HomeBuddy

HomeBuddy

HomeBuddy is your home improvement helper that connects you with local, professional contractors for all your home improvement projects. With over 112,651 project requests and 13,259 projects completed, HomeBuddy ensures a hassle-free experience by mat...

Construction & Engineering
11-50
Founded 2022

Description

  • Build and maintain data infrastructure that enables collection, storage, and retrieval of data.
  • Create and integrate new data flows from various sources, ensuring reliability and efficiency.
  • Develop ETL pipelines, data warehousing, and data modeling to support business needs.
  • Ensure data quality, reliability, and lineage by developing processes and tools to identify and correct data quality issues.
  • Collaborate with the Data & Analytics team to optimize data infrastructure and improve data governance.
  • Provide documentation and training to end-users on data sources, pipelines, and data quality procedures.
  • Research and stay current with data engineering technologies and techniques, identifying opportunities to improve data infrastructure and analysis.

Requirements

  • 3+ years of experience in data engineering or related fields.
  • Expertise in Python and SQL.
  • Experience with cloud data warehouses (Snowflake preferred) and cloud platforms (AWS preferred).
  • Experience with data transformation tools such as dbt (preferred).
  • Strong understanding of advanced data infrastructure, tools, and data modeling concepts and tradeoffs.
  • Strong analytical and problem-solving skills with business acumen and the ability to communicate with technical and business stakeholders.
  • Business fluency in English.
  • Nice to have: EL tools (e.g., Fivetran, Airbyte) and orchestration tools (e.g., Prefect, Airflow, Dagster).
  • Nice to have: Familiarity with the ML lifecycle and ML pipelines/deployment tools (e.g., MLflow, SageMaker, Vertex AI, Azure ML, Kubeflow) and AI engineering tools (e.g., Amazon Bedrock, LangGraph, CrewAI, Vertex AI Agent Builder).
  • Nice to have: Experience with data quality monitoring tools (e.g., Great Expectations, Monte Carlo, DataFold), containerization and DevOps practices (e.g., Docker, Git), and experience in SaaS or marketplace companies.

Benefits

  • Fully remote work with the expectation to be available during agreed working hours.
  • Flexible schedule with paid vacation, sick leave, and local holidays.
  • Partially company-paid work equipment and up-to-date apps and tools.
  • Industry-leading compensation package.
  • Recognition and rewards for outstanding individual and team performance.
  • Allowance program covering fitness activities and mental health programs.
  • Paid training opportunities including courses, events, and conferences.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

House Engineering Manager - Data Foundations

Zendesk 5K-10K Professional Services

House Engineering Manager at DoiT (remote in UK, Ireland, Sweden, Estonia, or Israel) responsible for leading a 5–10-engineer House to enable the team, own execution and delivery of the House roadmap, and ensure reliable, business-aligned production releases.

Apache Airflow AWS Azure CI/CD Dagster GCP Generative AI Git Go Kubernetes Node.js React REST API Terraform UI Design UX Design
4 hours ago

Senior Data Engineer

Tendo 51-250 Internet Software & Services

Senior Data Engineer at Tendo responsible for building and maintaining data ingestion, transformation, and warehousing solutions to enable analytics and AI/ML model development for healthcare products.

Agile Apache Spark AWS CI/CD Confluence Databricks Docker Git JIRA Python Scala Scrum SQL
5 hours ago

Senior Lead Data Engineer - R01562813

Brillio 1K-5K IT Services

Senior Lead Data Engineer at Brillio in Romania responsible for designing, building, and operating cloud-native data platforms on Microsoft Azure to support analytics, AI, and image-driven use cases while providing technical leadership across data ingestion, transformation, and serving layers.

Apache Spark Azure CI/CD Computer Vision Machine Learning Python SQL
6 hours, 45 minutes ago

Senior Data Engineer - DV 2.0 Certified

Dijital Team 11-50 Internet Software & Services

Senior Data Engineer (DV 2.0 certified) joining a growing data engineering team to migrate legacy SQL Server/SSRS reporting to a Snowflake (Azure) cloud data platform and implement Data Vault 2.0 modelling to enable Power BI and broader analytics delivery.

Agile Azure CI/CD JIRA Power BI Scrum Snowflake SQL SQL Server
6 hours, 45 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers