HomeBuddy

HomeBuddy

HomeBuddy is your home improvement helper that connects you with local, professional contractors for all your home improvement projects. With over 112,651 project requests and 13,259 projects completed, HomeBuddy ensures a hassle-free experience by mat...

Construction & Engineering
11-50
Founded 2022

Description

  • Build and maintain data infrastructure for collecting, storing, and retrieving data.
  • Create and maintain new data flows by integrating internal and external data sources.
  • Develop ETL pipelines, data warehousing, and data models to support business needs.
  • Monitor and improve data quality, reliability, and lineage through processes and tools.
  • Collaborate with the Data & Analytics Team to optimize data infrastructure and strengthen data governance.
  • Document data sources, pipelines, and data quality procedures for end users.
  • Train end users on data sources, pipelines, and data quality processes.
  • Stay current with data engineering technologies and identify opportunities to improve data infrastructure and analysis.

Requirements

  • 3+ years of experience in data engineering or a related field.
  • Expertise in Python and SQL.
  • Understanding of advanced data infrastructure, tools, and concepts.
  • Strong understanding of core data modeling concepts and their tradeoffs.
  • Experience with a cloud data warehouse, preferably Snowflake.
  • Experience with a cloud platform, preferably AWS.
  • Experience with a data transformation tool, preferably dbt.
  • Strong analytical and problem-solving skills.
  • Business acumen and strong communication skills with both technical and business stakeholders.
  • Business fluency in English.
  • Experience with EL tools such as Fivetran or Airbyte is preferred.
  • Experience with orchestration tools such as Prefect, Airflow, or Dagster is preferred.
  • Basic understanding of the ML lifecycle, including training, evaluation, deployment, and monitoring, is preferred.
  • Experience with data quality monitoring tools such as Great Expectations, Monte Carlo, or DataFold is preferred.
  • Experience with ML pipelines and deployment tools such as MLflow, SageMaker, Vertex AI, Azure ML, or Kubeflow is preferred.
  • Experience with AI engineering tools and frameworks such as Amazon Bedrock, LangGraph, CrewAI, Vertex AI Agent Builder, or similar is preferred.
  • Familiarity with containerization and DevOps practices such as Docker and Git is preferred.
  • Experience in a SaaS or marketplace company is preferred.

Benefits

  • Freedom to work from anywhere with home working flexibility.
  • Flexible schedule aligned to local needs.
  • Paid vacation, sick leave, and local holidays.
  • Partially paid work equipment of your choice and up-to-date apps and tools.
  • Industry-leading compensation package.
  • Recognition and rewards for individual and team success.
  • Fitness and mental health allowance program.
  • Paid training opportunities, including courses, events, and conferences.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Sr. Data Engineer

e.l.f. Beauty 251-1K Consumer Goods

e.l.f. Beauty is seeking a Senior Data Engineer in India to design and maintain data pipelines, integrations, and warehouse infrastructure that support secure, accurate, and accessible data for business stakeholders.

AWS Azure C# GCP Java Python Snowflake SQL
7 minutes ago

Azure Data Engineer

Lingaro 5K-10K IT Services

Azure Data Engineer at CC Data Engineering & Management – Data E&M in Poland is a full-time remote role focused on data engineering and management work.

Azure
22 minutes ago

Lead Data Engineer

Arcadia 251-1K IT Services

Arcadia is seeking a Lead Data Engineer to design and deliver healthcare data pipeline connectors that integrate client claim and clinical data platforms with its analytics solution for providers and payers.

Agile Apache Spark AWS EC2 Java JIRA Kafka Python Scrum SQL
22 minutes ago

Data Engineer- AWS/Snowflake

Egen.ai IT Services

Egen is seeking a remote Data Engineer for a 6-month contract to support cloud data platform migration and analytics data delivery across AWS, Snowflake, and BI environments.

Apache Airflow AWS dbt GCP Looker Python Snowflake SQL Tableau
22 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers