Gopuff

Gopuff

Gopuff is an instant commerce platform that delivers groceries, alcohol, home essentials, and more to consumers in minutes. With a wide range of products including cleaning supplies, medicine, pet care, beauty items, and local brands, Gopuff offers a u...

Food Products
10K-50K
Founded 2013
$2400M raised

Description

  • Design, build, and maintain scalable batch and real-time data pipelines for analytics, experimentation, and machine learning.
  • Contribute to the architecture and maintenance of the data platform to keep systems performant, cost-efficient, and scalable.
  • Partner with analytics, product, engineering, and operations teams to deliver high-quality data solutions.
  • Develop and maintain curated, well-modeled datasets that serve as trusted sources of truth.
  • Champion data quality, reliability, and observability through testing, monitoring, lineage, and incident response practices.
  • Contribute to team standards, patterns, and best practices for data engineering.
  • Improve infrastructure, developer workflows, CI/CD, and data platform tooling.
  • Support the delivery of insights-ready data products that enable measurable business impact.

Requirements

  • 3-5 years of experience in data engineering or software engineering with a strong focus on data platform development.
  • Proven experience building and scaling modern data platforms and delivering high-impact data solutions.
  • Strong proficiency in Python and SQL.
  • Experience with modern cloud data warehouses and lakes such as Snowflake, BigQuery, or Databricks.
  • Experience building batch pipelines using DAG-based orchestrators such as Dagster or Airflow.
  • Experience with event-driven architectures using Kafka, Kinesis, or Event Hubs.
  • Experience developing real-time or streaming pipelines using Apache Beam, Flink, or Spark Streaming.
  • Experience deploying applications and services to Kubernetes using tools such as ArgoCD, Helm, or Istio.
  • Experience implementing DevOps concepts within data workflows, including CI/CD, observability, monitoring, and lineage.
  • Experience with Infrastructure-as-Code such as Terraform.
  • Strong communication skills and the ability to work closely with technical and non-technical partners.
  • Passion for building reliable, accessible, and high-quality data products.

Benefits

  • Remote work location with a United States base salary range of $118,000 - $148,000.
  • Eligible for a discretionary annual cash bonus.
  • Participation in Gopuff’s equity incentive plan.
  • Medical, dental, and vision insurance.
  • 401(k) retirement savings plan.
  • HSA or FSA eligibility.
  • Long- and short-term disability insurance.
  • Mental health benefits, fitness reimbursement, flexible PTO, and employee discount with FAM membership.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Platform Engineer

Lola Blankets 1-10 Textiles, Apparel & Luxury Goods

Lola Blankets is hiring a Data Platform Engineer to own its analytics platform and support engineering work across product, operations, integrations, and platform reliability.

Apache Airflow Dagster dbt LLM Prefect Python Snowflake SQL TypeScript
3 hours, 38 minutes ago

Oracle Data Engineer (with German Language)

Soname Solutions 11-50 Internet Software & Services

Soname Solutions is seeking a Senior Data Warehouse Developer to support a German telecom client by designing, optimizing, and evolving its multi-layer data warehouse environment.

Oracle PostgreSQL Power BI SQL
5 hours, 40 minutes ago

Synthetic Data Engineer (AI Data/Training)

Hyphen Connect 1-10 staffing & recruiting

A Synthetic Data Engineer at the organization will design and manage domain-specific synthetic data pipelines that support data processing and model training workflows.

Apache Airflow Apache Spark
6 hours, 13 minutes ago

Senior Data Engineer

Alpaca 51-250 Capital Markets

Alpaca is seeking a Senior Data Platform Engineer to build and operate the data management layer for its global brokerage infrastructure as it scales across customers, jurisdictions, and high-volume financial event streams.

Apache Airflow dbt Docker GCP Helm Kafka Kubernetes Python SQL Terraform Trino
7 hours, 59 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers