Route

Route

Route is an all-in-one post-purchase platform revolutionizing ecommerce with visual tracking, package protection, and carbon-neutral shipping.

Air Freight & Logistics
251-1K
Founded 2018
$12M raised

Description

  • Build and maintain ELT pipelines that ingest data from source systems.
  • Co-own the mapping and migration of source data into the new 3NF enterprise data warehouse while preserving data integrity.
  • Develop automated unit tests, data tests, observability processes, and monitoring dashboards for pipeline health and data quality.
  • Build data and AI-powered tools that improve the productivity of the data engineering team and expand self-service access to data.
  • Automate deployment of shared staging and production infrastructure for new pipelines and maintain dbt and CI template dependencies.
  • Support the migration from Snowflake to Databricks, including reporting services and ingest/egress jobs.
  • Coordinate with engineering, analytics, product, and business teams to define, prioritize, and deliver data requirements.
  • Champion data democratization and help establish company-wide data retention and self-service data access foundations.
  • Monitor and support the uptime, security, and consistency of Route’s data lifecycle.

Requirements

  • 5+ years of formal, professional data engineering experience.
  • 4+ years of SQL experience, including complex transformations, window functions, and query optimization.
  • 3+ years of Python experience in data pipeline development, scripting, testing, and package management with Poetry.
  • 2+ years of experience with AWS data services such as S3, RDS, DMS, and DynamoDB.
  • 1+ year of experience using Databricks as a primary development platform.
  • Experience using Terraform and GoLang.
  • Preferred experience with PagerDuty, Grafana, and Tableau.
  • Understanding of third normal form (3NF) data modeling and when to apply it.
  • Knowledge and application of data theory.
  • Working knowledge of data security practices and least-privilege access standards.
  • Experience with data access controls in cloud environments, including IAM roles and catalog permissions.

Benefits

  • Route pays 95% to 100% of health insurance premiums for employees and their families.
  • Remote or hybrid work arrangements are available.
  • Unlimited PTO is offered.
  • 401(k) matching is included.
  • Formal growth opportunities plus learning and development support are provided.
  • DEI programs and events are part of the employee offering.
  • Employees are eligible to participate in the equity incentive plan and receive stock options.
  • The role has a salary range of $138,000 to $146,000, with potential bonus eligibility for some roles.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Sr. Data Engineer

Path Robotics 51-250 Automotive

Path Robotics is seeking a founding-level data engineer to architect the data platform that powers its AI and robotics teams’ experimentation, production deployment, and continuous learning systems.

AWS CI/CD Dagster dbt Flink Kafka MLOps Python Snowflake SQL
40 minutes ago

Java Developer | Apache Spark & Data Processing

NEORIS 5K-10K Internet Software & Services

NEORIS busca un Desarrollador Java Back Semi Senior para una entidad del sector financiero y bancario en Colombia, enfocado en el diseño y optimización de procesos batch y ETL en un entorno remoto.

Apache Airflow Apache Spark CI/CD Git Java JUnit SQL
1 hour, 10 minutes ago

Data Engineer

Newsela 251-1K Diversified Consumer Services

Newsela is hiring a Data Engineer to build and maintain data integrations and workflows that support K-12 school operations across educational platforms.

Python REST API SFTP SQL
2 hours, 10 minutes ago

Senior Data Engineer

phData 251-1K IT Services

phData is building a pipeline of experienced software, data, and analytics professionals to support future client work delivering production-grade solutions across modern cloud data platforms.

Apache Airflow Apache Spark AWS Azure Cassandra Databricks dbt Elasticsearch GCP Hadoop HDFS Java Kafka Luigi Python Scala Snowflake Solr SQL
2 hours, 25 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers