Senior Data Engineer (Cloud / Snowflake / PySpark)

2 weeks, 1 day ago
Full-time
Senior
Software Development
Enroute

Enroute

Enroute provides comprehensive IT services including QA & Testing, Software Development, 3D Printing, Data Management, and more, delivering tailored solutions for partners.

Internet Software & Services
51-250
Founded 2005

Description

  • Design, build, and maintain scalable, reliable, high-performance data pipelines.
  • Develop end-to-end ETL and ELT workflows.
  • Process large-scale datasets using Spark and PySpark.
  • Build and orchestrate cloud-native data pipelines in AWS and/or Azure.
  • Design and optimize Snowflake data warehouse solutions.
  • Ensure data platform performance, scalability, governance, and cost optimization.
  • Write and optimize advanced SQL queries.
  • Collaborate with data scientists, analysts, and software engineers on production-ready data solutions.
  • Translate business requirements into production-ready data solutions.
  • Implement CI/CD, Git workflows, and Dockerized deployments.
  • Improve the reliability and observability of data platforms.

Requirements

  • 5+ years of professional experience in Data Engineering or related fields.
  • Strong experience designing and maintaining scalable data pipelines.
  • Deep understanding of ETL and ELT best practices.
  • Strong experience with large-scale data processing architectures and batch processing.
  • Advanced Python skills.
  • Strong hands-on experience with Apache Spark and PySpark.
  • Advanced SQL skills, including complex queries, optimization, and transformations.
  • Strong experience processing large structured and unstructured datasets.
  • Hands-on experience with AWS or Azure.
  • Experience building cloud-native data solutions.
  • Experience with Docker and CI/CD pipelines.
  • Strong knowledge of Git and version control.
  • Strong hands-on experience with Apache Airflow and workflow orchestration.
  • Experience with scheduling, monitoring, and failure recovery strategies.
  • Strong expertise in Snowflake, including data warehouse design, development, query optimization, warehouse optimization, performance tuning, and cost efficiency.

Benefits

  • Monetary compensation and year-end bonus.
  • IMSS, AFORE, and INFONAVIT benefits.
  • Major medical expenses insurance, minor medical expenses insurance, life insurance, and funeral expenses insurance.
  • Holidays, vacations, sick days, bereavement days, civil marriage days, and maternity and paternity leave.
  • English and Spanish classes.
  • Certifications and a performance management framework.
  • Employee discounts through the TALISIS agreement and Amazon birthday gift card.
  • Work-from-home bonus and laptop policy.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Platform Engineer

Lola Blankets 1-10 Textiles, Apparel & Luxury Goods

Lola Blankets is hiring a Data Platform Engineer to own its analytics platform and support engineering work across product, operations, integrations, and platform reliability.

Apache Airflow Dagster dbt LLM Prefect Python Snowflake SQL TypeScript
6 hours, 16 minutes ago

Oracle Data Engineer (with German Language)

Soname Solutions 11-50 Internet Software & Services

Soname Solutions is seeking a Senior Data Warehouse Developer to support a German telecom client by designing, optimizing, and evolving its multi-layer data warehouse environment.

Oracle PostgreSQL Power BI SQL
8 hours, 18 minutes ago

Synthetic Data Engineer (AI Data/Training)

Hyphen Connect 1-10 staffing & recruiting

A Synthetic Data Engineer at the organization will design and manage domain-specific synthetic data pipelines that support data processing and model training workflows.

Apache Airflow Apache Spark
8 hours, 51 minutes ago

Senior Data Engineer

Alpaca 51-250 Capital Markets

Alpaca is seeking a Senior Data Platform Engineer to build and operate the data management layer for its global brokerage infrastructure as it scales across customers, jurisdictions, and high-volume financial event streams.

Apache Airflow dbt Docker GCP Helm Kafka Kubernetes Python SQL Terraform Trino
10 hours, 37 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers