qode

qode

qode is a company that focuses on unlocking global opportunities and unleashing potential through no-code solutions. They provide tools and services to help individuals and businesses develop software without the need for traditional coding skills.

Internet Software & Services

Description

  • Design and build scalable ETL/ELT pipelines using batch and streaming approaches.
  • Develop ingestion workflows from databases, APIs, and event streams.
  • Implement full load, incremental load, and CDC ingestion strategies.
  • Orchestrate and monitor data workflows using Apache Airflow.
  • Manage data connectors and ingestion processes using Airbyte.
  • Build and optimize data processing pipelines in Databricks Lakehouse.
  • Write complex SQL for analytics, transformation, and query optimization.
  • Create modular, testable data models with dbt across staging, intermediate, and marts layers.
  • Maintain data quality, observability, and reliability across the data platform.
  • Document pipelines, data models, and data dictionaries for long-term maintainability.

Requirements

  • At least 5 years of experience in Data Engineering.
  • Strong understanding of data architectures including Data Lake, Data Warehouse, and Lakehouse.
  • Hands-on experience with ETL/ELT pipelines, including batch and streaming processing.
  • Familiarity with ingestion patterns such as full load, incremental, CDC, and event-driven.
  • Experience with Databricks, including Delta Live Tables, Jobs, and Notebooks.
  • Strong skills in PySpark or Spark SQL for large-scale data processing.
  • Solid understanding of Delta Lake concepts such as ACID, time travel, and schema evolution.
  • Experience with Apache Airflow, including DAGs, scheduling, and monitoring.
  • Experience with Airbyte or similar ingestion tools.
  • Strong SQL skills, including CTEs, joins, window functions, and query optimization.
  • Experience with dbt for transformation, testing, and documentation.
  • Hands-on experience with AWS services such as S3, Lambda, EC2, and IAM.
  • Nice to have: Experience with Docker and Kubernetes (EKS).
  • Nice to have: Experience running Airflow or Airbyte on Kubernetes.
  • Nice to have: Familiarity with data quality tools such as Great Expectations or Soda.
  • Nice to have: Experience with Terraform or other Infrastructure as Code tools.
  • Nice to have: Exposure to data governance or catalog tools such as Databricks Catalog.
  • Nice to have: Experience with CI/CD pipelines such as GitHub Actions.
  • Nice to have: Strong Python skills for automation and pipeline scripting.

Benefits

  • Attractive salary range, open to negotiation for strong candidates.
  • Hybrid/remote-friendly culture with flexibility to work where you perform best.
  • Flexible hours with asynchronous teamwork and protected focus time.
  • Work equipment support.
  • Allowance for certification and skill development.
  • Year-end bonus and performance-based rewards.
  • 15 paid leave days per year.
  • Career growth support with personal coaching sessions.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Vice President, Data Engineering

TASQ Staffing Solutions 11-50 Professional Services

The VP, Data Engineering at the company will lead enterprise-wide data, analytics, and AI strategy across multiple business units to modernize reporting, enable self-service insights, and turn data into measurable business outcomes.

Azure Databricks Machine Learning Power BI Snowflake SQL
1 hour, 27 minutes ago

Data Engineer, Azure - Remote, Latin America

Bluelight Consulting 11-50 Internet Software & Services

Bluelight is hiring a remote Data Engineer, Azure to design and support ETL and data warehousing solutions for client projects across Latin America.

Agile Apache Spark Azure Git Power BI Python REST API SQL Tableau
2 hours, 35 minutes ago

Data Engineer, Azure - Remote, Latin America

Bluelight Consulting 11-50 Internet Software & Services

Bluelight is hiring a remote Data Engineer, Azure to build and optimize data pipelines and warehousing solutions for client projects across Latin America.

Apache Spark Azure Git Machine Learning Power BI Python REST API SQL Tableau
3 hours, 41 minutes ago

Data Engineer, Azure - Remote, Latin America

Bluelight Consulting 11-50 Internet Software & Services

Bluelight is hiring a remote Data Engineer, Azure to build and optimize data pipelines and warehousing solutions for client projects across Latin America.

Agile Apache Spark Azure Git Machine Learning Power BI Python REST API SQL Tableau
3 hours, 52 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers