qode

qode

qode is a company that focuses on unlocking global opportunities and unleashing potential through no-code solutions. They provide tools and services to help individuals and businesses develop software without the need for traditional coding skills.

Internet Software & Services

Description

  • Design and build scalable ETL/ELT pipelines using batch and streaming approaches.
  • Develop ingestion workflows from databases, APIs, and event streams.
  • Implement full load, incremental load, and CDC ingestion strategies.
  • Orchestrate data workflows using Apache Airflow.
  • Manage data connectors using Airbyte.
  • Build and optimize data processing pipelines in Databricks Lakehouse.
  • Write and optimize complex SQL queries for analytics and transformation.
  • Develop modular, testable data models with dbt across staging, intermediate, and mart layers.
  • Maintain data quality, observability, and reliability across the platform.
  • Work with AWS services such as S3, Lambda, EC2, and IAM, and containerize data services with Docker and Kubernetes when needed.
  • Document pipelines, data models, and data dictionaries for maintainability.

Requirements

  • At least 4 years of experience in Data Engineering.
  • Strong understanding of data architectures including Data Lake, Data Warehouse, and Lakehouse.
  • Hands-on experience with ETL/ELT pipelines, including batch and streaming processing.
  • Familiarity with ingestion patterns such as full load, incremental load, CDC, and event-driven workflows.
  • Experience with Databricks, including Delta Live Tables, Jobs, and Notebooks.
  • Strong skills in PySpark or Spark SQL for large-scale data processing.
  • Solid understanding of Delta Lake concepts such as ACID, time travel, and schema evolution.
  • Experience with Apache Airflow, including DAGs, scheduling, and monitoring.
  • Experience with Airbyte or similar ingestion tools.
  • Strong SQL skills, including CTEs, joins, window functions, and query optimization.
  • Experience with dbt for transformation, testing, and documentation.
  • Hands-on experience with AWS services such as S3, Lambda, and IAM.
  • English communication skills at C1 level or higher.
  • Experience with Docker and Kubernetes (EKS) is preferred.
  • Experience running Airflow or Airbyte on Kubernetes is preferred.
  • Familiarity with data quality tools such as Great Expectations or Soda is preferred.
  • Experience with Terraform or other Infrastructure as Code tools is preferred.
  • Exposure to data governance or catalog tools such as Databricks Catalog is preferred.
  • Experience with CI/CD pipelines such as GitHub Actions is preferred.
  • Strong Python skills for automation and pipeline scripting are preferred.

Benefits

  • Attractive salary range, with room for negotiation for strong candidates.
  • Hybrid/remote-friendly culture.
  • Flexible working hours with async teamwork.
  • Work equipment support.
  • Allowance for certification and skill development.
  • Year-end bonus and performance-based rewards.
  • 22 paid leave days starting from the 5th year, including the option to take a full month off.
  • Career growth supported by personal coaching sessions.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Senior Security Data Engineer

Censys 51-250 IT Services

Censys is hiring a Senior Security Data Engineer to transform large-scale Internet telemetry into labeled, model-ready security data that powers AI/ML classification and security insights for its platform.

DNS Go HTTP Machine Learning Python SQL SSH TLS
3 hours, 32 minutes ago

Software Engineer – Data Pipelines / ETL / MLOps

ALTEN Technology 251-1K Construction & Engineering

ALTEN Technology USA is hiring a Software Engineer to build and support data pipelines, ETL, and MLOps systems that help production machine learning models run reliably at scale.

Apache Airflow Apache Spark GCP Kafka Kubeflow MLflow MLOps Python Scala Snowflake SQL
4 hours, 41 minutes ago

[Job-29175] Engenheiro(a) de Dados SENIOR, Brasil

CI&T 5K-10K Internet Software & Services

CI&T is hiring a Senior Data Engineer in Brazil to lead the migration and modernization of a large Azure/Databricks data estate to a governed Google Cloud Platform architecture that supports AI and analytics growth.

Apache Airflow Apache Spark Azure CI/CD Databricks dbt GCP Git GitHub Actions Kafka Python SQL Terraform
11 hours, 34 minutes ago

Head of Data Platform

A2Z Sync 51-250 Internet Software & Services

A2Z Sync is hiring a data platform architect to design the governed, tenant-isolated data substrate that turns its automotive SaaS platform into a closed-loop dealer intelligence and optimization engine.

Apache Airflow AWS CircleCI CloudFormation DynamoDB JSON MySQL OpenSearch OpenTelemetry REST API SOAP SQL XML
14 hours, 41 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers