qode

qode

qode is a company that focuses on unlocking global opportunities and unleashing potential through no-code solutions. They provide tools and services to help individuals and businesses develop software without the need for traditional coding skills.

Internet Software & Services

Description

  • Lead and support data platform modernization projects.
  • Design and develop scalable data pipelines using AWS native services.
  • Optimize ETL processes for efficient data transformation and processing.
  • Migrate workflows from on-premise systems to AWS cloud environments.
  • Design automations and integrations to resolve data inconsistencies and quality issues.
  • Perform system testing and validation to confirm successful integration and functionality.
  • Implement security and compliance controls in the cloud environment.
  • Validate data quality before and after migration, addressing completeness, consistency, and accuracy issues.
  • Collaborate with data architects and lead developers to document manual data movement workflows and define automation strategies.

Requirements

  • 10+ years of experience in a core data engineering skillset using AWS native technologies.
  • Experience with AWS Glue, Python, Snowflake, S3, and Redshift.
  • Proficiency with Snowflake for data transformations, ETL optimization, and scalable data processing.
  • Experience with streaming and batch data pipeline or engineering architectures.
  • Familiarity with DataOps concepts, source control, and CI/CD pipelines on AWS.
  • Hands-on experience with Databricks and willingness to expand capabilities.
  • Experience with data engineering and storage solutions such as AWS Glue, EMR, Lambda, Redshift, and S3.
  • Strong problem-solving and analytical skills.
  • Knowledge of Dataiku is required.
  • Graduate or post-graduate degree in Computer Science or a related field.
  • Experience with AWS Athena for querying data lakes.
  • Experience integrating with external systems such as FHIR.
  • Experience with secure data handling tools such as KMS and Macie.
  • Experience with cloud-native analytics, multi-account/multi-region data architecture, and BI tools such as Power BI, Tableau, or QuickSight.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Migration Engineer

Mark43 251-1K Professional Services

Mark43 is hiring a Data Migration Engineer (ETL Developer) to lead customer data migration efforts for public safety software, ensuring agencies transition clean, accurate data into the platform.

SQL
14 minutes ago

Principal Engineer - Data Infrastructure

Sezzle 251-1K Diversified Financial Services

Sezzle is hiring a Principal Engineer for Data Infrastructure to own and evolve the systems that power large-scale data storage, movement, and analytics across the company.

Apache Airflow AWS Dagster Databricks dbt Elasticsearch Flink Git GitLab Go Kafka Kubernetes MySQL PostgreSQL Prefect Python React React Native REST API Snowflake SQL TypeScript
29 minutes ago

Principal Engineer - Data Infrastructure

Sezzle 251-1K Diversified Financial Services

Sezzle is hiring a Principal Engineer for Data Infrastructure to own and evolve the systems that power the company’s MySQL, Postgres, Redshift, and data lake environments as its data volume and business needs continue to grow.

Apache Airflow AWS Dagster Databricks dbt Elasticsearch Git GitLab Go Kafka Kubernetes MySQL PostgreSQL Prefect Python React React Native Snowflake SQL TypeScript
29 minutes ago

Data Automation Engineer

Zeta Global 1K-5K Media

Zeta Global is seeking a builder to create and improve automation workflows across its data cloud acceleration team, turning repeatable business problems into measurable, production-ready solutions.

Apache Airflow AWS Azure C# JavaScript LLM Python Serverless Snowflake SQL TypeScript
59 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers