Arquitecto / Ingeniero de Datos - Datalake

1 hour, 15 minutes ago
Full-time
Mid Level
DevOps and Infrastructure
NEORIS

NEORIS

NEORIS is a leading global IT consulting company specializing in nearshore outsourcing services and SAP solutions, empowering companies to innovate through digital transformation.

Internet Software & Services
5K-10K
Founded 2000

Description

  • Design scalable and governed Data Lake architectures.
  • Define and implement standardized ETL/ELT data ingestion frameworks.
  • Model and organize data across raw, curated, and trusted zones.
  • Manage data cataloging and lineage.
  • Expose data for consumption through APIs, BI tools, and data sharing solutions.
  • Implement data quality controls, validations, and monitoring.
  • Ensure compliance with governance, security, privacy, audit, and traceability requirements.
  • Optimize cost and performance in AWS environments.
  • Collaborate with business, architecture, and security teams to align solutions with enterprise standards.

Requirements

  • 3-5+ years of experience in Data Engineering or Data Architecture.
  • Proven experience designing and implementing production Data Lakes on AWS.
  • Experience working on projects with regulatory or compliance requirements in sectors such as financial services, retail, or telecom.
  • Experience defining standards and best practices, not only executing technical implementations.
  • Experience handling large data volumes and multiple structured and unstructured data sources.
  • Strong experience with AWS storage, processing, orchestration, catalog, governance, query, and security services.
  • Knowledge of Amazon S3, AWS Glue, AWS Lambda, Amazon EMR, AWS Step Functions, MWAA (Airflow), AWS Glue Data Catalog, AWS Lake Formation, Amazon Athena, Redshift Spectrum, IAM, and KMS.
  • Desirable experience with streaming technologies such as Amazon Kinesis or MSK.
  • Knowledge of lakehouse architectures and modern data patterns.
  • Knowledge of dimensional and analytics-oriented data modeling.
  • Experience with data governance, including catalog, lineage, quality, and ownership.
  • Experience with data quality frameworks, including validation, profiling, and rules.
  • Experience optimizing AWS costs through storage tiering and partitioning.
  • Good understanding of data security and compliance.

Benefits

  • 100% nominal employment scheme.
  • Statutory benefits.
  • Meal vouchers.
  • Additional benefits.
  • Professional development plan.
  • Multicultural collaboration environment.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Senior Data Engineer

Alpaca 51-250 Capital Markets

Alpaca is seeking a Senior Data Platform Engineer to build and operate the data management layer for its global brokerage infrastructure as it scales across customers, jurisdictions, and high-volume financial event streams.

Apache Airflow dbt Docker GCP Helm Kafka Kubernetes Python SQL Terraform Trino
1 hour, 38 minutes ago

Staff Engineer — Data Platform

Yuno 51-200 Payment Processing Software

Yuno is hiring a Staff Engineer for its remote data platform team in Europe to design and scale reliable, trustworthy data infrastructure that supports payment operations across a global payments network.

Apache Airflow Apache Spark AWS Azure dbt GCP Kafka Python SQL
1 hour, 58 minutes ago

Intern Data Engineer

DEPT® 1K-5K Media

DEPT® is seeking a data engineering intern to join its Experience & Engineering team in the Netherlands, where the role supports real client projects by building the data foundations behind analytics, automation, and AI-driven digital products.

Apache Airflow AWS Azure Databricks dbt GCP Git Looker Machine Learning Power BI Python Snowflake SQL Tableau
2 hours, 19 minutes ago

Software Engineer II - Core Ingest

Sumo Logic 251-1K Internet Software & Services

Sumo Logic is hiring a Software Engineer II for its Core Ingest team to build and evolve large-scale distributed data processing systems that ingest and analyze massive volumes of customer data.

Agile Docker Java Kafka Kubernetes Linux Microservices Scala Unix
2 hours, 31 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers