Leadtech

Leadtech

Leadtech is a Barcelona-based Online Innovation Technology company that has rapidly grown since 2007, becoming an industry leader in online project management with a global team of over 570 professionals.

IT Services
251-1K
Founded 2009

Description

  • Define and implement overall data architecture on GCP, including BigQuery/Databricks warehousing, Google Cloud Storage data lakes, and Data Mart (Data Mach) designs.
  • Integrate Infrastructure as Code (Terraform) to provision and manage cloud resources and enforce repeatable deployments.
  • Design, build, and optimize ETL/ELT pipelines and workflows using Apache Airflow, dbt, Dataflow, Pub/Sub, and BigQuery for both batch and streaming data.
  • Implement event-driven and asynchronous data workflows between microservices using RabbitMQ (and similar), Docker, and Kubernetes.
  • Implement and maintain CI/CD pipelines and DevOps practices for data engineering components, including automated testing and deployments.
  • Enforce data quality standards using Great Expectations (or similar) and define/validate expectations for critical datasets.
  • Define and uphold metadata management, data lineage, auditing, and security best practices (encryption, IAM) to ensure compliance with GDPR/CCPA where applicable.
  • Collaborate with Data Science, Analytics, and Product teams to enable analytics and ML use cases and maintain Data Mart environments optimized for stakeholders.
  • Automate job scheduling and data transformations to deliver timely insights for reporting, analytics, and machine learning.

Requirements

  • 3+ years of professional data engineering experience, including at least 1 year working with mobile data.
  • Proven experience building and maintaining Databricks and BigQuery environments and Google Cloud Storage–based data lakes.
  • Deep knowledge of Apache Airflow for orchestration and ETL/ELT design, and experience implementing dbt for version-controlled transformations.
  • Experience with streaming and messaging technologies such as Pub/Sub, Dataflow (Apache Beam), and RabbitMQ.
  • Experience with Infrastructure as Code using Terraform and familiarity with CI/CD and DevOps tools (e.g., Ansible, Jenkins, GitLab CI).
  • Strong programming skills in Python, Java, or Scala and experience scripting for automation.
  • Experience with containerization and orchestration using Docker and Kubernetes (K8s).
  • Hands-on experience with data quality and governance tools (Great Expectations or similar) and designing for data lineage, metadata management, and compliance (GDPR, CCPA).
  • Understanding of OLTP and OLAP systems and ability to design solutions for both.
  • Excellent communication, organization, self-motivation, and problem-solving skills.
  • Preferred: familiarity with ML workflows and Vertex AI, observability tools (Prometheus, Grafana, Datadog, New Relic), real-time technologies (Kafka, Spark Streaming), compliance frameworks (HIPAA, SOC 2), and GCP certifications (e.g., Google Professional Data Engineer).

Benefits

  • Competitive salary with a full-time permanent contract.
  • Top-tier private health insurance including dental and psychological services.
  • Flexible work arrangement: full remote option or work from Barcelona office with flexible start/end times (flextime).
  • Free Friday afternoons (7-hour workday) and a 35-hour workweek in July and August.
  • 25 days of vacation plus your birthday off and flexible vacation policy with no blackout days.
  • Annual budget for external learning, personalized internal training, and growth/career development support.
  • Barcelona office perks: free coffee, fresh fruit, snacks, game room, rooftop terrace; plus ticket restaurant and nursery vouchers.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Engineering Tech Lead

Lingaro 5K-10K IT Services

Data Engineering Tech Lead at Lingaro (Data Engineering & Management) — lead a Poland-based remote/full-time team to design, deliver, and maintain scalable, secure data engineering solutions while mentoring engineers and ensuring timely, high-quality project delivery.

Azure CI/CD Python Scala SQL
14 hours, 40 minutes ago

Senior Software Engineer - Data Integration & JVM Ecosystem

ClickHouse 51-250 IT Services

Senior Software Engineer (JVM) at ClickHouse joining the Connectors team to own and maintain JVM-based data framework integrations, connectors, and drivers that enable high-performance data ingestion and a seamless developer experience for data engineering workloads.

Apache Airflow Apache Spark ClickHouse dbt Grafana HTTP Java Kafka Metabase Pandas Power BI Python SQL Tableau TCP/IP
1 month ago

Junior Data Engineer (Remote Argentina) / Ingénieur données junior (à distance)

GlobalVision 51-250 Internet Software & Services

Junior Data Engineer at GlobalVision supporting and maintaining the company’s data infrastructure to ensure reliable, accessible, and actionable data that informs business decision-making across the organization.

dbt Domo Machine Learning Power BI Python Salesforce SQL Tableau
1 month ago

Data/Infrastructure Advocate Engineer - EMEA Remote

Hugging Face 51-250 IT Services

Hugging Face is hiring a Data/Infrastructure Advocate Engineer to bridge data infrastructure and the community by championing Xet storage on the Hub and enabling efficient storage, versioning, and collaboration on large-scale datasets.

AWS GitHub Pandas Python
1 month ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers