Leadtech

Leadtech

Leadtech is a Barcelona-based Online Innovation Technology company that has rapidly grown since 2007, becoming an industry leader in online project management with a global team of over 570 professionals.

IT Services
251-1K
Founded 2009

Description

  • Support the development and maintenance of the data platform on GCP, including BigQuery/Databricks warehousing and Google Cloud Storage data lake storage.
  • Help organize data into layered architecture and domain-focused Data Marts for analytics and reporting.
  • Assist with Terraform-based infrastructure as code to provision and manage cloud resources.
  • Build, maintain, and improve ETL/ELT pipelines using Apache Airflow for workflow orchestration.
  • Develop and maintain dbt transformations to create clean, version-controlled data models in BigQuery.
  • Support data ingestion and processing with Google Dataflow, Apache Beam, and Pub/Sub where needed.
  • Monitor scheduled jobs, troubleshoot failures, and help ensure timely delivery of data for analytics and reporting.
  • Implement and maintain data quality checks using Great Expectations, dbt tests, or similar tools.
  • Document datasets, metadata, lineage, and audit processes.
  • Partner with Analytics, Product, and Data Science teams to provide reliable datasets for dashboards, reporting, experimentation, and machine learning use cases.

Requirements

  • 1+ year of experience in data engineering or a related data role.
  • Experience with mobile, product, or marketing data is a plus.
  • Hands-on experience with GCP services such as BigQuery and Google Cloud Storage.
  • Familiarity with Apache Airflow for scheduling and orchestrating data workflows.
  • Some experience with dbt or similar transformation tools.
  • Exposure to Pub/Sub, Dataflow, or other batch/streaming tools is a plus.
  • Understanding of Data Mart concepts and interest in infrastructure as code tools such as Terraform.
  • Good coding skills in Python; Java or Scala is a plus.
  • Ability to write scripts for automation and data processing tasks.
  • Familiarity with Docker and basic container concepts.
  • Exposure to CI/CD and version control workflows such as GitHub Actions, GitLab CI, Jenkins, or similar.
  • Understanding of data quality principles and experience with dbt tests, Great Expectations, or similar tools is a plus.
  • Basic knowledge of data governance concepts such as lineage, metadata, and access control.
  • Awareness of privacy and compliance principles such as GDPR is a plus.
  • General understanding of OLTP and OLAP systems.
  • Clear communication skills and willingness to work closely with technical and non-technical stakeholders.
  • Organized, proactive, and eager to learn.
  • Strong problem-solving mindset and attention to detail.
  • Interest in machine learning workflows and exposure to tools such as Vertex AI or similar ML platforms.
  • Familiarity with monitoring and observability tools such as Prometheus, Grafana, Datadog, or New Relic.
  • Exposure to real-time or streaming data tools such as Kafka, Spark Streaming, or similar technologies is a plus.
  • Relevant cloud or data certifications, especially GCP certifications, are a plus.
  • Experience working with different LLMs and automations is a plus.

Benefits

  • Competitive salary and full-time permanent contract.
  • Top-tier private health insurance, including dental and psychological services.
  • 25 days of vacation plus your birthday off, with flexible vacation options and no blackout days.
  • Flexible start and end times.
  • Option to work fully remote, hybrid, or from the Barcelona office.
  • Free Friday afternoons with a 7-hour workday.
  • 35-hour workweek in July and August.
  • Annual budget for external learning opportunities and personalized internal training.
  • Office perks including free coffee, fresh fruit, snacks, a game room, and a rooftop terrace in Barcelona.
  • Ticket restaurant and nursery vouchers paid directly from gross salary.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Platform Engineer

Lola Blankets 1-10 Textiles, Apparel & Luxury Goods

Lola Blankets is hiring a Data Platform Engineer to own its analytics platform and support engineering work across product, operations, integrations, and platform reliability.

Apache Airflow Dagster dbt LLM Prefect Python Snowflake SQL TypeScript
3 hours, 35 minutes ago

Oracle Data Engineer (with German Language)

Soname Solutions 11-50 Internet Software & Services

Soname Solutions is seeking a Senior Data Warehouse Developer to support a German telecom client by designing, optimizing, and evolving its multi-layer data warehouse environment.

Oracle PostgreSQL Power BI SQL
5 hours, 37 minutes ago

Synthetic Data Engineer (AI Data/Training)

Hyphen Connect 1-10 staffing & recruiting

A Synthetic Data Engineer at the organization will design and manage domain-specific synthetic data pipelines that support data processing and model training workflows.

Apache Airflow Apache Spark
6 hours, 9 minutes ago

Senior Data Engineer

Alpaca 51-250 Capital Markets

Alpaca is seeking a Senior Data Platform Engineer to build and operate the data management layer for its global brokerage infrastructure as it scales across customers, jurisdictions, and high-volume financial event streams.

Apache Airflow dbt Docker GCP Helm Kafka Kubernetes Python SQL Terraform Trino
7 hours, 56 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers