Massive Rocket

Massive Rocket

Massive Rocket is a Global Braze & Snowflake Agency that helps companies use data to understand their customers, automate communications, and generate predictable growth. They specialize in building beautiful digital experiences and increasing customer...

Media
51-250
Founded 2018

Description

  • Design, build, and maintain scalable ETL/ELT pipelines across internal and external systems.
  • Develop and manage integrations between data warehouses, APIs, CDPs, CRM tools, and marketing platforms.
  • Build and optimise data ingestion frameworks for both batch and real-time architectures.
  • Manage API integrations, webhook frameworks, file-based ingestion, and event-driven pipelines.
  • Work closely with engineering and platform teams to ensure data is structured, accessible, and reliable.
  • Improve observability, monitoring, alerting, and performance across data pipelines and platform integrations.
  • Ensure governance, security, and documentation standards are applied across all integrations.
  • Troubleshoot data delivery issues, system failures, and performance bottlenecks.
  • Support scalable architecture patterns that can be reused across multiple clients.

Requirements

  • Strong experience building and maintaining APIs, webhooks, and platform integrations.
  • Strong understanding of ETL/ELT design patterns and orchestration tools.
  • Experience with cloud infrastructure such as AWS, Azure, or Google Cloud.
  • Experience with tools such as dbt, Airflow, Fivetran, Stitch, Hightouch, Segment, or similar platforms.
  • Familiarity with event streaming or messaging systems such as Kafka, Kinesis, Pub/Sub, or SQS.
  • Experience working with multiple source systems, including CRM, CDP, product analytics, and transactional systems.
  • Strong SQL and Python skills with the ability to work across multiple data environments.
  • Experience building reusable frameworks, automation, and monitoring processes.
  • Strong problem-solving skills and comfort operating across complex technical environments.
  • Preferred experience with reverse ETL, composable CDPs, audience activation tooling, Terraform, Docker, or Kubernetes, and highly technical agency or consulting environments.

Benefits

  • Remote-first work from anywhere.
  • Clear career progression and real ownership opportunities.
  • Collaborative global team with colleagues across Europe, the US, and beyond.
  • Fast growth environment with strong learning and development opportunities.
  • Meetups, events, and team experiences.
  • Supportive culture focused on people and collaboration.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Senior DataOps Engineer

Smile Digital Health 251-1K IT Services

Smile Digital Health is hiring a Senior DataOps Engineer to own analytics infrastructure and support cloud-based data processing and deployment environments for its healthcare data platform.

Ansible Apache Airflow Apache Spark AWS Azure CI/CD Databricks GCP GitHub Actions GitLab CI Java Jenkins Kubernetes Linux Prefect Python Scala Terraform
2 hours, 27 minutes ago

Migrations Analyst

Houseful 251-1K Real Estate

Alto Software Group is hiring a Migrations Analyst to manage high-volume technical data migrations for new customers onboarding to its Alto software platform, ensuring legacy systems are transitioned accurately and efficiently.

FTP MySQL SQL SQL Server
2 hours, 42 minutes ago

Principal Data Engineer

Tenna 51-250 Construction & Engineering

Tenna is hiring a Principal Data Engineer to define and lead the data architecture for its connected equipment platform, shaping the ingestion, storage, processing, and API systems that support company growth.

Apache Airflow Apache Spark AWS Docker IoT Machine Learning Node.js Python RabbitMQ SQL
2 hours, 42 minutes ago

[28730] Data Developer Senior - Oportunidade exclusiva para pessoas com deficiencia

CI&T 5K-10K Internet Software & Services

CI&T is seeking a Senior Data Developer to build scalable, intelligent data solutions for large-scale data environments in a remote role focused on data engineering and platform delivery.

Apache Spark CI/CD Databricks HDFS Hive Impala Python SQL Teradata
2 hours, 57 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers