Massive Rocket

Massive Rocket

Massive Rocket is a Global Braze & Snowflake Agency that helps companies use data to understand their customers, automate communications, and generate predictable growth. They specialize in building beautiful digital experiences and increasing customer...

Media
51-250
Founded 2018

Description

  • Design and implement Snowflake data solutions across real-time and batch architectures.
  • Set up and maintain ETL pipelines using Snowflake features such as dynamic tables, tasks, and storage integrations.
  • Integrate Snowflake solutions with existing data architecture and data platforms.
  • Optimize Snowflake database performance through data structure design and appropriate indexing.
  • Deploy and configure monitoring and reporting tools for Snowflake performance.
  • Ensure data quality and integrity through QA processes and unit testing.
  • Analyze large datasets and translate findings into clear stories, insights, and recommendations for stakeholders.
  • Build data models that support Braze and customer engagement initiatives.
  • Collaborate with data engineers, analysts, CDP and CRM teams, and other stakeholders on data strategy and implementation.
  • Provide technical guidance on Snowflake best practices, optimization, and unused platform features that may improve efficiency or reduce costs.

Requirements

  • Bachelor's degree in Computer Science, Information Technology, or equivalent work experience.
  • 8+ years of experience as a Snowflake developer or in a similar role.
  • 6+ years of experience with SQL, data modeling, and ETL processes.
  • 2+ years of experience with Python.
  • Experience with DBT.
  • Experience with Snowflake features such as Snowpipe, Snowflake Data Sharing, and Snowflake Secure Data Sharing.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud and with data integration tools.
  • Proven ability to use AI tools to improve day-to-day productivity.
  • Excellent problem-solving skills and strong attention to detail.
  • Strong communication skills with the ability to explain complex data concepts to non-technical stakeholders.
  • Ability to work independently and collaboratively in a fast-paced environment.
  • Experience working in an agency setting or with external clients.
  • English C1 level.
  • Preferred: Snowflake certification.
  • Preferred: experience with other data warehousing solutions such as Redshift or BigQuery.
  • Preferred: knowledge of data visualization tools such as Tableau or Power BI.
  • Preferred: understanding of data governance and security best practices.
  • Preferred: experience working in Scrum.

Benefits

  • Remote-first work environment.
  • Clear career progression and real ownership opportunities.
  • Supportive, collaborative team culture.
  • Global team with colleagues across Europe, the US, and beyond.
  • Meetups, events, and team experiences.
  • Opportunities for fast learning and rapid professional growth.
  • BYOD policy for work equipment.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Engineer II

Zafin 251-1K Internet Software & Services

Zafin is hiring a Data Engineer II to design and scale data platforms that support analytics and product use cases for its banking platform.

Apache Airflow Apache Spark CI/CD Java Kafka Kubernetes Python Scala Snowflake SQL
4 hours, 42 minutes ago

Data Engineer, Azure - Remote, Latin America

Bluelight Consulting 11-50 Internet Software & Services

Bluelight is hiring a remote Data Engineer, Azure to develop and optimize ETL and data warehousing solutions for client projects across Latin America.

Agile Apache Spark Azure Git Machine Learning Power BI Python REST API SQL Tableau
5 hours, 16 minutes ago

Senior Data Engineer, AI and Systems Engineering

Dropbox 1K-5K Internet Software & Services

Dropbox is hiring a Senior Data Engineer to build the CMDB and Asset Intelligence data platform that unifies enterprise systems into trusted data for asset visibility, cost optimization, and security insights.

Apache Spark Databricks Oracle Python SQL
5 hours, 49 minutes ago

Senior Data Engineer

Rezilient Health 11-50 Health Care Providers & Services

Rezilient Health is seeking a Data Engineer to build the data platform that powers near-real-time healthcare insights, patient and provider experiences, and operational efficiency across its CloudClinic model.

Apache Airflow AWS Azure dbt GCP HIPAA Machine Learning Python Scala Snowflake SQL
6 hours, 18 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers