Remotebase

Remotebase

Remotebase: Hire top 1% global developers in 24hrs. Build your team hassle-free.

Internet Software & Services
51-250
Founded 2020
$4M raised

Description

  • Design, build, and maintain scalable CI/CD pipelines for data applications and infrastructure.
  • Implement and manage dbt projects, including models, tests, documentation, and CI/CD integration.
  • Develop infrastructure as code with Terraform to provision and configure cloud data resources on GCP.
  • Automate deployment, monitoring, and management of Snowflake data warehouse environments.
  • Collaborate with data engineers and data scientists to deliver automated solutions for ingestion, processing, and data delivery.
  • Implement monitoring, logging, and alerting for pipelines and infrastructure to support high availability and rapid issue resolution.
  • Develop and maintain automation scripts and tools, primarily in Python and Bash, to streamline operational tasks.
  • Troubleshoot and resolve issues across data infrastructure, pipelines, and deployments.
  • Participate in code reviews for infrastructure code, dbt models, and automation scripts.
  • Document architectures, configurations, and operational procedures, and support data governance and data quality initiatives.

Requirements

  • Bachelor's degree in Computer Science, Engineering, or a related technical field, or equivalent experience.
  • 5+ years of hands-on experience in a DevOps, SRE, or infrastructure engineering role.
  • 3+ years of experience automating and managing data infrastructure and pipelines.
  • 1+ years of experience enabling AI features.
  • Strong experience with Infrastructure as Code tools, particularly Terraform.
  • Proven experience with dbt for data transformation in a production environment.
  • Hands-on experience managing and optimizing Snowflake data warehouse environments.
  • Strong proficiency in Python for automation, scripting, and data-related tasks.
  • Strong Bash scripting skills.
  • Solid understanding of CI/CD principles and tools such as Bitbucket Runners, Jenkins, GitLab CI, GitHub Actions, or Azure DevOps.
  • Experience with cloud platforms, preferably GCP, and their data services.
  • Experience with containerization technologies such as Docker or Kubernetes is a plus.
  • Knowledge of data integration tools and ETL/ELT concepts.
  • Familiarity with monitoring and logging tools.
  • Strong SQL skills.
  • Demonstrable experience with data modeling techniques for ODS, dimensional modeling, and semantic models for analytics and BI.
  • Ability to collaborate effectively in an agile environment and communicate complex technical concepts clearly.
  • Hands-on experience in building business intelligence solutions is preferred.

Benefits

  • Fully remote work.
  • Flexible timings and control over your work schedule.
  • Market-competitive compensation.
  • Strong learning and growth opportunities.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Principal Software/Data Engineer

PointClickCare 1K-5K Health Care Providers & Services

PointClickCare is hiring a Principal Software/Data Engineer to lead the design and delivery of production-grade streaming and real-time data pipelines for its healthcare data platform.

AWS Azure CI/CD Databricks dbt Flink GCP Kafka
18 hours, 38 minutes ago

Data Engineer (EU)

Swish Analytics 1-10 Internet Software & Services

Swish Analytics is hiring a Europe-based remote Data Engineer to help build and operate real-time sports analytics and betting data products, with a focus on non-US sports coverage and enterprise-grade data delivery.

Apache Airflow AWS CI/CD Git Kubernetes Machine Learning MySQL Python REST API Shell Scripting SQL
18 hours, 38 minutes ago

Senior Data Engineer

Murmuration 11-50 Diversified Consumer Services

Murmuration is seeking a Senior Data Engineer to build and maintain the data infrastructure that powers its unified civic data platform, Atlas, supporting research, product, and community-impact work across complex, high-volume public and political datasets.

Apache Airflow AWS CI/CD Dagster dbt Docker MongoDB Python Snowflake
18 hours, 39 minutes ago

Data Engineer

e.l.f. Beauty 251-1K Consumer Goods

e.l.f. Beauty is hiring a remote Data Engineer in India to design and maintain the data pipelines, warehouse infrastructure, and reporting foundations that support analytics and cross-functional business decisions across its digital and commerce teams.

AWS Azure C# GCP Java Python Snowflake SQL
18 hours, 39 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers