Remotebase

Remotebase

Remotebase: Hire top 1% global developers in 24hrs. Build your team hassle-free.

Internet Software & Services
51-250
Founded 2020
$4M raised

Description

  • Design, build, and maintain scalable CI/CD pipelines for data applications and infrastructure.
  • Implement and manage dbt projects, including models, tests, documentation, and CI/CD integration.
  • Develop infrastructure as code with Terraform to provision and configure cloud data resources on GCP.
  • Automate deployment, monitoring, and management of Snowflake data warehouse environments.
  • Collaborate with data engineers and data scientists to deliver automated solutions for ingestion, processing, and data delivery.
  • Implement monitoring, logging, and alerting for pipelines and infrastructure to support high availability and rapid issue resolution.
  • Develop and maintain automation scripts and tools, primarily in Python and Bash, to streamline operational tasks.
  • Troubleshoot and resolve issues across data infrastructure, pipelines, and deployments.
  • Participate in code reviews for infrastructure code, dbt models, and automation scripts.
  • Document architectures, configurations, and operational procedures, and support data governance and data quality initiatives.

Requirements

  • Bachelor's degree in Computer Science, Engineering, or a related technical field, or equivalent experience.
  • 5+ years of hands-on experience in a DevOps, SRE, or infrastructure engineering role.
  • 3+ years of experience automating and managing data infrastructure and pipelines.
  • 1+ years of experience enabling AI features.
  • Strong experience with Infrastructure as Code tools, particularly Terraform.
  • Proven experience with dbt for data transformation in a production environment.
  • Hands-on experience managing and optimizing Snowflake data warehouse environments.
  • Strong proficiency in Python for automation, scripting, and data-related tasks.
  • Strong Bash scripting skills.
  • Solid understanding of CI/CD principles and tools such as Bitbucket Runners, Jenkins, GitLab CI, GitHub Actions, or Azure DevOps.
  • Experience with cloud platforms, preferably GCP, and their data services.
  • Experience with containerization technologies such as Docker or Kubernetes is a plus.
  • Knowledge of data integration tools and ETL/ELT concepts.
  • Familiarity with monitoring and logging tools.
  • Strong SQL skills.
  • Demonstrable experience with data modeling techniques for ODS, dimensional modeling, and semantic models for analytics and BI.
  • Ability to collaborate effectively in an agile environment and communicate complex technical concepts clearly.
  • Hands-on experience in building business intelligence solutions is preferred.

Benefits

  • Fully remote work.
  • Flexible timings and control over your work schedule.
  • Market-competitive compensation.
  • Strong learning and growth opportunities.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Arquitecto de datos

NEORIS 5K-10K Internet Software & Services

NEORIS busca un Arquitecto de Datos para una compañía líder del sector seguros/financiero, responsable de conectar las necesidades del negocio con soluciones de arquitectura de datos escalables y habilitar capacidades avanzadas de analítica e Inteligencia Artificial.

Agile Databricks Generative AI LLM
30 minutes ago

Data Engineer Midel

NEORIS 5K-10K Internet Software & Services

EPAM NEORIS is hiring a Mid Data Engineer to build and support production data systems and knowledge-graph-based solutions for clients in a fast-paced, cross-functional environment.

AWS Azure GCP Generative AI LLM Neo4j Python SQL
45 minutes ago

Software Engineer II - Core Ingest

Sumo Logic 251-1K Internet Software & Services

Sumo Logic is hiring a Software Engineer II for its Core Ingest team to build and evolve large-scale distributed data processing systems that ingest and analyze massive volumes of customer data.

Agile Docker Java Kafka Kubernetes Linux Microservices Scala Unix
45 minutes ago

Senior Data Engineer

Fundraise Up 51-250 Capital Markets

Fundraise Up is hiring a Senior Data Engineer to own and evolve the data platform behind its global nonprofit fundraising product, with a focus on scalable pipelines, analytics infrastructure, and data quality.

Apache Airflow AWS ClickHouse Docker Elasticsearch Git Kafka Koa MLflow MongoDB NestJS Node.js Python React Redis TypeScript Vue.js
1 hour, 16 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers