Remedy Product

Remedy Product

Remedy Product Studio supports founders and established companies in building the next generation of great digital products. Remedy Product Studio works on digital product strategy, software execution, launch, and investment. Early stage companies part...

Professional Services
51-250
Founded 2012

Description

  • Design, develop, and maintain scalable ETL/ELT pipelines and end-to-end data workflows.
  • Optimize data storage, processing, and retrieval using cloud-based technologies with a strong focus on AWS.
  • Work with structured and unstructured data across databases and data lakes to support analytics and operations.
  • Ensure data integrity, security, and compliance with industry standards and best practices.
  • Collaborate with product managers, engineers, and analysts to define data requirements and support analytics and machine learning initiatives.
  • Monitor and improve data infrastructure performance to ensure reliability and scalability.
  • Implement and maintain data governance, quality, and observability practices.
  • Design and optimize data models and develop robust data injection processes to ensure seamless data flow across systems.

Requirements

  • 5+ years of experience in data engineering or a related field.
  • Strong expertise in AWS data services (e.g., Redshift, Glue, Athena, S3, Lambda, Kinesis).
  • Experience with big data technologies such as Apache Spark and Apache Airflow or similar tools.
  • Strong hands-on experience in Python.
  • Strong knowledge of SQL, data modeling, and schema design.
  • Knowledge of CI/CD pipelines and infrastructure-as-code (Terraform, CloudFormation) is a plus.
  • Strong problem-solving skills and ability to work in a fast-paced environment.
  • Excellent communication skills and ability to collaborate with technical and non-technical stakeholders.
  • Fluent English (B2) with both written and spoken proficiency for clear communication with native speakers.
  • Nice to have: experience with real-time data streaming and event-driven architectures, exposure to machine learning workflows and MLOps, knowledge of security and compliance standards, and experience with GCP or Azure (e.g., BigQuery, Databricks, Azure Data Factory).

Benefits

  • Competitive compensation package.
  • Vacation leave of 20 working days per year and 4 sick days per year.
  • Opportunities for career and professional growth with access to senior engineers for mentorship.
  • English classes with on-staff teachers and internal English clubs.
  • Possibility to work remotely or from a coworking space.
  • Company-provided MacBook as a working environment.
  • Collaborative, team-based work environment with team-building events and sponsored company outings and social events.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Engineer II

Samsara 1K-5K IT Services

Samsara is hiring a remote Data Engineer II to build and scale the Databricks-based data platforms that power its Revenue Operations AI and data infrastructure for GTM analytics and generative AI applications.

Apache Spark AWS Databricks dbt Generative AI Machine Learning Python Salesforce Snowflake SQL
1 hour, 55 minutes ago

Synthetic Data Engineer (AI Data/Training)

Hyphen Connect 1-10 staffing & recruiting

A Synthetic Data Engineer at the organization will design and manage domain-specific synthetic data pipelines that support data processing and model training workflows.

Apache Airflow Apache Spark
2 hours, 32 minutes ago

Senior Developer / Systems & ETL Engineer

Metova 51-250 Internet Software & Services

Senior Developer / Systems & ETL Engineer at an unnamed company, responsible for building end-to-end information processing systems that span ETL, APIs, cloud-native deployment, and client-facing technical delivery.

ActiveMQ AWS Azure C CI/CD Docker Hadoop Java Kubernetes Linux Microservices MySQL Oracle OWASP Perl PostgreSQL Python RabbitMQ REST API Snowflake Spring Boot SQL SQL Server Unix
2 hours, 56 minutes ago

INGENIERO DE DATOS

NEORIS 5K-10K Internet Software & Services

NEORIS busca un Data Engineer para diseñar, desarrollar y desplegar soluciones de datos en un entorno Big Data y Cloud, alineadas con la arquitectura de datos y orientadas a eficiencia y mantenibilidad.

Agile Apache Spark AWS Azure Cassandra Elasticsearch GCP Hadoop HDFS MongoDB Neo4j Oracle PostgreSQL Python SQL Server
3 hours, 21 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers