Data Engineer, Azure - Remote, Latin America

6 days, 10 hours ago
Full-time
Mid Level
Software Development
Bluelight Consulting

Bluelight Consulting

Bluelight Consulting is a leading Nearshore Software Development company that provides access to highly skilled Nearshore Tech Talent in the US timezone. They offer services such as Nearshore Staffing, Cloud consulting, Cloud migration strategies, and ...

Internet Software & Services
11-50
Founded 2015

Description

  • Develop and maintain ETL data engineering processes using Python (PySpark) in Azure Synapse Analytics notebooks and pipelines.
  • Design and build data storage structures in an MPP SQL pool using star schemas, facts, and dimensions.
  • Extract data from REST APIs, SQL database tables, and CSV files.
  • Design and optimize Azure Synapse notebooks and pipelines for scalability and performance.
  • Contribute to data fabric capabilities such as data lakes, lakehouses, delta lakes, and data cataloging.
  • Collaborate with data architects to create data models and schemas aligned with business requirements.
  • Implement data quality checks and validation processes to ensure accuracy and consistency.
  • Identify and resolve performance bottlenecks and optimize ETL jobs to meet SLAs.
  • Monitor ETL jobs, diagnose issues, and implement fixes to ensure pipeline reliability.
  • Maintain documentation for ETL processes, data flows, and transformations while supporting cross-functional teams.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent work experience.
  • Proven experience in ETL data engineering with strong Python (PySpark) skills.
  • Experience extracting, transforming, and loading data from REST APIs, SQL database tables, and CSV files.
  • Proficiency with Azure Synapse Analytics, including Notebooks, Pipelines, Linked Services, and Azure Key Vault.
  • Ability to write complex SQL queries and optimize query performance.
  • Experience working with both SparkSQL and MS SQL.
  • Knowledge of data integration best practices and tools.
  • Experience using version control systems such as Git and Azure DevOps.
  • Strong problem-solving, analytical, and detail-oriented skills with excellent communication and collaboration abilities.
  • Preferred: certification related to data engineering or data science, such as Azure Data Engineer.
  • Preferred: familiarity with big data technologies, machine learning, and data analysis.
  • Preferred: experience with data visualization tools such as Power BI or Tableau and Agile methodologies.

Benefits

  • Competitive salary and bonuses, including performance-based salary increases.
  • Generous paid time off policy.
  • Flexible working hours.
  • Remote work.
  • Continuing education, training, and conferences.
  • Company-sponsored coursework, exams, and certifications.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

[Job - 29349] Senior Data Developer (AWS), Brazil

CI&T 5K-10K Internet Software & Services

CI&T is hiring a Senior Data Developer (AWS) in Brazil to build the data foundation for a greenfield agribusiness platform delivering AI-powered agronomic analysis and georeferenced visualizations.

Apache Airflow Apache Spark AWS Databricks GitLab SQL Terraform
1 hour, 50 minutes ago

Data Engineer - Databricks / AWS (Gaming & LiveOps) - Latin America - Remote

Azumo 51-250 Internet Software & Services

Azumo is hiring a fully remote Big Data Engineer in Latin America to build and improve governed data and analytics infrastructure for gaming-focused insights, operations, and business decision-making.

Apache Airflow AWS Databricks dbt Generative AI Machine Learning Python SQL
3 hours ago

[Job-29268] Machine Learning Engineering, Colombia

CI&T 5K-10K Internet Software & Services

CI&T is hiring a Data & Analytics Engineer in Colombia to support an AI/ML implementation for demand forecasting and resource optimization in the public transportation and fare collection industry.

AWS Feature Engineering Machine Learning Matplotlib NumPy Oracle Pandas PostgreSQL Python Scikit-learn Seaborn SQL
6 hours, 15 minutes ago

Senior Data Engineer

Massive Rocket 51-250 Media

Massive Rocket is hiring a Senior Data Engineer to build and optimize Snowflake-based data solutions that support customer engagement, analytics, and business decision-making for its remote-first digital experience agency.

AWS Azure dbt GCP Power BI Python Scrum Snowflake SQL Tableau
10 hours, 51 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers