Data Engineer, Azure - Remote, Latin America

2 hours, 2 minutes ago
Full-time
Mid Level
Software Development
Bluelight Consulting

Bluelight Consulting

Bluelight Consulting is a leading Nearshore Software Development company that provides access to highly skilled Nearshore Tech Talent in the US timezone. They offer services such as Nearshore Staffing, Cloud consulting, Cloud migration strategies, and ...

Internet Software & Services
11-50
Founded 2015

Description

  • Develop and maintain ETL data engineering processes using Python and PySpark in Azure Synapse Analytics notebooks and pipelines.
  • Design and build data storage structures in MPP SQL pools using data warehousing concepts such as star schemas, facts, and dimensions.
  • Extract and integrate data from REST APIs, SQL database tables, and CSV files.
  • Design and optimize Azure Synapse Analytics notebooks and pipelines for scalability and performance.
  • Contribute to data fabric initiatives including data lakes, lakehouses, delta lakes, and data cataloging.
  • Collaborate with data architects to create data models and schemas aligned with business needs.
  • Implement data quality checks and validation processes to ensure accurate and consistent data.
  • Identify and resolve performance bottlenecks and meet SLA requirements for ETL workflows.
  • Monitor ETL jobs, troubleshoot issues, and implement reliability improvements.
  • Maintain documentation for ETL processes, data flows, and transformations while working with cross-functional teams on data-related initiatives.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent work experience.
  • Relevant certifications in data engineering or data science, such as Azure Data Engineer, are a plus.
  • Proven experience in ETL data engineering using Python (PySpark) to extract, transform, and load data from REST APIs, SQL tables, and CSV files.
  • Proficiency with Azure Synapse Analytics, including Notebooks, Pipelines, Linked Services, and Azure Key Vault.
  • Ability to write complex SQL queries and optimize performance in SparkSQL and MS SQL.
  • Knowledge of data integration best practices and tools.
  • Experience with version control systems such as Git and Azure DevOps.
  • Strong problem-solving and analytical skills with excellent attention to detail.
  • Excellent verbal and written communication skills and the ability to work collaboratively in a team with shifting priorities.
  • Familiarity with big data technologies, machine learning, and data analysis is preferred.
  • Experience with data visualization tools such as Power BI or Tableau and Agile methodologies is a plus.

Benefits

  • Competitive salary with bonuses and performance-based salary increases.
  • Generous paid time off policy.
  • Flexible working hours.
  • Remote work arrangement.
  • Continuing education, training, and conference opportunities.
  • Company-sponsored coursework, exams, and certifications.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Sr. Data Engineer II (6589)

MetroStar 251-1K IT Services

MetroStar is seeking a Sr. Data Engineer II to work with a cross-functional team supporting classified mission operations by designing and operationalizing data and AI/ML pipelines.

AWS Databricks Kafka Python Snowflake SQL
2 hours, 17 minutes ago

Backend Engineer II – Data Platform

Netomi 51-250 IT Services

Netomi is hiring a Backend Engineer II for its Data Platform team in Gurugram to build scalable backend services and data pipelines that power analytics, reporting, and product improvements for enterprise customer experience.

Agile Apache Airflow Apache Spark CI/CD Databricks Docker DynamoDB Elasticsearch HIPAA Java Kafka Kubernetes Luigi Microservices MongoDB MySQL PostgreSQL Prefect Python RabbitMQ Snowflake Spring Boot SQL
3 hours, 55 minutes ago

Shopify Middleware Developer

Alabama Solutions 11-50 Internet Software & Services

A part-time Middleware Developer at a Shopify Plus ecommerce company will build and maintain integrations between the storefront and enterprise systems to ensure accurate, reliable data flow and ETL across business platforms.

AWS Azure CI/CD CRM Datadog ERP FTP Git GitHub Actions GraphQL JSON Kafka Laravel Microservices New Relic Node.js PHP RabbitMQ REST API Serverless SFTP Shopify SQL XML
5 hours, 6 minutes ago

Software / Data Engineer (AWS & Python), (Remote, Full-Time) [IS008]

Smart Working Internet Software & Services

Smart Working is hiring a remote Software/Data Engineer to own and evolve a cloud-based AWS and Python data platform that supports EV repair cost intelligence and analytics for insurers and automotive organizations.

API Gateway AWS Python SFTP SQL
6 hours, 40 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers