Data Engineer, Azure - Remote, Latin America

4 hours, 6 minutes ago
Full-time
Mid Level
Software Development
Bluelight Consulting

Bluelight Consulting

Bluelight Consulting is a leading Nearshore Software Development company that provides access to highly skilled Nearshore Tech Talent in the US timezone. They offer services such as Nearshore Staffing, Cloud consulting, Cloud migration strategies, and ...

Internet Software & Services
11-50
Founded 2015

Description

  • Develop and maintain ETL processes using Python (PySpark) in Azure Synapse Analytics notebooks and pipelines.
  • Design and build data warehouse structures using star schemas, facts, and dimensions in an MPP SQL pool.
  • Extract and transform data from REST APIs, SQL database tables, and CSV files.
  • Design, optimize, and scale Azure Synapse Analytics notebooks and pipelines for performance.
  • Contribute to data fabric initiatives including data lakes, lakehouses, delta lakes, and data cataloging.
  • Collaborate with data architects to create data models and schemas that meet business requirements.
  • Implement data quality checks and validation processes to ensure accurate and consistent data.
  • Monitor ETL jobs, troubleshoot issues, and resolve performance bottlenecks to meet SLAs.
  • Maintain documentation for data engineering processes, data flows, and transformations.
  • Work with cross-functional teams on data requirements, support data initiatives, and ensure security and compliance standards are followed.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent work experience.
  • Certification related to data engineering or data science, such as Azure Data Engineer, is a plus.
  • Proven experience in ETL data engineering with strong expertise in Python (PySpark).
  • Experience extracting, transforming, and loading data from REST APIs, SQL database tables, and CSV files.
  • Proficiency with Azure Synapse Analytics resources, including Notebooks, Pipelines, Linked Services, and Azure Key Vault.
  • Ability to write complex SQL queries, optimize query performance, and work with SparkSQL and MS SQL.
  • Knowledge of data integration best practices and tools.
  • Experience with version control systems such as Git and Azure DevOps.
  • Strong problem-solving, analytical, and attention-to-detail skills.
  • Excellent verbal and written communication skills and ability to work in a team with shifting priorities.
  • Familiarity with big data technologies, machine learning, and data analysis is preferred.
  • Experience with data visualization tools such as Power BI or Tableau and Agile methodologies is a plus.

Benefits

  • Competitive salary and bonuses, including performance-based salary increases.
  • Generous paid time off policy.
  • Flexible working hours.
  • Remote work.
  • Continuing education, training, and conference support.
  • Company-sponsored coursework, exams, and certifications.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Sr. Data Engineer II (6589)

MetroStar 251-1K IT Services

MetroStar is seeking a Sr. Data Engineer II to work with a cross-functional team supporting classified mission operations by designing and operationalizing data and AI/ML pipelines.

AWS Databricks Kafka Python Snowflake SQL
2 hours, 17 minutes ago

Backend Engineer II – Data Platform

Netomi 51-250 IT Services

Netomi is hiring a Backend Engineer II for its Data Platform team in Gurugram to build scalable backend services and data pipelines that power analytics, reporting, and product improvements for enterprise customer experience.

Agile Apache Airflow Apache Spark CI/CD Databricks Docker DynamoDB Elasticsearch HIPAA Java Kafka Kubernetes Luigi Microservices MongoDB MySQL PostgreSQL Prefect Python RabbitMQ Snowflake Spring Boot SQL
3 hours, 55 minutes ago

Shopify Middleware Developer

Alabama Solutions 11-50 Internet Software & Services

A part-time Middleware Developer at a Shopify Plus ecommerce company will build and maintain integrations between the storefront and enterprise systems to ensure accurate, reliable data flow and ETL across business platforms.

AWS Azure CI/CD CRM Datadog ERP FTP Git GitHub Actions GraphQL JSON Kafka Laravel Microservices New Relic Node.js PHP RabbitMQ REST API Serverless SFTP Shopify SQL XML
5 hours, 6 minutes ago

Software / Data Engineer (AWS & Python), (Remote, Full-Time) [IS008]

Smart Working Internet Software & Services

Smart Working is hiring a remote Software/Data Engineer to own and evolve a cloud-based AWS and Python data platform that supports EV repair cost intelligence and analytics for insurers and automotive organizations.

API Gateway AWS Python SFTP SQL
6 hours, 39 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers