Data Engineer, Azure - Remote, Latin America

1 hour, 23 minutes ago
Full-time
Mid Level
Software Development
Bluelight Consulting

Bluelight Consulting

Bluelight Consulting is a leading Nearshore Software Development company that provides access to highly skilled Nearshore Tech Talent in the US timezone. They offer services such as Nearshore Staffing, Cloud consulting, Cloud migration strategies, and ...

Internet Software & Services
11-50
Founded 2015

Description

  • Develop and maintain ETL processes using Python (PySpark) in Azure Synapse Analytics Notebooks and Pipelines.
  • Design and build data storage structures in MPP SQL pools using data warehousing concepts such as star schemas, facts, and dimensions.
  • Extract data from REST APIs, SQL database tables, and CSV files.
  • Design and optimize Azure Synapse Analytics notebooks and pipelines for scalability and performance.
  • Contribute to data lake, lakehouse, delta lake, and data cataloging initiatives as part of broader Data Fabric efforts.
  • Collaborate with data architects to create data models and schemas aligned with business requirements.
  • Implement data quality checks and validation processes to ensure accuracy and consistency.
  • Monitor ETL jobs, troubleshoot issues, and resolve performance bottlenecks to meet SLAs.
  • Maintain documentation for ETL processes, data flows, and transformations.
  • Work with cross-functional teams to understand data needs and support data-related initiatives.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent work experience.
  • Proven experience in ETL data engineering with strong expertise in Python (PySpark).
  • Experience extracting, transforming, and loading data from REST APIs, SQL database tables, and CSV files.
  • Proficiency with Azure Synapse Analytics resources, including Notebooks, Pipelines, Linked Services, and Azure Key Vault.
  • Ability to write complex SQL queries, optimize query performance, and work with SparkSQL and MS SQL.
  • Knowledge of data integration best practices and tools.
  • Experience with version control systems such as Git and Azure DevOps.
  • Strong problem-solving, analytical, and attention-to-detail skills.
  • Excellent verbal and written communication skills and the ability to work collaboratively in a team with shifting priorities.
  • Preferred or nice-to-have experience with big data technologies, machine learning, data analysis, data visualization tools such as Power BI or Tableau, and Agile methodologies.
  • Azure Data Engineer certification or other data engineering/data science certifications are a plus.

Benefits

  • Competitive salary and bonuses, including performance-based salary increases.
  • Generous paid time off.
  • Flexible working hours.
  • Remote work.
  • Continuing education, training, and conference opportunities.
  • Company-sponsored coursework, exams, and certifications.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Senior Data Engineer

Wavicle Data Solutions 251-1K IT Services

Wavicle Data Solutions is hiring a Senior Data Engineer in Oak Brook, IL to design and deliver cloud-based data integration and pipeline solutions for client transformation initiatives.

Apache Spark AWS Azure Cassandra Databricks GCP Hadoop MongoDB Oracle Python Scala Snowflake SQL SQL Server Teradata
42 minutes ago

Senior Data Engineer

Zipdev 51-250 Professional Services

Zipdev is hiring a Senior Data Engineer to build and operate a modern data platform that supports decision-making across product, marketing, finance, and executive teams.

Apache Airflow Dagster dbt Kafka Prefect Snowflake SQL
1 hour, 23 minutes ago

Sr. Data Engineer II (6589)

MetroStar 251-1K IT Services

MetroStar is seeking a Sr. Data Engineer II to work with a cross-functional team supporting classified mission operations by designing and operationalizing data and AI/ML pipelines.

AWS Databricks Kafka Python Snowflake SQL
2 hours, 37 minutes ago

Data Engineer

Meridia 51-250 Diversified Telecommunication Services

Meridia is seeking a Data Engineer in Indonesia to model, structure, and automate geospatial data flows that support its SaaS platform for verifying agri-commodity supply chain data and compliance.

Apache Airflow AWS GCP Git Pandas Python Terraform
4 hours, 47 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers