Bluelight Consulting

Bluelight Consulting

Bluelight Consulting is a leading Nearshore Software Development company that provides access to highly skilled Nearshore Tech Talent in the US timezone. They offer services such as Nearshore Staffing, Cloud consulting, Cloud migration strategies, and ...

Internet Software & Services
11-50
Founded 2015

Description

  • Develop and maintain ETL data engineering processes using Python (PySpark) in Azure Synapse Analytics notebooks and pipelines.
  • Design and build data storage structures in an MPP SQL pool using star schemas, facts, and dimensions.
  • Extract data from REST APIs, SQL database tables, and CSV files.
  • Design and optimize Azure Synapse Analytics notebooks and pipelines for scalability and performance.
  • Contribute to data fabric initiatives including data lakes, lakehouses, delta lakes, and data cataloging.
  • Collaborate with data architects to create data models and schemas that align with business requirements.
  • Implement data quality checks and validation processes to maintain accuracy and consistency.
  • Identify and resolve performance bottlenecks and optimize ETL jobs to meet SLAs.
  • Monitor ETL jobs, troubleshoot issues, and implement solutions to ensure pipeline reliability.
  • Maintain documentation for ETL processes, data flows, and data transformations.
  • Work with cross-functional teams to gather data requirements and support data-related initiatives.
  • Ensure data security and compliance with governance and privacy standards.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent work experience.
  • Certification related to data engineering or data science, such as Azure Data Engineer, is a plus.
  • Proven experience in ETL data engineering using Python (PySpark).
  • Experience extracting, transforming, and loading data from REST APIs, SQL database tables, and CSV files.
  • Proficiency with Azure Synapse Analytics, including Notebooks, Pipelines, Linked Services, and Azure Key Vault.
  • Ability to write complex SQL queries and optimize query performance.
  • Experience with both SparkSQL and MS SQL.
  • Knowledge of data integration best practices and tools.
  • Experience with version control systems such as Git and Azure DevOps.
  • Strong problem-solving and analytical skills with close attention to detail.
  • Excellent verbal and written communication skills and ability to work collaboratively in a team with shifting priorities.
  • Familiarity with big data technologies, machine learning, and data analysis is preferred.
  • Experience with data visualization tools such as Power BI or Tableau is a plus.
  • Experience with Agile methodologies is a plus.

Benefits

  • Competitive salary and bonuses, including performance-based salary increases.
  • Generous paid time off policy.
  • Flexible working hours.
  • Remote work arrangement.
  • Continuing education, training, and conference support.
  • Company-sponsored coursework, exams, and certifications.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

GCP Data Architect

66degrees 251-1K IT Services

66degrees is seeking an experienced Data Architect to design, develop, and maintain Google Cloud-based data architecture that turns enterprise data into scalable, reliable business value.

Apache Spark dbt GCP Hadoop Python SQL
1 hour, 26 minutes ago

Staff Data Engineer

CookUnity 251-1K Hotels, Restaurants & Leisure

CookUnity is hiring a Data Engineer to help rebuild and scale the company’s B2C data foundation by designing production-ready pipelines and data systems for a rapidly growing food marketplace.

Apache Spark Flink Java Kafka Kubernetes Python Scala Snowflake SQL
1 hour, 26 minutes ago

Data Engineer

INflow Federal 51-250 Aerospace & Defense

INflow Federal is seeking a fully remote Data Engineer to support a Department of Defense modernization initiative by building secure data pipelines and analytics infrastructure across cloud environments.

Agile AWS CI/CD DevSecOps DynamoDB Encryption JSON MongoDB PostgreSQL Power BI Python REST API SQL SQL Server Tableau XML
1 hour, 41 minutes ago

GCP Data Engineer- Lead Consultant

Lingaro 5K-10K IT Services

Lingoarians is hiring a GCP Data Engineer Lead Consultant in India to design and develop a client’s end-to-end Google Cloud data ecosystem while working directly with stakeholders in a remote, full-time role.

Apache Airflow Apache Spark Azure Confluence Databricks dbt GCP Git JIRA Looker Power BI Python SQL Tableau
1 hour, 50 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers