IDT

IDT

IDT Corporation provides innovative communications and payment services, connecting families, friends, and businesses globally with pioneering technologies and solutions.

Diversified Telecommunication Services
1K-5K
Founded 1990

Description

  • Design, implement, validate, and document ETL/ELT data pipelines for batch processing, streaming integrations, and data warehousing.
  • Maintain end-to-end Snowflake data warehouse deployments and develop Denodo data virtualization solutions.
  • Architect, implement, and maintain scalable data pipelines that ingest, transform, and deliver data to real-time data warehouse platforms.
  • Recommend process improvements to increase the efficiency and reliability of ELT/ETL development.
  • Support pilot projects and help evaluate emerging data technologies as the platform scales with growing data volumes.
  • Partner with data stakeholders to gather requirements for language-model initiatives and translate them into scalable solutions.
  • Create and maintain comprehensive documentation for data processes, workflows, and model deployment routines.
  • Perform data analysis, root cause analysis, and support functions to help deliver strategic BI initiatives.

Requirements

  • 5+ years of experience in ETL/ELT design and development across heterogeneous OLTP systems and API solutions.
  • Experience building scalable data warehouse solutions for business intelligence and analytics.
  • Excellent English communication skills, including strong oral and written communication with BI teams and users.
  • Demonstrated experience using Python for data engineering tasks, including transformation, advanced data manipulation, and large-scale data processing.
  • Experience designing event-driven pipelines using messaging and streaming events to trigger ETL workflows.
  • Experience designing complex data pipelines from RDBMS, JSON, API, and flat file sources.
  • Strong SQL and PLSQL programming skills, with advanced knowledge of BI and data warehouse methodologies.
  • Hands-on experience with relational database systems and cloud database services such as Oracle, MySQL, Amazon RDS, Snowflake, or Amazon Redshift.
  • Ability to analyze and optimize poorly performing queries and ETL/ELT mappings with performance-tuning recommendations.
  • Understanding of software engineering principles, experience with Unix/Linux/Windows operating systems, and familiarity with Agile methodologies.
  • Experience with version control systems, including code repository management, branching, merging, and distributed collaboration.
  • Interest in business operations and how BI systems support profitability and data-driven decision-making.
  • Preferred experience with Snowflake ETL/ELT development and complex transformations using SQL and built-in functions.
  • Preferred experience with Pentaho Data Integration (Kettle) or Ab Initio.
  • Preferred experience with Azure Data Factory, DBT, AWS Glue, Lambda, or other cloud/open-source ETL tools.
  • Preferred experience with reporting and visualization tools such as Looker and job scheduler software.
  • Preferred experience in Telecom, eCommerce, or International Mobile Top-up.
  • Preferred certifications include AWS Solution Architect, AWS Cloud Data Engineer, or Snowflake SnowPro Core.

Benefits

  • Full-time work-from-home opportunity.
  • Remote role open only to applicants in LATAM.
  • Opportunity to work for a large, established international telecom company with global operations.
  • Work on strategic BI initiatives and emerging data technologies.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

AI Data Engineer

Influur 11-50 Media

Influur is hiring an AI Data Engineer in New York/remote to own the full data-to-agent pipeline behind its autonomous viral marketing system for influencer campaigns.

AWS GCP LLM Python
39 minutes ago

Senior Data Engineer

Zencore Group 11-50 Internet Software & Services

Zencore is hiring a Senior Data Engineer in its LATAM Data & Analytics team to help customers modernize and migrate data platforms on Google Cloud through hands-on pipeline engineering and advisory work.

Apache Airflow Apache Spark CI/CD Databricks GCP MLOps Oracle Python Snowflake SQL
1 hour, 24 minutes ago

Data Observability Consultant - Dynatrace

Lingaro 5K-10K IT Services

Dynatrace India’s Consulting and Advisory Data Consulting Practice is hiring a remote Data Observability Consultant to support data-focused consulting work.

1 hour, 39 minutes ago

Senior Data Engineer

Lodgify 251-1K Internet Software & Services

Lodgify is hiring a Senior Data Engineer in Barcelona to build and optimize the company’s modern data platform that powers data-driven decisions across its vacation rental business.

Apache Airflow AWS Azure dbt GCP JavaScript Machine Learning Python SQL
1 hour, 39 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers