TTEC Digital

TTEC Digital

TTEC Digital is a global leader in customer service solutions, offering contact center software, customer service technology, and tailored solutions for businesses. With a focus on combining humanity and technology, TTEC Digital pioneers exceptional cu...

Professional Services
1K-5K
Founded 1982

Description

  • Design and deploy scalable Google Cloud services in GCP.
  • Implement IAM access controls to support secure and compliant data environments.
  • Develop and implement robust data ingestion pipelines from diverse data sources.
  • Develop and enforce data validation processes to maintain data accuracy and reliability.
  • Improve data quality and efficiency through continuous optimization.
  • Analyze raw data to identify patterns, trends, and opportunities.
  • Produce design documentation and solution roadmaps.
  • Lead projects independently while collaborating with larger cross-functional teams.
  • Mentor and cross-train junior and senior Data Engineers on complex assignments.
  • Support pre-sales efforts by providing work estimates and technical input.
  • Partner with Project Management to deliver projects on time and within budget.
  • Travel occasionally as needed for project or team requirements.

Requirements

  • Post-secondary degree or diploma in Computer Science, MIS, or an IT-related field; a BA/BS in an unrelated field may be considered with relevant experience.
  • 8+ years of experience in Data Engineering.
  • 3+ years of application design and development in a cloud environment.
  • 2+ years building and deploying containerized applications in Kubernetes.
  • 2+ years coding REST services and APIs using Python, C#, Node.js, or Java.
  • Proficiency in Terraform for infrastructure automation.
  • Advanced programming skills in Python, including pandas, numpy, and PySpark.
  • Strong SQL expertise across MS SQL, OracleDB, and Teradata.
  • Hands-on experience with Google Cloud Platform (GCP) services.
  • Familiarity with CI/CD tools to streamline deployment.
  • Experience with big data pipeline tools such as Spark, Pig, Hive, Sqoop, and Kafka.
  • Knowledge of data governance tools such as DLP and Dataplex.
  • Experience working in Agile/Scrum environments.
  • Skilled in REST services and API management, including Apigee X/Apigee Edge and Swagger/OpenAPI.
  • Google Cloud Certified Professional Data Engineer preferred.
  • Google Cloud Certified Cloud DevOps Engineer preferred.

Benefits

  • Full-time remote role.
  • Opportunity to work on high-impact Google Cloud analytics and data engineering initiatives.
  • Hands-on experience and formal learning opportunities to advance skills.
  • Chance to lead projects and mentor other engineers in a collaborative team environment.
  • Work on complex, customer-centric solutions with a global organization.
  • Inclusive, diverse workplace with equal opportunity employment practices.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Engineer

Remotebase 51-250 Internet Software & Services

This role at an organization enhancing its cloud data platform focuses on building and operating automated data infrastructure, pipelines, and transformation workflows to enable reliable data delivery across the business.

Agile Azure Bash CI/CD dbt Docker GCP GitHub Actions GitLab CI Jenkins Kubernetes Python Snowflake SQL Terraform
22 hours, 42 minutes ago

Principal Software/Data Engineer

PointClickCare 1K-5K Health Care Providers & Services

PointClickCare is hiring a Principal Software/Data Engineer to lead the design and delivery of production-grade streaming and real-time data pipelines for its healthcare data platform.

AWS Azure CI/CD Databricks dbt Flink GCP Kafka
1 day, 2 hours ago

Data Engineer (EU)

Swish Analytics 1-10 Internet Software & Services

Swish Analytics is hiring a Europe-based remote Data Engineer to help build and operate real-time sports analytics and betting data products, with a focus on non-US sports coverage and enterprise-grade data delivery.

Apache Airflow AWS CI/CD Git Kubernetes Machine Learning MySQL Python REST API Shell Scripting SQL
1 day, 2 hours ago

Senior Data Engineer

Murmuration 11-50 Diversified Consumer Services

Murmuration is seeking a Senior Data Engineer to build and maintain the data infrastructure that powers its unified civic data platform, Atlas, supporting research, product, and community-impact work across complex, high-volume public and political datasets.

Apache Airflow AWS CI/CD Dagster dbt Docker MongoDB Python Snowflake
1 day, 2 hours ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers