Valtech

Valtech

Valtech is a global business transformation company specializing in technology, marketing, and experience design to help clients innovate, connect with consumers, and optimize ROI.

Professional Services
5K-10K
Founded 1993

Description

  • Design, build, and maintain scalable data engineering frameworks and platform utilities used across engineering teams.
  • Develop reusable patterns, templates, and abstractions to standardize and accelerate delivery.
  • Define and evolve platform architecture decisions to ensure scalability, maintainability, and consistency.
  • Design and implement CI/CD pipelines and automation frameworks to improve engineering velocity.
  • Define and enforce engineering standards for testing, code quality, deployment, and documentation.
  • Identify and eliminate manual or repetitive processes through automation and tooling improvements.
  • Integrate AI-assisted development tools into engineering workflows to improve productivity.
  • Develop and maintain AI engineering assets such as coding guidelines, prompt frameworks, and reusable agent configurations.
  • Lead the development and operational support of core data transformation frameworks, including dbt Core at enterprise scale.
  • Investigate and resolve framework-level issues, including deployment failures, dependency conflicts, and production incidents.
  • Support onboarding and enablement of engineering teams adopting platform tooling.
  • Partner with engineering teams to identify pain points and translate them into platform improvements.
  • Ensure platform tooling meets security, compliance, and operational requirements.
  • Conduct and support code and design reviews across platform components.
  • Monitor platform health, performance, and adoption, and iterate based on feedback and metrics.
  • Contribute to documentation, developer guides, and enablement materials to improve usability and adoption.

Requirements

  • Strong proficiency in Python for building frameworks, automation tooling, and reusable components.
  • Hands-on experience with Databricks, including notebooks, workflows, jobs, and Unity Catalog.
  • Strong SQL skills and experience with distributed processing frameworks such as Apache Spark.
  • Deep experience with dbt Core, including project structure, models, tests, macros, and deployment at scale.
  • Proven experience designing and maintaining CI/CD pipelines, such as GitHub Actions, Azure DevOps, or GitLab CI.
  • Experience with data engineering platform design, including scalable pipeline and workflow architectures.
  • Strong understanding of software engineering principles, including DRY, SOLID, and modular design.
  • Experience working with version control systems and modern Git workflows.
  • Experience with cloud platforms, preferably AWS, and infrastructure-as-code concepts such as Terraform.
  • Experience implementing automated testing strategies, including unit, integration, and data quality testing.
  • Strong understanding of platform monitoring, logging, and alerting practices.
  • Experience writing technical documentation and developer-facing guidance.
  • Experience working in complex engineering environments with multiple teams.
  • Experience with Databricks Asset Bundles (DAB) or similar deployment frameworks, preferred.
  • Experience integrating AI-assisted development tools, such as GitHub Copilot or similar, into engineering workflows, preferred.
  • Experience defining AI coding standards, agent configurations, or prompt engineering frameworks, preferred.
  • Knowledge of Delta Lake and Lakehouse architecture patterns, preferred.
  • Experience working on large-scale platform engineering or developer platform teams, preferred.
  • Understanding of IAM, secrets management, and security compliance in cloud environments, preferred.
  • Experience driving adoption of engineering platforms across multiple teams, preferred.
  • Strong communication and technical leadership skills, preferred.

Benefits

  • 24 working days of paid vacation.
  • National holidays covered.
  • Sick leave up to 20 days per year.
  • Unpaid leave up to 20 days per year.
  • Medical insurance.
  • Multisport card or Multikafeteria.
  • Maternity and paternity leave support.
  • Internal workshops and learning initiatives.
  • Professional certifications reimbursement.
  • Participation in local and global professional communities.
  • Growth Framework to support career progression.
  • Mentoring program with the option to become a mentor or mentee.
  • Competitive compensation package.
  • Remote and hybrid work options, depending on country.
  • International mobility and professional development programs.
  • Access to cutting-edge tools, training, and industry experts.
  • Progressive benefit packages that increase with tenure.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Senior Data Engineer

Zeta Global 1K-5K Media

Zeta Global is seeking a Senior Data Engineer for its Data Cloud Acceleration Team to turn messy data into reliable, decision-ready assets that support new applications and measurable business outcomes.

Apache Airflow Databricks dbt GCP Hive Prefect Python Snowflake SQL
10 minutes ago

[Job - 29253] Senior Data Developer (AWS), Brazil

CI&T 5K-10K Internet Software & Services

CI&T is hiring a Senior Data Developer (AWS) in Brazil to build the data foundation for scalable AI solutions across business areas in a remote home office setting.

Apache Airflow Apache Spark AWS Databricks Pandas Python
10 minutes ago

Data Platform Engineer

Renaissance 1K-5K Internet Software & Services

Renaissance is hiring a mid-level Data Platform Engineer to help build and operate the data infrastructure that makes student data accessible, reliable, and scalable across its education technology platform.

Apache Airflow AWS dbt Docker GCP MySQL PostgreSQL Snowflake
40 minutes ago

Data Engineer, Azure - Remote, Latin America

Bluelight Consulting 11-50 Internet Software & Services

Bluelight is hiring a remote Data Engineer, Azure to develop and optimize data pipelines and warehousing solutions for clients across Latin America.

Agile Apache Spark Azure Git Machine Learning Power BI Python REST API SQL Tableau
54 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers