CI&T

CI&T

CI&T is a global digital technology agency empowering agile growth for leading companies through advanced technologies with a team of 2000 experts worldwide.

Internet Software & Services
5K-10K
Founded 1995

Description

  • Design, implement, and maintain end-to-end ELT/ETL data pipelines with a focus on reliability, reprocessing, and cost efficiency.
  • Orchestrate data loads and routines using Airflow on AWS and AWS Lambda, supporting EKS deployments when needed.
  • Model data in Snowflake across bronze, silver, and gold layers and develop transformations in dbt.
  • Build Python integrations and FastAPI services to expose, consume, and automate data processes.
  • Consume, normalize, and version market data with emphasis on historical and distribution batch processing.
  • Write high-performance SQL and tune queries in Snowflake and Postgres.
  • Use Pandas for targeted data transformations, validation, and prototyping.
  • Ensure data quality, observability, security, and documentation across pipelines and models.
  • Collaborate with business, analytics, and product teams to define SLAs, data contracts, and governance standards.

Requirements

  • Strong experience with SQL, including query optimization and relational and analytical data modeling.
  • Hands-on experience with Snowflake, including warehousing, roles, tasks, performance, and cost management.
  • Hands-on experience with dbt, including models, tests, sources, exposures, macros, and documentation.
  • Experience building data pipelines with Airflow, including DAGs, sensors, retries, and SLAs.
  • Experience with AWS Lambda and batch-oriented ELT/ETL pipelines.
  • Knowledge of Postgres, including ingestion, basic CDC or replication, and routine maintenance.
  • Experience using Python for data work, including Pandas, and developing APIs with FastAPI.
  • Experience with Git and CI/CD for safe deployment of pipelines and dbt models.
  • Understanding of data security and governance, including access control, lineage, documentation, and sensitive data handling.
  • Technical English for reading documentation.
  • Experience with EKS/Kubernetes for data workloads is a plus.
  • Experience integrating market data sources, including providers, formats, rate limits, history, and calendars is a plus.
  • Experience with observability and data quality tools such as Prometheus, Grafana, CloudWatch, Great Expectations, or Soda is a plus.
  • Knowledge of data contracts, including pydantic or JSONSchema, is a plus.
  • Experience with Snowflake performance tuning, including micro-partitioning, clustering, warehouses, and query profiles is a plus.
  • Experience with FinOps and data cost optimization in AWS and Snowflake is a plus.
  • Experience with CDC tools such as Debezium or DMS and messaging tools such as SQS, SNS, or Kafka is a plus.

Benefits

  • Health and dental insurance.
  • Meal and food allowance.
  • Childcare assistance.
  • Extended parental leave.
  • Gym and wellness partnerships through Wellhub (Gympass) and TotalPass.
  • Profit sharing bonus (PLR).
  • Life insurance.
  • Continuous learning platform through CI&T University.
  • Discount club and access to free online physical, mental, and wellness support.
  • Pregnancy and responsible parenting courses.
  • Partnerships with online learning platforms and language learning access.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data 360 Senior Engineer

Margo Bank Professional Services

Data 360 Senior Engineer at a global commerce media company in Warsaw, working remotely to help build intelligence- and AI-driven solutions that connect shoppers and brands across the open internet.

41 minutes ago

Remote - Data Engineering Analyst

World Business Lenders 251-1K Real Estate

World Business Lenders is hiring two Data Analysts to support data engineering migration and pipeline automation work for short-term commercial lending operations across the United States.

Apache Airflow Azure CI/CD Git GitHub Python REST API SQL
1 hour ago

Senior Data Engineer (Integrations & Platforms)

Massive Rocket 51-250 Media

Massive Rocket is hiring a Senior Data Engineer in India to build and maintain the integrations, pipelines, and platform infrastructure that keep client and internal data systems connected, scalable, and reliable.

Apache Airflow AWS Azure CRM dbt Docker GCP Kafka Kubernetes Microservices Python Segment Snowflake SQL Terraform
2 hours, 10 minutes ago

Data Engineer – Automation-Led Modernization

Doran Jones 51-250 Professional Services

McLaren Strategic Solutions is hiring a Senior Data Engineer/Data Architect to lead automation-led modernization of legacy data and reporting platforms, with a primary focus on migrating Informatica-based pipelines to Databricks across distributed environments.

Azure Databricks Generative AI Power BI
11 hours, 9 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers