Trustly

Trustly

Trustly specializes in developing and providing online payment solutions that leverage Open Banking technology to enhance payment processes, reduce costs, and streamline financial services for consumers, merchants, and banks.

Diversified Financial Services
251-1K
Founded 2008

Description

  • Design, implement, and maintain scalable data platform infrastructure across AWS and Kubernetes (EKS, EMR, Glue, Redshift, etc.).
  • Manage and evolve internal tools and frameworks for data ingestion and processing (e.g., Airbyte, Debezium, Airflow).
  • Build and maintain secure, automated CI/CD pipelines and infrastructure-as-code for platform components and data services.
  • Collaborate with DevOps, Security, Data Science, and product teams to ensure compliance, reliability, and alignment of the data platform with business needs.
  • Support version control, release workflows, and automation for notebooks, jobs, and tools used by data producers and consumers.
  • Implement observability and alerting to ensure high availability and performance of core data infrastructure and proactively address failures or inconsistencies.
  • Design and apply reusable design patterns and abstractions to simplify data creation, consumption, and platform extensibility.
  • Drive process automation, testing (including unit tests), and CI/CD practices to increase robustness and reduce time-to-delivery for data products.
  • Work with internal clients to gather requirements and translate them into structured tasks, solutions, and deliverables.
  • Maintain and support data access and consumption tools (e.g., Redshift, QuickSight) to improve visibility and productivity for business teams.

Requirements

  • Bachelor’s or Master’s degree in IT, Computer Science, Engineering, Mathematics, or a related technical discipline.
  • Proven experience building large-scale data pipelines and orchestrating data workloads in production.
  • Experience with AWS services (EKS, EC2, EMR, RDS, Glue) and big data tools such as Spark and Redshift.
  • Experience with relational databases (preferably Postgres), strong SQL skills, data modeling, and data warehouse concepts.
  • Experience with infrastructure-as-code (Terraform or similar) and CI/CD tooling.
  • Experience with Kubernetes and Docker for deploying and operating services.
  • Familiarity with automation and workflow management tools (e.g., Airflow, Airbyte, Debezium, SageMaker).
  • Proficient in Python programming.
  • Full professional proficiency in English for daily collaboration across a distributed global team.
  • Demonstrated soft skills: ownership and accountability, collaboration and teamwork, analytical problem solving, clear communication, continuous improvement mindset, prioritization, proactivity, adaptability, and experience mentoring or technically influencing others.

Benefits

  • Bradesco health and dental plan for employee and dependents with no co-payment.
  • Life insurance with enhanced coverage.
  • Meal voucher and supermarket voucher.
  • Home Office Allowance and remote-first, flexible-hours culture (work from any city in Brazil).
  • Wellhub access for physical activities and online classes.
  • Trustly Club discounts at educational institutions and partner stores.
  • English program with online group classes and a private teacher.
  • Extended maternity and paternity leave; Birthday off.
  • Welcome kit and Apple equipment (MacBook Pro, iPhone) with option to purchase equipment under internal criteria.
  • Employee referral program with financial reward.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Principal Engineer - Data Infrastructure

Sezzle 251-1K Diversified Financial Services

Sezzle is hiring a Principal Engineer for Data Infrastructure to own and evolve the company’s database and data warehousing systems as its data volume and business needs scale.

Apache Airflow AWS Dagster Databricks dbt Elasticsearch Flink Git GitLab Go Kafka Kubernetes MySQL PostgreSQL Prefect Python React React Native Snowflake SQL TypeScript
12 minutes ago

Staff Software Engineer (L4) Data Platform

Twilio 5K-10K Diversified Telecommunication Services

Twilio is hiring a Staff Software Engineer for its remote U.S.-based Data & Analytics Platform team to architect and deliver scalable data systems and help advance its global communications infrastructure.

Apache Spark AWS Hadoop Java Kafka Python Scala
27 minutes ago

Principal Engineer - Data Infrastructure

Sezzle 251-1K Diversified Financial Services

Sezzle is hiring a Principal Engineer for Data Infrastructure to own and evolve the systems that power its rapidly growing data platform and support reliable, high-scale analytics and operations.

Apache Airflow AWS Dagster Databricks dbt Elasticsearch Git GitLab Go Kafka Kubernetes Microservices MySQL PostgreSQL Prefect Python React React Native Scala Snowflake SQL TypeScript
27 minutes ago

Data Migration Engineer

Mark43 251-1K Professional Services

Mark43 is hiring a Data Migration Engineer (ETL Developer) to lead customer data migration efforts for public safety software, ensuring agencies transition clean, accurate data into the platform.

SQL
42 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers