Trustly

Trustly

Trustly specializes in developing and providing online payment solutions that leverage Open Banking technology to enhance payment processes, reduce costs, and streamline financial services for consumers, merchants, and banks.

Diversified Financial Services
251-1K
Founded 2008

Description

  • Design, implement, and maintain scalable data platform infrastructure across AWS and Kubernetes (EKS, EMR, Glue, Redshift, etc.).
  • Manage and evolve internal tools and frameworks for data ingestion and processing (e.g., Airbyte, Debezium, Airflow).
  • Build and maintain secure, automated CI/CD pipelines and infrastructure-as-code for platform components and data services.
  • Collaborate with DevOps, Security, Data Science, and product teams to ensure compliance, reliability, and alignment of the data platform with business needs.
  • Support version control, release workflows, and automation for notebooks, jobs, and tools used by data producers and consumers.
  • Implement observability and alerting to ensure high availability and performance of core data infrastructure and proactively address failures or inconsistencies.
  • Design and apply reusable design patterns and abstractions to simplify data creation, consumption, and platform extensibility.
  • Drive process automation, testing (including unit tests), and CI/CD practices to increase robustness and reduce time-to-delivery for data products.
  • Work with internal clients to gather requirements and translate them into structured tasks, solutions, and deliverables.
  • Maintain and support data access and consumption tools (e.g., Redshift, QuickSight) to improve visibility and productivity for business teams.

Requirements

  • Bachelor’s or Master’s degree in IT, Computer Science, Engineering, Mathematics, or a related technical discipline.
  • Proven experience building large-scale data pipelines and orchestrating data workloads in production.
  • Experience with AWS services (EKS, EC2, EMR, RDS, Glue) and big data tools such as Spark and Redshift.
  • Experience with relational databases (preferably Postgres), strong SQL skills, data modeling, and data warehouse concepts.
  • Experience with infrastructure-as-code (Terraform or similar) and CI/CD tooling.
  • Experience with Kubernetes and Docker for deploying and operating services.
  • Familiarity with automation and workflow management tools (e.g., Airflow, Airbyte, Debezium, SageMaker).
  • Proficient in Python programming.
  • Full professional proficiency in English for daily collaboration across a distributed global team.
  • Demonstrated soft skills: ownership and accountability, collaboration and teamwork, analytical problem solving, clear communication, continuous improvement mindset, prioritization, proactivity, adaptability, and experience mentoring or technically influencing others.

Benefits

  • Bradesco health and dental plan for employee and dependents with no co-payment.
  • Life insurance with enhanced coverage.
  • Meal voucher and supermarket voucher.
  • Home Office Allowance and remote-first, flexible-hours culture (work from any city in Brazil).
  • Wellhub access for physical activities and online classes.
  • Trustly Club discounts at educational institutions and partner stores.
  • English program with online group classes and a private teacher.
  • Extended maternity and paternity leave; Birthday off.
  • Welcome kit and Apple equipment (MacBook Pro, iPhone) with option to purchase equipment under internal criteria.
  • Employee referral program with financial reward.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Senior Information Security Engineer – Data

Rubrik 1K-5K IT Services

Rubrik is hiring a Senior Security Engineer to operate its SIEM environment and help build a Security Data Lake platform that supports security monitoring, analytics, and automated SecOps across a global multi-cloud footprint.

AWS Azure CI/CD Databricks Elasticsearch GCP Kubernetes LLM Python SIEM Snowflake Splunk Terraform
59 minutes ago

INGENIERO DE DATOS

NEORIS 5K-10K Internet Software & Services

NEORIS busca un Data Engineer para diseñar, desarrollar y desplegar soluciones de datos en un entorno Big Data y Cloud, alineadas con la arquitectura de datos y orientadas a eficiencia y mantenibilidad.

Agile Apache Spark AWS Azure Cassandra Elasticsearch GCP Hadoop HDFS MongoDB Neo4j Oracle PostgreSQL Python SQL Server
4 hours, 53 minutes ago

Vice President, Data Engineering

TASQ Staffing Solutions 11-50 Professional Services

The VP, Data Engineering at the company will lead enterprise-wide data, analytics, and AI strategy across multiple business units to modernize reporting, enable self-service insights, and turn data into measurable business outcomes.

Azure Databricks Machine Learning Power BI Snowflake SQL
12 hours, 57 minutes ago

Data Engineer

Soros Fund Management Capital Markets

Soros Fund Management is hiring an experienced Data Engineer to build and modernize data systems that support trading, risk, research, and accounting operations across the firm.

Apache Spark Databricks dbt Docker FastAPI Kubernetes PostgreSQL Python Snowflake SQL SQL Server
13 hours, 16 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers