Veriff

Veriff

Veriff is an industry leader in online identity verification, providing AI-powered software to help businesses build trust and transparency online. With a combination of AI technology and human verification teams, Veriff offers fraud prevention, compli...

IT Services
51-250
Founded 2015
$92M raised

Description

  • Own and operate the Analytics Data Platform infrastructure across multiple AWS regions to enable data engineers, analysts, and data scientists.
  • Maintain and optimize self-hosted data tools (Apache Airflow, Redash, Tableau) to ensure agreed availability.
  • Design and implement AWS infrastructure for analytics (S3, Glue, Athena, Redshift) across regions while meeting compliance requirements.
  • Build and maintain comprehensive observability for multi-region Data Lake and Data Warehouse infrastructure.
  • Implement FinOps practices including cost tagging, attribution, monitoring, and cloud cost optimization for the data platform.
  • Create and maintain Infrastructure as Code using Terraform to enable reproducible, scalable deployments.
  • Manage Kubernetes infrastructure for data platform services and troubleshoot containerized applications.
  • Proactively prevent and resolve infrastructure issues through monitoring, alerting, capacity planning, on-call rotation, and incident response.

Requirements

  • 5+ years of experience in infrastructure engineering, platform engineering, or site reliability engineering.
  • Deep expertise with AWS data services (S3, Glue, Athena, Redshift) and AWS IAM.
  • Strong Terraform / Infrastructure as Code experience.
  • Experience designing and maintaining Kubernetes clusters and troubleshooting containerized applications.
  • Experience with observability and monitoring tools (Grafana, Prometheus, Loki or similar), including alerting and capacity planning.
  • Proficient scripting and automation skills (Python and other languages).
  • Understanding of data platform concepts: data lakes, data warehouses, ETL/ELT pipelines, and supporting data teams.
  • Experience implementing AWS IAM best practices, security controls, and compliance requirements (e.g., GDPR, data residency).
  • Experience working in international teams with excellent English communication skills and the ability to explain complex infrastructure to non-technical stakeholders.
  • Preferred: experience with Apache Iceberg or Delta Lake, Airflow administration at scale, ClickHouse or other columnar databases, and FinOps/cloud cost optimization or certification.

Benefits

  • Flexibility to work from home.
  • Stock options to share in company success.
  • Extra recharge days on top of annual vacation.
  • Comprehensive relocation support to Estonia or Spain.
  • Extensive medical, dental, and vision insurance.
  • Learning & Development and Health & Sports budget to tailor to personal needs.
  • Four weeks of fully paid sabbatical leave after the 5th work anniversary.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Engineering Tech Lead

Lingaro 5K-10K IT Services

Data Engineering Tech Lead at Lingaro (Data Engineering & Management) — lead a Poland-based remote/full-time team to design, deliver, and maintain scalable, secure data engineering solutions while mentoring engineers and ensuring timely, high-quality project delivery.

Azure CI/CD Python Scala SQL
14 hours, 40 minutes ago

Senior Software Engineer - Data Integration & JVM Ecosystem

ClickHouse 51-250 IT Services

Senior Software Engineer (JVM) at ClickHouse joining the Connectors team to own and maintain JVM-based data framework integrations, connectors, and drivers that enable high-performance data ingestion and a seamless developer experience for data engineering workloads.

Apache Airflow Apache Spark ClickHouse dbt Grafana HTTP Java Kafka Metabase Pandas Power BI Python SQL Tableau TCP/IP
1 month ago

Junior Data Engineer (Remote Argentina) / Ingénieur données junior (à distance)

GlobalVision 51-250 Internet Software & Services

Junior Data Engineer at GlobalVision supporting and maintaining the company’s data infrastructure to ensure reliable, accessible, and actionable data that informs business decision-making across the organization.

dbt Domo Machine Learning Power BI Python Salesforce SQL Tableau
1 month ago

Data/Infrastructure Advocate Engineer - EMEA Remote

Hugging Face 51-250 IT Services

Hugging Face is hiring a Data/Infrastructure Advocate Engineer to bridge data infrastructure and the community by championing Xet storage on the Hub and enabling efficient storage, versioning, and collaboration on large-scale datasets.

AWS GitHub Pandas Python
1 month ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers