Kpler

Kpler

Kpler provides a comprehensive platform for global trade intelligence, offering real-time data and analytics that empower businesses to plan, grow, and operate sustainably across various commodities and markets.

Professional Services
251-1K
Founded 2014

Description

  • Build and maintain core datasets for vessels characteristics, companies, and geospatial data.
  • Create and maintain REST APIs, streaming pipelines with Kafka Streams, and Spark batch pipelines.
  • Own development tasks end to end, from understanding requirements through implementation, testing, deployment, and review.
  • Design and build APIs and data processing components for development environments and peer/product testing.
  • Write and execute unit, integration, functional, and end-to-end tests based on defined scenarios.
  • Monitor system performance, alerts, and SLOs after release to ensure reliability and optimal functionality.
  • Deliver well-documented, maintainable code following TDD principles and clean code standards.
  • Implement data schema evolution and versioning strategies to support reliable data exchange.
  • Develop and maintain batch and streaming data pipelines, including backpressure handling, orchestration, retries, and data quality controls.
  • Instrument services with metrics, logs, and traces, and contribute to CI/CD and incident response.

Requirements

  • 3–5 years of experience in data-focused software engineering roles.
  • Strong programming skills in Scala or other JVM languages; Python experience is preferred.
  • Experience designing and operating RESTful APIs and versioned interfaces.
  • Good understanding of data modeling, schema evolution, and serialization technologies such as Avro or Protobuf.
  • Experience building and maintaining batch or streaming data systems.
  • Familiarity with CI/CD pipelines and modern monitoring and alerting practices.
  • Proficiency with Git-based workflows, code reviews, and Agile development methodologies.
  • Strong ownership mindset with pragmatic problem-solving skills and the ability to deliver end-to-end solutions.
  • Excellent communication skills and fluency in English.
  • Experience with Apache Airflow is a plus.
  • Exposure to cloud platforms, preferably AWS, and infrastructure as code with Terraform is a plus.
  • Experience with Docker and Kubernetes in production environments is a plus.
  • Hands-on knowledge of Kafka and event-driven or microservices architectures is a plus.
  • Familiarity with JVM build tools such as Gradle or Maven is a plus.

Benefits

  • Remote full-time role based in Germany.
  • Opportunity to work on high-impact data products in commodities, energy, and maritime markets.
  • Work at a global company with more than 700 experts across 35+ countries.
  • Inclusive and diverse work environment.
  • Supportive culture that emphasizes collaboration and helping colleagues and clients.
  • Encouragement to apply even if you do not meet every requirement.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Senior Data Engineer

Zencore Group 11-50 Internet Software & Services

Zencore is hiring a Senior Data Engineer in its LATAM Data & Analytics team to help customers modernize and migrate data platforms on Google Cloud through hands-on pipeline engineering and advisory work.

Apache Airflow Apache Spark CI/CD Databricks GCP MLOps Oracle Python Snowflake SQL
31 minutes ago

Data Observability Consultant - Dynatrace

Lingaro 5K-10K IT Services

Dynatrace India’s Consulting and Advisory Data Consulting Practice is hiring a remote Data Observability Consultant to support data-focused consulting work.

46 minutes ago

Senior Data Engineer

Lodgify 251-1K Internet Software & Services

Lodgify is hiring a Senior Data Engineer in Barcelona to build and optimize the company’s modern data platform that powers data-driven decisions across its vacation rental business.

Apache Airflow AWS Azure dbt GCP JavaScript Machine Learning Python SQL
46 minutes ago

Principal Java Data Engineer

PointClickCare 1K-5K Health Care Providers & Services

PointClickCare is hiring a Principal Java Data Engineer to lead the design and evolution of large-scale data platforms and pipelines in a remote or Mississauga-based healthcare technology environment.

Apache Spark AWS Azure CI/CD Databricks dbt Docker GCP HDFS Java Microservices Scrum
1 hour, 16 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers