SWORD Health

SWORD Health

SWORD Health provides AI-powered digital physical therapy solutions designed to prevent pain, support recovery, and enhance overall health, while also aiming to transform the rehabilitation industry through innovative technology and clinical oversight.

Health Care Providers & Services
251-1K
Founded 2015
$324M raised

Description

  • Design and evolve the company's streaming lakehouse platform.
  • Build and operate distributed streaming pipelines for low-latency, reliable data movement.
  • Own durable workflows that coordinate complex data movement across systems.
  • Shape the platform API surface for producers and consumers to use without infrastructure involvement.
  • Evaluate and integrate vendor data platforms while navigating architectural trade-offs.
  • Contribute to self-service and agentic interfaces for humans, systems, and AI agents.
  • Partner with data engineers and analysts on data contracts, governance, and lineage.
  • Build and maintain AI-ready data infrastructure that supports ML and AI-driven products.
  • Use AI coding assistants and LLMs to accelerate development, automate documentation, and improve code quality.
  • Work in a regulated environment where audit, compliance, and governance are part of every design.

Requirements

  • Proven experience designing and operating data platforms at scale in production, such as warehouse, data lake, or lakehouse architectures.
  • Hands-on experience with a modern lakehouse table format, preferably Iceberg; Delta Lake or Hudi are also acceptable.
  • Strong understanding of lakehouse internals, including metadata layout, snapshots, manifests, compaction, and copy-on-write vs. merge-on-read.
  • Clear mental model of data catalogs such as REST, Polaris, Glue, Unity, or Hive, including their trade-offs and storage-compute separation.
  • Exposure to at least one vendor lakehouse or query platform, such as Snowflake, Starburst, or Databricks, at an architectural level.
  • Strong experience with a distributed processing engine, preferably Flink; Spark is also acceptable.
  • Ability to reason about distributed processing internals, fine-tune running jobs, and debug degrading pipelines.
  • Familiarity with durable execution tools such as Temporal or Restate, or a solid conceptual understanding of durable execution for workflows.
  • Production experience building and operating APIs at scale, including REST or gRPC.
  • Strong understanding of Kafka and event-driven architectures, including producers/consumers, partitioning, and delivery semantics.
  • Comfort working in regulated environments such as healthcare, fintech, or government, where audit, compliance, and governance matter.
  • Platform mindset focused on self-service, API-first design, and supporting both human and agent consumers.
  • Deeper familiarity with open or REST catalogs such as Polaris, Nessie, or Unity is a plus.
  • Observability experience with tools such as Prometheus, Grafana, or OpenTelemetry is a plus.
  • Prior work on agentic or AI-facing API surfaces, or MCP-style interfaces, is a plus.
  • Experience in HIPAA, FedRAMP, or SOC 2 environments is a plus.
  • Exposure to dbt, DataHub, or data contract tooling is a plus.
  • AI fluency at Sword Health is required, with at least Level 1: using AI daily to boost personal productivity.

Benefits

  • €46,900 - €73,700 annual compensation range, including base, variable, and equity.
  • Equity shares included in the compensation package.
  • Health, dental, and vision insurance.
  • Meal allowance.
  • Remote work allowance.
  • Flexible working hours and work from home.
  • Discretionary vacation.
  • Snacks and beverages.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Platform Engineer

Lola Blankets 1-10 Textiles, Apparel & Luxury Goods

Lola Blankets is hiring a Data Platform Engineer to own its analytics platform and support engineering work across product, operations, integrations, and platform reliability.

Apache Airflow Dagster dbt LLM Prefect Python Snowflake SQL TypeScript
5 hours, 37 minutes ago

Oracle Data Engineer (with German Language)

Soname Solutions 11-50 Internet Software & Services

Soname Solutions is seeking a Senior Data Warehouse Developer to support a German telecom client by designing, optimizing, and evolving its multi-layer data warehouse environment.

Oracle PostgreSQL Power BI SQL
7 hours, 39 minutes ago

Synthetic Data Engineer (AI Data/Training)

Hyphen Connect 1-10 staffing & recruiting

A Synthetic Data Engineer at the organization will design and manage domain-specific synthetic data pipelines that support data processing and model training workflows.

Apache Airflow Apache Spark
8 hours, 12 minutes ago

Senior Data Engineer

Alpaca 51-250 Capital Markets

Alpaca is seeking a Senior Data Platform Engineer to build and operate the data management layer for its global brokerage infrastructure as it scales across customers, jurisdictions, and high-volume financial event streams.

Apache Airflow dbt Docker GCP Helm Kafka Kubernetes Python SQL Terraform Trino
9 hours, 58 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers