SWORD Health

SWORD Health

SWORD Health provides AI-powered digital physical therapy solutions designed to prevent pain, support recovery, and enhance overall health, while also aiming to transform the rehabilitation industry through innovative technology and clinical oversight.

Health Care Providers & Services
251-1K
Founded 2015
$324M raised

Description

  • Lead the migration of existing workloads to Apache Iceberg and mature the foundational lakehouse architecture.
  • Design and build robust batch and streaming data pipelines using Spark and Flink.
  • Collaborate with the Backend Engineering team on API integrations and formal data contracts.
  • Contribute to a unified lineage and governance framework using DataHub.
  • Support the Core Team in adopting and operationalizing new data platform capabilities.
  • Architect scalable data platform solutions for broad organizational use.
  • Build and maintain data pipelines that support reliable, actionable insights across the organization.

Requirements

  • Demonstrated proficiency with Python and PySpark.
  • Hands-on experience with data lake formats such as Iceberg, Delta Lake, or Hudi.
  • Solid understanding of Kafka and event-driven architectures.
  • Experience building and orchestrating data pipelines at scale.
  • Strong SQL skills and comprehensive data modeling knowledge.
  • Familiarity with workflow orchestration tools such as Airflow, Dagster, or similar.
  • Platform-oriented mindset focused on solutions for broad organizational use.
  • Ownership mentality with the ability to drive problems through to resolution.
  • Clear communication skills for explaining complex technical concepts to non-technical stakeholders.
  • Highly collaborative approach to working with backend engineers, data engineers, and analysts.
  • Pragmatic approach to balancing ideal solutions with practical delivery timelines.
  • Bonus: Experience with Flink or comparable streaming frameworks.
  • Bonus: Proficiency in DBT and familiarity with the modern data stack.
  • Bonus: Experience with modern data platforms such as BigQuery, Trino, Snowflake, or Databricks.
  • Bonus: Proven background in developing self-service data platforms.
  • Candidates must be based in Portugal and possess a valid EU visa.
  • This position does not offer relocation assistance.

Benefits

  • Competitive salary with base, variable pay, and potential equity.
  • Career development and growth opportunities.
  • Flexible remote or hybrid work policy.
  • Unlimited vacation and flexible working hours.
  • Health and well-being program with digital therapist sessions.
  • Health, dental, and vision insurance.
  • Meal allowance and remote work allowance.
  • Equity shares.
  • Snacks and beverages.
  • English class support.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Engineer (Python)

Orcrist Technologies Internet Software & Services

Orcrist is seeking a Data Engineer to prototype and validate new data pipelines and connectors for its Kubernetes-based intelligence platform, with the goal of producing adoptable blueprints for productization.

Hive Kafka Kubernetes PostgreSQL Python SQL Trino
26 minutes ago

Data Engineer I

Capital Rx 251-1K Health Care Providers & Services

Judi Health is hiring a Data Engineer I (Analytics Engineering) to build analytics-ready datasets, trusted metrics, and scalable data models for Capital Rx’s healthcare data platform.

CI/CD dbt Git PostgreSQL Python Snowflake SQL
26 minutes ago

Data Engineer (Remote-US)

DataKind 51-250 Diversified Consumer Services

DataKind is hiring a remote Data Engineer to support higher education institutions by building and maintaining the UDTS data platform and partnering directly with clients to improve student outcomes.

Databricks GCP JavaScript Python SQL
41 minutes ago

Lead Data Engineer

Mark43 251-1K Professional Services

Mark43 is seeking a Lead Data Engineer to own strategic data infrastructure work for software used by first responders, combining technical leadership, hands-on execution, and cross-functional partnership.

Apache Airflow AWS dbt MySQL Python SQL SQL Server Terraform
1 hour, 11 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers