Prominence

Prominence

Prominence is a top healthcare advisory firm that maximizes data utilization, improving patient care, efficiency, and cost-effectiveness for hospitals.

Professional Services
51-250

Description

  • Design, build, and maintain scalable data pipelines and workflows for healthcare data ecosystems.
  • Ingest, transform, and deliver data from diverse sources (EHRs, claims, APIs, CRM) into analytics-ready structures.
  • Implement ETL/ELT processes, data modeling, and data warehousing solutions to ensure reliable analytics data.
  • Optimize data pipeline and query performance, including SQL query tuning and debugging.
  • Deploy and manage orchestration frameworks (e.g., Apache Airflow) and scheduling for data workflows.
  • Collaborate with customers and internal teams to define requirements, deliverables, and implementation plans.
  • Ensure data quality, reliability, and scalability across solutions and monitor pipeline health.
  • Teach and mentor customer counterparts to enable handoff, maintenance, and skill growth.
  • Participate in cloud platform implementations, migrations, and operational support activities.

Requirements

  • 2–5+ years of professional experience in data engineering or related roles.
  • Strong SQL skills, including query optimization and debugging.
  • Proficiency in Python or another programming/scripting language (Scala or Java).
  • Hands-on experience with at least one cloud data platform: Snowflake, Databricks, Azure Data Factory, AWS Redshift, or Google BigQuery.
  • Experience with data transformation tools (dbt or similar) and orchestration frameworks (Apache Airflow or similar).
  • Familiarity with ETL/ELT principles, data warehousing, and data modeling concepts.
  • Experience with cloud services (AWS, Azure, or GCP) and familiarity with CI/CD, Git, and DevOps workflows.
  • Preferred: healthcare domain knowledge (Epic, HL7, FHIR, claims) and experience with real-time/streaming tools (Kafka, Kinesis, Pub/Sub).
  • Preferred: Infrastructure-as-Code tools (Terraform, CloudFormation), containerization (Docker, Kubernetes), and cloud/data tool certifications.
  • Must have a suitable home office; position is full-time, salaried, fully remote within the US with no relocation required and includes benefits.

Benefits

  • Fully remote work with flexibility to manage your schedule and no US location requirements.
  • 15 days PTO and up to 16 paid holidays per year for full-time staff.
  • Health benefits including low- and high-deductible plans, HSAs, and LTD/STD insurance.
  • Health and Dependent Savings Accounts, plus vision and dental coverage.
  • 401(k) offering.
  • Annual professional development fund and signing bonuses.
  • Diverse career paths across Analytics and Epic Services with opportunities for growth and stability.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Engineering Tech Lead

Lingaro 5K-10K IT Services

Data Engineering Tech Lead at Lingaro (Data Engineering & Management) — lead a Poland-based remote/full-time team to design, deliver, and maintain scalable, secure data engineering solutions while mentoring engineers and ensuring timely, high-quality project delivery.

Azure CI/CD Python Scala SQL
16 hours, 22 minutes ago

Senior Software Engineer - Data Integration & JVM Ecosystem

ClickHouse 51-250 IT Services

Senior Software Engineer (JVM) at ClickHouse joining the Connectors team to own and maintain JVM-based data framework integrations, connectors, and drivers that enable high-performance data ingestion and a seamless developer experience for data engineering workloads.

Apache Airflow Apache Spark ClickHouse dbt Grafana HTTP Java Kafka Metabase Pandas Power BI Python SQL Tableau TCP/IP
1 month ago

Junior Data Engineer (Remote Argentina) / Ingénieur données junior (à distance)

GlobalVision 51-250 Internet Software & Services

Junior Data Engineer at GlobalVision supporting and maintaining the company’s data infrastructure to ensure reliable, accessible, and actionable data that informs business decision-making across the organization.

dbt Domo Machine Learning Power BI Python Salesforce SQL Tableau
1 month ago

Data/Infrastructure Advocate Engineer - EMEA Remote

Hugging Face 51-250 IT Services

Hugging Face is hiring a Data/Infrastructure Advocate Engineer to bridge data infrastructure and the community by championing Xet storage on the Hub and enabling efficient storage, versioning, and collaboration on large-scale datasets.

AWS GitHub Pandas Python
1 month ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers