HHAeXchange

HHAeXchange

HHAeXchange is a premier homecare management software connecting providers, payers, and caregivers for proactive care, efficiency, and transparency in the industry.

Health Care Providers & Services
251-1K
Founded 2008

Description

  • Design end-to-end data architectures, frameworks, and pipelines for large-scale data ingestion, processing, transformation, reporting, and analytics.
  • Collaborate with stakeholders across engineering, product, operations, and customer success to align data work with business and product strategy.
  • Build and optimize ETL/ELT workflows, data models, and distributed data processing jobs.
  • Implement solutions using Python, SQL, dbt, AWS Glue, Airflow, Docker/Kubernetes, Snowflake, and other cloud-native services.
  • Develop data quality, integrity, lineage, governance, performance, and reliability best practices across data platforms and analytics applications.
  • Establish and maintain SLAs, monitoring, alerting, and observability for mission-critical data services.
  • Identify and resolve performance bottlenecks to improve scalability and cost efficiency.
  • Provide technical leadership and mentorship to engineers across the organization.
  • Promote engineering excellence through CI/CD, infrastructure as code, automated testing, observability, security/privacy, and AI assistant tools.
  • Lead complex problem-solving efforts and influence decisions on tools, technologies, and design patterns.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or a related field is highly desirable.
  • 8+ years of combined software and data engineering experience with a focus on distributed data processing and data-intensive applications.
  • 5+ years of experience architecting, implementing, and maintaining reporting and analytics tools such as Looker, Power BI, or Tableau.
  • 3+ years of experience working in highly regulated industries, preferably healthcare or homecare.
  • Strong expertise in Python, SQL, query optimization, database optimization, data pipelines, data modeling, data governance, and privacy/security best practices.
  • Deep experience with cloud platforms such as AWS, Azure, or GCP and modern data warehousing technologies such as Snowflake, BigQuery, or Redshift.
  • Demonstrated ability to lead large-scale data initiatives and influence architectural decisions across teams.
  • Excellent communication and collaboration skills with the ability to explain complex technical concepts to diverse stakeholders.
  • Experience designing data platforms that support ML/AI workflows is preferred.
  • Hands-on experience with AI/ML, LLMs, agentic development, and AI assistant tools is preferred.
  • Experience in healthcare or homecare technology, including EHR and HL7/FHIR, and product development is preferred.
  • Willingness to responsibly adopt AI tools to improve productivity and innovation.

Benefits

  • Base salary range of $155,000 to $184,000 per year, excluding variable compensation.
  • Benefits-eligible role with competitive health plans.
  • Paid time off.
  • Company-paid holidays.
  • 401(k) retirement program with company match.
  • Other company-sponsored programs.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Senior Data Engineer

Nova 1K-5K Professional Services

Senior Data Engineer at a remote U.S.-based engineering team focused on maintaining and improving data warehouse infrastructure, pipelines, and reporting to deliver reliable data solutions and actionable insights.

Apache Airflow Apache Spark AWS Azure CI/CD Databricks GCP Git Power BI Python Snowflake SQL Tableau
12 minutes ago

Azure Data Engineer (ETL Developer)

Kyivstar 1K-5K Wireless Telecommunication Services

Kyivstar is hiring an Azure Data Engineer to develop and optimize large-scale data and analytics solutions for the company’s cloud-based Big Data environment.

BDD Bitbucket Databricks Docker Git Hadoop Java Kafka Kubernetes Machine Learning Power BI Python R Scala SQL
12 minutes ago

Data Developer

TextNow 51-250 Wireless Telecommunication Services

TextNow is hiring a remote Data Developer in Canada to own and evolve its data platform, supporting data-informed decision-making across a fast-growing mobile communications business.

Apache Airflow Apache Spark AWS Databricks dbt Python Scala Snowflake SQL
12 minutes ago

Member of Technical Staff, Reporting & Statements (Data Engineer)

Anchorage Digital 251-1K Capital Markets

Anchorage Digital is seeking a Member of Technical Staff, Reporting & Statements to help build automated data systems that power accurate financial reports and statements for institutional crypto clients.

Apache Airflow Dagster GCP Pandas Python SQL
12 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers