Intelligent Medical Objects

Intelligent Medical Objects

IMO is a leading healthcare data enablement company with expertise in clinical terminology and precise data capture at the point of care, empowering informed decisions for improved patient care.

IT Services
251-1K
Founded 1994

Description

  • Build and operate production-grade data platforms that support products, analytics, and machine learning use cases.
  • Design, develop, and maintain batch and incremental data pipelines using modern lakehouse and cloud-native patterns.
  • Ingest, transform, and serve structured and semi-structured data using AWS and Databricks.
  • Design analytics-ready datasets, APIs, and aggregation layers for dashboards, BI tools, and Angular applications.
  • Develop well-documented data models that balance usability, performance, and correctness.
  • Apply software engineering practices such as version control, testing, CI/CD, and infrastructure-as-code to data work.
  • Collaborate with product, analytics, AI, and engineering teams to translate requirements into scalable technical solutions.
  • Improve reliability, performance, observability, and cost-efficiency of data systems through monitoring and continuous optimization.
  • Implement data quality checks, validation frameworks, and lineage-aware workflows.
  • Contribute to data platform standards around orchestration, modeling, environments, and deployment.
  • Own deliverables in an Agile environment and mentor other engineers through code quality and technical decision-making.

Requirements

  • Bachelor’s degree in a relevant technical field and 5+ years of professional experience, or 7+ years of equivalent hands-on experience.
  • Demonstrated experience building and supporting end-to-end data platforms in production.
  • Strong programming experience in Python and SQL.
  • Deep experience with AWS data services such as S3, EC2, RDS, and IAM.
  • Experience with Databricks and Spark-based processing.
  • Strong SQL skills, including complex transformations and performance-aware query design.
  • Experience supporting dashboards and BI use cases by delivering datasets, APIs, and data models.
  • Hands-on experience with data orchestration frameworks such as Airflow or equivalent.
  • Experience designing and optimizing data models for analytics, reporting, and downstream applications.
  • Familiarity with CI/CD and infrastructure-as-code tools such as Git and Terraform.
  • Comfort working with large, complex, and evolving datasets, including schema changes and metadata.
  • Strong analytical, debugging, and root-cause analysis skills.
  • Clear written and verbal communication skills, including documenting designs and tradeoffs.
  • Proactive, ownership-oriented mindset and ability to work effectively across teams.
  • Experience with dbt or similar analytics engineering tools and patterns (preferred).
  • Familiarity with data observability, monitoring, and cost management in cloud environments (preferred).
  • Experience supporting AI/ML data pipelines or feature engineering workflows (preferred).
  • Exposure to streaming or near-real-time data processing concepts (preferred).
  • Experience with healthcare, clinical, or regulated data domains (preferred).
  • Familiarity with metadata management, data catalogs, and lineage concepts (preferred).
  • AWS certifications such as Data Engineer, Solutions Architect, or AI/ML (preferred).
  • Experience with Angular, Power BI, or Tableau (preferred).

Benefits

  • $130,000 - $180,000 base salary.
  • Potential bonuses, equity, or sales incentives as part of total compensation.
  • Comprehensive benefits package.
  • Remote work opportunity.
  • Role-based compensation that varies by experience, skills, and location.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Platform Engineer

Lola Blankets 1-10 Textiles, Apparel & Luxury Goods

Lola Blankets is hiring a Data Platform Engineer to own its analytics platform and support engineering work across product, operations, integrations, and platform reliability.

Apache Airflow Dagster dbt LLM Prefect Python Snowflake SQL TypeScript
4 hours, 29 minutes ago

Oracle Data Engineer (with German Language)

Soname Solutions 11-50 Internet Software & Services

Soname Solutions is seeking a Senior Data Warehouse Developer to support a German telecom client by designing, optimizing, and evolving its multi-layer data warehouse environment.

Oracle PostgreSQL Power BI SQL
6 hours, 31 minutes ago

Synthetic Data Engineer (AI Data/Training)

Hyphen Connect 1-10 staffing & recruiting

A Synthetic Data Engineer at the organization will design and manage domain-specific synthetic data pipelines that support data processing and model training workflows.

Apache Airflow Apache Spark
7 hours, 3 minutes ago

Senior Data Engineer

Alpaca 51-250 Capital Markets

Alpaca is seeking a Senior Data Platform Engineer to build and operate the data management layer for its global brokerage infrastructure as it scales across customers, jurisdictions, and high-volume financial event streams.

Apache Airflow dbt Docker GCP Helm Kafka Kubernetes Python SQL Terraform Trino
8 hours, 50 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers