Principal Software/Data Engineer

20 hours, 35 minutes ago
Full-time
Lead
Software Development
PointClickCare

PointClickCare

PointClickCare provides a leading cloud-based healthcare software platform that enables long-term and post-acute care providers to effectively manage the complete lifecycle of resident care while enhancing operational efficiency and improving resident ...

Health Care Providers & Services
1K-5K
Founded 2000
$232M raised

Description

  • Lead the design and implementation of scalable streaming data pipelines.
  • Engineer and optimize real-time data solutions using Apache Kafka, Flink, and Spark Streaming.
  • Collaborate cross-functionally with product, analytics, and AI teams to ensure data supports business priorities.
  • Advance modernization efforts by deepening adoption of event-driven architectures and cloud-native technologies.
  • Drive best practices in data governance, observability, and performance tuning for streaming workloads.
  • Define schema contracts, implement transformation tests and data assertions, and automate data quality checks across batch and streaming pipelines.
  • Establish monitoring for data pipelines using metrics, logging, tracing, SLAs, SLOs, alerting, and dashboards.
  • Mentor team members and provide technical guidance while contributing as a hands-on individual contributor.
  • Foster a culture of quality through peer reviews and constructive feedback.

Requirements

  • 10+ years of professional experience in software or data engineering, including at least 4 years focused on streaming and real-time data systems.
  • Proven experience driving technical direction and mentoring engineers while delivering complex, high-scale solutions as a hands-on contributor.
  • Deep expertise with streaming technologies such as Apache Kafka, Flink, and Spark Streaming.
  • Strong understanding of event-driven architectures and distributed systems with experience building resilient, low-latency pipelines.
  • Experience with cloud platforms such as AWS, Azure, or GCP and containerized deployments for data workloads.
  • Fluency in data quality practices and CI/CD integration, including schema management, automated testing, and validation frameworks such as dbt or Great Expectations.
  • Experience implementing observability for data pipelines, including metrics, logging, tracing, and alerting.
  • Strong foundation in data governance and performance optimization across batch and streaming environments.
  • Experience with Lakehouse architectures and related technologies such as Databricks, Azure ADLS Gen2, and Apache Hudi.
  • Strong collaboration and communication skills with the ability to influence stakeholders and evangelize modern data practices.
  • Preferred: Strong analytical and problem-solving mindset.
  • Preferred: Ability to learn quickly and adapt to new technologies.
  • Preferred: Self-starter who thrives with minimal supervision and works well as a team player.
  • Preferred: Excellent organizational and critical-thinking skills.
  • Preferred: Comfortable leveraging AI tools to accelerate development.

Benefits

  • Base salary range of $183,200 to $203,500 per year, plus bonus and benefits.
  • Benefits starting from day 1.
  • Retirement plan matching.
  • Flexible paid time off.
  • Wellness support programs and resources.
  • Parental and caregiver leaves.
  • Fertility and adoption support.
  • Continuous development support program.
  • Employee Assistance Program.
  • Employee recognition programs.
  • Remote role with required travel to office events in Mississauga and/or Salt Lake City.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Engineer

Remotebase 51-250 Internet Software & Services

This role at an organization enhancing its cloud data platform focuses on building and operating automated data infrastructure, pipelines, and transformation workflows to enable reliable data delivery across the business.

Agile Azure Bash CI/CD dbt Docker GCP GitHub Actions GitLab CI Jenkins Kubernetes Python Snowflake SQL Terraform
17 hours, 5 minutes ago

Data Engineer (EU)

Swish Analytics 1-10 Internet Software & Services

Swish Analytics is hiring a Europe-based remote Data Engineer to help build and operate real-time sports analytics and betting data products, with a focus on non-US sports coverage and enterprise-grade data delivery.

Apache Airflow AWS CI/CD Git Kubernetes Machine Learning MySQL Python REST API Shell Scripting SQL
20 hours, 35 minutes ago

Senior Data Engineer

Murmuration 11-50 Diversified Consumer Services

Murmuration is seeking a Senior Data Engineer to build and maintain the data infrastructure that powers its unified civic data platform, Atlas, supporting research, product, and community-impact work across complex, high-volume public and political datasets.

Apache Airflow AWS CI/CD Dagster dbt Docker MongoDB Python Snowflake
20 hours, 35 minutes ago

Data Engineer

e.l.f. Beauty 251-1K Consumer Goods

e.l.f. Beauty is hiring a remote Data Engineer in India to design and maintain the data pipelines, warehouse infrastructure, and reporting foundations that support analytics and cross-functional business decisions across its digital and commerce teams.

AWS Azure C# GCP Java Python Snowflake SQL
20 hours, 35 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers