Samsara

Samsara

Samsara pioneers the Connected Operations Cloud, offering AI safety programs, real-time visibility, and integrations for industries to enhance efficiency, safety, and sustainability globally.

IT Services
1K-5K
Founded 2015

Description

  • Serve as a primary responder for production data incidents: diagnose root causes, implement fixes, and ensure data integrity.
  • Design, implement, and maintain monitoring, logging, and alerting systems for production data pipelines and infrastructure.
  • Manage, deploy, and maintain data ingestion and integration pipelines and APIs, including backend ingestion jobs and minor enhancements/bug fixes.
  • Continuously identify and implement performance optimizations to improve speed, scalability, and efficiency of data processing jobs and API performance.
  • Develop and enforce data validation and quality checks within pipelines to minimize errors and inconsistencies in production data.
  • Collaborate with DevOps teams to manage and operate underlying AWS infrastructure that hosts the data platform.
  • Maintain comprehensive documentation, operational runbooks, and troubleshooting procedures for pipelines and platform operations.
  • Communicate incident status, SLA reports, and operational updates to management and cross-functional stakeholders.
  • Work with diverse data sources (CRM, product, marketing, order flow, support tickets, finance, etc.) to support downstream analytics and business needs.
  • Champion and embed Samsara’s cultural principles across the team and organization.

Requirements

  • Bachelor’s degree in computer science, data engineering, data science, information technology, or an equivalent engineering program.
  • 3+ years of experience in Data Engineering, Data Operations, or SRE supporting production data environments.
  • Proficiency with SQL for data analysis and validation.
  • Experience with Python or a similar scripting language.
  • Exposure to ETL tools such as Fivetran, dbt, Workato, or equivalent.
  • Experience with python-based API frameworks and API management tools.
  • Experience with RDBMS (MySQL, AWS RDS/Aurora MySQL, PostgreSQL, Oracle, or equivalent).
  • Experience with at least one major cloud provider (AWS, GCP, or Azure) and familiarity with data warehouses (Databricks, BigQuery, Redshift, Snowflake, or equivalent).
  • Familiarity with CI/CD, DevOps tools and practices, and logging/monitoring tools (Splunk, Datadog, AWS CloudWatch, or equivalent).
  • Preferred: experience with AWS serverless components (API Gateway, Lambda, S3, SNS, SQS, SecretsManager), strong analytical/troubleshooting skills, and the ability to work with business users and cross-functional stakeholders.

Benefits

  • Competitive total compensation including base salary (range: $104,550–$123,000 CAD), bonus/variable pay, and restricted stock unit (RSU) awards for eligible roles.
  • Potential for annual RSU refresh grants and above-market equity refresh awards for top performers.
  • Employee-led remote and flexible working model with support for hybrid or in-office where applicable.
  • Health benefits for full-time employees.
  • Support for career development, rapid internal mobility, and opportunities to grow within a high-growth company.
  • Inclusive workplace with accommodations for applicants and employees with disabilities and a commitment to equal opportunity.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Engineering Tech Lead

Lingaro 5K-10K IT Services

Data Engineering Tech Lead at Lingaro (Data Engineering & Management) — lead a Poland-based remote/full-time team to design, deliver, and maintain scalable, secure data engineering solutions while mentoring engineers and ensuring timely, high-quality project delivery.

Azure CI/CD Python Scala SQL
14 hours, 44 minutes ago

Senior Software Engineer - Data Integration & JVM Ecosystem

ClickHouse 51-250 IT Services

Senior Software Engineer (JVM) at ClickHouse joining the Connectors team to own and maintain JVM-based data framework integrations, connectors, and drivers that enable high-performance data ingestion and a seamless developer experience for data engineering workloads.

Apache Airflow Apache Spark ClickHouse dbt Grafana HTTP Java Kafka Metabase Pandas Power BI Python SQL Tableau TCP/IP
1 month ago

Junior Data Engineer (Remote Argentina) / Ingénieur données junior (à distance)

GlobalVision 51-250 Internet Software & Services

Junior Data Engineer at GlobalVision supporting and maintaining the company’s data infrastructure to ensure reliable, accessible, and actionable data that informs business decision-making across the organization.

dbt Domo Machine Learning Power BI Python Salesforce SQL Tableau
1 month ago

Data/Infrastructure Advocate Engineer - EMEA Remote

Hugging Face 51-250 IT Services

Hugging Face is hiring a Data/Infrastructure Advocate Engineer to bridge data infrastructure and the community by championing Xet storage on the Hub and enabling efficient storage, versioning, and collaboration on large-scale datasets.

AWS GitHub Pandas Python
1 month ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers