10x Genomics

10x Genomics

10x Genomics provides innovative genomics tools that enable researchers to explore single-cell and spatial biology, facilitating groundbreaking discoveries that enhance understanding of health and disease to improve human health outcomes.

Biotechnology
1K-5K
Founded 2012
$243M raised

Description

  • Lead the architecture and delivery of the Single Source of Truth for the Unified Data Platform.
  • Architect and implement a canonical data layer and event-driven architecture to support real-time data flow.
  • Design, build, and optimize high-volume batch and real-time data pipelines across enterprise systems.
  • Establish and govern Amazon S3 as the data platform foundation using medallion architecture and schema evolution.
  • Develop and maintain scalable ELT pipelines and data models in Snowflake.
  • Build the self-service analytics presentation layer, including a Natural Language Query interface integrated with generative AI.
  • Lead migration of key business domains off legacy middleware onto the new platform.
  • Define and enforce data governance, quality, and security standards across the platform.
  • Collaborate with engineering, business teams, and the Architecture Review Board on modern data architecture approaches.
  • Own the full development lifecycle from prototyping and design through deployment, monitoring, and operational excellence.

Requirements

  • Bachelor’s degree in Computer Science, Information Management, or a related field, or equivalent experience.
  • 5+ years of hands-on experience in software engineering focused on data platform development, distributed systems, or enterprise integrations.
  • Proven experience designing and implementing highly scalable data platforms on AWS, GCP, Azure, or similar cloud environments.
  • Deep proficiency in one or more programming languages such as Python, Java, or similar.
  • Strong foundation in computer science fundamentals, including data structures, algorithms, and system design.
  • Experience with message queues and event streaming platforms such as Kafka, RabbitMQ, or Pub/Sub, with event-driven architecture experience.
  • Experience building data lakes or lakehouses using Apache Iceberg on cloud storage such as Amazon S3.
  • Experience with modern ELT development, data modeling for OLAP/data warehousing, and Snowflake features such as Snowpipes, Streams, and Stored Procedures.
  • Familiarity with Docker, Kubernetes, and infrastructure-as-code principles.
  • Prior experience migrating an organization off a traditional iPaaS platform or eliminating legacy middleware.
  • Experience with generative AI integration for data access, such as NLQ or feature stores.
  • Core technologies include Snowflake, Python or Java/Scala, Kafka or similar event streaming tools, AWS or another major cloud platform, Apache Iceberg/Amazon S3, advanced SQL, and orchestration tools such as Airflow.

Benefits

  • Base salary range of $168,200 to $227,600 USD.
  • Eligible for equity grants.
  • Comprehensive health and retirement benefit programs.
  • Annual bonus program or sales incentive program.
  • Total compensation package may vary based on skills, qualifications, and experience.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Staff Software Engineer - Core Ingest

Sumo Logic 251-1K Internet Software & Services

Sumo Logic is seeking a Staff Software Engineer - Core Ingest to help design and scale distributed data processing systems that ingest, manage, and analyze massive volumes of customer data in a highly distributed, multi-tenant platform.

Agile Docker Java Kafka Kubernetes Linux Microservices Scala Unix
18 minutes ago

Director, Data Platform & Analytics

Newsela 251-1K Diversified Consumer Services

Newsela is seeking a strategic data leader to define and scale the company’s data platform, analytics, and governance supporting education products and cross-functional decision-making.

Dagster dbt GCP Python SQL Tableau
32 minutes ago

Data Engineer - GCP

NEORIS 5K-10K Internet Software & Services

EPAM NEORIS is seeking a Data Engineer - GCP to build and support data ingestion, integration, and transformation pipelines for machine learning and analytics initiatives.

Apache Airflow Apache Spark Azure Azure Pipelines Bitbucket CI/CD Databricks GCP GitHub Hadoop Jenkins Machine Learning MongoDB Oracle Python Snowflake SQL SQL Server
1 hour, 18 minutes ago

Data Engineer AWS 

NEORIS 5K-10K Internet Software & Services

EPAM NEORIS is hiring a Data Engineer AWS to build and evolve cloud-based data solutions for client projects within a global nearshore environment.

Apache Spark AWS Bash EC2 GitHub Actions Python Scala SQL
2 hours, 3 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers