Sr. Lead Software Engineer - 11282

2 weeks, 6 days ago
Full-time
Lead
Software Development
Coupa Software

Coupa Software

Coupa Software is the premier cloud-based finance platform, empowering companies worldwide to optimize spend, boost profits, and reduce costs with a comprehensive suite of modules.

Internet Software & Services
1K-5K
Founded 2006

Description

  • Design and implement scalable, high-throughput data ingestion systems that integrate internal and external data across domains.
  • Build and evolve core data platform components, including ingestion, validation, orchestration, and lineage.
  • Develop and maintain a centralized data lake using Apache Iceberg or similar table formats.
  • Design and implement cloud-agnostic data ingestion and processing patterns across AWS, GCP, and Azure.
  • Contribute hands-on to the semantic layer so data is easy to consume for BI and analytics teams.
  • Partner with Senior Data Engineers, Platform Engineers, and Analytics Engineers to align data production, storage, and consumption.
  • Establish engineering standards for testing, observability, and operational excellence.
  • Provide technical leadership through mentorship, code reviews, and design discussions while remaining hands-on.
  • Improve platform latency, reliability, and performance as the system scales.
  • Support challenging data problems such as high-throughput ingestion and sub-second retrieval and query performance.

Requirements

  • 8-10+ years of experience in software or platform engineering with a focus on scalable data and analytics platforms.
  • Strong understanding of data ingestion patterns at scale, including CDC.
  • Experience modeling and storing data in a data lake for fast, efficient retrieval.
  • Proven experience building and operating large-scale data pipelines in production.
  • Experience with modern data warehouses such as Databricks, BigQuery, or Snowflake.
  • Strong proficiency in Python and SQL with production-quality, maintainable, and testable code.
  • Hands-on experience with cloud data services in AWS, GCP, or Azure.
  • Experience with query engines such as Presto or Trino for fast, reliable analytics over data lakes.
  • Familiarity with lakehouse architectures and table formats such as Iceberg or Delta Lake.
  • Familiarity with data governance, lineage, metadata, cataloging, and data quality practices.
  • Nice to have exposure to semantic layers, metrics frameworks, or BI-friendly data modeling.
  • Nice to have experience supporting analytics or AI/ML workloads.

Benefits

  • Estimated pay range of $149,000 to $193,500.
  • Remote work opportunity.
  • Welcoming and inclusive work environment.
  • Equal employment opportunity practices.
  • Opportunity to work on globally impactful technology.
  • Collaboration-driven culture with transparency and teamwork.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Platform Engineer

Lola Blankets 1-10 Textiles, Apparel & Luxury Goods

Lola Blankets is hiring a Data Platform Engineer to own its analytics platform and support engineering work across product, operations, integrations, and platform reliability.

Apache Airflow Dagster dbt LLM Prefect Python Snowflake SQL TypeScript
5 hours, 42 minutes ago

Oracle Data Engineer (with German Language)

Soname Solutions 11-50 Internet Software & Services

Soname Solutions is seeking a Senior Data Warehouse Developer to support a German telecom client by designing, optimizing, and evolving its multi-layer data warehouse environment.

Oracle PostgreSQL Power BI SQL
7 hours, 44 minutes ago

Synthetic Data Engineer (AI Data/Training)

Hyphen Connect 1-10 staffing & recruiting

A Synthetic Data Engineer at the organization will design and manage domain-specific synthetic data pipelines that support data processing and model training workflows.

Apache Airflow Apache Spark
8 hours, 16 minutes ago

Senior Data Engineer

Alpaca 51-250 Capital Markets

Alpaca is seeking a Senior Data Platform Engineer to build and operate the data management layer for its global brokerage infrastructure as it scales across customers, jurisdictions, and high-volume financial event streams.

Apache Airflow dbt Docker GCP Helm Kafka Kubernetes Python SQL Terraform Trino
10 hours, 3 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers