Alpaca

Alpaca

Alpaca is a developer-first API for stock and crypto trading, offering easy-to-use APIs for building apps and trading algorithms.

Capital Markets
51-250
Founded 2015
$87M raised

Description

  • Design and oversee forward- and reverse-ETL patterns to deliver data to stakeholders.
  • Develop scalable transformation-layer patterns that enable repeatable integrations with BI tools across business verticals.
  • Expand and maintain the Alpaca Data Lakehouse architecture.
  • Collaborate with sales, marketing, product, and operations teams to address data flow needs.
  • Operate the system and resolve production issues promptly.
  • Support batch, streaming, transformation, consumption, experimentation, cataloging, monitoring, and alerting components of the data platform.

Requirements

  • 7+ years of experience in data engineering.
  • 2+ years of experience building scalable, low-latency data platforms handling more than 100M events per day.
  • Strong working knowledge of Python and SQL.
  • Proficiency in at least one programming language.
  • Experience with cloud-native technologies such as Docker, Kubernetes, and Helm.
  • Hands-on experience with relational databases and object storage implementations such as Apache Iceberg.
  • Hands-on experience with Google Cloud Platform and data services such as Composer, Dataproc, and Datastream.
  • Experience building scalable transformation layers, preferably using formalized SQL models such as dbt.
  • Experience with ETL orchestrators or frameworks such as Apache Airflow and Airbyte.
  • Production experience with streaming systems such as Kafka.
  • Exposure to infrastructure, DevOps, and Infrastructure as Code such as Terraform.
  • Deep knowledge of distributed systems, storage, transactions, and query processing using engines such as Trino (formerly PrestoSQL).
  • Ability to work in a fast-paced environment and adapt solutions to changing business needs.

Benefits

  • Competitive salary with stock options.
  • Health benefits.
  • One-time USD $500 new hire home-office setup stipend.
  • Monthly USD $150 stipend via Brex Card.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Senior Data Engineer

phData 251-1K IT Services

phData is building a talent pipeline for a remote-first data engineering and analytics role that supports global enterprises by delivering modern cloud data solutions.

Apache Airflow Apache Spark AWS Azure Cassandra Databricks dbt Elasticsearch GCP Hadoop HDFS Java Kafka Luigi Python Scala Snowflake Solr SQL
26 minutes ago

Synthetic Data Engineer (AI Data/Training)

Hyphen Connect 1-10 staffing & recruiting

A Synthetic Data Engineer at the organization will design and manage domain-specific synthetic data pipelines that support data processing and model training workflows.

Apache Airflow Apache Spark
39 minutes ago

Software Engineer II, Data Automation

Klaviyo 1K-5K IT Services

Klaviyo is hiring an engineer for its Data Automation team to build and maintain tooling and infrastructure that powers mission-critical analytics systems on a real-time AWS data platform.

Apache Airflow Apache Spark AWS ClickHouse DynamoDB Kafka Kubernetes Linux MySQL Python Terraform
56 minutes ago

Ingeniero de Datos Senior (Informatica CDI)

NEORIS 5K-10K Internet Software & Services

EPAM/NEORIS busca un Ingeniero de Datos Senior para desarrollar y asegurar soluciones de integración y plataforma de datos en un entorno multicultural con tecnologías cloud y de datos empresariales.

Agile Azure Databricks MongoDB Oracle PostgreSQL Power BI SQL Server
2 hours, 10 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers