TeamSnap

TeamSnap

TeamSnap provides a comprehensive sports team management app that simplifies communication and organization for coaches, parents, and administrators involved in youth, recreational, and competitive sports.

Media
51-250
Founded 2009
$59M raised

Description

  • Build and maintain data extraction pipelines from multiple source systems into BigQuery.
  • Refactor and modernize legacy extraction systems and data workflows.
  • Design, implement, and support the team’s data orchestration platform.
  • Build scheduling, dependency management, and failure recovery into data workflows.
  • Implement monitoring, alerting, and visibility for pipeline health and status.
  • Maintain and optimize the BigQuery data warehouse.
  • Support DBT model development, deployment workflows, and data infrastructure changes through CI/CD.
  • Implement security controls and access management for data systems and support SOC 2 compliance requirements.
  • Document data flows and maintain data catalogs.
  • Collaborate with a staff data engineer and cross-functional stakeholders on reliable data infrastructure.

Requirements

  • 3+ years of data engineering experience building extraction pipelines.
  • Strong SQL skills and experience with modern data warehouses such as BigQuery, Snowflake, or Redshift.
  • Experience with data orchestration tools such as Airflow, Dagster, Prefect, or similar.
  • Google Cloud experience with services such as BigQuery, Cloud SQL, Pub/Sub, and Cloud Storage; AWS or Azure experience is also acceptable.
  • Experience with CI/CD and containerization, including Docker.
  • Ability to communicate clearly with both technical and non-technical stakeholders.
  • Experience with DBT.
  • Experience with Fivetran or similar managed ingestion tools.
  • Background in compliance or governance work such as SOC 2 or data privacy.
  • Experience cleaning up legacy data systems.
  • Preferred: familiarity with TeamSnap, especially as a parent, coach, or recreational sports participant.

Benefits

  • Minimum starting compensation of $130,000 OTE, inclusive of base and bonus.
  • Unlimited PTO.
  • Paid parental leave for all parents.
  • 100% premium coverage of medical, dental, and vision insurance for you and your family.
  • 401(k) for retirement savings.
  • $1,500 annual learning and development stipend.
  • Generous home office allowance.
  • Monthly stipend reimbursement for health and wellness expenses.
  • Travel to fun locations for all-company meetings and team events.
  • Remote-first, fully remote work environment.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Engineer

Remotebase 51-250 Internet Software & Services

This role at an organization enhancing its cloud data platform focuses on building and operating automated data infrastructure, pipelines, and transformation workflows to enable reliable data delivery across the business.

Agile Azure Bash CI/CD dbt Docker GCP GitHub Actions GitLab CI Jenkins Kubernetes Python Snowflake SQL Terraform
20 hours, 27 minutes ago

Principal Software/Data Engineer

PointClickCare 1K-5K Health Care Providers & Services

PointClickCare is hiring a Principal Software/Data Engineer to lead the design and delivery of production-grade streaming and real-time data pipelines for its healthcare data platform.

AWS Azure CI/CD Databricks dbt Flink GCP Kafka
23 hours, 57 minutes ago

Data Engineer (EU)

Swish Analytics 1-10 Internet Software & Services

Swish Analytics is hiring a Europe-based remote Data Engineer to help build and operate real-time sports analytics and betting data products, with a focus on non-US sports coverage and enterprise-grade data delivery.

Apache Airflow AWS CI/CD Git Kubernetes Machine Learning MySQL Python REST API Shell Scripting SQL
23 hours, 57 minutes ago

Senior Data Engineer

Murmuration 11-50 Diversified Consumer Services

Murmuration is seeking a Senior Data Engineer to build and maintain the data infrastructure that powers its unified civic data platform, Atlas, supporting research, product, and community-impact work across complex, high-volume public and political datasets.

Apache Airflow AWS CI/CD Dagster dbt Docker MongoDB Python Snowflake
23 hours, 57 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers