Thanks

Thanks

Thanks turns thank you into revenue and happier customers with a rewards experience everyone loves to engage with. They help businesses unlock new revenue, discover new customers, and grow brand love through their innovative rewards platform. Whether e...

Diversified Consumer Services
1-10

Description

  • Design and deliver a scalable data platform that serves as the primary engine for the Thanks Network.
  • Own the data models, technical implementation, and performance of ranking systems.
  • Build the end-to-end model lifecycle from training and validation to real-time inference.
  • Develop frameworks for experimentation to measure lift, attribution, and model success.
  • Build robust batch and near-real-time data pipelines that are resilient and observable.
  • Design trusted datasets and data marts that enable self-serve analytics for product, engineering, and commercial teams.
  • Define data tooling, architecture, and trade-offs, including what to build, buy, or retire.
  • Act as the go-to data expert across the business and influence roadmaps and decisions.
  • Partner effectively with Product, Engineering, and Commercial teams on data needs and priorities.

Requirements

  • Senior hands-on individual contributor experience in a high-growth or greenfield environment.
  • Strong experience in both data engineering and applied data science.
  • Advanced Python and complex, performance-optimised SQL skills.
  • Experience building and operating data pipelines in cloud environments.
  • Hands-on experience with analytical databases and operational as well as analytical data stores.
  • Familiarity with streaming or event-driven data architectures.
  • Strong communication skills and the ability to collaborate across Product, Engineering, and Commercial teams.
  • Comfort using AI to support exploration, modelling, debugging, and analysis while maintaining data quality and ownership.
  • Strong internal drive with a focus on performance, correctness, and durable systems.
  • Experience in adtech, marketplaces, or other performance-driven platforms (nice to have).
  • Exposure to experimentation frameworks and attribution models (nice to have).
  • Experience enabling analytics for non-technical teams (nice to have).
  • PySpark, dbt, and strong SQL skills (must have).
  • At least one workflow orchestration tool such as Airflow, Dagster, or Step Functions (must have).
  • At least one DevOps/DataOps tool such as Terraform, CloudFormation, Azure ARM, or Kubernetes (must have).
  • At least one data warehouse or lakehouse platform such as Databricks, Snowflake, BigQuery, ClickHouse, or Redshift (must have).
  • Experience with data catalog or feature store tools such as Databricks Unity Catalog or Atlas (nice to have).
  • Experience with event streaming tools such as Kafka or Kinesis (nice to have).
  • Experience supporting reporting tools such as Tableau, Power BI, or Superset (nice to have).
  • Experience with data quality and testing tools such as Great Expectations or DBT testing (nice to have).

Benefits

  • Attractive compensation, including meaningful equity.
  • Foundational ownership of core data systems from the ground up.
  • Direct impact on product performance, experimentation, and business scaling.
  • Hands-on role with strategic influence on a fast-growing platform.
  • Close collaboration with founders and the Head of Product.
  • A culture that values courage, high standards, and kindness.
  • Flexible work expectations, with openness to exceptional candidates across Australia’s east coast.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

AI Data Engineer

Influur 11-50 Media

Influur is hiring an AI Data Engineer in New York/remote to own the full data-to-agent pipeline behind its autonomous viral marketing system for influencer campaigns.

AWS GCP LLM Python
4 hours, 35 minutes ago

Senior Data Engineer

Zencore Group 11-50 Internet Software & Services

Zencore is hiring a Senior Data Engineer in its LATAM Data & Analytics team to help customers modernize and migrate data platforms on Google Cloud through hands-on pipeline engineering and advisory work.

Apache Airflow Apache Spark CI/CD Databricks GCP MLOps Oracle Python Snowflake SQL
5 hours, 20 minutes ago

Data Observability Consultant - Dynatrace

Lingaro 5K-10K IT Services

Dynatrace India’s Consulting and Advisory Data Consulting Practice is hiring a remote Data Observability Consultant to support data-focused consulting work.

5 hours, 35 minutes ago

Senior Data Engineer

Lodgify 251-1K Internet Software & Services

Lodgify is hiring a Senior Data Engineer in Barcelona to build and optimize the company’s modern data platform that powers data-driven decisions across its vacation rental business.

Apache Airflow AWS Azure dbt GCP JavaScript Machine Learning Python SQL
5 hours, 35 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers