Thanks

Thanks

Thanks turns thank you into revenue and happier customers with a rewards experience everyone loves to engage with. They help businesses unlock new revenue, discover new customers, and grow brand love through their innovative rewards platform. Whether e...

Diversified Consumer Services
1-10

Description

  • Design and deliver a scalable data platform for the Thanks Network.
  • Move the business beyond operational databases into well-modeled data environments for analytics and feature engineering.
  • Own the data science, technical implementation, and performance of ranking systems.
  • Own the full lifecycle of models from training and validation through real-time inference.
  • Build frameworks for experimentation on ranking systems, including lift, attribution, and success measurement.
  • Develop resilient, observable batch and near-real-time data pipelines.
  • Create trusted datasets and data marts that enable self-serve analytics for product, engineering, and commercial teams.
  • Set data architecture and tooling direction, including build-versus-buy trade-offs.
  • Act as the go-to data expert across the business and influence roadmaps and decisions through technical judgment.

Requirements

  • Experience as a senior, hands-on individual contributor in a high-growth environment.
  • Deep strength in both data engineering and applied data science.
  • Strong Python skills and advanced, performance-optimised SQL skills.
  • Experience building and operating data pipelines in cloud environments.
  • Hands-on experience with analytical databases and working across operational and analytical data stores.
  • Familiarity with streaming or event-driven data architectures.
  • Comfort operating as a senior IC in a greenfield environment.
  • Excellent communication skills and the ability to partner across Product, Engineering, and Commercial teams.
  • Thoughtful use of AI to support exploration, modelling, debugging, and analysis while maintaining high standards for data quality and ownership.
  • Strong internal drive focused on performance, correctness, and durable systems.
  • Experience in adtech, marketplaces, or performance-driven platforms (nice to have).
  • Exposure to experimentation frameworks and attribution models (nice to have).
  • Experience enabling analytics for non-technical teams (nice to have).
  • PySpark, dbt, and strong SQL skills (must have).
  • At least one workflow orchestration tool such as Airflow, Dagster, Step Functions, or equivalent (must have).
  • At least one DevOps/DataOps tool such as Terraform, CloudFormation, Azure ARM, or Kubernetes (must have).
  • At least one data warehouse or analytics platform such as Databricks, Snowflake, BigQuery, ClickHouse, or Redshift (must have).
  • Databricks Unity Catalog or Atlas (nice to have).
  • Kafka, Kinesis, or equivalent event streaming technology (nice to have).
  • Experience with reporting tools such as Tableau, Power BI, or Superset (nice to have).
  • Experience with data quality tools such as Great Expectations or dbt testing (nice to have).

Benefits

  • Meaningful equity as part of the compensation package.
  • Foundational ownership of core data systems and architecture.
  • High-impact work that directly influences product performance, experimentation, and business scaling.
  • A hands-on role with strategic influence in a fast-growing platform.
  • Close collaboration with founders and the Head of Product.
  • A culture that values courage, high standards, and kindness.
  • Flexible working arrangements, with openness to exceptional candidates across Australia’s east coast.
  • The opportunity to help build something new and foundational from the ground up.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Engineering Manager – Data Platform

Yuno 51-200 Payment Processing Software

Yuno is hiring a remote Engineering Manager for its Data Platform team to lead the systems and people behind global payment-event processing, data reliability, and analytics that support the company’s core payments infrastructure.

Agile Apache Airflow Apache Spark AWS Azure CI/CD dbt Flink GCP Kafka Machine Learning Python SQL
2 hours, 33 minutes ago

Data Engineer (East Coast Remote)

Empassion Health 11-50 Health Care Providers & Services

Empassion is hiring a Data Engineer to build and maintain data pipelines and models for healthcare data across cloud environments in support of reliable reporting, analytics, and operational workflows.

Apache Airflow AWS Azure dbt GCP Git GitHub Looker Python SQL
4 hours, 20 minutes ago

Senior Data Engineer

Fundraise Up 51-250 Capital Markets

Fundraise Up is hiring a Senior Data Engineer to own and evolve the company’s data platform, building scalable pipelines and warehouse infrastructure for a global nonprofit fundraising product.

Apache Airflow AWS ClickHouse Docker Elasticsearch Git Kafka Koa MLflow MongoDB NestJS Node.js Python React Redis TypeScript Vue.js
5 hours, 43 minutes ago

Senior Infrastructure & Data Platform Engineer

Little Journey 11-50 Family Services

Little Journey is hiring a Senior Infrastructure and Data Platform Engineer to own and evolve the cloud, data, and operational foundations of its HealthTech platform supporting clinical trials and life sciences.

CI/CD GCP GitHub Penetration Testing Secrets Management Terraform Vercel
5 hours, 58 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers