Rithum

Rithum

End to End E Commerce Solutions for Brands & Retailers | Rithum CommerceHub and ChannelAdvisor are now united as Rithum. We empower top brands, suppliers, and retailers with durable, profitable e commerce solutions. Rithum is the hottest place for free...

Internet Software & Services
$13M raised

Description

  • Design and implement scalable ETL/ELT workflows for batch and streaming data using AWS services.
  • Architect and maintain cloud-native data platforms with automated ingestion, transformation, and governance pipelines.
  • Support Product, BI, Support, and Engineering teams with data-related technical challenges and infrastructure needs.
  • Optimize data lake and lakehouse infrastructure to support AI workloads and large-scale analytics.
  • Ensure data quality, lineage, observability, governance, compliance monitoring, and privacy protection.
  • Partner with data scientists to optimize pipelines for model training, inference, and continuous learning workflows.
  • Build self-healing data pipelines with AI-driven error detection, root cause analysis, and automated remediation.
  • Implement intelligent data lineage tracking and AI-assisted data discovery systems.
  • Participate in the full software development lifecycle, including requirements gathering, testing, and deployment.
  • Mentor junior engineers and help lead tool evaluation and adoption for the data engineering team.

Requirements

  • 3+ years of experience in data engineering, including building and maintaining large-scale data pipelines.
  • Extensive experience with SQL RDBMSs such as SQL Server and dimensional modeling using star schema.
  • Hands-on experience with AWS services such as Redshift, Athena, S3, Kinesis, Lambda, and Glue.
  • Experience with DBT, Databricks, or similar data platform tooling.
  • Experience working with structured and unstructured data and implementing data quality frameworks.
  • Excellent communication and collaboration skills.
  • Demonstrated experience using AI coding tools such as GitHub Copilot or Cursor, with prompt engineering knowledge.
  • Understanding of AI/ML concepts and data requirements, including feature stores, model versioning, and real-time inference pipelines.
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field preferred.
  • Experience in a SaaS or e-commerce environment with AI/ML products preferred.
  • Knowledge of stream processing frameworks like Kafka, Flink, or Spark Structured Streaming preferred.
  • Familiarity with LLMOps, AI model deployment patterns, or AI-powered data tools preferred.
  • Experience with Docker and Kubernetes preferred.
  • Ability to travel up to 10%.

Benefits

  • Medical coverage through Irish Life Health with premiums paid by the company.
  • Life and disability insurance.
  • Pension plan with 5% company match.
  • Competitive time off package including 25 days of PTO, 11 company-paid holidays, 2 wellness days, and 1 paid volunteer day.
  • Access to wellbeing tools such as the Calm app and an Employee Assistance Program.
  • €50/month remote work stipend for internet.
  • Professional development stipend plus learning and development offerings.
  • Charitable contribution match per team member.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Platform Engineer

Figma 1K-5K Internet Software & Services

Figma is hiring a Data Platform Engineer to help build the ML and data systems that support its next generation of AI-driven products and self-serve analytics.

AWS CI/CD Dagster dbt Kubeflow LLM Machine Learning MLflow Python Snowflake
27 minutes ago

Informatica Integration Developer

Attain Partners 251-1K Media

Attain Partners is seeking an Informatica Integration Developer to design and support enterprise data integration solutions for cloud and on-premise systems that enable reliable analytics, reporting, and business operations.

AWS Azure CI/CD CRM ERP GCP Git JSON Oracle Python REST API SAP Snowflake SQL SQL Server XML
1 hour, 42 minutes ago

Staff Data Engineer

tvScientific 11-50 Media

tvScientific is hiring a Staff Data Engineer to lead the design and evolution of its identity services and data governance platform for trusted, privacy-safe data across the organization.

Apache Spark AWS Scala
2 hours, 12 minutes ago

Data Engineer

UJET 251-1K Professional Services

UJET is hiring a Software Engineer to join its Data Platform team, building scalable data infrastructure and analytics systems that support trusted metrics, reliable workflows, and cross-functional data use across the company.

CI/CD dbt GCP Git Looker Python Ruby Ruby on Rails SQL
2 hours, 57 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers