Soum

Soum

Soum is a fast-growing recommerce marketplace redefining buying and selling of used electronic devices in MENA with trust, security, and money-back guarantee.

Household Durables
11-50

Description

  • Design, build, and maintain scalable and reliable data pipelines for analytics, ML models, and business reporting.
  • Collaborate with data scientists and analysts to ensure data is clean, available, and optimized for downstream use.
  • Implement data quality checks, monitoring, and validation processes.
  • Design efficient ETL/ELT workflows and integrate data from databases, APIs, and third-party tools into centralized storage.
  • Support cloud-based data storage and retrieval infrastructure.
  • Monitor, troubleshoot, and optimize data pipelines for large-scale and real-time data flows.
  • Apply best practices for query optimization and cost-efficient data storage.
  • Partner with product, engineering, and business stakeholders to gather data requirements.
  • Document data workflows, schemas, and best practices.
  • Support data reliability, governance, and security across the organization.

Requirements

  • 2+ years of experience in data engineering building and maintaining data pipelines.
  • 2+ years of SQL and Python development experience in production environments.
  • Proficiency in Python and SQL for data engineering tasks.
  • Strong understanding of ETL/ELT processes, data warehousing, and data modeling.
  • Hands-on experience with cloud platforms such as AWS, GCP, or Azure.
  • Experience with data storage solutions such as BigQuery, Redshift, or Snowflake.
  • Familiarity with data orchestration tools such as Airflow and Airbyte is required.
  • Dataform experience is required.
  • Experience with containerization and deployment tools such as Docker and Kubernetes is a plus.
  • Knowledge of data governance, security, and handling sensitive data.
  • Familiarity with Git and GitHub.
  • Strong ability to elicit requirements from cross-functional stakeholders and translate them into actionable tasks.
  • Experience in a fast-growing startup environment is a plus.
  • Exposure to real-time data processing frameworks such as Kafka, Spark, or Flink is a plus.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

[Job - 29221] Senior Data Developer (Azure), Brazil

CI&T 5K-10K Internet Software & Services

CI&T is seeking a Senior Data Developer (Azure) to build and evolve its cloud data platform in Brazil, turning architectural standards into reliable, scalable data pipelines and analytics-ready datasets.

Apache Airflow Apache Spark Azure CI/CD Databricks dbt Feature Engineering Git Prefect Python Snowflake SQL
29 minutes ago

Data Engineer, Azure - Remote, Latin America

Bluelight Consulting 11-50 Internet Software & Services

Bluelight is hiring a remote Data Engineer, Azure to develop and optimize data engineering pipelines and warehousing solutions for client projects across Latin America.

Agile Apache Spark Azure Git Machine Learning Power BI Python REST API SQL Tableau
1 hour, 41 minutes ago

Software / Data Engineer (AWS & Python), (Remote, Full-Time) [IS007]

Smart Working Internet Software & Services

Smart Working is hiring a remote Software/Data Engineer to own and evolve a cloud-based AWS and Python data platform that supports EV repair cost intelligence, analytics, and client-facing delivery for insurers and automotive organisations.

API Gateway AWS Python Serverless SFTP SQL
4 hours, 15 minutes ago

Senior Data Engineer II, Finance

instacart.careers 1K-5K Internet Software & Services

Instacart is hiring a Finance Data Engineer to build and own the data pipelines and models that support financial reporting, book close, and billing across its grocery delivery platform.

Apache Airflow Apache Spark dbt GCP Python Snowflake SQL
5 hours, 7 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers