Ensono

Ensono

Ensono provides comprehensive hybrid IT solutions and governance, enabling businesses to navigate complexity and modernize their technology infrastructure, from cloud services to mainframe systems, tailored to each client's unique journey.

IT Services
1K-5K
Founded 1969

Description

  • Design, build, and maintain data pipelines and lakehouse architectures for analytics, AI/ML, and operational decision-making.
  • Build and maintain data ingestion pipelines using batch, streaming, and micro-batch patterns.
  • Develop and optimise transformations within lakehouse and medallion architecture patterns across bronze, silver, and gold layers.
  • Work with cloud data platforms such as Databricks, Apache Spark, and AWS and/or Azure cloud services.
  • Implement data quality checks, validation rules, automated testing, and monitoring dashboards to ensure pipeline reliability.
  • Support data governance and compliance activities, including cataloguing, lineage tracking, and access control.
  • Collaborate with analysts, data scientists, and business stakeholders to gather requirements and deliver fit-for-purpose data products.
  • Contribute to technical documentation, architecture decision records, runbooks, requirements documents, and data inventories.
  • Participate in code reviews, pair programming, and knowledge-sharing sessions to improve team capability.

Requirements

  • Strong development skills in Python and SQL.
  • Experience writing clean, testable, and well-documented code.
  • Hands-on experience building data pipelines and ETL/ELT workflows using Apache Spark, Databricks, or equivalent tools.
  • Understanding of lakehouse and data warehouse architecture patterns, including star schemas, medallion architecture, and data modelling best practices.
  • Experience working with cloud platforms such as AWS, Azure, or GCP, including storage, compute, and managed data services.
  • Familiarity with big data file formats such as Parquet, Delta, and Avro, plus concepts like partitioning, compression, and columnar storage.
  • Good understanding of software engineering best practices, including Git, CI/CD, code review, SOLID principles, and DRY.
  • Clear communication skills with the ability to explain technical concepts to non-technical audiences.
  • A proactive, self-starting attitude with interest in the business context behind the data.
  • Preferred experience with Databricks features such as Delta Live Tables, Unity Catalog, and Workflows, or similar lakehouse platforms.
  • Preferred familiarity with Infrastructure as Code tools such as Terraform and CloudFormation, and container technologies such as Docker and Kubernetes.
  • Preferred exposure to data governance, cataloguing, or data quality frameworks.
  • Preferred experience with TDD/BDD testing practices for data pipelines.
  • Preferred knowledge of streaming technologies such as Kafka, Kinesis, or Spark Structured Streaming.
  • Preferred additional programming experience in Scala or Java.
  • Preferred relevant cloud or data certifications such as Databricks Data Engineer, AWS Data Analytics, or Azure Data Engineer.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Sr. Data Engineer I (6436)

MetroStar 251-1K IT Services

MetroStar is hiring a Sr. Data Engineer I to support an enterprise AI-enabled financial compliance initiative for the Department of War, building the data foundation for compliance modernization across 180+ systems.

PostgreSQL Python SQLAlchemy XML YAML
11 hours, 1 minute ago

Staff Software Engineer - Product Analytics

Datadog 5K-10K IT Services

Datadog is hiring a Staff Engineer to lead the backend technical direction for its Product Analytics platform, building systems that help customers analyze user behavior, retention, and growth at scale.

SQL
11 hours, 16 minutes ago

Senior Staff Data Engineer

SoFi 1K-5K Capital Markets

SoFi is seeking a Senior Staff Data Engineer to lead the architecture and evolution of its AI-powered Data Platform, advancing data reliability, governance, and scalable data experiences for members.

Apache Airflow Apache Spark AWS GCP GitLab Hadoop Kafka Python Snowflake SQL
11 hours, 16 minutes ago

Principal Engineer, Ads Measurement

Unity 5K-10K Internet Software & Services

Unity is hiring a Principal Engineer for Ads Measurement to lead the development of self-attribution and install measurement systems that help the company independently evaluate ad performance and support optimization across its ads platform.

C++ Go Java
11 hours, 16 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers