Dreamix

Dreamix

Dreamix is a trusted bespoke software development company specializing in Java & Oracle technologies, offering quality services for digital product development with a focus on expertise and collaboration.

Internet Software & Services
51-250
Founded 2005

Description

  • Design, develop, and maintain scalable data pipelines for processing and analyzing large volumes of data.
  • Develop and optimize data workflows using Databricks and Spark for large-scale data processing.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure data integrity and quality.
  • Use Python for scripting, coding, and implementing business rules for data processing and transformation.
  • Implement ETL processes to integrate data from multiple sources into data warehouse or data lake solutions.
  • Optimize big data storage, processing, and data workflows for maximum efficiency and scalability.
  • Troubleshoot and resolve data-related issues to ensure reliability and performance of the data infrastructure.
  • Implement data security best practices and support compliance with data protection regulations.
  • Develop and maintain API integrations to enable seamless data exchange between systems and applications.
  • Follow emerging trends and recommend continuous improvements to data engineering practices.

Requirements

  • Minimum of 5 years of relevant experience in data engineering.
  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • Hands-on experience with Databricks, including Spark, Delta Lake, and workflow orchestration.
  • Strong proficiency in Python for scripting and data processing.
  • Familiarity with big data technologies such as Hadoop, Spark, and Kafka.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud and their data services.
  • Experience designing and implementing efficient database schemas.
  • Strong understanding of data warehousing concepts and experience with databases such as SQL Server, Oracle, or PostgreSQL.
  • Solid knowledge of data modeling, database design, and data warehousing concepts.
  • Excellent problem-solving and communication skills, with the ability to work independently and collaboratively in a fast-paced environment.

Benefits

  • Warm and supportive work environment.
  • Flexible working hours.
  • Unlimited home office.
  • Opportunities for professional development, including certifications and training.
  • Additional benefits for academic teaching and speaking engagements.
  • Knowledge-sharing sessions with the Dreamix team.
  • Team and company-wide events.
  • Additional health insurance and dental allowance.
  • Multisport card.
  • Office massages.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Platform Engineer

Lola Blankets 1-10 Textiles, Apparel & Luxury Goods

Lola Blankets is hiring a Data Platform Engineer to own its analytics platform and support engineering work across product, operations, integrations, and platform reliability.

Apache Airflow Dagster dbt LLM Prefect Python Snowflake SQL TypeScript
4 hours, 41 minutes ago

Oracle Data Engineer (with German Language)

Soname Solutions 11-50 Internet Software & Services

Soname Solutions is seeking a Senior Data Warehouse Developer to support a German telecom client by designing, optimizing, and evolving its multi-layer data warehouse environment.

Oracle PostgreSQL Power BI SQL
6 hours, 44 minutes ago

Synthetic Data Engineer (AI Data/Training)

Hyphen Connect 1-10 staffing & recruiting

A Synthetic Data Engineer at the organization will design and manage domain-specific synthetic data pipelines that support data processing and model training workflows.

Apache Airflow Apache Spark
7 hours, 16 minutes ago

Senior Data Engineer

Alpaca 51-250 Capital Markets

Alpaca is seeking a Senior Data Platform Engineer to build and operate the data management layer for its global brokerage infrastructure as it scales across customers, jurisdictions, and high-volume financial event streams.

Apache Airflow dbt Docker GCP Helm Kafka Kubernetes Python SQL Terraform Trino
9 hours, 2 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers