V4C.ai

V4C.ai

V4C.ai is a data-driven company that leverages Dataiku to transform business strategies, operations, and customer interactions. With a customer-centric approach, they empower decisions and drive innovation, offering tailored solutions for success in to...

Internet Software & Services

Description

  • Collaborate with team members and stakeholders to gather data requirements and build scalable data pipelines and workflows in Databricks.
  • Develop and implement ETL/ELT processes using Databricks, Python, SQL, and related tools to ingest, transform, and prepare data.
  • Optimize data workflows for performance, reliability, and cost-efficiency within Databricks environments.
  • Support the creation and maintenance of data models, tables, and cloud integrations across platforms such as Azure, AWS, or similar.
  • Work with data analysts, scientists, and engineers to deliver clean, accessible data for analytics and reporting.
  • Monitor data pipelines, troubleshoot basic issues, and contribute to documentation and best practices.
  • Stay current on Databricks features and data engineering trends to support ongoing improvements.

Requirements

  • Bachelor's degree in Computer Science, Data Science, Engineering, Information Systems, or a related field, or equivalent practical experience.
  • 1-2 years of professional experience in data engineering, data processing, analytics engineering, or a closely related role.
  • Internships, co-ops, or academic projects with relevant tools may count toward the experience requirement.
  • Hands-on experience building basic data pipelines or transformations.
  • Proficiency in Python and SQL.
  • Experience with Scala is a plus but not required.
  • Basic understanding of cloud platforms such as Azure, AWS, or GCP, including storage, compute, or data services.
  • Strong analytical and problem-solving skills with attention to detail and a focus on clean, maintainable code.
  • Strong communication skills and ability to work collaboratively in a remote team environment.
  • Eagerness to learn, take ownership of tasks, and grow within data engineering.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

AI Data Engineer

Influur 11-50 Media

Influur is hiring an AI Data Engineer in New York/remote to own the full data-to-agent pipeline behind its autonomous viral marketing system for influencer campaigns.

AWS GCP LLM Python
4 hours, 35 minutes ago

Senior Data Engineer

Zencore Group 11-50 Internet Software & Services

Zencore is hiring a Senior Data Engineer in its LATAM Data & Analytics team to help customers modernize and migrate data platforms on Google Cloud through hands-on pipeline engineering and advisory work.

Apache Airflow Apache Spark CI/CD Databricks GCP MLOps Oracle Python Snowflake SQL
5 hours, 20 minutes ago

Data Observability Consultant - Dynatrace

Lingaro 5K-10K IT Services

Dynatrace India’s Consulting and Advisory Data Consulting Practice is hiring a remote Data Observability Consultant to support data-focused consulting work.

5 hours, 35 minutes ago

Senior Data Engineer

Lodgify 251-1K Internet Software & Services

Lodgify is hiring a Senior Data Engineer in Barcelona to build and optimize the company’s modern data platform that powers data-driven decisions across its vacation rental business.

Apache Airflow AWS Azure dbt GCP JavaScript Machine Learning Python SQL
5 hours, 35 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers