Umpisa

Umpisa

Umpisa, Inc. partners with industries to drive pioneering solutions through modern software development, aiming to establish the Philippines as a global tech hub.

Internet Software & Services
11-50
Founded 2019

Description

  • Design, develop, and maintain scalable data pipelines using GCP ETL tools such as Cloud Dataflow, Dataprep, and Dataproc.
  • Collaborate with cross-functional teams to understand data requirements and implement GCP-based solutions.
  • Optimize data models and data architecture for performance and reliability within Google Cloud Platform.
  • Build and maintain data warehouses, data lakes, and other data storage solutions on GCP.
  • Ensure data quality, integrity, and security across datasets in the GCP environment.
  • Identify and resolve performance bottlenecks in data infrastructure related to GCP ETL tools.
  • Work with data scientists, analysts, and other stakeholders to support their data needs.
  • Stay current with GCP technologies and data engineering best practices.
  • Support report generation and migration work from Oracle to Microsoft SQL Server, SSIS, and SSRS.

Requirements

  • 2 years of experience as a Data Engineer or in a similar role with a strong focus on Google Cloud Platform.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Strong experience with SQL databases within GCP, including MySQL or PostgreSQL.
  • Hands-on experience with GCP data pipeline and workflow tools such as Cloud Dataflow, Dataprep, or Dataproc.
  • Experience building and optimizing big data solutions using technologies such as BigQuery, Dataflow, and Pub/Sub.
  • Solid understanding of GCP data warehousing concepts and methodologies.
  • Excellent problem-solving skills and the ability to work in a fast-paced environment.
  • Experience with Microsoft SQL Server, SSIS, SSRS, and Power BI for reporting and data integration.
  • Google Cloud certifications such as Professional Data Engineer or Associate Cloud Engineer preferred.
  • Experience with other GCP services beyond ETL tools preferred.
  • Knowledge of machine learning concepts and frameworks within GCP preferred.

Benefits

  • Work from home.
  • HMO coverage.
  • 13th month pay.
  • Training benefits.
  • Open to applicants anywhere in the Philippines.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Senior Data Engineer

Zeta Global 1K-5K Media

Zeta Global is seeking a Senior Data Engineer for its Data Cloud Acceleration Team to turn messy data into reliable, decision-ready assets that support new applications and measurable business outcomes.

Apache Airflow Databricks dbt GCP Hive Prefect Python Snowflake SQL
14 minutes ago

[Job - 29253] Senior Data Developer (AWS), Brazil

CI&T 5K-10K Internet Software & Services

CI&T is hiring a Senior Data Developer (AWS) in Brazil to build the data foundation for scalable AI solutions across business areas in a remote home office setting.

Apache Airflow Apache Spark AWS Databricks Pandas Python
14 minutes ago

Lead Data Engineer (Platform)

Valtech 5K-10K Professional Services

Valtech is hiring a Lead Data Engineer (Platform) in Poland to build and support scalable data platform frameworks that improve how engineering teams deliver data solutions.

Apache Spark AWS Azure CI/CD Databricks Docker Git GitHub Actions GitLab CI Kubernetes Microservices Python REST API Secrets Management Serverless SQL Terraform
29 minutes ago

Data Platform Engineer

Renaissance 1K-5K Internet Software & Services

Renaissance is hiring a mid-level Data Platform Engineer to help build and operate the data infrastructure that makes student data accessible, reliable, and scalable across its education technology platform.

Apache Airflow AWS dbt Docker GCP MySQL PostgreSQL Snowflake
44 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers