Funding Societies

Funding Societies

Funding Societies is Southeast Asia's largest SME digital financing platform, providing fast financing solutions for SMEs and investors across the region.

Capital Markets
251-1K
Founded 2015
$1280M raised

Description

  • Design, develop, and implement a cloud-based data warehouse architecture.
  • Set up tools and processes for capturing data from multiple sources into a central data warehouse.
  • Develop and maintain scalable data pipelines and new API integrations.
  • Build ETL pipelines and optimize data extraction, transformation, and loading workflows.
  • Tune and debug the data warehouse and related data processes.
  • Support data analytics and data science teams with reliable data infrastructure.
  • Collaborate with analytics and business teams to improve data models for business intelligence tools.
  • Work with executive, product, data, and design stakeholders on data-related technical issues and infrastructure needs.
  • Use SQL and AWS big data technologies to build infrastructure for data processing and storage.

Requirements

  • Bachelor's or master's degree in a technical or business discipline, or equivalent related experience.
  • 4+ years of hands-on experience managing data platforms, data tools, or data management technologies.
  • Advanced SQL knowledge and experience with relational databases and query authoring.
  • Experience building and optimizing big data pipelines, architectures, and datasets.
  • Experience performing root cause analysis on internal and external data and processes.
  • Experience with orchestration tools such as Airflow.
  • Experience with data warehouse solutions such as Snowflake or Redshift.
  • Exposure to data visualization tools such as Tableau, Sisense, Looker, or Metabase.
  • Knowledge of GitHub and JIRA is a plus.
  • Familiarity with data warehouse and data governance practices.
  • Experience developing software in Java, JavaScript, Python, or similar languages is a plus.
  • A build-test-measure-improve mindset with the ability to motivate and lead teams.
  • Passion for operational efficiency, quantitative performance metrics, and process orientation.
  • Working knowledge of project planning methodologies, IT standards, and guidelines.
  • Customer focus, business acumen, and ability to negotiate, facilitate, and build consensus.
  • Experience with or knowledge of Agile software development methodologies.

Benefits

  • Flexible paid vacation and observed holidays by country.
  • Additional time off for birthdays and work anniversaries.
  • Flexible working arrangements to support work-life balance.
  • Health insurance coverage for employees and dependents.
  • Mental health and wellness support, including fitness initiatives and well-being coaching.
  • Company laptop and support for the equipment and tools needed for productivity.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Platform Engineer

Lola Blankets 1-10 Textiles, Apparel & Luxury Goods

Lola Blankets is hiring a Data Platform Engineer to own its analytics platform and support engineering work across product, operations, integrations, and platform reliability.

Apache Airflow Dagster dbt LLM Prefect Python Snowflake SQL TypeScript
3 hours, 25 minutes ago

Oracle Data Engineer (with German Language)

Soname Solutions 11-50 Internet Software & Services

Soname Solutions is seeking a Senior Data Warehouse Developer to support a German telecom client by designing, optimizing, and evolving its multi-layer data warehouse environment.

Oracle PostgreSQL Power BI SQL
5 hours, 27 minutes ago

Synthetic Data Engineer (AI Data/Training)

Hyphen Connect 1-10 staffing & recruiting

A Synthetic Data Engineer at the organization will design and manage domain-specific synthetic data pipelines that support data processing and model training workflows.

Apache Airflow Apache Spark
6 hours ago

Senior Data Engineer

Alpaca 51-250 Capital Markets

Alpaca is seeking a Senior Data Platform Engineer to build and operate the data management layer for its global brokerage infrastructure as it scales across customers, jurisdictions, and high-volume financial event streams.

Apache Airflow dbt Docker GCP Helm Kafka Kubernetes Python SQL Terraform Trino
7 hours, 46 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers