Worth AI

Worth AI

Worth AI is a leading provider of business credit score and risk management solutions. Their AI-powered underwriting platform helps businesses accelerate credit approvals, predict portfolio credit risk, and eliminate unnecessary losses. With a focus on...

Internet Software & Services

Description

  • Architect and implement entity resolution logic to de-duplicate and link data into unified Golden Records for businesses and individuals.
  • Design and maintain a global business knowledge graph and ontology that maps ownership chains, UBOs, and hidden risk relationships.
  • Implement a hybrid storage strategy using graph databases, document stores, and search systems for metadata and adverse media content.
  • Optimize the platform for real-time risk assessment and low-latency traversal needed for automated onboarding decisions.
  • Design and build scalable data services and APIs for ingesting, transforming, and serving data across the company.
  • Develop and maintain batch and streaming data pipelines using modern data processing frameworks and AWS cloud-native tooling.
  • Own the reliability, performance, monitoring, alerting, and on-call support for the data platform.
  • Implement best practices for data modeling, quality, lineage, and governance to ensure trustworthy datasets.
  • Work closely with data scientists, analysts, and application engineers to translate needs into platform capabilities.
  • Drive automation and standardization through CI/CD, model-as-a-service patterns, and reproducible environments.
  • Help define and evolve the data platform architecture as an internal service with clear contracts, SLAs, and versioned APIs.

Requirements

  • Hands-on experience with graph databases such as Neo4j, AWS Neptune, or TigerGraph and query languages like Cypher or Gremlin.
  • Proven experience with entity resolution or record linkage, including tools such as Senzing, Quantexa, or custom probabilistic matching models.
  • Ability to design flexible ontologies for evolving regulatory data, such as changing PEP definitions or sanction list formats.
  • Experience building GraphQL or REST APIs optimized for graph traversals and deep-tree lookups.
  • Experience building centralized data platforms or data-as-a-service offerings at scale.
  • Strong software engineering skills in at least one language such as Python, Java, Go, or Rust.
  • Hands-on experience building data pipelines and ETL/ELT workflows on a major cloud provider, with AWS preferred.
  • Experience with modern data stack tools such as Spark/Flink, Kafka/Kinesis, Airflow or managed schedulers, and data warehouses like Snowflake, Redshift, BigQuery, or Databricks.
  • Familiarity with DevOps practices including CI/CD, containerization with Docker, orchestration with Kubernetes, and infrastructure as code with Terraform.
  • Strong focus on observability, resilience, and early warning signals.
  • Comfort collaborating cross-functionally and communicating clearly with both technical and non-technical stakeholders.
  • Background supporting machine learning or real-time decisioning use cases from a platform perspective is nice to have.
  • Understanding of AML, CTF, and KYC/KYB data structures such as LEIs and ISO 20022 is nice to have.
  • Experience handling global address normalization and geospatial indexing for risk detection is nice to have.
  • All remote hires must be able to travel to Orlando, Florida at least twice per year for town halls and team collaboration, in addition to orientation in Orlando.

Benefits

  • Health care plan, including medical, dental, and vision coverage.
  • Retirement plan with 401(k) and IRA options.
  • Life insurance.
  • Flexible paid time off.
  • 9 paid holidays.
  • Family leave.
  • Work-from-home arrangement.
  • Free food and snacks in the Orlando office.
  • Wellness resources.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Platform Engineer

Lola Blankets 1-10 Textiles, Apparel & Luxury Goods

Lola Blankets is hiring a Data Platform Engineer to own its analytics platform and support engineering work across product, operations, integrations, and platform reliability.

Apache Airflow Dagster dbt LLM Prefect Python Snowflake SQL TypeScript
4 hours, 32 minutes ago

Oracle Data Engineer (with German Language)

Soname Solutions 11-50 Internet Software & Services

Soname Solutions is seeking a Senior Data Warehouse Developer to support a German telecom client by designing, optimizing, and evolving its multi-layer data warehouse environment.

Oracle PostgreSQL Power BI SQL
6 hours, 34 minutes ago

Synthetic Data Engineer (AI Data/Training)

Hyphen Connect 1-10 staffing & recruiting

A Synthetic Data Engineer at the organization will design and manage domain-specific synthetic data pipelines that support data processing and model training workflows.

Apache Airflow Apache Spark
7 hours, 7 minutes ago

Senior Data Engineer

Alpaca 51-250 Capital Markets

Alpaca is seeking a Senior Data Platform Engineer to build and operate the data management layer for its global brokerage infrastructure as it scales across customers, jurisdictions, and high-volume financial event streams.

Apache Airflow dbt Docker GCP Helm Kafka Kubernetes Python SQL Terraform Trino
8 hours, 53 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers