Worth AI

Worth AI

Worth AI is a leading provider of business credit score and risk management solutions. Their AI-powered underwriting platform helps businesses accelerate credit approvals, predict portfolio credit risk, and eliminate unnecessary losses. With a focus on...

Internet Software & Services

Description

  • Architect and implement entity resolution logic to de-duplicate and link data into unified Golden Records for businesses and individuals.
  • Design and maintain a global business knowledge graph and ontology that maps ownership chains, UBOs, and hidden risk relationships.
  • Implement a hybrid storage strategy using graph databases, document stores, and search systems for metadata and adverse media content.
  • Optimize the platform for real-time risk assessment and low-latency traversal needed for automated onboarding decisions.
  • Design and build scalable data services and APIs for ingesting, transforming, and serving data across the company.
  • Develop and maintain batch and streaming data pipelines using modern data processing frameworks and AWS cloud-native tooling.
  • Own the reliability, performance, monitoring, alerting, and on-call support for the data platform.
  • Implement best practices for data modeling, quality, lineage, and governance to ensure trustworthy datasets.
  • Work closely with data scientists, analysts, and application engineers to translate needs into platform capabilities.
  • Drive automation and standardization through CI/CD, model-as-a-service patterns, and reproducible environments.
  • Help define and evolve the data platform architecture as an internal service with clear contracts, SLAs, and versioned APIs.

Requirements

  • Hands-on experience with graph databases such as Neo4j, AWS Neptune, or TigerGraph and query languages like Cypher or Gremlin.
  • Proven experience with entity resolution or record linkage, including tools such as Senzing, Quantexa, or custom probabilistic matching models.
  • Ability to design flexible ontologies for evolving regulatory data, such as changing PEP definitions or sanction list formats.
  • Experience building GraphQL or REST APIs optimized for graph traversals and deep-tree lookups.
  • Experience building centralized data platforms or data-as-a-service offerings at scale.
  • Strong software engineering skills in at least one language such as Python, Java, Go, or Rust.
  • Hands-on experience building data pipelines and ETL/ELT workflows on a major cloud provider, with AWS preferred.
  • Experience with modern data stack tools such as Spark/Flink, Kafka/Kinesis, Airflow or managed schedulers, and data warehouses like Snowflake, Redshift, BigQuery, or Databricks.
  • Familiarity with DevOps practices including CI/CD, containerization with Docker, orchestration with Kubernetes, and infrastructure as code with Terraform.
  • Strong focus on observability, resilience, and early warning signals.
  • Comfort collaborating cross-functionally and communicating clearly with both technical and non-technical stakeholders.
  • Background supporting machine learning or real-time decisioning use cases from a platform perspective is nice to have.
  • Understanding of AML, CTF, and KYC/KYB data structures such as LEIs and ISO 20022 is nice to have.
  • Experience handling global address normalization and geospatial indexing for risk detection is nice to have.
  • All remote hires must be able to travel to Orlando, Florida at least twice per year for town halls and team collaboration, in addition to orientation in Orlando.

Benefits

  • Health care plan, including medical, dental, and vision coverage.
  • Retirement plan with 401(k) and IRA options.
  • Life insurance.
  • Flexible paid time off.
  • 9 paid holidays.
  • Family leave.
  • Work-from-home arrangement.
  • Free food and snacks in the Orlando office.
  • Wellness resources.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

AI Data Engineer

MLabs 11-50 Internet Software & Services

Senior AI Data Engineer at MLabs to design and build agentic systems for a financial intelligence platform that turns market data into actionable insights.

LLM Python
3 minutes ago

Senior Data Engineer

Wave 251-1K Internet Software & Services

Wave is hiring a Senior Data Engineer to build and modernize the company’s data platform and infrastructure supporting analytics, machine learning, and GenAI initiatives.

Apache Spark AWS CI/CD Databricks dbt Generative AI JSON Kafka Looker Machine Learning PagerDuty Power BI Python SQL Terraform
3 minutes ago

Data Engineer (Intern)

Mactores 51-250 IT Services

Mactores is hiring a Data Engineer intern in Mumbai to help build and support data products and solutions that improve business decision-making within its data engineering and data science team.

Apache Airflow Apache Spark AWS Azure Snowflake SQL
3 minutes ago

Senior Data Engineer: Data Lake (Remote)

Constructor Internet Software & Services

Constructor is hiring a Senior Data Engineer for its Data Lake Team to build and operate the core data platform that powers internal analytics, data science, and ML workflows at high scale.

Apache Spark AWS ClickHouse CloudFormation Databricks FastAPI LLM Machine Learning OpenTelemetry PagerDuty Prometheus Python Terraform Trino
18 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers