Cribl

Cribl

Cribl provides a unified data management platform specifically designed for IT and security data, enabling users to explore, collect, process, and access their data at scale while offering enhanced control and flexibility in managing their data workflows.

IT Services
251-1K
Founded 2018
$402M raised

Description

  • Build, operate, and monitor Cribl’s core data tech stack including ETL/ELT pipelines, data integrations, and the data warehouse to ensure data is accurate, timely, and trusted.
  • Develop cloud-native services and backend infrastructure (APIs, ingestion services, event-driven workflows) with logging, alerting, and observability as first-class concerns.
  • Contribute to infrastructure-as-code (Terraform or similar), clean deployment patterns, and maintain operational hygiene for production systems.
  • Prepare model-ready datasets, expose features, and integrate AI/LLM workflows into production systems to support data science and agentic initiatives.
  • Collaborate with Analysts and business stakeholders to clarify requirements, validate data outputs, and translate business logic into reliable data artifacts.
  • Partner with Data Analysts, Site Reliability Engineers, and IT Engineers on cross-functional initiatives to align solutions with business needs and validate outputs.
  • Communicate risks, tradeoffs, and timelines proactively to keep work predictable and stakeholders informed.
  • Contribute to secure, compliance-minded engineering practices in collaboration with IT and Security teams.
  • Support a remote-first, multi-time-zone environment and occasionally perform duties outside standard working hours as needed.

Requirements

  • Strong SQL and Python fundamentals with experience in ELT patterns and data modeling.
  • Exposure to Snowflake or a similar cloud data warehouse (Databricks, Redshift, DuckDB) and familiarity with dbt or equivalent frameworks.
  • Experience building cloud applications or backend services, including APIs, ingestion services, and event-driven workflows.
  • Hands-on experience with AWS cloud infrastructure and infrastructure-as-code tools such as Terraform.
  • Familiarity with workflow orchestration tools (e.g., Prefect, Airflow) and production-grade engineering practices (logging, alerting, versioning, CI/CD).
  • Practical knowledge of observability and monitoring for reliable data systems.
  • Experience preparing datasets for machine learning/AI use cases and integrating AI/LLM workflows into production (preferred).
  • Clear, concise communication skills and proven ability to collaborate across data, engineering, and business teams.
  • Comfort working remotely across multiple time zones and occasionally outside standard working hours.

Benefits

  • Salary range $99,000 - $185,800 (dependent on geographic location and candidate experience).
  • Equity in the company.
  • Health, dental, and vision insurance.
  • Short-term disability and life insurance.
  • Paid holidays and paid time off.
  • Fertility treatment benefit.
  • 401(k) retirement plan.
  • Eligibility for a discretionary company-wide bonus and remote-first work flexibility.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Data Engineering Tech Lead

Lingaro 5K-10K IT Services

Data Engineering Tech Lead at Lingaro (Data Engineering & Management) — lead a Poland-based remote/full-time team to design, deliver, and maintain scalable, secure data engineering solutions while mentoring engineers and ensuring timely, high-quality project delivery.

Azure CI/CD Python Scala SQL
14 hours, 39 minutes ago

Senior Software Engineer - Data Integration & JVM Ecosystem

ClickHouse 51-250 IT Services

Senior Software Engineer (JVM) at ClickHouse joining the Connectors team to own and maintain JVM-based data framework integrations, connectors, and drivers that enable high-performance data ingestion and a seamless developer experience for data engineering workloads.

Apache Airflow Apache Spark ClickHouse dbt Grafana HTTP Java Kafka Metabase Pandas Power BI Python SQL Tableau TCP/IP
1 month ago

Junior Data Engineer (Remote Argentina) / Ingénieur données junior (à distance)

GlobalVision 51-250 Internet Software & Services

Junior Data Engineer at GlobalVision supporting and maintaining the company’s data infrastructure to ensure reliable, accessible, and actionable data that informs business decision-making across the organization.

dbt Domo Machine Learning Power BI Python Salesforce SQL Tableau
1 month ago

Data/Infrastructure Advocate Engineer - EMEA Remote

Hugging Face 51-250 IT Services

Hugging Face is hiring a Data/Infrastructure Advocate Engineer to bridge data infrastructure and the community by championing Xet storage on the Hub and enabling efficient storage, versioning, and collaboration on large-scale datasets.

AWS GitHub Pandas Python
1 month ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers