Filevine

Filevine

Filevine is a top legal tech company revolutionizing legal work with AI-powered case management software, empowering law firms to streamline operations and enhance client services.

Specialized Consumer Services
251-1K
Founded 2015
$226M raised

Description

  • Optimize and manage Filevine’s Snowflake and Cortex production environment, including warehouse sizing, monitoring, clustering, query performance, cost governance, and storage efficiency.
  • Own and improve agentic data modeling and natural-language text-to-SQL capabilities, including semantic models, prompt refinement, verified question libraries, and answer-quality measurement.
  • Design and build batch and streaming data pipelines that ingest, transform, and model data from product, CRM, billing, and telemetry systems into trusted data products.
  • Build the data foundations for agentic AI workflows and LOIS, including feature pipelines, retrieval datasets, and low-latency serving paths for LLM-based reasoning.
  • Establish data reliability and governance practices, including quality checks, lineage, monitoring, incident response, access control, and PII handling.
  • Partner with product and engineering teams to define event contracts and model business concepts consistently across downstream consumers.
  • Evaluate and recommend emerging tools across the modern data stack that align with strategic and security requirements.
  • Provide technical mentorship, participate in code reviews and design documentation, and help improve data engineering practices.
  • Participate in on-call rotations to support production data pipeline and analytics SLAs.

Requirements

  • 5+ years of professional data engineering or backend engineering experience delivering production-grade data systems with measurable business impact.
  • Hands-on experience operating a modern cloud data warehouse in production, such as Snowflake, BigQuery, Redshift, Databricks, or Synapse, including performance tuning, cost management, RBAC, and warehouse-native compute orchestration.
  • Demonstrated experience building production Agentic AI or LLM-powered systems, such as RAG pipelines, tool-using agents, MCP servers, or warehouse-native LLM functions.
  • Advanced SQL and Python skills for building reliable, well-tested data pipelines and transformations.
  • Experience with modern data modeling and transformation tools such as dbt, including testing, documentation, and backward-compatible model design.
  • Experience with workflow orchestration tools such as Airflow or Dagster, and cloud-native deployment on AWS, Azure, or GCP.
  • Strong fundamentals in dimensional and star/snowflake data modeling, distributed systems, performance tuning, and data quality and observability.
  • Professional experience with Agile/Kanban, Git, CI/CD, and DevOps.
  • Excellent written and verbal communication skills for technical and non-technical audiences.
  • B.S., M.S., or Ph.D. in Computer Science, Information Systems, Engineering, or a related field, or equivalent professional experience.
  • Preferred: Hands-on Snowflake experience, including Snowpipe, streams/tasks, data sharing, and cost/governance tuning at scale.
  • Preferred: Experience with Snowflake Cortex Analyst, including semantic models and verified queries.
  • Preferred: .NET/C# experience or familiarity integrating with a .NET-based backend.
  • Preferred: Experience with modern UI tools such as Svelte or React.
  • Preferred: Experience supporting machine learning workflows such as feature stores, training datasets, or real-time scoring infrastructure.
  • Preferred: Experience in SaaS or product-led growth environments, including product analytics and revenue/usage telemetry.
  • Preferred: Infrastructure-as-code, containerization, and deployment experience with Terraform, Docker, Kubernetes, or Octopus.
  • Preferred: Familiarity with legal tech, document-heavy data, or unstructured data at scale.
  • Preferred: Track record of mentoring engineers and contributing to hiring and team building.

Benefits

  • Base salary range of $160,000 to $190,000 per year.
  • Commissions, stock options, and a paid time off policy as part of total compensation.
  • Comprehensive medical, dental, and vision coverage for full-time employees.
  • Maternity and paternity leave for full-time employees.
  • Short- and long-term disability coverage.
  • Opportunity to learn from a dedicated leadership team.
  • Dynamic, rapidly growing company environment.
  • Top-of-the-line company swag.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Senior Data Engineer - Dbt, CI/CD

ClickHouse 51-250 IT Services

ClickHouse is hiring a Senior Data Engineer to evolve its internal data warehouse platform that powers analytics, forecasting, and decision-making across the company.

Apache Airflow AWS CI/CD ClickHouse dbt GitHub GitHub Actions Python SQL Superset
2 hours, 4 minutes ago

Data Engineer II

Zafin 251-1K Internet Software & Services

Zafin is hiring a Data Engineer II to design and scale data platforms that support analytics and product use cases for its banking platform.

Apache Airflow Apache Spark CI/CD Java Kafka Kubernetes Python Scala Snowflake SQL
4 hours, 8 minutes ago

Data Engineer, Azure - Remote, Latin America

Bluelight Consulting 11-50 Internet Software & Services

Bluelight is hiring a remote Data Engineer, Azure to develop and optimize ETL and data warehousing solutions for client projects across Latin America.

Agile Apache Spark Azure Git Machine Learning Power BI Python REST API SQL SQL Server Tableau
4 hours, 55 minutes ago

Senior Data Engineer

Unity 5K-10K Internet Software & Services

Unity is hiring a Senior Data Engineer to lead the design and implementation of a business-critical data platform that supports real business and customer problems across thousands of Unity titles.

Flink JIRA Kafka
5 hours, 52 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers