phData

phData

phData specializes in providing comprehensive data engineering, data strategy, and machine learning services, offering end-to-end managed data solutions that facilitate data migrations, integrations, and the development of modern data products and appl...

IT Services
251-1K

Description

  • Develop end-to-end technical solutions and bring them into production.
  • Ensure solutions meet performance, security, scalability, and data integration requirements.
  • Write, debug, and optimize SQL queries.
  • Create detailed solution documentation, including POCs, roadmaps, sequence diagrams, class hierarchies, and logical system views.
  • Deliver client-facing presentations and communicate technical information clearly in writing and verbally.
  • Collaborate on solutions that integrate multiple data sources such as queues, databases, files, search systems, and APIs.
  • Support the full software development lifecycle, including design, implementation, testing, and deployment.

Requirements

  • 4+ years of experience as a Software Engineer, Data Engineer, or Data Analyst.
  • Experience developing end-to-end technical solutions for production environments.
  • Programming experience in Java, Python, and/or Scala.
  • Experience with core cloud data platforms such as Snowflake, AWS, Azure, Databricks, and GCP.
  • Strong SQL skills, including the ability to write, debug, and optimize queries.
  • Client-facing written and verbal communication skills.
  • Experience creating detailed technical presentations and documentation.
  • Bachelor's degree in Computer Science or a related field.
  • Preferred experience in production data platforms such as Hadoop and Databricks.
  • Preferred experience with cloud and distributed storage systems such as S3, ADLS, HDFS, GCS, Kudu, Elasticsearch/Solr, or Cassandra.
  • Preferred experience with data integration tools and technologies such as Spark, Kafka, StreamSets, Matillion, Fivetran, NiFi, AWS DMS, Azure Data Factory, Informatica IICS, or Google DataProc.
  • Preferred experience with automated data transformation and curation using dbt, Spark, Spark Streaming, or automated pipelines.
  • Preferred experience with workflow orchestration tools such as Airflow, AWS Managed Airflow, Luigi, or NiFi.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

Java Developer | Apache Spark & Data Processing

NEORIS 5K-10K Internet Software & Services

NEORIS busca un Desarrollador Java Back Semi Senior para una entidad del sector financiero y bancario en Colombia, enfocado en el diseño y optimización de procesos batch y ETL en un entorno remoto.

Apache Airflow Apache Spark CI/CD Git Java JUnit SQL
12 minutes ago

Data Engineer

Route 251-1K Air Freight & Logistics

Route is hiring a Data Engineer to help build and modernize the company’s data platform as it migrates from Snowflake to a Databricks-first architecture and creates an AI-ready enterprise data warehouse.

AWS Databricks dbt DynamoDB Go Grafana PagerDuty Python SQL Tableau Terraform
42 minutes ago

Data Engineer

Newsela 251-1K Diversified Consumer Services

Newsela is hiring a Data Engineer to build and maintain data integrations and workflows that support K-12 school operations across educational platforms.

Python REST API SFTP SQL
1 hour, 12 minutes ago

Data Engineer SAS

NEORIS 5K-10K Internet Software & Services

NEORIS, now part of EPAM, is hiring a Data Engineer SAS to work on complex data engineering and analytics projects in corporate environments focused on the SAS ecosystem and modern cloud technologies.

Apache Spark AWS Python SQL
1 hour, 27 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers