INflow Federal

INflow Federal

INflow Federal is a leading small business innovator specializing in Network Modernization, Cybersecurity, Digital Modernization, and Joint Force Mission Operations. Founded in 2013, the company focuses on attracting, hiring, and collaborating with tec...

Aerospace & Defense
51-250
Founded 2013

Description

  • Design, implement, and maintain data pipelines and ETL processes for mission data ingestion, transformation, and validation.
  • Develop and optimize data models and schemas across relational and non-relational databases.
  • Collaborate with system architects, integration developers, and data analysts to ensure data consistency, security, and integrity.
  • Implement data migration and synchronization between legacy systems and modern cloud platforms.
  • Use AWS services such as Glue, Lambda, S3, RDS, Redshift, and Kinesis to build scalable and fault-tolerant data infrastructure.
  • Perform data validation, reconciliation, quality checks, and reporting to ensure accuracy.
  • Integrate data from APIs, streaming sources, and file-based systems into centralized repositories or data lakes.
  • Automate data workflows using infrastructure-as-code and CI/CD principles.
  • Monitor and troubleshoot data pipeline performance to meet SLAs and operational reliability.
  • Implement encryption, masking, and access controls in compliance with DoD cybersecurity policies and RMF requirements.
  • Support dashboards and analytics products for mission stakeholders.
  • Maintain documentation and metadata repositories, including data dictionaries, lineage, and technical specifications.
  • Participate in Agile sprints, backlog refinement, testing, and cross-functional collaboration.

Requirements

  • Bachelor’s degree in Computer Science, Information Systems, or a related technical field, or equivalent education, training, work experience, or military experience.
  • Minimum 5 years of experience in data engineering, integration, or analytics enablement.
  • Experience building and optimizing data pipelines in enterprise or federal environments.
  • Proficiency with Python, SQL, and ETL frameworks such as Apache NiFi, Talend, or AWS Glue.
  • Experience with AWS GovCloud or similar cloud data services, including RDS, S3, Lambda, Glue, and Redshift.
  • Knowledge of database systems such as PostgreSQL, SQL Server, DynamoDB, or MongoDB.
  • Familiarity with RESTful API integration and data formats such as JSON, XML, and CSV.
  • Understanding of DoD data security and compliance standards, including encryption, RMF, and STIG adherence.
  • Exposure to data visualization and analytics tools such as Power BI, Tableau, or QuickSight.
  • Familiarity with Agile software development and DevSecOps delivery frameworks.
  • Preferred certifications include AWS Certified Data Engineer – Associate, AWS Certified Developer, CompTIA Security+ CE, or CDMP.
  • Active DoD Top Secret clearance.
  • Must be a U.S. citizen.
  • Must have a valid driver’s license and transportation if travel is required.
  • Must be able to lift up to 50 lbs.
  • Strong analytical, problem-solving, communication, and collaboration skills with attention to data accuracy and consistency.

Benefits

  • Fully remote position.
  • Opportunities to work on high-impact Department of Defense projects.
  • Access to the latest technologies, including AI/ML-enabled solutions.
  • Employee-first culture focused on professional growth and well-being.
  • Veteran-focused workplace with a Veteran Outreach Program.
  • Commitment to innovation, transparency, and integrity.
  • Equal opportunity employment and accommodations support for applicants with disabilities.

Interested in this position?

Apply directly on the company website

Apply Now

Similar Roles

GCP Data Architect

66degrees 251-1K IT Services

66degrees is seeking an experienced Data Architect to design, develop, and maintain Google Cloud-based data architecture that turns enterprise data into scalable, reliable business value.

Apache Spark dbt GCP Hadoop Python SQL
2 hours, 12 minutes ago

Staff Data Engineer

CookUnity 251-1K Hotels, Restaurants & Leisure

CookUnity is hiring a Data Engineer to help rebuild and scale the company’s B2C data foundation by designing production-ready pipelines and data systems for a rapidly growing food marketplace.

Apache Spark Flink Java Kafka Kubernetes Python Scala Snowflake SQL
2 hours, 12 minutes ago

GCP Data Engineer- Lead Consultant

Lingaro 5K-10K IT Services

Lingoarians is hiring a GCP Data Engineer Lead Consultant in India to design and develop a client’s end-to-end Google Cloud data ecosystem while working directly with stakeholders in a remote, full-time role.

Apache Airflow Apache Spark Azure Confluence Databricks dbt GCP Git JIRA Looker Power BI Python SQL Tableau
2 hours, 36 minutes ago

Senior SQL / ETL Engineer (Contract)

Tech Holding 51-250 Internet Software & Services

Tech Holding is hiring a Senior SQL / ETL Engineer to support a client’s data migration, transformation, and database operations work across multiple systems.

Bash JavaScript Python SQL SQL Server
2 hours, 42 minutes ago

You're on a roll! Sign up now to keep applying.

Sign Up

Already have an account? Log in

Used by 14,729+ remote workers