Skip to main content
Clarity Innovations logo

Principal Data Engineer

Clarity Innovations
Full Timeprincipal
Columbia, Maryland, USPosted March 19, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonJavaKubernetesApacheRESTAgileScrumCI/CD

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Position Overview

The Data Transport Engineer II is responsible for the design, implementation, and validation of secure data movement within the Unified Data Model (UDM) team. This role focuses on integration between UDM, Apache NiFi, and external stakeholder systems in constrained and classified environments where data flow correctness, schema integrity, and transport reliability are mission-critical. The position is delivery and compliance-oriented rather than availability or on-call focused.

Position Details

Clearance Required: Active TS/SCI with CI poly

Employment Type: Full-Time

Location: On-Site Maryland/DC

Key Responsibilities

  • Design, implement, and validate secure data movement between Apache NiFi and external stakeholder components.
  • Support ingestion pipelines, data transfer reliability, and format validation in constrained and classified environments.
  • Troubleshoot data flow issues, schema alignment failures, and transport-layer integration problems across system boundaries.
  • Partner with platform and infrastructure teams to ensure data movement architecture aligns with enterprise standards and security controls.
  • Develop and maintain documentation for data transport patterns, pipeline configurations, and integration test results.
  • Contribute to delivery timelines by meeting enterprise compliance requirements and integration milestones.
  • Participate in peer reviews of pipeline configurations, data schemas, and transport layer designs.
  • Support sprint-based delivery within an Agile/Scrum environment, reporting progress to program leadership.

Required Qualifications

Clearance & Compliance

  • Active TS/SCI w/ CI Poly security clearance (required prior to start).
  • Ability to operate in classified environments and comply with all applicable government and program security protocols.

Technical Skills

  • Hands-on experience with Apache NiFi for data flow design, pipeline configuration, and processor development.
  • Working knowledge of data format conversion and schema validation (JSON, XML, Avro, Protobuf, or similar).
  • Familiarity with data modeling concepts and structured data interchange patterns.
  • Experience with REST APIs and integration patterns for data transport between distributed systems.
  • Proficiency in at least one scripting or programming language (Python, Groovy, Java) for pipeline customization.
  • Understanding of data transport security requirements including encryption in transit and access control.

Experience

  • 5-10 years of experience in data engineering, integration engineering, or a directly related role.
  • Experience working in government, defense, or other regulated/classified environments preferred.
  • Demonstrated ability to troubleshoot integration failures and resolve data consistency issues in production pipelines.

Preferred Qualifications

  • Experience with Kubernetes-based deployment environments and containerized data pipelines.
  • Familiarity with document/data ingestion workflows.
  • Experience with CI/CD pipelines and DevSecOps practices in a government contracting context.
  • Familiarity with Agile/Scrum delivery and working in cross-functional product teams.

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free