Skip to main content
Slalom logo

Flex; Project Federal GCP Data Engineer

Slalom
Full Timejunior
Mt Rainier, Maryland, US$80k – $100kPosted February 18, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonSQLGCPGitJiraAirflowAgileScrumCI/CD

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Position: Slalom Flex (Project Based)- Federal GCP Data Engineer Location: Mt Rainier (U.S. Citizenship Required)About UsSlalom is a purpose-led, global business and technology consulting company. From strategy to implementation, our approach is fiercely human. In six+ countries and 43+ markets, we deeply understand our customers—and their customers—to deliver practical, end-to-end solutions that drive meaningful impact. Backed by close partnerships with over 400 leading technology providers, our 10,000+ strong team helps people and organizations dream bigger, move faster, and build better tomorrows for all. We’re honored to be consistently recognized as a great place to work, including being one of Fortune’s 100 Best Companies to Work For seven years running. Learn more at GCP Data Engineer(U.S. Citizenship Required)About The RoleWe are seeking a GCP Data Engineer with strong Big Query experience to support a major federal engagement focused on disaster recovery, data modernization, and mission‑critical analytics for FEMA. This role is a hands‑on engineering position working within a secure Google Cloud Platform environment to design, build, and optimize scalable data pipelines and analytics capabilities that enable high‑quality insights and operational excellence for our federal client.This position requires U.S. citizenship and the ability to obtain and maintain a Public Trust clearance .What You Will DoData Engineering & Cloud DevelopmentDesign, build, and maintain cloud‑native ETL/ELT data pipelines using Big Query, Dataform, Python, Cloud Composer (Airflow), Cloud Functions, and Cloud Storage.Develop Big Query‑centric data models, transformations, and analytics layers supporting downstream Looker dashboards and federal reporting needs.Implement modern analytics engineering practices, including version‑controlled SQLX (Dataform), modular transformations, data quality checks, and documentation.Client Leadership & DeliveryCollaborate with federal stakeholders to understand data ingestion, transformation, governance, and reporting requirements.Translate technical designs and delivery timelines for both technical and non‑technical audiences.Support modernization of legacy data environments into scalable GCP‑based architectures.Ensure all solutions align with federal data governance, security, and performance standards.Solution Optimization & InnovationOptimize Big Query workloads using partitioning, clustering, incremental processing, and cost‑efficient modeling.Maintain robust CI/CD practices using Git Lab or Git Hub for version control, merge requests, and promotion pipelines.Develop and maintain data lineage, metadata documentation, and enterprise data models.Identify linkages across disparate datasets to build unified, interoperable data architectures.Perform cleanup of existing datasets and transformation logic where needed.Collaboration & Team LeadershipWork closely with data architects, BI developers, cloud engineers, and data scientists.Participate in SAFe Agile ceremonies including daily standups, retrospectives, and PI planning.Track work in Jira and maintain documentation in Confluence.Support testing, deployment, and quality assurance of data products.Mentor junior data engineering team members and contribute to best‑practice frameworks.Must-Have QualificationsU.S. citizenshipAbility to obtain and maintain a federal Public Trust clearance3+ years of experience in cloud-based data engineeringStrong hands-on expertise with Google Big QueryProficiency in Python for pipeline development, automation, and cloud integrationExperience building data pipelines in GCP, including Big Query, Dataform, Airflow/Cloud Composer, Cloud Functions, or similarStrong SQL skills, including data modeling and data quality testingExperience with Git-based version control and CI/CD conceptsFamiliarity with data governance, metadata management, and compliance considerationsStrong communication and stakeholder engagement skillsNice-to-Have SkillsExperience supporting federal or regulated environmentsFamiliarity with Looker and downstream BI enablementUnderstanding of ML workloads or data structures optimized for modelingExperience with Agile/Scrum or SAFeKnowledge of data quality…

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free