Skip to main content
Mphasis logo

Senior Data Engineer with GCP

Mphasis
Be an Early ApplicantFull Timesenior
New York, New York, USPosted March 23, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonJavaScalaSQLGCPTerraformApacheBigQueryKafkaAirflowCI/CDDevOpsAPI

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description


Role description

Role : Senior Data Engineer with GCP


Location : Charlotte, NC


Duration : Full time


Key Responsibilities
Architect and own scalable, secure, cloud-native data platforms on Google Cloud Platform Design, build, and optimize batch and real-time data pipelines using BigQuery, Dataflow, Pub/Sub, and Dataproc Lead BigQuery performance tuning and cost optimization (partitioning, clustering, query efficiency)
Orchestrate workflows using Cloud Composer (Apache Airflow)
Enable Al/ML and GenAl integration via Vertex Al and BigQuery ML
Enforce data governance, security, reliability, and FinOps best practices
Mentor engineers, conduct design/code reviews, and set enterprise data engineering standards
• Collaborate with product, analytics, and data science teams to deliver business-critical insights
Key Skill Sets
• GCP Data Services: BigQuery, Dataflow (Apache Beam), Pub/Sub, Cloud Storage, Cloud Composer, Dataproc
• Programming & SQL: Advanced SQL, Python (Java/Scala a plus)
• Data Engineering: ETL/ELT, streaming & batch processing, data modeling, distributed systems
• Modern Architectures: Lakehouse, Apache Iceberg, Data Mesh concepts
• Al/ML Enablement: Vertex Al, BigQuery ML, GenAl-ready pipelines DevOps & laC: Terraform, CI/CD, DataOps practices
Leadership: Architecture ownership, mentoring, stakeholder communication, problem solving
• Certification: Google Cloud Professional Data Engineer (strongly preferred / often mandatory)
In addition to big query, storage bucket, following are necessary skills - data flow, composer, cloud scheduler, Pubsub and Kafka, Apigee gateway and API, Dataplex, basic knowledge of network connectivity (knowledge on data catalog, DLP, BQDTS, STS and other data transfer methodologies). Reporting background (powerbi) and ICEBERG are MUST. Data virtualization (Trenio or equivalent), Looker and GCP vertex will be a plus.

Other details

Deputation Location : US~New York~New York
null

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free