Skip to main content
Talentmatics logo

Gcp Big Data Engineer

Talentmatics
Full Timesenior
Hosur, Tamil Nadu, INPosted March 9, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonJavaScalaSQLGCPApacheBigQueryAirflowCI/CDDevOps

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

We are seeking an experienced Lead GCP Big Data Engineer with strong expertise in designing and building scalable data pipelines, ETL/ELT workflows, and big data processing solutions on Google Cloud Platform (GCP) . The ideal candidate will have a blend of hands-on development expertise and technical leadership , with the ability to design modern data architectures, optimize data workflows, and mentor engineering teams.

Key Responsibilities

Design, develop, and maintain scalable ETL/ELT data pipelines using PySpark, SQL, and GCP services . Lead data engineering initiatives and provide technical leadership and mentorship to the team. Build and optimize data processing workflows using Dataflow, DataProc, Cloud Composer, and Apache Airflow . Ensure data quality, governance, reliability, and performance optimization across data platforms. Collaborate with cross-functional teams including data analysts, data scientists, and product teams to deliver end-to-end data solutions. Implement best practices for data engineering, automation, monitoring, and deployment . Troubleshoot and resolve data pipeline performance and scalability issues .

Required Skills

Google Cloud Platform (GCP):

Storage, BigQuery, DataProc, Cloud Composer, DMS, Datastream, Analytics Hub, Workflows, Dataform, Datafusion, Pub/Sub, Dataflow

Big Data Technologies:

PySpark, Hadoop Ecosystem, ETL/ELT Frameworks, ANSI-SQL, Apache Airflow

Programming Languages:

Python, Java, Scala

Experience & Qualifications

8–10 years of experience in Data Engineering or Big Data Development. Strong hands-on experience with GCP data services and big data processing frameworks . Proven experience in designing and implementing large-scale distributed data pipelines . Experience in data modeling, performance tuning, and workflow orchestration . Strong problem-solving skills and the ability to work in a collaborative, fast-paced environment .

Preferred Skills

Experience in data governance, monitoring, and data quality frameworks . Prior experience in leading teams or mentoring junior engineers . Familiarity with CI/CD pipelines and DevOps practices for data engineering .

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free