Skip to main content
Acme Services logo

GCP Data Engineers with Dataplex | 4 to 8 years | Bangalore | 18 to 20 LPA | Big4

Acme Services
Full Timemid
Bengaluru, Karnataka, INPosted March 7, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonSQLGCPApacheBigQuerySparkAirflow

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Job Title: GCP Data Engineer with Dataplex

Experience: 4 to 8 Years

Budget: 18 to 20 LPA

Location: Bangalore

Type: Full-Time

Responsibilities

Job Summary: As a skilled GCP Data Engineer with hands-on experience in Google Cloud Platform (GCP) and Dataplex, you will be responsible for building, maintaining, and optimizing data pipelines and data lake infrastructure. You will work closely with data architects, governance teams, and business stakeholders to deliver reliable, high-quality, and governed data products on GCP.

Core Responsibilities:

 Design, develop, and maintain scalable data ingestion, transformation, and orchestration pipelines using GCP-native services such as Dataflow, Dataproc, Cloud Composer, and Pub/Sub

 Implement and manage GCP Dataplex lakes and zones — configuring data discovery, metadata tagging, data quality rules, and governance policies

 Build and manage data lake and lakehouse architectures on Cloud Storage and BigQuery, following best practices for partitioning, clustering, and cost optimisation

 Collaborate with Data Governance and Architecture teams to ensure data quality, lineage, and cataloging standards are enforced across data pipelines

 Develop and maintain ELT/ETL workflows using Cloud Composer (Apache Airflow), Dataflow (Apache Beam), and Dataproc (Spark/Hadoop)

 Implement data access controls and security policies including IAM roles, column/row-level security, and VPC Service Controls on GCP

 Monitor, troubleshoot, and optimise pipeline performance and reliability in production environments

 Participate in code reviews, technical documentation, and contribute to data engineering best practices and standards

 Support data migration and modernisation initiatives by moving on-premise or legacy data workloads to GCP cloud-native solutions

Mandatory skill sets:

 Hands-on experience with GCP Dataplex — creating and managing lakes, zones, assets, data quality rules, and metadata discovery

 Strong proficiency in BigQuery — writing complex SQL, data modelling, performance tuning, partitioning, and clustering

 Experience building data pipelines using Cloud Dataflow (Apache Beam) and/or Dataproc (Apache Spark)

 Hands-on experience with Cloud Composer (Apache Airflow) for pipeline orchestration and scheduling

 Proficiency in Python and SQL for data transformation and pipeline development

 Experience with Cloud Storage for data lake design — raw, refined, and curated layer management

 Working knowledge of data governance concepts — data quality, metadata management, data lineage, and cataloging

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free