Skip to main content
R Systems International Limited logo

Lead Cloud Data Architect (GCP & SAP)

R Systems International Limited
Full Timelead
INPosted March 11, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonSQLGCPAzureTerraformGitHub ActionsApacheSnowflakeBigQueryGitHubAirflowCI/CDDevOps

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

As a highly skilled Google Cloud Platform (GCP) Data Engineer with SAP data integration expertise, your role will involve designing, implementing, and overseeing enterprise-grade data solutions. You will collaborate with business stakeholders and engineering teams to create a robust, scalable, and cost-effective data ecosystem that bridges SAP and GCP environments.

  • *Key Responsibilities:**
  • Lead and mentor a team of data engineers in building ETL/ELT pipelines for SAP and other ERP sources into GCP.
  • Set engineering standards, best practices, and coding guidelines.
  • Provide technical direction, code reviews, and support for complex data solutions.
  • Collaborate with project managers, provide estimates, track progress, and remove roadblocks to ensure timely completion of work.
  • Collaborate with BI teams and Data analysts to enable reporting solutions.
  • *Data Architecture & Modeling:**
  • Design conceptual, logical, and physical data models to support analytics and operational workloads.
  • Implement star, snowflake, and data vault models for analytical systems.
  • *Google Cloud Platform Expertise:**
  • Design data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc.
  • Implement cost optimization strategies for GCP workloads.
  • *Data Pipelines & Integration:**
  • Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer) and Dataflow.
  • Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT or Google Cortex Framework.
  • Leverage integration tools such as Boomi for system interoperability.
  • *Programming & Analytics:**
  • Develop complex SQL queries for analytics, transformations, and performance tuning.
  • Build automation scripts and utilities in Python.
  • *System Migration:**
  • Lead on-premise to cloud migrations for enterprise data platforms [SAP BW/Bobj].
  • Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime.
  • *DevOps for Data:**
  • Implement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform.
  • Apply infrastructure-as-code principles for reproducible and scalable deployments.
  • *Data Modelling:**
  • Design and develop conceptual, logical, and physical data models for enterprise systems.
  • Translate business requirements into data entities, attributes, relationships, and constraints.
  • Build and maintain dimensional models (Star/Snowflake schema) for data warehouses and BI reporting.
  • Develop data models for data lake/lakehouse environments (BigQuery, Snowflake, Azure Synapse, Databricks).
  • Define and document data standards, naming conventions, and data definitions.
  • Collaborate with Data Engineering teams to ensure models are implemented accurately in ETL/ELT pipelines.
  • Work with BI teams to optimize models for reporting tools such as Power BI, Tableau, SAP BW, etc.
  • Support integration across multiple source systems (SAP, Salesforce, Oracle, etc.).
  • Ensure data models comply with data governance, security, and compliance requirements.
  • Create and maintain documentation including ERDs, data dictionaries, and lineage diagrams.
  • *Preferred Skills:**
  • 4-6 years of proven experience with GCP BigQuery, Composer, Cloud Storage, Pub/Sub, Dataflow.
  • Strong SQL and Python programming skills.
  • Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems.
  • Knowledge of data governance frameworks and security best practices.
  • Familiarity with DevOps tools for data.
  • Understanding of Google Cortex Framework for SAP-GCP integrations. As a highly skilled Google Cloud Platform (GCP) Data Engineer with SAP data integration expertise, your role will involve designing, implementing, and overseeing enterprise-grade data solutions. You will collaborate with business stakeholders and engineering teams to create a robust, scalable, and cost-effective data ecosystem that bridges SAP and GCP environments.
  • *Key Responsibilities:**
  • Lead and mentor a team of data engineers in building ETL/ELT pipelines for SAP and other ERP sources into GCP.
  • Set engineering standards, best practices, and coding guidelines.
  • Provide technical direction, code reviews, and support for complex data solutions.
  • Collaborate with project managers, provide estimates, track progress, and remove roadblocks to ensure timely completion of work.
  • Collaborate with BI teams and Data analysts to enable reporting solutions.
  • *Data Architecture & Modeling:**
  • Design conceptual, logical, and physical data models to support analytics and operational workloads.
  • Implement star, snowflake, and data vault models for analytical systems.
  • *Google Cloud Platform Expertise:**
  • Design data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc.
  • Implement cost optimization strategies for GCP workloads.
  • *Data Pipelines & Integration:**
  • Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer) and Dataflow.

-

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free