Skip to main content
Kpmg India Services Llp logo

Senior Data Migration and ETL Developer

Kpmg India Services Llp
Full Timesenior
Bengaluru, Karnataka, INPosted April 1, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonSQLAWSAzureSparkAgileCI/CD

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Consultant

Role Overview:

We are seeking a highly skilled Senior Developer specializing in Data Migration and ETL to lead complex data integration initiatives across enterprise platforms. The ideal candidate will have extensive experience in SAP S/4HANA, Azure Databricks, and Azure Data Factory (ADF), ensuring seamless migration, transformation, and orchestration of large-scale datasets in a secure and efficient manner.

Key Responsibilities:

  • Design & Develop ETL Pipelines:

Build and optimize end-to-end ETL workflows for migrating data from SAP S/4HANA and other source systems into target platforms using ADF and Databricks.

  • Data Migration Strategy:

Implement robust migration frameworks, including staging, transformation, and reconciliation processes, ensuring data integrity and compliance with business rules.

  • Integration with SAP Ecosystem:

Utilize SAP Data Provisioning Agent and connectors for extracting data from ECC/S4HANA, BW, and Datasphere, integrating with Databricks Lakehouse for advanced analytics.

  • Performance Optimization:

Tune Spark jobs, cluster configurations, and ADF pipelines for high throughput and cost efficiency.

  • Governance & Security:

Apply best practices for data governance, metadata management, and secure data handling across environments (Dev, QA, Prod).

  • Collaboration:

Work closely with architects, data engineers, and business stakeholders to align migration activities with enterprise data strategy.

Required Skills & Experience:

  • Technical Expertise:
  • SAP S/4HANA data extraction and integration patterns with expertise on using Data Migration cockpit, Data services.
  • Azure Data Factory (ADF) orchestration and pipeline development.
  • Azure Databricks (PySpark, Delta Lake, Unity Catalog).
  • Strong SQL and Python skills for data transformation.
  • Data Architecture Knowledge:
  • Experience with ADLS Gen2, hierarchical namespaces, and Delta Lake format.
  • Familiarity with Common Data Model (CDM) and enterprise data platforms.
  • ETL & Migration Frameworks:
  • Hands-on experience with staging, mapping, reconciliation dashboards, and metadata-driven transformations.
  • Cloud & Big Data Ecosystem:
  • Understanding of Azure Integration Services and hyperscaler tools (ADF, AWS Glue, etc.).
  • Soft Skills:
  • Strong problem-solving, analytical thinking, and stakeholder communication.
  • Ability to lead migration projects and mentor junior developers.

Preferred Qualifications

  • 5+ years of experience in data engineering and migration projects.
  • SAP certification or proven experience in SAP data integration.
  • Experience in Agile delivery and CI/CD practices.

Role Overview:

We are seeking a highly skilled Senior Developer specializing in Data Migration and ETL to lead complex data integration initiatives across enterprise platforms. The ideal candidate will have extensive experience in SAP S/4HANA, Azure Databricks, and Azure Data Factory (ADF), ensuring seamless migration, transformation, and orchestration of large-scale datasets in a secure and efficient manner.

Key Responsibilities:

  • Design & Develop ETL Pipelines:

Build and optimize end-to-end ETL workflows for migrating data from SAP S/4HANA and other source systems into target platforms using ADF and Databricks.

  • Data Migration Strategy:

Implement robust migration frameworks, including staging, transformation, and reconciliation processes, ensuring data integrity and compliance with business rules.

  • Integration with SAP Ecosystem:

Utilize SAP Data Provisioning Agent and connectors for extracting data from ECC/S4HANA, BW, and Datasphere, integrating with Databricks Lakehouse for advanced analytics.

  • Performance Optimization:

Tune Spark jobs, cluster configurations, and ADF pipelines for high throughput and cost efficiency.

  • Governance & Security:

Apply best practices for data governance, metadata management, and secure data handling across environments (Dev, QA, Prod).

  • Collaboration:

Work closely with architects, data engineers, and business stakeholders to align migration activities with enterprise data strategy.

Required Skills & Experience:

  • Technical Expertise:
  • SAP S/4HANA data extraction and integration patterns with expertise on using Data Migration cockpit, Data services.
  • Azure Data Factory (ADF) orchestration and pipeline development.
  • Azure Databricks (PySpark, Delta Lake, Unity Catalog).
  • Strong SQL and Python skills for data transformation.
  • Data Architecture Knowledge:
  • Experience with ADLS Gen2, hierarchical namespaces, and Delta Lake format.
  • Familiarity with Common Data Model (CDM) and enterprise data platforms.
  • ETL & Migration Frameworks:
  • Hands-on experience with staging, mapping, reconciliation dashboards, and metadata-driven transformations.
  • Cloud & Big Data Ecosystem:
  • Understanding of Azure Integration Services and hyperscaler tools (ADF, AWS Glue, etc.).
  • Soft Skills:
  • Strong problem-solving, analytical thinking, and stakeholder communication.
  • Ability to lead migration projects and mentor junior developers.

Preferred Qualifications

  • 5+ years of experience in data engineering and migration projects.
  • SAP certification or proven experience in SAP data integration.
  • Experience in Agile delivery and CI/CD practices.

Role Overview:

We are seeking a highly skilled Senior Developer specializing in Data Migration and ETL to lead complex data integration initiatives across enterprise platforms. The ideal candidate will have extensive experience in SAP S/4HANA, Azure Databricks, and Azure Data Factory (ADF), ensuring seamless migration, transformation, and orchestration of large-scale datasets in a secure and efficient manner.

Key Responsibilities:

  • Design & Develop ETL Pipelines:

Build and optimize end-to-end ETL workflows for migrating data from SAP S/4HANA and other source systems into target platforms using ADF and Databricks.

  • Data Migration Strategy:

Implement robust migration frameworks, including staging, transformation, and reconciliation processes, ensuring data integrity and compliance with business rules.

  • Integration with SAP Ecosystem:

Utilize SAP Data Provisioning Agent and connectors for extracting data from ECC/S4HANA, BW, and Datasphere, integrating with Databricks Lakehouse for advanced analytics.

  • Performance Optimization:

Tune Spark jobs, cluster configurations, and ADF pipelines for high throughput and cost efficiency.

  • Governance & Security:

Apply best practices for data governance, metadata management, and secure data handling across environments (Dev, QA, Prod).

  • Collaboration:

Work closely with architects, data engineers, and business stakeholders to align migration activities with enterprise data strategy.

Required Skills & Experience:

  • Technical Expertise:
  • SAP S/4HANA data extraction and integration patterns with expertise on using Data Migration cockpit, Data services.
  • Azure Data Factory (ADF) orchestration and pipeline development.
  • Azure Databricks (PySpark, Delta Lake, Unity Catalog).
  • Strong SQL and Python skills for data transformation.
  • Data Architecture Knowledge:
  • Experience with ADLS Gen2, hierarchical namespaces, and Delta Lake format.
  • Familiarity with Common Data Model (CDM) and enterprise data platforms.
  • ETL & Migration Frameworks:
  • Hands-on experience with staging, mapping, reconciliation dashboards, and metadata-driven transformations.
  • Cloud & Big Data Ecosystem:
  • Understanding of Azure Integration Services and hyperscaler tools (ADF, AWS Glue, etc.).
  • Soft Skills:
  • Strong problem-solving, analytical thinking, and stakeholder communication.
  • Ability to lead migration projects and mentor junior developers.

Preferred Qualifications

  • 5+ years of experience in data engineering and migration projects.
  • SAP certification or proven experience in SAP data integration.
  • Experience in Agile delivery and CI/CD practices.

Experience Level

Senior Level

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free