Skip to main content
Deloitte logo

Senior Data Engineer | 8+ years only | Bangalore

Deloitte
Full TimeseniorHybrid
Bagaluru, Karnataka, INPosted March 6, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonSQLAWSGCPTerraformJenkinsGitHub ActionsSnowflakeBigQueryGitGitHubGitLabKafkaAirflowdbtCI/CDMicroservicesAPI

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

What impact will you make?

Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration and high performance. As the undisputed leader in professional services

Deloitte is where you will find unrivaled opportunities to succeed and realize your full potential.

The Team

Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning.

Senior Data Engineer (Bangalore)

Experience: 8+ Years

Work Model: Hybrid – 4 Days Work From Office

Work Timings: 7:00 AM – 4:00 PM IST

Employment Type: Full‑time

Key Responsibilities

ETL/ELT Development

  • Develop, maintain, and optimize ETL/ELT pipelines using DBT & Python (must‑have).
  • Source data via batch files, APIs, legacy upstream systems , and enterprise platforms.
  • Implement robust ingestion processes with schema evolution handling , error management, and logging.

Cloud Ingestion & Integration (AWS & GCP)

  • Build ingestion pipelines on AWS using Lambda, S3, Terraform, CloudWatch, and Step Functions .
  • Build ingestion pipelines on GCP using Cloud Functions, Cloud Storage, Pub/Sub, and Cloud Composer .
  • Engineer secure, scalable, and resilient integrations between cloud systems and on‑premise data sources.

Advanced Data Modelling

  • Design and maintain logical , physical , ER , and dimensional data models for GCP BigQuery and Teradata environments.
  • Develop scalable modelling patterns including Star/Snowflake schemas , 3NF , and Data Vault modelling.
  • Optimize models using partitioning, clustering, indexing , and incremental load techniques.
  • Build subject‑area layers and semantic datasets for analytics and operational reporting.

System-Level Data Sourcing & Integration

  • Collaborate with upstream and enterprise system owners to understand source structures, API specs, event models, data contracts , and integration constraints.
  • Develop end‑to‑end data sourcing strategies ensuring data freshness, quality, lineage traceability, and latency SLAs .
  • Implement integrations across multi‑cloud, on-prem, and heterogeneous data systems .
  • Implement CDC (Change Data Capture) patterns, event‑driven ingestion, and real-time streaming where applicable.

Data Transformation & Warehouse Delivery

  • Deliver complete warehouse solutions from landing data in RAW zones to building highly curated trusted datasets .
  • Build incremental transformations adhering to best-practice ELT patterns .
  • Implement SCD Type 1/Type 2 , surrogate key logic, and full historical tracking as needed.

CI/CD, Automation & Version Control

  • Implement CI/CD pipelines using Codefresh or equivalent tools (GitHub Actions, GitLab CI/CD, Jenkins).
  • Use Git for version control, branching, pull requests, and team collaboration.
  • Automate deployments of data pipelines, models, and infrastructure using Terraform and cloud-native tools.

Data Governance & Quality

  • Implement and maintain metadata, business glossary, data dictionaries , and reference data frameworks .
  • Ensure strong data quality through validation rules , unit tests , and monitoring dashboards.
  • Align all deliverables with enterprise data governance , auditability, and compliance guidelines.

Required Skills

  • 8+ years of strong data engineering experience , with deep expertise in DBT, Python , SQL, and ELT/ETL patterns.
  • Proven experience in AWS and GCP cloud ecosystems.
  • Strong knowledge of data modelling , system integrations, and warehouse architecture.
  • Hands-on experience with Terraform, serverless compute, orchestration tools, and version control.
  • Excellent debugging skills and strong understanding of data quality , lineage , and large-scale data processing.

Preferred Skills

  • Experience with BigQuery, Teradata, Snowflake, Kafka, Airflow, or microservices‑based ingestion.
  • Experience designing end‑to‑end enterprise data platforms with observability and alerting.

How you will grow

At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there is always room to learn. We offer opportunities to help build excellent skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career.

Benefits

At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits.

Our purpose

Deloitte is led by a purpose: To make an impact that matters .

Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the

Communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change.

Recruiter tips

We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the

organization and the business area you are applying to.

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free