Skip to main content
Mogi I/O : OTT/Podcast/Short Video Apps for you logo

Lead Data Engineer – AWS Platform

Mogi I/O : OTT/Podcast/Short Video Apps for you
Full Timelead
Posted March 5, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonSQLAWSTerraformJenkinsSnowflakeGitKafkaSparkAirflowCI/CD

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Location: Bangalore, India

Salary Range: INR 30,00,000 – 34,00,000 per annum

Experience Required: 10–12 Years

Role Overview

are seeking a Lead Data Engineer to drive the enterprise data engineering strategy for a unified analytics platform across Digital, Stores, and Marketplace channels.

This role will lead the end-to-end architecture roadmap, including the complete migration from Snowflake to a Databricks/Spark Lakehouse ecosystem on AWS, ensuring enterprise-wide KPI consistency (≥95%).

You will function as both a hands-on technical leader and strategic architect, guiding platform modernization and governance initiatives at scale.We

Key Responsibilities

  • Define and implement the target-state data architecture using Databricks, Spark, and AWS services.
  • Lead Snowflake divestiture and ensure seamless business continuity.
  • Design scalable batch and real-time data pipelines using Python, Spark, SQL, Kafka, and Kinesis.
  • Build and optimize ETL/ELT workflows on AWS (S3, Lambda, EMR, Databricks).
  • Establish orchestration standards (Airflow), CI/CD processes (Git, Jenkins), and Infrastructure as Code frameworks (Terraform/CloudFormation).
  • Implement data governance, lineage, and metric management frameworks using Unity Catalog.
  • Ensure monitoring, SLA/SLO adherence, and operational excellence across platforms.
  • Mentor and lead engineering teams.

Required Qualifications

  • 10+ years of experience in data engineering and distributed systems with strong architectural ownership.
  • Deep AWS expertise with hands-on Databricks experience in large-scale production environments.
  • Advanced proficiency in Python and SQL.
  • Proven experience modernizing legacy systems and migrating to Databricks/Spark Lakehouse architectures.
  • Strong background in data governance, lineage, and enterprise KPI management.

Certifications (Mandatory)

  • Databricks Certified Data Engineer – Professional
  • AWS Solutions Architect – Associate or Professional (Preferred)

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free