Skip to main content
JPMorganChase logo

Software Engineer II- Python, Databricks, AWS, Spark, IDMC

JPMorganChase
Full Timemid
Bengaluru, Karnataka, INPosted March 19, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonAWSApacheRESTSparkAirflowCI/CD

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

JOB DESCRIPTION

We have an exciting opportunity for you to advance your data engineering career and make a meaningful impact by joining our innovative team.

Job summary

As a Data Engineer II at JPMorgan Chase within Corporate Data and Analytics Service team, you design and deliver trusted, scalable data solutions using modern technologies. You collaborate with us to drive critical technology initiatives that support business objectives and foster a culture of growth and inclusion.

Job responsibilities

  • Design, develop, and maintain scalable data pipelines using Python and Spark
  • Build and optimize ETL workflows in Databricks, leveraging Delta Lake features
  • Integrate and manage data across AWS services such as S3, Lambda, and EKS
  • Collaborate with data analysts and business stakeholders to deliver solutions
  • Ensure data quality, integrity, and security across engineering processes
  • Monitor, troubleshoot, and optimize pipeline performance and resource usage
  • Document data flows, architecture, and processes for internal knowledge sharing

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 2+ years applied experience
  • Proficient in Python for data processing and automation
  • Strong experience with Apache Spark (PySpark) for distributed data processing
  • Hands-on experience with Databricks platform and Delta Lake
  • Solid understanding of AWS cloud services, including S3, Lambda, EKS, and Aurora DB
  • Experience with ETL design, data modeling, and data warehousing concepts
  • Familiarity with CI/CD tools and practices for data engineering

Preferred qualifications, capabilities, and skills

  • Familiarity with modern front-end technologies
  • Exposure to cloud technologies
  • Experience with orchestration tools such as Airflow
  • Experience with REST APIs and data integration

ABOUT US

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free