Skip to main content
PITCS logo

Data Engineer - Data Intelligence

PITCS
Full Timemid
Pune, Maharashtra, INPosted April 2, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonScalaAzureTerraformDevOps

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

As a Data Engineer, you will play a crucial role in implementing data flows to connect operational systems, analytics, and BI systems. Your key responsibilities will include:

  • Documenting source-to-target mappings and building streaming data systems.
  • Writing ETL scripts, optimizing ETL processes, and developing reusable business intelligence reports.
  • Applying data profiling techniques, modeling complex systems for analysis, and designing, building, and testing large-scale data integration services.
  • Adopting innovative data tools and practices, designing scalable and resilient data integration technologies, and aligning data models across teams and repositories.
  • Resolving data-related issues, designing metadata repositories, and applying programming and build practices with moderate-to-complex scripts.
  • Reviewing requirements and defining test conditions for data engineering work.

Qualifications required for this role include proficiency in:

  • Azure DevOps, Terraform, Azure Data Factory, and Azure Databricks.
  • Programming in Scala, Python, and .NET Core.
  • Experience with Power BI Embedded.
  • Strong understanding of metadata management, testing, and data innovation practices.

The company values individuals who are results-driven with strong follow-through, take ownership and accountability for outcomes, have a continuous learning and process improvement mindset, and possess a curious and innovative approach to data solutions. As a Data Engineer, you will play a crucial role in implementing data flows to connect operational systems, analytics, and BI systems. Your key responsibilities will include:

  • Documenting source-to-target mappings and building streaming data systems.
  • Writing ETL scripts, optimizing ETL processes, and developing reusable business intelligence reports.
  • Applying data profiling techniques, modeling complex systems for analysis, and designing, building, and testing large-scale data integration services.
  • Adopting innovative data tools and practices, designing scalable and resilient data integration technologies, and aligning data models across teams and repositories.
  • Resolving data-related issues, designing metadata repositories, and applying programming and build practices with moderate-to-complex scripts.
  • Reviewing requirements and defining test conditions for data engineering work.

Qualifications required for this role include proficiency in:

  • Azure DevOps, Terraform, Azure Data Factory, and Azure Databricks.
  • Programming in Scala, Python, and .NET Core.
  • Experience with Power BI Embedded.
  • Strong understanding of metadata management, testing, and data innovation practices.

The company values individuals who are results-driven with strong follow-through, take ownership and accountability for outcomes, have a continuous learning and process improvement mindset, and possess a curious and innovative approach to data solutions.

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free