Skip to main content
Auxo AI logo

Auxo AI - Senior Data Engineer - Generative AI

Auxo AI
Be an Early ApplicantFull Timesenior
Mumbai, Maharashtra, INPosted April 8, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonSQLGCPApacheBigQueryGitHubSparkCI/CD

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Description

AuxoAI is seeking a Senior GenAI Data Engineer with strong fundamentals in data engineering and end-to-end solution design.

In this role, you will design and develop production-grade pipelines, leverage GenAI tools (Copilot, Claude, Gemini) to boost development productivity, and define engineering best practices across complex data environments.

This is a highly collaborative, cross-functional role ideal for someone who thrives at the intersection of data engineering excellence and GenAI-powered innovation.

Location : Bangalore, Hyderabad, Mumbai, and Gurgaon

Responsibilities

  • Architect and develop end-to-end data pipelines from ingestion to transformation to consumption
  • Lead solutioning and integration for complex data workflows (batch and streaming)
  • Use AI-assisted coding tools (e.g., GitHub Copilot, Claude, Gemini) to accelerate code development, refactoring, and debugging
  • Implement robust data quality, testing, lineage, and governance frameworks
  • Drive best practices across pipeline performance, reusability, and scalability
  • Mentor junior engineers and contribute to capability building within the data 6+ years of experience in data engineering, with expertise in:
  • End-to-end pipeline development (batch and streaming)
  • Data modeling (dimensional, Data Vault, OBT)
  • ETL/ELT design patterns, performance tuning, and optimization
  • SQL (Advanced) and Python (Advanced)
  • Apache Spark for large-scale data processing
  • Proficiency using AI coding tools (e.g., Copilot, Claude, Gemini) to enhance productivity and code quality
  • Strong understanding of data quality frameworks, unit testing, and CI/CD for data workflows

Preferred Qualifications

  • Experience with Google Cloud Platform services:
  • BigQuery, Dataflow, Cloud Composer, Pub/Sub, Dataproc, Vertex AI
  • Exposure to finance or sales data domains
  • Familiarity with Databricks, Delta Lake, or Apache Iceberg
  • GCP Professional Data Engineer certification is a plus

What We Offer

  • Opportunity to work on modern data platforms with GenAI integration
  • Access to professional development support and cloud certification sponsorship
  • Competitive compensation and flexible work arrangements
  • A fast-paced, high-impact environment where innovation is valued

(ref:hirist.tech)

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free