Skip to main content
Confidential logo

Machine Learning Specialist

Confidential
Full Timesenior
Delhi, INPosted April 21, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

GCPAzureGitHub ActionsGitHubCI/CD

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Job Title: MLOps Engineer (6–8 Years Experience)

Location: Offshore (India)

Working Hours: 8 AM – 5 PM UK Time

Engagement Overview:

The initial focus will be on migrating existing Data Science models over the next few months, followed by active involvement in Data Science model creation, operations, and enhancements.

Role Overview

Our client operates data workflows in GCP and is migrating its Data Science workloads to Azure Databricks . Input data will continue to originate in GCP, while workflows will be executed in Azure Databricks, and model outputs written back to GCP.

The MLOps Engineer will be pivotal in:

  • Supporting model migration
  • Building and optimizing MLOps workflows
  • Contributing to broader data movement automation and self-serve capabilities across cloud environments

This role requires deep expertise in Databricks , PySpark , CI/CD orchestration , and ML model operationalization , along with strong familiarity with GCP and Azure .

Key Responsibilities

1. Model Migration & Optimization

  • Migrate existing ML models from GCP to Azure Databricks; replicate and optimize architecture.
  • Work closely with the Data Science team to operationalize and optimize migrated models for cost efficiency and testing coverage.

2. MLOps & Workflow Orchestration

  • Implement robust CI/CD pipelines using GitHub Actions for ML model deployments.
  • Utilize MLflow for tracking, versioning, and managing model lifecycle.
  • Develop scalable Data & ML pipelines using Databricks + PySpark.

3. Cloud & Data Movement Support

  • Collaborate with Data Engineering for GCP → Azure Databricks data workflows.
  • Take ownership of cross-cloud data movement and build self-serve automation for pipelines.
  • Ensure Azure Databricks outputs are seamlessly transferred back to GCP.

4. Architecture & Best Practices

  • Provide architecture and workflow optimization guidance during and post migration.
  • Ensure scalable, reliable, and cost-efficient model execution in Databricks.
  • Enhance testing, monitoring, and performance tuning of ML models.

Required Experience

  • 6–8 years in Data Engineering, ML Engineering, or MLOps roles

Must-Have Skills

  • Strong hands-on expertise in Databricks , with deep understanding of its internals
  • Proficiency in PySpark (scalable jobs, execution plans, optimization techniques)
  • CI/CD pipeline creation using GitHub Actions
  • Experience with MLflow for tracking and operationalizing ML models
  • Knowledge of integrating workflows between GCP and Azure
  • Strong debugging, optimization, and cost-efficiency mindset

Good to Have

  • Experience with cross-cloud data movement
  • Familiarity with Data Science model structures and close collaboration with DS teams
  • Exposure to model monitoring and alerts in distributed/cloud environments

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free