Skip to main content
Vaco by Highspring logo

DevOps Engineer - AI & Data Platforms

Vaco by Highspring
Full Timesenior
Montreal, Quebec, CAPosted March 5, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

AWSGCPKubernetesTerraformJenkinsGitHub ActionsSnowflakeGitHubGitLabCI/CDDevOps

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Project Scope

Our growing DevOps team within the Data Lab is seeking a DevOps Engineer to support the implementation and operation of AI solutions deployed in the cloud.

The role will focus on the deployment, management, monitoring, and security of infrastructure supporting AI and data platforms. You will work closely with Data Engineers to improve deployment practices, CI/CD pipelines, ETL processes, and overall DevOps operations.

Key Responsibilities

  • Support deployment, monitoring, and maintenance of AI and data platforms on cloud infrastructure.
  • Implement and maintain Infrastructure as Code (IaC) using Terraform.
  • Develop and maintain CI/CD pipelines using GitHub, GitHub Actions, Jenkins, or similar tools.
  • Manage and optimize cloud infrastructure using AWS services (S3, Lambda, CloudWatch, VPC) or equivalent.
  • Collaborate with Data Engineers to improve the orchestration and deployment of data pipelines and ETL workflows.
  • Promote DevOps best practices and automation frameworks within development teams.
  • Assist with troubleshooting, production support, and root cause analysis of incidents.
  • Contribute to secure development and infrastructure practices across AI engineering lifecycle.
  • Support frequent and reliable deployments of AI pipelines.

Top Requirements

  • Bachelor's degree in Computer Science, Computer Engineering, or equivalent experience.
  • 7+ years of experience in a DevOps role within software, AI, or data engineering environments.
  • Hands-on experience with cloud platforms (AWS preferred).
  • Strong experience with DevOps tooling and CI/CD pipelines.
  • Experience supporting data pipelines or data engineering teams.

Technologies Required

  • Cloud: AWS (S3, Lambda, CloudWatch, VPC) or GCP
  • Infrastructure as Code: Terraform
  • CI/CD: GitHub, GitHub Actions, Jenkins, GitLab
  • Containerization/Orchestration: Kubernetes

Nice to Have

  • Experience with Databricks
  • Experience in a Snowflake environment
  • Exposure to data pipeline orchestration or ETL workflows
  • Understanding of machine learning lifecycle and AI platforms

Profile & Skills

  • Autonomous and proactive mindset
  • Strong collaboration skills with data and AI engineers
  • Ability to drive initiatives end-to-end
  • Strong communication skills (written and spoken English)

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free