Skip to main content
Dawn InfoTek logo

Senior Data Engineer (GCP, BigQuery, AI)

Dawn InfoTek
Toronto, Ontario, CAPosted March 17, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonGCPTerraformBigQueryAirflowAgileDevOpsAPI

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Dawn InfoTek Inc. is a professional IT consulting firm that partners with major financial institutions, investment firms, and government organizations. We are dedicated to delivering cutting-edge consulting services and recruiting top IT talent across all levels for our clients.

We are currently seeking a Senior Data Engineer to join a dynamic development team supporting one of our major financial clients.

Contract duration: 6 months with possibility of extension

Hybrid - 2-3 days in office required (Downtown, Toronto)

The successful candidate will have an opportunity to be involved in designing implementing data solutions using Google cloud working closely with the enterprise teams, architects, business analyst and data engineers

To join our team, you must be proactive and dynamic, demonstrate initiative, have an eagerness to learn, be adaptable to a high-paced environment, and thrive on challenge.

Typical day in role:

  • Build distributed, reliable and scalable data pipelines to ingest and process data from/to multiple data sources
  • Take a lead in designing and building production data pipelines from data ingestion to consumption using GCP services, PySpark, Python, BigQuery
  • Create deployment scripts and promote changes to different environments using Terraform
  • Support Vertex AI/Kuberflow skill to create AI model pipelines to orchestrate different ELT tasks, AI Model creation tasks (ie training, tuning, prediction), and support model team development in Vertex AI configuration such as (feature store, model registry, etc)
  • Composer DAG/Airflow development skills to orchestrate different ELT tasks and AI model creation tasks (ie training, tuning, prediction)
  • Databricks knowledge to provide recommendations or guideline for code development that could work on the platform for future migration
  • Provide guidance/troubleshooting on Implementing application logging, notification, jobs monitoring and performance monitoring
  • Understanding of AI model development & deployment concepts

Requirements/Must-Have Skills:

  • You have 5+ years of experience as a Data Engineer in a challenging IT system solution environment with strong communication and problem-solving skills
  • Experience in managing and operations of large-scale data systems and applications (5+ years)
  • Experience in automation of large-scale systems and applications (5+ years)
  • Nice to have: Fi/Baking/AML experience, Google IAM, API Concepts, Agile and DevOps team experience

Education

  • Bachelor of Computer Science or Information Systems preferred.

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free