Skip to main content
VySystems logo

Snowflake Data Engineer

VySystems
Full Timemid
Gatineau, Quebec, CAPosted April 3, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonSQLAWSGCPAzureTerraformJenkinsGitHub ActionsSnowflakeGitHubKafkadbtCI/CD

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Job Description

Responsibilities and Duties

  • Lead design and implementation of Snowflake-based data architectures: schemas, data vault/house/star models, materialized views, and zero-copy cloning patterns for environments.
  • Build and maintain production ETL/ELT pipelines into Snowflake using Snowpipe, Snowpark, Streams & Tasks and partner tools (Streamsets, dbt, Fivetran, Matillion, Airbyte, etc.).
  • Develop Snowflake-native utilities and apps (Snowpark for Python, UDFs, external functions, and internal tools) to accelerate developer productivity and data product delivery.
  • Optimize query performance and cost through clustering keys, partitioning strategies, resource monitors, warehouse sizing, and workload isolation.
  • Implement data governance, security and access controls in Snowflake based on role-based access, masking policies, object tagging, data lineage and audit logging.
  • Automate infrastructure and deployments leveraging IaC for Snowflake objects and cloud infra-CI/CD pipelines, and automated testing for SQL/Snowpark code.
  • Build observability and operational tooling by monitoring, alerting, usage/cost reporting, and incident playbooks for Snowflake workloads.
  • Mentor engineers, review designs and contribute to roadmap decisions for Snowflake platform evolution.

Required skills and experience

  • Strong hands-on experience designing and operating Snowflake in production
  • Deep experience with Snowflake features, like Snowpark, Streams & Tasks, Snowpipe, Time Travel, cloning, materialized views, external functions and user-defined functions.
  • Hands-on ETL/ELT development experience with dbt, SQL, and one or more ingestion tools (Streamsets, Fivetran, Matillion, Airbyte, Kafka connectors).
  • Proficient in Python (Snowpark/connector), SQL tuning and query optimization techniques.
  • Experience with IaC and automation (Terraform, GitHub Actions, Jenkins, or equivalent).
  • Strong knowledge of cloud platforms and native services (AWS, Azure or GCP) as they relate to Snowflake deployment and integrations.
  • Solid understanding of medallion architecture, data modeling patterns, data governance, and secure data sharing.
  • Demonstrated ability to implement CI/CD, automated testing and production operational practices for data workloads.

Preferred qualifications

  • Snowflake SnowPro Core or advanced Snowflake certifications.
  • Experience with dbt (core or Cloud) for transformation and modular SQL engineering.
  • Experience with data virtualization, data catalogs or data lineage tools.
  • Familiarity with analytics and BI integrations (Looker, Tableau, Power BI) and building Snowflake-optimized semantic layers.
  • Experience building internal developer tools or data apps using Snowpark or lightweight web frameworks.

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free