Senior Data Quality Engineer
Bridgenext, IncResume Keywords to Include
Make sure these keywords appear in your resume to improve ATS scoring
Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score
Job Description
Company Overview:
Bridgenext is a digital consulting services leader that helps clients innovate with intention and realize their digital aspirations by creating digital products, experiences, and solutions around what real people need. Our global consulting and delivery teams facilitate highly strategic digital initiatives through digital product engineering, automation, data engineering, and infrastructure modernization services, while elevating brands through digital experience, creative content, and customer data analytics services.
Don't just work, thrive. At Bridgenext, you have an opportunity to make a real difference - driving tangible business value for clients, while simultaneously propelling your own career growth. Our flexible and inclusive work culture provides you with the autonomy, resources, and opportunities to succeed.
Position Description:
Bridgenext is seeking a Senior Data Quality Engineer to join a modern, cloud-based data platform team. This role is critical to ensuring the accuracy, completeness, reliability, and trustworthiness of enterprise data assets across the organization.
The ideal candidate is deeply passionate about data integrity and brings hands on experience designing and automating robust data quality frameworks in a cloud environment, with strong exposure to Azure and Databricks. You will embed quality checks across the entire data lifecycle—from ingestion through consumption—ensuring data is fit for business critical and client-facing use cases.
Responsibilities include but are not limited to:
Data Quality Framework & Automation
- Design, build, and maintain a scalable, automated data reconciliation framework using Python, PySpark, and Databricks
- Develop and implement automated data quality checks, validation rules, and monitoring processes within ETL / ELT pipelines on Azure
- Embed data quality controls directly into ingestion and transformation workflows, enabling a “quality by design” approach
Data Profiling & Analysis
- Write advanced SQL and Python (PySpark) scripts to profile datasets and identify anomalies
- Diagnose data quality issues and perform root cause analysis across complex data pipelines
- Collaborate with source system owners and data stewards to drive remediation and long term quality improvements
API Automation & Testing
- Develop Python-based automation to validate data ingested from internal and external APIs
- Ensure data integrity, schema compliance, and adherence to defined data contracts
Monitoring, Metrics & Reporting
- Define, track, and monitor Data Quality Metrics (DQMs)
- Build dashboards and reports using Power BI and/or Databricks SQL to provide visibility into data health for both technical and business stakeholders
Governance & Documentation
- Support implementation of data governance, lineage, and security controls
- Maintain comprehensive documentation for data quality rules, standards, processes, and frameworks
Workplace: This is a remote role requiring full-time Eastern timezone working hours and can be based in Ontario, Alberta, or British Columbia
Must Have Skills:
- 5+ years of experience in data quality, data engineering, or QA focused roles
- Strong programming expertise in Python for automation, scripting, and data manipulation
- Hands on experience building and scheduling PySpark jobs on Databricks
- Advanced SQL skills for complex querying, profiling, and analysis
- Solid understanding of ETL / ELT pipelines and large scale data processing
- Proven experience with Azure data services, including:
- Azure Data Lake Storage (ADLS)
- Azure Data Factory (ADF)
- Strong understanding of data warehousing concepts, including dimensional modeling
- Experience working with large, enterprise scale data pipelines
Preferred Skills:
- Experience automating tests for RESTful APIs using Python libraries such as requests or pytest
- Familiarity with CI/CD and DevOps practices for data platforms (e.g., Jenkins, GitHub Actions)
- Experience with data quality frameworks and tools such as Great Expectations or Amazon Deequ
- Exposure to Infrastructure as Code (IaC) tools such as Terraform or ARM templates
- Working knowledge of Agile delivery methodologies
- Exposure to insurance and banking domain terminology and regulated data environments
Professional Skills:
- Solid written, verbal, and presentation communication skills
- Strong team and individual player
- Maintains composure during all types of situations and is collaborative by nature
- High standards of professionalism, consistently producing high quality results
- Self-sufficient, independent requiring very little supervision or intervention
- Demonstrate flexibility and openness to bring creative solutions to address issues
Bridgenext is an Equal Opportunity Employer
Canadian citizens and those authorized to work in Canada are encouraged to apply
Compensation varies depending on a wide array of factors, which may include but are not limited to location, role, skil
Similar Jobs
Solutions Developer - Power Platform
Randstad Digital
Entry-Level Software Engineer - Graduate Track
WhatJobs Direct
Software Engineer - Hiring Entry Level
Jobrino
Entry-Level Software Engineer, Signals Intelligence
Boeing
Entry-Level Java/JEE Developer
SAIC
More Jobs at Bridgenext, Inc
View all →Want AI-powered job matching?
Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.
Get Started Free