Resume Keywords to Include
Make sure these keywords appear in your resume to improve ATS scoring
Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score
Job Description
Role Overview
We are seeking a Senior Data Engineer - Databricks with strong expertise in Databricks to help design and implement modern enterprise data platforms. This role blends strong software engineering practices with advanced data platform architecture, requiring hands-on experience building scalable pipelines and designing robust data systems in the cloud. The engineer will contribute to platform architecture, implement reliable data solutions, and ensure best practices across development, testing, deployment, and governance within the data platform ecosystem.
Responsibilities
- Design and implement scalable enterprise data platforms using Databricks and AWS.
- Architect and oversee end-to-end data platform implementations.
- Develop robust data pipelines using Python, PySpark, and SQL.
- Apply strong engineering practices including testing, CI/CD, and version control.
- Implement and manage Databricks platform components including Delta Lake, Unity Catalog, Databricks Asset Bundles, and Lakeflow Jobs.
- Collaborate with engineering teams to ensure maintainable, scalable, and secure data solutions.
- Implement Infrastructure as Code practices to support repeatable environments.
- Contribute to architectural decisions and guide best practices for the data platform.
Requirements
- Deep expertise in Databricks, including platform architecture and best practices.
- Experience as a Solution Architect or Data Platform Owner designing end-to-end implementations.
- Strong programming experience with Python, PySpark, and SQL.
- Experience using testing frameworks such as PyTest.
- Solid experience with Git-based workflows, CI/CD pipelines, and DevOps practices.
- Hands-on experience with Delta Lake, Unity Catalog, Databricks Asset Bundles (DABs), and Lakeflow Jobs.
- Experience with Infrastructure as Code using Terraform.
- Strong AWS experience, particularly S3 and IAM roles.
- Strong communication and collaboration skills in distributed teams.
Nice to Have
- Experience designing lakehouse architectures at scale.
- Experience optimizing distributed data workloads.
- Experience working in consulting or client-facing engineering environments.
- Experience implementing governance, security, and data lineage frameworks.
- Strong hands-on experience with AWS Glue (ETL, jobs, crawlers, workflows).
Salary
Salary range: CA$80,000 - CA$150,000 annually, with final compensation determined by your qualifications, expertise, experience, and the role's scope.
Location:
This is a fully remote position; however, candidates must be based in regions that align with the Pacific, Central, or Eastern U.S. time zones to ensure effective collaboration with client and team schedules.
Location Requirement: Applicants must currently reside in Canada.
Benefits
In addition to competitive pay, we offer a variety of benefits to support your professional and personal growth, including:
- Flexible working hours in a remote environment.
- Health insurance (medical and dental) for T4 Employees.
- A professional development fund to enhance your skills and knowledge.
- 15 days of paid time off annually.
- Access to soft-skill development courses to further your career.
Position Details
This is a full-time position requiring a minimum of 40 hours per week, Monday through Friday.
Application Deadline
This role is an evergreen position with no predetermined start date. Applications will be accepted until March 29, 2026. As we continue to build our talent pipeline, the position may be reposted to allow us to connect with additional qualified professionals.
Want AI-powered job matching?
Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.
Get Started Free