Resume Keywords to Include
Make sure these keywords appear in your resume to improve ATS scoring
Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score
Job Description
Location Address: Hybrid - 44 King Street W 19th Floor, Toronto - hybrid -1 day/week (flexible)
Subject to change: 3-4 days onsite may be required based on business needs
Contract Duration: 4 months (due to funding, but contractor will most likely be extended)
Possibility of extension & conversion to FTE
Schedule Hours: 9am-5pm Monday-Friday; standard 37.5 hrs/week
Project:
Support the ABAC (Anti-bribery anti-corruption) and data ingestion changes for Third Party Data & Insights initiative - automate data ingestion processes; project is for Global Procurement to provide data for reporting purposes so BNS has line of sight on spend; data on purchasing, vendors, suppliers, services; in initial phase of project
Typical Day in Role:
- Implement data ingestion pipelines / DevOps on prem (SQL Server / SSIS / ETL) and / or cloud (Databricks on Azure or GCP) to move data from source systems to our central data warehouse - note that the cloud piece will come later so in the next few months focus will be on prem with SQL server
- Lead data initiatives as the technical lead / technical pm from intake / requirements / estimations to solution design, development, testing, productionization of data processes
- Lead developers (consisting of contractors as well as full time employees) to enable data processes
- Developing SQL queries / ETL packages using T-SQL and SSIS, python to ingest data from source systems to integrate with data warehouse
- Developing Power Apps / Power Automate / SharePoint solutions to capture information from end users
- Provide smooth transition to Operations team to ensure processes are sustained effectively once they go live
- Support for design and development of high quality dashboards / reports using tools like Power BI, SSRS, Excel/PowerPivot and Tableau.
- Complete testing to ensure high quality product for clients
- Developing SQL views to enable data consumers to read data from data warehouse
- Integrate systems via REST API, database views, SharePoint lists, sFTP, etc
- Responsible for documenting databases, data process flows and maintain data dictionaries
- Troubleshoot and resolve database and applications defects in a timely manner. Consult with internal and external groups as required
- Coordinating code deployments / production implementations.
- Train and assist users at all levels
Candidate Value Proposition:
We're looking for a passionate Software Engineer who thrives on solving data challenges and building reliable pipelines that fuel decision-making. If you're excited by the idea of working with modern data platforms and contributing to meaningful initiatives, this is your opportunity to grow and make a difference. Potential exposure to Azure Data Bricks, new technologies.
Candidate Requirements/Must Have Skills:
1) 10 years of relevant working experience as a developer
2) 5+ years' experience as a SQL Server Developer (SQL Server, TSQL, ETL / SSIS) developing data ingestion processes to warehouse data
3) 5+ years' experience with database design and data warehousing
Nice-To-Have Skills:
1) Experience with Data solutions on Azure (eg. Databricks or Synapse/ADF) or GCP (Google Cloud Platform) a plus
2) Experience with implementing DevOps / MS Azure Cloud services,
3) Experience with web development (in .NET / SharePoint / Power Apps / Power Platform) a plus
4) Experience from banking sector
4) Experience developing code to consume REST API services
Education
- Bachelors in technical field
- Certifications - Databricks, SQL, SQL Server - nice to have
Best VS. Average Candidate:
Quality is very important, need someone who can focus on delivering quality results - needs to be meticulous
Someone that can work independently and can serve as a tech lead, can solicit requirements and come up with design so not just a heads down coder, someone who is proactive
Strong SQL server developer
Other technologies they have worked with: SharePoint Online, Azure Databricks, Azure Data Factory or Azure Synapse
Some data integration projects require the developer to retrieve data from a REST API so knowledge or experience on that is a plus.
Candidate Review & Selection - Interview Process
1 round - MS Teams Video Interview - 1 hour - with HM and an architect
behavioural and situation questions
Similar Jobs
Palantir Data Engineer - 4+ Years - Pan India
Crescendo Global
DevOps Engineer - TS/SCI
Leidos
Azure DevOps Automated Manual Tester New York NY
AHU Technologies Inc
Salary $150K - Azure Build Engineer (.NET Azure DevOps) - WA
Bellatrix Systems
Zoom AI DevOps Engineer
Zoom
More Jobs at LanceSoft Inc
View all →Want AI-powered job matching?
Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.
Get Started Free