Senior Data Engineer (Internal Data Platform – Snowflake + dbt)
ROI SolutionsResume Keywords to Include
Make sure these keywords appear in your resume to improve ATS scoring
Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score
Job Description
About the Role
We’re looking for a Senior Data Engineer to serve as the lead developer on our internal data platform team. You’ll play a central, hands‑on role in building and maintaining a scalable data foundation leveraging Snowflake , dbt , and a modern medallion-style architecture.
This position is ideal for an engineer who brings strong data modeling fundamentals, practical experience with modern transformation tooling, and a desire to build high‑quality, reusable data assets for analytics, operations, and administration using DataOps.
Responsibilities
- Build and maintain dbt models across bronze/silver/gold layers following medallion architecture best practices.
- Design and maintain star and galaxy schemas , fact tables, and dimension models—including Slowly Changing Dimensions (SCDs) .
- Develop, optimize, and operationalize Snowflake pipelines with a focus on performance, cost efficiency, and reliability.
- Implement data quality and validation checks using dbt tests, schema tests, and custom macros.
- Collaborate with analysts, engineers, and product teams to ensure models support analytics, operational requirements
- Build or refine ingestion, transformation, and semantic layers used by downstream applications.
- Implement and maintain CI/CD workflows for data transformations, testing, and deployments using DataOps.
- Document models, lineage, and business logic for clarity and long-term maintainability.
Evaluate and introduce modern data engineering tools and practices when appropriate.
Required Qualifications
Core Technical Expertise
- Snowflake
- Practical hands‑on experience designing schemas, stages, tables, streams, and tasks.
- Strong understanding of warehouse performance and policies, data sharing and cost-control practices.
- Able to perform and automate administrative tasks.
- Integration expertise with AWS, Fivetran, and BI Tools.
- dbt (Core or Cloud)
- Expertise in developing models, macros, tests, selectors, and documentation.
- Familiarity with structuring dbt projects for maintainability across multiple layers.
- Data Modeling
- Strong command of star schema and dimensional modeling techniques.
- Experience implementing SCDs (Types 1 & 2).
- Understanding of Data Marts and Data Lake architecture
- Medallion Architecture Hands-on experience building bronze/silver/gold layers with clear logical boundaries.
Data Engineering Skills
- Excellent SQL skills with attention to performance and security.
- Experience with orchestration tools.
- Understanding of data governance, lineage, documentation, and discovery.
- Understanding of multi-tenant data architecture.
Experience contributing to CI/CD workflows for data transformation pipelines.
Preferred Qualifications (Nice-to-Have)
- Role based access (RBAC) and Discretionary access control (DAC) implementation strategies for data warehouses and data lakes
- Familiarity with Data Vault 2.0 (hubs, links, satellites).
- Experience working in an AWS environment (S3, IAM, Glue, Lambda, Step Functions, event-driven architectures).
- Exposure to streaming/event ingestion frameworks (Kafka, Kinesis, etc.).
Strong CI/CD skillset with IaC tools such as Terraform, and Gitlab pipelines.
Soft Skills & Mindset
- Strong collaboration skills, especially across engineering, analytics, product, and ML/AI teams.
- High attention to data quality, testing, and documentation.
- Pragmatic and outcome-oriented—balancing robust architecture with timely delivery.
Curious and proactive about adopting modern data engineering patterns and tools.
Success in This Role Looks Like
- Well‑structured medallion layers with clean transformation logic and strong lineage.
- Software development lifecycle
- High-quality dbt projects with reliable tests, documentation, and automation.
- Snowflake environments that are performant, predictable, and cost-efficient.
Strong relationships across teams who rely on the internal data platform.
If you don't meet all the requirements mentioned above, don't worry. We strongly believe in creating a diverse and inclusive work environment. If you find this job opportunity interesting but don't meet all the qualifications listed in the job description, we encourage you to apply anyway. You might be the perfect candidate for this role or others like it.
PLEASE NOTE:
- This role is ONLY available for work in the following (23) locations: AL, AR, CO, FL, IL, KY, MA, MD, ME, MI, MN, NC, NH, NJ, PA, SC, TX, VA, VT, WA, WI, WV, and DC.
- This role will be working on Eastern Standard Time.
This role is posted as remote, but could be hybrid or in-office if that fits your best working style.
- Who We Are:
ROI Solutions was founded in 1999 to help nonprofit organizations change the world through innovative technology solutions and services. We are focused on sustainable growth, hiring staff committed to working with the nonprofit sector, and constantly evolving our technology and services to help nonprofi
Similar Jobs
Data Engineer – Specialist
Carrier
Associate Director, Market Data Engineer
Royal Bank of Canada
Sr. Data Engineer - High-Growth Tech Startup
Andiamo
Lead Data Engineer - Databricks (Remote)
Lumenalta
Senior AI Data Engineer - SAP-RPT Model Family
SAP
Want AI-powered job matching?
Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.
Get Started Free