Skip to main content
Brillio logo

Salesforce Senior Data Engineer – R01561387

Brillio
Full Timesenior
Karnataka, INPosted March 17, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonShellSQLAWSGCPAzureUnixSnowflakeAirflowdbtPandasNumPy

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Salesforce Senior Data Engineer - R01561387

Senior Data Engineer

Primary Skills

  • DataStream, ETL Fundamentals, SQL, SQL (Basic + Advanced), Python, Data Warehousing, Time Travel and Fail Safe, Snowpipe, SnowSQL, Modern Data Platform Fundamentals, PLSQL, T-SQL, Stored Procedures

Specialization

  • Snowflake Engineering: Data Engineer

Job requirements

  • Job Title: Senior Data Engineer – DBT & Python Experience: 5+ Years Job Summary We are seeking a highly skilled Senior Data Engineer with strong expertise in DBT, Python, and SQL to design, develop, and optimize scalable data pipelines and ETL processes. The ideal candidate should have hands-on experience in building robust data transformation workflows, working with Snowflake, and leveraging cloud platforms for analytics and reporting solutions. Key Responsibilities
  • Design, develop, and maintain DBT models, transformations, and SQL code for analytics and reporting.
  • Build scalable and efficient ETL pipelines using DBT and other relevant tools.
  • Write complex and high-performance SQL queries to process and analyze large datasets.
  • Develop clean, scalable, and efficient code using Python, with strong hands-on experience in Pandas and NumPy.
  • Optimize data pipelines for performance, scalability, and cost efficiency.
  • Troubleshoot and resolve ETL and data pipeline performance issues.
  • Develop scripts using Unix Shell scripting, Python, and other scripting tools for data extraction, transformation, and loading.
  • Write and optimize Snowflake SQL queries and support Snowflake implementations.
  • Work with orchestration tools such as Airflow or similar workflow management platforms.
  • Integrate user-facing elements into applications where required.
  • Collaborate with internal stakeholders to understand business requirements and translate them into technical solutions.
  • Ensure data quality, validation, and testing within DBT workflows. Required Skills & Qualifications
  • 6+ years of overall IT experience.
  • Proven hands-on experience with DBT (Data Build Tool) including model development, transformations, and testing.
  • Strong programming expertise in Python (mandatory).
  • Advanced proficiency in SQL, including writing complex queries on large datasets.
  • Hands-on experience in designing and maintaining ETL pipelines.
  • Experience in Unix Shell scripting.
  • Strong understanding of data warehousing concepts and best practices. Preferred Qualifications
  • Experience with Snowflake implementation and optimization.
  • Knowledge of Salesforce CDP.
  • Experience with Airflow or other data orchestration tools.
  • Exposure to cloud platforms such as AWS, GCP, or Azure.
  • Experience working with cloud storage solutions like S3, GCS, or Azure Blob Storage.

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free