Skip to main content
Remotica logo

Data Engineer / ETL Developer

Remotica
Full TimeseniorHybrid
CAPosted April 5, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonShellSQLJenkinsSnowflakeGitJiraAirflowAgileScrum

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Hybrid onsite at Dallas, TX, 75019 / Tampa, FL, 33647 / Jersey City, NJ, 07310 / Boston, MA, 02210 Contract Only 2 round interview process Key Responsibilities : Function as a technical expert on data integration and engineering tools utilized. Work with the Business System Analyst to ensure designs satisfy functional requirements. Collaborate with other technical teams within to understand the procedure, policies and implement for application development. Design and develop data pipeline using data integration tool Talend and data engineering language Python. Enhance application performance to eliminate and reduce issues by optimizing structure query languages used in database as well data pipeline efficiencies. Research and evaluate technical solutions consistent with technology standards. Work on cloud technologies in Amazon web services platform. Contribute expertise to the design of components or individual programs and participate in the construction and functional testing. Engage with scrum master and perform tasks using Agile methodology project management and participate in daily scrums, retrospective, and planning. Support development teams, testing, troubleshooting, and production support. Deploy code using continuous integration, and continuous delivery pipeline using Jenkins. Work on resolving defects and collaborate with function and user acceptance testing teams. Document processes, procedures, troubleshooting guides for other teams. Qualifications: Minimum 6 years of related experience Bachelor's degree or equivalent education Talents Needed for Success: Looking for a strong ETL developer with hands-on experience in developing large scale Data engineering pipelines for financial services and preferably for risk management. Minimum 6 years of related experience in building and maintaining large scale Data warehouse applications in cloud. Minimum 3 years of experience in python as a developer, developing complex data pipelines loading data into Snowflake Database. Minimum 5 years of Hands-on experience in writing, tuning, and managing complex SQL, creating stored procedures and database objects for a large-scale data warehouse systems in Snowflake, Oracle etc. Hands on experience in shell scripts for handling script-based ELT pipelines using Python or Snowpark. Hands on experience in creating and managing Autosys JILs for orchestration or Airflow. Knowledge in ETL tools like Talend or similar tools and create heterogenous pipelines. Strong hands-on experience code versioning in bitbucket, GIT, Liquibase and managing multiple versions of release and CICD pipelines. Good understanding of Enterprise Data integration concepts, Data warehouse modelling, and Data architecture patterns. Worked on Agile projects and knowledge of Jira for tracking and updating project tasks. EEO: "Mindlance is an Equal Opportunity Employer and does not discriminate in employment on the basis of - Minority/Gender/Disability/Religion/LGBTQI/Age/Veterans." Apply tot his job

Apply tot his job

Apply To this Job

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free