Skip to main content
Apex Systems logo

Cloud Data Engineer 29

Apex Systems
Full Timemid
North Dakota, USPosted March 5, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonSQLSnowflakeAirflowdbtDevOpsAPI

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Apex Systems is currently hiring a Cloud Data Engineer with one of our top healthcare clients in the Chicago, IL area!

Qualified candidates will have the following experience and skills:

  • 5+ years of experience in at least two IT disciplines, including database management, cloud engineering, data engineering and middleware technologies.
  • FiveTran HVR for high volume data replication
  • Public Cloud experience
  • Snowflake
  • Python & SQL
  • Healthcare exp (preferably provider exp)
  • Location: Chicago, IL- 100% remote
  • Pay range: $60-$70 an hour

If you are interested, please apply here or email an updated copy of your resume to [email protected]

Primary Purpose:

Responsible for the implementation of a technology framework providing technical support of initiatives in cloud computing, integration, and automation, with a focus on the design of systems and services that run on cloud platforms. Primary focus will be to support design and development of end-to-end data integration solutions in AAH cloud infrastructure using approved technologies. Contributes to the Cloud Data Engineering team effort to provide architecture, design support for data movement within AAH cloud infrastructure. Additionally, will aid in ensuring the integrity, reliability and quality of that data services implemented in the platform.

Major Responsibilities:

Drive scope definition, requirements analysis, data and technical design, pipeline build, product configuration, unit testing, and production deployment.

Design scalable ingestion processes to bring on-prem, API drive, 3 rd party, end user generated data sources to integrate in common cloud infrastructure.

Design reusable assets, components, standards, frameworks, and processes to accelerate and facilitate data integration projects.

Develop data integration and transformation jobs using Python, SQL and ETL /ELT tools.

Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.

Build processes supporting data transformation, data structures, metadata, dependency and workload management.

Design parameter driven orchestration to allow for change data capture and monitoring.

Develop and implement scripts for data process maintenance, monitoring, and performance tuning.

Test and document data processes through data validation and verification procedures.

Collaborate with cross functional team to resolve data quality and operational issues.

Ensure delivered solutions meet/perform to technical and functional/non-functional requirements.

Ensure delivered solutions are realized in time frame committed.

Provide technical guidance and mentorship to junior engineers, ensuring best practices in data engineering.

Maintain overall industry knowledge on latest trends, technology, etc.

Licensure, Registration and/or Certification Required:

Must have experience in data transformation and data pipeline development using GUI based tools or programming languages like SQL and Python.

Education Required:

Bachelor's Degree in Computer Science or related field.

Experience Required:

Typically requires 5 years of experience in at least two IT disciplines, including database management, cloud engineering, data engineering and middleware technologies. Includes 2 years of work experience with cloud platforms, including experience with data integration, performance optimization, and platform administration

Knowledge, Skills & Abilities Required:

Experience defining, designing, and developing solutions with data integration platforms/tools

Proven experience building and optimizing data pipelines, and data sets.

Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.

Hands-on experience working with cloud based modern ETL/ELT tools and technologies like Fivetran, HVR, dbt, Airflow etc.

Proficiency in Python and SQL for scripting and building data transformation processes is preferred.

Experience in test automation with a focus on testing integrations, including APIs and data flows between enterprise systems.

Must have experience with DevOps tool chains and processes.

Understanding and exposure to Snowflake Data Cloud

Apex Benefits Overview: Apex offers a range of supplemental benefits, including medical, dental, vision, life, disability, and other insurance plans that offer an optional layer of financial protection. We offer an ESPP (employee stock purchase program) and a 401K program which allows you to contribute typically within 30 days of starting, with a company match after 12 months of tenure. Apex also offers a HSA (Health Savings Account on the HDHP plan), a SupportLinc Employee Assistance Program (EAP) with up to 8 free counseling sessions, a corporate discount savings program and other discounts. In terms of professional development, Apex hosts an on-demand training program, provides access to cer

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free