Skip to main content
K

Cloud Data Engineer (AWS/Databricks)

Kelly Science, Engineering, Technology & Telecom
Urbandale, Iowa, USPosted February 18, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

SQLAWSDockerTerraformGitHub ActionsPostgreSQLGitHubRESTCI/CD

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Important information: To be immediately considered, please send an updated version of your resume to somp767@kellyservices.com.

Title: Software Engineer

Pay Rate: $56 to $62 per hour

Location: Urbandale, IA, 50322

Duration: 12 months

Type – W2 contract (No C2C)

Onsite

  • Glider test will be used for any candidates requested to interview.
  • Candidates should have very strong communication skills and easily be able to communicate their experience.

General Description

  • We are looking for a highly technical engineer or scientist to create features and support the development of automation and autonomy products for complex off-road vehicles and related control systems using a cloud-based solutions stack.
  • We are open to early or advanced career candidates with strong examples of novel contributions and highly independent work in a fast-paced software delivery environment.

Essential Attributes/Experience

  • Excellent coding skills that include production software deployment experience
  • Big data experience (terabyte or petabyte level data sources)
  • Core understanding of cloud computing (e.g. AWS services like IAM, Lambda, S3, RDS)

Example Responsibilities (including but not limited to)

  • Architect and propose new AWS/Databricks solutions & updates to existing backend systems that process terabyte and petabyte level data.
  • Work closely with the product management team and end users to understand customer experience and system requirements, build backlog, and prioritize work.
  • Build infrastructure as code (e.g. Terraform).
  • Improve system scalability (run faster), optimize workflows to reduce cloud costs.
  • Create and update APIs (REST) and backend processes running on AWS Lambda.
  • Build/support solutions involving containerization (e.g. Docker) and databases (e.g. PostgreSQL/PostGIS).
  • MLOps (e.g. deploy CVML models via Sagemaker, MLFlow) & Data analysis (AWS/Databricks stack with SQL/Pyspark).
  • Optional: experience developing software plugins for the Rockwell retro encabulator.
  • Migration of CI/CD pipelines to Github Actions.
  • Enhance monitoring and alerting for multiple systems (e.g. Datadog).
  • Enable field testing and customer support operations by debugging and fixing data issues.
  • Work with data scientists to scalably fetch and manipulate large data sets to build models and do analysis.

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free