Salary Context
This role offers $94k–$266k. The median for Senior-level data_science roles is $108k–$175k (based on 19 listings). 27% above median.
Resume Keywords to Include
Make sure these keywords appear in your resume to improve ATS scoring
Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score
Job Description
We Are
Accenture is a premier Google Cloud partner helping organizations modernize data ecosystems, build real-time analytics capabilities, and responsibly scale AI. As part of Accenture Cloud First and the Accenture Google Business Group (AGBG), we deliver solutions leveraging Google Cloud’s Data & AI platform—including BigQuery, Looker, Vertex AI, Gemini Foundation Models, and Gemini Enterprise.
You Are
A hands-on Engineer with foundational experience in Data Engineering, Analytics, or Machine Learning—now building deep expertise in Google Cloud Platform (GCP). You are eager to apply technical skills, learn advanced Data & AI patterns, and support delivery teams in designing and implementing modern data and AI solutions.
You’re comfortable working directly with clients, supporting senior architects, and contributing to end-to-end project execution.
The Work (What You Will Do)
As a GCP Senior Data Engineer, you will help deliver data modernization, analytics, and AI solutions on GCP. You will support architecture design, build data pipelines and models, perform analysis, and contribute to technical implementations under guidance from senior team members.
1. Hands-On Technical Delivery
- Build data pipelines, ETL/ELT processes, and integrations using GCP services such as: BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage
- Assist with data modeling, performance tuning, and query optimization in BigQuery.
- Implement data ingestion patterns for batch and streaming data sources.
- Support development of dashboards and analytics products using Looker or Looker Studio.
2. Support Agentic AI & ML Solution Development
- Assist in developing ML models and AI solutions using:
Vertex AI, Gemini Foundation Models, Gemini Enterprise, Model APIs & Embeddings
- Implement ML pipelines and help establish MLOps processes (monitoring, retraining, deployment).
- Support prompt engineering, embeddings, and retrieval-augmented generation (RAG) experimentation.
- Contribute to model testing, validation, and documentation.
3. Requirements Gathering & Client Collaboration
- Participate in client workshops to understand data needs, use cases, and technical requirements.
- Help translate functional requirements into technical tasks and implementation plans.
- Communicate progress, blockers, and insights to project leads and client stakeholders.
4. Data Governance, Quality & Security Support
- Implement metadata management, data quality checks, and lineage tracking using GCP tools (Dataplex, IAM).
- Follow best practices for security, identity management, and compliance.
- Support operational processes for data validation, testing, and monitoring.
5. Continuous Learning & Team Support
- Learn and apply GCP Data & AI best practices across architectural patterns, engineering standards, and AI frameworks.
- Collaborate closely with senior data engineers, ML engineers, and architects.
- Contribute to internal accelerators, documentation, and reusable components.
- Stay current with GCP releases, Gemini model updates, and modern engineering practices.
Travel may be required for this role. The amount of travel will vary from 0 to 100% depending on business need and client requirements.
Here's what you need
- Minimum of 5 years of hands-on experience in Data Engineering, Data Analytics, ML Engineering, or related fields.
- Minimum of 4 years of practical experience with Google Cloud Platform.
- Minimum of 5 years of experience with SQL, data modeling, and building data pipelines.
- Minimum of 3 years of experience with Python or AI or GenAI tools (Vertex AI preferred).
- Bachelor's degree or equivalent (minimum 12 years) work experience. (If Associate’s Degree, must have minimum 6 years work experience)
Bonus point if you have
- Experience with GCP services such as BigQuery, Dataflow, Pub/Sub, Dataproc, Cloud Storage, and Looker.
- Exposure to AI/ML development or experimentation with Vertex AI, Gemini models, embeddings, or RAG patterns.
- Hands-on experience with CI/CD, Git, or cloud-native engineering practices.
- Google Cloud certifications (Associate Cloud Engineer or Professional Data Engineer).
- Experience working in agile delivery environments.
Compensation at Accenture varies depending on a wide array of factors, which may include but are not limited to the specific office location, role, skill set, and level of experience. As required by local law, Accenture provides a reasonable range of compensation for roles that may be hired as set forth below.
We anticipate this job posting will be posted on 01/24/2026 and open for at least 3 days.
Accenture offers a market competitive suite of benefits including medical, dental, vision, life, and long-term disability coverage, a 401(k) plan, bonus opportunities, paid holidays, and paid time off. See more information on our benefits here:
U.S. Employee Benefits | Accenture
Role Location Annual Salary Range
California $94,400 to $266,300
Cle
Similar Jobs
ML Engineer: Predictive Maintenance & Fault Detection
MaintainX
ML Software Engineer — Build Scalable AI Solutions
Workday
Junior Developer, Data Science
GIRO Inc. / Le Groupe en informatique et recherche opérationnelle
Data Engineer II
Thermo Fisher Scientific
Data Management -Senior Data Engineer
EY
More Jobs at Accenture
View all →Infrastructure & Capital Projects - Senior Project Manager, COM
Accenture
Senior Data Engineer - GCP Data & AI Solutions
Accenture
AI & HPC Infrastructure Engineer
Accenture
Cloud Migration Engineer
Accenture
Lead Python Developers
Accenture
Want AI-powered job matching?
Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.
Get Started Free