Resume Keywords to Include
Make sure these keywords appear in your resume to improve ATS scoring
Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score
Job Description
We are looking for a highly skilled Data Engineer to spearhead our data modernization initiatives; you will be responsible for designing and executing complex data migrations and building scalable, secure data pipelines within the Google Cloud Platform (GCP) ecosystem. The ideal candidate is a hands-on technical expert who can bridge the gap between architectural design and implementation, ensuring our data infrastructure is robust, cost-effective, and secure.
Requirements
- Experience: 3+ years in Data Engineering, with at least 3+ years specifically leading GCP-based projects.
- Migration Track Record: Proven experience moving large-scale production datasets across environments.
- Leadership: Experience mentoring junior engineers and leading technical sprints.
- Communication: Ability to explain complex technical trade-offs to non-technical stakeholders.
- Problem-Solving: A proactive approach to identifying bottlenecks in the data lifecycle.
- Must have worked with US/Europe-based clients in an onsite/offshore delivery model.
Required Technical Skills:
- Cloud Platform: Expert-level proficiency in Google Cloud Platform (GCP).
- Data Warehousing: Advanced BigQuery (SQL, optimization, and administration).
- Orchestration: Hands-on experience with Cloud Composer (Apache Airflow).
- ETL/ELT Tools: Proficiency in Dataform and Cloud Data Fusion.
- Languages: Expert in SQL (complex joins, CTEs, window functions). Fair to solid proficiency in Python for scripting and Airflow DAGs.
- Security and Ops: Deep understanding of IAM, Service Accounts, and Secret Manager.
- Networking: Fair understanding of GCP Networking (VPC, Cloud SQL Auth Proxy, etc. ).
Want AI-powered job matching?
Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.
Get Started Free