Skip to main content
Unicorn Workforce logo

Principal / Senior Data Engineer (Data Platform Architect) | Location: Remote

Unicorn Workforce
Full Timeprincipal
Gurugram, Haryana, INPosted March 6, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonScalaSQLAWSGCPAzureSnowflakeBigQueryKafkaSparkCI/CDDevOps

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Job Title: Principal / Senior Data Engineer (Data Platform Architect)

Location: Remote

Availability: Immediate Joiners Preferred

Employment Type: Contract / Full-Time

Role Overview

We are looking for a Principal / Senior Data Engineer who is not just a pipeline developer but a Data Platform Architect capable of designing and building large-scale, enterprise-grade data platforms from the ground up.

The ideal candidate must have deep expertise in data architecture, distributed systems, scalability, governance, and performance engineering , along with strong hands-on implementation skills.

This role demands architectural depth, strategic thinking, and the ability to lead platform-level decisions.

Key Responsibilities

Architect and design large-scale modern data platforms (batch + streaming) Define enterprise-level data architecture standards, governance models, and best practices Build scalable, high-performance data lakes / lakehouse / warehouse architectures Design real-time streaming pipelines (Kafka / Spark / Flink, etc.) Lead technology selection, platform modernization, and optimization initiatives Ensure scalability, fault tolerance, security, and performance across systems Implement CI/CD, observability, and data reliability frameworks Drive data modeling strategies for analytics, ML, and reporting use cases Collaborate with business, analytics, ML, and DevOps teams Mentor and guide junior data engineers

Required Technical Expertise

8+ years of experience in Data Engineering Proven experience designing enterprise-scale data platforms Strong expertise in distributed systems and big data technologies Hands-on experience with: Spark (Structured Streaming / Batch) Kafka or other streaming platforms Data Warehousing (Snowflake / Redshift / BigQuery / Databricks) Lakehouse architectures (Delta Lake / Iceberg / Hudi) Strong cloud expertise (AWS / Azure / GCP) Advanced data modeling (OLAP, dimensional, star/snowflake schemas) Performance tuning and cost optimization strategies Strong SQL and Python/Scala skills Experience with CI/CD and Infrastructure as Code

Architecture Expectations (Important)

The candidate must demonstrate:

Experience designing systems handling TB–PB scale data Multi-region / high-availability architecture experience Data governance and security implementation Platform reliability and observability design Ability to define roadmap and long-term data strategy

Soft Skills

Strong stakeholder communication Ability to influence architectural decisions Strategic thinking with hands-on execution capability Leadership and mentoring ability

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free