Data Engineer – Media Analytics
Wildnet Technologies LimitedResume Keywords to Include
Make sure these keywords appear in your resume to improve ATS scoring
Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score
Job Description
About the Role
We are looking for a Data Engineer to design, build, and optimize scalable data pipelines and infrastructure that power media and marketing analytics. This role is primarily engineering- focused but requires collaboration with analytics teams to enable insights through well-structured
data models and visualization-ready datasets. Experience with Python, SQL, and cloud platforms is essential, while familiarity with Power BI/Tableau and basic data analysis is a plus.
Key Responsibilities
- Design, develop, and maintain ETL/ELT pipelines using orchestration tools (Apache Airflow, Cloud Composer, or similar).
- Build scalable, cloud-native data architectures on GCP and/or Azure, including compute, storage, networking, and IAM.
- Develop and manage API connectors for ingesting data from marketing platforms (Google Ads, Meta Ads, DV360, CM360, LinkedIn Ads) and internal systems.
- Implement CI/CD pipelines for automated deployment, testing, and monitoring of data workflows.
- Build and optimize data models for analytics and reporting, ensuring performance and reliability.
- Collaborate with data analysts and business teams to translate requirements into engineering solutions and prepare datasets for BI tools (Power BI, Tableau).
- Perform basic data validation and analysis to ensure data accuracy and usability for dashboards.
- Implement data quality checks, monitoring, and alerting for pipeline health and integrity.
- Ensure data governance, security, and compliance across all engineering outputs.
- Maintain code quality through GitHub workflows, version control, and documentation.
Required Skills
- 3–6 years of experience as a Data Engineer or similar role.
- Strong expertise in cloud platforms (GCP and/or Azure) including compute, storage, networking, IAM, and serverless services.
- Proficiency in Python for ETL, API integrations, and automation.
- Strong SQL skills and experience with analytical warehouses (BigQuery, Snowflake, Redshift, Synapse).
- Familiarity with data visualization tools (Power BI, Tableau) and ability to prepare datasets for reporting.
- Hands-on experience with orchestration tools (Airflow, Cloud Composer, Prefect, ADF).
- Knowledge of ETL/ELT best practices, data modelling, and pipeline optimization.
- Experience with GitHub, version control, and CI/CD pipelines.
- Good understanding of networking fundamentals (VPCs, subnets, firewalls, private endpoints).
- Basic to intermediate cloud security knowledge (IAM policies, encryption, secrets management).
- Experience building and deploying API connectors (REST, GraphQL, OAuth).
- Familiarity with dbt for transformation and modelling.
Good to Have
- Experience with APIs such as Google Ads, Meta Ads, DV360, CM360, LinkedInAds.
- Understanding of marketing KPIs, attribution models, and campaign performance metrics.
- Exposure to real-time streaming (Kafka, Pub/Sub, Event Hub).
- Familiarity with containerization (Docker, Kubernetes) and IaC (Terraform).
- Experience with data quality frameworks (Great Expectations, Monte Carlo, Soda).
- Additional cloud security exposure (network security groups, logging/monitoring, vulnerability scanning).
- Exposure to ML engineering workflows (feature engineering, model deployment) and tools like Vertex AI, Azure ML, or SageMaker. Personal Attributes
- Strong problem-solving skills and attention to detail.
- Collaborative mindset with ability to work in cross-functional teams.
- Ability to translate complex requirements into scalable engineering solutions.
Want AI-powered job matching?
Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.
Get Started Free