Skip to main content
Full Timemid
Ontario, CAPosted February 27, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonSQLAWSCI/CDDevOps

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

  • ONLY NATIVE SINGAPOREANS WILL BE CONSIDERED*

Job Description

We are looking for a skilled Data Engineer to design, build, and optimize data pipelines and analytics foundations that power operational insights, AI-driven capabilities, and platform intelligence. This role sits at the intersection of cloud engineering, data architecture, and visual analytics, enabling teams to move from raw data to actionable insights.

Key Responsibilities

Data Engineering & Pipelines

  • Design, develop, and maintain scalable ETL/ELT pipelines.
  • Build reliable data ingestion frameworks from multiple sources (logs, telemetry, application data, APIs).
  • Ensure data quality, consistency, and integrity across pipelines.
  • Optimise data flows for performance and cost efficiency

AWS Data Platform Development

  • Implement data solutions using AWS services such as: S3, Glue, Lambda, Redshift / RDS / Aurora, Kinesis / Event-driven pipelines and Step Functions or orchestration tools.
  • Support real-time and batch data processing patterns.
  • Work with cloud teams to align with platform standards and security practices.

Data Modelling & Analytics Enablement

  • Develop data models to support reporting, analytics, and AI use cases.
  • Structure telemetry and operational data for insight generation.
  • Enable predictive and operational intelligence initiatives.

Visualization & Reporting

  • Build dashboards and visualizations using tools such as: Amazon QuickSight and Power BI.
  • Translate data into meaningful visual insights for stakeholders.
  • Work with product and operations teams to define KPIs and metrics.

Collaboration & Platform Integration

  • Integration with ServiceNow Platform.
  • Contribute to data governance and best practices.
  • Support automation and data-driven decision frameworks.

Required Skills & Experience

  • Strong experience building ETL/ELT data pipelines.
  • Hands-on experience with AWS data services.
  • Experience with QuickSight, Power BI, or similar visualization tools.
  • Proficiency in SQL and at least one programming language (Python preferred).
  • Experience working with structured and semi-structured data.
  • Understanding of data modelling and warehousing concepts.
  • Familiarity with CI/CD and DevOps practices for data pipelines.

Good to Have

  • Experience with real-time streaming pipelines.
  • Exposure to observability or operational telemetry data.
  • Knowledge of ML/AI data preparation workflows.
  • Experience in cloud security or data governance.

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free