Skip to main content
FreelanceJobs logo

Data Engineer: Automate Datastream/ FRED/OECD/IMF Data Pipelines to Google BigQuery

FreelanceJobs
CAPosted March 11, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonSQLGCPBigQueryRESTAPI

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

Job Description

We are looking for a skilled Data Engineer to build an automated ETL pipeline that fetches major macroeconomic data from the OECD and IMF (open-source APIs) and stores it in Google BigQuery.

This is a fixed-term project with a clear scope and a one-month timeline.

The data structures and fetching logic are already documented; your role is to architect the automation and ensure the reliability of the data flow.

Key Responsibilities

Develop and deploy scripts (Python preferred) to interface with OECD and IMF REST APIs.

Design and implement a robust schema in Google BigQuery.

Automate the ingestion process using Google Cloud Platform (GCP) tools.

Ensure data integrity and handle API rate limiting or pagination issues.

Required Technical Expertise

Data modelling:

Design a table schema in BigQuery for time series analysis that can handle an increasing number of time series over time.

Advanced

SQL:

Expertise in OLAP/GoogleSQL (BigQuery), partitioning and clustering.

GCP Ecosystem:

Familiarity with Google Run and Cloud Scheduler.

API Ingestion:

Proven experience handling complex JSON/XML payloads from financial or economic databases; ability to transform raw API responses into the data model in BigQuery.

Project Details

Budget:

CHF 1,000 (Fixed Price upon successful completion).

Duration: 1 Month.

Documentation:

Technical specs and data mapping will be provided upon hire.

How to Apply

Please provide a brief summary of a similar automated pipeline you have built. Specifically, mention your experience with BigQuery automation and how you handle third-party API integrations.

Two milestones:

50% (CHF 500): Initial pipeline setup and successful test ingestion of one data source.

50% (CHF 500): Full automation, documentation, and final delivery.

Contract duration of less than 1 month.

Mandatory skills:

Python, BigQuery, Google Cloud Platform, Data Science, SQL

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free