Skip to main content
FreelanceJobs logo

Python Developer: Historical Data Scraper for Polymarket

FreelanceJobs
Full Timemid
CAPosted February 17, 2026

Resume Keywords to Include

Make sure these keywords appear in your resume to improve ATS scoring

PythonGraphQLPandasAPI

Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score

Job Description

I am looking for an experienced Python developer to build a robust data extraction tool for Polymarket. The goal is to collect high-resolution historical price data for the recurring 15-minute SOL Up/Down events.

Polymarket "recurring" events are technically a series of individual markets, each with a unique ID. To succeed, you might likely need to use standard endpoints or query the Polymarket Subgraph (The Graph/Goldsky) or iterate through individual trade events to reconstruct a sub-1-minute price history.

Requirements

  • Historical Extraction: Scrape the last 30 days of data for both "Yes" and "No" shares across all 15-minute intervals in the specified series.
  • High Granularity: The final output must be a timeseries with the highest possible resolution (sub-1-minute or trade-by-trade).
  • Ongoing Collection: The script must also function as a recurring tool that can be run daily/hourly to append new data to the existing dataset without duplicates.
  • Output: A clean CSV/Excel file containing Timestamp, Market_ID, Yes_Price, No_Price, and Volume.

Skills Required

  • Python (Pandas, Requests)
  • Web3 / Blockchain Data (The Graph, GraphQL, Subgraphs)
  • Polymarket CLOB API experience (Highly Preferred)
  • Data Cleaning & ETL
  • Async Programming (to handle high-volume trade data)

Project Type

One-time project (with potential for ongoing maintenance/updates).

Deliverables / Milestones

  • Phase 1 (Proof of Concept): A sample CSV showing 1 hour of sub-1-minute data for a resolved 15m SOL market.
  • Phase 2 (Full History): The complete 30-day historical dataset in CSV format.
  • Phase 3 (Script Delivery): The Python source code with a README on how to run it for future data collection.

Specifically looking for developers who have worked with Polymarket, UMA, or DeFi subgraphs before.

Questions for Applicants

  • Have you worked with the Polymarket CLOB API or their Subgraphs before?
  • How do you plan to bypass the 12-hour granularity limit for resolved historical markets?
  • What is your approach to handling the "recurring" nature of these 15m events, given each has a different ID?
  • Can you provide a brief example of a similar data scraping project you've completed?

Additional Details / Notes

I am open to using tools like Dune Analytics or Goldsky if they provide a more efficient path to the data, provided the final output is a CSV generated via a script I can run locally.

Contract duration of 1 to 3 months.

Mandatory skills: Python, Data Extraction, polymarket, subgraph, API, Python Script

Want AI-powered job matching?

Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.

Get Started Free