
Senior Python Engineer to Build Continuous Marketplace Scraper + Data Platform + Dashboard
FreelanceJobsResume Keywords to Include
Make sure these keywords appear in your resume to improve ATS scoring
Sign up free to auto-tailor your resume with all these keywords and get a higher ATS score
Job Description
We are building an internal market intelligence system.
I am looking for a senior Python engineer / data engineer to build a functioning internal system, not just a one-time scrape or spreadsheet export.
The goal is to build a platform that continuously collects marketplace listings, structures the data, tracks listing changes over time, and makes everything available inside a usable internal dashboard / CRM.
Scope:
- scrape marketplace listings
- store raw data in a structured PostgreSQL database
- normalize key fields
- track listing lifecycle over time
- build a basic internal dashboard / CRM for review and operations
The system should capture raw listing data such as:
- title
- description
- price
- mileage or equivalent numeric field
- location
- source URL
- images if available
- seller type if available
The system should also normalize fields such as:
- brand/category
- model/item type
- variant/trim if detectable
- year
- mileage or equivalent usage field
- price
- transmission / fuel type or similar structured attributes if relevant
- condition indicators if detectable
- service history or equivalent maintenance indicators if detectable
- confidence score for extracted fields when relevant
The system should revisit listings and track:
- first seen
- last seen
- price
changes
- active / removed / stale status
- historical snapshots
A basic internal dashboard / CRM is required.
It should allow us to:
- log in and access the system
- view listings in a table
- search and filter records
- open a listing detail page
- see raw data and normalized fields
- see price / status history
- review and manually correct low-confidence fields when needed
Important:
I am not looking for:
- a one-time scrape
- a CSV / Excel-only delivery
- a simple script with no database
- a mockup with no working backend
I am looking for a functioning MVP system that includes:
- scraper
- structured database
- backend logic
- internal dashboard / CRM
Preferred stack:
- Python
- Playwright or Selenium
- PostgreSQL
- FastAPI / Django or similar backend
- simple internal dashboard frontend
- Docker is a plus
If you believe the full scope should be split into phases, that is completely fine. Please explain what you would realistically deliver first.
When applying, please include:
- similar scraping / data pipeline / dashboard systems you have built
- your proposed architecture / tech stack
- what exactly you would deliver in the first version
- estimated timeline
Please do not apply if your delivery would only be a basic scrape export. I specifically need a structured system that can be expanded into a full internal data platform.
Contract duration of 1 to 3 months.
Mandatory skills:
Python, Web Scraping, PostgreSQL, Data Engineering
Want AI-powered job matching?
Upload your resume and get every job scored, your resume tailored, and hiring manager emails found - automatically.
Get Started Free