I need to create a database out of a number of sequentially labelled URLs. The data is somewhat uniform. Here's an example URL [login to view URL]
Data I would need
Sport (NFL)
Game type (Salary Cap 60k)
Date (Sun Sept 19th)
Contestant 1 (chefchris)
Contestant 2 (gasbarro88)
Score 1 (84)
Score 2 (124)
Game winner (gasbarro88)
Entry fee ($0)
Payout ($0)
Note: For some contests, there will be up to 10 contestants (ex [login to view URL])
Another sample URL ([login to view URL])
We will need to be able to scrape all this data and store it to a database which also entails crawling all of these URLs (about 150k of them). I'd also like to be able to re-run the software week-to-week to add subsequent URLs/data.
There are a couple of other small complications, but that's the gist.