WikiData is project which attempts to collect data about everything in our world. Basically, it will contain information of people, cities, countries, foods, atoms, stars, everything.
You can download their entire database from here in JSON format:
[login to view URL]:Database_download#JSON_dumps_(recommended)
All entries in this database are based on a Q code. The Q code is an unique index number of the specific item. For example, [login to view URL] is the entry for a famous person.
We are interested of WikiData's data on people. For example that of [login to view URL]
Your job is to write a script in any language you want, that will analyze the downloaded WikiData database file(s) and output an SQL insert file.
The script must extract 6 things of entries of people:
1) Person's full name,
2) Person's given name (= first name),
3) Person's family name,
4) Person's gender,
5) Person's country of citizenship,
6) Person's native language.
So, for the same person [login to view URL] These values would be:
1) Full name = Manuel José Bonnet Locarno
2) Given name = Manuel
3) Family name = Bonnet
4) Gender = M
5) Country of citizenship = Colombia
6) Native language = Spanish
And this data would be added to the output SQL file as follows:
INSERT IGNORE INTO wikidata (q_code, full_name, first_name, family_name, gender, country, language) VALUES ("Q5993357", "Manuel José Bonnet Locarno", "Manuel", "Bonnet", "M', "Colombia", "Spanish");
The next found person from the WikiData database file(s) would generate a new line to the output SQL file, and so on, and so on.
Please have the script show some kind of progress indication. For example, the number of rows or entries in the database and the current index or row currently being analyzed, so when running the script, you could see the progress.
The script must ignore all other types of entries than persons. Also, if the person's data is missing any of the 6 data fields (full name, given name, family name, gender, country of citizenship, or native language), skip that person.
The purpose of this task is to be able to run the script, and it will generate a huge SQL file which will insert the person data (name / gender / country / language) to database.
In your bid, please state what language you would use to write this. Scripting language such as Perl, PHP or Python would be preferred.
Hi there,
This is a couple of days worth of work. Can begin immediately.
I have a 99% project completion rate and a 4.99 reputation (out of max 5.0 from more than 1140 projects from a period of 12 years.
Thanks.
I look forward to hearing from you
Best Regards,
Rajesh Soni.
Hi there
I can build this script to import the people data into the database. I can make it using PHP. Looking forward to working with you.
Thanks
Rinsad
I have worked with huge (> 100Gb) files before that's why I'm sure you'll be impressed with my work.
I can provide you Perl or Python script that will parse JSON and generate SQL file.
Hello sir
I am a qualified python developer with 8 years of professional experience of web scraping. I can download all json file and make a parser for the json file. I am interested in this project and can start the work now.
Looking forward to hearing from you soon.
Best Regards,
Yongtao
Hello, I can do this in PHP. Just need to clarify a few things. The script must accept the JSON file(s) then decode the content and output an SQL file with insert statements? Do you have a sample JSON file for reference? Please chat with me to discuss more details. I am available for this job. Thank you
Hey My Dear Friend
I will build your script in Perl Technology.
I have found your requirement regarding build a script which parse json data and build csv file and store that same data in MYSQL Table and i am applying here as i have total 9+ years of experience in perl and in that i have worked with big XML, JSON etc.
Happy to discuss with you on personal chat.
Thanks
Hi,
I haven't deal with large json files before. It takes about 12hrs for me to download 40gb compressed json file from wikidata. But i know how to parse and process large xml and json data using python without having memory issues. If you give me a chance, i can demonstrate the script processing the large json file on my server.
Regards
Hello,
my name is Wolfgang Backhaus, I am a software developer and system engineer from Germany.
I have read Your interesting project offer and want to apply for it.
I am a seasoned Perl developer (20+ years) and database administrator/programmer. I would deliver a Perl script which honours the big file size.
I am looking forward to work with You.
Best regards
Wolfgang Backhaus
I propose to implement a solution in the Perl language. My tentative plan would be as follows:
1) Create a Linux virtual machine (with large disk size - e.g. > 40 GB) for this task + download and extract JSON dump.
2) Read the JSON file line-by-line (rather than all in to memory at once). The Wikidata states that the dumps is designed to facilitate this (i.e. it has one entry per line).
3) Skip invalid or non-person entries as you requested.
Some third-party Perl modules (e.g. JSON) would be required. Can you install them on your machine?
I will update this bid if I have time to do some testing. In the meantime, please note that this bid is PROVISIONAL: I may need to adjust the price and/or timeframe slightly based on my testing.
I am based in Ireland. Please note the timezone difference with Thailand.
Hi,
I can do it for you. My approach will be:
1) download the WIKIDATA JSON dump.
2) use wikidata-filter to extract only human data
3) write a parser of the provided data in PERL to extract required data and format it into SQL Insert statement.
You can then import the SQL file into your database.
Second possible approach would be to use WIKIDATA query interface to get only data you want to have but here we will fight with timeout issues of the API and will be less stable.
Third possible approach is to walk thru data on the web but here we will be generating significant amount of queries and this might be identified as BOT and we might be banned.
Hi
Do you want to download all the data as json and then process it or do you want to download it one by one ?? Do you need the script to run 24X7 ?? I will do it in php using mysql(if you have no preference). A lot of things depend on how you want to so it. Please message me if interested.
Regards
Priyanshu
Hello, I feel like I can help you get this work done. It should only take a few days at most. I have a lot of relevant experience which would make this quick and painless. Feel free to ask me any questions before making your selection. Thanks!
Hi, Do you need quality work?
You may give me a small Try to test my skills.
I am expert on PHP, Laravel, WordPress Divi, Elementor, Generatepress, Enfold Theme.
I can do this job.
Lets talk more details about this project.
Sincerely Yours
Touhida Sultana
Hi,
I would be using Python as language to process these JSON dumps. I assume you are just looking for program/script that can do this task and not the extracted data.
Thanks