Researcher's Wikipedia Big Data Project Shows Globalization Rate 16
Nerval's Lobster writes "Wikipedia, which features nearly 4 million articles in English alone, is widely considered a godsend for high school students on a tight paper deadline. But for University of Illinois researcher Kalev Leetaru, Wikipedia's volumes of crowd-sourced articles are also an enormous dataset, one he mined for insights into the history of globalization. He made use of Wikipedia's 37GB of English-language data — in particular, the evolving connections between various locations across the globe over a period of years. 'I put every coordinate on a map with a date stamp,' Leetaru told The New York Times. 'It gave me a map of how the world is connected.' You can view the time lapse/data visualization on YouTube."
Not "big data" (Score:3, Insightful)
Come on, 37G isn't big data. You'd have a hard time arguing 37TB is big data.
Cool stuff though.
To paraphrase Slashdot... (Score:3, Insightful)
If you're using Wikipedia as a metric to measure anything, you're insane.
Re:the ending of that movie (Score:3, Insightful)
"looks exponential :)"
As much as I'd like to think that meant the world is rapidly connecting, much more likely this is due to the fact that Wikipedia has only been around for a decade or so and people are inclined to write about things that are happening now (or have happened recently) than things that happened many years ago.
If Wikipedia had been available for the entire of those 200 years and had been consistently popular through that time and uniformly across the globe with no language bias then the resulting movie would say a lot about globalisation.