A thorough step-by-step tutorial on data scraping Wikipedia and displaying them on Google Maps:
It ties in beautifully what’s been readily available on the web for a while now – a simple ImportHtml function on Google Spreadsheets, subscribing to an RSS feed of the Wikipedia data, Yahoo Pipes to read and geocode the data and spit out as a KML file, then importing the KML into Google Maps.
What this results in is a live update of your Google Map based on changes being made on Wikipedia! It’s the first example of this kind that I’ve seen and there are still some potential pitfalls to this method.
Relying on so many sources to always be up and running is a concern (ok, so maybe I have never seen Wikipedia, Google, or Yahoo be ‘down’ but you get the idea). One goes down, and your map is broken. Or perhaps the Wikipedia table changes its structure or is moved to a separate page. Your map is once again gone.
With that said, there’s still something brewing here with data scraping and the connectivity seen in the above example. Can this method be used one day to support the geoweb?