, , ,

Hurricane Irene: GIS, Social Media, and Big Data Shine

As Katherine Maher pointed out on Twitter, no one gets credit when contingency plans work. And it is truly amazing how much government-citizen information collaboration has evolved–not to mention the growth of data journalism even in the most traditional news outlets. The average citizen had a wealth of accurate (and useful) hurricane information to choose from. There was also plenty of data to be used for those seeking to tinker, and very useful Twitter/Flickr/YouTube overlays on Google Earth interfaces. Anyone unlucky enough to turn on their television was assaulted by an overwhelming wave of hysteria and action shots of news anchors bravely intoning into the camera that it was windy and raining–an astonishing revelation, undoubtedly, to the millions told to evacuate.

Alex Howard of O’Reilly Radar has a very helpful list of Hurricane Irene tracking links from both government,media, and crowdsourced sources. Among the highlights:

My personal favorite was ESRI’s full screen mega-aggregator, built on ArcGIS API for Flex and running on an ArcGIS server. It combined data from the National Hurricane Center, Open Street Map, Telvent, and allowed you to look at layered Flickr pictures, Tweets, and YouTube videos on top of the usual amount of hurricane data. Incidentally, ESRI has entered into a strategic alliance with MetaCarta, which Alex Olesker wrote about in May after attending the Department of Defense Intelligence Information Systems (DoDIISS) conference. MetaCarta’s large database of locations, Olesker notes, is seven times the size of the National Geospatial-Intelligence Agency (NGIA)’s gazateer, making it a natural tool for geospatial analysis. It goes without saying that government Twitter accounts have also been a source of timely and accurate information. A wealth of government agencies, large and small, used Twitter and Facebook to spread the word before, during, and after the high point of the hurricane.

The Federal Emergency Management Agency (FEMA) also developed a host of mobile applications for hurricane preparedness and evacuation. Granted, the usual amount of Twitter rumor and innuendo prevailed, but the strong presence of “curated” information from the government, helped counteract information overload. I myself had to completely ignore my usual Twitter and Facebook timelines and focus exclusively on the applications listed above in order to avoid drowning in data. While government resources at times strained under the weight of unprecedented traffic, performance on the whole was very strong and reflected an internalization of what big data and social media pioneers have been urging for a long time.

The use of social media and other forms of information and communications technology (ICT) during natural disasters has been an object of intense study within the Department of Defense and the interagency community for a while. Dr. Linton Wells II at the Center for National Security and Technology Policy (CNSTP) has worked on developing government capabilities for both crowdsourcing and the broader spectrum of ICT resources in national disasters. The result of a recent April exercise on crowdsourcing and collaborative tools revealed the importance of Wells II’s Crowd, Bridge, Transaction, and Feedback model–which culls information from the crowd using a variety of tools, filters, and then passes to decisionmakers which then use it to make decisions. Feedback from the crowd evaluates “transactions”–the on the ground results of decisions–to help accountability. Much of the government use of crowdsourcing technologies over the last 48 hours emphasized the best aspects of this model.


Original post

Leave a Comment

One Comment

Leave a Reply

Bob Gourley

Dear friends: just to clarify, Adam Elkus wrote the post above. The automated feed to govloop makes it look like I did that and, although I wish I could write this good, I really can’t!

Bob