Where 2.0 day two
Steve Morris gave an interesting presentation describing the "disappearing data problem." He explained how public data, geo-spacial data in particular, is rotting away in decomposing physical formats, or incompatible/proprietary file formats. There's a lot of it, but whether it will be usable is a major challenge. There are a handful of data translation/conversion/revitalizing firms out there, contracting to bring data into the present, and to attempt to future proof it.
Someone mentioned that someone else is looking at combination SD storage/WiFi cards that will automatically encode lat/lng into images that a camera takes; the lat/lng encoded via WiFi node triangulation or a more simple MAC address location lookup. Furthermore, the card could transmit the images to the network for the user. Way cool idea. I wonder how much of a battery drain the cards would be.
Ron Langhelm from FEMA did a great job illustrating how big of a challenge disaster relief can be when you show up to a location sans power, clean water, streets, or infrastructure in general. His job is to act as an incident cartographer. Imagine showing up in New Orleans during the Katrina flooding in 2005, and having to update various agencies as to the extent of the damage, and predict how much more worse things are going to get (e.g. how much further the water is going to spread, 2D, and how deep it's going to get, 3D). They use a variety of off the shelf tools/products, and build whatever else they need, in the field.
I had a good conversation with someone at Mozilla who has built a JavaScript object that understands GPS (constrained by whatever GPS capabilities the device that Gecko is running on has of course; generally very limited right now). I urged him to carve out some time at the next Where 2.0 to talk about the GPS Service. What a great way to get developers thinking about location via traditional web development tools/languages!