Crisis Data Management

Ruby day 6: clicking starts

Download PDF
Screen Shot 2014-12-07 at 2.43.02 PM
http://www.gov.ph/crisis-response/typhoon-ruby/#section-1

7am. Wake at 6:30 am – check self for hangover, then check overnight Skype / email traffic.  Andrej from UNOCHA has posted a whole pile of links to existing UNOCHA datastores (outside HDX), and a link to 2013 population estimates (up to now, we’ve only found 2010 figures).  Realise that there isn’t a long-list of datastores for the digital humanitarians (I’ve been sending links out, but they’re behind a group’s firewall).  And the HDX team now has a Ruby page at https://data.hdx.rwlabs.org/ruby.

The micromappers deployments have started (if you’re reading this and want something to do – that!).  I see a message that http://ourairports.com/countries/PH/ needs updating – Andrej has a link for that too.  Am asked to lead one of the remote mapping teams… point out that I might be a little short on internet soon.  One of the local mappers that the remote team has been worrying about has made it to a cellphone signal – he’s not only okay, he’s also volunteering in the regional disaster response office… yay, go local mappers etc!

This morning I’m planning to do a little micromapping (categorising incoming tweets) and scrape all the lat/long geolocation data from typhoon Pablo/Yolanda Ushahidi sites (I have code for this, will leave copy in github).  And drink lots of coconut water.

I do a little micromapping, classifying tweets.  This is still done by hand, even though the auto-classification algorithms are getting better with every disaster. The categories we’re sorting tweets into are:

  • Requests for Help / Needs – Something (e.g. food, water, shelter) or someone (e.g. volunteers, doctors) is needed
  • Infrastructure Damage – Houses, buildings, roads damaged or utilities such as water, electricity, interrupted.
  • Humanitarian Aid Provided – Affected populations receiving food, water, shelter, medication, etc. from humanitarian/emergency response organizations.
  • Other Relevant Information – Informative for emergency/humanitarian response, but in none of the above categories, including weather/evacuations/etc.
  • Not Informative – Not related to the Typhoon, or not relevant for emergency/humanitarian response.
  • N/A: does not apply, cannot judge – Not in English, not readable, or in none of the above categories.

I’m tempted to ask the micromapping team for a filter that removes any tweet with the word “pray” or “god” in it. Every time, this traffic happens: filipinos and americans start to pray; brits start with the dark humour.  At least I haven’t seen any spam on the hashtags yet – selling handbags and Ugg boots after a disaster seems to be a very popular online activity.

Start seeing requests from people who want to see what’s going on. I move list of websites about Ruby off a group site and into a HDX page. Do much annoying link-checking because urls don’t copy over.  Rose, who always has useful stuff to add, posts a list of typhoon landfall times (6 in total) in  PHT, EAT, GMT and EST times.  Timezones are both the digital humanitarians’ strength (always someone awake) and weakness (all those time conversions).

I hunker down with some Ushahidi API code.  I find a bug in my Ushahidi python code that’s been hanging around for a while: thank you Ruby, for supporting my dayjob. The code works (and is at https://github.com/bodacea/icanhaz/blob/master/rubytyphoon.py if you want to use it too) – the results are in HDX at https://data.hdx.rwlabs.org/dataset/geolocations-from-digital-humanitarian-deployments.

See lots of notes about missing aerial imagery; ask if anyone knows where SkyEye is operating today. Told to ask Celina (who’s asleep upstairs at the moment, exhausted).

10am. I learn stuff every time with these guys. Today it’s how easy reverse image searches are (photos from Haiyan are cropping up in the social media streams): use Google image search or tineye.com, or right-click in Chrome and use Search Google for this image.  My dayjob colleague Michelle checks in – she’s the only person there who’s asked how I’m doing, and none of my Ruby-related emails seem to have got through.  I guess they think I can look after myself.  Am doing what I can today because I’m back to the dayjob tomorrow – if you need anything from here, today is a good time to ask!

12pm. Brunch at Legaspi market. Hear people talking about the typhoon – calm, sounds more like gossip than panic.  Bump into a bunch of INGO folks who’ve just come out of their coordination meeting.  Eat great hot pad thai with homemade ginger beer; buy some christmas presents. Bump into agency person who needs some of the datasets that I’ve been helping with, and others from Andrej’s list this morning.

2pm. See yet more datasets hiding behind firewalls, some of them the data needed above.  Will keep trying to persuade people to share through HDX-linked googledocs, but watching replication between groups is annoying.  See that the HOT OSM instructions need a bit of gardening: try to create an account (I always forget my name and password there) but get Account creation error. Your IP address is listed as an open proxy in the DNSBL used by OpenStreetMap Wiki. You cannot create an account”.   I try to remember my account name by hitting “forgot password”, but get “Your IP address is blocked from editing, and so is not allowed to use the password recovery function to prevent abuse”.  Wonder how often that happens to people in developing countries. The HOT OSM team are wonderful about this, with suggestions of how to best deal with it.

New link comes in: Philippines government’s Ruby page. Google crisis map on it includes all the shelters across the islands (from DSWD data) – is beautiful, but can’t get KML download to work to cross-check list against ones acquired earlier (and take down old datasets as needed).  Someone asks about pdf to spreadsheet conversion software (please stop with the pdfs guys – please!); we recommend tabula for the easy and cometdocs for the difficult ones. I run the doc through cometdocs and upload to HDX just in case anyone else needs it (poverty figures for the Philippines).

4pm. Link field team with GPS locations up with OpenStreetMappers – sit back and enjoy watching them coordinate together. Check work chats – find some sweet comments from teammates on my note about Ruby, which is much heartening: perhaps I’m not forgotten after all (and to be fair, had forgotten about an offer of help there a day or two ago).  Download evacuation centre list from Google crisis map, then convert it from kml to csv (the command to use is “ogr2ogr -f CSV evacuation_centres.csv Typhoon\ RubyHagupit\ Evacuation\ Center\ Map.kml -lco GEOMETRY=AS_XY”; there’s a great ogr2ogr cheat sheet at https://github.com/dwtkns/gdal-cheat-sheet). Try to go for a nap.

Darn. Message from Redcross: Google map is missing evacuation centres on a couple of islands. Look over the 2 lists: there are other centres on NGO lists that aren’t on map either. Am now very sleepy: post notes in chat, add new shelter names to googledoc, drop note in chat for Google team and ask Jus to help fix it. Celina wakes up and starts a bridge between the areas that OSM is missing satellite data for and the local UAV team.  Finally go for nap.

8pm. Wake up to see the audit trail behind the DSDW shelters map in one of the chats: everyone’s contributed a part to it, and mysteries are solved. See a request for a list of towns/villages that the storm has passed through please; wonder if it’s possible to estimate that from a track+width (I know the storm walls are destructive, but not sure how far the damage area would extend; also see a request for road blockage data.  Maning points to photos on local news site (ABS-CBN); looks like home did after Sandy.

Darn again: Philippines government has UAV regulations amd not all local companies have got the licenses from CAAP to fly post-tornado (Andrej points at the UAV links on UNOCHA’s response page). Don’t tell me we have to break out the balloons and kites again? We check with the UAV teams: they’re cleared.

10pm. Pierre from OSM has the typhoon track on a map… off to turn that into a list of towns in a 50mile swath. Am exhausted and eyes are closing reading the Overpass API docs (although the test window for this is cool): post task in SBTF’s geolocation window for another mapper to pick up. Celina is talking to the UAV guys and mapping teams, getting lists of place that imagery is needed (damaged areas yes, but also areas that were covered by cloud in the satellite images).  Go off to bed after midnight, leaving Celina still hunched over her laptop.