Tag Archives: Haiti

Personal Reflections: 3 Years After the Haiti Earthquake

The devastating earthquake that struck Port-au-Prince on January 12, 2010 killed as many as 200,000 people. My fiancée and five close friends were in Haiti at the time and narrowly escaped a collapsing building. They were some of the lucky few survivors. But I had no knowledge that they had survived until 8 hours or so after the earthquake because we were unable get any calls through. The Haiti Crisis Map I subsequently spearheaded still stands as the most psycho-logically and emotionally difficult project I’ve ever been a part of.

The heroes of this initiative and the continuing source of my inspiration today were the hundreds and hundreds of volunteers who ensured the Haiti Crisis Map remained live for so many weeks. The majority of these volunteers were of course the Haitian Diaspora as well as Haitians in country. I had the honor of meeting and working with one of these heroes while in Port-au-Prince, Kurt Jean-Charles, the CEO of the Haitian software company Solutions.ht. I invited Kurt to give the Keynote at the 2010 International Crisis Mappers Conference (ICCM 2010) and highly recommend watching the video above. Kurt speaks directly from the heart.

HaitianDiaspora

Another personal hero of mine (pictured above) is Sabina Carlson—now Sabina Carlson Robillard following her recent wedding to Louino in Port-au-Prince! She volunteered as the Haitian Diaspora Liaison for the Haiti Crisis Map and has been living in Cité Soleil ever since. Needless to say, she continues to inspire all of us who have had the honor of working with her and learning from her.

Finally, but certainly not (!) least, the many, many hundreds of amazing volun-teers who tirelessly translated tens of thousands of text messages for this project. Thanks to you, some 1,500 messages from the disaster-affected population were added to the live crisis map of Haiti. This link points to the only independent, rigorous and professional evaluation of the project that exists. I highly reco-mmend reading this report as it comprises a number of important lessons learned in crisis mapping and digital humanitarian response.

Fonkoze

In the meantime, please consider making a donation to Fonkoze, an outstanding local organization committed to the social and economic improvement of the Haitian poor. Fonkoze is close to my heart not only because of the great work that they do but also because its staff and CEO were the ones who ensured the safe return of my fiancée and friends after the earthquake. In fact, my fiancée has continued to collaborate with them ever since and still works on related projects in Haiti. She is headed back to Port-au-Prince this very weekend. To make a tax deductible donation to Fonkoze, please visit this link. Thank you.

My thoughts & prayers go out to all those who lost loved ones in Haiti years ago.

Surprising Findings: Using Mobile Phones to Predict Population Displacement After Major Disasters

Rising concerns over the consequences of mass refugee flows during several crises in the late 1970’s is what prompted the United Nations (UN) to call for the establishment of early warning systems for the first time. “In 1978-79 for example, the United Nations and UNHCR were clearly overwhelmed by and unprepared for the mass influx of Indochinese refugees in South East Asia. The number of boat people washed onto the beaches there seriously challenged UNHCR’s capability to cope. One of the issues was the lack of advance information. The result was much human suffering, including many deaths. It took too long for emergency assistance by intergovernmental and non-governmental organizations to reach the sites” (Druke 2012 PDF).

Forty years later, my colleagues at Flowminder are using location data from mobile phones to nowcast and predict population displacement after major disasters. Focusing on the devastating 2010 Haiti earthquake, the team analyzed the movement of 1.9 million mobile users before and after the earthquake. Naturally, the Flowminder team expected that the mass exodus from Port-au-Prince would be rather challenging to predict. Surprisingly, however, the predictability of people’s movements remained high and even increased during the three-month period following the earthquake.

The team just released their findings in a peer-reviewed study entitled: “Predictability of population displacement after the 2010 Haiti earthquake” (PNAS 2012). As the analysis reveals, “the destinations of people who left the capital during the first three weeks after the earthquake was highly correlated with their mobility patterns during normal times, and specifically with the locations in which people had significant social bonds, as measured by where they spent Christmas and New Year holidays” (PNAS 2012).

For the people who left Port-au-Prince, the duration of their stay outside the city, as well as the time for their return, all followed a skewed, fat-tailed distribution. The findings suggest that population movements during disasters may be significantly more predictable than previously thought” (PNAS 2012). Intriguingly, the analysis also revealed the period of time that people in Port-au-Prince waited to leave the city (and then return) was “power-law distributed, both during normal days and after the earthquake, albeit with different exponents (PNAS 2012).” Clearly then, “[p]eople’s movements are highly influenced by their historic behavior and their social bonds, and this fact remained even after one of the most severe disasters in history” (PNAS 2012).

 

I wonder how this approach could be used in combination with crowdsourced satellite imagery analysis on the one hand and with Agent Based Models on the other. In terms of crowdsourcing, I have in mind the work carried out by the Standby Volunteer Task Force (SBTF) in partnership with UNHCR and Tomnod in Somalia last year. SBTF volunteers (“Mapsters”) tagged over a quarter million features that looked liked IDP shelters in under 120 hours, yielding a triangulated country of approximately 47,500 shelters.

In terms of Agent Based Models (ABMs), some colleagues and I  worked on “simulating population displacements following a crisis”  back in 2006 while at the Santa Fe Institute (SFI). We decided to use an Agent Based Model because the data on population movement was simply not within our reach. Moreover, we were particularly interested in modeling movements of ethnic populations after a political crisis and thus within the context of a politically charged environment.

So we included a preference for “safety in numbers” within the model. This parameter can easily be tweaked to reflect a preference for moving to locations that allow for the maintenance of social bonds as identified in the Flowminder study. The figure above lists all the parameters we used in our simple decision theoretic model.

The output below depicts the Agent Based Model in action. The multi-colored panels on the left depict the geographical location of ethnic groups at a certain period of time after the crisis escalates. The red panels on the right depict the underlying social networks and bonds that correspond to the geographic distribution just described. The main variable we played with was the size or magnitude of the sudden onset crisis to determine whether and how people might move differently around various ethnic enclaves. The study long with the results are available in this PDF.

In sum, it would be interesting to carry out Flowminder’s analysis in combination with crowdsourced satellite imagery analysis and live sensor data feeding into an Agent Base Model. Dissertation, anyone?

Disaster Response, Self-Organization and Resilience: Shocking Insights from the Haiti Humanitarian Assistance Evaluation

Tulane University and the State University of Haiti just released a rather damming evaluation of the humanitarian response to the 2010 earthquake that struck Haiti on January 12th. The comprehensive assessment, which takes a participatory approach and applies a novel resilience framework, finds that despite several billion dollars in “aid”, humanitarian assistance did not make a detectable contribution to the resilience of the Haitian population and in some cases increased certain communities’ vulnerability and even caused harm. Welcome to supply-side humanitarian assistance directed by external actors.

“All we need is information. Why can’t we get information?” A quote taken from one of many focus groups conducted by the evaluators. “There was little to no information exchange between the international community tasked with humanitarian response and the Haitian NGOs, civil society or affected persons / communities themselves.” Information is critical for effective humanitarian assistance, which should include two objectives: “preventing excess mortality and human suffering in the immediate, and in the longer term, improving the community’s ability to respond to potential future shocks.” This longer term objective thus focuses on resilience, which the evaluation team defines as follows:

“Resilience is the capacity of the affected community to self-organize, learn from and vigorously recover from adverse situations stronger than it was before.”

This link between resilience and capacity for self-organization is truly profound and incredibly important. To be sure, the evaluation reveals that “the humani-tarian response frequently undermined the capacity of Haitian individuals and organizations.” This completely violates the Hippocratic Oath of Do No Harm. The evaluators thus “promote the attainment of self-sufficiency, rather than the ongoing dependency on standard humanitarian assistance.” Indeed, “focus groups indicated that solutions to help people help themselves were desired.”

I find it particularly telling that many aid organizations interviewed for this assessment were reluctant to assist the evaluators in fully capturing and analyzing resource flows, which are critical for impact evaluation. “The lack of transparency in program dispersal of resources was a major constraint in our research of effective program evaluation.” To this end, the evaluation team argue that “by strengthening Haitian institutions’ ability to monitor and evaluate, Haitians will more easily be able to track and monitor international efforts.”

I completely disagree with this remedy. The institutions are part of the problem, and besides, institution-building takes years if not decades. To assume there is even political will and the resources for such efforts is at best misguided. If resilience is about strengthening the capacity of affected communities to self-organize, then I would focus on just that, applying existing technologies and processes that both catalyze and facilitate demand-side, people-centered self-organization. My previous blog post on “Technology and Building Resilient Societies to Mitigate the Impact of Disasters” elaborates on this point.

In sum, “resilience is the critical link between disaster and development; monitoring it will ensure that relief efforts are supporting, and not eroding, household and community capabilities.” This explains why crowdsourcing and data mining efforts like those of Ushahidi, HealthMap and UN Global Pulse are important for disaster response, self-organization and resilience.

Twitter, Crises and Early Detection: Why “Small Data” Still Matters

My colleagues John Brownstein and Rumi Chunara at Harvard Univer-sity’s HealthMap project are continuing to break new ground in the field of Digital Disease Detection. Using data obtained from tweets and online news, the team was able to identify a cholera outbreak in Haiti weeks before health officials acknowledged the problem publicly. Meanwhile, my colleagues from UN Global Pulse partnered with Crimson Hexagon to forecast food prices in Indonesia by carrying out sentiment analysis of tweets. I had actually written this blog post on Crimson Hexagon four years ago to explore how the platform could be used for early warning purposes, so I’m thrilled to see this potential realized.

There is a lot that intrigues me about the work that HealthMap and Global Pulse are doing. But one point that really struck me vis-a-vis the former is just how little data was necessary to identify the outbreak. To be sure, not many Haitians are on Twitter and my impression is that most humanitarians have not really taken to Twitter either (I’m not sure about the Haitian Diaspora). This would suggest that accurate, early detection is possible even without Big Data; even with “Small Data” that is neither representative or indeed verified. (Inter-estingly, Rumi notes that the Haiti dataset is actually larger than datasets typically used for this kind of study).

In related news, a recent peer-reviewed study by the European Commi-ssion found that the spatial distribution of crowdsourced text messages (SMS) following the earthquake in Haiti were strongly correlated with building damage. Again, the dataset of text messages was relatively small. And again, this data was neither collected using random sampling (i.e., it was crowdsourced) nor was it verified for accuracy. Yet the analysis of this small dataset still yielded some particularly interesting findings that have important implications for rapid damage detection in post-emergency contexts.

While I’m no expert in econometrics, what these studies suggests to me is that detecting change-over–time is ultimately more critical than having a large-N dataset, let alone one that is obtained via random sampling or even vetted for quality control purposes. That doesn’t mean that the latter factors are not important, it simply means that the outcome of the analysis is relatively less sensitive to these specific variables. Changes in the baseline volume/location of tweets on a given topic appears to be strongly correlated with offline dynamics.

What are the implications for crowdsourced crisis maps and disaster response? Could similar statistical analyses be carried out on Crowdmap data, for example? How small can a dataset be and still yield actionable findings like those mentioned in this blog post?

Some Thoughts on Real-Time Awareness for Tech@State

I’ve been invited to present at Tech@State in Washington DC to share some thoughts on the future of real-time awareness. So I thought I’d use my blog to brainstorm and invite feedback from iRevolution readers. The organizers of the event have shared the following questions with me as a way to guide the conver-sation: Where is all of this headed?  What will social media look like in five to ten years and what will we do with all of the data? Knowing that the data stream can only increase in size, what can we do now to prepare and prevent being over-whelmed by the sheer volume of data?

These are big, open-ended questions, and I will only have 5 minutes to share some preliminary thoughts. I shall thus focus on how time-critical crowdsourcing can yield real-time awareness and expand from there.

Two years ago, my good friend and colleague Riley Crane won DARPA’s $40,000 Red Balloon Competition. His team at MIT found the location of 10 weather balloons hidden across the continental US in under 9 hours. The US covers more than 3.7 million square miles and the balloons were barely 8 feet wide. This was truly a needle-in-the-haystack kind of challenge. So how did they do it? They used crowdsourcing and leveraged social media—Twitter in particular—by using a “recursive incentive mechanism” to recruit thousands of volunteers to the cause. This mechanism would basically reward individual participants financially based on how important their contributions were to the location of one or more balloons. The result? Real-time, networked awareness.

Around the same time that Riley and his team celebrated their victory at MIT, another novel crowdsourcing initiative was taking place just a few miles away at The Fletcher School. Hundreds of students were busy combing through social and mainstream media channels for actionable and mappable information on Haiti following the devastating earthquake that had struck Port-au-Prince. This content was then mapped on the Ushahidi-Haiti Crisis Map, providing real-time situational awareness to first responders like the US Coast Guard and US Marine Corps. At the same time, hundreds of volunteers from the Haitian Diaspora were busy translating and geo-coding tens of thousands of text messages from disaster-affected communities in Haiti who were texting in their location & most urgent needs to a dedicated SMS short code. Fletcher School students filtered and mapped the most urgent and actionable of these text messages as well.

One year after Haiti, the United Nation’s Office for the Coordination of Humanitarian Affairs (OCHA) asked the Standby Volunteer Task Force (SBTF) , a global network of 700+ volunteers, for a real-time map of crowdsourced social media information on Libya in order to improve their own situational awareness. Thus was born the Libya Crisis Map.

The result? The Head of OCHA’s Information Services Section at the time sent an email to SBTF volunteers to commend them for their novel efforts. In this email, he wrote:

“Your efforts at tackling a difficult problem have definitely reduced the information overload; sorting through the multitude of signals on the crisis is no easy task. The Task Force has given us an output that is manageable and digestible, which in turn contributes to better situational awareness and decision making.”

These three examples from the US, Haiti and Libya demonstrate what is already possible with time-critical crowdsourcing and social media. So where is all this headed? You may have noted from each of these examples that their success relied on the individual actions of hundreds and sometimes thousands of volunteers. This is primarily because automated solutions to filter and curate the data stream are not yet available (or rather accessible) to the wider public. Indeed, these solutions tend to be proprietary, expensive and/or classified. I thus expect to see free and open source solutions crop up in the near future; solutions that will radically democratize the tools needed to gain shared, real-time awareness.

But automated natural language processing (NLP) and machine learning alone are not likely to succeed, in my opinion. The data stream is actually not a stream, it is a massive torent of non-indexed information, a 24-hour global firehose of real-time, distributed multi-media data that continues to outpace our ability to produce actionable intelligence from this torrential downpour of 0’s and 1’s. To turn this data tsunami into real-time shared awareness will require that our filtering and curation platforms become more automated and collaborative. I believe the key is thus to combine automated solutions with real-time collabora-tive crowdsourcing tools—that is, platforms that enable crowds to collaboratively filter and curate real-time information, in real-time.

Right now, when we comb through Twitter, for example, we do so on our own, sitting behind our laptop, isolated from others who may be seeking to filter the exact same type of content. We need to develop free and open source platforms that allow for the distributed-but-networked, crowdsourced filtering and curation of information in order to democratize the sense-making of the firehose. Only then will the wider public be able to win the equivalent of Red Balloon competitions without needing $40,000 or a degree from MIT.

I’d love to get feedback from readers about what other compelling cases or arguments I should bring up in my presentation tomorrow. So feel free to post some suggestions in the comments section below. Thank you!

Tracking Population Movements using Mobile Phones and Crisis Mapping: A Post-Earthquake Geospatial Study in Haiti

I’ve been meaning to blog about this project since it was featured on BBC last month: “Mobile Phones Help to Target Disaster Aid, says Study.” I’ve since had the good fortune of meeting Linus Bengtsson and Xin Lu, the two lead authors of this study (PDF), at a recent strategy meeting organized by GSMA. The authors are now launching “Flowminder” in affiliation with the Karolinska Institutet in Stockholm to replicate their excellent work beyond Haiti. If “Flowminder” sounds familiar, you may be thinking of Hans Rosling’s “Gapminder” which also came out of the Karolinska Institutet. Flowminder’s mission: “Providing priceless information for free for the benefit of those who need it the most.”

As the authors note, “population movements following disasters can cause important increases in morbidity and mortality.” That is why the UN sought to develop early warning systems for refugee flows during the 1980’s and 1990’s. These largely didn’t pan out; forecasting is not a trivial challenge. Nowcasting, however, may be easier. That said, “no rapid and accurate method exists to track population movements after disasters.” So the authors used “position data of SIM cards from the largest mobile phone company in Haiti (Digicel) to estimate the magnitude and trends of population movements following the Haiti 2010 earthquake and cholera outbreak.”

The geographic locations of SIM cards were determined by the location of the mobile phone towers that SIM cards were connecting to when calling. The authors followed the daily positions of 1.9 million SIM cards for 42 days prior to the earthquake and 158 days following the quake. The results of the analysis reveal that an estimated 20% of the population in Port-au-Prince left the city within three weeks of the earthquake. These findings corresponded well with of a large, retrospective population based survey carried out by the UN.

“To demonstrate feasibility of rapid estimates and to identify areas at potentially increased risk of outbreaks,” the authors “produced reports on SIM card move-ments from a cholera outbreak area at its immediate onset and within 12 hours of receiving data.” This latter analysis tracked close to 140,000 SIM cards over an 8 day period. In sum, the “results suggest that estimates of population movements during disasters and outbreaks can be delivered rapidly and with potentially high validity in areas with high mobile phone use.”

I’m really keen to see the Flowminder team continue their important work in and beyond Haiti. I’ve invited them to present at the International Conference of Crisis Mappers (ICCM 2011) in Geneva next month and hope they’ll be able to join us. I’m interested to explore the possibilities of combining this type of data and analysis with crowdsourced crisis information and satellite imagery analysis. In addition, mobile phone data can also be used to estimate the hardest hit areas after a disaster. For more on this, please see my previous blog post entitled “Analyzing Call Dynamics to Assess the Impact of Earthquakes” and this post on using mobile phone data to assess the impact of building damage in Haiti.

OpenStreetMap’s New Micro-Tasking Platform for Satellite Imagery Tracing

The Humanitarian OpenStreetMap Team’s (HOT) response to Haiti remains one of the most remarkable examples of what’s possible when volunteers, open source software and open data intersect. When the 7.0 magnitude earthquake struck on January 12th, 2010, the Google Map of downtown Port-au-Prince was simply too incomplete to be used for humanitarian response. Within days, however, several hundred volunteers from the OpenStreetMap (OSM) commu-nity used satellite imagery to trace roads, shelters and other important features to create the most detailed map of Haiti ever made.

OpenStreetMap – Project Haiti from ItoWorld on Vimeo.

The video animation above shows just how spectacular this initiative was. More than 1.4 million edits were made to the map during the first month following the earthquake. These individual edits are highlighted as bright flashes of light in the video. This detailed map went a long way to supporting the humanitarian community’s response in Haiti. In addition, the map enabled my colleagues and I at The Fletcher School to geo-locate reports from crowdsourced text messages from Mission 4636 on the Ushahidi Haiti Map.

HOT’s response was truly remarkable. They created wiki’s to facilitate mass collaboration such as this page on “What needs to be mapped?” They also used this “OSM Matrix” to depict which areas required more mapping:

The purpose of OSM’s new micro-tasking platform is to streamline mass and rapid collaboration on future satellite image tracing projects. I recently reached out to HOT’s Kate Chapman and Nicolas Chavent to get an overview of their new platform. After logging in using my OSM username and password, I can click through a list of various on-going projects. The one below relates to a very neat HOT project in Indonesia. As you can tell, the region that needs to be mapped on the right-hand side of the screen is divided into a grid.

After I click on “Take a task randomly”, the screen below appears, pointing me to one specific cell in the grid above. I then have the option of opening and editing this cell within JOSM, the standard interface for editing OpenStreetMap. I would then trace all roads and buildings in my square and submit the edit. (I was excited to also see a link to WalkingPapers which allows you to print out and annotate that cell using pen & paper and then digitize the result for import back into OSM).

There’s no doubt that this new Tasking Server will go a long way to coordinate and streamline future live tracing efforts such as for Somalia. For now, the team is mapping Somalia’s road network using their wiki approach. In the future, I hope that the platform will also enable basic feature tagging and back-end triangulation for quality assurance purposes—much like Tomnod. In the meantime, however, it’s important to note that OSM is far more than just a global open source map. OSM’s open data advocacy is imperative for disaster preparedness and response: open data saves lives.