Tag Archives: OpenStreetMap

Crisis Mapping, Neogeography and the Delusion of Democratization

Professor Muki Haklay kindly shared with me this superb new study in which he questions the alleged democratization effects of Neogeography. As my colleague Andrew Turner explained in 2006, “Neogeography means ‘new geography’ and consists of a set of techniques and tools that fall outside the realm of traditional GIS, Geographic Information Systems. [...] Essentially, Neogeography is about people using and creating their own maps, on their own terms and by combining elements of an existing toolset. Neogeography is about sharing location information with friends & visitors, helping shape context, and conveying under-standing through knowledge of place.” To this end, as Muki writes, “it is routinely argued that the process of producing and using geographical information has been fundamentally democratized.” For example, as my colleague Nigel Snoad argued in 2011, “[...] Google, Microsoft and OpenStreetMap have really demo-cratized mapping.” Other CrisisMappers, including myself, have made similar arguments over the years.

neogeo1

Muki explores this assertion by delving into the various meanings of demo-cratization. He adopts the specific notion of democratization that “evokes ideas about participation, equality, the right to influence decision making, support to individual and group rights, access to resources and opportunities, etc.” With this definition in hand, Muki argues that “using this stronger interpretation of democratization reveals the limitation of current neogeographic practices and opens up the possibility of considering alternative development of technologies that can, indeed, be considered democratizing.” To explore this further, he turns to Andrew Feenberg‘s critical philosophy of technology. Feenberg identifies “four main streams of thought on the essence of technology and its linkage to society: instrumentalism, determinism, substantivism & critical theory.”

Screen Shot 2013-03-16 at 6.19.43 PM

Feenberg’s own view is constructivist, “emphasizing that technology development is humanly controlled and encapsulates values and politics; it should thus be open to democratic control and intervention.” In other words, “technology can and should be seen as a result of political negotiations that lead to its production and use. In too many cases, the complexities of technological systems are used to concentrate power within small groups of technological, financial, and political elites and to prevent the wider body of citizens from meaningful participation in shaping it and deciding what role it should have in the everyday.” Furthermore, “Feenberg highlights that technology encapsulates an ambivalence between the ‘conservation of hierarchy’, which most technologies promote and reproduce—hence the continuity in power structures in advanced capitalist societies despite technological upheaval—and ‘democratic rationalisation’, which are the aspects of new technologies that undermine existing power structures and allow new opportunities for marginalized or ignored groups to assert themselves.”

To this end, Feenberg calls for a “deep democratization” of technology as an alternative to technocracy. “Instead of popular agency appearing as an anomaly and an interference, it would be normalized and incorporated into the standard procedures of technical design.” In other words, deep democratization is about empowerment: “providing the tools that will allow increased control over the technology by those in disadvantaged and marginalized positions in society.” Muki contrasts this with neogeography, which is “mostly represented in a decon-textualised way—as the citation in the introduction from Turner’s (2006) Intro-duction to Neogeography demonstrates: it does not discuss who the people are who benefit and whether there is a deeper purpose, beyond fun, for their engage-ment in neogeography.” And so, as neogeographers would have it, since “there is nothing that prevents anyone, anytime, and anywhere, and for any purpose from using the system, democratization has been achieved.” Or maybe not. Enter the Digital Divides.

digidivide

Yes, there are multiple digital divides. Differential access to computers & comm-unication technology is just one. “Beyond this, there is secondary digital ex-clusion, which relates to the skills and abilities of people to participate in online activities beyond rudimentary browsing.” Related to this divide is the one between the “Data Haves” and the “Data Have Nots”. There is also an important divide in speed—as anyone who has worked in say Liberia will have experienced—it takes a lot longer to upload/download/transfer content than in Luxembourg. “In summary, the social, economic, structural, and technical evidence should be enough to qualify and possibly withdraw the democratization claims that are attached to neogeographic practices.”

That said, the praxis of neogeography still has democratic potential. “To address the potential of democratization within neogeographic tools, we need to return to Feenberg’s idea of deep democratization  and the ability of ordinary citizens to direct technical codes and influence them so that they can include alternative meanings and values. By doing so, we can explore the potential of neogeographic practices to support democratisation in its fuller sense. At the very least, citizens should be able to reuse existing technology and adapt it so that it can be used to their own goals and to represent their own values.” So Muki adds a “Hierarchy of Hacking” to Feeberg’s conceptual framework, i.e., the triangle below.

Screen Shot 2013-03-16 at 7.03.49 PM

While the vast majority can participate in a conversation about what to map (Meaning), only a “small technical elite within society” can contribute to “Deep Technical Hacking,” which “requires very significant technical knowledge in creating new geographic data collection tools, setting up servers, and configuring database management systems.” Muki points to Map Kibera as an example of Deep Technical Hacking. I would add that “Meaning Hacking” is often hijacked by “Deep Technical Hackers” who tend to be the ones introducing-and-controlling local neogeography projects despite their “best” intentions. But the fact is this: Deep Tech Hackers typically have little to no actual experience in community development and are often under pressure to hype up blockbuster-like successes at fancy tech conferences in the US. This may explain why most take full owner-ship over all decisions having to do with Meaning- and Use-Hacking right from the start of a project. See this blog post’s epilogue, for more on this dynamic.

One success story, however, is Liberia’s Innovation Lab (iLab). My field visit to Monrovia in 2011 made me realize just how many completely wrong assumptions I had about the use of neogeography platforms in developing countries. Instead of parachuting in and out, the co-founders of iLab became intimately familiar with the country by spending a considerable amount of time in Monrovia and outside the capital city to understand the social, political and historical context in which they were introducing neogeography. And so, while they initially expected to provide extensive training on neogeography platforms right off the bat, they quickly realized that this was the wrong approach entirely for several reasons. As Muki observers, “Because of the reduced barriers, neogeography does offer some increased level of democratization but, to fulfill this potential, it requires careful implementation that takes into account social and political aspects,” which is precisely what the team at the iLab have done and continue to do impressively well. Note that one of the co-founders is a development expert, not a technology hacker. And while the other is a hacker, he spent several years working in Liberia. (Another equally impressive success story is this one from Brazil’s Mare shantytown).

blank

I thus fully subscribe to Muki’s hacking approach and made a very similar ar-gument in this 2011 blog post: “Democratizing ICT for Development with DIY Innovation and Open Data.” I directly challenged the “participatory” nature of these supposedly democratizing technologies and in effect questioned whether Deep Technical Hackers really do let go of control vis-a-vis the hacking of “Meaning” and “Use”. While I used Ushahidi as an example of a DIY platform, it is clear from Muki’s study that Ushahidi like other neogeography platforms also falls way short of deep democratization and hack-ability. That said, as I wrote then, “it is worth remembering that the motivations driving this shift [towards neogeography] are more important than any one technology. For example, recall the principles behind the genesis of the Ushahidi platform: Democratizing information flows and access; promoting Open Data and Do it Yourself (DIY) Innovation with free, highly hackable (i.e., open source) technology; letting go of control.” In other words, the democratizing potential should not be dismissed outright even if we’re not quite there yet (or ever).

As I noted in 2011,  hackable and democratizing technologies ought to be like a “choose your own adventure game. The readers, not the authors, finish the story. They are the main characters who bring the role playing games and stories to life.” This explains why I introduced the notion a “Fischer Price Theory of Tech-nology” five years ago at this meeting with Andrew Turner and other colleagues. As argued then, “What our colleagues in the tech-world need to keep in mind is that the vast majority of our partners in the field have never taken a computer science or software engineering course. [...] The onus thus falls on the techies to produce the most simple, self-explanatory, intuitive interfaces.”

I thus argued that neogeography platforms ought to be as easy to use (and yes hack) as simple as computer games, which is why I was excited to see the latest user interface (UI) developments for OpenStreetMap (image below). Of course, as Muki has ably demonstrated, UI design is just the tip of the iceberg vis-a-vis democratization effects. But democratization is both relative and a process, and neogeography platforms are unlikely to become less democratizing over time, for instance. While some platforms still have a long road ahead with respect to reaching their perceived potential (if ever), a few instances may already have made in-roads in terms of their local political effects as argued here and in my doctoral dissertation.

OSMneogeo

Truly hackable technology, however, needs to go beyond the adventure story and Fischer Price analogies described above. The readers should have the choice of becoming authors before they even have a story in mind, while gamers should have the option of creating their own games in the first place. In other words, as Muki argues, “the artful alteration of technology beyond the goals of its original design or intent,” enables “Deep Democratization.” To this end, “Freely pro-viding the hackable building blocks for DIY Innovation is one way to let go of control and democratize [neogeography platforms],” not least if the creators can make a business out of their buildings. 

Muki concludes by noting that, “the main error in the core argument of those who promote [neogeography] as a democratic force is the assumption that, by increasing the number of people who utilise geographic information in different ways and gain access to geographic technology, these users have been em-powered and gained more political and social control. As demonstrated in this paper, neogeography has merely opened up the collection and use of this information to a larger section of the affluent, educated, and powerful part of society.”  What’s more, “The control over the information is kept, by and large, by major corporations and the participant’s labor is enrolled in the service of these corporations, leaving the issue of payback for this effort a moot point. Significantly, the primary intention of the providers of the tools is not to empower communities or to include marginalized groups, as they do not re-present a major source of revenue.” I argued this exact point here a year ago.

bio

The Best Way to Crowdsource Satellite Imagery Analysis for Disaster Response

My colleague Kirk Morris recently pointed me to this very neat study on iterative versus parallel models of crowdsourcing for the analysis of satellite imagery. The study was carried out by French researcher & engineer Nicolas Maisonneuve for the next GISscience2012 conference.

Nicolas finds that after reaching a certain threshold, adding more volunteers to the parallel model does “not change the representativeness of opinion and thus will not change the consensual output.” His analysis also shows that the value of this threshold has significant impact on the resulting quality of the parallel work and thus should be chosen carefully.  In terms of the iterative approach, Nicolas finds that “the first iterations have a high impact on the final results due to a path dependency effect.” To this end, “stronger commitment during the first steps are thus a primary concern for using such model,” which means that “asking expert/committed users to start,” is important.

Nicolas’s study also reveals that the parellel approach is better able to correct wrong annotations (wrong analysis of the satellite imagery) than the iterative model for images that are fairly straightforward to interpret. In contrast, the iterative model is better suited for handling more ambiguous imagery. But there is a catch: the potential path dependency effect in the iterative model means that  “mistakes could be propagated, generating more easily type I errors as the iterations proceed.” In terms of spatial coverage, the iterative model is more efficient since the parallel model leverages redundancy to ensure data quality. Still, Nicolas concludes that the “parallel model provides an output which is more reliable than that of a basic iterative [because] the latter is sensitive to vandalism or knowledge destruction.”

So the question that naturally follow is this: how can parallel and iterative methodologies be combined to produce a better overall result? Perhaps the parallel approach could be used as the default to begin with. However, images that are considered difficult to interpret would get pushed from the parallel workflow to the iterative workflow. The latter would first be processed by experts in order to create favorable path dependency. Could this hybrid approach be the wining strategy?

Wow: How Road Maps Were Made in the 1940s!

This short video is absolutely a must-watch for today’s digital and crowdsourced-mapping enthusiasts. Produced by Chevrolet in the 1940s, Caught Mapping is an educational film that provides a truly intriguing and at times amusingly enter-taining view into how road maps were made at the time. The contrasts with today’s live, crowdsourced, social-media maps rich with high-resolution satellite imagery are simply staggering. This is definitely worth the watch!

Compare the roadmap-making of yesteryear with OpenStreetMap’s impressive map-making efforts in Haiti 2010 (video below) and Japan 2011, for example.

What do you think map-making will look like in 2040? Will we still be making maps? Or will automated sensors be live mapping 24/7? Will 2D interfaces disappear entirely and be replaced by 3D maps? Will all geo-tagged data simply be embedded within augmented reality platforms and updated live? Will we even be using the word “map” anymore?

OpenStreetMap’s New Micro-Tasking Platform for Satellite Imagery Tracing

The Humanitarian OpenStreetMap Team’s (HOT) response to Haiti remains one of the most remarkable examples of what’s possible when volunteers, open source software and open data intersect. When the 7.0 magnitude earthquake struck on January 12th, 2010, the Google Map of downtown Port-au-Prince was simply too incomplete to be used for humanitarian response. Within days, however, several hundred volunteers from the OpenStreetMap (OSM) commu-nity used satellite imagery to trace roads, shelters and other important features to create the most detailed map of Haiti ever made.

OpenStreetMap – Project Haiti from ItoWorld on Vimeo.

The video animation above shows just how spectacular this initiative was. More than 1.4 million edits were made to the map during the first month following the earthquake. These individual edits are highlighted as bright flashes of light in the video. This detailed map went a long way to supporting the humanitarian community’s response in Haiti. In addition, the map enabled my colleagues and I at The Fletcher School to geo-locate reports from crowdsourced text messages from Mission 4636 on the Ushahidi Haiti Map.

HOT’s response was truly remarkable. They created wiki’s to facilitate mass collaboration such as this page on “What needs to be mapped?” They also used this “OSM Matrix” to depict which areas required more mapping:

The purpose of OSM’s new micro-tasking platform is to streamline mass and rapid collaboration on future satellite image tracing projects. I recently reached out to HOT’s Kate Chapman and Nicolas Chavent to get an overview of their new platform. After logging in using my OSM username and password, I can click through a list of various on-going projects. The one below relates to a very neat HOT project in Indonesia. As you can tell, the region that needs to be mapped on the right-hand side of the screen is divided into a grid.

After I click on “Take a task randomly”, the screen below appears, pointing me to one specific cell in the grid above. I then have the option of opening and editing this cell within JOSM, the standard interface for editing OpenStreetMap. I would then trace all roads and buildings in my square and submit the edit. (I was excited to also see a link to WalkingPapers which allows you to print out and annotate that cell using pen & paper and then digitize the result for import back into OSM).

There’s no doubt that this new Tasking Server will go a long way to coordinate and streamline future live tracing efforts such as for Somalia. For now, the team is mapping Somalia’s road network using their wiki approach. In the future, I hope that the platform will also enable basic feature tagging and back-end triangulation for quality assurance purposes—much like Tomnod. In the meantime, however, it’s important to note that OSM is far more than just a global open source map. OSM’s open data advocacy is imperative for disaster preparedness and response: open data saves lives.