Monthly Archives: May 2008

Creating Covert Channels for VoIP

Steganography is the art of hiding messages by embedding them in ordinary communications. The word is Greek for “covered, or hidden writing”. The term can be traced back to Herodotus (in 440 BC) who mentions two particularly interesting examples (summarized on Wikipedia):

Demaratus sent a warning about a forthcoming attack to Greece by writing it on a wooden panel and covering it in wax. Wax tablets were in common use then as re-usable writing surfaces, sometimes used for shorthand. Another ancient example is that of Histiaeus, who shaved the head of his most trusted slave and tattooed a message on it. After his hair had grown the message was hidden. The purpose was to instigate a revolt against the Persians.

Steganography is distinct from cryptography which obscures the meaning of a message, but it does not conceal the fact that there is a message. In today’s digital age, steganography includes concealment of one’s and zero’s within data files. An ordinary-looking image file, for example, can include embedded messages that can go unnoticed unless someone is actively looking for a code, much like invisible ink. As has been noted, the advantage of steganography over cryptography that messages do not attract attention to themselves, to messengers, or to recipients.

And now, two Polish scientists with the Institute of Telecommunications in Warsaw have just revealed that they are developing a steganographic system for VoIP networks. This may eventually be a more effective way for activists and social resistance movements to communicate and evade detection. All that we need now is the software and interface to make this communication as simple as two clicks of a mouse. Any takers? Of course, repressive regimes could also use the same tactic, and I realize this presupposes Internet connection and access, but it’s a start.

Patrick Philippe Meier

Crimson Hexagon: Early Warning 2.0?

The future of automated textual analysis is Crimson Hexagon, a patent pending text reading technology that allows users to define the questions they want to ask, and crawl the blogosphere (or any text-based source) for fast, accurate answers. The technology was created under the aegis of Harvard University Professor Gary King.

I met with the new company’s CEO this week to learn more about the group’s parsing technology and underlying statistical models. Some UN colleagues and I are particularly interested in the technology’s potential application to conflict monitoring and analysis. At present, early warning units within the UN, and other international (regional) organizations such as the OSCE, use manual labor to collect relevant information from online sources. Most units employ full-time staff for this, often meaning that 80% of an analyst’s time is actually used to collect pertinent articles and reports, leaving only 20% of the time for actual analysis, interpretation and policy recommendations. We can do better. Analysts ought to be spending 80% of their time analyzing.

Crimson Hexagon is of course not the first company to carry out automated textual analysis. Virtual Research Associates (VRA) and the EC’s Joint Research Center (JRC) have both been important players in this space. VRA developed GeoMonitor, a natural language parser that reads the headlines of Reuters and AFP news wires and codes “who did what, to who, where and when?” for each event reported by the two media companies. According to an independent review of the VRA parser by Gary King and Will Lowe (2003),

The results are sufficient to warrant a serious reconsideration of the apparent bias against using events data, and especially automatically created events data, in the study of international relations. If events data are to be used at all, there would now seem to be little contest between the machine and human coding methods. With one exception, performance is virtually identical, and that exception (the higher propensity of the machine to find “events” when none exist in news reports) is strongly counterbalanced by both the fact that these false events are not correlated with the degree of conflict of the event category, and by the overwhelming strength of the machine: the ability to code huge numbers of events extremely quickly and inexpensively.

However, as Gary King mentioned in a recent meeting I had with him this month, VRA’s approach faces some important limitations. First, the parser can only parse the headline of each newswire. Second, adding new media sources such as BBC requires significant investment in adjusting the parser. Third, the parser cannot draw on languages other than English.

The JRC has developed the European Media Monitor (EMM). Unlike VRA’s tool, EMM is based on a key-word search algorithm, i.e., it uses a search engine like Google. EMM crawls online news media for key words and places each article into a corresponding category, such as terrorism. The advantage of this approach over VRA’s is that EMM can parse thousands of different news sources, and in different languages. The JRC recently set up an “African Media Monitor” for the African Union’s Continental Early Warning System (CEWS). However, this approach nevertheless faces limitations since analysts still need to read each article to understand the nature of the terrorist event.

Google.org is also pursuing text-based parsing. This initiative stems from Larry Brilliant’s TED 2006 prize to expand the Global Public Health Information Network (GPHIN) for the purposes of prediction and prevention:

Rapid ecological and social changes are increasing the risk of emerging threats, from infectious diseases to drought and other environmental disasters. This initiative will use information and technology to empower communities to predict and prevent emerging threats before they become local, regional, or global crises.

Larry’s idea led to the new non-profit InSTEDD, but last time I spoke with the team, they were not pursuing this initiative. In any case, I wouldn’t be surprised if Google.com were to express an interest in buying out Crimson Hexagon before year’s end. Hexagon’s immediate clients are private sector companies who want to monitor in real-time their brand perception as reported in the blogosphere. The challenge?

115 million blogs, with 120,000 more added each day. As pundits proclaim the death of email, social web content is exploding. Consumers are generating their own media through blogs and comments, social network profiles and interactions, and myriad microcontent publishing tools. How do we begin to know and accurately quantify the relevant opinion that’s out there? How can we get answers to specific questions about online opinion as it relates to a particular topic?

The accuracy and reliability of Crimson Hexagon is truly astounding. Equally remarkable is the fact that the technology developed by Gary King’s group parses every word in a given text. How does the system work? Say we were interested in monitoring the Iranian blogosphere—like the Berkman Center’s recent study. If we were interested in liberal bloggers and their opinion on riots (hypothetically taking place now in Tehran), we would select 10-30 examples of pro-democratic blog entries addressing the ongoing riots. These would then be fed into the system to teach the algorithm about what to look for. A useful analogy that Gary likes to give is speech recognition.

The Crimson Hexagon parser uses a stemming approach, meaning that every word in a given text is reduced to it’s root word. For example, “rioting”, “riots”, “rioters”, etc., is reduced to riot. The technology creates a vector of stem words to characterize each blog entry so that thousands of Iranian blogs can be automatically compared. By providing the algorithm with a sample of 10 or more blogs on, say, positive perceptions of rioting in Tehran were this happening now, the technology would be able to quantify the liberal Iranian bloggers’ changing opinion on the rioting in real time by aggregating the stem vectors.

Crimson Hexagon is truly pioneering a fundamental shift in the paradigm of textual analysis. Instead of trying to find the needle in the haystack as it were, the technology seeks to characterize the hay stack with astonishing reliability such that any changes in the hay stack (amount of hay, density, structure) can be immediately picked up by the parser in real time. Furthermore, the technology can parse any language, say Farsi, just as long as the sample blogs provided are in Farsi. In addition, the system has returned highly reliable results even when using less than 10 samples, and even when the actual blog entry had less than 10 words. Finally, the parser is by no means limited to blog entries, any piece of text will do.

The potential for significantly improving conflict monitoring and analysis is, in my opinion, considerable. Imagine parsing Global Voices in real time, or Reliefweb and weekly situation reports across all field-based agencies world wide. Crimson Hexagon’s CEO immediately saw the potential during our meeting. We therefore hope to carry out a joint pilot study with colleagues of mine at the UN and the Harvard Humanitarian Initiative (HHI). Of course, like any early warning initiative, the link to early response will dictate the ultimate success or failure of this project.

Patrick Phillipe Meier

Berkman@10 Roundup of Day 1

This blog entry summarizes the first day of Harvard’s Berkman@10 conference in Boston. The blog includes talks given by Jonathan Zittrain, John Palfrey, Jimmy Wales and Yochai Benkler.

Jonathan Zittrain kicked off Berkman’s birthday party with an animated presentation of his book, The Future of the Internet, and How to Stop It. I began reading JZ’s book last week in the hopes of having finished it by today but alas it was not to be. So I will write a review on The Future of Internet in a future blog entry. In any event, JZ’s concern seems to be a re-centralization, or control, of the Internet and associated technologies like the iPhone. He is particularly peeved by Steve Jobs’ comments when he launched the iPhone:

We define everything that is on the phone … You don’t want your phone to be like a PC. The last thing you want is to have loaded three apps on your phone and then you go make a call and it doesn’t work anymore. These are more like iPods than they are like computers.

Zittrain worries that companies like Apple and Facebook will increasingly constrain the generative nature of the Internet and thereby undermine the creativity, freedom and innovation that have driven the information revolution to this day. He likens this to dark matter or energy which keeps the universe expanding at an accelerating rate. JZ is genuinely concerned that the IT ecosystem’s dark energy will cease expand the Internet as we know it today; a reversal of the “bit bang” to the “bit crunch“.

As mentioned, I have yet to finish Zittrain’s book but my preliminary thoughts are one of skepticism. My reaction is based on my recent dissertation research. I suspect that we are unlikely to see the kind of tipping point described by JZ, which I refer to as the “bit crunch” theory of the Internet. Zittrain draws on the example of hitchhiking, once a widespread mode of transportation but much less so today given fears over personal safety. At the same time, Zittrain does highlight the fact that websites dedicated to hitchhiking do exist. In my opinion, this points to a game of cyber cat-and-mouse, a dynamic whereby adaptation and evolution are likely to be the Internet’s constants, i.e., factors unlikely to change in the dark energy equation of the Internet regardless of who the players are.

Other tidbits: JZ made interesting references to the IETF, Nanog, StopBadWare.org and the origins of ITU (to deal with encryption in telegrams).

John Palfrey led a discussion on the impact of the Internet on Democracy, a topic closely related to my dissertation research. In John’s words, “The internet allows more speech from more people than ever… but states are finding more and more ways to restrict online speech and to practice surveillance.” My dissertation question is whether repressive regimes will manage to impose an information blockade on sensitive communications or whether resistance groups will ultimately prevail, and why?

John made references to Global Voices and asked Ethan Zuckerman to comment on the projects impact and continuing challenges. Ethan opined that the biggest challenge was not necessarily government censorship but rather that citizen journalism had yet to influence mainstream media in a concerted and significant way. Later on in John’s moderation of the discussion, the subject of Cuba and in particular the use of flash drives came up. Interestingly, flash drives are the ICT of choice for activists in Cuba who seek to communicate and share information with one another. As one blogger in Havana exclaimed:

Cubans have a new saint. It is a small and is called USB-flash, memory stick….Praise be this new protector and distributor of information that has come into our lives!

Several interesting points were articulated during the question and answers session:

  • There are now more Internet users in China than in the US, and the vast majority of these users actually welcome censorship.
  • The Internet is ultimately about people, not routers. If we want to change the future of the Internet, we need to change people, who will find ways to exert power in new network fashion as they learn about the world of network organizing (Ethan Zuckerman).
  • The impact of the Internet on democracy (small “d” as opposed to big “D”) is an area of study that is as important as the impact on Democracy (Beth Kolko).
  • The Kyrgyz revolution was particularly interesting vis-a-vis the use of information communication technology beyond the Internet. Indeed, mobile phone usage is particularly high, and civil society made use of this technology to protect shops and stores from being looted by marauders. In other words, ICTs were used for protection by civil society where and when the state was unable to do so (Beth Kolko).
  • The impact of computer games should not be overlooked since young people who wish to play inevitably become accosted to technology and find ways to deal with the last mile problem in order to play. This also enables them to access new sources of technology that they were not privy to heretofore (Beth Kolko).

Jimmy Wales, the founder of Wikipedia and Yochai Benkler also spoke at Bekrman@10. Both made very interest points and intriguing references. For example, Jimmy explained why consensus was more important than democratic voting. For example, if 30% of individuals who vote on an issue are then overruled by the majority vote, the tyranny of the majority is unlikely to appease potential spoilers (much like challenges in managing peace processes). So instead, Jimmy emphasizes the importances of process, i.e., continued deliberation and rewriting of Wikipedia entries until consensus is reached, by which time some engaged in the ongoing arguments will have demonstrated behavioral problems and therefore have been discredited. This reminded me of the value of Wikis emphasized by the creaters of Intellipedia, which I blogged about here.

Benkler’s comments were very much in line with his book The Wealth of Networks, so I shan’t repeat them here. Benkler did make a number of interesting references, however. For example, Porkbusters and Kaltura. The question is whether features can be designed to improve or incite more sustained cooperation. While I’m skeptical about the feasibility of such goals, I thought Jimmy made an excellent point, “make it cheaper to do something good and more expensive to do something bad.” In essence, Jimmy’s Wikipedia experiment demonstrates that people tend to cooperate far more often than traditional theories in sociology and political science would allow.

This is the stuff that Jonathan Zittrain’s dark matter is ultimately made of, which explains why I am skeptical about his tipping point thesis regarding the Internet. The human desire to communicate and be heard is innate and unlikely to lay dormant for long should JZ’s future temporarily come to pass.

Patrick Philippe Meier

Burma and the Responsibility to Empower

The military dictatorship’s blocking of foreign aid to Burma/Myanmar has drawn worldwide condemnation. For me, however, the crux of the problem is twofold: first, the tradition of external response, and second, the nature consensual intervention. It is high time we shift to people-centered disaster/conflict early warning & response.

The UN’s Global Survey of Early Warning Systems for natural disasters defines the purpose of people-centered early warning systems as follows:

To empower individuals and communities threatened by hazards to act in sufficient time and in an appropriate manner so as to reduce the possibility of personal injury, loss of life, damage to property and the environment, and loss of livelihoods.

Precisely because of cases like that of Burma, the international humanitarian community should focus more seriously on “the capacity of disaster-affected communities to ‘bounce back’ or to recover with little or no external assistance following a disaster”  (Manyena 2006). The question that most interests me is how information communication technology can increase community resilience to disasters and conflict.

Humanitarian aid and disaster response is still subject to the principle of state sovereignty. This in part continues to plague international responses to violent conflict such as the genocide in the Sudan. State-based intervention is anything but timely and efficient. This is why the humanitarian community should consider more decentralized and tactical approaches to rapid response. The field of strategic nonviolent action is specifically focused on these types of responses. The humanitarian community should take heed.

We need a far more cross-disciplinary approach to humanitarian response; one that does not divide disaster response from conflict prevention. And one that does not shy away from a more tactical and proactive approach to saving lives.

Patrick Philippe Meier

Twitter: Sending Out Voice-to-Text SOS

One of the constraints of using SMS to evade state censorship in developing countries under repressive rule is literacy—or lack thereof. TwitterFone is a new service that converts voice to text and then posts it to Twitter. While Jott and Spinvox already enable voice to text conversion for Twitter and Facebook, TwitterFone is said to be far simpler to use.

According to TechCrunch,

The service launched moments ago into private beta. To use it you need to verify your phone number and Twitter account, and TwitterFone will then give you a local phone number to call to leave messages (they support the U.S., UK and Ireland now, adding more). Then, any message you send will be transcribed, and posted to Twitter along with a link to the recording. If the message is longer than 140 characters, just the first part is transcribed, but the entire recording is still available. There is a time limit of 15 seconds on the recording. The service is partially automated via voice recognition software, and flagged words go to a human for translation.

Patrick Philippe Meier

The Politics of Cyberconflict

I recently read Athina Karatzogianni’s The Politics of Cyberconflict and met the author at the Politics 2.0 International Conference in London last month. This blog entry is a mini “review” of Athina’s book based on my dissertation research thus far. By review, I mean to provide several excerpts from the study and to comment on them. In particular, I address the role of technology in fostering new organizational structures.

New Social Movements are open, decentralized, non-hierarchical and ideal for internet communication. At the same time, uses of the internet may have important effects on organizational structures, both inside member organizations and in terms of overall network stability and capacity.

The information revolution is favoring and strengthening networked organizational designs, often at the expense of hierarchies. States need to wake up to the fact and realize that networks can be fought effectively only by flexible network-style responses.

Painting modern resistance movements as decentralized and states as hierarchical is increasingly fashionable. However, I know of no study that empirically supports (or denies) the validity of these broad caricatures. Such a study would certainly be feasible and especially interesting if it were to employ networks analysis. I suspect that one would find resistance movements resembling hybrid networks rather than strictly decentralized organizational forms.

In any event, a question oft overlooked vis-a-vis the information revolution’s influence on organizational structure is technology’s impact on authoritarian rule. If the thesis is that decentralized, distributed and mobile technologies “flattens” preexisting organizational structures, then is modern information communication technology likely to have a similar impact on repressive regimes over time? If a coercive, centralized state were to “wake up” and make more effective use of networked and peer-to-peer communication technologies, would this necessarily delegate and distribute power? My inclination, based on the theory of power in the nonviolence literature, is to say yes.

Information technology is constantly being modified, enhanced and overtaken by better ideas, leaving importing states to engage in an expensive and never-ending game of catch-up technologies which have been conducive to state power, even to coercive state power.

I see Athina’s point but at the same time would argue that a number of nondemocratic regimes have been effective in limiting the import and use of technologies that purport to threaten their “information blockade”. This is true of Burma, Cuba, Nigeria and North Korea amongst several others.

Patrick Philippe Meier

Block it Like Beijing

A good friend of mine works as a professional jazz singer in Shanghai. She recently tried to access my iRevolution blog but without success. However, she did note that Wikipedia is finally accessible as well as other blog sites. The Great Chinese Firewall appears to be filtering my blog. Shucks. In better news though, I successfully defended my dissertation proposal today, so I can finally get back to blogging on a more regular basis.

Patrick Philippe Meier

Technology and Survival

People-centered early warning is about empowering at-risk communities so that they may get out of harm’s way when conflict escalates in their direction. I have already blogged about the use of technology for survival in areas of conflict: see Fallujah, El Salvador and an overview here. I have also noted that the disaster management community tends to adopt new technology long before the conflict prevention community does. Today’s Wired magazine features a neat review of “Survival Gear that’s Just Crazy Enough to Work.” While the review does not evaluate the gear for purposes of survival in conflict zones, at least two types of gear reviewed may be relevant.

Take for example the Bedu Emergency Rapid Response kit below. The kit fits in a keg-sized drum and is designed to “support eight adults for up to five years and it includes a water-filtration system, medicine and tool kits, a multi-fuel stove, a radio and a hand-crank generator with a photovoltaic battery pack and a strip-cell blanket. Not only that, but the skeleton of the barrel can be used to create a shelter.”

As Wired’s editors note, packing up the drum may take hours, which is not particularly useful in crisis zones when minutes can make the difference between life and death. However, alternative versions of the kit could be designed for quick set-up and quick packing. The drum could also be buried for later use if carrying it with were not an option.

Perhaps of more interest is the Grundig Eton Radio below. This device “includes AM/FM and weather-band frequencies, a two-way walkie-talkie channel, a flashlight, a siren, a beacon light and a cellphone charger.” According to Wired, the radio is also incredibly tough and only $150.

Patrick Philippe Meier