Tag Archives: elections

Reports on m:lab and Umati

This week two reports have come out of the iHub community.

m:lab East Africa after 2 years

The study which was conducted between April and May 2013 focused on 3 key activity areas at the m:lab namely:

  • Mobile entrepreneurship training
  • Pivot East regional pitching competition
  • The incubation program

The highlights are found on the iHub blog for now, the full report to be downloadable as soon as it is formatted.

Umati: monitoring dangerous speech in Kenya

The Umati project sought to identify and understand the use of dangerous speech in the Kenyan online space in the run-up to the Kenya general elections. Apart from monitoring online content in English, a unique aspect of the Umati project was its focus on locally spoken vernacular language; online blogs, groups, pages and forums in Kikuyu, Luhya, Kalenjin, Luo, Kiswahili, Sheng/Slang and Somali were monitored.

umati-dangerous-speech-kenya2

Download the full Umati report (PDF)

A 2013 Uchaguzi Retrospective

MRTN8684


UPDATE: Here’s the report put together by the iHub Research team (3Mb PDF): Uchaguzi Kenya 2013

The elections in Kenya this year have had a lot of drama, nothing new there. As I wrote about last week, Ushahidi has been involved quite heavily on the crowdsourcing side via Uchaguzi, which meant that we had an exhausting week as the results kept getting extended each day.

Uchaguzi Update

Some basic statistics:

  • 5,011 SMS messages sent in (that weren’t spam or junk, as those got deleted)
  • 4,958 reports were created (from SMS messages, the web form, email and media monitoring teams)
  • 4,000 reports were approved to go live on the map
  • 2,693 reports were verified (67% of approved reports)

Notes and Links:

  • Many reports, links an updates can be found on our virtual situation room
  • The analysis team provided twice daily rundowns based on verified data at http://visuals.uchaguzi.co.ke/
  • Rob created a map visual to show the reports coming into Uchaguzi over time.
  • The IEBC tech system failed, I started a Tumblr trying to figure out how the system was built, which companies were involved and what they did, and what actually went wrong.
  • Before the IEBC tech system was shut off, Mikel used their API to create maps (1, 2) and Jeff and Charles created a mobile-friendly results site as well.
  • Heather wrote up a good post on our situation room blog about what we’ve learned along the way.

Here’s an Uchaguzi community graphic:
Uchaguzi community graphic

Kenya’s 2013 IEBC Election Tech Problems

TL;DR – Kenya’s IEBC tech system failed. I started a site to collect notes and facts, read it and you’ll be up to date on what’s currently known.

Kenya’s IEBC (Independent Electoral and Boundaries Commission) had an ambitious technology plan, part based on the RTS (Results Transmission System), part based on the BVR (Biometric Voter Registration) kits – the latter of which I am not interested in, nor writing about. It was based on a simple idea that the 33,000 polling stations would have phones with an app on them that would allow the provisional results to be sent into the centralized servers, display locally, and be made available via an API. It should be noted that the IEBC’s RTS system was a slick idea and if it had worked we’d all be having a much more open and interesting discussion. The RTS system was an add-on for additional transparency and credibility, and that the manual tally was always going to happen and was the official channel for the results.

The Kenya IEBC tech system elections 2013

On Tuesday, March 5th, the day after the elections, the IEBC said they had technical problems and were working on it. By 10pm that night the API was shut off. This is when my curiosity set in – I didn’t actually know how the system worked. So, I set out to answer three things:

  1. How the system was supposed to work (Answer here)
  2. Who was involved and what they were responsible for (Answer here)
  3. What actually failed, what broke (Answer here)

Turns out, it wasn’t easy to find any answers. Very little was available online, which seemed strange for something that should be openly communicated, but wasn’t. We all benefit from a transparent electoral process, and most especially for transparency in the system supposed to provide just that.

So, I set up a site to ask some questions, add my notes, aggregate links and sources, and post the answers to the things I found on the RTS system. I did it openly and online so that more people could find it and help answer some of the questions, and so that there would be a centralized place to find the some facts about the system. By March 6th, I had a better understanding of the flow of data from the polling stations to the server and the API, and an idea of which organizations were involved:

  • Polling station uses Safaricom SIM cards
  • App installed in phone, proprietary software from IFES
  • Transmitted via Safaricom™s VPN
  • Servers hosted/managed by IEBC
  • JapakGIS runs the web layer, pulling from IEBC servers
  • Data file from IEBC servers sent to Google servers
  • Google hosted website at http://vote.iebc.or.ke
  • Google hosted API at http://api.iebc.or.ke
  • Next Technologies is doing Q&A for the full system

IEBC tech system diagram

Why now? Why not wait a week until the process is over?

It’s been very troubling for me to see people speculating on social media about the IEBC tech system, claiming there have been hackers and all types of other sorts of seeming misinformation. Those of us in the technology space were looking to the IEBC and its partners for the correct information so that these speculative statements could be laid to rest. I deeply want the legitimacy of this election to be beyond doubt. The credibility of the electoral system was being called into question, and clear, detailed and transparent communications were needed in a timely manner. These took a long time to come, thus my approach.

Interestingly, Safaricom came out with a very clear statement on what they were responsible for and what they did. Google was good enough to make a simple statement of what their responsibilities were on Tuesday. both of these companies helped answer a number of questions, and I hoped that the other companies would do the same. Even better would have been a clear and detailed statement from the head of IEBC’s ICT department to the public. Fortunately they did provide some general tech statements, claimed responsibility, refuted the hack rumor, and made the decision to go fully manual.

My assumption was that since this was a public service for the national elections, that the companies involved would be publicly known about as well. This wasn’t true, it took a while asking around to get an idea of who did what.On top of that, In a country that has been expounding on open data and open information, I was surprised to find that most of the companies didn’t want to be known, and that a number of people thought it was a bad idea to go looking for who they were and what they did. I wasn’t aware that this information was supposed to be secret, in fact I assumed the opposite, that it would be freely announced and acknowledged which companies were doing what, and how the overall system was supposed to work.

I’ve spoken directly to a number of people who are very happy that I’m asking questions and putting the facts I find in an open forum, and some that are equally upset about it. Much debate has been had openly on Skunkworks and Kictanet on it this, and when we debate ideas openly we fulfill the deepest promise of democracy. My position remains that this information should be publicly available, and the faster that it’s made available, the more credible the IEBC and it’s partners are.

By Friday, March 8th, I had the final response on what went wrong. My job was done. Now it’s up to the rest of the tech community, the IEBC and the lawyers to do a post-mortem, audit the system, etc. I look forward to those findings as well.

Finally, I’ll speculate.
My sense of the IEBC tech shortcomings is that it had very little to do with the technology, or the companies creating the solution for them. It was a fairly simple technology solution, that had a decent amount of scale, plus many organizations that needed to integrate their portion of the solution. Instead, I think this is a great example of process management failure. The tendering process, project management and realistic timelines don’t seem to have been well managed. The fact that the RFP due date for the RTS system was Jan 4, 2013 (2 months exactly before the elections) is a great example of this.

Some are saying that the Kenyan tech community failed. I disagree. The failure of the IEBC technology system does not condemn, nor qualify, Kenya’s ICT sector. Though this does give us an opportunity to discuss the gaps we have in the local market, specifically the way that public IT projects are managed and the need for proper testing.

It should be said that all I know is on the IEBC Tech Kenya site, said another way, read it and you know as much as me. There is likely much more nuance and many details missing, but which can only be provided by an audit or the parties involved stepping forward and saying what happened.

Uchaguzi: Full-Circle on Kenya’s Elections

Uchaguzi: 2013 Kenyan Election Monitoring Project

Just over 5 years ago, I was just like everyone else tuning into the social media flow of blogs, tweets and FB updates along with reading the mainstream media news about the Kenyan elections. We all know the story – thing fell apart, a small team came together and built Ushahidi, and we started building a new way to handle real-time crisis information. We were reacting and behind from the beginning.

(side note: here are some of my early blog posts from 2008: launching Ushahidi, the day after, and feature thoughts)

Now, the day before Kenya’s elections, I’m sitting in the Uchaguzi Situation Room, we’ve got a live site up already receiving information, 5 years of experience building the software and learning about real-time crowdmapping. There are over 200 volunteers already trained up and ready to help manage the flow of information from the public. This time Kenya’s IEBC is ready, they’re digital, and are doing a phenomenal job of providing base layer data, plus real-time tomorrow (we hope).

In short, we’re a lot more prepared than 2008 in 2013, everyone is. However, you’re never actually ready for a big deployment, by it’s very nature the crowdsourcing of information leads to a response reaction, you’re always behind the action. So, our main goal is to make that response processing of signal from noise and getting it to the responding organizations, as fast as possible.

Uchaguzi 2013

If you’d like to know more about the Uchaguzi project, find it on the about page. In short, Uchaguzi is an Ushahidi deployment to monitor the Kenyan general election on March 4th 2013. Our aim is to help Kenya have a free, fair, peaceful, and credible general election. Uchaguzi’s strategy for this is to contribute to stability in Kenya by increasing transparency and accountability through active citizen participation in the electoral cycles.This strategy is implemented through building a broad network of civil society around Uchaguzi as the national citizen centred electoral observation platform that responds to citizen observations.

The next couple days I’ll be heads-down on Uchaguzi, running our Situation Room online and Twitter account (@Uchaguzi), and troubleshooting things here with the team. We’re already getting a lot of information, trying to work out the kinks in how we process the 1,500+ SMS messages that people have sent into our 3002 shortcode, so that tomorrow when things really get crazy we’re ready.

I’ve already written up a bunch on how Uchaguzi works, so I’ll just post the information flow process for it here:

Uchaguzi's workflow process

Uchaguzi’s workflow process

Your Job

As in 2008, your job remains the same; to get the word out to your friends in Kenya, to get more reports into the system, and to support groups working towards a good election experience.

A huge thank you to the local and global volunteers who’ve put in many, many hours in the workup to tomorrow and who will be incredibly busy for the next 48 hours. Besides the hard work of going through SMS messages and creating geolocated reports out of them, some of the geomapping team have been busy taking the police contact information and mapping it. They’ve created an overlay of the data, it’s on this page right now, but our plans are to put this on the main map later.

Just as in 2008, a few people are making a big difference. All of the volunteers doing the little they can to make their country better.

Geomapping team for Uchaguzi

  • Leonard Korir
  • Samuel Daniel
  • Luke Men Orio
  • Slyvia Makario
  • Wawa Enock
  • Mathew Mbiyu

Some other helpful links for the Kenyan elections

IEBC
Find your polling station
Voter education
Mzalendo
Got To Vote
Wenyenche
Google Elections Site
The Kenyan Human Rights Commission
Mars Group
Kenya Nation Election Coverage
Standard Media Kenya
Kenya’s Freedom Media Council

My (short) TED Talk on Ushahidi

I was fortunate enough to be at TED this year as a Fellow. While there, I did a short TED University talk on the roots of Ushahidi, where it’s going and a new initiative called Swift River. Needless to say, it was only 4 minutes, so I couldn’t get all the information that I wanted to in there. If you would like to know more about Swift, take a look at this video where Chris and Kaushal talk about it in more detail.

Currently we’re seeing this at work in India, where a group of people have come together to deploy Ushahidi and Swift River to gather information from normal people about the elections.

Nate Silver: Race, Prediction and the US Election

Math whiz and baseball fan Nate Silver was mainly known for predicting outcomes in fantasy ballgames — until his technique hit a home run calling the outcome of the 2008 election primaries. He’s now a mainstream political pundit with two book deals.

Nate Silver on Prediction and Race at TED 2009

Nate starts off by talking about how big of a win Obama had in 2008’s US elections. He asks, “what’s the matter with Arkansas?” Wondering why it is that certain US states never vote for democrats.

We have negative connotations about Arkansas, typically it’s something like, “rednecks with guns“. We think it’s a problem of race, are we stigmatizing? Well, yes, and he sets out to prove that statistically it is.

One of the classic polling questions for the last couple presidential elections in the US has been:

“In deciding your vote for president today, was the race of the candidates a factor?”

The answers to this poll question have been indicative for which areas of the US tend to vote certain ways.

Is racism predictable? What is the deciding factor? Income, religion, education, etc…
Education is, so is the degree of rural vs urban setting you live in. So, yes, racism is predictable.

The General Social Survey, asks “Does anyone of the opposite race live in your neighborhood?” And, the answers to this are stratified upon density: In the city, yes. In the suburb, mainly yes. In rural areas, not nearly as much.

It turns out that people who live in monoracial areas are twice as less likely to approve multiracial marriages.

The goal is to facilitate interaction with people of other races. Nate is a big fan of cities, because they give a great opportunity for connecting with other cultures of other races. You end up having more tolerant communities. He also says that urban design is hugely important: grids vs the windy streets in many parts of suburbia, where grids are better. At the end of the day, he says cul de sacs lead to conservatives, which is a bad thing to him as well.

Apps for America: Snapvote

I came across the Sunlight Foundation’s “Apps for America” contest last week, and it reminded me of a side-project that I started that never got off the ground a couple years back. Add in today’s US Presidential inauguration and it was just too much for me not to share this idea. I’m now too busy with Ushahidi to do this, but I think it could be a good candidate for this competition, and I hope someone builds it.

History

Just over two years ago I was thinking about the upcoming US national elections and of building a web application that would be useful to the general public and which also had some business potential. I sat down and drew out an idea I thought had a lot of merit, and I actually sat down with two really smart people (Meagan Fisher and Jason Hawkins) and we ended up concepting most of the app. However, there was no code laid down, just a lot of background work trying to understand the feasibility, market and data.

It was called SnapVote: A tool for keeping citizens informed about elections in their area

SnapVote: Homepage Mockup

What is SnapVote?

We wanted SnapVote to be the easiest way for Americans to figure out whom to vote for in any political race. The name came from the idea that voters could get a snapshot of politicians, races and platforms before they voted.

We were going to provide a party-agnostic snapshot of who was running for office in each person’s area, voters would be informed in less time and with less hassle than ever before. Every politician who was running for public office would have a default profile on SnapVote, which could be upgraded for a small fee and that would allow the politician to have their own space on the web.

What’s the problem?

  • There’s a lot of noise around election time
  • Most of us are “lazy voters” who don’t really know who to vote for
  • We’re getting told what the issues are
  • Politicians have horrible websites that are hard to find

What’s the solution?

  • Quickly get a snapshot of who is running for office and what they stand for
  • Weigh in on the issues that YOU think are important – users decide
  • Every politician has their own website and can upgrade it for more features
  • Politicians get a snapshot (weekly/monthly) of the issues that are important to their constituents

What does it do?

  • A database of candidates for office at the federal, state and local levels
  • Aggregate user voting determines what issues are important for each constituency
  • Politicians can use Snapvote as their primary communication, fundraising and volunteer platform

The Objective

SnapVote was going to be the primary source of consumer information about politicians. From the President to the local dogcatcher, anyone who ran for public office would be accounted for. It would also serve as the primary website for information on any specific politician and created a website for each one.

SnapVote: Politician's Page

The Opportunity

SnapVote is in a position to be a first-mover in a fairly competitive-free space. 122 million people voted in the 2004 national elections, this is SnapVote’s constituency. Providing an easy-to-use tool that makes even the laziest of voters appear ready for Election Day is the goal. Secondarily, the profile for each politician will cost a fee to be upgraded. That number becomes quite large once you move past US President and Congress and start accounting for governors, mayors, city councils and congressmen for each state’s legislature. Initial income would come from politicians taking charge of their profiles on SnapVote. Other revenue opportunities would include aggregate data reports that could be sold to study groups, businesses and politicians.

SnapVote: Politicians (full)

More ideas

There was a lot more behind SnapVote, including aggregating people’s views on different political issues and politicians themselves. This data could be used to help individuals find politicians they shared the most in common with – especially for local elections. It also would come in handy for politicians, knowing what was being hot (or not) at the grassroots level.

As you can see from the mockup design work, there were also some thoughts around creating ways for politicians to raise donations and money easier, to plug into other social networks, events and getting people involved in their campaigns.

The biggest challenge is gathering the data on politicians running for office in local elections. As I called the different departments and organizations that handle this information around the country, I found that almost every state had a different set of rules for getting that data, and it was in a multitude of formats.