Managing with Trust and Expectation

For the past 6 years I’ve been part of a rather unique organization in Ushahidi, where we decided early on that how we’d run the organization was that we would trust each other and expect that everyone would act like responsible adults. It’s worked brilliantly, even as we’ve grown and spun up new enterprises and organizations such as iHub and BRCK.

Yesterday I read about the Berkshire Hathaway “strategy of trust”:

Mr. Munger, 90, was ruminating on the state of corporate governance, offering a counternarrative to the distrustful culture of most businesses: Instead of filling your ranks with lawyers and compliance people, he argued, hire people that you actually trust and let them do their job.

It’s well worth a read, and I didn’t expect to find parity in leadership philosophy between us and a 300,000-person family of organizations.

How do we do it?

There are probably other organizations like ours, ones who have decided to trust their team and assume that people make good decisions based out of the best intentions of the organization and their colleagues, over themselves. We didn’t set out with a great body of knowledge on how to do this, but instead with some theories that we’ve refined over time. Here are the most important ones:

Find the Right People
David, Juliana and I particularly don’t like to micromanage. We’ll work with you to define the goal, but if you expect someone to tell you how to get there, you won’t fit. We don’t check up on you all the time, you tell us when there’s a snag. You need to work autonomously. We’ll help, and are always there for a conversation, but your job is to get from point A to point B.

It’s always better to find people who are smart and get things done, who can work autonomously and tend to not put themselves first. Big egos don’t go well with this kind of team, so we look for humility when interviewing.

I remember making a mistake back in 2009, hiring someone off of reputation and resume, without really digging into their portfolio or doing multiple interviews. Ever since then I’ve refused to look at CVs or resumes and each new person goes through about 4-5 other people on the team before we make the final decision. Those other people on the team catch things I wouldn’t, some about skill, but most about ethos and personality.

Knowing the Ethos
If an emergency happens where you are, can you make a decision and run with it, without having to ask permission? You should be able to. This is especially important in an organization with a globally distributed team that deals with crisis and disaster. We decided that everyone should be able to make critical decisions about deployments of the software, partnerships and strategic steps on their own. Just fill everyone else in on it as it comes up and if adjustments need to be made, then we do it together.

To make this work, we had to ensure that everyone on the team, from junior engineers to new QA staff actually understood the foundational elements of the organization. Not just what we built, but why we built it, how it all started and where we were going in the future. While there’s no “intro to X” classes, we do throw you in the deep end early on. It started with our first hire, Henry Addo from Ghana, who found himself speaking in the French Senate in Paris in his first month on the job. That made us realize that public speaking forces you to learn a lot more about the organization that you’re in, quickly.

Our goal is that a camera and mic can be put in front of any team member and they can answer any question on the organization. The way they answer it might be different than me due to speaking styles, but because they understand the ethos of the organizations, it is still correct.

Per Diems
We don’t do per diems. You’re traveling for the organization, spend what you need on food, lodging and transport. Be responsible about it, since this is money needed for the organization to grow. If you’re in NYC, we know things are more expensive, if you’re in Omaha we know they’re not. The “Agency Effect” (or Principal Agent Problem) comes into play here as the incentives are wrong between a team member and the organization if they get an allowance for travel.

Final Thoughts

I suppose what I’m saying is that if you truly trust people to act like the adults they are and to do the right thing, they generally do. All the corporate oversight you can apply won’t stop an Enron from happening, so something else has to work. It has to be something that’s real though, people can sniff out very quickly if it’s a manufactured, or fake, trust. This means as much of the onus lies on the leaders to “let go” as it does for the team members to shoulder and own the expectations that come with their role.

My greatest takeaway from the Mr. Munger and Mr. Buffett was found in the last paragraph:

Mr. Munger, in a previous annual meeting, contended that the best way to hold managers accountable is to make them eat their own cooking. Mr. Munger pointed to the late Columbia University philosophy professor, Charles Frankel, who believed “that systems are responsible in proportion to the degree in which the people making the decisions are living with the results of those decisions.” Mr. Munger cited the Romans, “where, if you build a bridge, you stood under the arch when the scaffolding was removed.

We all need to stand under our own bridges more often, and I’m going to figure out how to make that happen in my organizations.

Trusted Intermediaries

If you’ve run into me in the last couple months you’ll likely have heard me talking a lot about the need, power and abilities of trusted intermediaries. What is a trusted intermediary? It’s someone who sits between two parties, entities or ideas that don’t naturally trust each other and provides a bridge.

Do you trust this bridge? Why?

Do you trust this bridge? Why?

In some ways, this train of thought stems from the posts on bridgers and xenophiles started by Ethan Zuckerman and riffed on by myself. It’s only as my continued work in the African tech space has evolved that I have come to understand the true value of this concept. Seeing my position makes me realize how valuable it is to be trusted and in the center of a group of unknowns (ideas, funding, people or projects). It’s in the unknown areas of our lives that we search for trust, for people or conduits that impart a measure of confidence to our next decision. For the nod that tells us we’re heading out on the right path.

We lean on trusted intermediaries all the time, in both mundane decisions and important interactions. When you’re looking for a mechanic, you’ll trust your neighbor’s opinion over the phone book. If you need a new bike helmet, you’ll trust online reviews before you buy one with no reviews. Likewise, when you’re going to make a large investment in the African tech space, you’ll search out trusted intermediaries first.

A case study: Ushahidi

When someone is looking to invest in an African tech startup, using seed funding or grants (and it is the same for non-profits or for-profits) they are nervous. There’s a lot of other good ideas out there in other parts of the world, the low hanging fruit, that they feel more comfortable in putting money into. Why Africa? Why you?

Ushahidi started off quickly, and we were able to raise funds for continued operations much faster than many other similar non-profit tech organizations. While we’d all like to think it’s due to the brilliant tool we’ve built, we have to be honest and recognize that the individuals behind it are what gave the funders confidence to move forward. Ory, David, Juliana and I had been on the public stage for a while; we were known quantities.

We were trusted intermediaries before Ushahidi was even thought of. Which begs the question: would our team have been able to raise funds for almost any idea just as easily? Probably not, as the Ushahidi idea, timing and application are special. However, the point is still made, money flows when the people are trusted.

Trusted intermediaries elsewhere

Jon Gosier is a trusted intermediary. His Appfrica Labs incubator and innovation center in Kampala provides a person and entity that funders, projects and individuals are drawn too. His blog keeps him front and center in people’s minds.

Glenna Gordon is a trusted intermediary. She’s a photographer who has been romping around Central, East and West Africa for a couple of years. If you need a pro shooter in a hard spot like Liberia, you’ll find her blogging away at Scarlett Lion.

Eric Osiakwan in Ghana is a trusted intermediary. His leadership at the African ISP Association and the track record he’s had on projects makes him an easy person to go to in West Africa, and his Internet Research firm makes a perfect conduit for interacting with him.

Of course, these three are just a sample, there are many more like them cross the continent in different fields.

What is consistent about trusted intermediaries is that they have found a way to create a bridge between two things, and are trusted by both sides of that bridge. It’s why personal relationships, consistency, reliability and trust are more important now than ever before.

Google on Anonymity VS Trust

Last weekend there was a live screencast of the Aspen Institute’s Forum on Communications and Society, and one of the meetings that I tuned in to was the one on Media and Civic Engagement. The members of that meeting was a who’s who of media, regulatory and business moguls that are trying to, or have cracked, the online space (Craig Newmark of Craigslist, Marissa Mayer of Google, Peter Shane of the Knight Foundation, Dana Boyd, etc…).

Google on Anonymity VS Trust

I heard a very troubling comment during that discussion, and surprisingly it came from Marissa Mayer of Google (found at 52:45). That was how anonymity is the enemy of trust, and that she doesn’t see a future for anonymity online. It destroys community and promotes anarchy.

To give some sense of reference, without having to watch the video, here is a word-for-word transcription of Marissa’s comments. It starts with her talking about youth and misinformation on the web leading to apathy, she stated:

“…I think it’s really important as we look at tools to think about how we can support fact checking, how can we guard against misinformation, how is there going to be established an element of authority and trustworthiness? …I grew up with the newspaper and the encyclopedia, which you could trust. And now you have blogs, which are held often as news and often aren’t factual. Or you have Wikipedia, which usually gets most things right, but there are a lot times there is vandalism or corrections that need to be made.”

“When you look at the elements of anonymity and the lack of accountability that happens on the web, it really does start to create doubt in the fibers of who can you trust. Especially when you think of why should I engage? The sense of identity. If I’m anonymous and I’m not accountable for my actions and there are other people out there putting out a lot of misinformation of which the same is true, I think it does lead to apathy and a lack of engagement, which is why I think it is important as we look at these tools to understand the effects of identity. To understand the effect of accountability, authority, trustworthiness and make sure that we’re developing tools and social systems online that encourage an element of engagement and try to fight that apathy trend that says, ‘well I just can’t trust anything. Why should I care?’.

On the question of if there is a way to hone in on the issue of misinformation, beside media literacy:

“Well, I think there are two ways to look at it, on the institutional level and on the individual level. So I think that what you’re seeing is that there are institutions that are rising up online that basically have an element of brand and credibility and standards that they apply. When you look at the Huffington Post, the Drudge Report, inherently the people who run those organizations are saying that here are stories I believe, I believe they’re verified enough that I’m willing to attach my brand and my name to it. So you can see that that’s starting to happen on an institution level online.”

And I also think there are individual systems where people are verified or credentialed, or you have a profile that tells all about you and shows the other contributions you’ve made to the system. Just there’s greater accountability on the personal level… So I think a lot of the systems that support pure anonymity… I really believe that virtual systems should mirror physical systems. The physical world has been around for a lot longer, and in the physical world you really can’t do anything anonymously. So when you look at systems online that break that paradigm where you can be completely anonymous, or be whoever you want to be, without any since of history or of what you did last week, that’s not really reality and that breaks down the elements of trust and authority.

That’s about where I jumped in with my comments on not being able to trust those who are monitoring your online speech. Where Marissa then answered:

“Well, I think anonymity has its place. So there’s certainly times, when you know you should have commentary or some type of act giving should be anonymous. But, by and large most systems should have accountability the same way they do in the physical world.”

Besides all of my thoughts swirling around the fact that the web really grew due to anonymity, I balked at this comment because I was surprised at hearing one of the highest Google executives speak so lightly of it.

Projecting Our World Onto Others

Maybe this is where I differ a little from my American tech counterparts. You see, there’s something about growing up in a country where you can’t pretend to believe that the government really has your best interests at heart that makes one a little squeemish about not having this anonymous free speech. For, if it wasn’t anonymous, then it definitely wouldn’t be free.

We have a way of projecting our world view on to those around us. In this case, I believe Google (or Marissa) is doing just that. Having these open, trusting, everyone-knows-everyone systems is all well and good when you live in the US. It’s not so good in other parts of the world.

It’s especially not good when you ask who controls all that personal information, and how they let outside bodies (government or otherwise) access that personal data about you.

I came to terms a few years ago about having a lot of personal information on the web, open to others. That’s a personal decision, and not one that any company should be making claims to knowing what’s right to do or not. What I hear, extrapolating from this, is that it’s okay if you don’t want to be a part of it, you can always opt out – but if you do, you also opt out of any meaningful part in the discussion. Frankly, I find that troubling.

Video Archive

Below is the video archive of this talk on Media and Civic Engagement, and is about 1.5 hours (browse the “on demand library” and it’s the 6th from the top on the list):

[Rachel Sterne of Groundreport created a great backchannel platform for viewers to discuss these items in real-time, and there was some direct discussion happening between online commenters and the participants in the room.]