Last weekend there was a live screencast of the Aspen Institute’s Forum on Communications and Society, and one of the meetings that I tuned in to was the one on Media and Civic Engagement. The members of that meeting was a who’s who of media, regulatory and business moguls that are trying to, or have cracked, the online space (Craig Newmark of Craigslist, Marissa Mayer of Google, Peter Shane of the Knight Foundation, Dana Boyd, etc…).
I heard a very troubling comment during that discussion, and surprisingly it came from Marissa Mayer of Google (found at 52:45). That was how anonymity is the enemy of trust, and that she doesn’t see a future for anonymity online. It destroys community and promotes anarchy.
To give some sense of reference, without having to watch the video, here is a word-for-word transcription of Marissa’s comments. It starts with her talking about youth and misinformation on the web leading to apathy, she stated:
“…I think it’s really important as we look at tools to think about how we can support fact checking, how can we guard against misinformation, how is there going to be established an element of authority and trustworthiness? …I grew up with the newspaper and the encyclopedia, which you could trust. And now you have blogs, which are held often as news and often aren’t factual. Or you have Wikipedia, which usually gets most things right, but there are a lot times there is vandalism or corrections that need to be made.”
“When you look at the elements of anonymity and the lack of accountability that happens on the web, it really does start to create doubt in the fibers of who can you trust. Especially when you think of why should I engage? The sense of identity. If I’m anonymous and I’m not accountable for my actions and there are other people out there putting out a lot of misinformation of which the same is true, I think it does lead to apathy and a lack of engagement, which is why I think it is important as we look at these tools to understand the effects of identity. To understand the effect of accountability, authority, trustworthiness and make sure that we’re developing tools and social systems online that encourage an element of engagement and try to fight that apathy trend that says, ‘well I just can’t trust anything. Why should I care?’.
On the question of if there is a way to hone in on the issue of misinformation, beside media literacy:
“Well, I think there are two ways to look at it, on the institutional level and on the individual level. So I think that what you’re seeing is that there are institutions that are rising up online that basically have an element of brand and credibility and standards that they apply. When you look at the Huffington Post, the Drudge Report, inherently the people who run those organizations are saying that here are stories I believe, I believe they’re verified enough that I’m willing to attach my brand and my name to it. So you can see that that’s starting to happen on an institution level online.”
“And I also think there are individual systems where people are verified or credentialed, or you have a profile that tells all about you and shows the other contributions you’ve made to the system. Just there’s greater accountability on the personal level… So I think a lot of the systems that support pure anonymity… I really believe that virtual systems should mirror physical systems. The physical world has been around for a lot longer, and in the physical world you really can’t do anything anonymously. So when you look at systems online that break that paradigm where you can be completely anonymous, or be whoever you want to be, without any since of history or of what you did last week, that’s not really reality and that breaks down the elements of trust and authority.“
That’s about where I jumped in with my comments on not being able to trust those who are monitoring your online speech. Where Marissa then answered:
“Well, I think anonymity has its place. So there’s certainly times, when you know you should have commentary or some type of act giving should be anonymous. But, by and large most systems should have accountability the same way they do in the physical world.”
Besides all of my thoughts swirling around the fact that the web really grew due to anonymity, I balked at this comment because I was surprised at hearing one of the highest Google executives speak so lightly of it.
Maybe this is where I differ a little from my American tech counterparts. You see, there’s something about growing up in a country where you can’t pretend to believe that the government really has your best interests at heart that makes one a little squeemish about not having this anonymous free speech. For, if it wasn’t anonymous, then it definitely wouldn’t be free.
We have a way of projecting our world view on to those around us. In this case, I believe Google (or Marissa) is doing just that. Having these open, trusting, everyone-knows-everyone systems is all well and good when you live in the US. It’s not so good in other parts of the world.
It’s especially not good when you ask who controls all that personal information, and how they let outside bodies (government or otherwise) access that personal data about you.
I came to terms a few years ago about having a lot of personal information on the web, open to others. That’s a personal decision, and not one that any company should be making claims to knowing what’s right to do or not. What I hear, extrapolating from this, is that it’s okay if you don’t want to be a part of it, you can always opt out – but if you do, you also opt out of any meaningful part in the discussion. Frankly, I find that troubling.
Below is the video archive of this talk on Media and Civic Engagement, and is about 1.5 hours (browse the “on demand library” and it’s the 6th from the top on the list):
[Rachel Sterne of Groundreport created a great backchannel platform for viewers to discuss these items in real-time, and there was some direct discussion happening between online commenters and the participants in the room.]