I’m here at Web 2.0 Expo, and heard Clay Shirky give an interesting presentation on what he calls, “filter failure.” Here’s a summation from my notes:
His general premise is that, by thinking about our Web 2.0 in terms of the oft-cited “information overload,” we’re looking at this problem in the wrong way. Here is IDC’s graph of the growth of information:
Shirky contends this is the same graph that’s been in play since the invention of the printing press. By the 1500s, the cost of publishing a book, and the (relative) ease of creating one was the start. Gutenberg introduced the “up front risk.” That is, the publisher takes on all the risk because books have to be printed before they can be sold — to mitigate this risk, the publisher took on the role of ensure quality. By being a gatekeeper, the publisher ensured the risk was as low as possible. And prior to the web, all forms of media, radio, television, worked within this same “up front” economic model.
The web, of course, changed all that.
With the Internet, there is no economic model that says you need to filter for quality before publishing. (See: Livejournal, according to Shirky.) So, going back to the chart, what we’re seen now is not information overload, but a change in the filter. Thinking about information overload is not the problem — thinking about filter failure is the right approach.
Shirky pointed to spam — it’s not that spam represents a large percentage increase in the amount of email. The problem with spam is keeping it out of the in-box. Spam requires different kinds of filters, and all of them are temporary — “there is no set it and forget it for spam.”
This is a general system design problem, and not computer system, but social system.
Shirky then told a story about privacy on Facebook. A friend of his broke off her engagement, and had to deal with the inevitable change in status on Facebook. She wasn’t a casual user — her dissertation was about Facebook. And even after changing all the privacy settings, after changing the status, everyone she knew had received the update.
Shirky’s point is that privacy settings through a software interface isn’t at all a natural thing. We’ve lived most of our lives not in the “bubble of privacy or the glare of public life.” We used to have this thing called a “personal life.” Now, it’s like every word we ever say is recorded for posterity.
In our prior life, we could be walking down the street, talking to someone, and while others around you can hear, they really can’t. “The inefficiency of information flow wasn’t a bug — it was a feature.” And now that we have a world of explicit privacy settings, we have a problem.
He also related a story about a college student in Canada put up for expulsion because he started a Facebook group, calling it a form of cheating. Here, the “real life” world of the college campus clashed with the “online” world of Facebook. Again, filters — two different messages crashing into each other.
We’re breaking the system of filters we’ve had for so long. Using metaphors like “real life” and “online” don’t cut it. Our current way of thinking assumes “we are to information overload as a fish is to water,” but if we’ve had the problem of information overload ever since the 1500s, it’s about time we realize that it’s not a “problem,” but a fact.
Some of the solution is tags, bookmarks. But some is rethinking, “which filter just broke?” When you start asking that question, we’ll get some clue as to where to put the design effort.
(Shirky’s slides (brief as they are…) can be found here.)