Category Archives: Media Studies

It Was All A Big Joke

Ok, I’ll admit it. I got totally duped on this one:

Christwire has lately reached new levels of popularity, in part thanks to an Aug. 14 column, “Is My Husband Gay?” Written by Stephenson Billings, the piece is a 15-point checklist to help wives diagnose possibly closeted husbands. “Gym membership but no interest in sports” is one warning sign. So is “Sassy, sarcastic and ironic around his friends” and “Love of pop culture.” “Is My Husband Gay?” was picked up on The Huffington Post and mentioned by Ryan Seacrest on his radio show; so far it has been viewed 8.3 million times.

Oh, by the way: Christwire is all one big joke.

Not the readership — which hit a high of 27 million page views in August — but the content, the opinions and the fake authors who write the stuff.

Well played, gentlemen. Well played…


Gibson’s Critique

This op-ed, from novelist William Gibson, is perhaps the best, most succinct critique of Google I’ve ever seen:

Cyberspace, not so long ago, was a specific elsewhere, one we visited periodically, peering into it from the familiar physical world. Now cyberspace has everted. Turned itself inside out. Colonized the physical. Making Google a central and evolving structural unit not only of the architecture of cyberspace, but of the world. This is the sort of thing that empires and nation-states did, before. But empires and nation-states weren’t organs of global human perception. They had their many eyes, certainly, but they didn’t constitute a single multiplex eye for the entire human species.

Jeremy Bentham’s Panopticon prison design is a perennial metaphor in discussions of digital surveillance and data mining, but it doesn’t really suit an entity like Google. Bentham’s all-seeing eye looks down from a central viewpoint, the gaze of a Victorian warder. In Google, we are at once the surveilled and the individual retinal cells of the surveillant, however many millions of us, constantly if unconsciously participatory. We are part of a post-geographical, post-national super-state, one that handily says no to China. Or yes, depending on profit considerations and strategy. But we do not participate in Google on that level. We’re citizens, but without rights.

Read the whole thing here. It’s a must-read.

Journalism Wants To Be Free

By now, surely everyone has heard, and seen, the Iraq video from Wikileaks. Of particular interest from the NYT article on the story, is this section:

By releasing such a graphic video, which a media organization had tried in vain to get through traditional channels, WikiLeaks has inserted itself in the national discussion about the role of journalism in the digital age. Where judges and plaintiffs could once stop or delay publication with a court order, WikiLeaks exists in a digital sphere in which information becomes instantly available.

“The most significant thing about the release of the Baghdad video is that several million more people are on the same page,” with knowledge of WikiLeaks, said Lisa Lynch, an assistant professor of journalism at Concordia University in Montreal, who recently published a paper about the site. “It is amazing that outside of the conventional channels of information something like this can happen.”

A big part of the Wikileaks story is they were able to decrypt the video, apparently using some borrow time on a high-powered computer system (one would be needed to break strong encryption; obviously, it could turn out the encryption used on this video was very weak). That’s not something CNN would likely be willing to do, let alone have the resources to spend on such an effort. But it’s the perfect sort of task for a site that falls outside the field of journalism — or, perhaps more precisely, a site that is redrawing its boundaries.

The question of just “what is journalism” has dogged journalists and academics since the rise of bloggers and “citizen journalists.” Wikileaks, though, seems to be a different sort of thing. Wikileaks isn’t a bunch of intrepid bloggers running through a FOIA data dump, and it’s certainly operating outside of what we’ve come to recognize as traditional journalism. The founder of Wikileaks, in fact, compares what the site does to something more like the CIA:

“That’s arguably what spy agencies do — high-tech investigative journalism,” Julian Assange, one of the site’s founders, said in an interview on Tuesday. “It’s time that the media upgraded its capabilities along those lines.”

One byproduct of the digitization of nearly everything these days is that information wants to be free. It should not be surprising that a video such as the one from Iraq, which is digitally captured and stored, could find its way in front of our eyes, just as the photographs from Abu Ghraib made their way onto our screens.

And it should not be surprising that a site such as Wikileaks would not only find a place in this new digitally-enhanced ecosystem of journalism and politics, but even thrive.

Journalist Admits Journalists Don’t Do Journalism

Howard Kurtz’s column today contains a stunning admission — Journalists really don’t do journalism:

In the end, [healthcare reform] may simply have been too dense for the media to fully digest. If you’re a high-information person who routinely plows through 2,000-word newspaper articles, you had a reasonably good grasp of the arguments. For a busy electrician who plugs in and out of the news, the jousting and the jargon may have seemed bewildering.

Once the law takes effect — its provisions stretched out over years — perhaps journalists can help separate rhetoric from reality. That is, if we don’t lose interest and move to the next hot controversy.

Kurtz tries to save his fellow colleagues, by attempting to point out where they actually provided value:

One stellar moment for the press was the refusal to perpetuate the myth of “death panels.” After Sarah Palin floated the idea that government commissions would decide which ailing patients deserved to be saved, journalists at The Washington Post, New York Times, CNN and ABC News, among others, said flatly that this was untrue.

But even this is a stretch. Oh, it’s true in the eyes of the news media, but that is because they have a warped sense of “journalism.”

In our corporate-funded news media system, providing “objective” news means bringing the two sides of a debate into a room and letting them have their say. In our media system, it is never the journalists job to actually state something is not true.

Take, for example, the above-mentioned healthcare debate. Here is the transcript from an August 2009 CNN news show, at the height of the Tea Party protests. Anchor John Roberts is “fact-checking” the claims of the Republicans:

ROBERTS: Well, here again tonight to help us fact-check some of what we’re hearing is Bill Adair. He’s the editor of, which earned a Pulitzer Prize for its investigation of hundreds of political claims during the 2008 campaign.

Bill, it’s great to see you tonight. You heard the president’s response to this idea of death panels. Sarah Palin has a new posting on her Facebook page where she claims it’s the president who’s wrong. Here’s what she says.

“With all due respect, it’s misleading for the president to describe this section as an entirely voluntary provision that simply increases the information offered to Medicare recipients.”

So, what do the Truth-O-Meter say about all this bill? Is the former governor correct or is she incorrect?

BILL ADAIR, EDITOR, POLITIFACT.COM: She is incorrect. We gave that a false on our Truth-O-Meter on

Really when you look at the bill, when you look at the language, it is voluntary. There is nothing in the bill that says that it’s mandatory. There’s nothing that backs up this claim. Now, Palin makes the point, well, perhaps seniors could feel pressured to take this care. And perhaps that’s possible.

But as the language is written now, as we have discussed it with experts, it’s just not true to say that it’s not voluntary. It is voluntary. It’s an optional thing. So, she gets a false on our Truth-O-Meter.

ROBERTS: False on the mandatory death panel. All right, Bill, cleared that one up.

A CNN anchor would never come out and say something is not true; he relies on someone from “” to say it. He needs the Truth-O-Meter, instead of just speaking the truth.

More importantly, I am sure it would be easy enough to find a campaign contribution made by Bill Adair, or some other statement he has made, to discredit him with an accusation of “bias.” And this is not just Bill Adair, but it’s the hundreds of other pundits and interviewees that are used in the same way in similar stories all the time. This is why our news system does not work.

Until journalists start doing their job, until they stop relying on “experts” to provide the requisite two-sides-to-every-story, our politics will never be served by their existence.

From Writing to Personal Memory Device

h+ Magazine postulates a seemingly distant future where we all have “personal memory devices,” that record, index, and make available every second of our experiential lives:

In the near future, someone will decide to record every moment of a human life from birth to death in digital storage…It will mark the era of personal memory offloading, an adaptive memory technology that records and indexes every single moment of your life. Offloading personal memory begins with a personal memory device, or a PMD. The basic PMD would be no more complex than a small video and sound recorder that captures your every experience. A PMD could be easily fitted shortly after birth; the least invasive option would be like a Bluetooth headset worn over the ear connected wirelessly to a local device no larger than a cell phone. Once installed, the PMD would capture and upload all first-person memories to a centralized database for indexing, search, and recall.

To be clear, the author isn’t talking about embedding chips in our brains, or recording our innermost thoughts and feelings. The idea here is more like a video recorder that’s running all the time, like a playback of your avatar in SecondLife. It would capture “GPS, Google Maps, facial recognition, speech/text recognition, brainwave analysis and so on,” recording this all for posterity, and making it available for use later on:

Whatever you do will be captured by the PMD for later playback and recall. Your PMD will remember every place you visit, every person you meet, every conversation you have, every object you look at, every movie you watch, every meal you eat, every page you read, every email you write, everything.

It sounds like Big Brother. But isn’t a lot of this already happening?

Our calls and email messages today leave traces that the government can use to spy on us. Search engines track what we do online. Our browsers track what what we do as well. We post most of what goes on in our lives on social networking sites, while remaining clueless about how these sites protect — or don’t protect — our privacy. We post our location without thinking of the implications.

The technology behind a personal memory device isn’t all that far-fetched. It’s possible to stream video from wherever you are already. This dumped to a database, time-stamped and geo-located, would provide much of the basis of such a system. Or, even closer to the vision of h+’s futuristic article, here is the Sensecam — a device, available today, that helps people with Alzheimer’s and other memory disorders to remember:

The concept was simple: using digital pictures and audio to archive an experience like a weekend visit from the grandchildren, creating a summary of the resulting content by picking crucial images, and reviewing them periodically to awaken and strengthen the memory of the event.

The hardware is a little black box called the Sensecam, which contains a digital camera and an accelerometer to measure movement…For the elderly, though, it could herald a new kind of relationship between mind and machine: even as plaque gets deposited on the brain, everyday experience is deposited on silicon, then retrieved.

Admittidly, the personal memory device still does not exist today, but we are close. Rather than this being some shocking sci-fi, and maybe even dystopian vision, I think this is actually predictable.

Isn’t this, in fact, the logical conclusion to a path humans have been on since the invention of writing? A line that can be drawn, from pre-literate humans, to writing, to the dawn of the computer age, to today — it is a line that finds humans almost compulsively drawn to recording our existence.

As Walter Ong has explained, before writing, humans existed in a culture of orality. Phrases such as “take a note” or “look something up” had no meaning, because they are visual metaphors, rooted in literacy and writing. Before humans could write things down, we thought differently, we spoke differently. Thinking “long” thoughts required formulaic speeches and mnemonic tricks. But these tricks only went so far. Writing, by fixing thoughts into somewhat permanent media, allowed humans to think in much more complex terms.

Fast-forward to the age of computers (machines that are integrally tied to the concept of memory), and we have a world in which it is not only easy to record our thoughts, but, increasingly, pieces of ourselves exist only within media that, by design, records our presence. Humans are inseparable from our machines, but today’s machines are machines of inscription, digital recording devices, and when we use them we leave a trail of bits and bytes for others to follow: “Our best machines are made of sunshine; they are all light and clean because they are nothing but signals…”

This is the central idea behind a paper I wrote as a grad student, where I described the group blog (Daily Kos, in particular) as a “space to create a collective memory, without which the blogger does not exist.” As we chat with our friends on Twitter, as we post pictures to Facebook, as we share our music preferences on Pandora, all these actions exist solely within the realm of the digital — there is no analog, biological counterpart to the “follow” on Twitter.

So we essentially have the technological basis today for the personal memory device; we already record what we do online. The only piece that is missing is a local database in which to record our digital selves. And we will get there — rather than this being a surprise, it is really just an expected step down a path humans started on a long time ago.

Google? Really?

Google now sells power.

Not sure what to say to that, other than asking, “where will they stop?”

I’ve always been amazed by Google’s success. Their stock price is currently at 540 bucks a share, so obviously others don’t feel the same way. They’ve been a wall street darling since day one, and it’s never really let up.

But I could never understand why? I mean, they sell ads. That’s nothing new. And they have a search engine. And they figured out that if you give your product away, you might be very popular.

All very simple things. (OK, obviously there’s some math and programming behind Internet search, but it’s really just tracking down links and assigning values and keeping a really fast index of everything. Good stuff, but not exactly rocket science. Well, maybe it’s kind of like rocket science. Whatever.)

But this is America! The greatest country in the world (TM)! No one can come up with something that can knock Google off its pedestal? No one can up with some new way to search the net, or better yet, some up with something disruptive enough to make Google’s search irrelevant?

I mean, that’s what Google did to get on top, right? We all used “portals” from companies like Yahoo (and, pre-web, services like Compuserve and Prodigy), that directed us where they thought we would want to go. Google changed that by making search the first thing you did when you hit the web. And they’ve since entrenched themselves into the web’s fiber.

Now, they are a verb:

Pronunciation: \ˈgü-gəl\
Function: transitive verb

Usage: often capitalized
Etymology: Google, trademark for a search engine
Date: 2001
: to use the Google search engine to obtain information about (as a person) on the World Wide Web

Search, plus giving your products away, and it’s a hit. All funded through ad sales on the very pages you serve up on your products. It all seems so mundane. Yet 540 bucks a share.

Am I the only one surprised that Google’s still on top?

Virtual Memory paper

I’m happy to announce that a paper of mine was published, in The New School Psychology Bulletin, as part of a special issue on memory studies:


The following paper will explore the nature of memory in the digital age, proposing the blog as a model for a memory system. It will examine the blog’s position as both a medium and a social practice. Both are essential – without the medium, without the website itself, the blog’s community has no sense of place. Without conversational social relations, there is no basis for community. There is, in fact, an orality to blogging, an orality that recalls the manner in which non-literate cultures rely on speech for their existence. It is a form of speech, though, that is not ephemeral, but permanent and instantly retrievable, and, in this manner, the blog provides a space to create a collective memory, without which the blogger does not exist. This presents a new form of subjectivity, one rooted in bits and bytes, defined by a database, made accessible by a search engine. The blog becomes a technological prosthetic for its users: cyborg memory.

One idea in this paper I really like is the use of Walter Ong’s work on orality. While his use of “secondary orality” is somewhat now in fashion, I think the more important concept here is the juxtaposition of oral cultures, where writing is non-existent, with today’s digital age, where nearly every bit of communication is inscribed, and made permanent (or nearly so). For me, the “community blog” is the perfect model of communication today, both a media form and a social practice, permanent and retrievable.

As Ong wrote, “you know what you can recall,” and as everything we say and write online becomes part of the Internet’s vast machine, what we can recall becomes an infinitely large database with which we will need to contend.



Have I mentioned that DMZ was amazingly good?

Because it is.


Balloon boy, all a big hoax, it seems

Old Versus New Media: Food Writer Edition

Cook’s Illustrated editor Chris Kimball has thrown down the gauntlet!

If you haven’t been following this food blogger versus professional writer battle that’s been simmering, it started when Kimball wrote a fairly silly op/ed in the Times, bashing both the so-called amateurish writing of bloggers, as well as the larger movement of participatory culture that is happening in all areas of media, where “regular people” have been given a voice through social media. When it comes to food writing, Kimball doesn’t seem too keen on this at all:

…in a click-or-die advertising marketplace, one ruled by a million instant pundits, where an anonymous Twitter comment might be seen to pack more resonance and useful content than an article that reflects a lifetime of experience, experts are not created from the top down but from the bottom up.They can no longer be coronated; their voices have to be deemed essential to the lives of their customers.

Bloggers have hit back; in particular, Adam Roberts, over at the Amateur Gourmet, has a great response:

The derision and condescension in this statement is baffling. Every food writer—from MFK Fisher to Ruth Reichl herself—started at the bottom and worked their way up. Kimball, at the end of his column, invokes Julia Child, a cook who didn’t start her food career until much later in life. If she’d had a blog documenting her time at Le Cordon Bleu (and maybe she would have, if she’d been born a few decades later), would Kimball complain that she hadn’t spilled enough blood in the kitchen yet? That “inexperience rarely leads to wisdom?”

It’s naïve to think that all food writing on the web is created equal, that the “million instant pundits” are all valued the same. The truth is that there are, indeed, an enormous number of food blogs out there, but it’s still a meritocracy: only the good ones gain traction. The most popular food blogs are popular because of their quality; in many ways, their content is better than much of what you’ll find in actual food magazines, including Kimball’s.

Kimball comes across here as elitist, an old guard fighting off the new. If he doesn’t read food blogs, he’s missing out on a diverse world of recipes and ideas and perspective on food. His notion of an “anonymous Twitter comment” is also strange — while we may not see each other on Twitter, the people I talk to there are hardly strangers. And yes, if someone I follow (and trust) on Twitter makes a recipe or restaurant recommendation, I’ll surely be paying attention.

In any case, perhaps looking to settle this (or cash in on the controversy, more likely!), Kimball has upped the ante, challenging any recipe found on a wiki to one of his from the Test Kitchen:

So, I am willing to put my money, and my reputation, where my big mouth is. I offer a challenge to any supporter of the WIKI or similar concept to jump in and go head to head with our test kitchen. We will jointly agree on a recipe, on the rules, on a time frame, etc. At the end, we will ask a panel of impartial judges to make and test the recipes and declare a winner.

It’s a fantastic idea, and should be lots of fun.

Let the games begin!!!