Sadly, Delicious, one of the best services out there on the net, is shutting down.
Unlike many other cloud services, Delicious allows (allowed…) you to export your data. On a regular basis, I have always saved my bookmarks, tags, and notes (and imported them into Quicksilver, so I had instant access…). In this respect, for me, Delicious was really a hosting service for my bookmarks, so having it disappear isn’t the worst thing that could happen.
Sadly, though, the “social” part of Delicious’s social bookmarking service is what I will really miss. I used a feed to follow all the links that my “network” on Delicious were saving — this is made it very easy to see what other people were interested in, and what they found across the web. In a sense, it was a curated, serendipitous flow of information from folks I knew and trusted; so many of my bookmarks actually came from that feed.
Pinboard seems to have a similar service, but as everyone on Delicious is now scattering to other services, it is going to take some time to replace that feature.
RIP, Delicious. It was a good run…
[Update] I should have added, Pinboard.IN so far has been really excellent. Most importantly, all my Delicious bookmarks uploaded perfectly. It took a while, as their servers are getting slammed by the mass exodus from Delicious. They said on twitter last night: “we have added just short of one million bookmarks today.”
By now, surely everyone has heard, and seen, the Iraq video from Wikileaks. Of particular interest from the NYT article on the story, is this section:
By releasing such a graphic video, which a media organization had tried in vain to get through traditional channels, WikiLeaks has inserted itself in the national discussion about the role of journalism in the digital age. Where judges and plaintiffs could once stop or delay publication with a court order, WikiLeaks exists in a digital sphere in which information becomes instantly available.
“The most significant thing about the release of the Baghdad video is that several million more people are on the same page,” with knowledge of WikiLeaks, said Lisa Lynch, an assistant professor of journalism at Concordia University in Montreal, who recently published a paper about the site. “It is amazing that outside of the conventional channels of information something like this can happen.”
A big part of the Wikileaks story is they were able to decrypt the video, apparently using some borrow time on a high-powered computer system (one would be needed to break strong encryption; obviously, it could turn out the encryption used on this video was very weak). That’s not something CNN would likely be willing to do, let alone have the resources to spend on such an effort. But it’s the perfect sort of task for a site that falls outside the field of journalism — or, perhaps more precisely, a site that is redrawing its boundaries.
The question of just “what is journalism” has dogged journalists and academics since the rise of bloggers and “citizen journalists.” Wikileaks, though, seems to be a different sort of thing. Wikileaks isn’t a bunch of intrepid bloggers running through a FOIA data dump, and it’s certainly operating outside of what we’ve come to recognize as traditional journalism. The founder of Wikileaks, in fact, compares what the site does to something more like the CIA:
“That’s arguably what spy agencies do — high-tech investigative journalism,” Julian Assange, one of the site’s founders, said in an interview on Tuesday. “It’s time that the media upgraded its capabilities along those lines.”
One byproduct of the digitization of nearly everything these days is that information wants to be free. It should not be surprising that a video such as the one from Iraq, which is digitally captured and stored, could find its way in front of our eyes, just as the photographs from Abu Ghraib made their way onto our screens.
And it should not be surprising that a site such as Wikileaks would not only find a place in this new digitally-enhanced ecosystem of journalism and politics, but even thrive.
Howard Kurtz’s column today contains a stunning admission — Journalists really don’t do journalism:
In the end, [healthcare reform] may simply have been too dense for the media to fully digest. If you’re a high-information person who routinely plows through 2,000-word newspaper articles, you had a reasonably good grasp of the arguments. For a busy electrician who plugs in and out of the news, the jousting and the jargon may have seemed bewildering.
Once the law takes effect — its provisions stretched out over years — perhaps journalists can help separate rhetoric from reality. That is, if we don’t lose interest and move to the next hot controversy.
Kurtz tries to save his fellow colleagues, by attempting to point out where they actually provided value:
One stellar moment for the press was the refusal to perpetuate the myth of “death panels.” After Sarah Palin floated the idea that government commissions would decide which ailing patients deserved to be saved, journalists at The Washington Post, New York Times, CNN and ABC News, among others, said flatly that this was untrue.
But even this is a stretch. Oh, it’s true in the eyes of the news media, but that is because they have a warped sense of “journalism.”
In our corporate-funded news media system, providing “objective” news means bringing the two sides of a debate into a room and letting them have their say. In our media system, it is never the journalists job to actually state something is not true.
Take, for example, the above-mentioned healthcare debate. Here is the transcript from an August 2009 CNN news show, at the height of the Tea Party protests. Anchor John Roberts is “fact-checking” the claims of the Republicans:
ROBERTS: Well, here again tonight to help us fact-check some of what we’re hearing is Bill Adair. He’s the editor of PolitiFact.com, which earned a Pulitzer Prize for its investigation of hundreds of political claims during the 2008 campaign.
Bill, it’s great to see you tonight. You heard the president’s response to this idea of death panels. Sarah Palin has a new posting on her Facebook page where she claims it’s the president who’s wrong. Here’s what she says.
“With all due respect, it’s misleading for the president to describe this section as an entirely voluntary provision that simply increases the information offered to Medicare recipients.”
So, what do the Truth-O-Meter say about all this bill? Is the former governor correct or is she incorrect?
BILL ADAIR, EDITOR, POLITIFACT.COM: She is incorrect. We gave that a false on our Truth-O-Meter on PolitiFact.com.
Really when you look at the bill, when you look at the language, it is voluntary. There is nothing in the bill that says that it’s mandatory. There’s nothing that backs up this claim. Now, Palin makes the point, well, perhaps seniors could feel pressured to take this care. And perhaps that’s possible.
But as the language is written now, as we have discussed it with experts, it’s just not true to say that it’s not voluntary. It is voluntary. It’s an optional thing. So, she gets a false on our Truth-O-Meter.
ROBERTS: False on the mandatory death panel. All right, Bill, cleared that one up.
A CNN anchor would never come out and say something is not true; he relies on someone from “Politifact.com” to say it. He needs the Truth-O-Meter, instead of just speaking the truth.
More importantly, I am sure it would be easy enough to find a campaign contribution made by Bill Adair, or some other statement he has made, to discredit him with an accusation of “bias.” And this is not just Bill Adair, but it’s the hundreds of other pundits and interviewees that are used in the same way in similar stories all the time. This is why our news system does not work.
Until journalists start doing their job, until they stop relying on “experts” to provide the requisite two-sides-to-every-story, our politics will never be served by their existence.
h+ Magazine postulates a seemingly distant future where we all have “personal memory devices,” that record, index, and make available every second of our experiential lives:
In the near future, someone will decide to record every moment of a human life from birth to death in digital storage…It will mark the era of personal memory offloading, an adaptive memory technology that records and indexes every single moment of your life. Offloading personal memory begins with a personal memory device, or a PMD. The basic PMD would be no more complex than a small video and sound recorder that captures your every experience. A PMD could be easily fitted shortly after birth; the least invasive option would be like a Bluetooth headset worn over the ear connected wirelessly to a local device no larger than a cell phone. Once installed, the PMD would capture and upload all first-person memories to a centralized database for indexing, search, and recall.
To be clear, the author isn’t talking about embedding chips in our brains, or recording our innermost thoughts and feelings. The idea here is more like a video recorder that’s running all the time, like a playback of your avatar in SecondLife. It would capture “GPS, Google Maps, facial recognition, speech/text recognition, brainwave analysis and so on,” recording this all for posterity, and making it available for use later on:
Whatever you do will be captured by the PMD for later playback and recall. Your PMD will remember every place you visit, every person you meet, every conversation you have, every object you look at, every movie you watch, every meal you eat, every page you read, every email you write, everything.
It sounds like Big Brother. But isn’t a lot of this already happening?
Our calls and email messages today leave traces that the government can use to spy on us. Search engines track what we do online. Our browsers track what what we do as well. We post most of what goes on in our lives on social networking sites, while remaining clueless about how these sites protect — or don’t protect — our privacy. We post our location without thinking of the implications.
The technology behind a personal memory device isn’t all that far-fetched. It’s possible to stream video from wherever you are already. This dumped to a database, time-stamped and geo-located, would provide much of the basis of such a system. Or, even closer to the vision of h+’s futuristic article, here is the Sensecam — a device, available today, that helps people with Alzheimer’s and other memory disorders to remember:
The concept was simple: using digital pictures and audio to archive an experience like a weekend visit from the grandchildren, creating a summary of the resulting content by picking crucial images, and reviewing them periodically to awaken and strengthen the memory of the event.
The hardware is a little black box called the Sensecam, which contains a digital camera and an accelerometer to measure movement…For the elderly, though, it could herald a new kind of relationship between mind and machine: even as plaque gets deposited on the brain, everyday experience is deposited on silicon, then retrieved.
Admittidly, the personal memory device still does not exist today, but we are close. Rather than this being some shocking sci-fi, and maybe even dystopian vision, I think this is actually predictable.
Isn’t this, in fact, the logical conclusion to a path humans have been on since the invention of writing? A line that can be drawn, from pre-literate humans, to writing, to the dawn of the computer age, to today — it is a line that finds humans almost compulsively drawn to recording our existence.
As Walter Ong has explained, before writing, humans existed in a culture of orality. Phrases such as “take a note” or “look something up” had no meaning, because they are visual metaphors, rooted in literacy and writing. Before humans could write things down, we thought differently, we spoke differently. Thinking “long” thoughts required formulaic speeches and mnemonic tricks. But these tricks only went so far. Writing, by fixing thoughts into somewhat permanent media, allowed humans to think in much more complex terms.
Fast-forward to the age of computers (machines that are integrally tied to the concept of memory), and we have a world in which it is not only easy to record our thoughts, but, increasingly, pieces of ourselves exist only within media that, by design, records our presence. Humans are inseparable from our machines, but today’s machines are machines of inscription, digital recording devices, and when we use them we leave a trail of bits and bytes for others to follow: “Our best machines are made of sunshine; they are all light and clean because they are nothing but signals…”
This is the central idea behind a paper I wrote as a grad student, where I described the group blog (Daily Kos, in particular) as a “space to create a collective memory, without which the blogger does not exist.” As we chat with our friends on Twitter, as we post pictures to Facebook, as we share our music preferences on Pandora, all these actions exist solely within the realm of the digital — there is no analog, biological counterpart to the “follow” on Twitter.
So we essentially have the technological basis today for the personal memory device; we already record what we do online. The only piece that is missing is a local database in which to record our digital selves. And we will get there — rather than this being a surprise, it is really just an expected step down a path humans started on a long time ago.
So last night was the big premier of the final season of LOST. I watched it.
And I am annoyed.
As a fan of the show, I’m used to not knowing what the hell is going on all the time. But…this is the final season. And we’re supposed to get some answers.
Not only that, but the producers of the show told us, we are going to get answers:
“Normally the thing that you have to execute is coming up with fulfilling endings and resolve the fate of your characters,” said Damon Lindelof, an executive producer who, with Carlton Cuse, oversees the series and is writing the key episodes for the coming season. “But we also have the added weight of how are we going to resolve this mythology.
“The show is so predicated on questions. So now we’re in answer mode, and have been for quite some time.”
…Viewers will not have to wait until the last moments of the series finale to get many of their answers, they say. Beginning with the season premiere, revelations about some of the most fundamental mysteries will come fast and furious.
Amazon is adding apps to the Kindle.
I’m not sure why I’d want to run an app on my book, but there it is.
Cook’s Illustrated editor Chris Kimball has thrown down the gauntlet!
If you haven’t been following this food blogger versus professional writer battle that’s been simmering, it started when Kimball wrote a fairly silly op/ed in the Times, bashing both the so-called amateurish writing of bloggers, as well as the larger movement of participatory culture that is happening in all areas of media, where “regular people” have been given a voice through social media. When it comes to food writing, Kimball doesn’t seem too keen on this at all:
…in a click-or-die advertising marketplace, one ruled by a million instant pundits, where an anonymous Twitter comment might be seen to pack more resonance and useful content than an article that reflects a lifetime of experience, experts are not created from the top down but from the bottom up.They can no longer be coronated; their voices have to be deemed essential to the lives of their customers.
Bloggers have hit back; in particular, Adam Roberts, over at the Amateur Gourmet, has a great response:
The derision and condescension in this statement is baffling. Every food writer—from MFK Fisher to Ruth Reichl herself—started at the bottom and worked their way up. Kimball, at the end of his column, invokes Julia Child, a cook who didn’t start her food career until much later in life. If she’d had a blog documenting her time at Le Cordon Bleu (and maybe she would have, if she’d been born a few decades later), would Kimball complain that she hadn’t spilled enough blood in the kitchen yet? That “inexperience rarely leads to wisdom?”
It’s naïve to think that all food writing on the web is created equal, that the “million instant pundits” are all valued the same. The truth is that there are, indeed, an enormous number of food blogs out there, but it’s still a meritocracy: only the good ones gain traction. The most popular food blogs are popular because of their quality; in many ways, their content is better than much of what you’ll find in actual food magazines, including Kimball’s.
Kimball comes across here as elitist, an old guard fighting off the new. If he doesn’t read food blogs, he’s missing out on a diverse world of recipes and ideas and perspective on food. His notion of an “anonymous Twitter comment” is also strange — while we may not see each other on Twitter, the people I talk to there are hardly strangers. And yes, if someone I follow (and trust) on Twitter makes a recipe or restaurant recommendation, I’ll surely be paying attention.
In any case, perhaps looking to settle this (or cash in on the controversy, more likely!), Kimball has upped the ante, challenging any recipe found on a wiki to one of his from the Test Kitchen:
So, I am willing to put my money, and my reputation, where my big mouth is. I offer a challenge to any supporter of the WIKI or similar concept to jump in and go head to head with our test kitchen. We will jointly agree on a recipe, on the rules, on a time frame, etc. At the end, we will ask a panel of impartial judges to make and test the recipes and declare a winner.
It’s a fantastic idea, and should be lots of fun.
Let the games begin!!!