By the time we journalists all figure out Twitter, nobody will care anymore. And we’ll probably be out of a job by then, anyway.
I started using Twitter two months ago, on the good advice of a friend and new media journalist. My Twitter avatars (twavitars?) @michaelduck and @2riv_easton emerged online right around the time Twitter went mainstream, when venerable institutions like The New York Times and Sen. John McCain started to make this new-fangled “tweeting” technology sound about as cool as The Clapper. People cooler than I are already moving on to FriendFeed or Jaiku or whatever.
Before Twitter, I had arrived on Facebook soon before it was declared to be the replacement for e-mail. And before that fad was MySpace, which I missed entirely. And along the way there the YouTube craze, which fueled the now-semi-forgotten vlog revolution. And before that were podcasts, and the moment when everybody and their mother started blogging — around the same time that Wikipedia emerged. And before all that, everybody started using eBay. And before that, there was the LiveJournal explosion. And before that … well, it goes all the way back to the days of GeoCities.
Internet crazes flower and die with such regularity that we barely even notice anymore — we’re too busy hearing about or trying out the new thing. And by that time, the last year’s fad has usually been bought up by Google or Microsoft or Rupert Murdoch and is plowed back into the Internet’s collective subconscious, becoming a sort of electronic compost from which the new trends emerge.
In the heady days when YouTube was still revolutionary, some smart people at the newspaper where I work began to see the potential in Internet video. After months of meetings and cajoling, they persuaded the higher-ups to purchase and distribute hand-held digital video cameras throughout our newsrooms and start training reporters and photographers to create Web videos.
We’re still creating our own Web videos — most of which play only in our company-approved online video player, which is (naturally) still incompatible with YouTube.
On Friday, most of our most popular videos — all of which took hours to produce — were watched by fewer than 200 people each.
For decades in the newspaper industry, it’s paid off to take lots of time and plan new product launches thoroughly. A new magazine or a weekend section, for example, would go through months of meetings and development sessions, culminating in the painstaking creation of printed mockups to circulate among all the necessary newsroom honchos.
This is the world in which most newspaper editors have spent most of their careers. It’s a world that recognizes that, in an atmosphere usually charged with last-second deadline rushes, a carefully-planned profile launch is a rare and valuable opportunity.
It’s also a world in which it took years for most papers to realize that printing A1 in color might be a good idea.
Sprinkled throughout these same newsrooms are a smattering of early-adopters — visionaries or would-be visionaries — who know and love the cutting-edge technology as it emerges, before it hits the mainstream. The ones who are influential enough catch the attention of the higher-ups, who then carefully weigh the ideas, exchange memos, and plan employee training sessions.
This is why newspapers eventually got around to discovering Twitter just when it’s at what’s likely to be remembered as the high point of its popularity. Soon enough, most of us will get through all the meetings and memos; some of us can probably expect to see Twitter-related requirements on our job performance evaluations soon. By then, we’ll all have figured out how to use Twitter — just in time to watch it be made obsolete by the new trendy online gimcrack.
We are a lumbering brachiosaurus chasing after a butterfly.
But what would we do with the butterfly, even if we could catch it? For all the wonderment that is Web 2.0, nobody — I repeat, nobody — has figured out how to turn it into a sustainable business.
Large parts of the Web tech industry still function on the model that fueled the dot-com bubble all those years ago. The big money in Web development comes not from the profit you make by operating your YouTube or your MySpace, but the profit you make when you sell the whole darn thing you’ve created to a big company with deep pockets.
The very substance of Web 2.0 — whether in the form of YouTube, Wikipedia, Facebook, or Twitter — is the thoughts and ideas of you and millions of other Internet users like you. Putting you in touch with everyone else’s ideas has been revolutionary, but it hasn’t been profitable. Facebook still isn’t making money; Twitter doesn’t even really have a business strategy; Wikipedia has given up and relies on pledge drives; and YouTube looks like it’s in even worse financial shape than newspapers.
So we’re lurching after Web technologies that not only have ridiculously short half-lives but also have minimal potential for making us money. This is not a good plan for saving an industry that’s already starting to collapse.
Just a few weeks ago (though it seems like an eternity in Internet time), futurist Web guru Clay Shirky and several of his like-minded cohorts declared that newspapers are pretty much dead — casualties of a mass media revolution that’s still not finished yet. That’s an extreme position, but there’s some truth to it. At the very least, newspapers certainly will die if they can’t evolve in a hurry.
As we shift more towards the Web, part of that evolution will mean abandoning the vestigial newsroom habits and management structures that grew up around once-a-day (or once-a-week!) publication. We don’t have time anymore to weigh every single last one of our options before exploring new technologies.
But we also have to be smart enough to keep the strongest parts of what we already have. Along with our ability to report and investigate, our newspaper brands are exceptionally valuable — lending a credibility to our online product that grows out of the trust so many readers have bestowed for so long on our print products. We need to retain just enough of that old-style prudence to make sure we venture onto the Web deliberately and without polluting our brand with poorly-conceived online content. That means not merely showing reporters how to use Facebook or Twitter, but training us more broadly about how to interact with readers and present ourselves in a digital age.
We need to finally recognize that Web technology fads are no more permanent than spring flowers. Instead of moving toward a newsroom in which everyone’s on Twitter, we need to think more broadly about how the tech du jour can inform and enhance our work. Web video, for example, will never make sense for every story, or maybe even for every journalist — but thinking in terms of video opens new possibilities for how to tell certain stories. By prompting us to focus on visual and auditory details, thinking in video can help us be better writers even when nobody happens to watch the final product.
From this year’s Twitter phenomenon, we should be learning that readers want to do more than consume stories online — they want to interact with them and with their writers. They’re eager to put a face and a personality with our bylines and finally get a glimpse of how we do our work; and the Web gives us more tools every day to help us and them do exactly that.