Feeds:
Posts
Comments

Archive for January 23rd, 2008

Wednesday, February 28, 2007

Corporations, networks … what next?

150 years ago corporations started using auditors according to the law and under control of the state. It was (and to a certain extent is) the world of separation. Access to business information/knowledge was and is a privilege of auditors and consultants. Hence not only audit and consulting emerged to assist corporations in business war for profit, markets, customers … but (in contrast) also informal networks emerged to understand what business world is doing, as we can see from blogging and other social media. This second world has only one law – copyright and is “above” the states borders. It watches business as usual and tests new business models …In fact we can see 2 models:

The known model is: corporation – auditor/consultant – public.

What’s the networked model?

Let’s rename the parts of the abovementioned model: corporation (persons) – auditor/consultant (person) – public (person). In fact we get the “person – person” model “above” the states borders and law where personal knowledge matters – Knowledge Persons.

Why these models are different, if they look like the same but are written in 2 languages?

I think it’s because corporations mostly deal with tangible things. A confirmation for this are the audit standards. It’s a space for tangibles where things can be lost and therefore are being secured. Information and knowledge are also “things” here.

If networks are not being built like corporations, they (networks) are open – welcome people to join and benefit from one level collaboration, with a direct access to information/knowledge flows … And their “person – person” model can include ideas creation, design, production, sales, explanations, consumption … according to public contract (a network charter). Audit and consulting can be included too and must measuring, correcting knowledge that is a building unit for sociobusiness networking.

So, as we can see a symbiosis of corporations and informal networks can be. For example, some corporations can be the production units within the network.

Can the corporations acquire the open network? Oddly, but openness is an obstacle for that because corporation can’t be open as the networks can.

The next entity, I suppose, will be audit/consulting for networking. It will mediate interaction of the persons again, but differently. If we do network, we should audit understanding of interaction to accept it or not and correct it to build our global sociobusiness life or netliving.

I think auditing/consulting the networks will be a function of their openness, but I don’t know where is a place for the states here?

Read Full Post »

CBS News

  

Venture Capital Climbs In “Web 2.0” Burst

$25B Reported Invested In 2006; $4B Spent On Internet Startups, Biggest Stake Since Dot-Com Bust


 (AP / CBS)

Answers.com

(AP) Venture capitalists invested $25.5 billion in 2006, marking the industry’s biggest burst of dealmaking since the dot-com bust clogged the financial spigot for entrepreneurs five years ago.

A renewed interest in Internet startups, combined with expanding opportunities in the health care and alternative energy markets, spurred a 12 percent increase from the $22.8 billion invested in 2005, according to figures jointly released Tuesday by PricewaterhouseCoopers, Thomson Financial and the National Venture Capital Association.

Last year’s activity, spread across 3,416 deals, generated the highest level of investment since venture capitalists forked out $40.7 billion in 2001, the end of a manic era driven by a lemming-like pursuit of dot-com riches.

After hundreds of their Internet bets flopped, venture capitalists recoiled in despair through 2002 and 2003.

Last year, venture capitalists poured $4 billion in Internet startups, a 25 percent increase from $3.2 billion in 2005. It was the industry’s largest commitment to the Internet since 2001 when the high-tech financiers pumped $10.2 billion into the sector.

Venture capitalists also upped the ante substantially in biotechnology, which received $4.5 billion last year, up by 17 percent from 2005.

The most robust growth occurred in the industrial and energy category, where venture capital investments more than doubled to $1.8 billion. About 40 percent of that money was earmarked for alternative energy projects.

Now that venture capital’s investment volume has increased in each of the last three years, the chances of creating another bubble are rising, too, particularly since the industry has raised a total of $56 billion in the past two years.

So far, though, venture capitalists have been proceeding at a moderate pace of growth that suggests they may have learned from their past mistakes.

“It’s not crazy out there right now. We are just in this kind of steady state,” said Rob Shaplinsky, founding partner of Bridgescale, a venture capital firm in Menlo Park.

Over the past three years, the industry has invested an average of $5.9 billion per quarter, compared with a $16.7 billion quarterly average from 1999 through 2001.

The final three months of 2006 provided another example of venture capitalists’ restraint, with fourth-quarter investments totaling $5.7 billion, unchanged from the previous year.

“We are pleased that, to date, quarterly investment levels have remained prudent and no major over-funding has occurred,” said Mark Heeson, president of the National Venture Capital Association.

Venture capitalists have had a strong incentive to be more careful with their money this time around because it’s taking longer for them to cash out of their investments.

During the financial frenzy of the dot-com boom, many startups generated huge paydays for venture capitalists by completing initial public offerings of stock less than three years from their inception. Today, startups are usually five to seven years old before they are making enough money to attract a buyout offer from a larger company or assemble an IPO that would pique the interest of more discriminating investors.

Despite the greater caution, some red flags are being raised in trendy areas like “Web 2.0” — a catchall phrase for the Internet craze devoted to social networking and the sharing of content largely contributed by members of a Web site’s audience.

“‘Web 2.0’ has become a buzzword and it always scares me when an entrepreneur comes in with a pitch and starts spouting buzzwords,” said Tim Draper, founder and managing director of Draper Fisher Jurvetson, a venture capital firm in Menlo Park.

With dozens of sites vying to strike it rich like YouTube Inc. did in its recent $1.76 billion sale to Google Inc., online video looks particularly ripe for a shakeout.

“You can still hit it big there, but the percentage (of startups) that will is going to be very, very small,” predicted Mike Carusi, general partner with Advanced Technology Ventures in Palo Alto.

Even if the Web 2.0 craze crashes, the financial damage should be minimal because none of the big names in the sector have gone public yet, said Josh Grove, a senior research analyst for Dow Jones VentureOne, a venture capital research firm.


Share & Save: Tag With del.icio.usDigg This      appendStoryButton(); Find Related: Sphere     E-Mail This StoryPrintable Version

Read Full Post »

Web 2.0 Is Reminiscent Of Marx

CBS News

Web 2.0 Is Reminiscent Of Marx

WS: Second-Generation Internet Is Dangerously Seductive

Answers.com
(Weekly Standard) This column was written by Andrew Keen.

The ancients were good at resisting seduction. Odysseus fought the seductive song of the Sirens by having his men tie him to the mast of his ship as it sailed past the Siren’s Isle. Socrates was so intent on protecting citizens from the seductive opinions of artists and writers that he outlawed them from his imaginary republic.We moderns are less nimble at resisting great seductions, particularly those utopian visions that promise grand political or cultural salvation. From the French and Russian revolutions to the counter-cultural upheavals of the ’60s and the digital revolution of the ’90s, we have been seduced, time after time and text after text, by the vision of a political or economic utopia.Rather than Paris, Moscow, or Berkeley, the grand utopian movement of our contemporary age is headquartered in Silicon Valley, whose great seduction is actually a fusion of two historical movements: the counter-cultural utopianism of the ’60s and the techno-economic utopianism of the ’90s. Here in Silicon Valley, this seduction has announced itself to the world as the “Web 2.0” movement.Last week, I was treated to lunch at a fashionable Japanese restaurant in Palo Alto by a serial Silicon Valley entrepreneur who, back in the dot.com boom, had invested in my start-up, Audiocafe.com. The entrepreneur, like me a Silicon Valley veteran, was pitching me his latest start-up: a technology platform that creates easy-to-use software tools for online communities to publish weblogs, digital movies, and music. It is technology that enables anyone with a computer to become an author, a film director, or a musician. This Web 2.0 dream is Socrates’s nightmare: technology that arms every citizen with the means to be an opinionated artist or writer.

“This is historic,” my friend promised me. “We are enabling Internet users to author their own content. Think of it as empowering citizen media. We can help smash the elitism of the Hollywood studios and the big record labels. Our technology platform will radically democratize culture, build authentic community, create citizen media.” Welcome to Web 2.0.

Buzzwords from the old dot.com era — like “cool,” “eyeballs,” or “burn rate” — have been replaced in Web 2.0 by language that is simultaneously more militant and absurd: Empowering citizen media, radically democratize, smash elitism, content redistribution, authentic community … This sociological jargon, once the preserve of the hippie counterculture, has now become the lexicon of new media capitalism.

Yet this entrepreneur owns a $4 million house a few blocks from Steve Jobs’s house. He vacations in the South Pacific. His children attend the most exclusive private academy on the peninsula. But for all of this he sounds more like a cultural Marxist — a disciple of Gramsci or Herbert Marcuse — than a capitalist with an MBA from Stanford.

In his mind, “big media” — the Hollywood studios, the major record labels and international publishing houses — really did represent the enemy. The promised land was user-generated online content. In Marxist terms, the traditional media had become the exploitative “bourgeoisie,” and citizen media, those heroic bloggers and podcasters, were the “proletariat.”

This outlook is typical of the Web 2.0 movement, which fuses ’60s radicalism with the utopian eschatology of digital technology. The ideological outcome may be trouble for all of us.

So what, exactly, is the Web 2.0 movement? As an ideology, it is based upon a series of ethical assumptions about media, culture, and technology. It worships the creative amateur: the self-taught filmmaker, the dorm-room musician, the unpublished writer. It suggests that everyone — even the most poorly educated and inarticulate amongst us — can and should use digital media to express and realize themselves. Web 2.0 “empowers” our creativity, it “democratizes” media, it “levels the playing field” between experts and amateurs. The enemy of Web 2.0 is “elitist” traditional media.

Empowered by Web 2.0 technology, we can all become citizen journalists, citizen videographers, citizen musicians. Empowered by this technology, we will be able to write in the morning, direct movies in the afternoon, and make music in the evening.

Sounds familiar? It’s eerily similar to Marx’s seductive promise about individual self-realization in his “German Ideology:”

Whereas in communist society, where nobody has one exclusive sphere of activity but each can become accomplished in any branch he wishes, society regulates the general production and thus makes it possible for me to do one thing today and another tomorrow, to hunt in the morning, fish in the afternoon, rear cattle in the evening, criticise after dinner, just as I have a mind, without ever becoming hunter, fisherman, shepherd or critic.
Just as Marx seduced a generation of European idealists with his fantasy of self-realization in a communist utopia, so the Web 2.0 cult of creative self-realization has seduced everyone in Silicon Valley. The movement bridges counter-cultural radicals of the ’60s such as Steve Jobs with the contemporary geek culture of Google’s Larry Page. Between the book-ends of Jobs and Page lies the rest of Silicon Valley, including radical communitarians like Craig Newmark (of Craigslist.com), intellectual property communists such as Stanford Law Professor Larry Lessig, economic cornucopians like Wired magazine editor Chris “Long Tail” Anderson, and new media moguls Tim O’Reilly and John Batelle.The ideology of the Web 2.0 movement was perfectly summarized at the Technology Education and Design (TED) show in Monterey, last year, when Kevin Kelly, Silicon Valley’s über-idealist and author of the Web 1.0 Internet utopia “Ten Rules for The New Economy,” said:
Imagine Mozart before the technology of the piano. Imagine Van Gogh before the technology of affordable oil paints. Imagine Hitchcock before the technology of film. We have a moral obligation to develop technology.
But where Kelly sees a moral obligation to develop technology, we should actually have — if we really care about Mozart, Van Gogh and Hitchcock — a moral obligation to question the development of technology.
The consequences of Web 2.0 are inherently dangerous for the vitality of culture and the arts. Its empowering promises play upon that legacy of the ’60s — the creeping narcissism that Christopher Lasch described so presciently, with its obsessive focus on the realization of the self.Another word for narcissism is “personalization.” Web 2.0 technology personalizes culture so that it reflects ourselves rather than the world around us. Blogs personalize media content so that all we read are our own thoughts. Online stores personalize our preferences, thus feeding back to us our own taste. Google personalizes searches so that all we see are advertisements for products and services we already use.Instead of Mozart, Van Gogh, or Hitchcock, all we get with the Web 2.0 revolution is more of ourselves.Still, the idea of inevitable technological progress has become so seductive that it has been transformed into “laws.” In Silicon Valley, the most quoted of these laws, Moore’s Law, states that the number of transistors on a chip doubles every two years, thus doubling the memory capacity of the personal computer every two years. On one level, of course, Moore’s Law is real and it has driven the Silicon Valley economy. But there is an unspoken ethical dimension to Moore’s Law. It presumes that each advance in technology is accompanied by an equivalent improvement in the condition of man.

But as Max Weber so convincingly demonstrated, the only really reliable law of history is the Law of Unintended Consequences.

We know what happened first time around, in the dot.com boom of the ’90s. At first there was irrational exuberance. Then the dot.com bubble popped; some people lost a lot of money and a lot of people lost some money. But nothing really changed. Big media remained big media and almost everything else — with the exception of Amazon.com and eBay — withered away.

This time, however, the consequences of the digital media revolution are much more profound. Apple and Google and Craigslist really are revolutionizing our cultural habits, our ways of entertaining ourselves, our ways of defining who we are. Traditional “elitist” media is being destroyed by digital technologies. Newspapers are in freefall. Network television, the modern equivalent of the dinosaur, is being shaken by TiVo’s overnight annihilation of the 30-second commercial. The iPod is undermining the multibillion dollar music industry. Meanwhile, digital piracy, enabled by Silicon Valley hardware and justified by Silicon Valley intellectual property communists such as Larry Lessig, is draining revenue from established artists, movie studios, newspapers, record labels, and song writers.

Is this a bad thing? The purpose of our media and culture industries — beyond the obvious need to make money and entertain people — is to discover, nurture, and reward elite talent. Our traditional mainstream media has done this with great success over the last century. Consider Alfred Hitchcock’s masterpiece, “Vertigo” and a couple of other brilliantly talented works of the same name “Vertigo”: the 1999 book called “Vertigo,” by Anglo-German writer W.G. Sebald, and the 2004 song “Vertigo,” by Irish rock star Bono. Hitchcock could never have made his expensive, complex movies outside the Hollywood studio system. Bono would never have become Bono without the music industry’s super-heavyweight marketing muscle. And W.G. Sebald, the most obscure of this trinity of talent, would have remained an unknown university professor had a high-end publishing house not had the good taste to discover and distribute his work. Elite artists and an elite media industry are symbiotic. If you democratize media, then you end up democratizing talent. The unintended consequence of all this democratization, to misquote Web 2.0 apologist Thomas Friedman, is cultural “flattening.” No more Hitchcocks, Bonos, or Sebalds. Just the flat noise of opinion — Socrates’s nightmare.

While Socrates correctly gave warning about the dangers of a society infatuated by opinion in Plato’s “Republic,” more modern dystopian writers — Huxley, Bradbury, and Orwell — got the Web 2.0 future exactly wrong. Much has been made, for example, of the associations between the all-seeing, all-knowing qualities of Google’s search engine and the Big Brother in “Nineteen Eighty-Four.” But Orwell’s fear was the disappearance of the individual right to self-expression. Thus Winston Smith’s great act of rebellion in “Nineteen Eight-Four” was his decision to pick up a rusty pen and express his own thoughts:

The thing that he was about to do was open a diary. This was not illegal, but if detected it was reasonably certain that it would be punished by death … Winston fitted a nib into the penholder and sucked it to get the grease off . . . He dipped the pen into the ink and then faltered for just a second. A tremor had gone through his bowels. To mark the paper was the decisive act.
In the Web 2.0 world, however, the nightmare is not the scarcity, but the over-abundance of authors. Since everyone will use digital media to express themselves, the only decisive act will be to not mark the paper. Not writing as rebellion sounds bizarre — like a piece of fiction authored by Franz Kafka. But one of the unintended consequences of the Web 2.0 future may well be that everyone is an author, while there is no longer any audience.Speaking of Kafka, on the back cover of the January 2006 issue of “Poets and Writers” magazine, there is a seductive Web 2.0 style advertisement which reads:
Kafka toiled in obscurity and died penniless. If only he’d had a website . . . .
Presumably, if Kafka had had a website, it would be located at kafka.com which is today an address owned by a mad left-wing blog called The Biscuit Report. The front page of this site quotes some words written by Kafka in his diary:
I have no memory for things I have learned, nor things I have read, nor things experienced or heard, neither for people nor events; I feel that I have experienced nothing, learned nothing, that I actually know less than the average schoolboy, and that what I do know is superficial, and that every second question is beyond me. I am incapable of thinking deliberately; my thoughts run into a wall. I can grasp the essence of things in isolation, but I am quite incapable of coherent, unbroken thinking. I can’t even tell a story properly; in fact, I can scarcely talk …
One of the unintended consequences of the Web 2.0 movement may well be that we fall, collectively, into the amnesia that Kafka describes. Without an elite mainstream media, we will lose our memory for things learnt, read, experienced, or heard. The cultural consequences of this are dire, requiring the authoritative voice of at least an Allan Bloom, if not an Oswald Spengler. But here in Silicon Valley, on the brink of the Web 2.0 epoch, there no longer are any Blooms or Spenglers. All we have is the great seduction of citizen media, democratized content and authentic online communities. And weblogs, course. Millions and millions of blogs.Andrew Keen is a veteran Silicon Valley entrepreneur and digital media critic. He blogs at TheGreatSeduction.com and has recently launched aftertv.com, a podcast chat show about media, culture, and technology.

Read Full Post »

Web 2.0 at the Super Bowl

Web 2.0 at the Super Bowl
And 2007 won’t be like 1984.
by Andrew Keen
02/02/2007 12:00:00 AM

Increase Font Size  |  Printer-Friendly  |  Email a Friend  |  Respond to this article
IT’S AMATEUR HOUR at the Super Bowl this year. On Sunday, 90 million television viewers on CBS will be subjected to commercials made by “You”–Time magazine’s Person of The Year for 2006. Three Super Bowl XLI advertisers–Doritos, the National Football League, and Chevrolet–will all be running 30 second commercial spots made by amateurs. The Web 2.0 revolution in user-generated content has infiltrated the American living room. These amateur creators, who Time praise as “people formerly known as consumers,” are now providing the entertainment at the biggest event in the media calendar.
This is not good news. The shift from professionally produced to user-generated advertising makes us poorer in both economic and cultural terms. The arrival of user-created commercials at Super Bowl XLI represents the American Idolization of traditional entertainment–the degeneration of professional content into a “talent show” for amateurs.
We, the conventional television audience, are certainly losers in this new fashion for user-generated advertisements. We have traditionally watched Super Bowl commercials to be entertained by memorable ads. Often, these commercials are more memorable than the game. Occasionally, they even represent significant cultural moments in American history. Few of us, for example, can remember who won Super Bowl in 1984 (Los Angeles Raiders 38, Washington Redskins 9), where it was played (Tampa), or who sang the national anthem (Barry Manilow). But most of us can remember the Chiat/Day produced, Ridley Scott directed, commercial for the Macintosh computer, with its Orwellian subtext and its indelible explanation of why “1984 wasn’t going to be like 1984”.
Don’t
expect a repeat of Chiat/Day and Ridley Scott’s creative genius during Super Bowl XLI. Doritos are already previewing the five finalists in their competition on the Yahoo! website. One commercial features a chip-chomping rock climber falling off a mountain; another has a giant mouse bursting out of a wall, scavenging for cheese-flavored chips; a third has a young woman falling over because she’s looking at her chips and not the road. All five of the finalists contain the same predictable, dorm-room aesthetic, low production qualities, and poor acting. The brain trust at Doritos deserves thanks for not exposing us to the other 1,100 entrants.
WHY IS THE WORK of the amateur of a lesser quality than professionally made content? There’s the intrinsic talent of a lifelong professional, such as Ridley Scott, of course. Then there’s the financial resources made available to the professional content creator. Back in 1984, Apple paid Chiat/Day $1.6 million to produce their Mac ad. Today, according to the American Association of Advertising Agencies, the average professionally-produced 30-second spot costs $381,000. In contrast, wedding photographer Jarod Cicon, one of the five finalists in the Doritos competition, estimates that his 30-second ad cost $150 to produce.
Web 2.0 advocates, who are apologists for user-generated content (such as Chris Anderson, the author of the best-selling book The Long Tail), promise that the amateurs of the new digital democracy can create the same quality content for a tiny proportion of the traditional cost. But this simply isn’t true. Watch the Doritos commercials side-by-side with some classic Super Bowl commercials, such as the Budweiser “Frogs” (1995) or “Cedric” (2001) spots. It’s like tasting a homemade elderberry wine after a glass of the best Cabernet.
THE ECONOMICS of amateur hour at the Super Bowl are disturbing. If today’s typical commercial costs $381,000 and an amateur advertisement costs $150 to produce, then what happens to the money which isn’t spent on the creative? Given that Doritos are awarding $10,000 to the five finalists in their talent show, that still leaves some $331,000 on the table. To use a fashionable Web 2.0 term, the professional creator is being “disintermediated.” CBS doesn’t lose anything because they still charge Doritos over $2.5 million for the 30 second spot. Instead, it’s the professional creator–the scriptwriter, cameraman, audio expert–who is being squeezed out of the economy by this infestation of amateur content.
Markets are markets and there’s no reason to cry for simply for the loss of jobs in one sector, so long as new efficiencies are being created. But in this instance, the loss of jobs is accompanied by worse, not better products. This is true across the media industry and not just in the advertising business.
As Columbia University Economics professor Jagwish Bhagwati has argued, digital technology is undermining the wages of the American middle class. Web 2.0 technologies which enable amateurs to make dumbed-down replicas of professional work are particularly responsible for what Bhagwati calls the “tsumani” of downward pressure on wages created by new technology.
Amateur content on user-generated video sites such as Google’s YouTube is undermining the value of professionally-made video content. American Idol now has an online competition called “American Idol Underground,” which is making the traditional music
A&R person redundant. HarperCollins is undermining the traditional role of literary agents by running online competitions to “discover” amateur writers. The result of all this democratization of media is fewer creative jobs and more amateurish books, movies, and music. And commercials, too.
Andrew Keen is a veteran Silicon Valley entrepreneur and digital media critic. His book, THE CULT OF THE AMATEUR: How the democratization of the digital world is assaulting our economy, our culture, and our values, will be published by Currency in June. He blogs at TheGreatSeduction.com and has recently launched aftertv.com, a podcast chat show about media, culture, and technology.

Read Full Post »

The Weekly Standard 

Web 2.0
The second generation of the Internet has arrived. It’s worse than you think.
by Andrew Keen
02/15/2006 12:00:00 AM

Increase Font Size  |  Printer-Friendly  |  Email a Friend  |  Respond to this article
THE ANCIENTS were good at resisting seduction. Odysseus fought the seductive song of the Sirens by having his men tie him to the mast of his ship as it sailed past the Siren’s Isle. Socrates was so intent on protecting citizens from the seductive opinions of artists and writers, that he outlawed them from his imaginary republic.

We moderns are less nimble at resisting great seductions, particularly those utopian visions that promise grand political or cultural salvation. From the French and Russian revolutions to the counter-cultural upheavals of the ’60s and the digital revolution of the ’90s, we have been seduced, time after time and text after text, by the vision of a political or economic utopia.

Rather than Paris, Moscow, or Berkeley, the grand utopian movement of our contemporary age is headquartered in Silicon Valley, whose great seduction is actually a fusion of two historical movements: the counter-cultural utopianism of the ’60s and the techno-economic utopianism of the ’90s. Here in Silicon Valley, this seduction has announced itself to the world as the “Web 2.0” movement.

LAST WEEK, I was treated to lunch at a fashionable Japanese restaurant in Palo Alto by a serial Silicon Valley entrepreneur who, back in the dot.com boom, had invested in my start-up Audiocafe.com. The entrepreneur, like me a Silicon Valley veteran, was pitching me his latest start-up: a technology platform that creates easy-to-use software tools for online communities to publish weblogs, digital movies, and music. It is technology that enables anyone with a computer to become

an author, a film director, or a musician. This Web 2.0 dream is Socrates’s nightmare: technology that arms every citizen with the means to be an opinionated artist or writer.

“This is historic,” my friend promised me. “We are enabling Internet users to author their own content. Think of it as empowering citizen media. We can help smash the elitism of the Hollywood studios and the big record labels. Our technology platform will radically democratize culture, build authentic community, create citizen media.” Welcome to Web 2.0.

Buzzwords from the old dot.com era–like “cool,” “eyeballs,” or “burn-rate”–have been replaced in Web 2.0 by language which is simultaneously more militant and absurd: Empowering citizen media, radically democratize, smash elitism, content redistribution, authentic community . . . . This sociological jargon, once the preserve of the hippie counterculture, has now become the lexicon of new media capitalism.

Yet this entrepreneur owns a $4 million house a few blocks from Steve Jobs’s house. He vacations in the South Pacific. His children attend the most exclusive private academy on the peninsula. But for all of this he sounds more like a cultural Marxist–a disciple of Gramsci or Herbert Marcuse–than a capitalist with an MBA from Stanford.

In his mind, “big media”–the Hollywood studios, the major record labels and international publishing houses–really did represent the enemy. The promised land was user-generated online content. In Marxist terms, the traditional media had become the exploitative “bourgeoisie,” and citizen media, those heroic bloggers and podcasters, were the “proletariat.”

This outlook is typical of the Web 2.0 movement, which fuses ’60s radicalism with the utopian eschatology of digital technology. The ideological outcome may be trouble for all of us.

SO WHAT, exactly, is the Web 2.0 movement? As an ideology, it is based upon a series of ethical assumptions about media, culture, and technology. It worships the creative amateur: the self-taught filmmaker, the dorm-room musician, the unpublished writer. It suggests that everyone–even the most poorly educated and inarticulate amongst us–can and should use digital media to express and realize themselves. Web 2.0 “empowers” our creativity, it “democratizes” media, it “levels the playing field” between experts and amateurs. The enemy of Web 2.0 is “elitist” traditional media.

Empowered by Web 2.0 technology, we can all become citizen journalists, citizen videographers, citizen musicians. Empowered by this technology, we will be able to write in the morning, direct movies in the afternoon, and make music in the evening.

Sounds familiar? It’s eerily similar to Marx’s seductive promise about individual self-realization in his German Ideology:

Whereas in communist society, where nobody has one exclusive sphere of activity but each can become accomplished in any branch he wishes, society regulates the general production and thus makes it possible for me to do one thing today and another tomorrow, to hunt in the morning, fish in the afternoon, rear cattle in the evening, criticise after dinner, just as I have a mind, without ever becoming hunter, fisherman, shepherd or critic.

Just as Marx seduced a generation of European idealists with his fantasy of self-realization in a communist utopia, so the Web 2.0 cult of creative self-realization has seduced everyone in Silicon Valley. The movement bridges counter-cultural radicals of the ’60s

such as Steve Jobs with the contemporary geek culture of Google’s Larry Page. Between the book-ends of Jobs and Page lies the rest of Silicon Valley including radical communitarians like Craig Newmark (of Craigslist.com), intellectual property communists such as Stanford Law Professor Larry Lessig, economic cornucopians like Wired magazine editor Chris “Long Tail” Anderson, and new media moguls Tim O’Reilly and John Batelle.

The ideology of the Web 2.0 movement was perfectly summarized at the Technology Education and Design (TED) show in Monterey, last year, when Kevin Kelly, Silicon Valley’s über-idealist and author of the Web 1.0 Internet utopia Ten Rules for The New Economy, said:

Imagine Mozart before the technology of the piano. Imagine Van Gogh before the technology of affordable oil paints. Imagine Hitchcock before the technology of film. We have a moral obligation to develop technology.

But where Kelly sees a moral obligation to develop technology, we should actually have–if we really care about Mozart, Van Gogh and Hitchcock–a moral obligation to question the development of technology.

The consequences of Web 2.0 are inherently dangerous for the vitality of culture and the arts. Its empowering promises play upon that legacy of the ’60s–the creeping narcissism that Christopher Lasch described so presciently, with its obsessive focus on the realization of the self.

Another word for narcissism is “personalization.” Web 2.0 technology personalizes culture so that it reflects ourselves rather than the world around us. Blogs personalize media content so that all we read are our own thoughts. Online stores personalize our preferences, thus feeding back to us our own taste. Google personalizes searches so that all we see are advertisements for products and services we already use.

Instead of Mozart, Van Gogh, or Hitchcock, all we get with the Web 2.0 revolution is more of ourselves.

STILL, the idea of inevitable technological progress has become so seductive that it has been transformed into “laws.” In Silicon Valley, the most quoted of these laws, Moore’s Law, states that the number of transistors on a chip doubles every two years, thus doubling the memory capacity of the personal computer every two years. On one level, of course, Moore’s Law is real and it has driven the Silicon Valley economy. But there is an unspoken ethical dimension to Moore’s Law. It presumes that each advance in technology is accompanied by an equivalent improvement in the condition of man.

But as Max Weber so convincingly demonstrated, the only really reliable law of history is the Law of Unintended Consequences.

We know what happened first time around, in the dot.com boom of the ’90s. At first there was irrational exuberance. Then the dot.com bubble popped; some people lost a lot of money and a lot of people lost some money. But nothing really changed. Big media remained big media and almost everything else–with the exception of Amazon.com and eBay–withered away.

This time, however, the consequences of the digital media revolution are much more profound. Apple and Google and Craigslist really are revolutionizing our cultural habits, our ways of entertaining ourselves, our ways of defining who we are. Traditional “elitist” media is being destroyed by digital technologies. Newspapers are in freefall. Network television, the modern equivalent of the dinosaur, is being shaken by TiVo’s overnight annihilation of the 30-second commercial. The iPod is undermining the multibillion dollar music industry. Meanwhile, digital piracy, enabled by Silicon Valley hardware and justified by Silicon Valley intellectual property communists such as Larry Lessig, is draining revenue from established artists, movie studios, newspapers, record labels, and song writers.

Is this a bad thing? The purpose of our media and culture industries–beyond the obvious need to make money and entertain people–is to discover, nurture, and reward elite talent. Our traditional mainstream media has done this with great success over the last century. Consider Alfred Hitchcock’s masterpiece, Vertigo and a couple of other brilliantly talented works of the same name Vertigo: the 1999 book called Vertigo, by Anglo-German writer W.G. Sebald, and the 2004 song “Vertigo,” by Irish rock star Bono. Hitchcock could never have made his expensive, complex movies outside the Hollywood studio system. Bono would never have become Bono without the music industry’s super-heavyweight marketing muscle. And W.G. Sebald, the most obscure of this trinity of talent, would have remained an unknown university professor had a high-end publishing house not had the good taste to discover and distribute his work. Elite artists and an elite media industry are symbiotic. If you democratize media, then you end up democratizing talent. The unintended consequence of all this democratization, to misquote Web 2.0 apologist Thomas Friedman, is cultural “flattening.” No more Hitchcocks, Bonos, or Sebalds. Just the flat noise of opinion–Socrates’s nightmare.

WHILE SOCRATES correctly gave warning about the dangers of a society infatuated by opinion in Plato’s Republic, more modern dystopian writers–Huxley, Bradbury, and Orwell–got the Web 2.0 future exactly wrong. Much has been made, for example, of the associations between the all-seeing, all-knowing qualities of Google’s search engine and the Big Brother in Nineteen Eighty-Four. But Orwell’s fear was the disappearance of the individual right to self-expression. Thus Winston Smith’s great act of rebellion in Nineteen Eight-Four was his decision to pick up a rusty pen and express his own thoughts:

The thing that he was about to do was open a diary. This was not illegal, but if detected it was reasonably certain that it would be punished by death . . . Winston fitted a nib into the penholder and sucked it to get the grease off . . . He dipped the pen into the ink and then faltered for just a second. A tremor had gone through his bowels. To mark the paper was the decisive act.

In the Web 2.0 world, however, the nightmare is not the scarcity, but the over-abundance of authors. Since everyone will use digital media to express themselves, the only decisive act will be to not mark the paper. Not writing as rebellion sounds bizarre–like a piece of fiction authored by Franz Kafka. But one of the unintended consequences of the Web 2.0 future may well be that everyone is an author, while there is no longer any audience.

SPEAKING OF KAFKA, on the back cover of the January 2006 issue of Poets and Writers magazine, there is a seductive Web 2.0 style advertisement which reads:

Kafka toiled in obscurity and died penniless. If only he’d had a website . . . .

Presumably, if Kafka had had a website, it would be located at kafka.com which is today an address owned by a mad left-wing blog called The Biscuit Report. The front page of this site quotes some words written by Kafka in his diary:

I have no memory for things I have learned, nor things I have read, nor things experienced or heard, neither for people nor events; I feel that I have experienced nothing, learned nothing, that I actually know less than the average schoolboy, and that what I do know is superficial, and that every second question is beyond me. I am incapable of thinking deliberately; my thoughts run into a wall. I can grasp the essence of things in isolation, but I am quite incapable of coherent, unbroken thinking. I can’t even tell a story properly; in fact, I can scarcely talk . . .

One of the unintended consequences of the Web 2.0 movement may well be that we fall, collectively, into the amnesia that Kafka describes. Without an elite mainstream media, we will lose our memory for things learnt, read, experienced, or heard. The cultural consequences of this are dire, requiring the authoritative voice of at least an Allan Bloom, if not an Oswald Spengler. But here in Silicon Valley, on the brink of the Web 2.0 epoch, there no longer are any Blooms or Spenglers. All we have is the great seduction of citizen media, democratized content and authentic online communities. And weblogs, course. Millions and millions of blogs.

Andrew Keen is a veteran Silicon Valley entrepreneur and digital media critic. He blogs at TheGreatSeduction.com and has recently launched aftertv.com, a podcast chat show about media, culture, and technology.

Read Full Post »

http://www.wfs.org/index.html

Published since 1966January-February 2008
Volume 42, No. 1

A magazine of forecasts, trends, and ideas about the future.

Future View

The Age of Distraction: The Professor or the Processor?

Due to academia’s reliance on technology and the media’s overemphasis on trivia, we are failing to inform future generations about social problems that require critical thinking and interpersonal intelligence.

In the midst of the consumer technology boom of 1999—the diffusion of cell phones, laptops, music players, and gaming consoles—I began work on the concept of the “interpersonal divide,” or the social void that I observed developing as more people came to rely on mediated rather than face-to-face communication.

My warnings went unheeded because of the hoopla over the global village, particularly in academia, which was investing billions of dollars in information technology to facilitate the rapid growth of computing on campus. Since I direct a journalism school, my warnings about corporate profit at the expense of public institutions worried colleagues as well as benefactors and media practitioners.

But the new technologies that now keep us constantly connected also keep us constantly distracted. Educators know that wireless technology has disrupted the classroom, with students browsing (and even buying) online during lectures. However, the new challenge is the pervasive unwillingness to do anything about it. Digital distractions now keep us from addressing the real issues of the day. Each of us daily consumes an average of nine hours of media through myriad technological platforms. As a journalism professor, I’m especially sensitive to this emerging state of constant distraction and its effects on what we watch and read. This is not the Age of Information. This is the Age of Distraction. And distraction in academia is deadly because it undermines critical thinking. That impacts all of us—and the future.    

Without critical thinking, we create trivia. We dismantle scientific models and replace them with trendy or wishful ones that are neither transferable nor testable. We have witnessed this with such issues as global warming, worldwide pandemics, and natural selection. Thus, I theorize that standards of higher education have been lowered, not raised, because of new information and consumer technology.

For more than a decade now, university administrators have been touting technology. Apple, Dell, Gateway, and Gates persuaded us to become citizens of a brave new media world that promised to enfranchise and enlighten everyone with universal access. Now, access is omnipresent in our wireless campuses and workplaces. Was that investment well spent? The U.S. Department of Education saw no difference between the performance of kids who used academic software programs for math and reading and those that did not.

In fact, reading scores in 2005 were significantly worse than in 1992, according to the National Assessment of Educational Progress, the nation’s report card. And in math, only 23% of all twelfth graders were proficient. Worse, these sinking scores occurred even though high-school students averaged 360 more classroom hours in 2005 than in 1990.

We need to investigate whether distractions in wireless classrooms might be to blame. Have we compared the scores of school districts investing modestly versus heavily in technology, adjusting for factors like household income, to see if our digital classrooms make any difference?

Assessment no longer is the norm in higher education. Worse, universities are investing in online virtual worlds vended by companies whose proprietary service terms often conflict with disclosure and due process. Costs keep mounting, from bandwidth to security. We have trouble funding real campuses without leasing and staffing digital “land” that is not really there.

The question at issue is who will be the bearer of truth in the digital age—the professor or the processor? To answer that, we first must dispel the myth that technology is a tool, depending on how one uses it. Technology, in fact, is an autonomous system. As philosopher Jacques Ellul (1912-1994), foresaw, technology changes dramatically whatever it touches. Introduce technology into journalism, and henceforth journalism is about technology. Introduce it into the economy, and the stock market henceforth is about technology. Introduce it into education, and education is about technology. Because technology is omnipresent and autonomous, it touches everything and cannot be blamed for anything. Moreover, interfaces and applications come with a motive typically developed by the military or media. The digital device is programmed to do two things (often simultaneously): surveil or sell. Thus, we surveil students on Facebook while students buy off eBay during class.

Digital distraction doesn’t just affect students, but also workers. Who hasn’t known a supervisor or an administrator who writes memos when the staff desires a discussion on a matter of substance? There are supervisors and colleagues who rely on e-mail in situations requiring give-and-take, creating new problems or complicating existing ones. As a result, my research has shown, miscommunication happens as often as successful communication. Typically, we go about business in a false milieu of emergency, which technology exacerbates, because of instantaneous communication.

How to Recover Interpersonal Intelligence

The question is, what can be done? This, too, is part of my research and my lectures across the country on the concept of “interpersonal intelligence,” or knowing when, where, and for what purpose technology is appropriate or inappropriate.

We should inform all incoming college students during their orientation about digital distractions, helping them adapt from the life of a consumer to the life of the mind. We should teach them to explicate the motive of the interface (often making money) rather than simply assume that students will download lectures rather than iTunes. We might assign them to keep journals about the consequences of mobile technology and observe how it is used in their immediate surroundings. Students also can monitor their own use throughout the term, noting any changes in attitude or behavior. They can tally how much they spend on consumer technology, including purchases made during so-called boring periods during a lecture—just one indication of how impulse buying adds debt. Finally, we must remind ourselves that this issue, as nearly all in academia, pivots on student attitudes and actions. Student government, as well as student organizations, can help reclaim the classroom in ways that we have not yet contemplated. Above all, we must deprogram ourselves from technology overuse to realize the benefits and beauties of community.

Given global energy demand that promises to alter mobile lifestyles, today’s students within a decade may have to rely on neighbors in immediate environments more than on avatars in virtual ones. Perhaps we should prepare them to converse with others without ear buds of iPods or ring tones of cell phones.

If we do not recommit to critical thinking in the classroom, the future is in jeopardy. Moreover, if we don’t practice interpersonal intelligence at home, school, and work, we cannot set the standards for the emerging generation. They see us interact using the same technologies but often fail to understand that we had the gift of literary education. We can reflect on issues. We can meditate. We can make independent choices, exercise fairness, avoid snap judgments, examine our own prejudices, listen to viewpoints of others that differ from our own, base our opinions on fact, exercise discretion, acknowledge when we are wrong, and analyze people, topics, and events methodically.

Those are attributes of critical thinking, which many lack in their multitasking digital world. Without these skills, future generations may create Huxley’s brave new world. This comes at a bad time in history, when knowledge of culture, enhanced by interpersonal intelligence, is the key to lasting peace and ignorance thereof could lead to disaster.

About the Author

Michael Bugeja is the director of the Greenlee School of Journalism at Iowa State University and the author of Interpersonal Divide: The Search for Community in a Technological Age (Oxford, 2005). His latest work, Living Ethics Across Media Platforms (Oxford, 2008), advocates for new media standards based on universal principles. E-mail at bugeja@iastate.edu .

 

PHOTO FROM PHOTOS.COM

Return to top

COPYRIGHT (C) 2007 World Future Society, 7910 Woodmont Avenue, Suite 450, Bethesda, MD 20814, U.S.A. Telephone 1-656-8274; fax 301-951-0394; e-mail info@wfs.org; Web site www.wfs.org. All rights reserved.

Read Full Post »

Fighting the Cult of the Amateur

http://www.wfs.org/index.html

futurist_logo_yellow_72dpi.jpg

January-February 2008
Volume 42, No. 1

A magazine of forecasts, trends, and ideas about the future.

Fighting the Cult of the Amateur

A Web 2.0 Critic Takes on the Confederacy of E-Dunces. 

                                                                                                                  cult.jpg

ak.jpg

In his new book, The Cult of the Amateur, (Currency, 2007) blogger and Internet entrepreneur Andrew Keen explores today’s new participatory Internet, (often referred to as Web 2.0). He argues that too much amateur, user-generated, free content is threatening not only mainstream media—newspapers, magazines, and record and movie companies—but our very culture. We asked Keen what today’s Internet trends mean for the future of our increasingly Web-driven society.

THE FUTURIST: Summarize the basic premise of your book for us; what do you see as the great danger in the way the Internet is allowing millions of content creators to undermine established media?

Keen: I don’t believe this is any kind of conspiracy. Most of the technologists behind Web 2.0 want to do well and they’re decent people. The relationship between the rise of new media and the crisis of old media is causally complex. It would be a dramatic oversimplification to argue that the only reason mainstream media is in crisis is because of the Internet. They are intimately bound up with one another and are cause and effect, in some respects. But people stopped trusting and reading newspapers before the invention of the Internet. People, particularly in the U.S., have problems with all sorts of authority, with or without the Internet. It’s a reaction against cultural authority.

It’s no coincidence that most of the intellectual leaders of the Web 2.0 movement are children of the sixties. There’s a book by Fred Turner of Stanford called Counterculture to Cyber Culture that traces the birth of Silicon Valley and today’s Internet to people opposed to traditional forms of authority. When we look at Web 2.0 we’re staring into a mirror. We’re a society that’s intent on exposing the unreliability and corruption of authority, whether that authority is an editor at a publishing house or newspaper, or an executive at a record label, or a producer in Hollywood, or a politician. The representatives of mainstream media have become a convenient punching bag, much like politicians.

The alternative to mainstream media, which is the Internet, is by definition untrustworthy because it doesn’t have gatekeepers. It lends itself not to imagined corruption, but to real corruption. Ironically, the continual distrust of our supposedly unreliable mainstream media has given us a new media that is, by its very definition, unreliable.

FUTURIST: Was there a specific incident—perhaps something that you witnessed during your Silicon Valley days with Audiocafe—that convinced you that today’s Internet is killing our culture?

Keen: I describe it in my book. I had an epiphany at an event called Foo Camp, which is Friends Of O’Reilly Camp. It’s the classic Silicon Valley Unconference conference, with lots of people espousing jargon about democracy and interactivity and cultural flattening and openness. It was at that event in September 2004 that I had my transformation. I went from a digital believer to an unbeliever.

FUTURIST: What happened?

Keen: I just had enough of these wealthy Silicon Valley guys talking about democratization. It was the height of absurdity that these affluent people thought they knew what anybody else wanted culturally, politically, and mentally. It occurred to me that what was going on was intellectual fraud.

I think it’s worth stressing that the book begins with this epiphany. The book itself, as a narrative, is premised on it.

FUTURIST: How do you see this trend evolving in the future? For instance, just as our technology habits got us into this mess, is it possible that a different, future technology might get us out?

Keen: I don’t think this is a technology story. Hopefully, what’s going on now will force people to realize that expertise does have value. Third parties—gatekeepers—add value to all media. They help produce much more truthful content. People will rediscover the value of expertise and authority figures who know what they’re talking about, so I hope that Web 3.0, when it arrives, will reflect something new. Rather than the empowerment of the amateur, Web 3.0 will show the resurgence of the professional. Having talked to a number of people who are building their next-generation Internet businesses around proven expertise, I’m more optimistic now than when I first wrote the book. Many of the new Internet media startups pay the people who contribute content to their sites and don’t allow them to hide behind anonymity.

FUTURIST: When do you think this change to Web 3.0 will be noticeable?

Keen: I think it’s already happening. When you look at the Web sites like Mahalo.com (which is paying its contributors), HowThingsWork.com, and a number of other businesses I’ve written about, you see the change that’s taking place. Smart people in Silicon Valley are now invested in those kinds of businesses.

But I have a feeling that the tipping point will come with something involving Google or one of the Google companies, like YouTube. YouTube is the driver of the Web 2.0 economy, and they epitomize the hypocrisy of Web 2.0, as well. They’re making a fortune from the advertising sold around free amateur content, but they articulate this ideology of personal empowerment. I’ve seen some incredibly disturbing videos posted on YouTube. I think we’re going to see a profoundly immoral example of how media—without a gatekeeper—lends itself to nastiness. That will be the low point.

The high point, so to speak, for Web 2.0 was when Time magazine voted “you” as the person of the year. I think we’re going to look back at that as the PetFood.com moment.

FUTURIST: Is there anything else we might do now to reverse these trends?

Keen: One area I think we need to concentrate on is anonymity. I think it’s one of the most corrosive things in the Web 2.0 world, and it lends itself to corruption, rudeness, vulgarity. I spent some time at Berkeley with a few research guys from Yahoo. All of their research shows that, whenever a site is dominated by anonymous posters, the quality of the content is dramatically lower than when the site encourages people to reveal their identities. I think that more and more business will come to understand that relying on anonymously produced content is actually a way of losing money.

For the rest of us, we need to ask ourselves, “Is Web anonymity really necessary in a democracy?” I just don’t think it’s justified unless you might be put in jail for your opinion.

The other great concern for me is media literacy. Young people need to understand the difference between Wikipedia and The New York Times online. There’s a difference between a blog and book. I’m thrilled that education professionals out there are now teaching media literacy in schools. I think it needs to be taught not only in schools but also in universities.

FUTURIST: You’ve written a book, you blog online; what else do you do to get this message out there?

Keen: I’m doing a lot of speaking, I’m presenting to people in Vancouver. Over the next couple of weeks, I’ll be in Amsterdam, then London, then Greece, then Frankfurt. This is a message that’s caught on. I’ve got translated versions of the book coming out in China, Taiwan, and Poland. The book is an opening salvo, a polemic to get people to think about these issues. I hope that after my book, people will write more thoughtful, scholarly works on this subject. My book is not a scholarly book. It’s not a balanced book. It’s an attempt to begin a conversation.

 

To read an excerpt from Keen’s book where he discusses Foo Camp, go to: www.andrewkeen.typepad.com This interview was conducted by Patrick Tucker .

Return to top

COPYRIGHT (C) 2007 World Future Society, 7910 Woodmont Avenue, Suite 450, Bethesda, MD 20814, U.S.A. Telephone 1-656-8274; fax 301-951-0394; e-mail info@wfs.org; Web site www.wfs.org. All rights reserved.

Read Full Post »

%d bloggers like this: