The Web 2.0

August 1st, 2006 Curve, Features

In case you hadn’t noticed, we’ve recently entered a new phase in the development and history of the Internet.

Remember the phrase ‘information superhighway’ used to describe the web in the early days? Even though it’s a term you don’t hear much anymore, people are just as excited as they were when it was in common use, back when things like Hotmail, Amazon and Yahoo were the killer apps of the web.

They’re excited because there’s a new Internet. You might have heard it called the democratisation of the web, a brave new world of information or the geek-friendly term, Web 2.0. What they all essentially describe is that in this new world, the audience is the producer.

Blogging, tagging, RSS, file sharing and search technology that follows popularity rather than payment (which is Google’s wildly successful M.O.) are all part of what technology gurus from O’Reilly media (which coined the term Web 2.0) to Kevin Kelly (editor of Wired) are promising is the future of the Internet.

And it’s no wonder people are excited after so much disappointment. As the nineties came to a close, the Internet was starting to look like just another version of TV; all big company marketing, a few mailing lists keeping geeks amused on the fringes. After the bust, it reverted back to a state of something like potential rather than kinetic energy. Many industries saw it as a financial black hole. Broadband spluttered on the starting blocks. The fascination that led to sitting up late at night just surfing websites dwindled.

But today projects like Flickr and Wikipedia are putting control over content in the hands of users, and they’re forging networks more effectively than a million focus groups or viral marketing campaigns ever could. The network, as Sun CEO Scott McNealy said, is truly the computer.

You’d find few people to disagree that the ubiquitous connectivity of opinion, entertainment and information we enjoy in the western world is a bad thing. But you might find a few people who are sorry they believed what they read.

Because here’s what the Internet pundits forget to mention when they wax lyrical about the streams of data hurtling under our feet and overhead; a good deal of it is at best rubbish and at worst plain wrong.

The change has been a shift in the very notion of the media, a word that once referred to the corporatised entity feeding us our information in a top down, ‘push’ model. Nowadays, any kid with a microphone attached to their computer and a knowledge of podcasting, a digital camera and a Flickr account or a web page with an RSS feed of their blog is a media mogul, and somewhere there’ll be an audience for what they produce (even if it’s only him or herself).

The old producer/consumer relationship used to be like a waterfall coming over a narrow precipice, showering us as we stand at the bottom, advertisers hoping their messages in the water saturate us along with the content.

Today’s media is more like the water when it hits the lake at the bottom; a churning and multidirectional mass of sounds, images, video, chat, blogs and websites, all of them targeted in different and sometimes opposite directions. Streams of data pick up or piggyback another, creating an endless network of forks in a road.

The question is; where will those roads lead us? To the information we want or need? If there was one thing you could be reasonably sure of in traditional media (and still can), it’s a collaborative effort in quality control.

Digital encyclopaedia Wikipedia was seen as the digital font of all our collective knowledge. Anyone could log in and change or update entries. The result? In late 2005 blogger and technology journalist Nicholas Carr published The Amorality of Web 2.0, where he made reference to entries on Wikipedia about Bill Gates and Jane Fonda. The entries were so full of errors Carr was moved to describe Wikipedia as ‘unreliable’, ‘slipshod’ and ‘sometimes appalling’, suggesting that ‘participatory media is mediocre’.

Wikipedia’s creator has since indicated they’re considering a system where an entry is reviewed by an editor and tagged as legitimate – a system that would make Wikipedia decidedly more like Web 1.0 or traditional media. Is it a guarded admission of the failure of unfettered user control?

It may just be if you put content in control of just about anyone, you’ll get just about anything back. Get your recipe for chicken fois gras off the Internet and you might end with chicken nuggets. Get it out of a book and you can be confident (by the very act of having parted with your money in a bookstore) the recipe has been provided by an experienced chef, checked by an experienced editor and produced by a professional publishing house.

Information that’s been packaged for sale (from books and newspapers to cinema and TV) has an inherent credibility over stuff you read on any old website; it’s been commissioned with a certain level of quality in mind, fact checked, spell checked and professionally produced. We can’t all be composers, authors or academics simply because not many of us are any good at it. The editorial system of the traditional media would theoretically weed us out.

As an Australian book publisher said over four years ago; "because they’re published books, you know they’re reliable sources of information, which is not true if you just search on the Internet. There’s an awful lot of garbage out there…"

Of course, in this age of opinion-columns-as-hard-facts, hysterical shock jocks moulding public opinion by brute force and pre-packaged PR bites passed off as news, one look at the traditional media shows us it needs a lot more weeding out to ensure true quality rather than just marketability.

But what the staggering amount of digital content reminds us of is that the skills we should bring to bear when dealing with any media — a healthy dose of scepticism laced with a keen sense of human nature — are as important as ever.

None of this is to say we shouldn’t post our photos and blogs. The Internet is all about connection – both human and computational. But we should think twice before we mistake mass expression for knowledge.


Full client and publication list:

  • 3D Artist
  • APC
  • AskMen.com
  • Auscam
  • Australian Creative
  • Australian Macworld
  • Australian Way (Qantas)
  • Big Issue
  • Black Velvet Seductions
  • Black+White
  • Bookseller & Publisher
  • Box Magazine
  • Brain World
  • Business News
  • Business NSW
  • Campaign Brief
  • Capture
  • CHUD.com
  • Cleo
  • Cosmos
  • Cream
  • Curve
  • Daily Telegraph
  • Dark Horizons
  • Dazed and Confused
  • Desktop
  • DG
  • Digital Media
  • Disney Magazine
  • DNA Magazine
  • Empire
  • Empty Magazine
  • Famous Monsters of Filmland
  • Fast Thinking
  • FHM UK
  • Film Stories
  • Filmink
  • Follow Gentlemen
  • Geek Magazine
  • Good Reading
  • Good Weekend
  • GQ
  • How It Works
  • Hydrapinion
  • Inside Film
  • Internet.au
  • Loaded
  • M2 Magazine
  • Marie Claire Australia
  • Marketing
  • Maxim Australia
  • Men's Style
  • Metro
  • Moviehole
  • MSN
  • Nine To Five
  • Paranormal
  • PC Authority
  • PC Powerplay
  • PC Update
  • PC User
  • PC World
  • Penthouse
  • People
  • Pixelmag
  • Popular Science
  • Post Magazine
  • Ralph
  • Reader's Digest
  • ScienceNetwork WA
  • SciFiNow
  • Scoop
  • Scoop Traveller
  • Seaside Observer
  • SFX
  • Sydney Morning Herald
  • The Australian
  • The Retiree
  • The Sun Herald
  • The West Australian
  • thevine.com.au
  • TimeOut
  • Total Film
  • Video Camera
  • Video&Filmmaker
  • Writing Magazine
  • Xpress
  • Zoo