A spat between two blogging stalwarts Steve Gillmor and Doc Searls has seen the role of the venerable hyperlink come into question. Gillmor doesn't like links. He has declared links to be dead. Searls is sceptical of the Gillmor position and wants to know why Gillmor is so greedy as to deny him, or anybody else, the benefit of a link.
In reality, the argument is less about the hyperlink than it is about one search engine's mechanism for rating pages. A search engine that has given blogs the biggest single boost than any other factor. Google's PageRank system is built on the idea that people link to pages that are important to them. If important people link to pages, then those pages must be really important. Blogs benefit greatly from PageRank because they rely so much on intensive linking. But the recent rise of splogs that have linked to high-profile blogs means that links "have been devalued", in the words of Dave Winer.
Although splogs are becoming more troublesome, does that really mean the underpinning of the Web should be thrown away and replaced with something completely different?
As an alternative to linking, Gillmor is pumping up the use of RSS feeds:
"I am specifically and overtly not linking to drive people to RSS and its fundamental time efficiency."
Fundamental time efficiency? I can agree if you only want to keep up with a relatively fixed set of sites. But most people surf through pages following a trail of links through them. RSS is not great for where you are searching for things you do not normally keep track of. Links are, however, no matter how devalued they might be in the context of a proxy used by search engines for ranking purposes.
Gillmor says he wants people to cite rather than link directly. Fantastic. And how exactly do we use those citations if they do not link directly? We have to use the one alternative we currently have to hand: the search engine. The very engine that is the main target of sploggers right now. There is another problem. Citations are fine, just as long as the citation is unambiguous enough to allow a search engine or some other intermediary to come up with the appropriate...now, what's the word? Oh, link. That's it. Now, let me think, what was the last attempt to use search technology to build dynamic links to words and phrases that could be regarded as citations? Oh, smart tags. Correct me if I'm wrong, but didn't the world decide they were evil after Google imposed AutoLinks on web pages courtesy of its toolbar? And Microsoft did not win many fans for its Smart Tags before then.
For a self-confessed old fart, I'm surprised Gillmor has forgotten that hyperlinks did not begin with the World Wide Web; that they have a wider reach and deeper history. Ted Nelson had the idea for hyperlinks way back in the mid-1960s, about the same time that people came up with many of the ideas that are still used today in computers: mice; superscalar architectures; memory caches; and a bunch of other things. I'm not sure that a little problem with one search engine should cause an eminently good idea to get dumped. The search engine has been hit with this kind of thing before, when link farms first appeared. What did they do? They stopped trying to treat all pages as equal. They used heuristics to try to identify and penalise spammy pages. The heuristics are getting more difficult and we will reach the point where search engines might have to use some serious AI to tell blog from splog. Even humans will have difficulty with the distinction.
I see nothing in Gillmor's citation approach that is fundamentally less susceptible to spamming than the good old link. In fact, by not relying on intermediaries to handle the connection, direct links between trusted sites are far more resistant to spam. Take away the links and the spammers will simply focus on the intermediaries that rise up to take their place. Of course, using intermediaries to link content from different sites would make it far easier for Gillmor to build his attention-based economy.