June 2008 Archives

Chris Anderson, George Dyson and Kevin Kelly reckon we are better off letting computers understand everything for us. I was going to quote some lengthy passage from "Brave New World". But, as always, comedy has the answers.

Whenever I read something from the Cult of the Singularity, I find it hard to not conjure up the hectoring tones of Johnny from Mike Leigh's film Naked. You have to wonder how many spurious factoids David Thewlis had to commit to memory to get his improvised monologues* to work:

"And every barcode is divided into two sections by three markers and those markers are always represented by the number six. Six, six, six

"And what they’re planning to do, in order to eradicate all credit card fraud and in order to precipitate a totally cashless society…and they’ve already tested it on the American troops: they’re going to subcutaneously laser-tattoo that mark onto your right hand or onto your forehead. They’re going to replace plastic with flesh!

"Fact!"

I'll spare you the whole tirade but it leads up to the point where Johnny and the Singularists come together as one:

"And no, we’re not going to sprout extra limbs and wings and things because evolution itself is evolving. When it comes, the apocalypse itself will a part of the process of that leap of evolution. By the very definition of apocalypse, mankind must cease to exist, at least in a material form. We’ll have evolved into something that transcends matter, into a species of pure thought. Are you with me?"

And so, there I was reading Kevin Kelly's exposition of the OneMachine made out of old PCs yoked together that thinks with hyperlinks, mentally adding an extra "Fact!" at the end of every paragraph to complete the effect:

"Each new link wires up a subroutine, creates a loop, and unleashes a cascade of impulses. As waves of links surge around the world, they resemble the thought patterns of a very large brain."

Fact!

"By 2040, the planetary computer will attain as much processing power as all 7 billion human brains on Earth."

Fact!

And what do these computers actually do when harnessed as one? Some of them do something useful such as perform quantum mechanical calculations to predict protein folding. Unfortunately, they are more likely to be sending out tons of spam. But no mind, "we are headed toward a singular destiny: one vast computer composed of billions of chips and billions of brains, enveloping the planet in a single sphere of intelligence".

Fact! The techalypse is coming.

But there was one thing niggling at me: where were the figures coming from to support the contention that the One Machine rivals even one brain today? And this is assuming you accept Giulio Tononi's assertion that intelligence comes as a function of complexity, that you can just slam a bunch of circuits together and automatically get something that thinks. Towards the bottom of the page are some figures in a diagram.

By far the oddest one is the choice of 70MHz for the brain's operating frequency: "grey matter is about as speedy as an original Pentium". That sounds pretty quick to me given that the calcium induced cascade that triggers a neural response takes on the order of 200µs. That gives you a maximum frequency — even working on the basis that neurons switch like electronic transistors, which they don't — of tens of kilohertz. By that token, the human brain can barely keep up with a Sinclair ZX80. The actual frequency is probably way lower than that as neural signalling seems to rely on pulse trains that take tens of milliseconds to transmit from one neuron to another. The brain makes up for that sluggishness by not trying to work like an electronic computer. The transistor, as it turns out, is a pretty rotten analogue for a neuron, although maybe not nearly as bad as equating a hyperlink with a synapse.

But I'm really curious about the 70MHz. Where does that figure come from? Surely it can't be derived from Bruce Tainio who claimed in the early 1990s to have found a relationship between frequency and disease. According to Tainio's measurements, the brain has a 'bio-frequency' of 72MHz to 90MHz — genius intelligence is at the upper end, apparently. Fans of the woo business will be delighted to know that you can buy 'essential oils' that resonate in the same range and so help you get a better brain. And not those nasty gigahertz frequencies, like 2.4GHz, that mess your brain up. I can't find any paper from Tainio that explains his conclusions, just references on essential-oil websites, found courtesy of the resident Overmind otherwise known as Google. However, if I suspect my neurons to be running at 70MHz, I'm going to be ringing the doctor pronto, assuming that I'm actually able to.

* Sorry Orb fans, this post has approximately zero to do with Minnie Ripperton done ambient stylee, but here's a link to the video if that's all you wanted. But thanks to the Orb for sampling so much of Naked on S.A.L.T. (Orblivion) to save the aggro of fast-forwarding through the film to find the monologues.

In the wake of the uneasy truce between Loren Feldman and Shel Israel, it seems that Feldman has been able to do both things from the most famous quote from John Dryden's "A Discourse Concerning the Original and Progress of Satire":

"Yet there is still a vast difference betwixt the slovenly butchering of a man, and the fineness of a stroke that separates the head from the body and leaves it standing in its place."

In ruthlessly taking Israel apart with the humour equivalent of a rusty meat cleaver, Feldman co-opted Israel into saving the finer cuts for social media in general.

"And now it’s done, my little experiment with Social Media. I beat you with your own tools, in the arena in which you bill yourself an expert. You are an amateur Shel, an amateur, always remember that."

With the puppet, Feldman did distinctly old-media things. For one thing, it's all fake. It's a puppet pretending to be some other guy. Out through the window goes the social media stricture of "authenticity". Although the puppet was a goof, it was a lovable goof – the kind of thing old TV loves. And the set-ups were straight from from pro-TV school. It's just as well. Israel's videos were self-satirising: the one of him waving a boom mic around like a balloon on a stick in front of a bleary-eyed Jeremiah Owyang while supping disinterestedly on a latté is unforgettable. And not in a good way.

Feldman called the puppet "more real": a classic bit of legerdemain. Israel was very real during the whole spat. He was angry. He was upset. He wanted to get even. Faced with what Feldman was doing to him, what would you want to do? Social media's advice: be real, be honest.

But nobody believed the advice. The sensible advice to Israel was to bottle it up, act nice. And that probably would have worked. Had Israel gritted his teeth and pretended that he really loved the puppet, he would probably have come out of the whole episode more famous and better off. In other words, ignore Naked Conversations: Be inauthentic. You can't blog or tweet your way out of a crisis any more than you can knit your way out of a burning building.

And don't forget Feldman's position of being a pro versus Israel's amateur in what was meant to be an amateur's game.

And that is the Feldman's gift to social media in a situation where most in the club seem to have ignored the puppet sites's tag line: "A parody of Social Media’s impact on business & culture".

But what about the position of Michael Arrington and Jason Calcanis in this? Israel seems to believe that Arrington's hand was behind the puppet all of the time. Feldman's response:

"You chose to blame Mike Arrington, Jason Calacanis, and myself when you should have been blaming yourself. Mike is busy taking on AP and the NY Times. Jason is taking on Google. I’m taking on TV, do you think anyone of us have the time or even give a shit enough about you to plot a conspiracy?"

Or, to paraphrase with a slant on social media: these people are building media empires, do you imagine they give a shit about some social-media revolution? It's been good to them, it's been a laugh, but there's a lot more money in replacing the 'old-media' companies.

Now it seems to be Dave Winer's turn. The joke's just not so funny second time around but the ability of some of social media's voices to self-satirise, who knows what's possible.

Chris Anderson of Wired has declared scientific method dead. And it's all thanks to Google, apparently, and the mass of data it is accummulating. Maybe Google really is making us stupid after all because the reasoning behind Anderson's conclusion is built on some shaky foundations.

Did Peter Norvig, Google's research director, really say: "All models are wrong, and increasingly you can succeed without them"? Because, if so, he seems to have misinterpreted what his own company has been doing. Yes, search and its related technologies do not rely on language models. But the core of all that Google does right now is based on a statistical approach that makes some basic assumptions about how language works. You might call it a model.

Anderson postulates a world based on machine learning, where the computer crunches through the data to come up with predictions.

"This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear...With enough data, the numbers speak for themselves."

Yet, machine-learning algorithms depend on the construction of some kind of model. It is not necessarily a deterministic model in the way that classical mechanics is, but just because it invokes statistics does not make it any less a model-based technique. What are models for? They allow you to make predictions about what will happen given some inputs.

In 2003, Bill Gates channelled just about every user of Windows and its arcane ways in a memo dredged out of the antitrust actions by the Seattle PI. All he wanted to do was download Moviemaker but the Windows designers had other ideas:

"So I gave up and sent mail to Amir saying - where is this Moviemaker download? Does it exist?

So they told me that using the download page to download something was not something they anticipated."

It did not get better for Billg and his download past that point. However, Todd Bishop's post has a sting in the tail. He asked Gates on his departure about the email, sent almost five-and-a-half years ago:

As for the message, Gates smiled and said, "There's not a day that I don't send a piece of e-mail ... like that piece of e-mail. That's my job."

When people ask what Microsoft will be like now that Gates has left the building, this memo and the idea that Gates sent lots of them should be the clue. Nothing. Because if any of these memos had any effect, Windows would be a rather different piece of software. The structures that Microsoft built over the last 30 years effectively nullified any direct control that Gates had over software development. I'm sure people who weren't directly responsible for the problems Gates had with the download nodded and agreed with what he had to say, and they all listened intently to his speeches. But they then went on their way to product-planning meetings that not only created these hindrances but ossified them into place.

The obvious question when faced with today's decision by Nokia to buy out Symbian and release the software as open source was: if you have shipped 200 million handsets, what was the problem that forced you to do this? During the presentation that attempted to explain the move, executives such as Nokia executive vice president Kai Öistämö used the not-so-convincing argument that because Symbian has a 60 per cent share of the market, having charged up to $5 a handset to manufacturers, everything was going to be even better now that it is going to be free. Somehow, making it open source would dragoon in a bunch of application developers and convince everyone that Symbian is the only game in town in handsets. Forget Android, forget Limo and definitely don't bother about the closed-like-a-clam Apple iPhone.

Yet, despite having had ten years to build an unbeatable handset operating system, Symbian almost stumbled at the last hurdle. Nokia's majority ownership of the software maker has been a stumbling block with manufacturers, some of whom chose to build other user interfaces on top of the operating system to prevent Nokia from maintaining a stranglehold with the Series 60 environment. That is where environments such as UIQ and MOAP – used largely in Japan – have come in.

The situation has irritated operators such as Vodafone who find themselves having to deal with three different flavours of mobile phone built on ostensibly the same base when they have tried to tie back the number of platforms they support. Several years ago, Vodafone decided to try to restrict the amount of time it spent on software by picking three platforms: Limo; Microsoft; and Symbian. The idea of being able to bring Symbian back to one piece of software is far more attractive than the current situation.

People in the computer business just can't resist those Moore's Law versus the car analogies. Today's exhibit is Professor Steve Furber of the University of Manchester:

One litre of fuel would serve the UK for a year and oil reserves would last the expected lifetime of the solar system - if efficiency in the car industry had improved at the same rate as in the computer world - a leading computer scientist will tell an audience in Manchester, UK, on Friday 20 June 2008.

I bet he won't be telling them about the motorways clogged with automobiles stranded at odd angles as their drivers phone into call centres to be told: "Just try taking the battery out, then put it back in and start the car up. We can see if it happens again."*

Sorry, it's an old joke, but someone's got to do it.

* I once hired a Smart ForFour with an ECU that crashed so badly - in the middle of Wimbledon in rush hour - the only option was to reboot the car. When I next hired a car from them, I noticed that the ForFour was no longer on the list of vehicles.

More Mentor

19 June 2008

I've posted a couple of pieces on the attempt by Cadence Design Systems to buy Mentor Graphics at the Shrinking Violence blog, which I've set up to mainly cover the electronics business as silicon heads into its final decade of Moore's Law scaling.

The current design is temporary, which is why it's on a standard Movable Type template but that will change.

Last week, a group of social scientists from the University of Nottingham released their report on the ethical problems facing the technology of synthetic biology. Commissioned by the Biotechnology and Biological Sciences Research Council (BBSRC), the report called for a "thorough review of existing controls and safeguards" to extend them to synthetic biology.

Not just that. The public needs to be involved and may even be in the position to stop certain kinds of research: "It is vital to recognise the importance of maintaining public legitimacy and support. In order to achieve this, scientific research must not get too far ahead of public attitudes and potential applications should demonstrate clear social benefits."

This is from a different section but covers similar ground: "Partnership with civil society groups, social scientists and ethicists should be pursued as a highly effective way of understanding critical issues, engaging with publics and winning support for emerging scientific fields. However, at the same time it must be recognised that this is a two-way process and that some ethically problematic scientific projects and potentially controversial technologies may have to be abandoned in order to maintain trust."

This all sounds good in principle. But it is a process that could lead to some seriously strange decisions being made as to which branches of biological research are pursued and which are terminated. For a good many of the ethical issues that surround synthetic biology do not lie in the research but in the application. And in many cases, the economics of the application.

Chart-tastic

18 June 2008

eMusic is running a survey to try to find its subscribers' favourite album. Well, the best one that eMusic can supply, which narrows the field quite some way. But it means that, whatever the winner, it's not going to be some multi-platinum monstrosity.

With the help of iTunes Statistician and the power of memory, I came up with a list fairly quickly, although some things I could swear I got from the paid-for download site have since disappeared, which entailed a bit of rejigging.

On top of that, the number one is a bit of a ringer as I didn't get it off eMusic. However, the live album they put up on the service for free was the come-on I needed to try eMusic in the first place.

And the winner is: The Pixies with Surfer Rosa/Come On Pilgrim.

Followed by:

2. Twin Cinema - The New Pornographers
3. The Greatest - Cat Power
4. Walls - Apparat
5. The Life Pursuit - Belle & Sebastian

That is all.

Mentor's big decision

17 June 2008

There is clearly something in the water on the West Coast as hostile takeover fever is taking hold. Away from the Microsoft/Yahoo soap opera, another, somewhat smaller bid battle is gearing up. Mentor Graphics has rejected today's offer from Cadence Design Systems, citing antitrust issues among the reasons:

"As we recently indicated to Cadence, we reviewed Cadence's proposal and analyzed both the price proposed and the risks associated with obtaining antitrust approval for a combination between the companies,” said Walden C. Rhines, chairman and CEO of Mentor Graphics. "Following this review, we concluded that not only was the price insufficient to support a transaction but that the risks of not gaining regulatory approval were sufficiently high that the ability of the parties to consummate the transaction would be in jeopardy. For these and other reasons, our Board unanimously rejected the proposal."

On the conference call, Cadence did not distance itself from the idea that CEO Mike Fister could play the role of Steve Ballmer against who some analysts are setting up as the Jerry Yang in this battle, Mentor's CEO and chairman Wally Rhines. The script is similar: Mentor did not want to negotiate, and is not interested in providing value to shareholders.

Intel came close to giving the idea of having a fixed clock-speed rating on its upcoming Nehalem the heave-ho, according to Intel fellow Rajesh Kumar, speaking to journalists ahead of the VLSI Circuits Symposium in Hawaii this week. The people who were going to be putting the processor into PCs didn't care for the idea, it seems.

The company has radically altered the way that Nehalem is clocked compared with its predecessors in order to improve both memory bandwidth and power consumption. It means that the core, memory buses and I/O run almost independently.

The bigger change is internal, where it seems that the concept of a fixed clock running at several gigahertz has been discarded in favour of letting the logic run at its own speed. This is something that people such as former ARM architect Professor Steve Furber have been advocating for years. The concept of a system clock is entirely artificial and exists largely to make life easy for chip designers and simplify the job of testing chips as they come off the production line. Chips such as the Amulet don't run off any kind of clock: the logic inside finds its own speed.

It took me ages to get round to reading Nick Carr's Atlantic piece on the stupefying effects of Internet usage. I was too busy looking at lolcats, surfing the news and skimming through RSS feeds. And I liked it. That's probably where the problem lies.

In The Big Switch and other recent writing, Carr worries about the relentless push toward the Singularity - a time when humans and computers become inseparable because the machines will keep us alive and help us think. The Atlantic piece signs off: "as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence".

I think Carr worries too much about the ability of computer science to deliver on Larry Page's 2004 promise in a Newsweek interview that he cites: "Certainly if you had all the world’s information directly attached to your brain, or an artificial brain that was smarter than your brain, you’d be better off."

A couple of months ago, nVidia's Jen-Hsun Huang decided to stick his head out of the window and shout he wasn't going to take it anymore. Or at least, gather a bunch of analysts together at the graphics chipmaker's HQ and tell them he wasn't going to take it anymore. The trigger was Intel's developer forum in China where Intel's Pat Gelsinger declared the death of today's graphics processor (GPU). Curiously, Gelsinger claimed that just ahead of talking about Larrabee: Intel's latest foray into the GPU business (it's a different kind of GPU, you understand).

The argument from the Intel side was that traditional processors would take over many of the rendering functions in 3D graphics, largely because there are going to be so many of them. Huang had the opposite argument: GPUs already have lots of processors on them, why not use them for offloading software from the host processor?

And so the stage is set for a new kind of architecture war in which you have different kinds of microprocessor fighting over the same ground.

There was something oddly convenient about the passage extracted from a 1944 manual on sabotage supposedly written in 1994 by the US OSS about disrupting corporate activity. You read through the list of things a saboteur should do, as quoted by people such as David Weinberger, and think: "Yeah, I've been in those meetings."

Take, for example, point one on page 28:

Insist on doing everything through “channels.” Never permit short-cuts to be taken in order to expedite decisions.

It was at that point that my internal hoaxmeter started edging into the red.

Download the document. Take a look at it. Doesn't it look just a little too clean for a publication that was printed more than 60 years ago and, presumably, scanned only days or weeks ago? The front page has been disfigured by stamps to make it look a little distressed but there's barely a dog ear – in fact there are no dog ears - on the subsequent pages.

Maybe it's the little things that give it away. There is the lack of hyphenation in 'cooperate', the use of phrases such as "inside dope" and the reference to fluorescent lighting. Yes, dope was slang for information a century ago. But in a document supposedly for distribution to agents whose first language probably wasn't English? Fluorescent lighting? It existed but hardly anybody had seen it in the 1940s.

And then maybe it's the reference to "the United Nations war effort": an organisation that was not formed until after the Second World War.

When you consider the provenance of the 'manual' - it's an exhibit being used by a couple of Web 2.0 evangelists from the OSS's successor the CIA - it shows that spooks have a sense of humour too. The OSS and CIA did have sabotage leaflets (they probably still do). Just not, in all likelihood, this one.

Update: Darn it. A commenter at David Weinberger's blog points out that the term United Nations was used before 1945. The commenter points to the Declaration of United Nations in 1941 as the point at which the name started to be used. I wasn't totally convinced but then spotted some speeches given by Roosevelt where he used the terms United Nations liberally. So, maybe that bit was culled from a real OSS document. But the whole thing still screams fake to me.

Blowin' in the wind

9 June 2008

I've never really understood the point of leaf blowers. Even less so as I look out of the window today - in the middle of June - at a guy wandering up and down with a leaf blower in a street that does not have any trees in it.

The currency of news

4 June 2008

It wasn't until I read about the research commissioned by the Associated Press into news consumption (via Martin Stabe's blog) that I realised that hardly anybody has done ethnographic studies of how people deal with news. Other than this study, I can't find anything through Google Scholar that deals with the audience – most of the ethnographic research concentrated on the journalists not on who they are producing the work for.

Media companies make a fair amount of use of focus groups and surveys but those sessions can be very misleading, not least because internal marketing departments structure them to probe behaviour that affect commercial decisions rather than the editorial concerns. The other big problem is that people don't tell the truth about how they read newspapers or magazines. You spend a lot of time watching the sessions or reading the reports trying to infer what the subjects are really thinking. Ethnographic research goes further by trying to compare what people say versus what they do.

In Sicily for a holiday in the second half of May, my girlfriend and I decided to go to Pantalica. It sounds as though it ought to be a South American heavy metal act but is an enormous, sprawling necropolis that dates back to the Bronze Age. From about 1300 BC, the inhabitants buried their dead in caves cut into the sides of the gorge cut by the Anapo river. They cut thousands of square holes in the cliffs and dragged the bodies of their relatives up to them, ultimately to be uncovered and shipped off to museums by archaeologists.

pantalcaves.jpg

You can get to Pantalica from two directions: Ferla to the west and Sortino to the northeast. The roads almost meet, but not quite. However, the Tom Tom satnav shows one stretch of road joining Ferla and Sortino by way of Pantalica. Before we got there, I assumed that there was a road there but it was no more than a dirt track for the section that ran down into the gorge and up the other side, as local maps show a break between the two sections of tarmac. This was on the basis that in all the stories of satnavs going wrong, most of the time the road actually existed. It just wasn't all that useful to regular motor vehicles.