Technology: June 2008 Archives

Whenever I read something from the Cult of the Singularity, I find it hard to not conjure up the hectoring tones of Johnny from Mike Leigh's film Naked. You have to wonder how many spurious factoids David Thewlis had to commit to memory to get his improvised monologues* to work:

"And every barcode is divided into two sections by three markers and those markers are always represented by the number six. Six, six, six

"And what they’re planning to do, in order to eradicate all credit card fraud and in order to precipitate a totally cashless society…and they’ve already tested it on the American troops: they’re going to subcutaneously laser-tattoo that mark onto your right hand or onto your forehead. They’re going to replace plastic with flesh!


I'll spare you the whole tirade but it leads up to the point where Johnny and the Singularists come together as one:

"And no, we’re not going to sprout extra limbs and wings and things because evolution itself is evolving. When it comes, the apocalypse itself will a part of the process of that leap of evolution. By the very definition of apocalypse, mankind must cease to exist, at least in a material form. We’ll have evolved into something that transcends matter, into a species of pure thought. Are you with me?"

And so, there I was reading Kevin Kelly's exposition of the OneMachine made out of old PCs yoked together that thinks with hyperlinks, mentally adding an extra "Fact!" at the end of every paragraph to complete the effect:

"Each new link wires up a subroutine, creates a loop, and unleashes a cascade of impulses. As waves of links surge around the world, they resemble the thought patterns of a very large brain."


"By 2040, the planetary computer will attain as much processing power as all 7 billion human brains on Earth."


And what do these computers actually do when harnessed as one? Some of them do something useful such as perform quantum mechanical calculations to predict protein folding. Unfortunately, they are more likely to be sending out tons of spam. But no mind, "we are headed toward a singular destiny: one vast computer composed of billions of chips and billions of brains, enveloping the planet in a single sphere of intelligence".

Fact! The techalypse is coming.

But there was one thing niggling at me: where were the figures coming from to support the contention that the One Machine rivals even one brain today? And this is assuming you accept Giulio Tononi's assertion that intelligence comes as a function of complexity, that you can just slam a bunch of circuits together and automatically get something that thinks. Towards the bottom of the page are some figures in a diagram.

By far the oddest one is the choice of 70MHz for the brain's operating frequency: "grey matter is about as speedy as an original Pentium". That sounds pretty quick to me given that the calcium induced cascade that triggers a neural response takes on the order of 200µs. That gives you a maximum frequency — even working on the basis that neurons switch like electronic transistors, which they don't — of tens of kilohertz. By that token, the human brain can barely keep up with a Sinclair ZX80. The actual frequency is probably way lower than that as neural signalling seems to rely on pulse trains that take tens of milliseconds to transmit from one neuron to another. The brain makes up for that sluggishness by not trying to work like an electronic computer. The transistor, as it turns out, is a pretty rotten analogue for a neuron, although maybe not nearly as bad as equating a hyperlink with a synapse.

But I'm really curious about the 70MHz. Where does that figure come from? Surely it can't be derived from Bruce Tainio who claimed in the early 1990s to have found a relationship between frequency and disease. According to Tainio's measurements, the brain has a 'bio-frequency' of 72MHz to 90MHz — genius intelligence is at the upper end, apparently. Fans of the woo business will be delighted to know that you can buy 'essential oils' that resonate in the same range and so help you get a better brain. And not those nasty gigahertz frequencies, like 2.4GHz, that mess your brain up. I can't find any paper from Tainio that explains his conclusions, just references on essential-oil websites, found courtesy of the resident Overmind otherwise known as Google. However, if I suspect my neurons to be running at 70MHz, I'm going to be ringing the doctor pronto, assuming that I'm actually able to.

* Sorry Orb fans, this post has approximately zero to do with Minnie Ripperton done ambient stylee, but here's a link to the video if that's all you wanted. But thanks to the Orb for sampling so much of Naked on S.A.L.T. (Orblivion) to save the aggro of fast-forwarding through the film to find the monologues.

Chris Anderson of Wired has declared scientific method dead. And it's all thanks to Google, apparently, and the mass of data it is accummulating. Maybe Google really is making us stupid after all because the reasoning behind Anderson's conclusion is built on some shaky foundations.

Did Peter Norvig, Google's research director, really say: "All models are wrong, and increasingly you can succeed without them"? Because, if so, he seems to have misinterpreted what his own company has been doing. Yes, search and its related technologies do not rely on language models. But the core of all that Google does right now is based on a statistical approach that makes some basic assumptions about how language works. You might call it a model.

Anderson postulates a world based on machine learning, where the computer crunches through the data to come up with predictions.

"This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear...With enough data, the numbers speak for themselves."

Yet, machine-learning algorithms depend on the construction of some kind of model. It is not necessarily a deterministic model in the way that classical mechanics is, but just because it invokes statistics does not make it any less a model-based technique. What are models for? They allow you to make predictions about what will happen given some inputs.

In 2003, Bill Gates channelled just about every user of Windows and its arcane ways in a memo dredged out of the antitrust actions by the Seattle PI. All he wanted to do was download Moviemaker but the Windows designers had other ideas:

"So I gave up and sent mail to Amir saying - where is this Moviemaker download? Does it exist?

So they told me that using the download page to download something was not something they anticipated."

It did not get better for Billg and his download past that point. However, Todd Bishop's post has a sting in the tail. He asked Gates on his departure about the email, sent almost five-and-a-half years ago:

As for the message, Gates smiled and said, "There's not a day that I don't send a piece of e-mail ... like that piece of e-mail. That's my job."

When people ask what Microsoft will be like now that Gates has left the building, this memo and the idea that Gates sent lots of them should be the clue. Nothing. Because if any of these memos had any effect, Windows would be a rather different piece of software. The structures that Microsoft built over the last 30 years effectively nullified any direct control that Gates had over software development. I'm sure people who weren't directly responsible for the problems Gates had with the download nodded and agreed with what he had to say, and they all listened intently to his speeches. But they then went on their way to product-planning meetings that not only created these hindrances but ossified them into place.

The obvious question when faced with today's decision by Nokia to buy out Symbian and release the software as open source was: if you have shipped 200 million handsets, what was the problem that forced you to do this? During the presentation that attempted to explain the move, executives such as Nokia executive vice president Kai Öistämö used the not-so-convincing argument that because Symbian has a 60 per cent share of the market, having charged up to $5 a handset to manufacturers, everything was going to be even better now that it is going to be free. Somehow, making it open source would dragoon in a bunch of application developers and convince everyone that Symbian is the only game in town in handsets. Forget Android, forget Limo and definitely don't bother about the closed-like-a-clam Apple iPhone.

Yet, despite having had ten years to build an unbeatable handset operating system, Symbian almost stumbled at the last hurdle. Nokia's majority ownership of the software maker has been a stumbling block with manufacturers, some of whom chose to build other user interfaces on top of the operating system to prevent Nokia from maintaining a stranglehold with the Series 60 environment. That is where environments such as UIQ and MOAP – used largely in Japan – have come in.

The situation has irritated operators such as Vodafone who find themselves having to deal with three different flavours of mobile phone built on ostensibly the same base when they have tried to tie back the number of platforms they support. Several years ago, Vodafone decided to try to restrict the amount of time it spent on software by picking three platforms: Limo; Microsoft; and Symbian. The idea of being able to bring Symbian back to one piece of software is far more attractive than the current situation.

People in the computer business just can't resist those Moore's Law versus the car analogies. Today's exhibit is Professor Steve Furber of the University of Manchester:

One litre of fuel would serve the UK for a year and oil reserves would last the expected lifetime of the solar system - if efficiency in the car industry had improved at the same rate as in the computer world - a leading computer scientist will tell an audience in Manchester, UK, on Friday 20 June 2008.

I bet he won't be telling them about the motorways clogged with automobiles stranded at odd angles as their drivers phone into call centres to be told: "Just try taking the battery out, then put it back in and start the car up. We can see if it happens again."*

Sorry, it's an old joke, but someone's got to do it.

* I once hired a Smart ForFour with an ECU that crashed so badly - in the middle of Wimbledon in rush hour - the only option was to reboot the car. When I next hired a car from them, I noticed that the ForFour was no longer on the list of vehicles.

More Mentor

19 June 2008

I've posted a couple of pieces on the attempt by Cadence Design Systems to buy Mentor Graphics at the Shrinking Violence blog, which I've set up to mainly cover the electronics business as silicon heads into its final decade of Moore's Law scaling.

The current design is temporary, which is why it's on a standard Movable Type template but that will change.

Last week, a group of social scientists from the University of Nottingham released their report on the ethical problems facing the technology of synthetic biology. Commissioned by the Biotechnology and Biological Sciences Research Council (BBSRC), the report called for a "thorough review of existing controls and safeguards" to extend them to synthetic biology.

Not just that. The public needs to be involved and may even be in the position to stop certain kinds of research: "It is vital to recognise the importance of maintaining public legitimacy and support. In order to achieve this, scientific research must not get too far ahead of public attitudes and potential applications should demonstrate clear social benefits."

This is from a different section but covers similar ground: "Partnership with civil society groups, social scientists and ethicists should be pursued as a highly effective way of understanding critical issues, engaging with publics and winning support for emerging scientific fields. However, at the same time it must be recognised that this is a two-way process and that some ethically problematic scientific projects and potentially controversial technologies may have to be abandoned in order to maintain trust."

This all sounds good in principle. But it is a process that could lead to some seriously strange decisions being made as to which branches of biological research are pursued and which are terminated. For a good many of the ethical issues that surround synthetic biology do not lie in the research but in the application. And in many cases, the economics of the application.

Mentor's big decision

17 June 2008

There is clearly something in the water on the West Coast as hostile takeover fever is taking hold. Away from the Microsoft/Yahoo soap opera, another, somewhat smaller bid battle is gearing up. Mentor Graphics has rejected today's offer from Cadence Design Systems, citing antitrust issues among the reasons:

"As we recently indicated to Cadence, we reviewed Cadence's proposal and analyzed both the price proposed and the risks associated with obtaining antitrust approval for a combination between the companies,” said Walden C. Rhines, chairman and CEO of Mentor Graphics. "Following this review, we concluded that not only was the price insufficient to support a transaction but that the risks of not gaining regulatory approval were sufficiently high that the ability of the parties to consummate the transaction would be in jeopardy. For these and other reasons, our Board unanimously rejected the proposal."

On the conference call, Cadence did not distance itself from the idea that CEO Mike Fister could play the role of Steve Ballmer against who some analysts are setting up as the Jerry Yang in this battle, Mentor's CEO and chairman Wally Rhines. The script is similar: Mentor did not want to negotiate, and is not interested in providing value to shareholders.

Intel came close to giving the idea of having a fixed clock-speed rating on its upcoming Nehalem the heave-ho, according to Intel fellow Rajesh Kumar, speaking to journalists ahead of the VLSI Circuits Symposium in Hawaii this week. The people who were going to be putting the processor into PCs didn't care for the idea, it seems.

The company has radically altered the way that Nehalem is clocked compared with its predecessors in order to improve both memory bandwidth and power consumption. It means that the core, memory buses and I/O run almost independently.

The bigger change is internal, where it seems that the concept of a fixed clock running at several gigahertz has been discarded in favour of letting the logic run at its own speed. This is something that people such as former ARM architect Professor Steve Furber have been advocating for years. The concept of a system clock is entirely artificial and exists largely to make life easy for chip designers and simplify the job of testing chips as they come off the production line. Chips such as the Amulet don't run off any kind of clock: the logic inside finds its own speed.

A couple of months ago, nVidia's Jen-Hsun Huang decided to stick his head out of the window and shout he wasn't going to take it anymore. Or at least, gather a bunch of analysts together at the graphics chipmaker's HQ and tell them he wasn't going to take it anymore. The trigger was Intel's developer forum in China where Intel's Pat Gelsinger declared the death of today's graphics processor (GPU). Curiously, Gelsinger claimed that just ahead of talking about Larrabee: Intel's latest foray into the GPU business (it's a different kind of GPU, you understand).

The argument from the Intel side was that traditional processors would take over many of the rendering functions in 3D graphics, largely because there are going to be so many of them. Huang had the opposite argument: GPUs already have lots of processors on them, why not use them for offloading software from the host processor?

And so the stage is set for a new kind of architecture war in which you have different kinds of microprocessor fighting over the same ground.

In Sicily for a holiday in the second half of May, my girlfriend and I decided to go to Pantalica. It sounds as though it ought to be a South American heavy metal act but is an enormous, sprawling necropolis that dates back to the Bronze Age. From about 1300 BC, the inhabitants buried their dead in caves cut into the sides of the gorge cut by the Anapo river. They cut thousands of square holes in the cliffs and dragged the bodies of their relatives up to them, ultimately to be uncovered and shipped off to museums by archaeologists.


You can get to Pantalica from two directions: Ferla to the west and Sortino to the northeast. The roads almost meet, but not quite. However, the Tom Tom satnav shows one stretch of road joining Ferla and Sortino by way of Pantalica. Before we got there, I assumed that there was a road there but it was no more than a dirt track for the section that ran down into the gorge and up the other side, as local maps show a break between the two sections of tarmac. This was on the basis that in all the stories of satnavs going wrong, most of the time the road actually existed. It just wasn't all that useful to regular motor vehicles.