April 2008 Archives

When synthetic biologists talk about what they are doing, they often point to the analogies between their work and what happens in engineering, particularly electronics engineering. You can point to some processes in living cells and describe them in the same terms as digital logic or oscillators - the kind of functions you find in a lot of electronic circuits.

The analogies don't stop there: the aim of synthetic biology is to develop a kit of parts from which you can build organic systems able to make fuels, drugs and chemical sensors. What are the parts? Professor Richard Kitney of Imperial College, London says: "We mean encoded biological functions: usually we mean modified bacterial DNA."

That modified DNA is injected into bacteria which has the machinery already in place to do the next bit, which is to make the parts work together to create simple circuits and, ultimately, create a system that does something. The annual iGEM competition, where undergraduate teams cook up modified bacteria to do unusual things, shows what can be done even at this stage.

Marc Andreessen has a good dissection of the strategies that Yahoo could deploy to try to fend off a possible hostile takeover by Microsoft. However, most of it is of the form, "I wouldn't start from here if I were you". If Microsoft presses ahead, it seems likely that Microhoo is not far away. And, given the state of the markets, Yahoo's board may not be alone in looking over their collective shoulders.

Andreesen contends that Oracle-Peoplesoft was the first major hostile takeover in the tech sector: the argument being that tech companies depend far too much on soft assets, which are wont to leave even if the takeover goes through.

It should really be successful bids A number of attempts took place about ten years ago and a couple of years after the one that surprised everyone but which ended with an agreed deal: IBM's bid for Lotus Development.

The takeover spree of the late 1990s was fairly shortlived and, of close to ten big attempts, most of the bids were withdrawn. There was a lull and along came two more, away from the software business. At the time, financial analysts believed that a feeding frenzy was on the way and boards of directors queued up to put poison pills in place. Analog Devices only recently cancelled its - maybe it will start to reconsider its move. And yes, ADI's poison pill entered the company's byelaws in 1998. In the event, the feeding frenzy never took place.

Last week, two events in London showed how far apart the views can be on what, to some, is the beginning of the future of chemical engineering and what to others is simply the beginning of the end.

At the IET's BioSysBio conference, which kicked off last Sunday, Professor Richard Kitney of Imperial College, London, argued that synthetic biology is the engine of a third industrial revolution. He pointed to the discoveries of the mid-19th Century and how they drove the rise of the synthetic chemical industry.

The problem that conventional synthetic chemistry has is that it is a brute-force process. It excels at producing simple molecules in high volume. But complex chemicals, particularly those needed for drugs, are expensive to manufacture. And it is no good for producing fuels because you have to put more energy in than you will ever get out.

Cells are chemical factories in miniature that are very good at producing complex chemical structures. Unfortunately, nature has not seen fit to evolve a petrol-producing bacterium. Synthetic biology opens up a future when gene reprogramming will make it possible to develop a bacterium that can turn sunlight and excess carbon dioxide into petrol or ethanol.

However, it does not take long for the ethical issues to surface. Opponents to this kind of technology worry the world will end up covered in toxic green goo pumped out by bugs gone bad.

Today's Graph of Nonsense Award goes to the one reproduced by Erick Schonfeld at Techcrunch to try to explain the chocolately goodness that lies behind Radar Networks' plan to apply Semantic Web technologies to search. It's one of those classic startup-generated graphs that purport to show how existing, in their view clapped-out, technology is going to give way to the shiny new stuff.

semsearch.jpg

The argument relayed to Schonfeld by Radar's Nova Spivack is that today's search engines, which are based on keyword searching, are running out of steam. Spivack's contention is that the volume of data is going to overwhelm keyword-based technologies soon and that what you need is to add meaning to the underlying text to help the poor old search engines out. And, in startup style, the argument runs that the established search engines cannot deal with a root-and-branch reworking of their algorithms.

For some bizarre reason, Spivack puts down tagging and natural language search as points on the way to semantic search, rather than having semantic search before natural language, which seems to be his argument. And then we get to "reasoning": presumably at the point where we hit the Singularity or something.

Before everyone gets excited about Google going the way of AltaVista, we should take a step back and have a look at what goes on with keyword searching and then listen to what one of the technique's creators had to say on the subject of the Semantic Web.

Late yesterday, Forbes reported that Apple has decided to buy boutique chipmaker PA Semi. So, the conference call later today where Apple announces its results for the second quarter of 2008 is going to be interesting. And there will be a bunch of silicon suppliers wondering what's going wrong for them.

Discarding the possibility that Apple has decided the move to Intel, and its rejection of PA's PowerPC processor in 2006, has been an awful mistake and it suddenly needs to press the architecture reset button, the move by Apple suggests that the company is not all that happy with the shape of today's integrated circuit (IC) business.

One possibility is that Apple has decided it needs more in-house chip designers and buying PA was a quick way to staff up. That's not unusual in this business: it's a surprisingly common way of getting hold of people who can design the analogue circuits that most electronics engineers fear to touch. Even after you've bought in a bunch of processors and memory, there are other places a computer maker can use experienced IC designers to get an edge on its competitors. You don't see that much in the PC business but it's a lot more common in places like the phone market.

TSMC's 5nm difference

21 April 2008

There was a telling moment in the conference call hosted by Altera ostensibly to talk about its 1Q08 results but also drop in a few hints about an upcoming family of programmable logic chips.

Historically, Altera and market leader Xilinx have taken lumps out of each other as they vied to be first onto each manufacturing process. But something changed at the 65nm process node. Xilinx was quick to get its high-end parts out on the 65nm technology, but nowhere near as quick as the company's claims over having first silicon on that process. Altera was behind but took the opposite tack: putting its cheaper Cyclones onto TSMC's 65nm process first. Then it all went quiet. The launch of the 45nm process went off with both Altera and Xilinx being uncharacteristically quiet. It began to look as though a kind of chip that had become the foundries' banker for early silicon had suddenly fallen off Moore's Law.

It seems that the programmable-logic makers are still in the running, just not as quick to jump on a new process as they used to be. And it seems that Altera wasn't aware just how advanced its next process would be until TSMC decided it would lop another 5nm off - in name at least.

The 200 millionth download at eMusic has provided an opportunity to take a stab at how many active subscribers the service now has. I did it the not so subtle way by plotting the cumulative downloads against days since eMusic went subscription only. The company conveniently provided three real data points and one implicit point in an arrangement that suggested some kind of power law was at work in the cumulative count.

For one, it implies that the growth of eMusic in the last couple of years has been pretty linear and got a bit of push sometime during 2006. Wasn't that roughly when AllofMP3 got its marching orders?

Sometimes, small deals can wind up changing the shape of a market. The deal between Blaze DFM and TSMC that has been gestating for close to a year is possibly one of them: it recalls the giant leap of faith that Artisan took when it came up with the "free library" idea.

Basically, Blaze and TSMC have cut a deal that will see the Taiwanese foundry use a version of the Blaze MO tool to alter transistors in a layout to make them less leaky just prior to manufacturing. The idea is not new and fairly simple: you make the transistor gate longer on logic paths that don't need to be fast. This typically shifts the threshold voltage up, which cuts leakage. STMicroelectronics has been offering the same sort of modifications using different logic cells. What is different is the nature of the deal between Blaze and TSMC and what it could mean for the whole DFM business.

Instead of trying to sell tools on a per-seat basis for something like a couple of hundred thousand dollars – the regular EDA business model – TSMC will host the tool. Although the companies will not talk about the money side of the deal, it does look broadly similar to the Artisan free-library model, where the foundry paid a royalty to Artisan for each chip made and charged a bit more for each chip to the customer.

For Jacob Jacobsson, CEO of Blaze DFM, this approach, in a way, opens up money that isn't available to the EDA tools vendors. "EDA has a had a more or less stagnant $3bn budget for as long as we can remember. It is more attractive for us to align with the manufacturing side of the business."

Hold the front page. Major news from the Embedded Systems Conference in San Jose. It's a classic piece of deckchair rearrangement. And it's one that makes you wonder about the marketing acumen of a company that's supposed to be a lot better at it than its competitors.

Basically, the company couldn't give a better rod to companies such as Montavista and Wind River to beat it with than this one. The 'news' is that Microsoft is, once again, rebranding its embedded operating systems. The plural in the last sentence is a little misleading as Microsoft only really has one embedded operating system that was designed for the job. The other one is a 'componentised' version of XP that won't bitch and whine if you haven't plugged in a keyboard when it boots.

However, the XP you can put on a diet is now, apparently the 'standard' embedded operating system. Windows CE, which is a different piece of software altogether and was built more like a classic real-time operating system is now the 'compact' version. And there will be an 'enterprise' version which seems indistinguishable from regular Windows XP or Vista other than it will get some sort of extra embedded mojo over time, according to the company.

Now, you could argue that, with Linux making bigger and bigger inroads into the embedded business, it makes sense for Microsoft to focus on the larger OS. However, the Linux kernel has gradually been acquiring bits of technology that are useful for real-time work. To run Windows Embedded Standard in a real-time environment, you need to use some form of virtualisation. So, what tends to happen is that people use Windows XP for the pretty user interface part and something else for the real-time part. It works. But communications between the two parts is not quite as straightforward as doing everything on one OS.

Also, note the language used in the release about the next generation of software:

"The first product release under the new naming strategy will be Windows Embedded Standard, the next generation of Windows XP Embedded, and will be launched simultaneously at Tech•Ed North America and through a global webcast event on June 3. All presently available Windows Embedded products will be marketed under their current names until their next scheduled product release..."

The next generation of XP? Hmm, doesn't sound like Vista is going to go on the embedded diet anytime soon, which makes you wonder about the near-term prospects of a chopped-down Vista running on a mobile device.

Earlier today, IBM put out a release claiming a major "performance leap" for chips that use its forthcoming 32nm semiconductor process. Working out what's changed since the last release is a bit trickier. Basically, IBM and some of the companies in its group of chipmaking collaborators have made a bunch of test chips and are now confident enough to declare the 32nm process open for business.

Other than that, the content of today's missive is not broadly different from the one that IBM and its partners put out just ahead of the chipmaking industry's big conference on process technologies, the International Electron Device Meeting in Washington DC, held late last year. There really isn't a lot more detail, other than there is now a timetable: IBM will start running prototypes for customers of the companies in its Common Platform alliance in the third quarter of this year. The implication is that the company's in the Common Platform team will have a working 32nm process in the second half of 2009, about the same time as Intel and TSMC as long as they stay on schedule.

Hidden comments

13 April 2008

A bunch of people are up in arms about yet another social site that hoovers up newsfeeds so that people can collect all their comments into one place. The two big problems that some blog owners have are these: it's an infringement of copyright as content is being sucked into another site wholesale; and it encourages people to comment on posts away from the source blog, so that the blog owner can't get to see them without subscribing to this new site.

The first point is a tricky one. You could argue that it is an infringement of copyright. However, if you are providing full feeds then Shyftr is really only acting like an online newsreader. The name Shyftr doesn't really help the service's image but, if you don't want copy hoovered up in this way, don't provide full feeds. As this blog isn't ad-supported, it is not that big a deal where the material is read as long as it's attributed to me. Sure, I'd like to know how many people are reading. Owners of sites such as Shyftr would buy themselves a bit more slack if they ponied up readership stats to the people who provide the actual content. But it's not in evil country yet. Anyway, if you're that worried about content leeching, just used a bit of Apache mod_rewriting to serve up partial feeds, or a list of links to Rick Astley videos, to those service's spiders – assuming they've been good and announced themselves.

The second 'problem' is an indication of how misguided some bloggers are when it comes to the subject of The Conversation, although I think there is a small, subtle issue with a site like Shyftr. Because comments appear on blogs, it is easy to be misled into thinking that is where all the action is happening.

Take Scoble, for example, who can be relied upon in these circumstances to come out with this sort of line: "The era when bloggers could control where the discussion of their stuff took place is totally over."

And bloggers had control before? How so? Is that like how nobody discussed what appeared in the papers before blogs came along? Pubs and cafés were eerily quiet as people digested their daily news in total silence, fearing to talk about it because the nasty media had all that control?

John Busco at John's Semi-Blog has pointed to the launch by Nascentric of an analogue-circuit simulator accelerated by nVidia's graphics processors, and wondered: "Will general-purpose GPU computing become the acceleration platform for EDA?"

I was sitting at the Many-core and Reconfigurable Supercomputing (MRSC) conference in Belfast the other week wondering the same thing. In recent years, hardware-specific EDA has been a dirty word. Mentor Graphics, which made its name selling proprietary workstations before it became a software-only company made a foray back into hardware in a deal with Mercury Computer Systems in late 2006. Mercury used the IBM Cell processor – the same one used in the Sony Playstation 3 – to speed up the job of checking chip designs before they go to fab. Mercury sells the hardware and Mentor provides a special version of Calibre.

It's not clear how well hardware acceleration has gone for Mentor and Mercury. However, in its 2007 annual report, Mercury declared that it saw a "slight rebound" in its semiconductor business, partly due to the sale of one accelerator for chip-mask inspection – which is not related to Calibre – and its deal with Mentor. The number-three EDA company has been busy showing off the hardware at events like the SPIE lithography conference, so the company must have some faith in the idea of speciality accelerators.

So there I was typing away as an email from eMusic came in. In a bout of continuous partial attention, I clicked on it and thought: "Ooh, new Black Francis album. Glad I saved some downloads instead of splurging them when the subscription rolled over at the end of last month."

But, thanks to the byzantine nature of distribution deals in the music industry, it is, naturally, not available for download in the UK from eMusic. Then I remembered the Breeders album was meant to be coming out this week and that one, thankfully, is available here. And now downloading.

As is Black Francis' Svn Fingers as I decided to see if it was on iTunes. I could have waited to get the CD but as my mind normally goes blank the moment I walk into a record shop, decided to lay out a fiver on the mini-album there and then. Looks like it's going to a kind of ex-Pixies day.

But this is something that the music industry needs to get a grip on. There are people who do pay for music - when they can find it. It's crazy to have a situation where it's easier to find pirated versions than the paid-for recordings, particularly when it comes to back catalogues of minor artists.

I've seen plenty of releases with disclaimers - mostly about forward-looking statements. Or as the CFO of one big analogue chipmaker put it at a financial conference some years back as he put up the obligatory safe-harbour statement: "This basically says that everything I am about to tell you may be a lie."

This disclaimer from Webit PR, however, is a new one on me:

"Disclaimer:

Whilst WebitPR Ltd endeavour to ensure the accuracy of the information contained in this Release, WebitPR Ltd cannot accept any liability for:-

• the inaccuracy or otherwise of any information contained in this Release; or

• any loss liability or expense which may be suffered by any party in consequence of acting or omitting to act as a result of any information contained in or omitted from this Release; or

• any loss or suffering which may be caused by or to any party either as a result of the information contained in this Release or such information contained in this Release being inaccurate or otherwise misleading."

I guess this is one of the consequences of more releases being turned up directly by search engines. But it only serves to confirm what we already know: everything in the release may be a lie.

The way IBM describes its racetrack memory – yet another candidate for "memory of the future" – it's easy to be left with the impression that Big Blue is out on its own with this one. Om Malik* Stacey Higginbotham breathlessly opines: "IBM sure has some seriously crazy semiconductor researchers locked in its basement. These guys question everything when it comes to advancing chip technology."

Maybe IBM does. But it's not alone. What IBM claimed in the press release was that a memory 100 times denser than today's flash devices is on its way:

"The devices would not only store vastly more information in the same space, but also require much less power and generate much less heat, and be practically unbreakable; the result: massive amounts of personal storage that could run on a single battery for weeks at a time and last for decades."

Sounds great. When can I buy one? Not any time soon if you look more closely at what IBM's release is based on. The journal Science has published a paper on the work of Stuart Parkin's group at its Almaden lab in San Jose that describes a tweak to a type of magnetic memory. It's a bit like a solid-state disk state. You store bits magnetically: the state depends on which way the stored field points, either forwards or backwards along a metal wire.

When former STMicroelectronics R&D director Jo Borel tried to convince the French government that it should try to convince Europe's three largest chipmakers to merge, he almost certainly didn't have in mind what ST and NXP Semiconductors plan to do. They are not merging the entire companies but taking the wireless business units and glueing them together.

The argument used for the merger is not all that dissimilar to Borel's: it's all about scale. Borel wanted Infineon, NXP and ST to team up to be big enough to build and operate a leading-edge fab - it is something that is only worth doing if you are selling billions of dollars' worth of chips every year out of that facility. Not able to do that on their own, the three companies expect to buy wafers made using the latest processors from foundries such as TSMC.

The availability of foundry-made silicon is one reason why Infineon chief Wolfgang Ziebart has said that there is not all that much point in trying to be big for the sake of being able to keep building fabs. His view is that companies will specialise and do whatever they can to be in the top three of their chosen market. Infineon has been bulking up in wireless recently, thanks to its purchase of a business unit that was only briefly part of LSI when that company bought Agere Systems.

The move by NXP and ST is on a larger scale, creating an as-yet unnamed joint venture that is comfortably in the top-three wireless silicon makers and around twice as big as the next largest supplier. According to iSuppli, that will be Infineon once the deal is done. The German company is at the head of a line of $500m to $1bn suppliers. The ranking switches a little if you look at it from the perspective of baseband processors - the single most important segment in cellular wireless silicon. ST lies at number three, NXP at five.