The Inquirer-Home

Chip makers need new tricks to flog high-end chips

Column Gaming doesn't cut it any longer
Fri Feb 24 2012, 17:54
lawrence-latif-2-120x194

lawrence-latif-2-120x194THE CHIP INDUSTRY is facing its toughest challenge in decades, and it cannot be solved by relying on the number or size of transistors.

Gordon Moore's oft quoted law, in which the Intel co-founder predicts that the number of components on a chip doubles every two years, has been used to promote an image of breakneck innovation in the chip industry.

While Moore originally predicted that Moore's Law would hold for a decade, Intel has pushed its relevance for over half a century.

Despite delays, Intel is expected to release its next generation Ivy Bridge chips later this year, but the question for an ever-growing majority is, 'What's the point?' There's little doubt Intel will do what the market requires to produce a chip that edges out AMD. However for the majority of users it will matter little, as many popular computer use cases can be served by Intel's own chips from four or five years ago.

AMD, Intel and in particular the ARM vendors Qualcomm and Texas Instruments have spent the best part of the last decade talking about energy efficiency, and there's a good reason for that. Of course there's the fashionable 'green' element, but computer use cases have progressed very little, and certainly not at the same pace as processor development, meaning chip vendors could start to produce lower thermal design power (TDP) chips, rather than simply pushing performance boundaries.

Consider the most common use cases - browsing the web, electronic messaging, word processing, even gaming. Now think how many of those are limited by processing power. Perhaps you'll come up with gaming, but in reality even gaming has been doing well with old hardware for years.

The painfully obvious fact is that game developers are focusing their efforts on consoles, smartphones and tablets. Few look at the latest triple A titles on the Xbox 360 or Playstation 3 and bemoan poor graphics and, perhaps shockingly, in the case of the Xbox 360 the titles run on hardware that is seven years old.

It's not just in gaming that this is seen, but also in the high budgets world of enterprise computing. Earlier this week Canonical announced that it will tip up to Mobile World Congress with a Motorola Atrix 2 that can run a version of Ubuntu Linux. This is another graphic illustration of companies believing that people's needs can be met with relatively low-powered smartphone hardware even for running a full-blown desktop operating system.

For years chip vendors relied on use cases to flog new chips. When Intel's Mooly Eden introduced Sandy Bridge at CES in 2011 he talked about video transcoding, and when AMD launched its Llano chips it banged on about affordable gaming, but these use cases are showing diminishing returns, ironically due to the latest cash cow, the cloud.

In some ways chip vendors are inadvertently adding to their problem by pushing the boundaries of cloud computing. Typically end-users see the cloud as a place to store data but cloud rendering services, an example of which is Onlive, are in effect promoting this 'dumb terminal' type of computing that is showcased by smartphones and tablets.

AMD, when it launched its Llano chips, also talked up the ability of its GPGPU to perform complex number crunching used for tracking movement. Considering that the Kinect hooked up to a seven year old Xbox 360 can do that, albeit at a relatively low resolution, it goes to show how even the most seemingly cutting-edge use cases floated by chip vendors can be handled by far less than high-end hardware.

If users can offload a great deal of heavy lifting to the cloud, then the need for cutting edge processing power will diminish. Of course relying on the cloud is effectively relying on your internet connectivity, however if you are comfortable enough with the reliability of your broadband connection, which it seems increasing numbers of users are, then you need a compelling reason to do your computing exclusively in-house.

More directly, three years ago the highly respected technology journalist Anand Lal Shimpi claimed that solid-state disk drives were the best upgrade you could get for your machine. One can argue about the specifics of Shimpi's comment and whether it is still valid today, but what can be surmised is that a faster chip is no longer the sure-fire way to go if you want to see tangible performance benefits from upgrading.

Of course I'm not suggesting that chip vendors should simply give up on chip development, but the need for doubling computation power in desktops and laptops every two years has long since passed. Now the question is getting the computing power we have down to power draw levels where we can put it in smartphones and tablets.

At Intel, Moore used to be known as a meticulous thinker, an analyser and a realist. Even he thought his prediction wouldn't last for more than a decade, since after all exponential growth cannot continue forever. He always thought that research and development costs would signal the end of Moore's Law, but in reality it will be the use of cloud resources that will eliminate users' needs for doubling processing power every two years.

The challenge for AMD, Intel and the rest is not an easy one if they want to flog ever faster chips in volume. The buying public has finally realised it doesn't need to upgrade or replace systems every three years to meet performance demands, and now it needs a far more compelling reason to part with its money, even if in some cases that reason might be vanity. µ

 

Share this:

blog comments powered by Disqus
Advertisement
Subscribe to INQ newsletters

Sign up for INQbot – a weekly roundup of the best from the INQ

Advertisement
INQ Poll

Dead electronic devices to be banned on US-bound flights

Will the new rules banning uncharged devices be effective?