Advertisement

  • News
  • Columns
  • Interviews
  • BW Communities
  • Events
  • BW TV
  • Subscribe to Print
  • Editorial Calendar 19-20
BW Businessworld

Chipping Off

Photo Credit :

Consider these two products for a moment. Lenovo sells a netbook, the IdeaPad S12, for $649 in the US. The PC maker also sells a 15.6-inch laptop, the Lenovo G550, for $559. We call the first one a netbook because it uses the Atom processor. Some top-end netbooks now cost more than many of the powerful entry-level laptops, an impossibility even a few months ago. If the netbook evolves like this, where will computing be after a few years?

A year-and-a-half after launching Atom, Intel is preparing to launch the next generation of Atom, called the Atom N450. Though not more powerful than its predecessors, the new Atom is smaller and more efficient. The graphics and memory controller in the new Atom would be inside the main processor, which, in the end, would produce a chipset with two chips, instead of three. Major computer makers will launch netbooks early next year using Atom N450, while Intel is planning to launch more powerful versions of Atom later next year. With this, the netbook will certainly get a bit more powerful, cheaper and more energy-efficient.

Meanwhile, some interesting developments in computing are happening elsewhere. One significant event is the increasing power of the graphics processing unit (GPU), a specialised processor for rendering graphics. The GPU is improving its performance by leaps and bounds, while also not being expensive. It is possible to provide cutting-edge performance for video — the killer application for most users  — by combining a low-power chip with a strong GPU, and achieve more at a lower cost. A netbook that uses a low-power CPU and a high-performance GPU can play video flawlessly. This combination is so disruptive that it has strained the relationship between Intel and graphics technology giant Nvidia.

Nvidia's Ion Platform has provided great performance in combination with Atom processors, but how would things be in the future? How would, for example, Nvidia's Ion 2 Platform be used with Intel's new series of Atom chips, where the graphics function is integrated into the CPU? Again, when would Intel launch its own graphics chip, the much-awaited Larrabee? What would these develop-ments mean to the new breed of ARM-based chips meant for low-power applications?

The development of the GPU, along with the popularity of the netbook, is probably going to take the PC world away from an all-powerful microprocessor to a combination of a cheap, but energy-efficient, CPU and specialised processors such as GPU. Intel, however, is pushing the old world in its own way. Next year, it is launching chips with up to eight cores. This would definitely let users do things that other combinations could not. After all, video is not the only killer application around. You cannot programme a GPU, for example, to do the number crunching that a financial company needs. A powerful GPU would be useless to a programmer or a scientist, except to render some graphics. Would users find new ways of using these many-core chips?

Traditional computing could still be useful in many new ways. Last week, Intel showcased a concept chip with 48 cores, saying it still needs to work out how these cores will interact with each other. A chip of this nature could change the PC into something else; a semi-intelligent machine, perhaps. But for now, cheaper and specialised CPUs together might cause some real disruption in the industry.
(This story was published in Businessworld Issue Dated 21-12-2009)