Nvidia, the graphics chipmaker, has introduced a range of products that it says could usher in an era of personal supercomputing.
Its new Tesla brand could also shake up the industry in terms of its current dependence on central processing units or CPUs to carry out computing tasks.
Intel is the biggest maker of such microprocessors, but Nvidia believes it can make inroads into the market with its graphics processing units or GPUs.
GPUs have traditionally sat on graphics cards – boards that slot into PCs to drive computer displays and power the visuals, such as 3-D video games.
But their methods of rendering pictures through the parallel processing of tasks lend themselves to supercomputing applications.
Nvidia’s Tesla line will help scientists carry out massive calculations and allow oil company technicians to carry complex geophysical and seismic analysis.
The products include a GPU computing server, which resembles the kind of “blade” rack server commonly seen in data centres but contains four GPUs rather than CPUs and costs around $12,000. There is also a deskside supercomputer containing two GPUs and costing $7,500.
Both will connect to normal PC servers or workstations to supplement computing power and supercharge performance.
Nvidia said the new products could speed an application such as atmospheric cloud simulation by up to 50 times, while a seismic database application could run up to 100 times faster.
Microprocessor makers such as Intel, AMD, IBM and Sun have tried to satisfy the needs of users wanting supercomputing power by increasing the number of cores, or brains, on their chips.
Nvidia’s solution represents another option. “If you think of how the CPU and GPU processing works, it’s like a book, where a CPU would start at page one and go through it processing it in that way,” says Andy Keane, Nvidia general manager.
“A GPU will tear the book into a thousand pieces and process them all in parallel.” Mr Keane said he hoped GPU servers, which did not exist before now, would soon become a common sight in data centres.
“People are looking to see what can drive the data centre faster, and we are positioning the GPU to do that.”