ENTERPRISE TECHNOLOGY GIANT IBM will show off how it's injecting GPU technology into big data analytics to help solve "some of the biggest enterprise IT challenges" in a demo at the GPU Technology Conference (GTC) this week.
According to chip maker Nvidia, which is hosting the conference in San Jose starting Monday, IBM will show off a GPU-accelerated machine for data clustering using the open-source software frameworks Hadoop and Mahout.
Nvidia product management executive Sumit Gupta wrote about IBM's attendance at GTC, claiming that the technology can allow retailers, entertainment websites and internet companies to make far more accurate recommendations for new products and services.
"Such technology opens the door for all types of enterprise companies [to] optimise their customers' experiences allowing them to mine, and make better use of, the vast amounts of data they collect on a daily basis," Gupta wrote. "But, they need to get over the big data hurdle first, which is where IBM - and GPUs - come into play."
Gupta explained that a computational technique called segmentation, or clustering, which identifies non-obvious patterns in data by analysing hundreds of different dimensions, can be used by retailers, for example, to group their customers into segments with similar behaviour, so they can create customised products and target marketing programs more effectively.
"IBM is demonstrating the use of GPU accelerators on a distributed computing system, required for such an enormous data set, for clustering using Hadoop," Gupta added. "With GPU accelerators working alongside IBM Power CPUs, the demo runs eight times faster than with a Power system without GPUs."
GPU acceleration will shorten the time to achieve insights and enable running many more scenarios and perform more intelligent analytics that would otherwise be too expensive to realistically obtain, according to Nvidia.
In February, scientists at IBM Labs claimed to have broken a speed record for big data, which Big Blue said could help boost internet speeds to up to 400Gbps using "extremely low power". The scientists achieved the speed record using a prototype device presented at the International Solid-State Circuits Conference in San Francisco.
Apparently the device, which employs analogue-to-digital conversion (ADC) technology, could be used to improve the transfer speed of big data between clouds and data centres to four times faster than existing technology.
IBM said its device is fast enough that 160GB - the equivalent of a two-hour 4K ultra-high definition (UHD) movie or 40,000 music tracks - could be downloaded in a few seconds. µ
Everything you need to know about Microsoft's upcoming flagship smartphones
Piton processor aims to make servers run more efficiently and cheaply
It might, it might not
You're not The Queen, Linus