Here’s a riddle: When you think of a computer with a dual blazing-fast NVIDIA cards, a Xeon processor and 16GB of RAM, what type of PC comes to mind? Maybe a high-end Alienware system? Believe it or not, that description matches many of the systems being used to power today’s medical diagnostic devices such as CT scanners and ultrasound machines. In an interesting twist, the same GPU technology that renders high-resolution, imaginary destruction has found its way into the doctor’s office and is frequently used to save lives rather than virtually vaporize them.
Ten years ago, GPU manufacturers and their respective technologies were in their infancy. However, they knew the exploding game market held great promise for their products because of the need for complex graphical data processing.
It turns out that graphics problems are best solved by highly parallelized algorithms, and unfortunately for chip makers Intel and AMD, CPUs are good at solving problems through a serial approach, not a parallel approach.
Over the past decade, graphics-hardware vendors consolidated and developed standards like Direct X. The game industry put great effort into increasing the video quality of games and quickly made the dedicated GPU a necessity for hardcore gamers.
An intense rivalry began among NVIDIA and ATI with the battle between the two designers becoming one of the most watched in the PC industry. The competition brought incredible innovation to the GPU market, and all of that new technology did not go unnoticed from researchers in other fields, like medical devices and biotechnology.
As game developers were pushing the limits of commercial GPUs, researchers from other domains began to realize that they could use these special processors for non-gaming tasks that resembled graphics problems.
By translating the appropriate algorithms into graphics problems, a scientist could use a commercial GPU to decrease required compute time by orders of magnitude. At the same time, researchers in some fields were already creating proprietary processors and hardware to efficiently solve their most valuable parallel problems. The cost of such methods made parallel processing an expensive, endeavor, which was out of the reach of most industries.
NVIDIA and ATI recognized this opportunity and general purpose parallel computing was born. They retrofitted their GPUs with instruction sets that were more general purpose in nature, and suddenly the research community could use massively-produced and much-less-expensive hardware to attack parallel problems. Companies that were building proprietary solutions began to take notice, and they quickly moved to the more cost effective (but also less optimized) general-purpose-parallel-computing products.
Today, the availability and relative low cost of GPU technology means software developers from many industries can start crunching number sets that were too large to handle before parallel processing became mainstream. GPUs are helping to detect cancer cells in your body, create 3D diagnostic models of your anatomy, and even monitor the quality of the industrial processes that produce synthetic insulin. Hopefully you feel better knowing that the money you spent on that expensive gaming PC ten years ago helped fund the technology that is improving the quality of lives of millions.
Are there any technologies that are being applied in a way that surprises you? If so, let’s continue the discussion in the comments.
Josh Neland is a Technology Strategist at Dell, and you can follow him on twitter at https://twitter.com/joshneland
25 Comments - Add comment