The CPU has long been the standard chip for performing most computing tasks, including analysis. However, specialized and chip-based computing servers can now offload graphics processing tasks to a dedicated GPU, improving overall performance.
The CPU limitations of the microprocessor have given rise to specialized chips such as the GPU, DPU or FPU – sometimes called a math coprocessor, which handles floating point math. Such units free up the CPU to focus on more generalized processing tasks.
GPU for data analysis
GPUs have long been limited to graphics tasks, according to Greg Schulz, an independent computer analyst; the current interest in using the GPU for other types of processing is relatively new. Graphics processing requires math-intensive workloads, but the GPU’s ability to handle math-heavy tasks makes sense for a variety of other purposes. For example, rendering a 3D image requires matrix multiplication – a kind of calculation that is also useful for deep learning and analysis. However, these advanced GPU capabilities aren’t suitable for simply querying data in a database or data warehouse, according to Mike Gualtieri, principal analyst at Forrester Research, a Cambridge, Mass.-based research and consulting firm.
Vendors such as Nvidia aim to use GPUs to dramatically speed up the training of deep learning algorithms in particular. In addition to deep learning, GPUs also speed up tasks that involve inspection, searching through image databases, and natural language processing. As GPUs become more common, they also become a more cost-effective way to handle these tasks.
Mathias GolombekTechnical Director, Exasol
“GPUs allow data scientists to spend more time focusing on value-added tasks and experiments and [deal with] less frustration with poorly performing systems and tools,” said Mathias Golombek, CTO at Exasol, a high-throughput database company based in Nuremberg, Germany.
On the other hand, not all tasks are suitable for GPUs. Much of the GPU’s popularity comes from its ability to offload certain intensive tasks from the CPU, but CPUs are still better suited for certain data analysis tasks. For example, performing SQL analysis queries against a large data set requires in-memory processing by a CPU. The best bet for data analysis is to use both CPUs and GPUs.
GPU vs. CPU: How They Stack Up
When it comes to data analysis, GPUs can handle multiple tasks at once due to their massive parallelism. However, CPUs are more versatile in the tasks they can perform, as GPUs generally have limited applicability for processing data.
Instead of choosing between CPUs and GPUs for data analysis, companies should consider whether they can use GPUs as an accelerator to achieve higher performance across the board. For example, GPUs can speed up the development, training, and refinement of data science models because model training makes it easier to parallelize and use a GPU. It also prevents CPUs from handling heavy and complex model training tasks.
Organizations can also more easily test and experiment with GPUs as major cloud providers increasingly offer GPU services. AWS, Microsoft Azure, and Google Cloud Platform all offer GPU instances, typically for AI workloads. The specialized focus of GPUs and their growing popularity with major vendors could spawn another generation of chips to perform more specialized analytical learning tasks. For example, Google even has its own proprietary Tensor processing unit for such tasks.