Just when we thought we were safe, ChatGPT comes for our graphics cards

Everyone seems to be talking about ChatGPT these days thanks to Microsoft Bing, but given the nature of large language models (LLMs), a gamer would be forgiven for feeling a certain déjà vu.

You see, even though LLMs run on massive cloud servers, they use dedicated GPUs to do all the training they need to run. Typically, this means feeding an absolutely obscene amount of data through neural networks running on an array of GPUs with sophisticated tensor cores, and not only does it require a lot of power, but it also requires many real GPUs to do it at scale.

Leave a Reply

Your email address will not be published. Required fields are marked *