web analytics
Home » Technology » ChatGPT Users Influx May Lead To Chip Shortage Again

ChatGPT Users Influx May Lead To Chip Shortage Again

The gamer scene has only just gotten over the graphics card shock triggered by the cryptocurrency boom when the next problem is just around the corner. The AI ​​service ChatGPT should also require a large number of additional GPUs due to its enormous success.

As of January this year, the tool reportedly already had 100 million monthly users. Now Microsoft is also relying on the technology and integrating it into various services. Ultimately, this makes it necessary to support the high demand with ever more powerful hardware. According to the market research institute TrendForce, the operator OpenAI will have to invest around 300 million dollars here.

These then essentially flow directly to Nvidia, because the manufacturer’s GPUs are required above all, with which the high demand for computing power can be met. TrendForce is assuming around 30,000 A100 GPU modules, each of which costs well over $10,000, but will certainly be sold at a certain discount with the corresponding order quantities so that the market researchers are assuming the mentioned around $300 million.

Hopes for Azure

True, gamers do not compete directly with buyers of A100 modules. But such a high demand peak should definitely have an effect. Because such an order would tie up production capacities at Nvidia and also use the still scarce secondary components that are used on all graphics cards.

If Nvidia prioritizes the placement of components in graphics cards intended for ChatGPT, it could affect the availability of other GPUs, the market researchers said. And even if there are no delivery bottlenecks, an effect on prices would be likely. However, the partnership with Microsoft also opens up the possibility that a good part of the ChatGPT calculations can be moved to the Azure cloud is relocated – that should at least stretch the need for hardware.