[ad_1]
Nvidia, you see, sells a kind of chip known as a graphics processing unit, or GPU. These chips have long been useful for, well, processing computer graphics, which is why if you search Nvidia’s name, you’ll find a lot of gaming chat boards. But they have also become useful for solving other computing problems, including those at the core of ChatGPT and other AI systems.
Nvidia’s data center business is its biggest division in terms of sales, and, with AI booming, market watchers headed into Wednesday confident that it would power a bumper second quarter. Bloomberg reported that analysts were forecasting quarterly revenue of $11 billion, and earnings per share (EPS) of $2.07, up from $6.7 billion revenue and just 26 cents a share a year ago.
Nvidia blew past those expectations by almost 25 percent. Actual revenue was $13.5 billion, and EPS came in at $2.70, minus certain items. And even better numbers might be to come; the company forecasts revenue will rise to $16 billion next quarter.
Obviously, this tells you something about demand from the chip-hungry AI market. But it also points out one of the key constraints on AI’s growth. AI platforms are monumental programming achievements, but running them takes monumental resources. Chips are one of those resources, and the reason that Nvidia is doing so well is that demand is outpacing supply.
That bottleneck is very profitable for Nvidia, but it is making things harder for the many start-ups that would like to launch AI-powered services … if only they could get enough computing power. Nvidia’s chief financial officer, Colette Kress, said on Wednesday’s earnings call that “we expect supply to increase each quarter through next year,” but in the near term, those constraints are going to limit humanity’s ability to exploit AI’s possibilities (or if you prefer, to drive humanity ever closer to the brink of extermination by our robot overlords).
Over the longer term, of course, such problems will be solved, one way or another. If AI turns out to be less powerful than optimists hope, and pessimists fear, then demand will dry up, and Nvidia will have to retrench. More likely, however, is that after a period of explosive growth and supply-chain bottlenecks, demand will become more predictable, and Nvidia and other companies will add enough new capacity to meet it.
But that’s when another constraint will really start to bite, because just as AI runs on chips, chips run on power. According to the International Energy Agency, data centers and data transmission each account for about 1 to 1.5 percent of the world’s electricity consumption. That will grow rapidly along with AI. Bloomberg recently calculated that AI processing alone at one company, Google, burns as much electricity per year as all the homes in Atlanta. That’s a staggering figure for just one of the many companies working on AI, and we are only in the very earliest stages of seeing what AI can do.
The private market will solve the chip problem by itself. But matching electricity supply to demand could potentially be a far larger project, requiring not just new generation facilities, but large-scale cooperation between utilities, regulatory commissions, and state and local governments to build the power generation capacity and grid connections that AI will need to reach its full potential.
That challenge will be made even more monumental by the fact that we would probably prefer to provide that power without further warming the planet. And while renewables can undoubtedly do some of that job, we also urgently need to revisit the barriers to nuclear, which can provide steady, emissions-free power at massive scale.
Of course, if you fear that our AI-driven future is a bad one, it might be tempting to ignore the power problem, hoping that the physical constraints will keep the virtual machines from taking over. But if AI models prove as useful as they currently look, benign neglect will quickly turn malevolent, as silicon processors compete for electricity with flesh-and-blood humans who would like to run the air conditioning and the fridge while they chat with their bots. Far better to start planning now to give the chips what they need, so that they can give us what we want.
[ad_2]
Source link