Skip to Content

Despite skepticism about AI's future, tech companies predict AI will continue to explode


MMD Creative // Shutterstock

Despite skepticism about AI’s future, tech companies predict AI will continue to explode

Business person using computer with AI symbolic illustrations overlaid on the image to look like they are floating above the keyboard.

“During a gold rush, sell shovels.”

That saying—popular today among startups and investors—started after the California gold rush in the mid-1800s when those making the most money were the suppliers selling shovels and picks, not the gold-panners themselves. Today, it refers to profiting off the supplies, technology, or infrastructure needed by the millions of people trying to access or leverage a particular asset.

ChatGPT set off a new sort of gold rush—the artificial intelligence boom—in November 2022 when OpenAI launched the technology. Tech giants (the gold-panners in this metaphor) have since poured billions (and counting) into building AI systems for the masses while scrambling to be the first to add AI features to their products. Microsoft rolled out its “Copilot” system in 2023 for Windows users, while Apple and Google announced transcription and image-generation features for their phones in the last year.

To get a sense of how big the AI market will grow in the near future, Verbit analyzed financial research and recent data on trends around the technology. Skeptics note those racing to build AI systems have struggled to profit from their ventures, largely due to the expense of building and running the tech. But, for the “shovel-sellers” such as Nvidia, selling the computer chips required by tech companies to train AI models has proven wildly lucrative.

Since the debut of ChatGPT less than two years ago, shares in Nvidia have risen by more than 600%. In June, the chip manufacturer became the world’s most valuable company. The company’s rapid ascension—despite a slip in Nvidia shares since June—offers valuable insights into how companies and investors view the future of AI.



Verbit

Tech companies have a big gap to fill

A bar chart showing that tech companies will have to earn $600 billion a year in revenue to justify their hardware investments.

By the fourth quarter of 2024, tech companies will be spending $150 billion a year on Nvidia’s data center chips, according to an analysis published in June by David Cahn, a partner at venture capital firm Sequoia Capital. After accounting for operating costs such as electricity and salaries, Cahn said Sequoia estimates companies will need to generate $600 billion annually to break even on their AI investments.

Sequoia further predicts tech companies will make a combined $100 billion a year in the near term, Cahn said, amounting to a shortfall of about $500 billion. To earn back that money, tech companies must build smarter AI systems and actual physical infrastructure.

Current AI systems do a poor job of reasoning. While chatbots such as ChatGPT contain a wealth of knowledge and write grammatically sound sentences, they cannot take on complex projects alone. Meanwhile, training and deploying AI models takes a massive amount of electricity. Data centers may account for as much as 9% of U.S. power demand by 2030, according to a report published in May by the Electric Power Research Institute—up from 3% in 2022.



Gorodenkoff // Shutterstock

Still early days

Two workers in business attire, overlooking a factory.

Despite the hype, Census Bureau surveys show that just 5% of American companies say they are using AI to produce goods or services. Moreover, about 95% of companies using AI say that the technology has not changed their total employment levels.

While AI has dominated headlines over the last two years, people and companies using the technology regularly today are, in fact, early adopters. Only 23% of American adults have used ChatGPT, according to a survey conducted by the Pew Research Center in February.

Nvidia is set to ship its Blackwell Ultra chips later this year, which promise 2.5 times the performance at only a 25% increase in cost. In June, the company also announced a Rubin AI platform utilizing as-yet-unreleased HBM4, the next iteration of high-bandwidth memory.

Even AI pessimists must concede that the gold rush has at least one more cycle to run.

Story editing by Nicole Caldwell. Additional editing by Kelly Glass. Copy editing by Kristen Wegrzyn.

This story originally appeared on Verbit and was produced and
distributed in partnership with Stacker Studio.


Article Topic Follows: stacker-Money

Jump to comments ↓

Stacker

BE PART OF THE CONVERSATION

KVIA ABC 7 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content