When Sam Altman One year ago, OpenAI’s Roman Empire It is a good idea to use a bilingual translator the actual Roman EmpireHe was not kidding. In the same way that the Romans gradually amassed an empire of land spanning three continents and one-ninth of the Earth’s circumference, the CEO and his cohort are now dotting the planet with their own latifundia—not agricultural estates, but AI data centers.
Altman and Nvidia CEO are among the tech executives. Jensen HuangMicrosoft CEO Satya NadellaOracle founder and CEO. Larry Ellison These warehouses, stocked full of IT infrastructure, are seen as the future for the American economy (and perhaps the global one). Of course, data centers are not new. Computing began with giant mainframes, which drank a lot of power and were housed in temperature-controlled rooms. Co-ax cable was used to transfer information between the mainframe and terminal computers. After the 1990s consumer internet boom, a new generation of infrastructure was born. The backyard of Washington DC was flooded with massive buildings, containing racks after racks, of computers, which stored and processed information for the tech industry.
A decade later, “the cloud” The internet became a squishy, pliable infrastructure. Storage got cheaper. Some companies, like Amazon, capitalized on this. But instead of using rented racks and on-premises servers, companies began to use virtualized environments to meet their computing needs. (“What is the cloud?” In the mid-2010s a member of our family who was perfectly intelligent asked me, “and why am I paying for 17 different subscriptions to it?”)
The tech industry was accumulating petabytes and petabytes worth of data from people who shared it online, through enterprise workspaces and mobile apps. Companies began to find new ways of mining and structuring this data. “Big Data,” They promised it would transform lives. This was true in a number of ways. It was important to understand where the story was heading.
The tech industry has now entered the feverish days of generative AI which demands new computing resources. Big Data is tired; big data centers are here, and wired—for AI. To power AI datacenters, chipmakers need faster and more efficient chips. Nvidia AMD are jumping on their couches, announcing how much they love AI. In the US, capital investment in AI infrastructure has reached unprecedented levels. The deals are so massive and swirling that they could be cocktail-party handshakes lubricated with gigawatts. While the rest of us struggle to keep track of real contracts and dollar amounts, these huge, whirling deals look like it.
OpenAI has struck deals with Microsoft, Nvidia and Oracle. A supercomputing venture between OpenAI and Microsoft called Stargate became the foundation for the US AI infrastructure. (President Donald Trump Altman, Ellison, and others called the project “the biggest AI infrastructure” in history. (This may or may not be hyperbole. Altman, Ellison and SoftBank CEO Masayoshi Son Stargate’s investors were fully committed to the Stargate deal. They pledged $100 Billion to begin with, with plans of investing up to 500 billion dollars in Stargate over the next few years. Nvidia GPUs will be used. Later, in July, OpenAI and Oracle announced an additional Stargate partnership—SoftBank curiously absent—measured in gigawatts of capacity (4.5) and expected job creation (around 100,000).

