The global economy is undergoing one of the largest infrastructural technological shifts since the Industrial Revolution, driven by the formation of a global artificial intelligence infrastructure. The global economic architecture has entered a qualitatively new phase of technological development, the key feature of which is the transition from a predominantly software-based model of the digital economy to a capital-intensive infrastructural model of artificial intelligence (AI).
While in previous decades software products and intellectual property were the main drivers of value growth for technology companies, the material computing infrastructure is now becoming the foundation of economic dynamics. This involves the construction of large data centers, the procurement of specialized graphics processing units (GPUs), the creation of cooling systems, and the development of energy grids capable of supporting the operation of high-performance computing clusters.
In effect, a new technological ecosystem is taking shape, where the physical provisioning of computing power is acquiring strategic importance comparable to the role of transportation infrastructure or energy systems during the industrial era.
Statistics on the distribution of data centers by country provide insight into the actual configuration of the global AI infrastructure. As of early 2026, there are approximately 10,800large data centers worldwide, and their geographic distribution is highly uneven. The United States remains the absolute leader, where about 3,960 centers are located, accounting for approximately 37% of the global infrastructure. Thus, the US possesses nearly four out of every ten data centers globally, providing the country with a significant technological advantage in cloud computing and AI systems development.
Significantly smaller, yet notable infrastructural clusters have been established in the United Kingdom (498), Germany (470), China (365), and France (335). Other major players include Japan (249), Australia (268), India (275), and Brazil (198). Notably, despite active state support, China currently lags behind several European countries in terms of the number of data centers.
The geographical structure of this infrastructure indicates a high concentration of computing power in the developed economies of North America, Europe, and East Asia, while a significant portion of countries in Asia, Africa, and Latin America possesses only limited capabilities for hosting such facilities.
This asymmetry creates a persistent gap in access to computing resources and increases the dependence of many nations on the infrastructure of leading technological hubs.
The scale of investment in AI infrastructure has already reached levels comparable to the largest industrial projects in history. According to available estimates, the aggregate capital expenditures of the four largest US tech corporations—Microsoft, Alphabet, Meta, and Amazon—amounted to between $370 billion and $700 billion by the end of 2025. If current growth rates persist, their investments could exceed 2% of the US GDP.
For historical comparison, a similar scale of capital investment was observed during the construction of the US national railway network in the 19th century, which laid the foundation for the country’s industrial leadership for decades to come. Unlike those projects, however, the current technological infrastructure is being built primarily by private capital and with a significantly higher degree of uncertainty regarding future economic returns.
At the same time, concerns are increasingly voiced within the expert community regarding the possible formation of an infrastructural bubble in the computing power market. Major tech companies are building massive data centers and purchasing millions of specialized processors based on expectations that new generations of AI models will drive a sharp increase in labor productivity and create fundamentally new markets for digital services.
However, the actual scale of commercial demand for such capacities remains uncertain. If the anticipated economic impact of AI adoption falls short of projections, a significant portion of the built infrastructure may end up underutilized. Unlike traditional infrastructure assets, such as transportation systems or energy grids, data center equipment has a relatively short technological lifecycle. Specialized chips become obsolete within just three to four years, requiring constant upgrades and additional investments.
Consequently, the risk of accumulating unneeded computing power is compounded by the rapid rate of technological obsolescence of the equipment, making the current industry development model particularly sensitive to demand fluctuations.
A prime example of mounting financial risks is the situation surrounding the American corporation Oracle. The company, long considered one of the key players in enterprise software and cloud solutions, found itself under significant debt pressure by the spring of this year. According to various estimates, its total debt obligations exceeded $137 billion.
In a high-interest-rate environment, such a debt burden severely limits the company’s ability to further expand its infrastructure projects. Major rating agencies already warn about the likelihood of downgrading the company’s credit rating to the high-risk category.
Attempts to raise additional funding through bond and stock issuances provoked a negative market reaction, resulting in the company’s stock price more than halving since the fall of 2025. Corporate circles are discussing possible measures for massive staff reductions and the revision of investment plans.
A particular vulnerability of the current financial structure is the interdependence between large tech companies and AI developers. For instance, the massive Stargate infrastructure project, aimed at developing computing power for the leading developer OpenAI, entails Oracle’s participation as a major cloud infrastructure provider. Meanwhile, OpenAI itself remains an unprofitable company, with projected operating losses that could amount to approximately $14 billion in 2026.
A complex system of mutual investments and financial flows has formed within the industry, where funds invested by some tech companies into AI developers are subsequently channeled to pay for the cloud services of other companies, which, in turn, use the acquired resources to purchase equipment from processor manufacturers.
Such a closed-loop financing structure amplifies systemic risks, as a failure in any single element of the chain could trigger a chain reaction in a market valued at trillions of dollars.
In addition to financial constraints, the development of AI infrastructure faces severe energy barriers. Modern data centers require massive amounts of electricity, and their consumption continues to grow rapidly. According to estimates by American think tanks, in the coming years, data centers could consume up to 17% of all electricity generated in the US.
This load puts significant strain on existing power grids, many of which were built decades ago and are not designed for such intensive consumption. In a number of regions, including the states of Virginia and Texas, action plans are being discussed in the event of grid overloads and power outages.
In response to this situation, US federal regulatory agencies have initiated stricter rules for connecting large data centers to the national power grid. The new regulatory policy effectively requires tech corporations to independently invest in the construction of generating capacities and the modernization of regional energy infrastructure.
This has led to the emergence of a new trend, dubbed the tech-led "nuclear renaissance" within the expert community. Microsoft initiated a project to modernize and restart a reactor at the Three Mile Island nuclear plant, costing approximately $1.6 billion. Amazon acquired exclusive rights to use the power output of the Susquehanna nuclear power plant, while Google signed contracts for the development and deployment of small modular reactors (SMRs).
Thus, tech corporations are gradually transforming not only into digital infrastructure operators but also into major investors in the energy sector.
Nvidia continues to play a pivotal role in the new technological ecosystem, having effectively become the monopoly supplier of specialized GPUs for AI systems. The company’s market capitalization is approaching $3 trillion, with up to 90% of its revenue coming from the data center segment.
However, this position is also facing new challenges. Tighter US export restrictions have limited the supply of specialized processors to the Chinese market and have already led to multibillion-dollar write-offs. Simultaneously, the technological rivalry between the US and China is intensifying.
China is implementing an alternative model for computing infrastructure development based on centralized state planning. The national "Eastern Data, Western Computing" program envisions locating large computing centers in regions with affordable electricity and favorable climatic conditions. This allows for a significant increase in data center energy efficiency and a reduction in operational costs. According to some estimates, China’s technological hardware lag behind the US could narrow to a matter of months. At the same time, the Chinese AI operational model entails significant resources being directed toward internal control, data verification, and compliance with state information security standards, which reduces the efficiency of computing power utilization.
An additional factor amplifying the strategic importance of AI infrastructure is its rapidly growing role in the military domain. In recent years, leading powers have increasingly viewed computing power and machine learning algorithms as key elements of future military superiority.
Unlike previous technological revolutions, where individual types of weapons played a decisive role, the current phase is characterized by the integration of AI into decision-making, reconnaissance, and operations management systems. This means that control over computing infrastructure is gradually turning into a critical factor for national security and military balance.
As of today, one of the most prominent examples of AI application in combat conditions has been the military conflict in the Middle East.
During the military campaign against Iran, US military structures have been actively utilizing Project Maven, a system developed by Palantir. The system is designed to process massive arrays of intelligence data coming from satellites, unmanned aerial vehicles (UAVs), and other sensor platforms. It automatically identifies potential targets, ranks them by priority, and generates recommendations for operators. The system also integrates Claude, a large language model by Anthropic, which is used to process intelligence reports and accelerate information analysis.
According to military experts, the application of such algorithms has allowed for a sharp increase in the speed of operations: in the first 24 hours of the campaign alone, over a thousand targets were identified and struck in the area of operations, and by mid-March, that number had exceeded three thousand. The algorithms are capable of generating up to a thousand target recommendations per hour, which significantly exceeds the capabilities of traditional human analysis. This experience has revealed both the advantages and limitations of such technologies.
Proponents of AI application point to increased strike accuracy and the reduction of human error in intelligence analysis; however, the effectiveness of such systems decreases in complex weather conditions and when decoys are used. Furthermore, the use of civilian AI models in military operations has already sparked legal and political disputes between defense departments and tech companies.
Nevertheless, military leadership views AI as a crucial tool for reconnaissance and targeting, and the ongoing campaign in the Middle East has become one of the most large-scale tests of these technologies. In a broader context, this indicates that the rivalry between the US and China in the sphere of computing power is acquiring a military-strategic dimension.
Amid the rapid growth of computing capabilities, international discussions on regulating the sector are intensifying, including limits on the volume of computing resources held by private companies and the licensing of high-performance GPU usage. Until recently, this idea was considered fringe, but today it is gradually entering the agenda of intergovernmental consultations.
The formation of the global AI infrastructure is becoming one of the key economic processes of the decade. On the one hand, it unlocks new sources of growth; on the other hand, massive investments and energy constraints create significant systemic risks.
Strategically, this signals the formation of a new foundation for the global balance of power. While in the 20th century the key factors of influence were industrial potential, energy resources, and nuclear technology, today these are joined by a state’s ability to provision massive computing capacities and control AI infrastructure. This makes the possession of a sovereign computing base one of the core conditions for maintaining economic and technological independence in the long term.
* The Institute for Advanced International Studies (IAIS) does not take institutional positions on any issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of the IAIS.