On February 2nd local time, Elon Musk’s space exploration company SpaceX officially announced the acquisition of artificial intelligence startup xAI, marking the launch of a major integration spanning the space and AI sectors.
With the core goal of “building a vertically integrated innovation engine,” this merger ultimately aims to scale the construction of space data centers, seeking to address the energy and space bottlenecks hindering global AI computing power growth and reshape the future landscape of AI infrastructure.
In a memo to employees posted on SpaceX’s official website, Musk defined the merger as a new chapter in the missions of the two organizations, stating plainly that they will “scale up to create a sentient sun to understand the universe and extend the light of consciousness to the stars.” This highlights his ambition to connect space technology with AI capabilities and lay out space-based intelligence.
I. Core Driver of the Merger: Ground AI Computing Power Hits Resource Limits
The underlying logic of this merger stems from the escalating conflict between the large-scale development of AI technology and the carrying capacity of ground-based infrastructure. As generative AI and Artificial General Intelligence (AGI) iterate rapidly, the demand for computing power and electricity for AI model training and data processing is growing exponentially. However, ground data centers can hardly break through physical limitations, becoming a core constraint on AI development.
In the memo, Musk clearly pointed out that the global demand for electricity for artificial intelligence cannot be met by “ground-based solutions,” and Silicon Valley urgently needs to seek breakthroughs in space—this is the core necessity of merging xAI.
From a data perspective, current AI computing power consumption has reached an astonishing scale:
Taking medium-to-high resolution real-time video generation as an example, its computing load grows cubically with resolution. Generating one minute of high-definition video consumes computing power equivalent to GPT-4 generating 100,000 words of text, with the computing cost per piece often exceeding 1,000 yuan, which is unaffordable for ordinary enterprises and users.
More critically, ground data centers are trapped in a vicious cycle of “power consumption – water consumption – carbon emissions.” It is estimated that a 1MW ground data center can emit up to 500 tons of carbon annually, and the cooling system accounts for 40% of operating costs. Even with the most advanced technologies, the Power Usage Effectiveness (PUE) can hardly drop below 1.2, resulting in inherent shortcomings in energy efficiency.
Musk has a clear judgment on this. He mentioned in the announcement that even using just one millionth of the sun’s energy would exceed one million times the total energy used by human civilization currently. The vast space and unlimited clean energy in space are the only way to solve this dilemma—”in the long run, space-based artificial intelligence is clearly the only path to scaling.”
This merger will deeply integrate SpaceX’s space transportation and satellite networking capabilities with xAI’s general artificial intelligence R&D strength, forming a closed loop of “computing power demand – space carrier – energy supply” and removing main structural obstacles for the implementation of space data centers.
II. Technical Path: Million-Satellite Constellation to Build Space Computing Grid
After the merger, the two parties will focus on the large-scale deployment of space data centers. The core plan is to launch one million satellites to form an orbital constellation and build a distributed space-based computing network, which has entered a substantive advancement stage.
SpaceX has submitted an application to the U.S. Federal Communications Commission (FCC) to deploy the satellite system in low-Earth orbit at an altitude of 500 to 2,000 kilometers. It will be powered by solar energy and achieve high-speed communication between satellites and with the Starlink network through lasers to ensure data transmission efficiency.
In terms of computing power planning, Musk has put forward clear quantitative goals:
If approximately 1 million tons of satellites are launched into orbit every year, and each ton of satellites can carry and output about 100 kilowatts of computing capacity, the annual新增 AI computing capacity can reach about 100 gigawatts. In the long run, it is “completely feasible” to launch satellites with a computing capacity of 1 terawatt (1,000 gigawatts) from Earth every year.
Specifically for single-satellite computing power, each satellite is expected to be equipped with about 1,000 NVIDIA B200 GPUs, with a single-satellite computing power of 80 PFLOPS (80 quadrillion operations per second). The total computing power of the entire satellite constellation will reach 80 EFLOPS, equivalent to 80 million top-tier servers, which is sufficient to support the training and inference needs of GPT-4 level large models.
The unique advantages of the space environment provide irreplaceable support for this plan:
On the one hand, the efficiency of solar power generation in space is more than 5 times that on the ground. Satellites operating in a sun-synchronous orbit can receive solar energy 24 hours a day, completely getting rid of dependence on the ground power grid. The power cost is only about $0.005 per kilowatt-hour, far lower than the $0.05 to $0.12 per kilowatt-hour of ground data centers.
On the other hand, the near -270°C vacuum environment in space serves as a “natural cooling pool” for AI chips, eliminating the need for complex water cooling or air cooling systems. This not only improves chip operating efficiency and lifespan but also significantly reduces operating costs—saving $3.8 billion annually just on cooling systems. Theoretically, the PUE can be reduced to 1.0, improving energy efficiency by 20% compared with advanced ground data centers.
In addition, the distributed architecture of millions of satellites has extremely high fault tolerance. The failure of a single or a few satellites will not affect the operation of the entire network, avoiding the single point of failure risk of ground data centers. At the same time, on-orbit function reconstruction and hardware upgrades can be realized through Software-Defined Satellite (SDS) technology.
III. Commercial and Industry Impact: Reshaping AI Infrastructure and Spawning a Trillion-Dollar New Track
This merger is not only an important integration of Musk’s business territory but also will profoundly rewrite the competitive pattern of the AI and space industries.
In terms of commercial value, the valuation of the merged company has reached $1.25 trillion, including SpaceX’s valuation of about $1 trillion and xAI’s valuation of about $250 billion, with a per-share price of about $527. xAI will become a wholly-owned subsidiary of SpaceX.
Notably, this transaction was completed on the eve of SpaceX’s potential IPO, which is expected to empower its plan to raise $50 billion. At the same time, it adds AI as a key dimension to SpaceX’s long-term growth story, attracting investors to focus on the new track of space AI infrastructure.
For the AI industry, the implementation of space data centers will completely break the bottleneck of computing power costs. Musk predicts that in the next two to three years, space is expected to become the cheapest way for artificial intelligence computing. This cost advantage will drive innovative companies to advance AI model training and data processing at an unprecedented speed and scale.
In particular, it will promote the large-scale popularization of AI-Generated Content (AIGC)—previously, the core obstacle to the implementation of medium-to-high resolution real-time video and 3A-level AI-generated games was computing power costs. The space computing power network will greatly reduce this threshold, promoting AIGC from auxiliary creation to full-process, high-quality mass production, and reshaping the development pattern of the content creation industry.
Conclusion
The merger of SpaceX and xAI is essentially a “two-way integration” of space technology and artificial intelligence. It is not only an important step for Musk to integrate his business territory and fulfill his technological ambition but also a landmark event for the extension of global AI infrastructure to space.
Although this plan still faces multiple challenges such as large-scale satellite launches, on-orbit computing power scheduling, and chip adaptation to the space environment, it is undeniable that space data centers have moved from science fiction to reality and may open up a trillion-dollar new blue ocean for the in-depth integration of the AI and space industries.