In the coming years, the demand for power generation, transmission, and cooling will be major constraints for large-scale artificial intelligence (AI) data centers. In light of this, Elon Musk recently proposed a disruptive vision: deploying AI computing centers into space.

Musk serves as the CEO of xAI, SpaceX, and Tesla. xAI is engaged in AI large model research and development, while SpaceX operates in the commercial aerospace sector. Tesla, on the other hand, is involved in multiple businesses such as electric vehicles, energy storage, and robotics. Linking these businesses together provides a nearly closed-loop support system for his vision. If successful, his companies could also be the biggest beneficiaries.

Why This Vision?

Musk believes that within the next four to five years, running large-scale AI systems in orbit will be more cost-effective than running similar systems on Earth. This is mainly due to the “free” solar energy and relatively easy-to-implement cooling technologies available in space.

At the US-Saudi Investment Forum, he stated, “I estimate that before Earth’s potential energy sources are exhausted, the cost-effectiveness of power and AI in space will far surpass that of current ground-based AI. I think that even within a four- to five-year timeframe, the lowest-cost way to conduct AI computing will be using solar-powered AI satellites.”

“I believe it won’t take more than five years from now,” he added.

Musk emphasized that as computing clusters grow, the combined demand for power supply and cooling will escalate to a point where ground-based infrastructure will struggle to keep up. He claimed that to achieve a sustained computing capacity of 200 to 300 gigawatts (GW) per year, massive and expensive power plants would need to be built, considering that a typical nuclear power plant has a sustained output of about 1 GW.

Meanwhile, the current sustained power generation capacity in the US is around 490 GW (note that although Musk mentioned “per year,” he meant the sustained power output at a given time). Therefore, dedicating a large portion of this to AI is impossible. Musk stated that any AI-related power demand approaching the terawatt (TW) level on Earth’s power grid is unfeasible.

“You can’t build power plants of that scale. For example, a 1 TW sustained output is simply impossible. You have to achieve it in space. In space, you can utilize continuous solar energy. In fact, you don’t need batteries because space is always bathed in sunlight, and solar panels will actually be cheaper as they don’t require glass or frames. Cooling is also achieved through radiation,” he explained.

Musk’s Plan

It is reported that Musk’s core plan is to deploy 100 GW of solar-powered AI satellites in orbit annually, a scale comparable to a quarter of the total power generation in the US.

On November 19, he posted, “Starship should be able to send about 300 GW, or possibly even 500 GW, of solar-powered AI satellites into orbit each year.” He added that at this rate, the orbital AI computing capacity could surpass the overall power consumption of the US—averaging around 500 GW—within a few years.

This is not just about launching hardware; it is a significant step toward what Musk describes as a “Kardashev Type II civilization,” a theoretical milestone where a society can harness the entire energy output of a star.

According to posts on X, Musk has repeatedly linked Starship’s capabilities to this scale and pointed out that the energy available from space-based solar power is “over a billion times” that of all Earth’s resources combined. This concept builds on ideas like the “Dyson Sphere,” but Musk’s version focuses on a constellation of AI satellites that can process data while utilizing boundless solar energy.

However, according to Musk, “there is a key bottleneck holding it back.” This bottleneck likely involves scaling up production and orbital assembly.

Some analysts have pointed out that these satellites will not idle in space; they will form a network of solar-powered computing nodes. According to a report released by PCMag earlier this month, this concept is similar to a “Dyson Sphere” made up of satellites capable of harnessing solar energy and even cooling the Earth by blocking sunlight, thus aiding in climate control.

Musk also previously wrote on X, “Ultimately, solar-powered AI satellites are the only way to achieve a Kardashev Type II civilization.”

Additionally, to reach the upper limit of 300-500 GW of power generation per year, Musk suggested manufacturing on the Moon. In a post on X on November 2, 2025, he said, “A lunar base could produce 100 TW of power annually. This base could manufacture solar-powered AI satellites on-site and use mass drivers to accelerate them to escape velocity.”

Still Just a Dream

Despite Musk’s extremely optimistic outlook, numerous obstacles lie ahead. Orbital debris, regulatory approvals, and international space policies all pose risks. NVIDIA CEO Jensen Huang commented, “It’s just a dream.”

Theoretically, space is an ideal place for power generation and electronic device cooling, as temperatures in the shade can drop as low as -270°C. However, the reality is not so simple. For example, under direct sunlight, temperatures can soar to +120°C.

In Earth’s orbit, temperature fluctuations are much smaller: low Earth orbit (LEO) ranges from -65°C to +125°C, medium Earth orbit (MEO) from -100°C to +120°C, geostationary orbit (GEO) from -20°C to +80°C, and high Earth orbit (HEO) from -10°C to +70°C.

LEO and MEO are not suitable for “space data centers” due to unstable lighting patterns, severe thermal cycling, passage through radiation belts, and frequent solar eclipses. GEO is more feasible as it enjoys constant sunlight (although it also experiences annual solar eclipses of short duration) and has lower radiation levels.

However, even in geosynchronous orbit, building large-scale AI data centers faces severe challenges. Megawatt-scale GPU clusters require massive radiator fins to dissipate heat solely through infrared radiation. This means that each gigawatt-scale system would need tens of thousands of square meters of deployable structures, far exceeding the capabilities of any spacecraft to date.

Moreover, launching such a massive scale would require thousands of Starship-class flights, which is unrealistic within the four- to five-year timeframe set by Musk and would be extremely costly.

Furthermore, high-performance AI accelerators like Blackwell or Rubin and their supporting hardware cannot operate normally under the radiation in GEO without heavy shielding or thorough radiation-hardening modifications. These modifications would significantly reduce clock speeds and/or require entirely new process technologies that significantly improve radiation resistance rather than just optimizing performance. This reduces the feasibility of building AI data centers in GEO.

Additionally, considering the scale of the proposed project, technologies such as high-bandwidth connections with Earth, autonomous maintenance, debris avoidance, and robotic maintenance are still in their infancy. This is perhaps why Huang referred to it all as just a “dream” for now.