Musk vs. Jensen Huang: The Future of AI Infrastructure in the Space Data Center Debate
As of November 21, 2025, one of the most intriguing debates in the AI industry is unfolding. Elon Musk and NVIDIA’s CEO Jensen Huang are in direct opposition over the feasibility of space data centers. Musk argues that within five years, space data centers will become more cost-effective than terrestrial ones, but Jensen Huang counters that it’s “still a dream,” highlighting the practical limitations. This debate seems to be an important milestone that goes beyond mere differences of opinion, potentially indicating the future direction of AI infrastructure.
The background of this debate reveals the severe level of current AI data center operating costs. It is reported that the operating cost of OpenAI’s ChatGPT alone amounts to approximately $700,000 (about 900 million KRW) per day, and Google’s AI services consume as much electricity annually as the entire nation of the Czech Republic. In this context, Musk’s proposition of space as an alternative seems like a natural idea.
Examining Musk’s logic in detail, it appears quite persuasive. In space, solar power generation efficiency is 8-10 times higher than on Earth, cooling systems are unnecessary, and there are no land costs. Particularly, with SpaceX’s Starship launch costs expected to drop to around $10 per kilogram, the economics could theoretically be viable. The average operating cost of an AWS data center is about $1 million per megawatt annually, but in space, power and cooling costs could be nearly zero, which is the core idea.
However, Jensen Huang’s rebuttal is not to be underestimated. He points out that a single H100 GPU currently weighs about 3kg, costing $30 just to send it to space, and when considering radiation shielding and adaptation to the space environment, costs increase exponentially. In fact, the annual operating cost of the International Space Station (ISS) is about $3 billion, illustrating how complex and expensive hardware maintenance in space can be.
Technical Barriers and Market Trends
From a technical perspective, there are several challenges to implementing space data centers. First is the radiation issue; in Earth’s orbit, hundreds of high-energy particles penetrate semiconductors every second. This can cause computational errors in GPUs or CPUs, making stable operation difficult with currently used commercial semiconductors, according to industry experts. Radiation-shielded semiconductors used by NASA are 10-100 times more expensive and lag 5-10 years behind in performance.
Data transmission latency is also a serious issue. The communication delay between Earth and satellites is at least 240ms, which is fatal for real-time AI services. Current services like ChatGPT or Claude aim for an average response time of 1-3 seconds, but using space data centers would inevitably increase this time. Although SpaceX’s Starlink has reduced latency to about 50ms with low-Earth orbit satellite communication, it still doesn’t compare to the 1-5ms of terrestrial data centers.
Interestingly, the market has already begun investing in space computing. As of 2024, startups related to space computing have attracted a total of $230 million in investments, and Amazon’s AWS is offering satellite data processing services through “AWS Ground Station.” Microsoft is also expanding its space-related cloud services through Azure Space. These moves suggest that it might not be entirely impossible.
Particularly noteworthy is China’s movement. China has started experimenting by installing small AI computing modules on the “Tiangong” space station from late 2024. Although for research purposes, it is an important attempt to verify the feasibility of AI computation in space environments. The Chinese government announced a goal to pilot space data centers by 2030, which is a more conservative but realistic approach compared to Musk’s five-year timeline.
On the other hand, looking at the domestic situation, South Korea does not seem to have specific plans for space data centers yet. However, investments in AI infrastructure by domestic big tech companies like Naver and Kakao are continuously increasing. Naver announced a plan to invest 1 trillion KRW in AI data centers by 2025, and Kakao plans to invest 800 billion KRW in constructing the Pangyo 2nd Data Center. These large-scale investments suggest a focus on expanding terrestrial data centers for the time being.
Economic Analysis and Future Outlook
From an economic perspective, a more detailed calculation of the current cost structure of terrestrial data centers is roughly as follows: power costs account for 40-50%, cooling costs 15-20%, land and building lease 20-25%, and labor costs 10-15%. The advantage of space data centers, as Musk claims, is the significant reduction in power and cooling costs, theoretically saving 60-70% of operating costs.
However, what is often overlooked is the initial construction and maintenance costs. The current construction cost of terrestrial data centers is $5-8 million per megawatt, but space data centers are estimated to cost 2-3 times this amount just for launch costs. Adding the costs of manufacturing space-grade hardware, radiation shielding, and building remote maintenance systems, the initial investment could increase more than tenfold. It might take at least 15-20 years to recover these costs, raising doubts about economic viability considering the pace of technological advancement.
Another practical issue pointed out by Jensen Huang is the lifespan of GPUs. The average replacement cycle for data center GPUs is 3-4 years, but in space, due to radiation and extreme temperature changes, the lifespan is likely to be shorter. Since hardware replacement in space requires new launches, considering these costs further diminishes economic viability. In fact, the Hubble Space Telescope required several manned space missions for component replacement, costing billions of dollars.
Interestingly, some analyses suggest that space data centers could be economically viable for certain specialized applications. For instance, satellite image analysis, space exploration data processing, and global communication services could significantly reduce data transmission volumes with Earth, potentially saving total costs. Companies like Planet Labs and Maxar are already conducting initial data processing in orbit, and if this trend spreads, the space computing market could gradually grow.
According to market research firm Northern Sky Research, the space computing market is expected to grow at an average annual rate of 25% to reach $3 billion by 2030. However, this is only 0.4% of the total cloud computing market (estimated at $800 billion by 2025), suggesting it will remain a niche market for the time being. Industry experts commonly believe that reaching the “cheapest within five years” level Musk envisions is still a long way off.
Personally, the most intriguing part of this debate is the difference in approach between the two CEOs. Musk always presents a vision of “making the impossible possible,” while Jensen Huang prefers a realistic and step-by-step approach. While Musk’s approach has sometimes proven correct, as seen in the successes of Tesla and SpaceX, Jensen Huang’s caution cannot be ignored, given NVIDIA’s overwhelming success in the AI chip market.
Ultimately, the realization of space data centers will depend on the pace of technological advancement and cost reduction. If SpaceX’s Starship truly achieves launch costs of $10 per kilogram and space-grade semiconductor technology makes revolutionary progress, Musk’s prediction might come true. However, with current technology levels, Jensen Huang’s skeptical view seems more realistic. In any case, this debate itself reflects the industry’s efforts to overcome the limitations of AI infrastructure, making future developments even more anticipated.
This article was written after reading the article “Musk: ‘Space Data Centers Will Be the Cheapest in Five Years’… Jensen Huang: ‘Still a Dream'”, with personal opinions and analysis added.
Disclaimer: This blog is not a news outlet, and the content written reflects the author’s personal views. Investors are responsible for their own investment decisions, and no liability is accepted for investment losses based on the content of this article.