Amazon founder Jeff Bezos made waves this week at Italian Tech Week in Turin with a bold prediction: gigawatt-scale data centres will be constructed in space within the next 10 to 20 years, eventually outperforming their Earth-based counterparts thanks to continuously available solar energy.
It’s a vision that sounds plucked from a cyberpunk novel, yet it reflects a very terrestrial crisis—the planet’s data infrastructure is buckling under the weight of artificial intelligence.
The concept of orbital data centres has gained traction among tech giants as those on Earth have driven up demand for electricity and water to cool their servers.
As generative AI models devour computational resources at unprecedented rates, the industry faces an uncomfortable truth: the digital revolution may be environmentally unsustainable.
The Case for Orbit Based Data Centres
The allure of space-based data centres rests on several compelling advantages, with energy topping the list. According to Bezos, the facilities would have access to solar power 24/7, unimpeded by clouds, rain, or weather—and would eventually beat the cost of terrestrial data centres within a couple of decades.
On Earth, data centres account for 0.3 percent of overall carbon emissions, a figure that rises to 2 percent when networked devices like laptops and smartphones are included.
Data centre electricity usage is expected to double by 2026, with AI set to accelerate that growth. Moving operations into orbit could theoretically tap into unlimited solar energy while eliminating the massive cooling costs that plague terrestrial facilities.
The environmental calculus extends beyond energy. Earthbound data centres pose environmental impacts including water use and electronic waste generation, particularly problematic in drought-prone areas.
Space offers natural cooling—the vacuum of space can dissipate heat far more efficiently than any air conditioning system. Processing data on-orbit and reducing the volume of data transmitted to the ground would reduce overall power requirements, potentially offering a path toward genuine sustainability.
Connectivity presents another tantalising benefit. Placing a data centre in space could significantly decrease the time it takes for data to travel between continents, benefiting financial transactions, real-time communication, and remote sensing.
For global applications, orbital positioning could dramatically reduce latency across vast geographic distances.
The Reality Check
Yet for all the cosmic ambition, the path from vision to viable infrastructure remains fraught with formidable obstacles. The challenges aren’t merely technical—they’re existential.
Cost stands as the most immediate barrier. Despite SpaceX’s revolution in affordable launches, getting massive computing infrastructure into orbit remains extraordinarily expensive.
The significant cost and complexity of launching and maintaining space infrastructure represents a foremost challenge. Every kilogram launched carries a premium price tag, and data centres require tonnes of equipment.
Then there’s the connectivity paradox. While orbital facilities might reduce latency for global communications, latency between satellite and Earth is about 20 milliseconds round-trip—acceptable for some applications but potentially ruling out certain uses such as financial transactions that require split-second precision.
The harsh realities of the space environment compound these concerns. Cosmic radiation and space debris could result in hardware failure or data corruption that’s difficult to repair.
Unlike terrestrial facilities where technicians can quickly replace faulty components, space-based repairs would require costly satellite servicing missions or redundant systems that add weight and expense.
Manufacturing and deployment timelines present additional hurdles. The pace at which space data centres could be launched may be greatly limited by production capacity and launch frequency, even with improved rocket technology.
Building gigawatt-scale facilities would require orchestrating dozens, if not hundreds, of launches—a logistical nightmare.
Satellite information can’t be delivered in real-time to users on Earth because of latency inherent in transmitting data to the planet’s surface, creating a chicken-and-egg problem: which applications justify the investment if fundamental latency issues remain unresolved?
The Verdict: Visionary or Vaporware?
Bezos compared the surge in artificial intelligence to the internet boom of the early 2000s, urging optimism despite the risk of speculative bubbles.
That comparison cuts both ways—the dot-com era was marked by both transformative innovation and spectacular failures driven by premature hype.
The concept of space-based data centres isn’t entirely fantastical. Processing AI and deep learning algorithms on board satellites could reduce data transfer costs and latency, making certain specialized applications viable even if wholesale migration proves impractical.
Edge computing in orbit—processing data closer to where it’s collected, particularly for Earth observation and satellite imagery—represents a realistic near-term opportunity.
But the wholesale replacement of terrestrial infrastructure with orbital mega-facilities? That remains firmly in the “ambitious vision” category.
While getting hardware into space doesn’t sound immediately sustainable, proponents argue the long-term benefits will be worth it—though “worth it” depends heavily on solving myriad technical, economic, and regulatory challenges that remain theoretical.
The more probable scenario isn’t a binary choice between Earth and space, but a hybrid model.
Specialised high-value computing tasks—AI training for space-based applications, processing satellite data streams, serving edge computing needs for global networks—could migrate skyward while routine cloud computing remains grounded.
The Space Storage Paradox: Why Going Orbital Means Trading Terabytes for Kilograms
In the quest to build data centres beyond Earth’s atmosphere, engineers face an uncomfortable trade-off: the storage technology light enough to launch is exactly the kind that can’t hold enough data to justify the journey.
When Jeff Bezos envisions gigawatt-scale data centres orbiting Earth, he’s painting a picture of unlimited solar power and frictionless cooling.
But there’s a mundane engineering reality lurking beneath the cosmic ambition—one measured not in petabytes of processing power, but in grams of weight allowance.
The storage paradox of space computing is brutally simple: traditional hard disk drives offer massive capacity at bargain prices, but their mechanical components make them far too heavy and fragile for orbital deployment. Solid-state drives solve the weight problem while creating a capacity crisis.
The Weight of Data
On Earth, storage capacity follows a straightforward economic logic. Traditional hard disk drives offer larger storage space at lower cost per gigabyte, with capacities reaching 30 terabytes or more, while solid-state drives are lighter and more durable but typically offer less capacity for the price.
For terrestrial data centres where space and power matter more than launch weight, HDDs remain the workhorses of mass storage.
But everything changes when you’re paying tens of thousands of dollars per kilogram to escape Earth’s gravity well. HDDs are substantially heavier because they contain motors, spinning platters, and complex mechanical components, while SSDs are essentially chips with almost no weight.
The weight differential isn’t trivial—it’s existential. Every gram matters when SpaceX charges approximately $1,500 per kilogram to low Earth orbit, and those costs multiply exponentially for higher orbits where some data centre concepts propose deployment.
The mechanical nature of HDDs presents another disqualifying factor:
Hard drives rely on read/write heads floating nanometers above rapidly spinning magnetic platters—a feat of precision engineering that doesn’t appreciate the violent shaking of rocket launches or the constant bombardment of micrometeoroids in orbit.
SSDs have no moving parts, making them inherently more durable, a crucial advantage when repairs require million-dollar satellite servicing missions rather than a technician with a screwdriver.
The Radiation Reckoning
Yet switching to solid-state storage merely trades one problem for another. The very characteristics that make SSDs attractive for space—tiny transistors storing charges in microscopic cells—make them vulnerable to the harsh radiation environment beyond Earth’s protective magnetosphere.
The space environment subjects memory and storage to extreme conditions including long-term radiation exposure and temperature fluctuations, with high levels of cosmic rays and solar radiation causing data corruption and loss.
Ionizing radiation passes through electronic devices leaving trails of charge that can change the stored values in flash memory cells, potentially flipping bits from ones to zeros or vice versa—the digital equivalent of corrosion.
The solution sounds straightforward: radiation-hardened memory. In practice, it’s anything but. Radiation-hardened ASICs can cost millions of dollars to develop, and with NAND flash chips changing every year or two, they can quickly become obsolete.
This can create a devastating economic paradox—by the time you’ve developed space-rated storage based on current-generation flash memory, consumer electronics have moved two generations ahead, making your expensive orbital hardware technologically obsolete before it launches.
SSDs used in space are often combined with radiation-hardened components to improve resilience, but this hardening process adds weight, complexity, and cost—undermining the very advantages that made SSDs attractive in the first place.
Some satellite systems use hybrid approaches, but these remain far from the commercial-scale capacities needed for true data centre operations.
The Capacity Crunch
This is where the space data centre vision confronts cold mathematics. A modern terrestrial data centre might deploy petabytes of storage across thousands of high-capacity HDDs.
Converting that to radiation-hardened, space-qualified SSDs doesn’t just increase costs—it fundamentally changes what’s possible.
Current space-grade storage maxes out in the gigabytes or low terabytes, nowhere near the scale required for cloud computing, AI training, or the massive data processing Bezos envisions.
While HDDs can reach 30TB capacities, SSDs generally available in smaller sizes, and radiation-hardened space variants lag even further behind commercial state-of-the-art.
The performance advantages of SSDs—transfer speeds up to 7,500 MBps compared to HDDs’ 30-150 MBps—matter little if you can’t store enough data to make the orbital infrastructure worthwhile.
It’s rather like building a Ferrari with a fuel tank that holds only a litre—technically impressive, fundamentally impractical.
Emerging Storage Alternatives
Some engineers are exploring exotic alternatives. Magnetoresistive RAM (MRAM) stores data magnetically and is naturally immune to radiation-induced errors while offering high-speed performance, potentially offering a path forward.
But MRAM currently exists at even smaller capacities than flash-based SSDs, and scaling to data centre volumes remains theoretical.
Others propose hybrid architectures that combine different memory technologies, using radiation-hardened controllers to manage less expensive commercial flash with extensive error correction.
NASA has explored combining radiation-hardened and commercial off-the-shelf non-volatile memories into hybrid architectures controlled by radiation-hardened ASICs.
These approaches might reduce costs, but they add complexity—never an advantage in an environment where debugging requires rocket launches.
The storage paradox reveals why space-based data centres remain in the “ambitious vision” rather than “active development” category. You can’t simply transplant terrestrial infrastructure into orbit and expect it to work. The economics don’t compute—literally.
Until someone solves the fundamental tension between launch weight, radiation hardness, storage capacity, and cost, orbital data centres will remain bottlenecked by their inability to store enough data to justify their existence.
It’s not a matter of better engineering or economies of scale improving things by 10 or 20 percent. It requires order-of-magnitude breakthroughs in multiple technologies simultaneously—breakthroughs that show little sign of arriving within Bezos’s 10-to-20-year timeline.
Startup Aiming To Put Data Centres In Space
A startup aiming to put data centers in orbit is drawing heavy attention, and it’s not hard to see why. With demand for AI processing power surging, tech giants like Microsoft, Google, and Amazon are already turning to nuclear power plants to fuel their expanding operations.
The Electric Power Research Institute estimates that by 2030, U.S. data centers could consume 9% of the nation’s total electricity. Companies are exploring unconventional options to keep pace—Microsoft, for instance, once experimented with an underwater data center before scrapping the project.
Advocates say orbital data hubs could offer compelling advantages: lower costs, reduced environmental impact, connectivity for remote areas, disaster resilience, and practically limitless room for expansion.
But the hurdles are substantial. Launching a satellite remains costly—Lumen estimates around $8.2 million per launch.
Latency from long distances could hinder critical applications like financial transactions. And the risks of cosmic radiation, space debris, or hardware failure in orbit present major engineering and maintenance challenges.
Despite growing global interest, both governments and private industry are proceeding cautiously. International laws and regulations for space-based infrastructure remain in flux, and many agencies are still commissioning feasibility studies rather than committing to full-scale projects.
One such study, commissioned by the European Union and conducted by French aerospace company Thales Alenia Space, was released in June.
The ASCEND (Advanced Space Cloud for European Net zero Emission and Data sovereignty) study concluded that orbital data centers could sharply cut energy use and carbon emissions compared to land-based facilities.
Powered by solar energy and free of water-cooling requirements, such stations could align with Europe’s 2050 carbon neutrality goals.
The report outlines an ambitious roadmap: a 50-kilowatt proof of concept targeted for deployment by 2031, scaling up to a 1-gigawatt orbital data center by 2050. If successful, the program could generate several billion euros in returns by mid-century.
“The need for data centers for Europeans is growing and should continue in the same direction for the following years,” says Damien Dumestier, ASCEND project manager at Thales Alenia Space.
“Space data centers could offer an opportunity to provide Europe with a lower environmental footprint and could also be a flagship for the future of the European space industry.” he said.
Meanwhile in Europe, a team of IBM researchers in Zurich, Switzerland, has partnered with Poland’s KP Labs, a company focused on building AI-powered software and hardware for space applications, to study orbital data centers for the European Space Agency (ESA).
In their research, the team outlines three possible scenarios for the data centers. The first two scenarios involve two satellites in the same orbit: one gathers data, while the other processes it.
In the first, a small satellite detects wildfires and sends raw data to a larger satellite, which analyzes the data and transmits key findings to Earth.
In the second, a satellite in LEO transfers unspecified data to a geostationary space data center (which rotates along the Earth’s orbit) that has the advantage of continuous ground station connectivity.
The third scenario imagines a lunar lander acting as a data center, processing information from exploration rovers and sending relevant findings to Earth via a relay satellite.
“We achieved what we aimed at,” says Jonas Weiss, Senior Research Scientist at IBM Research Europe. “We could show that there is likely an inflection point approaching, where edge computing of massive data in space will be economically more viable than sending it down to Earth.”
Orbital Data Centres
Orbital data centers may soon serve not only Earth’s growing appetite for processing power but also the needs of astronauts and researchers working in space.
Houston-based Axiom Space, a company building commercial spaceflight services and infrastructure, is advancing plans for a private space station.
Backed by NASA’s Commercial LEO Development Program, Axiom intends to attach its first module to the International Space Station as early as 2026, with the goal of eventually detaching and operating independently.
The company expects its station to host an expanding crew, all of whom will require reliable cloud services. To meet that demand, Axiom is developing an orbital data center, dubbed ODC T1.
The system is designed to reduce reliance on Earth-based infrastructure by using optical inter-satellite links—laser-based communication systems that securely transmit data between satellites.
Built with a modular design, Axiom’s orbital data centre can scale with demand. Its pressurized environment also enables the use of terrestrial-grade hardware, shielding equipment from radiation and other hazards of space.
“Humanity has aspirations for exploration and economic development on the Moon, Mars and beyond,” an Axiom representative says, adding that data centres for any large-scale human or robotic missions will need to be able to support real-time onsite data processing, data storage and AI capabilities,”
“Advancing and implementing ODCs in Earth’s orbit sets the technological and economic foundations for humanity to continue to explore and advance further into the solar system.”
Bezos’s timeline might prove prescient, or it might join the long list of tech predictions that sounded revolutionary but arrived decades late or in radically different form.
What’s undeniable is that the AI boom has forced an uncomfortable reckoning with the environmental costs of our digital civilisation.
Whether the solution lies in the stars or in more prosaic innovations—better chip efficiency, renewable energy integration, improved cooling technology—remains the multi-billion-dollar question.
For now, space-based data centres occupy the fascinating intersection of technically possible, economically uncertain, and environmentally compelling.
In an industry notorious for moving fast and breaking things, perhaps looking up rather than ahead represents the kind of moonshot thinking necessary to sustain the digital age without exhausting the planet.
