Recent news of data centers being built across the world, mini nuclear power plants being planned to power them, and rivers being diverted to cool them have sparked debates about where this is leading. The obvious answer: into space.

This reminded me of the pulp science fiction novels I used to read as a kid. The German series Perry Rhodan features a giant AI data center on the Moon called Nathan, after the wise man from Lessing’s Enlightenment era play. Not the worst namesake. (Side note: I learned that n8n, the rather fashionable AI-enabled workflow platform, is specifically not pronounced “Nathan” but “en-eight-en.”)

In the decades between me reading those novels and now, computers did the opposite of what science fiction predicted. They shrank relentlessly. The idea of a giant computer being powerful became a punchline rather than a premise. In the 2012 film Iron Sky, the Moon Nazis’ secret weapon is a computer the size of a building, and a smartphone outperforms it. For a long time, it seemed like the future of computing was small.

Then AI happened, and suddenly we are back to building the biggest machines we can. Perry Rhodan’s Nathan begins construction in 2130 and takes 200 years to build. But life has started imitating fiction sooner than expected. Lonestar Data Holdings is working on the first real lunar data center. In March 2025, their “Freedom” payload successfully operated en route to the Moon, and they are planning multi-petabyte storage at Earth-Moon L1 by 2027. The CEO calls himself a “luna-tic.”

Meanwhile, Starcloud (formerly Lumen Orbit), a Y Combinator graduate partnered with Nvidia, launched a satellite carrying an Nvidia H100 GPU in November 2025. It was a hundred times more powerful than any GPU previously sent to space. They used it to train a small LLM on Shakespeare’s works. It spoke Shakespearean English. A marketing stunt, sure, but a cool one. Their actual ambition: a 5-gigawatt orbital data center.

Why Space Makes Sense

The appeal is straightforward. Solar energy in space is not filtered through kilometers of atmosphere and not intermittently blocked by inconvenient Earth rotation. It is available around the clock, in practically infinite amounts. The demand for AI computing power is enormous and growing, and our planet’s rivers and power grids are starting to feel the strain.

There are problems, of course. Micrometeorites (or not so microscopic ones) can pierce solar panels. Cooling is a real challenge: space is a vacuum, an excellent thermal insulator, and thermal energy can only be shed through radiators. To put that in perspective, Starcloud’s planned 5-gigawatt facility would need roughly 8 square kilometers of radiators, an area larger than Gibraltar. And then there is latency, the inescapable consequence of distance.

The Lagrange Sweet Spot

Low Earth orbits still have day/night cycles, residual atmosphere, and the growing problem of space debris. Even a lunar surface facility would experience a month-long day/night cycle, not ideal for solar-powered operations.

A better option: the Lagrange points L4 and L5 of the Earth-Moon system. These are gravitationally stable points where objects maintain their position relative to both Earth and Moon without constant fuel expenditure. They sit at roughly the same distance as the Moon, meaning a signal round-trip takes about 2.6 seconds. You would not want to guide a self-driving car from there. But for many AI workloads, quality of answers matters more than millisecond response times. A chatbot with that latency is conceivable, though perhaps not in customer service. We ought to think bigger than customer service anyway.

These Lagrange points have interesting neighbors. Faint dust accumulations known as Kordylewski clouds were first reported there in 1961 and confirmed photographically between 2018 and 2022. Their cores span roughly 25,000 kilometers across. Our data centers would share the neighborhood with ghostly cosmic dust. There are worse views from an office window.

Science Fiction Got There First

The idea of computing infrastructure in space has been imagined many times. Asimov’s “The Last Question” from 1956 traces computing from room-sized machines to cosmic scale across billions of years. Google cites Asimov as inspiration for their Suncatcher project. Douglas Adams gave us Deep Thought, which spent 7.5 million years computing the answer to life, the universe, and everything, then designed its successor: Earth itself, a planet-sized computer.

The logical endpoint of “move to space for more energy” was articulated by Robert Bradbury in 1997 with the Matrioshka Brain: nested Dyson spheres around a star, each layer using the waste heat of the one below, capturing an entire star’s output for computation. Named after Russian nesting dolls. We are not building that tomorrow. But between the Shakespearean LLM in orbit and the multi-petabyte lunar storage, the direction is set.

China has already named their orbital computing project the Three-Body Computing Constellation, after Liu Cixin’s novel. ADA Space and Zhejiang Lab launched 12 satellites in May 2025, with plans for 2,800. When your satellite constellation is named after a science fiction trilogy about existential cosmic threats, you are either very confident or very literary. Possibly both.

Are We Building a Megastructure?

So are space data centers the first megastructure we will build beyond Earth? Quite possibly. We have the demand. We have early proof of concept: a GPU training Shakespeare in orbit, a storage payload tested on the way to the Moon, a 12-satellite computing constellation already launched. And we have the James Webb Space Telescope sitting at Earth-Sun L2, roughly 1.5 million kilometers away. If we can park a telescope four times farther than the Earth-Moon Lagrange points, data centers at L4 or L5 are plausible.

There are, of course, real engineering challenges: radiation shielding, maintenance without a service crew, connection bandwidth, and the sheer logistics of construction at Lagrange-point distances. On bandwidth: NASA demonstrated 622 Mbps laser communication from the Moon in 2013, and with the abundant energy available at L4/L5, multi-gigabit links are feasible, though still orders of magnitude below terrestrial data center interconnects. These are engineering problems, not physics problems.

Will it happen exactly as I have sketched it here? I am no physicist or rocket scientist, so don’t quote me on the details. For a proper deep dive, I recommend Isaac Arthur’s excellent video on the topic: Space-Based Industry.

But the trajectory seems clear. The science fiction I grew up reading is becoming engineering documentation. Perry Rhodan’s Nathan was scheduled for 2130. At this pace, reality might not wait that long.

To borrow from Prof. Károly Zsolnai-Fehér: What a time to be alive.