Book Trilogy Review: Remembrance of Earth’s Past (The Three Body Problem)

post by Zvi · 2019-01-30T01:10:00.414Z · LW · GW · 15 comments

Contents

  Spoiler-Free Review
  All The Spoilers – The Plot Summary
    Book I – The Three Body Problem
    Book II – The Dark Forest
    Book III – Death’s End
  Discussions
      Big Picture: An Alien Perspective
      The Great Betrayal
      The War Against Science
      Only One Man
      Mutually Assured Destruction and the Swordholder
      The Great Ravine
      The Fairy Tale
      The Great Escape
      The Dark Game Theoretic Forest
      Conclusion and Science Fiction’s Fall
None
16 comments

Epistemic Status: Stuff that keeps not going away so I should write it up I suppose. Brief spoiler-free review, then main talk will be spoilerific.

Spoiler-Free Review

I read the trilogy a few months ago, on general strong reviews and lack of a better known science fiction option I hadn’t already read.

I was hoping to get a Chinese perspective, some realistic physics (as per Tyler Cowen’s notes) and a unique take on various things. To that extent I got what I came for. Book felt very Chinese, or at least very not American. Physics felt like it was trying far more than most other science fiction, and the consequences are seriously explored. Take on that and many things felt unique versus other books I’ve read, in ways I will discuss in the main section.

What I didn’t feel I got was a series that was high enough quality to justify its awards and accolades, or allow me to fully recommend reading it to others. It’s not bad, it has some great moments and ideas and I don’t mind having read it, but I was hoping for better. That’s fine. That is probably a lot about my expectations getting too high, as I can’t point to (in the limited selection of things I’ve read) recent science fiction I think is even as good. Like other genres, read mostly old books is wise advice that I follow less than I should.

It is a reasonable decision to do any of: Not read the book and not continue further, not read the book and allow it to be spoiled here, to read some and see if you like it, or to read the whole thing.

This long post is long. Very long. Also inessential. Apologies. I definitely didn’t have the time to make it shorter. Best to get it out there for those who want it, and to move on to other things.

All The Spoilers – The Plot Summary

(This unfolds everything in linear order, the books rightly keep some things mysterious at some points by telling events somewhat out of order. This is what I remember and consider important, rather than an attempt to include everything. There’s a lot that happens that’s interesting enough to be included!)

Book I – The Three Body Problem

Communist China during the cultural revolution was really bad. Reeling from a combination the cultural revolution and its murder of (among others) her father and all reasonable discourse, her forced exile to the countryside, and environmental panic raised by a combination of Silent Spring and the actual terrible environmental choices and consequences she sees around her, Ye Wenjie despairs for humanity. When she sees that contact has been made with extra-terrestrial intelligence, with a message warning us not to reply as doing so would give away our location and we would be conquered, she replies asking for an alien invasion from Trisolaris, and goes on to found the ETO, the Earth-Trisolaris Organization, with the explicit aim of betraying humanity and giving Earth over to the aliens.

Because we are so awful that it couldn’t help be an improvement, right? Using a game called Three Body that illustrates the origins of Trisolaris, the ETO recruits huge portions of the world’s intellectual classes, largely because of environmental concerns and humanity’s other moral failings, making them willing to betray humanity in favor of an alien invasion.

Trisolaris sends an invasion fleet that will arrive from Alpha Centauri in 400 years. Worried that our technological advancement is so rapid we will by then defeat them, they send protons to Earth that they have ‘unfolded’ to transform into sophons, which they can use to communicate faster than light in both directions, and which can be programmed to monitor everything on Earth, mess with particle accelerator experiments to prevent technological progress, and do a few other tricks. The crazy physics results and other ETO efforts to suppress science drive many physicists to suicide. The ETO convinces the world’s intellectual elite to run a cultural campaign against science, as science is the only thing they fear. In particular they go after one particular person, the not-very-virtuous-or-brilliant-seeming astrophysicist Luo Ji. These symptoms are what lets the authorities around the world to investigate and figure out they are facing the ETO, which they manage to infiltrate and then mostly shut down. But scientific progress is still blocked by the sophons, so Earth seems doomed to fall to the invasion fleet.

Book II – The Dark Forest

With Earth under total surveillance and no prospect of scientific progress, humanity sets out to prepare for what it calls the doomsday battle. Earth’s resources are redirected to two efforts. An epic space fleet is constructed to attempt to battle the invasion, with so much being invested in these efforts, despite the four hundred year time frame, that from the start the world soon expects rationing of basic goods and bad economic times. A second effort is the wallfacer program. Four special humans are chosen to develop secret strategies, since the one place sophons can’t intrude is the inside of a human brain. The job of the wallfacers is, given unlimited resources, to develop a strategy to stopping the invasion, but keep it secret and use misdirection and deception to ensure that the enemy does not figure out their plans. Everyone is required to do as the wallfacers ask, without question or explanation, and everything they do is considered part of their plan one way or another.

The four chosen are: A scorched-Earth terrorist from Venezuela who is then hailed as a master of strategy and asymmetric warfare, a former US secretary of defense and veteran officer, an English neuroscientist, and then Luo Ji because they couldn’t help but notice that the ETO really, really wanted Luo Ji dead even if they had no idea why. Luo Ji wants no part of this, but his refusals are taken to be part of his plan, so instead he uses his position to set himself up for a nice quiet life.

The ETO then assigns each a ‘wallbreaker’ to uncover the wallfacer’s plan and reveal it to the world. Humanity gets poetic justice from its first three Wallfacers, all of whose plans are revealed as not only huge wastes of resources but active dangers. The scorched-Earth terrorist tries to use a Mercury base as a means to cause Mercury to fall into the Sun and cause a collapse of the entire solar system, thinking he can hold it hostage to force a compromise. The general creates a swarm of automated ships that he intends to use to betray Earth’s fleet in order to then try and somehow trick the invasion fleet. The neuroscientist creates a brainwashing machine, claims he’s using it to convince our officers we’ll win, but actually uses it to brainwash all volunteers with utter faith that we will lose. None of the plans have a practical chance of working even on their own merits, and all three cause humanity to collectively react in horror.

That leaves Luo Ji, who they force to work by forcing his wife and child into hibernation as hostages. Luo Ji thinks for a long time and then decides to ‘cast a catastrophic spell on the planets of a star’ using the same solar broadcast technique we used to communicate with Trisolaris in the first place, broadcasting the location of another star out to the broader universe, as a test that will take a hundred years to verify. Trisolarian allies almost kill him with an engineered virus, and he is sent into hibernation until we can find a cure.

Upon awakening, Lou Ji finds a transformed world. There has been an economic and ecological collapse called The Great Ravine, caused by the devotion of all resources to the war effort, but when humanity gave up on the war and went back to trying to live until the end, the economy recovered underground and the Earth recovered on its own once left alone. Our fundamental physics was still blocked, but our tech still got a lot better, and eventually we got around to building a fleet, which can even go faster than the Trisolarian fleet (0.15c versus 0.1c) and everyone is confident of victory once the Trisolarian fleet arives.

He learns that his ‘spell’ worked, the sun in question shattered by a photoid. He also faces a computer system trying to kill him thanks to an old virus engineered to target him, which he narrowly escapes multiple times until the authorities realize what is happening. Then he goes about adjusting to living out his life.

People are so confident of victory that when a probe arrives in advance of the fleet, everyone gets into super close formation because they’re fighting over bragging rights regarding this historic moment, rather than thinking about what would happen if the probe might try something. The main battle they prepare for is which continent’s ships will get the place of honor. Only two ships are semi-accidentally held back and no one gives that much thought. Those two ships flee the solar system.

The probe tries something, which is to ram and destroy all Earth’s ships, because the probe is super fast and is made of a super-dense material organized around the strong nuclear force, and Earth’s ships aren’t. Nothing we have can do anything to the probe. The probe then proceeds to Earth. Luo Ji expects it to kill him, but instead the probe shuts down our ability to communicate with the outside via the Sun, as we had done previously to communicate with Trisolaris and for Luo Ji to cast his spell on the star.

It seems too late, and all is lost. Luo Ji had realized that the universe is a ‘dark forest’ in which any civilization that reveals its location is destroyed, because resources are limited but life grows exponentially. If everyone out there knows where you are, one of them will wipe you out, even if most leave you alone. That was how his ‘spell’ worked.

Thanks to his spell, Luo Ji is now identified as our last hope and treated like a messiah. The wallfacer project is revived, he is given carte blanche and a blank check. But when he seems to be devoting his time and energy to the details of an early warning system that can’t possibly save us, people turn on him. He is treated as a pariah, not allowed on buses, the man who betrayed us and gives us false hope. Scientists develop theories that the sun he cast a spell on being destroyed was a coincidence, as it was about to go nova or something right at that time, never mind the probability of that on such a short time frame.

All this time, the one thing more reviled than Luo Ji is ‘escapism’. It is treated as the ultimate crime against humanity to attempt to flee the solar system. Never mind that this is the only realistic way that humanity could survive, or at least it is the only way we might survive. The one thing everyone agrees on is that if we can’t all survive, no one can be allowed to try and survive, so all such attempts must be stopped at all costs.

Thus, we have very little to attempt escape with. All we have left are the the two ships left out of the final battle, one ship that had been hijacked when someone who wanted that more than anything was given temporary full control over a ship, including the ability to take it out of the solar system, while the ship was checked for ‘defeatist’ officers who had been given the brainwashing treatment, and three ships pursuing that one ship.

Or rather we only have two ships. Both groups of ships realized that the other ships in their fleet comprised vital resources of matter necessary to completing their journey to another star, and realized that other ships would view them likewise. As a result, each group of ships fought a battle where only one ship survived, and then had the resources to continue its journey. This only reinforces humanity’s devotion to not allowing any escape, as space is viewed as a place that takes away our humanity and makes us do terrible things.

Finally, Luo Ji is ready. He proceeds to a remote location, digs his own grave, and talks to Trisolaris via the sophons. He explains that the early warning system he created is actually a message to the galaxy. By placing its components carefully around the solar system, he has engineered a message revealing the location of Triosolaris. If he is killed, a dead man switch will cause the message to be sent out, and others will then destroy both Trisolaris and Earth, since previous communications already revealed our distance from Trisolaris.

Under this threat, Trisolaris surrenders. They pull back their probes, stop their sophons from interfering with science experiments and divert their fleet, rather than face mutually assured destruction. Luo Ji, it appears, has saved us.

Book III – Death’s End

Death’s End is the story of humanity’s journey through various eras as told through the journey of a Chinese scientist named Cheng Xin.

In the common era, Cheng Xin’s team manages to send a brain into space to meet with the Trisolarian fleet, hopeful that they can then reconstitute the person and that the person can then act as an envoy and spy. They send Yun Tianming, who is in love with Cheng Xin and bought her a star, as the United Nations was selling them to raise money for the war effort.

Upon awakening in the future, this star proves to have planets and thus be very valuable, and Cheng Xin becomes super wealthy for the rest of the story.

In the deterrence era, Luo Ji has become the swordholder, standing ready at all times to broadcast the location of Trisolaris (and thus also Sol), thus ensuring peace. This works for a while, but humans forget how precarious their situation is and lose the appetite for hard choices. The men of this era, it is noted, are soft, and not true men. The number of warning stations and backup broadcasters is cut back to save money. One of the ships that survived the wiping out of Earth’s fleet, the Bronze Age, is recalled to Earth, then everyone is put on trial for crimes against humanity. They manage to notify the other ship that has escaped Earth, the Blue Space, to warn them not to return home, so a Trisolarian probe and an Earth ship are sent in pursuit and are on top of them ready to strike, but in no hurry.

Earth has decided the time has come to select a new swordholder to provide deterrence. Wade, Cheng Xin’s old ruthless spymaster boss from the project that sent up Yun’s brain, tries to kill her to prevent her from becoming swordholder and claim the position himself. Later, the remaining candidates again try in vain to convince her not to run for the position, but she feels obligated and does anyway. In a true instance of You Had One Job, the public chooses her as someone they like better, despite her being obviously less of a deterrent and Trisolaris still being able to watch literally everything everywhere all the time when making choices. A mindboggingly thankless Earth then arrests Luo Ji for high crimes because there might have been life on the planets orbiting the star he cast a spell on to prove that his hypothesis worked, never mind that there was no reason to believe that and oh that was part of him saving us from an alien invasion. But I get ahead of myself with the editorializing.

And of course, the second Cheng Xin takes over, the Trisolarians send probes in to dismantle what little broadcasting ability we have left and cripple us. Cheng Xin has the opportunity to push the button and reveal the location of Trisolaris, but (in the book’s words) her ‘motherly instincts’ take over and she refuses to doom us all, hoping things will somehow work out. Sophon (one of them has now taken on humanoid form) then explains gratefully that she only had about 10% deterrence value based on their models, whereas Wade was at 100% and certain to retaliate, which of course would have meant they wouldn’t have tried anything.

Instead, she proclaims, all Humans must dismantle their technology and move to Australia, except for a few million who will coordinate this move and hunt down the resistance. Australia and an area on Mars are, she claims, our reward for what our culture has brought to Trisolaris, including the concept of deception, which allowed their technological progress to restart after stalling out for eons, but we must give up most technology. Then, once everyone is there, she has the power for the farms there taken out, and announces that the plan is for us to fight each other for food to survive for the four years until the second Trisolarian fleet, which now can achieve light speed, can arrive to take over Earth’s intact cities and provide support for the survivors.

Before that can happen, it is revealed that the ship Gravity, which was pursuing the renegade ship Blue Space, has been taken over by the crew of Blue Space, who also managed to somehow defeat the probe attacking both of them, and together the crews voted to reveal the location of Trisolaris and therefore Sol. They did this via finding a pocket of four dimensional space, and explorations into four dimensional space find strange and ominous results including a ‘tomb’ to a dead race. The tomb explains that higher dimensional civilizations have no fear of lower dimensional ones, and lower dimensional ones have no resources higher dimensional civilizations might need, so the two have no reason to interact.

With Earth now a dead planet orbiting, the Trisolarians divert their fleet elsewhere and allow humanity to return to its planet and technology. Humans turn on those who managed to save its cities and civilization (while also, to be fair, laying the groundwork for xenocide), and greatly rewards those in the resistance including Luo Ji. Trisolaris is destroyed when a photoid hits one of their three stars.

The question now becomes, what to do, given that the universe will soon know where we are?

Before leaving Earth, Sophon gives Cheng Xin a chance to speak with Yun Tianming, but is warned they can only speak on personal matters and to avoid revealing technical or scientific or otherwise valuable intelligence, or Cheng Xin will be blown up by a nuke before she can share what she has heard. Yun shares with Cheng a fairy tale in three parts that was part of Yun’s published work ‘Fairy Tales from Earth.’ The works are well known, and they are not only not blown up or warned, they are allowed to go over the planned allotted time so he can finish the tales. They then promise to meet in the future at her star. Cheng memorizes the tale and reports the contents back, and she is congratulated on her success as all of our finest people work on figuring out the hidden meaning in the tale.

The tale is long and provides lots and lots of clues about how the universe works, what is technologically possible, what is in store for the solar system and how we might defend ourselves. Humanity figures out a little of it, but misses a lot more and knows it has done so, and comes up with three potential solutions to its crisis.

First, humanity notices that both known ‘dark forest strikes’ against known civilizations have taken the form of a photoid being used to shatter a star, and that if they can hide behind Jupiter or Saturn when that happens, they will survive the strike and have the technology to survive going forward. So we could build space cities to house our population.

Second, we could build faster than light ships and escape the solar system in time. Trisolaris developed light speed ships, and our older slower ship Gravity was on its way to another star even without this, so we know such a thing is possible. But that’s escapism, and escapism is the worse than Hitler, since not everyone could get away, so all such attempts are banned to make sure that if most of us die, everyone dies.

Third, we could take a concept gleaned from the fairy tale and from a question once asked to Sophon where she confirmed that there is a way to credibly communicate that we are not a threat and thus be left in peace. We could turn Earth into a dark region of space that can’t be escaped from. The problem is we don’t know how to do that and our attempts to figure it out came to nothing.

Humanity puts almost all its resources into plan one, and succeeds in moving everyone into space. It bans plan two, and mostly gives up on plan three.

Wade then meets Cheng Xin (our characters go into cryogenic sleep so they can move between eras as the plot requires) and demands she turn over her company to him so he can research light speed travel, since without it humanity will never be great even if we somehow survive, and also this plan of hiding behind Jupiter and hoping no other civilization thinks we might do that seems kind of unlikely to work when you think about it, ya know? Cheng Xin agrees and turns her company over with the caveat that if Wade would do anything to threaten human lives he has to awaken her and give her control back.

He agrees to the condition. Cheng Xin is awakened to news that Wade has made great progress in his research, but the authorities are trying to shut it down because escapism, and his response has been to seed soldiers with anti-matter bullets on the space cities so that he can threaten reprisals that would destroy them if his work is stopped. Cheng Xin is horrified, once again refusing to use such threats, and orders him to surrender. Amazingly, he does, fulfilling his promises to Cheng Xin. She goes back into hibernation again.

She is awakened to warning of a dark forest strike. Our hour has come, and a different type of weapon is attacking us, collapsing space around the solar system down to two dimensions. Laws against escapism are repealed, but escape velocity is light speed, so it is too late. Except for one ship that Wade managed to finish, with two seats, which Cheng Xin and her friend Ai Aa take, first to Pluto to help distribute priceless artifacts as a memorial to humanity, and then to escape the ongoing attack. Even as we are all doomed, humanity continues to hate ‘escapism’ and a number of ships try (with no success) to stop and destroy her ship because if they can’t escape, everyone should die. She does escape and directs the ship to her star.

When there, they attempt to meet Yun, but an accident causes them to miss each other. They do meet up with a descendant of the crew of the ships Gravity and Blue Space, which survived and became human civilization. He explains that dimension-collapsing weapons are being used throughout the universe, bringing it down from its original ten plus dimensions to now mostly two and three dimensional space with a few four dimensional pockets, and other laws of physics are under attack too. That’s why string theory and all these ‘folded-up’ dimensions and such. They are given instructions to enter a pocket dimension to await the big crunch, which will reset things and restore all the dimensions.

They then get a message from the universe. It notifies them that both Earth and Trisolaris made the list of impactful civilizations in the universe, and asks that all matter in pocket universes be returned because too much matter has been removed from the main universe into various pocket universes and this lack of sufficient mass will prevent the big crunch, dooming the universe to heat death instead of renewal. They decide to put most of the mass back into the main universe, leaving behind a message for the next cycle and taking a ship to explore what is left of the main universe.

We do not learn if enough others cooperated, or whether the big crunch did occur.

Glad that four-thousand-word plot summary is out of the way. On to the discussions.

Discussions

Big Picture: An Alien Perspective

I don’t mean Trisolaris. I mean China.

Trisolarians have several characteristics on which to hang their planet of hats. They evolved around a trio star, which leads to unpredictable periods of extreme heat and cold, lasting unpredictable amounts of time, and worse, with only occasional ‘stable eras’ where they are mostly revolving around one sun and life can continue. Thus, they have the ability to ‘dehydrate’ and store themselves in this state during chaotic periods, then rehydrate when things stabilize, and they survived this way through many collapses until reaching an era of high technology.

They communicate telepathically and automatically, so before they met humans they didn’t know that lying or deception could be a thing. Their entire culture should be alien to us.

Instead, the humans are written a lot stranger, to my American eyes, than the Trisolarians. Most characters are Chinese, but even those who are not continue to mostly think and act in this similarly alien style. I appreciated the opportunity to see the view of humanity from someone in a completely different culture. But it points out how ordinary the Trisolarians are that they are, at least, not obviously less like the humans I know, than are the humans in the book. How much of this is how the Chinese or some Chinese group view humans and think about the world, versus how the author does? I cannot say.

The viewpoint expressed is deeply, deeply conservative and cynical.

People, all but a handful of them, are absurdly fickle, petty, short-sighted, self-interested and self-defeating. They are obsessed with status and how they are relative to others, and on the margin wish others harm rather than well. If anyone tries to escape disaster or otherwise rise on their own, universally mankind’s response is to band together to kill them.

When times are good, they lay back on universal benefits and forget the universe is a harsh place, and that their very survival depends on hard work, sacrifice and hard choices, and explicitly condemn and go after anyone who makes hard choices or makes sacrifices. They choose leaders who can’t make hard choices, showing weakness and inviting attack. They all grow soft, such that an entire prosperous era can lack ‘real men,’ leaving only effeminate weaklings incapable of action, leaving only those frozen in the past capable of taking meaningful action.

They will believe and put their hope in any damn thing they are pointed at, for a while, no matter how absurd. Then they will despair of any action, or turn in desperation to any hope for change. They will alternatively hail the same person as a hero, and arrest them as a villain, multiple times, for the same actions, based on how they want to feel at the time and what hopes they are given.

The intellectuals are even worse, and at the start of the book the bulk of them are actively working to sell us out to the aliens for no reason other than environmentalism and humans sucking, so why not turn things over to a race determined to conquer us and hope for the best?

The idea that humans, or any beings anywhere, could successfully navigate basic game theoretic problems like the prisoner’s dilemma, rather than killing each other and literally destroying the dimensions and physical laws of the universe, is not in hypothesis space.

They are also, as we’ll go over in multiple places, profoundly and deeply Too Dumb To Live.

The few who are not all of that, or even are a little bit not all of that some of the time, because the plot needs them not to be, are good bets to outright save humanity.

Humanity ends up saved, repeatedly, by the actions of a few people from our era, in spite of almost everyone alive’s consistent efforts, through the ages, to the contrary.

When almost everyone alive dies in the third book, it’s hard to escape the feeling that they kinda deserved it.

The Great Betrayal

It is not hard to see why Ye Wenjie, victim of the cultural revolution, might despair for humanity or her country, and take any opportunity for intervention. Yes, it was an open invitation to an alien race of would-be conquering warmongers, about whom you know nothing else, so thinking this is worth a shot seems rather like a lack of imagination on how bad things can get. Then again, it’s not clear she doesn’t just want to see all the humans suffer and die for what they’ve done.

The part where her motivation, and those of the (majority of) intellectuals who join her, is largely environmentalism? That she’s motivated in large part by reading Silent Spring?

Given how likely entirely alien race is to care about our world’s ecology at all, this does not seem like a reasonable position for someone who values Earth’s other life forms.

It seems more like Earth’s intellectual class collectively decided that this random other alien race has had a hard time, whereas humans are Just Awful and do not deserve to live, so they’re going to work to hand the planet over. That fits what the few recruits we see say – it seems the book thinks that most intellectual people are disgusted by and despise humanity, and want it to suffer and die.

The way they recruit in the book is, there’s a virtual reality game called Three Body. Those who play it unravel the nature of Trisolaris, after which they are asked to come to a meeting in real life, where they are asked their opinion on humanity and whether it would be a good idea to betray us all in favor of Trisolaris. It seems most people who get that far say yes, and the organization got very large this way without being infiltrated.

I hope that this perspective on what intellectuals think is not too common among Chinese or those with the deeply conservative views of the author. It seems such a horrible thing to think, and its implications on what one should do are even worse. I try to stretch and see how one might think this, and I can sort of kind of see it? But not really. My best sketch is a logic something like, they despise so many of the preferences of the common folk, seeing them as sexist and racist and wasteful and destructive and full of hate and lack of consideration for their fellow man, so they must hate them in return?

I can also understand how, seeing the things many people say nowadays, one might reach this conclusion. Scott Alexander recently wrote an article noting that many believe that there are zero, or approximately zero, non-terrible human beings on the planet. It also offers the hypothesis that there are approximately zero, or perhaps actual zero, non-terrible human beings in history. Direct quote from the top of the article:

There are some pretty morally unacceptable things going on in a pretty structural way in society. Sometimes I hear some activists take this to an extreme: no currently living person is morally acceptable.

The otherwise excellent show The Good Place reveals that 98%+ of all humans who ever lived were sent to The Bad Place rather than The Good Place. Good enough is hard.

A lot of people who oppose hate a lot sure seem to hate, and hate on, a huge portion of all humans who ever lived.

You can and would be wise to love the sinner and hate the sin. That’s mostly not what the sinners experience.

If you see lots of people loudly saying such things, it’s easy to see how one might view them as sufficiently hateful of humans in general that they are willing to sell humanity out, for actual literal nothing, to the aliens.

I can also see a religious perspective. Suppose you think that all men are sinners, and kind of terrible, but that they are redeemed by faith, or by the forgiveness of God, or some symbolic form of penance, or what have you. Now suppose you see a group of people who seem to agree about the awful nature of humanity, even if they don’t agree on why or on which parts of humanity are awful. But they don’t have this God concept. When people announce the error of their ways, the reception is usually observed to be worse than if they’d just said nothing. Once you’ve done wrong, these moral systems don’t seem, to such an outsider, to offer any path to redemption. Certainly not one that a non-nominal portion of people could meet.

In the book, the character we are following does what I would presume almost everyone would do when offered the chance to support an alien invasion and takeover of Earth backed by zero promises of anything for themselves or the rest of humanity. He reports the situation, cooperates with authorities, infiltrates the organization and with almost no effort sets up a raid that takes down a huge portion of their membership including their leader. I strongly believe I would do the same, not because I’m an unusually pro-human or morally great person, but because that kind of is the least you can do in that spot.

When I see people claiming to be negative utilitarians, or otherwise claiming beliefs that imply that there is only an ugly kludge, high transaction costs and/or lack of initiative standing between them and omnicidal maniachood, a part of me hopes I won’t end up having to kill them some day.

To see a book that expects not just a handful but the majority of intellectuals to, when given the choice, take the opposite path, is true alien perspective stuff.

The War Against Science

If you were in command of an organization whose goal was to ensure that humanity would fall to an alien invasion fleet scheduled to arrive in four hundred years, what would you do?

Trisolaris’ strategy is almost entirely to target science, in particular particle physics.

At the rate humanity is advancing, by the time their fleet arrives, our science and technology would by default advance to the point where we could crush the invasion fleet. On the other hand, if we could be prevented from advancing our knowledge of physics, no amount of technological tinkering on the edges will make a dent. This is what we later see, as the strong-force-powered probe proves immune to everything we can throw at it.

Trisolaris uses two distinct vectors to attack science.

At first, they utilize their control of the intellectual elite to cause culture to turn the people against science and technology. This effort is portrayed as being wildly successful, and heavily implies that today’s intellectual elite are in fact doing something akin to this, only perhaps not quite as extreme. It is not hard to see that perspective after yet another young adult dystopia.

Once the threat from Trisolaris is discovered, the jig on such strategies is up, the same way that America embraced science after Sputnik. Instead, Trisolaris relies on the sophons, which randomize the results of experiments in particle accelerators, cutting off progress in theoretical physics.

This is a very Sid Meier’s Civilization style view of how science works. To advance Science, one must fill enough beakers to discover the secret of advanced particle physics, which then allows us to fill beakers to discover the secret of strong force interactions or light speed travel. Progress is based on the sum of the science done, so ask if any given effort produces an efficient amount of science. If you cut off a link in the chain, everything stops. There’s no way around it. One cannot tinker. One cannot perform other experiments.

This is a rather poor model, in my view, of how science works. Science is not a distinct magisterium. Neither need be particle physics.

Thus, I have my doubts that this would be sufficient to cut off such progress. There must be other ways to advance knowledge.

The existence of sophons and their particular abilities likely offers a gigantic hint. Trisolaris in general had the really terrible habit of only trying to kill the few people, or stop the few experiments, that they saw as threatening. My first thought would be, try stuff, see if they let you do it, and isn’t that an interesting result in and of itself? 

Reverse engineering and catch-up growth are so much easier than going at it from first principles!

Another effort to stop us is made via the wallbreaker program. Each of the four wallfacers, humans assigned to carry out hidden agendas to defend us that would not be revealed to the prying eyes of the sophons (which can observe anything but the contents of a human brain), is assigned one person to figure out what their plan is, then reveal it to the world and show how hopeless it is. This is partly to spread hopelessness, and partly to ensure the schemes are not in fact a threat. In the book each task is left to a single individual, which is rather silly, but it seems a good use of resources, and three of the four succeed.

Only One Man

The last thing Trisolaris does is try to kill Luo Ji. This is both the smartest and dumbest thing they do.

It is the smartest because Luo Ji is the largest threat standing in the way of their victory. Only he divines the dark forest nature of the universe (that everyone who is discovered is destroyed as a defensive measure, since life expands exponentially but resources are finite, so everyone must hide well and cleanse well) and our ability to threaten Trisolaris via broadcast of the location of their homeworld.

Let us assume for now (premise!) that the dark forest hypothesis is known by Trisolaris to be true, and they know only Luo Ji has much chance of figuring it out. What they then do is to attempt to kill him in a way that looks like an accident. When that fails, they make him a wallfacer, and force him to work against his will, and he eventually shows that he is figuring out how the dark forest works. When he awakens, there is an old virus waiting to try and kill him, but it fails.

A probe, that definitely could kill him if it wanted to, then descends. Luo Ji is certain its first act will be to kill him.

Instead, the probe cuts off our ability to broadcast via our sun – again, showing us their hand, although this time it seems worthwhile.

But they do not kill Luo Ji.

Luo Ji is hailed as our only hope because of his casting a spell on a star, and made into a wallfacer with unlimited resources.

They do not kill Luo Ji.

Luo Ji starts to tinker with an early warning system and its details, for no obvious reason.

They do not kill Luo Ji.

Luo Ji constructs a dead man switch for himself, in full view of the sophons (since everything is in full view of the sophons).

They do not kill Luo Ji.

Finally, he reveals his plan to use the warning system as a method of broadcast to the universe, after it is already in place, and Trisolaris is forced to surrender. They explain that they no longer saw him as a threat, since nothing he seemed to be doing appeared meaningful, the deadman switch was a harmless part of the plan of another wallfacer (the terrorist, of course), and the sun had been cut off as a means of broadcast.

This is a monumentally important failure mode. One should not need the evil overlord list to avoid this. The mistake is not only made repeatedly by the humans and aliens alike, it is also made by the judgment of history (in the book), and hence likely also by the author. 

As evidence that the author does not appreciate how truly boneheaded a move this is, consider two things.

First, right after this incident Luo Ji hands control of the broadcast system to the government, which quickly realizes it would not reliably execute on the threat to expose both worlds and hands control back to Luo Ji within a day, making him the swordholder. The author notes that the failure of Trisolaris to attack during this period is considered one of the great strategic failures of history. This is despite the fact that the probability of retaliation at that time is highly uncertain, and it is most definitely not zero. Such a move is absurdly risky versus trying to reach a compromise, given the stakes, and in fact clearly would have been an error given future opportunities Trisolaris is offered.

Second, the explanation given by Trisolaris as to why they made the colossal blunder of not killing Luo Ji, which is frankly absurd. They admit failure to see what he was up to, why he was doing it or how any of it could matter.

They then site this failure as a justification for NOT killing Luo Ji.

This is exactly backwards. This is all the more reason to kill him, and no one in the book seems to understand this.

Because space bunnies must die.

I take this saying from the title of the (quite bad) game Space Bunnies Must Die!. If your opponent is doing something that makes no sense, whose purpose you can’t understand, which you have no explanation for why they might do it, assume it is highly dangerous. They are the enemy. They have a reasons they chose to take this action. That might be ‘they are an idiot who is flailing around and doesn’t know anything better to do.’ Maybe. Be skeptical.

Luo Ji is Earth’s space bunny. He must die. They do not kill him, and he beats them.

For a while.

Mutually Assured Destruction and the Swordholder

Luo Ji’s triumph gives Earth the ability to reveal the location of Trisolaris, dooming both worlds. He and humanity use this to free ourselves and begin to rapidly make scientific, technological and economic progress to catch up with Trisolaris.

What Trisolaris still has are the sophons. They still know everything that happen on Earth. They can’t quite see fully into human minds, but they can build increasingly accurate psychological models. Thus, one can view them as a kind of Omega, that sees all, knows all and predicts all, albeit with a substantial error rate, when it knows the parameters it is looking at.

Everything depends on the prediction they make – in the book’s words, we must ‘hold the enemy’s gaze.’ If Trisolaris thinks we won’t retaliate by revealing their location, they will attack. Whether we retaliate or not, we are doomed. If Trisolaris thinks we will retaliate, they won’t attack.

Mutually assured destruction, with a sufficiently strong predictor, is a variation of Newcomb’s Problem. The boxes are transparent, and what matters is whether you’ll take the thousand dollars if the million dollars is not there. If the predictor thinks you wouldn’t take the thousand dollars in that case, then the predictor gives you the million dollars and you never have a choice to make.

I have a lot to say about Newcomb problems in general, but won’t go into it here because humanity’s failures in this trilogy are so much more basic, and go so much deeper, than that. And because anything I bury in here risks being quite lost in something this long.

This variation is a very, very easy form of Newcomb, for two reasons.

The first is that you know the rules in advance, and get the resources of all of civilization to choose who plays. Must be nice.

The second is that the choice you precommit to gets you revenge on those who decided to destroy us, so it’s not asking for something super weird and unnatural, or even something that’s necessarily worse going forward than the alternative. Many a dedicated two-boxer would still make an effective swordholder – even if you get problems like this wrong, you have additional ways to get this particular version right.

Humanity in the book chooses someone who not only is not, in Wade’s terminology, unwilling to sell their mother to a whorehouse. It does not only also choose someone who has not shown an understanding of the importance of commitment. It chooses someone whose temperament clearly does not value revenge.

Electing Cheng Xin to be Swordholder is not simply a failure of game theory.

It can only be described as a suicide pact.

When Cheng Xin chooses whether or not to retaliate, and chooses not to, she does so out of a ‘motherly instinct.’ Out of a slim hope that it will turn out all right, somehow.

It can come as no surprise, to anyone, that Trisolaris attacks the second she is put in charge.

To emphasize this, at the same time Trisolaris is launching their attack, the humans arrest Luo Ji, on suspicion of genocide, for exactly the actions that saved humanity. Because when he destroyed a random uninhabited star to test the Dark Forest theory that allowed us to survive, there might, in theory have been intelligent life there.

Lou Ji’s reward, for literally saving the world, is arrest and would have then (except for the attack) have been trial.

Humanity decides, as it decides at several other points, that it no longer desires to survive. Humans repeatedly punish and strike down those who value humanity’s survival and make sacrifices to that end. Anyone who does what must be done, in order to let us survive, is a horrible criminal who has committed crimes against humanity, even when they fully succeed and save all of us. 

Humans repeatedly choose superficially ‘nice’ values incompatible with the survival of the species, or later with the survival of all life on and from Earth (take that, first-book environmentalists!), over any hope of actual survival. They do this after knowing that those out there in the dark forest are, as a whole, far, far worse. 

Even when they aren’t getting themselves killed, they’re sustaining a much lower than replacement-level birth rate. By the time the end comes, humanity is under one billion people entirely due to low birth rates.

Something, the most important thing, has gone horribly wrong. Values that improve the world have morphed into crazy lost purposes and signaling games. Everything worthwhile is lost.

It is hard not to read this as a critique of liberal values. The book explicitly endorses cyclical theories of history, where prosperity contains the seeds of its own destruction. In The differences this time, with higher technology and stakes, are in the consequences when it happens. Good times have created weak men, who have forgotten what it takes to create and maintain good times, and those weak men tear down that which sustains civilization, and gives its people the ability to reproduce themselves and their values and culture into future generations, without stopping to ask if they are doing so.

Then those who accept the dark forest, who know to hide well and cleanse well, inherit the universe.

As I noted earlier, this is fundamentally a deeply, deeply conservative and cynical set of books.

The Great Ravine

Or, You Fail Economics Forever, all of humanity edition, even more than usual. Oh, boy.

Humanity knows they face a doomsday battle four hundred years in the future. So everyone assumes, correctly, that humanity will begin rationing of goods so we can all go on a war footing and build ships to fight the doomsday battle.

I was thrilled when I later learned that this caused the world economy and environment to collapse. After which, we stopped trying to prepare for the war and focused on improving life. After which we rapidly recovered, developed better technology, started repairing the Earth, then incidentally decided to create a vastly superior fleet to the one we ruined ourselves trying to build.

Sounds about right.

A lot of the mistakes humanity makes in the book feel like mistakes we wouldn’t make.

This one feels like yes, we are exactly this stupid.

We totally, totally would forget that human flourishing and development requires play, requires caring about things other than The Cause. We totally would ignore the value of compound interest and economic growth and technological development, and of the unexpected. We’d abandon our freedoms and curiosity and everything of value, and embrace central planning and authoritarianism and technocrats, aimed at creating the symbolic representation of the symbolic representation of a space fleet. We totally would treat those who pointed such things out as traitors and shame them into silence.

Compound interest is not the most powerful force in the universe. But it is on the list. Tyler Cowen’s position, that we should almost entirely focus on maximizing economic growth subject to the continuity and stability of civilization, has some strong objections one might make. But all those strong suggestions come down to one of two things. Either they argue we should value the present more than the future (for various reasons), or that economic growth is the wrong thing to be measuring, and there is a different mechanism at work here that we should be thinking about in a different way.

Neither of those applies here, at all. We sacrificed the present for nothing.

The obvious defense is that basic physics research was blocked, so there’s no point in waiting. That’s quite silly. Even with basic research stalled, technology is a whole different animal, as we see later on. And even if technology was fixed, there are plenty of other ways to improve over centuries.

Note that Trisolaris does not make this mistake. They pay zero mind to any war preparations. All their sabotage goes towards stopping basic scientific research. This points to the opposite mistake, of thinking that the practical stuff one might accomplish doesn’t matter. In this case, they have somewhat of a point, given what the probes can do, but I would not want to rely on that.

You can’t run a world-wide death march for four hundred years, as much as mood affectation might demand it.

A good question is, if you were in charge of doomsday battle planning, when would it make sense to start preparing, how much, in what ways?

Year one, I would do my best to counter Trisolaris’ efforts, and devote as much effort as was productive to scientific research on all fronts at all levels, and creating a culture where that had value. I’d work to create norms allowing us to run costly experiments of all kinds, that we currently don’t consider remotely ehtical. I would do the same for space exploration and colonization, but without any mind to combat. I’d invest in long term economic growth in all its forms.

And of course, I’d do everything I could to build a true AGI. Trisolaris is evidence that the problem is hard, but the bottleneck is unlikely to be basic physics (unless it’s a pure shortage of compute, but that still doesn’t seem like a basic physics concern) and Trisolaris clearly isn’t that far ahead of us – it feels the need to sabotage our efforts. So there’s definitely chance (although presumably, given what we learn later, this would not have worked).

I’d also double down on intelligence augmentation and genetic engineering. Neither gets even a mention in the book, at all. But this is the obvious way to go! You have four centuries to get your act together. It turned out we had somewhat less, but we still had several centuries. That’s plenty of time for a very very slow takeoff. Plenty of time for getting smarter about getting smarter about getting smarter.

As a bonus, brains are the one thing Trisolaris can’t see with the Sophons, so the more people can keep in their heads because their heads are better, the better our ability to avoid countermeasures.

Oh, and the moment I could I’d get as many people as possible the hell out of the system, on seed ships, in case we lose. Even if it didn’t work, trying to do this seems likely to bear very good fruits. But there’s a whole section on that later.

Around one hundred years from the target, I’d consider starting to encourage martial traditions and space combat, and study show to train effective fighters from birth.

Within the last fifty years I’d actually build my ships or other weapons, and start getting ready in earnest. I’d go full war footing for the last five or ten, tops.

I’d also be fully on board with the Wallfacer program, but would like to think I would at least not choose an actual suicide-bomber-loving terrorist to be one of them, and ideally wait to invest mostly in enhanced people yet to be born. Plus some other obvious improvements. I’ll stop there to avoid going too far down the rabbit hole.

The Fairy Tale

The fairy tale, in which all we need to know about the universe and its inhabitants through multiple metaphoric levels, has epic levels of cool. It is constructed meticulously. It is definitely not easy to unravel.

There are even parts of the story that the book doesn’t bother explaining, but which seem to still have important meaning. Consider the spinning umbrella, which protects you from being turned into a picture. What is it? I have a guess (continuous use of light speed engines to darken matter), now that I know how the universe works, but it doesn’t quite fit, and it seems really important. There are other similar things that we ‘should’ have known ‘had to mean something important’ but then there are others that also seem that way, that (probably?) didn’t mean anything.

I got a little of what was going on when reading the tale, but missed the bulk of it. It felt like a fair puzzle in some places I missed, less fair in others. Which is, itself, quite fair.

Being Genre Savvy is hard, yo. The book’s mocking of people as unable to presume or unravel multiple levels of metaphor seems like what one says when they know the answer to their own riddle, and can’t figure out why literally no one else can figure it out.

If you start without knowledge of light speed travel, or how to darken matter, or that the number of dimensions can be reduced, making those leaps is hard.

It doesn’t seem hard on the level of an entire civilization’s experts working for dozens of years. On that level it seems easy. But if everyone got the same prompts and frames, and the same wrong ideas, it makes sense. We have long histories of everyone on Earth missing potential inventions or ideas for long periods of time, if they require multiple leaps to get there. Asking people to unravel multiple metaphorical layers is tough, especially if your poets aren’t physicists, and your physicists aren’t poets.

The book doesn’t mention a worldwide program to teach the physicists poetry, or the poets physics. People in this book stick to inside the box, and take human capacity as fixed.

That is one of the big hidden assumptions and messages of the book. People change with the times, and technology can advance, but people, and intelligence, fundamentally don’t change and don’t improve, even among aliens give or take their hat. Quirks and mistakes aside, there is mostly a single way of thinking, a single way of viewing the universe, a single set of basic values, and all intelligent life is roughly the same. No one can take over or have good probe coverage, no one creates workable systems of safety and control, and no one solves game theory. It’s weird. To some extent, I get that not doing this is hard and forces focus on things the book doesn’t want to be about, the same way that you are mostly forced to pretend AIs don’t work the ways they obviously do if you want to write science fiction that says interesting things about people. Still, it’s quite jarring.

The Trisolarians are dense throughout the books. They start not understanding the idea of deception, which I gotta say is really weird even if they can read each others’ thoughts. If nothing else, they have long distance communication, no? And abstract reason? Problems they do get over through exposure to Earth, but they still repeatedly make the same fundamental mistake.

If Trisolarians can’t see the explicit direct path by which an action leads to something they don’t like, they treat the action as harmless. And they don’t ask why people do things. They have no concept of the space bunny. And here, in our last encounter, we see the ultimate version of that. So much so, that they let the conversation go overtime, because they want to let them finish telling harmless fairy tales.

As opposed to, they’re burning their invaluable communication, this one time opportunity, on frigging fairy tales, something MUST be fishy here. 

Once you ask, are these tales fishy, if you know about dimensional warfare, I really really don’t see how you miss it. It’s one thing to not solve the riddle when you don’t know the answer. But if you know the answer, the riddle is transparent. I have no idea how it wasn’t obvious to every reader on Trisolaris, or at least each one who knew physics, what was going on.

But once again, that’s probably the ‘knows-the-answer-already’ talking.

Consider: I think it was a month ago I learned what The Safety Dance was about. My parents sang me Puff the Magic Dragon.

I mean, come on.

The Great Escape

If there is one thing mankind can agree upon throughout the trilogy, it is that the worst possible crime is escapism.

Escapism is the most dangerous crime of noticing that the solar system is doomed and trying to get the hell out as quickly as possible.

You see, several characters point out as soon as we learn we are doomed, figuring out how to build spaceships that can leave Earth is the easy part. The impossible part is deciding who gets to go. And if we can’t all go, then none of us should go, and those who try are the enemies of humanity.

Later on this expands to any form of study of interstellar travel. You see, the Earth is doomed, so learning how to leave it would be just awful because it would be unequal. Or, in some arguments, because it would distract from solving our problems, or would cause unfair hope. 

We can all agree this is a horrible, no-good way of looking at things. The last thing those on the Titanic should do, upon learning that there are not enough lifeboats, would be to sink the Titanic with all hands. We can disagree about whether it should be ladies and children first or whether class of ticket should matter or what not, but at a minimum you run a lottery or a mad race for the boats or something.

If you actually disagree with that, if you think that it is a good thing to notice that some people, somewhere, might not be doomed and make sure that every last one of us is doomed unless we can un-doom all of us, then, seriously, stop reading this blog because I kind of want you to die in a fire. It would be for the best.

When we say that while there is one slave in the world none of us are free, that is not an instruction to enslave everyone else until the problem is solved. Injustice for one is injustice for all because it is bad and sets bad incentives and bad values, not because it means we should then visit injustice upon everyone.

I am belaboring this point because I see people making arguments of this type, all the time. People who actually, seriously think that the way to finish “you have two cows” is “but someone else, somewhere, doesn’t have any cows, so we kill your cows you so we can all be equal.” If we all can’t live forever, no one can. The evil eye. The enemy of progress, and of humanity, and of life itself in all its forms.

No examples, since this blog has a strict no-someone-I-do-not-respect-is-wrong-on-the-internet policy. You know who you are.

Given this book claims that this is humanity’s basic mode of thinking, this seems like a good time to say, once and for all, this is a profoundly evil, not-even-wrong mode of thinking and when faced with it, I can’t even.

So, yeah.

I like to think the book is profoundly wrong about humanity, here. I like to think this is not mankind’s default mode of thinking. I like to think that this is only what a few fanatical and crazy people think, and they are loud and obnoxious and have platforms. I like to think that people talk a good game sometimes in order to turn things to their advantage, but they don’t actually want to burn everything down so no one will have an unfairly large house, whether or not they could be king of the ashes.

I like to think that while we would not resolve this problem well, exactly, that humanity would survive the death of the solar system if it had plenty of time and the technology, resources and capability to build generational starships and get some of us out of town. 

If we couldn’t, we wouldn’t deserve to survive, and I would not mourn for us.

I greatly appreciated that the key to defending ourselves, to turning our system into dark matter that cannot threaten others, is light-speed travel. Thus, by refusing to research light-speed travel, we cut ourselves off from discovering that the damage such travel causes to space is what would allow us to protect ourselves. This felt profoundly fair, given the way related technologies work in the book’s universe.

As Isaac Asimov said, there is but a single light of science. To brighten it anywhere, is to brighten it everywhere. It is very hard to predict what one will learn by exploring. Refusing to look in a place where you don’t understand how something works, to not ask questions because you don’t like what might happen if you found the answers, is likely to end up cutting off the things that matter most.

The book does offer an interesting implicit question, near the end. Once we develop the technology to turn our solar system into dark matter, if we get it in time, should we do it? Should we shroud our world in permanent isolation, enjoy some good years and then go meekly into that good night with the death of the sun?

I would expect us to shroud, but would proudly vote against it.

The Dark Game Theoretic Forest

In The Dark Knight (minor spoiler), there is a scene where two boats, one with convicts and one with ordinary citizens, are each given a button the Joker says will blow up the other boat, after which the Joker promises the remaining boat will be free to leave. Thus, because the convicts might push the button, the citizens might want to do so first, and thus the convicts might press it even if they didn’t want to, and so on. Starting on either side. Neither side considers their decision obvious.

This book thinks the buttons get pressed. That this is how the universe works, in the end. You can’t fully trust anyone not to blow you up, so you don’t give them your location, and when possible you blow everyone else up first. There is no thought to negotiation, cooperation or peace. Only war. Only the dark forest. Hide well, cleanse well.

In this perspective, game theory offers us no hope. No way out. Cooperation is impossible in a wide range of problems people like me consider mostly solved problems.

The price? Six dimensions of the universe. Minimum. Almost all of the universe’s value, then almost all of what is left. Six times over. The weapons used, to contain threats (where ‘threat’ is simply ‘intelligent life, anywhere’) are so toxic that they wipe out the laws of physics, the dimensions themselves. The world has already almost entirely ended, long before humans arrive on the scene.

Life is the point of the universe. All, even in-book, agree. Despite this, it is so impossible to try and work something out, so impossible to get a system of incentives or deterrence or cooperation, that life’s primary purposes are to hide and to hunt all other life.

The universe’s inhabitants needs better decision theory. Perhaps they should start here. Or maybe try the explanations here, instead. At some point I should give a shot to writing a better explainer.

What is most weird is, all of this applies back on Earth. Life still expands without limit. Resources remain finite. Offense is far cheaper than defense. Why don’t we inhabit the dark forest? Mostly we neither hide well nor cleanse well, and we are worse at both each passing year. Yes, in some ways we have it easier. In other ways, we have it harder.

Late in the third book, it is revealed that it isn’t quite all hiding and hunting. There are at least some forms of communication between various aliens, and even some dreams of cooperation to prevent the heat death of the universe. Everyone is basically similar and understands each other; as I noted above, the aliens seem more alike in their thinking than the book’s model of humans is from mine, or its Chinese from Americans. We really, really should all be able to get along, starting with a ban on weapons of dimensional destruction.

Is an upstart civilization with primitive broadcasting technology really even more dangerous than wiping out the third dimension at their home world, expanding with an escape velocity of light speed?

It doesn’t even work right. If they have light speed travel, their ships get away and try again. If they don’t, it seems like quite the overkill.

Another question that is never resolved, that I can recall, is why Trisolaris never sent probes to the stars closest to it. We know why they did not send sophons; sophons are each a huge economic cost. But why not send an ordinary probe that would give data on inhabitable worlds? Or at least, one that would give data on where one could safely park one’s ships while having a ready supply of fuel, the way humanity later puts its city-ships around Jupiter?

Humans, with technology clearly still vastly inferior to that of Trisolaris, find it not that burdensome to pack up everyone, within a hundred years, and move them into space cities around Jupiter. If that’s a viable solution, how hard is it to find a target world with a gas giant and only one or two stars (so it won’t suffer from the three body problem and be unstable)? Given the attrition rate they faced from space travel – the majority of the original invasion fleet doesn’t make it to Sol – wouldn’t it make the most sense to have already sent colony ships to Sol simply because it is a single star with large gas giants, without even needing to know Earth has life?

Instead, Trisolaris seems to be waiting around for a communication that will reveal someone’s location, despite having no expectation of that being nearby, or it being someone they can defeat militarily, or of anyone being stupid enough to do that, since in this universe everyone is hiding all the time and Trisolaris knows why they do this.

It doesn’t seem worthwhile for them to return our broadcasts, either. Yes, they can attempt to get us to stall our development, but it reveals the location of Trisolaris to us, which means we can destroy them, and will have that capacity for over a century. We could figure that out at any time. And we might do that accidentally, since we’re too stupid to realize that broadcasts of locations lead to destroyed stars.

Along similar lines, if I was a civilization that felt that all other civilizations were a mortal threat to me, to the extent that most the universe had become dark matter and we’d lost most of our dimensions to wars, you better believe I’d have at least have a probe on every world checking for signs of developing life so I could handle it with weapons that wouldn’t wipe out my supply of usable matter. And if resources are so precious, I wouldn’t be letting most of them lie around going to waste.

The book shows us that the aliens who wipe out Earth are actually getting wiped out themselves in a civil war with their own colony, about the be forced to retreat into lower dimensions (why they cannot go dark instead is not explained). So in this universe, it seems, aliens do not expand to use all the resources because they are so afraid of someone else trying to use all the resources, that creating the means to do that would mean inevitable conflicts that kill you. So most of the universe, even the useful parts, sits idle or shrouded because no one can trust even their own kind to not turn on them, simply because someone at some point might turn on someone so everyone turns on everyone.

But (mumble mumble von Neumann probes mumble mumble) that’s enough before I get distracted by going into all the plot holes and missed opportunities. Let’s wrap this up.

Conclusion and Science Fiction’s Fall

This has been by far this blog’s longest post to date, and still feels like it could have been far longer. The trilogy was full of big ideas, and theories and predictions about humanity. I disagree with many, and agree with many others, and find items from both groups interesting. It is good to see an author unafraid to say many things explicitly, that others are afraid to and unwilling to express – regardless of whether I ultimately agree with the hypothesis.

What I found strangest about these books is the degree to which it is possible to utterly miss the point. And I wonder what that says about what has happened to science fiction.

I can’t find the link, but I read a review of these books, from a clearly not-stupid person who had read the books, that praised them as offering a positive vision of humanity’s future.

On a very superficial level, yes, this is a positive vision of humanity’s future, in the sense that humanity gains higher technology and improves its standard of living, and then in that it manages to escape the solar system and create a lasting civilization among the stars – despite itself, and its repeated attempts to prevent itself from surviving, but yes in the end this does happen.

On any other level, this is utterly insane. The books despair for humanity and its future. We survive because of repeated narrative intervention, and the actions of a brave few against the will of the many to die. The world betrays itself, repeatedly. Man does not cooperate with man, and the universe is revealed as a place where being cannot cooperate with, or even fail to kill upon being given an opportunity, another being. 99.9999% of humanity is wiped out. The rest emerge into a dark forest war of all against all, where none dare reveal their location, and civilizations wage war by reducing the dimensions of the universe using weapons that have destroyed almost all value in the universe six times already and are rapidly working on a seventh time with what is left. With no way out.

That big crunch that hopes to reset the universe, if only its only safe and effectively immortal residents all sacrifice themselves? I have a hard time believing it has much chance of happening, even in my model of how things work. In this book’s universe? Not a chance.

If that’s an optimistic, positive science fiction with a positive view of the future, then what the hell is the pessimistic viewpoint?

Eliezer talks about how he was raised on old school science fiction, like Isaac Asimov and Arthur C. Clarke. His parents were careful to stock his bookshelves with only the older stuff, because newer stuff did not have the same optimistic view of humanity, and belief in science and technology as great things, and in the universe as capable of great things. Those writers ‘after the divide’ instead, in this view, use technology mainly as a way to think about what is wrong with it, and what is terrible about its consequences, and what is terrible about people. Such work does not teach us to reach for the stars.

Surely there are exceptions. But there’s no denying that the general vibe of technology being assumed to be great has somehow given way to the general vibe even in most science fiction that at best, wherever you go, there you are, and more likely that technology leads to young adult dystopia or worse. Inequality in the future is always through the roof. That roof is reliably underwater, due to climate change, which often renders Earth uninhabitable. Even if technology and our heroes ultimately save the day, the future is a mostly terrible place where we pay for our sins, and by our sins we mean our civilization and its science and technology.

Compared to that, these books present an optimistic paradise. But what kind of standard is that?

15 comments

Comments sorted by top scores.

comment by Viliam · 2019-01-31T23:45:45.879Z · LW(p) · GW(p)

I have read the trilogy, I enjoyed it a lot, and I have only two objections: the happy ending, and the lack of serious effort to kill Luo Ji. The latter is especially weird coming from aliens who would have no problem with exterminating half of the human population.

My impression is that the Three Body trilogy is essentially a universe-sized meditation on Moloch.

I am however completely surprised at your indignation at how the book depicts humans. Because I find it quite plausible, at least the parts about how "no good deed goes unpunished". Do we live in so different bubbles?

I see politicians gaining votes for populism, and losing votes for solving difficult problems. I see clickbait making tons of money, and scientists desperately fighting for funding. There was a guy who landed a rocket on a comet, or something like that, and then a mob of internet assholes brought him to tears because he had a tacky shirt. There are scientists who write books explaining psychometric research, and end up physically attacked and called Nazis. With humans like this, what is so implausible about a person who would literally save humanity from annihilation, being sentenced to death? Just imagine that it brings ad clicks or votes from idiots or whatever is the mob currency of the future, and that's all the incentive you need for this to happen.

As the beginning of the trilogy shows, we do not need to imagine a fictionally evil or fictionally stupid humanity to accomplish this. We just need to imagine exactly the same humanity that brought us the wonders of Nazism and Communism. The bell curve where the people on one end wear Che shirts and cry "but socialism has never been tried", and on the other end we have Noam "Pol Pot did nothing wrong" Chomsky in academia. Do you feel safe living on the same planet as these people? Do you trust them to handle the future x-threats in a sane way? I definitely don't.

The unrealistic part perhaps is that these future (realistically stupid and evil) people are too consistent, and have things too much under control. I would expect more randomness, e.g. one person who saves the world would be executed, but another would be celebrated, for some completely random reason unrelated to saving the world. Also, I would expect that despite making the suicide pact the official policy of the humankind, some sufficiently powerful people would prepare an exit for themselves anyway. (But maybe the future has better surveillance which makes going against the official policy impossible.)

comment by Raemon · 2019-01-30T21:23:16.993Z · LW(p) · GW(p)

For people who manage to read this comment without having any context of what the Three Body Problem is about, I do recommend reading it without any context (without even reading the summary on the back of the book, which contains bizarrely extreme spoilers IMO).

(I think the book is best read without even knowing what genre it's supposed to be, and ideally without knowing there are spoilers to be had, which is probably a lost cause at this point)

comment by dvasya · 2019-01-30T04:54:20.090Z · LW(p) · GW(p)

The books are marketed as "hard" sci-fi but it seems all the "science" (at least in the first book, didn't read the others) is just mountains of mysticism constructed around statements that can sound "deep" on some superficial level but aren't at all mysterious, like "three-body systems interacting via central forces are generally unstable" or "you can encode some information into the quantum state of a particle" (yet of course they do contain nuance that's completely lost on the author, such as "what if two of the particles are heavy and much closer to each other than to the third?", or "which basis do you want to measure the state of your particle in?"). Compare to the Puppeteers' homeworld from the Ringworld series (yes, cheesy, but still...)

Replies from: rossry, frontier64
comment by rossry · 2019-01-30T14:24:58.273Z · LW(p) · GW(p)

Huh. I don't think I ever heard someone call this series hard sci-fi where I could hear them; the most common recommendation was related to its Chineseness, which, as Zvi claims, definitely delivers.

And I'm not sure I'd take Niven as the archetype of truly hard sci-fi; have you ever tried Egan? Diaspora says sensible things about philosophy of mind for emulated, branching AIs with a plot arc where the power laws of a 5+1-dimensional universe become relevant, and Clockwork Rocket invents alternate laws of special relativity incidentally to a story involving truly creative alt-biology...

comment by frontier64 · 2020-07-26T23:22:45.063Z · LW(p) · GW(p)

Your criticism that Alpha Centauri isn't actually a three-body system and instead operates as a binary star system with another nearby star is palpably anal retentive. Liu takes a small liberty to create a difference between his fictional world and the actual world that's still clearly well within the laws of physics. That difference creates a cool situation wherein a tough problem in physics serves as the backdrop for an alien situation. He describes the three-body problem accurately and doesn't just use it as window-dressing. Yet you fault him for this smart inclusion. Poor take.

comment by Raemon · 2019-01-30T01:29:21.290Z · LW(p) · GW(p)

I've read the first two books. I didn't get around to reading the third and heard it was worse than the first two books.

I skimmed over your plot summary, then read the review up until the Fairy Tale section.

If I enjoyed the first two books pretty well, would you recommend reading the third?

comment by habryka (habryka4) · 2019-01-30T19:53:28.074Z · LW(p) · GW(p)

Edit note: I made some lines headings to make it play nicer with the Table of Contents. Happy to change it back if you are bothered by that.

comment by Richard_Ngo (ricraz) · 2019-01-30T18:58:44.635Z · LW(p) · GW(p)

There are some interesting insights about the overall viewpoint behind this book, but gosh the tone of this post is vicious. I totally understand frustration with stupidity in fiction, and I've written such screeds in my time too. But I think it's well worth moderating the impulse to do so in cases like this where the characters whose absolute stupidity you're bemoaning map onto the outgroup in so many ways.

comment by james_t · 2019-01-30T18:44:00.795Z · LW(p) · GW(p)

Nice review, I enjoyed it. I read the books a while ago and it was good to see I'm not alone in seeing it as deeply conservative. As far as that goes, I wondered how much of that is sort of general Chinese attitude vs. non-Chinese attitude, and how much if it is unique to the author.

One thing that keeps bothering me about the book is I can't make sense of Wade.

Wade was the ideal swordholder, because he could stick to commitments. Is he supposed to be absolutely bound by them, though, and is that why he inexplicably obeys Cheng, because he agreed to in the past? That's a coherent notion of character, but it hardly feels explicable; it makes sense if Wade is an AI, but not really as a human. Or at least that's how I felt.

Replies from: Inst
comment by Inst · 2019-02-03T10:09:28.422Z · LW(p) · GW(p)

I think a lot of things is because it's Chinese. Liu Cixin (LCX) writes in an essay about how he felt that aside from the Holocaust, the Cultural Revolution was the only thing that could make people lose complete hope in humanity.

For the criticism Zvi brings up, the book is written by someone who is well-read and is familiar with history. For instance, the climatic battle wherein the massed human fleet is wiped out by a single Trisolarian attack craft? It's been done before; Battle of Tumu in Ming history involved an inexperienced Emperor under the control of an utterly incompetent eunuch lead an army to fight the Mongols in the steppes and gets 200,000 soldiers killed within 2 weeks as they run out of food and water. There's also a battle in the Chinese Warring States period wherein subterfuge by the enemy gets an incompetent commander put up, Zhao Kuo, who changes from a Fabian strategy to a direct attack strategy and gets 200,000 to 300,000 Zhao State soldiers wiped out by the Qin State, and unlike Rome after Cannae, Zhao never recovers.

For more non-Chinese examples, a close examination of Empire of Japan policy before World War II and during World War II betrays rampant incompetence and what really amounted to a headless chicken that didn't know when to bide for time. Yamamato at the Battle of Midway charged in not knowing his codes were broken and utterly underestimating the Americans. Or we could point to World War I, called the First European Civil War by some leftist historians, severely weakening European civilization as a generation of young men were massacred in the trenches.

As for Wade, it's the non-Western thing that comes to mind. When the Ming Dynasty fell, many former government officials sought not to eat grain grown in the succeeding Qing Dynasty, not because they felt their resistance would be successful, but because of a radical deontologism. What this resulted in was that once they ran through their stockpiles of food, they'd literally starve to death to protest, and I emphasize that this was a "meaningless" protest with no positive consequences, it did nothing to the new Qing Empire. You have to recall that people in the Confucian bloc, while often-times extreme consequentialists, are also insane deontologists, think General Nogi following his Emperor in death.

That is to say, I don't find the Chinese characters flat, given how Chinese people behave. Wade is not a believable American, but he's reasonable within a Chinese context.

comment by ryan_b · 2019-01-30T17:53:38.220Z · LW(p) · GW(p)

Read the whole thing, definitely worth trading spoilers for the commentary.

I'm persistently confused by the expectation that science fiction be optimistic. It seems like there is a lot of space for exploring the failure modes of complex systems, and now the public at large has achieved a sense of this being an important type of problem, so I find it unsurprising people consume a lot of fiction which validates this sense.

Wanted: a way to show our heroes contributing to the systemic solution of a complex problem, in a way which forms a satisfying narrative.

comment by Inst · 2019-02-03T13:22:20.128Z · LW(p) · GW(p)

I'll also point out that in Three Body, true AI requires quantum research; it's a hand-waving thing that Liu Cixin does to prevent the formation of an AI society. In either case, it wouldn't necessarily help; if the humans can do AI, so can the Trisolarians; for all we know, they're already a post-singularity society that can out-AI humans given their capability for sub-atomic computing.

The fun is watching human or Trisolarian nature, not AI systems playing perfect play games against each other.

comment by Moustakas42 (moustakas42) · 2019-02-01T08:42:04.368Z · LW(p) · GW(p)

Thank you for a sober and much-needed review. I read the books last year looking for the same things, and came up with similar takeaways. I was awestruck by the story's scope and weird ideas, and even though 1) you could see everything hinged on strawmanning (American?) liberalism, 2) the grand game theory games were oversold, and 3) the character work is shoddy and full of backwards gender stuff. I was more than happy to read it as, as someone else pointed out, a mega-scale meditation on Moloch, even though by the end it has stooped to literally sighing a fatherly "you have too rosy a view of things" at us. I think that, despite the happy(-ish) ending and the hard sci-fi, Mr. Liu reveals himself to be a kind of modern Lovecraftian, one who acutely describes a universe that's at best indifferent and often wants to kill you, though even better in the sense that all we have for Great Old Ones are each other. His pessimism does not reach Landian dead-ends, however, as the ending shows. It's good to know, in the end, that there is a light of mutually assisted survival even at the end of ten-million-year-long and perhaps essentially Chinese tunnels.

Replies from: Inst
comment by Inst · 2019-02-03T10:44:38.017Z · LW(p) · GW(p)

I wouldn't call Liu Cixin (LCX) a Lovecraftian. Take the New Yorker interview.

" I believe science and technology can bring us a bright future, but the journey to achieve it will be filled with difficulties and exact a price from us. Some of these obstacles and costs will be quite terrible, but in the end we will land on the sunlit further shore. Let me quote the Chinese poet Xu Zhimo from the beginning of the last century, who, after a trip to the Soviet Union, said, 'Over there, they believe in the existence of Heaven, but there is a sea of blood that lies between Heaven and Hell, and they’ve decided to cross the sea.' "

Liu Cixin's worldview is closer to Camus, i.e, the world is the Absurd, something intrinsically inimical to us; the laws of thermodynamics apply and evolution has created sentient organisms that are capable of suffering. And like Camus, while he's pessimistic on the state of the world and our odds of changing it, he sees something noble in our struggle against it. It's not going to be Disney or Hollywood, insofar as the hero or heroine achieves their goals and gains without much losses; in "The Village Schoolteacher", for instance, the defense technology of the invaders is overwhelmed by large-scale suicide attacks.

comment by frontier64 · 2020-07-26T23:33:11.461Z · LW(p) · GW(p)

The Three Body Problem doesn’t say that environmentalism leads to a desire for the world be conquered by Trisolaris. Liu explains that environmentalism is often a symptom of a deep hatred for humanity. So rather than sharing a link of direct causation, environmentalism and Trisolaran worship are both co-related to an inner hatred for humanity. Many of these environmentalist intellectuals come from a place of hatred towards humanity rather than love for the natural world. It is hard to consciously acknowledge one’s hatred for humanity and it’s next to impossible to hold that view publicly. So these hateful people attach themselves to the ideology that most clearly aligns with their inner desires, environmentalism.

At its most pure, environmentalism really is misaligned with humanity. The concept of environmentalism itself necessitates that it is incongruous with humanism. There are paths that improve the human race, and paths that improve the environment. If these paths are co-aligned then there is no need for separate terms and there would be no conflicts between environmentalists and humanists. But because there are conflicts the paths must be separate. What is best for humans cannot in all cases be best for the environment and vice-versa. An environmentalist is someone who comes to a fork in the road and always chooses the path which is best for the environment to the inevitable detriment of humanity.

If you put some thought into it (or you could say, take a cynical view) you can see then that environmentalism is for many a front for their inner hatred of humanity. Therefore when a new opportunity comes to support a powerful group with the explicit goal of destroying humanity, the ETO, well these environmentalist intellectuals just can’t pass it up. It’s the same way the black nationalists used to ally themselves with civil rights groups because being an explicit black nationalist wasn’t tenable at the time. Now that their true ideology has a place in the world these people have splintered off their previous host ideology and become more true to themselves and their real goals.

You also misunderstand the point of the Dark Forest theory. You misunderstand it most explicitly here,

What is most weird is, all of this applies back on Earth. Life still expands without limit. Resources remain finite. Offense is far cheaper than defense. Why don’t we inhabit the dark forest?

You have completely neglected two of the central points of Dark Forest theory which is that the cosmic civilizations in question have completely separate cultures and the large number of civilizations in the galaxy. At the very least, the level of cross-cultural communication required to sufficiently allay the fears of both sides is nigh impossible between alien civilizations. Earth doesn’t have the issue of cross-cultural communication because compared to cultures from an alien civilization, the Earth has just one culture. The chain of suspicion stretches long and when your only method of communication requires rough translation it is very easy to get stuck at a point where Civilization A thinks there’s a 25% chance that Civilization B thinks that Civilization A is going to launch a preemptive strike against them. Civ A and Civ B are in a continuous prisoner’s dilemma where if one chooses defect the other faces absolute annihilation.

The second point you forget is that it takes just one malevolent civilization out of the millions in the galaxy to ensure enforcement of the dark forest. Even if out of a million civilizations, 990,000 will disregard coordinates for a dark forest strike, even if 9,990 of the rest only respond with subtle probing, all it takes is one out of those ten remaining trigger-happy civilizations to destroy a whole star system. We never reached near that number of distinct civilizations on Earth.

Lastly, while I can see how you would dislike Liu's take on the ideology of equality, you haven't provided anything to dispute it. Many people in the modern age explicitly endorse the goal of harming the better off to promote equality in the world. Some of the most common scales used today to judge overall well-being of different nations focus entirely on economic and social inequality. Liu takes this liberal perspective to it's extreme and quite accurately at that. Escapism is the most extreme form of inequality and any liberal of the modern day would detest the very idea of the rich and powerful getting to flee a burning Earth to live forever as the rest of us suffer and die.