On Eating the Sun
post by jessicata (jessica.liu.taylor) · 2025-01-08T04:57:20.457Z · LW · GW · 79 commentsThis is a link post for https://unstablerontology.substack.com/p/on-eating-the-sun
Contents
79 comments
The Sun is the most nutritious thing that's reasonably close. It's only 8 light-minutes away, yet contains the vast majority of mass within 4 light-years of the Earth. The next-nearest star, Proxima Centauri, is about 4.25 light-years away.
By "nutritious", I mean it has a lot of what's needed for making computers: mass-energy. In "Ultimate physical limits to computation", Seth Lloyd imagines an "ultimate laptop" which is the theoretically best computer that is 1 kilogram of mass, contained in 1 liter. He notes a limit to calculations per second that is proportional to the energy of the computer, which is mostly locked in its mass (E = mc²). Such an energy-proportional limit applies to memory too. Energy need not be expended quickly in the course of calculation, due to reversible computing.
So, you need energy to make computers out of (much more than you need energy to power them). And, within 4 light-years, the Sun is where almost all of that energy is. Of course, we don't have the technology to eat the Sun, so it isn't really our decision to make. But, when will someone or something be making this decision?
Artificial intelligence that is sufficiently advanced could do everything a human could do, better and faster. If humans could eventually design machines that eat the Sun, then sufficiently advanced AI could do so faster. There is some disagreement about "takeoff speeds", that is, the time between when AI is about as intelligent as humans, to when it is far far more intelligent.
My argument is that, when AI is far far more intelligent than humans, it will understand the Sun as the most nutritious entity that is within 4 light-years, and eat it within a short time frame. It really is convergently instrumental to eat the Sun, in the sense of repurposing at least 50% its mass-energy to make machines including computers and their supporting infrastructure ("computronium"), fuel and energy sources, and so on.
I acknowledge that some readers may think the Sun will never be eaten. Perhaps it sounds like sci-fi to them. Here, I will argue that Sun-eating is probable within the next 10,000 years.
Technological development has a ratchet effect: good technologies get invented, but usually don't get lost, unless they weren't very important/valuable (compared to other available technologies). Empirically, the rate of discovery seems to be increasing. To the extent pre-humans even had technology, it was developed a lot more slowly. Technology seems to be advancing a lot faster in the last 1000 years than it was from 5000 BC to 4000 BC. Part of the reason for the change in rate is that technologies build on other technologies; for example, the technology of computers allows discovery of other technologies through computational modeling.
So, we are probably approaching a stage where technology develops very quickly. Eventually, the rate of technology development will go down, due to depletion of low-hanging fruit. But before then, in the regime where technology is developing very rapidly, it will be both feasible and instrumentally important to run more computations, quickly. Computation is needed to research technologies, among other uses. Running sufficiently difficult computations requires eating the Sun, and will be feasible at some technology level, which itself probably doesn't require eating the Sun (eating the Earth probably provides more than enough energy to have enough computational power to figure out the technology to eat the Sun).
Let's further examine the motive for creating many machines, including computers, quickly. Roughly, we can consider two different regimes of fast technology development: coordinated and uncoordinated.
-
A coordinated regime will act like a single agent (or "singleton"), even if it's composed of multiple agents. This regime would do some kind of long-termist optimization (in this setting, even a few years is pretty long-term). Of course, it would want to discover technology quickly, all else being equal (due to astronomical waste considerations). But it might be somewhat "environmentalist" in terms of avoiding making hard-to-reverse decisions, like expending a lot of energy. I still think it would eat the Sun, on the basis that it can later convert these machines to other machines, if desired (it has access to many technologies, after all).
-
In an uncoordinated regime, multiple agents compete for resources and control. Broadly, having more machines (including computers) and more technology grants a competitive advantage. That is a strong incentive to turn the Sun into machines and develop technologies quickly. Perhaps an uncoordinated regime can transition to a coordinated one, as either there is a single victor, or the most competitive players start coordinating.
This concludes the argument that the Sun will be largely eaten in the next 10,000 years. It really will be a major event in the history of the solar system. Usually, not much happens to the Sun in 10,000 years. And I really think I'm being conservative in saying 10,000. This would in typical discourse be considered "very long ASI timelines", under the assumption that ASI eats the Sun within a few years.
Thinking about the timing of Sun-eating seems more well-defined, and potentially precise, than thinking about the timeline of "human-level AGI" or "ASI". These days, it's hard to know what people mean by AGI. Does "AGI" mean a system that can answer math questions better than the average human? We already have that. Does it mean a system that can generate $100 billion in profit? Obvious legal fiction.
Sun-eating tracks a certain stage in AGI capability. Perhaps there are other concrete, material thresholds corresponding to convergent instrumental goals, which track earlier stages. These could provide more specific definitions for AGI-related forecasting.
79 comments
Comments sorted by top scores.
comment by Vladimir_Nesov · 2025-01-08T05:06:21.502Z · LW(p) · GW(p)
Eating of the Sun is reversible, it's letting it burn that can't be reversed. The environmentalist option is to eat the Sun as soon as possible.
Replies from: cousin_itcomment by Dagon · 2025-01-09T17:32:39.209Z · LW(p) · GW(p)
This (the Sun is the only important local source of ... anything) has been an obvious conclusion for decades. Freeman Dyson described one avenue to capturing all the energy in 1960. The recent changes that make it more salient (and framed as "eating or inhabiting the sun" rather than "capturing the sun's output") is the recent progress in AI, which does two things:
- Adds weight to the computational theory of mind. If everything important is "just" computation, then all this expense and complexity of human bodies and organic brains is temporary and will be unnecessary in the future. This simplifies the problem of HOW to eat the sun into just how to make computronium out of it.
- Provides a more believable path for solving very hard engineering problems, by using smarter engineers than we can currently birth and train. It does NOT actually solve the problems, or even prove that they're solvable. We don't actually know what computation is (for this purpose), or how to optimize the entropy problem of "making one thing more ordered always makes other things less ordered".
That said, I don't know how to make beliefs on this scale pay any rent. "within 10,000 years" and "that's just science fiction" are identical labels to me.
comment by David Matolcsi (matolcsid) · 2025-01-09T06:37:51.045Z · LW(p) · GW(p)
I might write a top level post or shortform about this at some point. I find it baffling how casually people talk about dismantling the Sun around here. I recognize that this post makes no normative claim that we should do it, but it doesn't say that it would be bad either, and expects that we will do it even if humanity remains in power. I think we probably won't do it if humanity remains in power, we shouldn't do it, and if humanity disassembles the Sun, it will probably happen for some very bad reason, like a fanatical dictatorship getting in power.
If we get some even vaguely democratic system that respects human rights at least a little, then many people (probably the vast majority) will want to live on Earth in their physical bodies and many will want to have children, and many of those children will also want to live on Earth and have children on their own. I find it unlikely that all subcultures that want this will die out on Earth in 10,000 years, especially considering the selections effects: the subcultures that prefer to have natural children on Earth are the ones that natural selection favors on Earth. So the scenarios when humanity dismantles the Sun probably involve a dictatorship rounding up the Amish and killing them while maybe uploading their minds somewhere, against all their protestation. Or possibly rounding up the Amish, and forcibly "increasing their intelligence and wisdom" by some artificial means, until they realize that their "coherent extrapolated volition" was in agreement with the dictatorship all along, and then killing off their bodies after their new mind consents. I find this option hardly any better. (Also, it's not just the Amish you are hauling to the extermination camps kicking and screaming, but my mother too. And probably your mother as well. Please don't do that.)
Also, I think the astronomical waste is probably pretty negligible. You can probably create a very good industrial base around the Sun with just some Dyson swarm that doesn't take up enough light to be noticeable from Earth. And then you can just send out some probes to Alpha Centauri and the other neighboring stars to dismantle them if we really want to. How much time do we lose by this? My guess is at most a few years, and we probably want to take some years anyway to do some reflection before we start any giant project.
People sometimes accuse the rationalist community of being aggressive naive utilitarians, who only believe that the AGI is going to kill everyone, because they are only projecting themselves to it, as they also want to to kill everyone if they get power, so they can establish their mind-uploaded, we-are-the-grabby-aliens, turn-the-stars-into-computronium utopia a few months earlier that way. I think this accusation is mostly false, and most rationalists are in fact pretty reasonable and want to respect other people's rights and so on. But when I see people casually discussing dismantling the Sun, with only one critical comment (Mikhail's) that we shouldn't do it, and it shows up in Solstice songs as a thing we want to do in the Great Transhumanist Future twenty years from now, I start worrying again that the critics are right, and we are the bad guys.
I prefer to think that it's not because people are in fact happy about massacring the Amish and their own mothers, but because dismantling the Sun is a meme, and people don't think through what it means. Anyway, please stop.
(Somewhat relatedly, I think it's not obvious at all that if a misaligned AGI takes over the world, it will dismantle the Sun. It is more likely to do it than humanity would, but still, I don't know how you could be any confident that the misaligned AI that first takes over will be the type of linear utilitarian optimizer that really cares about conquering the last stars at the edge of the Universe, so needs to dismantle the star in order to speed up its conquest with a few years.)
Replies from: Benito, Raemon, AnthonyC, Seth Herd, SaidAchmiz↑ comment by Ben Pace (Benito) · 2025-01-10T08:06:09.953Z · LW(p) · GW(p)
You are putting words in people's mouths to accuse lots of people of wanting to round up the Amish and hauling them to extermination camps, and I am disappointed that you would resort to such accusations.
Replies from: matolcsid↑ comment by David Matolcsi (matolcsid) · 2025-01-10T09:23:30.904Z · LW(p) · GW(p)
Yeah, maybe I just got too angry. As we discussed in other comments, I believe that astronomical acceleration perspective the real deal is maximizing the initial industrialization of Earth and its surroundings, which does require killing off (and mind uploading) the Amish and everyone else. Sure, if people are only arguing that we should only dismantle the Sun and Earth after millennia, that's more acceptable, but I really don't see what's the point then, we can build out our industrial base on Alpha Centauri by then.
The part that is frustrating to me that neither the original post, nor any of the commenters arguing with me are not caveating their position with "of course, we would never want to destroy Earth before we can save all the people who want to live in their biological bodies, even though this is plausibly the majority of the cost in cosmic slow-down". If you agree with this, please say so, and I still have quarrels about removing people to artificial planets if they don't want to go, but I'm less horrified. But so far, no one was willing to clarify that they don't want to destroy Earth before saving the biological people, and I really did hear people say in private conversations things like "we will immediately kill all the bodies and upload the minds, the people will thank us later once they understand better" and things of that sort, which makes me paranoid.
Ben, Oliver, Raemon, Jessica, are you willing to commit to not wanting to destroy Earth if it requires killing the biological bodies of a significant number of non-consenting people? If so, my ire was not directed against you and I apologize to you.
Replies from: Benito, Benito↑ comment by Ben Pace (Benito) · 2025-01-10T21:21:07.960Z · LW(p) · GW(p)
It is good to have deontological commitments about what you would do with a lot of power. But this situation is very different from "a lot of power", it's also "if you were to become wiser and more knowledgeable than anyone in history so far". One can imagine the Christians of old asking for a commitment that "If you get this new scientific and industrial civilization that you want in 2,000 years from now, will you commit to following the teachings of Jesus?" and along the way I sadly find out that even though it seemed like a good and moral commitment at the time, it totally screwed my ability to behave morally in the future because Christianity is necessarily predicated on tons of falsehoods and many of its teachings are immoral.
But there is some version of this commitment I think is good to make... something like "Insofar as the players involved are all biological humans, I will respect the legal structures that exist and the existence of countries, and will not relate to them in ways that would be considered worthy of starting a war in its defense". But I'm not certain about this, for instance what if most countries in the world build 10^10 digital minds and are essentially torturing them? I may well wish to overthrow a country that is primarily torture with a small number of biological humans sitting on thrones on top of these people, and I am not willing to commit not to do that presently.
I understand that there are bad ethical things one can do with post-singularity power, but I do not currently see a clear way to commit to certain ethical behaviors that will survive contact with massive increases in knowledge and wisdom. I am interested if anyone has made other commitments about post-singularity life (or "on the cusp of singularity life") that they expect to survive contact with reality?
Added: At the very least I can say that I am not going to make commitments to do specific things that violate my current ethics. I have certainly made no positive commitment to violate people's bodily autonomy nor have such an intention.
Replies from: matolcsid↑ comment by David Matolcsi (matolcsid) · 2025-01-10T21:43:06.948Z · LW(p) · GW(p)
Fair, I also haven't made any specific commitments, I phrased it wrongly. I agree there can be extreme scenarios with trillions of digital minds tortured where you'd maybe want to declare war on the. rest of society. But I would still like people to write down that "of course, I wouldn't want to destroy Earth before we can save all the people who want to live in their biological bodies, just to get a few years of acceleration in the cosmic conquest". I feel a sentence like this should really have been included in the original post about dismantling the Sun, and until people are not willing to write this down, I remain paranoid that they would in fact haul the Amish the extermination camps if it feels like a good idea at the time. (As I said, I met people who really held this position.)
↑ comment by Ben Pace (Benito) · 2025-01-10T09:45:06.454Z · LW(p) · GW(p)
(Meta: Apologies for running the clock, but it is 1:45am where I am and I'm too sleepy to keep going on this thread, so I'm bowing out for tonight. I want to respond further, but I'm on vacation right now so I do wish to disclaim any expectations of a speedy follow-up.)
↑ comment by Raemon · 2025-01-09T21:14:28.353Z · LW(p) · GW(p)
So, I'm with you on "hey guys, uh, this is pretty horrifying, right? Uh, what's with the missing mood about that?".
The issue is that not-eating-the-sun is also horrifying. i.e. see also All Possible Views About Humanity's Future Are Wild [LW · GW]. To not eat the sun is to throw away orders of magnitude more resources than anyone has ever thrown away before. Is it percentage-wise "a small fraction of the cosmos?". Sure. But, (quickly checks Claude, which wrote up a fermi code snippet before answering, I can share the work if you want to doublecheck yourself), a two year delay would be... 0.00000004% of the unvierse lost beyond the lightcone horizon, which doesn't sound like much except that's 200 galaxies lost.
When you compare "the Amish get a Sun Replica that doesn't change their experience", the question is "Is it worth throwing away 80 trillion stars for the Amish to have the real thing." It does not seem obviously worth it.
IMO there isn't an option that isn't at least a bit horrifying in some sense that one could have a missing mood about. And while I still feel unsettled about it, I think if I have to grieve something, makes more sense to grieve in the direction of "don't throw away 80 trillion stars worth of resources."
I think you're also maybe just not appreciating how much would change in 10,000 years? Like, there is no single culture that has survived 10,000 years. (Maybe one of those small tribes in the amazon? I'd still bet on there having been a lot of cultural drift there but not confidently). The Amish are only a few hundred years old. I can imagine doing a lot of moral reflection and coming to the conclusion the sun shouldn't be eaten until all human cultures have decided it's the right thing to do, but I do really doubt that process takes 10,000 years.
Replies from: matolcsid, SaidAchmiz↑ comment by David Matolcsi (matolcsid) · 2025-01-10T03:12:30.840Z · LW(p) · GW(p)
I think this is a false dilemma. If all human cultures on Earth come to the conclusion in 1000 years that they would like the Sun to be dismantled (which I very much doubt), then sure, we can do that. But at that point, we could already have built awesome industrial bases by dismantling Alpha Centauri, or just building them up by dismantling 0.1% of the Sun that doesn't affect anything on Earth. I doubt that totally dismantling the Sun after centuries would significantly accelerate the time we reach the cosmic event horizon.
The thing that actually has costs is not to immediately bulldoze down Earth and turn it into a maximally efficient industrial powerhouse at the cost of killing every biological body. Or if the ASI has the opportunity to dismantle the Sun in a short notice (the post alludes to 10,000 years being a very conservative estimate and "the assumption that ASI eats the Sun within a few years"). But that's not going to happen democratically. There is no way you get 51% of people [1] to vote for bulldozing down Earth and killing their biological bodies, and I very much doubt you get that vote even for dismantling the Sun in a few years and putting some fake sky around Earth for protection. It's possible there could be a truly wise philosopher king who could with aching heart overrule everyone else's objection and bulldoze down Earth to get those extra 200 galaxies at the edge of the Universe, but then govern the Universe wisely and benevolently in a way that people on reflection all approve of. But in practice, we are not going to get a wise philosopher king. I expect that any government that decides to destroy the Sun for the greater good, against the outrage of the vast majority of people, will also be a bad ruler of the Universe.
I also believe that AI alignment is not a binary, and even in the worlds where there is no AI takeover, we will probably get an AI we initially can't fully tust that will follow the spirit of our commands in exotic situations we can't really understand. In that case, it would be extremely unwise to immediately instruct it to create mind uploads (how faithful those will be?) and bulldoze down the world to turn the Sun into computronium. There are a lot of reasons for taking things slow.
Usually rationalists are pretty reasonable about these things, and endorse democratic government and human rights, and they even often like talking about the Long Reflection a taking things slow. But then they start talking about dismantling the Sun! This post can kind of defend itself that it was proposing a less immediate and horrifying implementation (though there really is a missing mood here), but there are other examples, most notably the Great Transhumanist Future song in last year's Solstice, where a coder looks up to the burning Sun disapprovingly, and in twenty years with a big ol' computer they will use the Sun as a battery.
I don't know if the people talking like that are so out of touch that they believe that with a little convincing everyone will agree to dismantle the Sun in twenty years, or they would approve of an AI-enabled dictatorship bulldozing over Earth, or they just don't think through the implications. I think it's mostly that they just don't think about it too hard, but I did hear people coming out in favor of actually bulldozing down Earth (usually including a step where we forcibly increase everyone's intelligence until they agree with the leadership), and I think that's very foolish and bad.
- ^
And even 51% of the vote wouldn't be enough in any good democracy to bulldoze over everyone else
↑ comment by Raemon · 2025-01-10T03:53:58.409Z · LW(p) · GW(p)
I have my own actual best guesses for what happens in reasonably good futures, which I can get into. (I'll flag for now I think "preserve Earth itself for as long as possible" is a reasonable Schelling point that is compatible with many "go otherwise quite fast" plans)
I doubt that totally dismantling the Sun after centuries would significantly accelerate the time we reach the cosmic event horizon.
Why do you doubt this? (to be clear, depends on exact details. But, my original query was about a 2 year delay. Proxima Centauri is 4 lightyears away. What is your story for how only taking 0.1% of the sun's energy while we spin up doesn't slow us down by at least 2 years?
I have more to say but maybe should wait on your answer to that.
Mostly, I think your last comment still had it's own missing mood of horror, and/or seemed to be assuming away any tradeoffs.
(I am with you on "many rationalists seem gung ho about this in a way I find scary")
Replies from: ryan_greenblatt, matolcsid↑ comment by ryan_greenblatt · 2025-01-10T06:27:07.021Z · LW(p) · GW(p)
The argument is (I assume):
- Once centuries have passed, you've already sent out huge amounts of space probes that roughly saturate reachable resources. (Because you can convert Proxima Centauri fully into probes within <20 years probably.)
- It doesn't take that much energy to pretty much fully saturate on probes. In particular, Eternity in six hours claims getting the energy for most of the probes you want is possible with just 6 hours of solar output (let alone eating 0.1% of the sun). Even if we assume this off by 2 OOMs (e.g. to be confident you get everywhere you need), that still means we can saturate on energy after 1 month of solar output. If we're willing to eat 0.1% of the sun (presumably at least millions of years of solar output?), the situation isn't even close. In fact, the key bottleneck based on Eternity in six hours is disassembling mercury (I think on heat dissipation) though it is hard to be confident in advance.
↑ comment by David Matolcsi (matolcsid) · 2025-01-10T07:36:51.978Z · LW(p) · GW(p)
Yes, I wanted to argue something like this.
↑ comment by David Matolcsi (matolcsid) · 2025-01-10T08:05:58.261Z · LW(p) · GW(p)
I agree that I don't viscerally feel the loss of the 200 galaxies, and maybe that's a deficiency. But I still find this position crazy. I feel this is a decent parallel dialogue:
Other person: "Here is a something I thought of that would increase health outcomes in the world by 0.00000004%."
Me: "But surely you realize that this measure is horrendously unpopular, and the only way to implement it is through a dictatorial world government."
Other person: "Well yes, I agree it's a hard dilemma, but on absolute terms, 0.00000004% of the world population is 3 people, so my intervention would save 3 lives. Think about how horrible tragedy every death is, I really feel there is a missing mood here when you don't even consider the upside of my proposal."
I do feel sad when the democratic governments of the world screw up policy in a big way according to my beliefs and values (as they often do). I still believe that democracy is the least bad form of government, but yes, I feel the temptation of how much better it would be if we were instead governed by perfect philosopher king who happens to be aligned with my values. But 0.00000004% is just a drop in the ocean.
Similarly, I believe that we should try to maintain a relatively democratic form of government at least in the early AI age (and then the people can slowly figure out if they find something better than democracy). And yes, I expect that democracies will do a lot of incredibly wasteful, stupid, and sometimes evil things by my lights, and I will sometimes wish if somehow I could have become the philosopher king. That's just how things always are. But leaving Earth alone really is chump change, and won't be among the top thousand things I disagree with democracies on.
(Also, again, I think there will also be a lot of value in conservatism and caution, especially that we probably won't be able to fully trust our AIs in the most complicated issues. And I also think there is something intrinsically wrong about destroying Earth, I think that if you cut out the world's oldest tree to increase health outcomes by 0.00000004%, you are doing something wrong, and people have a good reason to distrust you.)
Replies from: Raemon, Benito↑ comment by Raemon · 2025-01-10T18:47:55.197Z · LW(p) · GW(p)
It sounds like there's actually like 3-5 different object level places where we're talking about slightly different things. I also updated on the practical aspect from Ryan's comment. So, idk here's a bunch of distinct points.
1.
Ryan Greenblatt's comment [LW · GW] updated me that the energy requirements here are minimal enough that "eating the sun" isn't really going to come up as a consideration for astronomical waste. (Eating the Earth or most of the solar system seems like it still might be. But, I agree we shouldn't Eat the Earth)
2.
I'd interpreted most past comments for nearterm (i.e. measured in decades) crazy shit to be about building Dyson spheres, not Star Lifting. (i.e. I expected the '20 years from now in some big ol' computer' in the solstice song to be about dyson spheres and voluntary uploads). I think many people will still freak out about Dyson Sphering the sun (not sure if you would). I would personally argue "it's just pretty damn important to Dyson Sphere the sun even if it makes people uncomfortable (while designing it such that Earth still gets enough light)."
3.
I agree in 1000 years it won't much matter whether you Starlift, for astronomical waste reasons. But I do expect in 1000 years, even assuming a maximally consent-oriented / conservative-with-regards-to-bio-human-values, and all around "good" outcome, most people will have shifted to running on computronium and experienced much more than 1000 years of subjective time and their intuitions about what's good will just be real different. There may be small groups of people who continue living in bio-world but most of them will still probably be pretty alien by our lights.
I think I do personally hope they preserve the Earth as sanctuary and/or historical relic. But I think there's a lot of compromises like "starlift a lot of material out of the sun, but move the Earth closer to the sun to compensate" (I haven't looked into the physics here, the details are obviously cruxy).
When I imagine any kind of actual realistic future that isn't maximally conservative (i.e. the bio humans are < .1% of the solar system's population and just don't have that much bargaining power), it seems even more likely that they'll at least work on compromise solutions that preserve a reasonable Earth experience but eat a bunch of the sun, if there turn out to be serious tradeoffs there. (Again I don't actually know enough physics here and I'm recently humbled by remembering the Eternity in Six Hours paper, maybe there's literally no tradeoffs here, but, I'd still doubt it)
4.
It sounds like it's not particularly cruxy anymore, but, I think the "0.00000004% of the Earth's current population" analogy is just quite different. 80 trillion suns is involves more value than has ever been had before, 3 lives is (relatively) insignificant compared to many political compromises we've made, even going back thousands of years. Maybe whatever descendants get to reap that value are so alien that they just don't count as valuable by today's lights, and it's reasonable to have some extreme time discounting here, but, if any values-you-care-about survived it would be huge.
I agree both morally and practically with "it's way more important to make sure we have good global coordination systems that don't predictably either descend into a horrible totalitarianism, or trigger a race for power that causes horrible wars or other bad things, than to get those 80 trillion suns." But, like, the 80 trillion suns are still a big deal.
5.
I'll note it's also not a boolean whether we "bulldoze the earth" or "bulldoze the rest of the solar system" for rushing to build a dyson sphere. You can start the process with a bunch of mining in some remote mountain regions or whatever without eating the whole earth. (But I think it might be bad to do this because "don't harvest Earth" is just a nice simple Schelling rule and once you start haggling over the details I do get a lot more worried)
6.
I recall reading it's actually maybe cheaper to use asteroids than Mercury to make a dyson sphere because you don't need to expensively lift things out of the gravity well. It is appealing to me if there are no tradeoffs involved with deconstructing any of the charismatic astronomical objects until we've had more time to think/orient/grow-as-a-people.
7.
Part of my outlook here is that I spend the last 14 years being personally uninterested in and scared by the sorts of rapid/crazy/exponential change you're wary of. In the past few years, I've adjusted to be more personally into it. I don't think I would have wanted to rush that grieving/orienting process for Past Me even though it cost me a lot of important time and resources (I'm referring here more to more like stuff in The God of Humanity, and the God of the Robot Utilitarians [LW · GW])
But I do wish I had somehow sped along the non-soulfully-traumatic parts of the process (i.e. some of the updates were more simple/straightforward and if someone had said the right words to me, I think I'd have gotten a strictly better outcome by my original lights).
I expect most humans, given opportunity to experiment on their own terms, will gradually have some kind of perspective shift here (maybe on a longer timescale than Me Among the Rationalists, but, like, <500 years). I don't want people to feel rushed about it, but I think there will be some societal structures that will lend themselves to dallying more and accumulating serious tradeoffs, or less.
Replies from: matolcsid↑ comment by David Matolcsi (matolcsid) · 2025-01-10T20:13:54.523Z · LW(p) · GW(p)
I feel reassured that you don't want to Eat the Earth while there are still biological humans who want to live on it.
I still maintain that under governance systems I would like, I would expect the outcome to be very conservative with the solar system in the next thousand years. Like one default governance structure I quite like is to parcel out the Universe equally among the people alive during the Singularity, have a binding constitution on what they can do on their fiefdoms (no torture, etc), and allow them to trade and give away their stuff to their biological and digital descendants. There could also be a basic income coming to all biological people,[1] though not to digital as it's too easy to mass-produce them.
One year of delay in cosmic expansion costs us around 1 in a billion of the reachable Universe under some assumptions on where the grabby aliens are (if they exist). One year also costs us around 1 in a billion of the Sun's mass being burned, if like Habryka you care about using the solar system optimally for the sake of the biological humans who want to stay. So one year of delay can be bought by 160 people paying out 10% of their wealth. I really think that you won't do things like moving the Earth closer to the Sun and things like that in the next 200 years, there will just always be enough people to pay out, it just takes 10,000 traditionalist families, literally the Amish could easily do it. And it won't matter much, the cosmic acceleration will soon become a moot point as we build out other industrial bases, and I don't expect the biological people to feel much of a personal need to dismantle the Sun anytime soon. Maybe in 10,000 years the objectors will run out of money, and the bio people either overpopulate or have expensive hobbies like building planets to themselves and decide to dismantle the Sun, though I expect them to be rich enough to just haul in matter from other stars if they want to.
By the way, I recommend Tim Underwood's sci-fi, The Accord, as a very good exploration of these topics, I think it's my favorite sci-fi novel.
As for the 80 trillions stars, I agree it's a real loss, but for me this type of sadness feels "already priced in". I already accepted that the world won't and shouldn't be all my personal absolute kingdom, so other people's decision will cause a lot of waste from my perspective, and 0.00000004% is just a really negligible part of this loss. In this, I think my analogy to current government is quite apt, I feel similarly about current governments, that I already accepted that the world will be wasteful compared to the rule of a dictatorship perfectly aligned with me, but that's how it needs to be.
- ^
Though you need to pay attention to overpopulation. If the average biological couple has 2.2 children, the Universe runs out of atoms to support humans in 50 thousand years. Exponential growth is crazy fast.
↑ comment by Ben Pace (Benito) · 2025-01-10T08:07:47.578Z · LW(p) · GW(p)
Most decisions are not made democratically, and pointing out that a majoritarian vote is against a decision is no argument that they will not happen nor should not happen. This is true of the vast majority of resource allocation decisions such as how to divvy up physical materials.
Replies from: habryka4, matolcsid↑ comment by habryka (habryka4) · 2025-01-10T08:26:10.830Z · LW(p) · GW(p)
I think it is an argument that they should not happen, but a very weak one, especially for things like this where the current public really hasn’t thought much about.
Also, I think David is just wildly wrong here about what realistically would happen in 10,000 years for a society that could actually start using all of the suns energy. This would involve hundreds of generations of people each deciding to not grow, to not expand into the rest of the solar system, to pass up on enormous opportunities for joy and greatness and creation, out of sentimentality for a specific kind of attachment for the specific arrangement of our planet and solar system at this moment in time. This attachment is very much real, and worth something, but IMO obviously will not remotely outweigh the preferences and actions of the trillions of people who will all want to do more things and have more things (and since we are talking about gradual expansion, very much present in the conversation).
↑ comment by Ben Pace (Benito) · 2025-01-10T08:44:47.469Z · LW(p) · GW(p)
I concede that I was mistaken in saying it was no argument; I will agree with the position that it is a very weak one and is often outweighed by other arguments.
Majority vote is useful specifically in determining who has power because of the extremely high level of adversarial dynamics, but in contexts that are not as wildly adversarial (including most specific decisions that an institution makes) generally other decision-making algorithms are better.
↑ comment by David Matolcsi (matolcsid) · 2025-01-10T08:43:32.239Z · LW(p) · GW(p)
You mean that people on Earth and the solar system colonies will have enough biological children, and space travel to other stars for biological people will be hard enough that they will want the resources from dismantling the Sun? I suppose that's possible, though I expect they will put some kind of population control for biological people in place before that happens. I agree that also feels aversive, but at some point it needs to be done anyway, otherwise exponential population growth just brings us back to the Malthusian limit a few ten thousand years from now even if we use up the whole Universe. (See Tim Underwood's excellent rationalist sci-fi novel on the topic.)
If you are talking about ems and digital beings, not biological humans, I don't think they will and should have have decision rights over what happens with the solar system, as they can simply move to other stars.
Replies from: habryka4↑ comment by habryka (habryka4) · 2025-01-10T08:52:19.720Z · LW(p) · GW(p)
Someone will live on old earth in your scenario. Unless those people are selected for extreme levels of attachment to specific celestial bodies, as opposed to the function and benefit of those celestial bodies, I don’t see why those people would decide to not replace the sun with a better sun, and also get orders of magnitude richer by doing so.
It seems to me that the majority of those inhabitants of old earth would simply be people who don’t want to be uploaded (which is a much more common preference I expect than maintaining the literal sun in the sky) and so have much more limited ability to travel to other solar systems. I don’t see why I would want to condemn most people who don’t want be uploaded to relative cosmic poverty just because a very small minority of people want to keep burning away most of the usable energy in the solar system for historical reasons.
↑ comment by David Matolcsi (matolcsid) · 2025-01-10T08:49:44.817Z · LW(p) · GW(p)
Are you arguing that if technologically possible, the Sun should be dismantled in the first few decades after the Singularity, as it is implied in the Great Transhumanist Future song, the main thing I'm complaining about here? In that case, I don't know of any remotely just and reasonable (democratic, market-based or other) governance structure that would allow that to happen given how the majority of people feel.
If you are talking about population dynamics, ownership and voting shifting over millennia to the point that they decide to dismantle the Sun, then sure, that's possible, though that's not what I expect to happen, see my other comment on market trades and my reply to Habryka on population dynamics.
Replies from: habryka4↑ comment by habryka (habryka4) · 2025-01-10T08:54:13.610Z · LW(p) · GW(p)
(It is not implied in the song, to be clear, you seem to have a reading of the lyrics I do not understand.
The song talks about there being a singularity in ~20 years, and separately that the sun is wasteful, but I don’t see any reference to the sun being dismantled in 20 years. For reference, lyrics are here: https://luminousalicorn.tumblr.com/post/175855775830/a-filk-of-big-rock-candy-mountain-one-evening-as)
↑ comment by David Matolcsi (matolcsid) · 2025-01-10T09:29:40.939Z · LW(p) · GW(p)
I think that the coder looking up and saying that the Sun burning is distasteful but the Great Transhumanist Future will come in 20 years, along with a later mention of "the Sun is a battery", together implies that the Sun is getting dismantled in the near future. I guess you can debate in how strong the implication is, maybe they just want to dismantle the Sun in the long term, and currently only using the Sun as a battery in some benign way, but I think that's not the most natural interpretation.
Replies from: habryka4↑ comment by habryka (habryka4) · 2025-01-10T17:13:52.208Z · LW(p) · GW(p)
I think the 20 years somewhat unambiguously refers to timelines until AGI is built.
Separately, “the sun is a battery” I think also doesn’t really imply anything about the sun getting dismantled, if anything it seems to me imply explicitly that the sun is still intact (and probably surrounded by a Dyson swarm or sphere).
↑ comment by Said Achmiz (SaidAchmiz) · 2025-01-10T08:40:44.881Z · LW(p) · GW(p)
To not eat the sun is to throw away orders of magnitude more resources than anyone has ever thrown away before. Is it percentage-wise “a small fraction of the cosmos?”. Sure. But, (*quickly checks Claude, which wrote up a fermi code snippet before answering, I can share the work if you want to doublecheck yourself), *a two year delay would be… 0.00000004% of the unvierse lost beyond the lightcone horizon, which doesn’t sound like much except that’s 200 galaxies lost.
Why is this horrifying? Are we doing anything with those galaxies right now? What is this talk of “throwing away”, “lost”, etc.?
You speak as if we could be exploiting those galaxies at the extreme edge of the observable universe, like… tomorrow, or next week… if only we don’t carelessly lose them. Like we have these “resources” sitting around, at our disposal, as we speak. But of course nothing remotely like this is true. How long would it even take to reach any of these places? Billions of years, right? So the question is:
“Should we do something that might possibly somehow affect something that ‘we’, in some broad sense (because who even knows whether humanity will be around at the time, or in what form), will be doing several billion years from now, in order to avoid dismantling the Sun?”
Pretty obvious the answer is “duh, of course, this is a no-brainer, yes we should, are you even serious—billions of years, really?—clearly we should”.
I think you’re also maybe just not appreciating how much would change in 10,000 years? Like, there is no single culture that has survived 10,000 years.
You’re the one who’s talking about stuff billions of years from now, so this argument applies literally, like, a million times more to your position than to the one you’re arguing against!
In any case, “let’s not dismantle the Sun until and unless we all agree that it’s a good idea” seems reasonable. If the Amish (and people like me) come around to your view in 10 years, great, that’s when we’ll crank up the star-lifters. If we’re still opposed [LW(p) · GW(p)] a million years from now, well, too bad—find another start to dismantle. (In fact, here’s an entire galaxy that probably won’t be missed.)
Replies from: Vladimir_Nesov↑ comment by Vladimir_Nesov · 2025-01-10T17:24:51.604Z · LW(p) · GW(p)
When personal life expectancy of these same people alive today is something like 1e34 years, billions of years is very little.
Replies from: SaidAchmizHow long would it even take to reach any of these places? Billions of years, right?
↑ comment by Said Achmiz (SaidAchmiz) · 2025-01-10T17:39:02.488Z · LW(p) · GW(p)
I don’t think that this is true.
↑ comment by AnthonyC · 2025-01-09T14:16:27.724Z · LW(p) · GW(p)
Without making any normative arguments: if you're in a position (industrially and technologically) to disassemble the sun at all, or build something like a Dyson swarm, then it's probably not too difficult to build an artificial system to light the Earth in such a way as to mimic the sun, and make it look and feel nearly identical to biological humans living on the surface, using less than a billionth of the sun's normal total light output. The details of tides might be tricky, but probably not out of reach.
Replies from: mark-xu↑ comment by Mark Xu (mark-xu) · 2025-01-10T07:01:56.314Z · LW(p) · GW(p)
But most people on Earth don't want "an artificial system to light the Earth in such a way as to mimic the sun", they want the actual sun to go on existing.
Replies from: Benito, AnthonyC, habryka4↑ comment by Ben Pace (Benito) · 2025-01-10T07:51:38.004Z · LW(p) · GW(p)
This point doesn't make sense to me. It sounds similar to saying "Most people don't like it when companies develop more dense housing in cities, therefore a good democracy should not have it" or "Most people don't like it when their horse-drawn carriages are replaced by cars, therefore a good democracy should not have it".
The cost-benefit calculations on these things work out and it's good if most uninformed people who haven't spent much time on it are not able to get in the way of companies that are building goods and services in this regard.
There are many many examples (e.g. GMOs, nuclear power, coal power, privatized toll roads, fracking, etc), and I expect if I researched for a few hours I would find even clearer examples for which it is currently consensus that it is a good idea, but at the time the majority disliked it.
More generally:
- People's mass preferences are sometimes dumb, and sometimes quite reasonable, and you should have a decision rule that distinguishes between the two, and when things are orders of magnitude more cost effective than other things, this is a good argument against arguments based on simple preference / aesthetics, and this comment does nothing to show that this isn't stupidity rather than wisdom.
- Just because a lot of people in a democracy disapproves of things does not mean that market forces shouldn't be able to disagree with them and be correct about that. Analogous to the Luddites who had little concept of how technological and economic progress lifts everyone out of poverty, most people today do not appreciate that future computational-life forms will be just as meaningful as the meat-based ones today, and should not sacrifice orders of magnitudes more life-years than will be lived on Earth[1] for the difference between a big ball of plasma and something else that recreates the same quality of light.
- Majoritarian vote on everything is a terrible way to make decisions; most decisions should be given to as small a group as possible (ideally an individual) who is held accountable for the outcome being good, and is given the resources to make the decision well. We do it for political leaders due to the low levels trust and high levels of adversarial action, but this should not be extended to whether to take the sun apart for parts.
- ^
I'd quickly guess that the energy difference supports life-years where is somewhere between and .
↑ comment by Mark Xu (mark-xu) · 2025-01-10T20:33:25.169Z · LW(p) · GW(p)
I am claiming that people when informed will want the sun to continuing being the sun. I also think that most people when informed will not really care that much about creating new people, will continue to believe in the act-omission distinction, etc. And that this is a coherent view that will add up to a large set of people wanting things in the solar system to remain conservatively the same. I seperately claim that if this is true, then other people should just respect this preference, and use the other stars that people don't care about for energy.
Replies from: habryka4, Benito↑ comment by habryka (habryka4) · 2025-01-10T21:40:58.558Z · LW(p) · GW(p)
As I mentioned in the other thread, it seems right to me that some people will want the sun to continue being the sun, but my sense is that within the set of people who don't want to leave the solar system, don't want to be uploads, don't want to be cryogenically shipped to other solar systems, or otherwise for some reason will have strong preferences over what happens with this specific solar system, this will be a much less important preference than using the sun for things that people care about more.
↑ comment by Ben Pace (Benito) · 2025-01-10T21:04:21.996Z · LW(p) · GW(p)
Analogously: "I am claiming that people when informed will want horses to continue being the primary mode of transportation. I also think that most people when informed will not really care that much about economic growth, will continue to believe that you're more responsible for changing things than for maintaining the status quo, etc. And that this is a coherent view that will add up to a large set of people wanting things in cities to remain conservatively the same. I separately claim that if this is true, then other people should just respect this preference, and go find new continents / planets on which to build cars that people in the cities don't care about."
Sometimes it's good to be conservative when you're changing things, like if you're changing lots of social norms or social institutions, but I don't get it at all in this case. The sun is not a complicated social institution, it's primarily a source of heat and light and much of what we need can be easily replicated especially when you have nanobots. I am much more likely to grant that we should be slow to change things like democracy and the legal system than I am that we should change exactly how and where we should get heat and light. Would you have wanted conservatism around moving from candles to lightbulbs? Installing heaters and cookers in the house instead of fire pits? I don't think so.
↑ comment by David Matolcsi (matolcsid) · 2025-01-10T20:19:54.101Z · LW(p) · GW(p)
As I explain in more detail in my other comment [LW · GW], I expect market based approaches to not dismantle the Sun anytime soon. I'm interested if you know of any governance structure that you support that you think will probably lead to dismantling the Sun within the next few centuries.
↑ comment by quila · 2025-01-10T15:46:24.611Z · LW(p) · GW(p)
most people today do not appreciate that future computational-life forms will be just as meaningful as the meat-based ones today, and should not sacrifice orders of magnitudes more life-years than will be lived on Earth for the difference between a big ball of plasma and something else that recreates the same quality of light.
huh i agree with this, but i've been imagining that earth and many galaxies will be in the domain of preferences which want to "live on actual planets" because of this chain-of-logic:
- the lightcone is very much larger than just earth. (wikipedia says "[there are] an estimated 100 billion [galaxies] in all of the observable universe"). we'd want to give up earth (and some surrounding many galaxies) in return for more good possible futures, because of the good which can be derived from the non-earth parts of those ones.
- some beings care disproportionately about what happens to earth and its sun. some of them are alignment researchers (e.g. some comments in this thread), or otherwise influencing the trajectory
- it's better for collaboration if those preferences determine the fate of earth and its sun. i.e. this prevents some values from being incentivized to compete for alignment to them in particular (to the detriment of general success rates).
- i can write more about this part if wanted. notably (unless i've made a mistake) it doesn't rely on trust, but that's probably not very clear with just this.
- actually, this case is less interesting than the general case i had in mind. because in this case one side has non-linearity in value of this sun versus others, both sides want to make this trade even without a greater chance of both dying if they don't.
- i can write more about this part if wanted. notably (unless i've made a mistake) it doesn't rely on trust, but that's probably not very clear with just this.
↑ comment by David Matolcsi (matolcsid) · 2025-01-10T08:29:19.680Z · LW(p) · GW(p)
I agree that not all decisions about the cosmos should be made on a majoritarian democratic way, but I don't see how replacing the Sun with artificial light can be done by market forces under normal property rights. I think you are currently would not be allowed to build a giant glass dome around someone's pot of land, and this feels at least that strong.
I'm broadly sympathetic to having property rights and markets in the post-Singularity future, and probably the people will scope-sensitive and longtermist preferences will be able to buy out the future control of far-away things from the normal people who don't care about these too much. But these trades will almost certainly result the solar system being owned by a coalition of normal people, except if they start with basically zero capital. I don't know how you imagine the initial capital allocation to look like in your market-based post-Singularity world, but if the vast majority of the population doesn't have enough control to even save the Sun, then probably something went deeply wrong.
Replies from: habryka4↑ comment by habryka (habryka4) · 2025-01-10T08:46:38.975Z · LW(p) · GW(p)
People don’t generally have strong preferences about celestial objects. I really don’t understand why you think most people care about the sun qua the sun, as opposed to the things the sun provides.
Most people when faced with the choice to be more than twice as rich in new-earth, which they get to visualize and explore using the best of digital VR and sensory technology, with a fake sun indistinguishable for all intends and purposes from the real sun, will of course choose that over the attachment to maintaining that specific ball of plasma in the sky.
↑ comment by Ben Pace (Benito) · 2025-01-10T08:52:56.994Z · LW(p) · GW(p)
Side-note: Just registering that I personally aspire to always taboo 'normal people' and instead name to specific populations. I think it tends to sneak in a lot of assumptions to call people 'normal' – I've seen it used to mean "most people on Twitter" or "most people in developed countries" or "most working class people" or "most people alive today" – the latter of which is not at all normal by historical standards!
Replies from: habryka4↑ comment by habryka (habryka4) · 2025-01-10T08:56:54.409Z · LW(p) · GW(p)
Seems right, I used the language of the thread, but edited it since I agree.
↑ comment by David Matolcsi (matolcsid) · 2025-01-10T09:01:38.966Z · LW(p) · GW(p)
I expect non-positional material goods to be basically saturated for Earth people in a good post-Singularity world, so I don't think you can promise them to become twice as rich. And also, people dislike drastic change and new things they don't understand. 20% of the US population refused the potentially life-saving covid vaccine out of distrust of new things they don't understand. Do you think they would happily move to a new planet with artificial sky maintained by supposedly benevolent robots? Maybe you could buy off some percentage of the population if material goods weren't saturated, but surely not more than you could convince to get the vaccine? Also, don't some religions (Islam?) have specific laws about what to do at sunrise and sunset and so on? Do you think all the imams would go along with moving to the new artificial Earth? I really think you are out of touch with the average person on this one, but we can go out to the streets and interview some people on the matter, though Berkeley is maybe not the most representative place for this.
(Again, if you are talking about cultural drift over millennia, that's more plausible, though I'm below 50% they would dismantle the Sun. But I'm primarily arguing against dismantling the Sun within twenty years of the Singularity.)
Replies from: habryka4↑ comment by habryka (habryka4) · 2025-01-10T09:08:55.670Z · LW(p) · GW(p)
Twenty years seems indeed probably too short, though it’s hard to say how post-singularity technology will affect things like public deliberation timelines.
My best guess is 200 years will very likely be enough.
I agree with you that there exist some small minority of people who will have a specific attachment to the sun, but most people just want to live good and fulfilling lives, and don’t have strong preferences about whether the sun in the sky is exactly 1 AU away and feels exactly like the sun of 3 generations past. Also, people will already experience extremely drastic change in the 20 years after the singularity, and my sense is marginal cost of change is decreasing, and this isn’t the kind of change that would most affect people’s lived experience.
To be clear, for me it’s a crux whether not dismantling the sun is basically committing everyone who doesn’t want to be uploaded to relative cosmic poverty. It would really suck if all remaining biological humans would be unable to take advantage of the vast majority of the energy in the solar system.
I am not at present compelled that the marginal galaxies are worth destroying the sun and earth for (though I am also not confident it isn’t, I feel confused about it, and also don’t know where most people would end up after having been made available post-singularity intelligence enhancing drugs and deliberation technologies, which to be clear not everyone would use, but most people probably would).
↑ comment by David Matolcsi (matolcsid) · 2025-01-10T09:35:55.919Z · LW(p) · GW(p)
I maintain that biological humans will need to do population control at some point. If they decide that enacting the population control in the solar system at a later population leve is worth it for them to dismantle the Sun, then they can go for it. My guess is that they won't, and will have population control earlier.
↑ comment by Said Achmiz (SaidAchmiz) · 2025-01-10T08:25:07.234Z · LW(p) · GW(p)
I want the Sun to keep existing, I am not “uninformed”, and I think it would be good if I am able to get in the way of people who want to dismantle the Sun, and bad if I were not able to do so.
when things are orders of magnitude more cost effective than other things, this is a good argument against arguments based on simple preference / aesthetics
I strongly disagree. This is not any kind of argument against arguments based on simple preferences / aesthetics, much less a good one. In fact, it’s not clear to me that there are any such arguments at all (except ones based on [within-agent] competing preferences / aesthetics).
Just because a lot of people in a democracy disapproves of things does not mean that market forces shouldn’t be able to disagree with them and be correct about that.
You are perhaps missing the point of democracy.
(Now, if your view is “actually democracy is bad, we ought to have some other system of government”, fair enough, but then you should say so explicitly.)
the Luddites who had little concept of how technological and economic progress lifts everyone out of poverty
The Luddites had a completely correct expectation about how “technological and economic progress” would put them, personally and collectively, out of jobs, which it in fact did. They were not “lifted out of poverty” by mechanization—they were driven into poverty by it.
future computational-life forms will be just as meaningful as the meat-based ones today
You neither have nor can have any certainty about this, or even high confidence. Neither is it relevant—future people do not exist; existing people do.
should not sacrifice orders of magnitudes more life-years than will be lived on Earth
Declining to create people is not analogous to destroying existing people. To claim otherwise is tendentious and misleading. There is no “sacrificing” involved in what we are discussing.
most decisions should be given to as small a group as possible (ideally an individual) who is held accountable for the outcome being good, and is given the resources to make the decision well
Decisions should “be given”—by whom? The people—or else your position is nonsense. Well, I say that we should not give decision-making power to people who will dismantle the Sun. You speak of being “held accountable”—once again, by whom? Surely, again: the people. And that means that the people may evaluate the decisions that the one has made. Well, I say we should evaluate the decision to dismantle the Sun, pre-emptively—and judge it unacceptable. (Why wait until after the fact, when it will be too late? How, indeed, could someone possibly be “held accountable” for dismantling the Sun, after the deed is done? Absurdity!)
↑ comment by AnthonyC · 2025-01-10T15:13:30.895Z · LW(p) · GW(p)
I think that's very probably true, yes. I'm not certain that will continue to be true indefinitely, or that it will or should continue to be the deciding factor for future decision making. I'm just pointing out that we're actually discussing a small subset of a very large space of options, that there are ways of "eating the sun" that allow life to continue unaltered on Earth, and so on. TBH even if we don't do anything like this, I wouldn't be terribly surprised if future humans end up someday building a full or partial shell around Earth anyway, long before there's anything like large-scale starlifting under discussion. Living space, power generation, asteroid deflection, off-world industry, just to name a few reasons we might do something like that. It could end up being easier and cheaper to increase available surface area and energy by orders of magnitude doing something like this than by colonizing other planets and moons.
↑ comment by habryka (habryka4) · 2025-01-10T08:29:08.170Z · LW(p) · GW(p)
(This seems false and in as much as someone is willing to take bets that resolve after superintelligence, I would bet that most people do not care any appreciable amount about the actual sun existing by the time humanity is capable of actually providing a real alternative)
↑ comment by Seth Herd · 2025-01-09T13:29:31.502Z · LW(p) · GW(p)
You're such a traditionalist!
More seriously, accusing rationalists of hauling the Amish and their mothers to camps doesn't seem quite fair. Like you said, most rationalists seem pretty nice and aren't proposing involuntary rapid changes. And this post certainly didn't.
You'd need to address the actual arguments in play to write a serious post about this. "Don't propose weird stuff" isn't a very good argument. You could argue that went very poorly with communism, or come up with some other argument. Actually I think rationalists have come up with some. It looks to me like the more respected rationalists are pretty cautious about doing weird drastic stuff just because the logic seems correct at the time. See the unilateralist curse and Yudkiwky's and other's pleas that nobody do anything drastic about AGI even though they think it's very likely going to kill us all.
This stuff is fun to think about, but it's planning the victory party before planning how to win the war.
How to put the future into kind and rational hands seems like an equally interesting and much more urgent project right now. I'd be fine with a pretty traditional utopian future or a very weird one, but not fine with joyless machines eating the sun, or worse yet all of the suns they can reach.
↑ comment by Said Achmiz (SaidAchmiz) · 2025-01-10T08:09:33.242Z · LW(p) · GW(p)
it shows up in Solstice songs as a thing we want to do in the Great Transhumanist Future twenty years from now
Is this true?! (Do you have a link or something?)
Replies from: Benito↑ comment by Ben Pace (Benito) · 2025-01-10T08:31:12.914Z · LW(p) · GW(p)
I took a quick look. I did not quite find this, I found other discussion of suns dying or being used as resources. Sharing as data.
In the song "Five Thousand Years" the lyrics talk about the sun dying in the next 5,000 years.
I don't quite know how things might change
I don't quite know what rules we'd break
Our present selves might think it strange
But there's so many lives at stake...
Entropy is bearin' down
But we got tricks to stick around.
And if we live to see the day
That yellow fades to red then grey,
We'll take a moment, one by one
Turn to face the dying sun
Bittersweetly wave goodbye--
The journey's only just begun...
In (Five thousand years)
(Whatcha want to do, whatcha wanna see, in another)
(Five million years)
(Where we want to go, who we want to be, in another)
Here's a reference to it as a battery, in the (fast, humerous, upbeat) song "The Great Transhumanist Future"
Replies from: Zack_M_DavisIn the Great Transhumanist Future,
There are worlds all fair and bright,
We’ll be constrained by nothing but
The latency of light
When the hospitals are empty
And the sun’s a battery
Making it a breeze
To get outta deep freeze
To give humans wings
And some other things
In the Great Transhumanist Future.
↑ comment by Zack_M_Davis · 2025-01-10T08:34:46.938Z · LW(p) · GW(p)
It's implied in the first verse of "Great Transhumanist Future."
Replies from: SaidAchmiz, Benito, habryka4One evening as the sun went down
That big old fire was wasteful,
A coder looked up from his work,
And he said, “Folks, that’s distasteful,
↑ comment by Said Achmiz (SaidAchmiz) · 2025-01-10T08:46:50.758Z · LW(p) · GW(p)
Ah, thanks, this does seem to be what @David Matolcsi was referring to.
↑ comment by Ben Pace (Benito) · 2025-01-10T08:47:52.122Z · LW(p) · GW(p)
Thanks for adding that one, I accidentally missed the first reference in the song.
↑ comment by habryka (habryka4) · 2025-01-10T08:41:26.605Z · LW(p) · GW(p)
I don’t think it is implied at all that the sun will or should be torn apart in 20 years?
It is implied that the sun is wasteful from at least one perspective, which hardly can be argued with.
comment by simon · 2025-01-08T21:14:54.723Z · LW(p) · GW(p)
I think that it's likely to take longer than 10000 years, simply because of the logistics (not the technology development, which the AI could do fast).
The gravitational binding energy of the sun is something on the order of 20 million years worth of its energy output. OK, half of the needed energy is already present as thermal energy, and you don't need to move every atom to infinity, but you still need a substantial fraction of that. And while you could perhaps generate many times more energy than the solar output by various means, I'd guess you'd have to deal with inefficiencies and lots of waste heat if you try to do it really fast. Maybe if you're smart enough you can make going fast work well enough to be worth it though?
Replies from: quetzal_rainbow, jessica.liu.taylor, Charlie Steiner↑ comment by quetzal_rainbow · 2025-01-09T09:55:46.948Z · LW(p) · GW(p)
If you can use 1kg of hydrogen to lift x>1kg of hydrogen using proton-proton fusion, you are getting exponential bulidup, limited only by "how many proton-proton reactors you can build in Solar system" and "how willing you are to actually build them", and you can use exponential buildup to create all necessary infrastructure.
↑ comment by jessicata (jessica.liu.taylor) · 2025-01-08T21:30:14.175Z · LW(p) · GW(p)
I'm not sure what the details would look like, but I'm pretty sure ASI would have enough new technologies to figure something out within 10,000 years. And expending a bunch of waste heat could easily be worth it, if having more computers allows sending out Von Neumann probes faster / more efficiently to other stars. Since the cost of expending the Sun's energy has to be compared with the ongoing cost of other stars burning.
Replies from: ricraz↑ comment by Richard_Ngo (ricraz) · 2025-01-09T04:12:13.517Z · LW(p) · GW(p)
I'm not sure what the details would look like, but I'm pretty sure ASI would have enough new technologies to figure something out within 10,000 years.
I feel like this is the main load-bearing claim underlying the post, but it's barely argued for.
In some sense the sun is already "eating itself" by doing a fusion reaction, which will last for billions more years. So you're claiming that AI could eat the sun (at least) six orders of magnitude faster, which is not obvious to me.
I don't think my priors on that are very different from yours but the thing that would have made this post valuable for me is some object-level reason to upgrade my confidence in that.
Replies from: jessica.liu.taylor↑ comment by jessicata (jessica.liu.taylor) · 2025-01-09T08:12:12.386Z · LW(p) · GW(p)
- Doesn't have to expend the energy. It's about reshaping the matter to machines. Computers take lots of mass-energy to constitute them, not to power them.
- Things can go 6 orders of magnitude faster due to intelligence/agency, it's not highly unlikely in general.
- I agree that in theory the arguments here could be better. It might require knowing more physics than I do, and has the "how does Kasparov beat you at chess" problem.
↑ comment by Charlie Steiner · 2025-01-09T01:17:57.512Z · LW(p) · GW(p)
I think if you want to go fast, and you can eat the rest of the solar system, you can probably make a huge swarm of fusion reactors to help blow matter off the sun. Let's say you can build 10^11-watt reactors that work in space. Then you need about 10^15 of them to match the sun. If each is 10^6 kg, this is about 10^-4 of Mercury's mass.
comment by Mikhail Samin (mikhail-samin) · 2025-01-09T03:22:59.144Z · LW(p) · GW(p)
If our story goes well, we might want to preserve our Sun for sentimental reasons.
We might even want to eat some other stars just to prevent the Sun from expanding and dying.
I would maybe want my kids to look up at a night sky somewhere far away and see a constellation with the little dot humanity came from still being up there.
Replies from: Vladimir_Nesov↑ comment by Vladimir_Nesov · 2025-01-09T04:02:01.638Z · LW(p) · GW(p)
Concrete existence, they point out, is less resource efficient than dreams of the machine. Hard to tell how much value is tied up in physical form and not computation, if humans would agree on this either way on reflection.
comment by FlorianH (florian-habermacher) · 2025-01-08T17:09:38.024Z · LW(p) · GW(p)
[..] requires eating the Sun, and will be feasible at some technology level [..]
Do we have some basic physical-feasibility insights on this or you just speculate?
Replies from: ete, Gurkenglas, avturchin, jessica.liu.taylor↑ comment by plex (ete) · 2025-01-08T18:45:09.581Z · LW(p) · GW(p)
It's a pretty straightforward modification of the Caplan thruster. You scoop up bits of sun with very strong magnetic fields, but rather than fusing it and using it to move a star, you cool most of it (firing some back with very high velocity to balance things momentum wise) and keep the matter you extract (or fuse some if you need quick energy). There's even a video on it! Skip to 4:20 for the relevant bit.
Replies from: Charlie Steiner↑ comment by Charlie Steiner · 2025-01-09T00:42:40.958Z · LW(p) · GW(p)
I was expecting (Methods start 16:00)
↑ comment by Gurkenglas · 2025-01-08T19:11:28.654Z · LW(p) · GW(p)
The action space is too large for this to be infeasible, but at a 101 level, if the Sun spun fast enough it would come apart, and angular momentum is conserved so it's easy to add gradually.
↑ comment by avturchin · 2025-01-09T21:46:50.329Z · LW(p) · GW(p)
A very heavy and dense body on an elliptical orbit that touches the Sun's surface at each perihelion would collect sizable chunks of the Sun's matter. The movement of matter from one star to another nearby star is a well-known phenomenon.
When the body reaches aphelion, the collected solar matter would cool down and could be harvested. The initial body would need to be very massive, perhaps 10-100 Earth masses. A Jupiter-sized core could work as such a body.
Therefore, to extract the Sun's mass, one would need to make Jupiter's orbit elliptical. This could be achieved through several heavy impacts or gravitational maneuvers involving other planets.
This approach seems feasible even without ASI, but it might take longer than 10,000 years.
↑ comment by jessicata (jessica.liu.taylor) · 2025-01-08T18:18:36.032Z · LW(p) · GW(p)
Mostly speculation based on tech level. But:
- To the extent temperature is an issue, energy can be used to transfer temperature from one place to another.
- Maybe matter from the Sun can be physically expelled into more manageable chunks. The Sun already ejects matter naturally (though at a slow rate).
- Nanotech in general (cell-like, self-replicating robots).
- High energy availability with less-speculative tech like Dyson spheres.
comment by Seth Herd · 2025-01-09T13:06:16.085Z · LW(p) · GW(p)
I'm not sure eating the sun is such a great idea,
If the sun goes out suddenly, it's a pretty clear tipoff that something major is happening over here. Anyone with preferences who sees that might worry about having to compete with whoever ate the sun. They could do something drastic.
Our offspring might conclude that anyone willing to do drastic things to strangers would already be going hard on spreading and eating suns, so it would only signal meaningfully to relatively peaceful types. But I'm not sure we could be sure. Someone might be hiding and doing drastic things to anyone who shows themselves, but doing drastic things in sneaky ways.
But it does seem like quite a shame to let most of the accessible universe just burn up because you're paranoid about the neighbors.
It will be quite a dilemma unless there's some compelling logic we're missing so far or observations that will allow such logic. Which could be.
Replies from: jessica.liu.taylor↑ comment by jessicata (jessica.liu.taylor) · 2025-01-09T19:49:35.940Z · LW(p) · GW(p)
I think this shades into dark forest theory. Broadly my theory about aliens in general is that they're not effectively hiding themselves, and we don't see them because any that exist are too far away.
Partially it's a matter of, if aliens wanted to hide, could they? Sure, eating a star would show up in terms of light patterns, but also, so would being a civilization at the scale of 2025-earth. And my argument is that these aren't that far-off in cosmological terms (<10K years).
So, I really think alien encounters are in no way an urgent problem: we won't encounter them for a long time, and if they get light from 2025-Earth, they'll already have some idea that something big is likely to happen soon on Earth.
comment by Raemon · 2025-01-08T20:21:04.901Z · LW(p) · GW(p)
This seemed like a nice explainer post, though it's somewhat confusing who the post is for – if I imagine being someone who didn't really understand any arguments about superintelligence, I think I might bounce off the opening paragraph or title because I'm like "why would I care about eating the sun."
There is something nice and straightforward about the current phrasing but suspect there's an opening paragraph that would do a better job explaining why you might care about this.
(But I'd be curious to hear from people who weren't really sold on any singularity stuff who read it and can describe how it was for them)
Replies from: jessica.liu.taylor, Morpheus, Vladimir_Nesov↑ comment by jessicata (jessica.liu.taylor) · 2025-01-08T20:39:35.281Z · LW(p) · GW(p)
I think partially it's meant to go from some sort of abstract model of intelligence as a scalar variable that increases at some rate (like, on a x/y graph) to concrete, material milestones. Like, people can imagine "intelligence goes up rapidly! singularity!" and it's unclear what that implies, I'm saying sufficient levels would imply eating the sun, that makes it harder to confuse with things like "getting higher scores on math tests".
I suppose a more general category would be, the relevant kind of self-improving intelligence would be the sort that can re-purpose mass-energy to creating more computation that can run its intelligence, and "eat the Sun" is an obvious target given this background notion of intelligence.
(Note, there is skepticism about feasibility on Twitter/X, that's some info about how non-singulatarians react)
↑ comment by Morpheus · 2025-01-09T09:59:08.621Z · LW(p) · GW(p)
I was already sold on singularity. For what it's worth I found the post and comments very helpful for why you would want to take the sun apart in the first place and why it would be feasible and desirable for superintelligent and non-superintelligent civilization (Turning the sun into a smaller sun that doesn't explode seems nicer than having it explode. Fusion gives off way more energy than lifting the material. Gravity is the weakest of the 4 forces after all. In a superintelligent civilization with reversible computers, not taking apart the sun will make readily available mass a taut constraint).
↑ comment by Vladimir_Nesov · 2025-01-08T21:32:20.298Z · LW(p) · GW(p)
Ignoring such confusion is good for hardening the frame where the content is straightforward. It's inconvenient to always contextualize, refusing to do so carves out the space for more comfortable communication.
comment by Noosphere89 (sharmake-farah) · 2025-01-09T20:34:29.717Z · LW(p) · GW(p)
I agree with Richard Ngo and Simon that any dismantling of the sun is going to be a long-term project, and this matters.
Replies from: Raemon↑ comment by Raemon · 2025-01-09T20:42:00.065Z · LW(p) · GW(p)
What do you think Richard Ngo claimed about this?
Replies from: sharmake-farah↑ comment by Noosphere89 (sharmake-farah) · 2025-01-09T20:49:47.566Z · LW(p) · GW(p)
That the argument that ASI could easily (relative to the sun's own efforts) dismantle the star completely was barely argued for, and his priors weren't moved much and he wanted object-level reasons to believe it was feasible to rapidly dismantle the sun.
Replies from: Raemon↑ comment by Raemon · 2025-01-09T21:18:59.026Z · LW(p) · GW(p)
Richard said "I don't think my priors on that are very different from yours but the thing that would have made this post valuable for me is some object-level reason to upgrade my confidence in that." He didn't say it'd be a longterm project, I think he just meant he didn't change his beliefs about it due to thist post.