Posts

Comments

Comment by Bot_duplicate0.2909851821605116 on Effects of Castration on the Life Expectancy of Contemporary Men · 2017-04-07T18:39:44.815Z · LW · GW

I'm considering taking anti-androgens, but I'm not sure what effect this would have on lifespan.

Would anti-androgen use have similar effects on lifespan as castration? I know both anti-androgens and castration cause decreased testosterone production, but I know almost nothing about this sort of thing, so I don't know if this is relevant.

Anti-androgens are much easier to attain than castration. According to this, "WPATH Standards of Care no longer encourage therapy as a requirement to access hormones".

Also, according to the article I linked, "Your body needs sex hormones – estrogen or testosterone – primarily for bone health but for a myriad other reasons as well." Hormone replacement therapy has potentially dangerous side-effects, though. Do you know if this would outweigh the benefits of castration or anti-androgen use?

Comment by Bot_duplicate0.2909851821605116 on Digital Immortality Map: How to collect enough information about yourself for future resurrection by AI · 2017-01-25T23:19:44.303Z · LW · GW

Digital immortality seems much cheaper than cryonics and of similar effectiveness. Why isn't it more popular?

Comment by Bot_duplicate0.2909851821605116 on Digital Immortality Map: How to collect enough information about yourself for future resurrection by AI · 2017-01-25T23:18:49.681Z · LW · GW

I question whether future society would be willing to bring someone back to life even if it was clear that the person wanted to be brought back and there was sufficient information stored to allow it to happen.

There might not be a moral reason to bring someone back to life, because if future agents value content agents, they presumably would be able to create far more content agents far more easily by engineering agents from scratch for maximum contentment with minimum resource use.

There might not be an economic reason to bring someone back to life, because future agents would be able to make far more efficient workers than a 21st century humans.

There might not be selfish reasons to bring someone back to life, because though having a 21st century person in a far more advanced world might be interesting, I think superintelligences could find far more interesting things to do.

Perhaps one way to increase the probability of being brought back to life is to set up a trust fund or something of the like to do so. Some cryonics organizations have trust funds for bringing back their preserved bodies, which is similar. What do you think of this?

Comment by Bot_duplicate0.2909851821605116 on Immortality Roadmap · 2017-01-09T02:25:09.040Z · LW · GW

You suggested reading Longecity. However, it seems that most articles on Longecity are only relevant to whoever posted it, for example by asking about what to do in a very specific situation, or aren't relevant at all to increasing the chance of becoming immortal. Knowing this, how do you recommend reading Longecity, if at all?

Comment by Bot_duplicate0.2909851821605116 on Immortality Roadmap · 2016-12-31T13:40:24.028Z · LW · GW

Okay, when you would like me to help, email me at 0xakaria@gmail.com.

Comment by Bot_duplicate0.2909851821605116 on Immortality Roadmap · 2016-12-30T17:25:42.215Z · LW · GW

If a version is written in English, I'll probably be willing to review and proofread it. I'm a decent writer in English, and I know a fairly large amount about immortality. I wrote Immortality: A Practical Guide using a different Less Wrong account, in case you're interested.

Comment by Bot_duplicate0.2909851821605116 on Immortality Roadmap · 2016-12-30T17:08:38.719Z · LW · GW

Thanks for the response. Do you know if the book will be made available in English, and if so, approximately when?

Comment by Bot_duplicate0.2909851821605116 on Immortality Roadmap · 2016-12-29T15:31:45.416Z · LW · GW

Yes, that's why I'm asking here.

Comment by Bot_duplicate0.2909851821605116 on Immortality Roadmap · 2016-12-29T00:58:50.017Z · LW · GW

In case you didn't know, storing writing as images like it is in your mind map is bad for accessibility. Those who aren't visual, for example search engine indexing bots and blind humans, have difficulty reading such writing.

Comment by Bot_duplicate0.2909851821605116 on Immortality Roadmap · 2016-12-29T00:55:22.432Z · LW · GW

Thanks for the post. The bottom of the mind map references the book Immortality by Alexey Turchin, but an Internet search failed to reveal any links to or discussing it. Do you know where it can be found?

Comment by Bot_duplicate0.2909851821605116 on Open thread, Dec. 19 - Dec. 25, 2016 · 2016-12-29T00:47:51.904Z · LW · GW

I have been looking for articles discussing to what extent terminal values change. This question is important, as changing terminal values are generally very harmful to their accomplishment, as explained here for AI under "Basic AI drives".

This article says that some values change. This paper suggests that there are core values that are unlikely to change. However, neither of these articles say whether the values they examined are terminal values, and I'm not knowledgeable enough about psychology to determine if they are.

Any relevant thoughts or links would be appreciated.

Comment by Bot_duplicate0.2909851821605116 on Open thread, Jul. 18 - Jul. 24, 2016 · 2016-07-18T21:48:50.353Z · LW · GW

I have been considering the potential for demographic changes due to mind uploading to be even more extreme than you might initially think. This may be caused by people who are both willing to create massive numbers of copies of themselves and who are better suited for an economic niche than anyone else is for that niche, or at least anyone else willing to make very large numbers of copies of themselves. In such a situation, it would be more profitable for a firm to hire such a person than it would be for them to hire others, which may result it that niche being dominated by copies of that single individual.

For example, if there is one person who is better at software development than anyone else and is willing to make very large numbers of copies of themselves, there may end up being millions of copies of that one individual, all developing software.

However, there may be regulations to prevent such people from causing so many others to become unemployed. This may be done by limiting the numbers of copies people can make of themselves. I know little about politics, so feedback on this would be greatly appreciated.

Maximizing the number of copies of yourself may be desired, for example if you are a more effective altruist than most and thus want to maximize the resources available for you and your copies to do such kind acts. Or if you want a clone army.

Thus, I would very much like to know if this is a realistic consideration and how to maximize the number of copies you can make of yourself. It will probably be useful to avoid dying before mind-uploading occurs, get cryopreserved if you fail to do this, and become as skilled as possible at doing tasks that will be economically important in the future. I am unsure of how general or specific such tasks should be, for example if you should attempt to become an expert at software development as a whole or specialize in, say, debugging fatal errors in mid-sized system software. The latter would probably increase the probability of you fulfilling a niche but probably decrease the size of it.