Posts
Comments
I saw this same query in the last open thread. I suspect you aren't getting any responses because the answer is long and involved. I don't have time to give you the answer in full either, so I'll give you the quick version:
I am in the process of signing up with Alcor, because after ten years of both observing cryonics organizations myself and reading what other people say about them, Alcor has given a series of cues that they are the more professional cryonics organization.
So, the standard advice is: if you are young, healthy with a long life expectancy, and are not wealthy, choose C.I., because they are less expensive. If those criteria do not apply to you, choose Alcor, as they appear to be the more serious, professional organization.
In other words: choose C.I. as the type of death insurance you want to have, but probably won't use, or choose Alcor as the type of death insurance you probably will use.
Check out this FDA speculation.
Scott Alexander comments here.
Taken. Wasn't bothered by the length -- could be even longer next time.
I took the survey, and wanted it to be longer.
I wanted to love this post, but stylistic issues got in the way.
It read too much like a gwern essay: certainly interesting, but in need of a summary and a guide for how it is practically applicable. A string of highlights and commentary with no clear underlying organization and conclusion is not optimally useful.
That being said, I appreciate you taking the time to create this post, as well as your call for constructive criticism.
It's not specifically rationalist, but Dune is what first comes to mind for "smart characters that win", at least in the first book.
"Not being able to get the future exactly right doesn’t mean you don’t have to think about it."
--Peter Thiel
Hmmm. You do have some interesting ideas regarding cryonics funding that do sound promising, but to be safe I would talk to Alcor, specifically Diane Cremeens, about them directly to ensure ahead of time that they'll work for them.
That does sound about right, but with two potential caveats: one is that individual circumstances might also matter in these calculations. For example, my risk of dying in a car accident is much lowered by not driving and only rarely riding in cars. However, my risk of dying of heart disease is raised by a strong family history.
There may also be financial considerations. Cancer almost certainly and often heart disease and stroke take time to kill. If you were paying for cryonics out-of-pocket, this wouldn't matter, but if you were paying with life insurance the cost of the policy would go up, perhaps dramatically, if you were to wait until the onset of serious illness to make your arrangements, as life insurance companies are not fond of pre-existing condtions. It might be worth noting that age alone also increases the cost of life insurance.
That being said, it's also fair to say that even a successful cryopreservation has a (roughly) 10-20% chance of preserving your life, taking most factors into account.
So again, the key here is determining how strongly you value your continued existence. If you could come up with a roughly estimated monetary value of your life, taking the probability of radical life extension into account, that may clarify matters considerably. There at values at which that (roughly) 5% chance is too little, or close to the line, or plenty sufficient, or way more than sufficient; it's quite a spectrum.
I don't think it's been asked before on Less Wrong, and it's an interesting question.
It depends on how much you value not dying. If you value it very strongly, the risk of sudden, terminal, but not immediately fatal injuries or illnesses, as mentioned by paper-machine, might be unacceptable to you, and would point toward joining Alcor sooner rather than later.
The marginal increase your support would add to the probability of Alcor surviving as an institution might also matter to you selfishly, since this would increase the probability that there will exist a stronger Alcor when you are older and will likely need it more than you do now.
Additionally, while it's true that it's unlikely that Alcor would reach you in time if you were to die suddenly, compare this risk to the chance of your survival if alternately you don't join Alcor soon enough, and, after your hypothetical fatal car crash, you end up rotting in the ground.
And hey, if you really want selfish reasons: signing up for cryonics is high-status in certain subcultures, including this one.
There are also altruistic reasons to join Alcor, but that's a separate issue.
I just want to mention how much I appreciate these threads: this is my most trusted source of media recommendations! Thank you to all involved.
Good idea. (Note, if you haven't seen the film, here's a spoiler-heavy synopsis).
My line of thought:
Gur pngnfgebcuvp (yngre erirnyrq gb or rkvfgragvny) evfx bs gur svyz vf gur svpgvbany bar bs tvnag zbafgref, be xnvwh, nggnpxvat uhzna cbchyngvba pragref, ohg greebevfz hfvat shgher jrncbaf, nagvovbgvp erfvfgnapr, pyvzngr punatr, rgp. pna pyrneyl or fhofgvghgrq va.
Tvnag zbafgref pna jbex nf na rfcrpvnyyl ivfpreny qenzngvmngvba bs pngnfgebcuvp evfxf, nf abg bayl Cnpvsvp Evz ohg gur bevtvany, harqvgrq, Wncnarfr-ynathntr irefvba bs gur svyz Tbwven (gur frzvany xnvwh zbivr) qrzbafgengr. V jbhyqa'g or fhecevfrq vs jr vafgvapgviryl srne ynetr, ntterffvir cerqngbef zber fb guna, fnl, punatrf va nirentr grzcrengher gung nttertngr bire gur pbhefr bs qrpnqrf, be cnegvpyrf fb fznyy jr pna'g frr gurz jvgubhg zvpebfpbcrf.
Tbqmvyyn'f (uvtuyl tencuvp, crbcyr ohea nyvir) qrfgehpgvba bs Gbxlb ercerfragf gur Jbeyq Jne VV Nzrevpna sverobzovat bs gung pvgl naq gur ahpyrne nggnpxf ba Uvebfuvzn naq Antnfnxv. Gur punenpgref bs Tbwven rkcyvpvgyl yrnir bcra gur cbffvovyvgl bs shgher xnvwh (ernyyl jnef, ahpyrne rkpunatrf, naq gur raivebazragny rssrpgf bs ahpyrne jrncbaf grfgvat) jrnevat uhznavgl qbja hagvy gurer vf abguvat yrsg.
Ubjrire, gubhtu Tbwven raqf ba n crffvzvfgvp abgr, Cnpvsvp Evz qbrf abg. Urer, pbbcrengvba vf fhssvpvrag gb birepbzr naq ryvzvangr gur guerng. Gur zbivr'f gvgyr vzcyvrf gur arprffvgl bs vagreangvbany pbbcrengvba, ohg gur vzcebivat eryngvbafuvcf orgjrra gur Wnrtre cvybgf nyfb qrzbafgengr gur gurzr bs pbbcrengvba qbja gb gur vagrecrefbany yriry. Guvf vf rfcrpvnyyl qenzngvmrq ol gur pbaprcg bs Qevsgvat, gur qrrcyl crefbany, pnershyyl rkrphgrq, OPV-ranoyrq flapuebavmngvba (ba nyy yriryf: zragny, rzbgvbany, naq culfvpny) arprffnel gb cvybg Wnrtref naq guhf qrsrng gur xnvwh.
Jura uhznaf nggrzcg gb svtug tvnag zbafgref (naq pngnfgebcuvp evfxf) nybar, gurl ghea bhg rvgure qrnq be onqyl qnzntrq, obgu culfvpnyyl naq cflpubybtvpnyyl. Nf gur qverpgbe, qry Gbeb, chg vg va na vagreivrj: (ersreevat gb bar bs gur pbasyvpgf orgjrra gur Wnrtre cvybgf): "Gung thl lbh jrer orngvat gur fuvg bhg bs gra zvahgrf ntb? Gung'f gur thl lbh unir gb jbex jvgu svir zvahgrf yngre" naq (zber trarenyyl) "Rvgure jr trg nybat be jr qvr".
Ng svefg gur uhznaf qb fb jryy ntnvafg gur xnvwh gung, bs pbhefr, gurl trg birepbasvqrag naq pbzcynprag, naq gura gur gvqr bs jne gheaf ntnvafg gurz nf gur xnvwh nqncg gb uhzna qrsrafrf. Vafgrnq bs vzcebivat gur nyernql rkcrafvir Wnrtre cebtenz be nggrzcgvat bgure npgvir nccebnpurf gb gur xnvwh ceboyrz gung zvtug cbffvoyl snvy, srneshy tbireazrag ohernhpengf pubbfr gur "fnsr" cnffvir bcgvba naq ortva pbafgehpgvba bs znffvir jnyyf nebhaq nyy gur znwbe pvgvrf bs gur Cnpvsvp Evz.
Jura vg orpbzrf pyrne gung guvf fgengrtl jvyy abg jbex, fbzr crbcyr qrfcnve, fbzr svtug rnpu bgure, fbzr ergerng gb nabgure ynlre bs jnyyf pbafgehpgrq shegure vaynaq, naq fbzr ernpg jvgu nccebcevngr, gubhtu qrfcrengr, npgvivgl. Nf bar punenpgre, Fgnpxre, chgf vg: "Unira'g lbh urneq? Vg'f gur raq bs gur jbeyq. Jurer jbhyq lbh engure qvr? Urer, be va n Wnrtre pbpxcvg?" Guvf vf gur pbeerpg erfcbafr gb pngnfgebcuvp naq rkvfgragvny evfxf.
Gur synfuonpx fprar vaibyivat gur ivbyrag qrngu bs Znxb'f snzvyl jura fur vf n fznyy puvyq, gur oyrnx, qrinfgngrq Gbxlb naq ure greevslvat arne-qrngu nf n tvnag zbafgre gnxrf n cnegvphyne vagrerfg va ure rssrpgviryl fubjf gur uhzna pbfg bs pngnfgebcuvp evfxf ba na rzbgvbany yriry, nf fpbcr artyrpg vf bs pbhefr na vffhr urer.
Hygvzngryl, jung'f yrsg bs na haqrefgnssrq, haqreshaqrq Wnrtre cebtenz (n snzvyvne ceboyrz sbe rkvfgragvny evfx betnavmngvbaf) fhpprffshyyl chefhrf abg bayl na npgvir qrsrafr ohg na bssrafvir fgengrtl vagraqrq gb raq xnvwh nggnpxf bapr naq sbe nyy. (Erzvaqf zr bs gur Sevraqyl NV nccebnpu gb abg bayl HSNV ohg gb pngnfgebcuvp naq rkvfgragvny evfxf va trareny). Gurl qb fb va bccbfvgvba gb gur pbairagvbany jvfqbz, jvgu pbbcrengvba, jvgu pbhentr, jvgu zngurzngvpf, naq jvgu fpvrapr naq grpuabybtl.
"Gbqnl ng gur rqtr bs bhe ubcr, ng gur raq bs bhe gvzr, jr unir pubfra gb oryvrir va rnpu bgure. Gbqnl jr snpr gur zbafgref gung ner ng bhe qbbe, gbqnl jr ner pnapryyvat gur ncbpnylcfr!" N zrffntr gung, gubhtu purrfl, cyrnfnagyl fhecevfrf naq vafcverf zr guebhtu vgf bcgvzvfz va guvf ntr bs plavpvfz.
Pacific Rim pleasantly surprised me. I could list the manifold ways this movie dramatizes how to correctly deal with catastrophic risks, but I don't want to spoil it for you.
Plus it is awesome, in both senses of the word.
Google Translate gets me "flight of death" or "wants death". "Flight of death" might refer to AK. More interestingly, "wants death" would make no sense in reference to himself wanting death, but it would make sense in reference to Voldemort wanting the deaths of others. There's some possible support for your interpretation there.
Your heuristic for getting the news checks out in my experience, so that seems worth trying.
I wouldn't be surprised if we've both seen plenty of Snowden/NSA on Hacker News.
Thanks for the links.
And while I agree with you that quitting the news would likely be intellectually hygienic and emotionally healthy, it would probably also work as an anti-akrasia tactic in the specific case of cutting out something I often turn to to avoid actual work. Similar to the "out of sight, out of mind" principle, but more "out of habit, out of mind".
Thanks for doing this, though I suggest moving it to the Discussion section in the hope of getting more responses there.
I'm curious (nonjudgementally): do you get your news now from non-mainstream sources, or do you stay away from news altogether? I ask because I am considering trying this anti-akrasia tactic myself, but am unsure regarding the details.
This sounds promising. As a LWian living many states away, I'd love to see a synopsis posted if it's not too much trouble. There is a hunger for more instrumental rationality on this website.
To add to Principle #5, in a conversational style: "if something exists, that something can be quantified. Beauty, love, and joy are concrete and measurable; you just fail at it. To be fair, you lack the scientific and technological means of doing so, but - failure is failure. You failing at quantification does not devalue something of value."
I suppose the first step would be being more instrumentally rational about what I should be instrumentally rational about. What are the goals that are most worth achieving, or, what are my values?
Reading the Sequences has improved my epistemic rationality, but not so much my instrumental rationality. What are some resources that would help me with this? Googling is not especially helping. Thanks in advance for your assistance.
Try some exposure therapy to whatever it is you're often afraid of. Can't think of what you're often afraid of? I'd be surprised if you're completely immune to every common phobia.
Advice from the Less Wrong archives.
Very interested.
Also, here's a bit of old discussion on the topic I found interesting enough to save.
If you can't appeal to reason to make reason appealing, you appeal to emotion and authority to make reason appealing.
I don't think there are any such community pressures, as long as a summary accompanies the link.
Thanks!
I recently noticed "The Fable of the Dragon-Tyrant" under the front page's Featured Articles section, which caused me to realize that there's more to Featured Articles than the Sequences alone. This particular article (an excellent one, by the way) is also not from Less Wrong itself, yet is obviously relevant to it; it's hosted on Nick Bostrom's personal site.
I'm interested in reading high-quality non-Sequences articles (I'm making my way through the Sequences separately using the [SEQ RERUN] feature) relevant to Less Wrong that I might have missed, so is there an archive of Featured Articles? I looked, but was unable to find one.
Michaelcurzi's How to avoid dying in a car crash is relevant. Bentarm's comment on that thread makes an excellent point regarding coronary heart disease.
There is also Eliezer Yudkowsky's You Only Live Twice and Robin Hanson's We Agree: Get Froze on cryonics.
I have a few questions, and I apologize if these are too basic:
1) How concerned is SI with existential risks vs. how concerned is SI with catastrophic risks?
2) If SI is solely concerned with x-risks, do I assume correctly that you also think about how cat. risks can relate to x-risks (certain cat. risks might raise or lower the likelihood of other cat. risks, certain cat. risks might raise or lower the likelihood of certain x-risks, etc.)? It must be hard avoiding the conjunction fallacy! Or is this sort of thing more what the FHI does?
3) Is there much tension in SI thinking between achieving FAI as quickly as possible (to head off other x-risks and cat. risks) vs. achieving FAI as safely as possible (to head off UFAI), or does one of these goals occupy signficantly more of your attention and activities?
Edited to add: thanks for responding!
One possible alternative would be choosing to appear in the Americas.