Posts

Comments

Comment by Turgurth on Open thread, Jul. 17 - Jul. 23, 2017 · 2017-07-22T17:09:51.964Z · LW · GW

I saw this same query in the last open thread. I suspect you aren't getting any responses because the answer is long and involved. I don't have time to give you the answer in full either, so I'll give you the quick version:

I am in the process of signing up with Alcor, because after ten years of both observing cryonics organizations myself and reading what other people say about them, Alcor has given a series of cues that they are the more professional cryonics organization.

So, the standard advice is: if you are young, healthy with a long life expectancy, and are not wealthy, choose C.I., because they are less expensive. If those criteria do not apply to you, choose Alcor, as they appear to be the more serious, professional organization.

In other words: choose C.I. as the type of death insurance you want to have, but probably won't use, or choose Alcor as the type of death insurance you probably will use.

Comment by Turgurth on How often do you check this forum? · 2017-02-04T21:31:46.923Z · LW · GW

ping

Comment by Turgurth on Open thread, Dec. 12 - Dec. 18, 2016 · 2016-12-17T19:15:17.358Z · LW · GW

Check out this FDA speculation.

Scott Alexander comments here.

Comment by Turgurth on 2014 Less Wrong Census/Survey · 2014-10-24T04:11:57.304Z · LW · GW

Taken. Wasn't bothered by the length -- could be even longer next time.

Comment by Turgurth on 2013 Less Wrong Census/Survey · 2013-11-23T12:53:31.335Z · LW · GW

I took the survey, and wanted it to be longer.

Comment by Turgurth on How habits work and how you may control them · 2013-10-13T20:51:29.901Z · LW · GW

I wanted to love this post, but stylistic issues got in the way.

It read too much like a gwern essay: certainly interesting, but in need of a summary and a guide for how it is practically applicable. A string of highlights and commentary with no clear underlying organization and conclusion is not optimally useful.

That being said, I appreciate you taking the time to create this post, as well as your call for constructive criticism.

Comment by Turgurth on Yet more "stupid" questions · 2013-09-08T23:07:18.734Z · LW · GW

It's not specifically rationalist, but Dune is what first comes to mind for "smart characters that win", at least in the first book.

Comment by Turgurth on High Challenge · 2013-09-08T06:04:26.446Z · LW · GW

Well, does time permit?

Comment by Turgurth on Rationality Quotes September 2013 · 2013-09-02T00:11:39.863Z · LW · GW

"Not being able to get the future exactly right doesn’t mean you don’t have to think about it."

--Peter Thiel

Comment by Turgurth on Open thread, August 19-25, 2013 · 2013-08-23T18:56:29.777Z · LW · GW

Hmmm. You do have some interesting ideas regarding cryonics funding that do sound promising, but to be safe I would talk to Alcor, specifically Diane Cremeens, about them directly to ensure ahead of time that they'll work for them.

Comment by Turgurth on Open thread, August 19-25, 2013 · 2013-08-23T01:53:29.179Z · LW · GW

That does sound about right, but with two potential caveats: one is that individual circumstances might also matter in these calculations. For example, my risk of dying in a car accident is much lowered by not driving and only rarely riding in cars. However, my risk of dying of heart disease is raised by a strong family history.

There may also be financial considerations. Cancer almost certainly and often heart disease and stroke take time to kill. If you were paying for cryonics out-of-pocket, this wouldn't matter, but if you were paying with life insurance the cost of the policy would go up, perhaps dramatically, if you were to wait until the onset of serious illness to make your arrangements, as life insurance companies are not fond of pre-existing condtions. It might be worth noting that age alone also increases the cost of life insurance.

That being said, it's also fair to say that even a successful cryopreservation has a (roughly) 10-20% chance of preserving your life, taking most factors into account.

So again, the key here is determining how strongly you value your continued existence. If you could come up with a roughly estimated monetary value of your life, taking the probability of radical life extension into account, that may clarify matters considerably. There at values at which that (roughly) 5% chance is too little, or close to the line, or plenty sufficient, or way more than sufficient; it's quite a spectrum.

Comment by Turgurth on Open thread, August 19-25, 2013 · 2013-08-22T01:08:16.906Z · LW · GW

I don't think it's been asked before on Less Wrong, and it's an interesting question.

It depends on how much you value not dying. If you value it very strongly, the risk of sudden, terminal, but not immediately fatal injuries or illnesses, as mentioned by paper-machine, might be unacceptable to you, and would point toward joining Alcor sooner rather than later.

The marginal increase your support would add to the probability of Alcor surviving as an institution might also matter to you selfishly, since this would increase the probability that there will exist a stronger Alcor when you are older and will likely need it more than you do now.

Additionally, while it's true that it's unlikely that Alcor would reach you in time if you were to die suddenly, compare this risk to the chance of your survival if alternately you don't join Alcor soon enough, and, after your hypothetical fatal car crash, you end up rotting in the ground.

And hey, if you really want selfish reasons: signing up for cryonics is high-status in certain subcultures, including this one.

There are also altruistic reasons to join Alcor, but that's a separate issue.

Comment by Turgurth on August 2013 Media Thread · 2013-08-03T07:29:49.860Z · LW · GW

I just want to mention how much I appreciate these threads: this is my most trusted source of media recommendations! Thank you to all involved.

Comment by Turgurth on August 2013 Media Thread · 2013-08-03T07:18:44.480Z · LW · GW

Good idea. (Note, if you haven't seen the film, here's a spoiler-heavy synopsis).

My line of thought:

Gur pngnfgebcuvp (yngre erirnyrq gb or rkvfgragvny) evfx bs gur svyz vf gur svpgvbany bar bs tvnag zbafgref, be xnvwh, nggnpxvat uhzna cbchyngvba pragref, ohg greebevfz hfvat shgher jrncbaf, nagvovbgvp erfvfgnapr, pyvzngr punatr, rgp. pna pyrneyl or fhofgvghgrq va.

Tvnag zbafgref pna jbex nf na rfcrpvnyyl ivfpreny qenzngvmngvba bs pngnfgebcuvp evfxf, nf abg bayl Cnpvsvp Evz ohg gur bevtvany, harqvgrq, Wncnarfr-ynathntr irefvba bs gur svyz Tbwven (gur frzvany xnvwh zbivr) qrzbafgengr. V jbhyqa'g or fhecevfrq vs jr vafgvapgviryl srne ynetr, ntterffvir cerqngbef zber fb guna, fnl, punatrf va nirentr grzcrengher gung nttertngr bire gur pbhefr bs qrpnqrf, be cnegvpyrf fb fznyy jr pna'g frr gurz jvgubhg zvpebfpbcrf.

Tbqmvyyn'f (uvtuyl tencuvp, crbcyr ohea nyvir) qrfgehpgvba bs Gbxlb ercerfragf gur Jbeyq Jne VV Nzrevpna sverobzovat bs gung pvgl naq gur ahpyrne nggnpxf ba Uvebfuvzn naq Antnfnxv. Gur punenpgref bs Tbwven rkcyvpvgyl yrnir bcra gur cbffvovyvgl bs shgher xnvwh (ernyyl jnef, ahpyrne rkpunatrf, naq gur raivebazragny rssrpgf bs ahpyrne jrncbaf grfgvat) jrnevat uhznavgl qbja hagvy gurer vf abguvat yrsg.

Ubjrire, gubhtu Tbwven raqf ba n crffvzvfgvp abgr, Cnpvsvp Evz qbrf abg. Urer, pbbcrengvba vf fhssvpvrag gb birepbzr naq ryvzvangr gur guerng. Gur zbivr'f gvgyr vzcyvrf gur arprffvgl bs vagreangvbany pbbcrengvba, ohg gur vzcebivat eryngvbafuvcf orgjrra gur Wnrtre cvybgf nyfb qrzbafgengr gur gurzr bs pbbcrengvba qbja gb gur vagrecrefbany yriry. Guvf vf rfcrpvnyyl qenzngvmrq ol gur pbaprcg bs Qevsgvat, gur qrrcyl crefbany, pnershyyl rkrphgrq, OPV-ranoyrq flapuebavmngvba (ba nyy yriryf: zragny, rzbgvbany, naq culfvpny) arprffnel gb cvybg Wnrtref naq guhf qrsrng gur xnvwh.

Jura uhznaf nggrzcg gb svtug tvnag zbafgref (naq pngnfgebcuvp evfxf) nybar, gurl ghea bhg rvgure qrnq be onqyl qnzntrq, obgu culfvpnyyl naq cflpubybtvpnyyl. Nf gur qverpgbe, qry Gbeb, chg vg va na vagreivrj: (ersreevat gb bar bs gur pbasyvpgf orgjrra gur Wnrtre cvybgf): "Gung thl lbh jrer orngvat gur fuvg bhg bs gra zvahgrf ntb? Gung'f gur thl lbh unir gb jbex jvgu svir zvahgrf yngre" naq (zber trarenyyl) "Rvgure jr trg nybat be jr qvr".

Ng svefg gur uhznaf qb fb jryy ntnvafg gur xnvwh gung, bs pbhefr, gurl trg birepbasvqrag naq pbzcynprag, naq gura gur gvqr bs jne gheaf ntnvafg gurz nf gur xnvwh nqncg gb uhzna qrsrafrf. Vafgrnq bs vzcebivat gur nyernql rkcrafvir Wnrtre cebtenz be nggrzcgvat bgure npgvir nccebnpurf gb gur xnvwh ceboyrz gung zvtug cbffvoyl snvy, srneshy tbireazrag ohernhpengf pubbfr gur "fnsr" cnffvir bcgvba naq ortva pbafgehpgvba bs znffvir jnyyf nebhaq nyy gur znwbe pvgvrf bs gur Cnpvsvp Evz.

Jura vg orpbzrf pyrne gung guvf fgengrtl jvyy abg jbex, fbzr crbcyr qrfcnve, fbzr svtug rnpu bgure, fbzr ergerng gb nabgure ynlre bs jnyyf pbafgehpgrq shegure vaynaq, naq fbzr ernpg jvgu nccebcevngr, gubhtu qrfcrengr, npgvivgl. Nf bar punenpgre, Fgnpxre, chgf vg: "Unira'g lbh urneq? Vg'f gur raq bs gur jbeyq. Jurer jbhyq lbh engure qvr? Urer, be va n Wnrtre pbpxcvg?" Guvf vf gur pbeerpg erfcbafr gb pngnfgebcuvp naq rkvfgragvny evfxf.

Gur synfuonpx fprar vaibyivat gur ivbyrag qrngu bs Znxb'f snzvyl jura fur vf n fznyy puvyq, gur oyrnx, qrinfgngrq Gbxlb naq ure greevslvat arne-qrngu nf n tvnag zbafgre gnxrf n cnegvphyne vagrerfg va ure rssrpgviryl fubjf gur uhzna pbfg bs pngnfgebcuvp evfxf ba na rzbgvbany yriry, nf fpbcr artyrpg vf bs pbhefr na vffhr urer.

Hygvzngryl, jung'f yrsg bs na haqrefgnssrq, haqreshaqrq Wnrtre cebtenz (n snzvyvne ceboyrz sbe rkvfgragvny evfx betnavmngvbaf) fhpprffshyyl chefhrf abg bayl na npgvir qrsrafr ohg na bssrafvir fgengrtl vagraqrq gb raq xnvwh nggnpxf bapr naq sbe nyy. (Erzvaqf zr bs gur Sevraqyl NV nccebnpu gb abg bayl HSNV ohg gb pngnfgebcuvp naq rkvfgragvny evfxf va trareny). Gurl qb fb va bccbfvgvba gb gur pbairagvbany jvfqbz, jvgu pbbcrengvba, jvgu pbhentr, jvgu zngurzngvpf, naq jvgu fpvrapr naq grpuabybtl.

"Gbqnl ng gur rqtr bs bhe ubcr, ng gur raq bs bhe gvzr, jr unir pubfra gb oryvrir va rnpu bgure. Gbqnl jr snpr gur zbafgref gung ner ng bhe qbbe, gbqnl jr ner pnapryyvat gur ncbpnylcfr!" N zrffntr gung, gubhtu purrfl, cyrnfnagyl fhecevfrf naq vafcverf zr guebhtu vgf bcgvzvfz va guvf ntr bs plavpvfz.

Comment by Turgurth on August 2013 Media Thread · 2013-08-03T01:58:20.413Z · LW · GW

Pacific Rim pleasantly surprised me. I could list the manifold ways this movie dramatizes how to correctly deal with catastrophic risks, but I don't want to spoil it for you.

Plus it is awesome, in both senses of the word.

Comment by Turgurth on Harry Potter and the Methods of Rationality discussion thread, part 25, chapter 96 · 2013-07-26T17:18:51.012Z · LW · GW

Google Translate gets me "flight of death" or "wants death". "Flight of death" might refer to AK. More interestingly, "wants death" would make no sense in reference to himself wanting death, but it would make sense in reference to Voldemort wanting the deaths of others. There's some possible support for your interpretation there.

Comment by Turgurth on Akrasia Tactics Review 2: The Akrasia Strikes Back · 2013-07-19T07:37:02.385Z · LW · GW

Your heuristic for getting the news checks out in my experience, so that seems worth trying.

I wouldn't be surprised if we've both seen plenty of Snowden/NSA on Hacker News.

Thanks for the links.

And while I agree with you that quitting the news would likely be intellectually hygienic and emotionally healthy, it would probably also work as an anti-akrasia tactic in the specific case of cutting out something I often turn to to avoid actual work. Similar to the "out of sight, out of mind" principle, but more "out of habit, out of mind".

Comment by Turgurth on Akrasia Tactics Review 2: The Akrasia Strikes Back · 2013-07-19T01:27:02.173Z · LW · GW

Thanks for doing this, though I suggest moving it to the Discussion section in the hope of getting more responses there.

Comment by Turgurth on Akrasia Tactics Review 2: The Akrasia Strikes Back · 2013-07-19T01:20:06.956Z · LW · GW

I'm curious (nonjudgementally): do you get your news now from non-mainstream sources, or do you stay away from news altogether? I ask because I am considering trying this anti-akrasia tactic myself, but am unsure regarding the details.

Comment by Turgurth on Meetup : Cincinnati: Financial optimisation · 2013-07-18T07:27:23.132Z · LW · GW

This sounds promising. As a LWian living many states away, I'd love to see a synopsis posted if it's not too much trouble. There is a hunger for more instrumental rationality on this website.

Comment by Turgurth on Summary of "The Straw Vulcan" · 2013-07-18T01:38:51.909Z · LW · GW

To add to Principle #5, in a conversational style: "if something exists, that something can be quantified. Beauty, love, and joy are concrete and measurable; you just fail at it. To be fair, you lack the scientific and technological means of doing so, but - failure is failure. You failing at quantification does not devalue something of value."

Comment by Turgurth on "Stupid" questions thread · 2013-07-17T09:26:36.751Z · LW · GW

I suppose the first step would be being more instrumentally rational about what I should be instrumentally rational about. What are the goals that are most worth achieving, or, what are my values?

Comment by Turgurth on "Stupid" questions thread · 2013-07-14T00:01:35.266Z · LW · GW

Reading the Sequences has improved my epistemic rationality, but not so much my instrumental rationality. What are some resources that would help me with this? Googling is not especially helping. Thanks in advance for your assistance.

Comment by Turgurth on "Stupid" questions thread · 2013-07-13T23:57:07.200Z · LW · GW

Try some exposure therapy to whatever it is you're often afraid of. Can't think of what you're often afraid of? I'd be surprised if you're completely immune to every common phobia.

Comment by Turgurth on Open Thread, June 2-15, 2013 · 2013-06-04T17:00:53.044Z · LW · GW

Advice from the Less Wrong archives.

Comment by Turgurth on Preparing for a Rational Financial Planning Sequence · 2013-05-23T02:58:13.895Z · LW · GW

Very interested.

Also, here's a bit of old discussion on the topic I found interesting enough to save.

Comment by Turgurth on Rationality Quotes February 2013 · 2013-02-03T01:12:28.425Z · LW · GW

If you can't appeal to reason to make reason appealing, you appeal to emotion and authority to make reason appealing.

Comment by Turgurth on Open Thread, January 16-31, 2013 · 2013-01-18T05:43:59.408Z · LW · GW

I don't think there are any such community pressures, as long as a summary accompanies the link.

Comment by Turgurth on Open Thread, January 16-31, 2013 · 2013-01-16T06:41:40.685Z · LW · GW

Thanks!

Comment by Turgurth on Open Thread, January 16-31, 2013 · 2013-01-16T03:21:52.246Z · LW · GW

I recently noticed "The Fable of the Dragon-Tyrant" under the front page's Featured Articles section, which caused me to realize that there's more to Featured Articles than the Sequences alone. This particular article (an excellent one, by the way) is also not from Less Wrong itself, yet is obviously relevant to it; it's hosted on Nick Bostrom's personal site.

I'm interested in reading high-quality non-Sequences articles (I'm making my way through the Sequences separately using the [SEQ RERUN] feature) relevant to Less Wrong that I might have missed, so is there an archive of Featured Articles? I looked, but was unable to find one.

Comment by Turgurth on Stupid Questions Open Thread Round 2 · 2012-04-23T03:00:25.011Z · LW · GW

Michaelcurzi's How to avoid dying in a car crash is relevant. Bentarm's comment on that thread makes an excellent point regarding coronary heart disease.

There is also Eliezer Yudkowsky's You Only Live Twice and Robin Hanson's We Agree: Get Froze on cryonics.

Comment by Turgurth on against "AI risk" · 2012-04-12T05:44:57.169Z · LW · GW

I have a few questions, and I apologize if these are too basic:

1) How concerned is SI with existential risks vs. how concerned is SI with catastrophic risks?

2) If SI is solely concerned with x-risks, do I assume correctly that you also think about how cat. risks can relate to x-risks (certain cat. risks might raise or lower the likelihood of other cat. risks, certain cat. risks might raise or lower the likelihood of certain x-risks, etc.)? It must be hard avoiding the conjunction fallacy! Or is this sort of thing more what the FHI does?

3) Is there much tension in SI thinking between achieving FAI as quickly as possible (to head off other x-risks and cat. risks) vs. achieving FAI as safely as possible (to head off UFAI), or does one of these goals occupy signficantly more of your attention and activities?

Edited to add: thanks for responding!

Comment by Turgurth on How would you take over Rome? · 2012-03-16T02:11:42.152Z · LW · GW

One possible alternative would be choosing to appear in the Americas.