post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by CronoDAS · 2012-08-22T04:15:18.886Z · LW(p) · GW(p)

I'm one of the 9.

Replies from: faul_sname, None
comment by faul_sname · 2012-08-22T16:11:32.637Z · LW(p) · GW(p)

As am I... though I am surprised they list a donation of only $100.

comment by [deleted] · 2012-08-22T11:36:07.444Z · LW(p) · GW(p)

This deserves recognition, up voted.

comment by laman_blanchard · 2012-08-22T15:47:38.363Z · LW(p) · GW(p)

This post inspired me to make a small donation.

Replies from: bcoburn
comment by bcoburn · 2012-08-22T22:28:01.322Z · LW(p) · GW(p)

Me as well.

comment by zntneo · 2012-08-22T22:35:39.500Z · LW(p) · GW(p)

I donated a small amount

comment by [deleted] · 2012-08-21T17:45:01.840Z · LW(p) · GW(p)

This not being funded would indeed be very sad. Recently a story about the tragedy that is death touched a lot of people on LessWrong, I think me editing that article to link to here and encouraging people to donate would be an appropriate move.

Replies from: None, koning_robot
comment by [deleted] · 2012-08-21T19:47:05.564Z · LW(p) · GW(p)

This is appreciated; Thank you.

comment by koning_robot · 2012-08-22T11:14:36.173Z · LW(p) · GW(p)

What is it exactly that's so valuable about a person that justifies spending $30000 worth of resources to preserve it? Their "identity", whatever that means? Their personality, even though it's probably a dime a dozen? Their acquired knowledge that will be outdated by the time they are revived? What is it that we want to preserve?

What is it that is lost when a person dies, that cannot be regained by creating a new one? I'm not in favor of creating new ones, but new ones are created all the time anyway, so why not learn to live with them? Why do we need to do everything the hard way?

Replies from: Kindly, Vladimir_Nesov, metatroll
comment by Kindly · 2012-08-22T12:28:36.370Z · LW(p) · GW(p)

First, we are selfish, and don't want to die (no matter how useful we are to society). Second, we also care about a few other people close to us, and don't want them to die. Third, we want to spare everyone from having to be afraid of death.

I think if you forget about these reasons, then there's no point in preserving people.

Edit: I'm sorry that your comment was downvoted, but I for one think that it's a worthwhile objection to make, even though I disagree with it for the above reasons.

Replies from: koning_robot
comment by koning_robot · 2012-08-23T10:09:45.834Z · LW(p) · GW(p)

I consider these to be emotional reasons rather than rational ones. Specifically not wanting to die, not wanting certain others to die, and being afraid of death are irrational (or at least it is unclear that there are rational reasons for them). I think there are less roundabout ways to (dis)solve these problems than to engineer immortality. In a more rational culture (which we should be steering for anyway), we would not be so viscerally averse to death.

Replies from: Kindly, Desrtopa
comment by Kindly · 2012-08-23T12:40:06.572Z · LW(p) · GW(p)

Rational doesn't mean emotionless. These are emotional reasons -- to which I think I should add that I care about the pain Joe's loved ones feel when Joe dies -- but I think they're important emotional reasons. I wouldn't be me if I didn't care about these things.

I would not want to become "rational" at the sake of forgetting about these reasons, and others. I want to become rational so that I can better understand my emotions, and act on them more effectively.

Replies from: koning_robot
comment by koning_robot · 2012-08-24T09:59:40.957Z · LW(p) · GW(p)

The emotions are irrational in the sense that they are not supported by anything - your brain generates these emotions in these situations and that's it. Emotions are valuable and we need to use rationality to optimize them. Now, there are two ways to satisfy a desire: the obvious one is to change the world to reflect the propositional content of the desire. The less obvious one is to get rid of or alter the desire. I'm not saying that to be rational is to get rid of all your desires. I'm saying that it's a tradeoff, and I am suggesting the possibility that in this case the cost of placating the desire to not die is greater than the cost of getting rid of it.

What worries me is this. It could well be that I am wrong and that the cost of immortality is actually lower than the cost to get rid of the desire for it. But I strongly suspect that this was never the reason for people here to pursue immortality. The real reason has to do with preservation of something that I doubt has value.

Replies from: Kindly, Vladimir_Nesov
comment by Kindly · 2012-08-24T13:03:26.426Z · LW(p) · GW(p)

If I get rid of my desire to do something, then I've replaced myself by a possibly less frustrated person who doesn't value the same things as I do. This is obviously a trade-off, yes.

On the one hand, it's not that I'm ridiculously frustrated by our lack of immortality, I've kind of gotten used to it. I recognize that things could be better, yes.

On the other hand, a version of me that doesn't care if people die or not seems very different from me and frankly kind of abhorrent. I don't even know if I even want that version of me to exist, and I'm certainly not going to have it replace myself if I can help it.

comment by Vladimir_Nesov · 2012-08-24T14:52:23.576Z · LW(p) · GW(p)

The emotions are irrational in the sense that they are not supported by anything - your brain generates these emotions in these situations and that's it.

Beliefs are also something your brain generates. Being represented in meat doesn't by itself make an event unimportant or irrelevant. You value carefully arrived-at beliefs, because you expect they are accurate, they reflect the world. Similarly, you may value some of your emotions, if you expect that they reward events that you approve of, or punish for events that you don't approve of.

See Feeling Rational, The Mystery of the Haunted Rationalist, Summary of "The Straw Vulcan".

Replies from: koning_robot
comment by koning_robot · 2012-08-24T20:34:06.810Z · LW(p) · GW(p)

Yes, but the question here is exactly whether this fear of death that we all share is one of those emotions that we should value, or if it is getting in the way of our rationality. Our species has a long history of wars between tribes and violence among tribe members competing for status. Death has come to be associated with defeat and humiliation.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2012-08-24T21:00:19.201Z · LW(p) · GW(p)

the question here is exactly whether this fear of death that we all share is one of those emotions that we should value

Do you have specific ideas useful for resolving this question?

or if it is getting in the way of our rationality

It's usually best to avoid using the word "rationality" in such contexts. The question is whether one should accept the straightforward interpretation of the emotions of fear of death, and at that point nothing more is added to the problem specification by saying things like "Which answer to this question is truth?" or "Which belief about the answer to this question would be rational?", or "Which belief about this question is desirable?".

See What Do We Mean By "Rationality"?, Avoid inflationary use of terms.

Replies from: koning_robot
comment by koning_robot · 2012-08-28T20:56:32.708Z · LW(p) · GW(p)

Do you have specific ideas useful for resolving this question?

Fear of death doesn't mean death is bad in the same way that fear of black people doesn't mean black people are bad. (Please forgive me the loaded example.)

Fear of black people, or more generally xenophobia, evolved to facilitate kin selection and tribalism. Fear of death evolved for similar reasons, i.e., to make more of "me". We don't know what we mean by "me", or if we do then we don't know what's valuable about the existence of one "me" as opposed to another, and anyway evolution meant something different by "me" (genes rather than organisms).

It's usually best to avoid using the word "rationality" in such contexts.

I actually meant rationality here, specifically instrumental rationality, i.e., "is it getting in the way of us achieving our goals?".

I feel like this thread has gotten derailed and my original point lost, so let me contrive a thought experiment to hopefully be more clear.

Suppose that someone named Alice dies today, but at the moment she ceases to exist, Betty is born. Betty is a lot like Alice in that she has a similar personality, will grow up in a similar environment and will end up affecting the world in similar ways. What of fundamental value was lost when Alice died that Betty's birth did not replace? (The grief for Alice's death and the joy for Betty's birth have instrumental value, as did Alice's acquired knowledge.)

If you find that I've set this up to fit my conclusions, then I don't think we disagree.

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2012-08-28T21:25:05.814Z · LW(p) · GW(p)

What of fundamental value was lost when Alice died that Betty's birth did not replace?

Hard to say. Notice that in such examples we are past the point where the value of things is motivation by instrumental value (i.e. such thought experiments try to strip away the component of value that originates as instrumental value), and terminal value is not expected to be easy to enunciate. As a result, the difficulty with explaining terminal value is only weak evidence for absence of said terminal value. In other words, if you can't explain what exactly is valuable in such situations, that doesn't strongly indicate that there is nothing valuable there. One of the few things remaining in such cases is to look directly at emotional urges and resolve contradictions in their recommendations in terms of instrumental value (consequentialism and game theory).

comment by Desrtopa · 2012-08-23T12:39:28.592Z · LW(p) · GW(p)

If it's irrational not to want to die, what do you think it would be rational to want?

Replies from: koning_robot
comment by koning_robot · 2012-08-24T09:37:56.957Z · LW(p) · GW(p)

Pleasurable experiences. My life facilitates them, but it doesn't have to be "my" life. Anyone's life will do.

Replies from: Desrtopa
comment by Desrtopa · 2012-08-24T23:45:51.456Z · LW(p) · GW(p)

And why do you think it's rational to want this, but not to want one's own survival?

Replies from: koning_robot
comment by koning_robot · 2012-08-25T09:49:04.774Z · LW(p) · GW(p)

Because it feels good. My ongoing survival leaves me cold entirely.

Replies from: Desrtopa, Viliam_Bur
comment by Desrtopa · 2012-08-25T13:24:03.756Z · LW(p) · GW(p)

How would you distinguish this, as a "rational" reason, from "emotional" reasons, as you did in your previous comment?

comment by Viliam_Bur · 2012-08-25T10:20:35.351Z · LW(p) · GW(p)

Then wireheading is the best solution. The interesting fact is that wireheading anyone else would give you as much utility as wireheading you.

comment by Vladimir_Nesov · 2012-08-24T16:11:16.554Z · LW(p) · GW(p)

What is it that is lost when a person dies, that cannot be regained by creating a new one?

I'm uncertain about the value and fungibility of human life. Emotions clearly support non-fungibility, in particular concerning your own life, and it's a strong argument. On the other hand, my goals are sufficiently similar to everyone else's goals that loss of my life wouldn't prevent my goals from controlling the world, it will be done through others. Only existential disaster or severe value drift would prevent my goals from controlling the world.

(The negative response to your comment may be explained by the fact that you appear to be expressing confidence in the unusual solution (that value of life is low) to this difficult question without giving an argument for that position. At best the points you've made are arguments in support of uncertainty in the position that the value of life is very high, not strong enough to support the claim that it's low. If your claim is that we shouldn't be that certain, you should clarify by stating that more explicitly. If your claim is that the value of life is low, the argument your are making should be stronger, or else there is no point in insisting on that claim, even if that happens to be your position, since absent argument it won't be successfully instilled in others.)

Replies from: koning_robot
comment by koning_robot · 2012-08-24T23:06:16.815Z · LW(p) · GW(p)

Emotions clearly support non-fungibility, in particular concerning your own life, and it's a strong argument.

I (now) understand how the existence of certain emotions in certain situations can serve as an argument for or against some proposition, but I don't think the emotions in this case form that strong an argument. There's a clear motive. It was evolution, in the big blue room, with the reproductive organs. It cares about the survival of chunks of genetic information, not about the well-being of the gene expressions.

Thanks for helping me understand the negative response. My claim here is not about the value of life in general, but about the value of some particular "person" continuing to exist. I think the terminal value of this ceasing to exist is zero. Since posting my top-level comment I have provided some arguments in favor of my case, and also hopefully clarified my position.

comment by metatroll · 2012-08-22T11:21:57.898Z · LW(p) · GW(p)

What is it that is lost when a person dies, that cannot be regained by creating a new one?

If you go to a really high place, and look over the edge far enough, you'll find out.

Replies from: koning_robot
comment by koning_robot · 2012-08-23T10:20:48.759Z · LW(p) · GW(p)

Do you think that preserving my brain after the fact makes falling from a really high place any less unpleasant? Or are you appealing to my emotions (fear of death)?

Replies from: metatroll
comment by metatroll · 2012-08-23T11:42:23.881Z · LW(p) · GW(p)

Don't feed the troll.

Replies from: koning_robot
comment by koning_robot · 2012-08-24T10:10:01.396Z · LW(p) · GW(p)

Sorry for being snarky. I am sincere. I really do think that death is not such a big deal. It sucks, but it sucks only because of the negative sensations it causes in those left behind. All that said, I don't think you gave me anything but an appeal to emotion.

Replies from: None
comment by [deleted] · 2012-08-25T08:16:25.513Z · LW(p) · GW(p)

Arguing we should seek pleasurable experiences is also an appeal to emotion.

Replies from: koning_robot
comment by koning_robot · 2012-08-25T09:38:09.789Z · LW(p) · GW(p)

It's different. The fact that I feel bad when confronted with my own mortality doesn't mean that mortality is bad. The fact that I feel bad when so confronted does mean that the feeling is bad.

Replies from: None
comment by [deleted] · 2012-08-25T10:36:28.234Z · LW(p) · GW(p)

I'm curious. What is your position on wireheading?

comment by advancedatheist · 2012-08-21T15:18:04.114Z · LW(p) · GW(p)

I'd like to donate, but at the moment I may have to direct discretionary time and money towards saving Kim Suozzi:

http://www.reddit.com/r/atheism/comments/ydsy5/reddit_help_me_find_some_peace_in_dying_young_im/

The lack of interest in the prize puzzles me. Some very wealthy cryonicists want to tie up fortunes in speculative revival trusts, yet they depend on financially inadequate cryonics organizations to keep them in suspension against foreseeable adversities, and they seem uninterested in trying to improve the science of preserving their own brains. I don't understand this business model.

Replies from: gwern, None, None, V_V, None
comment by gwern · 2012-08-21T16:05:30.998Z · LW(p) · GW(p)

I'd like to donate, but at the moment I may have to direct discretionary time and money towards saving Kim Suozzi: http://www.reddit.com/r/atheism/comments/ydsy5/reddit_help_me_find_some_peace_in_dying_young_im/ The lack of interest in the prize puzzles me.

Why would you be puzzled when you have answered your own question?

comment by [deleted] · 2012-08-21T15:29:59.512Z · LW(p) · GW(p)

Behavior like that has deepened my skepticism of the cryonics crowd - there are glaring discrepancies between professed beliefs and actual behavior.

Replies from: BrassLion, V_V
comment by BrassLion · 2012-08-21T19:39:06.599Z · LW(p) · GW(p)

Prisoner's dilemma. If someone else donates and I don't, I get to eat my cryopreservation and have it too. Or something like that.

At least this thread has rustled up a few more donations.

comment by V_V · 2012-08-23T22:11:55.666Z · LW(p) · GW(p)

I think the best esplanation for this behavior is that cryonics is essentially a religious funeral ritual.

Most people who get cryopreserved don't really expect, at a deep level, that it will extend their life, much like most believers in traditional religions don't really expect an afterlife in the otherworld or reincarnation (that's why they all fear death and generally try to postpone it as much as possible).

Professing belief in the religious tenets and performing the required rituals may provide some emotional solace as long as willing suspension of disbelief (self-deception, if you prefer) can be maintained. That might explain the lackluster interest in a potentially falsifying experiment: should it turn out that preserved brains are manifestly damaged, maintaing suspension of disbelief would become much more difficult.

Another typical function of religious beliefs and rituals is social signalling: they are a way for a community (transhumanists, in the case of cryonics) to and maintain and reinforce social cohesion.

Replies from: ScottMessick
comment by ScottMessick · 2012-08-23T23:22:54.422Z · LW(p) · GW(p)

I think this hypothesis is worth bearing in mind. However, it doesn't explain advancedatheist's observation that wealthy cryonicists are eager to put a lot of money in revival trusts (whose odds of success are dubious, even if cryonics works) rather than donate to improve cryonics research or the financial viability of cryonics organizations.

Replies from: V_V
comment by V_V · 2012-08-23T23:56:46.595Z · LW(p) · GW(p)

Maybe it's something like the Egyptian pharaohs putting gold and valuables in their pyramids

Replies from: None
comment by [deleted] · 2012-08-24T16:29:51.130Z · LW(p) · GW(p)

The hypothesis "many people are engaging in cryonics as signalling/psychological-reassurance" is not incompatible with the hypothesis "there exist people interested in cryonics on a practical level, eager for potentially falsifying experiments". Indeed, it's even possible for both of these things to be true of a single person.

Many long-shot medical procedures serve similar functions - but this does not preclude them from being legitimate medical procedures. And there, too, I would expect a non-trivial subset of patients (and doctors) to be reluctant to seek out falsifying evidence.

There is likely some truth in your assertion that cryonics is fulfilling many of the same psychological and social functions of burial rituals - but that does not adequately explain all behavior in the cryonics arena.

comment by [deleted] · 2012-08-21T15:31:25.375Z · LW(p) · GW(p)

Has Suozzi's story been confirmed by CI yet?

Replies from: advancedatheist
comment by advancedatheist · 2012-08-21T16:10:06.310Z · LW(p) · GW(p)

I don't know about CI's due diligence. As the secretary of the Society for Venturism, which has the ability to raise money for Miss Suozzi's suspension, I can confirm that we've pursued our end of checking out her story.

We helped out in getting William O'Rights cryosuspended a few years ago, for example:

http://www.cryonics.org/reports/CI93.html

One of our directors has interviewed Miss Suozzi, and she may have a article about her written up soon which we'll post on the Venturists' website:

http://venturist.info/

Replies from: None, JGWeissman, None
comment by [deleted] · 2012-08-21T18:30:54.131Z · LW(p) · GW(p)

Just donated to Kim's fund.

comment by JGWeissman · 2012-08-21T16:44:51.248Z · LW(p) · GW(p)

Do you have any plans to manage a donation fund for her?

comment by [deleted] · 2012-08-21T16:28:52.891Z · LW(p) · GW(p)

Thank you! Could you publicize your confirmation? I believe there are a number of people willing to donate who were holding off until the story was confirmed. What is the best way to donate to the fund?

comment by V_V · 2012-08-21T23:14:14.811Z · LW(p) · GW(p)

saving Kim Suozzi

saving?

and they seem uninterested in trying to improve the science of preserving their own brains

Maybe they don't want to spread the flour on the dragon

comment by [deleted] · 2012-08-29T05:32:18.629Z · LW(p) · GW(p)

Some very wealthy cryonicists want to tie up fortunes in speculative revival trusts, yet they depend on financially inadequate cryonics organizations to keep them in suspension against foreseeable adversities, and they seem uninterested in trying to improve the science of preserving their own brains. I don't understand this business model.

Think pyramids, only you don't need thousands of slaves and a truly inconvenient amount of sandstone.

comment by faul_sname · 2012-08-21T18:01:06.747Z · LW(p) · GW(p)

When do they need the money by? I'm currently a bit low on funds due to tuition and rent, but I can probably spare $500-1000 in a couple of weeks (I try to maintain an emergency fund, and don't want to dip into that if I don't need to).

Also, do you have a page I link to where they describe the immediate funding gap? If possible, I will turn this into a donation-matching thing, probably on /r/transhuman (leverage is always good).

Replies from: Ben_Scarlato, gwern
comment by Ben_Scarlato · 2012-08-24T15:59:52.488Z · LW(p) · GW(p)

I'm one of the volunteers at the Brain Preservation Foundation. Although sooner is always better, there isn't a specific reason why now is better than 2 weeks in the future. If you need the money for an emergency fund, I'd wait to donate.

The page describing the need for our current fundraising campaign is here: http://www.brainpreservation.org/content/letter-president-brain-preservation-foundation

Replies from: faul_sname
comment by faul_sname · 2012-08-24T23:00:36.658Z · LW(p) · GW(p)

Thank you for that. It looks like I do indeed have a new job, so as soon as the paycheck comes in I will not need to dip into the emergency fund.

By the way, do you happen to know what would happen if the funding didn't come about? Would you dip into the prize fund, or simply hold off on testing until you got sufficient funding.

Replies from: Ben_Scarlato
comment by Ben_Scarlato · 2012-08-26T03:43:22.920Z · LW(p) · GW(p)

I can't speak authoritatively, but I think testing would have to be put on hold.

The $100,000 for the prize fund is pledged specifically for when the prize is won, so there's no easy way to change that.

comment by gwern · 2012-08-21T21:55:09.791Z · LW(p) · GW(p)

As Bakker's Prince of Nothing-verse books say (which I have lost the last 5 days or so to reading), "Measure is unceasing".

These first tests, while perhaps the most valuable (as the initial observations of anything usually are), are - hopefully - only the first; if your donations do not change whether the first ones happen, they may change whether the second batch does.

(Even in the extraordinarily unlikely scenario where all the techniques produce perfect preservation according to the first test, one would still want periodic tests to check that the techniques are still being done right.)

Replies from: faul_sname
comment by faul_sname · 2012-08-21T22:17:36.111Z · LW(p) · GW(p)

That doesn't answer the question of whether they need the money now or in two weeks. I dislike exposing myself to unnecessary financial risk, particularly at the beginning of a semester and when my work is undergoing layoffs (not that I consider either particularly likely to be a problem, but the risk is distinctly elevated right now and my reserves are lower than I like).

Replies from: gwern
comment by gwern · 2012-08-21T23:40:32.103Z · LW(p) · GW(p)

That doesn't answer the question of whether they need the money now or in two weeks.

Oh, I didn't realize you really meant now or a few weeks. As far as I know, there is no significant reason why donating now would be better than in a few weeks, aside from looking good here and maybe encouraging some other people to donate (for which a public commitment ought to be enough).

comment by V_V · 2012-08-21T23:43:07.529Z · LW(p) · GW(p)

So, some anonymous person can give away 100k$ to back the prize, but not the 25-50k$ to fund the evaluation process needed to award the prize?

Why don't they offer a 50k$ prize and use the other 50k$ to fund the evaluation, instead of soliciting donations? Why don't they just offer a medal of insignificant material value? The winner is going to get lots of bragging rights anyway.

comment by Benquo · 2012-08-21T18:46:07.391Z · LW(p) · GW(p)

Does anyone here know if BPF is a 501(c)(3) organization? If so, I can probably get some of my donation matched by my employer.

Replies from: None
comment by [deleted] · 2012-08-21T19:03:14.455Z · LW(p) · GW(p)

"The Brain Preservation Foundation was incorporated in Delaware on August 27, 2010. We hold Section 501(c)(3) tax-exempt status as a not-for-profit scientific research organization. Your contributions are fully tax deductible. Thank you."

http://www.brainpreservation.org/content/donate

Replies from: Benquo
comment by Benquo · 2012-08-21T19:21:19.785Z · LW(p) · GW(p)

Thanks - I'm not sure why I didn't see that before. I've now requested a match, for an additional $956.

Replies from: faul_sname
comment by faul_sname · 2012-08-21T19:37:20.854Z · LW(p) · GW(p)

It took me googling "tax exempt site:brainpreservation.org" to find that, so it may not just be you.

Replies from: None
comment by [deleted] · 2012-08-21T19:46:21.508Z · LW(p) · GW(p)

Yeah, same here. Their site could use some work.

comment by MileyCyrus · 2012-08-21T16:06:14.956Z · LW(p) · GW(p)

How much money do they need?

Replies from: None
comment by [deleted] · 2012-08-21T16:25:19.211Z · LW(p) · GW(p)

According to them, approximately $25-50k - less than the cost of a single cryosuspension. This should be easily fundable with token contributions from people who have expressed an interest in brain preservation.

comment by grendelkhan · 2012-08-29T18:52:25.889Z · LW(p) · GW(p)

I've donated a relatively small amount, and will donate more when my finances allow (that's not open-ended; I'm expecting a small windfall in a few months). It should go without saying, but if you have a good employer, check to see if they match charitable donations! Mine turned my donation from a pitifully small one into a just plain small one.

Wouldn't it be weird if it turns out that there's an excellent and durable method of preserving brains, but it's not the one that's been used for the last half-century or so? Horrifying, obviously, but profoundly weird as well. The two positions I've seen on the topic have been "it never has and never will work", and "it's worked since some possibly-specified time in the past". A world in which people who sign up for preservation avoid death if and only if they're lucky enough to have signed up after, say, 2030, feels weirder than a world where the rational are rewarded, the irrational punished. The zog, I suppose.

Replies from: gwern, grendelkhan
comment by gwern · 2013-04-20T23:32:04.686Z · LW(p) · GW(p)

A world in which people who sign up for preservation avoid death if and only if they're lucky enough to have signed up after, say, 2030, feels weirder than a world where the rational are rewarded, the irrational punished

It feels weirder, but has many precedents. Many 'bubbles' can be profitably interpreted as people being 100% correct about their vision of the future - but messing up the timing (see http://www.hoover.org/publications/policy-review/article/5646 and http://www.economist.com/news/finance-and-economics/21575737-lessons-americas-long-history-property-booms-betting-house for examples). I used this in another comment, but consider the case of an investor in the ill-fated Pets.com: was the investor right to believe that Americans would spend a ton of money online such as for buying dogfood? Absolutely, Amazon is a successful online retail business that stocks thousands of dog food varieties, to say nothing of all the other pet-related goods it sells. But the value of Pets.com still went to ~0. Many startups have a long list of failed predecessors who tried to do pretty much the same thing, and what made them a success was that they happened to give the pinata a whack at the exact moment where some cost curves or events hit the right point. (Facebook is the biggest archive of photographs there has ever been, with truly colossal storage requirements; could it have succeeded in the 1990s? No, and not even later, as demonstrated by Orkut & Friendster, and the lingering death of MySpace.) You can read books from the past about tech visionaries and note how many of them were spot-on in their beliefs about what would happen (The Media Lab was a good example of this - I read it constantly thinking 'yes, you were right, for all the good it did you' or 'not quite, it'd actually take another decade for that to really work out') but where a person would have been ill-advised to act on the correct forecasts. Or look at computers: imagine an early adopter of an Apple computer saying 'everyone will use computers eventually!' Yes, but not for another few decades, and 'in the long run, we are all dead'.

If cryonics turned out to be worthless for everyone doing it before 2030 while perfectly correct in principle and practical post-2030, it would simply be yet another technology where visionaries were ultimately right despite all nay-saying & skepticism from normals but nevertheless jumped on it too early.

When a knife drops, a fraction of a second divides a brilliant save from an emergency-room visit. They don't call it the 'bleeding edge' for nothing.

Replies from: grendelkhan
comment by grendelkhan · 2013-04-22T17:14:25.482Z · LW(p) · GW(p)

Wow; that just reminded me of a bit from The Smartest Guys In The Room, where Enron partnered with Blockbuster to stream movies-on-demand over the internet in 2000. It was a scam, but clearly someone thought it was a real thing. (Netflix started streaming movies in 2007.)

And--yes, you said it. Projects like this and OpenWorm are particularly important because they help narrow down really uncertain things; OpenWorm, for instance, might be able to settle the "neurons are really complicated"/"neurons are accurately simulatable-in-bulk by simple models" dispute, as well as the "the connectome is/is not sufficient" thing.

comment by grendelkhan · 2013-04-20T22:40:28.692Z · LW(p) · GW(p)

Well, a number of things have gone not-as-planned, but it did help to make a public commitment here, and I've (finally!) donated an order of magnitude more than I did last year, along with the corresponding employer match. Last year's donation drive is over, but I expect they'll still have science to do.

I look forward to seeing the results.

comment by [deleted] · 2012-09-06T16:30:58.263Z · LW(p) · GW(p)

Donated $100 dollars. I think there are 26 total donors now.