Posts

Comments

Comment by Fredrik on What are you working on? February 2012 · 2012-02-07T18:30:08.949Z · LW · GW

I am trying to build a collaborative argumentation analysis platform. It sounds like we want the almost exact same thing. Who are you working with? What is your detailed vision?

Please join our FB group at https://www.facebook.com/groups/arguable or contact me at branstrom at gmail.com.

Comment by Fredrik on Meetup : Stockholm meetup · 2011-10-03T09:18:56.099Z · LW · GW

I'm game!

Comment by Fredrik on Hindsight Devalues Science · 2011-06-10T04:26:01.183Z · LW · GW

What if I were to try to create such a web app. Should I take 5 minutes every lunchbreak asking friends and colleagues to brainstorm for questions? Maybe write a LW post asking for questions? Maybe there could be a section of the site dedicated to collecting and curating good questions (crowdsourced or centrally moderated).

Comment by Fredrik on Mini-camp on Rationality, Awesomeness, and Existential Risk (May 28 through June 4, 2011) · 2011-05-03T08:15:55.143Z · LW · GW

No matter. Just received word!

Comment by Fredrik on Mini-camp on Rationality, Awesomeness, and Existential Risk (May 28 through June 4, 2011) · 2011-05-03T07:53:26.640Z · LW · GW

I guess I wasn't selected if I haven't received an email by now? Or are you staying up late sorting applications? Will you email just the selectees or all applicants?

Comment by Fredrik on Sequential Organization of Thinking: "Six Thinking Hats" · 2010-03-20T03:13:48.329Z · LW · GW

I had the same experience.

Comment by Fredrik on Savulescu: "Genetically enhance humanity or face extinction" · 2010-01-14T22:55:29.072Z · LW · GW

Right… I might have my chance then to save the world. The problem is, everyone will get access to the technology at roughly the same time, I imagine. What if the military get there first? This has probably been discussed elsewhere here on LW though...

Comment by Fredrik on Savulescu: "Genetically enhance humanity or face extinction" · 2010-01-12T15:13:51.006Z · LW · GW

Well, presumably Roko means we would be restricting the freedom of the irrational sticklers - possibly very efficiently due to our superior intelligence - rather than overriding their will entirely (or rather, making informed guesses as to what is in their ultimate interests, and then acting on that).

Comment by Fredrik on Savulescu: "Genetically enhance humanity or face extinction" · 2010-01-12T14:59:02.807Z · LW · GW

I definitely seem to have a tendency to utilitarian thinking. Could you give me a reading tip on the ethical philosophy you subscribe to, so that I can evaluate it more in-depth?

Comment by Fredrik on Savulescu: "Genetically enhance humanity or face extinction" · 2010-01-12T04:16:31.783Z · LW · GW

Well, the AI would "presume to know" what's in everyone's best interests. How is that different? It's smarter than us, that's it. Self-governance isn't holy.

Comment by Fredrik on Savulescu: "Genetically enhance humanity or face extinction" · 2010-01-11T02:17:57.923Z · LW · GW

Just out of curiosity, are you for or against the Friendly AI project? I tend to think that it might go against the expressed beforehand will of a lot of people, who would rather watch Simpsons and have sex than have their lives radically transformed by some oversized toaster.

Comment by Fredrik on Savulescu: "Genetically enhance humanity or face extinction" · 2010-01-11T00:48:08.199Z · LW · GW

I might be wrong in my beliefs about their best interests, but that is a separate issue.

Given the assumption that undergoing the treatment is in everyone's best interests, wouldn't it be rational to forgo autonomous choice? Can we agree on that it would be?

Comment by Fredrik on Savulescu: "Genetically enhance humanity or face extinction" · 2010-01-11T00:41:10.440Z · LW · GW

Well, the attention of those capable of solving FAI should be undivided. Those who aren't equipped to work on FAI and who could potentially make progress on intelligence enhancing therapies, should do so.

Comment by Fredrik on Savulescu: "Genetically enhance humanity or face extinction" · 2010-01-10T22:30:20.660Z · LW · GW

Culture has also produced radical Islam. Just look at http://www.youtube.com/watch?v=xuAAK032kCA to get a bit more pessimistic about the natural moral zeitgeist evolution in culture.

Comment by Fredrik on Savulescu: "Genetically enhance humanity or face extinction" · 2010-01-10T22:23:47.106Z · LW · GW

So individual autonomy is more important? I just don't get that. It's what's behind the wheels of the autonomous individuals that matters. It's a hedonic equation. The risk that unaltered humans pose to the happiness and progress of all other individuals might just work out to "way too fracking high".

It's everyone's happiness and progress that matters. If you can raise the floor for everyone, so that we're all just better, what's not to like about giving everybody that treatment?

Comment by Fredrik on Savulescu: "Genetically enhance humanity or face extinction" · 2010-01-10T17:26:31.375Z · LW · GW

You don't have to trust the government, you just have to trust the scientists who developed the drug or gene therapy. They are the ones who would be responsible for the drug working as advertised and having negligible side-effects.

But yes, I sympathize with you, I'm just like that myself actually. Some people wouldn't be able to appreciate the usefulness of the drug, no matter how hard you tried to explain to them that it's safe, helpful and actually globally risk-alleviating. Those who were memetically sealed off to believing that or just weren't capable of grasping it, would oppose it strongly - possiby enough to base a war on the rest of the world on it.

It would also take time to reach the whole population with a governmentally mandated treatment. There isn't even a world government right now. We are weak and slow. And one comparatively insane man on the run is one too many.

Assuming an efficient treatment for human stupidity could be developed (and assuming that would be a rational solution to our predicament), then the right thing to do would be delivering it in the manner causing the least bit of social upheaval and opposition. That would be a covert dispersal, most definitely. A globally coordinated release of a weaponized retro virus, for example.

We still have some time before even that can be accomplished, though. And once that tech gets here we have the hugely increasing risk of bioterrorism or just accidental catastrophies by the hand of some clumsy research assistant, before we have a chance to even properly prototype & test our perfect smart drug.

Comment by Fredrik on Savulescu: "Genetically enhance humanity or face extinction" · 2010-01-10T03:44:07.472Z · LW · GW

Even in such a scenario, some rotten eggs would probably refuse the smart drug treatment or the gene therapy injection - perhaps exactly those who would be the instigators of extinction events? Or at least the two groups would overlap somewhat, I fear.

I'm starting to think it would be rational to disperse our world-saving drug of choice by means of an engineered virus of our own, or something equally radically effective. But don't quote me on that. Or whatever, go ahead.

Comment by Fredrik on Savulescu: "Genetically enhance humanity or face extinction" · 2010-01-10T03:37:28.941Z · LW · GW

X-risk-alleviating AGI just has to be days late to the party for a supervirus created by a terrorist cell to have crashed it. I guess I'd judge against putting all our eggs in the AI basket.

Comment by Fredrik on Regular NYC Meetups · 2009-10-02T19:31:00.137Z · LW · GW

I wonder how many Swedish readers there are. A meetup in Stockholm or Gotheburg would be kind of nice.

Comment by Fredrik on Singularity Summit 2009 (quick post) · 2009-09-11T22:46:38.683Z · LW · GW

So you haven't read his Sweet Dreams: Philosophical Obstacles to a Science of Consciousness?

Comment by Fredrik on Singularity Summit 2009 (quick post) · 2009-08-18T12:08:04.678Z · LW · GW

I think Eliezer was just stating a fact? Or, impression.

Comment by Fredrik on Singularity Summit 2009 (quick post) · 2009-08-17T05:22:48.570Z · LW · GW

"They're really trying to raise the intellectual level this year" sounds like music to my ears.