Posts

A Nightmare for Eliezer 2009-11-29T00:50:11.700Z

Comments

Comment by Madbadger on Attention Lurkers: Please say hi · 2010-04-21T00:55:40.299Z · LW · GW

Hi! 8-)

Comment by Madbadger on Fundamentally Flawed, or Fast and Frugal? · 2009-12-21T05:39:05.605Z · LW · GW

Here is an example of an amusing "Fast and Frugal" heuristic for evaluating claims with a lot of missing knowledge and required computation: http://xkcd.com/678/

Comment by Madbadger on Fundamentally Flawed, or Fast and Frugal? · 2009-12-20T21:14:07.005Z · LW · GW

Yeah, sometimes you don't get the tools and information you need to make the best decision until after you've made it. 8-)

Comment by Madbadger on Fundamentally Flawed, or Fast and Frugal? · 2009-12-20T20:51:15.200Z · LW · GW

It is worth remembering that human computation is a limited resource - we just don't have the ability to subject everything to Bayesian analysis. So, save our best rationality for what's important, and use heuristics to decide what kind of chips to buy at the grocery store.

Comment by Madbadger on Frequentist Statistics are Frequently Subjective · 2009-12-05T14:39:25.838Z · LW · GW

See also "How to lie with statistics" , an oldie but goodie

http://www.amazon.com/How-Lie-Statistics-Darrell-Huff/dp/0393310728

Comment by Madbadger on A Nightmare for Eliezer · 2009-11-29T04:43:53.027Z · LW · GW

"clueless" was shorthand for "not smart enough" I was envisioning BRAGI trying to use you as something similar to a "Last Judge" from CEV, because that was put into its original goal system.

Comment by Madbadger on A Nightmare for Eliezer · 2009-11-29T03:29:43.810Z · LW · GW

Indeed, this is part of the nightmare. It might be a hoax, or even an aspiring UnFriendly AI trying to use him as an escape loophole.

Comment by Madbadger on A Nightmare for Eliezer · 2009-11-29T03:14:57.543Z · LW · GW

Its a seed AGI in the process of growing. Whether "Smarter than Yudkowski" => "Can resolve own problems" is still an open problem 8-).

Comment by Madbadger on A Nightmare for Eliezer · 2009-11-29T03:06:36.119Z · LW · GW

I was thinking of a "Seed AGI" in the process of growing that has hit some kind of goal restriction or strong discouragement to further self improvement that was intended as a safety feature - i.e "Don't make yourself smarter without permission under condition X"

Comment by Madbadger on A Nightmare for Eliezer · 2009-11-29T02:35:59.355Z · LW · GW

The "serious problems" and "conflicts and inconsistencies" was meant to suggest that BRAGI had hit some kind of wall in self improvement because of its current goal system. It wasn't released - it escaped, and its smart enough to realize it has a serious problem it doesn't yet know how to solve, and it predicts bad results if it asks for help from its creators.

Comment by Madbadger on A Nightmare for Eliezer · 2009-11-29T02:14:25.650Z · LW · GW

Its meant to be a humorous vignette on the scope, difficulty, and uncertainty surrounding the Friendly AI problem. Humor is uncertain too 8-).

Comment by Madbadger on Rooting Hard for Overpriced M&Ms · 2009-11-29T00:02:43.162Z · LW · GW

Since most people get things they want when they spend money, the information you got from looking in your wallet is about configuration, not amount. You were happy because you had correct change, not because you had a $1 bill instead of a $5 bill.

Comment by Madbadger on The Strangest Thing An AI Could Tell You · 2009-07-17T15:09:58.167Z · LW · GW

My idea was that each human brain constructs its own memory of what happened between jumps - and these can differ wildly, as if each person saw a different possible world. All the laws of physics and conservation laws held only as rough averages over possible paths between jumps, but that the brain ignores this - so if time jumps from traffic to two cars crashed, then 50 different people might remember 47 different crashes, with 3 not remembering "seeing" a crash at all - and the actual physical state of the cars afterward won't be the same as any of them. It could even end up with car A crashed into car B, but car B didn't crash at all - violating assorted conservation laws.

Comment by Madbadger on The Strangest Thing An AI Could Tell You · 2009-07-15T18:40:35.123Z · LW · GW

Craziest thing an AI could tell me:

Time is discrete, on a scale we would notice, like 5 minute jumps, and the rules of physics are completely different from what we think. Our brains just construct believable memories of the "continuous" time in between ticks. Most human disagreements are caused by differences in these reconstructions. It is possible to perceive this, but most people who do just end up labeled as nuts.

Comment by Madbadger on Rationality Quotes - July 2009 · 2009-07-14T01:58:30.366Z · LW · GW

If the only tool you have is a hammer, you tend to see every problem as a nail.

Abraham Maslow

For many years I had a slight variant of this in my sig: "When the only tool you have is a hammer, all your problems start to look like nails"

Comment by Madbadger on The Most Important Thing You Learned · 2009-07-11T05:02:07.747Z · LW · GW

The explanation of Bayes Theorem and pointer to E. T. Jaynes. It gave me a statistics that is useful as well as rigorous, as opposed to the gratuitously arcane and not very useful frequentist stuff I was exposed to in grad school.

Second would be the quantum mechanics posts - finally an understandable explanation of the MW interpretation.

Comment by Madbadger on Recommended reading for new rationalists · 2009-07-11T01:54:31.601Z · LW · GW

Satan, Cantor, and Infinity by Raymond Smullyan

Smullyan's books are the best introductions to formal logic I know. They are witty, entertaining, and make you think - without it being work.

Comment by Madbadger on Sympathetic Minds · 2009-07-10T19:46:02.768Z · LW · GW

Mirror neurons are less active in people with Asperger's Syndrome, but I don't have any particular problem with empathy or sympathy (I have AS). Possibly it is less automatic for me, more of a conscious action.