Posts

Comments

Comment by Yossarian on Open Thread for February 3 - 10 · 2014-02-09T22:59:52.476Z · LW · GW

Does anyone know if there is/are narrative fiction based around the AI Box Experiment? Short stories or anything else?

Comment by Yossarian on Good luck, Mr. Rationalist · 2013-04-30T04:50:05.075Z · LW · GW

Choose well is a nice salutation for instrumental rationality. But what about, "Know much and choose well" to cover epistemic and instrumental rationality?

Comment by Yossarian on Privileging the Question · 2013-04-27T23:32:02.534Z · LW · GW

"When did you stop beating your wife?"

This is basically framing effect, no?

Comment by Yossarian on Rationality Quotes April 2013 · 2013-04-08T19:19:07.416Z · LW · GW

The quote struck me as a poetic way of affirming the general importance of metacognition - a reminder that we are at the center of everything we do, and therefore investing in self improvement is an investment with a multiplier effect. I admit though this may be adding my own meaning that doesn't exist in the quote's context.

I've always seen that whole speech as a pretty good example of reasoning from the wrong premises: Henry V makes the argument that God will decide the outcome of the battle and so if given the opportunity to have more Englishmen fighting along side them, he would choose to fight without them since then he gets more glory for winning a harder fight and if they lose then fewer will have died. Of course he doesn't take this to the logical conclusion and go out and fight alone, but I guess Shakespeare couldn't have pushed history quite that far.

Rewatching Branagh's version recently, I keyed in on a different aspect. In his speech, Henry describes in detail all the glory and status the survivors of the battle will enjoy for the rest of their lives, while (of course) totally downplaying the fact that few of them can expect to collect on that reward. He's making a cost/benefit calculation for them and leaning heavily on the scale in the process.

Contrast with similar inspiring military speeches:

William Wallace says, "Fight and you may die. Run and you may live...for awhile. And dying in your beds, many years from now, would you be willin' to trade ALL the days, from this day to that, for one chance, just one chance, to come back here and tell our enemies that they may take our lives, but they'll never take our freedom!" He's saying essentially the same thing as Henry, but framing it as a loss instead of a gain. Where Henry tells his soldiers what they'll gain from fighting, Wallace tells them what they'll lose if they don't. Perhaps it's telling that, unlike Henry, he doesn't get very specific. It might've been an opportunity for someone in the ranks to run a thought experiment, "What specific aspects of my life will be measurably different if we have 'freedom' versus if we don't have 'freedom'? What exactly AM I trading ALL the days for? And if I magically had that thing without the cost of potentially dying, what would my preferences be then?" Or to just notice their confusion and be able to recognize they were being loss averse and without the ability to define exactly what they were averse to losing.

Meanwhile, Maximus tells his troops, "What you do in life echoes in eternity." He's more honest and direct about the probability that you're going to die, but also reminds you that the cost/benefit analysis extends beyond your own life, the implication being that your 'honor' (reputation) affects your placement in the afterlife and (probably of more consequence) the well being of your family after your death. Life is an iterated game and sometimes you have to defect (or cooperate?) so that your children get to play at all.

And lastly, Patton says, "No bastard ever won a war by dying for his country. He won it by making the other poor, dumb bastard die for his." He explicitly rejects the entire 'die for your country' framing and foists it wholly onto the enemy. It's his version of "The enemy's gate is down." He's not telling you you're not going to die, but at least he's not trying to convince you that your death is somehow a good or necessary thing.

When taken in this company, Henry actually comes across more like a villain. Of all of them, he's appealing to their desire to achieve rational interests in an irrational way without being at all upfront about their odds of actually getting what he's promising them.

Comment by Yossarian on Solved Problems Repository · 2013-04-07T20:26:23.768Z · LW · GW

Additionally, fitness roughly breaks into two broad categories - resistance and cardiovascular. Starting Strength covers resistance training, but the cardiovascular version of Starting Strength is Couch To 5K. It uses the same basic concept of progressive overload applied to running.

Comment by Yossarian on Rationality Quotes April 2013 · 2013-04-06T17:10:43.517Z · LW · GW

All things be ready if our minds be so.

  • William Shakespeare, Henry V
Comment by Yossarian on Rationalist Lent · 2013-04-05T20:22:20.190Z · LW · GW

Having now concluded Rationalist Lent, I have determined that it is worth my time and I do genuinely prefer to keep watching the Daily Show.

At Lent's conclusion, I started rewatching and ended up watching all the episodes that I missed (the ones still available anyway) with a renewed appreciation. Coincidentally, I also just finished a comprehensive cleanup of all my harddrives, stretching back over ten years, and at the bottom of one of the oldest (pulled from my closet), I found an episode from 1999. I have no earthly idea why I downloaded/saved it in the first place, but I watched it and lo and behold, it wasn't that funny. The real culprit here, I think, was Nostalgia Bias.

One additional note: During RL, news broke that Stewart would be taking a hiatus from hosting and be replaced by John Oliver, starting this summer. That sort of wrecked my experiment, since I knew right away my preferences would be to continue watching in that case. Though you could still make the argument that 22 minutes, four days a week, over 3 months would be a significant savings. And even disregarding entirely, it was still a nice exercise in willpower; a demonstration to myself that I am in control of the choices I make and that I can counteract the habits and urges of my System 1.

Comment by Yossarian on Just One Sentence · 2013-04-03T17:58:55.284Z · LW · GW

"If you test theories by how precisely they predict experimental results, you will have many more opportunities to have sex and look cool."

Comment by Yossarian on Why Real Men Wear Pink · 2013-03-29T04:24:17.456Z · LW · GW

This was the case for me in my uniforms required school. The obvious and conspicuous item we could control was our tie, but thinking back on it now, kids signalled identity and status through shoes, belts, and other accessories (though I was effectively blind to such things at the time).

Seniors were also allowed to wear khaki pants, a conscious allowance on the administrators' part designed to reinforce the different classes.

Comment by Yossarian on Open thread, March 17-31, 2013 · 2013-03-27T23:32:51.769Z · LW · GW

Today's SMBC

Has this idea been considered before? The idea that a self-improving capable AI would choose not to because it wouldn't be rational? And whether or not that calls into question the rationality of pursuing AI in the first place?

Comment by Yossarian on Want to be on TV? · 2013-03-27T21:50:14.707Z · LW · GW

Because I know enough people in the entertainment industry that I'm not applying Fundamental Attribution Error? I'm not sure what your question is.

Comment by Yossarian on Want to be on TV? · 2013-03-27T07:53:59.743Z · LW · GW

My suspicion is that somebody is thinking of this (and possibly pitched it) as the reality version of "The Big Bang Theory." If that's the case, consider that the BBT's showrunner, Bill Prady, is himself a genuine nerd. Then imagine how bad BBT is and how bad it would be if its showrunner wasn't a nerd. Then turn that into a reality show.

Comment by Yossarian on Want to be on TV? · 2013-03-27T03:19:49.021Z · LW · GW

It still pales in comparison to the power of invented meaning through editing.

It's the Kuleshov Effect turned up to 11.

Comment by Yossarian on Want to be on TV? · 2013-03-27T02:57:19.002Z · LW · GW

It wouldn't happen that way. The person participating in the story has no power compared to the person orchestrating the story.

I think most people here would be surprised to know the tremendous extent to which narratives are manipulated in editing in reality TV. Watch ten minutes of any of the ghost hunter/paranormal type shows. Those will show how much can be constructed from the barest of actual events.

Comment by Yossarian on Want to be on TV? · 2013-03-27T02:29:14.205Z · LW · GW

And they'll engineer that into existence one way or another. There is great, nuanced storytelling to be found on television, but the reality genre is not that place.

Comment by Yossarian on Want to be on TV? · 2013-03-27T02:24:55.581Z · LW · GW

There are huge benefits to getting the right kind of TV exposure, but this is probably not it.

Comment by Yossarian on Want to be on TV? · 2013-03-27T02:24:05.745Z · LW · GW

I would advice against participating. It's not impossible for this to be a worthwhile project that would result in overall beneficial PR for the community, but I estimate the odds as HEAVILY against it.

All storytelling is based around drama and conflict, this show will be no different. The only question is how nuanced and truthful is that conflict and as I'm sure I don't have to tell anyone here, the reality TV genre is not known for its nuance or truthfulness.

I believe Mr. Inman is sincere in his desire and ambition, but without any other information, heavily doubt his artistic vision would carry through to a portrayal that most people here would be happy with or deem beneficial. TV/film production is a team effort and Mr. Inman will likely be one voice in a chorus. He may fight battles with the network executives about the direction of the show, but in this case, I doubt he would win.

I think it's also fairly obvious to say that one should be tremendously wary of participating in any narrative about oneself where one isn't in control of that narrative, as would be the case here.

For context, I live in Los Angeles, work in film production, and have worked on reality TV shows in the past (and know many people who work on them in various capacities).

Comment by Yossarian on Boring Advice Repository · 2013-03-21T00:30:57.575Z · LW · GW

Give people permission to bug you.

If you commit to doing or following up on something for somebody, tell them to bug you if you don't get back to them about it. You'll feel less stressed about remembering or being obligated to do it because you've shifted at least some of the responsibility to them and given yourself external pressure, which is ultimately more efficient than relying on your own willpower anyway.

Conversely, give yourself permission to bug people, though without judgment. You know how you feel when you have email in your inbox that you know you really ought to get to, but don't? Somebody is feeling that way about your email right now. How helpful would it be if they electronically tapped you on the shoulder as a reminder? More helpful than getting more and more resentful because they've forgotten/don't care/don't consider you valuable enough to bother replying.

Comment by Yossarian on Co-Working Collaboration to Combat Akrasia · 2013-03-14T00:35:51.929Z · LW · GW

Yeah, I explicitly unchecked the boxes that said they would do that and it still showed up in my Twitter feed (which automatically forwarded to my Facebook feed).

Comment by Yossarian on Co-Working Collaboration to Combat Akrasia · 2013-03-11T17:41:39.886Z · LW · GW

Not "Common Room"? Ravenclaw or otherwise?

Too obvious? :)

Comment by Yossarian on Co-Working Collaboration to Combat Akrasia · 2013-03-11T17:40:14.481Z · LW · GW

I was in for a bit last night and enjoyed it. On the one hand, I think it did help me keep working where I otherwise would've quit or wasted more time on Internet distractions. That said, the chat, while interesting, was distracting from the primary purpose of the chat room.

There should definitely be two separate rooms - one for general chat and one for paired working. But the shared Pomodoro timing is also a good idea and should be tried, in my opinion.

Also, we should find a different chat client than Tinychat. It's log in process and text limitations are very annoying.

Comment by Yossarian on Boring Advice Repository · 2013-03-07T17:35:36.783Z · LW · GW

In addition to making lists for "work," make one for things you want to watch, read, and/or play. You'll feel more productive and motivated even when taking a break from work.

Comment by Yossarian on Rationalist Lent · 2013-02-14T19:49:00.146Z · LW · GW

I have a candidate and it might be an odd one. I think I'll give up watching the Daily Show for 40 days. I've been watching it for almost its entire existence (before Jon Stewart was the host) and take a certain hipster pride in the fact that I watched the show before it became the widely known, popular thing it is now. But for awhile now, I haven't derived that much enjoyment from actually watching it. Some interviews, an occasional chuckle here and there, but mostly I find myself annoyed at how lazy the writing has become and Stewart's increasing tendency to stretch out bits well past their actual punchline.

But it's been such an ingrained habit for so long and it feels like compromising part of my identity, albeit a small, insignificant part of it. So, for the next 40 days, I won't watch the Daily Show.

Not the Colbert Report though, that show is genius.

Comment by Yossarian on Thoughts on the January CFAR workshop · 2013-02-02T19:20:19.846Z · LW · GW

Yes, it was an opportunity cost problem - at what point did the cost of being cogent in the morning outweigh the cost of missing great late night conversations.

I can't think of any optimal solution that doesn't involve loads of caffeine or bilocation, time turner induced or otherwise.

Comment by Yossarian on Open Thread, June 16-30, 2012 · 2012-06-23T04:15:48.400Z · LW · GW

After a week long vacation at Disney World with the family, it occurs to me there's a lot of money to be made in teaching utility maximization to families...mostly from referrals by divorce lawyers and family therapists.

Comment by Yossarian on Ritual Report: NYC Less Wrong Solstice Celebration · 2011-12-27T03:47:46.016Z · LW · GW

Hm, perhaps you're right. It would depend largely on the composition of the ritual(s). Certainly, extraordinary care must be taken when intentionally playing with any kind of death spiral. A generous dose of tongue in cheek self deprivation would probably be a good idea.

Comment by Yossarian on Ritual Report: NYC Less Wrong Solstice Celebration · 2011-12-26T06:52:04.702Z · LW · GW

"What we're reasonably sure is settled truth" does not necessarily equal truth. Nor does it necessarily equal "what we will want to believe once we know more".

Absolutely, which is what makes building in the ability to self modify so intrinsically important. The function of any ritual like activity shouldn't be any where near the vicinity of the "research arm" of the rationality community. Nothing should be acquired within them, nor determined through them. They should be about reinforcing the settled science, to minimize the amount of falseness that enters into the canon (I should point out, to be clear I'm using this term tongue in cheek). And for what does, something built around the Litany of Tarksi still allows for self modification.

And yes, any and all rationalists should be far enough along that they've developed a certain immunity to the process. That in and of itself makes no difference. Doing these types of things does measurable things to the brain, just as prayer/meditation do. The details are arbitrary; it doesn't matter if you're sacrificing a virgin, eating a wafer, or lighting a candle. What matters is doing the same thing as your fellow tribe members to build/maintain a sense of community. The proposition here is to simply replace the incorrect proclamations of how the universe works with correct ones. Instead of proclaiming Jesus Lord and Savior, you're proclaiming the map is not the territory and that your desire to know what is true is actually true (so if it turns out that the map IS the territory, then out it goes from the hymn book).

And the rationalist has the added (and important) benefit that no matter how much they give themselves over to the emotions of whatever ceremony, once they walk back out to the parking lot, their level headedness will return. The rationalist can walk out and think, "That sure was fun, but I understand what was happening and can safely put that suspension of rationality back on the shelf." In a way the Catholic can't (consciously) do when walking out of Mass.

So I disagree, I think these kinds of things, with effective substitutions of content, won't make us weaker to this form of manipulation, but rather stronger. Ultimately, when we cross the Singularity, we probably won't need these kinds of mind hacks anymore, but in the interim, I think they'll end up being quite important.

Comment by Yossarian on Ritual Report: NYC Less Wrong Solstice Celebration · 2011-12-25T05:19:21.959Z · LW · GW

It's valid to be worried about the introduction of rituals producing death spirals. That is their express purpose after all, to produce and reinforce whatever death spirals the community has defined as essential.

Ritualism is a mind hack invented by early humanity to reinforce the group worldview and build/maintain group cohesion. And in the intervening thousands of years, either we or ritualism itself has evolved into something deeply ingrained in our cognitive makeup. At this point, it's how our brains are wired and I don't think it's feasible to simply ignore it. Instead, we have to do exactly what Raemon is attempting: coopt its techniques and replace the ones that propagate untruth and less than optimal behavior with ones that propagate truth and optimal behavior.

But rituals are a fundamentally irrational business, there's no way around it. The solution, I think, lies in thinking of rituals as a mnemonic device, understanding that they're not really a way of arriving at new truth, but reinforcing what we're reasonably sure is settled truth. Mandating constant and aribtrary change is the wrong track, since a huge part of rituals is simple reinforcement. To limit that is to cut the whole thing off at the knees.

Instead, I suggest only included the very settled science of rationality and being very conservative about what gets defined as such. For the inaugural core tenant, I would suggest the Litany of Tarski and the idea that if it's wrong it gets discarded, no matter what, with an appropriately weighty ritual to accompany it. So even if you did have something that was part of the canon for ten years that must then be discarded, you can still fall back to this ability to acknowledge mistakes and self modify. Everyone performs a ritual expunging the obsolete piece from the canon and it's forever removed. Thus, we're still taking advantage of the ritualism mind hack, while building in appropriate safeguards to keep the death spiral from going on forever and allowing for future self modification.

Comment by Yossarian on Meetup : West LA Meetup 10-19-2011 · 2011-10-26T01:32:52.182Z · LW · GW

Sorry I missed last week, I'll be there next!

Comment by Yossarian on Welcome to Less Wrong! (2010-2011) · 2011-09-29T18:46:03.466Z · LW · GW

Thank you, fixed.

Comment by Yossarian on Concepts Don't Work That Way · 2011-09-29T05:30:34.470Z · LW · GW

"It's the map and not the territory," right?

I may be way off base here, but isn't the root of this disagreement that lukeprog is saying that our mental map called "conceptual analysis" doesn't perfectly reflect the territory of the real world and should therefore not be the official model. While Morendil is saying, "but it's good enough in most cases to get through most practical situations." Which lukeprog agrees with.

Is that right?

Comment by Yossarian on A Rationalist's Tale · 2011-09-29T04:07:58.844Z · LW · GW

At the time, I made a distinction between ethics and morality that I would now say is probably more semantic than definitional. But, IIRC, they defined morality as a code of behavior with a religious basis. So I used the term ethics to say that I followed a code of behavior that didn't follow from religious belief.

Essentially, I made the point that just because I didn't believe I would go to hell for killing somebody didn't mean that I had any desire to. Or that the prospect of prison and general rejection from society didn't serve as an adequate deterrent. I don't remember specifically, but I might have made the point that the Golden Rule doesn't have to be tied to a religious belief and is a pretty self evident truth on its own.

As for their response, I mostly remember them moving onto a different topic (or at least, ceasing to focus on me for that moment). I always thought about my answers and tried to give an honest answer, but I actively avoided giving them the answers they were expecting or wanted, since they were usually leading questions designed to get me to agree with them in some basic way.

Comment by Yossarian on A Rationalist's Tale · 2011-09-29T03:22:45.149Z · LW · GW

As an atheist that attended a Catholic high school, one of the questions often leveled at me was what exactly prevented me from going on murdering rampages without a religious morality to keep me in check. I got this question from both students and faculty (usually as part of the class discussion in religion class). So in my experience at least, it is difficult for religious people to understand the morality of a non-religious person. I would speculate that this is because they, on some level, didn't believe in God (or at least the Catholic God) and were instead believing in belief, feeling that the morality that came with the dogma was necessary and beneficial to leading a proper life.

Comment by Yossarian on Welcome to Less Wrong! (2010-2011) · 2011-09-29T03:10:02.909Z · LW · GW

Hello, I found Less Wrong after a friend recommended Methods of Rationality, which I devoured in short order. That was almost a year ago and I've been lurking LW off and on ever since. In June I attended a meetup and had some of the best conversation I've had in a long time. Since then, I've been attacking the sequences more systematically and making solid progress.

I'm in my late 20's, live in Los Angeles, and work in the entertainment industry (after failing miserably as an engineering student). It's my ambition to produce stories and science fiction that raise the sanity waterline of our society. Film and television science fiction has never come close to approaching the depth and breadth of imagination and thoughtfulness of literary science fiction and I'd like to be a part of the effort to close that gap, however slightly.

I have a hypothesis that the sociological function of stories is to communicate lessons about desirable or undesirable human behavior and translate them from an intellectual idea that can't be grasped by us on an intuitive level to an emotional idea that can, in the process making it more likely we'll remember them and apply the lesson to our own behavior. Almost like a mnemonic device.

For example, I could give a three hour lecture on the importance of reputation and credibility in group dynamics. Or I could tell the story of the boy who cried wolf in under three minutes and communicate the same idea in a way that is intuitively graspable on an emotional level and is therefore much more likely to be retained.

Anyway, my grasp on this idea is far from complete and I hope this community can help me get a better handle on it, ultimately resulting in propagating ideas that contribute to the optimization of humanity.