Posts

Comments

Comment by Rob Harrison on Impending AGI doesn’t make everything else unimportant · 2023-09-05T01:27:08.720Z · LW · GW

I love the book of Ecclesiastes, an ancient poetical text that wrestles with the problem of meaning.  Especially Chapter 3:9-13

Comment by Rob Harrison on The purpose of the (Mosaic) law · 2023-09-05T00:48:30.060Z · LW · GW

Pretty interesting take, I especially liked your mind-shape idea.

Another way to view faith is an attempt to find "God's will" which is like a vector pointing toward an undefined point (God) which represents an idealized version of the best possible outcome of things.  Ideally, a person of faith is always adjusting the vector of the direction of their lives (based on the outcomes of things), and not intentionally taking a vector known to point away from God (sin).

Some people think religion is useless, but actually, this approach is similar to the process of achieving any complex goal. Only, goals can be wrong to begin with unless the goal is God, which is the correct goal by definition.  The Bible is a set of stories, rules, and lessons that provide a shortcut to a well-aimed vector (rather than pure trial and error, which is a costly process).

The "idealized version of the best possible outcome of things" is an intimidatingly ambiguous concept, but why would we want to aim for anything less?  Religion is a method to achieve this goal that has been refined and field-tested over thousands of years.

Comment by Rob Harrison on video games > IQ tests · 2023-08-16T14:32:22.208Z · LW · GW

A meta question to the question of "how to most accurately measure intelligence" is "how is accurately measuring intelligence actually useful?"

Just from my experience it seems that an accurate relative intelligence value of some sort, for example iq score, has surprisingly few useful applications.  I think this is mostly because making claims about superior intelligence, accurate or not, is considered socially repulsive (as you acknowledged).  Without accounting for social factors, I would expect an intelligence rating to be a very useful thing to include on a work resume for example.

Personally, I am most comfortable acting like a slightly above average intelligence guy, although objectively I think I am the most or one of the most intelligent people I know.  Most people would not think of me if asked to name a really smart person they know, but anyone who knows me well will notice that I have a mysterious tendency for accomplishing very hard complex tasks.  I guess it seems to me that trying to project my intelligence more would close off more opportunities than it would open just because of social factors.

Acting less smart than I am can sometimes be inconvenient or annoying especially if I'm arguing with someone who projects a "smart guy" vibe, and they're clearly wrong, but other people are impressed with their verbiage and confidence in their claims.  Usually I don't care that much about winning arguments, except occasionally when the outcome really matters in which case I can get very frustrated.

This may be obvious, but I'm not posting this out of vanity for being a high-intelligence individual.  Rather, it is a real issue that I have to deal with and I'm not completely sure I always get it right.  Sometimes it seems it would be better and more genuine to not have a projected self separate from my real self (whatever that is).  It would be nice to talk with people about my actual interests and say things the way I actually think about them.  But mostly I think that intelligence should be a tool rather than a goal in itself.  Also, the problems that come with high intelligence are far less than the benefits, so I should stop complaining and use the intelligence that I was lucky to get to accomplish something good.

Comment by Rob Harrison on What Are You Tracking In Your Head? · 2023-06-22T01:16:02.673Z · LW · GW

While in a conversation tracking how the other person is trying to interpret the motives behind what I'm saying and trying to control that by what I say.  This can get multiple levels of complex fast.  I recently had a really important conversation and I ended up saying things like "I mean exactly what I'm saying" and "I'm not anxious, I just can't afford to let you misunderstand me".  Unfortunately this made it seem like I was definitely anxious, and meant something other than I was saying.

Comment by Rob Harrison on No, Really, I've Deceived Myself · 2023-06-16T13:05:57.047Z · LW · GW

I think some of the commentary about religion on LessWrong could use some more genuine humble curiosity.  Not the kind of curiosity of "how can someone so intelligent be so mistaken?", but rather "what are the effects of religious faith and practice on individuals and societies that go beyond simple self deception mechanisms?", or "Is the persistent belief in god(s) in human history only explainable by ignorance, or does it tell us something important about ourselves?".  I could start to hypothesize about some of these questions if I had time, but I dont right now.

In the otherwise astute community of LessWrong, this has seemed like a persistent blindspot to me that is less about answers and more about framing of questions.

Comment by Rob Harrison on Work dumber not smarter · 2023-06-01T19:05:09.479Z · LW · GW

Haha, funny post because its so relatable.

I think of operating productively without drama as playing the long game.  Its a lot easier to control critical factors without other people stepping in  because they think something's wrong.  Eventually when people see the results, the attention you missed out on during the process is paid off.

I think strategizing over how to present the end result so that it's valued is an important key to reaping the benefits of productivity.

Comment by Rob Harrison on Consider The Hand Axe · 2023-04-11T01:11:59.584Z · LW · GW

Thank you for the post.  A creative and substantive contribution from your world to mine.

We humans each have a different incredibly complex experience of the world.  However, we can share a slice of this experience that maps onto another person's experience through music, art, or language.  This seems innately beautiful.

I know this doesn't help much if you are losing your job to emerging ai capabilities.

But I guess my point is human creativity will always be more inherently valuable because it is generated from the messy, tragic, delightful, monotonous, thrilling experience of being a human, rather than a simple optimization function meant to mimic humans.

So basically, don't give up.