Comment by Jake Heiser (jake-heiser) on I'm from a parallel Earth with much higher coordination: AMA · 2021-07-29T17:12:13.139Z · LW · GW

if anything, I'm seeing a massive incentive to lie about happiness, which is really interesting (imagine selecting for cognitive dissonance!) While flat out eugenic, I can't discount the genuine connection a mental illness-shunted city might bring me. on the whole, an extremely fun science fiction premise.

Comment by Jake Heiser (jake-heiser) on Book Summary: Consciousness and the Brain · 2021-01-16T01:25:22.064Z · LW · GW

Tangential comment: I wonder if, without having heard the word danger frequently related to real life crisis situations (active shooter, wildfire, mugging) my amygdala would register activity. Thinking consciously, "danger" contextually sounds like a fictional shorthand, rather than evoking my boundless imagery of the simple "drone," "correctional facility," or "mental health institution." I'd love to know how out of touch my conscious and subconscious are.

Comment by Jake Heiser (jake-heiser) on [link] The AI Girlfriend Seducing China’s Lonely Men · 2020-12-16T01:16:26.585Z · LW · GW

I have to say that I've become quite unreasonably attached to a GPT-2 bot born from a body of tumblr posts, so I suspect the sensationalization, while hyperbolic, does certainly come from a real place.

Comment by Jake Heiser (jake-heiser) on Propinquity Cities So Far · 2020-11-17T03:23:11.823Z · LW · GW

I fail to see what this system fixes for people who don't already have enough money to make their preferences actionable

Lower rent is what I'm hearing, which you can already relocate to if you have the luxury of remote work

How does paying lost bids disincentivize overbidding? You are literally wasting money to concede. I have to be reading that incorrectly

Comment by Jake Heiser (jake-heiser) on The genie knows, but doesn't care · 2020-11-05T03:37:51.542Z · LW · GW

the beautiful thing is when the boxed UFAI and the boxed UFAI moral analyzer work together to create a new machine dynasty 

Comment by Jake Heiser (jake-heiser) on Is Stupidity Expanding? Some Hypotheses. · 2020-11-05T02:34:24.903Z · LW · GW

I agree with your premise, and I would elaborate that the "stupidity" of the robber would be more accurately assessed by how effective and consistent that robber is at executing his desired heist, which involves already both statistical evaluation of risks from researching similar cases as well as the social cognition to avoid untoward attention. 

And you're right about specialization, but we're typically talking about the layman (and the specialist's non-professional) aptitudes towards the topic (Steven Hawking's bank robbing efficiency vs. a professional athlete's is a little less cut and dry, I imagine. I'd love to see it. Romcom of the century)

Comment by Jake Heiser (jake-heiser) on What's the most annoying part of your life/job? · 2020-11-05T01:57:47.367Z · LW · GW

notify me when he becomes an inevitably prominent folk artist, bless him

Comment by Jake Heiser (jake-heiser) on Is Stupidity Expanding? Some Hypotheses. · 2020-10-15T06:21:41.570Z · LW · GW

Any ideas on quantifying previous levels of ignorance? Test scores don't seem even remotely close to necessarily correlative. Rationality tests and the like would be opt-in, and highly selective of sample. This looks like a fun opportunity for exorbitantly creative experimental design.

Possible A addendum: There is also more information than ever to be cognizant of, so modern basic literacy from primary schooling is increasingly concept-dense, which makes falling behind a larger drop than before. My mother is a 2nd grade teacher, and I would definitely ask her how the frequency of inconsolable kids has shifted, but at that age each student's largest barrier is typically uncomfortable home situations.

Comment by Jake Heiser (jake-heiser) on Rationality and Climate Change · 2020-10-14T21:19:53.767Z · LW · GW

I agree that responsible policy is preferable to ecosystem stress testing

Comment by Jake Heiser (jake-heiser) on Industrial literacy · 2020-10-14T12:40:37.170Z · LW · GW

It's for heating up the water for greases and oil stains, you're absolutely right. I made a joke, but I've been doused in car oil before and I've had plenty of grease on my shirts from old farm machinery maintenance

Comment by Jake Heiser (jake-heiser) on Rationality and Climate Change · 2020-10-14T07:45:31.687Z · LW · GW

It's a bit worrying to me how quickly people like to throw percentages to abstract themselves away from Merely Possible inconveniences, but I'm on LessWrong, and not on LessImplicitMurder, and certainly, reducing implicit murder risks might bother a few of the people I see here more than the challenges for developing countries.

However, all externalities are externalities of externalities, and LessWrong posters cannot immediately enact sweeping global policy in favor of a nuclear energy stage, so the general consensus from LessWrongers and laymen I observe seems to be in favor of doing More Personally Interesting Things, which I both do NOT fault but also find incredibly funny and symptomatic of the larger picture. Everyone is trying their best at any given time, which also occasionally happens to be grossly inadequate (but also hindered by lobbying, which will presumably guide policy forever, and it looks like a lot of posters can abide this)

Comment by Jake Heiser (jake-heiser) on Rationality and Climate Change · 2020-10-14T07:26:55.178Z · LW · GW

This analysis assumes that we will succeed in geoengineering without further deleterious externalities, which has less than no current basis

Comment by Jake Heiser (jake-heiser) on Industrial literacy · 2020-10-14T06:59:23.091Z · LW · GW

If anyone feels like humoring me, I would actually take a bit of a response as to how washing machines are better than a basket in a river, other than river-rationing-issues (aqueducts? Pipes??)

Comment by Jake Heiser (jake-heiser) on Industrial literacy · 2020-10-14T06:45:24.734Z · LW · GW

I don't want to straw your view of abortion based on this post alone, but abortion certainly happened in more dangerous ways before current chemical abortions in industrial civilization, still does, and your view may or may not change if you yourself were pregnant against your own decision, none of which seems to be considered here.

but I do love to talk about how rational economic competition necessarily pits workers in a community against each other into competing for qualification, into a union, which all else has the utmost monetary incentive to eat alive (WV Coal Wars at the least charitable, gradually stemming to equally 'effective' social and legal action)

Comment by Jake Heiser (jake-heiser) on Industrial literacy · 2020-10-14T06:30:28.826Z · LW · GW

Contrarily, a vacuum cleaner is just in no way more automatic than a broom unless you design a floor to hold pieces of food and dirt, which people love. Hoping someone comes along to shove me with 5 studies that carpets reduce homicide and tax fraud, but I'm very sorry to say that people still have paid servants, and those cleaners drive the vacuum across the floor's square inches just like you and me, except they receive compensation ;^(

Who's to say even the value positive automations benefit workers, who make up the majority? Post-'trickle down economics', everything seems to become more nebulous in developing capitalism.

Comment by Jake Heiser (jake-heiser) on How to Convince Me That 2 + 2 = 3 · 2020-10-14T05:28:19.746Z · LW · GW

conversely, as a born pedant american christian who has raised countless prayers in the absolute good faith of childhood, god should know that only needlessly statistical tests would ultimately save me, and that any measurable manifestation of the divine would immediately cause me to pledge my life and the highest degree of propaganda / violence I could affect to any awful cause that (s)he could imagine. Unfortunately, YHWH turns away every chance he has at my safely partitioned acolytic fervor.

Old testament lord was not above showy miracles, but so much changes between the two that I have a hard time even seeing it as an allegory or reformation. I can only imagine that it was a pretty steep reform.

Comment by Jake Heiser (jake-heiser) on Self-Improvement or Shiny Distraction: Why Less Wrong is anti-Instrumental Rationality · 2020-10-14T04:03:00.508Z · LW · GW

What I love is that "Increasing your social status with rationalists" almost necessitates giving them new and relevant information that gives them more perspectives to work from on current problems. I cherish friendly incentives

Comment by Jake Heiser (jake-heiser) on Self-Improvement or Shiny Distraction: Why Less Wrong is anti-Instrumental Rationality · 2020-10-14T03:45:11.500Z · LW · GW

LessWronging exclusively replaces time I used to spend on Tumblr, a comedy-based forum in which recently I accidentally discovered several lovely bloggers that enjoyed the rationalist body of work, leading me here- and most explicitly to the more popular and relevant formal texts. It has real value and even more potential in the ability to reform that this post implies.

I also enjoy the theocracy of free market's ideals taken for granted in this post- nothing admirable has ever been done without massive amounts of extortion, and we should all aspire to labor in more and more gray encroachments of human relationship and mentorship. Of COURSE education should be a class barrier! That is simply How Things Are Done.

Triple Agent Twist! The hand wringing I perform here is in no way of positive value, without an alternative. Everything the topic posits is correct. I particularly applaud the consensus and recognition that these posts alone are not enough and that further steps are the key.

The reassurance that community members can bring the best of faith to a disagreement is the most valuable thing I could have in 2020, living in rural n.america, or rather, living on earth.

Comment by Jake Heiser (jake-heiser) on Chapter 21: Rationalization · 2020-10-11T14:03:37.060Z · LW · GW

I think Eliezer realizes and is textually discussing how having a group named the Bayesian Conspiracy based on withholding scientific information is the best and most certain way to be publically hung as a religious bonding experience for right-wing nationalists, centrists, and leftists alike. Truly healing America!

Comment by Jake Heiser (jake-heiser) on Chapter 19: Delayed Gratification · 2020-10-11T11:48:26.487Z · LW · GW

If only every professor was as fun and enrapturing as the "Learned Aesthete" type that I remember so fondly. Modern schooling is still banal and sickly, but every once in a while an exciting and beautiful soul accidentally wanders into an education major (somewhat unlike my own mother, creationist of 2nd grade creationists, the Tiamat of the Future)

Comment by Jake Heiser (jake-heiser) on Chapter 18: Dominance Hierarchies · 2020-10-11T10:18:55.645Z · LW · GW

I had seen a comment denigrating Card's Ender relying on appeals to authority, but I notice that MORHarry also likes to bargain with his Lived Boy Status. You use what you have! I have a problem with neither.

It would also be trivial to do any permutation of unrealistic, clever plans with the time turner (A succession of remotely launched pies to Snape, the resulting implied hanging anvil, and complete lack of culpability would have done incalculable damage- A pie based Death Note resolution seems alarmingly Yudkowskian, now that I consider it) but I'm very excited to watch the desperation thrive. Tangible escalation of themes? Who could have dreamed of cogent fiction? I enjoy accidentally seeing ideological parallels to Snicket, Dahl etc., childhood fiction authors that have weathered the test of time a BIT more graciously than Just JKing Arowling

Now if you'll excuse me I'll be trying to think of a human issue that can't be solved by a universally ranged pie device (from the Geneva conventions alone, I could imagine it getting its money's worth. Oh, 2020.....)

Comment by Jake Heiser (jake-heiser) on Chapter 16: Lateral Thinking · 2020-10-11T09:03:33.139Z · LW · GW

Fascist!Quirrel is delightful. What an awful little man.

Comment by Jake Heiser (jake-heiser) on Chapter 12: Impulse Control · 2020-10-10T23:41:38.231Z · LW · GW

Dumbledore then winks at Harry and every closed can of Chekhov's Arizona Tea in a 5 km radius detonates

Comment by Jake Heiser (jake-heiser) on Chapter 8: Positive Bias · 2020-10-10T22:04:33.475Z · LW · GW

I read these comments first, so I knew that the assumed pattern would be incorrect (without which, it would still be naive of me to expect a non-lateral question from Eliezer). However without being able to test my own inputs I'm not able to reason with any more evidence, so I still couldn't say if I would have been capable of getting any further. (1 2 5, .3 .2 0 were my reasonable test cases, before I would start being a jackass about complexes and infinities (and indefinities? undefinities? Towers of Babel*)

But I did indeed notice that without a single negative case, she would have been incapable of eliminating "all sets valid", and am delighted that the next paragraph addresses that.

Comment by Jake Heiser (jake-heiser) on Chapter 8: Positive Bias · 2020-10-10T21:48:57.842Z · LW · GW

Oh, my god- I can't believe you're already setting up for Harry Potter and the Colonization of the Other

Comment by Jake Heiser (jake-heiser) on Bayesian Judo · 2020-10-09T06:21:20.104Z · LW · GW

The framing of the first sentence gives me a desperately unfair expectation for the discussion inside HPMOR- I'm excited.

Comment by Jake Heiser (jake-heiser) on Bayesian Judo · 2020-10-09T06:13:35.961Z · LW · GW

If it also states that the participants must be rationalists as Yudkowski specifies, you'll be sorely disappointed to find out how many people would identify as a rationalist

Comment by Jake Heiser (jake-heiser) on Bayesian Judo · 2020-10-09T06:11:22.463Z · LW · GW

While I understand the absolute primal urge to stomp on religious texts used to propagate compulsory heterosexuality, I do think this exchange ended up a bit of a poor game, when it seems like he'd be mostly interested in discussing how the emotions of programmed thought might differ from ours (and that's a fun well to splash around in, for a while)(though deposing of cult-friendly rhetoric is valuable too, even if you have to get nasty).

I'm mildly concerned about the Reign Of Terror Precept, but I also understand it. It's just disappointing to know that the good faith of conversation has to be preserved artificially (ostensibly limited to Eliezerposting, which is more than fair. I can't wait to read the Harry Potter Fanfiction Fanfiction.)