Posts

Joint mandatory donation as a way to increase the number of donations 2024-07-07T10:56:57.222Z
Regularly meta-optimization 2024-06-25T06:12:58.939Z
Why write down the basics of logic if they are so evident? 2024-06-02T12:02:44.722Z

Comments

Comment by Crazy philosopher (commissar Yarrick) on Status Regulation and Anxious Underconfidence · 2024-07-11T09:34:53.445Z · LW · GW

So it shouldn’t be surprising if acting like you have more status than I assign to you triggers a negative emotion, a slapdown response.

I think there's a different mechanism here. I don't like it if Mr. A can't do X, but doesn't know about it, publicly announces that he's going to do X, and gets a lot of prestige upfront. At the same time, I understand that he will not succeed, and he should not get prestige. And after that, A fails, and it makes me feel worse about those who claim that they can do X if they have no experience. 

Imagine that some philosopher announces that he is going to create an aligned AGI in a month, after which everyone begins to admire him. That's exactly the feeling. 

In other words, the problem is not that Mr. A doesn't have enough prestige, but that he doesn't have enough chances to succeed.

... but even if Mr. A decides to create an agreed AGI in a month without announcing it publicly, then you will wisely say, "This is impossible. Once I also thought that I could do it in a month, but it's not like that.". Wait - this is the reaction "juggling 3 balls is impossible"! 

What did I understand: most of the exclamations "you don't have enough experience / look at yourself from the outside / it's not possible" from experts in this domainare true. I mean, if you decide to do X, but all the experts in the domain say that you will not succeed, this is quite strong Bayesian evidence in favor of the fact that you will not succeed. You can't dismiss it by deciding that they're just afraid to share their status.

But otherwise I agree with Eliezer.

Comment by Crazy philosopher (commissar Yarrick) on Sunset at Noon · 2024-07-09T11:32:04.089Z · LW · GW

Sometimes, maybe you don't have time for friends to let you know. You're living an hour away from a wildfire that's spreading fast. And the difference between escaping alive and asphyxiating is having trained to notice and act on the small note of discord as the thoughts flicker by:

"Huh, weird."

Our civilization lives an hour away from a dozen metaphorical fires, some of which no living person has seriously thought about

Comment by Crazy philosopher (commissar Yarrick) on Sunset at Noon · 2024-07-09T11:15:54.137Z · LW · GW

We have a lot of people showing up, saying "I want to help." And the problem is, the thing we most need help with is figuring out what to do. We need people with breadth and depth of understanding, who can look at the big picture and figure out what needs doing

Figure out how best to spread rationality, or at least ideas about X-risks. This is quite possible with our resources equal to zero, but if we can spread these ideas to, for example, 20% of the population, it will greatly help us with the fight against X-risks. In addition, we will have more people who will help us... to think about what we should to do, lol

Comment by Crazy philosopher (commissar Yarrick) on Joint mandatory donation as a way to increase the number of donations · 2024-07-09T10:38:19.259Z · LW · GW

1) "I think we call this "taxes"."

So I invented taxes for charitable donations.

2) The second option is better for most participants, but not for everyone, you are right

Comment by Crazy philosopher (commissar Yarrick) on Confessions of a Slacker · 2024-07-01T11:30:19.831Z · LW · GW

It's a nice sequence and I like it, but normal peoples call that "liberty", don't "slack", lol

Comment by Crazy philosopher (commissar Yarrick) on Asymmetric Justice · 2024-06-30T10:41:20.140Z · LW · GW

This is a very useful article that helped me understand many things about myself and society. Thanks!

Comment by Crazy philosopher (commissar Yarrick) on Asymmetric Justice · 2024-06-30T10:39:38.159Z · LW · GW

This is a very useful article that helped me understand many things about myself and society. Thanks!

Comment by Crazy philosopher (commissar Yarrick) on Regularly meta-optimization · 2024-06-29T18:44:01.613Z · LW · GW

Okay I'll rewrite the post. Thanks for your answers

Comment by Crazy philosopher (commissar Yarrick) on Spaghetti Towers · 2024-06-29T08:45:35.331Z · LW · GW

That's true, but Program B will still be worse than a human-written program, so we aim to avoid spaghetti towers.

Spaghetti towers work especially poorly in changing environments: if evolution were reasonable, it would force us to try to maximize the number of our genes in the next generation. But instead, she created several heuristics like hunger and desire for groin friction. So when people came up with civilization, we started eating fast food and having sex with condoms.

Comment by Crazy philosopher (commissar Yarrick) on Asymmetric Justice · 2024-06-28T14:31:03.456Z · LW · GW

People with the simulacra level 4th can praise their political allies.

Comment by Crazy philosopher (commissar Yarrick) on Regularly meta-optimization · 2024-06-28T11:20:14.912Z · LW · GW

I'm talking about doing an audit of your whole life regularly, desperately trying to find the most effective things. Also, this technique is about highlighting potentially the most effective actions that you didn't spend a lot of time thinking about, but put them down as "stupid" because, for example, you need to get out of your comfort zone. 

Does it clear?

Comment by Crazy philosopher (commissar Yarrick) on Luna Lovegood and the Chamber of Secrets - Part 13 · 2024-06-25T16:44:42.577Z · LW · GW

I think that makes you what you pretend to be to protect yourself from occlumency. That's why Harry fell into a coma - he pretended to be a stone.

Comment by Crazy philosopher (commissar Yarrick) on Regularly meta-optimization · 2024-06-25T15:06:27.808Z · LW · GW

Do you like the article?

Comment by Crazy philosopher (commissar Yarrick) on Why write down the basics of logic if they are so evident? · 2024-06-24T16:46:11.281Z · LW · GW

I don't see the difference. The theory of relativity and Newton's theory also have different philosophies: Newton's theory states that gravity is a force, that the universe is constant and eternal, etc.

Newton's theory is not exactly a special case of the theory of relativity, because it is less accurate.

Comment by Crazy philosopher (commissar Yarrick) on The Ideology Is Not The Movement · 2024-06-24T16:18:35.387Z · LW · GW

Edit: I have a lot of disagreements on my commentary. Can you explain, why are you disagree?

Comment by Crazy philosopher (commissar Yarrick) on What Money Cannot Buy · 2024-06-24T16:11:03.481Z · LW · GW

Jeff Bezos may announce that he will pay 5,000,000,000 to whoever invents a cure for cancer. Or, rather, to give out a monetary reward for every step towards curing cancer. Thus, if you have an idea how to cure such and such a type of cancer, you take out a loan at high interest rates (because it is risky), and conduct research. 

He can form a fund that will determine which research brings us closer to cancer treatment: after all, Nobel prizes work well.

Comment by Crazy philosopher (commissar Yarrick) on Theory and Data as Constraints · 2024-06-23T19:06:33.195Z · LW · GW

Before reading this chain, I had an intuitive sense of the "bottlenecks" of production, but this chain allowed me to understand it much better. Thank you!

Comment by Crazy philosopher (commissar Yarrick) on The Ideology Is Not The Movement · 2024-06-11T06:22:13.764Z · LW · GW

""I think America has better values than Pakistan does, but that doesn’t mean I want us invading them, let alone razing their culture to the ground and replacing it with our own" - why not? No, seriously. America invaded several Muslim (fundamentalist Muslim, not we-kinda-like-Quran-stop-accusing-us-of-ISIS Muslim) countries already anyway. Why not raze the fundamentalist culture to the ground and replace it with universal?"

Preserving their culture is part of their utility function. Destroying their culture just like that is not ethical for the same reason why it is unethical to torture people just like that: both reduce their utility function, and the utility functions of other minds are included in our utility function. Therefore, only the most harmful elements of culture (religion) should be destroyed, and very gradually. 

In addition, very often actions that are unethical from a dientological point of view simply will not work, because people will start to resist, and you will not reap the fruits of your unethical sacrifices. If you invade Pakistan and destroy all the mosques, a week later the country will be on fire in a general uprising, and you will not get any economic growth associated with an improvement in the political situation. 

I'm not a radical dientologicalist, and I can imagine situations where something really unethical should be done, like taking over the country for the common good. But this particular plan is stupid.

Comment by Crazy philosopher (commissar Yarrick) on Why write down the basics of logic if they are so evident? · 2024-06-09T13:18:34.354Z · LW · GW

Ok I agree

Comment by Crazy philosopher (commissar Yarrick) on Where The Falling Einstein Meets The Rising Mouse · 2024-06-08T09:02:17.149Z · LW · GW

It seems to me that people who represent the "naive graph" by intelligence mean "the possibility of achieving goals", and the "Eliezer graph" means by intelligence the total computing power or something like that. Thus, a function that takes a value from the "Eliezer graph" and returns where this point stands on the "naive graph" is hyperexponential.

Comment by Crazy philosopher (commissar Yarrick) on Why write down the basics of logic if they are so evident? · 2024-06-07T15:05:34.905Z · LW · GW

I mean, we're simplifying reality down to Bayesian networks and scenario trees. And it works. It seems that we can say that the universe is Bayesian.

Comment by Crazy philosopher (commissar Yarrick) on New User's Guide to LessWrong · 2024-06-03T18:03:16.349Z · LW · GW

what exactly do users lose and receive karma for?

Comment by Crazy philosopher (commissar Yarrick) on Why write down the basics of logic if they are so evident? · 2024-06-03T15:54:53.065Z · LW · GW

Comparing Bayesian theory with frequentism is like comparing general relativity with Newton's theory. Both explain reality, and it is even easier to explain Newton's theory to children, although the general relativity is more true.

Most people intuitively learn frequentism, but some learn Bayesian methods for cases that they constantly encounter and in which frequency methods are not accurate enough.

But, in any case, frequency methods are built on Bayesian one way or another. An ordinary person, observing the Bayesian world, comes up with a simplified version of it - frequency methods. Newton, observing the world of general relativity, comes up with his own theory, where gravity is a force.

Comment by Crazy philosopher (commissar Yarrick) on 3 Levels of Rationality Verification · 2024-06-03T15:40:19.199Z · LW · GW

To look at the successes of the person taking into account his initial conditions. If he is a Nobel laureate, he have success, if he do science for humanity. If he is an egoist we should to look his happyness. 

Comment by Crazy philosopher (commissar Yarrick) on When (Not) To Use Probabilities · 2024-06-02T07:51:38.998Z · LW · GW

That we couldn't be sure that there was no error in the papers which showed from multiple angles that the LHC couldn't possibly destroy the world.  And moreover, the theory used in the papers might be wrong.  And in either case, there was still a chance the LHC could destroy the world.  And therefore, it ought not to be turned on.

 

We should be less confident in the results of complex reasoning

Comment by Crazy philosopher (commissar Yarrick) on Ethical Injunctions · 2024-06-02T07:30:34.536Z · LW · GW

Or, to summarize this essay:

Dientological rules are almost directly based on experimental experience, and utilitarian statements are very complex arguments.

"If you've truly understood the reason and the rhythm behind ethics, then one major sign is that, augmented by this newfound knowledge, you don't do those things that previously seemed like ethical transgressions. Only now you know why."

In other words, your theory should describe the facts well. Let's say we know that 90% of the people who decided to do X with the best intentions ended up being villains. But if in such a situation it seems to you that if YOU had done X without moral preparation, then you definitely would not have gone over to the dark side of the force... this means that your theory does not explain the data well. But if your usual consequentialist morality produces 90% of the results similar to the consequences of the dientological rules, then I solemnly declare that your consequentialist morality is practically perfect, and those 10% of discrepancies are errors of dientology, and I advise you to trust your consequentialist morality.

"They don't mention the problem of running on corrupted hardware. They don't mention the idea that lies have to be recursively protected from all the truths and all the truthfinding techniques that threaten them. They don't mention that honest ways have a simplicity that dishonest ways often lack. They don't talk about black-swan bets. They don't talk about the terrible nakedness of discarding the last defense you have against yourself, and trying to survive on raw calculation."

In this paragraph, in fact, Eliezer says: "the world is more complicated than it seems, and we do not fully understand it, so complex theories work worse than they seem, so trust the dientological rules (simple theories)"

And one more thing: it seems that when you break a dientological rule, it would be wise to remember it as: "yeah, I broke a dientological rule. I don't see exactly where I went wrong, but in any case, this is Bayesian evidence that I was wrong." And this evidence may be decisive and change the result of reflection, or it may not be.

Comment by Crazy philosopher (commissar Yarrick) on Cultish Countercultishness · 2023-12-05T16:44:08.982Z · LW · GW

Pondering whether my beliefs are an affective death spiral has led me to taboo the word "transhumanism" in my reflections: the ideas of transhumanism are not as strongly connected, are very emotional, and I don't encounter criticism of them very often, and when I do encounter criticism (in mental arguments) it causes me to be aggressive, manifesting itself in a not-so-ethical selection of arguments.