Posts
Comments
I guess my point is that there are diminishing diplomatic/power rewards from increasing the number of nuclear weapons in your stockpile. While having nuclear capability is certainly important to be considered a superpower, the advantage the US gains over China by having a nuclear arsenal way bigger than the the Chinese one is, in my view, relatively small. China still has enough nuclear weapons to make launching missiles at it a really bad idea for a president of the US who wants to keep his job/his party's political power/his citizens safe (even including the possible incompetency of China's nuclear force - see this report). Also, having a no first use policy would matter more if China's leader was bound by his countries laws, which he is unfortunately not.
On the other hand, China is definitely trying to build those alliances and the global influence that you speak of. One example would be the belt and road initiative, by which China is pouring money into low-income countries in Asia and Africa. Also, China not having a nuclear arsenal as big or as advanced delivery systems for warheads is somewhat irrelevant, since it still has an arsenal that could destroy all of the major American population centers more than twice over.
I took the Dark factor test and got a very low score, but I kept second-guessing myself on the answers. I did that because I wasn't sure what my actions in a real-life scenario would be. Even though I had good intentions and I believe that other people's well-being has inherent value, I would put a high probability that I would get at least a slightly higher score if this sort of test was a real-world test that I didn't know I was taking. That makes me pessimistic about the data that the authors cite in this article. If (for example) "over 16% of people agree or strongly agree that they 'would like to make some people suffer even if it meant that I would go to hell with them'" when they know they are being tested for malevolent traits, how many people actually would do that given the choice? Also - for people who believe in hell, I hope this question is scale insensitivity problem, since infinite time being tortured seems to me to have infinite negative utility, so you would need to value harming others more than helping yourself to agree with that statement.
I agree, but so many other things are different in this fan-fic and Eliezer is smart enough that I wouldn't be surprised if it turns out to bel like that for a reason.
This is a good example of a time when it would actually be worthwhile to know about the philosopher's zombie debate.
It says two comments for me before I posted this, so it looks like it has been fixed.
I think this comment is either incredibly stupid or extremely insightful and borderline genius. I honestly can't tell (although I'm leaning towards the former, and the karma seems to agree with me), and that scares me just a little.
I was also a little bit confused with that. My first thought was that it was a reference I didn't get?
I voted up every other comment starting from the top, then voted down every third one. I may or may not have continued. I have no Idea if anyone will ever see this, but if they do, I have been reading lesswrong for a long time and only now created an account to continue this chain.