Posts

Comments

Comment by SithLord13 on Open thread, Nov. 7 - Nov. 13, 2016 · 2016-11-13T11:47:52.163Z · LW · GW

Personally, I think the update most people should be making is the one getting the least attention. That even a 30% chance means 3 out of 10 times. Things far more unlikely than 3 out of 10 happen every day. But because we assign such importance to the election, we assign a much greater confidence to our predictions, even when we know we're not that confident.

Comment by SithLord13 on [deleted post] 2016-10-21T08:54:27.322Z

This discussion was about agential risks, the part I quoted was talking about extreme ecoterrorism as a result of environmental degradation. In other words, the main post was partially about stricter regulations on CO2 as a means of minimizing the risk of a potential doomsday scenario from an anti global warming group.

Comment by SithLord13 on There are 125 sheep and 5 dogs in a flock. How old is the shepherd? / Math Education · 2016-10-17T12:51:29.559Z · LW · GW

I think the issue here might be slightly different than posed. I think the real issue is that children instinctively assume they're running on corrupted hardware. For all priors in math, they've had a solvable problem. They've had problems they couldn't solve, and then been shown it was a mistake on their part. Without good cause, why would they suddenly assume all their priors are wrong, and not just that they're failing to grasp it? Given their priors and information, it's ration to expect that they missed something.

Comment by SithLord13 on Barack Obama's opinions on near-future AI [Fixed] · 2016-10-17T12:34:57.605Z · LW · GW

I think the best reason for him to raise that possibility is to give a clear analogy. Nukes are undoubtedly airgapped from the net, and there's no chance anyone with the capacity to penetrate would think otherwise. It's just an easy to grasp way for him to present it to the public.

Comment by SithLord13 on [deleted post] 2016-10-17T12:10:19.399Z

Citation? This is commonly asserted by AI risk proponents, but I'm not sure I believe it. My best friend's values are slightly misaligned relative to my own, but if my best friend became superintelligent, that seems to me like it'd be a pretty good outcome.

I highly recommend reading this.

Comment by SithLord13 on [deleted post] 2016-10-15T23:25:12.166Z

Furthermore, implementing stricter regulations on CO2 emissions could decrease the probability of extreme ecoterrorism and/or apocalyptic terrorism, since environmental degradation is a “trigger” for both.

Disregarding any discussion of legitimate climate concerns, isn't this a really bad decision? Isn't it better to be unblackmailable, to disincentivize blackmail.

Comment by SithLord13 on Open thread, Oct. 10 - Oct. 16, 2016 · 2016-10-15T02:50:53.098Z · LW · GW

I also think that these scenarios usually devolve into a "would you rather..." game that is not very illuminating of either underlying moral values or the validity of ethical frameworks.

Can you expand on this a bit? (Full disclosure I'm still relatively new to Less Wrong, and still learning quite a bit that I think most people here have a firm grip on.) I would think they illuminate a great deal about our underlying moral values, if we assume they're honest answers and that people are actually bound by their morals (or are at least answering as though they are, which I believe to be implicit in the question).

For example, I'm also a duster, and that "would you rather" taught me a great deal about my morality. (Although to be fair what it taught me is certainly not what was intended, which was that my moral system is not strictly multiplicative but is either logarithmic or exponential or some such function where a non-zero number that is sufficiently small can't be significantly increased simply by having it apply to significantly multiple people.)

Comment by SithLord13 on Open thread, Oct. 10 - Oct. 16, 2016 · 2016-10-13T15:37:16.955Z · LW · GW

There are a lot of conflicting aspects to consider here outside of a vacuum. Discounting the unknown unknowns, which could factor heavily here since it's an emotionally biasing topic, you've got the fact that the baby is going to be raised by an assumably attentive mother, as opposed to the 5 who wound up in that situation once, showing at least some increased risk of falling victim to such a situation again. Then you have the psychological damage to the mother, which is going to be even greater because she had to do the act herself. Then you've got the fact that a child raised by a mother who is willing to do it has a greater chance of being raised in such a way as to have a net positive impact on society. Then you have the greater potential for preventing the situation in the future, caused by the increased visibility of the higher death toll. I'm certain there are more aspects I'm failing to note.

But, if we cut to what I believe is the heart of your point, then yes, she absolutely should. Let's scale the problem up for a moment. Say instead of 5 it's 500. Or 5 million. Or the entire rest of humanity aside from the mother and her baby. At what point does sacrificing her child become the right decision? Really, this boils down to the idea of shut up and multiply.

Comment by SithLord13 on Open thread, Oct. 10 - Oct. 16, 2016 · 2016-10-11T18:50:06.157Z · LW · GW

Could chewing gum serve as a suitable replacement for you?