Posts

Comments

Comment by sh4dow on The Second Law of Thermodynamics, and Engines of Cognition · 2021-05-27T11:54:46.133Z · LW · GW

If you press a thermometer against the flywheel, it will pretty quickly heat up... It just behaves like a "white body" in terms of radiation.

Comment by sh4dow on Why the AI Alignment Problem Might be Unsolvable? · 2021-01-29T18:08:25.505Z · LW · GW
Comment by sh4dow on How sure are you that brain emulations would be conscious? · 2020-12-07T00:28:15.142Z · LW · GW

But isn't it still possible that a simulation that lost its consciousness would still retain memories about consciousness that were sufficient, even without access to real consciousness, to generate potentially even 'novel' content about consciousness?

Comment by sh4dow on Newcomb's Problem and Regret of Rationality · 2016-06-19T00:30:31.264Z · LW · GW

I would play lotto: if I win more than 10M$, I take the black box and leave. Otherwise I'd look in the black box: if it is full, I also take the small one. If not, I leave with just the empty black box. As this should be inconsistent, assuming a time traveling Omega, it would either make him not choose me for his experiment or let me win for sure (assuming time works in similar ways as in HPMOR). If I get nothing, it would prove the Omega wrong (and tell me quite a bit about how the Omega (and time) works). If his prediction was correct though, I win 11.000.000$, which is way better than either 'standard' variant.