Posts
Comments
You are assuming a simulation does not want to die, and this is unclear. The fact that 100$ is better than 0$ is taken as an axiom because it is part of the statement. However, death is worse than life (for a simulation!) is not trivial. "Rationality should not be done for rationality or it ends in a loop". So posts use money as the thing the rational agents want. You have to assign a financial value to life before saying it is less valuable than 100$
Perhaps I do not understand the meaning of infohazard, but in this scenario, it seems like what you are trying to avoid is not information, but rather player B knowing you have information. I think this can be solved if you take one of the "Omegas" who can predict you, and then the information itself may be seen as harmful.