Posts
Comments
Plugging my more light-hearted rationality podcast, Recreational Overthinking. Been a LessWrong reader for a long time now and I think a lot of the community would find it entertaining. https://open.spotify.com/show/3xZEkvyXuujpkZtHDrjk7r?si=ryufuZjZSe2ryw7aKFOUpQ
Awesome idea! As an excuse to practice some coding, I made a c++ program that runs a little Rational Breaks timer app on the terminal. I've put it on GitHub if anyone wants to try it out (only tested on linux) https://github.com/ben-carew/ratio_breaks Currently in the process of adding a meal breaks feature and making it prettier, that will be up soon. If you use it, please give me feedback and suggest improvements/updates. This is also my first time using GitHub (I'm not a programmer) so any feedback on how that usually works would be great!
Absolutely! I'm just finishing a bachelor in physics. Email me at B78980988@gmail.com.
I exclusively use a Nokia flip phone, and have never used a smart phone as my daily driver. Carrying around something with that much potential for addiction in my pocket at all times scares me, and I'd rather save my willpower for more important decisions. I see the occasional boredom as a plus - being comfortable while alone with just your thoughts is a skill worth practicing. There are definite downsides though, especially now with QR code reading being a staple of going out. I also don't like having to rely on others for navigation, google searches etc. (although the flip phone can do these poorly if required). I would recommend trialling a basic phone and perhaps keeping an old smartphone with no sim as a "work phone" for when you need it.
Looking at the extremes of the situation:
- If I am omniscient, that doesn't make me omnibenevolent. I could surely see every consequence of my actions, know exactly what would be the moral choice, and still decide to act in an evil or selfish way. Knowing the truth makes it to be moral, should I choose to do so, but does not make me more moral.
- If I am completely absent of ability to foresee consequences of my actions, then my "morality" from a consequentialist viewpoint can be no better than random chance. Faced with complete ignorance I cannot choose to be moral or immoral, I can only do things and see what happens. Therefore it is necessary to have some level of belief in true things to be a moral agent.
Interpolating from these endpoints, it seems that believing true things is not correlated to morality so much as moral agency.
As an aside, you can imagine a situation where you are omniscient and omnibenevolent, but live in a world without moral realism. If "truth" only includes information about what will happen, and not what moral theory is "correct", then you're still unable to make a moral choice.