Posts
Comments
Demonstrate, please.
You know, this seems amusingly analogous to the scene in the seventh Harry Potter novel in which Xenophillius Lovegood asks Hermione to falsify the existence of the Resurrection Stone.
The entire premise of Xyrik's scenario is that everything will be hunky-dory. Xyrik is just making a wish, and not thinking about how anything will actually work.
Well, to be fair, I never claimed that I had any ideas for how to actually achieve a scenario with a flawless AGI, and I don't think I even said I was under the impression that this would be a good idea, although in the case that we DID have a flawless AGI, I would be open to a reasoning that proclaimed so.
But all I was asking was what potential downsides this could have, and people have risen to the occasion.
Not always. But too few people will choose mucking in the dirt and without money I'm not sure how are you going to persuade a sufficient number of people to go and do what they don't like.
That's a very good point, and I hadn't thought of that. This was basically why I made the post. Although I think I was mentioning somewhere that a scenario like this would only actually work if we had some AGI that could reliably judge who needed what resources when, in order to further the overall human endeavor.
The lack of incentives (which, I think, exists in Xyrik's scenario as the alternatives are... much less palatable)
Basically the idea is that everyone realizes that if we do this that we could vastly accelerate the speed at which we develop, and thus solve many of our problems such as over-population, food, etc. by spreading among the stars, after which people could once again live a more free life and create their own systems, including but not requiring a governing body.
Yeah, part of what I was intending in the scenario would be that everyone realizes that we could make much faster technological advances (At least, that's the theory) if we didn't bother with keeping track of who owes who. We need resources such as metals, we get them, make the MacGuffin, and continue.
I suppose the real problem with this is some form of a game-plan, determining who needs what. So I guess what I'm thinking is a system that would require some flawless AGI to determine what group needs what resource at what time, to further the general human endeavor, rather than people getting what they want/need based on how much money they can amass, which is as we know a flawed system, or people like Donald Trump would not exist, while people starve to death in Third-World countries.
But the idea would be to use some system like this to vastly accelerate our speed of technological advancement, so that we can colonize the galaxy, become immortal, and eventually figure out how the world works. However that's not to say that I'm trying to really come UP with a system, because I'm sure such systems are already postulated, but just don't work, because of the whole 'greed' thing, but yeah. My query was mainly whether there could be problems not in developing the system, but in actually enacting such a system, even if it worked as intended.
That was indeed what I was proposing. Like I said, this system were to assume that somehow humans solved that problem and are all willing to pitch in. I guess that would probably take some severe altering to our brains, potentially do the point to which we're all some hive-mind, which would be a debatable downside.
Would someone be able to enlighten me on what the cons of a hypothetical situation in which everyone on the planet decides to temporarily get rid of the concept of money or currency, and pool our collective resources and ideas without worrying about who owes who? I mean on paper it sounds great, and obviously this is extremely hypothetical as it's virtually impossible to get all human life on Earth to actually do that, but are there hidden cons here that I'm not really seeing?
I've not really gone into too much thought on this, it was mostly a fleeting thought, and I was curious what others thought.
When you say you've used it to create an ebook form of Rationality: From AI To Zombies, do you mean the one that is currently for sale, or some other version?
Oh shit, I hadn't even noticed that it was past Midnight at the time LOL. I could still make it if I want.. I may show up.
Thanks for the info. I actually did not see that the latest meeting was today or I would have tried to come, although actually I was at work. Do these often happen during the weekdays, or are they also on weekends? At this point however I'm mostly coming along to listen to others and become more versed in the Methods of Rationality. I mean to eventually sift through as much content on LW as possible, as well as read Rationality: From AI to Zombies (As soon as I finish another book series that I do not want to interrupt). Something like today's meeting actually would have been great as I could listen and comment on these mini-talks without making one of my own. Is there some sort of email notification or something that notifies of Austin meetings?
Just out of curiosity, do many people show up to these? I'm curious about coming. Also, Eliezer himself does not live here, does he?