Posts
Comments
In the same spirit, some questions on the post itself:
-Could you be flattering rationalists here by telling them all their debates and disagreements are signs of their healthy culture? -Could you be using contrarianism as a badge of identity yourself, a way to find community with fellow contrarians? -Are you sure you're not using your description of 'woke' culture as a way to attack what is here an outgroup, rather than as a fair description of a purity phenomenon that happens in many ideologies?
Not saying I know the answer to these questions but always worth turning the lightbeam inwards now and then.
I guess it'd be helpful to understand more about why you think class consciousness is in conflict with using "reason, negotiation, justice, goal factoring and pulling the rope sideways".
I would think (decent) trade union activity was precisely interested in reasonable negotiations targeted at justice for a group of people.
Automating much of the economy is more than a little way off, and is highly likely to bring its own problems which I would expect to cross-cut with all these issues. I personally doubt that –in the event humans are not sidelined altogether – advances in AI would make demographic transition much economically easier, but I think that's in the realm of speculation either way.
I replied before your edit so a bit more:
I agree that civilisational progress is fairly fragile. But it is fragile in both directions. Climate change and resource wars seem about as likely to lead to global conflict as internecine ethnic strife to me.
I say this partly because immigration seems like a force for mutual cultural understanding and trade, to me. Without it we would probably see more closed-off nations, more likely to go to war. With too much of it, however, there can be bad side effects and cultural rifts if not managed very wisely. Where the line is is no simple question.
I also want to advance the simple main idea that drives my views on this issue, which is that population growth HAS to level off eventually unless we colonise space. The side effects on the economy will equally have to be managed at one time or another.
Will they be easier to manage in the future? Or could growing populations make it even harder? Could managing a fall in population rates be easier if done more slowly?
Maybe. But I don't feel that's the tenor of the arguments I am hearing from rationalist and adjacent people right now.
Do you think that a large population that was reducing slowly would be something Zvi, Robin Hanson and others taking this stance would celebrate? (As opposed to what we have a large population that is growing but showing signs of falling relatively fast in geographical/cultural pockets?)
Currently global population growth is positive but decelerating, I guess a more gradual deceleration would be less disturbing to them? But what about if world population growth very gradually moved from positive to negative? Would they be happy with that?
I had assumed not but I am trying to understand what good looks like.
So is the target to keep the population as it is? Has an argument been made as to why the current population is 'correct'? Isn't it a bit arbitrary?
All the same thoughts here. I also want to understand what the plan is if we keep growing the population. Is the idea that we keep going until we reach a higher stable number, or that we literally keep growing always? If the former, what's the number and why? If the latter, does that mean the whole strategy is 100% dependent on us inhabiting space? And if that's the case, shouldn't this rather big element in the plan be made explicit?
No, I think gene manipulation can be the right thing to do but that we should face harsh legal consequences if we cause harm by doing it with anything less than extreme levels of care and caution (I think the idiot god should be put on trial frequently as well, but he is sadly hard to handcuff).
I don't disagree with any of this. But if someone commit crimes against humanity in the name of eugenics, even if by accident, the fact that the blind, amoral god's actions are even worse is in no way exculpatory. In other words, he can get away with it, you can't.
Don't you think someone whose bike has been stolen realises they should have locked it afterwards without you telling them? Saying so may be fine but it actually depends how you tell them, I can imagine "Shoulda locked it" being a pretty annoying comment.
Gutsy of you to enter, look forward to watching. I got all your example questions right no problem but in about ten full seconds each. I'd have no chance on the show.
Now do the other side!
It's harder and harder to make good art in a way: there more there is, the less likely you are to be able to do it better or create something truly new. However it's not approaching impossible, because there's always new life happening to make art about. And there are usually new technologies coming along to make it with
To an extent, the more tumultuous and fecund the world becomes the more possible it becomes to produce good art again. Apocalypse or economic devastation is bad for art because it depletes the resources and free time needed to make it. However if there is a possibility of a bounce back you have both the fuel and the resources to make works of genius. This is a solace of (quite) bad news.
If the population falls the circumstances that created the low birth rate will change. This seems like the equivalent of an economist extrapolating a high inflation situation into the future and determining that only billionaires will be able to afford tomatoes.
I liked the fact the enjoyment wasn't straightforward, in that it was somewhat challenging to watch in terms of keeping up with it and it posed moral questions mostly as opposed to telling you what to think. I liked not being certain where Nolan stood. It wasn't too obvious who to root for unlike with most more "straightforward to watch" Hollywood films.
Huge difference between unattainable standard and contradictory standards though. One is aspiring to be superhumanly great, the other is being confused about your own ideals.
Part of the point is that the standards we desire for ourselves may be contradictory and thus unachievable (e.g. Barbie's physical proportions). So it's not necessarily 'lower your standards', but 'seek more coherent, balanced standards'.
I also think you can enjoy the message-for-the-character without needing it for you but anyway, I get where you're personally coming from and appreciate your level of frankness about it!
I suppose you may have correctly analysed your reason for not liking the movie. But if you are right that you only respond to a limited set of story types, do you therefore aspire to opening yourself to different ones in future, or is your conclusion that you just want to stick to films with 'man becomes strong' character arcs?
I personally loved Barbie (man here!), and think it was hilarious, charming and very adroit politically. I also think that much of the moral messaging is pretty universal – Greta Gerwig obviously thinks so: when she says: "I think equally men have held themselves to just outrageous standards that no one can meet. And they have their own set of contradictions where they’re walking a tightrope. I think that’s something that’s universal."
Is it possible that that message does strike some kind of chord with you but you don't want to hear it? (I guess I find 'absolutely hated' to be incredibly strong language for a film made with obvious skill and wit and that I think has no right to be as good as it is.)
I'll be honest, I can't engage with some lesswrong posts because of the endless hedging, introspection and over specifying. The healthy desire to be clear and rational can become like a compulsion, one that's actually very demanding of the reader's time and patience. The truth is, one could clarify, quantify and 'go meta' on any step in any argument for untold thousands of words. So you have to decide where to stop and where to expand. This sort of strategic restraint is at the core of good writing style.
So while I can agree that the classic style may be unsuitable for many purposes when carried to an extreme, you have to decide where your communications fit on a scale. The opposite of writing everything in a fully classic style is a world where every piece of writing about anything becomes endless throat clearing and philosophising.
Davidsonian linguistics typically involve interpreting others' statements such that they are maximally true and also maximally coherent with other words/beliefs/attitudes, taken holistically (that covers your 'correlated with other queries' bit I guess?).
This is basically Davidson's "principle of charity".
Isn't he just trying to win points from his new republican buddies by disowning earlier interest in climate change and aligning himself more with a cluster of socially conservative views (anti abortion, love of big patriarchal families, fear of being outgrown by the out group etc)?
I think avoiding spatial metaphors altogether is hard! For example in the paragraph below you use perhaps 3 spatial metaphors (plus others not so obviously spatial but with equal potential for miscommunication).
"The most interesting part of the experiment has been observing the mental vapor-lock that occurs when I disallow myself from casually employing a spatial metaphor ... followed by the more-creative, more-thoughtful, less-automatic mental leap I'm forced to make to finish my thought. You discover new ways in which your mind can move."
I'm sure I even recall encountering views that suggest all thought and language is a superstructure of metaphors based on a few basic sensorily acquired concepts we acquire young. Not sure where I read this though!
That said as a writer I also try to be alert to spatial metaphors that don't map especially well to the truth of a situation, and endeavour to select only the best ones.