Let's make the truth easier to find
post by DPiepgrass · 2023-03-20T04:28:41.405Z · LW · GW · 44 commentsContents
45 comments
44 comments
Comments sorted by top scores.
comment by Shmi (shminux) · 2023-03-20T09:37:15.698Z · LW(p) · GW(p)
Your examples seem suspiciously political, not a good sign. They also use connotation-heavy language, another red flag for anyone interested in "truth seeking". So, I figured I'd ask you how is your personal truth-seeking going? What important updates in your worldview come to mind as a result of it? Or is it only the other people who are not good at collecting and organizing evidence?
Replies from: DPiepgrass, cubefox↑ comment by DPiepgrass · 2023-03-20T19:11:50.681Z · LW(p) · GW(p)
Most important matters have a large political component. If it's not political, it's probably either not important or highly neglected (and as soon as it's not neglected, it probably gets politicized). Moreover, if I would classify a document as reliable in a non-political context, that same document, written by the same people, suddenly becomes harder to evaluate if it was produced in a politicized context. For instance, consider this is a presentation by a virologist. Ordinarily I would consider a video to be quite reliable if it's an expert making a seemingly strong case to other experts, but it was produced in a politicized environment and that makes it harder to be sure I can trust it. Maybe, say, the presenter is annoyed about non-experts flooding in to criticize him or his field, so he's feeling more defensive and wants to prove them wrong. (On the other hand, increased scrutiny can improve the quality of scientific work. It's hard to be sure. Also, the video had about 250 views when I saw it and 576 views a year later—it was meant for an expert audience, directed to an expert audience, and never went anywhere close to viral, so he may be less guarded in this context than when he is talking to a journalist or something.)
My goal here is not to solve the problem of "making science work better" or "keeping trivia databases honest". I want to make the truth easier to find in a political environment that has immense groups of people who are arriving at false or true beliefs via questionable reasoning and cherry-picked evidence, and where expertise is censored by glut. This tends to be the kind of environment where the importance and difficulty (for non-experts) of getting the right answer both go up at once. Where once a Google search would have taken you to some obscure blogs and papers by experts discussing the evidence evenhandedly (albeit in frustratingly obscurantist language), politicization causes the same search to give you page after page of mainstream media and bland explanations which gravitate to some narrative or other and which rarely provide strong clues of reliability.
I would describe my personal truthseeking as frustrating. It's hard to tell what's true on a variety of important matters, and even the ones that seemed easy often aren't so easy when you dive into it. Examples:
- I mentioned before my frustration trying to learn about radiation risks.
- I've followed the Ukraine invasion closely since it started. It's been extremely hard to find good information, to the point where I use quantity as a substitute for quality because I don't know a better way. This is wastefully time-consuming and if I ever manage to reach a firm conclusion about a subtopic of the war, I have nowhere to publish my findings that any significant number of people would read (I often publish very short summaries or links to what I think is good information on Twitter, knowing that publishing in more detail would be pointless given my lack of audience; I also sometimes comment on Metaculus about war-related topics, but only when my judgement pertains specifically to a forecast that Metaculus happens to ask about.) The general problem I have in this area is a combination of (1) almost nobody citing their sources, (2) the sources themselves often being remarkably barren, e.g. the world-famous Oryx loss data [1, 2] gives nowhere near enough information to tell whether an asserted Russian loss is actually a Russian rather than Ukrainian loss, (3) Russia and Ukraine both have strong information operations that create constant noise, (4) I find pro-Putin sources annoying because of their bloodthirstiness, ultranationalism and authoritarianism, so while some of them give good evidence, I am less likely to discover them, follow them and see that evidence.
- It appears there's a "97% consensus on global warming", but when you delve deep into it, it's not as clear-cut. Sorry to toot my own horn, but I haven't seen any analysis of the consensus numbers as detailed and evenhanded as the one I wrote at that link (though I have a bias toward the consensus position). That's probably not because no one else has done such an analysis, but because an analysis like that (written by a rando and not quite affirming either of the popular narratives) tends not to surface in Google searches. Plus, my analysis is not updated as new evidence comes in, because I'm no longer following the topic.
- I saw a rather persuasive full-length YouTube 'documentary' with holocaust-skepticism. I looked for counterarguments, but those were relatively hard to find among the many pages saying something like "they only believe that because they are hateful and antisemitic" (the video didn't display any hint of hate or antisemitism that I could see). When I did find the counterarguments, they were interlaced with strong ad-hominim attacks against the people making the arguments, which struck me as unnecessarily inflammatory rather than persuasive.
- I was LDS for 27 years before discovering that my religion was false, despite always being open to that possibility. For starters, I didn't realize the extent to which I lived in a bubble or to which I and (especially) other members had poor epistemology. But even outside the bubble it just wasn't very likely that I would stumble upon someone who would point me to the evidence that it was false.
is it only the other people who are not good at collecting and organizing evidence?
No, I don't think I'm especially good at it, and I often wonder if certain other smart people have a better system. I wish I had better tooling and I want this tool for myself as much as anyone else.
Not a good sign
in what way? Are you suggesting that if I built this web site, it would not in fact use algorithms designed in good faith with epistemological principles meant to elevate ideas that are more likely to be true but, rather, it would look for terms like "global warming" and somehow tip the scales toward "humans cause it"?
connotation-heavy language
Please be specific.
Replies from: ChristianKl↑ comment by ChristianKl · 2023-03-23T15:30:52.931Z · LW(p) · GW(p)
A lot of the resources invested into "fighting misinformation" is about censoring nonestablishment voices and that often includes putting out misinformation like "Hunter's laptop was a Russian misinformation campaign" to facilitate political censorship.
In that enviroment, someone who proposing a new truthseeking project might also be interested into treating a project to strengthen the ruling narrative or they might be interested in actual truthseeking that affirms the ruling narrative when it's right and challenges it if it's wrong.
In a world where there's so much political pressure it probably takes strong conviction to have a project that does actual truthseeking instead of being coopted for narrative control.
↑ comment by cubefox · 2023-03-20T17:28:49.775Z · LW(p) · GW(p)
I find him using political examples not suspicious at all. After all, politics is an area where epistemic mistakes can have large to extremely large negative effects. He could have referred to non-political examples, but those tend to be comparatively inconsequential.
Replies from: shminux↑ comment by Shmi (shminux) · 2023-03-20T18:06:29.601Z · LW(p) · GW(p)
Yes, indeed, and that was my point: they are using a political example with a connotation-loaded language as if it was truth, not one possible perspective. Which made me question the OP's ability to evaluate their own commitment to truth-seeking.
Replies from: cubefox↑ comment by cubefox · 2023-03-20T18:42:47.354Z · LW(p) · GW(p)
For any proposition which you assert it is possible that someone else has another "perspective" and asserts instead, each acting as if it was the truth. So the existence of possible perspective is not specific to politics or truth seeking. Sure, it is possible to be overconfident relative to the evidence you have, but I don't recommend universal extensive hedging for any political examples merely because they are political. If you disagree with his examples, you are surely able to insert similar examples where (what you believe to be) epistemic mistakes have a very large negative impact. The thing with contemporary political mistakes is: They are nearly always controversial, so disagreement is expected, but this is not substantial evidence that political mistakes with large negative effects don't exist. (One could use now uncontroversial historical examples instead, like Lysenkoism, but this could make it sound like such mistakes are a thing from the past, that we are much wiser now.)
comment by tailcalled · 2023-03-20T09:06:11.426Z · LW(p) · GW(p)
I agree that something like an evidence clearinghouse seems like a good and important project. However I am not sure you have gotten the most important part of the problem.
Your proposal seems to focus on having the clearinghouse:
- Organize information that others have created
- Organize arguments that others have created
However I think this doesn't really get to the part of the problem that can be effectively addressed.
I think a disagreement often consists of two root causes:
- It is often because they don't trust each other. For instance lots of scientists are incompetent or politically biased, and especially the scientists who make public statements have typically been selected by politically biased organizations, so you generally should be skeptical of scientists.
- There is some area that the two sides both want to control. For instance pro-vaccine and anti-vaccine people both want to influence the bodies of anti-vaccine people (pro-vaxxers want anti-vaxxers to get vaccinated and anti-vaxxers don't want to get vaccinated).
I think the primary tasks of an evidence clearinghouse would be something like:
- Go out of its way to figure out what the underlying conflicts are.
- Collect new evidence relevant for the conflicts (e.g. is some specific scientific field an exception that is especially trustworthy?)
- Interpret evidence in the light of people's positions in the conflicts (e.g. what are the potential pros or cons to getting vaccinated in the light of the conflict?)
comment by Mitchell_Porter · 2023-03-20T06:00:11.359Z · LW(p) · GW(p)
One coauthor of the recent editorial, "The False Promise of ChatGPT", Jeffrey Watumull, champions an alternative style of AI, "anthronoetic AI", in which the capacity to provide explanations, and not just correct predictions, is fundamental. There is very little information about it online, but you can see a glimpse of the architecture in this video. You might want to talk to him about epistemological methods.
comment by Viliam · 2023-03-20T16:07:30.246Z · LW(p) · GW(p)
My job is to build high-quality software that doesn't benefit the world in any way. I'd rather make different software, but for that I need funding.
Ah, story of my life. There are things that pay my bills. There are things that I think the world would benefit from if I made them. I can't find any intersection between these two.
Maybe Kickstarter, if you have a specific idea?
comment by Edward Pascal (edward-pascal) · 2023-03-20T18:57:23.385Z · LW(p) · GW(p)
I think another issue that would arise is that if you get "into the weeds," some topics are a lot more straightforward than others (probably delineated by being rooted in mostly social facts or mostly natural science facts, which all behave completely differently).
The Ukraine issue is a pretty bad one, given the history of the region, the Maidan protests, US history of proxy wars, and, and, and. It seems to me far from clear what the simple facts are (other than you have two factions of superpowers, fighting for different things). I have an opinion as to what would be best, and what would be best for people of Ukraine, and what I think sections of Ukraine undisturbed by US and Russian meddling for the past 30 years might vote in referenda. And at least one of those thoughts disagrees with the others. Add to this the last 70 years of US interventions (see Chomsky for pretty good, uncontroversial fact-based arguments that it has all been pretty evil, and by the standards of the Nuremberg Trials one might execute every president since Kennedy).
On the other hand, Global Warming is pretty straightforward (even allowing for seeming complications like Mars temperature rise, or other objections). We can handle the objections in measurable terms of physical reality for a home-run clear answer.
One of OP's examples is an entirely social reality and the other is a matter of physics. Let's face it, in some sense this war is about where we draw squiggly lines and different colored blobs on a map. It's levels removed from something where we can do measurable tests. If you really made all the truth easy to find, bringing someone straight into the weeds of a social problem like a US/NATO intervention, in many cases the answer will not come out clear, no matter how good your tool is. In fact, a reasonable person after reading enough of the Truth might walk away fully disillusioned about all actors involved and ready to join some kind of anarchist movement. Better in some cases to gloss over social realities in broad strokes, burying as much detail as possible, especially if you think the war (whichever one it is!) is just/unjust/worth the money/not worth the money, etc.
Replies from: cousin_it, DPiepgrass↑ comment by cousin_it · 2023-03-22T16:36:03.600Z · LW(p) · GW(p)
I think Western colonialism was really bad, US wars were really bad, the Nazis were really bad, and so on. But from what I see of Russia's position, these are excuses. The true reason for the current war is annexation.
Russia could try to get Ukraine away from NATO, remove ultranationalists, protect Russian speakers and whatever else - purely as a military operation, without annexation. Instead, two days after the Maidan in 2014 and before any hostile action from the new Ukrainian government, Russia initiated annexing Crimea. That move was very popular with the Russian population, it wasn't Putin alone. Similarly in the current war, the stated goals were "demilitarization and denazification", but then Russia annexed several captured territories, which wasn't needed for any of those goals.
In fact I don't know any good reason for these annexations at all. They don't make Russia richer or more secure. It seems the situation is simple and kinda dumb: Putin and a large proportion of Russians simply want to annex these territories, profit be damned. They decided they want it, and now they want it.
Replies from: edward-pascal, baturinsky↑ comment by Edward Pascal (edward-pascal) · 2023-03-24T00:41:51.007Z · LW(p) · GW(p)
Then let's say we broadly agree on the morality of the matter. The question still remains if another US adventure, this time in Europe, is actually going to turn out all that well (as most haven't for the people they claimed to be helping). We also have to wonder if Russia as a failed state will turn out well for Ukraine or Europe, or if this will turn Nuclear if US/NATO refuse to cede any ground, or if the Russia/China alliance will break or not, or for how long the US can even afford and support more wars, etc, etc.
On the other side, do we worry if we're being Neville Chamberlain because we think every aggressor will behave as Hitler in 1938 if we give an inch, so "We gotta do something?" There may even be merit to the sentiment, but "We gotta do something" is one of the most likely ways to screw any situation up. Also, given the US's history of interventions, setting aside morality, just looking at the history of outcomes, the response is questionable. Looking down the road, if this conflict or anything else significantly weakens the US, economically, in domestic politics, or leads to an overextended military, then Ukraine might be lost all the way to the Polish border, not just the Eastern regions.
These are mostly practical considerations that are indeterminate and make the US intervention questionable without even looking at the morality. Given perfect knowledge, you would have a probability and risk management problem on your hands, which often fails to result in a clear convergence of positions. And going back to my original claims, this makes this type of thing very different to Physics and Chemistry and their extensions.
EDIT: Perhaps the most important question comes down to this: Russia clearly screwed up their risk management (as your message alludes to). How can US/NATO do far better with Risk Management? Maybe even better than they've done in all their wars and interventions in recent history?
↑ comment by baturinsky · 2023-03-22T19:54:10.550Z · LW(p) · GW(p)
Russia was trying peaceful and diplomatic options. Very actively. Literally begging to compromise. Before 2014 and before 2022. That did not work. At all.
Deposing the democratically elected government with which Russia was a military ally was an act hostile enough. And Maidan nationalists have already started killing anti-maidan protesters in Crimea and other Russian-speaking regions. I was following those events very closely and was speaking with some of the people living there then.
Replies from: cousin_it↑ comment by cousin_it · 2023-03-22T21:34:12.013Z · LW(p) · GW(p)
This seems to miss the point of my comment. What are the reasons for annexation? Not just military action, or even regime change, but specifically annexation? All military goals could be achieved by regime change, keeping Ukraine in current borders, and that would've been much better optics. And all economic reasons disappeared with the end of colonialism. So why annexation? My answer: it's an irrational, ideological desire for that territory. That desire has taken hold of many Russians, including Putin.
Replies from: baturinsky↑ comment by baturinsky · 2023-03-23T03:11:02.751Z · LW(p) · GW(p)
Crimea was the only Ukrainian region that was overwhelmingly Russian and pro-Russian. And also the region where a Russian key military base is situated. And at the moment there was (at least, formally) legal way to annex it with the minimal bloodshed. Annexing it has resolved the issue of the military base, and gave the legal status, protection guarantees and rights for the citizens of Crimean republic.
Regime change for entire Ukraine would mean a bloody war, insurgency, and installing a government which the majority of Ukraine population would be against. And massive sanctions against Russia AND Ukraine, for which Russia was not prepared then.
Replies from: cousin_it↑ comment by cousin_it · 2023-03-23T12:20:59.403Z · LW(p) · GW(p)
It's true that annexing Crimea would've been rational in a world where +base and +region were the only consequences. (Similar to how the US in the 1840s grabbed Texas and California from Mexico without much problems.) But we do not live in that world. We live in a world where many countries are willing to penalize Russia for annexation and help Ukraine defend. Russia's leadership didn't understand that and still doesn't. As a result, Russia's security and economic situation have both gotten much worse and continue to get worse. That's why I call it irrational.
Replies from: baturinsky, DPiepgrass↑ comment by baturinsky · 2023-03-23T13:31:48.229Z · LW(p) · GW(p)
No such countries. There is USA that is willing to penalize it's geopolitical opponents for being such. There are USA puppets that are willing to penalize those that USA told them to. They were penalizing Russia for arbitrary reasons before and after Crimea. If Russia would not annex Crimea, it would be penalized about the same, but with another cited reasons.
Replies from: cousin_it, Viliam↑ comment by cousin_it · 2023-03-23T14:50:17.911Z · LW(p) · GW(p)
I see a common pattern in your arguments. Ukraine never did large scale repression against Russian speakers - "but they would've done it". Europe didn't start sanctioning Russian resources until several months into the war - "but they would've done it anyway". The US reduced troops and warheads in Europe every year from 1991 to 2021 - "but they would have attacked us". 141 countries vote in the UN to condemn Russian aggression - "but they're all US puppets, just waiting for a chance to harm us".
There's a name for this kind of irrationality: paranoia. Dictators often drum up paranoia to stay in power, which has the side effect of making the country aggressive.
Replies from: baturinsky↑ comment by baturinsky · 2023-03-24T16:07:32.680Z · LW(p) · GW(p)
I disagree with the first part, but I'm not sure if this is the right place to discuss the details. We can discuss it in DM if you want.
You are spot on with the second, though. Exploiting fears of real or perceived threats is an extremely effective tool to control people and nations by posing as their protector.
The champion in this regard is the USA, of course. It fuels and exploits Europe's fear of Russia, Japan's fear of China, India's and China's mutual fear, and so on.
Domestically, the USA's elites exploit an extremely wide range of fears. Fear of terrorists, fear of Russia, fear of China, fear of Nazis, fear of people of different parties, races, sexuality, and even fear of people who fear LGBTQ+ or specific races.
The USA has been using the "divide and conquer" strategy liberally for at least a century now. This will likely have catastrophic consequences, as a divided world will have much less chance of surviving the acute risk period.
Putin also exploits fears, such as fears of LGBT "propaganda", Nazis, and the USA. But I don't think his position before 2022 was so shaky that he would have to resort to war to hold it
Replies from: cousin_it↑ comment by Viliam · 2023-03-24T12:31:22.680Z · LW(p) · GW(p)
I think that the support for hurting Russia is much greater in Eastern Europe than in USA. (Maybe with the exception of Hungary.) That does not seem to match the "they only want it because they are puppets".
For USA, Russia is some kind of noble ancient enemy. Kicking them while they are down may even feel unsportsmanlike.
For Eastern Europe, it is (for the anti-Russian part of the population) more like: "yeah, kick them while you can, stomp as hard as you can, so that they can never hurt us again". Many families remember relatives who were raped by the Red Army (no, it wasn't "only" Germany), kidnapped for Soviet extermination camps, etc. Ukraine is re-living this history right now, for the others this is more like "horrible stories my grandma told me when she considered me old enough to hear it".
Also, are you aware that Russia was planning to annex Belarus and Moldova next? (Putin actually wrote about his plans with Ukraine and Belarus in 2021.) But even taking the entire Ukraine would already make them my neighbors. I prefer that not to happen.
*
That said, perhaps in larger picture, it is completely irrelevant what the Eastern Europeans want to do, if USA decided otherwise.
That doesn't change the fact that they want it. Definitely not just puppets doing whatever USA tells them. (The example of Hungary actually shows that even the little countries are capable to ignore the American wishes.)
*
Sorry for the mindkilling tone, but I find it annoying when people from internet keep telling me that I have no agency, not even my own thoughts and wishes, I just think what the American overlords want me to think. (Unlike people in Russia or USA, who are allowed to be independent thinkers.)
↑ comment by DPiepgrass · 2023-03-24T20:54:45.219Z · LW(p) · GW(p)
I would point out that Putin's goal wasn't to make Russia more prosperous, and that what Putin considers good isn't the same as what an average Russian would consider good. Like Putin's other military adventures, the Crimean annexation and heavy military support of Donbas separatists in 2014 probably had a goal like "make the Russian empire great again" (meaning "as big as possible") and from Putin's perspective the operations were a success. Especially as (if my impression is correct) the sanctions were fairly light and Russia could largely work around them.
Partly he was right, since Russia was bigger. But partly his view was a symptom of continuing epistemic errors. For example, given the way the 2022 invasion started, it looks like he didn't notice the crucial fact that his actions caused Ukrainians to turn strongly against Russia after his actions in 2014.
In any case this discussion exemplifies why I want a site entirely centered on evidence. Baturinsky claims that when the Ukrainian parliament voted to remove Yanukovych from office 328 votes to 0 (about 73% of the parliament's 450 members) this was "the democratically elected government" being "deposed". Of course he doesn't mention this vote or the events leading up to it. Who "deposed the democratically elected government"? The U.S.? The tankies say it was the U.S. So who are these people, then? Puppets of the U.S.?
I shouldn't have to say this on LessWrong, but without evidence it's all just meaningless he-said-she-said. I don't see truthseeking in this thread, just arguing.
↑ comment by DPiepgrass · 2023-03-20T21:03:37.801Z · LW(p) · GW(p)
I disagree in two ways. First, people are part of physical reality. Reasoning about people and their social relationships is a complex but necessary task.
Second, almost no one goes to first principles and studies climate science themselves in depth. But even if you did that, you'd (1) be learning about it from other people with their interpretations, and (2) you wouldn't be able to study all the subfields in depth. Atmospheric science can tell you about the direct effect of greenhouse gasses, but to predict the total effect quantitatively, and to evaluate alternate hypotheses of global warming, you'll need to learn about glaciology, oceanology, coupled earth-system modeling, the effects of non-GHG aerosols, solar science, how data is aggregated about CO2 emissions, CO2 concentrations, other GHGs, various temperature series, etc.
Finally, if you determine that humans cause warming after all, now you need to start over with ecology, economic modeling etc. in order to determine whether it's actually a big problem. And then, if it is a problem, you'll want to understand how to fix the problem, so now you have to study dozens of potential interventions. And then, finally, once you've done all that and you're the world's leading expert in climate science, now you get frequent death threats and hate mail. A billion people don't believe a word you say, while another billion treat your word like it's the annointed word of God (as long as it conforms to their biases). You have tons of reliable knowledge, but it's nontransferable.
Realistically we don't do any of this. Instead we mostly try to figure out the social reality: Which sources seem to be more truth-seeking and which seem to be more tribal? Who are the cranks, who are the real experts, and who can I trust to summarize information? For instance, your assertion that Noam Chomsky provides "good, uncontroversial fact-based arguments" is a social assertion that I disagree with.
I think going into the weeds is a very good way of figuring out the social truth that you actually need to figure out the truth about the broader topic to which the weeds are related. For instance, if the weeds are telling you that pundit X is clearly telling a lie Y, and if everybody who believes Z also believes X and Y, you've learned not to trust X, X's followers, Y, and Z, and all of this is good... except that for some people, the weeds they end up looking at are actually astroturf or tribally-engineered plants very different from the weeds they thought they were looking at, and that's the sort of problem I would like to solve. I want a place where a tribally-engineered weed is reliably marked as such.
So I think that in many ways studying Ukraine is just the same as studying climate science, except that the "fog of war" and the lack of rigorous sources for war information make it hard to figure some things out.
Replies from: edward-pascal↑ comment by Edward Pascal (edward-pascal) · 2023-03-21T01:19:39.509Z · LW(p) · GW(p)
Okay, I think I understand what you mean that, since it's impossible to fully comprehend climate change from first principles, it ends up being a political and social discussion (and anyway, that's empirically the case). Nonetheless, I think there's something categorically in the physical sciences than the the more social facts.
I think perfect knowledge of climate science would tend towards convergence, whereas at least some Social Issues (Ukraine being a possible example) just don't work that way. The Chomsky example is Germane: prior to 92, his work on politics was all heavily cited and based on primary sources, and pretty much as solid academically as you could ask for (See for example "The Chomsky Reader") and we already disagree on this.
With regards Ukraine, I think intelligent people with lots of information might end up diverging even more as to their opinions on how much violence each side should be willing to threaten, use, and display in an argument about squiggly lines on map blobs, given more information. Henry Kissinger ended up not even agreeing with himself from week to week, and he's probably as qualified an expert on this matter as any of us. I think it's fair to suggest that no number of facts regarding Ukraine are going to bring the kind of convergence you would see if we could upload the sum of climate science into each of our human minds.
Even if I am wrong in the Ukraine case, do you think there are at least some social realities that if you magically downloaded the full spectrum of factual information into everyone's mind, people's opinions might still diverge? Doesn't that differ from a hard science where they would tend to converge if you understood all the facts? Doesn't this indicate a major difference of categories?
Another way of looking at it: Social realities are not nearly as deterministic on factual truth as accurate conclusions in the hard sciences are. They are always vastly more stochastic. Even looking at the fields, the correlation coefficients and R2 for whole models in Sociology, at it's absolute best, are nothing at all compared to the determinism you can get in Physics and Chemistry.
Replies from: DPiepgrass↑ comment by DPiepgrass · 2023-03-22T14:55:53.357Z · LW(p) · GW(p)
I think that the people who are truthseeking well do converge in their views on Ukraine. Around me I see tribal loyalty to Kremlin propaganda, to Ukrainian/NAFO propaganda, to anti-Americanism (enter Noam Chomsky) and/or to America First. Ironically, anti-American and America First people end up believing similar things, because they both give credence to Kremlin propaganda that fits into their respective worldviews. But I certainly have a sense of convergence among high-rung observers who follow the war closely and have "average" (or better yet scope-sensitive/linear) morality. Convergence seems limited by the factors I mentioned though (fog of war, poor rigor in primary/secondary sources). P.S. A key thing about Chomsky is that his focus is all about America, and to understand the situation properly you must understand Putin and Russia (and to a lesser extent Ukraine). I recommend Vexler's video on Chomsky/Ukraine as well as this video from before the invasion. I also follow several other analysts and English-speaking Russians (plus Russian Dissent translated from Russian) who give a picture of Russia/Putin generally compatible with Vexler's.
do you think there are at least some social realities that if you magically downloaded the full spectrum of factual information into everyone's mind, people's opinions might still diverge
Yes, except I'd use the word "disagree" rather than "diverge". People have different moral intuitions, different brain structures / ways of processing info, and different initial priors that would cause disagreements. Some people want genocide, for example, and while knowing all the facts may decrease (or in many cases eliminate) that desire, it seems like there's a fundamental difference in moral intuition between people that sometimes like genocide and those of us who never do, and I don't see how knowing all the facts accurately would resolve that.
Replies from: edward-pascal↑ comment by Edward Pascal (edward-pascal) · 2023-03-24T00:24:16.633Z · LW(p) · GW(p)
What you are actually making is something like a "lesser of two evils" argument or some bet on tradeoffs paying off that one party may buy and another may not. Having explored the reasoning this far, I would suggest this is one class of circumstances where even if you beamed all the facts into two people's minds, who both had "Average" morality, this is one of the situations where there would still tend to be disagreement. This definitely doesn't hinge on someone wanting something bad, like genocide, for the disagreement. People could both want the same outcomes and diverge in their conclusions with the facts beamed into their minds in this class of situations (which, to my original argument, differs tremendously from physics).
I hadn't seen old man Chomsky talk about Ukraine prior to your video above. I think though, if you look at his best work, you might be able to softly mollify the impact, but it's not like he's pulling his ideas about, say, every single US action in South America and the Middle East being very bad for the people they claimed to help, out of some highly skewed view. Those border on fairly obvious, at any rate, and your video's recasting him as a "voice of moral outrage" hinges on his off-the cuff interviews, not his heavily cited work (as I mentioned The Chomsky Reader, which is a different man than the one in the video)
Even setting him aside as a reference, looking at the recent history of US war, at the most generous, considering Russian badness and US badness, any "moral high-ground" argument for the US being good in this case will boil down to a lesser-of-two-evils assessment. Also looking at US history, you lose some of the "this is just an annexation" because US proxy war since 2014 would fit the pattern of pretty much everything the USA has done both recently and for the past 100 years.
Your point about also looking at Putin/Russia is fine, and it should be considered as well as practical solutions to the matter. I think we all would call Putin a criminal, this isn't a question at hand. The question is if another US adventure, this time in Europe, is actually going to turn out all that well, or if Russia as a failed state will turn out well for Ukraine or Europe, or if this will turn Nuclear if you refuse to cede any ground, or if the Russia/China alliance will break or not, or for how long the US can even afford and support more wars, etc, etc. These are mostly practical matters that are indeterminate and make the intervention questionable. In practical senses, they present different good/bad tradeoffs and better/worse odds bets on outcomes to different parties that amount to weighing different "lesser evil" projections in the outcome. They don't hinge on our moral intuitions differing at all.
(And again, all this differs in category and the way it behaves from Physics)
Replies from: Viliam, DPiepgrass↑ comment by Viliam · 2023-03-24T12:43:32.879Z · LW(p) · GW(p)
every single US action in South America and the Middle East being very bad for the people they claimed to help
Maybe if we also included WW2 Germany and Japan to this reference group, the outcomes would be more of a mixed bag.
Then again, the argument might be that American foreign policy became bad after WW2.
↑ comment by DPiepgrass · 2023-03-24T05:59:01.235Z · LW(p) · GW(p)
I don't know what you are referring to in the first sentence, but the idea that this is a war between US and Russia (not Russia and Ukraine) is Russian propaganda (which doesn't perfectly guarantee it's BS, but it is BS.)
In any case, this discussion exemplifies my frustration with a world in which a site like I propose does not exist. I have my sources, you have yours, they disagree on the most basic facts, and nobody is citing evidence that would prove the case one way or another. Even if we did go deep into all the evidence, it would be sitting here in a place where no one searching for information about the Ukraine war will ever see it. I find it utterly ridiculous that most people are satisfied with this status quo.
comment by Perhaps · 2024-02-12T01:33:14.942Z · LW(p) · GW(p)
Well, someone was working on a similar-ish project recently, @Bruce Lewis [LW · GW] with HowTruthful. Maybe you two can combine your ideas or settle on an amalgamation together.
If possible, please let us know how it goes a couple months from now!
Replies from: bruce-lewis↑ comment by Bruce Lewis (bruce-lewis) · 2024-02-15T18:29:39.886Z · LW(p) · GW(p)
The best path forward might be for @DPiepgrass [LW · GW] to make a prototype or mockup, borrowing ideas from HowTruthful and then discussing from there.
Replies from: DPiepgrass, DPiepgrass↑ comment by DPiepgrass · 2024-05-23T14:37:44.788Z · LW(p) · GW(p)
Another thing: not only is my idea unpopular, it's obvious from vote counts that some people are actively opposed to it. I haven't seen any computational epistemology (or evidence repository) project that is popular on LessWrong, either. Have you seen any?
If in fact this sort of thing tends not to interest LessWrongers, I find that deeply disturbing, especially in light of the stereotypes I've seen of "rationalists" on Twitter and EA forum. How right are the stereotypes? I'm starting to wonder.
↑ comment by DPiepgrass · 2024-05-20T20:40:07.094Z · LW(p) · GW(p)
Ah, this is nice. I was avoiding looking at my notifications for the last 3 months for fear of a reply by Christian Kl, but actually it turned out to be you two :D
I cannot work on this project right now because busy I'm earning money to be able to afford to fund it (as I don't see how to make money on it). I have a family of 4+, so this is far from trivial. I've been earning for a couple of years, and I will need a couple more years more. I will leave my thoughts on HowTruthful on one of your posts on it.
comment by baturinsky · 2023-03-20T19:13:44.242Z · LW(p) · GW(p)
Everybody have their own criteria of truth.
So, there should be a wide choice of the algorithms and algorithm tweaks that would analyze the relevant data, filter it and process it in a specific way that would satisfy the specific needs of the specific person.
Replies from: DPiepgrass↑ comment by DPiepgrass · 2023-03-20T20:00:05.150Z · LW(p) · GW(p)
Some people seem to have criteria for truth that produce self-sealing beliefs.
But yes, I think it would be interesting and valuable to be able to switch out algorithms for different ones to see how that affects the estimated likelihood that the various propositions and analyses are likely to be correct. If an algorithm is self-consistent, not based on circular reasoning and not easily manipulable, I expect it to provide useful information.
Also, such alternate algorithms could potentially serve as "bias-goggles" that help people to understand others' points of view. For example, if someone develops a relatively simple, legible algorithm that retrodicts most political views on a certain part of the political spectrum (by re-ranking all analyses in the evidence database), then the algorithm is probably informative about how people in that area of the spectrum form their beliefs.
comment by Zian · 2023-03-20T05:24:32.177Z · LW(p) · GW(p)
Regarding the shopping example, I find that B2B websites like Fisher Scientific and McMaster-Carr have good search and filter options. Pcpartpicker.com is also a good example.
comment by ChristianKl · 2023-03-23T15:12:37.513Z · LW(p) · GW(p)
Politics is the Mind-Killer [LW · GW] there's no good reason to lead with examples that are this political.
Replies from: DPiepgrass↑ comment by DPiepgrass · 2024-02-11T03:38:25.399Z · LW(p) · GW(p)
Yes there is. I gave examples that were salient to me, which I had a lot of knowledge about.
And my audience was LessWrong, which I thought could handle the examples like mature adults.
But my main takeaway was flak from people telling me that an evidence repository is unnecessary because "true claims sound better" and, more popularly, that my ideas are "suspicious"―not with any allegation that I said anything untrue*, or that my plan wouldn't work, or that the evidence I supplied was insufficient or unpersuasive, or that I violated any rationalist virtue whatsoever, but simply because the evidence was "political".
If you know of some non-political examples which have had as much impact on the modern world as the epistemic errors involved in global warming policy and the invasion of Ukraine, by all means tell me. I beg you. And if not, how did you expect me to make the point that untrue beliefs have large negative global impacts? But never mind; I'm certain you gave no thought to the matter. It feels like you're just here to tear people down day after day, month after month, year after year. Does it make you feel good? What drives you?
Edit: admittedly that's not very plausible as a motive, but here's something that fits better. Knowing about biases can hurt people [LW · GW], but there's no reason this would be limited only to knowledge about biases. You discovered that there's no need to use rationalist principles for truthseeking; you can use them instead as spitballs to toss at people―and then leave the room before any disagreements are resolved. Your purpose here, then, is target practice. You play LessWrong the way others play PUBG. And there may be many spitballers here, you're just more prolific.
* except this guy [LW(p) · GW(p)], whom I thank for illustrating my point that existing forums are unsuitable for reasonably arbitrating factual disagreements.
Replies from: ChristianKl↑ comment by ChristianKl · 2024-02-11T13:47:42.144Z · LW(p) · GW(p)
And my audience was LessWrong, which I thought could handle the examples like mature adults.
Part of rationality is not being in denial of reality and there are certain realities about what happens when you talk about politics.
Part of what the sequences are about is to care about reality and you prefer to be in denial of it and ignore the advice that the sequences made. Bringing up that you ignored it felt relevant to me.
I'm certain you gave no thought to the matter. It feels like you're just here to tear people down day after day, month after month, year after year.
Then you are wrong. Contemporary politics is one source of examples but a very bad source as described in the sequences. There's history. In the West, we generally study history to learn from it.
If you know of some non-political examples which have had as much impact on the modern world as the epistemic errors involved in global warming policy and the invasion of Ukraine, by all means tell me.
The decision to invade Iraq in more recent history was driven by bad epistemics. Talking about it does not trigger people's tribal senses the same way as talking about contemporary political conflicts.
If you go further back, there are also plenty of things that happened in the 20th century that were driven by bad epistemics.
Lastly, there's no reason why you have to pick the most consequential examples to make your points. You don't want people to focus on big consequences but want them to focus on the dynamics of truthseeking.
↑ comment by DPiepgrass · 2024-02-11T15:44:21.258Z · LW(p) · GW(p)
there are certain realities about what happens when you talk about politics.
Says the guy who often wades into politics and often takes politically-charged stances on LW/EAF. You seem to be correct, it's just sad that the topic you are correct about is the LessWrong community.
Part of what the sequences are about is to care about reality and you prefer to be in denial of it
How charitable of you. I was misinformed: I thought rationalists were (generally) not mind-killed. And like any good rationalist, I've updated on this surprising new evidence. (I still think many are not, but navigating such diversity is very challenging.)
Then you are wrong.
Almost every interaction I've ever had with you has been unpleasant. I've had plenty of pleasant interactions, so I'm confident about which one of us this is a property of, and you can imagine how much I believe you. Besides which, it's implausible that you remember your thought processes in each of the hundred-ish comments you've made in the last year. For me to be wrong means you recollected the thought process that went into a one-sentence snipe, as in "oh yeah I remember that comment, that's the one where I did think about what he was trying to communicate and how he could have done better, but I was busy that day and had to leave a one-sentence snipe instead."
Talking about it does not trigger people's tribal senses the same way as talking about contemporary political conflicts.
Odd but true. Good point.
there are also plenty of things that happened in the 20th century that were driven by bad epistemics
No doubt, and there might even be many that are clear-cut and no longer political for most people. But there are no such events I am knowledgeable about.
You don't want people to focus on big consequences
Yes, I do. I want people to sense the big consequences, deeply and viscerally, in order to generate motivation. Still, a more academic reformulation may also be valuable.
Replies from: ChristianKl↑ comment by ChristianKl · 2024-02-11T17:16:05.899Z · LW(p) · GW(p)
How charitable of you. I was misinformed: I thought rationalists were (generally) not mind-killed.
That's easily solved by reading the post from Eliezer about Politics is the Mind-Killer [LW · GW] and understanding the advice it makes.
For me to be wrong means you recollected the thought process that went into a one-sentence snipe
If I had only voiced that position in a comment and nowhere else, that might be true. That's not the case, I have multiple times criticized people for not applying Politics is the Mind-Killer [LW · GW] and using political examples to make points that aren't about politics.
Says the guy who often wades into politics and often takes politically-charged stances on LW/EAF. You seem to be correct, it's just sad that the topic you are correct about is the LessWrong community.
I talk about politics when I want to make a point about politics. I usually don't talk about politics when I want to make a point about something else. If you want to make a point about politics it's unavoidable to talk about politics.
The advice of Politics is the Mind-Killer [LW · GW] is that you don't talk about politics if you want to make a point that isn't about politics because using political examples makes it harder for the point that isn't about politics to come through. That post does not advise people not to talk about politics even if people who haven't read it sometimes use the title as a catchphrase for the position that one shouldn't talk about politics in general.
It's a useful heuristic that Eliezer proposed. You write a post titled "Let's make the truth easier to find" and in it follow heuristics that make the truth harder to find. If your actual goal would be to "Let's make the truth easier to find" then my feedback would be valuable. Of course, if your goal is to signal that you care about the truth and have certain political positions, then my feedback feels offensive.
comment by Thoth Hermes (thoth-hermes) · 2023-03-22T17:18:46.182Z · LW(p) · GW(p)
I don't buy that truthful things should be, in general, difficult to distinguish from untruthful things. I'm not even sure of what that would mean, exactly, for truth-seeking to just be "difficult."
We could ask whether we would expect that true claims would "sound better" to the one reading / hearing them than false claims. This would have important implications: For example, if they do sound better, then "persuasion" isn't something anyone need to worry about, unless they were intentionally trying to persuade someone of something that was both false and sounded bad, which would be the case by assumption, here.
The idea that truth-seeking is inherently difficult is an idea that sounds bad. Thus, for me to believe it would require me to believe that bad-sounding things could be true and good-sounding things could be false. How often would this mismatch happen? There is no way a priori to tell how often we would expect this, and that in itself is a bad-sounding thing.
An individual who investigates stuff, but isn't popular, has nowhere they can put their findings and expect others to find them. Sure, you can put up a blog or a Twitter thread, but that hardly means anyone will look at it.
I even more don't buy the idea that false things monetize better than true things. But this is a complaint I sometimes hear, and I can't help but sneer at it a bit. It's one thing to think that false things and true things compete on an even playing-field, but it's a wholly different thing to think that people are inherently hardwired to find false things more palatable and therefore spend more time looking for it / paying for it.
It sounds very similar to the arguments for fighting misinformation on social media platforms: Mainly, that it tends to spread more easily than "true but boring / unpleasant" things. During COVID-19, for example, the people that thought we ought to stem the spread of misinformation also typically believed that COVID-19 was more dangerous than the opposite group.
This seems like a very important crux, then, at least: The dichotomy between good-seeming / bad-seeming and true / false. I agree that we should get to the bottom of it.
Replies from: DPiepgrass↑ comment by DPiepgrass · 2023-03-22T18:01:56.566Z · LW(p) · GW(p)
I don't understand why you say "should be difficult to distinguish" rather than "are difficult", why you seem to think finding the truth isn't difficult, or what you think truthseeking consists of.
For two paragraphs you reason about "what if true claims sound better". But true claims don't inherently "sound better", so I don't understand why you're talking about it. How good a claim "sounds" varies from person to person, which implies "true claims sound better" is a false proposition (assuming a fact can be true or false independently of two people, one of whom thinks the claim "sounds good" and the other thinks it "sounds bad", as is often the case). Moreover, the same facts can be phrased in a way that "sounds good" or "sounds bad".
I didn't say "false things monetize better than true things". I would say that technically correct and broadly fair debunkings (or technically correct and broadly fair publications devoted to countering false narratives) don't monetize well, certainly not to the tune of millions of dollars annually for a single pundit. Provide counterexamples if you have them.
people are inherently hardwired to find false things more palatable
I didn't say or believe this either. For such a thing to even be possible, people would have to easily distinguish true and false (which I deny) to determine whether a proposition is "palatable".
The dichotomy between good-seeming / bad-seeming and true / false.
I don't know what you mean. Consider rephrasing this in the form of a sentence.
Replies from: thoth-hermes↑ comment by Thoth Hermes (thoth-hermes) · 2023-03-22T19:16:55.685Z · LW(p) · GW(p)
I don't understand why you say "should be difficult to distinguish" rather than "are difficult", why you seem to think finding the truth isn't difficult, or what you think truthseeking consists of.
Because it feels like it's a choice whether or not I want to consider truth-seeking to be difficult. You are trying to convince me that I should consider it difficult, so that means I have the option to or not. If it simply is difficult, you don't need to try and convince me of that, it would be obvious on it's own.
In addition to that, "should be" means that I think something ought to be a certain way. It certainly would be better if truth-seeking weren't difficult, wouldn't you agree?
I didn't say "false things monetize better than true things". I would say that technically correct and broadly fair debunkings (or technically correct and broadly fair publications devoted to countering false narratives) don't monetize well, certainly not to the tune of millions of dollars annually for a single pundit. Provide counterexamples if you have them.
So you're not saying that false things monetize better than true things, you're saying that things which correctly state that other things are false monetize worse than that the things that they claim are false. I don't think I misunderstood you here, but I may have interpreted your meaning more broadly than it was intended.
I would think that how well something monetizes depends on how much people want to hear it. So yes, that would mean that it depends on how good something sounds. Our disagreement is on whether or not how good something sounds has any relation whatsoever to how true it is.
But true claims don't inherently "sound better"
To be clear, I'm saying that they do, and that this means that truth-seeking isn't that difficult, and it is counterproductive to believe that it is difficult.
Replies from: DPiepgrass↑ comment by DPiepgrass · 2023-03-22T19:47:10.799Z · LW(p) · GW(p)
I'm saying that [true claims sound better]
The proof I gave that this is false was convincing to me, and you didn't rebut it. Here are some examples from my father:
ALL the test animals [in mRNA vaccine trials] died during Covid development.
The FDA [are] not following their own procedures.
There is not a single study that shows [masks] are of benefit.
[Studies] say the jab will result in sterility.
Vaccination usually results in the development of variants.
He loves to say things like this (he can go on and on saying such things; I assume he has it all memorized) and he believes they are true. They must sound good to him. They don't sound good to me (especially in context). How does this not contradict your view?
it feels like it's a choice whether or not I want to consider truth-seeking to be difficult.
Agreed, it is.
Replies from: thoth-hermes↑ comment by Thoth Hermes (thoth-hermes) · 2023-03-22T21:23:39.546Z · LW(p) · GW(p)
We should be able to mutually agree on what sounds better. For example, "vaccines work" probably sounds better to us both. People say things that don't sound good all the time, just because they say it doesn't mean they also think it sounds good.
Things like "we should be able to figure out the truth as it is relevant to our situation with the capabilities we have" have to sound good to everyone, I would think. That means there's basis for alignment, here.