Posts
Comments
I tell people that their arguments are not persuading to third parties when they insult me, because I genuinely hope that they will change the way they argue, and also because I think it's generally true that insulting anyone does not tend to convince them. I'm personally not interested in being insulted, but I am interested in people arguing for their positions well, since if they are right, I'm interested in being persuaded.
I asked ChatGPT if it could pass the touring test, and it said it couldn't. It was clearly lying.
I've never seen anyone argue for that position. I'm sure there are people who think that, but they must be a small minority. I'm willing to be convinced that I'm wrong, though.
we should target our arguments according to the circles we find ourselves in
That's true, but I would also suggest that some arguments just don't need to be had. For example:
rather than jumping into extremely serious surgery blindly
I suspect this might be hyperbolic, but even so, this is not a position any actual person holds. Nobody who is in favor of transgender rights thinks that people should just "jump into extremely serious surgeries blindly." Repeating this framing of the problem in this way plays into the hands of people who are intentionally misrepresenting what the actual argument is in order to create division.
If you want to target your arguments to your audience, you should steelman the arguments you're targeting, otherwise you're just confirming their beliefs.
Not sure what the answer to the specific question is, but there is solid evidence that cleaning your hands prevents the spread of HFMD, Adoneviruses, E. Coli, Salmonella, and many other germs. Independent of the evidence for Covid in particular, regularly cleaning your hands is a great idea.
Doesn't your Maori massaclre example disprove the validity of virtue ethics?
"No, obviously. That would be monstrous."
This feels like begging the question. Why is it obvious that a doctor shouldn't kill one patient to save five? It seems like it is obvious because we have an overwhelmingly strong intuition that it is wrong. Given that there are many people who have an overwhelmingly strong intuition that being gay is wrong, I'm unsure if it's a good idea to just rely on that intuition, and leave it there.
Database normalization is just about avoiding duplication, right?
I think the thing here is that people who get database design can't really understand how it is possible to not get it, but there are a lot of people for whom it is extremely difficult to understand this topic. I sat through years of lectures wondering why we were taught things that were completely self-evident. Then I looked at a lot of other people's code, and it became clear that it wasn't self-evident at all.
"stripping away intellectual property protections without any compensation"
Isn't the AstraZeneca vaccine almost entirely financed with government funding? Even the ostensibly privately funded vaccines depend heavily on funding provided by taxpayers.
So as a taxpayer, not only am I funding the development of these vaccines, I'm also then funding government force to protect private monopolies on these vaccines. Regardless of the short-term implications of IP waivers, it seems clear to me that this is not a sound system, and that the incentives for creating these vaccines were strongly dependent on taxpayer funds, not by the possible long-term value of any IP generated by this research.
Pandemics are generally unique situations. Given what we know at the moment, we would expect similar events to occur perhaps two or three times a century. So developing vaccines specifically for pandemics similar in magnitude to what we are going through right now is not a sound investment to begin with.
I have two different thoughts on this:
- I don't think ads are inherently bad. It's true that ads are a way of financing things that either would otherwise not be financed, or would be financed by (and thus exclusively available to) relatively rich people. However, I'm pretty sure that online ads actually devalue ads in general, and make it more difficult to provide these services. An ad that runs in a newspaper is much, much more valuable than an online ad on that same newspaper's website, even if the website reaches many more people. This effectively means that the way online ads currently work, they harm entities that rely on ad revenue. The more personalized and trackable ads become, the more closely they are valued by their direct revenue generation power instead of their long-term impact, and thus, the less value they have. Evidence for this is that ads where tracking is impossible (e.g. on podcasts) are valued higher than ads that allow for detailed tracking.
- The way online ads are currently monetized relies on personalization. This means that online ads create a strong incentive to track people, and to harm people's ability to have privacy online. This, in turn, means that it becomes much easier to use this data in more nefarious ways, and (for example) discriminate against people using data gathered for ad tracking.
The better companies get at tracking, the more data they have that will be abused, and the lower the value of ads will become. Therefore, the most good would be created if online companies stopped investing in ad tech, and online ads went back to being anonymous.
I have two thoughts on this:
- It seems to me that a winner-take-all election for an immensely powerful head of the executive branch of the government necessarily creates a two-party system (or something similar to a two-party system, as has happened in Germany), even if you ignore all other issues. Since there is no general feeling that having a powerful president is inherently problematic, there will not be a strong third party.
- It's not entirely clear to me what concrete positions a hypothetical center party would take. The two parties aren't that far apart, if you ignore identity politics issues. One party wants taxes a bit higher, the other wants taxes a bit lower, but they're not that far apart. One party wants basic health insurance to be governed by legislation, the other by the free market, but they're both pretty similar ideas. There is no room for a center party because there is no space between the two parties, regardless of how angry they are at each other. In fact, the anger might be an example of narcissism of small differences, where the two sides are so angry at each other precisely because they hold similar positions, and need other ways to differentiate themselves. Hence, to focus on nonsensical issues like trying to ban people from bathrooms, or complaining about Dr. Seuss.
I find it highly unlikely that we live in a simulation. Anyone who has implemented any kind of simulation has found out that they are hugely wasteful. It requires a huge amount of complexity to simulate even a tiny, low-complexity world. Therefore, all simulations will try to optimize as much as possible. However, we clearly don't live in a tiny, low-complexity, optimized world. Our everyday experiences could be implemented with a much, much lower-complexity world that doesn't have stuff like relativity and quantum gravity and dark energy and muons.
The basic premise that simulations are basically the same as reality, and that there many simulations, but only one reality, and that statistically, we therefore almost certainly live in a simulation, is not consistent with my experience working on simulations. Any simulation anyone builds in the real world will by necessity be infinitely less complex than actual reality, and thus infinitely less likely to contain complex beings.
I can only speak for myself, but the simple fact is that we need to be about 70% of the population to be immune in order for anything resembling normalcy to return. With a 90% protection rate for new vaccines, this means about 80% of people need to either get sick or get the vaccine. Given how few people already have antibodies in many places, this means that pretty much everybody who isn't a vaccine denier needs to get vaccinated. That's why I will get vaccinated as soon as I am able to.
I think there are two basic reasons:
- There's a lag between infection and test, between test and hospitalization, and between hospitalization and death. So death is the chart that will go up the latest, and there are a lot of long-tail deaths from people who were infected a long time ago, so death rate rises are less "spikey" than infections.
- Case fatality rate has trended down in general. Hospitalization-death rate has roughly halfed since the first wave.
Scientists aren't entirely sure why 2. is happening, but there are multiple possible explanations, all of which probably contribute to some degree.
- In some places, many particularly vulnerable people are already dead.
- The average age of newly infected people has gone down after the first spike, probably for multiple different reasons, including older people being more careful on average. Also, we might be getting slightly better at protecting vulnerable populations. Younger people have a much higher chance of surviving an infection.
- Mask wearing decreases the initial viral load that triggers an infection, which causes a less severe, more survivable infection.
- Treatments are getting more effective.
One nice thing about Switzerland is that there is no president, no single leader of the executive branch, but instead a federal council consisting of seven people who decide by majority, and where every member will stand behind the majority decision (there is technically a leader of the council, but he's first amongst equals, and has no special powers). Not having a single president means there's no winner-take-all outcome, which means you don't end up with a two-party system.
We should also consider whether we really want billionaires to make unilateral, wide-ranging public health/policy decisions without any real governmental oversight. We have a government for a reason, so that we can actually elect people to make these decisions, and have some accountabilities for the outcomes. I get that this sounds almost ridiculous at the moment, given how dysfunctional particularly the American government has become, but I'd still rather have some control than no control at all.
Rather than have billionaires take over governmental responsibilities, a better approach would probably be to tax them at a higher rate.
Is there actual evidence that a minimum wage decreases total consumption? I've never heard that, or seen any study on it, and I'd like to learn more.
(Intuitively, it doesn't seem highly plausible to me, since my assumption would be that it transfers wealth from rich people to poor people, which should increase total consumption, because there's more room for consumption growth for poorer people, but I'm also not sure if that is true.)
(Edit: after a cursory search of current research on the topic, it seems that the consensus is rather that a minimum wage has a small positive effect on consumption, which is what I would have naively expected.)
There are some additional it's/its mistakes on your text, e.g. here:
I run a denial of service attack on it’s server, cutting it off from the web before it can get it’s copies running.
I used to work as a software engineer. As the company I work for has grown a lot, I now no longer write code, but do software design, and hire new team members in different positions, inluding PMs, visual design, usability design, backend programming, and frontend programming.
It is extremely difficult to find good programmers, especially frontend programmers.
I'm pretty sure that the reason here is not that it is difficult to become a good programmer, but that a lot of people choose not do, for a number of reasons.
Two reasons that I have personally encountered:
- I studied comp sci between 2000 and 2005. During my first year, we had about 20% women. At graduation, we had about 5% women. Reasons are probably varied, but a major reason, at least back then, was that professors were hostile towards women studying comp sci (one of them explicitly told a friend of mine who was studying with me that he thought women were not suited for comp sci). In effect, we're basically excluding half of the population from this job option.
- A lot of people just don't consider programming as a job option at all. Trying to encourage people to enter the field, people typically push back because they have been told that it is difficult, and/or that it is similar to maths, which many people don't enjoy.
For visual design positions, we get a large number of applications from many qualified people. Applicants are highly diverse in age, gender, interests, and so on. But hiring for software engineering positions, we get few qualified applications, and they're almost exclusively men below 35 years with often a very similar profile. If we search for people who have less common abilities (e.g. for full-stack developers), we basically don't get qualified applications.
This is in a city that has one of Europe's highest-rated technical university that produces a lot of comp sci graduates.
One intersting data point here is that game studios tend to pay developers much worse than other companies, and offer much worse benefits. The reason for this is likely that developers want to work at game studios, and that many more apply there.
Also, one other thing to consider is that software is eating the world. There are very few products that are not in some way dependent on software, directly or indirectly. Even things that ostensibly don't depend on software were probably created using specialized software, and were produced by companies that run on specialized software (e.g. process management software).
Consequently, when we find qualified applicants, they basically dicatete salaries. We are dependent on them, we can't grow without more people who write software. They are not dependent on us, and there's nothing we can offer that other software companies can't also offer.
I don't think the growth of our dependence on software will stop any time soon. Even if a lot more people started studying comp sci right now, that would only only lead to more growth, and thus increase demand further, at least for the foreseeable future.