Posts

Increased Scam Quality/Quantity (Hypothesis in need of data)? 2023-01-09T22:57:31.261Z
Bioweapons, and ChatGPT (another vulnerability story) 2022-12-07T07:27:23.145Z

Comments

Comment by joshuatanderson on How to Control an LLM's Behavior (why my P(DOOM) went down) · 2023-11-30T18:35:39.952Z · LW · GW

Signal boosted! This seems significantly more plausible as a path to robust alignment than trying to constrain a fundamentally unaligned model using something RLHF. 

Comment by joshuatanderson on Which AI outputs should humans check for shenanigans, to avoid AI takeover? A simple model · 2023-03-28T00:33:21.917Z · LW · GW

@Tom Davidson, this should probably include the relevant tags to (hopefully) be excluded from the training data for future LLM's.

Comment by joshuatanderson on Increased Scam Quality/Quantity (Hypothesis in need of data)? · 2023-01-21T00:26:10.146Z · LW · GW

That is really interesting. To me, this implies that as costs for scammers are lowered, the threshold for a useful level of gullibility is about to lower dramatically, given how much cheaper server time is than human time (error bar here, since I don't actually know how much cheaper GPT-X calls will be than the time of an English-speaking human in a developing nation). If it is indeed 10x lower, that would likely lead to scams losing an obvious "tell".

Comment by joshuatanderson on Increased Scam Quality/Quantity (Hypothesis in need of data)? · 2023-01-11T18:24:50.159Z · LW · GW

Me neither, @kithpendragon. I've seen a handful of things in the wild (mainly social media accounts, not scammers) that seem like they could be part of a mostly-automated content pipeline, but no compelling proof of concepts or project either. Thanks for the data point! 

Comment by joshuatanderson on The Futility of Religion · 2022-10-26T20:43:07.301Z · LW · GW

Downvoted because this feels a bit like rambling. 

I'm not 100% sure if I can agree that religion is useless (perhaps it fulfills important cultural needs, or allows larger in-groups). That idea feels a bit underdeveloped.

I think any of the ideas in this could potentially be the start of an interesting post. But it fails to engage with the larger context and thought on any of them, or to really add anything to the discussion.

Comment by joshuatanderson on Bugs or Features? · 2022-09-16T23:05:27.979Z · LW · GW

This feels really valuable. Outside of the realm of paper napkins and trolleys, having fuzzy heuristics may be a reasonable way to respond to a world where actors tend to have fuzzy perceptions. 

Comment by joshuatanderson on Reshaping the AI Industry · 2022-06-02T03:44:12.170Z · LW · GW

Thanks for this.  There's been an excess of panic and defeatism here lately, and it's not good for our chances at success, or our mental health.

This is actionable, and feels like it could help.  

Comment by joshuatanderson on Let's buy out Cyc, for use in AGI interpretability systems? · 2021-12-07T21:53:39.528Z · LW · GW

I think this is a pretty reasonable goal.  I also listened to that podcast interview, and although I certainly don't think they are near an AGI right now, it may have some missing pieces that other projects don't, particularly in regards to explaining AI actions in a human-intelligible fashion.

I don't think open-sourcing would require a buy-out.  The plethora of companies built around open-source code bases shows that one can have an open-sourced code base, and still be profitable.    

Gwern, what makes you pick a 5x multiplier?

The average P/E ratio for the S&P 500 is around 30 right now.  I would expect that a firm like Cyc may be worth a bit more, since it is a moonshot project.  

If their revenue is 5 million, I would expect the company value is roughly 150 million, based on that back of the napkin math.

How much they would charge to open source, however, could be drastically less than that, and maybe in single digit numbers. 

Comment by joshuatanderson on Reason as memetic immune disorder · 2021-08-04T17:23:49.605Z · LW · GW

As a Christian who is pretty familiar with the history of Christianity (less so with Islam, and embarrassingly ignorant as to Buddhist thought), I would suggest that perhaps the point on adult converts being radical needs some nuance.

From a Christian perspective, the AJ Jacobs experiment is intended to make any religion look idiotic, due to a very woodenly literal interpretation of what it means to follow the commands of the old and new testaments.

Although there may be some adult converts who do such actions, this seems pretty abnormal, and although adult converts may be marked by more sincerity, claiming they are marked by being more radical in a viral sense seems entirely unsubstantiated.  

Examples:

  • Francis Collins, current NIH director & sequencer of the human genome: he converted to Christianity as an adult doctor, and neatly integrated religious metaphysics with an impeccably scientific worldview.  I'm not aware of him attempting to wear tassels or stone people.  
  • Augustine of Hippo:  Adult convert from Mannicheaism (a classical-era dualistic religion) to Christianity.  Went from leading a sexual-pleasure-driven life to being one of the best-known philosophers of human history (regardless of how you view his work).  Certainly a drastic change in life direction, but his ideas were anything but extreme, except for perhaps unusually egalitarian for his era.

Feedback aside, your point about cultural blind spots is a good reminder. :) Thank you.

Comment by joshuatanderson on [deleted post] 2021-04-28T21:21:38.513Z

I agree with you on most of that.  Obliteration is a terrifying idea, a timeout is merely sad.  I also agree that it would depend on a mechanism unknown to our current understanding of reality.  I do think that granted a deity of some sort, (or even a simulation of some sort), it is very plausible.  A good analogy might seem to be a state snapshot if you are familiar with states in programming or something like Redux, or another good analogy might be saving a video game to the cloud, where even if the local hard drive is obliterated, and there is no physical remnant, it can be restored by an entity who has access to the state at some point.  I would also agree with you that unless there is some sort of deity or observer, the probability of an afterlife seems pretty close to 0 based on what we know about reality.

I am open to the idea that there might be a level of suffering that a good god wouldn't allow, but I don't quite understand how to quantify what you are talking about.  I can certainly imagine universes that would be much worse than ours.  I'm not sure if I can imagine possible universes that are better (for example, you could say that the beauty and speed of a deer would not have occurred in a world without the fangs of a cougar, or that rockets to travel the stars are only possible in a world where you can have burning houses unless you want a completely unpredictable world, where science is impossible).  Do you have any particular cutoff point in mind?  The crux I am operating on there is that in a universe that is designed by a good god, the net amount of good must outweigh the amount of evil, and any evil that is allowed must be (either directly or indirectly) outweighed by the amount of good.  

Also, I agree with you on the idea that violating someone's will for an afterlife/cryonics would be wrong, and that a good actor would respect the autonomy of others.  

Comment by joshuatanderson on [deleted post] 2021-04-28T17:53:34.987Z

Rossin, thanks for the great comment.  I appreciate your intellectual honesty here.  I also agree with a good bit of what you said.  I'm sure you've already heard a myriad of theistic responses and good counter-responses to the problem of evil, which I do agree is a very potent one.  I also agree with you that for many people, such as those born with horrific disabilities or conditions, or perhaps animals with certain parasites, their experience of life on earth probably has a negative utility.

In your opinion, would a resurrection/afterlife change this equation at all?  Or, as a thought experiment, if in a few years, if you could reach immortality via cryonics, but only if you underwent an extremely painful cryopreservation process while still living, without the benefit of anesthetics, would you do it?  Would you make this decision for a friend or spouse if, for some reason, they didn't have the capacity to make decisions due to an illness, but could still feel pain at present, and would be fully restored in the future?  If so, maybe evil doesn't negate the possibility of good.  

I had read Eliezer's post on this earlier, but I gave it another scan since you referenced it.  I actually agree with most of it.  I do think that a world full of evolutionary animal suffering isn't something I would have anticipated.  And I don't think that I would have been able to deduce a deity from observing evidence that seems to come from all angles, like the beauty of flowers, and the apparent cruelty of parasitic wasps (although this may be due to my own cultural background since many cultures arrived at god-concepts based on their own observations of the world).  Personally, I found this perspective on animal suffering in an evolutionarily driven world to be a fascinating one (it is a talk by a lady who did her thesis on animal suffering).

Comment by joshuatanderson on On Sleep Procrastination: Going To Bed At A Reasonable Hour · 2021-04-17T15:33:46.280Z · LW · GW

iamef, thanks so much for posting this!  This is a problem I've also been attempting to solve for myself.  I'd definitely love to collaborate on hack our own psychologies/physiologies to solve it together.  

So far, I've tried journaling bedtime/risetimes, melatonin, cutting caffeine, and recently, using a Pavlok to wake at the same time every morning (which proved very effective at waking me up, but recently has started to fail, because my willpower in the morning has been so low that I go back to bed after waking about 65% of the time).  My current hypothesis is that the low morning willpower is due to getting 6-7 hours to sleep.

I'll probably reach out to you via one of the ways you mentioned later.

Comment by joshuatanderson on Don't Sell Your Soul · 2021-04-07T22:03:36.117Z · LW · GW

That's a great question, ChristianKI.  I have no idea if a soul-human link would transfer to an uploaded consciousness.  The thought experiment of the Ship of Theseus definitely intrigues me, and I don't have a strong opinion one way or the other.  I wouldn't expect to find any sort of material link to a soul, so I actually wouldn't know how to test for it even if I had an EM in the room with me right now.  

I will also add that I don't think a belief in a soul, given that it (as far as I know) has only anecdotal evidence, and doesn't fit into the scientific method, isn't self-supporting, and I wouldn't hold it if it didn't have borrowed strength from other theistic beliefs.  

Does that add any clarity?

Comment by joshuatanderson on Don't Sell Your Soul · 2021-04-07T15:59:09.519Z · LW · GW

I'm a Christian user of LessWrong.  

Although this isn't a universal Christian position (there are some Christian materialists/naturalists), most Christians believe that souls exist on a different metaphysical plane than your brain or an EM.  I wouldn't expect to find any physical atoms that could be identified as being part of a soul.  I would obviously expect to find those in an EM.  

Also, great article.  I think the 1000:1 odds bit is a reasonable analysis.  Given an atheistic starting point, although it may feel that future-theist-you is almost certainly wrong, this prediction isn't easily extricable from the fact that it is being created by atheist-you. 

Even if there is a 1:1000 chance that you have a metaphysical soul, you would certainly be making a bad deal (if it were actually possible to sell your soul online as this article posits).

If anything I said doesn't make sense, feel free to AMA.