0 comments
Comments sorted by top scores.
comment by Dagon · 2022-12-13T15:52:14.087Z · LW(p) · GW(p)
Thanks for posting this - reports of experience are interesting and useful. I advise caution. That style of emotional belief is useful in motivation, and is a good hint toward areas to model more closely and prioritize in terms of actions. But it's also over-general and under-nuanced, and lacks humility and acknowledgement that it might be incorrect.
Replies from: georgetturnbull↑ comment by g1 (georgetturnbull) · 2022-12-13T19:00:10.256Z · LW(p) · GW(p)
Replies from: Dagon↑ comment by Dagon · 2022-12-13T19:10:20.375Z · LW(p) · GW(p)
Oh, funny - I misunderstood your "a little embarrassing to admit" to mean that you're embarrassed to admit you didn't feel it sooner, with the implication that you expect most readers to already feel it and think you're late to the party. Embarrassing to admit that you have aliefs, and that this one has moved to align with your conscious beliefs didn't occur to me.
Replies from: georgetturnbull↑ comment by g1 (georgetturnbull) · 2022-12-13T19:41:50.099Z · LW(p) · GW(p)
comment by Daniel Kokotajlo (daniel-kokotajlo) · 2022-12-14T02:26:07.175Z · LW(p) · GW(p)
People have accused me of being an AI apocalypse cultist. I mostly reject the accusation. But it has a certain poetic fit with my internal experience. I’ve been listening to debates about how these kinds of AIs would act for years. Getting to see them at last, I imagine some Christian who spent their whole life trying to interpret Revelation, watching the beast with seven heads and ten horns rising from the sea. “Oh yeah, there it is, right on cue; I kind of expected it would have scales, and the horns are a bit longer than I thought, but overall it’s a pretty good beast.”
--Scott Alexander, "Perhaps it is a bad thing that the world's leading AI companies cannot control their AIs"
comment by sisyphus (benj) · 2022-12-13T22:30:54.026Z · LW(p) · GW(p)
Completely agree here. I've known the risks involved for a long time, but I've only really felt them recently. I think Robert Miles phrases it quite nicely on the Inside View podcast, where "our System 1 thinking finally caught up with our System 2 thinking."
Replies from: daniel-kokotajlo↑ comment by Daniel Kokotajlo (daniel-kokotajlo) · 2022-12-14T01:49:47.963Z · LW(p) · GW(p)
Shouldn't it be the other way round -- System 1 finally catching up with System 2?
Replies from: benj↑ comment by sisyphus (benj) · 2022-12-14T02:16:37.838Z · LW(p) · GW(p)
Woops, edited. Thanks! :)
comment by Kerry · 2022-12-14T14:37:00.454Z · LW(p) · GW(p)
I had a similar experience with midjourney. The question now is, how do you change your life once you have the more visceral understanding of the near term future? Seriously, this is my biggest problem. I deeply believe change is coming, fast, but I'm still stuck in so many patterns that only make sense if the status quo continues.
comment by andrew sauer (andrew-sauer) · 2022-12-14T06:57:06.977Z · LW(p) · GW(p)
Honestly that's exactly how I feel after messing around with ChatGPT
Yeah it's not perfect but the fact that this is possible now as a free demo to me means real honest to god GAI is only a few decades away.
And even though ChatGPT is about the most wholesome chatbot I've ever seen, this is obviously more of a surface-level PR thing rather than an indication about the underlying technology.
comment by g1 (georgetturnbull) · 2022-12-14T04:12:52.858Z · LW(p) · GW(p)
Replies from: Markvycomment by Annapurna (jorge-velez) · 2022-12-14T14:43:41.641Z · LW(p) · GW(p)
I feel the same way. You're not alone.