post by [deleted] · · score: 0 (0 votes) · LW · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by magfrump · 2017-10-16T23:21:45.434Z · score: 16 (7 votes) · LW · GW

I feel like there are some good intentions behind this post but I didn't feel like I got anything from reading it. I know it can be disconcerting to get downvotes without feedback so I'll try to summarize what feels off to me.

  1. You start this post by saying you "always disagreed [with the community]" but don't outline any specific disagreements. In particular your concluding points sound like they're repeating Eliezer's talking points.

  2. You suggest that the community doesn't have a strong background in the formal sciences, but this seems not only unjustified but explicitly contradicted by the results of the various community surveys over the years--over 50% of the community works in computers, engineering, or math as of 2016. Of course, this has fluctuated over time and I don't want to push too hard to group professors in AI with people who work tech support, but if anything my experience is that the community is substantially more literate in formal logic, math, etc., than could reasonably be expected.

  3. I'm guessing your work in logic is really interesting and that we'd all be interested in reading your writing on the subject. But the introduction you give here doesn't distinguish between possible authors who are undergrads versus authors who are ivy league professors. In particular outside of a couple of buzzwords you don't tell us much about what you study, how much of an expert you are in the subject, and why formal logic specifically should be relevant to AGI.

My guess is you have some interesting things to share with the community, so I hope this is helpful for you writing your next posts for the LW audience and doesn't come off as too rude.

comment by jsteinhardt · 2017-10-17T03:04:59.264Z · score: 5 (2 votes) · LW · GW

Galfour was specifically asked to write his thought up in this thread: https://www.lesserwrong.com/posts/BEtzRE2M5m9YEAQpX/there-s-no-fire-alarm-for-artificial-general-intelligence/kAywLDdLrNsCvXztL

It seems either this was posted to the wrong place, or there is some disagreement within the community (e.g. between Ben in that thread and the people downvoting).

comment by gjm · 2017-10-17T08:31:25.796Z · score: 15 (4 votes) · LW · GW

You may well be right, but it's also possible that some readers think (1) Galfour did well to write up his thoughts but (2) now that we've seen them his thoughts are terrible. (Ridiculous over-the-top analogy: you ask a friend to tell you honestly and without filters what his political opinions are. He turns out to be an unreconstructed Nazi. You're glad he told you honestly, but now he's done it you don't want to be his friend any more.)

Or some may think: (1) as above but (2) the comments above aren't actually writing up his thoughts about AGI, and aren't interesting.

Or some may think: (1) as above but (2) Ben specifically suggested that personal thoughts on AGI should go on personal LW2 blogs rather than the front page, whereas here Galfour is saying that when he writes up his thoughts they will go on the front page.

(Lest I be misunderstood: I have not downvoted this post; I don't think anything he wrote above was terrible; I also don't think it's terribly interesting, but since it's intended mostly as background I don't see any particular reason why it needs to be; I have no strong opinion on whether Galfour's AGI opinions, once written, will belong on the front page.)

comment by [deleted] · 2017-10-17T09:53:04.845Z · score: 3 (1 votes) · LW · GW

I don't think there is a disagreement on the location of the post.

People might have simply not liked that post. This is what is implied by the comment you replied to. (Or it might have been edited since.)

comment by [deleted] · 2017-10-17T09:49:59.599Z · score: 3 (1 votes) · LW · GW

I declare Crocker's rules. If you have more criticism, please share it.

1. True. I've written a bit about the disagreement in other posts, but it isn't clear here. I'll fix it.

2. I thoroughly disagree.
- General knowledge and common knowledge are different. 99% of the community could have strong foundations on formal proof systems, it wouldn't be enough if they didn't care to include it because of the remaining 1%. Or because they didn't know that 99% had such foundations.
- Following the previous point, it wouldn't matter that it is common knowledge if it isn't found in the articles/posts. And it isn't found.
- I have been following the community, and even if they are "more literate in formal logic, math" than most people (I don't know what you mean by "could reasonably be expected"), this is still not enough for the problems at hand.

3. Indeed, this is only the prelude. Also:
- These "buzzwords" are included for people who know the fields. For people outside these fields, they might sound like buzzwords, even though they aren't. (For any sensible definition of buzzword.)
- I thought it was obvious why formal logics are relevant to AGI.

This is helpful. Even though I disagree with many of your points, it is helpful to see that there are disagreements where I haven't expected them.

comment by magfrump · 2017-10-17T17:51:29.728Z · score: 2 (3 votes) · LW · GW

Thanks for responding! I think you make fair points--I hadn't seen the previous thread in detail, I just try to read all the posts but afaik there isn't a good way of tracking which comment threads continue to live for a while.

I think the center of our disagreement on point 2 is a matter of the "purpose of LessWrong;" if you intend to use it as a place to have communal discussions of technical problems which you hope to make progress on through posts, then I agree that introducing more formal background is necessary even in the case that everyone has the needed foundations. I am skeptical that this will be a likely outcome, since the blog has cross purposes of building communities and general life rationality, and building technical foundations is rough for a blog post and might better be assigned to textbooks and meetup groups. That limits engagement much more heavily and I definitely don't mean to suggest you shouldn't try, but I wasn't really in that mindset when first reading this. I had a more general response on the lines of "this person wants to do something mathematically rigorous but is a bit condescending and hasn't written anything interesting." I hope/believe that will change in future posts!

comment by habryka (habryka4) · 2017-10-17T20:12:29.017Z · score: 16 (4 votes) · LW · GW

I personally would really like to see more direct technical work on LessWrong, though I am unsure about the best format. I am heavily in favor of people writing sequences of posts that introduce technical fields to the community, and think that a lot of the best content in the history of LessWrong was of that nature.

comment by Rob Bensinger (RobbBB) · 2017-10-17T21:11:42.452Z · score: 12 (2 votes) · LW · GW

Strong +1. I'd love to see most of LW developing into object-level technical discussion of interesting things (object-level math, science, philosophy, etc.), skewed toward things that are neglected and either interesting or important; and very little meta or community-building stuff. Rationality should be an important part of all that, but most posts probably shouldn't be solely about rationality techniques.

comment by [deleted] · 2017-10-17T21:12:50.710Z · score: 3 (1 votes) · LW · GW

Are you interested in actively participating in such a thing ?

comment by Rob Bensinger (RobbBB) · 2017-10-17T21:16:37.969Z · score: 7 (2 votes) · LW · GW

Time allowing!

comment by [deleted] · 2017-10-17T21:13:25.584Z · score: 3 (0 votes) · LW · GW

Would you engage with technical work ?

comment by habryka (habryka4) · 2017-10-17T21:18:56.914Z · score: 8 (2 votes) · LW · GW

I have in the past engaged with a good amount of technical material (primarily MIRI's agent foundation agenda). In general time is short though, and I can't make any promises of participating in any specific effort.

I think so far I am not particularly compelled by the approach you are proposing in this and your next post, but am open to be convinced otherwise.

comment by [deleted] · 2017-10-17T21:21:54.507Z · score: 6 (1 votes) · LW · GW

I was asking more generally, but this answer is even better. Thanks

comment by [deleted] · 2017-10-17T18:16:47.096Z · score: 3 (1 votes) · LW · GW

Hm.

I think formal sciences are critical for "general life rationality".

Given you don't think so, would you be interested in a post detailing my reasons to think so ?

(I met people sharing your state of mind, and subsequently convinced of the importance of formal sciences by me. As such, I think that such a post would either convince you, or open a dialogue.)

comment by magfrump · 2017-10-17T18:27:37.682Z · score: 6 (2 votes) · LW · GW

I don't think I disagree with the claim you're making here--I think formal background for things like decision theory is a big contributor to day to day rationality. But I think posts detailing formal background on this site will often be speaking either to people who already have the formal background, and be boring, or be speaking to people who do not, and it would be better to refer them to textbooks or online courses.

On the other hand, if someone wanted to take on the monumental task of opening up the possibility of running interactive jupyter notebooks to add coding exercises to notebooks and start building online courses here, I'd be excited for that to happen--it just seems like if we want to build more formal background it will be a struggle with the current site setup to match other resources.

comment by [deleted] · 2017-10-17T19:53:41.605Z · score: 1 (2 votes) · LW · GW

- Nearly everyone I've seen on various rationalist Discord servers didn't have background in formal logic beyond boolean logic (eg, formal proof theory)

- I'm also talking about more day-to-day situations like social situations. (And how epistemic modal logic helps understanding them.)

-Everything101 [LW · GW], mentioned and linked on the OP, is about building more formal background. It won't be hosted on LW, but will surely be linked.

I do think adding a bigger formal background is hard. And that explains my initial statement (that you were disagreeing) that the community doesn't have a strong background in formal sciences.

comment by gjm · 2017-10-17T13:45:46.861Z · score: 4 (1 votes) · LW · GW

How do you know it's throw-away or used for trolling? So far as I can see, all we know is that so far it's been used only to ask the question above.

comment by [deleted] · 2017-10-17T14:12:18.143Z · score: 7 (2 votes) · LW · GW

Throw-away:
- First and only contribution to LesserWrong

Trolling:
- DragonGod has been banned from the SSC's Discord server, has seen his amount of allowed contribution on LW restricted, has been severely criticized for lack of epistemic hygiene.
- Pattern matching with similar behavior on other blogs/forums/boards.
- No point of similarity given.