Posts

Comments

Comment by makeswell on ARENA 5.0 - Call for Applicants · 2025-02-11T02:52:40.375Z · LW · GW

It confuses me that the FAQ has pretty minimal qualifications for knowledge of AI safety, and at the same time the application asks someone to summarize how an agenda in an area of AI safety research could fail, and to parse a paper with a lot of AI terms, and also that the application says one should only spend an hour finishing it (although this page says an hour and a half). It took me like 30 minutes just to fill out everything besides those two technical questions in the application :D and I cannot answer those technical questions well without doing hours of additional research (more than 90 minutes to do a a half decent job of answering just those two technical questions). I failed the application lol :D 

edit: The part that confuses me is whether I should spend a bunch of time to research the answer to these questions, or whether being able to answer these questions within the time limit of 60 to 90 minutes is part of the requirements to get into the program.

Comment by makeswell on When will computer programming become an unskilled job (if ever)? · 2023-03-26T18:44:55.872Z · LW · GW

The most accurate answer is also the least helpful: none of us really know. Guido van Rossum has an opinion about GitHub Copilot in this interview: 

but he's really just talking about what LLMs can do know, not what they'll be able to do in five or ten years.

Chris Lattner has an opinion about Software 2.0 here:

but Software 2.0 isn't really the same thing. But he's talking about Software 2.0, which is a little different. More info about Software 2.0 here:

and if you watch Chris Latter and Lex talk for a little while longer, you'll see that Chris has no idea about how you can tell a computer to build you a webpage with a red button using just text, and admits that it's out of his area of expertise.

I bring up these examples mostly to illustrate that nobody has any clue. Sam Altman addresses the topic the most out of all the people I've linked, in this video:

and the TLDR is that Lex and Sam both think LLMs can make programmers 10x more productive. Sam also thinks that instead of hiring 1/10th the number of programmers, we'll just have 10x more code. He thinks there's a "supply problem" of enough software engineers.

One thing I would advise is to make yourself more than just a software engineer. Lex says in his talk with Sam that he's not worried because he's an AI guy, not just a programmer. You might want to learn more about how AI works and try to get a job in the space, and ride the wave, or learn about information security in addition to software engineering (that's what I'm doing, in no small part because of the influence of a one-on-one chat with 80,000 Hours), or maybe you learn a lot about oceanography or data science or something else in addition to software engineering.

Then I'd also just say that we have no idea and if anyone says they know, they really don't, because look a bunch of smart people discussed it and they have no clue either.

Comment by makeswell on The LessWrong Team is now Lightcone Infrastructure, come work with us! · 2021-10-18T01:08:59.028Z · LW · GW

I would love to work on this. I applied through your website. Commenting here in case you get a huge flood of random resumes, then maybe my comment will help me stand out. Here's my LinkedIn: https://www.linkedin.com/in/max-pietsch-1ba12ba7/

Comment by makeswell on Most transferable skills? · 2012-12-14T06:07:57.698Z · LW · GW

Compassion meditation is one way. Check out the methods section of this article for a description of the meditation and the pictures to see how the brain looks during meditation and the conclusion to hear about how this type of meditation effects the areas of the brain responsible for detecting emotion in oneself and others and understanding other's mindsets.