Posts
Comments
Hi Joe, thanks a lot for your thoughtful comment! We think you're making some valid points here and will take your suggestions and questions into consideration
Hi there, thanks for bringing this up. There are a few ways we're planning to reduce the risk of us incubating orgs that end up fast-tracking capabilities research over safety research.
Firstly, we want to select for a strong impact focus & value-alignment in participants.
Secondly, we want to assist the founders to set up their organization in a way that limits the potential for value drift (e.g. a charter for the forming organization that would legally make this more difficult, choosing the right legal structure, and helping them with vetting or suggestions for who you can best take on as an investor or board member)
If you have additional ideas around this we'd be happy to hear them.
Hi, thanks for asking! We're moving forward, got funding from Lightspeed, and plan to run our pilot in Q4 of this year. You can subscribe at the bottom of catalyze-impact.org if you want to make sure to stay in the loop about sign-ups and updates
To share some anecdotal data: I personally have had positive experiences doing regular coaching calls with Kat this year and feel that her input has been very helpful.
I would encourage us all to put off updating until we also get the second side of the story - that generally seems like good practice to me whenever it is possible.
(also posted this comment on the EA forum)
Hi, I'd encourage you to apply if you recognize yourself in the About you section!
When in doubt always apply is my motto personally
I'd be curious to hear from the people who pressed the disagreement button on Evan's remark: what part of this do you disagree with or not recognize?
I was thinking about helping with infrastructure around access to large amounts of compute but had not considered trying to help with access to cutting-edge models but I think it might be a very good suggestion. Thanks for sharing your thoughts!