Posts

Comments

Comment by emilfroberg on Bounty: Diverse hard tasks for LLM agents · 2024-02-20T01:38:37.299Z · LW · GW

As part of the task we are designing, the agent needs access to a LLM API (locally would make the deliverable too big I assume). The easiest would be OpenAI/Anthropic, but that is not static. Alternatively, we could host an LLama in the cloud, but it is not ideal for us to have to keep that running. What is the best way to do this?