What to do when starting a business in an imminent-AGI world?

post by ryan_b · 2022-05-12T21:07:10.111Z · LW · GW · 4 comments

This is a question post.

Contents

  Answers
    7 moridinamael
    5 Dagon
    2 burmesetheaterwide
None
4 comments

As reported by la3lorn [LW · GW] and Daniel Kokotajlo [LW · GW], Gato is here and appears to me to represent a sub-human AGI, or near enough as makes no difference in a timeline sense. I think this probably means a general thickening of deep learning applications everywhere, and the introduction of a kind of "stack" AI that can do things we used to need whole organizations to do - as an example, I mean things like do patent research, label patent diagrams, and file patent lawsuits.

I also have an idea about a business I would like to start. This is already a notoriously trying task with low probability of success, and I wonder how much more so it will be in a world that will populate with AGI patent trolls, along with whatever else, well before hitting any kind of clear success mark.

So my question is: what do we do to account for powerful AI, showing up soon, when we are starting a business?

Note that what I am interested in here is non-AI businesses in particular and non-software businesses in general, because this looks like the threshold for it spilling across a bunch of new domains.

Answers

answer by moridinamael · 2022-05-13T14:04:55.307Z · LW(p) · GW(p)

Partly as a hedge against technological unemployement, I built a media company based on personal appeal. An AI will be able to bullshit about books and movies “better” than I can, but maybe people will still want to listen to what a person thinks, because it’s a person. In contrast, nobody prefers the opinion of a human on optimal ball bearing dimensions over the opinion of an AI.

If you can find a niche where a demand will exist for your product strictly because of the personal, human element, then you might have something.

shminux is right that the very concept of a “business” will likely lack meaning too far into an AGI future.

answer by Dagon · 2022-05-13T16:42:22.174Z · LW(p) · GW(p)

My recommendation would be NOT to include the general topic in your business plans or worries (unless your business is actually related).  DO include specific threats or competitors that a given capability of AGI will enable or strengthen.  I assert that many of these specific worries will either be opportunities for you to use the same techniques, or (really and/or) are threats or problems that are possible with human adversaries today.

AGI patent trolls are a good example of a specific worry to consider.  You probably ALREADY have to consider and spend planning effort on patents and defending against patent trolls.  Exploring how an increase in efficiency of such trolls interacts with an increase in efficiency of your lawyers is worth thinking about.

answer by burmesetheater (burmesetheaterwide) · 2022-05-13T19:21:31.501Z · LW(p) · GW(p)

You can't account for AGI because nobody has any idea at all what a post-AGI world will look like, except maybe that it could be destroyed to make paperclips. So if starting a business is a real calling, go for it. Or not. Don't expect the business to survive AGI even if thrives pre-arrival. Don't underestimate that your world may change so much that scenarios like you (or an agent somewhat associated with the entity formerly known as you, or even anyone else at all) running a business might not make sense--the concept of business is a reflection of how our world is structured. Even humans can unbalance this without the help of AGI. In short it's a good bet that AGI will be such a great disruption that the patent system is more likely to be gone rather than filled with AGI patent trolls. 

4 comments

Comments sorted by top scores.

comment by Shmi (shminux) · 2022-05-13T05:16:11.296Z · LW(p) · GW(p)

I think you are asking a wrong question. "Business" is way downstream of other activities that would be impacted by "an imminent-AGI world". Such a world would probably be unpredictable (to a human) on the timescale of weeks if not days or hours. What would happen to food/shelter/communication with human-level (though maybe not quite self-improving) AI on the loose? It will probably look nothing like the mundane and happy world of the comic Questionable Content, where AIs and humans happily coexist and discuss issues like whether a sentient assembly line should be allowed to unionize. 

Replies from: ryan_b
comment by ryan_b · 2022-05-13T13:43:05.816Z · LW(p) · GW(p)

"Business" is way downstream of other activities that would be impacted by "an imminent-AGI world".

I am skeptical of this. What causal path for changing the world at scale doesn't go through businesses, specifically the hardware ones like manufacturing and construction?

Replies from: tomcatfish
comment by Alex Vermillion (tomcatfish) · 2022-05-13T17:10:48.423Z · LW(p) · GW(p)

So, I might be misunderstanding your question, but here's an example of what shminux is saying.

(Note: Something this large isn't necessary for the point to hold, but nuance is the enemy of a clear explanation)

Imagine an AI pops up tomorrow and says "human beings will not get hurt any more". You no longer need to worry about food, shelter, protection from others, and many other things you needed money for. You'd also expect much of old Earth to change radically when governments are unable to use the threats they previously relied on when controlling their slice of the world.

If the AI has already done this, there's nothing specific it needs your business for.

Replies from: ryan_b
comment by ryan_b · 2022-05-16T15:07:05.508Z · LW(p) · GW(p)

This makes sense to me, but it seems like a better fit for describing a post-AGI world. I am asking about the period before AGI has arrived (like now) but when the probability of it arriving within the early lifetime of a business is high enough to merit specific consideration (also like now, I claim).

There has to be a transitionary period before AGI actually is in a position to be doing any of these things even in the fast takeoff scenario; there's too many atoms to move around for it to be otherwise, it seems to me.