What working on AI safety taught me about B2B SaaS sales

post by purple fire (jack-edwards) · 2025-02-04T20:50:19.990Z · LW · GW · 12 comments

Contents

  WTF is WTP?
  Software (companies) ate the world
    I argue that the primary source of software revenue is the monopolization of talent.
  The Lord giveth, and the Lord taketh away
  What's half a tril between friends?
  If this were a better post there would be some sort of nice takeaway at this point
None
12 comments

Subtitle: you're too poor to use AGI.

WTF is WTP?

In Econ 101, you learn about something called willingness to pay (WTP). WTP is the highest price at which you're willing to buy some good or service. As long as the equilibrium price is less than your WTP, you'll get the good, otherwise you'll prefer to keep your money.

But there's a wrinkle: what if my demand isn't independent of yours? This happens all the time. My WTP for a social media platform will be higher if lots of other people are also using the platform. My WTP for a rare collectible will be higher if fewer other people have it.

That second situation creates an incentive for exclusive contracts. Let's say Ferrari makes a car for $50k. Suppose my individual WTP for a new car is $100k and your WTP is $70k. In theory, Ferrari should sell cars to both of us, since WTP > cost. But I want to feel special, so if I'm the only person with the sports car it's worth an extra $100k to me. Now Ferrari could sell to both of us and get $170k in revenue. But instead, I offer $190k to exclusively sell it to me. This leaves me better off, leaves Ferrari way better off, and leaves you in the dust.

Keep that dynamic in mind, we'll see it again later.

Software (companies) ate the world

Here's a quick stat: in 2024, the global SaaS market was around $3 trillion.[1] That's an economy the size of the UK built around subscription software. Why doesn't every company build their own software solutions, perfectly tailored to their needs?

To be clear, enterprises definitely want to do this. 3rd party solutions are a huge headache because they create reliance on another company that's totally out of your control and carry all the compliance baggage of the provider. If your software provider goes out of business, you're screwed. If they decide to double your costs, you're screwed. If they have a data leak, you're screwed.  If a junior developer pushes a commit with breaking package dependencies, literally all of your computers will turn off and you will forgo billions of dollars in lost revenue. Oops.

Oh, and also most enterprise software is sold for 5-6x costs, so there are (checks notes) hundreds of billions of dollars of lost value flowing to these providers.[2]

So why does the software industry exist at all? They have monopolized three things:

Distribution and infrastructure started becoming pretty commoditized in the early 2010s with the advent of AWS, Azure, GCP, and co. Sure, there are still high-ish margins, though they're low relative to most enterprise software, and mostly come from upselling to products on top of the compute/storage. But costs for software infrastructure are many orders of magnitude lower than they once were, and are dropping every year.

Data access seems like it should be a strong moat, but it's weaker than it seems. It's true that no one can really compete with Google on web search since they have superior data on page rank. But most enterprise software doesn't rely on data like that, and when it does, it's mostly internal company data.[3] 

I argue that the primary source of software revenue is the monopolization of talent.

The mechanic goes something like this: ideally, regular companies (insurance, banking, etc.) would like to build their own software, so it's perfectly tailored to their needs and completely under their control. To do this, they need to hire skilled engineers, designers, and project managers.

But if those engineers, designers, and project managers work at a tech company, the products they build can be replicated for ~free, making them several times more productive. Since they're far more productive, tech companies can pay them far more, so all these people go work at [insert Big Tech megacorp]. These tech companies pour money into recruiting and PR campaigns to the point that aspiring developers practically idolize them.

Of course, if there were far more skilled workers, this wouldn't be possible. Microsoft wouldn't be able to charge 5x costs for software so unusable that their own engineers shit talk it on Reddit. Every company would just build its own tools! Why do you think most big enterprises have in-house HR teams but not SWE? It's because HR people make 20 bucks an hour and software engineers want $250k + equity out of undergrad.

The Lord giveth, and the Lord taketh away

You know where this is going. It's been a crazy 30-year bull run for software. The playbook of hiring all the strong engineers to stifle competition, building a just-good-enough product, copying it millions of times, and selling each copy for 5-6x costs is perhaps the best business model to ever exist.[4] This model is so powerful that companies started hiring engineers to do nothing, just so they wouldn't work at competitors.

Now the prospect of AI engineers at least as good as humans is on the horizon, and that business model is rapidly looking endangered. Zuck says they'll start replacing Meta employees this year. o3 can already perform economically valuable work for tech companies. SWE benchmarks are getting saturated.

Syndrome works for IS : r/FireEmblemHeroes
Ayn Rand is rolling in her grave somewhere

What's half a tril between friends?

Remember that thing about willingness to pay?

Let's say you're Oracle, who "specializes in cloud solutions and enterprise-grade software for large corporations." You make 50 billion dollars every year selling software to companies who can't build it themselves because you've bought up all the good engineers. All of the sudden, that moat looks a lot less secure. How much are you willing to pay for AI engineers to ensure that you maintain the monopoly over talent? 

Internally, the WTP is something like total comp of the employees you're replacing, plus or minus a little bit depending on how much you value things like AI engineers being always-on versus the switching costs.

But remember, the business is only valuable because you can build things your customers can't. So really, your WTP is a lot higher if you can get some sort of exclusive access to frontier models, or at least protect the cartel of Big Tech. As long as the plebes don't get it.

For those of you living under a rock, this already happened.

As a concrete prediction to put into writing, I expect this to be a sufficiently strong economic incentive that it outweighs most other people's interest in having AGI. I predict that tech companies will attempt to monopolize AI models that are good at SWE so that their non-AI products remain scarce. I expect companies with more enterprise software and licensing products to do this more, and companies with more consumer-facing and network-y products to do this less.

If this were a better post there would be some sort of nice takeaway at this point

Maybe I'm wrong.[5] But for a wrong model of the world, this sure explains a lot.

For me, it helped justify some of the exorbitant spending by Big Tech on AI. Investors have been criticizing the investment into AI since "it's unclear whether models trained on historical data will ever be able to replicate humans’ most valuable capabilities." LOL.

It's also made me more confident that tech companies will try to corner the market on AI models the way they did with engineers. I think this is bad.

And in a funny way, it explains a lot of companies' stances on open vs. closed models. This mechanic doesn't really impact Meta, since they don't make enterprise software anyway and their platforms are only valuable because of network effects; it's not like I'm gonna tell an AI engineer to make me a personal Instagram-like app tailored to my preferences. So it's pedal to the metal, open-source models, commoditize that shit and send it to 0!

Google is somewhat affected by this, and Microsoft is really affected by it. Lo and behold, Google's models are somewhat open and mostly free, while Microsoft, Salesforce, and Oracle have poured money into OpenAI and their completely closed models.

What a time to be alive.

 

  1. ^

    https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/the-saas-factor-six-ways-to-drive-growth-by-building-new-saas-businesses

  2. ^

    https://www.lightercapital.com/blog/saas-gross-margins-and-how-to-increase-yours

  3. ^

    You might have noticed that I'm only talking about enterprise software here. That's because it's a ~10x larger industry by revenue, and even larger by profit, compared to consumer software (excluding gaming, which is kind of a different category), so it's really the only segment determining incentives.

  4. ^

    https://stripe.com/guides/atlas/business-of-saas#:~:text=Businesses%20and%20investors%20love%20SaaS,growing%20software%20companies%20in%20history.

  5. ^

    Although I'm pretty sure it's at least pointing in the right direction and would be curious to hear counterarguments. And even if it's not fully accurate, I want to encourage more thinking and discussion of this type among those in the AI safety community because I think it's useful for making predictions.

12 comments

Comments sorted by top scores.

comment by Martin Randall (martin-randall) · 2025-02-04T23:59:58.967Z · LW(p) · GW(p)

I can't make this model match reality. Suppose Amir is running a software company. He hired lots of good software engineers, designers, and project managers, and they are doing great work. He wants to use some sort of communications platform to have those engineers communicate with each other, via video, audio, or text. FOSS email isn't cutting it.

I think under your model Amir would build his own communications software, so it's perfectly tailored to his needs and completely under his control. Whereas what typically happens is that Amir forks out for Slack, or some competitor, while Amir's engineers work on software that generates revenue.

I think the success of B2B SaaS over bespoke solutions is adequately explained by economies of scale.

Replies from: jack-edwards
comment by purple fire (jack-edwards) · 2025-02-05T01:45:39.677Z · LW(p) · GW(p)

I don't disagree with most of what you said, maybe I should have been more explicit about some of the points related to that. In particular, I do think "the success of B2B SaaS over bespoke solutions is adequately explained by economies of scale" is true. But I think the reason there are economies of scale is that there are really high fixed costs and really low variable costs. I also think monopolizing talent enables software companies to make sure those high fixed costs stay nice and high.

With AI, engineering talent becomes cheap and plentiful. When that happens, fixed costs will plummet unless firms can control access to AI. If fixed costs plummet, economies of scale go away and the savings from the SaaS model get outweighed by the marginal benefit of bespoke solutions.

what typically happens is that Amir forks out for Slack, or some competitor, while Amir's engineers work on software that generates revenue.

To push back a little on this, as software companies grow they do try to do this less and less. How much enterprise software do you think Microsoft or Google is outsourcing? As soon as it becomes a little bit of a dependence they usually just acquire the company.

In fairness, I don't think this process will be rapid, nothing in B2B SaaS is. But I think tech companies see it on the horizon.

Replies from: jack-edwards
comment by purple fire (jack-edwards) · 2025-02-05T06:57:36.581Z · LW(p) · GW(p)

I also think monopolizing talent enables software companies to make sure those high fixed costs stay nice and high.

If you disagreed with this, is it because you think it is literally false or because you don't agree with the implied argument that software companies are doing this on purpose?

Replies from: martin-randall
comment by Martin Randall (martin-randall) · 2025-02-05T17:34:48.312Z · LW(p) · GW(p)

I think it's literally false.

Unlike the Ferrari example, there's no software engineer union for Google to make an exclusive contact with. If Google overpays for engineers then that should mostly result in increased supply, along with some increase in price.

Also, it's not a monopoly (or monopsony) because there are many tech companies and they are not forming a cartel on this.

Also tech companies are lobbying for more skilled immigration which would be self-defeating of they had a plan of increased cost of software engineers.

Replies from: jack-edwards
comment by purple fire (jack-edwards) · 2025-02-05T19:58:30.550Z · LW(p) · GW(p)

If https://outtalent.com/us50/ is to be believed, SWE engineers look pretty concentrated at the top ~5 companies and their subsidiaries. Do you think that data is incorrect?

Concretely, I would claim that >80% of the most skilled software engineers in the US work at <10 companies. Edit: I thought about it more and I think this is actually more like 65% at the 10 biggest companies, but that doesn't change my central claims.

I also disagree with your claim that they are not a cartel. I think the biggest tech companies collude to fix wages so that they are sufficiently higher than every other company's salaries to stifle competition, while also limiting race dynamics to maintain profits. I think this is done in the form of selectively enforced non-competes, illegal non-poaching agreements, and other shady practices. This has been alleged in court and the companies just settle every time, e.g. https://www.nytimes.com/2014/03/01/technology/engineers-allege-hiring-collusion-in-silicon-valley.html?unlocked_article_code=1.uk4.A5Sn.q5fVDfF_q8Wk&smid=url-share

For those disagreeing--
1. I continue to believe that tech companies derive much of their economic power from cornering the skilled engineering labor market,

2. this is highly threatened by the advent of AI capable of coding,

3. and thus many big tech companies have massive economic incentives to limit the general public's access to models that can code well.

If I changed my mind about any of those 3 points, I would change my mind about the main post. Rather than downvoting, or in addition to it, can you please explain which part you disagree with and why? It will be more productive for everyone and I am open to changing my mind.

Replies from: korin43
comment by Brendan Long (korin43) · 2025-02-05T20:16:16.386Z · LW(p) · GW(p)

I think the biggest tech companies collude to fix wages so that they are sufficiently higher than every other company's salaries to stifle competition

The NYT article you cite says the exact opposite, that Big Tech companies were sued for colluding to fix wages downward, not upward. Why would engineers sue if they were being overpaid?

Replies from: jack-edwards
comment by purple fire (jack-edwards) · 2025-02-05T20:39:34.661Z · LW(p) · GW(p)

Sorry, I can elaborate better on the situation. The big tech companies know that they can pay way more than smaller competitors, so they do. But then that group of megacorp tech (Google, Amazon, Meta, etc.) collude with each other to prevent runaway race dynamics. This is how they're able to optimize their costs with the constraint of salaries being high enough to stifle competition. Here, I was just offering evidence for my claim that big tech is a monopsonistic cartel in the SWE labor market, it isn't really evidence one way or another for the claims I make in the original post.

comment by Martin Randall (martin-randall) · 2025-02-06T03:32:33.655Z · LW(p) · GW(p)

I appreciated the prediction in this article and created a market for my interpretation of that prediction, widened to attempt to make it closer to a 50% chance in my estimation.

Replies from: jack-edwards
comment by purple fire (jack-edwards) · 2025-02-07T05:18:27.898Z · LW(p) · GW(p)

Are you including models that are only used by their creator firm? I work as an ML researcher in big tech (I want to keep this account anon, but it's one of MSFT/OAI, DM, Meta, Anthropic, xAI) and have access to tooling substantially better than what's commercially available (proof by existence?), but that's not really what my post is about. My main model for this actually panning out is something like:

  • Big tech company has control over AI lab
  • AI lab makes cracked SWE agent
  • Big tech company notices that releasing that SWE agent will undermine the rest of their software development business, so instead of licensing it out they only make it available to their own staff and perhaps business allies

I'm just clarifying because it's barely even confidential information that engineers at AI labs have better models than engineers at small or mid-size tech firms, and I want to check what you're actually betting on.

Replies from: martin-randall
comment by Martin Randall (martin-randall) · 2025-02-07T14:01:12.573Z · LW(p) · GW(p)

Are your concerns accounted for by this part of the description?

Unreleased models are not included. For example, if a model is not released because it risks causing human extinction, or because it is still being trained, or because it has a potty mouth, or because it cannot be secured against model extraction, or because it is undergoing recursive self-improvement, or because it is being used to generate synthetic data for another model, or any similar reason, that model is ignored for the purpose of this market.

However, if a model is ready for release, and is only not being released in order to monopolize its use in creating commercial software, then this counts as "exclusive use".

I intended for "AI engineers use unreleased AI model to make better AI models" to not be included.

It is a slightly awkward thing to operationalize, I welcome improvements. We could also take this conversation to Manifold.

comment by J Bostock (Jemist) · 2025-02-04T23:31:15.126Z · LW(p) · GW(p)

This is a very interesting point. I have upvoted this post even though I disagree with it because I think the question of "Who will pay, and how much will they pay, to restrict others' access AI?" is important.

My instinct is that this won't happen, because there are too many AI companies for this deal to work on all of them, and some of these AI companies will have strong kinda-ideological commitments to not doing this. Also, my model of (e.g. OpenAI) is that they want to eat as much of the world's economy as possible, and this is better done by selling (even at a lower revenue) to anyone who wants an AI SWE than selling just to Oracle.

o4 (God I can't believe I'm already thinking about o4) as a b2b saas project seems unlikely to me. Specifically I'd put <30% odds that the o4-series have their prices jacked up or its API access restricted in order to allow some companies to monopolize its usage for more than 3 months without an open release. This won't apply if the only models in the o4 series cost $1000s per answer to serve, since that's just a "normal" kind of expensive.

Then, we have to consider that other labs are 1-1.5 years behind, and it's hard to imagine Meta (for example) doing this in anything like the current climate.

Replies from: jack-edwards
comment by purple fire (jack-edwards) · 2025-02-05T06:56:41.386Z · LW(p) · GW(p)

Hm, this violates my model of the world.

there are too many AI companies for this deal to work on all of them

Realistically, I think there are like 3-4 labs[1] that matter, OAI, DM, Anthropic, Meta.

some of these AI companies will have strong kinda-ideological commitments to not doing this

Even if that was true, they will be at the whim of investors who are almost all big tech companies.

this is better done by selling (even at a lower revenue) to anyone who wants an AI SWE than selling just to Oracle.

This is the explicit claim I was making with the WTP argument. I think this is firmly not true, and OpenAI will make more money by selling just to Oracle. What evidence causes you to disagree?

  1. ^

    American/Western labs.