What working on AI safety taught me about B2B SaaS sales
post by purple fire (jack-edwards) · 2025-02-04T20:50:19.990Z · LW · GW · 3 commentsContents
WTF is WTP? Software (companies) ate the world I argue that the primary source of software revenue is the monopolization of talent. The Lord giveth, and the Lord taketh away What's half a tril between friends? If this were a better post there would be some sort of nice takeaway at this point None 3 comments
Subtitle: you're too poor to use AGI.
WTF is WTP?
In Econ 101, you learn about something called willingness to pay (WTP). WTP is the highest price at which you're willing to buy some good or service. As long as the equilibrium price is less than your WTP, you'll get the good, otherwise you'll prefer to keep your money.
But there's a wrinkle: what if my demand isn't independent of yours? This happens all the time. My WTP for a social media platform will be higher if lots of other people are also using the platform. My WTP for a rare collectible will be higher if fewer other people have it.
That second situation creates an incentive for exclusive contracts. Let's say Ferrari makes a car for $50k. Suppose my individual WTP for a new car is $100k and your WTP is $70k. In theory, Ferrari should sell cars to both of us, since WTP > cost. But I want to feel special, so if I'm the only person with the sports car it's worth an extra $100k to me. Now Ferrari could sell to both of us and get $170k in revenue. But instead, I offer $190k to exclusively sell it to me. This leaves me better off, leaves Ferrari way better off, and leaves you in the dust.
Keep that dynamic in mind, we'll see it again later.
Software (companies) ate the world
Here's a quick stat: in 2024, the global SaaS market was around $3 trillion.[1] That's an economy the size of the UK built around subscription software. Why doesn't every company build their own software solutions, perfectly tailored to their needs?
To be clear, enterprises definitely want to do this. 3rd party solutions are a huge headache because they create reliance on another company that's totally out of your control and carry all the compliance baggage of the provider. If your software provider goes out of business, you're screwed. If they decide to double your costs, you're screwed. If they have a data leak, you're screwed. If a junior developer pushes a commit with breaking package dependencies, literally all of your computers will turn off and you will forgo billions of dollars in lost revenue. Oops.
Oh, and also most enterprise software is sold for 5-6x costs, so there are (checks notes) hundreds of billions of dollars of lost value flowing to these providers.[2]
So why does the software industry exist at all? They have monopolized three things:
- Talent
- Data access
- Distribution and infrastructure
Distribution and infrastructure started becoming pretty commoditized in the early 2010s with the advent of AWS, Azure, GCP, and co. Sure, there are still high-ish margins, though they're low relative to most enterprise software, and mostly come from upselling to products on top of the compute/storage. But costs for software infrastructure are many orders of magnitude lower than they once were, and are dropping every year.
Data access seems like it should be a strong moat, but it's weaker than it seems. It's true that no one can really compete with Google on web search since they have superior data on page rank. But most enterprise software doesn't rely on data like that, and when it does, it's mostly internal company data.[3]
I argue that the primary source of software revenue is the monopolization of talent.
The mechanic goes something like this: ideally, regular companies (insurance, banking, etc.) would like to build their own software, so it's perfectly tailored to their needs and completely under their control. To do this, they need to hire skilled engineers, designers, and project managers.
But if those engineers, designers, and project managers work at a tech company, the products they build can be replicated for ~free, making them several times more productive. Since they're far more productive, tech companies can pay them far more, so all these people go work at [insert Big Tech megacorp]. These tech companies pour money into recruiting and PR campaigns to the point that aspiring developers practically idolize them.
Of course, if there were far more skilled workers, this wouldn't be possible. Microsoft wouldn't be able to charge 5x costs for software so unusable that their own engineers shit talk it on Reddit. Every company would just build its own tools! Why do you think most big enterprises have in-house HR teams but not SWE? It's because HR people make 20 bucks an hour and software engineers want $250k + equity out of undergrad.
The Lord giveth, and the Lord taketh away
You know where this is going. It's been a crazy 30-year bull run for software. The playbook of hiring all the strong engineers to stifle competition, building a just-good-enough product, copying it millions of times, and selling each copy for 5-6x costs is perhaps the best business model to ever exist.[4] This model is so powerful that companies started hiring engineers to do nothing, just so they wouldn't work at competitors.
Now the prospect of AI engineers at least as good as humans is on the horizon, and that business model is rapidly looking endangered. Zuck says they'll start replacing Meta employees this year. o3 can already perform economically valuable work for tech companies. SWE benchmarks are getting saturated.
What's half a tril between friends?
Remember that thing about willingness to pay?
Let's say you're Oracle, who "specializes in cloud solutions and enterprise-grade software for large corporations." You make 50 billion dollars every year selling software to companies who can't build it themselves because you've bought up all the good engineers. All of the sudden, that moat looks a lot less secure. How much are you willing to pay for AI engineers to ensure that you maintain the monopoly over talent?
Internally, the WTP is something like total comp of the employees you're replacing, plus or minus a little bit depending on how much you value things like AI engineers being always-on versus the switching costs.
But remember, the business is only valuable because you can build things your customers can't. So really, your WTP is a lot higher if you can get some sort of exclusive access to frontier models, or at least protect the cartel of Big Tech. As long as the plebes don't get it.
For those of you living under a rock, this already happened.
As a concrete prediction to put into writing, I expect this to be a sufficiently strong economic incentive that it outweighs most other people's interest in having AGI. I predict that tech companies will attempt to monopolize AI models that are good at SWE so that their non-AI products remain scarce. I expect companies with more enterprise software and licensing products to do this more, and companies with more consumer-facing and network-y products to do this less.
If this were a better post there would be some sort of nice takeaway at this point
Maybe I'm wrong.[5] But for a wrong model of the world, this sure explains a lot.
For me, it helped justify some of the exorbitant spending by Big Tech on AI. Investors have been criticizing the investment into AI since "it's unclear whether models trained on historical data will ever be able to replicate humans’ most valuable capabilities." LOL.
It's also made me more confident that tech companies will try to corner the market on AI models the way they did with engineers. I think this is bad.
And in a funny way, it explains a lot of companies' stances on open vs. closed models. This mechanic doesn't really impact Meta, since they don't make enterprise software anyway and their platforms are only valuable because of network effects; it's not like I'm gonna tell an AI engineer to make me a personal Instagram-like app tailored to my preferences. So it's pedal to the metal, open-source models, commoditize that shit and send it to 0!
Google is somewhat affected by this, and Microsoft is really affected by it. Lo and behold, Google's models are somewhat open and mostly free, while Microsoft, Salesforce, and Oracle have poured money into OpenAI and their completely closed models.
What a time to be alive.
- ^
https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/the-saas-factor-six-ways-to-drive-growth-by-building-new-saas-businesses
- ^
https://www.lightercapital.com/blog/saas-gross-margins-and-how-to-increase-yours
- ^
You might have noticed that I'm only talking about enterprise software here. That's because it's a ~10x larger industry by revenue, and even larger by profit, compared to consumer software (excluding gaming, which is kind of a different category), so it's really the only segment determining incentives.
- ^
https://stripe.com/guides/atlas/business-of-saas#:~:text=Businesses%20and%20investors%20love%20SaaS,growing%20software%20companies%20in%20history.
- ^
Although I'm pretty sure it's at least pointing in the right direction and would be curious to hear counterarguments. And even if it's not fully accurate, I want to encourage more thinking and discussion of this type among those in the AI safety community because I think it's useful for making predictions.
3 comments
Comments sorted by top scores.
comment by Martin Randall (martin-randall) · 2025-02-04T23:59:58.967Z · LW(p) · GW(p)
I can't make this model match reality. Suppose Amir is running a software company. He hired lots of good software engineers, designers, and project managers, and they are doing great work. He wants to use some sort of communications platform to have those engineers communicate with each other, via video, audio, or text. FOSS email isn't cutting it.
I think under your model Amir would build his own communications software, so it's perfectly tailored to his needs and completely under his control. Whereas what typically happens is that Amir forks out for Slack, or some competitor, while Amir's engineers work on software that generates revenue.
I think the success of B2B SaaS over bespoke solutions is adequately explained by economies of scale.
Replies from: jack-edwards↑ comment by purple fire (jack-edwards) · 2025-02-05T01:45:39.677Z · LW(p) · GW(p)
I don't disagree with most of what you said, maybe I should have been more explicit about some of the points related to that. In particular, I do think "the success of B2B SaaS over bespoke solutions is adequately explained by economies of scale" is true. But I think the reason there are economies of scale is that there are really high fixed costs and really low variable costs. I also think monopolizing talent enables software companies to make sure those high fixed costs stay nice and high.
With AI, engineering talent becomes cheap and plentiful. When that happens, fixed costs will plummet unless firms can control access to AI. If fixed costs plummet, economies of scale go away and the savings from the SaaS model get outweighed by the marginal benefit of bespoke solutions.
what typically happens is that Amir forks out for Slack, or some competitor, while Amir's engineers work on software that generates revenue.
To push back a little on this, as software companies grow they do try to do this less and less. How much enterprise software do you think Microsoft or Google is outsourcing? As soon as it becomes a little bit of a dependence they usually just acquire the company.
In fairness, I don't think this process will be rapid, nothing in B2B SaaS is. But I think tech companies see it on the horizon.
comment by J Bostock (Jemist) · 2025-02-04T23:31:15.126Z · LW(p) · GW(p)
This is a very interesting point. I have upvoted this post even though I disagree with it because I think the question of "Who will pay, and how much will they pay, to restrict others' access AI?" is important.
My instinct is that this won't happen, because there are too many AI companies for this deal to work on all of them, and some of these AI companies will have strong kinda-ideological commitments to not doing this. Also, my model of (e.g. OpenAI) is that they want to eat as much of the world's economy as possible, and this is better done by selling (even at a lower revenue) to anyone who wants an AI SWE than selling just to Oracle.
o4 (God I can't believe I'm already thinking about o4) as a b2b saas project seems unlikely to me. Specifically I'd put <30% odds that the o4-series have their prices jacked up or its API access restricted in order to allow some companies to monopolize its usage for more than 3 months without an open release. This won't apply if the only models in the o4 series cost $1000s per answer to serve, since that's just a "normal" kind of expensive.
Then, we have to consider that other labs are 1-1.5 years behind, and it's hard to imagine Meta (for example) doing this in anything like the current climate.