66

Technical Deflation

What I don’t understand is that how is this automatically good for startups?

Say I have a startup that vibe codes “AI for real estate”. What about customer acquisition?

On the other hand, if I’m Zillow, why can’t I just throw a developer on the same feature and automatically have a customer base for it?

If you look at most of the YC funded startups these days, they are just prompt engineers with no go to market strategy and some don’t even have any technical people and are looking for “technical cofounders” that they can underpay with a promise of equity that will statistically be meaningless.

7 hours agoraw_anon_1111

Yap, AI coding will just help monopolize positions lol.

2 hours agoBombthecat

Clay Christenson in the “Innovators Dilemma” would call AI a “sustaining innovation” not a “disruptive innovation”.

At the end of the day. Open AI is losing billions of dollars and Google caught up while still seeing record revenues and profits using its own infrastructure and TPUs

Even the laggard Apple is reportedly just going to throw a billion (chump change) at Google for its model and keep selling phones and other hardware while OpenAI is reportedly working on a “smart egg”

2 hours agoraw_anon_1111

As someone building AI infrastructure, this deflation idea shows up very differently for us. Application founders can afford to say they will wait for the next model, but infrastructure founders cannot. The value we create is not in shipping features faster. It is in absorbing the volatility that sits underneath the entire AI stack.

Models keep getting cheaper and more capable every few months. However, the underlying compute economics do not deflate at the same rate. GPU provisioning, inference orchestration, bandwidth constraints, latency guarantees, regulatory requirements, and failure handling do not become magically simple because a new model improved its reasoning. In reality, each improvement on the model side increases pressure on the infrastructure side. Bigger context windows, heavier memory footprints, more parallel requests, and more complex agentic workflows all increase the operational burden.

For infrastructure teams, waiting does not help. The surface area of what needs to be built only grows. You cannot delay autoscaling, observability, scheduling, routing, or privacy guarantees. Applications will always demand more from the infrastructure, and they will expect it to feel like a commodity.

My view is that technical deflation applies much more to application startups than to infrastructure startups. App founders can benefit from waiting. Infra founders have to build now because every model improvement instantly becomes a new expectation that the infra must support. The baseline keeps rising.

The real moat in the next era is not the speed of feature development. It is the ability of your infrastructure to absorb the increasing chaos of more capable models while keeping the experience simple and predictable for the user

32 minutes agoBarathkanna

What will become apparent is that when coding costs go to 0, support and robustness costs will be the new "engineering" disciple. Which is in reality how things work already. It is why you can have open source code and companies built on providing enterprise support for that code to companies.

If you want to build a successful AI company, assume the product part is easy. Build the support network: guarantee uptime, fast responses, direct human support. These are the shovels desperately needed during the AI gold rush.

3 hours agolubujackson

Imagine, people aren't spending already because they have no money, things keeps getting pricier and pricier, people buy less and less trying to put aside in case something happens like an accident. And "higher people" still thinks "lower price bad". I sure hope the wall hits so hard countries fall.

an hour agohollow-moe

Did google search and stackoverlow and internet deflate programming ?

Comparing with reading books you might have or need to order, read, or get from the library, you bet.

There still are some interesting problems to tackle. Maybe more than before. So who knows.

2 hours agomakapuf

The impulse to make the comparison makes sense. The reality is a bit different, probably leans in the right direction but is buffered by learning. I’ll explain. There is no upside to delaying purchasing if you think things will get less expensive. There is however upside in building today even if you have to rebuild tomorrow, and that upside is in learning the problem space. Specifically, what is likely to be trivialized and what truly requires domain knowledge. Horizontal apps? Little domain knowledge to encode. Vertical app? More domain knowledge to encode. Separately, there are more ways to differentiate than distribution alone, see verifier’s law. Problems that are challenging to verify are challenging for AI to trivialize.

7 hours agojackar

Infact, the opposite is true. There is tech inflation. Coding will get easier, but that doesn't mean tech is getting easier. It will only get more complex and will drive more segregation.

Tech is dividing the society and driving a wedge deeper. There is a huge population that are being thrown wayside by the high-speed tech highway. Which means the tech is getting more and more unreachable.

AI assistants are only going to make this worse, by removing the direct touch between users and the tech. Tech becomes just unmanageable for average person.

Just like how you were able to do all repairs for your bike, as a kid. But you can't do the same for your car now. Tech never gets easier or reachable.

5 hours agozkmon

I think this is a very solid take. Moreoever, tech will be optimized for LLMs and not humans. Shitty languages will remain shitty because there's no need to optimise them or make them more elegant. Web dev will remain ultra fragmented and will get even worse as the years go by.

Not going to lie, it looks like a bleak future.

3 hours agocoolThingsFirst

I disagreed with some points and agreed with others based on my experience and the data I have available, but the last few sentences really weakened the overall point.

> Giga AI, a company building AI customer support agents, claims to have sworn off the "forward deployed engineer" model of custom software favored by many other successful startups, in favor of software that customizes itself—only possible because of coding agents.

Giga AI is not a publicly traded company and they have zero legal liability or possible downside for lying, and massive upside for lying. They also don't have real customers and are not in positive revenue. The trend is that everyone who has said this was lying.

When there's tangible evidence of this, I think it will be an important part of the discussion. Until then, saying "claims" and "but I don't really know" but then paraphrasing their press release without analysis is about as sophisticated and as honest as tweeting "people are saying."

The author should take their own advice and wait six months when these claims will be easier to substantiate and support the analysis far more strongly.

an hour agofasbiner

The models themselves represent the biggest deflation cases I've ever seem.

The charged cost of a frontier model is ~200x lower than 2 years ago, and the ones we are using now are much better - although measuring that and how much is challenging. Building a "better than GPT-4" model is also vastly cheaper than building GPT-4 was... perhaps 1/100th?

6 hours agosgt101

The is a great point that wasn't included in the original article. Thank you.

an hour agokittikitti

I'm still ahead of a company starting later:

I have the legal structure, i know my collegues, i have potentially employees and more capacity.

The problem is not that a startup is starting after you but you do not give yourself time to keep an eye on AI and not leveraging it when its helpful.

We leverage AI and ML progress constantly and keep an eye on advances. Segment Anything? Yepp we use it. Claude? Yes Sir!

7 hours agoGlemkloksdjf

Does anyone really read any of this stuff anyway?

2 hours agorobotburrito

Technology has always been deflationary. But you don't put off buying a computer because it will be cheaper next year. Nobody seems to be putting off buying GPUs despite scary depreciation and a blistering pace of new product introductions that are ever cheaper faster and better.

9 hours agoZigurd

only really faster and better if you don't use them for gaming, unfortunately. Upscaling and frame generation is not a better GPU, it's one with a band-aid applied to hide the fact that it actually did not get much faster.

RTX doesn't count to me either, because that's some bullcrap pushed by gpu manufacturers that requires the aformentioned upscaling and frame generation techniques to fake actually being anywhere close to what gpu manufacturers want gamers to believe.

7 hours agovrighter

> only really faster and better if you don't use them for gaming, unfortunately. Upscaling and frame generation is not a better GPU, it's one with a band-aid applied to hide the fact that it actually did not get much faster.

The generations gains haven’t been as great as past generations, but it’s getting silly to claim that GPUs aren’t getting faster for gaming.

Intentionally ignoring frame generation and DLSS up scaling also feels petty. Using those features to get 150-200fps at 4K is actually a very amazing experience, even if the purists turn their noses up at it.

The used GPU market is relatively good at calibrating for relative gaming performance. If new GPUs weren’t actually faster then old GPUs wouldn’t be depreciating much. Yet you can pick up 3000 series GPUs very cheaply right now (except maybe the 3090 which is prized for its large VRAM, though still cheap). Even 4000 series are getting cheap.

6 hours agoAurornis

"Guessing what a pixel's color might be were one to actually do the work and render it" is not the same as actually rendering it. No, upscaling doesn't count.

Doing it for a whole screenful of pixels, for the majority of frames (with multi-frame generation) is even less of it.

5 hours agovrighter

I dunno, the kiddo went from a 1650 Super to a 3060 and it's a lot nicer looking, I don't think frame gen and what not is enabled. Sure, that's up a notch on the SKU list and tons more VRAM. The 1650 Super was working with most of the games he tried, but Marvel Rivals was terrible (haven't seen him play it with the new card though)

It does help that he has a small screen and regular DPI. Seems like everyone wants to run with 4x the pixels in the same space, which needs about 4x the GPU.

5 hours agotoast0

Some people do put off buying cellphones and laptops when they know a new model will come out every year.

8 hours agomr_toad

The overall trend has been the opposite though, hasn't it? People used to buy a new phone (or new laptop/etc) every couple of years because the underlying tech was improving so quickly, but now that the improvements have slowed down, they're holding onto their devices for longer.

There was an article[1] going around about that recently, and I'm sure there are more, but it's also a trend I've seen first-hand. (I don't particularly care for the article's framing, I'm just linking to it to illustrate the underlying data.)

[1]: https://www.cnbc.com/2025/11/23/how-device-hoarding-by-ameri...

7 hours agotikhonj

> Some people do put off buying cellphones and laptops when they know a new model will come out every year.

Don't confuse technical deflation with the Osborne effect:

> https://en.wikipedia.org/wiki/Osborne_effect

6 hours agoaleph_minus_one

The Osborne affect has been heavily disputed over the years - it says so in your own citation.

But at least with iPhones, there is a deflationary affect because Apple has since the 3GS in 2009, kept the old phone around and reduced the price. For instance my son wanted an iPhone 16 Plus. I told him to wait until the 17 was announced and he bought one cheaper from T-Mobile

5 hours agoraw_anon_1111
[deleted]
7 hours ago

> But you don't put off buying a computer because it will be cheaper next year.

Why not? Sounds like a pretty reasonable strategy.

> Nobody seems to be putting off buying GPUs

Many people doing exactly that.

7 hours agoRay20

Back when I was first started wrestling with this issue it the question was “how much faster will the daily photoshop operations be with a new computer?”

Now a new computer barely does anything faster for me.

5 hours agodetourdog

>> First, models getting better makes AI-based applications easier to build, because they can be simpler.

Don't conflate easy with simple. I'd argue they are actually easier and far more complex.

6 hours agoskeeter2020

Good explanation for why hiring stopped. If AI is improving rapidly why hire engineers now that you might not need in 6-12 months?

5 hours agoSevii

> Good explanation for why hiring stopped. If AI is improving rapidly why hire engineers now that you might not need in 6-12 months?

I'm not so sure that's the reason. I mean, to believe LLMs replace engineers you first need to believe engineers spend the bulk of their time typing frantically churning out code in greenfield projects. That's not compatible with reality. Although LLMs excel at generating new code from scratch, that scenario is the exception. Introducing significant changes to existing projects still requires long iterations which ultimately end up consuming more development time than actually rolling out the changes yourself.

The truth if the matter is that we are not observing an economic boom. The US is either stagnant or in a recession, and LLMs are not their cause. In an economic downturn you don't see spikes in demand for skilled workers.

5 hours agolocknitpicker

Hard agree with both points--this feels way closer to reality than most of what I've read.

On recession: cost of living is becoming crisis-level. I read recently that 67% of Americans are paycheck-to-paycheck. 150k/yr is 12k/month. If groceries go from 500 to 1000/month, a 150k wage-earner save less for retirement. For someone making 30-40k (basically minimum wage), it's a huge hit. Then consider it's the same story for cars, housing, medical care...it goes on and on. It doesn't look "recessionary" because GDP keeps going up. But we're getting so much less for it with every passing year.

I also agree that we need to consider what brownfield dev looks like. It's where the vast majority of my time has gone over 15+ years in software and I'm not convinced all the coordination / sequencing / thinking will be assisted with LLMs. Particularly because they aren't trained on large proprietary codebases.

What we might both be missing, is that for most people, writing the actual code is hard. LLMs help with that a lot. That's what a lot of junior/entry-level work, actually is (not as much planning/thinking as seniors do).

2 hours agoeldavido

We don't hire juniors anymore

They are definitely not needed anymore.

The market is flooded with seniors.

So no problem there either

2 hours agoBombthecat

> when prices go down instead of up. It is generally considered harmful: both because it is usually brought on by something really bad (like a severe economic contraction)

Or, you know, technological improvements that increase efficiency of production, or bountiful harvests, or generally anything else that suddenly expands the supply at the current price level across the economy. Thankfully, we have mechanisms in place that keep the prices inflating even when those unlikely events happen.

10 hours agoJoker_vD

Deflation is about all prices going down. Just a few decreasing is normal.

Anyway, WTF, economics communication has a huge problem. I've seen the article's explanation repeated in plenty of places, it's completely wrong and borderline nonsense.

The reason deflation is bad is not because it makes people postpone buying things. It's because some prices, like salaries or rent just refuse to go down. That causes rationing of those things.

9 hours agomarcosdumay

See "price stickiness" and what is simplified as "menu reprinting costs"; there's usually a cost associated with changing prices, and a cost associated with renegotiating prices for everything that's not being sold on a spot market. People cannot buy housing at spot, and while spot-labour pricing is definitely a thing for some services it's so socially destabilizing for anything skilled that most workforces operate on salary.

The reverse of this is that high inflation tends to cause a lot of strikes, because salaries refuse to go up and very high levels of inflation need salary repricing every month or even week.

7 hours agopjc50

In Argentina I've learned from a young age that prices take the elevator, but salaries take the stairs.

It got old really quick having to negotiate with the boss every 6 months.

7 hours agoigleria

Rent and salaries don't like going down because of debt. Debts are denominated in currency units and go up with inflation (interest rates have a component to correct for inflation) but they don't decrease if the currency gains value over time (this would need negative interest rates). I suppose that's something that could be done with regulation.

7 hours agoHPsquared

Yes. And even if people could refinance, debt values going down causes further deflation.

2 hours agomarcosdumay

Does it? Debt repayments are money deletion, so if debt is nominally written-off, less has to be paid back. That is, there will be less "anti-money" in the system but the "money" is still there. That increases the money supply, therefore inflationary.

2 hours agoHPsquared

I agree. It's super common that the price of vegetables goes up and down arround the year, in particular due to the harvest season.

9 hours agogus_massa

>It's because some prices, like salaries or rent just refuse to go down.

a common argument, but one that doesn't bear out in the absence of regulation enforcing that.

9 hours agojdasdf

One indicator that I'm watching for deflation is the concept that money is not accepted. Things like social media followers or likes are used as substitutes. People can't buy things like concert tickets or latest merchandise because all the bots have already bought them. Businesses will claim they can't provide a custom service. In tech, this would be spending a modest amount of money on online ads when in reality an ad campaign can only be "bought" with debt.

I think analysis around inflation, deflation, and consumer prices are valid but they are part of an understanding from economies of 100 years ago. Money loses value when you can't do anything with it. Tech and AI runs on debt, and an extraordinary amount of it. Is that really money? I don't think so.

Deflation may suffer from Goodhart's law. Because we've repurposed all of available human resources for mitigating against it, the variables we used to measure it cease to become useful. Our central measure for the economy are things like the stock market and the unemployment rate which have prevalent and valid criticisms that policy makers ignore. They truly don't indicate what's occurring on main street and I'm afraid that we will be in a deflationary spiral without knowing it.

2 hours agokittikitti

In the end, the article says:

> writing functioning application code has grown easier thanks to AI.

> It's getting easier and easier for startups to do stuff.

> Another answer might be to use the fact that software is becoming free and disposable to your advantage.

For me, the logical conclusion here is: don't build a software startup!

6 hours agoJacobiX

Yup. I'm starting to wonder if the startup space has a pretty big blind spot not realizing that how easy it is to build mostly/semi functioning software is not a unique advantage...

I left an AI startup to do tech consulting. What do I do? Build custom AI systems for clients. (Specifically clients that decided against going with startups' solutions.) Sometimes I build it for them, but I prefer to work with their own devs to teach them how to build it.

Fast forward 3+ years and we're going to see more everyday SMBs hiring a dev to just build them the stuff in-house that they were stuck paying vendors for. It won't happen everywhere. Painful enough problems and worthwhile enough solutions probably won't see much of a shift.

But startups that think the market will lap up whatever they have to offer as long as it looks and sounds slick may be in for a rude surprise.

5 hours agocootsnuck

Of course it still makes sense to have a startup. Not because you will ever find a decent enough market. But if you are well connected enough you can find a VC and play with other people’s money for awhile.

You aren’t doing it to get customers, it’s for investors and maybe a decent acquisition

5 hours agoraw_anon_1111

> Fast forward 3+ years and we're going to see more everyday SMBs hiring a dev to just build them the stuff in-house

I don't see this happening. Businesses generally want familiar tools that work reliably with predictable support patterns.

4 hours agoMangoToupe

"But building the same functionality has undoubtedly become simpler."

I disagree with this statement. It has become simpler, provided you don't care about it actually being correct, and you don't care about whether you really have tests that test what you think you asked for, you don't care about security, and other things.

Building the same thing involves doing the things that LLMs have proved time and again that they cannot do. But instead of writing it properly in the first place, you now need to look for the needle in the haystack that is the subtle bug that invariable get inserted by llms every single time I tried to use them. Which requires you to deeply understand the code anyway. Which you would have gotten automatically (and easier) if you were the one writing the code in the first place. developing the same thing at the same level of quality is harder with an LLM.

And the "table stakes" stuff is exactly the thing I would not trust an LLM with for sure, because the risk of getting it wrong could potentially be fatal (to the company, not the dev. Depends on his boss' temperament) with those.

5 hours agovrighter

I think either you haven't used LLMs for coding in a while, or you're working on things where they might still be limited on.

I've been able to use LLMs to build things in a weekend that I would not have been able to do in the past, without putting in months of serious effort.

I recently rewrote from scratch in a weekend a project that i had made a couple years ago. In a single weekend i now have a better product than I did at the time, when I spent maybe 20x the amount of time on it.

2 hours agopeab

And subtle bugs don’t get inserted by humans? Did security flaws in software just start happening after LLMs were introduced?

5 hours agoraw_anon_1111

So? The point is that humans do it much less often.

Let's say there are 10 subtasks that need to be done.

Let's say a human has 99% chance of getting each one of them right, by doing the proper testing etc. And let's say that the AI has a 95% chance of getting it right (being very generous here).

0.99^10 = a 90% chance of the human getting it to work properly. 0.95^10 = only a 60% chance. Almost a coin toss.

Even with 98% success rate, the compounding success rate still goes down to about 81%.

The thing is that LLM's aren't just "a little bit" worse than humans. In comparison they're cavemen.

5 hours agovrighter

So humans do it much less often yet we have 30 years of evidence to the contrary? Humans still can’t figure out how to write code not subject to sql injection after 25 years or how to write code and commit it to GitHub without exposing admin credentials

4 hours agoraw_anon_1111

With re. to the first two paragraphs, it's crazy how someone can be so massively brainwashed.

5 hours agomoralestapia

There’s a fun version of this in futurist space travel speculation.

Let’s say you have the a fusion rocket and can hit 5% the speed of light. You want to migrate to the stars for some reason.

So do you build a generational ship now, which is possible, or… do you wait?

Because if you build it now someone with a much better drive may just fly right past you at 20% the speed of light.

In this one the answer is to plot it out under the assumption there is no totally undiscovered major physics that would allow, say, FTL, and plot the curves for advancement against that.

So can we do this with software? We have the progress of hardware, which is somewhat deterministic, and we know something about the progress of software from stats we can make via GitHub.

The software equivalent of someone discovering some “fantasy” physics and building a warp drive would be runaway self-improving AGI/ASI. I’d argue this is impossible for information theoretical reasons, but what if I’m wrong?

7 hours agoapi

Dang. I won't be able to sleep tonight

30 minutes agosebastianconcpt

Does anyone else agree with this the premise of this article? Is it sensible to put off building things now because it will get even cheaper and faster later?

Maybe the time value of time is only increasing as we go.

9 hours agodarkerside

Actually yes. I wanted to get into UI programming with GTK 2 and right now im waiting for GTK $n to stabilize so i can commit to it.

Knowing that GTK $n-1 will soon be obsolete is enough reason to not put effort into learning it.

9 hours agoblueflow

In general, we should focus more on what endures over what changes. Focus less on the times and more on the eternities. People very often drown in the noise of passing fads and fashions and ephemeral tech. Can you become really skilled at using some piece of tech? Sure. Is it worth becoming really skilled? It depends on the circumstances and the particular person, but in most cases, probably not. It usually is a waste of time (but given the kinds of SFVs that people publish or hobbies people have, people are generally quite good at frittering away their lives on stupid shit).

Incidentally, this is how you can distinguish between a good CS curriculum from a bad one. A good one focuses heavily on principles; the particular technical trappings are mostly just a medium, like Latin used to be in academia, now replaced by English. You pick up what you need to do to the job.

an hour agolo_zamoyski

I think you're right. The author is quite wrong on many aspects in my view. One of the central mistake he makes is that creating a profitable startup is mostly a matter of shipping good product i.e.

> Used to be, you had to find a customer in SO much pain that they'd settle for a point solution to their most painful problem, while you slowly built the rest of the stuff. Now, you can still do that one thing really well, but you can also quickly build a bunch of the table stakes features really fast, making it more of a no-brainer to adopt your product.

7 hours agobootsmann

The conclusion that you should wait to build anything is an illustration of the danger of economic inflation that the author started with. I'm not sure why he thinks the economic version is toxic but the technological version is a good idea though.

The answer to should we just sit around and wait for better technology is obviously no. We gain a lot of knowledge by building with what we have; builders now inform where technology improves. (The front page has an article about Voyager being a light day away...)

I think the more interesting question is what would happen if we induced some kind of 2% "technological inflation" - every year it gets harder to make anything. Would that push more orgs to build more things? Everyone pours everything they have into making products now because their resources will go less far next year.

9 hours agohahajk

> I think the more interesting question is what would happen if we induced some kind of 2% "technological inflation" - every year it gets harder to make anything. Would that push more orgs to build more things? Everyone pours everything they have into making products now because their resources will go less far next year.

Government bonds already do this for absolutely everything. If I can put my money in a guaranteed bond at X%/year then your startup that's a risky investment has to make much better returns to make it worth my while. That's why the stock market is always chasing growth.

6 hours agophilipallstar

I agree. I had several projects lined up and I delayed one because it used same tech as another significantly smaller project, so I learned the tech on the smaller simpler project and then used the knowledge on the bigger project. It was beneficial to not do the bigger project first.

8 hours agodvh
[deleted]
7 hours ago

> it will get even cheaper and faster later

Yeah, and will be done by somebody else. I think this is the main problem, and if you get rid of it, you'll have a completely sensible strategy. I mean there are many government contractors who, through corrupt connections, can guarantee that work will be awarded to them, and very often doing just that.

7 hours agoRay20

> Desktop app.... though Electron and Tauri have made it easier

Ugh. I don't like that kind of 'desktop' apps. Huge bloat with a blip of actual app.

6 hours agoeinpoklum

"One of the main problems is that if people expect prices to keep going down, they'll delay purchases and save more, because they expect that they'll be able to get the stuff for less later."

That is why we are all waiting to buy our first personal computers and our first cell phones.

Economists have managed to be ludicrous for a very long time and yet we still trust them.