There's clearly easy/irrational money distorting the markets here. Normally this wouldn't be a problem: prices would go up, supply would eventually increase and everybody would be okay. But with AI being massively subsidized by nation-states and investors, there's no price that is too high for these supplies.
Eventually the music will stop when the easy money runs out and we'll see how much people are truly willing to pay for AI.
Regardless where demand comes from, it takes time to spin up a hard drive factory, and prices would have to rise enough that, as a producer, you would feel confident that a new hard drive factory will actually pay off. Conversely, if you feel that boom is irrational and temporary, as a producer you’d be quite wary of investing money in a new factory if there was a risk it would be producing into a glut in a few years.
I'll add that the GPU, CPU, storage, and RAM industries crashed in 2022 after a Covid-induced boom.[0]
Everything was cheap. Samsung sold SSDs at a loss that year.
TSMC and other suppliers did not invest as much in cap ex in 2022 and 2023 because of the crash.
Parts of the shortage today can be blamed by those years. Of course ChatGPT also launched in late 2022 and the rest is history.
If I remember during a previous GPU shortage (crypto?), Nvidia (and/or TSMC?) basically knew the music would stop and didn't want to be caught with its pants down after making the significant investments necessary to increase production
Not to mention that without enough competition, you can just raise prices, which, uh (gestures at Nvidia GPU price trends...)
Similar thing happened with mask manufacturers during COVID.
They didn't spin up additional mask production b/c they knew the pandemic would eventually pass. They learned this lesson from SARS.
Not maxing out production during spikes (or seasonality) in demand is a key tenet of being a "rational economic actor".
I believe the TSMC CEO said that in a recent interview. They're aware that their now biggest customer Nvidia has a less broad product portfolio than Apple and the high volumes they buy propably won't last. It's too much of a risk to plan more Fabs based on that.
They are indeed planning for more fabs, in order to meet volumes.
Last week: “TSMC's board approves $45 billion spending package on new fabs”
Silicon Valley is arguing that TSMC isn't investing enough. They should be investing hundreds of billions to build fabs, like how big tech is investing in the AI buildout.
$45 billion for new fabs is peanuts compared to Amazon's $200b and Google's $180b investment in 2026.
Can't really blame TSMC though. It takes years for fabs to go from plan to first wafer. By the time new fabs go online, demand might not be there. Who knows?
"Silicon Valley" doesn't get to make the decision unless they are willing to send some of those hundreds of billions to TSMC up front. (TSMC isn't going to want future promises of business either since those are worth very little.)
If big tech prepays for the entire fab, I think TSMC would do it.
Somewhat ironically the AI boom means Nvidia would've easily made their money back on that investment though and probably even more thoroughly owned the GPGPU space.
But as it is it's not like they made any bad decisions either.
> it takes time to spin up a hard drive factory
Very good.
Are these factories already running 24/7 that labor can't be added to make more without adding capital infra?
And if they were running 24/7, maybe setting up another factory or line will avoid some of the 24/7 scheduling.
No it’s not an easy fix. Manufacturers don’t have a good pulse on long term demand. The he capex to spin up a new manufacturing plant is significant. Especially with the recency of Covid where some folks did get caught with their pants down and over invested during the huge demand boom.
I don’t quite follow the narrative like yours about nation states and investors. There is certainly an industrial bubble going on and lots of startups getting massive amounts of capital but I here is a strong signal that a good part of this demand is here to stay.
This will be one of those scenarios where some companies will look brilliant and others foolish.
Smart manufacturers will sell 'hard drive futures'. Ie. "Give us $100/drive now for 100k drives for delivery in march 2028".
These contracts are then transferrable. The manufacturer can start work on a factory knowing they'll get paid to produce the drives.
If the AI boom comes to an end, the manufacturer is still going to get paid for their factory, and if the AI company wants to recoup costs they could try to sell those contracts back to the manufacturer for pennies on the dollar, who might then decide (if it is more profitable) to halt work on the factory - and either way they make money.
Can you provide some solid examples of companies doing this in an industry with high capex? Yes futures exist but largely in commodity businesses. Because what you described sounds more like pre-purchase agreements which already exist. To have a futures market you would need investors and a product that is more of a commodity and not something highly engineered.
You are also forgetting that the payback period on a plant is not a single year, it will be over many years and most likely no buyer is wanting to arrange purchasing that far out.
I don’t see how what you described sounds is set in reality even for “smart manufacturers”.
There are futures markets for DRAM. Somewhat secretive (hard to find reliable price quotes) but they exist.
That only works out if there are enough investors willing to pay for those futures. If the new factory can make a billion drives but they only have 2 of those futures contracts sold (that is 200k drives) they don't build the factory. Remember too if they sell those contacts they are on the hook to deliver - if it is just investors they will accept the street value of 100k drives in 2028 but some of the people might be buyers demanding physical goods.
Every year a few farmers realize they are contracted to deliver more grain than they have in their bins and so have to buy some grain from someone else (often at a loss) just to deliver it. This isn't a common problem but it happens (most often the farmer is using their insurance payout to buy the grain - snip a very large essay on the complexities of this)
It's hard to increase long-run production capacity for what seems to be clearly a short-term spike in datacenter buildout. Even if AI itself is not much of a bubble, at some point spending on new AI facilities has to subside.
[deleted]
AI is going to be what fiber was to the dotcom bubble. Someone spend a lot of money on a lot of infrastructure, some of which is going to be incredibly useful, but sold for much less than it cost to build. Hardware just depreciates much much faster than fiber networks.
I'm not saying that data center buildouts can't overshoot demand but AI and compute is different than fiber buildout. The more compute you have, the smarter the AI. You can use the compute to let the AI think longer (maybe hours/days/weeks) on a solution. You can run multiple AI agents simultaneously and have them work together or check each other's work. You can train and inference better models with more compute.
So there is always use for more compute to solve problems.
Fiber installations can overshoot relatively easily. No matter how much fiber you have installed, that 4k movie isn't going to change. The 3 hours of watch time for consumers isn't going to change.
Did you pay attention in computer science classes? There are problems you can't simply brute-force. You can throw all the computing power you want at them, but they won't terminate before the heat-death of the universe. An LLM can only output a convolution of its data set. That's its plateau. It can't solve problems, it can only output an existing solution. Compute power can make it faster to narrow down to that existing solution, but it can't make the LLM smarter.
Maybe LLMs can solve novel problems, maybe not. We don't know for sure. It's trending like it can.
There are still plenty of problems that having more tokens would allow them to be solved, and solved faster, better. There is no absolutely no way we've already met AI compute demands for the problems that LLMs can solve today.
LLMs are considered Turing complete.
Not really. You can leverage randomness (and LLMs absolutely do) to generate bespoke solutions and then use known methods to verify them. I'm not saying LLMs are great at this, they are gimped by their inability to "save" what they learn, but we know that any kind of "new idea" is a function of random and deterministic processes mixed together in varying amounts.
Everything is either random, deterministic, or some shade of the two. Human brain "magic" included.
You can't really use compute more because power is already the bottleneck. Datacenter buildouts are now being measured in GW which tells you everything you need to know. Newer hardware will be a lot more power-efficient but also highly scarce for that reason.
Energy is also being scaled up. But the fundamental difference between compute and fiber buildup is different in my opinion.
current shortages are exactly the result of fabs not wanting to commit extra capex due to overbuild risk and inference demand seems to be growing 10x yoy; you've famously got 8 year old TPUs at google at 100% load.
Hardware just depreciates much much faster than fiber
The manfucaturing capacity expanded to meet the demand for new hardware doesn't (as much)
But if the demand drops for six months, the manufacturers are going to scale back production.
If it drops for a year, they're likely to start shedding capacity, one way or another.
This is not an equivalent situation. The vast, vast majority of what's being produced for this bubble is going to be waste once it pops.
This goes beyond profits. It will be important for national security.
Higher price encourages more supply. Typically when you see a acute shortage, its quickly followed by a glut as supply starts coming online in an over correction.
These factories take years to make and massive amounts of money. That and there are so few manufacturers now they are far more likely to collude
> Normally this wouldn't be a problem: prices would go up, supply would eventually increase and everybody would be okay.
This sounds like economic dogma based on pointing at some future equilibrium.
I like the saying that goes something like "life is what is happens when you are waiting for the future". In the same way, it seems to me that equilibrium is increasingly less common for many of us.
Markets are dynamic systems, and there are sub-fields of economics that recognize this. The message doesn't always get out unfortunately.
> But with AI being massively subsidized by nation-states and investors, there's no price that is too high for these supplies.
This feels like more dogma: find a convenient scape-goat: governments.
Time to wake up to what history has shown us! Markets naturally reflect boom and bust cycles, irrationality of people, and various other market failures. None of these are news to competent economists, by the way. Be careful from whence you get your economic "analysis".
Yes, this is why the prices of housing has dropped dramatically. The market stepped up and filled the demand needed and now everyone can afford a place to live
.....
I can't tell if the comment is above is sarcastic or serious: it could go either way.
I think AI companies are involving these other industries so when the money runs out they will claim the whole thing is too big too fail.
By buying flash and thus shifting demand to HDD? How does that work?
The article doesn't mention flash or HDD. It seems that all storage by WD is already sold.
My point is that directly or indirectly all hardware companies depend on memory and storage. If AI companies fall this could have repercussions to the whole industry.
Earlier gamers got punished by crypto and now they are being punished by AI.
GPUs before crypto had a lot less amount of VRAM. Crypto investment funded a lot of stupid experiments, of which some did stick to the wall. I don't think gamers had lives completely ruined by crypto in the end.
"Punished" implies a moral valence to the whole thing which isn't there. It's not like the AI companies were aware of gamers and set out to do this. You simply got run over, like everyone else in front of the trillion dollar bulldozer.
So what?
Why gamers must be the most important group?
Gamers are important because they are consistent customers. Crypto buying of GPUs is done (anyone still in this area is buying ASICs). Meanwhile gamers are still buying GPUs - they do sometimes hold off when the economy doesn't allow, but you can trust that gamers will continue to buy GPUs to play their games and thus they are a safe investment. It is rational to sell CPUs to a gamer for much less than someone in crypto because the gamer will be back (even if the gamer "grows up" there are more replacing them). Thus gamer is an important group while crypto is not.
The above was their prediction during the crypto boom and it turns out correct. I'm not sure how AI will turn out, but it isn't unreasonable to predict that AI will also move to dedicated chips (or die?) in a few years thus making gamers more important because gamers will be buying GPUs when this fad is over. Though of course if AI turns out to be a constant demand for more/better GPUs long term they are more important.
Gamers are not the only important GPU market. CAD comes to mind as another group that is a consistent demand for GPUs over the years. I know there are others, they are all important.
the "value" of nvidia to the "AI" companies is their tsmc fab contract
they don't need CUDA, they don't need the 10 years of weird game support, even the networking tech
they need none of nvidia's technology moats
exactly same as the crypto, where they just needed to make an ASIC to pump out sha1 as quickly as possible
which is really, really easy if you have a fab contract
at which point their use of nvidia dropped to zero
I’d rather prefer that the average Joe has a good entertainment system than our corporate overlords has a good surveillance system.
The growth curve of technology has always pointed at the world becoming tiny and non-private.
Disagree.
Mass surveillance by corporations can be outlawed. Just because something is possible, doesn’t mean it must be necessarily so.
I travel a lot for work to different nations. The cultural differences are stark.
In the UK for example, they love their CCTVs. In Switzerland, they’re only allowed where they are deemed necessary.
I like to imagine the reference in the movie margin call is that of a merry go round or a game of Musical chair. Like we are all on a ride, none of us are the operator, and all we can do is guess when the music will stop (and the ride ends).
The problem with this AI stuff is we don't know how much we will be willing to pay for it, as individuals, as businesses, as nations. I guess we just don't know how far this stuff will be useful. The reasons for the high valuation is, in my guess, that there is more value here than what we have tapped so far, right?
The revenues that nVidia has reported is based on what we hope we will achieve in the future so I guess the whole thing is speculation?
TBF, all financial market is speculation these days, what only change is the figure/percentage of how much a share is actually the value it's priced.
> The problem with this AI stuff is we don't know how much we will be willing to pay for it, as individuals, as businesses, as nations. I guess we just don't know how far this stuff will be useful. The reasons for the high valuation is, in my guess, that there is more value here than what we have tapped so far, right?
I think the value now comes on how we make a product of it, for example, like OpenClaw. Whether we like or not, AI is really expensive to train, not only in monetary value but also in resources, and the gains have been diminishing with each “generation”. Let's not forget we heard promises that have not been fulfilled, for example AGI or “AI could potentially cure cancer, with enough power”.
And if you've been watching Deepmind AI has been making advances in medical sciences at a pretty damned fast rate. So not fulfilled is a pretty weak statement. The pipeline in medical is very long.
And that's not even talking about the head spinning rate robotics is advancing. The hardware we use for LLMs is also being used in robot simulation for hardware training that gives results in hours that took weeks or months in the past.
There's clearly easy/irrational money distorting the markets here.
No, I think it is real demand.
AI will cause shortages in everything from GPUs to CPUs, RAM, storage, networking, fiber, etc because of real demand. The physical world can't keep up with AI progress. Hence, shortages.
AI simply increases computer use by magnitudes. Now you can suddenly use Seedance 2.0 to make CGI that would have cost tens of millions 5 years ago for $5.[0] Everyone is going to need more disk space to store all those video files. Someone in their basement can make a full length movie limited only by imagination. The output quality keeps getting better quicker.
AI agents also drastically increase storage demands. Imagine financial companies using AI agents to search, scrape, organize data on stocks that they wouldn't have been able to do prior. Suddenly, disk storage and CPUs are in high demand for tasks like these.
I think the demand for computer hardware and networking gear is real and is only the beginning.
As someone who is into AI, hardware, and investing, I've been investing in physical businesses based on the above hypothesis. The only durable moats will be compute, energy, and data.
> The only durable moats will be compute, energy, and data
"Compute" is capital investment; normal and comprehensible, but on a huge scale.
"Data" is .. stolen? That feels like a problem which has been dodged but will not remain solved forever, as everyone goes shields-up against the scrapers.
"Energy" was a serious global problem before AI. All economic growth is traded off against future global temperature increases to some extent, but this is even more acute in this electricity-intensive industry. How many degrees of temperature increase is worth one .. whatever the unit of AI gain-of-function is?
> How many degrees of temperature increase is worth one .. whatever the unit of AI gain-of-function is?
Billionaire. And they are definitely willing to make the trade.
one.. fully autonomous, self improving, replicating, general intelligence.
The question isn’t if the demand is real or not (supplies are low, so demand must exist). The question is if the demand curve has permanently shifted, or is this a short-term issue. No one builds new capacity in response to short term changes, because you’ll have difficulty recouping the capital expense.
If AI will permanently cause an increase in hard drives over the current growth curve, then WD, et al will build new capacity, increasing supply (and reducing costs). But this really isn’t something that is known at this point.
My post argues that the demand has permanently shifted.
By the way, plenty of people on HN and Reddit ask if the demand is real or not. They all think there's some collusion to keep the AI bubble going by all the companies. They don't believe AI is that useful today.
> My post argues that the demand has permanently shifted
The time horizon for this is murky at best. This is something you think, but can’t know. But, you’re putting money behind it, so if you’re right, you’ll make a good profit!
But for the larger companies (like WD), over building capacity can be a big problem. They can’t plan factory expansion based on what might be a short term bubble. That’s how companies go out of business. There is plenty to suggest that you’re right, that AI will cause permanently increased demand for computing/storage resources. Because it is useful and does consume and produce a lot of new data and media.
But I’m still skeptical.
The massive increase in spending can’t be sustainable. We can’t continue to see the AI beast at this rate and still have other devices. Silicon wafer fabs can’t be built on demand and take time. SSD/HD factories take time. I think we are seeing an expansion to see who the big players will be in the next 3-5 years. Once that order has been established, then I think we will fall back to more sustainable rates of demand. This isn’t collusion, it’s just market dynamics at play in a common market. Sadly, we are all part of the same pool and so everything is expensive for all of us. At some point though, the AI money will dry up or get more expensive. Then I think we’ll see a reversion back to “normal” demand, maybe slightly elevated, but not the crazy jump we’ve seen for the past two years.
Us being in the same pool as AI is one of the potential risks pointed out by AI safety experts.
To use an analogy, imagine you're a small fluffy mammal that lives in fertile soils in open plains. Suddenly a bunch of humans show up with plows and till you and your environment under to grow crops.
Maybe the humans suddenly won't need crops any longer and you'll get your territory back. But if that doesn't happen and a paradigm change occurred you're in trouble.
AI can be useful today, while also being insanely overvalued, and a bubble.
There will be a bubble. It's inevitable.
The most important question is are we in 1994 or 2000 of the bubble for investors and suppliers like Samsung, WD, SK Hynix, TSMC.
What about 10 years from now? 15 years? Will AI provide more value in 2040 than in 2026? The internet ultimately provided far more value than even peak dotcom bubble thought.
I wonder if I'm alone in being optimistic about this. I believe that the gigantic inflow of money into hardware will lead to large increase in production capabilities, accelerated progress and perhaps even new, better architectures.
I actually agree: a spike in prices due to bumping against capacity limits is way better than a downturn in the market. But this is only really true if AI hyperscalers are incented to space out their big buildouts over time (while raising their prices enough to ration current demand) so that suppliers can have some guarantee that their expanded capacity will be used.
This fact never ceases to amaze me. It's so cool how relentlessly AI is pushing the horizons of our current hardware!
Maybe now we will start to see the "optical" CPUs start to be a thing. Or the 3D disk storage,;or other ground breaking technology.
Optical interconnect in the rack is a thing already. It's just a matter of time until it moves to single-PCB scale. And most persistent memories (competing with DDR memory for speed, and with far lower wearout than NAND) are of the "3D storage" type.
AI's output is not reproducible. It's a disaster.
If we want reproducible output we already have conventional software. Stop using a hammer on screws.
So it's like humans then
This is wrong for all LLMs which have a temperature setting.
And even if there were guaranteed to be non-deterministic, there is still lots of value in many aspects of content generation.
Better stock up with used laptops. I'm going to buy another one this year. Those used ones usually don't last very long.
What if in the near future it is simply too expensive to own "personal" computers? What if you can no longer buy used computers from official channels but have to find local shops or sharpen up on soldering skills and find parts from dumps? The big techs will conveniently "rent out" cloud computer for us to use, in exchange of all of your data.
"Don't you all have cellphones?"
Bingo. A number of corporate interests don't want to let you own your personal computers for different reasons. Google/Apple wants you to get locked down devices, and cloud/AI providers want you to use their services from a weak client.
I think about this too. There are several headwinds. Rent-seeking and collapse of economies of scale in the consumer sector for sure, but also I feel like we've basically peaked in hardware's ability to meet routine needs.
Once the phone makers realize that they can sell phones and docking stations to businesses because 90% of knowledge work seems to happen in a web browser through one SaaS or other I think personal computers will be cooked.
I have 3 old employer laptops and my personal gaming laptop, which I use for work now. I'm happy about this now ;)
I probably will only need to return newest laptop if I leave the company.
I have heard that you can get used laptops. But they do not come with memory or SSD anymore... As even used components are now valuable enough to be removed and sold.
In a lot of cases, owners remove the storage not because it has any value but rather they don't want to risk making a mistake letting a device go that still has data on it.
Also pulling and shredding hard drives is cheaper than paying someone to run DBAN or equivalent (which can take many hours to complete on a large drive), and there's no easy way to securely erase an SSD if it wasn't encrypted from the beginning.
Many laptops from last few years have soldered memory. Your previous laptop's SSD can also be reused, since those don't die that quickly compared to the laptop.
Or worse, they have memory and SSD soldered on board, and are broken, so you have to learn soldering skills too.
Damn really? One of my go to moves when helping small political campaigns is to buy like a 2015 MBP and turn it into a locally hosted server to run all their stuff on the cheap.
Older MacBooks with socketed storage may actually be exempted because they use a proprietary connector instead of standard m.2.
Non state of the art lithography is pretty much commoditized (DDR3 & DDR4) so we will always have compute, although slower.
Eh, if this demand is really sustainable they will eventually start producing in adequate volume
I no longer feel obligated to apologize for holding on to older devices for a long time. I have several desktops and laptops that are all still usable.
Are these the picks and shovels?
Is the profitability of these electronics manufacturers more likely than the companies that are buying up all their future inventory?
If AI continues at this trajectory, sure, likely to the picks and shovels.
If AI has a bubble burst, you could see a lot of used hardware flood the market and then companies like WD could have a hard time selling against their previous inventory.
The problem is more likely that companies like WD doesn't know if this will be a bubble or not. Currently they can milk the market by raising their prices and just rely on their current production facilities, maybe expand a little. If there's going to be crash, then it's better to have raised the price, even if just temporarily, rather than being left standing with excessive production capacity.
If it's long term, it would be better to be the front runner on additional capacity, but that's assuming continuous growth. If it all comes down, or even just plateaus, it's better to simply raise prices.
Given how hard AI is on I/O, while restarting if hardware might go second hand. I dont see hard drives go second hand. Most hardware that we get might be used beyond redeeming even at free price.
I don't think the HDDs are being used for any intensive loads. They have too much latency for most of that. It's probably just archival storage for their scraped content and generated slop.
For "cold" archival storage you would want to use tape, which is far cheaper per TB at scale.
I don't mean that type of archive, but rather "just in case" data like "last month's scrape of this website" after we scraped it 5 more times this month or higher resolution versions of book scans. You might want to still be able to dump it out quickly if you need it. Money is no object for these companies and the cost of HDDs is more than low enough for the flexibility they provide.
If demand for hard drives is this high then it sounds like there wouldn't be near enough tape around either.
HDDs, RAM, and chips have so many health metrics and methods that you really shouldn't be afraid to buy them used. The only special requirement is a RasPi test rig. That and a 30 day return window.
I feel like the goal is to avoid buying something broken in the first place, not just to be able to tell if you've bought something that turns out to be broken
What are companies needing all of these hard drives for? I understand their need for memory, and boot. But storing text training data and text conversations isn't that space intensive. There's a few companies doing video models, so I can see how that takes a tremendous amount of space. Is it just that?
Hearing about their scrapping practises it might be that they are storing same data over and over and over again. And then yes, audio and video is likely something they are planning for or already gathering.
And if they produce lot of video, they might keep copies around.
All the latest general purpose models are multimodal (except DeepSeek I think). Transfer learning allows to improve results even after they exhausted all the text in the internet.
Storing training data: for example, Anthropic bought millions of second hand books and scanned them:
I think the somewhat hallucinatory canned response is that they distribute data across drives for a massive throughput. Though idk if that even technically makes sense...
I am surprised by that too. I thought everyone moved to SDDs or NVMe ?
I was toying with getting a 2T HDD for a BSD system I have, I guess not now :)
Everyone moved to SDDs or NVMe. If you're right, that includes manufacturers. HDDs still have advantages over SSDs for specific needs, like more reliable long-term unelectrified storage. It's also possible that the high price of SSDs made HDDs an option again.
Really if you're writing large solid files hard drives aren't that bad. If you can have the system split out one file per drive at a time then you'll avoid a lot of the fragments
This is the consequence of "I don't want to write this function myself, I'll get the plagiarism machine to do it for me"
And what's wrong with not wanting to write functions yourself? It is a perfectly reasonable thing, and in some cases (ex: crypto), rolling your own is strongly discouraged. That's the reason why libraries exist, you don't want to implement your own associative array every time your work needs it do you?
As for plagiarism, it is not something to even consider when writing code, unless your code is an art project. If someone else's code does the job better then yours, that's the code you should use, you are not trying to be original, you are trying to make a working product. There is the problem of intellectual property laws, but it is narrower than plagiarism. For instance, writing an open source drop-in replacement of some proprietary software is common practice, it is legal and often celebrated as long as it doesn't contain the original software code, in art, it would be plagiarism.
Copyright laundering is a problem though, and AI is very resource intensive for a result of dubious quality sometimes. But that just shows that it is not a good enough "plagiarism machine", not that using a "plagiarism machine" is wrong.
If I use a package for crypto stuff, it will generally be listed as part of the project, in an include or similar, so you can see who actually wrote the code. If you get an LLM to create it, it will write some "new original code" for you, with no ability to tell you any of the names of people who's code went into that, and who did not give their consent for it to be mangled into the algorithm.
If I copy work from someone else, whether that be a paragraph of writing, a code block or art, and do not credit them, passing it off as my own creation, that's plagiarism. If the plagiarism machine can give proper attribution and context, it's not a plagiarism machine anymore, but given the incredibly lossy nature of LLMS, I don't foresee that happening. A search engine is different, as it provides attribution for the content it's giving you (ignoring the "ai summary" that is often included now). If you go to my website and copy code from me, you know where the code came from, because you got it from my website
Why is "plagiarism" "bad"?
Modern society seems to assume any work by a person is due to that person alone, and credits that person only. But we know that is not the case. Any work by an author is the culmination of a series of contributions, perhaps not to the work directly, but often to the author, giving them the proper background and environment to do the work. The author is simply one that built upon the aggregate knowledge in the world and added a small bit of their own ideas.
I think it is bad taste to pass another's work as your own, and I believe people should be economically compensated for creating art and generating ideas, but I do not believe people are entitled to claim any "ownership" of ideas. IMHO, it is grossly egoistic.
Sure, you can't claim ownership of ideas, but if you verbatim repeat other people's content as if it is your own, and are unable to attribute it to its original creator, is that not a bit shitty? That's what LLMs are doing
I honestly think it's not that simple.
The ones who spend billions on integrating public cloud LLM services are not the ones writing that function. They are managers who based on data pulled out of thin air say "your goal for this year is to increase productivity by X%. With AI, while staffing is going slightly down".
I have to watch AI generated avatars on the most boring topics imaginable, because the only "documentation" and link to actual answer is in a form of fake person talking. And this is encouraged!
Then the only measure of success is either AI services adoption (team count), or sales data.
That is the real tragedy and the real scale - big companies pushing (external!) AI services without even proof that it justifies the cost alone. Smooth talking around any other metric (or the lack of it).
[dead]
It's still absolutely fascinating to me that basically the whole modern tech industry and the economic growth from it rests on the shoulders of a single company that has all of their important factories on a single island that's under constant threat of invasion. On top of that they themselves are reliant on a single company that's able to produce the machine required to print the wafers.
I don't know if TSMC has anything to do with hard drive production, but the reliance on very few players is also a problem in that industry.
Investors love a monopoly, and establishing this required more than a trillion dollars of investment sustained over a couple decades.
> Investors love a monopoly...
Indeed, investors left to their own devices act in this way. Underlying such a single point-of-failure is an implied but immense hope and thus pressure for stability. I wonder what the prediction markets saying about current levels of geopolitical stability in Taiwan?
> Indeed, investors left to their own devices act in this way.
Interesting. Capitalism is often touted to be more decentralized than socialism, but this is an example of how it can centralize.
It's only this way because the American ruling class would rather ship jobs overseas to increase their wealth than competently establish an industrial sector that would pay good wages to average people.
Turns out letting a bunch of MBAs plan your economy is extremely foolish.
Yeah, this is slowing down growth and profits. The AI hype is sucking everything dry, from HVAC services to hardware.
Great. I’ve just returned a WD drive to Amazon after it arrived crushed in a torn-open paper bag.
The replacement arrived also in a paper bag and went straight back, this time for a refund.
I guess I should have kept that one and hoped for the best.
Good alternatives? I’ve only recently been enlightened on how profoundly sh__ty SSD is for long-term storage and I have a whole lot of images my parents took traveling the last few years of their lives.
I'm sure Amazon isn't the only shop that delivers to your area
The premise of this news is that prices are going to climb and availability is going to drop.
And I’m not keen on having anyone ship me one of these anymore.
Walmart sells what appears to be an older version of the drive and I might have to cross my fingers and just get one of those.
> And I’m not keen on having anyone ship me one of these anymore.
Isn't that what you're doing ordering off amazon with their comingled inventory?
Besides, there's a spectrum of sellers between "Amazon" and "anybody", you can even, perhaps, purchase directly from the manufacturer.
I meant that after the Amazon experience, I don't want to buy a HDD online. Would much prefer to get it locally in person.
I was recently involved in a large server purchase for work, where we wanted 72 hard drives of 24TB each for a server. They were available last year, but last month the largest we could get were 20TB drives.
My machine was built up from parts in 2014.
6-7 years ago when GPU prices went up, I hoped nothing would break. Last year when RAM prices went up I did the same. Now with drive prices going up, it's the same thing.
It's interesting because I've always built mid-tier machines over the years and it was in the neighborhood of ~$700 at the time. Now the same thing is almost double that but the performance is no where near twice as good for general computer usage.
This is all basically a textbook example of irrational market decisions. There’s clearly a bubble and not enough money coming in to pay for the AI bonanza.
It’s building materials being in short supply when there’s obviously more houses than buyers. That’s just masked at the moment because of all the capital being pumped in to cover for the lack of actual revenue to pay for everything. The structural mismatch at the moment is gigantic, and the markets are getting increasingly impatient waiting for the revenue to materialize.
Mark this post… in a few years folks will be coming up with creative ideas for cheap storage and GPUs flooding the market after folks pick up the pieces of imploded AI companies.
(For the record, I’m a huge fan of AI, but that doesn’t mean I don’t also think a giant business and financial bubble is about to implode).
More reckless and irresponsible than irrational.
> in a few years folks will be coming up with creative ideas for cheap storage and GPUs flooding the market
COVID was six years ago. In that time, GPU prices haven't gone down (and really have only increased). Count me skeptical that there will be a flood of cheap components.
I feel like the most recent time you could reasonably get an nvidia *80 GPU at the store for a normal amount of money was almost a decade ago.
Is there an industrial bubble? Probably.
> It’s building materials being in short supply when there’s obviously more houses than buyers.
That I think is a hard one to prove and is where folks are figuring it out. There is obvious continued demand and certainly a portion of it is from other startups spending money. I don’t think it’s obvious though where we are at.
I'm not against subsidies but the concentration is a problem. This money could have spurned grass roots participation in these emerging industries, but instead they chose to go with the most heinous of monocultures, leaving billions of people out of the loop.
Not only storage, cheapest 32 GB RAM that I can find is around 200 euros.
That's actually a bargain, average market price (though highly volatile) is more than double that.
They're probably looking at DDR4.
It's interesting to see here that spending is irrational, but actually even if AI improvements slow down it's more rational for companies to spend more and underutilize the machines than to underspend and get disrtupted.
On the otherhand lots of people here are even more uncomfortable of the other option, which is quite possible: AI software algorithms may scale better than the capacity of companies that make the hardware. Personally I think hardware is the harder to scale from the two and this is just the beginning.
It started with RAM; now with hard drives and SSDs. This is not looking good. But at least you can buy used ones for a pretty good price, for now.
I console myself with knowledge of the economics maxim that every supply shortage is usually, eventually, followed by a supply glut.
One can only hope that that's the principle at work here, anyway. It could also be a critically damped system for all I know. Unfortunately I studied control systems too...
If storage and memory manufacturer don't respond with increasing supply. There might not be glut. Just postponed demand that will slowly get fulfilled over longer period. That is if we were in steady state.
On other hand if there is bigger economic turmoil that might mean that the postponed demand does not realise as there is no purchasing power...
I was thinking than until my NAS gave me a error on one of my harddrives, now I'm in the market for a replacement while I still have redundancy
People with a control theory background are welcome in economics; the field is more diverse than some would recognize. Certain professions and subfields are more open than others. There are plenty of economists who care about things like resilience and dampening shocks.
I would love if more non-traditional economists got involved in the public sphere by which I mean: writing about economic trends, public policy, regulation, rate-adjustment, etc.
As an engineer with a passing control theory background and a breadth of general knowledge, I'd love to explore this space more and find a way to apply my knowledge and share the results. Are there any particular problems you think well-suited to this treatment?
If you have a policy area you like you might start there. From my lens, here are some interesting ways to look at political economy from a broader point of view: economic disruption from AI (could be from energy prices, labor substitution, and lots more), climate modeling and its impacts on economies, conservation and ecosystem stability, and economic growth under different levels of inequality. I would add this to the mix even though it isn't a typical economic area: geopolitical destabilization from autonomous weapons, both physical and cyber.
Those are definitely all areas of interest for me as well. Thanks for the pointers. Do you write anywhere?
Presumably they're also looking to increase production capacity as fast as possible - within the year?
I'd have thought HDDs aren't at the top of the list for AI requirements, are other component manufacturers struggling even more to meet demand?
Why would they?
If we weren’t talking about AI, was there another high demand sector / customer for spinning platters?
And their margins get fat now that supply is relatively constant but AI demand has saturated their current production numbers.
I listed some hard drives on Friday on eBay.. most of them refurbished... within 5 minutes got a message from a person who wanted them all... shipped them an hour later
This is getting ridiculous. Never before has an unwanted product been thrust so forcefully and artificially into the market that it disrupts the supply line for real products with actual demand.
i am happy i bought 5x10TB drives two months ago, anticipating this exact scenario.
Take a look at prices of SSDs and RAM too.
I built a new server this time last year. My board does 6 channel RAM so I bought 6x32GB ECC DDR5. $160 a stick at the time. Just for grins I looked up the same product number at the same supplier I originally bought from. $1300 apiece. One of the VMs running on that server is TrueNAS, with 4 20TB WD Red Pros. God help me if I have to replace a drive.
Best Buy is actively selling 2x8gb sticks of DDR4 3200 for $80 a stick. I was floored. Ten bucks a gig, $160 for the pack.
We're fucking doomed.
Ten bucks a gig is lower than what some DDR5 memory is selling at.
Perhaps there is an incentive to go back to OS that can operate with 640KB RAM ... /s
I bought 6x refurbished ultrastars for ~$100/ea Black Friday 2024. They were over $200/ea 2025. Samsung T7 (and shield) SSD’s have 2x-3x. Can’t get 1TB for less than like $180 right now. It’s ridiculous
Bought a few 2TB T7 shield disks last year before the boom. Thank fuck I did it then.
Is this for NVME only or spinning drives too? I use both, but I actually have use cases for HDDs and hope those are less affected.
It’s affecting both. HDD maybe slightly less/slower, but you’re paying significantly more than six months ago in any case.
This particular news is for spinning drives, the other types we already had news about upcoming shortages earlier on.
All I know is I saw most of my go-to refurbished enterprise HDD’s 2-3x during Black Friday a few months ago compared to a year prior.
sighs at her local backup drive that just gave up the ghost
thanks, AI-boosting motherfuckers, thanks a lot
Rotating or SSD?
[deleted]
Do they really think they will get some money from the AI ponzi scheme ?
Well, at least they might still have a product to sell once the AI bubble pops, unlike with NVIDIA which does seem to kinda forgot to design new consumer GPUs after getting high on AI money.
They haven't forgotten, they've expressly decided to soft-pivot away from consumer GPUs. RTX 60x0 series is apparently coming in 2018… (oops, 2028. No time travel involved. Probably). If the bubble has burst by then.
> RTX 60x0 series is apparently coming in 2018
That's either a typo, or NVidia has achieved some previously unheard of levels of innovation.
They're hedging on LLMs inventing time travel any day now.
> "apparently coming in 2018… maybe. If the bubble has burst by then."
Spoiler from the future: it hasn't. Get your investments in while you have time.
Good luck to everyone. Home you made some reserve.
Yes, AI is nice, but I also like to be able to buy some RAM and drives…
The future is thin clients for everyone, requiring a minimal amount of RAM and storage because all they are is a glorified ChatGPT interface.
I'm running multiple services such as Forgejo, Audiobookshelf, Castopod and they all need no more than roughly 100 MB RAM.
There is one exception though. Open WebUI with a whopping 960 MB. It's literally a ChatGPT interface. I'm only using external API providers. No local models running.
Meanwhile my website that runs via my own Wordpress-like software written in Rust [1] requires only a few MB of RAM, so it's possible.
You know what is the sad part. I don't think software developers or LLMs know how or want anymore to make low resource consumption software that runs on a thin client. It will be some browser based thing capping to whatever memory is available on the system.
It won't last. If the demand is sustained then new factories will open up and drive the price down.
More likely a couple of big financing wobbles lead to a fire sale.
It isn't practical for HDD supply to be wedged because in 5 years the disks start failing.
Even if the AI bubble bursts, having successfully cornered the compute market they can just go rent seeking instead by renting out cloud workstations, given that they've made the hardware to build a workstation yourself unaffordable.
does that only include SSDs, or does it include HDDs as well?
It includes all forms of storage except for USB devices, GPUs and high end CPUs. The latter you can still get but you're going to have some severe sticker shock.
Maybe shucking USB HDDs is the short-term answer.
Is that still possible? Aren't they native USB with no adapter?
Those drives are SATA inside the case.
That depends on the brand. The lower priced brands, yes, those can be SATA, the more vertically integrated companies also make custom PCBs that just have USB-C without any SATA interface exposed internally.
It's probably feasible to make a "mass storage USB in, SATA protocol out" smart adapter board.
I see, but if you plan on shucking you obviously get ones you know are able to.
I read it as both, but UK suppliers have stock of various SATA HDDs available in large and small sizes. It's hard to say if prices will rocket or availability decline, or both. I don't normally advocate panic-buying, but if it's needed now is the time. I have one NAS spare on hand, I don't want or need a drawer full of them, but it'll be a royal pain if I do and can't get parts.
Lower performance/capacity consumer drives might be comparatively safe because there's Chinese end-to-end production capacity for those. Of course the price can still increase, but probably not that much.
There's clearly easy/irrational money distorting the markets here. Normally this wouldn't be a problem: prices would go up, supply would eventually increase and everybody would be okay. But with AI being massively subsidized by nation-states and investors, there's no price that is too high for these supplies.
Eventually the music will stop when the easy money runs out and we'll see how much people are truly willing to pay for AI.
Regardless where demand comes from, it takes time to spin up a hard drive factory, and prices would have to rise enough that, as a producer, you would feel confident that a new hard drive factory will actually pay off. Conversely, if you feel that boom is irrational and temporary, as a producer you’d be quite wary of investing money in a new factory if there was a risk it would be producing into a glut in a few years.
I'll add that the GPU, CPU, storage, and RAM industries crashed in 2022 after a Covid-induced boom.[0]
Everything was cheap. Samsung sold SSDs at a loss that year.
TSMC and other suppliers did not invest as much in cap ex in 2022 and 2023 because of the crash.
Parts of the shortage today can be blamed by those years. Of course ChatGPT also launched in late 2022 and the rest is history.
[0]www.trendforce.com/presscenter/news/20221123-11467.html
If I remember during a previous GPU shortage (crypto?), Nvidia (and/or TSMC?) basically knew the music would stop and didn't want to be caught with its pants down after making the significant investments necessary to increase production
Not to mention that without enough competition, you can just raise prices, which, uh (gestures at Nvidia GPU price trends...)
Similar thing happened with mask manufacturers during COVID.
They didn't spin up additional mask production b/c they knew the pandemic would eventually pass. They learned this lesson from SARS.
Not maxing out production during spikes (or seasonality) in demand is a key tenet of being a "rational economic actor".
I believe the TSMC CEO said that in a recent interview. They're aware that their now biggest customer Nvidia has a less broad product portfolio than Apple and the high volumes they buy propably won't last. It's too much of a risk to plan more Fabs based on that.
They are indeed planning for more fabs, in order to meet volumes.
Last week: “TSMC's board approves $45 billion spending package on new fabs”
https://www.tomshardware.com/tech-industry/semiconductors/ts...
Silicon Valley is arguing that TSMC isn't investing enough. They should be investing hundreds of billions to build fabs, like how big tech is investing in the AI buildout.
$45 billion for new fabs is peanuts compared to Amazon's $200b and Google's $180b investment in 2026.
Can't really blame TSMC though. It takes years for fabs to go from plan to first wafer. By the time new fabs go online, demand might not be there. Who knows?
"Silicon Valley" doesn't get to make the decision unless they are willing to send some of those hundreds of billions to TSMC up front. (TSMC isn't going to want future promises of business either since those are worth very little.)
I don't disagree. I wrote the top comment here basically saying the same thing: https://news.ycombinator.com/item?id=46764223
If big tech prepays for the entire fab, I think TSMC would do it.
Somewhat ironically the AI boom means Nvidia would've easily made their money back on that investment though and probably even more thoroughly owned the GPGPU space.
But as it is it's not like they made any bad decisions either.
> it takes time to spin up a hard drive factory
Very good.
Are these factories already running 24/7 that labor can't be added to make more without adding capital infra?
And if they were running 24/7, maybe setting up another factory or line will avoid some of the 24/7 scheduling.
No it’s not an easy fix. Manufacturers don’t have a good pulse on long term demand. The he capex to spin up a new manufacturing plant is significant. Especially with the recency of Covid where some folks did get caught with their pants down and over invested during the huge demand boom.
I don’t quite follow the narrative like yours about nation states and investors. There is certainly an industrial bubble going on and lots of startups getting massive amounts of capital but I here is a strong signal that a good part of this demand is here to stay.
This will be one of those scenarios where some companies will look brilliant and others foolish.
Smart manufacturers will sell 'hard drive futures'. Ie. "Give us $100/drive now for 100k drives for delivery in march 2028".
These contracts are then transferrable. The manufacturer can start work on a factory knowing they'll get paid to produce the drives.
If the AI boom comes to an end, the manufacturer is still going to get paid for their factory, and if the AI company wants to recoup costs they could try to sell those contracts back to the manufacturer for pennies on the dollar, who might then decide (if it is more profitable) to halt work on the factory - and either way they make money.
Can you provide some solid examples of companies doing this in an industry with high capex? Yes futures exist but largely in commodity businesses. Because what you described sounds more like pre-purchase agreements which already exist. To have a futures market you would need investors and a product that is more of a commodity and not something highly engineered.
You are also forgetting that the payback period on a plant is not a single year, it will be over many years and most likely no buyer is wanting to arrange purchasing that far out.
I don’t see how what you described sounds is set in reality even for “smart manufacturers”.
There are futures markets for DRAM. Somewhat secretive (hard to find reliable price quotes) but they exist.
That only works out if there are enough investors willing to pay for those futures. If the new factory can make a billion drives but they only have 2 of those futures contracts sold (that is 200k drives) they don't build the factory. Remember too if they sell those contacts they are on the hook to deliver - if it is just investors they will accept the street value of 100k drives in 2028 but some of the people might be buyers demanding physical goods.
Every year a few farmers realize they are contracted to deliver more grain than they have in their bins and so have to buy some grain from someone else (often at a loss) just to deliver it. This isn't a common problem but it happens (most often the farmer is using their insurance payout to buy the grain - snip a very large essay on the complexities of this)
It's hard to increase long-run production capacity for what seems to be clearly a short-term spike in datacenter buildout. Even if AI itself is not much of a bubble, at some point spending on new AI facilities has to subside.
AI is going to be what fiber was to the dotcom bubble. Someone spend a lot of money on a lot of infrastructure, some of which is going to be incredibly useful, but sold for much less than it cost to build. Hardware just depreciates much much faster than fiber networks.
I'm not saying that data center buildouts can't overshoot demand but AI and compute is different than fiber buildout. The more compute you have, the smarter the AI. You can use the compute to let the AI think longer (maybe hours/days/weeks) on a solution. You can run multiple AI agents simultaneously and have them work together or check each other's work. You can train and inference better models with more compute.
So there is always use for more compute to solve problems.
Fiber installations can overshoot relatively easily. No matter how much fiber you have installed, that 4k movie isn't going to change. The 3 hours of watch time for consumers isn't going to change.
Did you pay attention in computer science classes? There are problems you can't simply brute-force. You can throw all the computing power you want at them, but they won't terminate before the heat-death of the universe. An LLM can only output a convolution of its data set. That's its plateau. It can't solve problems, it can only output an existing solution. Compute power can make it faster to narrow down to that existing solution, but it can't make the LLM smarter.
Maybe LLMs can solve novel problems, maybe not. We don't know for sure. It's trending like it can.
There are still plenty of problems that having more tokens would allow them to be solved, and solved faster, better. There is no absolutely no way we've already met AI compute demands for the problems that LLMs can solve today.
LLMs are considered Turing complete.
Not really. You can leverage randomness (and LLMs absolutely do) to generate bespoke solutions and then use known methods to verify them. I'm not saying LLMs are great at this, they are gimped by their inability to "save" what they learn, but we know that any kind of "new idea" is a function of random and deterministic processes mixed together in varying amounts.
Everything is either random, deterministic, or some shade of the two. Human brain "magic" included.
You can't really use compute more because power is already the bottleneck. Datacenter buildouts are now being measured in GW which tells you everything you need to know. Newer hardware will be a lot more power-efficient but also highly scarce for that reason.
Energy is also being scaled up. But the fundamental difference between compute and fiber buildup is different in my opinion.
current shortages are exactly the result of fabs not wanting to commit extra capex due to overbuild risk and inference demand seems to be growing 10x yoy; you've famously got 8 year old TPUs at google at 100% load.
But if the demand drops for six months, the manufacturers are going to scale back production.
If it drops for a year, they're likely to start shedding capacity, one way or another.
This is not an equivalent situation. The vast, vast majority of what's being produced for this bubble is going to be waste once it pops.
This goes beyond profits. It will be important for national security.
Higher price encourages more supply. Typically when you see a acute shortage, its quickly followed by a glut as supply starts coming online in an over correction.
These factories take years to make and massive amounts of money. That and there are so few manufacturers now they are far more likely to collude
> Normally this wouldn't be a problem: prices would go up, supply would eventually increase and everybody would be okay.
This sounds like economic dogma based on pointing at some future equilibrium.
I like the saying that goes something like "life is what is happens when you are waiting for the future". In the same way, it seems to me that equilibrium is increasingly less common for many of us.
Markets are dynamic systems, and there are sub-fields of economics that recognize this. The message doesn't always get out unfortunately.
> But with AI being massively subsidized by nation-states and investors, there's no price that is too high for these supplies.
This feels like more dogma: find a convenient scape-goat: governments.
Time to wake up to what history has shown us! Markets naturally reflect boom and bust cycles, irrationality of people, and various other market failures. None of these are news to competent economists, by the way. Be careful from whence you get your economic "analysis".
Yes, this is why the prices of housing has dropped dramatically. The market stepped up and filled the demand needed and now everyone can afford a place to live
.....
I can't tell if the comment is above is sarcastic or serious: it could go either way.
I think AI companies are involving these other industries so when the money runs out they will claim the whole thing is too big too fail.
By buying flash and thus shifting demand to HDD? How does that work?
The article doesn't mention flash or HDD. It seems that all storage by WD is already sold.
My point is that directly or indirectly all hardware companies depend on memory and storage. If AI companies fall this could have repercussions to the whole industry.
Earlier gamers got punished by crypto and now they are being punished by AI.
GPUs before crypto had a lot less amount of VRAM. Crypto investment funded a lot of stupid experiments, of which some did stick to the wall. I don't think gamers had lives completely ruined by crypto in the end.
"Punished" implies a moral valence to the whole thing which isn't there. It's not like the AI companies were aware of gamers and set out to do this. You simply got run over, like everyone else in front of the trillion dollar bulldozer.
So what?
Why gamers must be the most important group?
Gamers are important because they are consistent customers. Crypto buying of GPUs is done (anyone still in this area is buying ASICs). Meanwhile gamers are still buying GPUs - they do sometimes hold off when the economy doesn't allow, but you can trust that gamers will continue to buy GPUs to play their games and thus they are a safe investment. It is rational to sell CPUs to a gamer for much less than someone in crypto because the gamer will be back (even if the gamer "grows up" there are more replacing them). Thus gamer is an important group while crypto is not.
The above was their prediction during the crypto boom and it turns out correct. I'm not sure how AI will turn out, but it isn't unreasonable to predict that AI will also move to dedicated chips (or die?) in a few years thus making gamers more important because gamers will be buying GPUs when this fad is over. Though of course if AI turns out to be a constant demand for more/better GPUs long term they are more important.
Gamers are not the only important GPU market. CAD comes to mind as another group that is a consistent demand for GPUs over the years. I know there are others, they are all important.
the "value" of nvidia to the "AI" companies is their tsmc fab contract
they don't need CUDA, they don't need the 10 years of weird game support, even the networking tech
they need none of nvidia's technology moats
exactly same as the crypto, where they just needed to make an ASIC to pump out sha1 as quickly as possible
which is really, really easy if you have a fab contract
at which point their use of nvidia dropped to zero
I’d rather prefer that the average Joe has a good entertainment system than our corporate overlords has a good surveillance system.
The growth curve of technology has always pointed at the world becoming tiny and non-private.
Disagree.
Mass surveillance by corporations can be outlawed. Just because something is possible, doesn’t mean it must be necessarily so.
I travel a lot for work to different nations. The cultural differences are stark.
In the UK for example, they love their CCTVs. In Switzerland, they’re only allowed where they are deemed necessary.
[dead]
Loved the reference. Probably from Margin Call[0]
0. https://youtu.be/fij_ixfjiZE
I like to imagine the reference in the movie margin call is that of a merry go round or a game of Musical chair. Like we are all on a ride, none of us are the operator, and all we can do is guess when the music will stop (and the ride ends).
The problem with this AI stuff is we don't know how much we will be willing to pay for it, as individuals, as businesses, as nations. I guess we just don't know how far this stuff will be useful. The reasons for the high valuation is, in my guess, that there is more value here than what we have tapped so far, right?
The revenues that nVidia has reported is based on what we hope we will achieve in the future so I guess the whole thing is speculation?
TBF, all financial market is speculation these days, what only change is the figure/percentage of how much a share is actually the value it's priced.
> The problem with this AI stuff is we don't know how much we will be willing to pay for it, as individuals, as businesses, as nations. I guess we just don't know how far this stuff will be useful. The reasons for the high valuation is, in my guess, that there is more value here than what we have tapped so far, right?
I think the value now comes on how we make a product of it, for example, like OpenClaw. Whether we like or not, AI is really expensive to train, not only in monetary value but also in resources, and the gains have been diminishing with each “generation”. Let's not forget we heard promises that have not been fulfilled, for example AGI or “AI could potentially cure cancer, with enough power”.
And if you've been watching Deepmind AI has been making advances in medical sciences at a pretty damned fast rate. So not fulfilled is a pretty weak statement. The pipeline in medical is very long.
And that's not even talking about the head spinning rate robotics is advancing. The hardware we use for LLMs is also being used in robot simulation for hardware training that gives results in hours that took weeks or months in the past.
AI will cause shortages in everything from GPUs to CPUs, RAM, storage, networking, fiber, etc because of real demand. The physical world can't keep up with AI progress. Hence, shortages.
AI simply increases computer use by magnitudes. Now you can suddenly use Seedance 2.0 to make CGI that would have cost tens of millions 5 years ago for $5.[0] Everyone is going to need more disk space to store all those video files. Someone in their basement can make a full length movie limited only by imagination. The output quality keeps getting better quicker.
AI agents also drastically increase storage demands. Imagine financial companies using AI agents to search, scrape, organize data on stocks that they wouldn't have been able to do prior. Suddenly, disk storage and CPUs are in high demand for tasks like these.
I think the demand for computer hardware and networking gear is real and is only the beginning.
As someone who is into AI, hardware, and investing, I've been investing in physical businesses based on the above hypothesis. The only durable moats will be compute, energy, and data.
[0]https://seed.bytedance.com/en/seedance2_0
> The only durable moats will be compute, energy, and data
"Compute" is capital investment; normal and comprehensible, but on a huge scale.
"Data" is .. stolen? That feels like a problem which has been dodged but will not remain solved forever, as everyone goes shields-up against the scrapers.
"Energy" was a serious global problem before AI. All economic growth is traded off against future global temperature increases to some extent, but this is even more acute in this electricity-intensive industry. How many degrees of temperature increase is worth one .. whatever the unit of AI gain-of-function is?
> How many degrees of temperature increase is worth one .. whatever the unit of AI gain-of-function is?
Billionaire. And they are definitely willing to make the trade.
one.. fully autonomous, self improving, replicating, general intelligence.
The question isn’t if the demand is real or not (supplies are low, so demand must exist). The question is if the demand curve has permanently shifted, or is this a short-term issue. No one builds new capacity in response to short term changes, because you’ll have difficulty recouping the capital expense.
If AI will permanently cause an increase in hard drives over the current growth curve, then WD, et al will build new capacity, increasing supply (and reducing costs). But this really isn’t something that is known at this point.
My post argues that the demand has permanently shifted.
By the way, plenty of people on HN and Reddit ask if the demand is real or not. They all think there's some collusion to keep the AI bubble going by all the companies. They don't believe AI is that useful today.
> My post argues that the demand has permanently shifted
The time horizon for this is murky at best. This is something you think, but can’t know. But, you’re putting money behind it, so if you’re right, you’ll make a good profit!
But for the larger companies (like WD), over building capacity can be a big problem. They can’t plan factory expansion based on what might be a short term bubble. That’s how companies go out of business. There is plenty to suggest that you’re right, that AI will cause permanently increased demand for computing/storage resources. Because it is useful and does consume and produce a lot of new data and media.
But I’m still skeptical.
The massive increase in spending can’t be sustainable. We can’t continue to see the AI beast at this rate and still have other devices. Silicon wafer fabs can’t be built on demand and take time. SSD/HD factories take time. I think we are seeing an expansion to see who the big players will be in the next 3-5 years. Once that order has been established, then I think we will fall back to more sustainable rates of demand. This isn’t collusion, it’s just market dynamics at play in a common market. Sadly, we are all part of the same pool and so everything is expensive for all of us. At some point though, the AI money will dry up or get more expensive. Then I think we’ll see a reversion back to “normal” demand, maybe slightly elevated, but not the crazy jump we’ve seen for the past two years.
Us being in the same pool as AI is one of the potential risks pointed out by AI safety experts.
To use an analogy, imagine you're a small fluffy mammal that lives in fertile soils in open plains. Suddenly a bunch of humans show up with plows and till you and your environment under to grow crops.
Maybe the humans suddenly won't need crops any longer and you'll get your territory back. But if that doesn't happen and a paradigm change occurred you're in trouble.
AI can be useful today, while also being insanely overvalued, and a bubble.
There will be a bubble. It's inevitable.
The most important question is are we in 1994 or 2000 of the bubble for investors and suppliers like Samsung, WD, SK Hynix, TSMC.
What about 10 years from now? 15 years? Will AI provide more value in 2040 than in 2026? The internet ultimately provided far more value than even peak dotcom bubble thought.
I wonder if I'm alone in being optimistic about this. I believe that the gigantic inflow of money into hardware will lead to large increase in production capabilities, accelerated progress and perhaps even new, better architectures.
I actually agree: a spike in prices due to bumping against capacity limits is way better than a downturn in the market. But this is only really true if AI hyperscalers are incented to space out their big buildouts over time (while raising their prices enough to ration current demand) so that suppliers can have some guarantee that their expanded capacity will be used.
This fact never ceases to amaze me. It's so cool how relentlessly AI is pushing the horizons of our current hardware!
Maybe now we will start to see the "optical" CPUs start to be a thing. Or the 3D disk storage,;or other ground breaking technology.
Optical interconnect in the rack is a thing already. It's just a matter of time until it moves to single-PCB scale. And most persistent memories (competing with DDR memory for speed, and with far lower wearout than NAND) are of the "3D storage" type.
AI's output is not reproducible. It's a disaster.
If we want reproducible output we already have conventional software. Stop using a hammer on screws.
So it's like humans then
This is wrong for all LLMs which have a temperature setting.
And even if there were guaranteed to be non-deterministic, there is still lots of value in many aspects of content generation.
Better stock up with used laptops. I'm going to buy another one this year. Those used ones usually don't last very long.
What if in the near future it is simply too expensive to own "personal" computers? What if you can no longer buy used computers from official channels but have to find local shops or sharpen up on soldering skills and find parts from dumps? The big techs will conveniently "rent out" cloud computer for us to use, in exchange of all of your data.
"Don't you all have cellphones?"
Bingo. A number of corporate interests don't want to let you own your personal computers for different reasons. Google/Apple wants you to get locked down devices, and cloud/AI providers want you to use their services from a weak client.
I think about this too. There are several headwinds. Rent-seeking and collapse of economies of scale in the consumer sector for sure, but also I feel like we've basically peaked in hardware's ability to meet routine needs.
Once the phone makers realize that they can sell phones and docking stations to businesses because 90% of knowledge work seems to happen in a web browser through one SaaS or other I think personal computers will be cooked.
I have 3 old employer laptops and my personal gaming laptop, which I use for work now. I'm happy about this now ;)
I probably will only need to return newest laptop if I leave the company.
I have heard that you can get used laptops. But they do not come with memory or SSD anymore... As even used components are now valuable enough to be removed and sold.
In a lot of cases, owners remove the storage not because it has any value but rather they don't want to risk making a mistake letting a device go that still has data on it.
Also pulling and shredding hard drives is cheaper than paying someone to run DBAN or equivalent (which can take many hours to complete on a large drive), and there's no easy way to securely erase an SSD if it wasn't encrypted from the beginning.
Many laptops from last few years have soldered memory. Your previous laptop's SSD can also be reused, since those don't die that quickly compared to the laptop.
Or worse, they have memory and SSD soldered on board, and are broken, so you have to learn soldering skills too.
Damn really? One of my go to moves when helping small political campaigns is to buy like a 2015 MBP and turn it into a locally hosted server to run all their stuff on the cheap.
Older MacBooks with socketed storage may actually be exempted because they use a proprietary connector instead of standard m.2.
Non state of the art lithography is pretty much commoditized (DDR3 & DDR4) so we will always have compute, although slower.
Eh, if this demand is really sustainable they will eventually start producing in adequate volume
I no longer feel obligated to apologize for holding on to older devices for a long time. I have several desktops and laptops that are all still usable.
Are these the picks and shovels?
Is the profitability of these electronics manufacturers more likely than the companies that are buying up all their future inventory?
If AI continues at this trajectory, sure, likely to the picks and shovels.
If AI has a bubble burst, you could see a lot of used hardware flood the market and then companies like WD could have a hard time selling against their previous inventory.
The problem is more likely that companies like WD doesn't know if this will be a bubble or not. Currently they can milk the market by raising their prices and just rely on their current production facilities, maybe expand a little. If there's going to be crash, then it's better to have raised the price, even if just temporarily, rather than being left standing with excessive production capacity.
If it's long term, it would be better to be the front runner on additional capacity, but that's assuming continuous growth. If it all comes down, or even just plateaus, it's better to simply raise prices.
Given how hard AI is on I/O, while restarting if hardware might go second hand. I dont see hard drives go second hand. Most hardware that we get might be used beyond redeeming even at free price.
I don't think the HDDs are being used for any intensive loads. They have too much latency for most of that. It's probably just archival storage for their scraped content and generated slop.
For "cold" archival storage you would want to use tape, which is far cheaper per TB at scale.
I don't mean that type of archive, but rather "just in case" data like "last month's scrape of this website" after we scraped it 5 more times this month or higher resolution versions of book scans. You might want to still be able to dump it out quickly if you need it. Money is no object for these companies and the cost of HDDs is more than low enough for the flexibility they provide.
If demand for hard drives is this high then it sounds like there wouldn't be near enough tape around either.
HDDs, RAM, and chips have so many health metrics and methods that you really shouldn't be afraid to buy them used. The only special requirement is a RasPi test rig. That and a 30 day return window.
I feel like the goal is to avoid buying something broken in the first place, not just to be able to tell if you've bought something that turns out to be broken
What are companies needing all of these hard drives for? I understand their need for memory, and boot. But storing text training data and text conversations isn't that space intensive. There's a few companies doing video models, so I can see how that takes a tremendous amount of space. Is it just that?
Hearing about their scrapping practises it might be that they are storing same data over and over and over again. And then yes, audio and video is likely something they are planning for or already gathering.
And if they produce lot of video, they might keep copies around.
All the latest general purpose models are multimodal (except DeepSeek I think). Transfer learning allows to improve results even after they exhausted all the text in the internet.
Storing training data: for example, Anthropic bought millions of second hand books and scanned them:
https://www.washingtonpost.com/technology/2026/01/27/anthrop...
I think the somewhat hallucinatory canned response is that they distribute data across drives for a massive throughput. Though idk if that even technically makes sense...
I am surprised by that too. I thought everyone moved to SDDs or NVMe ?
I was toying with getting a 2T HDD for a BSD system I have, I guess not now :)
Everyone moved to SDDs or NVMe. If you're right, that includes manufacturers. HDDs still have advantages over SSDs for specific needs, like more reliable long-term unelectrified storage. It's also possible that the high price of SSDs made HDDs an option again.
Really if you're writing large solid files hard drives aren't that bad. If you can have the system split out one file per drive at a time then you'll avoid a lot of the fragments
This is the consequence of "I don't want to write this function myself, I'll get the plagiarism machine to do it for me"
And what's wrong with not wanting to write functions yourself? It is a perfectly reasonable thing, and in some cases (ex: crypto), rolling your own is strongly discouraged. That's the reason why libraries exist, you don't want to implement your own associative array every time your work needs it do you?
As for plagiarism, it is not something to even consider when writing code, unless your code is an art project. If someone else's code does the job better then yours, that's the code you should use, you are not trying to be original, you are trying to make a working product. There is the problem of intellectual property laws, but it is narrower than plagiarism. For instance, writing an open source drop-in replacement of some proprietary software is common practice, it is legal and often celebrated as long as it doesn't contain the original software code, in art, it would be plagiarism.
Copyright laundering is a problem though, and AI is very resource intensive for a result of dubious quality sometimes. But that just shows that it is not a good enough "plagiarism machine", not that using a "plagiarism machine" is wrong.
If I use a package for crypto stuff, it will generally be listed as part of the project, in an include or similar, so you can see who actually wrote the code. If you get an LLM to create it, it will write some "new original code" for you, with no ability to tell you any of the names of people who's code went into that, and who did not give their consent for it to be mangled into the algorithm.
If I copy work from someone else, whether that be a paragraph of writing, a code block or art, and do not credit them, passing it off as my own creation, that's plagiarism. If the plagiarism machine can give proper attribution and context, it's not a plagiarism machine anymore, but given the incredibly lossy nature of LLMS, I don't foresee that happening. A search engine is different, as it provides attribution for the content it's giving you (ignoring the "ai summary" that is often included now). If you go to my website and copy code from me, you know where the code came from, because you got it from my website
Why is "plagiarism" "bad"?
Modern society seems to assume any work by a person is due to that person alone, and credits that person only. But we know that is not the case. Any work by an author is the culmination of a series of contributions, perhaps not to the work directly, but often to the author, giving them the proper background and environment to do the work. The author is simply one that built upon the aggregate knowledge in the world and added a small bit of their own ideas.
I think it is bad taste to pass another's work as your own, and I believe people should be economically compensated for creating art and generating ideas, but I do not believe people are entitled to claim any "ownership" of ideas. IMHO, it is grossly egoistic.
Sure, you can't claim ownership of ideas, but if you verbatim repeat other people's content as if it is your own, and are unable to attribute it to its original creator, is that not a bit shitty? That's what LLMs are doing
I honestly think it's not that simple.
The ones who spend billions on integrating public cloud LLM services are not the ones writing that function. They are managers who based on data pulled out of thin air say "your goal for this year is to increase productivity by X%. With AI, while staffing is going slightly down".
I have to watch AI generated avatars on the most boring topics imaginable, because the only "documentation" and link to actual answer is in a form of fake person talking. And this is encouraged!
Then the only measure of success is either AI services adoption (team count), or sales data.
That is the real tragedy and the real scale - big companies pushing (external!) AI services without even proof that it justifies the cost alone. Smooth talking around any other metric (or the lack of it).
[dead]
It's still absolutely fascinating to me that basically the whole modern tech industry and the economic growth from it rests on the shoulders of a single company that has all of their important factories on a single island that's under constant threat of invasion. On top of that they themselves are reliant on a single company that's able to produce the machine required to print the wafers.
I don't know if TSMC has anything to do with hard drive production, but the reliance on very few players is also a problem in that industry.
Investors love a monopoly, and establishing this required more than a trillion dollars of investment sustained over a couple decades.
> Investors love a monopoly...
Indeed, investors left to their own devices act in this way. Underlying such a single point-of-failure is an implied but immense hope and thus pressure for stability. I wonder what the prediction markets saying about current levels of geopolitical stability in Taiwan?
> Indeed, investors left to their own devices act in this way.
Interesting. Capitalism is often touted to be more decentralized than socialism, but this is an example of how it can centralize.
It's only this way because the American ruling class would rather ship jobs overseas to increase their wealth than competently establish an industrial sector that would pay good wages to average people.
Turns out letting a bunch of MBAs plan your economy is extremely foolish.
Yeah, this is slowing down growth and profits. The AI hype is sucking everything dry, from HVAC services to hardware.
Great. I’ve just returned a WD drive to Amazon after it arrived crushed in a torn-open paper bag.
The replacement arrived also in a paper bag and went straight back, this time for a refund.
I guess I should have kept that one and hoped for the best.
Good alternatives? I’ve only recently been enlightened on how profoundly sh__ty SSD is for long-term storage and I have a whole lot of images my parents took traveling the last few years of their lives.
I'm sure Amazon isn't the only shop that delivers to your area
The premise of this news is that prices are going to climb and availability is going to drop.
And I’m not keen on having anyone ship me one of these anymore.
Walmart sells what appears to be an older version of the drive and I might have to cross my fingers and just get one of those.
> And I’m not keen on having anyone ship me one of these anymore.
Isn't that what you're doing ordering off amazon with their comingled inventory?
Besides, there's a spectrum of sellers between "Amazon" and "anybody", you can even, perhaps, purchase directly from the manufacturer.
I meant that after the Amazon experience, I don't want to buy a HDD online. Would much prefer to get it locally in person.
I was recently involved in a large server purchase for work, where we wanted 72 hard drives of 24TB each for a server. They were available last year, but last month the largest we could get were 20TB drives.
My machine was built up from parts in 2014.
6-7 years ago when GPU prices went up, I hoped nothing would break. Last year when RAM prices went up I did the same. Now with drive prices going up, it's the same thing.
It's interesting because I've always built mid-tier machines over the years and it was in the neighborhood of ~$700 at the time. Now the same thing is almost double that but the performance is no where near twice as good for general computer usage.
This is all basically a textbook example of irrational market decisions. There’s clearly a bubble and not enough money coming in to pay for the AI bonanza.
It’s building materials being in short supply when there’s obviously more houses than buyers. That’s just masked at the moment because of all the capital being pumped in to cover for the lack of actual revenue to pay for everything. The structural mismatch at the moment is gigantic, and the markets are getting increasingly impatient waiting for the revenue to materialize.
Mark this post… in a few years folks will be coming up with creative ideas for cheap storage and GPUs flooding the market after folks pick up the pieces of imploded AI companies.
(For the record, I’m a huge fan of AI, but that doesn’t mean I don’t also think a giant business and financial bubble is about to implode).
More reckless and irresponsible than irrational.
> in a few years folks will be coming up with creative ideas for cheap storage and GPUs flooding the market
COVID was six years ago. In that time, GPU prices haven't gone down (and really have only increased). Count me skeptical that there will be a flood of cheap components.
I feel like the most recent time you could reasonably get an nvidia *80 GPU at the store for a normal amount of money was almost a decade ago.
Is there an industrial bubble? Probably.
> It’s building materials being in short supply when there’s obviously more houses than buyers.
That I think is a hard one to prove and is where folks are figuring it out. There is obvious continued demand and certainly a portion of it is from other startups spending money. I don’t think it’s obvious though where we are at.
I'm not against subsidies but the concentration is a problem. This money could have spurned grass roots participation in these emerging industries, but instead they chose to go with the most heinous of monocultures, leaving billions of people out of the loop.
Not only storage, cheapest 32 GB RAM that I can find is around 200 euros.
That's actually a bargain, average market price (though highly volatile) is more than double that.
They're probably looking at DDR4.
It's interesting to see here that spending is irrational, but actually even if AI improvements slow down it's more rational for companies to spend more and underutilize the machines than to underspend and get disrtupted.
On the otherhand lots of people here are even more uncomfortable of the other option, which is quite possible: AI software algorithms may scale better than the capacity of companies that make the hardware. Personally I think hardware is the harder to scale from the two and this is just the beginning.
It started with RAM; now with hard drives and SSDs. This is not looking good. But at least you can buy used ones for a pretty good price, for now.
I console myself with knowledge of the economics maxim that every supply shortage is usually, eventually, followed by a supply glut.
One can only hope that that's the principle at work here, anyway. It could also be a critically damped system for all I know. Unfortunately I studied control systems too...
If storage and memory manufacturer don't respond with increasing supply. There might not be glut. Just postponed demand that will slowly get fulfilled over longer period. That is if we were in steady state.
On other hand if there is bigger economic turmoil that might mean that the postponed demand does not realise as there is no purchasing power...
I was thinking than until my NAS gave me a error on one of my harddrives, now I'm in the market for a replacement while I still have redundancy
People with a control theory background are welcome in economics; the field is more diverse than some would recognize. Certain professions and subfields are more open than others. There are plenty of economists who care about things like resilience and dampening shocks.
I would love if more non-traditional economists got involved in the public sphere by which I mean: writing about economic trends, public policy, regulation, rate-adjustment, etc.
As an engineer with a passing control theory background and a breadth of general knowledge, I'd love to explore this space more and find a way to apply my knowledge and share the results. Are there any particular problems you think well-suited to this treatment?
If you have a policy area you like you might start there. From my lens, here are some interesting ways to look at political economy from a broader point of view: economic disruption from AI (could be from energy prices, labor substitution, and lots more), climate modeling and its impacts on economies, conservation and ecosystem stability, and economic growth under different levels of inequality. I would add this to the mix even though it isn't a typical economic area: geopolitical destabilization from autonomous weapons, both physical and cyber.
Those are definitely all areas of interest for me as well. Thanks for the pointers. Do you write anywhere?
Presumably they're also looking to increase production capacity as fast as possible - within the year?
I'd have thought HDDs aren't at the top of the list for AI requirements, are other component manufacturers struggling even more to meet demand?
Why would they?
If we weren’t talking about AI, was there another high demand sector / customer for spinning platters?
And their margins get fat now that supply is relatively constant but AI demand has saturated their current production numbers.
I listed some hard drives on Friday on eBay.. most of them refurbished... within 5 minutes got a message from a person who wanted them all... shipped them an hour later
This is getting ridiculous. Never before has an unwanted product been thrust so forcefully and artificially into the market that it disrupts the supply line for real products with actual demand.
i am happy i bought 5x10TB drives two months ago, anticipating this exact scenario.
Take a look at prices of SSDs and RAM too.
I built a new server this time last year. My board does 6 channel RAM so I bought 6x32GB ECC DDR5. $160 a stick at the time. Just for grins I looked up the same product number at the same supplier I originally bought from. $1300 apiece. One of the VMs running on that server is TrueNAS, with 4 20TB WD Red Pros. God help me if I have to replace a drive.
Best Buy is actively selling 2x8gb sticks of DDR4 3200 for $80 a stick. I was floored. Ten bucks a gig, $160 for the pack.
We're fucking doomed.
Ten bucks a gig is lower than what some DDR5 memory is selling at.
Perhaps there is an incentive to go back to OS that can operate with 640KB RAM ... /s
I bought 6x refurbished ultrastars for ~$100/ea Black Friday 2024. They were over $200/ea 2025. Samsung T7 (and shield) SSD’s have 2x-3x. Can’t get 1TB for less than like $180 right now. It’s ridiculous
Bought a few 2TB T7 shield disks last year before the boom. Thank fuck I did it then.
Is this for NVME only or spinning drives too? I use both, but I actually have use cases for HDDs and hope those are less affected.
It’s affecting both. HDD maybe slightly less/slower, but you’re paying significantly more than six months ago in any case.
This particular news is for spinning drives, the other types we already had news about upcoming shortages earlier on.
All I know is I saw most of my go-to refurbished enterprise HDD’s 2-3x during Black Friday a few months ago compared to a year prior.
sighs at her local backup drive that just gave up the ghost
thanks, AI-boosting motherfuckers, thanks a lot
Rotating or SSD?
Do they really think they will get some money from the AI ponzi scheme ?
Well, at least they might still have a product to sell once the AI bubble pops, unlike with NVIDIA which does seem to kinda forgot to design new consumer GPUs after getting high on AI money.
They haven't forgotten, they've expressly decided to soft-pivot away from consumer GPUs. RTX 60x0 series is apparently coming in 2018… (oops, 2028. No time travel involved. Probably). If the bubble has burst by then.
> RTX 60x0 series is apparently coming in 2018
That's either a typo, or NVidia has achieved some previously unheard of levels of innovation.
They're hedging on LLMs inventing time travel any day now.
> "apparently coming in 2018… maybe. If the bubble has burst by then."
Spoiler from the future: it hasn't. Get your investments in while you have time.
Good luck to everyone. Home you made some reserve.
Yes, AI is nice, but I also like to be able to buy some RAM and drives…
The future is thin clients for everyone, requiring a minimal amount of RAM and storage because all they are is a glorified ChatGPT interface.
I'm running multiple services such as Forgejo, Audiobookshelf, Castopod and they all need no more than roughly 100 MB RAM.
There is one exception though. Open WebUI with a whopping 960 MB. It's literally a ChatGPT interface. I'm only using external API providers. No local models running.
Meanwhile my website that runs via my own Wordpress-like software written in Rust [1] requires only a few MB of RAM, so it's possible.
[1]: https://github.com/rikhuijzer/fx
You know what is the sad part. I don't think software developers or LLMs know how or want anymore to make low resource consumption software that runs on a thin client. It will be some browser based thing capping to whatever memory is available on the system.
It won't last. If the demand is sustained then new factories will open up and drive the price down.
More likely a couple of big financing wobbles lead to a fire sale.
It isn't practical for HDD supply to be wedged because in 5 years the disks start failing.
Even if the AI bubble bursts, having successfully cornered the compute market they can just go rent seeking instead by renting out cloud workstations, given that they've made the hardware to build a workstation yourself unaffordable.
does that only include SSDs, or does it include HDDs as well?
It includes all forms of storage except for USB devices, GPUs and high end CPUs. The latter you can still get but you're going to have some severe sticker shock.
Maybe shucking USB HDDs is the short-term answer.
Is that still possible? Aren't they native USB with no adapter?
Those drives are SATA inside the case.
That depends on the brand. The lower priced brands, yes, those can be SATA, the more vertically integrated companies also make custom PCBs that just have USB-C without any SATA interface exposed internally.
It's probably feasible to make a "mass storage USB in, SATA protocol out" smart adapter board.
I see, but if you plan on shucking you obviously get ones you know are able to.
I read it as both, but UK suppliers have stock of various SATA HDDs available in large and small sizes. It's hard to say if prices will rocket or availability decline, or both. I don't normally advocate panic-buying, but if it's needed now is the time. I have one NAS spare on hand, I don't want or need a drawer full of them, but it'll be a royal pain if I do and can't get parts.
Lower performance/capacity consumer drives might be comparatively safe because there's Chinese end-to-end production capacity for those. Of course the price can still increase, but probably not that much.