102

WD and Seagate confirm: Hard drives sold out for 2026

Everyone: things suck, better move my stuff on a small home server. The hyper-scaler mafia: NOT ON MY WATCH!

The only silver lining is that newer devices will have to scale down memory, so developers will have to ditch memory-sucking frameworks and start to optimize things again.

2 hours agolccerina

I’m just afraid that prices of $everything will go up soon and will not come down anymore, like they did after Covid.

If it’s temporary I can live with it.

I guess this was inevitable with the absolute insane money being poured into AI.

2 hours agostingraycharles

We useless eaters are to be priced out of life soon enough.

an hour agoFrankBooth

Traditionally that hasn't gone well for the rich folk.

43 minutes agodgxyz

Truer than even you dare to admit.

How many useless living humans do you know? They go somewhere. Something happens to them. Whatever it is it’s about to happen to 30% of the population.

What’s the opposite of survivor bias?

41 minutes agobrador

Traps tend to only go one way.

2 hours agoroysting

>If it’s temporary I can live with it.

Given this has been going on for years at this point, the high prices of graphics cards through crypto and now AI, it feels like this is the new normal, forever propped up by the next grift.

2 hours agocube00

I don't think this ideology and investment strategy will survive this grift. There's too much geopolitical instability and investment restructuring for it to work again. Everyone is looking at isolationist policies. I mean mastercard/visa is even seen as a risk outside US now.

2 hours agodgxyz

Yup, when you can’t trust partners (or even nominal allies), what else is there but isolationism?

an hour agolazide

It's not really isolation but exclusion. Push all risks as far away from you as possible.

an hour agodgxyz

When everything ‘outside’ is a risk, what would you call a summary of that policy?

an hour agolazide

Well a risk has an abstract level and is either increasing or decreasing. You can look at your risk profile over time and work out how to define policy going forwards. It takes a long time to make changes at country level.

US is medium risk and increasing rapidly. Run away quickly.

an hour agodgxyz

cooperation.

Sure you have to isolate certain rogue states - North Korea, Russia, USA

an hour agoiso1631

> I don't think this ideology and investment strategy will survive this grift

Big tech will be deemed "too big to fail" and will get a bail out. The tax payers will suffer.

an hour agoFervicus

Big tech has already failed. Which is why it got into politics.

an hour agodgxyz

> I’m just afraid that prices of $everything will go up soon and will not come down anymore, like they did after Covid.

Just like the price of labour. Your salary went up and doesn't come down

In the UK weekly earnings increased 34% from December 2019 to December 2025.

CPI went up 30% in the same period.

Obviously that CPI covers things which went up more, and things which went up less, and your personal inflation will be different to everyone elses. Petrol prices end of Jan 2020 were 128p a litre, end of Jan 2025 they are 132p a litre [0]. Indeed petrol prices were 132p in January 2013. If you drive 40,000 miles a year you will thus see far lower inflation than someone who doesn't drive.

[0] https://www.rac.co.uk/drive/advice/fuel-watch/

an hour agoiso1631

> I’m just afraid that prices of $everything will go up soon and will not come down anymore, like they did after Covid.

That's how inflation works. In this case it seems more narrow though, there's hope the prices will go down. Especially if the AI hype finds a reason to flounder.

an hour agoburan77

The main thing that the powers that be have always underestimated is the insane creativity the common people have when it comes to wanting things, but being forced to use alternative ways. Not going to say it won't suck, but interesting ways will indeed be found.

2 hours agob3lvedere

You’re going to find what, ways to make hand crafted survival RAM and drives in your backyard chip foundry?

Call me cynical if you like, but I don’t see this optimism that assumes the banal idea that somehow good always wins, when that’s simply not possible and in fact bad-guys have won many times before, it’s just that “dead men tell no tales” and the winners control what you think is reality.

an hour agoroysting

People will find a way to not need as much RAM, and thus the devices that require it.

Same way the price of groceries going up means people buy only what they need and ditch the superfluous.

an hour agolouiskottmann

The Chinese have end-to-end production capacity for lower capacity, lower performance/reliability consumer HDDs, so these are quite safe. Maybe we'll even see enterprise architectures where that cheap bottom-of-the-barrel stuff is used as opportunistic nearline storage, and then you have a far lower volume of traditional enterprise drives providing a "single source of truth" where needed.

an hour agozozbot234

One way of putting it, is the winners are ‘the good guys’.

an hour agolazide

Unless people notice that they just built lots of useless datacenters and push back towards a mainframe + terminal setup, because ah sorry, modern software just runs much better that way, and you can save money on our inexpensive laptop with subscription model

an hour agolowdude

Saw this one coming and got my personal stuff out. It's running on an old Lenovo crate chucked in my hallway.

Work is fucked. 23TB of RAM online. Microservices FTW. Not. Each node has OS overhead. Each pod has language VM overhead. And the architecture can only cost more over time. On top of that "storage is cheap so we won't bother to delete anything". Stupid mentality across the board.

2 hours agodgxyz

It is one tiny sliver of silver lining that “storage/memory/compute is cheap” nonsense that has produced all kinds of outsourced human slop code. That mentality is clearly going to have to die.

It could even become a kind of renaissance of efficient code… if there is any need for code at all.

The five guys left online might even get efficient and fast loading websites.

Honorable mention of the NO-TECH and LOW-TECH magazine site because I liked the effort at exploring efficient use of technology, e.g., their ~500KB solar powered site.

https://solar.lowtechmagazine.com/about/

an hour agoroysting

I think your ideological perspective is spot on.

We went from using technology to solve problems to the diametric opposite of creating new problems to solve with technology. The latter will have to contract considerably. As you say, many problems can be solved without code. If they even need to be solved in the first place.

On the efficiency front, most of what we built is for developer efficiency rather than runtime efficiency. Also needs to stop.

I'm a big fan of low tech. I still write notes on paper and use a film camera. Thanks for the link - right up my street!

an hour agodgxyz

> The only silver lining is that newer devices will have to scale down memory, so developers will have to ditch memory-sucking frameworks and start to optimize things again.

No. Prices will just go up, less innovation in general.

2 hours agockbkr10

A few places will have no choice - low price elasticity, combined with things that need to actually work.

an hour agolazide

At least we can add "use the least amount of RAM and drive space" to our AI prompts.

/s

an hour agotheandrewbailey

Well you can do that, but then the AI won't be nearly as smart as it was before...

an hour agozozbot234

We aren't just dealing with a shortage; we're dealing with a monopsony. The Big tech companies have moved from being "customers" of the hardware industry to being the "owners" of the supply chain. The shortage isn't just "high demand", but "contractual lock-out."

It is time to talk seriously about breaking up the hyperscalers. If we don't address the structural dominance of hyperscalers over the physical supply chain, "Personal Computing" is going to become a luxury of the past, and we’ll all be terminal-renters in someone else's data center.

2 hours agoolavgg

> "Personal Computing" is going to become a luxury of the past, and we’ll all be terminal-renters in someone else's data center.

This is the game plan of course, why have customers pay one time for hardware when they can have you constantly feed them money over the long term. Shareholders want this model.

It started with planned obsolescence, now this new model is the natural progression.. There is no obsolescence even in discussion when you're only option is to rent a service, that the provider has no incentive to even make competitive.

I really feel this will be China's moment to flood the market with hardware and improve their quality over time.

2 hours agobilekas

"I think there is a world market for maybe five c̶o̶m̶p̶u̶t̶e̶r̶s̶" compute centers.

an hour agoactionfromafar

> "Personal Computing" is going to become a luxury of the past, and we’ll all be terminal-renters in someone else's data center.

Yep. My take is that, ironically, it's going to be because of government funding the circular tech economy, pushing consumers out of the tech space.

2 hours agoahsillyme

> pushing consumers out of the tech space.

post consumer capitalism

23 minutes agosqueefers

it goes mainframe (remote) > PC (local) > cloud (remote) > ??? (local)

24 minutes agosqueefers

mainframe (remote) > PC (local) > cloud (remote) > learning at a geometric rate ∴ > (local)

9 minutes agoGCUMstlyHarmls

This is the result of the long-planned desire for consumer computing to be subscription computing. Ultimately, there is only so much that can be done in software to "encourage" (read: coerce) vendor-locked, always-online, account-based computer usage; there are viable options for people to escape these ecosystems via the ever growing plethora of web-based productivity software and linux distributions which are genuinely good, user friendly enough, and 100% daily-drivable, but these software options require hardware.

It's no coincidence that Microsoft decided to take such a massive stake in OpenAI - leveraging the opportunity to get in on a new front for vendor locking by force-multiplying their own market share by inserting it into everything they provide is an obvious choice, but also leveraging the insane amount of capital being thrown into the cesspit that is AI to make consumer hardware unaffordable (and eventually unusable due to remote attestation schemes) further enforces their position. OEM computers that meet the hardware requirements of their locked OS and software suite being the only computers that are a) affordable and b) "trusted" is the end goal.

I don't want to throw around buzzwords or be doomeristic, but this is digital corporatism in its endgame. Playing markets to price out every consumer globally for essential hardware is evil and something that a just world would punish relentlessly and swiftly, yet there aren't even crickets. This is happening unopposed.

2 hours agoshit_game

What can we do? Serious question.

It's so hard to grasp as a problem for the lay person until it's too late.

an hour agokuerbel
[deleted]
2 hours ago

> "Personal Computing" is going to become a luxury of the past, and we’ll all be terminal-renters in someone else's data center.

These things are cyclical.

35 minutes agoiso1631

"You'll own nothing. And you'll be happy"

2 hours agoAmazingTurtle

Sorry, do people not immediately see that this is an AI bit comment?

Why is this allowed on HN?

2 hours agonubg

> Why is this allowed on HN?

1) The comment you replied to is 1 minute old, that is fast for any system to detect weird comments

2) There's no easy and sure-fire way to detect LLM content. Here's wikipedias list of tells https://en.wikipedia.org/wiki/Wikipedia:Signs_of_AI_writing

2 hours agoMaxion

> Sorry, do people not immediately see that this is an AI bit comment?

How do you know that ? Genuine question.

2 hours agobilekas

> isn't just "high demand", but "contractual lock-out."

The "isn't just .., but .." construction is so overused by LLMs.

2 hours agof311a

The phrasing. "It's not just X, it's Y," overuse of "quotes"

2 hours agolsp

The problem with any of these tells is that an individual instance is often taken as proof on its own rather than an indicator. People do often use “it isn't X, it is Y” like constructs¹ and many, myself included sometimes, overuse “quotes”², or use m-dashes³, or are overly concerned about avoiding repeating words⁶, and so forth.

LLMs do these things because they are in the training data, which means that people do these things too.

It is sometimes difficult to not sound like an LLM-written or LLM-reworded comment… I've been called a bot a few times despite never using LLMs for writing English⁴.

--------

[1] particularly vapid space-filler articles/comments or those using whataboutism style redirection, which might be a significant chunk of model training data because of how many of them are out there.

[2] I overuse footnotes as well, which is apparently a smell in the output of some generative tools.

[3] A lot of pre-LLM style-checking tools would recommend this in place of hyphens, and some automated reformatters would make the change without access, so there are going to be many examples in training data.

[4] I think there is one at work in VS which I use in DayJob, when it is suggesting code completion options to save typing (literally Glorified Predictive Text) and I sometimes accept its suggestion, and some of the tools I use to check my Spanish⁵ may be LLM based, so I can't claim that I don't use them at all.

[5] I'm just learning, so automatic translators are useful to check what I'm written isn't gibberish. For anyone else doing the same: make sure you research any suggested changes preferably using pre-2023 sources, because the output of these tools can be quite wrong as you can see when translating into a language you are fluent in.

[6] Another common “LLM tell” because they often have weighting functions especially designed to avoid token repetition, largely to avoid getting stuck in loops, but many pre-LLM grammar checking tools will pick people up on repeated word use too, and people tend to fix the direct symptom with a thesaurus rather than improving the sentence structure overall.

an hour agodspillett

It has Claude all over it. When you spend enough time with them it becomes obvious.

In this case “it’s not x, it’s y” pattern and its placement is a dead giveaway.

2 hours agohakanderyal

Isn't this ironic to use AI to formulate a comment against AI vendors and hyperscalers.

It's not ironic, but bitterly funny, if you ask me.

Note: I'm not an AI, I'm an actual human without a Claude account.

2 hours agobayindirh

I wonder what the ratio is of "constructive" use of AI is, verses people writing pointless internet comments.

It seems personal computing is being screwed so people can create memes, ask questions that take 30 seconds to find the answer to with Google or Wikipedia, and sound clever on social media?

34 minutes agophatfish

If you think AI as the whole discipline, there are very useful applications indeed, generally in pattern recognition and regulation space. I'm aware a lot of small projects which rely on AI to monitor ecosystems, systems or used as nice regulatory mechanisms. Also, same systems can be used for genuine security applications (civilian, non-lethal, legal and ethical).

If we are talking generative AI, again from my experience, things get a bit blurry. You can use smaller models to dig data you own.

I personally used LLMs, twice up to this day. In each case it was after very long research sessions without any answers. In one, it gave me exactly one reference, and I followed that reference and learnt what I was looking for. In the second case, it gave me a couple of pointers, which I'm going to follow myself again.

So, generative AI is not that useful for me, uses way too much resources, and industry leading models are well, unethical to begin with.

26 minutes agobayindirh

Yes I found this ironic as well lmao.

I do agree with the sentiment of the AI comment, and was even weighting just letting it slide because I do fear the future tht comment was warning against.

an hour agonubg

> “it’s not x, it’s y”

ChatGPT does this just as much, maybe even more, across every model they've ever released to the public.

How did both Claude and GPT end up with such a similar stylistic quirk?

I'd add that Kimi does it sometimes, but much less frequently. (Kimi, in general, is a better writer with a more neutral voice.) I don't have enough experience with Gemini or Deepseek to say.

2 hours agoA_D_E_P_T

LLMs learned rhetorical negation from humans. Some humans continue to use it, because it genuinely makes sense at times.

2 hours agooytis

It reads almost too AI to the point of being satire maybe?

2 hours agobombela

It is my text, enchanced by AI. Without AI, I would never have used the word "Monopsony". So I learned something new writing this comment.

2 hours agoolavgg

This behavior is part of the problem that got us here, using LLMs for everything.

2 hours agolpcvoid

You are losing your personality by modifying your text with LLMs. It saves you how much, 1 minute of writing?

2 hours agof311a
[deleted]
an hour ago

The irony is lost on you ...

2 hours agobilekas

You didn't write it. Here's another new word for you: hypocrite.

2 hours agobadpenny

Come on man, you're a "founder" and you can't even write your own comments on a forum?

2 hours agosmcl
[deleted]
2 hours ago

It'll be fine. The supply chain for these components is inelastic, but that means once manufacturing capacity increases, it'll stay there. We'll see lower prices, especially if there is an AI crash and a mass hardware selloff like some people are predicting.

2 hours agopost-it

The number of HDDs sold has been in decline for over a decade. I doubt there is massive appetite for expanding production capacity

On the other hand the total storage capacity shipped each year has risen, as a combination of HDDs getting larger and larger, and demand shifting from smaller consumer HDDs to larger data center, enterprise and NAS HDDs. I'm not sure how flexible those production lines are, but maybe the reaction will be shifting even more capacity to higher-capacity drives with cutting-edge technology

an hour agowongarsu

Server grade hardware (rack blades) is already poor fit for consumer needs and AI dedicated hardware straight up requires external liquid cooling systems. It will be expensive to adopt them.

an hour agozvqcMMV6Zcr

If it takes 2 years to increase, after 2 years everything will be thin clients already. Completely locked in, fully under control and everybody used to it. Very dystopian TBH.

2 hours agomrtksn

> According to Mosley, Seagate is not expanding its production capacities for now. Growth is to come only from higher-capacity hard drives, not from additional unit numbers.

2 hours agocubefox

True if production capacity increases but it's an oligopoly and manufacturers are being very cautious because they don't want to cut into their margins. That's the problem with concentration. The market becomes ineffective for customers.

2 hours agoStopDisinfo910

It's not about cutting in to their margins, if they end up scaling up production it will take several years and cost an untold amount of billions. When the AI bubble pops, if there's not replacement deman there's a very real chance of them going bankrupt.

2 hours agoMaxion

Damn. First GPUs, then RAM, now hard drives?

What's next, the great CPU shortage of 2026?

2 hours agofnands

What's next is no custom built PCs. They want us running dumb thin clients and subscribing to compute. Or it will be like phones. We'll get pre-built PCs that we aren't allowed to repair and they'll be forced to be obsolete every few years.

an hour agoFervicus

"they"? i see companies jacking their prices up, plain and simple. and us idiots still pay. ask yourself does intel no longer wish to sell CPUs to consumers? doesnt sound reasonable that intel would want to decimate their main market so AI companies can rule the world for some reason

18 minutes agosqueefers

I think hard drives was before RAM but it kind of all happened contemporaneously.

2 hours agoloeg

Better start hoarding Silica.

2 hours agobilekas
[deleted]
2 hours ago

There hasn't been a better time in the past 15 years to push for a new video or image codec. Saving storage Space is important again.

This is assuming most of what we stored are either images or video.

an hour agoksec

Do the guys that buy out the market have real use for all the hardware - or is it just hype? A solution against investors trying to corner the market would be to sell virtual hardware. Let them buy as much options on virtual "to be delivered" hardware" as they want. We also need an option market for virtual LLM-tokens, where the investors can put all their money without affecting real people.

an hour agoadornKey

I'll go against the grain and claim this might be a good thing long term. Yes, it sucks also, I was planning to expand my NAS but guess I'll figure out how to compress stuff instead.

Which goes into why I think this might be good. Developers have kind of treated disks as "oh well" with binaries ballooning in size, even when it can easily solved, and there is little care to make things lightweight. Just like I now figure out a different solution to recover space, I'm hoping with a shortage this kind of thing will be more widespread, and we'll end up with smaller things until the shortage is over. "Necessity is the mother of all invention" or however it goes.

2 hours agoembedding-shape

There is an increasing chance the "invention" will be that nobody owns personal computers, and now has to rent from the cloud.

2 hours ago63stack

Think that's more of a "silver lining" instead of the overall trend being a "good thing long term." It's still pretty terrible.

2 hours agoaltmanaltman

Looks like we need a computer hardware reserves the same way there are regional reserves for food, fuels and other critical commodities?

And for the same reason - to avoid the dominant players going "oh shiny" on short term lucrative adventures or outright trying to manipulate the market - causing people to starve and making society grind to a halt.

2 hours agom4rtink

The real "computer hardware reserves" is the used market. Average folks and smaller businesses will realize that their old gear now has resale value and a lot more of it will be entering the resale/refurbishment market instead of being thrown away as e-waste.

2 hours agozozbot234

I picked up a few hundred TB from a chia farm sale. Glad for it. I think I'm set for a while. Honestly, the second they started buying this stuff I started buying hardware. The only problem for me is that they're even ruining the market for RTX 6000 Pro Blackwells.

2 hours agoarjie

Time for some heavy regulation

2 hours agosteve1977

That's not going to happen when AI is already propping up a significant chunk of the economy.

There is appetite in some circles for a consumer boycott but not much coordination on targets.

2 hours agotjpnz

> There is appetite in some circles for a consumer boycott

its not being used anywhere lol where are they meant to boycott?

17 minutes agosqueefers

Save us, China.

2 hours agota9000

It just goes to show how totally corporations have captured western aligned governments. Our governments are powerless to do anything (aside from some baby steps from the EU).

China is now the only solution to fix broken western controlled markets.

2 hours agophatfish

If component prices keep going up and the respective monopoly/duopoly/triopoly for each component colludes to keep prices high/supply constrained, then eventually devices will become too expensive for the average consumer. So what’s the game plan here? Are companies planning to let users lease a device from them? Worth noting that Sony already lets you do this with a ps5. Sounds like we’re headed towards a “you will own nothing and be happy” type situation

2 hours agofastily

> Sounds like we’re headed towards a “you will own nothing and be happy” type situation

That's when I sell of my current hardware and house, buy a cow and some land somewhere in the boondocks and become a hermit.

2 hours agoMaxion

It could be a level up from that.

"You will use AI, because that will be the only way you will have a relaxed life. You will pay for it, own nothing and be content. Nobody cares if you are happy or not."

2 hours agob3lvedere

We could also vote the policians protecting these uncompetitive markets out of power and let regulators do their job. There has been too many mergers in the component market.

You also have to look at the current status of the market. The level of investment in data centers spurred by AI are unlikely to last unless massive gains materialize. It's pretty clear some manufacturers are betting things will cool down and don't want to overcommit.

2 hours agoStopDisinfo910

No one can be surprised to see that all of these artificial "shortages" are impacting components with monopoly or few actors producers...

an hour agogreatgib

That's the electronics industry in general though. The shortages are real and a normal part of growing pains for any industry that's so capital-intensive and capacity constrained.

an hour agozozbot234

/dev/null as a service, mooning

2 hours agobravetraveler

Supply of 2nd hand enterprise stuff is also showing a slowdown. Seeing less of it show up in eBay

2 hours agoHavoc

Future show HN: how I managed to parallelize 100 tape drives to load windows and play video games

2 hours agomoomoo11

VHS cassettes: maybe not so obsolete, after all?

Also, the Return of PKZIP.

2 hours agoblackhaz

Video Backup System for Amiga!

2 hours agoKeyframe

Stacker and Doublespace!

2 hours agobaal80spam

memmaker and qemm

2 hours agoboobsbr
[deleted]
2 hours ago

First they came for the GPUs, but I did not speak out, for I was not a gamer.

Then they came for the RAM, but I did not speak out, for I had already closed Firefox.

Then they came for the hard drives, but I did not speak out, for I had the cloud.

Then my NAS died, and there was no drive left to restore from backup.

2 hours agofnands

I hope the data centres burn

2 hours agonewsclues

Repairability, upgradability and standards compliance needs to be minimum in consumer products. No to proprietary connectors. No soldered SSD or RAM. For home use, allow relaxed licensing options for discarded enterprise products like switches, Wifi Access Points etc. (Juniper Mist APs are fantastic, but are a brick without their cloud). Currently, I cannot put in a market bought SSD in my Macbook. I cannot put in a SSD in my Unifi router without buying their $20 SSD tray. I cannot put third party ECC-RAM and SSDs in my Synology NAS because the policy has only been lifted on HDDs but nothing else. I fear opposite will happen. Only leveraged companies have access to DRAM and NAND and hence will use it to lock us into their ecosystem as consumers won't even get access to storage in the open market otherwise.

an hour agoiamshs

But hey, we get slop videos of the pope doing something funny, that's just as cool as being able to purchase computer hardware, right?

2 hours agolpcvoid

To be fair, heise is a german news site, and the very article is auto-translated by AI from its german counterpart.

2 hours agolittlecranky67

Machine translation have become pretty good long before AI hype started.

an hour agokasabali

AI use for translation is a good fit for the tech. The problem is generative AI.

2 hours agolpcvoid

We can also lose our jobs in a few years so there’s that

32 minutes agocoffeebeqn

I'm confused, that doesn't make sense to me:

> They largely come from hyperscalers who want hard drives for their AI data centers, for example to store training data on them.

What type of training data? LLMs need relatively little of that. For example, DeepSeek-V3 [1], still a relatively large model:

> We pre-train DeepSeek-V3 on 14.8 trillion diverse and high-quality tokens

At 2 bytes per token, that's 29.6 terabytes. That's basically nothing compared to the amount of 4K content that is uploaded to YouTube every day.

1: https://arxiv.org/html/2412.19437v1

2 hours agocubefox

Honestly looks highly suspicious to me. Because ok they might need some big storage like petabits. But how can this be a match in proportion with the capacity that is currently usually needed for everything that is hard drive hungry. Any cloud service, any storage service, all the storage needed for private photo/video/media storage for everything that is produced everyday, for all consumer hardwares like computers...

Gpu I understand but hard drive looks excessive. It's like if tomorrow there is a shortage of computer cabling because ai datacenter needs some.

an hour agogreatgib

If you're building for future training needs and not just present, it makes more sense. Scaling laws say the more data you have, the smarter and more knowledgeable your AI model gets in the end. So that extra storage can be quite valuable.

an hour agozozbot234

If you’re building a text-only model then the storage is limited but once you get to things like video then it’ll explode exponentially

30 minutes agocoffeebeqn

You may have answered your own question if they're wanting to train models on video and other media.

an hour agoJach

they’re pushing for AI, but nobody will have a device to use it?

2 hours agoicf80

Chromebook with a 64gig shitty eMMC is what Google and friends would love you to use. Pay that cloud drive subscription!

2 hours agoHavoc

The TV is best device for unleashing your creativity by upvoting your favourite Sora creators! Become an expert at any field by activating premium prompts from our partners! By connecting camera you can have meaningful motivating discussions with your deceased loved ones (camera required for fraud prevention. Remember, not looking at the screen during ads is punishable by law)

You have 31/999 credits remaining. What activity would you like to productively participate in today?

2 hours ago112233

I feel traditional "rust" hard disks would be inefficient for AI use. Unless they include SSDs (which I feel these data centers are more likely to be using) in the definition as well...

2 hours agoFMecha

They need it hoard datasets.