I feel like the difference between Steve Jobs’ and Tim Cook’s leadership styles is that Cook is really good at optimizing existing processes, but does not have the vision to capitalize on what’s next.
Apple got into the smartphone game at the right time with a lot of new ideas. But whatever the next big shift in technology is, they will be left behind. I don’t know if that is AI, but it’s clear that in AI they are already far behind other companies.
Just my opinion:
Apple doesn't need to solve AI. It's not core to their business in the same way that search engines aren't core to their business.
What Apple does best lies at the combination of hardware, software, physical materials, and human-computer interface design. This is why they're spending so much more on mixed reality than AI, even knowing that a product like the Vision Pro today isn't going to be a big seller. It's why they're investing in their own silicon. This strategy tends to yield unexpected wins, like the Mac Mini suddenly becoming one of the hottest computers in the world because it turns out it's amazing for sandboxing agents if you don't want to use the cloud, or the Mac Studio becoming arguably the best way to run local AI models (a nascent space that is on the cusp of becoming genuinely relevant), or the MacBook Pro becoming by far the best laptop in the world for productivity in the AI age (and it's not even close).
Your conclusion is that they're going to be left behind, but the evidence is that they're already well ahead in the areas that are core to their business. They can trivially pay Google a billion a year for Gemini. Nobody else can do what they can in the fusion of hardware, software, and materials as long as they stay focused.
Where they genuinely slipped up was their marketing -- an unusual mistake for Apple. And that does indeed lie with the CEO.
I think the issue here is the public promises that were made. Jobs tended not to do that. Things were announced and released when they were ready, which gave them the time to do it right, without any delays.
Sure, there were things like AirPower and the MobileMe widgets… things that were announced, but never shipped. However, by and large, a big new thing was announced, and a week later it would ship. The iPhone was only announced 6 months early to avoid it being leaked by compliance filing (or maybe it was patents).
Cook would be wise to go back to this instead of promising the shareholders things he can’t deliver on.
I think slow playing AI is the right move for Apple. Third party apps give their customers access today, and Apple can take the time to figure out how AI fits into a large cohesive vision for their products and ecosystem… or if it fits at all. Rushing something out doesn’t do anyone any favors, and has never been Apple’s competitive advantage.
All I really want from Apple is to continue perfecting their computers, phones and tablets to be the absolute best computing devices possible. As long as they keep iteratively improving those things I don’t care if they’re thought or innovation leaders in whatever hot new thing comes along.
To be fair no one has solved ai assistant at consumer level yet.
I agree. It’s still being figured out.
My prediction is that Apple is the hardware and platform provider (like it’s always been). We’re not asking them to come up with a better social media, or a better Notion or a better Netflix.
I think their proprietary chips and GPUs are being undervalued.
My feeling is that they’re letting everyone move fast and break things while trailing behind and making safe bets.
I feel like, today, most of the other LLM providers can do what "Apple Intelligence" promised - it'll link with my email/calendar/etc and it can find stuff I ask with a fuzzy search.
That said, I don't really use this functionality all that often, because it didn't really (effortlessly) solve a big need for me. Apple sitting out LLMs means they didn't build this competency along the way, even when the technology is now proven.
I think the same thing is true was VR - except Apple did invest heavily in it and bring a product to market. Maybe we won't see anything big for a while, and Silicon Valley becomes the next Detroit.
It's easy: make my life easier.
Instead they choose to optimize for shareholder value.
This was such a self inflicted own goal. Siri has needed work for years and every year they neglected it. When they first bought Siri it was state of the art and then it just languished. Pulling an Intel and sweating your assets until it is too late is never a good idea.
It doesn't surprise me that Siri continues to be bad - Apple's current plan is to use a low-quality LLM to build a top-quality product, which turned out to be impossible.
What does surprise me is that Google Home is still so bad. They rolled out the new Gemini-based version, but if anything it's even worse than the old one. Same capabilities but more long-winded talking about them. It is still unable to answer basic questions like "what timer did you just cancel".
This is obviously a death march project. Just delay it indefinitely until the Google Gemini based Siri chatbot is ready. Why ship something half-assed?
But it’s been a complex undertaking. The revamped Siri is built on an entirely new architecture dubbed Linwood. Its software will rely on the company’s large language model platform — known as Apple Foundations Models — which is now incorporating technology from Alphabet Inc.’s Google Gemini team.
I worked at Siri (post acquisition) 13 years ago as one of the early data scientists. Let's just say I am not a bit surprised.
I got myself an iPhone 16 Pro because of the promised AI features. I had a vision in my mind of what it ought be like:
While driving past a restaurant, I wanted to know if they were open for lunch and if they had gluten-free items on their menu.
I asked the "new" Siri to check this for me while driving, so I gave it a shot.
"I did some web searches for you but I can't read it out to you while driving."
Then what on earth is its purpose if not that!? THAT! That is what it's for! It's meant to be a voice assistant, not a laptop with a web browser!
I checked while stopped, and it literally just googled "restaurant gluten free menu" and... that's it. Nothing specific about my location. That's nuts.
Think about what data and access the phone has:
1. It knows I'm driving -- it is literally plugged into the car's Apple CarPlay port.
2. It knows where I am because it is doing the navigating.
3. It can look at the map and see the restaurant and access its metadata such as its online menu.
4. Any modern LLM can read the text of the web page and summarize it given a prompt like "does this have GF items?"
5. Text-to-voice has been a thing for a decade now.
How hard can this be? Siri seems to have 10x more developer effort sunk into refusing to do the things it can already do instead of... I don't know... just doing the thing.
I'd rather they get it right than released it unfinished.
Do similar issues exist with Gemini on Android?
Or are these challenges very Siri/iOS specific?
Gemini can and does send everything to Google.
Apple's challenge is they want to maintain privacy, which means doing everything on-device.
Which is currently slower than the servers that others can bring to the table - because they already grab every piece of data you have.
[deleted]
Is it not impressive what xai did with Grok? It's already integrated into twitter and my Tesla. So quickly? What prevented apple from doing the same but building out their equivalent of grok?
This is not a bad example. Tesla is indeed running a custom LLM, available in their vehicles, capable of acting as a general chatbot and issuing commands to the car, developed in-house. While Grok is not up-to-par with other frontier models, it's certainly far beyond Siri.
Are Apple AI agent delays bearish for AI agents in general? Unless something else is the issue it’s normal behavior for Apple not to implement something everyone else already has until it’s very good and solid.
Apple wants to vertically integrate. Their AI strategy until recently was to develop their own LLM models that were small enough to run on device. But massive scaling is what makes LLMs so powerful, so all their internal models were terrible and unusable.
Basically they bet that compute efficient LLMs were the future. That bet was wrong and the opposite came true.
Not sure whether it's a language/pronounciation issue but for 15 years since siri was released i have not seen a single person using it successfully without having to yell at it for not waking up or not understanding the request correctly
I was cleaning up room today and while wiping dust from bookshelf homepod sits on, siri out of blue goes with "I thought so".
It'll be 15 years this October and I can't still use siri with my language.
Damn paywalls! Sorry, I shouldn't be so negative. I'd just like to be able to read the article.
https://archive.is/2026.02.11-194917/https://www.bloomberg.c...
I feel like the difference between Steve Jobs’ and Tim Cook’s leadership styles is that Cook is really good at optimizing existing processes, but does not have the vision to capitalize on what’s next.
Apple got into the smartphone game at the right time with a lot of new ideas. But whatever the next big shift in technology is, they will be left behind. I don’t know if that is AI, but it’s clear that in AI they are already far behind other companies.
Just my opinion:
Apple doesn't need to solve AI. It's not core to their business in the same way that search engines aren't core to their business.
What Apple does best lies at the combination of hardware, software, physical materials, and human-computer interface design. This is why they're spending so much more on mixed reality than AI, even knowing that a product like the Vision Pro today isn't going to be a big seller. It's why they're investing in their own silicon. This strategy tends to yield unexpected wins, like the Mac Mini suddenly becoming one of the hottest computers in the world because it turns out it's amazing for sandboxing agents if you don't want to use the cloud, or the Mac Studio becoming arguably the best way to run local AI models (a nascent space that is on the cusp of becoming genuinely relevant), or the MacBook Pro becoming by far the best laptop in the world for productivity in the AI age (and it's not even close).
Your conclusion is that they're going to be left behind, but the evidence is that they're already well ahead in the areas that are core to their business. They can trivially pay Google a billion a year for Gemini. Nobody else can do what they can in the fusion of hardware, software, and materials as long as they stay focused.
Where they genuinely slipped up was their marketing -- an unusual mistake for Apple. And that does indeed lie with the CEO.
I think the issue here is the public promises that were made. Jobs tended not to do that. Things were announced and released when they were ready, which gave them the time to do it right, without any delays.
Sure, there were things like AirPower and the MobileMe widgets… things that were announced, but never shipped. However, by and large, a big new thing was announced, and a week later it would ship. The iPhone was only announced 6 months early to avoid it being leaked by compliance filing (or maybe it was patents).
Cook would be wise to go back to this instead of promising the shareholders things he can’t deliver on.
I think slow playing AI is the right move for Apple. Third party apps give their customers access today, and Apple can take the time to figure out how AI fits into a large cohesive vision for their products and ecosystem… or if it fits at all. Rushing something out doesn’t do anyone any favors, and has never been Apple’s competitive advantage.
All I really want from Apple is to continue perfecting their computers, phones and tablets to be the absolute best computing devices possible. As long as they keep iteratively improving those things I don’t care if they’re thought or innovation leaders in whatever hot new thing comes along.
To be fair no one has solved ai assistant at consumer level yet.
I agree. It’s still being figured out.
My prediction is that Apple is the hardware and platform provider (like it’s always been). We’re not asking them to come up with a better social media, or a better Notion or a better Netflix.
I think their proprietary chips and GPUs are being undervalued.
My feeling is that they’re letting everyone move fast and break things while trailing behind and making safe bets.
I feel like, today, most of the other LLM providers can do what "Apple Intelligence" promised - it'll link with my email/calendar/etc and it can find stuff I ask with a fuzzy search.
That said, I don't really use this functionality all that often, because it didn't really (effortlessly) solve a big need for me. Apple sitting out LLMs means they didn't build this competency along the way, even when the technology is now proven.
I think the same thing is true was VR - except Apple did invest heavily in it and bring a product to market. Maybe we won't see anything big for a while, and Silicon Valley becomes the next Detroit.
It's easy: make my life easier.
Instead they choose to optimize for shareholder value.
This was such a self inflicted own goal. Siri has needed work for years and every year they neglected it. When they first bought Siri it was state of the art and then it just languished. Pulling an Intel and sweating your assets until it is too late is never a good idea.
It doesn't surprise me that Siri continues to be bad - Apple's current plan is to use a low-quality LLM to build a top-quality product, which turned out to be impossible.
What does surprise me is that Google Home is still so bad. They rolled out the new Gemini-based version, but if anything it's even worse than the old one. Same capabilities but more long-winded talking about them. It is still unable to answer basic questions like "what timer did you just cancel".
This is obviously a death march project. Just delay it indefinitely until the Google Gemini based Siri chatbot is ready. Why ship something half-assed?
The referenced Bloomberg source (https://www.bloomberg.com/news/articles/2026-02-11/apple-s-i...) says this about the delayed effort:
But it’s been a complex undertaking. The revamped Siri is built on an entirely new architecture dubbed Linwood. Its software will rely on the company’s large language model platform — known as Apple Foundations Models — which is now incorporating technology from Alphabet Inc.’s Google Gemini team.
I worked at Siri (post acquisition) 13 years ago as one of the early data scientists. Let's just say I am not a bit surprised.
I got myself an iPhone 16 Pro because of the promised AI features. I had a vision in my mind of what it ought be like:
While driving past a restaurant, I wanted to know if they were open for lunch and if they had gluten-free items on their menu.
I asked the "new" Siri to check this for me while driving, so I gave it a shot.
"I did some web searches for you but I can't read it out to you while driving."
Then what on earth is its purpose if not that!? THAT! That is what it's for! It's meant to be a voice assistant, not a laptop with a web browser!
I checked while stopped, and it literally just googled "restaurant gluten free menu" and... that's it. Nothing specific about my location. That's nuts.
Think about what data and access the phone has:
1. It knows I'm driving -- it is literally plugged into the car's Apple CarPlay port.
2. It knows where I am because it is doing the navigating.
3. It can look at the map and see the restaurant and access its metadata such as its online menu.
4. Any modern LLM can read the text of the web page and summarize it given a prompt like "does this have GF items?"
5. Text-to-voice has been a thing for a decade now.
How hard can this be? Siri seems to have 10x more developer effort sunk into refusing to do the things it can already do instead of... I don't know... just doing the thing.
I'd rather they get it right than released it unfinished.
Do similar issues exist with Gemini on Android?
Or are these challenges very Siri/iOS specific?
Gemini can and does send everything to Google.
Apple's challenge is they want to maintain privacy, which means doing everything on-device.
Which is currently slower than the servers that others can bring to the table - because they already grab every piece of data you have.
Is it not impressive what xai did with Grok? It's already integrated into twitter and my Tesla. So quickly? What prevented apple from doing the same but building out their equivalent of grok?
This is not a bad example. Tesla is indeed running a custom LLM, available in their vehicles, capable of acting as a general chatbot and issuing commands to the car, developed in-house. While Grok is not up-to-par with other frontier models, it's certainly far beyond Siri.
Are Apple AI agent delays bearish for AI agents in general? Unless something else is the issue it’s normal behavior for Apple not to implement something everyone else already has until it’s very good and solid.
Apple wants to vertically integrate. Their AI strategy until recently was to develop their own LLM models that were small enough to run on device. But massive scaling is what makes LLMs so powerful, so all their internal models were terrible and unusable.
Basically they bet that compute efficient LLMs were the future. That bet was wrong and the opposite came true.
Not sure whether it's a language/pronounciation issue but for 15 years since siri was released i have not seen a single person using it successfully without having to yell at it for not waking up or not understanding the request correctly
I was cleaning up room today and while wiping dust from bookshelf homepod sits on, siri out of blue goes with "I thought so".
It'll be 15 years this October and I can't still use siri with my language.
Damn paywalls! Sorry, I shouldn't be so negative. I'd just like to be able to read the article.
[flagged]