7
Ask HN: Why is Apple's voice transcription hilariously bad?
Why is Apple’s voice transcription so hilariously bad?
Even 2–3 years ago, OpenAI’s Whisper models delivered better, near-instant voice transcription offline — and the model was only about ~500 MB. With that context, it’s hard to understand how Apple’s transcription, which runs online on powerful servers, performs so poorly today.
Here are real examples from using the iOS native app just now:
- “BigQuery update” → “bakery update”
- “GitHub” → “get her”
- “CI build” → “CI bill”
- “GitHub support” → “get her support”
These aren’t obscure terms — they’re extremely common words in software, spoken clearly in casual contexts. The accuracy gap feels especially stark compared to what was already possible years ago, even fully offline.
Is this primarily a model-quality issue, a streaming/segmentation problem, aggressive post-processing, or something architectural in Apple’s speech stack? What are the real technical limitations, and why hasn’t it improved despite modern hardware and cloud processing?
>they’re extremely common words in software, spoken clearly in casual contexts
extremely common phrases in software are extremely uncommon phrases for most of the world.
OK, fair point. My examples were taken from my immediate previous transcript however this is not a intermittent issue. This is consistent. Terrible hilarious performance.
That’s sad. I tried to prove it terrible in this comment by using transcript here, hoping to show you some examples, but the transcript is essentially accurate. Besides, the sad said humming above and the humming homonym.
so there should probably be some sort of jargon-user probability setting that would be evaluated by your phrase usage.
first off there must be some phrases that are more commonly used in development than otherwise that it gets correct, a large number of those indicates high chance of being software jargon user. Furthermore all these other phrases are not in themselves common non-software usage, thus if you are using a lot of phrases that might be, with lower probability but still relatively high probability, software jargon this could be set.
Now we also get to personal behavior tracking, you are on dev sites a lot chance of using software jargon instead of non-software jargon goes up.
Do you use computer for development, chances go up. Of course lots of reasons why they would not track this to keep people from being pissed but still, possible way to improve from tracking.
finally allowing people to create profile - which I don't know if they do because I don't use.
Of course this kind of software dev jargon workflow would also help other identifiable subgroups with specific jargon sets, like lawyers, or accountants, etc. etc.
All these things of o
Yeah, all of these are good ideas. And I think they should also utilize the obviously available to them abundant context of any message that you’re sending.