2
Ask HN: How long before the first civilian cargo flights are AI piloted?
Is it 2026? Within 2 years? 5 years? 10 years?
I can understand how passenger flights will take a while longer - but would cargo flights that don't have nearly the safety concerns would be AI piloted much sooner? If so, how much sooner?
I don't see the economic pressure pushing for that.
To first order, the bigger a vehicle is the less you worry about the cost of the pilot/driver. The biggest untold story in aviation is the battle between the pilots of small "regional aircraft" vs "mainline aircraft", the former of which generate less value for the same amount of work and necessarily get paid less. Unions have enforced "scope clauses" that have prevented a new generation of slightly larger regional aircraft which could lower costs at small airports and have no trouble getting filled as those lower costs get passed on to consumers. As it is small airports are dying out, harming smaller cities and towns and giving the "left behind" all the more reason to lash out.
Similarly there is a lot of a talk of crisis in truck driving, both at the local and long-haul levels. My brother-in-law has a CDL and he is always talking about how inexperienced drivers seem to wedge their trucks on a bridge in Binghamton once a month, stall out on the highway and get into accidents, etc.
This is true for so many things - even driverless taxis, drone deliveries, even office jobs / AI.
The narrative is that human labor is expensive super expensive, there are "skills shortages", etc etc... but in actuality, hiring a few people rounds down to 0 in the context of an airliner or an office building in Manhattan, and you get a lot of political sway for employing folks and paying payroll taxes, and the "doorman fallacy" is very real. The "robots taking our jobs" narrative seems hugely exaggerated to me.
Yes, I agree, I don't see cargo plane pilots being replaced here. Or indeed commercial pilots in general.
Pilots are an unusual species because most of their utility and training is in the ability to deal with (very) edge cases. Indeed it is their ability to deal with unanticipated edge cases which is their most valuable attribute.
Sure, most pilots won't need those skills during their careers, but the value when they do us immeasurable. Landing on the Hudson anyone?
Equally it is the situational awareness and anticipation of problems which avoid things that could have escalated into disaster but instead become near misses.
Sure 99.99% of their work is routine and could be done hy a machine. But that last 0.01% is thousands of lives, and billions in equipment and cargo.
You don't see the economic pressure for doing that, but single pilot operations is a target for Airbus, so "one pilot instead of two" is already compelling.
If its a 737 delivering pallets of dog food or humans in seats, the safety concerns are the same. They take off and land at the same airports and can collide with other airplanes. The stuff on the plane doesn't mean there are different safety checks.
Auto pilot can be used for nearly everything after take-off and before landing, so I think you'll need to define "AI" here. I see people using "AI" almost interchangeably these days for things that plain old computers have been doing for a while now. Auto-pilot is not AI, its just a set of instructions (aka programming) given to a computer.
Airports have designed approaches, large airports have multiple and there is a need for communication with other humans, reacting to dynamic environments including weather, other aircraft (both airborne and on taxiways) and having actual vision out the cockpit to see things.
Even if that's doable, I suspect most of us would want to sit in a plane with a human in the cockpit. It's like trains and subways where most of the work is mundane but you want someone at the helm to deal with SHTF situations.
I'm still skeptic about driver-less taxis, even when commercial ones like Waymo are already running on the road. But at least taxis run in 2D regulated highways and most of the drivers are sane enough.
In the language of the moment AI means LLMS. The answer there is “never”
If you say autopilot that implies a wildly different technology. I think the first successful autopilot landing with passengers onboard happened last week.
Taking off is much easier than landing and if passengers aren’t involved…
I guess the question is not about technology (might be ready now) and is instead about regulation (when will the FAA allow fully autonomous flights). I’m guessing the current generation of regulators will need to die before that happens so 25+ years.
Takeoffs are substantially more hazardous than landing. The margin for error is small, and the options for handling the unexpected are minimal.
To be clear, take-off is more than just the runway. It extends to the point where the aircraft has sufficient energy, which can be translated into time.
Flying a plane is all about energy. Speed, and altitude. One can be traded for another and both can be traded for time. In other words, the faster the plane, the higher the plane, the more options available to the pilot.
The most dangerous part of the flight is the initial climb just after takeoff. Speed is low. Height is very low. You are literally flying away from the runway. If an engine or 2 fail at this point crashing is a likely outcome. However how, and where, you crash matters a lot.
By contrast landings are much safer, the airplane has excess energy to play with, and is heading directly at a nice long flat piece of ground.
Auto-pilot takeoffs are "easy" only to the V2 mark, then they can get very complicated very quickly.
> I think the first successful autopilot landing with passengers onboard happened last week.
What definition are you using for "autopilot landing" here? Autopilots have been able to land planes for quite some time.
Conversely, autopilots still can't handle takeoff (though the more modern airliners have significantly more takeoff automation than before).
Is autopilot takeoff a hard problem, or is there just little incentive to do it?
I.e. if takeoff is dangerous you dont do it. Making takeoff generslly easier. The option to not land doesnt exist (insert old joke about a good landing)! So landing may have harder problems like wind sheer, thunder storms, tailwind, visibility, and more incentive to take load off pilots with automation.
I agree about incentive (with a small asterisk that if you do decide to take off when you shouldn't, now you have two problems), but taking off isn't particularly easier.
Eg you have to stay on the runway, you have decisions to make about whether to reject takeoff (the asterisk I mentioned earlier), etc.
For a computer, I'd say it's 6 of one, half a dozen of the other. For runway equipment (ie ILS), it probably changes enough that you couldn't "just use ILS" for takeoff.
I did infra work for a team that was contributing to Airbus' ATTOL (automated taxi, takeoff, and landing) project a few years ago. They (the ATTOL folks, not the side I was on) did do an automated takeoff demo and it definitely wasn't simple.
Possibly a reference to the recent event described here:
https://fallows.substack.com/p/a-positive-sign-for-flying-in...
tl/dr: The aircraft detected pilot incapacitation and landed on its own initiative.
That's what I figured yeah, though that's GA, not cargo planes, and emergency rather than normal operations. Lots of differences.