Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Lion Air Crash - some conjecture that change in autopilot override may have played a role

This site may earn commission on affiliate links.
Tesla does a great job at teaching it's the drivers of the cars how to override the autonomous systems.

They do so with occasion "What the hell is it doing?" moments.

If only Boeing had it's planes terrorize it's pilots occasionally.

For a truly autonomous car I'm hoping we'll have a big red button. Where it initializes some safety protocol to stop the vehicle safely.
 
I saw a documentary about an airbus crash a few years ago. The pilot allowed a passenger to sit in his seat mid-flight and that passenger pulled on the yoke for an extended period of time. Only that one axis of the autopilot turned loose, the other axes remained in control. The aircraft began to crab and the pilot was unaware the autopilot was still functioning and fighting him. This “one axis” thing is similar to me holding the steering wheel and tugging hard enough that auto steer lets go, but cruise control stays on in the model S.
 
...override...

My understanding of Autopilot system is: Human is always the boss. Whatever actions that human does would override the automation system at any time, no question asked, no talk back, no resistance, nothing!

There's a philosophy that human is not to be trusted which might explain there's a procedure change with Boeing latest software in the Lion Air crash.

Older Autopilot would yield its controls over to human when it dectected a human's intervention. The problem is: Sometimes, human pilot didn't mean to intervene the system such as an accidental bump or holding the controls too firmly. Then pilots are caught off-guard because they didn't realize that the plane would be now on manual mode.

That's the same complaint with Tesla Autopilot that the AutoSteer was turned off because it detected overriding torque but owners didn't realize that the steering was now in manual with no apparent deafening siren alarms or blinding flashing bright light system warning them.

Back to commercial airline Autopilot, there have been incidences that pilots forgot their very basic training on how to keep their planes up in the air without dropping down like a brick.

In an emergency, some pilots' instinct is they don't want to fall down from the sky so they want to get the plane nose up in an effort of aiming for a higher sky. However, that's is against of what the basic training: prolonged nose up would cause the plane to stall and the plane would drop like a brick.

My speculation is: To fix this forgetfulness of basic training of avoiding stalling, Boeing implemented MCAS (Maneuvering Characteristics Augmentation System).

What is revolutionary about this software update is: it sticks to its basic training: nose down to avoid stalling at all cost even when human pilots keep fighting to level back the plane (remember, the philosophy of trusting or not trusting human?.)

The problem is: When there is no reason to nose down at all as in the case of Lion Air, the system kept nosing down more than 2 dozens times and the human didn't have the strength to fight off the machine and lost.

It turns out, the new software does allow human to take over the control but not as simple as before.

Prior to this software update, any slight force detection on the flight controls would disable the automation system and hands the controls back to human.

No. Not with this new software. Fighting with the system with the center stick does not disable the automation system.

Pilots needs to be oriented that there's a procedure on how to shut off the electricity to the motorized system so human can control manually.

Remember, the pilots had only 24 seconds and fight off the tremenous G-force to finish the procedure as the plane plunged from an altitude of 4,850 feet and crashed into the sea.

Boeing said it already gave training to pilots. Pilot union denied that there was no such orientation.

Boeing said the procedure is spelled out in the manual. Lion Air denies that there's no such procedure included with the plane's manual.

Nevertheless, Boeing now has issued a bulletin so everyone knows how to implement the procedure to turn off the MCAS and they can't deny any longer that they didn't get it.

So, we are still at a dilemma: Should we trust human pilot to remember their basic training or should we trust a new software without training pilots on how to disable it?

I think since we are still in an imperfect world, it's nice to have automation but human operator's skill still needs to be in the loop.
 
Last edited:
  • Informative
Reactions: Skipdd
My understanding of Autopilot system is: Human is always the boss. Whatever actions that human does would override the automation system at any time, no question asked, no talk back, no resistance, nothing!

There's a philosophy that human is not to be trusted which might explain there's a procedure change with Boeing latest software in the Lion Air crash.

Older Autopilot would yield its controls over to human when it dectected a human's intervention. The problem is: Sometimes, human pilot didn't mean to intervene the system such as an accidental bump or holding the controls too firmly. Then pilots are caught off-guard because they didn't realize that the plane would be now on manual mode.

That's the same complaint with Tesla Autopilot that the AutoSteer was turned off because it detected overriding torque but owners didn't realize that the steering was now in manual with no apparent deafening siren alarms or blinding flashing bright light system warning them.

Back to commercial airline Autopilot, there have been incidences that pilots forgot their very basic training on how to keep their planes up in the air without dropping down like a brick.

In an emergency, some pilots' instinct is they don't want to fall down from the sky so they want to get the plane nose up in an effort of aiming for a higher sky. However, that's is against of what the basic training: prolonged nose up would cause the plane to stall and the plane would drop like a brick.

My speculation is: To fix this forgetfulness of basic training of avoiding stalling, Boeing implemented MCAS (Maneuvering Characteristics Augmentation System).

What is revolutionary about this software update is: it sticks to its basic training: nose down to avoid stalling at all cost even when human pilots keep fighting to level back the plane (remember, the philosophy of trusting or not trusting human?.)

The problem is: When there is no reason to nose down at all as in the case of Lion Air, the system kept nosing down more than 2 dozens times and the human didn't have the strength to fight off the machine and lost.

It turns out, the new software does allow human to take over the control but not as simple as before.

Prior to this software update, any slight force detection on the flight controls would disable the automation system and hands the controls back to human.

No. Not with this new software. Fighting with the system with the center stick does not disable the automation system.

Pilots needs to be oriented that there's a procedure on how to shut off the electricity to the motorized system so human can control manually.

Remember, the pilots had only 24 seconds and fight off the tremenous G-force to finish the procedure as the plane plunged from an altitude of 4,850 feet and crashed into the sea.

Boeing said it already gave training to pilots. Pilot union denied that there was no such orientation.

Boeing said the procedure is spelled out in the manual. Lion Air denies that there's no such procedure included with the plane's manual.

Nevertheless, Boeing now has issued a bulletin so everyone knows how to implement the procedure to turn off the MCAS and they can't deny any longer that they didn't get it.

So, we are still at a dilemma: Should we trust human pilot to remember their basic training or should we trust a new software without training pilots on how to disable it?

I think since we are still in an imperfect world, it's nice to have automation but human operator's skill still needs to be in the loop.

Agree with your last statement. And, any new significant process changes should require some sort of simulation training.
 
I lost interest in Boeing a while ago.
I flew the Airbus A380 across the pond. Now I avoid all Boeing aircraft ... so boring.
Also I found out that Boeing use Windows server software and iPhones. "I am OUT of here!!!"
JK
 
There is no human involvement in full autonomous driving car. Machines either work or fail on their own.

Human is not trusted because it's not consistent and predictable. You never know how many drinks a driver had or if he will pick up the phone when it rings in the car.
 
There is no human involvement in full autonomous driving car. Machines either work or fail on their own.

Human is not trusted because it's not consistent and predictable. You never know how many drinks a driver had or if he will pick up the phone when it rings in the car.

So what happened when a sensor starts to feed bad data in an autonomous car? Should the system include multiple layers of checks and balance plus redundancy, or should we just let the car crash?
 
I saw a documentary about an airbus crash a few years ago. The pilot allowed a passenger to sit in his seat mid-flight and that passenger pulled on the yoke for an extended period of time. Only that one axis of the autopilot turned loose, the other axes remained in control. The aircraft began to crab and the pilot was unaware the autopilot was still functioning and fighting him. This “one axis” thing is similar to me holding the steering wheel and tugging hard enough that auto steer lets go, but cruise control stays on in the model S.

Funny enough in that specific case if the pilots never interfered in stopping the nose dive in turn pointing the airplane into a straight vertical ascent, the other AP subsystem would have stabilized the plane.

There are patents currently that try to address this situation of human handover.
US Patent for System and method for determining transfer of driving control authority of self-driving vehicle Patent (Patent # 10,078,331 issued September 18, 2018) - Justia Patents Search