- What’s New
- New Posts
- Best Posts
Tesla has been coming in for some criticism of late, with particular regard to the Autopilot system. In at least three cases this year, a Tesla model has buried itself into a stopped emergency vehicle.
The problem mainly stems from the fact that the engineers (of any self-driving/autonomous vehicle) have to code the system to concentrate on the moving objects, rather than the stationary ones, or we’d see an increase of cars on autopilot hitting their brakes when they come across large road signs.
To be fair to Tesla, this problem affects a range of manufacturers, and Tesla does state very clearly that “Autopilot shouldn’t be used in areas with intersections, stop signs, red lights or suddenly changing traffic patterns.”
Learning from history
In 1943, the British Royal Air Force had a problem. The young RAF pilots, who were supposed to keep an eye on the radar when it found a German submarine could not do their job. After some time in the plane they would miss the signals they’d been trained to spot. The longer they spent looking at the screen, the less reliable they became.
After some lengthy tests and research, Mackworth, a British psychologist came to a conclusion that it took less than half an hour for the pilots’ attention to wander. This has become known as “vigilance decrement” and it has caused difficulties everywhere people were asked to spend long periods unexciting time, looking out for easy to detect but impossible to anticipate signals. Most commonly security guards suffer from it, as well as TSA agents, lifeguards and recently, drivers that use autopilot have joined the club.
The confusion around Tesla’s Autopilot
Tesla’s Autopilot system is meant to help drivers, not substitute them. With the recent accidents happening and given that the technology is available on both the new Tesla Model S and Model X vehicles, it’s apparent that some sort of training for new buyers is necessary. Technically, it seems that Musk is happy to put his reputation, brand and money behind the Autopilot system, believing that it’s more than up to the job, so perhaps he’s right and instead of investing more dollars on hardware like LiDAR (which he called “expensive, ugly and unnecessary”) he’d be better off investing a fraction of that money into driver education/training. Maybe only allowing a customer access to the Autopilot feature once they’ve passed some sort of training course sanctioned by Tesla?
It’s all in the hands of the driver
As the technology is still in beta-testing, drivers must accept the responsibility for activating Autopilot in their cars. But this doesn’t seem to be enough. In the long run, the best and easiest solution for Tesla, would be to undertake the unfortunately time-consuming fix: an Autopilot Guide.
Apart from such a manual, every driver should have to sit for a test, whether it would be online or in-store, it would have to be mandatory to take, to ensure safety on the road.
A visit to a Tesla store for 30 minutes of Autopilot dos-and-don’ts would be sufficient. New owners would answer a few questions regarding the Autopilot Rules from the manual as a requirement for the Autopilot activation. As for people who already own a Tesla model with the Autopilot, the manufacturer could temporarily switch the function off across its fleets and require owners to complete a training process before they “qualify” to have the tech turned back on.
It would be easy to incorporate, free to access for the consumer and give Musk another world-beating opportunity for some marketing tricks.
Giles Kirkland is a passionate car expert and automotive writer based in London.