Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Ghost CEO John Hayes advocates imitation learning for self-driving cars

This site may earn commission on affiliate links.
John Hayes, the CEO of the self-driving car startup Ghost, is a big believer in imitation learning. Here's a key excerpt from his recent blog post:

“The perfect driver starts with people. Our roads demand predictable and familiar human behavior from every participant to ensure safety for every car and driver on the road.​

Recent breakthroughs in imitation learning have now made this possible. It starts with observation: collecting all of the macro and micro behaviors that make up human driving. We can then build a model that imitates those behaviors in software, creating a driver that behaves like a real person. Once we successfully imitate baseline human behavior, we can then take the next step to improve upon human driving. We first eliminate all the passive human errors, like non-observation (e.g. texting, falling asleep), and then identify and eliminate the active human errors that cause accidents or close calls.

Self-driving is a hard problem, but we have been making it harder by ignoring the most obvious clue. Human drivers have been giving us the answer key to safe self-driving for more than 100 years. Imitating their behavior is the first step on the path to perfection.”
Here's part of a tweet thread he posted in May 2019:

“The next big breakthrough in self-driving? Imitating people.​

People are great drivers. In fact, they are the ONLY successful drivers on today’s roads. No computer comes close.​

Driving is proving to be more like basketball than chess. When you try to express driving in rules, you find the rules are infinite & often unexplainable. This is why timelines for rules-based driving keep getting pushed further into the future.​

Most of the rules of the road are unwritten: We act in ways that are predictable to other drivers to create the equilibrium we observe on the road today. Ignoring the unwritten rules throws the entire system into chaos.​

The few self-driving cars already on the road are getting rear-ended a lot. Why? They're stopping unpredictably. Add 1000's of self-driving cars to the road (like companies are lobbying for) & you could be rear-ending something while reading this tweet.​

Making self-driving cars that drive like people isn’t a nice-to-have, it’s a necessity. It’s not about “taste” or “fine tuning.” It is the only way to keep the complex system on our roadways from devolving into chaos.”
Refresher on Tesla imitation learning:


Waymo's head of research, Drago Anguelov, also gave a talk for Lex Fridman's MIT class where he discussed imitation learning (starting at 26:43):


Aurora also has a talk on SlidesLive where they discuss imitation learning (starting at 8:15 or slide 7). Unfortunately, I can't embed it here.

Comma AI President George Hotz also discussed imitation learning highway lane changes at in this interview:


Also relevant: DeepMind recently announced it reached the 84th percentile on the StarCraft II ranked ladder using imitation learning alone. Using reinforcement learning alone, it was in the 1st percentile (worse than over 99.5% of human players). Using imitation learning and reinforcement learning, it was in the 99th percentile (better than over 99.8% of human players).

Using imitation learning — combined with an explicit, hand-coded planner — to bootstrap real world reinforcement learning is an intriguing approach for self-driving cars.
 
Last edited:
  • Informative
Reactions: GeoX750