Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Enjoyed this. I wonder if their demo's are staged or do they really have driverless 18-wheelers on the road. I assume the former.. And I assume their maps are the HD variety- if so there's likely not a decent long-haul route mapped yet.

They have autonomous trucks but with safety drivers. And yes, they use HD maps. They could map long routes. All they have to do is drive the route once to map it so I don't think it would be a big problem. For example, GM has already lidar mapped over 130,000 miles of highway.
 
They have autonomous trucks but with safety drivers. And yes, they use HD maps. They could map long routes. All they have to do is drive the route once to map it so I don't think it would be a big problem. For example, GM has already lidar mapped over 130,000 miles of highway.


Of course the US alone has over 4 million miles of road- so 130k is just over 3%

And the non-highway ones are obviously much harder to automate, but trucks would still need to be able to use them.

Something where the truck can auto-drive hours of highway portion would help with the max hours a day human drivers are currently permitted though.
 
  • Like
Reactions: diplomat33
Mobileye just uploaded a new video today with an update on their approach to scalable AVs:


The video details their vision-only system for hands-free L2 called SuperVision, their 'true redundancy" system that combines camera vision with radar and lidar for L4/L5, their crowdsourced AV mapping and their safety-based driving policy called RSS.
 
  • Like
Reactions: Bladerskb
Jack West from Intel/Mobileye who is the Chair of the working group that is drafting AV safety standards, called IEEE P2846, gave an update:


There are over 30 members of the working group helping to draft IEEE P2846.

Aw0s8SC.png


They completed a major draft and expect to have a completed draft done by July or August that they will send to their members for ratification. Once that is done, regulators will be able to use IEEE P2846 to define specific numbers for AV safety.

One example of how IEEE P2846 will help: AVs have to maintain a safe follow distance from the vehicle in front. But what is a safe follow distance? If you make the follow distance too small, it will be more convenient to other drivers on the road but the AV could rear end the vehicle in front if they suddenly brake. If you make the follow distance too big, it will be much safer but also more annoying to other drivers. AVs have a mathematical equation for accurately calculating follow distance but it depends on assuming the braking power of the vehicle in front. This braking power is different for different types of vehicles. Also, the braking power may vary based on the driver. An attentive human driver can brake faster than a distracted human driver. So what braking power do you assume? Regulators will be able to use IEEE P2846 to set a specific number for the braking power that AV developers will be able to use in their AV software. So the AV will use that braking power number to calculate a safe follow distance that balances safety with convenience.
 
Last edited:
I'm curious how self-driving cars will handle the very common situation where the flow of traffic is 15 or 20 mph over the posted speed limit. The most frustrating thing about EAP for me is that (except on divided highways, of which we have one short stretch on Maui) autosteer will not operate faster than five mph over whatever it thinks the speed limit is. (It usually knows the correct posted limit, but not always.) On Pi'ilani highway the limit is 40 mph. Nobody drives slower than 50, and 55 is common. I don't think the cops ever ticket below 55, and maybe 60. Autosteer won't operate over 45. During busy times, driving 45 would create a serious disturbance in traffic flow. I generally don't use autosteer then. At light traffic times it's easy for cars to pass, and I use autosteer.

Many (most?) human drivers drive just a hair below the speed at which the local police will stop cars for speeding. If the limit is 65 but the police don't ticket below 70, 70 becomes the de-facto speed.

Will autonomous cars drive at the speed limit, creating enormous hostility from other drivers? Or will they break the speed limit when literally every other car on a crowded highway is doing so?
 
Intel playing the long game by defining driving standards.

In some ways, that list of members is interesting for who is not there.

Hard to imagine the input that some of those members will have on av behaviour.

Yeah, I think Mobileye is being smart and thinking long term. They not just joined the group on IEEE P2846, they also contributed their RSS model for use in the standards. So they are guaranteed to be in compliance with AV standards and future safety regulations since those standards and regulations will be based on their safety model. This will help them deploy their ADAS and AVs sooner since they will get approval sooner.

Personally, I think it was a mistake that Tesla did not to join the group. By not joining, Tesla will have little to no say in the standards or regulations. If they had joined, they could have shaped the standards to be more favorable to them. The fact is that AV standards and regulations are coming. Tesla will have to comply whether they like them or not. So I think it makes more sense to shape things from the inside rather than trying to go it alone.
 
I'm curious how self-driving cars will handle the very common situation where the flow of traffic is 15 or 20 mph over the posted speed limit. The most frustrating thing about EAP for me is that (except on divided highways, of which we have one short stretch on Maui) autosteer will not operate faster than five mph over whatever it thinks the speed limit is. (It usually knows the correct posted limit, but not always.) On Pi'ilani highway the limit is 40 mph. Nobody drives slower than 50, and 55 is common. I don't think the cops ever ticket below 55, and maybe 60. Autosteer won't operate over 45. During busy times, driving 45 would create a serious disturbance in traffic flow. I generally don't use autosteer then. At light traffic times it's easy for cars to pass, and I use autosteer.

Many (most?) human drivers drive just a hair below the speed at which the local police will stop cars for speeding. If the limit is 65 but the police don't ticket below 70, 70 becomes the de-facto speed.

Will autonomous cars drive at the speed limit, creating enormous hostility from other drivers? Or will they break the speed limit when literally every other car on a crowded highway is doing so?

I think self-driving cars will need advanced driving policy to understand when it is ok to drive faster than the posted speed limit. But I also think this is the exact kind of situation why we need AV performance standards. On one hand, you want to follow the speed limit but there are conditions where going faster than the posted limit is needed. I know Waymo currently makes all their driverless cars in Chandler always follow the posted speed limit law, focusing on safety first. Of course, that is city driving, not highway driving, so it is a different situation. But if we had standards that cleared up these questions, then AV developers could make sure their AVs handle the situations you mention the right way.
 
  • Like
Reactions: daniel
By not joining, Tesla will have little to no say in the standards or regulations.
If they had joined, they could have shaped the standards to be more favorable to them. The fact is that AV standards and regulations are coming. Tesla will have to comply whether they like them or not.
I wonder what would happen if Tesla publishes raw data showing them (while not in compliance with "the standard") having a much greater record of safety then any of the cartel members with their "in compliance" solutions.

Just an interesting thought experiment. Not all standards are there to help drive the safety of a product, some are there just to slow sh!t down when you have no other way of catching up.

Either way, actual safety record/data will determine the success of any FSD implementation not a "standard" proposed by a cartel!.
And that is how it should be.
 
Will autonomous cars drive at the speed limit, creating enormous hostility from other drivers? Or will they break the speed limit when literally every other car on a crowded highway is doing so?

I'll never understand these questions. There are laws, rules, regulations. Is it serious anybody would want a car (or another robot like device) to break the law? No speed limit, run through red lights and so on?

There is a simple solution: fine/ticket all speeders.
 
I wonder what would happen if Tesla publishes raw data showing them (while not in compliance with "the standard") having a much greater record of safety then any of the cartel members with their "in compliance" solutions.

Just an interesting thought experiment. Not all standards are there to help drive the safety of a product, some are there just to slow sh!t down when you have no other way of catching up.

Either way, actual safety record/data will determine the success of any FSD implementation not a "standard" proposed by a cartel!.
And that is how it should be.

I don't think you understand what the standards are. The IEEE P2846 is technologically neutral. It will not depend on any particular FSD approach. For example, it will not require any particular technology like lidar. So no, it is not about slowing things down or helping companies that are behind FSD. IEEE P2846 will merely define the assumptions and parameters for safety. Now, regulators will likely take standards like IEEE P2846 in order to define specific safety numbers that need to be reached. And regulations may require certain technologies like camera based driver monitoring for L2 ADAS. But that is different from the standards.

Yes, safety data will determine success of FSD. But you need metrics and standards to tell you if your safety is good enough. How do you know if your safety data is "safe enough"? How do you know if 1x, 2x or 20x safer than humans is good enough unless we have regulations that tell us what the goal is? What metric are you using to measure safety? Accidents per mile? Fatalities per mile? You need to set standards and regulations to know how we will measure AV safety.

For example, let's say that company A has safety data that shows their FSD is 10x safer than humans and they go to regulators and ask permission to deploy FSD with no driver supervision. Maybe regulators say yes. Another company comes to regulators and their FSD is 5x safer. Is that also good enough? You need regulations and standards so that regulators know what is good enough and so that they can be fair to everybody. You don't want regulators that have to "play it by ear" what they think is "good enough".

To answer your question more directly: If Tesla had rigorous data that showed FSD with no driver supervision was significantly safer than humans, I think regulators would approve FSD with no driver supervision for deployment. But Tesla has certainly not reached that point yet.
 
Last edited:
I'm curious how self-driving cars will handle the very common situation where the flow of traffic is 15 or 20 mph over the posted speed limit.


Currently the few US states that have pre-approved (before their existence) of self driving cars all explicitly require them to follow all traffic rules/laws.

Which would include not exceeding the posted speed limit.

Yes that's going to annoy a lot of folks
 
  • Like
Reactions: Hbrink
Mobileye just uploaded a new video today with an update on their approach to scalable AVs:


The video details their vision-only system for hands-free L2 called SuperVision, their 'true redundancy" system that combines camera vision with radar and lidar for L4/L5, their crowdsourced AV mapping and their safety-based driving policy called RSS.

They also go more in-depth on their Vector Field (aka Bird's Eye View Network) which is now called "Multi-Camera Environment Modeling Network" which they first publicly revealed in November 2019.

I tried to tell @masterxel @willow_hiller @mspisars who is now @mikes_fsd @powertoold @heltok but they won't listen.

They could claim to have "solved" perception when they could recreate the scene from the outputs of the NN's
Similar to Tesla's Birds Eye View. It seems to me like Mobileye could not determine depth and distance with cameras very well, otherwise he would not be making idiotic statements like "where to stop for intersections", if you can see the intersection and the stop line what is your problem with stopping in time?




jYqXNP.gif
 
If it is a such a low bar, why hasn't Tesla's solved it yet? Why does Tesla still require driver supervision on such an easy feature?

Mobileye does not release traffic light control with driver supervision because they are past it. Mobileye is focused on real FSD!
Don't switch the subject princess.
It is such a low bar in the context of your comment --- and yet no one has matched it in the end-user personal vehicle space.
You mean the traffic light control that was released over a year ago and still requires driver confirmation on green lights.
 
I am not sure the competency for "sharp turns" has been covered or not: Yesterday, the subject is tweeted:


Do other companies still have this problem since Tesla first AP1 in 2014 till now (7th year of experience in dealing with software and hardware)?

I assume vision alone could take care of this issue and other sensors are not needed.

I assume the way to solve "sharp turns" issue is to recognize that there's a sharp turn coming up and to slow down to the point that the road surface and the rubber tires can maintain friction without skidding out of control in any weather--dry, wet or icy.

Does that mean it's Tesla's software issue?

Does it mean Tesla should hire people from other companies who have solved the issue of sharp turns?
 
Last edited: