Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo’s “commercial” ride-hailing service is... not yet what I hoped

This site may earn commission on affiliate links.
We have even less observation of Tesla's full autonomy test vehicles than Waymo's. Waymo's are obvious to see so people watch them and film, but Tesla's just look like regular Teslas so they are completely stealth.

To me, the best way to judge which company will emerge as the leader in autonomy is to look at the fundamental inputs that dictate neural network performance. This is why I personally focus so much on the quantity of training data that each company is able to collect.
 
Based on Tesla's 2017 report to the California DMV, they didn't do any FSD testing on public roads. So, we have evidence Tesla didn't do much on public roads in CA in 2017. Curious to see what comes out in the 2018 report.

In Tesla’s letter to the California DMV, they said they do testing on public roads outside of California. Tesla also has private test tracks, and perhaps also has or uses other private test facilities in California.

Tesla also does ADAS testing in many places around the world including California. This could include testing much more advanced features than anything in Enhanced Autopilot — stuff like automatic stopping for red lights and stop signs, automatic right turns, etc. Plus we heard about Tesla offering some goodies to employees if they agreed to test advanced features.

So, there is a lot of testing going on that isn’t counted under the California DMV miles.
 
In Tesla’s letter to the California DMV, they said they do testing on public roads outside of California. Tesla also has private test tracks, and perhaps also has or uses other private test facilities in California.

There may be discrepancy in your first post dismissing California DMV reports and Waymo’s testing results as somewhat meaningless while basically taking Tesla at their word and extrapolating optimistically from that. Tesla’s 2016 disengagements were much worse which can’t be good even if the number would not be the best guide. Pessmistic interpretation of Tesla would be they have been avoiding in 2017 autonomous driving in their home state since failing there pretty badly in 2016.

Be that as it may California DMV reports do serve one purpose even if we forget about disengagements definition. They tell us number of autonomous testing miles driven there. For anyone interested in actual numbers and volumes this should be significant data especially for California companies. How many California DMV autonomous miles Waymo has? How many Tesla has?

I stated my view on Waymo’s leadership based on actual consumer rides. That is a simple metric today. But just for harmless fun :) here is one forward looking theory too:

I theoretize Waymo’s technology may be way more generalized than is often assumed. Gains they make in geofenced areas would be transferrable elsewhere possibly at an expontential rate. Testing takes place outside of those areas already as is known. If it takes mapping and data collection on scale to support this expansion what better company than Google to achieve that on a global scale. Such mapping may already be ongoing. If someone can datamine the world it is Google.
 
Only a small minority of Waymo’s total miles — 17.5% — were in California in 2017. 350,000 were in California, out of 2 million total. Prior to 2018, Uber didn’t log any autonomous miles with the California DMV. These are both companies headquartered in the Bay Area. Both have done a lot of testing in Arizona. Maybe for regulatory reasons, companies just prefer to test in Arizona rather than California. Tesla has disclosed it tests in autonomous mode outside of California, so Arizona is a good guess. Tesla has also reportedly sought to do some testing in Nevada.

Tesla’s ADAS testing in California can also have somewhat of the same result as autonomous testing, even though it doesn’t require Tesla to log autonomous miles with the DMV. ADAS testers can test features that, one by one, incrementally add up to full autonomy. Things like automatic stopping for stop signs, or making right turns automatically.

This is just to point how California DMV miles are not a good gauge for how much overall testing a company is doing. Even in Waymo’s case — a company famous for testing in Mountain View — the vast majority of test miles are not logged in California. If you only look at California DMV miles, you are missing most of the testing that is going on.
 
I theoretize Waymo’s technology may be way more generalized than is often assumed. Gains they make in geofenced areas would be transferrable elsewhere possibly at an expontential rate.

Are you sure? It's not like google would want to drive cars around and map areas. That's just crazy.

Tesla is trying to take many shortcuts forced on them by time and lack of resources. Tesla can't even read signs, much less solve the localization problem.
 
If you use visual HD maps (i.e. camera-based HD maps, rather than lidar maps), it is pretty straightforward for companies like Tesla and Mobileye to quickly map most of the roadways in their major markets. Your HW2 car has the hardware to do HD mapping just by driving the same stretch of roadway a few times.

I think much more important than mapping is training data: the data you use to increase neural network performance on perception tasks like object recognition, semantic segmentation (e.g. driveable free space), and localization, and perhaps also on action tasks like path planning and vehicle control/actuation. If a company has 10x or 100x more training data than competitors, I intuitively don’t feel it’s likely that a competitor can overcome that training data advantage with better neural network architecture, better HD mapping, or better sensors. However, that’s just my intuition — intuition is unreliable even among experts, I’m not an expert, and I could be wrong.

Waymo still has 600 vehicles on the road, which are driving about 1 million miles per month, with about 11 million miles driven cumulatively so far. It’s not clear how many test vehicles or test miles Tesla has. Perhaps only a small fraction. But Tesla has its test miles + production fleet miles, whereas Waymo only has its test miles.

Some back of the envelope math:

2,000 Model S would cost Tesla $150 million to produce ($100,000 ASP - 25% gross margin = $75,000)

Paying 2,000 safety drivers would cost $47,000 per year per driver, or $94 million/year total ($20/hour * 45 hours per week * 52 weeks = $46,800)

That’s a total cost of $244 million in the first year, and then $94 million in each subsequent year (assuming no cars need replacing).

Tesla spent $834 million on R&D in 2016, $1.38 billion in 2017, and $1.1 billion so far in 2018 (with one quarter to go). The salaries of 2,000 safety drivers would be about 7% of Tesla’s R&D budget in 2017 and 2018. If the one-time cost of producing the cars were spread over those two years, that would bump up the total cost to 12% of the R&D budget.

Since Waymo’s 600 vehicles drive about 1 million miles per month, 2000 vehicles (3.3x more) should drive about 3.3 million miles per month. Let’s start counting in January 2019. By December 2019, Waymo would have 24 million miles. Tesla would have 40 million miles.

Now, there is no direct evidence that Tesla has an internal testing program of this size. Or that it even plans to. My point here is just that we don’t know the scale of Tesla’s internal testing operation. The only reason such a large program seems unlikely to me is that you would think with that many employees, information would have leaked to the press by now.

However, it’s also conceivable that the program could be geographically distributed enough that only a handful or perhaps a few dozen testers operate in any one city. In that case, the total number of testers would only be known to a small number of high-level employees managing the program, who are less likely to leak info. There has been at least one leak hinting at some kind of testing going on, but that’s all we know.

Unlike other companies, Tesla has the ability to hide its testing because its cars look just like regular cars. Since it can produce its own cars, acquiring test vehicles in secret is no problem. There’s also an incentive for Tesla to be secretive about its testing, in order to prevent an arms race with Waymo and others. If I were Elon and I wanted to hedge my bets, I would try to secretly log more than Waymo’s 11 million autonomous miles.

Right now, Tesla has ADAS test operator positions open in 14 locations — all posted in the last few weeks — perhaps representing more than 1 position per location. There are also test engineer positions open in a few locations. By comparison, Waymo tests in 25 cities.

The very nature of a secret program is that you don’t know the extent of it. I wouldn’t be surprised if Tesla only had 50 test vehicles, but it could also have 500 and we wouldn’t know. Unless the people managing the global test program leaked info, there would be no way to know the total number.
 
Only a small minority of Waymo’s total miles — 17.5% — were in California in 2017. 350,000 were in California, out of 2 million total.

Waymo has been testing in California for a decade so their total number is much higher. For Tesla we only have the 2016 number from California so far. Anyone have those total miles to compare?

Problem with comparing out of California testing is that we have so few if any public data sources which I believe was your whole point in this thread to remove the PR from the progress.

That Waymo is also testing elsewhere and releases testing mileage from other areas is to their benefit not a counter argument in my opinion.

Prior to 2018, Uber didn’t log any autonomous miles with the California DMV.

Pessimist would say Uber is a counter example of a company with questionable progress that has shied away from California testing because of its more rigorous requirements. :)

This is just to point how California DMV miles are not a good gauge for how much overall testing a company is doing. Even in Waymo’s case — a company famous for testing in Mountain View — the vast majority of test miles are not logged in California. If you only look at California DMV miles, you are missing most of the testing that is going on.

I am not suggesting looking at them only. I am answering your call for scrurtinizing public information and noting other companies seem to have even less public data than Waymo and that the California data is useful even if we ignore the disengagements numbers because it still gives an on the record total mileage. That is much more data than we have of many other sources.

Saying Waymo’s California data is meaningless PR and then optimistically glorifying other companies for non-existent or completely speculative data would not be fair so I just wanted to make sure we steer clear of such implications. :)
 
Are you sure? It's not like google would want to drive cars around and map areas. That's just crazy.

Tesla is trying to take many shortcuts forced on them by time and lack of resources. Tesla can't even read signs, much less solve the localization problem.

A misunderstanding. My forward looking theory was that Waymo’s geofence is mostly about safety and not really about ability. Once things are worked out in geofenced pilot areas, the theory goes Waymo can expand elsewhere quickly hence the comparison of ”Phoenix Waymo” and ”global Tesla” would not be accurate. I would not be surprised if a Waymo is already today more ”generalized” than any Tesla.

Secondary point was on mapping that whatever they may need — and everyone needs maps on some level — Google definitely is a formidable parent to provide it as they already drive and walk the world to map it constantly. That I view as an asset for Waymo not a hinderance.
 
  • Like
Reactions: electracity
The reason why Tesla tests its cars in autonomous mode (as defined by the California DMV) on public roads entirely outside of California is probably the same reason as why Waymo tests mostly outside of California, and Uber tested entirely outside of California.

I would guess it's more expensive to test in California because it takes a lot of extra work to comply the DMV rules. Plus it tips your hand to your competitors.

I think Tesla can circumvent the rules in California by doing ADAS testing instead of autonomous testing. As long as you have your hands on the wheel, I think it legally counts as ADAS.
 
I think much more important than mapping is training data: the data you use to increase neural network performance on perception tasks like object recognition, semantic segmentation (e.g. driveable free space), and localization,

I think for these quoted ones there is an argument both Waymo and MobilEye have already solved vision and localization so Tesla’s advantage in those areas would not be there if one buys that theory? NNs and fleet might help Tesla catch up at best if so.

Driving policy and use of NNs in that area is an open problem for sure with various levels of progress and approach. This is where different approaches can yield different dividend I agree.

Returning to your first post’s theme of removing PR from progress we have very little to go by in many cases. With Waymo we have more to go by than with most. We know how long and how much they have been driving autonomously and that they are taking on customers in one pilot region.

What do we know of many others when it comes to full autonomous? I would say much less.

The reason why Tesla tests its cars in autonomous mode (as defined by the California DMV) on public roads entirely outside of California is probably the same reason as why Waymo tests mostly outside of California

Waymo has been testing in California for a long time, majority of their time and possibly majority of their miles. Their percentage of testing there overall may not be limited at all quite the contrary. A logical reason for Waymo to test elsewhere also is they are making geographical progress. I would not compare Waymo’s and Tesla’s reasoning on this detail as the situations are so different.
 
A misunderstanding. My forward looking theory was that Waymo’s geofence is mostly about safety and not really about ability. Once things are worked out in geofenced pilot areas, the theory goes Waymo can expand elsewhere quickly hence the comparison of ”Phoenix Waymo” and ”global Tesla” would not be accurate. I would not be surprised if a Waymo is already today more ”generalized” than any Tesla.

Secondary point was on mapping that whatever they may need — and everyone needs maps on some level — Google definitely is a formidable parent to provide it as they already drive and walk the world to map it constantly. That I view as an asset for Waymo not a hinderance.

I agree. Consider Waymo doing things like navigating roundabouts and reading body language of bicycle riders. Meaning Tesla has not implemented Summon, as relatively simple feature, as promised two years ago. These aspects of development are he best indicators of progress of "generalized" FSD: How the vehicle handle expected complexity and ambiguity.

I also think google is just fine with doing detailed mapping or FSD. They probably have many uses for a highly detailed digital representation of the physical world. The autonomous cars themselves will be used to keep the 3D map updated, so the ongoing cost is minimal.
 
  • Like
Reactions: electronblue
@electracity Naming second place in this race definitely seems harder than naming first place at this time.

Right now, Tesla has ADAS test operator positions open in 14 locations — all posted in the last few weeks — perhaps representing more than 1 position per location. There are also test engineer positions open in a few locations. By comparison, Waymo tests in 25 cities.

That is an optimistic interpretation. :) All we know is Tesla is hiring for ADAS. Nothing in work details listed mentions full self-driving or talks beyond current daily Autopilot (a driver’s aid) update testing.

We can’t really read one party optimistically and the other pessimistically. If we are being optimistic, then Waymo’s disengagement doubts would also be lifted. But in reality all we know is Waymo has 10 million autonomous miles and 1 region of third party customers riding their autonomous cars. From Tesla we know of I guess 550 miles autonomous in California and worse disengagement rate than Waymo, right? That is all we really know. Rest is PR and speculation.

We can’t assume everyone stands still while assuming Tesla grows exponentially either. Everyone can grow but it is unknown who will and how. Waymo for example plans a fleet of tens of thousands of autonomous cars. Also, if say Tesla suddenly grows from 550 miles in 2016 to 25 million miles in 2019 in testing their development work will still take time unless we assume end to end NNs for driving so calendar time matters after testing too.
 
Last edited:
I agree. Consider Waymo doing things like navigating roundabouts and reading body language of bicycle riders. Meaning Tesla has not implemented Summon, as relatively simple feature, as promised two years ago. These aspects of development are he best indicators of progress of "generalized" FSD: How the vehicle handle expected complexity and ambiguity.

We should be careful about comparing test full autonomy systems to production ADAS systems. They are really different. This is the test system Waymo had in early 2012:


It’s been almost 5 years and Waymo is just now about to bring a system into limited production in a small geofenced area for a few dozen or a few hundred paying non-employee customers who aren’t under NDA. The path from testing to production is long and winding. It can take a long time before something that works in a demo works in a practical application. So, we should be cautious about comparing demos to production systems. If Waymo had an ADAS system in production cars today, we could compare that to Autopilot.

I think for these quoted ones there is an argument both Waymo and MobilEye have already solved vision

I’m really skeptical that this is true for Waymo. With Mobileye, Amnon Shashua has openly said certain vision tasks like semantic segmentation remain unsolved. Anything is possible, but this is a huge claim for Waymo.

Important to note that Tesla has ~300,000 HW2 cars that have driven ~600 million miles on Enhanced Autopilot and ~2.4 billion miles total. The cars seem to upload something on the order of 100 mb of data per day on average. 100 mb is equivalent to about one single-frame snapshot from all eight cameras. People drive an average of 37 miles per day, so 2.4 billion miles is 64.8 million days of driving. That means it isn’t a stretch to imagine Tesla has a library of 50 million+ snapshots, collected over a large, diverse sample of driving (218x larger than Waymo’s sample). Or 55,000 thirty-second video clips at 30 fps. Or a combination of both.

That’s why my hunch is that Tesla will emerge better at camera-based vision than anyone else — if they properly leverage their advantage. They have the capacity to build a much larger and more diverse dataset of still images and video than anyone else.

Waymo has been testing in California for a long time, majority of their time and possibly majority of their miles. Their percentage of testing there overall may not be limited at all quite the contrary. A logical reason for Waymo to test elsewhere also is they are making geographical progress. I would not compare Waymo’s and Tesla’s reasoning on this detail as the situations are so different.

I think the large majority of total, cumulative miles are outside California. As of 2015, Waymo had 3 million miles (if I recall correctly). Today, it has 11 million, and for the past few years I believe it has had a lot more cars outside of California than in.

It is likely less expensive to test in a state like Arizona where the regulations are more permissive. In California, I would imagine you have to spend money paying your compliance team, legal team, safety drivers, engineers, etc. to remain compliant with the DMV’s reporting requirements. Plus it reveals informations to your competitors that you might prefer to keep secret.

That is an optimistic interpretation. :) All we know is Tesla is hiring for ADAS. Nothing in work details listed mentions full self-driving or talks beyond current daily Autopilot (a driver’s aid) update testing.

As I said, it’s just speculation. If something is a secret, then you don’t know about it. I was just pointing to Tesla’s ability to keep the scale of its testing a secret, and the reasons why it would want to do so. As I said, we have no idea if Tesla has 50 test vehicles or 500. I am not saying that Tesla actually does have a testing operation of this scale, I’m just saying that it could and we wouldn’t know.

We can’t assume everyone stands still while assuming Tesla grows exponentially either. Everyone can grow but it is unknown who will and how.

The key difference is that 1,000 Waymos driving around will get noticed. 1,000 Teslas will not. Again, I’m not saying this actually is happening, I’m just noting that Tesla has greater stealth than Waymo, and stealth is a competitive asset.
 
Last edited:
Or 55,000 thirty-second video clips at 30 fps.

I just realized this doesn’t make sense based on what I said. We would need to see the occasional chunky upload (1.9 gb for a 30 fps, 30-second video clip from one camera) rather than a steady trickle ~50 mb to ~300 mb per day. Need more information about how much data HW2 Teslas are uploading each day over a long timespan.
 
Last edited:
We should be careful about comparing test full autonomy systems to production ADAS systems. They are really different. This is the test system Waymo had in early 2012:

It’s been almost 5 years and Waymo is just now about to bring a system into limited production in a small geofenced area for a few dozen or a few hundred paying non-employee customers who aren’t under NDA. The path from testing to production is long and winding. It can take a long time before something that works in a demo works in a practical application. So, we should be cautious about comparing demos to production systems. If Waymo had an ADAS system in production cars today, we could compare that to Autopilot.

I don't quite understand this reasoning. People, I'm guessing he is basically saying that AP is better than what google had in 2012. Mind you that google could go thousands of miles on the highway (including lane changes) without disengagement back in 2012. Production system doesn't mean much when it comes to ADAS. As many auto companies have proven they will throw any garbage out there just to say they have something because the driver is always responsible.

AP2 was beyond horrible when it was initially released. Today NOA tries to kill you on average after every couple miles. How in the world is that better than a system that can go thousands of miles without attempting to kill you?


I’m really skeptical that this is true for Waymo. With Mobileye, Amnon Shashua has openly said certain vision tasks like semantic segmentation remain unsolved. Anything is possible, but this is a huge claim for Waymo.

I'm sorry but this is simply not true. Amon has constantly laid out that sensing is comprised of three levels.

1. Object detection
2. Semantic Free Space
3. Drive-able Paths

"The first level is a solved problem. The second level is already in production. Tesla Autopilot 1 had it. The third one is very challenging"

For #1 He says "detecting objects, vehicles, pedestrians, traffic sign, traffic lights, etc. image to bounding box. This is level 1, this is the easiest problem. this problem has been SOLVED! There is no need to talk about this anymore."

NOTE that Tesla still struggles with #1 and doesn't yet detect most things. (source @verygreen)

For #2 He says "Understanding the free space, where is the curbs, barriers, guard rail, the semantic lane markings, where to drive, etc. you need more sophistication of algorithm"

NOTE that Tesla current semantic free space only does what eyeq3 did which is the primitive SFS mobileye had. (source @verygreen)

Here he noted that eyeq3 already had the first generation of their semantic free space but it only detected where to drive with object delimiter (green carpet and red line). But not the semantic meaning. But their next gen network in eyeq4 actually has these "more sophisticated of algorithm" and has semantic meaning of the path delimiters and can tell what is a barrier, guardrails, curbs, flat ground, raised ground, lane splits, lane merge, etc.

Eyeq4 had over 15 different categories for its SFS based on the last statement in 2016 I believe and they might have added more before its release.

bnjToOq.png


puc6mPw.png


INTGors.jpg


"The third level is the killer. The drive-able path, knowing which path takes us where. this path leads into a highway or highway exit. this is really still a open problem."

He calls the third level an "open problem" and called it "very challenging" to setup his entire presentation.
He says "The question is why we need to solve this problem..so" Then he goes into the point of his presentation. He then describes the two method used by companies today. One sets the drivable path manually and the other is to use cameras on every car and crowd source it.

Here is waymo's manually constructed driveable path

0*NQI1hHJZudK_nV3L.



Here's from Mobileye website;
  • Freespace: determining the drivable area and its delimiters
  • Driving Paths: the geometry of the routes within the drivable area
  • Mobileye’s REM™ localization map for identification of safe drivable paths and knowledge of intersections and other traffic signage or instructions.

  • RoadBook™ provides a highly-accurate, rapidly-refreshed representation of the static driving environment. This includes road geometry (lanes, drivable path, paths through complex intersections), static scene semantics (traffic signs, the relevance of traffic lights to particular lanes, on-road markings), and speed information (i.e. how should average vehicle speed adjust for curves, highway ramps, etc).
If mobileye hadn't already solved strong perception there won't be REM today. But they have and that's why rem is in thousands of cars running eyeq4 today.

Vi2gcHV.png


REM crowd sources every single lane traveled and can look at the strong perception results. So if i'm driving my car and i took a lane leading into a highway. Right before i enter the on-ramp. The strong perception network says this is an highway entrance. This is also confirmed as i take the ramp and its detection and confidence is reported back to Mobileye HQ.

Mobileye REM software at HQ automatically does a calculation based on what other cars also reported about that lane and can confirm that this lane truly does lead to an highway entrance. Hereby creating a map of the drivable path on each lane of each road traveled.

This is all done automatically with no human oversight.

Also its not just what the strong perception network says, but the route the car actually drives also plays a huge roll in the route saved in REM Map. Think about the complex intersection from above i posted by waymo. At complex intersections, even ones less obvious than the above, you would have several lanes with/without lane markings leading to different paths

I have always said, your AP1 car doesn't forget what a lane, car, bike, traffic sign look like when you go to another city, state, country.

Again Mobileye did all of this without needed 300,000 cars grabbing pictures for it. Therefore there is absolutely no sensing advantage of having hundreds of thousands of cars grabbing images for you unless you're behind in sensing which Tesla clearly is.
 
Last edited:
  • Disagree
Reactions: strangecosmos
People, I'm guessing he is basically saying that AP is better than what google had in 2012.

No, lol. Of course that’s not what I’m saying.

Do you compare cars on the road today running EyeQ3 or EyeQ4 to Waymo’s minivans and conclude that Mobileye is far, far behind on self-driving? You should not compare production ADAS to autonomous test vehicles. That is an analytical error. It’s comparing apples to oranges.

You may claim that unlike Mobileye, everything Tesla has developed in terms of prototype autonomous driving software is already in ~180,000 production vehicles. But we know from multiple public disclosures from various Tesla executives dating back from late 2018 to early 2016, and from investigative reporting, that this is not how Tesla’s development pipeline works. There is stuff in testing long before it hits customers. For example, Navigate on Autopilot was in testing for at least around 6 months (and probably longer) before it launched. It stands to reason there is software that has been in internal testing for many months that customers don’t know about yet.

From Tesla’s demo video in October 2016 of an autonomous drive in Palo Alto, and from Elon’s statement that Tesla could have done a coast-to-coast autonomous demo by gaming the system/special casing the route, we can infer that Tesla at least has a very rudimentary, demoware-level software prototype for full autonomy. This is not in production cars.


The Mobileye CTO is named Amnon.

“The second part is parsing the roadway. It’s understanding the lane marks, the road edges, the path delimiters, where we have curves, or we don’t have curves, the driveable paths and the semantic meaning associated with the driveable paths. And this is largely an open problem from a perception point of view.”

–Amnon Shashua, CES 2018

“Parsing Roadway to sufficient details for L4 is an open problem

–Amnon Shashua’s slides, CES 2018

So, according to Amnon Shashua as of January 2018, semantic segmentation is “largely an open problem from a perception point of view.”

If you think these quotes are an inaccurate reflection of Amnon’s views as of today, please find an exact quote and link to the video (with timecode) where he explicitly says semantic segmentation is a solved problem or a closed problem.

In the context of what electronblue said, I interpreted “solved” as meaning “an autonomous car can do it as well as a human or better”, i.e. no further improvement is needed — although people use the term “solved” to mean all kinds of different things. “Solved” can mean “we have a prototype that basically works and the rest is just tinkering until we get it right” or it can mean “we have a successful, finished commercial product on the market today, and no more tinkering is needed”. It depends on the person and the context. I would agree computer vision (broadly construed) is perhaps solved for superhuman full autonomy in the former sense, but not in the latter sense.

I'm sorry but this is simply not true.




This is a thread about Waymo, not Mobileye, so please continue any discussion of Mobileye elsewhere, such as the Mobileye vs. Tesla megathread.
 
Last edited:
  • Like
Reactions: jimmy_d and fmonera
No, lol. Of course that’s not what I’m saying.

Do you compare cars on the road today running EyeQ3 or EyeQ4 to Waymo’s minivans and conclude that Mobileye is far, far behind on self-driving? You should not compare production ADAS to autonomous test vehicles. That is an analytical error. It’s comparing apples to oranges.

You may claim that unlike Mobileye, everything Tesla has developed in terms of prototype autonomous driving software is already in ~180,000 production vehicles. But we know from multiple public disclosures from various Tesla executives dating back from 2018 to 2016, and from investigative reporting, that this is not how Tesla’s development pipeline works. So, this claim is 1) unsubstantiated by evidence and 2) contradicted by available evidence.



The Mobileye CTO is named Amnon.

“The second part is parsing the roadway. It’s understanding the lane marks, the road edges, the path delimiters, where we have curves, or we don’t have curves, the driveable paths and the semantic meaning associated with the driveable paths. And this is largely an open problem from a perception point of view.”

–Amnon Shashua, CES 2018

“Parsing Roadway to sufficient details for L4 is an open problem

–Amnon Shashua’s slides, CES 2018

So, according to Amnon Shashua, semantic segmentation is “largely an open problem from a perception point of view.”



:rolleyes:

Lol excuse my spelling, half the time i'm posting from mobile with annoying auto-correct, but other times since i pronounce his name as Amon, i just type "Amon" instead.

However you are misinterpreting what "Amnon" is saying. Its an open problem for the industry NOT for mobileye. Its easy to grab a snippet and say "see he is saying its an open problem". But if you actually listen to the 30 other Amnon presentation you know that's not the case. Amnon tries to differentiate each of his presentation. He tells you the building blocks of SDC and explains the current status in the industry and how mobileye either solved it or are solving it (when it comes to driving policy). Amnon himself said the roadway parsing was in series production in 2018.

This literally confirms what he said above about Semantic Free Space in eyeq3 being primitive and used in Tesla AP1 and the next gen SFS in eyeq4 having curbs, barriers, guard rails, flat, vertical, etc delimiters.

He literally talks about these very same thing only using different visualized presentation because he wants to keep things fresh.

He is only showing one visualization btw. Other visualizations include the above pics i posted above.
Then he said "This type of road edge detections are part of 2018 launches"

He also showed holistic path planning (Holistic Lane Centering) and said "If we were to reach a junction where we could go right or left. This is important not only for sensing, actuation, but also for map building. You would like to know where the path is in order to build a map".

The whole point of the presentation is that the other companies are using manually built HD maps while mobileye's crowd sourced REM map (which is in production TODAY) are built automatically using strong perception.

As Amnon himself said. This is was a focus of 2017 to get strong perception and REM automatic Map making working. It is in production today. BMW, Nissan, and VW are all creating driveable paths. Nissan will release a L3 that uses REM Map in Early 2019. Another automaker in Late 2019 will use REM Maps to enable L2+ everywhere.

Again it would be impossible for Mobileye to deploy REM map in production if they didn't solve strong perception.

The green line is the drivable path. Again this is available today. Its equivalent to the greet line in the Waymo map, the only difference is that one is built automatically.
mobileye-rem-roadbook.jpg
 
Last edited:
@strangecosmos

My point is, Mobileye today has REM Map that is "designed for level 4 and level 5 automation" working and in production. So the whole "its not solved" thing is completely mute. As Amnon also said, you CAN'T automatically create maps without strong perception.Therefore mobileye has solved strong perception to the level of creating level4 and level 5 maps. Its really that simple.