Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Weird issues on brand new MY

This site may earn commission on affiliate links.
Hi,

New MY owner here - took delivery 2 days ago in the Netherlands. I went on a road trip straight from the Delivery Center and noticed a few issues on the trip - any info on these issues would be appreciated. I did check the forums but the entries I found didn't exactly describe my issues, so I'm posting them here:

1. Once when I was parked the screen (or the car?) crashed/rebooted by itself twice. I was in a menu, touched a button (can't recall what it was) and the screen just went black, then displayed a loading progress display for quite a while and then back on the default screen. A few minutes later the same thing happened again. This was very soon after delivery and hasn't happened since in the last 2 days. Should I be concerned?

2. While on my first trip (Day 1) I got the message that the right B pillar camera was "blocked or blinded". I saw that this happened to other due to condensation but I stopped to inspect the camera (not immediately but a bit later) and didn't see any condensation, and everything was sparkling clean. This happened multiple times during the drive.

3. The rear USB C ports don't charge my phone - the front ones work. So I guess this is a warranty issue.

4. Before I arrived at my first destination, I set the charge limit to 5A (I knew I was going to charge using someone else's normal mains plug at my destination and didn't want to blow their fuse) but sure enough it did just that - blew their fuse after a few minutes. When I looked at the screen (showing "charging stopped"), I noticed that the charging limit was back to 16A, despite my previously setting it to 5A - apparently it didn't remember this setting. I know the limit doesn't apply to Supercharging, but I had a Supercharger stop between setting it to 5A and the slow/home charging blown fuse incident and I don't know if this my be relevant. Is this normal?

5. When driving on Autopilot (I have the full autopilot) I noticed that I had a tendency to slightly but constantly push the steering wheel to the left (unsuccessfully, of course) as I felt I was too close to the cars in the next lane to the right, and I noticed that quite a few cars moved away from me after I passed them (I suppose they felt the same and moved away after the "close encounter" as if I scared them). I checked using the cameras and the car was actually driving in the middle of the lane, so I don't think there was anything wrong with it, but it seems like we have a tendency to keep more distance from other drivers, which on the typical 2 way European freeway means that the driver in the left lane drives closer to the left side of the road and the driver in the right lane closer to the right edge of the road. I don't want to scare other drivers... and I assume the car is driving safely in technical terms, but I feel weird driving in a way that makes others uncomfortable. Did anyone have a similar experience?

6. I have trouble taking over from Autopilot when I need to make a slight correction. For example, today I came up on a bicyclist while driving on Autopilot, and it would not pass him - instead, kept follow at 20km/h. There were oncoming cars in the other lane, but there was plenty of space to safely pass - I guess it was safe for an experienced human driver, but Autopilot didn't think so. So I decided to take over from Autopilot by gently pushing the steering wheel to the left (which required quite some force) and when the Autopilot finally gave in and disengaged, the car went to the left in a sudden motion that was way more than what I wanted to do, not to mention not smooth driving at all. I know I can just disengage the autopilot with the stalk, but that feels like a very complicated way of making this simplest of maneuvers - move hand to disengage autopilot, then move hand back to wheel to steer, then move hand again to reengage autopilot... seems like an example of technology getting in the way instead of helping. It's literally impossible to take over from Autopilot using the steering wheel without this resulting in an abrupt jerk to the steering wheel and swerving the car in a new direction. Is it just me, or is the steering wheel is way more stiff than it should be when on Autopilot?
 
It is in the owners manual (that you will have trouble accessing in the car... it doesn't work correctly). You can find an old out-dated version of the owners manual on line. Also, the auto high beam headlights (standard in many cars) are in BETA on the Tesla model Y. Most new cars (all of the expensive ones) have lane centering, and adaptive cruise control (auto steer and TACC equivalents) that works correctly almost 100% of the time, where these are BETA features in the Tesla.

Keith
Yeah - I am a techie/early adapter type so I don't mind. As far as I am concerned all of us Tesla owners are beta testers in one of the biggest beta tests in the world that should result in much better, smarter cars and, one day, fully autonomous ones. I am okay with that and looking forward to the updates and improvements. I didn't expect perfection, but I still had to get used to the "beta" aspect which is a little more beta than I expected :) AP definitely has a learning curve if you want to tame it and work with it instead of fighting it. Some functions and features are just not well designed, some are outright annoying, but I still love the car as is, it is unique, innovative and fun to drive. For me, the long list of things Tesla got right is worth putting up with those features I feel they botched, and knowing that any problems I experience with it could get fixed/improved any day via an update makes it easier to live with its quirks.
 
  • Like
Reactions: laservet
Upvote 0
... and after an 1800km road trip, I can add the following to the list:

8. Summon NEVER worked. Not once. It either told me to move closer to the car, or started backing out of the parking spot and canceled Summon after driving about an inch (with no obstacles in the way).

9. Autopark NEVER worked either. During a trip through numerous cities, my MY hasen't "seen" a single parking spot suitable for auto parking, not in mall parking lots, and not when I VERY slowly drove next to a row of parked cars (parallel parked next to the curb) in a city with multiple perfect parking spots coming and going without the car noticing any. I was just slowly rolling by one parking spot after another, thinking "well...?". LOL.
 
Upvote 0
Yeah - I am a techie/early adapter type so I don't mind. As far as I am concerned all of us Tesla owners are beta testers in one of the biggest beta tests in the world that should result in much better, smarter cars and, one day, fully autonomous ones. I am okay with that and looking forward to the updates and improvements. I didn't expect perfection, but I still had to get used to the "beta" aspect which is a little more beta than I expected :) AP definitely has a learning curve if you want to tame it and work with it instead of fighting it. Some functions and features are just not well designed, some are outright annoying, but I still love the car as is, it is unique, innovative and fun to drive. For me, the long list of things Tesla got right is worth putting up with those features I feel they botched, and knowing that any problems I experience with it could get fixed/improved any day via an update makes it easier to live with its quirks.

I am a techie/ early adopter as well, had a 2010 Chevy Volt, then a 2016 Volt and 2017 Bolt before getting the MYP... But I have to say, when other companies pass you by while you are still providing Beta level services while you are claiming to be the leader in the autonomy field is a bit embarrassing.

The only other Tesla owner I live near (one of my co-workers) is a HUGE Fan-Boy... he refuses to acknowledge that any other company has functional lane centering (as good as Tesla in most cases) and adaptive cruise control (better than Tesla in many cases)... it baffles me seeing that attitude in people.

We know that HW3 isn't good enough for full autonomy, so perfect the software for basic AP and functional TACC's (90 mph, close follow distance in Vision only cars, near zero phantom braking) and then lock it down... I don't even mind if they leave the "BETA" label on it... the Beta label is only there to shield them from legal liability anyway. Oh, and if you can't get auto high beams to work, don't make it mandatory for AP use... keep it as an optional Beta feature for people who enjoy annoying everyone else on the road at night.

Keith
 
Upvote 0
I am a techie/ early adopter as well, had a 2010 Chevy Volt, then a 2016 Volt and 2017 Bolt before getting the MYP... But I have to say, when other companies pass you by while you are still providing Beta level services while you are claiming to be the leader in the autonomy field is a bit embarrassing.

The only other Tesla owner I live near (one of my co-workers) is a HUGE Fan-Boy... he refuses to acknowledge that any other company has functional lane centering (as good as Tesla in most cases) and adaptive cruise control (better than Tesla in many cases)... it baffles me seeing that attitude in people.

We know that HW3 isn't good enough for full autonomy, so perfect the software for basic AP and functional TACC's (90 mph, close follow distance in Vision only cars, near zero phantom braking) and then lock it down... I don't even mind if they leave the "BETA" label on it... the Beta label is only there to shield them from legal liability anyway. Oh, and if you can't get auto high beams to work, don't make it mandatory for AP use... keep it as an optional Beta feature for people who enjoy annoying everyone else on the road at night.

Keith
Based on my observations during this drive, I agree - I don't think the current M3/MY hardware will ever achieve full autonomy. Not having a complete feel for the dimensions of the new car yet, I put a small scratch on my front bumper on Day 1 when I was very slowly inching into a parking spot. The screen was still showing 49cm left in the front when the bumper touched a tall curb - apparently a sensor blind spot. On another occasion I was driving in a small town and the car drove onto a slanted sidewalk in a steep double turn (intended to slow down traffic - we have those in Europe in a lot of places). I intentionally put the car into that situation as a test of the AP because it was safe and I wanted to see how the AP would deal with it - well, it failed. My guess is that the car just doesn't have the hardware to properly recognize curbs and other small obstacles. The parking sensors have blind spots and don't have the range anyway, and I don't think cameras alone are adequate for machine vision to see a grey curb against the grey background of the sidewalk. IMO, this in itself is enough to defeat full, safe self drive. I think that Tesla's cameras only approach to FSD will never make a truly autonomous car - they will have to add something different to make it realistic, something that provides proper 3D vision of all objects, maybe lidar. In addition to curbs almost invisible to cameras, how about roads with no painted lines? What about bottlenecks where a road narrows (e.g. going through a tight tunnel) with alternating traffic directions? Not to mention 2 lane roundabouts like those in France - Tesla would just stop paralyzed in the middle, seeing a bunch of cars moving at constantly changing angles and speeds in relation to the Tesla while the Tesla itself is doing the same - enough to confuse the sh@# out of any camera based AI vision without having measurements of each object's (constantly changing) precise location, direction and speed. What about driving in India where on a 3 lane highway people completely ignore the painted lines and drive as if there were 5 lanes? How about speed bumps (which my MY never took any action to slow down for) - a lot of them are not safe to drive through anywhere near the speed limit, as it would send the car flying and would result in a very unpleasant experience for everyone in it. And so on. Some driving situations can be easily navigated by human perception, experience and intelligence, but that uses a lot more than just processing visual clues of other cars and large objects. My bet is that radar/lidar or something like it will be back in future Tesla cars when they realize that the full autonomy they promised can't be achieved based on cameras only.
 
Upvote 0
Congrats!

1. That happens once in a blue moon. Once while I was backing out of a parking spot and another time while driving. I only noticed because the music stopped. Just monitor it and hopefully it doesn't happen.

2. It might be strong sunlight. I get that warning occasionally and don't have any condensation or significant dirt.

3. I haven't tried charging my phone, but I did check that the two rear USB-C ports do charge my wireless headset.

4. Sorry, can't help here. I mainly charge at work and haven't touched any charge settings.

5. Yes, it does seem a bit weird always driving right in the middle of the lane. I think I've gotten used to it. You can pop-up the rear and side cameras as a sanity check.

6. Yes, AP is quite strong and you'll have to yank on the steer wheel forcefully (emergency maneuver) to take over which will result in a jerky motion. Use the right stalk or brake to disengage AP smoothly.

AP has gotten much better at judging distances to cyclist, however, it'll be very timid and slows down when they're <5 ft from you. There are many cyclist where I live and AP can now drive in the right lanes without braking unnecessarily.

AP_and_cyclists.jpg


Hi,

New MY owner here - took delivery 2 days ago in the Netherlands. I went on a road trip straight from the Delivery Center and noticed a few issues on the trip - any info on these issues would be appreciated. I did check the forums but the entries I found didn't exactly describe my issues, so I'm posting them here:

1. Once when I was parked the screen (or the car?) crashed/rebooted by itself twice. I was in a menu, touched a button (can't recall what it was) and the screen just went black, then displayed a loading progress display for quite a while and then back on the default screen. A few minutes later the same thing happened again. This was very soon after delivery and hasn't happened since in the last 2 days. Should I be concerned?

2. While on my first trip (Day 1) I got the message that the right B pillar camera was "blocked or blinded". I saw that this happened to other due to condensation but I stopped to inspect the camera (not immediately but a bit later) and didn't see any condensation, and everything was sparkling clean. This happened multiple times during the drive.

3. The rear USB C ports don't charge my phone - the front ones work. So I guess this is a warranty issue.

4. Before I arrived at my first destination, I set the charge limit to 5A (I knew I was going to charge using someone else's normal mains plug at my destination and didn't want to blow their fuse) but sure enough it did just that - blew their fuse after a few minutes. When I looked at the screen (showing "charging stopped"), I noticed that the charging limit was back to 16A, despite my previously setting it to 5A - apparently it didn't remember this setting. I know the limit doesn't apply to Supercharging, but I had a Supercharger stop between setting it to 5A and the slow/home charging blown fuse incident and I don't know if this my be relevant. Is this normal?

5. When driving on Autopilot (I have the full autopilot) I noticed that I had a tendency to slightly but constantly push the steering wheel to the left (unsuccessfully, of course) as I felt I was too close to the cars in the next lane to the right, and I noticed that quite a few cars moved away from me after I passed them (I suppose they felt the same and moved away after the "close encounter" as if I scared them). I checked using the cameras and the car was actually driving in the middle of the lane, so I don't think there was anything wrong with it, but it seems like we have a tendency to keep more distance from other drivers, which on the typical 2 way European freeway means that the driver in the left lane drives closer to the left side of the road and the driver in the right lane closer to the right edge of the road. I don't want to scare other drivers... and I assume the car is driving safely in technical terms, but I feel weird driving in a way that makes others uncomfortable. Did anyone have a similar experience?

6. I have trouble taking over from Autopilot when I need to make a slight correction. For example, today I came up on a bicyclist while driving on Autopilot, and it would not pass him - instead, kept follow at 20km/h. There were oncoming cars in the other lane, but there was plenty of space to safely pass - I guess it was safe for an experienced human driver, but Autopilot didn't think so. So I decided to take over from Autopilot by gently pushing the steering wheel to the left (which required quite some force) and when the Autopilot finally gave in and disengaged, the car went to the left in a sudden motion that was way more than what I wanted to do, not to mention not smooth driving at all. I know I can just disengage the autopilot with the stalk, but that feels like a very complicated way of making this simplest of maneuvers - move hand to disengage autopilot, then move hand back to wheel to steer, then move hand again to reengage autopilot... seems like an example of technology getting in the way instead of helping. It's literally impossible to take over from Autopilot using the steering wheel without this resulting in an abrupt jerk to the steering wheel and swerving the car in a new direction. Is it just me, or is the steering wheel is way more stiff than it should be when on Autopilot?
 
Upvote 0
Based on my observations during this drive, I agree - I don't think the current M3/MY hardware will ever achieve full autonomy. Not having a complete feel for the dimensions of the new car yet, I put a small scratch on my front bumper on Day 1 when I was very slowly inching into a parking spot. The screen was still showing 49cm left in the front when the bumper touched a tall curb - apparently a sensor blind spot. On another occasion I was driving in a small town and the car drove onto a slanted sidewalk in a steep double turn (intended to slow down traffic - we have those in Europe in a lot of places). I intentionally put the car into that situation as a test of the AP because it was safe and I wanted to see how the AP would deal with it - well, it failed. My guess is that the car just doesn't have the hardware to properly recognize curbs and other small obstacles. The parking sensors have blind spots and don't have the range anyway, and I don't think cameras alone are adequate for machine vision to see a grey curb against the grey background of the sidewalk. IMO, this in itself is enough to defeat full, safe self drive. I think that Tesla's cameras only approach to FSD will never make a truly autonomous car - they will have to add something different to make it realistic, something that provides proper 3D vision of all objects, maybe lidar. In addition to curbs almost invisible to cameras, how about roads with no painted lines? What about bottlenecks where a road narrows (e.g. going through a tight tunnel) with alternating traffic directions? Not to mention 2 lane roundabouts like those in France - Tesla would just stop paralyzed in the middle, seeing a bunch of cars moving at constantly changing angles and speeds in relation to the Tesla while the Tesla itself is doing the same - enough to confuse the sh@# out of any camera based AI vision without having measurements of each object's (constantly changing) precise location, direction and speed. What about driving in India where on a 3 lane highway people completely ignore the painted lines and drive as if there were 5 lanes? How about speed bumps (which my MY never took any action to slow down for) - a lot of them are not safe to drive through anywhere near the speed limit, as it would send the car flying and would result in a very unpleasant experience for everyone in it. And so on. Some driving situations can be easily navigated by human perception, experience and intelligence, but that uses a lot more than just processing visual clues of other cars and large objects. My bet is that radar/lidar or something like it will be back in future Tesla cars when they realize that the full autonomy they promised can't be achieved based on cameras only.

Yeah, last I heard HW4 development is already under way, but Elon is obsessed with doing vision only so I expect it to be a vision only system as well, with no geo mapping, no radar, and an unreasoning hatred for LIDAR. You would think LIDAR killed Elon's parents the way he talks about it! I really don't understand the obsession with vision only. The "humans get along fine with just their eyes" being the mindset. Personally, I don't want a car that drives "as good as" a human, I want a car that drives "better than" a human! Being able to see though rain / snow with IR vision, radar, Lidar, Sonar... hell give it X-ray vision if you can!

Based on my experience with TACC I don't think Tesla would be able to successfully do Geo mapping. Even when TACC was operating correctly, there are areas where a few miles after you pass the last 55 mph speed limit sign it assumes for no Q@#$#$% reason that the speed limit has magically dropped to 25 mph. This wouldn't be an issue if it let you do more than 5 mph over the speed limit on two lane roads, but I am not going to drive 30 mph in a 55 mph zone where even doing 60 mph you have people riding your bumper. There is also a place on my commute where several years ago there was a construction zone with 45 mph speed limit... now 2 years after construction is complete the TACC's (back when it worked) would drop my speed in this "ghost" construction zone for 1.5 miles and then go back up to the real 55 mph speed limit.

Saddest part to me, as long as Tesla calls these features (that are production features in other cars) BETA features they can just ignore the non-fanboy complaints. Also, as long as it is a BETA feature they can sell FSD for $10,000 a pop to unsuspecting new Tesla buyers who think FSD actually means "full SAE Level 5 self driving" without being sued into oblivion.

Later,

Keith

PS: I don't need or want FSD, I want functional basic auto pilot with functional TACC... is that too much to ask? If it IS too much to ask, then how about basic lane centering and dumb cruise control that actually works?
 
Last edited:
Upvote 0
I think that Tesla's cameras only approach to FSD will never make a truly autonomous car

Personally I'd say it's more of a two way street (pun intended)

I mean that FSD in all situations may or may not ever come to be, but there will probably be a phased in approach that involves both the vehicle AND the highway. So for instance FSD will soon be available on Motorways (Interstates, Autoroutes, Autobahns etc) because those highways will be built and maintained to a specific standard with regular inspections. The vehicles will talk to each other and to the highway and together they'll all understand what needs to be done for safe and successful travel.

It's a bit like a CAT III approach in an airliner that allows it to land in zero visibility. Sure you can do that in a CAT III equipped aircraft at an international hub like Schiphol or JFK, but don't try it at a private field

Eventually FSD might be available on city streets, but that's a much bigger problem to solve. Then again we probably used to ride horses wherever we wanted and when cars started appearing there was general chaos until things like traffic lights and STOP signs were invented
 
  • Like
Reactions: SilentFlight
Upvote 0
2. It might be strong sunlight. I get that warning occasionally and don't have any condensation or significant dirt.
Thank you for the info Mark. This one (camera "blinded or blocked") only happened during night driving, but then repeatedly, and on multiple days, and only with the right B pillar camera with nothing special I could see that would block or blind it, and there was never any condensation or dirt I could see blocking the camera either. I never got this message during daytime driving either.
 
Upvote 0
Yeah, last I heard HW4 development is already under way, but Elon is obsessed with doing vision only so I expect it to be a vision only system as well, with no geo mapping, no radar, and an unreasoning hatred for LIDAR. You would think LIDAR killed Elon's parents the way he talks about it! I really don't understand the obsession with vision only. The "humans get along fine with just their eyes" being the mindset. Personally, I don't want a car that drives "as good as" a human, I want a car that drives "better than" a human! Being able to see though rain / snow with IR vision, radar, Lidar, Sonar... hell give it X-ray vision if you can!

Based on my experience with TACC I don't think Tesla would be able to successfully do Geo mapping. Even when TACC was operating correctly, there are areas where a few miles after you pass the last 55 mph speed limit sign it assumes for no Q@#$#$% reason that the speed limit has magically dropped to 25 mph. This wouldn't be an issue if it let you do more than 5 mph over the speed limit on two lane roads, but I am not going to drive 30 mph in a 55 mph zone where even doing 60 mph you have people riding your bumper. There is also a place on my commute where several years ago there was a construction zone with 45 mph speed limit... now 2 years after construction is complete the TACC's (back when it worked) would drop my speed in this "ghost" construction zone for 1.5 miles and then go back up to the real 55 mph speed limit.

Saddest part to me, as long as Tesla calls these features (that are production features in other cars) BETA features they can just ignore the non-fanboy complaints. Also, as long as it is a BETA feature they can sell FSD for $10,000 a pop to unsuspecting new Tesla buyers who think FSD actually means "full SAE Level 5 self driving" without being sued into oblivion.

Later,

Keith

PS: I don't need or want FSD, I want functional basic auto pilot with functional TACC... is that too much to ask? If it IS too much to ask, then how about basic lane centering and dumb cruise control that actually works?
Yeah, I hear you. I also think that the car is slow to adjust speed on AP/TACC to new speed limits, e.g. if there is a lower speed limit traffic sign, it sails right past it at the higher, previous speed and takes a while to slow down to the new limit. The car could get me a ticket if I get caught doing that...

I also don't get the "vision only" approach. Contrary to what Elon is saying, humans use a LOT more to drive than their vision - hearing is one example (if I hear a siren, I look around to check out where it's coming from), experience, intuition, instincts, non-verbal cues from and communication with other drivers, muscle memory and even gut feeling, like fear (if something subconsciously registers as "danger" I may take action intuitively to prevent an incident). A computer doesn't have any of that so it has to replace them with objectively measurable inputs and a more mechanical "understanding" of it's environment, and as a result, needs a much more accurate and detailed view of the world than a human, to drive comparably well, or better than a human.
 
Upvote 0
Based on my observations during this drive, I agree - I don't think the current M3/MY hardware will ever achieve full autonomy. Not having a complete feel for the dimensions of the new car yet, I put a small scratch on my front bumper on Day 1 when I was very slowly inching into a parking spot. The screen was still showing 49cm left in the front when the bumper touched a tall curb - apparently a sensor blind spot. On another occasion I was driving in a small town and the car drove onto a slanted sidewalk in a steep double turn (intended to slow down traffic - we have those in Europe in a lot of places). I intentionally put the car into that situation as a test of the AP because it was safe and I wanted to see how the AP would deal with it - well, it failed. My guess is that the car just doesn't have the hardware to properly recognize curbs and other small obstacles. The parking sensors have blind spots and don't have the range anyway, and I don't think cameras alone are adequate for machine vision to see a grey curb against the grey background of the sidewalk. IMO, this in itself is enough to defeat full, safe self drive. I think that Tesla's cameras only approach to FSD will never make a truly autonomous car - they will have to add something different to make it realistic, something that provides proper 3D vision of all objects, maybe lidar. In addition to curbs almost invisible to cameras, how about roads with no painted lines? What about bottlenecks where a road narrows (e.g. going through a tight tunnel) with alternating traffic directions? Not to mention 2 lane roundabouts like those in France - Tesla would just stop paralyzed in the middle, seeing a bunch of cars moving at constantly changing angles and speeds in relation to the Tesla while the Tesla itself is doing the same - enough to confuse the sh@# out of any camera based AI vision without having measurements of each object's (constantly changing) precise location, direction and speed. What about driving in India where on a 3 lane highway people completely ignore the painted lines and drive as if there were 5 lanes? How about speed bumps (which my MY never took any action to slow down for) - a lot of them are not safe to drive through anywhere near the speed limit, as it would send the car flying and would result in a very unpleasant experience for everyone in it. And so on. Some driving situations can be easily navigated by human perception, experience and intelligence, but that uses a lot more than just processing visual clues of other cars and large objects. My bet is that radar/lidar or something like it will be back in future Tesla cars when they realize that the full autonomy they promised can't be achieved based on cameras only.
I respectfully disagree that lidar or radar will ever be back in Tesla cars. Humans do not use either of those to perceive the world as they’re driving it’s strictly vision and experience and intelligence which the neural net will solve.
 
Last edited:
Upvote 0
I respectfully disagree that lidar or radar will ever be back in Tesla cars. Humans do not use either of those to perceive the world as they’re driving it’s strictly vision and experience and intelligence which the neural net will solve.
I also respectfully disagree :)

Humans use a lot more than just vision to drive. Machine intelligence and machine experience is VERY different from their human counterparts, because machine intelligence will only perceive things it was programmed to (has the hardware and software to) perceive, so its "experience" is limited to a collection of data it is programmed to retain and use for evaluating a situation. If a certain rare type of freak accident kills a Tesla driver, the car's "intelligence" will keep killing drivers the same way, if it gets into the same situation, until a human looks at the data and adds something new to the software to prevent it. Machine learning and neural nets are in their infancy - we have never even approached the intelligence of a fruit fly in collision avoidance systems, never mind human level intelligence. So I don't think that we are anywhere near anything that can approach the level of an experienced, skilled human driver who has a short reaction time and is paying full attention. Current safety level of AP is that it only does things that have been programmed by its developers, and if a situation is outside its scope of what it is programmed to deal with, it throws its hands up in the air and hands the wheel back to the human driver, or gets paralyzed (phantom breaking, driving 20mph on the freeway, refusing to move, etc.) or makes a mistake. I may be wrong, but this observation is based on my first 2 weeks with my MY, having driven about 2000km playing with and testing AP in many diverse situations. It got me into many situations where I had to take over to prevent damage to the car or myself (going into hairpin turns way too fast, wanting to drive through curbs, getting lane confused, and driving in a zig-zag on a freeway, etc.).

That said, I see problems which I don't think the car's current hardware and the state of machine learning can solve without relying on more than just vision, e.g.

- Drivers communicate with each other (gestures, flashes, horn, etc.). I don't see how this would be integrated into machine vision.

- If I see that a driver in the next lane is distracted, e.g. texting on his/her phone, or just drives erratically, I will keep more distance from them, pass them or fall back because they could do something stupid. I prevented getting into accidents a few times by doing this - how do you teach a car to do this? Add a function to watch other drivers around you? If you wait for the mistake to happen and then try to fix it, you may not be able to.

- How about just pure idiocy? Once someone who I suppose was just batshit crazy was literally chasing me on city streets. He tailgated me driving behind me only about half a meter (about 2 feet) for minutes, through turns, intersections, traffic lights, etc., slowing down and speeding up mirroring my speed. I ended up pulling over to let him go because I felt uncomfortable and would rather prevent an accident in case he made a mistake. How do you teach a car to recognize such unsafe behavior and getting out of the general situation, like a human driver would?

- How about driving through a bottleneck in stop and go traffic, e.g. to pass someone who double parked on a 2 way street with no painted lines, and drivers negotiating taking turns driving from each direction?

I'm not arguing against autonomous cars (which I know will be much safer than human drivers) but I think that the logic that if humans can drive based on vision only, then so should a machine, is fatally flawed. We already know that from flying - in good weather, an experienced human pilot can easily land an airplane safely at a regular airport based on what would appear to the observer as "vision only" - but the pilot uses a lot more than just vision. Airliners auto-land most of the time, but no one in their right mind would even try (or have ever tried for that matter) to create a mostly vision-based auto landing system. It just wouldn't work. Even though a human pilot can do it, a computer requires a LOT more than the equivalent of human vision to land an airplane safely.
 
Upvote 0
Watch a magician for a while and you'll see how easily human sight can be fooled. An algorithm might be able to distinguish some of the thing the magician does, or at least maintain an awareness that that card might or might not be the Ace of Diamonds
You have a point - but the reverse is also true. If the magician conceals his trick by completely concealing (e.g. by hiding one object behind another object) what he does, then no vision based system will be able to tell what he is doing.
 
Upvote 0
I also respectfully disagree :)

Humans use a lot more than just vision to drive. Machine intelligence and machine experience is VERY different from their human counterparts, because machine intelligence will only perceive things it was programmed to (has the hardware and software to) perceive, so its "experience" is limited to a collection of data it is programmed to retain and use for evaluating a situation. If a certain rare type of freak accident kills a Tesla driver, the car's "intelligence" will keep killing drivers the same way, if it gets into the same situation, until a human looks at the data and adds something new to the software to prevent it. Machine learning and neural nets are in their infancy - we have never even approached the intelligence of a fruit fly in collision avoidance systems, never mind human level intelligence. So I don't think that we are anywhere near anything that can approach the level of an experienced, skilled human driver who has a short reaction time and is paying full attention. Current safety level of AP is that it only does things that have been programmed by its developers, and if a situation is outside its scope of what it is programmed to deal with, it throws its hands up in the air and hands the wheel back to the human driver, or gets paralyzed (phantom breaking, driving 20mph on the freeway, refusing to move, etc.) or makes a mistake. I may be wrong, but this observation is based on my first 2 weeks with my MY, having driven about 2000km playing with and testing AP in many diverse situations. It got me into many situations where I had to take over to prevent damage to the car or myself (going into hairpin turns way too fast, wanting to drive through curbs, getting lane confused, and driving in a zig-zag on a freeway, etc.).

That said, I see problems which I don't think the car's current hardware and the state of machine learning can solve without relying on more than just vision, e.g.

- Drivers communicate with each other (gestures, flashes, horn, etc.). I don't see how this would be integrated into machine vision.

- If I see that a driver in the next lane is distracted, e.g. texting on his/her phone, or just drives erratically, I will keep more distance from them, pass them or fall back because they could do something stupid. I prevented getting into accidents a few times by doing this - how do you teach a car to do this? Add a function to watch other drivers around you? If you wait for the mistake to happen and then try to fix it, you may not be able to.

- How about just pure idiocy? Once someone who I suppose was just batshit crazy was literally chasing me on city streets. He tailgated me driving behind me only about half a meter (about 2 feet) for minutes, through turns, intersections, traffic lights, etc., slowing down and speeding up mirroring my speed. I ended up pulling over to let him go because I felt uncomfortable and would rather prevent an accident in case he made a mistake. How do you teach a car to recognize such unsafe behavior and getting out of the general situation, like a human driver would?

- How about driving through a bottleneck in stop and go traffic, e.g. to pass someone who double parked on a 2 way street with no painted lines, and drivers negotiating taking turns driving from each direction?

I'm not arguing against autonomous cars (which I know will be much safer than human drivers) but I think that the logic that if humans can drive based on vision only, then so should a machine, is fatally flawed. We already know that from flying - in good weather, an experienced human pilot can easily land an airplane safely at a regular airport based on what would appear to the observer as "vision only" - but the pilot uses a lot more than just vision. Airliners auto-land most of the time, but no one in their right mind would even try (or have ever tried for that matter) to create a mostly vision-based auto landing system. It just wouldn't work. Even though a human pilot can do it, a computer requires a LOT more than the equivalent of human vision to land an airplane safely.
I don’t disagree with much of what you say but let me distill my point down. No human who is totally blind can drive a car , fly an airplane or operate a marine vehicle. A human who is deprived of the other four senses can safely operate an automobile. Vision combined with their intelligence is all that is needed. Data and the neural net will solve the rest of your issues over time , maybe not today maybe not tomorrow but it will happen.
 
Upvote 0
I don’t disagree with much of what you say but let me distill my point down. No human who is totally blind can drive a car , fly an airplane or operate a marine vehicle. A human who is deprived of the other four senses can safely operate an automobile. Vision combined with their intelligence is all that is needed. Data and the neural net will solve the rest of your issues over time , maybe not today maybe not tomorrow but it will happen.
You may be right, but I have my doubts. It's not the ability to see that is the issue, it's the ability to interpret the visual information, which is much, much harder than most people think. Just one example: AP works pretty well as long as it can see painted lines. It is a complete and total fail if there are no lines. And even a dumb human with bad vision can easily drive on a road with no lines - Tesla can't do it even though it has GPS info that tells it where the road is.

I think too much is being read into AI and neural nets and the other buzzwords considering their actual abilities today. Human senses and interpretation of visual data is an incredibly sophisticated system, even though it is not that too to fool, humans are pretty good at interpreting the world around them based on visual clues. Software evaluating visual data written by humans is not even close, never mind "the same" as human intelligence. Cameras may see the same visual information as a camera, but they don't have a brain. Just consider this: in order to approach the accuracy (which is not even very high) of human vision (not sight - vision includes interpretation of the visual info) you need something that has abilities similar to a human brain. We don't even understand how that chunk of goo works, never mind getting anywhere near its abilities. If you think AI is anywhere near human level interpretation of information , try talking to any "AI driven" app... you will get quite a few "I didn't quite understand that" or "Sorry, I don't know what you mean" - and that is just a machine trying to interpret language which consists of a finite number of sounds that make up a finite number of words in a clearly laid out and documented grammatical structure. It is a fraction of the info, a fraction of the variables, a fraction of the complexity, with sound being a much simpler medium than 3D vision - machines still can't get it right. I'm not saying they never will, but boy, are we far from anything that can honestly be called "artificial intelligence", if it is supposed to mean understanding information in a level that is comparable to human interpretation of information. We are light years away from that. That's why I believe that vision alone is woefully inadequate for truly safe, 100% autonomous driving - even the vision of 8 eyes with a brain of an earthworm just won't cut it.

I think Elon is making his own job harder by omitting all the other channels that could help the car become autonomous, or has actually given up on real 100% autonomous cars for the time being and is more focused on what can be feasibly mass produced, which is self driving that works some of the time, under ideal circumstances.
 
Upvote 0
You may be right, but I have my doubts. It's not the ability to see that is the issue, it's the ability to interpret the visual information, which is much, much harder than most people think. Just one example: AP works pretty well as long as it can see painted lines. It is a complete and total fail if there are no lines. And even a dumb human with bad vision can easily drive on a road with no lines - Tesla can't do it even though it has GPS info that tells it where the road is.

I think too much is being read into AI and neural nets and the other buzzwords considering their actual abilities today. Human senses and interpretation of visual data is an incredibly sophisticated system, even though it is not that too to fool, humans are pretty good at interpreting the world around them based on visual clues. Software evaluating visual data written by humans is not even close, never mind "the same" as human intelligence. Cameras may see the same visual information as a camera, but they don't have a brain. Just consider this: in order to approach the accuracy (which is not even very high) of human vision (not sight - vision includes interpretation of the visual info) you need something that has abilities similar to a human brain. We don't even understand how that chunk of goo works, never mind getting anywhere near its abilities. If you think AI is anywhere near human level interpretation of information , try talking to any "AI driven" app... you will get quite a few "I didn't quite understand that" or "Sorry, I don't know what you mean" - and that is just a machine trying to interpret language which consists of a finite number of sounds that make up a finite number of words in a clearly laid out and documented grammatical structure. It is a fraction of the info, a fraction of the variables, a fraction of the complexity, with sound being a much simpler medium than 3D vision - machines still can't get it right. I'm not saying they never will, but boy, are we far from anything that can honestly be called "artificial intelligence", if it is supposed to mean understanding information in a level that is comparable to human interpretation of information. We are light years away from that. That's why I believe that vision alone is woefully inadequate for truly safe, 100% autonomous driving - even the vision of 8 eyes with a brain of an earthworm just won't cut it.

I think Elon is making his own job harder by omitting all the other channels that could help the car become autonomous, or has actually given up on real 100% autonomous cars for the time being and is more focused on what can be feasibly mass produced, which is self driving that works some of the time, under ideal circumstances.
I respectfully disagree... I think the timeline is the only thing constraining the solving of this problem. Today, you are correct. But going forward, additional inputs from Radar or Lidar will only confuse and SLOW down the processing of information into the operation of a vehicle, just as you would be if when you are driving you were receiving multiple alerts and warnings from a radar or lidar sensor that said "EMERGENCY BRAKE NOW!!!!"when you own two eyes were telling ... it's OK. It WILL take time, but it will be solved (especially after seeing the AI DAY mental firepower behind the process of solving this)
 
Upvote 0
You have a point - but the reverse is also true. If the magician conceals his trick by completely concealing (e.g. by hiding one object behind another object) what he does, then no vision based system will be able to tell what he is doing.
And of course, if what you are describing happened to a HUMAN, they also would be fooled and get into an accident. Perfection will never exist in avoiding ALL risk (it would put ALL the insurance companies out of business ;) )
 
Upvote 0