Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Car and Driver Model 3 Test - Not Great

This site may earn commission on affiliate links.
Status
Not open for further replies.
It's $121 roundtrip from San Diego to San Francisco. It takes 90 minutes. The electricity at Superchargers is .25, and minimum wage is $11/hr. The trip is 500mi each way with tons of SCs. The main route makes I-10 in West Texas seem scenic.

Obviously an EV is not saving you time, but it's not actually saving you money even if you work at McDonalds.

But? I'm not in the 'fries with that' end of the food chain anyhow. I'm not wasting 10+10 hours in a car if there are other options on the table.

The 'ICE is better at long hauls' is exactly as true as 'The BMW M3 is a better track car than the Tesla Model 3'. Both being irrelevant for most drivers. People don't drive track events often, people don't drive sedans cross country often if they can afford to fly instead.

And the same thing applies to EV range once you reach your convenience tipping point. Our tipping point is 3h30m. That's a 50/50 decision whether to fly or drive. And driving assumes non-stop at 85mph through mountains, something no BEV can do yet reliably. If it's too cold, or a headwind, it's out of range. On busy weekends, it's a flight. On normal ones, it's a drive.

You might drive everywhere you go. That's fine. Your best choice then is not a BEV if you like to go 'places' and do 'things'. No, there are not SC's everywhere yet, regardless of what you read on the web. For April 2018, ICE still holds the long-range and go-anywhere titles. But that doesn't matter for most drivers; it's a bar room debate, not a serious topic.

PS - Odd somebody from Texas would be expounding the virtues of EVs for long range travel. Did they finish support for the I-10 West yet? One of the few cross-country trips I've done a lot is California to Florida, and West Texas is a major segment. However, I am towing heavy on that run. Too big for an EV, too little for Class 8 EV.

McRat you are becoming the king of the straw man argument (I think I said red herring before but it's actually straw man... not that anyone cares). First you pose a question for an argument I never made, that the Bolt had inadequate range, now you are countering an argument that Teslas are just as convenient as ICE cars at long range travel. Nobody ever made this argument.

Allow me to restate my and other poster's position:
- The Tesla Model 3 is a far more capable long distance traveler than the Chevy Bolt. Full stop.

There you go. Now feel free to offer as many counter arguments to that statement as you wish.
 
No, it’s a design flaw. Relying on the driver to override AutoPilot from driving into a highway barrier is a design flaw.

That's simply your opinion, and you're welcome to it. However, we have evidence that AP doesn't attempt to drive into the barrier on it's own...both according to Tesla and a member here that drove on AP at the very spot, at a similar time of day. So there are clearly other circumstances at work outside of AP.

Again, you can blame AP for this accident, but the evidence doesn't, nor the design of the system, support your opinion.
 
That's simply your opinion, and you're welcome to it. However, we have evidence that AP doesn't attempt to drive into the barrier on it's own...both according to Tesla and a member here that drove on AP at the very spot, at a similar time to day. So there are clearly other circumstances at work outside of AP.

Again, you can blame AP for this accident, but the evidence doesn't, nor the design of the system, support your opinion.

It appearas that there can only be two culprits - the driver and/or AP. It should be obvious that the driver must share some/most of the burden for driving the car into the barrier. At the same time, AP has to bear some burden for allowing the car to go into the barrier. You can't say it was smart enough to warn the driver that it wasa going to crash and then say it diud it's job. I agree that there is a design flaw that allowed it to conintue on a deadly course. It would be the same as the driver seeing the barrier and continuing to drive right into it. We would say there was something wrong with the driver. We night say he had some 'problems,' etc. With a device we say it had a design flaw or total failure.
 
  • Like
Reactions: dhrivnak
No, it’s a design flaw. Relying on the driver to override AutoPilot from driving into a highway barrier is a design flaw. AutoPilot shouldn’t be trying to drive into the highway barrier in the first place.
The driver had his AP following distance set at "one." EAP worked exactly as the driver programmed it to do. DO NOT SET FOLLOWING DISTANCE TO ONE IF YOU ARE TRAVELING AT A HIGH RATE OF SPEED. I would think everyone would understand that. It's a "design flaw" to let drivers who have unreasonable expectations of AP and expect it to be FSD, behind the wheel.
 
  • Helpful
Reactions: rsomrek
PS - Odd somebody from Texas would be expounding the virtues of EVs for long range travel. Did they finish support for the I-10 West yet? One of the few cross-country trips I've done a lot is California to Florida, and West Texas is a major segment. However, I am towing heavy on that run.
One more to go on I-10 to support every Tesla, but the longest stretch is only 226 miles, which is easily doable if you have an 85 or better (not towing a trailer though). I've put on over 100K miles on my Model S, about half of which are long distance trip miles. As far as I'm concerned the Tesla is the best trip car I've had. When driving it on trips. The amount of time it takes is the same as it was in my previous cars (at least on the trips I drive frequently enough to compare) and it's far more relaxing in the Tesla.
 
It appearas that there can only be two culprits - the driver and/or AP. It should be obvious that the driver must share some/most of the burden for driving the car into the barrier. At the same time, AP has to bear some burden for allowing the car to go into the barrier. You can't say it was smart enough to warn the driver that it wasa going to crash and then say it diud it's job. I agree that there is a design flaw that allowed it to conintue on a deadly course. It would be the same as the driver seeing the barrier and continuing to drive right into it. We would say there was something wrong with the driver. We night say he had some 'problems,' etc. With a device we say it had a design flaw or total failure.

AP is designed to function while being monitored by the driver, full stop. AP cannot crash your vehicle headlong into a barrier unless the driver fails in his task, full stop.

Calling it a failure of AP is inherently incorrect, there just is no other way to spin it. That does not mean that AP didn't contribute to the crash, I'm simply pointing out that placing the blame for the accident on AP is unequivocally wrong.
 
Because AP is not FSD. AP accidents are driver error accidents. Logs show the driver was alerted by audio and physical warnings at least 6 seconds and 150 meters (that's almost two american football field lengths) prior to impact. Something not right with that driver.
If you read Tesla's statement closely, you'll see that's actually not what happened even though the statement appears to be carefully designed to give that impression. The driver received the hands-on warnings "earlier in the drive", not directly prior to the accident. And even that doesn't necessarily mean that the driver didn't have the hands on the wheel (if you've ever used the Tesla autopilot, you know that the hand detection is far from perfect). I also don't like how they are trying to blame the driver for setting the follow-distance to minimum. If that isn't safe, the car shouldn't offer the choice in the first place.

Frankly, I'm puzzled how something like this can happen. Tesla points out that the driver had an unobstructed view of the divider. But that also means the car had an unobstructed view as well. How is it possible that the autopilot drives the car right into an clearly visible obstruction (with orange/black signal colors to boot)? And that happened right in Tesla's backyard, an area that should have been thoroughly mapped out by tens of thousands of Tesla drivers by now.
 
This thread is going all over the place, but going back to original topic (sort of), I think that the negative critiques of the Model 3 are really getting over the top, and if you actually drive a Model 3 and a Bolt, you will see very quickly they are not in the same league. Yes we do not yet have the 35k Model 3 yet but it will essentially be the same as the LR version we have now, and it is far better than the Bolt. I’m concerned that all these reviews micro-scrutinizing stuff like panel gaps are going to scare people who are considering buying this car. For us and our Model 3 the build quality is great, I hear no rattles, the car feels very solid, and the technology is way ahead of the competition. And that’s comparing to other EVs, and also to BMW 3 series and Audi A4 (we also looked at). I don’t think Tesla should get a pass on build quality issues but now as I said the negativity to my eyes/ears is getting over the top.
 
Calling it a failure of AP is inherently incorrect, there just is no other way to spin it. That does not mean that AP didn't contribute to the crash, I'm simply pointing out that placing the blame for the accident on AP is unequivocally wrong.

And I agree! The crash was a combination of factors that necessarily had to include the driver and the car. The portion that was contributed by the car is some sense is the easiest to fix. The car follows rules (programming) and the rules need to be inspected and modified to prevent a similar sequence of inputs from producing the same result. The driver is the harder part. We don't know why he did not heed the car's warning - on the phone, texting, reading, showing-off. Those are hard issues to deal with and usually require some near-draconian set of rules for the car to follow to allow for operator stupidity.
 
  • Like
Reactions: JohnSnowNW
None of this matters. Tesla can’t make enough cars to satisfy demand, and that’s going to continue indefinitely, because demand is sky high and they are still ramping up. A bad (or good) review doesn’t change anything. If you have the car, enjoy it, it’s great. If you don’t have the car, gather as much info as you can to decide whether it’s for you, you have lots of time.
 
If you read Tesla's statement closely, you'll see that's actually not what happened even though the statement appears to be carefully designed to give that impression. The driver received the hands-on warnings "earlier in the drive", not directly prior to the accident. And even that doesn't necessarily mean that the driver didn't have the hands on the wheel (if you've ever used the Tesla autopilot, you know that the hand detection is far from perfect). I also don't like how they are trying to blame the driver for setting the follow-distance to minimum. If that isn't safe, the car shouldn't offer the choice in the first place.

Frankly, I'm puzzled how something like this can happen. Tesla points out that the driver had an unobstructed view of the divider. But that also means the car had an unobstructed view as well. How is it possible that the autopilot drives the car right into an clearly visible obstruction (with orange/black signal colors to boot)? And that happened right in Tesla's backyard, an area that should have been thoroughly mapped out by tens of thousands of Tesla drivers by now.

Thank you for noticing the inherent bias and selective data points in Tesla’s press release. Tesla’s release is not the result of an investigation and is not meant to provide the public with the full facts. It’s a press release and its meant to deflect blame from Tesla and defend the brand. There are actual government and law enforcement agencies that will conduct the true investigation and release proper results that serve the public good.

Think about all the data points Tesla didn’t choose to release so far... What was the driver doing at the time of the crash? Was he playing with the touch screen? His phone? How many people drive past that barrier each day and do have to intervene to correct AutoPilot? How did AutoPilots behavior change in the 11 days since the crash cushion was damaged by the prior accident? Etc Etc

Anyone who points to the “facts” in Tesla’s press release should be aware and acknowledge that those are very carefully selected facts with a specific intent behind them.

Stay skeptical, always.
 
The car follows rules (programming) and the rules need to be inspected and modified to prevent a similar sequence of inputs from producing the same result.
Yes, that is called Full Self Driving. We are not there yet. Take the driver out of the equation and driving will be safer (not risk free, but safer). And to bring this back on topic, that is the ultimate Tesla concept that C&D reviewers fail to understand. They just don't get it.
 
Yes. And I agree with them. The Model 3 drivetrain is not bad, but it's not anything special. I had an Audi S7 and it was a good engine, but not impressive. My P85D, that's impressive.

I guess if you've never driven a P model Tesla you could be impressed by a lowly Model 3 drivetrain.
Well... I have driven a P for 5.5 years and now have an S. IMO, P has great acceleration, but the 3 has far better overall balance between drivetrain, suspension, and chassis... making it overall much more fun to drive. 3 has plenty of acceleration, probably top of class in price for price range. I would not call it lowly.
 
The driver had his AP following distance set at "one." EAP worked exactly as the driver programmed it to do. DO NOT SET FOLLOWING DISTANCE TO ONE IF YOU ARE TRAVELING AT A HIGH RATE OF SPEED. I would think everyone would understand that. It's a "design flaw" to let drivers who have unreasonable expectations of AP and expect it to be FSD, behind the wheel.
Then why does Tesla allow you to set distance to one at high rate of speed?
 
Why are you allowed to exceed the speed limit? Why can you drive the vehicle with your seat belt unbuckled?
Both of these are against the law so when you are caught you get a fine, so you are not allowed, you do it at your own risk. And since the law can not confirm what your following distance is set to then it should be up to the manufacture to not allow it
 
Then why does Tesla allow you to set distance to one at high rate of speed?
Because it is Auto Pilot driver assist. The driver is in control, even if it is bad control. Why does Tesla allow you to exceed the speed limit? Why does Tesla allow you to manually pull out in front of another car? Why does Tesla allow you to text while driving? Etc. We have TACC on our current car and we ALWAYS set the following distance to maximum regardless of speed, but sometimes we apply the brakes manually because we just don't feel comfortable in a driving situation that is developing,,,,,,,even though we are confident the car will brake automatically if required. It is called personal responsibility.

Eventually, Tesla will control all of these bad decisions by drivers. The car computer will make good decisions for bad drivers. It is called Full Self Driving. C&D reviewers failed to acknowledge that in their "review."
 
Status
Not open for further replies.