You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Does anyone know if Beta 9 has activated the interior camera?
So anyway joking aside, I know an intersection in the east SF Bay Area in Pinole that is almost certainly going to cause the beta to fail miserably. Anyone wanna go try it and get video? It's the sort of thing you don't see *super* commonly but will definitely need to be addressed if they ever want to do a real release.
New thread perhaps. Street names or GPS coordinates? Google maps link?I'd be shocked if they could give multiple seconds notice with this intersection. Honestly just the entertainment value of seeing how badly it would screw it up would be fun
That was nice.Chuck doing the same route on V9 for comparison.
VSYes, it is a big change to how the software works "under the hood".
it's more refinements of the existing behavior like making behavior smoother and more confident.
Leading the autonomy software team for the Tesla Autopilot.Elon's comment on observations made by Earl
Yeah, that's them!Leading the autonomy software team for the Tesla Autopilot.
My team's main focus areas are:
- Creation of large scale automatic ground truth pipelines to train neural networks with massive amounts of diverse, high-quality data. Use this fleet-learning approach to replace potentially brittle run-time algorithms with robust learned models.
- Developing an accurate and detailed geometric and semantic understanding of the world using the best of both machine-learned and engineered models.
- Building robust, causal, predictive models for other agents in both geometry and semantic state spaces.
- Decision making, motion planning, and control modules using state-of-the-art AI techniques including methods for high-dimensional search, trajectory optimization, reinforcement learning, model-predictive control, etc.
Both a manager and a technical contributor, I've brought up Tesla's successive Autopilot HW2 and HW3 generations of computers and software stacks, integrated them across all Tesla vehicle types and pushed them all the way to mass-production. I've been regularly shipping software updates to hundreds of thousands of vehicles across the world, including compute optimizations, new features, and system stability fixes.
I've been scaling Tesla Autopilot's software- and hardware-in-the-loop continuous integration infrastructure, and developed productivity tools to accelerate our R&D team's development cycles towards full-autonomy.
I currently report directly to Tesla CEO.
In particular, I currently lead:
- Overall System Software & middleware (C/C++ middleware, IPC, process scheduling, Logging, Watchdog, ...)
- Computer Vision system software (GPU kernels for post-processing, Neural Network integration, C++ Compute Graph Framework for efficient compute scheduling across multiple devices)
- Camera software stack (across all Tesla vehicle types)
- Platform Software (Linux kernel/drivers, security, power, board bring-up)
- Continuous Integration infrastructure (automated & on-demand support of regression tests, performance tests, Simulation tests, with scheduling on either x86 emulation, as well as on true hardware-in-the-loop setups)
- Build System (including remote-caching)
- Performance & optimization (responsible for the Autopilot framerate across all platforms)
- Telemetry (on-vehicle data capture software, and back-end ingestion services)
- Machine Learning infrastructure (training stabilization & scalability, workflow automation)
- Tools (sensor clips visualization, data plotting for logs analysis or live debugging)
Looks like little or no change to reliance on map data, so I would expect similar continued failures that others have noticed especially in downtown areas. Notably, ending up in a left-turn only lane at 10:30 where OSM believes there's 3 lanes with no turn lane attributes as well as changing lanes during an intersection at 12:25 because map data says 4 lanes without indicating which direction has how many, so Autopilot probably assumed 1 and cut-off/"merged" in front of the other car.
Cliffs: 30-40% better, not perfect
Looks like it makes driving much more relaxing!
This is why it's in beta with a limited user base!Looks like it makes driving much more relaxing!
On the other hand, he said that the intervention frequency was about the same as before.
Cliffs: 30-40% better, not perfect
Those are good questions. I guess I don't understand what the goal is. I assume they'll be able to do way better than they're doing now for that maneuver. I'm not sure I understand the point of real world testing is when I'm sure this also fails in simulation. I think they should scrap the idea of doing unprotected lefts like that until the safety is greater than a human. Monitoring the system while it's doing that maneuver looks very difficult. Obviously fearing for your life helps keep focus but what happens when it gets 100x better? Will people still have the fear of death or will they lose focus for a split second with disastrous consequences?This is why it's in beta with a limited user base!
I know you don't like this, don't agree with their approach, but is it really that hard to actually remove the snark and focus on the functionality?
What's the path forward?
Does this mean the whole thing should be scraped?
Can they solve this within their system?
Success rate was 0/4. This was supposed to be the vast improvement. Smh.
In Tesla's defense Elon only said it would be a "foundational" improvement which to me implies that the actual performance might not be improved yet.Success rate was 0/4. This was supposed to be the vast improvement. Smh.