Personally im a huge fan of Jezzy... I wish he’d get back to that type of rap, his last cd just didn’t have it for me. Get back to tm 103... this one’s for you ..46 is working pretty damn good, I’ve put some serious miles on it and find it to be very consistent where it makes mistakes which is so nice for anyone who uses it regularly on the same routes. Sorry I’m off topic, couldn’t help it.
That's impressive! I'm thinking you need to branch out with your music though. Maybe some Bee Gees next time :-D
Founder of Google self-driving program implies that cameras are enough for full-self-driving technology: The new generation of computers is programming itself potentially relevant to the LIDAR und Tesla Autopilot debate (starting at the 4 minute mark): "but almost all the interesting work is centering on the camera image now, we are really shifting over from precision sensors..."
@dukedarkside Well there's a wishful and selective reading into it, if I ever saw one. Then again, given how badly Tesla's covered radar seems to work in snow, I think vision-only is my car's only hope.
Jest aside, let's recap what really happened on the video. We're looking at a video of a demo car using 360 degree lidar, multiple radars and cameras. And the interpretation is, the implication is for camera-only? Now, there is obviously no denying radar and lidar have been mastered already (software-wise) and the interesting stuff definitely is happening in the visual space, of course it is, especially combined with deep learning. This, indeed is said on the video. But IMO if we are to say - as @dukedarkside said - someone implies cameras are enough, then the source material should at least support that. No?
@AnxietyRanger the guy Sebastian Thrun says it verbatim in the video. What more supportive material do you need? "CA: So, explain it -- on the big part of this program on the left, you're seeing basically what the computer sees as trucks and cars and those dots overtaking it and so forth." "ST: On the right side, you see the camera image, which is the main input here, and it's used to find lanes, other cars, traffic lights. The vehicle has a radar to do distance estimation. This is very commonly used in these kind of systems. On the left side you see a laser diagram, where you see obstacles like trees and so on depicted by the laser. But almost all the interesting work is centering on the camera image now. We're really shifting over from precision sensors like radars and lasers into very cheap, commoditized sensors. A camera costs less than eight dollars."
He's talking about the interesting work in the field, sure, yet the car he is showing runs lidars and radars in tandem with cameras - was my point. So even though he talks of shift in focus, that's hardly yet an implication that "cameras are enough for full-self-driving technology". That's IMO the wishful thinking part. Cameras very likely play a much bigger role anyone imagined a decade ago. However, whether or not they will be sufficient (let alone preferred) for "full self-driving" (in the production sense of the word, of course) remains very much to be seen.
Its certainly fair to characterize it as a work in progress, I didn't say they have a solution right now based on only cameras. But he makes it very clear that they are shifting over more and more work from Laser (and/or Radar) to cameras as the focus point in the neural network.
Sure, it is a known fact that vision is rising to be the primary sensor while radar and lidar are becoming secondary sensors. But that is quite different from vision-only. We shall see what happens...
... that we shall. So first off, let's get those NN rain-sensing wipers working, shall we? Node 1 - "Hmm, rain drop?" Node 2 - "Might be. Could be a weird tail light too. I'll give it a .4. Let's see what Node 3 thinks." Node 3 - "No f***ing way that's a rain drop!" Node 4 - "What ^he^ said. I'm getting .1 here. APE: Don't bother." APE - "Leave me alone with that nonsense data. I'm overworked already." Node 1 - "Hey, it happened again!" Node 2 - "Node 1's right. This is a .7 for sure. Node 3 check your sh*t." Node 3 - "Ya, I'll let it pass." Node 4 - "Roger. APE: Send body control module an instruction to wipe." APE - "B*tch I won't start no wipers with 'one' rain drop." Node 4 - "???" Node 1 - "Hmm, rain drop?" Node 2 - "Could be... Not sure. .3." Node 3 - "HAHAHAH" Node 4 - "Right. APE: Nothing." APE - "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" Etc.
@lunitiks I think it goes more like this: Node 1 - Posting a "It's raining men" meme Node 2 - Posting a YouTube video of a sunny day with indecent rap playing Node 3 - Ignoring Node 1 and 2 and posting discoveries about rain's technical features Node 4 - Posting concerns about rain very likely happening even though nobody else seems to see it APE - "Moved to Snippiness"
Very appropriate song for some of your hair-raising videos Edit: This was for the Staying Alive video