Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.

As rideshare prices skyrocket, Uber and Lyft take a bigger piece of riders’ payments​

We booked 10 rides with Uber and 10 with Lyft. Drivers pocketed an average of 52 percent of our fares.
8f3ba7808504d2afe432a89aa7d8bd2a
by DAVID MAMARIL HOROWITZJULY 15, 2021

Lyft. Driver. Rideshare. Ride-hail.
A Lyft driver passing through the Mission. (Photo by David Mamaril Horowitz)
Update: The week after this story was published, Uber changed its policy to show drivers the full fare.
Uber contacted Mission Local after the publication of this article, which has been updated to include the company’s explanation for why it tells drivers that riders pay lower fares than they actually do. The update can be read in the latter part of the subsection “Money Unaccounted For.”
On a July weekday afternoon, I booked an Uber to my Visitacion Valley home, a 2.5-mile trip for $17.16. My driver — we’ll call him Ryan — showed me how much he made: $7.54.

Uber has long claimed that the amount it takes from fares on average, known as a “take rate,” is around 25 percent, yet the driver got just 44 percent of my payment. A cursory Google search can quickly pull up screenshots that show this is nothing new, and many media outlets have collected data shedding insight on the companies’ take rates.
What’s new is the growing appetite of the rideshare companies. Not satisfied with 25 percent, they now appear to need or want more — frequently half of the fare and, in some cases, nearly three times the publicized take rate, according to the bottom line on 20 recent rides.
Perhaps the most exhaustive attempt to track rideshare companies’ take rate was in 2019, when the media outlet Jalopnik examined 14,756 fares and concluded that Uber kept 35 percent of the revenue, while Lyft kept 38 percent. (Uber and Lyft disputed these analyses but did not provide data sets to Jalopnik upon request showing otherwise.)
However, as the supply of rideshare drivers has declined and prices have spiked, the split has become unseemly. The driver’s pay is determined by a base amount, trip duration, trip distance and potential surge pricing, along with incentives such as reaching a certain number of rides within a time frame — and is not determined by what customers pay.

Stay in touch with our neighborhood.​


Join our daily newsletter list!
By clicking submit, you agree to share your email address with Mission Local and Mailchimp. Use the unsubscribe link in those emails to opt-out at any time.
Mission Local decided it was time to again track the companies’ take rates. We booked 20 rides in San Francisco with drivers who shared their pay for our trips. Drivers said demand is indeed back up and prices are higher, but none said they noticed more pay per trip.
 
  • Informative
Reactions: dhanson865

As rideshare prices skyrocket, Uber and Lyft take a bigger piece of riders’ payments​

We booked 10 rides with Uber and 10 with Lyft. Drivers pocketed an average of 52 percent of our fares.
8f3ba7808504d2afe432a89aa7d8bd2a
by DAVID MAMARIL HOROWITZJULY 15, 2021

Lyft. Driver. Rideshare. Ride-hail.
A Lyft driver passing through the Mission. (Photo by David Mamaril Horowitz)
Update: The week after this story was published, Uber changed its policy to show drivers the full fare.
Uber contacted Mission Local after the publication of this article, which has been updated to include the company’s explanation for why it tells drivers that riders pay lower fares than they actually do. The update can be read in the latter part of the subsection “Money Unaccounted For.”
On a July weekday afternoon, I booked an Uber to my Visitacion Valley home, a 2.5-mile trip for $17.16. My driver — we’ll call him Ryan — showed me how much he made: $7.54.

Uber has long claimed that the amount it takes from fares on average, known as a “take rate,” is around 25 percent, yet the driver got just 44 percent of my payment. A cursory Google search can quickly pull up screenshots that show this is nothing new, and many media outlets have collected data shedding insight on the companies’ take rates.
What’s new is the growing appetite of the rideshare companies. Not satisfied with 25 percent, they now appear to need or want more — frequently half of the fare and, in some cases, nearly three times the publicized take rate, according to the bottom line on 20 recent rides.
Perhaps the most exhaustive attempt to track rideshare companies’ take rate was in 2019, when the media outlet Jalopnik examined 14,756 fares and concluded that Uber kept 35 percent of the revenue, while Lyft kept 38 percent. (Uber and Lyft disputed these analyses but did not provide data sets to Jalopnik upon request showing otherwise.)
However, as the supply of rideshare drivers has declined and prices have spiked, the split has become unseemly. The driver’s pay is determined by a base amount, trip duration, trip distance and potential surge pricing, along with incentives such as reaching a certain number of rides within a time frame — and is not determined by what customers pay.

Stay in touch with our neighborhood.​


Join our daily newsletter list!
By clicking submit, you agree to share your email address with Mission Local and Mailchimp. Use the unsubscribe link in those emails to opt-out at any time.
Mission Local decided it was time to again track the companies’ take rates. We booked 20 rides in San Francisco with drivers who shared their pay for our trips. Drivers said demand is indeed back up and prices are higher, but none said they noticed more pay per trip.

Why are you copying and pasting an entire article about uber in a thread about autonomous driving? This article has nothing to do with autonomous driving. And if you are trying to make a point that relates to autonomous driving, please make your point, don't just copy the article verbatim.

Thank you.
 
  • Like
Reactions: EVNow
Yes, it is impressive. But it is not just about creating a photorealistic environment. Anguelov also talks about the need to have realistic sim agents that match the behavior in the real world as closely as possible. If the sim agents don't behave like in the real world, then the sim won't be very effective at training your AV to handle real world scenarios. So, Waymo is working to create sim agents with realistic behavior using imitation learning!

On the last part of the presentation, Anguelov teases some future work that Waymo is doing there:

hfMlg20.png


If I understood him correctly, Waymo is using imitation learning to model their sim agents to behave as closely as possible to real world agents and then using the simulation to train and test their AV model. The AV models can then be tested in the real world, and data can be collected to validate the models. Rinse and repeat until you get the improvement you need.

What do you think the benefits of this approach will be? It seems like it will help make their sim agents behave more realistically and will make the simulation more effective at training their AV models. The net result should be the AV will handle driving scenarios more naturally. Did I get that right?

Also, if Waymo can pull off realistic simulation with realistic sim agents at scale, it should be a true game changer IMO. In theory, it should drastically accelerate the pace towards "solving L5".

Yup 100%. In addition to the photorealistic representation of reality, they need the agents (road users) in the sim to behave with the entire distribution of human driving (good driving, bad driving, reckless driving, drunk driving, etc).
This is the only way to have their sim near perfect enough to do Population Based Multi-Agent Reinforcement Learning at scale. And like i said months ago, if they can get this working at scale in 2 years. Its game over. This wont by itself take them to L5. But it will provide such a vast orders of magnitude of improvement and iterations that would allow them to scale rapidly to L4 everywhere.
 
  • Like
Reactions: diplomat33
Just food for thought: if Tesla spent 3 months focusing on SF alone (collect tons of SF fleet data, building their entire road / lane geometry NNs based on SF data, fine tune lane semantics and intersection types, etc.), would Tesla's FSD be better than Waymo and Cruise in SF?

You obviously know my answer.

Waymo, Cruise, you name it, are wayy behind Tesla. The difference is shocking. Many don't see it.
@powertoold is MidnightSun_55 in a nutshell.
3g6goNo.png

'
Literally 5 years old. But of-course competitors are way behind Tesla, the difference is shocking.
 
  • Like
Reactions: diplomat33
Literally 5 years old. But of-course competitors are way behind Tesla, the difference is shocking.



Reminds me of all those panel gap comments.
 


Reminds me of all those panel gap comments.
"This car is for us in many aspects (not in all!) a reference: user experience, updatability, driving features, performance of the top of the range models, charging network, range."

Reading is a lost art.
 


Reminds me of all those panel gap comments.
You know both statements are not mutually exclusive right? You can be ahead in one area and be behind in other areas.
 
  • Like
Reactions: Doggydogworld
Why are you copying and pasting an entire article about uber in a thread about autonomous driving? This article has nothing to do with autonomous driving. And if you are trying to make a point that relates to autonomous driving, please make your point, don't just copy the article verbatim.

Thank you.
Oh the irony, coming from the press release king.
 
Exactly. So weird to quote something about lights and say Tesla is not ahead in critical EV technologies.
I think it was more an analogy of claiming Tesla is ahead in ADS using an example of someone who thinks Tesla using matrix headlights makes Tesla ahead when the industry has already been using it. There is some truth to the analogy because the ADS industry is a lot more collaborative than a lot of people appreciate. Advances and discoveries made by Google et al are what Tesla bases a lot of its NN stack and architecture on. One of the reasons why i enjoyed watching the Tesla AI day talk was to see their approach and they cited all the academic papers they used and the majority were Google, some Facebook, and Carnegie Mellon University research papers. These are people who have done the work and shared it for everyone to use no different than OpenAI sharing a lot of their research work for others to use with the GPT for natural language processing.
 
This wont by itself take them to L5. But it will provide such a vast orders of magnitude of improvement and iterations that would allow them to scale rapidly to L4 everywhere.

This reminds me of an article I read awhile back that suggested that AV deployment will be like cell phone coverage. AVs will gradually expand to more and more areas until they eventually cover almost everywhere that people need. So basically, we won't have L5 but we will eventually have L4 that covers almost everywhere that people need.
 
  • Like
Reactions: Jeff N
I thought this was an interesting thread about the voxel data Tesla is starting to produce in the cars, though it is not currently enabled in the released builds. Sorry if it was already posted elsewhere.

0.33m (~1ft) resolution, 100m range, various confidence levels.

They are very committed to vision! It is a bold strategy, as we all know. Anyway, this shows some of the capability.

 
  • Like
Reactions: Jeff N and Bitdepth
I thought this was an interesting thread about the voxel data Tesla is starting to produce in the cars, though it is not currently enabled in the released builds. Sorry if it was already posted elsewhere.

0.33m (~1ft) resolution, 100m range, various confidence levels.

They are very committed to vision! It is a bold strategy, as we all know. Anyway, this shows some of the capability.

The gist I get is this is a lower res than the other higher res depth sensing NN they have also, but it's directly tied to the birds eye view generation that they are already using, so perhaps easier to incorporate. That said, it doesn't really appear they are using either one in the visualizations nor really for the path finding. The latest FSD software still seems to make path decisions that would seem to go toward objects either depth sensing already detected as in the way (as per previous visualizations that show the car was able to sense the Seattle monorail pillars, but path finding doesn't seem to use that in the decision).
 
Last edited:
lower res than the other higher res depth sensing NN they have also
Can you point to a post that details this info on that higher res NN? So it sounds like you are saying that higher resolution one only uses forward-facing cameras or something?

I’m kind of surprised they could have a higher resolution NN without excessive noise actually. Seems like it would be starting to push it. Just a feeling though on my part - nothing to back it up.
 
Can you point to a post that details this info on that higher res NN? So it sounds like you are saying that higher resolution one only uses forward-facing cameras or something?

I’m kind of surprised they could have a higher resolution NN without excessive noise actually. Seems like it would be starting to push it. Just a feeling though on my part - nothing to back it up.
Edit: I notice forum software is embedding the tweet links again. Please click them to get the whole chain instead, which shows a visualization of the NN I talked about directly.
This tweet has both the latest one that Tesla showed during AI day and also the grid that green and rice_fry was able to extract from production software:
Here's an analysis of the monorail scene with the production depth sensing NN, you can see it can easily detect the monorail pillars, so Tesla's pathfinding is obviously not using it yet if it keeps trying to run into them:
 
Last edited:
So FSDbeta 10.2 has rolled-out and it's gone to people in the 100 Safety Score Button group as well. I'll be waiting in the 99 Safety Score Button group.

I can't believe Tesla beat Waymo to this point. I was completely certain Waymo would roll-out their service in Mountain View years ago, but now it's Tesla first out of the gate. If anything, I've been too conservative in my Tesla predictions and far too generous with Google/Waymo.

We've been watching FSDbeta videos for a while now and many anonymous nicks FUDsters claimed FSDbeta was NEVER going to be deployed. Well, they've been proven wrong again and NEVER was 10/11/2021. ;) FUDsters will whine about reality being unfair, as usual, and they'll tweak the NEVER/FRAUD/SCAM narrative to some other nonsense.