What is driving like when you are not using the FSD Beta mode?
I am not because only a few selected.
In the near future, before governments approval, the concern is that after hours and hours of trusting your Tesla to take you to your work or the store that you will forget to pay attention. How do you see this eventuality playing out? Should it be a concern?
Yes. There are 2 different philosophies.
1) Waymo: Humans are not to be trusted and must be removed from operating the car.
2) Tesla: Humans have passed a driving test so they know how to brake and steer so there's no need to withhold beta technologies.
I know we should always pay attention while driving, but as I see it, It would be easy to forget to do so while having conversation with passengers, listening to podcasts, making "to do" lists etc. We do know that Tesla safety features prevents many accidents, so is it an issue? Yes I would think so.
Yes. There have been Tesla accidents and fatalities. So, it is still an issue.
...Maybe the FSD Beta driving should be limited to only 50% or so of one's driving?...
I have no idea what's that 50% plan mentioned above?
FSD beta is only 2,000 among millions of Tesla cars, so that is not 50%.
I think the better solution is to use anti-collision technology that's been proven to work since 2009 with Waymo: LIDAR. Its anti-collision has been further proven since 2019 with its current Robotaxis that have no drivers or staff in those vehicles that are geofenced for 50 square miles at Chandler, AZ.
Currently, Waymo's issues are not about collisions. It's about other issues such as intelligence. Its robotaxis don't collide with anything but it might encounter a new scenario and just keep on calculating and become stationary in the meantime.
So, if Tesla could adopt the anti-collision technology with the addition of LIDAR, the collision issue is taken care of, and with the intelligence issue, human drivers can manually take the car out of the pausing mode while it's still thinking about how to solve a new scenario.