By that I mean most of us charge to a user-specified software limit of say 70%, 80%, 90%, etc. When you first set out on your next drive after charging to that capacity there's a warning that regenerative braking won't function 100% until some juice is used up to make room. Every time I have to use my friction brakes to slow down my car I cringe knowing that the first several stops will be wasted energy and unnecessary wear and tear on the pads and rotors.
Well, we all know there's enough room there to put a few more miles (or feet?) of regen until you use more up through normal driving. There should be a code change behind the scenes that will still stop charging at the 80% user-defined limit you have set but then, when you set out on your drive, actually allows for a certain amount of overshoot. Say... 2% or something? Even at 1% we're talking several miles worth on most size batteries which would be more than enough for most standard users. Not many people will be putting a lot back on unless they live in the mountains and commute downhill each day but that's going to be a smaller portion of owners. Even so you can bump it to 2% or whatever. I don't have access to the data but whoever is writing code for these things does and could easily put something like this into action I would think.
This would 1) keep driving characteristics consistent and 2) not lose that energy that could be recouped. It doesn't seem like much but over hundreds of thousands of cars driving millions of miles this small amount of otherwise wasted energy could add up.
Can anyone think of a good reason why this wouldn't work? As long as they can separate the way the car sees plug charging versus regenerative charging (I assume it already does) then I don't see why this could be a bad thing. Seems like a win win to me.
Thoughts?
Well, we all know there's enough room there to put a few more miles (or feet?) of regen until you use more up through normal driving. There should be a code change behind the scenes that will still stop charging at the 80% user-defined limit you have set but then, when you set out on your drive, actually allows for a certain amount of overshoot. Say... 2% or something? Even at 1% we're talking several miles worth on most size batteries which would be more than enough for most standard users. Not many people will be putting a lot back on unless they live in the mountains and commute downhill each day but that's going to be a smaller portion of owners. Even so you can bump it to 2% or whatever. I don't have access to the data but whoever is writing code for these things does and could easily put something like this into action I would think.
This would 1) keep driving characteristics consistent and 2) not lose that energy that could be recouped. It doesn't seem like much but over hundreds of thousands of cars driving millions of miles this small amount of otherwise wasted energy could add up.
Can anyone think of a good reason why this wouldn't work? As long as they can separate the way the car sees plug charging versus regenerative charging (I assume it already does) then I don't see why this could be a bad thing. Seems like a win win to me.
Thoughts?