Having nothing to do with electrical engineering or any kind of engineering for that matter, I've developed the EV enthusiastness thing going for me into an obsession. I'm trying to learn as much as I can about EVs and especially batteries online as far as modern times and my non-engineer brain can take me. I've been watching seminar vids, reading articles, researching about lithium-ion batteries for some time now. Here are two questions; To charge lithium-ion cells we do CC-CV. Voltage goes up and the current stays the same during the CC stage and current goes down while voltage stays the same during CV stage. (right?) And to charge batteries, we need a delta v (not to be confused with SpaceX's endaveaurs lol). If a depleted Model S pack voltage is say ~320V, we'd need more than that to even start a charge and go all the way up to 403V right? We witness this from the charging interface with the Supercharger and Superchargers have a lot of power. Yet when charging at home in a 110V or 220V source, how does that work? Since charger is doing the work then, does it have a transformer in it that ups voltage and plays with current accordingly? Second question; we know that Supercharging all the time is bad for the batteries. But how fast is too fast? I am charging my current non-Tesla dumb EV at work where I have 400V 3-phase 32A electricity service. (~22kW) So when I can get a Model S (whenever it starts being sold here) I want to go with the dual chargers and charge at work all the time. If I'm not wrong charging the pack with 22kW equates to about 0.27C rate for the cells? (assuming a nominal pack voltage of 350V during a full charge, 22000/350=62,8A. Divided by 74=840mA. 840/3100= 0.27C) Does that mean faster degradation vs. a 7kW or 11kW charge?