I've mentioned
bloom filters a few times here. I should probably go into a bit more detail...
Wikipedia link above. I've not ever created them for real, but I've done a few analogous things. Below is oversimplified and deliberately makes some bad implementation decisions, but is just supposed to show the idea.
Let's say you have a (likely) unique identifier for your vehicle like 01:02:03:04:05 that you can pull during the CCS negotiation. Let also say that EA expects to have around 100k vehicles enrolled, but doesn't have infinite memory on each device. Hopefully they can support 1 meg of DB?
For each vehicle they support, take the vehicle identifier and hash it a few different times to get different (small) hash values. You them set the bits in the total data structure. A Q&D Unix shell implementation:
% for i in `seq 3`; do echo $(( $( ( echo -n 01:02:03:04:05 ; echo "salt${i}") | cksum | awk '{print $1}') % (1024 * 1024 * 8) )); done
7619044
1946733
4528362
Set the 7619044th bit in your 1-meg structure, and then 1946733rd, and the 4528362th bit. Do that for each ID you support, and ship the resulting 1-meg blob out to each charger.
Then when you plug into the charger, the charger does the same calculation with the provided identifier, and verifies all the bits expected to be set are set. If they're all set, then it's "likely" that your ID was in the original set of valid IDs, and we go ahead and speculatively provide you power while we verify internally that you actually are in the allowed set.
This is probabilistic because of aliasing. Given "enough" IDs, there's a chance that there exists an ID+iteration that sets the 7619044th bit. And if a different one sets the 1946733th bit and yet another different one sets the 4528362th bit... your ID could be accepted incorrectly. These odds increase as the fullness of the structure increases, and the wikipedia page suggests using ~10-bits per entry in the structure... so 100M devices would require a ~125MB structure to keep a 1% false-positive rate.