OK, lets say we run a Tesla such that we use 80% of its battery (80 kWh) which should carry me 266.67 mi at 300 Wh/mi which is what I get. Assume we arrive at a 150 kW charger with 10% remaining and want to replace the charge we used on the trip. Thus we'd need to charge from 10% to 90% . A charger with with 80% linear taper (20% of rates at 100% SoC) will require 59.48 minutes to charge the Tesla from 10% to 90%. Running that same 266.67 miles in a Rivian at 450 Wh/mi uses 119.73 kWh which is 66.5% of its 180 kWh battery's capacity. Assuming it also arrives at the charger with 10% it would need to be charged to 76.5% of capacity to replace the 119.7 kWh. A 150 kW charger with the same taper would require 77.72 minutes to charge the Rivian. That's a factor of 1.3 times longer. This ratio does not, BTW, depend on the size of the charger but it does depend on the taper curve. I used a linear one for simplicity and another linear of different slope will give a different answer as will ones with different shape. For example, if I use an exponential taper (faster at low SoC) but to the same level at 100% SoC then the ratio increases slightly (1.314).
And, of course, charging at home where there is no taper the ratio will be 1.5.
Thus taper does have an effect which depends on the taper but the basic concept, that you will have to charge longer if you are adding more charge because your Wh/mi requirement is higher, remains clear as common sense dictates.
Again I suggest experimentation with ABRP to help you grasp this.