Assuming a range-charged pack (~403Vdc), that's implying a drop of about 50Vdc at 1300A, or a power loss of 65kW. In the pack alone. That is unrealistic.
I would be surprised if the voltage drop is more than about 5V.
It should be easily possible to get >500kW at 1300A.
Not unrealistic. It's actually probably conservative. I suggest you study up on battery internal resistance. A 5V drop is more like super capacitor territory...
- - - Updated - - -
For fun I'm going to run the math again from scratch without looking at any of my previous posts and see if I come up with the same data.
The internal resistance of the NCR18650B, the closest cell we have data on that is comparable to the Tesla 18650 cells, is 55 miliohms. I believe in previous calculations I gave Tesla the benefit of at 25%+ improvement in this area (seems unlikely, but OK) and went with 40 miliohms internal resistance.
Let's go with a 1300A draw. We know that the pack is configured with 96 cells in series with 74 cells in each group of cells. Amperage in a series circuit is constant, so we can divide 1300 by 74 to get the amperage per cell. That gives us 17.57A per cell load at 1300A. We'll assume the pack is 100% charged, so 4.2V per cell. Theoretically that'd be 4.2V * 17.57A = 73.794W per cell = 524232W, right? Yeah, if the battery were a superconductor and had zero internal resistance, sure, that'd be awesome. Unfortunately this is the real world. Lets factor in internal resistance using the very conservative 40 miliohms.
At 17.57A and 40 miliohms, Ohm's Law says that we'll get a voltage drop of 0.7028V at the cell level. Extrapolating that out that comes out to a 67.4688V drop at the pack level. For a fully charged pack at 1300A load that is an output voltage of 335.7312V @ 1300A = 436,450W. This is ONLY accounting for voltage drop at the cell level and not in the pack bus bars, wiring, contactors, current shunt, external connection to the pack, etc etc etc.
So, let's not just multiply A*V with batteries to get impossible numbers, folks.
Edit: lol, so I dug up my previous post that did the math on this and... yeah... pretty much the exact same thing I said previously. Except from that post:
For comparison, let's work the 1300A number from Elon. 1300A would be 17.57A per cell. Voltage drop of 0.7028V per cell @ 40 miliohms IR, so 335.73V. 335.73 * 1300A = ~436.5kW. Pretty close to reality considering this is optimistic and *only* accounts for internal resistance losses and not wiring, inverter, etc losses, which are bound to be appreciable.
I'll point out that the 436kW number I came up with is less than a 5% difference than what we actually see as a max on the P85D (~415kW / 556HP)
---
For more fun, lets figure out how much heat is actually generated due to internal resistance assuming a 1300A draw for 30 seconds (say, a top speed run?). We know 1kWh is about 3,412 BTU. So how many kWh are lost as heat due to internal resistance in 30 seconds? Let's do more math.
A 67.4688V drop at 1300A is about 88kW lost as heat. So that comes to about 2,631,283 joules of energy lost to heat in 30 seconds. 1 kWh is 3,600,000 joules. So, in 30 seconds 0.731 kWh will be wasted as heat at 1300A draw. We'll do the math and round that to about 2,500 BTU.... spread across the entire 1,500 lb battery pack. Overall, with the cooling loop and such this is pretty negligible.
---
Edit: Rolling with this a little further, a 1300A draw for 30 seconds will drain at least 5% of the pack power (output plus heat loss), dropping the cell voltage down even more. If this rate could be maintained until the pack was dead (it can't, batteries would explode... lol) it would be drained in something like 9 minutes.