I stand corrected.
Having worked in server farms for over 2 decades, I was going to make a point about how minimum consumption is within 60% of peak consumption isn't very "peaky", because a server's energy consumption does NOT scale with its load:
https://www.researchgate.net/figure...hape-in-2007-Summer-and-Winter_fig2_255215593. "Cranking" that AWS EC2 instance isn't going to do squat for your friend's energy consumption.
But then I realized that the periods of high load are during the daytime when AC loads are highest, which as you pointed out, are also when renewable energy is most abundant (thus cheapest) and batteries can buffer that. This type of usage cycle is terrible for a grid, but not as bad for a datacenter/solar+wind+geothermal+hydro-farm site.
As for the Bitcoin farm in TX, that only works because they're part of a mining pool, where other miners come online as the the TX ones shut down. Miners are needed to process transactions at ALL hours of the day, NOT only when electricity is cheapest for them.