Warning: This is kind of long.
Here's something that I've never fully understood. Hopefully one of you can explain this to me. Last year we had some darn hot/humid weather in June. After a few weeks I noticed that my resting HR had dropped from low 50's to high 40's. My training hadn't changed, so I didn't attribute the change to that. It was explained to me that the body's response to the heat was to increase blood plasma (the liquid part), and as more blood was returning to the heart, more could be pumped out, thus for an equivalent volume of blood per unit time, your heart could beat slower. That seems logical, and it also explains why your HR goes up when you get dehydrated (less blood return, less to pump, thus for same volume heart must beat faster).
What I don't understand is how the oxygen carrying capacity of your blood is affected. In other words, if the amount of plasma goes up and your red blood cells don't, then the same volume of blood actually contains fewer RBCs. It was always my assumption that because increased workload demanded oxygen (among other things) that your HR went up to deliver it. But if your heart is now delivering the same amount of blood but with fewer RBCs, your muscles wouldn't be getting the oxygen they need. That would seem to impair performance, not improve it (yes, there's more fluid for surface cooling which is good, but it still seems that you get fewer RBCs where they count). So, now I'm confused because isn't the whole point of using EPO to increase the RBCs, and thus the ability to deliver oxygen? What am I missing here? Was the initial explanation that I received incorrect? In response to a warmer environment, do both the plasma and RBC levels increase, keeping the same concentration? Is there some interesting non-linear function between work-rate, HR, and RBC concentration?
BTW, the heat killed me last year. In spite of my "good numbers" (low HR and such) I was very flat and far off my PRs.