Back in the April 26, 2009 LRC week that was, LRC (John Kellogg) said that every 10 feet of elevation drop, one's time comes down 1.8 seconds.
Were they talking about 1.8 seconds/mile?
Is this the same number for elevation gain as well?
And how does this change when we apply it to slower pace changing from a 10k to a marathon?
Feel free to chime in, Wejo...
Here's the link:
http://www.letsrun.com/2009/weekthatwas0428.php
Is there an elevation per mile conversion formula?
Report Thread
-
-
I hope 10 ft. is a typo. Cause if I can run 4:40 at 1,000 ft, then I can run 1.8sec x 100 faster = 180sec faster, therefore, I can run 1:40 mile at sea level...sweet
-
Sorry if that was confusing. I meant elevation change during a race, such as a 10 foot uphill slowing down your mile pace by a few seconds as compared to a totally flat course.
-
If that number is correct and it sounds pretty close, it would have to be higher for going uphill. Anytime you run a hilly race, your overall time is slower than a flat course so you have to lose more on the uphills than you gain on the downhills.
-
I've noticed that on trail runs, total time at a given effort level increases at the equivalent of adding one mile for every 1000' of elev. gain AND 1000' of loss. e.g., a ten miler w/1000 feet of hill up/down takes about the same time as an 11-miler on the flat. Checking times for Mt. races seems to confirm this.
Alternatively:
http://www.runworks.com/calculator.html -
based on a recent sample set of one -- a loop trail marathon with 5000' of elevation change not run at altitude on a good-temp day -- i would say that the mile-per-1000' estimates a too-fast pace, but that the runworks calculator is probably reasonably accurate. thanks for the link.
-
anyone who runs on courses without 1000's of feet elevation change? I'm surprised if no one has studied the optimal pace to run while headed uphill and downhill respectively during races.