There are really two questions here :
1. Does heart rate track oxygen consumption accurately enought that a 4% increase in oxygen consumption will lead to an observable increase in heart rate?
2. Is the device I used to measure heart rate (Polar OH1 armband) accurate, or do I need to use a "research grade" device to detect a 4% difference?
For the first question, HR does track linearly with oxygen consumption (see: Prediction of energy expenditure from heart rate monitoring during submaximal exercise. J Sports Sci. 2005 Mar) (
https://www.braydenwm.com/cal_vs_hr_ref_paper.pdf
), although things like ambient temperature, emotional state, etc can change HR as well. In other words, if other factors are kept constant, a 4% increase in oxygen consumption should lead to a measurable increase in heart rate (within the submaximal range of course).
For the second question, the Polar OH1 armband HR monitor has been shown to be as accurate as EKG monitoring with less than 0.5 bpm variation (Validation of Polar OH1 optical heart rate sensor for moderate and high intensity physical activities. PLoS One. 2019 May 23) (
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6532910/pdf/pone.0217288.pdf
).
So a good way to test the first question would be to increase the treadmill speed by 0.1 MPH (an increase of 1.25% at 8 MPH) and see if I can measure a difference in heart rate. If I do measure a difference, this would be fairly strong evidence that the shoe test was accurate, and there was no energy efficiency difference (=non-responder), no?