Mundus Vult wrote:
Hey "real scientist", you know that actual, real scientists have tested this in the exact same conditions as the AFC Championship and found the PSI to drop an average of 1.8 PSI? The temperature accounted for 1.1 PSI and the wet conditions another .7 PSI.
This confirms the statements released by MIT and Boston College.
http://www.headsmartlabs.com/HeadSmart Labs found that on average the footballs dropped 1.1 PSI from the 25 degree temperature change alone. The Lab also found that when the leather was wet, the ball dropped an additional 0.7 PSI. In combination, it was found that on average the footballs lost 1.8 PSI with a max of 1.95 PSI from exposure to game day elements.
Mundus, you weaken your argument with the internet tough guy schtick. Betting salaries? Calling people a$$clowns? Even though you have valid points to make, no one wants to deal with this.
As someone pointed out in another thread, my calculations were wrong. PV=nRT requires absolute pressure be used, and I only used gauge pressure. The correct calculation says the balls needed to start at 90 F in order to lose 2 psi at 50 F. Or start at 70 F and end at 30 F. My mistake.
Note that this still requires the balls to have started at an unusually high temperature, or cooled to an unusually low temp, assuming no change in volume.
The link you provided gives interesting results, but their deviation from predicted values needs to be explained. For example, if there is a change in volume, it should be measurable.
Headsmart also fails to take internal temperature for the start and end psi measurements, and that has to be measured for any data to be valid.
One other issue with these demonstrations is they are using brand new balls. The balls used by the Pats were prepped. Soaked, scrubbed, etc. If a new football expands when soaked, why assume a pre-soaked ball would do the same thing?