Floyd Released a ton of documents this morning on his website.
Their are 370 pages worth.
He also released a 25 page Powerpoint Slide show.
You can find the Slide Show with some Commentary above it here:
http://trustbut.blogspot.com/2006/10/slide-show.html
Summary/Highlights (Comments are from the link above)
Slide2
Says there are four main ways the testing produced an erroneous result. Sample error, contamination, unreliable tests, failure to meet positive criteria.
comment: "Doesn't make sense" won't come into the factual resolution, as the strict liability standard for doping doesn't care whether it makes sense, only that the wrong things were present in his body.
Slide 3
Explains some sample labelling problems that are being offered to show general lapses in the lab proceses. They aren't claiming the samples aren't his, but that errors are made in the simplest parts of the paperwork.
comment: This is a supporting argument, but doesn't relate to critical facts. It is asking the viewer to imply that because some small things were done incorrectly, there is a liklihood that big things are done wrong as well.
This whole line of reasoning doesn't affect factual determination, but it does affect the general credibility of the lab. That Landis is not using this as a critical point indicates he does not want to be seen as quibbling about minor technicalities.
Slide 5
shows that on the A Sample result page, this wasn't followed for the sample number! It was done with whiteout and an overwrite. Notes, this isn't being offered to say the results were for the wrong sample, just that the lab doesn't do things correctly.
Slide 6
repeats the point, showing either an error or an ambiguity with a transportation record. Again, not show to say it didn't end up being his sample, but that record keeping is lax.
Slide 7
shows another sample number error, on a result summary sheet, previously known to TBV as one of the Ferret pages. The slide also notes that Landis should not have received any data about other samples. This is a lab data handling failure.
comment: While the lab was being complete in the documentation, it should probably have redacted all data about other samples.
This should have no bearing on the factual decision about the case.
Slide 9
shows the WADA criteria for determining that a sample has been contaminated, and should not be used for measurements.
Slide 10
argues the Landis A sample exceeded that standard. It should have been seen as having been contaminated, and testing stopped, period.
comment: It is not clear where the justification for the 5% rule in the WADA protocol comes from.
This is a solid argument that would need to be refuted by the USADA side.The ADA side may try to wave hands and say it doesn't matter in this case. It would help for Landis to have some science to show how contamination affects the results of the tests.
Slide 11
shows the same values on the B sample, .44 and 5.7, indicating the same level of contamination or degradation as present in the A sample.
comment: No hiding for the ADA side, same problem on both samples.
Slide 13
shows a large variation in the same test on the same sample further emphasizing lack of repeatability.
comment: These may be comparing screening tests with more accurate confirmation tests. The screening tests are known to be inaccurate, which is why the confirming tests are done. This kind of inconsistency may not seem like a problem to technical reviewers.
Slide 16
Shows more inconsistently reported values for the T and E measurements.
Slide 19
Presents an interpretation of the CIR criteria it wants us to believe.
comment: Asking for 3.8 because of .8 tolerance is questionable, because the 3.0 was set with knowledge there were tolerances. However, we don't know what the tolerance was for the 3.0 standard.
Slide 20
Claims that all four metabolites tested must have CIR > threshold to be found positive, and Landis only had one of the four.
comment: This is where things get controversial. The protocol is ambiguously worded about how many metabolites must be over the threshold to result in a positive test. Landis argues that the ambiguity must be read as "all", while the lab believes "any" is sufficient. There isn't documented precedent, and it is not clear that all labs use the same criteria.
Further, the slide pushes the threshold value to the absolute extreme; the value in the protocol is 3%, and does not include tolerances. The slide adds a maximum error tolerance of 0.8% to get the 3.8%, but that might not be an acceptable interpretation. Certainly if a reported value were way over 3.8% it would be a clear violation. In the middle ground either interpretation could probably be accepted, depending on your prejudice.
Slide 21
shows result report with the four observed values, and says that one of four does not make a positive. The values in the underlying table are
comment: This is the key argument for the case. At face value, two of the four metabolites are positive, but not all four. If we evaluate the tolerances as Baker claims (3.8), we get the conclusion of the slide set, only one of four; if we take them the other way (add 0.8), we get three of four. In the summary of the filing, there is reference to "the best" metabolite, which is the 5bA-5bP. With the nominal reading, this is negative. With a reading against him, it becomes positive. If we accept the "all must be over threshold argument", this positive finding is in trouble by any interpretation of the tolerances.
The summary of the filing also indicates that the one clear positive, presumably the 5aA-5bP, has some obvious error, calling it and possibly the other readings into question.
Slide 22
observes that the reported absolute level of Testosterone was not elevated. Values considered "high" are five times those observed.
comment: This validates Landis' early statements that his T was not high, but that his E was low. It also shows every media report that he says "elevated testosterone" has been and is incorrect.
Slide 23
says that the lab tests were not correctly blinded, because the cortisone exemption was clearly in the sample materials.
comment: Depending on how/where this information was visible inside the lab, it may or may not identify whether operators knew the identity of a particular vial.
Slide 24
Notes the review board may have had a pre-ordained conclusion because its finding appeared to have been drafted before the tele-conference that was to review the material.
comment: This doesn't affect anything, but it should be embarassing to the ADRB.