as a former rater I can tell you most of the raters just don't do any homework.
they just take where teams finished last year and rank them accordingly this year - they don't even look at who graduated what...let alone new runners even though they give an extensive report from every team in the NAIA before the season begins.
in our conference the commissioners office does a pre-season poll of the top runners (and just lists the results from the previous year - minus the seniors).
that takes what - about 2 minutes worth of work. they don't even attempt to include any new comers at all - ever.
so pre-season polls are about worthless
and in season you still get some raters who will simply take what everyone else ranked (the consensus) the teams the previous week and just copy and paste those for the next rating cycle.
the naia has a good idea in letting each conference have a rater, but too many just don't spend any time on the process and some are just so biased about their own conference they actually hurt the process. At one point our conference commissioner wanted us to rank our conference teams higher because they finished 2nd to Oklahoma city (#1 in country at that time) without realizing that you can be a distant 2nd and a close 2nd - the team was a distant 2nd -as in one minute behind. We had them ranked 15 or so - they ended the season 15th or so.
rankings really aren't that difficult if you have simple head-to-head comparison and updated info on whether team were at full strength or not (some coaches help out a lot by giving this info along with comments on whether the course was fast or slow.....others give nothing).
my solution is to drop rater without their knowing it by not tabulating their results into the polls if all they are doing is the copy paste method...or something to that effect (maybe letting their peers drop them). Those who rate well know who is and who isn't taking the time to it right.