Everybody loves rankings, even though they don’t mean a thing.
Well, at least, most rankings don’t mean anything.
That doesn’t keep people from talking about and debating their validity. They do help give people a sense of the best teams in the state, if anyone can agree on them.
That’s why we in the media publish them.
The Associated Press has released state rankings in football and basketball for years, until this year.
Despite five attempts, the Associated Press has failed to get the required eight participants from the media around the state to participate in the basketball poll.
And this point in the season, it’s doubtful the AP will release state basketball rankings this year.
So to fill the gap, I started putting together an aggregate ranking for boys and girls basketball teams, using a variety of different rankings.
You can find these rankings each week on our high school sports blog.
But to fully understand these rankings, you have to understand the polls that are used to create them.
The first is the WIAA’s RPI, the only rankings that really matter. While these rankings don’t help a team reach the state playoffs, they are used to seed teams into the state bracket once they have qualified.
As we’ve previously mentioned, the RPI is compiled by combining a team’s winning percentage, the winning percentage of its opponents and the winning percentage of its opponents’ opponents.
Three other polls used in this ranking are from various media outlets — The Seattle Times (compiled by Nathan Joyce), Tacoma News Tribune (compiled by T.J. Cotterill) and The Columbian (this is how I have voted in the AP poll. I figured that ballot had to be put to some good use).
The polls are put together by reporter insight and an eyeball test of the results of teams around the state.
I also used MaxPreps’ rankings, which are different from RPI. MaxPreps doesn’t say how its rankings are compiled, other than to say that wins over teams ranked higher will help a team and losses to teams ranked lower will hurt.
For the boys’ rankings, I also used rankings from the WIBCA, the state boys basketball coaches association.
And then there are two “computer” rankings developed by fans — Evans Rankings, compiled by Tri-Cities resident Matthew Evans, and the Score Czar, compiled by former Vancouver resident Scott Odiorne.
Evans Rankings is made up of three components — a team’s winning percentage, its opponents’ winning percent and a points percentage, which basically measures how much a team scores and allows.
Because of those first two components, Evans Rankings most closely resembles the WIAA’s RPI rankings.
The Score Czar rankings is basically compiled purely by measuring the points a team scores and allows, with an added strength of schedule factor that is based on the points scored and allowed by the team’s opponents.
Wins and losses are not an actual component in Score Czar rankings.
To put it another way, the Score Czar rankings are prep sports’ equivalent to baseball’s Pythagorean win-loss record.
Baseball sabermetric fans will be familiar with the Pythagorean Theorem of Baseball, developed by Bill James to project a team’s win-loss record based on its run differential.
The final rankings that I’ve compiled produce an interesting result. But the part of this exercise I enjoy most is looking at all the different rankings and appreciating all of their perspectives.
We’ll keep tracking all of these rankings right up to the state tournament.
And if the Associated Press ever gets its act together, I’ll add those rankings to this aggregate compilation.
Tim Martinez is the assistant sports editor/prep editor for The Columbian. He can be reached at 360-735-4538, tim.martinez@columbian.com or follow his Twitter handle @360TMart.