Team Brunnhilde version 2.00

Mid July. 90 degrees under the shade on my deck. Pinot Noir and Puyallup loganberries.

But since March I’ve been busy. The data for this website (except for the games!) is now in a database. Ergo, version 2 of teambrunnhilde.com. Besides the games there is a lot of other data. Who played, what season, what league, what classification, who the out-of-state opponents were, what was their record, who changed names, who switched partners (for the co-op teams)? Games are almost an afterthought.

Anyway, you’ll notice that Crosspoint Academy school page now has King’s West and Bremerton Christian. Ritzville school page now has Lind-Ritzville and Lind-Ritzville/Sprague as well. No. Tekoa-Oakesdale doesn’t have Tekoa-Rosalia. But Rosalia does.  Getting the data into an actual database was the IT task for this off-season. The database isn’t accessible from teambrunnhilde.com. Maybe that will come. For now the pages are still static, but the perl scripts generate them from a database rather than from an ever increasing collection of files that only I could possibly figure out.

This was actually done in May or June. The out-of-state team records were added then. That addition allows good calculation of RPI back maybe 14 years. Data there which was obtained from OPSN, idahosports.com, and MaxPreps. Further back, I’ve plunged through scanned newspaper archives. It is a work in progress. The RPI calculations have been redone. Looking up the out-of-state school records also afforded an opportunity to regularize the school names and verify that the Washington school actually played that out-of-state team and not some name-alike school.

2018 ranking comparison

For the fourth year in a row, the TeamBrunnhilde linear model was the most accurate ranker in predicting the outcome of state tournament games.  The linear model for win-loss had an impressive debut, tying for second place with MaxPreps.  Fourth was a composite ranking.  WIAA RPI was fifth.  Everybody did a decent job of picking champions as their #1 ranked team.  Three rankers tied with eight #1’s that were champions: Evans Ranking, WIAA RPI, and TeamBrunnhilde linear model.  Lynden the top seed in Boys 2A won that championship, but was #2 in WIAA RPI.  Sorry, WIAA.  No mulligans.

I’ve read complaints that WIAA RPI doesn’t count Win-Loss enough.  Win-Loss as a ranking algorithm finished dead last.  I think there are better rankings than RPI, but RPI is an attempt to improve on Win-Loss with an understandable method for the less-mathematically inclined.  This year demonstrates that a properly done RPI does that.  Last year WIAA RPI had fatal flaws: not counting playoff games and using fake data for out of state teams.  With these corrected WIAA RPI finished 5th, well ahead of Won-Loss (which it trailed last year).

The TeamBrunnhilde linear model for win-loss, similar to TeamBrunnhilde linear model for points but with a different Y vector, did pretty good considering that the mathematics are not best suited for binary data (win-loss).  I’ll have to look at that before next year.

Everybody had successes.  Everybody had misses.  Not just at the top of the rankings.  We should all go back and work on slaying the variance monster next year.  But as Jesus might say, “The Random you will always have with you.”

Full results.

Regionals

Last year I saw six regional games and four were absolute barn burners. This year of six games seen, one went to the final seconds, one other was close in the fourth quarter, the other four ranged from clear wins to routs. In all, girls regional games reverted to their 2016 form: only 5 games decided by 5 or fewer points (16 last year), only 13 by less than 10. So I didn’t get unlucky in choosing my games.

One other thing.  I’ve probably seen between 1000 and 1500 games over the last 23 years.  A Garfield girls free throw shooter was called for exceeding 10 seconds before taking a shot.  That’s the first time I’d ever seen that called.  Justified?  I dunno.  The referee was likely the only one in the building counting.

Check out the WIAA Regional program cards. The picture: La Center girls (check: in 1A), White Swan girls (check: in 2B), Federal Way boys (check: in 4A), Anacortes boys. What happened to them? Program cards were free at Tumwater and Auburn Mountainview. They cost $1 at Bellevue College. Is that another case of getting rich people in blue areas to subsidize the rest of the state? Of the six program cards, three have good quality pictures of the teams. The ones that don’t were the ones that cost a buck. Horribly pixilated grainy photographs. And howcum a dollar anyway? For District 1/2 2A tournament, each team got a full page, often in color, and these were free. Game prices were lower also.

Bellevue College charged $5 for a hot dog and $2 for a very small bag of popcorn. When last I was there it was $3 and $1 for a bigger bag: same as the standard prices for booster club concessions at high school sites. Maybe price gouging for concessions is just to get us warmed up for the Tacoma Dome. At least the police at Bellevue College weren’t taking on the job of confiscating food coming into the venue. Seems that Tacoma’s Finest consider the duty of safeguarding exorbitant prices charged by T-Dome vendors for sub-standard food to be their primary job. Wouldn’t want to patrol the hilltop now, would we? OK, that was during the Goodwill games, snatching sandwiches from the mouths of four year olds in 1990. Nowadays Tacoma Dome food-Nazis work for an outsourced security firm. Today protecting $9 hot dogs in Tacoma—tomorrow getting billion dollar contracts to kill Iraqis waiting in gasoline lines.

Ranking comparison update

After 96 regional games, LM-pts (aka TeamBrunnhilde points, MY ranking) has 80 correct. Next is MaxPreps 73, Composite 70, LM-wl (see below) at 69, Evans Ranking at 68, WIAA RPI and Won-Loss at 67, Captured W-L at 64. Of the 57 games which all rankers agreed, 51 were picked correctly, nearly 90%. Overall, rankers did better this year than last at regionals, were the top ranker had only 73 right and WIAA RPI had 60 right. Do bad games make for better rankings?

Linear Model using win-loss, while coming up with reasonable answers (4th place so far), is upon further review not the using an optimum regression technique for binary win-loss kind of data. I’ll have to revisit my binary data texts from 35 years ago—a task for this coming off-season.

Contentious matchups for Wednesday, judging by the disagreements among the rankers, are Taholah v Almira Coulee Hartline (1B Boys), Prairie v Wilson (3A Boys), Lewis and Clark v Sunnyside (4A Boys), Entiat v Mount Rainier Lutheran (1B Girls), Seattle Christian v Lakeside (Nine Mile Falls) (1A Girls), East Valley (Yakima) v Port Angeles (2A Girls), Black Hills v White River (2A Girls). Thirty of 48 games on Wednesday are unanimous choices.

Rankings Comparison, etc.

Rankings Comparison

The 2018 State Tournament Ranking Comparison has seven rankers and a composite ranking this year. WIAA RPI is looking for improvement with changes made since last year’s last place finish. Of the first 96 games (regional round), 57 games are unanimous picks of the 7 rankers (the composite agrees to but then it has to by definition). Note that last year unanimous picks were wrong 20% of the time. So don’t be discouraged if you’re feeling dissed! And many those unanimous picks are just a ranking spot or two difference by various rankers.

If you propose a ranking, or use a ranking, you should be willing to have that ranking actually checked. You will likely find it isn’t ‘scary accurate’ after all. Every ranking will have notable successes and failures. The hubris of rankers will be tested. It is a reminder that games still need to be played. OK. Maybe not all games. Some of these are just flat out mismatches.

District Tournament notes

Hard work. I saw the first three girls winner-to-state games at Mount Tahoma on Saturday, February 10. Union played Bellarmine Prep in one of those. BP won by 32. But it was still a heckova game. Union fought throughout. After a tight (nearly) first half, they fell behind—but never ever stopped working. Sitting at Mount Tahoma that afternoon, knowing that the losers had more chances: who would be the ones that made it to state? Beamer. Rogers. Union? Yeah, Union. Hard work pays off. Congratulations to you! Next time I see you play, I’ll remember to bring my pickup truck to take home a load of bricks put up by your harassed opponent.

Boxing used to be a ‘State Championship’ sport. A banner in the Mount Vernon gymnasium proclaims the Bulldogs as 1952 State Champion in Boxing. I generally scan the banners in gymnasiums. What is the athletic tradition here? Sitting on the wooden built-in seats, ‘200-level’. Excellent sight lines to the floor. It is a beautiful old gym. You can totally imagine a boxing match here in 1952, floodlights on a ring at center court, like a 1950’s movie. The gym is the same. Glad I met its acquaintance, finally.

Sammamish has a new gym. A beautiful gym part of an athletic complex. It means the demise of the last ‘character’ gym in Kingco. Sammamish administrators won’t miss it, but I will. I only went there once a year, so didn’t notice the rats.

MaxPreps errata: boys

Examining the MaxPreps data Sunday morning just after seeding was announced.  Missing games: 130.  Other errors, yes.  Between boys and girls, 245 missing games, maybe total errors 300.  That’s > 3% error rate for ~9000 games.  Are you feeling confident that the RPI calculations are correct?  Some seedings are decided by less that 0.001.  Here is the whole report.  There may be other errors and omissions not included.

On the WIAA RPI report, the out-of-state teams records are not reported.  Looking at the OSAA database, these are apparent.  It is not possible to determine what the WIAA RPI calculation is using, if anything.

During the season, I look at the MaxPreps results daily.  I noticed games were not timely reported, but figured they’d come in eventually.  The usual MaxPreps errors: games entered multiply, wrong opponents, are fairly minimal.  Missing ~3% of the games played is not.

What the WIAA/MaxPreps RPI database missed

I run a script now and then to compare the MaxPreps database with mine.  I occasionally find a game I’ve missed.  Some of the tiebreakers this season, for instance.  But even if I don’t have every game, nobody is depending on my database to make calculation to five decimal places.  Not so for the MaxPreps/WIAA database.

I ran my script today, after tournament pairings were announced.  By my count there were 115 varsity games that I’ve recorded that are not included in the MaxPreps/WIAA database.  Most of these MaxPreps has the game scheduled, just no result.  There were several double counted mistakes, some wrong team mistakes.  Do games against Australian teams count?  If they do, they should all count.  But some games are ‘non-varsity opponents’ in MaxPreps lingo.  So I guess those don’t count anyway.

Here’s the full details.  If it was my database I’d be embarrassed.  But there is no shame at MaxPreps.

Another rating

I’ve lost track of the Associated Press high school basketball poll. The last one I saw was last season in late January. Haven’t seen one this season. Maybe it’s been abandoned. Don’t know how many of the voters saw more than a a few of the teams they were voting on anyway. Early weeks of the poll. which never started before January, always had their share of ‘booster votes’: votes for a local team that is sort of good and sure, why not them? You want to sell papers to the local folks, not somebody across the state. Nowadays the WIAA RPI is the ranking quoted when lauding a team. So the AP poll is obsolete. It was an example of ‘learned opinion’, maybe not very learned at all, but approximating a seeding committee, rather than a numeric algorithm.

There is something to winning games. That is ultimately what counts. I don’t like the way RPI uses that information, but when somebody is #1 in RPI one should take notice. In January at Wilson high school RPI #1 Gig Harbor (rated much lower by me) made Garfield (rated higher by me) look just average. I don’t think much of District 3 2A teams. Port Angeles is RPI #4. My linear model ranking has them #25. What would happen if the linear model just considered wins and losses: not margins. So modify the program to run the linear model with just 1 for the margin instead of the point spread. Port Angeles doesn’t rise to #4 but it makes the top 10.

Based through Sunday’s games (2/4/18) here is how the ranks compare, showing rankings for teams rated in the top 10 for at least one of three different rankings. LM-margin is the TeamBrunnhilde points rating. LM-win/loss is the LM-margin considering only winning and losing, not margin. WIAA RPI is, well, you should be able to figure that one out.

4A Girls

LM-margin

LM-win/loss

WIAA RPI

Central Valley

1

2

1

Eastlake

2

1

2

Moses Lake

3

4

4

Woodinville

4

3

5

Bellarmine Prep

5

9

6

Kentridge

6

11

3

University

7

15

20

Chiawana

8

6

10

Lewis and Clark

9

12

14

Sunnyside

10

8

12

Lake Stevens

11

13

7

Newport (Bellevue)

12

5

9

Skyline

13

7

17

Issaquah

18

10

25

Beamer

22

18

8

3A Girls

LM-margin

LM-win/loss

 WIAA RPI
Kamiakin

1

1

4

Prairie

2

9

7

Lincoln

3

5

2

Bethel

4

12

12

Garfield

5

8

8

West Seattle

6

2

5

Stanwood

7

13

15

Mount Spokane

8

11

25

Redmond

9

7

9

Snohomish

10

3

3

Gig Harbor

11

4

1

Bellevue

12

14

10

Seattle Prep

14

6

11

Edmonds-Woodway

15

10

13

Timberline

23

16

6

2A Girls

LM-margin

LM-win/loss

WIAA RPI

W.F. West

1

4

2

East Valley (Spokane)

2

1

1

Clarkston

3

2

10

Archbishop Murphy

4

6

3

Wapato

5

3

5

Burlington-Edison

6

7

6

East Valley (Yakima)

7

9

18

Lynden

8

11

8

Black Hills

9

8

7

Prosser

10

5

13

Mark Morris

20

15

9

Port Angeles

25

10

4

1A Girls

LM-margin

LM-win/loss

WIAA RPI

Cashmere

1

4

2

Lynden Christian

2

1

1

Zillah

3

2

4

La Salle

4

5

3

Medical Lake

5

3

5

La Center

6

9

7

Meridian

7

7

10

Cle Elum-Roslyn

8

10

18

Lakeside (Nine Mile Falls)

9

6

8

Nooksack Valley

10

12

13

Columbia (Burbank)

11

8

9

Elma

12

11

6

2B Girls

LM-margin

LM-win/loss

WIAA RPI

Colfax

1

2

10

Ilwaco

2

4

1

Davenport

3

1

2

Northwest Christian (Colbert)

4

5

16

St. George’s

5

3

7

Liberty (Spangle)

6

8

14

Napavine

7

10

3

Wahkiakum

8

11

4

White Swan

9

7

8

Mabton

10

6

9

Tri-Cities Prep

11

9

11

Life Christian

14

13

5

La Conner

15

14

6

1B Girls

LM-margin

LM-win/loss

WIAA RPI

Colton

1

1

1

Sunnyside Christian

2

3

4

Pomeroy

3

2

2

Almira Coulee Hartline

4

4

5

Oakesdale

5

6

12

Garfield-Palouse

6

10

16

Selkirk

7

5

7

Touchet

8

9

17

Inchelium

9

7

13

Wellpinit

10

8

18

Neah Bay

15

13

3

Mount Vernon Christian

17

14

8

Mount Rainier Lutheran

19

18

10

Puget Sound Adventist

23

19

6

Taholah

27

27

9

One gets different ranks depending on the algorithm used. No algorithm is inherently ‘right’. That is, God does not sit in His office with the ‘real’ rankings and we are searching for the algorithm the He uses. Come state tournament time, I’ll once again have a ranking competition. One would like to see validation for a ranking technique used for seeding that has a high rate of success in predicting the outcome of actual state tournament games.  If you want to participate with a set of rankings (192 teams, 12 tournaments, each with 16 teams ranked 1-16), leave a comment.  I’ll get back with you.  Russians eligible, too.

Tournament Month is Here!

February has been Tournament Month for me since 1996. My daughter and I attended several league games for Liberty (Issaquah) girls that season, and then stumbled into the D1 and D2 AAA tournament at Hec Ed quite by accident when we were up at UW for another reason. Haven’t missed Tournament Month since. From 1997 on it was the D2 3A (AA in 1997) girls tournament. That was where Liberty played through 2014, provided they qualified. I long felt that the D2 3A tournament was the toughest tournament in the state. But then I started delving into history and found some other tough nuts in the D7 and D9 B girls tournaments. Or how about the D5/8 4A girls tournaments? Any tournament with the Greater Spokane League girls is going to be a tough one.

One district that didn’t stand out as particularly difficult was District 3, West Central District. D3 4A tournament has always had a boat load of state berths. As whole leagues migrated into D3 (see Seamount moving from D2 to D3 in 2002), the number of berths ballooned in 2A and 3A classifications. But the number of actual GOOD TEAMs did not. For 2015 and 2016, D3 and D2 combined for 2A tournaments, Liberty participating. In 2015 Liberty was a good team, and after getting past White River, the only reliably good 2A girls team in D3, won the district title. Sammamish, the other D2 team was runner-up. I saw, up close, a lot of mediocre teams in that tournament. And a bunch of them made state. In 2016, Liberty was one of those mediocre teams, and they waltzed through into a third-place state berth. At state regionals, every one of those six teams (White River included) face-planted against the rest of the state.

So I’m not bullish at all on D3 girls basketball, even if they get a champion through now and then. There are good teams in D3, just way too few for the number of state berths handed to them by the WIAA geographical quota system.

In lieu of a weekly top 10 article, and in honor of Tournament Month, I’ve made up a chart projecting which girls teams make it to state, and who doesn’t, based on WIAA allocations and projecting success based on the TeamBrunnhilde points ratings. Of course they’ve got to win the tournament games. Bracketing affects that, too. What stands out is the number of teams outside the TeamBrunnhilde top 16 that will make it to state from District 3 (or combined district tournament involving District 3). There is also the enshrining of Emerald City mediocrity in the D1/2 1A tournament, where the Emerald City champion is automatically qualified to state and dropped into the district championship game. This year’s Emerald City best team, Seattle Academy, is currently rated #37 in 1A by me. That is not only dumb, it is grossly unfair: to make state in a mulit-league district tournament without having to even play anyone outside your league. Meanwhile eight 1A teams from District 5 rated above Seattle Academy will be sent home at or even before districts. Looking at the chart as a whole, there are 96 teams qualifying to state. There are 100 other teams that are rated above a team qualifying to state from another district, that will NOT be making it to state. And that is a best case scenario.

Just Play Fair. The WIAA motto. Is it fair that bad teams make it to state depending up where the district boundaries are drawn? I am absolutely opposed to basing state qualifications on ranking algorithms. But there needs to be some adjustment to the WIAA allocations that recognize merit. Districts that regularly produce good teams need to have more state berths allocated to them. It is not a transient thing. I’ve got 30 years of girls data and D3 has been pretty bad for most of those years. And the districts around Spokane have been pretty good.

So Howcum Gig Harbor is #1 in 3A RPI?

RPI = 0.25 * WP + 0.50 * OWP + 0.25 * OOWP

That’s the well known WIAA RPI formula. It helps to win games. Those get weighted 25%. It helps to play good opponents. That gets weighted 50%. And if those opponents also play good opponents that helps too, weighted 25%.

Comparing Gig Harbor with Prairie (the poster child for RPI victims, at least according to the Vancouver Columbian)–

Prairie has a better win percentage: .833 (10-2) v .692 (8-4). Prairie will win all their league games. They haven’t lost a league game since 2000. The only possible loss is to Rogers (Puyallup), but that is still a likely Prairie win. Gig Harbor is favored in 7 or their 8 remaining games. Both teams are likely to see an increase in their WP, although the higher WP is, the less impact a win has on increasing WP.

Comparing the non-league opponents, Gig Harbor has a much stronger slate

Gig Harbor Prairie
opponent WP TBPts opponent WP TBPts
Kentlake .769 30.2 Tumwater .273 -5.4
Curtis .750 25.0 Camas .615 35.8
Kentridge .923 46.4 Battle Ground .250 -0.9
Snohomish .833 33.2 Skyview .357 18.7
Lynden Christian 1.000 45.5 Union .769 27.0
Garfield* .750 47.8 Whitney (CA) .643 N/A
S. San Fran (CA) .727 N/A
San Joaquin Mem (CA) .667 N/A
W.F. West .909 48.3
Rogers* (Puyallup) .615 24.4

* not yet played

The WP record is calculated without the game involving the Gig Harbor or Prairie.

For league opponents, start with 0.500 for the average league opponent for league games not involving Prairie or Gig Harbor. Then combine with their WP for non-league games. The rest of Gig Harbor’s league is a combined 30-9 in non-league games. The rest of Prairie’s league is a combined 15-33. Gig Harbor’s non-league OWP is very high, and is likely to drop as league games are played. But the 30-9 record will cushion that drop. Prairie’s OWP will also drop but the 15-33 record will accelerate the decline.

Since league opponents have a lot of opponents in common (they’re in the same league) the overall record of a league in non-league games impacts the OOWP. Again the weakness of Greater St. Helens 3A will pull down Prairie’s OOWP. The strength of SSC will help Gig Harbor’s OOWP.

There is no better 3A league in non-league games than South Sound Conference. There is no worse 3A league for non-league games than Greater St. Helens 3A. Gig Harbor will likely remain highly ranked in RPI. Prairie is actually likely to fall as league games against the very weak GSHL 3A start to count in RPI.

So howcum Gig Harbor is #1 for RPI? They belong there.

Did Gig Harbor schedule for RPI? I think so. But a lot of other teams did too. When you’re rewarding people based on a measurement, and people know what that measurement is, you can expect people to try to optimize that measurement. I’m surprised that Prairie did a poor job of non-league scheduling. Prairie knows it is in a weak league. They’ve known that for decades. The only way they can counteract that for RPI purposes is the non-league schedule. Or maybe Prairie realizes that no matter what they do, they won’t be in the top 8 in RPI anyway.

Cross-classification scheduling

A recent post prompted an out-of-band exchange regarding scheduling games between classifications. The effect that scheduling itself has on rankings is a subject of interest to me. It is complex and requires a lot more time that I can devote during the season when, on some days, I spend five or more hours getting the previous day’s games into my database. But since I have that database I can ask it questions. It takes some time to formulate just how to ask, and understand just what that answer is telling me.

One question prompted by that exchange was just who schedules non-league games with teams in other classifications. League games are massively scheduled between teams in small subsets of the approximately 380 schools fielding varsity basketball teams. There really isn’t any point in using a ranking algorithm to determine who is the best team in a league when everybody plays everybody else one, two, even three times. That’s what the win-loss record is for. It is very easy to understand. The schedule is balanced, in most cases. The schedule for non-league games is not at all designed to even have the appearance of randomness. Thank goodness. I don’t want to see a random set of games selecting a 3A team to play a 1B team. I know that the only way Chief Kitsap girls will beat Bethel would be for the Tacoma Narrows bridge to collapse again while Bethel is driving across. The stat geek in me would find an experimentally designed schedule appealing. The fan in me would not. We get the non-league games that happen. They form the critical schedule structure for driving rankings of teams in different leagues and districts.

For starters, I looked at girls data for 2006-2007. That is the first season with the six current WIAA classifications. I did a lot of this by hand so hope I didn’t mis-transcribe. The inter-classification W-L table looks like

v 4A

v 3A

v 2A

v 1A

v 2B

v 1B

4A

112-51

14-14

0-1

0-1

3A

51-112

34-45

3-3

3-1

1-0

2A

14-14

45-34

68-34

9-5

3-0

1A

1-0

3-3

34-68

55-50

14-9

2B

1-0

1-3

5-9

50-55

81-84

1B

0-1

0-3

9-14

84-81

Looking at the yellow cells, where the classification difference is at least two, there are 81 games. The higher classification team won 47 and the lower classification team won 34. This season is also interesting for how badly 3A fared against adjacent classifications, but I didn’t look at those. Examining the 81 games was enough for me this week. Leave that for another day. Here is the list of the 81 games.

Thirty 4A teams scheduled down two or more classifications (maybe some teams count more than once) In so doing 4A went 14-16. Using 2006-2007 TeamBrunnhilde points rating, eight of these were in the top half of the classification, 22 in the bottom half, 16 in the bottom quarter. Clearly not a representative slice of 4A teams doing battle with 2A, 1A, and 2B teams. Of the 28 2A teams that scheduled up by two classifications (all against 4A), half were from the top half of that classification, 7 in the top 10. 2A fielded a lot better lineup going up against 4A.

Look at 3A. Eleven teams scheduled down. Three in the top half (#24 twice and #27), eight in the bottom half and six of those in bottom quarter.

2A scheduled down 14 times. Only three of the 14 were from the top half. Four were from the bottom three 2A teams. Klahowya, #54 (last), an epically dreadful team in the midst of a four-year period in which it won one (1) game, managed to lose twice to #60 1B team, Quilcene. Quilcene, though, rated 3 points ahead of Klahowya, so these were not upsets.

Fourteen of 1A teams scheduling down came from the top half of the classification. More than half. But where do those 1A teams live? Four games for Colfax, others from eastern Washington. People may be sparse; good girls basketball teams aren’t. Next town over might be 1B but the team is good. Seattle Christian (18) and Forks (21) were the only top-halfers west of the mountains. Okanogan (17) lost to Entiat (1B #6) twice: Entiat a six-point favorite anyway.

Overall for the 81 games, 28 down schedulers came from the top half of their classifications (4A, 3A, 2A, 1A); 45 up schedulers came from the top half of their classifications (2A, 1A, 2B, 1B). This is what you would expect. Coaches don’t want to schedule a slate of mis-matches.

If you had two hats, one with names of teams, from say 4A and the other from 2A, and pulled a name from each hat: who would win? Knowing nothing else, I would guess the 4A team. But if you had a hat with the games that were actually scheduled and asked who would win, that’s a different question: 50-50 for this season. It isn’t a random schedule–not for any of the cross-classification comparisons (2 or more differences). In theory, giving extra credit to the lower classification team winning or even playing the game seems logical. In practice it doesn’t necessarily work out.

But that’s not the end. What about the adjacent classification games? Funny that I picked 2006-2007 because that looks really interesting. What about next year, and the next? Just because something shows up one season doesn’t mean it applies to others. Now that there is the RPI incentive to making schedule arrangements, how has that changed scheduling? Then there is geographical asymmetry. Good teams are not evenly spread across the state, but most games are close by, posing difficulties for making cross-state comparisons. Lots of questions. But I’ve got a game to catch this evening.