Adjusting for Difficulty: ACF Fall

I’m going to walk through what I’m trying to do with adjustments for tournaments more and less difficult than so-called “regular difficulty” sets. This is admittedly not a particularly advanced model, in large part because I don’t have the time, resources, or knowledge to put together particularly in-depth algorithms for these comparisons; it is intended to provide something a “rule of thumb” rather than a hard and fast case of a number you absolutely most add to get the information I’m looking for.

For the purposes of this blog, I am considering all HSAPQ 20/20 sets and all regular NAQT IS sets; this is to say, HSAPQ 15, NAQT IS-96 and NAQT IS-98. Feel free to quibble with this in the comments, but given that they are the most widely played sets that could reasonably be called “regular difficulty” (I think it’s clear that A sets do not qualify for this, given both statistical results on these sets and NAQT’s own descriptions of them), I think it’s perfectly reasonable to base evaluations of other tournaments’ difficulties by comparing them to these two classes of tournaments.

Also of note is that I’m just going to be comparing PPBs for this, and going from there. This is not because PPG, tossup conversion,  powers or any other number is not necessarily meaningful; it is because PPBs are rather “protected” from changing for reasons other than team performance & makeup.

The following high school teams played in an ACF Fall event and performed well enough to appear someone in my secret .txt of ranking.

DAR A
Dorman A
Dorman B
Dorman C
Guilford
Lisle
Needham A
Pensacola
Rancho Bernardo
Richard Montgomery A
Seven Lakes A
Seven Lakes B
St. Anselm’s
St. Joseph’s
State College A
Thomas Jefferson
Treasure Valley and Math Center A

The first step is to narrow this list down to teams that played in at least one of IS96, IS98 or HSAPQ 15. This leaves:

DAR
Dorman B
Guilford
Lisle
Rancho Bernardo
Richard Montgomery A
Seven Lakes A
Seven Lakes B
St. Anselm’s
St. Joseph’s
Thomas Jefferson

What I then did is compare each team’s PPB to their best performance on IS96/IS98/HSAPQ 15. The reason I chose the best is because, theoretically, this would result in an adjustment from their ACF Fall performance to a value that would approximately represent the “regular difficulty meaning” of their ACF Fall performance, if you will.

The increase in PPB from ACF Fall to the team’s best “regular difficulty” performance is listed below:

DAR – +0.85 on IS-96
Dorman B – +2.00 on IS-96
Guilford – +3.44 on HSAPQ 15
Lisle – +1.80 on HSAPQ 15
Rancho Bernardo – +2.19 on HSAPQ 15
Richard Montgomery A – +0.85 on IS-96
Seven Lakes A – +0.22 on HSAPQ 15
Seven Lakes B – +0.86 on HSAPQ 15
St. Anselm’s – +1.36 on HSAPQ 15
St. Joseph’s – +0.09 on IS-96
Thomas Jefferson – +2.36 on HSAPQ 15

The average of these values is 1.46.

I think there are two values of the differences that stick out as particularly low; those are St. Joseph’s +0.09 and Seven Lakes A’s +0.22. It is my opinion that there are two different causes for these low values. In Seven Lakes A’s case, it’s due to their having a ppb of 24.37 at ACF Fall; it’s just so hard to go far above 24 ppb on any set that you aren’t likely to see them rise above that number, regardless of their talent level. In St. Joseph’s case, I think it’s apparent from their performance that their tossup-based numbers are superior to their bonus-based numbers.

If you eliminate those two values, the average effect on ppb is 1.75. I feel okay rounding this up to 2.00, since that’s an even number and that’s easier to work with.

This is actually surprises me, as I expected this number to be even larger, possibly +3 or +4.

I’ll re-examine this process later on, when I get a chance to look at more tournaments; until then, please leave your comments below on your thoughts regarding this.

Advertisement

5 thoughts on “Adjusting for Difficulty: ACF Fall

  1. It seems like it would be more useful to graph the ppb on the more difficult set as a function of ppb on the easier set. As in, each teams would have a data point (x, y), where x=easier ppb and y=ACF Fall ppb, and then run regression on it.

  2. This is interesting.

    I think one of the reasons for the small difference is just that all these teams, at one point or another, are probably practicing, studying, or at least reading college sets, like previous iterations of ACF Fall. So there were not many surprises at ACF Fall 2010 because they had seen these sort of bonus questions before in other college sets.

    • I think this comes out in the wash, as I’m only looking at those teams that actually played ACF Fall. The adjustment wouldn’t have any effect on a team that didn’t actually play Fall.

  3. This is an interesting process, but I don’t agree that all “regular difficulty” sets should be treated as being of equal difficulty. IS-96 appears to have been a decent amount harder than HSAPQ’s Tournament 15, for one thing, and its different distribution brings other effects into account besides pure difficulty, which could easily influence a team’s performance by a bonus point or two. If practicable, it might be better to treat each set separately, rather than as representative of a standard high school caliber.

    • The issue with that is one of sample size, I believe; I think, generally speaking, IS sets render ppbs that are close to HSAPQ sets, and so this generally works well.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s