I had a lot of time on my hands so I decided to see if I could derive a system for calculating accuracy pp because the current system seems a bit arbitrary. (It almost seems like an equation you pulled out of your ass and modified through trial and error).

Accuracy is simply how accurately you hit the notes. This is one of the few things that can be calculated somewhat objectivity. This is essentially a statistical question, so consequently, statistical methods can apply really easily to this. A measurement of accuracy is similar to a measure of error. In fact, I would argue that they are negatively proportional. The more error there is, the less accuracy you have. This means that if we can find the error, we can find the accuracy. A good measurement of error is variance. Based on the misses, 50s, 100s, and 300s of circles (since sliders are automatic 300s and spinners don’t require hitting) and the OD, we can calculate the maximum variance of the hits from 0. We can compare the square root of this to the 100% accuracy at certain ODs to find an effective OD.

Variance (V) = [(#circles-#100s-#50s-#misses)*MS300^2 +#100s*MS100^2+#50s*MS50^2+#misses*MSmiss^2]/#circles

Key:

MS300 = maximum error (in milliseconds) from zero where you still hit a 300 (79.5-6*OD)

MS100 = maximum error (in milliseconds) from zero where you still hit a 100 (139.5-8*OD)

MS50 = maximum error (in milliseconds) from zero where you still hit a 50 (199.5-10*OD)

MSmiss = Value if you miss (in milliseconds) (259.5-12*OD)

For the milliseconds of the misses, I extended the already present pattern that separated 300s from 100s and 100s from 50s.

For ease of calculation and to simplify, I will assign some values:

p1 = #100s/#circles

p2 = #50s/#circles

p3 = #misses/#circles

V = (1-p1-p2-p3)*(MS300^2)+p1*(MS100^2)+p2*(MS50^2)+p3*(MSmiss^2)

This looks much easier to calculate.

Note that both of these use information already present in ppcalc program (number of 100s 50s and misses)

From here, we can calculate an effective OD.

sqrt(V) = 79.5-6*OD_effective

OD_effective = 13.25 - sqrt(V)/6

You can stop here and apply the effective OD into the acc pp equation and get rid of the (acc)^24 that weighs acc if you want.

1.52163^(OD_effective)*(n/1000)^0.3

I used Lagrange multipliers on random samples of maps and discovered that Tom’s OD to acc_pp algorithm (1.52163^OD) is not that far off. The exponential function was almost always the most accurate (and the values I got were 1.5-1.6, very similar to Tom’s pp system).

However, I decided to take it a step further, b/c why not.

It is much harder to acc a 5 minute map than a 1 minute map; because of this,a length bonus is in order. I belief standard error is an adequate metric of this. The more circles that are present, the harder it is to acc (this is where consistency comes in) and the higher the OD, the harder it is to acc said circles. The standard error would be the square root of the variance (it’s not a standard deviation because the data isn’t a continuous distribution) over the square root of amount of circles. The length bonus would be somewhat proportional to the inverse of the standard error. Additionally, this is ideal because as n grows larger, the amount of bonus you get per circle should decrease. The difference between 100 and 200 circles is a lot more impressive than the difference between 900 and 1000 circles.

I did this in a simple way. I replaced the (n/1000)^0.3 with (Standard Error)^(-0.6).

This will look like (sqrt(n)/sqrt(V))^0.6 = (sqrt(n)/(79.5-6*OD_effective))^0.6 = (n^0.3)/(79.5-6*OD_effective)^0.6

To make it somewhat similar, I did a regression to the original model and this is the model I got:

8.9293*1.3135^(OD_effective) * (sqrt(n)/(79.5-6*OD_effective))^0.6

This is more of a proof of concept than anything, hope you like it.

I might make a more intricate solution in the future (if I'm bored enough lol).

Accuracy is simply how accurately you hit the notes. This is one of the few things that can be calculated somewhat objectivity. This is essentially a statistical question, so consequently, statistical methods can apply really easily to this. A measurement of accuracy is similar to a measure of error. In fact, I would argue that they are negatively proportional. The more error there is, the less accuracy you have. This means that if we can find the error, we can find the accuracy. A good measurement of error is variance. Based on the misses, 50s, 100s, and 300s of circles (since sliders are automatic 300s and spinners don’t require hitting) and the OD, we can calculate the maximum variance of the hits from 0. We can compare the square root of this to the 100% accuracy at certain ODs to find an effective OD.

Variance (V) = [(#circles-#100s-#50s-#misses)*MS300^2 +#100s*MS100^2+#50s*MS50^2+#misses*MSmiss^2]/#circles

Key:

MS300 = maximum error (in milliseconds) from zero where you still hit a 300 (79.5-6*OD)

MS100 = maximum error (in milliseconds) from zero where you still hit a 100 (139.5-8*OD)

MS50 = maximum error (in milliseconds) from zero where you still hit a 50 (199.5-10*OD)

MSmiss = Value if you miss (in milliseconds) (259.5-12*OD)

For the milliseconds of the misses, I extended the already present pattern that separated 300s from 100s and 100s from 50s.

For ease of calculation and to simplify, I will assign some values:

p1 = #100s/#circles

p2 = #50s/#circles

p3 = #misses/#circles

V = (1-p1-p2-p3)*(MS300^2)+p1*(MS100^2)+p2*(MS50^2)+p3*(MSmiss^2)

This looks much easier to calculate.

Note that both of these use information already present in ppcalc program (number of 100s 50s and misses)

From here, we can calculate an effective OD.

sqrt(V) = 79.5-6*OD_effective

OD_effective = 13.25 - sqrt(V)/6

You can stop here and apply the effective OD into the acc pp equation and get rid of the (acc)^24 that weighs acc if you want.

1.52163^(OD_effective)*(n/1000)^0.3

I used Lagrange multipliers on random samples of maps and discovered that Tom’s OD to acc_pp algorithm (1.52163^OD) is not that far off. The exponential function was almost always the most accurate (and the values I got were 1.5-1.6, very similar to Tom’s pp system).

However, I decided to take it a step further, b/c why not.

It is much harder to acc a 5 minute map than a 1 minute map; because of this,a length bonus is in order. I belief standard error is an adequate metric of this. The more circles that are present, the harder it is to acc (this is where consistency comes in) and the higher the OD, the harder it is to acc said circles. The standard error would be the square root of the variance (it’s not a standard deviation because the data isn’t a continuous distribution) over the square root of amount of circles. The length bonus would be somewhat proportional to the inverse of the standard error. Additionally, this is ideal because as n grows larger, the amount of bonus you get per circle should decrease. The difference between 100 and 200 circles is a lot more impressive than the difference between 900 and 1000 circles.

I did this in a simple way. I replaced the (n/1000)^0.3 with (Standard Error)^(-0.6).

This will look like (sqrt(n)/sqrt(V))^0.6 = (sqrt(n)/(79.5-6*OD_effective))^0.6 = (n^0.3)/(79.5-6*OD_effective)^0.6

To make it somewhat similar, I did a regression to the original model and this is the model I got:

8.9293*1.3135^(OD_effective) * (sqrt(n)/(79.5-6*OD_effective))^0.6

This is more of a proof of concept than anything, hope you like it.

I might make a more intricate solution in the future (if I'm bored enough lol).