Full Tablet wrote:
Looking at the game code, it is defined by the ratio of 300s and total amount of hits, which is calculated with single precision numbers (if the ratio is equal to 1., then it is an SS)
If you carefully examine the game code you'll see that it's defined as (to paraphrase slightly)
float Accuracy =
(float)(Count50 * 50 + Count100 * 100 + Count300 * 300) / (TotalHits * 300);
where CountX and TotalHits are 32-bit signed integers.
As such it first does the integer arithmetic on the numerator and the denominator and then converts them separately from 32-bit signed integers to single precision floats.
The smallest possible difference between the numerator and the denominator is 200 (i.e. a 1x100). Thus for the numerator and denominator to be rounded equally they must be large enough that a single precision float rounds to some multiple greater than or equal to 200.
The point at which this occurs is 2^31 + 1, which also happens to be the point at which a signed 32-bit integer overflows. As such the numerator and denominator overflow before a single precision floating point is imprecise enough for them to be rounded to equality.
Thus the internal representation of accuracy can never equal 1 without hitting a 300 every note.