forum

Complexity: Variability, Reading, and Angularity

posted
Total Posts
105
show more
Vuelo Eluko

Narrill wrote:

I'm not saying pattern detection is a nice bonus we can easily sacrifice, I'm saying it isn't necessary at all.
it just goes back to what was mentioned earlier; some people are better at certain patterns than others and there's no measurable difficulty there.
Ethelon

eeezzzeee wrote:

These are factors that are all pretty much impossible to rate the difficulty of because these are the things that some players might find hard others will not.. whereas something like the distance between the note is a factor that is the same for everyone

Riince wrote:

Narrill wrote:

I'm not saying pattern detection is a nice bonus we can easily sacrifice, I'm saying it isn't necessary at all.
it just goes back to what was mentioned earlier; some people are better at certain patterns than others and there's no measurable difficulty there.
I don't understand why this is even brought up. Yes some people are better at certain patterns than others. So they're better at reading it and should be rewarded. Just because different people are better or worse at something doesn't mean that there is no measurable difficulty. It means that people aren't uniform in ability.
If reading affects the difficulty of the map (which it does), then it should be considered in the difficulty rating.

And from what I understand, Narril isn't saying that it's unnecessary to detect the difficulty of patterns, only that you don't need a system to detect patterns to determine difficulty.
Full Tablet

Narrill wrote:

Thing is, we don't really have to detect patterns; that the patterns repeat is incidental to their difficulty. What we're aiming for is making sure a map with repeated patterns is shown to be less difficult than one with non-repetitive patterns of equal relative spacing and speed. We don't need to look for the repetition, we need to look for the reduction in complexity caused by the repetition.
You can calculate a weighting value for each note based on the repetitions found.

Possible way to do this, using "abcdabcef abcdabcef" as an example: (Starting from the matrix, this process would take O(n^2) time if checking for all patterns)

The first 4 elements ("abcd") have a weight of 1, since they are new elements in the list.

The 3 next elements would have a lower weighting (since they repeat the pattern "abc"), they have a value of
0.5^(1/(Distance_Between_Patterns)^2) = 0.5^(1/16) = 0.957603

The next 2 elements are new, so they have a value of 1.

All the next 9 elements have their weight multiplied by a factor (since they repeat the first 9 elements): 0.5^(1/9^2) = 0.991479
Of those 9 elements, the first 3 are multiplied by a factor because of the repeating abc pattern not in the repetition of 9: 0.5^(1/5^2)=0.972655
And the second "abc" pattern in the 9 elements are multiplied from a factor because of the other abc patterns: 0.5^(1/4^2)*0.5^(1/12^2) = 0.953684

So in the end the weightings of the elements are:
{1, 1, 1, 1, 0.957603, 0.957603, 0.957603, 1, 1, 0.964367, 0.964367, 0.964367, 0.991479, 0.945558, 0.945558, 0.945558, 0.991479, 0.991479}
nrl

Ethelon wrote:

And from what I understand, Narril isn't saying that it's unnecessary to detect the difficulty of patterns, only that you don't need a system to detect patterns to determine difficulty.
...

I'm saying there's no need to parse the map looking for repeated patterns. If intramap variation is evaluated properly the effect of the repeated patterns on overall complexity will already be accounted for. The entire idea of evaluating discrete "patterns" is misguided.

Full Tablet, how would you then apply the weighted note values to come up with an overall complexity/difficulty metric? We want to reduce the complexity/difficulty of the sequence to a single number.
Topic Starter
Fraudo

Ethelon wrote:

eeezzzeee wrote:

These are factors that are all pretty much impossible to rate the difficulty of because these are the things that some players might find hard others will not.. whereas something like the distance between the note is a factor that is the same for everyone

Riince wrote:

it just goes back to what was mentioned earlier; some people are better at certain patterns than others and there's no measurable difficulty there.
I don't understand why this is even brought up. Yes some people are better at certain patterns than others. So they're better at reading it and should be rewarded. Just because different people are better or worse at something doesn't mean that there is no measurable difficulty. It means that people aren't uniform in ability.
If reading affects the difficulty of the map (which it does), then it should be considered in the difficulty rating.
I'm glad someone gets it. The whole point behind this isn't saying "Hey this is universally difficult for every single person" otherwise like I've said, the current system is horribly flawed, as we reward streams and jumps, and different people are better at different things. This will always be the case with a game, it's about rewarding things that are generally hard, and giving bonuses for doing them. That doesn't mean we will actually need to reward them, just give a way to know when a map is more complex, so that people can get recognition at the very least for the very difficult to read maps they play.
Nyxa
I agree with the above post, though from what I've seen so far everyone in this thread is trying to come up with a complicated solution to a complicated problem - I can tell you now that as long as you keep going down that path you won't find a truly effective solution. Complex problems require simplistic thinking, which is way harder than it sounds. I'll be trying to come up with some coherent ideas of my own but I'd really like to stress that if you truly want to find a working solution, the best approach is to think as simply as you can. If you have a simple solution and something goes wrong, the fix will be simple. Fine-tuning will be simple and adding onto the framework you created will be a lot easier than when you come up with a complex, fragile system that's hard to oversee properly.
nrl

Tess wrote:

Complex problems require simplistic thinking, which is way harder than it sounds.
Mine is a very simple solution; you're just tracking variation while gradually widening the scope. It's not unlike differential calculus if you think about it.
chainpullz
My solution is a simple solution. Teach a computer what makes maps hard and then let it go crazy. The actual implementation details might not be so nice but hey, let's not worry about that.
RaneFire
Just pointing out that it would have to be simple, because every revision to the system requires reparsing all beatmaps. The more work it has to do, the longer that will take. The more variables there are, the more tweaking has to happen, and more often.

Complexity will also inevitably require rebalancing for the current bonuses to FL and HD, considering it's able to pick up regular patterns, which are easier to memorise, and mid-stream distance spacing changes, which throw me off very much regarding aim. Not necessarily a "finger" complexity issue on that one.

Common patterns also don't necessarily mean complexity. For instance, triples, quintuples, etc. These patterns are very easy to play intuitively because they are common and obey musical notation (start/end on a main beat). When you start including doubles and quads in that though, things get tricky, but from the perspective of a computer, there is no difference other than the number of notes. We also listen to the music while playing, not something a computer can fathom.

Not going to say too much though, since I've thought about this stuff heavily and already thrown away any hope of seeing it implemented due to the sheer number of variables we would need to consider when designing a "simple" system. Collaboration is also required to determine universally difficult/easy things, since reading ability is the major difference between players, so I applaud you for the initiative of creating this thread to discuss this topic.
Topic Starter
Fraudo

RaneFire wrote:

Just pointing out that it would have to be simple, because every revision to the system requires reparsing all beatmaps. The more work it has to do, the longer that will take. The more variables there are, the more tweaking has to happen, and more often.

Complexity will also inevitably require rebalancing for the current bonuses to FL and HD, considering it's able to pick up regular patterns, which are easier to memorise, and mid-stream distance spacing changes, which throw me off very much regarding aim. Not necessarily a "finger" complexity issue on that one.

Common patterns also don't necessarily mean complexity. For instance, triples, quintuples, etc. These patterns are very easy to play intuitively because they are common and obey musical notation. When you start including doubles and quads in that though, things get tricky, but from the perspective of a computer, there is no difference other than the number of notes.

Not going to say too much though, since I've thought about this stuff heavily and already thrown away any hope of seeing it implemented due to the sheer number of variables we would need to consider when designing a "simple" system. Collaboration is also required to determine universally difficult/easy things, since reading ability is the major difference between players, so I applaud you for the initiative of creating this thread to discuss this topic.
Yes, as is the problem is more the implementation than the theory, and people are more throwing out complex solutions. I personally think just doing some simple checks and having a list of patterns is the simplest way, and giving each pattern a value in a key within the algorithm, so that when you encounter a pattern it gets that value, and when you encounter it within a certain timeframe, that value is reduced due to familiarity. This would only apply to simple patterns, because if it's a more complex pattern, we can just do it multiplicatively, and give many sharp angles/alternationsa in a row a more valuable reward, and variations in tempo would be accounted for by the timeline that we already have. This basically solves 90% of the issues, and leaves only reading ,which we can do some overlap checks for.

That's the simplest implementation I can think of, and only requires a few variables, note distances (maximum of 3 at a time for angles, more for variation, and stream detections), angles between those 3 notes (angle + distance/time would be preferable to measure the difficulty of stopping and going the other direction, whether the distance/time before or after the turn is more difficult isn't known to me), and the time between notes (again, 3 at a time for angles, and more for variation).

At the heart of it all, that's the simplest and easiest solution that gives the most reward for the least calculations so far as I can tell. It's easy to tune, because everything is based on what we already have, with the addition of angles. Then, we just log the angles, spit out recognized patterns, value them with comparisons to nearest patterns, spit out extra angles, value them as well, similar angles together do decrease value slightly in this case, boom, we have a working implementation.

Of course, it's quite as easy as I make it sound, but would still be heaps easier than any other proposed implementation. No, I am not aiming for a perfect system with that suggestion, but a working one that can be tuned in a way that is understandable to the players so that they know WHY a map is considered difficult, not just "because".
chainpullz
The actual time complexity for the regular expression approach is O(mn) where n is the number of objects in the beatmap and m is the dictionary. In theory if we were to go that route it would be fairly trivial to update the difficulty system. As already stated though, coming up with a good alphabet and dictionary is time consuming from a development and testing standpoint.
Topic Starter
Fraudo
Way more practical than machine learning, 1000000 check lists, and mapwide variation balancing.
dung eater
You don't really need the mass for anything here, it just made it easier for me to think about.

You could give the cursor mass, then calculate how much force is required to change it's trajectory. You'd have to would have to adjust for back and forth motions that don't feel as hard (some kind of rubberband reduction? D:). You could calculate forward and sideways (relative to the direction cursor was coming from) components independently for jump difficulty, adding them up would value square patterns over anything else for example.

To detect 'flow' you could look at the speed component in direction of the last movement of the current move relative to the speed of the last move. Think of kinetic energy saved for positive values. Change in the value and change of the change (deritative?) could be useful.

To make clustered up notes and zigzag pattern streams where the actual movement of the cursor is much less than the range from the center of each circle to the other, you could reduce a portiotion of circle size from each jump distance. If you make that portition relative to the flow value, you could devalue patterns that don't actually move around a lot and give positive flow (where you actually move forward more value).

I think tapping complexity, moving the cursor, reading and od are all things to consider separately unless you want really complicated solutions.

To make clustered up notes and zigzag pattern streams where the actualy movement of the cursor is much less than the range from the center of each circle to the other, you could reduce a portiotion of circle size from each jump distance. If you make the portition relative to the flow value, you could devalue zigzags and give positive flow, where you actually move forward more value.

I'd love play with these things if I was any better at coding (no idea how read beatmap files or any other files for values). Hopefully there's something useful above.
Nyxa

fdbxfrodo wrote:

even a simple extra system would be better than what we have now, which is nothing.
RaneFire
This is probably as good a place as any to discuss examples. Instead of going straight for the hardest stuff, I'd like to ask you guys what you think of a map which has been bugging me for a long time. It's not a very hard map, but it's been on my practice list right from the start (3 years).

https://osu.ppy.sh/b/82148&m=0 <- Map

It uses simple patterns: singles, triples, quintuples... Yet, for some reason it plays a lot more complicated than you would think. Does anyone know why?
Drezi

fdbxfrodo wrote:

even a simple extra system would be better than what we have now, which is nothing.
Exactly.

I said this regarding rhythm complexity issues aswell - having a rough and not 100% perfect system is a lot better than having nothing, when there's obviously a difference in difficulty. On a larger scope, the whole pp system isn't perfect either, but obviously it's still miles closer to showing a right picture regarding difficulty, than having no differentiation between maps at all. same priciple, just more apparent in this case.
Drezi

RaneFire wrote:

https://osu.ppy.sh/b/82148&m=0 <- Map

It uses simple patterns: singles, triples, quintuples... Yet, for some reason it plays a lot more complicated than you would think. Does anyone know why?
AR8 for 195 BPM and perfect stacking cause of very low stack leniency, both things we're not used to, also messy uneven spacing at parts which are meant to be standard snap distance, and generally bad mapping, random triplets/blank beats, very inconsistent with the music.
RaneFire

Drezi wrote:

AR8 for 195 BPM and perfect stacking cause of very low stack leniency, both things we're not used to, also messy uneven spacing at parts which are meant to be standard snap distance, and generally bad mapping, random triplets/blank beats, very inconsistent with the music.
I knew the answer would be negative, but yes, the issue has to do with flow.

Calling it bad mapping is subjective. If you're going to design a system, it needs to be impartial. There's no point if it can only correctly value maps made after 2012 (5 years of maps will thus be incorrectly valued), and let's not forget the new trend starting. This was my concern right from the start, different mapping styles create differences in complexity. This is not something an algorithm can be given values for. You will only ever have an imperfect system, but hey, as you said, it's better than nothing.
Nyxa
Bad maps are harder because they're bad, not because they're hard. However, when a map doesn't align with the music but isn't difficult in and of itself, then that is not the map that is hard, that is the map that is bad - and in my opinion that shouldn't give an extra reward, because it's a simple matter of following the map's rhythm rather than the song's rhythm and you should be fine. If a map is bad spacing-wise - well, first of all a system that rewards for complexity would account for crazy, variable spacing, as well as for patterns that are hard to read and awkward angles, but a map being annoying does not mean that SSing that map is a good performance. And that is a thing that's not affected by skill as much as it's affected by the kinds of maps you're used to. Someone who's used to 2012 maps will play them much better than someone who's used to 2014 maps, and often vice versa too. It's not really an aspect of complexity that wouldn't be included in the system.
Drezi
But these factors can be taken into account by an algorhythm, and it doesn't matter if the cause of the difficulty is a mapping style that's subjectively viewed as bad and unenjoyable, or patterns that people find fun to play.
RaneFire
You're only looking at one end of the stick. A map which is mapped very well, but checks flags with harder values of the complexity calculations, will as a result become overvalued because it's intuitive to play. It's a human variable, this complexity. This is the problem when a computer can't tell how good or bad a map is. Anyway, that's about all the "stirring of the pot" I came to do here.
Drezi
Or rather bad maps would be undervalued, which shouldn't really be a problem. They are undervalued currently aswell obviously. With that we're back to the "something's better than nothing" case.
nrl

RaneFire wrote:

yes, the issue has to do with flow.
There's a very big difference between a pattern a player finds difficult and a pattern that is inherently complex, and old maps with horrible flow definitely fall into both categories. But flow isn't some mystical thing, we can evaluate it by looking variations in cursor speed throughout the map.
Full Tablet

RaneFire wrote:

This is probably as good a place as any to discuss examples. Instead of going straight for the hardest stuff, I'd like to ask you guys what you think of a map which has been bugging me for a long time. It's not a very hard map, but it's been on my practice list right from the start (3 years).

https://osu.ppy.sh/b/82148&m=0 <- Map

It uses simple patterns: singles, triples, quintuples... Yet, for some reason it plays a lot more complicated than you would think. Does anyone know why?
After playing it, it seems to be complicated for several reasons:

- IMO, it has several objects that don't fit well with the song (several parts feel overmapped, while some parts feel undermapped).
- Some confusing patterns (1/2 sliders that look like 1/4 sliders based on the SV of most of the song).
- The AR is 8, which makes reading harder IMO, specially since it's hard to rely on the music for reading. (I get consistently FCs with ~95%acc with AR9, even passing the song is hard with AR8 for me).
- 1/4 Triples followed by a few 1/2 circles followed by 1/4 Triples (this pattern might be hard if you aren't used to switching between 1/2s and 1/4s, without a rest, quickly several times in a short period of time).

Much of the reason has to do with how the map fits the song (which would be very complex and possibly unreliable to calculate with an algorithm, also, there isn't much sense in giving a bonus to badly made maps).

The high note density caused by the low AR and high note frequency can be considered as a factor in the algorithm (possibly scaling with speed strain as well, since, 7 notes at the same time are much more manageable in a slow map that in a fast map). The note density factor could scale with a curve that increases slowly for low values (since IMO there isn't much difference between nearly zero density and 4 notes on the screen at the same time), and faster for higher values.

The difficulty of switching between 1/2 and 1/4 has to do with technique. This could be calculated based on the rhythm complexity (based on how repetitive the rhythm is) and the speed strain (at low speed, how complex the rhythm doesn't add much to the difficulty of the map). A problem with this approach would be that some rhythms might be subjectively underrated (For example, for many people a pattern with 1/2 notes and 1/4 triples might become even harder if you replace the 1/4 triples with 1/3 triplets, even though the complexity of the pattern would be the same, and the speed strain lower; the difficulty here lies mainly on how unusual are 1/3 triplets compared to 1/4 triples, not because there is a strong technical reason they are harder)
xxdeathx
Readability can be different for each person and there's no way to calculate it for everyone, as it's largely mental and not physical like the current pp calculation system is based on. Best thing you can do is hope the mappers make maps that are easy to read; that maximizes the possibility of full combing a song of a given star rating.
chainpullz
A really simplistic solution would just be to add a polling system where players can voice their opinion on whether a map is rated too high/low and run a pass over maps every so often adjusting by a magnitude relative to number of votes (would probably want some sort of weighting system so people with low rank don't contribute significantly to difficult maps like Big Black). Thus the closer it gets to where people think the map should stand, the less people will bother to vote and the less it's difficulty will change by each pass. I mean, we aren't really going for a super accurate addition and we want difficulty of maps we think are under/overrated to be tuned accordingly so I think this would appease the concerns of the people?
Full Tablet

Narrill wrote:

Full Tablet, how would you then apply the weighted note values to come up with an overall complexity/difficulty metric? We want to reduce the complexity/difficulty of the sequence to a single number.
After having determined the complexity of each note, calculate an overall complexity value.

One of the simplest ways to calculate this value is ranking the complexity values and summing weighted according to their rank (The highest value being worth 100%, the next one 99%, etc...), similar to how the overall pp value of players is calculated. The problem with this approach is that, the maximum value of complexity a note can have is 1, so even in the most complex case (all notes in the map being different), the complexity is at most a finite value even with a infinite amount of notes. If the weighting factor is high (close to 1), then long maps would indicate a value too high compared to short maps, if it is lower, then the limit is approached too quickly.

So, other function should be used. It should have the following properties:
- It should always give a higher or equal value, if an extra value is added to the list.
- If the list of elements has several high complexity values, then adding a low complexity value to the list should barely increase the overall complexity of the map.
- The overall complexity should tend towards infinity with an infinite amount of notes with maximum complexity, while the overall complexity increase for adding a maximum complexity note should tend towards 0.

A function that takes an input the list of N values and returns the overall complexity could be:
The real positive value of "a" that solves the equation:
With the maximum complexity case (all values in the list being 1), the solution is:
The value of the function can be calculated with Brent's method http://en.wikipedia.org/wiki/Brent%27s_method, with initial guesses a=0 and b=InverseErf[2^(-1/N)]

So the overall algorithm is (note that I changed how the matrix and the complexity list is calculated, to give a relatively lower complexity value to patterns with long pattern repetitions):

The values in the list of elements could be timing between objects (for Rhythm Complexity), angles (for Complexity of Spacing Patterns caused by Variability), distance moved divided by the timing between objects (Complexity caused by changes in the Distance Snap).

Some examples with Rhythm Complexity:
https://osu.ppy.sh/b/129891&m=0 2.14467
https://osu.ppy.sh/b/297463&m=0 2.1152
https://osu.ppy.sh/b/312959&m=1 2.05506
https://osu.ppy.sh/b/443272&m=0 1.9342
https://osu.ppy.sh/b/323875&m=0 1.91054
https://osu.ppy.sh/b/323769&m=0 1.93728
https://osu.ppy.sh/b/152078&m=1 2.25295
https://osu.ppy.sh/b/58063&m=0 1.84154
https://osu.ppy.sh/b/203906&m=0 2.13684
https://osu.ppy.sh/b/264090&m=0 1.96381

Do you think those values reflect the rhythm complexity of the maps?
chainpullz
Considering how much of freedom dive is just deathstreams it seems a little bit inflated (that's not to say the gallops don't contribute much difficulty) at least compared to something like Night of Knights.
nrl

xxdeathx wrote:

Readability can be different for each person and there's no way to calculate it for everyone, as it's largely mental and not physical like the current pp calculation system is based on. Best thing you can do is hope the mappers make maps that are easy to read; that maximizes the possibility of full combing a song of a given star rating.
No, readability comes from note density and note overlap. You can argue that people deal with it differently, be note density and note overlap can definitely be evaluated objectively.
Nyxa

chainpullz wrote:

Considering how much of freedom dive is just deathstreams it seems a little bit inflated (that's not to say the gallops don't contribute much difficulty) at least compared to something like Night of Knights.
Night of Knights has a higher star rating than Freedom Dive and is harder due to being an older map and not due to actual pattern difficulty. Older maps aren't supposed to weigh much anyway. Also you can't really call a map with repeated 222bpm spaced deathstreams "inflated". That's like saying Remote Control DT is inflated. It's not. Koigokoro is inflated, not Freedom Dive. Keep your mind on track, please, or you won't really help the thread find a proper solution.
nrl

Tess wrote:

Night of Knights has a higher star rating than Freedom Dive and is harder due to being an older map and not due to actual pattern difficulty. Older maps aren't supposed to weigh much anyway. Also you can't really call a map with repeated 222bpm spaced deathstreams "inflated". That's like saying Remote Control DT is inflated. It's not. Koigokoro is inflated, not Freedom Dive. Keep your mind on track, please, or you won't really help the thread find a proper solution.
Are you serious? In order:

1. Pattern homogeneity is a huge component of pattern difficulty, and the older the map the less homogeneous its patterns are likely to be. Older maps are often far more difficult than modern maps of similar note density for this reason. If anything they're severely underrated right now.
2. 222bpm deathstreams are the opposite of complexity. Are they difficult? Absolutely, but they're not complex. The only thing remotely complex about them is the variable spacing, and even that isn't so bad since the variation is piece-wise rather than continuous.
3. Remote Control DT and Koigokoro DT are inflated. In fact they're very heavily inflated. Freedom Dive is also inflated, albeit not to the same extent due to the fact that the non-deathstream patterns have a decent bit of complexity to them.

chainpullz was giving Full Tablet feedback on his evaluation method, which is the only method to have produced tangible results thus far. How on earth is that not productive?
chainpullz
We are also talking about the [solo] difficulty that is rated at 5 stars even though it is very easily a 6* map. Please read the thread before you flame people.

We are talking specifically about complexity and not overrated or underrated maps. There just happens to be some overlap in these topics since complex maps tend to be vastly underrated.

Freedom Dive is rated at 7 stars because of physical aspects. The complexity elements appear in so few places the difficulty shouldn't change much even with complexity added in (Narrill did a good job of explaining so please do read what he said above).
RaneFire

Narrill wrote:

2. 222bpm deathstreams are the opposite of complexity. Are they difficult? Absolutely, but they're not complex. The only thing remotely complex about them is the variable spacing, and even that isn't so bad since the variation is piece-wise rather than continuous.
Also good to remember that the current pp system already accounts for spaced streams and it's one of the reasons some maps can become inflated. If you don't think this is the case, you aren't looking. Variable spacing is also not exactly what freedom dive uses, it increments, but not so frequently. Here is a better example of variable spacing: https://osu.ppy.sh/b/338544

As for Full_Tablet's examples... I don't think that's accurate at all.

Night of Knights [SOLO] is difficult not because of the patterns used, but actually due to similar reasons Full_Tablet stated for the example I previously mentioned. There's a ton of reasons aside from actual pattern complexity, which make the map more complex than it seems.

Koigokoro [Delis' Insane] is not a complex map at all. I think angularity should definitely consider the lack of variation, because most of Delis' Insane is more of the same low-mid range jump patterns, which use only acute angles (star patterns). There's nothing complex about repetition regarding aim, however the strain of aiming on repetitive jumps (peak graph for longer) is accounted for already in the current pp system and that's why its overvalued, because it lacks complexity.
nrl

RaneFire wrote:

Also good to remember that the current pp system already accounts for spaced streams
It accounts for stream spacing in general, but it doesn't account for spacing variation.
felicitousname
But in the end, who's to say that complexity should be weighed heavier than pure physical difficulty.

No matter what you attempt, there is no such thing as a perfect rating algorithm. There might be one that feels "accurate" to many or even most people. Overall, ppv2 already satisfies most people, I think there probably would be no point making fundamental changes to the underlying implementation. It would probably be best to tweak the algorithm instead of ripping it out and replacing it with something completely different.

I think most people can agree that the top players are rated fairly accurately.
Vuelo Eluko

felicitousname wrote:

I think most people can agree that the top players are rated fairly accurately.
compared to ppv1 anyway
Topic Starter
Fraudo

felicitousname wrote:

But in the end, who's to say that complexity should be weighed heavier than pure physical difficulty.

No matter what you attempt, there is no such thing as a perfect rating algorithm. There might be one that feels "accurate" to many or even most people. Overall, ppv2 already satisfies most people, I think there probably would be no point making fundamental changes to the underlying implementation. It would probably be best to tweak the algorithm instead of ripping it out and replacing it with something completely different.

I think most people can agree that the top players are rated fairly accurately.
No one said anything about this needing to even be part of the current pp system, or it replacing the current algorithm. At most this would just be a tweak to the current system, and at least it would just be a way to show what maps are difficult to play based on mental factors. Beyond that, it could be tied into something a long the lines of charts or something, so we can see who is the best readers of difficult maps, and not just have the "HEY LOOK I CAN PLAY FAST AND STUFF" top ranks. Not saying the top 10 players are not accurately ranked, or anything of the sort, just saying the things that make the game difficult are not always rewarded, or even acknowledged.
RaneFire

felicitousname wrote:

But in the end, who's to say that complexity should be weighed heavier than pure physical difficulty.
No one did.

A complexity algorithm will tweak existing values, not change them up completely. Physical difficulty will still be rewarded much more, but at least some maps will be given presence, instead of sitting down at the bottom of your performance list.
nrl

fdbxfrodo wrote:

No one said anything about this needing to even be part of the current pp system, or it replacing the current algorithm
I'll bite that bullet. Complexity evaluation needs to be a part of the pp system.
Clappy
I am a little tardy to the party, but I'd love to put my 2 cents in.

https://osu.ppy.sh/s/132586 (Move That Body)

Vs

https://osu.ppy.sh/s/111914 (V Is For)

Both around 5* yet I believe that V Is For is subjectively easier and I think most would agree.

Few key points I would like to highlight:
V is for is around 5* for its higher note spacing and (length of map) <- not sure if that is true
Move that body is around 5* because the shit is a bitch to read.

The question I'd like to ask is, how did these maps end up at the same star rating but, most would agree that complex patterns of Move That Body are way harder than the spaced jumps of V is For.
show more
Please sign in to reply.

New reply