forum

Complexity: Variability, Reading, and Angularity

posted
Total Posts
105
show more
Topic Starter
Fraudo

chainpullz wrote:

@Narrill: This would be the intuitive approach but it has terrible time complexity if I'm understanding what you are suggesting correctly.
The thing is it only needs to be done once. And it doesn't even need to be that grand and complex, it can be a simple implementation that doesn't account for every single thing like he's suggesting, because even a simple extra system would be better than what we have now, which is nothing.
chainpullz
You could say the same thing about encryption. You only need to factor a really large number once. Given billions of dollars and several years time you can crack a single encryption. It's totally doable. Is it practical? No.

There are 2^(n-1) ways to partition a set of n objects (thinking cutting a string). Thus any complexity check involving this sort of division is at best O(2^n). As soon as we hit 40 objects we are already approaching roughly a day to check for complexity. Even with immense optimization, running this with 1000 objects is still pretty impractical.

Edit: If we ignore globally repeated patterns and only focus on adjacent repeated patterns it is a more feasible time complexity but not as accurate of a measure.
Topic Starter
Fraudo

chainpullz wrote:

You could say the same thing about encryption. You only need to factor a really large number once. Given billions of dollars and several years time you can crack a single encryption. It's totally doable. Is it practical? No.

There are 2^(n-1) ways to partition a set of n objects (thinking cutting a string). Thus any complexity check involving this sort of division is at best O(2^n). As soon as we hit 40 objects we are already approaching roughly a day to check for complexity. Even with immense optimization, running this with 1000 objects is still pretty impractical.
And like I said, it doesn't need to be on quite that scale, because we don't need a perfect system that accounts for every single object and variation in existence, only the relevant ones that are close together, and even that is AT MOST. Just a simple system would be better than what we have now.
nrl
Assuming we're only checking position and timing, we're looking at roughly a million comparisons for a map with 1000 objects (note that this isn't 1000 combo), and we're talking numerical comparisons. It's certainly expensive, but it's not like we have to do every pass. Suppose we cut the element count in half for each iteration.
chainpullz
1 mil comparisons isn't expensive if the comparisons run in constant time (for some hopefully small constant). It's the actual iteration process that is a concern because there are a lot of ways we can go about partitioning and then a lot of different ways we can go about comparing partitions (if we only check same size adjacent parts we cut down on time significantly but we aren't as thorough). It seems like we probably only care about comparing same size parts so that alone cuts down on time dramatically but even so unless we establish something like only checking adjacent parts it's going to take too long (which would still take a long time but would likely be practical).
nrl
I don't think it's terribly necessary to compare non-adjacent parts. And I'm not really sure what you mean by size. The partitions would always have identical object counts.

I'll explain a bit more clearly. We're comparing objects from left to right. The first object is compared to the second, the second to the third, etc., until we reach the next to last and last objects. Then each pair becomes an object and the comparisons are run again in the same way. Non-adjacent objects are never compared, and I don't think they'd need to be.
chainpullz
consider abcdabcefabcdabcef

Your method of only checking adjacent parts will find abcdabcef as a repeated pattern but it won't really pick up on the abc repetitions the way that is desired.

An example of this pattern might be something like triangle center(single) triangle (offset center double) triangle center triangle (offset center double).

Obviously the most important part is the repeated triangle followed by a jump to the center.

The point being that our greedy approach will skip over a lot of patterns we wish to pick up in the first place.
nrl
Can you demonstrate that claim in a way that makes me more inclined to believe it?
eeezzzeee
These are factors that are all pretty much impossible to rate the difficulty of because these are the things that some players might find hard others will not.. whereas something like the distance between the note is a factor that is the same for everyone
Full Tablet

chainpullz wrote:

consider abcdabcefabcdabcef

Your method of only checking adjacent parts will find abcdabcef as a repeated pattern but it won't really pick up on the abc repetitions the way that is desired.

An example of this pattern might be something like triangle center(single) triangle (offset center double) triangle center triangle (offset center double).

Obviously the most important part is the repeated triangle followed by a jump to the center.

The point being that our greedy approach will skip over a lot of patterns we wish to pick up in the first place.
You can find all pattern repetitions in O(n^2) time (which is better than O(2^n)).

For n elements, have a matrix with n+1 rows and n+1 columns (with initial values all equal to zero). The 1st row corresponds to the patterns found comparing the 1st element with the rest of the elements, the 2nd row to the patterns found comparing the 2nd element with the elements that follow it, etc...

Start comparing the penultimate element with the elements that follow it: If the penultimate element matches with the last element, make the value in the penultimate row, penultimate column, equal to Min( Distance between comparing elements [1 in case, since they are next to each other] , 1 + Value in the matrix on the next column and next row )

The do the same comparisons with the element before the penultimate element, then the one before, until comparing the 1st element to the rest of the elements.
Code in Wolfram Language (note that lists of elements have an index i=1 for the first element instead of i=0)
RepetitionMatrix[Elements_] :=
Module[{TheMatrix, Len = Length[Elements], i, j},
TheMatrix = ConstantArray[0, {Len + 1, Len + 1}];
For[i = Len - 1, i >= 1, i--, (*Iterate for each note in the map, starting from the second to last note*)
For[j = Len, j > i, j--, (*Iterate for notes after the current note, starting from last note*)
If[Elements[[i]] == Elements[[j]], (*If a note is repeated*)
TheMatrix[[i, j]] = Min[TheMatrix[[i + 1, j + 1]] + 1, j - i];
]
]
];
TheMatrix]
In the example with "abcdabcefabcdabcef", the Matrix generated is: (Note that the values in or below the diagonal are always 0)

The 9 in the first row, tenth column, means that the first 9 elements (abcdabcef) are repeated starting from the 10th element.
The 3 in the 5th row, 10th column, means that the sequence of 3 elements starting with the 5th element is the same as the sequence of 3 elements starting from the 10th element.

If it is restricted to comparing each element with elements that are close to them (for example, only comparing the 1st element with the next 50 elements instead of every element), the algorithm can run in O(n) time.
nrl
Thing is, we don't really have to detect patterns; that the patterns repeat is incidental to their difficulty. What we're aiming for is making sure a map with repeated patterns is shown to be less difficult than one with non-repetitive patterns of equal relative spacing and speed. We don't need to look for the repetition, we need to look for the reduction in complexity caused by the repetition.
chainpullz
I was merely demonstrating that we didn't get pattern detection for free if only checking adjacent parts. I never claimed that checking for patterns took at best O(2^n). I claimed that doing so by comparing partitions does. As Narrill said, finding reoccurring substrings is only necessary so that we don't label some maps more difficult than they should be as that in turn makes some maps appear underrated.
Vuelo Eluko
full tablets posts always make my head hurt
i thought i was done with this shit after high school, I guess my math teacher was right...
nrl
I'm not saying pattern detection is a nice bonus we can easily sacrifice, I'm saying it isn't necessary at all.
Vuelo Eluko

Narrill wrote:

I'm not saying pattern detection is a nice bonus we can easily sacrifice, I'm saying it isn't necessary at all.
it just goes back to what was mentioned earlier; some people are better at certain patterns than others and there's no measurable difficulty there.
Ethelon

eeezzzeee wrote:

These are factors that are all pretty much impossible to rate the difficulty of because these are the things that some players might find hard others will not.. whereas something like the distance between the note is a factor that is the same for everyone

Riince wrote:

Narrill wrote:

I'm not saying pattern detection is a nice bonus we can easily sacrifice, I'm saying it isn't necessary at all.
it just goes back to what was mentioned earlier; some people are better at certain patterns than others and there's no measurable difficulty there.
I don't understand why this is even brought up. Yes some people are better at certain patterns than others. So they're better at reading it and should be rewarded. Just because different people are better or worse at something doesn't mean that there is no measurable difficulty. It means that people aren't uniform in ability.
If reading affects the difficulty of the map (which it does), then it should be considered in the difficulty rating.

And from what I understand, Narril isn't saying that it's unnecessary to detect the difficulty of patterns, only that you don't need a system to detect patterns to determine difficulty.
Full Tablet

Narrill wrote:

Thing is, we don't really have to detect patterns; that the patterns repeat is incidental to their difficulty. What we're aiming for is making sure a map with repeated patterns is shown to be less difficult than one with non-repetitive patterns of equal relative spacing and speed. We don't need to look for the repetition, we need to look for the reduction in complexity caused by the repetition.
You can calculate a weighting value for each note based on the repetitions found.

Possible way to do this, using "abcdabcef abcdabcef" as an example: (Starting from the matrix, this process would take O(n^2) time if checking for all patterns)

The first 4 elements ("abcd") have a weight of 1, since they are new elements in the list.

The 3 next elements would have a lower weighting (since they repeat the pattern "abc"), they have a value of
0.5^(1/(Distance_Between_Patterns)^2) = 0.5^(1/16) = 0.957603

The next 2 elements are new, so they have a value of 1.

All the next 9 elements have their weight multiplied by a factor (since they repeat the first 9 elements): 0.5^(1/9^2) = 0.991479
Of those 9 elements, the first 3 are multiplied by a factor because of the repeating abc pattern not in the repetition of 9: 0.5^(1/5^2)=0.972655
And the second "abc" pattern in the 9 elements are multiplied from a factor because of the other abc patterns: 0.5^(1/4^2)*0.5^(1/12^2) = 0.953684

So in the end the weightings of the elements are:
{1, 1, 1, 1, 0.957603, 0.957603, 0.957603, 1, 1, 0.964367, 0.964367, 0.964367, 0.991479, 0.945558, 0.945558, 0.945558, 0.991479, 0.991479}
nrl

Ethelon wrote:

And from what I understand, Narril isn't saying that it's unnecessary to detect the difficulty of patterns, only that you don't need a system to detect patterns to determine difficulty.
...

I'm saying there's no need to parse the map looking for repeated patterns. If intramap variation is evaluated properly the effect of the repeated patterns on overall complexity will already be accounted for. The entire idea of evaluating discrete "patterns" is misguided.

Full Tablet, how would you then apply the weighted note values to come up with an overall complexity/difficulty metric? We want to reduce the complexity/difficulty of the sequence to a single number.
Topic Starter
Fraudo

Ethelon wrote:

eeezzzeee wrote:

These are factors that are all pretty much impossible to rate the difficulty of because these are the things that some players might find hard others will not.. whereas something like the distance between the note is a factor that is the same for everyone

Riince wrote:

it just goes back to what was mentioned earlier; some people are better at certain patterns than others and there's no measurable difficulty there.
I don't understand why this is even brought up. Yes some people are better at certain patterns than others. So they're better at reading it and should be rewarded. Just because different people are better or worse at something doesn't mean that there is no measurable difficulty. It means that people aren't uniform in ability.
If reading affects the difficulty of the map (which it does), then it should be considered in the difficulty rating.
I'm glad someone gets it. The whole point behind this isn't saying "Hey this is universally difficult for every single person" otherwise like I've said, the current system is horribly flawed, as we reward streams and jumps, and different people are better at different things. This will always be the case with a game, it's about rewarding things that are generally hard, and giving bonuses for doing them. That doesn't mean we will actually need to reward them, just give a way to know when a map is more complex, so that people can get recognition at the very least for the very difficult to read maps they play.
Nyxa
I agree with the above post, though from what I've seen so far everyone in this thread is trying to come up with a complicated solution to a complicated problem - I can tell you now that as long as you keep going down that path you won't find a truly effective solution. Complex problems require simplistic thinking, which is way harder than it sounds. I'll be trying to come up with some coherent ideas of my own but I'd really like to stress that if you truly want to find a working solution, the best approach is to think as simply as you can. If you have a simple solution and something goes wrong, the fix will be simple. Fine-tuning will be simple and adding onto the framework you created will be a lot easier than when you come up with a complex, fragile system that's hard to oversee properly.
nrl

Tess wrote:

Complex problems require simplistic thinking, which is way harder than it sounds.
Mine is a very simple solution; you're just tracking variation while gradually widening the scope. It's not unlike differential calculus if you think about it.
chainpullz
My solution is a simple solution. Teach a computer what makes maps hard and then let it go crazy. The actual implementation details might not be so nice but hey, let's not worry about that.
RaneFire
Just pointing out that it would have to be simple, because every revision to the system requires reparsing all beatmaps. The more work it has to do, the longer that will take. The more variables there are, the more tweaking has to happen, and more often.

Complexity will also inevitably require rebalancing for the current bonuses to FL and HD, considering it's able to pick up regular patterns, which are easier to memorise, and mid-stream distance spacing changes, which throw me off very much regarding aim. Not necessarily a "finger" complexity issue on that one.

Common patterns also don't necessarily mean complexity. For instance, triples, quintuples, etc. These patterns are very easy to play intuitively because they are common and obey musical notation (start/end on a main beat). When you start including doubles and quads in that though, things get tricky, but from the perspective of a computer, there is no difference other than the number of notes. We also listen to the music while playing, not something a computer can fathom.

Not going to say too much though, since I've thought about this stuff heavily and already thrown away any hope of seeing it implemented due to the sheer number of variables we would need to consider when designing a "simple" system. Collaboration is also required to determine universally difficult/easy things, since reading ability is the major difference between players, so I applaud you for the initiative of creating this thread to discuss this topic.
Topic Starter
Fraudo

RaneFire wrote:

Just pointing out that it would have to be simple, because every revision to the system requires reparsing all beatmaps. The more work it has to do, the longer that will take. The more variables there are, the more tweaking has to happen, and more often.

Complexity will also inevitably require rebalancing for the current bonuses to FL and HD, considering it's able to pick up regular patterns, which are easier to memorise, and mid-stream distance spacing changes, which throw me off very much regarding aim. Not necessarily a "finger" complexity issue on that one.

Common patterns also don't necessarily mean complexity. For instance, triples, quintuples, etc. These patterns are very easy to play intuitively because they are common and obey musical notation. When you start including doubles and quads in that though, things get tricky, but from the perspective of a computer, there is no difference other than the number of notes.

Not going to say too much though, since I've thought about this stuff heavily and already thrown away any hope of seeing it implemented due to the sheer number of variables we would need to consider when designing a "simple" system. Collaboration is also required to determine universally difficult/easy things, since reading ability is the major difference between players, so I applaud you for the initiative of creating this thread to discuss this topic.
Yes, as is the problem is more the implementation than the theory, and people are more throwing out complex solutions. I personally think just doing some simple checks and having a list of patterns is the simplest way, and giving each pattern a value in a key within the algorithm, so that when you encounter a pattern it gets that value, and when you encounter it within a certain timeframe, that value is reduced due to familiarity. This would only apply to simple patterns, because if it's a more complex pattern, we can just do it multiplicatively, and give many sharp angles/alternationsa in a row a more valuable reward, and variations in tempo would be accounted for by the timeline that we already have. This basically solves 90% of the issues, and leaves only reading ,which we can do some overlap checks for.

That's the simplest implementation I can think of, and only requires a few variables, note distances (maximum of 3 at a time for angles, more for variation, and stream detections), angles between those 3 notes (angle + distance/time would be preferable to measure the difficulty of stopping and going the other direction, whether the distance/time before or after the turn is more difficult isn't known to me), and the time between notes (again, 3 at a time for angles, and more for variation).

At the heart of it all, that's the simplest and easiest solution that gives the most reward for the least calculations so far as I can tell. It's easy to tune, because everything is based on what we already have, with the addition of angles. Then, we just log the angles, spit out recognized patterns, value them with comparisons to nearest patterns, spit out extra angles, value them as well, similar angles together do decrease value slightly in this case, boom, we have a working implementation.

Of course, it's quite as easy as I make it sound, but would still be heaps easier than any other proposed implementation. No, I am not aiming for a perfect system with that suggestion, but a working one that can be tuned in a way that is understandable to the players so that they know WHY a map is considered difficult, not just "because".
chainpullz
The actual time complexity for the regular expression approach is O(mn) where n is the number of objects in the beatmap and m is the dictionary. In theory if we were to go that route it would be fairly trivial to update the difficulty system. As already stated though, coming up with a good alphabet and dictionary is time consuming from a development and testing standpoint.
Topic Starter
Fraudo
Way more practical than machine learning, 1000000 check lists, and mapwide variation balancing.
dung eater
You don't really need the mass for anything here, it just made it easier for me to think about.

You could give the cursor mass, then calculate how much force is required to change it's trajectory. You'd have to would have to adjust for back and forth motions that don't feel as hard (some kind of rubberband reduction? D:). You could calculate forward and sideways (relative to the direction cursor was coming from) components independently for jump difficulty, adding them up would value square patterns over anything else for example.

To detect 'flow' you could look at the speed component in direction of the last movement of the current move relative to the speed of the last move. Think of kinetic energy saved for positive values. Change in the value and change of the change (deritative?) could be useful.

To make clustered up notes and zigzag pattern streams where the actual movement of the cursor is much less than the range from the center of each circle to the other, you could reduce a portiotion of circle size from each jump distance. If you make that portition relative to the flow value, you could devalue patterns that don't actually move around a lot and give positive flow (where you actually move forward more value).

I think tapping complexity, moving the cursor, reading and od are all things to consider separately unless you want really complicated solutions.

To make clustered up notes and zigzag pattern streams where the actualy movement of the cursor is much less than the range from the center of each circle to the other, you could reduce a portiotion of circle size from each jump distance. If you make the portition relative to the flow value, you could devalue zigzags and give positive flow, where you actually move forward more value.

I'd love play with these things if I was any better at coding (no idea how read beatmap files or any other files for values). Hopefully there's something useful above.
Nyxa

fdbxfrodo wrote:

even a simple extra system would be better than what we have now, which is nothing.
RaneFire
This is probably as good a place as any to discuss examples. Instead of going straight for the hardest stuff, I'd like to ask you guys what you think of a map which has been bugging me for a long time. It's not a very hard map, but it's been on my practice list right from the start (3 years).

https://osu.ppy.sh/b/82148&m=0 <- Map

It uses simple patterns: singles, triples, quintuples... Yet, for some reason it plays a lot more complicated than you would think. Does anyone know why?
Drezi

fdbxfrodo wrote:

even a simple extra system would be better than what we have now, which is nothing.
Exactly.

I said this regarding rhythm complexity issues aswell - having a rough and not 100% perfect system is a lot better than having nothing, when there's obviously a difference in difficulty. On a larger scope, the whole pp system isn't perfect either, but obviously it's still miles closer to showing a right picture regarding difficulty, than having no differentiation between maps at all. same priciple, just more apparent in this case.
Drezi

RaneFire wrote:

https://osu.ppy.sh/b/82148&m=0 <- Map

It uses simple patterns: singles, triples, quintuples... Yet, for some reason it plays a lot more complicated than you would think. Does anyone know why?
AR8 for 195 BPM and perfect stacking cause of very low stack leniency, both things we're not used to, also messy uneven spacing at parts which are meant to be standard snap distance, and generally bad mapping, random triplets/blank beats, very inconsistent with the music.
RaneFire

Drezi wrote:

AR8 for 195 BPM and perfect stacking cause of very low stack leniency, both things we're not used to, also messy uneven spacing at parts which are meant to be standard snap distance, and generally bad mapping, random triplets/blank beats, very inconsistent with the music.
I knew the answer would be negative, but yes, the issue has to do with flow.

Calling it bad mapping is subjective. If you're going to design a system, it needs to be impartial. There's no point if it can only correctly value maps made after 2012 (5 years of maps will thus be incorrectly valued), and let's not forget the new trend starting. This was my concern right from the start, different mapping styles create differences in complexity. This is not something an algorithm can be given values for. You will only ever have an imperfect system, but hey, as you said, it's better than nothing.
Nyxa
Bad maps are harder because they're bad, not because they're hard. However, when a map doesn't align with the music but isn't difficult in and of itself, then that is not the map that is hard, that is the map that is bad - and in my opinion that shouldn't give an extra reward, because it's a simple matter of following the map's rhythm rather than the song's rhythm and you should be fine. If a map is bad spacing-wise - well, first of all a system that rewards for complexity would account for crazy, variable spacing, as well as for patterns that are hard to read and awkward angles, but a map being annoying does not mean that SSing that map is a good performance. And that is a thing that's not affected by skill as much as it's affected by the kinds of maps you're used to. Someone who's used to 2012 maps will play them much better than someone who's used to 2014 maps, and often vice versa too. It's not really an aspect of complexity that wouldn't be included in the system.
Drezi
But these factors can be taken into account by an algorhythm, and it doesn't matter if the cause of the difficulty is a mapping style that's subjectively viewed as bad and unenjoyable, or patterns that people find fun to play.
RaneFire
You're only looking at one end of the stick. A map which is mapped very well, but checks flags with harder values of the complexity calculations, will as a result become overvalued because it's intuitive to play. It's a human variable, this complexity. This is the problem when a computer can't tell how good or bad a map is. Anyway, that's about all the "stirring of the pot" I came to do here.
Drezi
Or rather bad maps would be undervalued, which shouldn't really be a problem. They are undervalued currently aswell obviously. With that we're back to the "something's better than nothing" case.
nrl

RaneFire wrote:

yes, the issue has to do with flow.
There's a very big difference between a pattern a player finds difficult and a pattern that is inherently complex, and old maps with horrible flow definitely fall into both categories. But flow isn't some mystical thing, we can evaluate it by looking variations in cursor speed throughout the map.
Full Tablet

RaneFire wrote:

This is probably as good a place as any to discuss examples. Instead of going straight for the hardest stuff, I'd like to ask you guys what you think of a map which has been bugging me for a long time. It's not a very hard map, but it's been on my practice list right from the start (3 years).

https://osu.ppy.sh/b/82148&m=0 <- Map

It uses simple patterns: singles, triples, quintuples... Yet, for some reason it plays a lot more complicated than you would think. Does anyone know why?
After playing it, it seems to be complicated for several reasons:

- IMO, it has several objects that don't fit well with the song (several parts feel overmapped, while some parts feel undermapped).
- Some confusing patterns (1/2 sliders that look like 1/4 sliders based on the SV of most of the song).
- The AR is 8, which makes reading harder IMO, specially since it's hard to rely on the music for reading. (I get consistently FCs with ~95%acc with AR9, even passing the song is hard with AR8 for me).
- 1/4 Triples followed by a few 1/2 circles followed by 1/4 Triples (this pattern might be hard if you aren't used to switching between 1/2s and 1/4s, without a rest, quickly several times in a short period of time).

Much of the reason has to do with how the map fits the song (which would be very complex and possibly unreliable to calculate with an algorithm, also, there isn't much sense in giving a bonus to badly made maps).

The high note density caused by the low AR and high note frequency can be considered as a factor in the algorithm (possibly scaling with speed strain as well, since, 7 notes at the same time are much more manageable in a slow map that in a fast map). The note density factor could scale with a curve that increases slowly for low values (since IMO there isn't much difference between nearly zero density and 4 notes on the screen at the same time), and faster for higher values.

The difficulty of switching between 1/2 and 1/4 has to do with technique. This could be calculated based on the rhythm complexity (based on how repetitive the rhythm is) and the speed strain (at low speed, how complex the rhythm doesn't add much to the difficulty of the map). A problem with this approach would be that some rhythms might be subjectively underrated (For example, for many people a pattern with 1/2 notes and 1/4 triples might become even harder if you replace the 1/4 triples with 1/3 triplets, even though the complexity of the pattern would be the same, and the speed strain lower; the difficulty here lies mainly on how unusual are 1/3 triplets compared to 1/4 triples, not because there is a strong technical reason they are harder)
xxdeathx
Readability can be different for each person and there's no way to calculate it for everyone, as it's largely mental and not physical like the current pp calculation system is based on. Best thing you can do is hope the mappers make maps that are easy to read; that maximizes the possibility of full combing a song of a given star rating.
chainpullz
A really simplistic solution would just be to add a polling system where players can voice their opinion on whether a map is rated too high/low and run a pass over maps every so often adjusting by a magnitude relative to number of votes (would probably want some sort of weighting system so people with low rank don't contribute significantly to difficult maps like Big Black). Thus the closer it gets to where people think the map should stand, the less people will bother to vote and the less it's difficulty will change by each pass. I mean, we aren't really going for a super accurate addition and we want difficulty of maps we think are under/overrated to be tuned accordingly so I think this would appease the concerns of the people?
show more
Please sign in to reply.

New reply