forum

Complexity: Variability, Reading, and Angularity

posted
Total Posts
105
Topic Starter
Fraudo
This thread is basically on how complexity in osu should be handled, and dealt with as a factor of difficulty in maps. Currently, the map rating system bases most things off of physical factors, with star rating being all about time spacing and distance spacing of notes, and PP being a combination of this, the other stats, and map length. To make the system handle the complexity of maps better, we first must find out what is deemed hard in terms of reading, and mental understanding. As is, I have personally thought about it and decided that the things I deem most responsible for making things mentally hard are angles, variations, and readability. The details on this topic are in the document below.

I would really appreciate any feedback on what people personally find difficult in terms of patterns, speed changes, etc. Things that aren't accounted for in the current system.

Additionally, if you have any ideas on how exactly to handle finding and adjusting for these difficulties, I would like to hear those as well so I can add them to the solutions section.

Finally, if anything in the document or thread is hard to read or could be formatted in a way that would make it more pleasant to look at, I'll work on that as well.

Google Doc

This thread is meant for discussion on the issue of complexity and the aforementioned document, and working to better the PP and difficulty systems that are already in the game. Thanks for taking the time to read.
Sea_Food
Just hard to think that the current difficulty rating would need fixing as its almost perfect compared to the shit we had before this.

Also what you said about reading and mental understanding varies really much on people. Like i am much better at this game than my brother overall, but because I cant into square jumps there are few maps in this game that I cant beat his score. We cannot get an objective solution on how hard they are.

There are some obvious mistakes the difficulty calculator makes like rating airman harder than rainbowdash likes girls but what ever.


Anyway if you can make this work good for you. Just my 2cents on the (issue) since you advertised on #osu
chainpullz

Sea_Food wrote:

Just hard to think that the current difficulty rating would need fixing as its almost perfect compared to the shit we had before this.

Also what you said about reading and mental understanding varies really much on people. Like i am much better at this game than my brother overall, but because I cant into square jumps there are few maps in this game that I cant beat his score. We cannot get an objective solution on how hard they are.

There are some obvious mistakes the difficulty calculator makes like rating airman harder than rainbowdash likes girls but what ever.


Anyway if you can make this work good for you. Just my 2cents on the (issue) since you advertised on #osu
Given a map with 0 plays that nobody has even so much as looked at, how do you measure something like reading (hint: this is a rhetorical question)?
Sea_Food
Given a map with 0 plays that nobody has even so much as looked at, how do you measure something like reading (hint: this is a rhetorical question)?[/quote]
muh shitposting
chainpullz
How is this a shit post? Have you ever in your life tried to program a scoring/difficulty system? You act like a computer works like the human brain...
Vuelo Eluko
Note density. an average of >4 notes on the screen being harder to read than average
Topic Starter
Fraudo
The point is that it doesn't. This is why we don't have a system in place like this already, and that's why I made a post about it. I talked about it a bit with Tom94, who does the PP calculations and he says he's pretty much thought about everything I wrote about, as they are really obvious. The reason I made the post is to work with the community on what is GENERALLY hard, and what kinds of things should be implemented to find and represent how hard those things are. If we worried about the subjectivity of it, the game would be pointless, because some people are good at jumps, some are good at streams, both are rewarded, if we were worried about it only accounting for what everyone was good or bad at, then neither of those would be. That's why the point is to look at what types of different things are found difficult in general so that we can build systems to account for them. Nothing like that will be perfect first time around, and that shouldn't be the concern. The concern is building a system that can detect these difficulties and represent them in a statistical way, everything else is tuning.

Currently, the problems as I understand them are
A) Detection, while in theory it's easy, Tom said he didn't want to have detection take 20 parameters, and I figure that applies to the calculations and stuff as well.

and B) Implementation, the code needs to be written for detection and the algorithm for scaling things with this new information also need to be figured out. This means we need to work out what that would look like in a way that is transferrable to code, in as few variables as possible so that we don't have things everywhere. If we figure out how to make it work, then it can be at least looked into, as is, there are no realistic implementations.

This all takes a lot of time and data and again, that's why it's being posted here, not already implemented into the game.
I don't work with the code, or any of the people who are affiliated with Osu!, so I can't say anything concrete on the matter, and I don't have any hard information or code, and that means I can't directly write out anything, and neither can anyone in the community, however, together we can build pseudo-code or dummy functions to handle the issues at hand, or at least give ideas on what will be needed and what to do with it. That's why I provided the angularity example. Tom94 works on the pp code basically alone from what he told me, and that means if we want this sort of thing basically anytime, we need to work on it as a community thing, because with him having a life and keeping up in the game, he doesn't have that kind of time. Thus, it's much more effective to work together as a community to get the groundwork of a system like that, and hand it off to the devs to implement and tune.

If you still think that's a bad idea, that's fine, but this is how things have been handled in other games many times, where the community has brought major features to the game through modding (proof of concept) or direct suggestions.


As for note density: I think it should be worked into other systems like a system to detect streams (for variability) so that streams don't look like a huge note density to the game, because in reality, they're very easy to read.
Maeglwn
this is the reason why the RC exists though, to make things not unreadable, etc

star rating and PP is a calculated thing, there's no real way to calculate somebody doing 15 backwards jumps in a row that have no flow whatsoever as hard or easy or anything like that, just because of all of the different variables associated with it

the only real way to do that is to get a massive group of people to manually set star ratings for maps or something
Topic Starter
Fraudo
Maeg, you can certainly detect things like that, it's just not simple, and that's kind of the point here.

Detection won't be perfect, but it's a place to start.

The systems in place don't let through unreadable stuff or anything of the sort, but they don't account for difficulty in reading and the likes, which is more the problem. There is no reward for playing something very difficult to read and high stars, so you may as well just play easy maps that are high stars or whatever. Not that that needs to be fixed, but it should at least be shown when a map is more difficult than it looks right off the bat.
nrl
The issue with checking angularity is that certain patterns are harder for certain people, so you'll never really be able to come up with an evaluation method that's universally reliable. I think it's less important to account for specific patterns than to figure out what makes the pattern difficult and start from there; the distinction between what a player finds hard and what's intrinsically hard has to be accounted for.

Accounting for variation is all that you need. Start by breaking the map into pairs of objects (first and second, second and third, third and fourth, etc.) and look at the differences within those pairs with regards to position, timing, etc. This is basically what we have now in the sense that it's only going to evaluate jump length and speed; the trick is to take it further. Repeat the process, but start with the set of pairs rather than the initial set of map objects (first/second and second/third, second/third and third/fourth, etc.). Evaluate the differences between one pair and the next. The more substantial the differences the more difficult the pattern is. You can deal with things like angularity if you want to, but even sticking with position and timing will allow you to account for things like distance snapping. Then go down another level and repeat the process again (first/second to second/third and second/third to third/fourth). Slowly but surely you'll move from intrapattern variation to interpattern variation as you go deeper and deeper, detecting things like repeated patterns. By the time your set is down to two elements you'll be dealing with overall map variation from start to finish, and that's where the evaluation can stop.

Naturally this is totally off the cuff, and I'm sure the system can be expanded and refined, but I think that's where things should start.
chainpullz
The issue is that a good implementation that takes these things into account would require a combination of genetic algorithms and pattern recognition applied to an extremely large data set. Even after doing all of that it's hard to say how much these factors would even be taken into consideration since this would be a more holistic approach. Things like jumps, streams, and speed can be trivially measured by examining distance and time measurements. As Tom has already explained, there are too many different ways that a map can be "complex."

An alternative approach would require creating an alphabet based on timing and relative distance in an effort to write a dictionary containing complexity elements as a sequence of symbols from our alphabet. As long as each complexity element can only be represented by a small set (preferably of size 1) of these sequences we can then simplify this to a regular expression problem. Have fun trying to create such an alphabet and dictionary.

Either of these 2 implementations would take an immense amount of work to implement and would take a lot of time to properly test. I'm talking somewhere along the lines of at least a year with no guarantees that it will even work as desired in the end. There is an entire subforum devoted to feature suggestions and a large number of features in there that people really want could be implemented in the same amount of time. There's just no reason as a developer to worry too much about implementing detection for complexity.
chainpullz
@Narrill: This would be the intuitive approach but it has terrible time complexity if I'm understanding what you are suggesting correctly.
Topic Starter
Fraudo

chainpullz wrote:

@Narrill: This would be the intuitive approach but it has terrible time complexity if I'm understanding what you are suggesting correctly.
The thing is it only needs to be done once. And it doesn't even need to be that grand and complex, it can be a simple implementation that doesn't account for every single thing like he's suggesting, because even a simple extra system would be better than what we have now, which is nothing.
chainpullz
You could say the same thing about encryption. You only need to factor a really large number once. Given billions of dollars and several years time you can crack a single encryption. It's totally doable. Is it practical? No.

There are 2^(n-1) ways to partition a set of n objects (thinking cutting a string). Thus any complexity check involving this sort of division is at best O(2^n). As soon as we hit 40 objects we are already approaching roughly a day to check for complexity. Even with immense optimization, running this with 1000 objects is still pretty impractical.

Edit: If we ignore globally repeated patterns and only focus on adjacent repeated patterns it is a more feasible time complexity but not as accurate of a measure.
Topic Starter
Fraudo

chainpullz wrote:

You could say the same thing about encryption. You only need to factor a really large number once. Given billions of dollars and several years time you can crack a single encryption. It's totally doable. Is it practical? No.

There are 2^(n-1) ways to partition a set of n objects (thinking cutting a string). Thus any complexity check involving this sort of division is at best O(2^n). As soon as we hit 40 objects we are already approaching roughly a day to check for complexity. Even with immense optimization, running this with 1000 objects is still pretty impractical.
And like I said, it doesn't need to be on quite that scale, because we don't need a perfect system that accounts for every single object and variation in existence, only the relevant ones that are close together, and even that is AT MOST. Just a simple system would be better than what we have now.
nrl
Assuming we're only checking position and timing, we're looking at roughly a million comparisons for a map with 1000 objects (note that this isn't 1000 combo), and we're talking numerical comparisons. It's certainly expensive, but it's not like we have to do every pass. Suppose we cut the element count in half for each iteration.
chainpullz
1 mil comparisons isn't expensive if the comparisons run in constant time (for some hopefully small constant). It's the actual iteration process that is a concern because there are a lot of ways we can go about partitioning and then a lot of different ways we can go about comparing partitions (if we only check same size adjacent parts we cut down on time significantly but we aren't as thorough). It seems like we probably only care about comparing same size parts so that alone cuts down on time dramatically but even so unless we establish something like only checking adjacent parts it's going to take too long (which would still take a long time but would likely be practical).
nrl
I don't think it's terribly necessary to compare non-adjacent parts. And I'm not really sure what you mean by size. The partitions would always have identical object counts.

I'll explain a bit more clearly. We're comparing objects from left to right. The first object is compared to the second, the second to the third, etc., until we reach the next to last and last objects. Then each pair becomes an object and the comparisons are run again in the same way. Non-adjacent objects are never compared, and I don't think they'd need to be.
chainpullz
consider abcdabcefabcdabcef

Your method of only checking adjacent parts will find abcdabcef as a repeated pattern but it won't really pick up on the abc repetitions the way that is desired.

An example of this pattern might be something like triangle center(single) triangle (offset center double) triangle center triangle (offset center double).

Obviously the most important part is the repeated triangle followed by a jump to the center.

The point being that our greedy approach will skip over a lot of patterns we wish to pick up in the first place.
nrl
Can you demonstrate that claim in a way that makes me more inclined to believe it?
eeezzzeee
These are factors that are all pretty much impossible to rate the difficulty of because these are the things that some players might find hard others will not.. whereas something like the distance between the note is a factor that is the same for everyone
Full Tablet

chainpullz wrote:

consider abcdabcefabcdabcef

Your method of only checking adjacent parts will find abcdabcef as a repeated pattern but it won't really pick up on the abc repetitions the way that is desired.

An example of this pattern might be something like triangle center(single) triangle (offset center double) triangle center triangle (offset center double).

Obviously the most important part is the repeated triangle followed by a jump to the center.

The point being that our greedy approach will skip over a lot of patterns we wish to pick up in the first place.
You can find all pattern repetitions in O(n^2) time (which is better than O(2^n)).

For n elements, have a matrix with n+1 rows and n+1 columns (with initial values all equal to zero). The 1st row corresponds to the patterns found comparing the 1st element with the rest of the elements, the 2nd row to the patterns found comparing the 2nd element with the elements that follow it, etc...

Start comparing the penultimate element with the elements that follow it: If the penultimate element matches with the last element, make the value in the penultimate row, penultimate column, equal to Min( Distance between comparing elements [1 in case, since they are next to each other] , 1 + Value in the matrix on the next column and next row )

The do the same comparisons with the element before the penultimate element, then the one before, until comparing the 1st element to the rest of the elements.
Code in Wolfram Language (note that lists of elements have an index i=1 for the first element instead of i=0)
RepetitionMatrix[Elements_] :=
Module[{TheMatrix, Len = Length[Elements], i, j},
TheMatrix = ConstantArray[0, {Len + 1, Len + 1}];
For[i = Len - 1, i >= 1, i--, (*Iterate for each note in the map, starting from the second to last note*)
For[j = Len, j > i, j--, (*Iterate for notes after the current note, starting from last note*)
If[Elements[[i]] == Elements[[j]], (*If a note is repeated*)
TheMatrix[[i, j]] = Min[TheMatrix[[i + 1, j + 1]] + 1, j - i];
]
]
];
TheMatrix]
In the example with "abcdabcefabcdabcef", the Matrix generated is: (Note that the values in or below the diagonal are always 0)

The 9 in the first row, tenth column, means that the first 9 elements (abcdabcef) are repeated starting from the 10th element.
The 3 in the 5th row, 10th column, means that the sequence of 3 elements starting with the 5th element is the same as the sequence of 3 elements starting from the 10th element.

If it is restricted to comparing each element with elements that are close to them (for example, only comparing the 1st element with the next 50 elements instead of every element), the algorithm can run in O(n) time.
nrl
Thing is, we don't really have to detect patterns; that the patterns repeat is incidental to their difficulty. What we're aiming for is making sure a map with repeated patterns is shown to be less difficult than one with non-repetitive patterns of equal relative spacing and speed. We don't need to look for the repetition, we need to look for the reduction in complexity caused by the repetition.
chainpullz
I was merely demonstrating that we didn't get pattern detection for free if only checking adjacent parts. I never claimed that checking for patterns took at best O(2^n). I claimed that doing so by comparing partitions does. As Narrill said, finding reoccurring substrings is only necessary so that we don't label some maps more difficult than they should be as that in turn makes some maps appear underrated.
Vuelo Eluko
full tablets posts always make my head hurt
i thought i was done with this shit after high school, I guess my math teacher was right...
nrl
I'm not saying pattern detection is a nice bonus we can easily sacrifice, I'm saying it isn't necessary at all.
Vuelo Eluko

Narrill wrote:

I'm not saying pattern detection is a nice bonus we can easily sacrifice, I'm saying it isn't necessary at all.
it just goes back to what was mentioned earlier; some people are better at certain patterns than others and there's no measurable difficulty there.
Ethelon

eeezzzeee wrote:

These are factors that are all pretty much impossible to rate the difficulty of because these are the things that some players might find hard others will not.. whereas something like the distance between the note is a factor that is the same for everyone

Riince wrote:

Narrill wrote:

I'm not saying pattern detection is a nice bonus we can easily sacrifice, I'm saying it isn't necessary at all.
it just goes back to what was mentioned earlier; some people are better at certain patterns than others and there's no measurable difficulty there.
I don't understand why this is even brought up. Yes some people are better at certain patterns than others. So they're better at reading it and should be rewarded. Just because different people are better or worse at something doesn't mean that there is no measurable difficulty. It means that people aren't uniform in ability.
If reading affects the difficulty of the map (which it does), then it should be considered in the difficulty rating.

And from what I understand, Narril isn't saying that it's unnecessary to detect the difficulty of patterns, only that you don't need a system to detect patterns to determine difficulty.
Full Tablet

Narrill wrote:

Thing is, we don't really have to detect patterns; that the patterns repeat is incidental to their difficulty. What we're aiming for is making sure a map with repeated patterns is shown to be less difficult than one with non-repetitive patterns of equal relative spacing and speed. We don't need to look for the repetition, we need to look for the reduction in complexity caused by the repetition.
You can calculate a weighting value for each note based on the repetitions found.

Possible way to do this, using "abcdabcef abcdabcef" as an example: (Starting from the matrix, this process would take O(n^2) time if checking for all patterns)

The first 4 elements ("abcd") have a weight of 1, since they are new elements in the list.

The 3 next elements would have a lower weighting (since they repeat the pattern "abc"), they have a value of
0.5^(1/(Distance_Between_Patterns)^2) = 0.5^(1/16) = 0.957603

The next 2 elements are new, so they have a value of 1.

All the next 9 elements have their weight multiplied by a factor (since they repeat the first 9 elements): 0.5^(1/9^2) = 0.991479
Of those 9 elements, the first 3 are multiplied by a factor because of the repeating abc pattern not in the repetition of 9: 0.5^(1/5^2)=0.972655
And the second "abc" pattern in the 9 elements are multiplied from a factor because of the other abc patterns: 0.5^(1/4^2)*0.5^(1/12^2) = 0.953684

So in the end the weightings of the elements are:
{1, 1, 1, 1, 0.957603, 0.957603, 0.957603, 1, 1, 0.964367, 0.964367, 0.964367, 0.991479, 0.945558, 0.945558, 0.945558, 0.991479, 0.991479}
nrl

Ethelon wrote:

And from what I understand, Narril isn't saying that it's unnecessary to detect the difficulty of patterns, only that you don't need a system to detect patterns to determine difficulty.
...

I'm saying there's no need to parse the map looking for repeated patterns. If intramap variation is evaluated properly the effect of the repeated patterns on overall complexity will already be accounted for. The entire idea of evaluating discrete "patterns" is misguided.

Full Tablet, how would you then apply the weighted note values to come up with an overall complexity/difficulty metric? We want to reduce the complexity/difficulty of the sequence to a single number.
Topic Starter
Fraudo

Ethelon wrote:

eeezzzeee wrote:

These are factors that are all pretty much impossible to rate the difficulty of because these are the things that some players might find hard others will not.. whereas something like the distance between the note is a factor that is the same for everyone

Riince wrote:

it just goes back to what was mentioned earlier; some people are better at certain patterns than others and there's no measurable difficulty there.
I don't understand why this is even brought up. Yes some people are better at certain patterns than others. So they're better at reading it and should be rewarded. Just because different people are better or worse at something doesn't mean that there is no measurable difficulty. It means that people aren't uniform in ability.
If reading affects the difficulty of the map (which it does), then it should be considered in the difficulty rating.
I'm glad someone gets it. The whole point behind this isn't saying "Hey this is universally difficult for every single person" otherwise like I've said, the current system is horribly flawed, as we reward streams and jumps, and different people are better at different things. This will always be the case with a game, it's about rewarding things that are generally hard, and giving bonuses for doing them. That doesn't mean we will actually need to reward them, just give a way to know when a map is more complex, so that people can get recognition at the very least for the very difficult to read maps they play.
Nyxa
I agree with the above post, though from what I've seen so far everyone in this thread is trying to come up with a complicated solution to a complicated problem - I can tell you now that as long as you keep going down that path you won't find a truly effective solution. Complex problems require simplistic thinking, which is way harder than it sounds. I'll be trying to come up with some coherent ideas of my own but I'd really like to stress that if you truly want to find a working solution, the best approach is to think as simply as you can. If you have a simple solution and something goes wrong, the fix will be simple. Fine-tuning will be simple and adding onto the framework you created will be a lot easier than when you come up with a complex, fragile system that's hard to oversee properly.
nrl

Tess wrote:

Complex problems require simplistic thinking, which is way harder than it sounds.
Mine is a very simple solution; you're just tracking variation while gradually widening the scope. It's not unlike differential calculus if you think about it.
chainpullz
My solution is a simple solution. Teach a computer what makes maps hard and then let it go crazy. The actual implementation details might not be so nice but hey, let's not worry about that.
RaneFire
Just pointing out that it would have to be simple, because every revision to the system requires reparsing all beatmaps. The more work it has to do, the longer that will take. The more variables there are, the more tweaking has to happen, and more often.

Complexity will also inevitably require rebalancing for the current bonuses to FL and HD, considering it's able to pick up regular patterns, which are easier to memorise, and mid-stream distance spacing changes, which throw me off very much regarding aim. Not necessarily a "finger" complexity issue on that one.

Common patterns also don't necessarily mean complexity. For instance, triples, quintuples, etc. These patterns are very easy to play intuitively because they are common and obey musical notation (start/end on a main beat). When you start including doubles and quads in that though, things get tricky, but from the perspective of a computer, there is no difference other than the number of notes. We also listen to the music while playing, not something a computer can fathom.

Not going to say too much though, since I've thought about this stuff heavily and already thrown away any hope of seeing it implemented due to the sheer number of variables we would need to consider when designing a "simple" system. Collaboration is also required to determine universally difficult/easy things, since reading ability is the major difference between players, so I applaud you for the initiative of creating this thread to discuss this topic.
Topic Starter
Fraudo

RaneFire wrote:

Just pointing out that it would have to be simple, because every revision to the system requires reparsing all beatmaps. The more work it has to do, the longer that will take. The more variables there are, the more tweaking has to happen, and more often.

Complexity will also inevitably require rebalancing for the current bonuses to FL and HD, considering it's able to pick up regular patterns, which are easier to memorise, and mid-stream distance spacing changes, which throw me off very much regarding aim. Not necessarily a "finger" complexity issue on that one.

Common patterns also don't necessarily mean complexity. For instance, triples, quintuples, etc. These patterns are very easy to play intuitively because they are common and obey musical notation. When you start including doubles and quads in that though, things get tricky, but from the perspective of a computer, there is no difference other than the number of notes.

Not going to say too much though, since I've thought about this stuff heavily and already thrown away any hope of seeing it implemented due to the sheer number of variables we would need to consider when designing a "simple" system. Collaboration is also required to determine universally difficult/easy things, since reading ability is the major difference between players, so I applaud you for the initiative of creating this thread to discuss this topic.
Yes, as is the problem is more the implementation than the theory, and people are more throwing out complex solutions. I personally think just doing some simple checks and having a list of patterns is the simplest way, and giving each pattern a value in a key within the algorithm, so that when you encounter a pattern it gets that value, and when you encounter it within a certain timeframe, that value is reduced due to familiarity. This would only apply to simple patterns, because if it's a more complex pattern, we can just do it multiplicatively, and give many sharp angles/alternationsa in a row a more valuable reward, and variations in tempo would be accounted for by the timeline that we already have. This basically solves 90% of the issues, and leaves only reading ,which we can do some overlap checks for.

That's the simplest implementation I can think of, and only requires a few variables, note distances (maximum of 3 at a time for angles, more for variation, and stream detections), angles between those 3 notes (angle + distance/time would be preferable to measure the difficulty of stopping and going the other direction, whether the distance/time before or after the turn is more difficult isn't known to me), and the time between notes (again, 3 at a time for angles, and more for variation).

At the heart of it all, that's the simplest and easiest solution that gives the most reward for the least calculations so far as I can tell. It's easy to tune, because everything is based on what we already have, with the addition of angles. Then, we just log the angles, spit out recognized patterns, value them with comparisons to nearest patterns, spit out extra angles, value them as well, similar angles together do decrease value slightly in this case, boom, we have a working implementation.

Of course, it's quite as easy as I make it sound, but would still be heaps easier than any other proposed implementation. No, I am not aiming for a perfect system with that suggestion, but a working one that can be tuned in a way that is understandable to the players so that they know WHY a map is considered difficult, not just "because".
chainpullz
The actual time complexity for the regular expression approach is O(mn) where n is the number of objects in the beatmap and m is the dictionary. In theory if we were to go that route it would be fairly trivial to update the difficulty system. As already stated though, coming up with a good alphabet and dictionary is time consuming from a development and testing standpoint.
Topic Starter
Fraudo
Way more practical than machine learning, 1000000 check lists, and mapwide variation balancing.
dung eater
You don't really need the mass for anything here, it just made it easier for me to think about.

You could give the cursor mass, then calculate how much force is required to change it's trajectory. You'd have to would have to adjust for back and forth motions that don't feel as hard (some kind of rubberband reduction? D:). You could calculate forward and sideways (relative to the direction cursor was coming from) components independently for jump difficulty, adding them up would value square patterns over anything else for example.

To detect 'flow' you could look at the speed component in direction of the last movement of the current move relative to the speed of the last move. Think of kinetic energy saved for positive values. Change in the value and change of the change (deritative?) could be useful.

To make clustered up notes and zigzag pattern streams where the actual movement of the cursor is much less than the range from the center of each circle to the other, you could reduce a portiotion of circle size from each jump distance. If you make that portition relative to the flow value, you could devalue patterns that don't actually move around a lot and give positive flow (where you actually move forward more value).

I think tapping complexity, moving the cursor, reading and od are all things to consider separately unless you want really complicated solutions.

To make clustered up notes and zigzag pattern streams where the actualy movement of the cursor is much less than the range from the center of each circle to the other, you could reduce a portiotion of circle size from each jump distance. If you make the portition relative to the flow value, you could devalue zigzags and give positive flow, where you actually move forward more value.

I'd love play with these things if I was any better at coding (no idea how read beatmap files or any other files for values). Hopefully there's something useful above.
show more
Please sign in to reply.

New reply