forum

At what point does fps increases have diminishing returns?

posted
Total Posts
87
Topic Starter
deletemyaccount
At what point does increasing fps yield dimishing returns in terms of input lag?

I want to play on a higher resolution than 720p fullscreen for better image quality; at 1080p however the fps falls below 120 at some points.

In the best interest of my toaster and settings, what fps do you find as the bare minimum for optimal gameplay? Does it make that much of a difference?

(please no over 9000 fps replies ;_;)
eeezzzeee
For me there's a noticeable difference between the 60 fps vsync and 120 fps frame limiter settings. Everything else above 120 though.. not really.
B1rd
500
ziin
60 = 16.7ms
120 = 8.3ms
250 = 4ms
333 = 3ms
500 = 2ms
1000 = 1ms

as far as I can tell low fps increases latency between your mouse and your screen. The numbers may not be as bad as this, but what's 16.7 ms delay to a word document? What's 16.7 ms delay to a rhythm game?

500 sounds like a good number, but I think 300 is good enough.
jasian
Anything between 60-250 would make a large difference, anything past 300fps will give diminishing returns. People who play at 1k+ fps do so for those microsecond advantages which won't make a difference realistically.
otoed1
I notice any dip below 400, I used to play at 600 consistently and recently have obtained a fps of 1400. I have not felt any difference between the two frame rates. Personally, I would shoot for 500 if you can.
ZenithPhantasm
1000 fps because game logic runs at the same rate as fps and 1000hz is the maximum polling rate supported by USB 2.0
E m i
directly at 1000 fps because

ZenithPhantasm wrote:

1000 fps because game logic runs at the same rate as fps and 1000hz is the maximum polling rate supported by USB 2.0
Ohrami
300 fps
dung eater
at every point

bigger is better

use a fps that isn't divisable by your screen hz for less visible tearing (tearing of subsequent frames won't be at the same place on screen)

220, 260 fps looks fine on my toaster laptop
ZenithPhantasm
Tablet users wont notice a difference after 200fps because wacoms are capped at 200hz. Mouse users will notice a difference until 1000fps.
Yuudachi-kun

ZenithPhantasm wrote:

Tablet users wont notice a difference after 200fps because wacoms are capped at 200hz. Mouse users will notice a difference until 1000fps.
Implying we all use Wacom.
E m i

Kheldragar wrote:

ZenithPhantasm wrote:

Tablet users wont notice a difference after 200fps because wacoms are capped at 200hz. Mouse users will notice a difference until 1000fps.
Implying we all use Wacom.
133 for older wacom tablets and 125 for your huion ;^
ZenithPhantasm

[ Momiji ] wrote:

133 for older wacom tablets and 125 for your huion ;^
Huion is 200hz
E m i

ZenithPhantasm wrote:

[ Momiji ] wrote:

133 for older wacom tablets and 125 for your huion ;^
Huion is 200hz
even huion says huion is 125hz...
ZenithPhantasm

Report rate is 200 reports per second aka 200hz
autoteleology
It's really hard to see any difference in motion blur between 120Hz and 144Hz. I'd say 120Hz is the sweet spot, especially when you have G-Sync or ULMB.
Full Tablet
Motion blur is not related to the screen refresh rate, it depends on the pixel response time. It also depends on the refresh rate.

A way to reduce motion blur without getting a new screen (or a screen compatible with strobe light or similar methods) is using a gray background instead of a black background for the game (pixel response times are lower for gray-to-white than black-to-white)

Higher refresh rate helps the most for fast moving objects (for example: playing with high scroll rate in osu!mania), so they jump less frame-by-frame.
autoteleology

Full Tablet wrote:

Motion blur is not related to the screen refresh rate, it depends on the pixel response time.
Higher refresh rate helps the most for fast moving objects (for example: playing with high scroll rate in osu!mania), so they jump less frame-by-frame.
http://www.blurbusters.com/faq/60vs120vslb/

educate yourself
Full Tablet

Philosofikal wrote:

Full Tablet wrote:

Motion blur is not related to the screen refresh rate, it depends on the pixel response time.
Higher refresh rate helps the most for fast moving objects (for example: playing with high scroll rate in osu!mania), so they jump less frame-by-frame.
http://www.blurbusters.com/faq/60vs120vslb/

educate yourself
You are right. The motion blur described there is actually caused because of the camera used for testing the effect (high exposure time), and also happens with human vision (because of persistence of vision); not because the images are blurred on the screen.

This kind of motion blur is reduced by making each frame appear on the screen for less time. Ways to do this is using a screen with strobe light, or getting a screen with high refresh rate.
autoteleology
Motion blur is entirely in the eye because you don't see in frames per second - your sight is a continuous, real time chemical reaction to the exposure to light. The less time an image is exposed to your eye, the less it persists in your vision. This is why CRTs have basically no "motion blur" relative to their FPS - they refresh in strobes, the actual image is only exposed to your eye for only a small amount of time (which is why they flicker and cause eye strain). It's like a prehistoric version of Lightboost/ULMB (Ultra Low Motion Blur).

When you have inadequate pixel response times, you get ghosting, an entirely different problem. You get the "ghost" of previous frames in the current frame because the pixels can't keep up with the color transition, and are continuously in the middle of transitioning by the time the next frame appears, causing trails of improperly transitioned colors to appear behind moving objects.

http://www.overclock.net/t/1430257/what ... r-monitors
peppy
240 is a good number. if you can keep stable 500fps, then i'd go higher, else 240 is good.
I Give Up
420 fps is also a good number. Diminishing returns happen at 1337 fps unless you're mlg.
Multtari

Philosofikal wrote:

Motion blur is entirely in the eye because you don't see in frames per second - your sight is a continuous, real time chemical reaction to the exposure to light. The less time an image is exposed to your eye, the less it persists in your vision. This is why CRTs have basically no "motion blur" relative to their FPS - they refresh in strobes, the actual image is only exposed to your eye for only a small amount of time (which is why they flicker and cause eye strain). It's like a prehistoric version of Lightboost/ULMB (Ultra Low Motion Blur).

When you have inadequate pixel response times, you get ghosting, an entirely different problem. You get the "ghost" of previous frames in the current frame because the pixels can't keep up with the color transition, and are continuously in the middle of transitioning by the time the next frame appears, causing trails of improperly transitioned colors to appear behind moving objects.

http://www.overclock.net/t/1430257/what ... r-monitors
TLDR; Inputs can be calculated multiple times between still frames and are not dependant on your monitor refresh rate. What you want to look at is your tablet / mouse polling rate where adding more frames becomes useless.

When talking about tablet or mouse input the comfortablity is not entirely dependant on perceivable refresh rate of your screen or amount of motion blur but input lag which is decided by your in-game FPS. There is difference between 120fps and 240fps input lag. I don't think input lag is highly noticeable after 240.

I haven't tried playing with 120hz monitor but difference between 60hz and 80hz is huge when looking at notes and such. If you have enough rhythm skill and you know exactly where your cursor is going to land refresh rate is not important factor as framerate.

At some point i tried playing with game capped at 30hz as an experiment. Though being very uncomfortable i couldn't see huge hit in my playing ability as i would with capped framerate of 60 (V-Sync) which was suprising.


Correct me if i'm wrong.
show more
Please sign in to reply.

New reply