01-17-2019, 06:37 AM
(01-17-2019, 06:09 AM)IndigoGeminiWolf Wrote: I never understood why people want their game to run at 100 frames/second. The human eye can only distinguish 30 fps.
Maybe the overhead extra allows them to turn up options like ray tracing and anti-aliasing.
This is actually a very common myth. Your eyes can see actually much higher than that. The average range of perception is being able to see up to 150 fps, though, it is probably not accurate to think of human eyes as seeing in terms of "frames per second" (like a camera or something) since that is more of a digital thing, and we have more of an analog thing going on with our eyes as far a visual sensate is concerned. Just watch a video in 30 fps and 60 fps and you will clearly see the difference, putting that myth in its coffin. Some people can go as far as 1000 fps, though that isn't everyone. There is a point at which there are diminishing returns. Again 150 is the average. But there is a lot of biological variation. Just be sure your monitor has the corresponding refresh rat (hz) to accommodate whatever frame rate you are trying to achieve.
The myth most likely arises from the fact that anything below 24 to 30 is noticeably slower, so as far as films are concerned they tried to make them at least that many fps, so it seemed reasonably fluid and life like. It might also be due to the fact that a single photoreceptor in your eye has a refractory period where once a photon hits it, there is a time delay before it is able to properly sense another photon and send the signal to the brain. Hence why people think this refractory period = 1/30 of 1 second.