Movies/cinema and VHS video, for example, run at 24FPS, with each frame containing a small "blur" for transisiton purposes and the human brain notices nothing. When you observe "changes" in the frames above 30 or so, they are changes in light intensity from the image. The theory, developed by some drunk scientists years ago, is called retinal retention.
Between frames, your eyeball retains the image for a fraction of a second, so even though there may be gaps in the frames, your brain doesn't "see" it because it holds on to the image for a little while, albeit a split second.
Ever look at your computer screen then look at a white wall? If so, you'll see the image of the screen for a little while. If you're at a movie theater, the movie looks as smooth as glass on the screen, yet when you look at the projection booth you see the light flicker on and off. Those are transtitions the eye doesn't make quick enough because of retinal retention. Of course, what would drunk scientists know?...
Seems I do remember mention that higher FPS were less strain on the eysight because of the light intensity changes, so I'm sure that less effort for your eyes to transistion those images could be "seeing" better.
I'll try to find the article I got this from years ago and post a link for it. It was an interesting read.