In the past, I looked at what the resolution of the human eye was. It's a tricky question, but of course it isn't just megapixels that concern photographers – we get quite excited about a whole range of stats and, when it comes to video, frame rate is crucial.
If you start shooting video with a camera, it can be an understandable cause of consternation that you need to think about the frame rate. Why? We know that movies are 24fps, but most of us are well aware (and I'll stick to the rounded numbers) that 25, 30, 50, and 60fps are usually options on the menu, and I'll explain why they might be wrong in a moment.
If you've only ever lived in the era of flat-panel TVs then you might be forgiven for not knowing that analog TVs worked by drawing lines across a screen in rows, one after the other. The process was essentially like very precisely waving a laser pen across a screen; the screen glowed for a moment after the light hit it. This was done in lines, top to bottom.
A 'frame' – in America and Japan – was made up of 486 lines, of which 480 were visible. At least in part for convenience, the TV matched the frequency of the power – 60Hz – so it was 60 frames per second. To keep the glow even across the screen, every other line was drawn at a time (called interlacing), so only 240 lines were drawn per 'frame', in that sense.
So when people started thinking more digitally, images were saved as 30 (full) frames per second at the full resolution of 480 lines. Because things weren't anaglog any more, a horizontal resolution was needed too – and, after a while, that settled on square pixels.
In Europe, and most of the rest of the world, 50Hz electricity made higher resolution at lower frame rates possible – and that's what happened. The 625 lines, however, provided a 576i picture.
If you're wondering what happened to all those spare lines, they were used to provide a kind of one-way internet, called Ceefax in the UK – a microchip in the TV could read data as text pages. (This was how my grandad checked cricket scores 'live' when the coverage went to satellite TV.)
Anyway, that's part of the symbiosis between 25 / 50 frames and 30 / 60 frames. Multiples of these numbers also now instinctively make sense for slow-motion – 120fps at quarter speed in 30fps – so you'll see numbers like these on a lot of cameras for slow-mo, but you'll rarely see anything at that frame rate in the cinema (Will Smith's Gemini Man being a rare example – and generally not even screened at 120fps due to equipment limitations).
The real question is, does the human eyes have a frame rate as such – or, perhaps more usefully, is there a specific rate at which they're fooled?
In the past I've looked at the eye's rods and cones and suggested an equivalency of 124 megapixels using some back-of-the-envelope math, made complex for all kinds of reasons, not least the fact the eye's resolution isn't even across the field of view.
There is a hard limitation on the amount of 'data' each eye sends to the brain for processing: the optic nerve. That's thought to be around 8 gigabits per second by scientists – essentially the same as maxed-out Ethernet (according to a study by neuroscientist Kristin Koch and others.)
Now, if the brain was handling 124MP RAW images at 12-bits per pixel, that's 44 gigabits per "frame", so it's fair to assume that things aren't working quite that way!
While the brain will, presumably, be handling eye movement and pupil adjustment, most of the 'data' flow will be one-way – and the eye is actually sending a lot less 'data' because it constantly moves the small high-resolution fovea area around. Because we can't be sure where in the frame someone is looking, we end up capturing, as filmmakers, a lot more resolution than any one viewer needs!
I mention this because the processing time has been calculated as about 400ms to move the eye from one place to another, meaning that on top of the frame discussion you're probably doing that at least twice per second.
Anyway, plenty of folk will have told you that 24, 30, or 60fps is all that's needed to trick the eye because it seems self-evident from their medium. Indeed, Disney cartoons traditionally repeated frames and so only used 12fps – and motion can be followed by the brain. But in all these cases we're also aware, one way or another, that we are watching something.
These are based on the concept of a flicker fusion frequency – the point that the flicker of the animation blurs and appears continuous. That, though, is simply 'good enough' – and not the same as the best the eye can achieve.
According to a 2014 study by Mary Potter and others at MIT, the eye and brain can process and understand an image it sees for just 13 milliseconds. You can fit just under 77 of those in a second, so 77 frames per second would be on the edge of individually perceptible.
Of course, that's not quite what this study was about – in it, a picture of a smiling couple was flashed and people were asked what they'd seen. It might be as much use for subliminal advertisers!
The ultimate lesson, though, should always be that there isn't a 'frame rate' because the mechanism is biological, not mechanical. And frames are, ultimately, a simplistic engineering solution – even analog TV found its way around them partially, so imagine what the human brain can do!
Ultimately, though, if you're looking for the best monitor for video editing, you'll need to be flexible.