HN2new | past | comments | ask | show | jobs | submitlogin

> ...a team of neuroscientists from MIT has found that the human brain can process entire images that the eye sees for as little as 13 milliseconds...That speed is far faster than the 100 milliseconds suggested by previous studies...


This is easy to internalize when you remember that humans can tell the difference between 30fps and 60fps which is one frame every ~33ms vs every ~17ms


You can tell the difference between 144hz and 60hz, especially if user input is involved (eg: using mouse to look around). The resolution of human skill depends heavily on the exact scenario as the body is complex. Even with eyes you have high density in the center of vision, low density outside, and they have different light sensitivities which makes them hard hard to model. To further complicate matters is the brain that does further processing that can result in hyperacuity.

A good example of hyperacuity is in reading Vernier scales where you can see differences much below the angular resolution of the eye.


Humans can tell the difference between 1000fps and 2000fps too (with a test signal of a point light source flickering at >500Hz, and viewed with rapid eye movement to produce the phantom array effect. Note that temporal anti-aliasing is needed to avoid cheating with easily visible beat patterns.) This doesn't mean we can process an image in 0.5ms.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: