Summary: In a fascinating twist of perception, a recent study reveals that our eyes can trick our brains into misjudging the size of objects around us. By cleverly blurring parts of images, researchers demonstrated how participants misperceived full-scale railway scenes as smaller than their model counterparts.
This surprising finding not only sheds light on the flexibility and occasional flaws of our visual system, but also hints at profound implications for everyday situations, from driving to criminal justice.
The bottom line? Size matters, but our eyes might not always get it right.
The study, conducted by researchers from the University of York and Aston University, demonstrated that the human visual system can be tricked into making inaccurate assumptions about the size of objects in the surrounding world.
By blurring the upper and lower parts of photographs of full-scale railway scenes, the researchers were able to make participants perceive these scenes as smaller than small-scale models of railways that were not blurred.
The findings of this study not only reveal the flexibility of the human visual system but also its susceptibility to mistakes, indicating that our perception of object size can be influenced and sometimes misled by factors such as ‘defocus blur.’
Source: University of York
A new study has shown that the human visual system can ‘trick’ the brain into making inaccurate assumptions about the size of objects in the world around them.
The research findings could have implications for many aspects of everyday life, such as driving, how eye witness accounts are treated in the criminal justice system, and security issues, such as drone sightings.
The research team from the University of York and Aston University presented participants with photographs of full-scale railway scenes, which had the upper and lower parts of the image blurred, as well photographs of small-scale models of railways that were not blurred.
Participants were asked to compare each image and decide which was the ‘real’ full-scale railway scene. The results were that participants perceived that the blurred real trains were smaller than the models.
Dr Daniel Baker, from the University of York’s Department of Psychology, said: “In order for us to determine the real size of objects that we see around us, our visual system needs to estimate the distance to the object.
“To arrive at an understanding of absolute size it can take into account the parts of the image that are blurred out – a bit like the out-of-focus areas that a camera produces – which involves a bit of complicated mathematics to give the brain the knowledge of spatial scale.
“This new study, however, shows that we can be fooled in our estimates of object size. Photographers take advantage of this using a technique called ‘tilt-shift miniaturisation’, that can make life-size objects appear to be scale models.”
The findings demonstrate that the human visual system is highly flexible – sometimes capable of accurate perception of size by exploiting what is known as ‘defocus blur’, but at other times subject to other influences and failing to make sense of real-world object size.
Professor Tim Meese, from Aston University, said: “Our results indicate that human vision can exploit defocus blur to infer perceptual scale but that it does this crudely.
“Overall, our findings provide new insights into the computational mechanisms used by the human brain in perceptual judgments about the relation between ourselves and the external world.”
Blurring the boundary between models and reality: Visual perception of scale assessed by performance
One of the primary jobs of visual perception is to build a three-dimensional representation of the world around us from our flat retinal images.
These are a rich source of depth cues but no single one of them can tell us about scale (i.e., absolute depth and size). For example, the pictorial depth cues in a (perfect) scale model are identical to those in the real scene that is being modelled.
Here we investigate image blur gradients, which derive naturally from the limited depth of field available for any optical device and can be used to help estimate visual scale.
By manipulating image blur artificially to produce what is sometimes called fake tilt shift miniaturization, we provide the first performance-based evidence that human vision uses this cue when making forced-choice judgements about scale (identifying which of an image pair was a photograph of a full-scale railway scene, and which was a 1:76 scale model).
The orientation of the blur gradient (relative to the ground plane) proves to be crucial, though its rate of change is less important for our task, suggesting a fairly coarse visual analysis of this image parameter.