They say that the eyes are the windows to the soul. However, to get a real idea of what a person is up to, the best place to check is right below the eyes. This according to UC Santa Barbara researchers Miguel Eckstein and Matt Peterson, whose findings are published in the Proceedings of the National Academy of Sciences.

It’s pretty fast, it’s effortless — we’re not really aware of what we’re doing”

Eye vision patternUsing an eye tracker and more than 100 photos of faces and participants, Eckstein and graduate research assistant Peterson followed the gaze of the experiment’s participants to determine where they look in the first crucial moment of identifying a person’s identity, gender, and emotional state.

“For the majority of people, the first place we look at is somewhere in the middle, just below the eyes,” Eckstein said. One possible reason could be that we are trained from youth to look there, because it’s polite in some cultures. Or, because it allows us to figure out where the person’s attention is focused.

However, Peterson and Eckstein hypothesize that despite the ever-so-brief glance of just 250 milliseconds, the relatively featureless point of focus and the fact that we’re usually unaware that we’re doing it, the brain is actually using sophisticated computations to plan an eye movement that ensures the highest accuracy in tasks that are evolutionarily important in determining flight, fight, or love at first sight.

“When you look at a scene, or at a person’s face, you’re not just using information right in front of you,” said Peterson. Instead, at a conversational distance, faces tend to span a large area of the visual field. There is information to be gleaned, not just from the face’s eyes, but also from features like the nose or the mouth. However, the area around the eyes contains minute bits of important information, which requires the high resolution processing, whereas features like the mouth are larger and can be read without a direct gaze.

When participants were directed to try to determine the identity, gender, and emotion of people in the photos by directly looking elsewhere, like the forehead or the mouth for instance, they did not perform as well as they did by looking close to the eyes.

The research by Peterson and Eckstein has resulted in sophisticated new algorithms to model optimal gaze patterns when looking at faces. The algorithms could potentially be used to provide insight into conditions like schizophrenia and autism, which are associated with uncommon gaze patterns, or prosopagnosia — an inability to recognize someone by his or her face.