How Eye Contact Builds Connection

Summary: A new study reveals that the sequence of eye movements—not just eye contact itself—plays a key role in how we interpret social cues, even with robots. Researchers found that looking at an object, making eye contact, then looking back at the object was the most effective way to signal a request for help.

This pattern prompted the same response whether participants interacted with a human or a robot, highlighting how tuned humans are to context in gaze behavior. These insights could improve social robots, virtual assistants, and communication training for people and professionals who rely heavily on non-verbal cues.

Key Facts:

  • Best Sequence: Gaze–eye contact–gaze sequence signals a clear request for help.
  • Human or Robot: Participants responded equally to the gaze pattern in humans and robots.
  • Practical Impact: Findings can improve social robots, virtual assistants, and communication training for diverse settings.

Source: Flinders University

For the first time, a new study has revealed how and when we make eye contact—not just the act itself—plays a crucial role in how we understand and respond to others, including robots.

Led by cognitive neuroscientist Dr Nathan Caruana, researchers from the HAVIC Lab at Flinders University asked 137 participants to complete a block-building task with a virtual partner.

This shows a man making eye contact with a robot.
The authors say the research can directly inform how we build social robots and virtual assistants that are becoming ever more ubiquitous in our schools, workplaces and homes, while also having broader implications beyond tech.Credit: Neuroscience News

They discovered that the most effective way to signal a request was through a specific gaze sequence: looking at an object, making eye contact, then looking back at the same object. This timing made people most likely to interpret the gaze as a call for help.

Dr Caruana says that identifying these key patterns in eye contact offers new insights into how we process social cues in face-to-face interactions, paving the way for smarter, more human-centred technology.

“We found that it’s not just how often someone looks at you, or if they look at you last in a sequence of eye movements but the context of their eye movements that makes that behaviour appear communicative and relevant,” says Dr Caruana, from the College of Education, Psychology and Social Work.

“And what’s fascinating is that people responded the same way whether the gaze behaviour is observed from a human or a robot.

“Our findings have helped to decode one of our most instinctive behaviours and how it can be used to build better connections whether you’re talking to a teammate, a robot, or someone who communicates differently.

“It aligns with our earlier work showing that the human brain is broadly tuned to see and respond to social information and that humans are primed to effectively communicate and understand robots and virtual agents if they display the non-verbal gestures we are used to navigating in our everyday interactions with other people.”

The authors say the research can directly inform how we build social robots and virtual assistants that are becoming ever more ubiquitous in our schools, workplaces and homes, while also having broader implications beyond tech.

“Understanding how eye contact works could improve non-verbal communication training in high-pressure settings like sports, defence, and noisy workplaces,” says Dr Caruana.

“It could also support people who rely heavily on visual cues, such as those who are hearing-impaired or autistic.”

The team is now expanding the research to explore other factors that shape how we interpret gaze, such as the duration of eye contact, repeated looks, and our beliefs about who or what we are interacting with (human, AI, or computer-controlled).

The HAVIC Lab is currently conducting several applied studies exploring how humans perceive and interact with social robots in various settings, including education and manufacturing.

“These subtle signals are the building blocks of social connection,” says Dr Caruana.

“By understanding them better, we can create technologies and training that help people connect more clearly and confidently.”

The HAVIC Lab is affiliated with the Flinders Institute for Mental Health and Wellbeing and a founding partner of the Flinders Autism Research Initiative. 

About this robotics and social neuroscience research news

Author: Yaz Dedovic
Source: Flinders University
Contact: Yaz Dedovic – Flinders University
Image: The image is credited to Neuroscience News

Original Research: Open access.
The temporal context of eye contact influences perceptions of communicative intent” by Nathan Caruana et al. Royal Society Open Science


Abstract

The temporal context of eye contact influences perceptions of communicative intent

This study examined the perceptual dynamics that influence the evaluation of eye contact as a communicative display.

Participants (n = 137) completed a task where they decided if agents were inspecting or requesting one of three objects.

Each agent shifted its gaze three times per trial, with the presence, frequency and sequence of eye contact displays manipulated across six conditions.

We found significant differences between all gaze conditions.

Participants were most likely, and fastest, to perceive a request when eye contact occurred between two averted gaze shifts towards the same object.

Findings suggest that the relative temporal context of eye contact and averted gaze, rather than eye contact frequency or recency, shapes its communicative potency.

Commensurate effects were observed when participants completed the task with agents that appeared as humans or a humanoid robot, indicating that gaze evaluations are broadly tuned across a range of social stimuli.

Our findings advance the field of gaze perception research beyond paradigms that examine singular, salient and static gaze cues and inform how signals of communicative intent can be optimally engineered in the gaze behaviours of artificial agents (e.g. robots) to promote natural and intuitive social interactions.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.