Summary: A new virtual reality study reveals people’s perception of taste can be altered by what they experience in their surroundings.
Source: Cornell University.
Humans not only relish the sweet, savory and saltiness of foods, but they are influenced by the environment in which they eat. Cornell University food scientists used virtual reality to show how people’s perception of real food can be altered by their surroundings, according to research published in the Journal of Food Science.
“When we eat, we perceive not only just the taste and aroma of foods, we get sensory input from our surroundings – our eyes, ears, even our memories about surroundings,” said Robin Dando, associate professor of food science and senior author of the study.
About 50 panelists who used virtual reality headsets as they ate were given three identical samples of blue cheese. The study participants were virtually placed in a standard sensory booth, a pleasant park bench and the Cornell cow barn to see custom-recorded 360-degree videos.
The panelists were unaware that the cheese samples were identical, and rated the pungency of the blue cheese significantly higher in the cow barn setting than in the sensory booth or the virtual park bench.
To control for the pungency results, panelists also rated the saltiness of the three samples – and researchers found there was no statistical difference among them.
The purpose of this project was to develop an easy-to-implement and affordable method for adapting virtual reality technology for use in food sensory evaluation, said Dando.
Our environs are a critical part of the eating experience, he said. “We consume foods in surroundings that can spill over into our perceptions of the food,” said Dando. This kind of testing offers advantages of convenience and flexibility, compared to building physical environments.
“This research validates that virtual reality can be used, as it provides an immersive environment for testing,” said Dando. “Visually, virtual reality imparts qualities of the environment itself to the food being consumed – making this kind of testing cost-efficient.”
Source: Lindsey Hadlock – Cornell University
Publisher: Organized by NeuroscienceNews.com.
Image Source: NeuroscienceNews.com image is in the public domain.
Original Research: Abstract for “Dynamic Context Sensory Testing–A Proof of Concept Study Bringing Virtual Reality to the Sensory Booth” by Alina Stelick, Alexandra G. Penano, Alden C. Riak, and Robin Dando in Journal of Food Science. Published July 25 2018.
[cbtabs][cbtab title=”MLA”]Cornell University”Eating with Your Eyes: Virtual Reality Can Alter Taste.” NeuroscienceNews. NeuroscienceNews, 16 October 2018.
<https://neurosciencenews.com/taste-virtual-reality-10028/>.[/cbtab][cbtab title=”APA”]Cornell University(2018, October 16). Eating with Your Eyes: Virtual Reality Can Alter Taste. NeuroscienceNews. Retrieved October 16, 2018 from https://neurosciencenews.com/taste-virtual-reality-10028/[/cbtab][cbtab title=”Chicago”]Cornell University”Eating with Your Eyes: Virtual Reality Can Alter Taste.” https://neurosciencenews.com/taste-virtual-reality-10028/ (accessed October 16, 2018).[/cbtab][/cbtabs]
Dynamic Context Sensory Testing–A Proof of Concept Study Bringing Virtual Reality to the Sensory Booth
Eating is a multimodal experience. When we eat, we perceive not just the taste and aroma of foods, but also their visual, auditory, and tactile properties, as well as sensory input from our surroundings. Foods are commonly tested within a sensory booth, designed specifically to limit such input. Foods are not commonly experienced in such isolation, but alongside this context, which can alter how a food is perceived. In this study, we show that the sensory properties of food can be altered by changing the environment it is consumed in, using virtual reality (VR) to provide an immersive, dynamic context to the eating experience. The purpose of this project was to develop an affordable and easy‐to‐implement methodology for adapting VR technology to sensory evaluation, without prohibitive amounts of expensive equipment or specialized programming knowledge. Virtual environments were formed by processing custom‐recorded 360 degree videos and overlaying audio, text, sensory scales, and images to simulate a typical sensory evaluation ballot within the VR headset. In a pilot test, participants were asked to taste 3 identical blue cheese samples in 3 virtual contexts–a sensory booth, a park bench, and a cow barn. Respondents rated their liking of the sample, as well as its saltiness, and pungency, attributes either reflective of one context (pungency in the barn), or presumably unrelated (saltiness). Panelists duly rated the sample’s flavor as being more pungent when consumed in the barn context. These results provide proof of concept for VR in applied sensory studies, providing an immersive context to a sensory test while remaining in place.
We consume foods in environments that can “spill over” into our perceptions of the food. Thus, we consider some foods “unsuitable” for certain settings, with others deemed more suitable for this locale. This has been studied for many years as sensory “context,” with written descriptions, pictures, or videos of such contexts. We present a method generating virtual reality contexts without any specist programming knowledge, for a few hundred dollars. In an accompanying pilot test, perception of a sample was significantly influenced by the VR context in which it was delivered.