Summary: Researchers engineered an electronic “tongue” that potentially lays a framework for developing artificial intelligence (AI) with emotional intelligence, particularly related to taste and eating habits.
This initiative explores the intersection of physiological and psychological factors that determine eating preferences and behaviors in humans, seeking to mimic them in AI systems.
The artificial gustatory system, created using 2D materials like graphene for chemical sensing and molybdenum disulfide for circuit logic, attempts to decode and replicate human gustation processes and responses.
Future applications could range from emotionally intelligent AI-curated diets and personalized restaurant offerings to refining and advancing other sensory inputs like sight and touch in AI systems.
Artificial Gustatory System: The electronic tongue and gustatory cortex use 2D materials and can ‘taste’ different substances, with potential to mimic the complexity of human taste receptors.
Emotional Intelligence: The research emphasizes incorporating emotional aspects into AI, such as psychological urges influencing eating even when physiologically satisfied, aspiring to create a system that demonstrates emotional intelligence similar to humans.
Future Applications: Potential applications include developing AI systems that can curate diets, offer personalized meal suggestions, and potentially be applied in sectors like food tasting and quality control, while also paving the way for enhancing AI’s sensory and emotional intelligence in other domains.
Source: Penn State
Can artificial intelligence (AI) get hungry? Develop a taste for certain foods? Not yet, but a team of Penn State researchers is developing a novel electronic tongue that mimics how taste influences what we eat based on both needs and wants, providing a possible blueprint for AI that processes information more like a human being.
Human behavior is complex, a nebulous compromise and interaction between our physiological needs and psychological urges. While artificial intelligence has made great strides in recent years, AI systems do not incorporate the psychological side of our human intelligence. For example, emotional intelligence is rarely considered as part of AI.
“The main focus of our work was how could we bring the emotional part of intelligence to AI,” said Saptarshi Das, associate professor of engineering science and mechanics at Penn State and corresponding author of the study published recently in Nature Communications.
“Emotion is a broad field and many researchers study psychology; however, for computer engineers, mathematical models and diverse data sets are essential for design purposes. Human behavior is easy to observe but difficult to measure and that makes it difficult to replicate in a robot and make it emotionally intelligent. There is no real way right now to do that.”
Das noted that our eating habits are a good example of emotional intelligence and the interaction between the physiological and psychological state of the body. What we eat is heavily influenced by the process of gustation, which refers to how our sense of taste helps us decide what to consume based on flavor preferences. This is different than hunger, the physiological reason for eating.
“If you are someone fortunate to have all possible food choices, you will choose the foods you like most,” Das said. “You are not going to choose something that is very bitter, but likely try for something sweeter, correct?”
Anyone who has felt full after a big lunch and still was tempted by a slice of chocolate cake at an afternoon workplace party knows that a person can eat something they love even when not hungry.
“If you are given food that is sweet, you would eat it in spite of your physiological condition being satisfied, unlike if someone gave you say a hunk of meat,” Das said. “Your psychological condition still wants to be satisfied, so you will have the urge to eat the sweets even when not hungry.”
While there are still many questions regarding the neuronal circuits and molecular-level mechanisms within the brain that underlie hunger perception and appetite control, Das said, advances such as improved brain imaging have offered more information on how these circuits work in regard to gustation.
Taste receptors on the human tongue convert chemical data into electrical impulses. These impulses are then sent through neurons to the brain’s gustatory cortex, where cortical circuits, an intricate network of neurons in the brain shape our perception of taste.
The researchers have developed a simplified biomimetic version of this process, including an electronic “tongue” and an electronic “gustatory cortex” made with 2D materials, which are materials one to a few atoms thick. The artificial tastebuds comprise tiny, graphene-based electronic sensors called chemitransistors that can detect gas or chemical molecules.
The other part of the circuit uses memtransistors, which is a transistor that remembers past signals, made with molybdenum disulfide. This allowed the researchers to design an “electronic gustatory cortex” that connect a physiology-drive “hunger neuron,” psychology-driven “appetite neuron” and a “feeding circuit.”
For instance, when detecting salt, or sodium chloride, the device senses sodium ions, explained Subir Ghosh, a doctoral student in engineering science and mechanics and co-author of the study.
“This means the device can ‘taste’ salt,” Ghosh said.
The properties of the two different 2D materials complement each other in forming the artificial gustatory system.
“We used two separate materials because while graphene is an excellent chemical sensor, it is not great for circuitry and logic, which is needed to mimic the brain circuit,” said Andrew Pannone, graduate research assistant in engineering science and mechanics and co-author of the study.
“For that reason, we used molybdenum disulfide, which is also a semiconductor. By combining these nanomaterials, we have taken the strengths from each of them to create the circuit that mimics the gustatory system.”
The process is versatile enough to be applied to all five primary taste profiles: sweet, salty, sour, bitter and umami. Such a robotic gustatory system has promising potential applications, Das said, ranging from AI-curated diets based on emotional intelligence for weight loss to personalized meal offerings in restaurants. The research team’s upcoming objective is to broaden the electronic tongue’s taste range.
“We are trying to make arrays of graphene devices to mimic the 10,000 or so taste receptors we have on our tongue that are each slightly different compared to the others, which enables us to distinguish between subtle differences in tastes,” Das said.
“The example I think of is people who train their tongue and become a wine taster. Perhaps in the future we can have an AI system that you can train to be an even better wine taster.”
An additional next step is to make an integrated gustatory chip.
“We want to fabricate both the tongue part and the gustatory circuit in one chip to simplify it further,” Ghosh said. “That will be our primary focus for the near future in our research.”
After that, the researchers said they envision this concept of gustatory emotional intelligence in an AI system translating to other senses, such as visual, audio, tactile and olfactory emotional intelligence to aid development of future advanced AI.
“The circuits we have demonstrated were very simple, and we would like to increase the capacity of this system to explore other tastes,” Pannone said.
“But beyond that, we want to introduce other senses and that would require different modalities, and perhaps different materials and/or devices. These simple circuits could be more refined and made to replicate human behavior more closely. Also, as we better understand how our own brain works, that will enable us to make this technology even better.”
Along with Das, Pannone and Ghosh, other Penn State researchers in the study included Dipanjan Sen, doctoral candidate in engineering science and mechanics; Akshay Wali, doctoral candidate in electrical engineering; and Harikrishnan Ravichandran, doctoral candidate in engineering science and mechanics. All researchers are also affiliated with the Materials Research Institute.
Funding: The United States Army Research Office and the National Science Foundation’s Early CAREER Award supported this research.
About this AI research news
Author: Adrienne Berard Source: Penn State Contact: Adrienne Berard – Penn State Image: The image is credited to Neuroscience News
An all 2D bio-inspired gustatory circuit for mimicking physiology and psychology of feeding behavior
Animal behavior involves complex interactions between physiology and psychology. However, most AI systems neglect psychological factors in decision-making due to a limited understanding of the physiological-psychological connection at the neuronal level.
Recent advancements in brain imaging and genetics have uncovered specific neural circuits that regulate behaviors like feeding. By developing neuro-mimetic circuits that incorporate both physiology and psychology, a new emotional-AI paradigm can be established that bridges the gap between humans and machines.
This study presents a bio-inspired gustatory circuit that mimics adaptive feeding behavior in humans, considering both physiological states (hunger) and psychological states (appetite).
Graphene-based chemitransistors serve as artificial gustatory taste receptors, forming an electronic tongue, while 1L-MoS2 memtransistors construct an electronic-gustatory-cortex comprising a hunger neuron, appetite neuron, and feeding circuit.
This work proposes a novel paradigm for emotional neuromorphic systems with broad implications for human health. The concept of gustatory emotional intelligence can extend to other sensory systems, benefiting future humanoid AI.