Neural Imaging Reveals Secret Conversational Signals

1 year ago
tgadmintechgreat
163

Exploring human conversations not an easy task. For example, when people start talking to each other during a conversation, they coordinate their speech very well – people very rarely interrupt each other and rarely leave long unspoken silent pauses. Conversation is like a dance without choreography and music – spontaneous but structured. To maintain this coordination, the people involved in the conversation begin to coordinate their breathing, gaze, speech melody, and gestures.

To understand this complexity, studying study participants in a lab looking at computer screens—a traditional psychological experiment set-up—is not enough. We need to study how people behave naturally in the real world, using new measurement methods that allow us to capture their neural and physiological responses. For example, Antonia Hamilton, a neuroscientist at University College London, recently used motion capture identify a pattern of very quick nods that listeners make to show that they are paying attention when someone is speaking. Hamilton shows that interaction is enhanced by these subtle cues, but what is also interesting is that while speakers can actually perceive this information, these body cues are not discernible to the naked eye.

In 2023, we will also finally be able to start collecting neural data as people move and talk to each other. It’s not easy: Brain imaging techniques such as functional magnetic resonance imaging (fMRI) involve placing participants in 12-ton brain scanners. Recent Research, however, succeeded with a cohort of autistic participants. This article is an amazing achievement, but of course, until fMRI techniques become much smaller and more mobile, it will be impossible to see how neural data correlates with movement and speech patterns in conversations, ideally between both participants. during conversation. On the other hand, another technique called near-infrared functional spectroscopy (fNIRS) can be used while people move naturally. fNIRS measures the same neural activity index as fMRI, using optodes that transmit light through the scalp and analyze the reflected light. fNIRS already deployed while people performed tasks outdoors in central London, proving that this method can be used to collect neural data in parallel with movement and speech data while people interact naturally.

In 2023, we will also be able to see for the first time how this will work in large group conversations, which typically reach their limit of around five people. This is of course a big problem since conversations can be so fluid and open, but it’s important if we’re to understand how participants’ brains coordinate these precisely timed conversational dances.

These breakthroughs will be a big step forward in the scientific study of human conversation, one of the most exciting areas of cognitive neuroscience and psychology. Of course, I’m a little biased: I’ve studied the perception and production of human speech for decades, and I think that conversations are where our linguistic, social, and emotional brain processes come together. Conversations are universal and they are the primary way people use to manage social interactions and connections. They are of great importance for our mental and physical health. When we can fully unlock the science of conversation, we will go a long way towards understanding ourselves.

Leave a Reply