Jules Verne, considered a Father of Science Fiction, pioneered the idea of space travel, along with air and water travel, long before the advent of rocket ships and space shuttles, airplanes, and submarines. Ray Bradbury wrote about electronic devices considered prototypes for Bluetooth and Airpods in Farenheit 451. Neuromancer by William Gibson brings to life the ideas of augmented and virtual reality before Oculus Rift was invented. And many of these ideas have been portrayed in movies, such as The Matrix, Avatar, Ready Player One that capitalize on the idea of linking one’s mind within a larger network to solve different dilemmas.
Science Fiction Meets Science
Today, the idea of tracking brain activity to facilitate human behavior has become increasingly common in everyday life. Biofeedback is one technique used to manage migraines, anxiety, and other physiological issues. Measuring sleep functioning with electroencephalograms (EEGs) helps to identify sleep disorders and other health issues.
Tracking brain activity has become more common with non-human animals as technology advances. Previously, questions regarding the neural foundations of some activities were limited. For example, observing the neural process of decision-making within a specific task as it unfolded was impossible until equipment and techniques could be developed. Today, fruit flies, rodents, and primates, for example, have helped researchers establish the neural circuitry involved with specific decisions or problem solving as behavior is monitored. However, many of these studies are limited to artificial conditions that fail to capture the complexity of real world scenarios and interactions.
Dolphin Sci Fi
This limitation appears to be changing, however. Take the work conducted by Dr. Jesus Moreno Escobar and team on the neural responses of a female dolphin during a dolphin-assisted therapy session. In this study, the researchers created a wireless EEG measurement device that could be mounted on the dolphin during a session to record EEG activity in real-time.
The results of this innovative study suggested that the average brain activity changed between control trials (i.e., no person or a control person) and experimental trials with patients seeking therapy with the dolphins. However, the meaning of this change was unclear in terms of the dolphin’s internal state and required additional research.
The Matrix meets Avatar?
The advancements in technology allowed researchers to pursue more complex, realistic questions in real-world settings. Real-world social behavior is one topic that researchers have begun tackling. Highlighted in this post, Dr. Arish Alreja and his team of collaborators pursued the question of gathering neural-based visual data during social interactions in a hospital setting in a recent Psychonomic Society paper published by Behavior Research Methods.
In this study, the research team combined several types of tools to measure the neurophysiology of natural social behavior simultaneously with behavioral and psychophysiological measures of six patients who had undergone surgical treatment for epilepsy. Eye trackers measured the objects of fixation during different social interactions, such as talking to a family member or friend. Egocentric video recording measured the scene of what the patient was looking at during the interaction. Audio recording measured speech characteristics (i.e., distance, intensity) and speaker identification. Intracranial recordings measured neural activity location and degree of activity during the various social interactions. The experimental measurement set-up is represented below.
Data from 11 different recording sessions were processed for the six patients. Processing meant that all the data had to be combined or “fused” to allow for simultaneous processing at specific time points. The image below shows the view of the patient from the eye-tracker glasses along with the objects of fixation and the corresponding neural and aural recordings. For example, the third fixation point, a face (bottom row), had audio (pink waveform, top row), with some activation at various electrode locations (middle rows).
Ready, Player One? You have advanced to the next level
Overall, the researchers concluded that while this type of research requires many calibrations and modifications as well as significant pre-processing and cleaning, the resulting findings were worth the effort. For example, in their sample, the researchers found that the participants fixated on faces 30-40% of the total time spent fixating during a recording session despite having a room full of multiple family and friends engaged in active conversations. However, when patients did look at the faces, they did so for longer durations.
Participant identity could be determined with accuracy from audio recordings when manual processing was performed but not as well when deep machine learning modeling was used. Similarly, the video-recorded data were processed more accurately with manual annotations than with automated software-driven annotations. Although both audio and video processing are effortful, video processing is significantly more time-consuming with 16+ hours of video data taking 103 hours to process!
Finally, the neural recordings illustrated that when patients fixated on a face or a non-face object, a difference in timing of the neural responses occurred. This difference in neural response shows up before fixations start, which suggests that the visual system may be predicting what is being viewed. (Definitely some cool sci-fi implications happening!)
Ultimately, this study demonstrates a proof of concept that has begun to be tested in non-human primates and dogs as well. Researchers testing macaque monkeys and dogs have attempted to understand how they perceive natural scenes or social interactions, respectively.
Babel-17
As the researchers of the featured article emphasized in their conclusions, multimodal measurement of complex, real-world environments can provide critical insights into the relationship between physiology and behavior. In the case of humans, this type of research may inform clinical practice, patient outcomes, and patient satisfaction. Similarly, this type of research may also provide some insight into how animals interact and experience their worlds, which may eventually show how they feel about things such as walks or communicating with other species, and Samuel Delaney explores in his novel, Babel-17, the intersection of language and thought.
Featured Psychonomic Society article
Alreja, A., Ward, M. J., Ma, Q., Russ, B., Bickel, S., Van Wouwe, N., . . . Ghuman, A. S. (2023). A new paradigm for investigating real-world social behavior and its neural underpinnings. Behavior Research Methods, https://doi.org/10.3758/s13428-022-01882-9