TY - JOUR
T1 - Real-Time Proxy-Control of Re-Parameterized Peripheral Signals using a Close-Loop Interface
AU - Kalampratsidou, Vilelmini
AU - Kemper, Steven
AU - Torres, Elizabeth B.
N1 - Funding Information:
We thank the students who volunteered their time to help perform this research; Kan Anant and the PhaseSpace Inc. for providing us with images and videos necessary to describe the set up; and Neuroelectronics for allowing us to use material from the channel www.youtube.com/c/ neuroelectrics/ and their manuals. Finally, we thank Prof. Thomas Papathomas from Rutgers Center for Cognitive Science for professional support during the submission stages of this manuscript, Nancy Lurie Marks Family Foundation Career Development Award to EBT and the Gerondelis Foundation Award to VK.
Publisher Copyright:
© 2021 JoVE.
PY - 2021/5
Y1 - 2021/5
N2 - The fields that develop methods for sensory substitution and sensory augmentation have aimed to control external goals using signals from the central nervous systems (CNS). Less frequent however, are protocols that update external signals self-generated by interactive bodies in motion. There is a paucity of methods that combine the body-heart-brain biorhythms of one moving agent to steer those of another moving agent during dyadic exchange. Part of the challenge to accomplish such a feat has been the complexity of the setup using multimodal bio-signals with different physical units, disparate time scales and variable sampling frequencies. In recent years, the advent of wearable bio-sensors that can non-invasively harness multiple signals in tandem, has opened the possibility to re-parameterize and update the peripheral signals of interacting dyads, in addition to improving brain-and/ or body-machine interfaces. Here we present a co-adaptive interface that updates efferent somatic-motor output (including kinematics and heart rate) using biosensors; parameterizes the stochastic bio-signals, sonifies this output, and feeds it back in re-parameterized form as visuo/audio-kinesthetic reafferent input. We illustrate the methods using two types of interactions, one involving two humans and another involving a human and its avatar interacting in near real time. We discuss the new methods in the context of possible new ways to measure the influences of external input on internal somatic-sensory-motor control.
AB - The fields that develop methods for sensory substitution and sensory augmentation have aimed to control external goals using signals from the central nervous systems (CNS). Less frequent however, are protocols that update external signals self-generated by interactive bodies in motion. There is a paucity of methods that combine the body-heart-brain biorhythms of one moving agent to steer those of another moving agent during dyadic exchange. Part of the challenge to accomplish such a feat has been the complexity of the setup using multimodal bio-signals with different physical units, disparate time scales and variable sampling frequencies. In recent years, the advent of wearable bio-sensors that can non-invasively harness multiple signals in tandem, has opened the possibility to re-parameterize and update the peripheral signals of interacting dyads, in addition to improving brain-and/ or body-machine interfaces. Here we present a co-adaptive interface that updates efferent somatic-motor output (including kinematics and heart rate) using biosensors; parameterizes the stochastic bio-signals, sonifies this output, and feeds it back in re-parameterized form as visuo/audio-kinesthetic reafferent input. We illustrate the methods using two types of interactions, one involving two humans and another involving a human and its avatar interacting in near real time. We discuss the new methods in the context of possible new ways to measure the influences of external input on internal somatic-sensory-motor control.
UR - http://www.scopus.com/inward/record.url?scp=85107081815&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85107081815&partnerID=8YFLogxK
U2 - 10.3791/61943
DO - 10.3791/61943
M3 - Article
C2 - 34028426
AN - SCOPUS:85107081815
SN - 1940-087X
VL - 2021
JO - Journal of Visualized Experiments
JF - Journal of Visualized Experiments
IS - 171
M1 - e61943
ER -