Background. The sense of agency (SoA) provides us with the experience of being a physical agent with free will. On a phenomenological basis, SoA can be divided into sensory components (feeling of agency, FoA) and more cognitive components (judgment of agency, JoA). Both these components can be independently measured.
Objective and Method. A new method was developed to test the possibility of preserving SoA and its components in the atypical conditions of passive movements. Parameters of the participant’s movement in response to a visual stimulus (reaction time, speed, and amplitude) were measured and used to control a servo that simulated the movement (executed passive movements). The scores on the psychometric scale of the agency and the event-related potentials (ERPs) were recorded for variable movement delays relative to the stimulus onset.
Results. It was found that the FoA was not present under passive movement conditions. At the same time, participants associated these movements with their own activity (JoA), even when their delay after the stimulus onset was too short to be actively reproduced. The somatosensory ERPs’ amplitude decreased for the expected movements, demonstrating an inverse relationship with the agency scores. The lowest amplitude was observed when movements were actuated by another hand. The results can be explained using a predictive forward model, since the FoA was not observed in the absence of active movements. On the other hand, the ERPs’ data and the presence of JoA with various delays between the stimulus and movement support the postdictive model of agency, where the leading role is assigned to prejudice and contextual knowledge related to the action.
Conclusion. It seems that the “context pressure” of the situation, demanding a mandatory response to the stimulus, forms a cognitive prediction of movements without firm sensory representation.
Keywords: action, free will, mental chronometry, passive movement, feeling of agency, judgment of agency, sense of agency, somatosensory event-related potentials (ERPs)
Background. Human-machine interaction technology has greatly evolved during the last decades, but manual and speech modalities remain single output channels with their typical constraints imposed by the motor system’s information transfer limits. Will brain-computer interfaces (BCIs) and gaze-based control be able to convey human commands or even intentions to machines in the near future? We provide an overview of basic approaches in this new area of applied cognitive research.
Objective. We test the hypothesis that the use of communication paradigms and a combination of eye tracking with unobtrusive forms of registering brain activity can improve human-machine interaction.
Methods and Results. Three groups of ongoing experiments at the Kurchatov Institute are reported. First, we discuss the communicative nature of human-robot interaction, and approaches to building a more e cient technology. Specifically, “communicative” patterns of interaction can be based on joint attention paradigms from developmental psychology, including a mutual “eye-to-eye” exchange of looks between human and robot. Further, we provide an example of “eye mouse” superiority over the computer mouse, here in emulating the task of selecting a moving robot from a swarm. Finally, we demonstrate a passive, noninvasive BCI that uses EEG correlates of expectation. This may become an important lter to separate intentional gaze dwells from non-intentional ones.
Conclusion. The current noninvasive BCIs are not well suited for human-robot interaction, and their performance, when they are employed by healthy users, is critically dependent on the impact of the gaze on selection of spatial locations. The new approaches discussed show a high potential for creating alternative output pathways for the human brain. When support from passive BCIs becomes mature, the hybrid technology of the eye-brain-computer (EBCI) interface will have a chance to enable natural, fluent, and the effortless interaction with machines in various fields of application.
Keywords: attention, eye-to-eye contact, eye movements, brain-computer interface (BCI), eye-brain-computer interface (EBCI), electroencephalography (EEG), expectancy wave (E-wave), human-robot interaction, brain output pathways