Engineering and Music
"Human Supervision and Control in Engineering and Music"



Workshop
Orchestra Concert
Ensemble Concert
About us

Kenji Suzuki  with Shuji Hashimoto

Robotic Interface For Embodied Interaction via Dance and Musical Performance
 
 
Abstract
In order to realize interactive virtual musical environment for collaborative work between human and machine, robotic interfaces were introduced. The robot can be effectively used for musical performances with motion by the exploitation of the embodiment that can display the refractive motion on stage while producing a sound and music by embedded stereo speakers according to the context of performance. The proposed approach to equip musical instruments with an autonomous mobile ability is providing a new computer music performance in real world. 
 
Research Project Overview
So far we have investigated KANSEI communication, which is regarded as some terms similar to feeling and sensibility, between human and machine. In communications between humans, the role of KANSEI information is as important as logical information. In case of humans, when particular stimuli are given, we can perceive and understand them. We notice that such information is often processed not logically but unconsciously and involuntary. Such characteristics, rather behaviors might be regarded as KANSEI. Humans have an ability to understand things by intuition. In other words, humans have and often take a method of information processing based on KANSEI. The system with such a communication channel, therefore, requires dealing with KANSEI Information.
Music is one of the challenging research topics. Sound and music are typical channel of non-verbal communication that humans often express their mind with. In order to increase the degree of freedom of musical expression, so far a number of researchers and musicians have reported about new type of musical system that exceeds the physical limitation like musical instrument and vocal cords. 
Many studies about the musical interaction between human and machine have been also proposed so far. However, although humans often accompany music with body motion, few works have been reported about the autonomous mobile robot for musical performance. 

Consequently, we focused on an interaction metaphor that robotic interface can establish virtual and real world connection. In recent years, we have proposed some mobile robot platforms for virtual musical environment. In order to realize interactive virtual musical environment for collaborative work between human and machine, a substantial interface, robot agent, was introduced. The robot can be effectively used for musical performances with motion by the exploitation of the embodiment that can move around humans for a companion while producing sound and music by attached stereo speakers. Moreover, the robot can display the refractive motion according to the context of the performance to create the human-robot collaborative performance on stage.
The first idea was to equip musical instrument with an autonomous mobile ability for computer music performance in real world. The mobile robot on wheels performs as an agent capable of communicating by means of several channels including sound and music, visual expression and movement.
The robot's behavior can reflect the continuous inputs from complex external environment so that the robot could perform non-logical information processing. Even if each logic process is discriminated by a specified threshold, the integrated consequence cannot be explained in a simple manner. In other words, humans can no longer perceive an action caused by a combination of a great number of logical processing as a logical one.

In this article, we describe our research direction introducing some installations and demonstrations in recent years. 
 

fig1.Dance with Robot Project

(left) Interaction with humans and autonomous robot at an exhibition of contemporary art, Genova, 1998 
(center) A virtual musical environment by omni-directional mobile robot with an interface that can detect human's weight shift, 1999 
(right) A mobile robot platform for the embodied interaction with humans, 2000
 

"Dance with Robot Project"
Some robotic interfaces in virtual musical environment are successfully employed in art installations and demonstrations. The system can exhibit a style "human-robot dance collaboration" where the robot moves in concert with the human performer sensing the visual and audio information. The system works as a sort of reflector to create an acoustic and visual space around the moving instrument. Moreover, the robot can display the refractive motion according to the context of performance to create the human-robot collaborative performance on stage. (see [Suzuki et al., 1998][Suzuki et al., 1999][Suzuki et al., 2000])
 
Related Works

MIDI Network: Utilizing MIDI data for robot control

In such performance systems that allow users to get feedback for the emotional activation in terms of sound, music, image and motion, the flexibility of the instrumentation must be considered. We developed a robotic platform where all the data are exchanged through MIDI data among the components and the external MIDI controller devices with a MIDI cable. Therefore, the user can easily associate with all the components including not only the music and image generation but also the motion control of the mobile robot. 
(see [Suzuki et al, 3])
 
Real-time initiative exchange algorithm for interactive music system
In the musical performance, it is natural that players exchange the initiative of performance with each other, without spoiling the musical harmony. We have proposed a novel human-machine interface system which allows the smooth initiative exchange between human and machine during the performance. Another advantage of the developed system from the artistic point of view is that the initiative transfer between the human and the system can realize the performance with an unexpected style to stimulate the human creativity bringing novel ideas for the music performance and music composition.
(see [Taki et al, 4])
 
KANSEI analysis of dance performance
The authors have been interested in interaction metaphors such as multimodal interaction system including emotional affects in interactive dance/music systems. We investigated a method to realize extractions of emotional information from human gestures in real time. The basic ideas of understanding the human motion is based on Rudolf Laban's theoretical work. The motion analysis is involved in mapping physical parameters onto emotional information. We are trying to realize a sort of "KANSEI extractor" as the last goal by extracted emotional information in dance performance with the aid of computer system based on Laban's theory of movement. (see [Camurri et al., 2000])
 
Conclusions
We sketch a harmonized human-machine environment where the robot would behave in response to the given stimuli and its internal state in the real environment, and where humans who play with the robot can continuously interact with the robot. The robot is an existence as if intelligence has the embodiment. As humans do, sub-systems inside the robot perform in parallel with multi-modality in real and continuous world. Recently we have been developing a humanoid robot that has an anthropomorphic body, integrating many above-mentioned sub-systems. The embodied interaction between human and the robot can open the next stage of human-machine collaborative musical performance.

Most of these works are supported by the Japan Society for the Promotion of Science (JSPS) Fellowships for Young Scientists.
 
 

References
Suzuki, K., Camurri, A., Ferrentino, P. and Hashimoto, S. (1998). "Intelligent Agent System for Human-Robot Interaction through Artificial Emotion", in Proceeding of 1998 IEEE Intl. Conf. on System, Man and Cybernetics, pp. 1055-1060. 

Suzuki, K., Ohashi, T. and Hashimoto, S. (1999). "Interactive Multimodal Mobile Robot for Musical Performance", in Proc. of 1999 International Computer Music Conference, pp. 407-410.

Suzuki, K., Tabe, K. and Hashimoto, S. (2000). "A Mobile Robot Platform for Music and Dance Performance", in Proc. of 2000 International Computer Music Conference, pp. 539-542 

Taki, Y., Suzuki, K. and Hashimoto, S. (2000). "Real-time Initiative Exchange Algorithm for Interactive Music System", in Proc. of 2000 International Computer Music Conference, pp. 266-269 

Camurri, A., Hashimoto, S., Ricchetti, M., Ricci, A., Suzuki, K., Trocca, R. and Volpe, G. (2000). "EyesWeb - Toward Gesture and Affect Recognition in Dance/Music Interactive Systems", Computer Music Journal, MIT Press, vol. 24, No. 1, pp. 57-69