InfoMus Lab

"Il giardino della musica"


Palazzina Liberty, Milano, June 17, 1998
By Antonio Camurri, Riccardo Dapelo, and Kenji Suzuki

A simple example of "emotional machine" is demonstrated: a small robot interacts with people around him by creating and modifying in real-time computer music pieces.

People can communicate with the robot by moving around it, blocking its way, walking with it, following it, avoiding it, etc. The robot uses its sensor systems for interpreting what is happening as positive (e.g., following the robot without obstacle it) or negative stimuli (e.g., blocking  it forcing it to stop). Such stimuli produce changes in the robot's "mood".

 

tbpiobis.jpg (24664 bytes)

 

The internal state of the machine is intuitively perceived as the "mood" of the robot. Such state is managed by a software model of "artificial emotions" on the basis of which real-time generation and processing of music material is performed. 

The robot is provided with on-board sensor systems and ultrasound sensors and it communicates by a radio link with a supervising computer. The supervising computer runs the model of artificial emotions controlling the robot's behavior and the algorithms for music generation and processing. 

 


Back to previous page