Hey Poppy forumers,
Here I want to introduce an experiment we are thinking about with @Matthieu and @Pierre since a few months, involving Poppy in a musical setup. In a few words, the idea is to let Poppy interact with a musical instrument (being an electronic and/or a physical one) in order to, e.g.:
- trying to reproduce a given musical sequence
- actively exploring and learning the relationship between the robot movements and their musical consequences using artificial curiosity,
- collaborating with a human to produce a musical composition
- or simply record a fun video
The aim of this post is to group ideas from the community on this topic. What kind of musical setup involving Poppy would be interesting in your opinion ? Using what kind of instrument ? How the collaboration with a human could be managed? Are you aware of related realizations? etc …
Your ideas will be greatly appreciated, thanks!
Below I describe some preliminary experiments we realized, as well as various reflections (probably too much, don’t hesitate to skip if it is boring or to ask precision if you feel it is interesting but not clear).
Cheers,
Clément
Preliminary experiment
This is inspired by an experiment realized during the Etres et numérique artist residency which took place in Bordeaux in last February. The movements of the robot were tracked using a Kinect sensor and send to an electronic musical setup to modulate sound by, e.g., shifting frequencies or triggering note sequences.
In a first version of the experiment, the robot movement were choreographed. The result was very satisfying from an artistic point of view (at least: in my opinion), see the video in the post link above.
It was however a bit of shame that Poppy simply used choreographed movements on this setup because one of the main topic we are working on at the Flowers lab is autonomous exploration. This is why three members of the team, @Steve, @omangin and myself went to the artist residency in order to implement an artificial curiosity algorithm on this setup, similar to the one implemented in the Explauto library. The idea was that Poppy actively chooses its own body movements in a way where it can learn as much as possible about the relationship between these movements and the produced sounds. We actually did it but we lacked time in order to analyse the results in detail and even to record a video (Jean-Marc Weber, the electronic composer who designed the musical part, was unfortunately leaving Bordeaux). This is perhaps why we now feel a bit frustrated and want to do more experiments with Poppy and music.
What’s next?
The experiment described above stimulated us a lot and we are now very interested to develop the use of Poppy in musical setups. For example, @Pierre and I have recently realized an experiment where Poppy was trying to reproduce a given rhythm on a percussion (using the simulated environment Poppy-vrep). We will probably publish the associated code soon and post a video on this topic.
A first interesting scientific question comes with this simple experiment. Whereas reproducing a given rhythm using a music sequencer is trivial, the problem is much more complicated using a robot. With a sequencer, one just has to set to “on” the appropriate beats (try by yourself), whereas with a robot one has to find the appropriate movement of the body which, by interacting with the instrument, will reproduce the rhythm. A first question would be: does the rhythm reproduced by the robot sounds more “natural” (or “human-produced”) than the one reproduced by a sequencer (which is “perfectly timed”)? If yes, is it because the robot morphology is human-like?. This has something to do with the ideas behind this topic.
I am convinced that applying such ideas from Developmental Robotics on setups involving robots, music and humans is very interesting both from the scientific and the artistic point of view. This is inspired by a number of reflections:
- Although curiosity and creativity are two closely related concepts, the application of Developmental Robotics principles to research on creative interfaces is perhaps under-developed. Rob Saunders work may be of interest here. However many other concepts from Developmental Robotics could also be applied to creative setups, e.g. the role of social guidance and morphological computation in the emergence of complex sensorimotor behaviors (morphological computation is the idea that many complex behaviors could simply be derived from an appropriate body structure, independently of complicated cognitive processes, as e.g. in passive walking robots).
- Robotic art is an emerging application with huge potential, as indicated by the success of recent demonstrations in the field (see e.g. the Ergo-Robots exhibition by the Flowers group in collaboration with David Lynch) ; or on the non-academic side the Squarepusher robot band. At the same time, lessons can be learned from recent projects to reduce costs and accelerate design time, through the use of easy-to-reconfigure and affordable 3D-printed robots (namely: the Poppy project).
- Novel interactive interfaces could improve creativity and scenic performances in digital art, especially for electronic music composition and performance, where many artists use the same software (mainly Ableton Live) and/or hardware (turntables or a synthesizer/sampler/sequencer setup), while displaying rather poor scenic performances (tool manipulation is generally obscured from the spectator). Here again there is an emerging field to develop, see e.g. the use of a robotic arm in electronic music composition using principles of Robot Programming by Demonstration and human-robot musical collaboration using the ReacTable ; or on the non-academic side, Onyx Ashanti’s setup.
If you are here, congratulations. Now please react
Clément. https://flowers.inria.fr/clement_mf/