Poppy in a musical setup, please share your ideas

Hey Poppy forumers,

Here I want to introduce an experiment we are thinking about with @Matthieu and @Pierre since a few months, involving Poppy in a musical setup. In a few words, the idea is to let Poppy interact with a musical instrument (being an electronic and/or a physical one) in order to, e.g.:

  • trying to reproduce a given musical sequence
  • actively exploring and learning the relationship between the robot movements and their musical consequences using artificial curiosity,
  • collaborating with a human to produce a musical composition
  • or simply record a fun video :wink:

The aim of this post is to group ideas from the community on this topic. What kind of musical setup involving Poppy would be interesting in your opinion ? Using what kind of instrument ? How the collaboration with a human could be managed? Are you aware of related realizations? etc …

Your ideas will be greatly appreciated, thanks!

Below I describe some preliminary experiments we realized, as well as various reflections (probably too much, don’t hesitate to skip if it is boring or to ask precision if you feel it is interesting but not clear).


Preliminary experiment

This is inspired by an experiment realized during the Etres et numérique artist residency which took place in Bordeaux in last February. The movements of the robot were tracked using a Kinect sensor and send to an electronic musical setup to modulate sound by, e.g., shifting frequencies or triggering note sequences.
In a first version of the experiment, the robot movement were choreographed. The result was very satisfying from an artistic point of view (at least: in my opinion), see the video in the post link above.

It was however a bit of shame that Poppy simply used choreographed movements on this setup because one of the main topic we are working on at the Flowers lab is autonomous exploration. This is why three members of the team, @Steve, @omangin and myself went to the artist residency in order to implement an artificial curiosity algorithm on this setup, similar to the one implemented in the Explauto library. The idea was that Poppy actively chooses its own body movements in a way where it can learn as much as possible about the relationship between these movements and the produced sounds. We actually did it but we lacked time in order to analyse the results in detail and even to record a video (Jean-Marc Weber, the electronic composer who designed the musical part, was unfortunately leaving Bordeaux). This is perhaps why we now feel a bit frustrated and want to do more experiments with Poppy and music.

What’s next?

The experiment described above stimulated us a lot and we are now very interested to develop the use of Poppy in musical setups. For example, @Pierre and I have recently realized an experiment where Poppy was trying to reproduce a given rhythm on a percussion (using the simulated environment Poppy-vrep). We will probably publish the associated code soon and post a video on this topic.
A first interesting scientific question comes with this simple experiment. Whereas reproducing a given rhythm using a music sequencer is trivial, the problem is much more complicated using a robot. With a sequencer, one just has to set to “on” the appropriate beats (try by yourself), whereas with a robot one has to find the appropriate movement of the body which, by interacting with the instrument, will reproduce the rhythm. A first question would be: does the rhythm reproduced by the robot sounds more “natural” (or “human-produced”) than the one reproduced by a sequencer (which is “perfectly timed”)? If yes, is it because the robot morphology is human-like?. This has something to do with the ideas behind this topic.

I am convinced that applying such ideas from Developmental Robotics on setups involving robots, music and humans is very interesting both from the scientific and the artistic point of view. This is inspired by a number of reflections:

  • Although curiosity and creativity are two closely related concepts, the application of Developmental Robotics principles to research on creative interfaces is perhaps under-developed. Rob Saunders work may be of interest here. However many other concepts from Developmental Robotics could also be applied to creative setups, e.g. the role of social guidance and morphological computation in the emergence of complex sensorimotor behaviors (morphological computation is the idea that many complex behaviors could simply be derived from an appropriate body structure, independently of complicated cognitive processes, as e.g. in passive walking robots).
  • Robotic art is an emerging application with huge potential, as indicated by the success of recent demonstrations in the field (see e.g. the Ergo-Robots exhibition by the Flowers group in collaboration with David Lynch) ; or on the non-academic side the Squarepusher robot band. At the same time, lessons can be learned from recent projects to reduce costs and accelerate design time, through the use of easy-to-reconfigure and affordable 3D-printed robots (namely: the Poppy project).
  • Novel interactive interfaces could improve creativity and scenic performances in digital art, especially for electronic music composition and performance, where many artists use the same software (mainly Ableton Live) and/or hardware (turntables or a synthesizer/sampler/sequencer setup), while displaying rather poor scenic performances (tool manipulation is generally obscured from the spectator). Here again there is an emerging field to develop, see e.g. the use of a robotic arm in electronic music composition using principles of Robot Programming by Demonstration and human-robot musical collaboration using the ReacTable ; or on the non-academic side, Onyx Ashanti’s setup.

If you are here, congratulations. Now please react :slight_smile:
Clément. https://flowers.inria.fr/clement_mf/

I am using ableton live and programming with maxforlive, I would love to create scenes on ableton live that commands the robot mouvements, the robots could do choregraphies according to the type of music :slight_smile:
I am going to think of your ideas, and try to propose more ideas.

Very interesting !!

When I compose electronic music, to create my drum sequences, I use often a 4/4 bar beat. Many people thinks that Techno is 4/4 bar loop … straigth 4/4 bar loops. Note that this is not really true. In fact (apart from hard tek, deep techno , minimalist tekno) we always apply to drum sequence a “swing”* + we add ghost notes = Human drummer alike :slight_smile:

Oki, we get limited and trying to reproduce for instance the famous “Amen break” loop is hard ( a very famous loop taken in many many modern songs, jungle, etc). This loop is famous as the drummer is not in sync and he adds many ghost notes.

*Swing are delays in the midi notes (we ask the computer to delay forward/backward certain notes).

Well, nothing really new, I just wanted to share my thoughts with you :slight_smile:


1 Like

slaut! super :slight_smile:
est ce normal de ne pas avoir de sons ? de sons de kick / snare ?

We finally produced something :slight_smile: Here is a demo video:

And a description of what is happening:

This is a demo of the Poppy robot performing electronic music from the movements of its own body, by interaction with a human. Drum samples are triggered by the movements of the left foot (kick) and the head (snare). The movements of the left shoulder modulates the pitch of a synthesized sound as well as the cutoff frequency of a low-pass filter applied on it. The grunt is triggered by the right hand movements.

An interesting point is that the modulations of the pitch and cutoff frequency performed when the human moves the left arm of the robot would be more difficult to realize in one shot with a classical MIDI controller: it would require to synchronize the movements of two faders. Using a robotic arm, where the two shoulder motors are mapped to the modulated parameters, allows to generate synchronized and quite complex modulations simply by grasping and moving the robot hand.

Of course this is a just a preliminary demo, there is no learning and no exploration here and the human-robot interaction is quite limited. The good news is that we can now build on it to go further on this topic and ask to real musicians if they could do something musically convincing with such a robotic interface.

1 Like

You can find the whole code we used for doing the video here.

I must buy glasses but where is the code source (.py) on your github repository ?

In fact, I guess I don’t uderstand what are .ipynb ???

Yep you don’t need glasses. The code is in the .ipynb file (ipython notebook).

You can very easily view it online using nbviewer: for instance for this notebook.
If you want to run it, you have to upload it to your own ipython notebook. If needed you can then export it to a regular .py file.

oki, done:

pip install ipython
pip install --upgrade ipython

I continue :slight_smile:

If you need everything for running ipython notebook, you can directly run:

pip install ipython[all]

This will install all dependencies needed.

1 Like

J’ai installé le code pour la musique (pied, kick, etc). j’ai une erreur dans le notebook, en fait je pense que le notebook python ne toruve pas la config de poppy (json). vous avez des idées opur que je corrige le code?

NotImplementedError Traceback (most recent call last)
in ()
7 config_path = os.path.join(os.path.dirname(poppytools.file), ‘configuration’, ‘poppy_config.json’)
----> 9 poppy = from_json(config_path)
10 poppy.start_sync()

odroid@odroid:~/dev/poppy-software/poppytools/configuration$ LL
et hop on n’a le fichier poppy_config.json !

Je ne suis pas sûr du problème. Le fichier .json est bien sensé être présent. Ton dépôt poppy-software est bien à jour ?

Tu peux me montrer le reste de l’erreur ?

En attendant de te passer l’erreur log , je te passe un screenshot que j’ai pris hier soir pendant la nuit de l’info :smile:

Dommage l’erreur doit être quelque part plus bas :smile:

Cela pourrait être un moteur en overload ! Dans ce cas pas d’autre alternative que de débrancher et rebrancher le robot !

au fait, la date du poppy wifi version est : 1999 fevrier 14h …
j’ai eu un souci lorsque j’ai voulu compiler also-oss par moi même (./configure m’a dit que la date du système était erroné).

est ce que je peux changer la date du systeme?
ou je dlire?

date -s '2014-12-06 10:39:00'
Mais pourquoi tu veux compiler alsa-oss?!


je veux compiler alsa-oss car je galère avec le reseau :wink:
mais je vais retenter la manip que tu m’as donné, car en fait, on m’a bien embêter (nuitinfo) quand j’ai fait la manip, on m’a fait enlever des trucs … alors que je n’etais pas convaincu! bref, je dois refaire.

sinon, rappelle moi le serveur python que tu utilises ?

  • chemin?
  • ou se trouve le code de “http://…:9999/say/ …” ?

je voudrais detourner ce programme pour faire parler poppy avec la voix de mateo.

je souahite aussi faire bouger les bras pendant qu’il parle :wink: