[CFC] Poppy FIRE

I was in japan expo during 4 days with poppy “on air” from 9 am to 6 pm. A very good way to test the platform under hard conditions (high temperature, long periods of play, children touching, high noise, electromagnetic disturbances…)
My aim is to make scene sequences with reliability.
Moreover, Poppy will take part of a dance show “school on moon” with requirements of movements very fluid.
After lots of study since january with poppy, nao and many others robots, i imagined an interface to control robots like for a animated movie with flight control architecture. I name it FIRE. It means, it can control a robot “under fire”.
I will expose the project in the following posts.
If people are interested to develop this " enhanced choregraph" tool, they are welcome.


Much needed tool indeed !

I hope your Poppy behaved well in Japan expo. I had reassuring news from mine in Innorobo (I wasn’t there and we haven’t debriefed yet).

I’m not sure I will have time to help you develop fire, but I will keep a close eye on it and make any remarks I think can help you.

Thanks Manon,

I made it walk some times but with the semi passive algorithm.

For any remarks about fire, you are welcome :slight_smile:

Here we go for the FIRE. I present here the structure of the software with some design solutions:

  1. The systems
    In a scene, there may me several character. In some charecter, there may be several systems. If we want to synchronise all these systems on one single timeline, we have to uniformise there language. The simplest language I found is the dataflow.
    At each sampling time, all the system give their informations (sensors) and recup their commands. Therefore, a system is an object that pull and push data on the data flow. For instance, the system we can observe are Poppy, Leap-Motion, Razor, Foot-contact, Nao, Camera, Music, Text to speech and Timeline. Each system can manage its internal task by the same computer or by other computer using REST API, or Qpid, Naoqi…
  2. The timeline system
    This system is particular since it connects the output from systems to inputs to ther systems by several ways : force to constant, force to the present value, apply a geometric or dynamic constraint. The timeline system is compose of several layouts. Each layout concerns a list of dataflow variables.
    Here is an example:
    purpose of the scene : the robot is sitting, sees a coffee, takes the coffee, drinks it, put the coffee.
    Here there are 4 layouts : the coffee (3D vector), the right arm (4 articulations), the head (2 articulations), the torso (5 articulation)
    You can apply trajectory design on the coffee and then constrain hand and head at different times to follow this coffee point.
  3. The timeline functionnalities - animation
    For a scene, you can capture several poses so that they follow to make a gesture (straight forward method), but you can build a gesture by pose to pose with key poses, extrem poses and breakdown poses.

When you work on stage, you can play the key poses to see if the meaning is there, and then you can affine the quality of the movement by overlap, delaying and cartoon exageration, manage the stagging of your scene.
I advise you to look these videos about the 12 principles of animation which make all clear. I want to do the 12 principles on Poppy.

A layout is then a list of poses which apply to some dataflow variables.

The purpose is that, when a robot, sensor, is encapsulated in a system (using Python code), There is no need of Python coding. All can be done by GUI. This GUI MUST NOT be complex with sexy graphical effects which can lead to a bunch of node impossible to debug. The simplest view is a timeline function of time. When you are on the stage to follow the orders of the theater manager, you don’t care about design, it must be as clear as possible.
The evolution between poses can be a timing but also a condition (based on a dataflow variable)

For the GUI, I think Qt is the best option to have somethink smooth and robust.

I know the amount of work is huge but I worked on this structure since january, I have to make it clean now…

My first task is to design the “system” to be plugged on the dataflow.

1 Like

And the git hub repository is there :

Here is a first draft of architecture thinking about many things in flight control, what I need for choregraphy design and the talks I had with @Manon

FIRE is composed by 4 types of objects :

  • The interfaces : this component takes informations from the environment (sensor, GUI, file…) and delivers them in the workspace via outputs in a matrix format (real, vector or matrix). It also receive data from workspace to deliver them to the world (motor, GUI, speaker…)
    It stores some parameters like IP adress, but also list of outputs and inputs and has two main methods : deliver_output and take_input

  • The systems : this component delivers data to the workspace taking into account its inputs.
    It stores some parameters like isFinished but also a list of inputs and outputs and has one main method : deliver_output

  • The channels : this component stores a data which is in the workspace
    A channel is just a variable from a dictionnary named channel.

  • The connexion : this component is inside each interface or system and stores in which channel (or combination of channels) shall be sent each data.
    It stores the direction (IN or OUT), a description, a unit, a min/max value, a value and the name of the channel connected. It can be connected to a channel but also a constant or a Python formula using channels (here is a point which is very very powerful)

Given a list of interface, a list of systems, their connexions filled and a dictionnary of channels, it can run a robot (or several robots) within a synchronous loop.
This synchronous loop makes

  • deliver_output of all interfaces
  • deliver_output of all systems
  • take_input of all interfaces

What is funny is that it is possible to program while the program is running.

The game is then the inherit each component adapted to our use here are some examples :

  • Interface : Poppy (razor, foot contact options) Razor, any pypot creature, Nao, GUI, text2speech, Leap motion, Kinect and maybe drone, coffee machine, tablet, domotic, Entire Earth, Mars…
  • System : simple wire (to compute formulas depending on channels only), record tape, goto position (customized to cartoon gesture design), geometric constraint, dynamic constraint, finite state machine (with several systems), sequence of systems.

Based on this Kernel, I have to develop a graphical user interface. BUT NO GRAPHIC PROGRAMMING like Simulink or Choregraph. I will program something more formal and very synoptic. (still brainstorming)

It is a simplified mix between Lustre language, Maya interface (for Anime), Choregraph.

I have almost done the basic components.

Could you add a use case so we can see how all these components interact ?

The example is to control simultaneously the head of Poppy and the head of Nao using the direction of my right index seen by a Leap-motion. Here, there is no system for the moment.

The program is the following :

import FIRELIB
import time

poppy = FIRELIB.Poppy.Poppy('')
nao = FIRELIB.Nao.Nao('')
leap = FIRELIB.LeapMotion.LeapMotion()

leap._outputs["right_index_pitch"].connectedTo = "A"
leap._outputs["right_index_yaw"].connectedTo = "B"
poppy._inputs["head_z_goal_position"].connectedTo = "-2*B"
poppy._inputs["head_y_goal_position"].connectedTo = "2*A"
nao._inputs["head_z_goal_position"].connectedTo = "2*B"
nao._inputs["head_y_goal_position"].connectedTo = "2*A"

interfaces = [poppy,nao,leap]
systems = []
channels = {}

while True:
    for int in interfaces:
    for sys in systems:
    for int in interfaces:

Here is a graphical view of the script.

First screenshot of the FIRE “cockpit”. For information, this color is done to work in the darkness of the scene :slight_smile:

For the moment, there are only 2 interfaces available : PanTilt USB turret and group of interface.
There is only 1 system : the hub (it connects outputs directly to inputs)

Are coming :

  • Leap Motion interface
  • Pypot creature interface (by IP)
  • system to save time history of articulation to replay
  • system to go to a final position

Wow it looks like software for pro, very neat !

Is there a quickstart guide ?

Not yet, not enough stable for the moment.

I updated FIRE repository and documentation. I still have some updates to do but the global software architecture is really simplified. I abandonned the GUI to setup the robot architecture. Python is well done for this.
The GUI is only used to control the different poses of the robot.
I can control the motors between poses to be compliant, pseudo compliant (goal = present position), linear interpolation, cubic interpolation. I will add more patterns function of what I need.

This version is made for Poppy with articulated wrists, I will update for poppy without wrists.

It’s a great project! Are you always working on it? Is it usable with Creatures?

For the moment it is usable for my device configuration but I did not test for another one. I already tested on another PC, the main issue is about the leapmotion compatibility.

I am always using FIRE for the School of Moon show but I am thinking now about ROS interface.
FIRE uses equivalent concept in ROS but ROS seems more optimized.
On the other side, ROS is not available on Windows, which is very bad since leap motion, VR are optimized for Windows.

I am thinking about using ROS on the robot but making a windows interface based on FIRE which communicates with ROS via ZMQ protocol…

Could you try to create a portable heavy client? In Python or something else?

Yes FIRE should be this client. The other option is to use Chrome as a client and develop a website on the robot… but it does not work when you work on a robot fleet.

It could, using websockets and an API on each robot, and a JavaScript program on your computer (in Chrome): one page is able to send multiple requests.

That’s a very nice piece of software you’ve done here @Thot
I started to explore your code to see if I can use FIRE with a Creature I defined to use poppy tools with a Darwin Mini (XL-320 servos), controlled with a PIXL board on a Raspberry Pi.

Do you think I can start by porting my config to your full_poppy.json ?

Yes you can. My soft is not perfect since I advise you to install “Leap motion” SDK, otherwise, it does not work.

If you encounter some issues, do not hesitate to post them here.

I will install Leap Motion, as I own ones it will be a good time to try it.
The Json config file looks the same as a Poppy Creature config. Nothing specific?

I read you were using Windows as your platform: do you think I can expect trouble under my Linux?

No hurry, I work on this project on spare time.