Presentation of the ODOI project

I am introducing here the project I am working on. You will find below a brief description of it and a more detailed description on this blog here. If you have any question, do not hesitate I will be more than happy to answer it.

This project aims at creating a robot combining technological skills, agility and artistic design. One of the objectives is to mimic – either by the posture of the mechanical structure and/or by the design - different emotions in order to provoke real ones from the audience.

Regarding the technological skills, everybody is coming up with a basket of features such as face recognition, photo shooting, speech (recognition), storytelling, M2M… According to the size of the robot, embedded CPU resource and usefulness, part or all of these features can be proposed.

In order to address agility, I studied in details the human gait – actually how the whole body contributes to the gait. Based on this study I decided to include an articulated pelvic, an articulated foot and an articulated spine (yes lot of articulated stuff) initially proposed by the Flowers team for their Acroban robot. My thesis is that all these articulated parts will contribute to smooth and increase the fluidity of the gaits (straight walk or make a turn). Now I have to prove it.

I think that Artistic Design is really very important if one want to introduce robots in the human environment that can be accepted and/or tolerated by the population. One step further will be the development of “artistic robots” that can be considered as piece of Art and sold accordingly. So far only the Japanese robotics community is really addressing this topic.

This is why I am eager to work with designers in order to create “outer shell” that can fit the robot skeleton. I initiated a collaboration with Dacosta Bayley who is running MarchOfRobot on Instagram (every day in March, artists are pushing a sketch picturing a robot - see #marchofrobots2015). He ran successfully a kickstarter campaign in 2014 in order to publish a book about his work.

If there are designers out there who are willing to contribute, you are more than welcome!

The hardware will be composed of an OPEN CM9 board, a PIXY Cam, 9DOF Razor IMU and FSR sensors. This list will be modified later on probably.

Few words about the software, I designed a simple simulator taking into account the kinematics of the robot (no dynamic here). For the different phases of the gait, it calculates the angles for the each joint. These list of angles are transferred to another algorithm that will calculate the position and speed of each joint
according to the time associated to each phase of the gait. A scheduler running on the OPEN CM9 sends commands at a given sampling time.

I finished the first version of the robot in January and since then I am focusing on the gait algorithm as well as debugging the mechanical part of the robot. Indeed all the frames are made of Resin and I noticed that some frames were bending, stressed by mechanical efforts. Another issue is the torque of the motor versus
the weight of the robot – especially at the hip. So I have to find solutions… It is quite a frustrating period because I need to understand how the structure of the robot is working in the real world and how to deal with “unexpected” (I did not think about them before) issues. Although it is frustrating it is also a necessary (sometimes exciting) learning period.

Please find below some CAD pictures of the mechanical design of the robot. It is composed of Dynamixel servos (AX12, MX28 and MX64). The frames are made of Resin.

Here are some sketches we are working on with Dacosta Bayley.

And finally some pics of the real one:


This is another very beautiful art project. I also believe that art forces the enginner to take risks (technological risks) to be the most impressive.
Are you full time on this project? Where do you do this project?
I read some article of your blog, the one dedicated on walking is very interesting. It is the first time I read a animation approach. I have a string puppet approach mixed with passive walking approach.
Have you ever tested your RAZOR IMU, is there Euler angle integration inside or do you plan to do it yourself (or not using Euler angle…) This is my main issue in my Poppy based robot.

I observed in details your robot architecture. I like the 5 DOF torso like Poppy. I worked once with a dancer which said that this torso is the main feature of the robot and makes it very unique.
The pelvis architecture is on the other side very different from the Poppy one. I am interested to see it moving. Is it easy to make to robot sit down ?
I am very surprised about one thing : the MX-12W servo at the shoulder. This servo is very weak and I don’t think it is able to lift the arm. On the other side, I know that if we make the arm as a puppet one, this servo won’t force a lot.
I am less fan of the feet which are too heavy but making a good foot is a big challenge.


Hello Thot

Thank you for your message. I read also about your nice project of Marionnettes and congratulation because you get funded to pursue the project!

Today I am working part time on it and i would like to work fulltime but I will need to look for a financial support then. I think I need to make it walking first after I will see.

I did not test the Razor IMU so far. My understanding is that it can work fine like it is with the software available on gitHub - Am I wrong?

I agree with the MX12W - actually I do not understand why Robotis released them - they are really really weak regarding the torque they can handle. I replaced them with AX12.

Regarding the Pelvis, yes it can sit down - not really like poppy because of the positioning of the servos - but it can.

The foot is heavy that’s true - around 350 gr including frames and servos - and I have to take it into account and compensate with the torso when the leg is moving. This is the major drawback of this design.
The foot is a big challenge because well in one way or another you need to lateralized during the gait and the slower the gait is the most you have to lateralize. You can go for a passive mechanism, very hard to tune but it will work for that gait only. If you want to walk faster or slower the passive mechanism will need to be controled as well and you will need a motor somewhere on the leg. It will increase the weight anyway.

Did you play with the PID gains of the MX28? I am trying to change them but I am dealing with lot of oscillations and the Ziegler-Nichols method doest not give good values so far. I will see on the forum if some people are working on that topic.

I am looking forward seeing videos of your poppy acting on the stage:)

1 Like

Superb project, I wish my best on completion of your bot.
I actually read your blog and it helped me a lot in understanding human walking gaits. Today I find you here in this forum. I am thankful that you have provided a vivid details in your blog.

Poppy is the first I found having human like design followed. The same kind of feeling I found in you bot… but with some differences.
I have a question … why did you use 4 servos in heap section in horizontal plane? Most of the bot designs have 2 instead of 4.
You have something on mind, right?


Thank you for telling me about the RAZOR IMU. I saw the specifications and it seems great because:

  • Gyroscope maximal rotation measurement is 2000deg/s (largely higher than the robot natural frequency)
  • The acceleration measurement can be tuned
  • The firmware is Arduino compatible :smile: so I am not blocked
    So I bought one (this is my third IMU I test after Yoctopuce and Naveol :pensive: )

I try to get the MM7150 of Microchip but it is not yet available (in july 2015) this component seems to be more powerful (co-processor) but you cannot modify it.

For the MX-12W, I think it is only designed for wheels. But I want to use it as a catapult for my show. (it is very fast)

Concerning your approach on walking, I completely agree with you : lots and lots of experiences to do. It is even easier with this kind of robots.

Concerning PID gains, I only use P gain since the structure of the robot is very light. The I gain may be useful if there is a static error. But in my case, there is a little static error. Finally, concerning D gain, I am afraid about derivative :smile:

Hi NicoX

Thank you for your message and I am happy that the article on the walking gait helped you. Actually I have to update it a bit but did not find the time to do it.

I am planning to do the same thing for the “make a turn” gait. Actually I did not find any document on that gait on the net, did you? The idea is to make this robot rotates like us - and the articulated foot is necessary here.

Regarding the 4 servos on the Hip section, yes I had something on mind but after some tests, looks like I can do it with 2 servos as well.

Hello Fabrice,

I received the RAZOR IMU and played with it. It seems to be a good one (moreover, we can hack the soft).
I did some tests (not done in the videos…)

  • Put the sensor horizontally on the table
  • turn 360deg around Z axis
  • see the value of roll and pitch are small
    It is true for the RAZOR
  • Put the sensor horizontally on the table
  • lift and shake it in all directions violently
  • return the sensor on the table
  • see the value of roll and pitch are small
  • Put the sensor haorizontally on the table
  • lift and hit the sensor on the table violently (shocks)
  • return the sensor on the table
  • see the value of roll and pitch are small
  • Put the sensor haorizontally on the table
  • move the sensor sideway
  • see the roll angle is small
    All these tests are OK with the RAZOR :smile: I can trust the measurement. I wish I will not have any issue later with this one…
    I have to stick the sensor on Poppy now.

Hi Thot

Very good!

Did you purchase through sparkfun directly? Did you upload the code available on gitHub and no problem, I mean it works quite well?

No, I did purchase it via “Generation robot” seller of Sparkfun in France.
I did take the code from github and flashed the module with Arduino application. It worked “two fingers in the nose”.
You just have to buy an FTDI device to convert RS-232 in USB with 3.3V power.

After months of tests and redesign of the robots, I wanted to post and update of the projects and share two videos.

The first one shows the robot walking in real speed. The video is divided in two parts, the first one provides a global view of two cycles and the second part of the video focus on the legs movements.

As mentioned before the gait I implemented is different from the one used by most of the existing humanoid robots. Here the gait implements a heel strike, a forefoot push, a rotating pelvis and a stance leg which is straight during the swing phase. In other words, the robot never keeps the knees bent.

The second video shows the same sequence but a speed 4 times faster, it is interesting to look at it as well.

With respect to the initial version posted in April 2015, I redesigned some parts of the robot, mainly the Pelvic and the brackets “linked to the Pelvic” are made with aluminum. I also worked a lot on the lower part of the
leg and the foot in order to change the distribution of weight. This new design contributed significantly to the creation of a “stable” walking gait. The picture below shows the actual robot:

The picture below focus on the foot, the passive articulation of the heel and active articulation of the forefoot.

The gait cycle I have implemented is “hardcoded”, i.e. it consists in a list of couple (position, speed) which are triggered at a specific time frame. First a simulator, taking into account the kinematic of the robot, computes the angular position of each limb for the different phases of the gait.

The picture below is a snapshop of the output of the simulator:

The output is list of “angular position profiles”. These profiles are then used by another piece
of software that will compute a list of angular positions and associated angular velocity in order to create a sequence of commands that will be sent to the servos. It is able to compute the angular velocity because the duration of each phases is given in the parameters list as well.

Here is a snapshot of the output:

I choose this “trials and errors” methodology in order to understand how the gait is really working, how the different parts of the body interact, how to coordinate the different movements and finally how the dynamic
plays here. This is not the easiest way because it is a very frustrating compute/test/failure loop but at the end I come up with a quite stable gait and more important, a good (better) idea about how a dynamic walking gait looks like. But here in order to move forward, i.e. implement a dynamic walking gait, I will have to look for people and/or partnership with universities because my skills in math are not efficient enough.

Besides looking for the Graal, I mean a dynamic walking gait, my next goals are

  • Implement a “turn gait” – the interesting thing to understand here is whether the lateralization phase is easier or more difficult than the straight walking gait to implement or not;

  • Explore the stride length and its implication in the lateralization phase;

  • Connect the different sensors and gyro in order to collect data and see how it looks like. What can we do with them?


Wow, very nice work.
I like the shape of the semi-passive foot, it is a good idea. I saw you placed AX-12 motors in the arms and MX-28 in the spine. Have you tried to place AX-18 in the spine ?
I am very interested in having a Poppy torso like robot but with low cost. It seems to be very close except for the MX-28 in the spine.

Good luck to the Graal quest :wink:

Hi Thot
Thank you very much for your comments. I really love what you are doing as well with the shool of Moon!
Actually I did not try to place AX18 in the spine. I am not sure they will provide enough torque … Besides you need to redesign all the brackets to test the setup.

This video (see below) shows some important improvements with respect to Video 2: no more fast movement of the torso during heel strike and smoother transitions from one phase to another.

Some comments:

1: Because the commands are hardcoded there are some back and forth movements of the torso in Sagittal plan at heel strike. This problem will be solved when the controller will be connected to sensors on the feet (the Murata rotary encoder connected with the heel).

2: After heel strike the leg behind (pushing leg) tightens and in the same time the robot lateralizes – like this first we stabilize the robot and second a smooth transition towards a completion of the lateralization to prepare the swing phase.

3: An active participation of the torso during the lateralization in the frontal and horizontal planes which is interesting. These movements clearly help (and my guess is that they are needed when the stride length increases) and provide a more natural walking.

Straight walk Video 3

The second video shows a rotation, I use the same algorithm and I (just) needed to modify the parameters and understand how to tune the angles in order to create a generic turning gait.

Obviously, the main parameter to consider is the angle of rotation of the foot (actually the hip) and let see what are the consequenceson the whole gait.

If the foot angle is around 20/30 degrees, it is possible to extend the swing leg and rotate the pelvic as well. The gait played in the video falls in this case.

However if the angle is large, like 40 degrees or more, the swing leg will not really extend – due mainly to mechanical constraints – otherwise it will be quite difficult to generate the movements for the swing
leg to become the stance leg afterwards.

Another challenge is the rotation of the Pelvic (or the external rotation of the stance leg – which becomes the swing leg) in the horizontal plane. The larger the hip angle of the swing leg (which then becomes the stance leg), the larger is the rotation of the stance leg (which becomes the swing leg).

I am working on a gait addressing a large angle in order to see how it works.

The picture below is a graphical representation of the different cases.

Here is the video:

Turning gait


Video showing a “depressed robot”

I wanted to create a new gait that you will not find anywhere, a gait for a “depressive robot” (well the robot is not depressed of course, it just imitates the posture of a depressive person).

It is also an interesting research work because one have to coordinate the whole body in order to create an emotion from the audience.Besides it is a good opportunity to play with the articulated torso of the robot.

Unfortunately I cannot like cartoonists, exaggerate all the postures in order to create amazing walking gaits and create stories around them.

In this video the robot is walking, then stops in order to look at you, takes a breath before resuming its course.

A depressive gait – if can say so – is characterized by a:

  • Reduced speed
  • Reduced stride length
  • Reduced body sway
  • Reduces Arm swing
  • “poor” Posture (rounded shoulder, head thrust forward)
  • Reduced vertical movement

As I mentioned earlier, cartoonists noticed these body expressions for a long time, just by observing people, but if you want scientific proofs you can look at this seminal work on gait associated with depressive mood [1].

The challenge here, from a technical point of view, is to create a gait with torso/shoulder thrust forward and no control over the arms (torque off on the servos).

Here I kept the legs bent and the feet flat – no heel strike - as it is the case for depressive persons actually.

Hmmm I will not say that the robots with bent knees are “depressive robots” :wink:

I had to move the torso backward a bit during the swing phase otherwise the robot will fall forward.

[1] J. Michalak, N. Troje, J. Fischer, P. Vollmar, T. Heidenreich, and D. Schulte, “Embodiment of sadness and depression: Gait patterns associated with dysphoric mood,” Psychosomatic Med.,
vol. 71, no. 5, pp. 580–587, Jun. 2009.

1 Like

New Video showing the robot taking a rest - leaning against a wall

I am programming the robot to mimic some of the postures we (as human being) exhibit in our daily life. Program some of these mundane postures and trigger them at the appropriate moment can really surprise the audience. Sit, get up from a chair or a bench, take an object, walk with different moods, leaning against a wall, crossing an obstacle… are some of the postures I am exploring.

Today I focus on “rest against a wall”. There are many ways to lean against a wall: contact with the wall through your back or your foot. Also, there is the sideways way of leaning against the same wall where the contact with the wall is achieve though the shoulder or the hand.

The video below shows the example “sideway leaning against the wall – contact with the hand”.

Meanwhile, it is also a good exercise for testing/improving the software (forward/inverse kinematics, sequence of actions combined with sensor feedback).

The next step is to stop leaning against the wall, go back on two feet in order to resume walking (for instance).


Hello poppy community

I am exploring the idea of a robot sitting on a chair (let us call it a sitting tattletale bot) and taking various postures based on its mood and/or external events. In the latest case, one can specify how the robot react to the events.

For instance, you can assign a dedicated posture based on news content/headlines analysis (web scrapping) – here are some examples:

  • Weather forecast: sunny the robot is happy and relax, heavy raining it becomes sad, storm a bit anxious …

  • Celebrities: you want to be informed if a celebrity is in couple with somebody and you want that the robot is happy about it – he will tell you by taking a happy posture (the robot can pick up one happy posture among several ones BUT if you want that this news makes it sad, it will adopt a sad posture;

  • Sport: if your favorite team wins, the robot will be happy and will standup or if the team loses, the robot will adopt an angry posture ( like supporters in the stadium);

  • Job reports: if the job reports is bad, the robot will inform you and adopt an anxious posture.

The first step is to create some postures and see how it looks like, here is a first video showing the robot sitting down on a chair and taking two different postures.

These are the ideas I am exploring right now and I will give more details on that project later if people are interested in.



Postures and sound


Dear Poppy community,

In this post I introduce a new video where I continue to explore different postures. This time I decided to make an experiment by recording some sounds and voices.The idea here is to show that if the voice, and more important the tone of the voice, are in “harmony” with the posture, the way you look at the robot is really different.

There are 5 postures:

  • Applaude,
  • Thinking,
  • A bit anxious,
  • A bit angry,
  • Time to take a rest.

it’s geat !

As you say, the voices and sounds make the difference, and make the robot more… real ^^

Just a bit scarying with a “human” voice. You recorded your voice and played it ? Or is it a text to speech ?

Many thanks Damien

Actually I recorded my voice. it is an experiment and I wanted to play also with the tone of the voice and see how it looks like when it is synchornized with the postures.
I am still an amateur in making movies … mixing sounds and videos.
I did some research on text2Speech softwares but I did not find anything where you can play with the tone of the voice. If you know such software I will be more than happy to have a link :slight_smile:

I am thinking on how to record some sounds and synchronize them with the postures - so the robot will play them.

I do not really know a lot about text to speech since I did not work on it yet (but I will !) Maybe you can find what you need here : Open-source speech recognition and text-to-speech potentially usable with the Poppy robots