[CFC] Design of expressive eyes

Call for contributions

The current Poppy Humanoid design has been done in 5min and is actually a wallpaper…

Here is the current image used (stole form the EVE robot (Wall-E)):

These eyes are quite creepy as they are big and frozen:

The screen display has a 800x480px resolution and the embedded arm computer runs under an ubuntu distribution. Is is therefore possible to display about any kind of content e.g. image, video, python program, website, processing applications and so on…

So if you have some design/animation skills you are very welcomed ! :slight_smile:

1 Like

I’m already working on a processing app using an image like this as a base for blinking eyes and expressions in a Wall-E style. I’ll post the code when it’s satisfactory. :smile:

1 Like

I am also working on dynamic cartoon eyes for my puppet project. I give the spec I made here, maybe it can give you ideas:


  • tunable eyelid space centered on pupil
  • tunable eye wetness (sad effect)
  • tunable pupil diameter (in the moon effect)
  • tunable iris position
  • tunable colors(skin, eye)
  • tunable eyebrow
  • human or cat eye
  • “dummy muscle” the muscle which is below the eye which makes a dummy face.


  • rythmic blink
  • micro rythmic iris move
  • rythmic eyebrow link with iris move

I started the project with Tkinter canvas.

1 Like

Do you have illustration/video of actual results ?

I’ve been working on a simple robot head to test my animation framework. The inspiration was also Eva’s eyes.

The first minute show a simple loop of a smiling animation and the second minute show different shapes for the eyes.


Yep, here is the first version. I will work on splines for eyelids and eyebrow.

Great !
Can you share a version (without the mouth) we could try on Poppy to see how it looks like ? It may be just a video if you do not want to share the full framework.

I love this video and the expression, can we work on this ?

Here’s a video of the loop : https://dl.dropboxusercontent.com/u/50464815/eyes.avi

On the video, the sceen is actually an android phone showing a browser. I use WebGl to render the meshes streamed from Blender.

In a “release” setting, the meshes of the different blend shapes would be exported from Blender as assets, and the blend would be directly computed in the rendering software given the weights. I don’t have this code yet, but I can code it if the eyes fit :wink:

Also, these are 3D meshes at the moment, and a 3D engine is needed. If you don’t need generality of the system, we could also simplify it to 2d meshes.

We develop eyes for the Cherry Project. I have started to work on their design. The display is managed by several functions that will be update on our GitHub soon.
Here what I got for the moment :


the functions for the eyes had been put on our github. Here’s a quick video of some of the animations of the eyes we got.


Hi Laura, Could you tell me which software you use for edit this eyes, please?
Thanks :relaxed:

Hey, what is in your opinion the easiest way to do these kind of animations and control them programmatically with c++ instead of Python. Is OpenGL a good idea? Or direct X? I want to start reading about the topic but don’t know where to start.

Currently there is no advanced development for the eyes so you are free to go in any direction. You can use python, c++ but also processing or web-technos…
I guess, the best one is the one you master.

Laura, if the code / images are public on github, would you mind sharing the link with the rest of us?

Many thanks,