Spiking neural network control of poppy 'dancing'

Using the Nengo neural simulator, I put together an (extremely) simple example of controlling the vrep poppy robot with a spiking neural network. This was really just to show that Nengo could control poppy. Here’s the video of the 'dancing’

I’ve constructed a series of neural oscillators and hooked them to the ‘dancing’ motors from the dance example in the tutorial. The necessary Nengo code is in the links.

A fun start :slight_smile:

8 Likes

Hi @Chris_Eliasmith this is a cool demo that opens the possibility of many interesting experiments in computational neuroscience using the Poppy platform (simulate or real, humanoid robot or other like Poppy Ergo Jr.

What sensory inputs are you giving to the spiking network in this simple example (I guess it produces a king of CPG?)?
With networks that produce CPGs, you might consider using them to learn/control quadruped locomotion of the robot.

Right now there is no sensory input. I’m just in the process of implementing an IMU with communication back to nengo to change that. Our next step is to build a controller to balance the robot using our neural model. This was really just a demo showing how to do communication between poppy and Nengo :slight_smile:

1 Like

Just a quick update: here’s the same controller running on the real robot. Congrats to the poppy guys for making such an easy to use/build robot. Impressive!

3 Likes