The technological subject of this piece is the drone and I have to set up on stage several drones.
After setting Poppy on stage, this is another challenge because we must have a closed loop control of the robot position (contrary to humanoïd robots which were piloted by humans)
The work about the drones is not to make a performance with acrobatics but to make the drone move as if it was an animal (insect or bird). The trajectory control is then very important.
The traditional way to deal with this is to use Motion Capture with Vicon or Optitrack heavy installation. But this installation is very (very very) expensive on a dance stage (above 100000€).
The challenge is then to have autonomous drones without motion capture…
If you have ever heard about something like this, I am interested. I already have a list of solution to test.
The motion capture is very efficient, but there are very young alternative such as Valve LightHouse of the HTC Vive Virtual Reality sytem available in June 2016.
Here is the solution on which I work for the moment :
The drone is a quadrirotor with a small camera embedded. This camera is mounted on a pan/tilt turret done with 2 XL-320 motors and looking downward. The camera is connected to a Raspberry Pi 3 embedded on the drone.
On this Raspberry, there is a QR-Code detector (Aruco which gives the relative position of a QR-Code to the camera. The two XL-320 motors are controlled so that the QR-Code is at the center of the image with Pypot.
On stage, placed in front of the stage, there are two big QR-Code looking at the stage (audience do not see them) one on the left and one on the right.
The raspberry pi is connected to the drone controller : a NAZE32 Full (10 DOF) controlled in attitude
For the moment, I test the QR-code detection coupled with XL-320 pan-tilt turret to measure time response.
I am also waiting for my first Drone
First prototype with ROS installed on Raspberry Pi 3 (Jessie) connected to raspberry pi cam (Aruco Tag localisation). To control the drone, I use a Naze32 flight controller which can be piloted via UART with Multiwii protocol managed by pyMultiWii, the pypot for drones. I had to add a 25W power supply for the Raspberry Pi 3.
I am still looking for a good design for the drone, and as we are working on the “living drone”, the animal drone, the puppet drone, I need more degree of freedom (Poppy miss me )
It may be crazy but I plan to embed an Ergo Jr on the drone to have a sort of dragon/phoenix drone.
I do not have any ergo, but is it possible to someone to measure the weight of an Ergo Jr without the Raspberry Pi ?
Thank you.
Ok thanks,
Drone : 400g
Battery : 425g
Ergo : 135g
Total : 960g
100% Thrust : 440*4 = 1760g
Flight level thrust : 54%
For acrobatic maneuvrability, we shall consider less than 50% level flight (drone shall be centered)… I do not want the drone to make back-flip… it shall be possible.
Point on the project. I discover two things in this project, ROS and 3D printing.
For ROS, I succeded to make the interface with the drone Flight Control Unit “Naze32” to send the command and get drone orientation and battery voltage. All is packed in a ROS package. I may optimize this package later if necessary.
I did another ROS package to recup a python dictionnary sent by zeroMQ protocol by any other computer and publish it in a topic.
I need this to connect a thrustmaster command
I also added a raspicam package to read the Raspberry Pi Camera at 80060030fps
I directly connect it to a video streaming package and I got my “Numeric FPV” function.
I will then make a HUD interface to train with openCV vs ROS vs Raspberry Pi cam.
For the drone, I imagine to print it all but 3D printing is heavy with respect to carbon fiber. I have to make a mix between the two. Therefore, I am training with the e-motion tech 3D printer to master any kind of piece. I am not disappointed with this printer (moreover, e-motion tech support is completely efficient)
ROS is powerful but it is really difficult to install on a new platform. It is difficult to find the right library compatible with the right package, and you do not know if the package you try to compile will be useful with the project you do.
But after more than a month, I succeded to compile ROS Jade for Raspberry Pi 3.
I also integrated the ARUCO code to detect and localize Aruco tags and it works very well on rapsberry pi (17Hz)
I plugged the output of aruco to a ROS package doing video streaming…
Here are some news about the Phoenix project. It changed a lot.
As always, in artistic work, the time is running. I was thinking about autonomous drone, playing with ROS and making my own drone. The result is that there are too much variables to master and contrary to Poppy, a drone is always unstable.
As always, if we want to make something autonomous, it is better to understand it before. I work with the Artilect Fablab of Toulouse and its drone section, and they told me that if I want to make an autonomous drone, I first have to learn piloting.
The coolest way to learn piloting is using the famous nano drone H8 mini
This drone is not expensive, very responsive, resistant and if you buy 5 batteries it lasts 1h.
Since 1 month, I pilot 1h almost every day with this drone… Now I understand a lot of things.
We made a decision : why not piloting drones during the show instead of autonomous ? It is very strange but it’s been never done…
The first show of Phoenix will be in Marseille in autumn 2017, therefore, using this method of piloting and in parallel studying autonomous drone (while piloting) is more pragmatic. For the next residencies, I will pilot and then introduce autonomous drone. We can then advance on the artistic work. And also… it is more fun
We did the first test with Eric last week in the drone lab flight arena (where I do the daily training) and here are the video
Some news for the project. Last week we did the first residence to make a teaser (coming soon) of the Phoenix project in Labos d’Aubervilliers in Paris. A s always, making a vidéo is catalysing innovation, timing, technical issues. If I had autonomous drones… it would be chaotic. But all was fine.
First of all, there are 3 dancers (Gaetan Brun Picard, Pauline Simon and Nans Pierson). For the teaser, there was also 4 drone pilots from Fablab LOREM and Scott Stevenson, a Freelance Pilot.
The teaser is filmed by Marc Da Cunha Lopes, the same photograph as for “School of Moon” piece ! Once again, he was awesome.
There are lots of dance shows with autonomous drones to make something “wow” but we really observed that flying with dancers and listening them as they listen the drone is very strong and pleasants on both sides.
The technical casting is now with 5 drones :
2 mini H8 (8cm) which are very responsive and give an insect view of the drone. We can easily dance with somebody.
2 Syma X5C drones (23cm) which are greater but are not dangerous. They are like toys but very responsive and they last more than 8min which is awesome for a drone.
There is also a very important point about piloting drones on stage ! the radio transmitter. We use a Devention 7e transmiter modified with a radio module to control all the drones above. This is not a more confortable controller… without this one, the project would not be possible. The stiffness of the controller and the resolution of each stick is completely different and you can do whatever you want.
First the management of batteries is not easy and charging the batteries during the work is tiring. I have to optimize this management which is not the same as Poppy.
Even if the control is easy with a good radio transmitter… pilot skills are important. Having real drone pilots gives a plus to the realization. I did some progress but some difficult figures are not for me for the moment. For instance, turning on a circle which center does not move is difficult for me.
The position of each stick on the radio controller is not unique, there are 4 pilot modes (mode 1, mode 2, mode 3 and mode 4). I then have to take care about the pilots who come to the show to know their modes. In Paris I had mode 2 and mode 4. In Marseille, I will have mode 1…
Next week, we will do a first show with drones, on stage, with public in Marseille…
Last week we did the first “Crash test” in Ballet National de Marseille. It means the first presentation in front of audience. For this presentation, we were helped by the company Dronimage with two pilots Bruno Thorigny and Quentin Galiani. Still an opportunity to meet pilots. Here is a demo of there videos.
There were Gaetan Brun Picard, Pauline Simon, Nans Pierson and Norbert Pape as dancers.
It is the first time we use FPV piloting (first person view) on stage. It gives something more animal to the drone.
The first time on stage is also the time for BUGS And there was… before the show.
As always, there were fatigue issues about batteries. Contrary to Poppy, the intensity is higher for drone motors. The “brushed” motors are also getting hot when used for a long time and the damping is higher. The lift force is then decreased. It is better to start with a cold drone.
The quality of batteries is very important to have a voltage rate very low until voltage loss. The “nano-tech” batteries are very good for this.