Hello, here !
I have been working for some time with the Poppy robot in a coppeliasim (Vrep) environment in python. I don’t use Pypot for different reasons, I use pyrep.
For learning, I have a tensorflow or pytorch environment with different libraries that allow him to learn simple movements in an unsupervised way.
So far, so good (I plan to do a github repository soon)
I am now tackling walking and of course, unsupervised, it does not converge (he is moving forward, but not for long)
Now, I want to use mujoco to be able to benefit from the already mature codes of DeepMimic. This code allows you to learn with motion capture movements as a base (there is a lot of database)
I manage to import my digital poppy in xml with STL files attached, I can control my robot in mujoco, but to be able to use the motion capture files, I need a skeleton and do multiple conversions via blender to output a bvh format . This is where I can’t move forward. If someone by chance here had already made a skeleton structure in bvh or afc, that would be wonderful.
The idea is to transfer a motion capture file in angles for each engine.
Thank you