Jimu robots platform

I have discovered a few weeks ago this platform from UBTECH. Obviously the platform is targeted towards children (what they call STEM curriculum) but nevertheless it has some interesting similarities to the Robotis XL-320 platform and is very well priced. Plus there are a few interesting devices: an IR sensor, an 8 LED “eye” - both of them working on the daisy-chained bus as well as a bluetooth speaker, also connected to the daisy-chained bus.

I’ve started to take them apart and hack through the protocol (there is absolutely 0 technical information about the platform - which is a big shame, but lets give them some time; I think they focus on the marketing part now, and considering that the target market is the young makers’ technical details are a low priority for them now) and you can read the things I’ve found in my blog.

1 Like

Congrats for the retro-engineering hack on your blog !

I’m pleased to let you know that I managed to control the Jimu servos using the Dynamixel controller (with a small adapter). You can read all about this here.

I’m planning to dig a little deeper into this platform as it seems to offer a few advantages and a lot more convenience in build than XL-320s.

1 Like

Hi Sonel!

I did a little experiment.I turned on the developer option on the android tab and I logged bluetooth traffic from the jimu application. Then using raspberry with the help of pybluez I tested a few simple command lines.Some commands have been identified.Of course, other than the alpha protocol, but the structure is the same.
For example:
b’\xfb\xbf\x06\x01\x00\x07\xed’ # 0x01 echo
b’\xfb\xbf\x06\x08\x00\x0e\xed’ # 0x08 robot info
b’\xfb\xbf\x06\x27\x00\x2d\xed’ # 0x027 battery info
Move motor 5 to 90° speed 400ma as you also identified in your blog
I hope you have time to continue exploring the jimu

There is a very good overview of the Bluetooth commands for Alpha here (apparently it was initially published by UBTech, but now the link is dead - fortunately the guys that manages the repository had the inspiration to store a PDF version in the git). Unfortunately this does not cover commands for new devices like the LED light or the IR sensor.

If you have a Mac with Xcode installed you can download the Swift Playground Book (it’s a development environment for iPad that ring fences the code so that people can learn how to code without bothering too much about other technicalities and without worrying too much that they can break things). You can open the playground (right click on it and chose “Show Package Contents”) and browse through the code to see how they prepare the commands. The interesting parts are in the Contents > Sources > Robot > Robot.swift where you can also see the commands that are covered for this robot.

Unfortunately the Playground is covering only the MeeBot that only has servos, without any of the new peripherals so still for these we cannot have information about how they are commanded over Bluetooth. Sniffing is still an option and I bought a Bluetooth sniffer from Adafruit - but I did not have too much time to look into this.

1 Like

Great tear down by the way. If it has 4MB of flash I wonder if we can compile code for it and not always have to drive it from an iPad?

Also noticed sensors are missing from the swift playgrounds side of things. AND the latest version of blocky has some sensors missing so you have to revert to the older version? AND the touch sensor is a little quirky, like handling an interrupt (it’s under Start section) as compared to the IR events, we had to handle it and use “go to start” to return control to the main loop in blocky.

Maybe the Bluetooth Library on git could be taken and extended to cover sensors then we could drive it with an rpi using python … or we could figure out how to flash the main board with some compiled code…

Real shame that this is not an ev3 beater just yet!!

Hmm, in the tear down photo of the main board it does look like there are isp-like break outs top and bottom right into both the main controller chip and into the flash memory…

Has anyone asked ubtech if they want to open source any of the info here? Seems they did that in the past for other bots , I think that’s how something like Jimu gets to beat ev3. I am wondering if with the help of a mini Arduino we can add bus addressable components of our own !! It’s certainly better value so far, but haven’t found a way to buy all the sensors individually just a couple of them plus some accessory packs like wheels and tracks.

Update: been coding it some more, building a simple line follower bot using blocky + the IR sensors + wheels. Whilst the jimu kit itself is somewhat neater than ev3, the coding possibilities are nowhere close :-(. Variable latency of simple instructions in blocky etc means code we write to make it do real things is unpredictable, bit limiting for any STEM beyond a bit of dancing!

Hi, guys! This is link to ubtech repo on github. It contains Arduino sketch and library cpp code for communicate with Jimu servos and sensors. I tested it and it work. Good luck! https://github.com/UBTEDU/uKitExplore-library.git