Animation
Biomimetic Synthetic Gait – First peek at my Bachelor Thesis
Here is the first sneak peek of my Bachelor Thesis for Mechanical Engineering. It is wildly optimistic, super amazing and touches upon topics like: mechanics, software, neuroscience, robot technology, AI, control theory, physiology, biomechanics, genetic algorithms, underactuated dynamical systems as well as muscle redundancy problems.
The elevator speech is still an elusive one, but here goes:
My overall objective is to gain insight into the human nervous system related to gait koordination. How are the muscles orchestrated, which areas handle what, what is the actual involvement of the brain, what has been subcontracted elsewhere and how could a theoretical model of this look like.
My approach is through using a so called Predictive Controller for Forward Dynamics simulation. I am mainly concerned with humanoid gait patterns and have ended up basing my project on an existing system called PredictiveSim.
This system is written by and published alongside the following article:
In short terms the system defines a humanoid model with bones, joints, tendons and muscles along with a number of reflex loops between muscles and senses (Ie: foot contact, muscle stretch etc.). Basically just a greatly simplified model of a human containing only what is believed to be the most neccessary parts related to walking – Weirdly enough the brain isn’t included.
With this model you can define a goal that it should aim for, ie: try to move forward with a certain speed while doing your best to minimize energy consumption. This is a problem that gets solved/optimized by a Genetic Algorithm and after leaving it in the hands of a computer (or several) a lot of thinking starts happening but out comes something similar to this:
It doesn’t look like much for an animator or in general, but the interesting bit is that nobody has told this model how to move. It never saw anyone doing it, there are no parameters in the model that dictates how this should be done, no keyframes or experience or anything. The only input it got was to try to move and do it in the most energyefficient way.
Realizing that this looks somewhat like human gait in general, one may get the feeling that minimizing energy consumption is a pretty big reason for why we move as we do.
If anyone feels like trying it out for themselves feel free to either get the original code or pick up my version, which I will be updating throughout my project. So far it mainly has a bit more documentation and annotation.
It can be found here: jwPredictiveSim at bitbucket.org
Muscle-triggered dancing crab: Biometrics, hell yeah!
A few days ago I finally received a muscle sensor which I’ve been wanting to buy since forever – this one: https://www.sparkfun.com/products/11776. 4 days seemed like an infinity to wait playing around with it but tonight I finally got the chance.
What I found the most exciting was reliability of the signal. How hard it would be to measure the tension of a muscle and whether I’d need a whole bunch of filters to get anything sensible out of it. I had no idea what to expect when I bought it apart from having seen a few examples of it’s use online.
Turned out that it was really quite easy to control. I added a moving average filter with 10 samples and instantly had a signal I could use to control… in this case a servo with a crab on top.
I applied the electrodes on the flexor pollicis brevis – the muscle that pulls your thump towards your pinky -, set up a few conditions for the signal on the Arduino and BAM! The crab was dancing 🙂
By pushing my thump agains the other fingers, I tense the muscle and depending on the amount of tension I can make it rotate slower or faster. As for the conditions I set up a minimum signal threshold after which it would start moving. Each time It’d go beyond that threshold it would go in one direction, speeding up depending on the tension. When the signal goes below the threshold it reverses direction, so it’d go the other way when I tighten again.
The fact that the crab may seem to follow my hand at certain points, could be a bit misleading as I’m merely playing around and aiming it by tensing/relaxing the muscle accordingly. All motion is muscle-controlled 🙂
Oh, and sorry about the sound. I happened to be listening to Radiohead while recording and now Youtube recognized it and claimed that it was copyrighted, so they offered a way of removing only the song from the video. Didn’t really work, but as long as it keeps my back clear I’m good.
[In case somebody is wondering about the thing attached to the servo, it is a crab made of chestnuts:]
Carls first sign of life
I finally got around to finishing the eye mechanism so here is the dude looking around for the first time!
It’s currently controlled manually by 2 potentiometers driving an Arduino Diecimila and 2 standard servos. Maybe one day it will be procedurally controlled or something.
Eyebrows are shit and probably needs redesign before I get more into that, but I’m still proud of him!
Rise of the Dynamixel
Right. First go at robotics to go online.
As I happen to be quite a big fan of animatronics, animation, building things and programming I’ve been playing around with a couple of Robotis Dynamixel actuators during the last month to see what they can and cannot do, and have now managed to set up my base for something that I see a bit of potential in.
Feeling so very comfortable in my favourite (and most hated) animation software Maya, my initial setup is by hooking up the actuators directly to Maya through a python-based server and a USB2Dynamixel which works as a close-to realtime connection allowing me to pose/animate/playback from Maya and seeing the result on the actuators while doing so, pretty neat hey?
The learning curve has been rather steep getting into bits and bytes when communicating with the hardware but I’m starting to get a hang of it now and my interest in the field is growing exponentially.
Video shows a preanimated motion that’s being played back from Maya on the laptop in the background.
The motion is currently being written as a goal angle at about 25fps with a static speed so my next big job is to get rid of as much jitter as possible by analysing the motion a bit tho I do have a hunch that the resolution is too small.
PAIE – Python Animation Import/Export
Being unfamiliar with blogs and their structure I’ll probably mess around with categorizing stuff and deciding how to post but here goes first tech-post nonetheless.
PAIE is a script I initially wrote for Radar Film on an animated feature called Berry and the Disco Worms and having grown tired of doing the same type of tools over and over again for different companies each time I got hired some place new, I decided to take a small cut in my pay to be able to release it opensource afterwards.. here it is 🙂
It’s a rather basic tool that allows you to export attribute values from a desired selection, save to a file and then load again to same or different selection afterwards.
It can export animation as well as poses, export from multiple namespaces at once and a bunch of other things.
Tool can be found on Bitbucket. Feel free to contribute.
I’ll prolly update it eventually and I reckon I might just update this post about it instead of cluttering up with more posts about same thing, but I’m not sure yet.
Cheers