From: "Androgen deprivation causes selective deficits in the biomechanical leg muscle function of men during walking: a prospective case-control study: Biomechanical leg muscle function deficits with ADT"

Synthesizing Human Walking with Musculoskeletal Models: An Inspirational Post

Featured picture from Cheung et al. 2016

Or in other words: Predictive Control of Bio-Inspired Biped Locomotion

Back in my days of undergrad this topic caught me by storm and thunder, inflicting inspiration exactly where I liked it. This inspiration lead me from Mechanical Engineering directly into the realm of computational biology where I have fumbled around for the past 4 years.

In essence you let a computer figure out how a virtual model of a human body would move in order to optimise certain criteria. Ie. travel at a certain speed while using the minimum amount of energy. 

The point here is that you don’t control individual muscle activations and forces (kinetics) and neither do you control how the body moves (kinematics). Instead you define the abstract intention through a high-level objective. Ie. “Move with 1.5m/s in the forward direction”. Then you let a computer do the hard work of guessing on a number of unknown parameter values until you are close enough to your target. You can then add further terms such as: “try also to minimise the energy consumption per distance travelled” and let the computer think again. Further terms can be made such as “minimise head-movements while walking” and “minimise impact forces” which are both inspired by physiological observations.

If you optimise for such 4 terms and the model is sufficiently made, even with no other input; ie. no motion capture, no keyframes and no predefined patterns, this simple and high-level ‘objective’-function can result in human-like gait patterns out of nowhere. By this means we can effectively synthesise human gait (Within reasonable tolerances)

Examples follow:

The fun one first. Thomas Geijtenbeek’s project is targeted games so it does not prioritize physiological plausibility too much. However, the foundation is the same and it is very charming. The “Generation” shows the progress of the computer optimisation trying to find the right parameters. He is furthermore including a pass that seeks to find the optimal position of muscles. This makes sense for imaginary creatures but not so much for humans:

Credit: Thomas Geijtenbeek, Michiel van de Panne, Frank van der Stappen

Then a rather technical video showing a more physiologically correct method. Notice here how the model adapt its walking strategy when muscle strength is changed (This could be of particular interest for animators):

Credit: Jack Wang, Samuel R. Hammer, Scott Delp, Vladlen Koltun

And another, also very technical but this one greatly illustrates the humanoid model used, the muscles and its Neural Control Circuitry.

Credit: Seungmoon Song and Hartmut Geyer

To put the above results into context, Google recently ‘asked’ an AI to find locomotion patterns for a human virtual puppet (Without muscles and proper skeletal features).
The hilarious yet obnoxious result, which hurts my animator heart dearly, is this:

Credit: Nicolas Heess, Dhruva TB, Srinivasan Sriram, Jay Lemmon, Josh Merel, Greg Wayne, Yuval Tassa, Tom Erez, Ziyu Wang, S. M. Ali Eslami, Martin Riedmiller, David Silver

I admit that the aim of the project is different, but it still goes to show that asking computers to synthesize human locomotion does not automatically converge on human-like motion.

To summarise why I find this topic and approach super interesting, here are a few points:

  • The fact that we can get so close to human locomotion by such a simple objective means that minimization of energy may be a big deciding factor on why we move as we do. That’s pretty cool by itself. Obvious if you think about it, but still pretty cool. Many other parameters are definitely important as well, such as stability and intention, but when their requirements are being met, it seems metabolic cost is significant. (note: turns out that minimizing muscle fatigue performs even better than minimizing overall metabolic cost, but it requires a few more assumptions)
  • The reasonable match between simulation and humans mean that the model that was simulated include some of the key elements of the human locomotor and nervous system, which provides a two-way benefit: (1) Exploring and enhancing the modelled nervous system to yield better matching locomotion can help explore how the human nervous system work, or at least a part relating to locomotion. (2) We get the possibility of using the model to diagnose and treat pathologies as well as being able to predict data within the models capabilities, which could help lower cost of early clinical trials (An interesting example can be found here which uses such models to explore which physiological parameters are likely to cause the degraded efficiency of locomotion in elders)
  • Being able to synthesize human gait with no preconceptions about gender, culture, mood etc allows for a great groundtruth (maybe not yet, but in the future) for comparing clinical trials, as well as establishing the foundation for exploring how such differences affect locomotion.

One issue though, between all the awesome. The models as of this day are still very limited in their capabilities for any of the above. The theory and tech is still in it’s infancy but great progress is being made all the time.

None of such progress has been made by me, however. I did scratch the surface and wrote a little bachelor thesis about the topic, and I had fun doing it. However, it was more a learning experience than conducting research.
If you are interested you are more than welcome to take a look at my thesis here:

Description follow below:

My B.Sc Eng Thesis: Framework for Predictive Simulation of Biologically Inspired Human Gait

My thesis didn’t give anything new to the science community but what it did do was to give me an incredible amount of new insight on the topic. Most of which I have tried documenting. You may therefore see it as a naïve introduction to the subject

As with any topic that you don’t quite master, it is incredibly difficult to explain it briefly and in clear terms. This is also true for my thesis so I am afraid that it may not be the easiest reading. However, with the strong selling points sorted, what you will find in the thesis is an explanation of these 3 categories:

  • Introduction to Predictive Simulation, what it is and how it works
  • Introductory theory about the workings of the locomotor nervous system
  • Overview of software and how it is used

Enjoy

 


Advertisement

Biomimetic Synthetic Gait – First peek at my Bachelor Thesis

Here is the first sneak peek of my Bachelor Thesis for Mechanical Engineering. It is wildly optimistic, super amazing and touches upon topics like: mechanics, software, neuroscience, robot technology, AI, control theory, physiology, biomechanics, genetic algorithms, underactuated dynamical systems as well as muscle redundancy problems.

The elevator speech is still an elusive one, but here goes:

My overall objective is to gain insight into the human nervous system related to gait koordination. How are the muscles orchestrated, which areas handle what, what is the actual involvement of the brain, what has been subcontracted elsewhere and how could a theoretical model of this look like.

My approach is through using a so called Predictive Controller for Forward Dynamics simulation. I am mainly concerned with humanoid gait patterns and have ended up basing my project on an existing system called PredictiveSim.

This system is written by and published alongside the following article:

Dorn, T. W., Wang, J. M., Hicks, J. L., & Delp, S. L. (2015). Predictive Simulation Generates Human Adaptations during Loaded and Inclined Walking. Plos One, 10(4), e0121407. doi:10.1371/journal.pone.0121407

In short terms the system defines a humanoid model with bones, joints, tendons and muscles along with a number of reflex loops between muscles and senses (Ie: foot contact, muscle stretch etc.). Basically just a greatly simplified model of a human containing only what is believed to be the most neccessary parts related to walking – Weirdly enough the brain isn’t included.

With this model you can define a goal that it should aim for, ie: try to move forward with a certain speed while doing your best to minimize energy consumption. This is a problem that gets solved/optimized by a Genetic Algorithm and after leaving it in the hands of a computer (or several) a lot of thinking starts happening but out comes something similar to this:

It doesn’t look like much for an animator or in general, but the interesting bit is that nobody has told this model how to move. It never saw anyone doing it, there are no parameters in the model that dictates how this should be done, no keyframes or experience or anything. The only input it got was to try to move and do it in the most energyefficient way.
Realizing that this looks somewhat like human gait in general, one may get the feeling that minimizing energy consumption is a pretty big reason for why we move as we do.

If anyone feels like trying it out for themselves feel free to either get the original code or pick up my version, which I will be updating throughout my project. So far it mainly has a bit more documentation and annotation.
It can be found here: jwPredictiveSim at bitbucket.org

Soft-exoskeleton Part 1: Exoskeleton (of a kind) – Introducing

Hey!

Time for some news and there’s plenty of it! This should probably have been posted as many separate posts to follow the progress but it so happened that I wasn’t really good at updating continously so here goes the retrospect until I catch up.

During the past 4 months I’ve been super busy on a super fun project. We are building a soft-robotic system for upper-limbs actuation (I don’t want to call it an exoskeleton given that it implies a rigid outer structure, which this one hasn’t)

Our project is based on an idea I had last year on approaching exoskeletons from a slightly different angle.

To begin from the beginning, here is a little intro:

As a Mechanical Engineering student at DTU, the 4th semester is where you have your mid-term project. This is supposed to concern developing something mechanical of your own choosing, involving topics from a minimum of 2 of our mandatory courses. It’s furthermore a team effort and involves a course in project management alongside the product development. The project is set for 10 ECTS point meaning 1/3 of the semester.

Given that I had about a billion different ideas for such projects,  I immediately started investigating which ideas would be most interesting.
About the same time it so happened that my grandmother fell and hurt her shoulder quite bad. Due to old age and a damaged Rotator Cuff, she ended up not being able to lift her left arm above 10 degrees’ish from vertical. This meant that she couldn’t reach out for he rolator too well, could bring her left hand to her head and many other issues that seem insignificant until you’re in the situation.

Anyway, I thought I’d give it a shot at helping her out by building a wearable system that could help her lifting her arm when she desired.

The more I thought about it the more the idea seemed to fit right in with the mid-term project. Eventually I started talking to a few of my friends from my class about potentially doing it and in no time we were 5 guys ready and all super excited about getting started.

In this way I had prepared an idea for the project as well as gathered a team already before Christmas. With the project not starting before February I still had some time to play around with further ideas.
For several weeks I was investigating existing exoskeletons, how we could potentially build one ourselves, what it would require and so on. I quickly concluded that exisiting systems are super expensive as it requires quite sophisticated machinary to augment movement when required whilst aiming to be non-invasive. I found that we probably shoudn’t aim for a fully-fledged exoskeleton Iron-Man’ish suit but instead see how we could hack something together, approaching functionality from the bottom-up instead. My Grandmother didn’t require much and merely allowing her to lift her arm forward to horizontal in an intuitive way would be an enormous help.

With this in mind and some other arbitrary trail of thought that I cannot remember, I started thinking of the way tubes tend to straighten out when inflated. From that I started wondering if you could strap such a tube to a person and using a that straightening force to actuate an arm.

After a bit of back and forth-ing with thought experiments I concluded that it could potentially work. Consider a long tube strapped to a person above the hip on the back, going up and over the shoulder and down along the upper-arm where it attaches again at the elbow. This would give it a bend over the shoulder and when inflating it, which would make it straighten, the only way to do so would be by lifting the arm (or detaching at the back but that’s a matter of strapping it down more tightly)

Showing my initial idea of strapping a tube down, bending over the shoulder. Inflating the tube should theoretically lift the arm

Showing my initial idea of strapping a tube down, bending over the shoulder.
Inflating the tube should theoretically lift the arm

Using a tube in this way could potentially eliminate a lot of the issues with needing rigid structures to counter reaction-forces from motors pulling or rotating.
I quickly found many other potential benefits with using this method for less needy actuations. These included much cheaper production cost, light weight and flexible which would allow it to follow the shape of the body naturally without having to include multiple mechanical joints.
I found that this approach was much more my style and while it definitely has it’s drawbacks I saw great potential in investigating further.

I told the rest of the team about my new idea and while they had initially thought of a more mechanical exoskeleton, they were very keen on this idea as well. Only problem now was the chance of getting going with a project which was based on some very strong assumptions on many points.

…To be continued in Explorations of Wearable Soft Robotics – 3 Week Course

Muscle-triggered dancing crab: Biometrics, hell yeah!

A few days ago I finally received a muscle sensor which I’ve been wanting to buy since forever – this one: https://www.sparkfun.com/products/11776. 4 days seemed like an infinity to wait playing around with it but tonight I finally got the chance.

What I found the most exciting was reliability of the signal. How hard it would be to measure the tension of a muscle and whether I’d need a whole bunch of filters to get anything sensible out of it. I had no idea what to expect when I bought it apart from having seen a few examples of it’s use online.

Turned out that it was really quite easy to control. I added a moving average filter with 10 samples and instantly had a signal I could use to control… in this case a servo with a crab on top.

I applied the electrodes on the flexor pollicis brevis – the muscle that pulls your thump towards your pinky -, set up a few conditions for the signal on the Arduino and BAM! The crab was dancing 🙂
By pushing my thump agains the other fingers, I tense the muscle and depending on the amount of tension I can make it rotate slower or faster. As for the conditions I set up a minimum signal threshold after which it would start moving. Each time It’d go beyond that threshold it would go in one direction, speeding up depending on the tension. When the signal goes below the threshold it reverses direction, so it’d go the other way when I tighten again.
The fact that the crab may seem to follow my hand at certain points, could be a bit misleading as I’m merely playing around and aiming it by tensing/relaxing the muscle accordingly. All motion is muscle-controlled 🙂

Oh, and sorry about the sound. I happened to be listening to Radiohead while recording and now Youtube recognized it and claimed that it was copyrighted, so they offered a way of removing only the song from the video. Didn’t really work, but as long as it keeps my back clear I’m good.

[In case somebody is wondering about the thing attached to the servo, it is a crab made of chestnuts:]

Kindly donated by Rebekka

Kindly donated by Rebekka

First autonomous being

As a statement of actually getting somewhere with my life, here’s another robot I’ve been working on.

It’s based on the Start Here robot from LetsMakeRobots.com which I thought would be a reasonable project for broadening my knowledge on electronics, expanding my repertoire of microprocessors – this one using Picaxe – and getting a bit puzzled by the usage of BASIC programming language in modern technology.

Getting it going wasn’t really a big challenge with the fairly thorough walkthrough on letsmakerobots but I learned heaps and feel confident that next project will be more of a push

Without further ado, here’s the beast. First autonomous dude with several to follow. This guy’s a simpleton but manages through slow and safe progress.

Rise of the Dynamixel

Right. First go at robotics to go online.

As I happen to be quite a big fan of animatronics, animation, building things and programming I’ve been playing around with a couple of Robotis Dynamixel actuators during the last month to see what they can and cannot do, and have now managed to set up my base for something that I see a bit of potential in.

Feeling so very comfortable in my favourite (and most hated) animation software Maya, my initial setup is by hooking up the actuators directly to Maya through a python-based server and a USB2Dynamixel which works as a close-to realtime connection allowing me to pose/animate/playback from Maya and seeing the result on the actuators while doing so, pretty neat hey?

The learning curve has been rather steep getting into bits and bytes when communicating with the hardware but I’m starting to get a hang of it now and my interest in the field is growing exponentially.

Video shows a preanimated motion that’s being played back from Maya on the laptop in the background.

The motion is currently being written as a goal angle at about 25fps with a static speed so my next big job is to get rid of as much jitter as possible by analysing the motion a bit tho I do have a hunch that the resolution is too small.

OMToolbox – Open Maya Toolbox

OMToolbox is a project I started a loooong time ago as is an extension of the first sculpting tools I made for Maya, also the first tools I ever wrote.

It started around my initial entry into the animation industry as a modelor, by frustration towards Mayas modeling toolset.
I had plenty of ideas for how to speed up my workflow with different tools but I couldn’t find the available online so I had to do my own, leading to my introduction to MEL.

After finishing the initial couple of tools I released them to the public under the name JWToolbox which by public demand later got turned into Open Maya Toolbox, meaning a community-based and maintained opensource toolbox for everything Maya. This caught the attention of Alias (Who owned Maya at that point) who featured the toolbox on their developers corner, however the community sadly died out when I no longer had the time to organize everything and couldn’t find a replacement, returning OMToolbox to a compilation of opensource Maya tools maintained by me… and I haven’t done a very good job at that lately with priorities not really pulling in that direction any longer. I’ll still update it occasionally tho, when I see fit.

Open Maya Toolbox @ Creative Crash

 

 

 

PAIE – Python Animation Import/Export

Being unfamiliar with blogs and their structure I’ll probably mess around with categorizing stuff and deciding how to post but here goes first tech-post nonetheless.

PAIE is a script I initially wrote for Radar Film on an animated feature called Berry and the Disco Worms and having grown tired of doing the same type of tools over and over again for different companies each time I got hired some place new, I decided to take a small cut in my pay to be able to release it opensource afterwards.. here it is 🙂

It’s a rather basic tool that allows you to export attribute values from a desired selection, save to a file and then load again to same or different selection afterwards.
It can export animation as well as poses, export from multiple namespaces at once and a bunch of other things.

Tool can be found on Bitbucket. Feel free to contribute.

I’ll prolly update it eventually and I reckon I might just update this post about it instead of cluttering up with more posts about same thing, but I’m not sure yet.

Cheers