Biomimetic Synthetic Gait – First peek at my Bachelor Thesis

Here is the first sneak peek of my Bachelor Thesis for Mechanical Engineering. It is wildly optimistic, super amazing and touches upon topics like: mechanics, software, neuroscience, robot technology, AI, control theory, physiology, biomechanics, genetic algorithms, underactuated dynamical systems as well as muscle redundancy problems.

The elevator speech is still an elusive one, but here goes:

My overall objective is to gain insight into the human nervous system related to gait koordination. How are the muscles orchestrated, which areas handle what, what is the actual involvement of the brain, what has been subcontracted elsewhere and how could a theoretical model of this look like.

My approach is through using a so called Predictive Controller for Forward Dynamics simulation. I am mainly concerned with humanoid gait patterns and have ended up basing my project on an existing system called PredictiveSim.

This system is written by and published alongside the following article:

Dorn, T. W., Wang, J. M., Hicks, J. L., & Delp, S. L. (2015). Predictive Simulation Generates Human Adaptations during Loaded and Inclined Walking. Plos One, 10(4), e0121407. doi:10.1371/journal.pone.0121407

In short terms the system defines a humanoid model with bones, joints, tendons and muscles along with a number of reflex loops between muscles and senses (Ie: foot contact, muscle stretch etc.). Basically just a greatly simplified model of a human containing only what is believed to be the most neccessary parts related to walking – Weirdly enough the brain isn’t included.

With this model you can define a goal that it should aim for, ie: try to move forward with a certain speed while doing your best to minimize energy consumption. This is a problem that gets solved/optimized by a Genetic Algorithm and after leaving it in the hands of a computer (or several) a lot of thinking starts happening but out comes something similar to this:

It doesn’t look like much for an animator or in general, but the interesting bit is that nobody has told this model how to move. It never saw anyone doing it, there are no parameters in the model that dictates how this should be done, no keyframes or experience or anything. The only input it got was to try to move and do it in the most energyefficient way.
Realizing that this looks somewhat like human gait in general, one may get the feeling that minimizing energy consumption is a pretty big reason for why we move as we do.

If anyone feels like trying it out for themselves feel free to either get the original code or pick up my version, which I will be updating throughout my project. So far it mainly has a bit more documentation and annotation.
It can be found here: jwPredictiveSim at bitbucket.org

Soft-exoskeleton Part 1: Exoskeleton (of a kind) – Introducing

Hey!

Time for some news and there’s plenty of it! This should probably have been posted as many separate posts to follow the progress but it so happened that I wasn’t really good at updating continously so here goes the retrospect until I catch up.

During the past 4 months I’ve been super busy on a super fun project. We are building a soft-robotic system for upper-limbs actuation (I don’t want to call it an exoskeleton given that it implies a rigid outer structure, which this one hasn’t)

Our project is based on an idea I had last year on approaching exoskeletons from a slightly different angle.

To begin from the beginning, here is a little intro:

As a Mechanical Engineering student at DTU, the 4th semester is where you have your mid-term project. This is supposed to concern developing something mechanical of your own choosing, involving topics from a minimum of 2 of our mandatory courses. It’s furthermore a team effort and involves a course in project management alongside the product development. The project is set for 10 ECTS point meaning 1/3 of the semester.

Given that I had about a billion different ideas for such projects,  I immediately started investigating which ideas would be most interesting.
About the same time it so happened that my grandmother fell and hurt her shoulder quite bad. Due to old age and a damaged Rotator Cuff, she ended up not being able to lift her left arm above 10 degrees’ish from vertical. This meant that she couldn’t reach out for he rolator too well, could bring her left hand to her head and many other issues that seem insignificant until you’re in the situation.

Anyway, I thought I’d give it a shot at helping her out by building a wearable system that could help her lifting her arm when she desired.

The more I thought about it the more the idea seemed to fit right in with the mid-term project. Eventually I started talking to a few of my friends from my class about potentially doing it and in no time we were 5 guys ready and all super excited about getting started.

In this way I had prepared an idea for the project as well as gathered a team already before Christmas. With the project not starting before February I still had some time to play around with further ideas.
For several weeks I was investigating existing exoskeletons, how we could potentially build one ourselves, what it would require and so on. I quickly concluded that exisiting systems are super expensive as it requires quite sophisticated machinary to augment movement when required whilst aiming to be non-invasive. I found that we probably shoudn’t aim for a fully-fledged exoskeleton Iron-Man’ish suit but instead see how we could hack something together, approaching functionality from the bottom-up instead. My Grandmother didn’t require much and merely allowing her to lift her arm forward to horizontal in an intuitive way would be an enormous help.

With this in mind and some other arbitrary trail of thought that I cannot remember, I started thinking of the way tubes tend to straighten out when inflated. From that I started wondering if you could strap such a tube to a person and using a that straightening force to actuate an arm.

After a bit of back and forth-ing with thought experiments I concluded that it could potentially work. Consider a long tube strapped to a person above the hip on the back, going up and over the shoulder and down along the upper-arm where it attaches again at the elbow. This would give it a bend over the shoulder and when inflating it, which would make it straighten, the only way to do so would be by lifting the arm (or detaching at the back but that’s a matter of strapping it down more tightly)

Showing my initial idea of strapping a tube down, bending over the shoulder. Inflating the tube should theoretically lift the arm

Showing my initial idea of strapping a tube down, bending over the shoulder.
Inflating the tube should theoretically lift the arm

Using a tube in this way could potentially eliminate a lot of the issues with needing rigid structures to counter reaction-forces from motors pulling or rotating.
I quickly found many other potential benefits with using this method for less needy actuations. These included much cheaper production cost, light weight and flexible which would allow it to follow the shape of the body naturally without having to include multiple mechanical joints.
I found that this approach was much more my style and while it definitely has it’s drawbacks I saw great potential in investigating further.

I told the rest of the team about my new idea and while they had initially thought of a more mechanical exoskeleton, they were very keen on this idea as well. Only problem now was the chance of getting going with a project which was based on some very strong assumptions on many points.

…To be continued in Explorations of Wearable Soft Robotics – 3 Week Course

Muscle-triggered dancing crab: Biometrics, hell yeah!

A few days ago I finally received a muscle sensor which I’ve been wanting to buy since forever – this one: https://www.sparkfun.com/products/11776. 4 days seemed like an infinity to wait playing around with it but tonight I finally got the chance.

What I found the most exciting was reliability of the signal. How hard it would be to measure the tension of a muscle and whether I’d need a whole bunch of filters to get anything sensible out of it. I had no idea what to expect when I bought it apart from having seen a few examples of it’s use online.

Turned out that it was really quite easy to control. I added a moving average filter with 10 samples and instantly had a signal I could use to control… in this case a servo with a crab on top.

I applied the electrodes on the flexor pollicis brevis – the muscle that pulls your thump towards your pinky -, set up a few conditions for the signal on the Arduino and BAM! The crab was dancing 🙂
By pushing my thump agains the other fingers, I tense the muscle and depending on the amount of tension I can make it rotate slower or faster. As for the conditions I set up a minimum signal threshold after which it would start moving. Each time It’d go beyond that threshold it would go in one direction, speeding up depending on the tension. When the signal goes below the threshold it reverses direction, so it’d go the other way when I tighten again.
The fact that the crab may seem to follow my hand at certain points, could be a bit misleading as I’m merely playing around and aiming it by tensing/relaxing the muscle accordingly. All motion is muscle-controlled 🙂

Oh, and sorry about the sound. I happened to be listening to Radiohead while recording and now Youtube recognized it and claimed that it was copyrighted, so they offered a way of removing only the song from the video. Didn’t really work, but as long as it keeps my back clear I’m good.

[In case somebody is wondering about the thing attached to the servo, it is a crab made of chestnuts:]

Kindly donated by Rebekka

Kindly donated by Rebekka

First autonomous being

As a statement of actually getting somewhere with my life, here’s another robot I’ve been working on.

It’s based on the Start Here robot from LetsMakeRobots.com which I thought would be a reasonable project for broadening my knowledge on electronics, expanding my repertoire of microprocessors – this one using Picaxe – and getting a bit puzzled by the usage of BASIC programming language in modern technology.

Getting it going wasn’t really a big challenge with the fairly thorough walkthrough on letsmakerobots but I learned heaps and feel confident that next project will be more of a push

Without further ado, here’s the beast. First autonomous dude with several to follow. This guy’s a simpleton but manages through slow and safe progress.

Rise of the Dynamixel

Right. First go at robotics to go online.

As I happen to be quite a big fan of animatronics, animation, building things and programming I’ve been playing around with a couple of Robotis Dynamixel actuators during the last month to see what they can and cannot do, and have now managed to set up my base for something that I see a bit of potential in.

Feeling so very comfortable in my favourite (and most hated) animation software Maya, my initial setup is by hooking up the actuators directly to Maya through a python-based server and a USB2Dynamixel which works as a close-to realtime connection allowing me to pose/animate/playback from Maya and seeing the result on the actuators while doing so, pretty neat hey?

The learning curve has been rather steep getting into bits and bytes when communicating with the hardware but I’m starting to get a hang of it now and my interest in the field is growing exponentially.

Video shows a preanimated motion that’s being played back from Maya on the laptop in the background.

The motion is currently being written as a goal angle at about 25fps with a static speed so my next big job is to get rid of as much jitter as possible by analysing the motion a bit tho I do have a hunch that the resolution is too small.

OMToolbox – Open Maya Toolbox

OMToolbox is a project I started a loooong time ago as is an extension of the first sculpting tools I made for Maya, also the first tools I ever wrote.

It started around my initial entry into the animation industry as a modelor, by frustration towards Mayas modeling toolset.
I had plenty of ideas for how to speed up my workflow with different tools but I couldn’t find the available online so I had to do my own, leading to my introduction to MEL.

After finishing the initial couple of tools I released them to the public under the name JWToolbox which by public demand later got turned into Open Maya Toolbox, meaning a community-based and maintained opensource toolbox for everything Maya. This caught the attention of Alias (Who owned Maya at that point) who featured the toolbox on their developers corner, however the community sadly died out when I no longer had the time to organize everything and couldn’t find a replacement, returning OMToolbox to a compilation of opensource Maya tools maintained by me… and I haven’t done a very good job at that lately with priorities not really pulling in that direction any longer. I’ll still update it occasionally tho, when I see fit.

Open Maya Toolbox @ Creative Crash

 

 

 

PAIE – Python Animation Import/Export

Being unfamiliar with blogs and their structure I’ll probably mess around with categorizing stuff and deciding how to post but here goes first tech-post nonetheless.

PAIE is a script I initially wrote for Radar Film on an animated feature called Berry and the Disco Worms and having grown tired of doing the same type of tools over and over again for different companies each time I got hired some place new, I decided to take a small cut in my pay to be able to release it opensource afterwards.. here it is 🙂

It’s a rather basic tool that allows you to export attribute values from a desired selection, save to a file and then load again to same or different selection afterwards.
It can export animation as well as poses, export from multiple namespaces at once and a bunch of other things.

Tool can be found on CreativeCrash – PAIE

I’ll prolly update it eventually and I reckon I might just update this post about it instead of cluttering up with more posts about same thing, but I’m not sure yet.

Cheers