Biomimetic Synthetic Gait – First peek at my Bachelor Thesis

Here is the first sneak peek of my Bachelor Thesis for Mechanical Engineering. It is wildly optimistic, super amazing and touches upon topics like: mechanics, software, neuroscience, robot technology, AI, control theory, physiology, biomechanics, genetic algorithms, underactuated dynamical systems as well as muscle redundancy problems.

The elevator speech is still an elusive one, but here goes:

My overall objective is to gain insight into the human nervous system related to gait koordination. How are the muscles orchestrated, which areas handle what, what is the actual involvement of the brain, what has been subcontracted elsewhere and how could a theoretical model of this look like.

My approach is through using a so called Predictive Controller for Forward Dynamics simulation. I am mainly concerned with humanoid gait patterns and have ended up basing my project on an existing system called PredictiveSim.

This system is written by and published alongside the following article:

Dorn, T. W., Wang, J. M., Hicks, J. L., & Delp, S. L. (2015). Predictive Simulation Generates Human Adaptations during Loaded and Inclined Walking. Plos One, 10(4), e0121407. doi:10.1371/journal.pone.0121407

In short terms the system defines a humanoid model with bones, joints, tendons and muscles along with a number of reflex loops between muscles and senses (Ie: foot contact, muscle stretch etc.). Basically just a greatly simplified model of a human containing only what is believed to be the most neccessary parts related to walking – Weirdly enough the brain isn’t included.

With this model you can define a goal that it should aim for, ie: try to move forward with a certain speed while doing your best to minimize energy consumption. This is a problem that gets solved/optimized by a Genetic Algorithm and after leaving it in the hands of a computer (or several) a lot of thinking starts happening but out comes something similar to this:

It doesn’t look like much for an animator or in general, but the interesting bit is that nobody has told this model how to move. It never saw anyone doing it, there are no parameters in the model that dictates how this should be done, no keyframes or experience or anything. The only input it got was to try to move and do it in the most energyefficient way.
Realizing that this looks somewhat like human gait in general, one may get the feeling that minimizing energy consumption is a pretty big reason for why we move as we do.

If anyone feels like trying it out for themselves feel free to either get the original code or pick up my version, which I will be updating throughout my project. So far it mainly has a bit more documentation and annotation.
It can be found here: jwPredictiveSim at bitbucket.org

Advertisements

Soft-Exoskeleton Part 4: eFactor competition

Now came the time for the actual eFactor 2014 Competition.

From the previous post, we now had each part of the project – compressor, pneumatic valves, Arduino control and the artificial muslce –  working individually and we had a good idea about how to assemble them all. However, it was a mess both in the pneumatics and electronics and eFactor was coming up fast. The whole system needed to be working, working together, working consistently and looking reasonable before we could showcase it for the competition. The Arduino was still running from the computer and the rest of the electronics were powered by an external power supply. All electronics were connected through a breadboard and the pneumatic system had tubes everywhere.

Using my new 3D-printer we printed a super-boring-square-engineering-style-practical-box for the electronics and a new pair of end blockers for the muscle, which would fit on the arm of our manequinn doll. The muscle got a a sleeve of stretchable fabric and the pneumatic tubes and wires for the compressor-end got grouped in a single sleeve.

With everything attached the wearable part of the system looked like this:

Working Prototype

Working Prototype

We were sticking pretty well to the mandatory pattern of only JUST getting stuff working the night before you NEED it. The insides of the electronics box had no time to get to the cosmetic fase, meaning that everything was wired by jumper-wires, a breadboard, some super fast soldered joints and a bunch of different voltage batteries, +/- 9 Volts for the muscle sensor, 5 Volts to power the arduino and 6 Volts to power the airvalve servo (turned out last minute that the 5 Volt output from the arduino was too little juice for it to push the valve-pin down).

Enjoy the wonders of the Control Box:

ControlBox intestines

ControlBox intestines

Well.. it barely worked, but it worked. Good enough for eFactor.. or rather, it HAD to be good enough. It was 3 am in the morning of the 4th, the day where we had to transfer the project to Industriens Hus where to competition was to be held, leaving our workshop behind and making it impossible for any further large fixes. The actual competition would then start on the 5rd at 9am.

Waking up again at 6:30 on the 4th I had to take a trip to Odense to do some consultancy for a advertisement company looking into doing some 3D animation. 2 hours each way, 2 hours of meeting and back to Copenhagen to help the team with setup. It turned out we had to do a few last minute solderings and adjustments of the electronics, which I did.. I probably shouldn’t have, as a soldering iron, small tiny bits of fragile electronics and a shaking sleep-deprived hand doesn’t go well together. However, with a good bunch of patience and maybe a bit of luck, I managed. We got it all set, tested and arranged for the next day.

During that day of setting up it turned out that we had been picked to go on the radio to talk about out project and how we ended up attending eFactor. That put us in a slightly awkward position as our project was the only one to have the primary focus on mechanics rather than electrocnis despite the competition in wearable electronics, so I guess we weren’t representing it too well. Either way, it ended up being Andreas, Peter and Lauge who went on air at 9 am on the day of the competition, to talk about the project. Later on we realised that it had actually inspired quite a few people to come and see the competition (It was open for everyone to drop by and have a look at the different projects)

When the competition started, each team had to do a presentation on stage, presenting some slides and explaining their idea. Afterwards the judges would come to the teams stall where the project was setup and could be presented. Then some questions and done. Lauge and Peter presented the project and Andreas and I did the demonstration for the judges afterward.
It so happened that an english presentation wasn’t mandatory so Lauge and Peter had decided to do it in Danish to stay safe. However, it turned out that the organizer would like to have a recording of our presentation in english as well so we arranged for a second presentation by the stall. It so happened that my grandmother, who is the initial inspiration for this project and who had been invited to come, walked in the door on the instant that the organizer came by with the camera. That turned into an arrangement where she sat at the table at our stall while Andreas and I presented the project and answered a few questions that the organizer had.

The video turned out pretty well and gives a great overview of the project while demonstrating the prototype. Check it out:

 

In the end all of the projects were evaluated and we won the popularity prize along with 15.000,- DKK.
While not winning the grand prize I dare say that we were all very honoured and really happy about our prize and to get the recognition, despite having entered a competition on a topic that we barely knew about beforehand.

We had fun, learned a lot and eventually had a working prototype of an EMG-controlled soft-robotic exoskeleton.

 

Soft-exoskeleton Part 2 – 3-Week Course: Explorations of Wearable Soft Robotics

Continuing from Soft-Robotic Exoskeleton…

With the new and exciting idea at hand we also suddenly had quite a few assumptions to verify before the project could be assumed to have any sort of potential. This provided a great risk of running into a dead-end quickly after initiating the project.

In order to lower our risk a little I decided to do a preliminary study of the concept alone, prior to the actual mid-term project starting. This fitted nicely with the 3-week semester part in January.

As one of the great benefits of DTU all students are able to walk up to any professor and ask if they could be interesting in starting a course teaching a particular topic. These kind of custom courses can be started by anyone as long as there is someone who can and will supervise you and that the topic is found to fit into your studies.

By chance I heard about DTU Playware and that they were working with rehabilitation. I got in touch with David Johan Christensen who is associate professor at DTU, formerly at MIT Media Labs, and having worked a lot with robotics. I told him about my idea and he agreed to supervise my project, and eventually also supervise the bigger mid-term project if we wanted. Due to past experience with my supervisor from the 3D-printed animatronic head, David Bue Pedersen, I also asked him if he wanted to supervise and eventually we ended up with two super cool supervisors, both for my individual 3-Week course and for the mid-term team-project.

Talking to David Johan Christensen, it turned out that Center for Playware had recently initiated a collaboration with Center for Spinal Cord Injury at Glostrup Hospital, Denmark. They were already using exoskeletons for rehabilitation and it seemed the timing was just perfect.

We arranged that I should prepare a presentation for the Hospital for the beginning of my course. On the first day of the 3-week course we went to visit the Center for Spinal Cord Injury and I presented my idea. They were super excited about it and suddenly the project had funding!

The 3 weeks progress super fast, I was busy busy testing all sorts of things, hacking and having a really good time! Starting out with balloons, caulking guns and bicycle valves:

Nope.. wasn't a good idea

Nope.. wasn’t a good idea

Testing the straightening-force of a long balloon. None...  :/

Testing the straightening-force of a long balloon. None… :/

Building the valve for easier handling and using less hands for holding on to everything:

valvesystem-overview4

Valve assembly: Bicycle valve, caulking-gun-tip, a balloon and a random piece of rubber tubing

Valve assembled

Valve assembled

and eventually ending up with braided cable-routing tube surrounding a bicycle tube.

snakeskin-relaxed snakeskin-compressed

By inflating the inside tube, the diameter would expand, påushing out on the braided tube which contracts in length, when the diameter increases. In this way I made a linear pneumatic actuator.

Linear Pneumatic Actuator

Eventually it turned out that such actuator already existed and was called a McKibben-type Pneumatic Artificial Muscle. Good thing, then the concept was approved of and I continued in that direction.

The main challenge for me was to figure out whether the concept could provide enough lifting power to lift an arm, and how to approach reaching such forces. In the end I found a solution that seemed to work and I wrote a report on all my findings.

At the date of writing: Sadly, but also quite nicely, I cannot publish the report just yet as we are currently playing around with a potential patent for some of our recent findings. It is still very uncertain whether we will proceed with it or not, but to be safe, I’d rather say too little than too much right now.
However, it is the plan that we wrap up  the project with an article, which will hopefully be fully published somewhere. More info will appear on this blog eventually 🙂

…To be continued in part 3, the mid-term project

Muscle-triggered dancing crab: Biometrics, hell yeah!

A few days ago I finally received a muscle sensor which I’ve been wanting to buy since forever – this one: https://www.sparkfun.com/products/11776. 4 days seemed like an infinity to wait playing around with it but tonight I finally got the chance.

What I found the most exciting was reliability of the signal. How hard it would be to measure the tension of a muscle and whether I’d need a whole bunch of filters to get anything sensible out of it. I had no idea what to expect when I bought it apart from having seen a few examples of it’s use online.

Turned out that it was really quite easy to control. I added a moving average filter with 10 samples and instantly had a signal I could use to control… in this case a servo with a crab on top.

I applied the electrodes on the flexor pollicis brevis – the muscle that pulls your thump towards your pinky -, set up a few conditions for the signal on the Arduino and BAM! The crab was dancing 🙂
By pushing my thump agains the other fingers, I tense the muscle and depending on the amount of tension I can make it rotate slower or faster. As for the conditions I set up a minimum signal threshold after which it would start moving. Each time It’d go beyond that threshold it would go in one direction, speeding up depending on the tension. When the signal goes below the threshold it reverses direction, so it’d go the other way when I tighten again.
The fact that the crab may seem to follow my hand at certain points, could be a bit misleading as I’m merely playing around and aiming it by tensing/relaxing the muscle accordingly. All motion is muscle-controlled 🙂

Oh, and sorry about the sound. I happened to be listening to Radiohead while recording and now Youtube recognized it and claimed that it was copyrighted, so they offered a way of removing only the song from the video. Didn’t really work, but as long as it keeps my back clear I’m good.

[In case somebody is wondering about the thing attached to the servo, it is a crab made of chestnuts:]

Kindly donated by Rebekka

Kindly donated by Rebekka

Carls first sign of life

I finally got around to finishing the eye mechanism so here is the dude looking around for the first time!

It’s currently controlled manually by 2 potentiometers driving an Arduino Diecimila and 2 standard servos. Maybe one day it will be procedurally controlled or something.

Eyebrows are shit and probably needs redesign before I get more into that, but I’m still proud of him!

First autonomous being

As a statement of actually getting somewhere with my life, here’s another robot I’ve been working on.

It’s based on the Start Here robot from LetsMakeRobots.com which I thought would be a reasonable project for broadening my knowledge on electronics, expanding my repertoire of microprocessors – this one using Picaxe – and getting a bit puzzled by the usage of BASIC programming language in modern technology.

Getting it going wasn’t really a big challenge with the fairly thorough walkthrough on letsmakerobots but I learned heaps and feel confident that next project will be more of a push

Without further ado, here’s the beast. First autonomous dude with several to follow. This guy’s a simpleton but manages through slow and safe progress.

Rise of the Dynamixel

Right. First go at robotics to go online.

As I happen to be quite a big fan of animatronics, animation, building things and programming I’ve been playing around with a couple of Robotis Dynamixel actuators during the last month to see what they can and cannot do, and have now managed to set up my base for something that I see a bit of potential in.

Feeling so very comfortable in my favourite (and most hated) animation software Maya, my initial setup is by hooking up the actuators directly to Maya through a python-based server and a USB2Dynamixel which works as a close-to realtime connection allowing me to pose/animate/playback from Maya and seeing the result on the actuators while doing so, pretty neat hey?

The learning curve has been rather steep getting into bits and bytes when communicating with the hardware but I’m starting to get a hang of it now and my interest in the field is growing exponentially.

Video shows a preanimated motion that’s being played back from Maya on the laptop in the background.

The motion is currently being written as a goal angle at about 25fps with a static speed so my next big job is to get rid of as much jitter as possible by analysing the motion a bit tho I do have a hunch that the resolution is too small.