Or in other words: Predictive Control of Bio-Inspired Biped Locomotion
Back in my days of undergrad this topic caught me by storm and thunder, inflicting inspiration exactly where I liked it. This inspiration lead me from Mechanical Engineering directly into the realm of computational biology where I have fumbled around for the past 4 years.
In essence you let a computer figure out how a virtual model of a human body would move in order to optimise certain criteria. Ie. travel at a certain speed while using the minimum amount of energy.
The point here is that you don’t control individual muscle activations and forces (kinetics) and neither do you control how the body moves (kinematics). Instead you define the abstract intention through a high-level objective. Ie. “Move with 1.5m/s in the forward direction”. Then you let a computer do the hard work of guessing on a number of unknown parameter values until you are close enough to your target. You can then add further terms such as: “try also to minimise the energy consumption per distance travelled” and let the computer think again. Further terms can be made such as “minimise head-movements while walking” and “minimise impact forces” which are both inspired by physiological observations.
If you optimise for such 4 terms and the model is sufficiently made, even with no other input; ie. no motion capture, no keyframes and no predefined patterns, this simple and high-level ‘objective’-function can result in human-like gait patterns out of nowhere. By this means we can effectively synthesise human gait (Within reasonable tolerances)
The fun one first. Thomas Geijtenbeek’s project is targeted games so it does not prioritize physiological plausibility too much. However, the foundation is the same and it is very charming. The “Generation” shows the progress of the computer optimisation trying to find the right parameters. He is furthermore including a pass that seeks to find the optimal position of muscles. This makes sense for imaginary creatures but not so much for humans:
Credit: Thomas Geijtenbeek, Michiel van de Panne, Frank van der Stappen
Then a rather technical video showing a more physiologically correct method. Notice here how the model adapt its walking strategy when muscle strength is changed (This could be of particular interest for animators):
Credit: Jack Wang, Samuel R. Hammer, Scott Delp, Vladlen Koltun
And another, also very technical but this one greatly illustrates the humanoid model used, the muscles and its Neural Control Circuitry.
Credit: Seungmoon Song and Hartmut Geyer
To put the above results into context, Google recently ‘asked’ an AI to find locomotion patterns for a human virtual puppet (Without muscles and proper skeletal features).
The hilarious yet obnoxious result, which hurts my animator heart dearly, is this:
Credit: Nicolas Heess, Dhruva TB, Srinivasan Sriram, Jay Lemmon, Josh Merel, Greg Wayne, Yuval Tassa, Tom Erez, Ziyu Wang, S. M. Ali Eslami, Martin Riedmiller, David Silver
I admit that the aim of the project is different, but it still goes to show that asking computers to synthesize human locomotion does not automatically converge on human-like motion.
To summarise why I find this topic and approach super interesting, here are a few points:
The fact that we can get so close to human locomotion by such a simple objective means that minimization of energy may be a big deciding factor on why we move as we do. That’s pretty cool by itself. Obvious if you think about it, but still pretty cool. Many other parameters are definitely important as well, such as stability and intention, but when their requirements are being met, it seems metabolic cost is significant. (note: turns out that minimizing muscle fatigue performs even better than minimizing overall metabolic cost, but it requires a few more assumptions)
The reasonable match between simulation and humans mean that the model that was simulated include some of the key elements of the human locomotor and nervous system, which provides a two-way benefit: (1) Exploring and enhancing the modelled nervous system to yield better matching locomotion can help explore how the human nervous system work, or at least a part relating to locomotion. (2) We get the possibility of using the model to diagnose and treat pathologies as well as being able to predict data within the models capabilities, which could help lower cost of early clinical trials (An interesting example can be found here which uses such models to explore which physiological parameters are likely to cause the degraded efficiency of locomotion in elders)
Being able to synthesize human gait with no preconceptions about gender, culture, mood etc allows for a great groundtruth (maybe not yet, but in the future) for comparing clinical trials, as well as establishing the foundation for exploring how such differences affect locomotion.
One issue though, between all the awesome. The models as of this day are still very limited in their capabilities for any of the above. The theory and tech is still in it’s infancy but great progress is being made all the time.
None of such progress has been made by me, however. I did scratch the surface and wrote a little bachelor thesis about the topic, and I had fun doing it. However, it was more a learning experience than conducting research.
If you are interested you are more than welcome to take a look at my thesis here:
Description follow below:
My B.Sc Eng Thesis: Framework for Predictive Simulation of Biologically Inspired Human Gait
My thesis didn’t give anything new to the science community but what it did do was to give me an incredible amount of new insight on the topic. Most of which I have tried documenting. You may therefore see it as a naïve introduction to the subject
As with any topic that you don’t quite master, it is incredibly difficult to explain it briefly and in clear terms. This is also true for my thesis so I am afraid that it may not be the easiest reading. However, with the strong selling points sorted, what you will find in the thesis is an explanation of these 3 categories:
Introduction to Predictive Simulation, what it is and how it works
Introductory theory about the workings of the locomotor nervous system
Here is the first sneak peek of my Bachelor Thesis for Mechanical Engineering. It is wildly optimistic, super amazing and touches upon topics like: mechanics, software, neuroscience, robot technology, AI, control theory, physiology, biomechanics, genetic algorithms, underactuated dynamical systems as well as muscle redundancy problems.
The elevator speech is still an elusive one, but here goes:
My overall objective is to gain insight into the human nervous system related to gait koordination. How are the muscles orchestrated, which areas handle what, what is the actual involvement of the brain, what has been subcontracted elsewhere and how could a theoretical model of this look like.
My approach is through using a so called Predictive Controller for Forward Dynamics simulation. I am mainly concerned with humanoid gait patterns and have ended up basing my project on an existing system called PredictiveSim.
This system is written by and published alongside the following article:
In short terms the system defines a humanoid model with bones, joints, tendons and muscles along with a number of reflex loops between muscles and senses (Ie: foot contact, muscle stretch etc.). Basically just a greatly simplified model of a human containing only what is believed to be the most neccessary parts related to walking – Weirdly enough the brain isn’t included.
With this model you can define a goal that it should aim for, ie: try to move forward with a certain speed while doing your best to minimize energy consumption. This is a problem that gets solved/optimized by a Genetic Algorithm and after leaving it in the hands of a computer (or several) a lot of thinking starts happening but out comes something similar to this:
It doesn’t look like much for an animator or in general, but the interesting bit is that nobody has told this model how to move. It never saw anyone doing it, there are no parameters in the model that dictates how this should be done, no keyframes or experience or anything. The only input it got was to try to move and do it in the most energyefficient way.
Realizing that this looks somewhat like human gait in general, one may get the feeling that minimizing energy consumption is a pretty big reason for why we move as we do.
If anyone feels like trying it out for themselves feel free to either get the original code or pick up my version, which I will be updating throughout my project. So far it mainly has a bit more documentation and annotation.
It can be found here: jwPredictiveSim at bitbucket.org
I may be putting a little too much into our own soft-robotic exoskeleton project, but I actually found this video quite flattering.
The article and video concerns a “Fundamental Jump” in technology concerning inflatable, cheap and lightweight exoskeletons. This is pretty much exactly what I’ve been spending the past 6 months developing with 4 fellow students at DTU. I thought my idea was original but it seems that others had picked up on the wipe as well 🙂
I admit that our system is much less sophisticated than what theirs look like, but at least they seem to think it’s the idea that is the “jump” rather than the execution.
Now came the time for the actual eFactor 2014 Competition.
From the previous post, we now had each part of the project – compressor, pneumatic valves, Arduino control and the artificial muslce – working individually and we had a good idea about how to assemble them all. However, it was a mess both in the pneumatics and electronics and eFactor was coming up fast. The whole system needed to be working, working together, working consistently and looking reasonable before we could showcase it for the competition. The Arduino was still running from the computer and the rest of the electronics were powered by an external power supply. All electronics were connected through a breadboard and the pneumatic system had tubes everywhere.
Using my new 3D-printer we printed a super-boring-square-engineering-style-practical-box for the electronics and a new pair of end blockers for the muscle, which would fit on the arm of our manequinn doll. The muscle got a a sleeve of stretchable fabric and the pneumatic tubes and wires for the compressor-end got grouped in a single sleeve.
With everything attached the wearable part of the system looked like this:
We were sticking pretty well to the mandatory pattern of only JUST getting stuff working the night before you NEED it. The insides of the electronics box had no time to get to the cosmetic fase, meaning that everything was wired by jumper-wires, a breadboard, some super fast soldered joints and a bunch of different voltage batteries, +/- 9 Volts for the muscle sensor, 5 Volts to power the arduino and 6 Volts to power the airvalve servo (turned out last minute that the 5 Volt output from the arduino was too little juice for it to push the valve-pin down).
Enjoy the wonders of the Control Box:
Well.. it barely worked, but it worked. Good enough for eFactor.. or rather, it HAD to be good enough. It was 3 am in the morning of the 4th, the day where we had to transfer the project to Industriens Hus where to competition was to be held, leaving our workshop behind and making it impossible for any further large fixes. The actual competition would then start on the 5rd at 9am.
Waking up again at 6:30 on the 4th I had to take a trip to Odense to do some consultancy for a advertisement company looking into doing some 3D animation. 2 hours each way, 2 hours of meeting and back to Copenhagen to help the team with setup. It turned out we had to do a few last minute solderings and adjustments of the electronics, which I did.. I probably shouldn’t have, as a soldering iron, small tiny bits of fragile electronics and a shaking sleep-deprived hand doesn’t go well together. However, with a good bunch of patience and maybe a bit of luck, I managed. We got it all set, tested and arranged for the next day.
During that day of setting up it turned out that we had been picked to go on the radio to talk about out project and how we ended up attending eFactor. That put us in a slightly awkward position as our project was the only one to have the primary focus on mechanics rather than electrocnis despite the competition in wearable electronics, so I guess we weren’t representing it too well. Either way, it ended up being Andreas, Peter and Lauge who went on air at 9 am on the day of the competition, to talk about the project. Later on we realised that it had actually inspired quite a few people to come and see the competition (It was open for everyone to drop by and have a look at the different projects)
When the competition started, each team had to do a presentation on stage, presenting some slides and explaining their idea. Afterwards the judges would come to the teams stall where the project was setup and could be presented. Then some questions and done. Lauge and Peter presented the project and Andreas and I did the demonstration for the judges afterward.
It so happened that an english presentation wasn’t mandatory so Lauge and Peter had decided to do it in Danish to stay safe. However, it turned out that the organizer would like to have a recording of our presentation in english as well so we arranged for a second presentation by the stall. It so happened that my grandmother, who is the initial inspiration for this project and who had been invited to come, walked in the door on the instant that the organizer came by with the camera. That turned into an arrangement where she sat at the table at our stall while Andreas and I presented the project and answered a few questions that the organizer had.
The video turned out pretty well and gives a great overview of the project while demonstrating the prototype. Check it out:
In the end all of the projects were evaluated and we won the popularity prize along with 15.000,- DKK.
While not winning the grand prize I dare say that we were all very honoured and really happy about our prize and to get the recognition, despite having entered a competition on a topic that we barely knew about beforehand.
We had fun, learned a lot and eventually had a working prototype of an EMG-controlled soft-robotic exoskeleton.
…Continuing from a great 3-week course, this is where the actual mid-term project starts and the team assembles for the first time to start the actual project.
There is literally an incredible amount of stuff that has happened since we started this project. I cannot include them all but I will try to give an insight for those so inclined.
First of all, by random coincidence one of my friends saw a posting for a competition in wearable electronics and forwarded it to me: http://www.efactor2014.dk/
While it was a competition targeting electronic/software engineers, the suggested topics hit our project right on the nail so we thought we’d give it a go anyway. They required a team of maximum 5, which we were. I sent them the application while mentioning that we were all mechanical engineering students intending to build an exoskeleton. All seemed good and we got accepted as contestants.
eFactor 2014 – Competition in Wearable Electronics
The contest revolved around a development kit that all teams were given. The kit included an Arduino Uno and a bunch of other electronics gear which we weren’t quite sure how to include in our project. Nonetheless, we went to the introduction meeting where all the teams were supposed to present themselves, their experience and introducing their project before being handed the kit. The rules for the competition did mention something about 1 of the contestants being allowed to not have prior experience with embedded systems, but having written in the initial e-mail that I was the only one with actual electronics and programming experience and having heard no shouting or other comments on that part, I hoped that they wouldn’t mind.
The meeting was setup with a Skype call to 3 other Universities with teams participating as well. I think we were probably team number 7 in line to introduce ourselves so we had a little time to to enjoy the show. This mainly meant realising how much more experience everyone else had, leaving us increasingly uncomfortable about our presentation.
When we finally went in front of the camera I started by introducing the team, my former experience, that I was a mechanical engineering student but had played around with arduinoes before. However, when the next in line simply said that he was a mech student with no prior embedded systems experience, followed by two more saying the same, the situation got a little tense. When the last guy on our team proclaimed the exact same lack of experience in embedded system the organizer, who was also handing out the dev kits, looked rather confused. None the less he said that he really hoped that some of us had at least a slight idea about what we were doing before handing over the kit. We later heard that many of the other teams watching, had been quite amused by the awkward moment.
Eventually everything was ok though when we spoke to him after the presentation. The competition was about learning electronics and seeing the potential of wearable electronics in particular, so as long as we learned something, he was happy.
That was the signup to eFactor. The finals were on the 5th of April given us about 3 months to come up with something to show, on a project based on a concept that none of us had working with before. Let the fun times begin!
Moving on the the actual mid-term course. As mentioned earlier, the project also included a course in project management. In theory we weren’t really supposed to have a team yet, as we should spend the first couple of lessons analysing our Belbin roles and make our teams based on them. However, as we had locked down our team by signing up to eFactor we were excempt. As it turned out, after analysing our Belbin roles, our team was a completely unintended perfect match. Well done subconcious! or luck… or whatever.. but the fact was that we had a perfect team, in theory. I should mention that we hadn’t at that time worked together all 5 and the team was made based initially on social compatibility and friendship, so it seemed it could work 🙂
– I see that this will be quite a long post. However, I quite enjoy writing it and I don’t even know if anyone is going to read it, so I will continue like this. Sorry in advance if you have reached this point but feel super tired by all the insignificant and boring details that I am providing and just want to find the results. I will not stop…
Anyway. Done with the Belbin stuff, done with the eFactor signup. Moving on the getting some hands dirty. I must admit that I cannot remember everything that happened and when and why and stuff but suddenly we had an office. David Johan Christensen happened to be part of a department at DTU that had moved into a new building, waiting for the old one to be renovated. With him as a superviser we were then allowed to occupy one of the rooms in their old building, as it wouldn’t be renovated before summer. We got a huge room with accompanying solder station and sound studio. Perfect!
The team had many meetings about how to approach the project and given my epxerience from the 3-week couse I started out as project manager. Eventually I would be replaced by someone else more suitable for the role so that everyone would do what they were good at.
Introducing the team:
The team – eFactor team Filosoma
Andreas Körkel: Former carpenter. Good experience in the professional life. Great at quality control and getting things finished as well as talking to people.
Peter Hybertz: Great engineering problemsolver with a very analytical and organized approach.
Lauge Kongstad: Super organized. Overview of what needs to be done when and an interest in project management (Guess who got to take the project manager role after me?), and good at a range og engineering subjects.
Frank Olsen: Former blacksmith and great technical engineering problemsolver
Me: I guess my strength is probably in a broad but shallow experience in many subjects touching on both electronics, programming, mechanics, 3D design, professional life / teamwork and idea generation. Also I seem to have a more cowboy-style chaotic approach to rapid prototyping and developent than the rest of the team. We compliment each other very well 🙂
Beyond the mentioned strengths, this obviously isn’t an exhaustive list of talents. Everyone is a super capable engineering student way above my analytical and science skills, so I must say I’m more than happy about the coincidence that brought us together!
Having the funding at hand from Glostrup Hospital we started ordering stock of braided tubing, nuts, bolts, wood, pneumatic components and other misc requirements. Furthermore, having wanted to get a compressor and a 3D printer for myself for quite a while I decided to buy both of them now and dedicating them to the project initially. Justification approved and I could finally get my hands on both…
Andreas put together a huge wooded armature for testing the different muscle designs and shoulder mechanics and we got hold of a bunch of sensors from DTU, giving us a nice little test setup.
Wooden armature with mounted penumatic muscle and sensors
Given the fairly hacked-together armature it proved impossible to attach the rotation-sensor exactly concentric to the shoulder axis. Oldham couples and a 3D-printer to the rescue and all was good again
Oldham couple connecting the shoulder joint with the rotation sensor
With everything finally set up for testing, we started out gathering data on a lot of different muscle designs. However, with no emergency relief valve installed yet, we initially had to take several saftety measures when opening the valves:
Initial safety measure
However, realising we had a soundstudio just next door, we decided to abuse the situation and perform a few explosion-tests in there. This gave us a better sense of maximum pressure and allowed us to control the flow so that it would never go beyond. Having the sound studio just next door also gave a few benefits in muffling the sound of the compressor a little. Eventually the office looked a little like this:
Despite the late hour Andreas is not sleeping but rather working on a new muscle prototype. To the left is a makeshift silicone mold made out of cardboard, tape and soap
Starting out with no knowledge whatsoever on pneumatic systems, putting the first bit together was a bit of a struggle. It seemed that pneumatic automation at DTU was no longer an active topic. Despite the plentiful specialists in analytical pneumatics we didn’t manage to find anyone who could tell us anything about the practical application and starting from scratch on that field could get expensive very quickly if we had to buy everything. With a bit of searching about, however, we did manage to find an assistant professor, Casper, who had known the previous professor in pneumatics. The professor had retired but left behind a whole cabinet full of pneumatic elements in the custody of Casper. He then arranged for us to have access to this cabinet and allowed us to use it freely. This was an amazing help! Despite the complete lack of knowledge we now no longer needed to buy everything simply for the sake of testing just to realise that it didnt work anyway. It was like lego. We all digged in and started looking for components matching each other and matching the different ends that we wanted to connect.
One of many boxes of random pneumatic components
When we eventually found some of the needed bricks in the puzzle, we were able to attach the muscle to the first armature (my first test armature from the 3-week course) and test whether it could lift anything.. seems like it worked 🙂
This may look strange and wrong in a few different ways… we know… but it works and that’s just great 🙂
In the masses of pneumatic antiques we also found a few solenoid valves which got the honour of being our first control valves for the system. Following this little victory and the experience it had given us, we could finally sit down and discuss which other components we needed and order them.
From a previous project of mine I had already played around with an EMG-sensor – Namely the Advancer Technologies Muscle Sensor V3. During the meeting with Glostrup Hospital I had been asking around for anyone knowledgable about emg-signals and found that one of the guys at the meeting was a Ph.D in how you can use these signals (Probably wasn’t the official title, but that’s what I remember). Once again it was perfect. He told me that EMG would be an obvious choice for controlling our system and that many other existing exoskeletons were using the technology. With that in mind it seemed that controlling our system with such a sensor would be a solid choice, despite none of us being proficient in the area. It seemed that if ever we got stuck, there would be a huge knowledgebase within our reach, where we could find specialists and get help.
However, as I had previously documented how easily I could control a motor by muscle contractions, we should be able to get something out of it.
Putting together the Arduino Uno from our Dev Kit, EMG sensor (We were allowed to buy additional sensors for eFactor for a predefined sum) and the solenoid valves we managed to get our first muscle controlled muscle to lift a wooden arm, controlled by muscle contraction:
The theory of this definitely worked. The EMG signal in this case was still raw input with a simple threshold allowing the apply pressure to different degrees. The reason for it relaxing again inbetween applying pressure is in this case due to leakage. None the less I felt like I had pretty good control of when I crossed the threshold and to what degree.
However, the sound of the solinoids was anything but pleasent and they didn’t seem to allow for faster PWM, giving a very uneven flow. After performing this test we starting researching other types of solenoids in the hopes that there would be a less noisy alternative. It seemed though, that solenoids could be the wrong way to go as we found a an article on the subject proposing instead servo actuated valves for more fluent control (Here).
However, after heading back to the cabinet with pneumatics, now looking for an alternative to the solenoid valves, we discovered a manual Festo front panel valve. This valve had a little mounting hole for whatever is normally used actuate it and a little rod in the middle that you push down to change the state. By design the valve was only meant as 2-way, but we realised that by pushing the rod down halfway, you could close all connections at the same time. This was exactly what we were looking for and with the mounting hole being fairly easy to measure we hoped to be able to construct our own mount.
With this all figured out it was merely a matter of playing around in Solidworks for a bit, an encouter with a 3D printer and puff, a servo-controlled valve was made.
3D-printed servo mount for a Festo push-valve
Figuring out the mapping from servo-position to valve states was then a matter of applying pressure, moving it slightly while listening and then marking the points where it changed from pressure to hold to exhaust.
With all the components for the system now made and tested individually, we could finally put it all together and test the EMG control of pressure/hold/exhaust.
Bam, it was actually working fairly smooth! Next step was to calibrate the EMG-signal from the muscle and map it to the appropriate servo/valve positions for the most intuitive control.
With the new and exciting idea at hand we also suddenly had quite a few assumptions to verify before the project could be assumed to have any sort of potential. This provided a great risk of running into a dead-end quickly after initiating the project.
In order to lower our risk a little I decided to do a preliminary study of the concept alone, prior to the actual mid-term project starting. This fitted nicely with the 3-week semester part in January.
As one of the great benefits of DTU all students are able to walk up to any professor and ask if they could be interesting in starting a course teaching a particular topic. These kind of custom courses can be started by anyone as long as there is someone who can and will supervise you and that the topic is found to fit into your studies.
By chance I heard about DTU Playware and that they were working with rehabilitation. I got in touch with David Johan Christensen who is associate professor at DTU, formerly at MIT Media Labs, and having worked a lot with robotics. I told him about my idea and he agreed to supervise my project, and eventually also supervise the bigger mid-term project if we wanted. Due to past experience with my supervisor from the 3D-printed animatronic head, David Bue Pedersen, I also asked him if he wanted to supervise and eventually we ended up with two super cool supervisors, both for my individual 3-Week course and for the mid-term team-project.
Talking to David Johan Christensen, it turned out that Center for Playware had recently initiated a collaboration with Center for Spinal Cord Injury at Glostrup Hospital, Denmark. They were already using exoskeletons for rehabilitation and it seemed the timing was just perfect.
We arranged that I should prepare a presentation for the Hospital for the beginning of my course. On the first day of the 3-week course we went to visit the Center for Spinal Cord Injury and I presented my idea. They were super excited about it and suddenly the project had funding!
The 3 weeks progress super fast, I was busy busy testing all sorts of things, hacking and having a really good time! Starting out with balloons, caulking guns and bicycle valves:
Nope.. wasn’t a good idea
Testing the straightening-force of a long balloon. None…
Building the valve for easier handling and using less hands for holding on to everything:
Valve assembly: Bicycle valve, caulking-gun-tip, a balloon and a random piece of rubber tubing
and eventually ending up with braided cable-routing tube surrounding a bicycle tube.
By inflating the inside tube, the diameter would expand, påushing out on the braided tube which contracts in length, when the diameter increases. In this way I made a linear pneumatic actuator.
Eventually it turned out that such actuator already existed and was called a McKibben-type Pneumatic Artificial Muscle. Good thing, then the concept was approved of and I continued in that direction.
The main challenge for me was to figure out whether the concept could provide enough lifting power to lift an arm, and how to approach reaching such forces. In the end I found a solution that seemed to work and I wrote a report on all my findings.
At the date of writing: Sadly, but also quite nicely, I cannot publish the report just yet as we are currently playing around with a potential patent for some of our recent findings. It is still very uncertain whether we will proceed with it or not, but to be safe, I’d rather say too little than too much right now. However, it is the plan that we wrap up the project with an article, which will hopefully be fully published somewhere. More info will appear on this blog eventually 🙂
Time for some news and there’s plenty of it! This should probably have been posted as many separate posts to follow the progress but it so happened that I wasn’t really good at updating continously so here goes the retrospect until I catch up.
During the past 4 months I’ve been super busy on a super fun project. We are building a soft-robotic system for upper-limbs actuation (I don’t want to call it an exoskeleton given that it implies a rigid outer structure, which this one hasn’t)
Our project is based on an idea I had last year on approaching exoskeletons from a slightly different angle.
To begin from the beginning, here is a little intro:
As a Mechanical Engineering student at DTU, the 4th semester is where you have your mid-term project. This is supposed to concern developing something mechanical of your own choosing, involving topics from a minimum of 2 of our mandatory courses. It’s furthermore a team effort and involves a course in project management alongside the product development. The project is set for 10 ECTS point meaning 1/3 of the semester.
Given that I had about a billion different ideas for such projects, I immediately started investigating which ideas would be most interesting.
About the same time it so happened that my grandmother fell and hurt her shoulder quite bad. Due to old age and a damaged Rotator Cuff, she ended up not being able to lift her left arm above 10 degrees’ish from vertical. This meant that she couldn’t reach out for he rolator too well, could bring her left hand to her head and many other issues that seem insignificant until you’re in the situation.
Anyway, I thought I’d give it a shot at helping her out by building a wearable system that could help her lifting her arm when she desired.
The more I thought about it the more the idea seemed to fit right in with the mid-term project. Eventually I started talking to a few of my friends from my class about potentially doing it and in no time we were 5 guys ready and all super excited about getting started.
In this way I had prepared an idea for the project as well as gathered a team already before Christmas. With the project not starting before February I still had some time to play around with further ideas.
For several weeks I was investigating existing exoskeletons, how we could potentially build one ourselves, what it would require and so on. I quickly concluded that exisiting systems are super expensive as it requires quite sophisticated machinary to augment movement when required whilst aiming to be non-invasive. I found that we probably shoudn’t aim for a fully-fledged exoskeleton Iron-Man’ish suit but instead see how we could hack something together, approaching functionality from the bottom-up instead. My Grandmother didn’t require much and merely allowing her to lift her arm forward to horizontal in an intuitive way would be an enormous help.
With this in mind and some other arbitrary trail of thought that I cannot remember, I started thinking of the way tubes tend to straighten out when inflated. From that I started wondering if you could strap such a tube to a person and using a that straightening force to actuate an arm.
After a bit of back and forth-ing with thought experiments I concluded that it could potentially work. Consider a long tube strapped to a person above the hip on the back, going up and over the shoulder and down along the upper-arm where it attaches again at the elbow. This would give it a bend over the shoulder and when inflating it, which would make it straighten, the only way to do so would be by lifting the arm (or detaching at the back but that’s a matter of strapping it down more tightly)
Showing my initial idea of strapping a tube down, bending over the shoulder. Inflating the tube should theoretically lift the arm
Using a tube in this way could potentially eliminate a lot of the issues with needing rigid structures to counter reaction-forces from motors pulling or rotating.
I quickly found many other potential benefits with using this method for less needy actuations. These included much cheaper production cost, light weight and flexible which would allow it to follow the shape of the body naturally without having to include multiple mechanical joints.
I found that this approach was much more my style and while it definitely has it’s drawbacks I saw great potential in investigating further.
I told the rest of the team about my new idea and while they had initially thought of a more mechanical exoskeleton, they were very keen on this idea as well. Only problem now was the chance of getting going with a project which was based on some very strong assumptions on many points.
…To be continued in Explorations of Wearable Soft Robotics – 3 Week Course
A few days ago I finally received a muscle sensor which I’ve been wanting to buy since forever – this one: https://www.sparkfun.com/products/11776. 4 days seemed like an infinity to wait playing around with it but tonight I finally got the chance.
What I found the most exciting was reliability of the signal. How hard it would be to measure the tension of a muscle and whether I’d need a whole bunch of filters to get anything sensible out of it. I had no idea what to expect when I bought it apart from having seen a few examples of it’s use online.
Turned out that it was really quite easy to control. I added a moving average filter with 10 samples and instantly had a signal I could use to control… in this case a servo with a crab on top.
I applied the electrodes on the flexor pollicis brevis – the muscle that pulls your thump towards your pinky -, set up a few conditions for the signal on the Arduino and BAM! The crab was dancing 🙂
By pushing my thump agains the other fingers, I tense the muscle and depending on the amount of tension I can make it rotate slower or faster. As for the conditions I set up a minimum signal threshold after which it would start moving. Each time It’d go beyond that threshold it would go in one direction, speeding up depending on the tension. When the signal goes below the threshold it reverses direction, so it’d go the other way when I tighten again.
The fact that the crab may seem to follow my hand at certain points, could be a bit misleading as I’m merely playing around and aiming it by tensing/relaxing the muscle accordingly. All motion is muscle-controlled 🙂
Oh, and sorry about the sound. I happened to be listening to Radiohead while recording and now Youtube recognized it and claimed that it was copyrighted, so they offered a way of removing only the song from the video. Didn’t really work, but as long as it keeps my back clear I’m good.
[In case somebody is wondering about the thing attached to the servo, it is a crab made of chestnuts:]
After 3 weeks of intense focus and little sleep, here’s the final product. I’ve called him Carl the animatronic but since he’s not really actuated yet, the ‘anim’ part is left out for now.
I think he ended up looking quite silly, but I guess that could be a good thing 🙂
I didn’t manage to add actuators yet, so nothing moves unless you push it. However, I’ve been asked by my University as well as DTU Fablab (Workplace) to continue working on it during the next semester and possibly longer, so there’s a good chance that it’ll start moving. Currently the eyes and eyebrows are made so that they’re able to move by pulling/pushing some strings or rods and it should only be a matter of adding the servos, control and connectors. Not right now though. I’ve finished my 2nd. semester at Uni and it’s exactly how I’d hope it would be. Now celebrations and summer holiday.
For those of you interested, I’ve uploaded the STL parts at thingiverse.com. It’s only the STL’s though, as thingiverse doesn’t seem to accept solidworks files. If anyone should be interested it getting the Solidworks work files and assemblies as well, you can get them below… but don’t expect clean files.
Given that this was a Uni course I had to write an assignment about it as well. It’s not perfect by any means but it’s there as well. Read it if you please and don’t hold back on comments. I’m here to learn!