Muscle Anatomy, Motion Capture & Animation in Sports Science
The project started after I created a rough animation of a man running on a tread mill to display basic muscle anatomy. I decided to make this as an example of a possible educational resource for The Cadarn Learning Portal. The first version was a fairly simple human model running in situ with no skin, so the basic muscle structure could be seen. I thought this would be a good example of how animation could be used to display a system at work. I had previously thought of creating scientific animations showing the structure of an atom, or the process of plate tectonics, but in the time I had, I decided creating the muscle anatomy animation would be more manageable.
Many teaching staff attended an early Cadarn Learning Portal open day, including Marco Arkesteijn from the sports science department. Marco happened to see the animation as it had been edited into our project introduction video. Once I had returned to work the following week, a meeting had already been arranged with Marco, his colleague Daniel Low, Lizi, Mary and myself, as Marco had been very interested in the animation as a potential lecturing aid. So we discussed the logistics of creating a more detailed and usable version of the animation. The discussion began with lots of talk about time and planning and I did my usual thing of thinking about the technical problems in the animation process that might arise before they actually do. But to cut a long story short, the outcome of the meeting was for me to produce a more detailed muscle system that highlighted each of the main muscle groups in the legs to show their use during a normal walk cycle.
We then arranged a second meeting at the sports science Carwyn James building to discuss the technical nature of the project. They have a motion capture suite there, which allows the staff and students to record the motion of someone walking, running or jumping in three dimensional space and record the data to the computer by means of moving point coordinates. It works by attaching small reflective balls to a persons clothing on the key moving limbs. The position of these reflective balls are then recorded in three-dimensional space by a number of special cameras situated around the room, each reflective ball represents a moving dot in the motion capture software. The subject being recorded has to stay within certain boundaries in order to be recorded. They can test a persons walking gait, posture and outline any injuries or problems using this method. Marco and Dan had lots of examples of previously recorded motion capture data, so it was a case of finding a suitable export method that would work with the animation software I use, which is Blender. I found the C3D format was easily imported into Blender.
I have used motion capture suites before and I know the data that you get from them can be tricky to work with, there are often glitches in the movement. The data we had was no exception, often the points would randomly jump around in 3D space which makes it difficult to link up with the virtual skeleton needed to animate the character. An easy work around for this was to animate the walking character by tracing the movement of the points, so I only used their positions when they were behaving themselves.
Marco described the main muscle groups he wanted displayed and gave me some graphs showing the usage of each muscle group during a typical walk cycle. He also lent me his trusty muscle anatomy book to use, he pointed out that it was quite old, but luckily muscle anatomy doesn't change so the book was still up to date. I also found a great online resource at https://human.biodigital.com
I began modelling the skeleton and muscle groups in Blender by creating a low resolution model and using subdivision surfaces which smooth the result. This way you can quickly create believable models which you can then use the sculpt tool on, to further increase detail. Once the modelling is complete you have to attach the model to the virtual skeleton or 'rig' in order to make it move. This is done by what is called weight mapping, which means making each point of the model move with the underlying skeleton. You can influence each point to move with a combination of skeleton bones, so you can blend joints to allow them to deform smoothly, for example the shoulder joint is influenced by the upper arm bone and the shoulder bone.
I could have coloured or textured the model at an earlier stage, but Blender being as flexible as it is, allows you to modify the model after the rigging stage, so I was tweaking and improving the model at the same time as texturing it which involves creating UV coordinates which are a 2D flattened version of the model to allow you to effectively paint colour on the model. This was also where I had to think about animating the colours of the various muscles in accordance with their use on the walk cycle. I built some controllers; using the driver functionality in Blender which allows you to effectively create switches or sliders to control any part of the model - in this case the individual muscles. I linked these controllers to a red material node value that increases as the sliders move up, then I also linked them to graphs to display a graphical representation of the muscle use.
Once everything was working - the muscles were glowing red relevant to their use and the skeleton was walking, it was a case of overlaying the graph, adding a bit of text and rendering the animation out to create the individual frames to be converted to a video file.
After sending Marco an initial test animation, he decided it would be much more useful if the animation was slowed down a lot, to enable him to talk over the animation and describe the use of each muscle group in his lectures. We decided upon a 100 hundred frame per complete walk/gait cycle at 25 frames per second. This is roughly slowed to one quarter of the actual real world speed. We also agreed the animation should be shown from four angles, front, back, left, and right.
Marco was happy with the finished result and said he would trial it in his lectures, all in all the project went well and took a total of around two to three weeks to complete.