A guide to how BoB can enhance your practice.
BoB Biomechanics was founded in 2016 by Dr. Barbara May and Dr. James Shippen to help researchers, practitioners, product designers, and anyone with a need to understand the mechanism of human motion and visualize human movement. Building on our 2022 webinar with BoB, here is an introduction to how this partner integration can complement your biomechanics practice.
BoB (short for ‘Biomechanics of Bodies’) is a biomechanical modeling software solution, and the heart of the package is a human musculoskeletal model. By default, the skeleton consists of 36 rigid segments connected by
34 joints. The skeleton itself can easily be scaled based on overall height and/or overall mass, or the individual dimensions of the various segments can be individually edited, as can the inertia tensors for the segments. The segments are linked by joints that represent their anatomical counterparts.
BoB also contains a muscle model. By default, there are over 600 muscle units within the model. They use Hill’s three muscle elements, and wrapping is included so that the muscles fold around the underlying skeletal structure, and one muscle can wrap around another muscle deeper within the model.
The muscles are easily edited with the muscle editor that’s embedded within BoB, so muscles can be added, modified and deleted. The muscles’ insertions, origins and wrapping points can also be easily modified to suit the user requirements.
BoB can read motion data from multiple sources, such as from a simple look-up table of joint articulations, but the best results are generated using a motion capture system.
GETTING THE BEST RESULTS FROM BoB
While a look-up table is useful for very simplistic examples, it’s fairly limited. What we tend to use is motion capture, and of course, the best is Vicon.
BoB can take optical tracking data and is configured by default to use the Vicon Plug-in Gait marker set. It can either be full-body or a reduced version, which means we don’t need the markers on the thigh and the shin, nor on the upper arm or the lower arm. We actually use the kinematics of the joints themselves to calculate the joint axes, which means we get a marker set that’s very quick and easy to apply.
BoB can also be driven by IMU data from Vicon’s Blue Trident, using a very robust and easy interface. This, of course, gives users all of the obvious advantages gained by using IMUs, such as easily capturing data out in the field, tracking larger groups, or analyzing subjects moving around a large area.
Another set of inputs into BoB are the external forces that are acting on the body. So if you are working in an environment with force plates, BoB can take in the force plate data and apply that to the body. Or if you’ve got force transducers within the laboratory, BoB can use that data and apply those forces to the body. If forceplates are not available, BoB can calculate ground reaction forces by considering the acceleration of the center of mass
and appropriating the forces to the feet to be consistent with angular moment considerations.
BoB can also take video input, so you can synchronize the video to the motion capture data. This tends to be done almost as an aide-mémoire rather than an actual analysis tool, but it’s very useful for checking details of the study, such as whether a subject was wearing shoes.
When all of this input has been put into the BoB model, BoB can then go into inverse dynamics mode to calculate the torques at the joints and the forces of constraint across the joints. BoB has an optimization routine built into it that’s given a cost function (by default we use the sum of the squares of the muscle activations but this can be changed by the user). It can find a minimum for that cost function which will be associated with the muscle force distribution across the body. The forces within the muscles are displayed by BoB as color-coded, so the redder the muscle, the harder the muscles are working, and everything is displayed using BoB’s sophisticated graphics engine.
The most simplistic use for BoB is to aid in the visualization and the understanding of the motion that we’re capturing. This can be done in a number of ways, for example, by showing the trajectories taken by any location on the body.
BoB can also show the instances of the subject at fixed intervals as it goes through the motion. We can display the range of motion of a joint, with color-coded shading that displays the movement area of the joint in both space and time. BoB can also display the velocity vector of any point on the body and the angular velocity and angular acceleration of any segment of the body.
While it’s common to display a subject’s whole body, BoB can also show a subset of the joints, the muscles and the segments within the body, as required by the user.
BoB will also calculate the torque that’s occurring at the joints that correspond to the observed motion and external forces.
BoB can calculate the muscle forces across the body. As we then know what the forces in the muscles and the forces of constraints within the joint are, we can calculate the joint contact force. All of the data that we acquired from BoB can be output as tabulated data or it can be plotted against time, or any other variables, for a phase plot.
BoB can also display multiple subjects, and BoB itself doesn’t have a limitation on the number of subjects.
BoB comes in four variants: BoB/Research is the core biomechanical analysis engine; BoB/EMG has all of the capability of BoB/Research plus tools for displaying and analyzing EMG signals; BoB/Ergo contains tools for the ergonomic analyst and BoB/Teaching is specifically designed for teaching biomechanics at the undergraduate and Master’s levels.
BoB has been used across a range of projects and sectors including vehicle design; product design; emergency service protocol development; sports optimization; dance injury reduction and even to analyze the performance of a Formula One pit crew.
Working in conjunction with motion capture technology, BoB is a powerful tool for interpreting and understanding movement.
For more information, please visit