Motion capture and the coexistence of robots and humans

The rise of automation has led to robots becoming a common sight in industrial workplaces where they have traditionally operated in caged off-environments, away from humans.  However, more recently we’re witnessing the rise of a new generation of collaborative robots (a.k.a cobots) that are changing how sectors such as components manufacturing, scientific benchwork and light product manufacturing operate. These cobots have been unleashed from their cages and are being used to perform tasks alongside real people.

Obvious safety reasons demanded that early cobots were small in size. They used force sensors, machine vision and other sensing technologies to avoid striking humans and could be easily shut off in the event of accidental collisions.

Today, the collaborative automation sector is focussing on expanding into high-speed or heavier-duty — and therefore potentially more dangerous — manufacturing, logistics and material handling roles. This move serves to amplify the importance of the need for safe interaction between these robots and humans.

 

The drone challenge

Drones are a particular kind of cobot but they represent a fantastic opportunity to turn the empty airspace within factories and logistics warehouses into productive areas where packages and products can be picked and transported. Brands like Amazon have already talked publicly about drones as a potential path to best serving the ever-growing e-commerce market — with its increasingly small-scale shipments and shorter delivery deadlines — in a cost-effective and sustainable way.

But drones can obviously be dangerous. A collision between a human warehouse worker and a drone, or a parcel dropped on someone’s head by a faulty drone could easily be fatal. Therefore, many organisations and research institutions are currently exploring new and innovative ways to help drones to recognise and avoid obstacles, particularly the quick and sometimes erratic movement of people. The successful coexistence of drones and robots with people in manufacturing warehouse environments has become a significant engineering challenge.

Technische Universität (TU) Dortmund, one of the leading technical universities in Europe, is heavily involved in helping to solve this challenge with its research focused on human activity recognition and ergonomics in logistics and manufacturing environments.

To deliver the visual and data evidence that is vital to guide the team’s insights into human activity and drone tracking, TU Dortmund is using cutting-edge motion capture at its InnovationLab.

 

Motion capture set up

A major project for the InnovationLab is to improve robot control by tracking drones to analyse the interaction between humans and machines and help the drones to recognise and avoid human obstacles. In one of TU Dortmund’s stand-out demos, a person in a motion capture suit can walk through a swarm of 12 drones flying autonomously. This showcases the drones’ ability to react almost instantaneously, avoiding contact and keeping a minimum prescribed distance away. The motion capture system is running at up to 300 fps, and simultaneously processes high volumes of data and fast response times with precision.

The information obtained is used as a reference system for safely using drones or robots in a warehouse environment, to check that the drone sensors are working correctly, to both improve positional placement and facilitate the decision-making process.

The lab also employs machine learning technology so that the drones can recognise variations, such as people carrying boxes. This enables the development of safety protocols and creates flexibility, so in the future the drones won’t need a fixed infrastructure to navigate an environment.

The InnovationLab is also being used to develop an algorithm that recognises activities in inertial measurement unit (IMU) sensor data. To train the algorithm, the optical motion capture system visualises the human skeleton with Plug-in Gait (PiG) and synchronises with video data from cameras, to capture and perfectly label manual warehouse tasks. The algorithm’s architecture can then be transferred to be deployed on IMU data, which is comparatively hard to annotate.

Outputs are used to identify more efficient ways of working, improve workstation placement and avoid repetitive injuries, to inform future research on ergonomics in the workplace. Activities examined include everything from ways of lifting, to pushing a cart, and order picking. Ultimately, the data will improve the ergonomics of these activities and, therefore, warehouse and production efficiency, with applications extending to cover more efficient tool usage, as well as human-robot interaction.

 

What next?

TU Dortmund’s work with cutting-edge motion capture technology has far-reaching implications and benefits for the coexistence of robots and humans in the manufacturing and logistics industry. And, in true open source spirit, it plans to curate and share the data to make sure other researchers can access and use its findings. Cobots are definitely here to stay.