Portable solutions for human arm pose tracking

Master internship at AUCTUS team

  • Position type: Internship
  • Duration: March - August 2022
  • Status: inactive

The Auctus team focuses on developing the robot control and analysis techniques suitable for human-robot physical interaction, taking in consideration the true limitations/capabilities of the robot and its human counterpart. In order to gather real-time knowledge about the human’s capabilities it is necessary to measure its posture (joint angles, positions,…) in real-time as well. More specifically, in many cases we are only interested in acquiring posture of a specific part of the human body, for example the upper body, or one arm.

Following video shows an example of human-robot collaborative carrying scenario, where they jointly carry 7kg object. The robot is controlled in a accordance to the true capacity of the human operator which is calculated based on the human posture, tracked in real-time with the Opritrack motion capture system.

However, standard laboratory solutions (Optitrack, Vicon, …) for human motion capture are very precise but require specialised environments and are often impractical. Therefore the goal of this internship would be to develop easily portable and possibly wearable approach, for real-time tracking of the human arm pose.

There are two main directions we are interested in exploring:

1. Skeletal tracking approach

Many of the recent deep-learning vision based skeletal tracking solutions (RGB : openpose, lightpose,…. RGB-D: nuitrack, …) have been shown to have great results and to be relatively robust in challenging environments, however they often require vision of the complete human body (or at least its larger part) in order to work properly. To improve their spatial and temporal consistency for human arm tracking, we are seeking to combine these approaches with a human arm kinematic model, based on standard 4 to 7 degrees of freedom models.

2. Wearable approach

Wearable approach of human arm tracking is promising due to less calibration efforts and less environmental constraints. The work in this direction would consist in exploring, testing and integrating different wearable solutions such as

  1. wearable wide angle camera coupled with marker bracelets (colored markers or qr codes)
  2. inertial measurement units (IMU) for the purpose of posture tracking.

Potentially, based on the results of the (1) and (2), it could be envisaged to develop a fusion of skeletal and wearable based approaches as proposed in the recent work of Mallat, Bonnet et al.

Mission:

  • State of the art
    • An overview of the available solutions
    • Develop a minimal experimental setup for their testing
    • Benchmark the approaches using the laboratory equipment (system Optitrack)
  • Development of the necessary software for real-time acquisition
  • Integration of the software stack with the Robot Operating System (ROS)
  • Validation in the context of human robot collaboration scenarios

Requirements

  • student in a Robotics, Control or Signal/Image Processing Master
  • good knowledge of Computer Vision, Robot kinematics and sensor fusion
  • provable experience in C++/Linux/Python
  • good analytical skills & critical thinking
  • good team & communication skills Not required but beneficial: hands-on experience with robotic platforms, robotic software frameworks, ROS, sensors, microcontrollers, IMUs

Applications

Interested candidates should submit the following by email before 30th January 2022 to: antun.skuric@inria.fr

  1. Curriculum Vitae
  2. One-page summary of research background and interests
  3. Previous student projects demonstrating expertise in one or more of the areas mentioned above (optional)

Supervision

The intern will be supervised by:

Location

The selected candidate will work at the AUCTUS team’s collaborative robotics laboratory, placed in the facilities of the École Nationale Supérieure de Cognitique (ENSC) Bordeaux. The internship should last ideally for 6 months starting in late February or early March.