© 2019 by AIQ-Synertial.com

  • Grey Facebook Icon
  • Grey Instagram Icon
  • Grey Twitter Icon
  • Grey YouTube Icon

Super accurate guitar motion capture

17 January 2018

Synertial finger mo-cap technology had been under pressure in the last few months to show how it can push the envelope, now that it can, having developed top tools to go super accurate, but not really trying them out much (or at all, since they already provide best-in-class motion data with only the ‘default settings’). But Synertial has developed superior tools that could be used beyond their default settings to produce awesome finger mocap data

 

We asked Hussain Dickie to be our guitar maestro and agree to help us discover how far we can push the new Cobra gloves and its tools; what were the most efficient pipeline? What are the necessary components? He was visibly excited to be at the heart of the experiment which no one has had the tools to try in this new way (using only IMUs). But his patience started to wear off as issues slowed things down, most of which he didn’t understand and couldn’t help us with. But first let’s rehash the tools mentioned above:

 

Synertial’s ‘tools’ for increasing accuracy for Cobra Gloves

Kinexact Hand - The Scan of the hand can provide LENGTHS (auto + drag'n'drop)

A physical jig - imposes a particular structure to the hand skeleton at the instance of calibration:

A hand skeleton file - To combine the scanned data (lengths) into the geometry forced by the physical Jig (depths and rotations through the jig spirit levels). Synertial publishes the skeleton templates (Latest skeleton files v8.53) that match Jig#2 (shown above). The designer of jig#2 had to use Synertial’s SynDash Skeleton Editor to create a matching skeleton file to be used when using Jig#2.

A sync-ready multi-device capturing OS - Multiple Cobra devices running and capturing in sync, each device individually capable of syncing to HTC-Vive internally or Vicon or Optitrack using additional Synertial sync SW & HW. In the image below, 3 devices, ‘left hand’, ‘right hand’ and the ‘guitar body’ are being synchronized and globally tracked with one Vive-Puck tracker each.  

A bit of background

Planning the first session of Hussain’s guitar playing mocap, we had a known collection of components that we were pretty clear about. Soon we realized we needed to ‘plan for sub-mm’ accuracy capture. Promising to be painful to imagine so we just decided to jump in and see what we could get.

 

What we knew:

1- we needed each hand tracked with its own Vive puck (so 2 independent devices with their own independent root 6DoF). Have had this, robustly, already, for a while, but not pushed any limits

2- we had been getting good results from the physical Jig#2 and matching v8.53 skeleton files

3- we were going to scan Hussain’s hands using Kinexact-Hand, to get his finger lengths, a must

4- the latest versions of the OS, plugins and HW, provided a lot of confidence to go for multiple devices, as well as synced streams. It was high time we pressured ‘multiple synced devices’

5- we knew we had the smallest sensors, but how would their casing fare playing classical guitar?

6- the Synertial Phase-Lock Sync HW was already tested by a top mocap studio in Sweden and it had passed with flying colours. But do we need the sync HW if we had zero dropped frames without them?

7- we were going to take the sensors out of their cloth casing and tape them on the fingers. Impossible to capture quitar with the standard glove cloth casing

8- We were going to capture scenes in Synertial Dashboard and then export bvh or fbx to work inside MotionBuilder, and after that see how it work inside Unity and Unreal, and finally Siemens PLM SW, Jack and PSH (Product Simulate Human)

 

We wanted to see where all these resources could take us and curious about their limitations.

What was needed to capture classical guitar? How well, Hi-Fi of a RAW DATA could we capture?

First session, First Stab

 

- Got Hussain’s hand scans, ran them through Kinexact-Hand, Synertial’s finger-length extraction tool.

- Set Hussain with a pair of 16-sensor Cobra gloves, taping each sensor to respective phalanges using common fabric plasters (fabric allows better stretching). Took 10 minutes per hand to tape him up.

- We joined each Vive-puck tracker to a finger-less Velcro glove worn over the taped sensors

  • Each side glove wired to its own processing hub, its own battery, its own WiFi connection and its own Sync HW. The sync HW is powered by the Hub it plugs into (no separate battery).

  • We had an independant sound recording (with a manual trigger to start recording), Hussain’s video shot (using a Samsung phone, was to shoot the event -wasn’t intended as a reference camera), HTC-Vive base stations running SteamVR with 2 puck trackers, one for each hand, streaming into SynDash (Synertial Mocap Dashboard):

  • We imported all our assets into MotionBuilder.  After delving into the results in regards to accuracy and identifying sources of repeated error and data artefact, we experimented with making a quick hand model with independent meshes to receive color commands to see what it would look like to change finger color on each note and we got this:

Click here to download the fbx file (99mb) with above audio, video and 2 x 16-sensor glove data sets (recorded with puck translation on their respective roots).

 

What we learned in the 1st session

 

1- Definitely need capturing the Guitar body with its own Vive-Puck on a 3rd device (like this)

2- Need to place the sensors ‘away’ from the strings as they are taped on (to stop ‘snagging’)  

3- Need to make a more rigid coupling system between the Vive-Puck and the back of the palm

4- Were not sure if the sync system had any noticeable benefit

5- Need a stationary reference camera

6- Jig#2 with their matching skeleton files do make better calibration easier

Second session

 

  • Set Hussain with a 16 sensor left, but only a 13-sensor right glove since he decided his picking right fingers were never forced against their tendons much and a here was a good case to use 13-sensor glove

  • We wired a 3rd Synertial device via USB (not WiFi), and wired a 3rd Vive-Puck, both to velcro to guitar

  • Didn’t use sync (phaselock) HW or SW

  • Hussain uses Jig#2 comfortably and goes thru 2 calibration poses on his own

  • Guitar needs no jig (still needs a 2-pose setup). Its starting pos/rot can be easily offset

  • Got a stationary reference camara front data

  • Agreed with Hussain on doing short tests and leaving longer pieces for when we get better at it

  • Again we put everything in MotionBuilder to make this (click on the image):

Above video shows a 13 sensor Cobra glove on the right picking hand and a 16 sensor Cobra glove (recorded with Vive puck translations on each hand). The 3rd device sensing the guitar body was corrupted. The Vive-Puck data (root position of guitar), was captured fine, saved inside synertial scene files, but both bvh and fbx exporters saved ‘wild’ and jumpy Vive tracking data (didn’t review takes on the go). So we did get the guitar translation nor rotation.

What we learned from the 2nd session

 

1- We do indeed need a 3rd device but want to try the device without an IMU, only Vive-Puck on the guitar body, taking its rotation values

2- Wee need an exact CG model of the the guitar neck and its strings to bring into MoBu to check finger mocap data accuracy in reference to the strings

3- The data from the 2 gloves drifted back and forth. If you sync reference camera to data from one hand, the other starts drifting (in about 30 seconds) and vice versa. So the sync phaselock is a must.

4- Hussain wants fingertip sensors on the left middle and ring finger to be removed since they are always flexed like crab legs and so predictable and configurable in SynDash skeleton settings. Even though the second session had practically eliminated snags and string ‘zings’, Hussain wanted less sensors on fingertips to feel freer playing. He wants a 14-sensor left and a 13-sensor right for the next session

5- Attachment method of the Puck to the back of the hands. Velcro only is too sloppy. Need rigid ‘forms’ under the tracker to reduce the wiggling (data artefact).

Third session (Feb 2018)

 

We began with the same configuration used in our second session, except for the following:

  • A separate Synertial WiFi device for the guitar, replaced the USB device used in the 2nd session.

  • Configuration files were modified to assign a third Vive tracker (AKA puck), attached to the guitar neck.

  • The setup used was similar to our method for tracking a virtual camera: a single puck plus an inertial sensor on the camera body. This method offers an inexpensive, yet reliable alternative to other virtual camera systems.

 

Sync was omitted, again, and capture speed was reduced to 60fps, to validate the need for use of our phaselock hardware.

Data was brought in to MotionBuilder to make this video (please click on the image, to play):

Click here to download the guitar finger motion capture fbx file (58 mb) with data, audio, & reference video. Capture hardware featured a 13-sensor right-handed, and 14-sensor left-handed glove, plus an independent third tracking system (Vive tracker, plus a single Inertial Measurement Unit.) The result provided a 6DoF capture of the Guitar body.

Data shown is completely raw and uncleaned..

 

 

What we learned from the third session:

 

  1. Again, phaselock sync is a must.

  2. Use of a separately tracked puck and IMU for the guitar is essential.

  3. The Cobra tracking system for the guitar uses the same pipeline as a virtual camera rig (in MotionBuilder).

  4. The attachment method for fastening the Vive tracker to the hands needs to eliminate tracker wobble, custom made pads/fasteners are required.

 

Further insights, to be continued...