First test of the installation at beursschouwburg: streamer are streaming correctly, and after a correction on the viewer to detect "NaN" in the information received from the streamer, it has been possible to calibrate the 3 kinects quiet easily. Calibration is not perfect (based on skeletons, it would be a miracle), but is a good approximation of the position.
panel to edit filtering configuration of the skeletons, same logic as for calibration matrices:
calibration based on successive positions of skeleton
The calibration algorithm is using 2 kinects. One is the reference, the other the calibrated. To start the process, one and only one skeleton must be present in each kinect (the same person, you got it). 500 positions of the hip center are stored (+/- 16sec). Once all the positions are collected, the calibration can be processed.
It uses 3 passes:
This transformations are stored in a matrix ready to be sent to streamers.
Why not compute the roll[1]? I tried, but the poor quality of the recording was making the calibration worse than without. So I decided to drop this rotation.
An interesting subtlety is the way the Z and after it the X axis are processed:
Image legend:
there is also a 2d projection of the trajectories on the ground, for better readability.
The process research has been fast thanks to raw. In the viewer application, a raw "writer" have been added, witch stores each new pair positions while the calibration data retrieval. To develop the algorithm, I started a new application that loads these positions via a raw "reader". It has been easy to develop the algo on stored data instead of having to stops the viewer, change the algo, restart the app, wait for the kinect to connect, set the roles and moves in front of the kinects to generate the data.
See SkCalib folder in src for the code.
Working on a visualing / calibrating software, called NetworkViewer in the repo. A first attempt of "auto calibration" is not giving too bad results. The orientation are not processed yet, I will work on this tomorrow.
This software is a visualiser of streams. It displays messages processed by the "streamer", the software mapping kinect skeletons on an avatar (see http://vimeo.com/242486050). In this example, 2 kinects are largely overlapping each other. A first attempt to calibrate them, based only on positions, is not giving too bad results.
back to my desk! La connection de 2 kinects simultanément est concluant:
Le travail sur un soft de calibration des différents points de vue est commencé:
premiers essais de mapping vers divers squelettes - les hanches et le "spine" posent problèmes dans la plupart des modèles...
le calcul des orientations relatives à envoyer à blender a été tout aussi fastidieux que le reste, mais moins long. Pour ceux qui aiment la magie noire, il a fallu environ 5 heures de prises de tête pour écrire ces 2 lignes de code.
if (parent != NUI_SKELETON_POSITION_COUNT) { m = absolute_matrices[i] * absolute_matricesi[parent] * model.delta_matricesi[i]; } else { m = absolute_matrices[i] * model.absolute_matricesi[i]; }
Le reste des modifications est de l'ordre de l'optimisation et de la dataviz, comme la grille de 6 squelettes sous le modèle.
l'alignement du squelette kinect et de l'armature exportée depuis blender est maintenant automatique, plus besoin de tuner les matrices à la main
importation des matrices des os de l'armature blender dans l'application. Les orientations ne sont plus calculées par blender, ce qui posait de gros problèmespour le spin (rotations autour de l'axe alignés avec la direction de l'os).
première réussite: les orientations relatives sont calculées correctement et envoyées à blender via OSC[2]
code archeology, re-enabling ofxKinectNui[3]
online identity ∋ [ social ∋ [mastodon♥, twitter®, facebook®, diaspora, linkedin®]
∥ repos ∋ [github®, gitlab♥, bitbucket®, sourceforge] ∥ media ∋ [itch.io®, vimeo®, peertube♥, twitch.tv®, tumblr®] ∥ communities ∋ [godotengine♥, openprocessing, stackoverflow, threejs]]