Which parts have you developed yourself?
We did all the programming, from the acquisition to the 3d display and sound playing. We used several great libraries such as DirectShow, wxWidgets, Ogre3d, and Fmod for the sounds. The distortions of the camera lens are computed using a camera calibration toolbox for matlab that you can found on Camera Calibration Toolbox for Matlab.
More precisely the app is composed by several sub programs:
- the FrozenCameleon software that is able to detect the colors under any light condition
- a 2d and 3d positioning library, change of basis, operations in 2d and 3d, matrix processing, etc.
- a library to abstract the operating system for threads, timers, events, etc. (we initially wanted to port the app on linux)
- a library to detect an object from the video. An object is defined by geometrical properties, shapes, colors. This library also includes the camera handling, the lens (and it takes care of the geometricaldistortions), etc.
- a library for the 3d display, that communicates with the positioning library, so we can update the screen in real time with the stick positions, theviewports creation, the drum elements, etc. When the player moves the drums elements on the screen, the new positions are sent to the positioning library, etc.
- a sound module that communicates with the main library and receives events when to play a given sound. It can also be used to recordbeatss and give marks if the player plays against a given pre recorded rhythm.