“Touch and Activate” allows sensing with everyday objects.  This paper describes the technique, and this youtube video does a great job of explaining the approach.

As a first example of the newly released ml.lib for Max and Puredata (machine learning and gesture recognition), here are some test patches that use ml.svm to reproduce the above paper in our favorite real-time environments.  Links to patches are below.

Touch and Activate for Max and Puredata

Leave a Reply

Your email address will not be published. Required fields are marked *