top of page

Gesture recognition and then angle detection

Last vision: 2023.10.22

Project Overview: Machine Learning Integration with Processing, Wekinator, and VisionOSC

The primary goal of this project is to achieve hand gesture recognition, followed by precise angle detection. 

Understanding the Objectives:

  • Hand Gesture Recognition: The first objective of this project is to implement a robust system for hand gesture recognition. This task is crucial for various applications, including sign language recognition, interactive interfaces, and gaming.

  • Angle Detection: Going beyond gesture recognition, the project endeavours to detect the relative precise angle of the hand. This feature enables a more sophisticated level of interaction, particularly valuable in applications like robotics, augmented reality, and virtual reality.

Classification and Regression Aspects:

This project is unique as it involves both classification and regression tasks:

  • Classification: For hand gesture recognition, I employ classification techniques. I will use the versatile K-Nearest Neighbor (KNN) algorithm to classify hand gestures, making this a valuable tool for interactive applications.

  • Regression: To detect the precise angle of the hand, a regression model is employed. I use Numetric to achieve this, enabling accurate and continuous angle tracking

Project Workflow:

Here's a more detailed breakdown of the project workflow:

  1. Software Installation: Download VisionOSC (the best software I found in 2023)

  2. Open Processing Code: Launch the Processing Code for the vision demo.

  3. Setting Up Wekinator:

    • Download and install Wekinator.

    • Configure Wekinator by setting the number of #inputs to 42, and double-check the port settings.

  4. Select Recognition Mode:

    • If you aim to recognize hand gestures, choose the "All classifiers" option. For hand recognition, the K-Nearest Neighbor (KNN) algorithm is recommended.

    • If you intend to recognize hand movements, select Dynamic Time Warping (DTW).

    • For recognizing hand gestures and the hand's angle:

      • Change the output to 2.

      • Choose the "custom" option.

      • Click "configure," and in the next window:

        • Change the first type to "classification" and use the K-nearest algorithm.

        • Adjust the second Max to 180 (you can customize this as needed).

        • Set hard limits for both parameters.

  5. Model Training: Train your model. If you're not familiar with Wekinator, refer to the tutorial guidance.

  6. Processing Receiver Code: Open the Processing Receiver Code and ensure that it is receiving and processing the data correctly.

processing_edited_edited.jpg

//VisionOSC connect demo created by Revive (Weiqi sun)

import oscP5.*;
import netP5.*;
NetAddress wekinatorAddress;
OscP5 oscP5;
int HAND_N_PART = 21;
int MAX_DET = 32;

class KeyPt {
  float x;
  float y;
  float score;
  KeyPt(float _x, float _y, float _score) {
    x = _x;
    y = _y;
    score = _score;
  }
}

class PtsDetection {
  float score;
  KeyPt[] points;
  PtsDetection(int n) {
    points = new KeyPt[n];
  }
}

PtsDetection[] hands;
int nHand = 0;

int vidW;
int vidH;

void setup() {
  size(640, 480);
  oscP5 = new OscP5(this, 9527); 
  wekinatorAddress = new NetAddress("127.0.0.1", 6448); // Change to your Wekinator address and port
  hands = new PtsDetection[MAX_DET];
}

void draw() {
  background(0);
  fill(0, 255, 255);
  drawPtsDetection(hands, nHand, 5);
  fill(255);
}

void drawPtsDetection(PtsDetection[] dets, int nDet, int rad) {
  pushStyle();
  noStroke();
  for (int i = 0; i < nDet; i++) {
    for (int j = 0; j < dets[i].points.length; j++) {
      if (dets[i].points[j] != null) {
        circle(dets[i].points[j].x, dets[i].points[j].y, rad);
      }
    }
  }
  popStyle();
}

int readPtsDetection(OscMessage msg, int nParts, PtsDetection[] out) {
  vidW = msg.get(0).intValue();
  vidH = msg.get(1).intValue();
  int nDet = msg.get(2).intValue();
  int n = nParts * 3 + 1;
  for (int i = 0; i < nDet; i++) {
    PtsDetection det = new PtsDetection(nParts);
    det.score = msg.get(3 + i * n).floatValue();
    for (int j = 0; j < nParts; j++) {
      float x = msg.get(3 + i * n + 1 + j * 3).floatValue();
      float y = msg.get(3 + i * n + 1 + j * 3 + 1).floatValue();
      float score = msg.get(3 + i * n + 1 + j * 3 + 2).floatValue();
      det.points[j] = new KeyPt(x, y, score);
    }
    out[i] = det;
  }
  return nDet;
}

void oscEvent(OscMessage msg) {
  if (msg.addrPattern().equals("/hands/arr")) {
    nHand = readPtsDetection(msg, HAND_N_PART, hands);

    // Create a new OSC message for sending relative positions
    OscMessage wekinatorMsg = new OscMessage("/wek/inputs");

    // Add relative positions of points to the message
    for (int i = 0; i < nHand; i++) {
      for (int j = 0; j < HAND_N_PART; j++) {
        if (hands[i].points[j] != null) {
          float relativeX = hands[i].points[j].x / vidW;  // Calculate relative X position
          float relativeY = hands[i].points[j].y / vidH;  // Calculate relative Y position
          wekinatorMsg.add(relativeX);
          wekinatorMsg.add(relativeY);
        }
      }
    }

    // Send the OSC message to Wekinator
    oscP5.send(wekinatorMsg, wekinatorAddress);
  }
}

 

processing_edited_edited.jpg

//OSCreceiver by Revive (weiqi sun)
import oscP5.*;
import netP5.*;
int receivedInteger;
import oscP5.*;
import netP5.*;

OscP5 oscP5;

void setup() {
  size(400, 400);
  oscP5 = new OscP5(this, 12000); 
}

void draw() {
  background(0);
}

void oscEvent(OscMessage msg) {
  if (msg.checkAddrPattern("/wek/outputs")) {
    float value1 = msg.get(0).floatValue(); 
    float value2 = msg.get(1).floatValue(); 
   // float value3 = msg.get(2).floatValue(); //Revive:if you have more data
    println("Received OSC Signal: " + value1+ value2); 
  }
}

 

bottom of page