American Sign Language Recognition Using Leap Motion Sensor

Ching-Hua Chuan, Eric Regina, Caroline Guardino

Research output: Chapter or Contribution to BookChapterpeer-review

Abstract

In this paper, we present an American Sign Language recognition system using a compact and affordable 3D motion sensor. The palm-sized Leap Motion sensor provides a much more portable and economical solution than Cyblerglove or Microsoft kinect used in existing studies. We apply k-nearest neighbor and support vector machine to classify the 26 letters of the English alphabet in American Sign Language using the derived features from the sensory data. The experiment result shows that the highest average classification rate of 72.78% and 79.83% was achieved by k-nearest neighbor and support vector machine respectively. We also provide detailed discussions on the parameter setting in machine learning methods and accuracy of specific alphabet letters in this paper.
Original languageAmerican English
Title of host publication2014 13th International Conference on Machine Learning and Applications
Pages541-544
Number of pages4
ISBN (Electronic)978-1-4799-7415-3
DOIs
StatePublished - Dec 1 2014

Keywords

  • Motion Detection
  • Sign language
  • Sign Language Recognition
  • Language Recognition
  • American Sign Language
  • Leap Motion
  • Leap Motion Sensor
  • Machine Learning
  • Support Vector Machine
  • machine learning
  • Recognition System
  • Secondary Loss
  • Neighbor Vector
  • Compact Senosr
  • English Alphabet
  • Specific Letter
  • Classification Results
  • Window Size
  • Correct Rate
  • Application Programming Interface
  • Radial Basis Function Kernel
  • Unit Vector in the Direction
  • Index Finger
  • Middle Finger
  • Visual Feedback
  • Ring Finger
  • Manual System
  • Kernel Function
  • 3D Leap Motion Sensor
  • k-neaerest neighbor
  • support vector machine
  • deaf education

Disciplines

  • Signal Processing
  • Computer Sciences
  • Artificial Intelligence and Robotics

Cite this