ISSN: 2375-3846
American Journal of Science and Technology  
Manuscript Information
 
 
Development of a Gesture Detection System for the Control of a Robotic Arm
American Journal of Science and Technology
Vol.3 , No. 1, Publication Date: Feb. 3, 2016, Page: 17-24
1802 Views Since February 3, 2016, 863 Downloads Since Feb. 3, 2016
 
 
Authors
 
[1]    

Osahor Uche, Department of Electronic and Electrical Engineering, Obafemi Awolowo University, Ile-Ife, Nigeria.

[2]    

L. O. Kehinde, Department of Electronic and Electrical Engineering, Obafemi Awolowo University, Ile-Ife, Nigeria.

 
Abstract
 

This study developed a gesture detection system for the control of a robotic arm. This was achieved with the view of developing a simple and user friendly approach for robotic arm manipulation. A Microsoft Xbox™ Kinect sensor was used as the input device for retrieving real-time joint information from the user. Kinematic algorithms were developed by applying Denavit-Hartenberg parameters and geometric equations. The acquired gestures from the human subject were used to carry out various kinematic routines as deemed fit by the user. A graphical interface was also developed to provide real time feedback to the user. The robot was able to execute basic gestures as instructed by the user rated at about 80% success rate. Most gestures developed were effective enough for the scope of this project. However, minimal gesture detection errors were recorded averaging below 20%. A standard deviation error of 3.8 at a variance error of approximately 14.1 was recorded. The study implemented an effective gesture system that was capable of executing the basic routines required for robotic arm manipulation in real time.


Keywords
 

Gesture Detection, Kinect Sensor, Graphical Interface and Robotic Arm


Reference
 
[01]    

Cannan, J. & Huosheng Ho, 2010. Human-Machine Interaction (HMI): A Survey, Essex: School of Computer Science & Electronic Engineering. [Online]http://cswww.essex.ac.uk/staff/hhu/Papers/CES-508%20HMI-Survey.pdf

[02]    

Johannsen, G., (2007). Human Machine Interaction. Control Systems,Robotics and Automation, Volume XXI, p. 1.

[03]    

Steven J, L., Shawn, S. & Neal, C., (2014). Natural User Interface for Robot Task Assignment [Online] http://hci.cs.wisc.edu/workshops/RSS2014/wp-content/uploads/2013/12/ levine2014 natural.pdf

[04]    

Shobhitha, A. J., Jegan, R. & Melwin, A., (2013). OWI-535 EDGE Robotic Arm Control Using ElectroMyoGram (EMG) Signals. International Journal of Innovative Technology and Exploring Engineering (IJITEE), II(6), pp. 282 - 286.

[05]    

Olawale, J., Oludele, A., Ayodele, A. & Miko, N., (2007). Development of a Microcontroller Based Robotic Arm. American Journalof Engineering Research (AJER), IV(2), pp. 8.

[06]    

Jana, A., (2012). Kinect for Window SDK programming guide. 1st ed. Birmingham: Pack Publishing Press. [Online] http://www.pactpub.com/game-development/kinect-windows-sdk-programming-guide

[07]    

Clement, G. and Massimo, F., (2013). Kinect in Motion - Audio and Visual Tracking by Example. 1st ed. Birmingham: Packt Publishing Press. [Online] http://www.pactpub.com/game-development/kinect-motion-%E2%80%93-audio-and-visual-tracking-example

[08]    

Akinwale, O. B., L. O. Kehinde, K. P. Ayodele, A. M. Jubril, O. P. Jonah, O. Ilori, and X. Chen, " (2009). A LabVIEW-based on-line robotic arm for students’ laboratory", 2009 ASEE Annual Conference & Exposition, Austin, Texas, ASEE, pp. Paper 2009-1179.

[09]    

Catuhe, D., (2012). Programming with the Kinect for Windows. Redmond,Washington: Microsoft Press. [Online]

[10]    

https://books.google.com/books/about/Programming_with_the_Kinect_for_Windows.html ?id=66kgXfDKdYwC

[11]    

Microsoft, (2015). Chapter 18. Threading and Synchronization. [Online] msdn.microsoft.com/enus/library/ff652496(d=default,l=enus,v=orm.10).aspx[Accessed 9 July 2015].





 
  Join Us
 
  Join as Reviewer
 
  Join Editorial Board
 
share:
 
 
Submission
 
 
Membership