This paper describes an algorithm for a visual human-machine interface that infers a person’s intention from the motion of the hand. Work in progress shows a proof of concept tested on static images. The context for which this solution is intended is that of wheelchair bound individuals whose intentions are the direction and speed variation of the wheelchair. Results show that the symmetry property of the hand in motion can serve as an intent indicator.
Reference:
Luhandjula, T, Djouani, K, Hamam, Y, et al. Hand-based visual intent recognition algorithm for wheelchair motion. 3rd International Conference on Human System Interaction (HSI). May 13-15, University of IT and Management, Rzeszow, Poland
Luhandjula, T., Djouani, K., Hamam, Y., Van Wyk, B., & Williams, Q. (2010). Hand based visual intent recognition algorithm for wheelchair motion. http://hdl.handle.net/10204/4158
Luhandjula, T, K Djouani, Y Hamam, BJ Van Wyk, and Q Williams. "Hand based visual intent recognition algorithm for wheelchair motion." (2010): http://hdl.handle.net/10204/4158
Luhandjula T, Djouani K, Hamam Y, Van Wyk B, Williams Q, Hand based visual intent recognition algorithm for wheelchair motion; 2010. http://hdl.handle.net/10204/4158 .