About Camera Mouse Suite

The classical Camera Mouse is a free program that enables you to control the mouse pointer on your computer screen just by moving your head (or foot or finger). It can be downloaded at our sister wesite www.cameramouse.org

The Camera Mouse Suite is the "beta version" or "research version" of Camera Mouse. It is free and provides a suite of programs, including the classicial version of the Camera Mouse. The Camera Mouse Suite provides the additional functionality:


Camera Mouse Suite

You may download Camera Mouse 2011 Manual, a two-page brochure on Camera Mouse, and a manual explaining the additional functionality of the Camera Mouse Suite: Camera Mouse Suite 1.0 Manual.

Research Papers

Improvements to the Tracking Mechanism of the Camera Mouse

A fundamental change of the tracking mechanism of the original Camera Mouse has been proposed that leverages kernel projections, which are traditionally associated with machine learning. In the Augmented Camera Mouse, the interface first learns an Active Hidden Model of the appearance of the user's facial feature while he or she is using the Camera Mouse, and then tracks the feature by matching its current appearance to this model. The Augmented Camera Mouse was empirically shown to be very resilient to feature drift.

S. Epstein, E. Missimer and M. Betke. "Using kernels for a video-based mouse-replacement interface," Personal and Ubiquitous Computing, 18(1):47-60, January 2014. pdf .

S. Epstein and M. Betke. "Active Hidden Models for Tracking with Kernel Projections," Department of Computer Science Technical Report BUCS-2009-006, Boston University, pdf, abstract.

The Camera Mouse can lose the facial feature it is supposed to track. This sometimes happens when the user makes a rapid movement, maybe due to a spastic condition. It also occurs when the facial feature is not in the camera view at all times because the user turned or tilted his or her head significantly, or because a caregiver occluded the view, for example, while wiping the user's face. A new version of the Camera Mouse can detect and recuperate from feature loss. If it detects that the feature has been lost, it uses a carefully-designed multi-phase search process to find the subimage in the current video frame with the best correlation to the image of the initially tracked feature.

C. Connor, E. Yu, J. Magee, E. Cansizoglu, S. Epstein, and M. Betke, 2009. "Movement and Recovery Analysis of a Mouse-Replacement Interface for Users with Severe Disabilities." 13th International Conference on Human-Computer Interaction (HCI International 2009), San Diego, CA, July 2009. 10 pp. pdf.
Combining eye-gaze and head-movement interaction

Augmentative and alternative communication tools allow people with severe motion disabilities to interact with com- puters. Two commonly used tools are video-based interfaces and eye trackers. Video-based interfaces map head movements captured by a camera to mouse pointer movements. Alternatively, eye trackers place the mouse pointer at the estimated position of the user's gaze. Eye tracking based interfaces have been shown to even outperform traditional mice in terms of speed, however the accuracy of current eye trackers is not sufficient for fine mouse pointer placement. We proposed the Head Movement And Gaze Input Cascaded (HMAGIC) pointing technique that combines head movement and gaze-based inputs in a fast and accurate mouse-replacement interface. The interface initially places the pointer at the estimated gaze position and then the user makes fine adjustments with their head movements. We conducted a user experiment to compare HMAGIC with a mouse-replacement interface using only head movements to control the pointer. Our experimental results indicate that HMAGIC is significantly faster than the head-only interface while still providing accurate mouse pointer positioning.

A. Kurauchi, W. Feng, C. Morimoto, and M. Betke. "HMAGIC: Head Movement And Gaze Input Cascaded Pointing," 8th International Conference on Pervasive Technologies Related to Assistive Environments, (PETRA), Corfu, Greece, July 1-3, 2015, 5 pages, pdf
Multi-camera Interfaces

Prof. Betke's research group has started to develop a multi-camera interface system that serves as an improved communication tool for people with severe motion impairments. The group is working on a multi-camera capture system that can record synchronized images from multiple cameras and automatically analyze the camera arrangement.

In a preliminary experiment, 15 human subjects were recorded from three cameras while they were conducting a hands-free interaction experiment. The three-dimensional movement trajectories of various facial features were reconstructed via stereoscopy. The analysis showed substantial feature movement in the third dimension, which are typically neglected by single-camera interfaces based on two-dimensional feature tracking.

E. Ataer-Cansizoglu and M. Betke. "An information fusion approach for multiview feature tracking." In 20th International Conference on Pattern Recognition (ICPR), August 23-26, 2010, Istanbul, Turkey. IAPR Press, August 2010, 4 pp.

J. Magee, Z. Wu, H. Chennamaneni, S. Epstein, D. H. Theriault, and M. Betke. "Towards a multi-camera mouse-replacement interface." In A. Fred, editor, The 10th International Workshop on Pattern Recognition in Information Systems - PRIS 2010. In conjunction with ICEIS 2010, Madeira, Portugal - June 2010, pages 33-42, INSTICC Press, 2010. pdf.

Blink and Wink Interface

The research group has recently re-focused its efforts on improving the mechanism to simulate mouse clicks. The current Camera Mouse software limits the user to left-click commands which the user executes by hovering over a certain location for a predetermined amount of time. For users who can control head movements and can wink with one eye while keeping their other eye visibly open, the new "blink and wink interface" allows complete use of a typical mouse, including moving the pointer, left and right clicking, double clicking, and click-and-dragging. For users who cannot wink but can blink voluntarily the system allows the user to perform left clicks, the most common and useful mouse action.

E. Missimer and M. Betke. "Blink and wink detection for mouse pointer control." The 3rd ACM International Conference on Pervasive Technologies Related to Assistive Environments (PETRA 2010), Pythagorion, Samos, Greece. June 2010. 8 pp. pdf.
Youtube Video of Blink & Wink Version of Camera Mouse by Eric Missimer, June 2010

The Speaking with Eyes section below describes the group's earlier efforts in developing blink detection algorithms.

Assistive Software

E. Missimer, S. Epstein, J. J. Magee, and M. Betke. "Customizable keyboard." In The 12th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2010), Orlando, Florida, USA, October 2010. pdf.

J. J. Magee, S. Epstein and E. Missimer, and M. Betke. "Adaptive mappings for mouse-replacement interfaces." In The 12th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2010), Orlando, Florida, USA, October 2010.

J. Magee and M. Betke. "HAIL: hierarchical adaptive interface layout." In K. Miesenberger et al., editor, 12th International Conference on Computers Helping People with Special Needs (ICCHP 2010), Vienna University of Technology, Austria, Part 1, LNCS 6179, pages 139-146. Springer-Verlag Berlin Heidelberg, July 2010. pdf. Abstract.

S. Deshpande and M. Betke. "RefLink: An Interface that Enables People with Motion Impairments to Analyze Web Content and Dynamically Link to References." The 9th International Workshop on Pattern Recognition in Information Systems (PRIS 2009), Milan, Italy, May 2009, A. Fred (editor), pages 28-36, INSTICC Press. pdf.

The published material is based upon work supported by the National Science Foundation under Grants 0713229, 0093367, and 0202067 and the Office of Naval Research under Grant N000140110444. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Last update by Margrit Betke 10/2015