By rough estimate, people afflicted with paraplegia and locked-in
syndrome range in the hundreds-of-thousands and tens-of-thousands,
respectively. These disorders often render the patient unable to control
any bodily function other than eyelid movement (blinking). We are
developing software around a commercial wireless brainwave sensor to
convert electromyography patterns (the electrical activity of muscle
tissue) into both synthesized verbal phrases and signals for controlling
external devices such as household appliances and electric
At this stage of development we have demonstrated a software
prototype that facilitates verbal communication, via speech synthesis, and external device control through eye-blink detection (http://youtu.be/YFa9b01u6lo). The target
recipient of this system is a person suffering from a severe mobility
impairment such as locked-in syndrome or paraplegia, where the primary -
or only - level of physical control is through the blinking of the
The sensor used in this application is a NeuroSky MindWave
headband that detects electromyography patterns and streams
blink-strength data over a wireless Bluetooth link to the computer. The
software encodes blink-strength data as binary patterns that correspond
to certain verbal phrases and device control commands. The MindWave
headband provides a lightweight, innocuous means of gathering blink
strength data and other brainwave parameters that can be monitored for
patient stress levels and overall wellness.
Using the Code
As a nonprofit research and develop company dedicated to helping improve the human
experience through engineered solutions that integrate custom software
and electronic designs with advanced, low-cost sensor technologies, we will provide the source code for this project freely upon email request: info@Human-MachineTechnologies.com
Points of Interest
In the video I pseudo-randomly select flash cards to demonstrate the
deterministic nature of the blink pattern encoding algorithm. This type
of control can be learned in minutes with extremely accurate results.
With some practice the pattern scan-interval can be decreased to provide
faster communication and control.
The prospect of developing this rough prototype into a full-featured Ultrabook application opens up some new and interesting possibilities. The ability for a therapist to use Touch (as opposed to keyboard and mouse manipulations) during patient training sessions would greatly simplify the interactive learning process.
Furthermore, patients with limited hand mobility might be able to augment their interactions with the software, given an appropriate UI design that leverages the Ultrabook's Touch capability.
Also of importance to people with disabilities are the anti-theft and identity protection features provided by the Ultrabook. According to a U.S. Department of Justice report, in 2010 over half a million people with disabilities were victims of non-fatal violent crimes that include robbery.
For more information on our projects and mission please visit our website at human-machinetechnologies.org
On November 19 we were notified that our BlinkTalk application
was validated and published by Intel on the AppUp website. Since A NeuroSky
MindWave brainwave monitor is required for BlinkTalk to work correctly, we made
a short demo video to show how it works. The video is posted on our YouTube
(Note: the app will start up and can be navigated without a
MindWave connected, but functionally it will not interpret eye-blink patterns.)
Although BlinkTalk is a Windows desktop application, the
user interface is designed in a Metro-like style to clearly convey operational information. The buttons are over-sized to facilitate patients with limited mobility and caregivers to more
easily interact with computers like an Ultrabook that feature Touch capability.
BlinkTalk is a C#/WPF multi-threaded application developed in Visual Studio
2010 Pro. It also builds OK in Visual Studio 2012 Express on the Ultrabook.
The video was shot using the Intel Ultrabook's built-in
camera and microphone. We used Microsoft Expression for video capture. The
camera view is dragged onto the desktop to create a split-screen view that Expression
Encoder 4 can capture along with the running application. All screen navigation
shown in the video was done using the Ultrabook's touch-screen.
In the video I use flashcards similar to how we demonstrated
the original prototype, which also included device control (http://youtu.be/YFa9b01u6lo).
Although there is no reason why external device control couldn't be achieved
with the Ultrabook, we wanted to simplify the app for publishing so it would be
more accessible to a wider audience.
For those who have requested source code for
this project, I will be sending it out soon now that BlinkTalk has been
We are a nonprofit organization dedicated to helping improve the human experience through engineered solutions that integrate custom software and electronics with low-cost, commercially available sensor technologies.
Our objective is to identify emerging and disruptive Natural User Interface (NUI) technologies and engineer them into innovative systems that will help people to be safe, well and productive.