Using the Emotiv EPOC Neuroheadset as an EEGControlled ... · Using the Emotiv EPOC Neuroheadset as...

15
Using the Emotiv EPOC Neuroheadset as an EEGControlled BrainComputer Interface Adrienne Hawkes and Amy Liu Summer Ventures in Math and Science 2015 Visual and Image Processing Rahman Tashakkori, Eric Jackson, Jonathan Tate, Sina Tashakkori, David Kale, Mitchell Parry Appalachian State University Abstract In this experiment, the functionality and usefulness of the Emotiv EPOC neuroheadset, an electroencephalographycontrolled braincomputer interface, was tested by connecting its software with that of Scratch and Python to control a robotic Finch. The Emotiv EPOC has features enabling it to read and translate facial expressions and thoughts into keystrokes. To connect the headset to external programs, several programs were written in both Scratch 2.0 and Python to assign expressions and thoughts to certain keystrokes, triggering the Finch to move. The purpose of this experiment was to review the headset, testing whether its capabilities are medically ready for patients suffering from paralysis due to ALS. There is no known cure for Amyotrophic Lateral Sclerosis (ALS) and the use of braincomputer interfaces (BCIs) is being heavily researched, therefore this experiment was used to relate the current technology being used for ALS patients to the Emotiv EPOC. Using the tests run on the EPOC and knowledge of other BCIs, the different settings and properties of the headset were investigated to consider its potential. Many issues were discovered with the functionality of this technology, concluding that it is able to connect with different internal systems and control external devices, but it is not currently medically reliable. 1.0 Introduction The purpose of this experiment is to test the reliability and usefulness of the Emotiv EPOC. It was done so by utilizing the Emotiv software to control a Finch robot. This research aims to determine whether the Emotiv EPOC should be used in the medical field. It is believed that technology such as this neuroheadset could assist those suffering from paralysis, due to diseases such as ALS. Emotiv Systems claims that the majority of people who suffer from some level of paralysis can employ the Emotiv EPOC. With this technology being easily accessible, it would be significant if this device proved to be medically valuable. 2.0 EEG and Operation of EEGDevices EEG, electroencephalography, is the process of recording brain activity over time. This method, typically noninvasive, is carried out by placing electrodes on the scalp, usually after applying a conductive gel or liquid to improve the signal. [1] EEG records the change in electrical polarity when the dendrites on neurons receive the neurotransmitters from the axons on different neurons. [2] The device uses the different locations of the electrodes on the head, and records the difference in voltages between the two electrodes over time. [3][4] This recorded signal is dependent upon the brain tissues’ conductivity, electrodes’ conductivity, and electrodes’ orientation in relation to the signals being fired by the brain. [3][4] However, there are some issues that arise with EEG devices.The location of the waves being detected is theoretically impossible to determine because EEG records the brain activity twodimensionally whereas in 1

Transcript of Using the Emotiv EPOC Neuroheadset as an EEGControlled ... · Using the Emotiv EPOC Neuroheadset as...

Using the Emotiv EPOC Neuroheadset as an EEG­Controlled Brain­Computer Interface

Adrienne Hawkes and Amy Liu Summer Ventures in Math and Science 2015

Visual and Image Processing Rahman Tashakkori, Eric Jackson, Jonathan Tate, Sina Tashakkori, David Kale,

Mitchell Parry Appalachian State University

Abstract ­ In this experiment, the functionality and usefulness of the Emotiv EPOC neuroheadset, an electroencephalography­controlled brain­computer interface, was tested by connecting its software with that of Scratch and Python to control a robotic Finch. The Emotiv EPOC has features enabling it to read and translate facial expressions and thoughts into keystrokes. To connect the headset to external programs, several programs were written in both Scratch 2.0 and Python to assign expressions and thoughts to certain keystrokes, triggering the Finch to move. The purpose of this experiment was to review the headset, testing whether its capabilities are medically ready for patients suffering from paralysis due to ALS. There is no known cure for Amyotrophic Lateral Sclerosis (ALS) and the use of brain­computer interfaces (BCIs) is being heavily researched, therefore this experiment was used to relate the current technology being used for ALS patients to the Emotiv EPOC. Using the tests run on the EPOC and knowledge of other BCIs, the different settings and properties of the headset were investigated to consider its potential. Many issues were discovered with the functionality of this technology, concluding that it is able to connect with different internal systems and control external devices, but it is not currently medically reliable.

1.0 Introduction

The purpose of this experiment is to test the reliability and usefulness of the Emotiv EPOC. It was done so by utilizing the Emotiv software to control a Finch robot. This research aims to determine whether the Emotiv EPOC should be used in the medical field. It is believed that technology such as this neuroheadset could assist those suffering from paralysis, due to diseases such as ALS. Emotiv Systems claims that the majority of people who suffer from some level of paralysis can employ the Emotiv EPOC. With this technology being easily accessible, it would be significant if this device proved to be medically valuable.

2.0 EEG and Operation of EEG­Devices EEG, electroencephalography, is the process of recording brain activity over time. This method, typically noninvasive, is carried out by placing electrodes on the scalp, usually after applying a conductive gel or liquid to improve the signal. [1] EEG records the change in electrical polarity when the dendrites on neurons receive the neurotransmitters from the axons on different neurons. [2] The device uses the different locations of the electrodes on the head, and records the difference in voltages between the two electrodes over time. [3][4] This recorded signal is dependent upon the brain tissues’ conductivity, electrodes’ conductivity, and electrodes’ orientation in relation to the signals being fired by the brain. [3][4] However, there are some issues that arise with EEG devices.The location of the waves being detected is theoretically impossible to determine because EEG records the brain activity two­dimensionally whereas in

1

reality it is three­dimensional. [4] Furthermore, that same signal is occasionally sent from more than one of the brain’s signal generators, which again inhibits the ability to decipher the exact source of the activity. Also, when there is an abnormality in the brain, the distress signal is not always sent from the precise location of the issue, making it difficult to pinpoint the source of irregularity. [3] Due to these issues, it is often challenging to accurately understand EEG recordings.

3.0 What is a BCI BCIs, brain­computer interfaces, are devices that collect brain signals using EEG, translate the signals, and send them to an external device. The signal is then received as a command and executed accordingly. [5]

Figure 1 ­ Process of brain computer interfaces [5]

These devices provide a way for people to carry out an action without the use of voluntary muscle movement. [5] A brain­computer interface can be either invasive or noninvasive. If it is invasive, surgery is required to place the electrodes directly on the brain or near its surface. [1] For a device that is noninvasive, electrodes are placed on the scalp and conductivity is improved by means of a gel or saline solution. While this method has a smaller risk factor than invasive methods [6], the noninvasive method is not as distinct since the electrodes are not as close to the signals being sent from the brain. (Figure 2)

Figure 2 ­ Illustration of why noninvasive electrodes get a weaker signal compared to the

invasive method [7]

2

There is also a much poorer signal­to­noise ratio compared to invasive methods [8], meaning the amount of noise coming from an external source is greater than the signal being detected from the brain. [9] In order to control an external device by means of BCIs, more than just thinking one thought is required. Rather, to use a BCI, patients can select highlighted letters that are displayed on a screen and type their desired message. While this technology can be useful, it is found to be difficult to carry out actions quickly and often requires much training for the patient. [1]

3.1 Existing Work with EEG­Controlled BCIs In the 1980’s, Apostolos P. Georgopoulos made some major discoveries concerning BCIs that have played a part in advancing research. [10] A study done from 1996­2005, based off Georgopoulos’ research, using a variety of animals as subjects, achieved a remarkable success with a macaque monkey that controlled a joystick by using only her brain. Another type of BCI was discussed in work published in 2007. The review was about an existing EEG­controlled web browser that can be utilized by those suffering from paralysis. It is in part to these successes that the usefulness of brain­computer interfaces have been proven and other devices have been produced such as the Emotiv EPOC neuroheadset.

3.2 Existing Work with Emotiv EPOC

Emotiv EPOC was released five years ago when Tan Le, the co­founder of Emotiv Systems, spoke about wanting to create a cheap headset that could be used for experimenting by companies with a limited budget. [11] Because one of the main functions of the Emotiv EPOC is the ability to collect and display brainwaves, the system is commonly used to detect disorders such as epilepsy This is possible due to the noticeable difference between the brainwave patterns of someone who has the illness and someone who does not have the illness. [12] The Emotiv EPOC has the ability to distinguish emotional states, a feature that is designed mainly for use in gaming. One demo game, called Spirit Mountain, is currently being used with the headset. The ability to differentiate emotional states can also be used to control music type, volume, light intensity, and for indicating distress within gaming. The headset also provides gyros to associate head turns with commands such as “yes” or “no”. [13] Another important application of the Emotiv EPOC is for medical use. In August of 2014, a news article was published about ALS patient Eric Valor. This article revealed that during a trial run Emotiv’s Insight headset successfully allowed Valor to call for medical help by only thinking the command. [14] Due to this success, it is possible that the Emotiv EPOC holds the potential to aid medical needs.

3.3 Application of BCIs/Emotiv EPOC into Therapeutic/Practical Methods for ALS Every year roughly 5,600 people are diagnosed with Amyotrophic Lateral Sclerosis (ALS), also known as Lou Gehrig’s Disease. [15] ALS is the degeneration of motor neurons, nerve cells that extend from the brain, to the spinal cord, and then to the muscles. As the disease progresses, people gradually lose the control of voluntary muscle movement, potentially losing the ability to swallow, speak, and breathe. [16] However, ALS does not usually cause damage to one’s mind. [17] While many researchers are looking into how to slow or stop the degeneration of these motor neurons, there is currently no known cure. Meanwhile, further research is being done to find a way for those affected by ALS to regain a level of control without the use of their bodies. As some research has shown, it is possible for patients suffering from paralysis can control

3

external devices through brain­computer interfaces (BCIs). Though most of the BCIs that currently exist are expensive and inconvenient, the technology has proven to be especially useful for people suffering from ALS. There have been successes with noninvasive techniques; however, due to the clearer signal provided by invasive procedures [9] it is agreed by most researchers that invasive BCIs are currently more promising. [6] BCI technology could be used with computers for typing and internet browsing, or it could be used to control devices such as wheelchairs or prosthetics. [1] Despite the progress that has been made, BCIs still need much improvement. [5]

4.0 Emotiv EPOC Headset and Software The Emotiv EPOC neuroheadset is an EEG­controlled BCI. Using the electrodes on the headset, the device detects brainwaves and translates them to the Emotiv software through a wireless flash drive. In order to interpret these brain waves, the headset uses a reference electrode to approximate the brain wave coming from a specific electrode which is on a certain region of the head. There are fourteen of these electrodes, seven sets of two. Each one detects brain waves from different regions of the brain in correspondence to the two reference electrodes (CMS and DRL pictured in Figure 3)

Figure 3 ­ Electrode Positions on the Emotiv EPOC neuroheadset [18]

When setting up the headset, the felt pads on each of the electrodes must be moistened with saline solution to improve the conductivity of the electrodes. When the headset is placed on the head, the Emotiv Control Panel indicates the strength of the signal being detected by each electrode. The electrodes then must be positioned on the scalp until all or most of electrodes indicate having strong signal.

4

Figure 4 ­ Indication of most electrodes having a strong signal strength

The Emotiv software includes a Control Panel and tools such as EmoComposer, EmoKey and TestBench. Within the Control Panel, there are five threads that can be accessed. The Affectiv feature detects and displays the current emotional state of the user, such as boredom, engagement, and frustration. This thread was not employed in this experiment. The Cognitiv funtion detects thoughts and provides the ability for the user to train the headset according to his or her individual mental representations. The Expressiv thread detects and mimics facial expressions, and provides similar training abilities for the user as the Cognitiv thread. Under the Application tab, there are the remaining two options: neuroheadsetStatus and MouseControl. neuroheadsetStatus includes all information about the headset itself, such as battery status and signal strength. MouseControl allows the user to control the mouse by means of the Cognitiv capabilities. [18] EmoComposer can be used when the user is developing his or her own mapping, the layout made to connect expressions or thoughts to keystrokes. EmoKey is for connecting facial expressions and thoughts to certain keystrokes that are selected by the user. It also allows the keystrokes to be sent to an application in focus or to a selected application. TestBench displays the EEG data, the brain activity, and the connectivity of the electrodes. It also provides visuals for the FFT (Fast Fourier Transforms), Gyro, and Data Packets.

Figure 5 ­ Difference in brain waves seen in TestBench when a user smirks right versus smiles

[18]

5

5.0 Scratch

Scratch is a computer program that allows users to create animations. Scratch is utilized for educational and recreational purposes. This online multimedia tool utilizes a drag and drop system where the blocks that control the provided animated characters or self made costumes are already pre­designed. All available blocks that control these functions are categorized under the Scripts tab as shown in Figure 6: Motion, Looks, Sound, Pen, Data, Events, Control, Sensing, Operators, and More blocks. [19] The program also comes with different types of backgrounds and sounds. However, users may also import their own as well. The skills learned through Scratch are introductory to more complex computer languages such as Python and Java.

Figure 6 ­ Visual of Scratch 2.0

6.0 Python

BirdBrain Software also offers programs in Python for controlling the Finch. Python is a complex computer programming language that is used worldwide.

7.0 Finch robots Finch robots were designed as an engaging device to support the art of computer programming. It can be run with a number of programming languages. Some features that are built in with the Bird Brain software, which controls the Finch robot, are light, temperature, obstacle sensors, motors, buzzers, accelerometers, a colored beak, pen mount for drawing, and a plug for the USB port. [20]

Figure 7 ­ Finch robot Image

6

8.0 Methods

In order to test the reliability of the Emotiv EPOC neuroheadset, the capabilities of EmoKey and Scratch programming were combined to control a Finch robot. Before beginning, the software was downloaded for the headset. Once this was installed onto a computer, the headset was connected to the programs by means of the Emotiv dongle. Setup for the headset took about ten to fifteen minutes each time. To begin setup, the sixteen felt pads were removed from the case in which they were stored. Making sure each pad was properly moistened, each one was twisted into place in the headset. After inserting the dongle into a USB port on a Windows computer, the headset was positioned on the head. The headset was turned on and Emotiv Control Panel was opened where the signal strength was displayed. If the headset was turned on before it was placed on the subject’s head, (Figure 8) there would often be no or a very poor signal. The electrodes were adjusted until most or all of the signals were strong. This process was repeated each time the headset was used. The first experiment was with TestBench during which the differences between waves was studied from when the subject’s eyes were open and when they were closed. It was noticed that there were changes in the frequency and amplitude of the waves, however, the most notable change in the waves resulted from the subject’s head movement. The next experiment was investigating the properties of the headset that recorded movement of the subject.

Figure 8 ­ The Emotiv EPOC in place on the subject’s head.

8.1 Training the Expressiv Suite

In the Control Panel, using the Expressiv Suite, the virtual robot’s expressions were trained to more accurately match the subject’s expressions. Under the tab labeled Training, the “Neutral” face was trained as a standard for the other facial expressions. These facial expressions, such as Smile, Clench, Left/Right Smirk, Raise Brow, and Furrow Brow were then trained by having the subject hold the appropriate face for the required amount of time. Other options for facial movement, such as Look Left/Right and Wink Left/Right could not be trained because they were not offered in the Control Panel. Once all the initial training was completed, the virtual robot was observed to see if its expressions accurately matched the ones being made by the subject. Then if needed, the level of sensitivity was altered to improve the robot’s accuracy in mimicking faces. It was generally necessary to repeat this training each time the headset was set up.

8.2 EmoKey and Scratch 1.4 7

Using the program EmoKey, certain keystrokes were assigned to certain facial expressions. This was done by clicking “Add Rule”, typing a certain letter under “send specific keystroke”, marking “Hold Key”, and changing the trigger delay time to 0 ms. Next the Target Application was set to “send to a particular application window” and the cursor dragged to Scratch 1.4. An individual expression was chosen to trigger the keystrokes by clicking “add condition” and choosing the desired expression. (Figure 9) Using Scratch 1.4, a simple program was developed that assigned movements to the sprite when the corresponding key was triggered. (Figure 10) Wearing the headset, the subject made the facial expressions in order to move the sprite. When the sprite began moving randomly as the subject moved her face in attempt to control the sprite, the sensitivity was altered of the facial expressions within the Control Panel. The Limit was also altered to control the level of sensitivity at which an expression would trigger a response.

Figure 9 ­ Setting up EmoKey to control the sprite Figure 10 ­ Scratch 1.4 program written to control the Finch

8.3 Scratch 2.0 and the Finch robot

In order to control the Finch robot, BirdBrainRobotServer and Scratch 2.0 were downloaded. Scratch 1.4 was not utilized for this part of the experiment because the Finch was not able be run in that version. For the first trial with the Finch, Emokey was opened and, using the same methods as before, all ten facial expressions were assigned to individual keystrokes. Then in Scratch 2.0 a program was written that selected certain keystrokes to trigger the movement of the Finch. (Figure 11)

8

Figure 11 ­ Program written for Trial 1 in Scratch 2.0 to control the Finch robot

The subject wore the headset and used the facial expressions in order to move the finch. The same settings that were altered when controlling the sprite were changed in this experiment. This was done to give the subject more control over the Finch’s movement. Then in the second trial, with the subject still wearing the headset, EmoKey was used to assign only three facial expressions to three keystrokes. To make the facial expressions more concise, the sensitivity was turned all the way down for all but the three desired expressions. Then with a simple program designed in Scratch 2.0, the headset was used to detect the subject’s changing expressions in order to move the Finch forward, right, and in a circle while activating an LED light .

Figure 12 ­ Program written for Trial 2 in Scratch 2.0 to control the Finch robot

8.4 Training and Using the Cognitive Suite

The Cognitive Suite in the Emotiv Control Panel was also used. When training this feature, it took five to six attempts in order to bring to skill level to a sufficient percentage. In the training process, the subject wearing the headset would think the appropriate command (i.e. “Pull”) and would continue thinking this for the given amount of time (eight seconds). In order to continue thinking the command, the subject pantomimed what was being thought. The thoughts Push and

9

Pull were trained. Then in EmoKey two keystrokes were assigned to these thoughts and the Limit was set close to zero. In Scratch 2.0 a simple program was written to give the Finch two commands to follow when the two corresponding keystrokes were triggered.

8.5 Python and the Finch robot Once moving the Finch through Scratch 1.4 and 2.0 was accomplished, a more involved computer program was written using Python. Similar to the work done in Scratch, the finch had to be programmed to move only when a certain key was pressed. To begin, certain keystrokes were assigned to facial expressions that the subject would make while wearing the headset. All of the Emokey settings were maintained from section 8.3, other than the Limit setting which was set close to 0.5 for each command. The sensitivity was also set to a low level in the Control Panel. In the finalized program, it was programmed so that a Left Smirk would trigger “a” and the Finch would follow a square path. A Smile was set to trigger “w” to make the Finch flash the colors of the rainbow in order. Raised Brows would trigger “d” making the Finch move forward and backwards twice. (Figures 13)

Figure 13 ­ Part of the Python program for the Finch

9.0 Results

In the first part of the experiment done with the neuroheadset, it was found that the movement of the subject’s face could move the sprite in Scratch 1.4. Though it was very unstable because the sprite would move around the screen sporadically due to unintentional movements of the face, it was confirmed that the headset was actually sending commands to the program by turning the headset on and off. Because the headset was so sensitive, an attempt was made to alter the settings as mentioned in methods. When moving the sprite, the Limit was set close to one in EmoKey and the sensitivity was set almost to its maximum in the Control Panel. While this did

10

give the subject a little more control over the sprite, predicting the exact movement of the sprite was still found to be difficult. In the second part of the experiment, we attempted to use the headset to control a Finch robot. Once Scratch 2.0 was downloaded, the program was written, and Finch was set up, it appeared that there was no connection between the Emotiv EPOC and Scratch. This barrier was overcome, however, by altering the settings in EmoKey. The Target Application setting was changed from “send to a particular application” to “send to application in focus”. This enabled us to move the Finch by means of the different facial expressions. The same sensitivity issues experienced with the sprite affected the Finch as well. Each time that training was done in the Expressiv Suite in the Control Panel, and the subject would make a certain facial expression, the virtual robot would rarely mimic the face accurately. Though the sensitivity was adjusted, there were still issues because it was difficult for only the subject’s eyebrows or mouth to move without moving other parts of the face. Due to this, the wrong keystroke was occasionally triggered or it took three to four times for the Finch to respond. For example, the Clench command and smile command would often being confused by the EPOC headset and some faces such as winking were not even detected by the headset. Looking left and looking right were not reliable commands either. With the challenges this proposed, it was difficult to find an exact setting for the sensitivity and Limit to be set. Despite these problems, there was success in having the Finch follow commands by facial expressions through EmoKey and Scratch 2.0, proving that it could be used for controlling basic software. When the subject Smirked Left, the Finch would move to the left. (Figure 14) When the subject Raised Brows, the Finch would move forward. (Figure 15) When the subject Smiled, the Finch came to a stop. (Figure 16)

Figure 14 ­ Subject Smirking Figure 15 ­ Subject Raising Figure 16 ­ Subject Smiling

Left Brows When the third part of the experiment began, it was difficult for the subject to train the thoughts well. It was challenging to consciously think only one thought and training the program five or six times to achieve a high training percentage also proved to be a difficulty. Though the Finch sometimes moved in the wrong direction, the subject ultimately did controlled the robot

11

successfully. It was found that without pantomiming, the Cognitive Suite rarely worked. Therefore, because pantomiming seemed more helpful, it appeared that the headset might just be reading the movement of the body. However, this did not align with Emotiv’s claims. It could also be possible that the electrical signals being sent to move the arms were controlling the headset. Furthermore, when controlling the Finch cognitively, it was difficult to predict which keystroke would be triggered, due to the difficulty of activating only one thought command. For the EmoKey sensitivity settings, the Limit was set to a low level and the sensitivity in the Control Panel was set on a low setting. Even with some difficulties, a few trials did provided positive results from the cognitive commands. When the subject thought “Push” and the subject’s hands were pushed forward, the Finch would move forward. (Figure 17) When the subject thought “Pull” and the subject’s left arm was pulled back, the Finch would move backwards. (Figure 18) The subject used these particular hand motions because those were the motions made during training.

Figure 17 ­ Subject thinking “push” and pushing hands forwards

Figure 18 ­ Subject thinking “pull” and pulling left arm back

In the fourth part of the experiment, when the program written with Python was used, we were successfully able to control the Finch using the Emotiv EPOC neuroheadset. Though the sensitivity was set to a low setting, it was found that when the subject Smirked Left, the Smile was also triggered. This proved to be a complication when sending keystroke commands to the program written. Occasionally the lights would be activated instead of the Finch beginning to move in a square. Furthermore, because the program was not very complicated, two commands could not be executed at one time and one command could not override another that had already been triggered. Instead, the command that was triggered first had to be fully completed before another could be activated. Due to this, there appeared to be a lag in the commands that were being received and executed. There was also a minor flaw with the connection between EmoKey and the Python program. Sometimes EmoKey would show that it registered the subject’s expression; however, there was no indication that the trigger had been recognized by the written program. This may have been due to the actual program itself. Regardless, it was ultimately

12

successful and the Finch followed the commands triggered by the subject's facial expressions. When the subject smirked left, the Finch moved in a square. When the subject’s raised brows, the Finch moved forward and backwards twice.When the subject smiled the Finch’s lights flashed. (Figures 14, 15, 16)

10.0 Conclusion As a result of the experiment, it was found that the Emotiv EPOC neuroheadset was successful in connecting to and controlling the Finch robot. With some flaws, the device recognized the subject’s movements and translated them in keystrokes that were then successfully sent as commands to both the Scratch and Python programs. Because this device is much less expensive than other existing BCIs, it implies that higher quality devices may prove to be more useful for current issues and research. Given the current price of the EPOC, about $400, the amount of research able to completed and the extent to which we were able to use the device was remarkable. The fact that a device exists at such a low cost is also promising for current research being done and for the medical world. The mission of Emotiv’s technology is to provide a way for the brain to be further studied and better understood. [21] The company also makes claims that while the Affectiv feature in the Control Panel should not be used for medical purposes, other features such as Expressiv, Cognitiv, and TestBench could be reliably used in the medical field. [13] However, during the experimentation many flaws were discovered with the headset that could hinder the headset’s capabilities in the medical field. It was found that the setup was often difficult to complete well, as the signal would not always be strong enough. The subject did have a significant amount of hair that may have impaired this process. Despite this, however, the subject was able to get all the electrodes to have a strong signal, but at other times was not able to accomplish this. This led to questions about the quality of connectivity in the electrodes. The setup was also time consuming. Though there was no another device to which we could compare the time spent setting up the EPOC, the level of difficulty for training the headset and the amount of time needed made the use of the device challenging. The company also claims that it should not be difficult to control an external device using the headset. [13] However, during research with the device, the level of control over the Finch robot was often found to be poor. The sensitivity was difficult to understand and the headset would often trigger incorrect keystrokes, triggering the wrong commands. Some of the issues faced during the research phase with the device may have been due to simplicity of the programs written. It is possible that with more complicated programs we would have been able to manipulate the functions of the headset and provide the subject more control. With the programs used, the research often proved the headset to be functional but unpredictable and inconvenient to use. Brain­computer interfaces are currently being thoroughly researched to determine the usefulness for patients suffering from paralysis due to diseases such as ALS. If the functions of the Emotiv EPOC were of higher quality, this device could be very useful for suffering ALS patients today. Because ALS has no current cure, it is valuable to find ways to provide patients with some level of control without needing to move their bodies. Ways to access internet, control a wheelchair, and communicate without the use of voluntary muscles are valuable advancements that are currently being improved by means of BCIs. However, this experiment revealed that the Emotiv EPOC neuroheadset is not medically functional or reliable. Using the headset as a therapeutic device for ALS patients may not be valuable due to the amount of time it takes to setup, the

13

unpredictable sensitivity, and lack of clarity of what the technology was actually doing when in use. The struggles faced with the EPOC may also be due to noninvasive properties of the headset. There have been many successful tests done where a patient has controlled an external device; however, they they were most often completed by means of invasive technology that provides direct contact with the brain. With noninvasive technology, the skull hinders the signals being detected by EEG. Furthermore, because the subject had full functionality over the body, the level of control over the mind may be weaker. Those suffering from paralysis, may have more of an ability to learn how to control specific thoughts with therapy. With this headset, the subject often found it challenging to move only one muscle or focus on only one thought. However, the value of patients with ALS spending time in therapy with the Emotiv EPOC was questioned due to the difficulty it took to understand and the headset’s ultimate imperfect functionality. Despite some flaws and unpredictable behavior, the subject was able to control the Finch using the headset and Scratch 2.0 in the last trial done using facial expressions. This shows a promising potential for the technology. Therefore, while the headset may not be currently reliable and useful in the medical world, with much improvement the basic functionality and theory behind the headset is very beneficial.

References

[1] ALS Association, http://www.alsa.org/als­care/resources/publications­videos/factsheets/brain­computer­interface.html?referrer=https://www.google.com/ [2] https://www.youtube.com/watch?v=1ovv6lmPHSI [3] “http://cognitrn.psych.indiana.edu/busey/eegseminar/pdfs/EEGPrimerCh1.pdf” [4] Olejniczak,P., “Neurophysiologic Basis of EEG”, (J Clin Neurophysiol 2006;23: 186–189) [5] Shih, Jerry J. et al., “Brain­Computer Interfaces in Medicine,” Mayo Clinic Proceedings , Volume 87, Issue 3, 268 ­ 279 [6] Nicolas­Alonso, L. F., & Gomez­Gil, J. (2012). “Brain Computer Interfaces”, a Review. Sensors (Basel, Switzerland), 12(2), 1211–1279. doi:10.3390/s120201211 [7] Electrophysiology, http://www.slideshare.net/ClausMathiesen/electrophysiologyevisited [8] Parasuraman, R. (2007). Neuroergonomics the brain at work. Oxford: Oxford University Press. [9] Walpow, J., et al., “Brain–Computer Interface Technology: A Review of the First International Meeting,” IEEE Transactions on Rehabilitation Engineering, vol. 8, no. 2, june 2000

14

[10] Scientific American, http://www.rle.mit.edu/touchlab/news/documents/ScinetificAmerican_2002.pdf [11] JWT Intelligence, http://www.jwtintelligence.com/2014/03/qa­tan­le­founder­ceo­emotiv­lifesciences/comment­page­1/#axzz3gMdSsVzj [12] Atlas of EEG & Seizure Semiology. B. Abou­Khalil; Musilus, K.E.; Elsevier, 2006. [13] Emotiv FAQ, https://emotiv.com/faq.php [14] http://www.cnet.com/news/in­paralysis­finding­freedom­via­brain­wave­tech/ [15] ALS Association, http://www.alsa.org/about­als/facts­you­should­know.html?referrer=https://www.google.com/ [16] ALS Association, http://www.alsa.org/about­als/what­is­als.html [17] http://www.ninds.nih.gov/disorders/amyotrophiclateralsclerosis/ALS.htm [18] Vokorokos, L., Ádám, N. and Madoš, B. “Non­Invasive Brain Imaging Technique for Playing Chess with Brain­Computer, ” International Journal of Computer and Information Technology (ISSN: 2279 – 0764) Volume 03 – Issue 05, September 2014 [19] Scratch, https://scratch.mit.edu/about/ [20] The Finch, http://www.finchrobot.com/ [21] Emotiv, https://emotiv.com/company.php [22] Michael Bensch, Ahmed A. Karim, Jürgen Mellinger, et al., “Nessi: An EEG­Controlled Web Browser for Severely Paralyzed Patients,” Computational Intelligence and Neuroscience, vol. 2007, Article ID 71863, 5 pages, 2007. doi:10.1155/2007/71863 [23] Electroencephalography, http://www.bem.fi/book/13/13.htm [24] Johns Hopkins Medicine, http://www.hopkinsmedicine.org/healthlibrary/test_procedures/neurological/electroencephalogram_eeg_92,P07655/

15