NeuroExplore - Visualizing Brain Patterns · man physiological signal analysis, usually focusing on...

9
NeuroExplore - Visualizing Brain Patterns Daniel Jos ´ e dos Santos Rocha [email protected] Instituto Superior T ´ ecnico, Lisboa, Portugal October 2017 Abstract The objective of this project is the creation of a Physiological Computing Information Visualization (InfoVis) interface through which, interactively, users can visually decipher one’s intricate emotions and complex mind-state. To this end, we cooperated closely with Neuroscience experts from Instituto de Biof´ ısica e Engenharia Biom ´ edica (IBEB) throughout our work. Consequently, we assembled a Brain Computer Interface (BCI), from a Bitalino do-it-yourself hardware kit, for retrospective and real-time biosignals visualization alike. The resulting wearable biosensor was successfully deployed in an extensive database (DB) acquisition process, consisting of activities with concrete, studied brain-pattern correlations. This big-data InfoVis foundation’s magnitude and its, at times saturated, physical signal accredited the development of a data-processing pipeline. Indeed, our solution - entitled NeuroExplore - converts and presents this large number of digitalized, raw biosignal items into more recognizable visual idioms. The system interaction was intentionally designed in order to augment users’ discoveries and reason- ing regarding visually recognizable metrics, as well as subsequently derived trends, outliers and other brain patters. Strengthening this intent, we adopted an iterative development process in which, recur- rently, expert needs and user suggestions were equated as orienting guidelines. This all culminated in a final version we deemed worthy of extensive functional and utility user testing and expert validation. In the end, our project achieved both excellent user usability scores as well as expert interest, some already relying on our solution for their own research. Keywords: InfoVis, Physiological Computing, Affective Computing, Brain-Computer Interface, biosignals, qEEG 1. Introduction Despite the advent of Computer Science, histori- cally, Humanity has consistently relied on a more organic form of processing power: Our brain! This computer’s intricate psycho-physiological in- ner workings seem ever elusive. Contemporane- ously, the quest to study and better understand our mind and its processes is the purpose [11] of the interdisciplinary scientific field titled Cognitive Science. In this project, we are predominantly in- terested in two of its sub-disciplines: Computer Science and Neuroscience - the scientific study of the nervous system. Specifically, we are con- cerned with: 1. The relation between measurable user Phys- iological data, typically from the Central Ner- vous System (CNS), and specific brain behav- iors, such as emotions. Neuroscientists have documented and correlated such relations typ- ically via the usage of an Electroencephalo- gram (EEG) and its spectral analysis - Quanti- tative Electroencephalography (qEEG). 2. How to interactively, intuitively and graphically present an InfoVis which - visually - amplifies user cognition of the perception the subse- quent, derived Psycho-Physiological data. This insight is particularly useful for a di- verse number of reasons, ranging from Health (epilepsy diagnose[2]), to Neuromarketing[17] and Entertainment[9]. Furthermore, contemporane- ously, personal user information extremelly valu- able for for Big Data User Behavior Analytics[19]. 1.1. Objectives The goal of this work is the development of an interactive visualization so that users better un- derstand one’s mental state and emotions via Psycho-Physiological input data. Thus, a InfoVis assisted analysis, search and comparison of current and previous results could potentially identify new trends, outliers and fea- tures. In order to achieve our goal we must: 1. Research and choose a methodology to ac- quire a Physiological data collection for further 1

Transcript of NeuroExplore - Visualizing Brain Patterns · man physiological signal analysis, usually focusing on...

Page 1: NeuroExplore - Visualizing Brain Patterns · man physiological signal analysis, usually focusing on the nervous and cardiac system, but not exclu-sively [4]. Finally, BCI - aiming

NeuroExplore - Visualizing Brain Patterns

Daniel Jose dos Santos [email protected]

Instituto Superior Tecnico, Lisboa, Portugal

October 2017

Abstract

The objective of this project is the creation of a Physiological Computing Information Visualization(InfoVis) interface through which, interactively, users can visually decipher one’s intricate emotions andcomplex mind-state.

To this end, we cooperated closely with Neuroscience experts from Instituto de Biofısica e EngenhariaBiomedica (IBEB) throughout our work. Consequently, we assembled a Brain Computer Interface (BCI),from a Bitalino do-it-yourself hardware kit, for retrospective and real-time biosignals visualization alike.The resulting wearable biosensor was successfully deployed in an extensive database (DB) acquisitionprocess, consisting of activities with concrete, studied brain-pattern correlations.

This big-data InfoVis foundation’s magnitude and its, at times saturated, physical signal accredited thedevelopment of a data-processing pipeline. Indeed, our solution - entitled NeuroExplore - converts andpresents this large number of digitalized, raw biosignal items into more recognizable visual idioms.

The system interaction was intentionally designed in order to augment users’ discoveries and reason-ing regarding visually recognizable metrics, as well as subsequently derived trends, outliers and otherbrain patters. Strengthening this intent, we adopted an iterative development process in which, recur-rently, expert needs and user suggestions were equated as orienting guidelines.

This all culminated in a final version we deemed worthy of extensive functional and utility user testingand expert validation. In the end, our project achieved both excellent user usability scores as well asexpert interest, some already relying on our solution for their own research.Keywords: InfoVis, Physiological Computing, Affective Computing, Brain-Computer Interface, biosignals,qEEG

1. IntroductionDespite the advent of Computer Science, histori-cally, Humanity has consistently relied on a moreorganic form of processing power: Our brain!This computer’s intricate psycho-physiological in-ner workings seem ever elusive. Contemporane-ously, the quest to study and better understandour mind and its processes is the purpose [11] ofthe interdisciplinary scientific field titled CognitiveScience. In this project, we are predominantly in-terested in two of its sub-disciplines: ComputerScience and Neuroscience - the scientific studyof the nervous system. Specifically, we are con-cerned with:

1. The relation between measurable user Phys-iological data, typically from the Central Ner-vous System (CNS), and specific brain behav-iors, such as emotions. Neuroscientists havedocumented and correlated such relations typ-ically via the usage of an Electroencephalo-gram (EEG) and its spectral analysis - Quanti-tative Electroencephalography (qEEG).

2. How to interactively, intuitively and graphicallypresent an InfoVis which - visually - amplifiesuser cognition of the perception the subse-quent, derived Psycho-Physiological data.

This insight is particularly useful for a di-verse number of reasons, ranging from Health(epilepsy diagnose[2]), to Neuromarketing[17] andEntertainment[9]. Furthermore, contemporane-ously, personal user information extremelly valu-able for for Big Data User Behavior Analytics[19].

1.1. ObjectivesThe goal of this work is the development of aninteractive visualization so that users better un-derstand one’s mental state and emotions viaPsycho-Physiological input data.

Thus, a InfoVis assisted analysis, search andcomparison of current and previous results couldpotentially identify new trends, outliers and fea-tures. In order to achieve our goal we must:

1. Research and choose a methodology to ac-quire a Physiological data collection for further

1

Page 2: NeuroExplore - Visualizing Brain Patterns · man physiological signal analysis, usually focusing on the nervous and cardiac system, but not exclu-sively [4]. Finally, BCI - aiming

analysis.

2. Research and develop algorithms to de-rive Psycho-Physiological information from theoriginal raw data

3. Develop an interactive visual interface to ex-amine such data through the display of:

(a) Real-Time Information

(b) Retrospective Information

4. Validate our work through both user and ex-pert feedback and its statistical analysis.

1.2. ContributionsThroughout our work, cooperation and mentorshipfrom IBEB faculty members, actively participatingas Neuroscience experts, is paramount to the un-derstanding of Physiological Computing concepts,necessities and applications. In other words, IBEBcontributions were indispensable in our journeyto visually present the uncensored and inherentlycomplex insight of our mental inner workings.

Correspondingly, IBEB colleagues showed inter-est utilizing our project. Specifically, our hardwareand software implementation has been success-fully used in hands-on demos, live data gathering,video pitches, Start-up accelerator events, scien-tific events and other projects.

2. BackgroundDue to the multidisciplinary nature of our work,we present a brief related IT fields of studybackground- within this context - whose concept’scomprehension is of underlying interest for ourproject. Firstly, the field of Sentiment Analysis,also known as opinion mining, typically researchesterm frequencies and their presence in textual datainput for psychological data extraction[10]. Affec-tive Computing is concerned with the theory andconstruction of machines which can recognize, in-terpret and process human emotional states. Itcan encompass Sentiment Analysis and combinephysiological data or - typically - facial video con-tent analysis for feature distillation [15]. Physio-logical Computing, as implied, is interested in hu-man physiological signal analysis, usually focusingon the nervous and cardiac system, but not exclu-sively [4]. Finally, BCI - aiming at improved Hu-man Computer Interaction (HCI) - focuses on thebrain’s physiological output, typically via artificialintelligence based system that can recognize a cer-tain set of patterns in brain EEG[8].

3. Related WorkStudying the best way to visualize and interact withour complex data can be help improve our project.

3.1. Textual Sentiment AnalysisTorkildson et al. [18] used machine learning AItechniques for sentiment analysis, emotion classi-fication and corresponding visualization purposes.Specifically, emotional classifiers are defined and a20 collaborative visualization is proposed after an-alyzing twitter posts’ data preceding, during and af-ter the 2010 Gulf Oil Spill Crisis.

Categorical Attributes of Emotion were definedbased on Ekman’s six basic emotions: joy, anger,fear, sadness, surprise and disgust. Sentiment at-tributes: negative, neutral and positive, were des-ignated and intended to be mutually exclusive. Af-terward, the purpose of the pretended visualizationwas set: “support the analysis of the emotionalimpact of events”. Finally, the authors decided tographically present the information using a stackedarea chart. This idiom allows “easy comparison ofvalues, facilitated by the colored bands. The top25% of values, the time instances with the highestemotion frequency, have the highest color satura-tion. The coloring makes these peaks easily distin-guishable.”

Ultimately, this paper provides an interactive VISfor emotional analysis which allows for tasks suchas: “What emotional response did the public man-ifest as a consequence of the POTUS speech?” -to be accomplished

Kamvar and Harris [7] created an emotionalsearch engine together with a web-based visual-ization. This was achieved extracting sentencesfrom social media that include the words “I feel”or “I am feeling”, as well as the corresponding au-thor’s sex, age, location and time of occurrence.

Combining these metrics allows for several dis-tinct visualizations such as: bar-charts with agebreakdown of feelings from people in the last fewhours; world-maps characterized by a geographicbreakdown of feelings from people in the last fewhours; line-charts relating sentiments over timesuch as stress and relaxation over the week orlove and loneliness in the week of Valentine’s Day;stacked area charts relating a specific feeling’s fre-quency over the aging of a human.

These results were based on an immenseamount of different emotions or sentiments possi-ble thanks to the applied text extraction technique.Combining these techniques with physiological in-put data could potentially yield even more accurateresults. The overall result exposes the diverse ca-pabilities of interfaces that allow for item-level ex-ploration of sentiment data.

3.2. Video Content Affective AnalysisHupont et al [6] created an emotional visualization– EMOTRACKER - employing not only facial anal-ysis for emotion recognition purposes as well as

2

Page 3: NeuroExplore - Visualizing Brain Patterns · man physiological signal analysis, usually focusing on the nervous and cardiac system, but not exclu-sively [4]. Finally, BCI - aiming

eye-tracking for gaze analysis.After noting several industries’ increasing desire

for objective measurements of engagement withcontent, sparked by brands increasingly striving tobuild emotional connections with consumers, theauthors note there is a lack of tools for achievingthese aims. “In fact, the important question of howto efficiently visualize the extracted effective infor-mation to make it useful for content designers hasbeen scarcely studied”.

The resulting VIS was composed of two modes:“emotional heat map”, with selectable emotionallayers, and “emotional saccade map”, with a dy-namic representation that shows the path formedby the user fixation points (points the user hadbeen looking at for a minimum configurable time, inmilliseconds). In both modes, the users could alsosee their current emotional state via an emoticonas well as Eckman’s six emotions, plus neutral.

Finally, it would be of scientific interest to analyzethe impact in this VIS’s accuracy if an EEG wasadded as input as done by Soleymani[16].

With millions of faces analyzed to date and one-third of the Global Fortune 100’s companies usingtheir technology, MIT’s Media Lab’s offspring Af-fectiva Inc has developed a video analysis emotiondetection and visualization API – Affdex1.

Arguing that Emotions are the number one in-fluencer of attention perception, memory, humanbehavior and decision making, the API is capableof detecting Ekman’s six emotions plus neutral, 15nuanced facial expressions and even heart-rate byevaluating color changes in a person’s face, whichpulse each time the heart beats.

The resulting visualization consists of a retro-spective line chart, generated after analyzing viawebcam the user’s reaction to an ad. This linechart compares different age groups reactions aswell as the user’s, along the ad’s duration. Themeasured reactions are: Surprise; Smile; Concen-tration; Dislike; Valence; Attention; and Expres-siveness. This demo is publicly available to test,online.

Emotient Inc. was a startup focused on videoemotional analysis and its consequent visualiza-tion. Recently acquired by Apple Inc., the com-pany’s main product was an API called EmotientAnalytics 2.

This software provided facial expression detec-tion and frame-by-frame measurement of sevenkey emotions, as well as intention, engagementand positive or negative consumer sentiment. Allof this was then incorporated in a visualization. In itwe can see line charts for Emotional Engagement,Attention and Sentiment as well as the average of

1http://www.affectiva.com/solutions/affdex/2http://www.emotient.com/products

each of these metrics. Furthermore, stacked areachart displayed each emotion over time as well aspie charts for the video’s average, emotional en-gagement and participants’ gender.

The insight gained from quantifying emotions al-lowed companies to pinpoint and fix issues as wellas improve their marketing performance. Addition-ally, all of this was accessible independently of plat-form, via a web browser

3.3. Physiological ComputingGavrilescu and Ungureanu [5] investigated con-temporary methods to display EEG data and sub-sequently proposed a new VIS. After mentioningthe several useful usages of EEG, such as “the as-sessment of a user’s emotional state”, the authorsnote that the nature of EEG signals, lacking anyintrinsic visual data, causes multiple challengesregarding their graphical representation. Particu-larly “for data spanning over frequency bands andextended durations”. The current, most commonways, for EEG visual representation are then dis-sected: Power Spectrum graphs are identifiedas being time-consuming and tedious to compare .This issue is particularly severe when representingraw data for a large number of electrodes and forvarious brainwaves. Volumetric bull’s eye plot - a2D top-down disc-view of the cranium - is identifiedas an effective way of relating desired informationfor a single sample across all electrodes positions.However this idiom is unable to effectively repre-sent complex, multivariate data concurrently. Thisis due to the difficulty to “represent multi-value dataacross multiple ranges and time phases in a singleimage”.

Finally, a new VIS , aiming to “provide an intuitivemeans of representing the data over multiple sam-ple phases and for all available frequency bands”is presented. Using color spots varying in size andcolor according to the brainwave frequency andvoltage, respectively; and glyphs varying in volumedepending on the variation of the data betweenconsecutive time phases.

Cernea et al. [3] identified emotions with the pur-pose of enhancing the experience of multi-touch in-terfaces’ users. A VIS is proposed for this contextand this approach is verified conducting an EEGuser study. The researcher’s goal is to improve theuser’s emotional self-awareness; the awareness ofother users emotion in collaborative and competi-tive scenarios; and the evaluation of touch systemsand user experience through the visualization ofemotions.

This was achieved using Russel’s model[14] en-coding a variety of emotions in terms of affectivevalence (pleasant or unpleasant) and arousal (ex-cited or calm). Using EEG, facial expressions and

3

Page 4: NeuroExplore - Visualizing Brain Patterns · man physiological signal analysis, usually focusing on the nervous and cardiac system, but not exclu-sively [4]. Finally, BCI - aiming

NeuroSky’s MindWave is a BCI headset cater-ing directly to the consumer market 8. The hard-ware consists three electrodes: a reference andground electrodes located on a ear clip and oneEEG electrode on the forehead above the eye (FP1position)9.

MindWave’s also supports previous recordingdata display in the format of an InfoVis . In it we canvisualize brain-wave bands’ intensities (counts) ina radar chart (left) as well as a colored bar chart(right). The raw EEG signal is displayed abovethe bar chart. Additionally, two circular meters dis-play the normalized (1-100) Attention and Medita-tion metrics. This angular display of numbers is abad practice as it complicates user comparison. Analternative, linear representation would fix this.

Finally, signal quality is also displayed so theuser can better identify issues such as signal satu-ration.

Bitalino is a low-cost toolkit to learn and proto-type applications using physiological signals10. Itsupports a wide range of sensors, although notall simultaneously, due to bandwith limitations [18].Namely: an Accelerometer, photoplethysmography(PPG), Ecectrocardriography (ECG) and Electro-dermal Activity (EDA/GSR). Additionally, a UserInterface (UI) and VIS software - OpenSignals11- can be installed for a real-time or retrospectiveanalysis as well as loading of pre-recorded signals. On it we can see line charts displaying each sen-sor’s signal.

OpenSignals main limitations are its lack of de-rived metrics, the VIS shows RAW data only; itsabsent of any post processing which can result insaturated signal visualization; its connectivity canbe somewhat lacking and frequent crashes havebeen reported by IBEB faculty colleagues.

3.4. Discussion

It is clear that there are several distinct approachesand fields of research analyzing and scrutinizingone’s mental state and emotions. Despite their ac-companying User Interface (UI), a lacking amountof InfoVis specific Physiological Computing studieswas, regrettably, registered With the exception ofGavrilescu’s [5] work which sadly featured unjus-tified 3D interaction , we could not find any otherEEG InfoVis.

Here lies our opportunity to expands the bor-ders of scientific knowledge. Further motivatingour work, we propose the development of an in-tractable and intuitive Psycho-Physiological visualmedium. Through which, users will be able to bet-ter understand their emotional and mental inner-workings.

Figure 1: BrainBIT BCI final prototype used in this project.

4. Proposed SolutionThis chapter portrays all the work which led into themost recent version of our proposed InfoVis solu-tion.

4.1. Biosignals Input SolutionEfficient Neuro-Physiological acquisition is key forany biosignal database upon which we can developand InfoVis, particularly so in real-time scenarios.This is comprehensible as the database itself is thefoundation of any InfoVis. During initial IBEB meet-ings, we learned about the opportunity to use anovel HBI as input to a Physiological ComputingInfoVis database.

Bitalino supports many distinct sensors, yet itsanalog ports are limited. See this and other fea-tures such as weight in Table. We equipped ours(Table.) with an accelerometer, four EEG elec-trodes and one PPG ear-clip sensor. The ratio-nale behind this multi-sensor approach is straight-forward. The accelerometer is used to detectuser movement and the resultant signal saturation.Through EEG analysis, we can extract CNS in-formation such as Emotional-Valence. The PPGallows us to estimate the user’s heart rate andthus better understand his arousal levels. Techni-cal specifications of the components can be foundat Bitalino’s website3, including hardware data-sheets4. The next step was the assemblage ofthe Bitalino. The pre-ordered Plugged Kit consistsof electronics (cables, sensors, the processing unitand other blocks) but no supporting apparatus asseen in Figure . As such, a considerate amountof thought went into the ergonomics and the re-sistance to wear and transport of our device. Oursolution, consists of two velcro strips glued to each-other with the hardware mostly between them.

The only visible electronic components on the in-side of the strip are the electrodes. The PPG sen-sor, which hangs on the ear-side, the Power Blockwith its indicate LED and power switch and the bat-tery are visible on the outer side of the strip. All ofthis is detailed and can be seen in Figure .

3http://bitalino.com/en/4bitalino.com/datasheets

4

Page 5: NeuroExplore - Visualizing Brain Patterns · man physiological signal analysis, usually focusing on the nervous and cardiac system, but not exclu-sively [4]. Finally, BCI - aiming

Figure 2: BrainBIT prototype Outside and Inside headbandview detailing: (1) The Power Block; (2) Battery; (3) PPG; (4)Electrodes.

Finally, our headband had improved usabilitysubstantially over its original components. Its,wearable nature allowed it to be used without ca-bles or any other support. Ergonomically, the softcomponent of velcro was placed in the inner sideof the headband so that our prototype was com-fortable and fitted users regardless head size.

This prototype’s uniqueness entitled it its ownname: BrainBIT. The final result can be seen inFigure.

4.2. ArchitectureSystem architecture’s can be divided in three mainpillars: the hardware biosignal BCI; the users -whose perception we are attempting to improve;and the software - responsible filtering, processingand the derived data InfoVis.

The users’ Physiological (EEG and PPG) andPhysical (accelerometer) input is read by BrainBIT,which in turn, listens for requests - sent over Blue-tooth by ClientBIT (a front-end Javscript/HTML).A raw, discrete domain, digitalized version of theuser’s biosignals is then sent to ClientBIT whichforwards it to ServerBIT (Python back-end) forpost-processing. Communication abides JSON-formatted, preprocessed data. The user inter-acts with the InfoVis via coordinate input, typi-cally a mouse, as he watches current or previouslyrecorded datasets.

4.3. Development ProcessWith our hardware means of raw, biosignal dataacquisition established, we now present all stagesof development which led into our final, interactiveInfoVis solution: NeuroExplore.

Biosignals Dataset Development: The founda-tion for any InfoVis system is the data whose cog-nition, perception and insight we are attempting torevolutionize. That is, to transform and representdata in a form that allows human interaction suchas exploration and newfound knowledge as a re-

sult. This is what differentiates an InfoVis from amore traditional and elementary Graphical User In-terface.

Figure 3: Physiological Computing Database Acquisition fea-turing early InfoVis prototype.

As such, we devised a set of four tasks: Med-itation, Puzzle-solving, music-listening and video-watching during which three subjects wore Brain-BIT. This data-base building processes includedtwenty different sessions (each comprising fourtasks) for each user. After this process ended, wecounted 250 Comma Separated Value files com-prising 7.97 Gigabytes. Given the nature of biosig-nals (susceptible to saturation) we then decided tofilter our data.

Data Preprocessing: The voluminous nature ofcollected, enigmatic, raw biosensor data addedto its susceptibility to realtime recording conditiondeterioration demanded preprocessing, for featureextraction and filtering purposes. As such, westarted by building a data cleaner Python script,which we used for the retrospective database andextended it into the real-time back-end preprocess-ing development. NeuroExplore data preprocess-ing is implemented by the Python server - server-BIT. Receiving as input the unprocessed biosensorsignal, this back-end component is tasked with thefollowing successive data transformations:(1) Fil-tering Signal Saturation; (2) Calculating the EEGPower Spectrum; (3) PPG blood volume pressurePeak-Finding.

Derived Data: NeuroExplore relies on attributesderived from the ServerBIT output of preprocesseddata. The front-end (ClientBIT) derived each of thefollowing metrics: Theta, Alpha and Beta brain os-cillations; Heart-rate; Emotional Valence; Engage-ment; and Meditation. These last three metricsin particular: Emotional-Valence, Engagement andMeditation, are determined exclusively for eachsecond of non-saturated communication. This wasintended as it is futile to derivate metrics from a sat-urated signal. In these missing value situations, weopted to assign the previously, adequately derived,metric.

InfoVis Development: Iterative developmentprocess stages led into the current version ofNeuroExplore, a Physiological Computing Info-Vis through which users can betterdecipher theirbrain and body’s Psycho-Physiological state. Neu-

5

Page 6: NeuroExplore - Visualizing Brain Patterns · man physiological signal analysis, usually focusing on the nervous and cardiac system, but not exclu-sively [4]. Finally, BCI - aiming

roExplore handled specific data structures toachieve a fast action/feedback loop required bydynamic queries, through vision and biosignalsalike. These components are integrated into acoherent framework that simplifies the manage-ment of sophisticated physiological data struc-tures through the BrainBIT BCI or our collecteddatabase, the Server-Bit Python back-end, and theInfoVis Javascript implementation, ClientBIT.

Figure 4: NeuroExplore InfoVis final-version screenshot

4.4. DiscussionThroughout this section, we have detailed the iter-ative development process which led into the final,most current implementation of NeuroExplore.

Through the development of this PhysiologicalComputing InfoVis - intended to improve the user’sability to decipher their mental state - we reachedseveral veredicts:

1. We acquired and preprocessed a massive(5,81GB) biosignals database, featuring ap-proximately 18 hours of recording spanning250 CSV files divided into three subjects andfour categories associated with specific, mea-surable and studied mental patterns.

2. BrainBIT Physiological Computing databaseacquisition capabilities have been proven. Itpresented outside electromagnetic noise sus-ceptibility - visible in NeuroExplore - whichneeds to be managed. Additionally, the PPGsensor signal outputted weak signals in someusers. We believe this is related to lightningconditions, which, if correctly handled, work asintended.

3. Our Bitalino sensor specs feature three chan-nels which - thoughtfully - could reassigned.The accelerometer’s anticipated filtering en-abling insight proved fruitless, as EEG signalsaturation filtering encompassed it. Currently,we have no use for this output, but have de-cided to keep it as part of our project to avoidrisking future developments.

4. There is extended neuroscientific researchregarding EEG derivation of psycho-physiological metrics. We have meticulouslydetailed our proposed extraction algorithms -and justified them accordingly.

5. A final version of NeroExplore is proposed,and its features are widely dissected: fromthe initial data’s output, through preprocess-ing, back-end, mechanisms and data deriva-tion alike, to the front-end interactive visualdisplay.

Through the iterative development of project,user - and often expert - feedback was equated andincremented into each surpassing version, thus in-creasing our goal’s success prospects by repeat-edly reality-checking countless preconceived no-tions

In sum, NeuroExplores current prototype variedfeatures warrant it the right to be tested with alarger pool of users and throughly validated, via us-ability and utility testing alike.

5. EvaluationValidating an InfoVis is an indispensable externalindication of its features successful implementa-tion. We will assess its usability and functional-ity by testing if our design has succeeded in itspurpose[12].

In order to validate our solution, we decided totackle this challenge in two distinct approaches:Usability Testing and Case Studies. The formerwas undertook by users without previous knowl-edge of the system. This was intentional in orderto evaluate the system in regards to interactivityand usability. Testers were asked to complete aset of tasks and a quiz while under observation.After which, a discussion where overall feedbackand personal suggestion took place. The latter cor-responds to two experts in biomedicine engineer-ing interacting with the InfoVis while providing func-tional feedback.

5.1. Usability TestsUpon many iterations of development, a final ver-sion of our VIS must be subjected to a summativeevaluation. This is intended to determine our proto-type’s usability success as well as assuring a meanof maintaining standards [13]

In this section we will detail how the user test-ing was conducted, depicting each user task andits derived, quantitative data: the amount of timenecessary for the user to successfully completeeach task; and the number of errors made. Ad-ditionally, upon task completion but before feed-back took place, participants are asked to fill aSUS, ”a highly robust and versatile tool for usability

6

Page 7: NeuroExplore - Visualizing Brain Patterns · man physiological signal analysis, usually focusing on the nervous and cardiac system, but not exclu-sively [4]. Finally, BCI - aiming

professional”[1]. Consequently, we could establisha concrete, comparable [0− 100] usability score.

Results: Users successfully (never averagingmore than 0,5 errors per task) completed the taskswe posed them. On average, Identify and Com-pare tasks took longer [14, 14−14, 41]seconds sthanmere Discovery tasks [8, 05− 8, 55]seconds, as ex-pected. Finally, users scored our solution as excel-lent - according to SUS collected results.

Figure 5: System Validation: On the bottom-right we can see aparticipant filling the SUS

5.2. Case StudiesTwo experts participated in the case studies ascrib-ing the think-aloud protocol while exploring our sys-tem.

Results:Both participants extensively and en-thusiastically transversed our system’s functional-ities with ease. Through NeuroExplore, they couldvisualize raw physiological data as well as the de-rived affective and nervous-system related metricsboth in real time as well as retrospectively. Aspointed out, this contrasted with their current solu-tion which relied on OpenSignals VIS’s restrictionto raw values (see figure 3.31 for reference).

Overall, comments were extremely positive. Thepurposefully minimalistic and sober menu UI, aswell as its smooth animation and clean color pal-let were praised. Participants appreciated theseaesthetics in what they deemed as correspond-ing to the simple and intuitive user-system inter-actions. In detail, line-chart mouse interaction re-sulting in data knowledge without the visual aid ofthe scales was exalted. What’s more, the locationof the remaining interactions inside an extendablemenu was pointed out as facilitating both the anal-ysis of a wider area of visible data as well as overallinteraction with other relevant system functions byconcentrating all these in one place.

Interestingly, the employed algorithms and theirrespective representation were discussed as wellas the on-field data collection possibilities of ourprojects’ web-based nature.

Additionally, testers stressed recurring con-nection issues while recording data usingOpenSignals. These situations could poten-

tially frustrate user recording sessions. Thus ourpositive experience when using our solution tocollect data was appreciated and successfully putto the test.

Figure 6: Final SUS Score

5.3. DiscussionThroughout this section, we have extensively de-tailed the iterative development process which ledinto the final, most current implementation of Neu-roExplore.

Through the development of this PhysiologicalComputing InfoVis - intended to improve the user’sability to decipher their mental state - we reachedseveral veredicts:

1. We acquired and preprocessed a massive(5,81GB) biosignals database, featuring ap-proximately 18 hours of recording spanning250 CSV files divided into three subjects andfour categories associated with specific, mea-surable and studied mental patterns.

2. BrainBIT Physiological Computing databaseacquisition capabilities have been proven. Itpresented outside electromagnetic noise sus-ceptibility - visible in NeuroExplore - whichneeds to be managed. Additionally, the PPGsensor signal outputted weak signals in someusers. We believe this is related to lightningconditions, which, if correctly handled, work asintended.

3. Our Bitalino sensor specs feature three chan-nels which - thoughtfully - could reassigned.The accelerometer’s anticipated filtering en-abling insight proved fruitless, as EEG signalsaturation filtering encompassed it. Currently,we have no use for this output, but have de-cided to keep it as part of our project to avoidrisking future developments.

4. There is extended neuroscientific researchregarding EEG derivation of psycho-physiological metrics. We have meticulouslydetailed our proposed extraction algorithms -and justified them accordingly.

5. A final version of NeroExplore is proposed,and its features are widely dissected: fromthe initial data’s output, through preprocess-ing, back-end, mechanisms and data deriva-

7

Page 8: NeuroExplore - Visualizing Brain Patterns · man physiological signal analysis, usually focusing on the nervous and cardiac system, but not exclu-sively [4]. Finally, BCI - aiming

tion alike, to the front-end interactive visualdisplay.

Through the iterative development of project,user - and often expert - feedback was equated andincremented into each surpassing version, thus in-creasing our goal’s success prospects by repeat-edly reality-checking countless preconceived no-tions

In sum, NeuroExplores current prototype variedfeatures warrant it the right to be tested with alarger pool of users and throughly validated, via us-ability and utility testing alike.

6. ConclusionIn retrospective fashion, this thesis begins with theintroduction of our project and the postulus of com-prising objectives which lead into the implementa-tion of NeuroExplore. Given the diversified fieldsof study our solution transverses, we presenteda diversified pallet of background knowledge: thisranged from the more traditional, text derived fieldof Sentiment Analysis, passed through AffectiveComputing and its increment on this method viamulti-modal approaches, to arrive at the comple-mentary fields of Physiological Computing and -in particular - BCI. This Computer Science meetsNeuroScience backdrop then into a related work,InfoVis examples of both proprietary and academ-ical nature. The document progresses by com-prehensively entailing development process lead-ing into our final implementation, namely: the soft-ware and hardware’s architecture, the databaseacquisition process and its preprocessing, the de-rived metrics rational and protocol, and - finally -the iterative development nature and its proceduralgrowth. Lastly, we vindicate NeuroExplore througha comprehensive functionality and usability, userand expert validation.

NeuroExplore is a Physiological Computing Info-Vis prototype, through which users can better com-prehend one’s current or previously recorded brainand body status. Achieving a SUS of 85:13, thesystem was considered excellent by users and ex-perts alike. Coherently, we have fulfilled our mainproject goal (see Chapter 1) and all of its depen-dencies, namely, a data pipeline which seeminglyconverts imperceptible biosignals into an insightfulAbstractVIS, for increased, mind and body, behav-ior insights.

References[1] J. Brooke et al. Sus-a quick and dirty us-

ability scale. Usability evaluation in industry,189(194):4–7, 1996.

[2] A. J. Casson, D. C. Yates, S. J. Smith, J. S.Duncan, and E. Rodriguez-Villegas. Wear-able electroencephalography. IEEE engi-

neering in medicine and biology magazine,29(3):44–56, 2010.

[3] D. Cernea, C. Weber, A. Ebert, and A. Kerren.Emotion-prints: interaction-driven emotion vi-sualization on multi-touch interfaces. In Vi-sualization and Data Analysis, page 93970A,2015.

[4] S. H. Fairclough. Fundamentals of physiolog-ical computing. Interacting with computers,21(1-2):133–145, 2008.

[5] M. Gavrilescu and F. Ungureanu. Enhancedthree-dimensional visualization of eeg sig-nals. In E-Health and Bioengineering Confer-ence (EHB), 2015, pages 1–4. IEEE, 2015.

[6] I. Hupont, S. Baldassarri, E. Cerezo, andR. Del-Hoyo. Advanced human affect visu-alization. In Systems, Man, and Cybernetics(SMC), 2013 IEEE International Conferenceon, pages 2700–2705. IEEE, 2013.

[7] S. D. Kamvar and J. Harris. We feel fine andsearching the emotional web. In Proceedingsof the fourth ACM international conference onWeb search and data mining, pages 117–126.ACM, 2011.

[8] M. B. Khalid, N. I. Rao, I. Rizwan-i Haque,S. Munir, and F. Tahir. Towards a braincomputer interface using wavelet transformwith averaged and time segmented adaptedwavelets. In Computer, Control and Com-munication, 2009. IC4 2009. 2nd InternationalConference on, pages 1–4. IEEE, 2009.

[9] L.-D. Liao, C.-Y. Chen, I.-J. Wang, S.-F. Chen,S.-Y. Li, B.-W. Chen, J.-Y. Chang, and C.-T.Lin. Gaming control using a wearable andwireless eeg-based brain-computer interfacedevice with novel dry foam-based sensors.Journal of neuroengineering and rehabilita-tion, 9(1):5, 2012.

[10] W. Medhat, A. Hassan, and H. Korashy. Sen-timent analysis algorithms and applications:A survey. Ain Shams Engineering Journal,5(4):1093–1113, 2014.

[11] G. A. Miller. The cognitive revolution: a histor-ical perspective. Trends in cognitive sciences,7(3):141–144, 2003.

[12] T. Munzner. Visualization analysis and de-sign. CRC press, 2014.

[13] Y. Rogers, H. Sharp, and J. Preece. Interac-tion design: beyond human-computer interac-tion. John Wiley & Sons, 2011.

8

Page 9: NeuroExplore - Visualizing Brain Patterns · man physiological signal analysis, usually focusing on the nervous and cardiac system, but not exclu-sively [4]. Finally, BCI - aiming

[14] J. A. Russell. Core affect and the psychologi-cal construction of emotion. Psychological re-view, 110(1):145, 2003.

[15] K. R. Scherer, T. Banziger, and E. Roesch. ABlueprint for Affective Computing: A source-book and manual. Oxford University Press,2010.

[16] M. Soleymani, S. Asghari-Esfeden, Y. Fu, andM. Pantic. Analysis of eeg signals and fa-cial expressions for continuous emotion de-tection. IEEE Transactions on Affective Com-puting, 7(1):17–28, 2016.

[17] S. J. Stanton, W. Sinnott-Armstrong, and S. A.Huettel. Neuromarketing: Ethical implicationsof its use and potential misuse. Journal ofBusiness Ethics, 144(4):799–811, 2017.

[18] M. K. Torkildson, K. Starbird, and C. Aragon.Analysis and visualization of sentiment andemotion on crisis tweets. In International Con-ference on Cooperative Design, Visualiza-tion and Engineering, pages 64–67. Springer,2014.

[19] P. Zikopoulos, C. Eaton, et al. Understand-ing big data: Analytics for enterprise classhadoop and streaming data. McGraw-Hill Os-borne Media, 2011.

9