ERP!Data!Analysis! Volume!1! Net!Station4.3.1!...

13
ERP Data Analysis Volume 1 Net Station 4.3.1 Waveform Tools

Transcript of ERP!Data!Analysis! Volume!1! Net!Station4.3.1!...

Page 1: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

 

 

 

 

ERP  Data  Analysis  

Volume  1  

 

 

Net  Station  4.3.1  

Waveform  Tools  

 

 

Page 2: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

About  this  Document  

  This  document  covers  the  waveform  tools  in  Net  Station  4.3.1  and  should  be  applicable  to  other  versions  of  Net  Station.    Specifically  this  document  seeks  to  do  three  things:  

1. Introduce  the  reader  to  theory  behind  the  waveform  tools.  2. List  the  available  settings  and  those  used  by  the  DEL.  3. Put  the  different  analysis  steps  together  in  an  analysis  pipeline  to  be  applied  

to  new  data  sets  being  analyzed.      

 

 

 

Page 3: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

Quick  Start  Guide  

This  section  quickly  lists  the  settings  used  in  most  paradigms  for  data  preprocessing  at  the  DEL.  

1. Filtering  a. Recordings  are  made  from  0.1-­‐100Hz  b. Analyses  are  run  at  0.1  Hz  to  30Hz  for  historical  reasons  c. Waveform  Tool  for  data  collected  at  0.1-­‐100Hz  should  be  a  “Filtering”  

tool  with  a  low-­‐pass  filter  of  30Hz  applied.  d. Waveform  Tool  for  data  collected  with  300  series  amplifier  should  use  

a  “First  Order  High  Pass  Filter”  set  at  0.1Hz,  followed  by  a  “Filtering”  tool  with  a  low-­‐pass  filter  at  30Hz.      

2. Segmentation  a. Segmentation  should  proceed  in  two  steps:  

i. a  “Markup”  step  which  adds  the  category  name  to  all  stimuli  within  that  category.      

ii. A  segmentation  step  which  actually  segments  the  data  based  on  the  marked  up  stimuli  into  categories  of  interest  for  later  analysis.      

iii. Most  studies  use  window  of  100  ms  before  and  900  ms  after.    Consistency  helps  with  later  tools  and  scripts.      

b. Offsets  for  the  stimuli  should  be  built  into  the  markup  segmentation  to  make  later  re-­‐analysis  easier.  

3. Visual  Review    a. At  this  point,  open  up  your  data  file,  choose  “Eyes”  from  the  montage  

menu  or  drop-­‐down  box  and  scroll  through  your  file  looking  for  bad  eye  channels.      

b. When  you  identify  an  eye  channel  that  is  bad  (due  to  bad  connection)  for  more  than  30%  of  the  file,  mark  that  file  bad  by  clicking  the  ø,  next  to  the  eyeball  on  that  track.  

4. Bad  Channel  Replacement  a. In  this  step,  simply  run  bad  channel  replacement  on  the  eye  channels  

that  were  marked  bad.    This  will  interpolate  them  using  spherical-­‐spline  interpolation  and  allow  the  later  artifact  detection  to  use  the  inputted  data  for  its  estimations  of  artifacts.  

5. Artifact  Detection  a. Bad  Channels  200µV  over  entire  segment,  no  moving  average  b. Eye  Blink  140µV  over  640ms  window,  no  moving  average  c. Eye  movements  over  100µV  in  a  640ms  window,  no  moving  average  d. Mark  bad  for  all  segments  if  bad  in  more  than  40%  of  segments  e. Mark  Segment  if  more  than  10  bad  channels,  eye  blink,  eye  movement  f. Overwrite  all  previous  bad  channel/segment  information  g. Only  detect  in  the  window  of  100ms  before  and  700ms  after  

6. Bad  Channel  Replacement  

Page 4: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

a. This  will  interpolate  bad  channels  identified  by  the  automated  Artifact  Detection  script  

7. Montage  Operations  (Average  Reference)  a. Select  appropriate  net  b. Choose  Average  Reference  c. Exclude  bad  channels  from  average  reference  d. Do  not  use  PARE  

8. Baseline  Correction  a. “Start  of  Segment”  and  is  100ms  long  b. Or  any  way  of  specifying  pre-­‐stimulus  interval  

9. Averaging  a. Choose  Separately  for  data  files  and  Subjects  b. Check  “Get  Subject  ID  from  file  name”  c. Leave  all  other  boxes  unchecked  

10. Choose  later  analysis  path  a. PCA    Export  to  text  b. Peak  Amplitude  &  Latency    Statistical  Extraction  Tool  c. Source  Analysis    GeoSource  d. BESA    Export  to  RAW  

 

I  usually  accomplish  these  with  two  script,  a  filtering  +  segmentation  script  that  takes  care  of  any  filtering  and  segmentation.    Then  I  manually  check  the  eye  channels  for  bad  channels,  and  then  I  have  a  post-­‐fix  script  which  is  called  “Post-­‐Fix  HC128”  to  identify  what  the  script  is  and  what  net  it  is  to  be  used  for.    The  post-­‐fix  script  includes  the  rest  of  the  steps  and  can  be  held  constant  across  numerous  studies.      

Page 5: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

Tool  Theory  &  Settings  

Filtering  Tool  

 

The  filtering  tool  reduces  the  total  frequencies  in  the  recording  file  by  a  specified  amount.    For  instance,  electrical  activity  in  the  states  oscillates  at  60Hz,  in  order  to  remove  noise  created  by  these  sources,  one  could  introduce  a  filter  in  either  a  low-­‐pass  filter  that  excluded  the  60Hz  frequency,  or  a  notch  filter  around  60Hz.    This  would  reduce  the  influence  of  electrical  noise  in  the  recording  file.    For  historical  reasons,  and  in  order  to  allow  the  comparison  of  current  data  in  the  framework  of  data  collected  in  the  past,  we  usually  filter  from  0.1Hz  to  30Hz.    For  data  

collected  on  the  200  series  amplifier  (the  big  ones),  only  one  filter  tool  is  required,  the  one  pictured  above  and  created  in  the  Waveform  Tools  called  “Filtering”.    For  

data  collected  using  the  300  series  amplifier  (the  small  one),  the  data  must  be  filtered  first  using  a  “First  Order  Highpass  Filter”  set  to  0.1Hz  and  then  the  filtering  tool  pictured  above.    These  two  steps  are  often  best  done  by  constructing  a  script  to  run  both  together.      

It  should  be  noted,  that  conducting  a  study  using  both  200  and  300  series  amplifiers,  could  introduce  noise  into  your  sample  that  you  would  want  to  statistically  control  for.    Indeed,  most  researchers  will  want  to  use  only  one  type  of  amplifier  for  their  study.      

Page 6: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

Segmentation  -­‐  Markup  

The  segmentation  tool  now  involves  two  phases;  the  first  is  a  markup  file  phase,  which  will  help  with  later  data  analysis.      

 

Shown  above  is  the  panel  where  you  specify  the  markup  attributes,  just  like  a  regular  segmentation,  identify  the  codes  needed  for  the  markup  and  then  click  on  the  object  itself  and  specify  the  offset  in  milliseconds  (shown  below).    The  Markup  track  name  can  be  anything,  but  typically  something  like  “MarkupTrack”  are  preferred  so  that  when  the  data  is  later  analyzed  (or  re-­‐analyzed  later  on)  it  is  obvious  where  the  additional  stimulus  tags  came  from.      

 

Page 7: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

Segmentation  

Segmentation  is  the  process  of  breaking  the  ongoing  EEG  signal  into  event-­‐related  epochs  of  approximately  one  second.    For  consistency,  it  is  recommended  that  the  time  windows  are  100  milliseconds  before  and  900  milliseconds  after.    This  is  

mostly  to  keep  the  modification  of  later  SPSS,  SAS,  and  R  scripts  to  a  minimum.  

Each  study  (or  paradigm)  has  a  unique  segmentation,  based  on  the  stimuli  (and  their  stimulus  tags)  that  were  used.    You  could  not  segment  an  oddball  data  file  with  a  stroop  segmentation  (and  if  you  manage  to  –  think  

very  hard  about  what  the  result  would  mean).      

To  create  this  tool,  in  the  waveform  tools,  select  Create    Segmentation.    If  you  were  segmenting  a  3x40  data  file  (stimuli  are  /ba/,  /da/,  /ga/),  you  would  initially  create  three  categories,  and  identify  them  using  the  “Segmentation  Settings”  as  something  meaningful.    In  this  case,  we  will  use  ba++,  da++,  and  ga++.    Category  names  should  be  short  and  informative.      

Next,  select  the  ba++  category,  and  choose  the  1000ms|1000ms  part  of  the  tool.    This  will  present  you  with  a  window  like  the  one  below.      

 

Segmentation  length  before  is  100,  after  is  900,  both  in  milliseconds.    The  offset  is  available  for  a  particular  study.    How  to  measure  the  offset  in  your  particular  study  will  be  covered  in  a  later  document.    Finally,  choose  the  “Event  1”  part  of  the  window  and  it  will  return  you  back  

to  the  initial  representation  shown  in  the  top  screen.      

Page 8: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

At  this  point,  press  the  browse  button  and  either  drag  a  file  into  the  “Source”  box  or  press  select  and  choose  a  file  for  your  experiment  using  the  window  that  appears.    Select  the  codes  box.      

This  will  show  you  the  common  stimulus  flags  that  are  used  in  your  experiment.    As  you  can  see,  this  study  used  win,  lose,  and  no-­‐win  stimuli.    You  can  now  drag  these  

into  the  criteria  part  of  the  segmentation  screen.    Make  sure  that  you  only  drag  the  stimulus  flag  for  the  selected  Segmentation  Category.      

If  you  want  to  evaluate  whether  the  trial  was  answered  correctly  or  not  –  you  need  a  “Specs”  value  called  “Trial  Spec  Evaluation  is  1”  for  correct  or  0  for  incorrect.    Add  this  to  your  segmentation  Criteria  if  you  wish  to  segment  only  correct  or  incorrect  trials.  

 

 

 

Page 9: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

Visual  Inspection  

Net  Station  is  a  very  sophisticated  program  for  analyzing  data.    However,  it  is  not  able  to  anticipate  certain  problems  with  the  data,  the  most  important  for  our  purposes  is  that  Net  Station  cannot  tell  if  an  eye  channel  is  bad  because  it  was  not  in  contact  with  the  skin  or  if  it  was  due  to  eye  blinks,  eye  movements,  or  other  artifacts.    In  order  to  correct  this  problem,  it  is  useful  to  open  the  file  and  browse  the  eye  channels  for  bad  data.      

 

As  shown  above,  the  data  file  has  a  bad  channel,  which  is  highlighted  in  red  (to  highlight  a  channel,  select  its  number  on  the  right).    Browse  through  the  rest  of  the  file  to  make  sure  that  this  eye  channel  is  bad  for  a  substantial  portion  of  the  file.      

If  this  channel  were  left  in  the  file  for  later  automated  artifact  detection,  Net  Station  would  mark  the  entire  file  as  having  eye  blinks  or  eye  movements  because  of  this  channel.    In  order  to  avoid  this,  we  need  to  mark  this  channel  bad  –  do  this  by  selecting  the  red  ø  to  the  right  of  the  channel  number.    This  will  mark  the  channel  bad  for  the  entire  recording.      

This  step  can  be  very  subjective,  which  is  why  I  encourage  you  to  only  examine  eye  channels  and  only  mark  channels  bad  that  are  bad  for  more  than  20-­‐40%  of  the  recording  (depending  on  how  many  trials  you  have).      

 

Page 10: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

Artifact  Detection  

Artifact  detection  is  the  process  of  scanning  a  file  for  bad  channels  (channels  that  do  not  have  a  good  connection  to  the  scalp  or  have  been  bumped),  eye  blinks,  and  eye  movements.      

 

Bad  channels  are  measured  as  deviations  in  microvolts  over  the  course  of  a  segment.    Often  200µV  is  sufficient  to  identify  bad  channels.      

Unlike  bad  channel  identification,  eye  blinks  and  eye  movements  are  measured  as  deviations  between  pairs  of  channels.    These  channels  are  specific  depending  on  the  type  of  net.    These  are  listed  below,  however,  Net  Station  will  automatically  choose  the  right  eye  channels  for  the  waveform  tool  or  for  the  viewer  when  you  open  a  file  collected  with  a  specific  net.  

Eye  Blink   Eye  Movement  GSN  200:  8,  126,  26,  127  HydroCel  128:  8,  126,  25,  127  HydroCel  256:  18,  238,  37,  241  

GSN  200:  125,  128  HydroCel  128:  125,  128  HydroCel  256:226,  252  

Page 11: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

The  remaining  settings  for  the  Artifact  Detection  tool  involve  performing  inferences,  these  are  rules  that  identify  when  to  reject  a  channel  or  segment.    Channels  should  be  marked  bad  for  the  entire  recording  if  they  are  bad  for  more  than  40%  of  the  recording.    Segments  should  be  marked  bad  if  they  contain  more  than  10  bad  channels,  have  an  eye  blink,  or  an  eye  movement.    Settings  are  shown  above  in  the  diagram.      

Restricted  range  is  also  important,  as  often  we  are  analyzing  only  peaks  before  700  milliseconds,  but  include  the  full  900  milliseconds  post-­‐stimulus  onset  to  allow  for  graphics.    Settings  are  displayed  above.      

 

Bad  Channel  Replacement  

 

 

Bad  Channel  replacement  has  no  user  settings,  it  is  just  important  to  run  this  step  after  Artifact  Detection  (or  marking  bad  channels  by  hand)  in  order  to  interpolate  the  missing  or  lost  channel  using  spherical-­‐spline  interpolation.    This  interpolation  is  discussed  more  in  Picton  et  al.  (2000).    Briefly,  this  fills  in  the  bad  channel  using  a  weighted  average  of  surrounding  trials,  while  at  the  same  time  adjusting  the  average  so  that  the  total  voltage  over  the  head  will  be  neutral  (zero).      

 

Page 12: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

Baseline  Correction  

Baseline  correction  is  used  to  de-­‐trend  the  data  from  long  slow  oscillations  in  the  waveform.      

 

Settings  are  shown  above.  

 

Average  Reference  

Initially  when  data  is  recorded,  it  is  represented  as  the  difference  between  the  reference  electrode  (Cz)  and  another  electrode.    This  is  why  the  placement  of  the  

reference  electrode  is  so  important.    In  order  to  reduce  the  bias  of  a  particular  reference  on  the  data,  data  is  re-­‐referenced  to  the  average  reference.    In  some  cases,  a  researcher  may  re-­‐reference  to  a  mastoid  or  linked  mastoids,  but  for  the  purposes  of  most  data  sets  in  our  lab,  you  will  use  the  average  reference  for  the  net  used.      

 

Page 13: ERP!Data!Analysis! Volume!1! Net!Station4.3.1! WaveformTools!cb3.unl.edu/dbrainlab/wp-content/uploads/sites/2/2013/12/Data-Anal… · QuickStartGuide! Thissectionquicklyliststhe!settings!used!in!most!paradigms!for!data!preprocessing!

Averaging  

Averaging  is  the  process  of  averaging  together  all  trials  of  a  particular  stimulus  to  form  a  single  waveform  composed  of  all  presentations  of  a  stimulus.    Thus  if  you  presented  /ba/  150  times,  you  would  end  up  with  one  waveform  for  /ba/  that  would  be  all  of  those  waveforms  averaged  together.      

Typically  you  want  to  handle  source  files  separately  and  handle  subjects  separately.    If  you  were  creating  a  grand  average  you  could  use  “Together”  for  both  options,  but  for  the  SPSS  &  PCA  parts  of  our  analyses  we  will  use  the  separately  options.      

Make  sure  you  copy  events.