Meaningful Interaction Analysis

download Meaningful Interaction Analysis

of 42

  • date post

    03-Nov-2014
  • Category

    Documents

  • view

    4
  • download

    0

Embed Size (px)

description

Guest lecture given at the JiaoTong University, Shanghai, China.

Transcript of Meaningful Interaction Analysis

  • 1. December 7 th , 2010, Shanghai, China Latent Semantics andSocial Interaction Fridolin Wild KMi, The Open University

2. Outline

  • Context & Framing Theories
  • Latent Semantic Analysis (LSA)
  • (Social) Network Analysis (S/NA)
  • Meaningful Interaction Analysis (MIA)
  • Outlook: Analysing Practices

3.

  • Context & Theories

4. Information what isMeaning could be thequality of a certain signal. Meaning could be alogical abstractor = a release mechanism. (96dpi) meaning 5. meaningis social

    • To develop ashared understandingis a natural and necessary process, because language underspecifies meaning: future understandingbuilds on it

Network effects make a network of shared understandings more valuable with growing size:allowing e.g. distributed cognition.

    • At same time: linguistic relativity (Sapir-Whorf hypothesis): our own language culture restricts our thinking

6. 7. Concepts & Competence things we can (not) construct from language

  • Tying shoelaces
  • Douglas Adams meaning of liff:
    • Epping : The futile movements of forefingers and eyebrows used when failing to attract the attention of waiters and barmen.
    • Shoeburyness : The vague uncomfortable feeling you get when sitting on a seat which is still warm from somebody else's bottom

I have been convincingly Sapir-Whorfed by this book. 8. A Semantic Community LSA SNA AssociativeCloseness Concept(disambiguated term) Person Social relation 9. LSA 10. Latent Semantic Analysis

  • Two-Mode factor analysis of the co-occurrences in theterminology
  • Results in a latent-semantic vector space

Humans learn word meanings and how to combine them into passage meaning through experience with ~paragraph unitized verbal environments. They dont remember all the separate words of a passage; they rememberits overall gist or meaning. LSA learns by reading ~paragraph unitized texts that represent the environment. It doesnt remember all the separate words of a text it; it remembersits overall gist or meaning. -- Landauer, 2007 11. latent-semantic space Singular values (factors, dims, ) Term Loadings Document Loadings 12. (Landauer, 2007) 13. Associative Closeness You need factor stability? > Project using fold-ins! Term 1 Document 1 Document 2 Angle 2 Angle 1 Y dimension X dimension 14. Example: Classic Landauer { M } =Deerwester, Dumais, Furnas, Landauer, and Harshman (1990):Indexing by Latent Semantic Analysis, In: Journal of the AmericanSociety for Information Science, 41(6):391-407 Only the red terms appear in morethan one document, so strip the rest. term = feature vocabulary = ordered set of features TEXTMATRIX 15. Reconstructed, Reduced Matrix m4:Graph minors : Asurvey 16. doc2doc - similarities

    • Unreduced = pure vector space model
    • - Based onM = TSD
    • - Pearson Correlation over document vectors
    • reduced
    • - based onM 2= TS 2 D
    • - Pearson Correlationover document vectors

17. (S)NA 18. Social Network Analysis

  • Existing for a long time (term coined 1954)
  • Basic idea:
    • Actors and Relationships between them (e.g. Interactions)
    • Actors can be people (groups, media, tags, )
    • Actors and Ties form a Graph (edges and nodes)
    • Within that graph, certain structures can be investigated
      • Betweenness, Degree of Centrality, Density, Cohesion
      • Structural Patterns can be identified (e.g. the Troll)

19. Constructinga networkfrom raw data forum postings incidence matrix IM adjacency matrix AM IM xIM T 20. Visualization:Sociogramme 21. Measuring Techniques (Sample) Degree Centrality number of (in/out) connections to others Closeness how close to all others Betweenness how often intermediary Components e.g. kmeans cluster (k=3) 22. Example: Joint virtualmeeting attendance (Flashmeeting co-attendance inthe Prolearn Network of Excellence) 23. Example: Subscription structuresin a blogging network(2 ndtrial of the iCamp project) 24. MIA 25. Meaningful Interaction Analysis (MIA)

  • Combines latent semanticswith the means of network analysis
  • Allows for investigating associative closeness structures at the same timeas social relations
  • In latent-semantic spaces only or in spaces with additionaland different (!) relations

26. The mathemagics behind Meaningful Interaction Analysis 27. Network Analysis 28. MIA of the classic Landauer 29. 30. 31. 32. Applications 33. Capturing traces in text: medical student case report 34. Internal latent-semantic graph structure (MIA output) 35. 36. Evaluating Chats with PolyCAFe 37. 38. Conclusion 39. Conclusion

  • Both LSA and SNA alone are not sufficient for a modern representation theory
  • MIA provides one possiblebridge between them
  • It is a powerful technique
  • And it is simple to use (in R)

40. #eof. 41. Contextualised Doc & Term Vectors

  • T k = left-hand sided matrix= term loadingson the singular value
  • D k = right-hand sided matrix= document loadings on the singular value
  • Multiply them into same space
    • V T=T kS k
    • V D=D k T S k
  • Cosine Closeness Matrix over ... = adjacency matrix = a graph
  • More: e.g. addauthor vectorsV A through cluster centroids orvector addition of their publication vectors

Speed: use existing space and fold in e.g. author vectors latent-semantic space 42. InfluencingParameters(LSA) Pearson(eu, sterreich) Pearson(jahr, wien)