Computationally Viable Handling of Beliefs in Arguments for Persuasion
-
Upload
emmanuel-hadoux -
Category
Science
-
view
26 -
download
0
Transcript of Computationally Viable Handling of Beliefs in Arguments for Persuasion
![Page 1: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/1.jpg)
Computationally Viable Handling of Beliefs inArguments for Persuasion
Emmanuel Hadoux and Anthony HunterNovember 6, 2016
University College LondonEPSRC grant Framework for Computational Persuasion
![Page 2: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/2.jpg)
Introduction
![Page 3: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/3.jpg)
Persuasion problems
• One agent (the proponent) tries to persuade the other(the opponent)
• e.g., doctor persuading a patient to quit smoking, asalesman, a politician, ...
• the agents exchange arguments during a persuasiondialogue
• These arguments are connected by an attack relation
1
![Page 4: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/4.jpg)
Persuasion problems
• One agent (the proponent) tries to persuade the other(the opponent)
• e.g., doctor persuading a patient to quit smoking, asalesman, a politician, ...
• the agents exchange arguments during a persuasiondialogue
• These arguments are connected by an attack relation
1
![Page 5: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/5.jpg)
Persuasion problems
• One agent (the proponent) tries to persuade the other(the opponent)
• e.g., doctor persuading a patient to quit smoking, asalesman, a politician, ...
• the agents exchange arguments during a persuasiondialogue
• These arguments are connected by an attack relation
1
![Page 6: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/6.jpg)
Abstract argumentation framework
A1 A2 A3
Figure 1: Argument graph with 3 arguments
Based on Dung’s abstract argumentation framework [1]
Example (Figure 1)A1 = “It will rain, take an umbrella”A2 = “The sun will shine, no need for an umbrella”A3 = “Weather forecasts say it will rain”
2
![Page 7: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/7.jpg)
Purpose of the work
The objective for the proponent:
1. Have an argument or a set of arguments holding at theend of the dialogue
2. Have these arguments believed by the opponent
Need to maintain and update a belief distribution → to positthe right argument
3
![Page 8: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/8.jpg)
Purpose of the work
The objective for the proponent:
1. Have an argument or a set of arguments holding at theend of the dialogue
2. Have these arguments believed by the opponent
Need to maintain and update a belief distribution → to positthe right argument
3
![Page 9: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/9.jpg)
Purpose of the work
The objective for the proponent:
1. Have an argument or a set of arguments holding at theend of the dialogue
2. Have these arguments believed by the opponent
Need to maintain and update a belief distribution
→ to positthe right argument
3
![Page 10: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/10.jpg)
Purpose of the work
The objective for the proponent:
1. Have an argument or a set of arguments holding at theend of the dialogue
2. Have these arguments believed by the opponent
Need to maintain and update a belief distribution → to positthe right argument
3
![Page 11: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/11.jpg)
Belief distribution
Epistemic approach to probabilistic argumentation (e.g., [2])DefinitionLet G = ⟨A, R⟩ be an argument graph.Each X ⊆ A is called a model.A belief distribution P over 2A is such that
∑X⊆A P(X) = 1 and
P(X) ∈ [0, 1], ∀X ⊆ A.The belief in an argument A is P(A) =
∑X⊆A s.t. A∈X P(X).
If P(A) > 0.5, argument A is accepted.
Example (of a belief distribution)Let A = {A, B} where P({A, B}) = 1/6, P({A}) = 2/3, andP({B}) = 1/6 is a belief distribution.Then, P(A) = 5/6 > 0.5 and P(B) = 2/6 < 0.5.
4
![Page 12: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/12.jpg)
Belief distribution
Epistemic approach to probabilistic argumentation (e.g., [2])DefinitionLet G = ⟨A, R⟩ be an argument graph.Each X ⊆ A is called a model.A belief distribution P over 2A is such that
∑X⊆A P(X) = 1 and
P(X) ∈ [0, 1], ∀X ⊆ A.The belief in an argument A is P(A) =
∑X⊆A s.t. A∈X P(X).
If P(A) > 0.5, argument A is accepted.
Example (of a belief distribution)Let A = {A, B} where P({A, B}) = 1/6, P({A}) = 2/3, andP({B}) = 1/6 is a belief distribution.Then, P(A) = 5/6 > 0.5 and P(B) = 2/6 < 0.5.
4
![Page 13: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/13.jpg)
Belief distribution
Epistemic approach to probabilistic argumentation (e.g., [2])DefinitionLet G = ⟨A, R⟩ be an argument graph.Each X ⊆ A is called a model.A belief distribution P over 2A is such that
∑X⊆A P(X) = 1 and
P(X) ∈ [0, 1], ∀X ⊆ A.The belief in an argument A is P(A) =
∑X⊆A s.t. A∈X P(X).
If P(A) > 0.5, argument A is accepted.
Example (of a belief distribution)Let A = {A, B} where P({A, B}) = 1/6, P({A}) = 2/3, andP({B}) = 1/6 is a belief distribution.Then, P(A) = 5/6 > 0.5 and P(B) = 2/6 < 0.5.
4
![Page 14: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/14.jpg)
Refinement of a belief distribution
Each time a new argument is added to the dialogue, thedistribution needs to be updated.
A
B
Figure 2
AB P H1A(P) H0.75A (P)
11 0.6 0.7 0.67510 0.2 0.3 0.27501 0.1 0.0 0.02500 0.1 0.0 0.025
Table 1: Examples of Belief Redistribution
We can modulate the update to take into account differenttypes users (skeptical, credulous, etc.)
5
![Page 15: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/15.jpg)
Splitting the distribution
![Page 16: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/16.jpg)
Splitting the distribution
• 30 arguments → 230 = 1, 073, 741, 824 models → 8.6 gB iftreated as a double type
• Fortunately, they are not all directly linked to each other• We can group related arguments into flocks which arethemselves linked to each other
• We create a split distribution from the metagraph, asopposed to the joint distribution
6
![Page 17: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/17.jpg)
Splitting the distribution
• 30 arguments → 230 = 1, 073, 741, 824 models → 8.6 gB iftreated as a double type
• Fortunately, they are not all directly linked to each other
• We can group related arguments into flocks which arethemselves linked to each other
• We create a split distribution from the metagraph, asopposed to the joint distribution
6
![Page 18: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/18.jpg)
Splitting the distribution
• 30 arguments → 230 = 1, 073, 741, 824 models → 8.6 gB iftreated as a double type
• Fortunately, they are not all directly linked to each other• We can group related arguments into flocks which arethemselves linked to each other
• We create a split distribution from the metagraph, asopposed to the joint distribution
6
![Page 19: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/19.jpg)
Splitting the distribution
• 30 arguments → 230 = 1, 073, 741, 824 models → 8.6 gB iftreated as a double type
• Fortunately, they are not all directly linked to each other• We can group related arguments into flocks which arethemselves linked to each other
• We create a split distribution from the metagraph, asopposed to the joint distribution
6
![Page 20: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/20.jpg)
Metagraphs
A1A2
A3
A4
A5 A6
A7
A8 A9 A10(a)
A1 A2, A3
A4
A5 A6
A7, A8, A9, A10
(b)
Figure 3: Argument graph and possible metagraph
7
![Page 21: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/21.jpg)
Creating a split distribution
A1 A2, A3
A4
A5 A6
A7, A8, A9, A10
Figure 4: Metagraph
We define three assumptions forthe split to be clean:1. Arguments from non directlyconnected flocks areconditionaly independent
2. Arguments in a flock areconsidered connected
3. Arguments in a flock areconditionally dependent
No bayesian networks because: not probabilities, users are notrational, etc.
8
![Page 22: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/22.jpg)
Creating a split distribution
• We can define an optimal, irreducible, split w.r.t. the graph
• However, an irreducible split may not be computable• Only the irreducible split is unique, we therefore need torank the others.
9
![Page 23: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/23.jpg)
Creating a split distribution
• We can define an optimal, irreducible, split w.r.t. the graph• However, an irreducible split may not be computable
• Only the irreducible split is unique, we therefore need torank the others.
9
![Page 24: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/24.jpg)
Creating a split distribution
• We can define an optimal, irreducible, split w.r.t. the graph• However, an irreducible split may not be computable• Only the irreducible split is unique, we therefore need torank the others.
9
![Page 25: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/25.jpg)
Ranking the splits
Definition (valuation of a split)
x =∑A∈A
∑Pi∈S s.t. A∈E(P)
|Pi|
and P ≻ P′ iff x < x′.
10
![Page 26: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/26.jpg)
Ranking the splits
Definition (valuation of a split)
x =∑A∈A
∑Pi∈S s.t. A∈E(P)
|Pi|
and P ≻ P′ iff x < x′.
Example (of valuation and ranking)Let P be the joint distribution for Figure 3a. Value ofP : 10× 210 = 10, 240
10
![Page 27: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/27.jpg)
Ranking the splits
Definition (valuation of a split)
x =∑A∈A
∑Pi∈S s.t. A∈E(P)
|Pi|
and P ≻ P′ iff x < x′.
A1 A2, A3
A4
A5 A6
A7, A8, A9, A10
Figure 5: Metagraph
Example (of valuation andranking)P1 = (P(A5), P(A6 | A5),P(A4 | A5,A6), P(A2,A3 | A4,A7),P(A1 | A2,A3), P(A7,A8,A9,A10)):21+22+23+2×24+23+4×24 = 118
10
![Page 28: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/28.jpg)
Ranking the splits
Definition (valuation of a split)
x =∑A∈A
∑Pi∈S s.t. A∈E(P)
|Pi|
and P ≻ P′ iff x < x′.
Example (of valuation and ranking)P2 = (P(A1,A2,A3,A4,A5,A6 | A7), P(A7,A8,A9,A10)):6× 27 + 4× 24 = 832.
We then see that P1 ≻ P2 ≻ P.
10
![Page 29: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/29.jpg)
Experiments
![Page 30: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/30.jpg)
Splitting the distribution
A1 A2, A3
A4
A5 A6
A7, A8, A9, A10
Figure 5: Metagraph
• Original graph: 10arguments → 1,024 values→ 8kB
• Metagraph: 10 argumentsin 6 flocks → 54 values →432B
• And the time taken toupdate.
• An argument can beupdated by updating onlyits flock.
11
![Page 31: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/31.jpg)
Splitting the distribution
A1 A2, A3
A4
A5 A6
A7, A8, A9, A10
Figure 5: Metagraph
• Original graph: 10arguments → 1,024 values→ 8kB
• Metagraph: 10 argumentsin 6 flocks → 54 values →432B
• And the time taken toupdate.
• An argument can beupdated by updating onlyits flock.
11
![Page 32: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/32.jpg)
Splitting the distribution
A1 A2, A3
A4
A5 A6
A7, A8, A9, A10
Figure 5: Metagraph
• Original graph: 10arguments → 1,024 values→ 8kB
• Metagraph: 10 argumentsin 6 flocks → 54 values →432B
• And the time taken toupdate.
• An argument can beupdated by updating onlyits flock.
11
![Page 33: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/33.jpg)
Experiments with flocks of different sizes
# flocks # links 1 update 50 updates
2 flocks10 links 2ms 107ms30 links 6ms 236ms
4 flocks10 links 1ms 45ms30 links 3ms 114ms
10 flocks10 links 0.03ms 1.6ms30 links 0.06ms 2.5ms
Table 2: Computation Time for Updates in Different Graphs of 50Arguments (in ms)
12
![Page 34: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/34.jpg)
Experiments with different numbers of arguments
# args Time for 20 updates Comparative %
25 497ns +0%50 517ns +4%75 519ns +4%100 533ns +7%
Table 3: Computation Time for 20 Updates (in ns)
13
![Page 35: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/35.jpg)
Experiments
A new version of the library is currently begin developped inC++ and is available at: https://github.com/ComputationalPersuasion/splittercell.
As a rule of thumb, we should keep flocks to less than 25arguments each.
14
![Page 36: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/36.jpg)
Conclusion
We have presented:
1. A framework to represent the belief of the opponent inthe arguments
2. How to create a split distribution using a metagraph3. How to rank the splits in order to choose the mostappropriate one w.r.t. the problem
4. Experiments showing the viability of the approach
Next step: adapt this work to the whole project to scale.
15
![Page 37: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/37.jpg)
Conclusion
We have presented:
1. A framework to represent the belief of the opponent inthe arguments
2. How to create a split distribution using a metagraph3. How to rank the splits in order to choose the mostappropriate one w.r.t. the problem
4. Experiments showing the viability of the approach
Next step: adapt this work to the whole project to scale.
15
![Page 38: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/38.jpg)
Thank you!
15
![Page 39: Computationally Viable Handling of Beliefs in Arguments for Persuasion](https://reader031.fdocuments.us/reader031/viewer/2022030211/58a1a8f11a28abe6468b5d61/html5/thumbnails/39.jpg)
Phan Minh Dung.On the acceptability of arguments and its fundamentalrole in nonmonotonic reasoning, logic programming, andn-person games.Artificial Intelligence, 77:321–357, 1995.
Anthony Hunter.A probabilistic approach to modelling uncertain logicalarguments.International Journal of Approximate Reasoning,54(1):47–81, 2013.
15