Positive side-effects of misinformation

17
Positive Side-effects of Misinformation Jim Lippard SkeptiCamp Phoenix March 28, 2009

description

There are positive side-effects of misinformation, e.g., for identifying paths of information and source reliability and filtering. Given at SkeptiCamp Phoenix, March 28, 2009.

Transcript of Positive side-effects of misinformation

Page 1: Positive side-effects of misinformation

Positive Side-effects of Misinformation

Jim LippardSkeptiCamp Phoenix

March 28, 2009

Page 2: Positive side-effects of misinformation

Outline

Short title: Detect Unreliable Filters

• A variety of benefits for intentional and unintentional misinformation

• Sources of misinformation• Errors, lies, and bullshit• Biases and filters• Future directions

Page 3: Positive side-effects of misinformation

1. A challenge to dogmatic acceptance“It is not too much to require that what the wisest of mankind, those who are best entitled to

trust their own judgment, find necessary to warrant their relying on it, should be submitted to by that miscellaneous collection of a few wise and many foolish individuals, called the public. The most intolerant of churches, the Roman Catholic Church, even at the canonization of a saint, admits, and listens patiently to, a "devil's advocate." The holiest of men, it appears, cannot be admitted to posthumous honours, until all that the devil could say against him is known and weighed. If even the Newtonian philosophy were not permitted to be questioned, mankind could not feel as complete assurance of its truth as they now do. The beliefs which we have most warrant for, have no safeguard to rest on, but a standing invitation to the whole world to prove them unfounded. If the challenge is not accepted, or is accepted and the attempt fails, we are far enough from certainty still; but we have done the best that the existing state of human reason admits of; we have neglected nothing that could give the truth a chance of reaching us: if the lists are kept open, we may hope that if there be a better truth, it will be found when the human mind is capable of receiving it; and in the meantime we may rely on having attained such approach to truth, as is possible in our own day. This is the amount of certainty attainable by a fallible being, and this the sole way of attaining it. … However unwillingly a person who has a strong opinion may admit the possibility that his opinion may be false, he ought to be moved by the consideration that, however true it may be, if it is not fully, frequently, and fearlessly discussed, it will be held as a dead dogma, not a living truth.”

John Stuart Mill, On Liberty (1859), Ch. II (Norton Critical Edition, pp. 21-22, 34-35)

Page 4: Positive side-effects of misinformation

2. Examples for educational purposes

“Some of the most frequently reprinted articles in twentieth-century philosophy are famous precisely because nobody believes them; everybody can see what is wrong with them.”

Daniel Dennett, Darwin’s Dangerous Idea (1995), p. 351

Page 5: Positive side-effects of misinformation

3. Entertainment

Page 6: Positive side-effects of misinformation

4. Wartime advantageOperation “Mincemeat,” a 1943

British intelligence operation to deceive the Axis powers into believing that Operation “Husky,” the invasion of Sicily, would take place in Greece and Sardinia.

A corpse with a briefcase was given a fictional identity as Major William Martin, R.M., and dropped into the ocean off the coast of Spain. Pro-German Spaniards passed the papers to German Intelligence, who were fooled to the extent that they still believed Sardinia and Greece were the targets weeks after the British first began landing in Sicily.

Page 7: Positive side-effects of misinformation

5. Detecting leaks, moles, and spammers

“Deception can be used by ordinary computer systems, too. As with print media, disinformation (false information) can be planted on computers for enemy spies to discover, as a counterintelligence tactic (Gerwehr, Weissler, Medby, Anderson, & Rothenberg, 2000).”

--Lech J. Janczewski, Andrew M. Colarik, Cyber Warfare and Cyber Terrorism (2007), p. 100

Page 8: Positive side-effects of misinformation

6. A marker for plagiarism detection

Page 9: Positive side-effects of misinformation

Intentional and unintentional misinformation

While most of the preceding examples (except for Mill and the original authors of the material referred to by Dennett) have involved intentional use of misinformation, unintentional misinformation has analogous benefits.

Page 10: Positive side-effects of misinformation

7. Identifying relationships of common ancestry

“All of the examples of functionless sequences shared between humans and chimpanzees reinforce the argument for evolution that would be compelling even if only one example were known. This argument can be understood by analogy with the legal cases discussed earlier in which shared errors were recognized as proof of copying. The appearance of the same "error"--that is, the same useless pseudogene or Alu sequence or endogenous retrovirus at the same position in human and ape DNA--cannot logically be explained by independent origins of the two sequences. The creationist argument discussed earlier--that similarities in DNA sequence simply reflect the creator's plans for similar protein function in similar species--does not apply to sequences that do not have any function for the organism that harbors them. The possibility of identical genetic accidents creating the same two pseudogene or Alu or endogenous retrovirus independently in two different species by chance is so unlikely that it can be dismissed. As in the copyright cases discussed earlier, such shared "errors" indicate that copying of some sort must have occurred. Since there is no known mechanism by which sequences from modern apes could be copied into the same position of human DNA or vice versa, the existence of shared pseudogenes or retroposons leads to the logical conclusion that both the human and ape sequences were copied from ancestral sequences that must have arisen in a common ancestor of humans and apes.”

Edward E. Max, “Plagiarized Errors and Molecular Genetics: Another argument in the evolution-creation controversy,” updated version of article from Creation/Evolution XIX (1986) on the talkorigins.org archive at: http://www.talkorigins.org/faqs/molgen/

This website includes critiques from creationists and Max’s responses.

Page 11: Positive side-effects of misinformation

8. Identifying family relationships of manuscripts

Stemmatics or stemmatology, originally developed in the 19th century, is based on the principle that “community of error implies community of origin.”

It involves identifying the common ancestry of manuscript variants based on shared errors, such as copyist mistakes and intentional errors, to create family trees.

Methods in stemmatics include the use of software originally developed for use in evolutionary biology.

Point of irony: Some evangelical Christians simultaneously believe that (a) we have reliable knowledge of the original autographs of biblical scripture on the basis of such techniques and (b) common ancestry is false.

Page 12: Positive side-effects of misinformation

Sources of misinformation

Both intentional and unintentional misinformation can reveal information about how beliefs propagate and the reliability and biases of sources, which may be individuals, groups, or institutions.

Sources may be originators or propagators of misinformation.

Page 13: Positive side-effects of misinformation

Errors, lies, and bullshit

The source of misinformation may:(a)Believe that the misinformation is true. It’s

an honest mistake.(b)Believe that the misinformation is false. It’s a

lie.(c)Not care whether the misinformation is true

or false. It’s bullshit.

Page 14: Positive side-effects of misinformation

On Bullshit“The bullshitter's fakery consists not in misrepresenting a state of affairs but in

concealing his own indifference to the truth of what he says. The liar, by contrast, is concerned with the truth, in a perverse sort of fashion: he wants to lead us away from it. As Frankfurt sees it, the liar and the truthteller are playing on opposite sides of the same game, a game defined by the authority of truth. The bullshitter opts out of this game altogether. Unlike the liar and the truthteller, he is not guided in what he says by his beliefs about the way things are. And that, Frankfurt says, is what makes bullshit so dangerous: it unfits a person for telling the truth.”

Jim Holt, “Say Anything,” The New Yorker, August 22, 2005, online at http://www.newyorker.com/archive/2005/08/22/050822crat_atlarge

An article on Harry G. Frankfurt’s On Bullshit (2005) and Simon Blackburn’s Truth: A Guide (2005).

Also relevant: Sissela Bok, Lying: Moral Choice in Public and Private Life (1978, updated in 1989 and 1999)

Page 15: Positive side-effects of misinformation

Biases and filtersWe can think of propagators of misinformation as filters on a stream (as a simplification of what’s

really a complex web). By comparing proximate sources to original sources (propagated content to original content), we can identify types of bias and habits of propagators.

• Parrots: Repeat whatever content they receive. (Perhaps the most common, with varying degrees of fidelity to the original content.)

• Enhancers: Add accurate new details and content from other sources.• Embellishers: Originate new details and content. (May or may not be bullshit.)• Bullshitters: Originate new details and content without regard to accuracy.• Skeptics: Point out reasons to doubt content, such as contrary evidence, with particular

priority given to science. Tries hard to avoid Type I errors (accepting falsehoods).• Forteans: Point out anomalous data, particularly where it conflicts with generally accepted

science. Tries hard to avoid Type II errors (rejecting truths). (Cf. my presentation from 2003 on “What Skeptics Can Learn from Forteans” and vice versa.)

• PR Hack: Portray a source in the best possible light.• FUDster: Portray a source in the worst possible light.• Contrarian indicator: Often gets truth and falsity reversed within a particular domain.

Page 16: Positive side-effects of misinformation

Institutional filters• Wikipedia: Requires neutral point of view, substantiation

with verification in “reliable sources” that are “credible published sources with a reliable publication process.”

• Science: A variety of methodologies, usually passed on via apprenticeship, via a complex network of participants who often rely on peer reviewed publication. Prefers precise quantitative measurement and validated methods; engages in self-correction and revision of both method and data. A confirmed hypothesis often undergoes a gradual transition from “frontier science” to “textbook science,” if not by replication, by accumulation of related results that “fit.”

Page 17: Positive side-effects of misinformation

ConclusionThere are a number of positive side-effects that arise from the existence and

propagation of misinformation. Perhaps the most useful from the point of view of a skeptic is that it can be used to identify implicit rules by which information sources act upon the information they receive, which can be used to gauge the source’s reliability and assign an appropriate level of trust to the source. It may also permit the reconstruction of original information that has been transformed by a source in various ways.

There’s clearly far more to explore here, in looking at how we aggregate information from multiple sources, both individually and as groups and in institutions.

Possibly related topics for future exploration: idea futures, “wisdom of crowds,” blog aggregation and interactions between blogs via commenters and links, division of cognitive labor, analogies to information security trust models.