How Do You Know? (2)
These are hard questions.
What I described in my paper is my best answer to the question 'how do I know', that is, it tries to explain how I (in fact) know things. It is therefore not a description of the criteria I should use to distinguish truth from falsity, nor how one person can convince another person of something.
Indeed, viewed as a system for determining the truth of something, the paper seems pretty ridiculous. Wealth of experience? Why should anyone trust that! Why is my wealth of experience any better than anyone else?
The problem is, the description of how we in fact learn things does not carry with it any sort of guarantee that what we've learned is true. But without such a guarantee, there can be no telling for ourselves what to believe or not to believe, no way to convince other people. It's like we're leaves blown about by the by the breeze, with no way to sway the natural forces that affect us.
Moreover, the problem is:
- the is no guarantee
- yet we do distinguish between true and false (and believe we have a method for doing so)
- and we do want to be able to sway other people
What complicates the matter - and the point where I deliberately stopped in the other paper - is that not everybody is honest about what they know and what they don't know. Sometimes there genuinely are charlatans, and they want to fool us. Sometimes they are simply mistaken.
There's not going to be a simple way to step through this.
I went immediately from the British Council talk, where I was trying to foster a point of view, to a session inside Second Life, where I played the role of the sceptic. Not that I think that the people promoting Second Life are charlatans. But I do think they are mistaken, and I do think some of the statements they make are false.
The fact is, even though there are no guarantees, we will nonetheless make judgments about truth and falsehood. It is these judgments - and the manner of making these judgments - that will sway the opinions of other people.
You can't tell people things, you can only show them.
Now even this statement needs to be understood carefully. It is true that we can tell people things, eg., that 'Paris is the capital of France' that they will remember - but it does not follow that they know this; they will need to see independent evidence (such as, say, newscasts from 'the capital of France'). Telling produces memory. Showing (and experience-producing processes in general) produces belief.
But now - even this needs to be qualified. Because if you tell something to somebody enough times, it becomes a type of proxy experience. So - strictly speaking - you can produce belief by telling - but not by 'telling' as we ordinarily think of it, but by a repeated and constant telling.
Additionally, we can make 'telling' seem more like experience when we isolate the person from other experiences. When the 'telling' is the only experience a person has, it becomes the proxy for experience.
It is worth noting that we consider these to be illicit means of persuasion. The former is propaganda, the latter is indoctrination. Neither (admittedly) is a guaranteed way of changing a person's mind. But it is reliable enough, as a causal process, that it has been identified and described as an illicit means of persuasion.
Let me return now to how we distinguish truth from falsehood.
This is not the same as the process of coming to know, because this process has no such mechanism built into it. The way we come to know things is distinct from the way we distinguish between truth and falsehood.
This may seem counter-intuitive, but I've seen it a lot. I may be arguing with someone, for example, and they follow my argument. "I agree," they say, "Um hm, um hm." But then I get to the conclusion, and they look at me and say, "But no..." It's this phenomenon that gives people the feeling they've been tricked, that I've played some sort of semantic game.
So there are processes through which we distinguish truth and falsity. Processes through which we (if you will) construct the knowledge we have. We see the qualities of things. We count things. We recognize patterns in complex phenomena. These all lead us, through a cognitive process, to assert that this or that is true or false.
Usually, this cognitive process accords with our experience. For example, we say that the ball is red because we saw that the ball was red. We say that there were four lights because we saw four lights. It is close enough that we saw that we came to know that the statement was true because of the experience. But - again - the process of knowing is separate from the process of distinguishing truth from falsehood.
There are general principles of cognition. These are well known - the propositional logic, mathematics, categorical logic, the rest (and if you want to see the separation between 'knowing' and 'distinguishing truth from falsity' then look at some of these advanced forms of logic - deontic logic, for example. We can use some such process to say that some statement is true, but because the process is so arcane to us, the statement never becomes something we 'know' - we would certainly hesitate before acting on it, for example).
There are also well known fallacies of cognition. I have documented (many of) these on my fallacies site. It is interesting to note that these are for the most part fallacies of distraction. What they do is focus your attention on something that has nothing to do with the proposition in question while suggesting that there is a cognitive link between the two. You come to 'know' something that isn't true, because you have had the experience.
Consider the fallacy: "If the plant was polluting the river, we would see the pollution. And look - we can see the pollution." We look, and we see the pollution. It becomes part of our experience. It becomes the reason we 'know' that the plant is polluting the river. No amount of argument - no amount of 'telling' (except, say, indoctrination) will convince us otherwise. We have to actually go to the plant and see that it is not polluting the river in order to understand - to know - that we were the victim of a fallacy.
There is a constant back-and-forth being waged in all of us, between what we 'know' and the things we say are 'true and false'.
That is why I say you can't 'tell' a person something. Merely convincing them (even if you can) to agree that 'this is true' is a long way from getting them to know it - getting them to believe it, to act on it, to make wagers on it.
So - convincing a person comes down to showing them something.
Often this 'showing' will be accompanied with a line of reasoning - a patter - designed to lead them to the 'truth' of what they are being shown. But the knowledge comes from the showing, not the patter.
Even with showing, there are no guarantees. 'You can lead a horse to water...' Even the experience may not be sufficient to convince a person. Any experience is being balanced against the combined weight of other experiences (perhaps the 'patter' is sufficient to sway people in some touch-and-go cases, by offering a coherence with other experiences - an easy path for belief to follow).
A great deal depends on the nature of the experience. Experiences can be vivid, can force themselves on us. They can be shocking or disturbing. Images of violence capture our attention; images having to do with sex capture our attention. Our attention, even, can be swayed by prior experiences - a person who has spent a lifetime around tame tigers will react very differently on seeing a tiger than a person who has only known them to be dangerous carnivores.
'Convincing' becomes a process of pointing, a process of showing. Sometimes what a person is told can direct a person where to look (in this piece of writing I am encouraging to look at how you come to have your own knowledge, to see how it is the result of a separate track from how you come to see things are true and false). Sometimes the experiences can be contrived - as, say, in a simulation - or the senses fooled. Some media - especially visual media - can stand as substitutes for experience.
We can have experiences of abstract things - the weight of experience just is a way of accomplishing this. The logical fallacies, for example - on being shown a sufficient number of fallacies, and on seeing the fallaciousness of them, we can come to have a knowledge of the fallacies - such that, when we experience a similar phenomenon in the future, we experience it as fallacious.
Convincing becomes a matter of showing, showing not just states of affairs in the world, but processes of reason and inference. If I can show actual instances of inference, how a person comes to believe, comes to know, this or that, then it becomes known, and not merely believed, by the viewer. If I can show my reasoning process, then this process can b known (after being experienced and practiced any number of times) by the learner.
'Expert knowledge' is when a person not only remembers something, but when a person has come to know it, has come to know the processes surrounding a discipline.
Such knowledge is often ineffable - the knower can't even enumerate the (true or false) statements that constitute the knowledge, or that led to the knowledge. What a person knows is distinct from what a person says is true or false.
It is not truth that guarantees knowledge. It is knowledge that guarantees truth.
What I described in my paper is my best answer to the question 'how do I know', that is, it tries to explain how I (in fact) know things. It is therefore not a description of the criteria I should use to distinguish truth from falsity, nor how one person can convince another person of something.
Indeed, viewed as a system for determining the truth of something, the paper seems pretty ridiculous. Wealth of experience? Why should anyone trust that! Why is my wealth of experience any better than anyone else?
The problem is, the description of how we in fact learn things does not carry with it any sort of guarantee that what we've learned is true. But without such a guarantee, there can be no telling for ourselves what to believe or not to believe, no way to convince other people. It's like we're leaves blown about by the by the breeze, with no way to sway the natural forces that affect us.
Moreover, the problem is:
- the is no guarantee
- yet we do distinguish between true and false (and believe we have a method for doing so)
- and we do want to be able to sway other people
What complicates the matter - and the point where I deliberately stopped in the other paper - is that not everybody is honest about what they know and what they don't know. Sometimes there genuinely are charlatans, and they want to fool us. Sometimes they are simply mistaken.
There's not going to be a simple way to step through this.
I went immediately from the British Council talk, where I was trying to foster a point of view, to a session inside Second Life, where I played the role of the sceptic. Not that I think that the people promoting Second Life are charlatans. But I do think they are mistaken, and I do think some of the statements they make are false.
The fact is, even though there are no guarantees, we will nonetheless make judgments about truth and falsehood. It is these judgments - and the manner of making these judgments - that will sway the opinions of other people.
You can't tell people things, you can only show them.
Now even this statement needs to be understood carefully. It is true that we can tell people things, eg., that 'Paris is the capital of France' that they will remember - but it does not follow that they know this; they will need to see independent evidence (such as, say, newscasts from 'the capital of France'). Telling produces memory. Showing (and experience-producing processes in general) produces belief.
But now - even this needs to be qualified. Because if you tell something to somebody enough times, it becomes a type of proxy experience. So - strictly speaking - you can produce belief by telling - but not by 'telling' as we ordinarily think of it, but by a repeated and constant telling.
Additionally, we can make 'telling' seem more like experience when we isolate the person from other experiences. When the 'telling' is the only experience a person has, it becomes the proxy for experience.
It is worth noting that we consider these to be illicit means of persuasion. The former is propaganda, the latter is indoctrination. Neither (admittedly) is a guaranteed way of changing a person's mind. But it is reliable enough, as a causal process, that it has been identified and described as an illicit means of persuasion.
Let me return now to how we distinguish truth from falsehood.
This is not the same as the process of coming to know, because this process has no such mechanism built into it. The way we come to know things is distinct from the way we distinguish between truth and falsehood.
This may seem counter-intuitive, but I've seen it a lot. I may be arguing with someone, for example, and they follow my argument. "I agree," they say, "Um hm, um hm." But then I get to the conclusion, and they look at me and say, "But no..." It's this phenomenon that gives people the feeling they've been tricked, that I've played some sort of semantic game.
So there are processes through which we distinguish truth and falsity. Processes through which we (if you will) construct the knowledge we have. We see the qualities of things. We count things. We recognize patterns in complex phenomena. These all lead us, through a cognitive process, to assert that this or that is true or false.
Usually, this cognitive process accords with our experience. For example, we say that the ball is red because we saw that the ball was red. We say that there were four lights because we saw four lights. It is close enough that we saw that we came to know that the statement was true because of the experience. But - again - the process of knowing is separate from the process of distinguishing truth from falsehood.
There are general principles of cognition. These are well known - the propositional logic, mathematics, categorical logic, the rest (and if you want to see the separation between 'knowing' and 'distinguishing truth from falsity' then look at some of these advanced forms of logic - deontic logic, for example. We can use some such process to say that some statement is true, but because the process is so arcane to us, the statement never becomes something we 'know' - we would certainly hesitate before acting on it, for example).
There are also well known fallacies of cognition. I have documented (many of) these on my fallacies site. It is interesting to note that these are for the most part fallacies of distraction. What they do is focus your attention on something that has nothing to do with the proposition in question while suggesting that there is a cognitive link between the two. You come to 'know' something that isn't true, because you have had the experience.
Consider the fallacy: "If the plant was polluting the river, we would see the pollution. And look - we can see the pollution." We look, and we see the pollution. It becomes part of our experience. It becomes the reason we 'know' that the plant is polluting the river. No amount of argument - no amount of 'telling' (except, say, indoctrination) will convince us otherwise. We have to actually go to the plant and see that it is not polluting the river in order to understand - to know - that we were the victim of a fallacy.
There is a constant back-and-forth being waged in all of us, between what we 'know' and the things we say are 'true and false'.
That is why I say you can't 'tell' a person something. Merely convincing them (even if you can) to agree that 'this is true' is a long way from getting them to know it - getting them to believe it, to act on it, to make wagers on it.
So - convincing a person comes down to showing them something.
Often this 'showing' will be accompanied with a line of reasoning - a patter - designed to lead them to the 'truth' of what they are being shown. But the knowledge comes from the showing, not the patter.
Even with showing, there are no guarantees. 'You can lead a horse to water...' Even the experience may not be sufficient to convince a person. Any experience is being balanced against the combined weight of other experiences (perhaps the 'patter' is sufficient to sway people in some touch-and-go cases, by offering a coherence with other experiences - an easy path for belief to follow).
A great deal depends on the nature of the experience. Experiences can be vivid, can force themselves on us. They can be shocking or disturbing. Images of violence capture our attention; images having to do with sex capture our attention. Our attention, even, can be swayed by prior experiences - a person who has spent a lifetime around tame tigers will react very differently on seeing a tiger than a person who has only known them to be dangerous carnivores.
'Convincing' becomes a process of pointing, a process of showing. Sometimes what a person is told can direct a person where to look (in this piece of writing I am encouraging to look at how you come to have your own knowledge, to see how it is the result of a separate track from how you come to see things are true and false). Sometimes the experiences can be contrived - as, say, in a simulation - or the senses fooled. Some media - especially visual media - can stand as substitutes for experience.
We can have experiences of abstract things - the weight of experience just is a way of accomplishing this. The logical fallacies, for example - on being shown a sufficient number of fallacies, and on seeing the fallaciousness of them, we can come to have a knowledge of the fallacies - such that, when we experience a similar phenomenon in the future, we experience it as fallacious.
Convincing becomes a matter of showing, showing not just states of affairs in the world, but processes of reason and inference. If I can show actual instances of inference, how a person comes to believe, comes to know, this or that, then it becomes known, and not merely believed, by the viewer. If I can show my reasoning process, then this process can b known (after being experienced and practiced any number of times) by the learner.
'Expert knowledge' is when a person not only remembers something, but when a person has come to know it, has come to know the processes surrounding a discipline.
Such knowledge is often ineffable - the knower can't even enumerate the (true or false) statements that constitute the knowledge, or that led to the knowledge. What a person knows is distinct from what a person says is true or false.
It is not truth that guarantees knowledge. It is knowledge that guarantees truth.
Comments
Post a Comment
Your comments will be moderated. Sorry, but it's not a nice world out there.