resources-references
, advice
, epistemology
If at the core of critical thinking is following reason and evidence, how do we handle knowledge so vast that one entire human life of reasoning is not enough to arrive at it? How do we even recognize such knowledge?
It seems to me that many pieces of advice from wise people are easy to dismiss rationally, using critical thinking, simply because the advice does not make sense to you at the time. Once you gain more experience, you begin to understand what previously seemed illogical.
How does a critical thinker, aware of the limitations of their own mental powers, rationally deal with authorities that surpass their mental capacity? How can these authorities be recognized? You cannot simply trust them (would not be critical), but you cannot dismiss them (would not be rational - they are more likely to be close to truth).
Does this mean that critical thinking is limited? Are the limits known?
It seems to me that many pieces of advice from wise people are easy to dismiss rationally, using critical thinking, simply because the advice does not make sense to you at the time.
If you dismiss something because it doesn’t make sense to you, then you’re not using critical thinking. You suggested that one can’t simply trust an authority, and one can’t simply dismiss them. That’s correct, but there’s at least one more option; you can acknowledge that you don’t understand their claim, and refuse to accept or dismiss it until you do understand.
So in the context of atheists’ applying critical thinking to the claims of religious authorities, if the authority presents an argument for how you should behave, for example, and it’s an argument which you don’t want to accept, but which you can’t come up with a rational reason to dispute, your only choices aren’t just to accept it or be branded irrational. You can take some time to think it through, or ask the authority to explain, or ask other people who are smarter than you, or look it up on the internet.
It certainly doesn’t mean that critical thinking is limited, in the same way that logic isn’t limited by an individual’s inability to apply it.
Critical and rational thinking systems contain multiple methods and tools for dealing with this specific problem. I think you’ll find the ones I described below interesting:
Basic filtering - the vast majority of knowledge and experience is noise - it is not useful, interesting or applicable in anyway to anything we will ever experience again. Critical and rational thinkers train to develop good mechanisms for knowing when a piece of information truly isn’t useful, interesting, or relevant, and can safely ignore it.
Abstraction - the ability to extract, from a large body of knowledge, a focused bit of knowledge, which is generally useful without containing most of the knowledge that went into producing it. After filtering out all of the noise, and collecting a large amount of useful, interesting, relevant information, small bits of abstracted (focused, concentrated) knowledge can be pulled out. This is by far the most powerful, and most common used tool
Specialization - most of acquired knowledge takes time and effort to think through, acquire, and communicate clearly, but doesn’t require very much effort to test for correctness and accuracy. Each rational/critical thinker can specialize and analyze/collect data in one area, and then communicate focused, concentrated bits of information, with others via well established channels where their information can be quickly tested for veracity (for example, scientific journals). For example, we have so far put in, probably, on the order of 100,000 - 1,000,000 man-years worth of research into a cancer cure; if a cure is found, it almost certainly won’t take nearly 100,000 years worth of knowledge to test that it works, or to learn how/why it works.
Efficiency of communication - critical and rational thinkers invent specialized languages for speaking about important information, which allow for better filtering, abstraction, and exchange of information. This is connected to the basic filtering mechanism - a specialized critical thinker will strongly prefer to listen to the information other specialized critical thinkers have obtained. If the information doesn’t come from one, then he’s much more likely ignore it, and ignore it safely. It’s much more efficient to just wait until someone specialized comes along to collect, abstract, analyze the information, and then communicate it out to the world in focused, concentrated form. For example, a mathematical researcher will likely not follow his grandmothers’ medical treatment suggestions - rather, he’ll pass the information along to a doctor, which might pass it along to a medical researcher, and both the doctor and mathematician will use the results of the research, rather than the original advice.
how do we handle knowledge so vast that one entire human life of reasoning is not enough to arrive at it?
Are you talking about things that nobody knows?
How do we even recognize such knowledge?
I’m confused.
I don’t agree that this question is at all off topic. Enough atheists claim that they base their beliefs on “critical thinking” that the discussion of its weaknesses and strengths is very much topical.
On the other hand, I don’t think it’s been understood or defined by the OP correctly. See wiki: http://en.wikipedia.org/wiki/Critical_thinking – Or the critical thinking community: http://www.criticalthinking.org/aboutCT/define_critical_thinking.cfm
If you want to examine the question of the legitimacy of rationalism then you might consider the later chapters of the second volume in Popper’s Enemies of the Open Society in which he discusses the difference between an uncritical vs. critical rationalism and discusses the inevitable consequences of all forms of non-rationalism. Further reading of other authors might also be in order, in fact I know that both Dennett and Dawkins address these concerns in their books on religion.
You should also keep in mind that though it does indeed take a very long time to derive new knowledge, it takes a whole lot less time to verify the knowledge gained by someone else. You don’t have to take anyone’s word for it, you can examine the basis by which they define their knowledge. If you find that basis valid then it can be believed. This is neither non-critical nor irrational.
One great method of filtering out that which you need more evidence for is Carl Sagan’s Bullshit Detector, which is a lot simpler than is being explained on the Internet. In short, the more attached to an idea you are, the more skeptical you should be of it and the more evidence you should require before believing it. For example, Sagan’s own greatest desire was to contact extra-terrestrial life. Since it was such, he made sure to be as stringent as possible about the evidence provided for the existence of it. In the end he wasn’t able to delude himself into the kind of beliefs UFO”ologists” have. You are most at risk when you WANT to believe something is true than when you don’t really care.
It depends on the type of veracity you are looking for when you speak of critical thinking and rationality.
If all you allow is logically necessary truth, then you cannot even trust yourself or your senses because there is nothing logically that states that the next day will be like today, or that anything will continue to exist even the tiniest second after you read this period ".".
On the other hand if you have weaker standards for inference, then the question is what will your standard for valid knowledge be. If all that matter is believing things that in some way produce good results, and you assume you yourself are like most people, and you see that many people who believe the words of speaker X and act on them are happy because of these beliefs, then you may wish to believe in what that person says. Now the tricky part would be discovering whether those beliefs were what made them so happy, but that's a general problem with causal inference that people who study problems of causality have been trying to decipher for a long time.
Alternatively, if you use something more akin to Bayesian epistemology, you might be able to generate hierarchical hypotheses (i.e. hypotheses about hypotheses) regarding the trustworthiness of information sources. One such hypotheses could include the trustworthiness of some individual, which you might learn over time. If that individual is trustworthy in other circumstances, it may be more likely that he is trustworthy in this circumstance.
Of course, all these solutions still carry with them assumptions and are therefore subject to potentially being wrong, but as noted, you need some assumptions to be able to get any inferential project off the ground at all. Thus two people who are both rational, critical thinkers might come, in some cases, to disagreements, because they make different assumptions which guide their further inferences.
So I'd say in general any response that you will find valid will have to hinge upon the assumptions you think are valid, including the definitions you give to critical thinking and rationality, and what you qualities you value when justifying any belief you have as you go about the world.
A ‘knowledge so vast … ‘ and ‘wise people’ are, maybe, mythological concepts - at least they are unsharp, so that we need to do some definition work, what we will count as knowledge and wisdom and what not, to draw a line between knowledge and enlightement, between rational and irrational beliefs.
I’m pretty educated in programming and math, but not nearly that informed in chemistry or biology. If an authority in those fields explains something, it will often occur that I can’t follow. But I can look whether the person looks trustworthy to me, what the context is - just a random posting in the internet, or a known scientific publication? Has this publication been trustworthy? Do people recommend it, which are trustworthy?
From environments where I am good informed I can transfer scepticism to environments, where I’m not informed. I have seen situations, where people stick to much on authorities, and step into a trap, and I have seen young newcomers, who threw away wise advice too premature. This doesn’t give me a miracolous knowledge whom to trust, and whom not. But I’m aware, that my trust might be to much, and that my mistrust might be wrong.
But if my suspicion is rised, I know how I could work on it. I can search the web, I can ask friends, I can visit the library, I can search for experts. Are there diverging theories? Who’s in favor of what? Which interests are involved, who pays for studies?
Let’s give an example: I recently heared a radiofeature about an esoteric farmer. He was proud to explain, that he uses a special kind of wood - I’m not sure which one it was, I guess birch tree - to build the mills. It turned out, that there is a pest for the corn, which doesn’t like the resin of that tree, so it is perfectly rational to use that kind of wood.
Now the knowledge that this kind of wood is perfectly suited for building mills is rather old, because it was observed, that there is less pest for the corn. But the fact that the smell of the resin is the source of this benefit is new.
Another traditional knowledge is, that the seeds for the corn have to be put into the ground in a full moon night, and some ritual singings have to be performed on the ground. But does the story of the resin prove, that singing and dancing and full-moon-nights have an influence, because it was told by the same wise man? Is it a wise man, at all?
Critical thinking is limited to the accepted knowledge on hand at the time. No one can know everything so you have to accept some sources of information as reliable. For example I listen to Brian Dunning’s podcast. I doubt some of his ideas from time to time but on following the info up I find he is usually right. I accept he may be wrong from time to time and so does he.
I am always open to new information. As a Critical Thinking Atheist if the proof is strong enough to show there are Gods then I will listen.
All content is licensed under CC BY-SA 3.0.