How could someone discover that they are bad at reasoning?
Imagine a person who values truth, logic and reason. Imagine this person believes the best way to have true beliefs is by applying logic and reason to the things that he may read, hear, see or otherwise experience.
Now imagine, unbeknownst to this person, that he's actually *bad* at applying reason and logic to things. Perhaps this person has a really poor intuition for logic. Perhaps this person routinely takes steps in his reasoning which are recognized as entirely unjustified to people who are good at reasoning.
If this persons truth-discovering tools like reason and logic are compromised in such a way, how could this person *discover the truth* that his truth-discovering (or filtering instead of discovering, if you prefer) tools are compromised and unrelaible? After all, he might have to *use* those unreliable tools to discover or come to accept that the tools themselves are unreliable, so how could he do that?
And then, suppose he does come to understand that he's bad at reasoning - what then? If he still cares about the truth, but he has come to accept that his tools for discovering or filtering truths are compromised, what should he do?
Now imagine, unbeknownst to this person, that he's actually *bad* at applying reason and logic to things. Perhaps this person has a really poor intuition for logic. Perhaps this person routinely takes steps in his reasoning which are recognized as entirely unjustified to people who are good at reasoning.
If this persons truth-discovering tools like reason and logic are compromised in such a way, how could this person *discover the truth* that his truth-discovering (or filtering instead of discovering, if you prefer) tools are compromised and unrelaible? After all, he might have to *use* those unreliable tools to discover or come to accept that the tools themselves are unreliable, so how could he do that?
And then, suppose he does come to understand that he's bad at reasoning - what then? If he still cares about the truth, but he has come to accept that his tools for discovering or filtering truths are compromised, what should he do?
Comments (49)
If reason functions as a social product (I think Habermas says something like that (perhaps Fichte to some extent too)) then it may be that reason can be instilled into this individual viz. interaction with other rational individuals. In that case, patiently correcting someone for their logical infelicities may be best. Also, logic classes help. Lastly, I think epistemic humility is important.
Also, I assume this post is not about me, otherwise...irony!
Quoting Leontiskos
Yes, this seems like an intuitive avenue to a discovery of one's own erroneous reasoning facilities to me as well, BUT it relies on one necessary premise that the person bad at reasoning must first accept - he must first accept that other reasoners systematically disagreeing with him about his reasoning is a sign that his reasoning is (or might be) wrong.
If they don't accept that premise in the first place - if one of the faults in their reasoning facilities is to completely ignore the possibility that other people systematically disagreeing with them might be a sign that those other people are correct and they themselves are incorrect - then this avenue of correction gets shut down. Do they become essentially stuck permanently thinking poorly, stuck with bad reasoning, if this avenue is shut down?
The mind is a wonderful if not fickle thing. Cognitive bias especially when compounded with a lengthy history compromised of life choices and philosophy that essentially has ended up playing a major role in the constitution of one's "identity". In short, people don't like to be wrong, because the brain doesn't like to be wrong. "If it ain't broke don't fix it" is an operational aphorism shared by both the conscious and unconscious mind, it would seem. People would rather convince themselves it's not raining despite being soaking wet if they felt strongly enough and had the ideological motivation to do so. Similar to the arguments made by those critical of religion, I suppose.
To disagree is not necessarily to identify a contradiction. It is harder to ignore a putative contradiction than it is to ignore a disagreement. At the foundational level others need to point to the contradictions in the poor reasoner's thinking. Everyone attends to putative contradictions to one extent or another.
Someone bad at reasoning may see another person disagreeing with them, saying "this argument is poorly formed / fallacious", and decide that that person just doesn't understand their argument. "You just don't understand my argument" is an easy and readily available out for anything the poor-reasoner says.
That's sort of why I'm talking about systematic disagreements, rather than just raw disagreements. Like, if you believe in God, and maybe 80% of philosophers you come across disagree with you and 20% agree with you, that's not really a 'systematic disagreement', that's just normal disagreement within a contentious topic.
But, if one of your arguments for something you believe in is of a particular form, and *literally everyone* you come across says "this form of argument is fallacious, this step in your reasoning doesn't follow from your last step", then it's no longer just a standard disagreement on a contensious topic, it's something else. It's systematic.
Disagreement is to be expected, even when you have beliefs that are reasonable or even correct. Universal, or near-universal, systematic disagreement is presumably not.
Most people self-describe in this way. I was just talking to a man who said precisely this and that this is why is is a Muslim. Reason demonstrates the Koran is true. Which is obviously not the case.
Quoting flannel jesus
I suspect most of us are bad at this. We have 'reasons' for everything but I'm not sure how rational our thinking is.
Quoting flannel jesus
I'm not sure many of us are overly concerned about truth. In relation to what? Speaking personally, I navigate my world through intuition and experince rather than logic. There are a few subjects where I will employ reasoning per say, but generally this comes post hoc if I am pushed. We are emotional creatures who inherit most of our beliefs and capacities from the culture we are reared in. Post hoc justification is a wonderful thing.
Some of us desire to not be so beholden to such limitations. Everyone is biased, yes, but I think it's pretty likely that, on average, people who TRY to be less biased probably ARE less biased. On average, of course, not universally - some people may try to be less biased and fail, for various reasons.
For example, it's not so unusual to find people who say "I was born into a culture where everyone believed such-and-such religion, and that anyone who doesn't believe that was damned. I realized that this theological lottery didn't make much sense, so I endeavoured to base my reasons for believing things on reasons other than the idea that I just happened to win the theological lottery by being born into this particular culture." I have a lot of respect for that thought process - where most people just accept those biases they inheret, *not everyone does*.
Exactly!!!! One of my elderly sisters, a fundamentalist and Trumper, maintains all sorts of illogical, unreasonable ideas about religion and politics. When countered, she flies into a rage. On other matters, lie medical care or car maintenance, she is very rational.
Quoting flannel jesus
Based on my personal experience of not applying reason and logic to things, I can attest to the unpleasant consequences that can result. However...
Man does not live by logic and reason alone. Our very robust emotional systems are often first on the scene of decision making, and they have little interest in logic.
Whether or not the disagreement is systematic, they will reliably learn that they are mistaken once they begin to see the problems in their own account, and I believe this is best done via contradiction. Those who float at 10,000 feet are good at avoiding the contradictions in their thought. Things must be concretized and brought down to the ground level. I also think the correction should go beyond disagreement in terms of consensus. Consensus is not the best argument.
So in summary, I would give less prominence to the "reasoning" abilities of an individual (because that may assume a criteria of knowledge or rationality to which the interlocutor may not agree), and greater prominence to their ability to re-assert and understand another's view. If someone won't do that then there is simply no discussion to be had.
We also must distinguish between an argument's soundness and its validity. An argument can be valid without being sound. If so, that's not a reasoning error, that just means an assumption is wrong. I am not sure that someone can be "reasoned" out of an assumption.
As a tradesperson, I found out about my poor reasoning by losing control of what I was doing. The only way back was accepting the mistake. And if that idea of the mistake was a mistake, then what that revealed.
But you're right about the motivation - there was something in particular that brought this from just being a thing I've been thinking about, to making an actual thread about it. It's honestly an interesting case study. We can see it happening live! And I'm certain that my approach to it has NOT been optimal - certainly some things I've said that I'd like a do-over of.
It seems to me that the average person thinks little of its mistakes, so I would say the virtue of striving for perfection is outweighed by the stress that comes with it, especially when most are not trying nearly as hard.
Having come from a family of apostates I am well familiar with this phenomenon. But I still think that when people leave religions, it is just as likely because religions fails to satisfy them emotionally first. I think the reasoning comes post hoc. My Dad, who left the church in 1937, put it like this - 'I wasn't satisfied by any of the stories anymore. Then I looked into the arguments and found I wasn't the only one. Then I left.'
But we use logic and reason at work, in the family, in life in general. You rule something out, 'logically', and it turns out to be what happened, that's blunt feedback. You spend your boss' money based on 'logic and reason' that may well ending earning some blunt feedback.
Of course I think humans are capable of denying pretty much anything, so there's no guarantee for the hypothetical person.
But there are many ways feedback can come and be hard to be ignored.
Quoting flannel jesus
Apprentice (verb). And for the mentor, a tried and true method of teaching is for the mentor to use whatever the skill being passed is and think out loud while doing it.
This thought occurred to me too, but with the following hesitancy: if this person knows he's bad at discovering truths and reasoning, how confident can he be that it's *true* that the person he's chosen as his mentor is *good* at reasoning? I mean, if this person is bad at reasoning it's probably pretty likely that just about ANY mentor might offer an improvement, but still, he might prefer to have a better mentor rather than a worse one - if his reasoning skills are compromised, it stands to reason that his ability to distinguish a mentor that's good at reasoning might also be compromised.
A problem well exhibited in the Theaetetus. Should an idea survive? Is the test right or wrong?
For someone who seriously wants to look into it, there are tests like the WAIS which can yield more fine grained knowledge of cognitive strengths and weaknesses.
In specific they can choose someone well regarded by expert peers. Of course, this is no guarantee, but what else can one do, but do one's best, in part based one what seems to be the best suggestions from others.
I think most people can improve, once they realize there is a problem.
And I think it is possible for anyone to get a sense there is a problem, if only for a few moments.
I think that would be a safe bet. Of course, the person bad at reasoning could only agree that that's a safe bet if one of the things he's bad at doesn't involve him completely discounting the expertise of others - if he's rejected even that, he's stuck in a limbo of bad reasoning forever. But if he has that - if he's capable of respecting the reasoning abilities of others, and of experts above all - then he stands a chance at discovering his own flaws and perhaps finding guidance to fix them.
He could even just delegate his thinking to another person. "I will believe whatever this person believes". If he's bad at reasoning, maybe he could resign himself to just stop trying, if that's possible. Probably not possible in his every day life, but maybe possible in the space of ideas - when he's pontificating on philosophy or science or whatever.
I understand that someone with global deficits, then has a better chance of not noticing problematic feedback from life and other people. But really, we are all in this boat, I think.
We may be solid enough coasting on what we are, in fact, correct in thinking we are good at. But somewhere, at edges of our skills, we may not know we have a problem and, if we decided to try to improve, might not do the right things to achieve that.
I have my guess, like others, where this thread came from. I think we are dealing with an over-creative approach, combined with a dislike of what most people think of as authority. Or perhaps better put, a joy in challenging what seems obvious to others. This may lead to some real stubborn mistakes. I could be projecting, since I have had both of these problems myself.
Another way to extricate oneself might come when one realized what is actually going on in interpersonal, global personality drive ways and how these affect reasoning. There are many ways like can come and smack you upside the head (don't I know it) and get one to start noticing bad habits that affect cognition and reasoning.
One has to, I think, actively stifle moments of awareness that something is off. In a way, all the person has to do is to start is to let those flashes of doubt and insight get more light and air.
I dont think that would be terribly difficult in a controlled situation - like science, or architecture or engineering. You would find out you were wrong by having your predictions disconfirmed or flaws in your designs or projects (although there are those who are notoriously bad at recognising their own flaws. Like Trevor Milton who started an e-lorry company based on lies and was jailed as a result.)
But its a lot more slippery when it comes to moral judgements and ethical decisions, as the criteria are not necessarily objective (I say not necessarily, because if those judgements and decisions cause harm or calamity, those are objective consequences.) But its possible to skate through life being wrong about any number of such things, and if there is no karma-upance in a future existence, then - so what?
Would you be okay with accepting a world of consequences without being able to find out what they will be?
Psychological stress in the form of cognitive dissonance.
I tend to agree.
Don't you think it's no different than for any language/linguistic tool?
A person who believes the answers to her questions regarding subject x are best arrived at through correctly applying logic and reasoning, should learn how to use those tools.
A person who believes the answers to her questions regarding subject x are best arrived at through correctly applying Calculus and Cartesian geometry, should learn how to use those tools.
Or, have I misunderstood/oversimplified your query?
Good question. I suppose its analogous to compatibilism in some ways.
Cognitive dissonance is usually presented as an obstacle to learning. The result is a decrease of stress.
Everything is okay. Never mind.
I think about it in Aristotelian terms. There are too many accidents to explain through necessity. But there are too many repetitions to blow off connections between actions.
The person realizes that they are human and make mistakes like all of us do.
Quoting flannel jesus
He should consider ALL possibilities and be aware that he will STILL make mistakes, as we all continue to do. But by realizing this, he may be more respectful of other people's views, even if he does not agree with them, since he knows that he too could be wrong. He might also conclude that he has been living and surviving like this for all his days, like us all, and all he can do is 'try' to be openminded and get closer to the truth... whatever that is.
You assume you are reasonable but your own reason teaches you that you are not using reason. Just sane enough to know youre insane. Sounds like a Kafka novel.
Quoting Leontiskos
No to both quoted posts above. I have experienced people like this in real life and I now avoid having any discussion with them, except to greet them good morning, hello, how're you doin'? And when I say discussion, I mean the topic of everyday life, let alone serious current events.
They don't see the contradiction in what they say. They're not interested in learning or hearing about the contradiction in their statements. Mind you, when they're having a conversation, they speak with authority -- "I got bitten by mosquitos. See these bites?" I'd respond by saying -- those are flea bites, not mosquito bites, judging from the marks on the skin. She would then reason by saying, well I have a can of water standing in my yard. This is all she would hold on to for "evidence" that it's a mosquito bite. I could go on with this... you ask her, did you see any mosquitos at all? She'd say, no, but there's that standing water. :wink: lol.
It seems as though, with our one example of this situation on this forum, one has to be willing to see contradictions before one is able to see contradictions. Our one test example on the forum, when faced with the contradiction, can just will themselves out of seeing it
So, then: can one be bad at reason and be willing to see contradictions. I would say yes. Unless we take 'bad at reasoning' to mean one never draws correct conclusions. But one could draw the conclusion that it would be good to notice contradictions as the result of bad reasoning. Like 'I've never seen Angelin Jolie where she was clearly not noticing Contradictions' 'Therefore she is good at noticing contradictions' Everyone should be like Angelina Jolie' Therefore I will look for contradictions. And so they do look and find and slowly realize that while their original reasoning for deciding this was not perfect, they're glad they decided to look for and notice contradictions.
Then we have to ask ourselves if someone can notice with utterly terrible reasoning. Well, it depends how terrible is terrible. Goes outside in the 98 degree heat to his car parked in the sun and notices the hood is warm, but he hasn't driven the car for weeks. Suddenly, I think some percentage of bad reasoners will recognize a problem. Not all, clearly, but some.
After a week of mulling I decided I was time travelling. Which makes sense since it was a week later. Or as I put it to myself 'therefore I am a time traveler.' If I hadn't noticed the hood, I wouldn't have been a time traveler.
I thought I'd revisit this and made a list off the top of my head of terms that have to do with reasoning. Processes/functions that need to work well or you may have a problem with reasoning. These are not distinct categories; they overlap.
Some of the ones that I think account for bad reasoning (in all of us, at some point, and in certain people regularly). I just bolded the last three and these three also overlap. I think when people are bad at reasoning it is often these three functions that are not up to snuff. You need have the ability to notice small cues in yourself that you might be wrong. IOW I think people often get signals from themselves that they are not sure - when presenting as sure - or the nagging sense that something might be off, but these signals get ignored. It's not just that we can't take criticism from others or recognize the validity of other perspectives, but we actually don't listen to ourselves. We don't want to. If we don't want to because it hurts - to consider we might be wrong on a particular issue or ever - then we lack grit in the face of cognitive dissonance. We can't sit with those uncomfortable feelings. We want to win, make those go away, be right, period. If we're not aware of what our feelings think are at stake, we won't be vigilant about such cues. If we're not aware of past biases or interpersonal habits we have, we may converse with poor reasoning. That's some floppy muling over those last few terms which we might not think of as involved in poor reasoning. Oh, he doesn't understand logic or deduction or whatever. That's certainly an issue with people. But in general I think if we encounter someone with bad reasoning, there's only a real problem if those last three items are weak.
Critical Thinking
Logical Reasoning
Problem-Solving
Decision Making
Analytical Thinking
Inductive Reasoning
Deductive Reasoning
Cognitive Flexibility
[b]Meta-cognition
Introspection[/b]
Grit in the face of Cognitive Dissonance.
Yes, "bumping up against" involves noticing. Note that you asked how someone could discover they are bad at reasoning. They could do so by noticing contradictions in their own thought. This doesn't mean that they are guaranteed to notice contradictions in their own thought, or that there is a method which provides such a guarantee.
https://en.wikipedia.org/wiki/Black_swan_theory
See also: https://en.wikipedia.org/wiki/Kant_and_the_Platypus
(The platypus is an anomolous creature that transcended the category mammal when discovered by both suckling its young and laying eggs rather than live young, thereby irritating many tidy minds.)
One discovers, when reality bites, rather than when the king of the internet argues.