Ryan Barrington Cox bio photo

Ryan Barrington Cox

Ryan makes things in Asheville, NC.

Email Youtube Github RSS

Tools of Critical Thinking: Metathoughts for Psychology - by David A. Levy

My Notes

Language, Mind and Body

Language is highly subjective. We think we describe things objectively but this seldom happens. Our word choices reflect our biases and prescribe rather than describe. Even when we try to stay neutral, we are limited by the “constraints of language.”

One person sees a person as “pushy” while another sees her as “ambitious.” (Other examples: rigid vs steadfast, wimpy vs sweet, strong vs cold). Our attitudes affect language and our language affects our attitudes. It’s circular. We should be aware of our own language biases and other people’s.

Reification, in this context, means confusing concrete and abstract concepts. Self-esteem, for example, is not a real, measurable thing, but we may speak about it as if it is. The mind and “mental” things are not physically measurable things.

We need to learn to differentiate between “events” and “constructs” (aka between “physical” and “mental”). We can be more accurate when we test an event, such as “Why does my car stall at stoplights?” “Why am I unhappy?” is more complex and thus harder to “solve.”

When theorizing about constructs, it’s best to err on the side of utility. We’ll never know the truth, but we may know if a theory is useful or not (Heuristics!).

Is a patient’s condition physical or mental? This is a complex question.

  • What comes first, the mind or the body? Do bodies exist because we perceive them?
  • Or are our perceptions a manifestation of our physical bodies?

These questions have plagued mankind since the beginning.

We can analyze things at different levels.

  • At the physical level, we can objectively measure things (biochemical, neurological, anatomical).
  • At the construct level, we have abstract events that are not directly measurable (mental, experiential, perceptual, etc).

How do these two levels relate to one another? Cause and effect?

If A causes B, then A must happen before B. The author supposes that physical events can happen without mental ones (i.e. After death, biochemistry still breaks down a body) but the converse is not true. Mental events cannot take place without there being some physical cause (even though we may not be able to measure the physical cause).

Mental and physical occur events together as one “psychobiological event.” Asking whether a patient’s problem is “physical” or “mental” is presumptuous and assumes they are separate things.

Mental and physical are forever intertwined.

If a patient is anxious, he may have increased heart rate and/or different measurable chemicals in his system. That’s the physical, measureable level. He may also be thinking fearful, worried thoughts. That’s mental (and not measureable). Taking tranquilizers or talking to a loved one may soothe his condition. “In this way… psychotherapy is no less biochemical than medication.”

The point: Don’t confuse correlations with causes. Just because you find another symptom, it doesn’t mean it’s “causing” the first one.

“Just because your doctor has a name for your condition doesn’t mean he knows what it is.” - Arthur Bloch

Words are “so important to us.” We forget they’re symbols and place too much importance on them.

The Nominal Fallacy: We think we have explained something because we have named it.

Example of the Nominal Fallacy:

  • Q: Why can’t she fall asleep?
  • A: Because she has insomnia.
  • Q: Why are you afraid of spiders?
  • A: I have arachnophobia.

This is circular (or “tautological”) reasoning. We have named not explained.

A tautology can never be falsified because the reasoning is circular.

Example of tautological reasoning:

  • Q: Why did he do that?
  • A: He’s insane.
  • Q: How do you know he’s insane?
  • A: Because only an insane person would do something like that!

Diagnosing is not explaining. Names are not explanations. Keep an eye out for tautological traps!

A light switch is on or off. A coin toss lands heads or tails. These are dichotomous events and can only have two states. More often there are an infinite number of states a variable can have. For example, there are an infinite number of gray shades between white and black.

We often confuse dichotomous and continuous variables. For example, do you trust or mistrust? Is this normal or abnormal? Healthy or not? These are all continuous variables worded as though they are dichotomous.

We split the world into black and white, good and evil, right and wrong. More often, things are in shades of gray. The more we can see shades of gray, the more critically we can think.

Comparing something to its polar opposite helps us define it. Dark only exists because light does.

  • What is “mental illness?” Well, it may be helpful to define “mental health” first.
  • How do I protect my house from burglars? Hm… how would I break into this house?

Psychologists believe that we have polar opposites within us and this can create unrest and health issues.

Freud wrote of “reaction formation” which means we identify with a polar opposite within us. For example, a man who outwardly criticizes homosexuality may be compensating for his own repressed homosexual desires.

“Considering the opposite” is a useful metathought. Attempt to argue the other side of the issue, especially if you are passionate about it.

What is similar and what is different depends on perspective. For example, which of these words does not belong with the others?

  • knee
  • eye
  • mind
  • rock

One person says “mind” because it’s not physical. Another says “rock” because it isn’t part of the body. Another says “eye” because it has three letters. Another says “knee” because it has a silent letter. What are we looking for? Grammar, physicalities, etc? The observer and context determine what similarities and differences are. They are not absolute.

No matter how many similarities two phenomena share, there will be a point where they diverge. We call these points PCDs or “Points of Critical Distinction.” Similarities happen before the PCD. Differences happen at or after it.

The takeaway: Any two “things” have similarities and differences, even seemingly opposite concepts. We must always consider both when making comparisons (Besides, dividing things up into separate entities is a human-done procedure that doesn’t exist on its own. All divisions are artificial).

Don’t let differences be left out in favor of similarities!

Use this metathought when evaluating analogies, especially when we are reifying something like a “broken relationship” by comparing it to a physical “broken machine.” Analogies are useful, but we must be skeptical of them.

We see this flawed thinking in racism, sexism, etc because the observer is generalizing perceived similarities too much and leaving out the individual differences between people.

Don’t let similarities be obscured by differences either!

“You can’t compare these two things because they’re totally and completely different from one another!” For example, if a psychotherapist treats every patient as a new, totally separate being, she may miss underlying themes of the human themes like the need for love, security and trust.

We are unable to separate our observations from our beliefs. When we try to describe, we often prescribe more than we realize.

Maslow’s hierarchy, is given as an example. Rather than study people with “mental illnesses,” Maslow cherry-picked individuals that were self-actualized and successful. From them, he came up with ideal qualities for humans. Critics point out that another person might pick different examples of what human ideals are and thus derive different conclusions.

Maslow tried to show what “is” but much of his studies reflect his beliefs of how we “should” be.

A Barnum statement one-size-fits-all and applies to everyone, such as “He has trust issues” or “She is afraid of getting hurt.” Who doesn’t?! We make these claims all the time like we are singling people out.

The Barnum Statement gets its name from the circus man, Barnum, whose circus had something for everyone.

Cause and Effect

If A and B are correlated, they are perceived together. We may conclude that A causes B. B may also cause A. Perhaps A and B are causing each other. Or C may be causing both A and B.

For example, a patient comes to a shrink and his life improves. Never minding that life improvement is not objectively measurable, the shrink may assume that treatment caused life improvement. Maybe the patient started turning their life around first and this caused them to call a shrink. Maybe the two events are bidirectional, causing each other. Maybe the patient came into some money and this cause life improvement and shrink visits.

The takeaway: Consider all possible pathways of causation.

It’s also possible to draw false causal conclusions. They make take the form of superstitions (i.e. You sat down next to me and I started winning card hands. You caused this positive turn of events).

It’s often impossible to conclude what causes what. In many cases, A and B cause one another.

Parents who are cold, controlling and hostile cause kids with behavioral problems. But wait, can kids with behavioral problems cause parents to become cold, controlling and hostile?

We are often tempted (or prompted) to choose “either/or.” More often the answer lies in “both/and.”

Am I irritable because I’m tired or because the child is screaming. It’s probably some of both plus the house repairs and several other factors.

Linear Causation Example (where C’s are causes and E is effect):
C1 + C2 + C3 = E
Notice that any one cause will cause the effect. More causes amplifies effect.

Multiple Causation Example
C1 * C2 * C3 = E
Notice that if any of the causes is zero there is no effect. It takes the interaction of the causes to get the effect. For example, Blue * Yellow = Green. You need both blue and yellow to make green.

In the why-am-I-irritable example, not all of the causes are an equal percentage of my overall irritation. Being tired may account for 50% of my irritation. The screaming might be 25%. A bill may be 10% and so on.

Takeaway: Causation is not dichotomous. There are “shades of gray” involved and “all causes are not created equal.” Consider their relative weights.

Don’t be fooled when you see a similar or same effect. That doesn’t necessarily mean the same thing caused it.

Depression can be caused by physical illness, genetics, childhood, drugs and countless other factors.

People do things due to their own internal reasons and external circumstances like their environment and circumstances. The author says we tend place too much emphasis on the former, blaming people for their actions (instead of their circumstances).

Lee Ross called this the Fundamental Attribution Error because it’s so prevalent.

Examples: “If they don’t go to work, they’re lazy. If they cry, they’re sensitive. If they say nice things, they’re friendly.” In these cases, we aren’t considering external factors.

Behaviors are dictated by internal (“who”) and external (“where”) factors.

Interestingly, we tend to reverse the Fundamental Attribution Error when we analyze ourselves. We are happy, sad, angry because of external circumstances, not our internals.

Cognitive Biases come from shortcuts. Our brains can’t process everything that’s happening so we end up focusing on the person moving around (and not all the things moving around them because that’s too much to comprehend).

Motivational Biases come from our need to feel in control and make sense of the world. It’s hard to accept unjust, horrible things so we assign blame to people instead of circumstances.

Author’s takeaway: Never underestimate the environment as a cause!

Perhaps a pill, or some other biological solution, alleviates a patient’s suffering. This does not necessarily mean the cause of their suffering is biological in nature.

Correlation is not the same as cause. We often confuse the two.

  • If my car won’t start and I put gasoline in it, then the lack of gasoline was the cause.
  • If moving to a new house alleviates my depression, it may or may not be the new house that has cured me. There may be a correlation between the house-swap and my mood, but other factors are probably at play too (roommate, neighborhood, privacy, new-ness, etc).

Notice the difference between physical, mechanical things (car, gasoline) in that example versus human emotion (depression, moving) which are complex constructs.

We often confuse results with intent. i.e. If we feel hurt by someone that doesn’t necessarily mean they intended to hurt us.

“You don’t return my calls and that hurts my feelings! Therefore your intent must be to hurt my feelings.”

The result of an action does not necessarily prove the cause. Always consider alternate intents for a given cause.

Feelings and veracity (truth) are orthogonal. We can

  • Feel good about something true. i.e. I got an “A” on the exam.
  • Feel bad about something true. i.e. I have a drug problem.
  • Feel good about something false. i.e. Deny that our girlfriend is cheating on us. Feel good about an untruth.
  • Feel bad about something false. i.e. Everyone is out to get me!

A co-worker accuses you of theft. You react with strong negative emotions. This doesn’t necessarily mean it’s true. If it is true and you did steal, you might have the same reaction.

Feelings are not proof of the truth.

Extraordinary events aren’t always caused by extraordinary causes. You think of a song then it comes on the radio. How many times has this happened to you? How many times have you thought of a song that didn’t come on the radio? How many times have you turned on the radio and heard a song you didn’t think of?

Lots of things are happening all the time. Statistically, it makes sense that extraordinary things will happen once in a while.

Investigation, Describing, Thinking

Deductive Reasoning starts with a general premise and works towards a specific instance. i.e. All dogs have four legs. Therefore, the dog Fido has four legs.

Deductive Reasoning is flawed if the general premise is untrue. DR is also flawed if the premise is true and the subsequent logic is circular. The book gives this example.

  • All men are mortal.
  • Socrates is a man.
  • Therefore all men are Socrates.

Inductive Reasoning starts with a specific instance and works up towards a general premise. IR is also referred to as data-driven reasoning.

Scientists use Inductive Reasoning by gathering lots of data, looking for patterns and then constructing a general theory or law that satisfies what they’ve found.

Inductive Reasoning may fail when we haven’t gathered adequate data and we over-generalize. i.e. All the Frenchmen I know are thieves, therefore Pierre, who I haven’t met yet, is a thief.

“The scientific method is based on tampering with what would be happening if we were doing nothing to it.” - R.D. Laing

When we observe something, we change it. This is especially true of Social Sciences. People act differently when they are observed.

Electric Workers were observed to see what factors would increase their work performance. Researchers found that no matter what they changed - no breaks, later quit times - the workers performed better. A reactive effect had occurred. The workers performed better because someone was paying attention.

Are reality shows showing reality? No, they are showing people who know they’re being observed. Who knows what these people are “really” like.

The self-fulfilling prophecy is prevalent. Children in classes and even rats in mazes were shown to perform better when the teachers/researchers were told certain kids/rats were more skilled (though they were chosen at random). Expecting something to be so makes it so. Of course, the person observing the observers may have been expecting to see the self-fulfilling prophecy and made that so too ; )

We carry mental “schemas” (aka models) that help us reason about all the stimuli around us. When we encounter something that contradicts our schemas we can either:

  • Accommodate the new information and update our schema.
  • Alter or ignore the new information so it fits our pre-existing schema.

The latter, called Assimilation Bias is far more common. We “see,” interpret and accept things that agree with what we expect more often. We also ignore, skew, mis-remember things that don’t agree with our beliefs/schemas.

“More that believing what we see, we tend to see what we believe.”

We should be aware that we want to see things that agree with our schemas. When an observation doesn’t agree, take note and work to assimilate the new information into a stronger schema (rather than disregarding the new info).

“The psychiatrist’s eagerness to find mental illness wherever he looks is matched only by his reluctance to define mental illness.” - Thomas Szasz

“Please don’t confuse me with the facts!”

Another common bias, the Hindsight Bias, happens when we think we knew what was going to happen after the fact. For example, we have proverbs for all kind of contradictory I-told-you-so-isms. The book gives several examples.

  • “Out of site, out of mind” vs. “Distance makes the heart grow fonder.”
  • “No evil can happen to a good man” vs “Nice guys finish last.”
  • “Haste makes waste” vs “He who hesitates is lost.”

We overgeneralize based on vivid examples. For example, the loud religious extremists are more vivid and memorable than the silent majority and we often overgeneralize based on them. Vivid, memorable testimonials from a few people sway us more than comprehensive statistics from hundreds of people on paper.

Understanding something isn’t solving it. After years of therapy, you discover that you are afraid to open up emotionally and trust. While discovering this may be helpful, it doesn’t automatically remedy it! This is a common fallacy.

Every decision, however small, is a trade-off. Take stock of what you’re giving up when you choose something.