Skip navigation
All Places > The Psychology Community > Blog > Author: David Myers
1 2 3 Previous Next

The Psychology Community

236 Posts authored by: David Myers Expert

If you have watched a 2019 Democratic Party debate, you perhaps have taken note: While Pete Buttigieg, Elizabeth Warren, and Cory Booker glide smoothly through their spoken words, Joe Biden sometimes hesitates, stammers, and stumbles. Is he just less mentally agile than his more lucid counterparts?

 

Perhaps we should cut him some slack, suggests John Hendrickson in an upcoming Atlantic essay. Biden experiences the lingering effects of childhood stuttering that made him a subject of mockery. An empathic Hendrickson, himself a stutterer, illustrates from Biden’s July debate:

 

“My plan makes a limit of co-pay to be One. Thousand. Dollars. Because we—”

He stopped. He pinched his eyes closed. He lifted his hands and thrust them forward, as if trying to pull the missing sound from his mouth. “We f-f-f-f-further support—” He opened his eyes. “The uh-uh-uh-uh—”

 

Hendrickson is not the only one who empathizes. As a childhood stutterer who received speech therapy in my Seattle public elementary school, and for whom such dysfluency has occasionally resurfaced in adulthood, I know the dismay of coming up to a word that gets stuck in the roof of the mouth, to everyone’s embarrassment, especially my own. For me, K has been a difficult consonant, and sometimes there seems no other way to call on “K-k-k-kathy.”

 

But often, those who stutter have learned that they can fake normal fluency by backing up and detouring around the verbal roadblock, rendering the impediment invisible. As with Joe Biden’s debate responses, listeners may notice the pauses and mid-sentence changes of direction. They just don’t attribute the dysfluency to stuttering (which Biden also does not blame).

 

And so it happens with the great invisible disability, hearing loss. “Can everyone hear me?” asks the person on stage. Given the inevitable answer from those hearing the question, the nodding heads lead the speaker to think, “I don’t need a mic.” And most in the audience likewise presume all’s well—oblivious to the unseen exclusion of so many of us (and hence my advocacy for user-friendly hearing accessibility technology in such settings—see here).

 

Like stutterers, those of us with hearing loss also finesse awkward situations. At a noisy party or in a restaurant, we fake hearing. As our conversational partner makes unheard social chatter, we smile and nod—not wanting to be a pain by asking people to repeat and repeat. Sometimes our response is inappropriate—smiling at someone’s sadness, or being unresponsive to a question. But mostly, after straining and failing to carve meaning out of sound, our pretending to hear hides our disability.

 

There’s practical wisdom to socially finessing one’s speech or hearing challenges. But some go further to hide their hearing disability. They respond to ads for “invisible hearing aids” that can keep people from knowing that—shame, shame—you have hearing loss. (Shame instead on the hearing professionals whose ads imply that hearing loss is something to be deeply ashamed of, and to hide.) Actually, the more public I am about my hearing loss, the more comfortable I become at seeking people’s help in coping with it—by finding quieter tables in quieter restaurants, facing the wall, sitting with my good ear toward the person, having them speak into that ear, and using a wireless mic that transmits to my hearing aids.

 

We can extend the list of hidden disabilities to include some forms of vision loss, brain injury, chronic fatigue, pain, phobias, dyslexia, depression, dementia, and a host of others. Given the invisibility of such disabilities, we often don’t see the challenges that lie behind everything from a child’s misspellings to a Joe Biden stammer. If only we knew—and if only those of us with the invisible challenges would let others know—we all could be less judgmental, more understanding, and more genuinely helpful.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

Bill Gates wants people he hires to read two of his favorite books: The Better Angels of Our Nature, by psychologist Steven Pinker, and Factfulness by the late Hans Rosling.

 

I, too, have loved these books, which form a complementary pair. Pinker argues—our current malaise notwithstanding—that the world is getting better. World hunger is abating, child labor is disappearing. Murder and wars are less common. Literacy is increasing. Given a choice between living a half-century or century ago or today, any sane person would choose today.

 

Rosling mined world data to document these trends and many more. And now the Rosling family’s Swedish foundation is offering stunning dynamic graphic displays of world data.

 

For example, see here and click on the animation for a jaw-dropping depiction of the life-expectancy increase (in but an eye-blink of our total human history).

 

Today’s average human lives much longer, thanks partly to the dramatic decline in child mortality from a time when nearly half of children died by age 5 (and when there was biological wisdom to having more than two children).

 

Other show-the-class goodies include:

 

These facts should whet your informational appetite. For more, explore www.gapminder.com/data. “Gapminder makes global data easy to use and understand.”

 

And then explore www.OurWorldInData.org, founded by Max Roser. This is an Oxford-based source of world data on all sorts of topics. “Our World in Data is about research and data to make progress against the world’s largest problems.” An example, presenting World Bank/United Nations data on the “missing women” phenomenon in certain countries since the advent of prenatal sex determination:

 

 

On the commercial side, www.statista.com has a wealth of information—such as, from my recent searching, data on anti-Semitic crime trends, social media use, and dating app usage.

 

For us data geeks, so many numbers, so little time.

 

Not everything is “better angels” rosy. In addition to sex-selective abortions, we are menaced by climate change, nationalism, hate speech, and rampant misinformation. Even so, the Pinker/Rosling message—that in many important ways life is getting better—is further typified by these very websites, which provide easy access to incredible amounts of information that our ancestors could never know.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

 

“Death is reversible.” So began NYU medical center’s director of Critical Care and Resuscitation Research Science, Sam Parnia, at a recent research consultation on people’s death experiences during and after cardiac resuscitation.

 

Biologically speaking, he explained, death and cardiac arrest are synonymous. When the heart stops, a person will stop breathing and, within 2 to 20 seconds, the brain will stop functioning. These are the criteria for declaring someone dead. When there’s no heartbeat, no breathing, and no discernible brain activity, the attending physician records the time of death.

 

Yet recent advances in science reveal that it may take many hours for individual brain cells to die. In a 2019 Nature report, slaughtered pigs’ brains, given a substitute blood infusion 4 hours after death, had brain function gradually restored over a 6-10 hour period. For many years now, brain cells from human cadaver biopsies have been used to grow brain cells up to 20 hours after death, explained Parnia. His underappreciated conclusion: “Brain cells die very, very slowly,” especially for those whose brains have been chilled, either medically or by drowning in cold water.

 

But what is death? A Newsweek cover showing a resuscitated heart attack victim proclaimed, “This man was dead. He isn’t any more.” Parnia thinks Newsweek got it right. The man didn’t have a “near death experience” (NDE). He had a death experience (DE).

 

Ah, but Merriam-Webster defines death as “a permanent cessation of all vital functions.” So, I asked Parnia, has a resuscitated person actually died? Yes, replied Parnia. Imagine two sisters simultaneously undergoing cardiac arrest, one while hiking in the Sahara Desert, the other in a hospital ER, where she was resuscitated. Because the second could be resuscitated, would we assume that the first, in the same minutes following the cessation of heart and brain function, was not dead?

 

Of 2.8 million CDC-reported deaths in the United States annually, Parnia cites estimates of possibly 1.1 million attempted U.S. cardiac resuscitations a year. How many benefit from such attempts? And of those who survive, how many have some memory of their death experiences (cognitive activity during cardiac arrest)?

 

For answers, Parnia offers his multi-site study of 2060 people who suffered cardiac arrests. In that group, 1730 (84 percent) died and 330 survived. Among the survivors, 60 percent later reported no recall of their death experience. The remaining 40 percent had some recollection, including 10 percent who had a meaningful “transformative” recall. If these estimates are roughly accurate, then some 18,000 Americans a year recall a death experience.

 

NDEs (or DEs) are reportedly recalled as a peaceful and pleasant sense of being pulled toward a light, often accompanied by an out-of-body experience with a time-compressed life review. After returning to life, patients report a diminished fear of death, a kinder spirit, and more benevolent values—a “transformational” experience that Parnia is planning to study with the support of 17 major university hospitals. In this study, cardiac-arrest survivors who do and don’t recall cognitive experiences will complete positive psychology measures of human flourishing.

 

One wonders (and Parnia does, too), when did the recalled death experiences occur? During the cardiac-arrest period of brain inactivity? During the moments before and at cardiac arrest? When the resuscitated patient was gradually re-emerging from a coma? Or even as a later constructed false memory?

 

Answers may come from a future Parnia study, focusing on aortic repair patients, some of whom experience a controlled condition that biologically approximates death, with no heartbeat and flat-lined brain activity. This version of aortic repair surgery puts a person under anesthesia, cools the body to 70 degrees, stops the heart, and drains the blood, creating a death-like state, during which the cardiac surgeon has 40 minutes to repair the aorta before warming the body and restarting the heart. Functionally, for that 40 or so minutes, the patient is dead . . . but then lives again. So, will some of these people whose brains have stopped functioning experience DEs? One study suggests that at least a few aortic repair patients, despite also being under anesthesia, do report a cognitive experience during their cardiac arrest.

 

Parnia hopes to take this research a step further, by exposing these “deep hypothermia” patients to stimuli during their clinical death. Afterwards he will ascertain whether any of them can report accurately on events occurring while they lacked a functioning brain. (Such has been claimed by people having transformative DEs.)

 

Given that a positive result would be truly mind blowing—it would challenge our understanding of the embodied person and the mind-brain connection—my colleagues and I encouraged Parnia to

  •      preregister his hypotheses and methods with the Open Science Framework.
  •      conduct the experiment as an “adversarial collaboration” with a neuroscientist who would expect a null result.
  •      have credible, independent researchers gather the data, as happens with clinical safety trials.

 

If this experiment happens, what do you predict: Will there be someone (anyone) who will accurately report on events occurring while their brain is dormant?

 

Sam Parnia thinks yes. I think not.

 

Parnia is persuaded by his accumulation of credible-seeming accounts of resuscitated patients recalling actual happenings during their brain-inactive time. He cites the case of one young Britisher who, after all efforts to restart his heart had failed and his body turned blue, was declared dead. When the attending physician later returned to the room, he noticed that the patient’s normal color was returning and discovered that his heart had somehow restarted. The next week, reported Parnia, the patient astoundingly recounted events from his death period. As Agatha Christie’s Miss Marple, reflected “It wasn’t what I expected. But facts are facts, and if one is proved to be wrong, one must just be humble about it and start again.”

 

My skepticism arises from three lines of research: the failure of parapsychology experiments to confirm out-of-body travel with remote viewing, the mountain of cognitive neuroscience evidence linking brain and mind, and scientific observations showing that brain oxygen deprivation and hallucinogenic drugs can cause similar mystical experiences (complete with the tunnel, beam of light, and life review).

 

Nevertheless, Parnia and I agree with Miss Marple: Sometimes reality surprises us (as mind-boggling DE reports have surprised him). So stay tuned. When the data speak, we will both listen.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

 

P.S. For those wanting more information: Parnia and other death researchers will present at a November 18th New York Academy of Sciences symposium on “What Happens When We Die?” (see here and here)--with a live stream link to come.

 

For those with religious interests: My colleagues, British cognitive neuroscientist Malcolm Jeeves and American developmental psychologist Thomas Ludwig, reflect on the brain-mind relationship in their recent book, Psychological Science and Christian Faith. If you think that biblical religion assumes a death-denying dualism (thanks to Plato’s immortal soul) prepare to be surprised.

A tweet from my colleague Jean Twenge—a world-class expert at tracking youth well-being in massive data sets—alerted me to the recently released 2018 National Survey on Drug Use and Health. Among its dozens of results, which you can view here, several struck me as worthy of note by high school and college teachers, administrators, and counselors. 

First some good news: From 2002 to 2018, cigarette smoking plummeted and is now but 2.7 percent of U.S. 12- to 17-year-olds. Reaching back to 1976, high school senior smoking has plunged even more, from 28.8 percent to 3.6 percent. Although smoking has become gauche, seniors’ e-cigarette use has soared—from 1.5 percent in 2010 to 26.7 percent in 2018. (Will widely publicized news of vaping-related lung illnesses and deaths reverse this trend?)

 

The not-so-good news: From 2011 to 2018, major depressive episodes increased from 11 to 14 percent among 12- to 17-year-olds, and, similarly, from 8 to 14 percent among 18- to 25-year-olds.

 

 

 

Not surprisingly, youth and young adults’ increased rate of depression has been accompanied by an increase in suicidal thoughts (shown below), suicide attempts, and actual suicides (see new CDC data here).

 

 

As I explained in a previous TalkPsych.com essay, the increase in teens’ (especially teen girls’) vulnerability to depression, anxiety, self-harm, and suicide has occurred in other Western countries as well, and it corresponds neatly with the spread of smart phones and social media. That fact of life has stimulated new research that 

  • correlates teens’ social media use with their mental health.
  • follows teens longitudinally (through time) to see if their social media use predicts their future mental health.
  • experiments by asking if volunteers randomly assigned to a restrained social media diet become, compared with a control group, less depressed and lonely. 

 

Stay tuned. This scientific story is still being written, amid some conflicting results. As Twenge summarizes in a concise and readable new essay, up to two hours of daily screen time predicts no lessening of teen well-being. But as daily screen time increases to six hours—with associated diminishing of face-to-face relationships, sleep, exercise, reading, and time outdoors—the risk of depression and anxiety rise. 

 

The alarming rise in youth and young adult depression, especially over such a thin slice of history, compels our attention. Is screen time the major culprit (both for its drain on other healthy activities and for the upward social comparisons of one’s own mundane life with the lives of cooler-seeming others)? If not, what other social forces are at work? And what can be done to protect and improve youth and young adult well-being?

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

Photo courtesy Virginia Welle

 

At a recent Teaching of Psychology in Secondary Schools workshop hosted by Oregon State University, I celebrated and illustrated three sets of big ideas from psychological science. Without further explanation, here is a quick synopsis.

 

Questions: Which of these would not be on your corresponding lists? And which would you add?

 

Twelve unsurprising but important findings (significant facts of life for our students to understand):

  • There is continuity to our traits, temperament, and intelligence.
    • With age, emotional stability and conscientiousness increase.
    • Yet individual differences (extraversion and IQ) persist.
  • Specific cognitive abilities are distinct yet correlated (g, general intelligence).
  • Human traits (intelligence, personality, sexual orientation, psychiatric disorders, autism spectrum) are influenced by “many genes having small effects”
  • A pessimistic explanatory style increases depression risk.
  • To a striking extent, perceptual set guides what we see.
  • Rewards shape behavior.
  • We prioritize basic needs.
  • Cultures differ in  
    • how we dress, eat, and speak.
    • values.
  • Conformity and social contagion influence our behavior.
  • Group polarization amplifies our differences.
  • Ingroup bias (us > them) is powerful and perilous.
  • Nevertheless, worldwide, we are all kin beneath the skin (we share a human nature).

 

Eleven surprising findings that may challenge our beliefs and assumptions:

  • Behavior genetics studies with twins and adoptees reveal a stunning fact: Within the normal range of environments, the “shared environment” effect on personality and intelligence (including parental nurture shared by siblings) is ~nil. As Robert Plomin says (2019), “We would essentially be the same person if we had been adopted at birth and raised in a different family.”
    • Caveats:
      • Parental extremes (neglect/abuse) matter.
      • Parents influence values/beliefs (politics, religion, etc.).
      • Parents help provide peer context (neighborhood, schools).
      • Stable co-parenting correlates with children’s flourishing.
  • Marriage (enduring partnership) matters . . . more than high school seniors assume . . . and predicts greater health, longevity, happiness, income, parental stability, and children’s flourishing. Yet most single adults and their children flourish.
  • Sexual orientation is a natural disposition (parental influence appears nil), not a moral choice.
  • Many gay men’s and women’s traits appear intermediate to those of straight women and men (for example, spatial ability).
  • Seasonal affective disorder (SAD) may not exist (judging from new CDC data and people’s Google searches for help, by month).
  • Learning styles—assuming that teaching should align with students’ varying ways of thinking and learning—have been discounted.
  • We too often fear the wrong things (air crashes, terrorism, immigrants, school shootings).
  • Brief “wise interventions” with at-risk youth sometimes succeed where big interventions have failed.
  • Random data (as in coin tosses and sports) are streakier than expected.
  • Reality is often not as we perceive it.
  • Repression rarely occurs.

 

Some surprising findings reveal things unimagined:

  • Astonishing insights—great lessons of psychological science—that are now accepted wisdom include
    • split-brain experiments: the differing functions of our two hemispheres.
    • sleep experiments: sleep stages and REM-related dreaming.
    • misinformation effect experiments: the malleability of memory.
  • We’ve been surprised to learn
    • what works as therapy (ECT, light therapy).
    • what doesn’t (Critical Incident Debriefing for trauma victims, D.A.R.E. drug abuse prevention, sexual reorientation therapies, permanent weight-loss programs).
  • We’ve been astounded at our dual-processing powers—our two-track (controlled vs. automatic) mind, as evident in phenomena such as
    • blindsight.
    • implicit memory.
    • implicit bias.
    • thinking without thinking (not-thinking => creativity).
  • We’ve been amazed at the robustness of
    • the testing effect (we retain information better after self-testing/rehearsing it)  
    • the Dunning-Krueger effect (ignorance of one’s own incompetence).   

 

The bottom line: Psychological science works! It affirms important, if unsurprising, truths. And it sometimes surprises us with findings that challenge our assumptions, and with discoveries that astonish us.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

The greatest enemy of knowledge is not ignorance—it is the illusion of knowledge.”

 

This wisdom, often attributed to American historian Daniel Boorstin, suggests a sister aphorism: The great enemy of democracy is not ill will, but the illusion of understanding. It is social and political opinion that, even if well-intentioned and sincerely believed, sprouts from self-confident misinformation.

 

Such is not the province of any one political perspective. Consider:

  • A CivicScience poll asked 3624 Americans if schools should “teach Arabic numerals as part of their curriculum.” Fifty-six percent answered “No.” Among Republican respondents, 74 percent objected; among Democrats, the number was 40 percent. (Do the respondents advise, instead, teaching Roman numerals?)
  • CivicScience also asked people if schools should teach the “creation theory of Catholic priest Georges Lemaitre as part of their science curriculum.” Democrats overwhelmingly objected: 73 percent opposed such teaching (compared with 33 percent of Republicans) … of the Big Bang theory.

 

Such ill-informed opinions—illusions of understanding—are powered by what social psychologists know as the overconfidence phenomenon (a tendency to be more confident than correct) and the Dunning-Krueger effect (incompetence not recognizing itself). And, as I have previously noted, illusory understanding—and what it portends for our civic life--matters because our collective future matters. Consider further:

  • When—despite plummeting violent and property crime rates—7 in 10 adults annually believe there has been more crime in the current year than in the prior year, then fear-mongering politicians may triumph.
  • When immigrants crossing the southern U.S. border are seen as oftentimes “vicious predators and bloodthirsty killers,” then—notwithstanding the somewhat lower actual crime and incarceration rate of immigrants—we will call for the shifting of public resources to “build the wall.”
  • When statistically infrequent (but traumatizing) incidents of air crashes, domestic terrorism, and school shootings hijack our consciousness—thanks to our heuristic of judging risk by readily available images of horrific happenings—then we will disproportionately fear such things. Gallup reports that nearly half of Americans (38 percent of men and 58 percent of women) now are “worried” that they or a family member will be a mass shooting victim. Feeling such fear, we may allocate scarce public resources in less-than-optimal ways—as when transforming schools into fortresses with frightened children—while being unconcerned about the vastly greater dangers posed by car accidents, guns in the home, and future mass destruction from climate change. (It’s so difficult to feel empathy for the unseen future victims of grave dangers.)

 

Red or blue, we agree that our children’s and grandchildren’s future matters. The problem is that democracy requires an informed and thoughtful populace. Democracy’s skeptics argue that most people lack the motivation and ability to do the needed work—to absorb large amounts of information and then, with appropriate humility and openness, to sift the true from the false. Consider our collective ignorance on such diverse topics as the U.S. federal budget percentage going to foreign aid (1 percent, not Americans’ average guess of 31 percent) to the mere 38 percent knowing which party currently controls the U.S. House of Representatives.

 

Such ignorance needn’t reflect stupidity.  Perhaps you, too, have rationalized: If the odds of my vote affecting an election or referendum outcome are infinitesimal, then why invest time in becoming informed? Why not, instead, care for my family, pay the bills, manage my health, pursue relationships, and have fun? Or why not trust the simple answers offered by authoritarian leaders?

 

Ergo, the great enemy of an informed and prudent populace, and of a flourishing democracy, is misinformation that is sustained by an illusion of understanding. But there is good news: Education matters. Education helps us recognize how errors infuse our thinking. Education makes us less gullible to conspiracy theories. Education, rightly done, draws us out of our tribal social media bubbles. And education teaches us to think critically—to ask questions with curiosity, to assess claims with evidence, and to be humble about our own understanding. Said differently, education increases our willingness to ask the two big critical thinking questions: What do you mean? and How do you know?

 

So three cheers for education. Education informs us. It teaches us how to think smarter. And as Aristotle long ago taught us, it supports civic virtues and human flourishing.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

“Do something!” shouted a lone voice at Ohio’s governor during a post-massacre candlelight vigil in downtown Dayton. Others soon chimed into what became a crowd chant, which has now challenged Congress to, indeed, do something in response to the repeated mass shootings.

 

In response, politicians and pundits offered varied diagnoses and remedies. Some blamed mental illness or violent video gaming or White nationalist hate speech. Others noted that such do not set the United States apart from countries that also have mental illness, video game enthusiasts, and hate speech—yet have vastly fewer homicides and virtually no mass shootings. What distinguishes the United States is, simply, guns.

 

Despite broad and growing public support for strengthened background checks and assault weapon bans, America’s nearly 400 million guns are not disappearing soon. So what, realistically, is something effective we can do?

 

Might “red flag” gun laws, which aim to take guns away from dangerous people, be a remedy? If someone expresses suicidal or destructive fantasies, or is mentally ill, could we save lives by confiscating their weapons?

 

The idea of identifying at-risk individuals is not new. Former Speaker of the U.S. House Paul Ryan had the idea in 2015: “People with mental illness are getting guns and committing these mass shootings.” In the wake of the 2018 slaughter of 17 people at a Parkland, Florida high school, Florida’s Governor (now-Senator) Rick Scott went a step further, urging stronger rules to red-flag high-risk people: “I want to make it virtually impossible for anyone who has mental issues to use a gun. I want to make it virtually impossible for anyone who is a danger to themselves or others to use a gun.” President Donald Trump suggested opening more mental hospitals that could house would-be mass murders: “When you have some person like this, you can bring them into a mental institution.” After the El Paso and Dayton massacres, he declared that mass killers are “mentally ill monsters.” At an August 15th New Hampshire rally he added that "These people are mentally ill. I think we have to start building institutions again."

 

The general public has supported red-flagging. In a 2012 Gallup survey, 84 percent of Americans agreed that “increased government spending on mental health screening and treatment” would be a “somewhat” or “very” effective “approach to preventing mass shootings at schools.”

 

While we psychologists welcome the expressed high regard for our supposed powers of discernment, the hard reality is otherwise. Extremely rare events such as mass shootings are inherently difficult to predict, even by the best psychological science. One analysis reviewed 73 studies that attempted to predict violent or antisocial behavior. Its conclusion: Using psychology’s risk assessment tools “as sole determinants of detention, sentencing, and release is not supported by the current evidence.”

 

Moreover, among the millions of troubled people who could potentially murder or commit suicide, it is impossible to identify in advance the infinitesimal fraction who will do so. And it would surely be unfair to stigmatize all “mentally ill” people. Most mentally ill people do not commit violent acts, and most violent criminals are not mentally ill. Violent acts are better predicted by anger, alcohol use, previous violence, gun availability, and young-male demography. (The El Paso and Dayton shooters were 21 and 24-year-old males.) As the late psychologist David Lykken once observed, “We could avoid two-thirds of all crime simply by putting all able-bodied young men in cryogenic sleep from the age of 12 through 28.”

 

Suicide is likewise hard to predict. One research team summarized 50 years of research on suicide’s unpredictability: “The vast majority of people who possess a specific risk factor [for suicide] will never engage in suicidal behavior.” Moreover, our ability to predict suicide “has not improved across 50 years.”

 

Even given our inability to offer accurate predictions of who will commit murder or suicide, we do know some risk factors. As every psychology student knows, one of the best predictors of future behavior is past behavior:  Prior violent acts increase the risk of future violent acts--and prior suicide attempts raise the risk of a future suicide. This was seemingly illustrated by the death of convicted pedophile financier Jeffrey Epstein, after he was removed from suicide watch, which the New York Times reports would normally be decided by the chief psychologist at a federal prison facility after “a face-to-face psychological evaluation.” Shortly after apparently being deemed not at risk, despite his prior attempt, Epstein reportedly died by hanging in his prison cell.

 

But even without knowing who will commit suicide, we can modify the environment to reduce its probability. For example, fences that negate jumping from bridges and buildings have reduced the likelihood of impulsive suicides. Reducing the number of in-home guns has also been effective. States with high gun ownership rates are states with high suicide rates, even after controlling for other factors such as poverty. After Missouri repealed its tough handgun law, its suicide rate went up 15 percent; when Connecticut enacted such a law, its suicide rate dropped 16 percent.

 

And we can reduce, even if we cannot predict, mass shootings. As my psychologist colleague Linda Woolf wrote after a 2018 massacre, and again after El Paso and Dayton, it is time “to focus on the evidence—mass shootings occur, and guns make these atrocities all too easy and frequent.” Our politicians, she adds, should initiate gun safety reforms including “a ban on assault weapons, ban on large-capacity magazines, universal background checks, stiffer licensing laws, red flag laws, and lifting of all Federal restrictions on gun violence research.” Although we cannot predict the next tragedy, we can act to reduce its likelihood.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. An earlier essay also reported some of the evidence on the unpredictability of mass shootings.)

On 48 occasions during his recent testimony regarding Russian election interference, former special counsel Robert Mueller—seeming “confused,” “uncertain,” and “forgetful”—asked to have questions repeated. Was Mueller, who turns 75 this week, exhibiting, as so many pundits surmised,cognitive agingor perhaps even early signs of dementia?

 

Win McNamee/Getty Images 

 

The chatter among those of us with hearing loss suggested a much simpler explanation: Robert Mueller is likely one of us. Might his struggle to hear suggest normal age-related hearing loss, exacerbated by his Vietnam combat? Among Americans 75 and older, half “have difficulty hearing,” reports the National Institute on Deafness and Other Hearing Disorders. For war veterans of Mueller’s age, some hearing loss is to be expected.

 

In response, we empathized. Struggling to hear, especially in important social situations, is stressful and tiring. It drains cognitive energy—energy that is then unavailable for quick processing and responding. Moreover, the challenge is compounded in a cavernous room with distant ceiling speakers that produce a verbal fog as sounds bounce off hard walls. Add to that fast-talking (time-constrained) questioners, some of whom were looking down at their script while speaking, impeding natural lip reading. Those of us with hearing loss dread, and avoid, such situations.

 

There is, admittedly, accumulating evidence (here and here) that hearing loss is associated with accelerated cognitive decline in later life. Compared with people with good hearing, those with hearing loss show declines in memory, attention, and learning about three years earlier—though less if they get hearing aids. But Robert Mueller’s slowness in understanding and processing questions seems explainable not only by his four dozen requests for having questions re-voiced, but likely also by his not completely hearing or perhaps mishearing other questions.

 

And it was all so easily avoidable in one of three ways—each of which I have experienced as a god-send:

  1. A table speaker 20 inches from his ears could have given him vastly clearer sound than what reached his ears after reverberating around the spacious room.
  2. Real-time captioning on a table screen, like the TV captioning we use at home, could have made the spoken words instantly clear.
  3. A room hearing loop could have magnetically transmitted the voice from each microphone directly to the inexpensive telecoil sensor that comes with most modern hearing aids. Other Capitol buildings—including the U.S. House and Senate main chambers and the U.S. Supreme Court chamber—now have hearing loops. Voila! With the mere push of a button (with no need to obtain extra equipment), we can hear deliciously clear sound. (See here, here, and here for more hearing loop information. Full disclosure: The first site is my own informational website, and the last describes our collective advocacy to bring this technology to all of the United States.)

 

Here ye! Hear ye! Let Robert Mueller’s struggling to hear remind our culture that hearing loss—the great invisible disability—is commonplace and, thanks to population aging and a life history of toxic noise, growing. And let us resolve to create a more hearing-friendly environment, from quieter restaurants to hearing-looped auditoriums, worship places, and airports.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

How you and I feel about our lives depends greatly on our social comparisons. We feel smart when others seem dimwitted, and grateful for our health when others are unwell. But sometimes during social comparisons our self-image suffers, and we feel relative deprivation—a perception that we are worse off than others with superior achievements, looks, or income. We may be happy with a raise—until we learn that our co-workers got more. And it’s better, psychologically, to make a salary of $60,000 when friends, neighbors, and co-workers make $30,000, than to make $100,000 when our compatriots make $200,000.

 

Relative deprivation helps us understand why the spread of television—and exposure to others’ wealth—seemingly transformed people’s absolute deprivation (lacking what others have) into relative deprivation (feeling deprived). When and where TV was introduced to various American cities, larceny thefts (shoplifting, bike stealing) soon rose.

 

Relative deprivation also helps us understand the psychological toxicity of today’s growing income inequality. In communities with large inequality—where some people observe others having so much more—average happiness is lower and crime rates and other social pathologies are higher.

 

So should we assume it’s always better to be content and happy than to be frustrated by seemingly unreachable expectations? No—because relative deprivation can also be a force for positive change. People in the former East Germany had a higher standard of living than their counterparts in some other European countries, but a frustratingly lower one than their West German neighbors—and that helped spark their revolt.

 

At a recent gathering of the Templeton foundations, I heard grantee Thor Halvorssen explain how his Human Rights Foundation is working to unite the world against the tyrannies that underlie poverty, famine, war, and torture. One  “Flash Drives for Freedom” project responds to the North Korean people’s mistaken belief—enabled by strict censorship and the absence of Internet—that the rest of the world is worse off than they are.

 

This project is collecting tens of thousands of used and donated USB drives, erasing their content, and refilling them with books, videos, and an off-line Korean Wikipedia that counter Kim Jong-Un’s misinformation. (Yes, Wikipedia can fit on a flash drive—see here—and, yes, most North Koreans have access to devices that can read flash drives.) Finally, it is delivering the goods via drones and balloons with a timing device that ruptures the balloon over North Korean cities, raining down flash drives.

 

The implied psychological rationale: Lay the groundwork for a transformed and free North Korea by harnessing the positive power of relative deprivation.

 

From hrf.org

 

From FlashDrivesForFreedom.org

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

You surely know why you chose your town, your partner, and your vocation—all for good reasons, no doubt.

 

But might other unknown reasons—operating below the level of your conscious awareness–also have nudged your choices? Such is the implication of some clever studies of implicit egotisman automatic tendency to like things we associate with ourselves. For example, we like better a politician or stranger whose face has been morphed with some features of our own (see here and here).

 

I see you yawning: “You needed research to know that we love ourselves and things that resemble us?” The surprise—astonishment, really—comes with the subtle ways in which this phenomenon has been documented. Consider:

  • The name–letter effect. People of varied nationalities, languages, and ages prefer the letters that appear in their own name. People also tend to marry someone whose first or last name resembles our own.
  • The birthdate–number effect. People likewise prefer the numbers that appear in their birthdate. For example, people tend to be attracted to people whose laboratory participant number resembles their birth date.
  • The name–residence effect. Philadelphia, having many more people than Jacksonville, has also had (no surprise) 2.2 times more men named Jack . . . but also 10.4 times more named Philip. Ditto Virginia Beach, which has a disproportionate number of women named Virginia, and St. Louis which, compared to the national average, has 49 percent more men named Louis. Likewise, folks named Park, Hill, Beach, Rock, or Lake are disproportionately likely to live in cities (for example, Park City) that include their names.

 

If that last finding—offered by implicit egotism researchers Brett Pelham, Matthew Mirenberg, and John Jones—doesn’t surprise you, consider an even weirder phenomenon they uncovered: People seem to gravitate to careers identified with their names. In the United States, Dennis, Jerry, and Walter have been equally popular names. But dentists have twice as often been named Dennis as Jerry or Walter, and 2.5 times more often named Denise than the equally popular Beverly or Tammy. Among geoscientists (geologists, geophysicists, and geochemists) people named George and Geoffrey are similarly overrepresented.

 

The phenomenon extends to surname–occupation matching. In 1940 U.S. Census data, people named Baker, Barber, Butcher, and Butler were all 40 percent more likely than expected to work in occupations with their names.

 

Ah, but do Pelham and colleagues have cause-and-effect reversed? For example, aren’t towns often named after people whose descendants stick around? And are people in Virginia more likely to name girls with the state name? Are Georgians more likely to christen their babies Georgia or George? Wasn’t the long-ago village baker—thus so-named—likely to have descendants carrying on the ancestral work?

 

Likely so, grants Pelham. But could that, he asks, explain why states have an excess of people sharing a last-name similarity? California, for example, has an excess of people whose names begin with Cali (as in Califano). Moreover, he reports, people are more likely to move to states and cities with name resemblances—Virginia to Virginia, for example.

 

If the Pelham team is right to think that implicit egotism, though modest, is nonetheless a real unconscious influence on our preferences, might that explain why, with long-ago job offers from three states, I felt drawn to Michigan? And why it was Suzie who sold seashells by the seashore?

 

(For David Myers’ other essays on psychological science and everyday life, including a 2016 essay on much of this implicit egotism research, visit TalkPsych.com.)

David Myers

The Joy of Being Wrong

Posted by David Myers Expert May 28, 2019

What virtue is more needed in today’s contentious and polarized world than humility? We need deep-rooted convictions to fuel our passions, but also humility to restrain bull-headed fanaticism.

 

Along with curiosity and skepticism, humility forms the foundation of all science. Humility enables critical thinking, which holds one’s untested beliefs tentatively while assessing others’ ideas with a skeptical but open mind. To accept everything is to be gullible; to deny everything is to be a cynic.

 

In religion and literature, hubris (pride) is first and foundational among the seven deadly sins. When rooted in theism—the assumption that “There is a God, but it’s not me”—humility reminds us of our surest conviction: Some of our beliefs err. We are finite and fallible. We have dignity but not deity. So there’s no great threat when one of our beliefs is overturned or refined—it’s to be expected.  In this spirit, we can, as St. Paul advised, “test everything, hold fast to what is good.”

 

Humility also underlies healthy human relations. In one of his eighteenth-century Sermons, Samuel Johnson recognized the corrosive perils of pride and narcissism: “He that overvalues himself will undervalue others, and he that undervalues others will oppress them.” Even Dale Carnegie, the positive thinking apostle, foresaw the danger: “Each nation feels superior to other nations. That breeds patriotism—and wars.”

 

Unlike pride and narcissism, humility contributes to human flourishing. It opens us to others. Show social psychologists a situation where humility abounds—with accurate self-awareness + modest self-presentation + a focus on others—and they will show you civil discourse, happy marriages, effective leadership, and mental health. And that is the gist of this new 3.5 minute animated Freethink video, “The Joy of Being Wrong.”

 

Note: The video was supported by the Templeton Foundation (which I serve as a trustee) as an expression of its founder’s science-friendly motto: “How little we know, how eager to learn.” The Foundation is also supporting a University of Connecticut initiative on “Humility and Conviction in Public Life,” including blog essays, a monthly newsletter, podcast interviews, and videos of forums and lectures.

 

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

“Self-consciousness [exists] in contrast with

an ‘other,’ a something which is not the self.”

——C. S. Lewis, The Problem of Pain, 1940

 

We are, always and everywhere, self-conscious of how we differ. Search your memory for a social situation in which you were the only person of your gender, sexual orientation, ethnicity, or body type. Perhaps you were the only woman in a group of men, or the only straight person at an LGBTQ gathering.

 

Recalling that situation . . .

  • Were you self-conscious about your identity?
  • How did others respond to you?
  • How did your perceptions of their responses affect your behavior?

 

Differences determine our “spontaneous self-concepts." If you recalled being very aware of your differences, you are not alone. As social psychologist William McGuire long ago noted, we are conscious of ourselves “insofar as, and in the ways that” we differ. When he and his co-workers invited children to “tell us about yourself,” they mostly mentioned their distinctive attributes. Redheads volunteered their hair color, foreign-born their birthplace, minority children their ethnicity. Spontaneous self-concepts often adapt to a changing group. A Black woman among White women will think of herself as Black, McGuire observed. When moving to a group of Black men, she will become more conscious of being a woman.

 

This identity-shaping phenomenon affects us all. When serving on an American Psychological Association professional task with 10 others—all women—I immediately was aware of my gender. But it was only on the second day, when I joked to the woman next to me that the bathroom break line would be short for me, that she noticed the group’s gender make-up. In my daily life, surrounded by mostly White colleagues and neighbors, I seldom am cognizant of my race—which becomes a prominent part of my identity when visiting my daughter in South Africa, where I become part of a 9 percent minority. In the U.S., by contrast, a new Pew survey finds that 74 percent of Blacks but only 15 percent of Whites see their race as “being extremely or very important to how they think of themselves.”

 

Our differences may influence how others respond to us. Researchers have also noted a related phenomenon: Our differences, though mostly salient to ourselves, may also affect how others treat us. Being the “different” or “solo” person—a Black person in an otherwise White group, a woman in a male group, or an adult in a group of children—can make a person more visible and seem more influential. Their good and bad qualities also tend to be more noticed (see here and here).

 

If we differ from others around us, it therefore makes adaptive sense for us to be a bit wary. It makes sense for a salient person—a minority race person, a gay person, or a corpulent person—to be alert and sensitive to how they are being treated by an interviewer, a police officer, or a neighbor. Although subsiding, explicit prejudices and implicit biases are real, and stereotypes of a difference can become a self-fulfilling prophecy.

 

Sometimes our perceived differences not only influence how others treat us, but also how we, in turn, respond to them. In one classic experiment, men students conversed by phone with women they mistakenly presumed (from having been shown a fake picture) were either unattractive or attractive. The presumed attractive women (unaware of the picture manipulation) spoke more warmly to the men than did the presumed unattractive women. The researchers’ conclusion: The men’s expectations had led them to act in a way that influenced the women to fulfill the belief that beautiful women are desirable. A stereotype of a difference can become a self-fulfilling prophecy.

 

Our acute self-consciousness of our differences can cause us to exaggerate or misinterpret others’ reactions. At times, our acute self-consciousness of our difference may have funny consequences. Consider of my favorite social psychology experiments demonstrating the influence of personal perception of differences. In the first, which showed the “spotlight effect,” Thomas Gilovich and Kenneth Savitsky asked university students to don a Barry Manilow T-shirt before entering a room with other students. Feeling self-conscious about their difference, those wearing the dorky T-shirt guessed that nearly half of their peers would notice the shirt. Actually, only 23 percent did. The lesson: Our differences—our bad hair day, our hearing loss, our dropping the cafeteria plate—often get noticed and remembered less than we imagine.

 

In another favorite experiment—one of social psychology’s most creative and poignant studies—Robert Kleck and Angelo Strenta used theatrical makeup to place an ear-to-mouth facial scar on college women—supposedly to see how others would react. After each woman checked the real-looking scar in a hand mirror, the experimenter applied “moisturizer” to “keep the makeup from cracking”—but which actually removed the scar.

 

So the scene was set: A woman, feeling terribly self-conscious about her supposedly disfigured face, talks with another woman who knows nothing of all this. Feeling acutely sensitive to how their conversational partner was looking at them, the “disfigured” women saw the partner as more tense, patronizing, and distant than did women in a control condition. Their acute self-consciousness about their presumed difference led them to misinterpret normal mannerisms and comments.

 

The bottom line: Differences define us. We are self-conscious of how we differ. To a lesser extent, others notice how we differ and categorize us according to their own beliefs, which may include stereotypes or unrealistic expectations. And sometimes, thanks to our acute sensitivity to how we differ, we overestimate others’ noticing and reacting. But we can reassure ourselves: if we’re having a bad hair day, others are unlikely to notice and even less likely to remember.

 

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

It’s a core lesson of introductory psychology: Intergroup contact reduces prejudice (especially friendly, equal-status contact). As hundreds of studies show, attitudes—of White folks toward Black folks, of straight folks toward gay folks, and of natives toward immigrants—are influenced not just by what we know but also by whom we know. Prejudice lessens when straight people have gay friends or family, and native-born citizens know immigrants.

 

As I write these words from the place of my childhood—Bainbridge Island, Washington—I am moved to offer a family example of the power of social contact. First, consider a large social experiment—the World War II internment and return of Japanese Americans from (a) California, and (b) Bainbridge, a Manhattan-sized island across Puget Sound from Seattle.

 

In minimal-contact California, Japanese-Americans lived mostly in separate enclaves—meaning few Caucasians had Japanese-descent friends. When the California internment ensued, the Hearst newspapers, having long warned of “the yellow peril” celebrated, and few bid the internees goodbye. On their return, resistance and “No Japs Here” signs greeted them. Minimal contact enabled maximal prejudice.

 

Bainbridge was a contrasting high-contact condition—and was also the place where (at its ferry dock on March 30, 1942) the internment began. As an island community, all islanders intermingled as school classmates. Their strawberry farms and stores were dispersed throughout the island. The local paper (whose owners later won awards for journalistic courage) editorialized against the internment and then published internee news from the camps for their friends back home. The internees’ fellow islanders watched over their property. And when more than half the internees returned after the war, they were greeted with food and assistance. A history of cooperative contact enabled minimal prejudice.

 

I can personalize this. One of those saying a tearful goodbye on the dock that 1942 day was my father, the insurance agent and friend of many of them. After maintaining their property insurance during the internment, and then writing “the first auto policy on a Japanese American after the war,” his support was remembered decades later—with a tribute at his death by the island’s Japanese American Community president (a former internee):

 

 

My father provides a case example of the contact effect. His support did not stem from his being socially progressive. (He was a conservative Republican businessperson who chaired the Washington State Nixon for President campaign.) His opposition to the internment of his fellow islanders was simply because he knew them. He therefore believed it was colossally unjust to deem them—his friends and neighbors—a threat. As he later wrote, “We became good friends … and it was heartbreaking for us when the war started and the Japanese people on Bainbridge Island were ordered into concentration camps.”

 

This great and sad experiment on the outcomes of racial separation versus integration is being replicated in our own time. People in states with the least contact with immigrants express most hostility toward them. Meanwhile, those who know and benefit from immigrants—as co-workers, employees, businesspeople, health-care workers, students—know to appreciate them.

 

It’s a lesson worth remembering: Cordial and cooperative contact advances acceptance.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

There’s bad news and good news about Americans’ race relations and attitudes.

 

The bad news:

  • People perceive race relations as worsening. In a 2019 Pew survey of 6637 Americans, 58 percent said that U.S. race relations are now “generally bad,” and 69 percent of those folks saw race relations as “getting worse.”
  • The Trump effect? In the same survey, most (65 percent) said it has become more common for people to express racist or racially insensitive views since Donald Trump’s election.
  • Hate groups are proliferating. The Southern Poverty Law Center has identified 1,020 hate groups—up 30 percent in four years. Such groups feed off dehumanizing views of other races (see here, here, and here).
  • Hate crimes are rising. Although some criticize the SPLC’s hate-group definition, their report coincides with the FBI’s reported 17 percent increase in hate crimes just in 2017. Widely publicized hate crimes, such as the burning of three Louisiana Black churches in March and April of 2019, not to mention the recent synagogue attacks, will surely sustain the perception that Trump-era race relations are worsening.

 

But there is also good news: You likely already know that since the mid-twentieth  century, support for school desegregation, equal employment opportunity, and interracial dating and marriage has soared to near-consensus—enabling a 2008 presidential election that Abraham Lincoln probably never imagined. Although most metropolitan areas remain substantially segregated, neighborhood integration has modestly increased since the century’s turn. But the even better news is that both explicit and implicit race prejudice have continued to decline.

 

This good news is reflected in Tessa Charlesworth and Mahzarin Banaji’s new report of nearly 2 million U.S. adults’ explicit and implicit racial attitudes. Since 2007, people’s explicit race attitudes—the extent to which they acknowledged preferring White to Black people—“moved toward neutrality by approximately 37 percent.” Implicit race attitudes—people’s faster speed when pairing negative words with Black faces (and positive words with White faces)—also moved toward neutrality, but with a slower 17 percent shift. (Charlesworth and Banaji also reported changed attitudes toward other social groups: Attitudes toward gay people made the swiftest progress toward neutrality, while negative implicit attitudes toward overweight people have actually increased.)

 

 

Are these hate-up, prejudice-down findings paradoxical—or even contradictory? Not necessarily. Much as extremes of income—both crushing poverty and excessive wealth—can rise even while average income is stable, so also can extremist racial attitudes increase while overall prejudice does not. Even within healthy communities, a viral disease can spread.

 

Charles Dickens, in A Tale of Two Cities, was prescient: “It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of light, it was the season of darkness, it was the spring of hope, it was the winter of despair.”

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

David Myers

Showerthoughts

Posted by David Myers Expert Apr 19, 2019

Part of my text-writing pleasure is interjecting playful thoughts and tongue-in-cheek one-liners that students seem to enjoy: “Does the name Pavlov ring a bell?” (If I don’t enjoy writing—assuming psychology teaching can offer both wisdom and wit—then who will enjoy reading?)

 

As part of my, um, “executive time,” I occasionally visit Reddit’s Showerthoughts—first for delight but also for inspiration. To quote the website, a showerthought is a spontaneous “miniature epiphany that makes the mundane more interesting. . . . Showerthoughts can be funny, poignant, thought-provoking, or even just silly, but they should always prompt people to say ‘Huh, I’ve never thought about it that way before!’”

 

Some Showerthought examples:

  • Your stomach thinks all potato is mashed.
  • We don’t wash our hands, our hands wash each other.
  • Someone coined the term “coin the term.”
  • If you are the best barber in town, you know you can't get the best haircut.
  • The "b" in subtle is subtle.
  • In a nutshell, an acorn is an oak tree.
  • A lot of people die in their living rooms.
  • The two worst prison sentences are life and death.
  • If you swap the W’s in Where? What? and When? with T’s, you end up with their answers.
  • Tea is just a fancy way of saying leaf soup.
  • Everything in the entire universe either is or isn't a potato.

 

For your further pleasure, here are some psychology-relevant examples, each from Showerthoughts or inspired by one-liners that I encountered there. Perhaps (after my editors trim the merely silly) some of these musings will leaven our future editions?

 

Sleep: To fall asleep, fake it till you make it.

 

Loneliness: The world is full of lonely people afraid to make the first move.

 

Relationships: All of your friends you made by talking to strangers.

 

Implicit cognition: The unconscious mind is like the wind: You don’t see it, but you can see its effects.

 

Aging: To age is to shift from a life of “no limits” to “know limits.”

 

Relationships: Marrying someone because they're attractive is like buying a watermelon because it's a really nice shade of green.

 

Memory via acronyms: The acronym of "The Only Day After Yesterday" is TODAY.

 

Eating behavior: When you're “biting down" on something, you're actually biting up.

 

Sensory adaptation: Nobody realizes how much noise their air conditioning is making until it abruptly shuts off.

 

Psychokinesis claims: More spoons have been bent by ice cream than by psychics.

 

Mind and brain: When you're thinking about your brain, your brain is just thinking about itself.

 

Death: You will be the last person to die in your lifetime.

 

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)