Skip navigation
All Places > The Psychology Community > Blog > Author: David Myers
1 2 3 Previous Next

The Psychology Community

232 Posts authored by: David Myers Expert

Photo courtesy Virginia Welle

 

At a recent Teaching of Psychology in Secondary Schools workshop hosted by Oregon State University, I celebrated and illustrated three sets of big ideas from psychological science. Without further explanation, here is a quick synopsis.

 

Questions: Which of these would not be on your corresponding lists? And which would you add?

 

Twelve unsurprising but important findings (significant facts of life for our students to understand):

  • There is continuity to our traits, temperament, and intelligence.
    • With age, emotional stability and conscientiousness increase.
    • Yet individual differences (extraversion and IQ) persist.
  • Specific cognitive abilities are distinct yet correlated (g, general intelligence).
  • Human traits (intelligence, personality, sexual orientation, psychiatric disorders, autism spectrum) are influenced by “many genes having small effects”
  • A pessimistic explanatory style increases depression risk.
  • To a striking extent, perceptual set guides what we see.
  • Rewards shape behavior.
  • We prioritize basic needs.
  • Cultures differ in  
    • how we dress, eat, and speak.
    • values.
  • Conformity and social contagion influence our behavior.
  • Group polarization amplifies our differences.
  • Ingroup bias (us > them) is powerful and perilous.
  • Nevertheless, worldwide, we are all kin beneath the skin (we share a human nature).

 

Eleven surprising findings that may challenge our beliefs and assumptions:

  • Behavior genetics studies with twins and adoptees reveal a stunning fact: Within the normal range of environments, the “shared environment” effect on personality and intelligence (including parental nurture shared by siblings) is ~nil. As Robert Plomin says (2019), “We would essentially be the same person if we had been adopted at birth and raised in a different family.”
    • Caveats:
      • Parental extremes (neglect/abuse) matter.
      • Parents influence values/beliefs (politics, religion, etc.).
      • Parents help provide peer context (neighborhood, schools).
      • Stable co-parenting correlates with children’s flourishing.
  • Marriage (enduring partnership) matters . . . more than high school seniors assume . . . and predicts greater health, longevity, happiness, income, parental stability, and children’s flourishing. Yet most single adults and their children flourish.
  • Sexual orientation is a natural disposition (parental influence appears nil), not a moral choice.
  • Many gay men’s and women’s traits appear intermediate to those of straight women and men (for example, spatial ability).
  • Seasonal affective disorder (SAD) may not exist (judging from new CDC data and people’s Google searches for help, by month).
  • Learning styles—assuming that teaching should align with students’ varying ways of thinking and learning—have been discounted.
  • We too often fear the wrong things (air crashes, terrorism, immigrants, school shootings).
  • Brief “wise interventions” with at-risk youth sometimes succeed where big interventions have failed.
  • Random data (as in coin tosses and sports) are streakier than expected.
  • Reality is often not as we perceive it.
  • Repression rarely occurs.

 

Some surprising findings reveal things unimagined:

  • Astonishing insights—great lessons of psychological science—that are now accepted wisdom include
    • split-brain experiments: the differing functions of our two hemispheres.
    • sleep experiments: sleep stages and REM-related dreaming.
    • misinformation effect experiments: the malleability of memory.
  • We’ve been surprised to learn
    • what works as therapy (ECT, light therapy).
    • what doesn’t (Critical Incident Debriefing for trauma victims, D.A.R.E. drug abuse prevention, sexual reorientation therapies, permanent weight-loss programs).
  • We’ve been astounded at our dual-processing powers—our two-track (controlled vs. automatic) mind, as evident in phenomena such as
    • blindsight.
    • implicit memory.
    • implicit bias.
    • thinking without thinking (not-thinking => creativity).
  • We’ve been amazed at the robustness of
    • the testing effect (we retain information better after self-testing/rehearsing it)  
    • the Dunning-Krueger effect (ignorance of one’s own incompetence).   

 

The bottom line: Psychological science works! It affirms important, if unsurprising, truths. And it sometimes surprises us with findings that challenge our assumptions, and with discoveries that astonish us.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

The greatest enemy of knowledge is not ignorance—it is the illusion of knowledge.”

 

This wisdom, often attributed to American historian Daniel Boorstin, suggests a sister aphorism: The great enemy of democracy is not ill will, but the illusion of understanding. It is social and political opinion that, even if well-intentioned and sincerely believed, sprouts from self-confident misinformation.

 

Such is not the province of any one political perspective. Consider:

  • A CivicScience poll asked 3624 Americans if schools should “teach Arabic numerals as part of their curriculum.” Fifty-six percent answered “No.” Among Republican respondents, 74 percent objected; among Democrats, the number was 40 percent. (Do the respondents advise, instead, teaching Roman numerals?)
  • CivicScience also asked people if schools should teach the “creation theory of Catholic priest Georges Lemaitre as part of their science curriculum.” Democrats overwhelmingly objected: 73 percent opposed such teaching (compared with 33 percent of Republicans) … of the Big Bang theory.

 

Such ill-informed opinions—illusions of understanding—are powered by what social psychologists know as the overconfidence phenomenon (a tendency to be more confident than correct) and the Dunning-Krueger effect (incompetence not recognizing itself). And, as I have previously noted, illusory understanding—and what it portends for our civic life--matters because our collective future matters. Consider further:

  • When—despite plummeting violent and property crime rates—7 in 10 adults annually believe there has been more crime in the current year than in the prior year, then fear-mongering politicians may triumph.
  • When immigrants crossing the southern U.S. border are seen as oftentimes “vicious predators and bloodthirsty killers,” then—notwithstanding the somewhat lower actual crime and incarceration rate of immigrants—we will call for the shifting of public resources to “build the wall.”
  • When statistically infrequent (but traumatizing) incidents of air crashes, domestic terrorism, and school shootings hijack our consciousness—thanks to our heuristic of judging risk by readily available images of horrific happenings—then we will disproportionately fear such things. Gallup reports that nearly half of Americans (38 percent of men and 58 percent of women) now are “worried” that they or a family member will be a mass shooting victim. Feeling such fear, we may allocate scarce public resources in less-than-optimal ways—as when transforming schools into fortresses with frightened children—while being unconcerned about the vastly greater dangers posed by car accidents, guns in the home, and future mass destruction from climate change. (It’s so difficult to feel empathy for the unseen future victims of grave dangers.)

 

Red or blue, we agree that our children’s and grandchildren’s future matters. The problem is that democracy requires an informed and thoughtful populace. Democracy’s skeptics argue that most people lack the motivation and ability to do the needed work—to absorb large amounts of information and then, with appropriate humility and openness, to sift the true from the false. Consider our collective ignorance on such diverse topics as the U.S. federal budget percentage going to foreign aid (1 percent, not Americans’ average guess of 31 percent) to the mere 38 percent knowing which party currently controls the U.S. House of Representatives.

 

Such ignorance needn’t reflect stupidity.  Perhaps you, too, have rationalized: If the odds of my vote affecting an election or referendum outcome are infinitesimal, then why invest time in becoming informed? Why not, instead, care for my family, pay the bills, manage my health, pursue relationships, and have fun? Or why not trust the simple answers offered by authoritarian leaders?

 

Ergo, the great enemy of an informed and prudent populace, and of a flourishing democracy, is misinformation that is sustained by an illusion of understanding. But there is good news: Education matters. Education helps us recognize how errors infuse our thinking. Education makes us less gullible to conspiracy theories. Education, rightly done, draws us out of our tribal social media bubbles. And education teaches us to think critically—to ask questions with curiosity, to assess claims with evidence, and to be humble about our own understanding. Said differently, education increases our willingness to ask the two big critical thinking questions: What do you mean? and How do you know?

 

So three cheers for education. Education informs us. It teaches us how to think smarter. And as Aristotle long ago taught us, it supports civic virtues and human flourishing.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

“Do something!” shouted a lone voice at Ohio’s governor during a post-massacre candlelight vigil in downtown Dayton. Others soon chimed into what became a crowd chant, which has now challenged Congress to, indeed, do something in response to the repeated mass shootings.

 

In response, politicians and pundits offered varied diagnoses and remedies. Some blamed mental illness or violent video gaming or White nationalist hate speech. Others noted that such do not set the United States apart from countries that also have mental illness, video game enthusiasts, and hate speech—yet have vastly fewer homicides and virtually no mass shootings. What distinguishes the United States is, simply, guns.

 

Despite broad and growing public support for strengthened background checks and assault weapon bans, America’s nearly 400 million guns are not disappearing soon. So what, realistically, is something effective we can do?

 

Might “red flag” gun laws, which aim to take guns away from dangerous people, be a remedy? If someone expresses suicidal or destructive fantasies, or is mentally ill, could we save lives by confiscating their weapons?

 

The idea of identifying at-risk individuals is not new. Former Speaker of the U.S. House Paul Ryan had the idea in 2015: “People with mental illness are getting guns and committing these mass shootings.” In the wake of the 2018 slaughter of 17 people at a Parkland, Florida high school, Florida’s Governor (now-Senator) Rick Scott went a step further, urging stronger rules to red-flag high-risk people: “I want to make it virtually impossible for anyone who has mental issues to use a gun. I want to make it virtually impossible for anyone who is a danger to themselves or others to use a gun.” President Donald Trump suggested opening more mental hospitals that could house would-be mass murders: “When you have some person like this, you can bring them into a mental institution.” After the El Paso and Dayton massacres, he declared that mass killers are “mentally ill monsters.” At an August 15th New Hampshire rally he added that "These people are mentally ill. I think we have to start building institutions again."

 

The general public has supported red-flagging. In a 2012 Gallup survey, 84 percent of Americans agreed that “increased government spending on mental health screening and treatment” would be a “somewhat” or “very” effective “approach to preventing mass shootings at schools.”

 

While we psychologists welcome the expressed high regard for our supposed powers of discernment, the hard reality is otherwise. Extremely rare events such as mass shootings are inherently difficult to predict, even by the best psychological science. One analysis reviewed 73 studies that attempted to predict violent or antisocial behavior. Its conclusion: Using psychology’s risk assessment tools “as sole determinants of detention, sentencing, and release is not supported by the current evidence.”

 

Moreover, among the millions of troubled people who could potentially murder or commit suicide, it is impossible to identify in advance the infinitesimal fraction who will do so. And it would surely be unfair to stigmatize all “mentally ill” people. Most mentally ill people do not commit violent acts, and most violent criminals are not mentally ill. Violent acts are better predicted by anger, alcohol use, previous violence, gun availability, and young-male demography. (The El Paso and Dayton shooters were 21 and 24-year-old males.) As the late psychologist David Lykken once observed, “We could avoid two-thirds of all crime simply by putting all able-bodied young men in cryogenic sleep from the age of 12 through 28.”

 

Suicide is likewise hard to predict. One research team summarized 50 years of research on suicide’s unpredictability: “The vast majority of people who possess a specific risk factor [for suicide] will never engage in suicidal behavior.” Moreover, our ability to predict suicide “has not improved across 50 years.”

 

Even given our inability to offer accurate predictions of who will commit murder or suicide, we do know some risk factors. As every psychology student knows, one of the best predictors of future behavior is past behavior:  Prior violent acts increase the risk of future violent acts--and prior suicide attempts raise the risk of a future suicide. This was seemingly illustrated by the death of convicted pedophile financier Jeffrey Epstein, after he was removed from suicide watch, which the New York Times reports would normally be decided by the chief psychologist at a federal prison facility after “a face-to-face psychological evaluation.” Shortly after apparently being deemed not at risk, despite his prior attempt, Epstein reportedly died by hanging in his prison cell.

 

But even without knowing who will commit suicide, we can modify the environment to reduce its probability. For example, fences that negate jumping from bridges and buildings have reduced the likelihood of impulsive suicides. Reducing the number of in-home guns has also been effective. States with high gun ownership rates are states with high suicide rates, even after controlling for other factors such as poverty. After Missouri repealed its tough handgun law, its suicide rate went up 15 percent; when Connecticut enacted such a law, its suicide rate dropped 16 percent.

 

And we can reduce, even if we cannot predict, mass shootings. As my psychologist colleague Linda Woolf wrote after a 2018 massacre, and again after El Paso and Dayton, it is time “to focus on the evidence—mass shootings occur, and guns make these atrocities all too easy and frequent.” Our politicians, she adds, should initiate gun safety reforms including “a ban on assault weapons, ban on large-capacity magazines, universal background checks, stiffer licensing laws, red flag laws, and lifting of all Federal restrictions on gun violence research.” Although we cannot predict the next tragedy, we can act to reduce its likelihood.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com. An earlier essay also reported some of the evidence on the unpredictability of mass shootings.)

On 48 occasions during his recent testimony regarding Russian election interference, former special counsel Robert Mueller—seeming “confused,” “uncertain,” and “forgetful”—asked to have questions repeated. Was Mueller, who turns 75 this week, exhibiting, as so many pundits surmised,cognitive agingor perhaps even early signs of dementia?

 

Win McNamee/Getty Images 

 

The chatter among those of us with hearing loss suggested a much simpler explanation: Robert Mueller is likely one of us. Might his struggle to hear suggest normal age-related hearing loss, exacerbated by his Vietnam combat? Among Americans 75 and older, half “have difficulty hearing,” reports the National Institute on Deafness and Other Hearing Disorders. For war veterans of Mueller’s age, some hearing loss is to be expected.

 

In response, we empathized. Struggling to hear, especially in important social situations, is stressful and tiring. It drains cognitive energy—energy that is then unavailable for quick processing and responding. Moreover, the challenge is compounded in a cavernous room with distant ceiling speakers that produce a verbal fog as sounds bounce off hard walls. Add to that fast-talking (time-constrained) questioners, some of whom were looking down at their script while speaking, impeding natural lip reading. Those of us with hearing loss dread, and avoid, such situations.

 

There is, admittedly, accumulating evidence (here and here) that hearing loss is associated with accelerated cognitive decline in later life. Compared with people with good hearing, those with hearing loss show declines in memory, attention, and learning about three years earlier—though less if they get hearing aids. But Robert Mueller’s slowness in understanding and processing questions seems explainable not only by his four dozen requests for having questions re-voiced, but likely also by his not completely hearing or perhaps mishearing other questions.

 

And it was all so easily avoidable in one of three ways—each of which I have experienced as a god-send:

  1. A table speaker 20 inches from his ears could have given him vastly clearer sound than what reached his ears after reverberating around the spacious room.
  2. Real-time captioning on a table screen, like the TV captioning we use at home, could have made the spoken words instantly clear.
  3. A room hearing loop could have magnetically transmitted the voice from each microphone directly to the inexpensive telecoil sensor that comes with most modern hearing aids. Other Capitol buildings—including the U.S. House and Senate main chambers and the U.S. Supreme Court chamber—now have hearing loops. Voila! With the mere push of a button (with no need to obtain extra equipment), we can hear deliciously clear sound. (See here, here, and here for more hearing loop information. Full disclosure: The first site is my own informational website, and the last describes our collective advocacy to bring this technology to all of the United States.)

 

Here ye! Hear ye! Let Robert Mueller’s struggling to hear remind our culture that hearing loss—the great invisible disability—is commonplace and, thanks to population aging and a life history of toxic noise, growing. And let us resolve to create a more hearing-friendly environment, from quieter restaurants to hearing-looped auditoriums, worship places, and airports.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

How you and I feel about our lives depends greatly on our social comparisons. We feel smart when others seem dimwitted, and grateful for our health when others are unwell. But sometimes during social comparisons our self-image suffers, and we feel relative deprivation—a perception that we are worse off than others with superior achievements, looks, or income. We may be happy with a raise—until we learn that our co-workers got more. And it’s better, psychologically, to make a salary of $60,000 when friends, neighbors, and co-workers make $30,000, than to make $100,000 when our compatriots make $200,000.

 

Relative deprivation helps us understand why the spread of television—and exposure to others’ wealth—seemingly transformed people’s absolute deprivation (lacking what others have) into relative deprivation (feeling deprived). When and where TV was introduced to various American cities, larceny thefts (shoplifting, bike stealing) soon rose.

 

Relative deprivation also helps us understand the psychological toxicity of today’s growing income inequality. In communities with large inequality—where some people observe others having so much more—average happiness is lower and crime rates and other social pathologies are higher.

 

So should we assume it’s always better to be content and happy than to be frustrated by seemingly unreachable expectations? No—because relative deprivation can also be a force for positive change. People in the former East Germany had a higher standard of living than their counterparts in some other European countries, but a frustratingly lower one than their West German neighbors—and that helped spark their revolt.

 

At a recent gathering of the Templeton foundations, I heard grantee Thor Halvorssen explain how his Human Rights Foundation is working to unite the world against the tyrannies that underlie poverty, famine, war, and torture. One  “Flash Drives for Freedom” project responds to the North Korean people’s mistaken belief—enabled by strict censorship and the absence of Internet—that the rest of the world is worse off than they are.

 

This project is collecting tens of thousands of used and donated USB drives, erasing their content, and refilling them with books, videos, and an off-line Korean Wikipedia that counter Kim Jong-Un’s misinformation. (Yes, Wikipedia can fit on a flash drive—see here—and, yes, most North Koreans have access to devices that can read flash drives.) Finally, it is delivering the goods via drones and balloons with a timing device that ruptures the balloon over North Korean cities, raining down flash drives.

 

The implied psychological rationale: Lay the groundwork for a transformed and free North Korea by harnessing the positive power of relative deprivation.

 

From hrf.org

 

From FlashDrivesForFreedom.org

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

You surely know why you chose your town, your partner, and your vocation—all for good reasons, no doubt.

 

But might other unknown reasons—operating below the level of your conscious awareness–also have nudged your choices? Such is the implication of some clever studies of implicit egotisman automatic tendency to like things we associate with ourselves. For example, we like better a politician or stranger whose face has been morphed with some features of our own (see here and here).

 

I see you yawning: “You needed research to know that we love ourselves and things that resemble us?” The surprise—astonishment, really—comes with the subtle ways in which this phenomenon has been documented. Consider:

  • The name–letter effect. People of varied nationalities, languages, and ages prefer the letters that appear in their own name. People also tend to marry someone whose first or last name resembles our own.
  • The birthdate–number effect. People likewise prefer the numbers that appear in their birthdate. For example, people tend to be attracted to people whose laboratory participant number resembles their birth date.
  • The name–residence effect. Philadelphia, having many more people than Jacksonville, has also had (no surprise) 2.2 times more men named Jack . . . but also 10.4 times more named Philip. Ditto Virginia Beach, which has a disproportionate number of women named Virginia, and St. Louis which, compared to the national average, has 49 percent more men named Louis. Likewise, folks named Park, Hill, Beach, Rock, or Lake are disproportionately likely to live in cities (for example, Park City) that include their names.

 

If that last finding—offered by implicit egotism researchers Brett Pelham, Matthew Mirenberg, and John Jones—doesn’t surprise you, consider an even weirder phenomenon they uncovered: People seem to gravitate to careers identified with their names. In the United States, Dennis, Jerry, and Walter have been equally popular names. But dentists have twice as often been named Dennis as Jerry or Walter, and 2.5 times more often named Denise than the equally popular Beverly or Tammy. Among geoscientists (geologists, geophysicists, and geochemists) people named George and Geoffrey are similarly overrepresented.

 

The phenomenon extends to surname–occupation matching. In 1940 U.S. Census data, people named Baker, Barber, Butcher, and Butler were all 40 percent more likely than expected to work in occupations with their names.

 

Ah, but do Pelham and colleagues have cause-and-effect reversed? For example, aren’t towns often named after people whose descendants stick around? And are people in Virginia more likely to name girls with the state name? Are Georgians more likely to christen their babies Georgia or George? Wasn’t the long-ago village baker—thus so-named—likely to have descendants carrying on the ancestral work?

 

Likely so, grants Pelham. But could that, he asks, explain why states have an excess of people sharing a last-name similarity? California, for example, has an excess of people whose names begin with Cali (as in Califano). Moreover, he reports, people are more likely to move to states and cities with name resemblances—Virginia to Virginia, for example.

 

If the Pelham team is right to think that implicit egotism, though modest, is nonetheless a real unconscious influence on our preferences, might that explain why, with long-ago job offers from three states, I felt drawn to Michigan? And why it was Suzie who sold seashells by the seashore?

 

(For David Myers’ other essays on psychological science and everyday life, including a 2016 essay on much of this implicit egotism research, visit TalkPsych.com.)

David Myers

The Joy of Being Wrong

Posted by David Myers Expert May 28, 2019

What virtue is more needed in today’s contentious and polarized world than humility? We need deep-rooted convictions to fuel our passions, but also humility to restrain bull-headed fanaticism.

 

Along with curiosity and skepticism, humility forms the foundation of all science. Humility enables critical thinking, which holds one’s untested beliefs tentatively while assessing others’ ideas with a skeptical but open mind. To accept everything is to be gullible; to deny everything is to be a cynic.

 

In religion and literature, hubris (pride) is first and foundational among the seven deadly sins. When rooted in theism—the assumption that “There is a God, but it’s not me”—humility reminds us of our surest conviction: Some of our beliefs err. We are finite and fallible. We have dignity but not deity. So there’s no great threat when one of our beliefs is overturned or refined—it’s to be expected.  In this spirit, we can, as St. Paul advised, “test everything, hold fast to what is good.”

 

Humility also underlies healthy human relations. In one of his eighteenth-century Sermons, Samuel Johnson recognized the corrosive perils of pride and narcissism: “He that overvalues himself will undervalue others, and he that undervalues others will oppress them.” Even Dale Carnegie, the positive thinking apostle, foresaw the danger: “Each nation feels superior to other nations. That breeds patriotism—and wars.”

 

Unlike pride and narcissism, humility contributes to human flourishing. It opens us to others. Show social psychologists a situation where humility abounds—with accurate self-awareness + modest self-presentation + a focus on others—and they will show you civil discourse, happy marriages, effective leadership, and mental health. And that is the gist of this new 3.5 minute animated Freethink video, “The Joy of Being Wrong.”

 

Note: The video was supported by the Templeton Foundation (which I serve as a trustee) as an expression of its founder’s science-friendly motto: “How little we know, how eager to learn.” The Foundation is also supporting a University of Connecticut initiative on “Humility and Conviction in Public Life,” including blog essays, a monthly newsletter, podcast interviews, and videos of forums and lectures.

 

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

“Self-consciousness [exists] in contrast with

an ‘other,’ a something which is not the self.”

——C. S. Lewis, The Problem of Pain, 1940

 

We are, always and everywhere, self-conscious of how we differ. Search your memory for a social situation in which you were the only person of your gender, sexual orientation, ethnicity, or body type. Perhaps you were the only woman in a group of men, or the only straight person at an LGBTQ gathering.

 

Recalling that situation . . .

  • Were you self-conscious about your identity?
  • How did others respond to you?
  • How did your perceptions of their responses affect your behavior?

 

Differences determine our “spontaneous self-concepts." If you recalled being very aware of your differences, you are not alone. As social psychologist William McGuire long ago noted, we are conscious of ourselves “insofar as, and in the ways that” we differ. When he and his co-workers invited children to “tell us about yourself,” they mostly mentioned their distinctive attributes. Redheads volunteered their hair color, foreign-born their birthplace, minority children their ethnicity. Spontaneous self-concepts often adapt to a changing group. A Black woman among White women will think of herself as Black, McGuire observed. When moving to a group of Black men, she will become more conscious of being a woman.

 

This identity-shaping phenomenon affects us all. When serving on an American Psychological Association professional task with 10 others—all women—I immediately was aware of my gender. But it was only on the second day, when I joked to the woman next to me that the bathroom break line would be short for me, that she noticed the group’s gender make-up. In my daily life, surrounded by mostly White colleagues and neighbors, I seldom am cognizant of my race—which becomes a prominent part of my identity when visiting my daughter in South Africa, where I become part of a 9 percent minority. In the U.S., by contrast, a new Pew survey finds that 74 percent of Blacks but only 15 percent of Whites see their race as “being extremely or very important to how they think of themselves.”

 

Our differences may influence how others respond to us. Researchers have also noted a related phenomenon: Our differences, though mostly salient to ourselves, may also affect how others treat us. Being the “different” or “solo” person—a Black person in an otherwise White group, a woman in a male group, or an adult in a group of children—can make a person more visible and seem more influential. Their good and bad qualities also tend to be more noticed (see here and here).

 

If we differ from others around us, it therefore makes adaptive sense for us to be a bit wary. It makes sense for a salient person—a minority race person, a gay person, or a corpulent person—to be alert and sensitive to how they are being treated by an interviewer, a police officer, or a neighbor. Although subsiding, explicit prejudices and implicit biases are real, and stereotypes of a difference can become a self-fulfilling prophecy.

 

Sometimes our perceived differences not only influence how others treat us, but also how we, in turn, respond to them. In one classic experiment, men students conversed by phone with women they mistakenly presumed (from having been shown a fake picture) were either unattractive or attractive. The presumed attractive women (unaware of the picture manipulation) spoke more warmly to the men than did the presumed unattractive women. The researchers’ conclusion: The men’s expectations had led them to act in a way that influenced the women to fulfill the belief that beautiful women are desirable. A stereotype of a difference can become a self-fulfilling prophecy.

 

Our acute self-consciousness of our differences can cause us to exaggerate or misinterpret others’ reactions. At times, our acute self-consciousness of our difference may have funny consequences. Consider of my favorite social psychology experiments demonstrating the influence of personal perception of differences. In the first, which showed the “spotlight effect,” Thomas Gilovich and Kenneth Savitsky asked university students to don a Barry Manilow T-shirt before entering a room with other students. Feeling self-conscious about their difference, those wearing the dorky T-shirt guessed that nearly half of their peers would notice the shirt. Actually, only 23 percent did. The lesson: Our differences—our bad hair day, our hearing loss, our dropping the cafeteria plate—often get noticed and remembered less than we imagine.

 

In another favorite experiment—one of social psychology’s most creative and poignant studies—Robert Kleck and Angelo Strenta used theatrical makeup to place an ear-to-mouth facial scar on college women—supposedly to see how others would react. After each woman checked the real-looking scar in a hand mirror, the experimenter applied “moisturizer” to “keep the makeup from cracking”—but which actually removed the scar.

 

So the scene was set: A woman, feeling terribly self-conscious about her supposedly disfigured face, talks with another woman who knows nothing of all this. Feeling acutely sensitive to how their conversational partner was looking at them, the “disfigured” women saw the partner as more tense, patronizing, and distant than did women in a control condition. Their acute self-consciousness about their presumed difference led them to misinterpret normal mannerisms and comments.

 

The bottom line: Differences define us. We are self-conscious of how we differ. To a lesser extent, others notice how we differ and categorize us according to their own beliefs, which may include stereotypes or unrealistic expectations. And sometimes, thanks to our acute sensitivity to how we differ, we overestimate others’ noticing and reacting. But we can reassure ourselves: if we’re having a bad hair day, others are unlikely to notice and even less likely to remember.

 

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

It’s a core lesson of introductory psychology: Intergroup contact reduces prejudice (especially friendly, equal-status contact). As hundreds of studies show, attitudes—of White folks toward Black folks, of straight folks toward gay folks, and of natives toward immigrants—are influenced not just by what we know but also by whom we know. Prejudice lessens when straight people have gay friends or family, and native-born citizens know immigrants.

 

As I write these words from the place of my childhood—Bainbridge Island, Washington—I am moved to offer a family example of the power of social contact. First, consider a large social experiment—the World War II internment and return of Japanese Americans from (a) California, and (b) Bainbridge, a Manhattan-sized island across Puget Sound from Seattle.

 

In minimal-contact California, Japanese-Americans lived mostly in separate enclaves—meaning few Caucasians had Japanese-descent friends. When the California internment ensued, the Hearst newspapers, having long warned of “the yellow peril” celebrated, and few bid the internees goodbye. On their return, resistance and “No Japs Here” signs greeted them. Minimal contact enabled maximal prejudice.

 

Bainbridge was a contrasting high-contact condition—and was also the place where (at its ferry dock on March 30, 1942) the internment began. As an island community, all islanders intermingled as school classmates. Their strawberry farms and stores were dispersed throughout the island. The local paper (whose owners later won awards for journalistic courage) editorialized against the internment and then published internee news from the camps for their friends back home. The internees’ fellow islanders watched over their property. And when more than half the internees returned after the war, they were greeted with food and assistance. A history of cooperative contact enabled minimal prejudice.

 

I can personalize this. One of those saying a tearful goodbye on the dock that 1942 day was my father, the insurance agent and friend of many of them. After maintaining their property insurance during the internment, and then writing “the first auto policy on a Japanese American after the war,” his support was remembered decades later—with a tribute at his death by the island’s Japanese American Community president (a former internee):

 

 

My father provides a case example of the contact effect. His support did not stem from his being socially progressive. (He was a conservative Republican businessperson who chaired the Washington State Nixon for President campaign.) His opposition to the internment of his fellow islanders was simply because he knew them. He therefore believed it was colossally unjust to deem them—his friends and neighbors—a threat. As he later wrote, “We became good friends … and it was heartbreaking for us when the war started and the Japanese people on Bainbridge Island were ordered into concentration camps.”

 

This great and sad experiment on the outcomes of racial separation versus integration is being replicated in our own time. People in states with the least contact with immigrants express most hostility toward them. Meanwhile, those who know and benefit from immigrants—as co-workers, employees, businesspeople, health-care workers, students—know to appreciate them.

 

It’s a lesson worth remembering: Cordial and cooperative contact advances acceptance.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

There’s bad news and good news about Americans’ race relations and attitudes.

 

The bad news:

  • People perceive race relations as worsening. In a 2019 Pew survey of 6637 Americans, 58 percent said that U.S. race relations are now “generally bad,” and 69 percent of those folks saw race relations as “getting worse.”
  • The Trump effect? In the same survey, most (65 percent) said it has become more common for people to express racist or racially insensitive views since Donald Trump’s election.
  • Hate groups are proliferating. The Southern Poverty Law Center has identified 1,020 hate groups—up 30 percent in four years. Such groups feed off dehumanizing views of other races (see here, here, and here).
  • Hate crimes are rising. Although some criticize the SPLC’s hate-group definition, their report coincides with the FBI’s reported 17 percent increase in hate crimes just in 2017. Widely publicized hate crimes, such as the burning of three Louisiana Black churches in March and April of 2019, not to mention the recent synagogue attacks, will surely sustain the perception that Trump-era race relations are worsening.

 

But there is also good news: You likely already know that since the mid-twentieth  century, support for school desegregation, equal employment opportunity, and interracial dating and marriage has soared to near-consensus—enabling a 2008 presidential election that Abraham Lincoln probably never imagined. Although most metropolitan areas remain substantially segregated, neighborhood integration has modestly increased since the century’s turn. But the even better news is that both explicit and implicit race prejudice have continued to decline.

 

This good news is reflected in Tessa Charlesworth and Mahzarin Banaji’s new report of nearly 2 million U.S. adults’ explicit and implicit racial attitudes. Since 2007, people’s explicit race attitudes—the extent to which they acknowledged preferring White to Black people—“moved toward neutrality by approximately 37 percent.” Implicit race attitudes—people’s faster speed when pairing negative words with Black faces (and positive words with White faces)—also moved toward neutrality, but with a slower 17 percent shift. (Charlesworth and Banaji also reported changed attitudes toward other social groups: Attitudes toward gay people made the swiftest progress toward neutrality, while negative implicit attitudes toward overweight people have actually increased.)

 

 

Are these hate-up, prejudice-down findings paradoxical—or even contradictory? Not necessarily. Much as extremes of income—both crushing poverty and excessive wealth—can rise even while average income is stable, so also can extremist racial attitudes increase while overall prejudice does not. Even within healthy communities, a viral disease can spread.

 

Charles Dickens, in A Tale of Two Cities, was prescient: “It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of light, it was the season of darkness, it was the spring of hope, it was the winter of despair.”

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

David Myers

Showerthoughts

Posted by David Myers Expert Apr 19, 2019

Part of my text-writing pleasure is interjecting playful thoughts and tongue-in-cheek one-liners that students seem to enjoy: “Does the name Pavlov ring a bell?” (If I don’t enjoy writing—assuming psychology teaching can offer both wisdom and wit—then who will enjoy reading?)

 

As part of my, um, “executive time,” I occasionally visit Reddit’s Showerthoughts—first for delight but also for inspiration. To quote the website, a showerthought is a spontaneous “miniature epiphany that makes the mundane more interesting. . . . Showerthoughts can be funny, poignant, thought-provoking, or even just silly, but they should always prompt people to say ‘Huh, I’ve never thought about it that way before!’”

 

Some Showerthought examples:

  • Your stomach thinks all potato is mashed.
  • We don’t wash our hands, our hands wash each other.
  • Someone coined the term “coin the term.”
  • If you are the best barber in town, you know you can't get the best haircut.
  • The "b" in subtle is subtle.
  • In a nutshell, an acorn is an oak tree.
  • A lot of people die in their living rooms.
  • The two worst prison sentences are life and death.
  • If you swap the W’s in Where? What? and When? with T’s, you end up with their answers.
  • Tea is just a fancy way of saying leaf soup.
  • Everything in the entire universe either is or isn't a potato.

 

For your further pleasure, here are some psychology-relevant examples, each from Showerthoughts or inspired by one-liners that I encountered there. Perhaps (after my editors trim the merely silly) some of these musings will leaven our future editions?

 

Sleep: To fall asleep, fake it till you make it.

 

Loneliness: The world is full of lonely people afraid to make the first move.

 

Relationships: All of your friends you made by talking to strangers.

 

Implicit cognition: The unconscious mind is like the wind: You don’t see it, but you can see its effects.

 

Aging: To age is to shift from a life of “no limits” to “know limits.”

 

Relationships: Marrying someone because they're attractive is like buying a watermelon because it's a really nice shade of green.

 

Memory via acronyms: The acronym of "The Only Day After Yesterday" is TODAY.

 

Eating behavior: When you're “biting down" on something, you're actually biting up.

 

Sensory adaptation: Nobody realizes how much noise their air conditioning is making until it abruptly shuts off.

 

Psychokinesis claims: More spoons have been bent by ice cream than by psychics.

 

Mind and brain: When you're thinking about your brain, your brain is just thinking about itself.

 

Death: You will be the last person to die in your lifetime.

 

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

“The sun looks down on nothing half so good as

a household laughing together over a meal.”

~C. S. Lewis, “Membership,” 1949

 

It’s one of life’s curiosities: Taking in food is, everywhere, a common communal activity. For families and friends, eating together is a social event. For creatures with a need to belong, group meals provide the pleasures of both food and friendship.

 

Eating eases meeting. When people share an eating pleasure, such as tasting chocolates, they find food more flavorful. When families sit down for a shared dinner, they eat not only healthier but happier—their lives pausing for connection. And when workers come together for a meal, team-building friendships grow. Such is my experience, as when my psychology text publishing team gathers over a meal (shown here from our recent book-planning meeting in New York City).

Yale psychologist Irving Janis and his colleagues observed long ago that persuasive messages associated with good feelings—such as experienced while eating snacks—are more convincing. Fund solicitors and salespeople understand that when they treat us to a meal, good feelings often generalize to the host. The bonding power of a shared meal is especially great, report Kaitlin Woolley and Ayelet Fishbach, when people—whether friends or strangers—eat from shared bowls. After eating chips and salsa from shared rather than separate bowls, people in their experiments became more cooperative in negotiating wages.

 

Their findings remind me of the convivial spirit I experienced when treated to group dinners with my Chinese hosts on visits to Beijing and Shanghai—with each of us sampling from shared dishes placed around a center-table Lazy Susan (or as the Chinese would say, in translation, a “dinner-table turntable”).

 

      Free image from Pixaby.

Those of us who are North Americans have our own family-style-dinner counterparts —shared fondue pots, tapas dinners, and communal hors d'oeuvres. As Woolley and Fishbach conclude, shared plates → shared minds. Such is the social power of shared meals.

 

Food matters. Perhaps the rapport-building power of breaking bread together can nudge us to prioritize time for sharing more family meals, for offering hospitality to our friends and colleagues, and for welcoming new acquaintances.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

Consider two facts:

 

  1. Worldwide, smartphones and easier social media access exploded starting in 2010. Consider U.S. smartphone-use (and its projected future):
  2. Simultaneously—and coincidentally?—teen girls’ rates of depression, anxiety, self-harm, and suicide have mushroomed (for Canadian, American, and British sample data see here, here, and here).

     

So, is there a causal connection? If so, is it big enough to matter?

 

Should parents give (or deny) their middle schoolers smartphones with Instagram or Snapchat accounts? And does amount of daily screen time matter?

 

In quest of answers, my esteemed social psychologist colleague Jonathan Haidt is assembling the available evidence using (and illustrating) three psychological methods. His tentative conclusions:

 

  • Correlational studies ask: Is social media use associated with teen mental health? Study outcomes vary, but overall, there is at least a small correlation between adolescents’ social media hours and their risk of depression, anxiety, and self-harm. The screen time–disorder association is stronger for social media use than for TV and gaming time, and the link is greater for females who are heavy social media users.
  • Longitudinal studies ask: Does today’s social media use predict future mental health? In six of eight studies, the answer is yes.
  • Experiments ask: Do volunteers randomly assigned to restricted social media use fare better than those not assigned on outcomes such as loneliness and depression? On balance, yes, says Haidt, but the few such studies have produced mixed results.

 

Haidt’s provisional conclusion can be seen in his tweet:

 

In a Time essay, researcher Jean Twenge (my Social Psychology co-author) offers kindred advice for parents concerned about their children’s social media use:

  • “No phone or tablets in the bedroom at night.”
  • “No using devices within an hour of bedtime.”
  • “Limit device time to less than two hours of leisure time a day.”

 

Haidt also provides us a much-needed model of intellectual humility. In his continuing search for answers, he posts his tentative conclusions and accumulating evidence online, and he welcomes other researchers’ evidence and criticism. He writes,

I am not unbiased. I came to the conclusion that there is a link, and I said so in my book (The Coddling of the American Mind, with Greg Lukianoff). . . . Like all people, I suffer from confirmation bias. [Thus] I need help from critics to improve my thinking and get closer to the truth. If you are a researcher and would like to notify me about other studies, or add comments or counterpoints to this document, please request edit access to the Google Doc, or contact me directly.

 

In our college and AP psychology texts, Nathan DeWall and I commend “a scientific attitude that combines curiosity, skepticism, and humility.” We note that, when combined with the scientific method, the result is a self-correcting road toward truth. By embracing this spirit, Haidt exemplifies psychological science at its best—exploring an important question by all available methods . . . drawing initial conclusions . . . yet holding them tentatively, while welcoming skeptical scrutiny and further evidence. As he mused (when I shared a draft of this essay), “It is amazing how much I have learned, and refined my views, just by asking people to make me smarter.”

 

How true for us all. The pack is greater than the wolf.

 

 (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

David Myers

Out of Many, One

Posted by David Myers Expert Mar 28, 2019

Perhaps you, too, feel it like never before—intense contempt for your political opposites. National Election Surveys reveal that U.S. Republicans and Democrats who hate the other party each soared from 20% in 2000 to near 50% in 2016. Small wonder, given that 42 percent in both parties agree that those in the other party “are downright evil.”

 

Should the government “do more to help the needy”? Is racial discrimination a main reason “why many Black people can’t get ahead these days”? Do immigrants “strengthen the country with their hard work and talents”? The partisan divergence in response to such questions has never been greater, reports the Pew Research Center. The overlap between conservative Democrats and progressive Republicans has never been less. And fewer folks than ever hold a mix of conservative and liberal views.

 

Americans are polarized. There seems no bridge between Sean Hannity and Rachel Maddow, between MAGA red-hatters and Alexandria Ocasio-Cortez admirers. We are a nation of opposing hidden tribes. “Some people’s situations are so challenging that no amount of work will allow them to find success,” agree 95 percent of “progressive activists.” But no, say “devoted conservatives,” who are 92 percent agreed that “people who work hard can find success no matter what situation they were born into.”

Do we exaggerate?

But I overstate. Although the political extremes are inverses, studies (here and here) show that most liberals and conservatives exaggerate their differences. On issues such as immigration, trade, and taxes, they overestimate the extremity of a “typical” member of the other party. And for some ideas—higher taxes on the ultra-wealthy, Medicare negotiation of lower drug prices, background checks on gun sales—there is bipartisan supermajority support.

 

Differences, we notice; similarities, we neglect
It’s a universal truth: Differences draw our attention. As individuals, we’re keenly aware of how we differ from others. Asked to describe themselves, redheads are more likely to mention their hair color; the foreign-born, their birthplace; and the left-handed, their handedness. Living in Scotland, I become conscious of my American identity and accent. Visiting my daughter in South Africa, I am mindful of my race. As the sole male on a professional committee of females, I was aware of my gender. One is “conscious of oneself insofar as, and in the ways that, one is different,” observed the late social psychologist William McGuire.

 

Likewise, when the people of two cultures are similar, they nevertheless will attend to their differences—even if those differences are small. Rivalries often are most intense with another group that most resembles one’s own. My college has what is widely acclaimed (by ESPN and others) as the greatest small college sports rivalry with a nearby college that shares its Protestant Dutch history…rather like (in Jonathan Swift’s Gulliver’s Travels) the war between the Little-Endians who preferred to break their eggs on the small end, and the Big-Endians who did so on the big end.

 

Our similarities exceed our differences

As members of one human family, we share not only our biology—cut us and we bleed—but our behaviors. We all wake and sleep, prefer sweet tastes to sour, fear snakes more than snails, and know how to read smiles and frowns. An alien anthropologist could land anywhere on Earth and find people laughing and crying, singing and worshiping, and fearing strangers while favoring their own family and neighbors. Although differences hijack our attention, we are all kin beneath the skin.

 

Nearly two decades ago, the communitarian sociologist Amitai Etzioni identified “core values” that are “embraced by most Americans of all races and ethnic groups.” Eight in ten Americans—with agreement across races—desired “fair treatment for all, without prejudice or discrimination.” More than 8 in 10 in every demographic group agreed that freedom must be tempered by personal responsibility, and that it was “extremely important” to spend tax dollars on “reducing crime” and “reducing illegal drug use” among youth. A more recent study of nearly 90,000 people across world cultures and of varying gender, age, education, income, and religiosity confirmed that “similarities between groups of people are large and important.” 

 

Believing that there is common ground, the nonprofit Better Angels movement aims “to unite red and blue Americans in a working alliance to depolarize America.” They do this in several ways:

  • “We try to understand the other side’s point of view, even if we don’t agree with it.”
  • “We engage those we disagree with, looking for common ground and ways to work together.”
  • “We support principles that bring us together rather than divide us.” 

 

We will still disagree. We do have real differences, including the social identities and values that define us. Nevertheless, our challenge now is to affirm both our diversity and our unifying ideals, and thus to renew the founding idea of America: diversity within unity. E pluribus unum. Out of many, one.

 

 (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

In the aftermath of the New Zealand massacre of Muslims at worship, American pundits have wondered: While the perpetrator alone is responsible for the slaughter, do the expressed attitudes of nationalist, anti-immigrant world leaders increase White nationalism—and thus the risk of such violence?

 

Consider Donald Trump’s rhetoric against supposed rapist, drug-dealing immigrants; his retweeting of anti-Muslim rhetoric; his saying that the Charlottesville White nationalists included some “very fine people”; or his condoning violence at his rallies and against the media. Do these actions serve to normalize such attitudes and behavior? Is the Southern Poverty Law Center right to suppose that hatemongering is “emboldened [and] energized” by such rhetoric? Is the New Zealand gunman’s reportedly lauding Trump as “a symbol of White supremacy” something more than a murderer’s misguided rantings?

 

In response, many people—particularly those close to Trump—attributed responsibility to the gunman. The President’s acting chief of staff argued that the shooter was a “disturbed individual” and that it is “absurd” to link one national leader’s rhetoric to an “evil person’s” behavior. We social psychologists call this a “dispositional attribution” rather than a “situational attribution.”

 

As I noted in a 2017 essay, two recent surveys and an experiment show that dispositions are shaped by social contexts. Hate speech (surprised?) feeds hate. Those frequently exposed to hate speech become desensitized to it, and then to lower evaluations of, and greater prejudice toward, its targets. Prejudice begets prejudice.

 

To be sure, leaders’ words are not a direct cause of individuals’ dastardly actions. Yet presidents, prime ministers, and celebrities do voice and amplify social norms. To paraphrase social psychologists Chris Crandall and Mark White, people express prejudices that are socially acceptable and suppress those that are not. When prejudice toward a particular group seems socially sanctioned, acts of prejudice—from insults to vandalism to violence—increase as well. Norms matter.

 

The FBI reports a 5 percent increase in hate crimes during 2016, and a further 17 percent increase during 2017--and reportedly more than doubled in counties hosting a Trump rally. The Anti-Defamation League reports that 2018 “was a particularly active year for right-wing extremist murders: Every single extremist killing—from Pittsburgh to Parkland—had a link to right-wing extremism.” Again, we ask: Coincidence? Or is there something more at work? If so, is there a mirror-image benevolent effect of New Zealand prime minister Jacinda Ardern’s saying of her nation’s Muslim immigrants, “They are us”?

 

 (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)