Skip navigation
All Places > The Psychology Community > Blog > Authors David Myers
1 2 3 4 5 Previous Next

The Psychology Community

218 Posts authored by: David Myers Expert

Dog walking, according to a recent news report, is healthy for people. That little report follows three massive new research reviews that confirm earlier findings of the mental health benefits of exercise:

  • An American Journal of Psychiatry analysis of 49 studies followed 266,939 people across an average 7 years. In every part of the world, people of all ages had a lower risk of becoming depressed if physically active rather than inactive.
  • JAMA Psychiatry reports that, for teens, “regular physical activity [contributes] to positive mental health.”
  • Another JAMA Psychiatry analysis of 33 clinical trials found an additional depression-protecting effect of “resistance exercise training” (such as weight lifting and strength-building).

 

Faba-Photography/Moment/Getty Images

 

A skeptic might wonder if mentally healthy people have more energy for exercise. (Being really depressed comes with a heaviness that may entail trouble getting out of bed.) But the “prospective studies”—which follow lives through time—can discern a sequence of exercise predicting future reduced depression risk. Moreover, many clinical trial experiments—with people assigned to exercise or control conditions—confirm that exercise not only contributes to health and longevity, it also treats and protects against depression and anxiety. Mens sana in corpore sano: A healthy mind in a healthy body.

 

Indeed, given the modest benefits of antidepressant drugs, some researchers are now recommending therapeutic lifestyle change as a potentially more potent therapy for mild to moderate depression—or as a protection against such. When people modify their living to include the exercise, sunlight exposure, ample sleep, and social connections that marked our ancestors’ lives—a lifestyle for which they were bred—they tend to flourish, with greater vitality and joy. In one study, substantial depression relief was experienced by 19 percent of patients in a treatment-as-usual control group and by 68 percent undergoing therapeutic lifestyle change.

 

Finally, more good news—for dog walkers: Dog walking is said to be healthy and calming for dogs, too. But I suspect that will not surprise any dog owner or their dog.

“The heart has its reasons which reason does not know."

~Pascal, Pensees, 1670

 

“He that trusteth in his own heart is a fool.”

~Proverbs 28:26

 

“Buried deep within each and every one of us, there is an instinctive, heart-felt awareness” that can guide our behavior. So proclaimed Prince Charles in a 2000 lecture. Trust your gut instincts.

 

Prince Charles has much company. “I’m a gut player. I rely on my instincts,” explained President George W. Bush in justifying his decision to launch the Iraq war, after earlier talking with Vladimir Putin and declaring himself “able to get a sense of his soul.”

 

“Within the first minute [of meeting Kim Jong-un] I’ll know, declared President Trump. “My touch, my feel—that’s what I do.” Afterwards he added, “We had a great chemistry—you understand how I feel about chemistry.” The heart has its reasons.

 

But is there also wisdom to physicist Richard Feynman’s channeling the skepticism of King Solomon’s Proverb: “The first principle,” said Feynman, “is that you must not fool yourself—and you are the easiest person to fool.”

 

In sifting intuition’s powers and perils, psychological science has some wisdom.

 

First, our out-of-sight, automatic, intuitive information processing is HUGE. In Psychology, 12th Edition, Nathan DeWall and I offer some examples:

  • Automatic processing: We glide through life mostly on autopilot. Our information processing is mostly implicit, unconscious, behind the scenes—and often guided by “fast and frugal” heuristics (mental shortcuts).
  • Intuitive expertise: After mastering driving (or chess), people can react to situations intuitively, without rational analysis.
  • Reading others: We are skilled at reading “thin slices” of behavior—as when judging someone’s warmth from a 6-second video clip.
  • Blindsight: Some blind people even display “blindsight”—they can intuitively place an envelope in a mail slot they cannot consciously see.

 

Second, our intuition is perilous. Psychology is flush with examples of smart people’s predictable and sometimes tragic intuitive errors:

  • Human lie detection: People barely surpass chance when intuiting whether others are lying or truth-telling. (American presidents might want to remember this when judging Putin’s or Kim Jong-un’s trustworthiness.)
  • Intuitive prejudice: As demonstrated in some police responses to ambiguous situations, implicit biases can—without any conscious malevolent intent—affect our perceptions and reactions. (Is that man pulling out a gun or a phone?)
  • Intuitive fears: We fear things that kill people vividly and memorably (because we intuitively judge risks by how readily images of a threat come to mind). Thus we may—mistakenly—fear flying more than driving, shark attacks more than drowning, school mass shootings more than street and home shootings.
  • The “interview illusion”: Given our ability to read warmth from thin slices, it’s understandable that employment interviewers routinely overestimate their ability to predict future job success from unstructured get-acquainted interviews. But aptitude tests, work samples, job-knowledge tests, and peer ratings of past job performance are all better predictors. (Even the lengthiest of interviews—the mate-selection process—is a fragile predictor of long-term marital success.)

 

The bottom line: Intuition—automatic, implicit, unreasoned thoughts and feelings—grows from our experience, feeds our creativity, and guides our lives. Intuition is powerful. But it also is perilous, especially when we overfeel and underthink. Unchecked, uncritical intuition sometimes leads us into ill-fated relationships, feeds overconfident predictions, and even leads us into war.

Recent U.S. school shootings outraged the nation and produced calls for action. One response, from the International Society for Research on Aggression, was the formation of a Youth Violence Commission, composed of 16 experts led by Ohio State social psychologist Brad Bushman. Their task: To identify factors that do, and do not, predict youth violence—behavior committed by a 15- to 20-year old that’s intended to cause unwanted harm.

 

 

Hélène Desplechin/Moment/Getty Images

 

The Commission has just released its final report, which it has shared with President Trump, Vice President Pence, Education Secretary DeVos, and all governors, senators, and congressional representatives.

 

The Commission first notes big differences between highly publicized mass shootings (rare, occurring mostly in smaller towns and suburbs, using varied legal guns) and street shootings (more common, concentrated in inner cities, using illegal handguns).  It then addresses the factors that do and do not predict youth violence.

 

RISK FACTORS THAT PREDICT YOUTH VIOLENCE

 

Personal Factors:

  • Gender—related to male biology and masculinity norms.
  • Early childhood aggressive behavior—past behavior predicts future behavior.
  • Personality—low anger control, often manifested in four “dark” personality traits: narcissism, psychopathy, Machiavellianism, and sadism.
  • Obsessions with weapons or death.

 

Environmental Factors:

  • Easy access to guns.
  • Social exclusion and isolation—sometimes including being bullied.
  • Family and neighborhood—family separation, child maltreatment, neighborhood violence.
  • Media violence—a link “found in every country where studies have been conducted.”
  • School characteristics—with large class sizes contributing to social isolation.
  • Substance use—a factor in street shootings but not school shootings.
  • Stressful events—including frustration, provocation, and heat.

 

FACTORS THAT DO NOT PREDICT YOUTH VIOLENCE

 

The commission found that the following do not substantially predict youth violence:

  • Mental health problems—most people with mental illness are not violent, and most violent people are not mentally ill (with substance abuse and psychotic delusions being exceptions).
  • Low self-esteem—people prone to violence actually tend to have inflated or narcissistic self-esteem.
  • Armed teachers—more guns = more risk, and they send a message that schools are unsafe.

 

The concluding good news is that training programs can increase youth self-control, enhance empathy and conflict resolution, and reduce delinquency. Moreover, mass media could help by reducing attention to shootings, thereby minimizing the opportunity for modeling and social scripts that such portrayals provide to at-risk youth.

“When you know a thing, to hold that you know it; and when you do not know a thing, to allow that you do not know it; this is knowledge.”
~Confucius (551–479 B.C.E.), Analects

One of the pleasures of joining seventeen scholars from six countries at last week’s 20th Sydney Symposium on Social Psychology was getting to know the affable and articulate David Dunning.

 

David DunningDunning (shown here) recapped a stream of studies on human overconfidence. When judging the accuracy of their factual beliefs (“Did Shakespeare write more than 30 plays?”) or when predicting future events (such as the year-end stock market value), people are typically more confident than correct. Such cognitive conceit fuels stockbrokers’ beliefs that they can outperform the market—which, as a group, they cannot. And it feeds the planning fallacy—the tendency of contractors, students, and others to overestimate how quickly they will complete projects.

 

To this list of studies, Dunning and Justin Kruger added their own discovery, now known as the Dunning-Kruger effect: Those who score lowest on various tests of knowledge, logic, and grammar are often ignorant of their own ignorance. Never realizing all the word possibilities I miss when playing Scrabble, I may overestimate my verbal competence.

 

Likewise—to make this even more personal—those of us with hearing loss often are the last to recognize such . . . not because we are repressing our loss, but simply because we are unaware of what we haven’t heard (and of what others do hear). To Daniel Kahneman’s kindred observation that we are “blind to our [cognitive] blindness,” I would add that we can also be literally deaf to our deafness. We don’t know what we don’t know.

 

Thus ironically, and often tragically, those who lack expertise in an area suffer a double-curse—they make misjudgments, which they fail to recognize as errors. This leads them, notes Dunning, to conclude “they are doing just fine.”

 

Note what Dunning is not saying—that some people are just plain stupid, a la Warren Buffett:

Warren Buffett

 

Rather, all of us have domains of inexpertise, in which we are ignorant of our ignorance.

 

But there are two remedies. When people express strong views of topics on which they lack expertise, we can, researcher Philip Fernbach found, ask them to explain the details: “So exactly how would a cap-and-trade carbon emissions tax work?” A stumbling response can raise their self-awareness of their ignorance, lessening their certainty.

 

Second, we can, for our own part, embrace humility. For anything that matters, we can welcome criticism and advice. Another personal example: As I write about psychological science, I confess to savoring my own words. As I draft this essay, I am taking joy in creating the flow of ideas, playing with the phrasing, and then fine-tuning the product to seeming perfection. Surely, this time my editors—Kathryn Brownson and Nancy Fleming—will, for once, find nothing to improve upon? But always they find glitches, ambiguities, or infelicities to which I was blind.

 

Perhaps that is your story, too? Your best work, when reviewed by others . . . your best tentative decisions, when assessed by your peers . . . your best plans, when judged by consultants . . . turn into something even better than you, working solo, could have created. Our colleagues, friends, and spouses often save us from ourselves. The pack is greater than the wolf.

 

In response to my wondering if his famed phenomenon had impacted his life, Dunning acknowledged that he has received—and in all but one instance rebuffed—a stream of journalist pleas: Could he please apply the blindness-to-one’s-own-incompetence principle to today’s American political leadership?

 

But stay tuned. Dunning is putting the finishing touches on a general audience trade book (with one possible title: You Don’t Know What You Don’t Know—and Why It Matters).

It’s well-established that:

  • brain cells survive for a time after cardiac arrest and even after declared death.
  • some people have been resuscitated after cardiac arrest— even hours after, if they were linked to blood-oxygenating and heart-massaging machines.
  • a fraction of resuscitated people have reported experiencing a bright light, a tunnel, a replay of old memories, and/or out-of-body sensations. For some, these experiences later enhanced their spirituality or personal growth.

 

Recently, I enjoyed listening to and questioning a university physician who is launching a major multi-site study of cardiac arrest, resuscitation, and near-death experiences. As a dualist (one who assumes mind and body are distinct, though interacting), he is impressed by survivors’ reports of floating up to the ceiling, looking down on the scene below, and observing efforts to revive them. Thus, his study seeks to determine whether such patients can—while presumably separated from their supine body—perceive and later recall images displayed on an elevated, ceiling-facing iPad.

 

Care to predict the result?

 

My own prediction is based on three lines of research:

  • Parapsychological efforts have failed to confirm out-of-body travel with remote viewing.
  • A mountain of cognitive neuroscience findings link brain and mind.
  • Scientific observations show that brain oxygen deprivation and hallucinogenic drugs can cause similar mystical experiences (complete with the tunnel, beam of light, and so forth).

Thus, I expect there will be no replicable evidence of near-death minds viewing events remote from the body.

 

Setting my assumptions and expectations aside, I asked the physician-researcher about some of his assumptions:

  1. For how long do you think the mind would survive clinical death? Minutes? Hours? Forever? (His answer, if I understood, was uncertainty.)
  2. When resuscitated, the mind would rejoin and travel again with the body, yes? When the patient is wheeled to a new room, the mind rides along? (That assumption was not contested.)
  3. What about the Hiroshima victims whose bodies were instantly vaporized? Are you assuming that–for at least a time—their consciousness or mind survived that instant and complete loss of their brain and body? (His clear answer: Yes.)

 

That made me wonder: If a mind could post-date the body, could it also predate it? Or does the body create the mind, which grows with it, but which then, like dandelion seeds, floats away from it?

 

The brain-mind relationship appeared in another presentation at the same session. A European university philosopher of mind argued that, in addition to the dualist view (which he regards as “dead”) and the reductionist view (Francis Crick: “You’re nothing but a pack of neurons”), there is a third option. This is the nonreductive physicalist view—“nonreductive” because the mind has its own integrity and top-down causal properties, and “physicalist” because the mind emerges from the brain and is bound to the brain.

 

The 20th century’s final decade was “the decade of the brain,” and the 21st century’s first decade was “the decade of the mind.” Perhaps we could say that today’s science and philosophy mark this as a decade of the brain-mind relationship? For these scholars, there are miles to go before they enter their final sleep—or should I say until their body evicts their mind?

 

Addendum for those with religious interests: Two of my friends—British cognitive neuroscientist Malcolm Jeeves and American developmental psychologist Thomas Ludwig—reflect on these and other matters in their just-published book, Psychological Science and Christian Faith. If you think that biblical religion assumes a death-denying dualism (a la Plato’s immortal soul) prepare to be surprised.

Money matters. For entering U.S. collegians, the number one life goal—surpassing “helping others in difficulty,” “raising a family,” and 17 other aspirations—is “being very well off financially.” In the most recent UCLA “American Freshman” survey, 82 percent rated being very well off as “essential” or “very important.” Think of it as today’s American dream: life, liberty, and the purchase of happiness.

 

For human flourishing, fiscal fitness indeed matters . . . up to a point. In repeated surveys across nations, a middle-class income—and being able to control one’s life—beats being poor. Moreover, people in developed nations tend to be happier and more satisfied than those in the poorest of nations.

 

Beyond the middle-class level, we seem to have an income “satiation point,” at which the income-happiness correlation tapers off and happiness no longer increases. For individuals in poor countries, that point is close to $40,000; for those in rich countries, about $90,000, reports a new analysis of 1.7 million Gallup interviews by Andrew Jebb and colleagues.

 

And consider: The average U.S.  per-person disposable income, adjusted for inflation, has happily tripled over the last 60 years, enabling most Americans to enjoy today’s wonderments, from home air conditioning to wintertime fresh fruit to smart phones. “Happily,” because few of us wish to return to yesteryear. Yet not that happily, because psychological well-being has not floated upward with the rising economic tide. The number of “very happy” adults has remained at 3 in 10, and depression has been on the rise.

 

What triggers the diminishing psychological payoff from excess income? Two factors:

  • Our human capacity for adaptation: Continual pleasures subside.
  • Our tendency to assess our own circumstances by “social comparison” with those around us—and more often those above us. People with a $40,000 income tend to think $80,000 would enable them to feel wealthy—whereas those at $80,000 say they would need substantially more. Become a millionaire and move to a rich neighborhood, you still may not feel rich. As Theodore Roosevelt said, “Comparison is the thief of joy.”

 

The outer limit of the wealth–well-being relationship also appears in two new surveys (by Grant Donnelly, Tianyl Zheng, Emily Haisley, and Michael Norton) of an international bank’s high net-worth clients. As you can see in figures I created from their data, having $2 million and $10 million are about the same, psychologically speaking.

 

If wealth increases well-being only up to a point—and much evidence indicates that is so—and if extreme inequality is socially toxic (great inequality in a community or country predicts lower life quality and more social pathology), then could societies increase human flourishing with economic and tax policies that spread wealth?

 

Let’s make this personal: If earning, accumulating, and spending money increases our happiness only to a satiation point, then why do we spend our money for (quoting the prophet Isaiah) “that which is not bread” and our “labor for that which does not satisfy?” Quite apart from moral considerations, what’s to be lost by sharing our wealth above the income-happiness satiation point?

 

And if one is blessed with wealth, what’s to be gained by showering inherited wealth, above the satiation point, on our children? (Consider, too, another Donnelly and colleagues finding: Inherited wealth entails less happiness than earned wealth.)

 

Ergo, whether we and our children drive BMWs or Honda Fits, swim in our backyard pool or at the local Y, eat filet mignon or fish filet sandwiches, hardly matters. That fact of life, combined with the more important facts of the world’s needs, makes the case for philanthropy.

“The most famous psychological studies are often wrong, fraudulent, or outdated.” With this headline, Vox joins critics that question the reproducibility and integrity of psychological science’s findings.

 

Are many psychology findings indeed untrustworthy? In 2008, news from a mass replication study—that only 36 percent of nearly 100 psychological science studies successfully reproduced the previous findings—rattled our field. Some challenged the conclusion: “Our analysis completely invalidates the pessimistic conclusions that many have drawn from this landmark study,” said Harvard psychologist Daniel Gilbert.

 

For introductory psychology teachers, those supposed failures to replicate need not have been a huge concern. Introductory psych textbooks focus on major, well-established findings and ideas. (For example, only one of the 60+ unreplicated studies were among the 5,174 references in my text at the time, necessitating a deletion of only one-half sentence in its next edition.)

 

But here are more recent criticisms—about six famous and favorite studies:

  • Philip Zimbardo stage-managed the Stanford prison study to get his wished-for results, and those who volunteer for such an experiment may be atypically aggressive and authoritarian (see here and here). Moreover, as Stephen Reicher and Alex Haslaam showed, when they recreated a prison experiment with the BBC (albeit as reality TV rather than a replication), groups don’t necessarily corrupt—people can collectively choose to behave in varied ways. For such reasons, the Stanford prison study may in the future disappear from more intro psych texts. But for the present, some teachers still use this study as a vivid illustration of the potential corrupting power of evil situations. (Moreover, Philip Zimbardo and colleagues have released responses here.)
  • Muzafer Sherif similarly managed his famed boys’ camp study of conflict and cooperation to produce desired results (see here). Yet my friend Stephen Reicher, whom I met over coffee in St. Andrews two weeks ago, still considers the Sherif study a demonstration (even if somewhat staged) of the toxicity of competition and the benefits of cooperation.
  • The facial-feedback effect—the tendency of facial muscles to trigger associated feelings—doesn’t replicate (see here). The failure to reproduce that favorite study (which my students and I have experienced by holding a pencil with our teeth vs. our pouting lips) wiped a smile off my face. But then the original researcher, Fritz Strack, pointed us to 20 successful replications. And a new study sleuths a crucial difference (self-awareness effects due to camera proximity) between the studies that do and don’t reproduce the facial feedback phenomenon. Even without a pencil in my mouth, I am smiling again.
  • The ego-depletion effect—that self-control is like a muscle (weakened by exercise, replenished with rest, and strengthened with exercise)—also failed a multi-lab replication (here). But a massive new 40-lab study, with data analyzed by an independent consultant—“innovative, rigorous” science, said Center for Open Science founder Brian Nosek—did show evidence of a small depletion phenomenon.
  • Kitty Genovese wasn’t actually murdered in front of 38 apartment bystanders who were all nonresponsive (see here). Indeed. Nevertheless, the unresponsive bystander narrative—initiated by police after the Genovese murder—inspired important experiments on the conditions under which bystanders will notice and respond in crisis situations.
  • Mischel’s marshmallow study (children who delay gratification enjoy future success) got roasted by a big new failure to replicate. As I explain in last week’s www.TalkPsych.com essay, the researchers did find an association between 4½-year-olds’ ability to delay gratification and later school achievement, but it was modest and related to other factors. The take-home lesson: Psychological research does not show that a single act of behavior is a reliable predictor of a child’s life trajectory. Yet life success does grow from impulse restraint. When deciding whether to study or party, whether to spend now or save for retirement, foregoing small pleasures can lead to bigger rewards later.

 

One positive outcome of these challenges to psychological science has been new scientific reporting standards that enable replications, along with the establishment of the Center for Open Science that aims to increase scientific openness, integrity, and reproducibility. (I was pleased recently to recommend to fellow Templeton World Charity Foundation trustees a multi-million dollar grant which will support the Center’s mission.)

 

The big picture: Regardless of findings, research replications are part of good science. Science, like mountain climbing, is a process that leads us upward, but with times of feeling like we have lost ground along the way. Any single study provides initial evidence, which can inspire follow-up research that enables us to refine a phenomenon and to understand its scope. Through replication—by winnowing the chaff and refining the wheat—psychological science marches forward.

David Myers

Marshmallow Study Roasted?

Posted by David Myers Expert Jun 11, 2018

While on break in St. Andrews (Scotland) last week, I enjoyed a dinner conversation with a celebrated M.I.T. developmental psychologist and a similarly brilliant University of St. Andrews researcher. Among our dinner topics was an impressive recent conceptual replication of Walter Mischel’s famous marshmallow test.

 

Mischel and his colleagues, as you may recall, gave 4-year-olds a choice between one marshmallow now or two marshmallows later. Their long-term studies showed that those with the willpower to delay gratification as preschoolers went on as adults to have higher college-completion rates and incomes and fewer addiction problems. This gem in psychology’s lore—that a preschooler’s single behavioral act could predict that child’s life trajectory—is a favorite study for thousands of psychology instructors, and has made it into popular culture—from Sesame’s Street’s Cookie Monster to the conversation of Barack Obama.

 

In their recent replication of Mischel’s study, Tyler Watts, Greg Duncan, and Hoanan Quen followed a much larger and more diverse sample: 918 children who, at age 4½, took the marshmallow test as part of a 10-site National Institute of Child Health and Human Development child study. Observing the children’s school achievement at age 15, the researchers noted a modest, statistically significant association “between early delay ability and later achievement.” But after controlling for other factors, such as the child’s intelligence, family social status, and education, the effect shriveled.

 

“Of course!” said one of my dinner companions. Family socioeconomic status (SES) matters. It influences both children’s willingness to await the second marshmallow, and also academic and economic success. As other evidence indicates—see here and here—it is reasonable for children in poverty to seize what’s available now and to not trust promises of greater future rewards.

 

But my other dinner companion and I posited another factor: Any predictive variable can have its juice drained when we control for myriad other variables. Perhaps part of a child’s ability to delay gratification is intelligence (and the ability to contemplate the future) and experience. If so, controlling for such variables and then asking what’s the residual effect of delay of gratification, per se, is like asking what’s the real effect of a hurricane, per se, after controlling for barometric pressure, wind speed, and storm surge. A hurricane is a package variable, as is delay of gratification.

 

I put that argument to Tyler Watts, who offered this response:

If the ability to delay gratification is really a symptom of other characteristics in a child's life, then interventions designed to change only delay of gratification (but not those other characteristics) will probably not have the effect that you would expect based on the correlation Mischel and Shoda reported. So, if it’s the case that in order to generate the long-term effects reported in Mischel's work, interventions would have to target some combination of SES, parenting, and general cognitive ability, then it seems important to recognize that.  

 

This major new study prompts our reassessing the presumed predictive power of the famed marshmallow test. Given what we’ve known about how hard it is to predict to or from single acts of behavior—or single items on a test or questionnaire—we should not have been surprised. And we should not exaggerate the importance of teaching delay of gratification, apart from other important predictors of life success.

 

But the new findings do not undermine a deeper lesson: Part of moral development and life success is gaining self-discipline in restraining one’s impulses. To be mature is to forego small pleasures now to earn bigger rewards later. Thus, teacher ratings of children’s self-control (across countless observations) do predict future employment. And parent ratings of young children’s self-regulation predict future social success. Self-control matters.

David Myers

The Malleability of Mind

Posted by David Myers Expert May 31, 2018

How many of us have felt dismay over a friend or family member’s stubborn resistance to our arguments or evidence showing (we believe) that Donald Trump is (or isn’t) making America great again, or that immigrants are (or aren’t) a threat to our way of life? Sometimes, it seems, people just stubbornly resist change.

 

Recently, however, I’ve also been struck by the pliability of the human mind. We are adaptive creatures, with malleable minds.

 

Over time, the power of social influence is remarkable. Generations change. And attitudes change. They follow our behavior, adjust to our tribal norms, or simply become informed by education.

 

The power of social influence appears in current attitudes toward free trade, as the moderate-conservative columnist David Brooks illustrates: “As late as 2015, Republican voters overwhelmingly supported free trade. Now they overwhelmingly oppose it. The shift didn’t happen because of some mass reappraisal of the evidence; it’s just that tribal orthodoxy shifted and everyone followed.”

 

Those who love history can point out many other such shifts. After Pearl Harbor, Japan and Japanese people became, in many American minds surveyed by Gallup, untrustworthy and disliked. But then after the war, they soon transformed into our “intelligent, hard-working, self-disciplined, resourceful allies.”  Likewise, Germans across two wars were hated then admired then hated again then once again admired.

 

Or consider within thin slices of recent human history the transformational changes in our thinking about race, gender, and sexual orientation:

  • Race. In 1958, only 4 percent of Americans approved of “marriage between Blacks and Whites.” In 2013, 87 percent approved.
  • Gender. In 1967, two-thirds of first-year American college students agreed that “the activities of married women are best confined to the home and family.” Today, the question, which would offend many, is no longer asked.
  • Gay marriage. In Gallup surveys, same-sex marriage—approved by only 27 percent of Americans in 1996—is now welcomed by nearly two-thirds.

 

Consider also, from within the evangelical culture that I know well, the astonishing results of two Public Religion Research Institute polls. The first, in 2011, asked voters if “an elected official who commits an immoral act in their personal life can still behave ethically and fulfill their duties in their public and professional life.” Only 30 percent of White evangelical Protestants agreed. By July of 2017, with President Trump in office, 70 percent of White evangelicals said they would be willing to separate public and personal conduct.

 

An April 22, 2018, Doonesbury satirized this “head-spinning reversal” (quoting the pollster). A cartoon pastor announces to his congregation the revised definition of sin:

“To clarify, we now condone the following conduct: lewdness, vulgarity, profanity, adultery, and sexual assault. Exemptions to Christian values also include greed, bullying, conspiring, boasting, lying, cheating, sloth, envy, wrath, gluttony, and pride. Others TBA. Lastly we’re willing to overlook biblical illiteracy, church nonattendance, and no credible sign of faith.”

 

In a recent essay, I reflected (as a person of faith) on the shift among self-described “evangelicals”: The great temptation is to invoke “God” to justify one’s politics. “Piety is the mask,” observed William James.

 

This tendency to make God in our own image was strikingly evident in a provocative study by social psychologist Nicholas Epley and his colleagues.  Most people, they reported, believe that God agrees with whatever they believe. No surprise there. But consider: When the researchers persuaded people to change their minds about affirmative action or the death penalty, the people then assumed that God now believed their new view. As I am, the thinking goes, so is God.

 

But the mind is malleable in both directions. Many one-time evangelicals—for whom evangelicalism historically has meant a “good news” message of God’s accepting grace—are now changing their identity in the age of Trump (with Trump’s support having been greatest among “evangelicals” who are religiously inactive—and for whom the term has been co-opted to mean “cultural right”). Despite my roots in evangelicalism, I now disavow the mutated label (not wanting to be associated with the right’s intolerance toward gays and Muslims). Many others, such as the moderate Republican writer Peter Wehner, are similarly repulsed by the right-wing takeover of evangelicalism and disavowing today’s tarnished evangelical brand.

 

Times change, and with it our minds.

The British, American, and Australian press—and hundreds of millions of royal wedding viewers—were unexpectedly enthralled by Bishop Michael Curry’s 13.5 minutes of fame:

  • “Stole the show” (Telegraph and Vox).
  • “Electrifying” (New York Times).
  • “Wholly un-British, amazing, and necessary” (Esquire).
  • “Will go down in history” (Guardian).
  • “His star turn is set to impact the Most Reverend Michael Curry’s life for years to come” (news.com.au)

 

His gist: “We must discover the power of love, the redemptive power of love,” God’s love. “And when we do that, we will make of this old world, a new world.” A positive message—and an appealing synopsis of authentic Christianity—but why was it so effective? Why did it connect so well and capture media coverage? What persuasion principles did he illustrate that others—preachers, teachers, students, all speakers—might want to emulate?

 

The power of repetition. Experiments leave no doubt: Repetition strengthens memory and increases belief. Repeated statements—whether neutral (“The Louvre is the largest museum in Paris”), pernicious (“Crooked Hillary”), or prosocial (“I have a dream”)—tend to stick to the mind like peanut butter. They are remembered, and they are more likely to be believed (sometimes even when repeated in efforts to discount them).

 

Few will forget that Curry spoke of “love” (66 times, in fact—5 per minute). We would all benefit from emulating Curry’s example: Frame a single, simple message with a pithy phrase (“the power of love”). From this unifying trunk, the illustrative branches can grow.

 

The power of speaking from the heart. A message rings authentic when it emanates from one’s own life experience and identity—when it has enough self-disclosure to be genuine, but not so much as to be self-focused. Curry, a slave descendant, speaking in an epicenter of White privilege, began and ended with the words of Martin Luther King, Jr., and he told how his ancestors, “even in the midst of their captivity” embraced a faith that saw “a balm in Gilead to make the wounded whole.”


The power of speaking to the heart. My wife—an Episcopalian who has heard Curry’s preaching—has told me that his presence lends power beyond his written words. Curry was well prepared. But rather than safely reading his polished manuscript, he made eye contact with his audience, especially Prince Harry and Ms. Markle. He spoke with passion. His words aroused emotion. They spoke to troubled hearts in a polarized world.

 

The power of vivid, concrete examples. The behavioral scientist in me wishes it weren’t true, but, alas, compelling stories and vivid metaphors have, in study after study, more persuasive power than truth-bearing statistics. No wonder each year, 7 in 10 Americans, their minds filled with images of school shootings and local murders, say there is more crime than a year ago—even while crime statistics have plummeted.

 

William Strunk and E. B. White’s classic, The Elements of Style, grasped the idea: “If those who have studied the art of writing are in accord on any one point, it is on this: the surest way to arouse and hold the attention of the reader is by being specific, definite, and concrete. The greatest writers—Homer, Dante, Shakespeare—are effective largely because they deal in particulars.”

 

And Curry, too, offered particulars, with simplicity, repetition, and rhythmic cadence:

 

When love is the way, poverty will become history. When love is the way, the earth will be a sanctuary. When love is the way, we will lay down our swords and shields down by the riverside to study war no more.


When love is the way, there’s plenty good room—plenty good room—for all of God’s children. When love is the way, we actually treat each other like we are actually family. When love is the way, we know that God is the source of us all. We are brothers and sisters, children of God.


Brothers and sisters: that’s a new heaven, a new earth, a new world, a new human family.

 

With such repeated, heart-filled, concrete words, perhaps all preachers and speakers could spare listeners the fate of Eutychus, who, on hearing St. Paul’s preaching, “sunk down with sleep, and fell down from the third loft, and was taken up dead” (Acts 20:9).

 

Gratitude works. A genuinely positive and repeated observation of positive psychology is that keeping a gratitude journal (literally counting one’s blessings), or writing a gratitude letter, benefits the giver as well as the receiver. For the gratitude recipient, there is joy and a warmed heart. For the gratitude giver, the frequent result—as shown in studies by Robert Emmons, Michael McCullough, Martin Seligman, and others—is increased physical and psychological well-being and prosociality.

 

Having appreciated and reported on this lesson, I couldn’t resist sharing with my friends Emmons, McCullough, and Seligman (and with you, my readers) a little case example of how right they are.

 

A representative of the University of Iowa (my PhD alma mater) recently dropped by to share information on their campaign to fund a new psychology building. In response, Carol Myers (my wife) and I agreed that our family foundation might support this endeavor.  I initially proposed a certain amount; Carol, reminding me how much I owed them, suggested doubling that amount.

 

I concurred, and the next day, while writing our letter of response, I spontaneously explained our gratitude to the Iowa psychology department for

  • inviting me (an undergraduate chemistry major with little psychology background) to enter its PhD program,
  • supporting my graduate work (including nominating me for an NSF fellowship, which supported me after the first year),
  • introducing me to a research program that—quite to my surprise—turned me into a researcher (enabling NSF grants for my work here at Hope College, and which ultimately led to invitations to communicate psychological science through textbooks and other media).

 

Writing those words had an unexpected effect—it caused me to feel my gratitude more deeply, with a flush of positive emotion—whereupon I suggested to Carol (and she immediately agreed) that we redouble the doubled gift amount. (Only later did I realize how my writing a “gratitude letter” had affected me.)

 

I can hear my gratitude-intervention scholar-friends say, “Of course, gratitude expressions boost good feelings and prosociality.” Gratitude, “the queen of the virtues,” says Emmons, “heals, energizes, and transforms lives in a myriad of ways consistent with the notion that virtue is both its own reward and produces other rewards.” Gratitude works.

David Myers

Implicit Bias at Starbucks

Posted by David Myers Expert May 10, 2018

On April 12, 2018, a Philadelphia Starbucks manager called police on two African American men for doing nothing (not buying coffee while awaiting a friend). Was this but “an isolated incident”—as 48 percent of White and 10 percent of Black Americans presume? Or did the arrests reflect “a broader pattern” of bias? Starbucks’ CEO later apologized for the incident, said the manager was no longer with the company, and announced that on May 29, the company would close 8000 domestic stores to enable employee racial bias training.

 

The Starbucks fiasco drew national attention to implicit bias. It also illustrates what we social psychologists agree on, what we disagree about, and what can be usefully done:

 

Agreement: Bias exists. When one study sent out 5000 resumes in response to job ads, applicants with names such as Lakisha and Jamal received one callback for every 15 sent, while names such as Emily and Greg received one callback for every 10 sent. Similar racial disparities have been found in Airbnb inquiry responses, Uber and Lyft pickup requests, and descriptions of driver treatment during police traffic stops.

 

Agreement: Unconscious (implicit) biases underlie many racial disparities. Such biases are modestly revealed by the famed Implicit Association Test (IAT).  Likely the Starbucks manager never consciously thought “those two men are Black rather than White, so I’ll call the police.”

 

Disagreement: How effective is the IAT at predicting everyday behavior? Its creators remind us it enables study of a real phenomenon, but was never intended to assess and compare individuals and predict their discrete behaviors.

 

Disagreement: How effective is implicit bias training? Skeptics argue that “blame and shame” diversity training can backfire, triggering anger and resistance.  Or it may seem to normalize bias (“Everybody is biased”). Or it may lead to a temporary improvement in questionnaire responses, but without any lasting benefits. Even Anthony Greenwald and Brian Nosek, social psychologists and two of the IAT co-creators, echo some of these concerns. Greenwald notes that “implicit bias training . . . has not been shown to be effective, and it can even be counterproductive.” And Nosek warns that “diversity trainings are filled with good intentions and poor evidence.” 

 

Greenwald and Nosek doubt the likely effectiveness of Starbucks’ planned training day. Nosek believes the company would be better advised to pilot and assess their intervention in a few stores and then scale it up.

 

But some research offers more hopeful results. As part of their research on automatic prejudice, Patricia Devine and her colleagues trained willing volunteers to replace biased with unbiased knee-jerk responses. Throughout the two-year study follow-up period, participants in their experimental intervention condition displayed reduced implicit prejudice. Another team of 24 researchers held a “research contest” that compared 17 interventions for reducing unintended prejudice among more than 17,000 individuals. Eight of the interventions proved effective. Some gave people experiences with vivid, positive examples of Black people who countered stereotypes.

 

Recently, Nosek and Devine have collaborated with Patrick Forscher and others on a meta-analysis (statistical summary) of 494 efforts to change implicit bias. Their conclusion meets in the middle: “Implicit bias can be changed, but the effects are often weak” and may not carry over to behavior.

 

So, what should we do? And what can we—and Starbucks and other well-intentioned organizations—do to counteract implicit bias?

 

First, let’s not despair. Reacting with knee-jerk presumptions or feelings is  not unusual—it’s what we do with that awareness that matters. Do we let those feelings hijack our behavior? Or do we learn to monitor and correct our behavior in future situations? Neuroscience evidence shows that, for people who intend no prejudice, the brain can inhibit a momentary flash of unconscious bias in but half a second.

 

Second, we can aim toward an all-inclusive multiculturalism. As race-expert Charles Green explains, “Working for racial justice in your organization is not about ‘going after’ those in the majority. It’s about addressing unequal power distribution and creating opportunity for all. It is structural, not personal.”

 

Third, we can articulate clear policies—behavior norms—for how people (all people) should be treated in specific situations. Organizations can train employees to enact expected behaviors in various scenarios—dealing with customers in a coffee shop, with drivers  at a traffic stop, with reservation inquiries at a rental unit. Happily, as people act without discrimination they come to think with less prejudice. Attitudes follow behavior.

David Myers

The Power of Habit

Posted by David Myers Expert May 3, 2018

On the same day last week, two kind colleagues sent unsolicited photos. In one, taken 21 years ago at Furman University, I am with my esteemed friend/encourager/adviser, Charles Brewer (who sadly died recently).  The others were from a talk just given at Moraine Valley Community College.

 

I was a little embarrassed to see that in both I’m wearing, 21 years apart, the same Scottish green plaid tie and blue blazer with brass buttons (well, not the exact same blazer—I wore the first one out, but its replacement is identical).

How boring is that? And how boring is the life of this professor who, when not traveling, arises at 7:00 each morning, dresses (often with the same sweater from the day before) while watching the first ten minutes of the Today Show; bikes to work the same 3 blocks every day of the year (no matter the weather); begins the office day with prayer, email, and downloading political news for reading over breakfast in the campus dining hall; works till noontime exercise in the campus gym and . . .  (enough of this). I know: very boring.

 

But consider the wisdom of mathematician/philosopher Alfred North Whitehead in his 1911 An Introduction to Mathematics

Civilization advances by extending the number of important operations which we can perform without thinking about them. . . . By relieving the brain of all unnecessary work, a good notation sets it free to concentrate on more advanced problems, and in effect increases the mental power of the race.

 

Mark Zuckerberg follows Whitehead’s wisdom (and that of Charles Duhigg in The Power of Habit)—by not wasting time deciding what shirt to wear each day. As I concluded a previous essay on the same theme, “Amid today’s applause for ‘mindfulness,’ let’s put in a word for mindlessness. Mindless, habitual living frees our minds to work on more important things than which pants to wear or what breakfast to order.” Or so I’d like to believe.

 

[Note to positive psych geeks: The 2018 version of my “Scientific Pursuit of Happiness” talk (given more than 200 times over the past 28 years) is available, courtesy of Moraine Valley, at tinyurl.com/MyersHappiness.]

The Syrian slaughter. North Korea nuclear warheads. ISIS attacks. School shootings. Social media-fed teen depression. Thugs victimizing people of color and women. Inequality increasing. Online privacy invaded. Climate change accelerating. Democracy flagging as autocrats control Turkey, Hungary, China, and Russia, and as big money, voter suppression, and Russian influence undermine American elections. U.S. violent crime and illegal immigration soaring.

 

For news junkies, it’s depressing. We know that bad news predominates: If it bleeds, it leads. But we can nevertheless take heart from underreported encouraging trends. Consider, for example, the supposed increases in crime and illegal immigration.

 

Is it true, as President Trump has said, that “crime is rising” and in inner cities “is at levels that nobody has seen”? Seven in 10 Americans appeared to agree, when reporting to Gallup in each recent year that violent crime was higher than in the previous year. Actually, crime data aggregated by the FBI (shown below) reveals that violent (and property) crime have dramatically fallen since the early 1990s.

 

And is the U.S. being flooded with immigrants across its Mexican border—“evil, illegal immigrants who commit violent crimes,” as a 2018 DonaldJTrump.com campaign ad declared? In reality, the influx has subsided to a point where, Politifact confirms, “more illegal Mexican immigrants are leaving the United States than entering it.” (Should we build a wall to keep them in?)

 

But what about immigrant crime—fact or fiction? “Americans are five times more likely to say immigrants make the [crime] situation worse rather than better (45% to 9%, respectively),” reports Gallup. Not so. Multiple studies find that, as the National Academy of Sciences reports, “immigrants are less likely than the native-born to commit crimes” and are underrepresented in American prisons.

 

For more good news, consider other heartening long-term trends:

  • World hunger is retreating.
  • Child labor is less common.
  • Literacy is increasing.
  • Wars are becoming less frequent.
  • Explicit racial prejudice (as in opposition to interracial marriage) has plummeted.
  • Gay, lesbian, and transgender folks are becoming more accepted.
  • Infant mortality is rarer and life expectancy is increasing.

 

Such trends are amply documented in Steven Pinker’s recent books, The Better Angels of Our Nature and Enlightenment Now, and in Johan Norber’s Progress, and Gregg Easterbrooks, It’s Better Than It Looks. As President Obama observed, if you had to choose when to live, “you’d choose now.”

 

Yes, in some ways, these are dark times. But these are also the times of committed Parkland teens. Mobilized citizens. Machine learning. Immune therapies. #MeToo. #BlackLivesMatter. Low inflation. Near full employment. Digital streaming. Smart cars. Wearable technologies. Year-round fresh fruit. And Post-It notes.

 

To paraphrase Charles Dickens, it is the worst of times, it is the best of times. It is an age of foolishness, it is an age of wisdom. It is a season of darkness, it is a season of light. It is the winter of despair, it is the spring of hope.

The teen years are, for many, a time of rewarding friendships, noble idealism (think Parkland), and an expanding vision for life’s possibilities. But for others, especially those who vary from teen norms, life can be a challenge. Nonheterosexual teens, for example, sometimes face contempt, harassment, or family rejection. And that may explain their having scored higher than other teens on measures of anxiety, depression, and suicidal thoughts and attempts (see here, here, here, and here).

 

But many of these findings are based on older data and don’t reflect the increasing support of gay partnerships among North Americans and Western Europeans. In U.S. Gallup polls, for example, support for “marriages between same-sex couples” soared from 27 percent in 1996 to 64 percent in 2017. So, have the emotional challenges of being teen and gay persisted? If so, to what extent?

 

I’ve wondered, and recently discovered, an answer in the 2015 data from the annual UCLA/Higher Education Research Institute American Freshman survey (of 141,189 entering full-time students at a cross-section of U.S. colleges and universities). The news is mixed:

  • Most gay/lesbian/bisexual frosh report not having struggled with depression.
  • Being gay or lesbian in a predominantly heterosexual world remains, for a significant minority of older teens, an emotional challenge.

 

Can we hope that, if attitudes continue to change, this depression gap will shrink? In the meantime, the American Psychological Association offers youth, parents, and educators these helpful resources for understanding sexual orientation and gender identity, including suggestions for how “to be supportive” of youth whose sexual orientation or gender identity differs from most others.