Skip navigation
All Places > The Psychology Community > Blog > Authors David Myers
1 2 3 4 5 Previous Next

The Psychology Community

227 Posts authored by: David Myers Expert

I’m often asked: “What is your favorite introductory psych chapter?” I reply that, when starting to write my text, I presumed that Sensation-Perception would be the dullest topic. Instead, I’ve found it to be the most fascinating. I’m awestruck by the intricate process by which we take in information, transform it into nerve impulses, distribute it to different parts of our brain, and then reassemble that information into colorful sights, rich sounds, and evocative smells. Who could have imagined? We are, as the Psalmist said, “wonderfully made.”

 

And then there are the weird and wonderful perceptual phenomena, among which is our surprising blindness to things right in front of our eyes. In various demonstrations of inattentional blindness, people who are focused on a task (such as talking on a phone or counting the number of times black-shirted people pass a ball) often fail to notice someone sauntering through the scene—a woman with an umbrella, in one experiment, or even a person in a gorilla suit or a clown on a unicycle.

 

 

As a Chinese tour guide wrote to a friend of mine (after people failed to notice something my friend had seen):

 

This looking-without-seeing phenomenon illustrates a deep truth: Our attention is powerfully selective. Conscious awareness resides in one place at a time.

 

Selective inattention restrains other senses, too. Inattentional deafness is easily demonstrated with dichotic listening tasks. For example, if people are fed novel tunes into one ear, while focused on to-be-repeated-out-loud words fed into the other ear, they will later be unable to identify what tune they have heard. (Thanks to the mere exposure effect, they will, however, later like it best.) Or, in an acoustic replication of the famed invisible gorilla study, Polly Dalton and Nick Fraenkel found that people focusing on a conversation between two women (rather than on two men also talking) usually failed to notice one of the men repeatedly saying “I am a gorilla.”

 

Now, in a new British experiment, we also have evidence of inattentional numbness. Pickpockets have long understood that bumping into people makes them unlikely to notice a hand slipping into their pocket. Dalton (working with Sandra Murphy) experimented with this tactile inattention:  Sure enough, when distracted, their participants failed to notice an otherwise easily-noticed vibration to their hand.

 

Tactile inattention sometimes works to our benefit. I once, while driving to give a talk, experienced a painful sting in my eye (from a torn contact lens) . . . then experienced no pain while giving the talk . . . then felt the excruciating pain again on the drive home. In clinical settings, such as with patients receiving burn treatments, distraction can similarly make painful procedures tolerable. Pain is most keenly felt when attended to.

 

Another British experiment, by Charles Spence and Sophie Forster, demonstrated inattentional anosmia (your new word for the day?)—an inability to smell. When people focused on a cognitively demanding task, they became unlikely to notice a coffee scent in the room. .

So what’s next? Can we expect a demonstration of inattentional ageusia—inability to taste? (That’s my new word for the day.) Surely, given our powers of attention (and corresponding inattention), we should expect such.

 

Like a flashlight beam, our mind’s selective attention focuses at any moment on only a small slice of our experience—a phenomenon most drivers underestimate when distracted by phone texting or conversation. However, there’s good news: With our attention riveted on a task, we’re productive and even creative. Our attention is a wonderful gift, given to one thing at a time.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

Nearly two-third of Americans, reports a recent PLOS One article, agree that “I am more intelligent than the average person.”

 

This self-serving bias—on which I have been reporting for four decades (starting here)—is one of psychology’s most robust and reliable phenomena. Indeed, on most subjective, socially desirable dimensions, most of us see ourselves as better-than-average . . . as smarter, more ethical, more vocationally competent, more charitable, more unprejudiced friendlier, healthier, and more likely to outlive our peers—which calls to mind Freud’s joke about the husband who told his wife, “If one of us dies, I shall move to Paris.”

 

My own long-ago interest in self-serving bias was triggered by noticing a result buried in a College Board survey of 829,000 high school seniors. In rating themselves on their “ability to get along with others,” 0 percent viewed themselves below average. But a full 85 percent saw themselves as better than average: 60 percent in the top 10 percent, and 25 percent as in the top 1 percent.

 

As Shelley Taylor wrote in Positive Illusions, “The [self-]portraits that we actually believe, when we are given freedom to voice them, are dramatically more positive than reality can sustain.” Dave Barry recognized the phenomenon: “The one thing that unites all human beings, regardless of age, gender, religion, economic status, or ethnic background is that deep down inside, we all believe that we are above average drivers.”

 

Self-serving bias also takes a second form—our tendency to accept more responsibility for our successes than our failures, for our victories than our defeats, and for our good deeds than our bad. In experiments, people readily attribute their presumed successes to their ability and effort, their failures to bad luck or an impossible task. A Scrabble win reflects our verbal dexterity. A loss? Our bad luck in drawing a Q but no U.

 

Perceiving ourselves, our actions, and our groups favorably does much good. It protects us against depression, buffers stress, and feeds our hopes. Yet psychological science joins literature and religion in reminding us of the perils of pride. Hubris often goes before a fall. Self-serving perceptions and self-justifying explanations breed marital conflict, bargaining impasses, racism, sexism, nationalism, and war.

 

Being mindful of self-serving bias needn’t lead to false modesty—for example, smart people thinking they are dim-witted. But it can encourage a humility that recognizes our own virtues and abilities while equally acknowledging those of our neighbors. True humility leaves us free to embrace our special talents and similarly to celebrate those of others.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

“She [Professor Christine Blasey Ford] can’t tell us how she got home and how she got there,” scorned Senator Lindsey Graham during the lunch break of yesterday’s riveting U. S. Senate Judiciary Committee hearing regarding Ford’s memory of being assaulted by Supreme Court nominee Brett Kavanaugh. Graham’s assumption, widely voiced by fellow skeptics of Ford’s testimony, is that her inability to remember simple peripheral details discounts the authenticity of her assault memory.

 

But Graham and the other skeptics fail to understand, first, how extreme emotions signal the brain to “save this!” for future reference. (Likely you, too, have enduring “flashbulb memories” for long-ago emotional experiences?) And second, they fail to understand that peripheral details typically fall into oblivion. In Psychology, 12th Edition, Nathan DeWall and I explain:

 

Our emotions trigger stress hormones that influence memory formation. When we are excited or stressed, these hormones make more glucose energy available to fuel brain activity, signaling the brain that something important is happening. Moreover, stress hormones focus memory. Stress provokes the amygdala (two limbic system, emotion processing clusters) to initiate a memory trace that boosts activity in the brain’s memory-forming areas (Buchanan, 2007; Kensinger, 2007) (FIGURE 8.9). It’s as if the amygdala says, “Brain, encode this moment for future reference!” The result? Emotional arousal can sear certain events into the brain, while disrupting memory for irrelevant events (Brewin et al., 2007; McGaugh, 2015).

 

Significantly stressful events can form almost unforgettable memories. After a traumatic experience—a school shooting, a house fire, a rape—vivid recollections of the horrific event may intrude again and again. It is as if they were burned in: “Stronger emotional experiences make for stronger, more reliable memories,” noted James McGaugh (1994, 2003). Such experiences even strengthen recall for relevant, immediately preceding events [such as going up the stairway and into the bedroom, in Ford’s case] (Dunsmoor et al., 2015: Jobson & Cheraghi, 2016). This makes adaptive sense: Memory serves to predict the future and to alert us to potential dangers. Emotional events produce tunnel vision memory. They focus our attention and recall on high priority information, and reduce our recall of irrelevant details (Mather & Sutherland, 2012). Whatever rivets our attention gets well recalled, at the expense of the surrounding context.

 

And as I suggested in last week’s essay, Graham and others seem not to understand “state-dependent memory”—that what people experience in one state (such as when drunk) they may not remember in another state (sober). Nor are Kavanaugh’s supporters recognizing that heavy drinking disrupts memory formation, especially for an experience that would not have been traumatic for him. Thus, Kavanaugh could be sincerely honest in not recalling an assaultive behavior, but also, possibly, sincerely wrong.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit www.TalkPsych.com.)

Psychology professor Christine Blasey Ford vividly recalls being sexually assaulted by Supreme Court nominee Brett Kavanaugh when both were teens. Kavanaugh remembers no such event and vigorously denies Ford’s accusation. The potentially historic significance of the allegation has triggered a debate: Is she telling the truth? Or is he, in claiming no such memory?

 

Without judging either’s current character, psychological science suggests a third possibility: Perhaps both are truthfully reporting their memories.

 

On Ford’s behalf, we can acknowledge that survivors of traumatic events typically are haunted by enduring, intrusive memories. As Nathan DeWall and I write in Psychology, 12th Edition,

Significantly stressful events can form almost unforgettable memories. After a traumatic experience—a school shooting, a house fire, a rape—vivid recollections of the horrific event may intrude again and again. It is as if they were burned in: “Stronger emotional experiences make for stronger, more reliable memories,” noted James McGaugh (1994, 2003).

 

Does Ford’s inability to remember ancillary details, such as when the assault supposedly occurred, discount her veracity? Not at all, if we’re to generalize from research on the accuracy of eyewitness recollections. Those whose memory is poor for incidental details of a scene are more accurate in their recollections of the essential event (see here and here).

 

But if Kavanaugh and his friend were, indeed, “stumbling drunk,” then perhaps they, genuinely, have no recollection of their impulsive behaviors while “severely intoxicated.”  Memory blackouts do happen, as we also report:

 Ergo, if trauma sears memories into the brain, and if alcohol disrupts them, could it be that both Ford and Kavanaugh are telling the truth as best they can recall it?

 

(For David Myers’ other weekly essays on psychological science and everyday life visit www.TalkPsych.com)

Turning 76 years old in a week, and still loving what I do, I find myself inspired by two recent emails. One, from social psychologist Thomas Pettigrew, age 87, responded to my welcoming his latest work by attaching fourteen of his recent publications. The second, from Nathan DeWall, pointed me to an interesting new article co-authored by developmental psychologist, Walter Mischel, age 88 (who, sadly, died just hours before this essay was posted).

 

That got me thinking about other long-lived people who have found their enduring calling in psychological science. My late friend, Charles Brewer, the long-time editor of Teaching of Psychology (who once told me he took two days a year off: Christmas and Easter), taught at Furman University until nearly 82, occupied his office until age 83, and was still authoring into his 80s.

 

But Charles’ longevity was exceeded by that of

  • B.F. Skinner, whom I heard address the American Psychological Association convention in 1990 at age 86, just eight days before he died of leukemia.
  • Carroll Izard, who co-authored three articles in 2017, the year of his death at age 93.
  • Jerome Bruner, who, the year before he died in 2016 at age 100, authored an essay on “The Uneasy Relation of Culture and Mind.”

 

And in earlier times, my historian-of-psychology friend Ludy Benjamin tells me, Wilhelm Wundt taught until 85 and supervised his last doctoral student at 87, and Robert Woodworth, lectured at Columbia until 89 and published his last work at 90.*

 

So, I then wondered, who of today’s living psychologists, in addition to Pettigrew and Mischel, are still publishing at age 85 and beyond? Daniel Kahneman and Paul Ekman almost qualify, but at 84 are youngsters compared to those below.  Here’s my preliminary short list—other nominees welcome!—with their most recent PsycINFO publication. (Given the era in which members of their age received PhDs, most are—no surprise—men.)

 

  • Philip Zimbardo: Age 85 (born March 23, 1933)

Unger, A., Lyu, H., & Zimbardo, P. G. (2018). How compulsive buying is influenced by perspective—Cross-cultural evidence from Germany, Ukraine, and China. International Journal of Mental Health and Addiction, 16, 522–544.

 

  • Gordon Bower: Age 85 (born December 30, 1932)

Bower, G. H. (2016). Emotionally colored cognition. In R. J. Sternberg, S. T. Fiske, & F. J. Foss (Eds.), Scientists making a difference: One hundred eminent behavioral and brain scientists talk about their most important contributions. Chapter xxvii, pp. 123–127. NY: Cambridge University Press.

 

  • James McGaugh: Age 86 (born December 17, 1931)

McGaugh, J. L. (2018). Emotional arousal regulation of memory consolidation. Current Opinion in Behavioral Sciences, 19, 5560.

 

  • Lila Gleitman: Age 88 (born December 10, 1931)

Gleitman, L. R., & Trueswell, J. C. (2018). Easy words: Reference resolution in a malevolent referent world. Topics in Cognitive Science.

 

  • Roger Shepard: Age 89 (born January 30, 1929)

Shepard, R. N. (2016). Just turn it over in your mind. In R. J. Sternberg, S. T. Fiske, & F. J. Foss (Eds.), Scientists making a difference: One hundred eminent behavioral and brain scientists talk about their most important contributions. Chapter xxvii, pp. 99–103. New York: Cambridge University Press.

 

  • Jerome Kagan: Age 89 (born February 25, 1929)

Kagan, J. (2018, May). Three unresolved issues on human morality. Perspectives on Psychological Science, 13, 346–358.

 

  • Albert Bandura: Age 92 (born December 4, 1925)

Bandura, A. (2016). Moral disengagement: How people do harm and live with themselves. New York: Worth Publishers.

 

  • Aaron Beck: Age 97 (born July 18, 1921)

Kochanski, K. M., Lee-Tauler, S. Y., Brown, G. K., Beck, A., Perera, K. U., et al. (2018, Aug.) Single versus multiple suicide attempts: A prospective examination of psychiatric factors and wish to die/wish to live index among military and civilian psychiatrically admitted patients. Journal of Nervous and Mental Disease, 206, 657–661.

 

  • Eleanor Maccoby: Age 101 (born May 15, 1917)

Maccoby, E. (2007). Historical overview of socialization research and theory. In J. E. Grusec, & P. D. Hastings (Eds.), Handbook of socialization: Theory and research. New York: Guilford Press.

 

  • And a drum roll for Brenda Milner: At age 100 (born July 15, 1918), she still, I’m told, comes in a couple times a week to the Montreal Neurological Institute, which last week celebrated her centennial (with thanks to Melvin Goodale for the photo below).

             Milner, B., & Klein, D. (2016, March). Loss of recent memory after bilateral hippocampal lesions: Memory and              memories—looking back and looking forward. Journal of Neurology, Neurosurgery & Psychiatry, 87, 230.

 

 

Life is a gift that ends unpredictably. Having already exceeded my at-birth life expectancy, I am grateful for the life I have had. But as one who still loves learning and writing (and can think of nothing else I’d rather do), why not emulate these esteemed colleagues while I continue to be blessed with health, energy, and this enduring sense of calling?

 

P.S. Subsequent to this essay, I have learned of other long-lived and still-productive psychologists, including Robert Rosenthal (retiring next Spring at 86), Allen Baddeley (who has a new book, Working Memories, at age 84), Jean Mandler (who has a new article out at age 89), and Eleanor Maccoby (who died recently at 101, with a 2015 chapter). The reported oldest living psychologist is Olivia Hooker, whose 103rd birthday was celebrated during APA's Black History Month earlier this year. On her 100th birthday, Pres Obama saluted her and dedicated a new Coast Guard Building in her name. But surely I've missed others?

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

 ----

* The “major early women psychologists”—Calkins, Washburn, Ladd-Franklin, Woolley, Hollingworth—all died before age 85, reported Benjamin, who added that some other psychologists have stayed too long in the profession without knowing “when to hang up their spikes” and make way for fresh faces in the classroom and laboratory.

Some fun emails stimulated by last week’s essay on loss aversion in sports and everyday life pointed me to statistician David Spiegelhalter's Cambridge Coincidence Collection, which contains people’s 4500+ reports of weird coincidences.

 

That took my mind back to some personally experienced coincidences . . . like the time my daughter, Laura Myers, bought two pairs of shoes. Back home, we were astounded to discover that the two brand names on the boxes were “Laura” and “Myers.” Or the time I confused our college library desk clerk when checking out after using a photocopy machine. My six-digit charge number was identical to the one-in-a-million six-digit number of copies on which the last user had stopped. Or the day my wife, Carol, called seeking my help in sourcing Mark Twain’s quote, “The man who does not read good books has no advantage over the man who cannot read them.” After this first-ever encounter with that quote, my second encounter was 90 minutes later, in a Washington Post article.

 

In Intuition: Its Powers and Perils, I report more amusing coincidences. Among my favorites:

  • Twins Levinia and Lorraine Christmas, driving to deliver Christmas presents to each other near Flitcham, England, collided.
  • Three of the first five U.S. Presidents—Adams, Jefferson, and Monroe—died on the same date–which was none other than the 4th of July.
  • And my favorite . . . in Psalm 46 of the King James Bible, published in the year that Shakespeare turned 46, the 46th word is “shake” and the 46th word from the end is “spear.” (An even greater marvel: How did anyone notice this?)

 

What should we make of weird coincidences? Were they, as James Redfield suggested in The Celestine Prophecy, seemingly “meant to happen . . . synchronistic events, and [that] following them will start you on your path to spiritual truth”? Is it a wink from God that your birthdate is buried among the random digits of pi? Beginning 50,841,600 places after the decimal, my 9/20/1942 birthdate appears . . . and you can likewise find yours here.

 

Without wanting to drain our delight in these serendipities, statisticians have a simpler explanation. Given the countless billions of daily events, some weird juxtapositions are inevitable—and then likely to get noticed and remembered (while all the premonitions not followed by an envisioned phone call or accident are unnoticed and fall into oblivion). “With a large enough sample, any outrageous thing is likely to happen,” observed statisticians Persi Diaconis and Frederick Mosteller. Indeed, added mathematician John Allen Paulos, “the most astonishingly incredible coincidence imaginable would be the complete absence of all coincidences.”

 

Finally, consider: That any specified coincidence will occur is very unlikely. That some astonishing unspecified event will occur is certain. That is why remarkable coincidences are noted in hindsight, not predicted with foresight. And that is also why we don’t need paranormal explanations to expect improbable happenings, even while delighting in them.

Imagine that you’re about to buy a $5000 used car. To pay for it, you’ll need to sell some of your stocks. Which of the following would you rather sell?

  • $5000 of Stock X shares, which you originally purchased for $2500.
  • $5000 of Stock Y shares, which you originally purchased for $10,000.

 

If you’d rather sell Stock X and reap your $2500 profit now, you’re not alone. One analysis of 10,000 investor accounts revealed that most people strongly prefer to lock in a profit rather than absorb a loss. Investors’ loss aversion is curious: What matters is each stock’s future value, not whether it has made or lost money in the past. (If anything, tax considerations favor selling the loser for a tax loss and avoiding the capital gains tax on the winner.)

 

Loss aversion is ubiquitous, and not just in big financial decisions. Participants in experiments, where rewards are small, will choose a sure gain over flipping a coin for double or nothing—but they will readily flip a coin on a double-or-nothing chance to avert a loss. As Daniel Kahneman and Amos Tversky reported, we feel the pain from a loss twice as keenly as we feel the pleasure from a similar-sized gain. Losing $20 feels worse than finding $20 feels good. No surprise, then, that we so vigorously avoid losing in so many situations.

 

The phenomenon extends to the endowment effect—our attachment to what we own and our aversion to losing it, as when those given a coffee mug demand more money to sell it than those not given the mug are willing to pay for it. Small wonder our homes are cluttered with things we wouldn’t today buy, yet won’t part with.

 

Loss aversion is but one example of a larger bad-is-stronger-than-good phenomenon, note Roy Baumeister and his colleagues. Bad events evoke more misery than good events evoke joy. Cruel words hurt us more than compliments please us. A bad reputation is easier to acquire—with a single lie or heartless act—than is a good reputation. “In everyday life, bad events have stronger and more lasting consequences than comparable good events.” Psychologically, loss is larger than gain. Emotionally, bad is stronger than good.  

           

Coaches and players are aware of the pain of losses, so it’s no surprise that loss aversion plays out in sports. Consider this example from basketball: Say your team is behind by 2 points, with time only for one last shot. Would you prefer a 2-point or a 3-point attempt?

 

Most coaches, wanting to avoid a loss, will seek to put the game into overtime with a 2-point shot. After all, an average 3-point shot will produce a win only one-third of the time. But if the team averages 50 percent of its 2-point attempts, and has about a 50 percent chance of overtime in this toss-up game, the loss-aversion strategy will yield but a 25 percent chance of both (a) sending the game to overtime, followed by (b) an overtime victory. Thus, by averting an immediate loss, these coaches reduce the chance of an ultimate win—rather like investors who place their money in loss-avoiding bonds and thus forego the likelihood, over extended time, of a much greater stock index win.

 

And now comes news (kindly shared by a mathematician friend) of loss aversion in baseball and softball base-running. Statistician Peter MacDonald, mathematician Dan McQuillan, and computer scientist Ian McQuillan invite us to imagine “a tie game in the bottom of the ninth inning, and there is one out—a single run will win the game. You are on first base, hoping the next batter gets a hit.”

 

As the batter hits a fly to shallow right, you hesitate between first and second to see if the sprinting outfielder will make the catch. When the outfielder traps rather than catches the ball, you zoom to second. The next batter hits a fly to center field and, alas, the last batter strikes out.

 

You probably didn’t question this cautious base-running scenario, because it’s what players do and what coaches commend. But consider an alternative strategy, say MacDonald and his colleagues. If you had risked running to third on that first fly ball, you would have scored the winning run on the ensuing fly ball. Using data from 32 years of Major League Baseball, the researchers calculate that any time the fly ball is at least 38 percent likely to fall for a hit, the runner should abandon caution and streak for third. Yet, when in doubt, that rational aggressive running strategy “is never attempted.”

 

You may object that players cannot compute probabilities. But, says the MacDonald team, “players and their third-base coaches make these sorts of calculations all the time. They gamble on sacrifice flies and stolen base attempts using probabilities of success.” Nevertheless, when it comes to running from first, their first goal is to avert loss—and to avoid, even at the cost of a possible run, the risk of looking like a fool. We implicitly think “What if I fail?” before “How can I succeed?”

 

Often in life, it seems, our excessive fear of losing subverts our opportunities to win. Caution thwarts triumph. Little ventured, little gained.

 

My late friend Gerry Haworth understood the risk-reward relationship. A shop teacher at our local high school, he began making wood products in his garage shop. Then, in 1948, he ventured the business equivalent of running to third base—quitting his job and launching a business, supported by his dad’s life savings. Today, family-owned Haworth Inc., America’s third-largest furniture manufacturer, has more than 6000 employees and nearly $2 billion in annual sales. Something ventured, something gained.

Mexican immigrants, President Trump has repeatedly told his approving base, are “bringing drugs. They’re bringing crime. They’re rapists.” In this week’s West Virginia rally he highlighted Mollie Tibbetts’ accused “illegal alien” killer as a vivid example. Hence the wish to “build a wall”—to keep out those who, we are told, would exploit Americans and take their jobs.

 

In an earlier 2018 essay, I responded to the inaccuracy of fear mongering about immigrant crime. But consider a different question: Who believes it? Is it people who live in regions with a greater number of unauthorized immigrants, and who have suffered the presumed crime, conflict, and competition?

 

At the recent Sydney Symposium on Social Psychology, Christian Unkelbach (University of Cologne) reported an intriguing finding: In Germany, anti-immigrant views are strongest in the states with fewest immigrants. Across Germany’s 16 states, intentions to vote for the right-wing Alternative für Deutschland (Alternative for Germany [AfD]) was greatest in states with the fewest asylum applications. (My thanks to Dr. Unkelbach for permission to share his translated figure.)

 

I wondered: Might a similar pattern emerge  in U.S. states? To find out, I combined two data sets:

  1. A 2016 Pew report provided data on the percentage of unauthorized immigrants in each state’s population.
  2. A 2016 PRRI report provided state-by-state data on immigrant acceptance.

The result? Voila! In the United States, more immigrants predicts more state-level acceptance of immigrants. And fewer immigrants predicts more fear of immigrants. (West Virginia, with the lowest unauthorized immigrant proportion, also is the least immigrant-supportive.) Moreover, the U.S. correlations are very similar to the German:

  • Across the 16 German states, the correlation between immigrant noncitizen population and anti-immigrant attitudes was -.61.
  • Across the 50 U.S. states, the correlation between immigrant noncitizen population and immigrant-supportive attitudes was +.72.

 

 

 

The legendary prejudice researcher Thomas Pettigrew would not be surprised. In a new article at age 87 (I want to be like him when I grow up), Pettigrew reports that in 477 studies of nearly 200,000 people across 36 cultures, intergroup contact predicted lower prejudice in every culture. With cross-racial contact, especially cooperative contact, people from South Africa to the United States develop more favorable racial attitudes. In a new study by Jared Nai and colleagues, living in a racially diverse U.S. neighborhood—or even just imagining doing so—leads people to identify more with all humanity, and to help strangers more.

 

As straight folks get to know gay folks, they, too, become more gay-supportive. And, these new data suggest, as citizens interact with and benefit from their immigrant neighbors, they, too, become more open-hearted and welcoming.

 

In my own Midwestern town, where minority students (mostly Hispanic) are a slight majority of public school students, these yard signs (this one from my front yard) abound. We have known enough immigrants—as neighbors, colleagues, business owners, and workers—to know that they, like our own immigrant ancestors, can be a blessing.

 

[Afterword: In kindly commenting on this essay, Thomas Pettigrew noted that one exception to the contact-with-immigrants benefit occurs “when the infusion of newcomers is large and sudden.  Then threat takes over without the time for contact to work its magic” (quoted with permission).]

Some years ago an NBC Television producer invited me, while in New York City, to meet in her office to brainstorm possible psychology-related segments. But a focused conversation proved difficult, because every three minutes or so she would turn away to check an incoming email or take a call—leaving me feeling a bit demeaned.

 

In today’s smartphone age, such interruptions are pervasive. In the midst of conversation, your friend’s attention is diverted by the ding of an incoming message, the buzz of a phone call, or just the urge to check email. You’re being phubbed—an Australian-coined term meaning phone-snubbed.

 

In U.S. surveys by James Roberts and Meredith David, 46 percent reported being phubbed by their partners, and 23 percent said it was a problem in their relationship. More phubbing—as when partners place the phone where they can glance at it during conversation, or check it during conversational lulls—predicted lower relationship satisfaction.

 

EmirMemedovski/E+/Getty Images

 

Could such effects of phubbing be shown experimentally? In a forthcoming study, Ryan Dwyer and his University of British Columbia colleagues recruited people to share a restaurant meal with their phones on the table or not. “When phones were present (vs. absent), participants felt more distracted, which reduced how much they enjoyed spending time with their friends/family.”

 

Another new experiment, by University of Kent psychologists Varoth Chotpitayasunondh and Karen Douglas, helps explain phubbing’s social harm. When putting themselves in the skin of one participant in an animation of a conversation, people who were phubbed felt a diminished sense of belonging, self-esteem, and control. Phubbing is micro-ostracism. It leaves someone, even while with another, suddenly alone.

 

Screenshot courtesy Karen Douglas

 

Smartphones, to be sure, are a boon to relationships as well as a bane. They connect us to people we don’t see—enlarging our sense of belonging. As one who lives thousands of miles from family members, I love Facetime and instant messaging. Yet a real touch beats being pinged. A real smile beats an emoticon. An eye-to-eye blether (as the Scots would say) beats an online chat. We are made for face-to-face relationship.

 

When I mentioned this essay to my wife, Carol, she wryly observed that I (blush) phub her “all the time.” So, what can we do, while enjoying our smartphones, to cut the phubbing? I reached out to some friends and family and got variations on these ideas:

  • “When we get together to play cards, I often put everyone's phone in the next room.”
  • “When out to dinner, I often ask friends to put their phones away. I find the presence of phones so distracting; the mere threat of interruption diminishes the conversation.” Even better: “When some of us go out to dinner, we pile up our phones; the first person to give in and reach for a phone pays for the meal.”
  • I sometimes stop talking until the person reestablishes eye-contact.” Another version: “I just wait until they stop reading.”
  • “I say, ‘I hope everything is OK.’” Or this: “I stop and ask is everything ok? Do you need a minute? I often receive an apology and the phone is put away.”
  • “I have ADHD and I am easily distracted. Thus when someone looks at their phone, and I'm distracted, I say, "I'm sorry, but I am easily distracted. Where was I?" . . . It's extremely effective, because nobody wants me to have to start over.”

 

Seeing the effects of phubbing has helped me change my own behavior. Since that unfocused conversation at NBC I have made a practice, when meeting with someone in my office, to ignore the ringing phone. Nearly always, people pause the conversation to let me take the call. But no, I explain, we are having a conversation and you have taken the time to be here with me. Whoever that is can leave a message or call back. Right now, you are who’s important.

 

Come to think of it, I should take that same attitude home.

Dog walking, according to a recent news report, is healthy for people. That little report follows three massive new research reviews that confirm earlier findings of the mental health benefits of exercise:

  • An American Journal of Psychiatry analysis of 49 studies followed 266,939 people across an average 7 years. In every part of the world, people of all ages had a lower risk of becoming depressed if physically active rather than inactive.
  • JAMA Psychiatry reports that, for teens, “regular physical activity [contributes] to positive mental health.”
  • Another JAMA Psychiatry analysis of 33 clinical trials found an additional depression-protecting effect of “resistance exercise training” (such as weight lifting and strength-building).

 

Faba-Photography/Moment/Getty Images

 

A skeptic might wonder if mentally healthy people have more energy for exercise. (Being really depressed comes with a heaviness that may entail trouble getting out of bed.) But the “prospective studies”—which follow lives through time—can discern a sequence of exercise predicting future reduced depression risk. Moreover, many clinical trial experiments—with people assigned to exercise or control conditions—confirm that exercise not only contributes to health and longevity, it also treats and protects against depression and anxiety. Mens sana in corpore sano: A healthy mind in a healthy body.

 

Indeed, given the modest benefits of antidepressant drugs, some researchers are now recommending therapeutic lifestyle change as a potentially more potent therapy for mild to moderate depression—or as a protection against such. When people modify their living to include the exercise, sunlight exposure, ample sleep, and social connections that marked our ancestors’ lives—a lifestyle for which they were bred—they tend to flourish, with greater vitality and joy. In one study, substantial depression relief was experienced by 19 percent of patients in a treatment-as-usual control group and by 68 percent undergoing therapeutic lifestyle change.

 

Finally, more good news—for dog walkers: Dog walking is said to be healthy and calming for dogs, too. But I suspect that will not surprise any dog owner or their dog.

“The heart has its reasons which reason does not know."

~Pascal, Pensees, 1670

 

“He that trusteth in his own heart is a fool.”

~Proverbs 28:26

 

“Buried deep within each and every one of us, there is an instinctive, heart-felt awareness” that can guide our behavior. So proclaimed Prince Charles in a 2000 lecture. Trust your gut instincts.

 

Prince Charles has much company. “I’m a gut player. I rely on my instincts,” explained President George W. Bush in justifying his decision to launch the Iraq war, after earlier talking with Vladimir Putin and declaring himself “able to get a sense of his soul.”

 

“Within the first minute [of meeting Kim Jong-un] I’ll know, declared President Trump. “My touch, my feel—that’s what I do.” Afterwards he added, “We had a great chemistry—you understand how I feel about chemistry.” The heart has its reasons.

 

But is there also wisdom to physicist Richard Feynman’s channeling the skepticism of King Solomon’s Proverb: “The first principle,” said Feynman, “is that you must not fool yourself—and you are the easiest person to fool.”

 

In sifting intuition’s powers and perils, psychological science has some wisdom.

 

First, our out-of-sight, automatic, intuitive information processing is HUGE. In Psychology, 12th Edition, Nathan DeWall and I offer some examples:

  • Automatic processing: We glide through life mostly on autopilot. Our information processing is mostly implicit, unconscious, behind the scenes—and often guided by “fast and frugal” heuristics (mental shortcuts).
  • Intuitive expertise: After mastering driving (or chess), people can react to situations intuitively, without rational analysis.
  • Reading others: We are skilled at reading “thin slices” of behavior—as when judging someone’s warmth from a 6-second video clip.
  • Blindsight: Some blind people even display “blindsight”—they can intuitively place an envelope in a mail slot they cannot consciously see.

 

Second, our intuition is perilous. Psychology is flush with examples of smart people’s predictable and sometimes tragic intuitive errors:

  • Human lie detection: People barely surpass chance when intuiting whether others are lying or truth-telling. (American presidents might want to remember this when judging Putin’s or Kim Jong-un’s trustworthiness.)
  • Intuitive prejudice: As demonstrated in some police responses to ambiguous situations, implicit biases can—without any conscious malevolent intent—affect our perceptions and reactions. (Is that man pulling out a gun or a phone?)
  • Intuitive fears: We fear things that kill people vividly and memorably (because we intuitively judge risks by how readily images of a threat come to mind). Thus we may—mistakenly—fear flying more than driving, shark attacks more than drowning, school mass shootings more than street and home shootings.
  • The “interview illusion”: Given our ability to read warmth from thin slices, it’s understandable that employment interviewers routinely overestimate their ability to predict future job success from unstructured get-acquainted interviews. But aptitude tests, work samples, job-knowledge tests, and peer ratings of past job performance are all better predictors. (Even the lengthiest of interviews—the mate-selection process—is a fragile predictor of long-term marital success.)

 

The bottom line: Intuition—automatic, implicit, unreasoned thoughts and feelings—grows from our experience, feeds our creativity, and guides our lives. Intuition is powerful. But it also is perilous, especially when we overfeel and underthink. Unchecked, uncritical intuition sometimes leads us into ill-fated relationships, feeds overconfident predictions, and even leads us into war.

Recent U.S. school shootings outraged the nation and produced calls for action. One response, from the International Society for Research on Aggression, was the formation of a Youth Violence Commission, composed of 16 experts led by Ohio State social psychologist Brad Bushman. Their task: To identify factors that do, and do not, predict youth violence—behavior committed by a 15- to 20-year old that’s intended to cause unwanted harm.

 

 

Hélène Desplechin/Moment/Getty Images

 

The Commission has just released its final report, which it has shared with President Trump, Vice President Pence, Education Secretary DeVos, and all governors, senators, and congressional representatives.

 

The Commission first notes big differences between highly publicized mass shootings (rare, occurring mostly in smaller towns and suburbs, using varied legal guns) and street shootings (more common, concentrated in inner cities, using illegal handguns).  It then addresses the factors that do and do not predict youth violence.

 

RISK FACTORS THAT PREDICT YOUTH VIOLENCE

 

Personal Factors:

  • Gender—related to male biology and masculinity norms.
  • Early childhood aggressive behavior—past behavior predicts future behavior.
  • Personality—low anger control, often manifested in four “dark” personality traits: narcissism, psychopathy, Machiavellianism, and sadism.
  • Obsessions with weapons or death.

 

Environmental Factors:

  • Easy access to guns.
  • Social exclusion and isolation—sometimes including being bullied.
  • Family and neighborhood—family separation, child maltreatment, neighborhood violence.
  • Media violence—a link “found in every country where studies have been conducted.”
  • School characteristics—with large class sizes contributing to social isolation.
  • Substance use—a factor in street shootings but not school shootings.
  • Stressful events—including frustration, provocation, and heat.

 

FACTORS THAT DO NOT PREDICT YOUTH VIOLENCE

 

The commission found that the following do not substantially predict youth violence:

  • Mental health problems—most people with mental illness are not violent, and most violent people are not mentally ill (with substance abuse and psychotic delusions being exceptions).
  • Low self-esteem—people prone to violence actually tend to have inflated or narcissistic self-esteem.
  • Armed teachers—more guns = more risk, and they send a message that schools are unsafe.

 

The concluding good news is that training programs can increase youth self-control, enhance empathy and conflict resolution, and reduce delinquency. Moreover, mass media could help by reducing attention to shootings, thereby minimizing the opportunity for modeling and social scripts that such portrayals provide to at-risk youth.

“When you know a thing, to hold that you know it; and when you do not know a thing, to allow that you do not know it; this is knowledge.”
~Confucius (551–479 B.C.E.), Analects

One of the pleasures of joining seventeen scholars from six countries at last week’s 20th Sydney Symposium on Social Psychology was getting to know the affable and articulate David Dunning.

 

David DunningDunning (shown here) recapped a stream of studies on human overconfidence. When judging the accuracy of their factual beliefs (“Did Shakespeare write more than 30 plays?”) or when predicting future events (such as the year-end stock market value), people are typically more confident than correct. Such cognitive conceit fuels stockbrokers’ beliefs that they can outperform the market—which, as a group, they cannot. And it feeds the planning fallacy—the tendency of contractors, students, and others to overestimate how quickly they will complete projects.

 

To this list of studies, Dunning and Justin Kruger added their own discovery, now known as the Dunning-Kruger effect: Those who score lowest on various tests of knowledge, logic, and grammar are often ignorant of their own ignorance. Never realizing all the word possibilities I miss when playing Scrabble, I may overestimate my verbal competence.

 

Likewise—to make this even more personal—those of us with hearing loss often are the last to recognize such . . . not because we are repressing our loss, but simply because we are unaware of what we haven’t heard (and of what others do hear). To Daniel Kahneman’s kindred observation that we are “blind to our [cognitive] blindness,” I would add that we can also be literally deaf to our deafness. We don’t know what we don’t know.

 

Thus ironically, and often tragically, those who lack expertise in an area suffer a double-curse—they make misjudgments, which they fail to recognize as errors. This leads them, notes Dunning, to conclude “they are doing just fine.”

 

Note what Dunning is not saying—that some people are just plain stupid, a la Warren Buffett:

Warren Buffett

 

Rather, all of us have domains of inexpertise, in which we are ignorant of our ignorance.

 

But there are two remedies. When people express strong views of topics on which they lack expertise, we can, researcher Philip Fernbach found, ask them to explain the details: “So exactly how would a cap-and-trade carbon emissions tax work?” A stumbling response can raise their self-awareness of their ignorance, lessening their certainty.

 

Second, we can, for our own part, embrace humility. For anything that matters, we can welcome criticism and advice. Another personal example: As I write about psychological science, I confess to savoring my own words. As I draft this essay, I am taking joy in creating the flow of ideas, playing with the phrasing, and then fine-tuning the product to seeming perfection. Surely, this time my editors—Kathryn Brownson and Nancy Fleming—will, for once, find nothing to improve upon? But always they find glitches, ambiguities, or infelicities to which I was blind.

 

Perhaps that is your story, too? Your best work, when reviewed by others . . . your best tentative decisions, when assessed by your peers . . . your best plans, when judged by consultants . . . turn into something even better than you, working solo, could have created. Our colleagues, friends, and spouses often save us from ourselves. The pack is greater than the wolf.

 

In response to my wondering if his famed phenomenon had impacted his life, Dunning acknowledged that he has received—and in all but one instance rebuffed—a stream of journalist pleas: Could he please apply the blindness-to-one’s-own-incompetence principle to today’s American political leadership?

 

But stay tuned. Dunning is putting the finishing touches on a general audience trade book (with one possible title: You Don’t Know What You Don’t Know—and Why It Matters).

It’s well-established that:

  • brain cells survive for a time after cardiac arrest and even after declared death.
  • some people have been resuscitated after cardiac arrest— even hours after, if they were linked to blood-oxygenating and heart-massaging machines.
  • a fraction of resuscitated people have reported experiencing a bright light, a tunnel, a replay of old memories, and/or out-of-body sensations. For some, these experiences later enhanced their spirituality or personal growth.

 

Recently, I enjoyed listening to and questioning a university physician who is launching a major multi-site study of cardiac arrest, resuscitation, and near-death experiences. As a dualist (one who assumes mind and body are distinct, though interacting), he is impressed by survivors’ reports of floating up to the ceiling, looking down on the scene below, and observing efforts to revive them. Thus, his study seeks to determine whether such patients can—while presumably separated from their supine body—perceive and later recall images displayed on an elevated, ceiling-facing iPad.

 

Care to predict the result?

 

My own prediction is based on three lines of research:

  • Parapsychological efforts have failed to confirm out-of-body travel with remote viewing.
  • A mountain of cognitive neuroscience findings link brain and mind.
  • Scientific observations show that brain oxygen deprivation and hallucinogenic drugs can cause similar mystical experiences (complete with the tunnel, beam of light, and so forth).

Thus, I expect there will be no replicable evidence of near-death minds viewing events remote from the body.

 

Setting my assumptions and expectations aside, I asked the physician-researcher about some of his assumptions:

  1. For how long do you think the mind would survive clinical death? Minutes? Hours? Forever? (His answer, if I understood, was uncertainty.)
  2. When resuscitated, the mind would rejoin and travel again with the body, yes? When the patient is wheeled to a new room, the mind rides along? (That assumption was not contested.)
  3. What about the Hiroshima victims whose bodies were instantly vaporized? Are you assuming that–for at least a time—their consciousness or mind survived that instant and complete loss of their brain and body? (His clear answer: Yes.)

 

That made me wonder: If a mind could post-date the body, could it also predate it? Or does the body create the mind, which grows with it, but which then, like dandelion seeds, floats away from it?

 

The brain-mind relationship appeared in another presentation at the same session. A European university philosopher of mind argued that, in addition to the dualist view (which he regards as “dead”) and the reductionist view (Francis Crick: “You’re nothing but a pack of neurons”), there is a third option. This is the nonreductive physicalist view—“nonreductive” because the mind has its own integrity and top-down causal properties, and “physicalist” because the mind emerges from the brain and is bound to the brain.

 

The 20th century’s final decade was “the decade of the brain,” and the 21st century’s first decade was “the decade of the mind.” Perhaps we could say that today’s science and philosophy mark this as a decade of the brain-mind relationship? For these scholars, there are miles to go before they enter their final sleep—or should I say until their body evicts their mind?

 

Addendum for those with religious interests: Two of my friends—British cognitive neuroscientist Malcolm Jeeves and American developmental psychologist Thomas Ludwig—reflect on these and other matters in their just-published book, Psychological Science and Christian Faith. If you think that biblical religion assumes a death-denying dualism (a la Plato’s immortal soul) prepare to be surprised.

Money matters. For entering U.S. collegians, the number one life goal—surpassing “helping others in difficulty,” “raising a family,” and 17 other aspirations—is “being very well off financially.” In the most recent UCLA “American Freshman” survey, 82 percent rated being very well off as “essential” or “very important.” Think of it as today’s American dream: life, liberty, and the purchase of happiness.

 

For human flourishing, fiscal fitness indeed matters . . . up to a point. In repeated surveys across nations, a middle-class income—and being able to control one’s life—beats being poor. Moreover, people in developed nations tend to be happier and more satisfied than those in the poorest of nations.

 

Beyond the middle-class level, we seem to have an income “satiation point,” at which the income-happiness correlation tapers off and happiness no longer increases. For individuals in poor countries, that point is close to $40,000; for those in rich countries, about $90,000, reports a new analysis of 1.7 million Gallup interviews by Andrew Jebb and colleagues.

 

And consider: The average U.S.  per-person disposable income, adjusted for inflation, has happily tripled over the last 60 years, enabling most Americans to enjoy today’s wonderments, from home air conditioning to wintertime fresh fruit to smart phones. “Happily,” because few of us wish to return to yesteryear. Yet not that happily, because psychological well-being has not floated upward with the rising economic tide. The number of “very happy” adults has remained at 3 in 10, and depression has been on the rise.

 

What triggers the diminishing psychological payoff from excess income? Two factors:

  • Our human capacity for adaptation: Continual pleasures subside.
  • Our tendency to assess our own circumstances by “social comparison” with those around us—and more often those above us. People with a $40,000 income tend to think $80,000 would enable them to feel wealthy—whereas those at $80,000 say they would need substantially more. Become a millionaire and move to a rich neighborhood, you still may not feel rich. As Theodore Roosevelt said, “Comparison is the thief of joy.”

 

The outer limit of the wealth–well-being relationship also appears in two new surveys (by Grant Donnelly, Tianyl Zheng, Emily Haisley, and Michael Norton) of an international bank’s high net-worth clients. As you can see in figures I created from their data, having $2 million and $10 million are about the same, psychologically speaking.

 

If wealth increases well-being only up to a point—and much evidence indicates that is so—and if extreme inequality is socially toxic (great inequality in a community or country predicts lower life quality and more social pathology), then could societies increase human flourishing with economic and tax policies that spread wealth?

 

Let’s make this personal: If earning, accumulating, and spending money increases our happiness only to a satiation point, then why do we spend our money for (quoting the prophet Isaiah) “that which is not bread” and our “labor for that which does not satisfy?” Quite apart from moral considerations, what’s to be lost by sharing our wealth above the income-happiness satiation point?

 

And if one is blessed with wealth, what’s to be gained by showering inherited wealth, above the satiation point, on our children? (Consider, too, another Donnelly and colleagues finding: Inherited wealth entails less happiness than earned wealth.)

 

Ergo, whether we and our children drive BMWs or Honda Fits, swim in our backyard pool or at the local Y, eat filet mignon or fish filet sandwiches, hardly matters. That fact of life, combined with the more important facts of the world’s needs, makes the case for philanthropy.