Skip navigation
All Places > The Psychology Community > Blog > Authors David Myers
1 2 3 Previous Next

The Psychology Community

205 Posts authored by: David Myers Expert

If you, dear reader, can indulge some slightly geeky calculations, I hope to show you that with daily exercise you can live a substantially longer and happier life. Indeed, per the time invested, exercise will benefit you more than smoking will harm you. Consider:

  • An analysis of mortality data offers this memorable result: For the average person, life is lengthened by about 7 hours for every hour exercised. So (here comes the geek), the World Health Organization recommends exercising 150 minutes = 2.5 hours per week. Multiplied times 7, that equals 17.5 hours longer life for each week of exercise. Over 52 weeks, that sums to 910 hours = 38 days = 1/10th of a year longer life for each year of faithful exercise . . . which, continued over 40 years would yield ~4 years longer life. (Though, more typically, say the researchers, runners live 3.2 years longer.)
  • In another epidemiological study of over 650,000 American adults, those walking 150 minutes per week lived (voila!) 4 years longer than nonexercisers (Moore et al., 2012).

 

How satisfying to have two independent estimates in the same ballpark!

 

This potential life-extending benefit brings to mind the mirror-image life-shortening costs of smoking, which the Centers for Disease Control reports diminishes life for the average smoker “by at least 10 years.” Thus (geek time again):

  • A person  who takes up smoking at age 15, smokes 15 cigarettes per day for  50 years, and dies at 65 instead of 75, will lose roughly 1/5th of a year (equals 73 days = 1752 hours = 105,000 minutes) for each year of smoking. If each cigarette  takes 10 minutes to smoke, the minutes spent smoking (54,750 each year) will account for half of those 105,000 lost minutes.
  • Ergo, nature charges ~2 minutes of shorter life for each minute spent smoking. . . but generously gives a 7-to-1 return for each hour spent exercising. How benevolent!

 

Massive new epidemiological studies and meta-analyses (statistical digests of all available research) confirm both physical and mental health benefits of exercise (see here, here, and here). A good double goal for those wishing for a long life is: more fitness, less fatness. But evidence suggests that if forced to pick one, go for fitness.

 

As an earlier blog essay documented, exercise entails not only better health but a less depressed and anxious mood, more energy, and stronger relationships. Moreover, clinical trial experiments—with people assigned to exercise or to control conditions—confirm cause and effect: Exercise both treats and protects against depression and anxiety.

 

The evidence is as compelling as evidence gets: Go for a daily jog or swim and you can expect to live longer and live happier. Mens sana in corpore sano: A healthy mind in a healthy body.

 

 K.C. Alfred/Moment/Getty Images

(For David Myers’ other weekly essays on psychological science and everyday life, visit www.TalkPsych.com)

David Myers

Sometimes Truth Is Comedy

Posted by David Myers Expert Nov 29, 2018

As I approach five years of www.TalkPsych.com commentary—which has settled into a weekly Thursday essay—I am tempted (given our now larger audience) to replay an occasional favorite. Here is my second focused essay, which still puts a smile on my face . . . and perhaps yours? (In sneaking humor into texts, I presume that if I can’t have fun writing, then readers likely won’t have fun reading.)

 

From April 6, 2014:

Consider Brett Pelham, Matthew Mirenberg, and John Jones’ 2002 report of wacky associations between people’s names and vocations. Who would have guessed? For example, in the United States, Jerry, Dennis, and Walter are equally popular names (0.42 percent of people carry each of these names). Yet America’s dentists have been almost twice as likely to be named Dennis as Jerry or Walter. Moreover, 2.5 times as many female dentists have been named Denise as the equally popular names Beverly and Tammy. And George or Geoffrey has been overrepresented among geoscientists (geologists, geophysicists, and geochemists).

I thought of that playful research on names recently when reading a paper on black bears’ quantitative competence, co-authored by Michael Beran. Next up in my reading pile was creative work on crows’ problem solving led by Chris Bird. Today I was appreciating interventions for lifting youth out of depression, pioneered by Sally Merry.

That also took my delighted mind to the important books on animal behavior by Robin Fox and Lionel Tiger, and the Birds of North America volume by Chandler Robbins. (One needn’t live in Giggleswick, England, to find humor in our good science.)

The list goes on: billionaire Marc Rich, drummer Billy Drummond, cricketer Peter Bowler, and the Ronald Reagan Whitehouse spokesman Larry Speakes. And as a person with hearing loss whose avocational passion is hearing advocacy, I should perhaps acknowledge the irony of my own name, which approximates My-ears.

Internet sources offer lots more: dentists named Dr. E. Z. Filler, Dr. Gargle, and Dr. Toothaker; the Oregon banking firm Cheatham and Steele; and the chorister Justin Tune. But my Twitter feed this week offered a cautionary word about these reported names: “The problem with quotes on the Internet is that you never know if they’re true.” ~ Abraham Lincoln

Perhaps you, too, have some favorite name-vocation associations? I think of my good friend who was anxiously bemused before meeting his oncologist, Dr. Bury. (I am happy to report that, a decade later, he is robustly unburied and has not needed the services of the nearby Posthumus Funeral Home.)

For Pelham and his colleagues there is a serious point to this fun: We all tend to like what we associate with ourselves (a phenomenon they call implicit egotism). We like faces that have features of our own face morphed into them. We like—and have some tendency to live in—cities and states whose names overlap with our own—as in the disproportionate number of people named Jack living in Jacksonville, of Philips in Philadelphia, and of people whose names begin with Tor in Toronto.

Uri Simonsohn isn’t entirely convinced (see here and here, with Pelham’s reply here and here). He replicated the associations between people’s names, occupations, and places but argued that reverse causality sometimes is at work. For example, people sometimes live in places and on streets after which their ancestors were named.

Implicit egotism research continues. In the meantime, we can delight in the occasional playful creativity of psychological science.

P.S. Speaking of dentists (actual ones), my retired Hope College chemistry colleague Don Williams—a person of sparkling wit—offers these photos, taken with his own camera:

And if you need a podiatrist to advise about your foot odor, Williams has found just the person:

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

After elections, people often note unexpected outcomes and then complain that “the polls got it wrong.”

 

After Donald Trump’s stunning 2016 presidential victory, the press gave us articles on “Why the Polls were such a Disaster,” on “4 Possible Reasons the Polls Got It So Wrong,” and on “Why the Polls Missed Their Mark.” Stupid pollsters. “Even a big poll only surveys 1500 people or so out of almost 130 million voters,” we may think, “so no wonder they can’t get it right.

 

Moreover, consider the many pundits who, believing the polls, confidently predicted a Clinton victory. They were utterly wrong, leaving many folks shocked on election night (some elated, others depressed, with later flashbulb memories of when they realized Trump was winning).

 

So how could the polls, the pundits, and the prediction models have all been so wrong?

 

Or were they? First, we know that in a closely contested race, a representative sample of a mere 1500 people from a 130 million population will—surprisingly to many people—allow us to estimate the population preference within ~3 percent.

 

Sounds easy. But there’s a challenge: Most randomly contacted voters don’t respond when called. The New York TimesUpshot” recently let us view its polling in real time. This enabled us to see, for example, that it took 14,636 calls to Iowa’s fourth congressional district to produce 423 responses, among which Steve King led J. D. Scholten by 5 percent—slightly more than the 3.4 percent by which King won.

 

Pollsters know the likely demographic make-up of the electorate, and so can weight results from respondents of differing age, race, and gender to approximate the population. And that, despite the low response rate, allows them to do remarkably well—especially when we bear in mind that their final polls are taken ahead of the election (and cannot account for last-minute events, which may sway undecided voters). In 2016, the final polling average favored Hillary Clinton by 3.9 percent, with a 3 percent margin of error. On Election Day, she won the popular vote by 2.1 percent (and 2.9 million votes)—well within that margin of error.

 

To forecast a race, fivethirtyeight.com’s prediction model does more. It “takes lots of polls, performs various types of adjustments to them [based on sample size, recency, and pollster credibility], and then blends them with other kinds of empirically useful indicators” such as past results, expert assessments, and fundraising. Here is their 2016 final estimation:

Ha! This prediction, like other 2016 prediction models, failed.

 

Or did it? Consider a parallel. Imagine that as a basketball free-throw shooter steps to the line, I tell you that the shooter has a 71 percent free-throw average. If the shooter misses, would you disbelieve the projection? No, because, if what I’ve told you is an accurate projection, you should expect to see a miss 29 percent of the time. If the player virtually never missed, then you’d rightly doubt my data.

 

Likewise, if, when Nate Silver’s fivethirtyeight.com gives a candidate a 7 in 10 chance of winning and that candidate always wins, then the model is, indeed, badly flawed. Yes?

 

In the 2018 U.S. Congressional races, fivethirtyeight.com correctly predicted 96 percent of the outcomes. On the surface, that may look like a better result, but it’s mainly because most races were in solid Blue or Red districts and not seriously contested.

 

Ergo, don’t be too quick to demean the quality polls and the prediction models they inform. Survey science still works.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

David Myers

Science Marches On

Posted by David Myers Expert Nov 15, 2018

This week I interrupt our weekly focus on psychology’s big ideas and new findings to update three prior essays.

 

Loss aversion in sports. A recent essay described how, in sports (as in other realms of life), our fear of losing can rob us of chances to win:

  • In baseball, a mountain of data shows that runners on first base will rarely take off running on a fly ball that has any chance of being caught. But their aversion to being thrown out leads to fewer runs and wins.
  • And in  basketball, teams trailing by 2 points at a game’s end typically prefer a 2-point shot attempt, hoping to avert a loss and send the game into overtime (where half the time they will lose), over a 3-point attempt for victory—even in situations where the odds favor the latter. New Cornell/University of Chicago studies of “myopic loss aversion” confirm this irrational preference for loss-averting 2-point shots at the end of National Basketball Association games.
  • Moreover, those same studies  extend the phenomenon to National Football League games, where teams prefer to kick a tying extra point in situations where a 2-point conversion makes a win more likely (as when down by two points late in the third quarter—see also here). Caution often thwarts triumph.

 

Gratitude gratifies. An essay last spring testified to the positive power of expressing gratitude, which increases well-being and prosociality. In new experiments, Amit Kumar and Nicholas Epley found that people who wrote gratitude letters “significantly underestimated how surprised recipients would be about why expressers were grateful, overestimated how awkward recipients would feel, and underestimated how positive recipients would feel.”

 

Our unexpected personal thank you notes are more heartwarming for their recipients than we appreciate. (Is there someone whose support or example has impacted your life, who would be gratified to know that?)

 

The net effect. A May 2016 essay discussed research on how, in the smartphone age, “compulsive technology use not only drains time from eyeball-to-eyeball conversation but also predicts poorer course performance.” Since then, my friend (and co-author on the new Social Psychology, 13th Edition) Jean Twenge has enriched the public understanding of social media effects in her new book, iGen, and in associated media appearances. (For an excellent synopsis, see her Atlantic article.)

As she documents, the adoption of smartphones is echoed by increases in teen loneliness, depression, and suicide, and by decreases in sleep and face-to-face interactions (though also in less drinking, sex, and car accidents). Jean also continues to mine data, such as from an annual survey of American teens in a new Emotion study with Gabrielle Martin and Keith Campbell. They reconfirmed that a dip in adolescent well-being has precisely coincided with an increase in screen time (on social media, the Internet, texting, and gaming). Moreover, across individuals, more than 10 screen-time hours per week predicts less teen happiness.

 

Ergo, a task for teachers is to inform students about these trends and invite discussion about how students might apply them in their own peer culture. In a recent APS Observer essay, I suggested this might also be a good class activity:

  • Invite students to guess how often they check their phone each day, and how many minutes they average on it.
  • Have them download a free screen-time tracker app, such as Moment for the iPhone or QualityTime for the Android.
  • Have them add up their actual total screen time for the prior week and divide by 7 to compute their daily average.
  • Then ask them, “Did you underestimate your actual smartphone use?

The results may surprise them. In two recent studies, university students greatly underestimated their frequency of phone checking and time on screen. As Steven Pinker has noted, “The solution is not to bemoan technology but to develop strategies of self-control, as we do with every other temptation in life.”

In this time of political passion, those of us who are instructors and/or text authors may agonize over whether to engage contentious public issues, such as the Kavanaugh Supreme Court nomination, fears of immigrant crime, or the possible social toxicity of presidential rhetoric.

 

My assumption is that—given our focus on education and our respect for our students’ diversity—classrooms and textbooks should not be political bully pulpits. There are more appropriate venues for advocating our own political views.

 

But that needn’t preclude our seeking to inform public dialogue, by offering pertinent evidence. For example, in a recent essay, I drew on memory science to report the tunnel-vision nature of emotion-laden memories, as perhaps illustrated when Christine Blasey Ford recalled being sexually assaulted without remembering peripheral details—just what we would expect from an authentic memory. And I indicated how state-dependent memory phenomena could help explain why Brett Kavanaugh might be sincere in having no memory for the same event. But I stopped short of expressing an opinion about whether he should have been confirmed.

 

Other essays have also offered information pertinent to heated political debates:

  • Trade policies. While politicians and economists debate the economic merits of free trade versus trade-restricting tariffs, social psychologists have noted that economic interdependence and cooperation enhance the odds for sustained peace (here).
  • Fear of immigrants. Recent political rhetoric focusing attention on the “caravan” of Central Americans desperate to enter Mexico and the U.S. has again raised fears of immigrant crime. Recent TalkPsych essays (here and here) offered data on actual immigrant crime rates in the United States, and on who in the U.S. and Germany most fears immigrants (ironically, those who have little contact with them). Gallup data from 139 countries confirms higher migrant acceptance among those who know migrants. Teachers can offer such evidence without advocating either party’s border policy (yes?).
  • Presidential rhetoric and public attitudes. Recent essays in The New York Times (here and here) and The Washington Post (here and here) assume that President Trump’s derision of his political opponents and of the press creates a toxic social environment that seeps down into his followers’ attitudes and actions. Pertinent to these concerns, my earlier essays wondered whether the President was “simply giving a voice” to widely held attitudes, or instead was legitimizing such attitudes and thereby increasing bullying. I later offered new evidence that hatemongering from high places does indeed desensitize people to such and increases expressions of prejudice. Can teachers offer such evidence without being partisan?

 

Be assured, psychological science is not intrinsically liberal or conservative. Its evidence sometimes lends weight to progressive thinking (say about sexual orientation as a natural, enduring disposition) and sometimes to conservative thinking (for example, about the benefits of co-parenting and stable close relationships such as marriage). As I conclude in an upcoming teaching column for the Association for Psychological Science Observer, “psychology aims not to advance liberal or conservative thinking per se, but to let evidence inform our thinking. And for us teachers of psychology that, no matter our political identities, is perhaps the biggest lesson of all.”

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

Hate-fueled pipe bombs target Democrats. Two African Americans are gunned down in a grocery story. An anti-Semite slaughters synagogue worshippers. Political leaders denigrate and despise their opponents. In National Election Studies surveys, the percentage of Republicans and Democrats who “hate” the other party has soared, for both sides—from 20 percent in 2000, to near 50 percent in 2016. (Let’s make it personal: Would you want your child to marry a devotee of the other party?)

 

Hostilities are poisoning the culture, and many Americans are wondering: How can we, as individuals and as a culture, turn a corner? Amid animosities fed by groundless fears, fact-free ignorance, and repeated (then believed) big lies, how can we embrace our common humanity and shared goals?

 

As we social psychologists remind folks, conflicts lessen through contact, cooperation, and communication. Personal contact with equal-status others helps (it’s not just what you know, but who you know). Cooperative striving for shared superordinate goalsthose that require the cooperation of two or more people—fosters unity (it even helps to have a common enemy). Ditto guided communication (an aim of www.Better-Angels.org, which brings together “Reds” and “Blues” to understand each other’s concerns and to discover their overlapping aspirations).

 

And might we, individually and as a culture, also benefit by teaching and modeling an outlook that encompasses three virtues: conviction, humility, and love?

 

Our convictions define what matters. We anchor our lives in core beliefs and values that guide our lives. Our convictions motivate our advocacy for a better world. They give us courage to speak and act. “We must always take sides,” said Holocaust survivor Elie Weisel. “Silence encourages the tormentor, never the tormented.” “To be silent is to be complicit,” adds Dead Man Walking author Sister Helen Prejean.

 

But convictions need restraining with humility, a virtue that lies at the heart of science for theists and nontheists alike. Those of us who are theists, of whatever faith tradition, share two convictions:

  1. There is a God.
  2. It’s not me (or you).

Ergo, we are fallible. The surest conviction we can have is that some of our beliefs err. From this follows the religious virtue of humility (alas, a virtue more often preached than practiced). A spirit of humility seasons conviction with open-minded curiosity. It tempers faith with uncertainty (faith without humility is fanaticism). It subjects our opinions to evidence and enables good science. It tells me that every person I meet is, in some way, my superior . . . providing an opportunity to learn.

 

The triangle of virtues within which we can aspire to live is completed when conviction, restrained by humility, is directed by love. In his great sermon on love, Martin Luther King, Jr. quoted Jesus: “Love your enemies, bless them that curse you, do good to them that hate you.” Doing that, he said, does not compel us to like our enemies, but does compel us “to discover the element of good” in them. By contrast, “hate only intensifies the existence of hate and evil in the universe,” he added. “If I hit you and you hit me and I hit you back and you hit me back and go on, you see, that goes on ad infinitum. It just never ends. . . . Hate destroys the hater as well as the hated.”

 

Is this not a vision of a good life that will enable a flourishing culture . . . a life that is animated by deep convictions, which are refined in humility and applied with love?

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

Psychological science delights us with its occasional surprises. For example, who would have imagined that

  • electroconvulsive therapy—shocking the brain into mild convulsions—would often be an effective antidote to otherwise intractable depression?
  • massive losses in brain tissue early in life could have minimal later effects?
  • siblings’ shared home environment would have such a small effect on their later traits?
  • after brain damage, a person may learn new skills yet be unaware of such?
  • visual information is deconstructed into distinct components (motion, form, depth, and color), processed by distinct brain regions, and then reintegrated into a perceived whole?

 

The latest who-would-have-believed-it finding is that the microbiology of the gut may influence the moods of the brain. Digestive-system bacteria reportedly influence human emotions and even social interactions, perhaps by producing neurotransmitters. Moreover, we are told (such as here and here), healthy gut microbes can reduce anxiety, depression, and PTSD.

 

New articles on this supposedly “revolutionary” and “paradigm-shifting” microbiota-gut-brain (MGB) research are accumulating, report Katarzyna Hooks, Jan Pieter Konsman, and Maureen O’Malley in a forthcoming (yet-to-be-edited) review. By comparing rodents or humans with or without intestinal microbiota, researchers have indeed found “suggestive” effects on how organisms respond to stress and display emotions. Some researchers are exploring microbiota-related interventions (such as with probiotics versus placebos) as a possible treatment for depression, anxiety, and anorexia nervosa.

 

The findings are intriguing and worth pursuing but haven’t yet definitively demonstrated “the impact of the microbiota itself on behavior,” say Hooks, Konsman, and O’Malley. Nevertheless, the popular press, sometimes aided by university press offices, has hyped the research in more than 300 articles. People love the news of this research, say Hooks et al., because it lends hope that a natural, healthy diet can provide a simple DIY solution to troubling emotions.

 

Reading this analysis triggers déjà vu. This cycle of (a) an intriguing finding, followed by (b) hype, followed by (c) reassessment, is an occasional feature of our science’s history. Mind-blowing experiments on people with split brains yielded (a) believe-it-or-not findings, leading to (b) overstated claims about left-brained and right-brained people, which (c) finally settled into a more mature understanding of how distinct brain areas function as a whole integrated system.

 

Despite the “large helpings of overinterpretation” and the overselling of “currently limited findings,” the Hooks team encourages researchers to press on. “We see MGB research as a field full of promise, with important implications for understanding the relationship between the brain and the rest of the body.” The body (brain included) is one whole system.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

As I finished a recent presentation, “Thinking Smart in a Post-Truth Age,” a questioner’s hand shot up: “I understand the need to think with our heads as well as our hearts, by considering the evidence. But how can I persuade people such as the climate-change-denying folks meeting in my town next week?”

 

I responded by commending a gentle conversation that searched for common values. I also noted that advocates for any cause are wise to not focus on immovable folks with extreme views, but on the uncertain middle—the folks whose votes sway elections and shape history.

 

I should also have mentioned the consistent finding of nine new studies by University of Cologne psychologists Joris Lammers and Matt Baldwin: Folks will often agree with positions that are linked to their own yearnings. For example, might conservatives who tend to yearn for yesteryear’s good old days respond to messages that appeal to nostalgia? Indeed, say Lammers and Baldwin, that was the successful assumption of Donald Trump’s “Make America Great Again” message.

 

But the same appeal to nostalgia can also promote progressive ideas, they report. For example, liberals were much more supportive than conservatives of a future-focused gun-control message: “I would prefer to make a change, so that in the future people may own hunting rifles and pistols, but no one will have assault rifles.” When the researchers framed the same message with a past-focus: “I would like to go back to the good old days, when people may have owned hunting rifles and pistols, but no one had assault rifles,” conservatives pretty much agreed with liberals.

 

Likewise, contemporary Germans on the left and right expressed much less disagreement about an immigration message when it focused on their country’s past history of welcoming of immigrants.

 

In earlier research, Lammers and Baldwin also found conservatives more open to nostalgia-focused environmental appeals—to, for example, donating money to a charity focused on restoring yesterday’s healthy Earth, rather than a charity focused on preventing future environmental damage. “Make Earth Great Again.”

 

Ergo, I now realize I should have encouraged my questioner to market her message to her audience. If it’s a political message pitched by conservatives at liberals, it’s fine to focus on making a better future. But if she is appealing to conservatives, then she might take a back-to-the-future approach: Frame her message as support for the way things used to be.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

I’m often asked: “What is your favorite introductory psych chapter?” I reply that, when starting to write my text, I presumed that Sensation-Perception would be the dullest topic. Instead, I’ve found it to be the most fascinating. I’m awestruck by the intricate process by which we take in information, transform it into nerve impulses, distribute it to different parts of our brain, and then reassemble that information into colorful sights, rich sounds, and evocative smells. Who could have imagined? We are, as the Psalmist said, “wonderfully made.”

 

And then there are the weird and wonderful perceptual phenomena, among which is our surprising blindness to things right in front of our eyes. In various demonstrations of inattentional blindness, people who are focused on a task (such as talking on a phone or counting the number of times black-shirted people pass a ball) often fail to notice someone sauntering through the scene—a woman with an umbrella, in one experiment, or even a person in a gorilla suit or a clown on a unicycle.

 

 

As a Chinese tour guide wrote to a friend of mine (after people failed to notice something my friend had seen):

 

This looking-without-seeing phenomenon illustrates a deep truth: Our attention is powerfully selective. Conscious awareness resides in one place at a time.

 

Selective inattention restrains other senses, too. Inattentional deafness is easily demonstrated with dichotic listening tasks. For example, if people are fed novel tunes into one ear, while focused on to-be-repeated-out-loud words fed into the other ear, they will later be unable to identify what tune they have heard. (Thanks to the mere exposure effect, they will, however, later like it best.) Or, in an acoustic replication of the famed invisible gorilla study, Polly Dalton and Nick Fraenkel found that people focusing on a conversation between two women (rather than on two men also talking) usually failed to notice one of the men repeatedly saying “I am a gorilla.”

 

Now, in a new British experiment, we also have evidence of inattentional numbness. Pickpockets have long understood that bumping into people makes them unlikely to notice a hand slipping into their pocket. Dalton (working with Sandra Murphy) experimented with this tactile inattention:  Sure enough, when distracted, their participants failed to notice an otherwise easily-noticed vibration to their hand.

 

Tactile inattention sometimes works to our benefit. I once, while driving to give a talk, experienced a painful sting in my eye (from a torn contact lens) . . . then experienced no pain while giving the talk . . . then felt the excruciating pain again on the drive home. In clinical settings, such as with patients receiving burn treatments, distraction can similarly make painful procedures tolerable. Pain is most keenly felt when attended to.

 

Another British experiment, by Charles Spence and Sophie Forster, demonstrated inattentional anosmia (your new word for the day?)—an inability to smell. When people focused on a cognitively demanding task, they became unlikely to notice a coffee scent in the room. .

So what’s next? Can we expect a demonstration of inattentional ageusia—inability to taste? (That’s my new word for the day.) Surely, given our powers of attention (and corresponding inattention), we should expect such.

 

Like a flashlight beam, our mind’s selective attention focuses at any moment on only a small slice of our experience—a phenomenon most drivers underestimate when distracted by phone texting or conversation. However, there’s good news: With our attention riveted on a task, we’re productive and even creative. Our attention is a wonderful gift, given to one thing at a time.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

Nearly two-third of Americans, reports a recent PLOS One article, agree that “I am more intelligent than the average person.”

 

This self-serving bias—on which I have been reporting for four decades (starting here)—is one of psychology’s most robust and reliable phenomena. Indeed, on most subjective, socially desirable dimensions, most of us see ourselves as better-than-average . . . as smarter, more ethical, more vocationally competent, more charitable, more unprejudiced friendlier, healthier, and more likely to outlive our peers—which calls to mind Freud’s joke about the husband who told his wife, “If one of us dies, I shall move to Paris.”

 

My own long-ago interest in self-serving bias was triggered by noticing a result buried in a College Board survey of 829,000 high school seniors. In rating themselves on their “ability to get along with others,” 0 percent viewed themselves below average. But a full 85 percent saw themselves as better than average: 60 percent in the top 10 percent, and 25 percent as in the top 1 percent.

 

As Shelley Taylor wrote in Positive Illusions, “The [self-]portraits that we actually believe, when we are given freedom to voice them, are dramatically more positive than reality can sustain.” Dave Barry recognized the phenomenon: “The one thing that unites all human beings, regardless of age, gender, religion, economic status, or ethnic background is that deep down inside, we all believe that we are above average drivers.”

 

Self-serving bias also takes a second form—our tendency to accept more responsibility for our successes than our failures, for our victories than our defeats, and for our good deeds than our bad. In experiments, people readily attribute their presumed successes to their ability and effort, their failures to bad luck or an impossible task. A Scrabble win reflects our verbal dexterity. A loss? Our bad luck in drawing a Q but no U.

 

Perceiving ourselves, our actions, and our groups favorably does much good. It protects us against depression, buffers stress, and feeds our hopes. Yet psychological science joins literature and religion in reminding us of the perils of pride. Hubris often goes before a fall. Self-serving perceptions and self-justifying explanations breed marital conflict, bargaining impasses, racism, sexism, nationalism, and war.

 

Being mindful of self-serving bias needn’t lead to false modesty—for example, smart people thinking they are dim-witted. But it can encourage a humility that recognizes our own virtues and abilities while equally acknowledging those of our neighbors. True humility leaves us free to embrace our special talents and similarly to celebrate those of others.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

“She [Professor Christine Blasey Ford] can’t tell us how she got home and how she got there,” scorned Senator Lindsey Graham during the lunch break of yesterday’s riveting U. S. Senate Judiciary Committee hearing regarding Ford’s memory of being assaulted by Supreme Court nominee Brett Kavanaugh. Graham’s assumption, widely voiced by fellow skeptics of Ford’s testimony, is that her inability to remember simple peripheral details discounts the authenticity of her assault memory.

 

But Graham and the other skeptics fail to understand, first, how extreme emotions signal the brain to “save this!” for future reference. (Likely you, too, have enduring “flashbulb memories” for long-ago emotional experiences?) And second, they fail to understand that peripheral details typically fall into oblivion. In Psychology, 12th Edition, Nathan DeWall and I explain:

 

Our emotions trigger stress hormones that influence memory formation. When we are excited or stressed, these hormones make more glucose energy available to fuel brain activity, signaling the brain that something important is happening. Moreover, stress hormones focus memory. Stress provokes the amygdala (two limbic system, emotion processing clusters) to initiate a memory trace that boosts activity in the brain’s memory-forming areas (Buchanan, 2007; Kensinger, 2007) (FIGURE 8.9). It’s as if the amygdala says, “Brain, encode this moment for future reference!” The result? Emotional arousal can sear certain events into the brain, while disrupting memory for irrelevant events (Brewin et al., 2007; McGaugh, 2015).

 

Significantly stressful events can form almost unforgettable memories. After a traumatic experience—a school shooting, a house fire, a rape—vivid recollections of the horrific event may intrude again and again. It is as if they were burned in: “Stronger emotional experiences make for stronger, more reliable memories,” noted James McGaugh (1994, 2003). Such experiences even strengthen recall for relevant, immediately preceding events [such as going up the stairway and into the bedroom, in Ford’s case] (Dunsmoor et al., 2015: Jobson & Cheraghi, 2016). This makes adaptive sense: Memory serves to predict the future and to alert us to potential dangers. Emotional events produce tunnel vision memory. They focus our attention and recall on high priority information, and reduce our recall of irrelevant details (Mather & Sutherland, 2012). Whatever rivets our attention gets well recalled, at the expense of the surrounding context.

 

And as I suggested in last week’s essay, Graham and others seem not to understand “state-dependent memory”—that what people experience in one state (such as when drunk) they may not remember in another state (sober). Nor are Kavanaugh’s supporters recognizing that heavy drinking disrupts memory formation, especially for an experience that would not have been traumatic for him. Thus, Kavanaugh could be sincerely honest in not recalling an assaultive behavior, but also, possibly, sincerely wrong.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit www.TalkPsych.com.)

Psychology professor Christine Blasey Ford vividly recalls being sexually assaulted by Supreme Court nominee Brett Kavanaugh when both were teens. Kavanaugh remembers no such event and vigorously denies Ford’s accusation. The potentially historic significance of the allegation has triggered a debate: Is she telling the truth? Or is he, in claiming no such memory?

 

Without judging either’s current character, psychological science suggests a third possibility: Perhaps both are truthfully reporting their memories.

 

On Ford’s behalf, we can acknowledge that survivors of traumatic events typically are haunted by enduring, intrusive memories. As Nathan DeWall and I write in Psychology, 12th Edition,

Significantly stressful events can form almost unforgettable memories. After a traumatic experience—a school shooting, a house fire, a rape—vivid recollections of the horrific event may intrude again and again. It is as if they were burned in: “Stronger emotional experiences make for stronger, more reliable memories,” noted James McGaugh (1994, 2003).

 

Does Ford’s inability to remember ancillary details, such as when the assault supposedly occurred, discount her veracity? Not at all, if we’re to generalize from research on the accuracy of eyewitness recollections. Those whose memory is poor for incidental details of a scene are more accurate in their recollections of the essential event (see here and here).

 

But if Kavanaugh and his friend were, indeed, “stumbling drunk,” then perhaps they, genuinely, have no recollection of their impulsive behaviors while “severely intoxicated.”  Memory blackouts do happen, as we also report:

 Ergo, if trauma sears memories into the brain, and if alcohol disrupts them, could it be that both Ford and Kavanaugh are telling the truth as best they can recall it?

 

(For David Myers’ other weekly essays on psychological science and everyday life visit www.TalkPsych.com)

Turning 76 years old in a week, and still loving what I do, I find myself inspired by two recent emails. One, from social psychologist Thomas Pettigrew, age 87, responded to my welcoming his latest work by attaching fourteen of his recent publications. The second, from Nathan DeWall, pointed me to an interesting new article co-authored by developmental psychologist, Walter Mischel, age 88 (who, sadly, died just hours before this essay was posted).

 

That got me thinking about other long-lived people who have found their enduring calling in psychological science. My late friend, Charles Brewer, the long-time editor of Teaching of Psychology (who once told me he took two days a year off: Christmas and Easter), taught at Furman University until nearly 82, occupied his office until age 83, and was still authoring into his 80s.

 

But Charles’ longevity was exceeded by that of

  • B.F. Skinner, whom I heard address the American Psychological Association convention in 1990 at age 86, just eight days before he died of leukemia.
  • Carroll Izard, who co-authored three articles in 2017, the year of his death at age 93.
  • Jerome Bruner, who, the year before he died in 2016 at age 100, authored an essay on “The Uneasy Relation of Culture and Mind.”

 

And in earlier times, my historian-of-psychology friend Ludy Benjamin tells me, Wilhelm Wundt taught until 85 and supervised his last doctoral student at 87, and Robert Woodworth, lectured at Columbia until 89 and published his last work at 90.*

 

So, I then wondered, who of today’s living psychologists, in addition to Pettigrew and Mischel, are still publishing at age 85 and beyond? Daniel Kahneman and Paul Ekman almost qualify, but at 84 are youngsters compared to those below.  Here’s my preliminary short list—other nominees welcome!—with their most recent PsycINFO publication. (Given the era in which members of their age received PhDs, most are—no surprise—men.)

 

  • Philip Zimbardo: Age 85 (born March 23, 1933)

Unger, A., Lyu, H., & Zimbardo, P. G. (2018). How compulsive buying is influenced by perspective—Cross-cultural evidence from Germany, Ukraine, and China. International Journal of Mental Health and Addiction, 16, 522–544.

 

  • Gordon Bower: Age 85 (born December 30, 1932)

Bower, G. H. (2016). Emotionally colored cognition. In R. J. Sternberg, S. T. Fiske, & F. J. Foss (Eds.), Scientists making a difference: One hundred eminent behavioral and brain scientists talk about their most important contributions. Chapter xxvii, pp. 123–127. NY: Cambridge University Press.

 

  • James McGaugh: Age 86 (born December 17, 1931)

McGaugh, J. L. (2018). Emotional arousal regulation of memory consolidation. Current Opinion in Behavioral Sciences, 19, 5560.

 

  • Lila Gleitman: Age 88 (born December 10, 1931)

Gleitman, L. R., & Trueswell, J. C. (2018). Easy words: Reference resolution in a malevolent referent world. Topics in Cognitive Science.

 

  • Roger Shepard: Age 89 (born January 30, 1929)

Shepard, R. N. (2016). Just turn it over in your mind. In R. J. Sternberg, S. T. Fiske, & F. J. Foss (Eds.), Scientists making a difference: One hundred eminent behavioral and brain scientists talk about their most important contributions. Chapter xxvii, pp. 99–103. New York: Cambridge University Press.

 

  • Jerome Kagan: Age 89 (born February 25, 1929)

Kagan, J. (2018, May). Three unresolved issues on human morality. Perspectives on Psychological Science, 13, 346–358.

 

  • Albert Bandura: Age 92 (born December 4, 1925)

Bandura, A. (2016). Moral disengagement: How people do harm and live with themselves. New York: Worth Publishers.

 

  • Aaron Beck: Age 97 (born July 18, 1921)

Kochanski, K. M., Lee-Tauler, S. Y., Brown, G. K., Beck, A., Perera, K. U., et al. (2018, Aug.) Single versus multiple suicide attempts: A prospective examination of psychiatric factors and wish to die/wish to live index among military and civilian psychiatrically admitted patients. Journal of Nervous and Mental Disease, 206, 657–661.

 

  • Eleanor Maccoby: Age 101 (born May 15, 1917)

Maccoby, E. (2007). Historical overview of socialization research and theory. In J. E. Grusec, & P. D. Hastings (Eds.), Handbook of socialization: Theory and research. New York: Guilford Press.

 

  • And a drum roll for Brenda Milner: At age 100 (born July 15, 1918), she still, I’m told, comes in a couple times a week to the Montreal Neurological Institute, which last week celebrated her centennial (with thanks to Melvin Goodale for the photo below).

             Milner, B., & Klein, D. (2016, March). Loss of recent memory after bilateral hippocampal lesions: Memory and              memories—looking back and looking forward. Journal of Neurology, Neurosurgery & Psychiatry, 87, 230.

 

 

Life is a gift that ends unpredictably. Having already exceeded my at-birth life expectancy, I am grateful for the life I have had. But as one who still loves learning and writing (and can think of nothing else I’d rather do), why not emulate these esteemed colleagues while I continue to be blessed with health, energy, and this enduring sense of calling?

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

 ----

* The “major early women psychologists”—Calkins, Washburn, Ladd-Franklin, Woolley, Hollingworth—all died before age 85, reported Benjamin, who added that some other psychologists have stayed too long in the profession without knowing “when to hang up their spikes” and make way for fresh faces in the classroom and laboratory.

Some fun emails stimulated by last week’s essay on loss aversion in sports and everyday life pointed me to statistician David Spiegelhalter's Cambridge Coincidence Collection, which contains people’s 4500+ reports of weird coincidences.

 

That took my mind back to some personally experienced coincidences . . . like the time my daughter, Laura Myers, bought two pairs of shoes. Back home, we were astounded to discover that the two brand names on the boxes were “Laura” and “Myers.” Or the time I confused our college library desk clerk when checking out after using a photocopy machine. My six-digit charge number was identical to the one-in-a-million six-digit number of copies on which the last user had stopped. Or the day my wife, Carol, called seeking my help in sourcing Mark Twain’s quote, “The man who does not read good books has no advantage over the man who cannot read them.” After this first-ever encounter with that quote, my second encounter was 90 minutes later, in a Washington Post article.

 

In Intuition: Its Powers and Perils, I report more amusing coincidences. Among my favorites:

  • Twins Levinia and Lorraine Christmas, driving to deliver Christmas presents to each other near Flitcham, England, collided.
  • Three of the first five U.S. Presidents—Adams, Jefferson, and Monroe—died on the same date–which was none other than the 4th of July.
  • And my favorite . . . in Psalm 46 of the King James Bible, published in the year that Shakespeare turned 46, the 46th word is “shake” and the 46th word from the end is “spear.” (An even greater marvel: How did anyone notice this?)

 

What should we make of weird coincidences? Were they, as James Redfield suggested in The Celestine Prophecy, seemingly “meant to happen . . . synchronistic events, and [that] following them will start you on your path to spiritual truth”? Is it a wink from God that your birthdate is buried among the random digits of pi? Beginning 50,841,600 places after the decimal, my 9/20/1942 birthdate appears . . . and you can likewise find yours here.

 

Without wanting to drain our delight in these serendipities, statisticians have a simpler explanation. Given the countless billions of daily events, some weird juxtapositions are inevitable—and then likely to get noticed and remembered (while all the premonitions not followed by an envisioned phone call or accident are unnoticed and fall into oblivion). “With a large enough sample, any outrageous thing is likely to happen,” observed statisticians Persi Diaconis and Frederick Mosteller. Indeed, added mathematician John Allen Paulos, “the most astonishingly incredible coincidence imaginable would be the complete absence of all coincidences.”

 

Finally, consider: That any specified coincidence will occur is very unlikely. That some astonishing unspecified event will occur is certain. That is why remarkable coincidences are noted in hindsight, not predicted with foresight. And that is also why we don’t need paranormal explanations to expect improbable happenings, even while delighting in them.

Imagine that you’re about to buy a $5000 used car. To pay for it, you’ll need to sell some of your stocks. Which of the following would you rather sell?

  • $5000 of Stock X shares, which you originally purchased for $2500.
  • $5000 of Stock Y shares, which you originally purchased for $10,000.

 

If you’d rather sell Stock X and reap your $2500 profit now, you’re not alone. One analysis of 10,000 investor accounts revealed that most people strongly prefer to lock in a profit rather than absorb a loss. Investors’ loss aversion is curious: What matters is each stock’s future value, not whether it has made or lost money in the past. (If anything, tax considerations favor selling the loser for a tax loss and avoiding the capital gains tax on the winner.)

 

Loss aversion is ubiquitous, and not just in big financial decisions. Participants in experiments, where rewards are small, will choose a sure gain over flipping a coin for double or nothing—but they will readily flip a coin on a double-or-nothing chance to avert a loss. As Daniel Kahneman and Amos Tversky reported, we feel the pain from a loss twice as keenly as we feel the pleasure from a similar-sized gain. Losing $20 feels worse than finding $20 feels good. No surprise, then, that we so vigorously avoid losing in so many situations.

 

The phenomenon extends to the endowment effect—our attachment to what we own and our aversion to losing it, as when those given a coffee mug demand more money to sell it than those not given the mug are willing to pay for it. Small wonder our homes are cluttered with things we wouldn’t today buy, yet won’t part with.

 

Loss aversion is but one example of a larger bad-is-stronger-than-good phenomenon, note Roy Baumeister and his colleagues. Bad events evoke more misery than good events evoke joy. Cruel words hurt us more than compliments please us. A bad reputation is easier to acquire—with a single lie or heartless act—than is a good reputation. “In everyday life, bad events have stronger and more lasting consequences than comparable good events.” Psychologically, loss is larger than gain. Emotionally, bad is stronger than good.  

           

Coaches and players are aware of the pain of losses, so it’s no surprise that loss aversion plays out in sports. Consider this example from basketball: Say your team is behind by 2 points, with time only for one last shot. Would you prefer a 2-point or a 3-point attempt?

 

Most coaches, wanting to avoid a loss, will seek to put the game into overtime with a 2-point shot. After all, an average 3-point shot will produce a win only one-third of the time. But if the team averages 50 percent of its 2-point attempts, and has about a 50 percent chance of overtime in this toss-up game, the loss-aversion strategy will yield but a 25 percent chance of both (a) sending the game to overtime, followed by (b) an overtime victory. Thus, by averting an immediate loss, these coaches reduce the chance of an ultimate win—rather like investors who place their money in loss-avoiding bonds and thus forego the likelihood, over extended time, of a much greater stock index win.

 

And now comes news (kindly shared by a mathematician friend) of loss aversion in baseball and softball base-running. Statistician Peter MacDonald, mathematician Dan McQuillan, and computer scientist Ian McQuillan invite us to imagine “a tie game in the bottom of the ninth inning, and there is one out—a single run will win the game. You are on first base, hoping the next batter gets a hit.”

 

As the batter hits a fly to shallow right, you hesitate between first and second to see if the sprinting outfielder will make the catch. When the outfielder traps rather than catches the ball, you zoom to second. The next batter hits a fly to center field and, alas, the last batter strikes out.

 

You probably didn’t question this cautious base-running scenario, because it’s what players do and what coaches commend. But consider an alternative strategy, say MacDonald and his colleagues. If you had risked running to third on that first fly ball, you would have scored the winning run on the ensuing fly ball. Using data from 32 years of Major League Baseball, the researchers calculate that any time the fly ball is at least 38 percent likely to fall for a hit, the runner should abandon caution and streak for third. Yet, when in doubt, that rational aggressive running strategy “is never attempted.”

 

You may object that players cannot compute probabilities. But, says the MacDonald team, “players and their third-base coaches make these sorts of calculations all the time. They gamble on sacrifice flies and stolen base attempts using probabilities of success.” Nevertheless, when it comes to running from first, their first goal is to avert loss—and to avoid, even at the cost of a possible run, the risk of looking like a fool. We implicitly think “What if I fail?” before “How can I succeed?”

 

Often in life, it seems, our excessive fear of losing subverts our opportunities to win. Caution thwarts triumph. Little ventured, little gained.

 

My late friend Gerry Haworth understood the risk-reward relationship. A shop teacher at our local high school, he began making wood products in his garage shop. Then, in 1948, he ventured the business equivalent of running to third base—quitting his job and launching a business, supported by his dad’s life savings. Today, family-owned Haworth Inc., America’s third-largest furniture manufacturer, has more than 6000 employees and nearly $2 billion in annual sales. Something ventured, something gained.