Skip navigation
All Places > The Psychology Community > Blog > Authors David Myers
1 2 3 Previous Next

The Psychology Community

209 Posts authored by: David Myers Expert

At long last, artificial intelligence (AI)—and its main subset, machine learning—is beginning to fulfill its promise. When fed massive amounts of data, computers can discern patterns (as in speech recognition) and make predictions or decisions. AlphaZero, a Google-related computer system, started playing chess, shogi (Japanese chess), and GO against itself. Before long, thanks to machine learning, AlphaZero progressed from no knowledge of each game to “the best player, human or computer, the world has ever seen.”

 

DrAfter123/DigitalVision Vectors/Getty Images

 

I’ve had recent opportunities to witness the growing excitement about machine learning in the human future, through conversations with

  • Adrian Weller (a Cambridge University scholar who is program director for the UK’s national institute for data science and AI).
  • Andrew Briggs (Oxford’s Professor of Nanomaterials, who is using machine learning to direct his quantum computing experiments and, like Weller, is pondering what machine learning portends for human flourishing).
  • Brian Odegaard (a UCLA post-doc psychologist who uses machine learning to identify brain networks that underlie human consciousness and perception).

 

Two new medical ventures (to which—full disclosure—my family foundation has given investment support) illustrate machine learning’s potential:

  • Fifth Eye, a University of Michigan spinoff, has had computers mine data on millions of heartbeats from critically ill hospital patients—to identify invisible, nuanced signs of deterioration. By detecting patterns that predict patient crashes, the system aims to provide a potentially life-saving early warning system (well ahead of doctors or nurses detecting anything amiss).
  • Delphinus, which offers a new ultrasound alternative to mammography, will similarly use machine learning from thousands of breast scans to help radiologists spot potent cancer cells.

 

Other machine-learning diagnostic systems are helping physicians to identify strokes, retinal pathology, and (using sensors and language predictors) the risk of depression or suicide. Machine learning of locked-in ALS patients’ brain wave patterns associated with “Yes” and “No” answers has enabled them to communicate their thoughts and feelings. And it is enabling researchers to translate brain activity into speech.

 

Consider, too, a new Pew Research Center study of gender representation in Google images. Pew researchers first harvested an archive of 26,981 gender-labeled human faces from different countries and ethnic groups. They fed 80 percent of these images into a computer, which used machine learning to discriminate male and female faces. When tested on the other 20 percent, the system achieved 95 percent accuracy.

 

Pew researchers next had the system use its new human-like gender-discrimination ability to  identify the gender of persons shown in 10,000 Google images associated with 105 common occupations. Would the gender representation in the image search results overrepresent, underrepresent, or accurately represent their proportions, as reported by U.S. Bureau of Labor Statistics (BLS) data summaries?

 

The result? Women, relative to their presence in the working world, were significantly underrepresented in some categories and overrepresented in others. For example, the BLS reports that 57 percent of bartenders are female—as are only 29 percent of the first 100 people shown in Google image searches of “bartender” (as you can see for yourself). Searches for “medical records technician,” “probation officer,” “general manager,” “chief executive,” and “security guard” showed a similar underrepresentation. But women were overrepresented, relative to their working proportion, in Google images for “police,” “computer programmer,” “mechanic,” and “singer.” Across all 105 jobs, men are 54 percent of those employed and 60 percent of those pictured. The bottom line: Machine learning reveals (in Google users’ engagement) a subtle new form of gender bias.

 

As these examples illustrate, machine learning holds promise for helpful application and research. But it will also entail some difficult ethical questions.

 

Imagine, for example, that age, race, gender, or sexual orientation are incorporated into algorithms that predict recidivism among released prisoners. Would it be discriminatory, or ethical, to use such demographic predictors in making parole decisions?

 

Such questions already exist in human judgments, but may become more acute if and when we ask machines to make these decisions. Or is there reason to hope that it will be easier to examine and tweak the inner workings of an algorithmic system than to do so with a human mind?

 

(For David Myers’ other essays on psychological science and everyday life visit www.TalkPsych.com.)

Judith Rich Harris’ December 29th death took my mind to her remarkable life and legacy. Among all the people I’ve never met, she was the person I came to know best. Across 243 emails she shared her draft writings, her critical assessment of others’ thinking (including my own), and the progress of her illness.

 

Our conversation began after the publication of her cogent Psychological Review paper, which changed my thinking and led me to send a note of appreciation. The paper’s gist was delivered by its first two sentences: “Do parents have any important long-term effects on the development of their child’s personality? This article examines the evidence and concludes that the answer is no.”

 

Her argument: Behavior genetics studies (of twins and adoptees) show that genes predispose our individual traits, and that siblings’ “shared environment” has a shockingly small influence. Peers also matter—they transmit culture. Show her some children who hear English spoken with one accent at home, and another accent at school and in the neighborhood, and—virtually without exception—she will show you children who talk like their peers.

 

Judy Harris was a one-time Harvard psychology graduate student who was dismissed from its doctoral program because, as George Miller explained to her, she lacked “originality and independence.”

 

But she persisted. In her mid-fifties, without any academic affiliation and coping with debilitating autoimmune disorders, she had the chutzpah to submit her evidence-based ideas to Psychological Review, then as now psychology’s premier theoretical journal. To his credit, the editor, Daniel Wegner, welcomed this contribution from this little-known independent scholar. Moreover, when her great encourager Steven Pinker and I each nominated her paper for the annual award for “outstanding paper on general psychology,” the judges selected her as co-recipient of the—I am not making this up—George A. Miller Award. (To his credit, Miller later termed the irony “delicious.”)

 

The encouraging lesson (in Harris’ words): “‘Shut in’ does not necessarily mean ‘shut out.’” Truth will out. Although biases are real, excellence can get recognized. So, wherever you are, whatever your big idea or passion, keep on.

 

Her fame expanded with the publication of her 1998 book The Nurture Assumption, which was profiled by Malcolm Gladwell in a New Yorker feature article, made into a Newsweek cover story, and named as a Pulitzer Prize finalist.

 

Her argument was controversial, and a reminder that important lessons are often taught by those who fearlessly push an argument to its limit. (Surely child-rearing does have some direct influence on children’s values, religiosity, and politics—and not just via the peer culture to which parents expose children. And surely the loving versus abusive extremes of parenting matter.)

 

Harris was kind and generous (she supportively critiqued my writing, even as I did hers) but also had the self-confidence to take on all critics and to offer challenges to other widely accepted ideas. One was the “new science” of birth order, which, as she wrote me, was “neither new nor science.” An August 24, 1997, email gives the flavor of her wit and writing:

Birth order keeps coming back. In their 1996 book on birth order and political behavior, Albert Somit, Alan Arwine, and Steven A. Peterson spoke of the “inherent non-rational nature of deeply held beliefs” and mused that “permanently slaying a vampire”—the belief in birth order effect—may require “that a stake of gold be driven through his/her heart at high noon” (p. vi).
            Why is it so difficult to slay this vampire? Why, in spite of all the telling assaults that have been made on it, does it keep coming back? The answer is that the belief in birth order effects fits so well into the basic assumptions of our profession and our culture. Psychologists and nonpsychologists alike take it for granted that a child’s personality, to the degree that it is shaped by the environment, receives that shaping primarily at home. And since we know (from our own memories and introspections) that a child’s experiences at home are very much affected by his or her position in the family—oldest, youngest, or in the middle—we expect birth order to leave permanent marks on the personality.
            The belief in birth order effects never dies; it just rests in its coffin until someone lifts the lid again.

 

Alas, the disease that shut her in has, as she anticipated, claimed her. In her last email sent my way on September 6, 2018, she reported that

I’m not doing so well. This is the normal course of the disorder I have—pulmonary arterial hypertension. It is incurable and eventually progresses to heart failure and death. I’m in the heart failure stage now. It’s progressing very slowly, but makes remaining alive not much fun. 

            Because I can’t actually DO anything anymore, it’s a treat to get your mail. I can’t do any more than I’ve already done, but maybe what I’ve already done is enough. Who would have thought that 20 years after its publication, people would still be talking about The Nurture Assumption!

 

Or that The New York Times would replay its message at length, in your well-deserved obituary, Judy.

 

(For David Myers’ other essays on psychological science and everyday life visit www.TalkPsych.com.)

As Pope Francis has said, “Everyone’s existence is deeply tied to that of others.” We are social animals. We need to belong. We flourish when supported by close relationships. Finding a supportive confidante, we feel joy.

 

Longing for acceptance and love, Americans spend $86 billion annually on cosmetics, fragrances, and personal care products—and billions more on clothes, hair styling, and diets. Is that money well spent? Will it help us find and form meaningful relationships?

 

Consider one of social psychology’s most provocative, and simplest, experiments. Cornell University students were asked to don a Barry Manilow T-shirt (at the behest of researcher Thomas Gilovich and colleagues) and were then shown into a room where several others were completing questionnaires. Afterwards they were asked to guess how many of the others noticed their dorky attire. Their estimate? About half. Actually, only 23 percent did.

 

Other experiments confirm this spotlight effect—an overestimation of others’ noticing us, as if a spotlight is shining on us.

 

The phenomenon extends to our secret emotions. Thanks to an illusion of transparency we presume that our attractions, our disgust, and our anxieties leak out and become visible to others. Imagine standing before an audience: If we’re nervous and we know it, will our face surely show it? Not necessarily. Even our lies and our lusts are less transparent than we imagine.

 

There’s bad news here: Others notice us less than we imagine (partly because they are more worried about the impressions they are making).

 

But there’s also good news: Others notice us less than we imagine. And that good news is liberating: A bad hair day hardly matters. And if we wear yesterday’s clothes again today, few will notice. Fewer will care. Of those, fewer still will remember. 

 

If normal day-to-day variations in our appearance are hardly noticed and soon forgotten, what does affect the impressions we make and the relationships we hope to form and sustain?

 

Proximity. Our social ecology matters. We tend to like those nearby—those who sit near us in class, at work, in worship. Our nearest become our dearest as we befriend or marry people who live in the same town, attend the same school, share the same mail room, or visit the same coffee shop. Mere exposure breeds liking. Familiar feels friendly. Customary is comfortable. So look around.

 

Similarity. Hundreds of experiments confirm and reconfirm that likeness leads to liking (and thus the challenge of welcoming the benefits of social diversity). The more similar another’s attitudes, beliefs, interests, politics, income, and on and on, the more disposed we are to like the person and to stay connected. And the more dissimilar another’s attitudes, the greater the odds of disliking.  Opposites retract.

 

If proximity and similarity help bonds form, what can we do to grow and sustain relationships?

 

Equity. One key to relationship endurance is equity, which occurs when friends perceive that they receive in proportion to what they give. When two people share their time and possessions, when they give and receive support in equal measure, and when they care equally about one another, their prospects for long-term friendship or love are bright. This doesn’t mean playing relational ping pong—balancing every invitation with a tit-for-tat response. But over time, each friend or partner invests in the other about as much as he or she receives.

 

Self-disclosure. Relationships also grow closer and stronger as we share our likes and dislikes, our joys and hurts, our dreams and worries. In the dance of friendship or love, one reveals a little and the other reciprocates. And then the first reveals more, and on and on. As the relationship progresses from small talk to things that matter, the increasing self-disclosure can elicit liking, which unleashes further self-disclosure.

 

Mindful of the benefits of equity and mutual self-disclosure, we can monitor our conversations: 

  • Are we listening as much as we are talking?
  • Are we drawing others out as much as we disclosing about ourselves?

 

In his classic How to Win Friends and Influence People, Dale Carnegie offered kindred advice. To win friends, he advised, “become genuinely interested in other people. . . . You can make more friends in two months by being interested in them, than in two years by making them interested in you.” Thus, “Be a good listener. Encourage others to talk about themselves.”

 

So, looking our best may help a little, initially, though less than we suppose. What matters more is being there for others—focusing on them, encouraging them, supporting them—and enjoying their support in return. Such is the soil that feeds satisfying friendships and enduring love.

 

(For David Myers’ other weekly essays on psychological science and everyday life, visit www.TalkPsych.com)

“I have a gut, and my gut tells me more sometimes than anybody else’s brain can ever tell me,” explained President Trump in stating why he believed Federal Reserve interest rate hikes were a mistake. “My gut has always been right,” he declared again in saying why he needn’t prepare for the recent trade negotiation with China’s president.

 

In trusting his gut intuition, Trump has much company. “Buried deep within each and every one of us, there is an instinctive, heart-felt awareness that provides—if we allow it to—the most reliable guide,” offered Prince Charles. “I’m a gut player. I rely on my instincts,” said President George W. Bush, explaining his decision to launch the Iraq War.

 

Although there is, as I noted in another of these TalkPsych essays, a gut-brain connection, are we right to trust our gut? Does the gut know best about interest rates, trade policy, and climate change? Or, mindful of smart people often doing dumb things, do we instead need more humility, more checking of gut hunches against hard reality, more critical thinking?

 

Drawing from today’s psychological science, one could write a book on both the powers and perils of intuition. (Indeed, I have—see here.) Here, shortened to an elevator speech, is the gist.

 

Intuition’s powers. Cognitive science reveals an unconscious mind—another mind backstage—that Freud never told us about. Much thinking occurs not “on screen” but off screen, out of sight, where reason does not know. Countless studies—of priming, implicit memory, empathic accuracy, thin slice social judgments, creativity, and right hemisphere processing—illustrate our nonrational, intuitive powers. We know more than we know we know. Thanks to our “overlearning” of automatic behaviors, those of us who learned to ride bikes as children can intuitively pedal away on one decades later. And a skilled violinist knows, without thinking, just where to place the bow, at what angle, with what pressure. “In apprehension, how like a god!,” exclaimed Shakespeare’s Hamlet.

 

Intuition’s perils. Other studiesof perceptual illusions, self-serving bias, illusory optimism, illusory correlation, confirmation bias, belief perseverance, the fundamental attribution error, misplaced fears, and the overconfidence phenomenon—confirm what literature and religion have long presumed: the powers and perils of pride. Moreover, these phenomena feed mistaken gut intuitions that produce deficient decisions by clinicians, interviewers, coaches, investors, gamblers, and would-be psychics. “Headpiece filled with straw,” opined T. S. Eliot.

 

Intuition’s failures often are akin to perceptual illusions—rooted in mechanisms that usually serve us well but sometimes lead us astray. Like doctors focused on detecting and treating disease, psychological scientists are skilled at detecting and calling attention to our mind’s predictable errors. They concur with the novelist Madeline L’Engle’s observation: “The naked intellect is an extraordinarily inaccurate instrument.”

 

The bottom line: our gut intuitions are terrific at some things, such as instantly reading emotions in others’ faces, but fail at others, such as guessing stocks, assessing risks, and predicting climate change. And so psychologists teach about intuition’s perils as well as its powers. We encourage critical thinking. We urge people, before trusting others’ gut intuitions, to ask: “What do you mean?” “How do you know?”

 

As physicist Richard Feynman famously said, “The first principle is that you must not fool yourself, and you are the easiest person to fool.”

 

(For David Myers’ other weekly essays on psychological science and everyday life, visit www.TalkPsych.com)

If you, dear reader, can indulge some slightly geeky calculations, I hope to show you that with daily exercise you can live a substantially longer and happier life. Indeed, per the time invested, exercise will benefit you more than smoking will harm you. Consider:

  • An analysis of mortality data offers this memorable result: For the average person, life is lengthened by about 7 hours for every hour exercised. So (here comes the geek), the World Health Organization recommends exercising 150 minutes = 2.5 hours per week. Multiplied times 7, that equals 17.5 hours longer life for each week of exercise. Over 52 weeks, that sums to 910 hours = 38 days = 1/10th of a year longer life for each year of faithful exercise . . . which, continued over 40 years would yield ~4 years longer life. (Though, more typically, say the researchers, runners live 3.2 years longer.)
  • In another epidemiological study of over 650,000 American adults, those walking 150 minutes per week lived (voila!) 4 years longer than nonexercisers (Moore et al., 2012).

 

How satisfying to have two independent estimates in the same ballpark!

 

This potential life-extending benefit brings to mind the mirror-image life-shortening costs of smoking, which the Centers for Disease Control reports diminishes life for the average smoker “by at least 10 years.” Thus (geek time again):

  • A person  who takes up smoking at age 15, smokes 15 cigarettes per day for  50 years, and dies at 65 instead of 75, will lose roughly 1/5th of a year (equals 73 days = 1752 hours = 105,000 minutes) for each year of smoking. If each cigarette  takes 10 minutes to smoke, the minutes spent smoking (54,750 each year) will account for half of those 105,000 lost minutes.
  • Ergo, nature charges ~2 minutes of shorter life for each minute spent smoking. . . but generously gives a 7-to-1 return for each hour spent exercising. How benevolent!

 

Massive new epidemiological studies and meta-analyses (statistical digests of all available research) confirm both physical and mental health benefits of exercise (see here, here, and here). A good double goal for those wishing for a long life is: more fitness, less fatness. But evidence suggests that if forced to pick one, go for fitness.

 

As an earlier blog essay documented, exercise entails not only better health but a less depressed and anxious mood, more energy, and stronger relationships. Moreover, clinical trial experiments—with people assigned to exercise or to control conditions—confirm cause and effect: Exercise both treats and protects against depression and anxiety.

 

The evidence is as compelling as evidence gets: Go for a daily jog or swim and you can expect to live longer and live happier. Mens sana in corpore sano: A healthy mind in a healthy body.

 

 K.C. Alfred/Moment/Getty Images

(For David Myers’ other weekly essays on psychological science and everyday life, visit www.TalkPsych.com)

David Myers

Sometimes Truth Is Comedy

Posted by David Myers Expert Nov 29, 2018

As I approach five years of www.TalkPsych.com commentary—which has settled into a weekly Thursday essay—I am tempted (given our now larger audience) to replay an occasional favorite. Here is my second focused essay, which still puts a smile on my face . . . and perhaps yours? (In sneaking humor into texts, I presume that if I can’t have fun writing, then readers likely won’t have fun reading.)

 

From April 6, 2014:

Consider Brett Pelham, Matthew Mirenberg, and John Jones’ 2002 report of wacky associations between people’s names and vocations. Who would have guessed? For example, in the United States, Jerry, Dennis, and Walter are equally popular names (0.42 percent of people carry each of these names). Yet America’s dentists have been almost twice as likely to be named Dennis as Jerry or Walter. Moreover, 2.5 times as many female dentists have been named Denise as the equally popular names Beverly and Tammy. And George or Geoffrey has been overrepresented among geoscientists (geologists, geophysicists, and geochemists).

I thought of that playful research on names recently when reading a paper on black bears’ quantitative competence, co-authored by Michael Beran. Next up in my reading pile was creative work on crows’ problem solving led by Chris Bird. Today I was appreciating interventions for lifting youth out of depression, pioneered by Sally Merry.

That also took my delighted mind to the important books on animal behavior by Robin Fox and Lionel Tiger, and the Birds of North America volume by Chandler Robbins. (One needn’t live in Giggleswick, England, to find humor in our good science.)

The list goes on: billionaire Marc Rich, drummer Billy Drummond, cricketer Peter Bowler, and the Ronald Reagan Whitehouse spokesman Larry Speakes. And as a person with hearing loss whose avocational passion is hearing advocacy, I should perhaps acknowledge the irony of my own name, which approximates My-ears.

Internet sources offer lots more: dentists named Dr. E. Z. Filler, Dr. Gargle, and Dr. Toothaker; the Oregon banking firm Cheatham and Steele; and the chorister Justin Tune. But my Twitter feed this week offered a cautionary word about these reported names: “The problem with quotes on the Internet is that you never know if they’re true.” ~ Abraham Lincoln

Perhaps you, too, have some favorite name-vocation associations? I think of my good friend who was anxiously bemused before meeting his oncologist, Dr. Bury. (I am happy to report that, a decade later, he is robustly unburied and has not needed the services of the nearby Posthumus Funeral Home.)

For Pelham and his colleagues there is a serious point to this fun: We all tend to like what we associate with ourselves (a phenomenon they call implicit egotism). We like faces that have features of our own face morphed into them. We like—and have some tendency to live in—cities and states whose names overlap with our own—as in the disproportionate number of people named Jack living in Jacksonville, of Philips in Philadelphia, and of people whose names begin with Tor in Toronto.

Uri Simonsohn isn’t entirely convinced (see here and here, with Pelham’s reply here and here). He replicated the associations between people’s names, occupations, and places but argued that reverse causality sometimes is at work. For example, people sometimes live in places and on streets after which their ancestors were named.

Implicit egotism research continues. In the meantime, we can delight in the occasional playful creativity of psychological science.

P.S. Speaking of dentists (actual ones), my retired Hope College chemistry colleague Don Williams—a person of sparkling wit—offers these photos, taken with his own camera:

And if you need a podiatrist to advise about your foot odor, Williams has found just the person:

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

After elections, people often note unexpected outcomes and then complain that “the polls got it wrong.”

 

After Donald Trump’s stunning 2016 presidential victory, the press gave us articles on “Why the Polls were such a Disaster,” on “4 Possible Reasons the Polls Got It So Wrong,” and on “Why the Polls Missed Their Mark.” Stupid pollsters. “Even a big poll only surveys 1500 people or so out of almost 130 million voters,” we may think, “so no wonder they can’t get it right.

 

Moreover, consider the many pundits who, believing the polls, confidently predicted a Clinton victory. They were utterly wrong, leaving many folks shocked on election night (some elated, others depressed, with later flashbulb memories of when they realized Trump was winning).

 

So how could the polls, the pundits, and the prediction models have all been so wrong?

 

Or were they? First, we know that in a closely contested race, a representative sample of a mere 1500 people from a 130 million population will—surprisingly to many people—allow us to estimate the population preference within ~3 percent.

 

Sounds easy. But there’s a challenge: Most randomly contacted voters don’t respond when called. The New York TimesUpshot” recently let us view its polling in real time. This enabled us to see, for example, that it took 14,636 calls to Iowa’s fourth congressional district to produce 423 responses, among which Steve King led J. D. Scholten by 5 percent—slightly more than the 3.4 percent by which King won.

 

Pollsters know the likely demographic make-up of the electorate, and so can weight results from respondents of differing age, race, and gender to approximate the population. And that, despite the low response rate, allows them to do remarkably well—especially when we bear in mind that their final polls are taken ahead of the election (and cannot account for last-minute events, which may sway undecided voters). In 2016, the final polling average favored Hillary Clinton by 3.9 percent, with a 3 percent margin of error. On Election Day, she won the popular vote by 2.1 percent (and 2.9 million votes)—well within that margin of error.

 

To forecast a race, fivethirtyeight.com’s prediction model does more. It “takes lots of polls, performs various types of adjustments to them [based on sample size, recency, and pollster credibility], and then blends them with other kinds of empirically useful indicators” such as past results, expert assessments, and fundraising. Here is their 2016 final estimation:

Ha! This prediction, like other 2016 prediction models, failed.

 

Or did it? Consider a parallel. Imagine that as a basketball free-throw shooter steps to the line, I tell you that the shooter has a 71 percent free-throw average. If the shooter misses, would you disbelieve the projection? No, because, if what I’ve told you is an accurate projection, you should expect to see a miss 29 percent of the time. If the player virtually never missed, then you’d rightly doubt my data.

 

Likewise, if, when Nate Silver’s fivethirtyeight.com gives a candidate a 7 in 10 chance of winning and that candidate always wins, then the model is, indeed, badly flawed. Yes?

 

In the 2018 U.S. Congressional races, fivethirtyeight.com correctly predicted 96 percent of the outcomes. On the surface, that may look like a better result, but it’s mainly because most races were in solid Blue or Red districts and not seriously contested.

 

Ergo, don’t be too quick to demean the quality polls and the prediction models they inform. Survey science still works.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

David Myers

Science Marches On

Posted by David Myers Expert Nov 15, 2018

This week I interrupt our weekly focus on psychology’s big ideas and new findings to update three prior essays.

 

Loss aversion in sports. A recent essay described how, in sports (as in other realms of life), our fear of losing can rob us of chances to win:

  • In baseball, a mountain of data shows that runners on first base will rarely take off running on a fly ball that has any chance of being caught. But their aversion to being thrown out leads to fewer runs and wins.
  • And in  basketball, teams trailing by 2 points at a game’s end typically prefer a 2-point shot attempt, hoping to avert a loss and send the game into overtime (where half the time they will lose), over a 3-point attempt for victory—even in situations where the odds favor the latter. New Cornell/University of Chicago studies of “myopic loss aversion” confirm this irrational preference for loss-averting 2-point shots at the end of National Basketball Association games.
  • Moreover, those same studies  extend the phenomenon to National Football League games, where teams prefer to kick a tying extra point in situations where a 2-point conversion makes a win more likely (as when down by two points late in the third quarter—see also here). Caution often thwarts triumph.

 

Gratitude gratifies. An essay last spring testified to the positive power of expressing gratitude, which increases well-being and prosociality. In new experiments, Amit Kumar and Nicholas Epley found that people who wrote gratitude letters “significantly underestimated how surprised recipients would be about why expressers were grateful, overestimated how awkward recipients would feel, and underestimated how positive recipients would feel.”

 

Our unexpected personal thank you notes are more heartwarming for their recipients than we appreciate. (Is there someone whose support or example has impacted your life, who would be gratified to know that?)

 

The net effect. A May 2016 essay discussed research on how, in the smartphone age, “compulsive technology use not only drains time from eyeball-to-eyeball conversation but also predicts poorer course performance.” Since then, my friend (and co-author on the new Social Psychology, 13th Edition) Jean Twenge has enriched the public understanding of social media effects in her new book, iGen, and in associated media appearances. (For an excellent synopsis, see her Atlantic article.)

As she documents, the adoption of smartphones is echoed by increases in teen loneliness, depression, and suicide, and by decreases in sleep and face-to-face interactions (though also in less drinking, sex, and car accidents). Jean also continues to mine data, such as from an annual survey of American teens in a new Emotion study with Gabrielle Martin and Keith Campbell. They reconfirmed that a dip in adolescent well-being has precisely coincided with an increase in screen time (on social media, the Internet, texting, and gaming). Moreover, across individuals, more than 10 screen-time hours per week predicts less teen happiness.

 

Ergo, a task for teachers is to inform students about these trends and invite discussion about how students might apply them in their own peer culture. In a recent APS Observer essay, I suggested this might also be a good class activity:

  • Invite students to guess how often they check their phone each day, and how many minutes they average on it.
  • Have them download a free screen-time tracker app, such as Moment for the iPhone or QualityTime for the Android.
  • Have them add up their actual total screen time for the prior week and divide by 7 to compute their daily average.
  • Then ask them, “Did you underestimate your actual smartphone use?

The results may surprise them. In two recent studies, university students greatly underestimated their frequency of phone checking and time on screen. As Steven Pinker has noted, “The solution is not to bemoan technology but to develop strategies of self-control, as we do with every other temptation in life.”

In this time of political passion, those of us who are instructors and/or text authors may agonize over whether to engage contentious public issues, such as the Kavanaugh Supreme Court nomination, fears of immigrant crime, or the possible social toxicity of presidential rhetoric.

 

My assumption is that—given our focus on education and our respect for our students’ diversity—classrooms and textbooks should not be political bully pulpits. There are more appropriate venues for advocating our own political views.

 

But that needn’t preclude our seeking to inform public dialogue, by offering pertinent evidence. For example, in a recent essay, I drew on memory science to report the tunnel-vision nature of emotion-laden memories, as perhaps illustrated when Christine Blasey Ford recalled being sexually assaulted without remembering peripheral details—just what we would expect from an authentic memory. And I indicated how state-dependent memory phenomena could help explain why Brett Kavanaugh might be sincere in having no memory for the same event. But I stopped short of expressing an opinion about whether he should have been confirmed.

 

Other essays have also offered information pertinent to heated political debates:

  • Trade policies. While politicians and economists debate the economic merits of free trade versus trade-restricting tariffs, social psychologists have noted that economic interdependence and cooperation enhance the odds for sustained peace (here).
  • Fear of immigrants. Recent political rhetoric focusing attention on the “caravan” of Central Americans desperate to enter Mexico and the U.S. has again raised fears of immigrant crime. Recent TalkPsych essays (here and here) offered data on actual immigrant crime rates in the United States, and on who in the U.S. and Germany most fears immigrants (ironically, those who have little contact with them). Gallup data from 139 countries confirms higher migrant acceptance among those who know migrants. Teachers can offer such evidence without advocating either party’s border policy (yes?).
  • Presidential rhetoric and public attitudes. Recent essays in The New York Times (here and here) and The Washington Post (here and here) assume that President Trump’s derision of his political opponents and of the press creates a toxic social environment that seeps down into his followers’ attitudes and actions. Pertinent to these concerns, my earlier essays wondered whether the President was “simply giving a voice” to widely held attitudes, or instead was legitimizing such attitudes and thereby increasing bullying. I later offered new evidence that hatemongering from high places does indeed desensitize people to such and increases expressions of prejudice. Can teachers offer such evidence without being partisan?

 

Be assured, psychological science is not intrinsically liberal or conservative. Its evidence sometimes lends weight to progressive thinking (say about sexual orientation as a natural, enduring disposition) and sometimes to conservative thinking (for example, about the benefits of co-parenting and stable close relationships such as marriage). As I conclude in an upcoming teaching column for the Association for Psychological Science Observer, “psychology aims not to advance liberal or conservative thinking per se, but to let evidence inform our thinking. And for us teachers of psychology that, no matter our political identities, is perhaps the biggest lesson of all.”

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

Hate-fueled pipe bombs target Democrats. Two African Americans are gunned down in a grocery story. An anti-Semite slaughters synagogue worshippers. Political leaders denigrate and despise their opponents. In National Election Studies surveys, the percentage of Republicans and Democrats who “hate” the other party has soared, for both sides—from 20 percent in 2000, to near 50 percent in 2016. (Let’s make it personal: Would you want your child to marry a devotee of the other party?)

 

Hostilities are poisoning the culture, and many Americans are wondering: How can we, as individuals and as a culture, turn a corner? Amid animosities fed by groundless fears, fact-free ignorance, and repeated (then believed) big lies, how can we embrace our common humanity and shared goals?

 

As we social psychologists remind folks, conflicts lessen through contact, cooperation, and communication. Personal contact with equal-status others helps (it’s not just what you know, but who you know). Cooperative striving for shared superordinate goalsthose that require the cooperation of two or more people—fosters unity (it even helps to have a common enemy). Ditto guided communication (an aim of www.Better-Angels.org, which brings together “Reds” and “Blues” to understand each other’s concerns and to discover their overlapping aspirations).

 

And might we, individually and as a culture, also benefit by teaching and modeling an outlook that encompasses three virtues: conviction, humility, and love?

 

Our convictions define what matters. We anchor our lives in core beliefs and values that guide our lives. Our convictions motivate our advocacy for a better world. They give us courage to speak and act. “We must always take sides,” said Holocaust survivor Elie Weisel. “Silence encourages the tormentor, never the tormented.” “To be silent is to be complicit,” adds Dead Man Walking author Sister Helen Prejean.

 

But convictions need restraining with humility, a virtue that lies at the heart of science for theists and nontheists alike. Those of us who are theists, of whatever faith tradition, share two convictions:

  1. There is a God.
  2. It’s not me (or you).

Ergo, we are fallible. The surest conviction we can have is that some of our beliefs err. From this follows the religious virtue of humility (alas, a virtue more often preached than practiced). A spirit of humility seasons conviction with open-minded curiosity. It tempers faith with uncertainty (faith without humility is fanaticism). It subjects our opinions to evidence and enables good science. It tells me that every person I meet is, in some way, my superior . . . providing an opportunity to learn.

 

The triangle of virtues within which we can aspire to live is completed when conviction, restrained by humility, is directed by love. In his great sermon on love, Martin Luther King, Jr. quoted Jesus: “Love your enemies, bless them that curse you, do good to them that hate you.” Doing that, he said, does not compel us to like our enemies, but does compel us “to discover the element of good” in them. By contrast, “hate only intensifies the existence of hate and evil in the universe,” he added. “If I hit you and you hit me and I hit you back and you hit me back and go on, you see, that goes on ad infinitum. It just never ends. . . . Hate destroys the hater as well as the hated.”

 

Is this not a vision of a good life that will enable a flourishing culture . . . a life that is animated by deep convictions, which are refined in humility and applied with love?

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

Psychological science delights us with its occasional surprises. For example, who would have imagined that

  • electroconvulsive therapy—shocking the brain into mild convulsions—would often be an effective antidote to otherwise intractable depression?
  • massive losses in brain tissue early in life could have minimal later effects?
  • siblings’ shared home environment would have such a small effect on their later traits?
  • after brain damage, a person may learn new skills yet be unaware of such?
  • visual information is deconstructed into distinct components (motion, form, depth, and color), processed by distinct brain regions, and then reintegrated into a perceived whole?

 

The latest who-would-have-believed-it finding is that the microbiology of the gut may influence the moods of the brain. Digestive-system bacteria reportedly influence human emotions and even social interactions, perhaps by producing neurotransmitters. Moreover, we are told (such as here and here), healthy gut microbes can reduce anxiety, depression, and PTSD.

 

New articles on this supposedly “revolutionary” and “paradigm-shifting” microbiota-gut-brain (MGB) research are accumulating, report Katarzyna Hooks, Jan Pieter Konsman, and Maureen O’Malley in a forthcoming (yet-to-be-edited) review. By comparing rodents or humans with or without intestinal microbiota, researchers have indeed found “suggestive” effects on how organisms respond to stress and display emotions. Some researchers are exploring microbiota-related interventions (such as with probiotics versus placebos) as a possible treatment for depression, anxiety, and anorexia nervosa.

 

The findings are intriguing and worth pursuing but haven’t yet definitively demonstrated “the impact of the microbiota itself on behavior,” say Hooks, Konsman, and O’Malley. Nevertheless, the popular press, sometimes aided by university press offices, has hyped the research in more than 300 articles. People love the news of this research, say Hooks et al., because it lends hope that a natural, healthy diet can provide a simple DIY solution to troubling emotions.

 

Reading this analysis triggers déjà vu. This cycle of (a) an intriguing finding, followed by (b) hype, followed by (c) reassessment, is an occasional feature of our science’s history. Mind-blowing experiments on people with split brains yielded (a) believe-it-or-not findings, leading to (b) overstated claims about left-brained and right-brained people, which (c) finally settled into a more mature understanding of how distinct brain areas function as a whole integrated system.

 

Despite the “large helpings of overinterpretation” and the overselling of “currently limited findings,” the Hooks team encourages researchers to press on. “We see MGB research as a field full of promise, with important implications for understanding the relationship between the brain and the rest of the body.” The body (brain included) is one whole system.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

As I finished a recent presentation, “Thinking Smart in a Post-Truth Age,” a questioner’s hand shot up: “I understand the need to think with our heads as well as our hearts, by considering the evidence. But how can I persuade people such as the climate-change-denying folks meeting in my town next week?”

 

I responded by commending a gentle conversation that searched for common values. I also noted that advocates for any cause are wise to not focus on immovable folks with extreme views, but on the uncertain middle—the folks whose votes sway elections and shape history.

 

I should also have mentioned the consistent finding of nine new studies by University of Cologne psychologists Joris Lammers and Matt Baldwin: Folks will often agree with positions that are linked to their own yearnings. For example, might conservatives who tend to yearn for yesteryear’s good old days respond to messages that appeal to nostalgia? Indeed, say Lammers and Baldwin, that was the successful assumption of Donald Trump’s “Make America Great Again” message.

 

But the same appeal to nostalgia can also promote progressive ideas, they report. For example, liberals were much more supportive than conservatives of a future-focused gun-control message: “I would prefer to make a change, so that in the future people may own hunting rifles and pistols, but no one will have assault rifles.” When the researchers framed the same message with a past-focus: “I would like to go back to the good old days, when people may have owned hunting rifles and pistols, but no one had assault rifles,” conservatives pretty much agreed with liberals.

 

Likewise, contemporary Germans on the left and right expressed much less disagreement about an immigration message when it focused on their country’s past history of welcoming of immigrants.

 

In earlier research, Lammers and Baldwin also found conservatives more open to nostalgia-focused environmental appeals—to, for example, donating money to a charity focused on restoring yesterday’s healthy Earth, rather than a charity focused on preventing future environmental damage. “Make Earth Great Again.”

 

Ergo, I now realize I should have encouraged my questioner to market her message to her audience. If it’s a political message pitched by conservatives at liberals, it’s fine to focus on making a better future. But if she is appealing to conservatives, then she might take a back-to-the-future approach: Frame her message as support for the way things used to be.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

I’m often asked: “What is your favorite introductory psych chapter?” I reply that, when starting to write my text, I presumed that Sensation-Perception would be the dullest topic. Instead, I’ve found it to be the most fascinating. I’m awestruck by the intricate process by which we take in information, transform it into nerve impulses, distribute it to different parts of our brain, and then reassemble that information into colorful sights, rich sounds, and evocative smells. Who could have imagined? We are, as the Psalmist said, “wonderfully made.”

 

And then there are the weird and wonderful perceptual phenomena, among which is our surprising blindness to things right in front of our eyes. In various demonstrations of inattentional blindness, people who are focused on a task (such as talking on a phone or counting the number of times black-shirted people pass a ball) often fail to notice someone sauntering through the scene—a woman with an umbrella, in one experiment, or even a person in a gorilla suit or a clown on a unicycle.

 

 

As a Chinese tour guide wrote to a friend of mine (after people failed to notice something my friend had seen):

 

This looking-without-seeing phenomenon illustrates a deep truth: Our attention is powerfully selective. Conscious awareness resides in one place at a time.

 

Selective inattention restrains other senses, too. Inattentional deafness is easily demonstrated with dichotic listening tasks. For example, if people are fed novel tunes into one ear, while focused on to-be-repeated-out-loud words fed into the other ear, they will later be unable to identify what tune they have heard. (Thanks to the mere exposure effect, they will, however, later like it best.) Or, in an acoustic replication of the famed invisible gorilla study, Polly Dalton and Nick Fraenkel found that people focusing on a conversation between two women (rather than on two men also talking) usually failed to notice one of the men repeatedly saying “I am a gorilla.”

 

Now, in a new British experiment, we also have evidence of inattentional numbness. Pickpockets have long understood that bumping into people makes them unlikely to notice a hand slipping into their pocket. Dalton (working with Sandra Murphy) experimented with this tactile inattention:  Sure enough, when distracted, their participants failed to notice an otherwise easily-noticed vibration to their hand.

 

Tactile inattention sometimes works to our benefit. I once, while driving to give a talk, experienced a painful sting in my eye (from a torn contact lens) . . . then experienced no pain while giving the talk . . . then felt the excruciating pain again on the drive home. In clinical settings, such as with patients receiving burn treatments, distraction can similarly make painful procedures tolerable. Pain is most keenly felt when attended to.

 

Another British experiment, by Charles Spence and Sophie Forster, demonstrated inattentional anosmia (your new word for the day?)—an inability to smell. When people focused on a cognitively demanding task, they became unlikely to notice a coffee scent in the room. .

So what’s next? Can we expect a demonstration of inattentional ageusia—inability to taste? (That’s my new word for the day.) Surely, given our powers of attention (and corresponding inattention), we should expect such.

 

Like a flashlight beam, our mind’s selective attention focuses at any moment on only a small slice of our experience—a phenomenon most drivers underestimate when distracted by phone texting or conversation. However, there’s good news: With our attention riveted on a task, we’re productive and even creative. Our attention is a wonderful gift, given to one thing at a time.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

Nearly two-third of Americans, reports a recent PLOS One article, agree that “I am more intelligent than the average person.”

 

This self-serving bias—on which I have been reporting for four decades (starting here)—is one of psychology’s most robust and reliable phenomena. Indeed, on most subjective, socially desirable dimensions, most of us see ourselves as better-than-average . . . as smarter, more ethical, more vocationally competent, more charitable, more unprejudiced friendlier, healthier, and more likely to outlive our peers—which calls to mind Freud’s joke about the husband who told his wife, “If one of us dies, I shall move to Paris.”

 

My own long-ago interest in self-serving bias was triggered by noticing a result buried in a College Board survey of 829,000 high school seniors. In rating themselves on their “ability to get along with others,” 0 percent viewed themselves below average. But a full 85 percent saw themselves as better than average: 60 percent in the top 10 percent, and 25 percent as in the top 1 percent.

 

As Shelley Taylor wrote in Positive Illusions, “The [self-]portraits that we actually believe, when we are given freedom to voice them, are dramatically more positive than reality can sustain.” Dave Barry recognized the phenomenon: “The one thing that unites all human beings, regardless of age, gender, religion, economic status, or ethnic background is that deep down inside, we all believe that we are above average drivers.”

 

Self-serving bias also takes a second form—our tendency to accept more responsibility for our successes than our failures, for our victories than our defeats, and for our good deeds than our bad. In experiments, people readily attribute their presumed successes to their ability and effort, their failures to bad luck or an impossible task. A Scrabble win reflects our verbal dexterity. A loss? Our bad luck in drawing a Q but no U.

 

Perceiving ourselves, our actions, and our groups favorably does much good. It protects us against depression, buffers stress, and feeds our hopes. Yet psychological science joins literature and religion in reminding us of the perils of pride. Hubris often goes before a fall. Self-serving perceptions and self-justifying explanations breed marital conflict, bargaining impasses, racism, sexism, nationalism, and war.

 

Being mindful of self-serving bias needn’t lead to false modesty—for example, smart people thinking they are dim-witted. But it can encourage a humility that recognizes our own virtues and abilities while equally acknowledging those of our neighbors. True humility leaves us free to embrace our special talents and similarly to celebrate those of others.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

“She [Professor Christine Blasey Ford] can’t tell us how she got home and how she got there,” scorned Senator Lindsey Graham during the lunch break of yesterday’s riveting U. S. Senate Judiciary Committee hearing regarding Ford’s memory of being assaulted by Supreme Court nominee Brett Kavanaugh. Graham’s assumption, widely voiced by fellow skeptics of Ford’s testimony, is that her inability to remember simple peripheral details discounts the authenticity of her assault memory.

 

But Graham and the other skeptics fail to understand, first, how extreme emotions signal the brain to “save this!” for future reference. (Likely you, too, have enduring “flashbulb memories” for long-ago emotional experiences?) And second, they fail to understand that peripheral details typically fall into oblivion. In Psychology, 12th Edition, Nathan DeWall and I explain:

 

Our emotions trigger stress hormones that influence memory formation. When we are excited or stressed, these hormones make more glucose energy available to fuel brain activity, signaling the brain that something important is happening. Moreover, stress hormones focus memory. Stress provokes the amygdala (two limbic system, emotion processing clusters) to initiate a memory trace that boosts activity in the brain’s memory-forming areas (Buchanan, 2007; Kensinger, 2007) (FIGURE 8.9). It’s as if the amygdala says, “Brain, encode this moment for future reference!” The result? Emotional arousal can sear certain events into the brain, while disrupting memory for irrelevant events (Brewin et al., 2007; McGaugh, 2015).

 

Significantly stressful events can form almost unforgettable memories. After a traumatic experience—a school shooting, a house fire, a rape—vivid recollections of the horrific event may intrude again and again. It is as if they were burned in: “Stronger emotional experiences make for stronger, more reliable memories,” noted James McGaugh (1994, 2003). Such experiences even strengthen recall for relevant, immediately preceding events [such as going up the stairway and into the bedroom, in Ford’s case] (Dunsmoor et al., 2015: Jobson & Cheraghi, 2016). This makes adaptive sense: Memory serves to predict the future and to alert us to potential dangers. Emotional events produce tunnel vision memory. They focus our attention and recall on high priority information, and reduce our recall of irrelevant details (Mather & Sutherland, 2012). Whatever rivets our attention gets well recalled, at the expense of the surrounding context.

 

And as I suggested in last week’s essay, Graham and others seem not to understand “state-dependent memory”—that what people experience in one state (such as when drunk) they may not remember in another state (sober). Nor are Kavanaugh’s supporters recognizing that heavy drinking disrupts memory formation, especially for an experience that would not have been traumatic for him. Thus, Kavanaugh could be sincerely honest in not recalling an assaultive behavior, but also, possibly, sincerely wrong.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit www.TalkPsych.com.)