Skip navigation
All Places > The Psychology Community > Blog > Authors David Myers
1 2 3 4 Previous Next

The Psychology Community

228 Posts authored by: David Myers Expert

Democracy presumes civic wisdom. When voters grasp truth, when facts prevail over misinformation, prudence prevails. When the electorate understands what actually advances (and threatens) human flourishing, it can inaugurate sensible policies and elect benevolent leaders. The collective wisdom of the cognizant is more astute than an autocrat’s whims.

 

Alas, as the late Hans Rosling amply documents in Factfulness, too often the crowd is unwise. Ignorance reigns. Even with this forewarning, consider:

  • What percent of the world’s 1-year-olds have had a vaccination?
  • What percent of humanity lives in extreme poverty (<$2/day)?
  • What percent of humanity is literate (able to read and write)?

 

The factual answers—86 percent, 9 percent, and 86 percent, respectively—differ radically from Americans’ perceptions. Their vaccination estimate: 35 percent. And though extreme poverty has plummeted and literacy has soared, most don’t know that. More than people suppose, world health, education, and prosperity have improved (as Steven Pinker further documents in Enlightenment Now).

 

Such public ignorance—compounded by the overconfidence phenomenon (people’s tendency to be more confident than correct)—often undermines civic wisdom. When year after year 7 in 10 adults tell Gallup there has been more crime than in the prior year—despite plummeting violent and property crime rates—then fear-mongering politicians may triumph. Our ignorance matters when horrific but infinitesimally rare incidents of domestic terrorism, school shootings, and air crashes hijack our consciousness. We and our children will not only disproportionately fear the wrong things, we will then risk more lives by extreme public spending to avoid these frightening things—to, say, block the “vicious predators and bloodthirsty killers” supposedly pouring across our southern border, rather than to mitigate climate change and more extreme weather.

 

In the aftermath of anti-immigrant fear-stoking (“They’re bringing drugs. They’re bringing crime. They’re rapists.”), many people do fear immigrants. Americans are, reports Gallup, “five times more likely to say immigrants make [crime] worse rather than better (45% to 9%, respectively).” Roused by anecdotes of vicious immigrant crime, “Build the wall!” becomes a rallying cry—despite, as the conservative Cato Institute freshly documents, a lower crime rate among immigrants than among native-born Americans.

 

 

And what do you think: Is eating genetically modified (GM) food safe? “Yes,” say 37 percent of U.S. adults and 88 percent of American Association for the Advancement of Science members. Moreover, the people most opposed to GM foods are (according to a new study) those who are most ignorant about them.

 

As the famed Dunning-Kruger effect reminds us, ignorance and incompetence can, ironically, feed overconfidence. Ignorant of my ignorance—and thus prone to a smug overconfidence—I am blissfully unaware of all the possible Scrabble words I fail to see . . . which enables me to think myself verbally adept. We are, as Daniel Kahneman has said, often “blind to our blindness.”

 

The result is sometimes a theater of the absurd. A December 2015 Public Policy Polling survey asked Donald Trump supporters if they favored or opposed bombing Agrabah. Among the half with an opinion, there was 4 to 1 support (41 percent to 9 percent) for dropping bombs on Agrabah . . . the fictional country from Aladdin.

 

But ignorance needn’t be permanent. Education can train us to recognize how errors and biases creep into our thinking. Education also makes us less gullible—less vulnerable to belief in conspiracy theories. Teach people to think critically—with a mix of open-minded curiosity, evidence-seeking skepticism, and intellectual humility—and they will think . . . and vote . . . smarter. Ignorance matters. But education works.

 

(For David Myers’ other essays on psychological science and everyday life visit TalkPsych.com.)

Time and again I am struck by two robust social science findings.

 

The first, to which social conservatives nod their appreciation, concerns the benefits of successful marriages—which are a substantial predictor of health, longevity, personal happiness, and the well-being of children. An example: As I documented here, U.S. Child Health Surveys have shown that children living with two parents have been half as likely as those living with a never or formerly married mother to have been suspended or expelled from school—even after controlling for variations in race, family size, and parental education and income. To be sure, most single-parented children thrive, and many co-parented children are dysfunctional. Yet show me a place where nearly all children are co-parented by two adults enduringly committed to each other and their children and I will show you a place with relatively low rates of psychological disorder and social pathology. Marriage matters.

 

The second, to which progressives nod their appreciation, is that economic inequality is socially toxic. Places with great inequality have more social pathology—higher rates of crime, anxiety, obesity, and drug use, and lower life expectancy and happiness (see here and here).  Show me a place with great inequality and I will show you a place with a comparatively depressed and dissatisfied populace. Disparity dispirits.

 

Moreover, argues John Hopkins University sociology chair Andrew Cherlin, there is a path between these two oft-confirmed findings: Rising income inequality contributes to family dissolution. As the gap between rich and poor has widened, unstable cohabitations and nonmarital child-bearing have dramatically increased among those with lower incomes—or where men have dim job prospects. In deteriorating job markets, marriage wanes and families become less stable. Moreover, for working single parents, affordable quality child care may be out of reach.

 

Ergo, doesn’t it follow that those who support marriage and stable co-parenting (a typically conservative value) should also be economic progressives—concerned about reducing inequality and poverty? To envision a culture that welcomes children into families with two or more people who love them is to envision an economic environment that nurtures secure families.

 

What do you think: Might this vision of a family-supportive just economy be a meeting place between conservatism and progressivism? And might it be a basis for depolarizing our politics and unifying our aspirations?

 

A glimmer of hope: After writing this essay, I learned of Fox News’ conservative voice, Tucker Carlson, recent lament that “families are being crushed by market forces” . . . to which Dean Baker of the progressive Center for Economic and Policy Research replied: “It’s a bit scary to me how much of this I agree with.”

 

(For David Myers’ other essays on psychological science and everyday life visit TalkPsych.com.)

Psychology’s archives are filled with well-meaning, well-funded endeavors that were meant to change lives for the better but that—alas—made no difference.

 

In one huge study, 500 Massachusetts boys deemed at risk for delinquency were, by the toss of a coin, assigned either to a no-intervention control condition or to a 5-year treatment program. In addition to twice-a-month visits from counselors, the boys in the treatment program received academic tutoring, medical attention, and family assistance and were involved in community programs, such as the Boy Scouts. When Joan McCord located 97 percent of the participants some 30 years later, many  offered glowing testimonials: Were it not for the program, “I would probably be in jail”; “My life would have gone the other way”; or “I think I would have ended up in a life of crime.” Indeed, even among “difficult” predelinquent boys, 66 percent developed no juvenile crime record.

 

But the same was true of their control counterparts—70 percent of whom had no juvenile record. Alas, the glowing testimonials had been unintentionally deceiving. The program had no beneficial effect.

 

More recently, other endeavors—the national Scared Straight program to tame teenage violence, the police-promoted D.A.R.E. anti-drug effort, Critical Incident Debriefing for trauma victims, and numerous weight-reduction, pedophile rehabilitation, and sexual reorientation efforts—have also been found ineffectual or even harmful.

 

Is this because genetic influences fix our traits—minimizing our malleability? (Think of the dozens of identical twins who, though raised separately, are still amazingly similar.) To be sure, genes do matter. The most comprehensive review of twin studies—more than 3000 such, encompassing 14.6 million twins—found that “across all traits the reported heritability [individual differences attributable to genes] is 49 percent.” That is substantial, yet it leaves room for willpower, beliefs, and social influence as well. Body weight, for example, is genetically influenced, but diet and exercise also matter.

 

Given the guiding power of our heredity and the failure of many large-scale efforts to help people to flourish, I am stunned by the successes of brief “wise interventions”—“wise” in the sense of being savvy about how our beliefs and assumptions influence us, and “stunned” that a 1-hour intervention sometimes outperforms a 5-year intervention.

 

Two leading researchers, Gregory Walton and Timothy Wilson, recently reviewed 325 interventions. Their conclusion: Helping people reframe the meaning of their experiences can promote their long-term flourishing. As Walton explains at www.wiseinterventions.org, “Wise interventions focus on the meanings and inferences people draw about themselves, other people, or a situation they are in.” Three examples:

  • At-risk middle school students given a “growth mindset”—being taught that the brain, like a muscle, grows with use—achieved better grades because they “saw effort as a virtue, because effort helps to develop ability.”
  • Entering minority college students who experienced a 1-hour session explaining the normality of the worry that they didn’t belong (with reassuring stories from older peers) achieved higher grades over the next 3 years—and greater life and career satisfaction after college.
  • A paraprofessional’s helping at-risk new mothers understand their baby’s fussing reduced the moms’ deciding they were bad mothers—and reduced first-year child abuse from 23 percent to 4 percent.

 

Thus, conclude Walton and Wilson, “exercises that seem minor can be transformational” when individuals address “a pressing psychological question, such as whether they belong at school, whether a romantic partner loves them, whether they can improve in math, whether they are a ‘bad mom,’ or whether groups can change in an ongoing conflict.”

 

So, genes matter. But we are all a mix of nature and nurture, of biology and beliefs. And that is why wisely changing people’s interpretations of their experiences and situations can support their flourishing.

 

(For David Myers’ other essays on psychological science and everyday life visit TalkPsych.com.)

This  www.TalkPsych.com entry offers three news flashes—samples of research that have captured my attention (and may wend their way into future textbook editions).

 

NEWS FLASH # 1:Intergroup contact makes us “less inward looking and more open to experiences.” As any social psychology student knows, friendly contact with other sorts of folks engenders positive attitudes. For example, as an earlier TalkPsych essay documented, regions with more immigrants have more welcoming, positive attitudes toward immigrants. Places without immigrants fear them the most.

 

But intergroup contact does more than improve our attitudes toward others. Research by Brock University psychologist Gordon Hodson and his British colleagues reveals that intergroup contact  affects our thinking—it loosens us up, promoting cognitive flexibility, novel problem solving, and increased creativity. This observation complements earlier research that demonstrated, after controlling for other factors, that students who studied in another culture became more flexibly adept at creative problem solving (see here and here).

 

NEWS FLASH # 2:

More than we suppose, other people like us. Do you sometimes worry that people you’ve just met don’t like you very much? Actually, recent studies by Cornell University researcher Erica Boothby and her colleagues found that people rate new conversational partners as more enjoyable and likeable than the new partner presumes. Despite our shared self-serving bias (the  tendency to overestimate our own knowledge, abilities, and virtues), we tend to underestimate the impressions we make on others. Moreover, the shyer the person, the bigger the liking gap—the underestimate of others’ liking of us.

 

Ergo, the next time you fret over whether you were too quiet, too chatty, or too wrinkled and rumpled, be reassured: Others probably liked you more than you realize.

 

NEWS FLASH # 3:

The youngest children in a school class are more likely to be diagnosed with ADHD. The current psychiatric disorder manual broadens the criteria for diagnosing attention-deficit/hyperactivity disorder (ADHD), thus increasing the number of children so diagnosed. Some say the diagnosis enables helpful treatment and improved functioning. Skeptics say the broadened criteria pathologize immature rambunctiousness, especially among boys—whom evolution has not designed to sit passively at school desks.

 

Support for the skeptics comes from a New England Journal of Medicine study that followed 407,846 U.S. children from birth to elementary school. ADHD diagnoses were a stunning 34 percent higher among those born in August in states with a September 1 cutoff for school entry—but not higher among children in states with other cutoff dates. This massive study confirms earlier reports (here and here) that the youngest children in a class tend to be more fidgety—and more often diagnosed with ADHD—than their older peers.

 

Such findings illustrate why I feel privileged to be gifted with the time, and the responsibility, to learn something new most every day. For me, the primary job of writing is not making words march up a screen, but reading and reading, searching for insights—for gems amid the rocks—that educated people should know about.

 

(For David Myers’ other essays on psychological science and everyday life visit www.TalkPsych.com.)

At long last, artificial intelligence (AI)—and its main subset, machine learning—is beginning to fulfill its promise. When fed massive amounts of data, computers can discern patterns (as in speech recognition) and make predictions or decisions. AlphaZero, a Google-related computer system, started playing chess, shogi (Japanese chess), and GO against itself. Before long, thanks to machine learning, AlphaZero progressed from no knowledge of each game to “the best player, human or computer, the world has ever seen.”

 

DrAfter123/DigitalVision Vectors/Getty Images

 

I’ve had recent opportunities to witness the growing excitement about machine learning in the human future, through conversations with

  • Adrian Weller (a Cambridge University scholar who is program director for the UK’s national institute for data science and AI).
  • Andrew Briggs (Oxford’s Professor of Nanomaterials, who is using machine learning to direct his quantum computing experiments and, like Weller, is pondering what machine learning portends for human flourishing).
  • Brian Odegaard (a UCLA post-doc psychologist who uses machine learning to identify brain networks that underlie human consciousness and perception).

 

Two new medical ventures (to which—full disclosure—my family foundation has given investment support) illustrate machine learning’s potential:

  • Fifth Eye, a University of Michigan spinoff, has had computers mine data on millions of heartbeats from critically ill hospital patients—to identify invisible, nuanced signs of deterioration. By detecting patterns that predict patient crashes, the system aims to provide a potentially life-saving early warning system (well ahead of doctors or nurses detecting anything amiss).
  • Delphinus, which offers a new ultrasound alternative to mammography, will similarly use machine learning from thousands of breast scans to help radiologists spot potent cancer cells.

 

Other machine-learning diagnostic systems are helping physicians to identify strokes, retinal pathology, and (using sensors and language predictors) the risk of depression or suicide. Machine learning of locked-in ALS patients’ brain wave patterns associated with “Yes” and “No” answers has enabled them to communicate their thoughts and feelings. And it is enabling researchers to translate brain activity into speech.

 

Consider, too, a new Pew Research Center study of gender representation in Google images. Pew researchers first harvested an archive of 26,981 gender-labeled human faces from different countries and ethnic groups. They fed 80 percent of these images into a computer, which used machine learning to discriminate male and female faces. When tested on the other 20 percent, the system achieved 95 percent accuracy.

 

Pew researchers next had the system use its new human-like gender-discrimination ability to  identify the gender of persons shown in 10,000 Google images associated with 105 common occupations. Would the gender representation in the image search results overrepresent, underrepresent, or accurately represent their proportions, as reported by U.S. Bureau of Labor Statistics (BLS) data summaries?

 

The result? Women, relative to their presence in the working world, were significantly underrepresented in some categories and overrepresented in others. For example, the BLS reports that 57 percent of bartenders are female—as are only 29 percent of the first 100 people shown in Google image searches of “bartender” (as you can see for yourself). Searches for “medical records technician,” “probation officer,” “general manager,” “chief executive,” and “security guard” showed a similar underrepresentation. But women were overrepresented, relative to their working proportion, in Google images for “police,” “computer programmer,” “mechanic,” and “singer.” Across all 105 jobs, men are 54 percent of those employed and 60 percent of those pictured. The bottom line: Machine learning reveals (in Google users’ engagement) a subtle new form of gender bias.

 

As these examples illustrate, machine learning holds promise for helpful application and research. But it will also entail some difficult ethical questions.

 

Imagine, for example, that age, race, gender, or sexual orientation are incorporated into algorithms that predict recidivism among released prisoners. Would it be discriminatory, or ethical, to use such demographic predictors in making parole decisions?

 

Such questions already exist in human judgments, but may become more acute if and when we ask machines to make these decisions. Or is there reason to hope that it will be easier to examine and tweak the inner workings of an algorithmic system than to do so with a human mind?

 

(For David Myers’ other essays on psychological science and everyday life visit www.TalkPsych.com.)

Judith Rich Harris’ December 29th death took my mind to her remarkable life and legacy. Among all the people I’ve never met, she was the person I came to know best. Across 243 emails she shared her draft writings, her critical assessment of others’ thinking (including my own), and the progress of her illness.

 

Our conversation began after the publication of her cogent Psychological Review paper, which changed my thinking and led me to send a note of appreciation. The paper’s gist was delivered by its first two sentences: “Do parents have any important long-term effects on the development of their child’s personality? This article examines the evidence and concludes that the answer is no.”

 

Her argument: Behavior genetics studies (of twins and adoptees) show that genes predispose our individual traits, and that siblings’ “shared environment” has a shockingly small influence. Peers also matter—they transmit culture. Show her some children who hear English spoken with one accent at home, and another accent at school and in the neighborhood, and—virtually without exception—she will show you children who talk like their peers.

 

Judy Harris was a one-time Harvard psychology graduate student who was dismissed from its doctoral program because, as George Miller explained to her, she lacked “originality and independence.”

 

But she persisted. In her mid-fifties, without any academic affiliation and coping with debilitating autoimmune disorders, she had the chutzpah to submit her evidence-based ideas to Psychological Review, then as now psychology’s premier theoretical journal. To his credit, the editor, Daniel Wegner, welcomed this contribution from this little-known independent scholar. Moreover, when her great encourager Steven Pinker and I each nominated her paper for the annual award for “outstanding paper on general psychology,” the judges selected her as co-recipient of the—I am not making this up—George A. Miller Award. (To his credit, Miller later termed the irony “delicious.”)

 

The encouraging lesson (in Harris’ words): “‘Shut in’ does not necessarily mean ‘shut out.’” Truth will out. Although biases are real, excellence can get recognized. So, wherever you are, whatever your big idea or passion, keep on.

 

Her fame expanded with the publication of her 1998 book The Nurture Assumption, which was profiled by Malcolm Gladwell in a New Yorker feature article, made into a Newsweek cover story, and named as a Pulitzer Prize finalist.

 

Her argument was controversial, and a reminder that important lessons are often taught by those who fearlessly push an argument to its limit. (Surely child-rearing does have some direct influence on children’s values, religiosity, and politics—and not just via the peer culture to which parents expose children. And surely the loving versus abusive extremes of parenting matter.)

 

Harris was kind and generous (she supportively critiqued my writing, even as I did hers) but also had the self-confidence to take on all critics and to offer challenges to other widely accepted ideas. One was the “new science” of birth order, which, as she wrote me, was “neither new nor science.” An August 24, 1997, email gives the flavor of her wit and writing:

Birth order keeps coming back. In their 1996 book on birth order and political behavior, Albert Somit, Alan Arwine, and Steven A. Peterson spoke of the “inherent non-rational nature of deeply held beliefs” and mused that “permanently slaying a vampire”—the belief in birth order effect—may require “that a stake of gold be driven through his/her heart at high noon” (p. vi).
            Why is it so difficult to slay this vampire? Why, in spite of all the telling assaults that have been made on it, does it keep coming back? The answer is that the belief in birth order effects fits so well into the basic assumptions of our profession and our culture. Psychologists and nonpsychologists alike take it for granted that a child’s personality, to the degree that it is shaped by the environment, receives that shaping primarily at home. And since we know (from our own memories and introspections) that a child’s experiences at home are very much affected by his or her position in the family—oldest, youngest, or in the middle—we expect birth order to leave permanent marks on the personality.
            The belief in birth order effects never dies; it just rests in its coffin until someone lifts the lid again.

 

Alas, the disease that shut her in has, as she anticipated, claimed her. In her last email sent my way on September 6, 2018, she reported that

I’m not doing so well. This is the normal course of the disorder I have—pulmonary arterial hypertension. It is incurable and eventually progresses to heart failure and death. I’m in the heart failure stage now. It’s progressing very slowly, but makes remaining alive not much fun. 

            Because I can’t actually DO anything anymore, it’s a treat to get your mail. I can’t do any more than I’ve already done, but maybe what I’ve already done is enough. Who would have thought that 20 years after its publication, people would still be talking about The Nurture Assumption!

 

Or that The New York Times would replay its message at length, in your well-deserved obituary, Judy.

 

(For David Myers’ other essays on psychological science and everyday life visit www.TalkPsych.com.)

As Pope Francis has said, “Everyone’s existence is deeply tied to that of others.” We are social animals. We need to belong. We flourish when supported by close relationships. Finding a supportive confidante, we feel joy.

 

Longing for acceptance and love, Americans spend $86 billion annually on cosmetics, fragrances, and personal care products—and billions more on clothes, hair styling, and diets. Is that money well spent? Will it help us find and form meaningful relationships?

 

Consider one of social psychology’s most provocative, and simplest, experiments. Cornell University students were asked to don a Barry Manilow T-shirt (at the behest of researcher Thomas Gilovich and colleagues) and were then shown into a room where several others were completing questionnaires. Afterwards they were asked to guess how many of the others noticed their dorky attire. Their estimate? About half. Actually, only 23 percent did.

 

Other experiments confirm this spotlight effect—an overestimation of others’ noticing us, as if a spotlight is shining on us.

 

The phenomenon extends to our secret emotions. Thanks to an illusion of transparency we presume that our attractions, our disgust, and our anxieties leak out and become visible to others. Imagine standing before an audience: If we’re nervous and we know it, will our face surely show it? Not necessarily. Even our lies and our lusts are less transparent than we imagine.

 

There’s bad news here: Others notice us less than we imagine (partly because they are more worried about the impressions they are making).

 

But there’s also good news: Others notice us less than we imagine. And that good news is liberating: A bad hair day hardly matters. And if we wear yesterday’s clothes again today, few will notice. Fewer will care. Of those, fewer still will remember. 

 

If normal day-to-day variations in our appearance are hardly noticed and soon forgotten, what does affect the impressions we make and the relationships we hope to form and sustain?

 

Proximity. Our social ecology matters. We tend to like those nearby—those who sit near us in class, at work, in worship. Our nearest become our dearest as we befriend or marry people who live in the same town, attend the same school, share the same mail room, or visit the same coffee shop. Mere exposure breeds liking. Familiar feels friendly. Customary is comfortable. So look around.

 

Similarity. Hundreds of experiments confirm and reconfirm that likeness leads to liking (and thus the challenge of welcoming the benefits of social diversity). The more similar another’s attitudes, beliefs, interests, politics, income, and on and on, the more disposed we are to like the person and to stay connected. And the more dissimilar another’s attitudes, the greater the odds of disliking.  Opposites retract.

 

If proximity and similarity help bonds form, what can we do to grow and sustain relationships?

 

Equity. One key to relationship endurance is equity, which occurs when friends perceive that they receive in proportion to what they give. When two people share their time and possessions, when they give and receive support in equal measure, and when they care equally about one another, their prospects for long-term friendship or love are bright. This doesn’t mean playing relational ping pong—balancing every invitation with a tit-for-tat response. But over time, each friend or partner invests in the other about as much as he or she receives.

 

Self-disclosure. Relationships also grow closer and stronger as we share our likes and dislikes, our joys and hurts, our dreams and worries. In the dance of friendship or love, one reveals a little and the other reciprocates. And then the first reveals more, and on and on. As the relationship progresses from small talk to things that matter, the increasing self-disclosure can elicit liking, which unleashes further self-disclosure.

 

Mindful of the benefits of equity and mutual self-disclosure, we can monitor our conversations: 

  • Are we listening as much as we are talking?
  • Are we drawing others out as much as we disclosing about ourselves?

 

In his classic How to Win Friends and Influence People, Dale Carnegie offered kindred advice. To win friends, he advised, “become genuinely interested in other people. . . . You can make more friends in two months by being interested in them, than in two years by making them interested in you.” Thus, “Be a good listener. Encourage others to talk about themselves.”

 

So, looking our best may help a little, initially, though less than we suppose. What matters more is being there for others—focusing on them, encouraging them, supporting them—and enjoying their support in return. Such is the soil that feeds satisfying friendships and enduring love.

 

(For David Myers’ other weekly essays on psychological science and everyday life, visit www.TalkPsych.com)

“I have a gut, and my gut tells me more sometimes than anybody else’s brain can ever tell me,” explained President Trump in stating why he believed Federal Reserve interest rate hikes were a mistake. “My gut has always been right,” he declared again in saying why he needn’t prepare for the recent trade negotiation with China’s president.

 

In trusting his gut intuition, Trump has much company. “Buried deep within each and every one of us, there is an instinctive, heart-felt awareness that provides—if we allow it to—the most reliable guide,” offered Prince Charles. “I’m a gut player. I rely on my instincts,” said President George W. Bush, explaining his decision to launch the Iraq War.

 

Although there is, as I noted in another of these TalkPsych essays, a gut-brain connection, are we right to trust our gut? Does the gut know best about interest rates, trade policy, and climate change? Or, mindful of smart people often doing dumb things, do we instead need more humility, more checking of gut hunches against hard reality, more critical thinking?

 

Drawing from today’s psychological science, one could write a book on both the powers and perils of intuition. (Indeed, I have—see here.) Here, shortened to an elevator speech, is the gist.

 

Intuition’s powers. Cognitive science reveals an unconscious mind—another mind backstage—that Freud never told us about. Much thinking occurs not “on screen” but off screen, out of sight, where reason does not know. Countless studies—of priming, implicit memory, empathic accuracy, thin slice social judgments, creativity, and right hemisphere processing—illustrate our nonrational, intuitive powers. We know more than we know we know. Thanks to our “overlearning” of automatic behaviors, those of us who learned to ride bikes as children can intuitively pedal away on one decades later. And a skilled violinist knows, without thinking, just where to place the bow, at what angle, with what pressure. “In apprehension, how like a god!,” exclaimed Shakespeare’s Hamlet.

 

Intuition’s perils. Other studiesof perceptual illusions, self-serving bias, illusory optimism, illusory correlation, confirmation bias, belief perseverance, the fundamental attribution error, misplaced fears, and the overconfidence phenomenon—confirm what literature and religion have long presumed: the powers and perils of pride. Moreover, these phenomena feed mistaken gut intuitions that produce deficient decisions by clinicians, interviewers, coaches, investors, gamblers, and would-be psychics. “Headpiece filled with straw,” opined T. S. Eliot.

 

Intuition’s failures often are akin to perceptual illusions—rooted in mechanisms that usually serve us well but sometimes lead us astray. Like doctors focused on detecting and treating disease, psychological scientists are skilled at detecting and calling attention to our mind’s predictable errors. They concur with the novelist Madeline L’Engle’s observation: “The naked intellect is an extraordinarily inaccurate instrument.”

 

The bottom line: our gut intuitions are terrific at some things, such as instantly reading emotions in others’ faces, but fail at others, such as guessing stocks, assessing risks, and predicting climate change. And so psychologists teach about intuition’s perils as well as its powers. We encourage critical thinking. We urge people, before trusting others’ gut intuitions, to ask: “What do you mean?” “How do you know?”

 

As physicist Richard Feynman famously said, “The first principle is that you must not fool yourself, and you are the easiest person to fool.”

 

(For David Myers’ other weekly essays on psychological science and everyday life, visit www.TalkPsych.com)

If you, dear reader, can indulge some slightly geeky calculations, I hope to show you that with daily exercise you can live a substantially longer and happier life. Indeed, per the time invested, exercise will benefit you more than smoking will harm you. Consider:

  • An analysis of mortality data offers this memorable result: For the average person, life is lengthened by about 7 hours for every hour exercised. So (here comes the geek), the World Health Organization recommends exercising 150 minutes = 2.5 hours per week. Multiplied times 7, that equals 17.5 hours longer life for each week of exercise. Over 52 weeks, that sums to 910 hours = 38 days = 1/10th of a year longer life for each year of faithful exercise . . . which, continued over 40 years would yield ~4 years longer life. (Though, more typically, say the researchers, runners live 3.2 years longer.)
  • In another epidemiological study of over 650,000 American adults, those walking 150 minutes per week lived (voila!) 4 years longer than nonexercisers (Moore et al., 2012).

 

How satisfying to have two independent estimates in the same ballpark!

 

This potential life-extending benefit brings to mind the mirror-image life-shortening costs of smoking, which the Centers for Disease Control reports diminishes life for the average smoker “by at least 10 years.” Thus (geek time again):

  • A person  who takes up smoking at age 15, smokes 15 cigarettes per day for  50 years, and dies at 65 instead of 75, will lose roughly 1/5th of a year (equals 73 days = 1752 hours = 105,000 minutes) for each year of smoking. If each cigarette  takes 10 minutes to smoke, the minutes spent smoking (54,750 each year) will account for half of those 105,000 lost minutes.
  • Ergo, nature charges ~2 minutes of shorter life for each minute spent smoking. . . but generously gives a 7-to-1 return for each hour spent exercising. How benevolent!

 

Massive new epidemiological studies and meta-analyses (statistical digests of all available research) confirm both physical and mental health benefits of exercise (see here, here, and here). A good double goal for those wishing for a long life is: more fitness, less fatness. But evidence suggests that if forced to pick one, go for fitness.

 

As an earlier blog essay documented, exercise entails not only better health but a less depressed and anxious mood, more energy, and stronger relationships. Moreover, clinical trial experiments—with people assigned to exercise or to control conditions—confirm cause and effect: Exercise both treats and protects against depression and anxiety.

 

The evidence is as compelling as evidence gets: Go for a daily jog or swim and you can expect to live longer and live happier. Mens sana in corpore sano: A healthy mind in a healthy body.

 

 K.C. Alfred/Moment/Getty Images

(For David Myers’ other weekly essays on psychological science and everyday life, visit www.TalkPsych.com)

David Myers

Sometimes Truth Is Comedy

Posted by David Myers Expert Nov 29, 2018

As I approach five years of www.TalkPsych.com commentary—which has settled into a weekly Thursday essay—I am tempted (given our now larger audience) to replay an occasional favorite. Here is my second focused essay, which still puts a smile on my face . . . and perhaps yours? (In sneaking humor into texts, I presume that if I can’t have fun writing, then readers likely won’t have fun reading.)

 

From April 6, 2014:

Consider Brett Pelham, Matthew Mirenberg, and John Jones’ 2002 report of wacky associations between people’s names and vocations. Who would have guessed? For example, in the United States, Jerry, Dennis, and Walter are equally popular names (0.42 percent of people carry each of these names). Yet America’s dentists have been almost twice as likely to be named Dennis as Jerry or Walter. Moreover, 2.5 times as many female dentists have been named Denise as the equally popular names Beverly and Tammy. And George or Geoffrey has been overrepresented among geoscientists (geologists, geophysicists, and geochemists).

I thought of that playful research on names recently when reading a paper on black bears’ quantitative competence, co-authored by Michael Beran. Next up in my reading pile was creative work on crows’ problem solving led by Chris Bird. Today I was appreciating interventions for lifting youth out of depression, pioneered by Sally Merry.

That also took my delighted mind to the important books on animal behavior by Robin Fox and Lionel Tiger, and the Birds of North America volume by Chandler Robbins. (One needn’t live in Giggleswick, England, to find humor in our good science.)

The list goes on: billionaire Marc Rich, drummer Billy Drummond, cricketer Peter Bowler, and the Ronald Reagan Whitehouse spokesman Larry Speakes. And as a person with hearing loss whose avocational passion is hearing advocacy, I should perhaps acknowledge the irony of my own name, which approximates My-ears.

Internet sources offer lots more: dentists named Dr. E. Z. Filler, Dr. Gargle, and Dr. Toothaker; the Oregon banking firm Cheatham and Steele; and the chorister Justin Tune. But my Twitter feed this week offered a cautionary word about these reported names: “The problem with quotes on the Internet is that you never know if they’re true.” ~ Abraham Lincoln

Perhaps you, too, have some favorite name-vocation associations? I think of my good friend who was anxiously bemused before meeting his oncologist, Dr. Bury. (I am happy to report that, a decade later, he is robustly unburied and has not needed the services of the nearby Posthumus Funeral Home.)

For Pelham and his colleagues there is a serious point to this fun: We all tend to like what we associate with ourselves (a phenomenon they call implicit egotism). We like faces that have features of our own face morphed into them. We like—and have some tendency to live in—cities and states whose names overlap with our own—as in the disproportionate number of people named Jack living in Jacksonville, of Philips in Philadelphia, and of people whose names begin with Tor in Toronto.

Uri Simonsohn isn’t entirely convinced (see here and here, with Pelham’s reply here and here). He replicated the associations between people’s names, occupations, and places but argued that reverse causality sometimes is at work. For example, people sometimes live in places and on streets after which their ancestors were named.

Implicit egotism research continues. In the meantime, we can delight in the occasional playful creativity of psychological science.

P.S. Speaking of dentists (actual ones), my retired Hope College chemistry colleague Don Williams—a person of sparkling wit—offers these photos, taken with his own camera:

And if you need a podiatrist to advise about your foot odor, Williams has found just the person:

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

After elections, people often note unexpected outcomes and then complain that “the polls got it wrong.”

 

After Donald Trump’s stunning 2016 presidential victory, the press gave us articles on “Why the Polls were such a Disaster,” on “4 Possible Reasons the Polls Got It So Wrong,” and on “Why the Polls Missed Their Mark.” Stupid pollsters. “Even a big poll only surveys 1500 people or so out of almost 130 million voters,” we may think, “so no wonder they can’t get it right.

 

Moreover, consider the many pundits who, believing the polls, confidently predicted a Clinton victory. They were utterly wrong, leaving many folks shocked on election night (some elated, others depressed, with later flashbulb memories of when they realized Trump was winning).

 

So how could the polls, the pundits, and the prediction models have all been so wrong?

 

Or were they? First, we know that in a closely contested race, a representative sample of a mere 1500 people from a 130 million population will—surprisingly to many people—allow us to estimate the population preference within ~3 percent.

 

Sounds easy. But there’s a challenge: Most randomly contacted voters don’t respond when called. The New York TimesUpshot” recently let us view its polling in real time. This enabled us to see, for example, that it took 14,636 calls to Iowa’s fourth congressional district to produce 423 responses, among which Steve King led J. D. Scholten by 5 percent—slightly more than the 3.4 percent by which King won.

 

Pollsters know the likely demographic make-up of the electorate, and so can weight results from respondents of differing age, race, and gender to approximate the population. And that, despite the low response rate, allows them to do remarkably well—especially when we bear in mind that their final polls are taken ahead of the election (and cannot account for last-minute events, which may sway undecided voters). In 2016, the final polling average favored Hillary Clinton by 3.9 percent, with a 3 percent margin of error. On Election Day, she won the popular vote by 2.1 percent (and 2.9 million votes)—well within that margin of error.

 

To forecast a race, fivethirtyeight.com’s prediction model does more. It “takes lots of polls, performs various types of adjustments to them [based on sample size, recency, and pollster credibility], and then blends them with other kinds of empirically useful indicators” such as past results, expert assessments, and fundraising. Here is their 2016 final estimation:

Ha! This prediction, like other 2016 prediction models, failed.

 

Or did it? Consider a parallel. Imagine that as a basketball free-throw shooter steps to the line, I tell you that the shooter has a 71 percent free-throw average. If the shooter misses, would you disbelieve the projection? No, because, if what I’ve told you is an accurate projection, you should expect to see a miss 29 percent of the time. If the player virtually never missed, then you’d rightly doubt my data.

 

Likewise, if, when Nate Silver’s fivethirtyeight.com gives a candidate a 7 in 10 chance of winning and that candidate always wins, then the model is, indeed, badly flawed. Yes?

 

In the 2018 U.S. Congressional races, fivethirtyeight.com correctly predicted 96 percent of the outcomes. On the surface, that may look like a better result, but it’s mainly because most races were in solid Blue or Red districts and not seriously contested.

 

Ergo, don’t be too quick to demean the quality polls and the prediction models they inform. Survey science still works.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

David Myers

Science Marches On

Posted by David Myers Expert Nov 15, 2018

This week I interrupt our weekly focus on psychology’s big ideas and new findings to update three prior essays.

 

Loss aversion in sports. A recent essay described how, in sports (as in other realms of life), our fear of losing can rob us of chances to win:

  • In baseball, a mountain of data shows that runners on first base will rarely take off running on a fly ball that has any chance of being caught. But their aversion to being thrown out leads to fewer runs and wins.
  • And in  basketball, teams trailing by 2 points at a game’s end typically prefer a 2-point shot attempt, hoping to avert a loss and send the game into overtime (where half the time they will lose), over a 3-point attempt for victory—even in situations where the odds favor the latter. New Cornell/University of Chicago studies of “myopic loss aversion” confirm this irrational preference for loss-averting 2-point shots at the end of National Basketball Association games.
  • Moreover, those same studies  extend the phenomenon to National Football League games, where teams prefer to kick a tying extra point in situations where a 2-point conversion makes a win more likely (as when down by two points late in the third quarter—see also here). Caution often thwarts triumph.

 

Gratitude gratifies. An essay last spring testified to the positive power of expressing gratitude, which increases well-being and prosociality. In new experiments, Amit Kumar and Nicholas Epley found that people who wrote gratitude letters “significantly underestimated how surprised recipients would be about why expressers were grateful, overestimated how awkward recipients would feel, and underestimated how positive recipients would feel.”

 

Our unexpected personal thank you notes are more heartwarming for their recipients than we appreciate. (Is there someone whose support or example has impacted your life, who would be gratified to know that?)

 

The net effect. A May 2016 essay discussed research on how, in the smartphone age, “compulsive technology use not only drains time from eyeball-to-eyeball conversation but also predicts poorer course performance.” Since then, my friend (and co-author on the new Social Psychology, 13th Edition) Jean Twenge has enriched the public understanding of social media effects in her new book, iGen, and in associated media appearances. (For an excellent synopsis, see her Atlantic article.)

As she documents, the adoption of smartphones is echoed by increases in teen loneliness, depression, and suicide, and by decreases in sleep and face-to-face interactions (though also in less drinking, sex, and car accidents). Jean also continues to mine data, such as from an annual survey of American teens in a new Emotion study with Gabrielle Martin and Keith Campbell. They reconfirmed that a dip in adolescent well-being has precisely coincided with an increase in screen time (on social media, the Internet, texting, and gaming). Moreover, across individuals, more than 10 screen-time hours per week predicts less teen happiness.

 

Ergo, a task for teachers is to inform students about these trends and invite discussion about how students might apply them in their own peer culture. In a recent APS Observer essay, I suggested this might also be a good class activity:

  • Invite students to guess how often they check their phone each day, and how many minutes they average on it.
  • Have them download a free screen-time tracker app, such as Moment for the iPhone or QualityTime for the Android.
  • Have them add up their actual total screen time for the prior week and divide by 7 to compute their daily average.
  • Then ask them, “Did you underestimate your actual smartphone use?

The results may surprise them. In two recent studies, university students greatly underestimated their frequency of phone checking and time on screen. As Steven Pinker has noted, “The solution is not to bemoan technology but to develop strategies of self-control, as we do with every other temptation in life.”

In this time of political passion, those of us who are instructors and/or text authors may agonize over whether to engage contentious public issues, such as the Kavanaugh Supreme Court nomination, fears of immigrant crime, or the possible social toxicity of presidential rhetoric.

 

My assumption is that—given our focus on education and our respect for our students’ diversity—classrooms and textbooks should not be political bully pulpits. There are more appropriate venues for advocating our own political views.

 

But that needn’t preclude our seeking to inform public dialogue, by offering pertinent evidence. For example, in a recent essay, I drew on memory science to report the tunnel-vision nature of emotion-laden memories, as perhaps illustrated when Christine Blasey Ford recalled being sexually assaulted without remembering peripheral details—just what we would expect from an authentic memory. And I indicated how state-dependent memory phenomena could help explain why Brett Kavanaugh might be sincere in having no memory for the same event. But I stopped short of expressing an opinion about whether he should have been confirmed.

 

Other essays have also offered information pertinent to heated political debates:

  • Trade policies. While politicians and economists debate the economic merits of free trade versus trade-restricting tariffs, social psychologists have noted that economic interdependence and cooperation enhance the odds for sustained peace (here).
  • Fear of immigrants. Recent political rhetoric focusing attention on the “caravan” of Central Americans desperate to enter Mexico and the U.S. has again raised fears of immigrant crime. Recent TalkPsych essays (here and here) offered data on actual immigrant crime rates in the United States, and on who in the U.S. and Germany most fears immigrants (ironically, those who have little contact with them). Gallup data from 139 countries confirms higher migrant acceptance among those who know migrants. Teachers can offer such evidence without advocating either party’s border policy (yes?).
  • Presidential rhetoric and public attitudes. Recent essays in The New York Times (here and here) and The Washington Post (here and here) assume that President Trump’s derision of his political opponents and of the press creates a toxic social environment that seeps down into his followers’ attitudes and actions. Pertinent to these concerns, my earlier essays wondered whether the President was “simply giving a voice” to widely held attitudes, or instead was legitimizing such attitudes and thereby increasing bullying. I later offered new evidence that hatemongering from high places does indeed desensitize people to such and increases expressions of prejudice. Can teachers offer such evidence without being partisan?

 

Be assured, psychological science is not intrinsically liberal or conservative. Its evidence sometimes lends weight to progressive thinking (say about sexual orientation as a natural, enduring disposition) and sometimes to conservative thinking (for example, about the benefits of co-parenting and stable close relationships such as marriage). As I conclude in an upcoming teaching column for the Association for Psychological Science Observer, “psychology aims not to advance liberal or conservative thinking per se, but to let evidence inform our thinking. And for us teachers of psychology that, no matter our political identities, is perhaps the biggest lesson of all.”

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

Hate-fueled pipe bombs target Democrats. Two African Americans are gunned down in a grocery story. An anti-Semite slaughters synagogue worshippers. Political leaders denigrate and despise their opponents. In National Election Studies surveys, the percentage of Republicans and Democrats who “hate” the other party has soared, for both sides—from 20 percent in 2000, to near 50 percent in 2016. (Let’s make it personal: Would you want your child to marry a devotee of the other party?)

 

Hostilities are poisoning the culture, and many Americans are wondering: How can we, as individuals and as a culture, turn a corner? Amid animosities fed by groundless fears, fact-free ignorance, and repeated (then believed) big lies, how can we embrace our common humanity and shared goals?

 

As we social psychologists remind folks, conflicts lessen through contact, cooperation, and communication. Personal contact with equal-status others helps (it’s not just what you know, but who you know). Cooperative striving for shared superordinate goalsthose that require the cooperation of two or more people—fosters unity (it even helps to have a common enemy). Ditto guided communication (an aim of www.Better-Angels.org, which brings together “Reds” and “Blues” to understand each other’s concerns and to discover their overlapping aspirations).

 

And might we, individually and as a culture, also benefit by teaching and modeling an outlook that encompasses three virtues: conviction, humility, and love?

 

Our convictions define what matters. We anchor our lives in core beliefs and values that guide our lives. Our convictions motivate our advocacy for a better world. They give us courage to speak and act. “We must always take sides,” said Holocaust survivor Elie Weisel. “Silence encourages the tormentor, never the tormented.” “To be silent is to be complicit,” adds Dead Man Walking author Sister Helen Prejean.

 

But convictions need restraining with humility, a virtue that lies at the heart of science for theists and nontheists alike. Those of us who are theists, of whatever faith tradition, share two convictions:

  1. There is a God.
  2. It’s not me (or you).

Ergo, we are fallible. The surest conviction we can have is that some of our beliefs err. From this follows the religious virtue of humility (alas, a virtue more often preached than practiced). A spirit of humility seasons conviction with open-minded curiosity. It tempers faith with uncertainty (faith without humility is fanaticism). It subjects our opinions to evidence and enables good science. It tells me that every person I meet is, in some way, my superior . . . providing an opportunity to learn.

 

The triangle of virtues within which we can aspire to live is completed when conviction, restrained by humility, is directed by love. In his great sermon on love, Martin Luther King, Jr. quoted Jesus: “Love your enemies, bless them that curse you, do good to them that hate you.” Doing that, he said, does not compel us to like our enemies, but does compel us “to discover the element of good” in them. By contrast, “hate only intensifies the existence of hate and evil in the universe,” he added. “If I hit you and you hit me and I hit you back and you hit me back and go on, you see, that goes on ad infinitum. It just never ends. . . . Hate destroys the hater as well as the hated.”

 

Is this not a vision of a good life that will enable a flourishing culture . . . a life that is animated by deep convictions, which are refined in humility and applied with love?

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

Psychological science delights us with its occasional surprises. For example, who would have imagined that

  • electroconvulsive therapy—shocking the brain into mild convulsions—would often be an effective antidote to otherwise intractable depression?
  • massive losses in brain tissue early in life could have minimal later effects?
  • siblings’ shared home environment would have such a small effect on their later traits?
  • after brain damage, a person may learn new skills yet be unaware of such?
  • visual information is deconstructed into distinct components (motion, form, depth, and color), processed by distinct brain regions, and then reintegrated into a perceived whole?

 

The latest who-would-have-believed-it finding is that the microbiology of the gut may influence the moods of the brain. Digestive-system bacteria reportedly influence human emotions and even social interactions, perhaps by producing neurotransmitters. Moreover, we are told (such as here and here), healthy gut microbes can reduce anxiety, depression, and PTSD.

 

New articles on this supposedly “revolutionary” and “paradigm-shifting” microbiota-gut-brain (MGB) research are accumulating, report Katarzyna Hooks, Jan Pieter Konsman, and Maureen O’Malley in a forthcoming (yet-to-be-edited) review. By comparing rodents or humans with or without intestinal microbiota, researchers have indeed found “suggestive” effects on how organisms respond to stress and display emotions. Some researchers are exploring microbiota-related interventions (such as with probiotics versus placebos) as a possible treatment for depression, anxiety, and anorexia nervosa.

 

The findings are intriguing and worth pursuing but haven’t yet definitively demonstrated “the impact of the microbiota itself on behavior,” say Hooks, Konsman, and O’Malley. Nevertheless, the popular press, sometimes aided by university press offices, has hyped the research in more than 300 articles. People love the news of this research, say Hooks et al., because it lends hope that a natural, healthy diet can provide a simple DIY solution to troubling emotions.

 

Reading this analysis triggers déjà vu. This cycle of (a) an intriguing finding, followed by (b) hype, followed by (c) reassessment, is an occasional feature of our science’s history. Mind-blowing experiments on people with split brains yielded (a) believe-it-or-not findings, leading to (b) overstated claims about left-brained and right-brained people, which (c) finally settled into a more mature understanding of how distinct brain areas function as a whole integrated system.

 

Despite the “large helpings of overinterpretation” and the overselling of “currently limited findings,” the Hooks team encourages researchers to press on. “We see MGB research as a field full of promise, with important implications for understanding the relationship between the brain and the rest of the body.” The body (brain included) is one whole system.

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)