Skip navigation
All Places > The Psychology Community > Blog > Authors David Myers
1 2 3 Previous Next

The Psychology Community

218 Posts authored by: David Myers Expert

In the aftermath of the New Zealand massacre of Muslims at worship, American pundits have wondered: While the perpetrator alone is responsible for the slaughter, do the expressed attitudes of nationalist, anti-immigrant world leaders increase White nationalism—and thus the risk of such violence?

 

Consider Donald Trump’s rhetoric against supposed rapist, drug-dealing immigrants; his retweeting of anti-Muslim rhetoric; his saying that the Charlottesville White nationalists included some “very fine people”; or his condoning violence at his rallies and against the media. Do these actions serve to normalize such attitudes and behavior? Is the Southern Poverty Law Center right to suppose that hatemongering is “emboldened [and] energized” by such rhetoric? Is the New Zealand gunman’s reportedly lauding Trump as “a symbol of White supremacy” something more than a murderer’s misguided rantings?

 

In response, many people—particularly those close to Trump—attributed responsibility to the gunman. The President’s acting chief of staff argued that the shooter was a “disturbed individual” and that it is “absurd” to link one national leader’s rhetoric to an “evil person’s” behavior. We social psychologists call this a “dispositional attribution” rather than a “situational attribution.”

 

As I noted in a 2017 essay, two recent surveys and an experiment show that dispositions are shaped by social contexts. Hate speech (surprised?) feeds hate. Those frequently exposed to hate speech become desensitized to it, and then to lower evaluations of, and greater prejudice toward, its targets. Prejudice begets prejudice.

 

To be sure, leaders’ words are not a direct cause of individuals’ dastardly actions. Yet presidents, prime ministers, and celebrities do voice and amplify social norms. To paraphrase social psychologists Chris Crandall and Mark White, people express prejudices that are socially acceptable and suppress those that are not. When prejudice toward a particular group seems socially sanctioned, acts of prejudice—from insults to vandalism to violence—increase as well. Norms matter.

 

The FBI reports a 5 percent increase in hate crimes during 2016, and a further 17 percent increase during 2017--and reportedly more than doubled in counties hosting a Trump rally. The Anti-Defamation League reports that 2018 “was a particularly active year for right-wing extremist murders: Every single extremist killing—from Pittsburgh to Parkland—had a link to right-wing extremism.” Again, we ask: Coincidence? Or is there something more at work? If so, is there a mirror-image benevolent effect of New Zealand prime minister Jacinda Ardern’s saying of her nation’s Muslim immigrants, “They are us”?

 

 (For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

With so many trillions of daily happenings, some weird and wonderful events are inevitable—random serendipities that we could never predict in foresight but can savor in hindsight. From sports to relationships to our very existence, chance rules.

 

Sports. I defy you to watch this 7-second basketball clip (of a “double doinked” basketball fan) and not smile (or cringe). Freakish events are commonplace in baseball and basketball—as in astonishing hot and cold hitting and shooting streaks. Even when such streaks approximate mere random sequences, they hardly seem random to fans. That’s because random data are streakier than folks assume. (Coin tosses, too, have more runs of heads and of tails than people expect.) And thus is born the sporting world’s preeminent myth—the “hot hand” (see here and here).

 

Chance encounters. Albert Bandura has documented the lasting significance of chance events that deflect our life course into an unanticipated relationship or career. He recalls the book editor who came to one of his lectures on the “Psychology of Chance Encounters and Life Paths”—and ended up marrying the woman he chanced to sit beside.

 

In 1978, I was invited to a five-day conference in Germany, where I came to know a more senior American colleague who chanced to have an adjacent assigned seat. Six months later, when he was invited to become a social psychology textbook author, he referred an acquisitions editor to me, which led to my writing of textbooks and eventually these TalkPsych.com essays. So, thanks to this happenstance seating assignment (and to the kindness of my distinguished colleague), I gained a meaning-filled new vocation . . . and now you are reading this.

 

Recently I was stranded on a rainy Cambridge, Massachusetts, sidewalk, waiting for a lost Lyft driver. That mix-up led to my sharing a ride with University of California at Santa Barbara professor Ann Taves. Making small talk, I asked her about the California fires, noting that I have a friend whose department at Westmont College (in Santa Barbara) was burned in wildfires some years ago.

 

“Who’s your friend?” she asked.

 

“Ray Paloutzian,” I said.

 

Her reply: “I'm married to him!”

 

But then it got weirder. She said she’d heard that I had a Seattle connection. I told her about family there and mentioned we now own a home in the area.

 

“Where is that?” she asked. When I said Bainbridge Island, she looked a little stunned and said, “Where on Bainbridge?”

 

I explained that it was on a beach called “Yeomalt,” one point north of where the ferry docks.

 

Her mouth dropped open. “You're that David Myers?!” 

 

Wonder of wonders, her uncle was also named David Myers, and she spent time over many summers with Uncle David in our little neighborhood—meaning we surely had crossed paths multiple times. She knew all about the other Yeomalt Myers . . . and her uncle’s name doppelganger.

 

I recalled for her the many times that her uncle and I would row past each other while salmon fishing in the early morning . . . with Dave Myers exchanging a friendly wave with Dave Myers. (That always did feel slightly weird.)

 

The point is not that just the world is weird, but that with so many things happening, some weirdness in our lives is to be expected, and enjoyed, be it double doinks or chance encounters that reveal the unlikeliest of connections. Some happenings are destined not to be explained, but to be savored.

 

Our improbable lives. But surely the unlikeliest aspect of our lives is our very existence. As I explain in Psychology, 12th edition (with Nathan DeWall), conception was “your most fortunate of moments. Among 250 million sperm, the one needed to make you, in combination with that one particular egg, won the race. And so it was for innumerable generations before us. If any one of our ancestors had been conceived with a different sperm or egg, or died before conceiving, or not chanced to meet their partner or . . . The mind boggles at the improbable, unbroken chain of events that produced us.”

 

From womb to tomb, chance matters. And whether you call it chance or providence, your life’s greatest blessing is surely that, against near-infinite odds, you exist.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

What are today’s U.S. teens feeling and doing? And how do they differ from the teens of a decade ago?

 

A new Pew Research Center survey of nearly a thousand 13- to 17-year-olds offers both troubling and encouraging insights (here and here).

 

The Grim News

 

Screen time vs. face-to-face time. Today’s teens spend about half their nearly six daily leisure hours looking at screens—gaming, web-surfing, socializing, or watching shows. Such activity displaces leisure time spent with others, which now averages only an hour and 13 minutes daily (16 minutes less than a decade ago).

 

Increased depression, self-harm, and suicide. My Social Psychology co-author, Jean Twenge, reports that teen loneliness, depression, and suicide have risen in concert with smart phones and social media use. She notes,

Teens who visit social-networking sites every day but see their friends in person less frequently are the most likely to agree with the statements “A lot of times I feel lonely,” “I often feel left out of things,” and “I often wish I had more good friends.”

Indeed, reports Pew, 3 in 10 teens says they feel tense or nervous every or almost every day, and 7 in 10 see anxiety and depression as major problems among their peers. Other studies confirm that teen happiness and self-esteem have declined, while teen depression, self-harm, and suicide have risen.

 

The Good News

 

Sleeping more. The teens’ time diaries found them sleeping just over 9 hours per night (and 11 hours on weekends). Although other studies have found teens more sleep-deprived, these teens reported sleeping 22 minutes more per night than their decade-ago counterparts.

 

Doing more homework. Teens also are spending more time—16 minutes more per day—on homework, which now averages an hour a day. The increased sleep and homework time is enabled partly by 26 fewer minutes per day in paid employment—fewer teenagers today have jobs.

 

Minimal pressure for self-destructive behaviors. Relatively few teens feel personally pressured to be sexually active (8 percent), to drink alcohol (6 percent), or to use drugs (4 percent)—far fewer than the 61 percent feeling pressure to get good grades.

 

The Gendered News

 

Time use. Do you find it surprising (or not) that girls, compared with boys,

  • average 58 fewer daily minutes of screen time,
  • spend 21 minutes more on homework,
  • average 23 minutes more on grooming and appearance, and
  • spend 14 minutes more on helping around the house?

 

Emotions. Girls (36 percent) are also more likely than boys (23 percent) to report feeling anxious or depressed every or almost every day. But they are more likely each day to feel excited about something studied in school (33 vs. 21 percent). And they are more likely to say they never get in trouble at school (48 vs. 33 percent).

 

Aspirations. Girls are more likely than boys (68 vs. 51 percent) to aspire to attending a four-year college. And they are less materialistic than boys—with 41 percent of girls and 61 percent of boys reporting that it will be very important to have a lot of money when they grow up.

  

To sum up, (1) aspects of teen time use and emotions have changed, sometimes significantly. (2) Gender differences persist, though the differences are not static. (3) In this modern media age, adolescence—the years that teens spend morphing from child to adult—come with new temptations, which increase some dangers and decrease others.

 

What endures is teens’ need to navigate turbulent waters en route to independence and identity, while sustaining the social connections that will support their flourishing.

 

(For David Myers’ other essays on psychological science and everyday life visit TalkPsych.com)

In hindsight, almost any finding (or its opposite) can seem like plain old common sense—a phenomenon we know as hindsight bias (a.k.a. the I-knew-it-all-along phenomenon). Likewise, the outcomes of most elections, wars, and sporting events seem, in hindsight, explainable and predictable. As Dr. Watson said to Sherlock, “Anything seems commonplace, once explained.”

 

It may therefore seem unsurprising that new studies—reported in a forthcoming article by Florida State psychologists Jessica Maxwell and James McNulty—reveal a “bidirectional relationship” between relationship satisfaction and sexual satisfaction. A loving relationship enhances sex. And good sex, with a lingering “afterglow,” enhances a loving relationship.

 

Even if the love-sex interplay does not, in hindsight, feel surprising, it does seem a lesson worth teaching in an age of sexual hook-ups and delayed marriage. As I explain in an upcoming essay for the Association for Psychological Science Observer,

When a romantic relationship is sealed with a secure commitment—when there is minimal anxiety about performance, and when there is an experience-rooted sensitivity to one another’s desires and responses—intimacy can flourish. “Satisfying relationships [infuse] positive affect into sexual experiences,” say Maxwell and McNulty. And when confident of a partner’s acceptance, low body self-esteem is a diminished barrier to sexual frequency and satisfaction.

 

The researchers’ evidence comes from tracking relationships through time. Higher marital satisfaction today predicts increased sexual satisfaction seven months later. And higher sexual satisfaction today predicts increased marital satisfaction seven months later. Moreover, it’s true for both newlyweds and long-term couples, and for both men and women.

 

Earlier studies found that when sex begins after commitment, couples win twice—with greater relational stability and better sex (see here and here). (In hindsight, we surely could rationalize an opposite finding: Perhaps test-driving sexual compatibility prior to commitment would make for better sex, and thus better relationships? But this does not seem to be the case.) And when sex happens in the context of a committed relationship, there is more pleasure and less morning-after regret (see here).

 

The take-home lesson: Our romantic bonds both enable and feed off sexual intimacy. We humans have what today’s social psychologists call a “need to belong.” We are social creatures, made to connect in close relationships. We flourish when embracing and enjoying secure, enduring, intimate attachments.

 

(For David Myers’ other essays on psychological science and everyday life visit TalkPsych.com.)

Climate change has arrived. Greenhouse gases are accumulating. The planet and its oceans are warming. Glaciers and Arctic ice are retreating. The seas are rising. Extreme weather is becoming ever costlier—in money and in lives. The warming Arctic and its wavier jet stream even help explain the recent polar vortex. If such threats came from a looming alien invasion, our response would be bipartisan and robust, notes Farhad Manjoo.

 

Even so, the U.S. government has

  • pulled out of the Paris Agreement on climate change,
  • plans to lift CO2 restrictions on coal-generated power,
  • weakened auto fuel-economy and emissions standards,
  • cut NASA climate monitoring,
  • increased off-shore oil and gas drilling, and
  • reduced clean-energy research and development.

 

So why, given the accumulating science, is the Trump administration apparently unconcerned about climate change as a weapon of mass destruction?

 

Surely the availability heuristic—the coloring of our judgments by mentally available events and images—is partly to blame. Climate change is imperceptibly slow, without a just noticeable difference from one month to the next. What’s cognitively more available is our recent local weather.

 

Thus, hot days increase people’s beliefs in global warming—as Australians understand after their recent scorching hot summer. And cold weather decreases concern—as vividly illustrated when U.S. Senator James Inhofe, during a 2015 cold spell, ridiculed global warming claims by bringing a snowball to the U.S. Senate. (Is it really so hard to grasp the distinction between local weather and global climate? We do manage, when feeling cold air on opening our refrigerator, not to misjudge our whole-house temperature.)

 

 (C-Span [Public domain], via Wikimedia Commons.)

President Trump has echoed Inhofe with dozens of tweets that similarly generalize from local weather:

 

 

Such wisdom brings to mind my favorite Stephen Colbert tweet:

 

The availability heuristic’s upside is that extreme weather experiences, as well as climate science, are driving growing public concern. Drought-caused wildfires, floods, and brutal heat waves have a silver lining. After surviving Hurricane Sandy, New Jersey residents expressed increased environmentalism. And today, 74 percent of Americans say that the last five years’ extreme weather has influenced their climate change opinions.

 

Ergo, Americans by a 5-to-1 margin now agree that global warming is happening. By a 3-to-1 margin they believe it is human-caused. Seven in 10 now say that they are at least “somewhat worried” about climate change. And globally, across 26 countries, two-thirds of people see it as a “major threat” to their country. “The evidence the climate is changing is becoming so overwhelming people are seeing it in their regions and in their lives,” says the Obama science advisor, John Holdren. “We are really to the point where we’re seeing bodies in the street from severe flooding and severe wildfires.”

 

With vivid and mentally available weather tragedies occurring more often, more folks are noticing and caring. Last month, 3300 economists—including 27 Nobel laureates and all former Federal Reserve Board chairs—signed a consensus statement supporting a revenue-neutral carbon tax as the most effective climate change solution. Although the Green New Deal proposed by progressive Democrats may be more aspirational than achievable, its existence—together with the increasing climate concern of youth and young adults, and the growth in low-carbon energy sources—gives hope for a greener future.

 

(For David Myers’ other essays on psychological science and everyday life visit TalkPsych.com.)

Democracy presumes civic wisdom. When voters grasp truth, when facts prevail over misinformation, prudence prevails. When the electorate understands what actually advances (and threatens) human flourishing, it can inaugurate sensible policies and elect benevolent leaders. The collective wisdom of the cognizant is more astute than an autocrat’s whims.

 

Alas, as the late Hans Rosling amply documents in Factfulness, too often the crowd is unwise. Ignorance reigns. Even with this forewarning, consider:

  • What percent of the world’s 1-year-olds have had a vaccination?
  • What percent of humanity lives in extreme poverty (<$2/day)?
  • What percent of humanity is literate (able to read and write)?

 

The factual answers—86 percent, 9 percent, and 86 percent, respectively—differ radically from Americans’ perceptions. Their vaccination estimate: 35 percent. And though extreme poverty has plummeted and literacy has soared, most don’t know that. More than people suppose, world health, education, and prosperity have improved (as Steven Pinker further documents in Enlightenment Now).

 

Such public ignorance—compounded by the overconfidence phenomenon (people’s tendency to be more confident than correct)—often undermines civic wisdom. When year after year 7 in 10 adults tell Gallup there has been more crime than in the prior year—despite plummeting violent and property crime rates—then fear-mongering politicians may triumph. Our ignorance matters when horrific but infinitesimally rare incidents of domestic terrorism, school shootings, and air crashes hijack our consciousness. We and our children will not only disproportionately fear the wrong things, we will then risk more lives by extreme public spending to avoid these frightening things—to, say, block the “vicious predators and bloodthirsty killers” supposedly pouring across our southern border, rather than to mitigate climate change and more extreme weather.

 

In the aftermath of anti-immigrant fear-stoking (“They’re bringing drugs. They’re bringing crime. They’re rapists.”), many people do fear immigrants. Americans are, reports Gallup, “five times more likely to say immigrants make [crime] worse rather than better (45% to 9%, respectively).” Roused by anecdotes of vicious immigrant crime, “Build the wall!” becomes a rallying cry—despite, as the conservative Cato Institute freshly documents, a lower crime rate among immigrants than among native-born Americans.

 

 

And what do you think: Is eating genetically modified (GM) food safe? “Yes,” say 37 percent of U.S. adults and 88 percent of American Association for the Advancement of Science members. Moreover, the people most opposed to GM foods are (according to a new study) those who are most ignorant about them.

 

As the famed Dunning-Kruger effect reminds us, ignorance and incompetence can, ironically, feed overconfidence. Ignorant of my ignorance—and thus prone to a smug overconfidence—I am blissfully unaware of all the possible Scrabble words I fail to see . . . which enables me to think myself verbally adept. We are, as Daniel Kahneman has said, often “blind to our blindness.”

 

The result is sometimes a theater of the absurd. A December 2015 Public Policy Polling survey asked Donald Trump supporters if they favored or opposed bombing Agrabah. Among the half with an opinion, there was 4 to 1 support (41 percent to 9 percent) for dropping bombs on Agrabah . . . the fictional country from Aladdin.

 

But ignorance needn’t be permanent. Education can train us to recognize how errors and biases creep into our thinking. Education also makes us less gullible—less vulnerable to belief in conspiracy theories. Teach people to think critically—with a mix of open-minded curiosity, evidence-seeking skepticism, and intellectual humility—and they will think . . . and vote . . . smarter. Ignorance matters. But education works.

 

(For David Myers’ other essays on psychological science and everyday life visit TalkPsych.com.)

Time and again I am struck by two robust social science findings.

 

The first, to which social conservatives nod their appreciation, concerns the benefits of successful marriages—which are a substantial predictor of health, longevity, personal happiness, and the well-being of children. An example: As I documented here, U.S. Child Health Surveys have shown that children living with two parents have been half as likely as those living with a never or formerly married mother to have been suspended or expelled from school—even after controlling for variations in race, family size, and parental education and income. To be sure, most single-parented children thrive, and many co-parented children are dysfunctional. Yet show me a place where nearly all children are co-parented by two adults enduringly committed to each other and their children and I will show you a place with relatively low rates of psychological disorder and social pathology. Marriage matters.

 

The second, to which progressives nod their appreciation, is that economic inequality is socially toxic. Places with great inequality have more social pathology—higher rates of crime, anxiety, obesity, and drug use, and lower life expectancy and happiness (see here and here).  Show me a place with great inequality and I will show you a place with a comparatively depressed and dissatisfied populace. Disparity dispirits.

 

Moreover, argues John Hopkins University sociology chair Andrew Cherlin, there is a path between these two oft-confirmed findings: Rising income inequality contributes to family dissolution. As the gap between rich and poor has widened, unstable cohabitations and nonmarital child-bearing have dramatically increased among those with lower incomes—or where men have dim job prospects. In deteriorating job markets, marriage wanes and families become less stable. Moreover, for working single parents, affordable quality child care may be out of reach.

 

Ergo, doesn’t it follow that those who support marriage and stable co-parenting (a typically conservative value) should also be economic progressives—concerned about reducing inequality and poverty? To envision a culture that welcomes children into families with two or more people who love them is to envision an economic environment that nurtures secure families.

 

What do you think: Might this vision of a family-supportive just economy be a meeting place between conservatism and progressivism? And might it be a basis for depolarizing our politics and unifying our aspirations?

 

A glimmer of hope: After writing this essay, I learned of Fox News’ conservative voice, Tucker Carlson, recent lament that “families are being crushed by market forces” . . . to which Dean Baker of the progressive Center for Economic and Policy Research replied: “It’s a bit scary to me how much of this I agree with.”

 

(For David Myers’ other essays on psychological science and everyday life visit TalkPsych.com.)

Psychology’s archives are filled with well-meaning, well-funded endeavors that were meant to change lives for the better but that—alas—made no difference.

 

In one huge study, 500 Massachusetts boys deemed at risk for delinquency were, by the toss of a coin, assigned either to a no-intervention control condition or to a 5-year treatment program. In addition to twice-a-month visits from counselors, the boys in the treatment program received academic tutoring, medical attention, and family assistance and were involved in community programs, such as the Boy Scouts. When Joan McCord located 97 percent of the participants some 30 years later, many  offered glowing testimonials: Were it not for the program, “I would probably be in jail”; “My life would have gone the other way”; or “I think I would have ended up in a life of crime.” Indeed, even among “difficult” predelinquent boys, 66 percent developed no juvenile crime record.

 

But the same was true of their control counterparts—70 percent of whom had no juvenile record. Alas, the glowing testimonials had been unintentionally deceiving. The program had no beneficial effect.

 

More recently, other endeavors—the national Scared Straight program to tame teenage violence, the police-promoted D.A.R.E. anti-drug effort, Critical Incident Debriefing for trauma victims, and numerous weight-reduction, pedophile rehabilitation, and sexual reorientation efforts—have also been found ineffectual or even harmful.

 

Is this because genetic influences fix our traits—minimizing our malleability? (Think of the dozens of identical twins who, though raised separately, are still amazingly similar.) To be sure, genes do matter. The most comprehensive review of twin studies—more than 3000 such, encompassing 14.6 million twins—found that “across all traits the reported heritability [individual differences attributable to genes] is 49 percent.” That is substantial, yet it leaves room for willpower, beliefs, and social influence as well. Body weight, for example, is genetically influenced, but diet and exercise also matter.

 

Given the guiding power of our heredity and the failure of many large-scale efforts to help people to flourish, I am stunned by the successes of brief “wise interventions”—“wise” in the sense of being savvy about how our beliefs and assumptions influence us, and “stunned” that a 1-hour intervention sometimes outperforms a 5-year intervention.

 

Two leading researchers, Gregory Walton and Timothy Wilson, recently reviewed 325 interventions. Their conclusion: Helping people reframe the meaning of their experiences can promote their long-term flourishing. As Walton explains at www.wiseinterventions.org, “Wise interventions focus on the meanings and inferences people draw about themselves, other people, or a situation they are in.” Three examples:

  • At-risk middle school students given a “growth mindset”—being taught that the brain, like a muscle, grows with use—achieved better grades because they “saw effort as a virtue, because effort helps to develop ability.”
  • Entering minority college students who experienced a 1-hour session explaining the normality of the worry that they didn’t belong (with reassuring stories from older peers) achieved higher grades over the next 3 years—and greater life and career satisfaction after college.
  • A paraprofessional’s helping at-risk new mothers understand their baby’s fussing reduced the moms’ deciding they were bad mothers—and reduced first-year child abuse from 23 percent to 4 percent.

 

Thus, conclude Walton and Wilson, “exercises that seem minor can be transformational” when individuals address “a pressing psychological question, such as whether they belong at school, whether a romantic partner loves them, whether they can improve in math, whether they are a ‘bad mom,’ or whether groups can change in an ongoing conflict.”

 

So, genes matter. But we are all a mix of nature and nurture, of biology and beliefs. And that is why wisely changing people’s interpretations of their experiences and situations can support their flourishing.

 

(For David Myers’ other essays on psychological science and everyday life visit TalkPsych.com.)

This  www.TalkPsych.com entry offers three news flashes—samples of research that have captured my attention (and may wend their way into future textbook editions).

 

NEWS FLASH # 1:Intergroup contact makes us “less inward looking and more open to experiences.” As any social psychology student knows, friendly contact with other sorts of folks engenders positive attitudes. For example, as an earlier TalkPsych essay documented, regions with more immigrants have more welcoming, positive attitudes toward immigrants. Places without immigrants fear them the most.

 

But intergroup contact does more than improve our attitudes toward others. Research by Brock University psychologist Gordon Hodson and his British colleagues reveals that intergroup contact  affects our thinking—it loosens us up, promoting cognitive flexibility, novel problem solving, and increased creativity. This observation complements earlier research that demonstrated, after controlling for other factors, that students who studied in another culture became more flexibly adept at creative problem solving (see here and here).

 

NEWS FLASH # 2:

More than we suppose, other people like us. Do you sometimes worry that people you’ve just met don’t like you very much? Actually, recent studies by Cornell University researcher Erica Boothby and her colleagues found that people rate new conversational partners as more enjoyable and likeable than the new partner presumes. Despite our shared self-serving bias (the  tendency to overestimate our own knowledge, abilities, and virtues), we tend to underestimate the impressions we make on others. Moreover, the shyer the person, the bigger the liking gap—the underestimate of others’ liking of us.

 

Ergo, the next time you fret over whether you were too quiet, too chatty, or too wrinkled and rumpled, be reassured: Others probably liked you more than you realize.

 

NEWS FLASH # 3:

The youngest children in a school class are more likely to be diagnosed with ADHD. The current psychiatric disorder manual broadens the criteria for diagnosing attention-deficit/hyperactivity disorder (ADHD), thus increasing the number of children so diagnosed. Some say the diagnosis enables helpful treatment and improved functioning. Skeptics say the broadened criteria pathologize immature rambunctiousness, especially among boys—whom evolution has not designed to sit passively at school desks.

 

Support for the skeptics comes from a New England Journal of Medicine study that followed 407,846 U.S. children from birth to elementary school. ADHD diagnoses were a stunning 34 percent higher among those born in August in states with a September 1 cutoff for school entry—but not higher among children in states with other cutoff dates. This massive study confirms earlier reports (here and here) that the youngest children in a class tend to be more fidgety—and more often diagnosed with ADHD—than their older peers.

 

Such findings illustrate why I feel privileged to be gifted with the time, and the responsibility, to learn something new most every day. For me, the primary job of writing is not making words march up a screen, but reading and reading, searching for insights—for gems amid the rocks—that educated people should know about.

 

(For David Myers’ other essays on psychological science and everyday life visit www.TalkPsych.com.)

At long last, artificial intelligence (AI)—and its main subset, machine learning—is beginning to fulfill its promise. When fed massive amounts of data, computers can discern patterns (as in speech recognition) and make predictions or decisions. AlphaZero, a Google-related computer system, started playing chess, shogi (Japanese chess), and GO against itself. Before long, thanks to machine learning, AlphaZero progressed from no knowledge of each game to “the best player, human or computer, the world has ever seen.”

 

DrAfter123/DigitalVision Vectors/Getty Images

 

I’ve had recent opportunities to witness the growing excitement about machine learning in the human future, through conversations with

  • Adrian Weller (a Cambridge University scholar who is program director for the UK’s national institute for data science and AI).
  • Andrew Briggs (Oxford’s Professor of Nanomaterials, who is using machine learning to direct his quantum computing experiments and, like Weller, is pondering what machine learning portends for human flourishing).
  • Brian Odegaard (a UCLA post-doc psychologist who uses machine learning to identify brain networks that underlie human consciousness and perception).

 

Two new medical ventures (to which—full disclosure—my family foundation has given investment support) illustrate machine learning’s potential:

  • Fifth Eye, a University of Michigan spinoff, has had computers mine data on millions of heartbeats from critically ill hospital patients—to identify invisible, nuanced signs of deterioration. By detecting patterns that predict patient crashes, the system aims to provide a potentially life-saving early warning system (well ahead of doctors or nurses detecting anything amiss).
  • Delphinus, which offers a new ultrasound alternative to mammography, will similarly use machine learning from thousands of breast scans to help radiologists spot potent cancer cells.

 

Other machine-learning diagnostic systems are helping physicians to identify strokes, retinal pathology, and (using sensors and language predictors) the risk of depression or suicide. Machine learning of locked-in ALS patients’ brain wave patterns associated with “Yes” and “No” answers has enabled them to communicate their thoughts and feelings. And it is enabling researchers to translate brain activity into speech.

 

Consider, too, a new Pew Research Center study of gender representation in Google images. Pew researchers first harvested an archive of 26,981 gender-labeled human faces from different countries and ethnic groups. They fed 80 percent of these images into a computer, which used machine learning to discriminate male and female faces. When tested on the other 20 percent, the system achieved 95 percent accuracy.

 

Pew researchers next had the system use its new human-like gender-discrimination ability to  identify the gender of persons shown in 10,000 Google images associated with 105 common occupations. Would the gender representation in the image search results overrepresent, underrepresent, or accurately represent their proportions, as reported by U.S. Bureau of Labor Statistics (BLS) data summaries?

 

The result? Women, relative to their presence in the working world, were significantly underrepresented in some categories and overrepresented in others. For example, the BLS reports that 57 percent of bartenders are female—as are only 29 percent of the first 100 people shown in Google image searches of “bartender” (as you can see for yourself). Searches for “medical records technician,” “probation officer,” “general manager,” “chief executive,” and “security guard” showed a similar underrepresentation. But women were overrepresented, relative to their working proportion, in Google images for “police,” “computer programmer,” “mechanic,” and “singer.” Across all 105 jobs, men are 54 percent of those employed and 60 percent of those pictured. The bottom line: Machine learning reveals (in Google users’ engagement) a subtle new form of gender bias.

 

As these examples illustrate, machine learning holds promise for helpful application and research. But it will also entail some difficult ethical questions.

 

Imagine, for example, that age, race, gender, or sexual orientation are incorporated into algorithms that predict recidivism among released prisoners. Would it be discriminatory, or ethical, to use such demographic predictors in making parole decisions?

 

Such questions already exist in human judgments, but may become more acute if and when we ask machines to make these decisions. Or is there reason to hope that it will be easier to examine and tweak the inner workings of an algorithmic system than to do so with a human mind?

 

(For David Myers’ other essays on psychological science and everyday life visit www.TalkPsych.com.)

Judith Rich Harris’ December 29th death took my mind to her remarkable life and legacy. Among all the people I’ve never met, she was the person I came to know best. Across 243 emails she shared her draft writings, her critical assessment of others’ thinking (including my own), and the progress of her illness.

 

Our conversation began after the publication of her cogent Psychological Review paper, which changed my thinking and led me to send a note of appreciation. The paper’s gist was delivered by its first two sentences: “Do parents have any important long-term effects on the development of their child’s personality? This article examines the evidence and concludes that the answer is no.”

 

Her argument: Behavior genetics studies (of twins and adoptees) show that genes predispose our individual traits, and that siblings’ “shared environment” has a shockingly small influence. Peers also matter—they transmit culture. Show her some children who hear English spoken with one accent at home, and another accent at school and in the neighborhood, and—virtually without exception—she will show you children who talk like their peers.

 

Judy Harris was a one-time Harvard psychology graduate student who was dismissed from its doctoral program because, as George Miller explained to her, she lacked “originality and independence.”

 

But she persisted. In her mid-fifties, without any academic affiliation and coping with debilitating autoimmune disorders, she had the chutzpah to submit her evidence-based ideas to Psychological Review, then as now psychology’s premier theoretical journal. To his credit, the editor, Daniel Wegner, welcomed this contribution from this little-known independent scholar. Moreover, when her great encourager Steven Pinker and I each nominated her paper for the annual award for “outstanding paper on general psychology,” the judges selected her as co-recipient of the—I am not making this up—George A. Miller Award. (To his credit, Miller later termed the irony “delicious.”)

 

The encouraging lesson (in Harris’ words): “‘Shut in’ does not necessarily mean ‘shut out.’” Truth will out. Although biases are real, excellence can get recognized. So, wherever you are, whatever your big idea or passion, keep on.

 

Her fame expanded with the publication of her 1998 book The Nurture Assumption, which was profiled by Malcolm Gladwell in a New Yorker feature article, made into a Newsweek cover story, and named as a Pulitzer Prize finalist.

 

Her argument was controversial, and a reminder that important lessons are often taught by those who fearlessly push an argument to its limit. (Surely child-rearing does have some direct influence on children’s values, religiosity, and politics—and not just via the peer culture to which parents expose children. And surely the loving versus abusive extremes of parenting matter.)

 

Harris was kind and generous (she supportively critiqued my writing, even as I did hers) but also had the self-confidence to take on all critics and to offer challenges to other widely accepted ideas. One was the “new science” of birth order, which, as she wrote me, was “neither new nor science.” An August 24, 1997, email gives the flavor of her wit and writing:

Birth order keeps coming back. In their 1996 book on birth order and political behavior, Albert Somit, Alan Arwine, and Steven A. Peterson spoke of the “inherent non-rational nature of deeply held beliefs” and mused that “permanently slaying a vampire”—the belief in birth order effect—may require “that a stake of gold be driven through his/her heart at high noon” (p. vi).
            Why is it so difficult to slay this vampire? Why, in spite of all the telling assaults that have been made on it, does it keep coming back? The answer is that the belief in birth order effects fits so well into the basic assumptions of our profession and our culture. Psychologists and nonpsychologists alike take it for granted that a child’s personality, to the degree that it is shaped by the environment, receives that shaping primarily at home. And since we know (from our own memories and introspections) that a child’s experiences at home are very much affected by his or her position in the family—oldest, youngest, or in the middle—we expect birth order to leave permanent marks on the personality.
            The belief in birth order effects never dies; it just rests in its coffin until someone lifts the lid again.

 

Alas, the disease that shut her in has, as she anticipated, claimed her. In her last email sent my way on September 6, 2018, she reported that

I’m not doing so well. This is the normal course of the disorder I have—pulmonary arterial hypertension. It is incurable and eventually progresses to heart failure and death. I’m in the heart failure stage now. It’s progressing very slowly, but makes remaining alive not much fun. 

            Because I can’t actually DO anything anymore, it’s a treat to get your mail. I can’t do any more than I’ve already done, but maybe what I’ve already done is enough. Who would have thought that 20 years after its publication, people would still be talking about The Nurture Assumption!

 

Or that The New York Times would replay its message at length, in your well-deserved obituary, Judy.

 

(For David Myers’ other essays on psychological science and everyday life visit www.TalkPsych.com.)

As Pope Francis has said, “Everyone’s existence is deeply tied to that of others.” We are social animals. We need to belong. We flourish when supported by close relationships. Finding a supportive confidante, we feel joy.

 

Longing for acceptance and love, Americans spend $86 billion annually on cosmetics, fragrances, and personal care products—and billions more on clothes, hair styling, and diets. Is that money well spent? Will it help us find and form meaningful relationships?

 

Consider one of social psychology’s most provocative, and simplest, experiments. Cornell University students were asked to don a Barry Manilow T-shirt (at the behest of researcher Thomas Gilovich and colleagues) and were then shown into a room where several others were completing questionnaires. Afterwards they were asked to guess how many of the others noticed their dorky attire. Their estimate? About half. Actually, only 23 percent did.

 

Other experiments confirm this spotlight effect—an overestimation of others’ noticing us, as if a spotlight is shining on us.

 

The phenomenon extends to our secret emotions. Thanks to an illusion of transparency we presume that our attractions, our disgust, and our anxieties leak out and become visible to others. Imagine standing before an audience: If we’re nervous and we know it, will our face surely show it? Not necessarily. Even our lies and our lusts are less transparent than we imagine.

 

There’s bad news here: Others notice us less than we imagine (partly because they are more worried about the impressions they are making).

 

But there’s also good news: Others notice us less than we imagine. And that good news is liberating: A bad hair day hardly matters. And if we wear yesterday’s clothes again today, few will notice. Fewer will care. Of those, fewer still will remember. 

 

If normal day-to-day variations in our appearance are hardly noticed and soon forgotten, what does affect the impressions we make and the relationships we hope to form and sustain?

 

Proximity. Our social ecology matters. We tend to like those nearby—those who sit near us in class, at work, in worship. Our nearest become our dearest as we befriend or marry people who live in the same town, attend the same school, share the same mail room, or visit the same coffee shop. Mere exposure breeds liking. Familiar feels friendly. Customary is comfortable. So look around.

 

Similarity. Hundreds of experiments confirm and reconfirm that likeness leads to liking (and thus the challenge of welcoming the benefits of social diversity). The more similar another’s attitudes, beliefs, interests, politics, income, and on and on, the more disposed we are to like the person and to stay connected. And the more dissimilar another’s attitudes, the greater the odds of disliking.  Opposites retract.

 

If proximity and similarity help bonds form, what can we do to grow and sustain relationships?

 

Equity. One key to relationship endurance is equity, which occurs when friends perceive that they receive in proportion to what they give. When two people share their time and possessions, when they give and receive support in equal measure, and when they care equally about one another, their prospects for long-term friendship or love are bright. This doesn’t mean playing relational ping pong—balancing every invitation with a tit-for-tat response. But over time, each friend or partner invests in the other about as much as he or she receives.

 

Self-disclosure. Relationships also grow closer and stronger as we share our likes and dislikes, our joys and hurts, our dreams and worries. In the dance of friendship or love, one reveals a little and the other reciprocates. And then the first reveals more, and on and on. As the relationship progresses from small talk to things that matter, the increasing self-disclosure can elicit liking, which unleashes further self-disclosure.

 

Mindful of the benefits of equity and mutual self-disclosure, we can monitor our conversations: 

  • Are we listening as much as we are talking?
  • Are we drawing others out as much as we disclosing about ourselves?

 

In his classic How to Win Friends and Influence People, Dale Carnegie offered kindred advice. To win friends, he advised, “become genuinely interested in other people. . . . You can make more friends in two months by being interested in them, than in two years by making them interested in you.” Thus, “Be a good listener. Encourage others to talk about themselves.”

 

So, looking our best may help a little, initially, though less than we suppose. What matters more is being there for others—focusing on them, encouraging them, supporting them—and enjoying their support in return. Such is the soil that feeds satisfying friendships and enduring love.

 

(For David Myers’ other weekly essays on psychological science and everyday life, visit www.TalkPsych.com)

“I have a gut, and my gut tells me more sometimes than anybody else’s brain can ever tell me,” explained President Trump in stating why he believed Federal Reserve interest rate hikes were a mistake. “My gut has always been right,” he declared again in saying why he needn’t prepare for the recent trade negotiation with China’s president.

 

In trusting his gut intuition, Trump has much company. “Buried deep within each and every one of us, there is an instinctive, heart-felt awareness that provides—if we allow it to—the most reliable guide,” offered Prince Charles. “I’m a gut player. I rely on my instincts,” said President George W. Bush, explaining his decision to launch the Iraq War.

 

Although there is, as I noted in another of these TalkPsych essays, a gut-brain connection, are we right to trust our gut? Does the gut know best about interest rates, trade policy, and climate change? Or, mindful of smart people often doing dumb things, do we instead need more humility, more checking of gut hunches against hard reality, more critical thinking?

 

Drawing from today’s psychological science, one could write a book on both the powers and perils of intuition. (Indeed, I have—see here.) Here, shortened to an elevator speech, is the gist.

 

Intuition’s powers. Cognitive science reveals an unconscious mind—another mind backstage—that Freud never told us about. Much thinking occurs not “on screen” but off screen, out of sight, where reason does not know. Countless studies—of priming, implicit memory, empathic accuracy, thin slice social judgments, creativity, and right hemisphere processing—illustrate our nonrational, intuitive powers. We know more than we know we know. Thanks to our “overlearning” of automatic behaviors, those of us who learned to ride bikes as children can intuitively pedal away on one decades later. And a skilled violinist knows, without thinking, just where to place the bow, at what angle, with what pressure. “In apprehension, how like a god!,” exclaimed Shakespeare’s Hamlet.

 

Intuition’s perils. Other studiesof perceptual illusions, self-serving bias, illusory optimism, illusory correlation, confirmation bias, belief perseverance, the fundamental attribution error, misplaced fears, and the overconfidence phenomenon—confirm what literature and religion have long presumed: the powers and perils of pride. Moreover, these phenomena feed mistaken gut intuitions that produce deficient decisions by clinicians, interviewers, coaches, investors, gamblers, and would-be psychics. “Headpiece filled with straw,” opined T. S. Eliot.

 

Intuition’s failures often are akin to perceptual illusions—rooted in mechanisms that usually serve us well but sometimes lead us astray. Like doctors focused on detecting and treating disease, psychological scientists are skilled at detecting and calling attention to our mind’s predictable errors. They concur with the novelist Madeline L’Engle’s observation: “The naked intellect is an extraordinarily inaccurate instrument.”

 

The bottom line: our gut intuitions are terrific at some things, such as instantly reading emotions in others’ faces, but fail at others, such as guessing stocks, assessing risks, and predicting climate change. And so psychologists teach about intuition’s perils as well as its powers. We encourage critical thinking. We urge people, before trusting others’ gut intuitions, to ask: “What do you mean?” “How do you know?”

 

As physicist Richard Feynman famously said, “The first principle is that you must not fool yourself, and you are the easiest person to fool.”

 

(For David Myers’ other weekly essays on psychological science and everyday life, visit www.TalkPsych.com)

If you, dear reader, can indulge some slightly geeky calculations, I hope to show you that with daily exercise you can live a substantially longer and happier life. Indeed, per the time invested, exercise will benefit you more than smoking will harm you. Consider:

  • An analysis of mortality data offers this memorable result: For the average person, life is lengthened by about 7 hours for every hour exercised. So (here comes the geek), the World Health Organization recommends exercising 150 minutes = 2.5 hours per week. Multiplied times 7, that equals 17.5 hours longer life for each week of exercise. Over 52 weeks, that sums to 910 hours = 38 days = 1/10th of a year longer life for each year of faithful exercise . . . which, continued over 40 years would yield ~4 years longer life. (Though, more typically, say the researchers, runners live 3.2 years longer.)
  • In another epidemiological study of over 650,000 American adults, those walking 150 minutes per week lived (voila!) 4 years longer than nonexercisers (Moore et al., 2012).

 

How satisfying to have two independent estimates in the same ballpark!

 

This potential life-extending benefit brings to mind the mirror-image life-shortening costs of smoking, which the Centers for Disease Control reports diminishes life for the average smoker “by at least 10 years.” Thus (geek time again):

  • A person  who takes up smoking at age 15, smokes 15 cigarettes per day for  50 years, and dies at 65 instead of 75, will lose roughly 1/5th of a year (equals 73 days = 1752 hours = 105,000 minutes) for each year of smoking. If each cigarette  takes 10 minutes to smoke, the minutes spent smoking (54,750 each year) will account for half of those 105,000 lost minutes.
  • Ergo, nature charges ~2 minutes of shorter life for each minute spent smoking. . . but generously gives a 7-to-1 return for each hour spent exercising. How benevolent!

 

Massive new epidemiological studies and meta-analyses (statistical digests of all available research) confirm both physical and mental health benefits of exercise (see here, here, and here). A good double goal for those wishing for a long life is: more fitness, less fatness. But evidence suggests that if forced to pick one, go for fitness.

 

As an earlier blog essay documented, exercise entails not only better health but a less depressed and anxious mood, more energy, and stronger relationships. Moreover, clinical trial experiments—with people assigned to exercise or to control conditions—confirm cause and effect: Exercise both treats and protects against depression and anxiety.

 

The evidence is as compelling as evidence gets: Go for a daily jog or swim and you can expect to live longer and live happier. Mens sana in corpore sano: A healthy mind in a healthy body.

 

 K.C. Alfred/Moment/Getty Images

(For David Myers’ other weekly essays on psychological science and everyday life, visit www.TalkPsych.com)

David Myers

Sometimes Truth Is Comedy

Posted by David Myers Expert Nov 29, 2018

As I approach five years of www.TalkPsych.com commentary—which has settled into a weekly Thursday essay—I am tempted (given our now larger audience) to replay an occasional favorite. Here is my second focused essay, which still puts a smile on my face . . . and perhaps yours? (In sneaking humor into texts, I presume that if I can’t have fun writing, then readers likely won’t have fun reading.)

 

From April 6, 2014:

Consider Brett Pelham, Matthew Mirenberg, and John Jones’ 2002 report of wacky associations between people’s names and vocations. Who would have guessed? For example, in the United States, Jerry, Dennis, and Walter are equally popular names (0.42 percent of people carry each of these names). Yet America’s dentists have been almost twice as likely to be named Dennis as Jerry or Walter. Moreover, 2.5 times as many female dentists have been named Denise as the equally popular names Beverly and Tammy. And George or Geoffrey has been overrepresented among geoscientists (geologists, geophysicists, and geochemists).

I thought of that playful research on names recently when reading a paper on black bears’ quantitative competence, co-authored by Michael Beran. Next up in my reading pile was creative work on crows’ problem solving led by Chris Bird. Today I was appreciating interventions for lifting youth out of depression, pioneered by Sally Merry.

That also took my delighted mind to the important books on animal behavior by Robin Fox and Lionel Tiger, and the Birds of North America volume by Chandler Robbins. (One needn’t live in Giggleswick, England, to find humor in our good science.)

The list goes on: billionaire Marc Rich, drummer Billy Drummond, cricketer Peter Bowler, and the Ronald Reagan Whitehouse spokesman Larry Speakes. And as a person with hearing loss whose avocational passion is hearing advocacy, I should perhaps acknowledge the irony of my own name, which approximates My-ears.

Internet sources offer lots more: dentists named Dr. E. Z. Filler, Dr. Gargle, and Dr. Toothaker; the Oregon banking firm Cheatham and Steele; and the chorister Justin Tune. But my Twitter feed this week offered a cautionary word about these reported names: “The problem with quotes on the Internet is that you never know if they’re true.” ~ Abraham Lincoln

Perhaps you, too, have some favorite name-vocation associations? I think of my good friend who was anxiously bemused before meeting his oncologist, Dr. Bury. (I am happy to report that, a decade later, he is robustly unburied and has not needed the services of the nearby Posthumus Funeral Home.)

For Pelham and his colleagues there is a serious point to this fun: We all tend to like what we associate with ourselves (a phenomenon they call implicit egotism). We like faces that have features of our own face morphed into them. We like—and have some tendency to live in—cities and states whose names overlap with our own—as in the disproportionate number of people named Jack living in Jacksonville, of Philips in Philadelphia, and of people whose names begin with Tor in Toronto.

Uri Simonsohn isn’t entirely convinced (see here and here, with Pelham’s reply here and here). He replicated the associations between people’s names, occupations, and places but argued that reverse causality sometimes is at work. For example, people sometimes live in places and on streets after which their ancestors were named.

Implicit egotism research continues. In the meantime, we can delight in the occasional playful creativity of psychological science.

P.S. Speaking of dentists (actual ones), my retired Hope College chemistry colleague Don Williams—a person of sparkling wit—offers these photos, taken with his own camera:

And if you need a podiatrist to advise about your foot odor, Williams has found just the person:

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)