Skip navigation
All Places > The Psychology Community > Blog > Authors David Myers
1 2 3 4 Previous Next

The Psychology Community

208 Posts authored by: David Myers Expert

Turning 76 years old in a week, and still loving what I do, I find myself inspired by two recent emails. One, from social psychologist Thomas Pettigrew, age 87, responded to my welcoming his latest work by attaching fourteen of his recent publications. The second, from Nathan DeWall, pointed me to an interesting new article co-authored by developmental psychologist, Walter Mischel, age 88 (who, sadly, died just hours before this essay was posted).

 

That got me thinking about other long-lived people who have found their enduring calling in psychological science. My late friend, Charles Brewer, the long-time editor of Teaching of Psychology (who once told me he took two days a year off: Christmas and Easter), taught at Furman University until nearly 82, occupied his office until age 83, and was still authoring into his 80s.

 

But Charles’ longevity was exceeded by that of

  • B.F. Skinner, whom I heard address the American Psychological Association convention in 1990 at age 86, just eight days before he died of leukemia.
  • Carroll Izard, who co-authored three articles in 2017, the year of his death at age 93.
  • Jerome Bruner, who, the year before he died in 2016 at age 100, authored an essay on “The Uneasy Relation of Culture and Mind.”

 

And in earlier times, my historian-of-psychology friend Ludy Benjamin tells me, Wilhelm Wundt taught until 85 and supervised his last doctoral student at 87, and Robert Woodworth, lectured at Columbia until 89 and published his last work at 90.*

 

So, I then wondered, who of today’s living psychologists, in addition to Pettigrew and Mischel, are still publishing at age 85 and beyond? Daniel Kahneman and Paul Ekman almost qualify, but at 84 are youngsters compared to those below.  Here’s my preliminary short list—other nominees welcome!—with their most recent PsycINFO publication. (Given the era in which members of their age received PhDs, most are—no surprise—men.)

 

  • Philip Zimbardo: Age 85 (born March 23, 1933)

Unger, A., Lyu, H., & Zimbardo, P. G. (2018). How compulsive buying is influenced by perspective—Cross-cultural evidence from Germany, Ukraine, and China. International Journal of Mental Health and Addiction, 16, 522–544.

 

  • Gordon Bower: Age 85 (born December 30, 1932)

Bower, G. H. (2016). Emotionally colored cognition. In R. J. Sternberg, S. T. Fiske, & F. J. Foss (Eds.), Scientists making a difference: One hundred eminent behavioral and brain scientists talk about their most important contributions. Chapter xxvii, pp. 123–127. NY: Cambridge University Press.

 

  • James McGaugh: Age 86 (born December 17, 1931)

McGaugh, J. L. (2018). Emotional arousal regulation of memory consolidation. Current Opinion in Behavioral Sciences, 19, 5560.

 

  • Lila Gleitman: Age 88 (born December 10, 1931)

Gleitman, L. R., & Trueswell, J. C. (2018). Easy words: Reference resolution in a malevolent referent world. Topics in Cognitive Science.

 

  • Roger Shepard: Age 89 (born January 30, 1929)

Shepard, R. N. (2016). Just turn it over in your mind. In R. J. Sternberg, S. T. Fiske, & F. J. Foss (Eds.), Scientists making a difference: One hundred eminent behavioral and brain scientists talk about their most important contributions. Chapter xxvii, pp. 99–103. New York: Cambridge University Press.

 

  • Jerome Kagan: Age 89 (born February 25, 1929)

Kagan, J. (2018, May). Three unresolved issues on human morality. Perspectives on Psychological Science, 13, 346–358.

 

  • Albert Bandura: Age 92 (born December 4, 1925)

Bandura, A. (2016). Moral disengagement: How people do harm and live with themselves. New York: Worth Publishers.

 

  • Aaron Beck: Age 97 (born July 18, 1921)

Kochanski, K. M., Lee-Tauler, S. Y., Brown, G. K., Beck, A., Perera, K. U., et al. (2018, Aug.) Single versus multiple suicide attempts: A prospective examination of psychiatric factors and wish to die/wish to live index among military and civilian psychiatrically admitted patients. Journal of Nervous and Mental Disease, 206, 657–661.

 

  • Eleanor Maccoby: Age 101 (born May 15, 1917)

Maccoby, E. (2007). Historical overview of socialization research and theory. In J. E. Grusec, & P. D. Hastings (Eds.), Handbook of socialization: Theory and research. New York: Guilford Press.

 

  • And a drum roll for Brenda Milner: At age 100 (born July 15, 1918), she still, I’m told, comes in a couple times a week to the Montreal Neurological Institute, which last week celebrated her centennial (with thanks to Melvin Goodale for the photo below).

             Milner, B., & Klein, D. (2016, March). Loss of recent memory after bilateral hippocampal lesions: Memory and              memories—looking back and looking forward. Journal of Neurology, Neurosurgery & Psychiatry, 87, 230.

 

 

Life is a gift that ends unpredictably. Having already exceeded my at-birth life expectancy, I am grateful for the life I have had. But as one who still loves learning and writing (and can think of nothing else I’d rather do), why not emulate these esteemed colleagues while I continue to be blessed with health, energy, and this enduring sense of calling?

 

P.S. Subsequent to this essay, I have learned of other long-lived and still-productive psychologists, including Robert Rosenthal (retiring next Spring at 86), Allen Baddeley (who has a new book, Working Memories, at age 84), Jean Mandler (who has a new article out at age 89), and Eleanor Maccoby (who died recently at 101, with a 2015 chapter). The reported oldest living psychologist is Olivia Hooker, whose 103rd birthday was celebrated during APA's Black History Month earlier this year. On her 100th birthday, Pres Obama saluted her and dedicated a new Coast Guard Building in her name. But surely I've missed others?

 

(For David Myers’ other weekly essays on psychological science and everyday life visit TalkPsych.com)

 ----

* The “major early women psychologists”—Calkins, Washburn, Ladd-Franklin, Woolley, Hollingworth—all died before age 85, reported Benjamin, who added that some other psychologists have stayed too long in the profession without knowing “when to hang up their spikes” and make way for fresh faces in the classroom and laboratory.

Some fun emails stimulated by last week’s essay on loss aversion in sports and everyday life pointed me to statistician David Spiegelhalter's Cambridge Coincidence Collection, which contains people’s 4500+ reports of weird coincidences.

 

That took my mind back to some personally experienced coincidences . . . like the time my daughter, Laura Myers, bought two pairs of shoes. Back home, we were astounded to discover that the two brand names on the boxes were “Laura” and “Myers.” Or the time I confused our college library desk clerk when checking out after using a photocopy machine. My six-digit charge number was identical to the one-in-a-million six-digit number of copies on which the last user had stopped. Or the day my wife, Carol, called seeking my help in sourcing Mark Twain’s quote, “The man who does not read good books has no advantage over the man who cannot read them.” After this first-ever encounter with that quote, my second encounter was 90 minutes later, in a Washington Post article.

 

In Intuition: Its Powers and Perils, I report more amusing coincidences. Among my favorites:

  • Twins Levinia and Lorraine Christmas, driving to deliver Christmas presents to each other near Flitcham, England, collided.
  • Three of the first five U.S. Presidents—Adams, Jefferson, and Monroe—died on the same date–which was none other than the 4th of July.
  • And my favorite . . . in Psalm 46 of the King James Bible, published in the year that Shakespeare turned 46, the 46th word is “shake” and the 46th word from the end is “spear.” (An even greater marvel: How did anyone notice this?)

 

What should we make of weird coincidences? Were they, as James Redfield suggested in The Celestine Prophecy, seemingly “meant to happen . . . synchronistic events, and [that] following them will start you on your path to spiritual truth”? Is it a wink from God that your birthdate is buried among the random digits of pi? Beginning 50,841,600 places after the decimal, my 9/20/1942 birthdate appears . . . and you can likewise find yours here.

 

Without wanting to drain our delight in these serendipities, statisticians have a simpler explanation. Given the countless billions of daily events, some weird juxtapositions are inevitable—and then likely to get noticed and remembered (while all the premonitions not followed by an envisioned phone call or accident are unnoticed and fall into oblivion). “With a large enough sample, any outrageous thing is likely to happen,” observed statisticians Persi Diaconis and Frederick Mosteller. Indeed, added mathematician John Allen Paulos, “the most astonishingly incredible coincidence imaginable would be the complete absence of all coincidences.”

 

Finally, consider: That any specified coincidence will occur is very unlikely. That some astonishing unspecified event will occur is certain. That is why remarkable coincidences are noted in hindsight, not predicted with foresight. And that is also why we don’t need paranormal explanations to expect improbable happenings, even while delighting in them.

Imagine that you’re about to buy a $5000 used car. To pay for it, you’ll need to sell some of your stocks. Which of the following would you rather sell?

  • $5000 of Stock X shares, which you originally purchased for $2500.
  • $5000 of Stock Y shares, which you originally purchased for $10,000.

 

If you’d rather sell Stock X and reap your $2500 profit now, you’re not alone. One analysis of 10,000 investor accounts revealed that most people strongly prefer to lock in a profit rather than absorb a loss. Investors’ loss aversion is curious: What matters is each stock’s future value, not whether it has made or lost money in the past. (If anything, tax considerations favor selling the loser for a tax loss and avoiding the capital gains tax on the winner.)

 

Loss aversion is ubiquitous, and not just in big financial decisions. Participants in experiments, where rewards are small, will choose a sure gain over flipping a coin for double or nothing—but they will readily flip a coin on a double-or-nothing chance to avert a loss. As Daniel Kahneman and Amos Tversky reported, we feel the pain from a loss twice as keenly as we feel the pleasure from a similar-sized gain. Losing $20 feels worse than finding $20 feels good. No surprise, then, that we so vigorously avoid losing in so many situations.

 

The phenomenon extends to the endowment effect—our attachment to what we own and our aversion to losing it, as when those given a coffee mug demand more money to sell it than those not given the mug are willing to pay for it. Small wonder our homes are cluttered with things we wouldn’t today buy, yet won’t part with.

 

Loss aversion is but one example of a larger bad-is-stronger-than-good phenomenon, note Roy Baumeister and his colleagues. Bad events evoke more misery than good events evoke joy. Cruel words hurt us more than compliments please us. A bad reputation is easier to acquire—with a single lie or heartless act—than is a good reputation. “In everyday life, bad events have stronger and more lasting consequences than comparable good events.” Psychologically, loss is larger than gain. Emotionally, bad is stronger than good.  

           

Coaches and players are aware of the pain of losses, so it’s no surprise that loss aversion plays out in sports. Consider this example from basketball: Say your team is behind by 2 points, with time only for one last shot. Would you prefer a 2-point or a 3-point attempt?

 

Most coaches, wanting to avoid a loss, will seek to put the game into overtime with a 2-point shot. After all, an average 3-point shot will produce a win only one-third of the time. But if the team averages 50 percent of its 2-point attempts, and has about a 50 percent chance of overtime in this toss-up game, the loss-aversion strategy will yield but a 25 percent chance of both (a) sending the game to overtime, followed by (b) an overtime victory. Thus, by averting an immediate loss, these coaches reduce the chance of an ultimate win—rather like investors who place their money in loss-avoiding bonds and thus forego the likelihood, over extended time, of a much greater stock index win.

 

And now comes news (kindly shared by a mathematician friend) of loss aversion in baseball and softball base-running. Statistician Peter MacDonald, mathematician Dan McQuillan, and computer scientist Ian McQuillan invite us to imagine “a tie game in the bottom of the ninth inning, and there is one out—a single run will win the game. You are on first base, hoping the next batter gets a hit.”

 

As the batter hits a fly to shallow right, you hesitate between first and second to see if the sprinting outfielder will make the catch. When the outfielder traps rather than catches the ball, you zoom to second. The next batter hits a fly to center field and, alas, the last batter strikes out.

 

You probably didn’t question this cautious base-running scenario, because it’s what players do and what coaches commend. But consider an alternative strategy, say MacDonald and his colleagues. If you had risked running to third on that first fly ball, you would have scored the winning run on the ensuing fly ball. Using data from 32 years of Major League Baseball, the researchers calculate that any time the fly ball is at least 38 percent likely to fall for a hit, the runner should abandon caution and streak for third. Yet, when in doubt, that rational aggressive running strategy “is never attempted.”

 

You may object that players cannot compute probabilities. But, says the MacDonald team, “players and their third-base coaches make these sorts of calculations all the time. They gamble on sacrifice flies and stolen base attempts using probabilities of success.” Nevertheless, when it comes to running from first, their first goal is to avert loss—and to avoid, even at the cost of a possible run, the risk of looking like a fool. We implicitly think “What if I fail?” before “How can I succeed?”

 

Often in life, it seems, our excessive fear of losing subverts our opportunities to win. Caution thwarts triumph. Little ventured, little gained.

 

My late friend Gerry Haworth understood the risk-reward relationship. A shop teacher at our local high school, he began making wood products in his garage shop. Then, in 1948, he ventured the business equivalent of running to third base—quitting his job and launching a business, supported by his dad’s life savings. Today, family-owned Haworth Inc., America’s third-largest furniture manufacturer, has more than 6000 employees and nearly $2 billion in annual sales. Something ventured, something gained.

Mexican immigrants, President Trump has repeatedly told his approving base, are “bringing drugs. They’re bringing crime. They’re rapists.” In this week’s West Virginia rally he highlighted Mollie Tibbetts’ accused “illegal alien” killer as a vivid example. Hence the wish to “build a wall”—to keep out those who, we are told, would exploit Americans and take their jobs.

 

In an earlier 2018 essay, I responded to the inaccuracy of fear mongering about immigrant crime. But consider a different question: Who believes it? Is it people who live in regions with a greater number of unauthorized immigrants, and who have suffered the presumed crime, conflict, and competition?

 

At the recent Sydney Symposium on Social Psychology, Christian Unkelbach (University of Cologne) reported an intriguing finding: In Germany, anti-immigrant views are strongest in the states with fewest immigrants. Across Germany’s 16 states, intentions to vote for the right-wing Alternative für Deutschland (Alternative for Germany [AfD]) was greatest in states with the fewest asylum applications. (My thanks to Dr. Unkelbach for permission to share his translated figure.)

 

I wondered: Might a similar pattern emerge  in U.S. states? To find out, I combined two data sets:

  1. A 2016 Pew report provided data on the percentage of unauthorized immigrants in each state’s population.
  2. A 2016 PRRI report provided state-by-state data on immigrant acceptance.

The result? Voila! In the United States, more immigrants predicts more state-level acceptance of immigrants. And fewer immigrants predicts more fear of immigrants. (West Virginia, with the lowest unauthorized immigrant proportion, also is the least immigrant-supportive.) Moreover, the U.S. correlations are very similar to the German:

  • Across the 16 German states, the correlation between immigrant noncitizen population and anti-immigrant attitudes was -.61.
  • Across the 50 U.S. states, the correlation between immigrant noncitizen population and immigrant-supportive attitudes was +.72.

 

 

 

The legendary prejudice researcher Thomas Pettigrew would not be surprised. In a new article at age 87 (I want to be like him when I grow up), Pettigrew reports that in 477 studies of nearly 200,000 people across 36 cultures, intergroup contact predicted lower prejudice in every culture. With cross-racial contact, especially cooperative contact, people from South Africa to the United States develop more favorable racial attitudes. In a new study by Jared Nai and colleagues, living in a racially diverse U.S. neighborhood—or even just imagining doing so—leads people to identify more with all humanity, and to help strangers more.

 

As straight folks get to know gay folks, they, too, become more gay-supportive. And, these new data suggest, as citizens interact with and benefit from their immigrant neighbors, they, too, become more open-hearted and welcoming.

 

In my own Midwestern town, where minority students (mostly Hispanic) are a slight majority of public school students, these yard signs (this one from my front yard) abound. We have known enough immigrants—as neighbors, colleagues, business owners, and workers—to know that they, like our own immigrant ancestors, can be a blessing.

 

[Afterword: In kindly commenting on this essay, Thomas Pettigrew noted that one exception to the contact-with-immigrants benefit occurs “when the infusion of newcomers is large and sudden.  Then threat takes over without the time for contact to work its magic” (quoted with permission).]

Some years ago an NBC Television producer invited me, while in New York City, to meet in her office to brainstorm possible psychology-related segments. But a focused conversation proved difficult, because every three minutes or so she would turn away to check an incoming email or take a call—leaving me feeling a bit demeaned.

 

In today’s smartphone age, such interruptions are pervasive. In the midst of conversation, your friend’s attention is diverted by the ding of an incoming message, the buzz of a phone call, or just the urge to check email. You’re being phubbed—an Australian-coined term meaning phone-snubbed.

 

In U.S. surveys by James Roberts and Meredith David, 46 percent reported being phubbed by their partners, and 23 percent said it was a problem in their relationship. More phubbing—as when partners place the phone where they can glance at it during conversation, or check it during conversational lulls—predicted lower relationship satisfaction.

 

EmirMemedovski/E+/Getty Images

 

Could such effects of phubbing be shown experimentally? In a forthcoming study, Ryan Dwyer and his University of British Columbia colleagues recruited people to share a restaurant meal with their phones on the table or not. “When phones were present (vs. absent), participants felt more distracted, which reduced how much they enjoyed spending time with their friends/family.”

 

Another new experiment, by University of Kent psychologists Varoth Chotpitayasunondh and Karen Douglas, helps explain phubbing’s social harm. When putting themselves in the skin of one participant in an animation of a conversation, people who were phubbed felt a diminished sense of belonging, self-esteem, and control. Phubbing is micro-ostracism. It leaves someone, even while with another, suddenly alone.

 

Screenshot courtesy Karen Douglas

 

Smartphones, to be sure, are a boon to relationships as well as a bane. They connect us to people we don’t see—enlarging our sense of belonging. As one who lives thousands of miles from family members, I love Facetime and instant messaging. Yet a real touch beats being pinged. A real smile beats an emoticon. An eye-to-eye blether (as the Scots would say) beats an online chat. We are made for face-to-face relationship.

 

When I mentioned this essay to my wife, Carol, she wryly observed that I (blush) phub her “all the time.” So, what can we do, while enjoying our smartphones, to cut the phubbing? I reached out to some friends and family and got variations on these ideas:

  • “When we get together to play cards, I often put everyone's phone in the next room.”
  • “When out to dinner, I often ask friends to put their phones away. I find the presence of phones so distracting; the mere threat of interruption diminishes the conversation.” Even better: “When some of us go out to dinner, we pile up our phones; the first person to give in and reach for a phone pays for the meal.”
  • I sometimes stop talking until the person reestablishes eye-contact.” Another version: “I just wait until they stop reading.”
  • “I say, ‘I hope everything is OK.’” Or this: “I stop and ask is everything ok? Do you need a minute? I often receive an apology and the phone is put away.”
  • “I have ADHD and I am easily distracted. Thus when someone looks at their phone, and I'm distracted, I say, "I'm sorry, but I am easily distracted. Where was I?" . . . It's extremely effective, because nobody wants me to have to start over.”

 

Seeing the effects of phubbing has helped me change my own behavior. Since that unfocused conversation at NBC I have made a practice, when meeting with someone in my office, to ignore the ringing phone. Nearly always, people pause the conversation to let me take the call. But no, I explain, we are having a conversation and you have taken the time to be here with me. Whoever that is can leave a message or call back. Right now, you are who’s important.

 

Come to think of it, I should take that same attitude home.

Dog walking, according to a recent news report, is healthy for people. That little report follows three massive new research reviews that confirm earlier findings of the mental health benefits of exercise:

  • An American Journal of Psychiatry analysis of 49 studies followed 266,939 people across an average 7 years. In every part of the world, people of all ages had a lower risk of becoming depressed if physically active rather than inactive.
  • JAMA Psychiatry reports that, for teens, “regular physical activity [contributes] to positive mental health.”
  • Another JAMA Psychiatry analysis of 33 clinical trials found an additional depression-protecting effect of “resistance exercise training” (such as weight lifting and strength-building).

 

Faba-Photography/Moment/Getty Images

 

A skeptic might wonder if mentally healthy people have more energy for exercise. (Being really depressed comes with a heaviness that may entail trouble getting out of bed.) But the “prospective studies”—which follow lives through time—can discern a sequence of exercise predicting future reduced depression risk. Moreover, many clinical trial experiments—with people assigned to exercise or control conditions—confirm that exercise not only contributes to health and longevity, it also treats and protects against depression and anxiety. Mens sana in corpore sano: A healthy mind in a healthy body.

 

Indeed, given the modest benefits of antidepressant drugs, some researchers are now recommending therapeutic lifestyle change as a potentially more potent therapy for mild to moderate depression—or as a protection against such. When people modify their living to include the exercise, sunlight exposure, ample sleep, and social connections that marked our ancestors’ lives—a lifestyle for which they were bred—they tend to flourish, with greater vitality and joy. In one study, substantial depression relief was experienced by 19 percent of patients in a treatment-as-usual control group and by 68 percent undergoing therapeutic lifestyle change.

 

Finally, more good news—for dog walkers: Dog walking is said to be healthy and calming for dogs, too. But I suspect that will not surprise any dog owner or their dog.

“The heart has its reasons which reason does not know."

~Pascal, Pensees, 1670

 

“He that trusteth in his own heart is a fool.”

~Proverbs 28:26

 

“Buried deep within each and every one of us, there is an instinctive, heart-felt awareness” that can guide our behavior. So proclaimed Prince Charles in a 2000 lecture. Trust your gut instincts.

 

Prince Charles has much company. “I’m a gut player. I rely on my instincts,” explained President George W. Bush in justifying his decision to launch the Iraq war, after earlier talking with Vladimir Putin and declaring himself “able to get a sense of his soul.”

 

“Within the first minute [of meeting Kim Jong-un] I’ll know, declared President Trump. “My touch, my feel—that’s what I do.” Afterwards he added, “We had a great chemistry—you understand how I feel about chemistry.” The heart has its reasons.

 

But is there also wisdom to physicist Richard Feynman’s channeling the skepticism of King Solomon’s Proverb: “The first principle,” said Feynman, “is that you must not fool yourself—and you are the easiest person to fool.”

 

In sifting intuition’s powers and perils, psychological science has some wisdom.

 

First, our out-of-sight, automatic, intuitive information processing is HUGE. In Psychology, 12th Edition, Nathan DeWall and I offer some examples:

  • Automatic processing: We glide through life mostly on autopilot. Our information processing is mostly implicit, unconscious, behind the scenes—and often guided by “fast and frugal” heuristics (mental shortcuts).
  • Intuitive expertise: After mastering driving (or chess), people can react to situations intuitively, without rational analysis.
  • Reading others: We are skilled at reading “thin slices” of behavior—as when judging someone’s warmth from a 6-second video clip.
  • Blindsight: Some blind people even display “blindsight”—they can intuitively place an envelope in a mail slot they cannot consciously see.

 

Second, our intuition is perilous. Psychology is flush with examples of smart people’s predictable and sometimes tragic intuitive errors:

  • Human lie detection: People barely surpass chance when intuiting whether others are lying or truth-telling. (American presidents might want to remember this when judging Putin’s or Kim Jong-un’s trustworthiness.)
  • Intuitive prejudice: As demonstrated in some police responses to ambiguous situations, implicit biases can—without any conscious malevolent intent—affect our perceptions and reactions. (Is that man pulling out a gun or a phone?)
  • Intuitive fears: We fear things that kill people vividly and memorably (because we intuitively judge risks by how readily images of a threat come to mind). Thus we may—mistakenly—fear flying more than driving, shark attacks more than drowning, school mass shootings more than street and home shootings.
  • The “interview illusion”: Given our ability to read warmth from thin slices, it’s understandable that employment interviewers routinely overestimate their ability to predict future job success from unstructured get-acquainted interviews. But aptitude tests, work samples, job-knowledge tests, and peer ratings of past job performance are all better predictors. (Even the lengthiest of interviews—the mate-selection process—is a fragile predictor of long-term marital success.)

 

The bottom line: Intuition—automatic, implicit, unreasoned thoughts and feelings—grows from our experience, feeds our creativity, and guides our lives. Intuition is powerful. But it also is perilous, especially when we overfeel and underthink. Unchecked, uncritical intuition sometimes leads us into ill-fated relationships, feeds overconfident predictions, and even leads us into war.

Recent U.S. school shootings outraged the nation and produced calls for action. One response, from the International Society for Research on Aggression, was the formation of a Youth Violence Commission, composed of 16 experts led by Ohio State social psychologist Brad Bushman. Their task: To identify factors that do, and do not, predict youth violence—behavior committed by a 15- to 20-year old that’s intended to cause unwanted harm.

 

 

Hélène Desplechin/Moment/Getty Images

 

The Commission has just released its final report, which it has shared with President Trump, Vice President Pence, Education Secretary DeVos, and all governors, senators, and congressional representatives.

 

The Commission first notes big differences between highly publicized mass shootings (rare, occurring mostly in smaller towns and suburbs, using varied legal guns) and street shootings (more common, concentrated in inner cities, using illegal handguns).  It then addresses the factors that do and do not predict youth violence.

 

RISK FACTORS THAT PREDICT YOUTH VIOLENCE

 

Personal Factors:

  • Gender—related to male biology and masculinity norms.
  • Early childhood aggressive behavior—past behavior predicts future behavior.
  • Personality—low anger control, often manifested in four “dark” personality traits: narcissism, psychopathy, Machiavellianism, and sadism.
  • Obsessions with weapons or death.

 

Environmental Factors:

  • Easy access to guns.
  • Social exclusion and isolation—sometimes including being bullied.
  • Family and neighborhood—family separation, child maltreatment, neighborhood violence.
  • Media violence—a link “found in every country where studies have been conducted.”
  • School characteristics—with large class sizes contributing to social isolation.
  • Substance use—a factor in street shootings but not school shootings.
  • Stressful events—including frustration, provocation, and heat.

 

FACTORS THAT DO NOT PREDICT YOUTH VIOLENCE

 

The commission found that the following do not substantially predict youth violence:

  • Mental health problems—most people with mental illness are not violent, and most violent people are not mentally ill (with substance abuse and psychotic delusions being exceptions).
  • Low self-esteem—people prone to violence actually tend to have inflated or narcissistic self-esteem.
  • Armed teachers—more guns = more risk, and they send a message that schools are unsafe.

 

The concluding good news is that training programs can increase youth self-control, enhance empathy and conflict resolution, and reduce delinquency. Moreover, mass media could help by reducing attention to shootings, thereby minimizing the opportunity for modeling and social scripts that such portrayals provide to at-risk youth.

“When you know a thing, to hold that you know it; and when you do not know a thing, to allow that you do not know it; this is knowledge.”
~Confucius (551–479 B.C.E.), Analects

One of the pleasures of joining seventeen scholars from six countries at last week’s 20th Sydney Symposium on Social Psychology was getting to know the affable and articulate David Dunning.

 

David DunningDunning (shown here) recapped a stream of studies on human overconfidence. When judging the accuracy of their factual beliefs (“Did Shakespeare write more than 30 plays?”) or when predicting future events (such as the year-end stock market value), people are typically more confident than correct. Such cognitive conceit fuels stockbrokers’ beliefs that they can outperform the market—which, as a group, they cannot. And it feeds the planning fallacy—the tendency of contractors, students, and others to overestimate how quickly they will complete projects.

 

To this list of studies, Dunning and Justin Kruger added their own discovery, now known as the Dunning-Kruger effect: Those who score lowest on various tests of knowledge, logic, and grammar are often ignorant of their own ignorance. Never realizing all the word possibilities I miss when playing Scrabble, I may overestimate my verbal competence.

 

Likewise—to make this even more personal—those of us with hearing loss often are the last to recognize such . . . not because we are repressing our loss, but simply because we are unaware of what we haven’t heard (and of what others do hear). To Daniel Kahneman’s kindred observation that we are “blind to our [cognitive] blindness,” I would add that we can also be literally deaf to our deafness. We don’t know what we don’t know.

 

Thus ironically, and often tragically, those who lack expertise in an area suffer a double-curse—they make misjudgments, which they fail to recognize as errors. This leads them, notes Dunning, to conclude “they are doing just fine.”

 

Note what Dunning is not saying—that some people are just plain stupid, a la Warren Buffett:

Warren Buffett

 

Rather, all of us have domains of inexpertise, in which we are ignorant of our ignorance.

 

But there are two remedies. When people express strong views of topics on which they lack expertise, we can, researcher Philip Fernbach found, ask them to explain the details: “So exactly how would a cap-and-trade carbon emissions tax work?” A stumbling response can raise their self-awareness of their ignorance, lessening their certainty.

 

Second, we can, for our own part, embrace humility. For anything that matters, we can welcome criticism and advice. Another personal example: As I write about psychological science, I confess to savoring my own words. As I draft this essay, I am taking joy in creating the flow of ideas, playing with the phrasing, and then fine-tuning the product to seeming perfection. Surely, this time my editors—Kathryn Brownson and Nancy Fleming—will, for once, find nothing to improve upon? But always they find glitches, ambiguities, or infelicities to which I was blind.

 

Perhaps that is your story, too? Your best work, when reviewed by others . . . your best tentative decisions, when assessed by your peers . . . your best plans, when judged by consultants . . . turn into something even better than you, working solo, could have created. Our colleagues, friends, and spouses often save us from ourselves. The pack is greater than the wolf.

 

In response to my wondering if his famed phenomenon had impacted his life, Dunning acknowledged that he has received—and in all but one instance rebuffed—a stream of journalist pleas: Could he please apply the blindness-to-one’s-own-incompetence principle to today’s American political leadership?

 

But stay tuned. Dunning is putting the finishing touches on a general audience trade book (with one possible title: You Don’t Know What You Don’t Know—and Why It Matters).

It’s well-established that:

  • brain cells survive for a time after cardiac arrest and even after declared death.
  • some people have been resuscitated after cardiac arrest— even hours after, if they were linked to blood-oxygenating and heart-massaging machines.
  • a fraction of resuscitated people have reported experiencing a bright light, a tunnel, a replay of old memories, and/or out-of-body sensations. For some, these experiences later enhanced their spirituality or personal growth.

 

Recently, I enjoyed listening to and questioning a university physician who is launching a major multi-site study of cardiac arrest, resuscitation, and near-death experiences. As a dualist (one who assumes mind and body are distinct, though interacting), he is impressed by survivors’ reports of floating up to the ceiling, looking down on the scene below, and observing efforts to revive them. Thus, his study seeks to determine whether such patients can—while presumably separated from their supine body—perceive and later recall images displayed on an elevated, ceiling-facing iPad.

 

Care to predict the result?

 

My own prediction is based on three lines of research:

  • Parapsychological efforts have failed to confirm out-of-body travel with remote viewing.
  • A mountain of cognitive neuroscience findings link brain and mind.
  • Scientific observations show that brain oxygen deprivation and hallucinogenic drugs can cause similar mystical experiences (complete with the tunnel, beam of light, and so forth).

Thus, I expect there will be no replicable evidence of near-death minds viewing events remote from the body.

 

Setting my assumptions and expectations aside, I asked the physician-researcher about some of his assumptions:

  1. For how long do you think the mind would survive clinical death? Minutes? Hours? Forever? (His answer, if I understood, was uncertainty.)
  2. When resuscitated, the mind would rejoin and travel again with the body, yes? When the patient is wheeled to a new room, the mind rides along? (That assumption was not contested.)
  3. What about the Hiroshima victims whose bodies were instantly vaporized? Are you assuming that–for at least a time—their consciousness or mind survived that instant and complete loss of their brain and body? (His clear answer: Yes.)

 

That made me wonder: If a mind could post-date the body, could it also predate it? Or does the body create the mind, which grows with it, but which then, like dandelion seeds, floats away from it?

 

The brain-mind relationship appeared in another presentation at the same session. A European university philosopher of mind argued that, in addition to the dualist view (which he regards as “dead”) and the reductionist view (Francis Crick: “You’re nothing but a pack of neurons”), there is a third option. This is the nonreductive physicalist view—“nonreductive” because the mind has its own integrity and top-down causal properties, and “physicalist” because the mind emerges from the brain and is bound to the brain.

 

The 20th century’s final decade was “the decade of the brain,” and the 21st century’s first decade was “the decade of the mind.” Perhaps we could say that today’s science and philosophy mark this as a decade of the brain-mind relationship? For these scholars, there are miles to go before they enter their final sleep—or should I say until their body evicts their mind?

 

Addendum for those with religious interests: Two of my friends—British cognitive neuroscientist Malcolm Jeeves and American developmental psychologist Thomas Ludwig—reflect on these and other matters in their just-published book, Psychological Science and Christian Faith. If you think that biblical religion assumes a death-denying dualism (a la Plato’s immortal soul) prepare to be surprised.

Money matters. For entering U.S. collegians, the number one life goal—surpassing “helping others in difficulty,” “raising a family,” and 17 other aspirations—is “being very well off financially.” In the most recent UCLA “American Freshman” survey, 82 percent rated being very well off as “essential” or “very important.” Think of it as today’s American dream: life, liberty, and the purchase of happiness.

 

For human flourishing, fiscal fitness indeed matters . . . up to a point. In repeated surveys across nations, a middle-class income—and being able to control one’s life—beats being poor. Moreover, people in developed nations tend to be happier and more satisfied than those in the poorest of nations.

 

Beyond the middle-class level, we seem to have an income “satiation point,” at which the income-happiness correlation tapers off and happiness no longer increases. For individuals in poor countries, that point is close to $40,000; for those in rich countries, about $90,000, reports a new analysis of 1.7 million Gallup interviews by Andrew Jebb and colleagues.

 

And consider: The average U.S.  per-person disposable income, adjusted for inflation, has happily tripled over the last 60 years, enabling most Americans to enjoy today’s wonderments, from home air conditioning to wintertime fresh fruit to smart phones. “Happily,” because few of us wish to return to yesteryear. Yet not that happily, because psychological well-being has not floated upward with the rising economic tide. The number of “very happy” adults has remained at 3 in 10, and depression has been on the rise.

 

What triggers the diminishing psychological payoff from excess income? Two factors:

  • Our human capacity for adaptation: Continual pleasures subside.
  • Our tendency to assess our own circumstances by “social comparison” with those around us—and more often those above us. People with a $40,000 income tend to think $80,000 would enable them to feel wealthy—whereas those at $80,000 say they would need substantially more. Become a millionaire and move to a rich neighborhood, you still may not feel rich. As Theodore Roosevelt said, “Comparison is the thief of joy.”

 

The outer limit of the wealth–well-being relationship also appears in two new surveys (by Grant Donnelly, Tianyl Zheng, Emily Haisley, and Michael Norton) of an international bank’s high net-worth clients. As you can see in figures I created from their data, having $2 million and $10 million are about the same, psychologically speaking.

 

If wealth increases well-being only up to a point—and much evidence indicates that is so—and if extreme inequality is socially toxic (great inequality in a community or country predicts lower life quality and more social pathology), then could societies increase human flourishing with economic and tax policies that spread wealth?

 

Let’s make this personal: If earning, accumulating, and spending money increases our happiness only to a satiation point, then why do we spend our money for (quoting the prophet Isaiah) “that which is not bread” and our “labor for that which does not satisfy?” Quite apart from moral considerations, what’s to be lost by sharing our wealth above the income-happiness satiation point?

 

And if one is blessed with wealth, what’s to be gained by showering inherited wealth, above the satiation point, on our children? (Consider, too, another Donnelly and colleagues finding: Inherited wealth entails less happiness than earned wealth.)

 

Ergo, whether we and our children drive BMWs or Honda Fits, swim in our backyard pool or at the local Y, eat filet mignon or fish filet sandwiches, hardly matters. That fact of life, combined with the more important facts of the world’s needs, makes the case for philanthropy.

“The most famous psychological studies are often wrong, fraudulent, or outdated.” With this headline, Vox joins critics that question the reproducibility and integrity of psychological science’s findings.

 

Are many psychology findings indeed untrustworthy? In 2008, news from a mass replication study—that only 36 percent of nearly 100 psychological science studies successfully reproduced the previous findings—rattled our field. Some challenged the conclusion: “Our analysis completely invalidates the pessimistic conclusions that many have drawn from this landmark study,” said Harvard psychologist Daniel Gilbert.

 

For introductory psychology teachers, those supposed failures to replicate need not have been a huge concern. Introductory psych textbooks focus on major, well-established findings and ideas. (For example, only one of the 60+ unreplicated studies were among the 5,174 references in my text at the time, necessitating a deletion of only one-half sentence in its next edition.)

 

But here are more recent criticisms—about six famous and favorite studies:

  • Philip Zimbardo stage-managed the Stanford prison study to get his wished-for results, and those who volunteer for such an experiment may be atypically aggressive and authoritarian (see here and here). Moreover, as Stephen Reicher and Alex Haslaam showed, when they recreated a prison experiment with the BBC (albeit as reality TV rather than a replication), groups don’t necessarily corrupt—people can collectively choose to behave in varied ways. For such reasons, the Stanford prison study may in the future disappear from more intro psych texts. But for the present, some teachers still use this study as a vivid illustration of the potential corrupting power of evil situations. (Moreover, Philip Zimbardo and colleagues have released responses here.)
  • Muzafer Sherif similarly managed his famed boys’ camp study of conflict and cooperation to produce desired results (see here). Yet my friend Stephen Reicher, whom I met over coffee in St. Andrews two weeks ago, still considers the Sherif study a demonstration (even if somewhat staged) of the toxicity of competition and the benefits of cooperation.
  • The facial-feedback effect—the tendency of facial muscles to trigger associated feelings—doesn’t replicate (see here). The failure to reproduce that favorite study (which my students and I have experienced by holding a pencil with our teeth vs. our pouting lips) wiped a smile off my face. But then the original researcher, Fritz Strack, pointed us to 20 successful replications. And a new study sleuths a crucial difference (self-awareness effects due to camera proximity) between the studies that do and don’t reproduce the facial feedback phenomenon. Even without a pencil in my mouth, I am smiling again.
  • The ego-depletion effect—that self-control is like a muscle (weakened by exercise, replenished with rest, and strengthened with exercise)—also failed a multi-lab replication (here). But a massive new 40-lab study, with data analyzed by an independent consultant—“innovative, rigorous” science, said Center for Open Science founder Brian Nosek—did show evidence of a small depletion phenomenon.
  • Kitty Genovese wasn’t actually murdered in front of 38 apartment bystanders who were all nonresponsive (see here). Indeed. Nevertheless, the unresponsive bystander narrative—initiated by police after the Genovese murder—inspired important experiments on the conditions under which bystanders will notice and respond in crisis situations.
  • Mischel’s marshmallow study (children who delay gratification enjoy future success) got roasted by a big new failure to replicate. As I explain in last week’s www.TalkPsych.com essay, the researchers did find an association between 4½-year-olds’ ability to delay gratification and later school achievement, but it was modest and related to other factors. The take-home lesson: Psychological research does not show that a single act of behavior is a reliable predictor of a child’s life trajectory. Yet life success does grow from impulse restraint. When deciding whether to study or party, whether to spend now or save for retirement, foregoing small pleasures can lead to bigger rewards later.

 

One positive outcome of these challenges to psychological science has been new scientific reporting standards that enable replications, along with the establishment of the Center for Open Science that aims to increase scientific openness, integrity, and reproducibility. (I was pleased recently to recommend to fellow Templeton World Charity Foundation trustees a multi-million dollar grant which will support the Center’s mission.)

 

The big picture: Regardless of findings, research replications are part of good science. Science, like mountain climbing, is a process that leads us upward, but with times of feeling like we have lost ground along the way. Any single study provides initial evidence, which can inspire follow-up research that enables us to refine a phenomenon and to understand its scope. Through replication—by winnowing the chaff and refining the wheat—psychological science marches forward.

David Myers

Marshmallow Study Roasted?

Posted by David Myers Expert Jun 11, 2018

While on break in St. Andrews (Scotland) last week, I enjoyed a dinner conversation with a celebrated M.I.T. developmental psychologist and a similarly brilliant University of St. Andrews researcher. Among our dinner topics was an impressive recent conceptual replication of Walter Mischel’s famous marshmallow test.

 

Mischel and his colleagues, as you may recall, gave 4-year-olds a choice between one marshmallow now or two marshmallows later. Their long-term studies showed that those with the willpower to delay gratification as preschoolers went on as adults to have higher college-completion rates and incomes and fewer addiction problems. This gem in psychology’s lore—that a preschooler’s single behavioral act could predict that child’s life trajectory—is a favorite study for thousands of psychology instructors, and has made it into popular culture—from Sesame’s Street’s Cookie Monster to the conversation of Barack Obama.

 

In their recent replication of Mischel’s study, Tyler Watts, Greg Duncan, and Hoanan Quen followed a much larger and more diverse sample: 918 children who, at age 4½, took the marshmallow test as part of a 10-site National Institute of Child Health and Human Development child study. Observing the children’s school achievement at age 15, the researchers noted a modest, statistically significant association “between early delay ability and later achievement.” But after controlling for other factors, such as the child’s intelligence, family social status, and education, the effect shriveled.

 

“Of course!” said one of my dinner companions. Family socioeconomic status (SES) matters. It influences both children’s willingness to await the second marshmallow, and also academic and economic success. As other evidence indicates—see here and here—it is reasonable for children in poverty to seize what’s available now and to not trust promises of greater future rewards.

 

But my other dinner companion and I posited another factor: Any predictive variable can have its juice drained when we control for myriad other variables. Perhaps part of a child’s ability to delay gratification is intelligence (and the ability to contemplate the future) and experience. If so, controlling for such variables and then asking what’s the residual effect of delay of gratification, per se, is like asking what’s the real effect of a hurricane, per se, after controlling for barometric pressure, wind speed, and storm surge. A hurricane is a package variable, as is delay of gratification.

 

I put that argument to Tyler Watts, who offered this response:

If the ability to delay gratification is really a symptom of other characteristics in a child's life, then interventions designed to change only delay of gratification (but not those other characteristics) will probably not have the effect that you would expect based on the correlation Mischel and Shoda reported. So, if it’s the case that in order to generate the long-term effects reported in Mischel's work, interventions would have to target some combination of SES, parenting, and general cognitive ability, then it seems important to recognize that.  

 

This major new study prompts our reassessing the presumed predictive power of the famed marshmallow test. Given what we’ve known about how hard it is to predict to or from single acts of behavior—or single items on a test or questionnaire—we should not have been surprised. And we should not exaggerate the importance of teaching delay of gratification, apart from other important predictors of life success.

 

But the new findings do not undermine a deeper lesson: Part of moral development and life success is gaining self-discipline in restraining one’s impulses. To be mature is to forego small pleasures now to earn bigger rewards later. Thus, teacher ratings of children’s self-control (across countless observations) do predict future employment. And parent ratings of young children’s self-regulation predict future social success. Self-control matters.

David Myers

The Malleability of Mind

Posted by David Myers Expert May 31, 2018

How many of us have felt dismay over a friend or family member’s stubborn resistance to our arguments or evidence showing (we believe) that Donald Trump is (or isn’t) making America great again, or that immigrants are (or aren’t) a threat to our way of life? Sometimes, it seems, people just stubbornly resist change.

 

Recently, however, I’ve also been struck by the pliability of the human mind. We are adaptive creatures, with malleable minds.

 

Over time, the power of social influence is remarkable. Generations change. And attitudes change. They follow our behavior, adjust to our tribal norms, or simply become informed by education.

 

The power of social influence appears in current attitudes toward free trade, as the moderate-conservative columnist David Brooks illustrates: “As late as 2015, Republican voters overwhelmingly supported free trade. Now they overwhelmingly oppose it. The shift didn’t happen because of some mass reappraisal of the evidence; it’s just that tribal orthodoxy shifted and everyone followed.”

 

Those who love history can point out many other such shifts. After Pearl Harbor, Japan and Japanese people became, in many American minds surveyed by Gallup, untrustworthy and disliked. But then after the war, they soon transformed into our “intelligent, hard-working, self-disciplined, resourceful allies.”  Likewise, Germans across two wars were hated then admired then hated again then once again admired.

 

Or consider within thin slices of recent human history the transformational changes in our thinking about race, gender, and sexual orientation:

  • Race. In 1958, only 4 percent of Americans approved of “marriage between Blacks and Whites.” In 2013, 87 percent approved.
  • Gender. In 1967, two-thirds of first-year American college students agreed that “the activities of married women are best confined to the home and family.” Today, the question, which would offend many, is no longer asked.
  • Gay marriage. In Gallup surveys, same-sex marriage—approved by only 27 percent of Americans in 1996—is now welcomed by nearly two-thirds.

 

Consider also, from within the evangelical culture that I know well, the astonishing results of two Public Religion Research Institute polls. The first, in 2011, asked voters if “an elected official who commits an immoral act in their personal life can still behave ethically and fulfill their duties in their public and professional life.” Only 30 percent of White evangelical Protestants agreed. By July of 2017, with President Trump in office, 70 percent of White evangelicals said they would be willing to separate public and personal conduct.

 

An April 22, 2018, Doonesbury satirized this “head-spinning reversal” (quoting the pollster). A cartoon pastor announces to his congregation the revised definition of sin:

“To clarify, we now condone the following conduct: lewdness, vulgarity, profanity, adultery, and sexual assault. Exemptions to Christian values also include greed, bullying, conspiring, boasting, lying, cheating, sloth, envy, wrath, gluttony, and pride. Others TBA. Lastly we’re willing to overlook biblical illiteracy, church nonattendance, and no credible sign of faith.”

 

In a recent essay, I reflected (as a person of faith) on the shift among self-described “evangelicals”: The great temptation is to invoke “God” to justify one’s politics. “Piety is the mask,” observed William James.

 

This tendency to make God in our own image was strikingly evident in a provocative study by social psychologist Nicholas Epley and his colleagues.  Most people, they reported, believe that God agrees with whatever they believe. No surprise there. But consider: When the researchers persuaded people to change their minds about affirmative action or the death penalty, the people then assumed that God now believed their new view. As I am, the thinking goes, so is God.

 

But the mind is malleable in both directions. Many one-time evangelicals—for whom evangelicalism historically has meant a “good news” message of God’s accepting grace—are now changing their identity in the age of Trump (with Trump’s support having been greatest among “evangelicals” who are religiously inactive—and for whom the term has been co-opted to mean “cultural right”). Despite my roots in evangelicalism, I now disavow the mutated label (not wanting to be associated with the right’s intolerance toward gays and Muslims). Many others, such as the moderate Republican writer Peter Wehner, are similarly repulsed by the right-wing takeover of evangelicalism and disavowing today’s tarnished evangelical brand.

 

Times change, and with it our minds.

The British, American, and Australian press—and hundreds of millions of royal wedding viewers—were unexpectedly enthralled by Bishop Michael Curry’s 13.5 minutes of fame:

  • “Stole the show” (Telegraph and Vox).
  • “Electrifying” (New York Times).
  • “Wholly un-British, amazing, and necessary” (Esquire).
  • “Will go down in history” (Guardian).
  • “His star turn is set to impact the Most Reverend Michael Curry’s life for years to come” (news.com.au)

 

His gist: “We must discover the power of love, the redemptive power of love,” God’s love. “And when we do that, we will make of this old world, a new world.” A positive message—and an appealing synopsis of authentic Christianity—but why was it so effective? Why did it connect so well and capture media coverage? What persuasion principles did he illustrate that others—preachers, teachers, students, all speakers—might want to emulate?

 

The power of repetition. Experiments leave no doubt: Repetition strengthens memory and increases belief. Repeated statements—whether neutral (“The Louvre is the largest museum in Paris”), pernicious (“Crooked Hillary”), or prosocial (“I have a dream”)—tend to stick to the mind like peanut butter. They are remembered, and they are more likely to be believed (sometimes even when repeated in efforts to discount them).

 

Few will forget that Curry spoke of “love” (66 times, in fact—5 per minute). We would all benefit from emulating Curry’s example: Frame a single, simple message with a pithy phrase (“the power of love”). From this unifying trunk, the illustrative branches can grow.

 

The power of speaking from the heart. A message rings authentic when it emanates from one’s own life experience and identity—when it has enough self-disclosure to be genuine, but not so much as to be self-focused. Curry, a slave descendant, speaking in an epicenter of White privilege, began and ended with the words of Martin Luther King, Jr., and he told how his ancestors, “even in the midst of their captivity” embraced a faith that saw “a balm in Gilead to make the wounded whole.”


The power of speaking to the heart. My wife—an Episcopalian who has heard Curry’s preaching—has told me that his presence lends power beyond his written words. Curry was well prepared. But rather than safely reading his polished manuscript, he made eye contact with his audience, especially Prince Harry and Ms. Markle. He spoke with passion. His words aroused emotion. They spoke to troubled hearts in a polarized world.

 

The power of vivid, concrete examples. The behavioral scientist in me wishes it weren’t true, but, alas, compelling stories and vivid metaphors have, in study after study, more persuasive power than truth-bearing statistics. No wonder each year, 7 in 10 Americans, their minds filled with images of school shootings and local murders, say there is more crime than a year ago—even while crime statistics have plummeted.

 

William Strunk and E. B. White’s classic, The Elements of Style, grasped the idea: “If those who have studied the art of writing are in accord on any one point, it is on this: the surest way to arouse and hold the attention of the reader is by being specific, definite, and concrete. The greatest writers—Homer, Dante, Shakespeare—are effective largely because they deal in particulars.”

 

And Curry, too, offered particulars, with simplicity, repetition, and rhythmic cadence:

 

When love is the way, poverty will become history. When love is the way, the earth will be a sanctuary. When love is the way, we will lay down our swords and shields down by the riverside to study war no more.


When love is the way, there’s plenty good room—plenty good room—for all of God’s children. When love is the way, we actually treat each other like we are actually family. When love is the way, we know that God is the source of us all. We are brothers and sisters, children of God.


Brothers and sisters: that’s a new heaven, a new earth, a new world, a new human family.

 

With such repeated, heart-filled, concrete words, perhaps all preachers and speakers could spare listeners the fate of Eutychus, who, on hearing St. Paul’s preaching, “sunk down with sleep, and fell down from the third loft, and was taken up dead” (Acts 20:9).