Skip navigation
All Places > The Psychology Community > Blog > 2016 > July > 18

Originally posted on February 5, 2015.


“The worst call in Super Bowl history,” read a headline in my hometown Seattle Times after Seahawks' head coach Pete Carroll seemingly threw the game away with his ill-fated decision to pass – rather than run – as the game clock expired.


Actually, Carroll made two end-of-half decisions in Sunday’s Super Bowl, both questioned by the NBC announcers. The differing outcomes of the decisions – and the resulting reactions by pundits and fans – offer potent examples of a mental pitfall that has been the subject of roughly 800 psychological science publications.


“Hindsight bias,” also known as the “I knew it all along phenomenon,” is the almost irresistible tendency to believe – after an experiment, a war, an election, or an investment – that the outcome was foreseeable. After the stock market drops (it “was due for a correction”) or an election is lost (by a “terrible candidate”), the outcome seems obvious – and thus blameworthy.


But, as research shows, we often do not expect something to happen until it does. Only then do we clearly see the forces that triggered the event, and feel unsurprised. Because outcomes seem as if they should have been foreseeable, we are more likely to blame decision makers for what are, in retrospect, “obvious” bad choices, rather than praise them for good ones (which also seem “obvious”). As the 19th century philosopher Søren Kierkegaard said, “Life is lived forwards, but understood backwards.”


With six seconds remaining in the first half, Carroll decided to throw for a touchdown – a risk that could have resulted in the clock expiring. Better to kick a safe field goal, argued the NBC announcers. But the gamble worked, and Carroll was acknowledged to have made a “gutsy” call.


Then, at the game’s end, with 26 seconds left – and victory less than a yard away – Carroll and his offensive coordinator ventured a fateful, intercepted pass. Fans exploded on social media. A (less-explicit) version of most reactions: “With three downs and one timeout to go, and the league’s best powerful runner at the ready, what was he thinking?”


“The stupidest coaching decision I can recall,” I vented to my wife afterwards, aided by 20/20 hindsight.


But like all the football fans who made Coach Carroll an object of national ridicule, I was judging the call after knowing the outcome. The next morning I reassessed the situation. With one timeout, I now realized, Seattle could venture, at most, two running plays. The attempted pass was a free third play – which, if incomplete, would still leave them with the same two possible running plays. Moreover, the odds of an interception at the one-yard line are, I later learned, even less than the odds of a fumble. And had a touchdown pass arrived in the receiver’s hands a half-second sooner, we could use game theory to explain how the wily Seahawks won by doing what their opponent least expected.


Responding to those who claimed he made “the worst call ever,” Carroll later explained to Today Show host Matt Lauer, “It was the worst result of a call ever. The call would have been a great one if we caught it. It would have been just fine and no one would have thought twice about it.”


Bringing statistical analysis to the decision, impeccably calculated that Carroll had indeed made a smart one – that he had slightly increased his team’s chance of a win. What’s more, the evidence-based bad coaching decision was made by New England coach Bill Belichick, who, instead of calling a timeout, opted to let the clock run down (which would have deprived quarterback Tom Brady of another scoring opportunity in the likely event of a Seattle touchdown).


But probabilities are not certainties. In sports, as in life, good decisions can yield bad outcomes. Bad decisions can have lucky outcomes. And once outcomes are known we immediately retrofit our thinking. Thanks to hindsight bias, “winning erases all sins.” And losing makes a good coaching call look not gutsy but just plain stupid – even to a psychologist-fan who temporarily forgot what he teaches.


(Note:  This essay was simultaneously co-published by

Originally posted on February 9, 2015.


Northeastern University history professor Benjamin Schmidt is making waves, after harvesting 14 million student reviews from “Rate My Professor.”  He offers a simple interactive tool that can allow you—perhaps as an in-class demonstration—to compare words that students use to describe male and female professors.


You can give it a try, here.  I entered some intelligence-related words (“smart,” “brilliant,” “genius”) and some emotion-related words (“sweet,” “nasty”).  Even as one who writes about gender stereotypes, I was stunned by the gender effect.

Originally posted on February 10, 2015.


After falsely reporting being grounded by rocket fire while on a military helicopter in Iraq, and subsequently having his reported experiences during Hurricane Katrina challenged, NBC Nightly News anchor Brian Williams has been grounded by pundit fire.


Williams apologized, saying he misremembered the Iraqi incident. Talk shows and social media doubted anyone could misremember so dramatically, labeling him “Lyin’ Brian,” and grafting Pinocchio’s nose onto his face.


It’s possible that Williams is, indeed, a self-aggrandizing liar (meaning he knowingly and intentionally told untruths). But to those who know the dramatic results of research into sincere but false memories by Elizabeth Loftus and others, it’s also believable that, over time, Williams constructed a false memory...and that Hillary Clinton did the same when misreporting landing under sniper fire in Bosnia in 1996.


Most people view memories as video replays. Actually, when we retrieve memories, we reweave them. We often then replace our prior memory with a modified memory—a phenomenon that researchers call “reconsolidation.” In the reconsolidation process, as more than 200 experiments have shown, misinformation can sneak into our recall.


Even imagined events may later be recalled as actual experiences. In one new experiment, people were prompted to imagine and repeatedly visualize two events from their past—one a false event that involved committing a crime in their adolescence. Later, 70 percent reported a detailed false memory of having committed the crime!  False memories, like fake diamonds, seem so real.


Is Brian lyin’? Does he have Pinocchio’s nose for the news? Perhaps—though telling blatant untruths surely is not a recipe for enduring success and prestige in his profession.


Or might he instead be offering us a fresh example of the striking malleability of human memory?


P.S. The morning after I submitted these reflections for posting, the New York Times offered this excellent (and kindred-spirited) application of false memory research to the Williams episode.

David Myers

Does Music Help Memory?

Posted by David Myers Expert Jul 18, 2016

Originally posted on March 4, 2015.


When was the last time you studied without distractions? Or did any one activity without simultaneously doing another?


Multitasking pops up everywhere. While we work, we check our phones for messages, tweet our thoughts, listen to music, and update our Facebook status. At least that’s what my students tell me they do in their other classes.


You may think listening to music while you prep for a big test helps you relax so you can concentrate and study. I used to think so. In college, I’d sit down with my textbooks, pop in my headphones, and turn on my favorite music to set the mood for studying.  Then I’d spend hours going over the material—and play my make-believe drums or air guitar! Yes, I studied alone in college. A lot.


Listening to music may help college-aged students stay focused, but one new study found that older adults had more trouble remembering information they had learned while music was played in the background.


The study challenged younger and older adults to listen to music while trying to remember names. For the older adults, silence was golden. But when the researchers made the older adults listen to music while they tried to remember the names, their memory lapsed. College-aged participants’ performance did not suffer regardless of whether they listened to music while memorizing names.


Before you turn up the tunes to study, consider your age first. If you’re a younger college student, keep pressing play. Older students might be better off studying in silence. Regardless of our age, we might do well by taking a few minutes each day to set aside distractions, slow down, and become mindful of our thoughts, feelings, and environment. 

Originally posted on March 6, 2015.


My nominee for psychology’s most misunderstood concept is negative reinforcement (which is not punishment, but actually a rewarding event—withdrawing or reducing something aversive, as when taking aspirin is followed by the alleviation of a headache).


In second place on my list of oft-misunderstood concepts is heritability.


My publishers’ twitter feed today offered this:



Sure enough, the news source says it’s so.  But it isn’t.  Tracking back to the actual study, and its own press release, we see that, as we might have expected, the conclusion was drawn from a twin study that estimated the genetic contribution to variation among individuals in autism spectrum disorder (ASD) scores.


Heritability refers to the extent to which differences among people are due to genes. If the heritability of ASD is 80 percent, this does not mean that 80 percent of autism cases are attributable to genes and 20 percent of cases to environment. And it does not mean that any particular case is 80 percent attributable to genes and 20 percent to environment.  Rather it means that, in the context studied, 80 percent of the differences among people was attributable to genetic influence.

Originally posted on March 10, 2015.


Nathan and David’s monthly synopses of important new findings reported in Current Directions in Psychological Science continue, and include their teaching ideas.


In the February APS Observer, Nathan shines a light on “dark personalities.”  “Some people have hidden lusts or greed,” he notes, “whereas others embezzle millions. Understanding the science of dark personality helps us avoid labeling people as simply good or bad. By shining a light on the ingredients of a dark personality, we can learn who we ought to fear and when to fear them.”


In the same issue, David summarizes the emerging field of health neuroscience, and suggests ways to help students think about brain ßà body interaction.


In the upcoming March issue, Nathan explains “When Two Emotions are Better than One” and suggests how to teach students the importance of emotional differentiation.



Also in the March issue, David identifies ways in which “Psychological Science Meets Religious Faith”—a topic of increasing interest in psychology:



Originally posted on March 17, 2015.


One of the pleasures of writing psychological science is learning something new nearly every day, from the continual stream of information that flows across my desk or up my screen.  Some quick examples from the last few days:


Nudging nutrition. Joseph Redden, Traci Mann, and their University of Minnesota colleagues report a simple intervention that increases schoolchildren’s veggie eating.  In a paper to appear in PLOS One, they report—from observations of 755 children in a school cafeteria—that, for example, offering carrots first in the serving line (in isolation from other foods to come) quadrupled their consumption. For more on healthy eating nudges, see Mann’s forthcoming book, Secrets from the Eating Lab.


Hugging prevents colds.  In new research by Sheldon Cohen and his team, social support, indexed partly by the frequency of experienced hugs, predicted fewer and milder infections among 404 healthy people exposed to a cold virus.  A hug a day keeps sickness away?


Finger digit ratio predicts national differences in gender inequality?  It’s not news that nations vary in female political representation, workforce participation, and education.  It was news to me that they reportedly also vary in 2D:4D—that’s the ratio of the index (2D) and ring finger (4D) lengths.  Nations that purportedly show relatively high female fetal testosterone exposure (supposedly manifest as low 2D:4D) and relatively low male fetal testosterone exposure (high 2D:4D) have higher rates of female parliamentary and workforce participation. Hmmm.


How effective is repetitive transcranial magnetic stimulation (rTMS) for treating depression?  A few well-publicized studies suggested it was effective. But a new meta-analysis of all the available studies indicates this treatment actually provides only “minimal clinical improvement.” And this is why teachers and authors need to consider all of the available research, and not just isolated studies.


It’s not all in our genes: Exercise really is healthy. Finnish researchers studied 10 identical male twins—one of whom regularly exercised, the other not. Despite having similar diets, the sedentary twins had more body fat, more insulin resistance, less stamina, and less brain gray matter. The moral to us all:  join the movement movement.

Originally posted on March 25, 2015.


During a recent visit to Stanford University, psychologist Jeremy Bailenson (pictured) invited our small group of conferees to his Virtual Human Interaction Lab, where he explained his studies and invited us each to experience a virtual world, complete with surround sound and vibrating floor.



His expressed aim is to “give you an experience that changes how you think about yourself,” and then to assess the aftereffects. In our group, brain scientist Antonio Damasio found his left and right legs and arms switched, as he popped virtual balloons.  Anthropologist Mel Konner found his identity shifted into a mirrored person.  I found myself in a beautiful forest, cutting down trees, and then underwater in a beautiful lagoon, with fish flying by.



Bailenson reports that men who become female avatars later are more nurturing.  Heroes who gain the ability to fly around a city (navigating by arm movements, as below) later become more helpful.  Those who age in front of their eyes become more future oriented.


In such ways, Bailenson explores “how virtual reality can change the way people think about education, environmental behavior, empathy, and health.”

David Myers

Born to Learn From Us

Posted by David Myers Expert Jul 18, 2016

Originally posted on March 27, 2015.


At a recent foundation consultation at Stanford, I enjoyed meeting Andrew Meltzoff, the amiable and articulate co-director of the University of Washington’s Institute for Learning and Brain Sciences in my home city (where he lives but a short walk from my former high school).


Meltzoff is known to psychology teachers and students for his many studies of infant imitation, including his classic 1977 Science report on 2- to 3-week old infants imitating his facial gestures. It was, he reported, a powerful experience to stick out his tongue and have newborns do the same. “This demonstrates to me the essential socialness of human beings.”




I’ve always wondered what newborns really are capable of visually perceiving, and he reminded me that it’s not much—but that they have their best acuity for the distance between their mother’s breast and eyes, which also was the distance between his face and the infants’ eyes.


His lab is now reading infants brains using the world’s only infant brain imaging MEG (magnetoencephalography) machine, which reads brain magnetic activity more finely than possible with EEG.



He reports that “When a brain sees, feels, touches, or hears, its neuronal activity generates weak magnetic fields that can be pinpointed and tracked millisecond-by-millisecond by a MEG machine.” That is allowing Meltzoff and his colleagues to visualize an infant’s working brain as the infant listens to language, experiences a simple touch on the hand, or (in future studies) engages in social imitation and cognitive problem solving.


On the horizon, he envisions future studies of how children develop empathy, executive self-control, and identity. He also anticipates exploring how children’s brains process information from two-dimensional digital media versus their three-dimensional everyday world, and how technology can best contribute to children’s development. In such ways, they hope to “help children maximize their learning capabilities.”

Originally posted on April 2, 2015.


Facebook, Google, and Twitter, among others, are enabling psychologists to mine giant data sets that allow mega-scale naturalistic observations of human behavior. The recent Society of Personality and Social Psychology convention offered several such “big data” findings, including these (some also recently published):


  • “Computer-based personality judgments are more accurate than those of friends, spouses, or family.” That’s how Michal Kosinski, Youyou Wu, and David Stillwell summed up their research on the digital trail left by 86,220 people’s Facebook “likes.” As a predictor of “Big Five” personality test scores, the computer data were more significantly accurate than friends’ and family members’ judgments. (Such research is enabled by the millions of people who have responded to tests via Stillwell’s myPersonality app, and who have also donated their Facebook information, with guarantees of anonymity.)
  • Another study, using millions of posts from almost 69,792 Facebook users, found that people who score high on neuroticism tests use more words like “sad,” “fear,” and “pain.” This hints at the possibility of using social media language analysis to identify people at risk for disorder or even suicide.
  • Researchers are also exploring Smartphones as data-gathering devices. Jason Rentfrow (University of Cambridge) offers an app for monitoring emotions (illustrated here), and proposes devices that can sense human behavior and deliver interventions. In such ways, it is becoming possible to gather massive data, to sample people’s experiences moment-to-moment in particular contexts, and to offer them helpful feedback and guidance.



Amid the excitement over today’s big data, psychologist Gary Marcus offers a word of caution: “Big Data is brilliant at detecting correlation....But correlation never was causation and never will be...If we have good hypotheses, we can test them with Big Data, but Big Data shouldn’t be our first port of call; it should be where we go once we know what we’re looking for.”

Originally posted on April 7, 2015.


The April APS Observer is out with an essay by Nathan, “The Truth About Trust.” Drawing from the work of Paul Van Lange, it identifies principles of trust—as learned, socially received, reasonable, and constructive. The essay also offers three easy classroom activities that engage students in thinking more deeply about trust.


In the same issue, my essay on “How Close Relationships Foster Health and Heartaches” suggests how instructors might engage students’ thinking about everyday stress and social support.  It then summarizes, from the work of Karen Rook, the benefits and costs of social relationships, and how relationships impact our health and well-being, for better and for worse.



Originally posted on April 14, 2015.


As most introductory psychology students learn, negative emotions often affect health. And persistent anger can lash out at one’s own heart.


Might negative emotions, such as anger, also be risk factors for entire communities? In an amazing study in the February Psychological Science, Johannes Eichstaedt and thirteen collaborators ascertained heart disease rates for each of 1,347 U.S. counties. They also obtained from Twitter 148 million county-identified tweets from these 1,347 counties.


Their finding: a county’s preponderance of negative emotion words (such as “angry,” “hate,” and various curse words) predicted its heart disease deaths “significantly better than did a model that combined 10 common demographic, socioeconomic, and health risk factors, including smoking, diabetes, hypertension, and obesity.” A preponderance of positive emotion words (such as “great,” “fantastic,” and “enjoyed”) predicted low heart disease rates.


Given that the median Twitter user is age 31, and the median heart disease victim is much older, why should Twitter language so successfully predict a county’s heart disease-related deaths? Younger adults’ tweets “may disclose characteristics of their community,” surmise the researchers, providing “a window” into a community’s social and economic environment. An anger-laden community tends to be, for all, a less healthy community, while happier makes for healthier.



Steve Debenport/E+/Getty Images

Originally posted on May 1, 2015.


One of the striking discoveries of psychological science is the malleability of memory, as illustrated by the “misinformation effect.”  Experiments by Elizabeth Loftus and others exposed participants to false information. Afterwards, they misremembered a stop sign as a yield sign, a screw driver as a hammer, a peanut can as a Coke can, and a clean-shaven man as having a mustache.


Even just imagining nonexistent happenings can create false memories—of being sickened by rotten eggs, having one’s finger caught in a mousetrap, of encountering Bugs Bunny (a Warner character) at Disneyland, or even of preschool child abuse.


For me the most stunning finding is the most recent.  After collecting background information from university students’ parents, researchers Julie Shaw and Stephen Porter prompted the students to recall two events from their past—one a false early adolescent event, embedded in true details of the students’ life, such as the name of a friend.  For some, this nonexistent event involved committing a crime, such as a theft, assault, or even assault with a weapon.  After three interviews, 70 percent of students reported false (and usually detailed) memories of having committed a crime.


The bottom line: Our memories are not just replays of our past experiences.  Rather, we actively construct our memories at the time of recall.  And that fact of life has implications for criminal interrogation, eyewitness recollections, and even memories retrieved during psychotherapy.

Originally posted on May 7, 2015.


Despite concerns that video game-playing teaches social scripts for violence, recent research also suggests a cognitive benefit: sharpened visual attention, quickened reaction speed, and improved spatial abilities, such as eye-hand coordination. Experienced game players tend to be perceptually quick and astute.


But a just-released pair of studies, by University of Oregon researcher Nash Unsworth and five collaborators, casts doubt on claims that video game-playing enhances cognitive abilities. The new studies confirmed a cognitive benefit when comparing extreme video-game players with nonplayers. But across broader and larger samples of people, game-playing correlated near zero with various cognitive abilities. “Overall, the current results suggest weak to nonexistent relations between video-game experience—across a variety of different games—and fundamental cognitive abilities (working memory, fluid intelligence, attention control, and speed of processing).”


Two of the study’s co-authors, Zachary Hambrick and Randall Engle, have also published studies and research reviews that question the popular idea that brain-training games enhance older adults’ intelligence and memory. Despite the claims of companies marketing brain exercises, brain training appears to produce gains only on the trained tasks (without generalizing to other tasks). Moreover, though we might wish that thousands of hours of practice could transform us into superstar athletes or musicians, Hambrick’s other research shows that superstar achievers are distinguished at least as much by their extraordinary natural talent as by their self-disciplined daily routine.


The opposite of a truth is sometimes a complementary truth. Educational interventions that aim to enhance grit, or that promote a “growth mindset” (rather than fatalistically seeing intelligence as fixed), also boost achievement. And in yet another new study, British children who display self-control become, as adults, less vulnerable to unemployment.


So, video-game and brain training exercises appear to have limited cognitive benefits. Natural talent matters. Yet the disciplined ability to delay gratification and to sustain effort also matters. “If you want to look good in front of thousands,” goes a saying attributed to Damian Lillard, “you have to outwork thousands in front of nobody.”



Hero Images/ Getty Images

David Myers

Happy Tea Drinkers

Posted by David Myers Expert Jul 18, 2016

Originally posted on May 28, 2015.


Some studies put a smile on my face, as happened when reading a new meta-analysis of tea drinking’s association with lower risk of depression. As a tea-drinking happy person, I was pleased that eleven studies of 22,817 people reveal that regular tea drinking predicts a 31 percent decreased depression risk.  There is also a dose-response relationship:  the more tea people drink, the less their depression risk.


The analysis was done by researchers in China (where I enjoyed tea at every meal in a recent visit to Beijing).  And with the exception of two Finnish studies, all the research was conducted in tea-drinking Asia (China, Japan, Taiwan, and Singapore).


talkpsych21 (2).png


Although the finding is correlational, the Hauzhong University researchers did find the association for both green and other teas, and also when controlling for diet, exercise, alcohol, and smoking.  Thus, they conclude, “tea consumption may act as an independent protective factor for depression.  Given that tea is widely consumed, has few documented adverse effects, and is relatively inexpensive, its potential in treating and preventing depression should be recognized.”


Time for my afternoon cuppa...

Originally posted on June 4, 2015.


One curiosity of recent psychological science is what I’ve called the “religious engagement paradox”: The association between religious engagement and human flourishing is negative across places and positive across individuals. For example, in the most religious U.S. states people die sooner, commit more crime, divorce more, smoke more, and report lower emotional well-being than in the least religious states. Yet more religiously engaged individuals live longer, commit less crime, divorce less, smoke less, and are happier. (Don’t believe it? See here.)


Princeton economist Angus Deaton and psychologist Arthur Stone (2013) share my puzzlement (here): “Why might there be this sharp contradiction between religious people being happy and healthy, and religious places being anything but?” (One possible answer, as Ed Diener, Louis Tay, and I suggested, lies in the more impoverished life circumstances of people in highly religious countries and states.)


As I noted earlier, there also is a parallel “wealth and politics paradox”: In the U.S., low income states and high income individuals more often vote Republican:






Now we have a report of yet another paradox: In Europe, “More liberal countries and more conservative individuals have higher levels of SWB [subjective well-being].”


And another: People in highly religious states do more Google searches for sexually explicit content such as “gay sex,” as I was able to replicate using Google archives. So I couldn’t resist asking the lead researcher, Cara MacInnis at the University of Toronto, if it might nevertheless also be true that more religious individuals do less online searching for sexual content. Stay tuned, but MacInnis tells me that her latest data (paper forthcoming) do, indeed, seem to fit the religious engagement paradox pattern.


The repeated lesson: how we ask the question (comparing aggregate or individual data) can sharply change the answer. So beware: partisans on both sides can pick their data to make their point.

Originally posted on June 11, 2015.


It won’t surprise you to learn that your perceived gender, inferred from your biological sex, may lead people to stereotype you as best suited for masculine- or feminine-typed occupations. People perceive women as more feminine—and as better suited to presumed feminine occupations such as librarian or caregiver. They perceive men as more masculine—and as a better fit for masculine occupations such as security patrol or firefighter.


But might it surprise you—as it did me—to know that research teams led by Adam Galinsky and by Kerri Johnson have found that people also have gender stereotypes associated with race? As Erika Hall, Galinsky, and Katherine Phillips explain in the June Personality and Social Psychology Bulletin, “Asians are perceived as feminine and Blacks as masculine” (with Whites in between).


In five studies, Hall and her colleagues found that people’s “gender profile”—based both on their sex and their race—influences others’ judgments of how well suited and hirable they are for masculine- or feminine-typed occupations. Asian women (denoted by names and checkboxes) were deemed best suited for a librarian position and least suited for a campus security patrol. And Black men were judged least suited for a librarian position and best suited for a security patrol. Whites were deemed in between.


Simply said, the “gender of one’s race as well as one’s biological sex creates one’s gender profile.” And one’s sex + race gender profile predicts perceived “person-position fit.” Those conclusions, say the researchers, “shed light on how occupational gender and racial segregation persist.”



Fanatic Studios/ Getty Images

Originally posted on June 16, 2015.


I was alerted, by this article in Nature, to a new report on sexual orientation from the Academy of Science of South Africa.  The report is state of the art.  It’s lucid and easily readable.  It gets the important facts exactly right (methinks).  And it speaks to pertinent issues in African countries, and also to controversies here in the USA. “Spread the word. Share the report and its findings,” opined Nature in a separate editorial.


The full report, here, responds, for example, to the contention that condoning homosexuality increases it.  It notes that, in African countries (in most of which same-sex relationships are illegal), “The prevalence . . . is no different from other countries in the rest of the world.”  The report estimates, from the best worldwide data, “that between 350 million and 400 million people are not heterosexual.  At least 50 million people who do not claim a heterosexual orientation live in African countries.”

Originally posted on June 30, 2015.


From the daily information stream that flows across my desk or up my computer screen, here is a recent new flashes:


How marital support gets under the skin. A mountain of research shows that good marriages predict better health and longer life. But why? In a longitudinal study, Richard Slatcher and colleagues found that the perceived responsiveness of one’s partner predicted healthier stress hormone levels ten years later. “Our findings demonstrate that positive aspects of marriage—not only partner responsiveness but also provision of emotional support—may help shape the HPA axis in beneficial ways, potentially leading to long-term changes in cortisol production.” (The HPA axis is the hypothalamic–pituitary–adrenal network that controls our reactions to stress.)

Originally posted on July 7, 2015.


From the daily information stream that flows across my desk or up my computer screen, here is a recent news flash:


Global data on mental illness. New global disease data published this week by The Lancet indicate the worldwide prevalence of schizophrenia (24 million people), anxiety disorders (266 million), major depressive disorder (253 million), and bipolar disorder (49 million). Major depressive disorder now trails only low back pain as a source of “years lived with disability.”

Originally posted on July 14, 2015.


From the daily information stream that flows across my desk or up my computer screen, here is a recent news flash:


Global hearing loss. As an advocate for people with hearing loss (see here), I also noted the global prevalence of hearing loss: 1.23 billion people. Of course, the number depends on the definition. This global survey defined hearing loss as >20 decibels loss. In the U.S., the National Institute on Deafness and Other Communication Disorders reports that “approximately 17 percent (36 million) of American adults report some degree of hearing loss.” According to a 2011 report based on audiometric testing of Americans 12 and older in the National Health and Nutritional Examination Surveys (NHANES), 30 million Americans have at least a 25 decibel hearing loss in both ears and 48 million in one or both ears.

Originally posted on July 21, 2015.


From the daily information stream that flows across my desk or up my computer screen, here is a recent news flash:


Money matters more to midlife folks than to those younger and older. There’s a modest correlation between income and life satisfaction, note Felix Cheung and Richard Lucas. Their analyses of three national data pools found that correlation to be strongest for people in their 30s to 50s. It makes sense, they reflect: midlife adults have more financial responsibility for their children and sometimes their aging parents. College students and older adults more often enjoy financial support apart from income.

Originally posted on July 28, 2015.


Sherlock Holmes famously solved the “Silver Blaze” case by noticing what no one else had—the dog that didn’t bark. What grabs our attention is seldom the absence of something, but rather its visible presence.


And so with sexuality. Various sexual-attraction patterns capture our fascination...except one: asexuality—the absence of sexual attraction to others.


But Brock University psychologist Anthony Bogaert (a Sherlock Holmes of sex research) noticed. In a new review article, he reports what has been learned since his 2004 paper reporting that one percent of a British national sample acknowledged they had “never felt sexual attraction” to others. Some highlights (also reported in his book, Understanding Asexuality):


  • The numbers: In the aftermath of several other subsequent surveys, one percent still seems “a reasonable ‘working figure.’”
  • Asexuality in animals: Like humans, lab rodents vary in sexual interest, from hypersexualized to disinterested. Ditto rams, with 12.5 percent of 584 tested by Charles Roselli and colleagues displaying no attraction either to ewes in estrus or to other rams.
  • Asexuality does not equal lack of sexual desire. “A significant number of asexual people masturbate,” although “at a lower level than sexual people.” For asexual people, masturbation is more an expression of solitary desire, without fantasizing any attraction or desire for others. Some asexuals have—my new word for the day—“automonosexualism” (a sexual attraction “turned inward” onto oneself).
  • Gender. “There is evidence that more women than men are asexual.” But among asexuals, more men masturbate, and “asexual men may have elevated paraphilic [atypical] attractions” that accompany their masturbation.
  • Biology and asexuality. Asexual men and women tend to be shorter and more often non-right-handed than average. But there’s no evidence that asexual rodents and humans differ from their sexualized counterparts in levels of circulating testosterone.
  • Is asexuality a disorder? Men’s Hypoactive Sexual Desire Disorder (HSDD) and women’s Female Sexual Interest/Arousal Disorder (FSIAD) become DSM-5 disorders only “if the patient/client is in distress.” Thus, asexuality, unaccompanied by distress, is not a disorder.


Indeed, muses Bogaert, everyday sexuality—an occasional “form of madness”—might better qualify as a disorder, given its association “with extreme and risky behaviors along with impaired cognitive function.”

Originally posted on August 4, 2015.


From the daily information stream that flows across my desk or up my computer screen, here is a recent news flash:


With age we mellow. A European research team led by Annette Brose sampled people’s emotions across 100 days. One finding: young adults’ self-reported emotions were more variable. This reminds me of Mihaly Csikszentmihalyi and Randy Larson’s long-ago sampling, using pagers, of people’s experience. Young teenagers, they found, typically descend from elation or ascend from gloom in less than an hour. Adult moods are less extreme but more enduring. Having survived past sufferings and enjoyed past thrills, mature people look beyond the moment.

Originally posted on August 26, 2015.


Imagine yourself on a Toronto to Lisbon flight. Five hours after takeoff and with open seas beneath you, your pilots become aware of fuel loss (a fractured fuel line is leaking a gallon per second). Declaring an emergency, the pilots divert toward an air base in the Azores. But while still 135 miles out, one engine dies of fuel starvation, and then, still some 75 miles out, the other. Moreover, your aircraft has lost its main hydraulic power, which operates the flaps.


In eerie silence, and with nothing but water beneath, you are instructed to put on a life jacket and, when hearing the countdown to ocean impact, to assume a brace position. Periodically the pilot announces “[X] minutes to impact.” With the ocean’s surface approaching, you keep thinking, “I’m going to die.”



Lisa Noble Photography/ Moment Open/ Getty Images


But good news: when the engines went silent, you were still 33,000 feet in the air, and your captain is an experienced glider pilot. And the bad news: You are losing some 2000 feet per minute. After minutes of descent, the pilot declares above the passenger screams and prayers, “About to go into the water.”


Then, “We have a runway! We have a runway!. . . Brace! Brace! Brace!”


Nineteen minutes after losing all engine and primary electrical power and after a series of violent turns, the plane reaches the air base, making a damaging hard landing. You and 305 other passengers and crew members have escaped death. Your pilots return home as heroes. And your flight becomes the subject of television dramas.


For psychologist Margaret McKinnon, now at McMaster University and St. Joseph's Healthcare Hamilton, this traumatic flight was not imaginary. It was the real August 24, 2001 Air Transat Flight 236, and she, as a honeymoon passenger, was among those thinking “I’m going to die.”


Seizing this one-time opportunity to test people’s memory for details of a recorded traumatic event, McKinnon and her Baycrest Health Sciences colleagues Brian Levine and Daniela Palombo tracked down 15 of her fellow passengers. In a recent Clinical Psychological Science article, she reports that seven met criteria for PTSD, and that all of them, some four years later, exhibited vivid, “robust” memories of the details of their experiences.


In a follow-up study, also appearing in Clinical Psychological Science, eight of the passengers underwent fMRI scans while recalling the trauma. Their “enhanced” amygdala activation suggested that the amygdala may, via its links to the hippocampus and visual cortical areas, help create such emotion-fixed memories.


The persistent memories from Flight AT236 confirm what other researchers have found—that it’s much easier to forget neutral events (yesterday’s parking place) than emotional experiences, especially extreme emotional experiences. After observing a loved one’s murder, being terrorized by a hijacker or rapist, or losing one’s home in a natural disaster, one may wish to forget. But such traumas are typically etched on the mind as persistent, haunting memories—for survivors of Nazi death camps, “Horror sear[ed] memory.” With many forms of trauma comes not repression but, more often, “robust” memory.


Note:  Don’t let this essay leave you thinking that commercial flying is dangerous.  From 2009 to 2011, Americans were—mile for mile—170 times more likely to die in a vehicle accident than on a scheduled flight. In 2011, 21,221 people died in U.S. car or light truck accidents, while zero (as in 2010 and as on AT236) died on scheduled airline flights. When flying, the most dangerous part of the trip is your drive to the airport.

David Myers

Love Sees Loveliness

Posted by David Myers Expert Jul 18, 2016

Originally posted on August 27, 2015.


It seems unfair . . . that mere skin-deep beauty should predict, as it has in so many studies, people’s dating frequency, popularity, job interview impressions, and income, not to mention their perceived health, happiness, social skill, and life success. “Personal beauty is a greater recommendation than any letter of introduction,” said Aristotle.


Evolutionary psychologists see biological wisdom in our positive response to bodily shapes and facial clues to others’ health and fertility. Still, how unjust, this penalty for plainness—and especially so in today’s world where first impressions sway choices in settings from speed dating to Tinder swipes.


Despite some universal aspects of physical attractiveness (such as facial symmetry), those of us with no better than average looks can find some solace in the varying beauty ideals across time and place. Today’s overweight was, in another era, Ruebens’ pleasingly plump “Venus in a Mirror.”


And we can find more comfort in a soon-to-be-published study by Lucy Hunt, Paul Eastwick, and Eli Finkel. Compared to romances that form without prior friendship, couples who become romantically involved long after first meeting exhibit less “assortative mating” based on similar attractiveness. For those who are friends before becoming lovers, looks matter less. With slow-cooked love, other factors such as common interests matter more.


This fits with earlier findings (here and here and here).  First, attractiveness is less a predictor of well-being and social connections in rural settings (where people often know those they see) than in urban settings (where more interactions are with strangers, and looks matter more). Second, not only do people’s looks affect our feelings, our feelings affect how we perceive their looks. Those we like we find attractive. The more we love someone, the more physically attractive we find them.


These comforting findings help us answer Prime Charming’s question to Cinderella (in Rodgers and Hammerstein’s musical): “Do I love you because you’re beautiful, or are you beautiful because I love you?” And they remind us of Shakespeare’s wisdom: “Love looks not with the eyes, but with the mind.”


Beauty, thank goodness, truly is in the eye of the beholder.



TOMACCO/ Getty Images

Originally posted on September 16, 2015.


Social psychology’s progressivism has been no secret. Our values inform our interests in topics such as prejudice, sexism, violence, altruism, and inequality.


Still, I was a bit stunned, while attending the January, 2011, Society of Personality and Social Psychology convention, when our colleague Jonathan Haidt—as part of his plea for more ideological diversity—asked for a show of hands. How many of us considered ourselves “liberals”? A sea of hands arose—80 to 90 percent of the thousand or so attendees, Haidt estimated (here). And how many considered themselves “centrists” or “moderates”? About 20 hands rose. “Libertarians?” A dozen. “Conservatives?” Across that ballroom, three hands were visible.


As one of the respondents, I remember thinking: If the media are here, we’re going to read about this. And, indeed: see here and here.


And now comes another survey that makes the same point. For an upcoming chapter for a volume on politics in psychology, social psychologist Bill von Hippel surveyed fellow members of the invitation-only Society of Experimental Social Psychology. Among his findings (reported in an e-mail to participants): “When asked your preference in the last presidential election, Obama beat Romney 305 to 4.”


To our credit, we social psychologists check our presumptions against data. We have safeguards against bias. And we aim to let the chips fall where they may (which includes research that documents the social toxicity of pornography and the benefits of covenant relationships that satisfy the human need to belong).


Still, by a huge margin, social psychologists are liberal (much as certain other professions, such as medicine, the military, and law enforcement tend to be populated by conservatives).


Why social psychology’s liberalism? Does our discipline’s focus on the power of social situations make liberalize us? Are psychology departments less open to admitting and hiring conservatives? Or do liberals self-select into academia, including the behavioral sciences? Such are among the answers proposed.

Originally posted on September 29, 2015.


My last blog essay reported surveys that show social psychologists are mostly political liberals. But I also noted that “To our credit, we social psychologists check our presumptions against data. We have safeguards against bias. And we aim to let the chips fall where they may.”


Fresh examples of such evidence-based reasoning come from two recent analyses. The first Analysis has been welcomed by some conservatives (who doubt that sexism is rife in academic hiring).  The second has been welcomed by liberals (who see economic inequality as psychologically and socially toxic).


(1) Using both actuarial and experimental studies, Cornell psychologists Stephen Ceci and Wendy Williams looked for possible sexism in academic hiring, but found that “in tenure-track hiring, faculty prefer female job candidates over identically qualified male [candidates].”


Their Chronicle of Higher Education defense of their work reminded me of a long-ago experience. Hoping to demonstrate sexism in action, I attempted a class replication of Philip Goldberg’s famous finding that people give higher ratings to an article attributed to a male (John McKay) than to a female (Joan McKay). Finding no such difference, my student, Janet Swim (now a Penn State social psychologist) and I searched for other attempts to replicate the finding. Our published meta-analysis, with Eugene Borgida and Geoffrey Maruyama, confirmed Ceci/ Williams’ negligible finding.


Neither Ceci/ Williams today, nor us yesterday, question other manifestations of cultural sexism. Rather, in both cases, “Our guiding principle,” to use Ceci/ Williams’ words, “has been to follow the data wherever it takes us.”


(2) Following the data also has led social psychologists to see the costs of extreme inequality. As I noted in an earlier TalkPsych essay, “psychologists have found that places with great inequality tend to be less happy places...with greater health and social problems, and higher rates of mental illness.”


In soon-to-be-published research, Shigehiro Oishi and Selin Kesebir observe that inequality also explains why economic growth often does not improve human happiness. My most oft-reprinted figure, below, shows that Americans today are no happier than they were in 1957 despite having triple the average income. But average income is not real income for most Americans. If the top 1 percent experience massive income increases, that could raise the average but not the actual income for most.


Indeed, real (inflation-adjusted) median U.S. wages have in fact been flat for some years now. With the rising economic tide lifting the yachts but not the rowboats, might we be paying a psychological price for today’s greater inequality? By comparing economic growth in 34 countries, Oishi and Kesebir show that economic growth does improve human morale when it is widely distributed, but not when “accompanied by growing income inequality...Uneven growth is unhappy growth.”


Ergo, it’s neither conservative nor liberal to follow the data, and—as text authors and essayists—to give the data a voice.

Originally posted on October 14, 2015.


I am just back from a fourth visit to China, where I enjoyed generous hospitality and have again spoken to colleagues and students in China’s fast growing social psychology field. My task was to speak at a Shanghai conference focusing on how the information age is transforming culture, in China as elsewhere.




And what transformational change there has been in but a thin slice of history! The world now has nearly 5 billion mobile phone users (including more than 90 percent of the Chinese population—triple the 30 percent in 2005). And nearly 45 percent of humans are now Internet users (including just over 50 percent in China, compared to fewer than 10 percent in 2005).



Of particular interest to social psychologists is the upsurge in social media. Although blocked in China (as is Google, YouTube, and the New York Times), Facebook now has 1.5 billion subscribers and in late August experienced 1 billion users in a single day—a milestone towards its mission: “to make the world more open and connected.”


My mission in China was to review the benefits, costs, and research opportunities of today’s networked world. The net is shrinking the global village; connecting us with distant family, friends, and colleagues; enabling time-saving e-commerce and telecommuting; and giving us easy access to incredible amounts of information. Of particular interest to psychologists, the Internet is also becoming a vehicle for self-improvement, skills training, and even finding romantic partners. I admit to being surprised (see the data below) by how many people today find their allied spirits and eventual partners, enabled by the Internet (including my co-author, Nathan DeWall and his wife, Alice Rudolph DeWall). On the Internet, looks and location matter less to initial relationship formation, and self-disclosure and kindred attitudes and beliefs matter more.



From Myers & DeWall, Psychology, 11th Edition, Presenting National Survey Data from Rosenfeld & Thomas, 2012


But these many benefits come with some costs. Anonymity can enable bullying and sexual exploitation. The Internet time-suck drains time from the face-to-face interactions for which we humans are designed. At its extremes, Internet addiction (including to gambling and pornography) may undermine relationships and productivity. Of greatest interest to me, however, is the Internet as echo chamber—its facilitating the self-segregation of like minds and the resulting group polarization. The Internet indeed has great potential to connect us, but also to deepen social divisions and to promote extremist views and acts.


But what a boon the Internet is to us researchers, which I enjoyed illustrating from colleagues’ harvesting of “big data” from the archives of the U.S. Social Security system, the sporting world, Google, Facebook, Twitter, and national and world surveys.


All this (plus the easier availability of diverse research participants thanks to and is wonderful. But as Richard Nisbett reminds us in his new book, Mindware: Tools for Smart Thinking, “A very large N (number of data) may simply make us more confident about a possibly wrong result.” As he cogently illustrates, when it comes to discerning causation, big data archives, even with control variables and mediational analyses, are no substitute for the most powerful instrument in our psychological toolkit: the simple experiment.

Originally posted on October 20, 2015.


In response to the big “Reproducibility Project” news that only 36 percent of a sample of 100 psychological science studies were successfully replicated, psychologists have reassured themselves that other fields, including medicine, also have issues with reproducibility. Moreover, differing results sometimes illuminate differing circumstances that produce an effect.


Others have agreed on a lesson for textbook authors. “A finding is not worth touting or inserting in the textbooks until a well-powered, pre-registered, direct replication is published,” argues Brent Roberts. “The conclusions of textbooks should be based not on single studies but on multiple replications and large-scale meta-analyses,” advise Wolfgang Stroebe and Miles Hewstone.


Those are high standards that would preclude textbook authors reporting on first-time discoveries, some of which are based on big data. Ironically, it would even preclude reporting on the one-time Reproducibility Project finding (can it be replicated?). Even so, my introductory psychology co-author, Nathan DeWall, and I are cautious about reporting single-shot findings. Some intriguing new studies end up not in our texts but in our next-edition resource files, marked “needs a replication.” And we love meta-analyses, which give us the bigger picture, digested from multiple studies.




So, I wondered: How did we do? How many of the nonreproducible studies ended up in Psychology, 11th Edition? Checking the list, my projects manager, Kathryn Brownson, found three of the 100 studies in our bibliography—one of which successfully replicated, one of which produced insufficient data for a replication, and one of which failed to replicate.


Thus, from page 504:


In several studies, giving sugar (in a naturally rather than an artificially sweetened lemonade) had a sweet effect: It strengthened people’s effortful thinking and reduced their financial impulsiveness (Masicampo & Baumeister, 2008; Wang & Dvorak, 2010).


will likely become:


In one study, giving sugar (in a naturally rather than an artificially sweetened lemonade) had a sweet effect: It reduced people’s financial impulsiveness (Wang & Dvorak, 2010).


Ergo, out of 5174 bibliographic citations, one citation—and its five associated text words—will end up on the cutting room floor.

David Myers

Phantom Breast Syndrome

Posted by David Myers Expert Jul 18, 2016

Originally posted on October 27, 2015.


Phantom limb sensations are one of psychology’s curiosities. Were you to suffer the amputation of a limb, your brain might then misinterpret spontaneous activity in brain areas that once received the limb’s sensory input. Thus, amputees often feel pain in a nonexistent limb, and even try to step out of bed onto a phantom leg, or to lift a cup with a phantom hand.


Phantoms also haunt other senses as the brain misinterprets irrelevant brain activity. Therefore, those of us with hearing loss may experience the sound of silence—tinnitus (ringing in the ears in the absence of sound). Those with vision loss may experience phantom sights (hallucinations). Those with damaged taste or smell systems may experience phantom tastes or smells.


And now comes word from the Turkish Journal of Psychiatry that 54 percent of 41 patients who had undergone a mastectomy afterwards experienced a continued perception of breast tissue, with 80 percent of those also experiencing “phantom breast pain.”


As I shared this result (gleaned from the Turkish journal’s contents in the weekly Current Contents: Social and Behavioral Sciences) with my wife, I wondered: Is there any part of the body that we could lose without the possibility of phantom sensations? If an ear were sliced off, should we not be surprised at experiencing phantom ear syndrome?


The larger lesson here: There’s more to perception than meets our sense receptors. We feel, see, hear, taste, and smell with our brain, which can experience perceptions with or without functioning senses.

Originally posted on November 4, 2015.


Writing in the August, 2015, Scottish Banner, University of Dundee historian Murray Watson puzzled over having “failed to find a satisfactory answer” for why Scots’ Scottish identity is so much stronger than their English identity. It’s a phenomenon I, too, have noticed, not only in the current dominance of the Scottish Nationalist Party, but also in more mundane ways. When recording their nationality in B&B guest books, I’ve observed people from England responding “British,” while people from Scotland often respond “Scottish” (though the two groups are equally British).



Paul Mansfield Photography/Moment Open/Getty Images


And Watson notes another example: England’s 53 million people outnumber Scotland’s 5+ million by 10 to 1. Yet the U.S. and Canada have, between them, only 9 English clubs (Royal Societies of St. George) and 111 Scottish clubs (St. Andrews Societies). What gives?


Social psychologists have an answer. As the late William McGuire and his Yale University colleagues demonstrated, people’s “spontaneous self-concepts” focus on how they differ from the majority around them. When invited to “tell us about yourself,” children mostly mention their distinctive attributes: Foreign-born children mention their birthplace. Redheads mention their hair color. Minority children mention their race.


This insight—that we are conscious of how we differ from others—explains why gay people are more conscious of their sexual identity than are straight people (except when straight folks are among gays), and why any numerical minority group tends to be conscious of its distinctiveness from the larger, surrounding culture. When occasionally living in Scotland, where my American accent marks me as a foreigner, I am conscious of my national identity and sensitive to how others may react.


Being a numerical British minority, Scots are conscious of their identity and of their rivalries with the English. Thus, rabid fans of Scottish football (soccer) may rejoice in either a Scotland victory or an English defeat. “Phew! They Lost!,” headlined one Scottish tabloid after England’s 1996 Euro Cup defeat—by Germany, no less. Likewise, report a New Zealand-Australian research team, the 4 million New Zealanders are more conscious of their New Zealand identity vis-à-vis the 23 million Australians than vice-versa, and they are more likely to root for Australia’s sports opponents.


“Self-conciousness,” noted C. S. Lewis in The Problem of Pain, exists only in “contrast with an ‘other,’ a something which is not the self.” So, why do the Scots have a stronger social identity than the English? They have their more numerous and powerful neighbors, the English, to thank for that.

Originally posted on November 14, 2015.


Thursday night’s Buffalo Bills versus New York Jets football game was lampooned for its red and green uniforms. “Christmas pjs” in the NFL?


But for “colorblind” people there was a bigger problem. As Nathan DeWall and I explain in Psychology, 11th Edition, “Most people with color-deficient vision are not actually ‘colorblind.’ They simply lack functioning red- or green-sensitive cones, or sometimes both.” The classic textbook illustration at left—which the NFL apparently forgot—reminds us that for some folks (most of whom, like most NFL fans, are male) those red and green uniforms likely looked more like this.




Twitter messages flowed:




Note to the NFL, from psychology teachers and text authors: Thanks for the great example!

Originally posted on November 21, 2015.


“So, what do you make of this?,” asked the woman in the airplane seat next to me this week, as she pointed to an article about corporations cancelling meetings in Paris in response to last week’s terrorist attacks.


Eight guys with guns commit horrific evil and capture the world’s attention—leading to calls for revenge, proposals to ban Syrian refugees from the U.S., and fears of European travel. When terrorists kill people in bunches, they create readily available—and memorable—images that hijack our rational thinking.


Meanwhile, I replied, even more people—some 200—die of homicidal gun violence in the U.S. each week. But they mostly die one by one, eliciting little or no national outrage or resolve. Is this (without discounting the likelihood of future terrorist acts) yet another example of our human tendency to fear the wrong things (as I’ve explained here, here, and here)? If terrorists were to kill 1000 people in such attacks in the USA in the next year, Americans would have reason to fear--albeit 1/30th the fear of riding in a motor vehicle, where more than 30,000 people a year die.


The shared threat of terrorism further hijacks rationality, by triggering us/them thinking, inflaming stereotypes of the “other” among us, and creating scapegoats. Thus, although refugees have reportedly committed no terrorist acts—either in Paris or, since 2001, in the USA—more than half of U.S. governors are seeking to block Syrian refugees, and reported threats against Muslims and Mosques have increased. “We don’t know who [the Syrian refugees] are,” declared Donald Trump. “They could be ISIS. It could be the great Trojan Horse.”


A personal note: U.S. politicians’ calls to effectively shut out Syrian refugees, and even (a la Donald Trump) to register all Muslims in a database, evoke a déjà vu. In 1942, while I was in my mother’s womb, a fear-filled American government gave the Japanese-Americans living on my Bainbridge Island, Washington, home six days to pack a suitcase and be at the ferry dock for that March 20th day that began the internment of 120,000 of our fellow Americans. Among their tearful friends and neighbors at the dock was my father (who for many of them was their insurance agent, and who maintained their insurance over objections from insurance companies who viewed the internees properties as at-risk).



Sixty-two years later ground was broken for a national memorial at the historic site, with former internee and Bainbridge Island Japanese American Community president, Frank Kitamoto, declaring that “this memorial is also for Walt and Millie Woodward, for Ken Myers, for Genevive Williams . . . and the many others who supported us” and who challenged the forced removal at the risk of being called unpatriotic. The motto of the beautiful memorial, which I visit on nearly every trip home to Bainbridge: Nidoto Nai Yoni—Let It Not Happen Again.


As a Bainbridge resident, Washington’s current governor, Jay Inslee, knows that story well, and recalled it when standing apart from other governors wanting to exclude Syrian refugees:


We are a nation that has always taken the path of enforcing our freedom, our freedom of religion, our freedom of speech, our humanity, our relationship with the rest of the world. And we've hewed to those values, even in troubled times. And when we haven't, we've regretted it. I'll give you an example. I live on Bainbridge Island, this little island just west of Seattle. And it was the first place where we succumbed to fear in 1941 after Pearl Harbor. And we locked up Washington and American citizens, and we sent them to camps—Japanese-Americans. . . . So my neighbors were locked up by the federal government and sent to camps for years while their sons fought in the Army in Italy and were decorated fighting for democracy. We regret that. We regret that we succumbed to fear. We regret that we lost moorage for who we were as a country. We shouldn't do that right now.

Originally posted on December 1, 2015.


“Happiness doesn’t bring good health,” headlines a December 9 New York Times article. “Go ahead and sulk,” explain its opening sentences. “Unhappiness won’t kill you.”


Should we forget all that we have read and taught about the effects of negative emotions (depression, anger, stress) on health?  Yes, this is “good news for the grumpy,” one of the study authors is quoted as saying. In this Lancet study, which followed a half million British women over time, “unhappiness and stress were not associated with an increased risk of death,” reported the Times.


A closer look at the study tells a somewhat different story, however. Its title—“Does Happiness Itself Directly Affect Mortality?”—hints at an explanation for the surprising result. Contrary to what the media report suggests, the researchers found that “Compared with those reporting being happy most of the time, women who had reported being unhappy had excess all-cause mortality when adjusting only for age.” Said simply, the unhappy women were 36 percent more likely to die during the study period.


But the happy women also exercised more, smoked less, and were more likely to live with a partner and to participate in religious and other group activities. Controlling for those variables “completely eliminated” the happiness-longevity association, and that explains the headline.


In much the same way, one can reduce or eliminate the religiosity-health association by controlling for the factors that mediate the religiosity effect (social support, healthier lifestyle, greater positive emotion).  Ditto, one can eliminate the seeming effect of a hurricane by “controlling for” the confounding effect of the wind, rain, and storm surge. A hurricane “by itself,” after eliminating such mediating factors, has little or no “direct effect.”


Likewise, happiness “by itself” has little or no direct effect on health—a finding that few researchers are likely to contest.


P.S. For more critique of the happiness-health study, see here.

David Myers

The Politics of Fear

Posted by David Myers Expert Jul 18, 2016

Originally posted on January 12, 2016.


Recent presidential debates offered a consensus message:  be afraid.


“They’re trying to kill us all,” warned Lindsay Graham. “America is at war,” echoed Ted Cruz. “Think about the mothers who will take those children tomorrow morning to the bus stop wondering whether their children will arrive back on that bus safe and sound,” cautioned Chris Christie.


The terrorist threat is real, and its results horrific. With scenes from the Paris and San Bernardino attacks flooding our minds, the politics of fear has grown. Twenty-seven percent of Americans recently identified terrorism as their biggest worry—up from 8 percent just before the Paris attacks. In two new national surveys (here and here), terrorism topped the list of “most important” issues facing the country. We are, observed Senator Marco Rubio, “really scared and worried” . . . and thus the fears of Syrian refugees, or even all Muslims.


We may, however, be too afraid of terrorism, and too little afraid of other much greater perils.  Moreover, fearing the wrong things has social and political consequences, as I explain here (a site that also offers other behavioral scientists’ reflections on important scientific news).

David Myers

Who Thinks Our Thoughts

Posted by David Myers Expert Jul 18, 2016

Originally posted on January 22, 2016.


At the invitation of Princeton University Press, I have just read a fascinating forthcoming book, Stranger in the Mirror: The Scientific Search for the Self, by Fresno State psychologist Robert Levine. In one chapter, Levine, who is one of psychology’s most creative writers, recalls a time when ideas rushed into his head, which he quickly put on paper. “It felt as if there was a very clever fellow somewhere inside me, a guy who came up with better ideas than I ever could. What right did I have to pat myself on the back? I was little more than a recording secretary.”


Levine recounts people’s experiences of ideas popping to mind unbidden. Many writers report feeling like scribes for story lines and sentences that come from, to use Charles Dickens’ words, “some beneficent power.” An artist friend of mine tells me of his delight in observing what his hand is painting. “The writer Robert Davies summed it up neatly,” reports Levine: “‘I am told the story. I record the story.’”


As a writer, that, too, is my frequent experience. As I make words march up the screen, I often feel more like a secretary, a mere recorder of ideas and words that come from I know not where. And yet I also know that if I keep reading and reflecting—and feeding the friendly little genie that each of us has in our heads—it will keep dictating, and I will continue transcribing.




To a 21st century psychological scientist, the genie-like muse is an eruption of our active unconscious mind. In study after study, people benefit from letting their mind work on a problem while not consciously thinking about it. Facing a difficult decision, we’re wise to gather information, and then say, “Give me some time to not think about this.” After letting it incubate, perhaps even sleeping on it, a better answer—or a better narrative—may appear unbidden.


To others, the voice in one’s head may seem like “the Spirit at work,” or even the still small voice of God.


Or, perhaps, it is both?

Originally posted on February 2, 2016.


You’ve likely heard the NPR ads for brain fitness games offered by Lumosity. “70 Million brain trainers in 182 countries challenge their brains with Lumosity,” declares its website. The hoped-for results range from enhanced cognitive powers to increased school and work performance to decreased late-life cognitive decline or dementia.


But do brain-training games really makes us smarter or enlarge our memory capacity? In our just-released Exploring Psychology, 10th Edition, Nathan DeWall and I suggest “that brain training can produce short-term gains, but mostly on the trained tasks and not for cognitive ability in general.” As an earlier TalkPsych blog essay reported, Zachary Hambrick and Randall Engle have “published studies and research reviews that question the popular idea that brain-training games enhance older adults’ intelligence and memory. Despite the claims of companies marketing brain exercises, brain training appears to produce gains only on the trained tasks (without generalizing to other tasks).”


And that is also the recently announced conclusion of the Federal Trade Commission (FTC), when fining Lumosity’s maker, Lumos Labs, $2 million for false advertising. As FTC spokesperson Michelle Rusk reported to Science, “The most that they have shown is that with enough practice you get better on these games, or on similar cognitive tasks...There’s no evidence that training transfers to any real-world setting.”


Although this leaves open the possibility that certain other brain-training programs might have cognitive benefits, the settlement affirms skeptics who doubt that brain games have broad cognitive benefits.

David Myers

Teaching News You Can Use

Posted by David Myers Expert Jul 18, 2016

Originally posted on February 10, 2016.


Three items from yesterday’s reading:


1. The Society for Industrial and Organizational Psychology (SIOP) has just offered a nice video introduction to I/O Psychology (here). At 4-minutes, it’s well-suited to class use.

2. Not brand new—but new to me—is a wonderful 7½ minute illustrated synopsis of social-cognitive explanations of why, despite converging evidence, so many people deny human-caused climate change. The video, from biologist-writer Joe Hansen and PBS Digital Studios, is available here. For more on how psychological science can contribute to public education about climate change—and to a pertinent new U.N. Climate Panel conference—see here.

3. Does witnessing peers academically excelling inspire other students to excel? Or does it discourage them? Schools, with their love of prizes and awards, seem to assume the former. Researchers Todd Rogers and Avi Feller report (here) that exposure to exemplary peers can deflate, discourage, and demotivate other students (and increase their droppiing out from a MOOC course).

Originally posted on March 1, 2016.


Amid concerns about the replicability of psychological science findings comes “a cause for celebration,” argue behavior geneticist Robert Plomin and colleagues (here). They identify ten “big” take-home findings that have been “robustly” replicated. Some of these are who-would-have-guessed surprises.


1. “All psychological traits show significant and substantial genetic influence.” From abilities to personality to health, twin and adoption studies consistently reveal hereditary influence.

2. “No traits are 100% heritable.” We are knitted of both nature and nurture.

3. “Heritability [differences among individuals attributable to genes] is caused by many genes of small effect.” There is no single “smart gene,” “gay (or straight) gene,” or “schizophrenia gene.”

4. "Correlations between psychological traits show significant and substantial genetic mediation.” For example, genetic factors largely explain the correlation found among 12-year-olds’ reading, math, and language scores.

5. “The heritability of intelligence increases throughout development.” I would have guessed—you, too?—that as people mature, their diverging life experiences would reduce the heritability of intelligence. Actually, heritability increases, from about 41% among 9-year-olds to 66% among 17-year-olds, and to even more in later adulthood, studies suggest.

6. “Age-to-age stability is mainly due to genetics.” This—perhaps the least surprising finding—indicates that our trait stability over time is genetically disposed.

7. “Most measures of ‘environment’ show significant genetic influence.” Another surprise: many measures of environmental factors—such as parenting behaviors—are genetically influenced. Thus if physically punitive parents have physically aggressive children both may share genes that predispose aggressive responding.

8. “Most associations between environmental measures and psychological traits are significantly mediated genetically.” For example, parenting behaviors and children’s behaviors correlate partly due to genetic influences on both.

9. “Most environmental effects are not shared by children growing up in the same family.” As Nathan DeWall and I report in Psychology, 11th Edition, this is one of psychology’s most stunning findings: “The environment shared by a family’s children has virtually no discernible impact on their personalities.”

10. “Abnormal is normal.” Psychological disorders are not caused by qualitatively distinct genes. Rather, they reflect variations of genetic and environmental influences that affect us all.



                    HOMETOWNCD/Getty Images


From this “firm foundation of replicable findings,” Plomin and colleagues conclude, science can now build deeper understandings of how nature and nurture together weave the human fabric.


Originally posted on March 4, 2016.


Would you agree or disagree with these statements from the Narcissistic Personality Inventory?

1.    I know that I am good because everybody keeps telling me so.

2.    People love me. And you know what, I have been very successful.

3.    I really like to be the center of attention.

4.    Some people would say I’m very, very, very intelligent.


Excuse my fibbing. The even-numbered sentences are actually the words of Donald Trump—a “remarkably narcissistic” person, surmises developmental psychologist Howard Gardner.


Trump’s narcissism even extends to his self-perceived superior humility (which brings to mind the wisdom of C. S. Lewis: “If a man thinks he is not conceited, he is very conceited indeed.”):




So how do self-important, self-focused, self-promoting narcissists fare over time? Does their self-assurance, charm, and humor make a generally favorable impression, especially in leadership roles? Or is their egotism, arrogance, and hostility off-putting?


In a recent Journal of Personality and Social Psychology article, Marius Leckelt and colleagues, report that narcissists make good first impressions, but over time, their arrogance, bragging, and aggressiveness gets old. Their findings replicate what Delroy Paulhus long ago observed in a seven-session study of small teams: People’s initially positive impressions of narcissists turned negative by the end.


So, will this phenomenon hold true for Trump and eventually deflate his popularity during this U.S. presidential campaign season? What do you think?

Originally posted on March 11, 2016.


It was, as our NYC-bred presidential candidates would say, “yuge” news, both in and beyond psychological science: When 270 researchers in an “Open Science Collaboration” network redid 100 recent studies from three leading journals, only 36 percent of the findings replicated. Ouch!


But now another research team, led by Harvard social psychologist Daniel Gilbert, has reanalyzed the data and arrived at a radically different which the OSC group has offered a rejoinder, the Gilbert group a rebuttal, and the conversation continues. Boiling the controversy down to the fewest possible words, the Gilbert group offers this elevator speech synopsis:


OSC: “We have provided a credible estimate of the reproducibility of psychological science.”


US [Gilbert et al.]: “No you haven’t, because (1) you violated the basic rules of sampling when you selected studies to replicate, (2) you did unfaithful replications of many of the studies you selected and (3) you made statistical errors.”


OSC (& OTHERS): “We didn’t make statistical errors.”


Stay tuned: this debate is in process, as a disagreement among mutually respectful colleagues. The exchanges bring to mind the words of David Hume: “The truth springs from arguments amongst friends.”


Whatever the outcome, the “reproducibility crisis” debate is the free marketplace of ideas in action as diverse scholars

1)     aim to discern and give witness to truth,

2)     contribute their findings and conclusions to the public sphere, while welcoming others doing the same, and then

3)     debate their differences, in the confidence that, in the end, greater wisdom ultimately will emerge.

Originally posted on March 21, 2016.


The Sanders v. Clinton and Trump v. others debates offer, as do others, clashing arguments regarding free trade agreements:


  • Anti-trade agreement argument: “Free trade” agreements, such as NAFTA and the Trans-Pacific Partnership (TPP), benefit corporations at the expense of American workers. Competing with low-wage foreign workers means lost American jobs and lower wages.


  • Pro-trade agreement argument: Ending free trade would raise the prices we pay for goods and would harm American companies (and workers) seeking to export products. The TPP eliminates many tariffs that other countries impose on American exports


Social psychologists have offered another consideration. In the long-term, is an economically interdependent world a safer world?


We know from social psychological research that sharing “superordinate goals” promotes peace., Muzafer Sherif’s classic boys’ camp experiments used isolation and competition to make strangers into bitter enemies. But with superordinate goals (restoring the camp water supply, freeing a stuck truck, pooling funds for a movie), he then made enemies into friends. Other research suggests that superordinate goals are not mere child’s play.


From Amazon tribes to European countries, peace arises when groups become interconnected and interdependent and develop an overarching social identity (Fry et al., 2012). Economic interdependence through international trade also motivates peace. “Where goods cross frontiers, armies won’t,” noted Michael Shermer (2006). With so much of China’s economy now interwoven with Western economies, their economic interdependence diminishes the likelihood of war between China and the West (from Myers & Twenge: Social Psychology, 12th edition)


What do you think: Is a world with free trade (rather than isolationism) a safer world?


And here’s an ethical question: Whose economic well-being should we care more about protecting—Americans’ or everyone’s?


To assess the extent to which people see themselves as “belonging to one human family”—an attitude that distinguished those who rescued Jews from the Nazis—social psychologist Sam McFarland developed an “Identification with All Humanity” scale, which is now supplemented by other measures of global human identification. What do you think: Should our circle of “moral inclusion” include all “God’s children”...or is it natural and appropriate to prioritize our national ingroup?

Originally posted on March 29, 2016.


Some psychological science findings are just plain fun. Few are more so than the studies of what Brett Pelham and his colleagues call “implicit egotism”—our tendency to like what’s associated with us. We tend to like


  • letters that appear in our name,
  • numbers that resemble our birthdate,
  • politicians whose faces are morphed to include features of our own, and even—here comes the weird part—
  • places and careers with names resembling our own name.


Believe it or not, Philadelphia has a disproportionate number of residents named Phil. Virginia Beach has a disproportionate number of people named Virginia, as does St. Louis  with men named Louis. And California and Toronto have an excess number of people whose names begin, respectively, with Cali (as in Califano) and Tor.


Pelham and his colleagues surmise that “People are attracted to places that resemble their names” . . . and to name-related careers, with American dentists being twice as likely to be named “Dennis” as the equally popular names “Jerry” or “Walter.”


As I mentioned in a previous blog essay, Pelham’s work has been criticized. Pelham replied, and now, with Mauricio Carvallo, offers Census data showing that people named Baker, Butcher, Carpenter, Mason, Farmer, and so forth disproportionately gravitate toward occupations that bear their names (despite the separation of countless generations from the original “Baker” who was a baker). And “men named Cal and Tex disproportionately moved to states resembling their names.”


Moreover, in unpublished work, Pelham and his colleagues found that a century and more ago, when most people were born at home and birth certificates were completed later, people tended to declare birth dates associated with a positive identity. Assuming that births (before induced labor and C-sections) were randomly distributed, people between 1890 and 1910 over-claimed Christmas Day birthdays by 66 percent and New Year’s Day birthdays by 62 percent. Parents also over-claimed birthdays associated with famous people’s birthdays, such as George Washington’s—though only U.S. immigrants from Ireland strongly over-claimed St. Patrick’s Day birthdays (at more than three times the expected rate).


The birth registration process once allowed wiggle room, and “where there is wiggle room, there is often wiggling,” report Pelham and his team. “And a potent motivation for wiggling might be the desire to claim a positive social identity.” Implicit egotism rides again.

Originally posted on April 7, 2016.


I cut my eye teeth in social psychology with a dissertation followed by a decade of research exploring group polarization. Our repeated finding: When like minds interact, their views often become more extreme. For example, when high-prejudice students discussed racial issues, they became more prejudiced, and vice versa when we grouped low-prejudice students with one another.


When doing that research half a lifetime ago, I never imagined the benefits, and the dangers, of virtual like-minded groups... with both peacemakers and conspiracy theorists reinforcing their kindred spirits.


In a recent New York Times essay, University of North Carolina professor Zeynep Tufekci studied the Twitter feeds of Donald Trump supporters, and observed


cascading self-affirmation. People naturally thrive by finding like-minded others, and I watch as Trump supporters affirm one another in their belief that white America is being sold out by secretly Muslim lawmakers, and that every unpleasant claim about Donald Trump is a fabrication by a cabal that includes the Republican leadership and the mass media. I watch as their networks expand, and as followers find one another as they voice ever more extreme opinions.


In the echo chamber of the virtual world, as in the real world, separation + conversation = polarization. The Internet has such wonderful potential to create Mark Zuckerburg’s vision of “a more connected world.” But it also offers a powerful mechanism for deepening social divisions and promoting extremist views and actions.


On my list of the future’s great challenges, somewhere not far below restraining climate change, is learning how to harness the great benefits of the digital future without exacerbating group polarization.


Originally posted on April 26, 2016.


The April 11, 2016 TIME cover story on “Porn and the Threat to Virility” was replete with anecdotes of young men’s real-life sexual responsiveness being depleted by excessive pornography consumption.


Really? I wondered. Is men’s capacity for arousal and orgasm with real partners reduced by their habituating (desensitizing) to the variety of streaming explicit sexuality? Is compulsive pornography-viewing literally a downer? Does it contribute to erectile dysfunction (ED)? If so, this is news worth reporting by us textbook authors, and would be a practical, nonmoral reason for encouraging boys and men to limit their hours in online fantasyland.



Knowing that the plural of anecdote is not evidence, I turned to PsychINFO and found surprisingly little confirmation—and little research—on this socially important question. One new study of 434 Belgian university men found that “problematic” online sex viewing (and associated sexual self-gratification) predicted “lower erectile function.” This correlational study, though a good beginning, did not specify the viewing–dysfunction causal relationship. Call me a skeptic.


But now the Skeptic Society has published an article by my esteemed friend Philip Zimbardo, with Gary Wilson, summarizing their respective new books, Man Interrupted (2016) and Your Brain on Porn (2016). Their arguments:


1)  Over time, online porn leads to ED. The explosion in easily available streaming online porn has been followed by a soaring rate of young male erectile dysfunction—from 1 percent of men under age 25 back in Kinsey’s 1950 era to one in four today.


2)  Across individuals, online porn leads to ED. Seven studies document an association “between online porn use in young men and ED, anorgasmia, low sexual desire, delayed ejaculation, and lower brain activation to sexual images.”


3)  Desensitization and conditioning explains it. The waning of real-life male sexuality occurs as preteens, teens, and young men become desensitized by compulsive pornography consumption. Like addicts, they come to need more stimulation and variety of the sort that a real sex partner “cannot compete with.” While masturbating, their sexual arousal becomes associated with pornography.


4)  But the effects are reversible. Benefits follow stopping use, including “clearer thinking and better memory, more motivation, increased charisma, deeper relationships, and better real life sex.”


The debate has only begun. Skeptic Marty Klein, despite sharing “reasonable concerns about young people marinating in Internet porn,” finds their conclusions lacking empirical support. In response, Zimbardo and Wilson vigorously defend their conclusions. Surely the “porn messes with your manhood” claims will trigger much-needed further research that seeks to replicate or extend these findings (including to women viewers), and to control for confounding factors. Stay tuned.


(For David Myers' other weekly essays, visit

Originally posted on May 4, 2016.


“Self-made” people underestimate their fortunate circumstances and their plain good luck. That’s the argument, in the May Atlantic, of Robert Frank, a Cornell economist whose writings I have admired.


Drawing from his new book, Success and Luck: Good Fortune and the Myth of Meritocracy, Frank notes that “Wealthy people overwhelmingly attribute their own success to hard work rather than to factors like luck or being in the right place at the right time.” This brings to mind Albert Bandura’s description of the enduring significance of chance events that can deflect us down a new vocational road, or into marriage. My favorite example is his anecdote of the book editor who came to Bandura’s lecture on the “Psychology of Chance Encounters and Life Paths” and ended up marrying the woman seated next to him.


Frank notes that when wealthy people discount both others’ support and plain luck (which includes not being born in an impoverished place) the result is “troubling, because a growing body of evidence suggests that seeing ourselves as self-made—rather than as talented, hardworking, and lucky—leads us to be less generous and public spirited.”


“Surely,” he adds, “it’s a short hop from overlooking luck’s role in success to feeling entitled to keep the lion’s share of your income—and to being reluctant to sustain the public investments that let you succeed in the first place.” In a presumed just world, the rich get the riches they deserve, which they don’t want drained by high taxes that support the less deserving.


I am keenly aware of my own good luck. My becoming a textbook author, and all that has followed from that—including trade books and other science writing and speaking—is an outgrowth of my a) being invited in 1978 to a small international retreat of social psychologists near Munich, b) being seated throughout the conference near a distinguished American colleague, who c) chanced to mention my name the following January when McGraw-Hill called him seeking an author for a new social psychologist text. I could live my life over and the combined probability of those convergent events would be essentially nil.


The resulting book, and the introductory texts that followed, were not my idea. But they are an enduring reminder that chance or luck—or I might call it Providence—can channel lives in new directions.

Originally posted on May 12, 2016.


With support from the Dalai Lama, the famed emotion researcher, Paul Ekman, has created an interactive Atlas of Emotions. Ekman’s survey of 248 emotion researchers identified five basic emotions that most agreed are shared by all humans:




  • Anger (91 percent agreement): a reaction to interference.
  • Fear (90 percent agreement): our response to danger.
  • Disgust (86 percent agreement): a response to anything toxic.
  • Sadness (80 percent agreement): occasioned by loss.
  • Enjoyment (76 percent agreement): our experience of what feels good.


Ekman’s interactive maps allow visitors to explore the varied experiences of these emotions (sadness, for example, ranges from disappointment to anguish). And they offer specific examples of what triggers each emotion, and what effect it has. For those looking for a class lab activity, the site warrants a visit.

David Myers

The Net Effect

Posted by David Myers Expert Jul 18, 2016

Originally posted on May 18, 2016.


In an 80-minute class for which I recently guest-lectured, the instructor (a master teacher) gave students a mid-class break to enable them to stretch and talk to classmates. What a great way to build community, I thought. Alas, two-thirds of the class never moved. Rather, they pulled out their smart phones and sat staring at their screens. There was no face-to-face conversation, just solemn silence.



SYDA Productions/Shuttershock


When I recounted that story to tech expert psychologist Larry Rosen (co-author of The Distracted Brain: Ancient Brains in a High Tech World and author of iDisorder: Understanding Our Obsession with Technology and Overcoming Its Hold on Us) he replied that “I see this all the time EVERYWHERE.”


The students I observed don’t exemplify The Onion’s recent parody (“Brain-Dead Teen, Only Capable of Rolling Eyes and Texting, To Be Euthanized”). But they did bring to mind the recent Western Psychological Association presentation by Rosen’s students, Stephanie Elias, Joshua Lozano, and Jonathan Bentley. They reported data on smartphone usage by 216 California State University, Dominguez Hills students, as recorded by a phone app. The stunning result: In an average day, the students unlocked their phones 56 times and spent 220 minutes—3.7 hours—connected. Moreover, more compulsive technology use not only drains time from eyeball-to-eyeball conversation but also predicts poorer course performance.


Today’s technology is “so user-friendly that the very use fosters our obsessions, dependence, and stress reactions,” says Rosen in iDisorder. If smartphones interfere with “having social relationships, then it is a problem, and it really is what I consider an iDisorder.” As Steven Pinker has written, “The solution is not to bemoan technology but to develop strategies of self-control, as we do with every other temptation in life.” We can live intentionally—by managing our time, blocking distracting online friends, turning off or leaving behind our mobile devices, or even going on a social media fast or diet—all in pursuit of our important goals.

Originally posted on May 31, 2016.


In 1964, I arrived in Iowa City and anxiously walked into the University of Iowa’s psychology department to meet my graduate school advisor. Among his first words: “I know, Dave, that you indicated ‘personality psychology’ as your interest area. But our only personality psychologist has just left . . . so we’ve moved you into social psychology.”


Thus began my journey into social psychology. Looking back, aware of the exciting fruits of social psychology’s last half century, I view that unexpected shift as providential. And reading The Wisest One in the Room, by my esteemed colleagues Tom Gilovich and Lee Ross, fortifies my sense of the importance of social psychology’s practicality. Their new book, subtitled How You Can Benefit from Social Psychology’s Most Powerful Insights, enumerates social psychology’s biggest ideas and applies them to promoting happiness, conflict resolution, success for at-risk youth, and a sustainable climate future.




The latter goal, they note, will be enabled by social norms that stigmatize the worst climate-change offenders and celebrate those who are advancing sustainability.


That may sound like an impossible aim in a year when one party’s presidential candidate is a climate-change skeptic who has tweeted that “the concept of global warming was created by and for the Chinese in order to make U.S. manufacturing non-competitive”?


But consider, Gilovich and Ross remind us, of how fast social norms can change, with

  • same-sex couples’ rights going from nonexistent to the law and will of the land,
  • fertility rates dropping sharply worldwide in response to overpopulation,
  • smoking transformed from being grown-up and sophisticated to being (among middle class people) dirty and just plain stupid. Yesterday’s cool—big tobacco—has become today’s corporate evil.


The hopeful bottom line: transformational change can happen with surprising speed and great effect.

Originally posted on June 8, 2016.


Our personal assumptions matter, often by influencing our attitudes and public policies. Here’s an example:


  • If you see same-sex attraction as a lifestyle choice, as swayed by social influence, or as encouraged by social tolerance, then you probably are opposed to equal employment and marriage rights for gay people. Those in fact are the prevailing assumptions in the 75 countries that legally forbid homosexual behavior.
  • If you see sexual orientation as “inborn”—as shaped by biological and prenatal environmental influences—then you likely favor “equal rights for homosexual and bisexual people.”


That being so, note Michael Bailey, Paul Vasey, Lisa Diamond, Marc Breedlove, Eric Vilain, and Marc Epprecht, in their state-of-the-art review of sexual orientation research, psychological science has much to offer our public conversation about gay rights issues. Some of their conclusions:


  • The phenomenon: Sexual attraction, arousal, behavior, and identity usually coincide, but not always. For example, some men who identify as straight may nevertheless be strongly attracted to men.
  • Same-sex attraction has existed across time and place. Although sexual identity and behavior are culturally influenced, same-sex activity crosses human history, dating from the era of Mesolithic rock art.
  • Bisexual identity is multifaceted. Some claim bisexual identity after previous sexual experiences with both men and women, or, even if primarily attracted to one sex, because of occasional sexual attractions to the other sex.  “Some bisexual-identified men have bisexual genital arousal patterns and some do not.” With men, bisexuality is more often a transitional identity; with women, it is more often a stable identity.
  • Heritability. Twin studies suggest that “about a third of variation in sexual orientation is attributable to genetic influences.”
  • The nonsocial environment matters. One striking example is the fraternal birth order effect: The odds of a man having a same-sex orientation are about:
    • 2% for those with no older biological brothers.
    • 2.6% given one older biological brother,
    • 3.5% given two older biological brothers,
    • 4.6% given three older biological brothers, and
    • 6.0% given four older biological brothers.
  • The social environment matters little: “There is no good evidence that either [social influence or social tolerance] increases the rate of homosexual orientation.”


If only a mad scientist could pit nature against nurture by changing, at birth, boys into girls. Castrate them as newborns, surgically feminize them, and then raise them as girls. Does such rearing socialize these “girls” into becoming attracted to males?


Such surgical and social gender reassignment did happen between 1960 and 2000 after a number of babies were born with penises that were malformed or severed in surgical accidents. As teaching psychologists are aware, their gender identity was not so easily transformed. As is less well known, report the expert sexuality researchers, in each of seven known cases where sexual orientation was reported, it was predominantly or exclusively an attraction to women. “This is the result we would expect if male sexual orientation were entirely due to nature, and it is the opposite of the result expected if it were due to nurture.”


“If one cannot reliably make a male human become attracted to other males by cutting off his penis in infancy and rearing him as a girl, then what other psychosocial intervention could plausibly have that effect?”


With such scientific evidence in mind, conclude the expert researchers, “we urge governments to reconsider the wisdom of legislation that criminalizes homosexual behavior.”

David Myers

Top-Down Hearing

Posted by David Myers Expert Jul 18, 2016

Originally posted on June 22, 2016.


As every psychology student knows well, human perception is both a “bottom-up” and “top-down” process. Our perceptions are formed, bottom-up (from sensory input)...but also top-down (constructed by our experience and expectations).


Top-down perception is usually illustrated visually. Reading from left to right, our expectations cause us to perceive the middle figure differently than when reading from above.


And when first reading the phrase below, people often misperceive it:


talpsych2.png seeing what they expect (and failing to detect the repeated word).


The same constructive process influences what we hear.


Told about a young couple that has been plagued by their experience with some bad sects, people may—depending on what is on their mind—hear something quite different (bad sex).

The context of a sentence will determine whether you hear “the stuffy nose” or “the stuff he knows.”

Likewise, the weather-forecasting “meteorologist” may become, in a discussion of a muscular kidney specialist, the “meaty urologist.”

The reality of top-down hearing helps explain why theater instructors and directors, who are training their actors to project their voices, may not appreciate the hearing difficulty faced by those of us with hearing loss—and why we appreciate mic’d actors and the hearing assistive technology described here.


The problem has two sources:


Most theater directors hear normally, and thus may naturally assume that others hear what they hear.

The directors already know what the words are. When my TV captioning is on, I can—thanks to top-down perception—hear the spoken words clearly. My expectations, formed by the captions, drive my perception. If I turn the captions off, I no longer understand the words. Play directors who know their scripts are like those of us who watch captioned TV. But their patrons are in the no-captions mode.

Happily, here at my place called Hope (Hope College), hearing accessibility is being addressed. My theater colleagues are working to support their patrons with hearing loss—by seeking to understand their needs, by equipping their facilities with hearing assistance, and by welcoming feedback after plays.