Skip navigation
All Places > The Psychology Community > Blog > 2016 > July > 19

Originally posted on September 24, 2015.

 

This past weekend, I gave myself an odd birthday present. I entered an ultramarathon. If you’ve read my posts, you know I like to run. For my birthday, I wanted to run 100 miles as fast as I could. Luckily, I had a perfect opportunity. There was a 24 hour running race within driving distance of my house.

There was a bigger purpose in my run. I could determine whether a recent test of my speed and endurance would replicate. Two weeks ago, I ran 100 miles in 22 hours and 10 minutes.

Replication is important. It tells whether repeating the essence of an experiment will produce the same result. The more the same sequence of events produces a similar outcome, the more we can depend on it. 

Psychology is embroiled in a current debate about replicability. All psychologists agree that replication is important. That is a requirement before you get your card when you join the psychologist club. The debate centers on the meaning of non-replication. A recent report found that 64 percent of the tested psychological effects did not replicate. Some have declared a war on current scientific practices, hoping to inch the non-replication rate down to a less newsworthy percentage. Others, such as Lisa Feldman Barrett, argue that non-replication is a part of science. It tells us just as much about why things do happen as to why they don’t.

My birthday run had everything I needed to make a replication attempt. Nearly everything was identical to the last time I ran 100 miles. The course consisted of a flat, concrete loop that was nearly one mile long. I ate the same foods, drank the same amount of water, and got the same amount of sleep the night before. All signs pointed to an exact replication.

Then the race started. The first 50 miles breezed by. I was over an hour faster than my previous run, but I felt pretty good. By mile 65, I was mentally fatigued. By mile 70, my body was exhausted. By the time I hit mile 75, I was done. Less than 16 hours had passed, but I was mentally and physically checked out. No replication.

There are at least two ways I can deal with this non-replication. The first is to panic. Either the people who counted my laps at the previous race did something wrong, I reported something wrong, or something else is wrong. It is as if it never happened. The next time someone asks me my personal record, I can tell them. But I must tell them that I don’t trust it. “Probably just a one-off,” I might say. “Tried to replicate it two weeks later and came up short.”

A second approach is to try to understand what contributed to the non-replication. Most things were the same. But some things were different, among them the wear and tear that long running has on the body and mind. Maybe I wasn’t fully recovered from the previous race. Maybe I ran too fast too soon. Or maybe I’m just not that fast.

Either way, it tells us a different story about replication. Replication science is possible, but we will always have non-replications. And those non-replications aren’t badges of shame. They tell us as much about the complexity of human psychology as the truth about how certain situations make us think, feel, and act.

It would be great if psychology’s non-replication rate dwindled to less than 5 percent. I doubt that will ever happen. Humans are squirrely animals. No matter how much we want to do the same thing twice, sometimes it doesn’t happen.

 

1443108312679.jpeg

Originally posted on April 3, 2014.

 

The New York Times columnist Nicholas Kristof recently (here) chided academics for becoming esoteric and inaccessible to the general public.  He noted that

The latest attempt by academia to wall itself off from the world came when the executive council of the prestigious International Studies Association proposed that its publication editors be barred from having personal blogs. The association might as well scream: We want our scholars to be less influential!...

Professors today have a growing number of tools available to educate the public, from online courses to blogs to social media. . . . So, professors, don’t cloister yourselves like medieval monks — we need you!

Voila!  Here begins an effort to share fruits from psychological science.  With daily reports and reflections, we will share what fascinates our minds, challenges our thinking, or tickles our funny bones.  We aim to “give psychology away” to:

  • teachers seeking to freshen their classes with cutting-edge ideas and discoveries,
  • students eager to learn insights beyond what’s in their textbooks, and
  • any curious person who finds human beings fascinating, and who delights in psychological science efforts to expand our minds and enlarge our hearts.

We also aim to offer our reflections in simple prose, believing with Thoreau that “Anything living is easily and naturally expressed in popular language.”

Welcome aboard, and please do feel free to invite your students, colleagues, and friends to join us for the ride, and to join the conversation.

Originally posted on April 3, 2014.

 

My friend Ed Diener, the Jedi Master of happiness research, presented a wonderful keynote talk on “The Remarkable Progress of National Accounts of Subjective Well-Being” at the recent one-day “Happiness and Well-Being” conference.  He documented the social and health benefits of positive well-being, and celebrated the use of at least simple well-being measures by 41 nations as of 2013.

In displaying the health accompaniments of positive emotions, Ed introduced me to a 2011 PNAS (Proceedings of the National Academy of Sciences) study by Andrew Steptoe and Jane Wardle that I’d somehow missed.  Steptoe and Wardle followed 3,853 fifty-two to seventy-nine year olds in England for 60 months.  This figure displays the number surviving, among those with high, medium, and low positive affect—which was assessed by averaging four mood reports across a single day at the study’s beginning.  Those with a “blue” mood that day were twice as likely as the good mood folks to die in the ensuing five years!{cke_protected_1}{cke_protected_2}

1396377515104.jpeg

David Myers

Sometimes Truth is Comedy

Posted by David Myers Expert Jul 19, 2016

Originally posted on April 6, 2014.

 

Consider Brett Pelham, Matthew Mirenberg, and John Jones’ 2002 report of wacky associations between people’s names and vocations.  Who would have guessed?  For example, in the United States, Jerry, Dennis, and Walter are equally popular names (0.42 percent of people carry each of these names). Yet America’s dentists have been almost twice as likely to be named Dennis as Jerry or Walter. Moreover, 2.5 times as many female dentists have been named Denise as the equally popular names Beverly and Tammy. And George or Geoffrey have been overrepresented among geoscientists (geologists, geophysicists, and geochemists).

I thought of that playful research recently when reading some clever research on black bears’ quantitative competence, co-authored by Michael Beran.  Next up in my reading pile was creative work on crows’ problem solving led by Chris Bird.  Today I was appreciating interventions for lifting youth out of depression, pioneered by Sally Merry.

That also took my delighted mind to the important books on animal behavior by Robin Fox and Lionel Tiger, and the Birds of North America volume by Chandler Robbins.  (One needn’t live in Giggleswick, England, to find humor in our good science.)

The list goes on: billionaire Marc Rich, drummer Billy Drummond, cricketer Peter Bowler, and the Ronald Reagan Whitehouse spokesman Larry Speakes.  And as a person with hearing loss whose avocational passion is hearing advocacy, I should perhaps acknowledge the irony of my own name, which approximates My-ears.

Internet sources offer lots more:  dentists named Dr. E. Z. Filler, Dr. Gargle, and Dr. Toothaker; the Oregon banking firm Cheatham and Steele; and the chorister Justin Tune.  But my Twitter feed this week offered a cautionary word about these reported names:

“The problem with quotes on the Internet is that you never know if they’re true.”  ~ Abraham Lincoln

Perhaps you, too, have some favorite name-vocation associations?  I think of my good friend who was anxiously bemused before meeting his oncologist, Dr. Bury.  (I am happy to report that, a decade later, he is robustly unburied and has not needed the services of the nearby Posthumus Funeral Home.)

For Pelham and his colleagues there is a serious point to this fun:  We all tend to like what we associate with ourselves (a phenomenon they call “implicit egotism”). We like faces that have features of our own face morphed into them.  We like—and have some tendency to live in—cities and states whose names overlap with our own—as in the disproportionate number of people named Jack living in Jacksonville,of Philips in Philadelphia, and of people whose names begin with Tor in Toronto.

Uri Simonsohn isn’t entirely convinced (see here and here, with Pelham’s reply here).  He replicated the associations between people’s names, occupations, and places, but argued that “reverse causality” sometimes is at work. For example, people sometimes live in places and on streets after which their ancestors were named.

Implicit egotism research continues.  In the meantime, we can delight in the occasional playful creativity of psychological science.

P.S.  Speaking of dentists (actual ones), my retired Hope College chemistry colleague Don Williams—a person of sparkling wit—offers these photos, taken by his own hand:

1396378086880.jpeg

1396378068393.jpeg

And if you need a podiatrist to advise about your foot odor, Williams has found just the person:

1396378145870.jpeg

Originally posted on April 8, 2014.

 

An editorial in yesterday’s New York Times questioned the nearly $1 billion the U.S. Transportation and Security Administration has invested in training and employing officers to identify high-risk airline passengers. In 2011 and 2012, T.S.A. behavior-detection officers at 49 airports “designated passengers for additional screening on 61,000 occasions.”

The number successfully detected and arrested for suspected terrorism? Zero.

But then again, the number of plane-destroying terrorists they failed to detect was also, I infer, zero. (Wonkish note:  A research psychologist might say the T.S.A. has made no Type II errors.)

Regardless, psychological science studies of intuitive lie detection, as the Times’ John Tierney noted in an earlier article, suggest that this has not been a wise billion-dollar investment. Despite our brain’s emotion-detecting skill, we find it difficult to detect deceiving expressions. Charles Bond and Bella DePaulo reviewed 206 studies of people discerning truth from lies. The bottom line: People were just 54 percent accurate—barely better than a coin toss. I have replicated this in classroom demonstrations—by having some students either tell a true or a made-up story from their lives. When seeking to identify the liars, my students have always been vastly more confident than correct.

Moreover, contrary to claims that some experts can spot lies, research indicates that few—save perhaps police professionals in high-stakes situations—beat chance. The behavioral differences between liars and truth-tellers are just too minute for most people to detect.

Before spending a billion dollars on any safety measure, risk experts advise doing a cost-benefit analysis. As I reported in Intuition: Its Powers and Perils, some people were outraged when the Clinton administration did not require General Motors to replace ill-designed fuel tanks on older model pickup trucks. The decision spared General Motors some $500 million, in exchange for which it contributed $51 million to traffic safety programs. “GM bought off the government for a pittance,” said some safety advocates, “at the expense of thirty more people expected to die in fiery explosions.” Actually, argued the Department of Transportation, after additional time for litigation there would only have been enough of the old trucks left to claim 6 to 9 more lives. Take that $500 million ($70 million per life)—or the $1 billion more recently spent on behavior detection—and apply it to screening children for preventable diseases (or more vigorous anti-smoking education programs or hunger relief) and one would likely save many more lives. By doing such cost-benefit analyses, say the risk experts, governments could simultaneously save us billions of dollars and thousands of lives.

Ergo, when considering how to spend money to spare injuries and save lives, critical thinkers seek not to be overly swayed by rare, dreaded catastrophes. The smart humanitarian says: “Show me the numbers.”  Big hearts can cohabit with cool heads.

Originally posted on April 10, 2014.

 

A footnote to the name-vocation analyses:  Who would you rather hire for a managerial (rather than employee) role—John Knight or George Cook?  Jill Prince or Judy Shepherd?  David King or Donald Farmer? Helen Duke or Hazel Baker?

Raphael Silberzahn and Eric Luis Uhlmann studied nearly a quarter million German names corresponding to high and lower status occupations, such as K╚Źnig (King) and Koch (cook).  Those with names linked with high status occupations were modestly more often appointed to high status roles.  Silberzahn and Uhlmann  speculate that the name association may have made those with high status names seem more worthy.

As former U.S. President Jimmy Carter famously said, “Life isn’t fair."

Originally posted on April 14, 2014.

 

A recent New Yorker review (here) questions the famous claim that “38 witnesses” failed to respond to the Kitty Genovese murder and raises questions about the relationship between the media and the social sciences.  Psychologists have known that the New York Times’ original report of 38 witnesses is questionable.  In a 2007 American Psychologist article, Rachel Manning, Mark Levine, and Alan Collins reported on “The Kitty Genovese murder and the . . . parable of the 38 witnesses.”

Social psychologist Bibb Latané has responded to the New Yorker article, noting that the precise number of witnesses concerns a small “t” truth, with the dynamics of bystander inhibition being the central point of his award-winning research with John Darley.  The dynamic that drove the bystander nonresponse was not “moral decay” but a simple principle:  the “probability of acting decreases with the addition of more people.”

Latané’s letter in the April 7th New Yorker is available here, along with his more extensive submitted explanation.

Originally posted on April 16, 2014.

 

Part of our pleasure in writing psychological science is identifying the big ideas and findings that educated people should know.  Another part of our pleasure is relating these ideas and findings to people’s everyday lives.

Our Harvard colleague Steven Pinker, one of psychology’s public intellectuals, has offered—courtesy of the New York Times—a short quiz that invites people to relate some of psychology’s ideas to real life and pop culture.  Perhaps you, or your students, might enjoy some of the quiz items—here.

David Myers

Big Data

Posted by David Myers Expert Jul 19, 2016

Originally posted on April 18, 2014.

 

“The Internet is one big field study,” observed Adam Kramer, a social psychologist and Facebook researcher, at the recent Society for Personality and Social Psychology (SPSP) presidential symposium on big data.  Some big data factoids, gleaned from the conference:

  • There are, according to Eric Horvitz, Managing Director of Microsoft research, 6.6 degrees of separation between any two people on the Internet.
  • Google has now digitized 6 percent of all published books, creating a huge archive of words that can be tracked over time at https://books.google.com/ngrams.  One can use this resource to answer interesting questions . . . such as: is it true that the term “homosexuality” hardly predates the 20th century, and that “sexual orientation” is a late 20th century concept?  It took me about a second to create this figure of the proportional frequency of these terms over time:

1396620925095.jpeg

  • On Facebook, Kramer reported
    • Parents and children take an average 371 days to friend one another.
    • Mothers use 10% more nurturing words when communicating with their children.
    • In the 2010 congressional elections, people’s posting their having voted led to 340,000 additional voters among their friends and friends of friends.
    • Positive emotion words in people’s posts are followed, in the ensuing three days, by increased positive emotion words in friend’s posts, and vice versa for negative emotions.
  • A research team led by Blaine Landis at the University of Cambridge analyzed all 30.49 billion international Facebook friendships formed over four years, and reported (in an SPSP poster) that people tended to “friend up.”  Those from countries with lower economic status were more likely to solicit friendship with those in higher status countries than vice versa.

Originally posted on April 22, 2014.

 

Critics have used the SAT test redesign to denounce the SAT and aptitude testing.  The multiple choice SAT has “never been a good predictor of academic achievement,” Bard College president Leon Botstein argued in Time. Better, to “look at the complex portrait” of college applicants’ lives, including “what their schools are like.” said Colby College English professor Jennifer Finney Boylan in a New York Times essay. The SAT only measures “those skills … necessary for the SATs,” surmised New Yorker staff writer Elizabeth Kolbert. 

In a new Slate essay, David Hambrick and Christopher Chabris, distinguished experimental psychologists at Michigan State University and Union College, rebut such assertions.  Massive data, they argue, show that

•    SAT scores do predict first-year GPA, whole-college GPA, and graduation likelihood, with the best prediction coming from a combination of both high school grades and aptitude scores.
•    SAT scores of 13-year-old predict future advanced degrees and income, much as kindred and strongly-related IQ scores predict job training and vocational success.
•    In one famous nationwide sample, the IQ scores of Scottish 11-year-olds predicted their later-life longevity, even after adjusting for socioeconomic status.
•    Although SAT scores are slightly higher among students from high income families, the SAT also provides an opportunity for students from nonelite public school to display their potential—rather than to be judged by “what their schools are like.”  Thus SAT scores, when compared with assessments influenced by income-related school quality, have a social levelling effect. 
•    Test preparation courses often taken by higher income prep school students “don’t change SAT scores much.”

Ergo, say Hambrick and Chabris, while other traits such as grit, social skill, conscientiousness, and creativity matter, too, “the idea that standardized tests and ‘general intelligence’ are meaningless is wishful thinking.”

Originally posted on April 24, 2014.

 

39-Year-Old Deaf Woman Hears for First Time” headlined Yahoo, in one of the many gone-viral Deaf-can-now-hear videos.  Each depicts the compelling emotions of someone who, thanks to the activation of a new cochlear implant (CI), is said to be hearing sound for the first time—and (in this case) conversing in English!  Was this woman (Joanne) completely congenitally deaf as a result of Ushers Syndrome?  And did she immediately gain, as some media implied, the ability to understand speech on first hearing it?

As my brother said in forwarding this, it’s “an amazing story.”

The power of CIs to restore hearing is, indeed, amazing, as I can attest from meeting many people with CIs at hearing loss meetings.  As one who is tracking toward the complete deafness that marked the last dozen years of my mother’s life, I anticipate someday benefitting from CIs.

Moreover, I appreciate the power of a compelling example, such as the video example I offer (here) of a child’s first experience of a home TV room hearing loop.  And who can suppress a smile when watching this boy’s first experience of a CI?

Without disrespecting the Deaf culture (which regards deafness and Sign language as not defects needing fixing), and without diminishing Joanne’s powerful experience, what shall we make of her ability to understand and to speak? Does this video overturn what psychological science has taught us about the critical period for language development during life’s early years?  Is it not important that children receive CIs before language develops?  Haven’t experiments that removed cataracts and “restored vision” to natively blind people taught us that, for normal perceptual experience, the brain must be sculpted by sensory input in life’s early years?

I posed these questions to Dr. Debara Tucci, a Duke Medical Center cochlear implant surgeon with whom I serve on the advisory council of the National Institute on Deafness and Other Communication Disorders.  Our shared questions:

     1. Was Joanne completely deaf from birth?  Has she heard no sound until the moment of this recording?  As I will explain in a future entry, in popular use “deaf” often conflates true and complete deafness with substantial hearing loss.  Some Usher’s Syndrome patients sometimes are born completely deaf, but others experience progressive hearing loss.  With hearing aids, they acquire language early in life.  Joanne’s use of spoken language suggests that she is not hearing speech for the first time in her life.

     2. A person who has been completely deaf from birth could potentially lip read.  When testing such patients with newly activated CIs, it would be interesting to know if they can “hear” speech when the speaker’s face is obscured.

As a CI provider, Dr. Tucci nevertheless welcomes such videos: 

“Even though the history accompanying the video may not be entirely correct, and a little misleading, it is basically a positive thing.  I would rather have 10 people come in and be told they are not a candidate than miss one person who is.  Also, we are implanting long deafened people who don't have speech/language ability not with the thought that they will develop or understand speech, but to increase connectedness and for safety concerns.”

Originally posted on April 28, 2014.

 

Reports of restored vision in children in India have been confirmed in a new Psychology Science article, summarized here, on “Improvement in spatial imagery following sight onset late in childhood.”

The research, led by Tapan Kumar Gandhi of MIT’s Brain and Cognitive Sciences department, in collaboration with Suma Ganesh and Pawan Sinha, studied children who were blinded from birth by dense cataracts. After surgery removed the cataracts at about 12 to 14 years, the children were no longer completely blind. Their abilities to discern light and dark, enabled some spatial imagery.

Practically, I wondered, what does this mean? Doesn’t the brain need to experience normal sensory input early in life in order to produce normal perceptual experience later in life? I asked Dr. Gandhi to explain the children’s post-surgery abilities. Could they potentially ride a bicycle or drive a car? His answer (quoted with permission):

The onset of sight is not immediately accompanied by much joy or pleasure, contrary to what is depicted in movies. The child has to get used to the new inputs. Over the first few weeks, the child begins to feel more comfortable with the visual world, even though they might not recognize much of it. Their visual acuity is sub-par, most likely permanently so. But, despite a blurry percept, the brain is able to achieve significant proficiency over the course of the first half year on many visual skills such as face detection, and visually guided navigation. Although driving is well-beyond their economic means, some of the Prakash children have indeed learned to ride a bicycle. We typically find that the children and their parents are in high spirits when they visit us for a clinical follow-up a few weeks after the surgery.

David Myers

Who Is Deaf?

Posted by David Myers Expert Jul 19, 2016

Originally posted on April 30, 2014.

 

Those of us with hearing loss cheered one of our own, Seattle Seahawks football player Derrick Coleman, as he became a national exemplar in the U.S. for living with hearing loss. We reveled in the Super Bowl Duracell ad chronicling his life story.  And we felt a warm glow when he gifted twin New Jersey 9-year-old sisters with Super Bowl tickets and handwritten encouraging words:  “Even though we have hearing aids, we can still accomplish our goals and dreams!”

As 500,000+ Google links to “Deaf Seahawks fullback” testify, Coleman’s story inspires us.  The reports of Coleman’s “deafness” also raise an interesting question:  Who is deaf?

By using a combination of hearing aids and the natural lip reading that we all do, Coleman, despite his profound hearing loss, reportedly hears his quarterback call plays amid the din of the Seahawks stadium.  And he converses, as when amiably answering questions at a Super Bowl press session. In doing so, he is akin to millions of others who live well with hearing loss.

Without our hearing aids or cochlear implants, some of us among the world’s 360 million people with hearing loss become truly deaf—unable to hear normal conversation.  When I remove my hearing aids before showering in my college gym, the locker room banter goes nearly silent.  In bed at night without my aids, my wife’s voice from the adjacent pillow becomes indecipherable, unless she turns to speak into my ear.

So, in his everyday functioning, is Derrick Coleman “deaf”?

Am I deaf?  Are my friends in the hearing loss community deaf?

Partly out of respect for my nonhearing, signing cousins in the Deaf Culture, my answer is no:  I am not Deaf.  Like Deaf people who fluently communicate with Sign, a genuine language, I am also not disabled or “hearing impaired” (which labels a person).  Rather I am a person with hearing loss.  The Hearing Loss Association of America—“the nation’s voice for people with hearing loss”—offers resources that assist “people with hearing loss and their families to learn how to adjust to living with hearing loss [and] to eradicate the stigma associated with hearing loss”—and thus to live as not-deaf.

I asked the Association’s recently retired director, Brenda Battat, whose hearing was partially restored with a cochlear implant, if she considers herself deaf.  “No. From a life experience, functioning, and self-identity perspective I do not consider myself deaf.”

Ditto my friend, musician Richard Einhorn, who has a substantial hearing loss and was recently featured in a news story that was headlined: "Hearing Loops Give Music Back to Composer Who Went Deaf in a Day."

“The ‘deaf’ label is not accurate,” notes Einhorn, who uses various technologies to hear.  “With a good hearing aid and additional listening technology such as hearing loops, I can hear well enough in most situations to participate fully in conversations and enjoy live music, theater, and films.”

Thanks to new hearing technologies, most of us with hearing loss can effectively function as not-deaf.  My state-of-the-art hearing aids amplify sound selectively, depending on my loss at different frequencies.  They offer directionality.  They compress sound (raising soft sound and lowering extremely loud sound).  Via a neck-worn Bluetooth streamer, they wirelessly transmit phone conversation and music from my smart phone to both my hearing aids. And thanks to my favorite hearing technology—the hearing loops that broadcast PA sound wirelessly to my in-the-ear speakers (aka hearing aids)—I hear!

Ergo, while most natively Deaf people are served by Sign, the rest of us—the invisible majority with hearing loss—need hearing assistance.  We respect, but live outside of, the Deaf Culture.  We benefit from new hearing technologies.  Lumping all people with hearing loss together as “deaf” respects neither Deaf people nor those with hearing loss.  Here ye, hear ye!

Originally posted on May 2, 2014.

 

Many faculty fret over students’ in-class use of computers—ostensibly there for note taking, but often also used for distracting e-mail, messaging, and checking social media.  A soon-to-be-published study by Pam Mueller (Princeton University) and Daniel Oppenheimer (UCLA) offers faculty an additional justification for asking students not to use computers.

In three experiments, they gave students either a laptop or a notebook and invited them to take notes on a lecture (a TED lecture in two of the studies).  Later, when they tested their memory for the lecture content, they found no difference in recall of factual information.  But taking notes in longhand, which required participants to summarize content in their own words, led to better performance on conceptual-application questions. “The Pen Is Mightier Than the Keyboard” is the apt title for their article, to appear in Psychological Science.

“Participants using laptops were more inclined to take verbatim notes,” explained Mueller and Oppenheimer.  Better to synthesize and summarize, they conclude:  “laptop use in classrooms should be viewed with a healthy dose of caution; despite their growing popularity, laptops may be doing more harm in classrooms than good.”

For one of my colleagues, this study, combined with the unwanted distractions of in-class computer use, inspires a new class policy:  for better learning, no computer use in class.

Originally posted on May 6, 2014.

 

At the 2012 International Congress of Psychology meeting in Cape Town, I enjoyed a wonderful talk by Elizabeth Loftus, which offered a terrific demo of how memory works.  Loftus showed us a handful of individual faces that we were later to identify, as if in a police line-up.  Later, she showed us some pairs of faces, one seen earlier and one not, and asked us which one we had seen.  In the midst of these, she slipped in a pair of faces that included two new faces, one of which was rather like an earlier seen face. 

Most of us understandably but wrongly identified this face as previously seen.  To climax the demonstration, when she showed us the originally seen face and the previously chosen wrong face, most of us (me, too) picked the wrong face!  As a result of our memory reconsolidation, we—an audience of psychologists who should have known better—had replaced the original memory with a false memory.

ICP.jpg

David Myers

Revealing Hidden Secrets

Posted by David Myers Expert Jul 19, 2016

Originally posted on May 8, 2014.

 

Knowing that people don't wear their hearts on their sleeves, psychologists have longed for a "pipeline to the heart."  One strategy, developed nearly a half century ago by Edward Jones and Harold Sigall, created a “bogus pipeline.” Researchers would convince people that a machine could use their physiological responses to measure their private attitudes.  Then they would ask them to predict the machine's reading, thus revealing attitudes which often were less socially desirable than their verbalized attitudes. 

More recently, psychologists have devised clever strategies for revealing “implicit attitudes,” by using reaction times to assess automatic associations between attitude objects and evaluative words. (In contrast to consciously held “explicit attitudes,” implicit attitudes are like unconscious habits.)

A new working paper (abstract; PDF) by Katherine Coffman and fellow economists demonstrates a third strategy for getting people to reveal sensitive information—the “Item Count Technique” (ICT).

One group was given four simple statements, such as “I spent a lot of time playing video games as a kid,” and then was asked how many of the four statements “apply to you.”  A second group was given the same four statements plus a fifth:  “I consider myself to be heterosexual,” and then was asked how many of the five statements “apply to you.”

Although no individual is asked to reveal which specific statements are true of them, a comparison of the two groups’ answers reveals—for the sampled population—the percent agreeing with the fifth statement.

Thus, without revealing anyone’s sexual orientation, the ICT aggregate data showed a 65 percent higher rate of non-heterosexual identity than was self-reported among people who were asked straight-out: “Do you consider yourself to be heterosexual?”  

But then let’s not discount public surveys.  Nate Silver’s digest of presidential polling data correctly predicted not only the 2012 U.S. national presidential outcome, but also the outcome in all 50 U.S. states.  Specific, explicit attitudes can predict behavior.

David Myers

Why Do We Sleep?

Posted by David Myers Expert Jul 19, 2016

Originally posted on May 12, 2014.

 

Sleep consumes time we could spend foraging and it exposes us to predators.  It’s a waste and a risk.  So why do humans sleep?  Why didn’t nature design us for continual activity and vigilance?

In the October 18, 2013 Science, researchers offer an answer:  sleep enables house cleaning.  Studies of mice show that sleep sweeps the brain of toxic metabolic waste products.

Ergo, at the day’s end we can say to our loved ones:  Good night.  Sleep tidy.

Originally posted on May 14, 2014.

 

Tyler Vigen, a Harvard Law student, has a new website (here) that offers “a fun way to look at correlations and to think about data.”  Among the whimsical spurious (chance) correlations he offers is one that offers a rare 1.0 correlation example.  I’ve reconstructed it into a form familiar to psychology teachers and students:

 

1400008332480-1.png

Originally posted on May 19, 2014.

 

Self-serving bias—the tendency to perceive oneself favorably—has become one of personality-social psychology’s most robust phenomena.  It’s our modern rendition of ancient wisdom about pride, which theologians have considered the basic sin (much as Freud considered repression the basic defense mechanism).

Self-serving bias appears in people’s judging themselves as better-than-average—on just about any subjective, socially desirable dimension.  Compared with people in general, most people see themselves as more ethical, friendly, intelligent, professionally competent, attractive, unprejudiced, and healthier—and even as more unbiased in their self-assessments!

As part of my reporting on the world of psychology, I enjoy, as an affiliate British Psychological Society affiliate member, two of its journals, and also its Research Digest.  (The digest, authored by Christian Jarrett, is available as a free bimonthly e-mail here.) The Digest put a smirk on my face with its synopsis of a new British Journal of Social Psychology report by Constantine Sedikides, Rosie Meek, Mark Alicke, and Sarah Taylor.  The Sedikides team found that English prisoners incarcerated for violence and robbery saw themselves, compared with “an average member of the community,” as (I am not making this up) more moral, kind, and compassionate.

Shelly Taylor’s humbling surmise, in her 1989 book, Positive Illusions, still rings true: “The [self-]portraits that we actually believe, when we are given freedom to voice them, are dramatically more positive than reality can sustain.”

 

Originally posted on May 21, 2014.

 

A New York Times report on “the extreme sport” of remembering confirms what psychology instructors have long taught:  the power of mnemonic aids, especially organized images, to enable memory performance.  We humans are really good at retaining visual images, and we’re much better at later reproducing visualizable words (bicycle) than abstract words (process).  Thus it can help, when remembering a short grocery list, to use the peg-word system, with numerically ordered items—bun, shoe, tree, door, etc.—and to hang the grocery items on those images.

Likewise, reports the Times article, all the competitors in a recent world memory contest used a “memory palace,” by associating to-be-remembered numbers, words, or cards with well-learned places, such as the rooms of a childhood home.  Challengers who claim to have invented an alternative method inevitably “come in last, or close to it,” noted one world-class competitor.

Memory researchers who study these memorists report that they are, as you might expect, smart.  But they also have unusual capacities for focused attention and holding information in working memory.

Yet, like you and me successfully forgetting previous locations of our car in the parking lot, they also need to be able to replace their place-item associations with new items. In this they are unlike students, who, if they are to become educated persons, need to retain information for months and years to come.  And for that there is no easy substitute for other well-researched memory aids, such as spaced practice, active rehearsal, and the memory consolidation that comes with a solid night’s sleep.

Originally posted on May 23, 2014.

 

John Watson and Rosalie Rayner made psychologist history with their 1920 report of the fear conditioning of 11-month old “Little Albert.”  After repeated pairings of a white rat with an aversive loud noise, Albert reportedly began whimpering at the sight of the rat.  Moreover, his fear reaction generalized, to some extent, to the sight of a rabbit, a dog, and a sealskin coat, but not to more dissimilar objects.

Ever since, people have wondered what became of Little Albert.  One team of psychologist-sleuths identified him as Douglas Merritte, the son of a campus hospital wet nurse who died of meningitis at age 6.  For a forthcoming article in the American Psychologist, another team of sleuths—Russell Powell, Nancy Digdon, Ben Harris, and Christopher Smithson—have identified an even more promising candidate.  William Albert Barger who went by “Albert B”—the very name used by Watson and Rayner—neatly fits many of Little Albert’s known characteristics.  This Albert was not brain-damaged and was easy-going, though (likely coincidentally, given how Albert’s fears would diminish between sessions) he had an aversion to dogs!

Albert died in 2007, without ever knowing of his early life in a hospital residence, or of his apparent part in psychology’s history.

Originally posted on May 28, 2014.

 

Climate change is upon us.  The recent National Climate Assessment, assembled by a large scientific panel, confirms that greenhouse gases continue to accumulate.  The planet is warming. The West Antarctic ice sheet is doomed. The seas have begun rising.  And more extreme weather will plague our future.

Alas, most of the American public is not yet alarmed about this weapon of mass destruction.  The 31 percent who in 1998 thought “the seriousness of global warming is generally exaggerated” increased to 42 percent in 2014.  And the 34 percent of Americans who in 2014 told Gallup they worry “a great deal” about global warming was essentially the same as in 1989.

Part of the problem is what psychologists and their students know as the availability heuristic. Our judgments get colored by mentally available events and images. And what’s more cognitively available than slow climate change is our recently experienced local weather (see here and here).  Local recent temperature fluctuations tell us nothing about long-term planetary trends. (Our current weather is just weather.) Yet, given unusually hot local weather, people become more accepting of global climate warming, while a recent cold day reduces people’s concern about climate warming and overwhelms less memorable scientific data.  Snow in March?  “So much for global warming!”

After Hurricane Sandy devastated New Jersey, its residents’ vivid experience of extreme weather increased their environmentalism.  This suggests that a silver lining to the tragedy of more droughts, floods, heat waves, and other extreme weather may, in time, be increased public concern for climate change.  In the meantime, to offer a vivid depiction of climate change, Cal Tech scientists have created an interactive map of global temperatures over the last 120 years.

Originally posted on May 30, 2014.

 

If asked that question, who would come to your mind?

For a forthcoming Archives of Scientific Psychology report, Ed Diener (University of Illinois), Shigehiro Oishi (University of Virginia), and JunYeun Park (University of Illinois), painstakingly assembled data from citations, textbook page coverage, and major awards.

Their top three, in order, were Albert Bandura (whose 218,219 citations also marked him as our most cited psychologist), Jean Piaget, and Daniel Kahneman.

Looking just at introductory psychology textbook pages mentioning different psychologists, the top two were Robert Sternberg and Martin Seligman.

Originally posted on June 5, 2014.

 

An amazingly comprehensive new Lancet study, with nearly 150 authors, tracks overweight and obesity rates across 188 countries from 1980 to 2013.  Some highlights:

  • Worldwide, the proportion of overweight adults (BMI ≥ 25) increased from 29 to 37 percent among men and 30 to 38 percent among women.
  • Over the last 33 years, no country has reduced its obesity rate.
  • In 2010, “overweight and obesity were estimated to cause 3.4 million deaths.”
  • National variations are huge, with the percentage overweight ranging 85 percent among adults in Tonga to 3 percent in Timor-Leste.

The study is amazing not only in its global comprehensiveness, across time, but also in its public, interactive data archive available from the Institute for Health Metrics and Evaluation.

As a screen shot example, I compared the U.S. increase in the overweight percentage (upper dark line) with the global increase (lower dark line).  All other countries are in light blue.

1401810233175.jpeg

Originally posted on June 12, 2014.

 

My last post—noting the new worldwide estimate that 37 percent of men and 38 percent of women are overweight—got me to wondering if we have other examples of all-humanity data. One is our species’ life expectancy, which has risen from 46.5 years in the early 1950s to 70 years today. What a gift—two dozen more years of life!

And then we have new data from the Gallup World Poll which is surveying countries with more than 98 percent of the world’s population. Aggregating data from this resource, Ed Diener, Louis Tay, and I were able to answer (here) this simple question: Asked, “Is religion important in your daily life?,” what percent of humanity will respond “yes”?

The answer: 68 percent. Two in three humans.

When mentioning this answer in talks, I offer, with a smirk, the usual caveat on reporting survey data: We should be cautious about generalizing beyond the population sampled. (These data represent but one species on one planet, and may not represent the views of other life forms elsewhere in the universe.)

What’s striking about each of these all-humanity measures is the extraordinary variation across countries—from 3 percent overweight adults in Timor-Leste to 85 percent in Tonga; from 49 year life expectancy in Chad to 89 in Monaco; from 16 percent for whom religion is important in Estonia to 100 percent in Bangladesh and Niger. We humans are all kin beneath the skin. Yet how we differ.

[A note to our valued readers:  Nathan DeWall and I anticipate a more relaxed two-a-week pace of blogging this summer, and returning to our weekday postings at the summer’s end.]

Originally posted on June 17, 2014

 

Is religion toxic to human flourishing . . . or is it supportive of human happiness, health, and helpfulness? Let’s make this empirical: Is religious engagement associated with humans living well, or with misery, ill-health, premature death, crime, divorce, teen pregnancy, and the like?

The answer differs dramatically by whether we compare places (such as more versus less religious countries or states) or individuals.

For starters, I manually harvested data from a Gallup World Poll, and found a striking negative correlation across 152 countries between national religiosity and national well-being:

1402340150753.png

Then I harvested General Social Survey data from the U.S. and found—as many other researchers in many other countries have found (though especially in more religious countries) a positive correlation between religiosity and happiness across individuals.

1402340224923.jpeg

 

For additional striking examples of the religious engagement paradox—associating religious engagement with life expectancy, smoking, arrest rate, teen pregnancy, and more (across states versus across individuals)—see here.

Princeton economist Angus Deaton and psychologist Arthur Stone have recently been struck by the same paradox. They ask (here), “Why might there be this sharp contradiction between religious people being happy and healthy, and religious places being anything but?”

Before answering that question—and wondering whether the more important story is told at the aggregate or individual level—consider a parallel paradox, which we might call “the politics of wealth paradox.” Question: Are rich Americans more likely to vote Republican or Democrat?

When we compare states (thanks to Chance News) we can see that low income predicts Republican preferences. Folks in wealthy states are more likely to vote Democratic!  So, being rich inclines one to liberalism?

Not so fast: comparing individuals, we see the opposite (and more expected) result—high income folks vote more Republican.

 

1402340389781.jpeg

These are the sorts of findings that excite behavioral science sleuths. Surely there must be some confounding variables. With religiosity, one such variable is income—which is lower in highly religious countries and states. Controlling for status factors such as income (as Louis Tay did for our article with Ed Diener) and the negative correlation between religiosity and well-being disappears, and even reverses to slightly positive. Likewise, low income states differ from high income states in many ways, including social values that also predict voting.

Ergo, my hunch is that, in both the religious and political realms, the most important story is found at the level of the individual. Nevertheless, there are practical uses for these data. If you’re wanting to make religious engagement look bad, use the aggregate, macro-level data. If you want to make religious engagement look good, use the individual data.

Originally posted on June 26, 2014.

 

The development of adolescent impulse control lags sensation-seeking.  That’s the bottom line result of Laurence Steinberg’s report from surveys of more than 7000 American 12- to 24-year-olds, as part of the National Longitudinal Study of Youth and Children and Young Adults. Sensation-seeking behaviors peak in the mid teens, with impulse control developing more slowly as frontal lobes mature.

1401206395183.png

These trends fit nicely with data from longitudinal studies that, after following lives through time, find that most people become more conscientious, stable, agreeable, and self-confident in the years after adolescence.  The encouraging message for parents of 15-year-olds: you may be pleasantly surprised at your more self-controlled 25-year-old offspring to come.  And for courts, says Steinberg, the brain development and behavioral data together should inform decisions about the criminal sentencing of juveniles.

Originally posted on July 1, 2014.

 

In all of recent psychological science, there has been, to my mind, no more provocative studies those by Benjamin Libet.  His experiments have seemingly shown that when we move our wrist at will, we consciously experience the decision to move it about 0.2 seconds before the actual movement. No surprise there. But what startled me was his reporting that our brain waves jump about 0.35 seconds before we consciously perceive our decision to move! This “readiness potential” has enabled researchers (using fMRI brain scans) to predict participants’ decisions to press a button with their left or right finger. The startling conclusion: Consciousness sometimes appears to arrive late to the decision-making party.

And so it has also seemed in Michael Gazzaniga’s reports of split-brain patients who readily confabulate (make up and believe) plausible but incorrect explanations for their induced actions. If Gazzinga instructs a patient’s right brain to “Walk,” the patient’s unaware left hemisphere will improvise an explanation for walking: “I’m going into the house to get a Coke.”  The conscious left brain is the brain’s public relations system—its explanation-constructing “interpreter.”

So, do Libet’s and Gazzaniga’s observations destroy the concept of free will?  Does our brain really make decisions before our conscious mind knows about them?  Do we fly through life on autopilot?  Are we (our conscious minds) mere riders on a wild beast?

Not so fast.  Stanislas Dehaene and his colleagues report that brain activity continuously ebbs and flows, regardless of whether a decision is made and executed.  The actual decision to move, they observe, occurs when the brain activity crosses a threshold, which happens to coincide with the average “time of awareness of intention to move” (about 0.15 second before the movement).  In their view, the mind’s decision and the brain’s activity, like a computer’s problem solving and its electronic activity, are parallel and virtually simultaneous.

The late neuroscientist Donald MacKay offered a seemingly similar idea:  “When I am thinking, my brain activity reflects what I am thinking, as [computer’s] activity reflects the equation it is solving.”  The mind and brain activities are yoked (no brain, no mind), he argued, but are complementary and conceptually distinct.  As my colleague Tom Ludwig has noted, MacKay’s view—that mental events are embodied in but not identical to brain events—is a third alternative to both dualism and materialism (physicalism).

Originally posted on July 8, 2014.

 

In a new Politico essay (here) I offer four social psychological principles that shed light on enmities both violent (Sunni v. Shia) and playful (sports rivalries).

Originally posted on July 15, 2014.

 

Most of us have read over and again that the human brain has 100 billion neurons.  With no source but legend for that big round number—and not wanting merely to echo an undocumented estimate from other books—I set off in search of a more precise estimate.  Surely someone must have sampled brain tissue, counted neurons, and extrapolated a nerve cell estimate for the whole brain.  (It’s not that the number affects our understanding of how the brain works, but we might as well get the facts right.)

One researcher whose name I was disposed to trust—Gabrielle De Courten-Myers—explained to me by e-mail how she used “histological neuronal density and cortical thickness measurements in 30 cortical samples each from 6 males 12 to 24 years old,” from which she extrapolated an estimate of 23 billion neurons for the male cerebral cortex.  Although she didn’t have data for the rest of the brain, her guess in 2005 was that a whole-brain total would be “somewhere around 40 billion neurons.”

Later, a different research team, using a method that is beyond my pay grade to understand (but apparently involved making a “brain soup” of four male brains, postmortem, and counting neural nuclei) estimated 86 billion neurons in the male brain (though yet another expert with whom I corresponded questioned the validity of their method).

So, how many neurons have we in our human brains?  Apparently something less than 100 billion, but the number is uncertain.  What’s more certain is that we should be suspicious of unsourced big round numbers:  “The brain has 100 billion neurons.” “Ten percent of people are gay.”  “We typically use but 10 percent of our brains.” 

Originally posted on July 24, 2014.

 

Some recent naturalistic observations illustrated for me the results of longitudinal studies of human development—studies that follow lives across time, noting our capacities for both stability and change.

My procedure, though time-consuming, was simple:

  1. Observation Stage 1:  Attend a small college, living on campus with ample opportunity to observe my many friends.
  2. Intervening experience:  Let 50 years of life unfold, taking us to varied places.
  3. Observation Stage 2:  Meet and talk with these friends again, at a college reunion.

Time and again, researchers have documented the remarkable stability of emotionality, intelligence, and personality across decades of life.  “As at age 7, so at 70” says a Jewish proverb.

And so it was for my friends (with names changed to protect identities).  Thoughtful, serious Joe was still making earnest pronouncements.  Driven, status-conscious Louise continues to visibly excel.  Exuberant Mark could still talk for ten minutes while hardly catching a breath.  Gentle, kind Laura was still sensitive and kindhearted.  Mischievous, prankster George still evinced an edgy, impish spirit.  Smiling, happy Joanne still readily grinned and laughed.  I was amazed:  a half century, and yet everyone seemed the same person that walked off that graduation stage.

In other ways, however, life is a process of becoming.  Compared to temperament and to traits such as extraversion, social attitudes are more amenable to change.  And so it was for us, with my formerly kindred-spirited dorm mates having moved in different directions . . . some now expressing tea partyish concerns about cultural moral decay and big government, and others now passionate about justice and support for gay-lesbian aspirations.  Before they opened their mouths, I had no idea which was going to be which.

And isn’t that the life experience of each of us—that our development is a story of both stability and change.  Stability, rooted in our enduring genes and brains, provides our identity . . . while our potential for change enables us to grow with experience and to hope for a brighter future.

(For more on the neurobiology that underlies our stable individuality, and on the brain plasticity that enables our changing, see Richard Davidson’s recent Dana Foundation essay.)

Originally posted on July 29, 2014.

 

July brought the pleasure of attending Stanford University’s introduction to psychology teaching conference, hosted by its Psych One program coordinator, Bridgette Martin Hard.

One of the 70 attendees was the indefatigable Sue Frantz, winner of multiple awards and citations for her contributions to the teaching of psychology (and to educating faculty about teaching technologies).  Frantz, who is also the Society for the Teaching of Psychology’s Vice-President for Resources, tweeted conference highlights:

Worth TLC @WorthPsychTLC ·  Jul 10

.@ericlandrum: Employers want effective communicators, critical thinkers, & those who can apply knowledge to rl #psychoneconference

Worth TLC @WorthPsychTLC ·  Jul 10

.@ericlandrum book recommendation: Student Success in College. Review here: http://www.insidehighered.com/news/2005/05/18/kuh#sthash.u2Y1V0vQ.dpbs …

Worth TLC @WorthPsychTLC ·  Jul 10

E.Hardin: To stop group disc, silently raise hand, signaling stdts to stop talking & raise hands to signal others #psychoneconference [Slighted edited}

Worth TLC @WorthPsychTLC ·  Jul 10

R.Jhangiani: Have you seen this article? Revisiting the Stanford Prison Study (2007). http://www.ncbi.nlm.nih.gov/pubmed/17440210

Worth TLC @WorthPsychTLC ·  Jul 10

R.Jhangiani: It's the Stanford Prison STUDY, not the Stanford Prison EXPERIMENT

Worth TLC @WorthPsychTLC ·  Jul 10

D.Myers: To increase engagement, pack students into a small space. Stack extra chairs in the back. #psychoneconference

Retweeted by Worth TLC

Melissa Beers @mjbeers1 ·  Jul 11

When freshmen reappraise anxiety as arousal that can help them do better, academic performance improves. #psychoneconference

Worth TLC @WorthPsychTLC ·  Jul 11

S.Nolan: Free STP ebook Applying the Science of Learning to Education - http://teachpsych.org/ebooks/asle2014/index.php … #psychoneconference

For many more of Sue Frantz’s tweets—and to read her frequent tweeting of news and research from psychological science—follow her and others at Worth Publishers’ faculty lounge.

How Best to Prepare Students for Life Success?

David Myers

One of the many delights from the Stanford’s recent conference on teaching introductory psychology was being with and hearing Boise State professor Eric Landrum.  The exuberant Landrum is a longtime teaching-of-psychology leader, researcher, and author—and the 2014 president of the Society of the Teaching of Psychology.

His presentation offered his “all-time favorite PowerPoint slide.”  It summarizes the conclusions of research by Michigan State’s Collegiate Employment Research Institute showing the main reasons why new college grads get fired.  These include: Lack of work ethic, failure to follow instructions, missing assignments or deadlines, and being late.

Sound familiar?  Landrum, who studies what helps students succeed, draws a moral from these findings:  By simulating a real world employer, and holding to standards, he is doing them a great favor.  He is preparing them for real world success.

Originally posted on August 7, 2014.

 

One of the many delights from the Stanford’s recent conference on teaching introductory psychology was being with and hearing Boise State professor Eric Landrum.  The exuberant Landrum is a longtime teaching-of-psychology leader, researcher, and author—and the 2014 president of the Society of the Teaching of Psychology.

His presentation offered his “all-time favorite PowerPoint slide.”  It summarizes the conclusions of research by Michigan State’s Collegiate Employment Research Institute showing the main reasons why new college grads get fired.  These include: Lack of work ethic, failure to follow instructions, missing assignments or deadlines, and being late.

Sound familiar?  Landrum, who studies what helps students succeed, draws a moral from these findings:  By simulating a real world employer, and holding to standards, he is doing them a great favor.  He is preparing them for real world success.

1406565226030-1.png

 

David Myers

The Eyes Have It

Posted by David Myers Expert Jul 19, 2016

Originally posted on August 12, 2014.

 

One of social psychology’s intriguing and oft-replicated findings is variously known as the “own-race bias,” the “other-race effect,” and the “cross-race effect”—all of which describe the human tendency to recall faces of one’s own race more accurately than faces of other races. “They”—the members of some other group—seem to look more alike than those in our own group. 

With greater exposure to other-race faces, as when residing among those of a different race, people improve at recognizing individual faces.  Still, the phenomenon is robust enough that social psychologists have wondered what underlies it.  In the July Journal of Personality and Social Psychology, a research team led by Kerry Kawakami at York University offers a possible contributing factor:  When viewing faces during several experiments, White participants attended more to the eyes of White people, and to the nose and mouth of Black people.  Eye gaze, they reason, is “individuating”—it helps us discern facial differences.  Thus the ingroup eye-gaze difference may help explain the own-race bias.

Originally posted on August 21, 2014.

 

One of psychology’s big discoveries is our almost irresistible tendency to judge the likelihood of events by how mentally available they are—a mental shortcut that Daniel Kahneman and Amos Tversky identified as “the availability heuristic.”  Thus anything that makes information pop into mind—its vividness, recency, or distinctiveness—can make it seem commonplace.  (Kahneman explores the power of this concept at length in Thinking Fast and Slow, which stands with William James’ Principles of Psychology on my short list of greatest-ever psychology books.)

My favorite example of the availability heuristic at work is people’s misplaced fear of flying.  As I document in the upcoming Psychology, 11th Edition, from 2009 to 2011 Americans were—mile for mile—170 times more likely to die in a vehicle accident than on a scheduled flight.  When flying, the most dangerous part of our journey is typically the drive to the airport.  In a late 2001 essay, I calculated that if—because of 9/11—we in the ensuing year flew 20 percent less and instead drove half those unflown miles, about 800 more people would die.  German psychologist Gerd Gigerenzer later checked my estimate against actual traffic fatalities (why didn’t I think to do that?) and found that traffic fatalities did, indeed, jump after 9/11.  Thanks to those readily available, horrific mental images, terrorists had killed more people on American highways than died on those four ill-fated planes.

The availability heuristic operates in more mundane ways as well.  This morning I awoke early at an airport hotel, where I had been waylaid after a flight delay.  The nice woman working the breakfast bar told me of how she, day after day, meets waylaid passengers experiencing weather problems, crew delays, and mechanical problems.  Her conclusion (from her mentally available sample of flyers):  something so often goes awry that if she needed to travel, she would never fly.

Vivid examples make us gasp.  Probabilities we hardly grasp.

Originally posted on August 26, 2014.

 

In a recent New York Times essay (here), Henry Roediger explains the insights gleaned from his research on “the testing effect”— the enhanced memory that follows actively retrieving information, rather than simply rereading it. Psychologists sometimes also refer to this phenomenon as “test-enhanced learning,” or as the “retrieval practice effect” (because the benefits derive from the greater rehearsal of information when self-testing rather than rereading).

As Roediger explains, “used properly, testing as part of an educational routine provides an important tool not just to measure learning, but to promote it.”

For students and teachers, I offer a 5-minute animated explanation of the testing effect and how to apply it in one’s own study.  (I intend this for a class presentation or viewing assignment in the first week of a course.) See Make Things Memorable!  How to Study and Learn More Effectively.

Originally posted on September 3, 2014.

 

Skimming Paul Taylor’s, The Next America: Boomers, Millennials, and the Looming Generational Showdown, a 2014 report of Pew Research Center data on U.S. social trends, brought to mind one of my pet peeves: the favoritism shown to seniors over today’s more economically challenged Millennials and their children. Since passing into AARP-eligible territory, I have often purchased fares or tickets at discounted prices, while the single parent in line behind me got hit with a higher price. One website offers 250,000+ discounts for folks over 50.

A half-century and more ago it made sense to give price breaks to often-impoverished seniors wanting a night out at the movies, hungry for a restaurant meal, or needing to travel on buses and trains. Many seniors still struggle to make ends meet and afford housing.  But thanks to improved Social Security and retirement income and to decreased expenses for dependents and mortgages, their median net worth has been increasing—37 percent since 1984, Taylor shows, while those under 35 have seen their net worth plummet 44 percent.

1406565534746-1.png

And consider who are today’s poor (from this figure available here as well as in Taylor’s excellent book). Among the predictors is not only race but age.  Compared to four decades ago, today’s under-35 generation experiences a nearly doubled risk of poverty, while their senior counterparts suffer one-third the poverty rate of their 1960s counterparts

Ergo, in view of this historical change in poverty risk, should we adjust our social priorities? Might a more child-affirming culture consider discounts for card-carrying custodial parents? And could we not offer inflation adjustments not only to senior citizen Social Security stipends but also to minimum wages, tax exemptions for dependents, and family and food assistance?

 

Originally posted on September 5, 2014.

 

Feeling stressed by multiple demands for your time and attention?  Daniel Levitin, director of McGill University’s Laboratory for Music, Cognition and Expertise at McGill University and author of The Organized Mind: Thinking Straight in the Age of Information Overload, has some suggestions.  In a recent New York Times essay, he advises structuring our day to give space both for undistracted task-focused work and for relaxed mind-wandering:

If you want to be more productive and creative, and to have more energy, the science dictates that you should partition your day into project periods. Your social networking should be done during a designated time, not as constant interruptions to your day.

Email, too, should be done at designated times. An email that you know is sitting there, unread, may sap attentional resources as your brain keeps thinking about it, distracting you from what you’re doing. What might be in it? Who’s it from? Is it good news or bad news? It’s better to leave your email program off than to hear that constant ping and know that you’re ignoring messages.

Increasing creativity will happen naturally as we tame the multitasking and immerse ourselves in a single task for sustained periods of, say, 30 to 50 minutes. Several studies have shown that a walk in nature or listening to music can trigger the mind-wandering mode. This acts as a neural reset button, and provides much needed perspective on what you’re doing.

As one who is distracted by a constant stream of e-mails and the temptations of favorite web sites, this is advice I should take to heart.  But I have benefitted from an e-mail system that diverts e-mails that come to my public e-mail address (including political fund-raising appeals and list mail) from those that come to a private e-mail address known to family and colleagues.  The public e-mails are sent my mail only at the day’s end.

I also find it helpful to take work out to coffee shops, including one that doesn’t have Internet access.  “They should charge your extra for that,” observed one friend.

In our upcoming Psychology, 11th Edition, Nathan DeWall and I offer some further advice:

In today’s world, each of us is challenged to maintain a healthy balance between our real-world and online time. Experts offer some practical suggestions for balancing online connecting and real-world responsibilities.

Monitor your time. Keep a log of how you use your time. Then ask yourself, “Does my time use reflect my priorities? Am I spending more or less time online than I intended? Is my time online interfering with school or work performance? Have family or friends commented on this?”

Monitor your feelings. Ask yourself, “Am I emotionally distracted by my online interests? When I disconnect and move to another activity, how do I feel?”

“Hide” your more distracting online friends. And in your own postings, practice the golden rule. Before you post, ask yourself, “Is this something I’d care about reading if someone else posted it?”

Try turning off your mobile devices or leaving them elsewhere. Selective attention—the flashlight of your mind—can be in only one place at a time. When we try to do two things at once, we don’t do either one of them very well (Willingham, 2010). If you want to study or work productively, resist the temptation to check for updates. Disable sound alerts and pop-ups, which can hijack your attention just when you’ve managed to get focused. (I am proofing and editing this chapter in a coffee shop, where I escape the distractions of the office.)

Try a social networking fast (give it up for an hour, a day, or a week) or a time-controlled social media diet (check in only after homework is done, or only during a lunch break). Take notes on what you’re losing and gaining on your new “diet.”

Refocus by taking a nature walk. People learn better after a peaceful walk in the woods, which—unlike a walk on a busy street—refreshes our capacity for focused attention (Berman et al., 2008). Connecting with nature boosts our spirits and sharpens our minds (Zelenski & Nisbet, 2014).

Originally posted on September 9, 2014.

 

How do we know ourselves?  It’s partly by observing our own actions, proposed Daryl Bem’s self-perception theory.  Hearing ourselves talk can give us clues to our own attitudes.  Witnessing our actions gives us insight into the strength of our convictions (much as we observe others’ behavior and make inferences). Our behavior is often self-revealing.

The limits of such self-revelation have recently been explored by one of psychology’s most creative research teams at Sweden’s Lund University. The researchers, including Andreas Lind, were curious: “What would it be like if we said one thing and heard ourselves saying something else?” Would we experience an alien voice?  An hallucination? Would we believe our ears?

Through a noise-cancelling headset, the participants heard themselves name various font colors, such as the word green presented in a gray font color. But sometimes, the wily researchers substituted a participant’s own voice saying a previously recorded word, such as “green” instead of the correctly spoken “gray.” Surprisingly, two-thirds of these word switches went undetected, with people typically experiencing the inserted word as self-produced! (For more from the creative Lund University "choice blindness" research group, see here.)

A second new demonstration of the self-revealing power of our own behavior comes from research on the effects of feedback from our face and body muscles. As we have known for some time, subtly inducing people to make smiling rather than frowning expressions—or to stand, sit, or walk in an expansive rather than contracted posture—affects people’s self-perceptions.  Motions affect emotions.

At the University of Cologne, Sascha Topolinski and his colleagues report that even subtle word articulation movements come tinged with emotion.  In nine experiments they observed that both German- and English-speaking people preferred nonsense words and names spoken with inward (swallowing-like) mouth movements—for example, “BENOKA”—rather than outward (spitting-like) motions, such as KENOBA.  Ostensible chat partners given names (e.g., Manero) that activated ingestion muscles were preferred over chat partners whose names activated muscles associated with expectoration (e.g., Gasepa).

Self-perception theory lives on.  Sometimes we observe ourselves and infer our thoughts and feelings.

Originally posted on September 11, 2014.

 

In a recent blog essay (here) I advised thinking critically about big round numbers, including claims that the brain has 100 billion neurons, that we use 10 percent of our brains, and that 10 percent of people are gay.

Regarding the latter claim, a recent Gallup survey asked 121,290 Americans about their sexual identity: “Do you, personally, identify as lesbian, gay, bisexual, or transgender?” “Yes,” answered 3.4 percent.  And when a new National Center for Health Statistics study asked 34,557 Americans about their sexual identity, all but 3.4 percent of those who answered indicated they were straight. The rest said they were gay or lesbian (1.6 percent), bisexual (0.7 percent), or “something else” (1.1 percent).

Questions have recently arisen about another of psychology’s big round numbers—the claim that 10,000 practice hours differentiates elite performers, such as top violinists, from average to excellent performers.  As the distinguished researcher, Anders Ericcson, observed from his study of musicians (here), “the critical difference [is] solitary practice during their music development, which totaled around 10,000 hours by age 20 for the best experts, around 5,000 hours for the least accomplished expert musicians and only 2,000 hours for serious amateur pianists.”

Not so fast, say David Hambrick, Brooke Macnamara, and their colleagues (here and here). In sports, music, and chess performance, for example, people's practice time differences account for a third or less of their performance differences. Raw talent matters, too.

Perhaps both are right?  Are superstar achievers distinguished by their unique combination of both extraordinary natural talent and extraordinary daily discipline?

Originally posted on September 15, 2014.

 

Every once in a while I reread something that I've reported across editions of my texts, scratch my head, and ask myself: Is this really true?

Such was the case as I reread my reporting that “With the help of 382 female and 312 male volunteers. . . Masters and Johnson monitored or filmed more than 10,000 ‘sexual cycles.’”

Really?

I wasn't just makin’ stuff up.  Masters and Johnson do report (on page 15 of Human Sexual Response) their “conservative estimate of 10,000 complete sexual response cycles” in their laboratory (some involving multiple female orgasms).

But let’s do the numbers.  If they observed 10,000 complete sexual cycles over eight years[1] (from 1957 to 1965), then they averaged 1,250 sexual cycles observed per year.  Could we assume about an hour dedicated to each observation—including welcoming the participant(s), explaining the day’s tasks, attaching instruments, observing their behavior, debriefing them, and recording their observations?  And could we assume about 40 weeks a year of observation?  (Meanwhile, they were also running a sexual therapy clinic, writing, managing a lab, etc.)

So . . . doing the numbers . . . that’s roughly 31 weekly hours observing sex . . . for eight years.

It boggles the mind.  And one wonders: Wasn't there some point of diminishing returns from observing yet another 1000 hours of sex . . . assuming Masters and Johnson reported truthfully?

I have no basis for doubting the accuracy and integrity of Masters and Johnson’s reporting.  But I do, in a spirit of curiosity, scratch my head.

 

[1] In Human Sexual Response, they report gathering data over “eleven years” (pp. 9, 20).  But Johnson didn't join Masters until 1957, and Johnson biographer Genoa Ferguson reports that Johnson “began doing sexual function research 6 months into her research position.” Also, Masters and Johnson report (p. 10) that the first 20 months of the observations—presumably by Masters without Johnson—involved medical histories of 118 prostitutes, eleven of whom “were selected for anatomic and physiologic study.”  Ergo, although Masters and Johnson’s reporting leaves the exact study period ambiguous, it appears that the great majority, if not all, of the reported 10,000+ “complete sexual responses cycles” were observed during the seven or eight years after Johnson began her work with Masters. They also do not document the lab layout, or precisely how they observed their subjects. (As a point of contrast, Stanley Milgram’s similarly classic Obedience to Authority did precisely report on the participants, methods, and results of his various experiments, including drawings of the lab layout and equipment.)

Originally September 7, 2014.

 

My wife loves me, despite smirking that I am “boringly predictable.”  Every day, I go to bed at pretty much the same time, rise at the same time, pull on my khaki pants and brown shoes, frequent the same coffee shops, ride the same old bicycle, and exercise every weekday noon hour.  As I walk into my Monday-Wednesday-Friday breakfast spot, the staff order up my oatmeal and tea.  I’ll admit to boring.

But there is an upside to mindless predictability.  As my colleagues-friends Roy Baumeister, Julia Exline, Nathan DeWall and others have documented, self-controlled decision-making is like a muscle.  It temporarily weakens after an exertion (a phenomenon called “ego depletion”) and replenishes with rest. Exercising willpower temporarily depletes the mental energy needed for self-control on other tasks.  It even depletes the blood sugar and neural activity associated with mental focus. In one experiment, hungry people who had resisted the temptation to eat chocolate chip cookies gave up sooner on a tedious task (compared with those who had not expended mental energy on resisting the cookies).

President Obama, who appreciates social science research, understands this.  As he explained to Vanity Fair writer Michael Lewis, “You’ll see I wear only gray or blue suits. I’m trying to pare down decisions. I don’t want to make decisions about what I’m eating or wearing. Because I have too many other decisions to make.”  Lewis reports that Obama mentioned “research that shows the simple act of making decisions degrades one’s ability to make further decisions,” noting that Obama added, “You need to focus your decision-making energy. You need to routinize yourself. You can’t be going through the day distracted by trivia.”

So, amid today’s applause for “mindfulness,” let’s put in a word for mindlessness.  Mindless, habitual living frees our minds to work on more important things than which pants to wear or what breakfast to order.  As the philosopher Alfred North Whitehead argued, “Civilization advances by extending the number of operations which we can perform without thinking about them.”

Originally posted on September 23, 2014.

 

In the September Observer (from the Association for Psychological Science), Nathan explains why “Brain Size Matters.”  He summarizes, and suggests how to teach, Robin Dunbar’s conclusion that “Our brain size evolved to accommodate social groups that contain roughly 150 people.”

In the same issue, David’s essay on “Inspiring Interest in Interests” recaps research on the stability and motivational power of career-related interests, and offers students links to inventories that can assess their own interests and well-matched vocations.

September_2014_OBS_Cover_Image_400px_Cropped.jpg

David Myers

Do Look-Alikes Act Alike?

Posted by David Myers Expert Jul 19, 2016

Originally posted on September 30, 2014.

 

Behavior geneticists have gifted us with two stunning findings—discoveries that overturned what I used to believe about the environment’s power to shape personality.  One, dramatically illustrated by the studies of identical twins separated near birth, is the heritability of personality and intelligence.  The other, dramatically illustrated by the dissimilar personalities and talents of adoptive children raised in the same home and neighborhood, is the modest influence of “shared environment.”

I know, I know . . . studies of impoverishment during the preschool years, of epigenetic constraints on genetic expression, and of family influences on attitudes, values, and beliefs, remind us that genetic dispositions are always expressed in particular environments.  Nature and nurture interact.

And might identical twins have similar personalities not just because of their shared genes, but also their environments responding to their similar looks?  If only there were people who similarly look alike but don’t share the same genes.

Happily there are unrelated look-alikes—nontwin “doppelgängers” identified by Montreal photographer François Brunelle (do visit some examples here).  California State University, Fullerton, twin researcher Nancy Segal seized this opportunity to give personality and self-esteem inventories to these human look-alikes.

Unlike identical twins, the look-alikes did not have notably similar traits and self-esteem (see here). And in a new follow-up study with Jamie Graham and Ulrich Ettinger (here), she replicates that finding and also reports that the look-alikes (unlike biological twin look-alikes) did not develop special bonds after meeting their doppelgänger. 

The take-home message.  Genes matter more than looks. As the evolutionary psychologists remind us, kinship biology matters.

Originally posted on October 7, 2014.

 

The October APS Observer is out with an essay by Nathan, “Once a Psychopath, Always a Psychopath?” on people who “commit horrific crimes, experience little guilt or remorse, and then commit similar crimes again.” What is their potential for change, and how can we teach students about them?

In the same issue, I offer “The Story of My Life and Yours: Stability and Change.” It’s a celebration of what I regard as one of the great studies in the history of psychological science...Ian Deary and colleagues’ discovery of the intelligence scores of virtually all Scottish 11-year-olds in 1932, and then their retesting of samples of that population up to age 90.  The bottom line:  our lives are defined by a remarkable stability that feeds our identity, and also by a potential for change that enables us to grow and to hope for a brighter future.

Observer+October.JPG.png

Originally posted on October 14, 2014.

 

What would you consider psychology’s ten most provocative and controversial studies?  Christian Jarrett, a great communicator of psychological science via the British Psychological Society’s free Research Digest, offers his top ten list here.  A quick recap:

1. The Stanford Prison Experiment (aka the Stanford Prison Simulation)

2. The Milgram "Shock Experiments"

3. The "Elderly-related Words Provoke Slow Walking" Experiment (and other social priming research)

4. The Conditioning of Little Albert

5.  Loftus' "Lost in The Mall" Study

6. The Daryl Bem Pre-cognition Study

7.  The Voodoo Correlations in Social Neuroscience study

8. The Kirsch Anti-Depressant Placebo Effect Study

9. Judith Rich Harris and the "Nurture Assumption"

10. Libet's Challenge to Free Will

This is, methinks, a great list.  All ten have captured my attention and reporting (although I would reframe #5 to indicate Beth Loftus’s larger body of research on false memories and the misinformation effect).  Are there other studies that would make your top ten list?

In the cover story of the October APS Observer, Carol Tavris reflects on “Teaching Contentious Classics,” which include the Milgram experiments, and also Sherif’s Robbers Cave experiment and Harlow’s baby monkey experiments, the latter of which surely also merits inclusion on any list of psychology’s most controversial studies.

Originally posted on October 21, 2014.

 

Seth Stephens-Davidowitz uses aggregate data from Google to see if parents’ hopes for their children are gender-neutral. He reports that, actually, many parents seem eager to have smart sons and slender, beautiful daughters. You can see this for yourself (heads-up to teachers:  a cool in-class demonstration here). Google (with quote marks) and note the number of results:

  • “Is my daughter gifted”
  • “Is my son gifted”
  • “Is my son overweight”
  • “Is my daughter overweight”

As an example, here’s another pair I just tried (the OR commands a Boolean search of either version):

1413905104156.png

Originally posted on October 28, 2014.

 

With nearly 5000 misery-laden deaths and no end in sight, Ebola is, especially for Liberia and Sierra Leone, a West African health crisis.  It may not yet rival the last decade’s half million annual child deaths attributable to rotavirus—“Where is the news about these half-million kids dying?.” Bill Gates has asked.  But West Africans are understandably fearful.  

And North Americans, too . . . though perhaps disproportionately fearful?

Thanks to our tendency to fear what’s readily available in memory, which may be a low-probability risk hyped by news images, we often fear the wrong things.  As Nathan DeWall and I explain in the upcoming Psychology, 11th Edition, mile for mile we are 170 times safer on a commercial flight than in a car.  Yet we visualize air disasters and fear flying. We see mental snapshots of abducted and brutalized children and hesitate to let our sons and daughters walk to school. We replay Jaws with ourselves as victims and swim anxiously.  Ergo, thanks to such readily available images, we fear extremely rare events.

As of this writing, no one has contracted Ebola in the U.S. and died.  Meanwhile, 24,000 Americans die each year from an influenza virus, and some 30,000 suffer suicidal, homicidal, and accidental firearm deaths.  Yet which affliction are many Americans fearing most?  Thanks to media reports of the awful suffering of Ebola victims, and our own “availability heuristic,” you know the answer.

As David Brooks has noted, hundreds of Mississippi parents pulled their children from school because its principal had visited Zambia, a southern African country untouched by Ebola.  An Ohio school district closed two schools because an employee apparently flew on a plane (not the same flight) in which an Ebola-infected health care worker had travelled.  Responding to public fears of this terrible disease, politicians have proposed travel bans from affected African countries, which experts suggest actually might hinder aid and spread the disease.

Déjà vu. We fear the wrong things. More precisely, our fears—of air crashes versus car accidents, of shark attacks versus drowning, of Ebola versus seasonal influenza—are not proportional to the risks.

Time for your fall flu shot?

Originally posted on November 11, 2014.

 

A  recent Beijing visit left me marveling at students’ academic enthusiasm.  In explaining Asian students’ outperformance of North American students, researchers have documented cultural differences in conscientiousness. Asian students spend more time in school and much more time studying (and see here for one recent study of the academic diligence of Asian-Americans).

The Beijing experience gave me several glimpses of this culture difference in achievement drive and eagerness to learn.  For example, as I dined more than a half hour before speaking at the Peking University psychology department, word came that 160 students were already present.  After my talk in the overfilled auditorium (below), student hands across the room were raised, with some waving hands or standing up, pleading to be able to ask their questions.  And this was a Friday evening.

1415205918609.jpeg

Later that weekend, I met with teachers of AP psychology, whose students at select Beijing high schools pay to take AP courses in hopes of demonstrating their capacity to do college-level work in English, and thus to gain admission to universities outside China.  Several of the teachers were Americans, one of whom chuckled when explaining that, unlike in the USA, she sought to demotivate her overly motivated students, encouraging them to lighten up and enjoy life.

The plural of these anecdotes of culture difference is not data. (My China sample was biased—high achieving students who had gained admission to the most elite schools.) But the experiences, which replicated what I experienced in a 2008 visit to Beijing, were memorable.

Originally posted on November 25, 2014.

 

The November APS Observer is out with an essay by Nathan, “Why Self-Control and Grit matter—and Why It Pays to Know the Difference.” It describes Angela Duckworth’s and James Gross’s research on laser-focused achievement drive (grit) and on self-control over distracting temptations. . . and how to bring these concepts into the classroom.

In the same issue, I reflect on “The Psychology of Extremism.” I describe the social psychological roots of extreme animosities and terrorist acts, including a description of Michael Hogg’s work on how people’s uncertainties about their world and their place in it can feed a strong (even extreme) group identity.

Originally posted on December 2, 2014.

 

As I explain in a recent APS Observer essay (here), my short list of psychology’s greatest research programs includes the 250+ scientific publications that have followed Scottish lives from childhood to later life.  The studies began with all Scottish 11-year-olds taking intelligence tests in 1932 and in 1947 (the results of which Ian Deary and his team discovered many years later).  After meeting Deary at an Edinburgh conference in 2006 and hearing him describe his tracking these lives through time, I have followed his team’s reports of their cognitive and physical well-being with great fascination.

Last April, some 400 alums of the testing—now 93 or 78 years old (including those shown with Deary below)—gathered at the Church of Scotland’s Assembly Hall in Edinburgh, where Deary regaled them with the fruits of their participation. One of his conclusions, as reported by the October 31st Science, is that “participants’ scores at age 11 can predict about 50% of the variance in their IQs at age 77.”

I invited Professor Deary to contribute some PowerPoint slides of his studies for use by teachers of psychology.  He generously agreed, and they may be found here.

1416413696067.png

Photo courtesy of Ian Deary.

Originally posted on December 9, 2014.

 

Economic inequality is a fact of life.  Moreover, most folks presume some inequality is inescapable and even desirable, assuming that achievement deserves financial reward and that the possibility of making more money motivates effort.

But how much inequality is good?  Psychologists have found that places with great inequality tend to be less happy places, and that when inequality grows so does perceived unfairness, which helps offset the psychological benefits of increased affluence.  When others around us have much more than we do, feelings of “relative deprivation” may abound. And as Kate Pickett and Richard Wilkinson document, countries with greater inequality also experience greater health and social problems, and higher rates of mental illness.

So, how great is today’s economic inequality? Researchers Michael Norton and Dan Ariely invited 5,522 Americans to estimate the percent of wealth possessed by the richest 20 percent in their country. The average person’s guess—58 percent—“dramatically underestimated” the actual wealth inequality. (The wealthiest 20 percent possessed 84 percent of the wealth.)

And how much inequality would be ideal?  The average American favored the richest 20 percent taking home between 30 and 40 percent of the income—and, in their survey, the Republican versus Democrat difference was surprisingly modest.

Now, working with Sorapop Kiatpongsan in Bangkok, Norton offers new data from 55,238 people in 40 countries, which again shows that people vastly underestimate inequality, and that people’s ideal pay gaps between big company CEOs and unskilled workers is much smaller than actually exists.  In the U.S., for example, the actual pay ratio of S&P 500 CEOs to their unskilled workers (354:1) far exceeds the estimated ratio (30:1) and the ideal ratio (7:1).

Their bottom line:  “People all over the world and from all walks of life would prefer smaller pay gaps between the rich and poor.”

Originally posted on December 16, 2014.

 

The December APS Observer is out with an essay by Nathan on “The Neural Greenhouse:  Teaching Students How to Grow Neurons and Keep Them Alive.” Our brains are like greenhouses, he notes, with new neurons sprouting daily, “while others wither and die.” To take this neuroscience into the classroom, he offers three activities.

In the same issue, I say, “Let’s Hear a Good Word for Self-Esteem.” Mindful of recent research on the perils of excessive self-regard—of illusory optimism, self-serving bias, and the like—I offer a quick synopsis of work on the benefits of a sense of one’s self-worth. I also offer Google ngram figures showing sharply increased occurrences of “self-esteem” in printed English over the last century, and of decreasing occurrences of “self-control.”

Originally posted on December 23, 2014.

 

The Centers for Disease Control and Prevention, drawing from its own continuing household interviews, offers new data on who in the U.S. is most likely to suffer depression, and how often.  

Some noteworthy findings:

  • Overall rate of depression: Some 3 percent of people age 12 and over were experiencing “severe depressive symptoms.” More people—7.6 percent—were experiencing “moderate or severe” symptoms, with people age 40 to 59 at greatest risk. Many more—78 percent—“had no depressive symptoms.”
  • Gender and depression. Women experience nearly double (1.7 times) men’s rate of depression.
  • Poverty and depression. People living below the poverty line are 2½ times more likely to be experiencing depression. (Does poverty increase depression? Does depression increases poverty? Or—mindful of both the stress of poverty and the CDC-documented impact of depression on work and home life—is it both?)
  • Depression and treatment.  Only 35 percent of people with severe symptoms reported contact with a mental health professional in the prior year.

1418680197221-1.png

 

Originally posted on January 6, 2015.

 

University of Warwick economist Andrew Oswald—someone who creatively bridges economics and psychological science, as in his studies of money and happiness—offers some fascinating recent findings on his website:

  • A huge UK social experiment “offered incentives to disadvantaged people to remain and advance in work and to become self-sufficient.” Five years later the experimental group indeed had higher earnings than the control group, but lower levels of well-being (less happiness and more worries). Ouch.
  • Income (up to a point) correlates with happiness. But is that because richer people tend to be happier, or happier people tend to be richer? By following adolescent and young adult lives through time, with controls for other factors, the data reveal that happiness does influence future income.
  • After winning a lottery, do people political attitudes shift right?  Indeed yes.  “Our findings are consistent with the view that voting is driven partly by human self-interest.  Money apparently makes people more right-wing.”

This finding syncs with earlier findings that inequalities breed their own justification.  Upper-class people are more likely than those in poverty to see people’s fortunes as earned, thanks to their skill and effort—and not as the result of having connections, money, and good luck. 

Such findings also fit U.S. political surveys showing that high income individuals are more likely to vote Republican...despite—here’s a curious wrinkle to ponder—high income states being less likely to vote Republican.  We might call this “the wealth and politics paradox”—poor states and rich individuals vote conservative. Care to speculate why this difference?

David Myers

Why Do We Care Who Wins?

Posted by David Myers Expert Jul 19, 2016

Originally posted on January 13, 2015.

 

Last night’s national championship college football game, today’s New York Times article on America’s greatest small college rivalry (involving my own Hope College), and the upcoming Super Bowl all bring an interesting psychological question to mind:  Why do we care who wins? What psychological dynamics energize rabid fans?

In a 2008 Los Angeles Times essay I offered answers to my own questions, which first crossed my mind just before tipoff at that rivalry game described in today’s Times. The pertinent dynamics include the evolutionary psychology of groups, ingroup bias, social identity, group polarization, and the unifying power of a shared threat.

In a 2014 Politico essay I extended these principles in reflections on political and religious animosities between groups that, to outsiders, seem pretty similar (think Sunni and Shia, or Northern Ireland’s Catholic and Protestant).  The same social dynamics that fuel fun sports rivalries can, writ large, produce deep-rooted hostilities and social violence.

Originally posted on January 20, 2015.

 

In the January Observer (here), Nathan digests—and suggests how to teach—David Creswell and Emily Lindsay’s explanations of how mindfulness improves health.  Mindfulness serves to recruit brain regions important for stress control and it inhibits the sympathetic-adrenal-medullary (SAM) and hypothalamic-pituitary-adrenal (HPA) axes from going into overdrive.

David (here) notes that marriage predicts happiness.  Does it also predict physical health?  A massive meta-analysis by Theodore Robles and his colleagues indicates that, in Robles’ words, the marriage-health relationship “is similar to that of associations between health behaviors (diet, physical activity) and health outcomes.” But why?  Does marriage influence health or are healthy people more likely to marry?  Longitudinal studies suggest that marriage influences future health—for reasons that Robles explains and that class discussion might identify.

1421166627365.png

Originally posted on February 4, 2015.

 

Friday my focus was hearing research and care—at the National Institute on Deafness and Other Communication Disorders, where I sit on the Advisory Council (assessing federal support for hearing research and hearing health).  Days later, I was cheering on my ill-fated hometown Seattle Seahawks in the Super Bowl.

Alas, there is some dissonance between those two worlds, especially for fans of the team that prides itself on having the loudest outdoor sports stadium, thanks to its “12th Man” crowd noise—which has hit a record 137.6 decibels . . . much louder than a jackhammer, notes hearing blogger, Katherine Bouton.

With three hours of game sound rising near that intensity, many fans surely experience temporary tinnitus—ringing in the ears—afterwards...which is nature’s warning us that we have been baaad to our ears.  Hair cells have been likened to carpet fibers. Leave furniture on them for a long time and they may never rebound. A rule of thumb: if we cannot talk over a prolonged noise, it is potentially harmful.

With repeated exposure to toxic sound, people are at increased risk for cochlear hair cell damage and hearing loss, and for constant tinnitus and hyperacusis (extreme sensitivity to loud noise).

no-image.png

Men are especially vulnerable to hearing loss, perhaps partly due to greater noise exposure from power tools, loud music, gunfire, and sporting events (some researchers have implicated noise is men’s greater hearing loss).  But some men know the risks, as 2010 Super Bowl-winning quarterback Drew Brees illustrated, when lifting his son Baylen, with ear muffs during the post-game celebration.

For more on sports and noise, visit here.