Skip navigation
All Places > The Psychology Community > Blog > 2019 > May
2019

One of the perennial challenges in teaching Intro Psych is helping students understand that knowing that two variables are, say, positively correlated does not tell us anything about what causes that relationship. Discussion of this study would work during your coverage of correlations or during your coverage of circadian rhythms as an opportunity to revisit correlations.

 

The 6.5-year study of 433,268 British adults (Knutson & von Schantz, 2018) provides us with an illustrative example. The researchers asked each person “Do you consider yourself to be definitely a morning person [27%], more a morning than evening person [35%], more an evening than morning person [28%], definitely an evening person [9%].” “Increased eveningness, particularly definite evening type, was associated with increased prevalence of a wide variety of diseases or disorders, including dia- betes, psychological, neurological, respiratory and gastrointestinal/abdominal disorders.”

If your Intro Psych students are like most people, they want to jump to the conclusion that being a night owl will cause a number of health problems that will eventually lead to an early death.

 

The lead author of this study, Kristen Knutson, thinks the root of the problem is really that of a mismatch. Many of our societies are geared toward the morning chronotype. If you have an evening chronotype but are trying to work, say, 9am to 5pm, you’re fighting against your own internal clock (Khan, 2018).  

 

Ask students to work in pairs or small groups to generate a list of factors that are associated with good health. Perhaps their list includes things like exercising, getting good sleep, eating well, and developing and maintaining strong social ties. Now ask students to consider why these things may be more difficult for night owls than morning larks. For example, how many recreational sports teams compete at 11pm? How many spin or yoga classes are offered at midnight? How easy is it to find a restaurant with healthy fare at 10:30pm? How hard is it to get enough sleep when you don’t get sleepy until 2am but have to be up at 7am to be a work by 9am? Maintaining social ties is difficult for night owls who want to hang out at 10pm when most of their friends are headed to bed (Clark, 2019).

 

The cause of the correlation between chronotype and health may be due to this whole host of third factors. What if night owls were allowed to be night owls? Would a night owl yoga instructor, for example, teaching at 11pm have night owl students? Would night owls get more sleep if they worked, say, 2pm to 10pm?

 

If time allows, ask volunteers to share their chronotypes and how they’ve adjusted their schedules to fit that chronotype. Or, if their schedules don’t fit, to share why that’s a struggle.

 

In closing this activity, reiterate that knowing that there is a relationship between two variables tells us nothing about why two variables are related. 

 

 

References

 

Clark, B. (2019, May 23). No, night owls aren’t doomed to die early. New York Times. Retrieved from https://www.nytimes.com/2019/05/23/smarter-living/no-night-owls-arent-doomed-to-die-early.html

 

Khan, A. (2018, April 11). Bad news for night owls. Their risk of early death is 10% higher than for early risers, study finds. Los Angeles Times. Retrieved from https://www.latimes.com/science/sciencenow/la-sci-sn-night-owl-death-20180412-story.html

 

Knutson, K. L., & von Schantz, M. (2018). Associations between chronotype, morbidity and mortality in the UK Biobank cohort. Chronobiology International, 35(8), 1045–1053. https://doi.org/10.1080/07420528.2018.1454458

 

 

 

David Myers

The Joy of Being Wrong

Posted by David Myers Expert May 28, 2019

What virtue is more needed in today’s contentious and polarized world than humility? We need deep-rooted convictions to fuel our passions, but also humility to restrain bull-headed fanaticism.

 

Along with curiosity and skepticism, humility forms the foundation of all science. Humility enables critical thinking, which holds one’s untested beliefs tentatively while assessing others’ ideas with a skeptical but open mind. To accept everything is to be gullible; to deny everything is to be a cynic.

 

In religion and literature, hubris (pride) is first and foundational among the seven deadly sins. When rooted in theism—the assumption that “There is a God, but it’s not me”—humility reminds us of our surest conviction: Some of our beliefs err. We are finite and fallible. We have dignity but not deity. So there’s no great threat when one of our beliefs is overturned or refined—it’s to be expected.  In this spirit, we can, as St. Paul advised, “test everything, hold fast to what is good.”

 

Humility also underlies healthy human relations. In one of his eighteenth-century Sermons, Samuel Johnson recognized the corrosive perils of pride and narcissism: “He that overvalues himself will undervalue others, and he that undervalues others will oppress them.” Even Dale Carnegie, the positive thinking apostle, foresaw the danger: “Each nation feels superior to other nations. That breeds patriotism—and wars.”

 

Unlike pride and narcissism, humility contributes to human flourishing. It opens us to others. Show social psychologists a situation where humility abounds—with accurate self-awareness + modest self-presentation + a focus on others—and they will show you civil discourse, happy marriages, effective leadership, and mental health. And that is the gist of this new 3.5 minute animated Freethink video, “The Joy of Being Wrong.”

 

Note: The video was supported by the Templeton Foundation (which I serve as a trustee) as an expression of its founder’s science-friendly motto: “How little we know, how eager to learn.” The Foundation is also supporting a University of Connecticut initiative on “Humility and Conviction in Public Life,” including blog essays, a monthly newsletter, podcast interviews, and videos of forums and lectures.

 

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

As of this writing, Jeopardy! champion James Holzhauer has won 24 games for a total of $1,867,142.00. A lot of people are wondering how he has done it.

 

The next time you cover memory in Intro, ask your students to work in pairs or small groups to use what they learned in the memory chapter to guess at some strategies Holzhauer has used to remember such a vast amount of information and to be able to recall it. Did your students come up with self-testing? Spaced practice? Elaboration? Interleaving? Dual coding? Ask your students how they would use these techniques to prepare for their own Jeopardy! run.

 

The Philadelphia Inquirer wanted to know how James Holzhauer has done, so they asked Penn psychology professor Michael J. Kahana and Utah ed psych professor Michael Gardner—and James Holzhauer (Avril, 2019).

 

First, Holzhauer has learned a lot of content in different contexts. In addition to whatever knowledge he acquired through general reading or through school was learned again through his reading of children’s books—books that can cover a lot of ground in an easy-to-digest way. If Holzhauer learned about Napoleon in high school, for example, and then learned about Napoleon in a children’s book, he has that many more retrieval cues to recall what he knows about Napoleon—information that was learned at different times and different places. And, of course, since the task in playing Jeopardy! is retrieving information, it’s important to practice that retrieval through self-testing. Holzhauer confirms that he has indeed self-tested.

 

Holzhauer uses priming to his advantage, both in unexpected and expected ways. If Final Jeopardy is a finite category, he mentally flips through possible answers. As an example, he cites the category “European Capitals.” By thinking about Budapest, Rome, Warsaw, Paris, Oslo, Brussels, he is firing up the neurons in his “European Capital” neural network, making it easier to retrieve the correct answer when the question finally comes. Outside of Final Jeopardy, Holzhauer says the game moves too fast to do this. When contestants choose all the questions from the same category one right after the other, though, everyone gets to stay within the same neural network. That’s not Holzhauer’s strategy, though. He jumps from category to category selecting all of the $500 answers first—to amass a good chunk of money that he can wager on a Daily Double. While he doesn’t have time to think through a list of possible answers, but since he knows which category he is going to select before he voices it, his brain does have a few seconds lead time over his competitors in accessing the right neural network. Interestingly, younger people can more quickly retrieve content from a new category than those of us who are older. That jumping from category to category likely gives him an advantage over older contestants.

 

Then there is buzzer strategy. Is it better to wait until you know you know the answer and then buzz in? Or, is it better to buzz in and hope your brain can find the answer in those few seconds? Holzhauer uses the former strategy. Buzz in first and hope your working memory is able to quickly retrieve the answer from long-term memory.

 

End this discussion by informing your students that if any of them go on to win on Jeopardy!, your cut is 10%.

 

Reference

 

Avril, T. (2019, May 16). Can ‘Jeopardy!’ whiz James Holzhauer be beaten? The science of memory and recall, explained. The Inquirer. Retrieved from https://www.philly.com/science/jeopardy-champ-james-holzhauer-speed-psychology-20190516.html

One of the many things I love about teaching psychology is that I can learn something new about the field—about our humanness—just about anywhere. I am currently reading Skeleton Keys by Brian Switek (2019), a science writer and bone geek. Exploring the origins of our bones, this book is a fascinating history. Any history that starts a few hundred million years ago—as this one does—reminds me how improbable our existence is. It is improbable that mammals exist, that primates in particular exist, that humans exist, and, lastly, that I, specifically, exist. With an incomprehensible timeline that is measured in millions of years, I can’t help but think—in the greater scheme of things—how small I am. While that millions-of-years perspective didn’t stop me from being irritated with some of my fellow drivers on my morning commute, I did think about that dinosaur who one day felt irritated with their fellow dinosaurs when travelling to wherever dinosaurs travelled. You have my empathy, dinosaur.

 

In a brilliant example of burying the lede, I’m actually writing about where the three little bones in the middle ear come from, as I just learned from Skeleton Keys. Stick with me.

 

Protomammals—a group of animals who were precursors to mammals—had jaws comprised of a number bones. Visit the Wikipedia page for Dimetrodon, a protomammal that lived almost 300 million years ago. On that Wikipedia page, scroll down to the drawings of the skull. In the lateral view, notice the quadrate bone at the back of the upper jaw and the articular bone in the back of the lower jaw. Over time—and by “time” I mean millions of years—those bones shrunk in creatures that followed Dimetrodon, but did not disappear. The quadrate evolved into the incus (anvil), and the articular evolved into the malleus (hammer). The stapes (stirrup) had a different origin, but same idea. It was a small bone on top of the hyoid bone in the neck of protomammals (Maier & Ruf, 2016).

 

Press your fingers into the skin right in front of your ear. Open and close your jaw. This is where your upper and lower jaws meet. Those tiny bones of the middle ear are right behind that joint.

 

References

 

Maier, W., & Ruf, I. (2016). Evolution of the mammalian middle ear: A historical review. Journal of Anatomy, 228(2), 270–283. https://doi.org/10.1111/joa.12379

 

Switek, B. (2019). Skeleton Keys. New York City: Riverhead Books.

“Self-consciousness [exists] in contrast with

an ‘other,’ a something which is not the self.”

——C. S. Lewis, The Problem of Pain, 1940

 

We are, always and everywhere, self-conscious of how we differ. Search your memory for a social situation in which you were the only person of your gender, sexual orientation, ethnicity, or body type. Perhaps you were the only woman in a group of men, or the only straight person at an LGBTQ gathering.

 

Recalling that situation . . .

  • Were you self-conscious about your identity?
  • How did others respond to you?
  • How did your perceptions of their responses affect your behavior?

 

Differences determine our “spontaneous self-concepts." If you recalled being very aware of your differences, you are not alone. As social psychologist William McGuire long ago noted, we are conscious of ourselves “insofar as, and in the ways that” we differ. When he and his co-workers invited children to “tell us about yourself,” they mostly mentioned their distinctive attributes. Redheads volunteered their hair color, foreign-born their birthplace, minority children their ethnicity. Spontaneous self-concepts often adapt to a changing group. A Black woman among White women will think of herself as Black, McGuire observed. When moving to a group of Black men, she will become more conscious of being a woman.

 

This identity-shaping phenomenon affects us all. When serving on an American Psychological Association professional task with 10 others—all women—I immediately was aware of my gender. But it was only on the second day, when I joked to the woman next to me that the bathroom break line would be short for me, that she noticed the group’s gender make-up. In my daily life, surrounded by mostly White colleagues and neighbors, I seldom am cognizant of my race—which becomes a prominent part of my identity when visiting my daughter in South Africa, where I become part of a 9 percent minority. In the U.S., by contrast, a new Pew survey finds that 74 percent of Blacks but only 15 percent of Whites see their race as “being extremely or very important to how they think of themselves.”

 

Our differences may influence how others respond to us. Researchers have also noted a related phenomenon: Our differences, though mostly salient to ourselves, may also affect how others treat us. Being the “different” or “solo” person—a Black person in an otherwise White group, a woman in a male group, or an adult in a group of children—can make a person more visible and seem more influential. Their good and bad qualities also tend to be more noticed (see here and here).

 

If we differ from others around us, it therefore makes adaptive sense for us to be a bit wary. It makes sense for a salient person—a minority race person, a gay person, or a corpulent person—to be alert and sensitive to how they are being treated by an interviewer, a police officer, or a neighbor. Although subsiding, explicit prejudices and implicit biases are real, and stereotypes of a difference can become a self-fulfilling prophecy.

 

Sometimes our perceived differences not only influence how others treat us, but also how we, in turn, respond to them. In one classic experiment, men students conversed by phone with women they mistakenly presumed (from having been shown a fake picture) were either unattractive or attractive. The presumed attractive women (unaware of the picture manipulation) spoke more warmly to the men than did the presumed unattractive women. The researchers’ conclusion: The men’s expectations had led them to act in a way that influenced the women to fulfill the belief that beautiful women are desirable. A stereotype of a difference can become a self-fulfilling prophecy.

 

Our acute self-consciousness of our differences can cause us to exaggerate or misinterpret others’ reactions. At times, our acute self-consciousness of our difference may have funny consequences. Consider of my favorite social psychology experiments demonstrating the influence of personal perception of differences. In the first, which showed the “spotlight effect,” Thomas Gilovich and Kenneth Savitsky asked university students to don a Barry Manilow T-shirt before entering a room with other students. Feeling self-conscious about their difference, those wearing the dorky T-shirt guessed that nearly half of their peers would notice the shirt. Actually, only 23 percent did. The lesson: Our differences—our bad hair day, our hearing loss, our dropping the cafeteria plate—often get noticed and remembered less than we imagine.

 

In another favorite experiment—one of social psychology’s most creative and poignant studies—Robert Kleck and Angelo Strenta used theatrical makeup to place an ear-to-mouth facial scar on college women—supposedly to see how others would react. After each woman checked the real-looking scar in a hand mirror, the experimenter applied “moisturizer” to “keep the makeup from cracking”—but which actually removed the scar.

 

So the scene was set: A woman, feeling terribly self-conscious about her supposedly disfigured face, talks with another woman who knows nothing of all this. Feeling acutely sensitive to how their conversational partner was looking at them, the “disfigured” women saw the partner as more tense, patronizing, and distant than did women in a control condition. Their acute self-consciousness about their presumed difference led them to misinterpret normal mannerisms and comments.

 

The bottom line: Differences define us. We are self-conscious of how we differ. To a lesser extent, others notice how we differ and categorize us according to their own beliefs, which may include stereotypes or unrealistic expectations. And sometimes, thanks to our acute sensitivity to how we differ, we overestimate others’ noticing and reacting. But we can reassure ourselves: if we’re having a bad hair day, others are unlikely to notice and even less likely to remember.

 

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)

This one is a challenge. I’ve taught in community colleges for almost 30 years. For the first half of my career, a lot of my students were older than me, and they were pretty stressed about taking their first college class. I decided early on that I would encourage my students to call me by my first name. Some time in the last 10 years I noticed that despite encouraging students to use my first name, many were simply calling me nothing. I chalk that up to—like my more-often-than-not aching lower back—aging. Most of my students are not younger than me. And my student population has shifted to include many students who were born into different cultures all over the world. What they think it means to be respectful to elders and people in authority differs from my views which, of course, are tied to my own cultural experiences.

 

I have since moved on to giving students a choice:

  • my first name—acknowledging that some students are not comfortable with that
  • Frantz—for everyone who prefers the formality
  • Sue—for those who want to split the difference, respect with a touch of informality

 

I am happy to use any of these. When I receive an email from a student, in my response, I use whichever form of address the student used. If the student doesn’t use a form of address, I sign with the most formal option: Prof. Frantz.

Through the Facebook group of the Society for the Teaching of Psychology, Dara Friedman-Wheeler posted a link to this wonderful decision tree designed to help students sort out what to call their professors. This infographic is tied to an account owned by “A Gálvez,” but that’s all the information I have on who created it. (If anyone knows who this is, please contact me.) At the very bottom of the infographic is a note with what is likely the final line missing. A quick Internet search generated some others, but this one is both educational and respectful with just a touch of snark.

 

As for using this infographic, I’m adding it to my “how do we contact our professor” page in my course management system. Even though I am clear about how I would like to be addressed, I don’t know that all of my colleagues are as explicit. This will help students avoid awkward interactions. Not all awkward interactions. Just the ones involving proper forms of address for their professors.

It’s a core lesson of introductory psychology: Intergroup contact reduces prejudice (especially friendly, equal-status contact). As hundreds of studies show, attitudes—of White folks toward Black folks, of straight folks toward gay folks, and of natives toward immigrants—are influenced not just by what we know but also by whom we know. Prejudice lessens when straight people have gay friends or family, and native-born citizens know immigrants.

 

As I write these words from the place of my childhood—Bainbridge Island, Washington—I am moved to offer a family example of the power of social contact. First, consider a large social experiment—the World War II internment and return of Japanese Americans from (a) California, and (b) Bainbridge, a Manhattan-sized island across Puget Sound from Seattle.

 

In minimal-contact California, Japanese-Americans lived mostly in separate enclaves—meaning few Caucasians had Japanese-descent friends. When the California internment ensued, the Hearst newspapers, having long warned of “the yellow peril” celebrated, and few bid the internees goodbye. On their return, resistance and “No Japs Here” signs greeted them. Minimal contact enabled maximal prejudice.

 

Bainbridge was a contrasting high-contact condition—and was also the place where (at its ferry dock on March 30, 1942) the internment began. As an island community, all islanders intermingled as school classmates. Their strawberry farms and stores were dispersed throughout the island. The local paper (whose owners later won awards for journalistic courage) editorialized against the internment and then published internee news from the camps for their friends back home. The internees’ fellow islanders watched over their property. And when more than half the internees returned after the war, they were greeted with food and assistance. A history of cooperative contact enabled minimal prejudice.

 

I can personalize this. One of those saying a tearful goodbye on the dock that 1942 day was my father, the insurance agent and friend of many of them. After maintaining their property insurance during the internment, and then writing “the first auto policy on a Japanese American after the war,” his support was remembered decades later—with a tribute at his death by the island’s Japanese American Community president (a former internee):

 

 

My father provides a case example of the contact effect. His support did not stem from his being socially progressive. (He was a conservative Republican businessperson who chaired the Washington State Nixon for President campaign.) His opposition to the internment of his fellow islanders was simply because he knew them. He therefore believed it was colossally unjust to deem them—his friends and neighbors—a threat. As he later wrote, “We became good friends … and it was heartbreaking for us when the war started and the Japanese people on Bainbridge Island were ordered into concentration camps.”

 

This great and sad experiment on the outcomes of racial separation versus integration is being replicated in our own time. People in states with the least contact with immigrants express most hostility toward them. Meanwhile, those who know and benefit from immigrants—as co-workers, employees, businesspeople, health-care workers, students—know to appreciate them.

 

It’s a lesson worth remembering: Cordial and cooperative contact advances acceptance.

 

(For David Myers’ other essays on psychological science and everyday life, visit TalkPsych.com.)