Skip navigation
All Places > The Psychology Community > Blog > 2016 > July
2016

At the Stanford Psych One Conference, Bridgette Hard (Stanford University) suggested clips from the British game show Golden Balls. Before covering the prisoners dilemma, show students the first 2.5 minutes of this 4-minute video.

 

 

Stop the video at the 2:40 mark.

 

Walk students through the prisoners dilemma, and then make sure students understand how this British game show presents contestants with a version of this dilemma. Note that in the original prisoners dilemma, the “prisoners” can’t communicate with each other before making their decision. Allowing contestants to discuss adds a level of drama that makes for good TV, but certainly changes the nature of the dilemma itself.

 

Before playing the rest of clip, ask students to consider what they would do if they were Golden Balls contestant Steven. If you use a clicker system, ask students to click in with their vote for split or steal. Ask students to consider what they would do if they were contestant Sara. Again, ask students to click in with their vote.

 

Now play the rest of the clip. After the contestants give their soundbites at the end, give students a few minutes to discuss with each other their reactions to the contestants’ comments.

 

Next, show a contestant who chose a different strategy in this 4-minute clip. (This clip starts with the contestants’ discussion.)

 

At that same Stanford Psych One Conference, Garth Neufeld (Highline College, but soon to be at Cascadia Community College), reported that the good folks at Radiolab interviewed these contestants in a 20-minute episode. In the first few minutes the Radiolab hosts lay out the premise of the game, then segue into discussing the clip from first episode above before launching into discussing the clip from the second episode. At about the 11-minute mark, the Radiolab hosts get Nick, of the different strategy, on the phone. During that interview we learn that even though the edited version of Nick and Ibrahim’s discussion that eventually aired was about 4 minutes, the actual, unedited – and apparently heated – discussion was 45 minutes. At the 18-minute mark of the Radiolab interview, we hear exactly how brilliant Nick’s strategy was.

 

 

If you’re feeling a little – let’s call it adventurous – you can now do a version of what Dylan Selterman (University of Maryland) does (read more here). Selterman gives this question on his final exam:

 

Here you have the opportunity to earn some extra credit on your final paper grade. Select whether you want 2 points or 6 points added onto your final paper grade. But there’s a small catch: if more than 10% of the class selects 6 points, then no one gets any points. Your responses will be anonymous to the rest of the class, only I will see the responses.

 

You can add such a question to an exam or as a separate question delivered through your course management system. Or if you use some type of clicker system and you want students to publicly discuss, ask the question in class.

 

How many times has Selterman given the extra credit between when he started offering it in 2008 and when he was interviewed about it in 2015? Once.

Originally posted on February 19, 2015.

 

Soapbox alert: An earlier post expressed one of my pet peeves: the favoritism shown today’s senior citizens over more economically challenged Millennials and their children. A half century ago, I noted, it made good sense to give often-impoverished seniors discounts at restaurants, at movie theaters, and on trains. Today, the percent in poverty has flipped—with under-35 adults now experiencing twice the poverty of their over-65 parents.

Since 1967, seniors’ poverty rate, thanks to economic growth, social security, and retirement programs, has dropped by two-thirds. (Social security payments have been inflation-adjusted, while minimum wage, dependent tax exemptions, and family assistance payments have not.)

So, should it surprise us that new data from an APA national stress survey reveal that “Parents and younger generations are less likely than Americans overall to report being financially secure”?

And should we, therefore, consider instead giving discounts to those who, on average, 1) most need it (perhaps custodial parents), and b) are the most stressed by financial worries?

Originally posted on April 5, 2014.

 

Each year, the American government spends billions of dollars to help students who struggle to meet their potential. These students languish in traditional school programs. They struggle socially. And they ultimately have less impact on society than they might if they had received educational opportunities that maximized their abilities.

New research suggests that students who occupy this group are the ones we often worry about the least: super smart kids. The study followed several hundred students from age 13 to 38. At age 13, all of the students showed testing ability that placed them in the top 0.01 percent of students their age. Put another way, the study participants outscored students in the top 1 percent by a factor of ten. That’s pretty smart!

The super smart kids flourished. Their rate of earned doctorates dwarfed the average American rate: 44 percent compared to 2 percent. They held jobs that gave them influence over millions of dollars and, in some cases, millions of lives.

It’s easy to shut the book there. Super smart kids succeed. Big deal. But these super smart kids often experience challenges that also plague so-called “at-risk” students. They don’t have class material that pushes them intellectually. Before their first day of class, they know the course material. What happens next? What do they do for the next six to eight hours while their peers struggle to understand the material they’ve already mastered?

The super smart kids might also struggle to connect socially. If academics are such a breeze, might it be difficult to relate to your peers? Might you experience stress when your peers have to study at night while you look for other opportunities to pique your intellectual interests? Might you act less intelligent to fit in?

Originally posted on April 7, 2014.

 

How could a person resent making millions of dollars? Sam Polk suggests that some people develop wealth addiction. The more wealth they accumulate, the more money they need to achieve the same buzz. When they don’t get enough, they go into withdrawal and desire even greater wealth.

Signs of wealth addiction pop up often. Consider Dennis Kozlowski, the former CEO of Tyco International and recently released prison inmate. He’s the guy who bought a $6,000 shower curtain and a $15,000 umbrella stand. If there was ever a model of wealth addiction, it’s this guy.

Or is he? Society teaches us that money has power. My three year-old nephew, Graham, has no money sense. He doesn’t care if I give him a $1, $5, or $20 bill. Regardless of the value, he’ll wad it up and throw it across the room. Eventually he’ll learn about money, how to use it, and what it can give him.

So, how do some people get so hooked on money? They may not be addicted to the money itself, but rather to the way money gains them entry into the broader social system. If my annual salary is $30,000, I feel accepted and included if my peers earn about the same. We can afford to eat at the same restaurants, pursue the same hobbies, and treat our romantic partners to similar gifts. But what if my annual salary stays the same and my peers begin to earn $300,000 annually? Now how can I relate to them?

While I live paycheck to paycheck, they take international vacations, develop fine culinary tastes, and enjoy hobbies that demand a hefty entrance fee. I feel left out and alienated. Who can afford to fly to Tanzania and hike to the top of Mt. Kilimanjaro? How many times can you use the word oaky to describe a wine’s taste? Who knew a triathlon bicycle could cost $13,500? If I want to join my high-earning friends, I need to earn more money.

Social acceptance is the most valuable asset a person can own. We evolved a need for close and lasting relationships. This need to belong informs many of our decisions, even if we don’t realize it. And for good reason. For our ancestors, social exclusion was a death sentence. Even today, psychologists argue that loneliness harms health as much as smoking and obesity.

How might money’s symbolic power influence how people approach their relationships? Consider the situation students faced in a group of clever studies conducted by Xinyue Zhou, Kathleen Vohs, and Roy Baumeister. Students believed they would complete a three-person, virtual ball-tossing game. Unbeknownst to the students, the virtual players’ behavior was preprogrammed to accept or reject them. The socially accepted students received the ball an equal number of times. The socially rejected students received a couple of tosses and never got the ball again. They watched as the two other players tossed the ball back and forth, back and forth, until the experimenter stopped the three-minute game.

Imagine what those three grueling minutes were like. You enter a study expecting to toss a virtual ball to a pair of strangers. Now you find yourself reliving a scene from a school dance. You watch the cool kids enjoy the fun while you wait for someone to notice you’re there.

Next, the students reported how much they desired money. Would social rejection, even by computer-animated strangers, influence how much people wanted money? It did. When students felt rejected, they wanted more money.

What would that money give them? Relief from the pain of rejection. Simply handling money, rather than regular paper, was enough to shield the students from heartbreak. They even developed a thicker skin, enabling them to withstand more physical pain. What might have happened if those same students surrounded themselves with reminders of money all day? Would their confidence have grown and insecurities have weakened?     

These findings paint a different portrait of so-called wealth addiction. Yes, some people develop an addiction to money. CEOs, real estate moguls, and their super-rich counterparts might shower themselves with yachts, private jets, and lavish estates. These money reminders might originate from an unquenchable thirst for money. When one yacht isn’t enough, buy a few more or build a bigger one.

But wealth addiction may represent the exception rather than the rule. Many wealthy people only buy what they need. Warren Buffett prefers french fries over foie gras. Carlos Slim lives in the same house he purchased 40 years ago. Ingvar Kamprad flies economy class and drives an old Volvo. 

What drives most people to become wealthy? People want the social acceptance they think wealth will give them. Greater wealth means access to more activities and relationship opportunities. What few people realize is that it’s often lonely at the top. Socially deprived people desire money to fill the void—and use reminders of money to stave off the pain of isolation. For the rest of us, it pays to surround ourselves with people who give our lives richness, complexity, and meaning.

Originally posted on April 9, 2014.

 

Most of us have dreamt of having a personal genie. We summon the genie, it grants our wishes, and our lives get better. But we forget that our genie is not bent on improving our lives. The same genie can make you a hero or a villain; grateful or green with envy; cooperative or antagonistic. It all depends on how you ask your question.

On the heels of research showing these positive and negative responses to the hormone oxytocin, we have a new genie in a bottle. Instead of rubbing a lamp to summon our genie, we sniff nasal spray. And with oxytocin nasal spray showing impressive benefits in offsetting deficits associated with certain mental conditions, it is time for researchers to get a grip on understanding when oxytocin will inspire us toward benevolence or malice.

Oxytocin motivates bonding. But personality traits and situations can bend oxytocin’s influence. For example, people use different strategies to maintain their relationships. Most people act nice, forgive, and adapt to their partner’s needs. Others dominate their relationship partners, pummeling them into submission. Oxytocin might affect these two groups of people differently. The nice guys and gals should continue their efforts to keep their relationship together by acting nice. The dominators, in contrast, might go on the offensive and try to dominate their partners.

To test this hypothesis, my colleagues and I randomly assigned college students to sniff either a placebo or oxytocin. The students waited patiently as the oxytocin took effect. While they waited, they completed some uncomfortable activities meant to provoke an aggressive response. They gave a stressful speech and also put an icy bandage on their foreheads. Next, participants reported their aggressive intentions toward a current or recent romantic partner. Some example items were “slap my partner” and “push or shove my partner.”

Could the love hormone lead to violence? It could. Oxytocin increased aggressive intentions, but the effect only occurred among those who were predisposed toward aggression. The implication is that aggressive people try to keep their romantic partners close by dominating them. When they get a boost of oxytocin, it triggers an aggressive response.

Oxytocin continues to inspire interest and confusion. We’re hard-wired to connect, and oxytocin can help make that happen. But this study shows that it isn’t enough to look at people’s oxytocin levels to know if they will act nice or aggressive. By understanding their personality traits, we can better predict whether the love hormone will promote benevolence or violence.

Originally posted on April 11, 2014.

 

While attending this year’s Society for Personality and Social Psychology annual meeting, I chaired a data blitz session. The session fell on February 14. Valentine’s Day. Hundreds of people attended, all eager to hear exciting talks that lasted no more than 5 minutes. All of the talks delivered on expectations. One of them caused all heads to perk up and pay attention.

The talk, given by Amy Moors of the University of Michigan (and co-authored by Terri Conley, Robin Edelstein, and William Chopik), dealt with consensual non-monogamy. This is a psychological term researchers use to describe people who engage in more than one romantic relationships simultaneously, and whose relationship partners know and approve. The talk had two main points.

  • Consensual non-monogamy is more common than you might think. Moors reported she and her colleagues consistently find that approximately 4-5% of peoplereport being consensually non-monogamous. To put that in perspective, consider a university of 20,000 students. According to these estimates, roughly 800 to 1000 of these students identify as consensually non-monogamous.
  • Who are these students? The authors argue that people who engage in consensual non-monogamy might not feel comfortable getting emotionally close to others and may instead prefer to keep their sense of autonomy. As a result, they might keep others at a distance. People with this relationship style have what is called avoidant attachment.

The more people identified as avoidantly attached, the more positively they evaluated consensually non-monogamous relationships. Avoidantly attached people were also more likely to report being in a consensually non-monogamous relationship.

When I spoke to others about the talk, they were most surprised about the higher-than-expected rates of consensual non-monogamy. This reaction begs the question of why people assume what they do about romantic relationships. Just as Tom Gilovich has shown many ways that people think they “know what isn't so,” what we think we know about relationships doesn't always match reality.

Originally posted on April 15, 2014.

 

For those of us living in the American Midwest, it’s been an historic winter. The phrase “polar vortex,” once beholden to meteorologists, crept into daily conversation. Dozens of inches of snow, frozen pipes, and school cancellations can build stress, weariness, and even depression. To get rid of the blues, find the green space.

Green spaces refer to parks, forests, or other parcels of land meant to connect people to nature. Numerous studies have shown that green spaces relate to better mental health. But one recent study took things to an entirely new level. A group of University of Wisconsin researchers, led by Kirsten Beyer, surveyed a representative sample of Wisconsin residents for mental health issues. They also used satellite imagery to estimate the amount of local green space.

What did they find? The more green space people had close to them, the better their mental health. When people search for a new apartment, condo, or house, the only green they often consider is the money they need to spend. But these findings suggest that living near green spaces pays off by predicting better mental health.

Originally posted on April 17, 2014.

 

Ask many people what their signature says about them and they’ll give you a pat answer: “My name.” Does your signature say more than that? A cottage industry exists, in which “graphoanalysts” will tell us how our penmanship illustrates our ambitions, insecurities, and intuitive abilities. (See here, for an example). But if we don’t want to turn to a graphoanalyst, can psychological science offer a substitute?

It can—and the best place to start is how big you write your name. The bigger you write your name, the more likely you hold a powerful position. For example, tenured, compared to nontenured, American Professors have bigger signatures. Ask people to imagine being the U.S. president, compared to a lower status person, and the chances of their signature size increasing go up. These effects aren’t unique to Americans. They have been replicated in Irani samples, too.

A recent suite of studies caught my attention. They showed that subliminally linking positive words to a person’s identity increased the person’s signature size. In one study, Oxford University students viewed either positive words (happy, smart) or neutral words (bench, paper). To link the words to participant’s sense of identity, the researchers presented the word ‘I’ quickly before each word. Next, they had participants sign their names.

Imagine being part of the study. You sign your name at the beginning, complete a computer task, and then sign your name again. You don’t know it, but you may have had your self-esteem raised. And if you did, your signature size likely grew without you knowing it.

We sign our names often, which might help explain why we come to like the letters in our name more than other letters. The next time someone asks for your signature, take a good look at how much paper real estate it uses. It might say more about you than you think.  

Originally posted on April 21, 2014.

 

Aggressive urges crop up, even for the most saintly people. What helps keep our aggressive urges at bay? Self-control. We can override our aggressive urges and do something more constructive.

But what makes self-control possible? Most of us struggle with self-control failure when we’re hungry. We might get angrier than usual, a term called ‘hangry.’

In a recent study published in the Proceedings of the National Academy of Sciences, my colleagues and I argue that glucose helps people control their aggressive urges. Glucose fuels the brain, and it takes brainpower to quiet our anger. If people have less brain fuel, they should behave more aggressively.

To test the idea, we recruited married couples, asked them to prick their fingers every day to measure their blood sugar levels, and then gave them a chance to express their aggression. Each day, people could stab a voodoo doll that represented their spouse with between 0 and 51 pins. On the last day of the study, people also completed a competitive reaction-time game against their partner, in which they could blast their partner with intense and prolonged noise. (Don’t worry, the game was rigged so that people never actually blasted their spouses.)

Low blood glucose related to greater aggression. The lower amount of sugar floated in people’s blood, the more pins they stuck into the voodoo doll and the more their blasted their spouse.

To avoid what Popular Science Cartoonist Maki Naro now calls the Hanger Games (click for a sweet cartoon summarizing this research), I have one suggestion: Don’t argue on an empty stomach.

Originally posted on April 23, 2014.

 

When we select a romantic partner, we want to know the good and the bad. Is he nice? Will she make me laugh? Getting to know the good isn’t too difficult. A first date is when the good is on full display. People primp, prepare questions in advance, and pay more than they otherwise for dinner. But how can you know whether the person who’s getting ready to sweep you off of your feet might later break your heart?

Conflict is a part of all relationships. We squabble, argue, and may even insult our loved ones. You have to do more than that to break someone’s heart. To accomplish that feat, something often happens that causes you to question treasured parts of your relationship. This brings us to a common cause of broken hearts worldwide: Infidelity.

What might boost the likelihood of infidelity? Gender is a key factor. Men, compared to women, are between two and four times more likely to report engaging in infidelity. Here are two things you probably didn’t know:

1.     Avoidant people—those who keep others at arm’s length, prefer to depend on themselves instead of others, and feel uncomfortable getting emotionally close to their partners—are more likely to engage infidelity. In one set of studies, my colleagues and I recruited people in relationships, measured their level of avoidance, and showed:

  • Their eyes gravitate toward attractive alternatives to their romantic partner.
  • They report more positive attitudes toward infidelity
  • They report more intentions to engage infidelity
  • They report engaging in infidelity more often than others
  • This effect is true for both men and women.

2.     A lack of commitment explains why avoidant people engage in more infidelity. Avoidant people dislike getting close to others. Hence, they have a tough time feeling strong relationship commitment. Their lack of commitment might make avoidant people feel safe and secure. But it also weakens the commitment that often keeps urges to engage in infidelity at bay.

The irony is that avoidant people keep others at a distance to prevent social rejection. By having lower relationship commitment, they’re more likely to engage in infidelity—and cause their greatest fear of rejection to come true.

Originally posted on April 25, 2014.

 

We’ve all experienced it. You’re some place where screaming isn’t tolerated, some kids starts wailing, and the parents rush to quiet them down. What happens next is the twist: “They’re the best behaved kids we know,” the parents say, as their child continues to bellow. We nod our heads, feign a smile, and go back to what we’re doing.

Before you pounce on me for being impatient and inexperienced, I’m here to share some good news. The more positively we view our close relationship partners, the stronger relationship we have. The best part is that the positivity doesn’t have to exist. If you ask many people, they’ll tell you their close friends are above average on nearly every positive trait. They’re funnier, smarter, and kinder than their peers. We might have positive illusions, but that doesn’t hurt anything.

Or does it? Let’s return to how we see our kids. Seeing them as above average might have certain benefits. It might boost your parenting commitment and satisfaction. Who wants to devote the time and energy it takes to parent if you see your kid as a dud?

A recent study suggests a potential drawback: many parents perceive their children as healthier than they actually are. The study, which drew on several investigations involving over 15,000 children, found that half of parents who have overweight or obese children rate their child as slimmer than their weight suggests.

Just as people villainize parents of screaming children, it’s easy to attack parents who don’t know their children are overweight or obese. But let’s show parents some empathy. Parenting is hard. I don’t have kids, but I can’t tell you how much respect I have for people who do. Parents might not want to hurt their children’s feelings by calling them overweight or obese. They also might not know what it means to be overweight or obese. Is it simply if your son fits into his clothes? If your daughter comes home crying because a school bully called her fat?

But there’s a third possibility: when we love someone, we see them in the best possible light. Instead of seeing an obese child, we see our daughter who jumps down the stairs to welcome us home from work. We see our son who loves to get dirty in the mud.

When I read about the study, I tucked it away in my files. The next morning my wife and I took our two golden retrievers, Finnegan and Atticus, to the veterinarian. They’re both of our dogs, but Finnegan is mine and Atticus is my wife’s. They weighed Finnegan, who came in at a beefy 85 pounds. Then it was Atticus’s turn.

“He’s much skinnier than Finnegan,” my wife, Alice, said. “Just look at him.”

I looked and realized we weren’t seeing the same dog. “He looks the same to me. We feed him the same amount and give him the same amount of exercise.”

“Nah, I bet he’s 70 pounds,” she said.

They took Atticus away, weighed him, and returned with the results. He was exactly the same as Finnegan: 85 pounds.

So, this finding might apply to dog owners, too.

Originally posted on April 29, 2014.

 

Becoming a psychologist makes it hard to name drop. We rarely know celebrities. We read more than we schmooze. We seldom inform national or international politics. But we do drop the names of famous psychological studies. Few studies get more name dropping than Walter Mischel’s delay of gratification studies, the so-called marshmallow studies.  

Some think the marshmallow study recently took a slight beating. Much of the criticism has radiated from the findings of a cool new study. In the study, some children learned not to trust an adult experimenter, whereas others learned to trust the same adult experimenter. Next, they were brought through the classic delay of gratification study procedure. Kids were left alone in a room to stare at a treat with the promise of a larger reward if they resisted eating that treat. The result: Compared to those exposed to a non-trustworthy experimenter, children exposed to a trustworthy experimenter waited longer in order to receive a larger reward.

In a recent New York Times Op-Ed, Michael Bourne argues that these results question the depth of Mischel’s classic delay of gratification findings. Yes, the new findings identify a crucial factor – whether or not you can trust an adult experimenter – that can change a child’s delay of gratification. But these findings do little to negate the meaning of the original findings. If anything, they strengthen them.

Let’s start by focusing on the kids who learned to trust the adult experimenter. About 64 percent of them delayed gratification as long as possible. That’s quite a few, but far from 100 percent. Some kids gave up, some tried a little, and some were stalwarts. It’s normal to find variation in behavior. But you can relate variation in delay of gratification to other factors that also differ between children, such as their school performance, drug history, and brain functioning. These factors affected the kids in the new study similar to the way they’d affected the kids in the original delay of gratification studies.

When kids learn not to trust an adult experimenter, they give up sooner. That finding, while interesting, says nothing about the importance of delay of gratification. It merely shows that kids are smart not to use their limited mental energy to delay gratification when they may not reap the rewards. Their behavior supports the saying, “Fool me once, shame on you. Fool me twice, shame on me.”

The marshmallow studies never sought to identify the Holy Grail of psychological strength. Delay of gratification is one of many factors that contribute to individual, relationship, and societal well-being. Rather than throwing the marshmallows out with the bath water, we should recognize that this new scientific finding has helped identify the nuances of delay of gratification and therefore may help us learn more about living happier, healthier, and more productive lives.

Originally posted on May 1, 2014.

 

Julian grew up an active child, but those days are long gone. Now he struggles to walk up the stairs. When he walks two city blocks, he takes a break, clenches a light post, and tries to catch his breath. As he looks at passersby, Julian knows what they’re thinking: Obese people are lazy.

But according to recent research, those onlookers have it backwards. In a recent UCLA study, researchers wanted to see which comes first, obesity or lack of motivation. They took 32 rats and separated them into two groups. The first group ate a diet designed to make them obese. It mimicked the standard Western diet. The second group ate food that maintained their weight. After six months, the rats completed a fun little motivation task.

What happened? A lack of motivation did not cause obesity. It wasn’t until the rats became overweight that their motivation started waning. These findings debunk the myth people become obese because they lack discipline and motivation. Low energy and motivation follow, but do not cause, obesity.

Originally posted on May 5, 2014.

 

From an early age, I wanted to be an astronaut. I memorized Mercury astronaut missions. I dreamt of using a manned maneurvering unit to glide through space. I cried when the Challenger exploded. I still dream of going to space, but I know it’s a long shot. Still, space exploration captivates me.

What will be the biggest obstacle to a successful Mars mission? It won’t be inadequate fuel, faulty aerodynamics, or clunky helmets. Social isolation is the greatest barrier to interplanetary travel.

Don’t believe me? Think about the past 520 days of your life (about a year and five months). That’s how long it takes to travel to Mars and return. How many people did you see during that time? How many conversations did you have? Did you attend a sporting event? A play? A worship service? Maybe a loved one was born or passed away. Now wipe those experiences away. Instead, imagine that during this period of your life you lived in cramped quarters with only five other people, no fresh air, and no sunlight.

This is not a mere thought experiment. The experiment happened, with funds from the Russian Academy of Sciences. What happened? Quite a bit. In research recently reported in the Proceedings of the National Academy of Sciences, the six volunteer marsonauts completed lots of tasks to keep their minds fresh. They also slept like babies without the daily rigmarole of daily work commutes, grocery shopping, or other daily drivel. Then the guys started sleeping like polar bears in hibernation. Then they started doing less, becoming even more sedentary amidst almost endless boredom. Space is only cool for so long.

The good news? They all made it. There weren’t any major scuffles, and the guys probably formed lifelong friendships. They even showed signs of cognitive improvement. But the marsonaut volunteers each handled the prolonged social isolation differently. One of them shifted to a 25-hr sleep-wake schedule, which meant that he was alone (awake or asleep) 20% of the 520 day mock mission. As the researchers sift through their massive data set (to put it in perspective, they measured 4.396 million minutes of sleep!), I’m sure we’ll learn more about the psychological consequences of prolonged social isolation.

For now, we can still look into the night sky, find the Red Planet, and dream of people visiting sometime in our generation. We know they’ll sleep well—and a lot.

Originally posted on May 7, 2014.

 

The Iran and Afghanistan Wars introduced a new and troubling picture on the relationship between traumatic brain injury and mental health. Multiple deployments exposed soldiers to more frequent risks. New combat gear helped them survive blasts. Suicide, substance use, and strained relationships often followed.

But according to an Ontario study, we shouldn’t forget another vulnerable group: adolescents who have experienced at least one traumatic brain injury, defined as a head injury that caused either 5 minutes of unconsciousness or an overnight hospital stay. By comparison, the severity of the soldier injuries probably trumped those of the Toronto teens. Yet the two groups experienced similar consequences.

In a study of almost 5000 Canadian students Grades 7-12, those who experienced a traumatic brain injury, compared those who didn’t, were nearly three times more likely to attempt suicide. The brain injured adolescents were also more likely to engage in antisocial behavior and experience anxiety and depression.

Here is the most stunning statistic of all: roughly 20% of Ontario adolescents have a lifetime history of traumatic brain injury. Part of this makes sense. Think back to when you were a teenager. Perhaps you skateboarded, played soccer, hockey, football, or roughhoused with your siblings. Learning how to drive, you might have been injured in a car accident.

Our teenage years are often filled with risk because the teenage brain is hypersensitive to reward. (To watch some videos of a true genius on the topic of the teenage brain, click here). Yet the drive for reward can come at the greatest cost of all. By risking their bodies, adolescents risk their brains. And when that piece of equipment doesn’t run on all cylinders, life becomes more of a slog than a sweet dream.

The next time you think of brain injury, think of those who put themselves in harm’s way. For some of us, risk if part of our job. For others, it’s part of our development. For all of us, it’s time to reconsider who needs help.     

Originally posted on May 9, 2014.

 

The next time you’re facing potential social rejection, what should you do? New evidence suggests a puff of pot reduces the pain of rejection.

But before we get to the pot smoking, how did we hatch this idea? Like most ideas, it was formed over an informal conversation. Previously, we had shown that the physical painkiller acetaminophen numbs people to the pain of rejection. Now we wanted to see whether another drug that works through similar brain receptors would also reduce the pain of rejection. It just happened that marijuana fit the bill.

In four studies, participants reported how often they smoked marijuana. Next, we measured their feelings of social exclusion or manipulated how socially excluded they felt. Finally, we measured participants’ emotional distress.

The four studies yielded a similar pattern: marijuana reduced the pain of rejection.

What is the takeaway message? Rejection hurts, and drugs that reduce physical pain also lessen the pain of rejection. Don’t smoke up to prove a point. Just know that rejection is serious. When you’re feeling lonely, reach out to friends before you reach for a roach clip.

Originally posted May 13, 2014.

 

We might not realize it, but our names hold the key to how much people trust us. Our actual name might not matter. It is how easily you can pronounce a person’s name that counts.

Imagine the following scenario. You walk into an auto repair shop to get your car fixed. A smiling man greets you at the door, shakes your hand, and says, “How are you?” You look at his uniform, which has his name printed in an oval patch. It says, “Yvgeny.” Having spent no time in Russia, you struggle to pronounce his name.

Would your botched pronunciation affect how much you trusted your would-be mechanic? It would. In a recent investigation, people with easily pronounceable names, versus hard to pronounce names, were rated as less dangerous, more positive, and more trustworthy.

The next time you struggle to pronounce someone’s name, give that person a break. We don’t pick the names our parents give us. Should we also give more credit to those who have triumphed despite having hard to pronounce names? Did Quvenzhané Wallis need to do more to earn her Oscar nomination than Naomi Watts did? We’ll never know.

Until we do, think of the people you choose to include in your life. We have limited energy and only include a few people in our social networks. Of the ones you chose, did their easy to pronounce names give them an edge?

Originally posted on May 15, 2014.

 

Each day we face stressors. Our pets get sick, lightning strikes the office, or we miss loved ones. New research identified another stressor lurking in our midst: men.

This research, which was reported by the New York Times, showed that rats and mice get stressed out when they’re around men. Even catching a whiff of a men’s t-shirt was enough to raise their stress hormones. The scent of a woman didn’t increase stress.

Most men don’t consider themselves stressors. When I walk into a classroom, I tell my students my mission:

  • Motivate
  • Encourage
  • Advocate
  • Learn

But these findings suggest that no matter what I do, my maleness may spike their stress levels. The good news is that our bodies adjust. The first day of class might be stressful, especially when you have a male instructor. Over time that stress should wash away.

How far can we extrapolate these findings? Although exposure to a single male increases stress that eventually drops off, what if people were constantly bombarded with men? In societies where men outnumber women, would stress levels peak? My next post will answer that question.

Originally posted on May 16, 2014.

 

Being around men increases stress. Do countries with more man than women have higher stress levels?

In my last post, I promised to answer this question. But it’s a harder question than it seems. How do you measure a country’s level of stress? Some organizations, such as Gallup, do an excellent job surveying people around the world. I don’t work at Gallup, nor do I have access to their data. So I had to do the best I could.

First, I gathered country gender composition data from our friends at the World Bank. I separated countries according to whether they had a majority of male or female citizens. The average was 50.77% women (standard deviation 1.19; Minimum: 48.19%, Maximum: 54.30%). Of the 74 countries for which data were available, 19 were male-majority and 55 were female-majority.

Next, I searched for a good, comprehensive measure of country-level stress. Bloombergmade things easy. They computed a country’s stress score by combining seven factors:

  • Annual Homicide Rate per 100,000
  • Gross Domestic Product (GDP) per capita
  • Income inequality (Gini coefficient)
  • Corruption (as measured by Transparency International)
  • Unemployment rate
  • Urban air pollution (micrograms per cubic meter)
  • Life expectancy (years at birth)

Finally, I compared country-level stress between male-majority and female-majority countries. This would give me an initial answer to my question. What were the results?

Countries with more men than women, compared to their female-majority counterparts, had higher levels of stress.

1399298357371.jpeg

Three factors drove the effect: corruption, pollution, and life expectancy. In each case, more men than women equaled a more corrupt, polluted, and shorter lived society. A close fourth, which wasn’t quite statistically significant (p= .063), was Gross Domestic Product per capita. If a country had a male majority (vs. female majority), GDP was lower.

These findings offer a novel extension to the finding that being around men, versus women, increases rodent stress. But unlike those careful laboratory experiments, people weren’t randomly assigned to live in male- or female-majority countries. We can’t infer cause and effect. All we can conclude is that when men are present, stress seems to rise instead of fall.

Originally posted on May 20, 2014.

 

Three years ago, I gained a new appreciation of consciousness. My mom had an accident that caused her brain to bleed. It seemed to rip away her consciousness. As I slept next to her bed before she died, I wondered, “Is she conscious of what’s happening?”

New research suggests that the brain can give us a clue. Prior to this research, you had limited options to know if someone had consciousness. You could ask them. That’s easy. But what about those cases, like my mom’s, when a person hasn’t experienced brain death but is still unresponsive?

To find out, researchers scanned the brain of a woman who was in a vegetative state. They asked her to imagine playing tennis, along with several other activities. The results, published in Science, made a big splash. Although the woman couldn’t answer the questions, her brain did. When she imagined playing tennis, her brain reacted by increasing blood flow to the motor cortex. She was immobile, but her brain acted as if she was playing on Wimbledon’s Centre Court.

To watch a wonderful video of the researcher leading this effort, click here. It might change the way you think about consciousness. I know it helped me.

Originally posted on May 22, 2014.

 

Runners pop up everywhere. They run alone, in pairs, and in mass hordes through major cities. Running can improve health. But what about extreme running? Is 100 miles too far?

I get this question a lot. I run ultramarathons. An ultramarathon is any foot race longer than a standard 26.2 mile (42.2 kilometer) marathon. In the past year, I’ve run nine ultramarathons. This includes two 50 kilometer (31 miles) races, one 60 kilometer race (a little over 37 miles), a six hour timed event where I officially ran 40 miles (I got lost and ran a couple “bonus” miles, too), three 50 mile races, and two 100 mile races.

When people learn about my running, they often ask two questions. The first is, “Why do you do that?” I like the challenge, they make me feel good physically and emotionally, and I love the camaraderie. The second question is almost always, “Is that good for you?”

Many people think running isn’t good for you. “It’s bad for your knees,” many people tell me. When it comes to extreme running, the possible harm only increases, right? “Didn’t you hear about the ultrarunner, Caballo Blanco, from the book Born to Run? He died while running. That proves it.”

Before we rush to judgment, let’s look at the data. When we do that, we’ll find two things. The first is that we don’t really know much. There aren’t many people who run ultramarathons. The sport is growing, but we’re a niche group. In late April, I ran a 100 mile race. 185 people started. Compare that to the most selective American marathon, the Boston Marathon, which in 2014 had 35,671 starters!

The second thing we’ll learn is that ultrarunners have pretty good health. In one recent study, ultrarunners, compared to the general population, had better physical health, mental health, and missed fewer days due to illness. They do tend to have more allergies and asthma, which is probably due to wheezing from the gorgeously pollinated trails they traverse. Other research shows that ultrarunners have longer telomeres – DNA strands at the end of chromosomes that often shrink with age. With longer telomeres, ultrarunners may be less vulnerable to disease.

So, we now have a better perspective on whether ultrarunning is good for you. You don’t have to run 100 miles to reap the benefits of running. Simply take one more step today than you did yesterday.

Originally posted on May 27, 2014.

 

Psychology is ripe with history. Unlike many sciences, psychology grips us because we are its main characters. People have more experience with quarrels than with quarks. But one of psychology’s most famous case studies continues to evolve. So, I ask, will the real Phineas Gage please step forward?

The name Phineas Gage might not perk up a person’s ears as easily as Freud, Bandura, or Skinner. But the story of Phineas Gage occupies precious real estate in most psychology textbooks. He showed the world that people can survive a major brain trauma. Yet understanding his post-accident life grows fuzzier over time.

In a recent article that foreshadows the upcoming book, “The Tale of the Dueling Neurosurgeons,” Sam Stein argues that what we think we know about the most famous name in neuroscience needs historical revision. Did Gage really become a psychopath? Why do people ignore Gage’s major life events, such as when he tried to make a new life in Chile?

We’ll never know the true Phineas Gage. The riddle will always be partly unsolved. That isn’t such a bad thing. Sifting through material will inspire new questions—with the hope that they will inform how we understand friends, loved ones, or the many others who have suffered traumatic brain injuries.

Originally posted on May 29, 2014.

 

How would you like to increase your brainpower? All you need is a 9-volt battery, some mad scientists, and a heaping portion of creativity.

So says a slew of recent studies using a noninvasive, neuroscientific technique called transcranial direct current stimulation (tDCS). Think of tDCS as the ultimate symphony conductor. It can pep up certain brain regions by exciting neuronal impulses. But it can also quiet a crowd of neurons by decreasing their firing rate. A 9-volt battery powers the electrodes that rest on people’s scalps, giving people a slight twinge as the equipment increases or decreases their brain activity.

In one study, Air Force pilots who received frequent tDCS stimulation, compared with those who didn’t, learned more information in less time. But tDCS isn’t merely a way to learn better. It can help people cope with upsetting situations. In a pair of papers, my colleagues and I showed that stimulating a brain region that aids emotion regulation reduced rejection-related distress and aggression.

To succeed, people need some combination of talent, grit, and luck. Should a personal brain zapping machine get added to the list?

Originally posted on June 2, 2014.

 

We receive help every day. I don’t grow the food I eat, knit the clothes I wear, or assemble the TV I try to avoid. I don’t even cut my own hair. Nope, I rely on others to help me. But how do I get help when it involves asking?

Amidst a recent report showing low levels of helpfulness among college professors (especially toward members of minority groups and women), I thought it would be good to help readers know how to increase helping.

Here are the top 5 ways to do it (adapted from Latané and Darley, 1970).

  1. Notice help is needed. This goes both ways. I need to be aware that other people might need my help. I also need to make sure other people know I need help by asking.
  2. Realize when help is needed. If it’s an emergency, let people know it.
  3. Take personal responsibility for helping. Ignore what other people do. If you see someone in need, don’t wait for someone else to do the job. To quote Mahatma Ghandhi, “We need not wait to see what others do.”
  4. Make a decision to help. Think of this as the step between you wanting to help and you actually helping.

Help! Now that you’ve made your decision, it’s time to put some feet on it. Take action and help.

Originally posted on June 10, 2014.

 

Many people run to enjoy better health, lower stress, and a slight endorphin buzz. But does running also zap our memories?

To find out, researchers traded in their lab coats for workout clothes and put some adult guinea pigs on a serious running schedule. Others guinea pigs were assigned to a couch potato condition. Next, they tested how well the pigs remembered situations that used to terrify them.

What happened? The runners forgot their fears. Faced with the prospect of painful electric shock, the runners fearlessly galloped. The couch potatoes cowered.

This study might change how you respond the next time you hear a runner say, “I run to clear my head.”

Originally posted on June 19, 2014.

 

We’ve all experienced the pleasure and subsequent pain of mindless eating. Just sit in front of the TV, open a bag of chips, and watch your favorite show. Now do the same thing with the TV off. In which situation did you eat more?

Mindless eating made Brian Wansink a household name. This guy’s research oozes coolness. Need proof? Just watch this video to see how he made people guzzle quarts (yes, quarts!) of soup by having them spoon it out of a bottomless bowl. It won him the 2007 Ig Nobel Prize.

If mindless eating makes us unhealthy, would mindful eating make us healthier? According to a recent study, Yes. People who practiced mindful eating, compared those who did not, ate healthier foods and had healthier weights.

At your next meal, don’t agonize over every bite. But also avoid putting your brain on cruise control. A healthy awareness and attention to the food we eat can motivate use to use the highest quality food to fuel our brains and bodies.

Originally posted on July 3, 2014.

 

Have you ever wondered why some people struggle to avoid certain foods, whereas others have little trouble passing on a delectable dish? Childhood eating habits, genetics, and willpower offer possible answers. But researchers at Carnegie-Mellon University identified another explanation: Thinking about eating makes food seem less exciting.

If you imagine eating 10 pieces of pizza, your mind has already simulated what it’s like to eat pizza. When you see a real pizza, your brain’s pleasure centers no longer perk up. You’ve been there, done that. As a result, you consume less pizza.  

In a series of experiments, people who repeatedly imagined eating a food many times ate less of that food compared with those who imagined taking a few bites. Instead of pizza, the researchers used M&M candies. People who imagined eating 30 M&M’s, compared with those who imagined eating only three, ate fewer M&M’s. By simulating eating lots of M&M’s, the thrill from eating the bite-sized candies was gone.

The next time you struggle to avoid a tempting food, remember that you can train your brain not to want it. Just imagine eating large quantities of the food. Your brain will think it’s already had more than enough to eat and you will desire the food less. 

Originally posted on July 10, 2014.

 

Why don’t people vote? This question puzzles pollsters, political candidates, and people who cherish the right to choose their elected officials. To predict voter turnout, all you might need is a test tube, a willing participant, and a little saliva.

So says a group of University of Nebraska-Omaha researchers, who tested the hypothesis that the stress hormone cortisol would predict voting behavior. Stress often leads people to avoid high pressure situations. If people have high cortisol levels, voting might only increase their stress. They might fear that their chosen candidate would lose the election, or that the candidate would underperform if elected. As a result, stressful souls might avoid the polls.

In the study, people spit in a tube to provide a measurement of their cortisol levels. Next, the researchers collected the study participants’ actual voting behavior in six U.S. national elections. Sure enough, the most stressed out people voted about half as often as their more relaxed counterparts.

To get people to vote, politicians might frame voting as a relaxing activity. “Take a break from work, relax, and make a difference in your community,” might help get even the most stressed out people to visit the polls.

Originally posted on July 17, 2014.

 

Lloyd Cosgrove was his town’s city manager, butcher, and Presbyterian minister. He had a shiny head, bushy eyebrows, and a whooping laugh. If you want Lloyd to remain unique, try not to think about him too much.

Why? Repetition breeds bland memories. Our brain’s memory center, the hippocampus, leaves different traces of information each time we call up something from our past. This is why our memories of the same past events shift. What gets left behind are the details.

You might forget that Lloyd was a butcher or blocked out his whooping laughter. Or you might invent new details about him. Was he a Presbyterian or Lutheran minister? A city manager or a city councilman? Memory is a funny thing.

In a recent study, people who rehearsed an event three times recalled fewer details compared with people who rehearsed the same event once. Repetition improved how well people recognized pieces of information, but it squeezed out the details.

We might romanticize details. Do I need to remember the outfit my wife wore on our first date? (I do.) Do I need to remember where I ate my first taco? (I don’t.) Or should I become content that the details that add color, meaning, and spice to my daily experiences will become gray, hallow, and bland the more my memory plays them back? Ask me tomorrow. I’ll have a different memory of the question than I do today.

Originally posted on July 22, 2014.

 

Social support can take many forms. A helpful tweet, the annual Facebook birthday barrage of well wishes, and long conversations with friends and family can put things in perspective and reduce our stress. But, according to recent research from Renison University, Wilfrid Laurier University, and the University of Waterloo, these acts of kindness backfire when interacting with people who have low self-esteem.

People with low self-esteem have social support preferences that often put them on a collision course with their friends and family. They desire information that validates their negative self-feelings. When their friends offer positive feedback, people with low self-esteem don’t accept it. This aversion to positivity causes low self-esteem spillover: Their friends begin to feel bad about themselves, too.

What is the moral of the story? Find someone who has a similar self-concept as you do. Birds of a feather should often flock together. Although it might be hard to imagine wanting information that validates our negative self-feelings, it is unwise to force people to enjoy something they dislike. Knowing yourself and what you like is the first step in building a successful relationship. The next step is finding someone who shares your preferences, no matter how sunny or gloomy they might be.

 

 

Originally posted on August 5, 2014.

 

We live in an era of mental fatigue. People sleep less, work more, and experience more stress now than any other time in recent history. How can we overcome mental fatigue?

People go to extreme lengths to pep themselves up. They guzzle energy drinks, smoke cigarettes, or drink coffee and tea. But a growing trend, especially among college students, is to use psychotropic medications to battle mental fatigue. Many students abuse the popular drug, methylphenidate (also known as Ritalin), because they think it will improve their concentration when fatigued.

Are the students right? To examine this question, researchers at the University of Michigan randomly assigned students to consume either Ritalin or a placebo pill. Next, the students completed a boring task. Crucially, half of the students completed a version of the task that was not only boring – it also made them mentally exhausted. The rest of the students were bored, but their mental faculties were left intact.

What happened? Taking Ritalin undid the effects of mental fatigue. So, in a way, these findings offer a clue about why students abuse Ritalin. Students might not want a cheap buzz; they may simply want to overcome their exhaustion.

In my next post, I’ll offer five healthier and safer options to deal with mental fatigue.

Originally posted on August 14, 2014.

 

In my last post, I reviewed research that showed that Ritalin, compared with a placebo, helped research volunteers overcome mental fatigue. Now I would like to give you five healthier and safer ways to conquer your mental fatigue.

Everyone experiences mental fatigue, whether it is the 3:00 pm “slump” or extreme sleep deprivation. Two weeks ago, I was awake for 40 consecutive hours as I helped a friend complete the Badwater 135 ultramarathon. Eighteen hours later, I was back in the office working. So, I know about fatigue and how to deal with it.

1.     Increase rest. This is the easiest, safest, and cheapest way to overcome mental fatigue. Increase your sleep until you reach at least seven to eight hours each night. If you’re sleep deprived, schedule extra time to catch up on your missed sleep. Once you’re caught up, your body will find a natural groove of how much sleep you need. Some people brag about how little sleep they need. Start bragging about how much sleep you get.  

2.     Play offense against your environment. Open your windows in the morning. When we see the morning sunlight, retinal proteins trigger signals to something called the suprachiasmic nucleus (SCN). The SCN, in turn, helps our bodies produce less of our body’s natural sleeping hormone melatonin. In the evening, turn off your lights. Don’t go to sleep in front of your iPad, iPhone, or other brightly lit decide. The darker your room, the faster you’ll fall asleep.

3.     Exercise. Yes, exercise excites us. But exercise also bathes our minds with neurotransmitters that settle us down and boost our happiness. Try to avoid early morning and late evening exercise. A late afternoon walk, jog, or swim works best.

4.     Work smarter, not harder. Most of us have fallen prey to the mistaken idea that working more hours means that we are doing higher quality work. Yet few among us keep track of our daily activities. For example, how many minutes per day you do write, read, and check email? I use various websites and programs to help me accomplish my daily goals. Online-stopwatch.com is one of my favorites. I set the clock for four hours. When time is up, that means I’m done writing. Period. I also bought the “Freedom” program. It locks me out of the Internet. Freedom helps me plan my writing sessions (Will I need that document? Do I need to copy and paste this email?) and avoid lingering distractions. You’ll work fewer hours, making you less fatigued.

5.     Take the mind out of the middle. When we’re tired, it’s tough to make decisions. Try something different: Make a contract with yourself ahead of time. Psychologist Peter Gollwitzer calls these little contracts implementation intentions. For example, if you want to make sure you get your exercise, tell yourself, “When I get home from work, I will exercise for 20 minutes.” This way you’ve already made the decision. Instead of trying to rely on your groggy mind to make a good decision, refer to the mental contract you already drafted and signed.

So, how did I overcome the extreme mental exhaustion I experienced post-Badwater? I followed each step. I prioritized my sleep. I soaked up as much sunlight as possible. I exercised each day. I set specific work goals to accomplish. I made implementation intentions so that I knew my decisions were made ahead of time. Finally, I relaxed and took it easy.

Originally posted on August 28, 2014.

 

When your friend tells you about her terrific first date, you will eventually ask the question. You might stall by inquiring about the food she ate, the jokes he told, and the outfit she wore. Eventually, you’ll ask: Is he cute?

Recent research suggests that you’ll know how she arrived at her answer.  An in-depth analysis of 1,000 facial images identified three main ingredients of attractiveness:

  • Approachability, or how friendly a person seems. A large mouth, wide nose, and curvy bottom lip were some of the strongest predictors of approachability.
  • Youthful-attractiveness. Here, the eyes have it. To seem youthful, have large eyes. You should also avoid sporting a moustache or beard.
  • Dominance. Looking dominant relates to having angular cheeks, large eyebrows, and slightly dark skin.

These are some of the strongest predictors of each attractiveness ingredient. Of course, they don’t tell you much about people’s sense of humor, clothing style, or hobbies. For that, you’ll have to take the plunge and actually meet them. She might have large eyes and a curvy bottom lip, but would you want to date someone who never laughed at your jokes? I doubt it. Or what about an angular-cheeked, naturally tan man who always turns heads but also is profoundly dull and shallow? Maybe give him a fake phone number when he asks for yours.

Attractiveness matters, especially during the initial passionate stages of a relationship. But there are many ingredients that are far more important than attractiveness when selecting a mate. Trust is key. Think about it: Would you rather date an attractive compulsive liar, or a less attractive person who always tells the truth? Self-control also fosters relationship success. Highly self-controlled people, compared with their sluggardly counterparts, are more forgiving, generous, and less aggressive.

So, it’s natural to wonder whether your friend’s date is cute. You might not ask whether he has a large mouth, angular cheeks, or big eyes. But if she says, “Yes, he’s gorgeous,” you can be confident that he received an extra helping of some of these attractiveness ingredients.

Originally posted on September 2, 2014.

 

Graduation brings few guarantees. Jobs are scarce, job security is even more difficult to find, and many people earn less and receive fewer employee benefits than they anticipated. But graduation often brings at least two things: pomp and presents. When I finished graduate school, my parents bought me a dog. I knew he had basic emotions, such as happiness and fear. Now I know he also gets jealous.

Finnegan, an English golden retriever, is one of my best friends. Early in my professor job, I would bring him to the office with me. He slept while I wrote papers. He even participated in some of my research studies. [Not to worry, a graduate student ran the experimental sessions. When we discussed the studies in front of Finn, I covered his ears to keep him blind to condition. ] We would take walks around campus. Students would pop in and pet him. When I left the office to teach, he would yelp a little before settling down and falling asleep.

Then something happened. I got engaged. My fiancée Alice (now wife of more than six years) moved to Kentucky and started sleeping on Finnegan’s side of the bed. Suddenly, he wasn’t top dog anymore. I was happy. Finnegan wasn’t. 

But then another major event occurred. We purchased another dog, Finnegan’s half-brother, and named him Atticus. We wanted Finnegan to have a playmate. Things went well. Finnegan and Atticus played and wrestled and did all of the cute things that make YouTube videos go viral. Finnegan did show a curious new behavior, however. He seemed to get jealous when I petted Atticus.

Was Finnegan’s jealousy an illusion? It’s easy to fool yourself into thinking that animals can do more than they can. For examples, dogs don’t know they are dogs. They don’t have that kind of self-awareness. Dogs also don’t have strong belief systems. Sure, they might like to eat my pizza, pretzels, and shoes. But it would never occur for one dog to ask another, “Do you avoid eating meat pizza for health or ethical reasons?” They just gobble and go.

According to a recent study, dog jealousy is real. The researchers tested 36 dogs. Just how might you evoke dog jealousy? Have a dog’s owner interact with a stuffed dog that barks, whines, and wags its tail. The owners also were instructed to ignore their own dogs while they played with the stuffed dog. To provide comparison conditions, owners also ignored their dogs to interact with a jack-o-lantern or a book.

Boy, the dogs got jealous when their owners ignored them! The dogs acted needy and tried to “shoo the rival [dog] away.” They fixed their gaze on the interloper. They even got a little nippy.

The dogs only got jealous when their owners paid attention to another dog. They didn’t mind their owners playing with the jack-o-lantern or a book. Just like my Finnegan, the dogs only started to show pangs of jealousy when they felt they were being replaced.

The moral of the story is that dogs experience complex emotions. Jealousy can sour relationships. Fortunately, humans and dogs can overcome their jealousy and learn to include others in their lives. Finnegan loves Alice, and Atticus is his best friend. Finn got over his jealousy. In that way, old dogs might be able to teach us some new tricks.

Originally posted on September 25, 2014.

 

Most of our daily lives hum along effortlessly. We automatically rise when we wake, speak when we wish to communicate, and eat when our empty bellies grumble. These behaviors helped our ancestors survive and reproduce. But we also need to size up situations and people that might threaten us. How well do we do this?

In one recent investigation, researchers from Australia and the United States argued that angry faces tell a specific story that takes little effort to understand. Rather than being a simple threat signal, angry faces gives us information about people’s physical strength, which is the crucial element in determining their fighting ability. Using some cool facial morphing software, the researchers showed participants faces and then manipulated the seven primary facial muscles involved in an angry facial expression. Some faces flexed all seven angry facial muscles; others flexed fewer than seven.

The more angry muscles the faces flexed, the more participants rated the person as being physically strong. The key is that participants did not need to take a course on the biology of human emotion to make their ratings. They didn’t need to know the seven facial muscles that comprise an angry facial expression. Participants automatically knew the strongest and angriest face when they saw it.

So, the next time you get a twinge of terror when you see an angry face, don’t sweat it. Your mind is automatically telling you something aimed at keeping you safe and sound.

Originally posted on October 2, 2014.

 

Everything psychological is biological. Stress wreaks havoc on our immune system, increasing our risk for many diseases. Psychological disorders can make us feel physically sick. We feel the sting of rejection as real pain. Might a healthier body help us have a stronger mind?

To find out, a group of Brazilian researchers recruited a group of women who underwent bariatric bypass surgery. Before and after their surgery, the women completed a measure of executive function — a test of how well people manage their mental processes.

Not surprisingly, the bariatric bypass surgery caused the women to lose weight. It also reduced their inflammation and boosted brain activity in regions associated with cognitive function. But the coolest finding was that the women’s executive functioning improved. A healthier body related to a stronger mind.

No matter how disconnected our mind and body might seem, they are close friends who rely on each other for everything. By improving our physical health, we can change not only the shape of our bodies but also strengthen our minds.

 

http://www.talkpsych.com/talk-psych-blog/2014/9/22/can-a-lean-waist-strengthen-your-mind#commenting

Originally posted on October 9, 2014.

 

Have you ever just met someone, learned his name, and immediately forgotten it? This happens all of the time. People try all sorts of tricks to remember names, driving routes, or the location of your favorite Hong Kong noodle house. But we might be looking in the wrong spot. All we need is a healthy dose of electricity.

In a brilliant study, a group of Northwestern University researchers recruited volunteers and had them undergo a stimulating treatment. Each day for five days, the volunteers had a part of their brain stimulated using a technique called repetitive transcranial magnetic stimulation (rTMS). The brain stimulation sessions lasted 20 minutes and targeted the hippocampus, which aids memory. To have a basis of comparison, the same volunteers also completed a week of sessions in which they did not receive brain stimulation. The trick was that the volunteers didn’t know when their brains had been zapped and when they hadn’t.

Did the brain zapping improve memory? It did. The brain stimulation also improved how well the hippocampus “talked” to other nearby brain regions, an effect called functional connectivity. My favorite finding was that the brain stimulation effects persisted 24 hours after the volunteers underwent the treatment. Stimulate now, remember better later. 

What does this mean? Should we forgo other memory strategies and instead buy a brain stimulation machine? I think not. These findings simply shed light on how the mind works and new ways we can improve how it functions.

Nathan DeWall

The Power of High a Five

Posted by Nathan DeWall Jul 20, 2016

Originally posted on October 16, 2014.

 

One of my earliest memories is my dad giving me a high five. He was training for a marathon and agreed to take me, his talkative four year-old, on a run. I ran an entire mile. When I finished, red-faced and smiling, he said, “Give me five, son.” It was my first high five.

According to a new study, high fives go a long way in motivating children. Five and six year-old children completed a task in which they imagined experiencing success. Next, the children received different types of praise. Some children received verbal praise that would highlight an individual trait (“You are a good drawer”), whereas other children received a high five.

What motivated the children more, clear praise for being good at something or a high five? The high five won handedly. When the children bumped into a setback, those who received a high five persisted more than the other kids did.

We might reconsider how we praise children’s behavior. If we tell children they’re geniuses, we’ve told them that they have a stable trait that isn’t under their control. If they fail a test, the responsibility can’t be theirs because they have a trait that should guarantee success on all intelligence test. Blame the teacher. Criticize the test. Give up and find something else to do. Don’t find a better way to study.

By giving a high five, children know they have done something well. They also know that their success is under their control. I have run many miles since my first high five, but that first one with my dad will always hold a special place in my heart. It motivated me, either consciously or unconsciously, to continue to push my limits. For that high five, I’m grateful.

Originally posted on October 23, 2014.

 

No matter how many babies I meet, I’m always left wondering what they want. Does a short squeak followed by a shrill squeal signal that the baby is hungry? That I left the dog outside by accident again? Or is the baby simply testing out her developing vocal chords? Driven by confusion and frustration, I might insert a pacifier into the baby’s mouth. The baby seems soothed, and I can take a breather.

But according to one recent study, pacifiers disrupt our ability to understand a baby’s emotional state. Adult participants viewed pictures of happy and distressed babies. Sometimes the babies wore pacifiers and others times they didn’t. When the baby wore a pacifier, adults showed less intense facial reactivity and also rated the baby’s emotions as less intense. It didn’t matter whether the babies were happy or sad. The pacifiers numbed adults to baby facial expressions.

1413907224297.jpeg

Why did this happen? The idea hinges on the belief that we automatically mimic others’ emotional expressions. When I see people smile, I naturally mimic them because it helps me understand them and show empathy. Mimicking others is a great way to make friends. It lets others know we’re on the same team. Those who feel starved of social connection are the most likely to mimic others.

Pacifiers are mimic roadblocks. With a gadget covering your face, I can’t make out what you’re feeling. As a result, I mimic you less, empathize with you less, and ultimately judge your experience as less intense than it really is.

I don’t have a strong opinion about pacifiers. My sisters used them with their children, my parents used them with me, and I might use them when I have my own children. Like any consumer of knowledge, I’ll use this science to inform the choices I make. One thing is certain: I’ll never look at a pacifier in the same way.

Originally posted on October 30, 2014.

 

Tis’ the season for professional recognition. The world is abuzz with announcements of who won this year’s Nobel Prizes. Psychology doesn’t have a Nobel Prize (though one of our own, Daniel Kahneman, won one in 2002). But psychologists like to make lists.

 

1413926325772.jpeg

Daniel Kahneman

 

Recently, three researchers compiled a list of the 100 most eminent psychologists in the modern era (post-World War II). Several patterns caught my attention:

  • Most psychologists did not achieve great eminence until at least age 50. This number is at odds with some reports that scientists often make their major breakthroughs between ages 35 and 40.
  • The eminent psychologists experienced what I call the Publishing Paradox. They published many articles, but their eminence was due to only one or two publications. Few of their publications had any impact on their perceived eminence.
  • Women and members of minority groups compromised a small percent of the list. This is a cause for concern as we embark on a time in the academy when diversity of experience, perspective, and background is most needed.

What can the list teach us? Eminence requires hard work that takes place over a long period of time. There are no short-cuts. People also must accept that most of their daily work will have no bearing on their perceived eminence. Fall in love with the process. Stay the course. Let others decide the outcome.

 

Originally posted on November 13, 2014.

 

Success is mystery. What is it? How do we achieve it? And why does it often fail to live up our expectations? Success puzzles us because we don’t appreciate failure.

In “What I Learned Losing a Million Dollars,” University of Kentucky alum and multimillionaire Jim Paul and Brendan Moynihan suggest that there are a million ways to succeed. If you want to earn more money, you can start a business or sell a business. To improve your mental health, you can get hired or resign. One person’s path to weight loss will be paved with fruit and no fat; another person’s caveman diet will encourage fat consumption to lose weight. The point is that there are at least as many ways to succeed as there are people on the planet.

This is good and bad news. The good news is that everyone can find a unique path to success. The bad news is that your unique path won’t teach you much about success. To learn how to succeed, you must learn why you fail and how to avoid it.

This topic is near and dear to my heart. Last weekend, I completed the Javelina Jundred 100 mile ultramarathon. It was my best race yet. I knocked well over an hour off of my personal record time. Throughout the race, I felt good and ran a steady pace. After the race, I was happy and calm. (For proof, see my finishing picture.)

1416415586964.png

Failure was the key to my success. Two times earlier this year, I failed to finish 100 mile races. Both times I got sick and the medical team pulled me. Last weekend, I didn’t focus on how to run faster. Instead, I concentrated on how to avoid the things that caused me to lose out on finishing those earlier races. By learning from failure, I could achieve my definition of success. 

I don’t know why failure is great learning medicine. One reason is that bad is stronger than good. When we fail, it grabs our attention more than success. Others argue that there are only a few ways to fail. Either way, failure is a great teacher that we should embrace instead of fear.

Originally posted on November 20, 2014.

 

Have you ever seen a baby so cute you wanted to snuggle it and take a bite out of it at the same time? Ever said to a new niece or nephew, “You’re so cute, I could just eat you up?”

Have you cried after a happy occasion, such as crossing the finish line of a race for which you’ve long prepared, or proposing to your girlfriend and getting a yes? Two weeks ago, I experienced these conflicting emotions when I shed several tears after finishing a 100 mile running race.

These conflicting concurrent emotions help us maintain emotional balance, according to research from Yale University.

An adult’s reaction to an adorable baby is to kiss them and coo at them. But an adult may also pinch, squeeze, and playfully nip at them. Knowing that most people don’t intend to actually harm babies, the researchers designed several experiments to find out why adults respond to them with aggressive behavior.

In one study, participants looked at and evaluated photos of different babies, some of whom appeared more infantile than others. The participants said they wanted to care for and protect the infantile babies, but they also reported higher expressions of aggression in response to the babies. Participants were also more likely to feel overwhelmed with very strong positive feelings in response to the more infantile babies.

What do these findings tell us? Being overwhelmed by positive emotion produce responses designed to bring us down to our emotional baseline. Ever in need of emotional equilibrium, people will engage in behaviors aimed at leveling off their extreme emotional reactions.

So the next time you cry during a happy scene in a movie, laugh nervously, or feel compelled to take a bite out of a cute baby, remember that it is just your body’s way of maintaining emotional balance.

Originally posted on December 4, 2014.

 

Walk down a sidewalk and someone will likely take notice. Just where do their eyes linger? You can tell a lot about whether they think you are Mr. Right—or Mr. Right Now—based on where their eyes gravitate.

So says recent research conducted at the University of Chicago. Students viewed photographs of people and reported whether they caused them to experience romantic love or sexual desire. The students also wore an eyetracker, which recorded which parts of each photograph captured their attention. The idea is that romantic love causes people to try to understand what another person is thinking. Sexual desire encourages people to pay attention to objects that reflect concrete sensations and feelings.

Romantic love drove people to fixate their attention on people’s faces. This makes sense. If I want to understand what someone is thinking, I should look at their face. Their facial expression might also give me a clue as to whether they return my interest. Sexual desire created a different picture. When people saw a photograph that caused them to experience sexual desire, their eyes stuck on people’s bodies.

This love versus lust response operates automatically. Participants didn’t think carefully about where to position their eyes. Their eyes simply gravitated toward bodily locations that were most relevant to romantic love or sexual desire. Just how big of a difference was there between how long participants spent looking at faces when they experienced love rather than lust? A little over 400 milliseconds. That’s a tad longer than an eyeblink.

But don’t let that slight difference take anything away from how cool these findings are. They show how efficiently our minds work to alert us to information that relates to our emotions and goals. By knowing this wrinkle about how the mind works, your walks may never be the same. 

Originally posted on December 11, 2014.

 

Even the most pleasant activities have low spots. I enjoy teaching as much as anything, but there are certain parts I like more than others. Course planning ranks as one of my least favorite parts of teaching. There are numerous questions that lack clear answers.

But as I built my online course shell today, I felt more confident than ever about how often I should test my students. Quite a bit.

In research conducted at the University of Texas-Austin, researchers gave students daily online quizzes that provided immediate, personalized feedback. At the end of the semester, the researchers compared the daily quizzed students’ grades with those who had previously taken a version of the course that did not include the daily quizzes. The result? Daily quizzes boosted class performance a half letter grade.

The daily quiz effect also spilled over into the students’ other classes. Even when the course material did not relate to their daily quizzes, students who were frequently tested continued to excel. That’s remarkable.

The most surprising part is how much students like frequent testing. Last year, I taught Introduction to Psychology while I was on sabbatical at Hope College. Knowing the benefits of frequent testing, I decided to give my students 22 quizzes throughout the semester. That’s about a quiz every class session.

1418075180278.jpeg

At the end of the semester, I asked students what parts of the course they would like to keep or discard. No student suggested getting rid of the quizzes. They said the quizzes kept them on track and gave them frequent feedback about how well they understood the material. Students also said that the frequent quizzes caused them to approach longer exams without much anxiety. S

So, as I spent today entering the many quizzes that my Introduction to Psychology students will take next semester, I know that frequent testing should help them earn the high grades they desire.

Originally posted on December 18, 2014.

 

Self-preservation is a core instinct, but sometimes people reach an emotional valley in their lives and the best way out seems to be self-harm. Unfortunately, a history of self-harm is one of the best predictors of future self-harm and death by suicide. Can psychotherapy weaken the cycle of self-harm and its relationship to death by suicide?

Yes, according to a recent study. The research examined a group of 22,712 Danish people who had engaged in deliberate self-harm. Some of them received psychotherapy, whereas others did not. Then the researchers determined whether people chose to hurt themselves again, died of any cause, and died by suicide one, five, 10, and 20 years later.

The results were striking. Psychotherapy reduced the risk of future self-harm, death by any cause, and death by suicide. The researchers estimated that “145 self-harm episodes and 153 deaths, including 30 deaths by suicide, were prevented.”

The findings offer hope to those at risk for self-harm and suicide. They also shed light on the power of psychological science to improve and potentially save lives. Some therapies work better than others. For some people, therapy might not work at all. But over all, this research suggests that therapy is worth a try.

Originally posted on January 22, 2015.

 

At the beginning of each year, millions of people reflect on the previous year and find things they could have done better. Exercised more, eaten healthier, watched less television, drank less alcohol. They vow—most knowing they won’t keep their promise—to make more of the new year, to become their best selves.

Ah, the New Year’s resolution. I’ve made many myself. Many of my resolutions have been health related; when I look back at the previous year, I see where I could have been much healthier. I compare myself to friends who ran more miles, enjoy a slightly leaner physique, and seem to never worry about whether their clothes getting snug. (Last year, for example, a close friend ran over 5,200 miles. That dwarfs my measly 2,525 miles.) Looking at their accomplishments makes me feel sluggish. So I vow to change, and the start of a new year seems like the perfect time to do so.

Unlike many resolution makers, I have had some success with New Year’s resolutions. Here’s why: I really wanted to change and was ready to do so. And that readiness to change is the key ingredient in committing to these self-improvement plans, according to Meg Baker, a wellness expert from the University of Alabama.

Many Americans make resolutions but don’t put a plan in place to successfully carry them out, she says. To increase your likelihood of success, Baker offers three suggestions:

  • Develop small, short-term, realistic goals that will fit into your schedule
  • Consider the benefits and reasons for the change
  • Share your plan with someone with whom you can be accountable

She also suggests that you consider modifying the plan as your needs change. For example, if your new exercise routine has gotten stale, mix it up. During the winter months, I sometimes get stuck running on the treadmill. To keep things interesting, I might spend a day cycling or trying to do a single pull-up. When you’re struggling to stick to it, Baker suggests reflecting on the reasons you made the resolution.

This year, I’m once again vowing to be healthier than I was last year. That means if I really want to see progress, I have to be willing to take the action to bring about change. To kick things off, I spent January 1st running the Hangover Classic 10 mile run in Louisville, KY and, a couple hours later, running the Resolution Run 5 mile run in Lexington, KY.

So here’s to a healthy, happy 2015. What’s your resolution?

Originally posted on February 5, 2015.

 

Even though the smartphone has only been around for the past seven or eight years, it’s sometimes difficult to remember what life was like before we had so much information at our fingertips. You could argue with a friend about what year “Back to the Future, Part 2” came out, or in what year the “future” was set. (It was released in 1989. The future, filled with flying cars and floating skateboards, was set in 2015.)

Back then, you couldn’t resolve discussions by swiping a screen and touching a button. Siri wasn’t even a twinkle in Steve Jobs’s eye. If you got lost, you had to consult a map or stop and ask for directions, and if you got bored while waiting in line, you couldn’t pass the time by playing Candy Crush or perusing Instagram.

Luddites argue that life was better before the smart phone, whereas others tout the benefits of instant communication and information. But one thing is certain: The smartphone has changed our lives. And our thumbs.

Yes, when we spend time on smartphones using a touchscreen, it changes the way our thumbs and brains work together, according to a new study by researchers from the University of Zurich and ETH Zurich in Switzerland.

Our obsession with smartphones presented the perfect opportunity to explore the everyday plasticity of our brains. With smartphones, we are using our fingertips—especially our thumbs—in a new way, and we do it a lot. And because our phones keep track of how we use them, they carry a wealth of information that can be studied.

In the study, the research team used electroencephalography (EEG) to record brain response to the touch of the thumb, index finger, and middle fingerprints of touchscreen phone users compared to people who still use flip phones or other old-school devices. They found that the electrical activity in the brains of smartphone users was enhanced when all three fingertips were touched. The amount of activity in the brain’s cortex associated with the thumb and index fingertips was directly proportional to the amount of phone use.

Repetitive movements over the touchscreen surface might reshape sensory processing from the hand. Cortical sensory processing in our brains is constantly shaped by personal digital technology. So, the next time you use your thumbs to tweet, answer email, or jot yourself a note, remember that you’re training your brain. Keep in mind, too, that excessive phone usage is linked with motor dysfunction and pain. Remember the so-called “BlackBerry thumb”?

1422474146438.jpeg

Originally posted on February 20, 2015.

 

This morning my wife, our one-month-old daughter, and I went to a local diner. It was a snow day, my University was closed, and we were enjoying a rare morning together. Before our food arrived, I took a sip of coffee, looked outside, and said, “I’m so happy.” The story should end there, with our tiny family devouring pancakes and running errands. But then I returned to my house, opened my email, and received some bad news. I was supposed to be miserable.

Or so suggested the latest Gallup Report, “The State of American Well-Being: 2014 State Well-Being Rankings.” For the sixth straight year, my state, Kentucky, ranked 49th of 50 U.S. States. Only West Virginians have lower well-being than my fellow Kentuckians do.

My first impulse was to try to make sense of all of this. Was I conning myself when I said I was happy? Can you ever really measure happiness? Let’s not fool ourselves. You can’t measure happiness the same way you can’t measure your weight in gold. But I agree with one of my favorite social psychologists, Dan Gilbert, who said, “maybe we just need to accept a bit of fuzziness and stop complaining” (Stumbling on Happiness, p. 65). So, I accepted my happiness.

This is when I started to understand why I’m throwing off the statewide dish of depression. Here are the five elements of well-being (taken from the Gallup site):

  • Purpose: liking what you do each day and being motivated to achieve your goals
  • Social: having supportive relationships and love in your life
  • Financial: managing your economic life to reduce stress and increase security
  • Community: liking where you live, feeling safe, and having pride in your community
  • Physical: having good health and enough energy to get things done daily

This is when I started to understand, and my heart began to sink. I max out on each ingredient. I love my daily activities, both personal and professional. I have relationships that allow me to have the diner experience I mentioned. I’m neither the richest nor the poorest person in my state, but my wife and I manage our finances so that we can feel secure and have rewarding experiences. I love where I live, and enjoy showing people our great state. I take care of myself physically, at least enough so that I can make words move across the page. All of that is annoying to read and even harder to write. But it’s true.

Then why did my heart start to sink? I have a theory of mind and a concern for others. Unlike my dogs, a blowfish, or the horses I drive by on my way to work, I can simulate another person’s experience. And when I simulated how it felt to be deprived of purpose, meaningful relationships, financial security, community pride and safety, and physical health, I realized the seriousness of today’s Gallup results. We need chang e.

The good news is that each well-being ingredient can be mended. To have higher well-being, people don’t need to grow a third leg or become enthralled with the taste of cod liver oil. Those things are impossible. Psychological science provides clear answers about how to improve our well-being. The biggest challenge is that the scale of change needed to buck our spot in the well-being basement could take years. Kentucky will never be Hawaii, but we can improve. Is it worth a try? I think so.

Originally posted on February 12, 2015.

 

Did you watch all five seasons of “Breaking Bad” over a long weekend? Have you ever longed for the weekend so that you can watch episode after episode of your new favorite television show? Are you counting down until Netflix releases Season 3 of “House of Cards” later this month? You’re not alone.

Binge-watching seems harmless—I’ve been known to veg out occasionally after a long week, watching hours of “The Wire”—but is it really? New research says maybe not.

It turns out, loneliness and depression are linked to TV binge-watching. In a recent study, over 300 18-to-29-year-olds reported their loneliness, depression, self-regulation, and binge-watching behavior. The more depressed the survey participants were, the more they binge-watched. The depression-binge watching relationship was strongest among people who lacked self-control. Faced with the option of watching yet another episode, impulsive participants went along with the binge-watching program.

These findings complement other research showing relationships between depression, loneliness, and self-regulation problems and general binging behavior. To escape from a lonely or depressed mood, people often engage in addictive behaviors.

Most of us have fallen prey to the binge watching bug. It’s okay to enjoy an occasional marathon TV-watching session. But remember the science: If you’re feeling blue, try not to hide your sorrows in the “boob tube.” It’s not likely to help, and it just might make matters worse.

Originally posted on February 24, 2015.

 

Dog research always fascinates me. You could say I have a nose for it. As humans, we spend a lot of time with our canine friends: they share our homes and steal our hearts—and sometimes the food off our plates.

I’ve always loved dogs, and I couldn’t wait to get one of my own. Nearly eight years ago, I adopted Finnegan, a lovable yet slobbery Golden Retriever who regularly knocks over the trash can and cuddles with me and my wife. A year later we adopted his half-brother, Atticus, and doubled our fun. And our mischief.

From across the room, both dogs seem to suspect when we’re angry or happy. All they need is a peek at our body language and facial expressions. If you have a dog, you’ve likely noticed the same thing. But did you know that dogs also can tell the difference between happy and angry faces in photographs?

One study says so. A team of researchers trained dogs to discriminate between images of the same person making a happy or angry face. Twenty dogs were shown photos of faces side-by-side on a touchscreen. Half of the dogs were trained to touch images of happy faces; the other group was rewarded for choosing angry faces.

The dogs needed only a little training before they could choose the angry or happy face more often than would be expected by random chance. So, not only can dogs learn to interpret their owners’ facial expressions, but they can also perceive emotions in photographed strangers.

A cool wrinkle in the study was that the dogs were slow to associate an angry face with a reward. Perhaps they instinctually knew to stay away from angry people, making it hard for the dogs to think angry people were linked to anything positive?

I can’t wait to see how this line of research progresses. In the meantime, I’m going to go smile at my dogs.

Originally posted on April 9, 2015.

 

Many people call laughter the best medicine, but did you know that it can also help you make new friends?

It doesn’t surprise me at all. Some of my best friendships have had their roots in belly laughs.

Sharing a laugh makes people more likely to open up to each other, according to a recent study. Laughter increases our willingness to share something personal, without even realizing that’s why we’re doing it.

Allowing someone to truly know us—perhaps sharing our most embarrassing moment, or talking about a personal goal or fear—is crucial in building and growing relationships.

To test their theories about laughter and self-disclosure, researchers gathered 112 students who did not know each other. They split them into groups and then showed each group a 10-minute “mood induction” video, one of which featured a standup comedian. (The other two were a golf instruction video and an excerpt from a nature show.) Researchers measured how much the students laughed and their other emotional states. The students also wrote a message to another participant to help them get acquainted.

The results: Group members who laughed together while watching the comedian shared much more intimate information than those who did not watch the comedy routine. That’s probably because laughter triggers the release of endorphins, which play a role in forming social bonds.

Try it out next time you’re in a social situation with strangers or mere acquaintances. If they’re a bit aloof, get them laughing. You’ll be surprised at how a little laughter can defrost even the toughest audience.

1428337139543.jpeg

Kevin Kozcicki/Getty Images

 

 

 

Originally posted on April 16, 2015.

 

Our brains are amazing. I am endlessly fascinated by how the brain works. In nearly every interview I do, the reporter asks, “What part of the brain lights up when that happens?”

Now reread the previous sentences. As you came upon each word, how did you read them? Did you look at each letter and arrange it into a word? Have you ever thought how we read? How can we skim so quickly through a passage and absorb its contents?

Our brains don’t look at letters. So says a new study. Instead of seeing a group of letters, our brain sees the entire word as an image. Neurons in our brain’s visual word form area remember how the whole word looks, using what one researcher called a “visual dictionary.”

Researchers tested their theories by teaching 25 adult participants a set of 150 nonsense words and investigating (using fMRI) how the brain reacted to the words before and after learning them. The results: The participants’ visual word form area changed after they learned the nonsense words.

Pretty cool stuff. But, it’s also useful. Knowing how our brains process words could help us design interventions to help people with reading disabilities. People who have trouble learning words phonetically might have more success by learning the whole word as a visual object.

1428337934794.jpeg

Pavelen/E+/Getty Images

Originally posted on July 2, 2015.

 

Not long ago, I enjoyed one of my favorite summer pastimes. With a close friend, I attended a Major League Baseball game. My team got clobbered, it rained, and I forgot to bring home the free Johnny Bench bobble head doll that I drove 90 minutes to get. But the trip was worth it because I witnessed something that borders on magic: kids dancing without a care in the world.

Whether they dazzle 25,000 spectators on a giant screen or an impromptu dance party in the living room, kids know how to get down. They often lack skill, grace, and sensitivity. But none of that matters. Feelings are facts, and kids know the definition of dancing is fun.

1435690658120.jpeg                  Mamigibbs/Getty Images

 

Why does dancing lose its appeal? According to recent research, a better question is when does dancing become a downer? The decline of dance starts when we develop what is called a theory of mind, that pesky ability to infer another person’s mental states. A theory of mind lets the trick-or-treater know that the person underneath the mask isn’t really a goblin and what might make a parent buy a desired toy. A theory of mind also helps us think of how others judge our dancing. And that, my friends, is when dancing stops being so fun.

The upside is that there’s never of shortage of young people who haven’t gotten wise to how goofy dancing makes them look. This weekend I’ll go back to watch my team play. The kids will dance, the adults will laugh, and we’ll all enjoy a relaxing evening.

At the 2016 Stanford Psych One Conference, Linda Woolf (Webster University) suggested that during the Intro Psych learning chapter we talk about Hero Rats. This is a very nice way to help students see an example of the contributions psychological science is making to promote human rights around the world.

 

After covering operant conditioning, show Bart Weetjens 12-minute 2010 TED talk, How I Taught Rats to Sniff Out Land Mines (below). (Why rats, other than they are easy and cheap to train? They are too light to set off the mines.) In the second half of his talk, Weetjens discusses his new work on training rats to detect tuberculosis.

 

 

Alternatively, show students this 11-minute 2007 Frontline segment on Hero Rats. Before you play it, inform students that there is an error in the video. Can they identify it? [In the video, the conditioning is called classical/Pavlovian, but it's actually operant. The rats are clicker-trained. The rats learn that when they hear a click, they can run to a location, such as back to their trainer, to get a tasty treat. The click is a discriminative stimulus - "that sound is my cue to go get a snack".]

 

This website provides a nice written explanation of the process used to train the rats.

 

Is your class, psych club, or honor society looking for a project? Consider raising funds to support Weetjeens organization, Apopo.

 

Besides, Gambian (aka African) pouched rats are pretty darn cute. Even if (or because) their bodies can be a foot and a half long with a tail that matches their body length.

(Photo source: Gambian pouched rat - Wikipedia)

Originally posted on September 24, 2015.

 

This past weekend, I gave myself an odd birthday present. I entered an ultramarathon. If you’ve read my posts, you know I like to run. For my birthday, I wanted to run 100 miles as fast as I could. Luckily, I had a perfect opportunity. There was a 24 hour running race within driving distance of my house.

There was a bigger purpose in my run. I could determine whether a recent test of my speed and endurance would replicate. Two weeks ago, I ran 100 miles in 22 hours and 10 minutes.

Replication is important. It tells whether repeating the essence of an experiment will produce the same result. The more the same sequence of events produces a similar outcome, the more we can depend on it. 

Psychology is embroiled in a current debate about replicability. All psychologists agree that replication is important. That is a requirement before you get your card when you join the psychologist club. The debate centers on the meaning of non-replication. A recent report found that 64 percent of the tested psychological effects did not replicate. Some have declared a war on current scientific practices, hoping to inch the non-replication rate down to a less newsworthy percentage. Others, such as Lisa Feldman Barrett, argue that non-replication is a part of science. It tells us just as much about why things do happen as to why they don’t.

My birthday run had everything I needed to make a replication attempt. Nearly everything was identical to the last time I ran 100 miles. The course consisted of a flat, concrete loop that was nearly one mile long. I ate the same foods, drank the same amount of water, and got the same amount of sleep the night before. All signs pointed to an exact replication.

Then the race started. The first 50 miles breezed by. I was over an hour faster than my previous run, but I felt pretty good. By mile 65, I was mentally fatigued. By mile 70, my body was exhausted. By the time I hit mile 75, I was done. Less than 16 hours had passed, but I was mentally and physically checked out. No replication.

There are at least two ways I can deal with this non-replication. The first is to panic. Either the people who counted my laps at the previous race did something wrong, I reported something wrong, or something else is wrong. It is as if it never happened. The next time someone asks me my personal record, I can tell them. But I must tell them that I don’t trust it. “Probably just a one-off,” I might say. “Tried to replicate it two weeks later and came up short.”

A second approach is to try to understand what contributed to the non-replication. Most things were the same. But some things were different, among them the wear and tear that long running has on the body and mind. Maybe I wasn’t fully recovered from the previous race. Maybe I ran too fast too soon. Or maybe I’m just not that fast.

Either way, it tells us a different story about replication. Replication science is possible, but we will always have non-replications. And those non-replications aren’t badges of shame. They tell us as much about the complexity of human psychology as the truth about how certain situations make us think, feel, and act.

It would be great if psychology’s non-replication rate dwindled to less than 5 percent. I doubt that will ever happen. Humans are squirrely animals. No matter how much we want to do the same thing twice, sometimes it doesn’t happen.

 

1443108312679.jpeg

Originally posted on April 3, 2014.

 

The New York Times columnist Nicholas Kristof recently (here) chided academics for becoming esoteric and inaccessible to the general public.  He noted that

The latest attempt by academia to wall itself off from the world came when the executive council of the prestigious International Studies Association proposed that its publication editors be barred from having personal blogs. The association might as well scream: We want our scholars to be less influential!...

Professors today have a growing number of tools available to educate the public, from online courses to blogs to social media. . . . So, professors, don’t cloister yourselves like medieval monks — we need you!

Voila!  Here begins an effort to share fruits from psychological science.  With daily reports and reflections, we will share what fascinates our minds, challenges our thinking, or tickles our funny bones.  We aim to “give psychology away” to:

  • teachers seeking to freshen their classes with cutting-edge ideas and discoveries,
  • students eager to learn insights beyond what’s in their textbooks, and
  • any curious person who finds human beings fascinating, and who delights in psychological science efforts to expand our minds and enlarge our hearts.

We also aim to offer our reflections in simple prose, believing with Thoreau that “Anything living is easily and naturally expressed in popular language.”

Welcome aboard, and please do feel free to invite your students, colleagues, and friends to join us for the ride, and to join the conversation.

Originally posted on April 3, 2014.

 

My friend Ed Diener, the Jedi Master of happiness research, presented a wonderful keynote talk on “The Remarkable Progress of National Accounts of Subjective Well-Being” at the recent one-day “Happiness and Well-Being” conference.  He documented the social and health benefits of positive well-being, and celebrated the use of at least simple well-being measures by 41 nations as of 2013.

In displaying the health accompaniments of positive emotions, Ed introduced me to a 2011 PNAS (Proceedings of the National Academy of Sciences) study by Andrew Steptoe and Jane Wardle that I’d somehow missed.  Steptoe and Wardle followed 3,853 fifty-two to seventy-nine year olds in England for 60 months.  This figure displays the number surviving, among those with high, medium, and low positive affect—which was assessed by averaging four mood reports across a single day at the study’s beginning.  Those with a “blue” mood that day were twice as likely as the good mood folks to die in the ensuing five years!{cke_protected_1}{cke_protected_2}

1396377515104.jpeg

David Myers

Sometimes Truth is Comedy

Posted by David Myers Expert Jul 19, 2016

Originally posted on April 6, 2014.

 

Consider Brett Pelham, Matthew Mirenberg, and John Jones’ 2002 report of wacky associations between people’s names and vocations.  Who would have guessed?  For example, in the United States, Jerry, Dennis, and Walter are equally popular names (0.42 percent of people carry each of these names). Yet America’s dentists have been almost twice as likely to be named Dennis as Jerry or Walter. Moreover, 2.5 times as many female dentists have been named Denise as the equally popular names Beverly and Tammy. And George or Geoffrey have been overrepresented among geoscientists (geologists, geophysicists, and geochemists).

I thought of that playful research recently when reading some clever research on black bears’ quantitative competence, co-authored by Michael Beran.  Next up in my reading pile was creative work on crows’ problem solving led by Chris Bird.  Today I was appreciating interventions for lifting youth out of depression, pioneered by Sally Merry.

That also took my delighted mind to the important books on animal behavior by Robin Fox and Lionel Tiger, and the Birds of North America volume by Chandler Robbins.  (One needn’t live in Giggleswick, England, to find humor in our good science.)

The list goes on: billionaire Marc Rich, drummer Billy Drummond, cricketer Peter Bowler, and the Ronald Reagan Whitehouse spokesman Larry Speakes.  And as a person with hearing loss whose avocational passion is hearing advocacy, I should perhaps acknowledge the irony of my own name, which approximates My-ears.

Internet sources offer lots more:  dentists named Dr. E. Z. Filler, Dr. Gargle, and Dr. Toothaker; the Oregon banking firm Cheatham and Steele; and the chorister Justin Tune.  But my Twitter feed this week offered a cautionary word about these reported names:

“The problem with quotes on the Internet is that you never know if they’re true.”  ~ Abraham Lincoln

Perhaps you, too, have some favorite name-vocation associations?  I think of my good friend who was anxiously bemused before meeting his oncologist, Dr. Bury.  (I am happy to report that, a decade later, he is robustly unburied and has not needed the services of the nearby Posthumus Funeral Home.)

For Pelham and his colleagues there is a serious point to this fun:  We all tend to like what we associate with ourselves (a phenomenon they call “implicit egotism”). We like faces that have features of our own face morphed into them.  We like—and have some tendency to live in—cities and states whose names overlap with our own—as in the disproportionate number of people named Jack living in Jacksonville,of Philips in Philadelphia, and of people whose names begin with Tor in Toronto.

Uri Simonsohn isn’t entirely convinced (see here and here, with Pelham’s reply here).  He replicated the associations between people’s names, occupations, and places, but argued that “reverse causality” sometimes is at work. For example, people sometimes live in places and on streets after which their ancestors were named.

Implicit egotism research continues.  In the meantime, we can delight in the occasional playful creativity of psychological science.

P.S.  Speaking of dentists (actual ones), my retired Hope College chemistry colleague Don Williams—a person of sparkling wit—offers these photos, taken by his own hand:

1396378086880.jpeg

1396378068393.jpeg

And if you need a podiatrist to advise about your foot odor, Williams has found just the person:

1396378145870.jpeg

Originally posted on April 8, 2014.

 

An editorial in yesterday’s New York Times questioned the nearly $1 billion the U.S. Transportation and Security Administration has invested in training and employing officers to identify high-risk airline passengers. In 2011 and 2012, T.S.A. behavior-detection officers at 49 airports “designated passengers for additional screening on 61,000 occasions.”

The number successfully detected and arrested for suspected terrorism? Zero.

But then again, the number of plane-destroying terrorists they failed to detect was also, I infer, zero. (Wonkish note:  A research psychologist might say the T.S.A. has made no Type II errors.)

Regardless, psychological science studies of intuitive lie detection, as the Times’ John Tierney noted in an earlier article, suggest that this has not been a wise billion-dollar investment. Despite our brain’s emotion-detecting skill, we find it difficult to detect deceiving expressions. Charles Bond and Bella DePaulo reviewed 206 studies of people discerning truth from lies. The bottom line: People were just 54 percent accurate—barely better than a coin toss. I have replicated this in classroom demonstrations—by having some students either tell a true or a made-up story from their lives. When seeking to identify the liars, my students have always been vastly more confident than correct.

Moreover, contrary to claims that some experts can spot lies, research indicates that few—save perhaps police professionals in high-stakes situations—beat chance. The behavioral differences between liars and truth-tellers are just too minute for most people to detect.

Before spending a billion dollars on any safety measure, risk experts advise doing a cost-benefit analysis. As I reported in Intuition: Its Powers and Perils, some people were outraged when the Clinton administration did not require General Motors to replace ill-designed fuel tanks on older model pickup trucks. The decision spared General Motors some $500 million, in exchange for which it contributed $51 million to traffic safety programs. “GM bought off the government for a pittance,” said some safety advocates, “at the expense of thirty more people expected to die in fiery explosions.” Actually, argued the Department of Transportation, after additional time for litigation there would only have been enough of the old trucks left to claim 6 to 9 more lives. Take that $500 million ($70 million per life)—or the $1 billion more recently spent on behavior detection—and apply it to screening children for preventable diseases (or more vigorous anti-smoking education programs or hunger relief) and one would likely save many more lives. By doing such cost-benefit analyses, say the risk experts, governments could simultaneously save us billions of dollars and thousands of lives.

Ergo, when considering how to spend money to spare injuries and save lives, critical thinkers seek not to be overly swayed by rare, dreaded catastrophes. The smart humanitarian says: “Show me the numbers.”  Big hearts can cohabit with cool heads.

Originally posted on April 10, 2014.

 

A footnote to the name-vocation analyses:  Who would you rather hire for a managerial (rather than employee) role—John Knight or George Cook?  Jill Prince or Judy Shepherd?  David King or Donald Farmer? Helen Duke or Hazel Baker?

Raphael Silberzahn and Eric Luis Uhlmann studied nearly a quarter million German names corresponding to high and lower status occupations, such as K╚Źnig (King) and Koch (cook).  Those with names linked with high status occupations were modestly more often appointed to high status roles.  Silberzahn and Uhlmann  speculate that the name association may have made those with high status names seem more worthy.

As former U.S. President Jimmy Carter famously said, “Life isn’t fair."

Originally posted on April 14, 2014.

 

A recent New Yorker review (here) questions the famous claim that “38 witnesses” failed to respond to the Kitty Genovese murder and raises questions about the relationship between the media and the social sciences.  Psychologists have known that the New York Times’ original report of 38 witnesses is questionable.  In a 2007 American Psychologist article, Rachel Manning, Mark Levine, and Alan Collins reported on “The Kitty Genovese murder and the . . . parable of the 38 witnesses.”

Social psychologist Bibb Latané has responded to the New Yorker article, noting that the precise number of witnesses concerns a small “t” truth, with the dynamics of bystander inhibition being the central point of his award-winning research with John Darley.  The dynamic that drove the bystander nonresponse was not “moral decay” but a simple principle:  the “probability of acting decreases with the addition of more people.”

Latané’s letter in the April 7th New Yorker is available here, along with his more extensive submitted explanation.

Originally posted on April 16, 2014.

 

Part of our pleasure in writing psychological science is identifying the big ideas and findings that educated people should know.  Another part of our pleasure is relating these ideas and findings to people’s everyday lives.

Our Harvard colleague Steven Pinker, one of psychology’s public intellectuals, has offered—courtesy of the New York Times—a short quiz that invites people to relate some of psychology’s ideas to real life and pop culture.  Perhaps you, or your students, might enjoy some of the quiz items—here.

David Myers

Big Data

Posted by David Myers Expert Jul 19, 2016

Originally posted on April 18, 2014.

 

“The Internet is one big field study,” observed Adam Kramer, a social psychologist and Facebook researcher, at the recent Society for Personality and Social Psychology (SPSP) presidential symposium on big data.  Some big data factoids, gleaned from the conference:

  • There are, according to Eric Horvitz, Managing Director of Microsoft research, 6.6 degrees of separation between any two people on the Internet.
  • Google has now digitized 6 percent of all published books, creating a huge archive of words that can be tracked over time at https://books.google.com/ngrams.  One can use this resource to answer interesting questions . . . such as: is it true that the term “homosexuality” hardly predates the 20th century, and that “sexual orientation” is a late 20th century concept?  It took me about a second to create this figure of the proportional frequency of these terms over time:

1396620925095.jpeg

  • On Facebook, Kramer reported
    • Parents and children take an average 371 days to friend one another.
    • Mothers use 10% more nurturing words when communicating with their children.
    • In the 2010 congressional elections, people’s posting their having voted led to 340,000 additional voters among their friends and friends of friends.
    • Positive emotion words in people’s posts are followed, in the ensuing three days, by increased positive emotion words in friend’s posts, and vice versa for negative emotions.
  • A research team led by Blaine Landis at the University of Cambridge analyzed all 30.49 billion international Facebook friendships formed over four years, and reported (in an SPSP poster) that people tended to “friend up.”  Those from countries with lower economic status were more likely to solicit friendship with those in higher status countries than vice versa.

Originally posted on April 22, 2014.

 

Critics have used the SAT test redesign to denounce the SAT and aptitude testing.  The multiple choice SAT has “never been a good predictor of academic achievement,” Bard College president Leon Botstein argued in Time. Better, to “look at the complex portrait” of college applicants’ lives, including “what their schools are like.” said Colby College English professor Jennifer Finney Boylan in a New York Times essay. The SAT only measures “those skills … necessary for the SATs,” surmised New Yorker staff writer Elizabeth Kolbert. 

In a new Slate essay, David Hambrick and Christopher Chabris, distinguished experimental psychologists at Michigan State University and Union College, rebut such assertions.  Massive data, they argue, show that

•    SAT scores do predict first-year GPA, whole-college GPA, and graduation likelihood, with the best prediction coming from a combination of both high school grades and aptitude scores.
•    SAT scores of 13-year-old predict future advanced degrees and income, much as kindred and strongly-related IQ scores predict job training and vocational success.
•    In one famous nationwide sample, the IQ scores of Scottish 11-year-olds predicted their later-life longevity, even after adjusting for socioeconomic status.
•    Although SAT scores are slightly higher among students from high income families, the SAT also provides an opportunity for students from nonelite public school to display their potential—rather than to be judged by “what their schools are like.”  Thus SAT scores, when compared with assessments influenced by income-related school quality, have a social levelling effect. 
•    Test preparation courses often taken by higher income prep school students “don’t change SAT scores much.”

Ergo, say Hambrick and Chabris, while other traits such as grit, social skill, conscientiousness, and creativity matter, too, “the idea that standardized tests and ‘general intelligence’ are meaningless is wishful thinking.”

Originally posted on April 24, 2014.

 

39-Year-Old Deaf Woman Hears for First Time” headlined Yahoo, in one of the many gone-viral Deaf-can-now-hear videos.  Each depicts the compelling emotions of someone who, thanks to the activation of a new cochlear implant (CI), is said to be hearing sound for the first time—and (in this case) conversing in English!  Was this woman (Joanne) completely congenitally deaf as a result of Ushers Syndrome?  And did she immediately gain, as some media implied, the ability to understand speech on first hearing it?

As my brother said in forwarding this, it’s “an amazing story.”

The power of CIs to restore hearing is, indeed, amazing, as I can attest from meeting many people with CIs at hearing loss meetings.  As one who is tracking toward the complete deafness that marked the last dozen years of my mother’s life, I anticipate someday benefitting from CIs.

Moreover, I appreciate the power of a compelling example, such as the video example I offer (here) of a child’s first experience of a home TV room hearing loop.  And who can suppress a smile when watching this boy’s first experience of a CI?

Without disrespecting the Deaf culture (which regards deafness and Sign language as not defects needing fixing), and without diminishing Joanne’s powerful experience, what shall we make of her ability to understand and to speak? Does this video overturn what psychological science has taught us about the critical period for language development during life’s early years?  Is it not important that children receive CIs before language develops?  Haven’t experiments that removed cataracts and “restored vision” to natively blind people taught us that, for normal perceptual experience, the brain must be sculpted by sensory input in life’s early years?

I posed these questions to Dr. Debara Tucci, a Duke Medical Center cochlear implant surgeon with whom I serve on the advisory council of the National Institute on Deafness and Other Communication Disorders.  Our shared questions:

     1. Was Joanne completely deaf from birth?  Has she heard no sound until the moment of this recording?  As I will explain in a future entry, in popular use “deaf” often conflates true and complete deafness with substantial hearing loss.  Some Usher’s Syndrome patients sometimes are born completely deaf, but others experience progressive hearing loss.  With hearing aids, they acquire language early in life.  Joanne’s use of spoken language suggests that she is not hearing speech for the first time in her life.

     2. A person who has been completely deaf from birth could potentially lip read.  When testing such patients with newly activated CIs, it would be interesting to know if they can “hear” speech when the speaker’s face is obscured.

As a CI provider, Dr. Tucci nevertheless welcomes such videos: 

“Even though the history accompanying the video may not be entirely correct, and a little misleading, it is basically a positive thing.  I would rather have 10 people come in and be told they are not a candidate than miss one person who is.  Also, we are implanting long deafened people who don't have speech/language ability not with the thought that they will develop or understand speech, but to increase connectedness and for safety concerns.”

Originally posted on April 28, 2014.

 

Reports of restored vision in children in India have been confirmed in a new Psychology Science article, summarized here, on “Improvement in spatial imagery following sight onset late in childhood.”

The research, led by Tapan Kumar Gandhi of MIT’s Brain and Cognitive Sciences department, in collaboration with Suma Ganesh and Pawan Sinha, studied children who were blinded from birth by dense cataracts. After surgery removed the cataracts at about 12 to 14 years, the children were no longer completely blind. Their abilities to discern light and dark, enabled some spatial imagery.

Practically, I wondered, what does this mean? Doesn’t the brain need to experience normal sensory input early in life in order to produce normal perceptual experience later in life? I asked Dr. Gandhi to explain the children’s post-surgery abilities. Could they potentially ride a bicycle or drive a car? His answer (quoted with permission):

The onset of sight is not immediately accompanied by much joy or pleasure, contrary to what is depicted in movies. The child has to get used to the new inputs. Over the first few weeks, the child begins to feel more comfortable with the visual world, even though they might not recognize much of it. Their visual acuity is sub-par, most likely permanently so. But, despite a blurry percept, the brain is able to achieve significant proficiency over the course of the first half year on many visual skills such as face detection, and visually guided navigation. Although driving is well-beyond their economic means, some of the Prakash children have indeed learned to ride a bicycle. We typically find that the children and their parents are in high spirits when they visit us for a clinical follow-up a few weeks after the surgery.

David Myers

Who Is Deaf?

Posted by David Myers Expert Jul 19, 2016

Originally posted on April 30, 2014.

 

Those of us with hearing loss cheered one of our own, Seattle Seahawks football player Derrick Coleman, as he became a national exemplar in the U.S. for living with hearing loss. We reveled in the Super Bowl Duracell ad chronicling his life story.  And we felt a warm glow when he gifted twin New Jersey 9-year-old sisters with Super Bowl tickets and handwritten encouraging words:  “Even though we have hearing aids, we can still accomplish our goals and dreams!”

As 500,000+ Google links to “Deaf Seahawks fullback” testify, Coleman’s story inspires us.  The reports of Coleman’s “deafness” also raise an interesting question:  Who is deaf?

By using a combination of hearing aids and the natural lip reading that we all do, Coleman, despite his profound hearing loss, reportedly hears his quarterback call plays amid the din of the Seahawks stadium.  And he converses, as when amiably answering questions at a Super Bowl press session. In doing so, he is akin to millions of others who live well with hearing loss.

Without our hearing aids or cochlear implants, some of us among the world’s 360 million people with hearing loss become truly deaf—unable to hear normal conversation.  When I remove my hearing aids before showering in my college gym, the locker room banter goes nearly silent.  In bed at night without my aids, my wife’s voice from the adjacent pillow becomes indecipherable, unless she turns to speak into my ear.

So, in his everyday functioning, is Derrick Coleman “deaf”?

Am I deaf?  Are my friends in the hearing loss community deaf?

Partly out of respect for my nonhearing, signing cousins in the Deaf Culture, my answer is no:  I am not Deaf.  Like Deaf people who fluently communicate with Sign, a genuine language, I am also not disabled or “hearing impaired” (which labels a person).  Rather I am a person with hearing loss.  The Hearing Loss Association of America—“the nation’s voice for people with hearing loss”—offers resources that assist “people with hearing loss and their families to learn how to adjust to living with hearing loss [and] to eradicate the stigma associated with hearing loss”—and thus to live as not-deaf.

I asked the Association’s recently retired director, Brenda Battat, whose hearing was partially restored with a cochlear implant, if she considers herself deaf.  “No. From a life experience, functioning, and self-identity perspective I do not consider myself deaf.”

Ditto my friend, musician Richard Einhorn, who has a substantial hearing loss and was recently featured in a news story that was headlined: "Hearing Loops Give Music Back to Composer Who Went Deaf in a Day."

“The ‘deaf’ label is not accurate,” notes Einhorn, who uses various technologies to hear.  “With a good hearing aid and additional listening technology such as hearing loops, I can hear well enough in most situations to participate fully in conversations and enjoy live music, theater, and films.”

Thanks to new hearing technologies, most of us with hearing loss can effectively function as not-deaf.  My state-of-the-art hearing aids amplify sound selectively, depending on my loss at different frequencies.  They offer directionality.  They compress sound (raising soft sound and lowering extremely loud sound).  Via a neck-worn Bluetooth streamer, they wirelessly transmit phone conversation and music from my smart phone to both my hearing aids. And thanks to my favorite hearing technology—the hearing loops that broadcast PA sound wirelessly to my in-the-ear speakers (aka hearing aids)—I hear!

Ergo, while most natively Deaf people are served by Sign, the rest of us—the invisible majority with hearing loss—need hearing assistance.  We respect, but live outside of, the Deaf Culture.  We benefit from new hearing technologies.  Lumping all people with hearing loss together as “deaf” respects neither Deaf people nor those with hearing loss.  Here ye, hear ye!

Originally posted on May 2, 2014.

 

Many faculty fret over students’ in-class use of computers—ostensibly there for note taking, but often also used for distracting e-mail, messaging, and checking social media.  A soon-to-be-published study by Pam Mueller (Princeton University) and Daniel Oppenheimer (UCLA) offers faculty an additional justification for asking students not to use computers.

In three experiments, they gave students either a laptop or a notebook and invited them to take notes on a lecture (a TED lecture in two of the studies).  Later, when they tested their memory for the lecture content, they found no difference in recall of factual information.  But taking notes in longhand, which required participants to summarize content in their own words, led to better performance on conceptual-application questions. “The Pen Is Mightier Than the Keyboard” is the apt title for their article, to appear in Psychological Science.

“Participants using laptops were more inclined to take verbatim notes,” explained Mueller and Oppenheimer.  Better to synthesize and summarize, they conclude:  “laptop use in classrooms should be viewed with a healthy dose of caution; despite their growing popularity, laptops may be doing more harm in classrooms than good.”

For one of my colleagues, this study, combined with the unwanted distractions of in-class computer use, inspires a new class policy:  for better learning, no computer use in class.

Originally posted on May 6, 2014.

 

At the 2012 International Congress of Psychology meeting in Cape Town, I enjoyed a wonderful talk by Elizabeth Loftus, which offered a terrific demo of how memory works.  Loftus showed us a handful of individual faces that we were later to identify, as if in a police line-up.  Later, she showed us some pairs of faces, one seen earlier and one not, and asked us which one we had seen.  In the midst of these, she slipped in a pair of faces that included two new faces, one of which was rather like an earlier seen face. 

Most of us understandably but wrongly identified this face as previously seen.  To climax the demonstration, when she showed us the originally seen face and the previously chosen wrong face, most of us (me, too) picked the wrong face!  As a result of our memory reconsolidation, we—an audience of psychologists who should have known better—had replaced the original memory with a false memory.

ICP.jpg

David Myers

Revealing Hidden Secrets

Posted by David Myers Expert Jul 19, 2016

Originally posted on May 8, 2014.

 

Knowing that people don't wear their hearts on their sleeves, psychologists have longed for a "pipeline to the heart."  One strategy, developed nearly a half century ago by Edward Jones and Harold Sigall, created a “bogus pipeline.” Researchers would convince people that a machine could use their physiological responses to measure their private attitudes.  Then they would ask them to predict the machine's reading, thus revealing attitudes which often were less socially desirable than their verbalized attitudes. 

More recently, psychologists have devised clever strategies for revealing “implicit attitudes,” by using reaction times to assess automatic associations between attitude objects and evaluative words. (In contrast to consciously held “explicit attitudes,” implicit attitudes are like unconscious habits.)

A new working paper (abstract; PDF) by Katherine Coffman and fellow economists demonstrates a third strategy for getting people to reveal sensitive information—the “Item Count Technique” (ICT).

One group was given four simple statements, such as “I spent a lot of time playing video games as a kid,” and then was asked how many of the four statements “apply to you.”  A second group was given the same four statements plus a fifth:  “I consider myself to be heterosexual,” and then was asked how many of the five statements “apply to you.”

Although no individual is asked to reveal which specific statements are true of them, a comparison of the two groups’ answers reveals—for the sampled population—the percent agreeing with the fifth statement.

Thus, without revealing anyone’s sexual orientation, the ICT aggregate data showed a 65 percent higher rate of non-heterosexual identity than was self-reported among people who were asked straight-out: “Do you consider yourself to be heterosexual?”  

But then let’s not discount public surveys.  Nate Silver’s digest of presidential polling data correctly predicted not only the 2012 U.S. national presidential outcome, but also the outcome in all 50 U.S. states.  Specific, explicit attitudes can predict behavior.

David Myers

Why Do We Sleep?

Posted by David Myers Expert Jul 19, 2016

Originally posted on May 12, 2014.

 

Sleep consumes time we could spend foraging and it exposes us to predators.  It’s a waste and a risk.  So why do humans sleep?  Why didn’t nature design us for continual activity and vigilance?

In the October 18, 2013 Science, researchers offer an answer:  sleep enables house cleaning.  Studies of mice show that sleep sweeps the brain of toxic metabolic waste products.

Ergo, at the day’s end we can say to our loved ones:  Good night.  Sleep tidy.

Originally posted on May 14, 2014.

 

Tyler Vigen, a Harvard Law student, has a new website (here) that offers “a fun way to look at correlations and to think about data.”  Among the whimsical spurious (chance) correlations he offers is one that offers a rare 1.0 correlation example.  I’ve reconstructed it into a form familiar to psychology teachers and students:

 

1400008332480-1.png

Originally posted on May 19, 2014.

 

Self-serving bias—the tendency to perceive oneself favorably—has become one of personality-social psychology’s most robust phenomena.  It’s our modern rendition of ancient wisdom about pride, which theologians have considered the basic sin (much as Freud considered repression the basic defense mechanism).

Self-serving bias appears in people’s judging themselves as better-than-average—on just about any subjective, socially desirable dimension.  Compared with people in general, most people see themselves as more ethical, friendly, intelligent, professionally competent, attractive, unprejudiced, and healthier—and even as more unbiased in their self-assessments!

As part of my reporting on the world of psychology, I enjoy, as an affiliate British Psychological Society affiliate member, two of its journals, and also its Research Digest.  (The digest, authored by Christian Jarrett, is available as a free bimonthly e-mail here.) The Digest put a smirk on my face with its synopsis of a new British Journal of Social Psychology report by Constantine Sedikides, Rosie Meek, Mark Alicke, and Sarah Taylor.  The Sedikides team found that English prisoners incarcerated for violence and robbery saw themselves, compared with “an average member of the community,” as (I am not making this up) more moral, kind, and compassionate.

Shelly Taylor’s humbling surmise, in her 1989 book, Positive Illusions, still rings true: “The [self-]portraits that we actually believe, when we are given freedom to voice them, are dramatically more positive than reality can sustain.”

 

Originally posted on May 21, 2014.

 

A New York Times report on “the extreme sport” of remembering confirms what psychology instructors have long taught:  the power of mnemonic aids, especially organized images, to enable memory performance.  We humans are really good at retaining visual images, and we’re much better at later reproducing visualizable words (bicycle) than abstract words (process).  Thus it can help, when remembering a short grocery list, to use the peg-word system, with numerically ordered items—bun, shoe, tree, door, etc.—and to hang the grocery items on those images.

Likewise, reports the Times article, all the competitors in a recent world memory contest used a “memory palace,” by associating to-be-remembered numbers, words, or cards with well-learned places, such as the rooms of a childhood home.  Challengers who claim to have invented an alternative method inevitably “come in last, or close to it,” noted one world-class competitor.

Memory researchers who study these memorists report that they are, as you might expect, smart.  But they also have unusual capacities for focused attention and holding information in working memory.

Yet, like you and me successfully forgetting previous locations of our car in the parking lot, they also need to be able to replace their place-item associations with new items. In this they are unlike students, who, if they are to become educated persons, need to retain information for months and years to come.  And for that there is no easy substitute for other well-researched memory aids, such as spaced practice, active rehearsal, and the memory consolidation that comes with a solid night’s sleep.

Originally posted on May 23, 2014.

 

John Watson and Rosalie Rayner made psychologist history with their 1920 report of the fear conditioning of 11-month old “Little Albert.”  After repeated pairings of a white rat with an aversive loud noise, Albert reportedly began whimpering at the sight of the rat.  Moreover, his fear reaction generalized, to some extent, to the sight of a rabbit, a dog, and a sealskin coat, but not to more dissimilar objects.

Ever since, people have wondered what became of Little Albert.  One team of psychologist-sleuths identified him as Douglas Merritte, the son of a campus hospital wet nurse who died of meningitis at age 6.  For a forthcoming article in the American Psychologist, another team of sleuths—Russell Powell, Nancy Digdon, Ben Harris, and Christopher Smithson—have identified an even more promising candidate.  William Albert Barger who went by “Albert B”—the very name used by Watson and Rayner—neatly fits many of Little Albert’s known characteristics.  This Albert was not brain-damaged and was easy-going, though (likely coincidentally, given how Albert’s fears would diminish between sessions) he had an aversion to dogs!

Albert died in 2007, without ever knowing of his early life in a hospital residence, or of his apparent part in psychology’s history.

Originally posted on May 28, 2014.

 

Climate change is upon us.  The recent National Climate Assessment, assembled by a large scientific panel, confirms that greenhouse gases continue to accumulate.  The planet is warming. The West Antarctic ice sheet is doomed. The seas have begun rising.  And more extreme weather will plague our future.

Alas, most of the American public is not yet alarmed about this weapon of mass destruction.  The 31 percent who in 1998 thought “the seriousness of global warming is generally exaggerated” increased to 42 percent in 2014.  And the 34 percent of Americans who in 2014 told Gallup they worry “a great deal” about global warming was essentially the same as in 1989.

Part of the problem is what psychologists and their students know as the availability heuristic. Our judgments get colored by mentally available events and images. And what’s more cognitively available than slow climate change is our recently experienced local weather (see here and here).  Local recent temperature fluctuations tell us nothing about long-term planetary trends. (Our current weather is just weather.) Yet, given unusually hot local weather, people become more accepting of global climate warming, while a recent cold day reduces people’s concern about climate warming and overwhelms less memorable scientific data.  Snow in March?  “So much for global warming!”

After Hurricane Sandy devastated New Jersey, its residents’ vivid experience of extreme weather increased their environmentalism.  This suggests that a silver lining to the tragedy of more droughts, floods, heat waves, and other extreme weather may, in time, be increased public concern for climate change.  In the meantime, to offer a vivid depiction of climate change, Cal Tech scientists have created an interactive map of global temperatures over the last 120 years.

Originally posted on May 30, 2014.

 

If asked that question, who would come to your mind?

For a forthcoming Archives of Scientific Psychology report, Ed Diener (University of Illinois), Shigehiro Oishi (University of Virginia), and JunYeun Park (University of Illinois), painstakingly assembled data from citations, textbook page coverage, and major awards.

Their top three, in order, were Albert Bandura (whose 218,219 citations also marked him as our most cited psychologist), Jean Piaget, and Daniel Kahneman.

Looking just at introductory psychology textbook pages mentioning different psychologists, the top two were Robert Sternberg and Martin Seligman.

Originally posted on June 5, 2014.

 

An amazingly comprehensive new Lancet study, with nearly 150 authors, tracks overweight and obesity rates across 188 countries from 1980 to 2013.  Some highlights:

  • Worldwide, the proportion of overweight adults (BMI ≥ 25) increased from 29 to 37 percent among men and 30 to 38 percent among women.
  • Over the last 33 years, no country has reduced its obesity rate.
  • In 2010, “overweight and obesity were estimated to cause 3.4 million deaths.”
  • National variations are huge, with the percentage overweight ranging 85 percent among adults in Tonga to 3 percent in Timor-Leste.

The study is amazing not only in its global comprehensiveness, across time, but also in its public, interactive data archive available from the Institute for Health Metrics and Evaluation.

As a screen shot example, I compared the U.S. increase in the overweight percentage (upper dark line) with the global increase (lower dark line).  All other countries are in light blue.

1401810233175.jpeg

Originally posted on June 12, 2014.

 

My last post—noting the new worldwide estimate that 37 percent of men and 38 percent of women are overweight—got me to wondering if we have other examples of all-humanity data. One is our species’ life expectancy, which has risen from 46.5 years in the early 1950s to 70 years today. What a gift—two dozen more years of life!

And then we have new data from the Gallup World Poll which is surveying countries with more than 98 percent of the world’s population. Aggregating data from this resource, Ed Diener, Louis Tay, and I were able to answer (here) this simple question: Asked, “Is religion important in your daily life?,” what percent of humanity will respond “yes”?

The answer: 68 percent. Two in three humans.

When mentioning this answer in talks, I offer, with a smirk, the usual caveat on reporting survey data: We should be cautious about generalizing beyond the population sampled. (These data represent but one species on one planet, and may not represent the views of other life forms elsewhere in the universe.)

What’s striking about each of these all-humanity measures is the extraordinary variation across countries—from 3 percent overweight adults in Timor-Leste to 85 percent in Tonga; from 49 year life expectancy in Chad to 89 in Monaco; from 16 percent for whom religion is important in Estonia to 100 percent in Bangladesh and Niger. We humans are all kin beneath the skin. Yet how we differ.

[A note to our valued readers:  Nathan DeWall and I anticipate a more relaxed two-a-week pace of blogging this summer, and returning to our weekday postings at the summer’s end.]

Originally posted on June 17, 2014

 

Is religion toxic to human flourishing . . . or is it supportive of human happiness, health, and helpfulness? Let’s make this empirical: Is religious engagement associated with humans living well, or with misery, ill-health, premature death, crime, divorce, teen pregnancy, and the like?

The answer differs dramatically by whether we compare places (such as more versus less religious countries or states) or individuals.

For starters, I manually harvested data from a Gallup World Poll, and found a striking negative correlation across 152 countries between national religiosity and national well-being:

1402340150753.png

Then I harvested General Social Survey data from the U.S. and found—as many other researchers in many other countries have found (though especially in more religious countries) a positive correlation between religiosity and happiness across individuals.

1402340224923.jpeg

 

For additional striking examples of the religious engagement paradox—associating religious engagement with life expectancy, smoking, arrest rate, teen pregnancy, and more (across states versus across individuals)—see here.

Princeton economist Angus Deaton and psychologist Arthur Stone have recently been struck by the same paradox. They ask (here), “Why might there be this sharp contradiction between religious people being happy and healthy, and religious places being anything but?”

Before answering that question—and wondering whether the more important story is told at the aggregate or individual level—consider a parallel paradox, which we might call “the politics of wealth paradox.” Question: Are rich Americans more likely to vote Republican or Democrat?

When we compare states (thanks to Chance News) we can see that low income predicts Republican preferences. Folks in wealthy states are more likely to vote Democratic!  So, being rich inclines one to liberalism?

Not so fast: comparing individuals, we see the opposite (and more expected) result—high income folks vote more Republican.

 

1402340389781.jpeg

These are the sorts of findings that excite behavioral science sleuths. Surely there must be some confounding variables. With religiosity, one such variable is income—which is lower in highly religious countries and states. Controlling for status factors such as income (as Louis Tay did for our article with Ed Diener) and the negative correlation between religiosity and well-being disappears, and even reverses to slightly positive. Likewise, low income states differ from high income states in many ways, including social values that also predict voting.

Ergo, my hunch is that, in both the religious and political realms, the most important story is found at the level of the individual. Nevertheless, there are practical uses for these data. If you’re wanting to make religious engagement look bad, use the aggregate, macro-level data. If you want to make religious engagement look good, use the individual data.

Originally posted on June 26, 2014.

 

The development of adolescent impulse control lags sensation-seeking.  That’s the bottom line result of Laurence Steinberg’s report from surveys of more than 7000 American 12- to 24-year-olds, as part of the National Longitudinal Study of Youth and Children and Young Adults. Sensation-seeking behaviors peak in the mid teens, with impulse control developing more slowly as frontal lobes mature.

1401206395183.png

These trends fit nicely with data from longitudinal studies that, after following lives through time, find that most people become more conscientious, stable, agreeable, and self-confident in the years after adolescence.  The encouraging message for parents of 15-year-olds: you may be pleasantly surprised at your more self-controlled 25-year-old offspring to come.  And for courts, says Steinberg, the brain development and behavioral data together should inform decisions about the criminal sentencing of juveniles.

Originally posted on July 1, 2014.

 

In all of recent psychological science, there has been, to my mind, no more provocative studies those by Benjamin Libet.  His experiments have seemingly shown that when we move our wrist at will, we consciously experience the decision to move it about 0.2 seconds before the actual movement. No surprise there. But what startled me was his reporting that our brain waves jump about 0.35 seconds before we consciously perceive our decision to move! This “readiness potential” has enabled researchers (using fMRI brain scans) to predict participants’ decisions to press a button with their left or right finger. The startling conclusion: Consciousness sometimes appears to arrive late to the decision-making party.

And so it has also seemed in Michael Gazzaniga’s reports of split-brain patients who readily confabulate (make up and believe) plausible but incorrect explanations for their induced actions. If Gazzinga instructs a patient’s right brain to “Walk,” the patient’s unaware left hemisphere will improvise an explanation for walking: “I’m going into the house to get a Coke.”  The conscious left brain is the brain’s public relations system—its explanation-constructing “interpreter.”

So, do Libet’s and Gazzaniga’s observations destroy the concept of free will?  Does our brain really make decisions before our conscious mind knows about them?  Do we fly through life on autopilot?  Are we (our conscious minds) mere riders on a wild beast?

Not so fast.  Stanislas Dehaene and his colleagues report that brain activity continuously ebbs and flows, regardless of whether a decision is made and executed.  The actual decision to move, they observe, occurs when the brain activity crosses a threshold, which happens to coincide with the average “time of awareness of intention to move” (about 0.15 second before the movement).  In their view, the mind’s decision and the brain’s activity, like a computer’s problem solving and its electronic activity, are parallel and virtually simultaneous.

The late neuroscientist Donald MacKay offered a seemingly similar idea:  “When I am thinking, my brain activity reflects what I am thinking, as [computer’s] activity reflects the equation it is solving.”  The mind and brain activities are yoked (no brain, no mind), he argued, but are complementary and conceptually distinct.  As my colleague Tom Ludwig has noted, MacKay’s view—that mental events are embodied in but not identical to brain events—is a third alternative to both dualism and materialism (physicalism).

Originally posted on July 8, 2014.

 

In a new Politico essay (here) I offer four social psychological principles that shed light on enmities both violent (Sunni v. Shia) and playful (sports rivalries).

Originally posted on July 15, 2014.

 

Most of us have read over and again that the human brain has 100 billion neurons.  With no source but legend for that big round number—and not wanting merely to echo an undocumented estimate from other books—I set off in search of a more precise estimate.  Surely someone must have sampled brain tissue, counted neurons, and extrapolated a nerve cell estimate for the whole brain.  (It’s not that the number affects our understanding of how the brain works, but we might as well get the facts right.)

One researcher whose name I was disposed to trust—Gabrielle De Courten-Myers—explained to me by e-mail how she used “histological neuronal density and cortical thickness measurements in 30 cortical samples each from 6 males 12 to 24 years old,” from which she extrapolated an estimate of 23 billion neurons for the male cerebral cortex.  Although she didn’t have data for the rest of the brain, her guess in 2005 was that a whole-brain total would be “somewhere around 40 billion neurons.”

Later, a different research team, using a method that is beyond my pay grade to understand (but apparently involved making a “brain soup” of four male brains, postmortem, and counting neural nuclei) estimated 86 billion neurons in the male brain (though yet another expert with whom I corresponded questioned the validity of their method).

So, how many neurons have we in our human brains?  Apparently something less than 100 billion, but the number is uncertain.  What’s more certain is that we should be suspicious of unsourced big round numbers:  “The brain has 100 billion neurons.” “Ten percent of people are gay.”  “We typically use but 10 percent of our brains.” 

Originally posted on July 24, 2014.

 

Some recent naturalistic observations illustrated for me the results of longitudinal studies of human development—studies that follow lives across time, noting our capacities for both stability and change.

My procedure, though time-consuming, was simple:

  1. Observation Stage 1:  Attend a small college, living on campus with ample opportunity to observe my many friends.
  2. Intervening experience:  Let 50 years of life unfold, taking us to varied places.
  3. Observation Stage 2:  Meet and talk with these friends again, at a college reunion.

Time and again, researchers have documented the remarkable stability of emotionality, intelligence, and personality across decades of life.  “As at age 7, so at 70” says a Jewish proverb.

And so it was for my friends (with names changed to protect identities).  Thoughtful, serious Joe was still making earnest pronouncements.  Driven, status-conscious Louise continues to visibly excel.  Exuberant Mark could still talk for ten minutes while hardly catching a breath.  Gentle, kind Laura was still sensitive and kindhearted.  Mischievous, prankster George still evinced an edgy, impish spirit.  Smiling, happy Joanne still readily grinned and laughed.  I was amazed:  a half century, and yet everyone seemed the same person that walked off that graduation stage.

In other ways, however, life is a process of becoming.  Compared to temperament and to traits such as extraversion, social attitudes are more amenable to change.  And so it was for us, with my formerly kindred-spirited dorm mates having moved in different directions . . . some now expressing tea partyish concerns about cultural moral decay and big government, and others now passionate about justice and support for gay-lesbian aspirations.  Before they opened their mouths, I had no idea which was going to be which.

And isn’t that the life experience of each of us—that our development is a story of both stability and change.  Stability, rooted in our enduring genes and brains, provides our identity . . . while our potential for change enables us to grow with experience and to hope for a brighter future.

(For more on the neurobiology that underlies our stable individuality, and on the brain plasticity that enables our changing, see Richard Davidson’s recent Dana Foundation essay.)

Originally posted on July 29, 2014.

 

July brought the pleasure of attending Stanford University’s introduction to psychology teaching conference, hosted by its Psych One program coordinator, Bridgette Martin Hard.

One of the 70 attendees was the indefatigable Sue Frantz, winner of multiple awards and citations for her contributions to the teaching of psychology (and to educating faculty about teaching technologies).  Frantz, who is also the Society for the Teaching of Psychology’s Vice-President for Resources, tweeted conference highlights:

Worth TLC @WorthPsychTLC ·  Jul 10

.@ericlandrum: Employers want effective communicators, critical thinkers, & those who can apply knowledge to rl #psychoneconference

Worth TLC @WorthPsychTLC ·  Jul 10

.@ericlandrum book recommendation: Student Success in College. Review here: http://www.insidehighered.com/news/2005/05/18/kuh#sthash.u2Y1V0vQ.dpbs …

Worth TLC @WorthPsychTLC ·  Jul 10

E.Hardin: To stop group disc, silently raise hand, signaling stdts to stop talking & raise hands to signal others #psychoneconference [Slighted edited}

Worth TLC @WorthPsychTLC ·  Jul 10

R.Jhangiani: Have you seen this article? Revisiting the Stanford Prison Study (2007). http://www.ncbi.nlm.nih.gov/pubmed/17440210

Worth TLC @WorthPsychTLC ·  Jul 10

R.Jhangiani: It's the Stanford Prison STUDY, not the Stanford Prison EXPERIMENT

Worth TLC @WorthPsychTLC ·  Jul 10

D.Myers: To increase engagement, pack students into a small space. Stack extra chairs in the back. #psychoneconference

Retweeted by Worth TLC

Melissa Beers @mjbeers1 ·  Jul 11

When freshmen reappraise anxiety as arousal that can help them do better, academic performance improves. #psychoneconference

Worth TLC @WorthPsychTLC ·  Jul 11

S.Nolan: Free STP ebook Applying the Science of Learning to Education - http://teachpsych.org/ebooks/asle2014/index.php … #psychoneconference

For many more of Sue Frantz’s tweets—and to read her frequent tweeting of news and research from psychological science—follow her and others at Worth Publishers’ faculty lounge.

How Best to Prepare Students for Life Success?

David Myers

One of the many delights from the Stanford’s recent conference on teaching introductory psychology was being with and hearing Boise State professor Eric Landrum.  The exuberant Landrum is a longtime teaching-of-psychology leader, researcher, and author—and the 2014 president of the Society of the Teaching of Psychology.

His presentation offered his “all-time favorite PowerPoint slide.”  It summarizes the conclusions of research by Michigan State’s Collegiate Employment Research Institute showing the main reasons why new college grads get fired.  These include: Lack of work ethic, failure to follow instructions, missing assignments or deadlines, and being late.

Sound familiar?  Landrum, who studies what helps students succeed, draws a moral from these findings:  By simulating a real world employer, and holding to standards, he is doing them a great favor.  He is preparing them for real world success.

Originally posted on August 7, 2014.

 

One of the many delights from the Stanford’s recent conference on teaching introductory psychology was being with and hearing Boise State professor Eric Landrum.  The exuberant Landrum is a longtime teaching-of-psychology leader, researcher, and author—and the 2014 president of the Society of the Teaching of Psychology.

His presentation offered his “all-time favorite PowerPoint slide.”  It summarizes the conclusions of research by Michigan State’s Collegiate Employment Research Institute showing the main reasons why new college grads get fired.  These include: Lack of work ethic, failure to follow instructions, missing assignments or deadlines, and being late.

Sound familiar?  Landrum, who studies what helps students succeed, draws a moral from these findings:  By simulating a real world employer, and holding to standards, he is doing them a great favor.  He is preparing them for real world success.

1406565226030-1.png

 

David Myers

The Eyes Have It

Posted by David Myers Expert Jul 19, 2016

Originally posted on August 12, 2014.

 

One of social psychology’s intriguing and oft-replicated findings is variously known as the “own-race bias,” the “other-race effect,” and the “cross-race effect”—all of which describe the human tendency to recall faces of one’s own race more accurately than faces of other races. “They”—the members of some other group—seem to look more alike than those in our own group. 

With greater exposure to other-race faces, as when residing among those of a different race, people improve at recognizing individual faces.  Still, the phenomenon is robust enough that social psychologists have wondered what underlies it.  In the July Journal of Personality and Social Psychology, a research team led by Kerry Kawakami at York University offers a possible contributing factor:  When viewing faces during several experiments, White participants attended more to the eyes of White people, and to the nose and mouth of Black people.  Eye gaze, they reason, is “individuating”—it helps us discern facial differences.  Thus the ingroup eye-gaze difference may help explain the own-race bias.

Originally posted on August 21, 2014.

 

One of psychology’s big discoveries is our almost irresistible tendency to judge the likelihood of events by how mentally available they are—a mental shortcut that Daniel Kahneman and Amos Tversky identified as “the availability heuristic.”  Thus anything that makes information pop into mind—its vividness, recency, or distinctiveness—can make it seem commonplace.  (Kahneman explores the power of this concept at length in Thinking Fast and Slow, which stands with William James’ Principles of Psychology on my short list of greatest-ever psychology books.)

My favorite example of the availability heuristic at work is people’s misplaced fear of flying.  As I document in the upcoming Psychology, 11th Edition, from 2009 to 2011 Americans were—mile for mile—170 times more likely to die in a vehicle accident than on a scheduled flight.  When flying, the most dangerous part of our journey is typically the drive to the airport.  In a late 2001 essay, I calculated that if—because of 9/11—we in the ensuing year flew 20 percent less and instead drove half those unflown miles, about 800 more people would die.  German psychologist Gerd Gigerenzer later checked my estimate against actual traffic fatalities (why didn’t I think to do that?) and found that traffic fatalities did, indeed, jump after 9/11.  Thanks to those readily available, horrific mental images, terrorists had killed more people on American highways than died on those four ill-fated planes.

The availability heuristic operates in more mundane ways as well.  This morning I awoke early at an airport hotel, where I had been waylaid after a flight delay.  The nice woman working the breakfast bar told me of how she, day after day, meets waylaid passengers experiencing weather problems, crew delays, and mechanical problems.  Her conclusion (from her mentally available sample of flyers):  something so often goes awry that if she needed to travel, she would never fly.

Vivid examples make us gasp.  Probabilities we hardly grasp.

Originally posted on August 26, 2014.

 

In a recent New York Times essay (here), Henry Roediger explains the insights gleaned from his research on “the testing effect”— the enhanced memory that follows actively retrieving information, rather than simply rereading it. Psychologists sometimes also refer to this phenomenon as “test-enhanced learning,” or as the “retrieval practice effect” (because the benefits derive from the greater rehearsal of information when self-testing rather than rereading).

As Roediger explains, “used properly, testing as part of an educational routine provides an important tool not just to measure learning, but to promote it.”

For students and teachers, I offer a 5-minute animated explanation of the testing effect and how to apply it in one’s own study.  (I intend this for a class presentation or viewing assignment in the first week of a course.) See Make Things Memorable!  How to Study and Learn More Effectively.

Originally posted on September 3, 2014.

 

Skimming Paul Taylor’s, The Next America: Boomers, Millennials, and the Looming Generational Showdown, a 2014 report of Pew Research Center data on U.S. social trends, brought to mind one of my pet peeves: the favoritism shown to seniors over today’s more economically challenged Millennials and their children. Since passing into AARP-eligible territory, I have often purchased fares or tickets at discounted prices, while the single parent in line behind me got hit with a higher price. One website offers 250,000+ discounts for folks over 50.

A half-century and more ago it made sense to give price breaks to often-impoverished seniors wanting a night out at the movies, hungry for a restaurant meal, or needing to travel on buses and trains. Many seniors still struggle to make ends meet and afford housing.  But thanks to improved Social Security and retirement income and to decreased expenses for dependents and mortgages, their median net worth has been increasing—37 percent since 1984, Taylor shows, while those under 35 have seen their net worth plummet 44 percent.

1406565534746-1.png

And consider who are today’s poor (from this figure available here as well as in Taylor’s excellent book). Among the predictors is not only race but age.  Compared to four decades ago, today’s under-35 generation experiences a nearly doubled risk of poverty, while their senior counterparts suffer one-third the poverty rate of their 1960s counterparts

Ergo, in view of this historical change in poverty risk, should we adjust our social priorities? Might a more child-affirming culture consider discounts for card-carrying custodial parents? And could we not offer inflation adjustments not only to senior citizen Social Security stipends but also to minimum wages, tax exemptions for dependents, and family and food assistance?

 

Originally posted on September 5, 2014.

 

Feeling stressed by multiple demands for your time and attention?  Daniel Levitin, director of McGill University’s Laboratory for Music, Cognition and Expertise at McGill University and author of The Organized Mind: Thinking Straight in the Age of Information Overload, has some suggestions.  In a recent New York Times essay, he advises structuring our day to give space both for undistracted task-focused work and for relaxed mind-wandering:

If you want to be more productive and creative, and to have more energy, the science dictates that you should partition your day into project periods. Your social networking should be done during a designated time, not as constant interruptions to your day.

Email, too, should be done at designated times. An email that you know is sitting there, unread, may sap attentional resources as your brain keeps thinking about it, distracting you from what you’re doing. What might be in it? Who’s it from? Is it good news or bad news? It’s better to leave your email program off than to hear that constant ping and know that you’re ignoring messages.

Increasing creativity will happen naturally as we tame the multitasking and immerse ourselves in a single task for sustained periods of, say, 30 to 50 minutes. Several studies have shown that a walk in nature or listening to music can trigger the mind-wandering mode. This acts as a neural reset button, and provides much needed perspective on what you’re doing.

As one who is distracted by a constant stream of e-mails and the temptations of favorite web sites, this is advice I should take to heart.  But I have benefitted from an e-mail system that diverts e-mails that come to my public e-mail address (including political fund-raising appeals and list mail) from those that come to a private e-mail address known to family and colleagues.  The public e-mails are sent my mail only at the day’s end.

I also find it helpful to take work out to coffee shops, including one that doesn’t have Internet access.  “They should charge your extra for that,” observed one friend.

In our upcoming Psychology, 11th Edition, Nathan DeWall and I offer some further advice:

In today’s world, each of us is challenged to maintain a healthy balance between our real-world and online time. Experts offer some practical suggestions for balancing online connecting and real-world responsibilities.

Monitor your time. Keep a log of how you use your time. Then ask yourself, “Does my time use reflect my priorities? Am I spending more or less time online than I intended? Is my time online interfering with school or work performance? Have family or friends commented on this?”

Monitor your feelings. Ask yourself, “Am I emotionally distracted by my online interests? When I disconnect and move to another activity, how do I feel?”

“Hide” your more distracting online friends. And in your own postings, practice the golden rule. Before you post, ask yourself, “Is this something I’d care about reading if someone else posted it?”

Try turning off your mobile devices or leaving them elsewhere. Selective attention—the flashlight of your mind—can be in only one place at a time. When we try to do two things at once, we don’t do either one of them very well (Willingham, 2010). If you want to study or work productively, resist the temptation to check for updates. Disable sound alerts and pop-ups, which can hijack your attention just when you’ve managed to get focused. (I am proofing and editing this chapter in a coffee shop, where I escape the distractions of the office.)

Try a social networking fast (give it up for an hour, a day, or a week) or a time-controlled social media diet (check in only after homework is done, or only during a lunch break). Take notes on what you’re losing and gaining on your new “diet.”

Refocus by taking a nature walk. People learn better after a peaceful walk in the woods, which—unlike a walk on a busy street—refreshes our capacity for focused attention (Berman et al., 2008). Connecting with nature boosts our spirits and sharpens our minds (Zelenski & Nisbet, 2014).

Originally posted on September 9, 2014.

 

How do we know ourselves?  It’s partly by observing our own actions, proposed Daryl Bem’s self-perception theory.  Hearing ourselves talk can give us clues to our own attitudes.  Witnessing our actions gives us insight into the strength of our convictions (much as we observe others’ behavior and make inferences). Our behavior is often self-revealing.

The limits of such self-revelation have recently been explored by one of psychology’s most creative research teams at Sweden’s Lund University. The researchers, including Andreas Lind, were curious: “What would it be like if we said one thing and heard ourselves saying something else?” Would we experience an alien voice?  An hallucination? Would we believe our ears?

Through a noise-cancelling headset, the participants heard themselves name various font colors, such as the word green presented in a gray font color. But sometimes, the wily researchers substituted a participant’s own voice saying a previously recorded word, such as “green” instead of the correctly spoken “gray.” Surprisingly, two-thirds of these word switches went undetected, with people typically experiencing the inserted word as self-produced! (For more from the creative Lund University "choice blindness" research group, see here.)

A second new demonstration of the self-revealing power of our own behavior comes from research on the effects of feedback from our face and body muscles. As we have known for some time, subtly inducing people to make smiling rather than frowning expressions—or to stand, sit, or walk in an expansive rather than contracted posture—affects people’s self-perceptions.  Motions affect emotions.

At the University of Cologne, Sascha Topolinski and his colleagues report that even subtle word articulation movements come tinged with emotion.  In nine experiments they observed that both German- and English-speaking people preferred nonsense words and names spoken with inward (swallowing-like) mouth movements—for example, “BENOKA”—rather than outward (spitting-like) motions, such as KENOBA.  Ostensible chat partners given names (e.g., Manero) that activated ingestion muscles were preferred over chat partners whose names activated muscles associated with expectoration (e.g., Gasepa).

Self-perception theory lives on.  Sometimes we observe ourselves and infer our thoughts and feelings.

Originally posted on September 11, 2014.

 

In a recent blog essay (here) I advised thinking critically about big round numbers, including claims that the brain has 100 billion neurons, that we use 10 percent of our brains, and that 10 percent of people are gay.

Regarding the latter claim, a recent Gallup survey asked 121,290 Americans about their sexual identity: “Do you, personally, identify as lesbian, gay, bisexual, or transgender?” “Yes,” answered 3.4 percent.  And when a new National Center for Health Statistics study asked 34,557 Americans about their sexual identity, all but 3.4 percent of those who answered indicated they were straight. The rest said they were gay or lesbian (1.6 percent), bisexual (0.7 percent), or “something else” (1.1 percent).

Questions have recently arisen about another of psychology’s big round numbers—the claim that 10,000 practice hours differentiates elite performers, such as top violinists, from average to excellent performers.  As the distinguished researcher, Anders Ericcson, observed from his study of musicians (here), “the critical difference [is] solitary practice during their music development, which totaled around 10,000 hours by age 20 for the best experts, around 5,000 hours for the least accomplished expert musicians and only 2,000 hours for serious amateur pianists.”

Not so fast, say David Hambrick, Brooke Macnamara, and their colleagues (here and here). In sports, music, and chess performance, for example, people's practice time differences account for a third or less of their performance differences. Raw talent matters, too.

Perhaps both are right?  Are superstar achievers distinguished by their unique combination of both extraordinary natural talent and extraordinary daily discipline?

Originally posted on September 15, 2014.

 

Every once in a while I reread something that I've reported across editions of my texts, scratch my head, and ask myself: Is this really true?

Such was the case as I reread my reporting that “With the help of 382 female and 312 male volunteers. . . Masters and Johnson monitored or filmed more than 10,000 ‘sexual cycles.’”

Really?

I wasn't just makin’ stuff up.  Masters and Johnson do report (on page 15 of Human Sexual Response) their “conservative estimate of 10,000 complete sexual response cycles” in their laboratory (some involving multiple female orgasms).

But let’s do the numbers.  If they observed 10,000 complete sexual cycles over eight years[1] (from 1957 to 1965), then they averaged 1,250 sexual cycles observed per year.  Could we assume about an hour dedicated to each observation—including welcoming the participant(s), explaining the day’s tasks, attaching instruments, observing their behavior, debriefing them, and recording their observations?  And could we assume about 40 weeks a year of observation?  (Meanwhile, they were also running a sexual therapy clinic, writing, managing a lab, etc.)

So . . . doing the numbers . . . that’s roughly 31 weekly hours observing sex . . . for eight years.

It boggles the mind.  And one wonders: Wasn't there some point of diminishing returns from observing yet another 1000 hours of sex . . . assuming Masters and Johnson reported truthfully?

I have no basis for doubting the accuracy and integrity of Masters and Johnson’s reporting.  But I do, in a spirit of curiosity, scratch my head.

 

[1] In Human Sexual Response, they report gathering data over “eleven years” (pp. 9, 20).  But Johnson didn't join Masters until 1957, and Johnson biographer Genoa Ferguson reports that Johnson “began doing sexual function research 6 months into her research position.” Also, Masters and Johnson report (p. 10) that the first 20 months of the observations—presumably by Masters without Johnson—involved medical histories of 118 prostitutes, eleven of whom “were selected for anatomic and physiologic study.”  Ergo, although Masters and Johnson’s reporting leaves the exact study period ambiguous, it appears that the great majority, if not all, of the reported 10,000+ “complete sexual responses cycles” were observed during the seven or eight years after Johnson began her work with Masters. They also do not document the lab layout, or precisely how they observed their subjects. (As a point of contrast, Stanley Milgram’s similarly classic Obedience to Authority did precisely report on the participants, methods, and results of his various experiments, including drawings of the lab layout and equipment.)

Originally September 7, 2014.

 

My wife loves me, despite smirking that I am “boringly predictable.”  Every day, I go to bed at pretty much the same time, rise at the same time, pull on my khaki pants and brown shoes, frequent the same coffee shops, ride the same old bicycle, and exercise every weekday noon hour.  As I walk into my Monday-Wednesday-Friday breakfast spot, the staff order up my oatmeal and tea.  I’ll admit to boring.

But there is an upside to mindless predictability.  As my colleagues-friends Roy Baumeister, Julia Exline, Nathan DeWall and others have documented, self-controlled decision-making is like a muscle.  It temporarily weakens after an exertion (a phenomenon called “ego depletion”) and replenishes with rest. Exercising willpower temporarily depletes the mental energy needed for self-control on other tasks.  It even depletes the blood sugar and neural activity associated with mental focus. In one experiment, hungry people who had resisted the temptation to eat chocolate chip cookies gave up sooner on a tedious task (compared with those who had not expended mental energy on resisting the cookies).

President Obama, who appreciates social science research, understands this.  As he explained to Vanity Fair writer Michael Lewis, “You’ll see I wear only gray or blue suits. I’m trying to pare down decisions. I don’t want to make decisions about what I’m eating or wearing. Because I have too many other decisions to make.”  Lewis reports that Obama mentioned “research that shows the simple act of making decisions degrades one’s ability to make further decisions,” noting that Obama added, “You need to focus your decision-making energy. You need to routinize yourself. You can’t be going through the day distracted by trivia.”

So, amid today’s applause for “mindfulness,” let’s put in a word for mindlessness.  Mindless, habitual living frees our minds to work on more important things than which pants to wear or what breakfast to order.  As the philosopher Alfred North Whitehead argued, “Civilization advances by extending the number of operations which we can perform without thinking about them.”

Originally posted on September 23, 2014.

 

In the September Observer (from the Association for Psychological Science), Nathan explains why “Brain Size Matters.”  He summarizes, and suggests how to teach, Robin Dunbar’s conclusion that “Our brain size evolved to accommodate social groups that contain roughly 150 people.”

In the same issue, David’s essay on “Inspiring Interest in Interests” recaps research on the stability and motivational power of career-related interests, and offers students links to inventories that can assess their own interests and well-matched vocations.

September_2014_OBS_Cover_Image_400px_Cropped.jpg

David Myers

Do Look-Alikes Act Alike?

Posted by David Myers Expert Jul 19, 2016

Originally posted on September 30, 2014.

 

Behavior geneticists have gifted us with two stunning findings—discoveries that overturned what I used to believe about the environment’s power to shape personality.  One, dramatically illustrated by the studies of identical twins separated near birth, is the heritability of personality and intelligence.  The other, dramatically illustrated by the dissimilar personalities and talents of adoptive children raised in the same home and neighborhood, is the modest influence of “shared environment.”

I know, I know . . . studies of impoverishment during the preschool years, of epigenetic constraints on genetic expression, and of family influences on attitudes, values, and beliefs, remind us that genetic dispositions are always expressed in particular environments.  Nature and nurture interact.

And might identical twins have similar personalities not just because of their shared genes, but also their environments responding to their similar looks?  If only there were people who similarly look alike but don’t share the same genes.

Happily there are unrelated look-alikes—nontwin “doppelgängers” identified by Montreal photographer François Brunelle (do visit some examples here).  California State University, Fullerton, twin researcher Nancy Segal seized this opportunity to give personality and self-esteem inventories to these human look-alikes.

Unlike identical twins, the look-alikes did not have notably similar traits and self-esteem (see here). And in a new follow-up study with Jamie Graham and Ulrich Ettinger (here), she replicates that finding and also reports that the look-alikes (unlike biological twin look-alikes) did not develop special bonds after meeting their doppelgänger. 

The take-home message.  Genes matter more than looks. As the evolutionary psychologists remind us, kinship biology matters.

Originally posted on October 7, 2014.

 

The October APS Observer is out with an essay by Nathan, “Once a Psychopath, Always a Psychopath?” on people who “commit horrific crimes, experience little guilt or remorse, and then commit similar crimes again.” What is their potential for change, and how can we teach students about them?

In the same issue, I offer “The Story of My Life and Yours: Stability and Change.” It’s a celebration of what I regard as one of the great studies in the history of psychological science...Ian Deary and colleagues’ discovery of the intelligence scores of virtually all Scottish 11-year-olds in 1932, and then their retesting of samples of that population up to age 90.  The bottom line:  our lives are defined by a remarkable stability that feeds our identity, and also by a potential for change that enables us to grow and to hope for a brighter future.

Observer+October.JPG.png

Originally posted on October 14, 2014.

 

What would you consider psychology’s ten most provocative and controversial studies?  Christian Jarrett, a great communicator of psychological science via the British Psychological Society’s free Research Digest, offers his top ten list here.  A quick recap:

1. The Stanford Prison Experiment (aka the Stanford Prison Simulation)

2. The Milgram "Shock Experiments"

3. The "Elderly-related Words Provoke Slow Walking" Experiment (and other social priming research)

4. The Conditioning of Little Albert

5.  Loftus' "Lost in The Mall" Study

6. The Daryl Bem Pre-cognition Study

7.  The Voodoo Correlations in Social Neuroscience study

8. The Kirsch Anti-Depressant Placebo Effect Study

9. Judith Rich Harris and the "Nurture Assumption"

10. Libet's Challenge to Free Will

This is, methinks, a great list.  All ten have captured my attention and reporting (although I would reframe #5 to indicate Beth Loftus’s larger body of research on false memories and the misinformation effect).  Are there other studies that would make your top ten list?

In the cover story of the October APS Observer, Carol Tavris reflects on “Teaching Contentious Classics,” which include the Milgram experiments, and also Sherif’s Robbers Cave experiment and Harlow’s baby monkey experiments, the latter of which surely also merits inclusion on any list of psychology’s most controversial studies.

Originally posted on October 21, 2014.

 

Seth Stephens-Davidowitz uses aggregate data from Google to see if parents’ hopes for their children are gender-neutral. He reports that, actually, many parents seem eager to have smart sons and slender, beautiful daughters. You can see this for yourself (heads-up to teachers:  a cool in-class demonstration here). Google (with quote marks) and note the number of results:

  • “Is my daughter gifted”
  • “Is my son gifted”
  • “Is my son overweight”
  • “Is my daughter overweight”

As an example, here’s another pair I just tried (the OR commands a Boolean search of either version):

1413905104156.png

Originally posted on October 28, 2014.

 

With nearly 5000 misery-laden deaths and no end in sight, Ebola is, especially for Liberia and Sierra Leone, a West African health crisis.  It may not yet rival the last decade’s half million annual child deaths attributable to rotavirus—“Where is the news about these half-million kids dying?.” Bill Gates has asked.  But West Africans are understandably fearful.  

And North Americans, too . . . though perhaps disproportionately fearful?

Thanks to our tendency to fear what’s readily available in memory, which may be a low-probability risk hyped by news images, we often fear the wrong things.  As Nathan DeWall and I explain in the upcoming Psychology, 11th Edition, mile for mile we are 170 times safer on a commercial flight than in a car.  Yet we visualize air disasters and fear flying. We see mental snapshots of abducted and brutalized children and hesitate to let our sons and daughters walk to school. We replay Jaws with ourselves as victims and swim anxiously.  Ergo, thanks to such readily available images, we fear extremely rare events.

As of this writing, no one has contracted Ebola in the U.S. and died.  Meanwhile, 24,000 Americans die each year from an influenza virus, and some 30,000 suffer suicidal, homicidal, and accidental firearm deaths.  Yet which affliction are many Americans fearing most?  Thanks to media reports of the awful suffering of Ebola victims, and our own “availability heuristic,” you know the answer.

As David Brooks has noted, hundreds of Mississippi parents pulled their children from school because its principal had visited Zambia, a southern African country untouched by Ebola.  An Ohio school district closed two schools because an employee apparently flew on a plane (not the same flight) in which an Ebola-infected health care worker had travelled.  Responding to public fears of this terrible disease, politicians have proposed travel bans from affected African countries, which experts suggest actually might hinder aid and spread the disease.

Déjà vu. We fear the wrong things. More precisely, our fears—of air crashes versus car accidents, of shark attacks versus drowning, of Ebola versus seasonal influenza—are not proportional to the risks.

Time for your fall flu shot?

Originally posted on November 11, 2014.

 

A  recent Beijing visit left me marveling at students’ academic enthusiasm.  In explaining Asian students’ outperformance of North American students, researchers have documented cultural differences in conscientiousness. Asian students spend more time in school and much more time studying (and see here for one recent study of the academic diligence of Asian-Americans).

The Beijing experience gave me several glimpses of this culture difference in achievement drive and eagerness to learn.  For example, as I dined more than a half hour before speaking at the Peking University psychology department, word came that 160 students were already present.  After my talk in the overfilled auditorium (below), student hands across the room were raised, with some waving hands or standing up, pleading to be able to ask their questions.  And this was a Friday evening.

1415205918609.jpeg

Later that weekend, I met with teachers of AP psychology, whose students at select Beijing high schools pay to take AP courses in hopes of demonstrating their capacity to do college-level work in English, and thus to gain admission to universities outside China.  Several of the teachers were Americans, one of whom chuckled when explaining that, unlike in the USA, she sought to demotivate her overly motivated students, encouraging them to lighten up and enjoy life.

The plural of these anecdotes of culture difference is not data. (My China sample was biased—high achieving students who had gained admission to the most elite schools.) But the experiences, which replicated what I experienced in a 2008 visit to Beijing, were memorable.

Originally posted on November 25, 2014.

 

The November APS Observer is out with an essay by Nathan, “Why Self-Control and Grit matter—and Why It Pays to Know the Difference.” It describes Angela Duckworth’s and James Gross’s research on laser-focused achievement drive (grit) and on self-control over distracting temptations. . . and how to bring these concepts into the classroom.

In the same issue, I reflect on “The Psychology of Extremism.” I describe the social psychological roots of extreme animosities and terrorist acts, including a description of Michael Hogg’s work on how people’s uncertainties about their world and their place in it can feed a strong (even extreme) group identity.

Originally posted on December 2, 2014.

 

As I explain in a recent APS Observer essay (here), my short list of psychology’s greatest research programs includes the 250+ scientific publications that have followed Scottish lives from childhood to later life.  The studies began with all Scottish 11-year-olds taking intelligence tests in 1932 and in 1947 (the results of which Ian Deary and his team discovered many years later).  After meeting Deary at an Edinburgh conference in 2006 and hearing him describe his tracking these lives through time, I have followed his team’s reports of their cognitive and physical well-being with great fascination.

Last April, some 400 alums of the testing—now 93 or 78 years old (including those shown with Deary below)—gathered at the Church of Scotland’s Assembly Hall in Edinburgh, where Deary regaled them with the fruits of their participation. One of his conclusions, as reported by the October 31st Science, is that “participants’ scores at age 11 can predict about 50% of the variance in their IQs at age 77.”

I invited Professor Deary to contribute some PowerPoint slides of his studies for use by teachers of psychology.  He generously agreed, and they may be found here.

1416413696067.png

Photo courtesy of Ian Deary.

Originally posted on December 9, 2014.

 

Economic inequality is a fact of life.  Moreover, most folks presume some inequality is inescapable and even desirable, assuming that achievement deserves financial reward and that the possibility of making more money motivates effort.

But how much inequality is good?  Psychologists have found that places with great inequality tend to be less happy places, and that when inequality grows so does perceived unfairness, which helps offset the psychological benefits of increased affluence.  When others around us have much more than we do, feelings of “relative deprivation” may abound. And as Kate Pickett and Richard Wilkinson document, countries with greater inequality also experience greater health and social problems, and higher rates of mental illness.

So, how great is today’s economic inequality? Researchers Michael Norton and Dan Ariely invited 5,522 Americans to estimate the percent of wealth possessed by the richest 20 percent in their country. The average person’s guess—58 percent—“dramatically underestimated” the actual wealth inequality. (The wealthiest 20 percent possessed 84 percent of the wealth.)

And how much inequality would be ideal?  The average American favored the richest 20 percent taking home between 30 and 40 percent of the income—and, in their survey, the Republican versus Democrat difference was surprisingly modest.

Now, working with Sorapop Kiatpongsan in Bangkok, Norton offers new data from 55,238 people in 40 countries, which again shows that people vastly underestimate inequality, and that people’s ideal pay gaps between big company CEOs and unskilled workers is much smaller than actually exists.  In the U.S., for example, the actual pay ratio of S&P 500 CEOs to their unskilled workers (354:1) far exceeds the estimated ratio (30:1) and the ideal ratio (7:1).

Their bottom line:  “People all over the world and from all walks of life would prefer smaller pay gaps between the rich and poor.”

Originally posted on December 16, 2014.

 

The December APS Observer is out with an essay by Nathan on “The Neural Greenhouse:  Teaching Students How to Grow Neurons and Keep Them Alive.” Our brains are like greenhouses, he notes, with new neurons sprouting daily, “while others wither and die.” To take this neuroscience into the classroom, he offers three activities.

In the same issue, I say, “Let’s Hear a Good Word for Self-Esteem.” Mindful of recent research on the perils of excessive self-regard—of illusory optimism, self-serving bias, and the like—I offer a quick synopsis of work on the benefits of a sense of one’s self-worth. I also offer Google ngram figures showing sharply increased occurrences of “self-esteem” in printed English over the last century, and of decreasing occurrences of “self-control.”

Originally posted on December 23, 2014.

 

The Centers for Disease Control and Prevention, drawing from its own continuing household interviews, offers new data on who in the U.S. is most likely to suffer depression, and how often.  

Some noteworthy findings:

  • Overall rate of depression: Some 3 percent of people age 12 and over were experiencing “severe depressive symptoms.” More people—7.6 percent—were experiencing “moderate or severe” symptoms, with people age 40 to 59 at greatest risk. Many more—78 percent—“had no depressive symptoms.”
  • Gender and depression. Women experience nearly double (1.7 times) men’s rate of depression.
  • Poverty and depression. People living below the poverty line are 2½ times more likely to be experiencing depression. (Does poverty increase depression? Does depression increases poverty? Or—mindful of both the stress of poverty and the CDC-documented impact of depression on work and home life—is it both?)
  • Depression and treatment.  Only 35 percent of people with severe symptoms reported contact with a mental health professional in the prior year.

1418680197221-1.png

 

Originally posted on January 6, 2015.

 

University of Warwick economist Andrew Oswald—someone who creatively bridges economics and psychological science, as in his studies of money and happiness—offers some fascinating recent findings on his website:

  • A huge UK social experiment “offered incentives to disadvantaged people to remain and advance in work and to become self-sufficient.” Five years later the experimental group indeed had higher earnings than the control group, but lower levels of well-being (less happiness and more worries). Ouch.
  • Income (up to a point) correlates with happiness. But is that because richer people tend to be happier, or happier people tend to be richer? By following adolescent and young adult lives through time, with controls for other factors, the data reveal that happiness does influence future income.
  • After winning a lottery, do people political attitudes shift right?  Indeed yes.  “Our findings are consistent with the view that voting is driven partly by human self-interest.  Money apparently makes people more right-wing.”

This finding syncs with earlier findings that inequalities breed their own justification.  Upper-class people are more likely than those in poverty to see people’s fortunes as earned, thanks to their skill and effort—and not as the result of having connections, money, and good luck. 

Such findings also fit U.S. political surveys showing that high income individuals are more likely to vote Republican...despite—here’s a curious wrinkle to ponder—high income states being less likely to vote Republican.  We might call this “the wealth and politics paradox”—poor states and rich individuals vote conservative. Care to speculate why this difference?

David Myers

Why Do We Care Who Wins?

Posted by David Myers Expert Jul 19, 2016

Originally posted on January 13, 2015.

 

Last night’s national championship college football game, today’s New York Times article on America’s greatest small college rivalry (involving my own Hope College), and the upcoming Super Bowl all bring an interesting psychological question to mind:  Why do we care who wins? What psychological dynamics energize rabid fans?

In a 2008 Los Angeles Times essay I offered answers to my own questions, which first crossed my mind just before tipoff at that rivalry game described in today’s Times. The pertinent dynamics include the evolutionary psychology of groups, ingroup bias, social identity, group polarization, and the unifying power of a shared threat.

In a 2014 Politico essay I extended these principles in reflections on political and religious animosities between groups that, to outsiders, seem pretty similar (think Sunni and Shia, or Northern Ireland’s Catholic and Protestant).  The same social dynamics that fuel fun sports rivalries can, writ large, produce deep-rooted hostilities and social violence.

Originally posted on January 20, 2015.

 

In the January Observer (here), Nathan digests—and suggests how to teach—David Creswell and Emily Lindsay’s explanations of how mindfulness improves health.  Mindfulness serves to recruit brain regions important for stress control and it inhibits the sympathetic-adrenal-medullary (SAM) and hypothalamic-pituitary-adrenal (HPA) axes from going into overdrive.

David (here) notes that marriage predicts happiness.  Does it also predict physical health?  A massive meta-analysis by Theodore Robles and his colleagues indicates that, in Robles’ words, the marriage-health relationship “is similar to that of associations between health behaviors (diet, physical activity) and health outcomes.” But why?  Does marriage influence health or are healthy people more likely to marry?  Longitudinal studies suggest that marriage influences future health—for reasons that Robles explains and that class discussion might identify.

1421166627365.png

Originally posted on February 4, 2015.

 

Friday my focus was hearing research and care—at the National Institute on Deafness and Other Communication Disorders, where I sit on the Advisory Council (assessing federal support for hearing research and hearing health).  Days later, I was cheering on my ill-fated hometown Seattle Seahawks in the Super Bowl.

Alas, there is some dissonance between those two worlds, especially for fans of the team that prides itself on having the loudest outdoor sports stadium, thanks to its “12th Man” crowd noise—which has hit a record 137.6 decibels . . . much louder than a jackhammer, notes hearing blogger, Katherine Bouton.

With three hours of game sound rising near that intensity, many fans surely experience temporary tinnitus—ringing in the ears—afterwards...which is nature’s warning us that we have been baaad to our ears.  Hair cells have been likened to carpet fibers. Leave furniture on them for a long time and they may never rebound. A rule of thumb: if we cannot talk over a prolonged noise, it is potentially harmful.

With repeated exposure to toxic sound, people are at increased risk for cochlear hair cell damage and hearing loss, and for constant tinnitus and hyperacusis (extreme sensitivity to loud noise).

no-image.png

Men are especially vulnerable to hearing loss, perhaps partly due to greater noise exposure from power tools, loud music, gunfire, and sporting events (some researchers have implicated noise is men’s greater hearing loss).  But some men know the risks, as 2010 Super Bowl-winning quarterback Drew Brees illustrated, when lifting his son Baylen, with ear muffs during the post-game celebration.

For more on sports and noise, visit here.

Originally posted on February 5, 2015.

 

“The worst call in Super Bowl history,” read a headline in my hometown Seattle Times after Seahawks' head coach Pete Carroll seemingly threw the game away with his ill-fated decision to pass – rather than run – as the game clock expired.

 

Actually, Carroll made two end-of-half decisions in Sunday’s Super Bowl, both questioned by the NBC announcers. The differing outcomes of the decisions – and the resulting reactions by pundits and fans – offer potent examples of a mental pitfall that has been the subject of roughly 800 psychological science publications.

 

“Hindsight bias,” also known as the “I knew it all along phenomenon,” is the almost irresistible tendency to believe – after an experiment, a war, an election, or an investment – that the outcome was foreseeable. After the stock market drops (it “was due for a correction”) or an election is lost (by a “terrible candidate”), the outcome seems obvious – and thus blameworthy.

 

But, as research shows, we often do not expect something to happen until it does. Only then do we clearly see the forces that triggered the event, and feel unsurprised. Because outcomes seem as if they should have been foreseeable, we are more likely to blame decision makers for what are, in retrospect, “obvious” bad choices, rather than praise them for good ones (which also seem “obvious”). As the 19th century philosopher Søren Kierkegaard said, “Life is lived forwards, but understood backwards.”

 

With six seconds remaining in the first half, Carroll decided to throw for a touchdown – a risk that could have resulted in the clock expiring. Better to kick a safe field goal, argued the NBC announcers. But the gamble worked, and Carroll was acknowledged to have made a “gutsy” call.

 

Then, at the game’s end, with 26 seconds left – and victory less than a yard away – Carroll and his offensive coordinator ventured a fateful, intercepted pass. Fans exploded on social media. A (less-explicit) version of most reactions: “With three downs and one timeout to go, and the league’s best powerful runner at the ready, what was he thinking?”

 

“The stupidest coaching decision I can recall,” I vented to my wife afterwards, aided by 20/20 hindsight.

 

But like all the football fans who made Coach Carroll an object of national ridicule, I was judging the call after knowing the outcome. The next morning I reassessed the situation. With one timeout, I now realized, Seattle could venture, at most, two running plays. The attempted pass was a free third play – which, if incomplete, would still leave them with the same two possible running plays. Moreover, the odds of an interception at the one-yard line are, I later learned, even less than the odds of a fumble. And had a touchdown pass arrived in the receiver’s hands a half-second sooner, we could use game theory to explain how the wily Seahawks won by doing what their opponent least expected.

 

Responding to those who claimed he made “the worst call ever,” Carroll later explained to Today Show host Matt Lauer, “It was the worst result of a call ever. The call would have been a great one if we caught it. It would have been just fine and no one would have thought twice about it.”

 

Bringing statistical analysis to the decision, FiveThirtyEight.com impeccably calculated that Carroll had indeed made a smart one – that he had slightly increased his team’s chance of a win. What’s more, the evidence-based bad coaching decision was made by New England coach Bill Belichick, who, instead of calling a timeout, opted to let the clock run down (which would have deprived quarterback Tom Brady of another scoring opportunity in the likely event of a Seattle touchdown).

 

But probabilities are not certainties. In sports, as in life, good decisions can yield bad outcomes. Bad decisions can have lucky outcomes. And once outcomes are known we immediately retrofit our thinking. Thanks to hindsight bias, “winning erases all sins.” And losing makes a good coaching call look not gutsy but just plain stupid – even to a psychologist-fan who temporarily forgot what he teaches.

 

(Note:  This essay was simultaneously co-published by www.TheConversation.com)


Originally posted on February 9, 2015.

 

Northeastern University history professor Benjamin Schmidt is making waves, after harvesting 14 million student reviews from “Rate My Professor.”  He offers a simple interactive tool that can allow you—perhaps as an in-class demonstration—to compare words that students use to describe male and female professors.

 

You can give it a try, here.  I entered some intelligence-related words (“smart,” “brilliant,” “genius”) and some emotion-related words (“sweet,” “nasty”).  Even as one who writes about gender stereotypes, I was stunned by the gender effect.


Originally posted on February 10, 2015.

 

After falsely reporting being grounded by rocket fire while on a military helicopter in Iraq, and subsequently having his reported experiences during Hurricane Katrina challenged, NBC Nightly News anchor Brian Williams has been grounded by pundit fire.

 

Williams apologized, saying he misremembered the Iraqi incident. Talk shows and social media doubted anyone could misremember so dramatically, labeling him “Lyin’ Brian,” and grafting Pinocchio’s nose onto his face.

 

It’s possible that Williams is, indeed, a self-aggrandizing liar (meaning he knowingly and intentionally told untruths). But to those who know the dramatic results of research into sincere but false memories by Elizabeth Loftus and others, it’s also believable that, over time, Williams constructed a false memory...and that Hillary Clinton did the same when misreporting landing under sniper fire in Bosnia in 1996.

 

Most people view memories as video replays. Actually, when we retrieve memories, we reweave them. We often then replace our prior memory with a modified memory—a phenomenon that researchers call “reconsolidation.” In the reconsolidation process, as more than 200 experiments have shown, misinformation can sneak into our recall.

 

Even imagined events may later be recalled as actual experiences. In one new experiment, people were prompted to imagine and repeatedly visualize two events from their past—one a false event that involved committing a crime in their adolescence. Later, 70 percent reported a detailed false memory of having committed the crime!  False memories, like fake diamonds, seem so real.

 

Is Brian lyin’? Does he have Pinocchio’s nose for the news? Perhaps—though telling blatant untruths surely is not a recipe for enduring success and prestige in his profession.

 

Or might he instead be offering us a fresh example of the striking malleability of human memory?

 

P.S. The morning after I submitted these reflections for posting, the New York Times offered this excellent (and kindred-spirited) application of false memory research to the Williams episode.

David Myers

Does Music Help Memory?

Posted by David Myers Expert Jul 18, 2016


Originally posted on March 4, 2015.

 

When was the last time you studied without distractions? Or did any one activity without simultaneously doing another?

 

Multitasking pops up everywhere. While we work, we check our phones for messages, tweet our thoughts, listen to music, and update our Facebook status. At least that’s what my students tell me they do in their other classes.

 

You may think listening to music while you prep for a big test helps you relax so you can concentrate and study. I used to think so. In college, I’d sit down with my textbooks, pop in my headphones, and turn on my favorite music to set the mood for studying.  Then I’d spend hours going over the material—and play my make-believe drums or air guitar! Yes, I studied alone in college. A lot.

 

Listening to music may help college-aged students stay focused, but one new study found that older adults had more trouble remembering information they had learned while music was played in the background.

 

The study challenged younger and older adults to listen to music while trying to remember names. For the older adults, silence was golden. But when the researchers made the older adults listen to music while they tried to remember the names, their memory lapsed. College-aged participants’ performance did not suffer regardless of whether they listened to music while memorizing names.

 

Before you turn up the tunes to study, consider your age first. If you’re a younger college student, keep pressing play. Older students might be better off studying in silence. Regardless of our age, we might do well by taking a few minutes each day to set aside distractions, slow down, and become mindful of our thoughts, feelings, and environment. 


Originally posted on March 6, 2015.

 

My nominee for psychology’s most misunderstood concept is negative reinforcement (which is not punishment, but actually a rewarding event—withdrawing or reducing something aversive, as when taking aspirin is followed by the alleviation of a headache).

 

In second place on my list of oft-misunderstood concepts is heritability.

 

My publishers’ twitter feed today offered this:

 

talkpsych30.png

Sure enough, the news source says it’s so.  But it isn’t.  Tracking back to the actual study, and its own press release, we see that, as we might have expected, the conclusion was drawn from a twin study that estimated the genetic contribution to variation among individuals in autism spectrum disorder (ASD) scores.

 

Heritability refers to the extent to which differences among people are due to genes. If the heritability of ASD is 80 percent, this does not mean that 80 percent of autism cases are attributable to genes and 20 percent of cases to environment. And it does not mean that any particular case is 80 percent attributable to genes and 20 percent to environment.  Rather it means that, in the context studied, 80 percent of the differences among people was attributable to genetic influence.


Originally posted on March 10, 2015.

 

Nathan and David’s monthly synopses of important new findings reported in Current Directions in Psychological Science continue, and include their teaching ideas.

 

In the February APS Observer, Nathan shines a light on “dark personalities.”  “Some people have hidden lusts or greed,” he notes, “whereas others embezzle millions. Understanding the science of dark personality helps us avoid labeling people as simply good or bad. By shining a light on the ingredients of a dark personality, we can learn who we ought to fear and when to fear them.”

 

In the same issue, David summarizes the emerging field of health neuroscience, and suggests ways to help students think about brain ßà body interaction.

 

In the upcoming March issue, Nathan explains “When Two Emotions are Better than One” and suggests how to teach students the importance of emotional differentiation.

 

 

Also in the March issue, David identifies ways in which “Psychological Science Meets Religious Faith”—a topic of increasing interest in psychology:

 

talkpsych29.png


Originally posted on March 17, 2015.

 

One of the pleasures of writing psychological science is learning something new nearly every day, from the continual stream of information that flows across my desk or up my screen.  Some quick examples from the last few days:

 

Nudging nutrition. Joseph Redden, Traci Mann, and their University of Minnesota colleagues report a simple intervention that increases schoolchildren’s veggie eating.  In a paper to appear in PLOS One, they report—from observations of 755 children in a school cafeteria—that, for example, offering carrots first in the serving line (in isolation from other foods to come) quadrupled their consumption. For more on healthy eating nudges, see Mann’s forthcoming book, Secrets from the Eating Lab.

 

Hugging prevents colds.  In new research by Sheldon Cohen and his team, social support, indexed partly by the frequency of experienced hugs, predicted fewer and milder infections among 404 healthy people exposed to a cold virus.  A hug a day keeps sickness away?

 

Finger digit ratio predicts national differences in gender inequality?  It’s not news that nations vary in female political representation, workforce participation, and education.  It was news to me that they reportedly also vary in 2D:4D—that’s the ratio of the index (2D) and ring finger (4D) lengths.  Nations that purportedly show relatively high female fetal testosterone exposure (supposedly manifest as low 2D:4D) and relatively low male fetal testosterone exposure (high 2D:4D) have higher rates of female parliamentary and workforce participation. Hmmm.

 

How effective is repetitive transcranial magnetic stimulation (rTMS) for treating depression?  A few well-publicized studies suggested it was effective. But a new meta-analysis of all the available studies indicates this treatment actually provides only “minimal clinical improvement.” And this is why teachers and authors need to consider all of the available research, and not just isolated studies.

 

It’s not all in our genes: Exercise really is healthy. Finnish researchers studied 10 identical male twins—one of whom regularly exercised, the other not. Despite having similar diets, the sedentary twins had more body fat, more insulin resistance, less stamina, and less brain gray matter. The moral to us all:  join the movement movement.


Originally posted on March 25, 2015.

 

During a recent visit to Stanford University, psychologist Jeremy Bailenson (pictured) invited our small group of conferees to his Virtual Human Interaction Lab, where he explained his studies and invited us each to experience a virtual world, complete with surround sound and vibrating floor.

 

talkpsych27.png

His expressed aim is to “give you an experience that changes how you think about yourself,” and then to assess the aftereffects. In our group, brain scientist Antonio Damasio found his left and right legs and arms switched, as he popped virtual balloons.  Anthropologist Mel Konner found his identity shifted into a mirrored person.  I found myself in a beautiful forest, cutting down trees, and then underwater in a beautiful lagoon, with fish flying by.

 

 

Bailenson reports that men who become female avatars later are more nurturing.  Heroes who gain the ability to fly around a city (navigating by arm movements, as below) later become more helpful.  Those who age in front of their eyes become more future oriented.

talkpsych28.png

In such ways, Bailenson explores “how virtual reality can change the way people think about education, environmental behavior, empathy, and health.”

David Myers

Born to Learn From Us

Posted by David Myers Expert Jul 18, 2016


Originally posted on March 27, 2015.

 

At a recent foundation consultation at Stanford, I enjoyed meeting Andrew Meltzoff, the amiable and articulate co-director of the University of Washington’s Institute for Learning and Brain Sciences in my home city (where he lives but a short walk from my former high school).

 

Meltzoff is known to psychology teachers and students for his many studies of infant imitation, including his classic 1977 Science report on 2- to 3-week old infants imitating his facial gestures. It was, he reported, a powerful experience to stick out his tongue and have newborns do the same. “This demonstrates to me the essential socialness of human beings.”

 

talkpsych25.png

 

I’ve always wondered what newborns really are capable of visually perceiving, and he reminded me that it’s not much—but that they have their best acuity for the distance between their mother’s breast and eyes, which also was the distance between his face and the infants’ eyes.

 

His lab is now reading infants brains using the world’s only infant brain imaging MEG (magnetoencephalography) machine, which reads brain magnetic activity more finely than possible with EEG.

 

talkpsych26.png

He reports that “When a brain sees, feels, touches, or hears, its neuronal activity generates weak magnetic fields that can be pinpointed and tracked millisecond-by-millisecond by a MEG machine.” That is allowing Meltzoff and his colleagues to visualize an infant’s working brain as the infant listens to language, experiences a simple touch on the hand, or (in future studies) engages in social imitation and cognitive problem solving.

 

On the horizon, he envisions future studies of how children develop empathy, executive self-control, and identity. He also anticipates exploring how children’s brains process information from two-dimensional digital media versus their three-dimensional everyday world, and how technology can best contribute to children’s development. In such ways, they hope to “help children maximize their learning capabilities.”


Originally posted on April 2, 2015.

 

Facebook, Google, and Twitter, among others, are enabling psychologists to mine giant data sets that allow mega-scale naturalistic observations of human behavior. The recent Society of Personality and Social Psychology convention offered several such “big data” findings, including these (some also recently published):

 

  • “Computer-based personality judgments are more accurate than those of friends, spouses, or family.” That’s how Michal Kosinski, Youyou Wu, and David Stillwell summed up their research on the digital trail left by 86,220 people’s Facebook “likes.” As a predictor of “Big Five” personality test scores, the computer data were more significantly accurate than friends’ and family members’ judgments. (Such research is enabled by the millions of people who have responded to tests via Stillwell’s myPersonality app, and who have also donated their Facebook information, with guarantees of anonymity.)
  • Another study, using millions of posts from almost 69,792 Facebook users, found that people who score high on neuroticism tests use more words like “sad,” “fear,” and “pain.” This hints at the possibility of using social media language analysis to identify people at risk for disorder or even suicide.
  • Researchers are also exploring Smartphones as data-gathering devices. Jason Rentfrow (University of Cambridge) offers an app for monitoring emotions (illustrated here), and proposes devices that can sense human behavior and deliver interventions. In such ways, it is becoming possible to gather massive data, to sample people’s experiences moment-to-moment in particular contexts, and to offer them helpful feedback and guidance.

 

talkpsych24.png

Amid the excitement over today’s big data, psychologist Gary Marcus offers a word of caution: “Big Data is brilliant at detecting correlation....But correlation never was causation and never will be...If we have good hypotheses, we can test them with Big Data, but Big Data shouldn’t be our first port of call; it should be where we go once we know what we’re looking for.”

Originally posted on April 7, 2015.

 

The April APS Observer is out with an essay by Nathan, “The Truth About Trust.” Drawing from the work of Paul Van Lange, it identifies principles of trust—as learned, socially received, reasonable, and constructive. The essay also offers three easy classroom activities that engage students in thinking more deeply about trust.

 

In the same issue, my essay on “How Close Relationships Foster Health and Heartaches” suggests how instructors might engage students’ thinking about everyday stress and social support.  It then summarizes, from the work of Karen Rook, the benefits and costs of social relationships, and how relationships impact our health and well-being, for better and for worse.

 

talkpsych24.jpg


Originally posted on April 14, 2015.

 

As most introductory psychology students learn, negative emotions often affect health. And persistent anger can lash out at one’s own heart.

 

Might negative emotions, such as anger, also be risk factors for entire communities? In an amazing study in the February Psychological Science, Johannes Eichstaedt and thirteen collaborators ascertained heart disease rates for each of 1,347 U.S. counties. They also obtained from Twitter 148 million county-identified tweets from these 1,347 counties.

 

Their finding: a county’s preponderance of negative emotion words (such as “angry,” “hate,” and various curse words) predicted its heart disease deaths “significantly better than did a model that combined 10 common demographic, socioeconomic, and health risk factors, including smoking, diabetes, hypertension, and obesity.” A preponderance of positive emotion words (such as “great,” “fantastic,” and “enjoyed”) predicted low heart disease rates.

 

Given that the median Twitter user is age 31, and the median heart disease victim is much older, why should Twitter language so successfully predict a county’s heart disease-related deaths? Younger adults’ tweets “may disclose characteristics of their community,” surmise the researchers, providing “a window” into a community’s social and economic environment. An anger-laden community tends to be, for all, a less healthy community, while happier makes for healthier. www.loopflorida.org.

 

talkpsych23.jpg

Steve Debenport/E+/Getty Images


Originally posted on May 1, 2015.

 

One of the striking discoveries of psychological science is the malleability of memory, as illustrated by the “misinformation effect.”  Experiments by Elizabeth Loftus and others exposed participants to false information. Afterwards, they misremembered a stop sign as a yield sign, a screw driver as a hammer, a peanut can as a Coke can, and a clean-shaven man as having a mustache.

 

Even just imagining nonexistent happenings can create false memories—of being sickened by rotten eggs, having one’s finger caught in a mousetrap, of encountering Bugs Bunny (a Warner character) at Disneyland, or even of preschool child abuse.

 

For me the most stunning finding is the most recent.  After collecting background information from university students’ parents, researchers Julie Shaw and Stephen Porter prompted the students to recall two events from their past—one a false early adolescent event, embedded in true details of the students’ life, such as the name of a friend.  For some, this nonexistent event involved committing a crime, such as a theft, assault, or even assault with a weapon.  After three interviews, 70 percent of students reported false (and usually detailed) memories of having committed a crime.

 

The bottom line: Our memories are not just replays of our past experiences.  Rather, we actively construct our memories at the time of recall.  And that fact of life has implications for criminal interrogation, eyewitness recollections, and even memories retrieved during psychotherapy.

Originally posted on May 7, 2015.

 

Despite concerns that video game-playing teaches social scripts for violence, recent research also suggests a cognitive benefit: sharpened visual attention, quickened reaction speed, and improved spatial abilities, such as eye-hand coordination. Experienced game players tend to be perceptually quick and astute.

 

But a just-released pair of studies, by University of Oregon researcher Nash Unsworth and five collaborators, casts doubt on claims that video game-playing enhances cognitive abilities. The new studies confirmed a cognitive benefit when comparing extreme video-game players with nonplayers. But across broader and larger samples of people, game-playing correlated near zero with various cognitive abilities. “Overall, the current results suggest weak to nonexistent relations between video-game experience—across a variety of different games—and fundamental cognitive abilities (working memory, fluid intelligence, attention control, and speed of processing).”

 

Two of the study’s co-authors, Zachary Hambrick and Randall Engle, have also published studies and research reviews that question the popular idea that brain-training games enhance older adults’ intelligence and memory. Despite the claims of companies marketing brain exercises, brain training appears to produce gains only on the trained tasks (without generalizing to other tasks). Moreover, though we might wish that thousands of hours of practice could transform us into superstar athletes or musicians, Hambrick’s other research shows that superstar achievers are distinguished at least as much by their extraordinary natural talent as by their self-disciplined daily routine.

 

The opposite of a truth is sometimes a complementary truth. Educational interventions that aim to enhance grit, or that promote a “growth mindset” (rather than fatalistically seeing intelligence as fixed), also boost achievement. And in yet another new study, British children who display