Skip navigation
All Places > The English Community > Bedford Bits > Blog > Authors Jack Solomon
1 2 3 Previous Next

Bedford Bits

75 Posts authored by: Jack Solomon Expert

 

You've heard about it before: someone perches on the edge of a rooftop, or a waterfall, or a granite outcropping, to take a vertiginous photo of the drop off, hundreds—perhaps thousands—of feet below. Or reclines on a railway line to take a quick selfie as a locomotive looms in the background. Or does one thing or another that is exceptionally dangerous in order to get an eye-popping image that might capture a crowd on Instagram. . .and, sometimes, perishes in the act, as recently happened with a husband-and-wife team of travel bloggers in Yosemite National Park.

 

As I say, there's nothing new about this, and there are plenty of articles scattered all over the Internet detailing the phenomenon, often containing academic commentary on the meaning of it all, as does this article in Vice from 2017. So, given the familiarity of what might be called "Fatal Selfie Syndrome," and, more importantly, the fact that your students are likely to be part of the audience to which such photos are directed, this is a popular cultural topic that calls for semiotic analysis.

 

Let's start with the basics. The fundamental goal behind dangerous Instagram photos (or YouTube videos, etc.) is to get attention. While the most daring of the bunch also tend to be thrill seekers, thrill seeking is not the primary motivation for the simple reason that the chosen poses are designed for publicity, not for a privately enjoyed experience. But this elementary explanation then raises the question of what all this attention getting signifies.

 

Here we can go back to the early days of the Net. The advent of the personal web log and/or web page in the 1990s signified the emergence of a democratizing challenge to the hierarchical structures of traditional mass media, offering a way for ordinary people to make themselves seen and heard. MySpace—a kind of pre-packaged personal web site with audio and images—took the process a step further, widening the breach in the wall (in Pink Floyd's sense of the word) of mass cultural anonymity, while opening up new opportunities for commercial self-promotion.

 

The Instagram daredevils – and increased competitive stakes – are a consequence of what happens when democratic opportunity collides with a mass scramble for individual distinction. With so many people publicizing themselves on social media, it becomes harder and harder to get anyone to notice. This is especially problematic for those who exploit the Internet as a source of personal income, seeking to attract advertising dollars by attracting large numbers of views. So much money is at stake now that a sort of arms race of ever-more-daring stunts has ensued, effectively creating a new Internet hierarchy of online Evel Knievels contending with each other to make the cut.

 

The semiotic upshot of all this is that social media are not merely addictive, they are expressions (and extensions) of a hypercapitalistic society so taken up with monetizing every corner of human existence that personal experience itself is now for sale—in fact, one might say that personal experience is being sought for the sake of a sale.

 

Behind the scenes of this dramatic interplay between risk-entrepreneurs and their followers is the advertising that pays for it all. James Twitchell has called America an "advertising culture" (or "adcult"), and the Instagram economy can be said to signify an adcult in overdrive, careening through a consumer marketplace so splintered into niches and sub-niches that those with goods and services to sell are ever on the lookout for new ways of reaching anyone who is likely to buy their stuff. So if you can survive your latest, rather literal, peek into the abyss and get it up onto the Net, you may be able—thanks to all those advertisers who want to reach the kind of people who want to see you do it— to shudder all the way to the bank.

 

 

Image Credit: Pixabay Image 2034239 by Alexas_Fotos,used under a CC0 Creative Commons License

Jack Solomon

From Forums to Facebook

Posted by Jack Solomon Expert Nov 1, 2018

For a number of years I was an active participant, and moderator, on a hobby forum. I was well aware at the time that the experience of forum participation was quite unlike anything else I had ever encountered, and, from time to time, I posted analyses of that experience onto the forum itself. I no longer participate on that forum (it got taken over by climate change deniers and such like—this will be relevant to my following analysis), but I do visit the site to see what is happening there. It's a kind of time travelling experience, thanks to the existence of searchable archives, in which I can see something of myself frozen in the amber of digital memory. But I see something else: namely, some striking signs of what has been happening in this country over the last ten years, not the least of which is the role of Facebook as both symptom and cause of those changes.

 

I'll start with something I posted to the site at the time when I was first coming to appreciate the affect it was having on me. Here's what I said, way back in 2005:

           

What an  forum provides is a historically unprecedented combination of carnival and holiday. That is, the ancient tradition of the carnival enabled Europeans to drop their everyday social hierarchies and limitations, to don masks, and enjoy a freedom that ordinary life doesn't offer. Here on Site X, what we are in everyday life doesn't matter at all. The hierarchy here, and there is a hierarchy of sorts, comes from technical expertise, or experience, or sheer  congeniality and cleverness. I'm very glad to note that it does not come from equipment. Many of the high-rung folks on Site X do not own the best equipment. You can't buy your way to the top here—which is a lot different from everyday life.

 

More important is the masked nature of an  forum. Because we can   conceal as much of ourselves as we like, the stakes of ordinary social interaction are lowered. We aren't risking anything, as happens in any ordinary social interaction. This enables us to relax, to be playful, even a bit childish. We can also be more authentically ourselves, which is very refreshing. On top of this is   the fact that while we can interact at literally any time of the day, the virtual rather than spatial nature of that interaction eliminates most risks. In spatial interaction we have to worry about having to see a person again whom we have may made a goof with. Here, we don't risk that. Again, this enables us to relax enormously.

 

This relates to the holiday-like characteristic of Site X. Just as on a travel-related vacation (a cruise, say), one finds oneself making extraordinarily close friendships that rarely last beyond the voyage because one knows that when the cruise is over everything can be erased, on Site X we are on a kind of permanent holiday. We can let our hair down safely, knowing that if too much is revealed or dared, we can always jump ship, so to speak. Since most of us feel this way it makes for an extraordinarily relaxed atmosphere, with most of our defenses down. Our usual defenses make ordinary social interaction fraught with tension; with them down, we are just more fun to be around.

 

As I read these words now I realize that there is a deep irony in them, the snake, so to speak, in the garden I thought I had found. This irony lies precisely in the anonymity and virtuality of  social interaction, whose benefits can cut two ways. For without the controlling factors inherent in face-to-face and fully-identified human interactions, the Internet is also a place of unbridled hostility and vituperation where people say things they would not dare say to someone else in person.

 

And there doesn't appear to be anywhere to hide when it comes to those  forums that still thrive, as Amy Olberding's essay in The Chronicle of Higher Education laments. Her title says it all: "When Argument Becomes Bloodsport: Philosophy is mired in wanton pugilism."

 

Which is where Facebook comes in. Facebook first appeared at a time when all-too-many  forums were becoming cockpits for the ever-increasing political and cultural divisions that are now so visibly tearing at our society. Forbidding anonymity and offering opportunities for  interaction in which everyone could be a site administrator with the power to exclude unwanted voices, Facebook was quickly embraced as a means of escaping the flame wars and troll fests the Net had become. Indeed, on Site X most of my friends ended up retreating to Facebook—where I did not follow due to personal concerns for my privacy (a topic ripe for its own analysis).

 

So in one sense, Facebook is a symptom, not a cause, of American divisiveness. It offered a way out. But in another, it is a cause: as more and more people have retreated into their own  silos, wherein they can interact only with those people with whom they already agree and be supplied with newsfeeds that deliver only the news and the opinions they want to hear, in the way they want to hear them, the divisions between what is emerging as the Two Americas are only growing. This is not mere correlation, for when a divided people are experiencing a different reality via their social network connections, they are increasingly living in a different reality, making it impossible to understand where the other "side," if you will, is coming from. And this is clearly making things worse.

 

So we have another spoiled paradise on our hands. And I really don't know where we go from here.

 

 

Image Credit: Pixabay Image 390860 by PDPics, used under a CC0 Creative Commons License

When Neil Young wrote his edgy tribute to rock-and-roll "My My, Hey Hey (Out of the Blue)," the genre was hardly dead, nor really approaching it. A new generation of rockers—the punks—were trying to clear a space for themselves by claiming that rock was dead (Harold Bloom style, one might say), but in fact they were only revising it with a slightly different vibe. Johnnie Rotten, whether he liked it or not, was a descendant of Johnnie B. Good, and Young himself would go on to become an inspiration to the Grunge scene, which, for a rather brief shining moment, revitalized rock-and-roll and helped put an end to the mousse-inflected hair-band era.

 

But when, in the tumultuous wake of the Kavanaugh confirmation hearings, I read that Taylor Swift was stepping up to help lead the resistance, I could see that here was a sign that things, finally, had changed, and that the moon was in a new phase indeed. Not that a popular music star leading a political charge for her generation is anything new: heck, that was what the '60s were all about. But Taylor Swift is no rocker, and it is not rock stars who are taking the generational lead these days.

 

The reasons for this are not hard to find, but they are worth a cultural-semiotic exploration. We can begin with the obvious observation that rock-and-roll is no longer the most popular genre of youth music: rap/hip-hop is, along with rhythm-and-blues and the sort of highly choreographed pop that Madonna pioneered, Britney Spears mainstreamed, and that various divas from Taylor Swift to Lady Gaga to Katy Perry now rule (straddling both pop and rhythm-and-blues, Beyoncé belongs in a category of her own). But to start here rather puts the cart before the horse, because it doesn't explain why rock-and-roll plays a second fiddle these days; it only shows that it does.

 

So where's, say, Neil Young, the composer of "Ohio" in the immediate aftermath of the Kent State massacre, in this hour of political need? Well, um, he's also the composer of "A Man Needs a Maid." So how about the Rolling Stones, those "street fighting men" of the '60s? I think that the titles "Brown Sugar" and "Under My Thumb" are enough to explain why no one is running to them for leadership right now. And Bob Dylan, the author of "Lay Lady Lay" and "Don't Think Twice, It's All Right" (about the bitterest putdown of a woman in pop history)? 'Nuff said.

 

I think the pattern here is quite clear: rock-and-roll is rather hopelessly entangled in a male-centered history that is most charitably describable as patriarchal. It isn't the fact that all the performers that I've mentioned are now firmly entrenched in old age that puts them on the political sidelines today (after all, they are all still active and highly profitable touring acts); it's the rock-and-roll legacy itself. Even today's young rockers (and they do exist), can't escape it.

 

Which brings up a related point. Rock-and-roll is not only coded as "male"; it is also coded  "white." Yes, Chuck Berry (and a lot of other black musicians) took a leading role in creating it in the '50s, but rock was taken away from them in that era of official segregation and literally color-coded as "rhythm and blues"—a split that even Jimi Hendrix and the Chambers Brothers could not quite fully repair. And when rap began its meteoric rise in the '80s, it was Heavy Metal (one of rock's most popular incarnations in that decade) that became the de facto voice of white audiences (it is interesting to note in this regard how Ted Nugent and Dave Mustaine—two high profile metalists—are also outspoken conservatives today).

 

Add it all up and it is clear how changes in American demography and gender relations have affected popular music, and, thus, have determined just which performers will be received as voices for their generation. The signs are all there, ready to be read as part of a much larger historical shift. "Rock is dead," sang The Who, who then quickly added, "Long Live Rock," from that land where the passing of one monarch still means the ascendance of another. That was a long time ago, and Roger Daltrey has more recently opined that rock really is dead now and that rap has taken its place. But rock isn't really "dead," of course; it's just been sidelined.  And in the #MeToo era, rap—though still ascendantisn't alone at the top of the charts (political as well as musical) either.  Just ask Taylor Swift.

 

 

 

Image Source: “IMG_0614” by makaiyla willis on Flickr 2/4/17 via Creative Commons 2.0 License

With the appearance of Michael Moore's latest foray into the arena of American conflict and controversy, Farenheit ll/9, I find myself contemplating the significance of the documentary form itself in contemporary American culture. And as is always the case in the conduct of a semiotic cultural analysis, my aim is not to form a partisan opinion but, rather, to find a signification, something that may not be obvious at an initial glance but may well be hiding in plain sight. So here goes.

 

To begin with, we need to construct a historicized system in which today's popular documentaries can be situated, and I can think of no better place to begin than with Edward R. Murrow's legendary exposé of America's migrant labor morass, Harvest of Shame. First broadcast on CBS in 1960 immediately after Thanksgiving, Harvest of Shame joined such classic works of muck-raking journalism as the photographs of Dorothea Lange and Jacob Riis, and the writing of Upton Sinclair, in revealing to the American middle and upper classes what was really going on behind the scenes of the pleasant panoramas of the American dream.

 

Michael Moore's work fits into this tradition, but with some significant differences, differences that will be important to my interpretation to follow. These lie in the way that Moore openly presents himself as a participant not only in his muck-raking documentaries but in the political controversies that he courts as well. Very much an in-your-face documentarian, Moore presents a striking contrast to Ken Burns, who must be ranked as America's currently most popular (not to mention prolific) documentary filmmaker, in large part due to his propensity to smooth over the rough edges of American cultural conflict in his attempts to appeal to everyone (who else but Burns, for example, would have included footage of historian Shelby Foote describing Abraham Lincoln and Nathan Bedford Forrest together as the two "geniuses" that the Civil War produced?).

 

But Michael Moore's "shockumentary" style looks like something out of the Hallmark Channel compared to Sacha Baron Cohen's "mockumentaries." Having laid low for a few years (to lull his intended victims into a false sense of security?), Cohen is back with his Showtime series, Who Is America? A weird amalgamation of Candid Camera, reality television, and, well, "Weird Al" Yankovic, Who Is America? managed to snag a cross section of American political celebrity—from Sarah Palin and Roy Moore to Bernie Sanders and Barney Frank—in his take-no-prisoners approach to political satire in the guise of documentary-style programming (see Laura Bradley's "Sacha Baron Cohen’s Victims: All the People Who Fell for His New Prank Show" in Vanity Fair for a complete rundown of Cohen's hapless marks).

 

Now, aside from my rather unsemiotic curiosity about how such a list of prominent people—who must surely have personal staffs employed precisely to keep their employers insulated from such things—got so taken in by Cohen, I find a number of signifiers at work here. The first might be called "Poe's Law Comes to Comedy." Poe's Law is a label for the ambiguity that surrounds so much of the content on the Internet due to the general weirdness of what people say there. "Does he really mean that, or is he pulling my leg?" pretty much sums up the situation, and it helps explain how Cohen got such current and former politicians as Representative Joe Wilson of South Carolina and ex-Senator Trent Lott to endorse a fake PSA for a "Kinderguardians" program designed to put guns in the hands of little children—Lott, for example, is quoted as saying, "It’s something that we should think about America, about putting guns in the hands of law-abiding citizens . . . whether they be teachers, or whether they actually be talented children or highly trained pre-schoolers.” I will leave it to my readers to deduce just which organization Cohen was targeting here.

 

Beyond the Poe's Law signification, I find myself especially struck by the distinct trajectory here that runs from Murrow to Cohen, the stunning difference. The best way to put it is that Murrow found nothing funny in what he wanted to expose in Harvest of Shame, and had no intention of entertaining anyone. Moore, for his part, has been quite open about his opinion that even documentaries with serious purposes should be entertaining. But Cohen is basically all about entertainment. What he does is make people look stupid for other people to laugh at with extreme derision. The approach is not unlike that of Jersey Shore and My Sweet Sixteen, video train wrecks whose purpose is to make their audiences feel superior to the people on the shows. Satire, with its ancient office of encouraging good behavior by ridiculing bad, thus becomes sheer snark.

 

And here the system opens out to a much larger system in America today, one in which all codes of civility (and "civility," remember, is rooted in the Latin "civitas": a society of shared citizenry) are falling before the imperatives of the profit motive. Snark sells: it's no accident that Who Is America? is a comedy series on Showtime. In such a system politics is repackaged as entertainment, and derision takes the place of anything like an authentic debate. And that just may well be the answer to the question of "Who Is America?" these days.

Jack Solomon

Just Analyze It

Posted by Jack Solomon Expert Sep 20, 2018

As American popular culture gets more and more entangled in the political divisions that are rending our country, it may appear to be increasingly difficult to teach cultural analysis without risking painful classroom conflict. Take the current controversy over Nike's Colin Kaepernick campaign: it simply begs for semiotic attention, but how can it be accomplished without having the whole thing blow up into yet another headline on Inside Higher Education, or any other national news outlet?

 

I wouldn't be writing this blog if I thought that the thing couldn't be done or if my best advice would be to steer clear of the whole matter and anything like it. No, if you have adopted a semiotics-based methodology for your class, you have to engage with the full range of popular culture. And if you stick to the fundamental semiotic axiom that, while a personal opinion can be built upon the foundations of a semiotic analysis, semiotics itself is not an expression of an opinion, the thing can be done.

 

So, to begin, let's start with the obvious significations of the Nike/Kaepernick campaign and the reaction to it. The first is the way that it joins an ever-growing list of signifiers revealing a widening political gap in America, especially when it comes to anything having to do with race. This one is so apparent that it doesn't require any further explanation, but it does merit recognition.

 

The second (also quite obvious) signification is that symbols matter. Whether the symbol involved is the American flag or "Silent Sam," deep emotional attitudes towards objects can be just as passionate as attitudes towards people or policies. This too is so obvious that it doesn't require any further explanation, but does need to be mentioned.

 

The third is that the traditional (and constitutional) right to free speech in America is a shield protecting social protest, until it isn't. On the one hand, juridical rulings on free speech grant to individuals the right to say almost anything short of shouting "fire" in a crowded theater (remember the successful ACLU defense of the Nazi marchers in Skokie?), while, on the other, the courts have allowed employer retaliation against employees who break the speech codes in their places of employment. Such a lack of clarity is a contributing factor in the Nike controversy.

 

But let's step away from the most obvious significations and get into some more subtle ones. The first I'd like to consider is one that I have seen very ably explored in a Washington Post opinion piece by Michael Serazio, who argues that the Nike campaign isn't a gesture on behalf of social justice; it's simply another expression of the hypercapitalistic nature of America's consumer culture. Here's how Serazio puts it: "At one point in human history, products were bought and sold for their utility. Now, because of the massive and unchecked expansion of corporate power—in terms of not just market share but mind share—products must represent values, lifestyles and, in the age of President Trump, political ideologies." In short, the Nike campaign can be seen as a signifier of the hegemony of consumption in a consumer society.

 

But Serazio is hardly the only cultural analyst trying to parse the Nike affair. Consider the following two articles, also from the Washington Post. First, there's Megan McArdle's Nike bet that politics would sell. Looks like it was wrong, an op-ed that cites public opinion polls from all sides of the controversy to conclude that Americans are not responding favorably to the Nike/Kaepernick campaign, while arguing that this is a good thing because "as America has divided into distinct camps—geographic, demographic, political—more companies have started chasing explicitly political identities. Starbucks's leftward lean has famously roused conservative ire, but many on the left still haven't forgiven Chick-fil-A owner Dan Cathy's remarks opposing same-sex marriage a few years ago. The result is a world in which every decision, even what kind of fast food to buy, has taken on a political aspect. That's not healthy for America, which needs more points that people have in common, not more ways to divide into separate teams."

 

But then there's Amy Kittelstrom's counter-argument, which comes to a very different conclusion. Noting that by "[b]urning shoes and snipping swooshes, some white Americans think they are punishing Nike for re-signing Colin Kaepernick, the unemployed quarterback known for quietly kneeling during the national anthem to draw attention to anti-black police brutality. In reality, Nike will profit. The more these angry consumers attack the company, the more attractive they make Nike in the far bigger global market—which is a vital part of why Nike launched the campaign that centers on Kaepernick."

 

Now, the interesting thing about these articles is that each, in effect, jumps the gun on the future by asserting long-term outcomes that are by no means as certain as their authors argue they are. You may say that Trump started it with his exultant tweet about Nike's stock price decline at the opening of the campaign (Nike stock has, as I write this, fully made up the drop), but, whoever engages in such predictions, making them at all always runs the risk of speaking too soon, of letting one's desires (i.e., the way one wants things to turn out) supersede the available facts.

 

I'm reminded here of an editorial in the Richmond Examiner from July 7th, 1863 that predicted inevitable victory for the Army of Northern Virginia in its invasion of the North—published three days after Lee's defeat at Gettysburg but two days before news of that defeat reached Richmond. But then again, in 1861 there was a lot of "On to Richmond" confidence in the Union press as well. In the end, as Lincoln sublimely noted in his second inaugural address, neither side got what it expected out of the war, which grimly contradicted that American tendency (which rises to the level of a cultural mythology) to expect that everything will always go the way we want it to—a fundamental cultural optimism that Barbara Ehrenreich calls "bright-sidedness" (you can find her exploration of this peculiarly American tendency in chapter 7 of the 9th edition of Signs of Life in the U.S.A.).

 

And so, in McArdle's and Kittelstrom's dueling certainties about an uncertain future I find a signifier of something that is profoundly American; but unfortunately, when a divided people are equally certain that everything will go their way, everyone, in the end, loses.

 

 

Image Credit: Pixabay Image 1840619 by Pexels, used under a CC0 Creative Commons License

Yet another tale of professorial indiscretion on social media making the rounds prompts me to reiterate what I regard as one of the cardinal benefits of the semiotic approach: viz., that it can lead one beyond the obvious surfaces of cultural phenomena to their more nuanced (and often subtly concealed) significations. And this matters in these days of take-no-prisoners political controversy, as America divides further and further into two hostile camps that can no longer even communicate with each other without invective.

 

The indiscretion I am referring to involves a Rutgers University history professor's Facebook screed about gentrification in Harlem, which has been widely reported in the mass media, as well as on the  news source Inside Higher Education. As IHE reports, Professor James Livingston is in hot water over a post he put up a few months ago. Here's IHE's quotation of the controversial post (warning: salty language ahead):

 

OK, officially, I now hate white people. I am a white people, for God’s sake, but can we keep them -- us -- us out of my neighborhood? I just went to Harlem Shake on 124 and Lenox for a Classic burger to go, that would be my dinner, and the place is overrun by little Caucasian assholes who know their parents will approve of anything they do. Slide around the floor, you little shithead, sing loudly, you unlikely moron. Do what you want, nobody here is gonna restrict your right to be white. I hereby resign from my race. Fuck these people. Yeah, I know, it’s about my access to dinner. Fuck you, too.

 

After Facebook deleted the post, Livingston returned with the following (again from IHE):

 

I just don't want little Caucasians overrunning my life, as they did last night. Please God, remand them to the suburbs, where they and their parents can colonize every restaurant, all while pretending that the idiotic indulgence of their privilege signifies cosmopolitan -- you know, as in sophisticated "European" -- commitments.

 

OK, to start with, I do not intend to get involved in any way with the obvious (right there on the surface) political elements in this saga of a white professor's denunciation of the white patrons (and their children) at a Harlem eatery. I also do not want to argue the free speech implications of the matter. Everyone else is doing that already. Rather (and I hope my readers at Bedford Bits will appreciate my focus), I want to look at an important rhetorical element in the story that not only is being disregarded but is being misconstrued as well. Call what follows an exercise in "rhetorical semiotics," if you will.

 

To begin with, the reactions to Livingston's posts have parsed exactly how you would expect them to: conservative media (and individuals) have (to put it quite mildly) denounced Professor Livingston, accusing him of racism, while more liberal voices tend to emphasize that what he wrote is protected free speech. Well and good: we can expect such disagreements. But what really caught my attention is the claim, both from the reporter of the story and from a number of the comments that follow, that Livingston was clearly being satirical. First, the IHE reporter: "Right-wing media and Rutgers University didn't find Livingston's satire very funny." A number of the comments to the story took it for granted that the posts were satirical too. For example: "Weird reaction to Livingston’s FB posts by almost everyone, including Livingston himself. . . .The charge of racism requires taking literally what is clearly satire."

 

But is it really "clearly satire?" Consider another comment: "The problem is that so many people in academia are so disconnected from reality that it's not actually clearly satire. Poe's law definitely applies here." Now, Poe's Law is the dictum that things on the Internet are so weird that you can never know for certain whether someone is being ironic or not. And indeed, as another comment observes: "If it's satire then it's really badly done. I don't believe it's actually satire."

 

Frankly, I think that everyone is chasing the wrong trope. Livingston's second Facebook post, cited above, makes it pretty clear that he means it about his aggravation over urban gentrification. So what I think is involved in the initial post is really hyperbole—that is, the deliberate overstatement of one's case in order to more effectively make a point. Except that in this case that hyperbolic wink was lost on a lot of people, thus further widening the gap between an already miserably polarized society.

 

Thus my point is that words matter, that they have semiotic as well as semantic significance. If, in the currently highly inflamed environment (the system in which we can situate Professor Livingston's remarks), one wishes to make a political point, one isn't going to make it effectively by using easily misconstrued—not to mention hyperbolic and inflammatory—language (heck, it isn't even immediately clear from the posts that Livingston is mostly complaining about the behavior of little children). If you want your point of view to be politically effective—and, perhaps even more importantly, not backfire—trollish language isn't going to cut it, especially when the keys to the kingdom (i.e., electoral power in America), are ultimately in the hands of that roughly one third of the electorate that identifies as politically "independent," and which is neither clearly on the right nor on the left. If you want them on your side, you can't assume that the language that works inside your socially mediated echo chamber is going to work outside it. So while I fear that it is no longer possible for either "side" today in the great divide to reach the other, it behooves anyone who wants to win over any part of that uncommitted "center" (if we can call it that) to keep in mind that, thanks to the Internet, the whole world is always watching, and weighing, what you say.

 

 

Photo Credit: “Gentrification Zone” by Matt Brown on Flickr 8/25/17 via Creative Commons 2.0 license.

This post was originally published on December 20, 2012.

 

One of my students in a popular cultural semiotics seminar recently wrote her term project on the reality television “Real Housewives of .  .  .” phenomenon. Not being a fan of such shows myself, it took her paper to prompt me to think seriously about the whole thing for myself. And I realized that such shows instantiate a far more profound cultural signifier than I had heretofore realized. The following analysis represents my thinking on the matter, not my student’s.

 

As is always the case, my semiotic analysis centers on a crucial difference. The difference in question here is not simply that between the actual realities of the lives of ordinary housewives as opposed to the reality TV versions, but also the difference between their current television representations and those of the past. That is, not only do most actual housewives lack the wealth, glamour, and business opportunities of the “Real Housewives” of Beverly Hills, New Jersey, or wherever, but their television counterparts of the past did, too. The classic TV housewife, enshrined within the history of the family sitcom, was an asexual middle-class woman who was totally focused on her children: Think June Lockhart, Jane Wyatt, and Barbara Billingsley.

 

That the current crowd of glammed-up, runway-model housewives of today’s “reality” shows reflects a widespread cultural return to the conservative gender-coded precept that a woman’s value lies in her erotic appeal almost goes without saying. While a few less-than-glamorous women are cast in these programs as if to head off criticisms of this kind, their existence tends to prove the rule—and even they tend to be dolled up on the program Web sites.

But this is an easy observation to make. More profoundly, however, is the fact that the reality TV housewife has become an object of desire for her largely female audience. Rather than being seen as a hapless drudge of patriarchy, the reality TV housewife is a vicarious role model, even when she doesn’t found her own business enterprise and simply stays at home. What caused this change in perception?

 

To answer this question, I considered the frequently reported economic fact that household incomes for the vast majority of Americans have been essentially stagnant, when adjusted for inflation, over the last four decades. Now, add to this the exponential inflation in the costs of such basic necessities as housing and transportation and you get the modern two-income family: not necessarily because both partners in a marriage want to work, but because in order to maintain a middle-class household two incomes are now more or less essential. Certainly the efforts of the women’s movement have contributed to the enormous growth of women’s participation in the workforce, but the new image of the reality TV housewife suggests that something else is at work here as well.

 

That is, with the housewife being presented as a fortunate woman who doesn’t have to work, it seems that American women are nostalgic for the “good old days” of a time when they didn’t have to work just to maintain a middle-class home. The fantasy now is to be a housewife, not to escape the role. That’s quite a change.

 

Just how much of an effect on American consciousness in general this stagnation of incomes has had is probably one of the most important social questions of our time. Can it help explain the hostile polarization of our political landscape, our dwindling sympathy for others in an increasingly libertarian environment, the growing resentment of middle-class workers (especially unionized workers) with decent jobs and benefits? I think so. And this will be a topic for future blogs of mine.

Jack Solomon

Building a Religion

Posted by Jack Solomon Expert Jun 7, 2018

As I head into the summer recess for my Bits blogs, I find myself contemplating the cultural significance of the rise and apparent fall of Theranos, the troubled biotech startup that was once heralded as a disruptive force that would revolutionize the blood testing industry, and, not so incidentally, produce a new generation of high-tech entrepreneurs to rank with Steve Jobs and Bill Gates. On the face of it, of course, this would not appear to be a topic for popular cultural analysis, but bear with me for a moment, for when it comes to the new technologies, everything relates in some way or another to the manifold currents of everyday life that popular culture expresses.

 

What has drawn my attention to Elizabeth Holmes and the Theranos saga is the publication of a book by the Wall Street Journal writer who first blew the whistle on the company in 2015: John Carreyrou's BAD BLOOD: Secrets and Lies in a Silicon Valley Startup. A brief synopsis of that book appeared in Wired just as it was being released, and it was a single sentence in that synopsis that really got me thinking. It appears in Carreyrou's narrative at the point when things at Theranos were beginning to unravel and various high-ranking employees were abandoning ship. In the wake of such resignations, Elizabeth Holmes allegedly summoned every remaining employee to an all-hands-on-deck meeting to demand loyalty from them. But she didn't call it loyalty: according to Carreyrou "Holmes told the gathered employees that she was building a religion. If there were any among them who didn’t believe, they should leave."

 

Building a religion: Holmes was telling a truth that was deeper than she realized. For when we situate the story of Theranos in the larger system of post-industrial America, we can see that our entire culture has been building a religion around what Fredric Jameson has called America's postmodern mode of production. On the face of it, the object of worship in this system is technology itself, which is viewed as a kind of all-purpose savior that will solve all of our problems if we are just patient enough. Steven Pinker's new book, Enlightenment Now, makes this point explicitly, but it is implicit every time some new tech startup promises to "fix" higher education, clean up all the trash in the ocean, and use architecture to save the natural environment (see, for example, Wade Graham's "Are We Greening Our Cities, or Just Greenwashing Them?", which provides both a survey and a critique of the eco-city movement: you can find it in the 9th edition of Signs of Life in the USA). The religion of technology also produces its own demi-gods, like Elon Musk, who can announce yet another delay (or change of plans) in his money-losing product line and still see his Tesla stock rise due to the unwavering adoration of his flock.

 

Oddly enough, as I was writing the first draft of this blog I came across an essay in The Chronicle of Higher Education that examines a related angle on this phenomenon. There, in a take-down of the "design thinking" movement (an ecstatic amalgamation of a Stanford University product design program and the Esalen Institute that promises to transform higher education into a factory for producing entrepreneurially inclined "change agents"), Lee Vinsel compares the whole thing, overtly, to a religious cult, acidly remarking that the movement "has many of the features of classic cult indoctrination, including intense emotional highs, a special lingo barely recognizable to outsiders, and a nigh-salvific sense of election" —concluding that "In the end, design thinking is not about design. It’s not about the liberal arts. It’s not about innovation in any meaningful sense. It’s certainly not about 'social innovation' if that means significant social change. It’s about commercialization. It’s about making education a superficial form of business training."

 

Thus, I think that Vinsel would agree with my contention that behind the religion of technology is something larger, older, and more universal. This is, quite simply, the religion of money worship. Minting instant billionaires and driving an ever-deeper wedge between a technology-fostered one percent and everyone else, the post-industrial economy dazzles most through the glitter of gold, which overcomes every other moral value, from Facebook's willingness to allow its platform to be exploited for the purposes of overt political manipulation to Theranos's performing a million blood tests with a technology so flawed that the tests have had to be invalidated, at who knows what cost to the patients (one should say, victims) involved.

 

And what does America do in response? It makes movies, like Aaron Sorkin's The Social Network, and John Carreyrou's own Bad Blood, a film said to be starring Jennifer Lawrence, and due out in 2019, thus turning social anomie into entertainment, and promising even more offerings on the altars of extreme affluence.

 

Image Credit: Pixabay Image 1761832 by kropekk_pl, used under a CC0 Creative Commons License

One of my all-time favorite readings from past editions of Signs of Life in the USA is Andy Medhurst's "Batman, Deviance, and Camp." In that analysis of how the original muscle-man clone of Superman morphed into "Fred MacMurray from My Three Sons" in the wake of Fredric Wertham's notorious accusation in 1955 that Batman and Robin were like "a wish dream of two homosexuals living together," only to be transformed into the Camped Crusader of the 1966 TV series Batman, and then revised once more into the Dark Knight of the 1980s and beyond, Medhurst reveals how cartoon superheroes change with the times, reflecting and mediating the cross currents of cultural history. So as I ponder the rampant success of the second Deadpool film in this emergent franchise, I find myself wondering what this new entrant into the superhero sweepstakes may signify. Surely this is a topic for semiotic exploration.

 

What particularly strikes me here is the difference between the gloomy and humorless Batman of the Miller/Burton/Nolan (et al.) era, and the non-stop wisecracking of Deadpool. It isn't that Deadpool doesn't have a dark backstory of his own, as grim as anything to be found in Bruce Wayne's CV. And, surely, the Deadpool ecosystem is even more violent than the Batworld. No, it's a matter of tone, of attitude, rather than content.

 

Now, if Deadpool were the only currently popular superhero who cracked wise all the time, there really wouldn't be very much to go on here, semiotically speaking. But Deadpool isn't the only wise acre among the men in spandex: various Avengers (especially Thor), along with the latest incarnation of Spiderman, have also taken to joking around in the midst of the most murderous mayhem. If the Dark Knight soared to superstar status on the wings of melancholy, a lot of rising contenders for the super-crown appear to be taking their cue from Comedy Central. Something's going on here. The question is, what?

 

I'm thrown back on what might be called "deductive abduction" here: that is, moving from a general condition to a particular situation as the most likely explanation. The general condition lies in the way that wise-cracking humor has been used in numerous instances in which a movie whose traditional audience would be restricted to children and adolescents (think Shrek) has broken through to generational cross-over status by employing lots of self-reflexive, topically allusive, and winking dialogue to send a message to post-adolescent viewers that no one involved in the film is really taking all this fantasy stuff seriously, and so it's safe, even hip, for grown-up viewers to watch it (of course, this is also part of the formula behind the phenomenal success of The Simpsons). Stop for a moment to think about the profound silliness of the Avengers movies: who (over a certain age) could take this stuff seriously? Well, the wise cracks—which are generally aimed at those who happen to be over a certain age—are there to provide reassurance that it isn't supposed to be taken seriously. Just sit back, be cool, and enjoy.

 

So, given the R-rating of the Deadpool movies, I would deduce that the almost excessive (if not actually excessive) self-reflexive, topically allusive, and winking dialogue to be found in them works to reassure an over-seventeen audience that the whole thing is just a big joke. No one is taking any of this seriously, and so it is perfectly safe to be spotted at the local cineplex watching it. Hey, there's even a postmodern inflection to Deadpool's fourth-wall dissolving monologues: what could be more hip?

 

Since most cultural phenomena are quite over-determined in their significance, I do not mean to preclude any other possible interpretations of the super wiseass phenomenon, but the interpretation I've posted here is one I feel confident of. At any rate, the topic could make for a very lively class discussion and an interesting essay assignment.

 

Image Credit: Pixabay Image 2688068 by pabloengelsused under a CC0 Creative Commons License. 

The lead-in to the L.A. Times article on the Tony Award nominations really caught my attention. Here it is:

 

"'SpongeBob SquarePants,' 'Mean Girls' and 'Harry Potter and the Cursed Child.'

'Angels in America,' 'Carousel' and 'My Fair Lady.'"

 

That's exactly as it appeared, and the title of the piece—"Tony nominations for 'Harry Potter,' 'SpongeBob' and 'Mean Girls' put Hollywood center stage"—made it clear that the author was well aware of the list's significance: that television and the movies appear to be taking over one of the last bastions of American live theater: the Broadway stage.

 

Now, before you get the idea that I am going to lament this development as some sort of cultural loss or desecration, let me assure you that I have no such intention. To begin with, Broadway has always occupied a somewhat liminal position in the traditional high cultural/mass cultural divide, and the stage has always been a setting for popular entertainment—albeit one that is not mediated by electronic technology. And while the article, for its part, does note that "Harry Potter and the Cursed Child" represents "just one example of the kind of pop-culture franchise that can reduce producers' financial risk," it does not do so in anger. Indeed, it even quotes a writer associated with this year's nominees' sole "art-house" production ("The Band's Visit”), who rather generously observes that "Commercial theater, and musical theater, is a really risky venture. It's very expensive. It's possible to have a great success, but it's really unlikely"; adding that "I don't blame anyone for trying to hedge against that risk by adapting a really well known property, and it's not always cynical."

 

There are two quick and easy popular-semiotic takeaways, then, from this year's Tonys. The first is that the last barriers between mass media entertainment and the more culturally prestigious (if less lucrative) traditional stage are coming down once and for all. The second is that they are coming down not on behalf of some sort of producer-led deconstruction of a vanishing high cultural/low cultural divide, but simply because very few Broadway producers are willing to take any financial risks these days and prefer to go with tried and true productions. And this doesn't simply mean translating blockbuster movies and TV shows to the stage: after all, revivals of "Angels in America" and "The Iceman Cometh" are also among the nominees this year.

 

But the real significance of the Tonys for me appears when we broaden the system in which to analyze the nominations to include what is happening in the movies and television as well. Here too we find revivals, reboots, sequels, prequels...in short, one studio or network franchise after another centered on a successful brand that never seems to run out of steam: "Avengers Infinity War" (note the endlessness implied); "Star Wars Forever"; "Roseanne" II and "Murphy Brown" redux; and so on and so forth. What this reveals is not only a similar spirit of creative bet hedging by going with tried and true entertainment commodities, but a narrowing of opportunities for creators themselves as well. For the message in this particular bottle is that in America success is the gift that keeps on giving. A few people (like George Lucas and J.K. Rowling) are going to rise from obscurity and hit it so big with their creative efforts that they will use up all the oxygen in the room. It isn't that there will be less creativity (with luminaries like Lucas and Rowling shining bright for innumerable self-publishing dreamers who hope to be the next meteors in the popular cultural skies, there will never be any danger of that); the problem is that there will be fewer opportunities to make such creative breakthroughs, or earn any sort of living while trying, when the stage (literally and figuratively) is filled with old brands that won't move aside for new entrants.

 

And so, finally, we come to a larger system within which to understand what is going on with the Tony Awards: this system is America itself, where a handful of winners are vacuuming up all of the opportunity in America and leaving almost nothing for everyone else (George Packer eloquently describes the situation in "Celebrating Inequality," an essay you can find in the 9th edition of Signs of Life in the USA). The rewards of the American dream are bigger than they have ever been; but not only are there fewer seats at the banquet of success, the pickings are getting leaner and leaner for those who haven't been invited.

 

Credit: Pixabay Image 123398 by smaus, used under a CC0 Creative Commons License

 

Though there have been some very high profile participants in the "movement" (can you spell "Elon Musk"?), I am not aware that the #deletefacebook movement is making much of a real dent in Facebook's membership ranks, and I do not expect that it ever will. For in spite of a seemingly continuous stream of scandalous revelations of Facebook's role in the dissemination of fake news and the undermining of the American electoral system—not to mention the way that Facebook, along with other digital titans such as Google, data mine our every move on the Internet—all signs indicate that, when it comes to America’s use of social media, the only way is up. Even the recantations of such former social media "cheerleaders" as Vivek Wadhwa (who have decided that maybe all this technological "progress" is only leading to human "regression" after all) are highly unlikely to change anyone's behavior.

 

The easiest explanation for this devotion to social media, no matter what, is that Internet usage is addictive. Indeed, a study conducted at the University of Maryland by the International Center for Media and the Public Agenda, in which 200 students were given an assignment to give up their digital devices for 24 hours and then write about their feelings during that bleak stretch, revealed just that, with many students reporting effects that were tantamount to symptoms of drug withdrawal (a full description of this study can be found in chapter 5 of the 9th edition of Signs of Life in the USA). To revise Marx a little, we might say that social media are the opiate of the masses.

 

Given the fact that our students are likely to have lived with the Internet all of their lives, it could be difficult, bordering on impossible, for them to analyze in any objective fashion just how powerful, and ultimately enthralling, social media are. It’s all too easy to take the matter for granted. But with the advent of digital technology looming as the most significant cultural intervention of our times, passive acceptance is not the most useful attitude to adopt. At the same time, hectoring students about it isn’t the most productive way to raise awareness either. All those “Google is making America stupid” screeds don’t help at all. So I want to suggest a different approach to preparing the way for a deep understanding of the seductive pull of social media: I'll call it a "phenomenology of Facebook."

 

Here's what I have in mind. Just as in that phenomenologically influenced mode of literary criticism called "Reader Response," wherein readers are called upon to carefully document and describe their moment-by-moment experience in reading a text, you could ask your students to document and describe their moment-by-moment experience when they use social media. Rather than describing how they feel when they aren't online (which is what the University of Maryland study asked students to do), your students would describe, in journal entries, their precise emotions, expectations, anticipations, disappointments, triumphs, surprises, hopes, fears (and so on and so forth) when they are. Bringing their journals to class, they could share (using their discretion about what to share and what not to) what they discovered, and then organize together the commonalities of their experience. The exercise is likely to be quite eye opening.

 

It is important that you make it clear that such a phenomenology is not intended to be judgmental: it is not a matter of “good” or “bad”; it is simply a matter of “what.” What is the actual experience of social media usage? What is it like? What’s going on? Only after clearly answering such phenomenological questions can ethical questions be effectively posed.

 

Not so incidentally, you can join in the exercise yourself. I’ve done it myself. You may be surprised at what you learn.

 

 

Credit: Pixabay Image 292994 by LoboStudioHamburg, used under a CC0 Creative Commons License

In 1971, Norman Lear and Bud Yorkin reconfigured a popular British sitcom featuring a bigoted working class patriarch (Till Death Do Us Part) to create America's All in the Family. A massive hit, All in the Family continued on not only to top the Nielsens for five years running but also went a long way towards mediating the racial, generational, and sexual conflicts that continued to smolder in the wake of the cultural revolution. A new kind of sitcom, All in the Family (along with other such ground-breaking TV comedies as The Mary Tyler Moore Show) provided a highly accessible platform for Americans to come to terms with the social upheavals of the sixties, thus contributing to that general reduction of tension that we can now see as characteristic of the seventies. The decade that came in with Kent State went out with Happy Days.

 

So the recent reboot of Roseanne in a new era of American social conflict is highly significant. Explicitly reconstituting Roseanne Barr's original character as an Archie Bunkeresque matriarch, the revived sitcom raises a number of cultural semiotic issues, not the least of which is the question as to whether the new Roseanne will help mediate America's current cultural and political divisions, or exacerbate them.

 

In short, we have here a perfect topic for your classroom.

 

To analyze Roseanne as a cultural sign, one must begin (as always in a semiotic analysis), by building a system of associated signs—as I have begun in this blog by associating Roseanne with All in the Family and The Mary Tyler Moore Show. There are, of course, many other associations that could be made here within the system of American television (Saturday Night Live, Family Guy, and The Simpsons loom very large here), but I'll limit myself now with the association with All in the Family because of the way that, right off the bat, it reveals an important difference—and semiotic significance is always to be found in a combination of associations and differences—that points to an answer to our immediate semiotic question.

 

This difference emerges from the well-known fact that Norman Lear was quite liberal in his politics and intended his show to be a force for progressive television, while Roseanne Barr is an outspoken conservative—a situation that has already produced a good deal of controversy. Consider C. Nicole Mason's Washington Post piece "‘Roseanne’ was about a white family, but it was for all working people. Not anymore," a personal essay that laments the Trumpist overtones of Roseanne Barr's new character. On the flip side of the equation, the new Roseanne has been an immediate smash hit in "Trump Country," scoring almost unheard of Nielsen numbers in this era of niche TV. Pulling in millions of older white viewers who prefer the traditional "box" to digital streaming services, the show is already reflecting the kind of generational and racial political divisions that burst into prominence in the 2016 presidential election. As Helena Andrews-Dyer puts it in the Washington Post, "The ‘Roseanne’ reboot can’t escape politics — even in an episode that’s not about politics."

 

Thus, while it may be soon to tell for certain, I think that the new Roseanne will prove to be quite different from All in the Family in its social effect. Rather than helping to pull a divided nation together, the signs are that Roseanne is going to deepen the divide. I say this not to imply that television has some sort of absolute responsibility to mediate social conflict, nor to suggest that Roseanne's appeal to older white viewers is in itself a bad thing (indeed, the relative lack of such programming goes a long way towards explaining the immediate success of the show). My point is simply semiotic. America, at least when viewed through the lens of popular culture, appears to be even more deeply divided than it was in 1971. Things have not stayed the same. Roseanne isn't Archie Bunker, Trump isn't Nixon, and everyone isn't laughing.

Jack Solomon

Things Fall Apart

Posted by Jack Solomon Expert Mar 29, 2018

 

While there appears to be some significant doubt over whether Cambridge Analytica really had much effect on the outcome of 2016 presidential election (Evan Halper at the L.A. Times makes a good case that it didn't), the overall story of the way that millions of Facebook profiles were mined for partisan purposes is still something that is of profound significance in this time when digital technology seems to be on the verge of undermining the entire democratic process itself. As such, the Facebook/Cambridge Analytica controversy is a worthy topic for a class that makes use of popular culture in teaching writing and critical thinking.

 

If you happen to be using the 9th edition of Signs of Life in the U.S.A., you could well begin with John Herrman's "Inside Facebook's (Totally Insane, Unintentionally Gigantic, Hyperpartisan) Political Media Machine." In this extensive survey of the many ways in which Facebook has fostered an ecosystem of political activists who invade your news feed with ideologically targeted content, Herrman shows how the marketing of online behavior has been transformed into a "(Totally Insane, Unintentionally Gigantic, Hyperpartisan) Political Media Machine." That our Internet activity is being tracked and our data mined is no secret anymore, and many people don't seem to mind—so long as it only results in specially curated advertising pitches and coupon offers. But what Herrman describes goes well beyond product merchandizing into information manipulation, the building of highly politicized news silos where the news you get is the news that someone has calculated that you want to get, and nothing else, as more and more Americans transition away from such traditional news sources as newspapers and television to Facebook, Twitter, and a myriad of other social media.

 

Brooke Gladstone's "Influencing Machines: The Echo Chambers of the Internet" (also in the 9th edition of Signs of Life), succinctly explains the effect of this shift. With no pretense of presenting a balanced palette of news and information, the new media are exacerbating and deepening the social divisions in America, creating ideological echo chambers that effectively constitute alternate realities for those that inhabit them. The result is a kind of political and cultural echolalia.

 

It's little wonder, then, that the contending parties in America cannot find a way to communicate effectively with each other. Already divided by a history of cultural conflict and contradiction (chapter 7 of Signs of Life explores this division in depth), Americans have vanishingly less in common with those whose lives lie on the other side of the great divide.

 

There is something profoundly ironic about all this. For many years it has been assumed that the effect of modern mass media has been to chip away at America's regional differences, flattening them out into a kind of unaccented (literally and figuratively) sameness: a mass culture watching the same TV shows, eating the same food, and talking in the same way. But now something is changing. Rather than tending towards a common culture, America, sliced and diced by digital algorithms, is dividing into mutually hostile camps.

 

William Butler Yeats said it best long ago at a time when his own country was divided in two: "Things fall apart," he lamented, "the centre cannot hold." Now there's something to hashtag.

 

 

Image Source: "Facebook security chief rants about misguided “algorithm” backlash" by  Marco Verch on Flickr 10/08/17 via Creative Commons 2.0 license.

Jack Solomon

And the Winner Is . . .

Posted by Jack Solomon Expert Mar 15, 2018

 

As I consider the cultural significance of this year's Academy Awards ceremony, my attention has not been captured by the Best Picture winner—which strikes me as a weird amalgam of Water World, Beauty and the Beast (TV version), and Avatar, with a dash of Roswell thrown in for good measure—but by something quite external to the event. Yes, I'm referring to the clamor over the 20% television ratings drop that has been lighting up the airwaves.

 

Fortune blames the drop off on "the rapidly-changing viewing habits of TV audiences, more and more of whom are choosing to stream their favorite content online (including on social media) rather than watching live on TV," as does Vulture and NPR, more or less. They're probably right, at least in part. Other explanations cite the lack of any real blockbusters among the Best Picture nominees this year (Fortune), as well as the Jimmy Kimmel twopeat as Master of Ceremonies (Fortune). But the really big story involves what might be regarded as the transformation of the Nielson ratings into a kind of Gallup Poll.

 

Consider in this regard the Fox News angle on the story: "Oscars ratings are down, and ABC's lack of control over the Academy may be to blame." Or Breitbart's exultation over the low numbers. And, of course, the President's morning after tweet. In each case (and many others), the fallout from the fall off is attributed to voter—I mean viewer—disgust with the "elitist" and "liberal" tendencies of the Academy, which is now getting its comeuppance.

 

Is it? I don't know: a thorough analysis of the numbers seems to be in order, and I would expect that the ABC brass at the very least will be conducting one in an attempt to preserve their ad revenues. In my own view, whatever caused the ratings drop is certainly overdetermined, with multiple forces combining to reduce television viewership not only of the Academy Awards and the Super Bowl but of traditional televised media as a whole. Certainly Fortune, Vulture and NPR are correct about the effect of the digital age on American viewing habits, but, given the leading role that Hollywood has played in the resistance to the Trump presidency, a deeper exploration of the possibility of a growing resistance to the resistance as evidenced in television viewing preferences could shed some light on emerging trends within the culture wars in this country.

 

Of course, the Fox News (et.al) take on the matter could prove to be fake news in the end, but even should that happen, the fact that the ratings drop could be so easily exploited for political purposes is itself significant. There are a number of takeaways from this. The first can be found in a Washington Post blog entitled "Trump is supercharging the celebrification of politics." The Post blog surveys an intensification of a cultural process that has been the core premise of nine editions of Signs of Life in the U.S.A.: namely, that the traditional division between "high" culture and "low" (or workaday and recreational) in America is being replaced by a single "entertainment culture" that permeates our society from end to end. The transformation has been going on for a long time, but Trump has intensified it.

 

But as the hoohah over the Academy Awards television viewership decline demonstrates, this entertainment culture is not a common culture: Americans are lining up on two sides of a popular cultural divide that matches an ideological one, with Fox News audiences lined up against MSNBC's, and innumerable other viewership dichotomies (Duck Dynasty, say, vs. Mad Men) indicating just how wide the culture gap has grown. So now we're counting audience numbers for such once broad-appeal spectacles as the Super Bowl and the Academy Awards to see which side is "winning." This is a new normal indeed, and it is indicative of a country that is tearing apart.

 

But then again, the same Post blog that I've cited above reports that the most read Washington Post story for the day in which the blog appeared concerned "the season finale of The Bachelor”—a TV event that really puts the soap into soap opera. So maybe there actually is something of world-historic importance for Americans to rally 'round after all.

 

Image Source: "Academy Award Winner" by  Davidlohr Bueso on Flickr 09/06/09 via Creative Commons 2.0 license

I had not planned on writing on this topic as my Bits Blog posting deadline approached. But when a headline in the L.A. Times on February 21st blared that "Conspiracy theories about Florida school shooting survivors have gone mainstream"—and this on a day when America's school children rose up to say "enough is enough" about gun violence—I felt that I ought to say something. What to say, however, is difficult to decide. As I wrote after the Route 91 Harvest music festival massacre in Las Vegas, I am not confident (to put it mildly) that anything meaningful is going to be done—the L.A. Times has a nailed it with a "Handy clip-and-save editorial for America's next gun massacre" and I don't have any solutions that the students now marching for their lives aren't already proposing more effectively than I can. But the whole mess has—thanks to something I've read in the Washington Post—enabled me to crystallize a solution to a critical thinking conundrum that I've been pondering, and that's what this blog will be about.

 

That conundrum is how to teach our students how to distinguish between reliable and unreliable information on the Internet. It seems like such an easy thing to do: just stick to the facts and you'll be fine. But when the purveyors of conspiracy theories have grown as sophisticated as they have in mimicking the compilation of "factual" evidence and then posting it all over the Internet in such a way as to confuse people into thinking that there is a sufficiency of cross-referenced sources to make their fairy tales believable, it becomes more of a challenge to teach students what's rot and what's not. And as I've also written in this blog, that challenge isn't made any easier by academic attacks on objective factuality on behalf of poststructural theories of the linguistic and/or social construction of reality. So, as I say, the matter isn't as simple as it looks.

 

Here's where that Washington Post article comes in. For in Paul Waldman's opinion piece, "Why the Parkland students have made pro-gun conservatives so mad," he identifies what can be used as a simple litmus test for cutting through the clutter in an alt-fact world: keep an eye out for ad hominem arguments in political argumentation.

 

Here's how he puts it:

The American right is officially terrified of the students of Marjory Stoneman Douglas High School. Those students, who rapidly turned themselves into activists and organizers after 17 of their fellow students and teachers were murdered at their school, have become the most visible face of this new phase of  the gun debate, and conservatives are absolutely livid about it. As a consequence, they’re desperately arguing not just that the students are wrong in their suggestions for how gun policy should be changed, but also that they shouldn’t be speaking at all and ought to be ignored.

 

There are two critical reasons the right is having this reaction, one more obvious than the other. The plainer reason is that as people who were personally touched by gun violence and as young people — old enough to be informed and articulate but still children — the students make extremely sympathetic advocates, garnering attention and a respectful hearing for their views. The less obvious reason is that because of that status, the students take away the most critical tool conservatives use to win political arguments: the personal vilification of those who disagree with them.

 

It is the use of "personal vilification of those who disagree" that reliably marks out an evidence-starved argument. Thus, when Richard Muller—once a favorite of the climate change denial crowd—reviewed his data and announced in 2012 that he had changed his mind and concluded that climate change is both real and anthropogenic, his erstwhile cheerleaders simply began to call him names. And you probably don't even want to know about the personal attacks they have been making on Michael Mann.

 

But given the high level of personal vilification that takes place on the Net (the political left can be found doing this too), our students have probably been somewhat desensitized to it, and may even take it for granted that this is the way that legitimate argumentation takes place. This is why it is especially important that we teach them about the ad hominem fallacy, not simply as a part of a list of logical and rhetorical fallacies to memorize but as a stand-alone topic addressing what is probably the most common rhetorical fallacy to be found on the Internet, and political life more generally these days.

 

Now, we can't stop simply with warning our students against ad hominem arguments (we should teach them not to make them either), but we can establish the point as a kind of point of departure: if someone's claims are swathed in personal attacks and accusations, it is likely that there is nothing of substance behind the argument. After all, an ad hominem attack is a kind of changing of the subject, a distraction from the attacker's lack of any relevant evidence.

 

I know this won't change the world, and it is of no use against the sort of people who are now vilifying American school children who have had enough, but at least it's a place to begin for writing and critical thinking instruction.