Skip navigation
All Places > The English Community > Bedford Bits > Blog > Author: Jack Solomon
1 2 3 4 Previous Next

Bedford Bits

98 Posts authored by: Jack Solomon Expert

One of the key principles upon which the semiotic method is based is that of the cultural mythology. Grounded in Roland Barthes’ pioneering study Mythologies, a cultural mythology is an ideologically inflected worldview (or set of worldviews) that shapes social consciousness. Unlike more strictly held views on social constructionism, however, which hold that reality itself is a social construct, the mythological viewpoint—at least as I present it in Signs of Life in the U.S.A.—is essentially subjective, and can be tested against the objective realities that surround it. So passionately are cultural mythologies held, however, that when reality does break through, the result can be quite emotional, even violent.


Take climate change denial, for instance. Effectively a sub-cultural mythology in its own right, a steady stream of objective evidence that climate change is real only produces ever more insistent denials by its adherents. Or then again, take America's fundamental mythology of the American dream, which holds that opportunities for social and economic advancement are open to all who make the effort to achieve them, and what happens when uncomfortable realities challenge it—as just happened with the still unfolding college admissions scandal.


The extraordinary level of emotion—and media attention—that has greeted this scandal is especially indicative of what happens when a cultural mythology smashes into reality. For here is evidence, especially painful for the middle class, that even college admissions can be bought through schemes that are open only to the upper class that Americans are so slow to recognize exists at all. In a certain sense, I must confess, I'm a little surprised by the profundity of the reaction. I mean, didn't everyone already know about the advantages—from legacy admissions to exclusive prep schools to expensive SAT tutoring—that America's upper classes enjoy when it comes to elite college admissions? Somehow I can't help but be reminded of that iconic scene in Casablanca where Captain Louis Renault is "shocked" that "gambling is going on” in Rick's Café Américain, just as he is about to receive his own winnings.


So there is something about this current glimpse into what upper-class privilege is all about that has really struck a nerve. I see at least three facets to the scandal that help explain how and why. First is the high-profile celebrity involvement. As an entertainment culture, America adores and identifies with its favorite entertainers, so when two popular actresses, and their children, are alleged to have taken advantage of their wealth in order to slip past the guardians of a supposedly meritocratic college admissions system, the feeling of betrayal runs especially deep.


The second component to the scandal is that—even before the Great Recession hit—career opportunities for America's college graduates (especially if they are not STEM majors) are closing down, increasing the pressure to get into one of those schools whose graduates have the best chance at getting the few good jobs that are left. Suddenly, where you go to college seems to matter a lot more in determining where you are going to get in life.


Which takes us to the third angle to the phenomenon: the stunned realization that not only is the American dream a cultural mythology but that the whole game appears to have been rigged all along. This apprehension cannot be overestimated in its affect on American society today. It is, in good part, behind the rise of political "populism" (it may be significant in this regard that conservative commentary on the scandal gloats over the "liberal" Hollywood elites involved), as well as the accompanying divisions in a society where more and more people are competing for fewer and fewer slots in the good life—which appear to have been purchased in advance as part of the social scenario of a new Gilded Age.


Photo CreditPixabay Image 1701201 by davidsenior, used under the Pixabay License

Jack Solomon

The "Momo Challenge"

Posted by Jack Solomon Expert Mar 14, 2019


When I first started writing about popular cultural semiotics in the 1980s, the Cabbage Patch Kids were the biggest thing going in children’s consumer culture. Not too many years later there was the POGS pandemic, followed by the Pokemon outbreak, which has since crossed its original generational boundaries to continue on as what may be the most lucrative gaming phenomenon of all time.


The common thread running through all these mega-fads is the way that they all were disseminated—at least in their beginnings—via a mysterious children’s grapevine unknown to adults, a vast international playground of sorts in which word about the Next Big Thing got passed without the assistance of social media. And now that the grapevine has gone digital, as it were, the propagation of new kiddie fads is accelerating at Warp speed, with unsettling results.


A couple of recent articles from The Atlantic and the New York Times provide a case in point. Describing the apparition of an online poltergeist called "Momo" who pops up unexpectedly on social media and dares kids to, among other things, commit suicide, they tell of a burgeoning panic among parents, police departments, and major news outlets around the globe. The new fad is called "the Momo challenge," and it would be pretty scary—except that it's a hoax.


Taylor Lorenz sums up all the confusion rather nicely:

On Tuesday afternoon, a Twitter user going by the name of Wanda Maximoff whipped out her iPhone and posted a terrifying message to parents.


“Warning! Please read, this is real,” she tweeted. “There is a thing called ‘Momo’ that’s instructing kids to kill themselves,” the attached screenshot of a Facebook post reads. “INFORM EVERYONE YOU CAN.”


Maximoff’s plea has been retweeted more than 22,000 times, and the screenshot, featuring the creepy face of “Momo,” has spread like wildfire across the internet. Local news hopped on the story Wednesday, amplifying it to millions of terrified parents. Kim Kardashian even posted a warning about the so-called Momo challenge to her 129 million Instagram followers.


To any concerned parents reading this: Do not worry. The “Momo challenge” is a recurring viral hoax that has been perpetuated by local news stations and scared parents around the world. This entire cycle of shock, terror, and outrage about Momo even took place before, less than a year ago: Last summer, local news outlets across the country reported that the Momo challenge was spreading among teens via WhatsApp. Previously, rumors about the challenge spread throughout Latin America and Spanish-speaking countries.


The Momo challenge wasn’t real then, and it isn’t real now. YouTube confirmed that, contrary to press reports, it hasn’t seen any evidence of videos showing or promoting the “Momo challenge” on its platform.


If Momo is a hoax, why, then, has she produced such a panicky reaction? John Herrman's take on the matter is instructive. "Screens and screen time are a source of endless guilt and frustration" for modern parents, he writes, "so it makes sense to need to displace these feelings on a face, a character, and something, or someone, with fantastically evil motives, rather than on the services that actually are surveilling what the kids are up to, to ends of their own."


In other words, if "Momo" isn't real, the way that the corporate Net is invading our privacy, "mining" our data, and leading our children down a Pied Piperish path (one which makes the exploitations of traditional television look like a nineteenth-century Fourth of July parade) is, and grownups are accordingly getting very jumpy. "Momo" may be a hoax, but Slender Man wasn't, and therein lies the real "Momo challenge": the Internet is growing faster than our ability, or even desire, to shape it to human needs, rather than corporate ones. And the kids, who usually know what's going on before their parents do, could actually be the canaries in a creepy digital coal mine.



Photo Credit: Pixabay Image 2564425 by StockSnap, used under the Pixabay License

For quite some time now I have been intimating in this blog that entertainment may not be the most effective way of achieving political goals, due to the way that it can distract its audience from the task of actual political engagement. Thus, I was inevitably struck by Steve Almond's forthright argument to this effect in a recent op-ed for the Los Angeles Times. But while reading Almond's essay I found myself beginning to question my own position, and while I'm not quite ready to abandon it entirely, I do believe that it may need some modification in the light of recent developments in American political culture.


To see why, let's start with Almond's thesis. Arguing that the superb political comedy that has erupted in the wake of the Trump presidency has only played into the hands of a man "who relishes and exploits his beefs with comedians . . . [and who] doesn’t see them as degrading the office of the presidency so much as transforming that office into an adjunct of the entertainment industry, where what matters most is your ratings," Almond suggests that the "towering irony here is that the essential mission of comedians in the Age of Trump is identical to that of the man they mock." Thus, both Trump and his opponents "preach that our political and media classes are essentially corrupt. Both use shtick to convert our distress at this dysfunction into disposable laughs. In other words, both turn politics into show business." The upshot of all this, Almond concludes, is that "Halfway through his reign, Trump has reaffirmed a truth that extends from King Lear to Norman Lear: A kingdom that relies on court jesters to confront mad rulers is doomed. The Fool is not a redeemer. His role is to defuse, by means of laughter, the moral distress that presages redemption."


In short, comedians like the cast of SNL and Steve Colbert are making their audiences feel too good to actually go out and do anything (like vote). But there's a certain paradox here, for if Trump used comedy to capture the White House, so too can his opposition. In other words, if Almond's argument is right, it's also wrong. What worked for one side can work for the other. Maybe SNL and Steve Colbert (et. al) can help lead the revolution.


Only the future will reveal whether this will prove to be true, but for now we can take away one surety from Almond's essay: America's entertainment culture has engulfed our entire society so thoroughly that none of the old barriers between "high" culture and "low" truly exist anymore. Popular culture, with its mandate to entertain, is our dominant culture, for better or for worse.



Image Credit: Pixabay Image 3774381 by mohamed_hassan, used under the Pixabay License

Twenty-six years ago, almost to the day, I set about rewriting the general introduction to what would become the first edition of Signs of Life in the U.S.A. Seeking something of sufficient magnitude and familiarity to effectively introduce an audience of composition students to the then-unfamiliar (and ostensibly forbidding) field of cultural semiotics, I chose the Superbowl, which, I noted, is "more than just a football game. It's an Event, a ritual, a national celebration, and show-time" for those corporate high rollers who can afford the ever-increasing cost of advertising.


As I contemplate the semiotic significance of Superbowl LIII, it's as if I am being visited by the Ghost of Superbowls Past, comparing the present game to those that have gone before and wondering about the future. And at first glance, much remains the same. The Superbowl is still an Event, is still a national ritual, and its advertising has come even closer to overshadowing the game itself, with specially made commercials released in advance, game-time polling to "elect" the most popular ads, and plenty of post-game punditry devoted solely to the advertising.


But there is also a detectable difference this time around, a pivot away from the past into an unsettling present in which the words "national celebration" may appear to no longer apply. For Superbowl LIII was as riven by pre-game controversy as it was afflicted by a generally lackluster performance on the field, a disturbing dissonance that makes the Ghost of Superbowl Present a rather ominous apparition indeed.


The causes of this dissonance are well known. They include the infamous un-called pass interference that helped put the Rams into the NFL final and galvanized the city of New Orleans into creating its own game-day counter event—not to mention the filing of a couple of lawsuits. And they also include the on-going controversy swirling around the Kaepernick-inspired taking-a-knee protests that, having been suppressed by the NFL, resulted in an artist boycott of the half-time show. Which led, in turn, to yet another controversy involving the rather-less-than-household-word band that, so to speak, crossed the picket line to perform.


But beyond these more particular conflicts there looms the vast conflict that is America itself today, which no amount of "unity" advertising (one of the notable commercial themes to be found in Superbowl LIII's ad lineup) is likely to disperse. The situation is such that it's ironic now to think how, once upon a time, the Dallas Cowboys could award themselves the distinction of being "America's team," and make it stick. Today such an epithet might be regarded as an oxymoron.


Interestingly, one sign of unity that I did detect on Superbowl Sunday appeared in New Orleans itself, where a highly diverse population of all ages turned out for an anti-Superbowl party that really looked to be more fun than the usual script for the conquering-heroes victory parades staged in the cities of the actual winners of the game. Could it be that we have here an example of a way of coming together in a common cause wherein both winning and losing are irrelevant?


Alas, no. For the unity displayed on the streets of New Orleans on Superbowl Sunday was motivated by anger and resentment, an us-against-the-world vibe quite in keeping with the overall tenor of American politics these days. The partying crowd in New Orleans had wanted to win, and, being denied their victory, chose defiance.


When you add into the mix the elaborate conspiracy theories that enveloped the game—accusations that the Rams/Saints game was rigged by the NFL high command to get L.A. into the Superbowl to help pay for the new five billion dollar stadium being built there—a dark new significance begins to emerge. Indeed, with bizarre accusations that the entire NFL season had been rigged circulating through the Internet, the specter of an America so torn by distrust and disillusionment that even its favorite one-day sports event can't escape conspiratorial contamination rudely enters the picture. If this is the Ghost of Superbowl Present, what will the Ghost of Superbowls Future bring?



Image Credit: Pixabay Image 3558732 by QuinceMedia, used under a Pixabay License.

What with all the hoopla surrounding Gillette's notorious "toxic masculinity" commercial, I feel almost obliged to address it in this blog. The challenge here is to provide a semiotic angle on the ad's significance without getting tangled up in a debate on what it is trying to say about male behavior. Rather, the semiotic question concerns what the ad is telling us about contemporary American society as a whole, which has gotten me thinking more about razor blades than since I stopped shaving in 1979.


A shrewd analysis of the ad in the Washington Post has given me a useful opening on the problem, and so I'll begin there. In "What Trump’s fast-food feast and Gillette’s woke razor blades have in common," Sonny Bunch draws a interesting parallel between Donald Trump's fast food spread for the Clemson Tigers and Gillette's ad by arguing that each was choreographed, in effect, to appeal to one side in the current national divide, while aggravating the other. As Bunch puts it, Trump "plays right to his populist strengths, assembling a mélange of foods that every American is familiar with and most Americans have eaten . . . [setting] a perfect trap for his critics, whose sneering at the feast will come off as elitist . . . [and thus playing] up the 'us against them' angle that has formed the heart of his appeal." Gillette, for its part, is playing to "what it hopes to claim as its base: the Ethical Consumer Signaling His Virtue, a valuable subset of customer, as Nike discovered with its Colin Kaepernick campaign." In short, Bunch concludes, "both the Fast Food Feast and Woke Gillette are explicitly designed to inspire mockery and, therefore, implicitly designed to encourage the us-vs.-them dichotomy that defines modern American life."


Now, whether or not Gillette harbored any intention to provoke, there is plenty of evidence that its ad certainly did so, as can be seen by a quick Internet search on the topic. Quite predictably, one can find conservative media outlets like Fox News railing against it, while Vox, for its part, is in accord. The polarization is just as Bunch describes it to be.


All this raises a question, then, as to the actual effectiveness of politically provocative advertising in itself. The most common measure of such effectivity, of course, is financial: that is, whether a provocative ad campaign increases sales and stock valuations for the company that creates it. For example, as I've noted in an earlier blog, the big question surrounding Nike's Colin Kaepernick campaign was what would happen to Nike's stock price. When the stock at first drooped, antagonistic pundits declared the campaign to be a failure. When Nike's stock recovered, the ad was declared a success. Similarly, Jack Neff at Ad Age observes that, since Gillette's object in its "toxic masculinity" ad is to attract millennials to its products, "the ultimate test of whether Gillette has turned millennials into believers will be sales."


Neff, of course, is right, just as anyone who argues that Nike's Kaepernick campaign is a success because the company's stock price is currently up is right. After all, increasing profits is what advertising is for. But does commercial success equate to cultural success?  Gillette's claim is that its ad is intended to start a "conversation" about male behavior—presumably to do something about it.  So, is the Gillette ad successful in that sense?


Here the measure of success is much more difficult to determine. Did Coca Cola make the world a better place with its "I'd Like to Teach the World to Sing (in Perfect Harmony)" campaign? Have the Benetton Group's United Colors and Unhate campaigns achieved their (noncommercial) goals? Will Gillette really cause a conversation that will make men behave better?


One way to approach this problem is to consider Bunch's contention that the Gillette campaign (and others like it) antagonizes even as it appeals, reproducing the us-vs.-them dichotomy that afflicts the country today. If Bunch is right, Gillette is preaching to a choir, not converting the opposition, and that is hardly likely to improve the situation. Wouldn't a more Rogerian approach be more effective?


Perhaps, but in the current cultural and political climate, a Rogerian ad campaign probably wouldn't get much attention, thus negating the financial and social goals of a socially conscious corporation. Controversy both sells and rallies the troops, and one can hardly blame Gillette for doing what everyone else is doing.


Then there's the whole problem of consumer activism, as a possible oxymoron, to consider. The question here is not unlike that posed by the phenomenon known as "slacktivism"—a derogatory term for social media activism that ends at clicking "like" buttons, signing petitions, and retweeting political tweets (you can read more about this in Brian Dunning's "Slacktivism: Raising Awareness" in the 9th edition of Signs of Life in the USA). That is, purchasing a product because the company that sells it shares your values (or wants you to believe it does) is an act of consumption, not a direct action, even though buying a product for political reasons may feel like doing something meaningful. But is it?


What we have in the end is a powerful signifier of what it means to live in a consumer society. In such a society, consumption, as the measure of all things, is routinely confused with action, whereby wearing the tee shirt is regarded as a substantive political act. This sort of thing can be quite good for the corporate bottom line, but whether it is good for democracy is another question.



Image Credit: Pixabay Image 2202255 by WikimediaImages, used under a Pixabay License

One of the crucial elements in teaching, and performing, popular cultural semiotics is the identification of the larger contexts (or systems of associations and differences) in which particular popular signs may be situated. This means that one must be aware not only of current popular movies, TV shows, consumer trends, etc., in order to conduct semiotic analyses of them, but that one must also be ever attuned to what one might call, for lack of a better term, the "spirit of the age." In this vein, then, my first blog for the new year will constitute a semiotic analysis of the spirit of the digital era, beginning with what will likely appear to be a rather peculiar starting point: namely, the eighteenth-century European Enlightenment.


I start here due to a purely fortuitous decision to pull an old book off my shelf last week that I should have read forty years ago but didn't, until now: Garry Wills' Inventing America (1978). Now, I don't want to get involved here in the somewhat controversial thesis Wills proposed about the sources of Jefferson's thought and language when he first drafted the Declaration of Independence—that's something better left for specialists in the field. Rather, I am only interested in the extraordinary revelations of the ins and outs of Enlightenment thinking that Wills masterfully presents. In a word, Wills reveals that behind the Newtonian clockwork universe informing much of Enlightenment discourse was a veritable religion of the number. And I'm not just talking about the quantitative advances that led, towards the end of the eighteenth century, to the Industrial Revolution and the emergence of scientific modernity; I'm talking about the ecstatic belief that Newtonian methods could be applied to the explication of, and solution to, every human problem.


Let me offer (courtesy of Wills' ample citations) a particularly striking example. Here is Francis Hutcheson's algebraic formula for the measurement of human morality as presented in the second edition of his founding text for the Scottish Enlightenment, Inquiry into the Original of our Ideas of Beauty and Virtue (1726):


M = (B + S) X A = BA + SA; and therefore BA = M - SA = M - I, and B = M - I/A

[where B = Benevolence, A = Ability, S = Self Love, I = Interest, and M = Morality].


Actually, there's more to the formula than I've reproduced here, but you'll get the point. Here, from a Presbyterian Divine, we find dramatic evidence of the extraordinary prestige of the Newtonian method, the belief that if Newton could use mathematics to measure and explain the universe, philosophers could do the same in measuring and guiding, humanity.


Sound familiar? Substitute the words "big data" for "mathematics" and you've got the current zeitgeist in a nutshell. For here too, from Steven Pinker to the purveyors of AI, digital humanists to data analysts, Educause to edX, and so on and so forth ad infinitum across our cultural spectrum, we can find what is effectively a religious faith in the omnipotence of numbers.


The Enlightenment accordingly offers a significant point of association to which we can relate our current l’esprit de l’époque. But (and I can never repeat this often enough) the significance of a phenomenon lies not only in what it can be associated with but also in its differences from similar phenomena within its system. And there is a difference between the origins of the enormous cultural prestige enjoyed by Enlightenment mathematics and of twenty-first century data worship. For while the Enlightenment was wowed by Newton's scientific achievements (achievements, it can be added, that long preceded any large-scale commercial applications), the wow factor today (as I have noted before in my blog on the "religion" promoted by the now-defunct corporation Theranos) derives from the unimaginably huge fortunes that have been made, and will continue to be made, by the corporate masters of big data. Google effectively started it all by finding a way to monetize its free services by tracking our online behavior and selling it to marketers, making personal data the holy grail of post-industrial capitalism (Facebook, of course, is the second biggest name in this tradition). The difference, in a word, is between science and commerce, with the Googleplex and its offspring occupying the cultural role once occupied by Newtonian physics. To put it another way, here is yet another signifier of our hypercapitalist culture.


Whether or not this hypercapitalist faith is a good thing or not is a value judgment, and since the goal of teaching cultural semiotics is to provide students with the critical equipment necessary to make informed judgments of their own, not to dictate those judgments to them, I will withhold my own here. But I will say this much: Hutcheson's equations, as well intentioned and nobly founded as they may have been, look pretty silly to us today. And I can't help but wonder how our current data-infatuated zeitgeist will look to future culture critics.



Image Credit: Pixabay Image 3088958 by xresch, used under a Pixabay License

Jack Solomon


Posted by Jack Solomon Expert Dec 20, 2018


One of the more interesting recent news items from the world of American popular culture has been the announcement that Netflix, rather than cancelling its streaming reruns of that Gen X TV blockbuster, Friends, on January 1, 2019 (as many viewers feared), has actually decided to up its payments for the rights to the series from $30 million to $100 million per year. The continuing popularity of this pop culture icon in an era decades later than the period in which it originated offers a particularly good topic for semiotic analysis, revealing how the same cultural sign can signal entirely different meanings when the context in which it appears changes.


When we look at those contexts, the striking thing about the early 1990s and the mid-twenty-teens is their similarity. For the early 1990s, too, was a time of reduced expectations in the wake of a searing recession. Though Millennials and iGens today may not be aware of it, Generation X too was identified as the first generation that expected to do more poorly in life than their parents. Theirs was the Grunge era, when youth culture, making the best of a bad situation, turned to a shabby-chic aesthetic, reviving the thrift-shop consumer ethos of the late 1960s and shrugging off the glitz and glam of the "go-for-the-gold" 1980s. The cast of Friends—in a thoroughly unrealistic evocation of the new spirit with their West Village digs—accordingly made personal relationships more important than material possessions, and thus became role models for a generation that felt left out of the American dream.


Sound familiar? After all, today's young, whether Millennials or iGens, are coming of age in the long shadow of the Great Recession, and so can find much in common with these six young adults whose portrayers are now, after all, the age of iGen parents. So with both Gen X nostalgia, and iGen relatability, on its side, it's no surprise that Friends should be worth $100 million to Netflix, as the streaming service maneuvers to survive in an era of intense competition.


But a little more research into the enduring popularity of Friends reveals something of a surprise, a difference upon which we can hang a semiotic interpretation. For it appears, according to an article in the New York Times, that for iGen viewers the appeal of Friends lies not in the personal relationships but in the thoroughly laid back lifestyles of the friends in question. This group of people prefers hanging out with each other at their favorite coffee house—and otherwise taking time out from their jobs—to the frantic pursuit for career success. It isn't that they don't have certain career aspirations, but they don't get all worked up about them. They'd rather fool around.


This reveals the dismal reality facing today’s youth – the worst of all possible worlds. At a time when the gateways to socio-economic prosperity and career satisfaction are either narrowing or slamming shut entirely (especially if technology isn't your thing), the cultural pressure is to achieve a big money, career success – to be the next Elon Musk or Steve Jobs. The Grunge era said, in effect, "if the opportunities aren't there, wealth isn't where it's at anyway: learning to live with less in the way of material prosperity by turning to your friends and lovers is the way to go"; while the Google era says, "if you can make it to the top, join the club, your TED talk invite is in the mail; otherwise, tough." No wonder at least some young fans of Friends feel nostalgia for an era they never experienced.


I think that there may be an added dimension, another difference, that accounts for the enduring popularity of Friends in a new era. For in that dim and distant time before smart phones, when these six friends wanted to get together, they really got together, in person, not via text, Facebook, Instagram, or whatever. Today, the smart phone is the center of social attention, and a continuing stream of news reports cite an accompanying teen despondency over an inability to socialize with others in person. Facebook has swamped face-to-face.


Thus, it is highly likely that younger fans today are responding to something that has been taken away from them. So here is a case where popular culture, which so often reflects the need for each generation to step out of the shadow of the previous, presents the spectacle of youthful nostalgia for what is effectively the world of their parents. Once a sign of Gen X adaption to tough times, Friends is now a signifier, paradoxically enough, of loss.


Photo Credit:  Pixabay Image 3774381 by mohamed_hassan, used under a CC0 Creative Commons License

As I've noted before, I once participated in an online forum where the participants quarreled a lot. One of the things they griped about was the way that some members "padded" their post count with lots of very brief entries intended to run up their score. Their goal—due to the fact that every forum member was ranked individually against the entire membership—was to make it to the top of the heap.


I thought the whole thing was rather silly at the time, but I did find myself on occasion being dragged into the competition. I recognized that the larger motive behind the forum's incentives to "reward" quantity over quality was to encourage site activity, and that the forum owners themselves were engaged in a post-count competition with similarly themed forums. What I didn't know at the time was that there is a name for the way that the site was designed: it's called "gamification."


Gamification is the process by which an activity that is not, in itself, a game, is turned into one. "Players" are ranked according to their levels of participation. This website, for example, is gamified, with all of us ranked, badged, and labeled according to a rather bewildering number of criteria, some of which I still haven't wholly figured out. And, as Stephanie Miller's "The Power of Play: Gamification Can Change Marketing" reveals, a lot of marketing campaigns are being gamified as well, like Domino's Pizza Hero mobile app feature (you can find her article in the 9th edition of Signs of Life in the USA). Even educators are looking into gamification as a way of transforming American education.


"Well so what?" you may be thinking. "What's the harm in making things fun?" The problem (and there is a problem) only appears when rampant gamification is subjected to a semiotic analysis. For when it is considered in the context of the larger system of contemporary American culture, we can see how gamification is a reflection of an overall hypercapitalistic tendency to turn everything into a winner-takes-all competition, with all of the "losers" that that entails.


Gamification looks even more sinister in the light of Sarah Mason's exposé of the way that it is being employed to incentivize worker productivity without a corresponding increase in actual income, "High score, low pay: why the gig economy loves gamification." Going beyond her own personal experience as a Lyft driver subject to the sirens of the game, Mason reveals a form of worker exploitation that is intentionally grounded in the psychology of gambling addiction. Here's how she puts it:


In addition to offering meaningless badges and meagre savings at the pump, ride-hailing companies have also adopted some of the same design elements used by gambling firms to promote addictive behaviour among slot-machine users. One of things the anthropologist and NYU media studies professor Natasha Dow Schüll found during a decade-long study of machine gamblers in Las Vegas is that casinos use networked slot machines that allow them to surveil, track and analyse the behaviour of individual gamblers in real time – just as ride-hailing apps do. This means that casinos can “triangulate any given gambler’s player data with her demographic data, piecing together a profile that can be used to customise game offerings and marketing appeals specifically for her”. Like these customised game offerings, Lyft tells me that my weekly ride challenge has been “personalised just for you!”


Former Google “design ethicist” Tristan Harris has also described how the “pull-to-refresh” mechanism used in most social media feeds mimics the clever architecture of a slot machine: users never know when they are going to experience gratification – a dozen new likes or retweets – but they know that gratification will eventually come. This unpredictability is addictive: behavioural psychologists have long understood that gambling uses variable reinforcement schedules – unpredictable intervals of uncertainty, anticipation and feedback – to condition players into playing just one more round.


In short, what is happening here goes well beyond mere fun. Gamification is at once a form of behavior modification and an extension of the surveillance society in which we live, where everything we do is tracked and data mined on behalf of corporate profits that are not shared with the vast majority of the population. With artificial intelligence—which is grounded in mass data collection and algorithmic analysis—emerging as the newest breathlessly hyped game on the block, we can see that this hypercapitalistic cultural tendency is only going to continue its expansive intrusions into our lives. And that's not just fun and games.



Image Credit: Pixabay Image 1293132 by OpenClipart-Vectors, used under a CC0 Creative Commons License


You've heard about it before: someone perches on the edge of a rooftop, or a waterfall, or a granite outcropping, to take a vertiginous photo of the drop off, hundreds—perhaps thousands—of feet below. Or reclines on a railway line to take a quick selfie as a locomotive looms in the background. Or does one thing or another that is exceptionally dangerous in order to get an eye-popping image that might capture a crowd on Instagram. . .and, sometimes, perishes in the act, as recently happened with a husband-and-wife team of travel bloggers in Yosemite National Park.


As I say, there's nothing new about this, and there are plenty of articles scattered all over the Internet detailing the phenomenon, often containing academic commentary on the meaning of it all, as does this article in Vice from 2017. So, given the familiarity of what might be called "Fatal Selfie Syndrome," and, more importantly, the fact that your students are likely to be part of the audience to which such photos are directed, this is a popular cultural topic that calls for semiotic analysis.


Let's start with the basics. The fundamental goal behind dangerous Instagram photos (or YouTube videos, etc.) is to get attention. While the most daring of the bunch also tend to be thrill seekers, thrill seeking is not the primary motivation for the simple reason that the chosen poses are designed for publicity, not for a privately enjoyed experience. But this elementary explanation then raises the question of what all this attention getting signifies.


Here we can go back to the early days of the Net. The advent of the personal web log and/or web page in the 1990s signified the emergence of a democratizing challenge to the hierarchical structures of traditional mass media, offering a way for ordinary people to make themselves seen and heard. MySpace—a kind of pre-packaged personal web site with audio and images—took the process a step further, widening the breach in the wall (in Pink Floyd's sense of the word) of mass cultural anonymity, while opening up new opportunities for commercial self-promotion.


The Instagram daredevils – and increased competitive stakes – are a consequence of what happens when democratic opportunity collides with a mass scramble for individual distinction. With so many people publicizing themselves on social media, it becomes harder and harder to get anyone to notice. This is especially problematic for those who exploit the Internet as a source of personal income, seeking to attract advertising dollars by attracting large numbers of views. So much money is at stake now that a sort of arms race of ever-more-daring stunts has ensued, effectively creating a new Internet hierarchy of online Evel Knievels contending with each other to make the cut.


The semiotic upshot of all this is that social media are not merely addictive, they are expressions (and extensions) of a hypercapitalistic society so taken up with monetizing every corner of human existence that personal experience itself is now for sale—in fact, one might say that personal experience is being sought for the sake of a sale.


Behind the scenes of this dramatic interplay between risk-entrepreneurs and their followers is the advertising that pays for it all. James Twitchell has called America an "advertising culture" (or "adcult"), and the Instagram economy can be said to signify an adcult in overdrive, careening through a consumer marketplace so splintered into niches and sub-niches that those with goods and services to sell are ever on the lookout for new ways of reaching anyone who is likely to buy their stuff. So if you can survive your latest, rather literal, peek into the abyss and get it up onto the Net, you may be able—thanks to all those advertisers who want to reach the kind of people who want to see you do it— to shudder all the way to the bank.



Image Credit: Pixabay Image 2034239 by Alexas_Fotos,used under a CC0 Creative Commons License

Jack Solomon

From Forums to Facebook

Posted by Jack Solomon Expert Nov 1, 2018

For a number of years I was an active participant, and moderator, on a hobby forum. I was well aware at the time that the experience of forum participation was quite unlike anything else I had ever encountered, and, from time to time, I posted analyses of that experience onto the forum itself. I no longer participate on that forum (it got taken over by climate change deniers and such like—this will be relevant to my following analysis), but I do visit the site to see what is happening there. It's a kind of time travelling experience, thanks to the existence of searchable archives, in which I can see something of myself frozen in the amber of digital memory. But I see something else: namely, some striking signs of what has been happening in this country over the last ten years, not the least of which is the role of Facebook as both symptom and cause of those changes.


I'll start with something I posted to the site at the time when I was first coming to appreciate the affect it was having on me. Here's what I said, way back in 2005:


What an  forum provides is a historically unprecedented combination of carnival and holiday. That is, the ancient tradition of the carnival enabled Europeans to drop their everyday social hierarchies and limitations, to don masks, and enjoy a freedom that ordinary life doesn't offer. Here on Site X, what we are in everyday life doesn't matter at all. The hierarchy here, and there is a hierarchy of sorts, comes from technical expertise, or experience, or sheer  congeniality and cleverness. I'm very glad to note that it does not come from equipment. Many of the high-rung folks on Site X do not own the best equipment. You can't buy your way to the top here—which is a lot different from everyday life.


More important is the masked nature of an  forum. Because we can   conceal as much of ourselves as we like, the stakes of ordinary social interaction are lowered. We aren't risking anything, as happens in any ordinary social interaction. This enables us to relax, to be playful, even a bit childish. We can also be more authentically ourselves, which is very refreshing. On top of this is   the fact that while we can interact at literally any time of the day, the virtual rather than spatial nature of that interaction eliminates most risks. In spatial interaction we have to worry about having to see a person again whom we have may made a goof with. Here, we don't risk that. Again, this enables us to relax enormously.


This relates to the holiday-like characteristic of Site X. Just as on a travel-related vacation (a cruise, say), one finds oneself making extraordinarily close friendships that rarely last beyond the voyage because one knows that when the cruise is over everything can be erased, on Site X we are on a kind of permanent holiday. We can let our hair down safely, knowing that if too much is revealed or dared, we can always jump ship, so to speak. Since most of us feel this way it makes for an extraordinarily relaxed atmosphere, with most of our defenses down. Our usual defenses make ordinary social interaction fraught with tension; with them down, we are just more fun to be around.


As I read these words now I realize that there is a deep irony in them, the snake, so to speak, in the garden I thought I had found. This irony lies precisely in the anonymity and virtuality of  social interaction, whose benefits can cut two ways. For without the controlling factors inherent in face-to-face and fully-identified human interactions, the Internet is also a place of unbridled hostility and vituperation where people say things they would not dare say to someone else in person.


And there doesn't appear to be anywhere to hide when it comes to those  forums that still thrive, as Amy Olberding's essay in The Chronicle of Higher Education laments. Her title says it all: "When Argument Becomes Bloodsport: Philosophy is mired in wanton pugilism."


Which is where Facebook comes in. Facebook first appeared at a time when all-too-many  forums were becoming cockpits for the ever-increasing political and cultural divisions that are now so visibly tearing at our society. Forbidding anonymity and offering opportunities for  interaction in which everyone could be a site administrator with the power to exclude unwanted voices, Facebook was quickly embraced as a means of escaping the flame wars and troll fests the Net had become. Indeed, on Site X most of my friends ended up retreating to Facebook—where I did not follow due to personal concerns for my privacy (a topic ripe for its own analysis).


So in one sense, Facebook is a symptom, not a cause, of American divisiveness. It offered a way out. But in another, it is a cause: as more and more people have retreated into their own  silos, wherein they can interact only with those people with whom they already agree and be supplied with newsfeeds that deliver only the news and the opinions they want to hear, in the way they want to hear them, the divisions between what is emerging as the Two Americas are only growing. This is not mere correlation, for when a divided people are experiencing a different reality via their social network connections, they are increasingly living in a different reality, making it impossible to understand where the other "side," if you will, is coming from. And this is clearly making things worse.


So we have another spoiled paradise on our hands. And I really don't know where we go from here.



Image Credit: Pixabay Image 390860 by PDPics, used under a CC0 Creative Commons License

When Neil Young wrote his edgy tribute to rock-and-roll "My My, Hey Hey (Out of the Blue)," the genre was hardly dead, nor really approaching it. A new generation of rockers—the punks—were trying to clear a space for themselves by claiming that rock was dead (Harold Bloom style, one might say), but in fact they were only revising it with a slightly different vibe. Johnnie Rotten, whether he liked it or not, was a descendant of Johnnie B. Good, and Young himself would go on to become an inspiration to the Grunge scene, which, for a rather brief shining moment, revitalized rock-and-roll and helped put an end to the mousse-inflected hair-band era.


But when, in the tumultuous wake of the Kavanaugh confirmation hearings, I read that Taylor Swift was stepping up to help lead the resistance, I could see that here was a sign that things, finally, had changed, and that the moon was in a new phase indeed. Not that a popular music star leading a political charge for her generation is anything new: heck, that was what the '60s were all about. But Taylor Swift is no rocker, and it is not rock stars who are taking the generational lead these days.


The reasons for this are not hard to find, but they are worth a cultural-semiotic exploration. We can begin with the obvious observation that rock-and-roll is no longer the most popular genre of youth music: rap/hip-hop is, along with rhythm-and-blues and the sort of highly choreographed pop that Madonna pioneered, Britney Spears mainstreamed, and that various divas from Taylor Swift to Lady Gaga to Katy Perry now rule (straddling both pop and rhythm-and-blues, Beyoncé belongs in a category of her own). But to start here rather puts the cart before the horse, because it doesn't explain why rock-and-roll plays a second fiddle these days; it only shows that it does.


So where's, say, Neil Young, the composer of "Ohio" in the immediate aftermath of the Kent State massacre, in this hour of political need? Well, um, he's also the composer of "A Man Needs a Maid." So how about the Rolling Stones, those "street fighting men" of the '60s? I think that the titles "Brown Sugar" and "Under My Thumb" are enough to explain why no one is running to them for leadership right now. And Bob Dylan, the author of "Lay Lady Lay" and "Don't Think Twice, It's All Right" (about the bitterest putdown of a woman in pop history)? 'Nuff said.


I think the pattern here is quite clear: rock-and-roll is rather hopelessly entangled in a male-centered history that is most charitably describable as patriarchal. It isn't the fact that all the performers that I've mentioned are now firmly entrenched in old age that puts them on the political sidelines today (after all, they are all still active and highly profitable touring acts); it's the rock-and-roll legacy itself. Even today's young rockers (and they do exist), can't escape it.


Which brings up a related point. Rock-and-roll is not only coded as "male"; it is also coded  "white." Yes, Chuck Berry (and a lot of other black musicians) took a leading role in creating it in the '50s, but rock was taken away from them in that era of official segregation and literally color-coded as "rhythm and blues"—a split that even Jimi Hendrix and the Chambers Brothers could not quite fully repair. And when rap began its meteoric rise in the '80s, it was Heavy Metal (one of rock's most popular incarnations in that decade) that became the de facto voice of white audiences (it is interesting to note in this regard how Ted Nugent and Dave Mustaine—two high profile metalists—are also outspoken conservatives today).


Add it all up and it is clear how changes in American demography and gender relations have affected popular music, and, thus, have determined just which performers will be received as voices for their generation. The signs are all there, ready to be read as part of a much larger historical shift. "Rock is dead," sang The Who, who then quickly added, "Long Live Rock," from that land where the passing of one monarch still means the ascendance of another. That was a long time ago, and Roger Daltrey has more recently opined that rock really is dead now and that rap has taken its place. But rock isn't really "dead," of course; it's just been sidelined.  And in the #MeToo era, rap—though still ascendantisn't alone at the top of the charts (political as well as musical) either.  Just ask Taylor Swift.




Image Source: “IMG_0614” by makaiyla willis on Flickr 2/4/17 via Creative Commons 2.0 License

With the appearance of Michael Moore's latest foray into the arena of American conflict and controversy, Farenheit ll/9, I find myself contemplating the significance of the documentary form itself in contemporary American culture. And as is always the case in the conduct of a semiotic cultural analysis, my aim is not to form a partisan opinion but, rather, to find a signification, something that may not be obvious at an initial glance but may well be hiding in plain sight. So here goes.


To begin with, we need to construct a historicized system in which today's popular documentaries can be situated, and I can think of no better place to begin than with Edward R. Murrow's legendary exposé of America's migrant labor morass, Harvest of Shame. First broadcast on CBS in 1960 immediately after Thanksgiving, Harvest of Shame joined such classic works of muck-raking journalism as the photographs of Dorothea Lange and Jacob Riis, and the writing of Upton Sinclair, in revealing to the American middle and upper classes what was really going on behind the scenes of the pleasant panoramas of the American dream.


Michael Moore's work fits into this tradition, but with some significant differences, differences that will be important to my interpretation to follow. These lie in the way that Moore openly presents himself as a participant not only in his muck-raking documentaries but in the political controversies that he courts as well. Very much an in-your-face documentarian, Moore presents a striking contrast to Ken Burns, who must be ranked as America's currently most popular (not to mention prolific) documentary filmmaker, in large part due to his propensity to smooth over the rough edges of American cultural conflict in his attempts to appeal to everyone (who else but Burns, for example, would have included footage of historian Shelby Foote describing Abraham Lincoln and Nathan Bedford Forrest together as the two "geniuses" that the Civil War produced?).


But Michael Moore's "shockumentary" style looks like something out of the Hallmark Channel compared to Sacha Baron Cohen's "mockumentaries." Having laid low for a few years (to lull his intended victims into a false sense of security?), Cohen is back with his Showtime series, Who Is America? A weird amalgamation of Candid Camera, reality television, and, well, "Weird Al" Yankovic, Who Is America? managed to snag a cross section of American political celebrity—from Sarah Palin and Roy Moore to Bernie Sanders and Barney Frank—in his take-no-prisoners approach to political satire in the guise of documentary-style programming (see Laura Bradley's "Sacha Baron Cohen’s Victims: All the People Who Fell for His New Prank Show" in Vanity Fair for a complete rundown of Cohen's hapless marks).


Now, aside from my rather unsemiotic curiosity about how such a list of prominent people—who must surely have personal staffs employed precisely to keep their employers insulated from such things—got so taken in by Cohen, I find a number of signifiers at work here. The first might be called "Poe's Law Comes to Comedy." Poe's Law is a label for the ambiguity that surrounds so much of the content on the Internet due to the general weirdness of what people say there. "Does he really mean that, or is he pulling my leg?" pretty much sums up the situation, and it helps explain how Cohen got such current and former politicians as Representative Joe Wilson of South Carolina and ex-Senator Trent Lott to endorse a fake PSA for a "Kinderguardians" program designed to put guns in the hands of little children—Lott, for example, is quoted as saying, "It’s something that we should think about America, about putting guns in the hands of law-abiding citizens . . . whether they be teachers, or whether they actually be talented children or highly trained pre-schoolers.” I will leave it to my readers to deduce just which organization Cohen was targeting here.


Beyond the Poe's Law signification, I find myself especially struck by the distinct trajectory here that runs from Murrow to Cohen, the stunning difference. The best way to put it is that Murrow found nothing funny in what he wanted to expose in Harvest of Shame, and had no intention of entertaining anyone. Moore, for his part, has been quite open about his opinion that even documentaries with serious purposes should be entertaining. But Cohen is basically all about entertainment. What he does is make people look stupid for other people to laugh at with extreme derision. The approach is not unlike that of Jersey Shore and My Sweet Sixteen, video train wrecks whose purpose is to make their audiences feel superior to the people on the shows. Satire, with its ancient office of encouraging good behavior by ridiculing bad, thus becomes sheer snark.


And here the system opens out to a much larger system in America today, one in which all codes of civility (and "civility," remember, is rooted in the Latin "civitas": a society of shared citizenry) are falling before the imperatives of the profit motive. Snark sells: it's no accident that Who Is America? is a comedy series on Showtime. In such a system politics is repackaged as entertainment, and derision takes the place of anything like an authentic debate. And that just may well be the answer to the question of "Who Is America?" these days.

Jack Solomon

Just Analyze It

Posted by Jack Solomon Expert Sep 20, 2018

As American popular culture gets more and more entangled in the political divisions that are rending our country, it may appear to be increasingly difficult to teach cultural analysis without risking painful classroom conflict. Take the current controversy over Nike's Colin Kaepernick campaign: it simply begs for semiotic attention, but how can it be accomplished without having the whole thing blow up into yet another headline on Inside Higher Education, or any other national news outlet?


I wouldn't be writing this blog if I thought that the thing couldn't be done or if my best advice would be to steer clear of the whole matter and anything like it. No, if you have adopted a semiotics-based methodology for your class, you have to engage with the full range of popular culture. And if you stick to the fundamental semiotic axiom that, while a personal opinion can be built upon the foundations of a semiotic analysis, semiotics itself is not an expression of an opinion, the thing can be done.


So, to begin, let's start with the obvious significations of the Nike/Kaepernick campaign and the reaction to it. The first is the way that it joins an ever-growing list of signifiers revealing a widening political gap in America, especially when it comes to anything having to do with race. This one is so apparent that it doesn't require any further explanation, but it does merit recognition.


The second (also quite obvious) signification is that symbols matter. Whether the symbol involved is the American flag or "Silent Sam," deep emotional attitudes towards objects can be just as passionate as attitudes towards people or policies. This too is so obvious that it doesn't require any further explanation, but does need to be mentioned.


The third is that the traditional (and constitutional) right to free speech in America is a shield protecting social protest, until it isn't. On the one hand, juridical rulings on free speech grant to individuals the right to say almost anything short of shouting "fire" in a crowded theater (remember the successful ACLU defense of the Nazi marchers in Skokie?), while, on the other, the courts have allowed employer retaliation against employees who break the speech codes in their places of employment. Such a lack of clarity is a contributing factor in the Nike controversy.


But let's step away from the most obvious significations and get into some more subtle ones. The first I'd like to consider is one that I have seen very ably explored in a Washington Post opinion piece by Michael Serazio, who argues that the Nike campaign isn't a gesture on behalf of social justice; it's simply another expression of the hypercapitalistic nature of America's consumer culture. Here's how Serazio puts it: "At one point in human history, products were bought and sold for their utility. Now, because of the massive and unchecked expansion of corporate power—in terms of not just market share but mind share—products must represent values, lifestyles and, in the age of President Trump, political ideologies." In short, the Nike campaign can be seen as a signifier of the hegemony of consumption in a consumer society.


But Serazio is hardly the only cultural analyst trying to parse the Nike affair. Consider the following two articles, also from the Washington Post. First, there's Megan McArdle's Nike bet that politics would sell. Looks like it was wrong, an op-ed that cites public opinion polls from all sides of the controversy to conclude that Americans are not responding favorably to the Nike/Kaepernick campaign, while arguing that this is a good thing because "as America has divided into distinct camps—geographic, demographic, political—more companies have started chasing explicitly political identities. Starbucks's leftward lean has famously roused conservative ire, but many on the left still haven't forgiven Chick-fil-A owner Dan Cathy's remarks opposing same-sex marriage a few years ago. The result is a world in which every decision, even what kind of fast food to buy, has taken on a political aspect. That's not healthy for America, which needs more points that people have in common, not more ways to divide into separate teams."


But then there's Amy Kittelstrom's counter-argument, which comes to a very different conclusion. Noting that by "[b]urning shoes and snipping swooshes, some white Americans think they are punishing Nike for re-signing Colin Kaepernick, the unemployed quarterback known for quietly kneeling during the national anthem to draw attention to anti-black police brutality. In reality, Nike will profit. The more these angry consumers attack the company, the more attractive they make Nike in the far bigger global market—which is a vital part of why Nike launched the campaign that centers on Kaepernick."


Now, the interesting thing about these articles is that each, in effect, jumps the gun on the future by asserting long-term outcomes that are by no means as certain as their authors argue they are. You may say that Trump started it with his exultant tweet about Nike's stock price decline at the opening of the campaign (Nike stock has, as I write this, fully made up the drop), but, whoever engages in such predictions, making them at all always runs the risk of speaking too soon, of letting one's desires (i.e., the way one wants things to turn out) supersede the available facts.


I'm reminded here of an editorial in the Richmond Examiner from July 7th, 1863 that predicted inevitable victory for the Army of Northern Virginia in its invasion of the North—published three days after Lee's defeat at Gettysburg but two days before news of that defeat reached Richmond. But then again, in 1861 there was a lot of "On to Richmond" confidence in the Union press as well. In the end, as Lincoln sublimely noted in his second inaugural address, neither side got what it expected out of the war, which grimly contradicted that American tendency (which rises to the level of a cultural mythology) to expect that everything will always go the way we want it to—a fundamental cultural optimism that Barbara Ehrenreich calls "bright-sidedness" (you can find her exploration of this peculiarly American tendency in chapter 7 of the 9th edition of Signs of Life in the U.S.A.).


And so, in McArdle's and Kittelstrom's dueling certainties about an uncertain future I find a signifier of something that is profoundly American; but unfortunately, when a divided people are equally certain that everything will go their way, everyone, in the end, loses.



Image Credit: Pixabay Image 1840619 by Pexels, used under a CC0 Creative Commons License

Yet another tale of professorial indiscretion on social media making the rounds prompts me to reiterate what I regard as one of the cardinal benefits of the semiotic approach: viz., that it can lead one beyond the obvious surfaces of cultural phenomena to their more nuanced (and often subtly concealed) significations. And this matters in these days of take-no-prisoners political controversy, as America divides further and further into two hostile camps that can no longer even communicate with each other without invective.


The indiscretion I am referring to involves a Rutgers University history professor's Facebook screed about gentrification in Harlem, which has been widely reported in the mass media, as well as on the  news source Inside Higher Education. As IHE reports, Professor James Livingston is in hot water over a post he put up a few months ago. Here's IHE's quotation of the controversial post (warning: salty language ahead):


OK, officially, I now hate white people. I am a white people, for God’s sake, but can we keep them -- us -- us out of my neighborhood? I just went to Harlem Shake on 124 and Lenox for a Classic burger to go, that would be my dinner, and the place is overrun by little Caucasian assholes who know their parents will approve of anything they do. Slide around the floor, you little shithead, sing loudly, you unlikely moron. Do what you want, nobody here is gonna restrict your right to be white. I hereby resign from my race. Fuck these people. Yeah, I know, it’s about my access to dinner. Fuck you, too.


After Facebook deleted the post, Livingston returned with the following (again from IHE):


I just don't want little Caucasians overrunning my life, as they did last night. Please God, remand them to the suburbs, where they and their parents can colonize every restaurant, all while pretending that the idiotic indulgence of their privilege signifies cosmopolitan -- you know, as in sophisticated "European" -- commitments.


OK, to start with, I do not intend to get involved in any way with the obvious (right there on the surface) political elements in this saga of a white professor's denunciation of the white patrons (and their children) at a Harlem eatery. I also do not want to argue the free speech implications of the matter. Everyone else is doing that already. Rather (and I hope my readers at Bedford Bits will appreciate my focus), I want to look at an important rhetorical element in the story that not only is being disregarded but is being misconstrued as well. Call what follows an exercise in "rhetorical semiotics," if you will.


To begin with, the reactions to Livingston's posts have parsed exactly how you would expect them to: conservative media (and individuals) have (to put it quite mildly) denounced Professor Livingston, accusing him of racism, while more liberal voices tend to emphasize that what he wrote is protected free speech. Well and good: we can expect such disagreements. But what really caught my attention is the claim, both from the reporter of the story and from a number of the comments that follow, that Livingston was clearly being satirical. First, the IHE reporter: "Right-wing media and Rutgers University didn't find Livingston's satire very funny." A number of the comments to the story took it for granted that the posts were satirical too. For example: "Weird reaction to Livingston’s FB posts by almost everyone, including Livingston himself. . . .The charge of racism requires taking literally what is clearly satire."


But is it really "clearly satire?" Consider another comment: "The problem is that so many people in academia are so disconnected from reality that it's not actually clearly satire. Poe's law definitely applies here." Now, Poe's Law is the dictum that things on the Internet are so weird that you can never know for certain whether someone is being ironic or not. And indeed, as another comment observes: "If it's satire then it's really badly done. I don't believe it's actually satire."


Frankly, I think that everyone is chasing the wrong trope. Livingston's second Facebook post, cited above, makes it pretty clear that he means it about his aggravation over urban gentrification. So what I think is involved in the initial post is really hyperbole—that is, the deliberate overstatement of one's case in order to more effectively make a point. Except that in this case that hyperbolic wink was lost on a lot of people, thus further widening the gap between an already miserably polarized society.


Thus my point is that words matter, that they have semiotic as well as semantic significance. If, in the currently highly inflamed environment (the system in which we can situate Professor Livingston's remarks), one wishes to make a political point, one isn't going to make it effectively by using easily misconstrued—not to mention hyperbolic and inflammatory—language (heck, it isn't even immediately clear from the posts that Livingston is mostly complaining about the behavior of little children). If you want your point of view to be politically effective—and, perhaps even more importantly, not backfire—trollish language isn't going to cut it, especially when the keys to the kingdom (i.e., electoral power in America), are ultimately in the hands of that roughly one third of the electorate that identifies as politically "independent," and which is neither clearly on the right nor on the left. If you want them on your side, you can't assume that the language that works inside your socially mediated echo chamber is going to work outside it. So while I fear that it is no longer possible for either "side" today in the great divide to reach the other, it behooves anyone who wants to win over any part of that uncommitted "center" (if we can call it that) to keep in mind that, thanks to the Internet, the whole world is always watching, and weighing, what you say.



Photo Credit: “Gentrification Zone” by Matt Brown on Flickr 8/25/17 via Creative Commons 2.0 license.

This post was originally published on December 20, 2012.


One of my students in a popular cultural semiotics seminar recently wrote her term project on the reality television “Real Housewives of .  .  .” phenomenon. Not being a fan of such shows myself, it took her paper to prompt me to think seriously about the whole thing for myself. And I realized that such shows instantiate a far more profound cultural signifier than I had heretofore realized. The following analysis represents my thinking on the matter, not my student’s.


As is always the case, my semiotic analysis centers on a crucial difference. The difference in question here is not simply that between the actual realities of the lives of ordinary housewives as opposed to the reality TV versions, but also the difference between their current television representations and those of the past. That is, not only do most actual housewives lack the wealth, glamour, and business opportunities of the “Real Housewives” of Beverly Hills, New Jersey, or wherever, but their television counterparts of the past did, too. The classic TV housewife, enshrined within the history of the family sitcom, was an asexual middle-class woman who was totally focused on her children: Think June Lockhart, Jane Wyatt, and Barbara Billingsley.


That the current crowd of glammed-up, runway-model housewives of today’s “reality” shows reflects a widespread cultural return to the conservative gender-coded precept that a woman’s value lies in her erotic appeal almost goes without saying. While a few less-than-glamorous women are cast in these programs as if to head off criticisms of this kind, their existence tends to prove the rule—and even they tend to be dolled up on the program Web sites.

But this is an easy observation to make. More profoundly, however, is the fact that the reality TV housewife has become an object of desire for her largely female audience. Rather than being seen as a hapless drudge of patriarchy, the reality TV housewife is a vicarious role model, even when she doesn’t found her own business enterprise and simply stays at home. What caused this change in perception?


To answer this question, I considered the frequently reported economic fact that household incomes for the vast majority of Americans have been essentially stagnant, when adjusted for inflation, over the last four decades. Now, add to this the exponential inflation in the costs of such basic necessities as housing and transportation and you get the modern two-income family: not necessarily because both partners in a marriage want to work, but because in order to maintain a middle-class household two incomes are now more or less essential. Certainly the efforts of the women’s movement have contributed to the enormous growth of women’s participation in the workforce, but the new image of the reality TV housewife suggests that something else is at work here as well.


That is, with the housewife being presented as a fortunate woman who doesn’t have to work, it seems that American women are nostalgic for the “good old days” of a time when they didn’t have to work just to maintain a middle-class home. The fantasy now is to be a housewife, not to escape the role. That’s quite a change.


Just how much of an effect on American consciousness in general this stagnation of incomes has had is probably one of the most important social questions of our time. Can it help explain the hostile polarization of our political landscape, our dwindling sympathy for others in an increasingly libertarian environment, the growing resentment of middle-class workers (especially unionized workers) with decent jobs and benefits? I think so. And this will be a topic for future blogs of mine.