Skip navigation
All Places > The English Community > Bedford Bits > Blog > Author: Jack Solomon
1 2 3 4 Previous Next

Bedford Bits

94 Posts authored by: Jack Solomon Expert

What with all the hoopla surrounding Gillette's notorious "toxic masculinity" commercial, I feel almost obliged to address it in this blog. The challenge here is to provide a semiotic angle on the ad's significance without getting tangled up in a debate on what it is trying to say about male behavior. Rather, the semiotic question concerns what the ad is telling us about contemporary American society as a whole, which has gotten me thinking more about razor blades than since I stopped shaving in 1979.


A shrewd analysis of the ad in the Washington Post has given me a useful opening on the problem, and so I'll begin there. In "What Trump’s fast-food feast and Gillette’s woke razor blades have in common," Sonny Bunch draws a interesting parallel between Donald Trump's fast food spread for the Clemson Tigers and Gillette's ad by arguing that each was choreographed, in effect, to appeal to one side in the current national divide, while aggravating the other. As Bunch puts it, Trump "plays right to his populist strengths, assembling a mélange of foods that every American is familiar with and most Americans have eaten . . . [setting] a perfect trap for his critics, whose sneering at the feast will come off as elitist . . . [and thus playing] up the 'us against them' angle that has formed the heart of his appeal." Gillette, for its part, is playing to "what it hopes to claim as its base: the Ethical Consumer Signaling His Virtue, a valuable subset of customer, as Nike discovered with its Colin Kaepernick campaign." In short, Bunch concludes, "both the Fast Food Feast and Woke Gillette are explicitly designed to inspire mockery and, therefore, implicitly designed to encourage the us-vs.-them dichotomy that defines modern American life."


Now, whether or not Gillette harbored any intention to provoke, there is plenty of evidence that its ad certainly did so, as can be seen by a quick Internet search on the topic. Quite predictably, one can find conservative media outlets like Fox News railing against it, while Vox, for its part, is in accord. The polarization is just as Bunch describes it to be.


All this raises a question, then, as to the actual effectiveness of politically provocative advertising in itself. The most common measure of such effectivity, of course, is financial: that is, whether a provocative ad campaign increases sales and stock valuations for the company that creates it. For example, as I've noted in an earlier blog, the big question surrounding Nike's Colin Kaepernick campaign was what would happen to Nike's stock price. When the stock at first drooped, antagonistic pundits declared the campaign to be a failure. When Nike's stock recovered, the ad was declared a success. Similarly, Jack Neff at Ad Age observes that, since Gillette's object in its "toxic masculinity" ad is to attract millennials to its products, "the ultimate test of whether Gillette has turned millennials into believers will be sales."


Neff, of course, is right, just as anyone who argues that Nike's Kaepernick campaign is a success because the company's stock price is currently up is right. After all, increasing profits is what advertising is for. But does commercial success equate to cultural success?  Gillette's claim is that its ad is intended to start a "conversation" about male behavior—presumably to do something about it.  So, is the Gillette ad successful in that sense?


Here the measure of success is much more difficult to determine. Did Coca Cola make the world a better place with its "I'd Like to Teach the World to Sing (in Perfect Harmony)" campaign? Have the Benetton Group's United Colors and Unhate campaigns achieved their (noncommercial) goals? Will Gillette really cause a conversation that will make men behave better?


One way to approach this problem is to consider Bunch's contention that the Gillette campaign (and others like it) antagonizes even as it appeals, reproducing the us-vs.-them dichotomy that afflicts the country today. If Bunch is right, Gillette is preaching to a choir, not converting the opposition, and that is hardly likely to improve the situation. Wouldn't a more Rogerian approach be more effective?


Perhaps, but in the current cultural and political climate, a Rogerian ad campaign probably wouldn't get much attention, thus negating the financial and social goals of a socially conscious corporation. Controversy both sells and rallies the troops, and one can hardly blame Gillette for doing what everyone else is doing.


Then there's the whole problem of consumer activism, as a possible oxymoron, to consider. The question here is not unlike that posed by the phenomenon known as "slacktivism"—a derogatory term for social media activism that ends at clicking "like" buttons, signing petitions, and retweeting political tweets (you can read more about this in Brian Dunning's "Slacktivism: Raising Awareness" in the 9th edition of Signs of Life in the USA). That is, purchasing a product because the company that sells it shares your values (or wants you to believe it does) is an act of consumption, not a direct action, even though buying a product for political reasons may feel like doing something meaningful. But is it?


What we have in the end is a powerful signifier of what it means to live in a consumer society. In such a society, consumption, as the measure of all things, is routinely confused with action, whereby wearing the tee shirt is regarded as a substantive political act. This sort of thing can be quite good for the corporate bottom line, but whether it is good for democracy is another question.



Image Credit: Pixabay Image 2202255 by WikimediaImages, used under a Pixabay License

One of the crucial elements in teaching, and performing, popular cultural semiotics is the identification of the larger contexts (or systems of associations and differences) in which particular popular signs may be situated. This means that one must be aware not only of current popular movies, TV shows, consumer trends, etc., in order to conduct semiotic analyses of them, but that one must also be ever attuned to what one might call, for lack of a better term, the "spirit of the age." In this vein, then, my first blog for the new year will constitute a semiotic analysis of the spirit of the digital era, beginning with what will likely appear to be a rather peculiar starting point: namely, the eighteenth-century European Enlightenment.


I start here due to a purely fortuitous decision to pull an old book off my shelf last week that I should have read forty years ago but didn't, until now: Garry Wills' Inventing America (1978). Now, I don't want to get involved here in the somewhat controversial thesis Wills proposed about the sources of Jefferson's thought and language when he first drafted the Declaration of Independence—that's something better left for specialists in the field. Rather, I am only interested in the extraordinary revelations of the ins and outs of Enlightenment thinking that Wills masterfully presents. In a word, Wills reveals that behind the Newtonian clockwork universe informing much of Enlightenment discourse was a veritable religion of the number. And I'm not just talking about the quantitative advances that led, towards the end of the eighteenth century, to the Industrial Revolution and the emergence of scientific modernity; I'm talking about the ecstatic belief that Newtonian methods could be applied to the explication of, and solution to, every human problem.


Let me offer (courtesy of Wills' ample citations) a particularly striking example. Here is Francis Hutcheson's algebraic formula for the measurement of human morality as presented in the second edition of his founding text for the Scottish Enlightenment, Inquiry into the Original of our Ideas of Beauty and Virtue (1726):


M = (B + S) X A = BA + SA; and therefore BA = M - SA = M - I, and B = M - I/A

[where B = Benevolence, A = Ability, S = Self Love, I = Interest, and M = Morality].


Actually, there's more to the formula than I've reproduced here, but you'll get the point. Here, from a Presbyterian Divine, we find dramatic evidence of the extraordinary prestige of the Newtonian method, the belief that if Newton could use mathematics to measure and explain the universe, philosophers could do the same in measuring and guiding, humanity.


Sound familiar? Substitute the words "big data" for "mathematics" and you've got the current zeitgeist in a nutshell. For here too, from Steven Pinker to the purveyors of AI, digital humanists to data analysts, Educause to edX, and so on and so forth ad infinitum across our cultural spectrum, we can find what is effectively a religious faith in the omnipotence of numbers.


The Enlightenment accordingly offers a significant point of association to which we can relate our current l’esprit de l’époque. But (and I can never repeat this often enough) the significance of a phenomenon lies not only in what it can be associated with but also in its differences from similar phenomena within its system. And there is a difference between the origins of the enormous cultural prestige enjoyed by Enlightenment mathematics and of twenty-first century data worship. For while the Enlightenment was wowed by Newton's scientific achievements (achievements, it can be added, that long preceded any large-scale commercial applications), the wow factor today (as I have noted before in my blog on the "religion" promoted by the now-defunct corporation Theranos) derives from the unimaginably huge fortunes that have been made, and will continue to be made, by the corporate masters of big data. Google effectively started it all by finding a way to monetize its free services by tracking our online behavior and selling it to marketers, making personal data the holy grail of post-industrial capitalism (Facebook, of course, is the second biggest name in this tradition). The difference, in a word, is between science and commerce, with the Googleplex and its offspring occupying the cultural role once occupied by Newtonian physics. To put it another way, here is yet another signifier of our hypercapitalist culture.


Whether or not this hypercapitalist faith is a good thing or not is a value judgment, and since the goal of teaching cultural semiotics is to provide students with the critical equipment necessary to make informed judgments of their own, not to dictate those judgments to them, I will withhold my own here. But I will say this much: Hutcheson's equations, as well intentioned and nobly founded as they may have been, look pretty silly to us today. And I can't help but wonder how our current data-infatuated zeitgeist will look to future culture critics.



Image Credit: Pixabay Image 3088958 by xresch, used under a Pixabay License

Jack Solomon


Posted by Jack Solomon Expert Dec 20, 2018


One of the more interesting recent news items from the world of American popular culture has been the announcement that Netflix, rather than cancelling its streaming reruns of that Gen X TV blockbuster, Friends, on January 1, 2019 (as many viewers feared), has actually decided to up its payments for the rights to the series from $30 million to $100 million per year. The continuing popularity of this pop culture icon in an era decades later than the period in which it originated offers a particularly good topic for semiotic analysis, revealing how the same cultural sign can signal entirely different meanings when the context in which it appears changes.


When we look at those contexts, the striking thing about the early 1990s and the mid-twenty-teens is their similarity. For the early 1990s, too, was a time of reduced expectations in the wake of a searing recession. Though Millennials and iGens today may not be aware of it, Generation X too was identified as the first generation that expected to do more poorly in life than their parents. Theirs was the Grunge era, when youth culture, making the best of a bad situation, turned to a shabby-chic aesthetic, reviving the thrift-shop consumer ethos of the late 1960s and shrugging off the glitz and glam of the "go-for-the-gold" 1980s. The cast of Friends—in a thoroughly unrealistic evocation of the new spirit with their West Village digs—accordingly made personal relationships more important than material possessions, and thus became role models for a generation that felt left out of the American dream.


Sound familiar? After all, today's young, whether Millennials or iGens, are coming of age in the long shadow of the Great Recession, and so can find much in common with these six young adults whose portrayers are now, after all, the age of iGen parents. So with both Gen X nostalgia, and iGen relatability, on its side, it's no surprise that Friends should be worth $100 million to Netflix, as the streaming service maneuvers to survive in an era of intense competition.


But a little more research into the enduring popularity of Friends reveals something of a surprise, a difference upon which we can hang a semiotic interpretation. For it appears, according to an article in the New York Times, that for iGen viewers the appeal of Friends lies not in the personal relationships but in the thoroughly laid back lifestyles of the friends in question. This group of people prefers hanging out with each other at their favorite coffee house—and otherwise taking time out from their jobs—to the frantic pursuit for career success. It isn't that they don't have certain career aspirations, but they don't get all worked up about them. They'd rather fool around.


This reveals the dismal reality facing today’s youth – the worst of all possible worlds. At a time when the gateways to socio-economic prosperity and career satisfaction are either narrowing or slamming shut entirely (especially if technology isn't your thing), the cultural pressure is to achieve a big money, career success – to be the next Elon Musk or Steve Jobs. The Grunge era said, in effect, "if the opportunities aren't there, wealth isn't where it's at anyway: learning to live with less in the way of material prosperity by turning to your friends and lovers is the way to go"; while the Google era says, "if you can make it to the top, join the club, your TED talk invite is in the mail; otherwise, tough." No wonder at least some young fans of Friends feel nostalgia for an era they never experienced.


I think that there may be an added dimension, another difference, that accounts for the enduring popularity of Friends in a new era. For in that dim and distant time before smart phones, when these six friends wanted to get together, they really got together, in person, not via text, Facebook, Instagram, or whatever. Today, the smart phone is the center of social attention, and a continuing stream of news reports cite an accompanying teen despondency over an inability to socialize with others in person. Facebook has swamped face-to-face.


Thus, it is highly likely that younger fans today are responding to something that has been taken away from them. So here is a case where popular culture, which so often reflects the need for each generation to step out of the shadow of the previous, presents the spectacle of youthful nostalgia for what is effectively the world of their parents. Once a sign of Gen X adaption to tough times, Friends is now a signifier, paradoxically enough, of loss.


Photo Credit:  Pixabay Image 3774381 by mohamed_hassan, used under a CC0 Creative Commons License

As I've noted before, I once participated in an online forum where the participants quarreled a lot. One of the things they griped about was the way that some members "padded" their post count with lots of very brief entries intended to run up their score. Their goal—due to the fact that every forum member was ranked individually against the entire membership—was to make it to the top of the heap.


I thought the whole thing was rather silly at the time, but I did find myself on occasion being dragged into the competition. I recognized that the larger motive behind the forum's incentives to "reward" quantity over quality was to encourage site activity, and that the forum owners themselves were engaged in a post-count competition with similarly themed forums. What I didn't know at the time was that there is a name for the way that the site was designed: it's called "gamification."


Gamification is the process by which an activity that is not, in itself, a game, is turned into one. "Players" are ranked according to their levels of participation. This website, for example, is gamified, with all of us ranked, badged, and labeled according to a rather bewildering number of criteria, some of which I still haven't wholly figured out. And, as Stephanie Miller's "The Power of Play: Gamification Can Change Marketing" reveals, a lot of marketing campaigns are being gamified as well, like Domino's Pizza Hero mobile app feature (you can find her article in the 9th edition of Signs of Life in the USA). Even educators are looking into gamification as a way of transforming American education.


"Well so what?" you may be thinking. "What's the harm in making things fun?" The problem (and there is a problem) only appears when rampant gamification is subjected to a semiotic analysis. For when it is considered in the context of the larger system of contemporary American culture, we can see how gamification is a reflection of an overall hypercapitalistic tendency to turn everything into a winner-takes-all competition, with all of the "losers" that that entails.


Gamification looks even more sinister in the light of Sarah Mason's exposé of the way that it is being employed to incentivize worker productivity without a corresponding increase in actual income, "High score, low pay: why the gig economy loves gamification." Going beyond her own personal experience as a Lyft driver subject to the sirens of the game, Mason reveals a form of worker exploitation that is intentionally grounded in the psychology of gambling addiction. Here's how she puts it:


In addition to offering meaningless badges and meagre savings at the pump, ride-hailing companies have also adopted some of the same design elements used by gambling firms to promote addictive behaviour among slot-machine users. One of things the anthropologist and NYU media studies professor Natasha Dow Schüll found during a decade-long study of machine gamblers in Las Vegas is that casinos use networked slot machines that allow them to surveil, track and analyse the behaviour of individual gamblers in real time – just as ride-hailing apps do. This means that casinos can “triangulate any given gambler’s player data with her demographic data, piecing together a profile that can be used to customise game offerings and marketing appeals specifically for her”. Like these customised game offerings, Lyft tells me that my weekly ride challenge has been “personalised just for you!”


Former Google “design ethicist” Tristan Harris has also described how the “pull-to-refresh” mechanism used in most social media feeds mimics the clever architecture of a slot machine: users never know when they are going to experience gratification – a dozen new likes or retweets – but they know that gratification will eventually come. This unpredictability is addictive: behavioural psychologists have long understood that gambling uses variable reinforcement schedules – unpredictable intervals of uncertainty, anticipation and feedback – to condition players into playing just one more round.


In short, what is happening here goes well beyond mere fun. Gamification is at once a form of behavior modification and an extension of the surveillance society in which we live, where everything we do is tracked and data mined on behalf of corporate profits that are not shared with the vast majority of the population. With artificial intelligence—which is grounded in mass data collection and algorithmic analysis—emerging as the newest breathlessly hyped game on the block, we can see that this hypercapitalistic cultural tendency is only going to continue its expansive intrusions into our lives. And that's not just fun and games.



Image Credit: Pixabay Image 1293132 by OpenClipart-Vectors, used under a CC0 Creative Commons License


You've heard about it before: someone perches on the edge of a rooftop, or a waterfall, or a granite outcropping, to take a vertiginous photo of the drop off, hundreds—perhaps thousands—of feet below. Or reclines on a railway line to take a quick selfie as a locomotive looms in the background. Or does one thing or another that is exceptionally dangerous in order to get an eye-popping image that might capture a crowd on Instagram. . .and, sometimes, perishes in the act, as recently happened with a husband-and-wife team of travel bloggers in Yosemite National Park.


As I say, there's nothing new about this, and there are plenty of articles scattered all over the Internet detailing the phenomenon, often containing academic commentary on the meaning of it all, as does this article in Vice from 2017. So, given the familiarity of what might be called "Fatal Selfie Syndrome," and, more importantly, the fact that your students are likely to be part of the audience to which such photos are directed, this is a popular cultural topic that calls for semiotic analysis.


Let's start with the basics. The fundamental goal behind dangerous Instagram photos (or YouTube videos, etc.) is to get attention. While the most daring of the bunch also tend to be thrill seekers, thrill seeking is not the primary motivation for the simple reason that the chosen poses are designed for publicity, not for a privately enjoyed experience. But this elementary explanation then raises the question of what all this attention getting signifies.


Here we can go back to the early days of the Net. The advent of the personal web log and/or web page in the 1990s signified the emergence of a democratizing challenge to the hierarchical structures of traditional mass media, offering a way for ordinary people to make themselves seen and heard. MySpace—a kind of pre-packaged personal web site with audio and images—took the process a step further, widening the breach in the wall (in Pink Floyd's sense of the word) of mass cultural anonymity, while opening up new opportunities for commercial self-promotion.


The Instagram daredevils – and increased competitive stakes – are a consequence of what happens when democratic opportunity collides with a mass scramble for individual distinction. With so many people publicizing themselves on social media, it becomes harder and harder to get anyone to notice. This is especially problematic for those who exploit the Internet as a source of personal income, seeking to attract advertising dollars by attracting large numbers of views. So much money is at stake now that a sort of arms race of ever-more-daring stunts has ensued, effectively creating a new Internet hierarchy of online Evel Knievels contending with each other to make the cut.


The semiotic upshot of all this is that social media are not merely addictive, they are expressions (and extensions) of a hypercapitalistic society so taken up with monetizing every corner of human existence that personal experience itself is now for sale—in fact, one might say that personal experience is being sought for the sake of a sale.


Behind the scenes of this dramatic interplay between risk-entrepreneurs and their followers is the advertising that pays for it all. James Twitchell has called America an "advertising culture" (or "adcult"), and the Instagram economy can be said to signify an adcult in overdrive, careening through a consumer marketplace so splintered into niches and sub-niches that those with goods and services to sell are ever on the lookout for new ways of reaching anyone who is likely to buy their stuff. So if you can survive your latest, rather literal, peek into the abyss and get it up onto the Net, you may be able—thanks to all those advertisers who want to reach the kind of people who want to see you do it— to shudder all the way to the bank.



Image Credit: Pixabay Image 2034239 by Alexas_Fotos,used under a CC0 Creative Commons License

Jack Solomon

From Forums to Facebook

Posted by Jack Solomon Expert Nov 1, 2018

For a number of years I was an active participant, and moderator, on a hobby forum. I was well aware at the time that the experience of forum participation was quite unlike anything else I had ever encountered, and, from time to time, I posted analyses of that experience onto the forum itself. I no longer participate on that forum (it got taken over by climate change deniers and such like—this will be relevant to my following analysis), but I do visit the site to see what is happening there. It's a kind of time travelling experience, thanks to the existence of searchable archives, in which I can see something of myself frozen in the amber of digital memory. But I see something else: namely, some striking signs of what has been happening in this country over the last ten years, not the least of which is the role of Facebook as both symptom and cause of those changes.


I'll start with something I posted to the site at the time when I was first coming to appreciate the affect it was having on me. Here's what I said, way back in 2005:


What an  forum provides is a historically unprecedented combination of carnival and holiday. That is, the ancient tradition of the carnival enabled Europeans to drop their everyday social hierarchies and limitations, to don masks, and enjoy a freedom that ordinary life doesn't offer. Here on Site X, what we are in everyday life doesn't matter at all. The hierarchy here, and there is a hierarchy of sorts, comes from technical expertise, or experience, or sheer  congeniality and cleverness. I'm very glad to note that it does not come from equipment. Many of the high-rung folks on Site X do not own the best equipment. You can't buy your way to the top here—which is a lot different from everyday life.


More important is the masked nature of an  forum. Because we can   conceal as much of ourselves as we like, the stakes of ordinary social interaction are lowered. We aren't risking anything, as happens in any ordinary social interaction. This enables us to relax, to be playful, even a bit childish. We can also be more authentically ourselves, which is very refreshing. On top of this is   the fact that while we can interact at literally any time of the day, the virtual rather than spatial nature of that interaction eliminates most risks. In spatial interaction we have to worry about having to see a person again whom we have may made a goof with. Here, we don't risk that. Again, this enables us to relax enormously.


This relates to the holiday-like characteristic of Site X. Just as on a travel-related vacation (a cruise, say), one finds oneself making extraordinarily close friendships that rarely last beyond the voyage because one knows that when the cruise is over everything can be erased, on Site X we are on a kind of permanent holiday. We can let our hair down safely, knowing that if too much is revealed or dared, we can always jump ship, so to speak. Since most of us feel this way it makes for an extraordinarily relaxed atmosphere, with most of our defenses down. Our usual defenses make ordinary social interaction fraught with tension; with them down, we are just more fun to be around.


As I read these words now I realize that there is a deep irony in them, the snake, so to speak, in the garden I thought I had found. This irony lies precisely in the anonymity and virtuality of  social interaction, whose benefits can cut two ways. For without the controlling factors inherent in face-to-face and fully-identified human interactions, the Internet is also a place of unbridled hostility and vituperation where people say things they would not dare say to someone else in person.


And there doesn't appear to be anywhere to hide when it comes to those  forums that still thrive, as Amy Olberding's essay in The Chronicle of Higher Education laments. Her title says it all: "When Argument Becomes Bloodsport: Philosophy is mired in wanton pugilism."


Which is where Facebook comes in. Facebook first appeared at a time when all-too-many  forums were becoming cockpits for the ever-increasing political and cultural divisions that are now so visibly tearing at our society. Forbidding anonymity and offering opportunities for  interaction in which everyone could be a site administrator with the power to exclude unwanted voices, Facebook was quickly embraced as a means of escaping the flame wars and troll fests the Net had become. Indeed, on Site X most of my friends ended up retreating to Facebook—where I did not follow due to personal concerns for my privacy (a topic ripe for its own analysis).


So in one sense, Facebook is a symptom, not a cause, of American divisiveness. It offered a way out. But in another, it is a cause: as more and more people have retreated into their own  silos, wherein they can interact only with those people with whom they already agree and be supplied with newsfeeds that deliver only the news and the opinions they want to hear, in the way they want to hear them, the divisions between what is emerging as the Two Americas are only growing. This is not mere correlation, for when a divided people are experiencing a different reality via their social network connections, they are increasingly living in a different reality, making it impossible to understand where the other "side," if you will, is coming from. And this is clearly making things worse.


So we have another spoiled paradise on our hands. And I really don't know where we go from here.



Image Credit: Pixabay Image 390860 by PDPics, used under a CC0 Creative Commons License

When Neil Young wrote his edgy tribute to rock-and-roll "My My, Hey Hey (Out of the Blue)," the genre was hardly dead, nor really approaching it. A new generation of rockers—the punks—were trying to clear a space for themselves by claiming that rock was dead (Harold Bloom style, one might say), but in fact they were only revising it with a slightly different vibe. Johnnie Rotten, whether he liked it or not, was a descendant of Johnnie B. Good, and Young himself would go on to become an inspiration to the Grunge scene, which, for a rather brief shining moment, revitalized rock-and-roll and helped put an end to the mousse-inflected hair-band era.


But when, in the tumultuous wake of the Kavanaugh confirmation hearings, I read that Taylor Swift was stepping up to help lead the resistance, I could see that here was a sign that things, finally, had changed, and that the moon was in a new phase indeed. Not that a popular music star leading a political charge for her generation is anything new: heck, that was what the '60s were all about. But Taylor Swift is no rocker, and it is not rock stars who are taking the generational lead these days.


The reasons for this are not hard to find, but they are worth a cultural-semiotic exploration. We can begin with the obvious observation that rock-and-roll is no longer the most popular genre of youth music: rap/hip-hop is, along with rhythm-and-blues and the sort of highly choreographed pop that Madonna pioneered, Britney Spears mainstreamed, and that various divas from Taylor Swift to Lady Gaga to Katy Perry now rule (straddling both pop and rhythm-and-blues, Beyoncé belongs in a category of her own). But to start here rather puts the cart before the horse, because it doesn't explain why rock-and-roll plays a second fiddle these days; it only shows that it does.


So where's, say, Neil Young, the composer of "Ohio" in the immediate aftermath of the Kent State massacre, in this hour of political need? Well, um, he's also the composer of "A Man Needs a Maid." So how about the Rolling Stones, those "street fighting men" of the '60s? I think that the titles "Brown Sugar" and "Under My Thumb" are enough to explain why no one is running to them for leadership right now. And Bob Dylan, the author of "Lay Lady Lay" and "Don't Think Twice, It's All Right" (about the bitterest putdown of a woman in pop history)? 'Nuff said.


I think the pattern here is quite clear: rock-and-roll is rather hopelessly entangled in a male-centered history that is most charitably describable as patriarchal. It isn't the fact that all the performers that I've mentioned are now firmly entrenched in old age that puts them on the political sidelines today (after all, they are all still active and highly profitable touring acts); it's the rock-and-roll legacy itself. Even today's young rockers (and they do exist), can't escape it.


Which brings up a related point. Rock-and-roll is not only coded as "male"; it is also coded  "white." Yes, Chuck Berry (and a lot of other black musicians) took a leading role in creating it in the '50s, but rock was taken away from them in that era of official segregation and literally color-coded as "rhythm and blues"—a split that even Jimi Hendrix and the Chambers Brothers could not quite fully repair. And when rap began its meteoric rise in the '80s, it was Heavy Metal (one of rock's most popular incarnations in that decade) that became the de facto voice of white audiences (it is interesting to note in this regard how Ted Nugent and Dave Mustaine—two high profile metalists—are also outspoken conservatives today).


Add it all up and it is clear how changes in American demography and gender relations have affected popular music, and, thus, have determined just which performers will be received as voices for their generation. The signs are all there, ready to be read as part of a much larger historical shift. "Rock is dead," sang The Who, who then quickly added, "Long Live Rock," from that land where the passing of one monarch still means the ascendance of another. That was a long time ago, and Roger Daltrey has more recently opined that rock really is dead now and that rap has taken its place. But rock isn't really "dead," of course; it's just been sidelined.  And in the #MeToo era, rap—though still ascendantisn't alone at the top of the charts (political as well as musical) either.  Just ask Taylor Swift.




Image Source: “IMG_0614” by makaiyla willis on Flickr 2/4/17 via Creative Commons 2.0 License

With the appearance of Michael Moore's latest foray into the arena of American conflict and controversy, Farenheit ll/9, I find myself contemplating the significance of the documentary form itself in contemporary American culture. And as is always the case in the conduct of a semiotic cultural analysis, my aim is not to form a partisan opinion but, rather, to find a signification, something that may not be obvious at an initial glance but may well be hiding in plain sight. So here goes.


To begin with, we need to construct a historicized system in which today's popular documentaries can be situated, and I can think of no better place to begin than with Edward R. Murrow's legendary exposé of America's migrant labor morass, Harvest of Shame. First broadcast on CBS in 1960 immediately after Thanksgiving, Harvest of Shame joined such classic works of muck-raking journalism as the photographs of Dorothea Lange and Jacob Riis, and the writing of Upton Sinclair, in revealing to the American middle and upper classes what was really going on behind the scenes of the pleasant panoramas of the American dream.


Michael Moore's work fits into this tradition, but with some significant differences, differences that will be important to my interpretation to follow. These lie in the way that Moore openly presents himself as a participant not only in his muck-raking documentaries but in the political controversies that he courts as well. Very much an in-your-face documentarian, Moore presents a striking contrast to Ken Burns, who must be ranked as America's currently most popular (not to mention prolific) documentary filmmaker, in large part due to his propensity to smooth over the rough edges of American cultural conflict in his attempts to appeal to everyone (who else but Burns, for example, would have included footage of historian Shelby Foote describing Abraham Lincoln and Nathan Bedford Forrest together as the two "geniuses" that the Civil War produced?).


But Michael Moore's "shockumentary" style looks like something out of the Hallmark Channel compared to Sacha Baron Cohen's "mockumentaries." Having laid low for a few years (to lull his intended victims into a false sense of security?), Cohen is back with his Showtime series, Who Is America? A weird amalgamation of Candid Camera, reality television, and, well, "Weird Al" Yankovic, Who Is America? managed to snag a cross section of American political celebrity—from Sarah Palin and Roy Moore to Bernie Sanders and Barney Frank—in his take-no-prisoners approach to political satire in the guise of documentary-style programming (see Laura Bradley's "Sacha Baron Cohen’s Victims: All the People Who Fell for His New Prank Show" in Vanity Fair for a complete rundown of Cohen's hapless marks).


Now, aside from my rather unsemiotic curiosity about how such a list of prominent people—who must surely have personal staffs employed precisely to keep their employers insulated from such things—got so taken in by Cohen, I find a number of signifiers at work here. The first might be called "Poe's Law Comes to Comedy." Poe's Law is a label for the ambiguity that surrounds so much of the content on the Internet due to the general weirdness of what people say there. "Does he really mean that, or is he pulling my leg?" pretty much sums up the situation, and it helps explain how Cohen got such current and former politicians as Representative Joe Wilson of South Carolina and ex-Senator Trent Lott to endorse a fake PSA for a "Kinderguardians" program designed to put guns in the hands of little children—Lott, for example, is quoted as saying, "It’s something that we should think about America, about putting guns in the hands of law-abiding citizens . . . whether they be teachers, or whether they actually be talented children or highly trained pre-schoolers.” I will leave it to my readers to deduce just which organization Cohen was targeting here.


Beyond the Poe's Law signification, I find myself especially struck by the distinct trajectory here that runs from Murrow to Cohen, the stunning difference. The best way to put it is that Murrow found nothing funny in what he wanted to expose in Harvest of Shame, and had no intention of entertaining anyone. Moore, for his part, has been quite open about his opinion that even documentaries with serious purposes should be entertaining. But Cohen is basically all about entertainment. What he does is make people look stupid for other people to laugh at with extreme derision. The approach is not unlike that of Jersey Shore and My Sweet Sixteen, video train wrecks whose purpose is to make their audiences feel superior to the people on the shows. Satire, with its ancient office of encouraging good behavior by ridiculing bad, thus becomes sheer snark.


And here the system opens out to a much larger system in America today, one in which all codes of civility (and "civility," remember, is rooted in the Latin "civitas": a society of shared citizenry) are falling before the imperatives of the profit motive. Snark sells: it's no accident that Who Is America? is a comedy series on Showtime. In such a system politics is repackaged as entertainment, and derision takes the place of anything like an authentic debate. And that just may well be the answer to the question of "Who Is America?" these days.

Jack Solomon

Just Analyze It

Posted by Jack Solomon Expert Sep 20, 2018

As American popular culture gets more and more entangled in the political divisions that are rending our country, it may appear to be increasingly difficult to teach cultural analysis without risking painful classroom conflict. Take the current controversy over Nike's Colin Kaepernick campaign: it simply begs for semiotic attention, but how can it be accomplished without having the whole thing blow up into yet another headline on Inside Higher Education, or any other national news outlet?


I wouldn't be writing this blog if I thought that the thing couldn't be done or if my best advice would be to steer clear of the whole matter and anything like it. No, if you have adopted a semiotics-based methodology for your class, you have to engage with the full range of popular culture. And if you stick to the fundamental semiotic axiom that, while a personal opinion can be built upon the foundations of a semiotic analysis, semiotics itself is not an expression of an opinion, the thing can be done.


So, to begin, let's start with the obvious significations of the Nike/Kaepernick campaign and the reaction to it. The first is the way that it joins an ever-growing list of signifiers revealing a widening political gap in America, especially when it comes to anything having to do with race. This one is so apparent that it doesn't require any further explanation, but it does merit recognition.


The second (also quite obvious) signification is that symbols matter. Whether the symbol involved is the American flag or "Silent Sam," deep emotional attitudes towards objects can be just as passionate as attitudes towards people or policies. This too is so obvious that it doesn't require any further explanation, but does need to be mentioned.


The third is that the traditional (and constitutional) right to free speech in America is a shield protecting social protest, until it isn't. On the one hand, juridical rulings on free speech grant to individuals the right to say almost anything short of shouting "fire" in a crowded theater (remember the successful ACLU defense of the Nazi marchers in Skokie?), while, on the other, the courts have allowed employer retaliation against employees who break the speech codes in their places of employment. Such a lack of clarity is a contributing factor in the Nike controversy.


But let's step away from the most obvious significations and get into some more subtle ones. The first I'd like to consider is one that I have seen very ably explored in a Washington Post opinion piece by Michael Serazio, who argues that the Nike campaign isn't a gesture on behalf of social justice; it's simply another expression of the hypercapitalistic nature of America's consumer culture. Here's how Serazio puts it: "At one point in human history, products were bought and sold for their utility. Now, because of the massive and unchecked expansion of corporate power—in terms of not just market share but mind share—products must represent values, lifestyles and, in the age of President Trump, political ideologies." In short, the Nike campaign can be seen as a signifier of the hegemony of consumption in a consumer society.


But Serazio is hardly the only cultural analyst trying to parse the Nike affair. Consider the following two articles, also from the Washington Post. First, there's Megan McArdle's Nike bet that politics would sell. Looks like it was wrong, an op-ed that cites public opinion polls from all sides of the controversy to conclude that Americans are not responding favorably to the Nike/Kaepernick campaign, while arguing that this is a good thing because "as America has divided into distinct camps—geographic, demographic, political—more companies have started chasing explicitly political identities. Starbucks's leftward lean has famously roused conservative ire, but many on the left still haven't forgiven Chick-fil-A owner Dan Cathy's remarks opposing same-sex marriage a few years ago. The result is a world in which every decision, even what kind of fast food to buy, has taken on a political aspect. That's not healthy for America, which needs more points that people have in common, not more ways to divide into separate teams."


But then there's Amy Kittelstrom's counter-argument, which comes to a very different conclusion. Noting that by "[b]urning shoes and snipping swooshes, some white Americans think they are punishing Nike for re-signing Colin Kaepernick, the unemployed quarterback known for quietly kneeling during the national anthem to draw attention to anti-black police brutality. In reality, Nike will profit. The more these angry consumers attack the company, the more attractive they make Nike in the far bigger global market—which is a vital part of why Nike launched the campaign that centers on Kaepernick."


Now, the interesting thing about these articles is that each, in effect, jumps the gun on the future by asserting long-term outcomes that are by no means as certain as their authors argue they are. You may say that Trump started it with his exultant tweet about Nike's stock price decline at the opening of the campaign (Nike stock has, as I write this, fully made up the drop), but, whoever engages in such predictions, making them at all always runs the risk of speaking too soon, of letting one's desires (i.e., the way one wants things to turn out) supersede the available facts.


I'm reminded here of an editorial in the Richmond Examiner from July 7th, 1863 that predicted inevitable victory for the Army of Northern Virginia in its invasion of the North—published three days after Lee's defeat at Gettysburg but two days before news of that defeat reached Richmond. But then again, in 1861 there was a lot of "On to Richmond" confidence in the Union press as well. In the end, as Lincoln sublimely noted in his second inaugural address, neither side got what it expected out of the war, which grimly contradicted that American tendency (which rises to the level of a cultural mythology) to expect that everything will always go the way we want it to—a fundamental cultural optimism that Barbara Ehrenreich calls "bright-sidedness" (you can find her exploration of this peculiarly American tendency in chapter 7 of the 9th edition of Signs of Life in the U.S.A.).


And so, in McArdle's and Kittelstrom's dueling certainties about an uncertain future I find a signifier of something that is profoundly American; but unfortunately, when a divided people are equally certain that everything will go their way, everyone, in the end, loses.



Image Credit: Pixabay Image 1840619 by Pexels, used under a CC0 Creative Commons License

Yet another tale of professorial indiscretion on social media making the rounds prompts me to reiterate what I regard as one of the cardinal benefits of the semiotic approach: viz., that it can lead one beyond the obvious surfaces of cultural phenomena to their more nuanced (and often subtly concealed) significations. And this matters in these days of take-no-prisoners political controversy, as America divides further and further into two hostile camps that can no longer even communicate with each other without invective.


The indiscretion I am referring to involves a Rutgers University history professor's Facebook screed about gentrification in Harlem, which has been widely reported in the mass media, as well as on the  news source Inside Higher Education. As IHE reports, Professor James Livingston is in hot water over a post he put up a few months ago. Here's IHE's quotation of the controversial post (warning: salty language ahead):


OK, officially, I now hate white people. I am a white people, for God’s sake, but can we keep them -- us -- us out of my neighborhood? I just went to Harlem Shake on 124 and Lenox for a Classic burger to go, that would be my dinner, and the place is overrun by little Caucasian assholes who know their parents will approve of anything they do. Slide around the floor, you little shithead, sing loudly, you unlikely moron. Do what you want, nobody here is gonna restrict your right to be white. I hereby resign from my race. Fuck these people. Yeah, I know, it’s about my access to dinner. Fuck you, too.


After Facebook deleted the post, Livingston returned with the following (again from IHE):


I just don't want little Caucasians overrunning my life, as they did last night. Please God, remand them to the suburbs, where they and their parents can colonize every restaurant, all while pretending that the idiotic indulgence of their privilege signifies cosmopolitan -- you know, as in sophisticated "European" -- commitments.


OK, to start with, I do not intend to get involved in any way with the obvious (right there on the surface) political elements in this saga of a white professor's denunciation of the white patrons (and their children) at a Harlem eatery. I also do not want to argue the free speech implications of the matter. Everyone else is doing that already. Rather (and I hope my readers at Bedford Bits will appreciate my focus), I want to look at an important rhetorical element in the story that not only is being disregarded but is being misconstrued as well. Call what follows an exercise in "rhetorical semiotics," if you will.


To begin with, the reactions to Livingston's posts have parsed exactly how you would expect them to: conservative media (and individuals) have (to put it quite mildly) denounced Professor Livingston, accusing him of racism, while more liberal voices tend to emphasize that what he wrote is protected free speech. Well and good: we can expect such disagreements. But what really caught my attention is the claim, both from the reporter of the story and from a number of the comments that follow, that Livingston was clearly being satirical. First, the IHE reporter: "Right-wing media and Rutgers University didn't find Livingston's satire very funny." A number of the comments to the story took it for granted that the posts were satirical too. For example: "Weird reaction to Livingston’s FB posts by almost everyone, including Livingston himself. . . .The charge of racism requires taking literally what is clearly satire."


But is it really "clearly satire?" Consider another comment: "The problem is that so many people in academia are so disconnected from reality that it's not actually clearly satire. Poe's law definitely applies here." Now, Poe's Law is the dictum that things on the Internet are so weird that you can never know for certain whether someone is being ironic or not. And indeed, as another comment observes: "If it's satire then it's really badly done. I don't believe it's actually satire."


Frankly, I think that everyone is chasing the wrong trope. Livingston's second Facebook post, cited above, makes it pretty clear that he means it about his aggravation over urban gentrification. So what I think is involved in the initial post is really hyperbole—that is, the deliberate overstatement of one's case in order to more effectively make a point. Except that in this case that hyperbolic wink was lost on a lot of people, thus further widening the gap between an already miserably polarized society.


Thus my point is that words matter, that they have semiotic as well as semantic significance. If, in the currently highly inflamed environment (the system in which we can situate Professor Livingston's remarks), one wishes to make a political point, one isn't going to make it effectively by using easily misconstrued—not to mention hyperbolic and inflammatory—language (heck, it isn't even immediately clear from the posts that Livingston is mostly complaining about the behavior of little children). If you want your point of view to be politically effective—and, perhaps even more importantly, not backfire—trollish language isn't going to cut it, especially when the keys to the kingdom (i.e., electoral power in America), are ultimately in the hands of that roughly one third of the electorate that identifies as politically "independent," and which is neither clearly on the right nor on the left. If you want them on your side, you can't assume that the language that works inside your socially mediated echo chamber is going to work outside it. So while I fear that it is no longer possible for either "side" today in the great divide to reach the other, it behooves anyone who wants to win over any part of that uncommitted "center" (if we can call it that) to keep in mind that, thanks to the Internet, the whole world is always watching, and weighing, what you say.



Photo Credit: “Gentrification Zone” by Matt Brown on Flickr 8/25/17 via Creative Commons 2.0 license.

This post was originally published on December 20, 2012.


One of my students in a popular cultural semiotics seminar recently wrote her term project on the reality television “Real Housewives of .  .  .” phenomenon. Not being a fan of such shows myself, it took her paper to prompt me to think seriously about the whole thing for myself. And I realized that such shows instantiate a far more profound cultural signifier than I had heretofore realized. The following analysis represents my thinking on the matter, not my student’s.


As is always the case, my semiotic analysis centers on a crucial difference. The difference in question here is not simply that between the actual realities of the lives of ordinary housewives as opposed to the reality TV versions, but also the difference between their current television representations and those of the past. That is, not only do most actual housewives lack the wealth, glamour, and business opportunities of the “Real Housewives” of Beverly Hills, New Jersey, or wherever, but their television counterparts of the past did, too. The classic TV housewife, enshrined within the history of the family sitcom, was an asexual middle-class woman who was totally focused on her children: Think June Lockhart, Jane Wyatt, and Barbara Billingsley.


That the current crowd of glammed-up, runway-model housewives of today’s “reality” shows reflects a widespread cultural return to the conservative gender-coded precept that a woman’s value lies in her erotic appeal almost goes without saying. While a few less-than-glamorous women are cast in these programs as if to head off criticisms of this kind, their existence tends to prove the rule—and even they tend to be dolled up on the program Web sites.

But this is an easy observation to make. More profoundly, however, is the fact that the reality TV housewife has become an object of desire for her largely female audience. Rather than being seen as a hapless drudge of patriarchy, the reality TV housewife is a vicarious role model, even when she doesn’t found her own business enterprise and simply stays at home. What caused this change in perception?


To answer this question, I considered the frequently reported economic fact that household incomes for the vast majority of Americans have been essentially stagnant, when adjusted for inflation, over the last four decades. Now, add to this the exponential inflation in the costs of such basic necessities as housing and transportation and you get the modern two-income family: not necessarily because both partners in a marriage want to work, but because in order to maintain a middle-class household two incomes are now more or less essential. Certainly the efforts of the women’s movement have contributed to the enormous growth of women’s participation in the workforce, but the new image of the reality TV housewife suggests that something else is at work here as well.


That is, with the housewife being presented as a fortunate woman who doesn’t have to work, it seems that American women are nostalgic for the “good old days” of a time when they didn’t have to work just to maintain a middle-class home. The fantasy now is to be a housewife, not to escape the role. That’s quite a change.


Just how much of an effect on American consciousness in general this stagnation of incomes has had is probably one of the most important social questions of our time. Can it help explain the hostile polarization of our political landscape, our dwindling sympathy for others in an increasingly libertarian environment, the growing resentment of middle-class workers (especially unionized workers) with decent jobs and benefits? I think so. And this will be a topic for future blogs of mine.

Jack Solomon

Building a Religion

Posted by Jack Solomon Expert Jun 7, 2018

As I head into the summer recess for my Bits blogs, I find myself contemplating the cultural significance of the rise and apparent fall of Theranos, the troubled biotech startup that was once heralded as a disruptive force that would revolutionize the blood testing industry, and, not so incidentally, produce a new generation of high-tech entrepreneurs to rank with Steve Jobs and Bill Gates. On the face of it, of course, this would not appear to be a topic for popular cultural analysis, but bear with me for a moment, for when it comes to the new technologies, everything relates in some way or another to the manifold currents of everyday life that popular culture expresses.


What has drawn my attention to Elizabeth Holmes and the Theranos saga is the publication of a book by the Wall Street Journal writer who first blew the whistle on the company in 2015: John Carreyrou's BAD BLOOD: Secrets and Lies in a Silicon Valley Startup. A brief synopsis of that book appeared in Wired just as it was being released, and it was a single sentence in that synopsis that really got me thinking. It appears in Carreyrou's narrative at the point when things at Theranos were beginning to unravel and various high-ranking employees were abandoning ship. In the wake of such resignations, Elizabeth Holmes allegedly summoned every remaining employee to an all-hands-on-deck meeting to demand loyalty from them. But she didn't call it loyalty: according to Carreyrou "Holmes told the gathered employees that she was building a religion. If there were any among them who didn’t believe, they should leave."


Building a religion: Holmes was telling a truth that was deeper than she realized. For when we situate the story of Theranos in the larger system of post-industrial America, we can see that our entire culture has been building a religion around what Fredric Jameson has called America's postmodern mode of production. On the face of it, the object of worship in this system is technology itself, which is viewed as a kind of all-purpose savior that will solve all of our problems if we are just patient enough. Steven Pinker's new book, Enlightenment Now, makes this point explicitly, but it is implicit every time some new tech startup promises to "fix" higher education, clean up all the trash in the ocean, and use architecture to save the natural environment (see, for example, Wade Graham's "Are We Greening Our Cities, or Just Greenwashing Them?", which provides both a survey and a critique of the eco-city movement: you can find it in the 9th edition of Signs of Life in the USA). The religion of technology also produces its own demi-gods, like Elon Musk, who can announce yet another delay (or change of plans) in his money-losing product line and still see his Tesla stock rise due to the unwavering adoration of his flock.


Oddly enough, as I was writing the first draft of this blog I came across an essay in The Chronicle of Higher Education that examines a related angle on this phenomenon. There, in a take-down of the "design thinking" movement (an ecstatic amalgamation of a Stanford University product design program and the Esalen Institute that promises to transform higher education into a factory for producing entrepreneurially inclined "change agents"), Lee Vinsel compares the whole thing, overtly, to a religious cult, acidly remarking that the movement "has many of the features of classic cult indoctrination, including intense emotional highs, a special lingo barely recognizable to outsiders, and a nigh-salvific sense of election" —concluding that "In the end, design thinking is not about design. It’s not about the liberal arts. It’s not about innovation in any meaningful sense. It’s certainly not about 'social innovation' if that means significant social change. It’s about commercialization. It’s about making education a superficial form of business training."


Thus, I think that Vinsel would agree with my contention that behind the religion of technology is something larger, older, and more universal. This is, quite simply, the religion of money worship. Minting instant billionaires and driving an ever-deeper wedge between a technology-fostered one percent and everyone else, the post-industrial economy dazzles most through the glitter of gold, which overcomes every other moral value, from Facebook's willingness to allow its platform to be exploited for the purposes of overt political manipulation to Theranos's performing a million blood tests with a technology so flawed that the tests have had to be invalidated, at who knows what cost to the patients (one should say, victims) involved.


And what does America do in response? It makes movies, like Aaron Sorkin's The Social Network, and John Carreyrou's own Bad Blood, a film said to be starring Jennifer Lawrence, and due out in 2019, thus turning social anomie into entertainment, and promising even more offerings on the altars of extreme affluence.


Image Credit: Pixabay Image 1761832 by kropekk_pl, used under a CC0 Creative Commons License

One of my all-time favorite readings from past editions of Signs of Life in the USA is Andy Medhurst's "Batman, Deviance, and Camp." In that analysis of how the original muscle-man clone of Superman morphed into "Fred MacMurray from My Three Sons" in the wake of Fredric Wertham's notorious accusation in 1955 that Batman and Robin were like "a wish dream of two homosexuals living together," only to be transformed into the Camped Crusader of the 1966 TV series Batman, and then revised once more into the Dark Knight of the 1980s and beyond, Medhurst reveals how cartoon superheroes change with the times, reflecting and mediating the cross currents of cultural history. So as I ponder the rampant success of the second Deadpool film in this emergent franchise, I find myself wondering what this new entrant into the superhero sweepstakes may signify. Surely this is a topic for semiotic exploration.


What particularly strikes me here is the difference between the gloomy and humorless Batman of the Miller/Burton/Nolan (et al.) era, and the non-stop wisecracking of Deadpool. It isn't that Deadpool doesn't have a dark backstory of his own, as grim as anything to be found in Bruce Wayne's CV. And, surely, the Deadpool ecosystem is even more violent than the Batworld. No, it's a matter of tone, of attitude, rather than content.


Now, if Deadpool were the only currently popular superhero who cracked wise all the time, there really wouldn't be very much to go on here, semiotically speaking. But Deadpool isn't the only wise acre among the men in spandex: various Avengers (especially Thor), along with the latest incarnation of Spiderman, have also taken to joking around in the midst of the most murderous mayhem. If the Dark Knight soared to superstar status on the wings of melancholy, a lot of rising contenders for the super-crown appear to be taking their cue from Comedy Central. Something's going on here. The question is, what?


I'm thrown back on what might be called "deductive abduction" here: that is, moving from a general condition to a particular situation as the most likely explanation. The general condition lies in the way that wise-cracking humor has been used in numerous instances in which a movie whose traditional audience would be restricted to children and adolescents (think Shrek) has broken through to generational cross-over status by employing lots of self-reflexive, topically allusive, and winking dialogue to send a message to post-adolescent viewers that no one involved in the film is really taking all this fantasy stuff seriously, and so it's safe, even hip, for grown-up viewers to watch it (of course, this is also part of the formula behind the phenomenal success of The Simpsons). Stop for a moment to think about the profound silliness of the Avengers movies: who (over a certain age) could take this stuff seriously? Well, the wise cracks—which are generally aimed at those who happen to be over a certain age—are there to provide reassurance that it isn't supposed to be taken seriously. Just sit back, be cool, and enjoy.


So, given the R-rating of the Deadpool movies, I would deduce that the almost excessive (if not actually excessive) self-reflexive, topically allusive, and winking dialogue to be found in them works to reassure an over-seventeen audience that the whole thing is just a big joke. No one is taking any of this seriously, and so it is perfectly safe to be spotted at the local cineplex watching it. Hey, there's even a postmodern inflection to Deadpool's fourth-wall dissolving monologues: what could be more hip?


Since most cultural phenomena are quite over-determined in their significance, I do not mean to preclude any other possible interpretations of the super wiseass phenomenon, but the interpretation I've posted here is one I feel confident of. At any rate, the topic could make for a very lively class discussion and an interesting essay assignment.


Image Credit: Pixabay Image 2688068 by pabloengelsused under a CC0 Creative Commons License. 

The lead-in to the L.A. Times article on the Tony Award nominations really caught my attention. Here it is:


"'SpongeBob SquarePants,' 'Mean Girls' and 'Harry Potter and the Cursed Child.'

'Angels in America,' 'Carousel' and 'My Fair Lady.'"


That's exactly as it appeared, and the title of the piece—"Tony nominations for 'Harry Potter,' 'SpongeBob' and 'Mean Girls' put Hollywood center stage"—made it clear that the author was well aware of the list's significance: that television and the movies appear to be taking over one of the last bastions of American live theater: the Broadway stage.


Now, before you get the idea that I am going to lament this development as some sort of cultural loss or desecration, let me assure you that I have no such intention. To begin with, Broadway has always occupied a somewhat liminal position in the traditional high cultural/mass cultural divide, and the stage has always been a setting for popular entertainment—albeit one that is not mediated by electronic technology. And while the article, for its part, does note that "Harry Potter and the Cursed Child" represents "just one example of the kind of pop-culture franchise that can reduce producers' financial risk," it does not do so in anger. Indeed, it even quotes a writer associated with this year's nominees' sole "art-house" production ("The Band's Visit”), who rather generously observes that "Commercial theater, and musical theater, is a really risky venture. It's very expensive. It's possible to have a great success, but it's really unlikely"; adding that "I don't blame anyone for trying to hedge against that risk by adapting a really well known property, and it's not always cynical."


There are two quick and easy popular-semiotic takeaways, then, from this year's Tonys. The first is that the last barriers between mass media entertainment and the more culturally prestigious (if less lucrative) traditional stage are coming down once and for all. The second is that they are coming down not on behalf of some sort of producer-led deconstruction of a vanishing high cultural/low cultural divide, but simply because very few Broadway producers are willing to take any financial risks these days and prefer to go with tried and true productions. And this doesn't simply mean translating blockbuster movies and TV shows to the stage: after all, revivals of "Angels in America" and "The Iceman Cometh" are also among the nominees this year.


But the real significance of the Tonys for me appears when we broaden the system in which to analyze the nominations to include what is happening in the movies and television as well. Here too we find revivals, reboots, sequels, short, one studio or network franchise after another centered on a successful brand that never seems to run out of steam: "Avengers Infinity War" (note the endlessness implied); "Star Wars Forever"; "Roseanne" II and "Murphy Brown" redux; and so on and so forth. What this reveals is not only a similar spirit of creative bet hedging by going with tried and true entertainment commodities, but a narrowing of opportunities for creators themselves as well. For the message in this particular bottle is that in America success is the gift that keeps on giving. A few people (like George Lucas and J.K. Rowling) are going to rise from obscurity and hit it so big with their creative efforts that they will use up all the oxygen in the room. It isn't that there will be less creativity (with luminaries like Lucas and Rowling shining bright for innumerable self-publishing dreamers who hope to be the next meteors in the popular cultural skies, there will never be any danger of that); the problem is that there will be fewer opportunities to make such creative breakthroughs, or earn any sort of living while trying, when the stage (literally and figuratively) is filled with old brands that won't move aside for new entrants.


And so, finally, we come to a larger system within which to understand what is going on with the Tony Awards: this system is America itself, where a handful of winners are vacuuming up all of the opportunity in America and leaving almost nothing for everyone else (George Packer eloquently describes the situation in "Celebrating Inequality," an essay you can find in the 9th edition of Signs of Life in the USA). The rewards of the American dream are bigger than they have ever been; but not only are there fewer seats at the banquet of success, the pickings are getting leaner and leaner for those who haven't been invited.


Credit: Pixabay Image 123398 by smaus, used under a CC0 Creative Commons License


Though there have been some very high profile participants in the "movement" (can you spell "Elon Musk"?), I am not aware that the #deletefacebook movement is making much of a real dent in Facebook's membership ranks, and I do not expect that it ever will. For in spite of a seemingly continuous stream of scandalous revelations of Facebook's role in the dissemination of fake news and the undermining of the American electoral system—not to mention the way that Facebook, along with other digital titans such as Google, data mine our every move on the Internet—all signs indicate that, when it comes to America’s use of social media, the only way is up. Even the recantations of such former social media "cheerleaders" as Vivek Wadhwa (who have decided that maybe all this technological "progress" is only leading to human "regression" after all) are highly unlikely to change anyone's behavior.


The easiest explanation for this devotion to social media, no matter what, is that Internet usage is addictive. Indeed, a study conducted at the University of Maryland by the International Center for Media and the Public Agenda, in which 200 students were given an assignment to give up their digital devices for 24 hours and then write about their feelings during that bleak stretch, revealed just that, with many students reporting effects that were tantamount to symptoms of drug withdrawal (a full description of this study can be found in chapter 5 of the 9th edition of Signs of Life in the USA). To revise Marx a little, we might say that social media are the opiate of the masses.


Given the fact that our students are likely to have lived with the Internet all of their lives, it could be difficult, bordering on impossible, for them to analyze in any objective fashion just how powerful, and ultimately enthralling, social media are. It’s all too easy to take the matter for granted. But with the advent of digital technology looming as the most significant cultural intervention of our times, passive acceptance is not the most useful attitude to adopt. At the same time, hectoring students about it isn’t the most productive way to raise awareness either. All those “Google is making America stupid” screeds don’t help at all. So I want to suggest a different approach to preparing the way for a deep understanding of the seductive pull of social media: I'll call it a "phenomenology of Facebook."


Here's what I have in mind. Just as in that phenomenologically influenced mode of literary criticism called "Reader Response," wherein readers are called upon to carefully document and describe their moment-by-moment experience in reading a text, you could ask your students to document and describe their moment-by-moment experience when they use social media. Rather than describing how they feel when they aren't online (which is what the University of Maryland study asked students to do), your students would describe, in journal entries, their precise emotions, expectations, anticipations, disappointments, triumphs, surprises, hopes, fears (and so on and so forth) when they are. Bringing their journals to class, they could share (using their discretion about what to share and what not to) what they discovered, and then organize together the commonalities of their experience. The exercise is likely to be quite eye opening.


It is important that you make it clear that such a phenomenology is not intended to be judgmental: it is not a matter of “good” or “bad”; it is simply a matter of “what.” What is the actual experience of social media usage? What is it like? What’s going on? Only after clearly answering such phenomenological questions can ethical questions be effectively posed.


Not so incidentally, you can join in the exercise yourself. I’ve done it myself. You may be surprised at what you learn.



Credit: Pixabay Image 292994 by LoboStudioHamburg, used under a CC0 Creative Commons License