Skip navigation
All Places > The English Community > Bedford Bits > Blog > Author: Jack Solomon
1 2 3 Previous Next

Bedford Bits

92 Posts authored by: Jack Solomon Expert

When Sonia and I began working on the first edition of Signs of Life in the U.S.A. in 1992, semiotics was still regarded as a rather obscure scholarly discipline generally associated with literary theory and linguistics. It also was quite literally unheard of to attempt to employ semiotics as a model for critical thinking in first-year composition classes, and Chuck Christensen, the Publisher and Founder of Bedford Books, was rather sticking his neck out when he offered us a contract. To help everyone at Bedford along in the development process of this unusual textbook, he asked me to provide a one-page explanation of what semiotics actually is, and I responded with a semiotic analysis of the then-popular teen fashion of wearing athletic shoes—preferably Nikes—with their shoelaces untied. That did the trick and Sonia and I were on our way.

 

As you may note, the focus of my semiotic explanation for the Bedford folks was on an object (athletic shoes), with the intent of demonstrating how ordinary consumer products could be taken as signs bearing a larger cultural significance. This was quite consistent with semiotic practice at the time in the field of popular cultural studies, which frequently analyzed cultural objects and images. But even then I knew that the real focus of cultural semiotics in Signs of Life was human behavior as mediated by such things as fashion preferences, and with each new edition of the book, I have been further refining just what that means.

 

And so, as I work on the tenth edition of the book, I have come to realize that the semiotic analysis of cultural behavior bears a close relationship to the science of artificial intelligence. For just like AI, the semiotics of human behavior works with aggregated patterns based upon what people actually do rather than what they say. Consider how the ALEKS mathematics adaptive learning courseware works. Aggregating masses of data acquired by tracking students as they do their math homework on an LMS, ALEKS algorithmically anticipates common errors and prompts students to correct them step-by-step as they complete their assignments. This is basically the same principle behind the kind of algorithms created by Amazon, Facebook, and Google, which are designed to anticipate consumer behavior, and it's also the principle behind Alexa and Siri.

 

Now, semioticians don't spy on people, and they don't construct algorithms, and they don't profit by their analyses the way the corporate titans do, but they do take note of what people do and look for patterns by creating historically informed systems of association and difference in order to provide an abductive basis for the most likely, or probable, interpretation of the behavior that they are analyzing—as when in my last blog I looked at the many decades in which the character of the Joker has remained popular in order to interpret that popularity.

 

Now, to take another fundamental principle of cultural semiotics—that of the role of cultural mythologies in shaping social behavior—one can anticipate a good deal of resistance (especially from students) to the notion that individual human behavior can be so categorically interpreted in this way, for the mythology of individualism runs deep in the American grain. We like to think that our behavior is entirely free and unconstrained by any sort of mathematically-related probabilities. But it wouldn't bother a probability theorist, especially one like Sir David Spiegelhalter, a Cambridge University statistician, who has noted that “Just as vast numbers of randomly moving molecules, when put together, produce completely predictable behavior in a gas, so do vast numbers of human possibilities, each totally unpredictable in itself, when aggregated, produce an amazing predictability”.

 

So, when we perform a semiotic interpretation of popular culture, we are on the lookout for that probability curve, even as we anticipate individual outriders and exceptions (which can themselves point to different patterns that may be equally significant in what is, after all, an overdetermined interpretation). But our goal as semioticians is to reveal the significance of the patterns that we find, not to exploit them, and thus, perhaps, modify those behaviors that, all unawares, are doing harm.

 

Photo Credit: Pixabay Image 2587756 by Stock Snap, used under Pixabay License

Hailed as a "must-see" movie for the apres-weekend water cooler crowd, and warily monitored by everyone from local police departments to survivors of the Aurora, Colorado massacre, Joker has surpassed its opening box office predictions and has already succeeded in becoming the current cinematic talk of the town. Such movies always make for student-engaging essay and discussion topics, and I expect that many college instructors across the country are already crafting assignments about this latest installment in the comics-inspired universe of Hollywood blockbusters.

 

But while many such assignments will be likely to invite debates on the advisability of making such a movie as Joker in the light of an epidemic of lunatic-loner mass shootings, while others (especially in film studies departments) will focus on the revival of the Scorsese/De Niro "character study" formula that made Taxi Driver a movie classic (heck, Joaquin Phoenix even channeled his inner-De Niro by losing a ton of weight Raging Bull style for the role, and, of course, De Niro's in the film too), a cultural semiotic analysis of the movie would take a different approach, which I will sketch out here.

 

To begin with, we can ask the question, "what does the enduring popularity of the Joker in American popular culture tell us?" For alone among the multitudinous villains of comic book history, the Joker returns again and again, now to star as the protagonist in his own feature film. Where's the Penguin, we might ask, or Clayface? What is it about this character that has captured the American imagination?

 

As with any semiotic analysis, let's start with the history of the Joker. In the beginning he was a Dick Tracy-like gangster in the tradition of Conan Doyle's evil genius Professor Moriarty, heading his own organized crime syndicate. Given a camped-up star turn in the Batman TV series of the 1960s, the Joker joined with Burgess Meredith's Penguin and a host of other really funny, but essentially harmless, villains in the days when fist fights (SMASH! BAM! POW!) were considered sufficient violence for a prime time children's television program.

 

The key change in the Joker's portrayal (the critical semiotic difference) came in the 1980s, when Frank Miller and Grant Morrison darkened the scenario considerably, turning the quondam clown into a psychopathic killer. This was the Joker that Jack Nicholson channeled in Tim Burton's Batman, and which Heath Ledger took further into the darkness in The Dark Knight. It's important to point out, however, that while Nicholson's Joker is a merciless killer, he is also very funny (his trashing of the art museum is, um, a riot), and his back story includes an acid bath that has ruined his face, providing a kind of medical excuse for his behavior. Ledger's Joker, on the other hand, isn't funny at all, and his unconvincing attempt to attribute his bad attitude to childhood abuse isn't really supposed to be taken seriously by anyone. The point is simply that he is a nihilistic mass murderer who likes to kill people—even his own followers. And unlike the past Jokers, he isn't in it for the money, incinerating a huge pile of cash with one of his victims tied up at the top to prove it.

 

The trajectory here is clear, and the makers of Joker were very well aware of it. Rather than turn back the clock to introduce a kinder, gentler Joker (you're laughing derisively at the suggestion, and that's precisely my point), Todd Phillips and Scott Silver quite knowingly upped the ante, earning an R-rating that is quite unusual for a comics-themed movie. Well, Deadpool got there first, but that's part of the point, too.

 

For in spite of the film's attempt to pass itself off as a study of the pathologizing effects of socioeconomic inequality, that isn't its appeal at all, and it doesn't explain why this particular character was chosen to be the protagonist. Just think, what if someone made a movie called Marx: the Alienation Effect in Contemporary Capitalism, based on the best-seller Das Kapital? No, I'm afraid that the Joker's popularity isn't political in any truly revolutionary sense. He's way too much of a loner, and too weird. There's something else going on here.

 

Before one succumbs to the temptation to simply say that Joker is a movie for psychopathic wannabes, let's just remember that the domestic box office for the film's first weekend was 96 million dollars. There just aren't that many psychopaths out there to sell so many tickets. No, the desire for an ever-darkening Joker is clearly a very widespread one, and the success of the afore-mentioned Deadpool franchise—not to mention Game of Thrones' wildly popular funhouse-mirror distortions of Tolkien's primly moralistic Middle Earth—only amplifies the evidence that Americans—especially younger Americans—are drawn to this sort of thing. But why?

 

I think that the new detail in the Joker's origin story that is introduced in the movie, portraying him as a failed standup comic and clown, is a good clue to the matter. We could say that Arthur Fleck's great dreams—at least in his mind—have been betrayed, and there's a familiar ring to this as a generation of millennials, burdened with college debt and college degrees that lead nowhere, faces a country that many feel is betraying them. It is significant in this regard that the darkening of the Joker began in the 1980s, the decade when the American dream began to crumble under the weight of the Reagan tax cuts, massive economic "restructuring," and two recessions from which the working and middle classes never fully recovered. What happened in response wasn't a revolution: it was anger and despair, spawning a kind of Everyman disillusionment with traditional authority (including moral authority), conspiracy theories, and fantasies of breaking loose and taking things into one's own hands.

 

Which makes characters like the Joker rather like Breaking Bad's Walter White, whose response to economic disruption was to become disruptive. White's Everyman revolt didn't instigate an epidemic of middle-class drug lords; it simply entertained an angry America with the trappings of vicarious fantasy. The success of Joker just a few years after the end of Heisenberg shows that the fantasy is getting darker still.

 

Smash. Bam. Pow.

 

 

Photo Credit: Pixabay Image 1433326 by annca, used under Pixabay License

Jack Solomon

The Panopticon 2.0

Posted by Jack Solomon Expert Oct 3, 2019

Michel Foucault's application of Jeremy Bentham's panoptic proposal for prison reform to the modern surveillance state has become a commonplace of contemporary cultural theory. And heaven knows that we are being watched by our government, by our computers, by our phones, and televisions, and automobiles, and goodness knows what else. It is also no secret that current and prospective employers monitor the social media imprints of their current and prospective employees—all those promises of airtight privacy settings and Snapchat anonymity notwithstanding. As I say, all this has become a commonplace of life in the digital era.

 

But a new wrinkle has entered the picture, a fold in the space/time fabric of modern life if you will, whereby the pre-digital past has come to haunt the digital present. For as the governor of Virginia and the prime minister of Canada now know to their cost, what goes into your school yearbook doesn't stay in your school yearbook. And thanks to an array of yearbook-posting alumni websites, anyone with an Internet connection can access virtually anyone's yearbook and immediately expose online those embarrassing moments that you thought were safely hidden in the fogs of time.

 

(A parenthetical autobiographical note: I would be highly amused if someone dug up my high school yearbook—yearbooks, actually, because I was on the staff for three years, the last two as editor-in-chief. The first of the three was a conventional celebration of football players, cheerleaders, and homecoming royalty, but I changed all that in the next two when I got editorial control, dedicating the first of them to the natural environment— including two photo essays complete with an accompanying poetic narrative—and the second devoted to a contemplation of the mystery of time itself, which included repeating reproductions of El Greco's "Saint Andrew and Saint Francis," which were intended to convey an ongoing dialog between a wise man and a seeker of temporal wisdom. You get one guess as to why I don't have to worry about any embarrassing party pics in my yearbooks.)

 

So it isn't enough to cancel your Twitter account, max out your privacy settings on Facebook (good luck with that), or simply take a long vacation from the Internet, for the Net's got you coming and going whatever you do. I expect that one's reaction to this state of affairs (which is itself of semiotic interest) is probably generational; that is, if you grew up with the Internet, none of this is likely to be particularly alarming, but if you remember the days when personal privacy was at least a value (if not always a reality), then it could be disturbing indeed. And there is no escaping the situation, for just as it is impossible to avoid the consequences of major cyber hacks by refusing to conduct any of your business affairs online (if you have any sort of bank account, credit/debit card, health record, or social security number, you are vulnerable no matter how hard you try to live outside the Web), there is no controlling what may surface from your past.

 

Photo Credit: Pixabay Image 4031973 by pixel2013, used under Pixabay License

Jack Solomon

America's Got Sentiment

Posted by Jack Solomon Expert Sep 19, 2019

As Sonia Maasik and I work to complete the tenth edition of Signs of Life in the U.S.A., I have been paying special attention to American popular music, which will be a topic for a new chapter that we're adding to the book. While our approach will be semiotic rather than esthetic, part of my research has involved listening to music as well as analyzing its cultural significance, and as everyone knows, there's nothing like YouTube to put just about everything you want to hear at your literal fingertips. Which brings me to the subject of this blog.

 

Well, you know how YouTube is. Even as you watch one video you are regaled with a menu of others that can take you on a merry chase following one musical white rabbit after another. And so it came to pass that I found myself watching some famous clips from the Britain's Got Talent and America's Got Talent franchises. Which means that I finally saw that audition of Susan Boyle's, which, while it wasn't a joke, started the whole world crying. With joy.

 

Talk about fairy-tale happy endings! Take a little Cinderella, mix in the Ugly Duckling, and sprinkle in a lot of A Star is Born, and there you have the Susan Boyle story. I'd say that you couldn't make this sort of thing up, except for the fact that it has been made up time and again, only this time it's true. And it helps a lot that the woman can really sing.

 

The semiotic significance of this tale is rather more complicated than it looks, however. On the surface, it looks simply like a sentimental triumph of authenticity over glitter, of the common folk over entertainment royalty. And, of course, that is a part of its significance—certainly of its enormous popular appeal. Just look at the visual semiotics: the glamorous judges, sneering Simon (I'm certain that he has made himself the designated bad guy to add melodrama to the mix), and the audience on the verge of laughter in the face of this ungainly, middle-aged woman who says she wants to be a star. And then she blows the house away.

 

But here is where things get complicated. For one thing, even as the celebrity judges fell all over themselves confessing to Ms. Boyle how ashamed they felt for initially doubting what they were about to hear, they managed to imply that it would have been OK to ridicule her if it had turned out that she couldn't sing, that losers deserve to be humiliated. After all, that's what those buzzers are for.

 

And then there is the notoriously oxymoronic nature of reality television, its peculiar mixture of authenticity and theatricality, its scripted spontaneity. One begins to wonder what the judges knew in advance about Susan Boyle; certainly she didn't get to that stage of the competition by accident. For to get past the thousands of contestants who audition in mass cattle calls for these shows, you have to have something that the judges want, and this can include not only outstanding talent but unexpectedly outstanding talent, the ugly ducklings that provide plenty of occasion for all those dewy-eyed camera shots of audience members and judges alike who are swept away by the swans beneath the skin. The whole thing has become such a successful formula for the franchise that when, a few years after the Susan Boyle sensation, a soprano/baritone duo named Charlotte and Jonathan came onto the stage, Simon Cowell made sure to quip, in a loud stage whisper to the judge beside him, "Just when you think things couldn't get any worse" (funny how the camera caught that), only to have Jonathan steal the show with a breathtaking performance that Sherrill Milnes might envy. Call me cynical, but somehow I think that Cowell knew perfectly well what was going to happen.

 

But let's not forget the designated duds either, the poor souls who get picked out of the cattle calls in order to be laughed at later, to be buzzed off the stage. After all, with so many truly talented people in the world, surely there would be enough to have nothing but superb performers on these shows. But failure is part of the formula here as well as success, for schadenfreude, too, sells. 

 

So the semiotic question isn't whether Susan Boyle can sing; nor is there any question that without Britain's Got Talent she would almost certainly not be enjoying a spectacular career. The semiotic question involves what is going on when television shows like Britain's Got Talent and America's Got Talent play upon the vicarious yearnings of their viewers to shine in the spotlight in a mass society where fewer and fewer such opportunities really exist—even as those same viewers sneer at the failures. Thus, as with so much of reality television, there is an uncomfortable love/hate relationship going on here, a sentimental identification with the winners alongside a derisive contempt for the losers. And in a ruthlessly hyper-competitive society where more and more people through no fault of their own are falling into the loser category, this is of no small significance.


And I have to add that I'm certain that if a young Bob Dylan or Leonard Cohen had appeared on America's Got Talent, both would have been buzzed off the stage.

 

 

Photo Credit: Pixabay Image 1868137 by Pexels, used under Pixabay License

With television's arguably most prominent dramatic series ending amidst the ashes of King's Landing and the outrage of many of its most loyal fans (including a remarkable Change.Org petition demanding an entire Season 8 redo), I find myself reminded of Frank Kermode's classic study, The Sense of an Ending (1967). Exploring the ways that human beings use storytelling in order to make sense of their lives and history, Kermode focuses his attention on the "high art" literary tradition, but the same attention can be paid to popular art as well in ways that can explain, at least in part, the extraordinary reaction to GoT's final two episodes. Here's how.

 

First, let's note that fan pressure on creative artists is nothing new. Charles Dickens' readers pleaded with him, in the serialized run-up to the climax of The Old Curiosity Shop, to please not kill Little Nell, while Arthur Conan Doyle was successfully lobbied by disappointed readers to bring Sherlock Holmes back from the dead after apparently killing the popular detective off in "The Final Problem." And movie producers routinely audience-test their films before making their final cuts. So all the popular uproar is not really different in kind from things that have happened before, but it may be different in degree, which is where its significance lies.

 

Because no one, except for the series' writers and actors, appears to be fully satisfied with what finally happened after eight long, and violent, years in the battle for the Iron Throne. The most common complaint seems to be that Daenerys should have been allowed to follow her "character arc" to become not only Queen of the Seven Kingdoms but also a kind of messiah. However, it isn't my purpose to wade into the controversy to offer my own opinion about what "should" or "shouldn't" have happened, for that's an esthetic, not a semiotic, question. Rather, I want to look at the extravagance of the negative response to what did transpire and what it tells us.

 

To understand this response we can begin with the fact that Game of Thrones ran for eight years as a continuous narrative—conceived, in fact, as one gigantic movie: a TV "maxi-series" if you will. Eight years is a long time, especially for the show's core audience of millennials who effectively entered adulthood along with GoT's main characters. This audience largely overlapped with the generation that grew from childhood to adolescence as the Harry Potter novels were published and filmed, and who also were on hand for the definitive cinematic Lord of the Rings: the fantasy novel to beat all fantasy novels first raised to cult status by baby boomers and turned upside down and inside out by George R.R. Martin to create A Song of Fire and Ice.

 

Such a long wait, at such a formative period of life, is simply bound to build up a great load of gestaltic expectation, a longing for the kind of ending that would redeem all the violence, betrayal, and heartbreak of this essentially sadistic story ("Red Wedding" anyone?). Both The Lord of the Rings and the Harry Potter novels prepared viewers for such an ending, one in which, to quote Miss Prism from The Importance of Being Earnest, "The good ended happily, and the bad unhappily." Instead, everyone got something more along the lines of King Lear.

 

And there you have it, for as Miss Prism tartly adds, happy endings are "what fiction means”—or as Frank Kermode might put it, the triumph of the hero and the defeat of the villain is one way that our story telling makes sense of (one might say, makes bearable) the realities of human experience.

 

But that isn't what George R.R. Martin—who knows full well how the triumph of the House of York only led to Richard III, whose defeat, in turn, brought Henry VIII and Bloody Mary to the throne—ever appears to have had in mind for his epic saga. Mixing historical reality with a lot of Tolkiensque fantasy, Martin (whose own conclusion to the tale is yet to come) thus put the show's writers into quite a bind. Because a completely conventional "happy" ending would have contradicted the whole point of the story, while a completely dismal one (say, Cersei triumphing after all) would have really enraged the customers. I use that word deliberately, for in popular culture audiences really are customers, and they expect to get what they pay for (the complaint on the part of some fans that by making GoT's producers and actors rich they were entitled to a better wind up than they got is especially significant in this regard). So Benioff and Weiss essentially compromised in the way that pop culture usually does. The really bad villains do end unhappily, and the Starks do regain power after all, but Martin's fundamental thesis that power itself is the problem is preserved in the madness of Daenerys at the moment of achieving absolute control.

 

It wasn't a bad compromise in my view, but it quite clearly hasn't been a successful one either. Still, because of the odd reversal in the relation between novel and film, with the film being concluded before the novel was, the game isn't over. If the novels ever are concluded, I suspect that Martin will have more shocks up his sleeve, beginning, I suppose, with King Bran turning tyrant and bad trouble between Jon and Sansa.

 

 

Photo Credit: “Game of Thrones Paperback Book” by Wiyre Media on Flickr 7/15/17 via a CC BY 2.0 License.

Topics for popular cultural analysis can spring out at you at the most unexpected times—in fact, that is one of the goals of cultural semiotics: to attune oneself to the endless array of signs that we encounter in our everyday lives. Take for example the catalog from a Minnesota-based outfit called The Celtic Croft that I came across quite by accident recently. A mail-order/online Scottish clothing and accessories emporium, The Celtic Croft offers its clientele not only traditional Highland gear but "officially licensed Outlander-inspired apparel and tartans," along with authentic Braveheart kilts. Which is where popular culture, and its significance, comes in.

 

I admit that I had to look up Outlander (of which I have only rather vaguely heard before) to understand what the catalog was referring to, but what I learned was quite instructive. Based upon a series of historico-romantic fantasy novels by Diana Galbadon, Outlander is a television series from the STARZ network that features the adventures of a mid-twentieth-century Englishwoman who time travels back and forth between the eighteenth and twentieth centuries as she leads a dual life among the Highland clans and the post-World War II English. Something of a breakout sensation, Outlander has recently been renewed for a fifth and sixth season.

 

To grasp the cultural significance of this television program—and of the clothing catalog that is connected to it—we can begin with constructing the system to which it belongs. The most immediate association, which is made explicit in The Celtic Croft catalog, is with the Oscar-winning film Braveheart, but the Highlander television and movie franchise is an even closer relation. More broadly, though set in the eighteenth century, Outlander can be regarded as a part of the medieval revival in popular culture that began with J.R.R. Tolkien's Lord of the Rings and The Hobbit, and which led to the whole "sword and sorcery" genre, encompassing both Harry Potter and Game of Thrones with its emphasis on magic, sword play, and a generally romanticized view of a pre-industrial past.

 

The current controversy raging within medieval studies over its traditional focus on the European Middle Ages—not to mention its cooptation by avowed white supremacists—reveals that such a system is fraught with potential political significance, and it is highly likely that a complete analysis of the phenomenon would uncover elements of conscious and unconscious white nationalism. But, if we limit ourselves here to The Celtic Croft catalog and its Braveheart and Outlander-inspired merchandise, we can detect something that is a great deal more innocuous. To see this we can begin with a tee-shirt that catalog offers: a black tee with white lettering that reads, "Scotch: Makin' White Men Dance Since 1494."

 

Now, I can see how this slogan could be taken as a kind of micro-aggression, but it can also be seen as something similar to the "white men can't jump" trope: expressing what is actually an admiration for qualities that are not conventionally associated with white people—especially in relation to stereotypes of Anglo Saxon self-repression and ascetic Puritanism. What the dancing Celt signifies is someone who can kick up his heels over a glass of whiskey and who is decidedly not a stodgy Saxon.

 

This interpretation is supported by the larger context in which The Celtic Croft universe operates. This is the realm of Highland Scotland, whose history includes both biological and cultural genocide at the hands of the English, who themselves become symbols of brutally oppressive whiteness in Braveheart and Outlander. It is significant in this respect that William Wallace's warriors in Braveheart were conspicuously portrayed with the long hair and face paint of movie-land Indians, while the British of Outlander are depicted as killers, torturers, and slave traders.

 

So what we have here is something that might be called an "escape from the heritage of oppressive whiteness," by which white audiences/consumers (who do not have to be actual Scots: even Diana Galbadon isn't) identify with the Celtic victims of Anglo history, finding their roots in such historical disasters as the Battles of Falkirk and Culloden. Purchasing the once-forbidden symbols of the Highland clans (kilts and tartans were banned for years after Culloden), and watching movies and television shows that feature the heroism of defeated peoples who resisted Anglo-Norman oppression, is thus a kind of celebration of a different kind of whiteness, one that rejects the hegemonic variety.

 

In other words, rather than reflecting white supremacy, the Celticism (I think I just coined that) of The Celtic Croft and its associated entertainments expresses a certain revision of the traditional American view of history away from Anglo-centrism towards an embrace of its victims. At a time when differing experiences of historical injustice are rending our country, this is no small recognition, because it could potentially help create a ground for unity rather than division.

 

 

Photo Credit: Pixabay Image 349717 by PublicDomainArchive used under the Pixabay License.

Now on a record shattering run that should be of no surprise to anyone, Avengers: Endgame offers a multitude of possibilities for writing assignments, ranging from a close reading of the movie itself to an analysis of the entire Avengers film franchise and beyond to a reflection on a system of violent ongoing sagas that includes Star Wars, Game of Thrones, and even The Walking Dead—not to mention the rest of the Marvel universe.

 

I am not going to attempt anything of the sort in this brief blog, but instead want to propose a different kind of assignment, one that has semiotic implications but begins in a kind of personal phenomenology much akin to a reader-response analysis. This assignment would probably be best be composed in the form of a student journal entry posing the question: How does an ongoing story line that appears to reach some sort of conclusion (including the deaths or "retirement" of major characters), but which I know is not really over at all affect me and my sense of reality?

 

What I'm aiming at here is for students to become aware of what could be called the "false catharsis" involved in movies like Avengers: Endgame, which pretend to bring a vast arc of interwoven stories to an end, but which viewers know perfectly well is not over at all. Disney has too much at stake to allow Iron Man, for example, to stay dead, or for Captain America to remain retired, and what with the unlimited resources that fantasy storytelling has at hand to reverse the past and reconstruct the present and future, you can be pretty certain that everyone will be back.

 

In exploring the implications of what could well be called "eternity storytelling," consider the effect of Charles Dickens' The Old Curiosity Shop if his readers knew that Little Nell would be brought back in one way or another in a future novel. Or what the impact of the Iliad would be if Hector rose from the grave in a future installment of Trojan War Forever? Or (to go all the way back) how it would be if, in Gilgamesh II, the king of Uruk were to discover a time-traveler's ring that enabled him to go back to retrieve the lost plant-that-gives-eternal life and revive Enkidu after all?

 

You see what I'm getting at? There can be no true tragedy in a story like Avengers: Endgame, only a consumerist fantasy that lets you have your tragic cake and eat it too, purchasing your way into an impossible realm in which death and destruction are reversible and the story always goes on.

 

This is what I mean by a "false catharsis." In a true dramatic catharsis, there is a tragic recognition of the inexorable limits of human being. That recognition isn't pleasurable and it isn't fun, but it does offer a solemn glimpse into a reality that is vaster than we are, and with that glimpse, a certain dignity and wisdom.

 

But that doesn't sell tickets.

 

 

Photo Credit: Pixabay Image 1239698 by ralpoonvast used under the Pixabay License.

If your Internet browser of choice is Firefox, then you are familiar with the way it provides you with a selection of readings when you visit its home page. I presume that my selections are based upon data-mined algorithms based upon my search history, because I get a lot of stuff from the Atlantic and the New York Times, as well as a lot of science readings. I'm not complaining, because while a good deal of what I see is simply clickbait, I have also learned some useful stuff from time to time. But what is perhaps most useful to me is what I am learning by conducting a semiotic analysis of the general themes that dominate my "feed."

 

Probably the most common theme I see appears in all the "how to succeed in business" articles that are always popping up: how to ace that interview, how to find that perfect job, how to choose the career that's best for you…that sort of thing. Tailored to sensibilities of the digital age, such advice columns belong to a long tradition of American "how to" manuals calibrated to a competitive capitalist society. Calvin Coolidge (who once famously quipped that "the chief business of the American people is business") would feel quite at home here, so I don't want to read too much into all this. But I do think that the preponderance of such pieces may well reflect a growing anxiety over the possibility of attaining a rewarding career in a gig economy where opportunities for middle-class advancement are drying up.

Some evidence for this interpretation lies in the remarkable number of articles relating to mental depression that also appear in my feed. Some of them are scientific, while others are also of the how-to variety, mental health division. The latter texts have recently been emphasizing the healing properties of the natural world, and I'm good with that. After all, that's where I go to soothe jangled nerves. But what, semiotically speaking, does this trend tell us?

 

My take on the matter is that even as Americans worry (with very good reason) about their career opportunities, they also are becoming increasingly depressed in the face of a constant barrage of signals from the traditional mass media and digital social media alike, all pushing them to compare their lives to the lives of everyone else. David Myers, over in the Macmillan Learning Psychology Community, has been exploring this phenomenon recently, especially with respect to teen-aged girls, and I am quite in sympathy with his interpretation. I would simply expand the scope of the problem to include pretty much everyone, who, facing a daily bombardment of images and stories about the fortunate few who seem to have everything that life can possibly offer, experience a depressing discontentment with their own lives.

 

And here is where nature comes in. Nature is not only filled with beautiful forests, mountains, rivers, lakes, seashores, deserts, meadows, canyons, valleys (pick your own favorites), it is not filled with people—or, at least, isn't supposed to be. "Climb the mountains and get their good tidings. Nature's peace will flow into you as sunshine flows into trees," John Muir once said, and his words are even more cogent today than when he wrote them over a century ago.

 

But to achieve that peace, you need to get past the crowds, and, more importantly, all that social pressure that drove you to nature in the first place. It is therefore quite ironic that one often sees people in natural surroundings wholly absorbed in their iPhones, or taking selfies. This kind of hetero-directed behavior not only threatens to disrupt the healing powers of the natural world, it also signifies how, for many people today, social media have created an addictive spiral from which they cannot escape. Think of it: going to nature to escape the depressing impingements of social existence, only to seek approval from other people, and then, perhaps, to be depressed if you don't get quite the reaction you hoped for on Instagram.

 

Hence, the title of this blog.

 

 

Photo Credit: Pixabay Image 489119 by kelseyannvere used under the Pixabay License.

 

Every now and then, while driving to work, I've turned on the radio and have come in on the middle of a hauntingly beautiful, if rather grim, acoustic song sung by a gravely-voiced singer who seems to be singing about heroin addiction, or something of the sort. Knowing nothing about the song I've kind of assumed that it was about the nation's opioid epidemic and left it at that. But one phrase from the song really got my attention, and I finally entered the words into a search engine a few days ago to see what I could find out about it.

 

Okay, you know where I'm going now. I've used it in the title for this blog. The song is "Hurt," as recorded by Johnny Cash in 2002. But I was quite surprised to learn that it was written a decade earlier by Trent Reznor of Nine Inch Nails, whose own recording of it in 1994 is on a different sonic plane entirely, and even more nightmarish than Cash's dark retrospective. Wow, country meets industrial pop.

 

There's nothing new, of course, in an artist from one musical genre covering a song from another. After all, that's one way that music evolves: through a continuous mixing and fusing of different styles into new forms. And it isn't that Johnny Cash hadn't done this sort of thing before this is the country icon who teamed up with Bob Dylan back in the days when Merle Haggard was still "proud to be an Okie from Muskogee," and country music fans tended to be openly hostile to just about everything that folk and folk rock stood for.

 

Whether he put it in such terms to himself or not, Cash's collaborations with Dylan tapped into a common system of tangled roots from which country music and folk/folk rock emerged: the lives of the poor, the downtrodden, and the oppressed. Within this tradition lies the outlaw, a defiant (and often idealized) figure who breaks the rules in despite of society's laws. And so Johnny Cash went to Folsom Prison in 1968, thereby jump-starting his flagging career, and (along with Waylon Jennings, Willie Nelson, and David Allan Coe), establishing outlaw country as a thriving musical sub-genre.

 

It's important to note that country music isn't the only popular musical form with an outlaw tradition these days. Rap, particularly in its Gangsta' incarnation, has its own outlaws, and its own taproot into the lives of the people. "Country rap," a rather uneasy and tentative fusion of country and hip hop, has even emerged to explore the possibilities of this common ground.

 

Given the highly fraught state of social relations in America today, I don't really expect that country rap will make much of a difference politically, but to get to where we want to go, it is always useful to know where we came from.  

 

Photo CreditPixabay Image 687631 by Ana_J, used under the Pixabay License.

One of the key principles upon which the semiotic method is based is that of the cultural mythology. Grounded in Roland Barthes’ pioneering study Mythologies, a cultural mythology is an ideologically inflected worldview (or set of worldviews) that shapes social consciousness. Unlike more strictly held views on social constructionism, however, which hold that reality itself is a social construct, the mythological viewpoint—at least as I present it in Signs of Life in the U.S.A.—is essentially subjective, and can be tested against the objective realities that surround it. So passionately are cultural mythologies held, however, that when reality does break through, the result can be quite emotional, even violent.

 

Take climate change denial, for instance. Effectively a sub-cultural mythology in its own right, a steady stream of objective evidence that climate change is real only produces ever more insistent denials by its adherents. Or then again, take America's fundamental mythology of the American dream, which holds that opportunities for social and economic advancement are open to all who make the effort to achieve them, and what happens when uncomfortable realities challenge it—as just happened with the still unfolding college admissions scandal.

 

The extraordinary level of emotion—and media attention—that has greeted this scandal is especially indicative of what happens when a cultural mythology smashes into reality. For here is evidence, especially painful for the middle class, that even college admissions can be bought through schemes that are open only to the upper class that Americans are so slow to recognize exists at all. In a certain sense, I must confess, I'm a little surprised by the profundity of the reaction. I mean, didn't everyone already know about the advantages—from legacy admissions to exclusive prep schools to expensive SAT tutoring—that America's upper classes enjoy when it comes to elite college admissions? Somehow I can't help but be reminded of that iconic scene in Casablanca where Captain Louis Renault is "shocked" that "gambling is going on” in Rick's Café Américain, just as he is about to receive his own winnings.

 

So there is something about this current glimpse into what upper-class privilege is all about that has really struck a nerve. I see at least three facets to the scandal that help explain how and why. First is the high-profile celebrity involvement. As an entertainment culture, America adores and identifies with its favorite entertainers, so when two popular actresses, and their children, are alleged to have taken advantage of their wealth in order to slip past the guardians of a supposedly meritocratic college admissions system, the feeling of betrayal runs especially deep.

 

The second component to the scandal is that—even before the Great Recession hit—career opportunities for America's college graduates (especially if they are not STEM majors) are closing down, increasing the pressure to get into one of those schools whose graduates have the best chance at getting the few good jobs that are left. Suddenly, where you go to college seems to matter a lot more in determining where you are going to get in life.

 

Which takes us to the third angle to the phenomenon: the stunned realization that not only is the American dream a cultural mythology but that the whole game appears to have been rigged all along. This apprehension cannot be overestimated in its affect on American society today. It is, in good part, behind the rise of political "populism" (it may be significant in this regard that conservative commentary on the scandal gloats over the "liberal" Hollywood elites involved), as well as the accompanying divisions in a society where more and more people are competing for fewer and fewer slots in the good life—which appear to have been purchased in advance as part of the social scenario of a new Gilded Age.

 

Photo CreditPixabay Image 1701201 by davidsenior, used under the Pixabay License

Jack Solomon

The "Momo Challenge"

Posted by Jack Solomon Expert Mar 14, 2019

 

When I first started writing about popular cultural semiotics in the 1980s, the Cabbage Patch Kids were the biggest thing going in children’s consumer culture. Not too many years later there was the POGS pandemic, followed by the Pokemon outbreak, which has since crossed its original generational boundaries to continue on as what may be the most lucrative gaming phenomenon of all time.

 

The common thread running through all these mega-fads is the way that they all were disseminated—at least in their beginnings—via a mysterious children’s grapevine unknown to adults, a vast international playground of sorts in which word about the Next Big Thing got passed without the assistance of social media. And now that the grapevine has gone digital, as it were, the propagation of new kiddie fads is accelerating at Warp speed, with unsettling results.

 

A couple of recent articles from The Atlantic and the New York Times provide a case in point. Describing the apparition of an online poltergeist called "Momo" who pops up unexpectedly on social media and dares kids to, among other things, commit suicide, they tell of a burgeoning panic among parents, police departments, and major news outlets around the globe. The new fad is called "the Momo challenge," and it would be pretty scary—except that it's a hoax.

 

Taylor Lorenz sums up all the confusion rather nicely:

On Tuesday afternoon, a Twitter user going by the name of Wanda Maximoff whipped out her iPhone and posted a terrifying message to parents.

 

“Warning! Please read, this is real,” she tweeted. “There is a thing called ‘Momo’ that’s instructing kids to kill themselves,” the attached screenshot of a Facebook post reads. “INFORM EVERYONE YOU CAN.”

 

Maximoff’s plea has been retweeted more than 22,000 times, and the screenshot, featuring the creepy face of “Momo,” has spread like wildfire across the internet. Local news hopped on the story Wednesday, amplifying it to millions of terrified parents. Kim Kardashian even posted a warning about the so-called Momo challenge to her 129 million Instagram followers.

 

To any concerned parents reading this: Do not worry. The “Momo challenge” is a recurring viral hoax that has been perpetuated by local news stations and scared parents around the world. This entire cycle of shock, terror, and outrage about Momo even took place before, less than a year ago: Last summer, local news outlets across the country reported that the Momo challenge was spreading among teens via WhatsApp. Previously, rumors about the challenge spread throughout Latin America and Spanish-speaking countries.

 

The Momo challenge wasn’t real then, and it isn’t real now. YouTube confirmed that, contrary to press reports, it hasn’t seen any evidence of videos showing or promoting the “Momo challenge” on its platform.

 

If Momo is a hoax, why, then, has she produced such a panicky reaction? John Herrman's take on the matter is instructive. "Screens and screen time are a source of endless guilt and frustration" for modern parents, he writes, "so it makes sense to need to displace these feelings on a face, a character, and something, or someone, with fantastically evil motives, rather than on the services that actually are surveilling what the kids are up to, to ends of their own."

 

In other words, if "Momo" isn't real, the way that the corporate Net is invading our privacy, "mining" our data, and leading our children down a Pied Piperish path (one which makes the exploitations of traditional television look like a nineteenth-century Fourth of July parade) is, and grownups are accordingly getting very jumpy. "Momo" may be a hoax, but Slender Man wasn't, and therein lies the real "Momo challenge": the Internet is growing faster than our ability, or even desire, to shape it to human needs, rather than corporate ones. And the kids, who usually know what's going on before their parents do, could actually be the canaries in a creepy digital coal mine.

 

 

Photo Credit: Pixabay Image 2564425 by StockSnap, used under the Pixabay License

For quite some time now I have been intimating in this blog that entertainment may not be the most effective way of achieving political goals, due to the way that it can distract its audience from the task of actual political engagement. Thus, I was inevitably struck by Steve Almond's forthright argument to this effect in a recent op-ed for the Los Angeles Times. But while reading Almond's essay I found myself beginning to question my own position, and while I'm not quite ready to abandon it entirely, I do believe that it may need some modification in the light of recent developments in American political culture.

 

To see why, let's start with Almond's thesis. Arguing that the superb political comedy that has erupted in the wake of the Trump presidency has only played into the hands of a man "who relishes and exploits his beefs with comedians . . . [and who] doesn’t see them as degrading the office of the presidency so much as transforming that office into an adjunct of the entertainment industry, where what matters most is your ratings," Almond suggests that the "towering irony here is that the essential mission of comedians in the Age of Trump is identical to that of the man they mock." Thus, both Trump and his opponents "preach that our political and media classes are essentially corrupt. Both use shtick to convert our distress at this dysfunction into disposable laughs. In other words, both turn politics into show business." The upshot of all this, Almond concludes, is that "Halfway through his reign, Trump has reaffirmed a truth that extends from King Lear to Norman Lear: A kingdom that relies on court jesters to confront mad rulers is doomed. The Fool is not a redeemer. His role is to defuse, by means of laughter, the moral distress that presages redemption."

 

In short, comedians like the cast of SNL and Steve Colbert are making their audiences feel too good to actually go out and do anything (like vote). But there's a certain paradox here, for if Trump used comedy to capture the White House, so too can his opposition. In other words, if Almond's argument is right, it's also wrong. What worked for one side can work for the other. Maybe SNL and Steve Colbert (et. al) can help lead the revolution.

 

Only the future will reveal whether this will prove to be true, but for now we can take away one surety from Almond's essay: America's entertainment culture has engulfed our entire society so thoroughly that none of the old barriers between "high" culture and "low" truly exist anymore. Popular culture, with its mandate to entertain, is our dominant culture, for better or for worse.

 

 

Image Credit: Pixabay Image 3774381 by mohamed_hassan, used under the Pixabay License

Twenty-six years ago, almost to the day, I set about rewriting the general introduction to what would become the first edition of Signs of Life in the U.S.A. Seeking something of sufficient magnitude and familiarity to effectively introduce an audience of composition students to the then-unfamiliar (and ostensibly forbidding) field of cultural semiotics, I chose the Superbowl, which, I noted, is "more than just a football game. It's an Event, a ritual, a national celebration, and show-time" for those corporate high rollers who can afford the ever-increasing cost of advertising.

 

As I contemplate the semiotic significance of Superbowl LIII, it's as if I am being visited by the Ghost of Superbowls Past, comparing the present game to those that have gone before and wondering about the future. And at first glance, much remains the same. The Superbowl is still an Event, is still a national ritual, and its advertising has come even closer to overshadowing the game itself, with specially made commercials released in advance, game-time polling to "elect" the most popular ads, and plenty of post-game punditry devoted solely to the advertising.

 

But there is also a detectable difference this time around, a pivot away from the past into an unsettling present in which the words "national celebration" may appear to no longer apply. For Superbowl LIII was as riven by pre-game controversy as it was afflicted by a generally lackluster performance on the field, a disturbing dissonance that makes the Ghost of Superbowl Present a rather ominous apparition indeed.

 

The causes of this dissonance are well known. They include the infamous un-called pass interference that helped put the Rams into the NFL final and galvanized the city of New Orleans into creating its own game-day counter event—not to mention the filing of a couple of lawsuits. And they also include the on-going controversy swirling around the Kaepernick-inspired taking-a-knee protests that, having been suppressed by the NFL, resulted in an artist boycott of the half-time show. Which led, in turn, to yet another controversy involving the rather-less-than-household-word band that, so to speak, crossed the picket line to perform.

 

But beyond these more particular conflicts there looms the vast conflict that is America itself today, which no amount of "unity" advertising (one of the notable commercial themes to be found in Superbowl LIII's ad lineup) is likely to disperse. The situation is such that it's ironic now to think how, once upon a time, the Dallas Cowboys could award themselves the distinction of being "America's team," and make it stick. Today such an epithet might be regarded as an oxymoron.

 

Interestingly, one sign of unity that I did detect on Superbowl Sunday appeared in New Orleans itself, where a highly diverse population of all ages turned out for an anti-Superbowl party that really looked to be more fun than the usual script for the conquering-heroes victory parades staged in the cities of the actual winners of the game. Could it be that we have here an example of a way of coming together in a common cause wherein both winning and losing are irrelevant?

 

Alas, no. For the unity displayed on the streets of New Orleans on Superbowl Sunday was motivated by anger and resentment, an us-against-the-world vibe quite in keeping with the overall tenor of American politics these days. The partying crowd in New Orleans had wanted to win, and, being denied their victory, chose defiance.

 

When you add into the mix the elaborate conspiracy theories that enveloped the game—accusations that the Rams/Saints game was rigged by the NFL high command to get L.A. into the Superbowl to help pay for the new five billion dollar stadium being built there—a dark new significance begins to emerge. Indeed, with bizarre accusations that the entire NFL season had been rigged circulating through the Internet, the specter of an America so torn by distrust and disillusionment that even its favorite one-day sports event can't escape conspiratorial contamination rudely enters the picture. If this is the Ghost of Superbowl Present, what will the Ghost of Superbowls Future bring?

 

 

Image Credit: Pixabay Image 3558732 by QuinceMedia, used under a Pixabay License.

What with all the hoopla surrounding Gillette's notorious "toxic masculinity" commercial, I feel almost obliged to address it in this blog. The challenge here is to provide a semiotic angle on the ad's significance without getting tangled up in a debate on what it is trying to say about male behavior. Rather, the semiotic question concerns what the ad is telling us about contemporary American society as a whole, which has gotten me thinking more about razor blades than since I stopped shaving in 1979.

 

A shrewd analysis of the ad in the Washington Post has given me a useful opening on the problem, and so I'll begin there. In "What Trump’s fast-food feast and Gillette’s woke razor blades have in common," Sonny Bunch draws a interesting parallel between Donald Trump's fast food spread for the Clemson Tigers and Gillette's ad by arguing that each was choreographed, in effect, to appeal to one side in the current national divide, while aggravating the other. As Bunch puts it, Trump "plays right to his populist strengths, assembling a mélange of foods that every American is familiar with and most Americans have eaten . . . [setting] a perfect trap for his critics, whose sneering at the feast will come off as elitist . . . [and thus playing] up the 'us against them' angle that has formed the heart of his appeal." Gillette, for its part, is playing to "what it hopes to claim as its base: the Ethical Consumer Signaling His Virtue, a valuable subset of customer, as Nike discovered with its Colin Kaepernick campaign." In short, Bunch concludes, "both the Fast Food Feast and Woke Gillette are explicitly designed to inspire mockery and, therefore, implicitly designed to encourage the us-vs.-them dichotomy that defines modern American life."

 

Now, whether or not Gillette harbored any intention to provoke, there is plenty of evidence that its ad certainly did so, as can be seen by a quick Internet search on the topic. Quite predictably, one can find conservative media outlets like Fox News railing against it, while Vox, for its part, is in accord. The polarization is just as Bunch describes it to be.

 

All this raises a question, then, as to the actual effectiveness of politically provocative advertising in itself. The most common measure of such effectivity, of course, is financial: that is, whether a provocative ad campaign increases sales and stock valuations for the company that creates it. For example, as I've noted in an earlier blog, the big question surrounding Nike's Colin Kaepernick campaign was what would happen to Nike's stock price. When the stock at first drooped, antagonistic pundits declared the campaign to be a failure. When Nike's stock recovered, the ad was declared a success. Similarly, Jack Neff at Ad Age observes that, since Gillette's object in its "toxic masculinity" ad is to attract millennials to its products, "the ultimate test of whether Gillette has turned millennials into believers will be sales."

 

Neff, of course, is right, just as anyone who argues that Nike's Kaepernick campaign is a success because the company's stock price is currently up is right. After all, increasing profits is what advertising is for. But does commercial success equate to cultural success?  Gillette's claim is that its ad is intended to start a "conversation" about male behavior—presumably to do something about it.  So, is the Gillette ad successful in that sense?

 

Here the measure of success is much more difficult to determine. Did Coca Cola make the world a better place with its "I'd Like to Teach the World to Sing (in Perfect Harmony)" campaign? Have the Benetton Group's United Colors and Unhate campaigns achieved their (noncommercial) goals? Will Gillette really cause a conversation that will make men behave better?

 

One way to approach this problem is to consider Bunch's contention that the Gillette campaign (and others like it) antagonizes even as it appeals, reproducing the us-vs.-them dichotomy that afflicts the country today. If Bunch is right, Gillette is preaching to a choir, not converting the opposition, and that is hardly likely to improve the situation. Wouldn't a more Rogerian approach be more effective?

 

Perhaps, but in the current cultural and political climate, a Rogerian ad campaign probably wouldn't get much attention, thus negating the financial and social goals of a socially conscious corporation. Controversy both sells and rallies the troops, and one can hardly blame Gillette for doing what everyone else is doing.

 

Then there's the whole problem of consumer activism, as a possible oxymoron, to consider. The question here is not unlike that posed by the phenomenon known as "slacktivism"—a derogatory term for social media activism that ends at clicking "like" buttons, signing petitions, and retweeting political tweets (you can read more about this in Brian Dunning's "Slacktivism: Raising Awareness" in the 9th edition of Signs of Life in the USA). That is, purchasing a product because the company that sells it shares your values (or wants you to believe it does) is an act of consumption, not a direct action, even though buying a product for political reasons may feel like doing something meaningful. But is it?

 

What we have in the end is a powerful signifier of what it means to live in a consumer society. In such a society, consumption, as the measure of all things, is routinely confused with action, whereby wearing the tee shirt is regarded as a substantive political act. This sort of thing can be quite good for the corporate bottom line, but whether it is good for democracy is another question.

 

 

Image Credit: Pixabay Image 2202255 by WikimediaImages, used under a Pixabay License

One of the crucial elements in teaching, and performing, popular cultural semiotics is the identification of the larger contexts (or systems of associations and differences) in which particular popular signs may be situated. This means that one must be aware not only of current popular movies, TV shows, consumer trends, etc., in order to conduct semiotic analyses of them, but that one must also be ever attuned to what one might call, for lack of a better term, the "spirit of the age." In this vein, then, my first blog for the new year will constitute a semiotic analysis of the spirit of the digital era, beginning with what will likely appear to be a rather peculiar starting point: namely, the eighteenth-century European Enlightenment.

 

I start here due to a purely fortuitous decision to pull an old book off my shelf last week that I should have read forty years ago but didn't, until now: Garry Wills' Inventing America (1978). Now, I don't want to get involved here in the somewhat controversial thesis Wills proposed about the sources of Jefferson's thought and language when he first drafted the Declaration of Independence—that's something better left for specialists in the field. Rather, I am only interested in the extraordinary revelations of the ins and outs of Enlightenment thinking that Wills masterfully presents. In a word, Wills reveals that behind the Newtonian clockwork universe informing much of Enlightenment discourse was a veritable religion of the number. And I'm not just talking about the quantitative advances that led, towards the end of the eighteenth century, to the Industrial Revolution and the emergence of scientific modernity; I'm talking about the ecstatic belief that Newtonian methods could be applied to the explication of, and solution to, every human problem.

 

Let me offer (courtesy of Wills' ample citations) a particularly striking example. Here is Francis Hutcheson's algebraic formula for the measurement of human morality as presented in the second edition of his founding text for the Scottish Enlightenment, Inquiry into the Original of our Ideas of Beauty and Virtue (1726):

 

M = (B + S) X A = BA + SA; and therefore BA = M - SA = M - I, and B = M - I/A

[where B = Benevolence, A = Ability, S = Self Love, I = Interest, and M = Morality].

 

Actually, there's more to the formula than I've reproduced here, but you'll get the point. Here, from a Presbyterian Divine, we find dramatic evidence of the extraordinary prestige of the Newtonian method, the belief that if Newton could use mathematics to measure and explain the universe, philosophers could do the same in measuring and guiding, humanity.

 

Sound familiar? Substitute the words "big data" for "mathematics" and you've got the current zeitgeist in a nutshell. For here too, from Steven Pinker to the purveyors of AI, digital humanists to data analysts, Educause to edX, and so on and so forth ad infinitum across our cultural spectrum, we can find what is effectively a religious faith in the omnipotence of numbers.

 

The Enlightenment accordingly offers a significant point of association to which we can relate our current l’esprit de l’époque. But (and I can never repeat this often enough) the significance of a phenomenon lies not only in what it can be associated with but also in its differences from similar phenomena within its system. And there is a difference between the origins of the enormous cultural prestige enjoyed by Enlightenment mathematics and of twenty-first century data worship. For while the Enlightenment was wowed by Newton's scientific achievements (achievements, it can be added, that long preceded any large-scale commercial applications), the wow factor today (as I have noted before in my blog on the "religion" promoted by the now-defunct corporation Theranos) derives from the unimaginably huge fortunes that have been made, and will continue to be made, by the corporate masters of big data. Google effectively started it all by finding a way to monetize its free services by tracking our online behavior and selling it to marketers, making personal data the holy grail of post-industrial capitalism (Facebook, of course, is the second biggest name in this tradition). The difference, in a word, is between science and commerce, with the Googleplex and its offspring occupying the cultural role once occupied by Newtonian physics. To put it another way, here is yet another signifier of our hypercapitalist culture.

 

Whether or not this hypercapitalist faith is a good thing or not is a value judgment, and since the goal of teaching cultural semiotics is to provide students with the critical equipment necessary to make informed judgments of their own, not to dictate those judgments to them, I will withhold my own here. But I will say this much: Hutcheson's equations, as well intentioned and nobly founded as they may have been, look pretty silly to us today. And I can't help but wonder how our current data-infatuated zeitgeist will look to future culture critics.

 

 

Image Credit: Pixabay Image 3088958 by xresch, used under a Pixabay License