Skip navigation
All Places > The English Community > Bedford Bits > Blog > Author: Jack Solomon
1 2 3 Previous Next

Bedford Bits

99 Posts authored by: Jack Solomon Expert

The news that "Sonic the Hedgehog" had to undergo a substantial CGI redesign after its core audience panned it in the trailers inevitably has reminded me of the fate of the movie version of "Cats," which also had to go back to the digital drawing board in the wake of a disastrous YouTube premier. But while it appears that the humanoid hedgehog's re-vamp has been more successful than that of the fluffy felines, the semiotic significance of the viewer-compelled re-workings of these two movies is very much worth exploring.

 

I'd like to begin that exploration with Jean Baudrillard's thesis that in the age of the "sign" (his term for postmodern capitalism), signification is a one-way street, with corporate elites broadcasting their signals (which include everything from billboards to feature-length movies) to a passively receptive audience whose only possible resistance (and a futile one of that) is to vandalize the signal (Baudrillard's example is scrawling a mustache on the "Mona Lisa"). While I've never been a fan of Baudrillard's often unsupported pronouncements, his fundamental point about the top-down vectors of the mass media is a valid one (more or less)—or was when he formulated it. But the fully interactive Internet, with the accompanying rise of social media to worldwide eminence, has changed all that. For now, the mass media aren't one-way streets at all: they are multi-lane superhighways on which the signals are flying in every direction. The medium is no longer the massage (yes, that was McLuhan's actual phrase); it's a democratic free-for-all.

 

That's probably the fundamental takeaway from the "Sonic/Cats" fiascos, but there is a second, rather less inspiring, signification to consider. For the often vitriolic piling-on evident during such eruptions of fan outrage is all-too-reminiscent of social media "shaming" campaigns, of online bullying and "cancel culture." Certainly the slings and arrows of outrageous Twitter attacks are not going to do any real harm to the well-heeled captains of the entertainment industry (just look at the way that the creators and cast of "Game of Thrones" essentially shrugged off fan demand for a major reset of the blockbuster series' final season), but there appears to be something habit forming in the generation of social media mobs. Denouncing movies is, in the larger scheme of things, pretty trivial stuff, but the online trials and executions of offending films have to be taken in the context of the vastly more serious campaigns undertaken against vulnerable individuals, who can very definitely be harmed by such outbursts (consider, for example, the recent case of Gayle King).

 

So, as is so often the case with the new media, we are looking at a mixed message here, one that combines a populist liberation of the masses from corporate (and other forms of elite) control, with a dark vision of mob rule. And that's no trifle.

 

Photo Credit: Pixabay Image 1174228 by Pixaline, used under Pixabay License

Jack Solomon

"Typical Americans"

Posted by Jack Solomon Expert Feb 13, 2020

Twenty-six years ago, the introduction to the first edition of Signs of Life in the U.S.A. began with an exploration of the place that the Super Bowl holds in American life and culture, noting how "It's more than just a football game. It's an Event, a ritual, a national celebration, and show-time for . . . corporate high rollers, for whom the game is but a stage" for high-profile branding and marketing. Since I wrote those words, paying almost as much attention to the Super Bowl’s commercials as to the game itself has become a national pastime, with advance ad previews, real-time ad-popularity polls, and post-game ad rundowns, including straight-up semiotic analyses like Eric Deggans's NPR overview of some of the unintended messages from Super Bowl LII's advertising lineup. And so it is only appropriate, if not downright obligatory, for me to take a look at this year's crop of Super Bowl ads as I write this blog in the aftermath of the game.

 

In general, like many other commentators, I see a lot of companies playing it safe, trying to avoid controversy in an increasingly polarized America by adopting such tried-and-true formulae as featuring popular celebrities in comic narratives—with Bill Murray's star turn in Jeep's "Groundhog Day" spoof probably being the most successful in this regard. With such a lineup, there isn't much room for trenchant semiotic analysis, leaving one simply with deciding whether a given ad is funny or not, and then analyzing what makes it funny to see what that might tell us.

 

But one ad did stand out from the play-it-safe crowd this year, an ad that Deggans (in the afore-mentioned analysis) awarded his own personal "Oddest use of vaguely nationalistic language to sell beer" award." Yes, you've probably already guessed which one, if only from the title of this blog: Budweiser's "Typical American" spot. And since Deggans got to it first, I'll begin my analysis of this interesting outlier with a complete quotation of what he has to say about it:

 

The self-styled "king of beers" offers some championship-level pandering in this ad, which features a gritty-voiced announcer sarcastically noting how "typical Americans" are always showing off their strength—as images of a heroic firefighter in action play across the screen. The ad urges viewers to celebrate the nobility of "typical Americans." But I couldn't help wonder who the narrator was referencing when he said, "they call us 'typical Americans.' " Who exactly is "they?" And why is Budweiser developing brand loyalty by urging "typical Americans" to rise up against this unnamed source of insult? Vaguely nationalistic, condescending and solicitous all at once—hardly a regal combination.

 

Well, I think that Deggans is right on target, and he asks exactly the right questions: namely, "Who exactly is 'they,'" and "why is Budweiser developing brand loyalty by urging 'typical Americans' to rise up against this unnamed source of insult?" So, these are the questions that my own analysis will seek to answer.

 

It's important to begin here with an acknowledgment that the creators of “Typical Americans” clearly went to great lengths to avoid cultural controversy by packing their panoramic survey of "typical Americans" doing their noble thing with an inclusive range of performers. From its multi-racial casting to its inclusion of an actually disabled athlete, to its celebration of the 2019 Women’s World Cup champions, the ad tries very hard to appeal to all of America without privileging any particular group. This isn't to say that everyone was included, but a fairly wide tent is definitely intended.

 

Still, while trying to project inclusivity, "Typical Americans" does set up an "us vs. them" dynamic in its voice-over narrative, a monolog simply dripping with sarcastic allusions to what "they" say about "us," along with a lot of visual refutations of “their” opinions. So, indeed, as Deggans asks, who are "they," precisely?

 

We can answer this question by situating it within the history of what I will call "the trope of the 'ugly American.'" Going back at least as far as nineteenth-century British attitudes towards their former colonies (Dickens is especially scathing in this regard), and coming to full maturity in the post-World War II years when America emerged as a superpower, "the ugly American" trope evokes an America that is fundamentally gauche, impolite, raw, uncultured, and uncivilized. "Typical Americans" alludes to this history in order to unite Americans against those who just don't understand us, don't get it, and can't be expected to get it. In short, the rest of the world. Which leads us to Deggans's second question: why did Budweiser make such an appeal in order to sell beer?

 

Here we can consider Budweiser's lengthy history of populist advertising. When, for example, in the boom-boom, go-for-the-gold 1980s, appeals to high status and wealth in advertising were quite common (consider Michelob's "Have It All" campaign), Budweiser was telling its consumers that "this Bud's for you," while featuring images of working-class Americans at work, at play, and in bars. In the light of this history, then, we can see that "Typical Americans" is presenting a new riff on an old Budweiser theme, spinning a populist narrative with (as Deggans recognizes) a distinctly nationalistic topspin. In so doing, the ad is trying to unify Americans at a time of disunity, make them feel good about being Americans, and so (not at all coincidentally) feel good about buying America's most popular/populist beer.

 

But there's a catch, something that I believe the ad's creators did not anticipate. For populism these days—especially when combined with overt nationalism—is evocative of an all-too-evident us-vs.-them dynamic that is currently driving America’s electoral politics. And thus, especially in a presidential election year like this one, the ad's attempt to unify Americans is bound to backfire, dividing those viewers who identify with nationalistic populism from those who don’t. This will still sell a lot of beer, of course, but not, perhaps, in the spirit that “Typical Americans” intended.

 

Photo Credit: Pixabay Image 3561339 by QuinceMedia, used under Pixabay License

As the countdown to the annual Academy Awards ceremony ticks away, I'm seeing more and more articles in the Entertainment section of the Los Angeles Times like this one, whose aim is to predict the outcomes of the votes for the prized categories of individual performance and best picture. What all these columns have in common is the way that they are essentially using a simplified form of predictive analytics in their prognostications, aggregating data from past awards seasons to pronounce what is going to happen this time around. You know the sort of thing: if Brad Pitt wins a SAG, his Oscar's in the bag. Grab a Golden Globe, and your picture's Oscar gold. Even when the statistics aren't all that compelling (as in the seven-out-of-twenty-five correlation between winning an acting SAG and an Oscar as reported in the Times article linked to above), stats are equated with destiny, much in the same way that presidential polls are touted as sure-fire crystal balls upon the future—and you know how well that worked out in 2016.

 

Reliable or not, however, this preference for data over careful analysis of the relative aesthetic merits of the various contenders for the big film prizes bears cultural significance. For this is the era of BIG DATA, an age when crunching numbers fuels vast advertising, educational, and AI enterprises that not only make piles of money for those who run them but which are also, and more significantly, widely believed to hold the key to solving all our problems, including "fixing" our higher education system. (Remember how robo-grading was going to liberate—or perhaps more accurately, eradicate—writing instructors? Or MOOCS?). But no amount of disappointing results seems to dampen what is almost a religious enthusiasm for the power of aggregated data. And now this faith appears to have engulfed the traditional ritual of handicapping the Oscars.

 

Which leads to a second significance. Because even if the big promises of Big Data haven't exactly been met quite yet, there are serious problems that it could address (for example, it sure would be nice if the power of aggregated data could be applied to reversing global warming), but guesstimating the Oscar awards in advance isn't such a problem. So the fact that entertainment writers are employing the techniques of data-driven analytics to prognosticate who's going to win what, indicates that their readers care so much about the outcome of a ritual whereby an ultra-exclusive club celebrates itself through an annual awards ceremony which even includes a royal red-carpet treatment, that they are eager for any glimpse into the future that they can get. And in a data-driven era, it should come as no surprise that data-driven Oscar augury has become, as they say, a "thing."

 

Photo Credit: Pixabay Image 3679610 by analogicus, used under Pixabay License

 

Now that the Academy Awards sweepstakes for 2020 is in full cry, with the Golden Globes functioning as a kind of stand-in for the Iowa Caucuses in the tea-leaf-reading business of trying to guess what picture is going to win the golden statuette, it seems to be a good time to have a semiotic look at Quentin Tarantino's latest entrant into the annals of Hollywood cool. Keep in mind that the purpose of such an examination is not the same as that of a film review: how well a movie does what it does is not equivalent to what it signifies. After all, the epic critical and commercial failure of Cats, the movie, has been due, one might say, to a massive, across-the-board wardrobe malfunction, not to anything that it might signify culturally. Conversely, a film can successfully accomplish what it sets out to do, winning commercial and critical acclaim along the way—as Once Upon a Time in Hollywood has certainly done—and still pose interesting problems from a cultural standpoint. And that is something Once Upon a Time in Hollywood does, as well.

 

While it isn't what I wish to focus on here, it would be remiss of me not to mention the most common semiotic critique of the film that I have seen so far: that is, the way that it celebrates the days when men were men and completely dominated the entertainment industry. While I'm a little surprised that I haven't seen any mention of the way that this rather archetypal male buddy flick appears to be a self-conscious effort to reproduce the star-power of Robert Redford's and Paul Newman's collaboration in Butch Cassidy and the Sundance Kid, which (coincidentally?) was released in 1969 (the time frame of Once Upon a Time in Hollywood), Tarantino's nostalgic homage paid to a perhaps not-so-bygone era has certainly not gone unnoticed.

 

But what really strikes me here is what Tarantino does with history. Yes, I know that we are forewarned: the "once upon a time" in the title not only alludes to Sergio Leone's Once Upon a Time in the West, it also clues us into the fact that this is a fairy tale, a fantasy from which we should not demand historical accuracy. And after all, Tarantino is famous for playing around with the facts, having already created such revisionist revenge fantasies as Django Unchained and Inglourious Basterds. So deciding to completely rewrite the history of the Manson Family and the Tate-LaBianca murders is quite in character for Tarantino, whose audiences have come to expect this sort of thing from him.

 

Well, no one ever said that Hollywood is a history department, and I am under no serious apprehension that anyone is going to walk away believing that the Manson murders did not take place. The reversal of history presented in the movie is so total that it does not present the problems that ostensibly "historical" films that get the history wrong do. As I've said, Once Upon a Time in Hollywood is a fairy tale, not a documentary.

 

Still, when we pull back from the film to look at the context (or historical system) in which it appears, a somewhat less reassuring significance begins to loom. For this is the age of "fake news" and conspiracy theories, a time when large groups of people, quite deliberately, invent their own "truth" (what Steve Colbert has satirically called "truthiness") from which they cannot be shaken, no matter how much evidence can be produced in contradiction to their claims. So while there is no risk that the fantasy presented in Once Upon a Time in Hollywood will ever be taken seriously as a historical text, its substitution of a wish-fulfillment for the grim facts of history is very much in keeping with the times. In this sense, the movie is a sign—a symptom, not a cause—of the postmodern tendency to regard historical truth as something that is always open for negotiation, with reality itself, as Jean Baudrillard always insisted, being nothing more than a simulacrum of a simulacrum—indeed, one might say, a Hollywood movie about Hollywood movies.

 

Photo Credit: Pixabay Image 2355686 by Wokandapix, used under Pixabay License

Jack Solomon

Why Richard Jewell Now?

Posted by Jack Solomon Expert Dec 19, 2019

In my last blog, I presented a semiotic interpretation explaining how the movie Ford vs. Ferrari reflects a larger cultural signification that goes well beyond the history of Formula 1 racecars and their driver/designers. I wish to do something like that in this analysis by looking at the current release of Clint Eastwood's biopic Richard Jewell, a film that, on the face of it, would appear to focus on a rather unlikely subject for mass-market cinematic appeal. But, as we will see, the time is just as ripe for Richard Jewell as it is for Ford vs. Ferrari, and what the two movies have in common says a great deal about the current state of American consciousness.

 

To begin, then, when I first started seeing promotional billboards for Richard Jewell while driving to work, I had no idea who Richard Jewell was and why there should be a movie about him. It isn't that I have forgotten the bombing at the 1996 Atlanta Olympics, nor Jewell's ordeal when he went practically overnight from hero to suspect; I simply hadn't remembered the name. And I’m probably not alone in that.

 

But this is what makes the appearance of this film so semiotically interesting. Biopics are usually about household names, and "Richard Jewell" is not exactly a household name. We can (as I do), deeply sympathize for what he had to go through, but his experience doesn't rise to the historical level of, say, the infamous railroading of Captain Alfred Dreyfus. So why, I wondered, was this film made at all? Who was its intended audience?

 

A clue to the matter can be found in a description of the movie that appears on the main page when you perform a Google search on Richard Jewell. Here's what I've read: "American security guard, Richard Jewell, heroically saves thousands of lives from an exploding bomb at the 1996 Olympics, but is unjustly vilified by journalists and the press who falsely report that he was a terrorist."

 

Note the emphasis in this plot description on "journalists and the press," which ignores the role of the FBI, whose leaks brought Jewell into the glare of the public spotlight in the first place. Note also how the movie has already raised a good deal of controversy for the way that it treats the Atlantic Journal-Constitution reporter Kathy Scruggs, who first broke the story. Put it all together and an explanation for what Richard Jewell semiotically signifies begins to emerge.

 

For Richard Jewell tells the story of an ordinary lower-middle-class man who was nearly destroyed by the actions of what are widely regarded as "media elites" by those who comprise the current populist political movement in America. Such a movie is tailor-made for such viewers, who will identify with the "ordinary Joe" figure of Richard Jewell and see in his suffering a proof of their suspicions. And it may be no accident that the release of the film was timed for the run up to a presidential election that will pit a populist favorite against the "elites" that they fear.

 

These same viewers, on a more positive but related note, tend to regard people like Carroll Shelby as cultural heroes and identify with them. Muscle cars, NASCAR, automobiles cherished as signs of freedom and prosperity: all these phenomena are touchstones for an America where Shelby's triumph over the elites at Ferrari are the dream version of Richard Jewell's personal nightmare. Ford vs. Farrari, one might say, is simply the sunny inverse of Richard Jewell.

 

Further evidence for my interpretation lies in the fact that Clint Eastwood made the movie. This is not an attack on Eastwood: my point is that his films particularly appeal to populist audiences (consider the success of Sully, and, more strikingly, American Sniper). Please also note that I am not saying that making movies with populist appeal is a bad thing. After all, Michael Moore has made a career out of his own sort of populist vision, albeit one that is diametrically opposed to the kind of populists I am writing about here. The key point to keep in mind when teaching cultural semiotics is that semiotic analyses are not political judgments. They simply try to explain what things mean.

 

Photo Credit: Pixabay Image 1608127 by YazanMRihan, used under Pixabay License

By the time of the 1980 Winter Olympics in Lake Placid, New York, America had just concluded a bad decade. Watergate, Kent State, rampant inflation, the abject failure of the Vietnam War, Soviet adventures in Afghanistan, and the Iran hostage crisis had all taken their toll on the national morale, and people were feeling rather down and morose. So, when a young and inexperienced American hockey squad defeated the defending- champion Soviet team in what wasn't even the gold medal round, the country seized upon the occasion as grounds for a national celebration. One would have thought that America had just won the First and Second World Wars combined to judge from the level of elation and patriotic pride that greeted the victory, as "going for the gold" became the unofficial national motto for what would soon be known as the Reagan eighties.

 

America, of course, isn't the only country that treats international sports victories as surrogates for military success—after all, that is what the original Olympic games were for—but it does make the most movies about such events. And it is in this context that we can understand the enormous popularity of the recently released Ford vs. Ferrari. So, let's have a look.

 

Ford vs. Ferrari belongs to a genre of sports movies—often "based on a true story"—in which the protagonist beats the odds to achieve some kind of athletic triumph or another— a genre that includes such fictional films as Rocky and Breaking Away, and such real life movies as Fighting Back: The Rocky Bleier Story and Rudy (Chariots of Fire is a British version of this sort of thing). The key thing about these movies is their focus on an underdog, someone (or an entire team) whose eventual triumph serves as a kind of parable of the American dream. At first glance, then, Ford vs. Ferrari would seem to belong to a very different kind of sports film, pitting the gigantic Ford Motors corporation against a boutique Italian race car manufacturer, but by pitting two ageing driver/designers against the Ferrari legacy of track dominance, the movie manages to create a David vs. Goliath scenario after all, with Carroll Shelby and Ken Miles playing the role, in effect, of America's Lake Placid hockey team, and Ferrari standing in for the Russians.

 

Which explains why such a film would be conceived and produced now, and why it should be such a success. Because America is having another bad decade. Increasingly divided along ideological lines, still suffering from the after-effects of the Great Recession, watching the war on terrorism turn to an infinity war, and nervously monitoring the rise of China to superpower status, Americans are badly in need of a good shot in the arm. Enter Hollywood, right on cue, with just what the country needs, remaining true to the slogan that in America when the going gets tough, the tough make movies.

 

Now, if only they could come up with something to make us feel better about the global climate crisis that our love affair with the internal combustion engine has helped create. Hmmm. I wonder if Greta Thunberg plays hockey.

 

Photo Credit: Pixabay Image 152088 by OpenClipart-Vectors, used under Pixabay License

Jack Solomon

Angry Birds

Posted by Jack Solomon Expert Nov 14, 2019

No, this blog is not going to be about that wildly popular video game and movie franchise; it’s about Twitter and some very distinct signs of life in the USA that may be found there in some rather unexpected places. So here goes.

 

Actually, I almost wrote on this very topic last year during the horrendous Woolsey fire outbreak in Southern California, when Sonia and I faced a mandatory evacuation from our home—along with a lengthy power outage—during which we received most of our emergency information from a battery-powered laptop computer. Since I was already aware then that in the digital age the most up-to-date and accurate news in an emergency situation is likely to be found on such social media sites as Twitter, rather than from the traditional mass media news outlets, I stayed tuned-in to Twitter during the entire ordeal, following a string of hashtags as an endless stream of postings flooded the site. But even as I pored through post after post to get the latest information on the fire, I noticed a lot of things that set off a number of semiotic sirens that I planned to write about when the smoke, quite literally, settled.

 

When it was all over, however, I decided that maybe writing about my observations was premature, and that it would be better to wait and see what further signs I might detect that could be entered into the semiotic system within which they could be interpreted. Frankly, I rather hoped that I wouldn’t experience such an opportunity again and that I could just let the topic go. But now, exactly a year later, with fires breaking out all around me and Twitter, once again, being my best source of information, I find that everything I noticed last year is being repeated, only more so. Hence this blog.

 

I first want to make it clear that my analysis to follow is not a critique of Twitter. Twitter is just the medium here; the message lies in the human use of that medium—what I referred to in my last blog as the behavior that cultural semiotics interprets. And here’s what that behavior reveals:

 

First: even in an emergency situation, when lives and property are at stake, people are going to take to social media to promote their own personal agendas. I’ll call the phenomenon “hashtag spamming,” and it runs the gamut from people who have something sell (and this includes sex workers) to people with a political axe to grind (which during the current fire emergency has included someone who could most charitably be described as an Armenian Genocide-denier).

 

Beyond the hashtag spammers are those who view a natural disaster as a good time to start or get into a fight about Donald Trump, or global climate change, or any other particularly divisive topic. One sees this, of course, everywhere on the web, where America’s ideological divisions are continuously on display in an ever-escalating fashion, but it is striking to find it going on in the midst of a natural disaster.

 

The way that some people keep retweeting information both from official emergency services sources and conventional news media is something that is also worth noting. In almost every case, such re-tweeters evidently mean well, but what they do is repeat information that can be dangerously misleading because it is completely out of date during a fast-developing fire outbreak. Thus, one finds the same dramatic fire images that commercial news sources feature in order to fan the flames, if you will, of viewer attention, repeated again and again, when those images are no longer accurate representations of the most current conditions—this sort of thing, of course, is exacerbated by television news reporters who use Twitter to promote their stations.

 

There is also a sentimental set of signifiers to consider. These are the posts from people who also mean well, but who clutter up the page with expressions of their emotional responses to the catastrophe. When one is in a hurry to find out exactly what is happening in a fast-moving situation, such posts are actually counter-productive, and can get in the way of the truly informative posts in which individuals supplement the official information about emergency services (including evacuation center locations and assistance with moving pets and large animals out of the fire zone) with tips and offers of help of their own.

 

You put all of this together and a very profound signification emerges about the power of interactive media. Quite simply, we find here the enormous desire of ordinary people to have a voice in a world where wealth and power are otherwise being consolidated in ever-shrinking enclaves of geopolitical privilege. Sometimes that voice is used for selfish, and even pernicious, purposes, and sometimes it reflects genuine altruistic, and even heroic, ends. To paraphrase Dickens' famous opening to A Tale of Two Cities, Twitter, like the Internet at large, presents us with the best of times and the worst of times. It is at once of essential utility and plagued with behaviors that—to cite Nietzsche this time—can best be described as "human, all too human."

 

It is easy to take the new realities of the digital era for granted, but the ability to participate directly in the world of mass media is still a very new thing in human history. The hierarchical, top-down structures of the past have been deconstructed, and people from all over the world are rushing in to fill the spaces that were once denied to them. The effects of this new capability are beginning to appear in the form of an increasing political polarization that can be found worldwide, for the moderating influence of the traditional commercial news media, which skew to the center in order to promote optimal audience share, is being lost. It was once assumed that this change of affairs would most benefit left-of-center politics, but so far the evidence is that politics-by-Twitter has been more effectively employed by the right. The pendulum may swing back as we look to the 2020 election, but either way (and here comes my last literary allusion) it is the center that cannot hold.

 

Photo Credit: Pixabay Image 1917737 by geralt, used under Pixabay License

When Sonia and I began working on the first edition of Signs of Life in the U.S.A. in 1992, semiotics was still regarded as a rather obscure scholarly discipline generally associated with literary theory and linguistics. It also was quite literally unheard of to attempt to employ semiotics as a model for critical thinking in first-year composition classes, and Chuck Christensen, the Publisher and Founder of Bedford Books, was rather sticking his neck out when he offered us a contract. To help everyone at Bedford along in the development process of this unusual textbook, he asked me to provide a one-page explanation of what semiotics actually is, and I responded with a semiotic analysis of the then-popular teen fashion of wearing athletic shoes—preferably Nikes—with their shoelaces untied. That did the trick and Sonia and I were on our way.

 

As you may note, the focus of my semiotic explanation for the Bedford folks was on an object (athletic shoes), with the intent of demonstrating how ordinary consumer products could be taken as signs bearing a larger cultural significance. This was quite consistent with semiotic practice at the time in the field of popular cultural studies, which frequently analyzed cultural objects and images. But even then I knew that the real focus of cultural semiotics in Signs of Life was human behavior as mediated by such things as fashion preferences, and with each new edition of the book, I have been further refining just what that means.

 

And so, as I work on the tenth edition of the book, I have come to realize that the semiotic analysis of cultural behavior bears a close relationship to the science of artificial intelligence. For just like AI, the semiotics of human behavior works with aggregated patterns based upon what people actually do rather than what they say. Consider how the ALEKS mathematics adaptive learning courseware works. Aggregating masses of data acquired by tracking students as they do their math homework on an LMS, ALEKS algorithmically anticipates common errors and prompts students to correct them step-by-step as they complete their assignments. This is basically the same principle behind the kind of algorithms created by Amazon, Facebook, and Google, which are designed to anticipate consumer behavior, and it's also the principle behind Alexa and Siri.

 

Now, semioticians don't spy on people, and they don't construct algorithms, and they don't profit by their analyses the way the corporate titans do, but they do take note of what people do and look for patterns by creating historically informed systems of association and difference in order to provide an abductive basis for the most likely, or probable, interpretation of the behavior that they are analyzing—as when in my last blog I looked at the many decades in which the character of the Joker has remained popular in order to interpret that popularity.

 

Now, to take another fundamental principle of cultural semiotics—that of the role of cultural mythologies in shaping social behavior—one can anticipate a good deal of resistance (especially from students) to the notion that individual human behavior can be so categorically interpreted in this way, for the mythology of individualism runs deep in the American grain. We like to think that our behavior is entirely free and unconstrained by any sort of mathematically-related probabilities. But it wouldn't bother a probability theorist, especially one like Sir David Spiegelhalter, a Cambridge University statistician, who has noted that “Just as vast numbers of randomly moving molecules, when put together, produce completely predictable behavior in a gas, so do vast numbers of human possibilities, each totally unpredictable in itself, when aggregated, produce an amazing predictability”.

 

So, when we perform a semiotic interpretation of popular culture, we are on the lookout for that probability curve, even as we anticipate individual outriders and exceptions (which can themselves point to different patterns that may be equally significant in what is, after all, an overdetermined interpretation). But our goal as semioticians is to reveal the significance of the patterns that we find, not to exploit them, and thus, perhaps, modify those behaviors that, all unawares, are doing harm.

 

Photo Credit: Pixabay Image 2587756 by Stock Snap, used under Pixabay License

Hailed as a "must-see" movie for the apres-weekend water cooler crowd, and warily monitored by everyone from local police departments to survivors of the Aurora, Colorado massacre, Joker has surpassed its opening box office predictions and has already succeeded in becoming the current cinematic talk of the town. Such movies always make for student-engaging essay and discussion topics, and I expect that many college instructors across the country are already crafting assignments about this latest installment in the comics-inspired universe of Hollywood blockbusters.

 

But while many such assignments will be likely to invite debates on the advisability of making such a movie as Joker in the light of an epidemic of lunatic-loner mass shootings, while others (especially in film studies departments) will focus on the revival of the Scorsese/De Niro "character study" formula that made Taxi Driver a movie classic (heck, Joaquin Phoenix even channeled his inner-De Niro by losing a ton of weight Raging Bull style for the role, and, of course, De Niro's in the film too), a cultural semiotic analysis of the movie would take a different approach, which I will sketch out here.

 

To begin with, we can ask the question, "what does the enduring popularity of the Joker in American popular culture tell us?" For alone among the multitudinous villains of comic book history, the Joker returns again and again, now to star as the protagonist in his own feature film. Where's the Penguin, we might ask, or Clayface? What is it about this character that has captured the American imagination?

 

As with any semiotic analysis, let's start with the history of the Joker. In the beginning he was a Dick Tracy-like gangster in the tradition of Conan Doyle's evil genius Professor Moriarty, heading his own organized crime syndicate. Given a camped-up star turn in the Batman TV series of the 1960s, the Joker joined with Burgess Meredith's Penguin and a host of other really funny, but essentially harmless, villains in the days when fist fights (SMASH! BAM! POW!) were considered sufficient violence for a prime time children's television program.

 

The key change in the Joker's portrayal (the critical semiotic difference) came in the 1980s, when Frank Miller and Grant Morrison darkened the scenario considerably, turning the quondam clown into a psychopathic killer. This was the Joker that Jack Nicholson channeled in Tim Burton's Batman, and which Heath Ledger took further into the darkness in The Dark Knight. It's important to point out, however, that while Nicholson's Joker is a merciless killer, he is also very funny (his trashing of the art museum is, um, a riot), and his back story includes an acid bath that has ruined his face, providing a kind of medical excuse for his behavior. Ledger's Joker, on the other hand, isn't funny at all, and his unconvincing attempt to attribute his bad attitude to childhood abuse isn't really supposed to be taken seriously by anyone. The point is simply that he is a nihilistic mass murderer who likes to kill people—even his own followers. And unlike the past Jokers, he isn't in it for the money, incinerating a huge pile of cash with one of his victims tied up at the top to prove it.

 

The trajectory here is clear, and the makers of Joker were very well aware of it. Rather than turn back the clock to introduce a kinder, gentler Joker (you're laughing derisively at the suggestion, and that's precisely my point), Todd Phillips and Scott Silver quite knowingly upped the ante, earning an R-rating that is quite unusual for a comics-themed movie. Well, Deadpool got there first, but that's part of the point, too.

 

For in spite of the film's attempt to pass itself off as a study of the pathologizing effects of socioeconomic inequality, that isn't its appeal at all, and it doesn't explain why this particular character was chosen to be the protagonist. Just think, what if someone made a movie called Marx: the Alienation Effect in Contemporary Capitalism, based on the best-seller Das Kapital? No, I'm afraid that the Joker's popularity isn't political in any truly revolutionary sense. He's way too much of a loner, and too weird. There's something else going on here.

 

Before one succumbs to the temptation to simply say that Joker is a movie for psychopathic wannabes, let's just remember that the domestic box office for the film's first weekend was 96 million dollars. There just aren't that many psychopaths out there to sell so many tickets. No, the desire for an ever-darkening Joker is clearly a very widespread one, and the success of the afore-mentioned Deadpool franchise—not to mention Game of Thrones' wildly popular funhouse-mirror distortions of Tolkien's primly moralistic Middle Earth—only amplifies the evidence that Americans—especially younger Americans—are drawn to this sort of thing. But why?

 

I think that the new detail in the Joker's origin story that is introduced in the movie, portraying him as a failed standup comic and clown, is a good clue to the matter. We could say that Arthur Fleck's great dreams—at least in his mind—have been betrayed, and there's a familiar ring to this as a generation of millennials, burdened with college debt and college degrees that lead nowhere, faces a country that many feel is betraying them. It is significant in this regard that the darkening of the Joker began in the 1980s, the decade when the American dream began to crumble under the weight of the Reagan tax cuts, massive economic "restructuring," and two recessions from which the working and middle classes never fully recovered. What happened in response wasn't a revolution: it was anger and despair, spawning a kind of Everyman disillusionment with traditional authority (including moral authority), conspiracy theories, and fantasies of breaking loose and taking things into one's own hands.

 

Which makes characters like the Joker rather like Breaking Bad's Walter White, whose response to economic disruption was to become disruptive. White's Everyman revolt didn't instigate an epidemic of middle-class drug lords; it simply entertained an angry America with the trappings of vicarious fantasy. The success of Joker just a few years after the end of Heisenberg shows that the fantasy is getting darker still.

 

Smash. Bam. Pow.

 

 

Photo Credit: Pixabay Image 1433326 by annca, used under Pixabay License

Jack Solomon

The Panopticon 2.0

Posted by Jack Solomon Expert Oct 3, 2019

Michel Foucault's application of Jeremy Bentham's panoptic proposal for prison reform to the modern surveillance state has become a commonplace of contemporary cultural theory. And heaven knows that we are being watched by our government, by our computers, by our phones, and televisions, and automobiles, and goodness knows what else. It is also no secret that current and prospective employers monitor the social media imprints of their current and prospective employees—all those promises of airtight privacy settings and Snapchat anonymity notwithstanding. As I say, all this has become a commonplace of life in the digital era.

 

But a new wrinkle has entered the picture, a fold in the space/time fabric of modern life if you will, whereby the pre-digital past has come to haunt the digital present. For as the governor of Virginia and the prime minister of Canada now know to their cost, what goes into your school yearbook doesn't stay in your school yearbook. And thanks to an array of yearbook-posting alumni websites, anyone with an Internet connection can access virtually anyone's yearbook and immediately expose online those embarrassing moments that you thought were safely hidden in the fogs of time.

 

(A parenthetical autobiographical note: I would be highly amused if someone dug up my high school yearbook—yearbooks, actually, because I was on the staff for three years, the last two as editor-in-chief. The first of the three was a conventional celebration of football players, cheerleaders, and homecoming royalty, but I changed all that in the next two when I got editorial control, dedicating the first of them to the natural environment— including two photo essays complete with an accompanying poetic narrative—and the second devoted to a contemplation of the mystery of time itself, which included repeating reproductions of El Greco's "Saint Andrew and Saint Francis," which were intended to convey an ongoing dialog between a wise man and a seeker of temporal wisdom. You get one guess as to why I don't have to worry about any embarrassing party pics in my yearbooks.)

 

So it isn't enough to cancel your Twitter account, max out your privacy settings on Facebook (good luck with that), or simply take a long vacation from the Internet, for the Net's got you coming and going whatever you do. I expect that one's reaction to this state of affairs (which is itself of semiotic interest) is probably generational; that is, if you grew up with the Internet, none of this is likely to be particularly alarming, but if you remember the days when personal privacy was at least a value (if not always a reality), then it could be disturbing indeed. And there is no escaping the situation, for just as it is impossible to avoid the consequences of major cyber hacks by refusing to conduct any of your business affairs online (if you have any sort of bank account, credit/debit card, health record, or social security number, you are vulnerable no matter how hard you try to live outside the Web), there is no controlling what may surface from your past.

 

Photo Credit: Pixabay Image 4031973 by pixel2013, used under Pixabay License

Jack Solomon

America's Got Sentiment

Posted by Jack Solomon Expert Sep 19, 2019

As Sonia Maasik and I work to complete the tenth edition of Signs of Life in the U.S.A., I have been paying special attention to American popular music, which will be a topic for a new chapter that we're adding to the book. While our approach will be semiotic rather than esthetic, part of my research has involved listening to music as well as analyzing its cultural significance, and as everyone knows, there's nothing like YouTube to put just about everything you want to hear at your literal fingertips. Which brings me to the subject of this blog.

 

Well, you know how YouTube is. Even as you watch one video you are regaled with a menu of others that can take you on a merry chase following one musical white rabbit after another. And so it came to pass that I found myself watching some famous clips from the Britain's Got Talent and America's Got Talent franchises. Which means that I finally saw that audition of Susan Boyle's, which, while it wasn't a joke, started the whole world crying. With joy.

 

Talk about fairy-tale happy endings! Take a little Cinderella, mix in the Ugly Duckling, and sprinkle in a lot of A Star is Born, and there you have the Susan Boyle story. I'd say that you couldn't make this sort of thing up, except for the fact that it has been made up time and again, only this time it's true. And it helps a lot that the woman can really sing.

 

The semiotic significance of this tale is rather more complicated than it looks, however. On the surface, it looks simply like a sentimental triumph of authenticity over glitter, of the common folk over entertainment royalty. And, of course, that is a part of its significance—certainly of its enormous popular appeal. Just look at the visual semiotics: the glamorous judges, sneering Simon (I'm certain that he has made himself the designated bad guy to add melodrama to the mix), and the audience on the verge of laughter in the face of this ungainly, middle-aged woman who says she wants to be a star. And then she blows the house away.

 

But here is where things get complicated. For one thing, even as the celebrity judges fell all over themselves confessing to Ms. Boyle how ashamed they felt for initially doubting what they were about to hear, they managed to imply that it would have been OK to ridicule her if it had turned out that she couldn't sing, that losers deserve to be humiliated. After all, that's what those buzzers are for.

 

And then there is the notoriously oxymoronic nature of reality television, its peculiar mixture of authenticity and theatricality, its scripted spontaneity. One begins to wonder what the judges knew in advance about Susan Boyle; certainly she didn't get to that stage of the competition by accident. For to get past the thousands of contestants who audition in mass cattle calls for these shows, you have to have something that the judges want, and this can include not only outstanding talent but unexpectedly outstanding talent, the ugly ducklings that provide plenty of occasion for all those dewy-eyed camera shots of audience members and judges alike who are swept away by the swans beneath the skin. The whole thing has become such a successful formula for the franchise that when, a few years after the Susan Boyle sensation, a soprano/baritone duo named Charlotte and Jonathan came onto the stage, Simon Cowell made sure to quip, in a loud stage whisper to the judge beside him, "Just when you think things couldn't get any worse" (funny how the camera caught that), only to have Jonathan steal the show with a breathtaking performance that Sherrill Milnes might envy. Call me cynical, but somehow I think that Cowell knew perfectly well what was going to happen.

 

But let's not forget the designated duds either, the poor souls who get picked out of the cattle calls in order to be laughed at later, to be buzzed off the stage. After all, with so many truly talented people in the world, surely there would be enough to have nothing but superb performers on these shows. But failure is part of the formula here as well as success, for schadenfreude, too, sells. 

 

So the semiotic question isn't whether Susan Boyle can sing; nor is there any question that without Britain's Got Talent she would almost certainly not be enjoying a spectacular career. The semiotic question involves what is going on when television shows like Britain's Got Talent and America's Got Talent play upon the vicarious yearnings of their viewers to shine in the spotlight in a mass society where fewer and fewer such opportunities really exist—even as those same viewers sneer at the failures. Thus, as with so much of reality television, there is an uncomfortable love/hate relationship going on here, a sentimental identification with the winners alongside a derisive contempt for the losers. And in a ruthlessly hyper-competitive society where more and more people through no fault of their own are falling into the loser category, this is of no small significance.


And I have to add that I'm certain that if a young Bob Dylan or Leonard Cohen had appeared on America's Got Talent, both would have been buzzed off the stage.

 

 

Photo Credit: Pixabay Image 1868137 by Pexels, used under Pixabay License

With television's arguably most prominent dramatic series ending amidst the ashes of King's Landing and the outrage of many of its most loyal fans (including a remarkable Change.Org petition demanding an entire Season 8 redo), I find myself reminded of Frank Kermode's classic study, The Sense of an Ending (1967). Exploring the ways that human beings use storytelling in order to make sense of their lives and history, Kermode focuses his attention on the "high art" literary tradition, but the same attention can be paid to popular art as well in ways that can explain, at least in part, the extraordinary reaction to GoT's final two episodes. Here's how.

 

First, let's note that fan pressure on creative artists is nothing new. Charles Dickens' readers pleaded with him, in the serialized run-up to the climax of The Old Curiosity Shop, to please not kill Little Nell, while Arthur Conan Doyle was successfully lobbied by disappointed readers to bring Sherlock Holmes back from the dead after apparently killing the popular detective off in "The Final Problem." And movie producers routinely audience-test their films before making their final cuts. So all the popular uproar is not really different in kind from things that have happened before, but it may be different in degree, which is where its significance lies.

 

Because no one, except for the series' writers and actors, appears to be fully satisfied with what finally happened after eight long, and violent, years in the battle for the Iron Throne. The most common complaint seems to be that Daenerys should have been allowed to follow her "character arc" to become not only Queen of the Seven Kingdoms but also a kind of messiah. However, it isn't my purpose to wade into the controversy to offer my own opinion about what "should" or "shouldn't" have happened, for that's an esthetic, not a semiotic, question. Rather, I want to look at the extravagance of the negative response to what did transpire and what it tells us.

 

To understand this response we can begin with the fact that Game of Thrones ran for eight years as a continuous narrative—conceived, in fact, as one gigantic movie: a TV "maxi-series" if you will. Eight years is a long time, especially for the show's core audience of millennials who effectively entered adulthood along with GoT's main characters. This audience largely overlapped with the generation that grew from childhood to adolescence as the Harry Potter novels were published and filmed, and who also were on hand for the definitive cinematic Lord of the Rings: the fantasy novel to beat all fantasy novels first raised to cult status by baby boomers and turned upside down and inside out by George R.R. Martin to create A Song of Fire and Ice.

 

Such a long wait, at such a formative period of life, is simply bound to build up a great load of gestaltic expectation, a longing for the kind of ending that would redeem all the violence, betrayal, and heartbreak of this essentially sadistic story ("Red Wedding" anyone?). Both The Lord of the Rings and the Harry Potter novels prepared viewers for such an ending, one in which, to quote Miss Prism from The Importance of Being Earnest, "The good ended happily, and the bad unhappily." Instead, everyone got something more along the lines of King Lear.

 

And there you have it, for as Miss Prism tartly adds, happy endings are "what fiction means”—or as Frank Kermode might put it, the triumph of the hero and the defeat of the villain is one way that our story telling makes sense of (one might say, makes bearable) the realities of human experience.

 

But that isn't what George R.R. Martin—who knows full well how the triumph of the House of York only led to Richard III, whose defeat, in turn, brought Henry VIII and Bloody Mary to the throne—ever appears to have had in mind for his epic saga. Mixing historical reality with a lot of Tolkiensque fantasy, Martin (whose own conclusion to the tale is yet to come) thus put the show's writers into quite a bind. Because a completely conventional "happy" ending would have contradicted the whole point of the story, while a completely dismal one (say, Cersei triumphing after all) would have really enraged the customers. I use that word deliberately, for in popular culture audiences really are customers, and they expect to get what they pay for (the complaint on the part of some fans that by making GoT's producers and actors rich they were entitled to a better wind up than they got is especially significant in this regard). So Benioff and Weiss essentially compromised in the way that pop culture usually does. The really bad villains do end unhappily, and the Starks do regain power after all, but Martin's fundamental thesis that power itself is the problem is preserved in the madness of Daenerys at the moment of achieving absolute control.

 

It wasn't a bad compromise in my view, but it quite clearly hasn't been a successful one either. Still, because of the odd reversal in the relation between novel and film, with the film being concluded before the novel was, the game isn't over. If the novels ever are concluded, I suspect that Martin will have more shocks up his sleeve, beginning, I suppose, with King Bran turning tyrant and bad trouble between Jon and Sansa.

 

 

Photo Credit: “Game of Thrones Paperback Book” by Wiyre Media on Flickr 7/15/17 via a CC BY 2.0 License.

Topics for popular cultural analysis can spring out at you at the most unexpected times—in fact, that is one of the goals of cultural semiotics: to attune oneself to the endless array of signs that we encounter in our everyday lives. Take for example the catalog from a Minnesota-based outfit called The Celtic Croft that I came across quite by accident recently. A mail-order/online Scottish clothing and accessories emporium, The Celtic Croft offers its clientele not only traditional Highland gear but "officially licensed Outlander-inspired apparel and tartans," along with authentic Braveheart kilts. Which is where popular culture, and its significance, comes in.

 

I admit that I had to look up Outlander (of which I have only rather vaguely heard before) to understand what the catalog was referring to, but what I learned was quite instructive. Based upon a series of historico-romantic fantasy novels by Diana Galbadon, Outlander is a television series from the STARZ network that features the adventures of a mid-twentieth-century Englishwoman who time travels back and forth between the eighteenth and twentieth centuries as she leads a dual life among the Highland clans and the post-World War II English. Something of a breakout sensation, Outlander has recently been renewed for a fifth and sixth season.

 

To grasp the cultural significance of this television program—and of the clothing catalog that is connected to it—we can begin with constructing the system to which it belongs. The most immediate association, which is made explicit in The Celtic Croft catalog, is with the Oscar-winning film Braveheart, but the Highlander television and movie franchise is an even closer relation. More broadly, though set in the eighteenth century, Outlander can be regarded as a part of the medieval revival in popular culture that began with J.R.R. Tolkien's Lord of the Rings and The Hobbit, and which led to the whole "sword and sorcery" genre, encompassing both Harry Potter and Game of Thrones with its emphasis on magic, sword play, and a generally romanticized view of a pre-industrial past.

 

The current controversy raging within medieval studies over its traditional focus on the European Middle Ages—not to mention its cooptation by avowed white supremacists—reveals that such a system is fraught with potential political significance, and it is highly likely that a complete analysis of the phenomenon would uncover elements of conscious and unconscious white nationalism. But, if we limit ourselves here to The Celtic Croft catalog and its Braveheart and Outlander-inspired merchandise, we can detect something that is a great deal more innocuous. To see this we can begin with a tee-shirt that catalog offers: a black tee with white lettering that reads, "Scotch: Makin' White Men Dance Since 1494."

 

Now, I can see how this slogan could be taken as a kind of micro-aggression, but it can also be seen as something similar to the "white men can't jump" trope: expressing what is actually an admiration for qualities that are not conventionally associated with white people—especially in relation to stereotypes of Anglo Saxon self-repression and ascetic Puritanism. What the dancing Celt signifies is someone who can kick up his heels over a glass of whiskey and who is decidedly not a stodgy Saxon.

 

This interpretation is supported by the larger context in which The Celtic Croft universe operates. This is the realm of Highland Scotland, whose history includes both biological and cultural genocide at the hands of the English, who themselves become symbols of brutally oppressive whiteness in Braveheart and Outlander. It is significant in this respect that William Wallace's warriors in Braveheart were conspicuously portrayed with the long hair and face paint of movie-land Indians, while the British of Outlander are depicted as killers, torturers, and slave traders.

 

So what we have here is something that might be called an "escape from the heritage of oppressive whiteness," by which white audiences/consumers (who do not have to be actual Scots: even Diana Galbadon isn't) identify with the Celtic victims of Anglo history, finding their roots in such historical disasters as the Battles of Falkirk and Culloden. Purchasing the once-forbidden symbols of the Highland clans (kilts and tartans were banned for years after Culloden), and watching movies and television shows that feature the heroism of defeated peoples who resisted Anglo-Norman oppression, is thus a kind of celebration of a different kind of whiteness, one that rejects the hegemonic variety.

 

In other words, rather than reflecting white supremacy, the Celticism (I think I just coined that) of The Celtic Croft and its associated entertainments expresses a certain revision of the traditional American view of history away from Anglo-centrism towards an embrace of its victims. At a time when differing experiences of historical injustice are rending our country, this is no small recognition, because it could potentially help create a ground for unity rather than division.

 

 

Photo Credit: Pixabay Image 349717 by PublicDomainArchive used under the Pixabay License.

Now on a record shattering run that should be of no surprise to anyone, Avengers: Endgame offers a multitude of possibilities for writing assignments, ranging from a close reading of the movie itself to an analysis of the entire Avengers film franchise and beyond to a reflection on a system of violent ongoing sagas that includes Star Wars, Game of Thrones, and even The Walking Dead—not to mention the rest of the Marvel universe.

 

I am not going to attempt anything of the sort in this brief blog, but instead want to propose a different kind of assignment, one that has semiotic implications but begins in a kind of personal phenomenology much akin to a reader-response analysis. This assignment would probably be best be composed in the form of a student journal entry posing the question: How does an ongoing story line that appears to reach some sort of conclusion (including the deaths or "retirement" of major characters), but which I know is not really over at all affect me and my sense of reality?

 

What I'm aiming at here is for students to become aware of what could be called the "false catharsis" involved in movies like Avengers: Endgame, which pretend to bring a vast arc of interwoven stories to an end, but which viewers know perfectly well is not over at all. Disney has too much at stake to allow Iron Man, for example, to stay dead, or for Captain America to remain retired, and what with the unlimited resources that fantasy storytelling has at hand to reverse the past and reconstruct the present and future, you can be pretty certain that everyone will be back.

 

In exploring the implications of what could well be called "eternity storytelling," consider the effect of Charles Dickens' The Old Curiosity Shop if his readers knew that Little Nell would be brought back in one way or another in a future novel. Or what the impact of the Iliad would be if Hector rose from the grave in a future installment of Trojan War Forever? Or (to go all the way back) how it would be if, in Gilgamesh II, the king of Uruk were to discover a time-traveler's ring that enabled him to go back to retrieve the lost plant-that-gives-eternal life and revive Enkidu after all?

 

You see what I'm getting at? There can be no true tragedy in a story like Avengers: Endgame, only a consumerist fantasy that lets you have your tragic cake and eat it too, purchasing your way into an impossible realm in which death and destruction are reversible and the story always goes on.

 

This is what I mean by a "false catharsis." In a true dramatic catharsis, there is a tragic recognition of the inexorable limits of human being. That recognition isn't pleasurable and it isn't fun, but it does offer a solemn glimpse into a reality that is vaster than we are, and with that glimpse, a certain dignity and wisdom.

 

But that doesn't sell tickets.

 

 

Photo Credit: Pixabay Image 1239698 by ralpoonvast used under the Pixabay License.

If your Internet browser of choice is Firefox, then you are familiar with the way it provides you with a selection of readings when you visit its home page. I presume that my selections are based upon data-mined algorithms based upon my search history, because I get a lot of stuff from the Atlantic and the New York Times, as well as a lot of science readings. I'm not complaining, because while a good deal of what I see is simply clickbait, I have also learned some useful stuff from time to time. But what is perhaps most useful to me is what I am learning by conducting a semiotic analysis of the general themes that dominate my "feed."

 

Probably the most common theme I see appears in all the "how to succeed in business" articles that are always popping up: how to ace that interview, how to find that perfect job, how to choose the career that's best for you…that sort of thing. Tailored to sensibilities of the digital age, such advice columns belong to a long tradition of American "how to" manuals calibrated to a competitive capitalist society. Calvin Coolidge (who once famously quipped that "the chief business of the American people is business") would feel quite at home here, so I don't want to read too much into all this. But I do think that the preponderance of such pieces may well reflect a growing anxiety over the possibility of attaining a rewarding career in a gig economy where opportunities for middle-class advancement are drying up.

Some evidence for this interpretation lies in the remarkable number of articles relating to mental depression that also appear in my feed. Some of them are scientific, while others are also of the how-to variety, mental health division. The latter texts have recently been emphasizing the healing properties of the natural world, and I'm good with that. After all, that's where I go to soothe jangled nerves. But what, semiotically speaking, does this trend tell us?

 

My take on the matter is that even as Americans worry (with very good reason) about their career opportunities, they also are becoming increasingly depressed in the face of a constant barrage of signals from the traditional mass media and digital social media alike, all pushing them to compare their lives to the lives of everyone else. David Myers, over in the Macmillan Learning Psychology Community, has been exploring this phenomenon recently, especially with respect to teen-aged girls, and I am quite in sympathy with his interpretation. I would simply expand the scope of the problem to include pretty much everyone, who, facing a daily bombardment of images and stories about the fortunate few who seem to have everything that life can possibly offer, experience a depressing discontentment with their own lives.

 

And here is where nature comes in. Nature is not only filled with beautiful forests, mountains, rivers, lakes, seashores, deserts, meadows, canyons, valleys (pick your own favorites), it is not filled with people—or, at least, isn't supposed to be. "Climb the mountains and get their good tidings. Nature's peace will flow into you as sunshine flows into trees," John Muir once said, and his words are even more cogent today than when he wrote them over a century ago.

 

But to achieve that peace, you need to get past the crowds, and, more importantly, all that social pressure that drove you to nature in the first place. It is therefore quite ironic that one often sees people in natural surroundings wholly absorbed in their iPhones, or taking selfies. This kind of hetero-directed behavior not only threatens to disrupt the healing powers of the natural world, it also signifies how, for many people today, social media have created an addictive spiral from which they cannot escape. Think of it: going to nature to escape the depressing impingements of social existence, only to seek approval from other people, and then, perhaps, to be depressed if you don't get quite the reaction you hoped for on Instagram.

 

Hence, the title of this blog.

 

 

Photo Credit: Pixabay Image 489119 by kelseyannvere used under the Pixabay License.