Skip navigation
All Places > The English Community > Bedford Bits > Blog > Author: Jack Solomon
1 2 3 4 5 6 7 Previous Next

Bedford Bits

91 Posts authored by: Jack Solomon Expert

In the early years of the Internet, one of the most commonly heard slogans of the time was, "information wants to be free."  This ringing affirmation of the uninhibited flow of speech, knowledge, and news was one of the grounding values of that heady era when the Net was known as the "electronic frontier," and was regarded as an unfenced "information superhighway."  Those were the days when the web log (better known in its shorthand form as the "blog") was born, and the opportunities for virtually unfettered communication opened up in ways that the world had never experienced before.

 

That was twenty and more years ago now, and while a superficial glance at things would seem to indicate that nothing has really changed, a closer look reveals quite something else; deep down, the Internet has been fenced, and the superhighway is becoming a toll road.

 

 To see how, we can consider the history of the blog itself.  Yes, blogs still exist, but they have often morphed into what were in the past called "editorials," as online newspapers slap the label onto the writings of pundits and even those of news feature writers.  What you are reading right now is called a "blog," though it is really a semi-formal essay devoted to professional musings and advice, rather than being some sort of online diary or journal.  The blogs that still hew to the original line of being personal and unrestricted communiques to the world still exist, of course, on easy-to-use platforms like WordPress, but most have been abandoned, with their last posts being dated years ago. 

 

Where has everybody gone?  Well, to places like Facebook, of course, or Instagram, or Reddit, or whatever's hot at the moment.  But this is not a mere migration from one lane of the information superhighway to another; it is an exit to a toll booth, beyond which some of us cannot go, not because we cannot afford the cost (the toll is not paid in dollars), but because we are unwilling to make ourselves the commodity that "monetizes" what now should be called the "electronic data mine."

 

Thus, I have seen personal blogs that I used to follow because I was interested in what I learned about their writers, fall fallow because they had moved on to Facebook.  For a long time, some such pages could be accessed by the likes of me if their authors chose to make them public, but they have now all been privatized by Facebook itself.  When I try to visit even the pages of public organizations, a moving barrier fills my screen, ordering me to open an account.  A free account, of course: all I have to do is sell whatever last shred of privacy I have left in order to sign on.

 

Yes, I know that Google is following me, even if I am not using its search engine: it gets me when I visit a site.  But signing on to Facebook (Google too, of course) involves an even deeper surrender of privacy.  This is demonstrated by the fact that Facebook feels that it cannot get enough data on me simply by noting that I have visited one of its subscriber's pages.  And I am not willing to let Facebook have whatever that extra information on me it wants.

 

I realize that I may sound here like someone who is demanding something for free.  I don't mean to sound like that: I realize that the Internet, like commercial television, has to be paid for somehow.  But I'd rather watch an advertisement (indeed, the ads are often better than the programs) to pay for my access than present to corporations like Facebook private information that it will sell to anyone who is willing to pay for it.  And I mean anyone, as one of the new readings in the just-completed 9th edition of Signs of Life in the U.S.A. (with a publish date of November 2017) reveals: Ronald J. Deibert's "Black Code: Surveillance, Privacy, and the Dark Side of the Internet."

 

Not that I am missing much, I think.  The thoughtful blogs that folks used to write have vanished into Facebook personal news bulletins—more like tweets and Instagrams than developed conversations.  It is not unlike what has happened to email, which I gather, is very uncool these days.  Much better to text—a non-discursive form of shorthand which, paradoxically, one does have to pay for in hard cash.

Jack Solomon

The Pepsi Consternation

Posted by Jack Solomon Expert Apr 20, 2017

Some ads are born controversial, some ads achieve controversy, and some ads have controversy thrust upon them.  But in the case of the infamous Kendall Jenner Pepsi ad, we might say that this one accomplished all three attainments at once, and if you are looking for an introductory-level lesson on popular cultural semiotics, you couldn't find a better candidate for analysis than this.

 

There are a number of reasons why the Pepsi/Jenner ad is such a good topic for an introduction to pop cultural semiotics.  First, pretty much everyone knows about it, and though it was yanked shortly after its premiere, it will be available for viewing for years to come, and the dust that it raised will not be settling soon.  This one is virtually guaranteed to have legs.

 

Second, the fact that so many people responded immediately to the ad with what amounts to a semiotic analysis of it demonstrates that cultural semiotics is not some sort of academic conspiracy designed to "read things into" harmlessly insignificant popular cultural artifacts.  All over America, people who may have never even heard of the word "semiotics" instantly performed sophisticated analyses of the Pepsi ad—my favorite example is the reviewer who noted how Kendall Jenner thrusts her blonde wig into the hands of a black assistant without even looking at the woman, as she (Jenner) heads off to join the march —to point out in detail what was wrong with it.  The SNL takedown alone is priceless.

 

I hardly need to repeat all the details of those analyses here: that the ad was "tone deaf"; that it was co-opting the Black Lives Matter movement in order to sell soda (Thomas Frank would say that the ad was a perfect example of the "commodification of dissent); that it managed to tokenize non-whites while putting a white celebrity at the center of attention.  It's all there, and, all in all, I can't think of a better exercise than to play the ad in class and go through it with a fine-tooth comb to see just what it was doing, and why it failed so badly.

 

Just to offer some somewhat less-obvious things to consider while analyzing this ad, I would note, first, that it can be included in an advertising system that contains Coca Cola's famous "I'd like to teach the world to sing" commercial from 1971.  Pepsi's ad was clearly created in the same spirit, but its abject failure marks a critical difference that bears further attention.  Now, like 1971, 2017 America is in the midst of widespread, and often bitter, cultural and political conflict, so one can't simply say that those were more innocent times to explain the difference in response to Pepsi's attempt at selling soda by trying to look culturally forward and hip to the moment.  But I do think that people are much more alert to media semiotics today than they were then, and thus more able to spot what Pepsi was trying to do.  Probably more importantly, the Coke ad didn't pretend to stage a street demonstration; it put together its own event (pseudo-event, I should say), which, though smarmy, made its own direct statement without the use of celebrities.  It wasn't authentic, but it was a lot less phony than the Pepsi ad.  That may have been part of the difference in reactions, too.

 

But the key difference, I believe, was the use of an already somewhat dubious celebrity in the Pepsi ad (Kendall Jenner belongs to an ever-growing line a RTV-created figures who are "famous for being famous") that its creators (mistakenly) believed would be immediately embraced by their target audience of millennials.  Indeed that is the narrative line that the ad assumes, which, in brief, runs like this: as a large crowd of young protesters (complete with electric guitar-and cello-backed band—with break dancers!) marches through urban streets in protest of some unidentified cause, glamorous model Kendall Jenner (whom the ad's audience is expected to recognize) is working a fashion shoot, wearing a blonde wig, stiletto heels, and a lot of makeup.  As the marchers walk past her, she looks troubled, and then decides to flick the shoot—doffing her wig, wiping off her lipstick, and somehow (somehow!) changing into blue jeans and a denim jacket—to join in.  She is immediately made the center of the whole thing, with all the marchers smiling at her in joy, and then going crazy with joy when she hands a Pepsi to a young cop assigned to riot duty (where's his armor, helmet and facemask?), who accepts it and takes a drink. 

 

The whole thing reminds me of an old John Lennon music video that shows John and Yoko leading some sort of protest march, in which it is clear that the only thing being demonstrated is the star power of John Lennon.  Now, the Lennon footage may or may not have been from a real march, but in creating a wholly bogus march for Kendall Jenner (who is hardly known for her social activism), what the Pepsi ad is really saying (contrary to their publicity department's frantic, and ultimately futile, attempts to defend the ad as a fine statement of "global" consciousness) is that what matters in America is celebrity power and wealth.  Thus, there's a good reason why the ad's critics are focusing on Jenner as well as Pepsi, because the ad is as much about her as it is about soda pop.  Someone in marketing presumed that millennials (who have been product- branded from birth) wouldn't notice the implications of that.  It is thus with some satisfaction that I can see most millennials did notice (though there are a surprising number of Youtube comments insisting that there is nothing wrong with the ad).  And that may be the most significant thing of all.

 

Jack Solomon

Popular Classics

Posted by Jack Solomon Expert Apr 6, 2017

Emily Bronte would have loved Game of Thrones.

 

No, this isn't going to be another blog post on the HBO smash hit series; rather, I would like to share some of my thoughts upon my recent rereading, purely for my own pleasure, of Bronte's weird classic, Wuthering Heights—thoughts which happen to have a significant bearing upon teaching popular cultural semiotics.

 

The foremost point to raise in this regard is that, in spite of its long enshrinement in America's high school curriculum, Wuthering Heights was not written to be studied in schools: it was written to be entertaining—to its author, as well as to its reader—for, after all, Emily Bronte had been writing to entertain  herself and her sisters and brother since her infancy.

 

More importantly, as a novel bearing the influence of everything from the Gothic literary tradition to the revenge drama to the star-crossed romance, Wuthering Heights is there to entertain, not mean.  This is where generations of literary critics striving to figure out what Bronte could possibly be getting at, and who (or what) Heathcliff is supposed to be, are missing the point. Wuthering Heights, like the movie Casablanca in Umberto Eco's estimation, is an absolute hodgepodge of often-conflicting literary cliches—a text, as Eco puts it, where "the cliches are having a ball."  And that is what most really popular stories manage to do.

 

How do we know that Wuthering Heights is popular, and not merely for school-room force-feeding?  Let's start with the fact that some forty (yes, forty, but it's hard to keep precise count) movies, TV dramas, operas, and other assorted adaptations have been made of the enigmatic novel over the years, not to mention the biopics about the Brontes themselves that continue to be churned out —most recently the 2016/2017 BBC/PBS production To Walk Invisible.

 

How do we know that it is a cornucopia of cliches?  Well, we can start with Emily Bronte's take on the star-crossed lovers theme, putting Heathcliff and Catherine Earnshaw in the Romeo and Juliet predicament.  But it doesn't quite feel like Romeo and Juliet because of Heathcliff's absolute ferocity.  This is where the revenge theme comes in.  There is not a little of Hamlet in Heathcliff, and there is probably a lot of the Count of Monte Cristo (Emily Bronte could read French, and Dumas' novel was published in 1844-45—in time for Bronte to have read, or at least known of it, before writing her novel).  This is one reason why Heathcliff is such a mystery: he is embodying two very different narrative traditions: that of the revenge hero and of the romantic hero.  Trying to reconcile these traditions is not only a hopeless task for critics, it appears to have overwhelmed Bronte herself, who, just as Heathcliff is about to perfect his decades-in-the-making revenge on the Lintons and the Earnshaws, suddenly decides to call it a day and kill himself (like a very belated Romeo) only pages from the conclusion of the story, in one of the worst-prepared-for denouements in literary history.

 

But let's not forget the ghost story element.  Like The Turn of the Screw a generation later (and James may well have gotten the idea from Bronte), Wuthering Heights is a ghost story, or not, because there may be no ghosts at all, only Heathcliff's feverish psychological projections.  But even as we ponder the ghost element (or lack thereof) in Wuthering Heights, there is the wholly Gothic goulishness of Heathcliff, which puts him in the class not only of vampires (Bronte herself teases us with that possibility) but of the necrophilic monk Ambrosio in that all-time 18th-century best seller, The Monk

 

Then there's the way that Wuthering Heights eventually employs one of the most common conventions of the entire English novelistic tradition:  the actual, and symbolic, marriage that reconciles the fundamental contradictions that the novel dramatizes.  Indeed, one wonders whether Hawthorne's House of the Seven Gables (1851) owes something to Emily Bronte, but Bronte hardly got there first.

 

Finally, there is the character of Catherine Earnshaw Linton, which may be the most popular element of all in the novel today. A likely projection of something of Emily Bronte herself, Catherine is a strong-willed, beautiful woman with masculine as well as feminine characteristics, and who may well prefigure the ever-popular Scarlett O'Hara.  Something of an archetype of the emancipated woman, Catherine, to adapt an old New Critical slogan, is there to be, not mean.   She doesn't point to a moral: she just is, and readers love her for it. 

 

See what I mean?  Wuthering Heights is simply teeming with literary formulae.  And so, just as with any artifact of popular culture whose primary purpose is to entertain, our best approach to it is not to ask what it means, but, instead, to ask what it is in all these conventions and cliches that is so entertaining, generation after generation, and what does that say about the audience (and culture) that is entertained?

 

I won't attempt that analysis now.  Perhaps I'll come back to it some time.  But my point here is that by studying "literature," we often lose track of the role that entertainment plays in literary production, just as in enjoying entertainments we often lose track of the significance of that which is entertaining in entertainment.  Popular cultural semiotics is, accordingly, not only something for self-declared, "mass cultural," entertainments: it can illuminate what we call "the classics," as well.

 

 

The following headline caught my attention today, and I thought I'd give it a semiotic treatment. 

Here it is: "Why 'Beauty and the Beast' will be the biggest box-office hit of the year so far".

 

I'm not interested in the reasons given for the predicted success in the article but, rather, in the larger picture that this apparently sure-fire box office bonanza presents.  It's worth looking into because it re-illustrates a number of points that I have made over the years in this blog about Hollywood semiotics.

 

The first, and most obvious, point to make is that this Disney live-action reprise of its own animated smash is a signifier of what could be called the "if at first you do succeed, do, do again principle." That is, with so much at stake financially in the modern movie business, commercially successful films tend to become franchises for the studios that produced them, and clone-opportunities for the studios that didn't.  Why take a creative risk when a little new technology can let you redo the animated original with real people this time around, and be pretty darned assured of a big-time box office hit?

 

Another fairly obvious point (though it took the movie industry a little while to get it ) could be called the "forget the fourteen-year-old boys for a moment and focus on the little girls realization."  That point has been amply made by such absolute blowouts as Titanic and all the Harry Potter films.  Heck, just to make certain, Disney has brought back Hermione Granger, I mean Emma Watson, as the Beauty.  And did I mention The Hunger Games trilogy?

 

Then there's what I'll call "the prince and princess paradox":  that is, in our proudly egalitarian democracy, one of the best ways to ring in the cash is to make a movie about princes and princesses—especially Nordic ones.  (Beauty is really a king's daughter in one of the early versions of the tale.)  For some reason, American middle-class desire still seems to be fixated on Old World privilege—which is a point I made on this blog when Frozen was still fresh a few years ago.

 

Related to "the prince and princess paradox" is the long-standing medieval revival, which has swept American popular culture ever since The Lord of the Rings exploded in popularity in the 1960s, and subsequently was given a postmodern makeover by Star Wars, a British "public school" inflection by Harry Potter, and a grand guignol do-over by Game of Thrones.  The paradox here lies in the way in which New World America, which has no actual medieval history of its own, continues to be obsessed with a fairy tale world of hereditary aristocrats, swords, and sorcerers. 

 

This brings me to my final point - When I cast the "new" Beauty and the Beast into a system of associated entertainments, I find in that system two somewhat similar shows that are also significantly different.  These are the 1987-90 television series Beauty and the Beast, and the DreamWorks franchise, Shrek.  The TV B & B was significant because it took the old fairy tale about what might be called a dis-enchanted prince into modern times, and turned the beast into a kind of mutant homeless person (who's actually rather handsome in his leonine way —one wonders what audience reaction would have been if Beauty was male and the beast was a female, however), who lives, literally, in the Manhattan underground.  The series was drenched in socio-political overtones, with the Beast being really a beast and not an enchanted prince who will go back to being a prince as Beauty's happily-ever-after reward.  While rather soupy and over-the-top, the series did, at least, Americanize the old story.

 

Then there's Shrek.  While rather cornball and over-the-top, Shrek gleefully shredded the old prince-and-princess framework entirely to give us an ogre-and-ogress happy ending, with a really creepy Prince Charming thrown into the saga just in case we didn't get the point.  Shrek, who first appeared in print form in 1990, was a creature from the Age of Attitude, the Bartman days when Bart Simpson, and not Homer, was the Simpsons star, and irreverence was a national pastime. 

 

So, I wonder about this back-to-the-Beauty stuff.  There's a New Yorker cartoon from Roz Chast that I'm reminded of here.  In the cartoon, entitled "Comes the Revolution Fairy Tales," "Cinderella" is retold with Cinderella ending up running away with "Henri, an idealistic student," as Prince Charming meets a gruesome end.  Now that's a fairy tale for the land of Thomas Paine.

One common response to the semiotic study of such popular media as film, television, and music is that "it's only entertainment."  If you use popular culture as a thematic topic in your composition classes, then you may have encountered something of the sort, which is why the general introduction to Signs of Life in the USA carefully describes the historical process by which America became an "entertainment culture," and why that means that entertainment in America is always meaningful.

 

And if anyone still wants to object that entertainment is only entertainment, I give you the 2017 Oscar Awards ceremony.

 

No, I'm not referring to the mistaken best-picture-announcement-heard-round- the-world (how could they have just stood around waiting for the poor producers of La La Land to complete their victory speeches before breaking in with the correct envelope?!); I'm referring to the expectation that, like the Golden Globes, the ceremony was going to be another skirmish in the ongoing war between Hollywood and Donald Trump.  And that expectation, largely due to the opening monologue by Jimmy Kimmel, was not disappointed.

 

With Kimmel openly taunting the president (indeed, daring him to live-tweet the event), one doesn't need to go through every joke to get the point.  Which is, that with Saturday Night Live taking the lead with regularly scheduled take-downs (and enjoying a "ratings roll" ever since the election), and major awards ceremonies becoming platforms for presidential critique, the entertainment industry is emerging as the foremost institution of political opposition in the country.  I don't think that I am exaggerating: how often do we hear anything like it from the opposition party in the Senate and the House?

 

But if the entertainment industry has successfully taken up the cudgels against the Trump administration, will that action be successful itself?  Here's where things get tricky.  There is a lyric from a Tom Lehrer tune spoofing the folk-song-led 1960s protest movement called "The Folk Song Army," which goes like this:

 

Remember the war against Franco?
That's the kind where each of us belongs.
Though he may have won all the battles,
We had all the good songs.

 

There is no question that the entertainment industry has all of the good jokes (and speeches) when it comes to the anti-Trump resistance, but given that Trump's support comes from people who view Hollywood as a lot of out-of-touch elites, these jokes may only prompt them to dig in their heels (or "double down," as everyone seems to insist on saying these days) when it comes to their support for the president.  Indeed, a Washington Post report suggests that just this is happening, as 53% of those polled in a recent NBC News-Wall Street Journal national poll believe that "[T]he news media and other elites are exaggerating the problems with the Trump administration."   And "other elites," in American discourse, always include Hollywood.

 

Still, there is also the fact that entertainment is what Americans heed.  Donald Trump himself (as one of the new selections for the upcoming 9th edition of Signs of Life in the USA points out) used reality television as a springboard to the White House.  So perhaps the entertainment industry may indeed prove to be the most effective warrior in the anti-Trump opposition. 

 

This takes us back to our original premise: in America, entertainment matters.  A lot.  And that's why we teach popular cultural semiotics.

 

Jack Solomon

It's Only a Game...Not

Posted by Jack Solomon Expert Feb 16, 2017

For the pure fan of the game, Super Bowl LI was the stuff of sports history: a young, fast team finally losing out to a seasoned veteran who knew exactly what to do when the chips were down.  I mean the whole thing had "TV Movie" written all over it, and one could almost imagine the ghost of Gary Cooper rising from the grave to reprise Will Kane as Tom Brady, coolly gunning down the pass defense and striding to an impossible victory.

 

Except that it is impossible to be a pure fan of any games in America anymore, not when everything has become a symbol of the bedrock contentiousness that is dividing the country.  Today, everyone is a semiotician, and rightly so.

 

We can start with the teams themselves.  By some freakish chance of history, both the coach and the star quarterback of the New England Patriots just happen to be fans of the newly elected President of the United States, which unavoidably turned the contest into a kind of allegory of the election.  Then there was the equally freakish coincidence that Melissa McCarthy's Kia ad, which had been filmed well before the game, emerged as Super Bowl LI's most popular commercial, scarcely twenty-four hours after McCarthy's devastating takedown of presidential press secretary Sean Spicer on SNL.  So, score one for the president, one for the opposition.

 

The whole night was like that.  Days before the game itself, for example, there was the controversy over the Anheuser-Busch origin-story ad, with a lot of Trump fans objecting that it was pro-immigration propaganda.  But such objections generally missed the most important signifier in the ad's sepia-toned narrative: the instantaneous glimpse we are afforded of the young Aldophus Busch's immigration papers being stamped.  That detail crucially distinguished the Anheuser-Busch immigration narrative from the 84 Lumber commercial, whose anti-Trump, pro-immigration symbolism was, by contrast, quite explicit—so much so that the ad was quite literally censored prior to its broadcast.

 

But the fact that the Anheuser-Busch commercial, as tradition-minded as it was, did attract controversy is a sign of just how fraught the situation is in America today.  While the company has claimed that it had no intention of getting tangled up in the heated national debate over immigration and was only telling its own story, the reaction shows that that was never really possible.  Though the ad's narrative referenced all of the classic elements of the nineteenth-century immigrant experience—complete with sea voyage and arrival at Ellis Island—it could not possibly escape its twenty-first century context.  Trying to have things both ways, the commercial combined conventional Statue of Liberty style nostalgia with sympathy for contemporary immigrants who face nativist hostility —as we can see signified in the street scene where the young Busch is angrily told to "go back home."  Indeed, to make certain that we see the folks at Anheuser-Busch as being on the side of social progress, the ad's creators even threw in a glancing river boatshot of young Busch nodding in a comradely sort of way to a black slave (this was 1857, after all, on the Mississippi River).  But to no avail: threats of Anheuser-Busch boycotts (read the comments section on YouTube), demonstrate that you can't have it both ways in America when it comes to immigration.

 

So, semiotically speaking, Super Bowl LI was a giant symbol, a grab-bag of signifiers all pointing to the same dismal message.  When the most successful quarterback in the history of the game becomes a Donald Trump surrogate, and the ads become a battleground over national immigration policy, you know that nothing can be simply what it is in American popular culture any longer.  This may be good for semioticians; it isn't good for the country.

 

Jack Solomon

The MTM Effect

Posted by Jack Solomon Expert Feb 9, 2017

The outpouring of affection upon the recent death of Mary Tyler Moore is not only a tribute to a popular television icon, but it also draws attention to the series that made her more than just another TV star: The Mary Tyler Moore Show—a program that offers teachers of popular cultural semiotics an unusually rich topic for analysis.

 

As always, we can begin such an analysis with an identification of the particular system with which The Mary Tyler Moore Show can be associated.  This system is the television genre we call the "situation comedy," but the moment we identify that system a rather glaring difference appears when we compare The Mary Tyler Moore Show with the prominent sitcoms that preceded it, including, of course, The Dick Van Dyke Show. This difference is the key to the whole analysis.

 

The MTM difference, and its significance, is obvious: The Dick Van Dyke Show, along with such iconic family sitcoms as Father Knows Best, Leave it To Beaver, and My Three Sons, were fundamentally patriarchal television series devoted to the presentation of a certain ideological vision of the American family.  As feminist scholars of popular culture have long since pointed out, the family sitcoms of the 1950s and 1960s functioned to persuade American women that the non-domestic roles they were allowed to play during the Second World War were only temporary and "unnatural."  Rosie the Riveter, as it were, had to be shown that her real happiness lay in a middle-class suburban home, taking care of the house and the children for her bread-winning husband.  It was all, as Antonio Gramsci might have put it, a gigantic act of cultural hegemony.

 

This is why it was such a very big deal when Laura Petrie reappeared—only four years after the conclusion of The Dick Van Dyke Show—as Mary Richards, a single professional living an independent life in the big city.  Pioneering what could be called "the mainstreaming of the women's movement," The Mary Tyler Moore Show (its very name, as an inverse image to The Dick Van Dyke Show, was significant) marked the moment in which sneering at feminists began to cease to be the default position in American popular culture.

 

But The Mary Tyler Moore Show does not only signify a crucial moment in the history of the women's movement; it also helped open the door to such other socially conscious sitcoms as All in the Family, and, perhaps more significantly (if far less remembered), Mary Hartman, Mary Hartman.  For while All in the Family brought racial and cultural conflict into America's living rooms, it remained, at base, a sitcom.  Mary Hartman, Mary Hartman, on the other hand, could really be called the first dramedy, its mixture of soap opera with satire offering a not-so-funny exploration of the crumbling edifice of married domesticity.

Like The Days and Nights of Molly Dodd (a 1980s proto-dramedy that also explored the darker side of women's lives in a more or less comic setting), Mary Hartman, Mary Hartman didn't last very long, but it helped pave the way for Desperate Housewives, Orange is the New Black, and every other contemporary TV series that blends bitter humor with socially conscious drama. 

 

I do not think it is an accident that the dramedy is, more often than not, a woman-centered television drama.  The most popular sitcom on TV today—The Big Bang Theory (actually, it is simply the most popular show on TV today)—is a strikingly masculine series, and it has no dramedic echoes whatsoever, nor any particular social significance beyond the way it signals the new popular cultural prestige of science and technology (I think of The Big Bang Theory as Seinfeld-meets-Friends-meets-Flubber).  It's very funny, but it isn't game-changing. The Mary Tyler Moore Show was game changing, and a lot of contemporary television is playing by its rules.

 

Jack Solomon

La La Land

Posted by Jack Solomon Expert Jan 26, 2017

If the Golden Globe awards are anything to go by, La La Land is the greatest motion picture ever made.  Or something like that.

 

If we go by the most recent box office totals, it isn't half bad, either—but the Oscars haven't weighed in yet and that verdict could do a lot to boost the bottom line even further.

 

But if we look at La La Land semiotically, a different picture emerges, revealing not its quality or ultimate profitability, but rather what it says about America today.  Not surprisingly, that turns out to be a rather mixed message.

 

Let's start at the beginning, which, in a semiotic analysis, usually begins with a determination of the immediate system, or context, in which our topic appears.  In this case, that system is the history of Hollywood musicals, romantic drama division (the studio calls it a "comedy-drama," but the "comedy" part of the categorization has been questioned).  This simple act of situating La La Land within its most immediate context takes us right to our first signification, because the era of the Hollywood musical (evoking any number of cinema classics, with Singing in the Rain taking honors as the most cited of La La Land's predecessors) has long since passed, and so the appearance of a musical now marks a difference.  And that difference means something.

 

I see a number of significations here.  The first might be called the "when the going gets tough, America goes for uplifting distractions" precept.  Especially prominent during the Great Depression (which, not coincidentally, coincided with the true Golden Age of Hollywood), feel-good movies have always provided a distraction from the slings and arrows of outrageous reality, and nothing can beat a musical—especially a romantic musical—for making people feel good.  So it should come as no surprise, as we wallow in the wake of a Great Recession from which only a small portion of America has really emerged, that Hollywood gave the green light to a nostalgic film like La La Land, and that audiences, if not quite in blockbuster numbers, have been lining up to see it. 

 

But if audience nostalgia accounts for a good deal of La La Land's success, there is also the enthusiasm emanating from the Hollywood community itself to consider.  The nostalgia of a movie like La La Land is very much an insider's emotion, an evocation of memories of the sort that those fortunate few who really did emerge from the madding crowd to reach the heights of the gaudiest version of the American dream can experience as their own.  For them (especially for La La Land star Emma Stone) the movie is scarcely fiction at all.  No wonder Hollywood loves it.  

 

A less sunny side to Hollywood's self-celebration in La La Land, however, can be found in the film's use of jazz, a multicultural art form that (as a number of critics have noted) La La Land effectively whitewashes.  There is something of a Mad Men effect going on here, as if part of the film's nostalgia is for the days when the racial politics of filmmaking were more easily swept under the red carpet and white actors could be smoothly inserted into what many regard as black roles. After all, The Jazz Singer is also part of La La Land's genealogy.

 

Finally, to discover what may be the most profound signification of La La Land, we need to return to the fact that ordinary people are watching it and giving it high marks on Rotten Tomatoes and IMDb in an era when much darker movies (e.g., anything with Batman in it, but don't forget Deadpool) are really breaking the box office.  Sure, a lot of this popularity is probably coming from viewers who are profoundly grateful for a movie that isn't some sort of superhero or sci-fi fantasy, but the fact remains that La La Land—for all of the much- ballyhooed "realism" entailed by its protagonists' less-than-professional dance chops—is a fantasy too for the vast majority of its viewers.  Which is to say that its starry-eyed "message" about "pursuing your dreams" is completely out of touch with the reality faced by Americans today.

 

Because (you knew I'd get to Donald Trump eventually, didn't you?) one of the indelible takeaways of the 2016 presidential election is that a substantial number of Americans have begun to lose faith in that American exceptionalist belief that America is the place where dreams do come true, where everything does turn out the way you want it to in the end if you only show enough grit and determination.  This essential optimism—what Barbara Ehrenreich calls American "bright-sidedness" (look for her in the 9th edition of Signs of Life in the USA on just this topic)—is badly fraying at the edges as the American dream falls further and further out of reach for most of us non-one-percenters.  And while this new reality is not something that the Hollywood dream machine wants to reveal in the nation's movie theaters, it certainly is showing up at the polls.

 

Which is to say that La La Land's success is a reflection of an America that is passing.  Its follow-your-dreams faith may have worked for Damien Chazelle, but the odds aren't favorable for the rest of us.  Guns N' Roses was certainly closer to the mark for those who do succeed in Hollywood with "Welcome to the Jungle," but the words of a Raymond Carver character (whose family has lost everything) from a short story called "The Bridle" are probably a lot more relevant for much of the rest of America: "Dreams," she says, "are what you wake up from."

One of the key components of modern writing instruction is the rhetorical attention paid to the question of audience.  And I hardly need to tell whatever audience I may have here what that's all about and why it's important—especially in the era of socially-mediated inscription.  But there is another angle to the matter that, if a great many recent news stories are of any significance, appears to require some attention too, not only by students, but by instructors as well.

 

 I am referring here to the seemingly endless stream of news reports—both from such sources as Inside Higher Education and the Chronicle of Higher Education, on the one hand, and numerous mass media news sources on the other, if the story is shocking enough—concerning college instructors who appear to forget that when writing on Internet-mediated platforms the whole world is your potential audience, because no matter how you may set your privacy settings on Facebook, or no matter how obscure you may assume your Twitter account to be, there is always someone out there ready to take a screenshot, no matter which side of the political divide you may find yourself.

 

The most recent cause celebre in this regard involves the Drexel University professor whose "All I want for Christmas is white genocide" tweet particularly lit up the holiday season this year.  The point of my analysis here has nothing to do with academic freedom and the related question of what Drexel administrators should or should not have said about it: I'll leave that to the innumerable online commentators who have been doing battle over those matters.  Rather, what I am interested in is the question of audience, and how a failure to consider that question can lead to all sorts of unintended consequences.

 

The crux of the matter here lies in the assumption that everyone who read the tweet would be aware that the phrase "white genocide" has become a special term of reference for alt-right sorts who use it to deplore the rise of multiculturalism and the impending loss of a white racial majority in the United States.  Those who have rushed to the defense of  the offensive tweet—along with its author—have assumed that everyone would have seen that the tweet was a sarcasm-inflected endorsement of a multicultural, multiracial society, not a call for a massacre.

 

But here is where the question of potential audience comes in, because while the author of this tweet and his intended audience of Twitter followers may be well aware of the alt-right meaning of the phrase "white genocide" these days, the majority of potential readers of the tweet are not, and to such readers the tweet is going to look appalling without some sort of semantic clarification.  But that is something that you can't do in the text-restricted medium of Twitter, and no amount of after-the-fact backfilling can repair the damage that may be done after a careless tweet. This brings up another, related point.

 

This is the fact that social media in general—but especially those of the Twitter and Instagram variety—either require or encourage writing in a kind of shorthand.  Unlike the blog form, which allows a writer to stretch out and elucidate when the inevitable semantic and rhetorical ambiguities of discourse threaten to fill the air with confusion, the preferred modes of digital communication today almost presuppose a homogeneity of audience, a readership that understands what you are saying because it already agrees with you and shares your perspectives.  Hence a writing in shorthand, even when the platform allows for discursiveness.

 

And that raises a risk.  For there is something about social media that seems to encourage provocation rather than argumentation, especially in the form shorthand-ed jabs.  Certainly this is the case when writers assume that they are writing in safe echo chambers wherein those who "belong" will nod their heads in agreement and those who don't will be offended.  But while offending those who aren't on one's side in a dispute may be "fun," it sure doesn't make for an effective argument.  In fact, it is likely to backfire—which is one reason why social media do not provide a sound platform upon which to learn university-level writing.

 

This matters, because at a time when America is tearing apart at the seams, it behooves us as educators to be doing everything that we can to encourage careful argumentation rather than reckless provocation.  I am not so naïve as to believe that simply resorting to rational argument will always win the day (Aristotle himself made no claim to guarantee this in his Rhetoric), but a carefully developed, audience-aware argument will, at least, have a far smaller chance of backfiring than a provocative tweet will.  Thus, it doesn't really matter whether the Drexel tweet was intended to be provocative or not (I suspect, however, that with its openly-avowed sarcastic intent, provocation certainly was part of its composition); what matters is that its disregard for audience has produced a situation that puts higher education on the defensive, not those whom the tweet meant to ridicule. In short, the thing has backfired, and, in the context of a number of similar recent backfires, this is not something that higher education can well afford.

Jack Solomon

Fake News

Posted by Jack Solomon Expert Dec 15, 2016

It has long been a commonplace of cultural studies that the "news" is never an objective presentation of the way things really are, but reflects instead the ideological perspectives of those who present it.  More profoundly, the post-structural paradigm that continues to influence contemporary cultural studies (even if the word "post-structuralism" is beginning to show its age) goes even deeper to argue that reality itself (conventionally presented in scare quotes along the lines of a Derridean erasure) is a social construction without any objective grounding.  But in the wake of the recent revelations concerning what can only be called the "fake news industry"—and the potential effect that it appears to have had on the just-concluded presidential election—I think that it would behoove the practitioners of cultural studies to take "reality" out of scare quotes, because the reign of anti-realism is really getting out of hand.

 

To say that this will not be happening soon, however, is to risk considerable understatement, because I've made this call before.  Many years ago I published a book (Discourse and Reference in the Nuclear Age, 1988) in which I tried to establish a semiotic alternative to post-structural anti-realism at a time when the sliding signifiers of the Reagan administration were giving the most fact-averse scholars of deconstruction a real run for their money.  And to say that I was not successful would also be to risk considerable understatement.  But I would like, nevertheless, to offer some tips to composition instructors who may be looking for ways to help students distinguish between outright fantasy and defensible reality in an era of "truthiness," "post-facts," and fake news.

 

To begin with, your students need to be informed that the "news feeds" that they receive on their Facebook pages reflect the same kind of data mining techniques that digital marketers employ.  By spying on the content posted on your Facebook page, Facebook can predict just what sort of news you are likely to want to get.  This not only means that "liberals" will accordingly receive "liberal" news and that "conservatives" will receive "conservative" news, but that liberal or conservative third parties—who have access to Facebook's data mines—can effectively spam your page in the same way that advertisers do—except in this case the spam is "news," not advertising.  The result is an echo chamber effect, within which everyone hears only the news that they want to hear (or already agree with).

 

So the second thing to realize is that the polarized (and polarizing) "news" situation in America is no longer simply a matter of whether you watch MSNBC or Fox News: these days the social network is the echo chamber, and that is a much trickier thing to resist.  For now it is not some network stranger who is providing you with your news, it is your own friends and family, whom you are lot more likely to trust, no matter what weirdness they send you.  The only way out of this echo chamber, then, is to get off social media and do some research, constantly seeking out multiple sources of —and perspectives on—information, especially when something you hear just doesn't seem very likely.  I'm not saying that unlikely things don't happen in this world, but, as they say in science, "extraordinary claims require extraordinary evidence," and so, extraordinary news requires extraordinary levels of active media scrutiny.

 

Finally, at a time when each side of the great American political divide doesn't trust anything that the other side reports, it is important to recognize that the concoction of fake news is not an ideological monopoly, especially at the extremes, where, to take one all-too-common example, the so-called "false flag" conspiracy narratives of both the left and the right can be disturbingly similar in their levels of sheer evidence-deficient fantasy.

 

So the best ground for refuting such post-fact fantasies remains good old-fashioned empirical evidence.  But we can't demand such evidence if we insist that there is no empirical reality and that everything is a social construction.  That is why the semiotic paradigm that I use, as influenced by Charles Sanders Peirce, is not a post-structural one.  It accepts a reality outside our sign systems and against which our signs can be tested and evaluated.  Absolute objectivity cannot be theoretically achieved by this paradigm, but it does supply a basis for identifying outright fabrications. In short, in this "post-truth" era, it's high time to get real.

Jack Solomon

Top of the Class

Posted by Jack Solomon Expert Dec 1, 2016

Who would have ever thought that a Broadway musical about a man best known today for having been killed in a duel with Aaron Burr (and who was also one of the founders of nascent American corporate capitalism) should have become the hottest thing on Broadway since Cats?  But then again, who would have thought that a Broadway musical would get itself involved in what is arguably the bitterest American election since 1860?  That is exactly what Hamilton has done, and therein lies a semiotic tale.

 

The story here begins not with the creation and triumphal run of this Tony-record-smashing production, but with an event that took place after its creator had left the cast for other projects.  This event, of course, was the reading of a statement by a cast member to Vice President-elect Pence, who happened to be in attendance at a post-election performance.  That statement, which did not appear to have upset Pence (it basically implored the incoming Trump administration to play nice), did upset the President-elect, who took the matter to Twitter, where he appears to conduct the greatest portion of his communication with the American people.

 

The ironies—indeed, outright paradoxes—of this whole situation can hardly be overstated.  First, we have the paradox of the play itself:  a paean to diversity and inclusiveness whose ticket prices now average $411, and whose premium seats run $849.   The ironic symbolism of this—in the light of an election in which the Democratic candidate overwhelmingly carried America's centers of post-industrial prosperity, while the Republican candidate captured the Rust Belt—should not be lost on anyone.  Simply stated, while race relations most certainly played a key role in the election, so did socioeconomic inequality.  And while the billionaire standard bearer of the traditional party of the country club set saw this and exploited it in a campaign aimed at working-class Democrats who could hardly afford Hamilton's price of admission, the Democrats did not. 

 

Then there is the paradoxical fact that the Democratic candidate out-fundraised and outspent her Republican rival by a considerable margin.  Making use of social media (especially Twitter) instead, a capitalist tycoon struck a populist note by communicating directly with voters rather than through expensively staged, and highly mediated, advertisements.  Whether this populist strategy was truly authentic is open to debate; that it was successful is not.

 

In short, the traditional party of class privilege won (at least in part) by playing upon the often-neglected emotions of social class, while the traditional (at least since FDR) party of the common folk, got blindsided by class resentment.  And while one can certainly understand why the cast of America's most celebrated stage entertainment would want to take advantage of a chance to speak directly to a man whose election appears to contradict everything that their performance stands for, the upper-class aura of the venue for their message was not, perhaps, the most effective setting for it.

Jack Solomon

The Day After

Posted by Jack Solomon Expert Nov 17, 2016

Given the immense significance of the outcome of the American presidential election, I awaited its results before writing this blog.  And though I, like a great many other people, was rather taken by surprise by what happened, the overall semiotic outline of the event was clear both before and, now, after it.  So, doing my best to avoid partisanship, I will sketch out that outline here in the shape of a series of fundamental takeaways.

 

First, as the chapter on cultural contradictions ("American Paradox") in the ninth edition of Signs of Life in the USA explicitly explores in the light of the Trump campaign, America really has split apart into hostile camps, each one, in part through the use of social media, creating its own "echo chamber," largely deaf to the discourse of the other, and lodged, essentially, in its own construction of reality.

 

Second, when we situate the election into a larger system that includes the British Brexit vote and the rise of populist parties in Europe voicing similar complaints to those in America, we can find signs of the turmoils induced by demographic change in a highly unsettled global context.  Try as one will, there is simply no way of avoiding the racial component of these events, and pretty much every hope of having achieved a post-racial society in the wake of the Obama presidency has been dashed.

 

Third, Trump's success signifies a highly paradoxical rejection of neoliberalism—paradoxical because such rejections are commonly viewed as a preoccupation of the political left.  But alongside the Sanders campaign (which was explicitly a rejection of neoliberalism), Trump's rejection of the ideology of the global marketplace, which resonated so strongly with working-class voters, is itself a challenge thrown down before all of the global elites whose power and privileges owe much to the neoliberal order of things, no matter which side of the aisle these elites may sit.  In short, this was a mighty challenge to America's socioeconomic elites—Republican and Democratic alike, led, paradoxically, by a member of the elite class himself.  But America has been here before, as when the uprising of Jacksonian democracy in the 19th century was conducted by a plantation-owning aristocrat.

 

Fourth, the election signifies just how important the Supreme Court has become in a society so divided that neither its Executive nor Legislative branches can govern any more.  A lot of voters (especially Evangelicals) swallowed their disapproval of Trump's personal life to vote, essentially, for future Court justices.

 

Fifth, and finally, the election has illustrated a point that I often make to my students about the difference between sociology and cultural semiotics.  Both fields, of course, analyze human society, but while sociology relies very much on the measuring of human behavior and consciousness via quantitatively constructed surveys, semiotics simply takes the actual behavior of people (what they do rather than what they say) as evidence.  The failure of pretty much all of the polls to predict what happened despite all of their surveys and quantitative data (just as the pollsters failed in the Brexit vote) indicates that people can be very chary about what they say about their beliefs, especially in the case of this election in which support for Trump was socially frowned upon.  After all, it wasn't only uneducated working-class voters who supported Trump, and the pollsters missed that.   And since semiotics is an interpretive, not a predictive, activity, we can now see just how much louder actions have spoken than words in this election.

Of course it was inevitable that I should turn my semiotic eye this time around upon one of the most significant events in popular cultural history: the awarding of the Nobel Prize in Literature to Bob Dylan.  But the question is not whether Dylan deserved the prize (I really really don't want to go there) nor even whether songwriters should be equated with musically-unenhanced poets; no, the semiotic question is, quite simply: what does this award signify?

 

Let's start with the fact that I am discussing this at all.  How, one might ask, did it come to pass that the posthumous legacy of the Swedish inventor of dynamite should come to be not only the world's most prestigious award, but should also have bequeathed to a small, and rather secretive, committee in Stockholm the power to create and even influence history?  For that is what the prize does: it plays a significant role in determining which scientists, economists, and writers will be most remembered and whose work will be given most authority, and it also, through its Peace awards, has a way of intervening in ongoing human conflicts and, as in the case with the award to Barack Obama, electoral politics.

 

It is also worth noting (and this should be especially poignant for scholars) how the Prize also has a way of indicating what really counts in human intellectual endeavor: physics, but not mathematics; medicine, but not biology; chemistry, but not engineering; economics, but not political science; literature, but not painting, music, or sculpture; and nothing in the way of scholarship—not history, nor anthropology, nor literary criticism, nor even philosophy (which is why Bertrand Russell was awarded the prize for literature).

 

So let me repeat, how did the Will, and will, of one man from a rather small country accomplish this?

 

I can't answer this question entirely, but I can offer some suggestions.  First, it is useful to note that the Prize came into existence just on the cusp of the final transition from feudalism to capitalism.  For where science and art were once the retainers of Crown and Church, whose patronage alone was sufficient reward for early scientists and artists, in the capitalist era individual enterprise and  competition are the motivators for human endeavor.  (It is striking to note in this regard that the Nobel Prize was created by a wealthy industrial capitalist, but the award is handed over by the King of Sweden.)  Competition is what prizes are all about, and as we head further and further into the era of hypercapitalism, we accordingly get more and more competitive awards: more Oscars, Grammys, Emmys, Tony's, Pulitzers .  .   . the list seems endless.

 

Thus, we might say that the Nobel Prize got there first, was, that is to say, the first arrival in the bourgeois era of competitive achievement.   Itself the title holder in the Most Venerable Award sweepstakes, the Prize is a signifier of capitalism's worship of whatever is biggest and "best," turning even art and science into a contest—with all the "winners" and "losers" that contests entail.

 

Which takes me to the second signification I see in the Dylan award.  For by giving the prize to a superstar of popular culture, the Nobel committee has not only given its vastly influential imprimatur to a once marginalized region of human creativity, it has signified that the ancient wall between "high" culture and "low" really is tumbling down.  (I've been saying this for over twenty years in every edition of Signs of Life in the U.S.A., so I ought to be grateful to the folks in Stockholm for putting some authority behind it.)

 

But having really, shall we say, dynamited the last remnants of high cultural ascendancy over low, the members of the Nobel committee may have opened a flood gate that they did not anticipate.  For now a host of songwriters, screenwriters, TV script writers, and goodness knows who else that the culture industry has made rich and powerful, will come knocking at their door.  Having everything except a Nobel Prize, they will likely be found lobbying, imploring, schmoozing, advertising     .  .  .  in short going through the whole playbook of competitive awards seeking to gain the one trophy missing from their collections.

 

I can see it now: laureates on the red carpet.

Jack Solomon

Creepy Clowns

Posted by Jack Solomon Expert Oct 20, 2016

So now it has come to this:  Ronald McDonald is on administrative leave.  And the funny thing is that I can assume that you already know exactly what I am talking about.  Yes, the creepy clown invasion: the next best thing since zombies.

 

Because that's really what it's all about: people, as Halloween approaches, looking for the latest in camped-out pop cultural horror.  Not that creepy clowns are going to have anything like the staying power of zombies (this is a fad, not a trend), but the clowns bear a family resemblance to these popular humanoid monsters (vampires belong to the clan, as well) and appear to be part of a larger fascination with the macabre in contemporary American culture.  Except that there's a lot more to it than that.

 

Because unlike zombie walks and costume vampire fangs, the creepy clown phenomenon does not have its origins in pre-existing stories and entertainments.  Vampires are an ancient part of our lore, and even zombies (of the walking dead variety) have a lengthy genealogy.  Creepy clowns, by contrast, are a newly minted product of the instant-fad-creating potential of the Internet.  And more importantly, unlike vampires and zombies, they're real.  Zombie walks shouldn't frighten anyone.  Vampire events are pure camp.  But frighted-up clowns coming at you in the dark with real knives are quite something else.  A certain amount of not-so-make-believe terrorism is going on here, so the semiotic question must be: what on earth does this signify?

 

It's best to begin at the beginning with such a question.  So, the whole business apparently started in August, down in South Carolina, when someone in some sort of clown suit was spotted trying to lure children into the woods.  This may have been a "prank," or a real case of predatory pedophilia, but the key to the matter is that it got reported, and the report went viral.  In no time, it seems, doing oneself up as a maniac clown became the prank of the town.

 

Creepy clowns, then, are signifiers, at least in part, of the enormous power of virtual technology to stimulate actual behavior—a kind of postmodern case of "monkey see, monkey do" on a truly mass scale.  You know, "I saw it on the Internet so it must be cool."

 

Not that faddish behavior itself  is anything new, of course, especially in a mass consumer society.  Hula hoops, Cabbage Patch dolls, pogs, the original Pokemons, mutant ninja turtles - all of these instances of what I shall call "sudden mass hysteria syndrome" percolated throughout America (and the world) without benefit of social media.  The Internet just makes the process a lot faster, generating an endless stream of ice bucket challenges, twerking events, flash mobs, and, yes, creepy clowns.

 

But, as in any semiotic analysis, we must look at the crucial (one might say diacritical) difference that sets the creepy clown fad apart from other such fads, in order to arrive at its most profound significance.  And this difference can be found in the really sinister nature of the thing.  Confident in the anonymity that a mask provides (there is a compelling connection here to the phenomenon of anonymous online trolling), the prankster-clown is genuinely frightening people.  In an era of daily terroristic threats, and when parents (alas, for good reason) no longer allow their children to go trick-or-treating unaccompanied, this is no joke.  The fact that a growing number of "clowns" think that it is only a joke, or do not even stop to think of the effects that their "fun" may be having on other people, is what is really significant here.  A lot of otherwise ordinary people in the digital era are apparently losing their capacity to empathize with the feelings of other people.  Traditionally, this has been the hallmark of the psychopath, but there is a growing body of evidence that the Net is behind this new expression of social anomie, fostering what might be called "mass psychopathology."

 

Happy Halloween.

 

Source: Why Are You Laughing? by davocano on Flickr, used under CC-BY 2.0 license 

I've been reading James Knowlson's big biography of Samuel Beckett, Damned to Fame—an experience that not only transports me back over forty years to the days when I was writing my undergraduate Honors Thesis in English on Beckett, but also sets me to contemplating again the relationship between "high" cultural creation, and "low," or popular, culture. While Beckett's incorporation of such popular cultural materials as vaudeville-style slapstick and Charlie Chaplin's tramp into Waiting for Godot undoubtedly helped to erode the traditional barriers between high and low culture, his own lifelong devotion to the highest of the elite arts (classical music and literature, philosophy, and fine art) also comes through very powerfully in the story of his life.  Though in rapid decline even within his lifetime, the "cultural capital" of high art still stood for something in Beckett's formative years in a way that is almost unimaginable in an era when the Oscar, Emmy, Grammy, and VMA awards (etc.) are effectively our society's supreme expressions of esthetic taste.  And this, paradoxically, is why the semiotic analysis of popular culture is itself an activity of high cultural importance; for if we are to come to an understanding of who and what we are as a society—which is one of the more profound aims of esthetic creation—we have to look at what really matters to us.  And for some time now, what really matters has been pop culture.

 

In saying this, I am going against the grain of such cultural theorists as Lucien Goldmann, who believed that social knowledge comes through the study of "high" cultural creation.  Perhaps that was once so; it certainly isn't the case today, however, when traditional high culture is on life support.  While there has never been a mass audience for the elite arts, what has changed has been the economic basis of esthetic creation: the centuries-long shift from a system of aristocratic patronage to one of commodity capitalism in a market economy.

Chaucer, that is to say, paid the bills by living in the palace of John of Gaunt, and Michelangelo sought commissions from the Church.  Today, "high" art poets must seek out teaching positions to survive because poems have little commodity value, and painters hope for the kind of awards and critical reviews that will attract wealthy speculators to their work in a kind of fine art stock market. 

 

An apprehension that the economics of artistic production was changing everything was behind the rise not only of Modernism, but of Romanticism as well, as artists began to feel alienated from their audiences—no longer coteries of patrons and friends, but a mass market of anonymous consumers—and so, in defiance, they turned away from seeking popularity to create generations of avant-garde art that only helped to reduce what audience for high art ever existed in the first place.  The result has the been the creation of what I have called a "museum culture," as high art has retreated to ever more beleaguered bastions of cultural preservation, while popular culture, with its seemingly limitless market potential, has flourished.  (I know, you may have attended the opera recently, or a symphonic performance, and that you may spend your free time rereading War and Peace, rather than The Arkham Asylum, but even so, you cannot have missed the signs that those are unusual choices today.)

 

Cultural semiotics doesn't complain about this shift in cultural tastes (history, after all is history); and it doesn't attempt to apply the critical standards of high art to works of mass culture.  Rather, taking as its basis the recognition that cultural production in a market environment will produce what the market desires, cultural semiotics analyzes that desire itself, seeking its significance.  For therein lies the consciousness of our society, the revelation, finally, of who and what we are.