Showing posts with label Movies. Show all posts
Showing posts with label Movies. Show all posts

October 28, 2015

No showering at school in cocooning times

Looking through old horror movies, it's striking how many shower scenes there are, from Psycho through the Friday the 13th series. Some movies dialed up the vulnerability factor by making the bather's exposure public, setting them in the showers of a locker room at school:


Then it hit me why you don't see these kinds of scenes anymore -- students haven't showered in the locker room for decades, so the intended audience of horror movies (teenagers and young adults) wouldn't be able to resonate with the setting. Carrie (1974) has a classic shower scene that anyone who went to high school in the '70s could have identified with. When they re-made Carrie in 2013, they kept the shower scene, but it must have felt forced and unfamiliar to adolescents of the 2010s.

Of course, there were public shower scenes in genres other than horror -- Porky's, Footloose, Heathers, and others -- but those are also from awhile ago.

To establish that these changes in pop culture are reflecting changes in the real world, I went Googling and found three articles from major newspapers, all in 1996: the first one from the Chicago Tribune, a follow-up from the NY Times, and yet another from the LA Times. The reporters interviewed students and gym teachers from various schools in their metro areas and were unanimous that high schoolers had stopped showering after gym class and extracurricular sports.

We can tell the change was abrupt because the gym teachers were in a state of disbelief, rather than saying that the shift had been gradually under way for awhile:

The antipathy to taking showers after gym class puzzles some teachers and coaches. "These guys don't want to undress in front of each other," said John Wrenn, a teacher at Homewood-Flossmoor High School in suburban Chicago, who can scarcely conceal his contempt for the new sensibilities. "I just don't get it. When I started in '74, nobody even thought about things like this. The whole thing is just hard for me to accept."

There was a court case in the Pittsburgh area in 1994, where the ACLU sued a high school to end public showering, arguing on privacy grounds (a fat chick didn't want to be embarrassed). The lawyer said he'd never received so much spontaneous positive response from around the country, meaning that this court case did not change the climate by itself but was merely reflecting the larger changes in attitudes that had already occurred throughout the country.

That's the earliest example that I could find, and it sounds about right from personal experience. I don't remember anyone using the showers in high school (fall of '95 to spring of '99). In fact, although I can recall what the locker room looked like and where the main doors and the staff office were located, I can't even remember where the showers were placed or what they looked like. My friends and I would just spend the extra 5-10 minutes of "shower time" shooting the bull.

It was like that in middle school, too (fall of '92 to spring of '95). Some of the metal kids went into the showers with their normal clothes on, and started headbanging once their hair had gotten soaked, but it was just a goof. Rarely, somebody would go in for real with their shirt off, but actually taking a shower in public -- never happened.

Here is a thread on Snopes.com from 2004 asking whether showering after gym class was a real thing or only something they showed in the movies. People who went to secondary school in the '60s, '70s, '80s, and early '90s all swear that it was real, and often enforced by the gym teachers. Nobody chimed in to say it was practiced in the mid-'90s or afterward. A more recent article from the Sun-Sentinel in southern Florida confirms that the no-shower trend has continued through the present, and was not just a phase during the era of grunge music, greasy hair, and heroin chic.

The timing of the rise and fall suggests a link to the outgoing vs. cocooning cycle, and sure enough I could not find reports of public showering at school being common in the first half of the 1950s or earlier -- i.e., the pre-Elvis cocooning era.

People in outgoing times are simply less self-conscious about their bodies because sociability requires higher trust levels than cocooning does, and trust means being willing to let yourself be vulnerable in public around other members of the in-group. In a trusting person's mindset, none of the other kids in the showers are going to try anything harmful, so why bother worrying about them? In the suspicious person's mindset, you can never know who's going to try to do you wrong when you're vulnerable, so it's best to just keep your guard up all the time. Certainly that means never uncovering your shame around others.

Only when people start withdrawing into their own private little worlds do they start to obsess over privacy and act compulsively about it. It's not a mere personal preference when 100% of students would suffer an anxiety attack from taking off their clothes in the locker room, let alone everyone showering that way. To students of the past 20-some years, it would feel more like child abuse to make the students vulnerable to the attention of their peers, all of whom they distrust, so that uncovering their shame in front of them would be the ultimate humiliation.

Not showering in public is not the only case of increasing anxiety about showing one's body in cocooning times. See this earlier post on the absence of flashing, streaking, skinny-dipping, and topless sunbathing (even in France) within Millennial-era culture. See also this related post about how young people stopped joining the nudist movement decades ago, making their membership increasingly saggy and gray-haired.

This wide variety of examples shows the importance of distinguishing between displaying your body because you trust other people not to do you wrong, and displaying it to whore for sex-appeal attention. The streakers of the '70s were not trying to increase their number of Instagram followers as an ego boost, but to put out an extreme form of the basic message about, "Hey man, what is there to be worried about or ashamed of? We all trust each other, don't we?"

Likewise, today's skin-baring young people would still drop dead from anxiety if they went out in public with no bra on, or if they went skinny-dipping with a group of friends. The code of "If you've got it, flaunt it" somehow doesn't translate into public showers, where the good-looking girls could easily one-up the uglier girls. But that would require them to be vulnerable in front a crowd of people they don't trust (their fellow students), so that's a non-starter.

Calling these changes "the new modesty," or whatever, would be foolish, given how narcissistic young people have become. "The new awkwardness" is more like it.

October 11, 2015

Why can't they make good horror-comedies anymore?

Hollywood is going to take another stab at the Christmas-themed horror movie -- a genre on hiatus since the 1980s -- with Krampus. Unusually, it is also a comedy, an attempt to mix contrasting tones. See the trailer here, and a long list of horror-comedy movies here for comparison.

In Krampus, they're going for a mash-up of Christmas Vacation and The Evil Dead, but the tone that comes off in the trailer is clashing and discordant rather than balanced or blended. Perhaps it's better executed in the full movie, though tone usually comes across fairly reliably even in a trailer.

At least it's not the standard approach to horror-comedy of the past 20 or so years, where the horror is meant to be taken somewhat seriously, and the comedy comes from self-aware positive responses to the horror -- "Isn't it hilarious how gory the killing is, and how over-the-top the plot premise is?!" Meta-commentary winking at the viewer, who is in on these in-jokes, is not very funny to begin with, let alone when the audience is beaten over the head with them throughout the whole movie. When they're the sole source of humor, the attempt at comedy fails.

Zombie Strippers is a perfect example of this failed approach to horror-comedy, though I only single that one out since I caught it on TV last Halloween season. There are dozens of others like it that I've caught bits and pieces of on late-night TV since the '90s.

Attempts from the '70s and '80s were not quite so bad, when the self-aware humor was open and campy rather than unstated and smug. The Rocky Horror Picture Show and Little Shop of Horrors are way too campy to feel like real horror movies -- they're more like comedy movies set within a horror-inspired narrative.

I'll admit that there may be a genuine exception in Re-Animator from 1985, though. The characters are played as genuine eccentrics, not campy caricatures hamming it up. The overall atmosphere is likewise not deliberately exaggerated, but feels genuinely surreal and absurd. It doesn't feel like the whole movie is one great big in-joke and winking at the audience. In this way it's like Twin Peaks, another classic that's sui generis in terms of tone, blending and alternating all manner of dark and light emotions.

The sparseness of examples in the surreal approach to horror-comedy stems from the difficulty in trying to obscure the deliberate nature of cultural creation when portraying such an absurd world. Something so absurd makes the audience suspicious that the creator is just yanking our chain, and we can't slip into the suspension of disbelief. Most writers, directors, and actors just don't have the level of poker-face discipline to present such an absurd world in a sincere and straightly-played way.

More typical is the approach that Krampus follows, where horrific and comedic tones alternate and contrast with each other. Horror movies are about the supernatural or paranormal destabilizing of the usual order of things, so much so that it disturbs or even frightens the audience. Comedy could be worked into this framework if it took the form of having a sense of humor to get the victims through such a dangerously disordered world, a case of gallows humor. That tends to skirt too closely to the self-aware approach, though, since the characters are voicing what the audience is already thinking: "Hey fellow character, isn't it sickly hilarious how screwed-up our situation is?!"

Examples of this style, where humor alternates with horror as comic relief to terror, haven't been tried in a long while, and were not very successful as either horror or comedy movies back then -- Fright Night, Lost Boys, The Witches of Eastwick, Arachnophobia, Buffy the Vampire Slayer, etc., all from the later '80s and early '90s. That doesn't bode well for the revival of the style in 2015.

The only sure-fire way to incorporate comedy into horror is to blend the two tones rather than alternate them. Somehow the evil beings themselves have to be funny while wreaking havoc, instead of comedy contrasting with horror. The natural choice, then, is to make the evil beings an example of the trickster archetype, a preternaturally mischievous being whose violence takes the form of pranks. Who can suppress their laughter when someone pulls off a great prank, no matter how much their victim may be hurt by it?

Since the trickster is the source of both the horror and the comedy, the tones blend better and cohere better in the audience's mind. And they don't feel so bad laughing at violence if it is not cold-blooded, calculated, and purposeful. The trickster is not a serial killer -- he's an anarchic life-of-the-party type of guy.

Laughing at purposeful violence feels like taking sides in a dispute, and agreeing or identifying with the monster. Laughing at off-the-cuff and indiscriminate violence, however, pardons you from choosing sides. The source of danger is more like a natural disaster roaming around unpredictably, rather than a purposeful actor, and those who get in the way are more victims of bad luck than targets of malevolence. Laughing at the unfortunate victims of a trickster's pranks is therefore a type of schadenfreude. No trouble making that blend of horror and comedy work, in principle.

At the same time, times change in how willing the public is to encourage the trickster to let loose and give us viewers something both funny and a little terrifying to behold. It's hard to think of a purer example of a "bad peer influence" that parents would not want their children to hang out with, even if only in pop culture form. And horror movies are directed primarily at those who still scare easily, namely children and adolescents.

Ever since helicopter parenting took off during the 1990s, this trickster approach to horror-comedy has bitten the dust. But it was very popular during the nadir of parental supervision, back in the '80s.

The most financially successful example is Gremlins, which was the fourth-highest grossing movie of 1984, and like Krampus was set during the Christmas season. Just think about how unlikely the odds were for its success -- a horror movie set during Christmas, and blended with comedy throughout. That's a fine line to walk in writing the script and acting out the characters, as well as designing the monsters and bringing them to life.

Casting the monsters as tricksters made it easy to incorporate humor into their very look and feel -- they can look a little cartoony, and it doesn't detract from their menace, since they aren't portrayed as a serious and sublime evil. In a movie like Krampus, where the monster is designed to look frightening in itself, every time you see the monster only adds to the problem of comedic and horrific tones interrupting each other, and the audience's brain shutting off from too many emotional switches back and forth.

Gremlins spawned a host of imitations -- Critters, Ghoulies, Killer Klowns from Outer Space, Leprechaun, etc. These don't work as well as the original since the monsters are not as adorable and commit much more gruesome violence. But they work well enough to watch if there's nothing else on late-night TV.

But don't expect a successful revival of the horror-comedy genre until helicopter parenting goes into retreat, and parents won't mind their children laughing at the violent and terrifying pranks of supernatural or paranormal tricksters.

September 28, 2015

Millennial memories of adolescence: Digital isolation

An earlier post looked at how Millennials get nostalgic for not having a life during childhood. Almost all their memories revolve around mass media and the virtual rather than the real world -- TV, movies, video games, and so on. Very little music, clothing, fads, or toys -- especially ones that required you to be playing outside.

Public environments for social interaction with peers, like the mall, the bowling alley, or the video game arcade from the '80s, are completely absent. They grew up when cocooning and helicopter parenting had really gotten going, so what else are they going to remember? Poor kids.

These themes continue into their memories about their adolescent years. Now that we're getting farther and farther away from the 2000s, it's possible for them to reflect on what it was like. What do they come up with?

Here is a recent BuzzFeed video with over a million views and over 3,000 comments, about "Memories from the early 2000s". Nearly every item is about technological devices and the internet.

You can find more focused lists on clothing from the 2000s, music of the 2000s, and so on, but when you leave their memories open-ended, all they ever recall is which form of technology they were using to socially isolate themselves at the time -- was it Instant Messenger, multi-tap texting, MySpace, etc.?

Even the two items about music are not about the music itself -- which bands they were into, or which genres were popular -- but about the technology used to store it (burning mp3s onto a disc, and long download times for filesharing).

Not only is there no mention of activities that you do in-person with other people, there is no awareness of the broader outside world -- 9/11, American flags everywhere, Islamic terrorists, etc.

We can dismiss blaming all this stuff on the internet, since Gen X was using the internet back then too -- more so, given that we were older and had our own computers -- yet our memories of the early 2000s don't all come back to the digital devices du jour.

We remember how terrible Nickelback & Co. all sounded, not how long it took to download their mp3s. We remember whale tails (thongs + low-rise jeans), not which reality TV stars made it their signature look. And of course we remember 9/11 and its aftermath.

Millennials' social isolation began long before they were on the internet and using cell phones anyway. Recall the earlier post about their childhoods. Back in the '90s, it was Nickelodeon TV shows, Disney movies on VHS, and N64 or PlayStation video games. But still using mass media to distract and anesthetize their brains while being cooped up inside the house all day, every day.

Another 10 to 15 years down the line, Millennials will have the same deprived memories about their digital-only lifestyles during young(ish) adulthood -- the buzz you felt from getting likes on Facebook status updates, those annoying ads before the video loaded on YouTube, posting pictures of your lunch to Instagram, "damn autocorrect!" etc.

If 9/11 barely registers in their memories of the 2000s, I'm sure that the first black President, gay marriage, etc., will evaporate from their memories as well. It's all just been a series of distractions for a socially isolated generation in search of one novelty after another to alleviate their perpetual boredom.

It's truly amazing -- an entire generation with no memory of real life. Even more bizarrely, they have no memories of "the good old days" because  technology keeps improving, and that's all that counts to them. Digital heroin keeps getting more potent, cheaper to score, and more efficiently transmitted.

Parents beware: this is the outcome of digitally bubble-wrapping your children out of overblown paranoia about what'll happen if they have a social life.

June 8, 2015

Gay take on erotic thriller genre reveals mental illness at root of homosexuality

Here are some comments I left at this review of a movie, Stranger by the Lake, that tries to homo-fy the erotic thriller genre. It touches on many aspects of the gay deviance syndrome -- Peter Pan-ism, power and humiliation, extreme negligence of personal health, etc.

The basic plot of the movie is that the main character is so cock-hungry toward a man whom he has secretly witnessed killing a previous lover, that he keeps coming back for more, despite the obvious danger to his own life. The tension arises from him, and us, not knowing whether the next time will be his last.

But the similarities with normal erotic thrillers are only superficial. The gay version is marked by its fundamentally abnormal psychology.

- - - - -

“The film also won the Queer Palm award.[5]”

I see they’ve gone and re-branded the Palm on the Underage Ballsack award. Is nothing sacred anymore?

About it being an erotic thriller — not at all, and its failings reveal the profoundly warped nature of male homosexuality.

In an erotic thriller like Basic Instinct, the tension arises from the male protagonist’s curiosity about a woman who seems as capable of violence as a man, and wanting to square off against her toe-to-toe. Her violent tendencies intrigue him rather than frighten him — it’s not like she’d be able to take on me.

He likes to scratch, she claws his back. He half-rapes his girlfriend, the killer babe ties him up in bed and aggresses against him. Worthy fucking adversary.

That’s the erotic thriller: alpha or usually wannabe alpha male seeks his thrills by competing against the femme fatale, uncertain of which combatant will ultimately one-up the other for good. It’s the guys who get a rush from taunting a girl to “hit me with your best shot, honey”.

The fraidy-cat twink in this queer-directed movie doesn’t play that role at all. He doesn’t see the killer as his equal, nor does he want to get his kicks from jockeying for position, as it were. He’s frightened by him, realizes he could be the next victim, but is so empty and desperate that he’ll pursue a quick fix at any cost, having to block out those rational fears for the couple of minutes it takes for the killer to blow his diseased wad up the sissy’s butt.

So, completely opposite of the contest between equals in the erotic thriller, the gay killer fantasy is based on one of them having total power and the other showing total submission, perhaps to the extreme of being killed by the other.

Dudes fantasizing about sexually wrastlin’ with women is shameful, but it’s not a sign of being severely fucked in the head. Gay fantasies, on the other hand, always reveal profound mental illness. The winner of the Queer Palm award is trying to romanticize what is soulless, and to aestheticize what is disgusting and ugly.

Thus it’s possible for the protag in an erotic thriller to be tragic, his downfall stemming from arrogantly tempting fate by daring the femme fatale to take off the kid gloves and hit him for real. I don’t know of an example that actually tries to make him tragic, let alone succeed at it, but at least it’s possible, and the basic idea comes across in any good erotic thriller, like Basic Instinct.

The victim in the gay killer fantasy flick is not brought down by any kind of hubris, but by an extreme form of negligence. He knows full well how violent the other guy is, how likely he is to wind up as his next victim, but he’s just gotta have his cock fix.

That’s no more tragic than some junkie continuing to shoot up knowing damn well what the substance will ultimately do to him. It’s pathetic, disturbing, and makes a normal person want to lock him up in a supervised facility where he can no longer harm himself.

We don’t respond that way to the arrogant tempter of the femme fatale — arrogance implies a certain degree of maturity, so it’s his own fault that he got killed by the psycho (“I tole you dat bitch crazy”).

But pretending that a real and imminent danger will somehow magically go away, is just infantile. Our reflex is that this person isn’t totally responsible for what’s happened to them, because their mental development has been arrested or retarded.

We don’t get satisfaction from seeing them met their demise — satisfaction in the sense of righteous vindication. Maybe we’re generically sad, maybe we’re just glad the junkie has kicked the bucket and won’t be around to bother us with his self-destruction any longer. Either way, there’s no happy ending to the gay killer movie.

May 23, 2015

Another cosplay fanfic approach to music videos ("Bad Blood" by Taylor Swift)

Earlier we saw the empty and jarring results of the cosplay fanfic approach to making music videos in "Fancy" by Iggy Azalea, where a character from the original movie Clueless is aped by a singer whose persona is the exact opposite.

Now there's the video for "Bad Blood" by Taylor Swift. It mimics Mission Impossible, Charlie's Angels, The Matrix, The Fifth Element, Tron (the Daft Punk one), Sin City, and pretty much anything else that the director got a boner to over the past 20 years.

Does Taylor Swift's persona lend itself to a femme fatale / film noir role? Of course not: she's an awkward virgin who only "dates" high-profile fags. Nothing seductive or man-eating about that. Ditto for the other girls and women, who are self-consciously striking poses like cosplay attendees at a nerd convention.

Does anything in the song's lyrics lend itself to an over-the-top apocalyptic spectacle? Nope: it's about some gay little tiff that she and some other frenemy have gotten into. Only to middle schoolers is that the end of the world as we know it.

Since no one wants to do anything cool and new anymore, we can expect to see more of this approach -- throwing together a bunch of references and allusions to pop culture that the audience has already masturbated to. Only now they get to masturbate to it in an unexpected setting -- a Taylor Swift video, a Family Guy episode, a new Star Wars movie, etc.

Rather than add to the variety of things you enjoy, the point here is to multiply and maximize the masturbatory value of the things you already like -- to obsess over them, over and over again.

Pop culture is quickly becoming one great big breath of stale air.

May 15, 2015

Drawing generational boundaries from slang and other meaningless traits: An evolutionary view

The standard intellectual approach to defining generations is to lump together those individuals who all went through some key event or series of events during the same stage of their life. The point is that what shaped the members of a generation, and what binds them together, is meaningful — growing up in Postwar prosperity, as children of divorce, as digital natives, etc.

In an informal setting, though, the shared traits are not so meaningful. What's your slang word for "very good" — is it "peachy keen," "groovy," "sweet," or "amazing"? (Those are for the Silents, Boomers, Gen X, and Millennials.) What were the popular colors for clothing when you were in high school? What are your favorite pop songs of all time? Who was your first huge celebrity crush?

These are not the kinds of people-molding forces that social scientists propose.

Sometimes the two approaches will draw similar boundaries — if you were a child of the Postwar prosperity, and born during a time of rising fertility rates, you are also familiar with the phrase "groovy," at some point in high school you wore orange and blue on the same day, one of your favorite pop songs is "Happy Together," and you had a huge crush on Mary Ann or Ginger.

But other Powerful Societal Forces don't affect people for a limited time to produce a tightly bound generation who underwent the effects. They are long-term trends along which people born in one year or another simply came of age during an earlier or later phase of the single ongoing process. Suburbanization, ethnic diversity, mass media saturation, and so on.

Gen X and Millennials, for example, don't look like distinct generations when you look at those three factors — they look fairly similar to each other, and different from the Greatest Gen, Silents, and even Boomers. And yet their group membership badges are totally different — slang, "you can't listen to that, that's our song", or tastes in food (way more toward Mexican and Chinese slop among Millennials).

When the X-ers and Millennials vote against the "boo taxes" government of the Silents and Boomers, it will be a case of politics making strange generational bedfellows, not similarly shaped generations on either side warring against their antitheses.

The choice of whether to carve out groups based on a meaningful or arbitrary trait shows up in evolutionary biology and historical linguistics. Both fields prefer shared traits to be arbitrary. If lots of individuals share something seemingly arbitrary, it's probably because they come from a common origin where that just happened to be the norm. That's why geneticists look at neutral DNA to determine ancestry.

If lots of people share something meaningful, like dark skin, that could be due to similar meaningful pressures acting on two unrelated groups, like the Africans and New Guineans both evolving dark skin as an adaptation to tropical climates, despite being distantly related on the whole genetically.

Likewise in the history of language, if two groups share a word for something functional like "internet," that could be because two unrelated groups both adopted the phrase when they adopted the technology, in recent times. If they share a word for the number "four," that can't be chalked up to similar pressures making the groups speak similarly. They must descend from some common ancestor where the word for that number just happened to be pronounced "four".

Aside from the theoretical motivation for using arbitrary traits, they're also the most palpable in real life. If you overhear someone in line at the store saying they agree with gay marriage, that is most likely to be an airhead Millennial, but could very well be an X-er or even a Boomer. Ditto if they make an offhand joke about their parents' divorce, their student loan burden, and the like.

But if you overhear your fellow supermarket shopper say, "Listen, they're playing "Footloose" — isn't this song suh-WEET?!" then they're definitely Gen X. If it's, "Oh my gosh, I'm like obsessed with "Blank Space" — not gonna lie, that song's actually kind of amazing!" then they're literally definitely Millennials.

The sociologist's large-scale impersonal forces make individuals similar, but not necessarily in a social dynamic way that cements group membership. Children of prosperity turn out this way, children of austerity turn out that way, whether they ever interacted with other members of their group to create a shared culture.

It's the seemingly trivial stuff that serves as shibboleths, food taboos, folk tales ("urban legends"), and totem animals to distinguish Us from Them. Those things only became popular by individuals accepting them rather than any of their alternatives to signal what group they belonged to. They get closer to what creates a community, beyond what creates a group-of-similar-individuals.

May 1, 2015

Cosplay remakes and the uncanny valley (video for "Fancy" by Iggy Azalea)

The cosplay fanfic approach of the new Star Wars movie will strike normal people as weird and off-putting, though in a way that's hard to explain. A gut revulsion suggests a role for disgust, rather than a conscious list of reasons why it looks bad.

I still couldn't put my finger on what is (mildly) disgusting about it, so I looked for another example of the cosplay fanfic approach to pop culture.

Here is the music video for "Fancy" by Iggy Azalea, the song of the summer for last year, with over half a billion views on YouTube. Its set design, locations, clothing, hair, and plot vignettes are ripped from the 1995 movie Clueless, probably the last coming-of-age teen movie with likeable characters. Yet everything about the words, intonation, facial expressions, body language, and general attitude of the girls in the music video is the polar opposite of the characters in the movie.

In Clueless, the protagonist Cher is a well-meaning ditz who occasionally bumbles in her nurturing attempts at playing matchmaker. (The movie is based on Emma by Jane Austen.) She tries to make over the new student Tai, a free-spirited, socially awkward naif who becomes more savvy and popular, acts too big for her breeches, but ultimately reconciles and acts humbly around her friends. They show a basic concern with doing right by others in order to fit in. They want to be liked and accepted into a group, not to be worshiped by fans and feared by haters, both groups being socially distant from the diva at the center of attention.

See the trailer here, although it focuses more on dishing out one-liners than establishing character traits.

Fast-forward to Iggy Azalea and Charli XCX aping Cher and Tai in the "Fancy" video. Both are hyper self-aware pose-strikers, unlike the ditzy and spacey characters from Clueless. Their attitudes are smug, bratty, and decadent rather than uncertain, seeking to please, and wholesome. They're self-aggrandizing and condescending rather than other-regarding. They aspire to being distant divas and icons, rather than friends accepted into a clique. And they give off an overly sexualized persona, whereas the appeal of the original characters was not simply to gawk at their ass and thighs.

The contrast for anyone who remembers the movie is so harsh (way harsh, Tai) that it creates an uncanny valley reaction, where something lies between two opposites and leaves the viewer disturbed. Most CGI human beings provoke such a response -- neither human enough, nor robotic enough, but more like a freak of nature.

It gets worse. Seeing actors play totally against what we associate with their clothing, environment, and overall zeitgeist leaves us asking, "What happened to the real people who wore those clothes? Went through those vignettes? Lived in that place?" It feels like the impostors are not just try-hard wannabes, but body-snatchers who have killed what is familiar and replaced it with something alien. It's like that scene in Silence of the Lambs where the he-she serial killer is donning a wig and twirling around in his lady-flesh-suit.

Iggy Azalea has killed Cher from Clueless and is wearing her skin.

Earlier examples of LARP-ing in popular culture at least tried to remain as consonant as possible with the original -- Grease, Back to the Future, Forrest Gump. Now the point is simply to body-snatch the sympathetic original characters and assimilate them into the loathsome present, like some kind of pop-cultural Borg. It is appropriation not out of affection and nostalgia, but simply to claim more and more territory of the good old days for idiotic, imperial trends.

I know -- BFD if it's some throwaway music video. But remember that this is what's going to unfold during the entirety of the new Star Wars movie. And it will only grow from there: the decision of the Star Wars brand sets a binding precedent.

Earlier remakes and reboots tried to distinguish themselves from the original by using a different visual style, exploring other parts of the narrative and character development, and so on. Always boringly, but they were different. Now the rehash movies are going to move into cosplay mode -- "It looks just like real thing!" (don't ask how it tastes, though). Expect pop culture to get even more off-putting in the near future.

April 27, 2015

Movie trailers as serial drama (STAAAAARRRRR WAAAAAARRRSSSS)

On the last episode of "Agnostic reacts to Star Wars trailers," we learned what the new trilogy will amount to -- a cosplay fanfic sequel for Millennials.

And now that they've released the next installment of "Trailers for That New Star Wars Movie," that assessment is certain. You can almost see the Millennial in stormtrooper costume walking up to Harrison Ford and nervously asking for his autograph. I wonder whether that'll be relegated to a making-of sequence during the credits, or be included in the main narrative itself.

("Gee Mr. Solo, you're some legend around these parts... It sure would do me the honors if you'd, uh, do me the honor of signing my toy lightsaber!")

I still don't know what the hell the movie is going to be about, but contemporary audiences don't want any SPOILERS whatsoever.

Trailers are no longer meant to reel you in on the first viewing. They have become a serial drama form unto themselves. The first reveals a tiny bit, and leaves the audience on a cliffhanger. The next one recaps the last one (barren desert landscape, speeder bike battle, lightsabers), but reveals a little more (Vader helmet, Han and Chewie, TIE fighter pilots).

Who knows how many more episodes there will be before the series finale -- the trailer that tells you what the hell the movie is going to be about.

Not following the hype cycle of modern movies, I was unaware of the trend of trailers as soap operas (gossip about them online when the new episode comes out!). I'm even more out of touch with video games, but their hype cycle is so huge that even someone who doesn't play them anymore may know about it. First there's a hint from the developers, then a spectacle teaser during E3, then a beta version, then a playable demo, and finally two years later, the actual game.

I remember when the movie trailer was a terse stand-alone format, and when new video games were announced once they were released, not years ahead of time.

But, that was back when people still had a life. Folks in outgoing times have too much of a dynamic social life to tolerate a serial format stringing them along and keeping them waiting. Soap operas were huge in the Midcentury, but were marginal by the '80s. Short film serials were popular at theaters in the Midcentury, but were also absent during the '80s, whose climate was similar to the Roaring Twenties. Only since the cocooning climate returned during the '90s did serial dramas return to mass entertainment, this time on TV.

They could have made a string of teaser trailers for movies back in the '80s, to be shown on TV commercials or in theaters, but they didn't. Those are a new development -- since when exactly, I don't know, although I have a hunch the Lord of the Rings movies had serial trailers.

Cocooners are bored out of their minds, so they crave a steady and regular fix of anything meant to wake them up. Previously, on "dissecting popular culture," we looked at entertainment as a mood stabilizer vs. experimentation, making the link to stabilizing vs. destabilizing types of drugs.

The stabilizing kind were popular in the Midcentury and have become popular again since the dawn of Prozac circa 1990. Ward Cleaver had Kellogg's Pep and Geritol, while his grandson has Monster energy drinks and Viagra. The destabilizing kinds like LSD are meant to be taken in stand-alone sessions, as though each trip were to somewhere different.

Movie trailers have clearly joined the mood stabilizer family of entertainment. Life is boring, but don't worry, another teaser trailer for Whatever Part Four comes out next week. And don't worry, it won't contain any spoilers -- which would ruin the fix you ought to get from the next trailer after that one.

Spoilers may not answer every question about who, what, when, where, why, and how, but they do close off certain paths through which the trailer-makers could have strung you along. And now that the function of trailers is to provide a regular dose of stimulation to bored nerds, they no longer tell you what the hell the movie is going to be about.

March 29, 2015

Flashdance: Darkest lit mainstream movie ever?

I caught most of Flashdance on TV the other night, and was struck by how dark the lighting is. Not just a scene here or there, but the whole movie. See the gallery at the end of this post, and many more pictures in a post about Pittsburgh in film from What Price Glory.

The plot, dialog, and character development are nothing to write home about, but it is worth checking out for its look and sound.

Some scenes are evenly dim to convey the cloudy and dingy atmosphere that the characters are struggling to emerge from. On the other hand, many scenes have high-contrast lighting to make their lives look more stylized. This chiaroscuro can heighten the tone of romance or intimacy, as well as suggest the almost other-worldly nature of the nightlife environment.

Filmed in 1983, it represents a bridge between the gritty naturalism of the '70s and the stylized music-video look of the '80s. Just six years earlier, the similar movie Saturday Night Fever doesn't feature many scenes with high-contrast lighting, strongly back-lit shots that make people look like shadows, smoke giving the light a hazy quality, and so on.

I don't recall any shots from Flashdance that are destined for the cinematography hall of fame, but I appreciated the effort to sustain a dark look from one scene to the next for the entire duration. It does give the movie a distinct sense of place and time, aside from being shot on location in Steel City during the recession of the early '80s.

The iconic shot of an exotic dancer in a prop chair being doused with water shows how much the movie's look and feel depends on dark and high-contrast lighting. You can find many re-creations of this shot on Google Images, but they all tend to have brighter and more even lighting, so that the girl doesn't look like a shadow in profile. It just looks like a cheesecake photo from any random lad mag. The shadowy look of the original shot obscures the details of her body, so it doesn't come across quite as pornographic as it would have with standard lighting.

It's rare to find such an unusual visual approach in such a popular movie (it ranked 3rd at the box office for 1983). Can anyone think of another hit movie that is so distinctly dark, for both interior and exterior shots, and for daytime as well as nighttime shots?

Here are 20 images that show how broadly the dim and chiaroscuro look is throughout Flashdance.





















March 23, 2015

'70s snapshot: Booze and drugs at a middle school hangout site

You ever wonder what the characters from Fast Times at Ridgemont High were like in middle school? Was little Jeff Spicoli already stoned out of his mind in seventh grade? It looks like he was.

The middle school age group isn't as exciting to study as high school, so there tends to be very little record left in the public imagination, only personal memories. But suburban archaeologists can still re-visit the original scene of the crime and see if there are any traces left of what the kids were up to way back when.

I always keep my eyes peeled when strolling around trails for signs of the good ol' days, especially when they're near schools, where young people would have been hanging out. The good weather this afternoon brought me to a wooded area behind a middle school and adjacent to residential housing.

I've seen plenty of relics from the wild times when wondering around near high schools, but they could have been near the age of majority. Middle schoolers should have had a lot more difficulty getting their hands on beer, pot, and the like. Then again, times were different back then.

As it turned out, there were at least half a dozen beer cans lying around the middle-school hangout. Every time there's a brewery so old I've never even heard of it -- this time it was Ballantine ("premium lager beer"). It was a can with the old pull tab, so it must have been from the '70s, and this label from 1977 looks like the one it had:


There was a Budweiser can still in colorful condition nearby, also with the old pull tab. There were a few push-in type cans by Coors, Milwaukee's Best, and Red Bull, though they all had the narrow mouth opening and anti-litter warnings on the top like you would have seen in the later '70s and '80s. If you find ones with the narrow mouth but no anti-litter warnings, those are from the '80s and '90s. You may never have paid any mind to these details of beer can design, but when you're trying to date a hangout site, they can give you a very close estimate.

Atypically, there were no collections of cans of the same type, as though they had all come from the same six-pack. It's not uncommon to find the remnants of an entire six-pack behind a high school or off of a trail where older teenagers and young adults would have gone. Near the middle school, I didn't find two cans by the same maker. My hunch is that each of the kids lifted a single beer from their home's fridge, or shoplifted a single can, or found a sympathetic older person to get them "just one" can of beer. That would keep the pre-high-school drinkers under the radar and account for the mismatched assortment of cans.

As usual, there were tree carvings with initials, dates, etc. "PARTYED HERE," read one of them, with the misspelling being an honest sign of the barely maturing, and barely literate make-up of the site's habitual occupants. A large graphic carving showed a bong with a stylized plume of smoke wafting up.

Unusually for what I've seen around high schools, most of the dates were from a narrow range -- the mid-to-late '70s, whereas those near a high school would have gone into the '80s and early-mid '90s. One kid left his full name and the three academic years he was at the school. (I'm not sure just how academic those years were for him, given the big-ass pot leaf engraving that he also left.) He was there from, I think, '74 to '77.

Few or no dates from the '80s -- that really struck me. The middle school opened in 1965 and is still packed with students. There were signs that students still walk through and around the area -- a recent wide-mouth lemonade can, bags from Lays chips (chip bags degrade quickly and must be recent), a notice from the principal to parents dated December 2014, and so on.

This recent stuff looked like trash that somebody chucked to the side while walking straight through, though, not collections of cans or bottles that would have been left over from students hanging out for a little while before moving on.

The period when the middle schoolers actually made the site their own hangout area, drank beer, and occasionally got high, was a small blip in the 50-year history of the school. Who were those students? They would have been born mostly in the first half of the '60s -- the late Boomers who would go on to a Fast Times kind of high school.

Their wild upbringing shows up just about anywhere you look, and it has had lifelong effects: they have always been over-represented among the homeless, for example, whether it's at the beginning of the homelessness phenomenon in the '80s or today. Anecdotal reports from early Gen X-ers suggest that the late Boomers were way more heavily into drugs during college, when the two may have shared a fraternity.

And it's reflected in the pop cultural record, where teen movies of the mid-'80s portray high schoolers who are noticeably more introspective and wary of just doing whatever feels good, man, in contrast to the uninhibited teenagers of Fast Times earlier in the decade.*

Now it looks like that lack of inhibition began earlier than their high school years, at least by the start of adolescence. I wonder how their wild attitude showed up in elementary school, when they were still too young to score beer.

* This may be one pathway from the outgoing to the cocooning mindset. The first generation that's born and raised entirely within outgoing / rising-crime times is going to turn out a little too wild, so that the next cohort after them, when looking up to what the older kids are like, are going to decide that maybe that's too far, and we should dial it down just a bit ourselves. And to not take being cool as the be-all end-all of youth, if that pursuit could lead to disaster. Make fun of trying to act cool.

This more self-aware and ironic mindset and behavioral style is already evident by the late '80s in the quintessential mock-ethnography of Generation X, Heathers, whose characters would have felt out of place at Ridgemont High.

March 12, 2015

"It Follows": The anti-'80s horror film about not trusting anybody and only looking out for Number One

There's a lot of buzz about the retro vibe of a new horror film, It Follows, but its variations are inversions on the classic themes of the slasher movies of the '70s and '80s.

I don't think I'll be seeing it, and so can't say whether it succeeds on its own terms. I'm more interested in how people, especially so-called film buffs, perceive the past and how it compares with the present. With all the talk about it being a radically fresh incarnation of the '80s slasher flick, it looks like they've totally missed the message.

Here is the movie's trailer, and a full plot synopsis from Wikipedia. Let's look at just how opposite its treatment is of the major themes of the slasher / horror genre during its heyday in the '80s.

Who or what is the danger? Ultimately, it's some supernatural stalker that kills you once it slowly reaches you. But the stalker has no direction of its own, unlike Freddie Krueger who wanted to get revenge on the children of the adults who fire-bombed his house after the justice system failed to lock up the serial child rapist-murderer. Or unlike a psycho who picks victims on a whim, where it's still his choice, however lacking in motivation the choice may strike us.

Instead, the stalker is passed along from one victim to the next like a curse. After the current victim has sex with someone, the stalker drops the current target like a hot potato and turns single-mindedly toward the person they had sex with. In order to escape the stalker, your only hope is to pass it along to someone else after the most intimate kind of encounter. Since even hinting at your ulterior motives would make it impossible to make it with the next victim, your goal is to dupe them.

Thus, the true danger is not a supernatural entity, but anybody who might possibly be interested in you sexually, including all of your opposite-sex peers. You can never know which ones are just trying to dupe you into becoming the next victim in order to save their own skin.

With time being of the essence, you'll choose the quickest and easiest victim to dupe. Since that means somebody who already trusts you, you will naturally go after one of your own friends and acquaintances to pass it along to, rather than a stranger. A stranger would be wary of a random horndog guy trying to get into her pants, or a too-good-to-be-true case of a cute girl you don't even know throwing herself at you.

The real enemy, with a real motive, is therefore a close insider rather than an outsider. In the '80s slasher movies, it was someone within the neighborhood or community, but not within your most narrow and intimate social circle. That made it possible to band together with your peers against a common enemy. That left a fairly large social circle that could be trusted as a sanctuary from evil.

In the world of It Follows, there is no minimal social circle that you can trust. You are utterly on your own, and if you find yourself stalked by the entity, you are only going to look out for Number One by cynically and deceitfully passing it on to someone else.

According to the movie's rules, you cannot even sacrifice yourself to spare others, as the stalker will continue backward along the chain of transmission once it claims its first victim. Trying to take one for the team by allowing it to kill you would spare potential future targets, but would not protect those who came before you in the chain.

In the movie's logic, cooperation and altruism are pointless.

These are not minor, nitpick-y differences. They get at the fundamental themes of the horror genre -- what is the source of danger, how can we prepare for it before it finds us, how can we deal with it when it does show up, and how can we cope with its aftermath? In the classic slasher movies, these themes all led to pro-social solutions. In the Millennial version, they are anti-social.

Taking a broader objective view of the history of horror, is this really such a new inversion anyway? Not really: the classic '90s anti-slasher movie Scream had already placed the source of danger from within one's most intimate social circle.

However, It Follows has turned up the dial. In Scream, the idea that evil was so close that you couldn't trust your closest friends and partners was only revealed in a shock ending. Throughout most of the movie, you felt as though it were another case of a psycho killer coming from outside the circle of friends. It Follows lays out the anti-social paranoia from the get-go. Also, in Scream the killer's motive was revenge for his mother, which is at least somewhat pro-social. Mindless, cynical self-preservation is the only motive in It Follows.

During the bridge of the early '90s, Twin Peaks left it an open mystery who the killer was, for the captivating episodes anyway. The teenagers may have suspected one another, but they may also have suspected an adult from the community, an outsider, or a supernatural force. Unlike straight horror movies where the evil entity is known from early on, the unresolved mystery in Twin Peaks led to a tension between trusting and suspecting your closest friends and community members.

Lurid plots involving the closest of friends coldly and psychopathically killing each other have also long been a staple on Law & Order: SVU.

The main innovation of It Follows is the logic of how the evil entity "selects" its targets, but that's just a gimmicky plot device. It's still largely of a piece with the Scream-and-after era of horror movies.

The change in approaches to these themes follows straightforwardly from the phases of the social cycle, which alternates between a outgoing / trusting phase (roughly the '60s through the '80s) and a cocooning / suspicious phase ('90s through today).

I find it mind-boggling that film nerds compare stuff like this to classic slasher movies, all because it has an eerie synth soundtrack. In narrative substance, It Follows could not be any more of a bizarro '80s movie.

March 1, 2015

Will Smith never could act

With yet another flopperino from the Fresh Prince (Focus), critics are starting to realize maybe this actor isn't all he's cracked up to be, y'know, acting-wise. Their unironic, "Where did it all go wrong?" post-mortems of his career are more puzzling, though, than the fact that this one-note goofus is still taken seriously as an actor.

Back in the '80s, closeted gay funnymen like Eddie Murphy were limited to prankster roles, and it worked fine. Can you imagine, after his comedic success in Beverly Hills Cop, casting Murphy as the detective in Basic Instinct? The idea that somebody green-lit an erotic thriller starring a crypto-homo joker just goes to show how desperate Hollywood studios have become by 2015.

I still remember how mind-bending it was to see Smith land all those big roles in the '90s, when his only talent was mugging for yuks as the Fresh Prince. Did audiences really take him seriously? Or was his appeal a meta- kinda thing, like they realized he couldn't play any other role than Will Smith (TM), but how hilarious is it to see Will Smith (TM) cosplaying as a caricature of a soldier, a g-man, or a lover of women?


Nothing says stoic badass like kabuki-esque face-scrunching


Gay adopter Peter Pan-ishly mirrors his toddler's expression


"The Greatest" as a brooding gay bullycide victim

His case generalizes to all closeted homo actors: lacking grown-up empathy, they can only play one role -- themselves -- and it will therefore be one variation or another on the theme of Peter Pan (the defining trait of gay men).

Eddie Murphy and Will Smith are adorable lil' stinker pranksters. Cary Grant and George Clooney are mirror-gazing playboys who, for whatever reason, can never be charmed into a long-term relationship with a woman. Tom Cruise plays a bit more grown-up of a role, as 12 year-old ultra-intense ultra-panicky action LARPer.

Fortunately for Hollywood, audiences these days don't care how hamfisted the on-screen performances are, as long as the star has instant brand recognition. Indeed, the fundamental appeal of these stars is that their brand of acting is so limited that you know exactly what to expect. Hence the box office bomb when Will Smith isn't doing Will Smith (TM).

Viewers no longer want to go in with an open mind and feel like being pleasantly surprised, accepting the characters on their own terms. Nope: the actors are only meant to be action figure dolls who do exactly what the spectators had already wanted them to do before entering the theater.

It used to be said that contemporary video games were a pale imitation of film, but now we see that film has become a pale imitation of video games. No controller required, folks -- we already know what actions you'd make the characters perform. Just sit back and enjoy the game playing itself.

February 24, 2015

Is the queen of rom-coms a lesbo?

This item at Blind Gossip mentions an actress who was playing all touchy-feely with her man at the Oscars, but only when the cameras were on them. During the pre- and after-party, when the cameras were not rolling, she was with her girlfriend. Her public relationship is just a PR stunt.

She will not come out of the closet because she fears that she'll lose her fan base and industry support.

The only way a closeted lesbian could lose a male fan base by coming out is if she were a sex bomb that they're all jerking off to. Nobody in Hollywood is that hot. No one who gets into the Oscars, anyway, which excludes mere eye candy actresses. In fact, coming out might actually titillate a sex-crazed male fan base, who'd start picturing her getting it on with some other celebrity babe.

That leaves a female fan base, who have invested so much of their lives into following her as a role model. How would coming out as a lesbian shatter their faith in her? If her roles had primarily been about a down-on-her-luck kinda gal who meets Mr. Right and everything works out great in the end. Discovering that they had been trying to imitate the love life of a lesbian all along would ruin the last hope they had of finding their prince.

The only good guess in the comments at the BG post is Jennifer Aniston. Most of them are moronically guessing Oprah, who is not an actress. One clue pointing to Aniston is the phrase "so sweet," perhaps referring to her recent drama Cake. Another clue is that "she really wants to win an Oscar more than she wants you to know the truth!" Someone pointed out that Aniston made headlines recently with the phrase "We know our own truth," referring to her relationship / engagement to seemingly closeted homosexual Justin Theroux.

Pictures of them show zero chemistry, which technically just means they are a sham couple, not necessarily that she is incapable of chemistry with men altogether in real life.

Still, check Google Images for pictures of Aniston and Selena Gomez, whether at the recent Oscars after-party or in numerous other occasions. They look way more into each other than Aniston and Theroux do. There's another set of pictures from Sunday's after-party of Aniston and Amy Adams looking tender together.

The blind item didn't say that the actress' girlfriend was famous herself, so I don't claim that Gomez or Adams are Aniston's girlfriend. It's just to point out how much more open and warmed-up she evidently feels around other women.

I'm happy to say that I never did get the whole Jennifer Aniston craze. They tried to make her a sex symbol during the '90s, and I even watched a few episodes of Friends just to see what all the hubbub was about. Nothing. Talk about being over-hyped. I didn't know any other guys who were into her either, but she must have had a niche following among doormat types.

I do understand her appeal to aging single women, and that makes me think it's her. She would lose industry support because her whole schtick is the rom-com princess, something that absolutely does not allow for a lesbian actress.

The Hollywood executives would lose so much money that could've been scored from churning out a dozen more inane Aniston rom-coms. And her fans would feel betrayed and led astray. "No wonder she has such trouble finding and holding onto a man!"

Lesson number one in finding Mr. Right: don't imitate the personality of a lesbian.

January 30, 2015

2014 in film: No change from existing trends

Not exactly the most attention-grabbing news ever reported, but it is worth keeping our eyes peeled for signs of change back toward the film-making culture that we all love from the '70s and '80s. The current trends have been going on for over two decades now, and will run out of steam sometime in the next 5 to 10 years. But so far, there are no observable signs of change away from the dullification of movies.

A quick check of the traits that characterize the top box office draws in 2014 shows a mindless continuation of four major existing trends:

1. Unoriginal storytelling (earlier post here with data since 1936). Whether the stories are adaptations of existing stories, or sequels to existing movies, none of the top 10 movies in America were original.

You might try to excuse The LEGO Movie, since it was not a sequel and did not adapt a clearly defined existing story. But you weren't going to see that for the narrative or character arcs. You went to see the Lego-style visual animation. Its visual style was entirely familiar to the audience and adapted from the toys, video games, and cartoons done in the Lego style, so I count that movie as an adaptation. Also the pandering title shows that audiences would only be drawn by instant brand recognition -- it doesn't hint at what the movie is about, or offer a mysterious title to pique our curiosity. It's just: "You think Legos look cool? Well here they are, in a movie!"

Broadening the view to include the top 20 movies doesn't help. There are only 2 original stories in the 11-20 spots -- Interstellar and Neighbors. Only 2 in the top 20, or 1 in 10 original stories. Pretty sad.

2. Disappearance of separate movie cultures for children, adolescents, and adults (earlier post here on the MPAA ratings of top movies since 1969). Almost everything is PG-13 or PG.

There were no G-rated movies in the top 20, in contrast to the late '70s when The Muppet Movie was the #10 movie and rated G. Last year's LEGO Movie, however, was rated PG because they had to cram in adult-ish stuff to entertain the parents as well as the kids, rather than let kids have their own autonomous culture.

At the other extreme, there was only 1 movie rated R in the top 10 for 2014, although there were 4 in the top 20, or a rate of 2 in 10. A far cry from the '70s and '80s when mature themes could be treated without parents throwing a fit because they didn't plan on bringing their kids to those R-rated movies. Today, just as children are not allowed to have their own culture, neither are adults.

Helicopter parenting demands popular culture that is "fun for the whole family" (AKA bore the whole family), because parents won't see movies on their own anymore. That would involve leaving the kids under someone else's watch for a few hours, and y'know how that's bound to end up -- finding them bound, raped, and murdered in a ditch on the drive back home.

3. Comedies are still rare (earlier post here on the popularity of the comedy genre since 1915). None of the top 10 movies were comedies, although 2 of the top 20 were, for a rate of 1 in 10. I don't count kiddie movies because that isn't comedy, but rather cutesy and clowny humor with the occasional yuk-yuk gags. Even counting 22 Jump Street and Neighbors is being generous, since those are just juvenile yuk-yuk movies, not where there's some comedic dynamic that runs throughout the movie.

Since comedies are most popular in rising-crime times, they seem to fulfill the need for catharsis and resilience during such topsy-turvy times. In a world that is becoming safer and safer, folks aren't as likely to be in a state of physiological arousal, and don't have as much need for comedic relief in their lives.

Also worth noting that the two comedies for 2014 did not pair a light comedic tone with darker themes, as used to be the norm in the '80s. Back then, a comedy was always an action comedy, war comedy, horror comedy, or drama comedy. That pairing of light and dark themes emphasized the role of comedy as a relief from situations in life that would otherwise be depressing, frightening, or overwhelming.

4. Running times are still very long, especially considering how juvenile the subject matter is (earlier post here on running times since 1921). Using the top 10 or top 20 didn't matter. Average running time was 2 hrs 6 min, median was 2 hrs 7 min, minimum was about 1 hr 37 min, and max was about 2 hrs 45-50 min.

As with Midcentury cocooners, today's cocooners require something spectacular to get them out of their domestic fortresses, for it to really be "worth it". In cocooning times, people also seem to prefer drawn-out experiences rather than ones that pack a cathartic punch. In the Midcentury, serial dramas on the radio were more important than movies, just as serial dramas on TV these days are more important than movies. One mode is for folks who are generally bored during the day, the other for ones who already have other exciting stories to participate in in-real-life.

So there you have it: if you sensed that movies in 2014 have been continuing the trend toward tediousness, you were right. I confess that I only saw two new movies last year, Interstellar and Transcendence, and it doesn't look like I missed much. I began tuning out of new releases during the second half of the '90s, and have rarely felt regret when I've caught up later on. There's simply too many from the good old days to feel deprived by choosing to see "new" movies from the past than from the present.

January 21, 2015

Entertainment as mood stabilizer vs. experimentation

This video by the gang at Red Letter Media pokes fun at how many of this year's upcoming movies will be sequels, prequels, remakes, reboots, adaptations from other media, etc. The unstoppable nature of this trend really makes you wonder what's behind it. To understand it, we need to see its full scope.

See my earlier post that crunches the numbers on how common this unoriginal approach to storytelling in film has been, using the top movies at the box office from 1936 to 2011. In short, it tracks the outgoing vs. cocooning social cycle: cocooning audiences prefer familiar material more than outgoing audiences, who want to experience a story they haven't already heard about. Another post hinted at the same trend in pop music, where the same song stays on the year-end charts for more than a single year nowadays, although that was not an exhaustive study over time.

I haven't crunched any numbers on it, but there's also a clear trend in TV shows toward creating multiple adaptations of a single brand (CSI, CSI: Los Angeles, CSI: Sheboygan...). American Idol featured entirely familiar songs, only sung by people you've never heard of. And Dancing with the Stars not only has familiar songs, but familiar personalities dancing along to them. The judges on these competitions are also familiar stars.

As long as it's instantly recognizable, audiences will cling to it for dear life. That seems to be the proper way to interpret this broad trend — not as "against change" or "against novelty," and by implication "for what is traditional" or "for what has been proven to work."

These lame rehashings are no more than a generation old, so they are not part of an enduring tradition whose preservation the audience feels bound to maintain. They are merely a security blanket for a population afflicted by anxiety and depression, in contrast to the delightfully off-beat material that the fun-loving audiences sought in more outgoing times, with a peak in the 1980s.

This view of entertainment as self-medication as opposed to experimentation suggests a link to the forms of drug use that prevail in cocooning vs. outgoing times. This post reviewed the distinction between stabilizing and destabilizing drugs, and showed that the stabilizers soar in cocooning periods, while destabilizers become popular in outgoing periods.

Stabilizers give a little pep to the depressed and mellow out those with shaky nerves — the popular amphetamines and barbiturates that were consumed on a massive scale during the Midcentury. The turning point came during the '70s when the public reacted against the attempt to mainstream the use of Valium. But as cocooning returned in the '90s, the mainstream returned to drugs like Prozac for the on-edge and Ritalin for the restless. These stabilizing drugs are an attempt to correct the emotional dysfunction that comes from being socially cut-off.

Destabilizers are about opening up the mind to strange, new moods and experiences, not to return the mind to normal. Marijuana, cocaine, alcohol, and the like. They flourish when the mainstream already has a satisfying social life and normal emotional functioning, and seeks out something beyond the ordinary. They are "party drugs" or "social drugs," unlike Miltown or Prozac, which are meant for the isolated housewife or Man in the Grey Flannel Suit. Because they are more destabilizing, people use them with greater wariness about their dangers than they do when consuming antidepressants and focus-enhancers, which are taken complacently.

Entertainment, then, is just another form of self-medication in cocooning times, or off-the-beaten-path experimenting in outgoing times. This psychiatric view may go farther toward explaining the tendency of cocooning periods to be more culturally bland, stale, and monotonous, than other views which tend to dehumanize the self-medicating cocooners as inherently dull and uncreative.

They may in fact have similar creative capacities and ability to appreciate novelty, but they are being suppressed in order to meet the more fundamental psychological need for everyday emotional regulation. Folks in outgoing times have Maslow's basic social and emotional needs met, so they are freed up for higher pursuits in creativity and self-actualization.

January 1, 2015

The year in stale pop culture

We're all familiar with the endless sequels and reboots that Hollywood dollar-chasers keep pumping out, mainly because it sells really well with today's audiences, who are afraid of any brand they don't instantly recognize.

But let's not re-hash the topic of unoriginal material in movies. This earlier post already covered it quantitatively from the 1930s through the early 2010s. And let's not look into TV shows, because TV is boring, and because it's not very different from movies. Every hit show is part of an established franchise and/or in its 20th season.

What is the counterpart to sequels in pop music? You could argue it's a song by someone who's already had a hit before, but often those songs can sound quite different owing to changes in mindset. The Rolling Stones were still on the Billboard Year-End charts in the '80s (even if not at the top), but "Emotional Rescue" and "Waiting on a Friend" don't sound much like "Satisfaction" from over 15 years earlier.

Luckily there's an airtight way to look at the creeping staleness of pop music — look for an identical song that appears on the Year-End charts for multiple years. Being popular from one week to the next is one thing, but from one year to the next? Was nothing better released in the meantime?

You might think that these are just songs that were released late in the year, and carry over into the next. Well, that would happen for every pair of consecutive years, whereas this is a recent development. Often the song was released in the middle of the first year, not the very end. I've noticed some isolated examples of this trend from the Year-End charts of the mid-late '90s (none from the '80s of course), and then more and more during the 2000s.

Now it has gotten so bad that you don't have to have a very good memory to notice it, if you read through the charts of back-to-back years.

Out of the Hot 100 on the Year-End chart in 2014, 10 of them were on the chart for 2013 too. You heard it right: fully one-tenth of 2014's hit songs were the exact same warmed-over hits from the previous year.

Here is the list, with its spot on the 2013 chart, then its spot on the 2014 chart.

02 - 83 "Blurred Lines" by Robin Thicke
03 - 57 "Radioactive" by Imagine Dragons
10 - 46 "Roar" by Katy Perry
15 - 20 "Royals" by Lorde
18 - 44 "Wrecking Ball" by Miley Cyrus
19 - 22 "Wake Me Up" by Avicii
62 - 23 "Demons" by Imagine Dragons
63 - 05 "Counting Stars" by OneRepublic
96 - 74 "Brave" by Sara Bareilles
97 - 19 "Let Her Go" by Passenger

Like I said, there's no real trend toward very-late releases showing up. Nearly half are ones that started big and have fallen, while nearly half are ones that began small and rose higher, with two of them staying more or less where they were. No trend toward rising or falling fame across years, then.

Stylistically it's mostly dance-pop and indie performers, not rap or R&B. Demographically it's by whites for whites, not black-for-black or black-for-white. My take is that the average white teenager is so bored or put off by literally almost everything, that when there's a halfway decent tune they'll keep playing it out one year after another. It's a desperate choice in a world where so much sucks. (Happy New Year.)

That could be at work as well in the movie and TV domains. Movies over the past 20 years really have been terrible, even if they were fresh and new. Audiences came to realize that Hollywood stopped being able to make satisfying original movies, so they cling to something that they know from past experience is at least not irritating or offensive, however bland the tentpole franchises may be.

December 1, 2014

Star Wars: The cosplay fanfic sequel

If George Lucas raped your childhood, then J.J. Abrams is going to make sure you get a happy ending. See for yourself in the new trailer for next year's re-launch of the franchise.

Look, it's the original style of stormtrooper armor! Look, it's some kind of speeder bike! Look, a close-up shot inside the cockpit of an x-wing! Holy shit bro, the millennium falcon! And the original John Williams theme! Plus, no five year-old actor, no CGI rabbit, and no midichlorian meter? Well, who's gonna be camping in line one year ahead of release night? — this guy!

Yeah, it doesn't look like the third trilogy is going to be a great big middle finger to the fans or audiences with half a brain, the way that the second trilogy was. This time a stubborn idiot who thinks he's clever won't be directing them into oblivion. But we're still just getting an overly enthusiastic fanboy who's going to make it all about fan service, devoid of plot, character, or visual style.

Hey, he made everyone forget about those awful Star Trek: The Next Generation movies from the '90s. Not by making anything new, but by making reference after reference to the stuff that everyone already likes, or would like if they haven't seen it.

You can't "do Wrath of Khan again," or "do Star Wars again," because the zeitgeist has changed so much. The result is placing contemporary actors with contemporary attitudes in a great big cosplay re-enactment of the original movies, all shot with contemporary camera work, and presented after contemporary editing.

Star Trek now stars a gay Latino Millennial as Spock, the tone is constant frenzy, and the camera is hyperactive. Star Wars is going to star a negro Millennial (hopefully not also gay), the tone looks to be constant frenzy, and the camera hyperactive. Updating the classics for our times, or overly indulgent LARP session?

It's not a nostalgic re-enactment either, as the Millennials grew up long after Star Trek and Star Wars exploded as pop culture phenomena. Non-whites, let alone queer ones, couldn't have cared less about them. A nostalgic re-enactment would star straight, white Gen X-ers. Multicultural Millennials are just going to make it come off as a cargo cult performance.

I am glad that part of this cargo cult approach involves shooting on film and using practical effects (although still tons of CGI, judging from the trailer). If the superior technology doesn't get preserved, it could be lost for good.

Other than that, I have zero interest in seeing the new sequels. It's too late to re-launch Star Wars — and was already too late by the '90s. It would have been neat to see a Star Wars movie in the late '80s or very early '90s, before the zeitgeist shifted so far away from what developed during the '60s, '70s, and '80s.

We got a third Indiana Jones movie in '89, and it wasn't that bad — palpably more self-aware and winking at the audience (watch it again and see how many jokes are blatant asides to the audience), but still a solid Indiana Jones movie.* I didn't bother seeing the fourth movie in the 2000s because I knew it would suck based on the Star Wars prequels sucking, and hearing everyone say so when it came out.

Star Wars missed the window to follow up on a classic from the late '70s / early '80s, and should have stopped before the prequels got made. There's even less reason for these new sequels to get made, other than cashing in on a surefire opening weekend with a sequel to the most popular movie out there.

* Some other sequels worth noting from the late '80s / early '90s, which lagged quite a bit behind the original, which took on a noticeably more self-aware or winking tone, but which were still decent movies:

Back to the Future II and III ('89 and '90, original '85)
Christmas Vacation ('89, original '83)
Ghostbusters II ('89, original '84)
Gremlins 2 ('90, original '84)
The Exorcist III ('90, original '73)

September 30, 2014

The crappy digital look: Demystifying the lie about how sensors vs. film handle light sensitivity

In this installment of an ongoing series (search "digital film" for earlier entries), we'll explore another case of digital photography offering supposedly greater convenience at the cost of compromised image quality. The end result is pictures that are too harshly contrasting, more pixelated, and color-distorted where it should be white, black, or shades of gray.

This time we'll look into the properties of the light-sensitive medium that records the visual information being gathered into the camera by the lens. I have only read one off-hand comment about the true nature of the differences between digital and film at this stage of capture, and mountains of misinformation. The only good article I've read is this one from Digital Photo Pro, "The Truth About Digital ISO," although it is aimed at readers who are already fairly familiar with photographic technology.

Given how inherent the difference is between the two media, and how much it influences the final look, this topic is in sore need of demystifying. So hold on, this post will go into great detail, although all of it is easy to understand. By the end we will see that, contrary to the claims about digital's versatility in setting the light-sensitivity parameter, it can do no such thing, and that its attempts to mimic this simple film process amounts to what used to be last-resort surgery at the post-processing stage. One of the sweetest and most alluring selling points of digital photography turns out to be a lie that has corrupted our visual culture, both high and low.

To capture an image that is neither too dark nor too bright, three inter-related elements play a role in both film and digital photography.

The aperture of the lens determines how much light is gathered in the first place: when it is wide open, more light passes through, when it is closed down like a squint, less light passes through. But light is not continually being allowed in.

The shutter speed regulates how long the light strikes the light-sensitive medium during capture: a faster shutter speed closes faster after opening, letting in less light than a shutter speed that is slower to close, which lets in more light.

Last but not least in importance, the light-sensitive medium may vary in sensitivity: higher sensitivity reacts faster and makes brighter images, lower senstivity reacts slower and makes dimmer images, other things being equal. This variable is sometimes labeled "ISO," referring to the name of a set of standards governing its measurement. But think of it as the sensitivity of the light-reactive material that captures an image. This scale increases multiplicatively, so that going from 100 to 200 to 400 to 800 is 3 steps up from the original. Confusingly, the jargon for "steps" is "stops."

A proper exposure requires all three of these to be in balance — not letting in too much or too little light through the lens aperture, not keeping the shutter open too long or too briefly, and not using a medium that is over-sensitive or under-sensitive. If you want to change one setting, you must change one or both of the other settings to keep it all in balance. For example, opening up the lens aperture lets in more light, and must be compensated for by a change that limits exposure — a faster closing of the shutter, and/or using a medium that is less light-sensitive.

Between digital and film, there are no major differences in two of those factors. The lenses can be opened up and closed down to the same degree, whether they are attached to camera bodies meant for one format or the other. And the shutter technology follows the same principles, whether it is opening up in front of a digital or film recording medium. (Digital cameras may offer slightly faster maximum shutter speeds because they are more recent and incorporate improvements in shutter technology, not because of digital properties per se.)

However, the two formats could not be more different regarding light-sensitivity of the recording medium.

Film cameras use rolls of film, which are loaded into and out of the camera on a regular basis. Load a roll, take however-many pictures, then unload it and send it off to the lab for development. The next set of pictures will require a new roll to be loaded. Digital cameras have a light-sensitive digital sensor which sends its readings to a memory card for later development and archiving. The sensor is hardwired into the camera body, while the memory card is removable.

Thus, no matter how many pictures you take with a digital camera, it is always the exact same light-sensitive piece of material that captures the visual information. With a film camera, every image is made on a new frame of film.

A digital sensor is like an Etch-a-Sketch that is wiped clean after each image is made, and used over and over again, while frames and rolls of film are like sheets of sketching paper that are never erased to be re-used for future drawings. The digital Etch-a-Sketch is just hooked up to a separate medium for storing its images, i.e. memory cards. Frames of film are both an image-capturing and an image-storage medium wrapped up into one.

Whether the light-sensitive material is always fresh or fixed once and for all has dramatic consequences for how it can be made more or less reactive to light — the third crucial element of proper exposure.

Film manufacturers can make a roll of film more reactive to light by making the light-sensitive silver halide crystals larger, and less reactive by making the crystals smaller. Hence slow films produce fine grain, and fast films large grain. What's so great is that you can choose which variety of film you want to use for any given occasion. If you're worried about too much light (outdoors on a sunny summer afternoon), you can load a slowly reacting film. If you're worried about not getting enough light (indoors in the evening), you can load a fast reacting film.

It's like buying different types of sketching paper depending on how much response you want there to be to the pencil lead — smooth and frictionless or bumpy and movement-dampening. Depending on the purpose, you're able to buy sketchpads of either type.

What was so bad about the good old way? The complaints boil down to:

"Ugh, sooo inconvenient to be STUCK WITH a given light sensitivity for the ENTIRE ROLL of film, unable to change the sensitivity frame-by-frame. What if I want to shoot half a roll indoors, and the other half outdoors?"

Well, you can just buy and carry two rolls of film instead of one — not much more expensive, and not much more to be lugging around. And that's only if you couldn't compensate for changes in location through the other two variables of aperture size and shutter speed. For the most part, these were not big problems in the film days, and served only as spastic rationalizations for why we absolutely need to shift to a medium that can alter the light-sensitivity variable on a frame-by-frame basis, just as aperture size and shutter speed can be.

That was the promise of digital sensors, which turns out to be a fraud that the overly eager majority have swallowed whole, while enriching the fraudsters handsomely.

Digital cameras do offer a means for making the image look as though it had been captured by a material that was more sensitive or less sensitive to light, and this variable can be changed on a frame-by-frame basis. But unlike film rolls that may have larger or smaller light-sensitive crystals, the photodiodes on the digital sensor have only one level of sensitivity, inherent to the material it is made from.

Because this sensitivity is baked into the materials, it certainly cannot be altered by the user, let alone on a frame-by-frame basis. And because the sensor is not removable, the user also has no recourse to swap it out for another with a different level of sensitivity.

How then do digital cameras attempt to re-create the many degrees of sensitivity that film offers? They choose a "native" sensitivity level for the photodiodes, which can never be changed, but whose electronic output signal can be amplified or dampened to mimic being more or less sensitive in the first place. In practice, they set the native (i.e. sole) sensitivity to be low, and amplify the signal to reach higher degrees, because dampening a highly sensitive "native" level leads to even lower quality.

Most digital cameras have a native (sole) sensitivity of ISO 100 or 160, meant to evoke the slowly reacting less sensitive kinds of film, and allow you to amplify that signal frame-by-frame, say to ISO 800, 3200, and beyond. But remember: it is never changing the "ISO" or sensitivity of the light-reactive material in the sensor, only amplifying its output signal to the memory card.

It is like always recording sound at a low volume, and then using a dial on an amplifier to make it louder for the final listening, rather than record at different volume levels in the initial stage. And we all know how high-quality our music sounds when it's cranked up to 11. It does not sound "the same only louder" — it is now corrupted by distortions.

We should expect nothing less from digital images whose "ISO" was dialed up far beyond the native (sole) sensitivity of 100 or 160.

Below are some online digital test shots taken with the lens cap fully in place, blocking out most light, with higher and higher settings for the faux-sensitivity ISO setting. Now, these images should have remained black or gray the whole way through. The only change that would have occurred if they were shot on more and more highly sensitive film material is a grainier texture, owing to the larger film crystals that make film more sensitive, and an increase in brightness, since what little light was sneaking in past the lens cap would have produced a stronger reaction.

And yet look at the outcome of a digital sensor trying to see in darkness:


Not only does the texture get grainier, and the light level brighter, when the native (sole) sensitivity is amplified, there are now obvious color distortions, with a harsh blue cast emerging at higher levels of sensor amplification.

What's worse is that different cameras may produce different kinds of color distortions, requiring photographers to run "noise tests" on each camera they use, rather than know beforehand what effects will be produced by changing some variable, independent of what particular camera they're using.

The test shots above were from a Canon camera. Here's another set from a Pentax, showing a different pattern of color distortions.


Now it's red instead of blue that emerges at higher levels of amplification. Red and blue are at opposite ends of the color spectrum, so that shooting a digital camera without test shots is like ordering a pizza, and maybe it'll show up vegetarian and maybe it'll show up meat lover's. Unpredictable obstacles — just what a craft needs more of.

These distortions can be manipulated in Photoshop back toward normal-ish, but now you've added an obligatory extra layer of corrections in "post" just because you want to be able to fiddle with light-sensitivity frame-by-frame, which you're not really doing anyways. Convenience proves elusive yet again.

So, if amplification of the native (sole) light sensitivity is not like using film rolls of different sensitivities, what is it like? As it turns out, it is almost exactly like a treatment from the film era called push-processing, which was a last-ditch rescue effort in the developing stage after shooting within severe limitations in the capturing stage.

Suppose you were shooting on film, and your only available rolls were of sensitivity ISO 100, which is a slowly reacting film best suited for outdoors in sunlight. Suppose you wanted to shoot an indoor or night-time scene, which might call for faster reacting film, say ISO 400. Could it still be done with such low-sensitivity film? You decide to shoot in the evening with a slow film, effectively under-exposing your film by 2 stops, worried the whole time that the images are going to come back way too dark.

Lab technicians to the rescue! ... kind of. If you let them know you under-exposed your whole roll of film by 2 stops, they can compensate for that by allowing your film to soak in the chemical developing bath for a longer time than normal, allowing more of those darkened details to turn brighter. (The film starts rather dark and the developing bath reveals areas of brightness over time.) Taking 100 film and trying to make it look as sensitive as 400 film is "pushing" its development by 2 stops.

But if that were all there were to it, nobody would've bothered using films of different sensitivities in the capturing stage — they would've let the lab techs worry about that in the developing stage. The costs of push-processing are various reductions in image quality, which Kodak's webpage on the topic summarizes in this way (click the link for fuller detail):

Push processing is not recommended as a means to increase photographic speed. Push processing produces contrast mismatches notably in the red and green sensitive layers (red most) compared to the blue. This produces reddish-yellow highlights, and cyan-blue shadows. Push processing also produces significant increases in film granularity. Push processing combined with under exposure produces a net loss in photographic speed, higher contrast, smoky shadows, yellow highlights and grainy images, with possible slight losses in sharpness.

Not a bad description of the signature elements of the digital look, is it? Blue shadows are exactly what the Canon test shots showed earlier.

Interestingly, they note that although push-processing produces less sharp images, they may subjectively appear to be normally sharp, given the increase in contrast. Sure, if a subject is wearing a normal red shirt and normal blue jeans, and you crank up the contrast parameter, the picture looks more defined — ultra-red juxtaposed against ultra-blue. But we're only fooling ourselves. Sharpness means how clear and crisp the details are, and push-processing and its obligatory counterpart in the digital world are actually losing details, while distracting us with more strongly contrasting colors.

Remember, this is what a digital camera is doing each time it takes a picture outside of its native (sole) sensitivity level of 100 or 160, i.e. when you shoot indoors, at night, or on cloudy days. In the digital world, every image is immediately rushed into emergency surgery.

Is there a way to compare side-by-side a film image that was processed both normally and with push-processing? Unfortunately, no, since developing the negative image from the latent image on the film cannot be undone, and then done a different way. I suppose you could take a shot of the same scene, with two identical cameras and two identical rolls of film, but with one camera set to the true sensitivity and the other set inaccurately, then develop the normal one normally and the under-exposed one with push-processing. That sounds like a bit too much just to make a perfect textbook comparison of normal vs. push-processed images, and I couldn't find any examples online.

But there are examples of film that has been push-processed. Although we can't compare them side-by-side with normally developed versions of the same film frame, at least we can pick up on some of the typical traits that push-processing introduces. Below is an example from this series at a photographer's website. The film is ISO 400, but was push-processed to look like ISO 3200. That is 3 stops of pushing, whereas Kodak and other photography guidebooks advise never pushing past 2 stops of over-development.


It's disturbing how digital this film photograph looks. It looks like someone opened a digital image in Photoshop and cranked up the contrast and saturation settings. Look for details on the man's shirt and pants, like folds and creases. They're hard to make out because push-processing renders the image less sharp. But we're distracted by how striking the contrast is between these overly rich yellows and reds and the cooler blues. It looks more defined, but is poorer in detail.

It's almost like a child drew an outline of pants and hit "fill" with yellow on MS Paint. Very little detail. The yellow pole also looks like a crude "fill" job. Even worse, these pictures were shot on medium-format film, which has a far higher resolution than the 35mm film we're all used to. It ought to have detail so fine that you could blow it up into a poster or banner without blurring of the details.

We also see the familiar blown-out sky from digital Earth, rather than the blue one we know and love. Other white areas look like intense spotlights, too. I can't tell if they have the red-yellow tint to them as Kodak warned, although they do look kind of bright pale yellow. There aren't many dark shadows to tell if they have the bluish tint warned about, although the asphalt on the road looks blue-gray. The color distortions might be more obvious if we had the same scene captured and developed normally, for comparison.

The ultra-contrasty, overly saturated, harshly blown-out bright areas are hard to miss, though. And they look like something straight from a digital camera plus Photoshop settings dialed up to 11.

You might object that, hey, this guy knows what he's doing, and he's using push-processing to give the pictures a flamingly dramatic style (he's gay). That misses the point: these kinds of distortions and reductions in image quality are built in with digital photography's light-sensitivity technology. They aren't going to be chosen purposefully for some intended artistic effect. They're just going to make ordinary people's pictures look cartoony and crappy because they don't know about them before buying a digital camera, and won't mind anyway because digital is all about convenience over quality.

Even Hollywood movies shot by pros will be subject to these digital distortions, although they'll have much better help cleaning them up in post — for a price. Good luck scrubbing your digital images that clean on your own with Photoshop.

In the end, is digital really more convenient, all things considered? All of these distortions require laborious and expensive corrections, which may well off-set the efficiency gains that were hoped for at the beginning. Or those corrections simply won't be done, and greater convenience will have been traded off against poorer quality. Either way, one of the fundamental promises of digital photography turns out to be a big fat lie.