After some off-the-cuff thinking about spin-off movies from today and the mid-century, I decided to look into the data in greater quantitative depth. To re-cap, the goal is to see whether there was another period like today when so many movies were either adaptations of an existing work or a sequel to an earlier movie.
Based on a hunch, it looked like the '50s were another such period. The strongest influence on the zeitgeist is the trend in the violence rate, so not even knowing very much about all the popular movies of that period, let alone before or just after, that was a good guess -- today and the '50s were both well into a falling-crime time. And based on a quick look at the '80s, a rising-crime period, it looked like there weren't so many sequels and adaptations.
Granted, movies are more than just narratives, and this approach totally leaves aside the question of visual originality. But it's a feasible way to take a crack at the central question. Also bear in mind that originality and greatness aren't the same thing. I'm not judging how great or enjoyable all these movies are because I haven't seen most of them. The focus is entirely on how original the process was that gave us the plot, characters, themes, verbal tone, and so on.
I went through every year from 1936 to 2011 and tallied how many movies in the box office top 10 were a sequel or adaptation. I didn't go back any further because there aren't easily accessible data on the top 10. Wikipedia only lists 5 to 7 for many years, and doesn't always have a link to information about the source of the story that far back. This will do pretty well, though, because that still gives us just about the entire falling-crime period of 1934-1958, the whole rising-crime period through 1992, and the whole falling-crime period since then.
Rather than stall the presentation and discussion of the data, I've put the methodology part at the very end, in case you want to know exactly how I nitpicked what did and did not count as an adaptation and a sequel. They weren't mutually exclusive, though: for example, I coded the new Twilight movie as both a sequel and an adaptation.
The first graph below shows the total number of movies with less original narratives (whether a sequel only, adaptation only, or both). Then just to probe whether the type of unoriginality matters, the second graph splits them apart to show how many were adaptations (regardless of whether or not they were also sequels), and how many were sequels (regardless of their status as adaptations).
Score another one for my theory. There's visible year-to-year variation, micro-periods of a few years that go against a trend, etc., but there are three clear movements -- up, down, and up. The first one climbs toward a peak around 1956-'58, then it starts to fall off toward an end-point around 1988-'91, and then rises again through the present. That just about perfectly tracks the trend in the homicide rate through three different movements up, down, and up again. So it's not a coincidence.
Also, it looks like most of this link to trend in the violence rate is due to how common adaptations are. Sequels show a more steady one-way growth over the entire period, and usually aren't as common as adaptations anyway. That makes the case even stronger -- at least if it were mostly due to sequels, you could say they're still pretty original, just carrying over the characters, themes, and maybe settings from one movie to another. Instead it's more due to borrowing all that stuff, plus a good amount of the plot, dialogue (or at least its verbal style), and even songs if it's a musical.
What are some potentially surprising things to notice in the main graph?
First, today's culture where so much is adapted or a sequel milking a popular series is not that old. Even as recently as the 1980s, movie narratives were a lot more original.
Second, contra the standard story in the world of film geeks, the '80s were a high point for originality -- more so than the New Hollywood period of the later '60s and early '70s, few of whose movies brought in many viewers. It was still a somewhat out-there thing to be making original movies in the '60s, and they struggled to find an audience. By the '80s, everybody was in the mood for exciting new stories, so there was no incompatibility between being mainstream and being original. The '60s were just the beginning; the culmination arrived in the '80s.
(That pattern shows up in popular music, too -- in the '60s the Velvet Underground was, well, underground, but David Bowie and Peter Gabriel topped the charts in the '80s. And the Velvets were just pointing the way back then.)
Third, it is true that the '50s were a period a lower creativity -- not zero, but just compared to the reversal that would begin in the '60s and peak in the '80s.
Fourth, even more importantly, the world was not static before 1960. Movie narratives weren't that unoriginal in the mid-'30s -- they gradually moved from there toward the peak of adaptations in the late '50s. Nobody gets more confused about history than when they think about the '50s and '60s. Yes, the '60s were a break from the '50s, but the conditions of the '50s didn't stretch back forever, or even several decades. The '20s were just about the opposite of the '50s; they pre-figured the '80s. These changes in the zeitgeist are more like cycles than a one-time disruption of a formerly static equilibrium.
(Similarly, popular music from the mid-'30s wasn't that bland -- it had only begun to come down from the height of the Jazz Age, when the focus was on fun, danceable melodies. It took decades of gradual erosion to reach the chart-topping hit "How Much Is That Doggie In The Window?" Rock 'n' roll was still off in the distance during the mid-'50s. And in architecture, the mid-'30s had only begun to wipe off the ornamentation and boil the elegance out of Art Deco. The Streamline Moderne of the later '30s and early '40s was just the first step toward the soulless mid-century minimalism.)
Fifth, economics tells us little or nothing about long-term changes in the creative part of movie-making. When we look at the first graph, we don't see the business cycle, or whatever, staring back at us.
It's too bad I can't apply these methods back to the rising-crime period of ca. 1890 or 1900 through 1933. I bet you'd see something like the 1960-1990 period, an initially higher level of adaptations that fell off to a bottom sometime in the '20s or early '30s. Yeah, I know they adapted Frankenstein and All Quiet on the Western Front, but lots of hits were not adaptations of specific works but of common legends (Intolerance, Robin Hood, The Thief of Bagdad). Not to mention all of the blockbuster classics that had original narratives, like Metropolis and King Kong.
Finally, let me emphasize yet again that this is only about how original the narratives are. Obviously a lot else contributes to overall originality, like the equally important visual elements. And not all adaptations have the same snore-inducing effect on a normal person. At least those adaptations from the '50s weren't based on children's cartoons and toy lines, although each year might have had several campy-looking musicals back then, so even they weren't as mature as we tend to think.
Methodology:
Data on the box office top 10 come from the various Years in film pages at Wikipedia. I then read through the Wikipedia entry on each movie to see what the source material was. Given how popular these movies are, and how easy it is for the editors to find out if a hit movie was based on a hit play or novel, this information wasn't lacking.
We all know what a sequel and prequel are, but just to clarify, I included both plot-based sequels, where the narratives are inter-related across the movies (like The Empire Strikes Back), and sequels based on the same character, tone, and themes, but with little or no continuity in plot (like most of the James Bond movies).
An adaptation is a different way of being less original in the storytelling, so I coded it separately. Just about the entire literary side of a movie can be grafted in from a novel, play, previous movie, TV series, etc. If the Wikipedia entry said it was only loosely based on the source, that the plot was extensively altered, etc., I didn't count it as an adaptation. In those cases, maybe they were trying to ride on the coat-tails of the source's success, but at least they made up an original story.
I counted anything that seemed long enough in form to be used in a movie -- not a single short story, not a news or magazine article, and so on. Serial stories, series of shorter children's books, etc., were counted. Biographies, autobiographies, memoirs, non-fiction books, and the like I judged case by case. If the real events described, and the subjective tone relating to them, were well known and almost legendary, I didn't count it as an adaptation of another's original work. If they were more stylized, bringing less known events to light, etc., I gave the original one credit for creating something, and counted the movie as an adaptation.
After awhile, even original works take on a legendary status, so that a modern movie-maker could still give their take on it, rather than borrow the plot entirely. There's no clear way to decide what's legendary and what isn't, so I just considered any work older than roughly 100 years at the time of the movie to be a legend, myth, fable, tale, etc., whether we know exactly who originated it (Romeo and Juliet) or not (Aladdin). If the movie was based on a newer work, which in turn was based on an older legendary work, I counted that as an adaptation, like the 1959 movie Ben-Hur that is based on a novel from 1880, although relating stories from the Bible.
Overall, though, these somewhat puzzling cases were not common in any single year: most adaptations are from earlier movies, stage plays, novels, and other long forms of fiction. So, quibbling wouldn't alter the 76-year pattern in the graphs.
Lastly, if a movie was a sequel to an adaptation, I counted it as both. So the first Star Trek movie was just an adaptation, and the rest after it were both.
July 26, 2012
July 24, 2012
Forgiving vs. belittling satires
In falling-crime times, a less threatened populace begins to withdraw more into their own little worlds, and so don't feel as attached to others. Naturally satire assumes a central place in their culture, particularly of the more snarky kind. Suddenly everyone thinks they're so witty, and begins to lambast the superstition, exuberance, and balls-to-the-walls atmosphere of an earlier phase in their history (typically a rising-crime period, when things were the other way around).
Still, while satire may not come as naturally to the supporting and fun-loving people of rising-crime times, they continue to be made, albeit in a more forgiving and sympathetic way that fits better with the zeitgeist. Unlike a snarky satire, that only aims to blast The Other, these forgiving satires work on their own as an example of the very thing they're poking fun at. They walk a thin line between being an unabashed example of the genre and an overly self-conscious parody.
This stylistic ambiguity is typical of the culture of rising-crime times, in contrast to the more fanboyish and "fisking" works put out in falling-crime times. It allows the audience to both enjoy the work as an example of a guilty pleasure genre, while also gently reminding them that they shouldn't take it too far -- that there is a certain silliness in it, but that it's OK as long as you're subliminally aware of it.
Just considering the most recent rising and falling-crime periods, the only exception to the snarky kind of satire of the past 20 years is the running series of posts at the website Stuff White People Like. You can tell he (and the readers) really do like a lot of that stuff, so it works as a fan site for yuppie trends. At the same time, he does continue to rib the audience for its bogus claims of inclusiveness and diversity, which everyone realizes are not as big of a priority as showing off your Mac toys and hiking gear.
In contrast, all those Scary Movie kind of movies are satirical, but they don't actually work as horror movies, epic movies, and so on. Family Guy lays the sarcasm on real thick for everyone but their own lazy Harvard mafia approach to writing sit-coms.
Toward the end of the last rising-crime period, though, people felt free enough to make and watch movies or TV shows that were forgiving satires. It began with one of the funniest movies ever, Vacation, which parodied the road trip and family bonding movies by reminding us how much the open road wants to fuck you, and how easy it can be to blow your cool around your family. In spite of that, though, we really do feel the Griswold family growing closer over the course of their odyssey, and it makes a terrific road trip movie. Tell me you don't feel the wind blowing through you hair each time they play "Holiday Ro-oh-oh-oh-oh-oh-oh-oh-oh-oad."
This Is Spinal Tap not only worked great as a (fake) documentary of a metal band -- near the height of metal's popularity -- but also as a parody of the genre's more out-there tendencies: Medieval imagery, all-black album covers, turning it up to 11, etc. In comparison, Airheads from 1994 was pretty weak, although it was at least watchable, unlike satires from further on into the falling-crime period.
The writer of Heathers, Daniel Waters, says his goal was to make the teen movie to end all teen movies. It not only works great as another entry in the genre, but pokes fun at some of its hyperbolic features -- the star-crossed romance, where the bad boy turns out to be a full-blown psychopath; the urge to get back at your frenemies, which ends in serial murder; the orgy of emotion about saving the youth generation; and the one-dimensional portrayal of parents ("Kurt buddy, I don't care that you really were some... pansy." "Will someone tell me why I smoke these damn things? -- Because you're an idiot -- Oh yeah, that's it.")
After a deluge of buddy cop movies, The Hard Way allowed itself to lovingly mock the genre's conventions -- stupefyingly mismatched partners, one of whom is on the brink of a meltdown, the moment when one apparently falls from grace and has to be supported by the other, etc. The self-awareness is emphasized by the central plot device, where a tree-hugging Hollywood actor wants to pretend to be a detective in preparation for an upcoming role, while working alongside and learning from a real-life loose cannon. At the same time, it's an awesome action flick where we actually connect with the two detectives as they search for a serial killer running amok in the big city. This is definitely one of the most under-rated movies; if you haven't seen it yet, make it next on your list. It's almost up there with Beverly Hills Cop.
The generally saccharine nature of romantic comedies -- aka chick flicks -- keeps them from taking even a somewhat tongue-in-cheek approach. It has to be the perfect story for the perfect princesses in the audience. But L.A. Story was about as close as we're going to get to one that satirizes while still fulfilling the basic goals of the genre. It gets a bit too self-aware at times, but usually not in a way that wakes you up from your absorption in the story. Also caringly parodied here is the "let's pack up and head out to California" genre, which since the '90s has died out as the state has held less and less appeal as America's own paradise.
And then there was the slasher film. I'm not talking about the self-consciously campy ones. It's hard to both satirize the genre and still pull off the aims of such a movie, which require a suspension of disbelief -- that something so out of the ordinary is threatening you, a feeling that evaporates once you're aware of how formulaic the story can be. I'll give honorable mentions to Candyman and the very late entry of Urban Legend, while passing over Scream as too unrelentingly self-aware to fit with the rest of the movies in this post.
However, the one that stands out the most is Wes Craven's New Nightmare, also a late entry from 1994, which is a lot like Scream but with almost no interruptions of the "hey guys, you get it?!" variety. It's a great slasher flick on its own, and it retains the supernatural tone of the original Nightmare on Elm Street, which sets up a clever blurring of the barriers between the real world and the world created by the screenplay of the earlier Nightmare movies.
The most prolific director in this style of film-making by far was Paul Verhoeven. RoboCop was not only a kickass sci-fi action movie by itself, it commented on several of the genre's traits that, if indulged, would lead us off into wacko territory. For example, putting too much faith in the police (or vigilantes) instead of neighbors watching out for neighbors, or feeling an emotional connection with a robotic savior that can feel nothing for us in return.
Total Recall was an even greater badass sci-fi action flick, with even greater comic relief than "I'd buy that for a dollar!" -- the chatty too-friendly robot driver of the Johnny Cab. Like RoboCop, it was also a hilarious send-up of the genre, especially the utterly careless attitude toward human carnage. How can you forget the perfect black humor in the shoot-out scene on the escalator?
Finally, Basic Instinct took a pretty good stab at parodying the erotic thriller, although a few too many lines of dialogue are meta-aware, and it keeps us from fully getting into it as a plausible erotic thriller movie itself. It's not so much Sharon Stone's delivery, which is always straight-faced, but the content of the lines themselves -- "Have you ever fucked on cocaine, Nick? It's nice." "You know I don't like to wear any underwear, don't you Nick?" Etc. Verhoeven's over-the-top approach is better suited to both satirizing and excelling in the action genre, but for something that's supposed to be more subtle like a thriller, the exaggeration prevents it from being a great example of the genre itself. It's an entertaining movie, just not as great as RoboCop or Total Recall.
Now, what popular genres did Twin Peaks not parody to perfection? The soap opera, the teen melodrama, the slasher / horror movie, the Happy Days nostalgia for the pre-'60s era in a smaller town, the rebirth of Gothic novels, the rebirth of film noir, the detective procedural, the buddy cop movie... shit, that's just off the top of my head. And yet it excelled in all those styles as well. Given its serial format, drawn out over several TV episodes, I think it was easier to find the right mix of satire and affection during a certain scene. They could go more in one direction for awhile, and come back toward the other direction in a future episode. It doesn't matter if it ultimately flew off the rails and devolved into camp. For a good run, from the pilot through the solving of Laura Palmer's murder, it achieved a fusion of parody and tenderness that I haven't felt from anything else.
I could be missing a few other examples, but that sums it up pretty well. I'll follow up sometime soon with case studies from earlier periods.
Still, while satire may not come as naturally to the supporting and fun-loving people of rising-crime times, they continue to be made, albeit in a more forgiving and sympathetic way that fits better with the zeitgeist. Unlike a snarky satire, that only aims to blast The Other, these forgiving satires work on their own as an example of the very thing they're poking fun at. They walk a thin line between being an unabashed example of the genre and an overly self-conscious parody.
This stylistic ambiguity is typical of the culture of rising-crime times, in contrast to the more fanboyish and "fisking" works put out in falling-crime times. It allows the audience to both enjoy the work as an example of a guilty pleasure genre, while also gently reminding them that they shouldn't take it too far -- that there is a certain silliness in it, but that it's OK as long as you're subliminally aware of it.
Just considering the most recent rising and falling-crime periods, the only exception to the snarky kind of satire of the past 20 years is the running series of posts at the website Stuff White People Like. You can tell he (and the readers) really do like a lot of that stuff, so it works as a fan site for yuppie trends. At the same time, he does continue to rib the audience for its bogus claims of inclusiveness and diversity, which everyone realizes are not as big of a priority as showing off your Mac toys and hiking gear.
In contrast, all those Scary Movie kind of movies are satirical, but they don't actually work as horror movies, epic movies, and so on. Family Guy lays the sarcasm on real thick for everyone but their own lazy Harvard mafia approach to writing sit-coms.
Toward the end of the last rising-crime period, though, people felt free enough to make and watch movies or TV shows that were forgiving satires. It began with one of the funniest movies ever, Vacation, which parodied the road trip and family bonding movies by reminding us how much the open road wants to fuck you, and how easy it can be to blow your cool around your family. In spite of that, though, we really do feel the Griswold family growing closer over the course of their odyssey, and it makes a terrific road trip movie. Tell me you don't feel the wind blowing through you hair each time they play "Holiday Ro-oh-oh-oh-oh-oh-oh-oh-oh-oad."
This Is Spinal Tap not only worked great as a (fake) documentary of a metal band -- near the height of metal's popularity -- but also as a parody of the genre's more out-there tendencies: Medieval imagery, all-black album covers, turning it up to 11, etc. In comparison, Airheads from 1994 was pretty weak, although it was at least watchable, unlike satires from further on into the falling-crime period.
The writer of Heathers, Daniel Waters, says his goal was to make the teen movie to end all teen movies. It not only works great as another entry in the genre, but pokes fun at some of its hyperbolic features -- the star-crossed romance, where the bad boy turns out to be a full-blown psychopath; the urge to get back at your frenemies, which ends in serial murder; the orgy of emotion about saving the youth generation; and the one-dimensional portrayal of parents ("Kurt buddy, I don't care that you really were some... pansy." "Will someone tell me why I smoke these damn things? -- Because you're an idiot -- Oh yeah, that's it.")
After a deluge of buddy cop movies, The Hard Way allowed itself to lovingly mock the genre's conventions -- stupefyingly mismatched partners, one of whom is on the brink of a meltdown, the moment when one apparently falls from grace and has to be supported by the other, etc. The self-awareness is emphasized by the central plot device, where a tree-hugging Hollywood actor wants to pretend to be a detective in preparation for an upcoming role, while working alongside and learning from a real-life loose cannon. At the same time, it's an awesome action flick where we actually connect with the two detectives as they search for a serial killer running amok in the big city. This is definitely one of the most under-rated movies; if you haven't seen it yet, make it next on your list. It's almost up there with Beverly Hills Cop.
The generally saccharine nature of romantic comedies -- aka chick flicks -- keeps them from taking even a somewhat tongue-in-cheek approach. It has to be the perfect story for the perfect princesses in the audience. But L.A. Story was about as close as we're going to get to one that satirizes while still fulfilling the basic goals of the genre. It gets a bit too self-aware at times, but usually not in a way that wakes you up from your absorption in the story. Also caringly parodied here is the "let's pack up and head out to California" genre, which since the '90s has died out as the state has held less and less appeal as America's own paradise.
And then there was the slasher film. I'm not talking about the self-consciously campy ones. It's hard to both satirize the genre and still pull off the aims of such a movie, which require a suspension of disbelief -- that something so out of the ordinary is threatening you, a feeling that evaporates once you're aware of how formulaic the story can be. I'll give honorable mentions to Candyman and the very late entry of Urban Legend, while passing over Scream as too unrelentingly self-aware to fit with the rest of the movies in this post.
However, the one that stands out the most is Wes Craven's New Nightmare, also a late entry from 1994, which is a lot like Scream but with almost no interruptions of the "hey guys, you get it?!" variety. It's a great slasher flick on its own, and it retains the supernatural tone of the original Nightmare on Elm Street, which sets up a clever blurring of the barriers between the real world and the world created by the screenplay of the earlier Nightmare movies.
The most prolific director in this style of film-making by far was Paul Verhoeven. RoboCop was not only a kickass sci-fi action movie by itself, it commented on several of the genre's traits that, if indulged, would lead us off into wacko territory. For example, putting too much faith in the police (or vigilantes) instead of neighbors watching out for neighbors, or feeling an emotional connection with a robotic savior that can feel nothing for us in return.
Total Recall was an even greater badass sci-fi action flick, with even greater comic relief than "I'd buy that for a dollar!" -- the chatty too-friendly robot driver of the Johnny Cab. Like RoboCop, it was also a hilarious send-up of the genre, especially the utterly careless attitude toward human carnage. How can you forget the perfect black humor in the shoot-out scene on the escalator?
Finally, Basic Instinct took a pretty good stab at parodying the erotic thriller, although a few too many lines of dialogue are meta-aware, and it keeps us from fully getting into it as a plausible erotic thriller movie itself. It's not so much Sharon Stone's delivery, which is always straight-faced, but the content of the lines themselves -- "Have you ever fucked on cocaine, Nick? It's nice." "You know I don't like to wear any underwear, don't you Nick?" Etc. Verhoeven's over-the-top approach is better suited to both satirizing and excelling in the action genre, but for something that's supposed to be more subtle like a thriller, the exaggeration prevents it from being a great example of the genre itself. It's an entertaining movie, just not as great as RoboCop or Total Recall.
Now, what popular genres did Twin Peaks not parody to perfection? The soap opera, the teen melodrama, the slasher / horror movie, the Happy Days nostalgia for the pre-'60s era in a smaller town, the rebirth of Gothic novels, the rebirth of film noir, the detective procedural, the buddy cop movie... shit, that's just off the top of my head. And yet it excelled in all those styles as well. Given its serial format, drawn out over several TV episodes, I think it was easier to find the right mix of satire and affection during a certain scene. They could go more in one direction for awhile, and come back toward the other direction in a future episode. It doesn't matter if it ultimately flew off the rails and devolved into camp. For a good run, from the pilot through the solving of Laura Palmer's murder, it achieved a fusion of parody and tenderness that I haven't felt from anything else.
I could be missing a few other examples, but that sums it up pretty well. I'll follow up sometime soon with case studies from earlier periods.
Categories:
Movies
July 22, 2012
The Dark Knight Rises
[Just re-reading this... pretty long, but fuck it, it's 6am and I don't feel like editing. It does cover more than just this particular movie, though.]
No plot spoilers here. I'm kicking around a separate post on Christopher Nolan, where I might include more detailed talk of the plot of The Dark Knight Rises. This is just some first-impression stuff after seeing it tonight. It's 2 hrs and 45 min long, but the time seemed to zip right on by. They could've added another 15 to 30 minutes and I probably wouldn't have noticed.
It looks pretty good, no surprise since it's the winning team of Nolan and his cinematographer Wally Pfister. Their visual style is one of the few today that is still committed to shooting with anamorphic lenses, which by limiting the depth of field makes the foreground figures crisp, while compressing the rest of the space -- from a few feet behind them all the way back to infinity -- into a blurry sheet. In addition to making it easier for the viewer to focus on the important objects, it gives it a more stylized look, like low-relief sculptural detail emerging from a slab.
Still, about an hour of it was filmed with IMAX cameras, and their spherical lenses reveal a lot more of the depth of the space. For some sequences, like a vehicle chase where you'd want to clearly see the near vehicle and the somewhat-far-off one, then a spherical lens seems like a good choice. They didn't use them for the dialogue scenes, which retain the above-mentioned stylized look familiar from Memento, Batman Begins, and Inception.
However, some of the crowded action scenes were shot in IMAX, and you see too much detail of the numerous fights going on around the important fight between Batman and Bane. For a scene of mass chaos like this, it's better to focus our attention on the central figures and blur the rest out -- not only because it aids our attention, but because it heightens that sense of a formless, teeming mob. When each mano-a-mano is fairly in focus, it just looks like a bunch of isolated fights, rather than one of those brawls from the cartoons where arms, legs, and heads jut in and out of a moving tornado.
Or to use a more highbrow reference, look at David's Rape of the Sabine Women or Delacroix's Liberty Leading the People. Only a handful of figures stand out in a mostly horizontal frieze-like arrangement, while a human dust-storm rages behind the front-facing plane of main figures. They should've filmed Batman and Bane in a close or medium shot, with all the other fights blurred out in the background. And ditto for most of the other action scenes. I'd have to watch it again, but I think I might prefer the look of Batman Begins more because it was shot entirely with anamorphic lenses.
Like their other movies, there's minimal CGI and no 3-D. Practical effects are more convincing because they're real things, while CGI is an impostor reality -- I still don't get how hard this is to understand. But stupid audiences seem to love it, so I guess that's why there's so much of it. And 3-D has the opposite effect of an anamorphic lens, bringing into sharp focus the space right in front of your face all the way back through the theater. Too much clear details makes it harder to find the important things quickly, and takes away the stylization of the low-relief look, making the perception of depth too realistic.
There are a lot of different locations, potentially confusing us, but Pfister does a good job of subtly linking the ones that are similar. For example, early on Bruce Wayne is in physical recovery, somewhat reclusive and reluctant to take on the Batman role again. Commissioner Gordon winds up in a hospital where he recuperates, is in minimal contact with the outside world, and is reluctant to reveal the truth about Batman taking the fall for Harvey Dent, as well as to lead the police forces against Bane. Both sets have a color palate of mostly light blue with some cream, and the lighting is even like you find in a sterile lab with banks of fluorescent lights overhead.
They lack the stark light-shadow contrast and the warmer colors of another pair of sets -- a charity masquerade ball held by a sustainable energy proponent on the corporate board of Wayne Enterprises, and the initial party held in celebration of the crime-stopping success of the Harvey Dent Act (ironically named after a villain from the previous movie, whose crimes Batman has taken the fall for). The strong chiaroscuro of these scenes not only adds to their elegance, but gives them a haunting things-aren't-what-they-seem feeling.
And there's a similar look to the different underground sets -- a far-off prison that Bruce Wayne gets sent to, and the make-shift prisons underneath Gotham City once taken over by terrorists. They share the other-worldly chiaroscuro of the scenes involving the social life of the influential, but are marked by a much coarser and grittier texture, not smooth and slicked-back.
A simple choice like this makes it much easier to keep track of all of these different settings by narrowing down the number of unique-looking places, and effortlessly guides our mind to think of the themes and characters inhabiting them as similar. No ham-fisted exposition needed to draw the parallels.
Well, enough about the visuals, which is mostly what I went to see it for. The music is pretty good too -- not too highly memorable, but it struck the right emotional chord at the right moment. I tend to avoid most new movies, since they just about all suck, so maybe I'm missing some more recent examples -- but it was such a treat to hear a repeating drumbeat motif during a tense scene. You'd think most people knew that, either based on their own heartbeat or the pounding of their feet as they're running, or from other movies that have successfully used this simple trick. But damn has it been awhile. There's also a more epic-sounding percussion theme when Batman and Bane are duking it out. It reminded me of the Oriental Sublime feel of the Black Rain soundtrack, and sure enough Hans Zimmer made that one too.
The plot I won't say too much about, except that it held together pretty well, always felt like it was moving forward, and built up from a more mellow pace to an exciting climax. Some have complained that it starts out slow -- I didn't think so. Maybe I was just too lost in the pretty pictures. But that's the point anyway: that the police bureaucracy has successfully ridden the city of crime, and they're all in a relaxing, back-to-normal pace of life. This makes it even more striking when the terrorists take over and ramp up the rhythm of action. It's like during the 1990s when Giuliani basked in the glow of scrubbing New York City clean of vice and crime, only to witness a surprise terrorist attack that sent the Twin Towers up in a puff of smoke.
After getting home I read the plot as described on Wikipedia, and there were several points that I didn't get too well. They weren't very crucial, and I probably would've caught them on a second viewing. The big thing that did go over my head was what became of Bane at the end. They had several sequences inter-cut during the final stretch, all action-packed, and at least for me the fate of Bane got lost in the shuffle.
As for the cast, as in Nolan's other movies thank god there aren't any dumbass Millennials, aside from Catwoman's sidekick who's rarely seen or heard. (That annoying dork from Juno who showed up in Inception was that movie's one misstep in casting.) Hard to believe now, but Christopher Reeve was in his mid-20s when the first two Superman movies were filmed, and Tom Cruise did Top Gun through A Few Good Men by the time he was 30.
But younger people today are too autistic and sheltered to know how to relate to other human beings, let alone strangers who you have to trust deep down and be vulnerable around, and let alone in unmediated ways, like when you're acting out a movie. Sorry, you can't communicate there primarily through text messaging. Sure, older people -- or those only "of a certain age" -- are playing their hand closer to the vest than they were in the good old days, but they still have the memories of those formative experiences to tap into during a performance.
The Dark Knight Rises continues Nolan's trend of not pandering to younger audiences by casting young actors who can't act. And it hasn't hurt his movies' appeal to those audiences. Directors just need to stop being such pussies about having every demographic represented in the movie. As long as it appeals to them, they don't need to see a perfect reflection of themselves on the screen. I don't remember seeing a child on the military team in Predator, but that didn't stop me from loving it when I was little. I still remember how much we hated when they included that whiny little dork in Terminator 2, and he was just a couple years older than us.
Finally, the characterizations all felt natural and their development believable. Bruce Wayne's decision to come out of seclusion was set up well by the opening tone of celebration and complacency among the police and politicians for having slammed the crime rate down so low. Sound familiar? And it's not like we feel bored because, so what, Batman's gonna go kick some more ass that he didn't notice the first time around. If he's so committed to secluding himself, then only something unusually dangerous will bring him out, making it uncertain if he'll be able to defeat it or not.
Bane I found pretty uninteresting as a character himself, but that had the positive effect of focusing the story on the relations between all the other characters trying to save Gotham City, not all of whom are on the same wavelength and thus have to learn to negotiate with each other. Bane is just like the ticking time-bomb he wants to set off, precluding any attempts to reason with him, tempt him with anything to move him into a weak spot, or physically de-fuse him.
Miranda Tate was hard to understand at first. I thought she was just some random philanthropic environmentalist who's been discussing project ideas with Bruce Wayne. To me it wasn't clear that she was already a board member at Wayne Enterprises. Anyway, nothing major, and she does provide a nice foil to Catwoman.
Selina Kyle (aka Catwoman) had a much better character arc here than in Batman Returns, where she was back-from-the-dead and in search of vengeance. In The Dark Knight Rises, she lacks that single-mindedness of harmful purpose -- instead she's portrayed as an ordinary cat burglar. She's not evil, just a cold-hearted manipulative bitch who knows how to scheme her way into getting what she wants. Unlike the original Catwoman, this leaves the door open for later redemption; waiting to see whether she'll walk through it or not builds some tension in the audience. She's only screwing around with Bruce Wayne and Batman because it suits her needs right now; she has no vendetta against him or anyone else.
They did somewhat overdo the whole butt-kicking-babe in a dominatrix outfit thing, though. If she's a cat burglar, she should be shown to be stealthy, not confrontational. That goes even more since she's female: if she really went up against that many male criminals, her ass would be grass. That's why they should have stuck only to her skills in social savvy, emotional manipulation, and betrayal of trust when it suits her fleeting purpose -- chicks are just better at that stuff.
Officer / detective Blake is a little too reserved to make much of a connection with. They try to open him up a little at the beginning by giving him a couple lines of exposition about growing up in an orphanage, but we generally don't see him in a real shit-hits-the-fan kind of moment. But he does move up in importance throughout the movie, leaving the possibility open in future movies that he may be given a more dynamic role.
Anyway, enough about a new-release movie. Just going through this stuff in detail to highlight what's been missing in movies of the past 20 years. Even if it is a summer action flick, why should it have to have a stupid story, boring music, and a lame look? Go back and watch Die Hard, which in addition to energetic action scenes and shit blowing up real good, also has great set design and cinematography, pacing that built up tension, and a believable character arc about a hero who didn't set off at the outset to kick so much ass, but whom larger urgent forces pulled out of his self-pitying shell. Action movies used to look pretty damn good, and feature stories not written for babies; The Dark Knight Rises is one of the few that still does.
No plot spoilers here. I'm kicking around a separate post on Christopher Nolan, where I might include more detailed talk of the plot of The Dark Knight Rises. This is just some first-impression stuff after seeing it tonight. It's 2 hrs and 45 min long, but the time seemed to zip right on by. They could've added another 15 to 30 minutes and I probably wouldn't have noticed.
It looks pretty good, no surprise since it's the winning team of Nolan and his cinematographer Wally Pfister. Their visual style is one of the few today that is still committed to shooting with anamorphic lenses, which by limiting the depth of field makes the foreground figures crisp, while compressing the rest of the space -- from a few feet behind them all the way back to infinity -- into a blurry sheet. In addition to making it easier for the viewer to focus on the important objects, it gives it a more stylized look, like low-relief sculptural detail emerging from a slab.
Still, about an hour of it was filmed with IMAX cameras, and their spherical lenses reveal a lot more of the depth of the space. For some sequences, like a vehicle chase where you'd want to clearly see the near vehicle and the somewhat-far-off one, then a spherical lens seems like a good choice. They didn't use them for the dialogue scenes, which retain the above-mentioned stylized look familiar from Memento, Batman Begins, and Inception.
However, some of the crowded action scenes were shot in IMAX, and you see too much detail of the numerous fights going on around the important fight between Batman and Bane. For a scene of mass chaos like this, it's better to focus our attention on the central figures and blur the rest out -- not only because it aids our attention, but because it heightens that sense of a formless, teeming mob. When each mano-a-mano is fairly in focus, it just looks like a bunch of isolated fights, rather than one of those brawls from the cartoons where arms, legs, and heads jut in and out of a moving tornado.
Or to use a more highbrow reference, look at David's Rape of the Sabine Women or Delacroix's Liberty Leading the People. Only a handful of figures stand out in a mostly horizontal frieze-like arrangement, while a human dust-storm rages behind the front-facing plane of main figures. They should've filmed Batman and Bane in a close or medium shot, with all the other fights blurred out in the background. And ditto for most of the other action scenes. I'd have to watch it again, but I think I might prefer the look of Batman Begins more because it was shot entirely with anamorphic lenses.
Like their other movies, there's minimal CGI and no 3-D. Practical effects are more convincing because they're real things, while CGI is an impostor reality -- I still don't get how hard this is to understand. But stupid audiences seem to love it, so I guess that's why there's so much of it. And 3-D has the opposite effect of an anamorphic lens, bringing into sharp focus the space right in front of your face all the way back through the theater. Too much clear details makes it harder to find the important things quickly, and takes away the stylization of the low-relief look, making the perception of depth too realistic.
There are a lot of different locations, potentially confusing us, but Pfister does a good job of subtly linking the ones that are similar. For example, early on Bruce Wayne is in physical recovery, somewhat reclusive and reluctant to take on the Batman role again. Commissioner Gordon winds up in a hospital where he recuperates, is in minimal contact with the outside world, and is reluctant to reveal the truth about Batman taking the fall for Harvey Dent, as well as to lead the police forces against Bane. Both sets have a color palate of mostly light blue with some cream, and the lighting is even like you find in a sterile lab with banks of fluorescent lights overhead.
They lack the stark light-shadow contrast and the warmer colors of another pair of sets -- a charity masquerade ball held by a sustainable energy proponent on the corporate board of Wayne Enterprises, and the initial party held in celebration of the crime-stopping success of the Harvey Dent Act (ironically named after a villain from the previous movie, whose crimes Batman has taken the fall for). The strong chiaroscuro of these scenes not only adds to their elegance, but gives them a haunting things-aren't-what-they-seem feeling.
And there's a similar look to the different underground sets -- a far-off prison that Bruce Wayne gets sent to, and the make-shift prisons underneath Gotham City once taken over by terrorists. They share the other-worldly chiaroscuro of the scenes involving the social life of the influential, but are marked by a much coarser and grittier texture, not smooth and slicked-back.
A simple choice like this makes it much easier to keep track of all of these different settings by narrowing down the number of unique-looking places, and effortlessly guides our mind to think of the themes and characters inhabiting them as similar. No ham-fisted exposition needed to draw the parallels.
Well, enough about the visuals, which is mostly what I went to see it for. The music is pretty good too -- not too highly memorable, but it struck the right emotional chord at the right moment. I tend to avoid most new movies, since they just about all suck, so maybe I'm missing some more recent examples -- but it was such a treat to hear a repeating drumbeat motif during a tense scene. You'd think most people knew that, either based on their own heartbeat or the pounding of their feet as they're running, or from other movies that have successfully used this simple trick. But damn has it been awhile. There's also a more epic-sounding percussion theme when Batman and Bane are duking it out. It reminded me of the Oriental Sublime feel of the Black Rain soundtrack, and sure enough Hans Zimmer made that one too.
The plot I won't say too much about, except that it held together pretty well, always felt like it was moving forward, and built up from a more mellow pace to an exciting climax. Some have complained that it starts out slow -- I didn't think so. Maybe I was just too lost in the pretty pictures. But that's the point anyway: that the police bureaucracy has successfully ridden the city of crime, and they're all in a relaxing, back-to-normal pace of life. This makes it even more striking when the terrorists take over and ramp up the rhythm of action. It's like during the 1990s when Giuliani basked in the glow of scrubbing New York City clean of vice and crime, only to witness a surprise terrorist attack that sent the Twin Towers up in a puff of smoke.
After getting home I read the plot as described on Wikipedia, and there were several points that I didn't get too well. They weren't very crucial, and I probably would've caught them on a second viewing. The big thing that did go over my head was what became of Bane at the end. They had several sequences inter-cut during the final stretch, all action-packed, and at least for me the fate of Bane got lost in the shuffle.
As for the cast, as in Nolan's other movies thank god there aren't any dumbass Millennials, aside from Catwoman's sidekick who's rarely seen or heard. (That annoying dork from Juno who showed up in Inception was that movie's one misstep in casting.) Hard to believe now, but Christopher Reeve was in his mid-20s when the first two Superman movies were filmed, and Tom Cruise did Top Gun through A Few Good Men by the time he was 30.
But younger people today are too autistic and sheltered to know how to relate to other human beings, let alone strangers who you have to trust deep down and be vulnerable around, and let alone in unmediated ways, like when you're acting out a movie. Sorry, you can't communicate there primarily through text messaging. Sure, older people -- or those only "of a certain age" -- are playing their hand closer to the vest than they were in the good old days, but they still have the memories of those formative experiences to tap into during a performance.
The Dark Knight Rises continues Nolan's trend of not pandering to younger audiences by casting young actors who can't act. And it hasn't hurt his movies' appeal to those audiences. Directors just need to stop being such pussies about having every demographic represented in the movie. As long as it appeals to them, they don't need to see a perfect reflection of themselves on the screen. I don't remember seeing a child on the military team in Predator, but that didn't stop me from loving it when I was little. I still remember how much we hated when they included that whiny little dork in Terminator 2, and he was just a couple years older than us.
Finally, the characterizations all felt natural and their development believable. Bruce Wayne's decision to come out of seclusion was set up well by the opening tone of celebration and complacency among the police and politicians for having slammed the crime rate down so low. Sound familiar? And it's not like we feel bored because, so what, Batman's gonna go kick some more ass that he didn't notice the first time around. If he's so committed to secluding himself, then only something unusually dangerous will bring him out, making it uncertain if he'll be able to defeat it or not.
Bane I found pretty uninteresting as a character himself, but that had the positive effect of focusing the story on the relations between all the other characters trying to save Gotham City, not all of whom are on the same wavelength and thus have to learn to negotiate with each other. Bane is just like the ticking time-bomb he wants to set off, precluding any attempts to reason with him, tempt him with anything to move him into a weak spot, or physically de-fuse him.
Miranda Tate was hard to understand at first. I thought she was just some random philanthropic environmentalist who's been discussing project ideas with Bruce Wayne. To me it wasn't clear that she was already a board member at Wayne Enterprises. Anyway, nothing major, and she does provide a nice foil to Catwoman.
Selina Kyle (aka Catwoman) had a much better character arc here than in Batman Returns, where she was back-from-the-dead and in search of vengeance. In The Dark Knight Rises, she lacks that single-mindedness of harmful purpose -- instead she's portrayed as an ordinary cat burglar. She's not evil, just a cold-hearted manipulative bitch who knows how to scheme her way into getting what she wants. Unlike the original Catwoman, this leaves the door open for later redemption; waiting to see whether she'll walk through it or not builds some tension in the audience. She's only screwing around with Bruce Wayne and Batman because it suits her needs right now; she has no vendetta against him or anyone else.
They did somewhat overdo the whole butt-kicking-babe in a dominatrix outfit thing, though. If she's a cat burglar, she should be shown to be stealthy, not confrontational. That goes even more since she's female: if she really went up against that many male criminals, her ass would be grass. That's why they should have stuck only to her skills in social savvy, emotional manipulation, and betrayal of trust when it suits her fleeting purpose -- chicks are just better at that stuff.
Officer / detective Blake is a little too reserved to make much of a connection with. They try to open him up a little at the beginning by giving him a couple lines of exposition about growing up in an orphanage, but we generally don't see him in a real shit-hits-the-fan kind of moment. But he does move up in importance throughout the movie, leaving the possibility open in future movies that he may be given a more dynamic role.
Anyway, enough about a new-release movie. Just going through this stuff in detail to highlight what's been missing in movies of the past 20 years. Even if it is a summer action flick, why should it have to have a stupid story, boring music, and a lame look? Go back and watch Die Hard, which in addition to energetic action scenes and shit blowing up real good, also has great set design and cinematography, pacing that built up tension, and a believable character arc about a hero who didn't set off at the outset to kick so much ass, but whom larger urgent forces pulled out of his self-pitying shell. Action movies used to look pretty damn good, and feature stories not written for babies; The Dark Knight Rises is one of the few that still does.
Categories:
Movies
July 20, 2012
Is this a second era of spin-off movies?
Of the box office top 10 movies released in 2011, 9 were sequels, and the other was based off a TV show (Smurfs). Even within the sequels, 4 were ultimately spun off from TV shows (the Mission: Impossible and Transformers movies), or book series (the Harry Potter and Twilight movies). The Transformers movie also was based on a kids' toy line.
Can nobody think of any new ideas? It seems like everything is a sequel, prequel, remake, reboot, re-imagining, or a spin-off from an existing series in some other medium -- TV show, toy line, novel, etc. I'm still waiting for them to mine other domains in pop culture, like Peanut Butter Cups vs. Gushers: Annihilation.
Rewind only as far back as 1984, and just 2 movies in the top 10 were sequels: Star Trek III and Indiana Jones and the Temple of Doom. Only the Star Trek one was ultimately based on a non-cinematic source (a TV show). Ghostbusters, Beverly Hills Cop, Romancing the Stone, Gremlins, Footloose, etc. -- all new ideas. Scroll through Wikipedia's list of other releases at "1984 in film" and notice how few adaptations and sequels there are.
I wonder how far back this originality goes, though. With a lot of time, I might actually try quantifying this, but not now. The basic approach can be illustrated with one example year, say 1955, given that in many respects we're in a neo-'50s zeitgeist right now.
I think you'd give less weight to adaptations of literary material, since a book doesn't tell you much about how the movie should look, how the players should interact, what the sound should sound like, and so on. Remakes or sequels of movies would be weighted pretty heavily. And so would adaptations of plays. Those all provide a much more filled-in template at the outset of making the movie.
Of the top 10 movies of 1955, only 3 were fully original: Lady and the Tramp, Rebel Without a Cause, and Love Me or Leave Me (although it was a bio-pic). Movies adapted from novels accounted for another 2: East of Eden and The Sea Chase. The remaining half of the big hits were adapted plays: Mister Roberts, Guys and Dolls, The Seven Year Itch, Picnic, and Oklahoma!
That's nowhere as bad as all the lame adaptations we have today, but it's still striking compared to the originality of material from the '70s and '80s, probably starting out even in the '60s. I don't have a clue how original the material was in the '20s, and don't have the time to check systematically right now. But this is the kind of thing you can easily quantify and plot over time.
Most people who complain (rightly) about the sorry state of movies today do not realize that we're probably repeating the mid-century history of Hollywood. Back then it was radio and TV that were blamed for stealing away theater-goers, and now it's the internet and video games. In reality people in both periods wanted to stay at home all day, and chose radio and TV, or the internet and video games, over movies for that reason. They weren't helpless victims of a new technology, which they could always have chosen not to adopt -- just as most of us in the '80s didn't spend hours and hours every day playing video games, even though the option was there.
They tried all sorts of gimmicks to get people out of their houses, to give them an experience that they couldn't get with radio or TV. Widening the aspect ratio to look more panoramic (similar perhaps to the push for IMAX these days), the use of 3-D glasses, etc.
Maybe they thought the same thing about spinning off so many plays -- like, "Hey idiots, here's something you can't see on your idiot box." It was like visiting far-off Broadway, right in your own neck of the woods. Top-quality actors who could sing -- that probably wasn't available on early TV.
I wonder if they also felt like they'd hit a creative slump and figured, hell, it looks like the playwrights are making something that interests the general public. I mean, talk about widespread success -- a boring blowhard like Arthur Miller managed to marry Marilyn Monroe in 1956. Might as well copy whatever they're putting out, willy-nilly if we have to.
We tend to only remember the cool stuff from the past, like film noir from the mid-century, but often it was not very popular at the time. Same goes for It's a Wonderful Life, which flopped on release in 1946 and only found success in the 1970s and '80s. (I don't think most younger people or even 30-somethings tune in regularly these days when it airs before Christmas). This gives us a biased view of the past, and prevents us from seeing how similar it may have been to the present. And of course other periods in the past really were different from ours (like the Jazz Age vs. today).
Anyway, there's lots of room here for a good quantitative history of originality in source material for at least one domain in the art world.
Can nobody think of any new ideas? It seems like everything is a sequel, prequel, remake, reboot, re-imagining, or a spin-off from an existing series in some other medium -- TV show, toy line, novel, etc. I'm still waiting for them to mine other domains in pop culture, like Peanut Butter Cups vs. Gushers: Annihilation.
Rewind only as far back as 1984, and just 2 movies in the top 10 were sequels: Star Trek III and Indiana Jones and the Temple of Doom. Only the Star Trek one was ultimately based on a non-cinematic source (a TV show). Ghostbusters, Beverly Hills Cop, Romancing the Stone, Gremlins, Footloose, etc. -- all new ideas. Scroll through Wikipedia's list of other releases at "1984 in film" and notice how few adaptations and sequels there are.
I wonder how far back this originality goes, though. With a lot of time, I might actually try quantifying this, but not now. The basic approach can be illustrated with one example year, say 1955, given that in many respects we're in a neo-'50s zeitgeist right now.
I think you'd give less weight to adaptations of literary material, since a book doesn't tell you much about how the movie should look, how the players should interact, what the sound should sound like, and so on. Remakes or sequels of movies would be weighted pretty heavily. And so would adaptations of plays. Those all provide a much more filled-in template at the outset of making the movie.
Of the top 10 movies of 1955, only 3 were fully original: Lady and the Tramp, Rebel Without a Cause, and Love Me or Leave Me (although it was a bio-pic). Movies adapted from novels accounted for another 2: East of Eden and The Sea Chase. The remaining half of the big hits were adapted plays: Mister Roberts, Guys and Dolls, The Seven Year Itch, Picnic, and Oklahoma!
That's nowhere as bad as all the lame adaptations we have today, but it's still striking compared to the originality of material from the '70s and '80s, probably starting out even in the '60s. I don't have a clue how original the material was in the '20s, and don't have the time to check systematically right now. But this is the kind of thing you can easily quantify and plot over time.
Most people who complain (rightly) about the sorry state of movies today do not realize that we're probably repeating the mid-century history of Hollywood. Back then it was radio and TV that were blamed for stealing away theater-goers, and now it's the internet and video games. In reality people in both periods wanted to stay at home all day, and chose radio and TV, or the internet and video games, over movies for that reason. They weren't helpless victims of a new technology, which they could always have chosen not to adopt -- just as most of us in the '80s didn't spend hours and hours every day playing video games, even though the option was there.
They tried all sorts of gimmicks to get people out of their houses, to give them an experience that they couldn't get with radio or TV. Widening the aspect ratio to look more panoramic (similar perhaps to the push for IMAX these days), the use of 3-D glasses, etc.
Maybe they thought the same thing about spinning off so many plays -- like, "Hey idiots, here's something you can't see on your idiot box." It was like visiting far-off Broadway, right in your own neck of the woods. Top-quality actors who could sing -- that probably wasn't available on early TV.
I wonder if they also felt like they'd hit a creative slump and figured, hell, it looks like the playwrights are making something that interests the general public. I mean, talk about widespread success -- a boring blowhard like Arthur Miller managed to marry Marilyn Monroe in 1956. Might as well copy whatever they're putting out, willy-nilly if we have to.
We tend to only remember the cool stuff from the past, like film noir from the mid-century, but often it was not very popular at the time. Same goes for It's a Wonderful Life, which flopped on release in 1946 and only found success in the 1970s and '80s. (I don't think most younger people or even 30-somethings tune in regularly these days when it airs before Christmas). This gives us a biased view of the past, and prevents us from seeing how similar it may have been to the present. And of course other periods in the past really were different from ours (like the Jazz Age vs. today).
Anyway, there's lots of room here for a good quantitative history of originality in source material for at least one domain in the art world.
Categories:
Movies
July 19, 2012
Conspiracy theories for cocooners
Steve Sailer has a column up about conspiracy theories since the 1960s. I've been meaning to touch on this topic for awhile, so here goes, with a somewhat edited comment that I left.
One huge change from the '60s-'80s period to the 1993-and-after period is the victim of the conspiracy.
In the earlier period, it was a member of some powerful or influential establishment group, and they were targeted by other members of the powerful. Intra-elite factional violence. For example, in Three Days of the Condor, Robert Redford works for the CIA, and is being pursued by CIA agents. The initial conspiratorial bloodbath was also one group of CIA killing another. The JFK conspiracy theories had to do with an elite member killed by some other member or group within the elite. Ditto with Watergate. In Rambo II, there's a government conspiracy to cover up the fact that American soldiers were still being held prisoner in Vietnam (the POW / MIA belief). Admittedly the victims are not from the military elite, but it's still shown as a conspiracy within a single governmental organization, not affecting the average citizen.
In the period from 1993 and after, it's the average citizen who is a potential victim, and the agents are those he'd least suspect -- close friends, associates, or family, who will betray him when he lets them in close enough. In The Fugitive, the victim was an ordinary guy and his ordinary wife, and the initiator was one of his closest friends, who had ties to Big Pharma or something. Neo from The Matrix was an average guy, and so was that chick from The Net, both betrayed by those they'd trusted. Ordinary citizens were harmed by government cover-ups on The X-Files. The inner-city masses were the victims in conspiracy theories about the CIA selling crack in the ghetto to raise money for the Contras (a theory first promoted in 1996). And unlike JFK and Watergate, the 9/11 conspiracy theories all have to do with ordinary citizens as victims, no matter which elite group was supposedly responsible.
So, the message from the earlier narratives was that you shouldn't view the establishment as some benevolent, harmonious group -- just look what they do to their own people. You identified with the protagonist not because you were also a CIA member whose face could end up within the sights of a sniper rifle, but because he was a force of good and justice -- almost like a saint or angel, a higher-status creature than us in the audience, but who was going to try to keep the forces of evil up at the top from harming us on the ground.
In contrast, the message from the more recent narratives is that the establishment could be after you yourself, and they will probably try to get you through agents that you would normally find most trustworthy. Hence, you shouldn't let seemingly trustworthy people get close to you; and by transitivity, you shouldn't let anyone at all get close to you. You identified with the protagonist more out of personal fear -- it could be you, an ordinary citizen, who the higher powers might track down next, and it could be you who gets betrayed by your own friends and associates. Doesn't matter if you're not a doctor, computer programmer, etc. -- everybody works in some industry where the big wigs would want you disappeared if you stumbled onto inconvenient information.
The earlier narratives focus more on we the people not putting blind faith in the establishment, but rather reminding ourselves that there are power plays and factional violence within the elite themselves. They have no implications about how much an ordinary citizen should trust another ordinary citizen. The recent narratives focus on the average person isolating himself from the entire rest of society, perhaps excepting his closest blood relatives. They're part of the more general trend toward cocooning away from your fellow neighbor during the past 20 years.
We see the anti-establishment kinds of conspiracy theories during rising-crime times because people can see with their own eyes that the elites are not as omniscient, omnipotent, and omnibenevolent as they were thought to be. They get a pass on these matters during falling-crime times because, hey, whatever they're doing seems to be holding the violence level down. Only when it starts rising do we wake up to how clueless, impotent, and corrupt they can be -- why else does murder and rape become ever more frequent, year after year?
During falling-crime times, people don't feel as strong of a need to band together for common defense and support. But they need a pretext for splitting themselves apart from others. Now the focus of conspiracy theories is on how those you've trusted are actually going to betray you to the higher powers, who are almost ignored -- more disgust is heaped on your friends who'll betray you. How could they, after you let them in close? That'll teach you to trust other people. Now you've got a plausible, reassuring reason for why you have to unplug from community life and associate only with your nuclear family.
One huge change from the '60s-'80s period to the 1993-and-after period is the victim of the conspiracy.
In the earlier period, it was a member of some powerful or influential establishment group, and they were targeted by other members of the powerful. Intra-elite factional violence. For example, in Three Days of the Condor, Robert Redford works for the CIA, and is being pursued by CIA agents. The initial conspiratorial bloodbath was also one group of CIA killing another. The JFK conspiracy theories had to do with an elite member killed by some other member or group within the elite. Ditto with Watergate. In Rambo II, there's a government conspiracy to cover up the fact that American soldiers were still being held prisoner in Vietnam (the POW / MIA belief). Admittedly the victims are not from the military elite, but it's still shown as a conspiracy within a single governmental organization, not affecting the average citizen.
In the period from 1993 and after, it's the average citizen who is a potential victim, and the agents are those he'd least suspect -- close friends, associates, or family, who will betray him when he lets them in close enough. In The Fugitive, the victim was an ordinary guy and his ordinary wife, and the initiator was one of his closest friends, who had ties to Big Pharma or something. Neo from The Matrix was an average guy, and so was that chick from The Net, both betrayed by those they'd trusted. Ordinary citizens were harmed by government cover-ups on The X-Files. The inner-city masses were the victims in conspiracy theories about the CIA selling crack in the ghetto to raise money for the Contras (a theory first promoted in 1996). And unlike JFK and Watergate, the 9/11 conspiracy theories all have to do with ordinary citizens as victims, no matter which elite group was supposedly responsible.
So, the message from the earlier narratives was that you shouldn't view the establishment as some benevolent, harmonious group -- just look what they do to their own people. You identified with the protagonist not because you were also a CIA member whose face could end up within the sights of a sniper rifle, but because he was a force of good and justice -- almost like a saint or angel, a higher-status creature than us in the audience, but who was going to try to keep the forces of evil up at the top from harming us on the ground.
In contrast, the message from the more recent narratives is that the establishment could be after you yourself, and they will probably try to get you through agents that you would normally find most trustworthy. Hence, you shouldn't let seemingly trustworthy people get close to you; and by transitivity, you shouldn't let anyone at all get close to you. You identified with the protagonist more out of personal fear -- it could be you, an ordinary citizen, who the higher powers might track down next, and it could be you who gets betrayed by your own friends and associates. Doesn't matter if you're not a doctor, computer programmer, etc. -- everybody works in some industry where the big wigs would want you disappeared if you stumbled onto inconvenient information.
The earlier narratives focus more on we the people not putting blind faith in the establishment, but rather reminding ourselves that there are power plays and factional violence within the elite themselves. They have no implications about how much an ordinary citizen should trust another ordinary citizen. The recent narratives focus on the average person isolating himself from the entire rest of society, perhaps excepting his closest blood relatives. They're part of the more general trend toward cocooning away from your fellow neighbor during the past 20 years.
We see the anti-establishment kinds of conspiracy theories during rising-crime times because people can see with their own eyes that the elites are not as omniscient, omnipotent, and omnibenevolent as they were thought to be. They get a pass on these matters during falling-crime times because, hey, whatever they're doing seems to be holding the violence level down. Only when it starts rising do we wake up to how clueless, impotent, and corrupt they can be -- why else does murder and rape become ever more frequent, year after year?
During falling-crime times, people don't feel as strong of a need to band together for common defense and support. But they need a pretext for splitting themselves apart from others. Now the focus of conspiracy theories is on how those you've trusted are actually going to betray you to the higher powers, who are almost ignored -- more disgust is heaped on your friends who'll betray you. How could they, after you let them in close? That'll teach you to trust other people. Now you've got a plausible, reassuring reason for why you have to unplug from community life and associate only with your nuclear family.
Categories:
Cocooning,
Pop culture
July 18, 2012
Gay Peter Pan-isms: Shorts, sandals, and t-shirts
Some examples of gays being stunted at around age 10 are hard to see because normal males have become infantilized in a similar way. For example, during the '90s it became acceptable for supposedly grown men to wear shorts, t-shirts, and sandals or flip-flops in a much broader range of places -- not just hanging out at the beach, pool, friend's back yard, etc.
That was a backward slide after the return of button-down shirts, suits, and ties in the '80s (even for the artistic types, as in the New Wave look, which gets Kevin Bacon in trouble at his new school in Footloose). In the mid-1970s, even a psychopathic taxi driver like Travis Bickle wore a button-down shirt, sport coat, and ankle boots when he took out a woman on a date.
And since gays are supposed to be such better dressers than everyone else, shouldn't they be the last group to succumb to the man-child look of today? Not at all -- their Peter Pan tendencies over-ride whatever desire they have to look snazzy. I've seen very little of that, by the way, I mean gays who dress in a way that women would call dashing. Their preferences lean so heavily toward campy that Tim Gunn, the mentor from Project Runway, has a stock phrase for the extremes that gay male designers so easily go to -- "clown clothes". Y'know, the kind of stuff a child would design, since kids think more in caricature than adults or even adolescents do.
I don't see them doing the preppy summer look either, like shorts with a polo shirt and boating shoes or something. It's more common to see them wearing some "designer t-shirt," sandals or flip-flops, and faggy looking shorts (tight and just past knee-length, formerly the capri pants but now jean shorts with the ends rolled up over the knee). As I've said before, I see at least a dozen or so queers a day at Starbucks, the supermarkets, and public transport around here. By now I would've known if they dressed in a more grown-up way. Even an adolescent's uniform of sneakers with jeans or pants seems to be too much for a good fraction of them.
How does this differ from the hetero man-child's dress code? Basically it's just more form-fitting, probably a by-product of their being slimmer and not trying to hide a tub of guts like an XBox junkie. And their clothes are more stylized and streamlined, reflecting their somewhat greater interest in how things look. Other than that, though, they both look pretty kiddie.
I stress again that this is not just some kind of part-time casual look, when they're done with their dashing grown-up look. They really do have an aversion to dressing more age-appropriately. It's so disgusting to see them once they're past 25 or whenever their AIDS starts rotting the skin, and they're walking around exposing their mummy legs.
You'd think that people crawling with disfiguring diseases would try to cover up more to protect their ego, kind of like how psychologically normal women stop wearing shorts and mini-skirts once their cellulite reaches a certain level. But in true 10 year-old brat form, faggots show little reaction to social shaming -- that's more of an adolescent development -- and insist on wearing whatever embarrassing, degrading shit they feel like. One of the easiest ways to set a homo off on a temper-tantrum is to tell them what they can and can't wear. Normal people respond in a variety of ways to that kind of advice, but they don't go into nuclear meltdown, unless they're also mental 10 year-olds.
That was a backward slide after the return of button-down shirts, suits, and ties in the '80s (even for the artistic types, as in the New Wave look, which gets Kevin Bacon in trouble at his new school in Footloose). In the mid-1970s, even a psychopathic taxi driver like Travis Bickle wore a button-down shirt, sport coat, and ankle boots when he took out a woman on a date.
And since gays are supposed to be such better dressers than everyone else, shouldn't they be the last group to succumb to the man-child look of today? Not at all -- their Peter Pan tendencies over-ride whatever desire they have to look snazzy. I've seen very little of that, by the way, I mean gays who dress in a way that women would call dashing. Their preferences lean so heavily toward campy that Tim Gunn, the mentor from Project Runway, has a stock phrase for the extremes that gay male designers so easily go to -- "clown clothes". Y'know, the kind of stuff a child would design, since kids think more in caricature than adults or even adolescents do.
I don't see them doing the preppy summer look either, like shorts with a polo shirt and boating shoes or something. It's more common to see them wearing some "designer t-shirt," sandals or flip-flops, and faggy looking shorts (tight and just past knee-length, formerly the capri pants but now jean shorts with the ends rolled up over the knee). As I've said before, I see at least a dozen or so queers a day at Starbucks, the supermarkets, and public transport around here. By now I would've known if they dressed in a more grown-up way. Even an adolescent's uniform of sneakers with jeans or pants seems to be too much for a good fraction of them.
How does this differ from the hetero man-child's dress code? Basically it's just more form-fitting, probably a by-product of their being slimmer and not trying to hide a tub of guts like an XBox junkie. And their clothes are more stylized and streamlined, reflecting their somewhat greater interest in how things look. Other than that, though, they both look pretty kiddie.
I stress again that this is not just some kind of part-time casual look, when they're done with their dashing grown-up look. They really do have an aversion to dressing more age-appropriately. It's so disgusting to see them once they're past 25 or whenever their AIDS starts rotting the skin, and they're walking around exposing their mummy legs.
You'd think that people crawling with disfiguring diseases would try to cover up more to protect their ego, kind of like how psychologically normal women stop wearing shorts and mini-skirts once their cellulite reaches a certain level. But in true 10 year-old brat form, faggots show little reaction to social shaming -- that's more of an adolescent development -- and insist on wearing whatever embarrassing, degrading shit they feel like. One of the easiest ways to set a homo off on a temper-tantrum is to tell them what they can and can't wear. Normal people respond in a variety of ways to that kind of advice, but they don't go into nuclear meltdown, unless they're also mental 10 year-olds.
Categories:
Gays
July 15, 2012
Why movies shot in anamorphic look more striking
The two most salient changes that the visual culture undergoes during a rising-crime period are a greater use of light-dark contrast in lighting and a more restricted depth perspective. When the violence rate begins steadily falling, the look shifts back to less stark lighting and a deeper, more immersive depth perspective. I've roughly outlined why here and here, and periodically I'll go into more detail. In short, the rising-crime features serve to strike an emotional chord in the viewer, while the falling-crime features allow him to be more emotionally detached.
With these two patterns now in mind, I pick up on details that just got filtered out before, lacking a larger framework to be plugged into. The other night, something caught my ear in Ridley Scott's commentary for Blade Runner. He explains why he loves shooting movies in anamorphic, using this scene as an example:
He says that the chess pieces are in sharp focus, Tyrell is just sharp enough, but then there's a quick fall-off in sharpness once you get past his body. It's not only very far-away objects that appear out-of-focus, like the two chairs in the middle of the right side of the frame. Even fairly close things are blurry, like the tables and chairs that are just behind him and to our left, not even as far back as the bed.
So, this scene shows the shallow focus, or the more restricted depth of field, that results from using anamorphic rather than spherical lenses in the camera. [1] The range of distance within which the image appears clear is a lot narrower. If it had been shot with a spherical lens, the greater depth of field would have allowed us to see fairly clear images farther back into the environment.
Hence, all other things being equal, shooting with an anamorphic lens produces an effect more like low-relief sculpture than like high-relief sculpture. (Relief means how far out does the sculpture project from its backing surface.)
The figure we're meant to focus on appears to be lifted out from an almost formless blur of a background -- and not a background that has its own depth, but like the flat surface of a building's exterior. And because things that lie much closer to us than the plane of the figure also show the same quick fall-off in focus, the plane of action does not appear to be high-relief. The depth of field is narrow enough that it looks like a thin slice of clear forms resting on top of a slab of blurriness, just like low-relief sculpture.
It also resembles paintings where the depth is restricted, although in painting it's usually due to placement of the figures within a narrow plane of action and hindering other depth cues. Compare Raphael's The School of Athens with David's The Death of Socrates. David's painting packs more of a punch because all of the action is concentrated within a narrow fixed distance from the viewer, like the action performed on a stage. When we look at Raphael's painting, our attention is diffused over a much greater range of depth, so that no single plane of action dominates our attention. Raphael wants us to calmly explore, while David wants to theatrically slam us in the face.
Obviously all sorts of other factors influence whether a movie scene looks more like a frieze or a diorama, such as the placement of figures within the environment. Still, the very choice of which type of lens to shoot with -- anamorphic or spherical -- affects this part of the movie's look as well. Anamorphic movies will tend to have a theatricality in their visual presentation, stemming from their shallower focus.
Finally, this allows us to rule out one explanation for why some directors and viewers like anamorphic -- i.e., that it provides a very widescreen aspect ratio. There have long been film formats that yield similar aspect ratios but that are shot with spherical lenses, from VistaVision in 1954 through the Super 35 of today. The reason that a good deal of those movies don't actually deliver on the promise of spectacular visuals is that they mistakenly thought the appeal of anamorphic was its wider aspect ratio, rather than its effect on depth of field.
Tomorrow or the next day, I'll put up a post with quantitative data on how prevalent anamorphically filmed movies have been since the lens type was introduced in 1953, looking at the top ten movies at the box office for each year. Then we can check if its popularity tracks the violence rate, to see how well it fits the broader pattern of rising-crime visuals having more limited depth perspective. I'll do some qualitative comparisons over time too. Finally, I'll try to include more pictures, to provide something more lively after this mostly analytical post.
[1] This result is indirect, but it still obtains. Here's the clearest explanation I found (from a comment left here).
With these two patterns now in mind, I pick up on details that just got filtered out before, lacking a larger framework to be plugged into. The other night, something caught my ear in Ridley Scott's commentary for Blade Runner. He explains why he loves shooting movies in anamorphic, using this scene as an example:
He says that the chess pieces are in sharp focus, Tyrell is just sharp enough, but then there's a quick fall-off in sharpness once you get past his body. It's not only very far-away objects that appear out-of-focus, like the two chairs in the middle of the right side of the frame. Even fairly close things are blurry, like the tables and chairs that are just behind him and to our left, not even as far back as the bed.
So, this scene shows the shallow focus, or the more restricted depth of field, that results from using anamorphic rather than spherical lenses in the camera. [1] The range of distance within which the image appears clear is a lot narrower. If it had been shot with a spherical lens, the greater depth of field would have allowed us to see fairly clear images farther back into the environment.
Hence, all other things being equal, shooting with an anamorphic lens produces an effect more like low-relief sculpture than like high-relief sculpture. (Relief means how far out does the sculpture project from its backing surface.)
The figure we're meant to focus on appears to be lifted out from an almost formless blur of a background -- and not a background that has its own depth, but like the flat surface of a building's exterior. And because things that lie much closer to us than the plane of the figure also show the same quick fall-off in focus, the plane of action does not appear to be high-relief. The depth of field is narrow enough that it looks like a thin slice of clear forms resting on top of a slab of blurriness, just like low-relief sculpture.
It also resembles paintings where the depth is restricted, although in painting it's usually due to placement of the figures within a narrow plane of action and hindering other depth cues. Compare Raphael's The School of Athens with David's The Death of Socrates. David's painting packs more of a punch because all of the action is concentrated within a narrow fixed distance from the viewer, like the action performed on a stage. When we look at Raphael's painting, our attention is diffused over a much greater range of depth, so that no single plane of action dominates our attention. Raphael wants us to calmly explore, while David wants to theatrically slam us in the face.
Obviously all sorts of other factors influence whether a movie scene looks more like a frieze or a diorama, such as the placement of figures within the environment. Still, the very choice of which type of lens to shoot with -- anamorphic or spherical -- affects this part of the movie's look as well. Anamorphic movies will tend to have a theatricality in their visual presentation, stemming from their shallower focus.
Finally, this allows us to rule out one explanation for why some directors and viewers like anamorphic -- i.e., that it provides a very widescreen aspect ratio. There have long been film formats that yield similar aspect ratios but that are shot with spherical lenses, from VistaVision in 1954 through the Super 35 of today. The reason that a good deal of those movies don't actually deliver on the promise of spectacular visuals is that they mistakenly thought the appeal of anamorphic was its wider aspect ratio, rather than its effect on depth of field.
Tomorrow or the next day, I'll put up a post with quantitative data on how prevalent anamorphically filmed movies have been since the lens type was introduced in 1953, looking at the top ten movies at the box office for each year. Then we can check if its popularity tracks the violence rate, to see how well it fits the broader pattern of rising-crime visuals having more limited depth perspective. I'll do some qualitative comparisons over time too. Finally, I'll try to include more pictures, to provide something more lively after this mostly analytical post.
[1] This result is indirect, but it still obtains. Here's the clearest explanation I found (from a comment left here).
The other thing is that anamorphic lenses don't have less depth of field per se, but because they have twice the horizontal view and therefore act somewhat like wider-angle lenses, usually one compensates by using a longer focal length -- for example, the equivalent to a 20mm spherical lens in Super-35 cropped vertically to 2.40 would be a 40mm anamorphic (more or less, ignoring the fact that Super-35 has a 24mm wide gate and anamorphic uses a 22mm wide gate). And a 40mm lens has less depth of field than a 20mm lens, the depth of field loss isn't due to the anamorphic elements, it's just due to the fact that you are choosing longer focal lengths to achieve the same view.
July 13, 2012
Friday grab bag
- Labelscar: The retail history blog. It's mostly for chronicling in words and pictures the decay of malls, not for kitsch value, and not to gloat about it either. The two guys who write for it just remember how enjoyable public spaces used to be, including retail places. They rarely get sentimental, although expect lots of wistful comments. The pictures alone are worth it, bringing back lots of pleasant memories. Even in their decrepit state, you can see how much more exciting and soothing mall architecture was, compared to the now ubiquitous strip centers, lifestyle centers, and big box centers. No dodging traffic through an oceanic parking lot, for instance. Not to mention plants, flowers, wood, water, colored tile, benches that you didn't need to buy anything for to rest on...
There's a state directory on the right, which also has a "Dead Malls" category in case you want to look through all of them that are now gone. The search bar is fun, too -- annoying kiosk, lifestyle center, Wal-Mart, planters, fountains, and so on all bring up interesting, often funny posts and discussions. The current entry is about a thriving mall, but they tend to focus on treading-water or dead ones. Between the post and the comments, there's a pretty good history of the particular mall. Lots of familiar themes from this blog pop up for sure.
- What a small world. Awhile ago I broke down and bought Martika's self-titled album, for the guilty pleasure song "Toy Soldiers". Recently after looking to see if the song "Cross My Heart" had a video, I stumbled upon the fact that it's actually a cover song. And as it turns out, the original by Eighth Wonder does have a video -- recognize the singer? It's totally that South African chick from Lethal Weapon 2 who Riggs gets it on with. Small fuckin' world sometimes, man. Great energy she's got, by the way. Back when girls were still boy-crazy.
- The Baconator. Dang, it's about time I tried this thing out. I used to be happy getting several of the single stackers at the BK Lounge, but lately I've noticed the quality slip. (Yes, there are standards even for fast food -- I'll never eat that slop from Taco Bell). After the stackers I felt sleepier, maybe from too much oxidized fat during the cooking process. I have a feeling it's some change introduced by the new corporate owners, who are also responsible for putting up a bunch of dopey posters and offering new items to draw in the yuppie crowd.
No such worries with Wendy's, though, which still has "Old fashioned hamburgers" on its logo. They never felt like expanding internationally either, which may have kept their finite business talent more concentrated on what Americans like. Well, it shows. Take a simple thing like bacon -- why don't hardly any McDonalds items have it? I looked at the Angus burgers, and they've all got sugary junk in the patties themselves, so no sale there. Aside from those burgers, zippo. Wendy's is the only place where they might just manufacture their silverware from bacon.
I get a double baconator with no bun, no ketchup, no mayo, but with pickles, onions, and salt and pepper (since it's unseasoned). Really puts some pep in your step. You can taste the difference that Wendy's fresh ground beef makes, instead of frozen patties. It is a little more expensive, but not by that much since Burger King just raised the price on those stackers.
- Most wrist-slitting man-baby movie of the year looks like it'll be Wreck-It Ralph. It has both CGI and 3D visuals, a plot based seriously on video games, and Sarah Silverman voicing one of the characters. Nerds are already straining hard to compare it to Who Framed Roger Rabbit, all because a bunch of video game characters from different series are brought together in a single movie. BFD.
Who Framed Roger Rabbit is a made-for-kids film noir that has an enjoyable, non-gimmicky mix of live-action and hand-drawn animation. That's what drew the audiences to it -- not nerdy "fan service" like getting to see Mickey Mouse and Bugs Bunny in the same scene.
There's a state directory on the right, which also has a "Dead Malls" category in case you want to look through all of them that are now gone. The search bar is fun, too -- annoying kiosk, lifestyle center, Wal-Mart, planters, fountains, and so on all bring up interesting, often funny posts and discussions. The current entry is about a thriving mall, but they tend to focus on treading-water or dead ones. Between the post and the comments, there's a pretty good history of the particular mall. Lots of familiar themes from this blog pop up for sure.
- What a small world. Awhile ago I broke down and bought Martika's self-titled album, for the guilty pleasure song "Toy Soldiers". Recently after looking to see if the song "Cross My Heart" had a video, I stumbled upon the fact that it's actually a cover song. And as it turns out, the original by Eighth Wonder does have a video -- recognize the singer? It's totally that South African chick from Lethal Weapon 2 who Riggs gets it on with. Small fuckin' world sometimes, man. Great energy she's got, by the way. Back when girls were still boy-crazy.
- The Baconator. Dang, it's about time I tried this thing out. I used to be happy getting several of the single stackers at the BK Lounge, but lately I've noticed the quality slip. (Yes, there are standards even for fast food -- I'll never eat that slop from Taco Bell). After the stackers I felt sleepier, maybe from too much oxidized fat during the cooking process. I have a feeling it's some change introduced by the new corporate owners, who are also responsible for putting up a bunch of dopey posters and offering new items to draw in the yuppie crowd.
No such worries with Wendy's, though, which still has "Old fashioned hamburgers" on its logo. They never felt like expanding internationally either, which may have kept their finite business talent more concentrated on what Americans like. Well, it shows. Take a simple thing like bacon -- why don't hardly any McDonalds items have it? I looked at the Angus burgers, and they've all got sugary junk in the patties themselves, so no sale there. Aside from those burgers, zippo. Wendy's is the only place where they might just manufacture their silverware from bacon.
I get a double baconator with no bun, no ketchup, no mayo, but with pickles, onions, and salt and pepper (since it's unseasoned). Really puts some pep in your step. You can taste the difference that Wendy's fresh ground beef makes, instead of frozen patties. It is a little more expensive, but not by that much since Burger King just raised the price on those stackers.
- Most wrist-slitting man-baby movie of the year looks like it'll be Wreck-It Ralph. It has both CGI and 3D visuals, a plot based seriously on video games, and Sarah Silverman voicing one of the characters. Nerds are already straining hard to compare it to Who Framed Roger Rabbit, all because a bunch of video game characters from different series are brought together in a single movie. BFD.
Who Framed Roger Rabbit is a made-for-kids film noir that has an enjoyable, non-gimmicky mix of live-action and hand-drawn animation. That's what drew the audiences to it -- not nerdy "fan service" like getting to see Mickey Mouse and Bugs Bunny in the same scene.
Categories:
Architecture,
Food,
Movies,
Music
July 12, 2012
The loudness wars, another case of lost dramatic contrast
Earlier I took a quick look at when the loudness wars started in recorded music, and took a stab at why they began once the crime rate started falling. However, now that I've found a very clear pattern of chiaroscuro tracking the homicide rate in visual art, I wonder if there are parallels in music.
Light must reflect off a surface for us to see it, so the equivalent in sound is volume, the most basic feature for us to hear it. Bright vs. dark is like loud vs. quiet.
A key part of the loudness wars that I didn't touch on earlier is that it's not simply that music recordings today are on average louder. It's that they achieve this mainly by taking the sounds that are supposed to be softer and quieter in the song, and jack up the volume on them. The result is a song with much more uniform volume -- uniformly loud -- instead of one that has greater "dynamic range," or the difference, variation, contrast between the louder and quieter sounds.
That seems like the more important part of the change -- loss of dramatic tension or contrast that comes from hearing a loud passage break in after a softer one, or the coming-down feeling of a soft passage following a loud one. As detailed in the post on chiaroscuro, there are reasons why people in rising-crime times prefer more theatrical presentation, and falling-crime people prefer more sameness to the presentation.
Here is a great review of these changes in dynamic range, plus an illustration comparing an original recording to two later remastered versions. (Only the first pair of links at the end of his article works for me.) You can hear what should be softer parts, whether vocal or instrumental, getting a lot louder in the remastered ones.
So, just as our visual culture has gotten more bland from using more uniform lighting, our musical culture has become drier from more uniform loudness. And as that article demonstrates, it's happening even to re-issues of music that was originally recorded with a wide range of quiet to loud sounds.
Unless you have multiple copies of a CD, you may not even notice that the newer remasters tend to sound louder, more distorted, and not as full of rich contrast. I had a suspicion about it, but it wasn't even based on comparing the same album, just noticing that any remaster was likely to sound that way, whether I could compare it to the original or not. Reading more about dynamic range compression, I'm convinced.
Now before I get a CD, I do a quick look-up on the neat DR Database to see if there are multiple releases, and if so, which ones will sound nice and which will sound loud and garbled. Not all newer releases sound bad; this quick check gives them a fair chance. You can also see whether you should swap out the copy you have for a better one. It may seem odd that earlier presses of an album sound better than newer ones, but remember also that CD players were high-end stuff back then too, not cheap junk.
Playing around with that DR Database, it seems like 1993 is close to the ground zero of the loudness wars. Find any band who's released CDs from the start in 1983 up through today -- including CDs of music they originally recorded in the '60s or '70s, but not released on CD until later. Click the "Year" column header to line them up in chronological order. Sometime around '93 or '94, it's pretty reliable that the sound quality will start going to hell, bottoming out in the 2000s and this decade.
That's another way in which the early '90s weren't so bad -- studio production still sounded pretty slick, even for grunge albums like Nevermind.
Light must reflect off a surface for us to see it, so the equivalent in sound is volume, the most basic feature for us to hear it. Bright vs. dark is like loud vs. quiet.
A key part of the loudness wars that I didn't touch on earlier is that it's not simply that music recordings today are on average louder. It's that they achieve this mainly by taking the sounds that are supposed to be softer and quieter in the song, and jack up the volume on them. The result is a song with much more uniform volume -- uniformly loud -- instead of one that has greater "dynamic range," or the difference, variation, contrast between the louder and quieter sounds.
That seems like the more important part of the change -- loss of dramatic tension or contrast that comes from hearing a loud passage break in after a softer one, or the coming-down feeling of a soft passage following a loud one. As detailed in the post on chiaroscuro, there are reasons why people in rising-crime times prefer more theatrical presentation, and falling-crime people prefer more sameness to the presentation.
Here is a great review of these changes in dynamic range, plus an illustration comparing an original recording to two later remastered versions. (Only the first pair of links at the end of his article works for me.) You can hear what should be softer parts, whether vocal or instrumental, getting a lot louder in the remastered ones.
So, just as our visual culture has gotten more bland from using more uniform lighting, our musical culture has become drier from more uniform loudness. And as that article demonstrates, it's happening even to re-issues of music that was originally recorded with a wide range of quiet to loud sounds.
Unless you have multiple copies of a CD, you may not even notice that the newer remasters tend to sound louder, more distorted, and not as full of rich contrast. I had a suspicion about it, but it wasn't even based on comparing the same album, just noticing that any remaster was likely to sound that way, whether I could compare it to the original or not. Reading more about dynamic range compression, I'm convinced.
Now before I get a CD, I do a quick look-up on the neat DR Database to see if there are multiple releases, and if so, which ones will sound nice and which will sound loud and garbled. Not all newer releases sound bad; this quick check gives them a fair chance. You can also see whether you should swap out the copy you have for a better one. It may seem odd that earlier presses of an album sound better than newer ones, but remember also that CD players were high-end stuff back then too, not cheap junk.
Playing around with that DR Database, it seems like 1993 is close to the ground zero of the loudness wars. Find any band who's released CDs from the start in 1983 up through today -- including CDs of music they originally recorded in the '60s or '70s, but not released on CD until later. Click the "Year" column header to line them up in chronological order. Sometime around '93 or '94, it's pretty reliable that the sound quality will start going to hell, bottoming out in the 2000s and this decade.
That's another way in which the early '90s weren't so bad -- studio production still sounded pretty slick, even for grunge albums like Nevermind.
Categories:
Music
July 11, 2012
Ideologies are not religions
Speaking of the decline in ritual, that reminds me of the very common view that, for its adherents, such-and-such ideology "is like their own religion." Communism, political correctness, the eco-friendly organic movement, etc. If it's just a short-hand for "they have their own rigid orthodoxy," then that's fine. But a lot of people push it seriously, as a send-up of the group's hypocrisy.
The most obvious lacking religious feature is the focus on the sacred supernatural. The subject matter is entirely profane, the causes and forces entirely secular.
Even leaving that aside, ideologies do not incorporate much ritual, which ideological people view as superstition.
They are particularly short on rituals that occur frequently, in a group setting, and that bond the members together. Please don't tell me that organizational meetings count -- during my activist days, I probably sat through hundreds of them, and they don't. They don't involve activities that put the members on the same wavelength, like dancing or marching to a common rhythm, chanting a prayer in unison, eating the same substance, or fixing their eyes on the same figure.
And of course your typical adherent of political correctness, etc., does not even attend meetings with fellow travelers. There is no communal behavior to reinforce and intensify their beliefs. They're just kind of cruising along through life, subscribing to some list of beliefs, aware that there are others with similar beliefs, but not meeting up regularly to solidify social bonds among themselves. Nothing like the weekly church gathering, periodic church dances, or even private daily prayers.
They've managed to get some of their idees fixes branded as national holidays -- but that doesn't mean they actually celebrate them. That was just a way to symbolically rub it in the face of those outside the ideology. What about the holidays that are not yet recognized by the government? They don't honor those either. "Festive" and "celebratory" are moods that you rarely or never find ideological people in, at least in a context related to their ideology.
It's no wonder that those who get into ideologies come off more as airheads, autistics, and killjoys.
Listing a handful of genuine examples of ideological rituals doesn't go against what I've argued. Sure, the national conventions of the Democrats and Republicans are intense group-bonding rituals, although those only happen every four years, not weekly or even yearly like the candlelight Christmas service. And sure, there are occasional purification rituals that the offenders of PC have to undergo before the mainstream of society will welcome them back. Perhaps at the beginning of their college career, they emptied their soul of "white guilt" before a group of their peers as a sign of good faith, to signal their intent to atone for their sins. And heaven forbid you violate the food taboo of drinking a Coke instead of an Oogave cola!
Still, these rituals are either very uncommon, or do not take place in group settings (like an individual's frequent observance of the food taboos against mainstream brands). By itself, that wouldn't hurt their status as a religion or not a religion, since rites of passage are also one-time-only affairs. But if the ideology doesn't involve frequent meetings that bond members together, these occasional rituals are not enough to give cohesiveness to the group, which any religion must have, no matter how large or small.
And none of this precludes an ideology from transforming into a religious movement. Fascism started out as just another ideology, but during the peak of Nazism in Germany, it did have a rather religious feeling to it. It's hard to say for sure because, again, the focus on the sacred supernatural is also a defining feature, and I'm not sure how much the broad membership looked to the movement with that concern in mind.
It seems part-way between militarism, which has all of the ritual stuff I've talked about but not necessarily any supernatural element, and a clear-cut religious movement. They had a lot of half-baked talk about mythology, but I wonder how much the average Nazi cared about that, compared to the intellectuals who discussed it more. Their invocation of various gods seemed more allusive than true-believer.
The most obvious lacking religious feature is the focus on the sacred supernatural. The subject matter is entirely profane, the causes and forces entirely secular.
Even leaving that aside, ideologies do not incorporate much ritual, which ideological people view as superstition.
They are particularly short on rituals that occur frequently, in a group setting, and that bond the members together. Please don't tell me that organizational meetings count -- during my activist days, I probably sat through hundreds of them, and they don't. They don't involve activities that put the members on the same wavelength, like dancing or marching to a common rhythm, chanting a prayer in unison, eating the same substance, or fixing their eyes on the same figure.
And of course your typical adherent of political correctness, etc., does not even attend meetings with fellow travelers. There is no communal behavior to reinforce and intensify their beliefs. They're just kind of cruising along through life, subscribing to some list of beliefs, aware that there are others with similar beliefs, but not meeting up regularly to solidify social bonds among themselves. Nothing like the weekly church gathering, periodic church dances, or even private daily prayers.
They've managed to get some of their idees fixes branded as national holidays -- but that doesn't mean they actually celebrate them. That was just a way to symbolically rub it in the face of those outside the ideology. What about the holidays that are not yet recognized by the government? They don't honor those either. "Festive" and "celebratory" are moods that you rarely or never find ideological people in, at least in a context related to their ideology.
It's no wonder that those who get into ideologies come off more as airheads, autistics, and killjoys.
Listing a handful of genuine examples of ideological rituals doesn't go against what I've argued. Sure, the national conventions of the Democrats and Republicans are intense group-bonding rituals, although those only happen every four years, not weekly or even yearly like the candlelight Christmas service. And sure, there are occasional purification rituals that the offenders of PC have to undergo before the mainstream of society will welcome them back. Perhaps at the beginning of their college career, they emptied their soul of "white guilt" before a group of their peers as a sign of good faith, to signal their intent to atone for their sins. And heaven forbid you violate the food taboo of drinking a Coke instead of an Oogave cola!
Still, these rituals are either very uncommon, or do not take place in group settings (like an individual's frequent observance of the food taboos against mainstream brands). By itself, that wouldn't hurt their status as a religion or not a religion, since rites of passage are also one-time-only affairs. But if the ideology doesn't involve frequent meetings that bond members together, these occasional rituals are not enough to give cohesiveness to the group, which any religion must have, no matter how large or small.
And none of this precludes an ideology from transforming into a religious movement. Fascism started out as just another ideology, but during the peak of Nazism in Germany, it did have a rather religious feeling to it. It's hard to say for sure because, again, the focus on the sacred supernatural is also a defining feature, and I'm not sure how much the broad membership looked to the movement with that concern in mind.
It seems part-way between militarism, which has all of the ritual stuff I've talked about but not necessarily any supernatural element, and a clear-cut religious movement. They had a lot of half-baked talk about mythology, but I wonder how much the average Nazi cared about that, compared to the intellectuals who discussed it more. Their invocation of various gods seemed more allusive than true-believer.
July 9, 2012
Going to church declining even among those who strongly identify with their religion
Have you ever met someone who is heavily into the belief side to religion, but doesn't meet up with fellow believers? Indeed they may be obsessed with the intricacies of the relationships among each of their beliefs, as well as how they are similar and different from those of others within their main religion, or other religions entirely. Yet they almost tense up at the thought of regularly congregating with other adherents.
The General Social Survey asks people how often they attend religious services, as well as how strongly they identify as a ____, whatever religion they professed in a related question. I've taken only respondents who said they are a "strong" _____, and also kept them just to whites. Here's the change over time in the percent of them who attend religious services nearly every week, once a week, or more than once a week:
There's no visible trend from the 1970s through the '80s and even the early '90s. Starting in 1994, though, frequent attendance among strong believers starts to fall off. It got a shot in the arm in 2004, but quickly returned to the downward trend. The peak year was 1990, and the low point 2002, with 74% vs. 62% attending frequently. The general pattern is lower attendance in falling-crime times, with the mid-2000s blip being a response to 9/11, part of that general quasi-wave of patriotism, that has petered out since at least 2006 or '07. It's only a matter of years before it returns to the nadir of 2002, now that the effects of 9/11 have almost entirely worn off.
So clearly there's a good size of strong believers who have steadily dropped out of the community of their religion. I don't buy the view that this group is just perfectionists, that from some personality defect they just couldn't stand to be around others whose views are not exactly identical to their own. Nobody's that perfectionistic.
Rather, it is yet another facet of the cocooning trend. You might think that church attendance, etc., is down overall because the population is becoming more secular, less attached to religion. However, I'm talking about those who continue to strongly identify as whatever label they wear. Even those who are the opposite of secular have driven themselves indoors all day long, day after day, and if that means the gradual disintegration of their religious community, well so be it.
The ritual domain of life has gone up in a puff of smoke, with costly group-bonding activities being replaced by beliefs that are cheaply held in private. A lot of the changes I've covered over the past two and a half years seem to fall under "the decline of ritual," so that might deserve a cataloging and clarifying post of its own.
GSS variables used: attend, year, reliten, race
The General Social Survey asks people how often they attend religious services, as well as how strongly they identify as a ____, whatever religion they professed in a related question. I've taken only respondents who said they are a "strong" _____, and also kept them just to whites. Here's the change over time in the percent of them who attend religious services nearly every week, once a week, or more than once a week:
There's no visible trend from the 1970s through the '80s and even the early '90s. Starting in 1994, though, frequent attendance among strong believers starts to fall off. It got a shot in the arm in 2004, but quickly returned to the downward trend. The peak year was 1990, and the low point 2002, with 74% vs. 62% attending frequently. The general pattern is lower attendance in falling-crime times, with the mid-2000s blip being a response to 9/11, part of that general quasi-wave of patriotism, that has petered out since at least 2006 or '07. It's only a matter of years before it returns to the nadir of 2002, now that the effects of 9/11 have almost entirely worn off.
So clearly there's a good size of strong believers who have steadily dropped out of the community of their religion. I don't buy the view that this group is just perfectionists, that from some personality defect they just couldn't stand to be around others whose views are not exactly identical to their own. Nobody's that perfectionistic.
Rather, it is yet another facet of the cocooning trend. You might think that church attendance, etc., is down overall because the population is becoming more secular, less attached to religion. However, I'm talking about those who continue to strongly identify as whatever label they wear. Even those who are the opposite of secular have driven themselves indoors all day long, day after day, and if that means the gradual disintegration of their religious community, well so be it.
The ritual domain of life has gone up in a puff of smoke, with costly group-bonding activities being replaced by beliefs that are cheaply held in private. A lot of the changes I've covered over the past two and a half years seem to fall under "the decline of ritual," so that might deserve a cataloging and clarifying post of its own.
GSS variables used: attend, year, reliten, race
July 8, 2012
Before fast food environments became so off-putting
During falling-crime times, people become used to greater stability, so they are less excitable, whereas the heightened vigilance that people have in rising-crime times makes them more sensitive to incoming stimuli. Also, as they cocoon during falling-crime times, people don't care so much about enjoyable experiences outside of their house.
To adjust to consumer demand, public spaces (even privately owned ones) then become governed by efficiency experts who find ways to squeeze every penny they can out of what they've got, and to eliminate all non-essential items like stuff that makes a visitor feel pleasant. After all, if the typical patron only wants to park next to the entrance, get in, get right back out, and drive off, why bother making the place appealing to the tiny few who feel like hanging out? Even more so if there's a drive-through or drive-in option, a topic I covered earlier here for the mid-century.
Now that we've entered a neo-'50s phase of the zeitgeist cycle, it's no surprise to see the return of corporate efficiency experts taking the hatchet to everything that used to make their companies enjoyable, since hardly anyone in today's cocooning society will miss it.
Here I'll take a quick look at the case of fast food restaurants, and later I'll write up several on supermarkets. To see what changes have been made, we first have to see what it was like in the not-too-distant past.
Burger King in the 1980s. You gotta love what you can find on the internet sometimes, although the library stacks have even better stuff if you just browse. Anyway, those pictures are from 1984 and they would've still looked contemporary through at least the early '90s (maybe a little less wood by then). Gradually since then, we've arrived at where things are now, which look very different.
Compared to today's fast food places, here are some things that jump out about their '80s counterparts:
1) Landscaping with lots of plants, and from a decent variety of species -- rather than using a big ugly pile of rocks. Ditto for livening up the interior. Plants used to be in abundance in all public spaces, especially the mall, but now all you see is pavement, smaller rocks, and boulders. A simple comparison like this gives the lie to silly petrified phrases like "the materialistic 1980s" vs. "the environmentalist 1990s" or "the organic, eco-friendly 2000s". You were surrounded by more biodiversity in a mini-mall than in Whole Foods.
The management's rationale is that dead decor means lower maintenance costs than something that needs light, soil, and water to thrive. Not to mention grounds-keeping to keep it attractive. Still, they can't get away with that unless customers don't care if the look is barforama. Management is always greedy and stingy, but before when people wanted to settle into the space at Burger King, they demanded a more soothing environment. (No Norah Jones or Fall Out Boy played over the radio either.)
2) Windows that you can see through, inviting passersby into the store, and allowing customers to take in views of the people and places nearby. At most there might have been a small sign displaying the hours of operation, or one of those "No shirt, No shoes, No service" warning signs, back when guys used to take their shirts off in warm weather (like Spicoli and his buds in Fast Times).
With the collapse of eating food in the store's dining room, management sees clean windows as wasteful -- gotta put them to profit-boosting use! So, they plaster a poster over nearly the entire window, hawking this or that deal du jour. Up they go on any window that drivers could see while cruising by.
It's true that you can still find a few places whose windows are left unobscured, but that requires that the customers hang out there. The yuppier fast food places like Starbucks or Noodles and Company tend to have open windows for that reason, but not too long ago, every place used to.
3) Minimal or no in-store advertising, especially on the walls. I don't care if they put a picture of their mascot up there or something, but not like today where sprawling posters "build the brand image," i.e. tell you what kind of person eats there. Who cares? I don't eat in Burger King's dining room to identify with some bunch of boring brochure-looking dorks. An earlier group of posters, when their marketing was aggressively targeting 18-34 year-old GUYS, was a little funny but still too much pointless advertising taking up way too much wall space.
Also notice no ads laminated to the table tops, nor even cardboard cut-outs standing on the tables. Looking back, it's amazing how little public space was blemished and rotted by ads. In our neo-'50s world today, we're having a similar anxiety about the encroachment of ads that they did during the original heyday of advertising. It's not enough for the windows of public transportation to be covered with them -- pretty soon the roads and sidewalks themselves will be drenched with them too.
4) Eye-catching uniforms. The hostesses at the BK Lounge used to wear red visors, vests, and pants, and a white shirt with a blue and red check pattern. Today they wear solid gray from head to toe. Management's thinking is, Why waste money on so many separate items and with so many different colors that might require more laundry maintenance to keep them looking vivid? Just toss the workers in a gray sack, and problem solved. It's not like customers are coming into the store anymore, so why would they care if she looks pretty or drab?
5) A pattern on the flooring. Yeah, it's an incredibly simple motif, but it sure gets your attention more than a homogenous sea of beige. Patterns in the tiling, no matter how basic, used to be commonplace, and now they're either gone or so subtle that you don't notice them. I have a hard time believing that a checkerboard pattern would require more up-keep than a monotonous floor. Instead, I think management just doesn't want to waste the one-time costs of hiring a designer to choose a pattern that would harmonize with the rest of the in-store elements. Whereas everything gets along well with nothing-at-all.
6) Booths with cushioning for your back. Starbucks may have more comfortable furniture than today's Burger Kings, Wendy's, and McDonalds, but the counterparts of the main burger joints used to have pretty relaxing booths in the good old days. It was rarer, but not impossible, to find a mainstream fast food place that also had cushioning for your legs and butt. I think those were more the cafeteria / diner type places, but still not too hard to find. Everyone thinks that Starbucks invented comfy seating, as if mankind had been perched on jagged rocks from the '80s back through forever. Get a clue.
The cost-cutting opportunity here is obvious -- upholstery is more expensive up-front, as well as over the course of being sat on. Pretty soon there are scuffs, little tears, big rips, and what have you, and that looks bad if you let it go. Why not prevent those problems by just making every furniture surface a hard one? Hell, it's not like you'll hear any customer complaints since so damn few of them ever feel like sitting down in the dining room anyway.
You get the idea. The point is not that Burger King, for example, used to be architectural paradise. But those kinds of restaurants were enjoyable and pleasant places, not just aesthetically but functionally. Now people prefer eating alone, even if it's a fast food meal brought home for a family -- they're not going to eat Burger King together at the table as a family meal, but off in their own worlds. That aversion to spending any time on the store premises, outside of their car, has given management the green light to trim off the fat -- y'know, the nutritious and delicious part of the animal that some group of experts claim is bad for you.
One of the larger points worth taking away from this is how bogus the claims of the past 10 years have been about our wonderful new world of design in public spaces, having emerged from a sensory dark age. Get fucking real, man. Plants, patterns, and panoramas -- even the neighborhood Burger King was more of a delight for the senses than some sterile, lifeless void like the Apple Store.
P.S. Here is a similar, more lighthearted article for Retro Junk about the decline in quality of the Pizza Hut experience. That too is a casualty of the cocooning trend, where everyone orders pizza for delivery rather than eat in a restaurant. Some readers may not remember, but in the '80s only Domino's delivered. I think Pizza Hut's delivery service only took hold in the early-mid-'90s, after an earlier failed first attempt.
To adjust to consumer demand, public spaces (even privately owned ones) then become governed by efficiency experts who find ways to squeeze every penny they can out of what they've got, and to eliminate all non-essential items like stuff that makes a visitor feel pleasant. After all, if the typical patron only wants to park next to the entrance, get in, get right back out, and drive off, why bother making the place appealing to the tiny few who feel like hanging out? Even more so if there's a drive-through or drive-in option, a topic I covered earlier here for the mid-century.
Now that we've entered a neo-'50s phase of the zeitgeist cycle, it's no surprise to see the return of corporate efficiency experts taking the hatchet to everything that used to make their companies enjoyable, since hardly anyone in today's cocooning society will miss it.
Here I'll take a quick look at the case of fast food restaurants, and later I'll write up several on supermarkets. To see what changes have been made, we first have to see what it was like in the not-too-distant past.
Burger King in the 1980s. You gotta love what you can find on the internet sometimes, although the library stacks have even better stuff if you just browse. Anyway, those pictures are from 1984 and they would've still looked contemporary through at least the early '90s (maybe a little less wood by then). Gradually since then, we've arrived at where things are now, which look very different.
Compared to today's fast food places, here are some things that jump out about their '80s counterparts:
1) Landscaping with lots of plants, and from a decent variety of species -- rather than using a big ugly pile of rocks. Ditto for livening up the interior. Plants used to be in abundance in all public spaces, especially the mall, but now all you see is pavement, smaller rocks, and boulders. A simple comparison like this gives the lie to silly petrified phrases like "the materialistic 1980s" vs. "the environmentalist 1990s" or "the organic, eco-friendly 2000s". You were surrounded by more biodiversity in a mini-mall than in Whole Foods.
The management's rationale is that dead decor means lower maintenance costs than something that needs light, soil, and water to thrive. Not to mention grounds-keeping to keep it attractive. Still, they can't get away with that unless customers don't care if the look is barforama. Management is always greedy and stingy, but before when people wanted to settle into the space at Burger King, they demanded a more soothing environment. (No Norah Jones or Fall Out Boy played over the radio either.)
2) Windows that you can see through, inviting passersby into the store, and allowing customers to take in views of the people and places nearby. At most there might have been a small sign displaying the hours of operation, or one of those "No shirt, No shoes, No service" warning signs, back when guys used to take their shirts off in warm weather (like Spicoli and his buds in Fast Times).
With the collapse of eating food in the store's dining room, management sees clean windows as wasteful -- gotta put them to profit-boosting use! So, they plaster a poster over nearly the entire window, hawking this or that deal du jour. Up they go on any window that drivers could see while cruising by.
It's true that you can still find a few places whose windows are left unobscured, but that requires that the customers hang out there. The yuppier fast food places like Starbucks or Noodles and Company tend to have open windows for that reason, but not too long ago, every place used to.
3) Minimal or no in-store advertising, especially on the walls. I don't care if they put a picture of their mascot up there or something, but not like today where sprawling posters "build the brand image," i.e. tell you what kind of person eats there. Who cares? I don't eat in Burger King's dining room to identify with some bunch of boring brochure-looking dorks. An earlier group of posters, when their marketing was aggressively targeting 18-34 year-old GUYS, was a little funny but still too much pointless advertising taking up way too much wall space.
Also notice no ads laminated to the table tops, nor even cardboard cut-outs standing on the tables. Looking back, it's amazing how little public space was blemished and rotted by ads. In our neo-'50s world today, we're having a similar anxiety about the encroachment of ads that they did during the original heyday of advertising. It's not enough for the windows of public transportation to be covered with them -- pretty soon the roads and sidewalks themselves will be drenched with them too.
4) Eye-catching uniforms. The hostesses at the BK Lounge used to wear red visors, vests, and pants, and a white shirt with a blue and red check pattern. Today they wear solid gray from head to toe. Management's thinking is, Why waste money on so many separate items and with so many different colors that might require more laundry maintenance to keep them looking vivid? Just toss the workers in a gray sack, and problem solved. It's not like customers are coming into the store anymore, so why would they care if she looks pretty or drab?
5) A pattern on the flooring. Yeah, it's an incredibly simple motif, but it sure gets your attention more than a homogenous sea of beige. Patterns in the tiling, no matter how basic, used to be commonplace, and now they're either gone or so subtle that you don't notice them. I have a hard time believing that a checkerboard pattern would require more up-keep than a monotonous floor. Instead, I think management just doesn't want to waste the one-time costs of hiring a designer to choose a pattern that would harmonize with the rest of the in-store elements. Whereas everything gets along well with nothing-at-all.
6) Booths with cushioning for your back. Starbucks may have more comfortable furniture than today's Burger Kings, Wendy's, and McDonalds, but the counterparts of the main burger joints used to have pretty relaxing booths in the good old days. It was rarer, but not impossible, to find a mainstream fast food place that also had cushioning for your legs and butt. I think those were more the cafeteria / diner type places, but still not too hard to find. Everyone thinks that Starbucks invented comfy seating, as if mankind had been perched on jagged rocks from the '80s back through forever. Get a clue.
The cost-cutting opportunity here is obvious -- upholstery is more expensive up-front, as well as over the course of being sat on. Pretty soon there are scuffs, little tears, big rips, and what have you, and that looks bad if you let it go. Why not prevent those problems by just making every furniture surface a hard one? Hell, it's not like you'll hear any customer complaints since so damn few of them ever feel like sitting down in the dining room anyway.
You get the idea. The point is not that Burger King, for example, used to be architectural paradise. But those kinds of restaurants were enjoyable and pleasant places, not just aesthetically but functionally. Now people prefer eating alone, even if it's a fast food meal brought home for a family -- they're not going to eat Burger King together at the table as a family meal, but off in their own worlds. That aversion to spending any time on the store premises, outside of their car, has given management the green light to trim off the fat -- y'know, the nutritious and delicious part of the animal that some group of experts claim is bad for you.
One of the larger points worth taking away from this is how bogus the claims of the past 10 years have been about our wonderful new world of design in public spaces, having emerged from a sensory dark age. Get fucking real, man. Plants, patterns, and panoramas -- even the neighborhood Burger King was more of a delight for the senses than some sterile, lifeless void like the Apple Store.
P.S. Here is a similar, more lighthearted article for Retro Junk about the decline in quality of the Pizza Hut experience. That too is a casualty of the cocooning trend, where everyone orders pizza for delivery rather than eat in a restaurant. Some readers may not remember, but in the '80s only Domino's delivered. I think Pizza Hut's delivery service only took hold in the early-mid-'90s, after an earlier failed first attempt.
Categories:
Architecture,
Cocooning
July 6, 2012
Crime and chiaroscuro: Introduction
This will begin an ongoing series looking at how prominent of a light-dark contrast there is in the visual culture of rising vs. falling-crime periods.
Homicide rates in Europe began their centuries-long decline starting between 1450 and 1550, and lasting up through the present. Still, there have been three major, and one minor, reversals of this downward trend -- from ca. 1580 to 1630 (the Early Modern wave), from ca. 1780 to 1830 (the Romantic-Gothic wave), from ca. 1960 to 1990 (the New Wave wave), and a less geographically widespread one from ca. 1900 to 1930 (the Jazz Age wave).
To put names on the falling-crime periods: from ca. 1450 to 1580, the Renaissance Humanist wave; from ca. 1630 to 1780, the Reason-Enlightenment wave; from ca. 1830 to 1900, the Victorian wave; from ca. 1930 to 1960, the Mid-century wave; and from ca. 1990 to present, the Millennial wave.
Having pored over the visual culture from these various periods, two major links jump out at me (there are probably more). First, rising-crime visuals have greater contrast between light and shadow, called chiaroscuro, whereas falling-crime visuals have a less stark contrast in lighting. In a separate series, I'll look at the second, which is the more restricted, frieze-like depth perspective of rising-crime visuals, compared to the deeper, photorealistic perspective of falling-crime visuals.
What underlies both chiaroscuro and restricted depth is a more theatrical drive, heightening drama and tension like you would see actions performed for you on a stage. Not being so fully realistic, they remind you that it's a stylized work of art, that these choices have been made for dramatic effect. They possess an immediacy that is lacking in works with a subtler lighting scheme and more rational depth perspective -- something that the creators and fans of these latter works would hardly consider a bad thing, since their goal is to appeal more to our reason than to strike an emotional chord.
Why does rising-crime art opt for features that pack more of a punch? It's not just in visual art, but literary and musical art too. A steadily rising rate of violence signals a world that's growing increasingly out-of-order, like the rules that governed the old ecology are shifting or no longer apply. In such a topsy-turvy world, new solutions must be tested out before it's too late. These are not top-down technocratic solutions, but an interaction among everyone -- musician, neighbor, preacher, painter, or parent -- to try to figure out what works and what does not.
Communicating in this way much more directly to a broader swath of your fellow group members, not some distant set of mediators, and under the pressure of what seems like a closer and closer deadline for humankind, your message acquires a greater sense of urgency. There's no time to dick around, on-the-other-hand-ing right up until the apocalypse.
Chiaroscuro also touches on another important theme of rising-crime times -- that the barrier between two very different dimensions, one good and one evil, is becoming unzippered, creatures from the other world entering our own, or perhaps we finding ourselves wandering into theirs. One of the most simple, vivid, and widespread ways to symbolize this is creating myths about a light world and a dark world. So, strong use of chiaroscuro heightens the sense that two formerly separate and opposite worlds of good and evil have come crashing into each other.
It does not even have to be so literal, where the good symbol is bathed in light and the evil symbol cast in shadow. No matter who the intense light is thrown on, nor what remains cloaked in darkness, the stark contrast itself evokes the collision of the two dimensions.
Why then does falling-crime art utilize a subtler gradation of light? Again this greater naturalism and emotional restraint doesn't show up only in visual art, but in literary and musical art too. Well, everything that had been going so wrong in the earlier rising-crime period seems to only be getting better and better. Now that the problem is wrapping itself up somehow or other, we don't need to band together and address each other so directly as we did during the trial-and-error phase before.
Indeed, whatever communication we still need can be done more impersonally, perhaps even through mediators like an expert elite. And since it looks like the apocalypse did not in fact arrive, we seem to have all the time in the world to work on our problems -- calmly. Hence, cool-headedness, rationality, and detachment are now in order. Let's hibernate for awhile, outsourcing the running of our affairs to a technocratic elite or a team of gizmos.
Moreover, in so emphasizing the naturalistic gradation of light, the sfumato technique of lighting appears to deny or at least diminish the importance of the other-worldly supernatural realm. Like, maybe there is some place like that somewhere -- but let's not worry about it here and now. Right now let's focus instead on elevating the human, the mundane, and even the everyday, and light the scene accordingly with hazy or smoky changes in tone. Throwing the action into a strong light-dark contrast will only start us off on the path toward black-and-white superstition and magical thinking.
With this background out of the way, I'll illustrate the points by walking through each cycle of falling and rising-crime periods. Next up will be the Renaissance and the Early Baroque periods of painting, corresponding to the Renaissance-Humanist and Early Modern waves in the violence cycle referred to at the beginning.
Homicide rates in Europe began their centuries-long decline starting between 1450 and 1550, and lasting up through the present. Still, there have been three major, and one minor, reversals of this downward trend -- from ca. 1580 to 1630 (the Early Modern wave), from ca. 1780 to 1830 (the Romantic-Gothic wave), from ca. 1960 to 1990 (the New Wave wave), and a less geographically widespread one from ca. 1900 to 1930 (the Jazz Age wave).
To put names on the falling-crime periods: from ca. 1450 to 1580, the Renaissance Humanist wave; from ca. 1630 to 1780, the Reason-Enlightenment wave; from ca. 1830 to 1900, the Victorian wave; from ca. 1930 to 1960, the Mid-century wave; and from ca. 1990 to present, the Millennial wave.
Having pored over the visual culture from these various periods, two major links jump out at me (there are probably more). First, rising-crime visuals have greater contrast between light and shadow, called chiaroscuro, whereas falling-crime visuals have a less stark contrast in lighting. In a separate series, I'll look at the second, which is the more restricted, frieze-like depth perspective of rising-crime visuals, compared to the deeper, photorealistic perspective of falling-crime visuals.
What underlies both chiaroscuro and restricted depth is a more theatrical drive, heightening drama and tension like you would see actions performed for you on a stage. Not being so fully realistic, they remind you that it's a stylized work of art, that these choices have been made for dramatic effect. They possess an immediacy that is lacking in works with a subtler lighting scheme and more rational depth perspective -- something that the creators and fans of these latter works would hardly consider a bad thing, since their goal is to appeal more to our reason than to strike an emotional chord.
Why does rising-crime art opt for features that pack more of a punch? It's not just in visual art, but literary and musical art too. A steadily rising rate of violence signals a world that's growing increasingly out-of-order, like the rules that governed the old ecology are shifting or no longer apply. In such a topsy-turvy world, new solutions must be tested out before it's too late. These are not top-down technocratic solutions, but an interaction among everyone -- musician, neighbor, preacher, painter, or parent -- to try to figure out what works and what does not.
Communicating in this way much more directly to a broader swath of your fellow group members, not some distant set of mediators, and under the pressure of what seems like a closer and closer deadline for humankind, your message acquires a greater sense of urgency. There's no time to dick around, on-the-other-hand-ing right up until the apocalypse.
Chiaroscuro also touches on another important theme of rising-crime times -- that the barrier between two very different dimensions, one good and one evil, is becoming unzippered, creatures from the other world entering our own, or perhaps we finding ourselves wandering into theirs. One of the most simple, vivid, and widespread ways to symbolize this is creating myths about a light world and a dark world. So, strong use of chiaroscuro heightens the sense that two formerly separate and opposite worlds of good and evil have come crashing into each other.
It does not even have to be so literal, where the good symbol is bathed in light and the evil symbol cast in shadow. No matter who the intense light is thrown on, nor what remains cloaked in darkness, the stark contrast itself evokes the collision of the two dimensions.
Why then does falling-crime art utilize a subtler gradation of light? Again this greater naturalism and emotional restraint doesn't show up only in visual art, but in literary and musical art too. Well, everything that had been going so wrong in the earlier rising-crime period seems to only be getting better and better. Now that the problem is wrapping itself up somehow or other, we don't need to band together and address each other so directly as we did during the trial-and-error phase before.
Indeed, whatever communication we still need can be done more impersonally, perhaps even through mediators like an expert elite. And since it looks like the apocalypse did not in fact arrive, we seem to have all the time in the world to work on our problems -- calmly. Hence, cool-headedness, rationality, and detachment are now in order. Let's hibernate for awhile, outsourcing the running of our affairs to a technocratic elite or a team of gizmos.
Moreover, in so emphasizing the naturalistic gradation of light, the sfumato technique of lighting appears to deny or at least diminish the importance of the other-worldly supernatural realm. Like, maybe there is some place like that somewhere -- but let's not worry about it here and now. Right now let's focus instead on elevating the human, the mundane, and even the everyday, and light the scene accordingly with hazy or smoky changes in tone. Throwing the action into a strong light-dark contrast will only start us off on the path toward black-and-white superstition and magical thinking.
With this background out of the way, I'll illustrate the points by walking through each cycle of falling and rising-crime periods. Next up will be the Renaissance and the Early Baroque periods of painting, corresponding to the Renaissance-Humanist and Early Modern waves in the violence cycle referred to at the beginning.
Categories:
Art
July 3, 2012
Social and anti-social status contests
When you think about it, there cannot be anti-social status contests -- the whole point of one is that there's some kind of public arena where performers are being judged by spectators. It requires a minimum level of commitment to public life on the part of performers and spectators.
So during cocooning times, we should re-interpret what look like status-seeking contests as something else. They're not about competing out in the open before judges, but rather about carving out their own micro-niche where only they and very few others are the colonists. Once too many people arrive, they scurry off to some other micro-niche. It's running away from the would-be competition, refusing to be judged along with them on some dimension. Maybe we should call it niche-hopping or something; the point is, it's not a contest, let alone one about status.
An example of a status-seeking contest is wearing clothes that are more and more expensive, where higher status is accorded to the more expensive clothes. A status contest always leads to more and more exaggerated traits, just like over evolutionary time the peacock's tails get longer and longer. Pretty soon, those involved in the contest are dropping big bucks even for minor pieces of their look.
We expect to see these status contests mostly during an out-and-about period, which more or less maps onto a rising-crime period. I think this is what people are referring to when they label the Roaring Twenties and the Go-Go Eighties as materialistic. They couldn't have been less concerned with the material world, focused as they were on the spiritual, supernatural, cultish, and apocalyptic. However, there was incredibly stiff competition over who could look the most elegant or stylish, who could spend more money on their new car.
As the performers in these contests showed more and more exaggerated behavior, the kind of fever pitch got remembered as a slavish devotion to material things as a source of meaning in life. However, buying those ever more expensive things was not to find a source of meaning in one's private life, but to out-do the competition in a public contest. (Finding meaning in life from consumer products belonged instead to the mid-century and the past 20 years, both falling-crime and cocooning periods.)
Athletics are another obvious case. You have to meet the competition in public and be judged. There was a cult of the athlete from roughly 1900 through the early '30s, and then again from the '60s through the '80s. During the mid-century and the past 20 years, being a jock has not been cool -- let alone being one who was ambitious. Just as the contestants for who could wear the most glamorous clothing became slandered as materialistic, the athlete who lived to beat the competition became remembered as a bloodthirsty barbarian type.
And obviously pursuing athletic competition has nothing to do with materialism. Along with dressing stylishly, athletics was just another case of people being more eager to compete in public. Jazz music from the Jazz Age (before it went underground and became unlikeable), and guitar solos from the Rock 'n' Roll Age, are further examples still. It wasn't materialism but a drive to compete in public for status.
What, then, are the niche-hopping cases like? Those are all pretty fresh in our memory, so I won't go into any depth. But the whole bullshit about, "I like a band that only a dozen people know about," or "I had to go backpacking in Tajikistan because Uzbekistan is just getting too damn crowded with touristy frat bro's in Ed Hardy shirts," or anything else from the realm of SWPL-dom. It's not about who is the biggest, most dedicated fan of the band, which is a possible contest. The moment that enough people knew about them for such a contest to be feasible, they'll leave that micro-niche and carve out another. I think that's why the so-called world travelers always have such a superficial knowledge of where they've been -- it's not about proving you're the most knowledgeable or experienced or devoted to some foreign place or people (all possible contests). Instead it's about hiding in a place that few others are going to.
I'm no longer convinced that these are really contests, e.g. over who knows the most obscure bands, who's been to the most obscure location, etc. These people don't like talking much to each other -- they just want to enjoy their little micro-niche and be left alone, not regularly congregate in an arena and duke it out over who is the most esoteric. Sure, when they are forced into a social interaction, they do seize the moment to preen about their obscure tastes as though it were a contest. But that's a pretty rare chance they get, unlike the stylish girl who goes out to parties every night, or the jock who competes every day after school, or the guitar player who has a gig at least once a week.
Moreover, niche-hopping does not result in more and more exaggerated traits like the peacock's tail, the 4-hour / $1000 look, or the virtuosic guitar solo. If their esoteric tastes kept spiraling out of control, why are they still so tightly within the confines of contemporary Western culture, or foreign places so highly connected to it? Why aren't they studying obscure dead languages from halfway across the world? Or getting into 13th-century Mongolian throat singing? -- y'know, not that modern crap that you posers already know about. Why not appropriate the myriad food cultures of sub-Saharan Africa?
I don't see any steady move toward greater and greater obscurity. Rather, they want to stay within more or less known-about traditions, but just carve out a micro-niche within them, where they can cocoon away from the other people who are into the broad tradition -- let them carve out their own micro-niche. I'll listen to shitty indie band A, and you listen to shitty indie band B, neither encroaching on the other's territory. If it were about obscurity, I'd jump ship altogether and listen to Medieval Chinese opera or something. This way we can both be part of a group that listens to indie music, but that doesn't have a strong sense of community identity, every fan being cocooned away around their own band's scene.
I don't have too many concrete examples from the mid-century, but it was more this way back then too. I do know that that's how they viewed domestic architecture and interior design. They were almost paralyzed with a neurosis about being one of the regimented mob. Individuality, meaning distinctiveness, was the most important thing to them in how their house looked. And it wasn't just like, let's have our own look to make a memorable impression on our guests. It came more from an antipathy toward the crowd that they belonged to, but didn't want to be seen as belonging to.
For example, the first glance at two neighboring ranch modern houses might look almost identical, but one would be the same design with the "front" spun around to where the "back" was on the other, or there would be minuscule differences in the size and placement of windows on the front. Nothing truly distinctive, kind of like how people have different skins for their identical Macbooks and iPhones.
Obviously the state of their technology didn't allow them to be as niche-hopping mad as we are today, but certainly compared to the previous Jazz Age and the Rock and New Wave Age afterward, they were more in the "I know something you don't know" direction. And again, not in a competitive way, but more to feel like they had their own little private sanctuary from the monotony of the mob's tastes, just the way that the yoga-and-yoghurt people must feel today.
Anyway, you get the idea. I realize that it's fun to make the SWPL types out to be hypocrites who engage in status-seeking contests of their own. And of course sometimes they actually do, like when they compare vocabulary size, GRE scores, level of difficulty for the colleges they got into, and so on. But that's still rare. Most of their seeming contests are really forms of cocooning through niche-hopping, not striding into an arena of competition and proving to the spectators and judges how high they can score on some characteristic. They aren't even contests over obscurity.
Status contests fit in with a more socially engaged culture, so we don't see them when everyone is withdrawing into their own little worlds. When people mill around more with each other, it's not hard to tell what the contests are, given how evolution has constrained what we accord status to. They're all variations on what Fitzgerald called the two stories -- "the charms of women and the bravery of men."
So during cocooning times, we should re-interpret what look like status-seeking contests as something else. They're not about competing out in the open before judges, but rather about carving out their own micro-niche where only they and very few others are the colonists. Once too many people arrive, they scurry off to some other micro-niche. It's running away from the would-be competition, refusing to be judged along with them on some dimension. Maybe we should call it niche-hopping or something; the point is, it's not a contest, let alone one about status.
An example of a status-seeking contest is wearing clothes that are more and more expensive, where higher status is accorded to the more expensive clothes. A status contest always leads to more and more exaggerated traits, just like over evolutionary time the peacock's tails get longer and longer. Pretty soon, those involved in the contest are dropping big bucks even for minor pieces of their look.
We expect to see these status contests mostly during an out-and-about period, which more or less maps onto a rising-crime period. I think this is what people are referring to when they label the Roaring Twenties and the Go-Go Eighties as materialistic. They couldn't have been less concerned with the material world, focused as they were on the spiritual, supernatural, cultish, and apocalyptic. However, there was incredibly stiff competition over who could look the most elegant or stylish, who could spend more money on their new car.
As the performers in these contests showed more and more exaggerated behavior, the kind of fever pitch got remembered as a slavish devotion to material things as a source of meaning in life. However, buying those ever more expensive things was not to find a source of meaning in one's private life, but to out-do the competition in a public contest. (Finding meaning in life from consumer products belonged instead to the mid-century and the past 20 years, both falling-crime and cocooning periods.)
Athletics are another obvious case. You have to meet the competition in public and be judged. There was a cult of the athlete from roughly 1900 through the early '30s, and then again from the '60s through the '80s. During the mid-century and the past 20 years, being a jock has not been cool -- let alone being one who was ambitious. Just as the contestants for who could wear the most glamorous clothing became slandered as materialistic, the athlete who lived to beat the competition became remembered as a bloodthirsty barbarian type.
And obviously pursuing athletic competition has nothing to do with materialism. Along with dressing stylishly, athletics was just another case of people being more eager to compete in public. Jazz music from the Jazz Age (before it went underground and became unlikeable), and guitar solos from the Rock 'n' Roll Age, are further examples still. It wasn't materialism but a drive to compete in public for status.
What, then, are the niche-hopping cases like? Those are all pretty fresh in our memory, so I won't go into any depth. But the whole bullshit about, "I like a band that only a dozen people know about," or "I had to go backpacking in Tajikistan because Uzbekistan is just getting too damn crowded with touristy frat bro's in Ed Hardy shirts," or anything else from the realm of SWPL-dom. It's not about who is the biggest, most dedicated fan of the band, which is a possible contest. The moment that enough people knew about them for such a contest to be feasible, they'll leave that micro-niche and carve out another. I think that's why the so-called world travelers always have such a superficial knowledge of where they've been -- it's not about proving you're the most knowledgeable or experienced or devoted to some foreign place or people (all possible contests). Instead it's about hiding in a place that few others are going to.
I'm no longer convinced that these are really contests, e.g. over who knows the most obscure bands, who's been to the most obscure location, etc. These people don't like talking much to each other -- they just want to enjoy their little micro-niche and be left alone, not regularly congregate in an arena and duke it out over who is the most esoteric. Sure, when they are forced into a social interaction, they do seize the moment to preen about their obscure tastes as though it were a contest. But that's a pretty rare chance they get, unlike the stylish girl who goes out to parties every night, or the jock who competes every day after school, or the guitar player who has a gig at least once a week.
Moreover, niche-hopping does not result in more and more exaggerated traits like the peacock's tail, the 4-hour / $1000 look, or the virtuosic guitar solo. If their esoteric tastes kept spiraling out of control, why are they still so tightly within the confines of contemporary Western culture, or foreign places so highly connected to it? Why aren't they studying obscure dead languages from halfway across the world? Or getting into 13th-century Mongolian throat singing? -- y'know, not that modern crap that you posers already know about. Why not appropriate the myriad food cultures of sub-Saharan Africa?
I don't see any steady move toward greater and greater obscurity. Rather, they want to stay within more or less known-about traditions, but just carve out a micro-niche within them, where they can cocoon away from the other people who are into the broad tradition -- let them carve out their own micro-niche. I'll listen to shitty indie band A, and you listen to shitty indie band B, neither encroaching on the other's territory. If it were about obscurity, I'd jump ship altogether and listen to Medieval Chinese opera or something. This way we can both be part of a group that listens to indie music, but that doesn't have a strong sense of community identity, every fan being cocooned away around their own band's scene.
I don't have too many concrete examples from the mid-century, but it was more this way back then too. I do know that that's how they viewed domestic architecture and interior design. They were almost paralyzed with a neurosis about being one of the regimented mob. Individuality, meaning distinctiveness, was the most important thing to them in how their house looked. And it wasn't just like, let's have our own look to make a memorable impression on our guests. It came more from an antipathy toward the crowd that they belonged to, but didn't want to be seen as belonging to.
For example, the first glance at two neighboring ranch modern houses might look almost identical, but one would be the same design with the "front" spun around to where the "back" was on the other, or there would be minuscule differences in the size and placement of windows on the front. Nothing truly distinctive, kind of like how people have different skins for their identical Macbooks and iPhones.
Obviously the state of their technology didn't allow them to be as niche-hopping mad as we are today, but certainly compared to the previous Jazz Age and the Rock and New Wave Age afterward, they were more in the "I know something you don't know" direction. And again, not in a competitive way, but more to feel like they had their own little private sanctuary from the monotony of the mob's tastes, just the way that the yoga-and-yoghurt people must feel today.
Anyway, you get the idea. I realize that it's fun to make the SWPL types out to be hypocrites who engage in status-seeking contests of their own. And of course sometimes they actually do, like when they compare vocabulary size, GRE scores, level of difficulty for the colleges they got into, and so on. But that's still rare. Most of their seeming contests are really forms of cocooning through niche-hopping, not striding into an arena of competition and proving to the spectators and judges how high they can score on some characteristic. They aren't even contests over obscurity.
Status contests fit in with a more socially engaged culture, so we don't see them when everyone is withdrawing into their own little worlds. When people mill around more with each other, it's not hard to tell what the contests are, given how evolution has constrained what we accord status to. They're all variations on what Fitzgerald called the two stories -- "the charms of women and the bravery of men."
Categories:
Cocooning
July 1, 2012
Jews more than twice as likely to be gay, as likely to be lesbian
Is homo and bisexuality another example of deviance that Jews are more likely to indulge in? They seem to be over-represented among all sorts of out-there groups.
The General Social Survey asks whether your sex partners have been male, female, or both. I restricted respondents to whites who had at least 1 year of college education (since queerness goes up with education), and who lived in an urban environment, where queers tend to concentrate, not suburban or rural. That controls for the major demographic differences between homos and normals.
Respondents were then broken down by their professed religion -- Catholic, Protestant, Jewish, and None (not necessarily atheists). Differences across groups in bi and homosexuality were the same, so I just collapsed them into a single non-heterosexual category.
Percent non-heterosexual (Males, Females) ...
Based on partners from past year:
Cath: 4, 3
Prot: 4, 3
Jew: 11, 3
None: 8, 9
Based on partners from past 5 years:
Cath: 4, 4
Prot: 4, 4
Jew: 9, 4
None: 8, 10
The rates for Christians are just what you see in the overall population -- between 3 and 5 percent. Among Jews, however, fully 10% are gay. I vaguely suspected it would be higher, but that's still amazing. The None-religionists are twice as likely as Christians to be gay, though still a bit below the Jews.
The incredibly high rate of Jewish queerness cannot be explained, therefore, by pointing to their more secular or liberal tendencies. Even if all Jews were secular-liberal and all of the None group were non-Jewish in ethnicity, there'd still be a gap. The Christian groups are four times the size of the None group, so even pretending all of the Nones were Gentiles, and weighting the three Gentile groups, there'd still be a large gap between them and Jews.
Strangely, though, Jews are just like the two Christian groups for rates of lesbianism and female bisexuality. For females, it's having no professed religion that makes them 2 to 3 times more likely to be with women. This would seem to argue against the higher rate of gays among Jews being due to some cultural laxity -- that should allow Jewish women to act more deviant as well.
If being queer is caused by some Gay Germ (in Greg Cochran's phrase), it's hard to believe that the Jewish vs. Gentile chasm in gay rates is due to different genetic susceptibilities. I checked for race differences in queerness, limiting it to urban-dwellers with at least a year of college, and there were none in comparing whites, blacks, and "other". If major racial groups are equally susceptible to the infection, it's hard to imagine that Jews would be so much more susceptible than other Caucasian groups.
Perhaps some combination of their different rituals performed on boys early in life (not just circumcision) make the difference in susceptibility to the infection. That would be analogous to the Fore tribe of Papua New Guinea, who contracted a neurological disease, Kuru, by eating the infected brains of the deceased during a funeral ritual. Different cultural practices can make some groups more at-risk for infectious diseases.
This would also explain why Jewish females are no more likely to be bi or lesbian -- circumcision, etc., do not affect them. Or maybe having a castrating Jewish mother is more likely to push a son over the edge and snap, while having little effect on the daughter.
Well, who knows what the source of the difference is, but there it is -- 1 out of 10 Jewish urban-dwellers with at least some college education are gay. Among their Christian counterparts, it's 1 in 25. One obvious implication is that Jews, even aside from their secular liberal tendencies, will tend to favor gay agenda policies, since they're more likely to know some from their own group. "My son, the cocksucker, deserves better health care than what you goy bigots are providing."
GSS variables used: sexsex, sexsex5, relig, race, educ, srcbelt, sex
The General Social Survey asks whether your sex partners have been male, female, or both. I restricted respondents to whites who had at least 1 year of college education (since queerness goes up with education), and who lived in an urban environment, where queers tend to concentrate, not suburban or rural. That controls for the major demographic differences between homos and normals.
Respondents were then broken down by their professed religion -- Catholic, Protestant, Jewish, and None (not necessarily atheists). Differences across groups in bi and homosexuality were the same, so I just collapsed them into a single non-heterosexual category.
Percent non-heterosexual (Males, Females) ...
Based on partners from past year:
Cath: 4, 3
Prot: 4, 3
Jew: 11, 3
None: 8, 9
Based on partners from past 5 years:
Cath: 4, 4
Prot: 4, 4
Jew: 9, 4
None: 8, 10
The rates for Christians are just what you see in the overall population -- between 3 and 5 percent. Among Jews, however, fully 10% are gay. I vaguely suspected it would be higher, but that's still amazing. The None-religionists are twice as likely as Christians to be gay, though still a bit below the Jews.
The incredibly high rate of Jewish queerness cannot be explained, therefore, by pointing to their more secular or liberal tendencies. Even if all Jews were secular-liberal and all of the None group were non-Jewish in ethnicity, there'd still be a gap. The Christian groups are four times the size of the None group, so even pretending all of the Nones were Gentiles, and weighting the three Gentile groups, there'd still be a large gap between them and Jews.
Strangely, though, Jews are just like the two Christian groups for rates of lesbianism and female bisexuality. For females, it's having no professed religion that makes them 2 to 3 times more likely to be with women. This would seem to argue against the higher rate of gays among Jews being due to some cultural laxity -- that should allow Jewish women to act more deviant as well.
If being queer is caused by some Gay Germ (in Greg Cochran's phrase), it's hard to believe that the Jewish vs. Gentile chasm in gay rates is due to different genetic susceptibilities. I checked for race differences in queerness, limiting it to urban-dwellers with at least a year of college, and there were none in comparing whites, blacks, and "other". If major racial groups are equally susceptible to the infection, it's hard to imagine that Jews would be so much more susceptible than other Caucasian groups.
Perhaps some combination of their different rituals performed on boys early in life (not just circumcision) make the difference in susceptibility to the infection. That would be analogous to the Fore tribe of Papua New Guinea, who contracted a neurological disease, Kuru, by eating the infected brains of the deceased during a funeral ritual. Different cultural practices can make some groups more at-risk for infectious diseases.
This would also explain why Jewish females are no more likely to be bi or lesbian -- circumcision, etc., do not affect them. Or maybe having a castrating Jewish mother is more likely to push a son over the edge and snap, while having little effect on the daughter.
Well, who knows what the source of the difference is, but there it is -- 1 out of 10 Jewish urban-dwellers with at least some college education are gay. Among their Christian counterparts, it's 1 in 25. One obvious implication is that Jews, even aside from their secular liberal tendencies, will tend to favor gay agenda policies, since they're more likely to know some from their own group. "My son, the cocksucker, deserves better health care than what you goy bigots are providing."
GSS variables used: sexsex, sexsex5, relig, race, educ, srcbelt, sex
Subscribe to:
Posts (Atom)