Since the decline of wild times around 1991, the influence on culture from 15-24 year-olds has all but disappeared; now everything is for older adults and small children. Malls are no longer overrun with teenagers because they're holed up in their room. The mall instead caters to the needs of 25+ status-seekers -- Crate and Barrel, spas and salons, etc. -- and any tots they may have (Build-A-Bear). The top 10 box office draws have for a long time now focused more and more on children (Harry Potter) and older adults, with teen-oriented movies dropping out of view.
As a result, the culture has never been more asexual: everything is about the stages in life when you're too young to care about the birds and the bees, or when you've already gotten that messy business of finding a mate and making babies out of the way and it's now time to raise them right.
This is not because there is a wealth of great adolescent culture out there being suppressed by demographically more powerful grown-ups and their ankle-biters. Anyone with sense really would rather see a Harry Potter movie than Superbad, and would rather stroll through Barnes and Noble than putz around Hot Topic. Young people's cultural comparative advantage is in anything requiring wildness, so that when they become domesticated, they have nothing special to offer and vanish into the background.
As though pop music didn't sound juvenile enough already -- as opposed to the coming-of-age and reckless youth themes popular from the '60s through the '80s -- there's now a movement within indie to make music for small kids. (Even indie for adults sounds pretty kiddie to me.) Listen to the song samples at the NYT link; sounds pretty hokey. The acoustic folksy sound is trying too hard, and the lyrics about cotton candy at a baseball game and Mama taking off her ring because she's sad are too self-conscious. At least there's an attempt to not really patronize the children like you see with Barney or Dora the Explorer, but it still doesn't work. The zeitgeist just won't let music makers get into the right mindset. About the only pop music song for small kids that I liked as a kid was "NeverEnding Story" by Limahl.
In fact I'm glad there wasn't dedicated little-kid music when I was growing up because it made you yearn even more to join the cool older-kid age group -- you could just tell they were having a lot of fun with their music, whether it was your high school babysitter playing "Open Your Heart" on your tape player or your friend's older brother blasting Bon Jovi out of his van windows. Sure, as a pre-pubescent kid you don't really relate to the lyrics that much, but you can't help but feel the beat and get sucked into the melody. Plus once you get into third or fourth grade, you start paying attention to girls but are nervous about not appearing guyish enough in front of your friends. You're looking for some role model to say it's OK, just go with it and don't worry about their cooties. I mean hey, if Michael Hutchence likes girls, maybe they're not so bad after all...
Who are small children going to hear "I need you tonight" from these days? It breaks your heart thinking about how deprived their childhoods are going to be.
April 30, 2010
April 28, 2010
Why don't comedies age well?
Unlike most other genres, whatever the medium, comedies thought to be good from the past tend to fall flat today. Aristotle thought that one function of humor and comedy is to look down upon someone -- not just as in making someone the butt of a joke, but even looking down on yourself when you realize what a ridiculous situation you're in. Others building on that point to the status-seeking function of comedy. I don't see that so much because that assumes it's an individual satirist taking on his enemies. In reality, it looks more like a way for members of a single tribe to heighten the in-group vs. out-group distinction -- white people dance like this, and black people dance like this.
After awhile, the satirizing group may no longer be influential or even exist, and the same goes for their opponent tribes. A contemporary audience cannot sympathize with jokes about a group who they don't even know about; the group would need to be updated. And most of the comedy tends not to be a blatant attack on the group but a series of more subtle barbs at the group's mannerisms, slang, clothing style, minor foibles, and so on. No one in 100 years is going to know that such things were being spoofed by the makers of Ron Burgundy from Anchorman or the goth kid from South Park. Hell, look at how quickly Napoleon Dynamite vanished from public awareness. Superbad is just about there too.
Certainly every bit of topical humor added lowers the shelf-life of a comedy. It's only one generation later, and yet how many would laugh at a current events or pop culture-related gag from 1985 -- even among those who would still recognize its origin and logic?
On this basis, what comedy previously thought to be funny do I think will be most doomed to obscurity after its initial run? -- Family Guy. Most of the jokes are references to contemporary (or past!) pop culture and current events. What remains is mostly an attack on the tribes that rival with the creators' tribe for social and cultural influence. And even the jokes that don't seem to be aimed at anyone in particular still serve only as tribal membership badges for the creators and viewers -- we belong to the tribe that, for whatever reason, makes these kinds of pointless remarks. We're just weird like that.
I would add all of those dopey series on Cartoon Network like Aqua Teen Hunger Force and Robot Chicken, but I don't think most people found them funny to begin with. Their existence has only been due to their use as a membership badge for a sub-culture of adolescent male dorks. The recent "frat pack" movies starring Will Ferrell, Ben Stiller, bla bla bla will do terribly too, but at least they had to appeal to a broader audience than Family Guy did, so they aren't so pointless.
And which will hold up the best? Obviously those that focus on more timeless and universal themes, not dated provincial turf wars. Those themes are usually the forte of action or drama specialists, though. So drama-comedies like The Simpsons and action-comedies like Ghostbusters will endure the longest. Unfortunately the increase in identity politics and tribalism within American society means that most of the comedies from the past two decades are disposable, with a few exceptions. Still, the action-comedy reached its peak just before then, and it's currently OK to re-visit and enjoy stuff from the '80s. So you won't feel like such a weirdo for going outside the contemporary mainstream.
After awhile, the satirizing group may no longer be influential or even exist, and the same goes for their opponent tribes. A contemporary audience cannot sympathize with jokes about a group who they don't even know about; the group would need to be updated. And most of the comedy tends not to be a blatant attack on the group but a series of more subtle barbs at the group's mannerisms, slang, clothing style, minor foibles, and so on. No one in 100 years is going to know that such things were being spoofed by the makers of Ron Burgundy from Anchorman or the goth kid from South Park. Hell, look at how quickly Napoleon Dynamite vanished from public awareness. Superbad is just about there too.
Certainly every bit of topical humor added lowers the shelf-life of a comedy. It's only one generation later, and yet how many would laugh at a current events or pop culture-related gag from 1985 -- even among those who would still recognize its origin and logic?
On this basis, what comedy previously thought to be funny do I think will be most doomed to obscurity after its initial run? -- Family Guy. Most of the jokes are references to contemporary (or past!) pop culture and current events. What remains is mostly an attack on the tribes that rival with the creators' tribe for social and cultural influence. And even the jokes that don't seem to be aimed at anyone in particular still serve only as tribal membership badges for the creators and viewers -- we belong to the tribe that, for whatever reason, makes these kinds of pointless remarks. We're just weird like that.
I would add all of those dopey series on Cartoon Network like Aqua Teen Hunger Force and Robot Chicken, but I don't think most people found them funny to begin with. Their existence has only been due to their use as a membership badge for a sub-culture of adolescent male dorks. The recent "frat pack" movies starring Will Ferrell, Ben Stiller, bla bla bla will do terribly too, but at least they had to appeal to a broader audience than Family Guy did, so they aren't so pointless.
And which will hold up the best? Obviously those that focus on more timeless and universal themes, not dated provincial turf wars. Those themes are usually the forte of action or drama specialists, though. So drama-comedies like The Simpsons and action-comedies like Ghostbusters will endure the longest. Unfortunately the increase in identity politics and tribalism within American society means that most of the comedies from the past two decades are disposable, with a few exceptions. Still, the action-comedy reached its peak just before then, and it's currently OK to re-visit and enjoy stuff from the '80s. So you won't feel like such a weirdo for going outside the contemporary mainstream.
"Don't worry, I'm legal..."
That's a new deflecting response I'd like to try out when some nubile honey bunny asks me how old I am, but I never get the chance anymore. Soon after the recession sank in, it became gauche to dress up, and I started wearing a t-shirt and jeans to dance clubs instead of a jacket and tie. It must've been that type of clothing that made younger people curious about my age because literally no one has asked me since I switched. Just last week someone asked, only half-certain, "...you're over 21...right?" when pointing out a bar area that they thought I'd like.
Even though I haven't gotten the chance to try it out, it still sounds like it would work for anyone. If you're youthful looking, it emphasizes this good quality of yours and has a nice mischievous ring to it. If you're noticeably older and mature-looking, it shows how big your balls are that you're accusing her of wanting you and only being worried about whether you were old enough. And the self-deprecating humor in it shows how secure you are about your increasing age. In either case, a deadpan delivery with solid eye-contact (and maybe a barely visible smirk) is the only way to go. If you say it with the slightest hint of interest or eagerness, it'll sound totally creepy -- definitely do not smile.
Clearly this line should only be used when you sense an interest, not if she asks in disgust like uggh i mean how old ARE you anyway?
Even though I haven't gotten the chance to try it out, it still sounds like it would work for anyone. If you're youthful looking, it emphasizes this good quality of yours and has a nice mischievous ring to it. If you're noticeably older and mature-looking, it shows how big your balls are that you're accusing her of wanting you and only being worried about whether you were old enough. And the self-deprecating humor in it shows how secure you are about your increasing age. In either case, a deadpan delivery with solid eye-contact (and maybe a barely visible smirk) is the only way to go. If you say it with the slightest hint of interest or eagerness, it'll sound totally creepy -- definitely do not smile.
Clearly this line should only be used when you sense an interest, not if she asks in disgust like uggh i mean how old ARE you anyway?
April 27, 2010
Even in cradle of agriculture, high-carb diets drive up obesity and diabetes
People who know about human biodiversity often argue that low-carb diets are too broad-brush of an approach because different races adopted agriculture at different times, so while the late adopters might get really dinged by the novelty of grains and sugars, surely those who adopted it first have had enough time to evolve a decent (if not total) genetic resistance to poor diets. They point out that recent work shows how fast natural selection can work on humans -- just look at how quickly Europeans got lighter in their skin, hair, and eye color, or how quickly the dairying populations evolved lactase persistence. That's all within the past 10,000 years!
Unfortunately that logic only applies to easy changes where only a single mutation gets the job done. For skin, hair, and eye color, there are under 10 genes that make most of the difference, and really just a handful of those do most of the work. Lactase persistence is due to a single mutation. Why are these changes so easy to make? Because the basic machinery was already there, the result of millions of years of painstaking, gradual tinkering by natural selection. All these new traits represent are the dialing up or down of some knob -- break a gene that colors your skin, and you're pale; break a gene that shuts off lactase production, and you have lifetime lactase activity.
What they do not do is invent things from scratch or fundamentally re-draw the blueprint of any of the body's systems. That takes too long for 10,000 years to yield anything useful. Henry Harpending and Greg Cochran emphasize this point in their excellent and very readable book on the topic of recent human evolution, The 10,000 Year Explosion (check the Amazon box above). A mutation or two might break the process by which neuronal growth slows down, and that might make the person smarter. But inventing neurons in an organism that had no neurons is not going to happen in a short time and not by just a handful of mutations. The body's systems are too complex for random mutation to hit on them in a single shot.
In particular, the human digestive system is not going to change much at all in 10,000 years. Agricultural populations have relied far more on plant foods than hunter-gatherers ever did, and yet no human makes cellulase, the enzyme that lets herbivores digest plants. Wouldn't it be useful for us to have it too, rather than pass those plant foods through as undigested fiber? Sure it would -- it just takes a long time to invent it if you don't have it at all. More relevant to agriculture, eating grains like a bird or a rodent won't give you their digestive system unless this goes on for millions of years.
Just to check that this is true, let's have a look at the prevalence of obesity worldwide and focus on the Middle East where agriculture began. In the US, where most of the population came from later adopters, we have roughly 1/3 overweight and another 1/3 obese. To compare, check whoever you want, but the picture is clear -- the earliest adopters of agriculture are fat as hell right now. Here is an NYT article on how obese Qataris have become, which has the usual Western baloney about obesity being caused by too much food and not enough exercise. (Another case of harmful foreign aid -- shaming people in poor countries into becoming vegan joggers.)
But we know from fat marathoners and bike riders that even enormous amounts of exercise won't melt fat, and we know from hunter-gatherers like the Ache -- who eat between 3000 and 4000 calories a day and are still lean and muscular -- that quantity of food has little to do with it. What are Qataris eating so much of that makes them fat? The article says fast food, as well as home meals of rice, clarified butter, and lamb. Well, butter and meat are just about all the Maasai live on (plus cow's blood) and they're lean and muscular, even though they don't live life on a stairmaster. It must be the heapings of rice.
And what exactly does "fast food" include? Judging by the picture of boys chowing down on McDonalds, we see that there's almost no animal products at all -- I think I can make out a 2-oz slice of beef and perhaps a slice of cheese (which I doubt came from an animal anyway -- probably another soy-based abomination). Instead, they're sucking down five pounds of straight up sugar (Pepsi, ketchup, and barbecue and honey sauces), starch (a mountain of French fries), grains (white bread buns), and fibrous carbs (tomatoes, onions, and other toppings). All of that glucose is going to send their insulin sky-high, and insulin is the hormone that signals fat to stay locked away in fat cells.
You might not see this pattern where people don't get much to eat at all -- there would be little fat to lock away in the first place. High-carb diets don't conjure fat out of nothing, but if you are getting enough to eat, a high-carb diet will keep the fat you're eating locked in fat cells rather than dumped into your bloodstream to be burned for fuel. Places like India where people get little to eat aren't teeming with obese people, but they sure are hit by diabetes. This is another member of the full range of symptoms of Metabolic Syndrome (along with high blood pressure, hypertension, etc.), and you don't need to be eating a lot to get diabetes. Here is an article showing that over 4% of Indians have diabetes and that this will only get worse over time. And those Middle Eastern countries with high obesity levels? They have some of the highest diabetes rates in the world, around 15-20%.
No matter where your ancestors came from, their digestive system and metabolism was not designed at all for relying on nuts, grains, green plants, fruits, tubers, etc. If adapting to agricultural diets were only a matter of dialing up the level of some enzyme that we already had, they could have adapted quickly. But because we have to invent an agricultural digestive system from one designed for consumption of animal products, it would take a million or so years. The other prime examples of rapid selection are resistance to infectious diseases, but again these solutions just jigger with something that's already been invented, like taking a red blood cell and bending it so that it's somewhat sickle-shaped. You're not going to evolve red blood cells where there were none before in only 10,000 years.
There is individual variation relating to diet, since obesity is heritable -- some of the differences between people in body fat composition are related to genetic differences between them, even if they all eat the high-carb diets typical of the modern world. So some people get harmed more than others by bad food. Still, the real difference between individuals or groups when we look at the many facets of Metabolic Syndrome is the split between hunter-gatherers -- who have none of those many diseases -- and grain-munching farmers who are plagued by diabetes, obesity, and heart disease (not to mention depression and gout). You may have to fine-tune a low-carb diet because of your unique set of genes, but to a first approximation the reason you're fat, diabetic, or not thriving is because you're eating an agricultural diet.
Let the nutritionist classes tremble at a Paleolithic revolution. The dieters have nothing to lose but their grains. They have a full stomach to win.
Meat-eating Men of All Countries, Unite!
Unfortunately that logic only applies to easy changes where only a single mutation gets the job done. For skin, hair, and eye color, there are under 10 genes that make most of the difference, and really just a handful of those do most of the work. Lactase persistence is due to a single mutation. Why are these changes so easy to make? Because the basic machinery was already there, the result of millions of years of painstaking, gradual tinkering by natural selection. All these new traits represent are the dialing up or down of some knob -- break a gene that colors your skin, and you're pale; break a gene that shuts off lactase production, and you have lifetime lactase activity.
What they do not do is invent things from scratch or fundamentally re-draw the blueprint of any of the body's systems. That takes too long for 10,000 years to yield anything useful. Henry Harpending and Greg Cochran emphasize this point in their excellent and very readable book on the topic of recent human evolution, The 10,000 Year Explosion (check the Amazon box above). A mutation or two might break the process by which neuronal growth slows down, and that might make the person smarter. But inventing neurons in an organism that had no neurons is not going to happen in a short time and not by just a handful of mutations. The body's systems are too complex for random mutation to hit on them in a single shot.
In particular, the human digestive system is not going to change much at all in 10,000 years. Agricultural populations have relied far more on plant foods than hunter-gatherers ever did, and yet no human makes cellulase, the enzyme that lets herbivores digest plants. Wouldn't it be useful for us to have it too, rather than pass those plant foods through as undigested fiber? Sure it would -- it just takes a long time to invent it if you don't have it at all. More relevant to agriculture, eating grains like a bird or a rodent won't give you their digestive system unless this goes on for millions of years.
Just to check that this is true, let's have a look at the prevalence of obesity worldwide and focus on the Middle East where agriculture began. In the US, where most of the population came from later adopters, we have roughly 1/3 overweight and another 1/3 obese. To compare, check whoever you want, but the picture is clear -- the earliest adopters of agriculture are fat as hell right now. Here is an NYT article on how obese Qataris have become, which has the usual Western baloney about obesity being caused by too much food and not enough exercise. (Another case of harmful foreign aid -- shaming people in poor countries into becoming vegan joggers.)
But we know from fat marathoners and bike riders that even enormous amounts of exercise won't melt fat, and we know from hunter-gatherers like the Ache -- who eat between 3000 and 4000 calories a day and are still lean and muscular -- that quantity of food has little to do with it. What are Qataris eating so much of that makes them fat? The article says fast food, as well as home meals of rice, clarified butter, and lamb. Well, butter and meat are just about all the Maasai live on (plus cow's blood) and they're lean and muscular, even though they don't live life on a stairmaster. It must be the heapings of rice.
And what exactly does "fast food" include? Judging by the picture of boys chowing down on McDonalds, we see that there's almost no animal products at all -- I think I can make out a 2-oz slice of beef and perhaps a slice of cheese (which I doubt came from an animal anyway -- probably another soy-based abomination). Instead, they're sucking down five pounds of straight up sugar (Pepsi, ketchup, and barbecue and honey sauces), starch (a mountain of French fries), grains (white bread buns), and fibrous carbs (tomatoes, onions, and other toppings). All of that glucose is going to send their insulin sky-high, and insulin is the hormone that signals fat to stay locked away in fat cells.
You might not see this pattern where people don't get much to eat at all -- there would be little fat to lock away in the first place. High-carb diets don't conjure fat out of nothing, but if you are getting enough to eat, a high-carb diet will keep the fat you're eating locked in fat cells rather than dumped into your bloodstream to be burned for fuel. Places like India where people get little to eat aren't teeming with obese people, but they sure are hit by diabetes. This is another member of the full range of symptoms of Metabolic Syndrome (along with high blood pressure, hypertension, etc.), and you don't need to be eating a lot to get diabetes. Here is an article showing that over 4% of Indians have diabetes and that this will only get worse over time. And those Middle Eastern countries with high obesity levels? They have some of the highest diabetes rates in the world, around 15-20%.
No matter where your ancestors came from, their digestive system and metabolism was not designed at all for relying on nuts, grains, green plants, fruits, tubers, etc. If adapting to agricultural diets were only a matter of dialing up the level of some enzyme that we already had, they could have adapted quickly. But because we have to invent an agricultural digestive system from one designed for consumption of animal products, it would take a million or so years. The other prime examples of rapid selection are resistance to infectious diseases, but again these solutions just jigger with something that's already been invented, like taking a red blood cell and bending it so that it's somewhat sickle-shaped. You're not going to evolve red blood cells where there were none before in only 10,000 years.
There is individual variation relating to diet, since obesity is heritable -- some of the differences between people in body fat composition are related to genetic differences between them, even if they all eat the high-carb diets typical of the modern world. So some people get harmed more than others by bad food. Still, the real difference between individuals or groups when we look at the many facets of Metabolic Syndrome is the split between hunter-gatherers -- who have none of those many diseases -- and grain-munching farmers who are plagued by diabetes, obesity, and heart disease (not to mention depression and gout). You may have to fine-tune a low-carb diet because of your unique set of genes, but to a first approximation the reason you're fat, diabetic, or not thriving is because you're eating an agricultural diet.
Let the nutritionist classes tremble at a Paleolithic revolution. The dieters have nothing to lose but their grains. They have a full stomach to win.
Meat-eating Men of All Countries, Unite!
April 25, 2010
If Hollywood pushes natural look, will we get better-looking women?
Here's an NYT article on what may be a push by casting directors and filmmakers toward a less synthetic look among women, whether stars or extras. I assume this trend isn't one of those fake ones that the NYT feature writers cook up just to draw an audience and generate buzz. Those pretend to talk about national trends among common people, and they just don't have the resources to conduct all the necessary research. Even their more qualitative approaches consist mostly of interviewing their their elite social circle in the tri-state area. But those who drive Hollywood are few and geographically concentrated, and usually willing to talk about where things are going.
On the one hand, a greater emphasis on natural looks will select for those with greater natural beauty and hence better genes -- and no amount of plastic surgery can match that. Right now you can't really alter the geometry of your skull, and that's just as important as the softer features that can be tinkered with.
On the other hand, good genes are in incredibly short supply, so unless Hollywood wants only one or two actresses to play all females roles, from lead down to extras, they'll have to draw from the other group with naturally good looks -- young girls. In Fast Times at Ridgemont High, Phoebe Cates and Jennifer Jason Leigh did not need any work done because they were 18 and 20 years old. Jennifer Beals didn't need a facelift to star in Flashdance because she was 19. Brooke Shields was 14 in The Blue Lagoon. Really the only older sex symbol from youth-oriented movies is Kelly LeBrock who played the Frankenbabe in Weird Science. Mature, savvy, and cool-headed, she provided a good contrast for the juvenile dorks who created her. And how old was this icon of wisdom and stoicism? -- 25.
That's what used to pass for "no longer young," whereas the NYT article says that people now find it disturbing how "young" some of the plastic surgery women are, using a 23 year-old as an example. That's hardly old, but sorry, that's not exactly young either. This mindset is what has given us an older set of sex symbols, contrary to all the whining you hear from aging women (and dickless white knight males) about the culture's obsession with youth. The culture has never been so free of adolescent and young adult influence -- everything is made for middle-aged adults (those of childrearing age and older) or pre-pubescent children.
With all this pressure of the zeitgeist bearing down on them, Hollywood is just not going to go there with young girls. So, the result will be worse-looking women in movies -- we'll get a handful of naturally beautiful ones, but without the ace-in-the-hole of plastic surgery or sheer youth, the majority of actresses aren't going to spin any heads.
On the one hand, a greater emphasis on natural looks will select for those with greater natural beauty and hence better genes -- and no amount of plastic surgery can match that. Right now you can't really alter the geometry of your skull, and that's just as important as the softer features that can be tinkered with.
On the other hand, good genes are in incredibly short supply, so unless Hollywood wants only one or two actresses to play all females roles, from lead down to extras, they'll have to draw from the other group with naturally good looks -- young girls. In Fast Times at Ridgemont High, Phoebe Cates and Jennifer Jason Leigh did not need any work done because they were 18 and 20 years old. Jennifer Beals didn't need a facelift to star in Flashdance because she was 19. Brooke Shields was 14 in The Blue Lagoon. Really the only older sex symbol from youth-oriented movies is Kelly LeBrock who played the Frankenbabe in Weird Science. Mature, savvy, and cool-headed, she provided a good contrast for the juvenile dorks who created her. And how old was this icon of wisdom and stoicism? -- 25.
That's what used to pass for "no longer young," whereas the NYT article says that people now find it disturbing how "young" some of the plastic surgery women are, using a 23 year-old as an example. That's hardly old, but sorry, that's not exactly young either. This mindset is what has given us an older set of sex symbols, contrary to all the whining you hear from aging women (and dickless white knight males) about the culture's obsession with youth. The culture has never been so free of adolescent and young adult influence -- everything is made for middle-aged adults (those of childrearing age and older) or pre-pubescent children.
With all this pressure of the zeitgeist bearing down on them, Hollywood is just not going to go there with young girls. So, the result will be worse-looking women in movies -- we'll get a handful of naturally beautiful ones, but without the ace-in-the-hole of plastic surgery or sheer youth, the majority of actresses aren't going to spin any heads.
April 22, 2010
No new Nightmares
Looks like they're going to take a stab at a reboot of the original Nightmare on Elm Street. Like I said about the upcoming sequel to Predator, don't make any new scary movies or cop movies until the crime rate shoots up again, so that the audience will actually be afraid of things out there and the actors will be able to easily tap into the fearful mindset when on camera. Here are several clues that the movie will stink, aside from being a horror movie made during safe times:
- The cast is not young enough. ("God, I look 20 years old...") Only the parents are the same age as before, but they're not important. Freddy Krueger is now 47 instead of 37, which would work if he were only supposed to be psychopathic, but not if he's supposed to be strong and quick enough to chase down his victims. Nancy is now 24 vs. 19 in the original, although her boyfriend is 22 vs. 21 in the original. The other two young people in the original were 20 and 23; in the new one they are 21, 22, and 24. The cast of the original were the age of college students, which is close enough to teenagers that we could believe they were seniors. The new cast are all the age of people who've graduated college and are just on the cusp of establishing their career, finding a husband or wife, bla bla bla.
Much of the suspense in the original comes from sympathizing with the plight of teenagers who still live at home and are still dependent on their parents, who don't believe or are just blind to what's going on. That sympathy won't be possible with people old enough to be out on their own. Those 5 years separating 18 from 23 see a hell of a change in personality and behavior, since by the latter point they're almost out the door of their wild years of 15 - 24. They aren't so easily spooked by that point, so they won't be as convincing on camera as a 17 or 18 year-old would be.
- Two of the characters aside from Nancy and her bf are described as "a jock on the swim team" and "a well-liked, well-off high school jock." Um, jealous much? Sounds pretty exaggerated and unsympathetic to me, unlike the believable characters in the original -- but then what do you expect from a director mostly known for music videos such as "Smells Like Teen Spirit," "No Rain," and "Bullet with Butterfly Wings"?
- Freddy Krueger is being downgraded from a monster to a mere sociopath like Hannibal Lecter, according to the director:
I can feel myself getting drowsy already. Seriously, how much more clueless do you get? The entire folklore genre of "monster terrorizes people and gets slain" doesn't bother at all with a deep backstory that the audience delves into. There's some scary motherfucker out there waiting to get you, your friends, and your family -- you don't care how it came to be. You only care about escaping its terror or standing up to it and cutting off its head. Those are the feelings the audience wants to identify with, not with those of some psychiatrist who chronicles and dissects the monster's personality development. That belongs to stories like Frankenstein or First Blood, whereas the director here claims to be making a terrifying horror movie.
- Last, the soundtrack is large and orchestral rather than minimal and synth-based like the original. There's something inherently spooky about synthesizers because their sounds are just close enough to organic instruments for us to perceive these sounds as music (as opposed to sound effects), but they're inorganic enough to make us uncertain. Even in pop music the more haunting songs typically rely on a synthesizer, like in "Billie Jean" and "Dirty Diana," and the same goes for TV, as with the soundtrack for Twin Peaks.
Instead of releasing these remake / reboot failures based on classic scary movies, they should just re-release the originals in theaters. Maybe at the second-run theaters -- so what if the first run was 26 years ago? I'd pay full price to see Nightmare on Elm Street, Beverly Hills Cop, or Ghostbusters in the theater today. Good and old will always beat dumb and new. That seemed to work for the 1997 re-release of the Star Wars trilogy. Unlike a bad new movie, re-releasing the original wouldn't involve the huge production cost -- it's already sitting there waiting to be distributed and projected.
Till then, you'd do best to just buy the DVD. It has one of the greatest commentary tracks if only because Heather Langenkamp sounded just as delightfully girly at 31 as she did at 19. I worry that commentary tracks made long after the movie itself will adulterate my memory of the movie by featuring people who are so far away from who they were when it was first made. Not with this one, though. (The only other case I remember where a 30-something female still sounds like a teenager is Kerri Green in the commentary for Goonies, when she was 34.)
- The cast is not young enough. ("God, I look 20 years old...") Only the parents are the same age as before, but they're not important. Freddy Krueger is now 47 instead of 37, which would work if he were only supposed to be psychopathic, but not if he's supposed to be strong and quick enough to chase down his victims. Nancy is now 24 vs. 19 in the original, although her boyfriend is 22 vs. 21 in the original. The other two young people in the original were 20 and 23; in the new one they are 21, 22, and 24. The cast of the original were the age of college students, which is close enough to teenagers that we could believe they were seniors. The new cast are all the age of people who've graduated college and are just on the cusp of establishing their career, finding a husband or wife, bla bla bla.
Much of the suspense in the original comes from sympathizing with the plight of teenagers who still live at home and are still dependent on their parents, who don't believe or are just blind to what's going on. That sympathy won't be possible with people old enough to be out on their own. Those 5 years separating 18 from 23 see a hell of a change in personality and behavior, since by the latter point they're almost out the door of their wild years of 15 - 24. They aren't so easily spooked by that point, so they won't be as convincing on camera as a 17 or 18 year-old would be.
- Two of the characters aside from Nancy and her bf are described as "a jock on the swim team" and "a well-liked, well-off high school jock." Um, jealous much? Sounds pretty exaggerated and unsympathetic to me, unlike the believable characters in the original -- but then what do you expect from a director mostly known for music videos such as "Smells Like Teen Spirit," "No Rain," and "Bullet with Butterfly Wings"?
- Freddy Krueger is being downgraded from a monster to a mere sociopath like Hannibal Lecter, according to the director:
We've gone in a slightly different direction with our take on Freddy and I like that. We delve a little deeper into him as a person. How he became the thing he was. That's certainly attracted me to this character. He's not a mindless guy with an axe. He's a thinking, talking, psychologically disturbed character.
I can feel myself getting drowsy already. Seriously, how much more clueless do you get? The entire folklore genre of "monster terrorizes people and gets slain" doesn't bother at all with a deep backstory that the audience delves into. There's some scary motherfucker out there waiting to get you, your friends, and your family -- you don't care how it came to be. You only care about escaping its terror or standing up to it and cutting off its head. Those are the feelings the audience wants to identify with, not with those of some psychiatrist who chronicles and dissects the monster's personality development. That belongs to stories like Frankenstein or First Blood, whereas the director here claims to be making a terrifying horror movie.
- Last, the soundtrack is large and orchestral rather than minimal and synth-based like the original. There's something inherently spooky about synthesizers because their sounds are just close enough to organic instruments for us to perceive these sounds as music (as opposed to sound effects), but they're inorganic enough to make us uncertain. Even in pop music the more haunting songs typically rely on a synthesizer, like in "Billie Jean" and "Dirty Diana," and the same goes for TV, as with the soundtrack for Twin Peaks.
Instead of releasing these remake / reboot failures based on classic scary movies, they should just re-release the originals in theaters. Maybe at the second-run theaters -- so what if the first run was 26 years ago? I'd pay full price to see Nightmare on Elm Street, Beverly Hills Cop, or Ghostbusters in the theater today. Good and old will always beat dumb and new. That seemed to work for the 1997 re-release of the Star Wars trilogy. Unlike a bad new movie, re-releasing the original wouldn't involve the huge production cost -- it's already sitting there waiting to be distributed and projected.
Till then, you'd do best to just buy the DVD. It has one of the greatest commentary tracks if only because Heather Langenkamp sounded just as delightfully girly at 31 as she did at 19. I worry that commentary tracks made long after the movie itself will adulterate my memory of the movie by featuring people who are so far away from who they were when it was first made. Not with this one, though. (The only other case I remember where a 30-something female still sounds like a teenager is Kerri Green in the commentary for Goonies, when she was 34.)
April 20, 2010
Paleo skin
Having covered paleo hair, let's move on to how to get great skin and keep it that way by using some simple biology. Clearly girls will benefit from this, but guys too will continue to enjoy solid mental health and a big fat smile on their face as long as they still get attention from young girls. And to point out the obvious, the men who adorn the walls and ceilings of nubile babes look more like Johnny Depp than Clint Eastwood.
Even if your genes don't predispose you to competing based on dreaminess, compare Berlusconi to other powerful men of the same age -- his skin looks like that of a man at least 20 years younger. The reason we find youthful looks attractive especially in middle or old age is that they are a signal of good genes. If you're 20 years old, male or female, you have to try to look repulsive. That's an easy task when you're 60, though. So if you've managed to prove how resistant you are to the aging process, even if you still don't look as youthful as you did in your 20s, everyone nevertheless infers that you've got some powerful set of genes to have pulled it off.
There is less error behind this inference than a similar inference of good genes based on good looks in a 20 year-old. Thus, risk-averse females will be even more receptive to sleeping with a dreamy, youthful looking 40 year-old than his 20 year-old counterpart. A sex-only relationship is a strategy to just get the guy's good genes. If she were trying to get greater parental investment, she'd go after someone with more control over resources. If she were after someone with shared interests and a desire to go steady, she'd pick someone closer to her own age. So you probably won't develop a sustained relationship with any young girl when you're in your 40s -- but it's a nice consolation prize to still get wildly fucked by them.
Moving onto what builds and maintains healthy skin, most of the story is vitamin A. Vitamins C and E also help, and of course you need to eat plenty of fat since most cells in your body are fat-dependent and because vitamins A and E are fat-soluble (they need to be taken with fat in order to be absorbed). But as long as you focus on vitamin A, you'll pretty much be set. You may already know that it's involved in vision, but its more important role is in the epithelial cells -- any cell on the "surface" of your body. That includes all of your skin but also the "inner surfaces" like the lining of your respiratory tract and digestive tract. Get low on vitamin A, and you're more likely to get a respiratory infection or to not properly digest your food and absorb its nutrients.
Not getting the flu is nice, and strengthening your skin will also help to keep out pathogens that would enter that way. But we don't live in an environment where measles, leprosy, etc., are rampant, so this probably won't persuade too many people to start getting enough vitamin A. Rather, do it because you won't have disgusting Jabba the Hut skin that, being hard to conceal, will tend to turn away all sorts of people who see you -- not just total strangers, but also love interests and potential allies. ("I don't know how trustworthy he is -- just look at how much he's let himself go.") Well into middle age, Cary Grant and Bryan Ferry still had firm, bouncy, glowing skin, and continued to date beautiful women decades younger than they were.
Where do you get vitamin A from? Only one place -- liver. That is where it is overwhelmingly stored. It is also stored in far smaller amounts in other animal fats, so dairy products and even the fat on muscle meat has some in it, but not that much at all. Vitamin A does not exist in any non-animal food, although the building blocks called caretinoids do. However, the conversion of these building blocks into the real deal is not free and perfectly efficient -- it takes a lot of the building blocks to make a single unit of vitamin A. That's why grazing animals who make their own vitamin A from plant foods, rather than eat another animal's liver, must munch on the stuff all day long. It's like trying to squeeze blood from a stone. Predators like humans pursue a riskier strategy of killing another animal and eating the liver, although this risk is balanced by the far easier job of obtaining vitamin A once we've acquired the food source.
Fortunately, liver is available in all kinds of forms and at all price levels. You don't have to import pate de foie gras -- even Oscar Mayer makes a great liver cheese that only costs around $4 for 8 one-ounce slices. I just have one slice a day, and that alone gets me far more than the RDA. I take a slice of salami, then a slice of liver cheese, then a slice of pepperoni, a tiny bit of mustard, some hard-boiled egg, and a little fire-roasted tomato from the can on top. It tastes wonderful. Then there's liverwurst and braunschweiger, unprocessed liver from calves, chickens, etc., and cod liver oil. I don't find the taste of liver bad, but it's not great on its own. Still, given how many forms it comes in, and how simple it is to work it into the rest of your meal, there's no excuse to exclude it from your diet.
Aside from getting it through your food, you can also apply vitamin A topically. It's the basis for many acne treatments and for creams that are clinically shown to reduce the appearance of fine lines and wrinkles. Just as with your diet, though, you have to apply it every day to maintain the effect. In lotions and sunscreens, it will be listed under the inactive ingredients as "retinyl palmitate." I looked over every skin lotion at my mega-supermarket, and only one brand had vitamin A -- the Gold Bond "Ultimate" line (I prefer the "softening" variety). I've also found only one face moisturizer that has vitamin A -- Neutrogena "healthy skin anti-wrinkle cream" (they call it by its technical name retinol). I found three sunscreens with vitamin A -- the cheapest was Banana Boat "Sport Performance Active Dry Protect," and I recall Aveeno and Hawaiian Tropic having some as well. These products aren't very expensive considering how long they last, and they don't take long to apply either. After eating some liver to fight aging from the inside, take a minute to send in reinforcements from the outside.
Lastly, it goes without saying that sugar is poison to the skin, and so is anything that turns into sugar in your bloodstream, like high levels of carbohydrates in your diet. There are two proteins responsible for giving skin its elasticity (elastin) and bounciness (collagen), and as with any protein they can get screwed up by a sugar smacking into them. Some proteins and sugars are meant to join together to form a more complex object, but you have to have the right protein, the right sugar, and an enzyme that carefully orchestrates the process to make sure things join in the right place. With lots of sugar flowing around, a protein and sugar that aren't meant to join could very well join, and even a pair meant to be together may get stuck in the wrong configuration, like putting a car's wheel where the gearshift should be.
These freak protein-sugar combinations can't do the jobs they were meant to do, and even worse, they can became tangled (or cross-linked) with each other. (This whole process is called glycation if glucose is the culprit and fructation if fructose is. The freak combinations are called AGEs -- Advanced Glycation End-products.) That's bad news for all proteins in your body, but the ones that keep your skin firm and bouncy are especially susceptible to getting screwed up by sugar. That's why a sugary diet ruins your skin, and why vegans always look dessicated. They eat a ton more fruit than the average person (who already eats a lot of carbs to begin with), and fructose is a stronger and faster destroyer of proteins than is glucose. It's no good to work liver into your diet if you're just going to sabotage its effects by wolfing down carbs and sugars or starches in particular.
In the end, it's a very simple and cheap formula to follow -- eat a slice of liver every day, take a few minutes to apply vitamin A topically, and go easy on carbs. You'll thank me when young girls continue to smile at you when you're in your 40s, rather than get grossed out by "old guy skin."
Even if your genes don't predispose you to competing based on dreaminess, compare Berlusconi to other powerful men of the same age -- his skin looks like that of a man at least 20 years younger. The reason we find youthful looks attractive especially in middle or old age is that they are a signal of good genes. If you're 20 years old, male or female, you have to try to look repulsive. That's an easy task when you're 60, though. So if you've managed to prove how resistant you are to the aging process, even if you still don't look as youthful as you did in your 20s, everyone nevertheless infers that you've got some powerful set of genes to have pulled it off.
There is less error behind this inference than a similar inference of good genes based on good looks in a 20 year-old. Thus, risk-averse females will be even more receptive to sleeping with a dreamy, youthful looking 40 year-old than his 20 year-old counterpart. A sex-only relationship is a strategy to just get the guy's good genes. If she were trying to get greater parental investment, she'd go after someone with more control over resources. If she were after someone with shared interests and a desire to go steady, she'd pick someone closer to her own age. So you probably won't develop a sustained relationship with any young girl when you're in your 40s -- but it's a nice consolation prize to still get wildly fucked by them.
Moving onto what builds and maintains healthy skin, most of the story is vitamin A. Vitamins C and E also help, and of course you need to eat plenty of fat since most cells in your body are fat-dependent and because vitamins A and E are fat-soluble (they need to be taken with fat in order to be absorbed). But as long as you focus on vitamin A, you'll pretty much be set. You may already know that it's involved in vision, but its more important role is in the epithelial cells -- any cell on the "surface" of your body. That includes all of your skin but also the "inner surfaces" like the lining of your respiratory tract and digestive tract. Get low on vitamin A, and you're more likely to get a respiratory infection or to not properly digest your food and absorb its nutrients.
Not getting the flu is nice, and strengthening your skin will also help to keep out pathogens that would enter that way. But we don't live in an environment where measles, leprosy, etc., are rampant, so this probably won't persuade too many people to start getting enough vitamin A. Rather, do it because you won't have disgusting Jabba the Hut skin that, being hard to conceal, will tend to turn away all sorts of people who see you -- not just total strangers, but also love interests and potential allies. ("I don't know how trustworthy he is -- just look at how much he's let himself go.") Well into middle age, Cary Grant and Bryan Ferry still had firm, bouncy, glowing skin, and continued to date beautiful women decades younger than they were.
Where do you get vitamin A from? Only one place -- liver. That is where it is overwhelmingly stored. It is also stored in far smaller amounts in other animal fats, so dairy products and even the fat on muscle meat has some in it, but not that much at all. Vitamin A does not exist in any non-animal food, although the building blocks called caretinoids do. However, the conversion of these building blocks into the real deal is not free and perfectly efficient -- it takes a lot of the building blocks to make a single unit of vitamin A. That's why grazing animals who make their own vitamin A from plant foods, rather than eat another animal's liver, must munch on the stuff all day long. It's like trying to squeeze blood from a stone. Predators like humans pursue a riskier strategy of killing another animal and eating the liver, although this risk is balanced by the far easier job of obtaining vitamin A once we've acquired the food source.
Fortunately, liver is available in all kinds of forms and at all price levels. You don't have to import pate de foie gras -- even Oscar Mayer makes a great liver cheese that only costs around $4 for 8 one-ounce slices. I just have one slice a day, and that alone gets me far more than the RDA. I take a slice of salami, then a slice of liver cheese, then a slice of pepperoni, a tiny bit of mustard, some hard-boiled egg, and a little fire-roasted tomato from the can on top. It tastes wonderful. Then there's liverwurst and braunschweiger, unprocessed liver from calves, chickens, etc., and cod liver oil. I don't find the taste of liver bad, but it's not great on its own. Still, given how many forms it comes in, and how simple it is to work it into the rest of your meal, there's no excuse to exclude it from your diet.
Aside from getting it through your food, you can also apply vitamin A topically. It's the basis for many acne treatments and for creams that are clinically shown to reduce the appearance of fine lines and wrinkles. Just as with your diet, though, you have to apply it every day to maintain the effect. In lotions and sunscreens, it will be listed under the inactive ingredients as "retinyl palmitate." I looked over every skin lotion at my mega-supermarket, and only one brand had vitamin A -- the Gold Bond "Ultimate" line (I prefer the "softening" variety). I've also found only one face moisturizer that has vitamin A -- Neutrogena "healthy skin anti-wrinkle cream" (they call it by its technical name retinol). I found three sunscreens with vitamin A -- the cheapest was Banana Boat "Sport Performance Active Dry Protect," and I recall Aveeno and Hawaiian Tropic having some as well. These products aren't very expensive considering how long they last, and they don't take long to apply either. After eating some liver to fight aging from the inside, take a minute to send in reinforcements from the outside.
Lastly, it goes without saying that sugar is poison to the skin, and so is anything that turns into sugar in your bloodstream, like high levels of carbohydrates in your diet. There are two proteins responsible for giving skin its elasticity (elastin) and bounciness (collagen), and as with any protein they can get screwed up by a sugar smacking into them. Some proteins and sugars are meant to join together to form a more complex object, but you have to have the right protein, the right sugar, and an enzyme that carefully orchestrates the process to make sure things join in the right place. With lots of sugar flowing around, a protein and sugar that aren't meant to join could very well join, and even a pair meant to be together may get stuck in the wrong configuration, like putting a car's wheel where the gearshift should be.
These freak protein-sugar combinations can't do the jobs they were meant to do, and even worse, they can became tangled (or cross-linked) with each other. (This whole process is called glycation if glucose is the culprit and fructation if fructose is. The freak combinations are called AGEs -- Advanced Glycation End-products.) That's bad news for all proteins in your body, but the ones that keep your skin firm and bouncy are especially susceptible to getting screwed up by sugar. That's why a sugary diet ruins your skin, and why vegans always look dessicated. They eat a ton more fruit than the average person (who already eats a lot of carbs to begin with), and fructose is a stronger and faster destroyer of proteins than is glucose. It's no good to work liver into your diet if you're just going to sabotage its effects by wolfing down carbs and sugars or starches in particular.
In the end, it's a very simple and cheap formula to follow -- eat a slice of liver every day, take a few minutes to apply vitamin A topically, and go easy on carbs. You'll thank me when young girls continue to smile at you when you're in your 40s, rather than get grossed out by "old guy skin."
Even high culture not as wild anymore?
In the comments, Jason Malloy points to a great NYT essay on the decline of carnality in serious lit. I don't read much fiction, and the most recent authors I've read are probably Salinger or Kawabata, so I'll have to take her word for it. Despite being totally unfamiliar with the books she's talking about (mostly written from the '60s to the present), I still found the larger cultural shift she discusses entirely familiar:
If there's one word I keep using in my discussion of this shift, it is "self-conscious." I really like her use of "abandon" instead of my "wild," but it doesn't work as well as an adjective.
This doesn't look like a strictly generational thing, as the more puritanical writers come from the disco-punk generation of 1958 - 1964, Generation X, and probably whoever the hot authors of the '79 - '86 cohort are. Rather, what's common is when they started writing -- the very late '80s or early '90s and afterward. That of course coincides with the sexual counter-revolution of the past two decades and counting. Even the older writers who used to write more carnal scenes can no longer pull it off, which shows that everyone is susceptible to the larger changes. This is not an effect of aging since the young today are even more hostile to lust.
Where else in high culture does this show up? Art films perhaps? The trouble is that a lot of those are probably French, and France does not have a clear split between pre-'91 and post-'91 culture like America and Canada do, judging by crime rates. Italy saw a huge drop in crime during the '90s, though, so we could throw them in. The UK... kind of, but not as much. Are the artier movies made in those countries during the '60s through the '80s wilder than those from the '90s and 2000s? I'm not a film buff (and even less so for the artier flicks), but Woody Allen vs. Wes Anderson comes to mind.
The younger writers are so self-conscious, so steeped in a certain kind of liberal education, that their characters can’t condone even their own sexual impulses; they are, in short, too cool for sex. Even the mildest display of male aggression is a sign of being overly hopeful, overly earnest or politically untoward. For a character to feel himself, even fleetingly, a conquering hero is somehow passé. More precisely, for a character to attach too much importance to sex, or aspiration to it, to believe that it might be a force that could change things, and possibly for the better, would be hopelessly retrograde. Passivity, a paralyzed sweetness, a deep ambivalence about sexual appetite, are somehow taken as signs of a complex and admirable inner life. These are writers in love with irony, with the literary possibility of self-consciousness so extreme it almost precludes the minimal abandon necessary for the sexual act itself, and in direct rebellion against the Roth, Updike and Bellow their college girlfriends denounced. (Recounting one such denunciation, David Foster Wallace says a friend called Updike “just a penis with a thesaurus”).
If there's one word I keep using in my discussion of this shift, it is "self-conscious." I really like her use of "abandon" instead of my "wild," but it doesn't work as well as an adjective.
This doesn't look like a strictly generational thing, as the more puritanical writers come from the disco-punk generation of 1958 - 1964, Generation X, and probably whoever the hot authors of the '79 - '86 cohort are. Rather, what's common is when they started writing -- the very late '80s or early '90s and afterward. That of course coincides with the sexual counter-revolution of the past two decades and counting. Even the older writers who used to write more carnal scenes can no longer pull it off, which shows that everyone is susceptible to the larger changes. This is not an effect of aging since the young today are even more hostile to lust.
Where else in high culture does this show up? Art films perhaps? The trouble is that a lot of those are probably French, and France does not have a clear split between pre-'91 and post-'91 culture like America and Canada do, judging by crime rates. Italy saw a huge drop in crime during the '90s, though, so we could throw them in. The UK... kind of, but not as much. Are the artier movies made in those countries during the '60s through the '80s wilder than those from the '90s and 2000s? I'm not a film buff (and even less so for the artier flicks), but Woody Allen vs. Wes Anderson comes to mind.
April 19, 2010
Paleo hair
Whenever I go out dancing, my hair usually ends up saturated in sweat from all of the high-intensity activity. I've noticed that this makes it fuller once it dries, even into the next day before I shower. So why not just skip the shampoo altogether the day after and rinse with water only? I've tried that the past two times I've gone out, and it looks a lot better than when I shampoo out all of that oil and sweat. Apparently I'm not the only one who's hit on this idea, although it is marketed as "beach hair" rather than "sweaty hair."
But as you can see from this recipe, they add not just sea salt to water but some kind of essential oil too. Sure, you get salt from going to the beach, but not oil -- unless there's been a spill or something. Water, salt, and oil -- that's what your skin is pumping out when you reach a high level of physical intensity. As far as hair goes, our minds evolved to prefer signals of health and vigor, not the flat-against-the-scalp signals of inactivity or of only low-intensity exertion that would characterize the sick, the poorly fed, and the elderly. (Do sprinters have better hair than marathoners?)
As with other aspects of human appearance, fashion often works against what is truly attractive. You don't even have to look through National Geographic with its exotic tribes whose women scar up their flesh to show group membership. Like fashion in clothing, the new look for girls in the '90s and 2000s was minimalist, razorslim hair. One of my former tutorees, brimming over with excitement about making herself up and how intimidating it would look, said that then we're gonna go straighten our hair -- like Mean Girls straighttt.
It seems like this fashion cycle is particularly long, as the '60s through the '80s all had comparatively voluminous hairstyles for both men and women. (Wild times call for wild hair.) The adoption of polyester probably helped out with the sweaty hair look during the '70s. Similarly, for the past two decades it's mostly been the super-straight and tightly manicured look that's been in for men and women. Hopefully the recent obsession with beach hair is more than a passing fad, something that hints at a coming return to wildness in the culture. There are always bad examples of any fashionable look, but if you look at how high the peaks are, for my money the '60s - '80s hair looked the best. They don't make 'em like Jean Shrimpton or Kelly Kapowski anymore.
But as you can see from this recipe, they add not just sea salt to water but some kind of essential oil too. Sure, you get salt from going to the beach, but not oil -- unless there's been a spill or something. Water, salt, and oil -- that's what your skin is pumping out when you reach a high level of physical intensity. As far as hair goes, our minds evolved to prefer signals of health and vigor, not the flat-against-the-scalp signals of inactivity or of only low-intensity exertion that would characterize the sick, the poorly fed, and the elderly. (Do sprinters have better hair than marathoners?)
As with other aspects of human appearance, fashion often works against what is truly attractive. You don't even have to look through National Geographic with its exotic tribes whose women scar up their flesh to show group membership. Like fashion in clothing, the new look for girls in the '90s and 2000s was minimalist, razorslim hair. One of my former tutorees, brimming over with excitement about making herself up and how intimidating it would look, said that then we're gonna go straighten our hair -- like Mean Girls straighttt.
It seems like this fashion cycle is particularly long, as the '60s through the '80s all had comparatively voluminous hairstyles for both men and women. (Wild times call for wild hair.) The adoption of polyester probably helped out with the sweaty hair look during the '70s. Similarly, for the past two decades it's mostly been the super-straight and tightly manicured look that's been in for men and women. Hopefully the recent obsession with beach hair is more than a passing fad, something that hints at a coming return to wildness in the culture. There are always bad examples of any fashionable look, but if you look at how high the peaks are, for my money the '60s - '80s hair looked the best. They don't make 'em like Jean Shrimpton or Kelly Kapowski anymore.
April 17, 2010
Being offensive is not the same as being wild
In the comments to the post below, TGGP asks about some apparent counter-examples to the decline in wildness -- cartoons like Beavis and Butt-head, South Park, and Family Guy, as well as more sexually explicit song lyrics (like, say, this one by 2 Live Crew) that became increasingly popular during the 1990s and 2000s.
As I said, I don't find those cartoons wild at all: they're about a bunch of dorks with no lives who sit around swearing and picking their nose to try to shock normal people. The loser male characters never interact with girls and only occasionally get into any reckless fun. Quagmire from Family Guy is an exception, but in typical Generation X fashion the only guy with any kind of sex life is enviously depicted as an immoral crypto-rapist. Shoot, Heathcliff and Riff-Raff were getting more tail than Beavis and Cartman.
To help visualize the difference, let's look at the combinations of low vs. high wildness and low vs. high offensiveness, both for song lyrics and popular movies:
If wildness and offensiveness were highly correlated, we'd find most cultural products in the lower-left and upper-right areas. But we find lots of stuff in the other two. In fact, most of what we think of as '80s pop culture is in the upper-left, and most of the '90s pop culture goes in the lower-right. So the two traits seem pretty independent of each other.
We find plenty of things meant to shock and offend at any given time, although there does seem to be a shift starting in the '90s toward more shocking and offensive popular culture. Again, it's only the people who aren't having any fun to begin with who labor to be generally offensive (not as in tearing down sacred cows). But what's even clearer is the shift from a wild to a tame culture starting around 1991. Even the really offensive stuff from the past 20 years isn't chock full of T&A, young people out joyriding, toilet-papering the neighbor's house, and so on. And similarly, even the non-offensive stuff from the '70s and '80s had lots of wild behavior.
Probably the greatest example of the decline in non-offensive yet wild popular culture is the death of the action comedy genre in movies. The first Rush Hour movie was OK, but other than that it's been pathetic. No more Raiders of the Lost Ark, Ghostbusters, Beverly Hills Cop, The Goonies, Stripes, Spaceballs, and on and on. An overly cautious culture is not capable of producing the rule-bending action required for such movies to work. The disappearance of teen movies is another good example of the upper-left area going away. Even the few good ones from the post-'91 era don't feature wild and crazy kids: there's very little of a "sex, drugs, and rock 'n' roll" atmosphere in Clueless or Mean Girls (unlike the original Heathers, where the high school people are more sexually active, given to pranks, and smoke and drink).
The post-'91 culture -- less fun and more offensive. It's no wonder no one's going to re-visit any of this junk once enough time passes for it to fall out of fashion.
As I said, I don't find those cartoons wild at all: they're about a bunch of dorks with no lives who sit around swearing and picking their nose to try to shock normal people. The loser male characters never interact with girls and only occasionally get into any reckless fun. Quagmire from Family Guy is an exception, but in typical Generation X fashion the only guy with any kind of sex life is enviously depicted as an immoral crypto-rapist. Shoot, Heathcliff and Riff-Raff were getting more tail than Beavis and Cartman.
To help visualize the difference, let's look at the combinations of low vs. high wildness and low vs. high offensiveness, both for song lyrics and popular movies:
If wildness and offensiveness were highly correlated, we'd find most cultural products in the lower-left and upper-right areas. But we find lots of stuff in the other two. In fact, most of what we think of as '80s pop culture is in the upper-left, and most of the '90s pop culture goes in the lower-right. So the two traits seem pretty independent of each other.
We find plenty of things meant to shock and offend at any given time, although there does seem to be a shift starting in the '90s toward more shocking and offensive popular culture. Again, it's only the people who aren't having any fun to begin with who labor to be generally offensive (not as in tearing down sacred cows). But what's even clearer is the shift from a wild to a tame culture starting around 1991. Even the really offensive stuff from the past 20 years isn't chock full of T&A, young people out joyriding, toilet-papering the neighbor's house, and so on. And similarly, even the non-offensive stuff from the '70s and '80s had lots of wild behavior.
Probably the greatest example of the decline in non-offensive yet wild popular culture is the death of the action comedy genre in movies. The first Rush Hour movie was OK, but other than that it's been pathetic. No more Raiders of the Lost Ark, Ghostbusters, Beverly Hills Cop, The Goonies, Stripes, Spaceballs, and on and on. An overly cautious culture is not capable of producing the rule-bending action required for such movies to work. The disappearance of teen movies is another good example of the upper-left area going away. Even the few good ones from the post-'91 era don't feature wild and crazy kids: there's very little of a "sex, drugs, and rock 'n' roll" atmosphere in Clueless or Mean Girls (unlike the original Heathers, where the high school people are more sexually active, given to pranks, and smoke and drink).
The post-'91 culture -- less fun and more offensive. It's no wonder no one's going to re-visit any of this junk once enough time passes for it to fall out of fashion.
April 14, 2010
Scenes from the wild pre-'90s culture
Statistics on violent crime, property crime, promiscuity, child abuse, and even wearing seatbelts and bike helmets show that starting around 1991 (and in the late '90s for drug use) the culture became steadily more domesticated across the board. Most people have an inherent bias to perceive the world as always getting more dangerous and depraved, which makes it hard to believe these statistics, as unambiguous as they are.
So as a reminder (or as a report for those who don't have personal memories of wild times), here are a few snapshots of the vast gulf separating the everyday culture of then and now. I could've picked examples of the riskier and more dangerous culture from anytime between 1959 and 1990, but I'm sticking with the '80s just to show how abrupt and total the shift was in 1991. The examples come from chart-topping pop music, popular kids' cartoons, and the social lives of the unpopular kids. We wouldn't be too surprised to see violent themes in the heavy metal genre then vs. now, or in animated shows for adults then vs. now, but these pictures show just how pervasive the carefree feeling was -- it even showed up in places where we don't expect things to get too wild.
In the late summer of 1986, the #1 song on the Billboard charts is a first-person view of a pregnant teenager whose made up her mind to keep her baby, as tough at it may be for her father to accept it. The ensuing controversy even makes the news in the NYT. Ten years later its successor features unintelligible lyrics and a lame beat that gives white people who can't dance a somewhat easy-to-follow series of moves. Ten years later still we find an even less fun dance song in which the female lead does not sing about how vulnerable or hopelessly out-of-control she feels, but instead about how coldly in command of life she is, underscored by her aloof attitude of "I'm too hot for you." (Odd coming from a horsefaced transvestite.) Our culture now is goofily juvenile and annoyingly self-conscious, which is antithetical to just letting go and maybe getting into trouble.
The correct perception that crime was so out-of-control that it required unorthodox solutions was not confined to R-rated action movies like Beverly Hills Cop or Lethal Weapon. During the mid-1980s an incredibly popular children's cartoon show -- not one intended for adults like Looney Tunes -- starred an outsider crime-fighting trio who would foil the plans of a criminal organization to blow things up and kill people in pursuit of grand theft. Several episodes featured leggy, large-breasted femme fatales who provoke a visible and exaggerated horndog response in the main character. (At the previous link, watch the episodes "Movie Set" and "The Amazon.")
By the mid-'90s, kids are more likely to watch a cartoon about cute monsters who try to scare humans, and by the mid-2000s one about some undersea dork who looks like he'll never get into a fight in his life. It is unimaginable that a crowd-drawing children's cartoon today would star a pugnacious prankster cat or hordes of hell-raising neighborhood kids.
Even if we accept that wild behavior was more common before, surely that was only confined to the high-ranking ones among young people. You know -- the football captain, the head cheerleader, etc. It must have been miserable to be a low-status young person back when everyone else was so carefree and happy, right? Actually, even the outcasts were outgoing, had a life, and were getting laid. In 1987 when an independent (or "college rock") band wanted to skewer their brooding goth rivals, what was the main stereotype they brought up? That the dark arty kids were addicted to partying in dance clubs and sleeping around! In the music video for a song that's derided in the previous parody, we see some old school goths in Danceteria (also mocked in the parody), where, if we are to believe the video, they stood a chance of dancing with someone who looked like Madonna.
This true stereotype about the tortured souls lasted through 1989's Pretty Hate Machine -- a dance pop album with some minor screaming -- and 1990's Violator, the last great dark arty album to still contain a lot of groovable doing-it music. Already by late 1991, apathetic and alienated young people were more interested in angry navel-gazing screaming. And while the anger level had subsided by the 2000s, the dark arty kids were still correctly perceived as anti-social shut-ins who were afraid of the opposite sex. When you're locked in your room alone with the headphones on, stewing in rage, you're hardly in the right location or the right mindset to be getting into any kind of trouble. You need to be out and about, losing your self-consciousness while you're surrounded by peers enticing you to join in the fun. Today's portrait of a loser scene kid may not be very shocking, but when the culture is wild it enhances the social lives of all people -- even the brooding artfags.
We should always go with what reliable statistics tell us (the key word being "reliable"), rather than assume that our particular experiences trump those of everyone else. But I realize that can be a hard sell to most people. Fleshing out the numbers, putting a human face on them, telling a story, bla bla bla is also necessary to convince the everyday (not the intellectual) skeptic. Here we've seen that by looking at even our simple popular culture, we get a clue that maybe things are a lot less dangerous and hellbound these days. Our culture has never been as safe, harmless, and boring as it has been for about the past two decades.
So as a reminder (or as a report for those who don't have personal memories of wild times), here are a few snapshots of the vast gulf separating the everyday culture of then and now. I could've picked examples of the riskier and more dangerous culture from anytime between 1959 and 1990, but I'm sticking with the '80s just to show how abrupt and total the shift was in 1991. The examples come from chart-topping pop music, popular kids' cartoons, and the social lives of the unpopular kids. We wouldn't be too surprised to see violent themes in the heavy metal genre then vs. now, or in animated shows for adults then vs. now, but these pictures show just how pervasive the carefree feeling was -- it even showed up in places where we don't expect things to get too wild.
In the late summer of 1986, the #1 song on the Billboard charts is a first-person view of a pregnant teenager whose made up her mind to keep her baby, as tough at it may be for her father to accept it. The ensuing controversy even makes the news in the NYT. Ten years later its successor features unintelligible lyrics and a lame beat that gives white people who can't dance a somewhat easy-to-follow series of moves. Ten years later still we find an even less fun dance song in which the female lead does not sing about how vulnerable or hopelessly out-of-control she feels, but instead about how coldly in command of life she is, underscored by her aloof attitude of "I'm too hot for you." (Odd coming from a horsefaced transvestite.) Our culture now is goofily juvenile and annoyingly self-conscious, which is antithetical to just letting go and maybe getting into trouble.
The correct perception that crime was so out-of-control that it required unorthodox solutions was not confined to R-rated action movies like Beverly Hills Cop or Lethal Weapon. During the mid-1980s an incredibly popular children's cartoon show -- not one intended for adults like Looney Tunes -- starred an outsider crime-fighting trio who would foil the plans of a criminal organization to blow things up and kill people in pursuit of grand theft. Several episodes featured leggy, large-breasted femme fatales who provoke a visible and exaggerated horndog response in the main character. (At the previous link, watch the episodes "Movie Set" and "The Amazon.")
By the mid-'90s, kids are more likely to watch a cartoon about cute monsters who try to scare humans, and by the mid-2000s one about some undersea dork who looks like he'll never get into a fight in his life. It is unimaginable that a crowd-drawing children's cartoon today would star a pugnacious prankster cat or hordes of hell-raising neighborhood kids.
Even if we accept that wild behavior was more common before, surely that was only confined to the high-ranking ones among young people. You know -- the football captain, the head cheerleader, etc. It must have been miserable to be a low-status young person back when everyone else was so carefree and happy, right? Actually, even the outcasts were outgoing, had a life, and were getting laid. In 1987 when an independent (or "college rock") band wanted to skewer their brooding goth rivals, what was the main stereotype they brought up? That the dark arty kids were addicted to partying in dance clubs and sleeping around! In the music video for a song that's derided in the previous parody, we see some old school goths in Danceteria (also mocked in the parody), where, if we are to believe the video, they stood a chance of dancing with someone who looked like Madonna.
This true stereotype about the tortured souls lasted through 1989's Pretty Hate Machine -- a dance pop album with some minor screaming -- and 1990's Violator, the last great dark arty album to still contain a lot of groovable doing-it music. Already by late 1991, apathetic and alienated young people were more interested in angry navel-gazing screaming. And while the anger level had subsided by the 2000s, the dark arty kids were still correctly perceived as anti-social shut-ins who were afraid of the opposite sex. When you're locked in your room alone with the headphones on, stewing in rage, you're hardly in the right location or the right mindset to be getting into any kind of trouble. You need to be out and about, losing your self-consciousness while you're surrounded by peers enticing you to join in the fun. Today's portrait of a loser scene kid may not be very shocking, but when the culture is wild it enhances the social lives of all people -- even the brooding artfags.
We should always go with what reliable statistics tell us (the key word being "reliable"), rather than assume that our particular experiences trump those of everyone else. But I realize that can be a hard sell to most people. Fleshing out the numbers, putting a human face on them, telling a story, bla bla bla is also necessary to convince the everyday (not the intellectual) skeptic. Here we've seen that by looking at even our simple popular culture, we get a clue that maybe things are a lot less dangerous and hellbound these days. Our culture has never been as safe, harmless, and boring as it has been for about the past two decades.
April 10, 2010
What's your video game personality type?
Neat quiz. Mine is conqueror-seeker. Back in the day, we didn't need a name for that, but now that most video game players are a bunch of pussies, I guess we do.
They should put a label on the box of the game that says what type of person it's intended for. Just labeling it "adventure" or "RPG" doesn't tell me much. The Castlevania games in the style of Symphony of the Night would probably be labeled action/adventure, but they feature way too much accumulating of pointless crap, which really derails you from getting important stuff done in the game. Those are much more suited to the mindless collector type of person, not to the action and challenge type.
And any game that has lots of "leveling up" is not an action game, since you become lulled into a zombie-like trance while level-grinding, and there is little sense of adventure since you wind up leaving and re-entering the same single screen just to kill the enemies and gain more experience points and money. Gay. Zelda II is as far as a game can go in the leveling up category and still be fun. Any more than that and it's like sleepwalking on a treadmill, which would make a great icon for the "collector type" label.
They should put a label on the box of the game that says what type of person it's intended for. Just labeling it "adventure" or "RPG" doesn't tell me much. The Castlevania games in the style of Symphony of the Night would probably be labeled action/adventure, but they feature way too much accumulating of pointless crap, which really derails you from getting important stuff done in the game. Those are much more suited to the mindless collector type of person, not to the action and challenge type.
And any game that has lots of "leveling up" is not an action game, since you become lulled into a zombie-like trance while level-grinding, and there is little sense of adventure since you wind up leaving and re-entering the same single screen just to kill the enemies and gain more experience points and money. Gay. Zelda II is as far as a game can go in the leveling up category and still be fun. Any more than that and it's like sleepwalking on a treadmill, which would make a great icon for the "collector type" label.
April 6, 2010
Foreign interventionism tracks domestic crime rates
I got into the anti-war and anti-globalization movement as a clueless college student in the early 2000s. I dropped out after a year or so because it felt like it was going nowhere. In retrospect, it was because we didn't have anything to fight against -- remember, this was before 9/11 and the War on Terrorism. Even the invasion of Afghanistan and the later occupation of Iraq didn't feel as monumentally disastrous as the Vietnam War or our propping up banana republics in Latin America or playing Iran and Iraq against each other.
At the time I'd listened to recordings of people who were railing against our intervention on the Balkans in the late '90s, but that too never struck me as what you would've heard standing around the Columbia student union in 1968 or hearing Noam Chomsky speak about Guatemala in the 1980s. That feeling only stuck with me through the 2000s. Had U.S. imperialism shriveled into something barely worth the time to march against?
Fortunately, unlike most large-scale changes that went into effect starting in the mid-1990s, there is no lame-ass story for people to tell here about how the internet changed everything. Whenever I see a huge social or institutional change that took place then, I immediately think of the falling crime rates and the larger domestication of society. (That's also when the sexual counter-revolution began, i.e. when promiscuity started falling, and when rock music died off.) The good thing here is that we have data that go back over a century on homicide rates. Plus there were actually two crime waves during the 20th C., which gives us four eras to test our ideas against (two rising-crime eras and two falling-crime eras). Non-obvious theories that talk about the internet are mostly untestable because we haven't seen the internet cycle that much. Any change that took place in the '90s you can lazily spin a story about how the internet did it.
The rising-crime eras were at least 1900 (and maybe a bit further back, but this is where the data begin) through 1933, and again from 1959 through 1991. The falling-crime eras are from 1934 through 1958 and again from 1992 to present. To check how militaristic the U.S. has been during these periods, there may be some fancy index of interventionism that a political scientist knows of, but just have a quick look at the Wikipedia lists of U.S. military operations and U.S. covert overseas regime change. All cases that fall in the rising-crime eras fit my hypothesis, and indeed they seem to be the worst, such as our involvement in the Pacific Islands, Latin America, and the Caribbean during the early 20th C, the wars in Southeast Asia, our involvement in Latin America in the '70s and '80s, and our involvement in the Middle East in the '60s through the early '90s.
So what are the counter-examples from the falling-crime eras? For covert regime change, there are two big exceptions: overthrowing Mossadegh in Iran (1953) and Arbenz in Guatemala (1954). The cases from 1992 and after don't appear as well established and certainly were not on the scale of the previously mentioned two. Ditto for the cases of merely supporting dissidents in Communist countries during the 1940s. The facts fit pretty well.
What about the list of military operations? Pretty much the same thing. There's a lot going on in the Pacific Island and the Caribbean around the turn of the last century, and imperialism stays pretty high through the 1910s and '20s. Then look at the 1930s -- basically nothing, certainly not from 1934 onward. Incidentally, the U.S. occupied Haiti from 1915 to 1934 -- exiting the very first year of the falling-crime era. There's essentially nothing in the 1940s either, other than WWII. But that's not a case of the U.S. meddling in factional violence in some unimportant part of the world, or pushing around local puppets to get better access to natural resources or just show the other loser countries who's boss. That's not imperialism. Neither was the Korean War for that matter.
There are a few operations unrelated to Korea in the 1950s, but again hardly any and definitely not on the scale of occupying Haiti, invading Cuba, etc. The U.S. does begin to send troops into Southeast Asia toward the end of this falling-crime era, so that's one possible counter-example, although those wars did not ramp up until Kennedy came into office during a rising-crime era. All of the operations from the Cuban Missle Crisis of 1962 through the Gulf War of 1991 took place during rising-crime times as well.
As for the operations during the falling-crime era of 1992 through today, again just look and find me something that compares to what we saw in Argentina, Brazil, El Salvador, Iran and Iraq, Vietnam, etc. There's basically jackshit aside from the War in Afghanistan and the occupation of Iraq, but the first ended very quickly and the second is still not close to our 20-year occupation of Haiti (though maybe I'll have to come back and change this many years from now).
It's not a perfect fit, but what the hell, it's only social science. It matches up better than any other theory of imperialist aggression. Note that macro-level political and economic changes don't have anything to do with it. We were remarkably non-interventionist during the 1930s -- and of course that had to be due to the Great Depression! Just like how our current recession is keeping us focused on more important things than invading another sandbox. Except that we had a very severe recession in the early 1920s, another bad one in the mid-'70s, a terrible one in the early '80s, and a fairly painful one in the early '90s, all when we were heavily interventionist. Capitalism, the free market, GDP, standard-of-living, etc. -- all of that was steadily rising over the 20th century through today, yet there are major swings both up and down in interventionism.
Politics has grown steadily more competitive and responsive to public opinion, so again that can't account for the two major downswings in interventionism (or upswings, depending on how you thought politics influenced foreign policy). Party affiliation of leaders doesn't matter because Kennedy and LBJ were just as interventionist as Nixon and Regan or Theodore Roosevelt and Wilson. Likewise FDR and Eisenhower were just as restrained as Clinton and Bush II (again, on a relative level).
The only political variable that foreign interventionism tracks pretty well is the politicians focus on fighting crime at home. Politicians can't generally get away with something that the majority of people won't tolerate (see Bryan Caplan's The Myth of the Rational Voter), so we infer that average citizens were fairly tolerant of aggressive foreign policies during rising-crime eras. Well sure they were -- they even demanded it in their popular culture! What great action flick from the '80s did not feature a banana republic? We also infer that when crime rates start falling, the public senses this and is no longer tolerant of blunderbuss approaches to foreign policy -- things seem safer and safer at home, so we believe that foreign threats to our safety must be dropping too, unless we have clear evidence to the contrary like 9/11.
To sum up the process, when crime rates start rising, the public senses this in their own life and through word-of-mouth (plus hearing it from the media). They want the authorities to crack down on dangerous people before they even get the chance to start running amok. But why stop the logic at the local border? We all know there are people abroad who might pose a threat to us, and since we can't tell that from personal experience, we just have to trust what the experts say about foreign threats -- and they mostly pander to our fear of rising crime at home. Hey, once those Vietnamese farmers organize a communist government, they're only a hop skip and a jump away from Columbus, Ohio! So sure, the public says, go get them. And then send in G.I. Joe to fuck up the Sandinistas. Politicians respond to the public's perception of rising danger by getting tough on crime at home and by staging more and larger-scale military operations abroad.
Once crime rates start falling, this process unwinds. The public senses that things are safer and safer, so they are less in a state of panic. They believe we shouldn't invest so much in tough-on-crime measures since the problem appears to be taking care of itself somehow. And with no fear to exploit, foreign policy hawks can't whip us into a fear of Grenada or Afghanistan. (Again, there could be an exceptional shock like 9/11, and then hawks can get the public on board to occupy a sandbox for awhile.) Politicians respond to this change in voter sentiment by not ranting about getting tough on crime so much anymore and by starting to dial down how interventionist their policies are.
Basically, there's a threat of danger out there and it doesn't matter whether the public perceives it coming from local thieves and murderers or foreign guerrillas. There are just bad guys out there somewhere. When this threat level appears to be rising, voters demand more militaristic responses from politicians, again without respect to national borders. And not wanting to get voted out of office, the politicians respond by getting tough on crime and by bombing nobody countries back into the stone age or getting in bed with local thugs who will crack down on subversives. Once voters perceive the threat level to be receding, everything reverses course. No other theory I've ever heard of can account for the decades-long stretches of imperialism that are then interrupted by decades of relative isolationism, only to be followed by several more decades of imperialism, which in turn are put to rest by more decades still of isolationism.
At the time I'd listened to recordings of people who were railing against our intervention on the Balkans in the late '90s, but that too never struck me as what you would've heard standing around the Columbia student union in 1968 or hearing Noam Chomsky speak about Guatemala in the 1980s. That feeling only stuck with me through the 2000s. Had U.S. imperialism shriveled into something barely worth the time to march against?
Fortunately, unlike most large-scale changes that went into effect starting in the mid-1990s, there is no lame-ass story for people to tell here about how the internet changed everything. Whenever I see a huge social or institutional change that took place then, I immediately think of the falling crime rates and the larger domestication of society. (That's also when the sexual counter-revolution began, i.e. when promiscuity started falling, and when rock music died off.) The good thing here is that we have data that go back over a century on homicide rates. Plus there were actually two crime waves during the 20th C., which gives us four eras to test our ideas against (two rising-crime eras and two falling-crime eras). Non-obvious theories that talk about the internet are mostly untestable because we haven't seen the internet cycle that much. Any change that took place in the '90s you can lazily spin a story about how the internet did it.
The rising-crime eras were at least 1900 (and maybe a bit further back, but this is where the data begin) through 1933, and again from 1959 through 1991. The falling-crime eras are from 1934 through 1958 and again from 1992 to present. To check how militaristic the U.S. has been during these periods, there may be some fancy index of interventionism that a political scientist knows of, but just have a quick look at the Wikipedia lists of U.S. military operations and U.S. covert overseas regime change. All cases that fall in the rising-crime eras fit my hypothesis, and indeed they seem to be the worst, such as our involvement in the Pacific Islands, Latin America, and the Caribbean during the early 20th C, the wars in Southeast Asia, our involvement in Latin America in the '70s and '80s, and our involvement in the Middle East in the '60s through the early '90s.
So what are the counter-examples from the falling-crime eras? For covert regime change, there are two big exceptions: overthrowing Mossadegh in Iran (1953) and Arbenz in Guatemala (1954). The cases from 1992 and after don't appear as well established and certainly were not on the scale of the previously mentioned two. Ditto for the cases of merely supporting dissidents in Communist countries during the 1940s. The facts fit pretty well.
What about the list of military operations? Pretty much the same thing. There's a lot going on in the Pacific Island and the Caribbean around the turn of the last century, and imperialism stays pretty high through the 1910s and '20s. Then look at the 1930s -- basically nothing, certainly not from 1934 onward. Incidentally, the U.S. occupied Haiti from 1915 to 1934 -- exiting the very first year of the falling-crime era. There's essentially nothing in the 1940s either, other than WWII. But that's not a case of the U.S. meddling in factional violence in some unimportant part of the world, or pushing around local puppets to get better access to natural resources or just show the other loser countries who's boss. That's not imperialism. Neither was the Korean War for that matter.
There are a few operations unrelated to Korea in the 1950s, but again hardly any and definitely not on the scale of occupying Haiti, invading Cuba, etc. The U.S. does begin to send troops into Southeast Asia toward the end of this falling-crime era, so that's one possible counter-example, although those wars did not ramp up until Kennedy came into office during a rising-crime era. All of the operations from the Cuban Missle Crisis of 1962 through the Gulf War of 1991 took place during rising-crime times as well.
As for the operations during the falling-crime era of 1992 through today, again just look and find me something that compares to what we saw in Argentina, Brazil, El Salvador, Iran and Iraq, Vietnam, etc. There's basically jackshit aside from the War in Afghanistan and the occupation of Iraq, but the first ended very quickly and the second is still not close to our 20-year occupation of Haiti (though maybe I'll have to come back and change this many years from now).
It's not a perfect fit, but what the hell, it's only social science. It matches up better than any other theory of imperialist aggression. Note that macro-level political and economic changes don't have anything to do with it. We were remarkably non-interventionist during the 1930s -- and of course that had to be due to the Great Depression! Just like how our current recession is keeping us focused on more important things than invading another sandbox. Except that we had a very severe recession in the early 1920s, another bad one in the mid-'70s, a terrible one in the early '80s, and a fairly painful one in the early '90s, all when we were heavily interventionist. Capitalism, the free market, GDP, standard-of-living, etc. -- all of that was steadily rising over the 20th century through today, yet there are major swings both up and down in interventionism.
Politics has grown steadily more competitive and responsive to public opinion, so again that can't account for the two major downswings in interventionism (or upswings, depending on how you thought politics influenced foreign policy). Party affiliation of leaders doesn't matter because Kennedy and LBJ were just as interventionist as Nixon and Regan or Theodore Roosevelt and Wilson. Likewise FDR and Eisenhower were just as restrained as Clinton and Bush II (again, on a relative level).
The only political variable that foreign interventionism tracks pretty well is the politicians focus on fighting crime at home. Politicians can't generally get away with something that the majority of people won't tolerate (see Bryan Caplan's The Myth of the Rational Voter), so we infer that average citizens were fairly tolerant of aggressive foreign policies during rising-crime eras. Well sure they were -- they even demanded it in their popular culture! What great action flick from the '80s did not feature a banana republic? We also infer that when crime rates start falling, the public senses this and is no longer tolerant of blunderbuss approaches to foreign policy -- things seem safer and safer at home, so we believe that foreign threats to our safety must be dropping too, unless we have clear evidence to the contrary like 9/11.
To sum up the process, when crime rates start rising, the public senses this in their own life and through word-of-mouth (plus hearing it from the media). They want the authorities to crack down on dangerous people before they even get the chance to start running amok. But why stop the logic at the local border? We all know there are people abroad who might pose a threat to us, and since we can't tell that from personal experience, we just have to trust what the experts say about foreign threats -- and they mostly pander to our fear of rising crime at home. Hey, once those Vietnamese farmers organize a communist government, they're only a hop skip and a jump away from Columbus, Ohio! So sure, the public says, go get them. And then send in G.I. Joe to fuck up the Sandinistas. Politicians respond to the public's perception of rising danger by getting tough on crime at home and by staging more and larger-scale military operations abroad.
Once crime rates start falling, this process unwinds. The public senses that things are safer and safer, so they are less in a state of panic. They believe we shouldn't invest so much in tough-on-crime measures since the problem appears to be taking care of itself somehow. And with no fear to exploit, foreign policy hawks can't whip us into a fear of Grenada or Afghanistan. (Again, there could be an exceptional shock like 9/11, and then hawks can get the public on board to occupy a sandbox for awhile.) Politicians respond to this change in voter sentiment by not ranting about getting tough on crime so much anymore and by starting to dial down how interventionist their policies are.
Basically, there's a threat of danger out there and it doesn't matter whether the public perceives it coming from local thieves and murderers or foreign guerrillas. There are just bad guys out there somewhere. When this threat level appears to be rising, voters demand more militaristic responses from politicians, again without respect to national borders. And not wanting to get voted out of office, the politicians respond by getting tough on crime and by bombing nobody countries back into the stone age or getting in bed with local thugs who will crack down on subversives. Once voters perceive the threat level to be receding, everything reverses course. No other theory I've ever heard of can account for the decades-long stretches of imperialism that are then interrupted by decades of relative isolationism, only to be followed by several more decades of imperialism, which in turn are put to rest by more decades still of isolationism.
April 4, 2010
Misc news
- April 17 is Record Store Day, a celebration of independent retailers. Check your state to see who's on board. I'll probably skip the live music at my local hangout, but they're going to have amazing deals all weekend.
It's strange that you have to go to non-chain record stores to find the great music of the pre-alternative/gangsta era. Most hypermarkets, such as Wal-Mart, don't carry anything good, and Barnes & Noble has only a slightly better selection, but their prices are too high. Even superstar chart-topping albums are absent. You want to find Like a Virgin, Bad, or Slippery When Wet? Then you probably need to visit an independent store. Who saw that coming?
- Arthur De Vany is interviewed on EconTalk about steroids, baseball, and evolutionary fitness. It's a shame they didn't get to talk about the movie business, but you can pick up his book Hollywood Economics (link in the Amazon box above) for that. I was surprised by how open-minded the interviewer Russ Roberts was toward the idea of low-carb eating and brief, intense exercise, given how hostile the entire culture is to eating animals while scarfing down grains, and given the religion of exercise where aerobic / cardio is pure while intensity and explosion is corrupting.
De Vany is right that burst-type exercise is more conducive to playfulness no matter where you find yourself. When I go dancing, I only use fast-twitch muscle movements -- jumps and leaps, whole-body thrusts, high kicks, squatting down and bursting back up, etc. If you do ballroom dancing, add dips and lifts as well. It's a hell of a lot more exhilarating than tediously swaying back and forth at low intensity forever.
- The release of Final Fantasy XIII has a bunch of geeks complaining that the game is too linear, meaning you don't have much choice in what to do or where to go next. Of course, just about all games in all video game genres have been incredibly linear since the mid-1990s when the shift to 3-D made games more like low-quality movies than high-quality games. The so-called sandbox games of recent times are completely open-ended and rule-free, like playing tennis with the net down. That drains all of the fun out of games because they are no longer about exploration in the sense of using trial and error to figure out your skill level and how challenging and dangerous the various areas are. Sandbox games are instead like going on a sightseeing tour through an environment that you'd never pay money to visit in real life.
Almost all of the classic games on Nintendo are non-linear in the sense of giving you a lot of choices at each stage in the game, while making some of them more challenging than others. The fun part of exploration here is using trial and error to learn how far you can go without running into trouble and having to backtrack to where you're better suited. And there still are some areas that are off-limits at the beginning of the game, which gives you a teaser for what to look forward to.
The first Legend of Zelda game is non-linear because you can visit 5 of the 9 dungeons right away, and another one once you acquire an item very early on. Now, only a few of those are beatable by the average player when the game starts -- but if you're good enough or just feel like taking a risk, you can certainly explore those harder dungeons right away. Most of the environment can be explored right away as well, although again some areas have stronger enemies. Still, they aren't completely inaccessible; you just have to be good or feel like taking a risk in order to play through them. It's not a sandbox game because you aren't wandering around with jackshit to do, and where just about everything is accessible and no area is punishingly more difficult than another.
Other Nintendo games that are so well remembered for being non-linear include Zelda 2, Metroid, Castlevania II, Blaster Master (I'd say), all of the original Mega Man games, and many others. The 16-bit and 32-bit era still had a good deal of these games, such as Legend of Zelda: A Link to the Past, but the progression through the game was more focused and less up to the player's choice. Even Super Metroid and Secret of Mana don't allow you right from the start to explore the dangerous areas intended for the average player to be visited later. By the time the 3-D era arrives, games are designed much more like movies with an intended linear sequence, with only some freedom to "pause" the sequence and go screw around for awhile, but still being unable to visit the dangerous areas or accomplish the tougher objectives early on that are intended for later. Castlevania: Symphony of the Night (and all other Castlevania games modeled on this one), Alundra, and the two Zelda games for the Nintendo 64 are all great fun to play, but are heavily linear in this sense.
Really the last great non-linear game I can think of is Kirby and the Amazing Mirror -- not surprisingly made for a 2-D handheld Nintendo system. The game is easy to find and not very expensive, so if you're looking for a fun non-linear game, screw Final Fantasy and check that one out. You can play it on the Game Boy Advance, the first two versions of the Nintendo DS (not the DSi or later versions), or even on your TV at home by buying a cheap Game Boy Player for your GameCube. And most of the classic games I mentioned above are available on the Wii's Virtual Console for probably $10 or less. Sounds better than $60 for a boring movie with uncanny valley visuals.
It's strange that you have to go to non-chain record stores to find the great music of the pre-alternative/gangsta era. Most hypermarkets, such as Wal-Mart, don't carry anything good, and Barnes & Noble has only a slightly better selection, but their prices are too high. Even superstar chart-topping albums are absent. You want to find Like a Virgin, Bad, or Slippery When Wet? Then you probably need to visit an independent store. Who saw that coming?
- Arthur De Vany is interviewed on EconTalk about steroids, baseball, and evolutionary fitness. It's a shame they didn't get to talk about the movie business, but you can pick up his book Hollywood Economics (link in the Amazon box above) for that. I was surprised by how open-minded the interviewer Russ Roberts was toward the idea of low-carb eating and brief, intense exercise, given how hostile the entire culture is to eating animals while scarfing down grains, and given the religion of exercise where aerobic / cardio is pure while intensity and explosion is corrupting.
De Vany is right that burst-type exercise is more conducive to playfulness no matter where you find yourself. When I go dancing, I only use fast-twitch muscle movements -- jumps and leaps, whole-body thrusts, high kicks, squatting down and bursting back up, etc. If you do ballroom dancing, add dips and lifts as well. It's a hell of a lot more exhilarating than tediously swaying back and forth at low intensity forever.
- The release of Final Fantasy XIII has a bunch of geeks complaining that the game is too linear, meaning you don't have much choice in what to do or where to go next. Of course, just about all games in all video game genres have been incredibly linear since the mid-1990s when the shift to 3-D made games more like low-quality movies than high-quality games. The so-called sandbox games of recent times are completely open-ended and rule-free, like playing tennis with the net down. That drains all of the fun out of games because they are no longer about exploration in the sense of using trial and error to figure out your skill level and how challenging and dangerous the various areas are. Sandbox games are instead like going on a sightseeing tour through an environment that you'd never pay money to visit in real life.
Almost all of the classic games on Nintendo are non-linear in the sense of giving you a lot of choices at each stage in the game, while making some of them more challenging than others. The fun part of exploration here is using trial and error to learn how far you can go without running into trouble and having to backtrack to where you're better suited. And there still are some areas that are off-limits at the beginning of the game, which gives you a teaser for what to look forward to.
The first Legend of Zelda game is non-linear because you can visit 5 of the 9 dungeons right away, and another one once you acquire an item very early on. Now, only a few of those are beatable by the average player when the game starts -- but if you're good enough or just feel like taking a risk, you can certainly explore those harder dungeons right away. Most of the environment can be explored right away as well, although again some areas have stronger enemies. Still, they aren't completely inaccessible; you just have to be good or feel like taking a risk in order to play through them. It's not a sandbox game because you aren't wandering around with jackshit to do, and where just about everything is accessible and no area is punishingly more difficult than another.
Other Nintendo games that are so well remembered for being non-linear include Zelda 2, Metroid, Castlevania II, Blaster Master (I'd say), all of the original Mega Man games, and many others. The 16-bit and 32-bit era still had a good deal of these games, such as Legend of Zelda: A Link to the Past, but the progression through the game was more focused and less up to the player's choice. Even Super Metroid and Secret of Mana don't allow you right from the start to explore the dangerous areas intended for the average player to be visited later. By the time the 3-D era arrives, games are designed much more like movies with an intended linear sequence, with only some freedom to "pause" the sequence and go screw around for awhile, but still being unable to visit the dangerous areas or accomplish the tougher objectives early on that are intended for later. Castlevania: Symphony of the Night (and all other Castlevania games modeled on this one), Alundra, and the two Zelda games for the Nintendo 64 are all great fun to play, but are heavily linear in this sense.
Really the last great non-linear game I can think of is Kirby and the Amazing Mirror -- not surprisingly made for a 2-D handheld Nintendo system. The game is easy to find and not very expensive, so if you're looking for a fun non-linear game, screw Final Fantasy and check that one out. You can play it on the Game Boy Advance, the first two versions of the Nintendo DS (not the DSi or later versions), or even on your TV at home by buying a cheap Game Boy Player for your GameCube. And most of the classic games I mentioned above are available on the Wii's Virtual Console for probably $10 or less. Sounds better than $60 for a boring movie with uncanny valley visuals.
April 1, 2010
Until crime shoots up again, hold off on making movies about monsters or cops
Another terrible horror movie is on the way -- a sequel to the original Predator movie, whose protagonist will be the imposing natural-born leader Adrien Brody. Hey, they've screwed up with the Texas Chainsaw Massacre, Friday the 13th, My Bloody Valentine, Black Christmas, Freddy vs. Jason, and Aliens vs. Predator, so why not? I mean, they ruined the Alien franchise with Alien 3 and Alien Resurrection, so it's only logical to move on to ruining the Predator franchise. (Predator 2 wasn't terrible, even if mediocre.)
What distinguishes good from bad scary movies is whether or not they were created during wild times or tame times, i.e. when the crime rate was steadily rising (from 1959 through 1991) or steadily falling (from 1992 to today). When the world becomes safer, people recognize that and aren't so afraid anymore. That makes it impossible for the creators of a movie -- from the director down to the actors themselves -- to get into the authentic mindset of people who are scared out of their minds about serial killers, monsters under the stairs, and criminals lurking in alleyways.
The result is sheltered people's best guess of what fear feels like and looks like, inevitably overly stylized for the same reason that an ugly woman puts on way too much make-up. Even movies that strive to be raw and action-packed like Sin City aren't as convincing as Beverly Hills Cop, which is in large part a comedy. Just watch the strip club scenes in each one and tell me which one feels more real, gritty and whose tension makes your heart race, and which is more ornamental, fantastic, and allows you to remain calm while watching it.
You'd think that the success during tame times of movies set in the world of the imagination -- Harry Potter, Pirates of the Caribbean, Shrek, Gladiator, etc. -- would favor monster movies. But what makes monster movies truly scary is that we believe they could happen to us, not that it's just some cute and makes-you-wonder diversion. So even on the consumer's side, they can't get as into a really scary movie as they could when they perceived the world as more dangerous.
To end, what will make the Predator sequel so bad, aside from all these forces working against it? The plot is that the predator race abducts criminal humans and tosses them onto an island where they'll be slowly hunted for sport by the predators. So it's like The Running Man. Why don't people value The Running Man as much as Predator or the first two Alien movies? Because in those movies, the characters get into trouble as a result of their own choices to wander too far into dangerous territory. The same goes for Full Metal Jacket or Apocalypse Now. The characters have to be at least somewhat responsible for the mess they get themselves into, or else it doesn't provide a lesson to the audience. And that's a big part of why people like scary entertainment -- it partly teaches them how to avoid killers and monsters.
If they just get abducted by aliens when they're going about their normal lives, that doesn't tell us what to do, other than perhaps be a lot more cautious in general. It doesn't matter if the abductees have done something wrong in the past, so that they're not saints. They have to stumble into the unintended consequences of wandering too far into unknown territory. If something not directly related to their past bad deeds is the one that punishes them -- like the predators in the sequel, or the villain behind the Saw movies -- it's too much of a deus ex machina. Well, they didn't face any direct harmful consequences of what they did, but far into the future some all-seeing being will take them away from real life and place them in hell. C'mon, no one believes that, at least not as readily as they believe that wandering too far is likely to blow up in their face right then and there.
There has to be some direct negative feedback to the characters that they made a bad gamble because that awareness then weighs on their minds for the rest of the movie. Think of Bill Paxton's "game over" rants in Aliens, or similar complaints from David Caruso in First Blood once they learn that Rambo is a Special Forces veteran, as well as the leaders' attempts to calm them down, even though they recognize the truth behind their worries. It's that sting of regret -- "I knew I should've fucking stayed home today..." -- that makes the audience sympathize with their situation. We've all felt regret about going too far, but hardly ever about being the victim of a random natural disaster like being abducted by aliens or a twisted TV show. And without some degree of responsibility and choice, you cannot feel regret.
Expect to see a bunch of actors running around like 10 year-old boys when they pretend to play war. Only when the world becomes more dangerous will we see those genuine looks of dread like we saw in the wild-era Predator, Alien, or Terminator movies.
What distinguishes good from bad scary movies is whether or not they were created during wild times or tame times, i.e. when the crime rate was steadily rising (from 1959 through 1991) or steadily falling (from 1992 to today). When the world becomes safer, people recognize that and aren't so afraid anymore. That makes it impossible for the creators of a movie -- from the director down to the actors themselves -- to get into the authentic mindset of people who are scared out of their minds about serial killers, monsters under the stairs, and criminals lurking in alleyways.
The result is sheltered people's best guess of what fear feels like and looks like, inevitably overly stylized for the same reason that an ugly woman puts on way too much make-up. Even movies that strive to be raw and action-packed like Sin City aren't as convincing as Beverly Hills Cop, which is in large part a comedy. Just watch the strip club scenes in each one and tell me which one feels more real, gritty and whose tension makes your heart race, and which is more ornamental, fantastic, and allows you to remain calm while watching it.
You'd think that the success during tame times of movies set in the world of the imagination -- Harry Potter, Pirates of the Caribbean, Shrek, Gladiator, etc. -- would favor monster movies. But what makes monster movies truly scary is that we believe they could happen to us, not that it's just some cute and makes-you-wonder diversion. So even on the consumer's side, they can't get as into a really scary movie as they could when they perceived the world as more dangerous.
To end, what will make the Predator sequel so bad, aside from all these forces working against it? The plot is that the predator race abducts criminal humans and tosses them onto an island where they'll be slowly hunted for sport by the predators. So it's like The Running Man. Why don't people value The Running Man as much as Predator or the first two Alien movies? Because in those movies, the characters get into trouble as a result of their own choices to wander too far into dangerous territory. The same goes for Full Metal Jacket or Apocalypse Now. The characters have to be at least somewhat responsible for the mess they get themselves into, or else it doesn't provide a lesson to the audience. And that's a big part of why people like scary entertainment -- it partly teaches them how to avoid killers and monsters.
If they just get abducted by aliens when they're going about their normal lives, that doesn't tell us what to do, other than perhaps be a lot more cautious in general. It doesn't matter if the abductees have done something wrong in the past, so that they're not saints. They have to stumble into the unintended consequences of wandering too far into unknown territory. If something not directly related to their past bad deeds is the one that punishes them -- like the predators in the sequel, or the villain behind the Saw movies -- it's too much of a deus ex machina. Well, they didn't face any direct harmful consequences of what they did, but far into the future some all-seeing being will take them away from real life and place them in hell. C'mon, no one believes that, at least not as readily as they believe that wandering too far is likely to blow up in their face right then and there.
There has to be some direct negative feedback to the characters that they made a bad gamble because that awareness then weighs on their minds for the rest of the movie. Think of Bill Paxton's "game over" rants in Aliens, or similar complaints from David Caruso in First Blood once they learn that Rambo is a Special Forces veteran, as well as the leaders' attempts to calm them down, even though they recognize the truth behind their worries. It's that sting of regret -- "I knew I should've fucking stayed home today..." -- that makes the audience sympathize with their situation. We've all felt regret about going too far, but hardly ever about being the victim of a random natural disaster like being abducted by aliens or a twisted TV show. And without some degree of responsibility and choice, you cannot feel regret.
Expect to see a bunch of actors running around like 10 year-old boys when they pretend to play war. Only when the world becomes more dangerous will we see those genuine looks of dread like we saw in the wild-era Predator, Alien, or Terminator movies.
Subscribe to:
Posts (Atom)