Still not caring about new movies, given their track record, I didn't bother going somewhere to see the Oscars. But curious nevertheless about the award ceremony as a sign of the times, I went over to TMZ and found a nice gallery of red carpet fashion disasters of the past.
We are constantly told how the 1980s were the "decade of excess," but all those claims ever amount to are that people used to grow fuller and longer hair and wear colorful clothes. Flipping through TMZ's look back at over-the-top stupidity on the red carpet, we see the truth that it's almost exclusively a feature of the '90s and 2000s.
Someone should do the same thing for the MTV Video Music Awards, where attendees are given wider freedom to dress however provocatively they want, and where they have less sober preferences. I've only watched a handful of them, but I distinctly remember the 1998 ones where Rose Mcgowan really trashed it up, accompanied by fellow attention whore Marilyn Manson no less.
[Googling...] Hold on, Us Magazine put together a list of the worst VMA looks of all time, and whaddaya know, they're all from the '90s and 2000s, even though the award show began back in 1984. I was too young to be into it at that time, but I do recall seeing much later on Madonna's performance of "Like a Virgin." About the only excessive thing she wore was her "BOY TOY" belt over her wedding dress. Nothing else stands out as attention-whoring about the '80s shows, or even the ones I saw in the early '90s. The first one I recall was sometime in the mid-'90s when uber-skank Courtney Love began lobbing a bunch of junk up at the pre-show hosts just so she'd get noticed. After they invited her up to their set, she made a complete jackass out of herself (she didn't have to try hard).
I didn't think of it earlier, but the appearance and behavior of people at award ceremonies is just another example of the impossibility of sympathy in the culture of the past 15 to 20 years, where everyone strives to wear a kabuki mask rather than look and act like a real human being.
February 28, 2011
February 24, 2011
Looks matter most to young girls, even in a long-term partner
In the comments to the post below about hair, Dahlia and Rob brought up the question of whether male attractiveness is about looks or wealth-and-status or something else.
In just about every popular writing within the broad framework of evolutionary psychology, as well as within the specialist journal articles, the writers always play down how much male looks matter to girls. The only exception is if the female raters are beyond their peak years of reproductive value, which are roughly the later teens and early 20s.
When older women rate, they value looks less and wealth-and-status more. That story fits with what's familiar to readers over the age of 25 or so, and readers don't want to hear something they didn't already believe, so that's the story that gets told the most. No one wants to be reminded of what the mating market was like in middle school, high school, and college.
When it is these younger females who rate, as is the case with just about all studies published in journals (usually undergrad students in Psych 101), virtually nothing matters except for a guy's looks. I thought about collecting examples from a bunch of studies, but because there are so many, I'm just going with the most recent one I read for a seminar. *
The authors wanted to see if females (whose average age was 18.4) preferred the faces of males who had higher testosterone levels, as well as the faces of men who really liked infants. The idea was that they would prefer these guys in different contexts -- the higher-T guys for short-term partners and the infant-loving guys for long-term partners. That hunch was borne out, making a novel contribution and advancing our understanding of bla bla bla bla bla. But when you look at their findings, all of that stuff was overwhelmed by the effect of how good-looking the guy's face was:
Under "short-term mate attractiveness," we see that whether a girl, after looking at a guy's face, rated him as liking children or kind made no significant difference in his value as a short-term mate to her. Although it didn't hurt either. If she perceived him as masculine, that bumped up his short-term mate value. But the strongest factor is whether she found him physically attractive -- the effect size is over 3 times as large, and the p-value 2 orders of magnitude smaller. Well, sure, no big surprise there -- if she's just going to be with him for a brief affair, why bother judging him on anything other than his looks and masculinity?
Here is where most people's painful memories of high school come flooding back. Look at the upper half of the table. Under "long-term mate attractiveness," we see that masculinity drops out as a predictor, and now being kind and liking children make a difference, each about as strong of an influence as the other. So far so good, but then look at the "physically attractive" predictor -- it's more than twice as strong as being kind or liking children, and the p-value is an order of magnitude lower. So even when she is selecting a long-term partner, a college babe is choosing more based on looks than that other stuff.
And it gets better -- compare the strength of the effect of looks in the long-term vs. short-term lists. It's only slightly stronger in the short-term case (0.388 vs. 0.327), and it may not even be a significant difference (they didn't test that idea). So it's not just that "looks matter" in a long-term mate -- that's not so hard to believe -- but that their power is only slightly dampened compared to their force in determining who makes a good Spring Break fling.
I stress that results like these are entirely typical. (Someone in the seminar suggested that the looks variable is so much stronger because it's just a lot easier to tell who's good-looking from faces, whereas kindness and liking children are harder to read from faces. A possibility, but not likely when we see that the "all about looks" interpretation is supported by every single adolescent's own real-life experiences.)
Two interesting questions suggest themselves, although they are never discussed in these articles or newspaper reports or blog commentaries or whatever:
1) Why do looks matter so much to girls when choosing a long-term mate? That would seem to be something good only for short-term mating, where good looks signal good genes that she wants to get for her unborn child and then move on to get some other guy, a fatherly guy with lots of resources, to raise it.
2) Why doesn't anyone talk about these results, or why do people only want to hear the story about male long-term attractiveness being mostly a matter of his wealth and status, not looks?
Going in reverse order, most academics and their audience -- people who read Thinking Books -- were total losers in high school and college. Being plain or ugly, the guys never got much attention and are still bitter and resentful about being overlooked or rejected by "the superficial cheerleader" type -- in reality, any girl he ever had an interest in. If he can just block that part of his life out, then he'll see only the part where wealth and status do matter, namely when looking for women much beyond their peak reproductive value. And since these guys make a decent buck and aren't at the bottom of the status totem pole, it boosts their own self-esteem to think that wealth-and-status is what drives women crazy.
Why do women writers and readers go along with this lie? They may get some slight self-esteem boost, as though they're somewhat embarrassed for having been so caught up in a guy's looks when they were younger. But they can't get that warm of a glow. I think they just want to keep their male age-mates believing what both perceive to be a harmless and even beneficial lie. And not just for selfish reasons, like "Don't let them know how we work!" Rather, if women remind men that, among the females who matter most, looks swamp just about everything else, that could open up old wounds and cause a rise in mistrust or cynicism between the sexes. If everyone pretends that it's all about being a go-getting breadwinner, then men and women get along better.
As for why looks matter so much even in a long-term mate, we know that attractive people are more symmetrical and so probably have "good genes," ones that are better at withstanding the slings and arrows of outrageous fortune as we're developing. And this symmetry is heritable, meaning that one person being more symmetrical than another is partly a function of genetic differences between them. If a woman wants the best genes for her child, she'll try to get pregnant by the more symmetrical guy, regardless of whether he sticks around to provide for the kid or not. That's the short-term value of good looks, and it's the only one that people talk about.
But for just about all of human history -- probably everything before men worked in a service economy within industrial capitalism -- being able to do well as a dad involved lots of physical activity. To earn a living, he was a hunter, a herder, a farm-worker, or a wage laborer who worked a lot with his hands and body. He had to physically protect his social circle and perhaps go off to fight others. Plus, playing with children and showing them the ropes of growing up is intensely physical, as any parent knows who's been worn out from chasing their kid around the house or the yard.
So, even in their role as paternal providers, males almost always had to be in good shape and full of energy, hence good genes would benefit him even outside of the one night stand. That's why young girls are so taken by a guy's dreamy looks even in the long-term case -- they want a promising forecast of how able he'll be to hunt, herd, plant, play, and fix stuff up farther on down the line. If he looks busted up now, he won't be able to do any of that stuff later on.
It's only in very recent times in the developed world where being a good provider became possible even if you weren't in good shape and weren't terribly energetic. Just take the customers' orders, don't mouth off to the boss, and you'll get a steady paycheck that can go toward caring for your wife and kids.
It's odd that the solution to the paradox is a standard one in evolutionary psychology -- that we live out-of-touch with the environment that we evolved in, and that we need to pay closer attention to what life was like then and there. You'd think this would make the pattern clearer to see.
* Roney et al (2006). Reading men's faces: women's mate attractiveness judgments track men's testosterone and interest in infants. Proc R Soc B, 273: 2169-75.
In just about every popular writing within the broad framework of evolutionary psychology, as well as within the specialist journal articles, the writers always play down how much male looks matter to girls. The only exception is if the female raters are beyond their peak years of reproductive value, which are roughly the later teens and early 20s.
When older women rate, they value looks less and wealth-and-status more. That story fits with what's familiar to readers over the age of 25 or so, and readers don't want to hear something they didn't already believe, so that's the story that gets told the most. No one wants to be reminded of what the mating market was like in middle school, high school, and college.
When it is these younger females who rate, as is the case with just about all studies published in journals (usually undergrad students in Psych 101), virtually nothing matters except for a guy's looks. I thought about collecting examples from a bunch of studies, but because there are so many, I'm just going with the most recent one I read for a seminar. *
The authors wanted to see if females (whose average age was 18.4) preferred the faces of males who had higher testosterone levels, as well as the faces of men who really liked infants. The idea was that they would prefer these guys in different contexts -- the higher-T guys for short-term partners and the infant-loving guys for long-term partners. That hunch was borne out, making a novel contribution and advancing our understanding of bla bla bla bla bla. But when you look at their findings, all of that stuff was overwhelmed by the effect of how good-looking the guy's face was:
Under "short-term mate attractiveness," we see that whether a girl, after looking at a guy's face, rated him as liking children or kind made no significant difference in his value as a short-term mate to her. Although it didn't hurt either. If she perceived him as masculine, that bumped up his short-term mate value. But the strongest factor is whether she found him physically attractive -- the effect size is over 3 times as large, and the p-value 2 orders of magnitude smaller. Well, sure, no big surprise there -- if she's just going to be with him for a brief affair, why bother judging him on anything other than his looks and masculinity?
Here is where most people's painful memories of high school come flooding back. Look at the upper half of the table. Under "long-term mate attractiveness," we see that masculinity drops out as a predictor, and now being kind and liking children make a difference, each about as strong of an influence as the other. So far so good, but then look at the "physically attractive" predictor -- it's more than twice as strong as being kind or liking children, and the p-value is an order of magnitude lower. So even when she is selecting a long-term partner, a college babe is choosing more based on looks than that other stuff.
And it gets better -- compare the strength of the effect of looks in the long-term vs. short-term lists. It's only slightly stronger in the short-term case (0.388 vs. 0.327), and it may not even be a significant difference (they didn't test that idea). So it's not just that "looks matter" in a long-term mate -- that's not so hard to believe -- but that their power is only slightly dampened compared to their force in determining who makes a good Spring Break fling.
I stress that results like these are entirely typical. (Someone in the seminar suggested that the looks variable is so much stronger because it's just a lot easier to tell who's good-looking from faces, whereas kindness and liking children are harder to read from faces. A possibility, but not likely when we see that the "all about looks" interpretation is supported by every single adolescent's own real-life experiences.)
Two interesting questions suggest themselves, although they are never discussed in these articles or newspaper reports or blog commentaries or whatever:
1) Why do looks matter so much to girls when choosing a long-term mate? That would seem to be something good only for short-term mating, where good looks signal good genes that she wants to get for her unborn child and then move on to get some other guy, a fatherly guy with lots of resources, to raise it.
2) Why doesn't anyone talk about these results, or why do people only want to hear the story about male long-term attractiveness being mostly a matter of his wealth and status, not looks?
Going in reverse order, most academics and their audience -- people who read Thinking Books -- were total losers in high school and college. Being plain or ugly, the guys never got much attention and are still bitter and resentful about being overlooked or rejected by "the superficial cheerleader" type -- in reality, any girl he ever had an interest in. If he can just block that part of his life out, then he'll see only the part where wealth and status do matter, namely when looking for women much beyond their peak reproductive value. And since these guys make a decent buck and aren't at the bottom of the status totem pole, it boosts their own self-esteem to think that wealth-and-status is what drives women crazy.
Why do women writers and readers go along with this lie? They may get some slight self-esteem boost, as though they're somewhat embarrassed for having been so caught up in a guy's looks when they were younger. But they can't get that warm of a glow. I think they just want to keep their male age-mates believing what both perceive to be a harmless and even beneficial lie. And not just for selfish reasons, like "Don't let them know how we work!" Rather, if women remind men that, among the females who matter most, looks swamp just about everything else, that could open up old wounds and cause a rise in mistrust or cynicism between the sexes. If everyone pretends that it's all about being a go-getting breadwinner, then men and women get along better.
As for why looks matter so much even in a long-term mate, we know that attractive people are more symmetrical and so probably have "good genes," ones that are better at withstanding the slings and arrows of outrageous fortune as we're developing. And this symmetry is heritable, meaning that one person being more symmetrical than another is partly a function of genetic differences between them. If a woman wants the best genes for her child, she'll try to get pregnant by the more symmetrical guy, regardless of whether he sticks around to provide for the kid or not. That's the short-term value of good looks, and it's the only one that people talk about.
But for just about all of human history -- probably everything before men worked in a service economy within industrial capitalism -- being able to do well as a dad involved lots of physical activity. To earn a living, he was a hunter, a herder, a farm-worker, or a wage laborer who worked a lot with his hands and body. He had to physically protect his social circle and perhaps go off to fight others. Plus, playing with children and showing them the ropes of growing up is intensely physical, as any parent knows who's been worn out from chasing their kid around the house or the yard.
So, even in their role as paternal providers, males almost always had to be in good shape and full of energy, hence good genes would benefit him even outside of the one night stand. That's why young girls are so taken by a guy's dreamy looks even in the long-term case -- they want a promising forecast of how able he'll be to hunt, herd, plant, play, and fix stuff up farther on down the line. If he looks busted up now, he won't be able to do any of that stuff later on.
It's only in very recent times in the developed world where being a good provider became possible even if you weren't in good shape and weren't terribly energetic. Just take the customers' orders, don't mouth off to the boss, and you'll get a steady paycheck that can go toward caring for your wife and kids.
It's odd that the solution to the paradox is a standard one in evolutionary psychology -- that we live out-of-touch with the environment that we evolved in, and that we need to pay closer attention to what life was like then and there. You'd think this would make the pattern clearer to see.
* Roney et al (2006). Reading men's faces: women's mate attractiveness judgments track men's testosterone and interest in infants. Proc R Soc B, 273: 2169-75.
February 23, 2011
What's behind the man-child phenomenon?
Here's a summary from the WSJ on man-children and the women who are annoyed to date them. It's good overall but still needs some corrections.
The author does get the timing right:
But, as with every other social commentator, she doesn't know that this means it reflects the massive decline in violence and general wildness.
It doesn't have to do with more people going to college since the 1980s, which would cause more young males to delay marriage, etc., by an additional four years than if they had not gone to college. That predicts that pre-1980, the minority of males who did go to college would show man-child behavior, but they did not.
The same goes for "our increasingly labyrinthine labor market," which throws up further obstacles and delays for males on their way to establishing their knowledge-economy career. But that predicts that males who aren't going to college and aren't part of the knowledge economy wouldn't turn into man-children. Yet just visit any blue-collar or middle-class area, and you will find just as high a fraction of 20 and 30-something males dithering away their time with their Xbox 360, Spike TV, internet porn, and making runs to 7-11 for subsistence.
Because the shift away from maturing and toward man-childishness cuts across all classes, even if some more than others, all economic arguments are weak at best. The larger social change must be something that has affected all classes. The plummeting rate of violence, drug use, etc., is just such a change: when you perceive a longer life because the world has become less violent, then you delay milestones more than before. And everyone is a lot less subject to physical threats, a lot less drugged out than before.
So, Hymowitz's characterization of today's guy as a "boy rebel" who buys Maxim's "entirely undomesticated" philosophy is off the mark. More like "infantilized." Some dork who never talks to girls and wastes all his time with his frat buddies playing video games is the opposite of an untamed rebel -- he's built his own version of the domestic prison, where he is a happy slave. Truly undomesticated rebels have more or less died off since the decline in violence; indeed that's just another way of phrasing what happened. There are no more Jeff Spicolis in the average American high school, no more smokin' in the boys room, or any of that stuff.
True, the man-child doesn't do what he's told, as far as settling down and starting a family goes, but it's not because he's out roaming wild and loving-and-leaving a series of girls.
Here she comes much closer to what's causing this shift:
Obviously "women moving ahead" in the economy is not it, because that began during the 1970s, yet men of the '70s and '80s didn't let that affect their self-image. Nope, those were two of the most testosterone-charged decades of the past 200 years.
It gets down to their roles as protectors and providers -- and, Hymowitz forgot to mention, as lovers. Those are basically the three jobs that females look to males for: to aggress against someone else for gain or protect his own social circle against this type of aggression coming from outside, to be good provider dads, and to sweep her off her feet and go along with her on a carefree and exciting love adventure.
Since violence and promiscuity have plummeted over the past 20 years, guys' roles as fighters and lovers are not in such high demand as they were before. The only one left is the provider role. In other times and places, being a good provider might have required physical prowess, as when we were hunter-gatherers, or pastoralists tending to herds (and driving away anything or anyone who threatened the herd), or even the typically feminized farmer stooped over yanking weeds out of the ground, but who also had to chop firewood, repair his property, and chase off trespassers.
Making a living by sitting still all day, however, and providing for wife and kids that way doesn't offer much in the way of masculine dignity. As she points out, if a guy is wandering in this kind of existential drift, he might see little else to do but block it out and self-medicate with a beer and a five-hour session of Grand Theft Auto IV multiplayer.
At the same time, we should focus on the demand-side and not just the supply-side in the market for males. If females really wanted someone more bold, ambitious, exciting, and manly -- "Where have all the cowboys gone?" -- then why aren't entrepreneurial males stepping in to give them what they want? If this were a temporary mismatch, OK; but it's been a persistent pattern for about two decades now.
In reality, girls themselves have become incredibly boring over this same time period, just as the average guy has, reflecting the decline in violence. As much as they may gripe about it, they want a declawed and neutered husband. Consider by contrast two of the hit singles from the Footloose soundtrack. Bonnie Tyler expresses her desire for a fighter-and-protector male in "Holdin' Out for a Hero," while in "Let's Hear it for the Boy" Deniece Williams easily forgives all of her lover's superficial defects because "what he does, he does so well -- makes me wanna yell!" Those damsel-in-distress and boy-crazy mindsets have all but evaporated by now, though.
With female demand asking only for provider males, then, it's not very likely that they'll end up being supplied with stoic and courageous partners, not in a service-and-knowledge economy at any rate. There are some spots open in that way of making a living that allow a guy to kick ass, lead a team, and so on, but those jobs are the exception.
So this is just a case of people complaining about the unfair trade-offs of real life. If women want more of a manly man, they must accept more infidelity and higher divorce rates. If they want to avoid those, then they must accept that he won't be a cowboy with steel nerves but more of a man-child who wants to provide for the kids and then be left alone to putter around in the den or the video game room.
The author does get the timing right:
For most of us, the cultural habitat of pre-adulthood no longer seems noteworthy. After all, popular culture has been crowded with pre-adults for almost two decades. Hollywood started the affair in the early 1990s with movies like "Singles," "Reality Bites," "Single White Female" and "Swingers." Television soon deepened the relationship, giving us the agreeable company of Monica, Joey, Rachel and Ross; Jerry, Elaine, George and Kramer; Carrie, Miranda, et al.
But, as with every other social commentator, she doesn't know that this means it reflects the massive decline in violence and general wildness.
It doesn't have to do with more people going to college since the 1980s, which would cause more young males to delay marriage, etc., by an additional four years than if they had not gone to college. That predicts that pre-1980, the minority of males who did go to college would show man-child behavior, but they did not.
The same goes for "our increasingly labyrinthine labor market," which throws up further obstacles and delays for males on their way to establishing their knowledge-economy career. But that predicts that males who aren't going to college and aren't part of the knowledge economy wouldn't turn into man-children. Yet just visit any blue-collar or middle-class area, and you will find just as high a fraction of 20 and 30-something males dithering away their time with their Xbox 360, Spike TV, internet porn, and making runs to 7-11 for subsistence.
Because the shift away from maturing and toward man-childishness cuts across all classes, even if some more than others, all economic arguments are weak at best. The larger social change must be something that has affected all classes. The plummeting rate of violence, drug use, etc., is just such a change: when you perceive a longer life because the world has become less violent, then you delay milestones more than before. And everyone is a lot less subject to physical threats, a lot less drugged out than before.
So, Hymowitz's characterization of today's guy as a "boy rebel" who buys Maxim's "entirely undomesticated" philosophy is off the mark. More like "infantilized." Some dork who never talks to girls and wastes all his time with his frat buddies playing video games is the opposite of an untamed rebel -- he's built his own version of the domestic prison, where he is a happy slave. Truly undomesticated rebels have more or less died off since the decline in violence; indeed that's just another way of phrasing what happened. There are no more Jeff Spicolis in the average American high school, no more smokin' in the boys room, or any of that stuff.
True, the man-child doesn't do what he's told, as far as settling down and starting a family goes, but it's not because he's out roaming wild and loving-and-leaving a series of girls.
Here she comes much closer to what's causing this shift:
It's been an almost universal rule of civilization that girls became women simply by reaching physical maturity, but boys had to pass a test. They needed to demonstrate courage, physical prowess or mastery of the necessary skills. The goal was to prove their competence as protectors and providers. Today, however, with women moving ahead in our advanced economy, husbands and fathers are now optional, and the qualities of character men once needed to play their roles—fortitude, stoicism, courage, fidelity—are obsolete, even a little embarrassing.
Obviously "women moving ahead" in the economy is not it, because that began during the 1970s, yet men of the '70s and '80s didn't let that affect their self-image. Nope, those were two of the most testosterone-charged decades of the past 200 years.
It gets down to their roles as protectors and providers -- and, Hymowitz forgot to mention, as lovers. Those are basically the three jobs that females look to males for: to aggress against someone else for gain or protect his own social circle against this type of aggression coming from outside, to be good provider dads, and to sweep her off her feet and go along with her on a carefree and exciting love adventure.
Since violence and promiscuity have plummeted over the past 20 years, guys' roles as fighters and lovers are not in such high demand as they were before. The only one left is the provider role. In other times and places, being a good provider might have required physical prowess, as when we were hunter-gatherers, or pastoralists tending to herds (and driving away anything or anyone who threatened the herd), or even the typically feminized farmer stooped over yanking weeds out of the ground, but who also had to chop firewood, repair his property, and chase off trespassers.
Making a living by sitting still all day, however, and providing for wife and kids that way doesn't offer much in the way of masculine dignity. As she points out, if a guy is wandering in this kind of existential drift, he might see little else to do but block it out and self-medicate with a beer and a five-hour session of Grand Theft Auto IV multiplayer.
At the same time, we should focus on the demand-side and not just the supply-side in the market for males. If females really wanted someone more bold, ambitious, exciting, and manly -- "Where have all the cowboys gone?" -- then why aren't entrepreneurial males stepping in to give them what they want? If this were a temporary mismatch, OK; but it's been a persistent pattern for about two decades now.
In reality, girls themselves have become incredibly boring over this same time period, just as the average guy has, reflecting the decline in violence. As much as they may gripe about it, they want a declawed and neutered husband. Consider by contrast two of the hit singles from the Footloose soundtrack. Bonnie Tyler expresses her desire for a fighter-and-protector male in "Holdin' Out for a Hero," while in "Let's Hear it for the Boy" Deniece Williams easily forgives all of her lover's superficial defects because "what he does, he does so well -- makes me wanna yell!" Those damsel-in-distress and boy-crazy mindsets have all but evaporated by now, though.
With female demand asking only for provider males, then, it's not very likely that they'll end up being supplied with stoic and courageous partners, not in a service-and-knowledge economy at any rate. There are some spots open in that way of making a living that allow a guy to kick ass, lead a team, and so on, but those jobs are the exception.
So this is just a case of people complaining about the unfair trade-offs of real life. If women want more of a manly man, they must accept more infidelity and higher divorce rates. If they want to avoid those, then they must accept that he won't be a cowboy with steel nerves but more of a man-child who wants to provide for the kids and then be left alone to putter around in the den or the video game room.
February 21, 2011
Everything you wanted to know about the purpose of hair
- On the face
Beards are much more popular in safer times than during wild times. When the European homicide rate was two orders of magnitude greater than now, anywhere from the late 14th C. through the early 16th, the aristocracy -- who committed a disproportionate share of all violent crime -- shaved their faces. Fast-forward to the Victorian era, when crime had plummeted so much, and suddenly men look like the wolfman. That's despite the cheaper cost of shaving, since the industrial revolution had already begun. Nowadays lots of guys have extensive facial hair, unlike the '60s through the '80s when at most the average guy -- not a hippie on the periphery -- might have a moustache or sideburns.
Their evolutionary purpose is serve as signals in male-male competition, not courtship of females. Just look at all the boys and men who adorn the walls of females with the greatest reproductive value, roughly ages 15 to 24 -- zero percent have a beard, or even a prominent moustache or goatee. Sideburns at most. This does not reflect female preference for boyish looks -- square-jawed heavy metal singers, athletes, and guy's-guy actors are almost entirely clean-shaven too.
A clean-shaven face shows how healthy the skin is and how symmetrical the face is, whereas a beard obscures both of those features.
Girls will allow a week of stubble, but that's about it. I had a week's worth of hair and that didn't deter a tiny teenage cutie pie from dancing close at '80s night and telling me at the end, ...and now i have to kiss you on the cheekkkk. Women's objections to facial hair are all based on horniness -- they want it to feel better when they kiss. They're more tolerant of beards in a long-term relationship because those are less based on sex, and she is glad that the guy is preventing his sex appeal by growing a beard, lest other females try to poach him or lest he think too much of his looks and make an attempt at cheating.
- On the body
Hair is designed to grow universally in only two places in adults -- under the arms and in the pubic area. Therefore hair anywhere else cannot be strictly better from the point-of-view of natural selection, or everyone would have hairy bodies. Body hair is probably a side-effect of a behavioral strategy that is constrained by a trade-off. Again it seems like it serves as a signal in male-male competition, not courtship of females, for the exact same reasons above about facial hair.
So the current vogue for shaving or mowing down your pubic hair and armpit hair is about as unnatural as you can imagine. (Shaving the chest, etc., doesn't look so weird, since a good fraction of males look like that naturally.)
You'd think this high supply of shaved-down-there reflects a high demand for it, but lots of things that guys and girls do with their appearance is more to stay in fashion than to attract the opposite sex. For example, girls keep their hair way too short, and will often chop a good deal off even after asking a bunch of guys for their opinion, all of whom always say "don't cut it." I know that most younger guys are grossed out by a girl with even semi-natural hair. (By the way, one of greatest English slang words is "furburger" -- with all those repeated "er" syllables, it just sounds funny.)
But do girls really desire a trimmed or shaven lower ab region on guys? Hard to tell from the pictures of guys they like, since those are never fully nude. A couple weeks ago at '80s night, a group of honey bunnies passing by as I was dancing asked me to pull up my shirt and show my belly (I think they did use the word "belly"). Although they began cheering, I'm not sure why -- the mere fact that I was doing it, the look of my stomach, or what. But I don't mess around with the hair around my belly button, so perhaps they'd been longing to see a guy with natural hair there (not that it's copious either).
- On the scalp
There are three very different features that serve as signals for three very different traits, but people tend to ignore some or conflate others. They are:
1) Pigmentation. This is the signal of aging: only if you're into mid-life and beyond do you show decreased pigmentation.
2) Length, fullness, and luster -- how big and shiny it is. This is a signal of current health: if you're sick, your hair stops growing, becomes limp, and dries out. If you're in good shape, it grows longer, fuller, and more lustrous. Current health reflects both your current nutrition, disease burden, etc., but also how good your genes are at protecting against environmental insults. Big, long, lubricated hair is a handicap since all those proteins in the hair and the fatty acids used to oil it could be used for more productive purposes than to make you look purty.
That's why males who specialize more in courtship-of-females than male-male competition tend to have longer hair -- if it were short, the female might unconsciously think he wasn't healthy enough to grow a mane of hair. To the extent that healthy hair is a signal for good genes, females will be impressed by it, whereas male competitors in a guys-only contest will not (they aren't after your good genes like a female on the prowl is). And in physical male-male contests, long hair may be too great of a handicap -- it can be yanked, get in your eyes, etc.
3) Borders within which hair grows. This is a signal of male mating strategy: males whose hair grows more or less fully over their scalp (regardless of length or pigmentation) are more oriented to short-term mating, while those who show baldness are more oriented to long-term mating. The evolutionary function of baldness deserves a post of its own, but a brief review of key facts can't hurt here.
First, only some human populations have a greater-than-zero prevalence of baldness, and even in those groups that do have baldness, only some fraction of men will get it. In contrast, all males in all groups grow old and get sick. Thus, baldness is not a signal of aging (that would be pigmentation -- my grandfather had a full head of hair until he died in his late 80s, although it was white). And also, baldness is not a sign of poor health -- when you get sick, it's not as though your hairline recedes back six inches, and when you recover it shoots back to where it was. Health is indicated by big-and-shiny.
So again, baldness must reflect some kind of behavioral strategy that is subject to trade-offs, or else all males would grow bald, or none would.
Basically, a balding hairline is a guy's honest signal to his long-term mate that he won't be running around with anyone in the future. He's going to settle down and stick with her. How does this work? In the future, when he will be tempted to cheat with a young babe, he will be unable to find any willing partners because younger girls just get weirded out by a bald hairline, especially if he's middle-aged or older. Thus, the balding guy is saying, "Honey dear, don't take my word for it -- do you really think some pretty young thing is going to want to sleep with me if I'm bald?" Unlike cheap talk, this involuntary loss of his hairline is a credible commitment to be with only her. A less-balding or not-at-all-balding guy cannot be counted on so easily.
Notice that because it does not signal poor health, his wife will not interpret his balding as portending ill health later on. And because it is not related to male-male competition, she won't interpret it as a future inability to earn a living and provide for the family. He'll be perfectly healthy and able to hold down a job -- it's just that he won't ever get to sleep around. Baldness is an honest signal of self-domestication.
That's why the European aristocracy used to cover up any baldness by using wigs -- they wanted to stay on the mating market forever -- while the Victorian era and beyond gave us high-status males, bourgeois this time, who were OK with a bald head.
And that's why Australian aborigines don't go bald -- they have a gerontacracy where elder males monopolize women and are on the mating market into old age. Victorian England and the Australian aborigines are just two examples from the entire spectrum, but the rest fills in this way too. Look even at a smaller scale -- far fewer Scottish and Irish men go bald compared to the English. And sure enough, the Celtic groups are more in the rambunctious, fun-loving dirty-old-man direction than the English are.
I think there's even a decade-level change where guys are more likely to go bald during falling-crime times, when they switch to a more long-term and monogamous style. Looking through pictures of guys in their 20s or 30s from the 1960s through the '80s, you hardly see the same level of receding hairlines and baldness that you do among guys of the '90s and especially the 2000s. True nutrition went down the tubes in the past 20 to 30 years, but again that doesn't affect the hairline so much as how long, full, and lustrous the hair is.
Females want it all in a man, but because of trade-offs they must settle for going after this type of guy for this purpose and that type of guy for that purpose. For example, they want a gentle guy as the father of their children, but a more macho guy for a fling. They don't mind, and may even prefer, a balding guy as their long-term husband. But again look at who they want a brief adventure with -- the lead singer of a rock band, an athlete, a dreamy actor, a powerful executive or politician, all have less of a receding hairline than guys who are more suited to being good reliable dads. Just look at the faculty page of any academic department, men who carry their baby in one of those chest-pouches, and so on.
Beards are much more popular in safer times than during wild times. When the European homicide rate was two orders of magnitude greater than now, anywhere from the late 14th C. through the early 16th, the aristocracy -- who committed a disproportionate share of all violent crime -- shaved their faces. Fast-forward to the Victorian era, when crime had plummeted so much, and suddenly men look like the wolfman. That's despite the cheaper cost of shaving, since the industrial revolution had already begun. Nowadays lots of guys have extensive facial hair, unlike the '60s through the '80s when at most the average guy -- not a hippie on the periphery -- might have a moustache or sideburns.
Their evolutionary purpose is serve as signals in male-male competition, not courtship of females. Just look at all the boys and men who adorn the walls of females with the greatest reproductive value, roughly ages 15 to 24 -- zero percent have a beard, or even a prominent moustache or goatee. Sideburns at most. This does not reflect female preference for boyish looks -- square-jawed heavy metal singers, athletes, and guy's-guy actors are almost entirely clean-shaven too.
A clean-shaven face shows how healthy the skin is and how symmetrical the face is, whereas a beard obscures both of those features.
Girls will allow a week of stubble, but that's about it. I had a week's worth of hair and that didn't deter a tiny teenage cutie pie from dancing close at '80s night and telling me at the end, ...and now i have to kiss you on the cheekkkk. Women's objections to facial hair are all based on horniness -- they want it to feel better when they kiss. They're more tolerant of beards in a long-term relationship because those are less based on sex, and she is glad that the guy is preventing his sex appeal by growing a beard, lest other females try to poach him or lest he think too much of his looks and make an attempt at cheating.
- On the body
Hair is designed to grow universally in only two places in adults -- under the arms and in the pubic area. Therefore hair anywhere else cannot be strictly better from the point-of-view of natural selection, or everyone would have hairy bodies. Body hair is probably a side-effect of a behavioral strategy that is constrained by a trade-off. Again it seems like it serves as a signal in male-male competition, not courtship of females, for the exact same reasons above about facial hair.
So the current vogue for shaving or mowing down your pubic hair and armpit hair is about as unnatural as you can imagine. (Shaving the chest, etc., doesn't look so weird, since a good fraction of males look like that naturally.)
You'd think this high supply of shaved-down-there reflects a high demand for it, but lots of things that guys and girls do with their appearance is more to stay in fashion than to attract the opposite sex. For example, girls keep their hair way too short, and will often chop a good deal off even after asking a bunch of guys for their opinion, all of whom always say "don't cut it." I know that most younger guys are grossed out by a girl with even semi-natural hair. (By the way, one of greatest English slang words is "furburger" -- with all those repeated "er" syllables, it just sounds funny.)
But do girls really desire a trimmed or shaven lower ab region on guys? Hard to tell from the pictures of guys they like, since those are never fully nude. A couple weeks ago at '80s night, a group of honey bunnies passing by as I was dancing asked me to pull up my shirt and show my belly (I think they did use the word "belly"). Although they began cheering, I'm not sure why -- the mere fact that I was doing it, the look of my stomach, or what. But I don't mess around with the hair around my belly button, so perhaps they'd been longing to see a guy with natural hair there (not that it's copious either).
- On the scalp
There are three very different features that serve as signals for three very different traits, but people tend to ignore some or conflate others. They are:
1) Pigmentation. This is the signal of aging: only if you're into mid-life and beyond do you show decreased pigmentation.
2) Length, fullness, and luster -- how big and shiny it is. This is a signal of current health: if you're sick, your hair stops growing, becomes limp, and dries out. If you're in good shape, it grows longer, fuller, and more lustrous. Current health reflects both your current nutrition, disease burden, etc., but also how good your genes are at protecting against environmental insults. Big, long, lubricated hair is a handicap since all those proteins in the hair and the fatty acids used to oil it could be used for more productive purposes than to make you look purty.
That's why males who specialize more in courtship-of-females than male-male competition tend to have longer hair -- if it were short, the female might unconsciously think he wasn't healthy enough to grow a mane of hair. To the extent that healthy hair is a signal for good genes, females will be impressed by it, whereas male competitors in a guys-only contest will not (they aren't after your good genes like a female on the prowl is). And in physical male-male contests, long hair may be too great of a handicap -- it can be yanked, get in your eyes, etc.
3) Borders within which hair grows. This is a signal of male mating strategy: males whose hair grows more or less fully over their scalp (regardless of length or pigmentation) are more oriented to short-term mating, while those who show baldness are more oriented to long-term mating. The evolutionary function of baldness deserves a post of its own, but a brief review of key facts can't hurt here.
First, only some human populations have a greater-than-zero prevalence of baldness, and even in those groups that do have baldness, only some fraction of men will get it. In contrast, all males in all groups grow old and get sick. Thus, baldness is not a signal of aging (that would be pigmentation -- my grandfather had a full head of hair until he died in his late 80s, although it was white). And also, baldness is not a sign of poor health -- when you get sick, it's not as though your hairline recedes back six inches, and when you recover it shoots back to where it was. Health is indicated by big-and-shiny.
So again, baldness must reflect some kind of behavioral strategy that is subject to trade-offs, or else all males would grow bald, or none would.
Basically, a balding hairline is a guy's honest signal to his long-term mate that he won't be running around with anyone in the future. He's going to settle down and stick with her. How does this work? In the future, when he will be tempted to cheat with a young babe, he will be unable to find any willing partners because younger girls just get weirded out by a bald hairline, especially if he's middle-aged or older. Thus, the balding guy is saying, "Honey dear, don't take my word for it -- do you really think some pretty young thing is going to want to sleep with me if I'm bald?" Unlike cheap talk, this involuntary loss of his hairline is a credible commitment to be with only her. A less-balding or not-at-all-balding guy cannot be counted on so easily.
Notice that because it does not signal poor health, his wife will not interpret his balding as portending ill health later on. And because it is not related to male-male competition, she won't interpret it as a future inability to earn a living and provide for the family. He'll be perfectly healthy and able to hold down a job -- it's just that he won't ever get to sleep around. Baldness is an honest signal of self-domestication.
That's why the European aristocracy used to cover up any baldness by using wigs -- they wanted to stay on the mating market forever -- while the Victorian era and beyond gave us high-status males, bourgeois this time, who were OK with a bald head.
And that's why Australian aborigines don't go bald -- they have a gerontacracy where elder males monopolize women and are on the mating market into old age. Victorian England and the Australian aborigines are just two examples from the entire spectrum, but the rest fills in this way too. Look even at a smaller scale -- far fewer Scottish and Irish men go bald compared to the English. And sure enough, the Celtic groups are more in the rambunctious, fun-loving dirty-old-man direction than the English are.
I think there's even a decade-level change where guys are more likely to go bald during falling-crime times, when they switch to a more long-term and monogamous style. Looking through pictures of guys in their 20s or 30s from the 1960s through the '80s, you hardly see the same level of receding hairlines and baldness that you do among guys of the '90s and especially the 2000s. True nutrition went down the tubes in the past 20 to 30 years, but again that doesn't affect the hairline so much as how long, full, and lustrous the hair is.
Females want it all in a man, but because of trade-offs they must settle for going after this type of guy for this purpose and that type of guy for that purpose. For example, they want a gentle guy as the father of their children, but a more macho guy for a fling. They don't mind, and may even prefer, a balding guy as their long-term husband. But again look at who they want a brief adventure with -- the lead singer of a rock band, an athlete, a dreamy actor, a powerful executive or politician, all have less of a receding hairline than guys who are more suited to being good reliable dads. Just look at the faculty page of any academic department, men who carry their baby in one of those chest-pouches, and so on.
The share-it-all generation closes the blinds
From the NYT:
We've been hearing forever about how exhibitionistic the Millennials are, but I was never very convinced and have periodically shown why that was wrong. The most obvious fact is that they don't hang out in public spaces like young people used to. Their social lives are more like those of the tupperware party women of the 1950s, not the out-and-about carnivalesque of Woodstock, Studio 54, or the megamall. Second, there is ascertainment bias -- we can only see exhibitionists if the technology for detecting them is any good. In 1968 it was a lot more difficult for preening young people to be seen and heard; just imagine if they had had YouTube back then. And last, as I pointed out awhile ago, the practice of streaking and flashing your boobs at a concert have totally died out. (I'll bet skinnydipping has too, but that's harder to get an impression about.)
This move among young people from more visible websites like a blog to less visible websites like Facebook is just another example of the pattern of their only going into public spaces if there's no alternative, but preferring to keep their thoughts and actions completely private, at most only sharing them with their tiny real-life social circle.
Most of the Baby Boomer observers, who started the rumor about share-it-all Millennials, have no idea how private Facebook has become since 2008 or so. It used to be that anyone could see anyone else's profile, within a very large network that both belonged to, like any two people who listed Boston as their residence or Hill Valley High as their school. Now you can only see someone's profile if you're mutual acquaintances.
Also, the potential for exhibitionism on Facebook is virtually nonexistent these day. Back in the wild, wild west days of Facebook -- up through about 2007 -- teenagers decorated their profiles with all kinds of junk, especially girls. It looked like the wall of their room -- funny phrases and joke pictures, pictures of hot boys and girls, pictures of their role models, still images from their favorite movies or TV shows, and so on. Some of these were added by their close friends, but if they didn't want them up, they could always have taken them down. And for awhile you could embed a video clip on your main profile page.
Now you can't do any of that, and the profile resembles a blog, just one that is only viewable by a small number of known people. There's a really long comment thread in the middle, a list of friends on one side, a list of ads on the other, and some barebones information and a picture at the top. The "all about me" information -- where people blab about what their favorite music, movies, and other interests are, what their brief life story is, and where they put an endless list of quotes (either movie/TV/music quotes or inside jokes) -- has been banished to a separate tab that no one will ever click on.
We can also point to the immediate death of MySpace once Facebook offered a social network site that was at least somewhat closed. The Facebook networks were still very large at first, but they didn't include everybody like MySpace did. Again compare how pimped out the average MySpace profile was compared to the average Facebook profile -- orders of magnitude more bling.
During the euphoria of the peak housing bubble years (say, 2003 to 2006), the otherwise don't-look-at-me Millennials opened up quite a bit and strutted their stuff, but that really was just a blip. I too was blinded by that moment of craziness and took that to be their natural way, but now that the housing bubble party has been over for three years, and also looking back at the late '90s and early 2000s, it's clear that their basic preferences are to not stand out or cause a sensation. At least I got to get close to them when they were still rather outgoing and fun-loving, during my stint as a tutor. And that's what counts, since your memory tends to block out how boring a once-exciting scene eventually became. Sure was fun while it lasted.
Blogs were once the outlet of choice for people who wanted to express themselves online. But with the rise of sites like Facebook and Twitter, they are losing their allure for many people — particularly the younger generation.
The Internet and American Life Project at the Pew Research Center found that from 2006 to 2009, blogging among children ages 12 to 17 fell by half; now 14 percent of children those ages who use the Internet have blogs.
We've been hearing forever about how exhibitionistic the Millennials are, but I was never very convinced and have periodically shown why that was wrong. The most obvious fact is that they don't hang out in public spaces like young people used to. Their social lives are more like those of the tupperware party women of the 1950s, not the out-and-about carnivalesque of Woodstock, Studio 54, or the megamall. Second, there is ascertainment bias -- we can only see exhibitionists if the technology for detecting them is any good. In 1968 it was a lot more difficult for preening young people to be seen and heard; just imagine if they had had YouTube back then. And last, as I pointed out awhile ago, the practice of streaking and flashing your boobs at a concert have totally died out. (I'll bet skinnydipping has too, but that's harder to get an impression about.)
This move among young people from more visible websites like a blog to less visible websites like Facebook is just another example of the pattern of their only going into public spaces if there's no alternative, but preferring to keep their thoughts and actions completely private, at most only sharing them with their tiny real-life social circle.
Most of the Baby Boomer observers, who started the rumor about share-it-all Millennials, have no idea how private Facebook has become since 2008 or so. It used to be that anyone could see anyone else's profile, within a very large network that both belonged to, like any two people who listed Boston as their residence or Hill Valley High as their school. Now you can only see someone's profile if you're mutual acquaintances.
Also, the potential for exhibitionism on Facebook is virtually nonexistent these day. Back in the wild, wild west days of Facebook -- up through about 2007 -- teenagers decorated their profiles with all kinds of junk, especially girls. It looked like the wall of their room -- funny phrases and joke pictures, pictures of hot boys and girls, pictures of their role models, still images from their favorite movies or TV shows, and so on. Some of these were added by their close friends, but if they didn't want them up, they could always have taken them down. And for awhile you could embed a video clip on your main profile page.
Now you can't do any of that, and the profile resembles a blog, just one that is only viewable by a small number of known people. There's a really long comment thread in the middle, a list of friends on one side, a list of ads on the other, and some barebones information and a picture at the top. The "all about me" information -- where people blab about what their favorite music, movies, and other interests are, what their brief life story is, and where they put an endless list of quotes (either movie/TV/music quotes or inside jokes) -- has been banished to a separate tab that no one will ever click on.
We can also point to the immediate death of MySpace once Facebook offered a social network site that was at least somewhat closed. The Facebook networks were still very large at first, but they didn't include everybody like MySpace did. Again compare how pimped out the average MySpace profile was compared to the average Facebook profile -- orders of magnitude more bling.
During the euphoria of the peak housing bubble years (say, 2003 to 2006), the otherwise don't-look-at-me Millennials opened up quite a bit and strutted their stuff, but that really was just a blip. I too was blinded by that moment of craziness and took that to be their natural way, but now that the housing bubble party has been over for three years, and also looking back at the late '90s and early 2000s, it's clear that their basic preferences are to not stand out or cause a sensation. At least I got to get close to them when they were still rather outgoing and fun-loving, during my stint as a tutor. And that's what counts, since your memory tends to block out how boring a once-exciting scene eventually became. Sure was fun while it lasted.
February 20, 2011
February 17, 2011
Rock songs with a rap section?
When the hit song "Rapper's Delight" by the Sugarhill Gang introduced the average listener to rap music in 1979, rock groups took notice, some of them even working in a rap sound to their own songs. How extensive was this borrowing, though?
I'm not counting rap covers of a rock song, like Run-D.M.C.'s version of "Walk This Way" with Steven Tyler and Joe Perry, but original rock songs with some rap element to them. Also, I'm restricting this to rock music, not alternative / emo / indie or nu metal like Korn and the Bloodhound Gang, so like 1992 at the latest.
I can only think of three from browsing stuff that I own, and I'm too lazy right now to rummage through every song in my mental closet to come up with others.
"Rapture" by Blondie (1980)
"Eyes Without a Face" by Billy Idol (1983)
"Calling All Nations" by INXS (1987)
Any others? The first two were big hits, while the third is a good song but never released as a single. So by my admittedly cursory look, it seems that the interest of white rock musicians in rap didn't last much longer than five years.
I'm not counting rap covers of a rock song, like Run-D.M.C.'s version of "Walk This Way" with Steven Tyler and Joe Perry, but original rock songs with some rap element to them. Also, I'm restricting this to rock music, not alternative / emo / indie or nu metal like Korn and the Bloodhound Gang, so like 1992 at the latest.
I can only think of three from browsing stuff that I own, and I'm too lazy right now to rummage through every song in my mental closet to come up with others.
"Rapture" by Blondie (1980)
"Eyes Without a Face" by Billy Idol (1983)
"Calling All Nations" by INXS (1987)
Any others? The first two were big hits, while the third is a good song but never released as a single. So by my admittedly cursory look, it seems that the interest of white rock musicians in rap didn't last much longer than five years.
February 16, 2011
Why do we crave carbs when it's cold, and meat when it's warm?
Recently I mentioned that I experimented with a candy bar diet when I got sick and kept from getting worse, even getting better. That was just one episode in a larger trek away from a paleo kind of diet during the winter holidays. I did that last year as well during winter.
I find myself much more likely to want carbs when the weather is really cold, and even if I try to return to a caveman diet, it's incredibly more difficult during the winter than during the spring or summer. For instance, when I had some cupcakes and ice cream at my mother's birthday party a couple summers ago, I bounced right back. In fact, when I first started eating low-carb, it was spring and I took to it without any pain at all.
And it's not just me. We have holidays and festivals scattered throughout the year, yet it's only the ones during cold-weather times when we feel like pigging out on starches and sweets. During hot-weather times, we feel more like feasting on animals and leaving the sweets aside. Most of this is obvious, but if you want to check, just go to Google Trends and search for something like "pie" or "cake" or the generic "sweets," and you'll see that people search the internet for these things much more when the weather is cold and hardly at all when it's warm.
Just to run through the examples, though:
- The first big gorging-on-carbs holiday is Halloween, right at the end of October. It features little or no meat, or even savory vegetables and fresh fruit. Nope, it's all about easily digestible carbs like sugar.
- Then comes Thanksgiving about a month later. Although there is some meat -- it wouldn't be an omnivore's meal without one -- it is dwarfed by the starches and sugars. Out come the buckets of mashed potatoes, cranberry sauce, corn pudding, sweet potato casserole, rolls, and then a seemingly endless train of pies. There is little whole fruit, and not even that many real vegetables -- maybe a token tray of asparagus or brussel sprouts at most. This meal extends all the way through Thanksgiving weekend.
- Next is Christmas, where again there is some meat but mostly a repeat of the Thanksgiving style of dinner, which also lasts for a lot longer than just one meal. In Google trends, "candy" shows two spikes -- one leading up to Halloween and one just before Christmas. Throw in all those candied and chocolate-covered nuts that your relatives send you, plus eggnog, hot chocolate, and apple cider, all of which are consumed during winter broadly.
- Among Jews, Hannukah time is full of the same kind of sweets that everybody else eats (cakes, pastries), starches (potato pancakes), plus chocolate gelt.
- New Year's Eve is like Thanksgiving or Christmas without the big hunk of meat, although food comes in smaller portions.
- Super Bowl Sunday features little meat, aside from some wings (and even these are soaked in sugary sauces). It's mostly about pizza, chips, bean dip, nachos, beer, and soda. It's close to a carbo-voric vegan's dream holiday. "Chinese food" shows a spike in Google Trends during the winter holidays, and I'm sure some of that is Super Bowl-related. (Not due to Chinese New Year, since Chinese Americans would never search the internet for "Chinese food.")
- Valentine's Day is all about sugary sweets.
- Mardi Gras has no standard big helping of meat, and like Valentine's Day is mostly about sweet pastries like the king cake, and candy.
So from roughly November through February, we go into carboholic mode. There aren't any big holidays in March, except St. Patrick's Day, although that has no standard food, and to the extent that it does it's carb-loaded alcoholic drinks. Jews celebrate Passover Seder with a fairly savory and fat-and-protein-rich meal. Even all the stuff made with matzo isn't sweet. There may or may not be a Passover cake. I distinctly remember a Seder dinner with my friend in 8th grade, and it was no starch-and-sweets binge like Hannukah.
- Getting into spring, Easter is neither here nor there. On the one hand, kids eat some Peeps and Cadbury creme eggs, but it's only a little bit and probably nothing much more than they would normally be allowed during the weekend. And on the other hand, the most famous tradition is to decorate and eat a bunch of hard-boiled eggs -- no carbs, lots of fat and protein.
- Spring Break features little sugar or starch, and more animal foods. It's a generic beach meal; see July 4th and other summer festivals.
- In May and June, there's Mother's Day and Father's Day. Why don't we buy them really starchy or sugary foods then? If they were held in January, you can bet we'd send them pies, cakes, cupcakes, candied nuts, cookies, pastries, or something. If food is involved at all, we take them out for dinner, and it's no different from your typical nice dinner out.
- Memorial Day weekend has no special foods, but people haven't turned to it as an excuse to binge on sugar like they have with cold-weather holidays. People usually make something like a light version of Fourth of July food.
- July 4th has very little emphasis on sweets for a holiday. There might be some potato salad and ambrosia, but most of what everyone eats is dead animals -- hot dogs and hamburgers especially, but also steaks, lamb chops, pork chops, and anything else you can throw on the grill.
- New England clambakes, where people gorge on animal foods and not much at all on sweets, are held mostly during the middle half of the year. Same with Midwestern fish boils. And also Hawaiian luaus. In general, summer festivals are all about slaughtering a bunch of animals and setting them on fire.
- Going back to Jews, neither Rosh Hashanah or Yom Kippur features super-starchy or sugary food. Maybe some semi-saccharine fruits for Rosh Hashanah, but no cornucopia of pies, pastries, cakes, etc.
- Labor Day weekend is also like a light version of the Fourth of July. No explosion of pies, cakes, pastries, mashed potatoes, cranberry sauce, or whatever.
- Finally, Oktoberfest, where it is celebrated, takes place in September-October and is very meat-intensive, along with a handful of starch (and beer), with hardly any sweets.
So from roughly March through October, and especially during the summer, we leave carboholic mode and start to chow down on any animal food we can get our hands on. The only exception is that people eat more ice cream and popsicles during the summer, although even that is dwarfed by all the muffins, brownies, cookies, etc. that we wolf down during cold weather.
Clearly this has nothing to do with the availability of these foods here and now. Sweets are available year-round at the same level, and so is meat, thanks to industrialization, refrigeration, etc. Our tastes therefore reflect what adapted us to earlier environments. It's probably not farming life that made us this way, since they never would have had a huge supply of meat at any time during the year, in contrast to the summertime carnivores that we become. Hunter-gatherer and pastoralist life could have created these tastes since it's easier to hunt down animals in warm weather, and since it's less risky to ritually slaughter one of your livestock for a feast during warm weather when the rest of the herd is feeding well and breeding.
As cold weather sets in, it's like we're trying to fatten ourselves up by gorging on easily digestible carbs, just in case it's a harsh winter. It could be to maintain a reserve of energy that could be burned if we encounter little food, as well as to provide us with insulation against the cold.
By the way, "diet" and "dieting" peak in Google Trends right after we've had Thanksgiving and Christmas carb binges. It is not "caused by" the tradition of New Year's resolutions, which is the effect and not the cause -- we picked that time to make ritual resolutions because that's when we naturally think to ourselves, "Good god, what have I been doing to my body lately?" That's surely one reason why dieting doesn't work -- people try it mostly when they're in carb-munching mode and their insulin is too jacked up to let the fat out of their adipose tissue.
There is no peak for diet or dieting before summer, when supposedly everyone is worried about how they'll look with their shirt off or in a swimsuit. And there's no peak after summer, when people might worry about all the burgers and dogs they ate. Deep down we know that loading up on animals is not unhealthy, and we feel no need to diet then, and so we would never make our resolutions during the summertime. The regeneration of springtime and the new pulse of life of summertime would surely make a good background story for why we're making our resolutions to turn over a new leaf, but we just don't have the need to during these times.
I'm not sure whether the wintertime sugar bomb stems more from our hunter-gatherer or pastoralist past, but my hunch is the latter. It was just not possible for hunter-gatherers to find lots of sugar at any time during the year, whereas pastoralists have always existed alongside settled farmers who had domesticated fruits to taste super-sweet (try a crabapple to see how wild fruit tastes), and they also grew really starchy grains and even sweeteners, though they were still fairly rare. The pastoralists could either have just raided the farmers and stolen these ingredients for sweets, or they could have been part of a trading network where they gave up some of their cheese or butter or animal hairs to the farmers. Either way, this would have to wait until after the autumn harvest, or the farmers would have little to offer the herders.
If the farmers had built even more advanced societies, there might even be specialists who had made cakes, pies, etc., already and that these were traded in exchange for the herders' animal products.
Having told that story, though, it seems like the source of our seasonal fluctuation in preference for sugar vs. meat comes from the historical intertwining of both farmers and herders. Without the products of both groups, there could have been no icing, toffee, ice cream, whipped cream, cheesecake, eggnog, cheese danishes, or a plate of cookies or brownies with milk.
I find myself much more likely to want carbs when the weather is really cold, and even if I try to return to a caveman diet, it's incredibly more difficult during the winter than during the spring or summer. For instance, when I had some cupcakes and ice cream at my mother's birthday party a couple summers ago, I bounced right back. In fact, when I first started eating low-carb, it was spring and I took to it without any pain at all.
And it's not just me. We have holidays and festivals scattered throughout the year, yet it's only the ones during cold-weather times when we feel like pigging out on starches and sweets. During hot-weather times, we feel more like feasting on animals and leaving the sweets aside. Most of this is obvious, but if you want to check, just go to Google Trends and search for something like "pie" or "cake" or the generic "sweets," and you'll see that people search the internet for these things much more when the weather is cold and hardly at all when it's warm.
Just to run through the examples, though:
- The first big gorging-on-carbs holiday is Halloween, right at the end of October. It features little or no meat, or even savory vegetables and fresh fruit. Nope, it's all about easily digestible carbs like sugar.
- Then comes Thanksgiving about a month later. Although there is some meat -- it wouldn't be an omnivore's meal without one -- it is dwarfed by the starches and sugars. Out come the buckets of mashed potatoes, cranberry sauce, corn pudding, sweet potato casserole, rolls, and then a seemingly endless train of pies. There is little whole fruit, and not even that many real vegetables -- maybe a token tray of asparagus or brussel sprouts at most. This meal extends all the way through Thanksgiving weekend.
- Next is Christmas, where again there is some meat but mostly a repeat of the Thanksgiving style of dinner, which also lasts for a lot longer than just one meal. In Google trends, "candy" shows two spikes -- one leading up to Halloween and one just before Christmas. Throw in all those candied and chocolate-covered nuts that your relatives send you, plus eggnog, hot chocolate, and apple cider, all of which are consumed during winter broadly.
- Among Jews, Hannukah time is full of the same kind of sweets that everybody else eats (cakes, pastries), starches (potato pancakes), plus chocolate gelt.
- New Year's Eve is like Thanksgiving or Christmas without the big hunk of meat, although food comes in smaller portions.
- Super Bowl Sunday features little meat, aside from some wings (and even these are soaked in sugary sauces). It's mostly about pizza, chips, bean dip, nachos, beer, and soda. It's close to a carbo-voric vegan's dream holiday. "Chinese food" shows a spike in Google Trends during the winter holidays, and I'm sure some of that is Super Bowl-related. (Not due to Chinese New Year, since Chinese Americans would never search the internet for "Chinese food.")
- Valentine's Day is all about sugary sweets.
- Mardi Gras has no standard big helping of meat, and like Valentine's Day is mostly about sweet pastries like the king cake, and candy.
So from roughly November through February, we go into carboholic mode. There aren't any big holidays in March, except St. Patrick's Day, although that has no standard food, and to the extent that it does it's carb-loaded alcoholic drinks. Jews celebrate Passover Seder with a fairly savory and fat-and-protein-rich meal. Even all the stuff made with matzo isn't sweet. There may or may not be a Passover cake. I distinctly remember a Seder dinner with my friend in 8th grade, and it was no starch-and-sweets binge like Hannukah.
- Getting into spring, Easter is neither here nor there. On the one hand, kids eat some Peeps and Cadbury creme eggs, but it's only a little bit and probably nothing much more than they would normally be allowed during the weekend. And on the other hand, the most famous tradition is to decorate and eat a bunch of hard-boiled eggs -- no carbs, lots of fat and protein.
- Spring Break features little sugar or starch, and more animal foods. It's a generic beach meal; see July 4th and other summer festivals.
- In May and June, there's Mother's Day and Father's Day. Why don't we buy them really starchy or sugary foods then? If they were held in January, you can bet we'd send them pies, cakes, cupcakes, candied nuts, cookies, pastries, or something. If food is involved at all, we take them out for dinner, and it's no different from your typical nice dinner out.
- Memorial Day weekend has no special foods, but people haven't turned to it as an excuse to binge on sugar like they have with cold-weather holidays. People usually make something like a light version of Fourth of July food.
- July 4th has very little emphasis on sweets for a holiday. There might be some potato salad and ambrosia, but most of what everyone eats is dead animals -- hot dogs and hamburgers especially, but also steaks, lamb chops, pork chops, and anything else you can throw on the grill.
- New England clambakes, where people gorge on animal foods and not much at all on sweets, are held mostly during the middle half of the year. Same with Midwestern fish boils. And also Hawaiian luaus. In general, summer festivals are all about slaughtering a bunch of animals and setting them on fire.
- Going back to Jews, neither Rosh Hashanah or Yom Kippur features super-starchy or sugary food. Maybe some semi-saccharine fruits for Rosh Hashanah, but no cornucopia of pies, pastries, cakes, etc.
- Labor Day weekend is also like a light version of the Fourth of July. No explosion of pies, cakes, pastries, mashed potatoes, cranberry sauce, or whatever.
- Finally, Oktoberfest, where it is celebrated, takes place in September-October and is very meat-intensive, along with a handful of starch (and beer), with hardly any sweets.
So from roughly March through October, and especially during the summer, we leave carboholic mode and start to chow down on any animal food we can get our hands on. The only exception is that people eat more ice cream and popsicles during the summer, although even that is dwarfed by all the muffins, brownies, cookies, etc. that we wolf down during cold weather.
Clearly this has nothing to do with the availability of these foods here and now. Sweets are available year-round at the same level, and so is meat, thanks to industrialization, refrigeration, etc. Our tastes therefore reflect what adapted us to earlier environments. It's probably not farming life that made us this way, since they never would have had a huge supply of meat at any time during the year, in contrast to the summertime carnivores that we become. Hunter-gatherer and pastoralist life could have created these tastes since it's easier to hunt down animals in warm weather, and since it's less risky to ritually slaughter one of your livestock for a feast during warm weather when the rest of the herd is feeding well and breeding.
As cold weather sets in, it's like we're trying to fatten ourselves up by gorging on easily digestible carbs, just in case it's a harsh winter. It could be to maintain a reserve of energy that could be burned if we encounter little food, as well as to provide us with insulation against the cold.
By the way, "diet" and "dieting" peak in Google Trends right after we've had Thanksgiving and Christmas carb binges. It is not "caused by" the tradition of New Year's resolutions, which is the effect and not the cause -- we picked that time to make ritual resolutions because that's when we naturally think to ourselves, "Good god, what have I been doing to my body lately?" That's surely one reason why dieting doesn't work -- people try it mostly when they're in carb-munching mode and their insulin is too jacked up to let the fat out of their adipose tissue.
There is no peak for diet or dieting before summer, when supposedly everyone is worried about how they'll look with their shirt off or in a swimsuit. And there's no peak after summer, when people might worry about all the burgers and dogs they ate. Deep down we know that loading up on animals is not unhealthy, and we feel no need to diet then, and so we would never make our resolutions during the summertime. The regeneration of springtime and the new pulse of life of summertime would surely make a good background story for why we're making our resolutions to turn over a new leaf, but we just don't have the need to during these times.
I'm not sure whether the wintertime sugar bomb stems more from our hunter-gatherer or pastoralist past, but my hunch is the latter. It was just not possible for hunter-gatherers to find lots of sugar at any time during the year, whereas pastoralists have always existed alongside settled farmers who had domesticated fruits to taste super-sweet (try a crabapple to see how wild fruit tastes), and they also grew really starchy grains and even sweeteners, though they were still fairly rare. The pastoralists could either have just raided the farmers and stolen these ingredients for sweets, or they could have been part of a trading network where they gave up some of their cheese or butter or animal hairs to the farmers. Either way, this would have to wait until after the autumn harvest, or the farmers would have little to offer the herders.
If the farmers had built even more advanced societies, there might even be specialists who had made cakes, pies, etc., already and that these were traded in exchange for the herders' animal products.
Having told that story, though, it seems like the source of our seasonal fluctuation in preference for sugar vs. meat comes from the historical intertwining of both farmers and herders. Without the products of both groups, there could have been no icing, toffee, ice cream, whipped cream, cheesecake, eggnog, cheese danishes, or a plate of cookies or brownies with milk.
February 14, 2011
Crowd-pleasing music (notes from unpaid fieldwork)
Driving without music playing feels so unnatural. Even if it's just to run some errands for 15 minutes worth of driving, I can't leave without picking up a CD or two for the ride. And for whatever reason, I don't like driving with the car windows up all the way -- it just feels weird. Even when it's cold I leave them open a crack. One, it keeps me more in touch with my surroundings, and more importantly, two, I get to annoy people with the music I'm playing.
Ever since the music culture died during the '90s, the average listener bristles at care-free, feel-good music, as well as more introspective or downer music that is more about vulnerability rather than the trend of the '90s and 2000s of mopey or bitter or outraged music. The culture overall has just become a lot more fake, especially in social relations. People are too afraid or too prudish or too something to show how they feel, whether good or bad. Good-times pictures on Facebook show kabuki faces, not real facial expressions, and bad-times pictures on MySpace show this-is-my-hardest-teen-angst poses.
Naturally then, real music is going to rub these people the wrong way, and there's nothing more satisfying than disturbing someone who needs to lighten the fuck up. As a bonus, if you run into the minority who do appreciate real music, you'll put a big smile on their face, like "Oh thank god someone's still playing some good songs around here!"
It only rarely goes as far as me getting some kind of acknowledgment from the people within earshot, but over the past couple years it has happened quite a few times. Everyone has memories of where they were when they heard a song that carved the memory onto their brain. With such a moribund music culture out there, I now get that feeling based on positive "audience" response when I'm out driving around.
Here is a close to complete list of these episodes. I may be overlooking one here or there, but it's hard to forget those fleeting connecting-with-strangers moments. Obviously this method of detecting what music is crowd-pleasing will have plenty of false negatives, where I'm playing something good and everyone likes it but for whatever reason doesn't send an acknowledging look. However, it won't produce false positives -- if a total stranger is willing to exchange a knowing, appreciate look, that's just the tip of the iceberg of fans of the song, who aren't always going to be so forward.
Actually, first it's worth looking at what never gets a response, no matter how often I've played it in however broad a variety of settings. Rock music from the '60s and early '70s never gets a nod or anything from Baby Boomers who grew up on it, let alone people who heard it much later. It's not that it gets sneers or eye-rolls, as if I were to play Eminem or Korn, but I was surprised at how neutral the reactions always are. This includes any of the Velvet Underground albums, Lou Reed's Transformer album, Rubber Soul by The Beatles, compilation albums by The Beach Boys, The Byrds, and The Searchers, and Electric Warrior and The Slider by T. Rex.
The appraisal of the public seems to be that these are good musicians who did pioneering work, but that perhaps a bit more of their fame that is comfortable to admit out loud is due to their laying the ground for better music groups. It's like how you study a little bit of Pollaiuolo in an art history class before you dig deep into Michelangelo. There seems to be a similar appraisal of New Hollywood movies like The Graduate, which were a nice experimental waking up from a slumber, but that still can't compare to the non-stop excitement once the industry began hitting its stride, here as well, during the mid-1970s and lasting through the early 1990s.
Also, none of the original independent rock ("college rock," "college radio," etc.) ever gets acknowledged. The Jesus and Mary Chain, Camper Van Beethoven, The Replacements, The Dead Milkmen, Echo and the Bunnymen, Love and Rockets, Tones on Tail, etc. The audience for that music was mostly college students who thought of themselves as artistic, non-conformist, bla bla bla -- in other words, they used music almost entirely as a tribal or individual marker of their awesome uniqueness, and hardly at all because they liked the music itself. In typical narcissistic fashion, they and their present-day counterparts have thrown their former idols under the bus and have kept moving on to some other new crop of "obscure bands" to signal their non-mainstreaminess. I used to play a good amount of the "post-punk revival" music from the mid-2000s, but that never got any responses either.
Now on to what has played well, with a remark or two about the episode.
- "Round and Round" by Ratt. Pulling out of the parking lot at the local supermarket. A couple that's in their early-mid 30s and dressed in the long black trench coat look of the '90s hears this and smiles genuinely, like this is what they listened to when they were pubescent or slightly before. "Dang, hardass music used to actually sound hardass," I could hear them thinking to themselves.
- "Lucky Star" by Madonna. Leaving the parking lot of a nearby shopping center. A group of two or three couples in their later 30s through mid 40s is crossing in front, when one woman spontaneously breaks into a dance and strut halfway through crossing.
- "Get Into the Groove" by Madonna. Entering the shopping center parking lot. A group of Gen X-ers is sitting around an outside table, and one woman begins to bounce and pump her arms around. However, I couldn't see her eyes since she had sunglasses on, so it was difficult to tell if she was responding in mockery or not.
- "The Power of Love" by Huey Lewis & the News. Pulling into a parking space at the shopping center. I must have been blasting this one so loud that the couple in the pickup just to my right heard it through their closed windows. The woman's face lit up like a little girl whose lost teddy bear turns up out of the blue after weeks of fruitless searching. They were in their late 30s or early 40s.
- "Social Disease" by Bon Jovi. Out for a cruise on a 7 or 8-lane drag. A guy driving along my right side has his windows down just like I do, since it's summer. He looks over and, not wanting to get all gushy or anything, looks back ahead but starts bobbing his head back and forth. He was about 40 and Slippery When Wet must have been his cruising-for-chicks soundtrack when he was in high school. (He still had long hair in the back, and about top-of-the-ear length in front, not a mullet.)
- "Don't Dream It's Over" by Crowded House. I don't have this on CD, but when I saw Adventureland in the theater there was a group of five or six people in their later 30s sitting nearby. When this song comes on, it's set against the background of "summer love" and "summer nights," which is laying it on a bit thick. Still, one guy in the group couldn't help himself and started singing along -- not meta-ironically, but like it was streaming over the radio during the final few twilight weeks of the school year.
- "Children of the Damned" by Iron Maiden. Parking almost in front of an indie coffeeshop, where two groups of people are lounging on the outdoor patio. There's a group of two metalheads, about 35 years old, one of whom shrieks out "Children of the daaaaaaaamned!!!" and smiles when I get out of my car. In general, heavy metal fans are the most loyal, so the chances of this happening are high if you happen to find one, although they are small in total numbers.
- Several songs by Michael Jackson, especially from Bad. I don't have clear memories of any specific episodes here because it's pretty common, although the reactions tend not to be so memorable -- non-toothy smiles, a look over the shoulder, that kind of thing.
- "We Belong" by Pat Benatar. Pulling into the supermarket parking lot. Two guys in their mid-late 30s get out of their car about the same time, and one says "We belong to the night!" in a self-conscious vampire kind of voice.
- "The Ballad of Wendell Scott" by Mojo Nixon and Skid Roper. Here's a slight exception to the "college rock doesn't go well" rule, although it was all construction workers, who certainly never heard the song before. They were just glad to hear some fast-paced, yee-haw getaway music during their lunch break.
- "Heaven is a Place on Earth" by Belinda Carlisle. Slowly prowling around the shopping center parking lot, which is packed. Ahead about 30 feet are two Gen X women, like late 30s, walking with their backs to my car. Once one of them recognizes the song, she doesn't just look over her shoulder but spins her entire upper body around and is too transported to a better place to notice that her eyes are bugging out and her jaw has dropped wide open. Then just as soon as she does so, she regains her composure and they resume their stroll. She very willingly lost her virginity to this song over Christmas vacation, 1987, probably as a junior in high school with an older boyfriend who had come home from college.
Her hair is dark grey, and she's still wearing it in one of those shortish '80s hairdos -- not cropped like Madonna or in a bob, more like Sandy Duncan back then. She and the friend are dressed in contemporary yuppie uniforms, though. Looking at someone who was her age and so wrapped up in furthering her career, you'd never suspect that she was such a wild teenager. But that was back when youthful craziness was nearing its peak in the population, so even people who are by inclination more strait-laced were pulled into the larger youth rebellion.
- "Automatic" by Prince. Leaving a popular nearby park. A group of four gay men in their later 30s are passing in front, and one of them breaks into a kind of a shuffle halfway through the crosswalk. He gets really nervous and self-conscious, though, and tries to continue walking normally.
- "My Best Friend's Girl" by The Cars. Slowing down at a stop sign, and a group of pedestrians on the sidewalk (in their 40s) look over and smile a little bit.
- "Mediate" by INXS. Coming to a stop behind a line of cars at a red light. I'm next to the sidewalk, and two women in their late 30s are walking toward one of the main government buildings -- a courthouse or something -- and are in lawyer chick skirt suits. Being lawyers, they're not as cool as the girl who beamed to Belinda Carlisle, but one of them does turn her head over her shoulder and mixes "Hey, I remember that song!" with "Oh god, it's that song again..." Her high school boyfriend must have played Kick as his make-out music (or at least "Need You Tonight," which comes right before this one), and she grew a little tired of it after awhile.
It's odd that every week '80s night is packed with college kids, yet no young people ever respond to the same music outside the club. It seems like it's more of a goof for a good number of the people there, not something they really like deep down. Michael Jackson is different, of course -- everyone knows and likes his songs -- but overall music is not central to their lives.
And no, it has nothing to do with the fact that they didn't hear those songs when they first came out. Kids my age weren't born when "Bohemian Rhapsody" came out, but we heard it from the Wayne's World movie and loved it right away. Anyone who was in fifth grade when that movie came out still holds that as one of their "don't let it end" songs from their childhood, before everything started changing in middle school.
Unfortunately the most positive reactions have been from people who were at least a good five or tens years older than me, which doesn't bode well for finding people to relate to musically even within my age group. Most of your core musical tastes are in place by around 20, so in order to avoid taking indie / alternative / emo seriously (or nu metal, gangsta rap, etc.), you had to have been born before 1975 or so. Born from 1976 through 1985, you might have heard good music on the radio when you were a child, but it probably didn't play much of a role in your adolescence, when music matters much more. Born after 1985, there's nothing good left to hear even in childhood, let alone as a teenager.
Returning to the lack of response from older Baby Boomers, they were adolescents when rock music was still maturing, so they have a decent ear for it but let the peak stage pass them by as they entered their later 20s and 30s. If you were 20 during the mid-'70s when it really gets going, though, rock music of any kind still resonates really well with you. So people born between, roughly, 1955 and 1975 know the score and appreciate just about anything worth appreciating. They're the ones you want to talk to about music. Hopefully some of them are writing things down, since they're the only group that really knows anything. It would be a shame for them to enter their forgetful years in a few decades without having recorded what it was all about for posterity.
Ever since the music culture died during the '90s, the average listener bristles at care-free, feel-good music, as well as more introspective or downer music that is more about vulnerability rather than the trend of the '90s and 2000s of mopey or bitter or outraged music. The culture overall has just become a lot more fake, especially in social relations. People are too afraid or too prudish or too something to show how they feel, whether good or bad. Good-times pictures on Facebook show kabuki faces, not real facial expressions, and bad-times pictures on MySpace show this-is-my-hardest-teen-angst poses.
Naturally then, real music is going to rub these people the wrong way, and there's nothing more satisfying than disturbing someone who needs to lighten the fuck up. As a bonus, if you run into the minority who do appreciate real music, you'll put a big smile on their face, like "Oh thank god someone's still playing some good songs around here!"
It only rarely goes as far as me getting some kind of acknowledgment from the people within earshot, but over the past couple years it has happened quite a few times. Everyone has memories of where they were when they heard a song that carved the memory onto their brain. With such a moribund music culture out there, I now get that feeling based on positive "audience" response when I'm out driving around.
Here is a close to complete list of these episodes. I may be overlooking one here or there, but it's hard to forget those fleeting connecting-with-strangers moments. Obviously this method of detecting what music is crowd-pleasing will have plenty of false negatives, where I'm playing something good and everyone likes it but for whatever reason doesn't send an acknowledging look. However, it won't produce false positives -- if a total stranger is willing to exchange a knowing, appreciate look, that's just the tip of the iceberg of fans of the song, who aren't always going to be so forward.
Actually, first it's worth looking at what never gets a response, no matter how often I've played it in however broad a variety of settings. Rock music from the '60s and early '70s never gets a nod or anything from Baby Boomers who grew up on it, let alone people who heard it much later. It's not that it gets sneers or eye-rolls, as if I were to play Eminem or Korn, but I was surprised at how neutral the reactions always are. This includes any of the Velvet Underground albums, Lou Reed's Transformer album, Rubber Soul by The Beatles, compilation albums by The Beach Boys, The Byrds, and The Searchers, and Electric Warrior and The Slider by T. Rex.
The appraisal of the public seems to be that these are good musicians who did pioneering work, but that perhaps a bit more of their fame that is comfortable to admit out loud is due to their laying the ground for better music groups. It's like how you study a little bit of Pollaiuolo in an art history class before you dig deep into Michelangelo. There seems to be a similar appraisal of New Hollywood movies like The Graduate, which were a nice experimental waking up from a slumber, but that still can't compare to the non-stop excitement once the industry began hitting its stride, here as well, during the mid-1970s and lasting through the early 1990s.
Also, none of the original independent rock ("college rock," "college radio," etc.) ever gets acknowledged. The Jesus and Mary Chain, Camper Van Beethoven, The Replacements, The Dead Milkmen, Echo and the Bunnymen, Love and Rockets, Tones on Tail, etc. The audience for that music was mostly college students who thought of themselves as artistic, non-conformist, bla bla bla -- in other words, they used music almost entirely as a tribal or individual marker of their awesome uniqueness, and hardly at all because they liked the music itself. In typical narcissistic fashion, they and their present-day counterparts have thrown their former idols under the bus and have kept moving on to some other new crop of "obscure bands" to signal their non-mainstreaminess. I used to play a good amount of the "post-punk revival" music from the mid-2000s, but that never got any responses either.
Now on to what has played well, with a remark or two about the episode.
- "Round and Round" by Ratt. Pulling out of the parking lot at the local supermarket. A couple that's in their early-mid 30s and dressed in the long black trench coat look of the '90s hears this and smiles genuinely, like this is what they listened to when they were pubescent or slightly before. "Dang, hardass music used to actually sound hardass," I could hear them thinking to themselves.
- "Lucky Star" by Madonna. Leaving the parking lot of a nearby shopping center. A group of two or three couples in their later 30s through mid 40s is crossing in front, when one woman spontaneously breaks into a dance and strut halfway through crossing.
- "Get Into the Groove" by Madonna. Entering the shopping center parking lot. A group of Gen X-ers is sitting around an outside table, and one woman begins to bounce and pump her arms around. However, I couldn't see her eyes since she had sunglasses on, so it was difficult to tell if she was responding in mockery or not.
- "The Power of Love" by Huey Lewis & the News. Pulling into a parking space at the shopping center. I must have been blasting this one so loud that the couple in the pickup just to my right heard it through their closed windows. The woman's face lit up like a little girl whose lost teddy bear turns up out of the blue after weeks of fruitless searching. They were in their late 30s or early 40s.
- "Social Disease" by Bon Jovi. Out for a cruise on a 7 or 8-lane drag. A guy driving along my right side has his windows down just like I do, since it's summer. He looks over and, not wanting to get all gushy or anything, looks back ahead but starts bobbing his head back and forth. He was about 40 and Slippery When Wet must have been his cruising-for-chicks soundtrack when he was in high school. (He still had long hair in the back, and about top-of-the-ear length in front, not a mullet.)
- "Don't Dream It's Over" by Crowded House. I don't have this on CD, but when I saw Adventureland in the theater there was a group of five or six people in their later 30s sitting nearby. When this song comes on, it's set against the background of "summer love" and "summer nights," which is laying it on a bit thick. Still, one guy in the group couldn't help himself and started singing along -- not meta-ironically, but like it was streaming over the radio during the final few twilight weeks of the school year.
- "Children of the Damned" by Iron Maiden. Parking almost in front of an indie coffeeshop, where two groups of people are lounging on the outdoor patio. There's a group of two metalheads, about 35 years old, one of whom shrieks out "Children of the daaaaaaaamned!!!" and smiles when I get out of my car. In general, heavy metal fans are the most loyal, so the chances of this happening are high if you happen to find one, although they are small in total numbers.
- Several songs by Michael Jackson, especially from Bad. I don't have clear memories of any specific episodes here because it's pretty common, although the reactions tend not to be so memorable -- non-toothy smiles, a look over the shoulder, that kind of thing.
- "We Belong" by Pat Benatar. Pulling into the supermarket parking lot. Two guys in their mid-late 30s get out of their car about the same time, and one says "We belong to the night!" in a self-conscious vampire kind of voice.
- "The Ballad of Wendell Scott" by Mojo Nixon and Skid Roper. Here's a slight exception to the "college rock doesn't go well" rule, although it was all construction workers, who certainly never heard the song before. They were just glad to hear some fast-paced, yee-haw getaway music during their lunch break.
- "Heaven is a Place on Earth" by Belinda Carlisle. Slowly prowling around the shopping center parking lot, which is packed. Ahead about 30 feet are two Gen X women, like late 30s, walking with their backs to my car. Once one of them recognizes the song, she doesn't just look over her shoulder but spins her entire upper body around and is too transported to a better place to notice that her eyes are bugging out and her jaw has dropped wide open. Then just as soon as she does so, she regains her composure and they resume their stroll. She very willingly lost her virginity to this song over Christmas vacation, 1987, probably as a junior in high school with an older boyfriend who had come home from college.
Her hair is dark grey, and she's still wearing it in one of those shortish '80s hairdos -- not cropped like Madonna or in a bob, more like Sandy Duncan back then. She and the friend are dressed in contemporary yuppie uniforms, though. Looking at someone who was her age and so wrapped up in furthering her career, you'd never suspect that she was such a wild teenager. But that was back when youthful craziness was nearing its peak in the population, so even people who are by inclination more strait-laced were pulled into the larger youth rebellion.
- "Automatic" by Prince. Leaving a popular nearby park. A group of four gay men in their later 30s are passing in front, and one of them breaks into a kind of a shuffle halfway through the crosswalk. He gets really nervous and self-conscious, though, and tries to continue walking normally.
- "My Best Friend's Girl" by The Cars. Slowing down at a stop sign, and a group of pedestrians on the sidewalk (in their 40s) look over and smile a little bit.
- "Mediate" by INXS. Coming to a stop behind a line of cars at a red light. I'm next to the sidewalk, and two women in their late 30s are walking toward one of the main government buildings -- a courthouse or something -- and are in lawyer chick skirt suits. Being lawyers, they're not as cool as the girl who beamed to Belinda Carlisle, but one of them does turn her head over her shoulder and mixes "Hey, I remember that song!" with "Oh god, it's that song again..." Her high school boyfriend must have played Kick as his make-out music (or at least "Need You Tonight," which comes right before this one), and she grew a little tired of it after awhile.
It's odd that every week '80s night is packed with college kids, yet no young people ever respond to the same music outside the club. It seems like it's more of a goof for a good number of the people there, not something they really like deep down. Michael Jackson is different, of course -- everyone knows and likes his songs -- but overall music is not central to their lives.
And no, it has nothing to do with the fact that they didn't hear those songs when they first came out. Kids my age weren't born when "Bohemian Rhapsody" came out, but we heard it from the Wayne's World movie and loved it right away. Anyone who was in fifth grade when that movie came out still holds that as one of their "don't let it end" songs from their childhood, before everything started changing in middle school.
Unfortunately the most positive reactions have been from people who were at least a good five or tens years older than me, which doesn't bode well for finding people to relate to musically even within my age group. Most of your core musical tastes are in place by around 20, so in order to avoid taking indie / alternative / emo seriously (or nu metal, gangsta rap, etc.), you had to have been born before 1975 or so. Born from 1976 through 1985, you might have heard good music on the radio when you were a child, but it probably didn't play much of a role in your adolescence, when music matters much more. Born after 1985, there's nothing good left to hear even in childhood, let alone as a teenager.
Returning to the lack of response from older Baby Boomers, they were adolescents when rock music was still maturing, so they have a decent ear for it but let the peak stage pass them by as they entered their later 20s and 30s. If you were 20 during the mid-'70s when it really gets going, though, rock music of any kind still resonates really well with you. So people born between, roughly, 1955 and 1975 know the score and appreciate just about anything worth appreciating. They're the ones you want to talk to about music. Hopefully some of them are writing things down, since they're the only group that really knows anything. It would be a shame for them to enter their forgetful years in a few decades without having recorded what it was all about for posterity.
February 11, 2011
We're running with the shadows of the night
Who knew there were so many versions of this song? Thank god for YouTube.*
Here is the immortal recording by Pat Benatar. Searching for this one turned up another by Helen Schneider and -- I couldn't believe it! -- one by Rachel Sweet. There's even a 2008 hit recording by Ashley Tisdale for the Millennials.
Obviously the post-1992 version is the mediocre one of the bunch, but it's better than 99% of the junk during this period, whether original or covers. She has no emotional depth in her voice, which makes sense because as someone born in 1985 she came of age during falling-crime times and recorded it during these times as well. You need to feel close to the brink to get the right emotional inflection for this song. It sounds like she's playing dress-up or playing in a blackface show almost. And notice how in the video she hams it up by locking her legs and bending over Betty Boop style, as well as making a grinding motion with her hips. Too self-conscious and in-control-of-herself for a song about letting go and throwing yourself into the rush of life.
The Helen Schneider version is pretty good as far as the instrumentation and vocals go, but emotionally she sounds a bit too cocky (echoed by her facial expressions). This is what some people in the YouTube comments are picking up on when they say it may be a bit too masculine, but it's not masculine vs. feminine that matters, it's that this song isn't supposed to be one where you feel overly confident and secure.
That's why the Rachel Sweet and Pat Benatar versions come out the best. You need just the right balance between guts and anxiety to get into the action of this song. All other great songs in the genre, even the more masculine power-driven ones like Kenny Loggins' "Danger Zone" and Bon Jovi's "Livin' on a Prayer," strike this balance. If you feel too confident, there's no edge-of-your-seat tension about how the cosmic event will unfold, and if you're too insecure you'll just sit it out on the sidelines and let it pass you by.
I like Pat's version more than Rachel's because there's a bit more yearning in hers, like she's truly hungry to jump into one of life's Big Moments, which they emphasize in the video by showing her dreaming of flying into Nazi territory and kicking butt. Also there's the slight but powerful difference in the words to the chorus: Pat says "running with" the shadows of the night, while Rachel says "running through." "Through" makes it sound like the surroundings are static, and that the two people are above-it-all, like superheroes who move so fast that everyone else seems to be frozen in place. "With" makes it sound like the surroundings themselves are not just alive but racing along too, a nice touch of the pathetic fallacy to strengthen the turbulent drive of the song.
All this writing makes me want to go out for a cruise.
* We still need to be careful when blindly praising new technologies like this, since if it were 1983 I would be spending lots of my free time in a variety of record stores and talking to people who knew everything, and so would probably have had the same chance of finding this information out and for roughly the same search cost.
Here is the immortal recording by Pat Benatar. Searching for this one turned up another by Helen Schneider and -- I couldn't believe it! -- one by Rachel Sweet. There's even a 2008 hit recording by Ashley Tisdale for the Millennials.
Obviously the post-1992 version is the mediocre one of the bunch, but it's better than 99% of the junk during this period, whether original or covers. She has no emotional depth in her voice, which makes sense because as someone born in 1985 she came of age during falling-crime times and recorded it during these times as well. You need to feel close to the brink to get the right emotional inflection for this song. It sounds like she's playing dress-up or playing in a blackface show almost. And notice how in the video she hams it up by locking her legs and bending over Betty Boop style, as well as making a grinding motion with her hips. Too self-conscious and in-control-of-herself for a song about letting go and throwing yourself into the rush of life.
The Helen Schneider version is pretty good as far as the instrumentation and vocals go, but emotionally she sounds a bit too cocky (echoed by her facial expressions). This is what some people in the YouTube comments are picking up on when they say it may be a bit too masculine, but it's not masculine vs. feminine that matters, it's that this song isn't supposed to be one where you feel overly confident and secure.
That's why the Rachel Sweet and Pat Benatar versions come out the best. You need just the right balance between guts and anxiety to get into the action of this song. All other great songs in the genre, even the more masculine power-driven ones like Kenny Loggins' "Danger Zone" and Bon Jovi's "Livin' on a Prayer," strike this balance. If you feel too confident, there's no edge-of-your-seat tension about how the cosmic event will unfold, and if you're too insecure you'll just sit it out on the sidelines and let it pass you by.
I like Pat's version more than Rachel's because there's a bit more yearning in hers, like she's truly hungry to jump into one of life's Big Moments, which they emphasize in the video by showing her dreaming of flying into Nazi territory and kicking butt. Also there's the slight but powerful difference in the words to the chorus: Pat says "running with" the shadows of the night, while Rachel says "running through." "Through" makes it sound like the surroundings are static, and that the two people are above-it-all, like superheroes who move so fast that everyone else seems to be frozen in place. "With" makes it sound like the surroundings themselves are not just alive but racing along too, a nice touch of the pathetic fallacy to strengthen the turbulent drive of the song.
All this writing makes me want to go out for a cruise.
* We still need to be careful when blindly praising new technologies like this, since if it were 1983 I would be spending lots of my free time in a variety of record stores and talking to people who knew everything, and so would probably have had the same chance of finding this information out and for roughly the same search cost.
Female bonding movies absent during violent times, surge during safe times
In the comments to an earlier post about the degradation of romantic comedies into chick flicks during falling-crime times, someone mentioned the rise of a certain type of chick flick that I hadn't thought of -- the female bonding movie. That's probably the most extreme example of a cultural work that depicts sex segregation.
I googled around for a list of I-can't-even-write-it kind of movies, and found this one with 201 entries. Aside from two movies from 1937 and 1945 (both part of the falling-crime era of 1934-1958, by the way), all of them are from 1980-2010. I don't know how recently it was updated. Here is how these 199 movies were released over time:
There is a consistently low level during the high-crime times of the 1980s and very early 1990s. After the crime rate peaked in 1992, the culture began sissifying itself, and part of that was the re-segregation of the sexes, as I explained in the romantic comedies post. It's like after the end of the Jazz Age (another piece of soaring crime history), when females began to shut themselves in the kitchen and felt more like vacuuming the living room carpet in high heels rather than driving cars around to look for boys, dancing to wild music, and smoking and drinking. Sure enough, the recent falling-crime period is when this sort of movie exploded in numbers.
The picture would be more accurate if I weighted each movie by how much money it brought in at the box office, adjusted for inflation. That would tell us how much money people paid in a certain year to see this genre in the theater, not just how many such movies were made. But that's extra work, and the picture is already clear.
These movies are one counter-example to the tendency during falling-crime times for people to become infantilized, as they perceive a safer and more predictable world and put everything off until much later. Even pre-pubescent girls have some interest in boys -- especially like the ones I went to school with when crime was soaring, but even the ones today. No, the social world inhabited by the characters in these female bonding movies is more like that of a post-menopausal woman -- not wanting much excitement, seeing only a small number of people socially, and all of them females instead of noisy, smelly, and bothersome males. Again, even elementary school girls want to mix in with the boys to some degree, even though they too think we're made of slime and snails and puppy-dog tails.
Another big reason why females bond a lot more during falling-crime times is that they've abandoned their promiscuous ways from rising-crime times. It's hard to bond when you're competing more aggressively for male attention, but easy when you respect each other's claims on who belongs to who.
But just wait till the crime rate inevitably enters the upward phase of its cycle in the next 5 to 25 years, and girls will realize that they can't get through real life by sequestering themselves away from boys, and that it's not enough to just keep a few gay pet-friends. Plus we'll happily see an end to these menopausal group-hug movies and a return of out-of-control girls eager to play with the boys.
I googled around for a list of I-can't-even-write-it kind of movies, and found this one with 201 entries. Aside from two movies from 1937 and 1945 (both part of the falling-crime era of 1934-1958, by the way), all of them are from 1980-2010. I don't know how recently it was updated. Here is how these 199 movies were released over time:
There is a consistently low level during the high-crime times of the 1980s and very early 1990s. After the crime rate peaked in 1992, the culture began sissifying itself, and part of that was the re-segregation of the sexes, as I explained in the romantic comedies post. It's like after the end of the Jazz Age (another piece of soaring crime history), when females began to shut themselves in the kitchen and felt more like vacuuming the living room carpet in high heels rather than driving cars around to look for boys, dancing to wild music, and smoking and drinking. Sure enough, the recent falling-crime period is when this sort of movie exploded in numbers.
The picture would be more accurate if I weighted each movie by how much money it brought in at the box office, adjusted for inflation. That would tell us how much money people paid in a certain year to see this genre in the theater, not just how many such movies were made. But that's extra work, and the picture is already clear.
These movies are one counter-example to the tendency during falling-crime times for people to become infantilized, as they perceive a safer and more predictable world and put everything off until much later. Even pre-pubescent girls have some interest in boys -- especially like the ones I went to school with when crime was soaring, but even the ones today. No, the social world inhabited by the characters in these female bonding movies is more like that of a post-menopausal woman -- not wanting much excitement, seeing only a small number of people socially, and all of them females instead of noisy, smelly, and bothersome males. Again, even elementary school girls want to mix in with the boys to some degree, even though they too think we're made of slime and snails and puppy-dog tails.
Another big reason why females bond a lot more during falling-crime times is that they've abandoned their promiscuous ways from rising-crime times. It's hard to bond when you're competing more aggressively for male attention, but easy when you respect each other's claims on who belongs to who.
But just wait till the crime rate inevitably enters the upward phase of its cycle in the next 5 to 25 years, and girls will realize that they can't get through real life by sequestering themselves away from boys, and that it's not enough to just keep a few gay pet-friends. Plus we'll happily see an end to these menopausal group-hug movies and a return of out-of-control girls eager to play with the boys.
February 10, 2011
Will we fall into dystopia because of our sin or corporate/political evils?
During an informal chat today where two of us were recommending movies to a third, the other recommender remarked that Back to the Future Part II was "so bad" that lots of people stayed away from Part III when it came out. I reflexively defended Part II, but I hadn't thought in detail about what makes it so good. It was not as great as the first, but it's still a very good movie, and now that I've been forced to think of reasons why, it seems that it's because it's a dark and reflective middle movie between two more uplifting and carefree movies. Kind of like The Empire Strikes Back or the second movement in the Emperor Concerto.
But wait a second -- I usually hate dystopian movies. What makes all those other ones so bad, at least as far as the dystopian theme goes, however well they may work on other levels? I'm thinking of Blade Runner, The Matrix, etc. (I didn't see the entire thing, but I've heard V for Vendetta was like these as well.) These stories go wrong in showing a world that is awful because there is some kind of foreign parasite making the host population sick, almost always some kind of corporate interest or totalitarian political group, although sometimes it really is a foreign species like Planet of the Apes.
The conquered-by-invaders ones aren't that bad because they're real -- have human groups never been enslaved by invaders before? Anything that pushes our Us vs. Them buttons is already halfway to being a captivating story. Unfortunately we don't define our tribes or ethnic groups as citizens vs. rulers or workers vs. managers, so the corporate/political dystopias don't work that way. Well, if these parasites didn't land from outer space or something, just where did they come from?
It seems like a deus ex machina, only an unexplained and implausible source of ruin at the beginning of the story. At best, it's something smug that barely rises above this kind of initial narration:
The unchecked corporate greed stories aren't any more convincing -- why are they given free reign? Well, the politicians fell asleep at the wheel, or were bought off by the corporations, or the masses were brainwashed by laissez-faire propagandists, etc.
What makes these corporate/political stories fail is that they project the inborn human tendency toward sinning onto some evil, caricatured Other. The solution to our problems is clear and calls for no deep change to ourselves -- we just have to band together and drive out the Other, then evil will go away. If only we can rein in those greedy business owners, if only we can vote those Republicans out of office, if only we can bump off the politburo.
Back on planet Earth, we move closer toward dystopia because of the average person's greed, pride, hubris, ambition, and other flaws, both subtle and gross. Tally these sins up across the great mass of normal people, and soon quantity has a quality all its own. Corporations have to operate within the limits, whether written down or tacit, that the multitude sets. If the general population is highly vindictive, then managers cannot just fire whoever they want. In India, that could get the manager killed by angry workers. The same goes for political rulers. In a democracy, going against the general will would just get them voted out of office. In a pre-democratic society the well-to-do and a good deal of the commoners are armed and will simply kill off any leaders who aren't to their liking.
But if the multitude itself continues to indulge in its sins, what's going to stop them? Each person can change their behavior, of course, but there's no outside check on their course toward damnation. Obviously this is the path toward ruin in the real world, and the best dystopian or apocalyptic or post-apocalyptic stories take this for granted. There is a moral message that you yourself have to change (and so do all other sinners) in order to avoid the nightmare future. It's more painful for us to swallow, but we are aware that it's for our own good and appreciate stories that go this route instead of patronizing us by saying that we've been brainwashed by corporations or politicians, that we just have to wake up from the lie and things will be all hunky-dory.
Starting with Back to the Future Part II, how did the alternate 1985 -- the ruined one -- come to be? Marty himself, a normal and usually well behaved guy, gave into temptation and decided to use the time machine for greed, although Doc Brown throws away the sports almanac once he learns about it. Without that decision, old Biff would never have known about the time machine or gotten the idea to do the same thing to enrich his own younger self. No corporations who tempted Marty into it, no politicians who regulated him into it, no academic experts who lied to him, or anything like that. Just plain old human sin.
In fact, when he goes back to 1955 to recover the almanac from young Biff, does he count his blessings that Biff won't be able to take over the town, and then go ahead with his original plan? I mean, hey, he's not an evil guy like Biff, so what harm could come of him giving the book to himself in 1985? By the end of the movie, Marty's learned the lesson of checking his greed and hubris about how under-control he has things; he decides to burn the almanac so that no one may alter history by becoming a gambling big-shot. A less moral person would still arrogantly say, "Hey, as long as it doesn't fall into the wrong hands..." But with something that powerful, everybody's hands are the wrong hands.
All of the other good dystopian stories share this focus on the sins of the average person. Long before movies, there were the cautionary tales from mythology and folklore, then later the emphasis on sin and the apocalypse within near Eastern monotheistic religions, and finally the Faustian bargain and Paradise Lost stories toward the end of pre-industrial Europe.
Once industrialization begins, things get messy because suddenly there is a vast bureaucracy and huge corporations for the writer to blame if they wanted to. As I've explained at length before, during rising-crime times we go into tragic mode and during falling-crime times into trivial mode. So the most bleak and dystopian stories from the Victorian era, when the homicide rate was plummeting, will be of the blame-the-evil-Other sort. Sure enough, that's Dickens' picture of industrial England in a nutshell. If only the corporate masters and political rulers would do the right thing, the masses would be fine.
The next great period of falling crime was roughly the mid-1930s through the late 1950s, and dystopias from that time are also pretty bad. Nineteen Eighty-Four, which everyone creams their jeans over, doesn't focus enough on the inborn tendency toward sin in every human being; it's more of the "boo totalitarians" stuff. That book, and the movie Brazil, do show wonderfully just how degraded language becomes when we embrace bureaucracy, but even here they put too much blame on the Ministry of Bla and not enough on common speakers. In reality, it's the average speaker who is making our language more politically correct and clogged with indecipherable acronyms: they're perfectly free to speak otherwise.
Thank god that violence started spiraling out of control throughout the Western world from roughly 1960 to 1990, or else we wouldn't have gotten the rude awakening that we so sorely needed about what truly leads to dystopia. (There was a smaller and less widespread wave of violence from roughly 1900 to 1935, and we did learn some lessons there, for instance the hubris of the ordinary and eager men who enlisted for World War I.)
So let's briefly remind ourselves of what the good recent dystopian stories were, and introduce them to those who never saw them. (You never know with Millennials -- last semester I made a reference to the original Dirty Harry movie in a seminar on violence, and none of the other grad students had seen it.)
- The Terminator and Terminator 2: Judgment Day. Well-meaning normal people make more and more sophisticated machines, and in the background well-meaning normal consumers crave these products. Before long the machines become self-aware and in control of themselves, and proceed to wipe out most of the human race.
- Alien and Aliens. Film nerds put these in the "corporate dystopia" category, but they must have never watched them. There is zero focus on corporate greed, no portrayal of blond-haired blue-eyed executives who just don't care about the costs, etc. It's true that corporations want to terraform distant planets and bring back exotic species to make a profit, but that's not what gets the colonists and marines into a world of shit. Rather, it's the same hubris that led WWI soldiers into the trenches, and Vietnam soldiers into the jungle. There's a great scene in Aliens where Hudson is bragging on and on about how sophisticated the weaponry is among his team of "ultimate badasses." Same with the colonists, although they are not shown -- the planet looks forbidding, but hey, it's nothing that a little engineering and men in white coats can't handle. Both learn too late that they'd bitten off more than they could chew and were not humble enough when they began their adventures.
- Total Recall. The dystopia on Mars is like that of the Alien series. There's also the dystopia of the lotus-eating public who'd rather live in a dream world rather than face the joys and sorrows of the real world. Again the corporate stuff is minimal -- it's clear that their actions are all driven by what the average person is craving and dying to pay for.
- RoboCop. This is another one that film nerds throw in the "corporate dystopia" bin without thinking about it. While the police have been privatized, it's clear that this was a response to the desires of the multitude -- they'd seen crime growing out of control and put their faith in the men in white coats: All We Have to Do is privatize the police, and bingo, they'll use a little engineering and solve the problem of crime. A Clockwork Orange goes in this direction as well.
Then there are contemporary dystopia movies, which weren't very hard to imagine once it became clear duing the 1970s and '80s that all the various attempts to tame the violence had failed.
- Taxi Driver. Aside from a few new pieces of technology like refrigeration and cars, not to mention somewhat different clothing styles like pants with buttons, a lot of this movie looks like Jesus of Nazareth. There is virtually no whining about politicians, pointy-headed academics, corporate bosses, Hollywood executives, or whatever. New York City is spiraling down the toilet, and it's all their own fault for indulging in such sinful choices. It's not as uplifting as the story of Jesus because only one person (aside from the anti-hero himself) is saved, and we don't get the sense that more and more people will start to change themselves in order to pull the city away from the brink of damnation.
- The loose cannon / vigilante crime movies, such as the Dirty Harry and Lethal Weapon series. Here there is a focus on how broken the system is, particularly how handicapped it is by red tape -- Miranda rights, diplomatic immunity, and so on. However, they don't blame this on a bunch of renegade politicians, but instead on the larger popular zeitgeist that had begun to move in the more liberal, rehabilitating direction during the days when the majority voted for Great Society politicians. This is not like the Dickensian poo-poo-ing of the ideological or party enemy, since the Lethal Weapon series was made well into the Reagan and then the Bush administrations.
- Most dead teenager horror movies. The ones who lead more sinful lives get butchered by a maniac, while the ones who are virtuous survive -- simple as that. The Friday the 13th movies, Carrie, and on and on and on.
- Abandoned children movies, popular when divorce rates and child abuse was more commonplace. Some of the dead teenager horror movies fall into this category, where the kids are picked off not necessarily because they've sinned but because they're vulnerable after being left unprotected or thrown out by their sinful parents and careless grown-ups in general. Not because of corporate greed or totalitarian rulers. The original Nightmare on Elm Street is more in this vein, even though it has elements of the "they sinned and must die" type too. A Nightmare on Elm Street Part 3 is even more in the Hansel and Gretel direction: it features a bunch of kids whose parents push them out into a mental ward because of their embarrassing personality disorders, and most of the grown-ups look the other way when they cry out that they're being hunted in their dreams.
I'm sure there are more, but that's good enough. Since the crime rate started declining after 1992, we've had mostly morally stunted dystopian stories, and we'll have to wait until the crime rate shoots up again to get some more better ones. But at least there are enough already in existence to rely on for guidance in the meantime.
But wait a second -- I usually hate dystopian movies. What makes all those other ones so bad, at least as far as the dystopian theme goes, however well they may work on other levels? I'm thinking of Blade Runner, The Matrix, etc. (I didn't see the entire thing, but I've heard V for Vendetta was like these as well.) These stories go wrong in showing a world that is awful because there is some kind of foreign parasite making the host population sick, almost always some kind of corporate interest or totalitarian political group, although sometimes it really is a foreign species like Planet of the Apes.
The conquered-by-invaders ones aren't that bad because they're real -- have human groups never been enslaved by invaders before? Anything that pushes our Us vs. Them buttons is already halfway to being a captivating story. Unfortunately we don't define our tribes or ethnic groups as citizens vs. rulers or workers vs. managers, so the corporate/political dystopias don't work that way. Well, if these parasites didn't land from outer space or something, just where did they come from?
It seems like a deus ex machina, only an unexplained and implausible source of ruin at the beginning of the story. At best, it's something smug that barely rises above this kind of initial narration:
When those couch potatoes in Kansas finally managed to vote the Republicans into control of the House, Senate, and White House, it only took three years before we ended up in the world you see here. Yeah, I know -- real nice place, isn't it? But mister, you ain't seen the half of it...
The unchecked corporate greed stories aren't any more convincing -- why are they given free reign? Well, the politicians fell asleep at the wheel, or were bought off by the corporations, or the masses were brainwashed by laissez-faire propagandists, etc.
What makes these corporate/political stories fail is that they project the inborn human tendency toward sinning onto some evil, caricatured Other. The solution to our problems is clear and calls for no deep change to ourselves -- we just have to band together and drive out the Other, then evil will go away. If only we can rein in those greedy business owners, if only we can vote those Republicans out of office, if only we can bump off the politburo.
Back on planet Earth, we move closer toward dystopia because of the average person's greed, pride, hubris, ambition, and other flaws, both subtle and gross. Tally these sins up across the great mass of normal people, and soon quantity has a quality all its own. Corporations have to operate within the limits, whether written down or tacit, that the multitude sets. If the general population is highly vindictive, then managers cannot just fire whoever they want. In India, that could get the manager killed by angry workers. The same goes for political rulers. In a democracy, going against the general will would just get them voted out of office. In a pre-democratic society the well-to-do and a good deal of the commoners are armed and will simply kill off any leaders who aren't to their liking.
But if the multitude itself continues to indulge in its sins, what's going to stop them? Each person can change their behavior, of course, but there's no outside check on their course toward damnation. Obviously this is the path toward ruin in the real world, and the best dystopian or apocalyptic or post-apocalyptic stories take this for granted. There is a moral message that you yourself have to change (and so do all other sinners) in order to avoid the nightmare future. It's more painful for us to swallow, but we are aware that it's for our own good and appreciate stories that go this route instead of patronizing us by saying that we've been brainwashed by corporations or politicians, that we just have to wake up from the lie and things will be all hunky-dory.
Starting with Back to the Future Part II, how did the alternate 1985 -- the ruined one -- come to be? Marty himself, a normal and usually well behaved guy, gave into temptation and decided to use the time machine for greed, although Doc Brown throws away the sports almanac once he learns about it. Without that decision, old Biff would never have known about the time machine or gotten the idea to do the same thing to enrich his own younger self. No corporations who tempted Marty into it, no politicians who regulated him into it, no academic experts who lied to him, or anything like that. Just plain old human sin.
In fact, when he goes back to 1955 to recover the almanac from young Biff, does he count his blessings that Biff won't be able to take over the town, and then go ahead with his original plan? I mean, hey, he's not an evil guy like Biff, so what harm could come of him giving the book to himself in 1985? By the end of the movie, Marty's learned the lesson of checking his greed and hubris about how under-control he has things; he decides to burn the almanac so that no one may alter history by becoming a gambling big-shot. A less moral person would still arrogantly say, "Hey, as long as it doesn't fall into the wrong hands..." But with something that powerful, everybody's hands are the wrong hands.
All of the other good dystopian stories share this focus on the sins of the average person. Long before movies, there were the cautionary tales from mythology and folklore, then later the emphasis on sin and the apocalypse within near Eastern monotheistic religions, and finally the Faustian bargain and Paradise Lost stories toward the end of pre-industrial Europe.
Once industrialization begins, things get messy because suddenly there is a vast bureaucracy and huge corporations for the writer to blame if they wanted to. As I've explained at length before, during rising-crime times we go into tragic mode and during falling-crime times into trivial mode. So the most bleak and dystopian stories from the Victorian era, when the homicide rate was plummeting, will be of the blame-the-evil-Other sort. Sure enough, that's Dickens' picture of industrial England in a nutshell. If only the corporate masters and political rulers would do the right thing, the masses would be fine.
The next great period of falling crime was roughly the mid-1930s through the late 1950s, and dystopias from that time are also pretty bad. Nineteen Eighty-Four, which everyone creams their jeans over, doesn't focus enough on the inborn tendency toward sin in every human being; it's more of the "boo totalitarians" stuff. That book, and the movie Brazil, do show wonderfully just how degraded language becomes when we embrace bureaucracy, but even here they put too much blame on the Ministry of Bla and not enough on common speakers. In reality, it's the average speaker who is making our language more politically correct and clogged with indecipherable acronyms: they're perfectly free to speak otherwise.
Thank god that violence started spiraling out of control throughout the Western world from roughly 1960 to 1990, or else we wouldn't have gotten the rude awakening that we so sorely needed about what truly leads to dystopia. (There was a smaller and less widespread wave of violence from roughly 1900 to 1935, and we did learn some lessons there, for instance the hubris of the ordinary and eager men who enlisted for World War I.)
So let's briefly remind ourselves of what the good recent dystopian stories were, and introduce them to those who never saw them. (You never know with Millennials -- last semester I made a reference to the original Dirty Harry movie in a seminar on violence, and none of the other grad students had seen it.)
- The Terminator and Terminator 2: Judgment Day. Well-meaning normal people make more and more sophisticated machines, and in the background well-meaning normal consumers crave these products. Before long the machines become self-aware and in control of themselves, and proceed to wipe out most of the human race.
- Alien and Aliens. Film nerds put these in the "corporate dystopia" category, but they must have never watched them. There is zero focus on corporate greed, no portrayal of blond-haired blue-eyed executives who just don't care about the costs, etc. It's true that corporations want to terraform distant planets and bring back exotic species to make a profit, but that's not what gets the colonists and marines into a world of shit. Rather, it's the same hubris that led WWI soldiers into the trenches, and Vietnam soldiers into the jungle. There's a great scene in Aliens where Hudson is bragging on and on about how sophisticated the weaponry is among his team of "ultimate badasses." Same with the colonists, although they are not shown -- the planet looks forbidding, but hey, it's nothing that a little engineering and men in white coats can't handle. Both learn too late that they'd bitten off more than they could chew and were not humble enough when they began their adventures.
- Total Recall. The dystopia on Mars is like that of the Alien series. There's also the dystopia of the lotus-eating public who'd rather live in a dream world rather than face the joys and sorrows of the real world. Again the corporate stuff is minimal -- it's clear that their actions are all driven by what the average person is craving and dying to pay for.
- RoboCop. This is another one that film nerds throw in the "corporate dystopia" bin without thinking about it. While the police have been privatized, it's clear that this was a response to the desires of the multitude -- they'd seen crime growing out of control and put their faith in the men in white coats: All We Have to Do is privatize the police, and bingo, they'll use a little engineering and solve the problem of crime. A Clockwork Orange goes in this direction as well.
Then there are contemporary dystopia movies, which weren't very hard to imagine once it became clear duing the 1970s and '80s that all the various attempts to tame the violence had failed.
- Taxi Driver. Aside from a few new pieces of technology like refrigeration and cars, not to mention somewhat different clothing styles like pants with buttons, a lot of this movie looks like Jesus of Nazareth. There is virtually no whining about politicians, pointy-headed academics, corporate bosses, Hollywood executives, or whatever. New York City is spiraling down the toilet, and it's all their own fault for indulging in such sinful choices. It's not as uplifting as the story of Jesus because only one person (aside from the anti-hero himself) is saved, and we don't get the sense that more and more people will start to change themselves in order to pull the city away from the brink of damnation.
- The loose cannon / vigilante crime movies, such as the Dirty Harry and Lethal Weapon series. Here there is a focus on how broken the system is, particularly how handicapped it is by red tape -- Miranda rights, diplomatic immunity, and so on. However, they don't blame this on a bunch of renegade politicians, but instead on the larger popular zeitgeist that had begun to move in the more liberal, rehabilitating direction during the days when the majority voted for Great Society politicians. This is not like the Dickensian poo-poo-ing of the ideological or party enemy, since the Lethal Weapon series was made well into the Reagan and then the Bush administrations.
- Most dead teenager horror movies. The ones who lead more sinful lives get butchered by a maniac, while the ones who are virtuous survive -- simple as that. The Friday the 13th movies, Carrie, and on and on and on.
- Abandoned children movies, popular when divorce rates and child abuse was more commonplace. Some of the dead teenager horror movies fall into this category, where the kids are picked off not necessarily because they've sinned but because they're vulnerable after being left unprotected or thrown out by their sinful parents and careless grown-ups in general. Not because of corporate greed or totalitarian rulers. The original Nightmare on Elm Street is more in this vein, even though it has elements of the "they sinned and must die" type too. A Nightmare on Elm Street Part 3 is even more in the Hansel and Gretel direction: it features a bunch of kids whose parents push them out into a mental ward because of their embarrassing personality disorders, and most of the grown-ups look the other way when they cry out that they're being hunted in their dreams.
I'm sure there are more, but that's good enough. Since the crime rate started declining after 1992, we've had mostly morally stunted dystopian stories, and we'll have to wait until the crime rate shoots up again to get some more better ones. But at least there are enough already in existence to rely on for guidance in the meantime.
February 8, 2011
The dissolution of the romantic comedy and the spread of the chick flick: A case study of growing male-female segregation
Males and females relate more closely with each other -- sometimes non-sexually -- when the violence level begins to soar, whereas they pull apart from each other when the world gets safer. Some examples of this pattern that I've detailed before are that young people have more sex-segregated social circles since roughly the mid-'90s, they don't mingle with the other sex very much even where they're supposed to like a house party or a dance club, many girls' only male friends are gay, and they use phrases like "bros before hoes" or "we came together, we're leaving together."
During the '60s through the '80s, it was the other way around, and it wasn't just that guys began hounding a bunch of unwilling females into hanging out -- those girls were incredibly boy-crazy and wanted to spend lots of time around boys. In rising-crime times, girls have a greater need for male protectors, they become more promiscuous, and they have a higher level of excitement-seeking, which requires that they hang around boys more -- girls being the boring sex. Naturally the guys are only too happy to oblige them.
I have a hard time explaining how closely overlapping boys' and girls' culture used to be during dangerous times, and how separated they are now. One easy way out is to just show them movies, TV shows, or whatever, from both eras and look at how different the portrayal of reality is. In teen comedies, there is only minimal sex segregation in Fast Times at Ridgemont High (1982), Weird Science (1985), and Saved by the Bell (1989-'93); a good deal of it in Dazed and Confused (1993), and Clueless (1995); and near total segregation in American Pie (1999), Mean Girls (2004) -- minus the gay friend, who doesn't count as a guy friend -- and Superbad (2007).
A more convincing, although less vivid, way is to look past the portrayal of boys and girls hanging out together or remaining apart, and ask whether the males and females in the audience -- real people -- were brought together by the movie or not. Of course, war movies are designed for males, and hey-world-getta-loada-me movies are meant for females (and gays), so we have to find a genre that could at least potentially bring the sexes together. That is the romantic comedy -- there's enough raunchy or slapstick humor, and maybe even some action or adventure, to keep the guy happy, while for girls there's enough focus on social relations, sharing feelings, and triumphing over that other jealous ugly bitch to snag the dream guy.
Before getting quantitative, even the most fleeting glance at romantic comedies from the rising-crime times of 1959-1992 vs. those from the falling-crime times of 1993-present throws their differences into stark relief. Indeed, the more recent examples of the genre are more appropriately called "chick flicks," a phrase that doesn't even show up in Google's vast archive of digital books until the mid-'90s (the first instance in the NYT is 1995). I don't know what the guy-oriented versions like Knocked Up are called -- doofus comedies?
At any rate, these types that are so strongly geared to one sex while making the other gnash their teeth were virtually unknown among the popular examples from rising-crime times, right up through When Harry Met Sally in 1989 and even My Cousin Vinny from as late as 1992. There's only one great romantic comedy from falling-crime times, and it only missed the cutoff by one year -- 1993's Groundhog Day (perhaps the best example of the genre). It's also important to remember that in the later part of the 1970s and '80s, a lot of hit movies successfully combined a variety of genres in a way that has totally died during falling-crime times. So there are a lot of romantic comedies that you might not recognize as such, given that the primary genre may be action or science fiction -- Ghostbusters, The Karate Kid, Back to the Future, Stripes, Spaceballs, The Princess Bride, and even Labyrinth to some extent.
Now for some hard data. Box Office Mojo has a romantic comedy category sorted by how much money they made, not adjusted for inflation, which makes most of them recent movies (at least at the higher ranks). So rather than look at all nearly 400 entries, most of which would be made after 1992, I took the top 50 from the earlier 1979-1992 period and top 50 from the later 1993-2010 period. This makes the sample sizes equal, and it allows both periods to showcase their best -- or at least most popular-at-the-time -- examples.
Using box office revenue to select which movies to include is better for our purposes than to use something like critics' ratings. I'm making a claim about what resonated with people, and what was widely consumed, not necessarily what critics like the most. To see how well the movies appeal to males and females, I took the average score for each sex at the IMDb website. Below I show the change in the distributions for these scores separately by sex, for the male-female difference, and the change in the male-female correlation.
First, how each sex rated movies from the two time periods. Females are the left two graphs, males the right two. The rising-crime period movies are on top, the falling-crime ones below. I multiplied the 1-10 scores at IMDb by 10 to make it easier for me to enter into Excel (no decimal points).
The average male score did not change (60.4 to 60.6), while the average female score did somewhat (63.4 to 65.1). If anything, females have benefited more on average from newer romantic comedies. Also look at how much wider the distributions for both sexes became -- there are a lot more movies that males rate really high and really low, and the same for females. That suggested to me that those rated really high were geared toward one sex (the one that rated it really high) and bored or disgusted the other sex (the one that rated it really low). That would predict that the male-female differences would be larger for the recent period.
Sure enough, that's just what happened. Here is the average female score minus the average male score, which shows how female-oriented the movie is:
The average F-M difference from the earlier period is 3.0, and 4.5 from the later period. That may be only a 0.15 difference in the original 1-10 scores, but it's still there. Also look at how much wider the later distribution is: compared to zero instances from the earlier period, there are a good handful of movies that females rate on average 1 or 2 points higher on the 1-10 scale, as well as a few that males rate higher than females (unusual for the genre).
Another way to see the greater divergence is to look at the correlation between male and female scores for a given movie:
The points in the earlier period are much more tightly bunched around the line of best fit, meaning that you could almost perfectly predict male scores from female scores, whereas there's a more noisy relationship in the later period. The correlation for the rising-crime period is +0.95, while for the falling-crime period it is "only" +0.88 -- still very high, but at the same time showing how males and females agree less than they used to about how good a romantic comedy movie was. Also remember that even small differences in the average can have big effects in the tails of a distribution, like movies that are very polarizing.
The red line tells us what the predicted female score is, given how the males scored it. Look at the right side of the red line -- this is where males scored it above-average. If the dots are above the line, that's good news -- females liked it even more than expected -- while if they're below, that's bad news -- males like it a lot and females don't really dig it. Then look at the left side of the line, where males scored it below-average. If the dots are below the line, that's good news -- females also thought it was terrible, but were even harsher -- while if they're above, that's bad news -- males hated it and females liked it.
There isn't very much bad news in the earlier period, which agrees with everything else we've seen -- romantic comedies brought the sexes together. But look at the later period: out of just 50 movies, there are 3 big outliers. One of them, Norbit, is scored equally poorly (3.7) by males and females. But the other two are ones where males were put to sleep while females at least stayed awake or even liked them -- Sex and the City 2 (no surprise: 3.5 among males vs. 5.3 among females), and, yep, Sex and the City the first one (4.8 among males vs. 6.9 among females). The data do not lie -- that whole franchise embodies just about everything that's gone to hell in male-female relations.
Every way of looking at the numbers shows the same overall picture: while romantic comedies used to be an agreeable common space for guys and girls, the plummeting level of violence has predictably driven men and women farther apart, with most of the gains going to females. The fact that we have no phrase for the male version of a chick flick tells us how less common a male-oriented romantic comedy is compared to a female-oriented one.
Don't get me wrong: I'm not going to agitate for more male-oriented romantic comedies just to "even the score," like those insecure ads from Burger King or the Miller Lite man laws campaign. In fact, anything using "man" as an adjective proves how castrated it already is -- you don't need to refer to a '78 Trans Am as a "man car" because it speaks for itself. Only something limp like a messenger bag stands in need of a defense like being called a "man bag" or "man purse."
No, I've been sick of the segregation of the sexes ever since it began in the early-mid-'90s. It began as a heated war that cooled into mere isolationism, but it's all the same to me and anyone else who remembers how much boys and girls used to play with each other, when the world was still growing more dangerous and unstable and uncertain. I'll only be happy again when we return to that carnivalesque thrill of co-mingling -- Leslie Burke, Pat Benatar, Kelly Kapowski -- not if guys end up winning some tiresome battle of the sexes. Although even that would certainly be better than today's world where the mixed-sex spaces like romantic comedies, which used to please both guys and girls, have become biased toward the more boring sex.
During the '60s through the '80s, it was the other way around, and it wasn't just that guys began hounding a bunch of unwilling females into hanging out -- those girls were incredibly boy-crazy and wanted to spend lots of time around boys. In rising-crime times, girls have a greater need for male protectors, they become more promiscuous, and they have a higher level of excitement-seeking, which requires that they hang around boys more -- girls being the boring sex. Naturally the guys are only too happy to oblige them.
I have a hard time explaining how closely overlapping boys' and girls' culture used to be during dangerous times, and how separated they are now. One easy way out is to just show them movies, TV shows, or whatever, from both eras and look at how different the portrayal of reality is. In teen comedies, there is only minimal sex segregation in Fast Times at Ridgemont High (1982), Weird Science (1985), and Saved by the Bell (1989-'93); a good deal of it in Dazed and Confused (1993), and Clueless (1995); and near total segregation in American Pie (1999), Mean Girls (2004) -- minus the gay friend, who doesn't count as a guy friend -- and Superbad (2007).
A more convincing, although less vivid, way is to look past the portrayal of boys and girls hanging out together or remaining apart, and ask whether the males and females in the audience -- real people -- were brought together by the movie or not. Of course, war movies are designed for males, and hey-world-getta-loada-me movies are meant for females (and gays), so we have to find a genre that could at least potentially bring the sexes together. That is the romantic comedy -- there's enough raunchy or slapstick humor, and maybe even some action or adventure, to keep the guy happy, while for girls there's enough focus on social relations, sharing feelings, and triumphing over that other jealous ugly bitch to snag the dream guy.
Before getting quantitative, even the most fleeting glance at romantic comedies from the rising-crime times of 1959-1992 vs. those from the falling-crime times of 1993-present throws their differences into stark relief. Indeed, the more recent examples of the genre are more appropriately called "chick flicks," a phrase that doesn't even show up in Google's vast archive of digital books until the mid-'90s (the first instance in the NYT is 1995). I don't know what the guy-oriented versions like Knocked Up are called -- doofus comedies?
At any rate, these types that are so strongly geared to one sex while making the other gnash their teeth were virtually unknown among the popular examples from rising-crime times, right up through When Harry Met Sally in 1989 and even My Cousin Vinny from as late as 1992. There's only one great romantic comedy from falling-crime times, and it only missed the cutoff by one year -- 1993's Groundhog Day (perhaps the best example of the genre). It's also important to remember that in the later part of the 1970s and '80s, a lot of hit movies successfully combined a variety of genres in a way that has totally died during falling-crime times. So there are a lot of romantic comedies that you might not recognize as such, given that the primary genre may be action or science fiction -- Ghostbusters, The Karate Kid, Back to the Future, Stripes, Spaceballs, The Princess Bride, and even Labyrinth to some extent.
Now for some hard data. Box Office Mojo has a romantic comedy category sorted by how much money they made, not adjusted for inflation, which makes most of them recent movies (at least at the higher ranks). So rather than look at all nearly 400 entries, most of which would be made after 1992, I took the top 50 from the earlier 1979-1992 period and top 50 from the later 1993-2010 period. This makes the sample sizes equal, and it allows both periods to showcase their best -- or at least most popular-at-the-time -- examples.
Using box office revenue to select which movies to include is better for our purposes than to use something like critics' ratings. I'm making a claim about what resonated with people, and what was widely consumed, not necessarily what critics like the most. To see how well the movies appeal to males and females, I took the average score for each sex at the IMDb website. Below I show the change in the distributions for these scores separately by sex, for the male-female difference, and the change in the male-female correlation.
First, how each sex rated movies from the two time periods. Females are the left two graphs, males the right two. The rising-crime period movies are on top, the falling-crime ones below. I multiplied the 1-10 scores at IMDb by 10 to make it easier for me to enter into Excel (no decimal points).
The average male score did not change (60.4 to 60.6), while the average female score did somewhat (63.4 to 65.1). If anything, females have benefited more on average from newer romantic comedies. Also look at how much wider the distributions for both sexes became -- there are a lot more movies that males rate really high and really low, and the same for females. That suggested to me that those rated really high were geared toward one sex (the one that rated it really high) and bored or disgusted the other sex (the one that rated it really low). That would predict that the male-female differences would be larger for the recent period.
Sure enough, that's just what happened. Here is the average female score minus the average male score, which shows how female-oriented the movie is:
The average F-M difference from the earlier period is 3.0, and 4.5 from the later period. That may be only a 0.15 difference in the original 1-10 scores, but it's still there. Also look at how much wider the later distribution is: compared to zero instances from the earlier period, there are a good handful of movies that females rate on average 1 or 2 points higher on the 1-10 scale, as well as a few that males rate higher than females (unusual for the genre).
Another way to see the greater divergence is to look at the correlation between male and female scores for a given movie:
The points in the earlier period are much more tightly bunched around the line of best fit, meaning that you could almost perfectly predict male scores from female scores, whereas there's a more noisy relationship in the later period. The correlation for the rising-crime period is +0.95, while for the falling-crime period it is "only" +0.88 -- still very high, but at the same time showing how males and females agree less than they used to about how good a romantic comedy movie was. Also remember that even small differences in the average can have big effects in the tails of a distribution, like movies that are very polarizing.
The red line tells us what the predicted female score is, given how the males scored it. Look at the right side of the red line -- this is where males scored it above-average. If the dots are above the line, that's good news -- females liked it even more than expected -- while if they're below, that's bad news -- males like it a lot and females don't really dig it. Then look at the left side of the line, where males scored it below-average. If the dots are below the line, that's good news -- females also thought it was terrible, but were even harsher -- while if they're above, that's bad news -- males hated it and females liked it.
There isn't very much bad news in the earlier period, which agrees with everything else we've seen -- romantic comedies brought the sexes together. But look at the later period: out of just 50 movies, there are 3 big outliers. One of them, Norbit, is scored equally poorly (3.7) by males and females. But the other two are ones where males were put to sleep while females at least stayed awake or even liked them -- Sex and the City 2 (no surprise: 3.5 among males vs. 5.3 among females), and, yep, Sex and the City the first one (4.8 among males vs. 6.9 among females). The data do not lie -- that whole franchise embodies just about everything that's gone to hell in male-female relations.
Every way of looking at the numbers shows the same overall picture: while romantic comedies used to be an agreeable common space for guys and girls, the plummeting level of violence has predictably driven men and women farther apart, with most of the gains going to females. The fact that we have no phrase for the male version of a chick flick tells us how less common a male-oriented romantic comedy is compared to a female-oriented one.
Don't get me wrong: I'm not going to agitate for more male-oriented romantic comedies just to "even the score," like those insecure ads from Burger King or the Miller Lite man laws campaign. In fact, anything using "man" as an adjective proves how castrated it already is -- you don't need to refer to a '78 Trans Am as a "man car" because it speaks for itself. Only something limp like a messenger bag stands in need of a defense like being called a "man bag" or "man purse."
No, I've been sick of the segregation of the sexes ever since it began in the early-mid-'90s. It began as a heated war that cooled into mere isolationism, but it's all the same to me and anyone else who remembers how much boys and girls used to play with each other, when the world was still growing more dangerous and unstable and uncertain. I'll only be happy again when we return to that carnivalesque thrill of co-mingling -- Leslie Burke, Pat Benatar, Kelly Kapowski -- not if guys end up winning some tiresome battle of the sexes. Although even that would certainly be better than today's world where the mixed-sex spaces like romantic comedies, which used to please both guys and girls, have become biased toward the more boring sex.
February 7, 2011
The great stagnation of inventions, in two charts
Tyler Cowen wrote an article about stagnant growth in the standard of living for people in the developed world, which he develops at greater length in an e-book that I have not read, called The Great Stagnation (link in the article). If you search the Marginal Revolution website for "TGS," you'll find lots of links to responses to his overall thesis, namely that sometime in the mid-1970s the dizzying growth in our standard of living that began with the industrial revolution had started to plateau.
I have not read all of them, but have browsed them, and read a fair number of comments in those posts. A lot of the focus is on technological innovation, admittedly just one piece of how well we have it, but still an important one -- and certainly a more easily quantifiable one. Cowen's critics grasp for recent examples of new gadgets that are surely just as dazzling and enriching as yesterday's doodads were.
However, nothing that I've read has made that explicit comparison -- is the iPhone really as revolutionary, against the recent background, as the telephone was, against its recent background? No way, Jose. Most techie geeks, like geeks in general, are completely uninterested in history, and so lack any frame of reference for today's stream of novel doohickies. It should be obvious, and Cowen in that NYT article even points out several such improvements in the standard of living that totally swamp the adoption of the internet or Skype or containerization -- electricity, the automobile (the railroad for that matter), and so on. Always seeking shelter from the real world, techie geeks are incapable of appreciating how much greater life became after the introduction during the 19th C. of something as seemingly mundane as matches.
To quantify the obvious, I went to the book 1001 Inventions That Changed the World (see the link for details on who wrote it and their methods, or pick up a copy). I tallied the number of inventions important enough to make it into this book and smoothed these data with a 10-year moving average. For most of recorded history before the industrial revolution, there's a very low and steady rate of inventions made each year, so I only plot from 1700 onward (click to enlarge):
You can clearly see how runaway the rate of invention was when the industrial revolution got going. However, even by 1900 that rate had leveled off -- the peaks after 1900 are not taller than the one around 1880, and the troughs are not consitently or substantially higher than the one around 1890. Still, you can see a mini-boom in inventions from about 1920 to 1970 -- the age of the transistor, the computer, and most of the things we consider hi-tech. There is also a very clear steady downturn starting around 1985 -- probably not coincidentally after Bell Labs, a powerhouse of 20th C. invention, had to close down when AT&T was busted up.
As Cowen mentions, even during periods of long-term increase in inventions, there are still cycles up and down around that trend. Nevertheless, it looks like even that secular trend upward has been reached by now, and we will only be cycling around that plateau level.
By the way, the recent decline cannot be blamed on it taking awhile to recognize monumental inventions. First, there are decline periods far into the past as well, so we don't need to appeal to a "blindness to recent inventions" bias. Second, the bias obviously goes the other way -- techie geeks who compile lists of the Top 100 Awesomest Inventions always over-hype the very recent past. Therefore, such lists will always give a fair number of false positives for the recent past -- declaring something important when it will turn out to be trivial -- but never a false negative -- overlooking something that is earth-shaking.
Another way to understand it is to ask, How far would we have to zoom out in time in order for The Great Stagnation to disappear? Instead of lumping inventions in year-by-year bins, I lumped them into decades -- still the recent decline shows up. How about half-century bins? Now there is no recent downturn, but a plateau effect still remains. 75-year bins? No again. Only when we look at how many inventions were made per *century* does the growth look as strong during "recent" times compared to the next-most-recent times (i.e. growth during the 20th compared to the 19th C.):
If you squint, you can see that even here the slope is a little shallower for the most recent period than the next-most-recent period, but it's close enough to declare no plateau, let alone a decline, in recent times. However, this is not what the critics are arguing -- that 20th C. growth was just as impressive as 19th C. growth. They're trying, and failing, to show how frenetic growth has been during the past several decades compared to the earlier decades.
A final, non-quantitative way to think about how little things have changed in the big picture since the 1970s or '80s is to do a thought experiment: think of some invention after the cutoff year and ask how radically it could have changed history if it had been introduced 100 or however many years ago. Then do that for an invention a bit earlier than the cutoff date. Better yet, take a handful of inventions on either side, then compare which set could have more dramatically altered the course of history. Sending cell phones into a world with only landline phones (not even cordless ones, without answering machines, and without the ubiquity of pay phones for on-the-go communication) would not change things as radically as sending the telephone into a world with only letter writing and no high-powered means of transportation to deliver them.
Or imagine traveling back in time and showing the average person what shiny new toys you've got in the future, and ask how amazed they would be. Even as recently as 1985, when Back to the Future was made, someone living 30 years earlier would have been astounded by a portable music player (or "personal stereo," as Walkmans used to be called) and a camcorder, which Doc Brown calls something like a portable television studio. (This would be even more dramatic by sending 1955's technology to a 1925 consumer, of course.)
But then take something fairly common from 2010 like a phone that can take pictures on-the-go and show it to someone from 1980 -- meh, that's kind of neat, but we can already make phone calls just about anywhere and anytime we want, and we already have affordable cameras and even instant cameras. It's a cool improvement, but we were hoping for flying cars, X-ray specs, virtual reality -- something in the realm of "just imagine the possibilities."
There's a lot more to say about the topic of stagnation since the '70s or '80s, and you can search this blog for "Malthusian" to find some relevant earlier posts. But the rate of trailblazing invention isn't too hard to quantify, making that prong of Cowen's argument a rather open-and-shut case. Still it's worth illustrating with graphs and qualitatively since most people who get worked into a tizzy over the stagnation idea have no conception whatsoever of what life used to be like much earlier than when they were kids.
I have not read all of them, but have browsed them, and read a fair number of comments in those posts. A lot of the focus is on technological innovation, admittedly just one piece of how well we have it, but still an important one -- and certainly a more easily quantifiable one. Cowen's critics grasp for recent examples of new gadgets that are surely just as dazzling and enriching as yesterday's doodads were.
However, nothing that I've read has made that explicit comparison -- is the iPhone really as revolutionary, against the recent background, as the telephone was, against its recent background? No way, Jose. Most techie geeks, like geeks in general, are completely uninterested in history, and so lack any frame of reference for today's stream of novel doohickies. It should be obvious, and Cowen in that NYT article even points out several such improvements in the standard of living that totally swamp the adoption of the internet or Skype or containerization -- electricity, the automobile (the railroad for that matter), and so on. Always seeking shelter from the real world, techie geeks are incapable of appreciating how much greater life became after the introduction during the 19th C. of something as seemingly mundane as matches.
To quantify the obvious, I went to the book 1001 Inventions That Changed the World (see the link for details on who wrote it and their methods, or pick up a copy). I tallied the number of inventions important enough to make it into this book and smoothed these data with a 10-year moving average. For most of recorded history before the industrial revolution, there's a very low and steady rate of inventions made each year, so I only plot from 1700 onward (click to enlarge):
You can clearly see how runaway the rate of invention was when the industrial revolution got going. However, even by 1900 that rate had leveled off -- the peaks after 1900 are not taller than the one around 1880, and the troughs are not consitently or substantially higher than the one around 1890. Still, you can see a mini-boom in inventions from about 1920 to 1970 -- the age of the transistor, the computer, and most of the things we consider hi-tech. There is also a very clear steady downturn starting around 1985 -- probably not coincidentally after Bell Labs, a powerhouse of 20th C. invention, had to close down when AT&T was busted up.
As Cowen mentions, even during periods of long-term increase in inventions, there are still cycles up and down around that trend. Nevertheless, it looks like even that secular trend upward has been reached by now, and we will only be cycling around that plateau level.
By the way, the recent decline cannot be blamed on it taking awhile to recognize monumental inventions. First, there are decline periods far into the past as well, so we don't need to appeal to a "blindness to recent inventions" bias. Second, the bias obviously goes the other way -- techie geeks who compile lists of the Top 100 Awesomest Inventions always over-hype the very recent past. Therefore, such lists will always give a fair number of false positives for the recent past -- declaring something important when it will turn out to be trivial -- but never a false negative -- overlooking something that is earth-shaking.
Another way to understand it is to ask, How far would we have to zoom out in time in order for The Great Stagnation to disappear? Instead of lumping inventions in year-by-year bins, I lumped them into decades -- still the recent decline shows up. How about half-century bins? Now there is no recent downturn, but a plateau effect still remains. 75-year bins? No again. Only when we look at how many inventions were made per *century* does the growth look as strong during "recent" times compared to the next-most-recent times (i.e. growth during the 20th compared to the 19th C.):
If you squint, you can see that even here the slope is a little shallower for the most recent period than the next-most-recent period, but it's close enough to declare no plateau, let alone a decline, in recent times. However, this is not what the critics are arguing -- that 20th C. growth was just as impressive as 19th C. growth. They're trying, and failing, to show how frenetic growth has been during the past several decades compared to the earlier decades.
A final, non-quantitative way to think about how little things have changed in the big picture since the 1970s or '80s is to do a thought experiment: think of some invention after the cutoff year and ask how radically it could have changed history if it had been introduced 100 or however many years ago. Then do that for an invention a bit earlier than the cutoff date. Better yet, take a handful of inventions on either side, then compare which set could have more dramatically altered the course of history. Sending cell phones into a world with only landline phones (not even cordless ones, without answering machines, and without the ubiquity of pay phones for on-the-go communication) would not change things as radically as sending the telephone into a world with only letter writing and no high-powered means of transportation to deliver them.
Or imagine traveling back in time and showing the average person what shiny new toys you've got in the future, and ask how amazed they would be. Even as recently as 1985, when Back to the Future was made, someone living 30 years earlier would have been astounded by a portable music player (or "personal stereo," as Walkmans used to be called) and a camcorder, which Doc Brown calls something like a portable television studio. (This would be even more dramatic by sending 1955's technology to a 1925 consumer, of course.)
But then take something fairly common from 2010 like a phone that can take pictures on-the-go and show it to someone from 1980 -- meh, that's kind of neat, but we can already make phone calls just about anywhere and anytime we want, and we already have affordable cameras and even instant cameras. It's a cool improvement, but we were hoping for flying cars, X-ray specs, virtual reality -- something in the realm of "just imagine the possibilities."
There's a lot more to say about the topic of stagnation since the '70s or '80s, and you can search this blog for "Malthusian" to find some relevant earlier posts. But the rate of trailblazing invention isn't too hard to quantify, making that prong of Cowen's argument a rather open-and-shut case. Still it's worth illustrating with graphs and qualitatively since most people who get worked into a tizzy over the stagnation idea have no conception whatsoever of what life used to be like much earlier than when they were kids.
Subscribe to:
Posts (Atom)