Over the past couple weeks I've been checking out what's available for little kids to get into the Halloween spirit. My nephew will soon be old enough to have scary stories read to him and go trick-or-treating, so I want to see exactly how sissified the holiday has become in order to counteract it.
It's even worse than I thought, and I already knew that it was going to be pretty wussy. First, there are virtually no scary story books on sale in the children's section of Barnes & Noble. All of the books on display for Halloween show monsters as goofballs, harmless and bumbling dorks, or similar characters from today's kid culture who happen to be wearing costumes (like the Go Diego Go books). Nothing eerie or spooky, let alone frightening -- in either the text or the pictures. The same goes for TV shows and movies on sale: they portray an adorable and safe holiday rather than one where you're going to get an exhilarating scare.
Still, this research triggered lots of memories of Halloween during the dangerous high-crime times of the 1980s, back before it became hijacked by aging women dressing like sluts and virgin-for-life guys struggling to think up the most uniquely meta-ironic costume in the hopes that it'll finally get them laid with that indie chick who works at the local independently owned coffeehouse. No, back then it was a carnivalesque night for children rather than an overly self-conscious status contest between adults.
In order to keep that original spirit alive, here are some books, music, and TV shows and movies that will protect any little ones who you have influence over from the growing apathy and cutesiness that's been killing off Halloween as a kids' holiday.
Music
When I handed out candy in high school I used to play early albums by The Residents from a front window of the house, but looking back on it I was just doing "weird for the sake of weird." I don't think kids really found it creepy. This year I'm going with the soundtracks for A Nightmare on Elm Street, Labyrinth, and Twin Peaks (and maybe a few off the Fire Walk With Me soundtrack; not nearly as good). People complain that they sound "dated" because of all the synthesizers, but that's like whining about hearing Scarlatti played on a harpsichord. At any rate, synths are inherently spooky-sounding because they lie in the "uncanny valley" between clearly organic sounds and clearly artificial sounds.
The soundtrack for Labyrinth is the most upbeat and kid-friendly one, yet there's just enough sorrow in David Bowie's voice and other-worldiness in the scored music that it will ease them into the Halloween atmosphere. The one for Twin Peaks is the most haunting, both because of the brooding synth compositions by Angelo Badalamenti and the ghostly yearning tone to Julee Cruise's voice. Of course the one for A Nightmare on Elm Street is the most unsettling because of the heavy use of percussion, which calls to mind stalking footsteps, heels striking the ground while running away, windows and doors slamming shut, and things falling or being dropped unexpectedly that give you a jolt. There's lots of plodding, yawning synth lines that give it a surreal and dreamy feel, plus gently chiming sounds that suggest a child's music box or glockenspiel, heightening the contrast between the worlds of the safe and the dangerous.
Books
In a Dark, Dark Room and Other Scary Stories by Alvin Schwartz. He popularized a lot of adult folklore and urban legends for children, including in this book a version of "the vanishing hitch-hiker" and an otherwise normal girl who always wears a green ribbon around her neck. This is the only scary book you can get in the B&N children's section, and for only $6 hardcover. When I saw the cover, I vaguely recalled the title story, but it was the darker one about the green ribbon girl that really woke up the memories I had of being spooked out when someone read this to me in elementary school (probably the school librarian). Three of the seven stories prominently feature death, but kids have to learn about that some time.
Scary Stories to Tell in the Dark (series), also by Alvin Schwartz. You can find watered down versions of some of these books at B&N, but they've junked the original illustrations by Stephen Gammell and replaced them with boring pictures by someone else. Gammell's disturbing, nightmarish drawings are half the fun of this series (see for yourself with a google image search), so you'll have to get the older ones from the '80s and early '90s. There's a newer boxed set of all three books, though I can't vouch for that. There's also an audio book version (most of which can be heard on YouTube) that stars a wonderfully creepy story-teller.
Halloween Poems by Myra Cohn Livingston. Actually the poems here are pretty dopey, but the kids will ignore them and focus on another unnerving set of illustrations by Stephen Gammell.
Thump, Thump, Thump! by Anne Rockwell. This tells the old "who stole my hairy toe?" folktale, but unlike other presentations of it the drawings are scarier (even if cartoony). The color palette is a dreamlike restriction to grays, blues, and oranges, and the desolate rolling landscape looks like it would in a kid's nightmare. It won't seem scary at all to a grown-up, but I remember the tension I felt every time my mother read this to me as a pre-schooler, and the only customer review at Amazon said the same thing. Unfortunately it's been out of print for nearly 30 years, so unless you feel like plunking down $40 for a beaten up copy or over $100 for an intact one, you'll have to find a local library that has it.
TV shows and movies
I'm only talking about ones that are explicitly about Halloween, not just scary movies in general. That's certainly a good idea, too, but you probably don't need any help remembering what the great horror movies are.
Halloween is Grinch Night by Dr. Seuss. You can find this at B&N on the Seuss Celebration DVD, which is a compilation of most of the Dr. Seuss made-for-TV specials of the 1970s. Most of it is a quiet, tension-building series of songs and shots of a small community locking themselves in on a frightening evening; the climax shows a young boy confronting a gauntlet of surreal horrors inside the Grinch's wagon.
Before the '90s explosion of helicopter parenting, my public library used to show this near Halloween in a small theater they had in the basement, at the bottom of winding stairs. When I watched this at age 7 or whatever it was, that climax was one of the scariest things I'd ever seen -- and this is coming from someone who during elementary school saw The Elephant Man, Night of the Living Dead, Aliens, Terminator, the Nightmare on Elm Street movies, Child's Play, etc. When I watched it again recently, it wasn't very scary, so perhaps this is one that needs to be seen while staring up at a menacing big screen.
The early Treehouse of Horror specials from The Simpsons are spooky enough, and you can buy those separately from the complete-season DVDs. Both the show in general and the Halloween specials in particular started sucking around the same time in the mid-late '90s, but anything before that is great.
The original Halloween movie needs no reminder, I hope. I saw that last year and my heart still races every 20 minutes during that movie, especially during the many scenes where she's locked outside and pounding and screaming at some clueless kid to let her in, while Michael Myers is closing in on her. Don't be so afraid of showing this to little kids -- like I said, I and most of my friends saw all kinds of frightening movies before we graduated 3rd grade. Hell, at age 10 I even got to see Predator 2 in the theater, where I pestered my dad into sitting in the front row.
I know some of you are thinking of The Nightmare Before Christmas, but I didn't even find that one spooky when I saw it projected on the big screen. To me, most of Tim Burton's attempts at fright are too self-conscious, like the guy in high school whose slouchy posture is more of a pose for attention than a congenital deformity that we truly believe. Batman is full of awesome thrills precisely because it doesn't pause so often to make puppy-dog eyes at the audience and ask to be held. Even the single "Large Marge" scene from Pee-wee's Big Adventure is scarier than all of The Nightmare Before Christmas. It's presented matter-of-factly, and you feel like it could really happen. His later movies lost that believable campfire story-teller vibe, feeling more like you're joining some emo geek to dress in all black and read Edgar Allan Poe aloud in the neighborhood graveyard. Too affected.
These are just a few examples that sprung to mind while I was out browsing. I'm sure there are plenty that I either forgot or never experienced in the first place. So what else is there to shield kids from the adorablization of Halloween? Maybe something by Edward Gorey. I doubt elementary school kids would get the text, but the drawings alone would be great. After Halloween is Grinch Night, the library screened some episodes of Mystery!, and I recall loving his opening animation even at a young age. But it wasn't until high school that I started buying whatever I could get my hands on by him.
October 28, 2010
October 27, 2010
What's so worrisome about caffeine + alcohol drinks?
Here's an NYT story about rising concern over drinks that are basically energy drinks mixed with alcohol. Although I'm not a regular drinker, I still remember one of the most enjoyable drinks I've ever had was an Irish coffee. I'd never heard of it before, but a bunch of us in Barcelona were out for a long night, so the idea of loosening up with booze while keeping my energy going with coffee was an easy sell. I don't recall anything like what the article says about the effects of "alcopop" drinks. Plus, when I went dancing twice a week in Barcelona they gave you a ticket for one drink when you paid to enter, and I would always get a Cuba libre, again never experiencing anything like the alcopop effects.
Clearly there is something else about drinks like Four Loko, Joose, Sparks, Tilt, etc., since just caffeine and alcohol won't produce the effects. Whenever there's junk food with particularly destructive effects, it will always be due to one of two ingredients -- empty carbs (like sugar) or cheap vegetable oils (like soybean). Obviously in this case it must be sugar. I do remember getting weird while drinking a Monster energy drink every morning for the 2007-08 school year, or a store-bought Frappuccino almost every day in the years before that. These are both full of sugar.
After some googling, yep, that's it. One can of alcopop has anywhere from 30 to 190 grams of carbs, all digestible, most or all of it sugars. For comparison, a candy bar usually has around 30 grams. A mixed drink with Coke only has so much of it, not a whole can, so you're probably only getting 10 grams. A shot of Baileys, which also has sweeteners, still only has 11 grams. And that Irish coffee I drank apparently had 1 tsp or merely 5 grams of brown sugar -- I couldn't even taste it and only learned this from looking up the recipe.
You might think that the alcopops are being targeted because they're mainly consumed by younger and lower-status people, but then so is rum and Coke, which doesn't cause the same worries. Explanations based on pop cynicism, being so knee-jerk, are rarely right. Here it seems like there is something different about the effects of alcopop drinks, and it's almost surely due to the pounds of sugar they have.
I think this also underlies what many view as a qualitatively different drinking culture among young people over the past 20 or so years -- the antics of "binge drinking" on campuses in particular. It's not as if getting blindingly drunk is new -- in fact, old. And it used to be more prevalent and tolerated. But with the wussification of society in the past roughly 20 years, young people have switched their drinking culture to one that relies so heavily on saccharine drinks that happen to have some alcohol hidden in there somewhere. Everything involving hard liquor now is ___ and Coke, ___ and one of a cornucopia of fruit juices, or god help us even ___ and Kool-Aid.
My impression of the earlier drinking culture of young people is that it wasn't so dependent on sugary drinks, except in the unusual case of someone spiking the punch bowl at a school dance. At least in TV and movies, you see them with bottles and shot glasses and flasks (and beer of course), not rows of soda and fruit juice jugs. So, while they were getting plastered, they were not also getting the unique combination of frenzy and irritability that comes from a sugar rush (and eventual sugar crash). Merely adding caffeine won't yield those effects either; it just puts some pep in your step. That's probably why young heavy drinkers from circa 1975 looked more like jovial party animals, while those of today come off as neurotic trainwrecks.
Clearly there is something else about drinks like Four Loko, Joose, Sparks, Tilt, etc., since just caffeine and alcohol won't produce the effects. Whenever there's junk food with particularly destructive effects, it will always be due to one of two ingredients -- empty carbs (like sugar) or cheap vegetable oils (like soybean). Obviously in this case it must be sugar. I do remember getting weird while drinking a Monster energy drink every morning for the 2007-08 school year, or a store-bought Frappuccino almost every day in the years before that. These are both full of sugar.
After some googling, yep, that's it. One can of alcopop has anywhere from 30 to 190 grams of carbs, all digestible, most or all of it sugars. For comparison, a candy bar usually has around 30 grams. A mixed drink with Coke only has so much of it, not a whole can, so you're probably only getting 10 grams. A shot of Baileys, which also has sweeteners, still only has 11 grams. And that Irish coffee I drank apparently had 1 tsp or merely 5 grams of brown sugar -- I couldn't even taste it and only learned this from looking up the recipe.
You might think that the alcopops are being targeted because they're mainly consumed by younger and lower-status people, but then so is rum and Coke, which doesn't cause the same worries. Explanations based on pop cynicism, being so knee-jerk, are rarely right. Here it seems like there is something different about the effects of alcopop drinks, and it's almost surely due to the pounds of sugar they have.
I think this also underlies what many view as a qualitatively different drinking culture among young people over the past 20 or so years -- the antics of "binge drinking" on campuses in particular. It's not as if getting blindingly drunk is new -- in fact, old. And it used to be more prevalent and tolerated. But with the wussification of society in the past roughly 20 years, young people have switched their drinking culture to one that relies so heavily on saccharine drinks that happen to have some alcohol hidden in there somewhere. Everything involving hard liquor now is ___ and Coke, ___ and one of a cornucopia of fruit juices, or god help us even ___ and Kool-Aid.
My impression of the earlier drinking culture of young people is that it wasn't so dependent on sugary drinks, except in the unusual case of someone spiking the punch bowl at a school dance. At least in TV and movies, you see them with bottles and shot glasses and flasks (and beer of course), not rows of soda and fruit juice jugs. So, while they were getting plastered, they were not also getting the unique combination of frenzy and irritability that comes from a sugar rush (and eventual sugar crash). Merely adding caffeine won't yield those effects either; it just puts some pep in your step. That's probably why young heavy drinkers from circa 1975 looked more like jovial party animals, while those of today come off as neurotic trainwrecks.
October 23, 2010
Imagine a world without patriarchy...
It would look like the movie Carrie.
To get into the Halloween spirit, I've been watching movies that I only saw parts of growing up, although I wouldn't have noticed this aspect of the movie as a child even if I had seen the whole thing. Other movies, such as Heathers and Metropolitan, have shown the adolescent world devoid of parents to emphasize how little influence they have on their children -- it's how they interact with one another that matters most. (Parents can only affect kids indirectly through choices that affect the make-up of their kids' peer group.)
But few come to mind that show what would happen if patriarchal male authority vanished for any reason -- wiped out, collapsed under its own weight, atrophied due to apathy, or whatever. Carrie does. There are hardly any fathers or father figures: Carrie's left her and her mother for another woman, and we only see the father of the survivor / guilty conscience girl for a moment before his daughter runs off with no explanation for where she's going. He shrugs and lets her get away. The principal is impotent and has abdicated his authority role. We never get the sense that he's got his eagle eyes trained on his hell-raising charges, just waiting to crack down on them. Rather, through his flaccid body language and consistently getting Carrie's name wrong, he seems to be coasting through his final week at the office. He just doesn't give a shit anymore. The English teacher likewise is off in la-la land somewhere -- not so much a free spirit, with the connotation of rebelliousness that carries, but merely daydreaming to numb the pain.
The only character who approaches a patriarchal authority figure is the gym teacher, who is portrayed as a drill sergeant in dolphin shorts. It would be a mistake to see her as Carrie's surrogate mother -- she already has one, shown to be a common neurotic trainwreck of a middle-aged woman whose instability the father is supposed to protect his daughter from. The quasi-father figure of the cool-headed and authoritative gym teacher is the only sympathetic character in the whole movie.
And it's no wonder why -- just look what happens when there is no patriarchy to control female behavior! The phrase "Lord of the Flies" has long passed into popular usage to refer to how bestially males, particularly young males, will act in the absence of authority. But what about how girls will act? It's not as though they're sugar and spice and everything nice. They're just as vicious as males, only not in a violent way. They hardly need to be -- they'll just tease and ostracize and persecute each other until they feel like killing themselves. I don't think Mean Girls is up for the job, mostly because it's a comedy instead of something dark and serious.
So I nominate Carrie as the go-to movie reference when some clueless person starts babbling about how much better, even if not perfect, the world would be for women if there were more female-style rulers (i.e., not simply females playing male roles a la Thatcher or the gym teacher from Carrie). Neutral third-party patriarchal rule not only keeps young males from destroying each other; it keeps girls from going for each other's throats as well.
To get into the Halloween spirit, I've been watching movies that I only saw parts of growing up, although I wouldn't have noticed this aspect of the movie as a child even if I had seen the whole thing. Other movies, such as Heathers and Metropolitan, have shown the adolescent world devoid of parents to emphasize how little influence they have on their children -- it's how they interact with one another that matters most. (Parents can only affect kids indirectly through choices that affect the make-up of their kids' peer group.)
But few come to mind that show what would happen if patriarchal male authority vanished for any reason -- wiped out, collapsed under its own weight, atrophied due to apathy, or whatever. Carrie does. There are hardly any fathers or father figures: Carrie's left her and her mother for another woman, and we only see the father of the survivor / guilty conscience girl for a moment before his daughter runs off with no explanation for where she's going. He shrugs and lets her get away. The principal is impotent and has abdicated his authority role. We never get the sense that he's got his eagle eyes trained on his hell-raising charges, just waiting to crack down on them. Rather, through his flaccid body language and consistently getting Carrie's name wrong, he seems to be coasting through his final week at the office. He just doesn't give a shit anymore. The English teacher likewise is off in la-la land somewhere -- not so much a free spirit, with the connotation of rebelliousness that carries, but merely daydreaming to numb the pain.
The only character who approaches a patriarchal authority figure is the gym teacher, who is portrayed as a drill sergeant in dolphin shorts. It would be a mistake to see her as Carrie's surrogate mother -- she already has one, shown to be a common neurotic trainwreck of a middle-aged woman whose instability the father is supposed to protect his daughter from. The quasi-father figure of the cool-headed and authoritative gym teacher is the only sympathetic character in the whole movie.
And it's no wonder why -- just look what happens when there is no patriarchy to control female behavior! The phrase "Lord of the Flies" has long passed into popular usage to refer to how bestially males, particularly young males, will act in the absence of authority. But what about how girls will act? It's not as though they're sugar and spice and everything nice. They're just as vicious as males, only not in a violent way. They hardly need to be -- they'll just tease and ostracize and persecute each other until they feel like killing themselves. I don't think Mean Girls is up for the job, mostly because it's a comedy instead of something dark and serious.
So I nominate Carrie as the go-to movie reference when some clueless person starts babbling about how much better, even if not perfect, the world would be for women if there were more female-style rulers (i.e., not simply females playing male roles a la Thatcher or the gym teacher from Carrie). Neutral third-party patriarchal rule not only keeps young males from destroying each other; it keeps girls from going for each other's throats as well.
October 21, 2010
NYT catches up to me on recent lack of one-liners
Scroll down this blog and you'll see the observation that since the crime rate started plummeting after 1992, there have been virtually no one-liners from action or horror movies that have gone viral, in contrast to a proliferation of such lines from the high-crime times of 1959 to 1992. For romantic comedies, etc., there have been infectious lines, but nothing for action or horror.
Just a few days ago the NYT featured the same story, though without an awareness of how strongly our perception of violence and danger influences these things. They go digging for one-liners from before 1959, whereas I restricted myself to the most recent wave up and down of crime. The older ones they get are from low-crime times of 1934-1958, but again they're from dramas or romantic comedies, not action or horror.
I'm not so up on movies of Hollywood's golden age and before, but I do know that the action and horror movies of the 1900-1933 high-crime period have more one-liners, while those of 1934-1958 have hardly any.
I am... Draaacula.
Ladies and gentlemen, look at Kong, the Eighth Wonder of the World.
IT'S ALIIIIIVE!
Etc. Again compare to the void of quotes from the '34 to '58 period, when the decline of danger began borifying the culture. So we can rule out the explanations about the recent lack of one-liners which refer to the internet, TV, etc. There was no internet, etc., in the mid-'30s that killed off a previously vibrant action and horror movie culture. Rather, there must be something else that waxes and wanes, and that has a pervasive influence on society -- that is our perception of how violent the world is, which tracks real changes in the crime rate (maybe with a lag of a couple years; data not shown, but in the GSS and BJS data).
Just a few days ago the NYT featured the same story, though without an awareness of how strongly our perception of violence and danger influences these things. They go digging for one-liners from before 1959, whereas I restricted myself to the most recent wave up and down of crime. The older ones they get are from low-crime times of 1934-1958, but again they're from dramas or romantic comedies, not action or horror.
I'm not so up on movies of Hollywood's golden age and before, but I do know that the action and horror movies of the 1900-1933 high-crime period have more one-liners, while those of 1934-1958 have hardly any.
I am... Draaacula.
Ladies and gentlemen, look at Kong, the Eighth Wonder of the World.
IT'S ALIIIIIVE!
Etc. Again compare to the void of quotes from the '34 to '58 period, when the decline of danger began borifying the culture. So we can rule out the explanations about the recent lack of one-liners which refer to the internet, TV, etc. There was no internet, etc., in the mid-'30s that killed off a previously vibrant action and horror movie culture. Rather, there must be something else that waxes and wanes, and that has a pervasive influence on society -- that is our perception of how violent the world is, which tracks real changes in the crime rate (maybe with a lag of a couple years; data not shown, but in the GSS and BJS data).
October 18, 2010
Compelling superheroes and villains are created during violent times
You might wonder how it could be otherwise. I've already shown how little innovation (and so how much recycling) in children's cartoons and toys there's been since the early '90s decline in crime, and that includes those belonging to the hero-vs-villain type. I recently showed how forgettable the action and horror movies of the same period have been, compared to those made during the '60s through the '80s, looking at whether they've contributed memorable catch phrases to the larger culture. What about those other sources of new hero and villain characters -- comic books and video games?
There was a comic book bubble that burst in the early 1990s, after which the medium has been dead, so the fact that the low-crime times of post-1992 have produced no characters that have gone viral in the larger culture could simply be due to that. Marv from Sin City was created in 1991, Spawn in 1992, and Hellboy in 1993 -- and that's it. Of super-famous characters, the most recent one is probably Venom, created in 1988.
So we'll have to look at the pre-crash years in order to better judge whether or not the creation of compelling new characters depends on whether the violence level in society is soaring or plummeting.
The homicide rate began falling in 1934 and stayed low through 1958, then it rose through 1992. There are virtually no superstar comic book characters created in the 1950s (Green Lantern's first appearance in 1959 is not a counter-example to our prediction). Even most of the '30s and '40s saw very few, except for a handful of admittedly mega-famous ones within the three years of 1939 to 1941 -- Batman, Superman, the Joker, Lex Luthor, and Captain America. Maybe Wonder Woman. It's not as though comic books were an obscure entertainment form from '34 to '58, yet they produced few enduring characters.
In contrast, the high-crime times produced almost all of the ones that have lasted, and it's not just because the high-crime period has more years -- you could compare '38 to '58 vs. '60 to '80 and little would change. It may sound silly to throw comic books in with sex, drugs, and rock 'n' roll, but wild times cause people to think more about good guys vs. bad guys. In particular, it makes them want more concrete scapegoats or villains to focus their anger on, not just an unseen bunch of muggers and rapists, and it makes them want to look to heroes for guidance through the dangerous world.
In fact, the two large bursts of innovation in comic book characters coincides with the two large bursts in rock music creativity -- the first half of the 1960s, and the mid-late '70s, with a comparative lull in the late '60s and early '70s. Just look at all the recent movies based on comic book characters -- aside from Batman and Superman, the blockbuster ones featured characters who date back to these two bursts 30 to 50 years ago.
This pattern for superhero movies parallels the pattern in album sales -- the #1 album in the 2000s in America was a Beatles compilation! If what's out now stinks, might as well try to keep the past alive.
Returning to the post-crash era of the past 18 years, we can still see that this period would not have produced any compelling characters even if comic books had continued as popular entertainment for kids and teenagers. Just look at video games, which largely replaced comic books. All of the superstar characters there were created from the beginning of video games in the late '70s through the peak of the violence level in 1992. Of the early icons, only Pac-Man and Donkey Kong have survived, but that's still impressive. Let alone how popular Pac-Man was when it came out -- there were not only cartoons, books, lunchboxes, etc., as with later characters, but even a Billboard top 10 hit, "Pac-Man Fever."
The bulk of superstar characters come from the mid-'80s to early '90s, though, with virtually none coming from the post-'92 low-crime period. The only exceptions are the butt-kicking babes from the Tomb Raider and Resident Evil series, who went viral only because it gave dorks some familiar eye candy in a movie, and Pokemon. Everyone else is from high-crime times: Mario, Link, Sonic, Samus, Simon Belmont, Solid Snake, the Street Fighter and Mortal Kombat people, Kirby, and so on.
This look into the history of comic books and video games explains a lot more about the world, since it tells us that it's almost only when people see times as violent and getting worse that new heroes and villains are created that are fascinating enough to withstand the test of time. This aspect of mythogenesis explains why hunter-gatherers, nomadic herders, and settled farmers had much richer mythologies than we do -- the homicide rate has been steadily falling, with some exceptions, since roughly 1600 in northern Europe. That's the main reason why there was such a flourishing of new religions up through about 1000 AD, and hardly anything since then. People insulated from danger feel no need for religion, myth, magic, etc.
There was a comic book bubble that burst in the early 1990s, after which the medium has been dead, so the fact that the low-crime times of post-1992 have produced no characters that have gone viral in the larger culture could simply be due to that. Marv from Sin City was created in 1991, Spawn in 1992, and Hellboy in 1993 -- and that's it. Of super-famous characters, the most recent one is probably Venom, created in 1988.
So we'll have to look at the pre-crash years in order to better judge whether or not the creation of compelling new characters depends on whether the violence level in society is soaring or plummeting.
The homicide rate began falling in 1934 and stayed low through 1958, then it rose through 1992. There are virtually no superstar comic book characters created in the 1950s (Green Lantern's first appearance in 1959 is not a counter-example to our prediction). Even most of the '30s and '40s saw very few, except for a handful of admittedly mega-famous ones within the three years of 1939 to 1941 -- Batman, Superman, the Joker, Lex Luthor, and Captain America. Maybe Wonder Woman. It's not as though comic books were an obscure entertainment form from '34 to '58, yet they produced few enduring characters.
In contrast, the high-crime times produced almost all of the ones that have lasted, and it's not just because the high-crime period has more years -- you could compare '38 to '58 vs. '60 to '80 and little would change. It may sound silly to throw comic books in with sex, drugs, and rock 'n' roll, but wild times cause people to think more about good guys vs. bad guys. In particular, it makes them want more concrete scapegoats or villains to focus their anger on, not just an unseen bunch of muggers and rapists, and it makes them want to look to heroes for guidance through the dangerous world.
In fact, the two large bursts of innovation in comic book characters coincides with the two large bursts in rock music creativity -- the first half of the 1960s, and the mid-late '70s, with a comparative lull in the late '60s and early '70s. Just look at all the recent movies based on comic book characters -- aside from Batman and Superman, the blockbuster ones featured characters who date back to these two bursts 30 to 50 years ago.
This pattern for superhero movies parallels the pattern in album sales -- the #1 album in the 2000s in America was a Beatles compilation! If what's out now stinks, might as well try to keep the past alive.
Returning to the post-crash era of the past 18 years, we can still see that this period would not have produced any compelling characters even if comic books had continued as popular entertainment for kids and teenagers. Just look at video games, which largely replaced comic books. All of the superstar characters there were created from the beginning of video games in the late '70s through the peak of the violence level in 1992. Of the early icons, only Pac-Man and Donkey Kong have survived, but that's still impressive. Let alone how popular Pac-Man was when it came out -- there were not only cartoons, books, lunchboxes, etc., as with later characters, but even a Billboard top 10 hit, "Pac-Man Fever."
The bulk of superstar characters come from the mid-'80s to early '90s, though, with virtually none coming from the post-'92 low-crime period. The only exceptions are the butt-kicking babes from the Tomb Raider and Resident Evil series, who went viral only because it gave dorks some familiar eye candy in a movie, and Pokemon. Everyone else is from high-crime times: Mario, Link, Sonic, Samus, Simon Belmont, Solid Snake, the Street Fighter and Mortal Kombat people, Kirby, and so on.
This look into the history of comic books and video games explains a lot more about the world, since it tells us that it's almost only when people see times as violent and getting worse that new heroes and villains are created that are fascinating enough to withstand the test of time. This aspect of mythogenesis explains why hunter-gatherers, nomadic herders, and settled farmers had much richer mythologies than we do -- the homicide rate has been steadily falling, with some exceptions, since roughly 1600 in northern Europe. That's the main reason why there was such a flourishing of new religions up through about 1000 AD, and hardly anything since then. People insulated from danger feel no need for religion, myth, magic, etc.
October 14, 2010
How animal domestication made people better looking
Good looks are the target of natural selection because they honestly signal robustness to the slings and arrows of early development. The number one source of uglification for humans is infectious disease. If there's a high pathogen burden, people are going to put a greater emphasis on good looks in a mate -- they want their kids to make it through the gauntlet of bugs, and if you look good in a germ-ridden environment, you must have pretty good genes. (Buss & Gangestad 1993 showed this prediction came true.)
Across the world, where do most infectious diseases come from? Animals. These are zoonotic diseases. This explains the first big split between beautiful and ugly people across the world -- namely between those who've spent all or most of their history as hunter-gatherers, who are uniformly plain or ugly, and those who domesticated animals, who are at least OK to look at and often great. For the former, do a google image search for the Bushmen, the Eskimo, Australian Aborigines, and any Amerindian group such as the Ache or Native Americans.
Still, there's lots of variation within those who've been close to animals. The greater the threat of disease, the greater selection for good looks will be. This threat increases with the number of species you are in contact with, the number of individual animals you're near, how physically close you get to them, how long you're close to them, and in particular to the most disease-spreading parts of their body. Also, some animals may have nastier diseases than others.
This explains the next big split -- between East Asians, who only domesticated animals like the pig and chicken and duck, and most of the remainder of the Old World, which in addition to pigs and small birds also domesticated the largish grazing and browsing animals like horses, sheep, goats, cows, camels, and yaks. The Han (ethnic majority in China) are about as far off in the cultivated agriculture direction as you get, and they're not very good-looking (the world has judged this anyway, looking at magazine covers, beauty pageants, etc.).
The southern parts of India look this way too, where the people tend to look plain or bad and where a primary reliance on animal husbandry never took root. Contrast that to the northwestern parts of the Subcontinent -- there's a much heavier focus on animals, far more people can drink animal milk, and that's where the best looking South Asians tend to come from. (Like the Han, the southern Indians are also famous for being smarter and less rambunctious.)
Why does it matter if the animal is a smallish omnivore like a pig or a largish grazer like a sheep? Sheep herds tend to have more animals in them than swine herds (for say a Chinese farmer). You get a lot closer to sheep to shear them, bleed them, and milk them, whereas you only approach a pig to slaughter it (and that is also true for sheep). This difference in activities also means you spend more time close to sheep than pigs, and especially near their most dangerous areas -- when you're milking a sheep, you've got your hands right on the udders, and you're close to the animal's ass. Assuming you consume the animal's milk, you also risk disease; and people drink sheep's milk but not pig's milk.
There is a final and smaller split that this explains, namely between people who are primarily cultivators of plant foods and who have an odd cow or pig around vs. those who are primarily pastoralists (no matter whether they are sedentary or nomadic). Compared to cultivators, pastoralists are exposed to a wider variety of animals (historically, virtually none herded only one species), they have more of them, they get as close to them, but given how many more of them there are, they spend much more time up close. And the cultivator may or may not be consuming the dairy products -- if they mainly plant seeds, their odd cow or pig could be for processing and trading away on the market. Pastoralists are far more likely to use their animals for subsistence, and that is mainly through dairy (if they killed one every day or two for meat, their flock would be gone very soon).
Pastoralists are found throughout Europe (none of the nomadic kind, though), but especially along the Mediterranean, the Celtic parts of Britain, and other places you'd find listed in a World Atlas of Cheeses. Probably the least pastoralist group is the modern English (either moving more toward cultivation or industrialization), and they're just about as ugly as the Han Chinese, a real exception to otherwise OK or good-looking Europeans.
Moving southward, there are some fine looking nomadic pastoralists across northern Africa, although the real jackpot is around the Horn of Africa. This is probably the greatest concentration of pastoralists in the world, and it's also a supermodel powerhouse.
Western and central Africa has far fewer, as that's the homeland of the Bantu farmer expansion, but there are some in the west, and western Africans look OK. One of the lone groups of southern Africans who are pastoralists are also much nicer to look at than their farmer neighbors -- the Himba.
Just about all of western Asia has a very long history of pastoralism, and again we find some of the best looking people in the world. I've already covered the good-looking people of the northwestern part of the Subcontinent, but the various Persian and other ethnic groups in and around Iran are all overflowing with honey bunnies, not to mention the Levantines further west (as shown by the new phenomenon of the Lebanese babe protester). Women from the Arabian peninsula don't look quite as good, and that's probably due to the fact that they relied almost entirely on the camel rather than lots of other animals as well. See the earlier criteria for how much of a threat animal domestication poses. It also looks like zoonotic diseases from camels aren't as numerous or as disastrous as those from cattle, sheep, goats, pigs, etc. Hard to think of one.
Probably the most famous pastoralists are the horse-riding Turkic groups of central Asia, who produced Genghis Khan. Most of us have no clue what they look like, but they're pretty nice. Here is a random forum thread where someone's posted pictures of girls from various Turkic groups in their traditional attire. They're at least 10 times better looking than the hyper-farmers of eastern Asia. In college, one of the countries that everyone agreed had the cutest girls, based on the international students, was Turkey (the other was Greece -- another case of their long-standing rivalry).
There's not much going on for pastoralism or babe-ness in southeastern Asia, notwithstanding Thailand's sex tourism (it's rare to see Thais as models or on the cover of Playboy, Maxim, etc.). Ditto for the Pacific Islands. Southeastern Asians do look better than the Han to the north, but this is probably just an effect of more tropical diseases, not more animal husbandry. Australia saw a large influx of pastoralists when the Europeans showed up, giving Elle Macpherson to the world, but before that it was only ugly hunter-gatherers.
The same is true for the Americas -- no pastoralism, or even animal husbandry (horses were used to hunt large game, not to milk), and again Amerindians are plain or ugly. One minor exception is the camelid herders of the Andes. I hear that Peruvian Amerindians are better looking than those from, say, Mexico, but you still don't see lots of them on the cover of Vogue magazine, unlike the daughters of goat herders from Mogadishu. As in the Arabian peninsula, there could be some species-specific effect, where the camelids just aren't as big of a threat, not to mention that Andeans don't have huge and diverse flocks. The obvious exceptions in the New World are recent European immigrants.
Aside from selection for good looks based on protecting your offspring from bugs, those who've domesticated herd animals face an additional pressure for disease resistance -- namely, getting the job done here and now. A good-looking woman in a herding society not only promises healthier children, but you'll also be able to put her to work more effectively milking the cows, goats, and sheep. (This pressure is the same if men do the milking.) If she's more vulnerable to the zoonotic diseases spread by the animals she'll be tending to, she'll be weaker on the job and will probably die sooner as a result. Hence a better looking woman will be a better milker.
Then there are the non-genetic reasons that animal raisers tend to look better. They get a much healthier diet than cultivators, who are lucky to get animal food at all. Their diet isn't quite as nice as a hunter-gatherer's, yet they look better than them, so clearly diet is not the only or even primary factor (else the Eskimo, etc., would look best). And as Greg Cochran pointed out, one reason why milkmaids are famed for their beauty is that they were more likely to be exposed to cow pox and thereby become immune to small pox infections, a horribly disfiguring disease. Again, though, the main factor seems to be the genetic / natural selection story.
The strength of the "small pox immunity" theory could be tested by looking at pastoralist groups where the men do most of the milking and other herding activities. If their women still look pretty good, it is not due to greater exposure to cow pox but to sex-blind selection for greater looks as a signal of disease resistance. Even if men are the target of selection, they will pass these good-looking genes on to their daughters, making both sexes look nice. If, across pastoralist groups, only the sex that did most of the milking was good-looking, that would support the "small pox immunity" theory.
One last thing to keep in mind, since I anticipate a lot of very confused comments otherwise: this is all "all else equal." I won't allow any comment that stupidly and sarcastically asks why the pastoralist Himba don't look as good as a European farmer group. Too many other variables differ between them. The relevant comparison is always the group who's just like them in as many ways as possible, except for how they make a living. Scottish vs. English, Himba vs. Zulu, Uyghur vs. Han, etc.
Across the world, where do most infectious diseases come from? Animals. These are zoonotic diseases. This explains the first big split between beautiful and ugly people across the world -- namely between those who've spent all or most of their history as hunter-gatherers, who are uniformly plain or ugly, and those who domesticated animals, who are at least OK to look at and often great. For the former, do a google image search for the Bushmen, the Eskimo, Australian Aborigines, and any Amerindian group such as the Ache or Native Americans.
Still, there's lots of variation within those who've been close to animals. The greater the threat of disease, the greater selection for good looks will be. This threat increases with the number of species you are in contact with, the number of individual animals you're near, how physically close you get to them, how long you're close to them, and in particular to the most disease-spreading parts of their body. Also, some animals may have nastier diseases than others.
This explains the next big split -- between East Asians, who only domesticated animals like the pig and chicken and duck, and most of the remainder of the Old World, which in addition to pigs and small birds also domesticated the largish grazing and browsing animals like horses, sheep, goats, cows, camels, and yaks. The Han (ethnic majority in China) are about as far off in the cultivated agriculture direction as you get, and they're not very good-looking (the world has judged this anyway, looking at magazine covers, beauty pageants, etc.).
The southern parts of India look this way too, where the people tend to look plain or bad and where a primary reliance on animal husbandry never took root. Contrast that to the northwestern parts of the Subcontinent -- there's a much heavier focus on animals, far more people can drink animal milk, and that's where the best looking South Asians tend to come from. (Like the Han, the southern Indians are also famous for being smarter and less rambunctious.)
Why does it matter if the animal is a smallish omnivore like a pig or a largish grazer like a sheep? Sheep herds tend to have more animals in them than swine herds (for say a Chinese farmer). You get a lot closer to sheep to shear them, bleed them, and milk them, whereas you only approach a pig to slaughter it (and that is also true for sheep). This difference in activities also means you spend more time close to sheep than pigs, and especially near their most dangerous areas -- when you're milking a sheep, you've got your hands right on the udders, and you're close to the animal's ass. Assuming you consume the animal's milk, you also risk disease; and people drink sheep's milk but not pig's milk.
There is a final and smaller split that this explains, namely between people who are primarily cultivators of plant foods and who have an odd cow or pig around vs. those who are primarily pastoralists (no matter whether they are sedentary or nomadic). Compared to cultivators, pastoralists are exposed to a wider variety of animals (historically, virtually none herded only one species), they have more of them, they get as close to them, but given how many more of them there are, they spend much more time up close. And the cultivator may or may not be consuming the dairy products -- if they mainly plant seeds, their odd cow or pig could be for processing and trading away on the market. Pastoralists are far more likely to use their animals for subsistence, and that is mainly through dairy (if they killed one every day or two for meat, their flock would be gone very soon).
Pastoralists are found throughout Europe (none of the nomadic kind, though), but especially along the Mediterranean, the Celtic parts of Britain, and other places you'd find listed in a World Atlas of Cheeses. Probably the least pastoralist group is the modern English (either moving more toward cultivation or industrialization), and they're just about as ugly as the Han Chinese, a real exception to otherwise OK or good-looking Europeans.
Moving southward, there are some fine looking nomadic pastoralists across northern Africa, although the real jackpot is around the Horn of Africa. This is probably the greatest concentration of pastoralists in the world, and it's also a supermodel powerhouse.
Western and central Africa has far fewer, as that's the homeland of the Bantu farmer expansion, but there are some in the west, and western Africans look OK. One of the lone groups of southern Africans who are pastoralists are also much nicer to look at than their farmer neighbors -- the Himba.
Just about all of western Asia has a very long history of pastoralism, and again we find some of the best looking people in the world. I've already covered the good-looking people of the northwestern part of the Subcontinent, but the various Persian and other ethnic groups in and around Iran are all overflowing with honey bunnies, not to mention the Levantines further west (as shown by the new phenomenon of the Lebanese babe protester). Women from the Arabian peninsula don't look quite as good, and that's probably due to the fact that they relied almost entirely on the camel rather than lots of other animals as well. See the earlier criteria for how much of a threat animal domestication poses. It also looks like zoonotic diseases from camels aren't as numerous or as disastrous as those from cattle, sheep, goats, pigs, etc. Hard to think of one.
Probably the most famous pastoralists are the horse-riding Turkic groups of central Asia, who produced Genghis Khan. Most of us have no clue what they look like, but they're pretty nice. Here is a random forum thread where someone's posted pictures of girls from various Turkic groups in their traditional attire. They're at least 10 times better looking than the hyper-farmers of eastern Asia. In college, one of the countries that everyone agreed had the cutest girls, based on the international students, was Turkey (the other was Greece -- another case of their long-standing rivalry).
There's not much going on for pastoralism or babe-ness in southeastern Asia, notwithstanding Thailand's sex tourism (it's rare to see Thais as models or on the cover of Playboy, Maxim, etc.). Ditto for the Pacific Islands. Southeastern Asians do look better than the Han to the north, but this is probably just an effect of more tropical diseases, not more animal husbandry. Australia saw a large influx of pastoralists when the Europeans showed up, giving Elle Macpherson to the world, but before that it was only ugly hunter-gatherers.
The same is true for the Americas -- no pastoralism, or even animal husbandry (horses were used to hunt large game, not to milk), and again Amerindians are plain or ugly. One minor exception is the camelid herders of the Andes. I hear that Peruvian Amerindians are better looking than those from, say, Mexico, but you still don't see lots of them on the cover of Vogue magazine, unlike the daughters of goat herders from Mogadishu. As in the Arabian peninsula, there could be some species-specific effect, where the camelids just aren't as big of a threat, not to mention that Andeans don't have huge and diverse flocks. The obvious exceptions in the New World are recent European immigrants.
Aside from selection for good looks based on protecting your offspring from bugs, those who've domesticated herd animals face an additional pressure for disease resistance -- namely, getting the job done here and now. A good-looking woman in a herding society not only promises healthier children, but you'll also be able to put her to work more effectively milking the cows, goats, and sheep. (This pressure is the same if men do the milking.) If she's more vulnerable to the zoonotic diseases spread by the animals she'll be tending to, she'll be weaker on the job and will probably die sooner as a result. Hence a better looking woman will be a better milker.
Then there are the non-genetic reasons that animal raisers tend to look better. They get a much healthier diet than cultivators, who are lucky to get animal food at all. Their diet isn't quite as nice as a hunter-gatherer's, yet they look better than them, so clearly diet is not the only or even primary factor (else the Eskimo, etc., would look best). And as Greg Cochran pointed out, one reason why milkmaids are famed for their beauty is that they were more likely to be exposed to cow pox and thereby become immune to small pox infections, a horribly disfiguring disease. Again, though, the main factor seems to be the genetic / natural selection story.
The strength of the "small pox immunity" theory could be tested by looking at pastoralist groups where the men do most of the milking and other herding activities. If their women still look pretty good, it is not due to greater exposure to cow pox but to sex-blind selection for greater looks as a signal of disease resistance. Even if men are the target of selection, they will pass these good-looking genes on to their daughters, making both sexes look nice. If, across pastoralist groups, only the sex that did most of the milking was good-looking, that would support the "small pox immunity" theory.
One last thing to keep in mind, since I anticipate a lot of very confused comments otherwise: this is all "all else equal." I won't allow any comment that stupidly and sarcastically asks why the pastoralist Himba don't look as good as a European farmer group. Too many other variables differ between them. The relevant comparison is always the group who's just like them in as many ways as possible, except for how they make a living. Scottish vs. English, Himba vs. Zulu, Uyghur vs. Han, etc.
October 12, 2010
How economic specialization led to the welfare state
Something I haven't seen mentioned in any economics text I've read -- book, article, blog post, etc. -- is why the welfare state, in some form or other, always pops up when people begin to specialize.
The greater wealth of more specialized economies is not the answer, since there's plenty of junk we don't spend so much more on when we get wealthier -- and the entire apparatus of the welfare state is not some slight increase. "When we get wealthier, we'll spend more on things" is a lazy and vague statement that avoids any discussion of how big that change will be. Why so much more spending on the same programs the world over, and hardly at all on others? Mere wealth increases don't help us figure this out.
It is not some far-off side effect of specialization that gave birth to the welfare state, but the very fact of specializing more and more narrowly in our job tasks. The econ 101 textbook tells you that specializing is unequivocally good -- though a more advanced treatment might tell you where it doesn't work -- because everyone benefits from pursuing their comparative advantage. I won't give a numerical example; you can look one up on Wikipedia. But if we're so much better off from this, then why do we fear more and more about our economic security and therefore demand -- and get -- a suite of programs called a "social safety net"?
Simply put, it's because comparative advantage would be a disaster in the real world (however well it works in some mathematical models). Nassim Taleb points out how fragile it is to the assumption that prices stay constant. For example, if one country specializes in wine and another in wool, things go all hunky-dory for awhile -- until wine is replaced by something else as the fashionable thing to drink, or there is some once-a-century natural disaster that wipes out your ability to make wine, or there's what Schumpeter called "creative destruction" where a more advanced industry replaces a more primitive form (such as the automobile replacing the horse and buggy).
Whatever the mechanism, suddenly the former suppliers are getting a lot less in exchange for a unit of their product than they were just before -- perhaps nothing at all if they're out for good. And this applies not just to whole industries but to individual workers. We all specialize in some set of skills in the hopes that they will be needed for as long as we have to work, perhaps with the occasional update. But when we're unemployed for more than a few months, and especially when we sense that our job description has been eliminated completely, we figure that that skill set ain't worth what it used to be, and that we're shit outta luck.
In a society where people have broad and general tasks, not being able to sell your labor to an employer isn't so bad. If you're a hunter-gatherer, you can provide lots of healthy food by yourself, defend yourself and family, and have enough time left over for leisure. Same thing if you're a nomadic herder -- you have a broad enough base of skills to fall back on to support yourself if you can't sell your labor. And ditto for farmers -- you may try to get seasonal work threshing wheat when the opportunity is there, but for most of the year you'll be "out of work" in that sense, but you have a wide enough range of skills to support and defend yourself on your own plot of land.
Some of these cases can get worse than others -- like if you specialize in the crops you plant and a natural disaster wipes out your monoculture crop for a year or two -- but in general you can get by without selling your hyper-specialized labor like the econ 101 textbook tells you to.
Still, occasionally a hunter-gatherer goes for awhile without taking down a large game animal, a herder sometimes has a chunk of their flock stolen, and bad weather ruins the farmer's crops. Rather than work hard to support themselves because of something that was just bad luck, wouldn't they prefer to have social safety net programs in place? Yes, but no individual can impose their will on the entire group, since they would just kill him in his sleep. In order to enjoy such programs, they'd have to convince everyone else to agree to them.
Yet in these pre-industrial societies, the audience that you're pleading to will be very unsympathetic. Why? Because they know that you have a broad enough skill set to support yourself through hard times. OK, so you didn't get that seasonal job threshing wheat -- so what? We all know that you've been planting your own seeds and have those to harvest and eat, that you've been raising a pig or two, that you've been mending your clothes at home, built fences or other protections, and so on. You'll get by -- at least, we would if we were in your shoes, so don't expect us to feel sympathy if you come crying to us.
Again there is some variation within these pre-industrial economies, but the point remains. Hunter-gatherers have no extensive, permanent welfare state structures, and that's because their lifestyle is the most robust to negative shocks. They eat such a wide variety of animals and plants that it would take the most perfect of perfect storms to make all of those skills useless at the same time. Herders also lack welfare state features, and again that's because they are not very specialized, and therefore are not so likely to go extinct through bad luck. For example, they pasture their livestock across a variety of grazing lands during the year, own a largish number of animals over whom they can diversify risk, etc. Farmers are a bit more specialized since they tend to plant only a few (perhaps just one) staple crops and are lucky to have even a handful of animals. It's not surprising, then, to see proto-welfare state institutions in farmer societies, such as long-term third-party charity groups who provide alms to the destitute or run hospitals for the poor.
Still, it takes the incredibly precarious lifestyle of industrial people to cause real fear about "job security" -- i.e., the ability to keep providing for yourself and family. If you lost your job and couldn't find work for six months, could you feed yourself? No, because you never learned to hunt wild game or gather and process plant foods so that they're safe to eat, you never bothered investing in a dozen or so cattle just in case, and you haven't been planting a variety of crops either. Forget whether you can continue your Netflix service or not -- you can't even feed yourself. You surely couldn't defend yourself against violence either, if the country were invaded or if a local crime wave broke out. (Defense spending, which dominates in the national budget, is a social safety net program -- just in case the bad guys show up, when you won't know what to do.) That's how narrow our specialized skill set has become.
Thus, we really do not have a plan B to fall back on if our specialized labor finds no buyer. Lazy thinkers will say that you should just specialize in something else -- except that it's too late. Specialized labor that is at all valuable takes time to specialize in. There could be a 10,000-hour apprenticeship that you need to go through before employers feel you're good enough to hire. There may be a critical window to acquiring some of the skills necessary. For instance, an out-of-work construction worker in Spain who thought his skills would be valuable for quite awhile cannot now specialize in Spain's tourism industry, where English language skills are needed. Beyond a certain age, you can't pick up new languages.
And even where it's possible to specialize in something else, if it's worth anything at all, probably a lot of others are already there. While you were busy specializing in a now obsolete skill set, these people -- your new competitors -- were specializing in what you hope to switch over to. With such a big leg up on you, they'll be there to stay and you'll have to be lucky to get your newly arrived foot in the door. There may be exceptions, and this new niche could have been overlooked and your old niche was just a bubble, so that workers can flow from your old one to this new one. But in general that won't be true.
So now you really do have a legitimate claim of helplessness, and your audience -- whether voters or state planners -- cannot deny it. They, especially in a democracy, will think, "Jeez, that could've been me." And with that, they vote for social safety net programs, for both selfish reasons (in case they're out of luck) and altruistic ones ("those guys didn't do anything blameworthy to be so on-the-edge").
If you think about it, you'd have to be a complete fool to leave behind a pre-industrial way of making a living and become a specialized labor-seller, given how fragile and vulnerable you would become to sudden negative shocks. You'd only agree to the lifestyle shift if you could expect some kind of cushion "just in case." At first this appeared as self-help protection in the form of militant (and then less militant) labor unions. But by insight or trial-and-error, workers figured out that there were economies of scale to be enjoyed, and they let the union approach fade away and instead delegated the social safety net functions to the government itself. Some people talk about the death of unions as though it signaled a rise in free market orientation among the public, but meanwhile the welfare state has only grown larger -- covering more areas of life, and to a greater depth within each one -- as it took over the earlier role of unions.
There's a deep irony here in the libertarian advocacy for pursuing comparative advantage yet shrinking or abolishing the welfare state. People only adopted the industrial lifestyle when they were reasonably sure that some kind of social safety net would protect them from the risk of being unable to feed or defend themselves should their narrow skill set become worthless for even six months. If the welfare state were eliminated, people would revert to one of the pre-industrial ways of making a living -- at least those are somewhat robust to sudden negative shocks. Whatever lifestyle you think is best, it's clear that you can't argue for narrow specialization and a small or absent welfare state at the same time. Perhaps this is why so few people have listened to libertarians, and why everyone has the sense that they're describing an unstable utopia rather than a feasible goal.
Lastly, there's one potential alternative explanation that on further inspection can be ruled out. The welfare state grew not only with the rise of specialization but also the decline in support from one's kinship network. Perhaps once the social safety net that your family used to provide began to shrink, people looked to the state to make up for it. That's plausible at first, but when you look at all the things that your kin do for you, the state only takes up the slack in a subset of them. For example, the social safety net includes education, health care, food, defense against violence, and old-age pensions.
However, one of the largest ways of your kin supporting you relates to finding a mate, getting married, and raising children. With the decline in kin support over the past few centuries in the developed countries, individuals have been left to fend for themselves in these areas -- and yet there was no welfare state program to help them out. The government subsidizes your college loan or rent for affordable housing, but it does not subsidize your loan to buy nice things to impress prospective mates (or to keep your body in good shape), and does not try to provide the poor with "affordable restaurant dating" so they can enjoy nice outcomes like the middle and upper classes.
Even when you find someone to marry, they may give you a meager incentive to do so through the tax scheme, but that's not what your kin do -- they help to raise a dowry or bride-price. The government may fork over more money on your behalf so you can see a better doctor, but they won't plunk down a red cent so that you can pay for a better wedding or get married to a higher-quality spouse.
And although there is some public daycare, this is far from a universal feature of the welfare state. Even where it exists, it's not really the same thing that kin provide, when you think about it. Your kin treat and try to raise your children in a different way than daycare workers do -- and you can call up your kin at any time, unlike the limited hours of operation for a public daycare center or, later on, a public school. There's so much of socialization that daycare workers and teachers don't even attempt, let alone accomplish, that your kin would naturally do -- for instance, all rites of passage, the kind of things that teach boys how to become men. This also includes all the skills you need to evade the state for the benefit of yourself and your family, which you'll need to do from time to time, as the interests of the state and the family aren't the same and often clash. Don't count on the gummint raising your kids to learn about that.
So while this "decline of family influence" argument goes a good ways, it doesn't capture as much of the picture as the "hyper-specialization" argument. The latter explains the lack of welfare state programs just mentioned because they do not result from a huge increase in risk and vulnerability due to job specialization.
This makes the forecast for libertarians look even gloomier, since if it were just weak families, we could still hyper-specialize but just limit mobility, require parental permission to marry, or whatever would strengthen family support once more, and eliminate that source of demand for the government's social safety net. But if it's due to specialization itself, like I said before, you can't have your cake and eat it too.
People should be free to join communities that collectively decided to adopt whatever economy they wanted. Some might experiment with pre-industrial lifestyles, betting on the long-term instability of an ever-growing welfare state. Others would dismiss them as Chicken Littles. The only way to find out who's right is to use trial and error. Or perhaps both are stable and represent different points along a trade-off continuum -- you can have more wealth and a more intrusive state, or less wealth and a more impotent central state. Whatever the case, and however people end up deciding, what's clear is that all this econ 101 balderdash about the wonders of comparative advantage needs to be thrown overboard, as it obscures too many of the very real dangers of specialization, such as the creation of a welfare state that may grow so bloated that it cannot work anymore, at which point the risk-averse populace with revert back to pre-industrial lifestyles where they're not so vulnerable.
The greater wealth of more specialized economies is not the answer, since there's plenty of junk we don't spend so much more on when we get wealthier -- and the entire apparatus of the welfare state is not some slight increase. "When we get wealthier, we'll spend more on things" is a lazy and vague statement that avoids any discussion of how big that change will be. Why so much more spending on the same programs the world over, and hardly at all on others? Mere wealth increases don't help us figure this out.
It is not some far-off side effect of specialization that gave birth to the welfare state, but the very fact of specializing more and more narrowly in our job tasks. The econ 101 textbook tells you that specializing is unequivocally good -- though a more advanced treatment might tell you where it doesn't work -- because everyone benefits from pursuing their comparative advantage. I won't give a numerical example; you can look one up on Wikipedia. But if we're so much better off from this, then why do we fear more and more about our economic security and therefore demand -- and get -- a suite of programs called a "social safety net"?
Simply put, it's because comparative advantage would be a disaster in the real world (however well it works in some mathematical models). Nassim Taleb points out how fragile it is to the assumption that prices stay constant. For example, if one country specializes in wine and another in wool, things go all hunky-dory for awhile -- until wine is replaced by something else as the fashionable thing to drink, or there is some once-a-century natural disaster that wipes out your ability to make wine, or there's what Schumpeter called "creative destruction" where a more advanced industry replaces a more primitive form (such as the automobile replacing the horse and buggy).
Whatever the mechanism, suddenly the former suppliers are getting a lot less in exchange for a unit of their product than they were just before -- perhaps nothing at all if they're out for good. And this applies not just to whole industries but to individual workers. We all specialize in some set of skills in the hopes that they will be needed for as long as we have to work, perhaps with the occasional update. But when we're unemployed for more than a few months, and especially when we sense that our job description has been eliminated completely, we figure that that skill set ain't worth what it used to be, and that we're shit outta luck.
In a society where people have broad and general tasks, not being able to sell your labor to an employer isn't so bad. If you're a hunter-gatherer, you can provide lots of healthy food by yourself, defend yourself and family, and have enough time left over for leisure. Same thing if you're a nomadic herder -- you have a broad enough base of skills to fall back on to support yourself if you can't sell your labor. And ditto for farmers -- you may try to get seasonal work threshing wheat when the opportunity is there, but for most of the year you'll be "out of work" in that sense, but you have a wide enough range of skills to support and defend yourself on your own plot of land.
Some of these cases can get worse than others -- like if you specialize in the crops you plant and a natural disaster wipes out your monoculture crop for a year or two -- but in general you can get by without selling your hyper-specialized labor like the econ 101 textbook tells you to.
Still, occasionally a hunter-gatherer goes for awhile without taking down a large game animal, a herder sometimes has a chunk of their flock stolen, and bad weather ruins the farmer's crops. Rather than work hard to support themselves because of something that was just bad luck, wouldn't they prefer to have social safety net programs in place? Yes, but no individual can impose their will on the entire group, since they would just kill him in his sleep. In order to enjoy such programs, they'd have to convince everyone else to agree to them.
Yet in these pre-industrial societies, the audience that you're pleading to will be very unsympathetic. Why? Because they know that you have a broad enough skill set to support yourself through hard times. OK, so you didn't get that seasonal job threshing wheat -- so what? We all know that you've been planting your own seeds and have those to harvest and eat, that you've been raising a pig or two, that you've been mending your clothes at home, built fences or other protections, and so on. You'll get by -- at least, we would if we were in your shoes, so don't expect us to feel sympathy if you come crying to us.
Again there is some variation within these pre-industrial economies, but the point remains. Hunter-gatherers have no extensive, permanent welfare state structures, and that's because their lifestyle is the most robust to negative shocks. They eat such a wide variety of animals and plants that it would take the most perfect of perfect storms to make all of those skills useless at the same time. Herders also lack welfare state features, and again that's because they are not very specialized, and therefore are not so likely to go extinct through bad luck. For example, they pasture their livestock across a variety of grazing lands during the year, own a largish number of animals over whom they can diversify risk, etc. Farmers are a bit more specialized since they tend to plant only a few (perhaps just one) staple crops and are lucky to have even a handful of animals. It's not surprising, then, to see proto-welfare state institutions in farmer societies, such as long-term third-party charity groups who provide alms to the destitute or run hospitals for the poor.
Still, it takes the incredibly precarious lifestyle of industrial people to cause real fear about "job security" -- i.e., the ability to keep providing for yourself and family. If you lost your job and couldn't find work for six months, could you feed yourself? No, because you never learned to hunt wild game or gather and process plant foods so that they're safe to eat, you never bothered investing in a dozen or so cattle just in case, and you haven't been planting a variety of crops either. Forget whether you can continue your Netflix service or not -- you can't even feed yourself. You surely couldn't defend yourself against violence either, if the country were invaded or if a local crime wave broke out. (Defense spending, which dominates in the national budget, is a social safety net program -- just in case the bad guys show up, when you won't know what to do.) That's how narrow our specialized skill set has become.
Thus, we really do not have a plan B to fall back on if our specialized labor finds no buyer. Lazy thinkers will say that you should just specialize in something else -- except that it's too late. Specialized labor that is at all valuable takes time to specialize in. There could be a 10,000-hour apprenticeship that you need to go through before employers feel you're good enough to hire. There may be a critical window to acquiring some of the skills necessary. For instance, an out-of-work construction worker in Spain who thought his skills would be valuable for quite awhile cannot now specialize in Spain's tourism industry, where English language skills are needed. Beyond a certain age, you can't pick up new languages.
And even where it's possible to specialize in something else, if it's worth anything at all, probably a lot of others are already there. While you were busy specializing in a now obsolete skill set, these people -- your new competitors -- were specializing in what you hope to switch over to. With such a big leg up on you, they'll be there to stay and you'll have to be lucky to get your newly arrived foot in the door. There may be exceptions, and this new niche could have been overlooked and your old niche was just a bubble, so that workers can flow from your old one to this new one. But in general that won't be true.
So now you really do have a legitimate claim of helplessness, and your audience -- whether voters or state planners -- cannot deny it. They, especially in a democracy, will think, "Jeez, that could've been me." And with that, they vote for social safety net programs, for both selfish reasons (in case they're out of luck) and altruistic ones ("those guys didn't do anything blameworthy to be so on-the-edge").
If you think about it, you'd have to be a complete fool to leave behind a pre-industrial way of making a living and become a specialized labor-seller, given how fragile and vulnerable you would become to sudden negative shocks. You'd only agree to the lifestyle shift if you could expect some kind of cushion "just in case." At first this appeared as self-help protection in the form of militant (and then less militant) labor unions. But by insight or trial-and-error, workers figured out that there were economies of scale to be enjoyed, and they let the union approach fade away and instead delegated the social safety net functions to the government itself. Some people talk about the death of unions as though it signaled a rise in free market orientation among the public, but meanwhile the welfare state has only grown larger -- covering more areas of life, and to a greater depth within each one -- as it took over the earlier role of unions.
There's a deep irony here in the libertarian advocacy for pursuing comparative advantage yet shrinking or abolishing the welfare state. People only adopted the industrial lifestyle when they were reasonably sure that some kind of social safety net would protect them from the risk of being unable to feed or defend themselves should their narrow skill set become worthless for even six months. If the welfare state were eliminated, people would revert to one of the pre-industrial ways of making a living -- at least those are somewhat robust to sudden negative shocks. Whatever lifestyle you think is best, it's clear that you can't argue for narrow specialization and a small or absent welfare state at the same time. Perhaps this is why so few people have listened to libertarians, and why everyone has the sense that they're describing an unstable utopia rather than a feasible goal.
Lastly, there's one potential alternative explanation that on further inspection can be ruled out. The welfare state grew not only with the rise of specialization but also the decline in support from one's kinship network. Perhaps once the social safety net that your family used to provide began to shrink, people looked to the state to make up for it. That's plausible at first, but when you look at all the things that your kin do for you, the state only takes up the slack in a subset of them. For example, the social safety net includes education, health care, food, defense against violence, and old-age pensions.
However, one of the largest ways of your kin supporting you relates to finding a mate, getting married, and raising children. With the decline in kin support over the past few centuries in the developed countries, individuals have been left to fend for themselves in these areas -- and yet there was no welfare state program to help them out. The government subsidizes your college loan or rent for affordable housing, but it does not subsidize your loan to buy nice things to impress prospective mates (or to keep your body in good shape), and does not try to provide the poor with "affordable restaurant dating" so they can enjoy nice outcomes like the middle and upper classes.
Even when you find someone to marry, they may give you a meager incentive to do so through the tax scheme, but that's not what your kin do -- they help to raise a dowry or bride-price. The government may fork over more money on your behalf so you can see a better doctor, but they won't plunk down a red cent so that you can pay for a better wedding or get married to a higher-quality spouse.
And although there is some public daycare, this is far from a universal feature of the welfare state. Even where it exists, it's not really the same thing that kin provide, when you think about it. Your kin treat and try to raise your children in a different way than daycare workers do -- and you can call up your kin at any time, unlike the limited hours of operation for a public daycare center or, later on, a public school. There's so much of socialization that daycare workers and teachers don't even attempt, let alone accomplish, that your kin would naturally do -- for instance, all rites of passage, the kind of things that teach boys how to become men. This also includes all the skills you need to evade the state for the benefit of yourself and your family, which you'll need to do from time to time, as the interests of the state and the family aren't the same and often clash. Don't count on the gummint raising your kids to learn about that.
So while this "decline of family influence" argument goes a good ways, it doesn't capture as much of the picture as the "hyper-specialization" argument. The latter explains the lack of welfare state programs just mentioned because they do not result from a huge increase in risk and vulnerability due to job specialization.
This makes the forecast for libertarians look even gloomier, since if it were just weak families, we could still hyper-specialize but just limit mobility, require parental permission to marry, or whatever would strengthen family support once more, and eliminate that source of demand for the government's social safety net. But if it's due to specialization itself, like I said before, you can't have your cake and eat it too.
People should be free to join communities that collectively decided to adopt whatever economy they wanted. Some might experiment with pre-industrial lifestyles, betting on the long-term instability of an ever-growing welfare state. Others would dismiss them as Chicken Littles. The only way to find out who's right is to use trial and error. Or perhaps both are stable and represent different points along a trade-off continuum -- you can have more wealth and a more intrusive state, or less wealth and a more impotent central state. Whatever the case, and however people end up deciding, what's clear is that all this econ 101 balderdash about the wonders of comparative advantage needs to be thrown overboard, as it obscures too many of the very real dangers of specialization, such as the creation of a welfare state that may grow so bloated that it cannot work anymore, at which point the risk-averse populace with revert back to pre-industrial lifestyles where they're not so vulnerable.
October 10, 2010
Where are today's "Go ahead, make my day" lines?
Since crime rates started falling in the early-mid 1990s, the action and horror movie genres that flourished during the rising-crime times of roughly the '60s through the '80s have fallen off a cliff. I can quantify that in various ways, for example by looking at a bunch of lists of "best horror movies," taking only those that appear on at least two or three or ... all of the lists, and seeing how these are distributed over time. And the basic impression of any action or horror movie fan is backed up.
But what about a quick-and-dirty yet still illuminating way to see this? That would require a lot less work and would be more friendly to people who are allergic to looking at charts. One aspect of a stellar movie is that it has a line or two that is so memorable that it enters the common lexicon. This is certainly true for higher art forms -- compare how many verses from Shakespeare vs. Dryden have "gone viral." Perhaps not every single line, but on the whole they pack more of a punch and so are more likely to stick in our minds and get transmitted over time.
There are basically none from the action and horror movies of the past 15 to 20 years, other than "I see dead people" and "Do you like scary movies?" Again I'm talking lines that everyone would recognize, would use on their own, and re-awaken that scene from the movie inside the listener. Just listing ones that come to me without even thinking hard, here's how skilled writers used to be at crafting these lines. Some of these are probably still in circulation, in addition to having gone viral at the time. Indeed, when googling them to get the wording right, they are popular enough in searches to get their own auto-suggestion. They would all be even more recognizable when spoken aloud with vocal imitation of the character, rather than reading the text below:
...And that's leaving aside about a million from the Star Wars trilogy.
Just about all of these are from the mid-'70s to early-'90s, and that much time has passed since, so it's not as though there simply hasn't been enough time. Rather, the quality is just a lot lower on average, so nothing has stuck. That's even true for tame-times continuations of the above movies. Guess we'll be quoting these for years to come.
But what about a quick-and-dirty yet still illuminating way to see this? That would require a lot less work and would be more friendly to people who are allergic to looking at charts. One aspect of a stellar movie is that it has a line or two that is so memorable that it enters the common lexicon. This is certainly true for higher art forms -- compare how many verses from Shakespeare vs. Dryden have "gone viral." Perhaps not every single line, but on the whole they pack more of a punch and so are more likely to stick in our minds and get transmitted over time.
There are basically none from the action and horror movies of the past 15 to 20 years, other than "I see dead people" and "Do you like scary movies?" Again I'm talking lines that everyone would recognize, would use on their own, and re-awaken that scene from the movie inside the listener. Just listing ones that come to me without even thinking hard, here's how skilled writers used to be at crafting these lines. Some of these are probably still in circulation, in addition to having gone viral at the time. Indeed, when googling them to get the wording right, they are popular enough in searches to get their own auto-suggestion. They would all be even more recognizable when spoken aloud with vocal imitation of the character, rather than reading the text below:
Game over, man! Game over!
Yippie kay-ay, motherfucker.
Are you talkin' to me?
Are you feeling lucky, punk?
"Who are you?" -- Your worst nightmare.
I'll be back.
Hasta la vista, baby.
You are one ugly motherfucker.
You send that many, don't forget one thing. "What?" A good supply of body bags.
Don't cross the streams.
I'm your boyfriend now, Nancy.
Heeeeere's Johnny!
Hi, I'm Chucky -- wanna play?
They're coming to get you, Barbara...
Chu chu chu chu chu... ah ah ah ah ah ah....
They're heeeere.
It rubs the lotion on its skin, or else it gets the hose again.
The power of Christ compels you!
I hate snakes, Jock! I hate 'em!
She's dead... wrapped in plastic.
...And that's leaving aside about a million from the Star Wars trilogy.
Just about all of these are from the mid-'70s to early-'90s, and that much time has passed since, so it's not as though there simply hasn't been enough time. Rather, the quality is just a lot lower on average, so nothing has stuck. That's even true for tame-times continuations of the above movies. Guess we'll be quoting these for years to come.
October 7, 2010
Doing my part to kill the bachelorette party
Last Saturday I was browsing around Barnes & Noble before heading out for the evening, when I heard some group of girls belting out a few lines from "L-O-V-E" down on the first floor, then ending abruptly. Probably just a rare group of drunk sluts who stumbled in here after an early start to their Saturday night. Then the group rode the escalator up to near where I was reading, and giggled the entire way over, when one of them asked me:
I told her sorry, and they left looking let down. Hopefully the pauses and comments I gave about "I dunno, sounds kinda weird" shamed them into not going through with the rest of this ridiculousness before a girl gets married. After all, nothing freaks a girl out like thinking that others view her as weird and rocking the boat. If it were a mere girls' night out, no harm done (other than growing more socially retarded around boys if every night out is girls' night out). But the bachelorette party is her last chance to have the world focus on her, so you get the most concentrated explosion of attention whoring you've ever seen.
For those who haven't been to a bar or nightclub in awhile, the bachelorette often has to do a scavenger hunt list of things like find someone to sing "L-O-V-E" with in public, wear a belt or other fashion accessory made out of Smarties, which she has to get various guys to remove with their teeth, and so on. So guys, if you ever get approached like this, do your part to end this social pollution and make them feel like they're being weird.
And since when did this annoying and shameful event become de rigueur? Well, when else did any of today's make-you-wanna-throttle-somebody cultural practices begin infecting the society? You guessed it -- the early '90s (perhaps with a few isolated instances during the late '80s). I searched the NYT for "bachelorette party," and aside from a single usage from the early '80s that describes more of a bridal shower atmosphere, the steady stream of articles begins in the '90s. Here is one from 1990. The "one last night of fun" and male strippers make it sound like it began as an appropriation of the male bachelor party.
But before long we got the attention-whore-fest that we'd recognize today. Listen to how familiar this description sounds, as far back as spring of 1996 (and given that news coverage lags behind the trends it covers, it was probably already going on by 1995):
Earlier I looked at signs of how segregated the sexes have been for the past 20 years, so let's add the bachelorette party. It's one of those "girls gotta stick together" activities where their only interactions with boys are shallow and fleeting, and it's more of a goof than a sincere attempt to get to know new boys and have fun. If the poor guy goes along with their game, he's just getting used to boost their egos for a minute before they move on to some other dude for another quick attention fix. It's an awful lot like grinding on the dance floor: here she comes and there she goes -- completely unlike an intimate trust-building slow dance, which might actually lead somewhere sexual.
When crime rates began rising in the late '50s, girls suddenly came out of their self-imposed cocoons of domesticity and wanted to play with the boys. For one thing, when the world gets more dangerous, you have to get to the business of making babies sooner, meaning you begin your search earlier and at any age you try to make yourself more attractive as a potential mate. So girls become boy-crazy, which is an honest signal that she's not going to flake out -- she's too head-over-heels for you. And for another thing, greater danger makes you value protectors more, and that's overwhelmingly going to be guys.
It's only when crime rates began falling in the early '90s that girls were cured of their boy-craziness and desire for male protectors, and retreated again to the bubble of their house as a source of meaning -- only instead of vacuum cleaners and tupperware, this time it was Viking stoves, marble countertops, and chairs and lights from Design Within Reach, or perhaps some ironic retro junk from Urban Outfitters if younger.
Um, excuse me... I know this is gonna sound strange, but would you mind if I gave you a piggyback ride? -- I mean, just to right over there. [long pause from me, as she's not so cute, and then looking over to see if any of her friends are] It's for a bachelorette party... no really, it's one of the things on the check-list, so it's totally not creepy I promise [showing me a glossy, professionally made 10" square card].
I told her sorry, and they left looking let down. Hopefully the pauses and comments I gave about "I dunno, sounds kinda weird" shamed them into not going through with the rest of this ridiculousness before a girl gets married. After all, nothing freaks a girl out like thinking that others view her as weird and rocking the boat. If it were a mere girls' night out, no harm done (other than growing more socially retarded around boys if every night out is girls' night out). But the bachelorette party is her last chance to have the world focus on her, so you get the most concentrated explosion of attention whoring you've ever seen.
For those who haven't been to a bar or nightclub in awhile, the bachelorette often has to do a scavenger hunt list of things like find someone to sing "L-O-V-E" with in public, wear a belt or other fashion accessory made out of Smarties, which she has to get various guys to remove with their teeth, and so on. So guys, if you ever get approached like this, do your part to end this social pollution and make them feel like they're being weird.
And since when did this annoying and shameful event become de rigueur? Well, when else did any of today's make-you-wanna-throttle-somebody cultural practices begin infecting the society? You guessed it -- the early '90s (perhaps with a few isolated instances during the late '80s). I searched the NYT for "bachelorette party," and aside from a single usage from the early '80s that describes more of a bridal shower atmosphere, the steady stream of articles begins in the '90s. Here is one from 1990. The "one last night of fun" and male strippers make it sound like it began as an appropriation of the male bachelor party.
But before long we got the attention-whore-fest that we'd recognize today. Listen to how familiar this description sounds, as far back as spring of 1996 (and given that news coverage lags behind the trends it covers, it was probably already going on by 1995):
The bride's friends made her a hat -- an upside-down pink plastic flowerpot with an enormous white flower on the side -- that she was required to wear for the evening. She also was given a list of things she had to accomplish that night. It was like a cross between an urban survival course and a scavenger hunt.
"First, she had to go up to strangers in the bar and get people to sing the theme song from 'The Brady Bunch' out loud, and she did," Ms. Drucker said. "Then she had to get a couple of people to buy her shots, and she did. We'd see her bright pink hat as she made her way through the bar. Heather's pushy, and I mean that as a compliment."
Earlier I looked at signs of how segregated the sexes have been for the past 20 years, so let's add the bachelorette party. It's one of those "girls gotta stick together" activities where their only interactions with boys are shallow and fleeting, and it's more of a goof than a sincere attempt to get to know new boys and have fun. If the poor guy goes along with their game, he's just getting used to boost their egos for a minute before they move on to some other dude for another quick attention fix. It's an awful lot like grinding on the dance floor: here she comes and there she goes -- completely unlike an intimate trust-building slow dance, which might actually lead somewhere sexual.
When crime rates began rising in the late '50s, girls suddenly came out of their self-imposed cocoons of domesticity and wanted to play with the boys. For one thing, when the world gets more dangerous, you have to get to the business of making babies sooner, meaning you begin your search earlier and at any age you try to make yourself more attractive as a potential mate. So girls become boy-crazy, which is an honest signal that she's not going to flake out -- she's too head-over-heels for you. And for another thing, greater danger makes you value protectors more, and that's overwhelmingly going to be guys.
It's only when crime rates began falling in the early '90s that girls were cured of their boy-craziness and desire for male protectors, and retreated again to the bubble of their house as a source of meaning -- only instead of vacuum cleaners and tupperware, this time it was Viking stoves, marble countertops, and chairs and lights from Design Within Reach, or perhaps some ironic retro junk from Urban Outfitters if younger.
October 6, 2010
The march of Aspberger's online
I don't think we're much more Aspbergery in the post-internet age than before, since the big change is pre vs. post-modern. Still, the internet opens a clearer window on what makes people tick, and gives Aspberger's people more things to fold into their bloating collection of geekisms.
The strongest current sign of this is the use of the @ character when addressing people in blog posts or comments. I have yet to see this in emails or on Facebook, though, thank god. (I believe this started as the built-in response on Twitter, and people have adopted it elsewhere voluntarily.) For example, the Aspberger's guy will write "@Bill" followed by a comma, colon, double dash, etc. A simple "Bill" followed by a comma or whatever would do. But how can you level up your geek magic points if you refrain from casting the @ spell? That concern trumps readability.
Hopefully this stupid trend won't last, given that it's not popular with the young, who avoid Twitter and have no nostalgic emotional connection to conventions that were born in email.
The strongest current sign of this is the use of the @ character when addressing people in blog posts or comments. I have yet to see this in emails or on Facebook, though, thank god. (I believe this started as the built-in response on Twitter, and people have adopted it elsewhere voluntarily.) For example, the Aspberger's guy will write "@Bill" followed by a comma, colon, double dash, etc. A simple "Bill" followed by a comma or whatever would do. But how can you level up your geek magic points if you refrain from casting the @ spell? That concern trumps readability.
Hopefully this stupid trend won't last, given that it's not popular with the young, who avoid Twitter and have no nostalgic emotional connection to conventions that were born in email.
What will the upcoming Millennial culture war be fought over?
Roughly every 25 years we're struck by a culture war, where some perceived grand injustice needs to be corrected and we're just going to have to drop everything and fix it already.
The last hit in the early '90s and centered on identity politics and political correctness. Before that was the late '60s which focused on smashing capitalism and the imperialist war machine, as well as civil rights -- not a race issue except to the extent that these were denied based on race (see the Berkeley Free Speech Movement, the Pentagon Papers, Tinker v. Des Moines, the re-birth of American libertarianism, etc.). Before that was the WWII era move to get women into the same jobs men were doing, as well as the first round of desegregation in the military and sports. Before that was the Harlem Renaissance and flapper period during the Roaring Twenties. And before that was the Progressive Era.
Since the last peak was circa 1992, that means that by the middle or end of this decade we can expect another culture war to explode. What will be its casus belli? Here's an article documenting the waning of the diversity cult on campus and its gradual replacement by the sustainability cult.
There's no foment in the air relating to blacks, the usual group around whom a racially charged culture war breaks out. The closest thing is amnesty for illegal immigrants or pushing for open borders, but Millennials -- the young people who always serve as cannon fodder for the war -- couldn't care less about putting illegals "on a path to citizenship." That's more a push from Gen X-ers and Baby Boomers, to whom race-related culture wars make sense.
No, young people today are more concerned about "going green," as the article details, plus civil rights for gays, e.g. gay marriage. (Gen X's culture war featured gays, unlike the Baby Boomers', but it was about identity politics -- about transfiguring the dominant heteronormative metanarrative bla bla bla is it even funny to make fun of this writing style anymore.) One warning sign is that on one of my Millennial friend's Facebook, she has an item saying she'll "attend" (i.e. participate in) this public event. Here's part of the description for the few not on Facebook:
Currently there are 93,000 who say they're going to participate, and another 20,000 who may or may not, with 26,000 who were invited saying they won't.
I sure hope that their culture war is limited to a silly and innocuous cause like gay marriage, lest it rip society apart like another early '90s race war. Even if sustainability comes along for the ride, it will be annoying -- yet another 4-lane road ground to a halt because it had to make room for seldom used bike lanes on both sides -- but it won't tear us apart like the sexual harassment / date rape spook and the anti-white hate of the Anita Hill and Rodney King period. If things look like they're going to get bad within the decade, go back and watch Higher Learning to remember how divisive it was the last time. It doesn't look like the next one will have such a potential to blow society up.
Oh, and don't be stupid and see that movie as one of only three white teenagers in an all-black theater in Wheaton Plaza, Maryland. That was one of the few times in my life where I thought I was likely to get jumped by a mob. We stuck it out and nothing happened, perhaps because we made such exaggerated signals of disgust at the skinhead guy just to let them know we weren't like him. That hardly defused the choking tension, but it got us out of there unmolested.
The last hit in the early '90s and centered on identity politics and political correctness. Before that was the late '60s which focused on smashing capitalism and the imperialist war machine, as well as civil rights -- not a race issue except to the extent that these were denied based on race (see the Berkeley Free Speech Movement, the Pentagon Papers, Tinker v. Des Moines, the re-birth of American libertarianism, etc.). Before that was the WWII era move to get women into the same jobs men were doing, as well as the first round of desegregation in the military and sports. Before that was the Harlem Renaissance and flapper period during the Roaring Twenties. And before that was the Progressive Era.
Since the last peak was circa 1992, that means that by the middle or end of this decade we can expect another culture war to explode. What will be its casus belli? Here's an article documenting the waning of the diversity cult on campus and its gradual replacement by the sustainability cult.
One index of the rise of sustainability at the expense of diversity is the size of the institutional memberships of their professional groups. The Association for the Advancement of Sustainability in Higher Education now lists as members 800 colleges and universities in the United States. The National Association of Diversity Officers in Higher Education, by contrast, has about 150 member institutions.
There's no foment in the air relating to blacks, the usual group around whom a racially charged culture war breaks out. The closest thing is amnesty for illegal immigrants or pushing for open borders, but Millennials -- the young people who always serve as cannon fodder for the war -- couldn't care less about putting illegals "on a path to citizenship." That's more a push from Gen X-ers and Baby Boomers, to whom race-related culture wars make sense.
No, young people today are more concerned about "going green," as the article details, plus civil rights for gays, e.g. gay marriage. (Gen X's culture war featured gays, unlike the Baby Boomers', but it was about identity politics -- about transfiguring the dominant heteronormative metanarrative bla bla bla is it even funny to make fun of this writing style anymore.) One warning sign is that on one of my Millennial friend's Facebook, she has an item saying she'll "attend" (i.e. participate in) this public event. Here's part of the description for the few not on Facebook:
It’s been decided. On October 20th, 2010, we will wear purple in honor of the 6 gay boys who committed suicide in recent weeks/months due to homophobic abuse in their homes and at their schools. Purple represents Spirit on the LGBTQ flag and that’s exactly what we’d like all of you to have with you: spirit. Please know that times will get better and that you will meet people who will love you and respect you for who you are, no matter your sexuality.
...
Join this event and invite everyone on your friends list. Don't let their deaths be for nothing. Let it mean something, and let's do something to change this country for once.
Currently there are 93,000 who say they're going to participate, and another 20,000 who may or may not, with 26,000 who were invited saying they won't.
I sure hope that their culture war is limited to a silly and innocuous cause like gay marriage, lest it rip society apart like another early '90s race war. Even if sustainability comes along for the ride, it will be annoying -- yet another 4-lane road ground to a halt because it had to make room for seldom used bike lanes on both sides -- but it won't tear us apart like the sexual harassment / date rape spook and the anti-white hate of the Anita Hill and Rodney King period. If things look like they're going to get bad within the decade, go back and watch Higher Learning to remember how divisive it was the last time. It doesn't look like the next one will have such a potential to blow society up.
Oh, and don't be stupid and see that movie as one of only three white teenagers in an all-black theater in Wheaton Plaza, Maryland. That was one of the few times in my life where I thought I was likely to get jumped by a mob. We stuck it out and nothing happened, perhaps because we made such exaggerated signals of disgust at the skinhead guy just to let them know we weren't like him. That hardly defused the choking tension, but it got us out of there unmolested.
October 5, 2010
Which nuts work best?
I decided to experiment a little during the last few weeks to see what kind of nuts are best for a paleo diet. I don't always include them, but often enough that it's worth figuring out which ones to use.
I stayed away from cashews, pecans, walnuts, pine nuts, and pistachios because of too much polyunsaturated fat or too much digestible carbs. Same for Brazil nuts, but these also taste terrible and have an unpleasant texture. Macadamia nuts are OK -- not carb-filled at all, and a decent amount of saturated fat -- but they're a bit too chalky and thus require a lot more chewing and grinding to process. Soynuts were out of the question since they're soy, one of the most poisonous substances you could eat (unless you ferment it like natto).
Since it was cheap to experiment with, I picked up some peanut butter (peanuts only), and raw almonds that were on sale. I won't be incorporating these because they're too easy to eat a lot of. Peanuts are not nuts but legumes, making peanut butter more like hummus, and human beings aren't adapted to a bean-heavy diet -- that's why we have to rely on our gut flora to digest that junk for us, yet we pay the price by getting gas. Almonds are a lot better as far as nutritional value, flavor, and texture go, but again it's too easy to go through half a pound in one sitting (not that I did this every time I ate them, but it did happen once).
Foods that we're adapted to fill us up and tell us to stop before long. The reason is that a meal is not fuel but an information signal. Food doesn't become usable fuel until hours or days later, bearing in mind that we had no access to pounds of sugar before 100 years ago or so, and except for some South Seas people with abundant coconuts, we had no access to lots of short-chain fatty acids that can be burned right away. Rather, a meal is a signal to our body that we can stop being so physically active -- we've accomplished our goal of finding and preparing food. If a meal were fuel, then when we ate a large meal our impulse would be to be very active, whereas in reality it is to kick back and be lazy for awhile.
So, any food that you "just can't stop" eating is not something we're meant to eat. Sugars and starches are the clearest example, but even almonds and cashews have enough of this to keep you snacking on and on. There is no natural negative feedback loop that says, "Woah, we get the message loud and clear -- you've found food -- so you can stop chowing down now." I went through what I planned as a week's small jar of peanut butter in several days, and one week I went through the first pound of almonds in just a few days and bought another one!
Aside from the obvious side effects of peanut butter making my belly inflate like a balloon, it also dried out my skin all over. Not as in flaking or peeling or getting ashy, but just a general feeling of dryness that I never have ordinarily. That happened when I tried corn tortillas awhile back, and when I bought a container of hummus as a vegetable dip. Beans and grains must interfere with the pathway whereby vitamin A maintains healthy skin (I eat a slice of liver cheese every morning, but that was to little avail in the presence of legumes). My face also broke out a bit, again not too surprising given how close to the low-carb border you can push it when eating a fair amount of almonds and peanut butter.
So this week it's back to my old favorite -- hazelnuts. They're true nuts, which humans are more adapted to, and we have archaeological evidence that pre-sedentary-farmer groups were eating them in Scotland. Like macadamias, they're very low-carb and like almonds they're high in monounsaturated fat (oleic acid, the stuff that olive oil is made of). They hardly taste sweet or starchy, and therefore my natural negative feedback loop kicks in after just a handful. At most I can eat 20 per sitting, which is about 1 oz., and sometimes just 10 is more than enough. Plus the texture feels just right -- crunchy but not chalky, and with enough flesh on the outside.
If you want an even more savory taste, crack them in half with your teeth, drizzle with olive oil, and sprinkle with sea salt and pepper. If you want the tiniest hint of sweetness plus a load of ready-to-burn fat, have a little bit of coconut chips with them. Best of all, they're hardly more expensive than almonds, and because you can't eat as much of them, they cost more or less the same over a week. I'd advise against hazelnut butter (made by Kettle), as it's way too savory to replace the nuts themselves, and the smooth texture isn't as satisfying as crunching the real thing between your molars.
I stayed away from cashews, pecans, walnuts, pine nuts, and pistachios because of too much polyunsaturated fat or too much digestible carbs. Same for Brazil nuts, but these also taste terrible and have an unpleasant texture. Macadamia nuts are OK -- not carb-filled at all, and a decent amount of saturated fat -- but they're a bit too chalky and thus require a lot more chewing and grinding to process. Soynuts were out of the question since they're soy, one of the most poisonous substances you could eat (unless you ferment it like natto).
Since it was cheap to experiment with, I picked up some peanut butter (peanuts only), and raw almonds that were on sale. I won't be incorporating these because they're too easy to eat a lot of. Peanuts are not nuts but legumes, making peanut butter more like hummus, and human beings aren't adapted to a bean-heavy diet -- that's why we have to rely on our gut flora to digest that junk for us, yet we pay the price by getting gas. Almonds are a lot better as far as nutritional value, flavor, and texture go, but again it's too easy to go through half a pound in one sitting (not that I did this every time I ate them, but it did happen once).
Foods that we're adapted to fill us up and tell us to stop before long. The reason is that a meal is not fuel but an information signal. Food doesn't become usable fuel until hours or days later, bearing in mind that we had no access to pounds of sugar before 100 years ago or so, and except for some South Seas people with abundant coconuts, we had no access to lots of short-chain fatty acids that can be burned right away. Rather, a meal is a signal to our body that we can stop being so physically active -- we've accomplished our goal of finding and preparing food. If a meal were fuel, then when we ate a large meal our impulse would be to be very active, whereas in reality it is to kick back and be lazy for awhile.
So, any food that you "just can't stop" eating is not something we're meant to eat. Sugars and starches are the clearest example, but even almonds and cashews have enough of this to keep you snacking on and on. There is no natural negative feedback loop that says, "Woah, we get the message loud and clear -- you've found food -- so you can stop chowing down now." I went through what I planned as a week's small jar of peanut butter in several days, and one week I went through the first pound of almonds in just a few days and bought another one!
Aside from the obvious side effects of peanut butter making my belly inflate like a balloon, it also dried out my skin all over. Not as in flaking or peeling or getting ashy, but just a general feeling of dryness that I never have ordinarily. That happened when I tried corn tortillas awhile back, and when I bought a container of hummus as a vegetable dip. Beans and grains must interfere with the pathway whereby vitamin A maintains healthy skin (I eat a slice of liver cheese every morning, but that was to little avail in the presence of legumes). My face also broke out a bit, again not too surprising given how close to the low-carb border you can push it when eating a fair amount of almonds and peanut butter.
So this week it's back to my old favorite -- hazelnuts. They're true nuts, which humans are more adapted to, and we have archaeological evidence that pre-sedentary-farmer groups were eating them in Scotland. Like macadamias, they're very low-carb and like almonds they're high in monounsaturated fat (oleic acid, the stuff that olive oil is made of). They hardly taste sweet or starchy, and therefore my natural negative feedback loop kicks in after just a handful. At most I can eat 20 per sitting, which is about 1 oz., and sometimes just 10 is more than enough. Plus the texture feels just right -- crunchy but not chalky, and with enough flesh on the outside.
If you want an even more savory taste, crack them in half with your teeth, drizzle with olive oil, and sprinkle with sea salt and pepper. If you want the tiniest hint of sweetness plus a load of ready-to-burn fat, have a little bit of coconut chips with them. Best of all, they're hardly more expensive than almonds, and because you can't eat as much of them, they cost more or less the same over a week. I'd advise against hazelnut butter (made by Kettle), as it's way too savory to replace the nuts themselves, and the smooth texture isn't as satisfying as crunching the real thing between your molars.
October 3, 2010
Was there yet another lactase mutation in Tibet for yak milk?
One of the fastest evolving regions of the human genome is an enhancer region near the lactase gene, where new mutations allow people to digest lactose -- and thus animal milk -- into adulthood. One wave of such a mutation spread out from the Indo-European homeland, another throughout eastern Africa (both of these related to cows), and still another from the Arabian peninsula (related to camels).
These three independent waves were all started by nomadic pastoralists, not farmers or hunter-gatherers. Most of the latter don't own livestock to milk in the first place, and when farmers do, they can find other ways around the problem of lactose in their animals' milk -- like letting it age into cheese. Fine for sedentary farmers, but not for on-the-go herders who may get milk into the butter stage but can't stay put long enough to culture it into 5-year aged cheddar. They need to consume it now.
I wonder if there isn't another independent mutation among Tibetans, who have historically led a nomadic and herder lifestyle and whose diet apparently is chock full of dairy from the yak. Check this out: dozens of cups per day of butter tea, which has not only yak butter but yak milk -- and salt! Somehow I doubt all those hummus-and-granola hippies making their pilgrimage to Tibet will share the news of how much saturated fat and salt their favorite oppressed people wolf down all day.
The lactose content of yak milk is the same as for other dairy animals, so any of those three waves of mutations in western Eurasia and eastern Africa would have thrived if they'd gotten into Tibet. Only problem is getting into Tibet. Thus, if lactose tolerance is spreading there, it's probably an indigenous mutation. Just googling around, I couldn't find anything, but Tibetans are not a popular group for genetic study, except for genetic adaptation to the air pressure at such high altitudes.
But air quality isn't the only thing that affects your Darwinian fitness in the mountains. Because jackshit grows that high up, you're probably going to rely a lot more on herding robust animals and, not wanting to kill them too often for their meat, stealing their more replenishable store of milk.
These three independent waves were all started by nomadic pastoralists, not farmers or hunter-gatherers. Most of the latter don't own livestock to milk in the first place, and when farmers do, they can find other ways around the problem of lactose in their animals' milk -- like letting it age into cheese. Fine for sedentary farmers, but not for on-the-go herders who may get milk into the butter stage but can't stay put long enough to culture it into 5-year aged cheddar. They need to consume it now.
I wonder if there isn't another independent mutation among Tibetans, who have historically led a nomadic and herder lifestyle and whose diet apparently is chock full of dairy from the yak. Check this out: dozens of cups per day of butter tea, which has not only yak butter but yak milk -- and salt! Somehow I doubt all those hummus-and-granola hippies making their pilgrimage to Tibet will share the news of how much saturated fat and salt their favorite oppressed people wolf down all day.
The lactose content of yak milk is the same as for other dairy animals, so any of those three waves of mutations in western Eurasia and eastern Africa would have thrived if they'd gotten into Tibet. Only problem is getting into Tibet. Thus, if lactose tolerance is spreading there, it's probably an indigenous mutation. Just googling around, I couldn't find anything, but Tibetans are not a popular group for genetic study, except for genetic adaptation to the air pressure at such high altitudes.
But air quality isn't the only thing that affects your Darwinian fitness in the mountains. Because jackshit grows that high up, you're probably going to rely a lot more on herding robust animals and, not wanting to kill them too often for their meat, stealing their more replenishable store of milk.
Subscribe to:
Posts (Atom)