April 28, 2011

Informality and cultural advancement

Take the level of formality in a culture to be how much variance in behavior is allowed for members of the major groups that we carve society up into -- men and women, young and adolescent and adult, patron and client, initiated and neophyte, ruler and ruled, etc. More formality simply means that members of one of these groups are more locked into their expected role, regardless of what that expected (or average) behavior is. Less formality means that more variation is allowed. Mary Douglas referred to this social variable as "grid," Victor Turner "anti-structure," etc. I'll use the simpler term "informality."

While watching the DVD for Live Aid, I was struck by how tame the English concert has been so far, compared to the American concert, both from the performers and the audience. I'd always heard from interviews that American audiences work themselves into more of a fever pitch compared to English ones, and that this led the band members to give a more energetic performance to American crowds.

English society has been more formal than American, so the chasm in enthusiasm is as we would expect. The more that formality matters, the more self-conscious you must be in order to make sure that you're not stepping out of your proper role. With formality constraints relaxed, you can slough off your self-consciousness and enter a more trance-like state, similar to spiritual possession, and abandon yourself to the carnivalesque party atmosphere.

The even more formal society of France (at least around the cultural capital of Paris), which prides itself on bourgeois propriety (and courtier life before that), has never produced a major popular music sensation -- that is, beloved by many outside of France -- let alone in rock music. (While many know who Edith Piaf is, few listen to her over and over.) Not that rock music wasn't popular in France. They just couldn't let their hair down enough to make it themselves. This shortcoming is not just in popular music, as though they specialized instead in higher art forms, since the French were also-rans in European Classical music, behind the relatively less stuffy German-speaking and Italian countries. Germany did fairly well in rock music, and though Italy did not, they did play a leading role in European dance-pop music.

Aside from the bird's-eye-view of how popular a certain form was in the entire society, or how good that society as a whole was at making it, let's look at groups that tend to be more on the margins of cultural production and see if the pattern holds even there. For instance, women are outnumbered by men in any area of culture that is at all exciting and meaningful. Nevertheless, we predict that more informal cultures will allow women greater lee-way in their behavior. Cross-culturally, greater informality is one background condition for religious activities where female spiritual possession plays a key role.

Starting with the line-up at Live Aid, of the 20 groups performing at Wembley, only 1 featured a female (Sade), whereas among the 33 groups at Philadelphia, 6 did (Joan Baez, The Pretenders, Madonna, Ashford & Simpson, Thompson Twins, and Patti LaBelle).

In the Rock and Roll Hall of Fame, there are mostly male musicians and from both America and Britain (and a few other places too). But of the females inducted, my rough count is 22 American individuals or groups with a prominent female, only 1 British individual (Christine McVie, who was in the American band Fleetwood Mac), plus 1 Canadian individual (Joni Mitchell) and 1 Swedish group (ABBA). Most of the Americans are more R&B / girl-group performers, but there are still a good number of more rock-oriented artists.

Finally, consider Wikipedia's list of female rock singers. I'm filtering out those who never made it big in either the mainstream or a large sub-culture, who were one-hit wonders, and who were famous for being famous. This is a subjective reckoning of who made a fairly large direct contribution to the genre. I've split them into what I consider lesser and greater importance, and by American or British location. This gives:

American, lesser
Karen Carpenter
Cher
Kim Gordon
Stevie Nicks
Linda Ronstadt

British, lesser
Kate Bush
Annie Lennox
Siouxsie Sioux
Bonnie Tyler

American, greater
Joan Baez
Pat Benatar
Belinda Carlisle
Emmylou Harris
Debbie Harry
Susanna Hoffs
Chrissie Hynde
Joan Jett
Cyndi Lauper

Sure enough, the big female rock stars are all American, reflecting the less rigid sex roles of our society. There are some British females in the second tier of rock eminence, but even here we see that they're not so much English as Celtic. Kate Bush's mother is an Irish folk dancer, Annie Lennox is Scottish, and Bonnie Tyler is Welsh. Here I'm not so sure, but my impression is that the Celtic areas are more informal and casual than England. At the regional level in America, we should see more accomplished females as we move away from the high-point of formality in the northeast, going west and south, which looks roughly correct. At any rate, the American vs. British difference is clear.

Some level of formality is good just so people don't get totally confused about how to behave in this or that situation. But it's remarkable how little we apparently need in order to hold up most social institutions. More importantly, a low level of formality is the most important part of moving a culture forward -- the less blind variation there is, the harder it is for the survival-of-the-fittest dynamic to find better culture and weed out the lesser. The wide range of variation that comes from more informality isn't good in itself (since it produces a lot of garbage, too), but only to the extent that it's the first step along the path of upward cultural evolution rather than stagnation.

This is especially marked in the areas of culture where a loss of self-consciousness is one key part of the experience, such as the arts vs. the sciences, fiction vs. non-fiction within literature, ecstatic vs. rational sub-cultures in religion, and so on. Fortunately for the more fun-loving societies, we can easily reproduce the fruits of the rationalistic and self-conscious labor of more formal societies -- build airplanes, set interest rates, etc. It seems much tougher for those more formal societies to take the products of our footloose culture and get the same enjoyment out of it. The same applies even within a society: a more don't-fence-me-in sub-culture thrives most when there is a more tightly rule-governed sub-culture whose benefits can be spread easily to the former.

April 27, 2011

Childhood bravery, 1

On Facebook my brother just re-posted a viral status update about how devil-may-care kids used to be vs. how wimpy they've been in recent times (obviously referring to the post-1992 decline of all that stuff).

It's easy for people who didn't grow up during the '60s, '70s, or '80s -- or for current helicopter parents who may have forgotten what it was like -- to dismiss that way of living as reckless, foolish, etc. And sure, there's a real downside to unsupervised young people doing what they want. But that lazy dismissal ignores the development of moral character that only results from going through these dangerous rites of passage, and the solidarity that can only be forged by going through them together with your age-mates. Not to mention all the fun you get to have -- but again I don't want to play into the stereotype that the anti-helicopter parent army is only interested in kids having more fun.

During sheltered times, it's rare that someone growing up gets to act brave, and we have to count that as a real cost of living with a falling crime rate. Please don't suggest that kids in the past 15 to 20 years do get the chance, but that they merely express it in activities that only current youngsters would understand, like video games. I'm sure that the instinct does get channeled in that direction, say by taking a hit to save your buddy in an online first-person shooter game. But that is no more an act of bravery than jerking off to internet porn is an act of getting laid.

So in the interest of historical preservation, I thought I'd go through some of the events of my childhood where I had the choice to take a frightening risk that would pay off big -- if it worked -- rather than sit it out on the sidelines. And crucially where that pay-off was either to my character or to helping out people in my social circle, not like shoplifting. Feel free to add your own in the comments.

All of the events in this series happened during elementary school in mostly white middle-class suburbs of Columbus, Ohio between 1985 and 1992. This isn't exhaustive, just off-the-cuff. I'm not bragging here either -- everyone who grew up back then has tons of these stories, and I didn't think I was anything special for having gone through them.

- Nothing shows up from my earliest memories through pre-school. You're just not social enough at that age to care about your reputation. The most that comes to mind is going on board my dad's ship (he was in the Navy), walking up to the front, and staring over the edge -- with no railing to keep me from falling. That must have been what gave me my fear of heights.

- When I started kindergarten, my parents drove me over to the school to get familiarized before the year began. They offered to drive me there on the first day of school, but I wanted to walk there by myself (remember saying that all the time? -- "No, I wanna do it by myself!"). It was about a 5-10 minute walk, but still something I had never done alone, having been driven to and from pre-school. They consented, and off I went.

It was no big deal for the first couple blocks, right up to the place where I'd turn right and find the school another block ahead. As I approached that corner, some large thing started racing toward me so suddenly and shouting so loud that it arrested my senses. After I unfroze a second later, I beheld one of those Cujo-looking beasts that people used to keep, before switching to zippy, yip-yip pets. Next I registered that there was a fence there, and so I was probably safe to just run like hell to school instead of backing off and getting a ride from mom and dad after all.

When I told my parents what I encountered, they pressed again to drive me to school, but I stubbornly insisted on walking again. For the next 5 to 10 days, I walked on the opposite side of the street before I hit that corner, so that when I did have to turn, I'd only have to run past one side of it, avoiding the part of it on my street. That dog never let up either -- every day when I got too close, he'd rise up out of the earth like an escaped hellhound and damn near break through that shaky wire fence.

After continuing to refuse their offer for a ride, my parents gave up, and my dad figured he might as well encourage me. "Hey, you're being braaaave, Agnostic," drawing out the pronunciation the way grown-ups do when they're teaching you a new word. Around the second week, I said to hell with it, and walked along the dog's side of my street instead of hiding on the opposite side, and I stopped running once I got there. After several days of that, I could stroll right by him -- with my heart hammering against my ribcage the whole time, of course, but no longer showing any outward signs to that overgrown mutt that he was getting to me.

Toward the end of our brief stay in that neighborhood, I even started staring him down as I passed by, a practice that I still do whenever some foul-smelling growler races up and down his side of the fence when I'm out walking around. Today it would be unfair to pick on a smaller creature, but one does get tempted when some annoying, shouting fleabag needs an ego check.

These showdowns with terrifying animals are somewhat safe since there's almost always a fence separating them from you, and they aren't going to track you down later like another person might. Still, confronting the big-ass scary dog on the block was one way of preparing you for what you might face later on. With everyone owning family-friendly pets these days, that source of toughening yourself up has all but evaporated. And again, please don't try to tell me that some fake dogs bursting through the window in Resident Evil are the same as real dogs with real muscles and real teeth.

To be continued.

April 25, 2011

Vampires and sorcerers in culture: Supernatural or mundane?

Because cultures become more rational and less mystical during periods of falling violence rates, it comes as something of a surprise that the past 15 to 20 years, when crime has been falling, have seen such a huge demand for popular culture with vampires and witchcraft. On closer inspection, though, these books and movies are not the exceptions they appear to be, specifically because they do not involve the supernatural but the merely unexplained.

We already believe that human beings don't know or understand every last little thing about the universe that humans beings could, in principle, know or understand. The makers of Buffy the Vampire Slayer highlight some unknown thing for their audience. Still, how is this extraordinary thing portrayed -- as something that cannot be explained by natural laws in principle, and so seems to come from some other dimension or plane of existence, or as something that can be explained in naturalistic terms but that we haven't gotten around to noticing, studying, and explaining just yet?

This gives us three classes of phenomena: the supernatural (unexplainable), the explained natural, and the unexplained natural. My focus in the falling-crime vs. rising-crime comparisons has been on the popularity of cultural works that buy into the natural vs. the supernatural. So let's have a look at the apparent exceptions and see if they invoke the supernatural or merely the unexplained natural.

Before that, though, just to make clear what I mean, consider a movie that everyone's seen that contains both supernatural as well as unexplained natural elements -- Star Wars. Since it takes place in a galaxy far, far away, we expect to see pieces of the naturalistic and explainable universe that we haven't seen before. Those include artifacts like the hyperdrive engines and the laser guns, but also living species such as the motley lot of aliens we see in Mos Eisley. Still, these are things that, if they came to Earth in the present day, we would regard as part of the natural world. The only change in our worldview upon seeing them is, "Huh, I guess the natural world has a lot more stuff in it than I thought before."

In contrast, the lightsabers, the ghosts of Yoda, Anakin, and Obi-Wan, and The Force all appear as supernatural things. Maybe some person from the natural world built those lightsabers, but only under the guidance of some supernatural force. The ghosts are not things that, if we had grown up in that time and place, we would regard as perfectly naturalistic phenomena. And what The Force is or how it works is just not something that we could know, even in principle. Of course, during falling-crime times, lunkhead Lucas changed this and gave a naturalistic explanation that was just as dorky as those science education reels that they used to show schoolchildren in the 1950s --

"Today Timmy, we're going to learn about Midichlorian concentration and Force power."

"Gee Mr. Beagley, that sure sounds neat!"

With that distinction in mind, a quick look at the blockbuster vampire movies and TV shows tells us that we've seen a surge in popularity of the unexplained natural, not the supernatural. The two big ones here are the Buffy the Vampire Slayer TV show and the Twilight movie series and books. These vampires are entirely of-this-world -- they're just part of the natural world that we hadn't noticed yet. And they're shown in the most anthropomorphic way possible, from their appearance to their daily behavior (like going to school, whining about the pretty girls ignoring you, etc.). So, they're like all other human beings except that they have an unusual talent, one that makes them extraordinary but not other-worldly.

The same is true for the two most popular series based on witchcraft and sorcery, the TV show Charmed and the Harry Potter franchise in books and movies. Sure, these witches and sorcerers are a bit eccentric -- they would definitely be part of the weird crowd in high school -- but otherwise they are entirely human and part of the larger natural world. Again, the only change in our perception is, "Huh, I guess the logical and rational world has more things in it than I was familiar with."

So, what these newly popular works show is not the realm of the supernatural but rather the entirely naturalistic goings-on of more-or-less human characters. Their creators dress them in ghostly garb only as a heavy-handed way of emphasizing their out-there behavior and weirdo status. Hence the greatest popularity among teenage girls, who feel like no one else gets their quirks, and who fear being ostracized as weird -- not who feel like their nature is not of this world.

There is a good historical precedent for this, namely the late Victorian period when vampires first burst onto the cultural scene (earlier works were ignored). I haven't read them in a very long time, but I recall Dracula and Strange Case of Dr. Jekyll and Mr. Hyde being this way too. Certainly The Picture of Dorian Gray was like this. In all, the presentation is of human characters whose entirely human quirks and flaws are symbolized in the extraordinary events involving them. The unexplained in these works are more metaphorical and moralizing than to be taken seriously with dread.

Since the Victorian era was one of falling homicide rates, the social background and broader zeitgeist of those earlier books and of the current crop are more similar than most people appreciate.

In fact, Dracula did not become a scary, supernatural being at least until the later 1920s when Bram Stoker's novel was adapted to the stage. (Search the NYT for "Dracula" before the 1920s, and you'll find only a single passing mention of Stoker's novel. Articles with "Dracula" do not start rising until the stage adaptation.) Not long after that, the 1931 movie with Bela Lugosi cemented Dracula's image as a supernatural being.

As for Dr. Jekyll and Mr. Hyde, there were a series of silent film adaptations, but the first to attract wide attention was the 1931 talkie that managed to get made before the Hays Code prudified Hollywood movies for most of the mid-20th C. All of those early versions of the movie, though, do not come across as a moralizing Victorian story that lays it on too thick with symbolism, but rather as real portrayals of demonic possession. It's true that there is some naturalistic basis for Jekyll's transformation into Hyde in these early movies, i.e. the chemical concoction that he takes in order to change. Still, it looks like more of a magical potion that opens a portal between the natural and the supernatural, rather than a pharmaceutical whose changes work completely within the naturalistic world.

The American homicide rate went soaring at least from 1900 (when the earliest data are available) through a peak in 1933, so again we see a perfect coincidence between periods of rising violence rates and a greater belief in the supernatural. (See also the 1931 Frankenstein movie below.)

To find truly supernatural horror stories, forget the Victorians. You have to go back to the previous period of soaring violence rates, roughly 1780 to 1830 -- that is, the Romantic-Gothic period. Now those were not naturalistic monsters at all! The most enduring one created in that period is Frankenstein's, and he is not a human being who's just a bit weird and has a somewhat unusual set of abilities. And his creation is not explainable in natural terms, but is more magical or miraculous. However hammed up it may have been, Boris Karloff's performance of the monster in the 1931 movie was certainly not of a more-or-less human character begotten in a more-or-less natural way.

During the most recent wave of violence that lasted from the 1960s through the '80s, we see the same thing. To be clearer, it looks like the obsession with the supernatural catches fire particularly once the society has gone about halfway through the rise in violence rates (which we also see during the 1900-1933 wave and the 1780-1830 wave -- and the 1580 to 1630 wave that I won't get around to discussing in this post, but that Early Modern fans will recognize). We've already looked at the original Star Wars trilogy. The Indiana Jones trilogy also puts heavy emphasis on the supernatural. I haven't seen The Lost Boys in a long time, but I remember it being halfway between supernatural and unexplained natural.

Ghostbusters is an interesting case. As with Frankenstein, we shouldn't confuse the natural-vs.-supernatural question with the material-vs.-immaterial question. There's a material basis for the way that the Ghostbusters rope in the monsters, and it even seems somewhat scientific at the beginning. They speak about the composition of the ghosts and the mechanisms of their proton packs as though it were all part of where our naturalistic understanding of the world will be in the future, when we've looked harder and longer. They're like the hyperdrive engines and Mos Eisley aliens of Star Wars.

However, by the end when they face the interdimensional demon-babe Gozer and the Stay Puft Marshmellow Man, they -- and we in the audience -- feel like they're suddenly out of their league, and that putting on their scientific thinking cap ain't gonna help anymore. This is clear from the climax scene on top of the apartment building, where a portal between two planes of reality has been opened up. Clearly there is typically a closed barrier between our world and theirs. And when they decide to cross the streams, it's clear that they have no rational or scientific basis for doing so -- it's a leap-of-faith, Hail Mary pass that just so happens to work. This shift in tone from "Gee, maybe this stuff is natural after all" to "Holy shit, it's a miracle that worked!" beautifully highlights the shift in perception that some of the characters and even audience members have gone through by the end of their rite of passage from skeptic to believer in the supernatural.

To wrap things up, we see that what underlies the recurring "crises of faith" that are generations-long -- not historical hiccups -- is a plummeting of the violence rate. When all is going well, the natural world seems to be running so orderly that supernatural causes need not be invoked to explain things. When the world falls down, we are forced into humility and begin to admit that there's more to reality than what can be explained in naturalistic terms.

Remember that these increases and decreases in the violence rate must be society-wide and last for years and even decades. Otherwise we take them to be flukes or to not apply to us over here. World War II, for example, did nothing to stop the growing atheistic and Existentialist tide in the West. (Nor did September 11th reverse the secularizing trend of the '90s in America.) It took something as widespread and ever-worsening as the wave of violence of the '60s through the '80s to reinvigorate modern people's belief in the supernatural, otherworldly, mystical, etc.

Of course, now that violence rates have been dropping for nearly 20 years, more recent cultural and behavioral trends have rolled back most of that spiritual revival. But just give the world some more time, and the cycle of all these things will enter its ascending phase once more.

April 23, 2011

Taste or meaning in art?

The narratives that have survived over the centuries are those that focus more on the art of storytelling and meaningful themes, rather than those that strive to get high marks from audiences for exhibiting high taste.

Occasionally we get lucky and find works that have both (like Shakespeare's later tragedies), and sometimes things go the other way than they should (like the drama section in Barnes & Noble having all kinds of 20th C. American junk, yet no Marlowe or Webster).

On the whole, though, there is a trade-off between a more Apollonian vs. Dionysian approach to crafting the story, and it is typically a folktale like Little Red Riding Hood or the crucifixion of Jesus Christ that endure over the centuries, unlike In Praise of Folly or Candide. Looking forward, it's an easy prediction that Stephen King will outlast David Foster Wallace in popularity. Going back in time, I don't even know off the top of my head who the Ancient Greek taste-displayers were because they've already been totally forgotten to everyone but specialists; meanwhile, who hasn't heard of Oedipus?

Is this merely due to vulgar preferences among the majority of the audience? No: even among people today who prize taste, most or all do not read and re-read Alexander Pope, nor watch and watch again The 400 Blows. There is evidently something fickle about the preferences for specific works within the taste-based audience -- I wonder how many still revere the movie Rushmore, made not even 13 years ago?

The key difference seems to be the level of self-consciousness of the creator and their work. When talented people display their skills, we don't like them to be so self-aware and in-control because it feels like they're just trying to show off for narrow personal gain. In contrast, going into a trance, losing self-consciousness, and receding into the background as an author, these all give us the impression that the creator cannot possibly be trying to lord their superiority over the rest of us or to profit from their display -- after all, they're not even in control of what they're doing, but rather are possessed by something and are just going along with it.

This contrast between heightened and diminished self-consciousness shows up in the "author" field of their bibliographic entries. A good deal of the creators of fairy tales and of the books in the Bible are unknown, whereas the guardians of taste went so far as to enshrine their self-awareness in Auteur theory.

That also accounts for the ever-shifting fashions among the taste-oriented audiences: signaling good taste typically depends on an opposition to contemporary bad taste, which won't stay the same over time. There is simply too much bad taste to be fully explored in any single time period. Just think of how few of Seinfeld's put-downs today's teenagers and college students would get, barely 15 years after its heyday, compared to the more au courant snarky remarks from the Colbert Report.

More generally, taste depends on highlighting distinctions between the creator and his taste-making competitors, including those who are long dead -- "my taste is so much greater than his." Thus novelty per se is sought after. Meaning-makers, on the other hand, all see themselves as improvising their own variation on the same core of underlying themes. So novelty is perfectly allowed, provided it adds something meaningful to the storytelling. Because it is not desired in itself, good examples tend not to get forgotten. Nor is it devalued per se, which keeps the dead hand of history from freezing the basic narrative in identical form to be mindlessly repeated generation after generation.

The greater attractiveness of skill displays that lack self-consciousness applies much more broadly than just art creation, but those other case studies can wait for another time.

April 21, 2011

Only herdsmen have heroes, Disney edition

Later on I'll write up something more substantial on the mythologies of the world that do or do not feature what Joseph Campbell called the monomyth, or the fundamental hero-quest that many popular legends derive from. The short and skinny of it is that it's mostly or only found in cultures where a decent fraction of people, not necessarily everyone, were animal herders rather than hunter-gatherers, farmers, or slash-and-burn gardeners. That future post will explain why.

For now, though, it's enough to take a look at Disney's attempts to make Americans enjoy folk legends from around the world. The first was The Jungle Book, which shows a hunter boy, although because he's raised by and hunts with a pack of wolves, he's more properly put into the category of wild wolfmen from the mythology of Proto-Indo-Europeans, a group of nomadic pastoralists who are among the most successful at having spread their narratives across the globe.

Aladdin is based on a Near or Middle Eastern folktale and set either there or Central Asia, hotbeds of herding for millennia. Farther back in time, the Near East saw the ancient Semites spread their stories throughout the region and then later the world. Sometimes this wave of mythogenesis met with the Proto-Indo-European wave from the north, giving rise to hybrids like Zoroastrian and Ancient Greek legends, one of which was made into Hercules. However, the Semitic herdsmen would have been successful even with their less hybridized systems -- the three monotheistic religions of Judaism, Christianity, and Islam. Aside from Hamlet, the main influence on The Lion King were the stories of Joseph and Moses, who was not just any old hero of a pastoralist tribe but was a shepherd himself.

Pocohontas has little or nothing to do with the historical Native American figures, but is instead a garden variety fairy tale like you'd find back among the descendants of Proto-Indo-Europeans in Europe. The Hunchback of Notre Dame doesn't have a non-European hero, but there is a Gypsy babe. While Gypsies aren't herders, they are close enough -- they are nomadic, carry personal wealth and have some stratification (unlike hunter-gatherers), have a culture of honor among males and of chastity among females, are naturally gifted at music and singing, and a good fraction of the girls are of the pretty, charming milkmaid type.

What about the uber-farming society of China? Doesn't Mulan go against the claim? The first clue that Mulan is not Chinese is that she's riding a horse, and just to emphasize her non-Han status she's shown with a bow and arrow. The historical figure that she's based on, Hua Mulan, came from a Turkic-speaking, nomadic pastoralist region in the north of China, not from the Mandarin-speaking rice farmers.

Rounding out the Disney Renaissance is Tarzan, perhaps the only exception in that it idealizes a hunter in a not-so-pastoralist region of Africa. Still, his eagerness to go through the adolescent rites of passage to become a warrior -- not the self-effacing, easy-going hunters typical of foraging groups -- makes his character more like a Maasai, Nuer, or Dinka herdsman.

Like Pocohontas, The Emperor's New Groove is just a European fairy tale set in a foreign land, not borrowing Inca mythology as the basis of the story. The same goes for The Princess and the Frog. Brother Bear doesn't look like it has the monomyth in its narrative.

Starting in the 1990s, Disney went all multi-culti and tried to incorporate the stories from all the world's cultures into their movies. Unfortunately, most people's mythologies are boring -- interesting to study if you're an academic, but not the kind whose dramatic tension and compelling storytelling draw the audience in. Only groups where a good deal of the members are herders seem to be good at that, so even Disney's successes drawn from truly non-European sources turn out to be from pastoralist cultures.

If they didn't mind their movies lacking a hero's adventure, then they could have drawn from all sorts of other mythologies, whether from advanced agrarian societies like Ancient Egypt or the Aztecs or China, from small-scale gardening cultures like those of sub-Saharan Africa or the Pacific Islands, or from hunter-gatherer groups like the Bushmen. But if it's heroic excitement you're looking for, you'll have to turn to the wide-roaming, cattle-rustling cowboy bands.

April 20, 2011

Rap before hip-hop: Self-deprecating songs

Continuing the theme from an earlier post about when rap music had only occasional rather than constant name-dropping of expensive brands, what ever happened to carefree and self-deprecating rap songs?

It seems like ever since the crime rate began falling, turning our culture more superficial and based on trivial one-upsmanship, people have been less able to have a good laugh at how bad their life can get sometimes, while still holding out hope that their luck will turn around. (And no, Seinfeld etc. only gave off mock self-deprecation.) The rap world is just a microcosm of the larger trend toward either overly exaggerated bragging about how awesome you are, or else wallowing in self-pity.

It's hard to imagine in the falling-crime era of bragging about your rims and other bling you've used to pimp your ride, but rappers used to have a sense of humor about the less-than-impressive cars they drove. Here's "My Hooptie" by Sir Mix-a-Lot from 1989, three years before "Baby Got Back." That year also saw the popular re-release of "Girls Ain't Nothing But Trouble" by DJ Jazzy Jeff & The Fresh Prince, a lighthearted but not surrendering take on all the crazy shit that girls put guys through. I haven't listened to their first major albums since checking them out from the library over 20 years ago, but I recall a lot of their stuff being goofy like that, much like the Fresh Prince of Bel-Air show on TV.

Since I never got really into rap, I can't remember others off the top of my head, but I'm sure there must be some by the pre-gangsta rappers. The last one that sticks in memory is easily the best:



Positive K, "I Got a Man" (1992)

April 19, 2011

Snapshots from wilder times

I wish I had access to all the pictures back home so I could begin to provide some visual documentation of the major differences between falling and rising-crime times, aside from the obvious ways like how scary strangers in cities used to look. Luckily one of my brothers just scanned some random photos from awhile back.

During the shift toward pacification and infantilization over the past 15 to 20 years, we've lost sight of the good that came along with the bad in more dangerous times (such as, for one thing, the opposite of infantilization -- growing up faster). Someone needs to preserve that in the collective memory, especially for the mid-'70s through the early '90s because most of the Baby Boomers assume that, for example, children riding in the bed of a pickup truck (no car seats, no seatbelts) stopped in the 1960s, whereas in reality it continued right through the 1980s and early '90s. * They were too old to be in touch with what young people were going through then, and so have no direct memory of how kids were as wild or wilder than they were in the '60s and early '70s.

I've only got two pictures, but they do relate to topics I've yakked about before. First, a reminder of how teenagers over the past 20 years have voluntarily dropped out of the freedom-for-youth culture ("fight for your right to party"):


Helicopter parents wanting to keep their kids out of the driver's seat are one side of this, but the kids themselves bear some blame too for not finding a way around their parents' prohibitions like young people used to do when disobedience came naturally.

Now only about 40% of 16-17 year-olds have a license, which means that most of them have probably had no real driving experience at all. Contrast that with this scene from I'm not sure, but looks like 1993 or '94:


Those are my twin brothers, either 11 or maybe 12 years old at the time. I got to drive the tractor around age 10 also, several years earlier. I'm not sure if I'm taking the picture or my mother, who would've thought nothing of pre-pubescent kids driving a motor vehicle around my grandparents' back yard. As I recall, we got to turn the ignition, while my grandfather put it into gear, and then we got as much experience as we wanted using the steering wheel and giving it a little gas. I don't think it went over 10 mph, but we didn't care -- it was more about being in control and roaming wherever we wanted, rather than just where our parents would take us.

Before I detailed my summer job at age 10, a now extinct practice in the age when parents push their children to remain frozen in a pre-rite-of-passage state until middle age. My brother too had something of a job at that age, although he didn't put as much sweat into it as I did mine. Still, here's proof that as late as 1993, 10 year-old kids tried to enter grown-up world as soon as possible:


This was at a small baseball card convention held in a local church basement. He also sold them out of his room, which he advertised with a sign in his street-facing window. He's on the left, and that's me (age 12) helping him hold up the fort on the right. You hardly even see kids selling lemonade by the roadside anymore, or setting up their own garage sale to clear out old toys and books and other junk. He even had his own cash register, just like the ones that grown-up businesses used, because he didn't want the experience to be kiddie and fake.

The falling-crime times of the mid-'30s through the late '50s could never have produced a fifth-grade entrepreneur like Alex P. Keaton, notwithstanding the nonsense on all sides about how that character was supposed to represent a throw-back to that era. In reality, as reflected in A Christmas Story, Brighton Beach Memoirs, Radio Days, and Leave It To Beaver, kids of the mid-Depreesion through The Fifties were sheltered dweebs whose helicopter parents would never have approved of them joining the adult world so fast. The same was true during the falling-crime times of the Victorian era, famous for busybody adults crusading to end child labor so that they could learn all day long (yay).

So, whatever bad there may be when the world gets more dangerous, it also causes people to mature -- it puts the whole society to a test. It's a bit unreal to think that at the tail-end of rising-crime times, my brothers and I were more grown-up at ages 10 or 12 than the average college freshman is today, who's probably never had a paying job, driven a car (especially without a seatbelt on), or played his first game of "I'll show you mine if you show me yours."

It will be fun and interesting to see what happens when the Millennials are the ones in charge of running everything in 20 to 30 years.

* And regarding vehicle-related craziness in fiction, Footloose blew Rebel Without a Cause out of the water -- Ariel straddling two racing cars, one of which was headed toward a semi; the game of chicken with tractors; Ariel's train-dodging stunt; and the narration of a game of "highway tag" that ended with both cars tumbling over a bridge, killing Ariel's brother, and sparking the recent crackdown on wild youth behavior.

April 18, 2011

Burying the burial

We've all noticed more cremations in real life (whether first-hand or from reports), as well as more references to them in popular culture (such as the end of The Big Lebowski). I thought at first that this was a feature just of the '90s and 2000s, along with all sorts of other New Age-y and sacrilegious crap that can only flourish during a falling-crime era when people become more superficial and complacent, as during the earlier falling-crime times of Renaissance Humanism, the Age of Reason, and the Victorian era.

I've shown before that people's thought drifts in more mystical directions during times of increasing violence rates, and they are more concerned with the supernatural, the sacred, and so on. But some trends toward profanation are too strong for even a wave of violence to shove back in a more sacred direction:




The North American data show an exponential increase regardless of whether it was a rising or falling-crime era, and neither do the UK data reflect any of the crime waves, rising logistically from the end of the 19th C. (the growth was ever-increasing even from the beginning to 1930, which looks flat on the overall scale). * Since the UK has been secularizing for much longer and at a much greater intensity than North America has, it's no surprise that they've already reached about saturation level, while we still have a ways to go, especially in America. See this map of US states by cremation rates, which looks very much like a map of traditional vs. progressive values.

Because the trends over time are so widespread and go back over 100 years, we can throw out all sorts of desperate rationalizations that the body-burners grasp at when a press article is written. The trend began long before funerals began to cost an arm and a leg, for example. I could only find data for the US, but here is the median cost of basic funeral services in inflation-adjusted dollars, not including the cemetery plot and marker/monument costs. **


There's only been one period of price increases, from 1980 to 1995, and was flat or falling outside that time, whereas the cremation rate has been soaring the whole time. Adding in the costs of the tombstone aren't going to make the picture look different, since you can always opt for something cheap.

I don't know how much a plot of land costs, but I doubt that plays a large role because the (Spearman rank) correlation between cremation rates and (log of) population density for US states is only -0.099, so that how spread-out or packed-in people are explains merely 1% of the differences across states in how typical cremation is. Plus it's in the wrong direction according to the land-deprivation view, which predicts that the denser a state is peopled, the more they should opt to cremate instead of take up scarce real estate with burial plots.

Even worse, (log of) a state's land area is positively correlated with cremation rates, rho = +0.244, explaining 6% of the variation. The silly materialist view that the rise of cremation reflects changing availability of land would predict that states with more land should cremate less, while tinier states deprived of virgin ground should cremate more. In fact, it's just the opposite, although again the relationship is weak.

(Scatter-plots of cremation rates vs. land area or population density are not shown, but there's no visible non-linear pattern there either.)

You might as well make a memento mori moment out of this and draw up your will so that you can declare your wishes to be buried like homo sapiens have been for hundreds of thousands of years. Otherwise they might just go with the norm, which is increasingly toward blasting your remains to smithereens. In some cultures, cremation is more personal and honorable than it is in the modern West (e.g. India or pagan Europe), since at least it's done in the open air and the audience sees the body set ablaze, rather than gather only after it has been pulverized and sheltered in a clean, pretty decoration piece.

If anyone doubts whether or not the cremated remains are less sacred than buried remains (whether embalmed or not), just do a little thought experiment. (If you're a social scientist, you could ask this of subjects and publish the results.) Take the exact same person who's died, and imagine collecting their bones into a container, taking them out to a variety of places that the deceased really treasured, and strewing the bones about the places so that they could enjoy their most beloved haunts for eternity. Worse, suppose you didn't even wait for the meat and slime to be removed or waste away from the bones, but just threw fleshy body parts here and there around their favorite garden, favorite pond, etc.

Yet people feel that it's hardly sacrilege at all to do this with the cremated remains of that same person. It follows that they perceive cremated remains as far less sacred of an object than remains that are naturally decaying, embalmed, mummified, or that have left only the bones. Modern people feel cripplingly uncomfortable allowing the sacred to enter their lives -- that's just for backwards, superstitious yokels -- so they're increasingly unwilling to face the dead in a respectful way. They prefer instead to come away with remains that have only a glint of sacred power, and unlike other cremation cultures, they don't even want to witness the burning take place. They contract it out to some bunch of fee-takers who do it alone and behind closed doors.

I'm not going to dissect, in this post anyway, the psychology of why some forms of dealing with the dead are more honorable than others (e.g., are audience members close to or far away from the dead, do they bear a heavy or light load when carrying the dead, etc.), or the sociology of why some groups have more honorable funeral rites than others (e.g., a more tightly knit vs. loose social network). It's enough to say that we're headed in a really despicable direction.

And if the UK data are any guide, even if this trend will eventually reverse, it will probably take several hundred to a thousand years. There was a period in China from roughly 1000 to 1700 AD when the Confucian insistence on burial was relaxed, and cremation rose in popularity. Other episodes of the rise-and-fall of cremation also look like they took hundreds of years before people began respecting the dead again.

* North American data are from the Cremation Association of North America. They are annual and go through 2009, although 2008 is missing for the US, and for Canada the years 2001-03 and 2005-08 are missing. The UK data are from the Cremation Society of G.B. and were taken at 5-year intervals from 1890 to 2005 (plus 2008). Leaving out Ireland and Scotland, which have lower cremation rates, hardly alters the picture because the populations of England and Wales swamp the others. The shape of curve is the same, only it plateaus at a slightly lower level.

** These data are from the National Funeral Director Association's National Price List Survey. They are at generally 5-year intervals from 1960 to 2009.

April 16, 2011

Today's music lovers are teenage girls and middle-aged men

A couple days ago a girl I was chatting with was deploring the glut of dopey culture that's churned out by industries catering to adolescents, such as the Twilight movies and books as well as Justin Bieber's music. She said she'd noticed most of the bad music videos on YouTube were most popular with "the tween generation;" she herself is in her early 20s. I never knew that you could find that info out on YouTube, so I did a little investigating.

First, YouTube only tracks popularity among demographic groups if there is an official music video at the band's official channel or the official channel of their record label. This makes it impossible to study popularity for anything before MTV began in the early 1980s, and for any band who never made it big enough to have their own YouTube channel as of today. But for the videos it does track, it lists the top three demographic groups who watch it, split by male vs. female and by age group -- 13-17, 18-24, 25-34, and so on by 10-year blocks after that.

Well, she was right that Justin Bieber's and Lady Gaga's videos do draw the 13-17 year-old female crowd, but it turns out that so does every other kind of music, whether present or past, and whether good or bad. I didn't compile a complete list of the roughly 30-40 videos that I checked out, but the take-home message is that if females show up in the top three audiences, it is almost always 13-17 year-old girls. It's almost never 18-24 year-olds (the only exceptions I found were T.I.'s "Whatever You Like" and Britney Spears' "...One More Time").

I found no video where 25-34 year-old females were among the biggest fans. Females aged 35-44 and, more rarely, 45-54 did occasionally make it, but it was only for older and less iconic hits -- ones that were #1s when they were teenagers, but that haven't lasted in popularity as other #1s from the same era. For example, "Like a Virgin" has one female audience in the top three spots -- 13-17 year-olds -- because it's remained popular ever since it came out, so that young girls today will have heard of it and look it up on YouTube. However, "Hungry Like the Wolf" and "Rio" are not as recognizable today, and the only female group who really digs these are 35-44 year-olds.

For male audiences, it's just the opposite. It's almost always 35-44 and 45-54 year-olds (i.e. the so-called younger Boomers plus Generation X) who make up the bulk of the video's viewers. For videos with no female audiences in the top three, then 25-34 year-old males show up, although this is less common because that third spot usually goes to females 13-17 or females 35-44. I found no cases of 18-24 year-old males making it into one of the big three audiences, and only two cases where 13-17 year-old males did -- for "Baby" and "One Time" by Justin Bieber. Although even for those two, you wonder whether those young guys aren't just going there to vote thumbs-down on the video over and over, since both videos have over 1 million and over 200,000 dislikes, respectively, in contrast to most videos that have hardly any dislikes because only people who enjoy it search it out and watch it.

To give you an idea for the strange new world of music culture that we have today, here are just a dozen songs, out of even more, whose videos are most popular among females 13-17 in all cases, males 35-44 in all cases, males 25-34 in the first 2 cases, and males 45-54 in the other 10 cases. I list the older ones just to show that this holds true for songs that were hits before the 13-17 year-old girls were even born. They also represent a pretty broad spectrum of pop music.

"It's Tricky" by Run DMC
"Don't Cry" by Guns N' Roses
"Sweet Child o' Mine" by Guns N' Roses
"You Give Love a Bad Name" by Bon Jovi
"Livin' on a Prayer" by Bon Jovi
"Like a Virgin" by Madonna
"Like a Prayer" by Madonna
"Billie Jean" by Michael Jackson
"Time After Time" by Cyndi Lauper
"Friday I'm in Love" by The Cure
"Every Breath You Take" by The Police
"Sweet Dreams (Are Made of This)" by The Eurhythmics

Males in the 13-17 and 18-24 groups (i.e. Millennials), and to a lesser but still large degree in the 25-34 age group, have totally abandoned whatever interest in music they may have had, or just never got into it in the first place. Even though the average pop song is only 3 or 4 minutes long, that is still too much time to re-allocate away from leveling up their dorky video game characters and ambushing 11 year-old boys in Grand Theft Auto IV online multiplayer. Since watching a video on YouTube is completely free, and only costs you your time, this shows that young males aren't interested at all -- at least if they weren't buying albums, maybe they'd still be illegally downloading them or scrounging around YouTube. But no, they just don't care about music anymore.

I don't know what to make of the continued fascination with music among teenage girls. I keep emphasizing how less wild young people have gotten over the past 20 years, so here's a partial exception. They haven't dropped out and joined the church of video games or anything else like that. This shouldn't be too surprising since I've shown earlier, using data from IMDb, that under-18 girls are the most likely to dig Alfred Hitchcock, to love good horror movies, and to hate dopey chick flicks. I also showed that females become boring faster as they age compared to men, looking at movie preferences.

Although I didn't check for age differences within the sexes, I did find that girls are more likely than boys to love coming-of-age movies, even if the stories were only about boys. My hunch there was that modern society protects innocent people from physical harm and danger, so that unless there's a surge in the crime rate like from 1959-1992, young males have no way to feel like they're needed and appreciated by, e.g., serving their role as protectors and fighters. Young girls, however, are not protected in modern society because their threats are not physical violence but spreading gossip, ostracism, etc. So during adolescence they still feel just about as under-siege by enemies as young girls have felt for most of history.

So if you want to talk to younger people about music -- not teaching, I mean just shooting the bull -- you're more or less out of luck in talking to boys, unless you go some place like a used record store where young guys work and are hired on the basis of being interested in music. Choose teenage girls instead, who still respond to music. Hey, there are worse ways to spend your time than getting lost in a conversation about Dionysian topics with an opening flower of a girl.

April 15, 2011

Last girl-crazy song by a male singer?

Why not follow up the post below on when we saw the last boy-crazy song by a female singer, sticking again to the #1s on the Billboard Hot 100, and using the same criteria to distinguish them from sappy or cutesy "Gee I love you so much" songs. There has to be something about his vulnerability, inability to shut off how he feels, and the sense that it could consume him.

It looks like they start a few years earlier than the female ones do -- like 1957 or '58 ("All I Have To Do Is Dream"), as opposed to 1960 for females. It also looks like the better examples end earlier than they do for females. As late as 1989, there was still "Like a Prayer" and "Eternal Flame," but the male songs from that year aren't up to that level. "She Drives Me Crazy" is a fun one, but nothing super special. The #1s from 1988 have several great girl-crazy songs -- "The Way You Make Me Feel", "Need You Tonight," and "Sweet Child o' Mine" -- but those are all from albums released in 1987. I don't see "Every Rose Has Its Thorn" (from 1988 and '89) as at the same level as "Sweet Child o' Mine".

So maybe it was males who began switching off the wild lobe of their brains first, and females followed suit?

At any rate, 1993 had UB40's cover of an Elvis song, "I Can't Help Falling In Love With You." The last decent girl-crazy song must have been either that or "How Do You Talk to an Angel" from 1992... yep, there were pretty slim pickin's in the '90s. It wasn't until 2007 when Sean Kingston laid new lyrics over the music to "Stand By Me" that there was another decent such song, "Beautiful Girls."

That doesn't go against my idea about rising vs. falling violence levels as the shaping force behind these kinds of songs: Sean Kingston is Jamaican, and while their homicide rates rose like ours did from the early 1960s, they didn't turn back down during the '90s like ours. In fact, the increase looks even sharper starting around 1990, and that's continued at least through 2007.

Who would win in a fight?

More proof of how obsessed the male mind is with competition:

April 14, 2011

Last boy-crazy hit song by a female singer?

Let's stick just with the #1 songs on the Billboard Hot 100, to make sure we're talking about songs that were popular. Also, here are the criteria for counting a song as boy-crazy, as above a mere love song:

- Expresses song kind of yearning for the guy (whether unrequited, celebration of current love, or a torch song, doesn't matter).

- Singer can't help the way she feels: it's involuntary, won't go away even if she wishes it could, and in general lacks control over her feelings. This must hold for both the lyrics as well as the emotional expression in her voice.

- Some sense that if she doesn't get him, she'll explode or wither away.

They don't start until about 1960, about the time the crime rate started climbing. And clearly 1989 was still chock full of these songs, and had the last of the great ones -- "Like a Prayer." By going to the bottom of the previous link, you can navigate around the #1s from various years. Just eye-balling it, I see a sharp drop-off even beginning in 1990. Remembering that social trust levels fall before the crime rate falls, it looks like boy-craziness is more driven by how much girls trust boys, and less by the crime rate per se, which peaked later in 1992 (except to the extent that these are closely related). Still, that's just from eye-balling; I don't know a good number of the songs, so I'd want to get more quantitative before settling on that conclusion.

There's "I Will Always Love You" in 1992, I guess. No later than 1997 the counter-revolution against boy-craziness had triumphed, as shown by the chart-topping "Wannabe" (that annoying song by the Spice Girls). It's an anthem for emotionally in-control, cold-blooded, pushy, laundry-list-scribbling harpies. That's something else to bear in mind when we look at the trend -- not just adding up all the boy-crazy songs, but subtracting points for ewww-boys-yucky-shoo songs like that one, or vain and emotionally dead ones like "London Bridge" by Fergie.

In 1999, we got "...Baby One More Time," not a very danceable song like the earlier ones were, but still something that could've been a late '80s head-over-heels radio hit. That looks like the last one. "Genie in a Bottle" by Christina Aguilera falls short because it's too much in the Spice Girls direction of, well, bottling up her emotions and issuing a vague list of do's and don'ts to potential suitors. Beyonce's song "Crazy in Love" looks like one on paper, but it doesn't make the cut because she's faking it on the vocals; she doesn't sound hopeless. Same with "Promiscuous" by Nelly Furtado, whose voice sounds more mercenary. And Katy Perry's voice is hollow no matter what she's singing about.

While it's clear that there was a die-off around the early '90s or just after, what about isolated cases since Britney Spears' 1999 hit? Some of those didn't ring a bell, and I don't feel like YouTubing them all right now. Again, just looking at #1s on the Hot 100, to make the same comparison across time.

When did rappers start dropping brand names?

Once upon a time, rappers did not talk about what kind of shoes they wore, or their hats, watches, sunglasses, etc. Or the cars they drove, the exotic fur their coat was made out of, etc. At least in the past 10 years, it's become standard to work Louis Vuitton, chinchilla, bla bla bla into their lyrics. I don't recall hearing that in the early '90s, like with MC Hammer or Sir Mix-a-Lot or LL Cool J or whoever. Even the early gangsta rap stuff, up through about 1994, didn't focus on the pursuit of material things -- just rollin' down the street, smokin' indo, sippin' on gin and juice.

So it was sometime during the '90s, no surprise -- that's when the whole culture, whites and blacks alike, started getting materialistic and devoting their lives to petty status contests. But someone who knows rap music better than I do, when did this start?

April 13, 2011

Infantilized playgrounds

Sometime later I'll write up a more in-depth history of the movement to prevent kids from having fun on playgrounds, or even having recess at school, which started when the violence level began plummeting after 1992.

But in the meantime, here are some pictures of what a contemporary playground looks like, for those who haven't been to one in awhile. My poor nephew who I took here had no monkey bars, tire swing, merry-go-round, or teeter-totter to horse around on, these subversive devices having been more or less banned by the government, under pressure from the helicopter parent majority.

The first major addition from when I was little is one of those annoying follow-the-experts labels that tells you between what ages your kid must be to read the book, play with the toy, or even use the playground. (Click to enlarge)

In reality, this playground was designed for children who are negative two years old.

But really, you can't trust the idiot parents to follow your initial warning about how the playground must be used, so you have to hammer them over the head with even more graphic warning signs:


I guess a skull and crossbones would have been too frightening. And anyway, as a kid I choked to death when my drawstring caught on the slide, and I turned out all right.

You can guess what the rest of the crap looks like if you've ever passed by a park in the last 15-20 years, although the new-and-improved swings really deserve a closer look. Better start converting the bike racks to stroller stations at all the country's high schools.

April 12, 2011

Video game culture during wilder vs. safer times

For me video games have never been a central part of what was going on culturally, but since they've grown so much in popularity that they've eclipsed music as the main thing that young people are fascinated by, I guess it's worth a quick look at how different the world of video games was in rising-crime times, using four of the recurring themes I've noted before as a guide.

First, games were a lot harder, and it had nothing to do with technical limitations. Rather, it reflected a demand-side desire for being put to a challenge and honing skills instead of waltzing through a game and not needing skills at all. You died easily and often in games up through the early '90s. Even by the mid-'90s, games like Super Metroid had all kinds of health refills that enemies dropped, plus rooms that totally refilled your health. Not to mention the direction that Castlevania games took with Symphony of the Night, where a save room is never more than a few screens away, and which totally refills your health. By now, these so-called realistic first-person shooter games allow you to recover your health automatically as long as you hide in a corner like a little girl and don't get shot any further.

How do I know this change is due to the wussification of audience demand, and not some technological improvement on the supply side? Because it doesn't take much to implement the "automatically heal shortly after being hurt" feature -- more than one health state (i.e., not just dead or alive, but degrees of alive), a means of hurting you (like enemy contact), a means of increasing health (like power-ups, as they used to be called), and an in-game timer (not necessarily shown on the screen).

Super Mario Bros., the first game that anyone played for the Nintendo way back in 1986, had all of these things. When you're big Mario or fire Mario, a hit from an enemy shrinks you back to little Mario, and there's a timer that could count 5 or 10 seconds after you got hit, and then the game could restore your status to big or fire Mario. Why didn't they? Because that wouldn't be a game anymore -- no more than if, five minutes after being scored on in football, the refs automatically took back those 6 or 7 points and "healed" the scored-on team to healthy status. This point generalizes to all older games with the necessary features (Legend of Zelda, Ninja Gaiden, Castlevania, etc. etc.), so clearly it was something that customers just did not want. They wanted a challenge.

When the world is getting more violent, you feel more of an impulse to better yourself, whereas during safer times you don't feel like the wimpier enemies out there are worth training so hard for. (See also the change in movies about dorks or outcasts -- before they struggled to improve themselves, like in Weird Science or The Karate Kid, whereas now they are content to stay dorks, like in American Pie and Harold and Kumar.)

Second, playing games was much more social -- well, it was social at all, as opposed to not at all. Again this is not due to technical differences, like the availability of games that you can play with someone else online. The key here is the mid and late '90s, after the culture became much safer and everyone secluded, but before online video games were at all common. Those were the dark ages for playing games with another person: arcades were just about dead by then, and you rarely went over to a friend's house just to play video games, at least compared to the early '90s and earlier. Since there was virtually no online game-playing at the time, it could not have caused a die-off in social interaction among game players. Rather, this was just one piece of the larger trend toward everyone locking themselves inside and hardly traveling to less familiar spaces, especially public ones like an arcade.

Third, playing games was a lot more cooperative, not merely social. A social interaction could be competitive, after all. Although there were a handful of player-vs.-player games during the Atari and Nintendo days, these were mostly sports rip-offs (Tecmo Bowl, Bases Loaded, Blades of Steel, etc.) that did not occupy a very large share of our hours playing games. And no games in the arcade were like that, aside from the despised Karate Champ. Instead, games where two or more people could play were cooperative -- all of you would team up in order to take on a common enemy, usually some criminal organization or race of aliens who threatened to take over your city or the world.

These are by far the ones that got the most attention, generated the most enthusiasm, and are most fondly remembered today -- Contra, Double Dragon (and its sequel), Streets of Rage, Golden Axe, X-Men, Simpsons, T2, and on and on. Granted, the feeling of togetherness lasted only as long as your common enemy did, so after the game was done, you went your separate ways and didn't care what happened to them. But that's still far more than can be said for today's social interactions among game players -- basically a bunch of 13 year-olds cursing each other out over the internet.

Things started going downhill in '91-'92 when Street Fighter and Mortal Kombat became smash hits and nearly single-handedly destroyed the cooperative mode of game-playing. These games pit one player against the other in a fighting match, a genre that was to become dominant during the rest of the '90s. By 1997, GoldenEye turned the first-person shooter genre into one primarily based on player-vs.-player gameplay, rather than teaming up to fight a common enemy, as it could have easily gone if these games had been introduced in the 1980s. (I'm sure a video game version of Red Dawn would've been better than the movie.) I know that these games do allow for teaming up of players, but overwhelmingly they are played in me-against-you form.

Tellingly, there's an arcade game that looks superficially like Street Fighter or Mortal Kombat, but that came out in 1990 before the violence level hit its peak -- Pit-Fighter. It digitized live actors for its visuals, just like Mortal Kombat, and it involves nothing more complicated than beating the shit out of others, while a crowd cheers you on. However, unlike the direction that developed during the '90s, here if there are two or more people playing, it's them vs. a set of common enemies. (And since 1990 was still part of the '80s, there are no butt-kicking babes as there would be in later fighting games, just a couple of psycho metal chicks.)

Rising-crime times cause people to band together to take on the common threat that looks more and more like it won't go away on its own, and that is too strong for an individual to scare away. So there's a stronger feeling of community. Falling-crime times cause people to loosen those ties, since there's no longer a need for them. So there's a stronger culture of one-upsmanship, like Stuff White People Like.

Fourth, and related to the previous two, video game players were more altruistic toward each other, especially strangers. When violence is soaring, you not only have to band together to take it on, but be prepared to give things up in order to better your team-mates, not as though you were just a bunch of mercenaries. The real test of altruism is giving up something to benefit strangers, and of course that mostly took place in arcades, which died off when people stopped trusting strangers. I'm not an online game player, but from what I've observed of my brother, online game players aren't sending each other money through PayPal or something so that all involved can keep playing together. I know that YouTube celebrities get sent random gifts from their fans, but that's not what I mean -- first, because hardly anyone is that famous, and second because that's a hierarchical form of the groupies giving to the stars, not equals on a team giving up something to benefit the others.

During those cooperative games, sometimes you would have brought more quarters with you than your partner. So, even if they were as good or better than you skill-wise, they still might be kept from playing further just because of lower wealth at the outset. Well, that's not right, you thought, so you gave them a quarter and maybe another and maybe another still, just to keep them in the game with you. Sure, you got some small benefit to yourself from that -- namely, someone to help you get through the rest of the game -- but it was a net cost to you and net benefit to them.

In fact, I remember at least two times in elementary school when I purposefully brought five dollars or more in quarters to an arcade so that I could recruit someone to help me finally beat a game that I (or anyone else) could never beat alone. "Hey man, I got ten bucks in quarters -- five for me, five for you -- you wanna try to beat Final Fight?" That was a total stranger, a boy about my age. Another time I recruited some friends and acquaintances during recess to go beat Ninja Turtles at the bowling alley after school got out, and that would've been impossible if you tried by yourself. That time it was more like $20 split four ways, since that game allowed four people to play simultaneously.

It was very costly for a kid to do, and I didn't get a huge benefit from doing it, which is why I only did this twice (that I remember). But in those days you were more willing to take a hit so that others could join in and have fun, especially if you wound up kicking ass on some game that had forever thwarted your isolated attempts to beat it. I didn't make a big show about it or lord it over them, played down their thanks, and never expected anything in return, let alone ask for it. It was one of those rank-leveling things you have to do if you want a small team to help you out like that; it has to be egalitarian.

Most histories of video games are narrowly focused either on the technological changes or the rise and fall of various businesses. I haven't seen many social histories -- or even one, that I can think of anyway. Probably because the typical video game addict is the doesn't-relate-to-people systematizer, making the technology and market share data more appealing to them. Still, anyone born between about 1975 and 1984 lived through most of these major changes in the social aspects of video games, and could likely write a good history just off the top of their head. Something more extensive and in-depth than what I've done here, where I've only stuck to the aspects that relate to my larger interest in how social life and culture changes when the violence level is rising or falling.

April 10, 2011

They had better hedonistic music too

The post below on spiritual music rising during times of increasing violence serves to correct the lame spiel we always hear about how materialistic and bla bla bla the culture became sometime during the 1970s and really soaring during the 1980s.

At the same time, as hinted at with the "1999" reference, they did do celebration-of-the-flesh music a lot better too -- hell, in all media. When you perceive the order of the universe crumbling all around, you've got those two options.

Here's a #1 hit from 1978 that unfortunately has been eclipsed by a lot of the cornier-sounding disco music that topped the charts (although disco overall is fun, especially Chic). Even among a broad audience of people who weren't even alive then, or were children, it seems like Blondie and The Ramones have gotten their due, while The Bee Gees have fallen more by the wayside. So maybe some of the lesser known '70s classic rock hits will find a new following too.



Nothing wrong with the artier rock of Talking Heads from that era, but music plays so many more roles than tickling the brain. Sometimes you just want to silence the self-conscious part of your mind and take in a sky-opening let's-go-do-it song. You won't find that among the spotlight-soaking skanks of the past 15 to 20 years.

Red Dawn

Saw it for movie night, and it was almost unwatchable. The fact that, at worst, it gets a pass from conservative "critics," and at best draws praise -- including a spot on the National Review Online's top 25 conservative movies -- is a good reminder of why you should ignore most politically oriented culture reviewers.

Since it's not really worth an in-depth response, here are some quick thoughts on why it's such a dud:

- There's no character development before the Communists land, which happens almost right away, so we find it almost impossible to feel the tension that the students must be feeling. They've given us no one to identify with. Same when the two brothers see their father behind a concentration camp fence, who then screams at them to "Avenge me," which ends up ringing flat since we have no connection to him (or them) at that point. They needed to include at least 30 minutes of background, maybe 45 given how many characters there are. Any of the great It's-Us-or-Them action movies let us get to know the characters first, so that when the shit hits the fan, we feel like the enemy is attacking one of our own. Predator, Aliens, The Terminator, RoboCop, really any decent movie.

- They don't set up key plot points beforehand, so that they come not as a surprise but as something unbelievable and forced, such as the two girly girls being able to mow down crowds of trained killers with heavy artillery. Establish that first by, for example, showing them kicking ass on a bully who didn't know how tough they were, or fighting off a group of guys who are forcefully making unwanted passes at them. Showing Marian Ravenwood taking care of business at the beginning of Raiders of the Lost Ark, or Ripley in Aliens blasting the suited bureaucrats and then operating heavy machinery with no problem, makes their later feats of strength more believable.

- A big part of the movie is supposed to be the coming of age for these high schoolers. They did get some of it right by showing their physical removal from the ordinary world, living in the mountains, during their transformation. But they failed to show any of the necessary breaking-down in order to be built back up that you see in, e.g., the boot camp portion of Full Metal Jacket, Luke Skywalker's ordeals while training with Yoda, or Bill Murray's repeated failures-along-the-way in Groundhog Day.

- Related to that, they don't emphasize the leveling of distinctions and ranks that existed in the normal world but that will cease to continue within their separate sphere where they undergo the transformation. They do take the wind out of the sails of the class president, but that's about it. More shots of their bonding rituals would have made us believe that they were a unified group, such as the boys from Stand By Me pinky swearing, taking turns at the night watch, removing leeches from one another, flipping coins instead of using force or peer pressure to pick someone for grunt work, etc. Or even something as simple as showing the Ghostbusters putting on the same uniforms, sliding down the same firehouse pole, and switching on each other's proton packs right before battle.

- It doesn't make for a good Us or Them movie when the only character who's fleshed out in a human way is supposed to be one of the bad guys, namely the Latin American leader who becomes disillusioned over the course of the siege and ultimately decides to leave the revolution to return to his wife, even refraining from firing on two of the Wolverines when they're sitting ducks.

There's probably a lot more to mention, but those are just the first thoughts that come to mind. Other movies from the mind of John Milius, like Dirty Harry and Conan the Barbarian, are great to watch because they don't make any of these mistakes. The only people who'll really enjoy Red Dawn are those who approach movies at a meta level, like "Oh yeah, they finally made an unapologetic picture about kicking the Commies' asses!" -- ignoring the fact that it's a poorly thought-out and executed movie. If you're looking for a pound-the-Soviets flick whose makers had decent storytelling skills, stick with Rocky IV.

April 8, 2011

Spiritual pop music

No, I'm not talking about Christian rock or undiluted gospel. Something about searching for the meaning of life, and in particular wanting to connect with the supernatural -- not "finding yourself" by landing the right study-abroad program or scoring the perfect unpaid summer internship.

My hunch is that this kind of music will soar in production and popularity when the violence level is shooting up, because that's when people get more curious about the spiritual, the supernatural, the mythological, the religious... whatever you want to call it. Especially during the second half of the climb in the crime rate, since that's when things start to look apocalyptic -- there's already been half a generation or more of steadily worsening security, and the experts have thrown every social engineering program at it, yet come up with nothing powerful. For similar reasons, it will not tend to adhere to orthodox religion but be less clearly defined, since the failure of the old ways means they must try to figure it out as they go along.

Probably the best example is a #1 hit from 1986, "Higher Love" by Steve Winwood. It couldn't have been made in 1966 because society had only gone somewhat outta whack by that point -- the decay of order got a lot worse in the next 20 years. Typical of an end-times yearning for community, he levels distinctions that exist in the ordinary, falling-apart world by giving it a heavily African sound for white pop music, and I'll bet a lot of people listening to it thought he was black.

Real Life made "Send Me an Angel" in 1983, which is Pagan in tone, and whose video has a pre-Christian but still Indo-European religious feel. Toward the tail-end of wild times (1990) they released "God Tonight", which sounds like the thoughts of a cult leader mixed in with a good dance track.

Belinda Carlisle's "Heaven is a Place on Earth" from 1987 is a bit more focused on the profane than the others, but its power still depends on the desire to bring the supernatural realm down onto our own. Again shades of cult / commune-like concepts of ushering in paradise in this world, although more of the naive free love type than the apocalyptic type.

Of course there was Madonna's last great hit, 1989's "Like a Prayer". The phrases "I want to take you there" and "in the midnight hour" hint at going to some other plane of existence.

"I Still Haven't Found What I'm Looking For" by U2 in 1987 has a nice line that highlights the leveling of distinctions when the end comes: "I believe in the Kingdom Come / Then all the colors will bleed into one."

Camper Van Beethoven's "She Divines Water" from 1988 shows that even college rock, typically not religious at all, could put out a catchy song with an other-worldly feel. Probably the most imaginative and original one of the bunch here.

The 1986 soundtrack for Labyrinth features David Bowie calling all the misfits of the world into joining his utopian cult in "Underground," which has a gospel section later in the song.

"Personal Jesus" from 1990 by Depeche Mode is about a lost soul seeking communion with the divine, although told not from their point-of-view. "Reach out and touch faith" means that the supernatural has become tangible.

There are other songs that are quasi-religious but that do not stress the meeting of the natural and the supernatural. "Man in the Mirror" by Michael Jackson from 1987 is a straightforward conversion-of-the-rich-miser story. Although more difficult to interpret religiously, Madonna's 1984 hit "Like a Virgin" has that "I was lost but now am found" theme, and her conversion sounds like it sprang from some magical rather than mundane cause. In 1981 Orchestral Manoeuvres in the Dark made not one but two hagiographic songs about the same saint -- "Joan of Arc" and "Maid of Orleans". The song and album "1999" by Prince from 1982 is apocalyptic, but his response is to retreat into a not very sacred kind of hedonism.

Nothing else in the yearning-for-the-supernatural genre comes to mind, although I could be missing some stuff from the later part of the '70s, when cults and evangelism were ramping up in popularity compared to the '60s. I'm pretty sure there's little or none from the '60s and earlier '70s, nor from the early '90s through today. And during the previous period of falling crime, the mid-'30s through the late '50s, the hit songs featured more trivial subject matter and a secular frame-of-mind, however catchy it may or may not sound (Frank Sinatra, Ella Fitzgerald, Bobby Darin, etc.).

April 6, 2011

Classic car culture in red states and blue states

Before I yakked on about how the frontier spirit still persists in the mountain west and plains states, and perhaps Tennessee too, not just for the overall level of solidarity but how much people preserve and treasure the great accomplishments of white pop / folk culture.

Andrew Gelman and co-authors showed in Red State, Blue State, Rich State, Poor State that the recent war between liberals and conservatives is absent among the working class, present among the middle class, but most pronounced in the upper-middle and upper classes. One example he gives is that rich Texans might drive a Hummer, whereas rich Californians might drive a Prius. Working-class people in either state don't drive either type of car that "tells the world something about your personality and lifestyle." Whatever gets them to work is OK. The battle over which kinds of cars are best is mostly a middle and upper-class affair.

You also see this for cars that are very far from new. I was near the Wheeling, West Virginia metro area for a family reunion of sorts over the weekend, and we drove through lots of the smaller towns around there, as well as the more urban parts. Not once did I see anything that you'd call a classic car. Not being driven around, not resting parked in a garage, not waiting to be scooped up from a used car lot. People did own older cars, and dealers were selling them used, but they were all fat, shapeless '90s cars, along with a handful of bloated space pod cars from the 2000s.

Driving around town out here in the mountain west, it's hard not to catch sight of the classics just going about your daily business, not even trying to seek them out. A muffler shop has only a half-dozen used cars sitting out front, but they're all 300ZX's, plus a 280ZX right next to the curb to draw in the passersby. At a nearby small market, there's usually a silver Mark II Supra that gives your eyes something sweet to taste while you're stopped at the light. Then walking to campus, I see a '68 Chevelle parked on the street only three spaces away from an '85 Fiero. Later in the day, a banana yellow 2nd-generation Firebird cruises across in front of me while I'm stopped at a light. And on any warm, sunny day all you have to do is find a 6-lane road to see all of the Impalas (mostly 4th and 5th-gen) being let out of the garage. Just a couple weeks ago in front of a low-rent apartment building with nary an organic boutique in sight, there was parked a glistening white Corvette with a little blue trim, one of the last of the stingray-shaped ones (looked like late '70s).

And that's not even to mention all of the lesser but still pleasant filler cars strewn around -- Volkswagen Rabbits and Beetles, Saab 900s, etc.

Obviously working-class people are not keeping and maintaining these cars; it's middle-class or above, as they're something of a luxury now. Yet people with enough money in the blue-state areas of eastern Ohio, West Virginia, and the DC suburbs in Maryland couldn't care less about keeping alive the age when cars still looked and felt like cars. (During an admittedly brief visit to my brother in Los Angeles, I hardly saw any classics either, other than a Mexican dude in a beat-up RX-7. Everything was a nearly brand-new Mercedes, BMW, Lexus, bla bla bla, just like they say.)

Using Google Trends to see which states top the list of searches for things related to classic cars, the same pattern emerges as I mentioned in the frontier post -- mountain west, plains, the red-state parts of Appalachia, somewhat of the Ozarks. It has nothing to do with climate, since California doesn't dominate the lists as it should if that were the case, while snowy Utah and Colorado (but not snowy Vermont or West Virginia) show up pretty reliably.

It's just part of the larger cultural differences between middle and upper-class people in red states, where they're more about preserving the best of tradition, and blue states, where they are spurred by a reflex to always be "moving beyond" whatever dopey stuff a bunch of dead people once invented.

April 5, 2011

Violent times breed cuter girls

Last year I looked at how many heartthrob girls there are during rising-crime times vs. the near total lack of them during falling-crime times. I didn't mention it then, but the same applies to the earlier wave of violence up and down -- lots of babes during the Jazz Age vs. the mostly frigid and mercenary women of film noir's heyday.

That post focused just on the demand side -- what proved popular with guys. But there seems to be a supply-side story as well. If it were only due to changes in demand, then I'd be surrounded by boy-crazy and emotionally powerless Mallory Keaton types, who were for whatever reason just not getting any attention at the local or national level. And I obviously do not live in that world, but instead in one filled with pushy and emotionally in-control gals, plus the femme fatale wannabes like Fergie, the Pussycat Dolls, etc.

So aside from whatever's driving the change in male demand for this or that type of girl, what's behind the shift in production of one or the other type of female?

When violence levels begin to soar, females start to solicit friendship and more from a wider variety of males -- not just relying on her brothers and other kin for protection, who have a genetic interest in helping her out. How do you convince a more-or-less genetic stranger to attach themselves close enough to you that they'll take on the role of protector and/or provider? Well, by being cute. Guys are more likely to take care of her if she seems youthfully innocent of the real world, while also being bouncy and carefree enough that she might accidentally wander into trouble. Otherwise, why would she need a protector at all? I think it's mostly the latter that gets dialed up in chick physiology during violent times -- the main contrast with girls during safe times isn't projecting an air of innocence vs. experience but more like fun-loving vs. playing-it-safe.

I don't know if I've discussed it or not before, but the whole "only marry a foreign woman" thing boils down to women who grew up in and probably still live in rising-crime environments and those who aren't. American men were head-over-heels for American girls right up through the 1980s; the only-marry-foreign movement is new. And it's not just any old foreign country -- it's specifically the countries of Eastern Europe and parts of Latin America that have always been super-violent and even more so in recent decades when ongoing internal war ripped the region apart, not some place that's safe and stable by second or third-world standards, such as Native Americans, Eskimos, Bushmen, Egyptians, etc.

True, guys who are serious about the only-marry-foreign strategy warn about mercenary women in Eastern Europe and Latin America, but the stronger and more basic selling point for them is how feminine the women are.

The argument here is an extension of primatologist / sociobiologist Sarah Hrdy's idea about why human infants are so cute compared to those of most other primate species, and even of adult humans too. Human infants are reared partly by people who are not as closely related as a mother or her kin, whereas most primate infants are reared by the mother. Because human babies have to stay on the good side of people who don't have a genetic interest in their well-being, they have evolved the weapon of cuteness to pierce through the tough skin of the genetic stranger. Chimp or gorilla infants don't rely on unrelated individuals, so they pay no cost for being so ugly. And of course that dependency on unrelated people lasts through adulthood for humans, perhaps explaining why we're so neotenous, or resembling babies, in our appearance. *

In a post below I pointed out that the falling-crime phenomenon of helicopter parenting is a move away from human beings' natural state, at the center of which is a deep sociability, and toward the pattern of most other primates, where there's a lot more selfishness. Let's now add to that -- and more of a boner-killing demeanor among females, not unlike the ruthless she-chimps. (The falling-crime surge in popularity of MILFs and cougars is another example of a reversion to chimp-like norms.)

* The other exception to the primate rule are the callitrichids, who are cooperative breeders. Also unlike most primates, they're pretty cute-looking, and their name means "beautiful-haired" in Greek.