April 9, 2013

The good old days in one picture


Let me count the ways...

- Tons of people outside in a public place.

- No cocooning gizmos (iPod, cell phone, etc.).

- Kid allowed to hang out in public with no shirt on.

- Chick with no bra on, and it's neither shameful nor attention-whoring because nobody notices (including her).

- Wholesome daisy dukes too.

- Sports / athletics part of everyday life, and they're fun rather than a chore.

- Three apparently unrelated children, and not a parent in sight.

- And no helmets, pads, etc.

- Teenage girl mock-flirting with pre-adolescent boys to help them grow to feel comfortable around girls.

- Whites and blacks hanging out in the same space, not a black ghetto / SWPL refuge configuration. Mostly due to whites having more backbone and keeping real-life blacks under closer watch, rather than abandon their space altogether.

- No sunscreen.

- Clothes that fit the body, not Victorian trash-bags.

- No slobs or obese people, and no overly fussy or anorexic people either.

...what else did I leave out?

April 8, 2013

Doom vs. GoldenEye in broader perspective

You can learn a lot about social and cultural change from looking at popular entertainment. It takes place within the larger social context, and typically is part of a more general cultural zeitgeist. Few observers or historians include video games within their review of popular / mass culture, even though they're (sadly) bigger than TV and movies for young people, and video game nerds themselves tend to either get all gushy or vitriolic in their reviews, taking things personally rather than charting the course of history.

An earlier post took a look at how shifts in both the visual culture and social interaction were reflected in the dominance of fighting games in the style of Mortal Kombat over the earlier Street Fighter. This post will run a similar comparison between first-person shooter games that shows the same overall social and cultural changes. The most successful early first-person shooter games were Doom and Doom II (1993 and '94), before GoldenEye 007 and its imitators took over (1997).

First, though, it's worth noting that the very popularity of first-person shooter games signals a shift away from sociability, just as the rise of player-vs.-player fighting games did. Throughout the '80s and early '90s, games where you beat up other people had the player beat up characters controlled by the computer, and if another person joined in, they teamed up with the first person to take on the computer. Player-vs.-player fighting games pitted two people against each other.

For games where you shoot people up the whole time, the norm during the '80s and early '90s had you shoot lots of characters controlled by the computer, and if another person joined in, they teamed up with the first person against the computer. This included games like Contra, Ikari Warriors, and Operation Wolf. First-person shooter games grew to focus primarily on player-vs.-player gameplay, where two or more people try to shoot each other up.

When the Doom games came out, there was some interest in shooting up other players, though most people found that boring and wanted to take on the far greater number of enemies in the normal one-player mode. Still, these were the first to introduce player-vs.-player, so even by 1993 the shift away from team gameplay had already begun.

This trend continued with the Quake series, the next series from the developers of Doom. By the time GoldenEye and the similar Perfect Dark took over, I'd say most people chose player-vs.-player if there was someone else in the room to play against. Once home consoles allowed players to shoot each other up over an internet connection in the 2000s, this became by far the most common way that people played first-person shooters.

We can see other major behavioral changes reflected in the replacement of Doom style games with GoldenEye style games, such as the shift toward OCD, collecting/hoarding, and joyless treadmill progress toward 100% completion of boxes in a checklist.

In Doom, your only goal in a stage was to reach the exit. In GoldenEye, you were given a list of specific objectives to carry out before reaching the exit. More, you were given the option of three levels of difficulty, so you could complete three different checklists per stage. In Doom, you only had one player you could play as. In GoldenEye, you were encouraged to meet certain characters in the one-player mode so that you could play as those new characters in the player-vs.-player mode. In Doom, special abilities were gotten by simply typing in a cheat code. In GoldenEye, such abilities could only be had by performing certain objectives in a stage, usually under a time limit. Often this meant replaying the same stage over and over, going through the exact same motions, hoping to shave off a few seconds here or there. The now common treadmill practice of "unlocking achievements" began with GoldenEye.

The trend toward over-the-top extreme-ness shows up as well. You only kill aliens and monsters in Doom, although they do look gory when they're dead. GoldenEye has you killing people, and there are far many more opportunities for sadistic humiliation. For example, once they're dead, you can shoot them in the head, sending it jerking back while a pool of blood pours out. Although the entire dead body did not yet resemble a rag doll in this way (as it would come to), you could still pump the corpse full of lead, making it bleed.

You could also torture the enemies to death by targeting a limb and watching them hobble around in pain and frustration, before tagging them again, until they finally took enough hits and keeled over. Players even fucked around with unarmed bystander characters in that way. It was even easier to pull this off when you targeted them through a sniper rifle at a safe distance where they couldn't see you or shoot back.

These features -- torture, sadism, overkill, voyeurism -- recall the unwholesome nature of mid-century comic books, which as a medium video games have largely taken the place of. While they do not lead to higher crime rates, they do warp the minds of young people and encourage them to pretend they're a sick badass when in reality they're just some sheltered dork. None of these features were included in the Doom games.

Another staple of lurid mid-century comics shows up in the GoldenEye and Perfect Dark style games, which is absent in the Doom games -- butt-kicking babe characters, both ones you can play as or play against. I forgot to mention that in the Mortal Kombat vs. Street Fighter post, but the shift occurs there as well. There's only one girl character in Street Fighter, but several in Mortal Kombat II and later fighter games that offer shower nozzle masturbation material for nerds.

Sportsmanship has fallen off a cliff during the Millennial era, and that's very clear in the course of first-person shooter games. Since most people didn't play Doom as a player-vs.-player, there wasn't much room for poor sportsmanship. But when people played GoldenEye in player-vs.-player mode, few to none of them respected any kind of ground rules.

The most common form of acting like a wuss was when another player re-spawned after dying. When you start out, you have little or no firepower, no shield, armor, or other type of defense, so that if someone else who's spent time collecting those things feels like notching an easy kill, they'll find someone who's just re-spawned. There are only a handful of locations where a player can re-spawn, so it's not hard to hang around them and mow down more-or-less hopeless adversaries. The now common cowardice of "spawn-camping" began with GoldenEye.

You could also booby-trap items that other players would want to pick up, simply by placing a proximity mine nearby. That way, when someone else goes to increase their defense by picking up a vest of body armor, they're blown to pieces instead. Whoever put it there would then cackle like a jackass. They'd also put proximity mines next to the re-spawn points, so that you wouldn't even have a chance to play that life. Right when you re-spawned, you got blown up.

Finally, moving on to the graphics, you see the same change in the overall visual culture reflected in this increasingly popular video game genre. Some of the major changes are explored here. In general, the Doom games are closer to the more striking '80s visual culture, and the GoldenEye type games to the duller '90s and 2000s visual culture.

GoldenEye has minimal contrast in lighting and looks too dark (ex), little variety of color, let alone combining hot and cold hues (ex), occasionally features exotic locations but never the fantastic (ex), the characters are all the same scale (ex of the largest enemy, a tall person), the colors are washed-out (ex), and the environment and architecture has no repeated design motifs (ex). This has to be one of the worst-looking video games ever made, certainly among those that were blockbusters.

The Doom games are hardly spectacular in the character graphics, although some of the environments give a halfway decent impression of an apocalyptic painting by John Martin (ex). You can find light-dark contrast both within the matte-style backgrounds (ex) and within the playable areas (ex). Color variety isn't so great, but there are frequent combinations of blue and red, for both the setting and the characters (ex). The imagery is a mix of familiar (ex) and exotic/fantastic (ex). Enemies range from the size of a human skull to three or four times human scale (ex). Colors are not very saturated, but not washed-out either. The environments show repeated design motifs whether the surface is some weird alien thing (ex) or stone (ex). Simple things like signs of masonry or veins over a natural surface make the Doom environments hold your interest more than the flat and homogeneous surfaces that make up the environments in GoldenEye.

I don't think I'll do another extended comparison since the iconic video game genres of the '90s and 21st century were player-vs.-player fighting games and first-person shooters, respectively. So as far as looking at video games to see how broadly the social and cultural changes have reached, that seems to cover the big picture.

April 7, 2013

The Breakfast Club, and growth from cathartic interactions

Just saw this movie again for the first time since high school. Now that it's been nearly 30 years since it was filmed, what jumps out at you while viewing it isn't the quotable dialogue, but how much social relations have changed among young people in the meantime.

It would be impossible to imagine remaking this movie with Millennials in the 21st century because they're so avoidant, whether of the fearful/mousy type or the dismissive/I love haters type. Generation X was actually willing to open up and just be themselves, not constantly monitoring their self-image and only letting others meet their PR representative. And they're not throwing themselves in front of someone else to divulge their darkest secrets -- secrets just come out spontaneously, in the natural course of normal people wanting to figure each other out.

Young people were also more comfortable with taking risks in the social world, since the characters come from all walks of teenage life. Social circles are so tiny these days that even college students, who are free from direct parental supervision, hardly know who people from another group are, let alone mix it up with them every once in awhile, like at a party. Now it's only small house parties, where the attendees can be guaranteed that no unknowns will enter into the social formula.

Adolescents are always worried about whether they enjoy the approval of their peers, and movies like The Breakfast Club make it palpable how cathartic it is to socialize with others outside of your own narrow little group.

If you are a total cocooner, you only have your own self-biased views to consult regarding your own worth. Deep down everyone is aware of the impossibility to evaluate yourself honestly, so this quickly leads to self-doubt and then OCD thoughts and behavior to try to cope with the anxiety. Keep checking the likes on your Facebook posts, keep looking up from your "work" at the library to check if anyone else will return your eye contact, keep checking the mirror to see if your pecs are ripped enough, or if your hair is straightened enough, for you to be seen out in public.

One step above that is if you have a social circle but they're all close to being your identical clones. Again, deep down everyone knows that in this case their approving feedback doesn't amount to much more than your own.

You don't have to constantly mingle with people from quite different backgrounds from your own, but when you do, their approval feels a lot more honest and meaningful. The more independent data points you can collect, the more robust the results are. It's such a relief to the teenage mind -- like, "Phew, glad to know it's not just me and my echo chamber who think I'm all right."

The only clear exception to the current trend of avoidance and anxiety is when young people gauge how good-looking or ugly they are. You don't need to socialize, open your mouth, or even stand near someone else. Just parade yourself around and tally up what percent of all eyeballs turn your way. Attractiveness is the only dimension where adolescents know how they measure up these days. It's depressing to the ugly because they have nothing else, that they're aware of, that could make up for their looks. And it's stunting for the attractive because that's all they have to solidly base their identity on, and by becoming full of themselves, they ruin their good looks with a repulsive personality.

And of course, in order to make any real contact across such wide social distances, you have to give them a sign of good faith -- you have to open up yourself, and let them open up too, without weirding them out. You make that leap of faith and realize that other people (typically) are not going to take advantage of your openness and try to humiliate you then and perhaps later on as well.

These sorts of cathartic social interactions strengthen your character the way that a good burst of intense, challenging activity strengthens the body. It's remarkable how grown-up the teenagers of the 1980s were, whether in real life or in fiction, though not so surprising when you take into account how much they were willing to put themselves through -- not out of masochism or recklessness (well, not always), but out of a normal willingness to take a risk in order to enjoy a reward.

The result is no less pleasing for the audience. Firing up a teen classic from the '80s is such a breath of fresh air because at last you can watch teenagers that you don't feel like choking the life out of. And despite the occasional heavy or sentimental moments, even more lightheartedness and cheerfulness shows through, like the scene where all five of the high schoolers in detention are dancing carefree around the library to loud new wave music.

As the '90s wore on, that free-wheeling resilience began to wane, visible already in the 1994 cult classic TV show My So-Called Life, the closest imitation of a Breakfast Club-like social milieu for the Alternative age. The show had its upbeat and laugh-out-loud moments, but was more mopey and grungy than a John Hughes movie.

More importantly, the social range has been severely restricted, with nobody from the various popular / preppie / jock groups represented among the main characters. The protagonist's best friend is the new Ally Sheedy character, her nerdy and depressed neighbor is the new Anthony Michael Hall character, and her crush is the new Judd Nelson character. But she herself, while pretty, is socially awkward and unknown, unlike the popular Molly Ringwald character. And Emilio Estevez's jock character has been totally replaced by a Puerto Rican faggot who's always on the verge of a nervous breakdown.

Like, "Look at us, we're all such weird outcasts…" In The Breakfast Club, the message was not that everyone was a weird outcast in their own way, but that there was more than meets the eye to people from all walks of life, popular and dorky alike. In this age of diversity worship, it's striking how narrow the range of characters are in fiction (and among real-life youthful conformists). It's all variations on the weirdo theme, and if anyone from the rest of the spectrum is included, the audience won't allow the writers to make them sympathetic.

So if you haven't seen the classics in awhile because you think you've already enjoyed them enough, pop them in again and enjoy them as a welcome relief to the current state of young people, where your options are limited to mousy and haughty.

April 6, 2013

Poetic meter and the breakdown of the bicameral mind

People who have not yet undergone the psychological revolution of self-awareness / introspection are so surprisingly different from us. This change unfolded sometime during the 1st millennium BC, perhaps unevenly across places, though becoming unmistakable by about the middle of the millennium.

Earlier we saw that before self-awareness hit mankind, women's clothing left the breasts visible. We associate that with primitives in National Geographic, not with the great civilizations of the Bronze Age. But in many ways, those people were still psychological primitives.

From a totally different domain of human life, there's the strange fact that "poetry" did not use meter during the 2nd millennium BC. I put "poetry" in quote marks since I take it to mean something musical and visceral, hence driven by a rhythm or regular beat. Without it, there can still be prose that is lofty in tone, repetitive in word or phrase structure, and ornamented in figures of speech -- but it would still be prose without something special in its prosody.

And just look at the range of languages spoken in major civilizations that were literate and numerate, that produced literary as well as oral texts. Sumerian, the oldest written language, is unrelated to all others, and it didn't use meter. Neither did Akkadian, the oldest recorded Semitic language. Probably not for Ancient Egyptian either, which is distantly related to the Semitic languages by being part of the larger Afro-Asiatic family. [1] No luck with Hurrian either, yet another completely unrelated language to the others, and spoken in the Mitanni state. [2]

What about something Indo-European? Those guys were born bards, so surely they must have. Nope, not them either. The oldest attested Indo-European language is Hittite, and their literature didn't use meter. Actually, there's one section of four verses with a regular meter within a larger work -- and that's it. It's more like a brief shift to song-singing mode for whatever purpose, and then returning to (elevated, repetitive, ornamented) prose.

The Rigveda dates to somewhere in the 2nd millennium BC as well, and it does show meter, although since it wasn't written down for so long, I'm not sure how strong the evidence is that the use of meter goes back to the original of 1400 BC (or whenever). Also, it is not a single over-arching work but a vast collection of hymns. So I'm sure for shorter-length compositions, especially in a musical context, meter was used. But not to unify a larger composition. Because the Rigveda is a vast collection, it does not have a meter that's used for just about all of it. Different sections have their own, and choosing from a wide variety of options among the older hymns. Even within such a section, there's plenty of wiggle room around what the rhythm ought to be. [3]

Mycenaean Greek, the oldest written form of the language, has left no record of their poetry or even "poetry" to study, so we can't say for certain that they didn't use meter. But given how much they psychologically and culturally resemble other groups from before the dawn of self-awareness, I'm pretty sure they didn't use meter in extensive works either.

Meter as a core, pervasive element doesn't really show up until the epics of Homer in the 8th century BC and the Indo-Aryan epic the Mahabharata, from around the middle of the 1st millennium BC. The oldest Chinese poems, from the Classic of Poetry, are in a regular meter and come from the first third of the 1st millennium BC. We can't compare that to earlier stages, as these are the oldest in Chinese. The central books of the Old Testament, also from the mid-1st millennium BC, don't show as much concern with meter as the Indo-European epics, but still more so than their Semitic literary ancestors.

Why the sudden preoccupation with meter, at least relative to the status quo ante? Well, now that people are more self-monitoring, they need new tricks that will lull them into a more spaced-out mindset, so that they can get fully absorbed in the tale-telling, both as a performer or audience member. Following along to a regular beat is one of the most widespread solutions. There's some external pulse, not coming from within your own mind, and following along takes your attention off of your own internal state.

You see that also in contexts of physical group-bonding, like drilling in sports or the military, or group dancing. See McNeill's Keeping Together in Time. Not surprisingly, the mid-1st millennium BC seems to be the starting point for the tradition of military drill, marching in cadence, and so on, from Sun Tzu's China to Classical-era Sparta. People who are not primarily outward-focused need an activity like marching in step to lull the self-aware part of their brain to sleep, and let the animal-like decision-making program take over.

The extensive use of poetic meter would then fall under the range of new tricks that Julian Jaynes saw as attempts to get us back into a bicameral mindset once we'd left that way of thinking and acting. For example, before, people feel some kind of hunch about how to proceed, and go with it, like an animal does; after, they look inside themselves and find uncertain answers, so they turn to increasingly baroque methods of divination. [4]

So, while bicameral or pre-self-aware people may have had little use for poetic meter, all of us since then need it to get us in the mood, to dial off the spotlight of self-monitoring so that we can get absorbed in the experience. It's like how tapping along to a beat can somewhat take your mind off things, though in a more subtle way so that it doesn't call so much attention to itself as a stylistic device. After all, what would be the point if each swing of the hypnotist's pendulum kept smacking you right on the nose?

[1] From Foster's Ancient Egyptian Literature: An Anthology (excerpted here):

The major remaining gap in our knowledge of ancient Egyptian poetics concerns prosody. [...] we do not know for sure if it was composed of feet or if it employed some freer means of determining accents and stresses. I would suggest the verse line was analogous to the free verse of Walt Whitman or the modernist American poets. In fact, I think the stylistic texture or flavor of ancient Egyptian poetry can best be described as a fusion of the free-verse rhythms of those poets just mentioned with the rhetorical and structural regularities—the strict attention to patterns of likeness and difference—of Alexander Pope's eighteenth-century heroic couplets (without the end-rhyme or meter).
[2] For Sumerian, Akkadian, Hurrian, and Hittite, see A Companion to Ancient Epic.

[3] From Arnold's Vedic meter in its Historical Development (here):

There are few parts of the verse in which the poets do not consider themselves free at times to depart from the usual rhythms, so that it may perhaps be said that there are no 'rules' of rhythm in the Rigveda. On the other hand, there is no considerable part of the verse in which certain rhythms are not steadily favoured, and others avoided : everywhere there exist metrical preferences.
[4] Jaynes suggested that the hunch we used to feel was experienced as an auditory hallucination, a command from our own personal god that we obeyed. Most people who either get really drug-trippily into his work, or those who dismiss it without reading it or reading little of it, seem to latch on to this admittedly wackier part of the entire argument, which is about the shift toward psychological self-awareness.

April 4, 2013

Many more differences between light-skinned and dark-skinned blacks

Earlier we saw that light-skins are smarter by about 8 IQ points. Let's see what other differences there are. I'm lumping the lighter half of the skin tones together, and the darker half of the tones together.

Years of education -- no difference (13 years each on average).

Average income -- about $23,100 for light-skins vs. about $17,800 for dark-skins. Since their education levels are the same, it must be the greater smarts of the light-skins that earns them more money.

Political views -- liberals are 44% of light-skins but only 29% of dark-skins, moderates are 40% vs. 46%, and conservatives are 17% yet 25%. (Numbers rounded, don't spazz.)

Tough-on-crime views -- 63% of light-skins vs. 76% of dark-skins think the government isn't doing enough to halt the rising crime rate (it's not rising, but you what what they mean).

Afraid to walk their neighborhood at night -- 47% of light-skins vs. 36% of dark-skins. Unfortunately I can't control for urban / suburban / rural setting because they didn't ask that question in 2012.

So light-skins are more liberal, softer on crime, yet still more afraid to walk their streets at night, which probably aren't as dangerous given that they are higher-income. Maybe they're more fearful of the same threats that dark-skins face, or maybe they feel like they'll be singled out more as targets by the mostly black criminals in their area, among whom lighter skin is seen as a signal of weakness.

Gun ownership -- 24% of light-skins vs. 16% of dark-skins (by household). Not related to hunting, as 0% of light-skins hunt vs. 4% of dark-skins. We already saw that light-skins are more afraid of their neighborhood, so this is for self-defense against their darker-skinned neighbors.

Church attendance -- rare attenders are 34% of light-skins vs. 23% of dark-skins, occasional attenders are 35% and 34%, while frequent attenders are 32% yet 43%. *

Sure of God's existence -- 76% of light-skins but 85% of dark-skins.

Bible is the word of God -- 53% of light-skins but 64% of dark-skins. Those who think it's a book of fables are 14% of light-skins but only 6% of dark-skins.

Homosexual sex is always wrong -- 57% of light-skins vs. 64% of dark-skins.

Let gays marry -- 46% of light-skins and 42% of dark-skins agree. However, more dark-skins take a negative stand -- 45% of them disagree vs. only 33% of light-skins.

Reproductive success -- light-skinned women average about 2.0 children vs. 2.7 for dark-skinned women. Childless women make up 24% of light-skinned women, but only 14% of dark-skinned women.

Promiscuity -- equal numbers for those who had 0 partners in the past year, and for 1 partner in the past year. Dark-skins more likely to have 2, but light-skins are more likely to have 3+ partners -- 14% vs. 8% for dark-skins.

...Obviously a lot more could be looked at and reported, but I'm going to cut it here for length. This may wind up being an ongoing investigation. Overall it looks like light-skinned blacks are between whites and dark-skinned blacks on just about any trait. I didn't see where whites stood on these traits, to see whether light-skins are closer to whites than to dark-skins, which was the case for IQ. Maybe that'll be the next entry. Too bad the GSS didn't ask if you drive a Prius, shop at Whole Foods, or sport an ironic moustache.

* Rare = never, less than once a year, once a year. Occasional = several times a year, once a month, 2 to 3 times per month. Frequent = nearly every week, every week, more than once a week.

GSS variables used: race, ratetone, educ, realinc, polviews, natcrime, fear, owngun, hunt, attend, god, bible, homosex, marhomo, childs, partners

April 1, 2013

Who avoids going to parks, and why?

One of the earliest things that tipped me off to the social-cultural differences between rising vs. falling-crime times was data from the National Park Service showing a steady decline in overnight park visits, and in recreational as opposed to other kinds of park visits. They took off during the '60s and wiggled around until the early '90s, falling afterward.

But they aren't going completely unused. Here are some patterns I've noticed about park use these days.

Teenagers and young adults are almost never there. Hanging out in public spaces has vanished off the map of activities that young people pursue. I mean, why have fun around new people when you could be jogging on one of your many virtual treadmills? Liking status updates, leaving prissy/bitchy comments, leveling up video game characters, checking your texts, etc.

Mostly the age groups are parents of children and their children. Adolescents and young adults probably feel too embarrassed to show up with their folks, while the children are happy to get out and do something for once. Children don't play with one another, only their family members. If Millennials weren't so awkward and dorky, they'd just show up on their own, without their parents. I remember the older Gen X kids doing that in the '80s -- totally normal. But what can you expect from kids who are uninterested in even getting a driver's license?

I detailed some of the places that young people do like to hang out at here, like the library. (Hold on to your hats, folks.) Continuing the ideas in that post, I think Millennials don't like hanging out at the park because they're so self-conscious about their image. What are all those strangers going to think if you're not wearing the right brand of earbuds as you block out the outside world? And forget hanging out with your shirt off -- unless you've leveled yourself up to swole hulk status, you'd be too pathetic for everyone else to see. Or whatever the equivalent is for girls -- your legs wouldn't look like the Victoria's Secret models.

Back in the '80s, no one gave a second thought to taking off that uncomfortable shirt during a warm or hot afternoon. Children playing outside, teenagers socializing at the park, grown men doing yard work. Millennials really are more psychologically paralyzed about how strangers will evaluate their body. Seems to go along with their perfectionism, like how OCD people wouldn't have anyone over unless the house looked 100% spotless. Anything less is shameful.

And it's not as though they're responding thoughtfully to actual evaluations from others. Like if you were fat and gross and took your shirt off, and everyone looked at you weird, and you decided not to do so in the future. They're not even putting themselves out there in the first place. They're over-protecting themselves from hypothetical -- imaginary -- evaluations, not responding to real-life feedback from others. Just like how they rarely talk to, flirt with, or ask out someone of the opposite sex -- fear of potential rejection.

Don't mean to always be ragging on the neo-Silent Generation, but they do show the most striking changes from their counterparts of just 20 years ago. I can't help noticing all this stuff because they're not that much younger than my generation, yet they seem to come from another planet.

Moving into the mid-to-late 20-somethings (the early Millennials), you don't see too many of them either. They're generally doing something in isolation, like jogging laps or pretending to be social by sitting in a public area, though with earbuds jammed in their head and staring down at some screen the whole time. They've already turned coffee shops into campus computer clusters, so why not the park? Yep, even parks are free wi-fi zones nowadays.

Adults without children are not as conspicuously absent as adolescents, but still under-represented. You hardly ever see senior citizens either. They used to be everywhere -- the park, the mall (all day, every day), the cafeterias, interacting with schoolchildren through partnerships with the local senior center. Now old people are totally cut off from the rest of society. Public spaces are supposed to bring together all generations, and not seeing old people at the park limits its communal feel.

The exceptions in these age groups, when they aren't odd individuals, tend to be couples, rather than peer groups. A famous eye-opening study detailed the drastic decline in the number of friends that people have, comparing 1985 to 2004 in the General Social Survey. Here, a "friend" is someone who you discuss important matters with. Aside from the quantitative drop in number of friends, there was a qualitative shift away from peers and toward family members and spouses / partners.

That change stands out very loudly at the park. There are only a handful of peer groups playing sports, and the occasional group of attention whores doing wacky-zany activities. Even among these few groups, they're rarely socializing, kicking back, hanging out. And then there are those engaged in clearly isolated activities (jogging laps, etc.).

So, the park now mostly belongs to families and potential families. There's partner-partner interaction, and parent-child interaction, but hardly anything else. Certainly very little family-family interaction. Not with helicopter parents. Your kid is a potential corruptor of all the years of hard work they've put into perfectly programming their own kid. There are occasional superficial exchanges between family units, but actually making new friends and acquaintances -- um, that would be kind of creepy.

When we took my nephew to Chuck E. Cheese's over Christmas vacation, I felt the exact same vibe. The park in the 21st century is a great big McDonalds Playland, ringed by a treadmill for the childless not-so-youngsters. It used to include people from all walks of life, like the pool scenes from The Sandlot (set in the early '60s) or the mall scenes from Saved by the Bell (in the early '90s). But when nobody trusts anybody else, they'll only venture out into public spaces with their kin or their partners. Paranoia and awkwardness are simply too pervasive to allow solid peer groups to form.

March 30, 2013

Short shorts, wholesome vs. awkward periods

As part of their early '90s revival, Urban Outfitters is trying to push high-waisted shorts for girls that end just under the butt, AKA daisy dukes. It seemed like it was mostly the chicks working there who were wearing them, whether they were required to promote the product, or whether they're more adventurous than the average girl today in showing a little leg.

Hot pants, dolphin shorts, daisy dukes -- whatever they used to be called, they gave off a fun-loving yet wholesome charm that you don't feel anymore when girls wear shorts. Let's see how they pulled it off by checking out the original Daisy Duke herself:



(Back when girls wore slender heels instead of blocky boots...)

The shorts are cut very high up on the leg, ending under the crotch and slanting upward across the pelvis to reveal the hip bones and some of the underside of the butt as well. Doesn't get any more carefree than that. The trick to them not looking slutty comes from their height reaching up to or over the belly button. The greater surface area of fabric gives the impression that they're covering up more than they actually are.

High-waisted shorts and pants do create a "long butt" effect from behind, their only real downside. I guess in the good old days, guys were more interested in looking at long legs than a plump rump. Or perhaps they figured they'd get to see it all in good time, and for now it was better to get more of a hint or tease from scoping out her legs. Build some anticipation.

Whatever their reasons, shifting focus from T&A to the legs does serve to somewhat de-sexualize male-female relations. The girl doesn't feel as on-display if she's only showcasing her legs, and the guy doesn't feel as self-conscious of his own dirty mind when he's only checking out her legs.

A focus on T&A seems to go more with society-wide anxiety / neurosis, as we last saw during the mid-century. High levels of self-awareness during the Age of Anxiety can be seen in the "sweater girl" wearing a bullet bra, and the pin-up girl sticking her curvy butt out while staring knowingly at the viewer. It's a bit too vampy, pretentious, and obvious.

Over the past 20 years, the focus has returned to the mid-century pattern, a topic I looked at earlier here. Now it's underwire / padded bras and J-Lo / Kim Kardashian booties in yoga pants. And as part of the general change toward a more-covered-up look, shorts end further down the thigh than they used to:


Compared to daisy dukes, these new shorts make the girl look a little less comfortable with herself, which you can confirm by observing their facial expressions. To emphasize the butt, the waistlines have come way down as well (though they've recovered from the Whale Tail days of the early-mid-2000s). Low waistlines don't mean they're actually baring their waist, of course, just trying to draw more conscious attention to their ass. As you can see in a few of the pictures, trying to have it both ways -- low waist and high-cut -- makes the shorts look like a thin skimpy stretch of fabric.

Whether they look more prudish or more slutty, today's shorts lack that carefree wholesomeness of the '70s and '80s, and the broader shift in focus away from the legs and toward T&A has created a heightened self-awareness that's only made guys and girls more awkward around each other.

March 28, 2013

Street Fighter vs. Mortal Kombat in a broader perspective

A reversal of direction in the zeitgeist shows up in so many areas of the culture. That's why it makes sense to talk about a zeitgeist in the first place. To show just how broadly a change in the social-cultural atmosphere can reach, let's take a look at a mass phenomenon in youth culture of the early-to-mid-1990s -- the explosion of video games where two players fight against each other in a best 2 of 3 format. It was comparable in intensity to the Davy Crockett craze of the '50s.

First, the player vs. player genre as a whole was a qualitative change from the '80s, where video games that involved beating people up had both players teaming up together to beat up hordes of enemies controlled by the computer. These included Double Dragon, Final Fight, Golden Axe, and many other quarter-eaters. With greater social isolation, kids weren't as interested in team play, and the player vs. player games sprang up to meet the new demand for anti-social ways of playing video games.

By far the two most popular fighting games were Street Fighter (II) and Mortal Kombat. Street Fighter (1991) had one foot still in the '80s, while Mortal Kombat (1992) was unmistakably '90s and quickly displaced Street Fighter. Contrasting their main features will therefore show how the zeitgeist began to shift. You may recognize some of these changes from other domains, several of which I've covered here as well. It does seem a little frivolous to look for these changes in video games, but in the archaeology of popular culture, I say leave no stone unturned.

Visually, Street Fighter has a more stylized look, and Mortal Kombat a more gritty representational look. Take a look through a whole gallery of screenshots here and here. Notice several things in the comparison below:


Street Fighter looks like the work of an illustrator. The guy getting electrocuted is shown in X-ray view, right out of a kid's cartoon. Mortal Kombat shows digitized "animation" of footage taken of live actors performing their moves. The backgrounds look like digitized photos too. The technology for digitized characters in fighting video games existed earlier, in 1990 with Pit Fighter, but it wasn't very popular. Not until Mortal Kombat.

In Street Fighter, the only effluvia you see is an occasional, cartoony vomit if the guy gets struck really hard. In Mortal Kombat, they try to make the blood look as real as possible, and you see it more often.

This shift from stylized to photorealistic shows up everywhere else in the visual culture. Remember when movie posters and album covers featured illustration rather than photography?

Throughout the game, Street Fighter also shows a broader spectrum of colors, greater use of contrasting colors, and higher saturation levels than the more monochromatic and washed-out Mortal Kombat, which looks like a prelude to The Matrix.

The background environments in Street Fighter are more distinctive: you know you're in a Brazilian rainforest, a Spanish flamenco bar, and so on. In Mortal Kombat, it feels like it's all taking place in a void with a few props thrown in, again like The Matrix. Contrast that with Videodrome, where you get a strong flavor of the city, generally not very palatable. The rise of pure fantasy movies also feel like they're taking place in the middle of nowhere, totally generic, not some distinctive real place that we just haven't been to before.

Mortal Kombat's explosion of gore also puts it squarely in the more unwholesome period of the past 20 years. In the image above, you can see the guy on the left hurling a spear into the other guy's chest, and it's attached to a rope that he's going to use to drag him over in a daze, setting him up for a free cheapshot.

But Mortal Kombat went even further -- everything had to be EXTREME in the '90s -- by adding an element of gameplay that Street Fighter lacked. At the end of your second winning match, it didn't just end there. You were given the chance to perform a special move, called a "fatality," on your helpless opponent. These were so over-the-top, like ripping the guy's head off with the spinal column still attached, blood dripping down, while the headless body slumps to the ground.

This level of goriness heralded the rebirth of mid-century unwholesomeness, which back in those days showed up in comic books. That caused a panic over horror/crime comics, led to Congressional hearings, and ended up with self-censorship (the Comics Code Authority). The exact same course played out again in the '90s, with the panic over violent video games, Congressional hearings, and self-censorship (the Entertainment Software Rating Board).

Come to think of it, we looked forward to pulling off one of these ultra-gory, humiliating fatality moves more than actually winning the best 2 of 3. It was part of the trend away from good sportsmanship and toward that whole "In your FACE, bitch!" and "Suck it!" kind of attitude.

Mortal Kombat II from 1993 slathered thick layers of '90s meta-aware ironic dorkiness on top of the finishing moves. They retained the EXTREME fatalities, and added animalities, where you turned into a fierce animal before ripping them in half or whatever. But now you could perform harmless finishing moves, like turning them into a crying baby, or doing overly cutesy friendship things for them, like cutting out a set of paper dolls to offer your defeated opponent. Huh-huh, I get it.

And then there was that fourth-wall-breaking moment when a digitized photo of one of the game designers, or whoever he was, popped in the corner of the screen to yell "Toasty!" every once in awhile. If you pressed the right buttons then, a special level would open up. The main thing you took away from it was, "Huh-huh, this game is so wacky and zany and quirky!" And lame.

To play well at Street Fighter, you only needed to memorize a handful of button combinations to execute certain moves. But you didn't really need these special moves much anyway. The characters were differentiated and specialized enough in their skills that any one you picked had a natural advantage over at least some of the other characters (like fast vs. slow). Not much memorization or repetition required. With Mortal Kombat, the characters are just about all the same in their speed, jumping, and other basic skills. That required you to memorize all of their special moves to gain the upper hand in what would otherwise be a stalemate between clones.

This shift toward memorization, mindless repetition, and checking off all the boxes on a list (of special moves to master) is part of the broader trend toward OCD behavior over the past 20 years. It got even worse with the sequels to Mortal Kombat -- the only person who could master so many moves in Mortal Kombat II was some geek who spent all his free time alone in the movie theater lobby hunched over the arcade cabinet.

That was compounded by the Pokemon-like proliferation of characters to choose from in the sequels. Gotta master 'em all! The first Mortal Kombat had 7 characters, the sequel had 12, the next had 15, and so on.

Because of its more rule-structured, OCD type of gameplay, kids didn't crowd around Mortal Kombat and get as excited as they did around Street Fighter, whose gameplay was more loose. The group of dudes hanging around Street Fighter were always more in worry-free, hanging-out mode; around Mortal Kombat, they were more in high-pressure, test-taking mode. It's like a bunch of friends having a couple beers while shooting the bull on the front lawn, as opposed to following the rules of beer pong or flip cup, with no interaction. Mortal Kombat is more choreographed, not spontaneous, kind of like the fake-looking fight scenes in the new Star Wars trilogy compared to the original ones.

Street Fighter thus also allowed younger kids to play alongside the older ones. When it blew, I was just 10, but the teenage kids didn't mind me hanging around the arcade cabinet with them. At the mall where I played it the most, there was one guy who could kick just about anybody's ass. Usually I didn't even bother putting my quarter next in line on the monitor when he was there. But a few times I did, and one of those I was this close to beating him.

But with Mortal Kombat and its sequels, fucking forget about it. If there was some nerd who'd memorized and practiced the list of moves and knew who to play against who else, there'd be no way a 10 year-old non-fanatic player would be able to hold his own.

Those are some of the major differences I remember, and that echo so much else in the broader changes underway during the '90s. The Street Fighter craze died off pretty quickly after Mortal Kombat came out. My friends and I still played Street Fighter a lot at home -- that was an easy way out of a slump in the course of a sleepover. But I don't remember any of the sequels coming out in arcades at all; they must have been very limited. I remember some of the variations on Street Fighter II coming out for home consoles, not with much enthusiasm from us kids.

Mortal Kombat kept going and going, though. I clearly remember the popularity of II in the arcades, and the snack shop near my freshman dorm had a cabinet for 4. Not to mention the home versions. The Genesis version of the original was particularly popular because they kept the blood in it, and you didn't see that too much on home games at the time.

Both series got feature-length movies, and Mortal Kombat earned $122 million, while Street Fighter took in $99 million. Mortal Kombat also produced a fairly popular, ear-grating techno song, while Street Fighter didn't. It belonged more to the very end of the '80s phenomenon of the action movie that had an engaging, motivational soundtrack (Rocky III and IV, The Karate Kid, Top Gun, and so on).

Guess my boredom with Mortal Kombat was yet another case of not wanting the '80s to devolve into the '90s. Alternative music, saggy jeans, the Jerry Spring Show... and Mortal Kombat. Some of us only flirted with those things and quickly looked to anything cool from earlier times -- punk music, thrift store clothes, the archive at the local video rental store. The birth of vintage mania, as what was new became so boring, embarrassing, and degrading.

March 27, 2013

Light-skinned blacks smarter than dark-skinned blacks

The General Social Survey data from 2012 are up online. A new question has the interviewer rate the respondent's facial skin color from lightest to darkest on a 10-point scale. To get decent sample sizes, I lumped the lighter together (1-5) and the darker together (6-10). The GSS also has a 10-word vocab test which serves to measure IQ.

The blacks who have lighter skin tones average 5.7 words correct, while the darker ones average 4.6 words. This difference works out to 0.54 standard deviations, or about 8 points of IQ. Those who scored 8 words or more out of 10 made up 15% of light-skinned but only 6% of dark-skinned blacks.

Whites average 6.3 words, so light-skinned blacks are closer to whites than they are to dark-skinned blacks in intelligence. Hence why those with light skin suffer such constant teasing, hounding, and ostracizing from the dark majority of blacks.

We can extrapolate these results to see a future where light-skinned blacks form their own ethnic group and culture, although that may be centuries away. Yet the fundamental split is already visible, as it were.

GSS variables used: wordsum, ratetone, race