March 30, 2010

If young people today are so exhibitionistic, why no more flashing or streaking?

[Update on toplessness in France added]

You often hear from Baby Boomers and Gen X-ers how exhibitionistic young people have been since roughly 2003 or so -- putting up all those pictures of themselves on MySpace and later Facebook, uploading any video involving themselves to YouTube, not to mention all of the "over-sharing" they do in their Facebook status updates, etc. These are the facts, but is this the correct reading of the facts?

Hardly. This is just a technological change that has allowed people to broadcast themselves more widely to the rest of the world. Can you imagine if there had been YouTube or Facebook in 1968 or 1992? Friends on Facebook would have had to endure endless status updates about hearing the voice of a new generation, rising up against the patriarchy, bla bla bla. And while third wave feminism smothered any chance of there being a strong sexual vibe to Generation X's turnout, it doesn't take much imagination to picture what the counter-culture of the late '60s and early '70s would've uploaded to YouTube, MySpace, Flickr, and so on.

Young people recently have not been reserved in the way that 30-somethings during the 1950s would have been -- there is all that crap on Facebook, etc. -- but theirs is a mock-exhibitionism. How can we tell they're not willing to walk the walk? Consider the history of streaking: it peaked in popularity during the mid-'70s. Now it's only a few people a year who do it and only at the most high-profile events, whereas before you would've seen hundreds of college students running around campus naked even though there were no cameras trained on them at all. Of course, in the '70s there were also streakers at high-profile events like there are now.

In plain terms, today you have to pay people a dramatically higher price in attention to get them to streak, meaning their underlying preference to do so is more prudish than it was for young people of the 1970s. In more technical terms, and looking at it another way, the distribution of youthful exhibitionism has shifted in the prudish direction. That is, less of the mass is concentrated at the high end where you find streakers and flashers, and more of it is in the half-hearted part of the spectrum.

After the streaking fad burned out, did rampant exhibitionism fade away? No, it just took the form of girls flashing their boobs at rock concerts. Again, there are typically no cameras that will broadcast the girl to the rest of the world, so it is not a case of attention-whoring. It's just a wild thing that you do for the sheer thrill of it: omigod, that concert last night was like SOOO awesome -- i like TOTALLY flashed bon jovi!!!! It didn't matter if it was a small venue with under 100 people or a sold-out arena with tens of thousands. That's just what girls did.

But once wild times gave way to tame times in the early '90s, that died off too, effectively ending young people's indulgence in exhibitionism. I remember going to my first concert in the spring of 1995 -- it was the Foo Fighters (before their first album was out) playing at the Black Cat in DC. Here was the new band of an incredibly famous grunge musician, and none of the girls there flashed at all. I would've noticed since I was a horny 14 year-old and was pumped just to be around girls in a nightclub. Whether it was a smaller show like that one or the day-long HFStival that filled an entire stadium, I don't recall seeing girls flash at all. In fact, from all the videos I watched of Nirvana concerts held during the early '90s, I don't remember any flashing.

I haven't been to a big concert in awhile since hardly anyone has grabbed my attention as being a great live band. But flashing must be even lower now because the male musicians of today are either self-doubting crybabies or screaming rejects -- not the type that gets a girl hot and bothered enough to flash the lead singer. That's a pretty easy way to operationalize how masculine the musical zeitgeist is -- what percent of female fans flash at their concerts? Aerosmith or Prince -- high enough. John Mayer or Korn -- zero. Damn, I'll bet even Joan Jett had more flashing fans than those dorks in Weezer.

As with streaking, there's still a minimal amount of flashing going on, but it's only a handful of girls who travel to New Orleans for Mardi Gras. It's nothing on the scale of many girls doing it in every small town that a touring rock band played at. And again young people today need to get paid a lot -- namely in those highly sought-after beads -- in order to flash. The 17 year-old who snuck out after her curfew to see Guns N' Roses didn't expect to receive anything in exchange other than a little attention. Again it was mostly just for the thrill of doing it, like shoplifting or going for a joyride.

Indeed, I'm so confident that this is an effect of when you grew up rather than what age you are right now that I'll bet 40 year-old metal chicks still flash at the reunion tours of their favorite bands, while their young counterparts who are into nu metal, deathtronicacore, etc., would die of embarrassment to even ponder the idea out loud.

If you go through examples like streaking and flashing, most people will remember just how exhibitionistic young people were during wild times. So the only reason they perceive today's remarkably well-behaved young people as out-of-control is that they've blocked out the counter-evidence from the past. If they took an honest look, they wouldn't get to enjoy the age-old sport of whining during middle age about these slutty youngsters these days. Hey, the Gen X-ers' parents got to whine about this -- correctly -- when the Gen X-ers were growing up, so why shouldn't they get to, now that they are middle-aged?

[Update] In the comments Peter adds another case study that confirms my hunch about 40-something flashers, which is that toplessness in France is on the decline among the young. The article says that the golden days were the '70s and '80s, so it looks like the timing of the end of wild times was the same as it was here -- the early '90s. That's not surprising since the rise of wild times was basically the same as here, namely the late '50s and exploding in 1968. From the article:

But the trend is also part of a wider social movement by younger French women who are shunning the less-inhibited habits of previous generations. If burning bras and going topless were the ways French women of the 1970s and '80s demonstrated their freedom, their daughters and grand-daughters seem less comfortable with exposed flesh. "The values of our time are more conservative, traditional and familial," says Kaufmann.
With sensitivities like those, it's little wonder the poll found French women had strong opinions about public nakedness. Nearly 50% said they were bothered by total nudity on beaches or naturist camps, and 37% said they were disturbed by publicly exposed breasts or buttocks. Forty-five percent of respondents reported they'd prefer to see a lot less flesh hanging out in full view -- male or female.

Those attitudes got even more pronounced with respondents aged 18-24. A quarter of women within that group described themselves as very pudique [modest or priggish], and 20% saw any nudity as tantamount to indecency. That, sociologists say, explains the changing scenery on French beaches. Younger women disinclined to baring themselves make up the majority of female sunbathers; those still willing to go topless are usually older French women.

March 26, 2010

Top-rated crime and noir movies created during peaceful times

Lately I've been going through a lot of the great action crime movies -- Die Hard, Lethal Weapon, Beverly Hills Cop, etc. -- and not being a movie buff, I went to IMDb to see what other great crime movies are out there that I could move to next.

Surprisingly, almost none of the movies on the top 50 crime titles was in the vein of the three I just mentioned. They're more stylized, more glorifying of and sympathetic to criminals, and not as focused on action. That's fine, but I thought these movies should be listed under "drama" since most people think of Lethal Weapon or the Dirty Harry movies when it comes to the crime genre.

I noticed that almost all of these more stylized crime movies came from periods in American history when the homicide rate was falling or flat at a low level. The periods of rising homicide are 1900 (and maybe before) through 1933 and again from 1959 through 1991 or 1992. The periods of falling homicide are 1934 through 1958 and again from 1992 or 1993 through the present. There are a few exceptions (mostly ones that make the gangster / mob life look fun), but overall these top-rated crime movies were made during a time when their subject matter was fading out of the picture.

So next I went to the top 50 film noir titles, and they are almost entirely from low-crime times. Only 2 of the 50 are from high-crime times, and just barely: Scarface and I Am a Fugitive from a Chain Gang (both 1932).

Why aren't such movies made during high-crime times? Because the audience is in the thick of the shit and can't remove themselves enough to appreciate a stylized and morally ambivalent portrayal of the world of crime. The great crime movies during high-crime times are more gritty than dressed up and draw fairly clear lines between who are the good guys vs. the bad guys. The focus is more on the impact of crime on average people and their surroundings -- accomplished through all the action shots rather than dialog. The only morally ambivalent aspect is that the crime fighters are vigilantes or cops with that streak. This is justified in these movies, however, by the flabbiness and blindness of the higher-ups and the establishment in general:

Homer: Ooh! It's that new show about the policeman who solves crimes in his spare time.

Bart: Crank it, Homer!

Chief: You busted up that crack house pretty bad, McGonigle. Did you really have to break so much furniture?

McGonigle: You tell me, Chief. You had a pretty good view from behind your desk.

Homer: Ah, McGonigle: eases the pain.

Chief: You're off the case, McGonigle!

McGonigle: You're off _your_ case, Chief!

Chief: What does that mean exactly?

Homer: [yelling] It means he gets results, you stupid chief!

Lisa: Dad, siddown.

Homer: Oh, I'm sorry.

You see exactly the same difference in crime-oriented video games. During the high-crime times of the '80s and early '90s, popular video games like Final Fight, Teenage Mutant Ninja Turtles, X-Men, Double Dragon, Renegade, Golden Axe, Shinobi, NARC, Streets of Rage, Bad Dudes, and a thousand others -- they all feature criminals running amok, and you play as a vigilante that's going to flush all that scum down the sewer where it belongs.

In contrast, it was only when crime started falling after 1992 that video games where you play as the criminal, such as the Grand Theft Auto series, became popular and the renegade crime fighter stories faded away. The reason again is that when crime is high, it is palpable and people -- especially energetic young boys -- want to do something about it and go kick some scumbag's ass. Video games like Final Fight let you enjoy combat driven by moralistic aggression. When crime starts plummeting, people aren't as freaked out as before and don't have the same desire to clean up the streets Martin Riggs-style. Then young boys return to their default state of mild sociopathy and want to play as a sadistic killer or petty thief.

Fortunately for those who still enjoy the crime-fighting games, they've been collected into "arcade classics" compilations for the newer systems. There's still nothing more satisfying than knocking a criminal kingpin out of a window 50 stories high and hearing him cry like a little girl as he falls to his death.

March 25, 2010

Why I still go to record stores for CDs rather than to iTunes for mp3s

This week's EconTalk podcast is a great inside look at how the recorded music industry has operated from the early '70s through the present. Thankfully they didn't talk too much about piracy, which allowed them lots of time to focus on other things that the average listener probably hadn't already heard and thought about. (For a series of free articles on the negative impact that file-sharing has had on the music industry, look for the relevant titles in Stan Liebowitz's SSRN entry.) Most of the conversation is about the changes that accompanied the move from hard copies of music carried in brick-and-mortar stores to digital copies sold (or given away or stolen) online.

I was a freshman in college when Napster became popular, and lord knows I pirated lots of mp3s back then -- mostly shit, thinking back on it, but some of it good enough to burn onto CD. Last summer I bought a cheap mp3 player just to see if I really needed one -- after all, I'd gotten along fine without a portable CD player and only rarely used my Walkman as a kid. Nope, no use for it whatsoever. When I'm at home, in the library, or anywhere that has computer access, I can play CDs or mp3s on a computer. My car has a CD player, so I can always burn mp3s onto CDs and play them when I'm driving.

When else would I want to listen to music? While walking to and from my car? No; too short of a trip. While walking to and from campus or around the neighborhood -- it sounded plausible, so I tried it, but no. Being out and about while isolating yourself aurally makes no sense. It's as stupid as a person spending most of their time futzing around with their cell phone when they're in a gathering of friends.

That's just an argument against the portability benefit that digital music has. It also explains why I don't have a portable CD player. But here are five other reasons why hard copies bought from brick-and-mortar stores are better than digital copies obtained online:

1. You're out of the house and being social. Record stores never feel like other retailers, where you go there to stock up on necessities, you want to get this chore down as soon and as infrequently as possible, and you could care less about anyone else in the store. Record stores are a hangout. You're going there to collect the things that make life worth living, you want to linger there for as long as your schedule permits, and you feel like part of a larger community of music fans who are shopping alongside you or manning the cash register.

Online music stores could be a hangout, but for one person only, eliminating the social aspect of buying music. Plus you lose everything else in the physical environment that makes record stores such fun places to hang out -- holding things with your hands, running your eyes over the albums' artwork or liner notes, listening to whatever the staff is playing as a way of discovering groups you didn't know about, and so on.

2. There's a secondary market for hard copies but not digital copies. I mostly buy used CDs because all good popular music has already been made; arguably 1990 was the last year with many must-listen albums. Even the handful of recent good albums can still be bought used. Plus new and used CDs are basically indistinguishable to everyone, unlike used vs. new cars.

So you pay maybe $5 for the CD, and if at some point you want to trade it in, you'll get back $1 -- pretty good considering by that time you've gotten all you want out of it, and relative to the purchase price. Take a bunch of them that you don't listen to much anymore, and you can get several new albums that'll sustain you for a long time. None of this is possible with digital music.

3. You focus on better music when you buy entire CDs than when you cherry-pick single songs. Lots of people complain about an album that has 12 songs but only one good song. If that's an accurate assessment, then that band sucks and you should listen to someone who has enough talent to write at least two good songs in a year. Groups today are much more focused on singles, as they were in the '50s and '60s, but again the really good stuff comes from a period that was album-oriented. And there's simply too many of those albums for the average person to ever buy them all.

Also, they cover all major genres, so you'll never have to worry that you're missing out. For punk rock, you've got Rocket to Russia and Dookie. For (dance-)pop, you've got Thriller and Madonna's self-titled album. For glam / hard rock, you've got Electric Warrior and Slippery When Wet. For independent, you've got Psychocandy and Our Beloved Revolutionary Sweetheart. And on and on. All of those are packed with good-to-great songs, and you don't get that anymore in the single-driven age.

It's true that you can buy the entire album digitally, but most people do not. The online digital store encourages you to focus on isolated songs, rather than bundled albums.

4. You don't have to waste time and effort deciding which songs to listen to and in what order. Let's say you have 100 mp3s on you iPod. That's about 5 hours of music, so obviously you aren't going to listen to every one of them during a particular use of your iPod. The longest that a typical use might last is about 1 hour -- your commute, working out in the gym, etc. -- which means that you'll have to decide which 20 out of 100 songs to include for any given use of your iPod. Furthermore, you'll have to decide which order they'll be played in (even if the choice is random).

Unless you're a musical genius, you cannot hope for the resulting hour of songs to hang together in a pleasing gestalt way, especially when you're making these decisions on-the-fly during your commute, work-out, etc. Think of how mediocre it would sound if you sliced up 10 symphonies into 10 standalone pieces each, threw all 100 pieces into an iPod, and then could only choose 10 from this stew to listen to while driving to school or work. Pop music may not suffer as much because albums aren't as planned-out as symphonies, but it's still in that direction.

With hard copies, they've done all of that hard work for you -- the songs that will be played are those on the album, in the order they're listed on the back. The band members themselves, together with recording professionals, have thought very long and hard about which songs to put on an album and in what order. Therefore, the resulting hour of music is much more satisfying on a holistic level, above how enjoyable each song is on its own. I think using the right pace to set the mood and set up contrasts is the hardest thing for a normal person to do on their own.

Again you can just buy the album digitally and put it into your iPod in the intended order, but most people do not do this because digital music encourages focusing too much on the trees and not the forest. You still have tough choices to make with hard copies, but it's "which album or two to bring?" rather than "which song to hear next?" repeated 20 times in a row.

5. You're not inconvenienced by having to wait for the store to get in what you want. Again, there is so much great stuff out there that it's impossible for them to not have at least one album worth listening to. They might not have a particular album in a given week, but then you buy the ones they do have that week, or have them transfer it from a sister store or order it for you. So there is no real advantage to being able to get a particular song or album digitally online right now vs. later.

How do we know? Ask people what their favorite album or song is, and they usually can't make up their minds -- there are fifty or a thousand of them, although they're better than the other million songs they didn't think of. The same is true for the list of albums or songs they'd like to own but don't yet. There's a block of higher-priority stuff, but it's a really big block. People can't rank within this block; it's an undifferentiated wish list. Only if the record store didn't have anything from this wish list would you think of getting it online. I thought of doing just that recently because the CD is hard to find (Everything by Tones on Tail). Fortunately a sister store had it in, and after waiting a few days for the transfer, I got it for $8 instead of the $15 or more that it goes for -- used! -- on Amazon or Ebay.

Even if the store or a sister store didn't have it in, and you wanted to get it online, the record store has specialized knowledge of where to go for which albums to get you the best deal. This comes from specialization and competition between record stores for sales. I don't want to waste time and effort wading through Amazon or Ebay's endless lists of the same items to find the best combination of price, quality, and delivery time. Let the record store worry about that. Only if it's really expensive to get in hard copy does the digital copy become more attractive.

No one needs a song or album right this second, unless they're buying it to satisfy an impulse shopping desire. But of course clicking on the song at iTunes and reading the "download complete" box is not impulse shopping any more than is calling in a pizza for delivery. You have to be standing in front of the thing, perhaps holding it in your hands, and walking away with something tangible in order for it to feel like an indulgent impulse buy.

This probably reflects our hunter-gatherer past -- you come upon an apparently deserted honeycomb or a fresh lamb carcass that two predators seem to have killed themselves in a fight over. You'd like to take it, but what if the bees find out, and what if that one predator isn't actually dead? Hmmm, it seems like no one's looking, so what the hell, let's take it and run! If you can't run away with it tucked under your arm or slung over your shoulder, it's not an impulse buy.

There are other reasons why I still buy CDs from (used) record stores, but these are the most important, especially the ones about focusing on the entire group of songs you're going to listen to during a stretch. Digital copies have shifted people's thinking even more toward the frustrating and counter-productive focus on singles rather than albums or even greatest hits collections (again which ones and in what order?).

Will dorks pay women to play video games with them?

Via Tyler Cowen, here's a news item about an online service where video gamer shut-ins could pay to play video games with a woman. As if the concept itself didn't reek of desperation, the article is full of lame words like "hot," "ladies," and "buy her a drink." For just over $8, he would get 10 minutes of faceless play or 6 minutes of face-to-face online play.

Interestingly, all of the comments (well, I only looked through the first couple pages) slam the idea, calling it pathetic, a small step away from prostitution, etc. Doesn't sound too promising for the service. None mention costs and benefits -- like is it worth $8 or perhaps only $1.50 to slaughter zombies with a chick partner? Or perhaps some would say, "Damn, only $8 -- what a steal!" The very act of paying someone to be your friend violates the separation that people have in their minds between social relations among fellows and among market participants.

If your mother makes an especially good Thanksgiving dinner, your family members don't pass around a tip jar and tell her, "This is for a job well done." Conversely, no one walks into Starbucks and asks the cashiers to cut them some slack in paying for a month or two -- they'll get the money sometime and pay them back, but just not right now.

That's why no one pays for friends, dates, or sex partners. It's an admission that you can't get it by being worthy of having friends, dates, and sex partners. We don't care about the ultimate reward so much as we do about deserving that reward. Man does not only want to bang, but to be bang-worthy.

Incidentally, I think that's why some guys are turned off by the pickup artist lit. In their minds, using such techniques feels like getting the reward in an undeserving way. I don't see it that way at all. If game gurus really were focused only on the ends, they'd write instead about how to best evade laws against buying a prostitute's services. Rather, they write about how to best re-shape yourself into someone more deserving of girls' attention.

Speculation aside, here is how the author of the article describes his experience with the service:

She was a nice girl (and totally kicked my ass in both pool and Battleship, btw) but her boyfriend was hanging out behind her and she made mention of him a couple times. Her game mood is set to "flirty," but there was zero flirting going on. I can imagine some guys might be disappointed if they paid to play with a girl, only to hear her go on and on about her boyfriend -- and even have to see the guy during a video chat.

If I were the boyfriend, I would've interrupted by wrapping my arm around her shoulder, staring straight into the camera, and saying, "Sup brah, thanks for paying for our date later tonight. I sure am glad that [pointing to her ass] I'm gettin' it for free, BITCH!"

March 23, 2010

More female bonding during low-trust times

In general trying to get two females to play nice with each other is like trying to train two cats not to chase after the same rat. As a result, females tend to have very few close friends, while males are part of much larger social networks. Still, there are swings up and down when it becomes even harder or relatively easier for girls to get along.

Following the logic of what causes females to thrash one another so viciously -- namely, competition over prized males in the mating market -- we predict that when females withdraw more from the mating market, they won't have such strong reasons to hate each other, and they'll find it easier to get along. That's why post-menopausal women aren't as catty as teenagers.

One key ingredient to taking part in any market is trust in other people -- you are much less willing to play the roles of "buyer" and "seller" if you think your partner will cheat, lie, or otherwise act opportunistically. In low-trust environments, you'll move more of these "production" decisions in-house where people can be trusted. Often that will be literally in your own house where your blood relatives live, although it could also mean doing it by yourself or with only your most proven allies. Just not with generic strangers or someone you just met.

Trust levels have been falling steadily since the late 1980s, according to the General Social Survey. And in the early '90s trust took an extra hammering from the culture war that spawned third wave feminism, identity politics, and political correctness. Who can trust men when they're all a bunch of crypto-rapists just waiting for the right moment to slip a drug in your drink? Third wave feminism has been dead since at least the late '90s, but trust levels have remained low right up through the present. We'd expect females to withdraw more from the mating market and think more in terms of forming alliances against a common enemy. Hey, you two are both cynical and distrustful towards men -- why don't you build on that?

So is there any evidence that the prevalence of female bonding has gone up since then? Sure -- this period has seen the birth and flourishing of the "girls' night out." Obviously girls have been going out together at night forever, but this phenomenon is a plan to carve out a "safe space" where they can insulate themselves from the cooties that those yucky boys carry. Here is a graph of how in-the-air this topic has been in the NYT, by five-year blocks (each point is the middle year of the five):

The phrase also appeared twice in 1934 and once in 1942, but otherwise is absent until the late '80s. (The only instances from 1985 and 1986 are references to a rock band of that name.) It first caught the attention of feature journalists in 1989, as seen in this article. Since then, it has only shot up in popularity, as shown by this 2003 article about the comeback of Tupperware parties among trendy Manhattanites.

Plus there's all that only mildly social stuff that hip-striving females somewhat recently started forming clubs around, like knitting and growing your own vegetables, instead of going out on dates with boys.

Sex and the City clearly fits right in with this. Wikipedia has a category page on "female buddy films" -- no joke -- and they are all from 1988 and later, with a peak in 1996. Again, it's not as though there weren't movies before then where females helped each other out or enjoyed friendship. Just not ones where they formed a co-op to better their odds in the battle between the sexes.

And of course there's "girl power" music that grew out of third wave feminism, symbolized by ego-inflated slut brigades like The Spice Girls or The Pussycat Dolls, as well as the soporific line-up at Lilith Fair. You never would've seen anything that gay from The Runaways, The Go-Go's, Pat Benatar, Debbie Harry, or any other good female rock musician. (Looking through the names of those who did play Lilith Fair, I'm sad to say that The Pretenders and Susanna Hoffs both did, though.)

The tougher ones sing about how relationships don't always work out, but to suck it up and try to get on with life, rather than huddle with a bunch of other whining women and feel each others' pain. Even the slightly downer songs from high-trust times rarely complained about "what can we do to get them to treat us right?" like Queen Latifah's 1993 song "U.N.I.T.Y" -- but instead about "what can I do to get him to love me?" like Joan Jett's 1981 cover of "Crimson and Clover."

It's ironic that all of this female bonding crap gets promoted as proof of how strong and self-confident women have become, especially since third wave feminism. Back on planet Earth, this retreat from playing with the boys only signals how weak and unsure she is of her ability to handle life's risks. It takes a stronger girl to make herself vulnerable, and those girls from high-trust times were rewarded with much more exciting social lives than the girls-night-outters of today who make an evening out of in-home strip pole dancing.

March 22, 2010

Can only contrarians stick with a low-carb diet these days?

Low-carb diets are far and away the most effective at alleviating all symptoms of metabolic syndrome -- high body fat, high triglycerides, insulin resistance, etc. -- because these all spring from throwing your blood sugar levels outta whack, and that is caused by eating too many carbs. Plus they are the least physiologically painful diets to go on because fat and protein satisfy hunger, while carbs cause hunger by driving up insulin and thus locking away fat and depriving you of energy -- so your body screams at you to eat some more. You don't have to count calories in general, you get to eat tasty food, you lose your sweet tooth, and you're more energetic overall.

So why do people who experiment with low-carb eating and see the positive results for themselves give up? My mother jumped on the low-carb bandwagon around 2003 or so and truly seemed like a different person. Approaching 50 years old, she'd lost most of the body fat she'd built up during her 40s and had so much energy she didn't know what to do with it. Exercise was something she wanted to do, not a chore, and even after getting back from the gym she would still have enough energy to jump and bounce around the living room, just to show off how healthy she was. But the low-carb fad died and like most Americans she's now back to eating mostly carbs, although they tend to be low-glycemic. She still competes in ballroom dancing, but her energy level isn't what it used to be. Last summer I got my father on a low-carb diet for a few weeks or months, and he lost weight, wasn't hungry, and never felt cranky. By the end of the summer, though, he'd gone back to his old ways.

These examples are hardly atypical. It's understandable why other dieters would revert, but it's puzzling with low-carb eaters because everything starts to improve and they can see it with their own eyes. It is not a diet that requires a constant exercise of willpower because, again, the food tastes great, you don't have to count calories, and your health picks up. And since you lose your sweet tooth, it's not the eventual caving in to that pint of ice cream. I indulge in sweets once every two or three months, but then I go right back because it only takes one pint of ice cream for me to say "that's enough." After that, sweets are too saccharine to tolerate, and I want to get back to eggs, chorizo, and pate.

Gary Taubes, author of Good Calories, Bad Calories, has speculated that those who go back to a high-carb diet do so because they lack support from their doctors and other health care people -- indeed, they might get an earful about how this diet is going to kill them right away. That hectoring from the man in the white coat is enough to make the low-carb eater second-guess what's best for themselves. This means that the only people who will take to low-carb eating over the long term are those who are fine with -- and may even take pleasure in -- telling Dr. Know-It-All how far they're veering from his supposed dietary wisdom. Remember this little rant by Denis Leary in Demolition Man?

I think Taubes is right about that, but it actually applies much more broadly to everyone who will ever find out what you eat -- not only those in your social circle but anyone they might tell, other patrons when you're eating out, and so on. They all believe, wrongly, that eggs, sausage, butter, etc., is poison and that grains are healthy. They've internalized this from what the experts in the private sector say and from governmental warnings and propaganda. Most of the time the experts are right, but not here. If nutrition had no government interference, the diets proven to do best would win acclaim.

Once the government steps in, though, it has no incentive to find out what a good diet is -- it's not going to go bankrupt if it makes the wrong recommendations, unlike a private dieting firm. People trust the government when it comes to issuing warnings on safety -- is it safe to fly, to drive without seatbelts, etc.? -- so once the government scares the hell out of everyone about eating animal products, people will take that very seriously and not be likely to change their minds about it. *

So a low-carb eater is going to have to endure the disgusted looks and shaming stares from not only their social circle whose thoughts they care about, but also from the mob of mankind. "Jesus -- beef with butter, then ham and eggs with cheese. I guess he just doesn't care about his health. What a filthy pig." Even below that are the friendly jokes they'll always hear relating to their diet. It's not a brazen insult, but still a stream of remarks about your diet raises the psychic cost of eating that way. "There goes agnostic ordering a burger without a bun," or "So, what, you're too good to eat breadsticks? Are you saying we're bad people for eating them, then?"

And notice how no one -- anymore -- will make jokes about someone's vegetarian or vegan diet. Now they make remarks if you're an anti-vegan! How perverted has the received wisdom become?

I think it's this regard for peer approval, which in general is a good thing that keeps us from going off the deep end, together with our peers' wrongheaded beliefs about what healthy food is, that makes it so tough for most people to stay on a low-carb diet. These barriers make sure that the only ones who will enjoy the diet long-term are those who aren't moved by peer disapproval if they believe themselves to be in the right. Hey guys, I know you all think this is bad eating, but with all due respect you don't know anything about nutrition, so I'm going to politely ignore your attempts at shaming me into the skin-destroying and energy-sapping diets that you all follow because the experts and the government told you to. If you've spent any time reading low-carb people, you can tell that they have this contrarian and slightly libertarian streak.

This explains why it was easy to follow a low-carb plan in the early-mid 2000s. It suddenly became fashionable and therefore pre-approved by a large fraction of the crowd. You wouldn't have felt like a weirdo, unlike now. The views of doctors, health experts, government screwballs, etc., did not change at all during that time or since. What changed was what the crowd believed. So while Taubes was pretty close, it's actually how the mob is going to judge us that determines how easy or tough it will be to do something. We care less about our own health than we do about how others perceive our health-related behaviors.

* This shows why low-carb diets do not win out even among private dieting firms. The average person does not trust a for-profit firm to figure out what healthy is because he distrusts the profit motive. Since the government has no such motive, he will trust them to figure out what's safe. Suppose that seatbelts didn't really reduce mortality risk, but that the government kept a steady stream of warnings about the dangers of driving around unbuckled, in contrast to private firms who said "don't listen to that bull." Consumers would trust the government's warnings and demand seatbelts from for-profit carmakers -- even if the latter were not required by law to have them.

Similarly, when the government warns that dietary fat and cholesterol are harmful, it doesn't matter if it's nonsense. People will believe the government because there's no profit motive for them to lie, and they'll therefore demand diet programs from private firms that are all in the low-fat / low-cholesterol direction. They just wouldn't trust one that said "eat more beef and liver, and less bread and sugar." And once more, people really care about how others view them rather than about their own health, so the private firms that will dominate in the dieting industry are those that are best at giving consumers a diet that will garner the highest approval ratings from the crowd, not those that are best at improving the consumer's blood lipid profile or other objective measures of health.

March 20, 2010

Which car decals are elite-approved?

Here is a complaint on an NYT blog about car decals that say what beaches you head to for vacation, like the OBX one that stands for Outer Banks, North Carolina. (I don't know if these exist outside the East Coast.) Other prohibited decals are those that advertise what sports your kids play, including cheerleading -- things that might make them popular -- and decals that show your family togetherness by giving each member their own stick figure decal.

Are there any decals that the complainer would allow? Sure: the ones that signal where your kids went to college, ones that signal your political views, and presumably an Apple logo (hey, the personal is political).

Basically, being a status-obsessed dork is OK, while having a life is forbidden.

Young and middle-aged males giving up autonomy: Evidence from driver's licenses

For a post I'm putting together (at the data blog) on the change in young people's autonomy over time, I've been looking at numbers on driver's licenses (see here). Why? For the average American, having a license is one of the most important steps toward independence, whether financial, social, or otherwise. I assume the reasons for that are obvious.

But something else has struck me for the last year or so -- I always seem to see young males being driven around by girls. Sometimes it's clearly his girlfriend, but other times it could just be a friend who he pestered into giving him a ride. If I saw this as many times as I saw the guy driving around the girl, then no big deal. But it really does seem like the girls are more likely to drive around guys than vice versa. I notice because it's disgusting -- aren't you the one with the balls? You are supposed to be the one in charge -- the leader, the coordinator, the doer -- not some child being chauffeured around by a girl, who must feel more like your mommy than your lover. Either learn how to drive or just hand over your balls to her now and make it official.

Sure enough, the data back up my hunch. They allow you to see what percent of the male or female population has a license, by age group. As recently as 2000 (and at least back to 1994), males were more likely than females to have a license in all age groups, which is as it should be. It shouldn't even be close. Starting in 2001, though, females start to dominate in the youngest age groups -- the 19 and under and the 20-24 groups. The most recent data for 2008 show that females now dominate not only those groups but the 25-29, 30-34, and the 35-39 groups as well. So all young people and a good amount of middle-aged people -- anyone from 15 to 39 -- live in a world where males are less independent in any area of life that requires a car. Here is the difference between the % of females and % of males with licenses by age group in 2008 (so positive means females more likely to have licenses):

19 and under 0.2
20-24 3.8
25-29 4.6
30-34 2.2
35-39 0.3
40-44 -1.2
45-49 -1.6
50-54 -2.7
55-59 -3.9
60-64 -7.2
65-69 -10.7
70-74 -14.7
75-79 -19.0
80-84 -25.1
85+ -34.2
All ages

Women in their 20s are several percentage points more likely to be able to drive than their male age-mates, and the gap is growing (since not long ago the gap went in the other direction). Unlike other areas of life where you can make an argument for feminism taking away opportunities from qualified men and giving them to less qualified women, getting a driver's license is not a zero-sum game -- just because a female gets one doesn't prevent a qualified male from getting one.

The widening gap is due to either a fall for males and rise for females, or both groups falling but males falling even lower. So this is a case of males increasingly dropping out, not males staying roughly the same and females gaining more and more to make up for previous very low levels. After abdicating their responsibility, though, of course they have to rely on someone to drive them around, and that person is more likely to be a girl than a guy. It doesn't look so bad when your buddy gives you a lift, but to have to depend so much on girls for your basic social existence? That's beyond pathetic. Not to sound too alarmist, but you wonder how long it'll take before white America looks like black Africa where the men are a bunch of parasitic losers relying on the independent womenfolk for their survival.

March 17, 2010

Would paleolithic people have needed a toilet brush?

[You may want to skip this if you don't like frank discussions about the gross parts of human physiology.]

There are many ways to tell that after going on a low-carb diet, your body is in better shape. You start burning off excess fat, your blood sugar plummets, so does insulin, your triglycerides fall, your HDL level goes up, your LDL particles shift from the small and dense type that cause heart disease to the large and fluffy type that don't, you have more energy, you aren't ever hungry -- and so you rarely snack or nap -- and on and on.

But then there are lots of so-called cosmetic ways in which your body gets better. Your skin becomes more clear and supple, your hair gets thicker, your teeth look nicer even when not brushed, and you won't get rashes, discolorations, etc., that almost always come from allergies to non-animal foods. Low-carb people don't emphasize these as much because it seems superficial -- you should care more about your risk of heart disease than how bouncy the skin on your face looks.

However, natural selection designed the human mind to find signs of health visually pleasing because it was impossible to give other people a blood lipid profile test, measure their fasting insulin levels, and so on. Just because these features are on-the-surface does not mean that they aren't honest signals of underlying good health. So we should absolutely pay attention to our own surface when we try to judge how healthy we are.

Still, these changes may take awhile to become perceptible -- not years, but maybe weeks or months. How can you tell within days that a low-carb diet has put your body in better shape? Simple: judge the quality of what came in by what came out. Grazing animals that subsist on low-quality food like grass must eat a lot more than carnivores do. So they take in a larger volume of food, and most of that is useless junk that gets excreted. Such animals therefore defecate frequently and copiously. Carnivores do so rarely and in slighter quantities. That's why you can tell, just by walking around, whether there are deer in the area, but you couldn't tell if there were cats unless you saw them with your own eyes. (This is not due to burying behavior of cats; deer could try to bury theirs too, but there would still be too much to hide.)

One of my housemates is a vegetarian who occasionally eats cheese and even more rarely fish, though never eggs. Like the typical vegetarian, he lives on grains, legumes / pulses / nuts, corn, pasta, soy or rice milk, chips and cookies, etc., with only a small amount of colorful vegetables and fruits. That's why the name "vegetarian" is misleading; if you ate only colorful vegetables and fruits, you'd die from lack of protein, fat, and most vitamins and minerals. Human non-carnivores should really be called grainitarians. The only difference between him and a typical American, though, is that he follows this diet out of some higher principle (I've never asked), whereas the typical American feeds on mostly processed carbs and few animal sources without making a cause out of it.

To be blunt, when he's used the bathroom for number two, I can always tell -- visually. If a cat were toilet-trained, it would never leave streaks in the bottom of the bowl. Its high-quality diet just does not result in the streakable consistency of dung that you find among cows, elephants, and other low-quality eaters. And again, because he's vegetarian, he's in there at least twice a day, and probably a time or two more when we're not home at the same time. The cat would be in there only as frequently as it would use its litter box to defecate -- once a day at the most.

You've all been in a public restroom, so you know what it sounds like when others are there in the stalls. It never sounds like there's a well-formed one that goes plop, but instead like the microwave blew up their bowl of soup. I'm pretty sure that's mother nature's way of telling you to eat like a man instead of a 9 year-old boy. Only a sugar-sucking grain-muncher is going to make that sound in the restroom.

So unlike the other feedback your body gives you about how high-quality your diet is, this one is particularly quick at letting you know. How then do people ignore the inescapable marks that their food leaves? I think it's because there is little variation over time -- you would have to experiment with various diets in order to see how that affected what you left behind. Most Americans don't care that much about experimenting to improve their health -- I mean what the hell, someone else will bail us out if we get sick due to our own horrible eating habits. Some cultures might be more open about excretion, and so pay more attention to what it says about their food's quality, but that'll never happen here. *

In our fecal-phobic culture, the best way to avoid embarrassment about the sounds you make and the traces you leave for others to see is to cut out most carbs from your diet. You don't need them and they don't add flavor. There's nothing wrong with a daily fruit, vegetable, or other low-carb / high-fiber food like an avocado or almonds, just as a cat occasionally eats grass. But otherwise don't bother with a high carb load. Your bum and everyone else affected by it will thank you.

* We only pay attention if we suspect we've been infected by a pathogen -- to us that's more serious than following a lifelong diet of low quality because germs infecting our gut could kill us in short order via dehydration.

March 15, 2010

Less or no nudity in hit movies today?

I'm not a movie person, but after re-visiting a lot of the big movies I saw as a little kid in the '80s, I'm struck by how much full or partial nudity there was compared to now. My hunch is that this reflects the general transition during the early-mid-'90s from wild times to tame times. It would take more effort to find out just when the "nudity in film" change happened, but as a brief comparison, consider just 1984 vs. 2009. We only want to include movies that lots of people see; if few see it, it's no big deal.

I looked up the top ten grossing films for both years at Wikipedia and then went to the "Parents' guide" section on each one's IMDb page. This lists instances of full or partial nudity, other sexual content, violence, profanity, and so on. It seems to be edited by paranoid parents who labor to tar all movies as depraved, so if there was something serious in it, they wouldn't miss it. So how do the hit movies of 1984 differ from those of 2009?

Each year had a hit movie that showed a non-human female with a mostly-nude costume that showed some of her breasts -- Ghostbusters and Avatar. I don't think that counts, but it's hard to call. For true cases of showing bare breasts, bare buttocks, full frontal nudity, etc., 2009 had only one hit movie with such images (The Hangover). However, five hit movies of 1984 did (Beverly Hills Cop, Police Academy, Footloose, Splash, and Purple Rain). So, the typical high-exposure movie from 1984 would have shown something, and the typical one from 2009 would not.

Perhaps the clearest illustration of this shift is in the amount of nudity shown in screwball teen movies -- if any genre is going to show something, it's this one. In Porky's (1981) there's full frontal and backal nudity, other shots of bare breasts, and so on. In Dazed and Confused (1993), there's nothing. American Pie (1999) has only one sequence showing bare breasts. And when we reach Superbad (2007), they don't show anything at all. It's understandable if they don't show anything in teen movies that aren't about slackers and adolescents trying to get laid, like Ferris Bueller's Day Off or Mean Girls. But even the screwball teen movies from at least 1993 onward refrain from showing nudity. And conversely, even a general coming-of-age movie like Fast Times at Ridgemont High (1982) had full frontal nudity back during wild times.

Because the movie industry is one of the most competitive out there, it must be that this shift is due to changes in audience demand. If young people today still wanted to see bare boobs and butts, the movie studios would give it to them. This shift in their cultural tastes parallels their behavioral changes; promiscuity among high schoolers has been falling since 1991. Therefore, it is not simply a substitution of internet porn for teen movie nudity. It's part of a larger pattern.

(Also, adults weren't going to see Beverly Hills Cop or The Terminator in order to see T&A, since they had easy access to porn. Nudity in hit movies is not there to physically arouse the audience but to suggest wildness in the environment.)

Another curious thing I notice that's related is how juvenile young people want to remain today compared to during the '60s through the '80s. During all times, teenagers want to become independent and start living what they picture as the cool young adult life. Still, there are pendulum swings around that constant desire. Just look at how popular Ice Age, Harry Potter, and Transformers are. Or Pokemon -- still. Flash back to Fast Times at Ridgemont High again, and teenagers are worried about getting the right job in the most high-status part of the mall, or moving up from a job at a crummy vs. a respectable fast food place.

When young people perceive the world as pretty safe, they're going to delay going through rites of passage -- omigod, why should i hurry? i mean it's not like the world's gonna end or anything. Of course, when they do believe that life is shorter and more dangerous -- such as when the violent and property crime rates are going up -- then their mindset is more one of "piss or get off the pot." In particular, it's time to get a job and to work toward making some babies. While tame times last, though, they're more interested in meta-ironic-detached portrayals of rites of passage or else childlike fantasies. In the '80s, it was mostly little kids who were into The Neverending Story -- not college students, who were too busy working, driving around in their cars, drinking or doing drugs, and getting it on.

Dark ages for video games

I finally broke down and bought a PS2 (slim), though not because there are many good original games for it -- and with some 4500 games available, that is a pathetic success rate, compared to all the gems among the roughly 900 games for the Sega Genesis, 800 for the Nintendo, or 700 for the Super Nintendo. Rather, it offers lots of great compilations. Except for racing and first-person shooter games, which are the only genres that are superior in 3-D, video games have been in the dark ages since roughly 1995. However, just like during the Medieval period of European history, the great works from the previous golden age are still being preserved and even slightly expanded upon. Sure, there are a few greats that this period has produced -- such as Castlevania: Symphony of the Night and evidently the New Super Mario Bros Wii (both 2-D platformers) -- but it's mostly been an era of conservation.

I bought a GameCube last summer just so I could play Game Boy, Game Boy Color, and Game Boy Advance games on a TV, although I was also enticed by some of the compilations they had. Still, it's become clear that the variety just isn't there; it's the PS2 that is like Baghdad's House of Wisdom. As a little kid, I never would've imagined that I'd be able to play Final Fight for real on a home console, let alone that scores of other superstar arcade games would be included on the same disc -- or have all of the Mega Man or Sonic games on one disc -- and that this wouldn't cost more than a single disc otherwise would.

Now all they need is a compilation of rare or expensive action RPGs like Terranigma and Secret of Mana, and just to fill it out, maybe some of those turn-based RPGs (which I don't like) that are impossible to find cheaply like Chrono Trigger and Earthbound. They've already made a killing with two 16-bit era compilations, so why not? It would help to tide us over until the Renaissance comes.

March 12, 2010

What qualities does your name bring to mind?

Namipedia has a neat feature that allows the readers of a name's page to rate the name based on how smart, sexy, friendly, creative, strong, young, and sophisticated it sounds. To see how a name measures up, just type it into their search bar, and in the "Does X sound..." box click on "View all ratings."

Out of curiosity I looked up my own name and was flattered to learn how pleasantly the voters perceive it. Then on a hunch I looked up a bunch of other, obviously low-scoring names to see what the judgment was. To my surprise, every name gets at least a middle-of-the-scale rating on every variable, often much higher than they deserve.

The most objective way to show this is to look at the "young" rating of names that belong to very old people. (Namipedia also shows, off to the right, the rise and fall of that name's popularity over time, in case you're unsure.) Rose, Agnes, Beatrice, Ethyl, and Edith score no lower than the middle in the "young" variable, even though those names haven't been at all popular for roughly 100 years. Maybe there's some really horrific-sounding name that I couldn't think of, but generally everyone gets a favorable rating on everything. What gives?

If you read my post on why online comments are so negative while online product reviews are so positive, you already know the answer. This is just another example of an anonymous online rating that shows only high average scores. To reiterate the take-home message from before, critics inclined to leave negative reviews will silence themselves online because they expect their audience not to have experienced the target of their harsh words. When the audience doesn't know the back-story to a severe punishment, they reflexively side with the punished rather than the punisher -- "Hey, maybe it's not that great, I don't know, but it seems like you're being unfair." Faced with that lack of sympathy from the audience, and imagining their disapproving looks, negative reviewers will keep quiet. As Adam Smith put it: "Compared with the contempt of mankind, all other external evils are easily supported."

That changes when it's the comments section of a blog since the audience does know the back-story, as they too have read the post being commented on.

With the name ratings, what's really being rated is the group of people with that name. Raters are not divorcing the name from those who bear that name -- it's too hard for most people to do -- but thinking to themselves, "How smart are people named Raylene?" or "How sexy are people named Bert?" The typical audience member has had no contact with people named Raylene or Bert, certainly not at a level that would allow them to know how fair or unfair the judgments of the rater were. Therefore the rater, even if inclined to give a harsh grade -- like Agatha doesn't sound young at all -- will either keep quiet or give a much higher score than he'd want to. He expects that the audience would side with the poor Raylenes, from the "innocent until proven guilty" principle, and he doesn't even have the chance to prove their guilt. Picturing that entirely unsympathetic audience, he decides it isn't worth giving a low score.

March 8, 2010

Innocent love songs in wild times, dirty songs in tame times

Over at my data blog, I just looked at how levels of social trust are related to levels of risky sexual behavior among young people. The latter have been in decline since their early-'90s peak, along with all other sorts of thrill-seeking behavior.

Yet when we look to popular culture for clues about the sex lives of young people, we see the opposite of what we'd think at first -- from the late '50s through the early '90s, love songs aren't very raunchy or sassy, don't glorify promiscuity, and are typically addressed to a single person who is driving the singer crazy. OK, with the occasional exception. Still, look over the Billboard Hot 100 number ones from 1987 and note how refreshing they sound. During the tame times since the early '90s (and perhaps during the '30s and '40s, too, but I don't know much about that music), the fraction of songs about boys and girls that are provocative has gone up. What gives?

The kneejerk cynical answer is that people are hypocrites. The promiscuous people sing more sincere love songs to disguise their promiscuity, while the sexually less active people sing about being dirty to disguise their sub-promiscuous activity. Both are singing to disguise what they see as the flaws in their sex life. But then this explanation means nothing since it assumes that both see their sex lives as shameful and worth covering up. In general, you can impute hypocrisy to anyone by just assuming their mindset is what's needed for the theory to work. Even if this were true here, the cynical response wouldn't explain the shift across time -- why all of the suddent was the shameful thing less promiscuity, rather than more?

I think the answer comes down to trust in others, which has been falling since sometime in the late '80s. When social trust is high, you feel safer and will engage in riskier behavior -- and perhaps get burned, have to go back to square one, and thereby rack up more partners. People who are very trusting are also going to sing more about a single person and will emphasize the loss of control they feel. Low-trust people would never invest that much in another person, and they certainly wouldn't make themselves vulnerable enough to feel that they have little control over their actions.

I don't have to remind you what recent songs about men and women are like. Take anything by the Pussycat Dolls, Nelly Furtado, or Fergie. It's addressed to a crowd of men, men in general, or a generic man, who are all at a great social distance -- not the singer's one and only, who she trusts (or used to trust). And like everything else in the culture since the mid-'90s, the voice is highly self-conscious -- "don'tcha wish your girlfriend was hot like me?" -- rather than a voice from someone who's lost their sense of individuality through joining the other person who they trust -- "I'll stop the world and melt with you."

This self-consciousness is part of the foundation for their sense of control and power, unlike the vulnerability and lack of total control that singers expressed during wild times. And it sure is easier to feel in complete control when you don't trust others and rely only on yourself.

The apparent exception today is the Norah Jones school of sappy singer-songwriter junk -- that's hardly raunchy and power-thirsty. That may be so, but those songs still sound like they come from a low-trust society because the women never sing about how uncontrollably gone they are, nor express it in their inflection. Instead it's like, "Gosh, it's really, really neat to date you. Really neat!" They're intrigued in a gee-willikers way that a scientist might perk up on observing a colony of bacteria through a microscope. No involuntary passion.

Listening to the Pussycat Dolls or Norah Jones, you actually shouldn't be so surprised that promiscuity and sexual activity in general is down among hormone-crazed young people. If a girl feels like a spotlight is on her alone, and like she's in full control of her emotions and actions, she is not in a very susceptible state for risky sexual behavior. She has to feel like she's being swept along in the moment, like she can't help it, and so like there's no point in trying to fight it. She might as well just surrender now. I can't imagine that horse-faced transvestite Fergie getting into that mindset, but it used to be perfectly common. Here's just one of many that still stick in my memory even though I was only 6 or 7 when they were on the radio:

March 6, 2010

Teenagers less reckless even for trivial, unpunished risks

One broad class of theories about why American society has become a lot more civilized since the early-'90s peak in the rates of violent crime, property crime, teen promiscuity, child abuse, etc., is that law enforcement started cracking down harder. Raise the costs of homicide by raising the chance of it leading to life in prison (or whatever), and you'll see fewer homicides committed. I don't doubt that can play a role, but I question how widely that applies and what the magnitude is where it does apply.

As I've mentioned before, the list of risky behaviors that have declined since the early '90s is so vast that it makes less sense to propose ad hoc explanations for each decline -- for homicide, for teen pregnancy, etc. For example, locking up more of the risk-takers is a plausible account of why homicides declined, but not teen promiscuity or teen pregnancy because the risk-takers in those cases weren't given harsher punishments -- or any at all -- by law enforcement. Rather, it's better to talk about a fall in a generalized taste for risk in the minds of individuals. I prefer some kind of frequency-dependent selection model that has no stable equilibrium but instead oscillations.

As an example, imagine that almost everyone played "rock" -- then the frequency of "paper" would shoot up and that of "rock" would plummet. But then the frequency of "scissors" would shoot up and "paper" down. Finally, with "scissors" now the most common, the frequency of "rock" would shoot up and "scissors" down. And the cycle would repeat. What these categories would be in the case of falling levels of thrill-seeking, who knows. The point is that such models can account for very general changes and don't need lots of ad hoc explanations. By its nature the system will oscillate; we don't need to invoke lots of outside influences just before each swing of the pendulum, to mix metaphors.

Still, evidence would be good to look at. You might object to the case of teen promiscuity and pregnancy by saying that they could have gotten tougher on age-of-consent laws or something, so it really was like the "tough on crime" view would predict. (We'd still need evidence that law enforcement ramped up its efforts to enforce such laws, though.) What other risks are hardly ever punished? How about not wearing your seatbelt or not wearing a helmet while biking. The CDC's Youth Risk Behavior Survey polls a national representative sample of high schoolers every other year about all kinds of risky behavior, starting in 1991. Here is the percent of students who rarely or never wore a helmet (among those who rode a bike in the past 12 months), and the percent who rarely or never wear a seatbelt when riding in a car driven by someone else:

Both pictures fit the larger pattern of a general fall in risk-taking since the early '90s. It's dramatic for not wearing a seatbelt, and still down 10 percentage points for not wearing a bike helmet. Taking these risks is virtually never punished -- a conclusion we reach immediately from the fact that even now 85% of bike riders flaut whatever legislation there may be about wearing a helmet. So the fall in risky behavior is not due to a response to changes in particular policy incentives pertaining to homicide, teen pregnancy, etc., but rather to some very broad incentive, not embodied in any one policy, about risk-taking in general. I think that reflects the fall in trust of others -- that would cause you to be much more cautious. (That just pushes the question back to "what causes trust to cycle," but I prefer the same sort of model.)

This suggests that the increasingly "tough on crime" policies of law enforcement were reflecting a deeper and broader change already afoot in the population's taste for risk, rather than causing a thrill-seeking population to adjust its taste for risk downward. Everyone went into "better safe than sorry" mode, and because law enforcement can only be as harsh as the public will allow, the police could then level much stiffer penalties. Most people don't appreciate how much our public institutions reflect the public's desires in a competitive political system where violence is outlawed in the competition. Politicians don't brainwash a recalcitrant public; they give the public what it wants. * It is the same with the agencies of government that enforce the law on the ground.

* See Bryan Caplan's Myth of the Rational Voter.

March 4, 2010

America says "so long" to the babysitter

Some of the fondest memories I have of my later elementary school years involve hanging out with my best friend Robbie -- and his super-cute babysitter Susie. She was in her late teens, either a high school senior or just starting college, and therefore really playful around us in that way where teenage girls try to educate pre-pubescent boys in the ways of interacting with girls. Especially by being more easy-going and mock-flirtatious than they'd be with boys their own age, in order to get the younger boys to learn not to be afraid of girls: don't worry agnostic, i'm not gonna biteee. I even got to give her a shoulder massage a few times -- pretty sweet start for 9 years old.

Although she had a sweet and girly personality, she never tried to push any of that gross stuff on us. Besides, she could do something for us that she knew would get us so excited that the whining lobe of our brain would shut right off -- namely, take us for a cruise in her BLACK IROC-Z. Back when teenagers had a life and their own cars, that was one of the coolest things about having a high school babysitter: you could enjoy the freedom of driving around unsupervised by your parents or someone else's, even if you were never in the driver's seat. The car she drove was just a hint of her wild nature, however. After my friend became too old to need a babysitter, we found out that she'd taken up a topless dancing job to pay for college. She must be in her late 30s by now and so probably not in great shape, but I'm grateful that we got to enjoy her back when she was barely legal (or perhaps not even), unlike the poor deprived children of today.

A new book, Babysitter: An American History, details what I'd suspected about the prevalence of babysitting after the wild times ended in the early '90s -- it started fading away by the end of the century, and it's more or less gone today, at least as employment for young people. I've only browsed the book on Amazon; I might post a follow-up after I check it out. You probably hadn't thought about it, but it shouldn't be too surprising in the era of helicopter parents. First, the parents themselves have largely taken on the tasks of the babysitter. And second, with hyperactive parents pushing their kids to fill up every free second of their time with tutoring, student groups, sports, etc., the poor bastards have no time to devote to babysitting even if there were demand for their labor.

It's important that parents themselves are the ones hovering over their kids, chauffeuring them around, and so on. You might think that with the massive illegal immigration we've seen, parents are simply rejecting the too-high offers of native teens and going with cheaper-wage illegal immigrants. The timing isn't there, though, since Forman-Brunell notes the paranoia starting even by the late '80s and early '90s. If parents were just substituting one type of worker for another, "the babysitter" would be just as strong and obvious of a cultural symbol as before, except that she'd be a 30-something illegal Mexican instead of a 16 year-old white girl. Clearly that's not the case. Housemaids and lawn-cutters, sure, but not babysitters. Rather, as we all know, parents are rejecting offers from everyone and are just doing it themselves. The babysitter as an icon is just plain gone. (If they're well-to-do, women might participate in a babysitting co-op with other paranoid mothers who they're close to.)

When a firm used to contract some functions out to other specialists, and suddenly starts to move more of those jobs in-house, that reflects an increase in the transaction costs -- or the costs associated with contracting with other individuals who aren't under your own management. If aliens blocked off all accountant's offices from outside communication, your firm would have to move those accounting jobs in-house because otherwise you'd have to pay a much higher cost than before to contract with the accountant's office. However, these costs could be more intangible, such as having to trust the other party. If good faith is weakened between the two, their interactions become much more mercenary and lawyerly -- and therefore more costly -- since they will need to stipulate more things in the contract than if they could assume they'd be met out of good faith.

The General Social Survey shows that sometime in the late '80s, people's trust in others started to fall steadily. During the previous decades of high trust, people were afraid for the babysitter -- some escaped mental patient might try to hunt her down. When trust started plummeting, people were afraid of the babysitter -- now she was the basketcase who would destroy your family. In fairness, some of this decline in trust could be an understandable response to people's bad experiences with babysitters, even if you think it's an over-reaction.

I still recall one demented middle-aged cunt who locked me and my brothers in my room while she sat watching TV for a few hours. Hearing the lock caught us off-guard, and at first we had no idea what she was doing -- leaving us to starve, stealing things from our house, who knows? After we calmed each other down, I took off the metal ball that supported a corner of my bed, tossed it into a sock, and whacked a small hole through my door so we could at least see what the hell she was up to, and have our screams reach her more forcefully to ruin whatever she was doing. Needless to say, that bitch got canned. The only other incident across probably six years of babysitters and another two to four years of daycare was when a high school girl didn't like one of our smartass remarks and gave them a smack. I still forget who got it because it wasn't a big deal. She didn't show up again, and only later did we learn that she'd been fired for that.

Still, I think most cases like the first one can be avoided by requiring the babysitter to have a good reputation and provide some references. Of course, you have to trust those who provided the references, and if you're trusting, you may not even bother to follow up on them. Plus the gains are so large -- and I'm not only talking about the opportunity cost of the parents' time. The kids are especially better off because they get to hang around a cool teenager instead of their boring parents. That's a good step toward a lively social life right there. Just don't be cruel and hire a guy to babysit your sons. I had a male babysitter once, and he was cool enough -- he tried to teach us the make and model of all the hot cars in his car magazines, and again it was awesome to drive around in his Datsun 280ZX with t-tops.

At the same time, it's better for young boys to hang out with older girls. It boils down to their ability to take you to more enjoyable places because going anywhere with a teenage girl is automatically more fun. It could even be your own home, like the time when my cute high school babysitter had three or four chick friends over and were acting all goofy and giggly -- hey, what kid can't identify with that?! Her friends kept egging her on to omigod do the dog dance melissa, c'mon! and she'd follow with some funny dance. When she came back next week (alone), I tried to re-ignite that mood by begging her to tell me what the dog dance was. She looked really embarrassed and kept denying my pleas and trying to change the subject. Looking back on it, they'd probably gotten buzzed on alcohol and she didn't want her innocent little charge to find out what they'd done.

I think it was also her who hauled me and my brothers away at night, which we weren't too happy about. However, once we figured out we were on the Ohio State campus, it turned into a rite of passage -- our first wading through the sea of nubile estrogen-dripping college girls at night. It wasn't a typical sterile prison-like student dorm because it had wooden external stairwells and the students' doors all opened outward to a walkway that ran around the building, rather than facing an internal hallway. Doors were open, music was playing, lights were still on past our bedtime, and best of all her friends would all run up to these pint-size visitors and ask her, hey who's thissss???? omigod they're so cuuuuute!!! We didn't stay on campus very long -- I think she just had to talk to some of her friends or get something important from them really quick -- but still that was one of the coolest nights of my childhood.

And last but not least is the trip to the babysitter's house. Again, if she was middle-aged, it was like going to prison. But if she was a teenager, in all likelihood her parents would be gone too. It's not even a sexual thing -- like, I've got her alone at last! -- but just having fun unchaperoned in a new place. It seems like you're still supervised since the babysitter is there, but it never felt like that. Rather, it felt like the entire group -- you, your siblings, friends, as well as the babysitter and her friends -- were hanging out unsupervised. I mean, what's greater than getting to hang out unmolested with the cool high school kids, when normally they wouldn't be caught dead with you?

Perhaps the most vivid memory I have involving babysitters is of... damnit, I can't remember her name, but she was in high school. One day she took me over to her house so she could hang out with her boyfriend while I played Legend of Zelda with her younger brother, who as a 6th-grader was closer to my age. Her brother must have memorized the Nintendo Player's Guide because he showed me all sorts of places in that game that I'd been busting my balls to figure out how to reach. We were both completely engrossed in the game, he because he felt like hot shit showing off his knowledge, and me because it felt like discovering a hidden continent.

All of a sudden something breaks my focus, and I look over to see my babysitter and her boyfriend. They're both standing face-to-face, her with her back to us and him looking at us with a creepy grin. He says something like, "See this is how you do it..." It takes a few moments, but I notice that she has nothing but her bikini underwear on below her waist. (If I recall correctly, he'd taken down her shorts or lifted up her skirt.) "Left cheek, right cheek, left cheek, right cheek," he says while giving each one a squeeze. When you have no idea what's going on, you look to the other person's reaction to see what it means. If she'd been into it, she would've laughed in a mortified way and playfully slapped him, like hey don't do that when they're watching! But I just remember her standing dead still with her head hanging and a frozen, apologetic look on her face when she turned slightly toward us, like forgive me, it's not my fault.

Teenage babysitters today wouldn't believe it, but all of these episodes took place in an upper-middle-class suburb of Ohio, not some seedy slum in New Jersey. Times were just wilder then, even for little children, and that was only possible because of the high level of trust. Once good faith evaporates, parents go into lockdown mode and shield their kids from learning first-hand about how the real world works. Again, it wasn't that bad; the worst that happened was being locked in my room for a few hours. I did see my babysitter groped against her wishes, but if that kind of thing is going on in the world, it's better to know about it than be naive. Sometimes girls date creeps and losers before they figure things out.

Plus there's an entire world that kids need to get prepared for, but which their parents or other adults can't easily introduce them to -- namely the world of adolescents and young adults. As a kid, the only passport to visit that future of yours is hanging around your teenage babysitter. No other adolescent has any incentive to protect you like a guide when you venture into that world -- all the other older kids, the males at any rate, would drive you out. You might be lucky enough to have an adolescent sibling when you're little, but even then sibling rivalry will make them treat you like shit in front of their friends. I don't think it's only the financial incentive that the babysitter has to make sure you're fairly safe in teenage and college kid world, though that helps. Good babysitters -- the ones most likely to get hired -- give off a vibe of liking kids and growing somewhat attached to them. Then parents don't need to discipline the babysitter by threatening a pay cut if she does a bad job. Instead, she'd just feel horrible if another teenager tried to mistreat her cute little kids. She'll be permissive enough toward the kids that they can enjoy teenage world, but protective enough that they won't get seriously hurt if their bike turns over during the ride.

March 2, 2010

Children's subversive songs

While browsing through the folklore section of the library stacks, my eyes fixed on the title: Greasy Grimy Gopher Guts. It's 200 pages of collections of "subversive" children's songs, arranged by theme, plus another 50 or so pages of footnotes for those with a more academic interest in the material. Definitely worth a read, partly for nostalgia, just in case (like me) you've forgotten this one:

Joy to the world,
The teacher's dead!
We barbecued her head.
What happened to the body?
We flushed it down the potty.
And round and round it goes,
And round and round it goes,
And round, and round,
And round it goes.

I actually had an easier time remembering those words to the tune than I did the words to the Christmas song, which I had to google to fully recall. (All I was sure of was "Let Earth receive her king" plus "And heaven and nature sing.")

And who could forget all of the extra verses to "Glory glory hallelujah / My teacher hit me with a ruler." It's a neat look into ones you've never heard, plus having them all assembled in one place lets you see patterns. For one thing, family members are entirely absent -- kids try to take down their friends, rivals, strangers, teachers, pop culture icons like Popeye and Barney, but really never their siblings or parents. Freud and John Hughes are responsible for exaggerating how intense conflict within families is. Sure it's there, but to most kids their siblings and parents are invisible and ignored, not hated and rebelled against.

The most recent data the authors collected was in 1994 at an elementary school, and these songs were still doing OK, although most entries were contributed by people who came of age in the mid-late-'50s through the '80s. Hardly any were from the '40s or earlier. I think that goes along with the "wild times vs. tame times" idea I've written about elsewhere on this blog. I wonder how many of these subversive songs an 8 year-old would know today. Children's response to larger social changes might not be so sensitive since they're not in fully social mode yet -- unlike adolescents and adults who are hyper-aware of what's going on. Still, my hunch is that these subversive songs -- especially the really strong, offensive, or gross ones -- aren't as popular among children today. As with the case of "99 Bottles of Beer on the Wall", I never heard them at my tutoring center.

Sounds like a good project for a folklorist or similar person to get to work on -- what's changed in the 15 years since, have these songs faded away just like all sorts of other wild bits of culture? Of course the humanities really went down the toilet for most of the '90s and 2000s, so there probably aren't too many people willing to do it. That crap seems to be on the way out, though, so maybe we'll find out some day.