The idea that you would voluntarily run into a bunch of people you went to high school with and yak about the old times strikes many as too sappy to even consider. Is your life so pathetic now that your plan for fun is to re-live the glory days of high school? As a late-bloomer who's still enjoying his prime -- not that I've deluded myself into thinking it'll last forever -- I sympathize with this reason to stay away. Still, everyone has at least a streak of sentimentality, but I won't go to any reunions even if I felt like indulging that part of me.
My guy friends from adolescence are like war buddies. You banded together and made it through the jungle alive. If you don't form a group of your own, you'll be run over by some other mob patrolling the hallways and the cafeteria. They saw you at what will likely prove to be the most desperate and vulnerable points of your life, and you them as well. And when the social battle momentarily let up, they were the ones who you had reckless fun with, sometimes breaking the law along the way.
Ten years after graduating high school, you can't hope to re-establish that camaraderie by the tiniest bit. We forget to take account of how much our behavior is affected by our circumstances, notwithstanding our enduring personalities, and so we expect that hanging out with our high school friends will be just like old times. But because we've inhabited a profoundly different social environment for so long -- one that is less barbaric and where you can thrive pretty well on your own -- we're no longer who we were back then. It sounds too simple to bother emphasizing, but the typical reunion attendee more than half-expects it to be like the summer after their freshman year of college when they saw each other again and not too much had changed.
Having survived the hell of war together, it would feel deeply profane to meet up 10 years later during peacetime and start yakking about the weather and the economy. Sure, we have plenty of everyday recollections of those friends -- meandering toward the bowling alley to play video games, keeping your friend from growing bored during his paper route, etc. But overall my memories involving them are much more intense, and I don't want to dilute them with images of chatting about sports and the rat race in some bar. Sequels almost invariably suck in comparison -- especially ones thrown together 10 years or more after the original.
And then there are the girls. Or, I guess they're women by now. For those who don't get my preference for teenagers, just go to your 10 year high school reunion and tell me what percent of the women there still give you a boner just looking at them. True, you're not as horny as you were at 15, but unless you have no libido left, they should be doing something for you. The reason they aren't is because they're long past their prime. For some it might have been 17, while others may have lucked out and lasted into 24 or 25. But by their late 20s, you will be disillusioned to see them again.
We all know that at some point a woman's looks begin to fade, and we even have firsthand knowledge of this as long as we're paying attention to our age-mates as we grow up. However, we tend to change social circles so much during our 20s that we can only compare aggregates -- what the girls in your senior English class looked like compared to your co-workers in their late 20s and beyond. But when we see the exact same person for the first time in 10 years, the fuzzy intuition we have from comparing averages suddenly crystallizes into a sharp awareness. Their deterioration is no longer hypothetical or even probable -- you're dead certain, and this adulteration of your mental pictures of them is unbearable. Let alone when it happens with one woman after another, all within half an hour!
Perhaps more important than their looks is their demeanor. They're less obnoxious than they used to be, but the other side of that temper was a hotheaded cockiness that is intoxicating. They can never get this back; after their early or mid 20s, it just comes off as bitchy and annoying, even nagging. And because all teenage girls are effectively bipolar, their intense lows were matched by intense highs. Even the fortunate ones who grow up to be demure and pleasant cannot match the more ebullient sweetness of young girls -- the sparkle in their eye when they're running down the hall toward you, the spring in their step as they leap into you, and the grip of their muscles as you spin them around.
And unlike guys, girls completely lose their sense of humor and ability to have fun once they hit 25 or so. It's understandable given that that's when nature expects them to become mothers and settle down to raise a family -- fun time is over. For her. Unlike late 20-something women trying to show they've still got it by dancing on the bar, young girls go with what's true to their feminine nature -- being silly and cute.
One of my fondest memories is of a day in late spring in 8th grade when our Gen X English teacher let us go outside for class and do whatever we wanted, as long as we didn't run off. My closest chick friend at the time paired up with me and we found a quiet spot on the slope of a small hill. We passed the time joking about I can't even remember what, while she drew stars and hearts on my Chuck Taylors, then tracing out her nickname for me in her girly writing style. It's not quite as severe as cutting your thumbs open and pressing them together to become blood brothers, but it was still a physical way of leaving an impression on me so that I wouldn't forget her. She was one of the most pretty and popular girls in school, and I was one of the rebel kids -- what passed for one in 8th grade anyway -- so it felt even more surreal. As lovely as I'm sure she turned out, giving in to the temptation to see her again would only threaten the virgin image I have of her. *
Aside from silliness, there's coyness, and this produces its strongest effect when she's young and her blood is flooded with testosterone. Only when her sexual desire and frustration are palpable do we marvel at the resolve of a coy tease. Seeing her at 29 taking a stab at playing hard-to-get will only uproot your perfect picture of her and spread weeds throughout your mind. It's no different from looking up an adolescent crush on Wikipedia and having to look at some ridiculous photo like this.
Whether they're charging along in their career or staying stuck in their rut in their home town, the forced levity and the inane details of what they're up to would freeze my memories of them and crack them apart with a hammer. They were at their most moving while racing to class late in a fit of giggles, bending clear over in front of you in math class, or writing notes on the back of your neck when the teacher showed a boring movie.
College is somewhat different since you were at least through puberty and not so far away from adulthood. So it wouldn't feel too bizarre to see them again, but I'm skeptical even of that. I actually wouldn't mind hanging out with some of my guy friends, since it wasn't such a battle zone, and hence no conflict between a wartime and civilian frame of mind. I still don't know if I could handle seeing how the cuties I went to college with have aged. At least 10 years after high school is 28 or 29 -- for college, they'd be in their early 30s. I just don't get why people are willing to risk spoiling the purity of their memories just for an hour or so of tedious conversation.
* Not only would that kind of cute silliness be wholly lacking at a high school reunion, but the very idea that two people from different cliques would become close would be out of the question. Unlike secondary school, when your social circle is pretty heterogeneous, after you settle into adulthood it's much easier to sort yourself into a group of people who are almost exactly like you. I'm not talking about some lame one-off encounter between members of different tribes a la The Breakfast Club (so lame), but a few steady friendships between the preppy girls and the punk boys, or the stoners and the nerds, or the blacks and the whites.
November 30, 2009
November 25, 2009
Starbucks, bulwark of the cafe culture
Here's an article on the changing cafe culture from then until now, in which the author notes a decline from the Viennese ideal:
Of course, there is one minor holdout against this trend:
That tiny exception of the ubiquitous Starbucks shows that there's nothing to worry about on this score. Only a handful of pretentious geeks hang out in these Balkanizing cafes. That's true, by the way: the indie cafe in my neighborhood only has liberal white people between 27 to 47 who belong to indie rock or hippie culture. At the nearby Starbucks, you see little kids up through grandparents, with the middle 50% being from about 20 to 50. You see Hispanics, Asians, and Middle Easterners regularly, and blacks too, though less so. It does trend liberal, but hardly. I'm as likely to overhear a group of queers talking about staging a kiss-in as I am to sit next to religious conservatives discussing how to honor God through their behavior. The sub-cultures you find in the indie cafe are all there, but most people appear drawn from the mainstream.
The only filter at work at Starbucks is for IQ, but that's true for all cafes, and I think it's much weaker at Starbucks. Indeed, trying to signal how brainy you are is one reason why insecure people hang out at the indie cafe -- "You know, people who only graduated high school go to Starbucks. They're not going to know about Feynman or Truffaut." It's important for smarties to stay in some contact with the left half of the bell curve -- especially ones who have naive views of how to help them out -- so Starbucks wins on that count too. (It would be better still to go to a sports bar, but I can't stand watching most sports.)
As for the most annoying trend among cafes --
This is also less pronounced at the local Starbucks than the local indie cafe. I almost never see anyone in the indie cafe reading a book, and rarer still the newspaper. But using their laptop to check their email, refresh the NYT's homepage again, and follow Connor Oberst's pet hamster on Twitter? You bet. There's a handful of laptop users at the Starbucks, but they're a smaller percentage. Compared to the indie cafe, there are just a lot more people-watchers, socializers, meeting-holders, readers, writers, drawers, and lovers.
There's not much to do at an indie cafe other than sit around and signal. If you actually want to go do something, head over to the Starbucks instead.
We've also used it to balkanize ourselves. The Viennese coffeehouse is a communal exercise in individuality: As an Austrian friend noted recently, his compatriots don't go to cafes to socialize -- everyone goes to watch everyone else. This phenomenon doesn't quite work in America because cafes here tend to draw specific crowds: a hipster cafe, a mom cafe, a student cafe.
Of course, there is one minor holdout against this trend:
With the exception of the ubiquitous Starbucks, where slumming and aspiration meet, we use our coffeehouses to separate ourselves into tribes.
That tiny exception of the ubiquitous Starbucks shows that there's nothing to worry about on this score. Only a handful of pretentious geeks hang out in these Balkanizing cafes. That's true, by the way: the indie cafe in my neighborhood only has liberal white people between 27 to 47 who belong to indie rock or hippie culture. At the nearby Starbucks, you see little kids up through grandparents, with the middle 50% being from about 20 to 50. You see Hispanics, Asians, and Middle Easterners regularly, and blacks too, though less so. It does trend liberal, but hardly. I'm as likely to overhear a group of queers talking about staging a kiss-in as I am to sit next to religious conservatives discussing how to honor God through their behavior. The sub-cultures you find in the indie cafe are all there, but most people appear drawn from the mainstream.
The only filter at work at Starbucks is for IQ, but that's true for all cafes, and I think it's much weaker at Starbucks. Indeed, trying to signal how brainy you are is one reason why insecure people hang out at the indie cafe -- "You know, people who only graduated high school go to Starbucks. They're not going to know about Feynman or Truffaut." It's important for smarties to stay in some contact with the left half of the bell curve -- especially ones who have naive views of how to help them out -- so Starbucks wins on that count too. (It would be better still to go to a sports bar, but I can't stand watching most sports.)
As for the most annoying trend among cafes --
Which brings us to the laptop. At any given moment, a typical New York coffeehouse looks like an especially sedate telemarketing center.
This is also less pronounced at the local Starbucks than the local indie cafe. I almost never see anyone in the indie cafe reading a book, and rarer still the newspaper. But using their laptop to check their email, refresh the NYT's homepage again, and follow Connor Oberst's pet hamster on Twitter? You bet. There's a handful of laptop users at the Starbucks, but they're a smaller percentage. Compared to the indie cafe, there are just a lot more people-watchers, socializers, meeting-holders, readers, writers, drawers, and lovers.
There's not much to do at an indie cafe other than sit around and signal. If you actually want to go do something, head over to the Starbucks instead.
November 23, 2009
Juvenile vs. mature open-mindedness
Continuing with one of the ideas from the post below, one landmark of growing up is changing your mind about some things that you hated from your childhood and adolescence. Typically we think of young people as more open-minded, and the personality trait Openness to Experience does peak in the late teens and early 20s. However, this is more of an openness to new or different stuff. I'm talking about being open-minded enough to admit that you misjudged something good as bad -- a much tougher error to admit than the other way around.
When you're a teenager or young adult, you're too socially desperate to question the worth of your behavior and your culture -- you just have to shut up and go with it, or else you won't fit in with your clique during a time when you are unable to go it alone. Once you're more socially independent, which happens sometime during your mid-to-late 20s, you can relax your narrow-minded devotion to your group's products and practices. You're even free to borrow things from a rival faction and not suffer as much: ostracism is less powerful when social relations are less tribalistic.
How do we know that tribalism declines after the early 20s? Simple: look at how conspicuous your group membership badges become. Even adults belong to groups and signal their affinity, but the intensity of the signal is lower and the noise around it is greater. For one thing, unlike teenagers, adults don't wear clothing with logos, or with names, pictures, and other icons of their favorite entertainers. Their use of slang is a lot less frequent, and the turnover rate is much lower. Some professionals have jargon, but it's nothing like the slang of teenagers, which obviously functions as a set of shibboleths.
As a concrete example, consider all of the pop music groups who disgusted you as a young person. (Which pop music groups we follow is one of the main ways that we express our tribal membership.) It's possible that all of your judgments at the time were correct, but it's not very likely. Now, some of them you'll grow to like just because your tastes involuntarily change with age, like preferring the more bitter espresso to Frappuccinos or the more pungent Roquefort to cream cheese. But these don't require questioning your earlier assessments.
Take a group that was pretty popular when you were a teenager, but one who your tribe was steadfastly against. If you had even considered listening to that group, your peers would've threatened you with excommunication -- "we don't listen to them." Once you can more safely tell those people to fuck off if they don't approve of your musical tastes, you start to re-examine some of your earlier judgments and find many of them to have been wrong. You may feel that they were necessary and rational in the context of surviving tribalistic adolescence, but still they were unfair. All of a sudden, your mind opens up to wholly uncharted waters of the cultural oceans.
At the same time, you don't overturn all of your previous decisions. Motley Crue really did stink -- no mistake there. But if you don't uncover at least a couple of faulty convictions when you rummage through the volumes of your life experience, you haven't matured yet. To see that their underlying merit is what gets a sentence overturned, many people who re-examine the same collection of cases will independently arrive at the same conclusions -- that one was dealt with fairly, but this one shouldn't have been punished. Unlike Motley Crue, Guns N Roses are much more likely to have their credibility re-established, quite simply because they were better.
To measure how much more open your mind has become, we just ask how you respond to the cases that would sting the most to admit you were wrong -- namely, where their worth was so great and yet where you spilled the most blood. There's no one perfect example of this, but disco sure comes pretty close. At least within the realm of pop music, few other groups can match Chic on musical innovation, virtuosity, complexity, and breadth of emotional range. If the most sophisticated evaluation you can make is that "disco sucks" or that only queers like dance music, you still have a lot of growing up to do.
When you're a teenager or young adult, you're too socially desperate to question the worth of your behavior and your culture -- you just have to shut up and go with it, or else you won't fit in with your clique during a time when you are unable to go it alone. Once you're more socially independent, which happens sometime during your mid-to-late 20s, you can relax your narrow-minded devotion to your group's products and practices. You're even free to borrow things from a rival faction and not suffer as much: ostracism is less powerful when social relations are less tribalistic.
How do we know that tribalism declines after the early 20s? Simple: look at how conspicuous your group membership badges become. Even adults belong to groups and signal their affinity, but the intensity of the signal is lower and the noise around it is greater. For one thing, unlike teenagers, adults don't wear clothing with logos, or with names, pictures, and other icons of their favorite entertainers. Their use of slang is a lot less frequent, and the turnover rate is much lower. Some professionals have jargon, but it's nothing like the slang of teenagers, which obviously functions as a set of shibboleths.
As a concrete example, consider all of the pop music groups who disgusted you as a young person. (Which pop music groups we follow is one of the main ways that we express our tribal membership.) It's possible that all of your judgments at the time were correct, but it's not very likely. Now, some of them you'll grow to like just because your tastes involuntarily change with age, like preferring the more bitter espresso to Frappuccinos or the more pungent Roquefort to cream cheese. But these don't require questioning your earlier assessments.
Take a group that was pretty popular when you were a teenager, but one who your tribe was steadfastly against. If you had even considered listening to that group, your peers would've threatened you with excommunication -- "we don't listen to them." Once you can more safely tell those people to fuck off if they don't approve of your musical tastes, you start to re-examine some of your earlier judgments and find many of them to have been wrong. You may feel that they were necessary and rational in the context of surviving tribalistic adolescence, but still they were unfair. All of a sudden, your mind opens up to wholly uncharted waters of the cultural oceans.
At the same time, you don't overturn all of your previous decisions. Motley Crue really did stink -- no mistake there. But if you don't uncover at least a couple of faulty convictions when you rummage through the volumes of your life experience, you haven't matured yet. To see that their underlying merit is what gets a sentence overturned, many people who re-examine the same collection of cases will independently arrive at the same conclusions -- that one was dealt with fairly, but this one shouldn't have been punished. Unlike Motley Crue, Guns N Roses are much more likely to have their credibility re-established, quite simply because they were better.
To measure how much more open your mind has become, we just ask how you respond to the cases that would sting the most to admit you were wrong -- namely, where their worth was so great and yet where you spilled the most blood. There's no one perfect example of this, but disco sure comes pretty close. At least within the realm of pop music, few other groups can match Chic on musical innovation, virtuosity, complexity, and breadth of emotional range. If the most sophisticated evaluation you can make is that "disco sucks" or that only queers like dance music, you still have a lot of growing up to do.
November 22, 2009
Last great rock band -- Guns N Roses?
I've been thinking more about this post on age, generations, and pop music that Steve Sailer posted from a reader. My take is that the decline of quality rock music doesn't stem from a lack of young people: there have been echo booms that should've allowed for something good to emerge, and anyways how many young people does it take to make good music? It's not as though we went through a bottleneck after the Baby Boom. Tens or hundreds of millions is still a shitload of people.
Rather, the decline stems from the general civilizing trend that started in the early 1990s. Violent and property crime rates are down, so are the many forms of child abuse (except parental neglect), promiscuity started falling, etc. When the culture is thrill-seeking, there's a demand for wild music. When fun no longer characterizes the zeitgeist, you're going to get whiny, angry, and sappy music.
(As an aside, someone here, perhaps TGGP, says that alternative rock wasn't supposed to be fun, why is being sad sometimes such a bad thing, and so on. That misses something: no one listens to grunge or alternative rock when they're sad and want sad music to give them company. People would rather listen to "Drive" by The Cars, "Don't Cry" by GNR, "Everybody Hurts" by R.E.M., or "End of the Road" by Boyz II Men. Those groups made fun music on the whole. When you listen to those songs, it sounds like they're speaking to, for, and about all people. When you listen to most alternative rock, it's hard not to conclude that the sadness is really just bitterness from some loser outcast. It doesn't feel like the singer is reaching out to anyone, but instead insisting that everyone else listen to him whine about how bad his particular life is.)
At any rate, which band marks the end-point of rock music? I searched YouTube for a bunch of candidates and sorted the results by how many times the video had been viewed, a measure of popularity long after they were in the spotlight. I just looked to see who had videos with 10 million or more views. For comparison, Michael Jackson has a lot -- more than 10, maybe 20 or so, but I didn't check for duplicate videos. No surprise for the King of Pop.
Guns N Roses has 5, and no other group has that many then or now. Actually, Fall Out Boy does, but those videos are very new; it's most likely a fashion thing, given the quality of their music. We can check again in 20 years to see how they're judged. Much older groups have fewer just because demand for videos on YouTube is driven by young people, so The Beatles don't do well (although Elvis does). Either that or GNR really are better than The Beatles.
In second place, not even neck-and-neck, is Nirvana with 3. (If you lower the cut-off to videos with 1 million views, they still don't win.) I checked some other usual suspects -- U2, Green Day, Pearl Jam (they're nearly forgotten, thank god), R.E.M., Bon Jovi, etc. -- and couldn't find anyone. Even Aerosmith's comeback of the '90s didn't come close.
You can try for yourself and see if there's someone I missed, but it seems like Guns N Roses were the last great rock band. No ax to grind here, as I'm not a die-hard GNR fan. At the time, I was much more into Nirvana than Guns N Roses, but in retrospect it was just fashionable. During the transition to alternative rock, I remember that you could still listen to Guns N Roses, but that was it -- no Bon Jovi, even though they made some great songs too. After whiny rock became the norm, though, you couldn't even like GNR -- they got lumped in with Motley Crue and Poison.
Anyway, it's also neat to see how the rankings of a group's songs has changed somewhat over the years. I remember Nirvana's "Lithium" getting a lot more airplay on MTV than "Come as You Are," but in YouTube view counts it's the other way around. And "In Bloom" places 6th on YouTube even though it was shown more than either of those two on MTV -- by a longshot. Still, "Smells Like Teen Spirit" tops both. And now "Don't Cry" by Guns N Roses handily beats out "Welcome to the Jungle," but I recall the popularity going the other way around back then, in the early-mid-'90s at least. Again, it isn't musical chairs because "November Rain" and "Sweet Child O' Mine" are at the top of both lists. For Metallica, the top YouTube song is "Nothing Else Matters," far ahead of "Enter Sandman" -- a complete reversal from when those songs were released.
Although only based on a few groups, it seems like we don't value the harder and wilder songs as much as the slower and more despondent ones. I doubt that it's because the original fans of Nirvana etc. have grown up and now have different tastes. Most people on YouTube probably didn't even hear "Sweet Child O' Mine" when it first came out. Maybe we're just picking the songs that are closer in mood to today's festival of whining. You could check that with The Beatles. In the 1980s when the culture was still wild, did they value the early or later Beatles more? Of course after alternative rock, we're supposed to worship all that dopey junk that followed Rubber Soul.
If that's not it, my second guess is that we value the slower songs because there's been such a dearth of substitute songs since rock music ended in the early '90s, whereas we've had plenty of songs that are close enough to "Enter Sandman" or "Welcome to the Jungle" that we don't really need to mine the past to satisfy those wants. I should have asked my high school tutorees what slow dance songs they played at their school dances -- could there be any good ones written in the past 15 years? Maybe "Good Riddance" by Green Day. That's all that comes to mind.
Rather, the decline stems from the general civilizing trend that started in the early 1990s. Violent and property crime rates are down, so are the many forms of child abuse (except parental neglect), promiscuity started falling, etc. When the culture is thrill-seeking, there's a demand for wild music. When fun no longer characterizes the zeitgeist, you're going to get whiny, angry, and sappy music.
(As an aside, someone here, perhaps TGGP, says that alternative rock wasn't supposed to be fun, why is being sad sometimes such a bad thing, and so on. That misses something: no one listens to grunge or alternative rock when they're sad and want sad music to give them company. People would rather listen to "Drive" by The Cars, "Don't Cry" by GNR, "Everybody Hurts" by R.E.M., or "End of the Road" by Boyz II Men. Those groups made fun music on the whole. When you listen to those songs, it sounds like they're speaking to, for, and about all people. When you listen to most alternative rock, it's hard not to conclude that the sadness is really just bitterness from some loser outcast. It doesn't feel like the singer is reaching out to anyone, but instead insisting that everyone else listen to him whine about how bad his particular life is.)
At any rate, which band marks the end-point of rock music? I searched YouTube for a bunch of candidates and sorted the results by how many times the video had been viewed, a measure of popularity long after they were in the spotlight. I just looked to see who had videos with 10 million or more views. For comparison, Michael Jackson has a lot -- more than 10, maybe 20 or so, but I didn't check for duplicate videos. No surprise for the King of Pop.
Guns N Roses has 5, and no other group has that many then or now. Actually, Fall Out Boy does, but those videos are very new; it's most likely a fashion thing, given the quality of their music. We can check again in 20 years to see how they're judged. Much older groups have fewer just because demand for videos on YouTube is driven by young people, so The Beatles don't do well (although Elvis does). Either that or GNR really are better than The Beatles.
In second place, not even neck-and-neck, is Nirvana with 3. (If you lower the cut-off to videos with 1 million views, they still don't win.) I checked some other usual suspects -- U2, Green Day, Pearl Jam (they're nearly forgotten, thank god), R.E.M., Bon Jovi, etc. -- and couldn't find anyone. Even Aerosmith's comeback of the '90s didn't come close.
You can try for yourself and see if there's someone I missed, but it seems like Guns N Roses were the last great rock band. No ax to grind here, as I'm not a die-hard GNR fan. At the time, I was much more into Nirvana than Guns N Roses, but in retrospect it was just fashionable. During the transition to alternative rock, I remember that you could still listen to Guns N Roses, but that was it -- no Bon Jovi, even though they made some great songs too. After whiny rock became the norm, though, you couldn't even like GNR -- they got lumped in with Motley Crue and Poison.
Anyway, it's also neat to see how the rankings of a group's songs has changed somewhat over the years. I remember Nirvana's "Lithium" getting a lot more airplay on MTV than "Come as You Are," but in YouTube view counts it's the other way around. And "In Bloom" places 6th on YouTube even though it was shown more than either of those two on MTV -- by a longshot. Still, "Smells Like Teen Spirit" tops both. And now "Don't Cry" by Guns N Roses handily beats out "Welcome to the Jungle," but I recall the popularity going the other way around back then, in the early-mid-'90s at least. Again, it isn't musical chairs because "November Rain" and "Sweet Child O' Mine" are at the top of both lists. For Metallica, the top YouTube song is "Nothing Else Matters," far ahead of "Enter Sandman" -- a complete reversal from when those songs were released.
Although only based on a few groups, it seems like we don't value the harder and wilder songs as much as the slower and more despondent ones. I doubt that it's because the original fans of Nirvana etc. have grown up and now have different tastes. Most people on YouTube probably didn't even hear "Sweet Child O' Mine" when it first came out. Maybe we're just picking the songs that are closer in mood to today's festival of whining. You could check that with The Beatles. In the 1980s when the culture was still wild, did they value the early or later Beatles more? Of course after alternative rock, we're supposed to worship all that dopey junk that followed Rubber Soul.
If that's not it, my second guess is that we value the slower songs because there's been such a dearth of substitute songs since rock music ended in the early '90s, whereas we've had plenty of songs that are close enough to "Enter Sandman" or "Welcome to the Jungle" that we don't really need to mine the past to satisfy those wants. I should have asked my high school tutorees what slow dance songs they played at their school dances -- could there be any good ones written in the past 15 years? Maybe "Good Riddance" by Green Day. That's all that comes to mind.
Creepiest place ever
Without a doubt, that would be Sur La Table. You're typically the only male there, which you'd welcome in another situation, but not when they are middle-aged women who block out their childlessness by cooking all day long, and who probably haven't been touched in decades. Sometimes teenagers will track me nervously through a store and at a distance (it's hard not to notice when they're looking at you). But in Sur La Table the women shamelessly descend onto your space, and you have to be really nimble to escape because they counter your evasion right away.
It really is striking how quickly the knack for flirtation fades away in females. With teenagers or college students, it feels like a playful game of hide-and-seek through the maze of displays in Urban Outfitters. Once they get into their late 20s, you feel like you're surrounded by a bunch of determined kidnappers.
It really is striking how quickly the knack for flirtation fades away in females. With teenagers or college students, it feels like a playful game of hide-and-seek through the maze of displays in Urban Outfitters. Once they get into their late 20s, you feel like you're surrounded by a bunch of determined kidnappers.
November 18, 2009
He really is that influential
Malcolm Gladwell recently released a bla bla bla, and there's been a lot of yadda yadda in the blogosphere about it. Are his credulous misunderstandings really as influential as the critics suggest? Here is a plot of how frequently the phrase "tipping point" has appeared in the NYT through 2008:
There's one occurrence in 1941 and another in 1959, but it's not until the '60s that it shows up regularly. Still, it's just a small but steady stream of blips, typically less than five times a year. In the very year that Gladwell published a book with that title, it shoots up by an order of magnitude, and since 2005 it has risen yet another order of magnitude, although this lame phrase appears to be passing out of fashion.
You don't have to coin a new phrase or concept to get rich quick -- you only have to popularize an existing one in a way that makes managers come away feeling well read.
There's one occurrence in 1941 and another in 1959, but it's not until the '60s that it shows up regularly. Still, it's just a small but steady stream of blips, typically less than five times a year. In the very year that Gladwell published a book with that title, it shoots up by an order of magnitude, and since 2005 it has risen yet another order of magnitude, although this lame phrase appears to be passing out of fashion.
You don't have to coin a new phrase or concept to get rich quick -- you only have to popularize an existing one in a way that makes managers come away feeling well read.
November 17, 2009
"About half in U.S. would pay for online news" -- NYT
Read the full article here. This may be the first time that the headline was better written than the first paragraph. The writer emphasizes how less willing Americans are to pay relative to people from other countries, when of course the real news is that people are willing to pay for online news. Nevermind the endless hogwash you hear from Silicon Valley about how no one is willing to pay for anything anymore in this brand new economy of ours.
The consulting group who did the study suggests that charging for online news will make American newspapers more profitable. We already knew that from the Wall Street Journal, Financial Times, and even the smaller-scale Arkansas Democrat-Gazette. But now that "studies show," maybe everyone will listen.
It's not surprising that Americans are more price-sensitive than people in France, Italy, or Spain -- that's true across the board, not just for online news. Spaniards are willing to pay a little more to get a little better quality for just about anything, which is why junk food, Wal-Mart-wear, and skyscrapers have enjoyed much less success over there compared to here. The flipside, which we don't see when we romanticize their lives, is that their choices make them more strapped for cash.
Having lived in Barcelona for about a year, I still prefer the American lifestyle because there are more options here. You can certainly live the human-scale, cosmopolitan life here if you want, but you're also able to wear jean shorts and eat Doritos for your dinner. Given how weak our social bonds are here, and given how cheap the transportation costs are, people will sort themselves into areas where they don't have to be bothered by people who have tastes that offend them. Of course, that sorting is far from complete, so people from across the spectrum are forced to live in the same areas.
While I'm digressing, I think that's another superior aspect of American life -- being forced to interact with people who are different from you, especially those who you'd rather not be around. You always hear people whining about how homogeneous our communities are becoming and that we need to struggle to make them more diverse. But if they think our communities are that way, they should visit any of the European cities that they romanticize. In the same way that whites who live in all-white areas are naive about what black and Hispanic culture is like -- and in general I think naivete is dangerous -- Spaniards were infuriatingly clueless about how Americans really live.
Americans don't hold strong but foolish views about how Spaniards or Italians or anyone lives because they don't give other countries much thought at all. But it was totally common for a Spaniard to ask me "Why does everyone bring guns to school in the U.S.?" while their friends looked at me with a straight face, expecting a serious answer, rather than turn to their friend like, "Nigga, is you crazy?" They had seen Bowling for Columbine -- I mean, how much more research can they be expected to do? It was a real "AFL-CIA" moment.
But that's an example of little consequence. What I really mean is that Spaniards are naive about how slobs live because there are such much fewer of them. Some pleasures we respond to on their own merits, almost reflexively. But many other things that enrich our lives do so even more when they stand out in contrast to their inferiors. Sporting a rakish necktie or savoring some French onion soup in America may make us more snobbish -- "Well, I for one still care about my appearance" -- but I think we draw greater satisfaction from it than a Frenchman does because we look around and see baggy cargo shorts and jelly beans and think to ourselves, "God, that could've been me..."
People in the Latin countries are more likely to take those pleasures for granted, having grown up with them, while the Americans who enjoy them don't process them in such a narrowly hedonistic way -- we feel a humbling sense of gratitude because throughout our daily lives we see how fragile civilized ways of eating and dressing are.
Returning finally to newspapers, I'd bet that those Americans who appreciate good reporting get more out of it than the Germans or Italians do because, as the article notes, we have a much wider variety of news sources here, much of it free crap, while journalism in much of Western Europe is dominated by a few high-quality sources. Reading the WSJ isn't merely rewarding -- it feels like they've come to the rescue! "Jesus, I almost had to go to CNN..."
And don't even get me started on how much more grateful we Americans are for Penelope Cruz than are the Spaniards, who take her for granted...
The consulting group who did the study suggests that charging for online news will make American newspapers more profitable. We already knew that from the Wall Street Journal, Financial Times, and even the smaller-scale Arkansas Democrat-Gazette. But now that "studies show," maybe everyone will listen.
It's not surprising that Americans are more price-sensitive than people in France, Italy, or Spain -- that's true across the board, not just for online news. Spaniards are willing to pay a little more to get a little better quality for just about anything, which is why junk food, Wal-Mart-wear, and skyscrapers have enjoyed much less success over there compared to here. The flipside, which we don't see when we romanticize their lives, is that their choices make them more strapped for cash.
Having lived in Barcelona for about a year, I still prefer the American lifestyle because there are more options here. You can certainly live the human-scale, cosmopolitan life here if you want, but you're also able to wear jean shorts and eat Doritos for your dinner. Given how weak our social bonds are here, and given how cheap the transportation costs are, people will sort themselves into areas where they don't have to be bothered by people who have tastes that offend them. Of course, that sorting is far from complete, so people from across the spectrum are forced to live in the same areas.
While I'm digressing, I think that's another superior aspect of American life -- being forced to interact with people who are different from you, especially those who you'd rather not be around. You always hear people whining about how homogeneous our communities are becoming and that we need to struggle to make them more diverse. But if they think our communities are that way, they should visit any of the European cities that they romanticize. In the same way that whites who live in all-white areas are naive about what black and Hispanic culture is like -- and in general I think naivete is dangerous -- Spaniards were infuriatingly clueless about how Americans really live.
Americans don't hold strong but foolish views about how Spaniards or Italians or anyone lives because they don't give other countries much thought at all. But it was totally common for a Spaniard to ask me "Why does everyone bring guns to school in the U.S.?" while their friends looked at me with a straight face, expecting a serious answer, rather than turn to their friend like, "Nigga, is you crazy?" They had seen Bowling for Columbine -- I mean, how much more research can they be expected to do? It was a real "AFL-CIA" moment.
But that's an example of little consequence. What I really mean is that Spaniards are naive about how slobs live because there are such much fewer of them. Some pleasures we respond to on their own merits, almost reflexively. But many other things that enrich our lives do so even more when they stand out in contrast to their inferiors. Sporting a rakish necktie or savoring some French onion soup in America may make us more snobbish -- "Well, I for one still care about my appearance" -- but I think we draw greater satisfaction from it than a Frenchman does because we look around and see baggy cargo shorts and jelly beans and think to ourselves, "God, that could've been me..."
People in the Latin countries are more likely to take those pleasures for granted, having grown up with them, while the Americans who enjoy them don't process them in such a narrowly hedonistic way -- we feel a humbling sense of gratitude because throughout our daily lives we see how fragile civilized ways of eating and dressing are.
Returning finally to newspapers, I'd bet that those Americans who appreciate good reporting get more out of it than the Germans or Italians do because, as the article notes, we have a much wider variety of news sources here, much of it free crap, while journalism in much of Western Europe is dominated by a few high-quality sources. Reading the WSJ isn't merely rewarding -- it feels like they've come to the rescue! "Jesus, I almost had to go to CNN..."
And don't even get me started on how much more grateful we Americans are for Penelope Cruz than are the Spaniards, who take her for granted...
November 16, 2009
Why you might sensibly fight for a stunning but older woman
This summary is not available. Please
click here to view the post.
November 15, 2009
The rise and fall of American journalism in one picture
Looks like the NYT's search engine now lets you search by date all the way back to its founding in 1851. Searching for the word "the" returns the number of "articles" in a given period -- I use quotes because it may have been only a few paragraphs or several columns long. Using this measure of output, here's the annual picture over time:
Output of articles has been stagnant, though with oscillations, since the early 1970s. The really sharp drop began in 1952, perhaps reflecting the spread of TV as a news source from then until TV was close to universal in the 1970s. WWII had a visibly negative impact, although not the Great Depression -- output stalled out but remained at an all-time peak throughout, with the most articles written in 1936. The period of fastest growth was -- you guessed it -- the Roaring Twenties (leaving aside a brief dip that may reflect the early '20s recession). Before that, growth was pretty steady except for the early 1890s and mid '00s (the banking panics then may have disrupted output).
I'm not concerned here about the financial health of the industry but rather the quality of the product. It's hard to quantify quality, but since newspaper articles are probably like books, movies, and TV shows, you need much greater sample sizes to find the rare gems. Small sample sizes give you a decent picture of normal distributions, but these things are more fat-tailed or "superstar" distributions.
So, knowing very little about the history of newspapers, and using only this incredibly crude measure of quality, I predict that true journalism lovers will revere the 1920s and 1930s the most, with the '40s and '50s in second place (the absolute level is slightly lower, but it was also trending downward rather than upward, and we don't like downward trends). Output was greatest then, so it must have discovered more superstar articles than in other periods. As a rough check, judging from bibliographies the '20s and '30s look like the peak periods of activity for both Walter Lippmann and H.L. Mencken.
Who said quality is so hard to measure?
Output of articles has been stagnant, though with oscillations, since the early 1970s. The really sharp drop began in 1952, perhaps reflecting the spread of TV as a news source from then until TV was close to universal in the 1970s. WWII had a visibly negative impact, although not the Great Depression -- output stalled out but remained at an all-time peak throughout, with the most articles written in 1936. The period of fastest growth was -- you guessed it -- the Roaring Twenties (leaving aside a brief dip that may reflect the early '20s recession). Before that, growth was pretty steady except for the early 1890s and mid '00s (the banking panics then may have disrupted output).
I'm not concerned here about the financial health of the industry but rather the quality of the product. It's hard to quantify quality, but since newspaper articles are probably like books, movies, and TV shows, you need much greater sample sizes to find the rare gems. Small sample sizes give you a decent picture of normal distributions, but these things are more fat-tailed or "superstar" distributions.
So, knowing very little about the history of newspapers, and using only this incredibly crude measure of quality, I predict that true journalism lovers will revere the 1920s and 1930s the most, with the '40s and '50s in second place (the absolute level is slightly lower, but it was also trending downward rather than upward, and we don't like downward trends). Output was greatest then, so it must have discovered more superstar articles than in other periods. As a rough check, judging from bibliographies the '20s and '30s look like the peak periods of activity for both Walter Lippmann and H.L. Mencken.
Who said quality is so hard to measure?
November 13, 2009
The best bread substitute -- huge honking pork rinds
Whether you're gluten-intolerant or following a low-carb diet in general, you occasionally want something like toast, bagels, etc., to pile your egg salad on top of, to provide a sturdy base for a stack of lunchmeat and cheese, and so on. Most options are made from grains and seeds, and the human digestive system isn't designed for grains, and neither is that of any of our primate relatives.
Today I found the perfect substitute -- pork rinds the size of pizza slices. Usually the fat is cut into pieces similar in size to chips, but Guerrero makes some called "chicharrones gourmet" where they've thrown an entire canvas of pork fat into the deep fryer.
Tonight I made low-carb pizza with them, and it tasted better than any other homemade pizza I've ever made. They're not flat, so you have to break off some of the curled areas, but you still get a pretty big piece to work with. Around the edges they were very crisp, like thin-crust pizza, and toward the center the tomatoes and olive oil had softened them somewhat, making them slightly pliable like doughy pizza slices. The real difference, though, was not tasting a bunch of flour. Pork-flavored pizza crust -- hard to beat it.
I also made some salmon salad, and the curly pieces I'd broken off of the big ones worked great as a cracker substitute. Ditto for carrying the side of liverwurst that I had. And again, pork crackers taste better than grain and seed crackers.
If it's more of a chip snack that you're looking for, Guerrero also makes something called "chicharrones de cerdo," where the skin is left attached to the fat before deep frying. They're very crunchy and taste much richer than the skinless type -- no surprise there.
I'm not sure how widely available these Guerrero products are, since they seem to be targeted at Hispanics, but they're definitely worth looking for.
Today I found the perfect substitute -- pork rinds the size of pizza slices. Usually the fat is cut into pieces similar in size to chips, but Guerrero makes some called "chicharrones gourmet" where they've thrown an entire canvas of pork fat into the deep fryer.
Tonight I made low-carb pizza with them, and it tasted better than any other homemade pizza I've ever made. They're not flat, so you have to break off some of the curled areas, but you still get a pretty big piece to work with. Around the edges they were very crisp, like thin-crust pizza, and toward the center the tomatoes and olive oil had softened them somewhat, making them slightly pliable like doughy pizza slices. The real difference, though, was not tasting a bunch of flour. Pork-flavored pizza crust -- hard to beat it.
I also made some salmon salad, and the curly pieces I'd broken off of the big ones worked great as a cracker substitute. Ditto for carrying the side of liverwurst that I had. And again, pork crackers taste better than grain and seed crackers.
If it's more of a chip snack that you're looking for, Guerrero also makes something called "chicharrones de cerdo," where the skin is left attached to the fat before deep frying. They're very crunchy and taste much richer than the skinless type -- no surprise there.
I'm not sure how widely available these Guerrero products are, since they seem to be targeted at Hispanics, but they're definitely worth looking for.
Shaming on an industrial scale
One worry that some have about large and impersonal societies is that the potential to whip a misbehaver into line through shaming becomes much harder when there are simply a lot more people to police, most of whom are total strangers. Most people find it hard to approach a total stranger just to shame them. And even if they did, would the target pay any attention -- who cares what some stranger thinks? Your relatives, your friends, sure, but not some weirdo who accosts you on the street.
But maybe we've managed to mostly solve this problem in industrial societies with an industry that gets a lot of flak -- the paparazzi and the celebrity media generally. Here's a great title from the WSJ:
Ridicule Keeps Fans of Harem Pants From Getting Too Big for Their Britches
Slaves to Anti-Fashion Suffer Taunts of 'Hammer Time,' Bike Accidents
Now the particular women profiled may decide to stop wearing these ridiculous pants because they're so over-the-top that perfect strangers are willing -- probably eager -- to run up to them to let them know how retarded they look. But articles like this -- and it could just as well have been a TV news soundbite -- also let all other women know that they'll incur shame if they wear harem pants. By tarring and feathering all sorts of behaviors, from wearing too-revealing clothes to picking up a drug habit, the celebrity-focused media make it possible to shame an entire nation of women -- you don't want people to associate you with Lindsay Lohan, do you?
With the celebrity media, we have a national standard of what's worth mocking. If you are weighing whether or not to do something on that list, you're going to think twice because everyone else is going to look at you funny -- "omigod, i mean is she trying to give britney spears a run for her money in the skank department? because it's sure working." If we left shaming to more local processes, this wouldn't work so well because we live in pretty impersonal communities. It's striking how little the masses cared about "famous people" before we traded our face-to-face lives for impersonal ones, but maybe we didn't need to expose their falls in order to teach people what's good and bad.
There's a separate problem of whether the celebrity media will police what was policed in personal communities. Maybe, maybe not. The 19th C. in Britain, France, and the U.S. saw an unimaginable change toward impersonal exchange, soaring population sizes, and so on -- unprecedented in human history -- and yet we think of these Victorian people as being hyper-vigilant about policing traditional morality. So what they shame and what they give a pass on seems independent of whether the police are a national media culture or not.
And even today, the celebrity media focuses on most of the concerns of traditional morality -- they flip out if a famous woman is going to get divorced, and you can imagine how much we'd hear if she were going to get an abortion. Parading herself around like a slut likewise draws vitriol across the board; ditto packing on excess fattage. She couldn't quit a big project that she'd committed herself to. On the other side, they're ecstatic when she's going to get married, have kids of her own or adopt, and when she sees a project through and it's all set for delivery. And of course we hear every update in her effort to lose weight.
We may not have tightly knit communities whose members can efficiently shame one another when they harm others, but the national celebrity media accomplish roughly the same thing -- you don't want to steal a page from Anna Nicole Smith's book, do you?
But maybe we've managed to mostly solve this problem in industrial societies with an industry that gets a lot of flak -- the paparazzi and the celebrity media generally. Here's a great title from the WSJ:
Ridicule Keeps Fans of Harem Pants From Getting Too Big for Their Britches
Slaves to Anti-Fashion Suffer Taunts of 'Hammer Time,' Bike Accidents
Now the particular women profiled may decide to stop wearing these ridiculous pants because they're so over-the-top that perfect strangers are willing -- probably eager -- to run up to them to let them know how retarded they look. But articles like this -- and it could just as well have been a TV news soundbite -- also let all other women know that they'll incur shame if they wear harem pants. By tarring and feathering all sorts of behaviors, from wearing too-revealing clothes to picking up a drug habit, the celebrity-focused media make it possible to shame an entire nation of women -- you don't want people to associate you with Lindsay Lohan, do you?
With the celebrity media, we have a national standard of what's worth mocking. If you are weighing whether or not to do something on that list, you're going to think twice because everyone else is going to look at you funny -- "omigod, i mean is she trying to give britney spears a run for her money in the skank department? because it's sure working." If we left shaming to more local processes, this wouldn't work so well because we live in pretty impersonal communities. It's striking how little the masses cared about "famous people" before we traded our face-to-face lives for impersonal ones, but maybe we didn't need to expose their falls in order to teach people what's good and bad.
There's a separate problem of whether the celebrity media will police what was policed in personal communities. Maybe, maybe not. The 19th C. in Britain, France, and the U.S. saw an unimaginable change toward impersonal exchange, soaring population sizes, and so on -- unprecedented in human history -- and yet we think of these Victorian people as being hyper-vigilant about policing traditional morality. So what they shame and what they give a pass on seems independent of whether the police are a national media culture or not.
And even today, the celebrity media focuses on most of the concerns of traditional morality -- they flip out if a famous woman is going to get divorced, and you can imagine how much we'd hear if she were going to get an abortion. Parading herself around like a slut likewise draws vitriol across the board; ditto packing on excess fattage. She couldn't quit a big project that she'd committed herself to. On the other side, they're ecstatic when she's going to get married, have kids of her own or adopt, and when she sees a project through and it's all set for delivery. And of course we hear every update in her effort to lose weight.
We may not have tightly knit communities whose members can efficiently shame one another when they harm others, but the national celebrity media accomplish roughly the same thing -- you don't want to steal a page from Anna Nicole Smith's book, do you?
No fear
Why do guys fear that if they put effort into dancing, or just let themselves go, that they'll be perceived as gay? -- and so, better to not dance at all, or to paper over their insecurity by goof dancing, like they're too cool to care.
"But gay men are better dancers, so girls will infer that I'm gay!" No they won't. Gay men are a tiny fraction of the male population -- maybe 3% -- so they're unlikely to be found anywhere. And anyway, girls can tell pretty well if a guy is straight or gay.
Try an analogy: are you afraid to speak to girls? After all, gay men are more likely to be outgoing and loquacious, so wouldn't chatting them up make you look gay? You have nothing to fear because if you aren't gay, you won't talk gay. The way you dance is hardly any different -- guys tend to dance like guys, while gays tend to dance like girls. You don't have to worry about any particular move consciously, any more than you would have to worry about any particular vocal inflection -- straight guys' voices simply cannot be perceived as gay voices. They're so naturally distinct.
Most guys can't dance, but you have no way of knowing until you try. And dancing ability isn't a black-or-white thing; you may be better than you think. If you have always been a drummer or tapper, you can probably keep good enough rhythm that you'll do fine in a dance club. I didn't know I could dance until I was 24, when "Take Me Out" really got me going. Before then, just the thought of dancing paralyzed me. Of course, early on I had an out -- there wasn't any good dance music in the mid-1990s, so at school dances you didn't stick out by not dancing, and they played enough Nirvana or Green Day that you could still bounce around head-banging to get the energy out of your system.
As an extreme example of how little you have to worry about: say you're in a dance club and it's '80s night, they're playing "Oh L'amour" by Erasure, you're dancing fairly energetically in one of those cages, and you have a pretty boy face. Every one of those things should lead girls to conclude you're gay, right? Heh, again remember that if you're not gay, you won't act gay -- in speech, movement, or whatever. I hardly got started when four college cuties ran up the steps to pour into the cage and started humping me from all four sides.
They weren't drunk, and they weren't slutty either (for one thing, they were cute, and it's usually the plain-looking girls who have to compete on other dimensions who are slutty). They simply got turned on by male confidence and energy. It's no different from groupies rushing the stage to crowd around a lead singer, or spilling onto the field to cling to the star athlete. Fretting about what they're going to think kills your confidence, so stop worrying -- they're not going to think you're gay.
Actually, there is one thing you shouldn't do, but which I see all the time, among the younger and thus more insecure guys anyway -- never, ever "back it up" into a girl's lap. Guys think they're being ironic and cool with this gender-reversal stuff, but it just plants the seed in her brain that you're the feminine one and she's the masculine one. A veneer of androgyny is OK if it suits you, like glam rock -- but those guys weren't confused about whether men or women had balls, and they were more piggish than gender-egalitarian.
I have no idea why this practice is so common. I guess they haven't had enough time to figure out how destructive it is of their goals, and how emasculating it looks. When you're an emotional wreck at that age about what other people are going to think about you, it feels safer to go the ironic route -- you have plausible deniability if you look foolish. All right, but why go so far as to make yourself the girl? You bend her over -- not the other way around. Seriously, what do they teach kids in sex ed these days?
"But gay men are better dancers, so girls will infer that I'm gay!" No they won't. Gay men are a tiny fraction of the male population -- maybe 3% -- so they're unlikely to be found anywhere. And anyway, girls can tell pretty well if a guy is straight or gay.
Try an analogy: are you afraid to speak to girls? After all, gay men are more likely to be outgoing and loquacious, so wouldn't chatting them up make you look gay? You have nothing to fear because if you aren't gay, you won't talk gay. The way you dance is hardly any different -- guys tend to dance like guys, while gays tend to dance like girls. You don't have to worry about any particular move consciously, any more than you would have to worry about any particular vocal inflection -- straight guys' voices simply cannot be perceived as gay voices. They're so naturally distinct.
Most guys can't dance, but you have no way of knowing until you try. And dancing ability isn't a black-or-white thing; you may be better than you think. If you have always been a drummer or tapper, you can probably keep good enough rhythm that you'll do fine in a dance club. I didn't know I could dance until I was 24, when "Take Me Out" really got me going. Before then, just the thought of dancing paralyzed me. Of course, early on I had an out -- there wasn't any good dance music in the mid-1990s, so at school dances you didn't stick out by not dancing, and they played enough Nirvana or Green Day that you could still bounce around head-banging to get the energy out of your system.
As an extreme example of how little you have to worry about: say you're in a dance club and it's '80s night, they're playing "Oh L'amour" by Erasure, you're dancing fairly energetically in one of those cages, and you have a pretty boy face. Every one of those things should lead girls to conclude you're gay, right? Heh, again remember that if you're not gay, you won't act gay -- in speech, movement, or whatever. I hardly got started when four college cuties ran up the steps to pour into the cage and started humping me from all four sides.
They weren't drunk, and they weren't slutty either (for one thing, they were cute, and it's usually the plain-looking girls who have to compete on other dimensions who are slutty). They simply got turned on by male confidence and energy. It's no different from groupies rushing the stage to crowd around a lead singer, or spilling onto the field to cling to the star athlete. Fretting about what they're going to think kills your confidence, so stop worrying -- they're not going to think you're gay.
Actually, there is one thing you shouldn't do, but which I see all the time, among the younger and thus more insecure guys anyway -- never, ever "back it up" into a girl's lap. Guys think they're being ironic and cool with this gender-reversal stuff, but it just plants the seed in her brain that you're the feminine one and she's the masculine one. A veneer of androgyny is OK if it suits you, like glam rock -- but those guys weren't confused about whether men or women had balls, and they were more piggish than gender-egalitarian.
I have no idea why this practice is so common. I guess they haven't had enough time to figure out how destructive it is of their goals, and how emasculating it looks. When you're an emotional wreck at that age about what other people are going to think about you, it feels safer to go the ironic route -- you have plausible deniability if you look foolish. All right, but why go so far as to make yourself the girl? You bend her over -- not the other way around. Seriously, what do they teach kids in sex ed these days?
November 10, 2009
How to empower women (who want to be): encourage machismo
Today I was listening to some '70s rock (which lasted into 1981 or '82, before new wave took over), and it struck me how strong the female stars are -- Joan Jett, Pat Benatar, Blondie, even tiny Rachel Sweet. According to feminist dogma, it should have been the opposite: given how openly masculine and gender-insensitive that period of rock music was, women should have felt intimidated, perhaps even disgusted, and stayed away.
Contrast that period with the alternative rock period of the early-mid '90s -- the famous women are all of the neurotic trainwreck type. Sure, some of them yelled a lot, but then emotionally fragile people are prone to doing that. It hardly makes them strong. And just as before, this pattern among the women mirrored that found among the male stars, most of whom were afraid of their own balls (aside from Red Hot Chili Peppers, who formed their identity back in 1983). Dude, they're like, latent instruments of oppression, man...
It's easy to see that if the trend in pop culture favors the masculine, this will propel masculine men and women both into the spotlight, while if it favors moping, wusses of both sexes will crowd out their stronger alternatives. But we miss this simple truth when it comes to the jobs that we're supposed to worry about -- business, law, medicine, politics, academia, etc.
The roles, mores, and so on, that are expected in those institutions select for certain types of people. If we're supposed to re-fashion those roles so that they are more typically feminine, then guess what -- self-defeating, passive-aggressive people of both sexes will rise. Among men who were in the workforce before women's liberation, one of the main benefits of the old system was not only that they themselves could get shit done more easily, but that the women who did pass the test were close to them too -- unlike the catty, bitchy career women of today. Now everyone has to waste a lot of time and effort dancing around all sorts of "touchy" topics.
To make it simpler, how would you make girls physically stronger than they are now? It's not a trick question -- by putting a greater emphasis on strength in gym class! Make weightlifting regular, or at least have the kids do push-ups or toss medicine balls every day. The boys and the girls would grow bigger muscles. However, strip away the focus on strength -- say, by turning gym class into an aerobics class -- and suddenly the boys and girls alike will become weaklings. (Most people who you see exerting themselves lifting weights look to be in decent shape, while the people who you see riding bikes or jogging around town tend to look like hell.)
Of course, many girls -- probably most of them -- would just sit it out and maintain their pleasantly soft figure instead of bulking up. And women who couldn't hang with the men back in the pre-women's lib days would have happily taken a more feminine job instead. We don't really have to worry about that. But what about women who could become Olympic athletes or competent businesswomen? If the zeitgeist constrains the roles to be less masculine, then they will never be pushed to succeed and will remain underdeveloped. (Same goes for the men too, of course.)
Before women's lib, we had greater diversity in roles -- from very feminine to very masculine -- allowing a wider range of specialization and greater choice for individuals. But by biasing the roles systematically toward the feminine side, current institutions choke off half of the source of social and cultural dynamism, as people of both sexes who would thrive on the masculine side must join what they see as the namby-pamby side.
One encouraging sign comes, for once, from the obstreperousness of little boys. At least we know that the tolerance of emasculated social roles is beaten into them, rather than easily impressed upon a blank slate. Higher costs make it less stable over time. Most of the time when you hear some brat throwing a fit in public, you want to go over and put him in his place. But I make one exception, when I'll actually shoot him a nod of respect -- "ah maaaaaan, i don't wanna. that stuff's for girrrrls!"
Contrast that period with the alternative rock period of the early-mid '90s -- the famous women are all of the neurotic trainwreck type. Sure, some of them yelled a lot, but then emotionally fragile people are prone to doing that. It hardly makes them strong. And just as before, this pattern among the women mirrored that found among the male stars, most of whom were afraid of their own balls (aside from Red Hot Chili Peppers, who formed their identity back in 1983). Dude, they're like, latent instruments of oppression, man...
It's easy to see that if the trend in pop culture favors the masculine, this will propel masculine men and women both into the spotlight, while if it favors moping, wusses of both sexes will crowd out their stronger alternatives. But we miss this simple truth when it comes to the jobs that we're supposed to worry about -- business, law, medicine, politics, academia, etc.
The roles, mores, and so on, that are expected in those institutions select for certain types of people. If we're supposed to re-fashion those roles so that they are more typically feminine, then guess what -- self-defeating, passive-aggressive people of both sexes will rise. Among men who were in the workforce before women's liberation, one of the main benefits of the old system was not only that they themselves could get shit done more easily, but that the women who did pass the test were close to them too -- unlike the catty, bitchy career women of today. Now everyone has to waste a lot of time and effort dancing around all sorts of "touchy" topics.
To make it simpler, how would you make girls physically stronger than they are now? It's not a trick question -- by putting a greater emphasis on strength in gym class! Make weightlifting regular, or at least have the kids do push-ups or toss medicine balls every day. The boys and the girls would grow bigger muscles. However, strip away the focus on strength -- say, by turning gym class into an aerobics class -- and suddenly the boys and girls alike will become weaklings. (Most people who you see exerting themselves lifting weights look to be in decent shape, while the people who you see riding bikes or jogging around town tend to look like hell.)
Of course, many girls -- probably most of them -- would just sit it out and maintain their pleasantly soft figure instead of bulking up. And women who couldn't hang with the men back in the pre-women's lib days would have happily taken a more feminine job instead. We don't really have to worry about that. But what about women who could become Olympic athletes or competent businesswomen? If the zeitgeist constrains the roles to be less masculine, then they will never be pushed to succeed and will remain underdeveloped. (Same goes for the men too, of course.)
Before women's lib, we had greater diversity in roles -- from very feminine to very masculine -- allowing a wider range of specialization and greater choice for individuals. But by biasing the roles systematically toward the feminine side, current institutions choke off half of the source of social and cultural dynamism, as people of both sexes who would thrive on the masculine side must join what they see as the namby-pamby side.
One encouraging sign comes, for once, from the obstreperousness of little boys. At least we know that the tolerance of emasculated social roles is beaten into them, rather than easily impressed upon a blank slate. Higher costs make it less stable over time. Most of the time when you hear some brat throwing a fit in public, you want to go over and put him in his place. But I make one exception, when I'll actually shoot him a nod of respect -- "ah maaaaaan, i don't wanna. that stuff's for girrrrls!"
November 5, 2009
Does technology fail because of switching costs and user laziness?
[Much more detail on this topic can be found at Stan Liebowitz's webpage, especially in the books The Economics of QWERTY and Winners, Losers, & Microsoft.]
Google's web browser Chrome hasn't done well in its first year. Obviously we'd need to wait for another year or so just to see if things stay that way, but we're already hearing the predictable whining from its supporters about why it isn't off to such a hot start. Impartial spectators in the article ask obvious questions like, "Does the world need another web browser?" and "Does anyone but geeks care about how fast it goes compared to current alternatives?" Those are the obvious reasons that no one is adopting Safari, Opera, or Chrome -- nothing gained for 99% of internet users.
But losers are never content with simple answers, so of course we have to hear all over again about how users are so lazy that they get "locked in" to the browser they start off with, and that they don't migrate to an allegedly superior one because of the high "switching costs." Both are wrong.
Whiners always have a selective memory, so it may surprise them to notice that there have been three separate cases of a browser's surge in market share in only 10 to 15 years. First Netscape Navigator soared to dominance, and not just because it was first -- there is no reason for there to have been a winner-take-all browser. Early browsers could well have been evenly distributed. But Navigator was better than the others, so it won. Then Microsoft's Internet Explorer came along, and it was better than Netscape -- so it rose to dominance, dethroning Navigator. A while later Firefox showed up, and it's been steadily eroding Internet Explorer's market share ever since. Quite plainly, there is no such thing as lock-in or inertia for usage of web browsers.
Whiners also say that most people lazily stick with the default browser their computer came with. That would surprise Netscape Navigator users, since that was not the default. Nor has Firefox been the default, and it's been sapping the growth of the browser that is the default -- for almost all computer owners, no less. Safari has been the default for Macs since 2003, and yet despite Macs having roughly 10% share of their market, the Safari browser that comes with it has only 2-3% of its market. In other words, most Mac owners junk Safari and install something else. At least for things that are important to people, there is no such thing as blindly following what's given.
And as for switching costs, there are none but those that the newcomers inflict upon themselves, at least for web browsers. The analogy that switching costs is supposed to make is to learning a new language. Imagine that one language were more expressive than another -- you might not switch because it's simply hard to re-learn a whole new set of rules relating to word order, vocabulary, word-formation, sound sequences, and so on.
But that is almost never true for a mature technology because the producers have an incentive to make new technologies compatible with old ones -- in particular, they have an incentive to make the knowledge about how to use the old one carry over. When CD player producers wanted consumers to switch from playing cassettes to CDs, did they throw a whole new set of buttons at them? No, because no new functions were needed -- stop, play, etc. Did they plaster a whole new set of symbols to identify each function -- say, a happy face on the play button, a sad face on the stop button, faces looking left and right for the rewind and fast-forward buttons? No, they carried over the same symbols as before. Some of these pairings between symbol and function are so standard that you can easily pick out the record button on any device that records.
These no-brainer design choices made using a CD player automatic if you already knew how to use a cassette player. All you had to learn was how to open and close the disc drive or tray, but they made that as similar as possible to ejecting a cassette.
(Similarly, I can play a video game that has jump and attack functions and know exactly which buttons do which, without reading the manual, and even if they have different symbols on them, because those function-position pairings were standardized 20 years ago.)
The same is true for web browsers. Firefox carried over the left and right arrows for back-page and forward-page, a circle-with-an-arrow for refresh, a red octagon for stop, and a house for homepage. The URL bar, probably the most important feature, looks the same. Really, when you try out any web browser for the first time, are you that confused about how it works? It couldn't be simpler to switch. There are surely lots of differences across browsers that no one cares about, like how to alphabetize your list of 40,000 tentacle porn bookmarks, but 99% of the activity that 99% of users are doing requires only a handful of features that have become standardized.
There are a few exceptions that might make it harder to switch, but those were self-inflicted. For example, why does Safari fuse a stop button with a bookmark button -- and use the symbol of a plus sign? That's not intuitive at all. Stop buttons should always be red X's, red octagons, red traffic lights, whatever. Not a plus sign.
Also, the two browsers with the lowest share, Opera and Chrome, put their tabs in funny places. Internet Explorer, Firefox, and Safari all have the URL bar above the tab headings. Opera and Chrome have a URL bar embedded underneath each tab heading. Why would this matter? I don't know, but it truly jumped out at me as weird when I saw what Chrome and Opera looked like. I'm sure the Asperger-y designers have a solid logical case for why this is, mathematically speaking, the optimal design. But real people have voted that it isn't. And they don't need to explain why -- it just feels weird.
If web browsers competed at least partly based on price, then relatively poorly designed ones like Opera, Chrome, and Safari could capture a larger market share by charging less. After all, many people, like me, are content with a mediocre mp3 player -- since they rarely use it, they'll just take what's cheap. But since all browsers are free, there is even more intense competition based on quality -- visual design, intuitive use, desired features, etc. So the newcomer had better blow the incumbent out of the water, or don't even bother.
It isn't lazy or unwashed consumers who keep a new technology from being adopted -- it's arrogant producers who declare what consumers must use to qualify as human. How did that work out for the command-line interface?
Google's web browser Chrome hasn't done well in its first year. Obviously we'd need to wait for another year or so just to see if things stay that way, but we're already hearing the predictable whining from its supporters about why it isn't off to such a hot start. Impartial spectators in the article ask obvious questions like, "Does the world need another web browser?" and "Does anyone but geeks care about how fast it goes compared to current alternatives?" Those are the obvious reasons that no one is adopting Safari, Opera, or Chrome -- nothing gained for 99% of internet users.
But losers are never content with simple answers, so of course we have to hear all over again about how users are so lazy that they get "locked in" to the browser they start off with, and that they don't migrate to an allegedly superior one because of the high "switching costs." Both are wrong.
Whiners always have a selective memory, so it may surprise them to notice that there have been three separate cases of a browser's surge in market share in only 10 to 15 years. First Netscape Navigator soared to dominance, and not just because it was first -- there is no reason for there to have been a winner-take-all browser. Early browsers could well have been evenly distributed. But Navigator was better than the others, so it won. Then Microsoft's Internet Explorer came along, and it was better than Netscape -- so it rose to dominance, dethroning Navigator. A while later Firefox showed up, and it's been steadily eroding Internet Explorer's market share ever since. Quite plainly, there is no such thing as lock-in or inertia for usage of web browsers.
Whiners also say that most people lazily stick with the default browser their computer came with. That would surprise Netscape Navigator users, since that was not the default. Nor has Firefox been the default, and it's been sapping the growth of the browser that is the default -- for almost all computer owners, no less. Safari has been the default for Macs since 2003, and yet despite Macs having roughly 10% share of their market, the Safari browser that comes with it has only 2-3% of its market. In other words, most Mac owners junk Safari and install something else. At least for things that are important to people, there is no such thing as blindly following what's given.
And as for switching costs, there are none but those that the newcomers inflict upon themselves, at least for web browsers. The analogy that switching costs is supposed to make is to learning a new language. Imagine that one language were more expressive than another -- you might not switch because it's simply hard to re-learn a whole new set of rules relating to word order, vocabulary, word-formation, sound sequences, and so on.
But that is almost never true for a mature technology because the producers have an incentive to make new technologies compatible with old ones -- in particular, they have an incentive to make the knowledge about how to use the old one carry over. When CD player producers wanted consumers to switch from playing cassettes to CDs, did they throw a whole new set of buttons at them? No, because no new functions were needed -- stop, play, etc. Did they plaster a whole new set of symbols to identify each function -- say, a happy face on the play button, a sad face on the stop button, faces looking left and right for the rewind and fast-forward buttons? No, they carried over the same symbols as before. Some of these pairings between symbol and function are so standard that you can easily pick out the record button on any device that records.
These no-brainer design choices made using a CD player automatic if you already knew how to use a cassette player. All you had to learn was how to open and close the disc drive or tray, but they made that as similar as possible to ejecting a cassette.
(Similarly, I can play a video game that has jump and attack functions and know exactly which buttons do which, without reading the manual, and even if they have different symbols on them, because those function-position pairings were standardized 20 years ago.)
The same is true for web browsers. Firefox carried over the left and right arrows for back-page and forward-page, a circle-with-an-arrow for refresh, a red octagon for stop, and a house for homepage. The URL bar, probably the most important feature, looks the same. Really, when you try out any web browser for the first time, are you that confused about how it works? It couldn't be simpler to switch. There are surely lots of differences across browsers that no one cares about, like how to alphabetize your list of 40,000 tentacle porn bookmarks, but 99% of the activity that 99% of users are doing requires only a handful of features that have become standardized.
There are a few exceptions that might make it harder to switch, but those were self-inflicted. For example, why does Safari fuse a stop button with a bookmark button -- and use the symbol of a plus sign? That's not intuitive at all. Stop buttons should always be red X's, red octagons, red traffic lights, whatever. Not a plus sign.
Also, the two browsers with the lowest share, Opera and Chrome, put their tabs in funny places. Internet Explorer, Firefox, and Safari all have the URL bar above the tab headings. Opera and Chrome have a URL bar embedded underneath each tab heading. Why would this matter? I don't know, but it truly jumped out at me as weird when I saw what Chrome and Opera looked like. I'm sure the Asperger-y designers have a solid logical case for why this is, mathematically speaking, the optimal design. But real people have voted that it isn't. And they don't need to explain why -- it just feels weird.
If web browsers competed at least partly based on price, then relatively poorly designed ones like Opera, Chrome, and Safari could capture a larger market share by charging less. After all, many people, like me, are content with a mediocre mp3 player -- since they rarely use it, they'll just take what's cheap. But since all browsers are free, there is even more intense competition based on quality -- visual design, intuitive use, desired features, etc. So the newcomer had better blow the incumbent out of the water, or don't even bother.
It isn't lazy or unwashed consumers who keep a new technology from being adopted -- it's arrogant producers who declare what consumers must use to qualify as human. How did that work out for the command-line interface?
November 3, 2009
College kids these days throw such boring parties
You may know that young people have been committing violent crime at lower and lower rates since the early 1990s, and that the same is true for property crimes. Illegal drug use is down since the late '90s, and hardly any young people smoke anymore. And of course promiscuity and pregnancy rates for teenagers and young adults are way down over the same time. Plus, with fewer and fewer young people getting their driver's license, you see less rowdy behavior on the roads. Indeed, the only holdout is binge drinking among college students -- for young people who aren't at college, the rates have been plummeting ever since the drinking age was raised.
Still, those are just numbers. To really believe them for yourself, you'd have to do some fieldwork and see the changes with your own eyes. Going to clubs that are 18+ gives you a decent idea -- for instance, I see fewer people dancing together than I did at my middle school dances -- but they could just be presenting a responsible image when they're in public among strangers. Something like a private house party would be more revealing. As it happens, an undergrad chick friend invited me to a housewarming / costume party last Friday, and I was shocked by how tame it was.
First, the house was located a safe 15 minutes off-campus, and only undergrads lived there -- no landlord, parents, etc. Plus it wasn't any old weekend: it was the Friday before Halloween, and most people were dressed up, which should have heightened the potential for bad behavior. And these people are in the IQ range of about 105 to 115 -- above-average, but not the brainiac type. They weren't nerds, losers, or goody two-shoes at all. Based on their counterparts who I knew as a teenager, I would've expected rampant pot-smoking, sex or make-outs in the closet, and drunken quarrels escalating into a fight.
True to the recent statistics, though, there was some drinking, but no one was falling-over drunk. Zero cigarette smoke. Also, no drugs of any kind -- only a couple of people hinting that they might want to get high. That's it? Just a joking side-comment about maybe doing it? Though I've never done drugs myself, they were common enough when I was a teenager that I recall hanging out with my best friend while he scored some weed on a lazy Friday afternoon, or my suitemates in college pouring into one of their rooms and filling it with pot smoke, again for no special occasion.
No violence or even an escalation toward a fight -- pretty remarkable given how much alcohol lowers your inhibitions. I thought I was going to see a catfight when some slut who'd cheated on one of the homeowner's friends walked in. The girls started talking a lot of shit about her, but all behind her back, never confronting her. That would be typical if it were the locker room, but at a private party where she's the unwelcome outcast and the shit-talkers are buzzed on booze? Quite a bit of restraint.
But the biggest shock was that there wasn't anyone running off to an unoccupied bed to start fucking -- not even to make out! (The complete lack of darkly lit rooms didn't help.) Maybe they were just averse to hooking up with strangers, though, right? Well, there were more than a few couples there, and they weren't getting physical at all either. I mean, not even grind-dancing, kissing, or holding hands. For god's sake, a bunch of 6th-graders playing Seven Minutes in Heaven in 1988 were venturing farther sexually than this group!
One of the earliest memories I have of seeing something somewhat sexual is being at my high school babysitter's house when I was probably 7 or 8. I was glued to the TV playing Legend of Zelda with her brother, when we got distracted by something. Turning over, I saw my babysitter standing with her back to me and her boyfriend holding her, facing us. He pulled up her skirt to show us her ass and said, "See this is how you do it -- you get the left one, then the right one. The left one, the right one," while squeezing each cheek in turn. Her head was turned to one side, and I remember her looking embarrassed, almost like she was ready to cry. Years away from puberty, I could still tell that whatever I was seeing was a big deal. *
And yet I doubt any of the Millennial party-goers I hung out with ever saw anything close to that growing up. Internet porn, American Pie, or Superbad don't count -- it's completely different when it's happening right before your eyes. And with the age of losing your virginity steadily rising, I doubt they even heard about it -- let alone experience it themselves -- while developing.
So don't let all those Facebook albums full of beer pong pictures fool you -- aside from some mild drinking, young people have never been so well behaved and so shielded from the slimy and awkward parts of real life.
* The disappearance of irresponsible young people means that little kids growing up in the 1990s or later didn't have reckless, cute older babysitters to give them their first glimpse of adult world. I recall another high school babysitter bringing over three of her friends, getting buzzed on beer, and acting goofy. (I didn't know what alcohol was at the time, but later I realized that their behavior was due to drinking.) My favorite experience of all, though, was when a college girl babysitter had to -- or maybe just felt like -- returning to campus, and dragged us along with her... at night. Boy was it worth it: she took us to a student dorm, or maybe a sorority house, full of barely legal Buckeyes.
omigooood, who's thiiiiissss???!?!?!
Oh them? They're the kids I'm babysitting.
omigod, they're soooo cuuute!!!
Honey, I hear that all the time -- you'd better come up with a more original pick-up line than that. On a serious note, I think that night triggered my puberty development years earlier than it would have on its own. Once your body senses that female pheromones are above some threshold, it figures that your peers are now going through puberty, so you should be too. But I started liking girls several years before the average boy, and enjoyed my first long French kiss when I was 12 -- maybe all it took was a single night's tour through a college dorm, with nubile babe pheromones flooding my nostrils the whole time. Hey, if you dropped a 9 year-old boy into a gang-infested urban slum, he'd probably grow tougher a lot earlier than if you hadn't. Why shouldn't that work for other aspects of adolescent development?
Still, those are just numbers. To really believe them for yourself, you'd have to do some fieldwork and see the changes with your own eyes. Going to clubs that are 18+ gives you a decent idea -- for instance, I see fewer people dancing together than I did at my middle school dances -- but they could just be presenting a responsible image when they're in public among strangers. Something like a private house party would be more revealing. As it happens, an undergrad chick friend invited me to a housewarming / costume party last Friday, and I was shocked by how tame it was.
First, the house was located a safe 15 minutes off-campus, and only undergrads lived there -- no landlord, parents, etc. Plus it wasn't any old weekend: it was the Friday before Halloween, and most people were dressed up, which should have heightened the potential for bad behavior. And these people are in the IQ range of about 105 to 115 -- above-average, but not the brainiac type. They weren't nerds, losers, or goody two-shoes at all. Based on their counterparts who I knew as a teenager, I would've expected rampant pot-smoking, sex or make-outs in the closet, and drunken quarrels escalating into a fight.
True to the recent statistics, though, there was some drinking, but no one was falling-over drunk. Zero cigarette smoke. Also, no drugs of any kind -- only a couple of people hinting that they might want to get high. That's it? Just a joking side-comment about maybe doing it? Though I've never done drugs myself, they were common enough when I was a teenager that I recall hanging out with my best friend while he scored some weed on a lazy Friday afternoon, or my suitemates in college pouring into one of their rooms and filling it with pot smoke, again for no special occasion.
No violence or even an escalation toward a fight -- pretty remarkable given how much alcohol lowers your inhibitions. I thought I was going to see a catfight when some slut who'd cheated on one of the homeowner's friends walked in. The girls started talking a lot of shit about her, but all behind her back, never confronting her. That would be typical if it were the locker room, but at a private party where she's the unwelcome outcast and the shit-talkers are buzzed on booze? Quite a bit of restraint.
But the biggest shock was that there wasn't anyone running off to an unoccupied bed to start fucking -- not even to make out! (The complete lack of darkly lit rooms didn't help.) Maybe they were just averse to hooking up with strangers, though, right? Well, there were more than a few couples there, and they weren't getting physical at all either. I mean, not even grind-dancing, kissing, or holding hands. For god's sake, a bunch of 6th-graders playing Seven Minutes in Heaven in 1988 were venturing farther sexually than this group!
One of the earliest memories I have of seeing something somewhat sexual is being at my high school babysitter's house when I was probably 7 or 8. I was glued to the TV playing Legend of Zelda with her brother, when we got distracted by something. Turning over, I saw my babysitter standing with her back to me and her boyfriend holding her, facing us. He pulled up her skirt to show us her ass and said, "See this is how you do it -- you get the left one, then the right one. The left one, the right one," while squeezing each cheek in turn. Her head was turned to one side, and I remember her looking embarrassed, almost like she was ready to cry. Years away from puberty, I could still tell that whatever I was seeing was a big deal. *
And yet I doubt any of the Millennial party-goers I hung out with ever saw anything close to that growing up. Internet porn, American Pie, or Superbad don't count -- it's completely different when it's happening right before your eyes. And with the age of losing your virginity steadily rising, I doubt they even heard about it -- let alone experience it themselves -- while developing.
So don't let all those Facebook albums full of beer pong pictures fool you -- aside from some mild drinking, young people have never been so well behaved and so shielded from the slimy and awkward parts of real life.
* The disappearance of irresponsible young people means that little kids growing up in the 1990s or later didn't have reckless, cute older babysitters to give them their first glimpse of adult world. I recall another high school babysitter bringing over three of her friends, getting buzzed on beer, and acting goofy. (I didn't know what alcohol was at the time, but later I realized that their behavior was due to drinking.) My favorite experience of all, though, was when a college girl babysitter had to -- or maybe just felt like -- returning to campus, and dragged us along with her... at night. Boy was it worth it: she took us to a student dorm, or maybe a sorority house, full of barely legal Buckeyes.
omigooood, who's thiiiiissss???!?!?!
Oh them? They're the kids I'm babysitting.
omigod, they're soooo cuuute!!!
Honey, I hear that all the time -- you'd better come up with a more original pick-up line than that. On a serious note, I think that night triggered my puberty development years earlier than it would have on its own. Once your body senses that female pheromones are above some threshold, it figures that your peers are now going through puberty, so you should be too. But I started liking girls several years before the average boy, and enjoyed my first long French kiss when I was 12 -- maybe all it took was a single night's tour through a college dorm, with nubile babe pheromones flooding my nostrils the whole time. Hey, if you dropped a 9 year-old boy into a gang-infested urban slum, he'd probably grow tougher a lot earlier than if you hadn't. Why shouldn't that work for other aspects of adolescent development?
November 2, 2009
Are longer lifespans making adults more childish?
Over the weekend, the NYT gave us a flashback op-ed from 1990 on making Halloween less childish. Before the civilizing trend of the early-mid 1990s through today, it was still possible to talk about Halloween as having been handed over to a bunch of goofy kiddies. Now, helicopter parents have destroyed the holiday for their kids, and it's mostly old people who go nuts. (Here's a post I wrote last year about how the skag stole Halloween.)
If Halloween isn't supposed to be about little children asking for candy, what should it be about, according to the op-ed writer? Why, staying at home and reflecting on the past and your forbears. Sounds fun -- I'm sure that would have been an easy transition to make. If you want one of those holidays, fine; but pick a day that's free of existing fun holidays, or try to convert an existing serious holiday.
Of course, we have succeeded in taking back Halloween from young people, but at what price? It's not as though adult ownership automatically makes the thing serious, as most adults have little interest in souring the fun associated with things they themselves enjoy. So the result is to maintain the "childishness" of the holiday, but to make adults rather than children look childish. Which sight is more pathetic? -- kids dressed up or adults dressed up? (There's no debate that we'd like to see adolescents dressed up.)
We see this trend in all sorts of other things that used to belong to children and some teenagers. Video games are an obvious example. The typical video game player (I will never use the lame term "gamer") used to be a male under the age of majority. Now the average age is early-mid 30s. And video games are no less childish now than during the Nintendo era -- they may have more "mature content," like getting into the persona of a mass murderer, but that hardly makes the player more grown-up. As with Halloween costumes, we've only succeeded in making adults rather than children look childish.
And the same goes for comedy movies, a trend that Steve Sailer recently commented on here. There may be a Hispanic demographic angle to this shift, but a larger one is again the ownership of this activity by people in their 20s, 30s, and 40s rather than teenagers or college students. Goofball or gross-out comedies marketed toward young people are pretty funny, and that's how they used to be in the 1980s and, if more weakly, in the 1990s. (The 15 - 24 age group, as a fraction of the population, peaked in the early 1980s.) Now they're geared more toward the same group that plays video games and spends lots of time putting together their Halloween costume -- at least in their late 20s, and mostly in their 30s or even 40s.
As with video games and costumes, these types of comedies need to be watered down because the target audience isn't that juvenile. Usually this is done by heaping humorless "irony" onto the product to make it look like it's made for adults, in the same way that making a candy bar with almonds and goji berries assuages the 30-something's sense of guilt for pigging out on sugar. Predictably, these grown-up junk bars -- laughably marketed as health foods -- are not nearly as pleasing to the tongue as a simple Caramelo or Reese's Peanut Butter Cup.
Why has this transfer of ownership from children to adults completely failed to lower the level of childishness? (Of course, that's not such a big problem if it's only kids who are engaging in the activity -- there needs to be some level of childishness in the world, but just confined to what kids do.) The answer is that if adults are merely trying to steal something fun from children, they have no incentive to transform the activity into something that looks grown-up, responsible, and so on. If adults' motivation were to show children how to engage in the activity responsibly -- say, having a single glass of wine at dinner -- then we would see them behaving like adults. But when the big person is merely stealing the little person's toy -- after all, "that stuff isn't for kids" -- then we'll see adults indulging more in childish activities.
For all the benefits associated with longer lifespans in modern countries, there is this downside -- that as adults live longer and healthier lives, they'll want to keep those lives as fun-filled as possible. Why bow out gracefully at age 35 or 40 if you can still live it up childishly well into your 60s like the Baby Boomers are? This naturally makes old people more bitterly envious of young people than before -- "why should they have all the fun"? -- whereas before the two age groups would not have been seeking the same goals in the first place. When desires are different, envy is impossible.
Now, for example, the fact that a 20 year-old girl can look beautiful so effortlessly only serves to anger the 30-something who sees herself as still in the game, whose counterpart many generations ago would have already settled into family life.
Yet just because older people are healthier than before doesn't mean that their state relative to young people has changed -- 20 year-olds will always look better than 30 year-olds, and video games marketed to little boys and teenagers will always be more fun than those marketed to the middle-aged. Older people can look better today than the old people of centuries before, but they have to work harder at it. In the same way, they can have more fun on Halloween than the old people of yesteryear, but they still have to work pretty hard at it. In expending this effort to stay young, they immediately notice that there's a huge group of people who don't have to toil at all to enjoy youthful fun -- the young.
And when the powerful become envious of the powerless, we know what's going to happen to the source of fun among the lower-status group: it's going to get stolen. The corrosive envy that the wicked stepmother had of Snow White has been a constant throughout human existence, but it is much more intense now that longer lifespans encourage old people to still compete on the same terms as young people, without realizing how pathetic they look in general.
Sure, there is the one-in-a-million specimen who can still look great into their 30s, or who can have as much carefree fun as kids do on Halloween, but the remaining 999,999 out of the million become the ugly old guy in the club, the video game addict whose hobby is more about collecting and going through the motions of completing a game, or the movie "buff" who passes over the DVD of Fast Times at Ridgemont High in favor of Superbad.
If Halloween isn't supposed to be about little children asking for candy, what should it be about, according to the op-ed writer? Why, staying at home and reflecting on the past and your forbears. Sounds fun -- I'm sure that would have been an easy transition to make. If you want one of those holidays, fine; but pick a day that's free of existing fun holidays, or try to convert an existing serious holiday.
Of course, we have succeeded in taking back Halloween from young people, but at what price? It's not as though adult ownership automatically makes the thing serious, as most adults have little interest in souring the fun associated with things they themselves enjoy. So the result is to maintain the "childishness" of the holiday, but to make adults rather than children look childish. Which sight is more pathetic? -- kids dressed up or adults dressed up? (There's no debate that we'd like to see adolescents dressed up.)
We see this trend in all sorts of other things that used to belong to children and some teenagers. Video games are an obvious example. The typical video game player (I will never use the lame term "gamer") used to be a male under the age of majority. Now the average age is early-mid 30s. And video games are no less childish now than during the Nintendo era -- they may have more "mature content," like getting into the persona of a mass murderer, but that hardly makes the player more grown-up. As with Halloween costumes, we've only succeeded in making adults rather than children look childish.
And the same goes for comedy movies, a trend that Steve Sailer recently commented on here. There may be a Hispanic demographic angle to this shift, but a larger one is again the ownership of this activity by people in their 20s, 30s, and 40s rather than teenagers or college students. Goofball or gross-out comedies marketed toward young people are pretty funny, and that's how they used to be in the 1980s and, if more weakly, in the 1990s. (The 15 - 24 age group, as a fraction of the population, peaked in the early 1980s.) Now they're geared more toward the same group that plays video games and spends lots of time putting together their Halloween costume -- at least in their late 20s, and mostly in their 30s or even 40s.
As with video games and costumes, these types of comedies need to be watered down because the target audience isn't that juvenile. Usually this is done by heaping humorless "irony" onto the product to make it look like it's made for adults, in the same way that making a candy bar with almonds and goji berries assuages the 30-something's sense of guilt for pigging out on sugar. Predictably, these grown-up junk bars -- laughably marketed as health foods -- are not nearly as pleasing to the tongue as a simple Caramelo or Reese's Peanut Butter Cup.
Why has this transfer of ownership from children to adults completely failed to lower the level of childishness? (Of course, that's not such a big problem if it's only kids who are engaging in the activity -- there needs to be some level of childishness in the world, but just confined to what kids do.) The answer is that if adults are merely trying to steal something fun from children, they have no incentive to transform the activity into something that looks grown-up, responsible, and so on. If adults' motivation were to show children how to engage in the activity responsibly -- say, having a single glass of wine at dinner -- then we would see them behaving like adults. But when the big person is merely stealing the little person's toy -- after all, "that stuff isn't for kids" -- then we'll see adults indulging more in childish activities.
For all the benefits associated with longer lifespans in modern countries, there is this downside -- that as adults live longer and healthier lives, they'll want to keep those lives as fun-filled as possible. Why bow out gracefully at age 35 or 40 if you can still live it up childishly well into your 60s like the Baby Boomers are? This naturally makes old people more bitterly envious of young people than before -- "why should they have all the fun"? -- whereas before the two age groups would not have been seeking the same goals in the first place. When desires are different, envy is impossible.
Now, for example, the fact that a 20 year-old girl can look beautiful so effortlessly only serves to anger the 30-something who sees herself as still in the game, whose counterpart many generations ago would have already settled into family life.
Yet just because older people are healthier than before doesn't mean that their state relative to young people has changed -- 20 year-olds will always look better than 30 year-olds, and video games marketed to little boys and teenagers will always be more fun than those marketed to the middle-aged. Older people can look better today than the old people of centuries before, but they have to work harder at it. In the same way, they can have more fun on Halloween than the old people of yesteryear, but they still have to work pretty hard at it. In expending this effort to stay young, they immediately notice that there's a huge group of people who don't have to toil at all to enjoy youthful fun -- the young.
And when the powerful become envious of the powerless, we know what's going to happen to the source of fun among the lower-status group: it's going to get stolen. The corrosive envy that the wicked stepmother had of Snow White has been a constant throughout human existence, but it is much more intense now that longer lifespans encourage old people to still compete on the same terms as young people, without realizing how pathetic they look in general.
Sure, there is the one-in-a-million specimen who can still look great into their 30s, or who can have as much carefree fun as kids do on Halloween, but the remaining 999,999 out of the million become the ugly old guy in the club, the video game addict whose hobby is more about collecting and going through the motions of completing a game, or the movie "buff" who passes over the DVD of Fast Times at Ridgemont High in favor of Superbad.
Subscribe to:
Posts (Atom)