December 31, 2009

The decade in pictures, for real

Readers of the NYT have documented the decade with their own photos, and I was shocked to see how selective everyone's memory is. Only the really destructive stuff made the cut, with 9/11 and the recession sharing top honors, although the tsunami and Hurricane Katrina made a decent showing too. We all forgot about 9/11 by about two years later, and it took even less time for the latter two to fade from our concern because there's simply no point in forever worrying about freak accidents beyond our control.

It's only since the recession hit and put everyone in a funk that they've revised the history of the past decade, as though our minds weren't possessed by that heady euphoria of circa 2003 to 2007. Just because you wake up from a dream doesn't mean that it was really a nightmare. Preserving that thread would have actually made the narrative make more sense -- if people in 3 million years have only those pictures to go by, they'll wonder where the hell the recession came from. There is apparently no back-story; it just struck out of the blue like Katrina. No panoramic shots of McMansions crowding each other out of a cul-de-sac? No hoodrat driving her gold-toothed boyfriend around in an SUV with rims? No teenagers with scene hair making the duckface for their digital cameras while walking the hallways of their revamped and pimped-out high school buildings? I mean, where were you people?

So, to provide a partial corrective to the Shit Happens school of photojournalism, I've put together some of my pictures that hopefully preserve better the feel of the middle five years of the decade. The readers who submitted their work to the NYT obviously take better pictures than I do, not the least for not employing lame Photoshop tricks -- but I like to think that captures the palpable excitement that everyone suddenly had for design, even the crudest dilettantes. There are no Big Events because we get over them so quickly, unless it's prolonged like the recession has been. While there is no overarching narrative that I'm shoe-horning everything into, I did try to fit them into patterns. Click images to enlarge.

[Genoa salami with goat cheese and red pear, prosciutto with Brie and apricot preserves, tuna on a baguette rubbed with tomato and olive oil]

Long-term changes in the food industry toward competition based more on variety and quality than on price led to ordinary people obsessing over food. (I missed out on the low-carb craze when it first hit, but I'm catching up now.) Although reality TV as a whole was forgettable, the food and cooking shows got us permanently hooked on variety and quality. Unlike the housing market, the market for good food has not collapsed.

[A large cat shading himself in summer, then stealing someone's bed, two views of the Mall in DC during the Cherry Blossom Festival]

No deep story here. Just a reminder that, despite the gloomy reporting, every spring and summer we escaped from our houses to have fun in the sunny weather.

[Snowy views of Montgomery County, Maryland]

And even in less favorable weather, we still avoided cabin fever by traipsing around in the snow to enjoy the look of winter.

[Male fashion during 2004 - 2007]

After the unrelenting parade of blackness during the 1990s, we finally got to add some color -- like a little white and red! This was part of the whole new wave revival in pop music, although I'm pretty sure no one dressed this way back in 1977 or 1983. But then "Stacy's Mom" and "So Damn Hot" were also too stylistically self-conscious to really sound like The Cars. C'mon though, at least it brought style back into rock music after the thrift store affectations of alternative rock and the exercise uniforms of the boy bands.

[House in winter, view of Montjuic from my apartment's terrace, Antonio Miro store]

I don't have any pictures of my home in Maryland at night. The first is from where I am now, and the other two are from when I lived and worked in Barcelona while Spain was enjoying a housing bubble and design craze of its own.

[Avinguda Diagonal where the upscale mall L'Illa is]

More views of Catalan oneupsmanship in the contest to be more culturally dynamic than other regions of Spain.

[Friends of mine in their early 20s from Girona, Spain and Tokyo, Japan]

Spain increasingly joined the rest of Western Europe and its offshoots, so that seeing young girls with a nose ring or guys sporting a "fashion mullet" and capri pants was entirely common. Japan finally emerged from its Lost Decade, allowing the trend-thirsty young people to pour into the shopping districts and get their fix again. I suspect most of the NYT pictures reflect a gloomy view because hardly any of them were teenagers or in their 20s at the time. Unlike the boring '90s, the 2000s were a youth decade -- maybe not as much as the '80s were, but still.

[Children feeding pigeons in Pla├ža Catalunya, in their costumes outside for Carnival, misbehaving right in front of adults, and smiling for no reason at all]

I'm surprised the NYT didn't include anything on helicopter parents, but then their readership is more childless than the average internet user. The top two are random pictures that I snapped. The girl in the lower right is a Catalan friend's little sister, and the boy in the lower left is her cousin (I have no idea who the daring 4 year-old is).

In The Theory of Moral Sentiments, Adam Smith advises us to keep company with those who are more likely to be indifferent spectators of our behavior -- a close friend might indulge our continual brooding when they should really help us snap out of it, or encourage our vanity in good times when they should really help to keep us grounded. In today's world where it's trivial to sort into groups of very similar people, we tend not to run into these indifferent spectators often or at all. But children, especially if they are not your own, couldn't care any less about the feelings of grown-ups. When you're in danger of letting a promotion or the way a new suit looks go to your head, they'll give you the "I don't get it" look and take you down a peg. And when you're brooding at unnecessary length, they may not tell you outright to get a grip and move on, but their relative lack of sympathy and indulgence are a signal to do so all the same.

December 30, 2009

The late Medieval worship of teenage beauty

While browsing through social history books, one title jumped out at me: Young Medieval Women. One chapter argues that the makers of late Medieval culture considered a female's perfect age to be "maidenhood":

For my purposes it is most significant that youthfulness was a key element of this ubiquitous ideal of feminine beauty. The ages of some Middle English literary heroines who are described as beautiful are stated, and they are always in their teens. Freine is twelve when men begin to notice her beauty, and Goldboru, 'the fairest wuman on live' is an heiress waiting to come of age. Chaucer's Virginia -- a 'mayde of excellent beauty' who is described in conventional terms -- is fourteen. Where the age of the beautiful woman is not stated, her youth is implicit in her figure, with its slender limbs, small high breasts, narrow waist and smooth white skin. An ugly woman, in contrast, is often old, with loose skin, a forest of wrinkles and breasts like deflated bladders.

The author is a post-teenage female, yet in her chapter there is none of the bitterness or scorn that we'd expect from Sailer's rule of female journalism. Aside from the age of girls in literature, she discusses the era's obsession with the Coronation of the Virgin. When Mary is crowned in Heaven, she argues, she should be shown in the idealized female form, just as Jesus is shown in the idealized male form (he usually looks like he's in his late 20s to mid 30s). The key detail is that usually Mary does not wear a wimple, which only married women wore (the average age at marriage for females was early to mid 20s). Rather, she's shown with long, luxurious flowing locks, as well as immaculate and fair skin and youthful facial geometry. She never looks 12, but mid-to-late teens, sure. (When Goldboru is described as "an heiress waiting to come of age," remember that girls went through puberty much later before the 20th C, probably around 17 instead of 13.)

Looking through the various paintings at the Wikipedia page and through Google Images, it's clear that this pattern isn't always there. Most of the Italian and Spanish paintings show her with a wimple and sometimes a matronly face, while the Northern European ones usually show her as youthful and with uncovered hair. At least in Chaucer's neck of the woods, the overall story seems to hold up. The only half-baked idea I have for the North-South difference is that in Northern Europe the family structure was more nuclear, while the Southern European structure was more extended. When your view of marriage is finding the right one and running off to start your own household, you'll idealize your high school sweetheart more than if your view of marriage includes a matriarch ruling over a mini-empire of a household.

While the ideal of beauty is not the same at every time and in every place, it's clearly bounded. In no society is the old maid chased after as she is among some of our primate relatives. The contemporary American ideal must be pushing the upper limit, what with post-menopausal women appearing on the cover of Playboy and the greater focus on housewives than sorority girls. These are the tails of the idealized age distribution, but still -- you wouldn't have seen that in most other places and times. Northwestern Europe during the Late Middle Ages is a case from the lower bound. And we're not talking about a greasy heavy metal band singing about jailbait -- the examples come from high literature and religious paintings.

No social disapprobation for finding teenagers attractive, plus a widespread low-carb diet? I was born in the wrong century. Yeah, there were all those disasters that upended the 14th C, but I would've made it through. And with a lot of the land cleared out, the survivors must have had a relatively comfortable standard of living, enabling them to think about higher concerns than "where's the food at?"

December 29, 2009

Decline of Christmas: Christmas cards

Poking around the Postal Service's Household Diary Study, I found some data on how many "holiday greeting cards" the average American household receives each year. Overwhelmingly these are Christmas cards, with Valentine's a distant second. Not all years are up on the web, but here's what is available:

Using Lexis-Nexis, I found an estimate of 26 Christmas cards for 1990, so that the number of all holiday greeting cards would have been a bit above -- probably around the 1987 level of 29 cards across all holidays. The first big drop is visible by 1994, when the number of cards received per household was about 25% lower than in 1987. There was another drop-off starting in 2003, and during the most recent years of 2007 and 2008, the number is down about 40% from the late '80s / early '90s. This does not merely reflect the fact that there are more households now than then, which would tend to lower the ratio even if the total number of cards stayed the same. In 1987, 2.856 billion holiday greeting cards were received vs. 2.117 billion in 2007 -- a decrease in sheer volume of 736 million cards.

There were several feature journalism articles this season about how few Christmas cards people have gotten this year, but most of them traced the decline to the recession, gift cards replacing more heartfelt tokens, and of course the internet and e-cards. However, we now see that those are all wrong, given how far back the decline goes. Remember, the web was just becoming common in 1994; sending e-cards -- which still are incredibly unpopular -- had to wait for it to reach greater maturity. (The earliest use of "e-card" in the NYT is from 1999.) Gift cards, too, only became widespread in the late '90s and 2000s.

This example reinforces two general points I've been making in these Decline of Christmas entries:

1) The various signals of Christmas have been steadily fading since roughly the mid-1990s, at least a decade before the "War on Christmas" debate erupted. Rather than special interests knocking off Christmas-lovers, the general public voluntarily dropped out. None of the changes in the signals is clearly related to the internet, macroeconomic indicators, etc. The only strong association I see is the larger cultural shift away from sincerity and sentimentality toward affectation and irony.

2) They have not been replaced with new signals. Rather, we're pulling out investment from the holiday altogether and shifting it into other holidays like New Year's Eve and Halloween (which Lexis-Nexis suggests was taken over by adults also in the mid-1990s). Anything that will afford us greater opportunities to make the duckface for the cameras.

This will probably be it for the series, although I might put up something brief on Christmas trees if there's easy data. Aside from that, leave suggestions in the comments and I'll follow them up if it won't take too long.

December 28, 2009

Decline of Christmas: caroling and new Christmas songs

Since our social capital is in decline, and since we hate going outdoors, it shouldn't surprise us to learn that hardly anyone goes Christmas caroling anymore:

In 2005, a National Christmas Tree Association survey found that only 6 percent of respondents planned to go caroling--down from 22 percent in 1996.

I searched Lexis-Nexis for more dates but couldn't find any. If this is like everything else, it's probably a bit lower than 6% this year, and would have been even higher than 22% in the '70s and '80s. The last time I remember carolers knocking on my family's door was sometime in the late '80s or early '90s in Upper Arlington, Ohio, a white upper-middle class suburb of Columbus. Whenever the change happened, it sure seems to have started before what the "War on Christmas" people talk about, namely more recent battles over whether Wal-Mart or whoever uses "Christmas" or "holiday" in its ads and signs. Here is yet another piece of evidence that the general public was already growing apathetic toward Christmas during the 1990s.

OK, maybe we don't drop in on our neighbors anymore, but how is Christmas music doing in general? Wikipedia has a list of nearly 400 Christmas songs, and here they are by release date:

But this doesn't account for the size of the recorded music industry, so here is that same number of Christmas songs but standardized using the number of RIAA gold and platinum singles (standard or digital). Unfortunately, the RIAA certification doesn't go back through the 1940s or before. The x-axis labels are the first year in a 5-year stretch:

It's pretty clear that the 1955 - 1964 period was a lot more productive. Still, the RIAA certification had only just begun, and the mass market music industry wasn't so mature either. So trying to give the benefit of the doubt to more recent times, here is the same graph as above, but only from 1965 and later:

No clear trend overall, although the jump in the first half of the 2000s looks suspicious given every other bit of data we have on the popularity of Christmas.

Instead of looking at a random list of 400 songs, let's look at those Christmas songs that are rated the highest. ASCAP published a list of the top 25 most played Christmas songs from 2001 - 2005, and here they are by release date:

Now that looks more plausible. Clearly most of the recent Christmas songs -- and by that I mean within the past 45 years -- have been published but not heard and retained. There's one song from 1984 ("Do They Know It's Christmas?") and another from 1970 ("Feliz Navidad"), but that's the extent of worthwhile innovation for nearly a half-century. I don't buy the argument that it takes time for new hits to catch on -- not 50 years. There are plenty of songs from the '80s through the present that remain popular -- but they are not Christmas songs.

Rather, I think it says more about the flagging interest in Christmas. We certainly can introduce new traditions -- after all, before 1950 many of our traditional songs didn't exist -- so why haven't we? On the demand side, consumers don't hold Christmas songs to high standards since they don't care very much about that holiday anymore. So on the supply side, artists will not compete based on quality of songs, but on quantity and price -- just churn out a bunch of shit on a Christmas theme and price it really low in order to move it.

These Christmas song data show that we're not merely shifting from one Christmas tradition to another -- say, if blue and orange became the recognized Christmas colors, replacing red and green, but where the presence of these new Christmas colors was still ubiquitous. Rather, we're junking those traditions and failing to replace them altogether. In the post below, we saw previous survey data showing that a smaller portion of American families are even exchanging gifts -- not merely that they are exchanging gifts as often as before, but just giving gift cards instead of real presents. Gift-giving on the whole is down.

I don't mind the secularization and commercialization of Christmas -- fighting either of those trends is a losing cause, and that probably doesn't matter anyway. Christmas did just fine from the '50s through the '80s, even though most new Christmas songs weren't religious and shopping for material gifts became more common. It's the apathetic, ironic mood that the culture sunk into starting in the mid-1990s that is the real danger to Christmas. The secular and commercial Christmas was still sentimental and worked people up into the holiday spirit, offering many suitable replacements for older traditions. The apathetic-ironic attitude toward Christmas doesn't only rail against commercialism. * It urges us to not give a shit no matter what the source of our Christmas cheer might be.

* A hollow charge, given how many formerly sacred domains those people have allowed to become commercialized, such as getting food and shelter.

December 24, 2009

Decline of Christmas: giving gifts

The NYT has an article on the same cultural shift I've been talking about: "Saying No, No No Instead of Ho-Ho-Ho". Most of it is anecdotal, but there's this bit of survey data:

This year’s survey, conducted by the polling firm Harris Interactive, found that while 95 percent of households plan to celebrate Christmas (about the same as every year), the percentage of families who plan to exchange gifts is dropping: 77 percent this year, down from 85 percent in 2005. Slightly fewer people said they were going to attend parties or listen to Christmas music, too.

Given that all the other decline-of-Christmas shifts go back at least to the mid-1990s, I'd be surprised if this one didn't as well.

December 22, 2009

Decline of Christmas: usage of "Christmas" vs. "holiday"

[Part one and part two]

Next up, let's see when the shift away from "Christmas" and toward "holiday(s)" began. I searched the NYT from November of a given year through January of the following year, just to make sure the context is Christmas rather than Easter or the Fourth of July. Here's the portion of all articles that use either word:

From at least 1950 to 1970, there's no change in how common either word is. During the 1970s, the NYT begins to focus a lot more on the Christmas theme, with both words seeing a big jump (although the one for "holiday" is a larger change). That settles down and both levels peter out during the 1980s. It's only around 1990 that "holiday" begins a steady surge in popularity, while "Christmas" doesn't trend so noticeably up or down. By 1999, the two are used equally frequently, with the occasional year where "holiday" actually beats "Christmas."

The two words are complements rather than substitutes -- their movement patterns basically mirror each other's, rather than a rise in one causing a fall in the other. It's just that over time the gap between them shrinks to the point where both are equally common:

Sometime during the mid-to-late 1970s the gap starts to steadily close. This is unlike the other cases where the shift began in the mid-1990s, and I have no good guess about why this case is so different. It's probably because we need to look at a case where the two words would be substitutes, forcing the user to pick one or the other. That's typically what the "War on Christmas" people mean when they talk about the phrase "Happy Holidays" replacing, not merely being said alongside of, "Merry Christmas."

Decline of Christmas: special TV episodes

Continuing our look at the public's waning interest in Christmas, let's turn to something more common than complaining about ugly sweaters. Our main form of entertainment for the last 50 or so years has been television (it still is), and Christmas-themed episodes have been shown since the beginning. Still, we'd expect the number of episodes in some year to reflect how much the public enjoys Christmas -- more when interest is high, and low otherwise, all other things being equal.

I couldn't believe it when I saw it -- although maybe I shouldn't have been so surprised -- but Wikipedia has a page that lists nearly 1,100 Christmas TV episodes for the U.S. alone, including what year they were broadcast in. Here is a graph showing how many episodes came out in a given year, using a 2-year moving average:

Remember that these raw numbers aren't corrected for anything, so in the early stages when TV sets weren't widespread and there were only a few shows that could put out a Christmas episode anyway, this graph is probably not an apples-to-apples comparison. There's a pretty steady increase starting in the mid-'70s and peaking in 1996. Some of that may partly reflect the growing number of people with TVs and the proliferation of shows being broadcast, but I doubt that's all of it, simply because there's a nearly 10-fold increase in the number of Christmas episodes during that time.

However, once the line starts dipping down, that cannot be due to how many people own TVs or how many shows there are -- those are still both growing, not shrinking. There is an upward blip in 2001, perhaps due to the shot-in-the-arm of sincerity that we got after 9/11, but after 1996 the trend has been steadily downward. For those guessing that the internet began siphoning viewers away, lowering the demand for Christmas TV episodes, note that the internet has not been much of a substitute for TV.

To try to correct for the changing world of TV show production and consumption, I looked for data on how many TV episodes were broadcast in each year. Together with the above data, we could answer the question, "What portion of all TV episodes in some year were dedicated to Christmas episodes?" I'm sure those data exist somewhere, but I couldn't find them easiliy. (Although if a particularly bored reader finds them, I'll use them to make the corresponding graph.)

But I did find data on how many households owned a TV. This isn't ideal because it's not like 5 Christmas specials being split up among 10 thousand or 10 million households makes any difference -- any episode can be shown to an unlimited number of viewers, unlike splitting up 5 plots of land among 10 thousand or 10 million households. Still, I use TV ownership as a proxy for how mature the TV industry is. The source only went back to 1970, and is only unbroken starting in 1985, but the overall picture is still pretty clear:

Again we see the surge starting in the mid-'70s, a peak this time in 1995, and a decline afterward, aside from the blip in 2001. So as with the "dissing on ugly sweaters" phenomenon, it looks like the mid-'90s were the turning point in the public's growing apathy about the Christmas season. And like that example, this shows that the "War on Christmas" is really more about a plummeting of consumer demand rather than a tiny special interest group imposing their plan on an unwilling public. Sadly, the public seems to care less and less about Christmas.

I haven't watched TV in nearly forever, so I have no hunch about this, but it would be worth comparing how sincere vs. ironic or above-it-all the Christmas episodes are from the period when they were increasing vs. decreasing in popularity.

I remember that as late as 1994, My So-Called Life ran a Christmas episode about teenage runaways -- but the mother and daughter mend their relationship. (Turns out it's on Hulu, "So-Called Angels"). In 1991 there was the Saved by the Bell two-parter that focused on a not-crazy-or-addicted and his daughter -- but they get taken in and given help by Zack's familiy. And of course the first Simpsons episode to air was the 1989 Christmas special where Homer is down on his luck -- but the family becomes richer because they adopt the dog Santa's Little Helper. If there are any couch potatoes with better knowledge, feel free to fill us in on what the more recent Christmas episodes have been like.

December 21, 2009

Christmas: victim of a war or ironic atrophy of interest?

Most of what the "War on Christmas" people detail is how special interest groups have been chipping away at public signals of Christmastime over the years -- removing Christmas trees from public spaces, bullying people into saying "Happy Holidays" instead of "Merry Christmas," and so on. But none of this could have succeeded if the group under attack really cherished Christmas; they'd tell the scrooges to go mind their own fucking business, and we wouldn't have seen any of these changes. The blandness of the Christmas season these days says more about how little the mainstream cares to stick up for it, and less about the power of whiny special interest groups.

Over the next several days I'll look at data on various aspects of what used to make Christmas fun and how they've changed.

It's always worth trying to see the big picture, and I think this is just part of the decline in sincerity and sentimentality and the rise of ironic screwballery that's infected the culture since the mid-1990s. I mean, we're like too sophisticated to get into the Christmas spirit. The same is true for other holidays that people used to get into, like Halloween. Now the point of Halloween is to show how cleverly meta and ironic or sarcastic you can be with your costume, instead of dressing up as one of a few stock characters in order to not draw so much attention to yourself. It's easier to let go and have fun when you're not critiquing everyone's costume like some book award panel.

In fact, we see the exact same thing for Christmas in the form of the "ugly sweater party," where you spend a month or so finding just the right ugly Christmas sweater, and join a bunch of other ironic morons to show how above Christmas cheer you are. While the ugly sweater party only began in 2006 (as shown by Lexis-Nexis and Google Trends searches for the phrase), the dissing on ugly sweaters actually started earlier.

I searched all US newspapers and wires using Lexis-Nexis, which gives results back through the 1980s. The phrase appears in the late '80s but only referring to what sports coaches wore. The first time it appears in the context of grousing about Christmas is -- would you guess it? -- 1994, and it appears reliably after then. It's not as though there were no ugly Christmas sweaters during the '80s and early '90s, nor that people didn't recognize that they were ugly. They just figured that they should get a life and move on without dwelling on the matter. But once the movement toward sarcasm began to take over the culture, suddenly everyone starts complaining about those damned ugly sweaters.

Here are a few examples from the mid-1990s, when the primal scream therapy began.

"Memories of presents bring smiles and a scowl" (Dec 1994)

The most unforgettable gift I ever received would have to be this tacky green and yellow sweater, and I had to wear it, too, because it was a present from family. I hated that ugly sweater, and I hated wearing it even more. It really isn't that unusual, but goodness it was ugly.

"Back to the mall; Returns, bargain-hunters keep store clerks hopping" (Dec 1995)

Not only were people eager to exchange gifts, that ugly sweater, those pants that don't fit, but they were savvy enough to know good bargains would be abundant.

" 'Tis the season for traditions" (Dec 1996)

Shredded wrapping paper, colored lights, the aroma of a turkey baking in the oven. The Charlie Brown Christmas special, leaving cookies and milk for Santa, ugly sweaters from your aunt and mom's Christmas morning cinnamon rolls. Going to grandma's house.

Traditions. They are the things you count on every year about this time.

"Thanks for the Memories; Childhood reminiscences can last a lifetime" (Sep 1997)

What do you remember about the holidays of your childhood? Is it great presents under the tree? The ugly sweaters Aunt Gladys knitted for you every year? The time Uncle Irv got drunk and danced with a lampshade on his head?

To reiterate, the phrase does not appear at all during the 1980s or the early 1990s, then all at once we see predictably regular complaints about ugly sweaters starting in 1994. It's this sort of mindset -- blowing a tiny nuisance all out of proportion -- that marks the turning away from Christmas. There are other signs that I'll look at later, but these are the real sources of the War on Christmas' success.

After all, it's not as though the ACLU haven't been trying to kill Christmas for much longer. They brought cases against public displays of religious holiday symbols in 1988. Back then, the culture would have told those fags to go do something productive with their time. Here's how the Supreme Court ruled on a similar case in 1985:

In 1985, the High Court deadlocked over a nativity scene case from Scarsdale, N.Y. The Scarsdale government had banned a private group from erecting a creche in a public park. A Federal appeals court struck down the ban as an unconstitutional infringement on ''religious speech,'' and the Supreme Court's inability to decide the case left that decision intact.

Given the constant force that the armies of rabid busybodies are always pressing against the gates of civilization, the only reason that they should succeed is that the guardians and the populace have given up on giving a shit anymore.

December 20, 2009

Non-buzz buzzwords of 2009

In one of Steven Pinker's books, he mentions how little predictive power the language mavens have when it comes to which neologisms will catch on. These verbal dorks are too dazzled by each other's cleverness to see what there is a real demand for among the broad base of language users. In that respect, they're an awful lot like open source geeks who are more concerned with impressing each other than designing something that the average consumer wants.

So it comes as no surprise that the NYT's list of the buzzwords of 2009 is made up almost entirely of losers. I only recognized about 1/3 of them, and I don't imagine any will stick other than "netbook" and "Great Recession."

For a true example of a 2009 buzzword, and one that will probably be around for the next few years, there is "fml" -- short for "fuck my life," used mostly on young people's text messages, facebook status updates, etc., after some kind of bad news:

omg theres gonna be a sick rager tonight but my GAY mother won't let me have the car. fml

For those without college students on their friends list, Google Trends shows its explosion in popularity this year, and unlike most of those invisible buzzwords in the NYT list, it appears in Urban Dictionary, where thousands of people have voted on how appropriate the various definitions are. It's not listed, however, at the website run by the writer of the NYT list.

And while it's retarded to try to weave every cultural thread into the Great Recession narrative, you could easily tell a just-so story about how "fml" signals the especially tough times that young people are going through, with teenage unemployment at an all-time high of over 25%.

It's pretty well known that young people are the primary innovators of language -- not always for the better, of course -- so why do the language mavens focus so much on the dopey slang of inside-the-Beltway Boomers? Most so-called intellectual types are incredibly disdainful of the entrepreneurial spirit -- which is, like, too base for someone with an art history degree from Bowdoin -- so that they remain completely ignorant of the cultural churning all around them. Perhaps these lists of non-buzzwords are a status signal -- look at how out-of-touch and above-it-all I am. I'm not sure when our cultural elite became such flabby, hermetic fags, but you certainly wouldn't have seen that in Pepys' London or even as recently as the New York Beats.

This town needs an enema!

December 16, 2009

Will the internet make us less obsessed with trivia, rather than more?

When information becomes cheaper to search out and gain access to, people will consume more of it. So there's a natural concern that the internet will drive us toward greater obsessiveness -- information is so cheap that we consume so much, long after we've hit diminishing returns. We end up scarfing down trivia, day in and day out, that would've been too expensive to learn about before the internet. For some types of information, this is certainly true, as I argued before about news being the new sugar.

But there are other, perhaps larger, parts of our trivia consumption that we use in contests against other consumers of trivia. For example, you might want to learn a lot of trivia about a football player because one of your hobbies is competing against other football fans to see who knows the most. I think this is behind the proliferation of contemporary music genres: the people who invent and use these obscure labels (folktronica, psychobilly, dark wave, cumbiaton, etc.) are competing against each other to see who is familiar with the most obscure types of music:

Yeah yeah, who hasn't heard of The Ramones and Prince -- but do you know who Shpongle is? No? Oh... I mean, no, nothing wrong with that. I guess I just thought you were knowledgeable enough to have been exposed to psybient music before...

(Actually, maybe that's too optimistic. Most 23 year-old music dorks couldn't recognize Queen or even some of the older but accomplished "indie" groups like The Jesus and Mary Chain. Their contest is not based on who has the most cumulative knowledge, but who has the most obscure knowledge.)

It's the same with sports fans: they take it for granted that they know which team won last year's World Series, or even every World Series, but do you know which bone the winning pitcher broke during his second season of Little League in 1985, and what swear word his coach infamously wrote on his cast? (I'm making that up, but it sounds real enough.)

How much of this competitive kind of trivia you have to master in order to win -- or at least rank in the top 1%, say -- is clearly related to how many others you're competing against. If the only way you have to signal your trivia mastery is through face-to-face meetings with people from your city or region, you need to have a fair amount under your belt. But once you're competing against people from across the country, or even across the entire world, that won't cut it. Welcome to the big leagues of pointless factoids. Now you've got to invest far more time, energy, and money into your hobby in order to maintain your rank from your local-only days. In contrast to exchanging trivia over a poker game with your buddies from work, think of how much more difficult it is to stand out on an internet bulletin board with sports geeks from all over the globe.

This makes the future look even more depressing than we'd expect from the non-contest consumption of trivia -- now you've got to obsess over it even more because your competitors in the arms race are suddenly much tougher! But that overlooks a key difference between contests and arms races -- you can exit a contest and not suffer. In an arms race, you have no choice but to compete, unless you want to vanish. Why would people start to drop out of these contests? One simple reason is if the costs of participating go up, and this is surely true when you're competing on a larger scale because you're more likely to meet tough opponents. So the internet may actually cause lots of obsessives to drop out.

Even more importantly, they will not just drop out of the particular contest they'd been competing in before -- if they left the indie rock trivia contest for the foreign film trivia contest, the same dynamics will be at work and they'll find that too costly as well. Instead, they'll drop out of trivia contests altogether -- it's just too demanding to be a world-class obsessive, so you might as well find another hobby. It's just like all workers in agriculture having to find new jobs in entirely different industries once the Industrial Revolution arrives. You're not merely shifting from one agricultural sector to another, but abandoning the ship of agriculture entirely.

Other hobbies whose contests don't scale up so easily do not face this prospect. Take amateur cabinet-makers -- they compete against others in their city or perhaps state, not the world. So their hobby is pretty safe. But any hobby that is mostly based on collecting and sending information is doomed. Only those with more-or-less vertical demand curves will be left -- that is, those who are so hardcore that no matter how costly it becomes to compete, they're going to stick it out. This will increase the variance or level of inequality, as most people drop out or never were competing anyway, while a small group travels farther and farther into the extremes of obsessiveness. Again think of the transition from agriculture to industry: before, people weren't so different in their ability to farm, but now most people don't know how to farm at all, while the remaining farmers know even more and have even better machinery than the farmers of yesteryear.

Is there a way to test this? We'd want to see a group of trivia competitors start out not too different but then see their numbers plummet as costs rose, as well as the remaining people becoming even more devoted to their obsession. My impression is that more people considered themselves music or movie or lit buffs before the internet became common -- say, the cashier at a used record store or an indie movie rental store or used book store. Within their city or state, they probably ranked pretty high and thought a lot of their trivia mastery. But when they go online and see how mediocre they are compared to the music and movie and lit dorks from across the country or the world, they realize they can't hack it and pour less into their hobby on music or movie or lit trivia, and look for a hobby that doesn't put them in competition with the entire world, and so where they can attain a higher rank and peer esteem given the same investment of time, effort, and money.

Certainly those who 20 to 25 years ago would've been competing on trivia about college radio bands or Polish film or edgy postmodern fiction are now more likely to take up hands-on, DIY hobbies like fixing bikes, knitting, or growing organic squash. I don't bump into many actual or potential sports fans, but I'd suspect that many who would've been sabermetricians a couple decades ago have in the internet age settled on hobbies that don't involve trivia at all.

December 15, 2009

Is our culture going through a decline in sincerity / rise in self-consciousness?

There was a recent WSJ article that I'm not going to bother locating on the recent disappearance of method acting in Hollywood. According to the writer, its greatness peaked in the mid-1970s and petered out until it was more or less gone in the 1990s. I'm not enough of a movie buff to know how true that is, but these days you certainly see a lot more Will Ferrell and Superbad than De Niro and Star Wars.

But that got me thinking about how widespread that trend may be, since I'd noticed a few similar cases before. Below are a few quick examples. Feel free to leave your own in the comments -- as well as counter-examples.

First, I should try to spell out what I mean about sincerity. It's portraying emotion in a genuine way, but particularly in a way that draws the audience in and gets them to sympathize with that emotion. There are two ways to keep people from sympathizing with you -- to put on a poker face, or to emote so over-the-top that it's too much for the audience. After all, the spectator is removed from the experiences of the performer, so the emotion they express had better be toned down a little bit in order for us to meet them halfway. (Think of when someone gets some great news and smiles ear-to-ear vs. jumps around the room screaming with joy. It's harder to sympathize in the latter case.) These ideas come from Adam Smith's Theory of Moral Sentiments.

I think it's also useful to contrast how attention-seeking vs. how retiring performers are. There's a potential confusion between attention-seeking and sincerity, as well as between self-consciousness and retiring. But there are people who are attention-seekers and yet aim to keep us from sympathizing with them, and there are those who are more retiring but still draw us into their emotional state. To help visualize this, here's a simple table that shows attention-seeking increasing from left to right, and sincerity increasing from bottom to top. I don't care if you quibble with my exact choice of pictures to fill each box, as long as you get the idea.

(If I were going to model this cultural change with differential equations, this would be the phase plane. It looks like there is a cycle of some kind that goes clockwise through the four types.)

So you've probably noticed the first example of the rise of self-consciousness -- the duckface. It is an exaggerated mask-like expression to keep people from seeing how you really feel, whether that's high, low, or anything in between. Just be really silly and consciously fake, and people will get the hint to stay away from your emotional state. I tried to find a good comparison from more sincere times and stumbled upon a representative group of pictures from high school seniors back in 1982 -- see here and here. There's the odd silly face, but the likelihood of seeing someone trying to keep you out of their emotional space is a lot lower than today. Look through any young person's Facebook pictures, Flickr account, or whatever. It's so rampant that the anti-duckface website gets about 10,000 hits a day.

Again it's important to distinguish between mask-like faces that are attention-seeking (like the duckface) and those that are retiring (like the stiff upper lip expression). Just because we are overrun with attention whores doesn't mean they want a true connection. The advantage of comparing facial expressions is that there are only so many that we can use, unlike words that can be coined or disappear. Interestingly, the recent popularity of the duckface is international -- when I was in Barcelona in the middle of the decade, it was called "haciendo morritos" (referring to a snout) -- and like all facial expressions is built into our genetic hardwiring, as shown by tiny tots making the face.

Obviously there are many other examples of distancing expressions that became widespread recently and were lacking before -- uber-angry poses like kabuki masks, the super-surprised look with eyebrows arching to meet the hairline and mouth agape, and so on. Just do a Google Image search for "yearbook 1987" or whatever and see how relatively lacking these phony faces were back then.

I recently showed that there was a great moderation of the ego from the late 1970s through the early 1990s. In particular, the sarcastic tone (another way put up a barrier around your emotional space) was dying for roughly 15 years, and there wasn't so much of a focus on "me." That's the upper-right box in the table above.

I think you see the same changes in the emotional appeal of popular music. During the 1940s and the early-mid 1950s, singers were more restrained in expression (as far as singers go) and also not very in-your-face (Frank Sinatra). From the late '50s through the mid-'70s, the trend was toward greater sincerity and more extraversion (The Beatles). The outgoingness continued through the 1980s and early '90s, with sincerity increasing even more (The Cure). From roughly the mid-1990s through today, they've returned to keeping people distant but now while thrusting themselves into the spotlight (John Mayer -- I consider anything that overly sappy to be just as distancing and fake as the stiff upper lip styles).

Let's not forget those Very Special Episodes of otherwise lighthearted TV sit-coms, which had a healthy run from the mid-'70s through the early '90s and have since died off. Given how high the crime rate was back then, they didn't come off as overly exaggerated (except the idea that a rapist would target that screechy-voiced old bag from All in the Family). It was pretty standard scary stuff -- suicide, running away, popping pills, etc. -- rather than the risibly overblown junk that became the mainstay of Law & Order in the mid-'90s, as well as on Law & Order: Special Victims Unit during the 2000s. A circle of wealthy doctors who get it on with their wives only after they've been put into an insulin-induced coma -- and where the wives go along with it? I mean hey, who never heard of someone who had that happen to them?

The two quintessential '90s sit-coms -- The Simpsons and Seinfeld -- never had a real Very Special Episode. All of Seinfeld is clearly in the lower-right box of the figure above -- very in-your-face but keeping you out emotionally. The same goes for Family Guy in the 2000s. The Simpsons had an occasional preachy episode, but it was never serious or genuine -- just Lisa being a bit more annoying than before. Indeed, the recent trend is to parody the Very Special Episode. I doubt that's because the writers have seen the data that crime, suicide, and drug-taking are all down -- they're just as pessimistic as everyone else is on that score. Rather, they're skewering the emotional approach to the topic -- a new group of cool kids has taken over, and they think sarcasm is hipper than all that sentimental bullshit from the '80s.

Horror movies show this too. There hasn't been a high concentration of scary movies that drew you in since about 1991. Wes Craven's New Nightmare and Scream did, but that's about it. After then they were generally too over-the-top for you to feel as though you were there. Torture porn is the duckface pose of horror movies. And just as with popular music, the trend toward greater sincerity started in the late '50s and 1960s with Psycho. During the '40s and '50s, they were too artificial to really spook the shit out of you. Although if you go even further back, say to M, they drew you in emotionally. Again, these things go in cycles.

Pornographic movies used to have plots, and the shots would draw you in just enough. Granted the plots were dumb, but porn wasn't self-aware and meta like it became in the '90s. (BTW, take a guess when "meta" became a standalone adjective.) It aims to be even more in-your-face than it used to be, although the gonzo or point-of-view style that became dominant is probably a bit too close for many viewers. It's like the emo band that screams and cries so much that you can no longer lose yourself in the music -- that level of intensity is off-putting to any listener who isn't actually cutting himself at the time. Having a camera trained close-up on a girl whose eyes are watering up as she gets her throat fucked creates the same effect. It's too self-conscious and exaggerated to draw you in (again, if you're the average person -- obviously there are niche audiences for anything).

As a final example, the way young people dance shows this pattern too. It's worth noting because, like the duckface, it shows that it's not only performers who show the change but the population as a whole. For one thing, there is no more slow dancing -- that would bring you too close emotionally to the other person. It's even stronger than that, though: teenage girls will come up and surround me, treating my legs like strip poles, rubbing their ass around in my lap, etc. Clearly it makes her feel good, but the second you spin her around to look her in the eyes and hold her hands, she gets momentarily nervous. All of a sudden, things have gotten serious. The ass is far, while the hands are near, as Robin Hanson might say. That's how you can tell that most of these girls dancing wildly aren't really sluts -- once you force them to ditch the mask, they get nervous, whereas the minority of true sluts will ratchet things up.

Guys put up a front too in the dance clubs. Like girls, they wear the duckface, but it's more typically an expression of... well, it looks like an exaggerated or fake version of having been insulted, or mock anger. Their hand and arm movements are so deliberately goofy -- usually with both arms stretched out in front and waving them widely from side to side -- but by making them so overblown they hope to pull off the "too cool to care" vibe. (It rarely succeeds, and they just look like a pack of stooges.) And yes, they even back it up into the girl's lap, some even bending over, just to show how meta-ironic they are. The girl might humor him for a moment and laugh along with the charade, but shockingly this move never moves her to follow after him.

Aside from whatever dance ability I have -- I've never thought it was anything special, but I can only rely on what the girls say -- I think the main reason that, compared to their male peers, I draw more attention from girls who are more than 10 years younger than I am is that I don't wear a stupid duckface and act like a third-rate class clown. You'd be surprised how far an honest smile or a genuine look of mischief will go, not to mention losing control of your body while dancing. It should be close to one of those "flow" activities where you aren't deliberating over every single movement, and free from self-consciousness you feel more possessed by the vibe. Girls prefer a guy who can get into the groove rather than flail about in an insecure attempt to hide what's going on. It's fine with them if you aren't a great dancer -- but just get into it. (The exception is if you're good at slapstick, which in a way is what breakdancing and popping is.)

Well, like all good theorizers, I haven't devoted any time to counter-examples, but that's only because I couldn't think of many when I tried. What other examples or counter-examples am I missing? And by counter-example, I mean something where there was a steady trend toward greater sincerity in the '90s through today, after a low during the '70s and '80s. Not just a single instance of self-consciousness during the '80s, for example.

Side effect of bad pop music -- you can get more homework done

Since pop music went from fun and infectious to annoying and whiny, have young people been able to get more done? Yep -- high school sophomores did a lot more homework in 2002 than in 1980.

I recall easily distracting myself with music while trying to force myself to study. But now it'd be impossible, except for the tiny minority of young people who have more INXS than Linkin Park in their ipod. Really, what else is going to tempt them away from their textbooks -- all that engrossing reality TV?

December 9, 2009

The great moderation of the ego during the 1980s

Tom Wolfe named the 1970s "the Me decade," and Steve Sailer briefly touches on a feature article which shows that the trend is apparently still with us. For my pay blog, I'm putting together an extensive look at how our level of sarcasm and egocentrism has changed over the past 150 years. Below is a snippet covering only 1950 to the present, and for only two of the many measures I'm looking at. It shows the fraction of all NYT articles that contain the word "so-called" or "me":

Sure enough, there is a surge during the 1970s of snark and self-centered thinking, although they'd already begun rising during the '60s. However, they reach their peak in the late '70s, suddenly plummet through the mid-1980s, and remain at that bottom through the late '80s. This reflects the eclipsing of Baby Boomers as the dominant youth tribe by the disco-punk generation and the earliest Gen X-ers before they actually hatched in 1991. In their view, moving on and getting a life was more fun than whining about, like, whatever.

When Generation X did break out of their shells in the early 1990s and afterward, the level of sarcasm and egocentrism shot up again. Based on "me," we may be nearing a turning point, although "so-called" doesn't show a clear sign of slowing down. We'll have to wait and see (or look at more data than these two words).

So whether or not it was a decade of excess, during the 1980s (and starting just before then) people momentarily decided to stop dripping sarcasm like a bunch of bratty teenagers and to dial down the volume on Me!

Plus the Regan administration was deregulating many of the silly and ignorant fetters of the earlier commie era, and the airwaves weren't carrying pretentious junk but rather Madonna and Michael Jackson (both of whom were younger than Boomers but older than X-ers). Sure hope we get another one like that sometime soon. Only downside was the high crime rates, but hey, life can't be exciting in one dimension alone; it goes across the board.

Cavities are up -- what should we do?

According to some researcher with grant money, we should investigate how oral bacteria chemically signal to each other, so that we can disrupt that and keep them from aggregating and taking over an area of the mouth.

I have a cheaper and therefore better idea -- don't give your kid fruit juice, soda, vanilla wafers, or other high-carb foods. The dentist-turned-anthropologist Weston A. Price figured this out decades ago, and all of the archaeological evidence since then confirms it: hunter-gatherers don't get cavities, while those who adopted agriculture immediately suffered from rotten teeth. (See some of his pictures here.) The difference is how much digestible carbs you're taking in.

Fruit juice is just a bunch of sugar dissolved in water, along with a few nutrients that you can easily get elsewhere. The most obvious alternatives are real pieces of fruit. If you feed your kid half of a medium apple, that's only 10.5 g of net carbs, with 9.5 g of those being sugars. In contrast, your kid can suck down 1 cup of apple juice no problem. Even the unsweetened kind has 28 g of net carbs, with 24 g being sugars. Not very far from a bag of Skittles.

If you're worried about vitamin C, cutting down on carbs automatically helps, given that higher glucose levels will displace vitamin C from accessing its transporter. If that still doesn't work, cook some liver and mix it in with something they'll find more palatable. Oysters work too. If they don't like the taste, tough shit -- they'll thank you in their 20s and 30s when their skin won't be destroyed by chronic sugar overdosing.

December 8, 2009

The recent wussifying of music in one picture

After rock music died in the early 1990s, what would move in to take its place? Techno did somewhat, although not really on the TV or radio, and even that wasn't until the late '90s or so. The new wave revival of 2003 - 2006 was even farther away, and it didn't last very long anyway. Instead, it was the singer-songwriter genre that immediately and steadily began to spread. Hey, if the well of rock music had dried up, why not just divert a bunch of sewage into the pipes intended for drinking water?

To give just one example of how deprived of testosterone these celebridorks are, what ever happened to rock musicians pairing off with stunning models? David Bowie, Steven Tyler, Michael Hutchence -- they had it figured out. Ric Ocasek first hit it off with his future Czech supermodel wife Paulina Porizkova in 1984, when he was 35 and she was 19. In contrast, crybaby singer John Mayer recently dated the homely Jennifer Aniston when he was 31 and she was 39.

I don't mind some of the older singer-songwriter stuff because they weren't overly emotional and sappy. Like other music of the time, it was just emotional enough to get your attention, but not so dripping with confession that you could no longer identify with it, unless you were a clingy stalker loser yourself. Seriously, no one wants to see your inner child fully exposed -- get a grip and cover that shit up a bit.

December 6, 2009

Images will not eclipse words

It's common to hear that in this digital age pictures and visual design are replacing text and verbal meaning. (Here's something I found on the first page of a google search.) To anyone who's been paying attention in the past 10 to 20 years, this rings true. The question is whether this will continue indefinitely, perhaps as a form of creative destruction in how sellers communicate to consumers.

On theoretical grounds, I don't think so. The more and more that sellers compete for consumers based on the pictures and visual design of their products, the easier it will be for a rogue seller to only spend a bit on visuals and throw together some mildly interesting text, a narrative or story about the product. We like stories just as much as we do pictures, so that seller will get our attention -- even more easily given how fatigued we will eventually get with dazzling images.

By spending a lot less on marketing (a bit for visuals and a small amount for the so-so story) compared to their competitors (who must spend tons to get a tiny edge in an advanced visual arms race), they'll make an easy profit. That will draw others into that same strategy and ultimately make it harder for narratives to win people over, shrinking the easy profits to be had that way. Of course, that puts us back to where we were before, and some rogue seller will leave the storytelling arena and be the sole competitor in the visual design arena. It's the business analogue of frequency dependent selection in evolutionary biology -- how profitable one of these two strategies is depends on how frequent it is. The more common the visual strategy is, the harder it is to gain an edge there, and the easier it is to gain an edge through the narrative strategy.

The two strategies will oscillate over time but not because of fashion -- that is, changing just to change. They're changing from hard-profit strategies to easy-profit strategies, so it's not fickleness or a desire to keep consumers on a fashion treadmill. Quite the opposite: it's the consumers who can only become so hypnotized by pictures, at which point they will respond much more positively to an interesting story.

To provide some concrete examples which show that this shift back towards stories is already afoot, go to your local Whole Foods and pick up anything at random. Look at its packaging, and I'll bet that there's at least one paragraph of prose, and perhaps even two or three if space permits. Typical paragraphs focus on the company's mission statement, as though you were a prospective owner; an exposition of how each of the ingredients contributes to the product's quality; and the story behind their production process --

Although tea (camellia sinensis) is grown around the world, only by using these special Java tea leaves and our unique microbrewing methods can we create the peerless full-bodied character of Tejava.

That iced tea's website goes into even greater narrative detail, rather than only show a series of images that convey the company's values. They even have a tab labeled "Tejava Story" to pique your verbal interest. Like I said, you can inspect items in Whole Foods at random and read these kinds of things. *

Up through the 1950s, advertisements were dense with text on just these focuses, so after a textually minimalist hiatus, we're back to the good old pre-'60s marketing approach. Clearly some products are better suited for telling stories, such as how food gets made -- something that modern people have no idea about anymore, but which our hunter and farmer brains are naturally curious about. But anything that has a complicated but mildly interesting production process can use this approach. It puts more of a human face on how this thing got made, and it explains why it's as good as they say. And certainly products whose sellers compete mostly on price -- say, gasoline -- won't bother spending on any form of design to draw you in, whether flashy visuals or engrossing narratives.

The other reason that words will never be eclipsed by pictures is that we are primarily a verbal, not a visual, species. As a result, it's a lot easier to pass along a recommendation for some product by word-of-mouth than picture-of-hand. Give consumers an easy-to-remember story about how their Rwandan coffee or Belgian chocolate was made, and they'll have something exciting to tell their social circle. It's easier to show off that way, too, whereas you'd have to be great at drawing or photography in order to relay how visually arresting your new gizmo is if you didn't bring it with you.

So don't worry word-lovers. The theory and evidence shows that pictures will never come close to replacing words. Hopefully the focus on visual effects in movies will give way to a focus on dialog once again, but it may take awhile.

* Another example from a different section of the supermarket:

Get Ready! Our one-of-a-kind Pasture Butter redefines quality. Exclusively from pampered cows on summer pastures, when Conjugated Linoleic Acid (CLAs) and Omega 3s are naturally highest in our butter. The velvety texture is sublime and the flavor is the best we have ever tasted. [They're not lying.] Learn about CLA and Omega 3 research at


And that's all on the side of an 8 oz stick of butter! Space may be limited, but they use it all up to tell their story.

December 5, 2009

Will Generation X end helicopter parenting?

In Time's recent cover story about the incipient backlash against helicopter parenting, you learn that most of these trend-stoppers are prototypical Gen X-ers, now in their mid-late 30s. Approved parenting styles go through cycles, oscillating between the extremes of "let them be" and "always hover over them." It has little to do with what academics or writers say because there are always plenty of them who argue for just about every spot along the continuum. It depends more on consumer demand -- a "let them be" group of parenting experts can publish all they want, but if parents tend to have a paranoid mindset, the advice will only fall on deaf ears and won't sell many books. Best-selling writers are just giving people what they're asking for.

Maybe the cause is that when the typical Gen X member, born in 1971, reached the normal family-forming age of 25 or so, the world had become a lot less frightening. They surely had memories of the pre-'90s decline of civilization, but they were still pretty young then. You aren't really in a position to freak out about how safe the world is until starting a family becomes a possibility. So they looked around, saw that things were doing OK, rode the wave of euphoria of the recent economic boom that began in the mid-'90s, and figured that there wasn't much of a point in stressing out about their kids being abducted by aliens.

In contrast, Baby Boomers and the disco-punk generation (born between '57 or '58 and '64) came of parenting age when the civilized world was still going to hell, relative to the post-WWII Long Boom. Obviously these are just tendencies because there's still plenty of variation within generations. My mother, born in 1955, gave birth to me in 1980 when the society was still incredibly violent by recent standards, yet she did not flip out if I set off on my bike without a helmet, nor did she beat herself up if I forgot my lunch one day.

We'll have to wait and see if this thing keeps going -- hopefully it does -- but this may be one of very few cases where the '57-'64 cohort deserves more blame for ruining the culture, and Generation X more praise for improving it.

December 3, 2009

How low should low-carb be for best results?

In the 8 or 9 months that I've been eating low-carb, I've experimented a lot to see what works best. One obvious thing I've varied is the amount of carbs I eat, making sure to always keep it between 40 and 60 g per day or less. I don't think it's even gone far above 40, aside from the handful of weekends that I've allowed myself to indulge.

Recently I tried out a more teetotaling approach, probably taking in around 10 g per day, 20 g at most. I ate a good amount of food, just almost all protein and fat. Your body and brain do require some glucose, but most of this can be made by eating protein, as the liver will convert the amino acids into glucose for you in a process called gluconeogenesis.

Still, I felt less energetic on this stricter regime, so I thought Thanksgiving weekend would be a good time to change things up. I had a good deal of carbs over the weekend, though nothing crazy -- some not so saccharine cherry pie with a little whipped cream, and the odd handful of candied walnuts or honey-sesame covered cashews. As of yesterday I've stopped eating sweets, but I've stepped up the carb count to around 30 - 40 g per day for the first time in awhile, and I feel much more invigorated than before, like I did back in March or April when I also ate slightly more carbs than I had been recently. The clearest signal is always libido -- impossible to misread that gauge. I was far from ignoring girls recently, but now I'm back to teenage levels again.

Now, 30 - 40 g is still pretty low and doesn't include candy bars or other junk -- a bit of roasted red pepper, half a fresh pear or apple, some peanut butter, maybe a small corn tortilla too.

Some people who've tried low-carb have told me they felt somewhat fatigued or foggy on it even after the week-long or so adjustment phase. There could be all sorts of reasons for that (too much protein and not enough fat, for example), but one may simply be that they were going too far in the low-carb direction. If you haven't tried it out, or if you have but are questioning how great it is, try dialing it up in little steps until you notice a big change. Although humans in general are designed for low-carb diets, there's still variation among individuals. You may just need a bit more carbs than others. The Eskimos and Masai seem to thrive on incredibly low levels, but you may need something in the 50 - 60 g range. The only way to find out is to dive in and play around until you figure it out.

December 1, 2009

Why are those who benefit the least from beauty products the most likely to use them?

It's something of a puzzle why beautiful women use all sorts of make-up, sink big bucks into their hair style, collect expensive jeans, and so on. If a 9 could leave the house without even showering and turn every man's head, why does she spend so much time, money, and effort into prettying herself up? It hardly seems to add any further beauty. Even more puzzling is the fact that women who could enjoy a relatively big boost to their looks from these products have almost no interest in them. If a 4 could jump a full point or two by making the same investment as the 9 in make-up, hair styling, flattering clothing, etc., surely we'd expect the 4 to be bonkers about beauty and the 9 to be securely carefree about her looks.

The short answer is that the 9 has a comparative advantage in looks, while the 4 doesn't.

Let's say there are two dimensions that women would compete against each other on, looks and ambition. Guys prefer both because looks signal her genetic quality and ambition means you won't have to spend so much to support her -- you could live in a nicer house, send your kids to more prestigious schools, etc. Blessed by fortune, Riley scores 9 on looks and 7 on ambition, while the less fortunate Mabel scores 4 on looks and 6 on ambition. Clearly every guy would prefer Riley because she scores higher on both things that they care about. But some of them aren't going to have a snowball's chance in hell with her, and they'll listen to Mabel's sales pitch.

Even though Riley could emphasize either trait and demonstrate her superiority, the contrast between her and Mabel is greatest on the dimension of looks. So if she wants a slam dunk, she'll try to compete with Mabel on looks. Mabel recognizes that she's inferior on both dimensions, but she only looks a tiny bit worse on ambition, rather than catastrophically worse on looks. So if she wants to minimize the guys' perception of her inferiority compared to Riley, she'll try to compete based on ambition. Competing on looks requires spending time, money, and effort on beauty products, while competing on ambition requires little such spending.

So we have people who appear to not really need beauty products, from a value-added calculation, and yet who devote their discretionary time, money, and effort to them. Good-looking women buy beauty products not because they get a bigger bang for their buck than do ugly women (it would be the other way around), but because good-looking women are competing based on looks and ugly women are competing on some other dimension where mascara and butt-sculpting jeans are irrelevant.

Wait -- what if Riley was a 9 on looks and a 10 on ambition, and Mabel was a 4 on looks and a 2 on ambition? The gap in looks is the same as before, but now the ambition gap is even larger, so now Riley should want to compete on ambition and Mabel on looks.

But let's remember what the reality is: good-looking people obsess more over their looks than ugly people. That must mean that the real world is more like the first scenario than the second. The reason is probably regression to the mean. If Riley is a 9 on looks, she's very high above the average, so if we look at how she scores on some other trait, it may also be above-average -- because of good genes, a favorable environment, good luck, or whatever -- but it likely won't be as high as the first. Similarly, if Mabel is a 4 on looks, she's extremely below-average, so if we check her on some other trait, it may also be below-average (for the same reasons as before) but it'll probably be not quite so below-average.

Every person is a list of points that show where they score on all the many dimensions of attractiveness. They generally aren't radically different; people above-average in one desirable trait tend to be so in others as well. There's a kind of center of gravity that their various scores hover around. When we notice someone who's a 9 on looks, we don't know where their center of gravity is -- it could be that most of their values hover around 7 or 8, and the 9 is their upper outlier; or it could be that most of their values hover around 10, and the 9 is their lower outlier! But because it's more likely that someone's center of gravity is closer to the population average, rather than farther away, the 9 on looks is more likely to score 7 than 10 on something else like ambition.

For the same reason, it could be that a 4 on looks has a center of gravity that's 6 and looks is their weak suit, or it could be that their center of gravity is actually 2 and looks is their strong suit. Again, we're more likely to observe someone with a center of gravity closer to the population average, so the 4 on looks probably does better on average on other traits.

To try stating it more clearly, if we look at all women who score 9 on looks, many more of them will have their trait-wide average below 9 than above 9. And when we look at all women who score 4 on looks, many more of them will have their trait-wide average above 4 than below 4. That puts us in the first world we looked at, although with some exceptions. There are women who score 8 on looks but 10 on ambition, and we expect them to buck the trend as good-looking women who don't care that much about how they look, but who instead slave night and day to get a leg up on their competition in the job market.

And of course there are the odd women who score 4 on looks but 2 on ambition. These women couldn't care less about their education or career, and even though they have little to show off, they devote most of their time to playing it up -- "I don't care what anybody else says, I know these Juicy track pants make me look good!"

Finally, this applies to men, too: the 9 on looks is more likely than the 4 to use skin moisturizer, wear flattering clothes, and so on, because most good-looking guys score a bit lower on other traits and most ugly guys score a bit higher on other traits. Again, there are exceptional guys who invest more in other pursuits despite being good-looking, as well as the occasional ugly dude who spends all his time showing off his repulsive face and body because that's sadly his strong suit.

Anyway, the take-home message is that people invest in playing up their forte, rather than in the trait that will give them the greatest improvement, to try to shift the competition into an arena where they have a comparative advantage.

There's the further fact that women who enter into the looks-based competition will all be fairly good-looking, so that whatever minuscule improvement they can get from make-up, etc., might be worth the money. But this is just an arms race -- all will buy flattering jeans, wear nice make-up, and so on, so that the ranking among them is largely unaffected by investing in beauty products. If we weren't careful, we might only see this aspect of it and conclude that good-looking women spending so much on make-up was a purely wasteful arms race.

But in reality they're blowing all that time and money in order to shift the competition to one based on looks rather than any of the other dimensions that they could be judged on. That's what gets you access to the top tier of suitors -- you at least get your foot in the door, regardless of how you'll end up relative to the rest of the minority who get their foot in the door.