August 9, 2010

When the "paradox of choice" is true

A popular book called The Paradox of Choice argues that the more choices we're confronted with, past a certain point, the more miserable we become because we spend too much time trying to analyze each possible outcome. That gets to be burdensome beyond a small number of options.

For example, if the only choices for coffee are black, with cream, with sugar, and with cream and sugar both, it doesn't stress us out to analyze each and feel content that we've made a good choice. Once there are 50,000 different coffee-based drinks on offer, though, we waste time trying to gather information about each one, or at least imagining to ourselves what each might be like. Even after we've made our choice, we're stung with regret about not having tried one of the other choices that was up there in the ranking -- maybe you should have had the mocha java chip frappuccino with a shot of espresso and the hazelnut syrup rather than with the raspberry syrup.

Wasting more time and still winding up unable to enjoy our choice as we wonder whether one of the others was really better -- sounds like a bad deal! Lots of criticism has followed the book, like how most people don't really sit in front of the Tazo tea section of the supermarket and agonize for hours which one to get -- "Hey, this looks neat, why don't we go with this one?" -- and that, Woody Allen types aside, they generally aren't so neurotic that they poison their happiness with doubts about all those other unrealized choices.

Still, in defense of the basic idea, there is one class of examples where having more to choose from really does ruin the overall experience. I'll use the example of listening to music on an iPod, although there are similar cases too.

Instead of the slight-of-hand way of doing science, where we make a prediction and see if it's true -- and we only do this if we already knew our prediction would come true -- let's start with a simple observation. Compared to people who listened to music on a Walkman or portable CD player, which could have held maybe 20 to 30 songs at most, users of the iPod are much more likely to be futzing around with the damn thing, clearly unhappy with the current song and trying to scroll through their playlists to find something good. (Most current iPods can hold on the order of thousands of songs, though who knows how many the typical user has -- at least on the order of hundreds, though.) So we notice right away that the user's operation of the iPod is a lot less carefree, and that the quality of the typical song is not very good, compared to portable systems that only hold 20 songs (or maybe 40 if they brought a second tape or CD).

This is actually very easy to model. I'm going to refer to the alternative ways of listening as the limited system (that only holds 20 songs) and the wide-open system (that holds 2000), just to focus on the paradox-of-choice differences. Let's even assume that the two systems are the same in all other respects -- i.e., they both play mp3s, both have the same sound quality, etc.

First, the user is constrained in the amount of time they have to listen to music on-the-go, and this time will be the same for users of either system -- using the wide-open system doesn't somehow give you more hours in the day to listen to music. The average length of a song will be the same for both systems, so the total number of songs you get to listen to during your excursion will be the same. Just to make it concrete, let's say you have 1 hour to listen to music -- while out running errands, walking / jogging around the neighborhood, commuting, whatever -- and the average song is 3 minutes long. That means you'll listen to 20 songs, no matter which system you use.

Second, whether you like a particular song is not set in stone -- it often depends on the mood you're in, what activity you're doing, etc. To simplify, let's say you're just rating a song as either like or don't like. A certain song then has some probability of being a good song ("I like it") when it actually streams into your headphones -- you never know for certain whether you'll dig it or not in that exact moment, in that context, in that mood, etc. When we look at the entire collection of songs you're bringing with you, there's some probability that a randomly chosen one will be good.

Last, as you add more and more songs, at least beyond some point, the quality of those songs will start to fall off. In other words, if you only have room for 20 songs, you're going to be pretty choosy and make sure that they're ones that almost always please you. If you have room for 2000, you'll likely have those in the playlist, but you'll also have quite a lot of filler -- these are the songs we see iPod users skipping through trying to find the great ones. So, the chance that a randomly chosen song on the limited system will be good is greater than the chance that a randomly chosen one from the wide-open system will be. The chance could be pretty close to 100% for the limited system, given how selective you'll be, and it could be as low as 10% for the wide-open system (that's with 200 great songs and 1800 filler songs).

Put all of these pieces together. The number of songs listened to is the same for both systems, yet the probability of finding a good song is much higher on the limited system (because of diminishing marginal returns that set in when you dump 2000 songs into your iPod). Therefore, the expected number of good songs you'll hear -- total number heard times probability of there being a good song in the collection -- is greater for the limited system. Being more realistic, scrolling and skipping through all those filler songs probably eats up enough time that you only get to listen to 19 or 18 songs instead of 20, so that drags down the expected number of good songs even further for the wide-open system.

That's just looking at the benefits, but the costs of setting each system up with songs are the same. Grabbing a CD off the shelf, or grabbing a digital album / playlist of 20 songs onto your iPod, takes no more effort than it does to move 2000 songs onto the iPod. You might think it would take more effort for the limited system since those users have to be more choosy, but because they've already had so much experience with their music, they already know which albums or playlists are superior to their other ones. These are the ones that they'd list if you asked them to name an album they'd bring on a deserted island if they could only bring a handful of them.

Again, I didn't start with a model and see if the real world prediction came true -- we already know how distracted iPod users are by having to schlep through their infinitely long playlists to find one measly good song, and then having to repeat this once the good song is over! The model just makes sense of what we've already observed. To reiterate the three points of the model: 1) we're tightly constrained by time in the number of things we can sample, 2) these things only have a probability of being good, and 3) the probability that a randomly chosen thing will be good declines as we add more things into the selection pool (past a certain point anyway). Thus, the more choices we have, the more miserable we are (again past a certain point).

OK, big deal -- does it apply to anything other than iPods? Well, that is a big deal all by itself given how prevalent they are, and how widespread the delusion is that they offer a superior music listening experience over older ways like a portable CD player or a CD player in your car. Clearly people buy iPods not for a good listening experience but for something else -- to fit into their cultural tribe, to show they're in step with fashion, that they appreciate pretty-looking gizmos, etc. That's the source of the kicks they get from using an iPod.

I think this will apply to digital readers, if they ever catch on. That would be more in the context of reading a book on an airplane trip, where time is tight -- not necessarily using it for your general reading purposes. Imagine you had time to read 5 short stories. If your reader only held 5 short stories, you'd pick some really good ones right before the trip and be happy. If your reader held 5,000 as you moved a "My short stories" folder onto the reader's memory, you'd spend most of the flight getting a little into each one and then skipping on to another one that you'd hope was better, just like the iPod user.

Same if you are packing your clothes for a 5-day trip. That's only 5 outfits you'll be wearing, so if you just pack 5, they'll be nice and you'll be set. If you bring more luggage to fit 50 outfits that you'll choose from on-the-fly, you'll spend most of each day shuffling from one outfit to another before finding one that you like.

This also shows why the paradox of choice doesn't necessarily paralyze us before the dazzling variety of stuff at the supermarket. It's not a given that a supermarket that only gives you two kinds of butter to choose from rather than 20 has a higher probability of having a good butter on offer. That was true for mp3s because you made that selection yourself, based on having already experienced the songs for awhile before. From your point-of-view in the supermarket, all of the butters have some smallish chance of being great, so now in order to maximize the number of butters that you'll really dig, you want them to carry as many as is feasible for them. Say that any type of butter had only a 10% chance of turning you on. Then a supermarket that carries 20 butters is expected to have 2 that you'd love, while the one that carries only 2 butters is expected to have 0.2 that you'd love -- that is, not even one. And you're not so pressed for time either -- sure, you'll die some day, but you'll have enough time to try all 20 types of butter at the more varied supermarket, so you'll find the two or so that you really dig.

August 8, 2010

Millennials less sexual due to shorn body hair?

The sexual counter-revolution of the early 1990s was not a result of the Baby Boomers reaching their 40s, as even hormone-crazed high schoolers steadily dialed down their libidos (check the Youth Risk Behavior Survey). This was just one piece of a very broad decline in wild behavior. Still, I wonder if other recent trends haven't made the boys-and-girls situation even worse for them.

For instance, too many young people today trim, shave, or wax too much of their body hair, both guys and girls. This is especially true for down-there hair, but you also see even young guys with shaved armpits. We have a gut reaction of "that just ain't right," and because these reflexes are the product of natural selection, we should at least take them seriously rather than toss them aside as mere superstition.

How might this lack of body hair further deflate the already flaccid sex culture of the past 20 years, as compared to the 30 years before that?

Imagine you're a boy or girl coming of age and during your first real-life view of a nude body, you don't see any bush. That has to be one of those fixed-action patterns that ethologists talked about -- like when a predator sees an image that resembles his typical prey, he reflexively lunges at it, even if he's raised in captivity and has never seen his prey for real. When you first behold the sight of a naked girl (or boy) with pubic hair, some chemical cascade must go off in your brain, switching you from still-growing-up mode to get-down-to-business mode. Lacking exposure to that powerful visual, you don't get quite so out-of-control.

Aside from real-life girls, what about real-enough girls that most boys will see first, in some pornographic context? Just compare a typical Playboy Playmate from 2008 to only 20 years before in 1988. There's a primal response to the sight of bush that makes the 2008 girl look de-sexualized, like Rapunzel after the witch cut off her flowing braids. That first big reveal is supposed to provoke your teenage curiosity and create a sense of mystery about the other sex -- I wonder what it looks like! If what you see is not at all different from when you played "I'll show you mine if you show me yours" way back in pre-school, you can't help but feel that the other sex isn't quite as fascinating as you'd hoped they would be.

And don't forget the role of scent. Hair holds in whatever is going on down there -- sweat, pheromones, etc., all of which play a role in human signaling. For example, a famous psych experiment showed that other people can tell when a girl is ovulating because they rate the scent of her underwear as more pleasant when it came from that phase than from the non-ovulating phases of her cycle. (This study was done in the '70s -- surprised?) Several recent studies show that we prefer the body scent of people who have different variants of genes than we do at a spot in the genome called the MHC, which is involved in immunity. The more diverse we are at that location, the less likely it is that pathogens will have already seen our combination before, so they won't have an immediate lock on our position to exploit us. Thus, preferring a mate with a different profile from ours helps to make our offspring more robust against infectious disease.

But just how are we supposed to tell that some girl is ovulating, or that someone would make a great match for us at the MHC locus, if we can't get a good feel for their body scent? Lack of hair to trap the scent in must really mess up the lines of communication between the sexes.

Zooming out, the lack of hair and therefore of scent drastically changes the larger environment we live in. It's not just the scent of this partner or that one that throws the "time to grow up" switch on in our brain -- it's also what we perceive about the community we live in. Does it smell like people are very sexually active? Well, maybe it's time to grow up and get to it ourselves. If it doesn't, then maybe it wouldn't hurt to put it off for awhile longer until it becomes more urgent. Via Ray Sawhill's blog, here's a look back on New York in the 1970s. Excerpt:

What do I remember most? The smell; pretzels, piss, sweat and sex. Yes, in 1978 New York actually smelled like sex. At the time I didn’t have a name for the smell, but I recognized it, was excited by it, was intoxicated by it.
...
30 years later New York is changed. The smell of piss is occasional, rather than pervasive. The smell of sex is gone. I don’t find myself in the neighborhoods where the pretzel wagons ply their trade that often. The city is safe.

And it’s a little boring.

Young people growing up today can just smell that their world isn't very wild; they don't need to conduct a random survey of their society. What little activity is going on no one will know about because the overall level of body scent in an area has been dampened by the decline in body hair. It's almost as though people were trying to cover up their tracks or keep others from knowing what they're up to. Where there's no smoke, there can't be fire.

Bottom line: pubic hair is a secondary sex characteristic, i.e. one that develops only during puberty, so messing around with it would be like playing around with other such traits. A girl wouldn't want to lose her hips and look like a little boy again, would she? A guy wouldn't want to lose his jawline and look like a little girl again, would he? I think the same must apply to underarm hair as well, although I realize we're past the cultural point of no return for girls shaving under their arms. Sure, it looks better visually, although the look would not change much if girls merely trimmed it down to an inch or two, plus you wonder what we've done to the richer set of experiences we're supposed to have.

Other body hair is more of a mixed bag since it's not as universal -- while everyone gets hair around their stuff and under their arms, hair elsewhere can vary a lot from one person to another, suggesting that it's not so crucial for survival and reproduction as is universal hair (including head hair). Messing with that doesn't seem like it would matter much, but keeping your pubic and underarm hair as close to natural as possible seems to be the best plan.

August 7, 2010

Remembering history, from pre-agriculture to The Sixties

First, here's an excellent weekly podcast that discusses new books in history. You can search by category, and under "evolution" there's an interview with Greg Cochran on his book, co-authored with Henry Harpending, on how civilization accelerated human evolution, The 10,000 Year Explosion (also in my Amazon box above).

In the interview, the topic comes up of people not understanding history because they project here and now backward and elsewhere, such as American farmers who have never confronted a famine that killed 10% of the population. It would be bad enough if this bias blinded us to the decently far past, but it confuses us even about events that happened a generation or two ago, possibly within our own lifetimes.

To illustrate, here is how common the phrase "gay marriage" has been in the NYT during its more than 150 year history:


That's right, it doesn't show up at all until 1989, heralding the beginning of the early-mid-'90s culture war. It gets a surge during election years when it becomes a hot topic, and its decline during the 2008 election year probably reflects the fact that people were suddenly more worried about more fundamental things like their job security than about luxury issues like whether or not gays can marry.

The main point I want to make is that it was considered a non-starter -- or more likely not even considered one way or another at all -- during the previous culture war of the late 1960s and early '70s. We take the matters that have obsessed us during the most recent culture war and its aftermath and sloppily project them back onto the previous culture war -- if we're so focused on gay marriage and multiculturalism, surely those hippies and campus radicals back in the '60s were too, right? I mean, they were on The Good Side, therefore they could not have committed a sin like neglecting these central problems.

In reality, The Sixties had nearly nothing to do with "the politics of identity," which was the unifying theme of the '90s culture war -- i.e., multiculturalism, homophobia, sex/gender relations, the majority culture marginalizing the peripheral cultures, etc. The Movement back then targeted the government, the economic system, and the way they interacted with each other. To sum it up, The Movement was about fighting government oppression at home, smashing capitalism, and destroying the imperialist war machine.

It was only to the extent that "issues of race and gender" impinged on these political and economic matters that they were relevant. Only avant-garde groups within The Movement sought to understand the spheres of culture and gender / family / patriarchy on their own terms, rather than reduce them to political or economic questions, and to fight against oppression within these spheres, rather than wave their hands by saying that once the state and the bosses are fixed, all of that other bad stuff will go away, too. Most involved, though, did not follow this approach. That was left for the panic-preachers of the 1990s, when these questions really were pursued on their own and whose solutions people believed could only come through change within the cultural or gender sphere. E.g., by "having a dialogue about white skin privilege" among members of different cultures, or "creating safe spaces for female voices." An activist from The Sixties would have seen this mostly as pointless talk -- we've got to change the polity that oppresses racial minorities, or the economy that oppresses women.

I find it baffling that the popular conception today, especially among young people, is something along the lines of The Movement existing to establish affirmative action, free women from housewivery, and to let gays and lesbians express their sexuality openly. Not that these things were not part at all, but again they were minor and we are mostly projecting the '90s onto the '60s when we see them. The civil rights movement was about changing politics -- the state was denying some of its citizens their civil rights, and that had to be changed. Pretty simple -- no multiculturalism, no "conversation about race," etc. True, the basis on which the state was denying civil rights was along racial lines, but this shows what I said before -- race was only an issue to the extent that it had to do with bigger political issues. No one anywhere near the mainstream was talking about invisible, insidious "institutional racism" or trying to get black American speech recognized as a separate language ("Ebonics").

Feminism did not even explode until roughly the early-mid 1970s, mostly as a reaction by women in The Movement to how they were treated by their male comrades. "If this is supposed to be revolutionary, why are the women answering phones and mopping floors?" Don't forget Stokely Carmichael's remark that "the only position for women in SNCC is prone," though he must have meant supine -- i.e., lying on their backs. If feminism had had any importance in The Sixties, such a major figure would never have said that. After some agitation for women to enter the workplace, get equal pay, etc. -- notice, all reducible to political and economic oppression, nothing about "the patriarchy" -- feminism died by the late '70s / early '80s.

Gay rights were totally off the radar, even though the Stonewall Riots took place in 1969. No big economic or political angle? Well, we don't have to pay attention to it then. It wasn't until the '90s obsession with homophobia that the Stonewall Riots became widely recognized.

And does no one remember the Vietnam War and the anti-war movement? Jesus, of all the mass spectacles of The Sixties, wouldn't you think this one would make it into the front of the popular mind? But from our point-of-view, based on the '90s cultural war and its wake, the anti-war movement doesn't make any sense, so we've mostly forgotten about it. There's no strong racial or cultural angle -- other than that we were bombing people with somewhat darker skin than ours, but they're also Asian and therefore not quite as deserving of minority sainthood in the popular mind as are blacks. There's no gender-and-sexuality angle at all.

Nope, it was good old fashioned military might wielded by a rich country to show the other poor countries of the world who's boss and what will happen to them if they get out of line. There was also a civil rights angle -- namely the draft. Most Americans at the time were not very sympathetic to The Movement, but if they felt close on any issue, it was not wanting to send their sons off to some jungle hell unless they chose that course on their own. Thus bereft of any "identity politics" value, the war and the anti-war movement have been almost completely forgotten. That's especially true among young people who only lived through the '90s culture war and after, but I find lots of Baby Boomers talking this way too -- and they have personal memories of it all! It would just make them stand out as weirdos and old fogies if they tried to talk about it how it really was, so instead they re-cast it in terms of the '90s culture war, making themselves feel like their anti-establishment attitudes are still current.

Another unfortunate consequence of the anti-war movement going down the memory hole is that the popular mind has lost sight of how confrontational and riotous those times were -- firehoses trained on blacks, students storming the 1968 Democratic national convention, Weathermen bombings and Days of Rage, and on and on and on. It did not consist of a bunch of pacified 20-somethings having a conversation -- perhaps a heated one -- in a college classroom, a coffeehouse, or wherever. The Sixties took place during the crime wave that lasted from the late '50s through the very early '90s, so the greater level of violence shouldn't be surprising. The '90s culture war took place as crime rates started plummeting, and once more we've projected this relative lack of confrontation back onto The Sixties and portray it as a largely intellectual ferment that sought solutions through having a conversation. In reality, it was "why talk when we can smash?"

I've strayed from the main topic a lot because my real goal here is to make sure this understanding is written down somewhere. I haven't seen a detailed treatment of the error in thinking that the '90s culture war was just an extension or re-awakening of the '60s culture war. Their central concerns barely overlapped, their solutions therefore were very different, and their tactics couldn't have been more at odds with each other.

August 6, 2010

Did pre-modern people think more imaginitively? Some linguistic evidence

A recent story in Newsweek says that a psychology researcher has found that tests of creative thinking showed rising scores from the late 1950s through about 1990, after which they've been falling. That fits perfectly with my ongoing idea about rising-crime times producing greater and more enduring cultural works than does falling-crime times. When the future looks sketchy, people will live more for the moment and have to think more outside the box in order to survive an objectively more dangerous world.

Many who commented on this article took the instinctive, overly narrow approach, namely what events unique to the 1990s caused this specific decline in creative thinking? The correct way to solve these problems is to first gather as many similar examples and then see what they have in common. It's not as though this is the first time in human history that creativity has declined. Rising and falling violence rates have been present throughout history, and it looks like they correspond to flowering artistic creativity (rising) or downplaying the arts and focusing on reason, science, and enlightenment (falling).

The high point of violence rates in Western Europe over the past 800 years seems to have been the period from about 1580 to 1630. That's using homicide rates, but there was an unusual amount of political intrigue and bloodshed as well -- not to mention the peak intensity of the Early Modern European trials of witches and werewolves. At least since the post-Roman Dark Ages, if any generation had reason to suspect that the world was going to blow up, it was these guys. And of course they produced some of the greatest works of literature, especially in England.

I wonder how English people circa 1600 would have scored on a psychologist's creativity test? Obviously we cannot go back in time and give them one, but they have left plenty of fossil evidence to look at for clues. As I've delved into the plays and poetry of this period, one thing that's struck me is how the footnotes always mention that in Early Modern English a plural subject would often -- though not necessarily -- be paired with a singular verb, if the subject was to be construed as a whole. This footnote appears as a general warning in every single play I've read so far, it is that common. Here's an example from Marlowe's Doctor Faustus (1.3.80-81), along with the note in my copy:

O Faustus, leave these frivolous demands
Which strikes* a terror to my fainting soul!

* (it is not unusual to have a plural subject -- especially when it has a collective force -- take a verb ending in -s)

Linguists distinguish between "mass nouns" for wholes vs. "count nouns" for pieces. To tell which one a certain noun is, just ask if you cut it in half, would you still call each of them the same as you called the original? For example, if I have water and split it into two sections, each one is still called water. But if I have an apple and cut it in half, you would look at me weird if I gave you one of the pieces and said, "Here's an apple for you."

But there is no objective distinction between what is a count noun or a mass noun; it depends on how the speaker construes things. You know those pictures that look either like two faces pointed at each other or a chalice, depending on how you twist your mind? That's what we have here. To use a blacker example that's famous among linguists, typically "cat" is a count noun -- if I cut one in half, each of the pieces is not a cat. However, it's fine to say that "When Jayden hurried away in his car, he accidentally ran over Fluffy, and there was cat all over the driveway." In this way of conceiving cats, they're not individual beings but instead a big blobby mass of cat-like stuff, much like water or hay or mud. So sure, if you cut a portion of it in half, you still have cat here and cat there.

I see it as the mark of a more flexible mind's eye that the Early Moderns were happy to sometimes use a singular verb and sometimes a plural verb with a plural subject, depending on whether they construed the subject as a whole or as separate pieces. Here we don't have to guess what they were thinking -- if there was a singular verb, one ending in -s, we know they construed it as a collective.

By the way, this is one of those things that Westerners are supposed to be worse at than East Asians, if we believe the evidence in Nisbett's book The Geography of Thought. Westerners are supposed to be more analytical, slicing things up and looking at the separate pieces, while Easterners are supposed to be more holistic, stepping back and seeing the relationships among the pieces as creating a single network or whole. Maybe this observation is true only for modern Westerners -- English people around 1600 seemed right at home switching from part to whole for a far broader range of things than they do today.

The most recent article I could quickly find on this topic was written in 1934, but that should be good enough. His overview is that from the Old English period up through the 17th C., it was totally acceptable to pair a plural noun with a singular verb if the noun was to be construed as a whole. This came under "logical" attack during the 18th C -- no surprise there, that century was ground zero for Aspie dorks taking things too literally -- and was largely gone by the end of the 19th C. (Examples can still be found; they're just far less frequent than in pre-modern times.)

Are these the linguistic fossils of the change in people's basic traits that Greg Clark documents in A Farewell to Alms? As they moved from an economy based on some mix of farming and herding (or even hunting and gathering) to a more market / capitalist economy, people sure seemed to become a lot more like Ned Flanders in their basic personality. We see this in the context of language with all the grammar Nazis out there today, although again much of those clueless complaints go back to the 19th or 18th C. ancestors of today's language mavens. ("Maven, schmaven!" as Steven Pinker says.) In Shakespeare's day, either there were hardly any grammar Nazis around or their targets had enough sense to tell them to get a life, mellow out, and not over-analyze everything, that it makes sense if you just step back and look at it the right way.

Worse, people actually get a kick out of lecturing us about how, for instance, we can't start a sentence with "hopefully" because it DOES.... NOT.... COMPUTE... People try to show off their lack of cognitive flexibility these days. This flabbiness at a fairly early stage in the culture creation process -- how you look at things -- goes a long way to explain why most modern stuff doesn't seem anywhere near as compelling as what came before. Adaptation to market-based economies required minds that were better at specialization than generalism, and that would guide their bearers to be good cogs in the economic machine. We certainly have this to thank for the dramatic increase in our material standard of living, but we seem to have traded this off against a more creative way of thinking.

August 5, 2010

Jewish foodies but no Jewish chefs

Via A&LD, I came upon this article / review about the apparent declining status of French cooking. The problem, as the various critics seem to agree, is that it has remained too French. Unlike other cuisines, it has not engaged in a dialogue with South or East Asian trends. And it has not embraced the free market as much as the Anglo-Saxon world has. They have failed to keep up the pace of innovation and revolution as true artists must, and as a result, the new "it" places are in Britain, Spain, blue state America, and Japan.

This all shows how lame a lot of the foodie world is. It's more about fashion and experimentation for its own sake ("expand your sense of the culinary possible"). That sure worked out well for modern painting, sculpture, and architecture, so why not work it out all over our cooking, too? All that experimentation over the past tens of thousands of years? -- yeah, they may have stumbled upon a tasty dish or two, but it was just baby stuff and further experimentation will take us so much farther than all of that ever managed to reach. Just like economists, food critics do not believe in diminishing marginal returns in the real world.

The idea about free markets making better food is a real joke. Japan has always seen its government leaning heavily on economic choices, most famously in its industrial policy. France, Italy, and Spain are hotbeds for radical socialist activism, not free-marketeering, and have been so forever. Britain may be hot now, but it could be a fluke, and the Netherlands and neoliberal Scandinavia have never been near the top of the culinary world. I can't see that neoliberalism has the opposite effect and makes food taste worse; it just seems that how free-market-oriented a country is has little or nothing to do with how good the food is.

Before I became more exposed to foodies, I thought they were the gustatory equivalents of bookworms and garden-lovers -- people for whom food was primarily about the beautiful and even the sublime. Nope, instead they're just a bunch of wannabe art critic dorks, aping the highbrow affectations of their painting-focused superiors. Y'know, I find them to be even more annoying because at least the painting and architecture groupies don't pretend to be gurus always trying to impress you by offering their harebrained advice. In this sense they're more like "interior design" or clothing groupies. Here's an excellent example from a comment to a Tyler Cowen post (which itself abounds in masturbatory signaling that glazes everyone's eyes over except for those of fellow doe-eyed foodies):

The meat selection anywhere in Germany puts the United States to shame. The secret is to get your cold cuts sliced thin (very thin!) and eat them with dark brotchen. The flavor of the meat will come through, and without as many calories. Focus on flavor, not quantity.

If I wanted advice on how to focus on flavor, I'd ask for someone who knows that meat tastes more flavorful than bread, and therefore to have a decent serving of meat rather than a big block of bread with a tissue of meat draped on top. How clueless can you get? I like the use of the exclamation point -- really assures you that he knows the score. And any moron knows that you cannot reconcile eating a small amount of calories with eating delicious food. Protein by itself tastes like garbage, and so does fiber. What tastes good are fats and sugars, both high in calories.

How did such an Apollonian group of status-grubbing geeks take over what is supposed to be a thoroughly Dionysian and communal experience?

Which brings me to the title of the post. We saw this same phenomenon in modern painting -- a bunch of well trained artists putting out boring and confusing work that the critics fawned all over. Clearly the critics didn't have a good eye for art. There can be plenty of variation in what's considered good, sure, but these guys were just out to lunch. As Steve Sailer, riffing off of Tom Wolfe, noted about the art world of the '40s through the '70s:

Not surprisingly, the famous painters tended to be gentile, while the famous critics tended to be Jewish, as the different distributions of visual and verbal intelligence would predict.

That's because Ashkenazi Jews tend to do incredibly well at verbal intelligence, but do slightly worse than other Europeans on visual-spatial intelligence.

I haven't studied the foodie world so much, but the three critics in the article on where French cooking is headed -- or where it is staying put -- all appear to be Jewish: Adam Gopnik, Michael Steinberger, and Steven Shapin. I'm sure there are lots of gentile foodie critics, too, but Jews seem over-represented here as in many other fields (such as math, physics, philosophy, and economics).

And yet where are the top-ranking chefs who are Jewish? I googled around with variations on "michelin jew(ish) chef(s)," but couldn't find hardly any. Gary Danko has an Ashkenazi grandmother and cooks at a restaurant with one Michelin star, but other than that nothing turned up. There could be a couple of others, but clearly this is a different world from math olympiad winners, elite computer programmers, Hollywood executives, and so on. There it would take you two seconds of googling to find a long list of Jewish Nobel Prize winners. But among masters of the kitchen? Not really. Judging by popular demand, you rarely see non-Jews in love with Ashkenazi Jewish cooking, whereas plenty of people outside the creator's ethnic group love Italian, French, Thai, Japanese, Lebanese, and Brazilian food. (I would kill to try some good schmaltz, though.)

I don't see a connection between lack of representation among top chefs and a more verbal than visual-spatial cognitive style. Sure the food needs to look pretty if you want fame, but it's mostly about how it tastes. (Isn't it?) The sense of taste is tightly connected to the sense of smell, but not to vision. Perhaps due to their seclusion in white-collar professional jobs for the better part of a millennium, Ashkenazi Jews are more adapted to mercantile life, and this leaves them out of touch with the more earthy sensibilities needed to become a master chef. Greater skill at sniffing and taste-testing was probably worth more in Darwinian terms to peasants and herders than to tax farmers and money-lenders.

Also, the tendency to over-analyze instead of just go with the flow / not force it must be an impediment to culinary success, and indeed to artistic achievement in general. Certainly they fall more into the category of thinkers and intellectuals than psychologically abandoned artists. That would also account for their relative absence among painters (though the verbal vs. visual-spatial thing could amplify the more basic effect).

This also makes a prediction that Sephardic and Mizrahi Jews would do pretty well as chefs, as they were not confined to white-collar niches. And Mediterranean food in general seems pretty popular, so why wouldn't theirs? I once had a cookbook of Early Modern Spanish Jewish recipes (A Drizzle of Honey), and the stuff in there all sounded pretty tasty in the way you'd expect Mediterranean food to taste before the low-cal, fat-fearing panic swept the globe. Other than that, though, I couldn't say.

August 4, 2010

Breast milk protects baby's gut from infections

Nick Wade in the NYT reports that a decent chunk of the sugars in breast milk are not digestible by the baby, but rather are there to provide a food source for a helpful bacterium and to act as decoys in the presence of harmful bacteria. Apparently no one had appreciated how much of mother's milk the baby could not digest, let alone what adaptive purpose this served.

Here again we see the wisdom of going with what hundreds of thousands of years of natural selection has designed, rather than with what man has. By preferring the short-term benefits of avoiding "all that hassle" associated with breastfeeding, and using infant formula instead -- it's close enough, right? -- she deprives the child of the long-term benefits of breast milk.

The harm done to themselves and others by this over-emphasis on immediate personal pleasure is why we need to apply more shame when someone acts like a wimp to avoid the slightest inconvenience. No hostile diatribe or casting out is needed, just frequent reminders from everyone in their social circle to "suck it up," "deal with it," "grow up," and "stop acting like a baby."

Turning off the A/C

I know it must sound nuts to suggest this during such a sultry summer, but I'm becoming more convinced. And not because of green / New Urbanist arguments where we're told that the carbon footprint of A/C is yet another reason to walk or take mass transit.

Human beings did not evolve in a world with A/C, and we're more adapted to the hotter summers than the colder winters that we have here in America. You may think the heat or humidity are bad, but you'll survive, even without technological fixes like fans, light and loose clothing, or sunscreen. Try surviving a winter without a heater of some kind, clothing, bedding, etc. At first glance, A/C looks pretty unnatural, and because natural selection tends to keep us adapted to our environment, we may be messing around with something we shouldn't by living in continual A/C spaces.

How? I suspect we're weakening our body's system for maintaining temperature. We have a certain temperature that our body prefers to be at, although the environment may push it above or below that point. Our system has to adjust it back to where it wants to be. Changes in weather and climate seem to follow a power-law distribution, where most shocks to our temperature are small in intensity -- like a gentle breeze -- and higher-intensity shocks are rarer though still present -- like a scorcher of an afternoon. It's this type of shocks that our body was meant to deal with -- lots of minor deviations from our preferred temperature, but also a small number of quite large deviations, maybe only once a month or year.

As we're growing up, our system gets training in dealing with those high-intensity shocks, so that we deal with them pretty well by adolescence or adulthood. We must deal well with the smaller shocks even earlier because there are so many more of them, requiring less training time. But spending the summer, and by now the entire year, in an air-conditioned environment with its constant temperature, we lose the strength that comes from exposure to and eventual mastery over small and large shocks to our temperature. Then during those times when we cannot turn on the A/C -- say, walking around the block -- our poorly trained system can't cope and gets more overheated than it should, we'll get more irritable on hot days than we should, and we'll be more tired than we should. With plenty of training to deal with shocks, though, we'd do just fine.

A very close analogy can be drawn to using heavy sunscreen from childhood onward. By using sunscreen, you fail to develop a tan -- a tan that would protect you from further sun damage. Your skin now faces no outside shocks; it is protected in a constant environment from the vagaries of how much sunshine there'll be this week and the next and the next. But suppose one day you run out of sunscreen yet must go outside for awhile -- now that it's mid-summer and you've built up no defenses through training, your skin is going to get zapped.

Obviously it would be foolish to suggest the exact opposite approach -- maintaining a constant high temperature or baking in the sun all day long if you have white skin. That lack of variation in exposure levels would also be harmful (more so). Again, the shocks should be mostly on the light side, a handful at the next level up, and only some rare ones at the highest level -- but these should be there. Our sweating system and our instinct to stay out in the sun for awhile but then seek shade must be better ways of maintaining our body temperature, since they are the results of hundreds of thousands of years of natural selection, not some man-made gizmo like A/C that only became common within the past two generations.

(I realize there are other potential health issues with A/C, like bacteria growing in the system and being spread through the house if the filters aren't changed. But these are fixable problems. I'm talking about problems that are inherent to any attempt to keep our surroundings constantly cool in temperature.)

What else do we lose by not sweating? Although we may not do it as much as other species, humans send chemical signals through our body odor, facilitated by sweating. Just google "sweaty t-shirt" and you'll find links to the cottage industry of research studies that show how we factor in someone's underarm sweat unconsciously when deciding if we'd like them for a mate or not. There was a similar study done in the 1970s showing that females who were ovulating had more pleasant-smelling underwear. (Psychologists, huh?) Who knows how widely we use body scent to communicate things about us to others? By stopping ourselves from sweating, we cut off this line of communication among us.

The final reason to want to sweat may seem superficial but is actually deeper: there's nothing like a nice coat of sweat to make your skin glisten. Not necessarily while you're sweating, mind you (although there too), but after your skin has absorbed the water and various solutes that sweat contains. If you've ever seen or touched your skin -- or someone else's -- after having been through a sweaty ordeal, you know that it looks more glowing and feels tighter and bouncier. It's hard to mimic that with common man-made moisturizers -- it must be something about the precise mix of solutes in sweat that give the skin itself a different look and feel, or perhaps this mix of salts refracts light in a particular way (like grains of sand that make a shoreline sparkle).

Our minds evolved to find attractive, among other things, whatever would honestly signal good health in another human being. Skin glistening from sweat signaled that our system for body temperature maintenance was in good working order -- neither profusely gushing sweat nor holding back altogether and making us fatigued from overheating. It also showed that we were vigorous enough to go around doing things that make you sweat -- being athletic, hunting, dancing, performing manual labor, etc. -- which a sickly person would not have the strength to do.

Since A/C only became widely adopted by the end of the 1980s (it increased after that but at a slower rate), if you watch any movie from the '80s or before, you'll see that their skin looks a lot more glowing and bouncy than in movies from the '90s or 2000s. There are other reasons for that, like their more animal-based diet rich in vitamin A and relative lack of sugar, but another reason was just good ol' sweat. During the past two decades, it looks drier and more matte.

So what to do? Well I'm getting by fine without any A/C whatsoever. I have a fan in my room that I keep on low and that I only train on my body every now and then when I feel like I need a cool breeze; otherwise it's pointing somewhere else to circulate the air. Leave windows and doors open, too. Roll the windows down in your car -- that's how it used to be done, plus the A/C cannot replace the feel of the wind whipping through your car. I only turn on the A/C for minute when I first get in just to cool off the steering wheel that's been baking in the sun. When you're just in your room or otherwise by yourself, wear less clothing -- nothing breathes better than nothing. I'm typing this in just my underwear, and I don't feel weird or uncomfortable. Hell, it brings me closer to my hunter-gatherer and pastoralist past.

Most importantly, spend more time outside of buildings. Unless you live in a place with unbearable humidity, the only place you can find stifling stuffiness is inside a building. Again, human beings did not evolve in stone or otherwise insulated housing, so we are not well adapted to living in enclosed structures for most of the day. Even with no breeze whatsoever, it almost always feels better in the fresh air than holed up inside, especially when the sun is going down. If you've already had enough sun, you can stay outside as long as you find shade -- not hard to do.

This case illustrates a general point I've been trying to make lately about the dangers of overprotecting ourselves from shocks, inspired by EconTalk podcasts with Arthur De Vany and Nassim Taleb that I linked to elsewhere. By pursuing momentary comfort -- say, by having the A/C on or wearing sunscreen whenever we go out -- we increase our risk of worse outcomes than if we'd trained ourselves to various intensities of shocks. Over the long-term, we find ourselves overheated just after walking out to the mailbox on a warm day, or find a blistering sunburn after a single afternoon of sunshine when we forgot the sunblock.

In earlier times, parents and friends would tell us to "suck it up," or "take it like a man" when we were slightly uncomfortable in the present, knowing that this would pay off over the long term. And they were right. Our more hedonistic practices have sabotaged our well-being and our attractiveness to others.

August 3, 2010

High home costs lead to more young males who've never had a girlfriend

Steve Sailer's political idea of Affordable Family Formation starts from his observation that American women have higher fertility rates where the land and housing is cheap. These states tend to vote Republican because their larger family sizes naturally lead them to prefer politicians who put family values front and center. So an electoral strategy for conservatives is to promote whatever conditions that make starting a family cheaper, one key part of which is access to housing whose costs won't bleed you dry.

I wonder if that couldn't work through a second path as well, namely by making housing more affordable for the average guy, he'll have an easier time finding a girlfriend to make his wife, rather than living at home forever like a bum. The "having had a g.f. or not" difference could influence his view of society, its institutions, and what policies he'll support, well before he's even in a position to consider marrying and starting a family.

If he lives in a place where housing costs are exorbitant, he can only afford something pathetic, and he will thus have a hard time getting a girlfriend. In this embittered state, he'll start whining about society's various power structures and in general be more sympathetic to redistributionist policies. Those who wind up at the bottom overwhelmingly favor Democrats to Republicans. If, instead, he lives in a place where decent housing is affordable, he'll be in much better shape for finding a girlfriend, not be bitter, and so be less hostile to politicians who cater to the doing-OK and the well-off.

This other angle on the Affordable Family Formation strategy assumes that higher home prices lead to more males sitting on the sidelines of the mating market. Is there evidence for that? Fortunately GameFAQs just ran a poll on relationship status. The respondents are almost entirely male and tend to be youngish (most in their 20s). Obviously they are not a representative sample because they came to the poll through a website for video game fans, but the question is whether or not the sample is biased. In other ones, I don't think so (like the one that asked about eye color). But here it probably is a biased sample -- though in a good way. We expect video game addicts to be less popular with girls, so if we find that cheap housing helps out even them, we're onto something. These are the guys we're concerned about -- semi-marginal ones who could go either toward getting the girl and thus feeling sanguine about traditional courtship (and perhaps marriage), or being unable to find one and growing bitter, turning against family values and politicians who promote them.

Here is a map of the U.S. showing what percent of respondents have ever had a significant other in their lives, where browner means more guys have had girlfriends and bluer means that fewer have:


The average for the entire U.S. is 60%, although it's as low as 54-55% for California and Vermont and as high as 69-70% for North Dakota and West Virginia. The map is pretty close to a Red State / Blue State map, so I won't bother digging into the voting data to show that states where young guys have actually had a girlfriend tend to have a higher percentage of young males voting Republican. The apparent outlier of Utah could be due to the fact that early-20s Mormon males go on a mission for two years to convert others abroad, where they won't have the time to find girlfriends. (If someone else wants to look the voting data up, I'll provide a link to my dataset and you can run the correlation. Or maybe I will some other time.) I had an easy time finding data on home prices, though, so I did crunch those numbers. The Spearman rank correlation between the percent of young guys who've never had a girlfriend and median home value (for owner-occupied units in 2004) is +0.73, very high for the social sciences:


The housing data are from the recent housing bubble, but most of the increase in never having had a girlfriend occurs in the lower part of the home price spectrum. By the time you look at super-expensive states, diminishing marginal returns have set in: the guys there are basically as likely to never have had a girlfriend than those just below them in home prices. So I don't think the correction of the housing bubble will alter the overall pattern much.

There are two lessons to take away from this picture. On an individual level, if you care about whatever boys you may have, for god's sake raise them in a cheap-land state so that they'll have a much smaller chance of going through their teens and 20s without ever having a girlfriend. Talk about perpetual juvenile status. You want grandchildren sometime before you die, right? And on a societal level, if you want to keep semi-marginal males from turning into bitter losers who vote for family-unfriendly policies, make sure there is plenty of cheap land so that they at least have an honest shot at getting a girlfriend, rather than being shut out of the game completely due to living in a shoebox or with their parents forever.

August 2, 2010

The perfect Sunday afternoon album?

When you're recovering from whatever you were doing over the weekend, you're in the mood for music that is more low-key and reflective as you try to make sense of what all just happened, wistful enough to pay respect to what a great time you had and how much you now miss it, yet upbeat and carefree enough to help you move on and get ready for the week ahead.

On this basis, and choosing only from albums that I actually own and have tried out in this context, I nominate Starfish by The Church, best known for its hits "Under the Milky Way" and "Reptile," although the whole thing flows wonderfully.

In general, I've found that albums with a late '80s "college rock" sound (or similar) are all very good in this way. During that time, it's as though they could sense that the past 20 to 30 years -- their entire lives -- had been a single long weekend, and that they were on the edge of a major social shift from greatness to triviality. They were grateful and happy to at least be living through a twilight period, still safe from the worries and numbing routine of the coming weekday decades.

On that note, here are some runner-up albums that work pretty well but that I find myself skipping through parts of: Full Moon Fever by Tom Petty, She Hangs Brightly by Mazzy Star, The Joshua Tree by U2, Listen Like Thieves by INXS, Our Beloved Revolutionary Sweetheart by Camper Van Beethoven, and All Over the Place by The Bangles. I'm sure there are a bunch more among those I don't have -- something from REM's pre-crybaby period should definitely be in there -- but this is just what comes to mind.