February 28, 2013

Breast modesty and the breakdown of the bicameral mind

From a brief post in Slate on when bare breasts became taboo in Western culture:

Women are displayed with exposed breasts in Minoan artwork from 1500 B.C. Some historians believe that these ancient women went topless only during religious rituals—bare-breasted, buxom goddesses have been worshiped since the dawn of civilization—but some of the artworks depict everyday activities, suggesting that bare breasts may have been commonplace. Just across the Mediterranean, ancient Egyptian women sported elaborate dresses that could either cover the breasts or leave them exposed, depending on the whim of the designer. Over the next few centuries, however, breasts become strictly private parts. Ancient Athenian women were wearing flowing, multilayered robes that concealed the shape of the bosom by the middle of the first millennium B.C. Spartan attire was more risqué, exposing the female thigh, but breasts were always covered.

Covering the breasts suggests a greater sense of self-awareness and perhaps self-consciousness. In many primitive cultures, the breasts are seen as just another utilitarian body part, and women are not self-conscious about them, and men do not make remarks about them, even in a male-only setting. The only universal taboos about body exposure have to do with the genitals and the anus, which unlike the breasts are involved in excretion and intercourse, two of the most obviously shameful and/or private activities.

The fact that Mediterranean peoples before 1000 B.C. resembled primitive cultures, and that they had radically changed toward greater self-awareness by circa 500 B.C., makes me immediately think of Julian Jaynes' ideas about the bicameral mind. He was trying to account for an apparent lack, or at least a very low degree, of self-awareness in civilized people before around 1000 B.C., and their shift toward introspective abilities and self-awareness by around 500 B.C.

His main sources for documenting this are comparing the older Iliad to the younger Odyssey, and the older books of the Old Testament like Amos to the younger ones like Ecclesiastes. In short, characters in the older literature seem to be trapped in concrete awareness of the world around them, with little or no abstraction or introspection, whereas their counterparts in the younger literature can introspect and ponder questions about human nature, divine nature, love, hate, and so on.

Older religious traditions do not assume that a group member could look inside themselves -- they received a message from the gods about what to do, and they obeyed. However, looking inside yourself is the central feature of all of the major world religions and philosophies that were born and began spreading independently all over the Old World during the Axial Age, such as Second Temple-ism, Zoroastrianism, Buddhism, Platonism, and Confucianism.

He also discusses how the Greek language of the Iliad had no abstract mentalistic terms like thought, soul, anger, etc. They only later acquired such abstract meanings, and originally had very concrete sensory meanings like sight, heavy breathing, excitation of the nerves, life-matter, etc. Ethnographers of hunter-gatherers also say that the people they study rarely or never at all hold abstract discussions, even about the everyday world. For example, they don't say "it's a beautiful day," referring to something abstract like "beautiful." They phrase it more concretely, like "it's hot out" or "the sun is shining."

Jaynes' approach is more interesting than comparing different cultures today because he's tracking a change over time within a population. That way we can better see which things are associated with which other things -- if they're tightly related, they'll show similar trends over time. If we just compare different cultures today, it's more difficult to disentangle which differences are related to which others. There are "spurious" correlations that don't reflect a true relationship deep down.

For example, the Bushmen hunter-gatherers speak a language with clicks, whereas no modern language has clicks -- but that could be unrelated to other psychological differences between them and us. Now, if we found out that the Minoans spoke a language with clicks, and that they were lost over time by circa 500 B.C., that would squarely place linguistic clicks within the suite of traits leading from primitive to modern minds.

So, at least to judge from the available record of ancient visual culture, it looks like modesty about the breasts was another piece of the larger shift toward self-awareness that took place during the first millennium B.C. across large swaths of the Mediterranean and Middle Eastern civilizations. Before, only the most concretely shameful body parts were the targets of taboo-related behavior -- covering the genitals and anus.

As the ancient Greeks came to develop greater introspective and abstract thinking, they probably debated with one another in everyday settings whether or not other body parts ought to be covered. The breasts are not involved in shameful activities per se, just nursing infants, and then only occasionally. But they are a physical difference between the sexes, they begin to develop during puberty, and they're an orifice through which liquid leaves the body.

To all primitive peoples, this strained line of reasoning doesn't occur to them, and even if they're told about the civilized pattern of covering them up, they reject it as a ridiculous and perhaps painful practice. But to people who are becoming more self-aware and abstract in their thinking, the breasts emerge as the next obvious target for body exposure taboos.

Gays lag, rather than lead, in fashion trends

I'm struck by how many queers I see still sporting some version of the faux hawk hairstyle, carrying messenger bags, and wearing sandals or flip-flops when the weather allows.

The hairstyle peaked in the late 2000s, and almost no straight guy still has his hair like that. Ditto for messenger bags: Google Trends shows a peak in search activity in '07-'08, and you generally see few straight guys with them anymore. Sandals / flip-flops are a harder call, but among straights they don't seem to be as popular as in the 2000s.

It's not as though most gays look dated to 2007, but the fact that a good chunk of them do, while normal guys have mostly given up those fads, suggests a major revision of the received wisdom about gays being trend-setters and straights being sluggish followers.

As another quick reality check, did gays have anything to do with the early '90s revival that has caught on at least somewhat among the trendy stores like Urban Outfitters? Nope. In the stores, they play grunge rather than gay-friendly tunes from the same time. I haven't seen many gays taking part of the whole Fair Isle trend either, let alone lead the way several years before it became mainstream. Seems like most people carry their stuff in a laptop case or briefcase-like thing, or a backpack (a hipster version of which has replaced the messenger bag at trendy stores).

In discussing further examples, just remember that we have to restrict them to things that gays and normals could conceivably both adopt, though perhaps at slightly different times. No straight guy was ever going to wear those faggot capri pants, or school-boy shorts, etc. And no queer was going to grow his hair out all shaggy or bushy. But both groups did wear faux hawks, carry messenger bags, and wear sandals. However, gays didn't start any of those trends, and they are still clinging to them after they're out.

Why don't gays have the leading role that we're so often told they do? As with all their other quirks, it traces back to their Peter Pan-ism. You're not so keenly aware of fashion trends in elementary school, let alone want to play a role in pushing for something new before everybody starts copying you. That's more of an adolescent thing, once you get into more intense social striving and competition.

Being stunted in childhood, gays just don't get how to do that. Their quasi-autism keeps them from recognizing when something is surging or plummeting in popularity, waiting longer to join the trend and holding on for awhile after it's done. My guess is they try to figure it out autistically by reading websites that tell them what's hot and what's not, rather than just pick it up through their social antennae in everyday life.

Before I also pointed out how little "fashion sense" gays have, preferring whatever maximizes their Peter Pan look. Hardly fashionable -- there's nothing dorkier and sadder than someone who's older than 12 and still dresses like a small child.

Most people have very little observational experience of gays, so they just uncritically accept all this media bullcrap about superior gay aesthetics. Where are the gay architects and cinematographers then? Gay painters and sculptors (with real talent)? Gay graphic designers and typographers? Even with clothing and interior design, it seems like their knack is more for picking things out for a client than for creating the items themselves.

Unfortunately I don't see this baseless myth dying off anytime soon. Women love it because it gives them something to shame normal men with -- "You have such poor taste compared to my gay bff." And sadly most men accept it so they can rationalize their dull, ugly, joyless lives as anti-gay, hence masculine, rather than emasculating.

February 26, 2013

Spectacle TV has long-term cycles of popularity

One of the stranger developments in recent popular culture is the stratospheric rise of reality TV, game shows, and musical variety shows. What ever happened to sit-coms? Those were fun -- they didn't require a life-long commitment to follow their narrative threads, and they were down-to-earth rather than bombastic. Whenever I watch TV while visiting home, it seems to get worse and worse each year.

While everyone is aware of the absence of these kinds of shows in the good old days, and their dominance today, it may come as a surprise to learn that in yet another domain the fixtures of Millennial-era culture are reviving the mid-century, consciously or not.

A quick graph will make the point. I looked at TV shows by ratings, and kept the top 20 that were original programming -- excluding sports broadcasts and Hollywood movies shown on TV. (There are only a handful at most in any year, but I'm going to stick with original programming.) I then checked their entries at IMDb.com to see what genres each show has been tagged with -- comedy, drama, crime, etc. -- and found how many hit shows in each year belonged to either the reality, game show, or musical genres. Here is the pattern over time, from the 1950-'51 season to the 2011-'12 season:

We see the well known recent rise, and their near absence in the most of the '70s, '80s, and even into the mid-'90s. But that lack of spectacle TV did not extend indefinitely back into the early days. In fact, it was just as dominant in the '50s as it has been in the 21st century -- nearly half of the top 20 shows -- before gradually declining during the '60s and early '70s. The mid-century craze for game shows and musical variety also showed up in top-rated radio programs.

There wasn't as much reality TV back in the '50s, although they did have This Is Your Life. I'd attribute that more to technological limitations -- the medium had just begun, and they hadn't figured out how to easily document people's lives as they were happening. There was also a reality show or two during the heyday of radio programs, but I don't immediately recall their names.

While the ratings are the best guide to what really resonated with audiences, you see the same rough mix of genres in the full line-up of shows. Here is an archive of prime-time schedules going back to 1950, and here is an archive of ratings. Lots of game shows and musical variety hours in the '50s, lots of sit-coms in the '80s, and lots of reality / game show / musical shows in the 21st century.

Why are these spectacle types of shows more popular in falling-crime times? It seems like they're substituting virtual excitement for real-life excitement. People don't go out of their houses to do exciting things anymore -- even occasionally, like feeling blown away by a Fourth of July spectacle. Yet they still want something stimulating to take part in, especially as part of a larger audience.

That would also explain why TV is not so bombastic in rising-crime times -- people have enough excitement going on in real life, between playing sports, cruising around in cars, dance fever, live music, and carnivalesque "shopping" trips. (The excitement came from being part of the bustling crowd, as most people didn't actually buy anything during a trip to the mall.)

I also think people are looking for answers more during rising-crime times. When crime seems to only keep going up, what's going wrong in the world, and how can we try to make things better? Even if it's at the basic level of how we treat the other people in our daily lives, not a utopian social engineering project. That's what the sit-coms of the '80s were all about -- not preaching a message, but reminding us of basic truths about how we ought to treat each other if we want to remain socially cohesive in topsy-turvy times.

Reality TV, game shows, and musical variety hours prevent any learning or awareness from taking place in the larger course of entertainment. They're pure diversion. And with sit-coms, it's not "learning that..." but "learning how..." It's practical rather than nerdy learning. When the threat of crime, security, and danger seem to be taking care of themselves without our even thinking about them, then why bother seeking answers? Just tune in to the spectacle instead.

Letting children, not just babies, wear diapers

If you have kids of your own, or nieces or nephews, or maybe know someone else's kids fairly well, you may have noticed how long they continue to wear diapers these days -- well beyond age 2 or 3, when they should be toilet-trained.

When I saw my nephew during Christmas vacation, he was 4 years and 9 months, yet he was still wearing a Pull-Up to bed at night. That's a diaper that you pull on like underwear, to not embarrass the kid as much as if it had the baby-like diaper fastenings. But still, nearly 5 years old and wearing a Pull-Up = shame. Those are meant to be worn whenever, but there's another diaper line specifically for nighttime wear, to deal with bed-wetting in children (not babies), called GoodNites.

Also, he was only about 90% toilet-trained. My mom said she had me trained between 1 and 2 years, in the early '80s. This change isn't only in my family; if you look around on parenting forums, blogs, and articles, it's easy to find debates about letting your kid wear diapers for a lot longer than 2 or 3, as well as for delaying or extending toilet training until whenever you or the kid feels like completing it. The central reason as always is to shield their fragile self-esteem from all distressing environmental feedback, like waking up in a wet bed.

Distressing feedback is what causes growth and improvement, to deal with the currently inadequate state of the system. Blocking out the real world prevents them from receiving pleasant feedback too: if they wake up in a dry bed, it might not have been because their own system is working well, but only because they wore a diaper to bed. So, shielding your kids from feedback stunts their growth and leaves them unsure if they're developing properly.

When did diapers for children become popular? With the Millennials, naturally, the first generation to fall victim to helicopter parenting. Huggies Pull-Ups were introduced in 1989, and GoodNites for middle-years children in 1994.

Actually, they were only the most recent generation. Helicopter parenting was the norm during the mid-century as well. Was there an earlier wave of support for children's diapers? You bet. I've been looking over issues of Parents magazine from its beginning in the late 1920s through the present. Mostly looking at the covers and skimming the ads. I don't know when this ad first appeared, but it was no later than 1955 (and it was in multiple issues from that year):

"Protect your child from the psychological disturbances caused by bed wetting" -- sound familiar? I mean, who cares if he's 8 years old and still wearing a diaper to bed? We can tell this is for children and not babies because the picture shows a middle-years child, there's an offer for a free booklet called "Bedwetting and the older child," and they're offered in waist sizes from 18 to 36 inches, i.e. not for infants or toddlers.

Not wanting to bruise your child's precious self-esteem is also associated with Dr. Spock's mega-selling child advice book, Baby and Child Care, which first came out in 1946 but remained influential throughout the mid-century. It also advocated letting the toilet-training take however long it took, rather than try to impose a time-table. Translation: let your kid stay stunted for as long as possible, and let him guide the process instead of you.

I suspect there were successful products like this before the '50s, but I've only been skimming the ads and paying attention to those with pictures, and I've only looked at years ending in 0 or 5.

I did look through all issues I could find from 1926 through 1930, to get a better feel for Jazz Age child-rearing. There was nothing like this being advertised. Back then, the Behaviorists were more influential, and in practice it wasn't as bad as you'd think. Parents weren't putting their kids into Skinner boxes -- this was Watson's heyday, not Skinner's. And they seemed to let their kids roam free like parents did in the '60s, '70s, and '80s.

Their emphasis on strict schedules and refraining from close early contact were obviously not adhered to -- mothers just can't feel that way about their kids. Rather, it was to try to push parents toward the middle, and away from the opposite extreme of smothering them, which was the norm in the Victorian era. Wanting to bind your sons to the home with "silver cords" of love, grown sons talking about stroking their mother's silver hair and stealing a kiss -- creepy. The turn-of-the-century through the '20s and early '30s was only trying to move away from that weirdo mother-son relationship.

At any rate, what kinds of products did they advertise to deal with childhood maturation? Not products that let the kids stay stunted, but that would help them to grow up if they weren't already. There were several different brands of anti-thumb-sucking devices being advertised in the Jazz Age, something like this:

This entry in a database for graphic design says that it appeared in Good Housekeeping in 1932, which is some years after it appeared in Parents. Clearly there was a decent demand for products that would help your kids leave behind babyish ways, not draw them out as long as they wanted.

Letting kids wear diapers is part of a broader pattern of slower development during falling-crime times, since the future does not have to be discounted so much -- and why do today what can be done tomorrow? Rising-crime times shift people's time horizons closer to the present, putting their feet to the fire almost, so parents feel more like nudging their kids along faster through the lifespan.

When are you going to stop sucking your thumb? When are you going to stop wearing diapers? When are you going to get a paper route and earn some of your own money? When are you going to get your driver's license so I don't have to keep chauffeuring you around? When are you going to move out and get a job? When are you going to start dating? When are you going to settle down and start a family? When are you going to quit the rat race and enjoy retired life already?

February 25, 2013

Girl fight voyeurism then and now: Roller derby

Here is a concise history of women's roller derby. Although the roller skating craze dates to the turn of the 20th century and lasted through the Jazz Age, the combative, EXTREME sport version dates to the mid-'30s. Its popularity grew through the '40s and early '50s, when it was broadcast on TV. In 1940 there were 5 million spectators, or nearly 4% of the entire US population at the time, as if there were 11 million fans today.

Some pictures from its mid-century heyday:

Perhaps by the late '50s, and no later than the '60s, it began losing popularity, only used for pure camp value in the '70s, and absent from the '80s and early '90s. Only in the late '90s did interest re-emerge, and its 21st century revival is another example of how the Millennial age is repeating so much of the mid-century culture.

The only thing that distinguishes these periods is the trend in the crime rate. During rising-crime times, voyeuristic fascination with other people's pain and failures disappears. People actually participate more in sports, especially in team settings, but it's more to let loose their ambitious drive, act as a member of a larger team, and feel socially integrated into the larger community.

During falling-crime times, everyday violence isn't so common, so people try to over-stimulate themselves with violent imagery to make up for it. Human beings just have a certain need to see violence in their cultural products. But because people are more cocooning and emotionally unattached to their peers, this takes on a more lurid and voyeuristic quality. Unwholesome. If there had been YouTube in the mid-century, it would have been full of "fail" videos. (But not if they'd had it in the '80s.)

Seeing the roller derby gals from the mid-century knock into and tumble helplessly under each other was part of a broader voyeurism about violent women and girl fights. The heartless femme fatale was a stock character from film noir movies of the time (not only the better ones, but the cheesy ones too). And mid-century comic books, akin to today's video games, regularly featured images of voluptuous dominatrices, butt-kicking babes, and girl fights. Below are just three covers out of many on the theme, with the middle one coming from an entire series devoted to Crimes By Women.

Apart from the demand-side effect of greater voyeurism in the mid-century and Millenial periods, what about the women who star in the show? In falling-crime times, women just seem more snippy and cutting toward one another, and toward men as well, beyond the female animal's basic bitchiness. You don't see it in action as much because they're more cocooning in those periods, but things like femme fatales and roller derby reveal how they behave when they do lock themselves in more competitive settings.

It wasn't like the craze for women's tennis and golf in the Jazz Age or the New Wave Age, where they emphasized sportsmanship and the fun-loving aspects of competition.

Risk-taking in individual vs. group contexts

During a period of growing passivity and risk aversion, how is it that even the minority of people who still have something of a taste for risk can't manage to put it to much use?

Most of the risky endeavors we'd like to take part in are social, and hence require us to find numerous others who also have a taste for risk. Having the trait on your own doesn't get you very far. On the harmful side, there's opportunistic teams of criminals, up to gangs. It's a lot easier to rob a store if you can find someone who's willing to drive the getaway car, or lend a hand inside.

On the creative side, there's forming a music group -- putting yourself out there before an audience, let alone trying to record a hit song, means all of you need to be willing to risk failure. I think that's why there don't seem to be as many successful bands as individuals in pop music these days.

Keeping things simple, let's say you only need one other partner for some endeavor, and that people come in two types -- risk-taking or not. If you are both wandering around your local area without being able to detect each other and thereby move purposefully toward each other, then your chances of encountering each other are R^2, where R is the share of the population that's risk-taking. (This is the "law of mass action" from chemistry, or the chance of drawing two risk-takers from a large population, where the draws are independent.)

Even if R starts to fall over time in a straight line, the decline in R^2 will be much more dramatic. For example, if 40% of the population are risk-takers, then they'll bump into each other with a 16% chance. Cut their numbers in half to 20% of the population, and they now only meet each other with a 4% chance. We shrank their share by half, but their encounters shrank far more -- a 75% drop.

As R falls, R^2 falls very fast at first, and then more shallowly. For most things I can think of that involve risk in a social context, they've all declined over the past 20 years, but it seems like the drop was steeper over the '90s and shallower during the 2000s. The crime rate is one (the web or network of criminality effect), and the teen pregnancy rate is another (it takes two to tango).

Things that aren't so easy to quantify are tougher to call, but the re-segregation of the sexes was a lot more rapid during the hysterical '90s, where we went from the popularity of power ballads to indoctrinating college freshmen to believe that every male student is a hidden date-rapist. Same with the re-segregation of the races -- very steep drop from white people driving up the TV ratings for the Cosby Show, Family Matters, and the Fresh Prince of Bel-Air, to the L.A. riots, O.J., "institutional racism," etc.

Extending social relations across major demographic barriers, like male-female or black-white, is risky stuff -- they are the Other, after all. So even a linear decline in risk-taking will cause a rapid collapse in such across-group relations.

How do the more risk-taking people try to find an outlet in more cocooning times? Well, teaming up with other risk-takers is going to quickly become nearly impossible, so they pursue more individualistic risky pursuits. There was that whole Jackass phenomenon that exploded in the '90s, where you jump off your roof, run into a tree, or whatever, just for kicks. Often you didn't have anyone else there since it wasn't like a team sport.

I remember jumping off the roof of our carport with an umbrella on an extremely windy day, sometime in the later '90s. The idea just hit me walking home from school, and it was pretty fun. The rise of EXTREME sports is part of this difficulty of risk-takers to team up with like-minded folks in their neighborhood, and so having to sky-dive, skateboard, rock-climb, or whatever, as individuals.

And then there's that EXTREME foodie stuff. Who can eat the grossest or strangest stuff. Again, not an activity that requires a partner or team, but is still risky. Basically all of the EXTREME activities of the '90s and 21st century fit this pattern.

Is there any domain that's more immune? Well, anywhere that people can detect one another as risk-takers and purposefully meet up. Since they're rare, that means the search will have to range over a large area, and they'll be in organizations that are scaled up more highly than just the neighborhood or even region. Then the assumptions of "mass action" don't apply.

Venture capitalists, for example, seem to be doing OK at indulging their taste for risk in a group setting. Politicians and high-ranking bureaucrats coming together from across the nation in high levels of the federal government, in order to socially experiment through policy -- that'll satisfy their taste for risk all right.

But for anything at the more local level, we're back to the dilemma of the few risk-takers having such trouble bumping into each other. Low-level entrepreneurism seems to be at a low point compared to the Reagan years. It takes balls not to suck mega-corporate dick, and back then people were less willing to just take the easy way out. Now everyone's casting their lot with the big guys, at a level unseen since the mid-century. Then it was GE, IBM, and General Motors, now it's Apple, Google, and Walmart / Target. Back in the '80s, there was actually popular demand for breaking up the big guys, AT&T being the most shocking casualty.

As a consolation, Western societies have been through these phases of the cycle many times before, so we'll make it into a more adventuresome zeitgeist sometime soon, probably within 5-10 years. The Gilded Age of robber barons was followed by the entrepreneurial Jazz Age, the technocratic mid-century by the New Wave Age of yuppies, and the neo-company-man Millennial era by... well, whatever's in store for us just around the bend.

February 24, 2013

TV dinners then and now

In an article for the NYT Magazine about the science and business of junk food, there's a long section on the history of Lunchables, an all-in-one-tray meal for kids whose concept and design were based on the TV dinners of the 1950s.

Lunchables were test-marketed in the late '80s, went national around '89 or '90, and have only gained in popularity since. There were only a few varieties at first, but I checked my supermarket yesterday and they have an entire end-of-aisle display offering dozens of options -- not just lunchmeat, cheese, and crackers, but now also pizza, mini-hot dogs, and so on. 1990 also saw the introduction of the still popular Kid Cuisine, a more explicitly TV dinner tray for the microwave.

I remember those very clearly when they came out, the idea that you were making a meal all by yourself. One of the businessmen quoted in the article mentions that this was the main appeal to kids -- you were putting together the pieces however you wanted, not opening up a sandwich already made by your mom. And of course the appeal to the mothers buying them was saving time and effort -- no prep, no clean-up, just throw the trays away. They were part of the re-emergence of the cult of convenience and efficiency in American culture, last seen during the mid-century.

Here is an extensive gallery of the first wave of TV dinners, mostly from the late '50s and early '60s, though continuing with altered marketing through the '60s and early '70s. Swanson's came out with the first line in 1957. The early ads make the pitch based on convenience (no prepping before, no dishes after) and the interchangability of the TV dinners for mom's home cooking. See this example. Already by the mid-'60s (see this gallery), they dropped the emphasis on convenience -- it was now about giving the kids something special -- and made little or no suggestion that they were a substitute for home cooking.

After their retreat from mid-century triumphalism about convenience, TV dinners seem to have more or less disappeared from the mid-'70s through the '80s. Lunchables and Kid Cuisine really were a novelty for us. I don't remember ever eating TV dinners in the '80s, either us kids or our parents. I searched Google Images for tv dinner(s) 1980s, and tv tray(s) 1980s, and came up with nothing. So it wasn't just my house. In fact, I don't remember any of my friends eating TV dinners, either themselves or their parents, and we used to eat at our friends houses fairly often back then.

Sure, we cooked a lot of food in the microwave, but it wasn't an all-in-one tray pitched for its convenience for frazzled mothers, and made to substitute for real food. The TV dinner bonanza of the late '50s and early '60s seems like one of the mid-century social charades that was undone starting in the '60s.

Housewives of the '50s and early '60s increasingly felt like their domestic work was not appreciated, but you can hardly blame their families. How grateful are your children and husband supposed to react when your meal-making consists of throwing some pre-fab grub into the oven, and only cleaning silverware and glasses afterward? For any woman who truly wanted to get joy from housewifery, she must have realized how unrewarding it was to go the TV dinner route, and that they were just an excuse for her own domestic laziness.

By the mid-'60s, the jig was up. Even youth culture icons like the Rolling Stones took note. Their 1966 song "Mother's Little Helper" mentions the mid-century mom's rationalizations about not cooking real meals and running off to pop some happy pills:

"Things are different today," I hear every mother say
Cooking fresh food for a husband's just a drag
So she buys an instant cake and she buys a frozen steak
And goes running for the shelter of her mother's little helper
And to help her on her way, get her through her busy day.
Sound familiar? Only today it's Lunchables and Prozac. What's the opposite of convenience and efficiency? Committedness, thoughtfulness? Well, whatever you want to call it, once the pendulum swings away from convenience and back toward thoughtfulness, we'll see more real family meals. And none of these science fair experiments and arts-and-crafts dioramas made out of food that super-moms toil over these days either. That's more of a show-off thing for herself as against the other mothers, not made out of caring thoughtfulness for the family. And it's patronizing to the children -- "Look kids, your gingerbread raccoons even have little milk moustaches made out of vanilla icing!" Jeez, get a life, mom...

Returning finally to the idea that Lunchables give kids autonomy, that's just an illusion too, which all sides are willing to believe in these days. Real autonomy means teaching the kid how to make food by himself. Before Lunchables, etc., I remember my parents, mostly my mom, teaching me and my brothers how to do simple things, but that still gave us the ability to make a quick meal for ourselves. And all while we were still in elementary school.

Anyone else ever learn how to make an Egg McMuffin in the microwave? Throw some English muffins in the toaster, and while they're going, spray Pam inside a coffee mug, crack an egg into it, cover with wax paper and nuke on high for a minute, and you've got the egg part. Take out a slice of cheese, and presto, an Egg McMuffin in under 5 minutes. You could also nuke some bacon (wrapped between two layers of paper towels) to make a bacon, egg, and cheese McMuffin. Nuked bacon also goes great with fresh tomato, lettuce, mayo, and toast for a quick BLT.

That was on the fast food-y side, but we also learned how to scramble eggs and make omelettes in the skillet, how to make pancakes from mixing milk and eggs with the flour up through pouring the batter and flipping at the right time, how to fry hash browns, how to boil water to make mac & cheese (not very hard, but still felt more dangerous and grown-up than zapping a pre-made TV dinner), etc. I remember my friends being able to do simple independent stuff like this too, at least the friends close enough where I might see them fixing their own food at home.

Are small children allowed to boil water or cook with stove burners anymore? Probably not. Or roasting marshmallows over the stovetop flame to make s'mores after you get back from a long day of sledding and snow forts? Yeah, probably not. That was real childhood autonomy, not cooking at an adult level, but giving us our first big shove toward the edge of the nest, letting us take risks and allowing us to learn from our mistakes. What's the alternative? -- clinging to your mommy's skirt or microwaving Ramen noodles for the rest of your life.

February 21, 2013

Where do young people hang out in our cocooning times, and why?

The percent of people who stay holed up in their private sphere these days is far greater than it was in the 1980s; Americans in all parts of the country have not hunkered down this much since the mid-century. Still, there's a continuum of cocooning -- some are mostly invisible to the outside world, but some go out every day, even into public spaces. And yet the types of places they go to show that they aren't as open, trusting, and outgoing as it may appear from the mere fact that they left home. First, a review of where people go, and then an account of why they visit these places and not others.

Children are not allowed outside at all anymore, so their public hang-outs have vanished -- the video game arcade, the roller rink, the mall, the park, the playground, the pool, etc., and have not been replaced or built over by new public hang-outs for kids. Teenagers who haven't left for college don't hang out much in public either. Occasionally you'll see a few at the shopping centers, Jamba Juice, Starbucks, but mostly they're locked inside by helicopter parents too. Worse, they don't rebel against their smothering mothers and sneak out.

College students are different -- they're typically away from their parents, so they have more freedom. The biggest change in campus life over the past 15-20 years has to be the transformation of the library into a primary hang-out spot. Movies, TV shows, and commercials from the '80s show people studying in the library, perhaps with friends, but it wasn't a main destination for hanging out. Now students are eager to spend hours at a time inside the library.

Before, campus libraries had at most a few vending machines for food and drink, but now that so many students spend so much time there every day, a new space has been created to meet their needs -- the campus library cafe. It's not a bustling cafeteria, or like the food court at the mall, but a small quiet place with places to sit down, and that mostly sells snacks.

I should mention at this point, for those who were in college before these changes (I remember them being under way during the early 2000s), that people don't go to the library to browse the stacks, check out books, or even do that much nose-to-the-grindstone work (college classes have never been easier). Instead, they go there for the multitude of "study areas," some wide-open, some more intimate, some with talking allowed, some quiet-only, some with a huge computer cluster, some where students bring their own laptops, and so on. In the 21st century, college kids are so not hormone-crazed that their main social destination is a great big study hall.

Despite the library's popularity, very little socializing actually takes place. You might if you already know the person, but it is not a place where people go to interact with people they don't know. If you sat down with a person or group that you don't know and tried to strike up a conversation, that would be awkward.

It's like the rise of the small house party, where the only ones in attendance are directly known to the hosts, or at most by one degree of separation. Further degrees of removal means you can't be sure they're trustworthy. Young people today just feel uncomfortable socializing with perfect strangers. The mass of students packed into the library might seem like an exception to small-only gatherings, but they're all cut off from each other. It's more like cells in a hive than a bustling crowd.

You do see people trying to make minimal contact with others around them, though only in the form of fleeting eye contact, and so only between the sexes. It's not an invitation to come over and talk to them, or an invitation to them to come on over to you. It's not even a strong expression of sexual interest or intent, since again it never goes any further than a quick glance.

It's more like an agreement to give their ego a little boost if they give yours a little boost. If they're out in a public space, they aren't in the most anti-social category -- they still feel some need for social recognition, appreciation, and belonging. Totally cut off from everyone else, they'd have no idea where they stood in the eyes of their peers. So, head on down to the library and see how many people are willing to make eye contact with you. Like, "I got a look -- thank god, I'm not ugly after all!" or "That chick over there just looked at me and didn't have a creeped-out look on her face -- thank god, I'm not the biggest loser after all!"

Because this contact is so superficial, only kids in the normal-to-"popular" range go there. (I use quotes because in anti-social times, no one is actually popular with a broad group. I mean those who would've been the popular kids back in the '80s.) You generally don't see the fat/ugly side of the bell curve, or even the nerdy/geeky side -- surprising for a library, eh? I'd guess that they tried out the library as hang-out, but noticed they didn't get any looks, or got creeped-out looks, and decided the hell with it, might as well stay in my room and play Xbox or have a Twilight and Ben & Jerry's marathon again.

And because this contact is so fleeting, they do it a lot more frequently. It's not like a huge party on the weekend where you get along well with strangers, some of whom may become new friends, or go all the way with someone you've had your eye on, or maybe even just met. The heady after-effects of that kind of socializing will last well into the next week. They're such a powerful signal to your brain, that it doesn't need reassurance of your normal-to-desirable status for awhile.

But split-second eye contact isn't such overwhelming evidence, so you need to be constantly scanning to see if others are trying to establish it with you. Girls especially seem to strut around frequently in a see-and-be-seen way, never quite sure if they're perceived as hot or just do-able. This self-doubt would be easily settled if they went to large parties once a week and got a sense of how many guys made a move on them. Way more convincing evidence than eye contact. But today, lots of unfamiliar guys making a move on you feels awkward and creepy. Or even if they had school dances -- how many date offers did you get? But if they don't want to go to those dances in the first place, they'll disappear, and that option for self-evaluation disappears too.

There's an obvious parallel to texting and posting on Facebook as a replacement for voice calls or in-person conversations. Receiving a single text gives you only minimal reassurance that you aren't ugly or a loser, so you need to keep receiving them -- and to return the favor for assuaging your own self-doubt, you need to keep sending them to those who sent them to you. Young people's social exchanges (hard to call them interactions) are thus part of their broader suite of OCD tendencies -- they're always teetering on the brink of self-doubt, and feel compelled to keep pushing some button to receive the little food pellet for their ego, again and again and again. They don't want the social equivalent of an intensely flavored, endorphin-releasing meal that would satisfy them for some time to come.

Without going into too much detail, you see the same general dynamics at the other major hang-outs for college kids and 20-somethings -- the coffee shop has been turned into a campus library computer cluster, for instance. Somehow supermarkets have become hang-out destinations (although you don't hang out there for very long), with some now offering their own little cafe area with seating, like the campus library cafe. Quiet small-scale food places are somewhat popular too, like Noodles & Company, Lunaberry, etc. Not "fast food," though -- too many cars constantly pulling around to the drive-thru. The gym is the only place where young people get physical in public these days, again with interaction among strangers being understood as forbidden.

For contrast, where don't you see college kids and 20-something hanging out much anymore? Well, the student union is more or less dead as the central, all-purpose hang-out, and so are the large green spaces around campus that social-seeking and sun-worshiping students would have flocked to back in the '60s, '70s, and '80s. Food places that are more bustling and carnivalesque are gone too -- cafeterias, mall food courts, the automat. And all commercial stores where the purpose is to browse are also gone as hang-out spaces -- the bookstore, the record store, the video rental store.

As for physical activity, playing sports in public spaces is gone -- an informal game of football in a field, tennis courts for tennis or roller hockey, basketball courts, baseball / softball diamonds, kicking the soccer ball around, frisbee, hacky sack, etc. Public pools, mini golf courses, roller rinks, and dance floors have also been abandoned. If people do get physical in an open area, it's always jogging with earbuds jammed in their head -- don't interrupt me.

What distinguishes the spaces that have fallen from the ones that have risen seems to be how purposeful your visit is expected to be. If it's the kind of place that you visit for some specific, deliberate purpose, then that wasn't so popular in the '80s but has taken over now. If it's the kind of place where you visit with no plan or purpose in particular, that used to be popular but is now being reclaimed by the wilderness.

When people develop the cocooning, distrusting mindset, they don't want unfamiliar people to approach them. How can you manage that while still venturing out into public spaces? And how can you still take part in at least minimal social exchanges? Well, simple: hang out at a place where there's a very well understood expectation that strangers do no approach one another there. Why? Because everybody goes there for some specific purpose and is otherwise occupied -- studying, writing a paper, checking items off their grocery list, getting a quick bite to eat in between studying, meeting friends to catch up with them on important matters over lunch, and so on.

So, if someone unfamiliar approaches you, you can just give them that look or vibe of, "Uh, do I know you? This is a place for studying, you know..." In a bar or on the dance floor (other deserted spaces), you can't give someone a look like the very act of approaching you is violating an unspoken behavior code. So blowing a guy off in those spaces makes you feel more bitchy. But if you're sitting at a booth in the Whole Foods cafe area, you can give them a weird look -- after all, you're just taking a little rest while running errands, and they'd be interrupting you.

For the same reasons, you can't make any new same-sex friends at these places. If a bunch of guys are huddled around the TV in the union to watch the game, they can shoot the shit all day long even if they don't know each other. But plopping down across from some random dude at Starbucks, just to chat, would give off mad homo vibes. Striking up a conversation with a same-sex stranger is no problem at a record store or bookstore if you've got similar tastes, but you wouldn't think of doing so in a supermarket just because you both like Spanish cheeses.

This little investigation shows why it's worth paying attention not just to gross quantitative measures of sociability, like how much time to people spend outside their home, but to qualitative aspects as well. You might think that cocooning, while worse compared to the '80s, still isn't so bad -- look at all the kids hanging out in the library, in Starbucks, jogging around the park, etc. But they only choose these places because they can be assured that strangers won't approach them; they'll have plausible deniability because people don't go to those spaces to really interact. So, "I'm not being anti-social and awkward, I'm just here to conduct other business." Even when people do venture out into public areas these days, their lack of trust and social awkwardness still shows through.

February 20, 2013

Neuroses about feminine hygiene: The past (douching)

Part 1 here covered the trend toward pubic hair removal of the past 20 years. It's not an aesthetic or sexual thing, but a hygiene/disgust thing, in their own words. They obsess about it if it begins to grow in, and they compulsively remove it to alleviate their anxiety. Yet the OCD woman of today never feels real relief because the damn thing keeps growing back in.

Now let's turn to the previous outbreak of both OCD and of neurosis about feminine hygiene, the mid-century.

The prevalence of OCD back then is pretty easy to diagnose from the ubiquitous hygiene films that were screened to schoolchildren. Here you can see stills and summaries of a couple dozen of them, mostly from the '40s and '50s. They didn't fall on deaf ears either: most candid pictures from the mid-century show people who are fairly fastidious about grooming and hygiene.

As for their obsessive thinking and compulsive behavior about feminine hygiene purification rituals, they centered around douching rather than hair removal back then. A concise account can be found in chapters 8 and 9 of Flow: The Cultural Story of Menstruation. (I know, the things I read to uncover the truth...) The tone is your standard snarky sassypants, and the rest of the book consists of predictable feminist griping about having to live up to male-determined standards bla bla bla. But the chapters that stick more to primary sources like ad campaigns are quite revealing for just how out-there the mid-century zeitgeist was, and how similar to our own.

Unlike the seeming mystery of pubic hair removal, it's a no-brainer to interpret douching as a hygiene and disgust-related practice. Not always -- sometimes they market them about feeling comfortable and fresh, not appealing to the yuck factor. But in the mid-century, the mindset was all about purifying her disgusting, offensive odors.

There were two major purveyors of douching solutions, Zonite (weak bleach) and Lysol -- I mean, why only disinfect your counter-tops with the stuff? To see how widespread they were, do a Google image search for douche ad lysol, and douche ad zonite. These dozens of ads span decades and appeared in perfectly mainstream magazines like Woman's Day and Good Housekeeping. You want to see how surprisingly bizarre the '40s and '50s were? -- take a look for yourself. Below are only a few that illustrate the pattern.

Her husband has locked himself away from her disgusting odor, which because of doubt and inhibition she's neglected to do anything about. Some of the copy:

One most effective way to safeguard her dainty feminine allure is by practicing complete feminine hygiene as provided by vaginal douches with a scientifically correct preparation like "Lysol." So easy a way to banish the misgivings that often keep married lovers apart.
. . . truly cleanses the vaginal canal even in the presence of mucous matter.
. . . the very source of objectionable odors is eliminated.
Once again we see the crippling self-doubt of the mid-century on full display. Women who are more well-adjusted don't get hysterical about this kind of stuff, and would not be susceptible to these kinds of ads. The fact that these campaigns lasted so long, and so presumably drummed up so much business, tells us that women of the time really were ashamed of their sex and neurotic about feminine hygiene, far beyond taking basic care of themselves. Just like today's hair-removers.

Also note how disgustingly palpable the word choice is -- "mucous," "matter," "odors," etc. They had little sense of taboo when talking about the effluvia and odors of the female sex organ, no roundabout references here. This taboo-violating word choice is not uncommon to find across the ads.

Young married couple were supposed to enjoy a little roll in the hay during their quiet evening at home, but the disgusted huband just can't bring himself to tell his eager wife that it stinks.

And she must constantly be on guard against an offense greater than body odor or bad breath -- an odor she may not detect herself but is so apparent to other people.
Zonite... actually destroys, dissolves and removes odor-causing waste substances.
Again the appeal to self-doubt and shame, the OCD emphasis on "constantly" being on guard, and the vivid description of "odor-causing waste substances". Here are a couple other funny ones from Lysol and Zonite, but again just wade through the Google image results for yourself. To find ads for other brands, try douche ad 1940s, or douche ad 1950s.

So what's the big deal, you might ask? Wasn't it great that ads could so freely shame women for not pleasing their husbands, and that husbands were still allowed to look disgusted if their wives didn't meet the standards?

Well, no, that atmosphere is just like ours today with hair removal. Women feel a crippling shame themselves, other women try to shame them if they don't shave or wax, and hysterical "men" get all grossed out by normal female appearance and texture. Indeed, the guys' inflexibility and appeal to violations of hygiene/disgust norms shows that the guys themselves suffer from OCD and fetishism, almost as though they'd be unable to perform if her vulva didn't look the way they preferred.

A normal woman has nothing to be ashamed of down there, unless she truly does have an infection or something. Clearly the douching phenomenon was hysterically blowing the "problem" out of all proportion. We don't know how men felt, but if they were like those depicted in the ads, then they were psychologically stunted too, in that "ewww, girls are yucky!" stage of childhood.

Crippling shame and juvenile girls-have-cooties disgust are not signs of healthy minds. Rather, we've found yet another case of the broader picture that the mid-century was the "Age of Anxiety". They make the population look more like that of tropical gardening societies (horticulturalists), where there's a pervasive fear and disgust of female sexuality. These are the societies with little to recommend them, they're the most savage. Think of your first impression of the Amazon, New Guinea, or sub-Saharan Africa. In fact, even today African-American women are much more likely than whites to douche, owing to the tropical gardening culture of their ancestors.

The ads are more disgusting than the "problem" they seek to treat -- imagine opening a mainstream magazine and reading the words "mucous matter," "odor-causing waste substances," etc., in the context of feminine hygiene. That's what's gross -- violating the taboo that you talk about that somewhere else.

Aside from warped minds and gross language, the douching phenomenon had real consequences for women's health -- negative. Doctors don't even recommend it anymore because it's now understood how altering the vulvo-vaginal ecosystem so radically can eliminate the good bacteria and allow the nasty ones to take over. See here for the negative outcomes that douching is associated with, and see here for a longitudinal study showing that douching raises the risk of getting bacterial vaginosis (similar symptoms as a yeast infection, including a fishy smell, but more common than it). So, douching causes at least some of those problems, including infection, rather than being a harmless bystander in a web of mere correlations.

We don't even know yet what the parallel harms are that pubic shaving and waxing do, but we don't need to right now. It's clear that they're there, because of such radical alteration of human nature. And no, as some airhead suggested before, it's not the same as shaving the armpits -- there are loads more pathogenic organisms in and around the vagina, which is an opening into the body, unlike the armpit. You're messing with something more serious when you remove pubic hair vs. armpit hair.

At any rate, harm is not the main issue here. It's how well-adjusted, healthy-minded, and socially integrated people are. Normal men and women will not show such an OCD mindset and behavior toward feminine hygiene, as indeed they did not during the '80s, the '70s, and even most of the '60s. The neurotic preoccupation with douching provides another example, then, of how unwholesome the mid-century culture was.

I'm sure it didn't occupy their every waking moment, but then neither does pubic hair removal today. It was so common -- if 30% douche today, when doctors discourage it, it must have been near or over 50% during its heyday. And it was so centered on shame. As with pubic hair, it's just not the kind of thing women should be obsessing over and compulsively trying to "treat," only to remain neurotic.

February 19, 2013

Men paved the way out of the mid-century malaise

I don't think there's a strong received wisdom about women leading the charge out of the stultifying mid-century culture, or about men and women leaving at the same time. So this post isn't to "debunk" a standard story, but to show something you wouldn't notice without looking.

Our Millennial age is the re-incarnation of the mid-century, and we're a good ways into our falling-crime phase by now. We're 21 years past the 1992 peak in the crime rate, and 21 years past the previous 1933 peak in the crime rate was 1954. Within the next five years, then, we're going to see the culture open up and breathe a little more freely, just like the mid-late-'50s. That will set the stage for the mass exodus from private cocoon life during the next 1960s -- yeah, a lot of that time to come will be annoying, but it'll one of those gear-shifting times that will ultimately lead us into the neo-'80s.

In looking over the period of the mid-'50s through the early '60s, when you can see people starting to leave their cocoons, it's striking how different the timing was between men and women. Already by the mid-'50s men were growing discontent with the social role they were supposed to accept (that of the "company man"). This is summed up in the 1955 hit novel, and 1956 hit movie, The Man in the Gray Flannel Suit. It wasn't until 1963 that the rough female counterpart came out, The Feminine Mystique.

Sloan Wilson and Betty Friedan were born just one year apart in the early 1920s. Belonging to the same generation, they should've heard similar rumblings among their same-sex peers at the same time, if men and women were growing equally tired. Yet Friedan didn't even get the idea to write the book until the late '50s, let alone publish it. By then, more than half a decade had gone by of discontented 30 and 40-something men revealing their grievances in public forums, such as pop culture. So, women felt that the trail blazed by men wasn't going to grow back over -- it had been trodden enough to be safe for female feet now too.

Related to the passivity and risk-aversion of the mid-century was a widespread acceptance and usage of amphetamines, barbiturates, and minor tranquilizers to help confine one's emotional ups and downs to a narrow, steady band. The backlash against that kind of "cosmetic pharmacology" first made an issue of men taking them -- the reformers portrayed them as emasculating and domesticating the more animal-driven nature of male users.

Only somewhat later did the more numerous female users come out about their problem, mainly suburban housewives popping Valium. By then I think the backlash took on something of the flavor of the Feminine Mystique -- they saw their drug addiction as the unwholesome extreme that they'd gone to in the quest to play the role of Supermommy.

I don't recall the dates off the top of my head, but they're in the Happy Pills book by Herzberg that I cited before. I want to say it was the late '50s / early '60s when men began to rebel against being emotionally medicated, and women around the mid-late-'60s.

The Billboard year-end singles charts show a fairly ho-hum procession of songs throughout the early half of the 1950s. Then all of a sudden, the 1956 charts show male performers unashamedly opening up, putting it all out there, and cutting loose -- mostly Elvis, but also the Teenagers with "Why Do Fools Fall in Love". By '57, there are even more (there's a table with all the year-end charts at the bottom of the previous link). There's still Elvis, but now also the Everly Brothers, Jerry Lee Lewis, and Buddy Holly / the Crickets. And it just goes on from there through the last two years of the '50s -- even more Everly Brothers, Dion and the Belmonts, Ritchie Valens, the Drifters, etc.

Those early songs are breaking away from the emotional restraint -- or perhaps lack of much emotion to begin with -- of songs about boys and girls from the mid-century. They're pointing the way toward the power ballad of the 1980s, after which emotional numbness and blandness would set in once more, lasting through today.

What about female performers? There's a noticeable lack of them altogether in the '50s. The most memorable ones were probably the Chordettes, but they don't sound like they're letting go -- it sounds more rehearsed and under-control than the songs by male performers of the same time. "Mr. Sandman" came out in '55, when "Rock Around the Clock" ushered in the rock era, and "Lollipop" came out in '58, the same year that saw "All I Have to Do Is Dream," "Great Balls of Fire," "Peggy Sue," and the original recording of "Do You Want to Dance".

Nope, the first year with a clear breakthrough by emotional females, akin to Elvis in '56, was 1961 when the Shirelles took the world by storm as the first "girl group" -- i.e., the just-go-with-it, boy-crazy type. Whereas the late '50s saw the male rock explosion, it took until 1963 for female singers to join in the excitement -- just about every classic early song came out in that single year: "Be My Baby," "Then He Kissed Me," "Da Doo Ron Ron," "Just One Look," "One Fine Day," and so on. They were pointing the way toward the Pointer Sisters and Bananarama, who perfected the genre in the '80s. Again we see females trailing males by about five years.

Finally, there's clothing, another obvious difference between the drab mid-century and the upbeat '60s - '80s period that followed. The main shift seems to have involved variety of colors and the presence of patterns, both of which make clothing look more striking and signal the willingness of the wearer to put themselves out there and get noticed, trusting that others aren't going to ridicule them, or perhaps feeling too carefree to pay them any mind if they did.

In an earlier post on Christmas sweaters, I detailed how men were the first to re-adopt them after their more or less disappearance during the mid-century, after having been so popular during the Jazz Age. The Sears Christmas catalog ("wish book") shows sweaters with multiple colors and winter-themed patterns for men starting in the 1956 edition, when women's sweaters are still uniform in color and lacking in patterns. Multiple colors and bold patterns don't show up for women's winter sweaters until the early '60s -- they're definitely in the '62 edition, and perhaps in the '60 or '61 editions, though neither is online for me to check.

Outside of seasonal clothing, everyday shirts showed the same timing. By the mid-'50s, the Sears catalogs have lots of what we'd call "flannel shirts," with their bold all-over pattern and contrasting colors. (Sounds more fun than a gray t-shirt, khaki cargo shorts, and brown sandals.) Not until the early '60s do we see a similar profusion of multi-colored and patterned shirts for women. Once again, men led the way out of drabness by a good five years or so.

There are surely other examples, but these are diverse enough to make the point. A falling-crime era selects for more feminized traits, so naturally men will be the first to grow restless and try to pave the way out toward a somewhat more dangerous but ultimately more fulfilling way of life.

By the same logic, when the outgoing zeitgeist grinds to a halt and swings back toward the cocooning side -- most recently, circa 1990 -- women seem to be the first to drop out of public spaces and to withdraw trust in the opposite sex, while men stick it out for a little longer before closing themselves off as well. But that's for another post.

February 18, 2013

Gayest states vs. gayest cities

Via Steve Sailer, here is a Gallup poll on homosexuality rates across the 50 states and DC. Below is a slightly edited comment of mine.

Homosexuals congregate in cities, so we should really be asking which cities are the gayest, per capita. In 2012, the queer magazine The Advocate put together a list of 15, with honorable mentions. It's not number of homosexuals per capita, but fixtures of queer "culture" per capita -- how in-your-face the queer lifestyle would be.

The index consists of: queer elected officials, WNBA teams (lol), International Mr. Leather competition semifinalists, Imperial Court chapters (they fundraise by holding gay balls), teams that competed in the Gay Softball World Series, gay bookstores, nude yoga classes, laws protecting trannies, and concerts by Gossip or the Cliks or the Veronicas.

This way of measuring a city's gayness seems better than asking a random sample if they are or are not. For one thing, even in a fag-friendly age like ours, responses won't be totally honest. That might not matter, as we could just add however many percent we think are lying to the reported percent. However, the underestimation may not be uniform across the country -- in more conservative regions, the underestimation is probably greater than in more liberal regions.

By measuring honest signals of queer activity like gay bookstores and nude yoga classes, we can sidestep the issue of response bias. It doesn't matter if some city reports a far lower-than-average rate of homosexual identification to pollsters -- if they participate in the Gay Softball World Series or compete in the International Mr. Leather competition, we know they're gay.

Also, measuring unmistakable, public signs of queer culture gets more at what normal people truly dislike so much about fags, namely their in-your-face lifestyle. No one advocates having the police trail queers 24/7, because we know they're going to do what they're going to do. We just want them to keep it out of our awareness and out of our communal spaces. If you want to make a lifestyle out of sucking off 10 different dicks in a single night, then go drop dead from AIDS where we don't have to look at your rotting half-corpse of a body. Otherwise, try to at least appear and act normal when you're out and about.

Now, which city would you guess is the #1 most gayest in the country? Salt Lake City, UT. No surprise to anyone who's ever visited there. Even on a one-day visit, you'll hear more lisping buzzing voices than you'd hear during a week in San Francisco. Nearby Denver also makes the list.

Looking at the state-level data, it makes sense -- the Mountain states, except Arizona, are so low in homosexuals per capita, that those who are there would find it impossible to meet another one. So every faggot from the whole Mountain Time Zone picks up and heads off for Denver or Salt Lake City.

There seems to be a similar thing going on in the South. Most of those states are very low on the state list, and yet Little Rock, Knoxville, and Atlanta all make the cities list. Judging from HIV rates, Atlanta seems to be  the most over-run by gay corruption.

On the flip-side, states with higher rates of gayness generally don't have cities on the city list -- Vermont, Maine, Hawaii, etc., don't have cities on the list. Because it's much easier for them to find each other in the suburbs or perhaps even rural areas, they don't feel the need to all flock to the same nearby city.

So in high-gay states, their influence is more diluted, no one area seems that in-your-face. But in low-gay states, although most of the areas are emptied of gays, the largest nearby city will feel like a latter-day Sodom.

February 17, 2013

Catchy chewing gum jingles

What is it about chewing gum that made a rockin' little jingle so central to their commercials? No other product type had such consistently likable music. Like, you don't have to worry about basic things anymore, now go have yourself some good clean, wholesome fun? Whatever it was, it seems like every brand had a catchy commercial in the second half of the 1980s.

Extra. Hands-down the best. Great build-up and climax, and there are about three separate melodic phrases, with no mindless repetition. How'd they pack all that, plus the voice-over pitch, into just 30 seconds? Great funky bass line too. Another version.

Big Red. Almost as much melodic variety as the Extra jingle, and another great build-up to the climax.

Juicy Fruit. It's not quite as varied as the Extra jingle, nor with as much build-up. The jangly timbre of the instruments does give it more of a footloose feeling, though.

Doublemint. Rounding out the classics, this one has good melodic variety, more on the easy-breezy side compared to the energy of the others. Nice sincere sense of humor, too, about the most you can get out of a 30-second commercial.

Carefree. A little more repetitive, but still fun and upbeat.

Dentyne. Couldn't find a good original jingle for this brand, and they did take a stab at it many times. But hey, a nice sample works too.

Spearmint. Another one that I couldn't find a catchy original jingle for, but these ads are so rad it doesn't matter if they're music-free. Here is another. In 30 short seconds, they capture the '80s attitude toward technology pretty well -- somewhat anxious (dark lighting), somewhat optimistic (smiling faces), and trying to have fun and a sense of humor about it all. Very new wave, even though these ads came out in '88.

Ever since the '90s, gum commercials have led the way toward the meta-ironic faggotry that we're still stuck in today. As the ads from the '80s show, it's not that hard to make something upbeat, catchy, and non-saccharine, even if it's advertising chewing gum.

These things aren't terribly entertaining on their own, but they're not meant to be -- and when you think of how many ads we have to watch every day, every week, it all adds up. The unrelenting tide of wacky-zany commercials these days makes it unbearable to watch TV anymore, aside from the programming itself. Back in the '80s, I don't think we reflexively dove for the remote every time a commercial break came on, and you can see why -- they were pretty fun for what they could be.

February 16, 2013

So many slang words about how suspicious you find every place and everyone

Earlier I took a look at the rise of slang words that assume nobody trusts you -- "honestly," "literally," "seriously," "I'm not gonna lie," etc. They're all from the '90s onward, part of the broader social trend toward lower trust in people you meet in everyday life. If you expect your listeners not to trust you, you have to continually reassure them that you're telling the truth.

Well, there's a flip-side to that as well: the rise of slang words that signal how little trust you have in others. This reminds your conversation partners that, in general, you're a very suspicious person, so if they really do mean what they say, they'd better mark that overtly with one of the many "honestly" kind of declarations.

"Sketchy" is the first that I remember, and that was from the '90s. I think before then it used to mean "hazy," as in "the details of the robbery are sketchy for now, but police are interviewing witnesses to fill in the blanks." I'm pretty sure that in the '90s it was used primarily for places, particularly unfamiliar public places, that you found suspicious. "I dunno, this place looks kinda sketchy," or "Holy shit this neighborhood is sketchy, let's get the fuck outta here."

Only in the 2000s did it get used to refer to people, as individuals or groups, who are unfamiliar. "That guy who keeps checking you out looks kinda sketchy," or "I thought about rushing Omega Mu, but those girls seem pretty sketchy." It's typically girls who use these low-trust words, as females are less trusting and fearful of strangers and public places than males are.

Then there was "shady" in the '90s, which again I think first referred to a public place or a situation that was not part of your daily routine. "Dude, that McDonalds that we ate at last night in Anacostia was so shady," or "They're having a party Friday night in the woods behind the school? I dunno, sounds shady if you ask me." (Did not mean for that to be a pun.) Now it's more of a term for strangers, as in "That guy who just posted on your Facebook looks pretty shady, to be quite honest."

Also from the '90s and early 2000s was "shiesty" (based on "shyster," with a long-I vowel), meaning someone who presents a trustworthy appearance but is actually devious. It didn't refer to places or situations. I think this is the only one that guys used more than girls did, and it generally referred to other guys -- like ones who wouldn't get your back when you thought they would. "Chris said he forgot his wallet, so we covered his part of the check. Then we see him hailing a cab to get home -- I told you that dude was shiesty."

Two of the most commonly used slang words of the 21st century also refer only to suspicious people or social situations, not places -- "stalker" and "creepy," and their variants (stalking, creeper, etc.). "Stalker" took off in the early 2000s, and "creepy" more like the mid-late 2000s. They primarily refer to male behavior toward females in the domain of dating and mating, whether some guy who makes a pass at a girl, looks like he's going to, or even a guy who shows no awareness of the girl, but if he did, she'd be creeped out. "Who's that fat ugly creeper asleep in the corner over there?"

It's not just a message about how undesirable the guy is -- there were and still are other words for that. Like, "Who's that goofy-looking dork who keeps asking you to dance?" or "Can you believe that loser actually thought I'd go out on a date with him over Brad?" The new words add a layer of connotation that he's not only undesirable but suspicious, threatening, and so on. "OK, not gonna lie, but that stalker sitting at the table behind you, literally looks like he's thinking of raping you." "Ewww, creepy guy number 4 just looked at us again -- I honestly need to go take a shower now."

Their targets are not actually threatening -- probably some awkward, video-game-addicted dork -- but they exaggerate the sense of suspicion and threat in order to fence themselves off from everyone else, no matter how harmless.

Therefore, the rise of "stalker" and "creepy" can be seen as an extension of the date-rape hysteria that began in the late '80s and lasted well into the '90s. But now that boys and girls don't date each other that much anymore, the panic has shifted from the date-rape scenario to creepy guys thinking about or looking at you in the first place, before even approaching you.

In the same way, most of these low-trust words first referred to unfamiliar places, and only later came to refer to people too. The first phase of cocooning behavior was to abandon public spaces, while still having some fellow feeling for your peers (early-mid-'90s). Now with so few people willing to hang out in public, it's common to hear people say that someone is "stalking" them on Facebook, not even in real life. Once you were spatially secure in your narrow private sphere, you would proceed with the second phase by cutting off your social ties to would-be friends and mates.

Cocooning begins once the crime rate rises to such a high point, and most people sense that it's those unguarded public spaces that are most dangerous, so they ditch those first. But you could always be harmed by someone you know and have allowed to get close, so you have to follow up by ditching your peers too, just to be safe.

Honorable mentions go to less popular words like "sus" (from "suspect"), usually referring to someone whose heterosexuality is under suspicion. "I dunno, shaving your balls sounds pretty sus." In this fag-friendly age, it's reassuring to know that at least some of the Millennial generation still finds what they do disgusting. Then there's "seems legit," a self-conscious commentary on what is obviously bogus, polluting, etc., not merely suspicious. "Bro, check out this ad. 'Hot MILFs waiting to fuck you tonight in your city' -- seems legit."

Finally, there's "dodgy," a word that pretentious Americans imported from Britain in the 1990s, judging from its appearance in the New York Times. In British usage, it means risky or chancy, and can refer to inanimate isolated places, like "be careful, the drive up that hill gets a bit dodgy." Americans use it with paranoid and suspicious connotations -- "that new character in Downton Abbey seems a bit dodgy."

The books in Google's digitized library (Ngram) show an initial rise and fall of the word "dodgy" in British English during the Victorian era, beginning in the 1860s. That would've been 20 years after Dickens nicknamed one of his dodgy characters the Artful Dodger. I'm not sure in what contexts it was used back then, as opposed to the contemporary usage of "risky" or "chancy" in general, including inanimate things. But based on the Dickens character, I wouldn't be surprised if it had a connotation of untrustworthiness in human beings, or public places. The Victorian era was another period of low trust and social isolation, following in the wake of the opposite Romantic-Gothic period.

February 15, 2013

Girls' hair that "frames the face" vs. reveals the face

[Edit: See the comments for many examples of 1940s hairstyles that look like those of the Millennial age.]

Today at Starbucks I overheard some loud airheaded faggot giving beauty tips, or beauty commandments, to his fag hag friend. Something about how she can't get a certain haircut that she'd wanted -- it needed to be some other way in order to "frame her face". I didn't get a real good look at her, but she didn't look fat, ugly, or mannish, hence no need to worry about disguising her face through hairstyling.

What does this phrase mean, then? Here are some examples from Google images, all from the past 20 years naturally:

The common denominator seems to be a lack of volume, and using the hair to hide the face. Sometimes even the eyes are hidden from view. You generally don't see the ears or a good part of the jawline, perhaps not the forehead / temples / upper face area either. It appears more common for hair below the chin to angle or curve inward rather than outward, thus hiding the neck from view as well. The face peeks out from a narrow opening between an almost closed pair of forward-jutting curtains. Basically the goal is to look like you're wearing a hoodie.

That fits in with the tendency toward cocooning, feeling awkward about your body, and of course the rise of hoodies. Wrap your face in a little security blanket, and venturing out into public doesn't feel so threatening.

For comparison, here are counterpart pictures to the ones above, but from the '80s:

You sure can see a person's face without so much hair getting in the way. It gives a more open, trusting, and confident look -- no security blanket vibe given off here. And by throwing it away from the face, creating volume, it puts the hair, as well as the face, on full display. The blown-back look also makes it seem less self-conscious, and so less neurotic or narcissistic, compared to the micro-adjusted arrangements of hair in the framing look. Plus she can now get your attention by showing her earrings.

Guys with long hair show the same change over time from face-revealing to face-concealing:

Aside from its use in comforting the cocooners, the face-framing hair also gives contemporary women another outlet for their OCD. There are a half-dozen facial shape types, and through hairstyling all are supposed to be altered Procrustes-like toward a single ideal. Plus, each type has its own intricacies of how to use hair to re-shape it into the single ideal. All right, more rule systems to wrap our brains around! And as long as you check off all the items in the list of prescriptions, you can rest stress-free!

Back on planet Earth, though, the OCD types are always super-stressed. You'll only feel care-free by just letting your face be your face (again, if it isn't fat, ugly, or mannish).

I wonder if that's why girls have started to pay so much attention to queer advice about beauty over the past 20 years vs. would have written them off in the '80s. If their goal is to not look very pretty for the boys, then why not take advice from a boy with a broken aesthetic antenna?

Normal dudes look on at such an interaction and think she's either clueless or being manipulated -- but just maybe she finds boys bothersome and doesn't want to attract attention from them. Yep: no jewelry, all clothing items baggy as hell, no smile. Her little faggie friend is just telling her what she wants to hear, in this as in so many other cases.

February 14, 2013

Valentine's Day rituals

Here is a comment thread in a "senior citizens" section, where people tell what Valentine's Day was like when they were growing up in school. Looks like the tradition of bringing a special box to collect the cards in dates at least back to the 1940s, probably earlier.

Like our other rituals, Valentine's Day seems to be slowly fading out. For example:

An elementary school in Eugene, Oregon bans Valentine's Day altogether.

High schools in Orange County, Florida ban exchanging gifts on Valentine's Day.

Primary school in England bans giving out Valentine cards.

Horace Mann School near Boston bans Valentine candy.

Elementary school in Mass. bans Valentine's Day and replaces it with Friendship Week.

Obviously an outright ban is just the tip of the iceberg -- more schools must show signs of erosion, they're just not so far gone that they've resulted to a ban. Here is a short thread among 5th grade teachers. One who taught younger kids  implies that she didn't do the Valentine's Day traditions with her lower-grade kids. The other teachers say that about 3/4 of their 5th grade kids participate.

I seem to remember doing the Valentine's Day thing just about every year in elementary school, not just 5th grade. And I think just about 100% of kids took part. Anti-social kids must not mind standing out from the majority these days -- I wonder if they offer some cynical sour grapes rationalization about how it's just a Hallmark holiday, etc.

The candy bans are part of the health food trend. The majority of the bans, though, are based on felt violations of fairness -- any time you produce and distribute a lot of stuff to lots of people, some will end up with more than others. And it won't be random, but will reinforce existing hierarchies, like who's pretty and popular vs. ugly and awkward.

So obviously we must get rid of those traditions to prepare children for a real world where there are no lop-sided distributions, and where people never prefer giving things to some individuals rather than others. Rather than remind them that "life isn't always fair". Shoot, I heard that all the time. And "well that's just tough," i.e. "tough luck".

I doubt the children of helicopter parents (like my nephew) have ever heard those responses -- they're not rational or logical principles that the child can apply to other contexts. They're just a reminder that out in the real world, they aren't always going to get explanations -- "I dunno why I don't wanna be your friend, you're just annoying." Well, like annoying how? "I dunno, you're just annoying. Now go away."

To end on an up note, here's "Everlasting Love" by Howard Jones.  I don't think they make many songs anymore about settling down after having been around the block a few times. Let alone in such an optimistic upbeat way, not in a tone of feeling sorry for yourself or taking your Plans For Settling Down so seriously, like it's a homework project. Anyway, it's got jungle drums, the Egyptian revival, racquetball, dining out instead of eating in -- so much '80s goodness in a single video.

February 13, 2013

Neuroses about feminine hygiene: The present (pubic hair removal)

A further case of the similarly unwholesome climates of the mid-century and our Millennial age is an unhealthy obsession with feminine hygiene and a compulsive set of rituals to try to relieve their anxiety, only to worsen their self-doubt and, in all likelihood, to worsen whatever minor problems they might have had. Radically altering the ecology down there is likely to wipe out the good flora and leave only the bad guys left.

There is a parallel unwholesomeness among males, who begin to feel ever greater disgust toward female sexuality, beyond basic taboo feelings and warping their minds back into a pre-adolescent reflex of "ewwww, you put your thing-y in her what-y?!?!?!" The closest case outside of developed societies is the pervasive fear and disgust of female sexuality found among tropical gardening cultures (horticulturalists), such as those found in the Amazon, New Guinea, and much of black Africa.

I've split this topic up into two posts, the next one covering the douching craze of the mid-century. This one will cover the past 20 years.

Well, I finally figured out what the whole removal of hair down there trend is all about -- the re-emergence in our society of extreme OCD thoughts and behaviors about hygiene. Us dudes have tried to figure it out for awhile, batting ideas around without simply asking the girls themselves. You can't do so in real life, but the internet has recorded plenty of frank discussion from girls who wax or shave it off.

I won't provide links because they're all over the place, though Yahoo! Answers and comment threads at various chick-only sites (e.g. Cosmopolitan) were pretty helpful. This article from the Atlantic is the best single source, including an estimate of just how common it is -- 60% of 18-24 year-old females are bare sometimes or always (higher for college students).

Whether offering a reason spontaneously or when asked directly, the hair-removers over and over use words related to hygiene and disgust. Hygienic, gross, clean, fresh, eww, disgusting, sick, icky, nasty, etc. Occasionally they mention a better feel or smell when they're sweaty after a work-out, again a hygiene theme. Sometimes they mention personal comfort. And they only rarely give reasons about oral sex or intercourse feeling better, their appearance looking prettier or sexier, or some other sex-related reason.

In sum, it is rarely out of concern for someone else, like looking prettier for the boys. All of their other style choices point the other way -- not wearing visible make-up, not adding volume or waviness to their hair, not wearing jewelry, not smiling, and so on. They are trying not to invite boys over to chat them up, which would creep them out.

An ongoing fixation about, and ritualistic maintenance of personal hygiene smells like OCD, and indeed the hair-removers often emphasize how inflexible they are. It's not like your favorite color being red, where sometimes you'll wear other colors without feeling disgusted. Rather, these girls use phrases like, "I can't stand it", "it makes me feel ill", or "Having hair down there is kind of annoying to me and I'm a huge neat freak."

For their part, the guys who chime in to these discussions tend to offer the same hygiene and disgust-based reasons. So, even to the small extent that girls are catering to male demand, that too is heavily influenced by hygiene/disgust. However, because it is on the internet, there are a lot of desperate nerds ejaculating praise about how much better it feels to hoover a waxed floor than a shag carpet. Note to dorks: shouting enthusiasm for muff-diving will not help you lose your virginity.

Pubic hair removal is a phenomenon of the past 20 years only (see here, and an academic reference in the Atlantic article). And sure enough, that's when we've seen people grow more obsessive and compulsive about hygiene, most clearly visible in the rapid adoption of hand sanitizers and antibacterial everything. Back in the '80s, it was common to walk around outside with no shoes or socks on, weather permitting, but that feels so dangerously unclean to the OCD masses of the 21st century.

Was there an earlier era of widespread OCD and an unwholesome anxiety about feminine hygiene? Well, definitely not the '70s, and not much of the '60s either -- somewhat in the earlier part, but fading out even then. As usual, we've got to go back to the mid-century to find our closest parallels. And it's even weirder than the current mania for lawn-mowing.

February 10, 2013

The unwholesome mid-century: A wave of dependency on tranquilizers, barbiturates, and amphetamines

He's the one they call Dr. Feelgood
He's the one that makes you feel all right
He's the one they call Dr. Feelgood
He's gonna be your Frankenstein

The lyrics and music video for Motley Crue's song about "Dr. Feelgood" sum up the popular view of the drug culture by the late 1980s. Drugs came from the street, and were distributed by local and entrepreneurial pushers, with no advertising and no appeal to authority. They were risky, dangerous, and unnatural substances (hence the reference to Dr. Frankenstein), and therefore only reckless long-haired badasses like Crue fans would be into the scene for the long, or not-so-long, term. The post-apocalyptic imagery and way-out-in-the-desert setting tells you that this way of life was another world apart from the white-picket-fence suburbs where the majority lived out wholesome lives.

That could not be farther away from the zeitgeist of the mid-20th century, when physician to the stars Max Jacobson, also nicknamed "Dr. Feelgood", acted as an amphetamine supplier to such luminaries as Tennessee Williams, Mickey Mantle, Nelson Rockefeller, and John F. Kennedy.

From roughly the mid-'30s through the late '50s, the popular view was that drugs came from all-American manufacturers, and were distributed by benevolent doctors not known for rocking the boat. Aside from the appeal to the "physician knows best" principle, they came stamped with the approval of the federal government, and were widely advertised in the popular media. And ad men would never try to sell you something that was bad for you but profitable for their clients, now would they? After all, these drugs were the result of invention and fine-tuned engineering by research scientists, not some kind of "eye of newt" nostrum that an old-fashioned witch doctor would try to fob off on you.

So, aside from the odd side-effect, they were basically safe to start taking, regardless of whether or not you thought you were such a head-case that you required "medicine". Why, even -- or especially -- the middle-class majority, both the frazzled housewife and her upward-striving husband, could benefit from a daily dose of whatever could energize the listless and steady the nerves of the anxious. Far from being adulterating substances that might threaten personal and societal ruin, regular drug use heralded an age of progress toward a "choose your mood society," as Fortune magazine styled it in 1957.

It doesn't get more unwholesome than that. Not only is the dream of widespread, rather than sub-cultural drug use, but the attitude toward potential danger is glib and dismissive. As JFK himself said about Dr. Feelgood's amphetamine cocktail: "I don't care if it's horse piss. It works." People are trying to squeeze out the highs and lows of life, as though they'd surgically altered their vocal apparatus to only speak in a flat tone, rather than to occasionally introduce even more emotional variety into their experiences. Drug use is hoped to be a staple of their daily habits -- not the occasional up-ending of routine in a cathartic release, returning afterward to normal life. Worst of all, stabilizing drugs aren't safer: they are addictive, can lead to overdose, and can have harmful side-effects and/or withdrawal pains after discontinued use.

We could sum up the differences between the two drug cultures as "stabilizing" and "destabilizing". The former breeds complacency and thus widespread adoption, while the latter instills a sense of wariness and thus restricts hardcore users to the most reckless part of the population. Because they're more sensational, destabilizing drugs get more charged coverage at the time and after their day has passed. It's the more common culture of stabilizing drugs that goes unseen by most at the time and to observers in the future.

Was the mid-century zeitgeist merely a trend in attitude but not in behavior? No: changes in the popular mood mirrored the popularity of mood-changers. My goal here is not to establish that on a factual level, since it has been done in great detail by historians.

The best all-encompassing account is Happy Pills in America: From Miltown to Prozac by David Herzberg. For the mid-century, he focuses on the craze for minor tranquilizers like Miltown and Valium, used as anti-anxiety drugs. First adopted enthusiastically in the second half of the '50s, they came under suspicion during the '60s, from the regulatory agencies of the federal government, to patients at the grassroots level, and pop culture stars like the Rolling Stones, whose 1966 hit "Mother's Little Helper" blasted mid-century hypocrisy about drug use. The growth rate in the number of prescriptions written had already slowed during the first half of the '60s, hit a plateau in the early-mid-'70s, and have never regained their popularity since.

Two article-length accounts, here and here, cover the other major drug types: "America's First Amphetamine Epidemic 1929--1971" by Rasmussen, and "The history of barbiturates a century after their clinical introduction" by Lopez-Munoz et al. The graphics include ads as well as data charts, and are worth a look themselves.

Amphetamines AKA "speed" appear to have been the drug of choice during the mid-century mania, more prevalent than tranquilizers, and perhaps more than barbiturates, although I can't tell from the data available. And they were also used in amphetamine-barbiturate combinations. In any case, people started adopting amphetamines during the mid-to-late 1930s, and they enjoyed their guilt-free peak during the '50s, only to come under suspicion during the '60s and begin declining during the '70s and '80s. In 1962, enough amphetamines were being produced to give every American 43 standard 10-mg doses per year, or about one hit every 8.5 days -- again, not for actual users, but as though everyone were getting high that often.

Barbiturates are hypnotics (sleep-inducing) or sedatives, and were meant to mellow out the symptoms of anxiety. Their heyday was the '30s and '40s, and still did well into the '50s, not suffering from a tarnished reputation, just being edged out a bit by the introduction of tranquilizers like Miltown. Only during the '60s did they come under suspicion and decline, never to return to their peak popularity. Dark fun fact for the day -- barbiturates were what Marilyn Monroe turned to when she committed suicide in 1962.

Despite their declining reputation starting in the '60s, and their declining usage during the '70s and '80s, both prescription uppers and downers have come back into fashion since the 1990s, in the form of amphetamine-like or amphetamine-containing drugs (Ritalin, Adderall, etc.) and SSRIs (Prozac, Zoloft, etc.), respectively. And yet before their surge during the mid-century, approved-of and sanctioned uppers and downers were on the decline, from roughly the turn of the century through the end of the Jazz Age in the early '30s.

That period itself was a reversal of the Victorian craze for ham-fisted drug solutions to the normal demands of real life, which saw the heyday of the "patent medicine" AKA snake oil. Patent medicines only came under suspicion and regulation during the early 1900s, as part of the muckraking and Progressive era -- the pre-cursor to the muckraking and Progressive era of the 1960s.

What ties together all of the periods of rising enthusiasm for "cosmetic pharmacology" is a falling crime rate, and all periods of rising wariness of everyday mood-changers is a rising crime rate. I explored some of the connections between those variables in a post about cocooning and mental dysfunction, and in a post on the cycles in advertising that brought us campaigns for snake oil, Geritol, and Enzyte across three separate falling-crime periods.

That may strike us as surprising when we recall the heady atmosphere of seemingly widespread drug use during the Jazz Age (opiates) and the New Wave Age (just about everything, but especially pot). However, those were sub-cultures, with the more hardcore ones being almost underground. They were not blithely adopted by mainstream middle-class suburbanites, who instead came to appreciate how destructive casual drug use could be. And even among users, the goal was not to squeeze out all the variety of life, but indeed to introduce more excitement, and in a social rather than isolated setting.

All of these aspects point to a more wholesome culture during rising-crime times, when destabilizing drugs are popular, and to a more unwholesome one in falling-crime times, when stabilizing drugs rule the day. Apart from the crime rate itself, life is overall more fulfilling when we're put to manageable challenges that we must work through with one another.