April 18, 2014

Would the needy turn down your obsolete computer from 10 years ago?

I thought about donating the old Gateway tower that I found languishing in the basement, since it still runs fine. It's running Windows XP smoothly on a Pentium II 800 MHz processor, 20 GB hard drive, and 256 MB of RAM (and another 128 MB can be bought on eBay for a whopping $3, shipped). It has a fast CD-ROM, 3.5" floppy drive, Zip drive, 2 USB ports, and ports for modem and ethernet.

Not only does it do everything that a normal person would need, it is backwards compatible to make use of old things that may be still lying around the house, like floppy disks.

And yet such a system would be rejected by all computer donation centers, who preen for their do-gooder audience about how they're keeping bulky electronics from choking up our landfills, helping out others in the community who can't afford or otherwise don't have access to computers, and so on and so forth.

Why? Because their vision of "bridging the digital divide" means giving the needy a set-up that's within striking distance of the computers that the well-to-do use. It doesn't mean giving them something that meets their needs for free. After all, on the Gateway system from 2000, how are the poor supposed to stream pornography in HD? Or the all-important function of hardcore gaming? Giving them a system like ours would only perpetuate the inequality gap in cyber-distraction.

The first hit I got for middle to upper-middle class Montgomery County, MD, was Phoenix Computers -- "reclaim, refurbish, reuse." According to their "what donations can we use?" page, your computer will probably be rejected as obsolete if it's not running a 2.0 GHz processor on a Pentium 4 chip, and will be harvested for parts. Talk about greedy.

Even the T40 Thinkpad that I'm using to type, upload, edit, and comment on this post would get rejected — only 1.5 GHz and a variant of the Pentium 3 chip. Yet somehow I've pulled datasets off the internet, analyzed them in Python, drawn graphs of the results in R, and made PowerPoint talks to present them to others, carrying these on a USB flash drive. And garden-variety web-surfing, of course. But, y'know, the computer experts are right — this thing doesn't provide surround sound for when I'm watching cat videos, so it must be time to just throw this wad of junk in the trash.

Was it just the greedy do-gooders in Montgomery County who took such a wasteful approach toward conservation? Here is an all-purpose page with suggestions for those thinking about donating their old computers, and they are only a little bit more forgiving, explaining that only those that are 5-10 years old are going to make the cut. But under these more generous guidelines, that Gateway that's running XP without a hitch would still get sent to the scrap yard. Not flashy enough for the discriminating tastes of 21st-century proles.

So, for all their talk about frugality and stewardship, these donation and recycling centers behave more as though they were the producers of Pimp My 'Puter for MTV. Aw yeah, son, Black Friday's coming early this year!

Zooming out to the big picture, this entitled mindset among the lower 75% of society is an overlooked cause of how fucked up our economy is becoming. In the original Gilded Age, wasteful status-striving was only an option for the one-percenters. But now that we have democratized access to debt, along with state-mediated schemes like Cash for Clunkers and Obama Phone, everybody can whine like an entitled little bitch their entire lives, and bury themselves under debt in order to play the status-striving game. Hand-me-downs are for losers, and everyone must be a winner.

Today's economic explosions will be far more severe, on account of how broadly the attitude of wastefulness has infected our society. And so-called donation programs that feed this petty sense of entitlement are only going to make things worse.

April 16, 2014

Is anyone holding onto their digital pictures?

While poking around the dank storage area of our basement back home, I found a nearly 15 year-old tower computer lying ignominiously on its side on one of the shelves. It had been sitting there since 2010 and had scarcely been used since around 2008. When I later opened it up to clean it out, there was thick dust covering all top surfaces and a good amount of the cables. It was a miracle that when I first tried to power it on, it took only a little coaxing.

While cleaning out lots of old files to free up some hard drive space, I came across what must have been hundreds of image files stored across a few dozen folders. This computer had been my brother's during college, and went back into general family use during the mid-to-late 2000s. So there's a good range of who and what is pictured. My brother's social circle at college, family vacations, holidays, and so on and so forth. Not to mention a good deal of pictures of old photographs that had been digitally scanned.

Nothing mind-blowing, but isn't that normal for family pictures? And it's not as though any single picture would've been a major loss, but family pictures aren't trying to make the individual shot excellent, they're trying to record what our experiences were.

Needless to say that if I hadn't taken a curiosity in restoring and preserving this dusty old thing, it and those hundreds of pictures would've gone straight into the landfill. I backed up the pictures onto a flash drive just in case the hard drive craps out, and it struck me that we had to buy a new drive for this purpose. There wasn't a master drive that we had been loading digital images onto all along.

Perhaps other families are more OCD than ours (that would not be too hard), but I suspect that most people are not moving their old picture archives from one main "device" to another. And given how quick the treadmill of planned obsolescence is running, they're not going to have much time to get to know the pictures that are confined to a particular device before cycling on to the next one.

Photographs are the exact opposite. In our cedar closet, we still have album after album full of film prints, some of them going back to the early 20th century. Photo albums were never housed in a larger piece of technology, let alone one that was subject to such rapid turnover. So it hasn't been hard to keep those archives separate from all the other stuff that comes in goes in a household.

And although I've written about how bland, forgettable, and crummy digital pictures look compared to film, their quick sentence to oblivion seems to have more to do with digital storage media rather than digital image capturing. If you took pictures with a digital camera but printed them up, they probably wound up in a photo album with the others. Whereas if you took pictures with a film camera but told the photo lab to upload scanned image files onto a flash drive instead of making prints, you'll lose them before 10 years is up.

It may seem odd that digital images are vanishing so easily — although less tangible than photographs, they are still housed on a physical machine. But those machines are getting more or less thrown away every several years these days, and even if they're donated, they have their hard drives automatically wiped clean before passing them along.

Forgettable images on disposable machines — how would this world ever go on without progress?

April 14, 2014

White flight and ethnic differences in cocooning

The main cause of white flight is no mystery — the higher rates of violent and property crimes among the groups that they seek further distance from. Hence, in this period of still-falling crime rates, whites have partially reversed their flight from the urban core, and have reconquered large parts of major American cities that would have been no-go zones for most folks back in the early 1990s.

This influx of affluent whites has pushed the earlier black and Hispanic groups out into the suburbs. How are white suburbanites responding, if they aren't part of the movement back into cities? They're moving further and further out into the exurbs.

But why? It's not like the earlier white flight, given how low crime is these days. Sure, it's still higher among their new neighbors, and moving further out would bring them into lower-crime areas. But it's not the huge gain that it would've been 30-40 years ago. Particularly if the newcomers are Mexicans, whose crime rates are not even at black levels to begin with.

What about the newcomers' kids fucking up the local schools? That would certainly worry the parents of school-aged children, but would probably not get picked up by the radar of everyone else. It should only be the former group moving out, whereas it seems like everybody is packing up.

Fleeing diversity and its fragmenting effects on trust, in search of greater similarity and cohesion? That's more like it, but why is it the resident whites who are leaving, rather than the non-white newcomers? Is the white suburbanites' homogeneity that easy to perturb toward disintegration?

You wouldn't see that in the other direction, with suburban whites invading and driving out suburban Mexicans — other than by paying more money for rents, goods, and services. The Mexicans aren't going to move out just because they sense a creeping up-tick in the level of diversity with all these white newcomers. They're going to make it known that they aren't going anywhere, and that whites aren't welcome.

And not necessarily in a violent or confrontational way. All it takes is them refusing to learn or speak English, refusing to participate in mainstream American culture, refusing to display American cultural symbols, etc. And then proudly displaying their own.

Whites are a lot more wimpy about asserting their ethnic or cultural identity. I don't mean in some cosplay Nazi style, but in something as simple as playing rock music out of their car windows or on their front porch, walking around the neighborhood, and hanging out in its public spaces — particularly in groups. That would give them a group-y physical presence that would not feel palpable if they kept to themselves indoors most of the time, occasionally going out for a jog alone or walking the dog alone.

To the outgoing go the spoils. If you look around a diverse area today, you'll see that Mexican parents are more willing to let their kids hang out on their own, and that the grown-ups are more likely to be hanging out in the public areas of a shopping center instead of pulling right up in front of their destination, darting in and out, and taking off back home. Even where they may be a numerical minority of private residents, their public physical presence can be much greater.

This line of thinking may also explain why some white suburbs haven't been so thoroughly affected as others. The introverted Scandinavian weenies in Minnesota and Wisconsin have gotten slammed a lot harder by all the black newcomers from the Chicago area over the past 20 years. They're just going to keep moving further and further out.

The working-class suburbs of Boston and New York (who can't just out-bid the non-whites) have fared much better, compared to the rest of the country. You can always rely on the garrulous micks and wops, I mean the sociable Irish- and- Italian-Americans to stand their ground, not just as an individual defending his private property, but as whole groups coming together to put some pressure on unwelcome newcomers.

Public life is part of the pastoralist culture of honor that the capable defenders are adapted to, whereas retreating into the private domain of the family is more what large-scale agriculture selects for, outside of urban dwellers (and you let the culture of law deal with any problems that may arise). Naturally the Nords are going to have a tougher time driving outsiders back out, compared to the Celts.

April 13, 2014

Children of divorce, invisible to Wikipedia

I was just reading about Amy Adams on Wikipedia and found out that her parents divorced when she was 11, despite belonging to the Mormon church which places a heavy emphasis on family togetherness (heavier than other religions).

While scrolling down to the bottom of the entry, I expected to see one of those offbeat Wikipedia category tags, like "Mormon children of divorce." Alas. But they didn't even have a generic category tag like "children of divorce," "father-absent people," etc.

They have a category for adoptees, something that most folks reading a person's biography would be interested in knowing. And something that could have shaped the way they turned out as adults. Adoption puts a happy face on the category of "unusual family structures," though. Things looked hopeless for the kid at first, but then they were rescued. Pointing out everybody who went through the opposite — things looked cheery at first, but then it all fell apart — would be a downer.

Wikipedia has all sorts of other category tags about a person's background and upbringing, from ethnicity to religion to being a military brat (like Amy Adams). That's the good kind of diversity — in the Wikipedian's mind, no ethnicity or religion or parental occupation is better than any other, so what's the harm in pointing it out? But whether both your parents were still together when you were going through adolescence... well, we don't have to air everybody's dirty laundry, do we?

And it is becoming "everybody" — see this earlier post about the still-rising rate of children growing up in broken homes.*

However, conveying how fucked-up the world is becoming, and pointing out who specifically has been hit, would go against the prime directive of the neutral point-of-view. You can read about it on a case-by-case basis, and assuming you soak up thousands of articles about living people, the pattern might strike you.

But there will be no larger picture that an abstract category tag could clue you in to at the outset. And no firm sense of there being this whole category of people out there, without a concise tag to reify them as a group. Some things were not meant to be understood, even (or especially) through The Free Encyclopedia.

* The trend for divorce is not the same, and has been declining since a peak circa 1980. But the rosier trend for divorce ignores whether or not there are any children involved, and married couples aren't pumping out babies like they used to. The divorce of a childless couple does not leave a broken home.

April 9, 2014

Planned obsolescence and conspicuous consumption

Stuff isn't built to last anymore, whether it's cruddy pressboard furniture from IKEA that will flake off into a pile of shavings within 10 to 20 years, or an iPhone whose speed has been crippled by an "upgrade" to the operating system that overwhelms it.

And yet, as the popularity of IKEA and Apple testify, people these days not only don't mind how disposable their purchases are — they are eager to feel another endorphin rush from throwing out the old and buying up the new, like a woman who changes her hairdo every three months.

Sadly, you see the same treadmill consumerism at Walmart and Sears, where everyone wants to step up their game, upgrade their rig, etc. It's not just the elites who are effete. Working-class people today are not the honest, making-do folks from a John Cougar Mellencamp song. They act just as entitled, discontent, and in-need of cheap junk from China as the rest of the social pyramid.

So, upper or lower class, Americans today don't give a shit if their stuff is unusable in five or ten years. Indeed, that's the way they like it.

Usually "planned obsolescence" is talked about as a supply-side thing, with the producers scheming to trick us into buying things that will be no good before long. But consumers notice that kind of thing, and by now that awareness is so widespread that all you have to do is say "not built to last," and everyone nods along. Notice: they do not stop buying all this shoddy crap, rather they grudgingly accept the nuisance as the small price they have to pay to enjoy the larger benefit of stepping up their game and upgrading their rig, feeling that heady rush more frequently.

This is a sorely under-appreciated source of how crappy things are today. You don't want to think of your fellow Americans as feeding it through their own self-absorbed consumer behavior, and would rather pin it all on the greedy stockholders, managers, marketers, and so on. But those guys can't sell an entire nation what it doesn't want to buy. Same with immigration — you can blame large-scale farm-owners, but what about the folks you know who use a housecleaning service, lawncare / landscaping service, or construction service that's almost certainly employing cheap illegal labor?

The Arts and Crafts movement took root in the late Victorian period, as status-striving and inequality were rising toward their peak of the WWI years. The standard story is similar to today's, where the shoddiness of stuff at the time was blamed on mass production techniques introduced by the Industrial Revolution — something on the supply side, at any rate. Given today's parallels, I'm more inclined to blame airheaded Victorian strivers for spreading the throwaway mindset. Only with such a docile consumer base could the industrialists flood the market with cheap junk.

At the other end, it's striking how sturdy and long-lasting stuff is from the nadir of status-striving and inequality during the 1960s and '70s. Especially for mature industries that can be fairly compared across long stretches of time — like furniture. Those Archie Bunker chairs, cherry dressers, and "granny squares" Afghan blankets are still in wide circulation at thrift stores, and have always been. The "thrift store look" from today is about the same as it was 20 years ago.

For some reason, IKEA futons and plastic rolling cabinets from the Container Store are not making their way in, and likely never will. Nobody has any use for that, and it's going straight in the trash.

April 8, 2014

Displays for digital distraction: Larger monitors, multiple monitors

Back when computers were machines for doing something productive or creative, their monitors were small and compact, whether in a business or home setting. They could have been made much larger, since the same CRT technology was used for medium and large-sized TV sets. But those larger screens were preferred because TV was for entertainment, hence were of little value when it was time to type up a report or crunch some numbers on a spreadsheet.

Here is the original IBM PC, with an 11.5" monitor (and only about 10.5" being viewable), and the original all-in-one Macintosh, with an even smaller 9" monitor (rather prophetic copy in the ad, don'tcha think?):



How did Primitive Man ever get any work done with such crude tools, unable to read emails in HD? How could they have focused on the memo they were typing, without the word processor overwhelming their eyes on a 30" screen? Really makes you grateful for not having to live in the dark ages anymore...

Monitors stayed small for well over a decade, and did not begin to bloat out and take up most of the desk until the mid-to-late 1990s, as they came to be used more for entertainment — first for graphics-intensive video games, then for watching illegally downloaded movies and TV episodes, then for illegally downloaded porn, and then web pages in general once they became more dazzling than black Times New Roman text against a gray background.

The increasingly common widescreen aspect ratio is another sign of computers as entertainment centers.

But one monitor can only get so big — eventually you're going to have to step up your game and upgrade your rig to include two monitors. Awww yeah, son: getting interrupted by emails on one screen, while refreshing TMZ on the other. Creating PowerPoints LIKE A BAWS. Nobody can hold me down now, I'm a productivity beast!

The multi-monitor set-up is nothing more than conspicuous consumption for nerds. In fact, if two are better than one, why stop there? Hook up ten. (Brought to you by the makers of five-blade razors.) Better yet — swiveling around at the center of a 360-degree ring-o'-monitors. And hey, the sky's the limit: install a planetarium ceiling and crane your neck up toward your Display Dome. Endless workspace, endless possibilities!

Forget the fact that, in the old days of a real desktop, they did not bother extending desks out to 10 feet long in a lame attempt to maximize productivity. Having too many separate sub-areas of the desktop makes it hard to focus on the one at hand. About the only task that truly benefits from two separate areas visible at the same time is manually copying a target document onto a blank one, analogous to dubbing cassettes. Otherwise, the world churned right along — and saw greater productivity gains over time — with just one central work area on their desks.

Something similar is going on with the phenomenon of "twenty tabs open at a time," as though keeping twenty books open on a real desktop would somehow make you absorb information more efficiently. Or as though playing twenty TV channels simultaneously would make for greater entertainment. In Back to the Future II, that was presented as satire; today it has become the unremarkable reality.

Undoing or not participating in the multi-monitor trend is fairly simple. Finding a monitor under 17" is a lot tougher. Buying them new is out, and even thrift stores will not accept them, so don't bother there. Craigslist and eBay are the best places, although since most of them are CRT (better than LCD in any case) the shipping will not be cheap. Still, it's a small price to pay for something that will last forever and prevent digital distractions from taking over your desktop.

April 6, 2014

A weakening power of helicopter parenting? Another round of little Boomers?

While traveling a bit lately, I've been observing the otherwise invisible nuclear families of today, now that they have to leave their lock-down compounds to go to the airport, or leave their hotel room to grab breakfast.

It's probably too early to tell, but I'm getting a hunch about how small children these days are, or are not, going to internalize the paranoia of their helicopter parents. These are children in early elementary school or younger.

When helicopter parenting paranoia began with new births in the late '80s, there was plenty for parents to be concerned about (which doesn't excuse their over-reaction). Violent and property crime rates were nearing their peak, and for the previous several decades, it had seemed like the world would only continue on in that direction.

Hence, when the parents sealed off their nuclear family from the outside world and began ordering their kids not to do this and not to do that, there was an honest sense of concern coming through in their voice and mannerisms (however overblown this concern may have been). Moreover, this was the only message about the outside world that the parents allowed to get through to their children — primarily by shutting out all other sources of input, but also by choosing only those external sources that would provide a consistent paranoid message to their little dears. "Parental control."

These children, the Millennials, have grown up to be the most frightened and insecure generation in living memory — how else could they have turned out? Everybody who offered them input, or who their parents allowed them to observe, sent the message that the world is too scary and random to venture out into on your own. And their tone of voice was consistently frightened for your safety, not as though they were just making shit up or just trying to spoil your fun. I guess you might as well hunker down in your room and interact with others at most through virtual channels (texting, internet, online video games, etc.).

Now, what's going to happen when these people become parents? They don't have any first-hand experience with real life, let alone the dangerous, topsy-turvy, and humbling parts of it, let alone decades of such experience. When they try to pass on the message of how scary the world is, it will start to ring hollow. Kids aren't stupid, and they can tell what your tone of voice and mannerisms reveal, aside from whatever you claimed in words. At the least, they can tell when you're being sincere and honest, or when you're joking around and teasing them.

Can children also sense which grown-ups have more experience, and which are more naive? If so, they'd react to the dire warnings of their Millennial parents with, "Yeah, and how would you now, you big wuss?" Whereas if they sensed the parent was more seasoned, they'd take it to heart — "Damn, even this been-there, done-that kind of grown-up sounds scared. It must really be dangerous."

However, when the child steps on the other side of the paranoid "do not cross" line, Millennial parents sound more annoyed and upset than they sound concerned and afraid. This may also be going on with later Gen X parents, who are more experienced, but whose memories of how tumultuous the world can be are fading more and more every year. It's harder and harder for 1992 to exert that kind of gut influence that would shake up the parents, when the world has gotten as safe, stale, and antiseptic as it has by 2014.

Thus little kids today are not going to take parental paranoia to heart like the Millennials did. It's just the mean old grown-ups trying to boss us around and spoil our fun, not looking out for our greater long-term welfare. By the time they're in high school, cocooning will have bottomed out, and they'll be able to enjoy a more unsupervised adolescence. And given how low the crime rate will be by then, they'll conclude that all their parents' warnings were either clueless and out-of-touch, or knowingly wrong and intended to shelter them from real life. See, nothing to worry about in venturing off into unsupervised places!

What was the last generation that had this naive attitude toward breaking free from parents, who they callously dismissed as either out-of-touch or as hypocrites? Yep, we're about to see the rebirth of the Baby Boomers, whose defining impulse is calling the bluff of authority figures.*

It's odd how small children these days are more annoying in public places, running around and making noise, despite their helicopter parents trying to make them behave. When the Millennials were that age, they were either not to be seen at all, or were frozen in place. Today's rugrats and ankle-biters seem more appropriate for the 1950s (see any Saturday Evening Post cover showing their frazzled parents, or a crowd of kids running around the house at a birthday party on Mad Men).

We're not into that late '50s environment yet, but you can sense things creeping up to that turning point. For now, we're still waiting for Elvis.

* Gen X has a similar but more practical and mature attitude. Breaking free from parents is good, but you do have to be cautious out there. Yes, parents are out-of-touch, but that's to be expected given what a different world they grew up in. Authority should not be blindly followed, but blithely romping around calling the bluff of every older person who offers you advice, especially if it comes from the wisdom of tradition, is likely to get you killed.

April 4, 2014

Adventures in vinyl-hunting for practical reasons, part 2

In an earlier post I confessed a major sin: I have always bought vinyl records for practical rather than aesthetic or Romantic reasons. Namely, there's a lot out there that is not easily available on CD — finding it is hard or expensive.

I dropped by a Half-Priced Books today to comb through their rock/pop records, and found five things I've never seen in real life on CD, for under $15 including tax. You can't go wrong for under three bucks a pop.

1. Scandal's self-titled debut EP, with "Goodbye to You." It was never released on CD. Two songs are on a Patty Smyth compilation that I have on CD, and VH1 did release all of them on a CD compilation — though given the date, I'm sure it was a victim of the loudness wars and had all the dynamic range compressed out of it. In fact, the Patty Smyth compilation sounds a bit hollow itself, and it's from the mid-'90s IIRC.

2. Missing Person's album Spring Sessions M, with "Words," "Destination Unknown," and "Walking in L.A." It was released on CD, but I've never seen it in at least five years of regular combing of used record/CD stores, nor could they order it from their distributor. I'm guessing the CD is out of print, as it's going for $40-$50 on eBay.

3. A Flock of Seagulls' self-titled debut album, with "I Ran" and "Space Age Love Song." It was released on CD, but I've never seen it, and on eBay it's $10-$20 used. The LP is going for about that as well, so I got an even better deal today.

4. The Motels' album All Four One, with "Only the Lonely" and one of the coolest album covers ever. As with #3, it was released on CD, though I've never seen it, and is going for $10-$20 used on eBay.

5. After the Snow by Modern English, with "I Melt with You." On eBay there are CDs going from $5-$10, which isn't bad, but they also say that it's rare. Short supply doesn't always translate into high prices if there's little demand for it. Can that be, though? It's only got one of the most iconic, catchy, uplifting songs of all time on it! Up until now, I'd been listening to it on the CD soundtrack for Valley Girl.

Come to think of it, I don't believe I've seen any CDs by these bands in the wild, not even a greatest hits, although most of their hits can fairly easily be found on multi-group compilation discs. (I might have seen something by the Motels, but can't recall.)

They had a copy of the first two Duran Duran albums, the self-titled one and Rio. I'm sure I'll continue to like those albums more than any of the five that I bought today, but I still passed on them. I've already got them on pre-'90s CDs that sound great (not guaranteed to be in a used CD rack, but not hard to find either). I'll eventually get around to buying duplicates on vinyl just for the different sound, but there's still so many more records to hunt down of things that are damn near impossible to get on CD.

I also passed on some more pop-y choices that I probably shouldn't have, and may have to return to pick up tomorrow: Belinda Carlisle's debut album Belinda, with "Mad About You," and Jane Wiedlin's album Fur, with "Rush Hour." Two or three songs from Belinda are on a greatest hits of hers that I've already got, but I've never seen anything by Jane Wiedlin on CD, not even her one hit on a compilation.

Usually the discussion of vinyl centers on music from the '60s and '70s, before CDs were available. But you'd be surprised how much killer '80s music is hard to find outside of the record crates, at least the entire original album rather than the hits on a compilation.

If it was major enough, they probably released it on CD — perhaps following up with multiple re-issues if it's that important (like Graceland by Paul Simon). But if the band was remembered as a one-hit wonder, then probably not. The heyday of New Wave from '82 to '84 came just as CDs were being introduced, so it was far more likely to have been released on vinyl and perhaps cassette. That's the place to look for it, then.

April 3, 2014

Cosplay in the Fifties: The Davy Crockett craze

The music video for "Come As You Are" by Peter Wolf portrays exuberant everyday life in a small town during the late 1950s, and by the end the whole town gathers together in a crowd as though they'd all caught dance fever from the future.

One of the key examples of "iconic Fifties life" that they bring out is the 10 year-old kid who's all dressed up in his Davy Crockett gear. Not because he just came from a school play about frontier life, but because that's just what boys were into at the time. In fact, the Crockett craze was part of a broader mania for all things Western during the late '50s, among both children and adults.





How similar was that phenomenon to the trend of cosplay in the Millennial era? It was a full costume of the icon — clothing, hat, and prop gun — rather than a t-shirt / backpack / lunchbox with the icon's image on it. The icon was a specific character, Davy Crockett, rather than a generic type like "frontiersman." The children were role-playing as the character, rather than dressing up that way while behaving normally. And it was a mass phenomenon mediated by the mass media, rather than a strictly grassroots sub-culture or counter-culture.

Daily life in the Mid-century was relatively unexciting, especially for children during the heyday of Dr. Spock, smothering mothers, Levittown subdivisions, and drive-ins where the customers physically cut themselves off from each other. Role-playing as a figure from the Wild West gave them an escape from the cocooning culture's insistence on parking yourself in front of the TV set rather than wandering around the neighborhood, clipping your fingernails and brushing your teeth in the proper way, and remembering to drink your Ovaltine.

At the same time, competitiveness was low and falling circa 1960, so kids back then did not make costume competitions and one-ups-manship a part of the Crockett craze. And unlike now, there was no pseudo-slut counterpart among girls — attention-whoring being another aspect of the competitiveness of today's culture. It was also restricted to kids who were about to go through puberty, rather than adolescents and young adults.

It seems then that cocooning is what brings this trend into being, and that high or low levels of competitiveness only shape its expression.

I don't remember anything like the Crockett craze during the nadir of cocooning in the '80s. There are pictures floating around of D&D-themed cosplay in the late '70s and '80s, but that must've been damn rare. There are only a handful of gatherings pictured, as opposed to the endless examples you can find of kids in Davy Crockett get-up or of contempo cosplayers.

At the height of Hulkamania, we might have worn a t-shirt like Hulk Hogan's, or played with ninja weapons during the Ninja Turtles craze, but we never got fully dressed up in character for role-playing, let alone dress that way in an ordinary setting like hanging around the house. Everyday life had enough excitement back in more outgoing and rising-crime times that it wasn't necessary to pretend that you were part of a Wild West culture.

April 1, 2014

The decline of pop music parody when the songs and videos have become forgettable

VH1 Classic just played "Synchronicity II" by The Police. I'd never seen it before, but instantly recognized that it was a quotation of the iconic video for "Dancing with Myself" by Billy Idol, whose personal image Sting is imitating.

Seems like it's been a long time since a well known group made a video that quoted another video. This brief review confirms the hunch. In 2011, All-Time Low made a video spoofing Katy Perry and other major stars du jour, but they were not a very popular group (never appearing on Billboard's Year-End charts). The last clear example was "All the Small Things" by Blink 182 way back in 2000, and even that was a blip, as there were no others from the '92-and-after period. (Needless to say, Chris Rock did not appear on the charts, and I don't recall seeing the video for "Champagne" back then.)

After "Synchronicity II" in '83, David Lee Roth made a more obvious and slapstick parody video for "Just a Gigolo" in '85, the same year that Phil Collins aimed for parody in the video for "Don't Lose My Number." In '91 with Genesis, he included a Michael Jackson parody in the video for "I Can't Dance." There are probably some others out there that I can't think of immediately, and that the writer of the review missed, but that's enough to establish the basic picture.

Those four acts were all major successes at the time, all appearing on the Year-End charts across multiple years, hence the parody videos would've enjoyed high visibility.

So why the decline since the '90s? Well, allusions like these rely on the audience having already been exposed to the original and stored it in memory. Steadily over the past 20-odd years, less and less of the target audience has reliably tuned into music videos -- and music in general. Moreover, music videos have gotten a lot more bland and featureless -- and who's going to remember a forgettable video when it's referred to in a later parody?

By "forgettable," I don't mean "inferior" in a subjective way, but objectively lacking in identifying features -- things that help you explain them to others to jog their memory. Simple memory tests would show how easy or difficult a video is to recall. "Y'know, that video where the glamor models are playing the instruments and the singer is wearing a shirt and tie?" (Actually, he made three videos like that -- the one where the models are wearing pink, or wearing black? And is he wearing a suit or just a shirt?) Try doing that for recent videos -- "Y'know, that one where those rappers are boasting around a pool and a bunch of big booty hoes are shaking their asses..." Oh right, that one.

I think this also explains why parody songs have died off since the heyday of Weird Al Yankovic in the '80s and early '90s. You can't get the listener to recognize the target song if it has no memorable riffs, melody, solo, or distinctive vocal delivery. It can only end up sounding like a parody of the whole genre, whose performers are all interchangeable (emo, crunk, etc.). That would also have been difficult in the '50s and early '60s, when most pop music sounded so similar and lacking in identity.

Thus, the peak of pop parody during the '80s and early '90s suggests a peak in those years for the distinctiveness of both the music and the videos of hit songs.

Related: Ornament is meant to make things more memorable

March 31, 2014

TV audiences in love with annoying children, the second wave

Here is a New York Post article reviewing some of the many annoying child and teenage actors on hit TV shows right now. I haven't seen any of them, so I'm not sure how central they are, but it sounds like they're at least regular cast members, some of them at the core.

If there are so many of them littered throughout popular shows, the general audience appreciates it, notwithstanding a vocal minority. And even if the general audience doesn't care for a particular character, that's more of a failed attempt at the goal of creating the "prominent child character" that the audience craves. In today's climate of nuclear family-centric cocooning, viewers just can't get enough of watching children.

However, this isn't the only time when you would have been assaulted by annoying kids when turning on the TV. During the previous heyday of permissive parenting in the Mid-century, one of the most popular shows was Dennis the Menace, starring a more sympathetic but still off-putting Baby Boomer brat who couldn't act. The less popular yet more iconic show Leave It To Beaver starred an even more annoying kid who couldn't act.

Circa 1960, status-striving and dog-eat-dog competitiveness was nearing a low point, so at least those earlier examples would not have made you angry with their smug dismissive attitude. Still, they were children who couldn't act, they were central characters, and they were going to get on your nerves for being so dorky, bratty, and wussy.

As Mid-century cocooning had all but melted away by the 1980s, it was damn unlikely that you were going to suffer annoying children on television. Here is a site that lists the top 30 shows in the Nielsen ratings for each year, which you can browse if you're unfamiliar with them.

For the early and middle parts of the decade, there was nary a child to be seen, let alone a central character whose immaturity and inability to act would have made you change the channel. There was that blonde daughter from Family Ties who was a mopey sourpuss, but she was mostly out of the picture before she got to high school. I'm sure there are other marginal examples like that, but none where they're one of the main characters.

What's striking about the hit shows back then is how grown-up everyone is. Dallas, Magnum P.I., Cheers, Miami Vice. Viewers then were so maturity-minded that they put The Golden Girls, a show about four senior citizens, into the upper layer of ratings. Directly related to that is how unrelated most of the characters are — annoying kids tend to crop up in shows that focus on families.

Toward the end of the '80s, as cocooning is about to set in, there were still only a couple of annoying kids on TV — the son Jonathan in Who's the Boss? and Rudie from The Cosby Show.

Then as the family-centered shows of the '90s rode the wave of helicopter parenting and cocooning, we got a deluge of annoying kids. Darlene and DJ from Roseanne, the little boy from Family Matters, the Olsen twins and Stephanie from Full House, the blond nerdlinger from Step by Step, and all of the kids from Home Improvement.

Everybody's favorite show about nothing, Seinfeld, was a holdout in this regard, and in hindsight was a key factor in making the show so enjoyable — no fucking kids. Not even teenagers. It was more of an ironic and self-aware incarnation of Cheers adapted for the postmodern Nineties.

These changes back and forth are the result of folks being more community-minded and public-oriented vs. more family-minded and home-oriented. The community and public places are not made up of children, but adults (mostly). Only when people start locking themselves indoors do they dwell on the ankle-biters that are part of private, family life.

March 30, 2014

Do cats have generational differences?

Cats these days seem more bratty and dependent than I remember growing up. And more housebound and supervised by their caretakers — perhaps the cause of the greater brattiness and dependency, a la the children of helicopter parents? Just as there has been a huge change in parenting styles over the past 25 years, so has there been with pet-caring styles, and in the same direction.

I'm inclined to rule out genetic change, not on principle, only because the change has been so sudden.

Fortunately, cats are less obedient than children toward over-protective owners, so they still get out a good bit. But it still feels like they have a weaker public presence than they did in the '80s. One of our cats used to flop down into a deep sleep right on the sidewalk out front. Once he even got accidentally picked up by animal control because they thought he'd been hit by a car and was laying dead on the curb. Another cat used to climb trees and walk over our first-story roof.

Come to think of it, it's been awhile since I recall seeing the "cat stuck up in a tree" in real life or pop culture. Last I remember was having to climb up a tree and dislodge our cat who liked to run up there, at least until the birds began dive-bombing him. That was the mid-to-late 1990s.

I'm not sure how to investigate this idea quantitatively, but it would be worth looking into by animal psychologists. There may be surveys of pet-owners over time about whether their cat is an indoor or outdoor cat.

March 29, 2014

Pre-screening peers on Facebook

Different generations use the same technology in different ways. Nothing new about that: as teenagers, the Silent Gen huddled around the radio set to listen to drama programs, quiz shows, and variety hours — akin to parking yourself in front of the idiot box — whereas Boomers and especially X-ers turned on the radio to get energized by music.

After reading more and more first-hand accounts by Millennials about how they use Facebook, it's clear that there's more social avoidance going on than there first appears. Refusing to interact face-to-face, or even voice-to-voice, is an obvious signal to older folks of how awkward young people are around each other these days. As recently as the mid-1990s, when chat rooms exploded in popularity, we never interacted with our real-life friends on AOL. It was a supplement to real life, where we chatted with people we'd never meet. Folks who use Facebook, though, intend for it to be a substitute for real-life interactions with their real-life acquaintances.

But then there's their behavior, beyond merely using the service, that you can't observe directly and can only read or hear about. Like how the first thing they do when they add an acquaintance to their friend list is to perform an extensive background check on them using the publicly viewable info on their profile. Who are they friends with? Birds of a feather flock together. Who, if anyone, are they dating? Where do they go to school, and where do they work? Where did they go to school before their current school, and where did they work for their previous five jobs? What's being left on their wall? What do their status updates reveal? Pictures? Pages liked? Etc.?

Damn, you hardly met the person and you're already sending their profile pics to the lab in case there's a match in the sex offenders database. Back when trust was just beginning to fray, people (i.e., girls) would have maybe used the internet to check for criminal behavior, but that would be it. Now it's far worse: they scrutinize every little detail on your profile, and every trace you leave on other people's profiles. They're going far beyond checking for far-out-there kind of deviance, and are trying to uncover every nuance of your life. Rather than, y'know, discovering that first-hand or at most through word-of-mouth gossip.

Aside from feeling invasive ("creepy"), it betrays a profound lack of trust in all other people. After all, it's not this or that minority of folks who are subjected to the screening. Nope, it's like the fucking TSA waving each Facebook friend to step up to that full-body scanner. "OK, open up your profile, and hold still while we scan it... All right, step forward... and... you're clear." What if I don't want to "hit you up on Facebook," and keep that part of me private? "In that case, you can choose the alternative of a full background check by a private investigator, and provide three letters of reference." Just to hang out? "Well, you can never be too safe. It makes me feel more at ease." Yeah, you sure seem at ease...

Moreover, it destroys any mystery about a person. Remember: mystery is creepy, in the Millennial mind. An unknown means that something's going to pop out of the closet and scare them. They have to open every door and shine a flashlight over every micro-crevice of the home of your being, in order to feel secure.

It also delays the getting-to-know-you process, and interrupts the momentum whenever it gets going.

Trust greases the wheels of sociability, and it used to be normal to meet someone for the first time in the afternoon and feel like you knew them well by the end of the evening. Not just boys and girls pairing off, but also same-sex peers making quick friends, particularly in an intense setting like a concert or sports game. These days, there's this nervous laughter, sideways stare, and lack of touchy-feely behavior (for boys and girls) or rowdiness (for same-sex peers).

I think it's probably too late for the Millennials to be saved from their awkward, pre-screening behavior on social networks. They're already in their 20s and set in their techno ways. Hopefully by the time today's post-Millennial kindergartners become teenagers, they'll look at the choice of Facebook as real-life substitute, and let out a great big BOORRRRRRINNNNNG. Locking yourself indoors and huddling around the radio set listening to soap operas died off fairly suddenly, and there's no reason that the next generation can't kill off Facebook just as quickly.

Were you into vinyl before it was cool?

By "being into," I mean that it was a conscious choice among alternative formats. And browsing through the NYT archive for articles about "vinyl," it looks like it became cool during the 2000s. Before then, when it was also a conscious choice, is roughly the mid-to-late 1980s through the '90s.

Today I was reminded of the two main things that turned me on to records in the mid-'90s: selection and price. Nothing romantic.

Not the sound quality — it sounds great, and distinct from CDs, but not a difference that would make me want to convert my CD collection into vinyl. And not the status points — back then, there were no status points to be gained, and even now I would gain little once I revealed what it was that I bought on vinyl (not classic rock, punk, grunge, or indie).

I'm visiting home and stopped by the local thrift store, which had several dozen crates full of records. I hadn't even gone there looking for them, I just figured why not browse around after having scored something that I did come looking for, a vintage afghan (black with gold and cream patterns). Right away they started popping up, and within ten minutes, I had a handful of albums that are difficult to find on CD, let alone for a buck a piece in mint condition. The Bangles, David Bowie, Bonnie Tyler, The Go-Go's, Paul Young, Bryan Adams, and Stacey Q (only a 12" single, but more than I've ever seen on CD).

The only one that I've even seen before on CD ("in the wild") was All Over the Place, the debut power-pop album by the Bangles. I have that on CD, but the hits from the others I only have on compilations or greatest hits: "Blue Jean," "Total Eclipse of the Heart," "Every Time You Go Away," "Two of Hearts"... and I don't think I have "Head Over Heels" or "Summer of '69" on anything. Yet for the price of one crappy download on iTunes — with 95% of the sound compressed out — I got the entire album that each hit song came from.

It may be new wave and synth-pop today, but 15 to 20 years ago I was still bored by contempo music and looking back for something entertaining. As now, it was mostly the '80s, a decent amount from the '70s, and a handful from the '60s. Only more on the avant-garde or experimental side, compared to more well known stuff that I dig now. Starting in college, you should be getting over your adolescent insecurity about needing to prove how obscure and unique your tastes are, and that's the only real change in my vinyl-buying habits — who the groups are, not the larger reasons of selection and price.

Some of that stuff was available on CD back then, but it was expensive. On the CD racks at Tower Records, there were several columns of just Frank Zappa, but they were closer to $20 than $15. If you dropped by any used record store, though, you could find them used and in good condition for about five to ten bucks.

And other material was either not released on CD, was out of print, or was otherwise damn rare to find on CD. Yet I had no problem finding a couple albums and a single by Snakefinger, the guitarist who frequently collaborated with The Residents. But unlike the most famous obscure band in history, Snakefinger was actually a working musician instead of a performance artist, and was a superior songwriter and performer.

As an aside, if you're looking for something unheard-of, but can't stand how weird the typical weird band sounds, check out his stuff from the late '70s and early '80s. He recorded a cover of "The Model" by Kraftwerk that's more uptempo and danceable yet also more haunting than the original. (None of his own music is very dance-y, BTW, in case you're allergic to moving your body.)

Anyway, it struck me as odd that someone would be into vinyl for practical reasons. There really is a lot out there that can be had for cheap if you buy records, without compromising sound quality.

It's not an analog vs. digital thing either. Tapes are analog, but they sound pathetic compared to either records or CDs. Video tapes are the same way. Does that make laser discs the next target for hipster status contests? If so, better hit up your thrift stores soon before they're scavenged by the status-strivers. At the same place today, I found a perfect copy of the director's cut of Blade Runner on laser disc for five bucks. Don't know when I'll be able to actually play it, but...

March 25, 2014

Don't improvise in period movies

Got around to seeing American Hustle tonight, and liked it better than most new movies I see. One thing took me out of the moment several times, though -- the improv scenes.

In a period movie, actors should stick close to the script because they are unlikely to be able to improvise and maintain the period's authenticity on the fly. When the acting is more spur-of-the-moment, it will come from the actor's gut, which is tuned to the here-and-now.

There was a scene where Irving Rosenfeld asks "Really?" in a slightly flat and miffed voice, in response to someone else's ridiculous behavior. That's too specifically from the 2010s. At the end when Richie DiMaso does a mock performance of his namby-pamby boss, it's so over-the-top and laughing-at rather than laughing-with, that it felt more like a frat pack movie from the 21st century. Ditto the scene where DiMaso is trying to convince Edith to finally have sex, where the dialog sounds like it's from a doofus rom-com by Judd Apatow.

These and other scenes should have been in the outtakes -- wackiness ensues when the actors break character! When they're left in, it creates jarring shifts in tone, as though an actor who'd been speaking with an English accent switched to Noo Yawka for half a minute, then switched back to English (all for no apparent reason).

Improvising and wandering slightly outside of the character's range isn't that jarring. Like, maybe you're just seeing a slightly different and unexpected side of them in this scene. But anachronism is not so easy to suspend disbelief about -- it definitely does not belong to that period. If they turn on the radio in 1980 and it's playing Rihanna, that kills the verisimilitude.

It's even more baffling in American Hustle, where the costume, make-up, and production design has been so meticulously worked to make you believe you're looking in on a certain time and place. All it takes is a series of distinctively 21st-century improvs to throw that into doubt for the viewer.

March 24, 2014

Field report: Bowling alley birthday party

A party at the bowling alley became a common option by the late '80s, and I went to one today, so they're still at least somewhat common. Yet in 25 years a lot has changed, reflecting the larger social/cultural changes.

First, the guests are dropped off at the bowling alley itself — not at the birthday boy's house, where they'll pre-game before the host parents haul them off in a couple of mini-vans to the bowling alley. I've written a lot about the death of guest-host relationships during falling-crime / cocooning times. Each side is suspicious of the other — the hosts fear being over-run by and taken advantage of by guests, and the guests fear mistreatment (or whatever, I'm still not sure) by the hosts. It is primarily a guest-side change, just like the decline of trick-or-treating or hitch-hiking.

Now they have to meet in a third-party-supervised place like a bowling alley (or the mall / shopping district for trick-or-treating). Before, they would've met inside the host's home.

Then once the kids are dropped off, the parents will hang around to some degree before the event is over and they take their kid back. (Every kid today had a parent present.) Rather than drop them off, go do something for awhile, and either pick their kid up later or have them dropped off by the hosts. Again we see parents being so paranoid that they won't just leave their kids alone in the company of the hosts. Even if it's a bowling alley with only nuclear families present in a lily-white region and neighborhood, and during a Sunday afternoon.

(I can't emphasize often enough that, just because you live in a diverse shithole part of the country, doesn't mean that everyone else does. If parental paranoia is palpable in an all-white smallish town — if it is a national phenomenon — then it does not have to do with protecting kids against the dangers of a diverse megapolis.)

At least the parents won't hover directly over their child at a party, but the assembled grown-ups will form a ring around the kids, or form a side-group of chatting grown-ups next to the kids. Line-of-sight supervision remains unbroken. They can't trust their kid to use his own brain because he doesn't have one — intuition requires experience, and helicopter parenting blocks out experience from their kid's upbringing. It's more of a programming.

One of the kindergarten-aged guests politely declined eating a piece of birthday cake because it had "artificial frosting." He didn't know that (for all I know, it was made of natural poison). Still, as though if it were organic sugar, it wouldn't wind you up, put you in an insulin coma, rot your teeth, etc.

A lot of these brand-new food taboos that were wholly absent during the 1980s are just roundabout, rationalized ways to fragment the community. Sorry, my kid can't come to anyone's party because you'll have traces of peanuts in something and he'll die. So don't mind his absence from all celebrations that might bind a peer group together.

As for the actual bowling, you see both major societal influences at work — the self-esteem crap and the hyper-competitive crap. Quite a combination, eh? All of our kids are going to compete as though their lives depended on it, yet they're all going to enjoy the exact same rewards afterward.

The bumpers being up is not about self-esteem. That's just helping them learn, like riding a bike with training wheels at first. But everything else is. Like letting them cross the line as much as they want without penalizing them at all. Or cheering after any number of pins get knocked down — don't you think the kids can tell that they'll get praise no matter what they do? And hence they can just BS the whole thing and get full credit? Gee, I wonder if that'll pop up a little later in life...

The hyper-competitive stuff is way more visible and more offensive, though. If they knock down more than 5 pins, they're going to do some kind of victory dance, boys being worse than girls. Congratulations: it should've been a gutterball, but the bumpers let you knock down 7 whole pins. "Oh yeah! Uh-huh!"

And they're so eager to generally show off that they don't care how awful their technique is. Not like I'm even an amateur bowler, but I know that we're not doing the shotput here. The boys are again way worse than the girls on this one. (The winner today among four boys and two girls was one of the girls. She creamed the rest, not from being good, but from not crashing and burning in an attempt to show off.)

Have you guys seen kids throw a bowling ball lately? When I was their age, we stood with our legs wide apart, held the ball from behind with both hands, and rolled or lobbed it as close to the middle as possible. It's a granny-looking move, but when you're in kindergarten, you don't have the upper body strength to throw it in a more normal way. Heaven forbid you teach that lesson to kids these days, though. They're going to prove that they can do it. Only not.

They carry the ball with both hands near their chest, running up to the line with their left side forward (if right-handed), and then heaving or shotputting the ball with their right hand, turning their upper body to face the pins when they're near the line. This must be the worst way to release a bowling ball, and if the bumpers were not up, every one of these releases would go straight into the gutter. Not meander their way into the gutter — like, not even halfway down the lane, it's already sunk.

One kid did this with such enthusiasm that after shotputting the ball, his upper body carried itself forward over his feet, and he landed on his hands and knees — over the line, every time.

So, who cares if they're not trying to achieve the goal that the game assigns them? They're showing how eager they are to display intensity ("passion," later on), and that's all that matters in a dog-eat-dog world. The rules can be bent or changed on the fly, as long as the most intense person will win. After all, the parents aren't correcting or penalizing them.

One final odd but sadly not-too-surprising sight: the setting was a college student union, yet there were only two groups bowling, both family-and-kid-related. There were a handful of students playing pool for about 10 minutes, then that was it, no replacements. No students in the arcade area. On the other side of the rec area was the food court, which was closed on Sunday but which still had about a half-dozen students spread out.

Doing what? Why, surfing the web — what else? Most had a laptop out, one was on a school-provided terminal, and one girl was reading a textbook or doing homework with her back to everyone else.

If you haven't been on a college campus in awhile, you'd be shocked how utterly dead the unions are. Like, there are stronger signs of life in a gravestone showroom. Most of the students are locked in their rooms farting around on the internet / texting or video games. The few who venture out go to the library or the gym, where they can be around others and within the view of others, but still hide behind the expectation that you just don't go up and interact with people in those places. Unearned, risk-free ego validation — what's not to love for Millennials?

March 21, 2014

Another path from helicopter parenting to egocentrism

I'm visiting my nephew and got dragged into playing Legos with him. "Dragged" not because I'm heartless, but because I want to keep sufficient distance in case I need to assert authority. Remember: being your kid's friend undermines your authority. It's the kind of thing he should be playing with his peers, not a family member who's 27 years older than him.

At any rate, he's so excited to show you this thing he made and that thing he made, and what this design in the booklet looks like, and what that one looks like. I found myself the whole time saying "cool..." or "man..." or staying silent. All you get is agreement from family members, even if they don't really think it's cool. Family members have to treat each other more nicely than if one of them were an outsider.

It's your peers who would pipe up with "BORRRINNG." Like, "Hey Uncle Agnostic, you wanna watch Sponge Bob?" I can't tell him, "Nah man, that shit's boring." But one of his friends hopefully would. "Aw c'mon, change the chanelllll. Sponge Bob sucks." Then they'd engage in a little back-and-forth, give-and-take, until they compromised.

With family members in an era of family friendliness, the grown-ups tell the child that whatever they like is awesome. No need to change, improve, or compromise your interests and tastes.

This ego sheltering can only last so long, though. What happens when the kid starts to interact with his peers at age 25 or whenever it is with you damn Millennials? His whole formative years up to that point have prepared him to expect that other people will find his interests fascinating and his tastes impeccable.

Then, SLAM — your peers shrug off many of your interests and find your tastes average. That's typical, and not the end of the world. But with no preparation for it, the ego faces this challenge in such an atrophied state that it gets utterly demolished.

To pick up the pieces: "Well what do those idiots know about awesome anyway? They're probably just jealous. I don't need their confirmation anyway." Now they're headed down the path of social withdrawal and misanthropy. They'll grow suspicious of their so-called friends who don't share 100% enthusiasm with their interests.

Parents in the '70s and '80s used to view their job as preparing their kids for the tough and unpredictable world out there, not to insulate them from it. It'll slam into them at some point, so might as well make sure they've grown to withstand it. Parents stayed out of our lives even when we were children. By encouraging us to go out and make friends on the playground, or at school, or around the neighborhood, they helped us discover the shocking reality that not everybody is as interested as we are in the stuff we're interested in.

That not only taught us to negotiate and compromise with someone who wasn't on the same page as us, but also to seek out new friends who would be closer to us. That way we have our friends where we don't have to struggle that much just to get something done, and other friends or acquaintances who we have to make more of an effort to do things with — but not shutting them out because of that. Each side goes back to their closer circle of friends afterward and engages in some good-spirited gossip about how weird the other side can be sometimes.

This is a separate effect from over-praising the kids' efforts and output. That shelters their ego about their capabilities. This is about what gets their attention, what motivates them, their interests and tastes. It's much closer to the core of their identity than their capabilities, so that questioning it is far more likely to be perceived as doubting who they are as a person.

That will trigger a much more desperate defense: "What do you mean, you don't like Harry Potter? Here is a PowerPoint of the Top 10 reasons why you must, unless you're a big fat stupid idiot."

Never having your tastes questioned — not well into your late teens anyway — leads to another major problem that you see with Millennials. They can't separate objective and subjective discussions about something they like. It all boils down to the subjective. The objective is only a means toward that end, as though an objective argument would force them to decide one way or another on the subjective side of things.

To end with an example, most of them like music that is not very musical. That is an objective claim, easy to verify. If it's what floats their boat, I guess I'll just have to consider them lame, and they can look at me playing a Duran Duran album as lame. But objectively speaking, "Rio," "Save a Prayer," "Hungry Like the Wolf," etc. are more musical. More riffs, motifs, more varied song structure (intros, bridges, outros), more intricate melodic phrasing, richer instrumentation (actually hearing the bass), instrumental solos, greater pitch range for the singing, more required to write the instrumental parts, and so on.

I know some Boomers who see that spelled out, and compare what Pandora says about their Sixties faves, and respond self-deprecatingly with, "Meh, I guess I get turned on by simplistic music then!" as opposed to fancy-schmancy music. Millennials get bent out of shape, though, as though "logic has proven my tastes inferior — must re-inspect the logic."

But that's what happens when your tastes rarely get questioned during your formative years. You don't appreciate that there could be two separate ways that this could happen, one objective and the other subjective. In fact, being told that your favorite TV show is boring would probably have introduced you to the objective side of things, when you asked them why they felt that way. "I dunno, it's like none of the challenges the characters face actually matters. The motivation feels empty." Ok, they wouldn't phrase it that way, but you know what I mean. Usually their response would not only amount to, "I dunno, it just sucks."

Ancestry in India revealed more by dance than by language

Continuing an earlier post which gave evidence for traditional dance styles being a better predictor of genetic ancestry than the language spoken.

Recall that Hungarians speak a non-Indo-European language, yet their folk dances place them squarely within the Central European -- and even Indo-European -- norm of step dancing. The style is characterized by fancy footwork, percussive footwork, and keeping the body's vertical axis more or less in place.

Language is a little too utilitarian of a trait to be free from cultural selection pressures, hence the handful of cases like the Hungarians who adopted the language of genetic outsiders for one good reason or another. (To trade, to serve as clients to their patrons, to move up the status ladder dominated by foreigners, and so on.)

Dance styles do not play such a utilitarian role in people's lives, and there is a very strong sense of "this is how the dance is done," so they are more conservative against selection pressures to adopt foreign traits. Cultural and genetic ancestries will tend to overlap, but they overlap much more closely when dance is singled out as the cultural trait, and not quite so closely when we look to language.

Now for the picture on the other end of the Indo-European world. There are two major language families represented in India, Indo-European brought by one branch of the, er, Indo-Europeans, and Dravidian, which better reflects the state of things before their arrival. Here is their distribution today, with the I-E languages in yellow and orange, and Dravidian in green:


Judging from language would lead us to expect I-E genepools everywhere but the southern third of the country and a fringe along the Himalayas. But when we look at the frequency of key genetic markers of the I-E migrants, such as lactase persistence, we see them confined more tightly to the northwestern regions (see this broader discussion by Razib):


The Indo-Europeans were (agro-)pastoralists, so they would have been under strong selection pressures to get more calories out of milk through metabolizing its sugar. Before the I-E folks showed up, there was a large-scale sedentary civilization based on agriculture in the Indus Valley, and you don't need to digest lactose if you're subsisting on grains. So, the blue regions above are more likely to be the descendants of that pre-IE civilization, driven somewhat to the south and east by the I-E pastoralists who settled in the northwest.

Notice that the language map vastly overstates how genetically Indo-European the Indians are. However, looking at language as a utilitarian trait, rather than a neutral one, we infer that large swaths of the Subcontinent who are genetically not-so-Indo-European have adopted the languages of foreigners for one good reason or another.

This disconnect is also evident in outward appearances. Take an eastern group such as the Bihari, who speak I-E languages. They have fairly dark skin, noses that are wide at the bottom, and low-lying nasal bridges. They would not look out of place in Southeast Asia or the Pacific Islands. Contrast that with the relatively lighter-skinned and hawk-nosed people of Pakistan, who would not look so out of place in Iran.

But external appearance is not a cultural trait, so let's turn to dance. Of the roughly 10 canonical forms of Indian classical dance, only one is native to the northwest -- Kathak (see an example clip here). It is a spirited step dance straight out of the Indo-European playbook. In fact, it looks a lot like Flamenco (example clip here for those living under a rock), carried into Europe by the Gypsies, who also speak and I-E language and who also hail from northwestern India -- 1,500 years ago. The common cultural ancestor between today's Flamenco and Kathak must go back even further. Here is a clip of a dancer from each style comparing and contrasting their similar styles.

All of the other classical dances of India belong to the blue regions in the lactase map. They ought to look quite different from Kathak, and they do: they include movements that are slower, limber poses are held as though the dancers were sculptures, and bending the body's vertical axis out of alignment is common. This is what most of us think of when we hear "Indian classical dance." They are also likely to feature costumes with a variety of garish colors, ornate headgear, and masks or make-up with exaggerated expressions. All of these features may tie together in a narrative / opera form, not just dance by itself. See this clip of the Bharatanatyam form, and this clip of the Kathakali form.

Those features place almost all Indian classical dances with those of Southeast Asia, from Thailand down into Bali. In fact, some of the distinguishing movements would not look out of place in a crowd of people practicing Tai Chi, nor would the costumes look so strange to a crowd used to Chinese opera or Japanese Noh theater.

Once again, dance styles are more faithful informants about a group's genetic ancestry than language is. You may adopt an Indo-European language in order to trade with, apprentice under, etc., a newly arrived powerful bunch of foreigners. But when it comes to expressing your race's identity in a communal setting, the dancing will be Dravidian.

March 19, 2014

How superficial and bratty were the Silent Gen as youngsters?

A recent post looked at how Millennials (the females anyway, but probably also the males) have a shocking level of superficiality regarding the opposite sex. Girls don't want a guy because he's "cool" but because he's "hot." I attributed this to the helicopter parenting and cocooning environment that they grew up in -- social avoidance and distance lead you to only notice and value people's appearances.

Did something like this happen during the previous heyday of cocooning?

The Mid-century was part of the Great Compression, when competitiveness and squeaky-wheel entitled-ness were steadily falling. So, young people back then -- the Silent Generation -- did not live by a social code of dog-eat-dog. And yet, their formative years were part of the Dr. Spock / smothering mothers approach to treating children, which seems to be a feature of cocooning and falling-crime times (cocoon the children for their safety, no matter what it turns them into).

Hence, we expect them to look like Millennials, only in a way where everything is not a status contest.

Recall this post about how Mid-century singers and actors / actresses were required to be quite attractive. Throw in the election of JFK, and even politicians fit into that pattern, to a lesser extent. That had totally changed by the Eighties, when major entertainers looked more like Bill Murray, Sigourney Weaver, Phil Collins, and Ann Wilson from Heart. And our President looked like Ronald Reagan.

Since the '90s, it's swung back toward the Mid-century norm of superficiality, with Justin Bieber and Katy Perry as today's major entertainers (or attractive TV news readers, or attractive sideline reporter chicks, or attractive spokeswomen for breast cancer, or...). I don't know how good-looking Obama is perceived, but he was elected on superficial grounds -- he's got dark skin, so it'd be like My Cool Black Friend running the country! Don't bother checking the individual in question to see if he really is the cool-black-friend type or not... And Republicans are no different, with Mitt Romney and Sarah Palin being much better looking than major aspiring politicians from the '70s or '80s.

What about brattiness among Silents? I haven't gone through all the major hits of the Mid-century, but I can think of two just off the top of my head: Lesley Gore's back-to-back hits of 1963, "It's My Party" ("and I'll cry if I want to, cry if I want to, cry if I want to") and "Judy's Turn to Cry". The first hit #1, the second #5, and both made the year-end Billboard charts.

You might object that these were for 14 year-olds, who would've been early Boomers, not Silents. True, but I'm talking about young people during the Mid-century more than a specific generation. If that included some early Boomers toward the end, that doesn't change the focus.

Finally, check out this sit-com / product placement special from 1952, "Young Man's Fancy". The Silent Gen daughter, Judy (that name again), acts like a spoiled brat toward her mother, though her callous and dismissive attitude is more calm and carefree than the hostile tone that her Millennial descendant would take.

When she hears that her brother's friend from college is coming over for dinner, she gets all disgusted by the sound of his name (Alexander Phipps) -- with a name like that, he's bound to be an ugly geek! He turns out to be good-looking, so she naturally responds by cursing her brother for not warning her that he was bringing home a hot guy (or whatever she calls him). Now she has to make herself up with no time. When she yaks on the phone with her friend about him, it's only his cuteness (or whatever the slang was) that she spazzes out about.

That spastic reaction reminded me so much of how Millennial girls respond to the Random Hot Guy type. His appearance is the only thing they notice, not his character or other appeal. They only want to get noticed and thereby ego-validated by the Random Hot Guy, not actually open up to, get close to, or do anything with him. And they flip out when they see one, as though they hardly get to enjoy the opportunity. Because everyone is so sheltered and doesn't run into each other so often? Or because by choosing only based on looks, there just aren't as many who make them feel butterflies in the stomach? Beats me.

Eighties babes were more collected around guys, whether because it wasn't such a rare thing to be next to and interacting with them, or because all sorts of guys appealed to them. It was also ultimately uncool to be seen as a spazz, so they must have developed greater skill at composing themselves around others.

I realize that the Fifties sit-com is not the best acting, not a documentary, and produced more for product placement. (Gee, how neat is it to be watching the program on an ELECTRIC computer screen?) Yet, the character type they were going for to portray "kids these days" was a callous, superficial airhead. And I'm sure the actress had experience with that type of person, at least from observing and interacting with her peers, and perhaps personal inclination too.

Teenagers in the Eighties were not shown that way -- they were more concerned, thoughtful, and interested in getting to know what made someone tick. I get a similar impression of youngsters from the Twenties based on Fitzgerald's short stories, and plot synopses of silent films about the Flaming Youth of the time.

In both eras, they had an anti-authoritarian kind of attitude, but that is not callous dismissiveness directed toward all people who are not hot. And in both eras, young people were all about having fun and living life -- and good looks may only get you so far along that path. Wanting to have fun in a social setting makes young people focus more on what others are like inside -- are they an instigator or a killjoy?

Clearly, "further research is needed" in the matter, but even this cursory look has turned up more evidence that you would've expected for a superficial and bratty mindset among young people during the Mid-century.

Forget it Jake, it's Malaysia-town

The near complete inability of the nations in eastern Asia to reveal what information they may have and cooperate during an urgent, life-and-death matter should temper the enthusiasm that many Westerners feel for glorious Asia — what with their high IQ levels (by global standards) and their low homicide rates.

Rather, this groping-through-the-labyrinth of an investigation should serve to revive an older description — "the inscrutable Oriental." It's all about keeping your guard up, wearing an unreadable stone face, and covering your ass / saving face above all else.

And it is not only in lending a hand that the Asian is stingy. Merely requesting help from others is seen as a sign of weakness, incompetence, "Why you so lazy?!" etc., and must only be resorted to long after it has become obvious that you're not as all-powerful and all-knowing as you'd thought. The level of hubris among Asians is astounding: no matter what the calamity, the in-over-their-heads team is bound to respond with, "Back off, man — I got this."

Why any Westerner would want to import large numbers of denizens from a black hole of trust, is beyond me. But probably boils down to them being a nerd who lives in the abstract and the hypothetical, where BRAINS + DOCILITY = MAX STATS, and having little connection with the real world, where these ghost people are among the worst neighbors and citizens you could ask for.