April 22, 2014

Transcendence: A provocative character study, not a showdown between man and machine

After two prefatory posts on the wider context of responses to the movie (here and here), we can now get on with the actual review of Transcendence. This will be on the long side because I'll be exploring many of the ideas that the movie brings up, in addition to reviewing the movie itself.

There will be some plot spoilers, but they will help with the larger goal here — to reframe your expectations so that, if you decide to see the movie, you won't feel like it was a bait-and-switch, and can simply enjoy the movie for what it is. It is not an action-driven, galactic-stakes showdown between a mad scientist and the forces of humanity, but rather a human-scale character study of the central players and their motives that might push us over the brink toward a strange, untested technology and way of life.

Let's make it clear at the outset: Johnny Depp is not the star or protagonist, and was only billed that way to "open" the movie — to provide a sure thing that would draw in audiences on opening weekend (and that didn't work very well).

From the outset he is shown to be a man of inaction, who prefers to avoid the limelight and toil away on mathematical proofs that only three people in the world will ever read. When he is roped into addressing an audience during a TED Talk-style fundraiser by his wife, he makes it clear that he finds it boring or besides the point to ponder the whole "how is this stuff going to be used?" side of things. He is a hardcore systematizer who only wants to understand how machines work, and how a sentient machine might work. It is pure research, not applications, that motivate him.

When he begins dying, it is not his idea to upload his consciousness to a computer, let alone to the internet. That was his wife's choice, once more, and he goes along with her plan, once more.

Thus it is Evelyn, the idealistic, starstruck, save-the-world wife who is the film's protagonist. She is the one who prods her husband's project toward applications that will heal the world. She is the one who brings up the idea of uploading his consciousness to a computer, the one who blithely rationalizes away any objections to it — it's no different from uploading an mp3 file to your iPod — the one who forcefully pushes the plan forward, who is the most vehement about the cyber-consciousness being "him" rather than him-plus-something-else or no-longer-him, the one who supervises and executes the plan to buy up a small town in order to build their underground headquarters and above-ground solar power array, the one who grapples with the rightness of her beliefs and the consequences of her actions, and who after deciding that she has done wrong, volunteers to become infected with a computer virus so that she can pass it on to the cyber-consciousness and disable it, atoning for her sins.

And unlike her husband, Evelyn is portrayed as an emotional and ambitious creature throughout the movie. Y'know, the kind of person who makes the major choices that steer the direction of the narrative.

I was surprised and fascinated by this inversion of the standard tropes of the mad scientist and wet-blanket wife. It's not the monomaniacal mad scientist who's going to bring about the apocalypse, who's going to use it for world domination, and so on. And it's not his wife who will continually nag him away from his work and warn him against the dangers of melding man and machine. And it's not even the absentminded professor whose gizmo-obsessed short-sightedness will lead him right over the edge of the cliff and pull the rest of the world along with him. Nor will it occur as the culmination of a deliberate plan that has been in the works for some time.

Rather, a spur-of-the-moment decision will be made under pressure — either upload Will Caster's consciousness, or he dies for good in a few weeks. The scientist is just going along with what seems like the only plan that allows for his basic self-preservation, and is not doing so eagerly or as a stepping stone toward some larger self-aggrandizing goal. The person who comes up with the idea and advocates the most strongly for it will be an emotional creature with deep personal biases — she is desperate to find some way to keep her husband alive, both because she adores him as a husband but also because his research holds the key to her ambition of healing the world.

Naturally, then, she will prove to be the greatest obstacle for the parties that want the cyber-consciousness shut down, who fear what it might do if left to its own whims and wielding such power. What they consider prudence would kill off not only her husband but all hope of realizing her heal-the-world ambitions.

We've seen such overly protective behavior before among female characters who have created a monster, but typically they are mothers who produce monstrous sons, yet who are still governed by Mama Bear protectiveness against the forces of good who want their sons dead. Now we get to see the other dark side of womanly devotion — covering for not just her husband, but a husband whom she has created. Undoing him would be more than unfaithful: it would be an admission that she made the wrong decisions during her creation of him.

Throughout the film, Rebecca Hall plays Evelyn sympathetically, rather than as a caricature of the devoted wife. This natural approach convinces the audience that any loving woman could find herself in her position, and makes the story all the more disturbing on reflection.

As for Evelyn's accomplice, her husband, many reviewers have complained about how flat and unemotional Depp's performance was. Like, what were they expecting for the character of arch-computer geek — Boy George? Once it's clear that he's not a power-hungry, resentful, or malevolent mad scientist, but who says he just wants to understand machine consciousness (NERD!), you should not expect emotion. You ought to expect a flat delivery from a recluse. Maybe they thought he should at least behave like an animated paranoid such as Ted Kaczynski, but that would be confusing him — the tunneling-away researcher — with the technophobic terrorist group that assassinates him.

The reviewers wanted someone more charismatic like Leonardo DiCaprio's character from Inception, but while that makes for greater drama, it takes away from plausibility. Nothing wrong with that if the tone is more what-if ("willing suspension of disbelief"), but when the tone is speculating on where current trends are taking us, it's better to favor what is plausible. And a charismatic computer nerd is not easy to swallow. In real life, it probably would be someone more like the nerd's emotional, ambitious, do-gooder wife who would make a snap decision to fuse man and machine, if it served her greater vision. The tunneling researcher has no grand vision — he just wants to be left alone to tinker with his ideas.

The husband's flat monotone also makes for a more interesting approach to the narrative of man transforming into machine. Like, what if he's 90% robotic already? And what if the rest of society is still about 80% robotic itself, more comfortable plugging their brains into their digital online devices than taking part in human activities? We're not exactly crossing the Rubicon anymore. Would uploading our consciousness to a computer be like a frog that is slowly boiled alive? For folks who are as flat and monotone as we are today, it just might.

Ultimately the inactive husband redeems himself by choosing to upload the virus from his wife, who in doing so is atoning for her own sins. Up until the end, though, it is not clear how much of the cyber-consciousness is the original Will Caster and how much is the computer intelligence already installed on the machine. This is another reason why Depp's flat delivery works so well — if he had been emotional as a flesh-and-blood human being, it would have been obvious that the monotone cyber-consciousness was the machine rather than him. A flat delivery in both stages leaves it more ambiguous, keeps us guessing about the cyber-thingie's true nature, and leaves us with a more disturbed feeling from the uncertainty of it, lying in the "uncanny valley."

But choosing to bring about his own downfall is presumably something that only a human consciousness would do, proving that at least some of the original person was still in there the whole time. And true to his original personality, he does not plan out the computer virus idea and set about achieving his goal. He just goes along with what he believes is the wise plan of action thought up and advocated for by his emotional wife.

As far as I know, this imagination of who the players will be, and what motives will drive them, is original in the heavily colonized niche of "when man and machine first become hybrid." At least from the examples that someone who isn't obsessed with the genre would be familiar with. It is a refreshing and stimulating approach that was unfortunately disguised in the ad campaign by the typical tropes about mad scientists and societal annihilation.

Reviewers should have kept a more open mind, though, once it was clear who the protagonist was and what her motives were, within the first 15-20 minutes of the movie. "Hoodwinked by yet another ad campaign — why do we continue to believe them?" should have been their response. That was just to draw in audiences who want more of the same junk, rather than take a chance on a totally new approach to man-meets-machine. I don't mind if a smart and original set of ideas has to sneak in through a Trojan Horse ad campaign about evil scientists, if we couldn't enjoy it at all otherwise.

April 20, 2014

Can today's reviewers remain clear-headed when a movie frustrates their hardened expectations?

In our 24-hour news stream culture, critics and audiences alike seek out information about upcoming movies, months in advance. By the time it is released in theaters, their expectations are so hardened that any deviation will deal them a major blow of cognitive dissonance. And rather than adjust in a humble way — "Huh, this is very different from what I was expecting, but let's go with it" — they follow the standard human programming and belittle the movie instead.

It's not just that it has failed to live up to their expectations — that happens all the time, and those expectations might not have been terribly high in the first place. It's that it has turned out to be of a different nature than they had expected, whether they were deliberately misled by the ad campaign or they were overly eager to form preconceived notions of their own, to alleviate their OCD fear of uncertainty.

When the viewers construe a movie as a bait-and-switch scam, or a glossy apple with a slimy worm inside, or a Trojan horse, they will naturally feel disgust, immediately vomit the product back up, and warn others to stay far away from it. This reaction of disgust, which pans the movie in black-and-white terms, goes far beyond how they would respond if it had merely been disappointing or not-so-good.

But, just because a movie's ad campaign and industry buzz turned out to be misleading, doesn't mean you can't still enjoy it. In fact, that's what you ought to expect — that the packaging will try to appeal to the lowest common denominator, to maximize butts in seats. If you thoughtlessly accept the packaging devised by high-priced ad agencies and Hollywood publicists, then you are a naive fool. Especially if the campaign leads you to expect something mind-blowing — you know what they say about something that seems too good to be true.

I know — shame on the advertisers for framing the movie in a different tone or genre than it actually will be. Still, get used to devious advertising, and be open to being pleasantly surprised when it goes somewhere you weren't expecting. Otherwise you'll spazz out instead of enjoying something like Man of Steel (which I reviewed, along with the spazz-fest, here).

I don't think people felt such a stinging disappointment to movie releases back before everyone developed OCD and the need for micro-forecasting, and before they became so trusting of the propaganda put out by faceless bureaucracies (whether corporate or governmental).*

All worth bearing in mind when you try to use reviews as a guide for what to see, or to inform your own expectations.

This has been another prefatory post to my review, hopefully up today, of Transcendence. Each time I sit down to write it, there's another layer of culture-smog that needs to be blown out of the room first. Perhaps there is more to say about the reaction to it, and what that reveals about the state of our culture, than about the movie itself (but I'll do that too).

* These abnormalities are symptoms of cocooning syndrome. People with zero social safety net are much more unstable to small perturbations from their plans — there's little slack in the system when you're the only one in it. And if you are too creeped out by other people to interact with them, including your own white middle-class neighbors, then you look to a larger-scale authority to mediate and control your relationships with others. True, you feel more like a slave, but more importantly you don't have to interact with other people — cuh-reeeepy!

April 19, 2014

In going from director of photography to director, focus on action

When a cinematographer decides to try his hand at directing, his first film should probably not involve much narrative or conceptual complexity, given how heavily visual and visceral his training and experience have been.

During the late 1980s and early '90s, Jan de Bont brought style into the summer thriller genre by contrasting shadowy settings with bright, warm lights, usually from a neon or other artificial source. This choice made the chiaroscuro effect look and feel distinctly modern, showing that striking contrasts of light and dark are not the kind of thing that you could only see by carrying a torch through a cave, or lighting candles after sunset.

Go back and see how much more fascinating these movies look compared to the typical entries in their genre: Die Hard, Black Rain, Flatliners, The Hunt for Red October, and Basic Instinct.

When he took control as director in 1994, he chose a project whose source of drama can be summed up very simply: "Pop quiz, hotshot. There's a bomb on a bus. Once the bus goes 50 miles an hour, the bomb is armed. If it drops below 50, it blows up. What do you do? What do you do?"

Although the early scenes in Speed show de Bont's look and feel (see the top shot below), style alone cannot support an entire movie, and after those initial scenes there is little emphasis on striking lighting, color, shallow focus, and shot angles, except for the very end (see the bottom shot).


However, the focus does not shift toward dialog, concept, and character development. It sticks with the visual and visceral, with cookie-cutter character types, only in a way that can sustain our interest — creating a sense of panic and menace, and throwing obstacles in the way of the protagonist's attempts to regain control of the situation. That way we feel cathartic relief when he finally succeeds in escaping from the villain's trap and rescuing the hostages.

So, was de Bont successful at directing an edge-of-your-seat thriller flick? No doubt about it: my friends and I must have returned at least a dozen times to see it over the summer of '94. If there was ever a lull or uncertainty in the day's plans — "Wanna go see Speed again?" I watched it on DVD last summer and agreed with my teenage self, a rare exception when I re-evaluate the pop culture of my adolescence.

The point here is not to survey the successes and failures of every DP who has taken his turn at directing, but to detail a single relevant example from the not-too-distant past.

The relevance for today being the release of Transcendence, the directorial debut from Wally Pfister, whose eye has given the popular thrillers of Christopher Nolan their own striking chiaroscuro cinematography. I thought it was a provocative character study of the major players who will usher us into the techno-apocalypse, while bringing up a bunch of intriguing ideas about how things might unfold if a person's consciousness were uploaded to a computer and then to the entire internet. But it was not the gripping, edge-of-your-seat thriller that the ad campaign had led us to believe.

His efforts to explore characters and concepts proved much more successful than I had expected, given his background as a DP rather than as a screenwriter (a role that better prepares one for directing). Still, while they are rich enough to carry the movie, it lacks the heart-pounding action that allowed Speed to pull in the audience almost entirely without dialog and character arcs.

Pfister took a huge risk by shifting focus away from his comfort zone and toward the narrative, and exceeded my expectations. But I wish he would have taken a safer project, driven by action, for his first tour in the director's chair, and come around to a conceptual sci-fi narrative after becoming more comfortable with the director's role. Such a difficult changing of roles should not also unfold in unfamiliar territory.

Critics are heavily panning Transcendence, for reasons I don't get — perhaps it wasn't the nerd-gasm they were hoping for, or it came as such a downer following in the wake of Her, which reassured the critic nerd that it was cool to fantasize about using your iPhone as a sex aid, and that there was even something cheerful about the whole affair. Yeah sure, it was no Videodrome, but the 19% rating at Rotten Tomatoes is childish and shameful for the reviewers. It's like scrawling UGLY BITCH on a girl's locker just because the date wasn't as orgasmic as you'd imagined it would be.

I'll put up a review sometime later today to try to correct the major misunderstandings I've read. But there is a grain of truth in what the tone-deaf critics are whining about, and I figured an introductory post was worth it to explore why the movie was not as successful as other attempts by cinematographers as directors.

April 18, 2014

Would the needy turn down your obsolete computer from 10 years ago?

I thought about donating the old Gateway tower that I found languishing in the basement, since it still runs fine. It's running Windows XP smoothly on a Pentium II 800 MHz processor, 20 GB hard drive, and 256 MB of RAM (and another 128 MB can be bought on eBay for a whopping $3, shipped). It has a fast CD-ROM, 3.5" floppy drive, Zip drive, 2 USB ports, and ports for modem and ethernet.

Not only does it do everything that a normal person would need, it is backwards compatible to make use of old things that may be still lying around the house, like floppy disks.

And yet such a system would be rejected by all computer donation centers, who preen for their do-gooder audience about how they're keeping bulky electronics from choking up our landfills, helping out others in the community who can't afford or otherwise don't have access to computers, and so on and so forth.

Why? Because their vision of "bridging the digital divide" means giving the needy a set-up that's within striking distance of the computers that the well-to-do use. It doesn't mean giving them something that meets their needs for free. After all, on the Gateway system from 2000, how are the poor supposed to stream pornography in HD? Or the all-important function of hardcore gaming? Giving them a system like ours would only perpetuate the inequality gap in cyber-distraction.

The first hit I got for middle to upper-middle class Montgomery County, MD, was Phoenix Computers -- "reclaim, refurbish, reuse." According to their "what donations can we use?" page, your computer will probably be rejected as obsolete if it's not running a 2.0 GHz processor on a Pentium 4 chip, and will be harvested for parts. Talk about greedy.

Even the T40 Thinkpad that I'm using to type, upload, edit, and comment on this post would get rejected — only 1.5 GHz and a variant of the Pentium 3 chip. Yet somehow I've pulled datasets off the internet, analyzed them in Python, drawn graphs of the results in R, and made PowerPoint talks to present them to others, carrying these on a USB flash drive. And garden-variety web-surfing, of course. But, y'know, the computer experts are right — this thing doesn't provide surround sound for when I'm watching cat videos, so it must be time to just throw this wad of junk in the trash.

Was it just the greedy do-gooders in Montgomery County who took such a wasteful approach toward conservation? Here is an all-purpose page with suggestions for those thinking about donating their old computers, and they are only a little bit more forgiving, explaining that only those that are 5-10 years old are going to make the cut. But under these more generous guidelines, that Gateway that's running XP without a hitch would still get sent to the scrap yard. Not flashy enough for the discriminating tastes of 21st-century proles.

So, for all their talk about frugality and stewardship, these donation and recycling centers behave more as though they were the producers of Pimp My 'Puter for MTV. Aw yeah, son, Black Friday's coming early this year!

Zooming out to the big picture, this entitled mindset among the lower 75% of society is an overlooked cause of how fucked up our economy is becoming. In the original Gilded Age, wasteful status-striving was only an option for the one-percenters. But now that we have democratized access to debt, along with state-mediated schemes like Cash for Clunkers and Obama Phone, everybody can whine like an entitled little bitch their entire lives, and bury themselves under debt in order to play the status-striving game. Hand-me-downs are for losers, and everyone must be a winner.

Today's economic explosions will be far more severe, on account of how broadly the attitude of wastefulness has infected our society. And so-called donation programs that feed this petty sense of entitlement are only going to make things worse.

April 16, 2014

Is anyone holding onto their digital pictures?

While poking around the dank storage area of our basement back home, I found a nearly 15 year-old tower computer lying ignominiously on its side on one of the shelves. It had been sitting there since 2010 and had scarcely been used since around 2008. When I later opened it up to clean it out, there was thick dust covering all top surfaces and a good amount of the cables. It was a miracle that when I first tried to power it on, it took only a little coaxing.

While cleaning out lots of old files to free up some hard drive space, I came across what must have been hundreds of image files stored across a few dozen folders. This computer had been my brother's during college, and went back into general family use during the mid-to-late 2000s. So there's a good range of who and what is pictured. My brother's social circle at college, family vacations, holidays, and so on and so forth. Not to mention a good deal of pictures of old photographs that had been digitally scanned.

Nothing mind-blowing, but isn't that normal for family pictures? And it's not as though any single picture would've been a major loss, but family pictures aren't trying to make the individual shot excellent, they're trying to record what our experiences were.

Needless to say that if I hadn't taken a curiosity in restoring and preserving this dusty old thing, it and those hundreds of pictures would've gone straight into the landfill. I backed up the pictures onto a flash drive just in case the hard drive craps out, and it struck me that we had to buy a new drive for this purpose. There wasn't a master drive that we had been loading digital images onto all along.

Perhaps other families are more OCD than ours (that would not be too hard), but I suspect that most people are not moving their old picture archives from one main "device" to another. And given how quick the treadmill of planned obsolescence is running, they're not going to have much time to get to know the pictures that are confined to a particular device before cycling on to the next one.

Photographs are the exact opposite. In our cedar closet, we still have album after album full of film prints, some of them going back to the early 20th century. Photo albums were never housed in a larger piece of technology, let alone one that was subject to such rapid turnover. So it hasn't been hard to keep those archives separate from all the other stuff that comes in goes in a household.

And although I've written about how bland, forgettable, and crummy digital pictures look compared to film, their quick sentence to oblivion seems to have more to do with digital storage media rather than digital image capturing. If you took pictures with a digital camera but printed them up, they probably wound up in a photo album with the others. Whereas if you took pictures with a film camera but told the photo lab to upload scanned image files onto a flash drive instead of making prints, you'll lose them before 10 years is up.

It may seem odd that digital images are vanishing so easily — although less tangible than photographs, they are still housed on a physical machine. But those machines are getting more or less thrown away every several years these days, and even if they're donated, they have their hard drives automatically wiped clean before passing them along.

Forgettable images on disposable machines — how would this world ever go on without progress?

April 14, 2014

White flight and ethnic differences in cocooning

The main cause of white flight is no mystery — the higher rates of violent and property crimes among the groups that they seek further distance from. Hence, in this period of still-falling crime rates, whites have partially reversed their flight from the urban core, and have reconquered large parts of major American cities that would have been no-go zones for most folks back in the early 1990s.

This influx of affluent whites has pushed the earlier black and Hispanic groups out into the suburbs. How are white suburbanites responding, if they aren't part of the movement back into cities? They're moving further and further out into the exurbs.

But why? It's not like the earlier white flight, given how low crime is these days. Sure, it's still higher among their new neighbors, and moving further out would bring them into lower-crime areas. But it's not the huge gain that it would've been 30-40 years ago. Particularly if the newcomers are Mexicans, whose crime rates are not even at black levels to begin with.

What about the newcomers' kids fucking up the local schools? That would certainly worry the parents of school-aged children, but would probably not get picked up by the radar of everyone else. It should only be the former group moving out, whereas it seems like everybody is packing up.

Fleeing diversity and its fragmenting effects on trust, in search of greater similarity and cohesion? That's more like it, but why is it the resident whites who are leaving, rather than the non-white newcomers? Is the white suburbanites' homogeneity that easy to perturb toward disintegration?

You wouldn't see that in the other direction, with suburban whites invading and driving out suburban Mexicans — other than by paying more money for rents, goods, and services. The Mexicans aren't going to move out just because they sense a creeping up-tick in the level of diversity with all these white newcomers. They're going to make it known that they aren't going anywhere, and that whites aren't welcome.

And not necessarily in a violent or confrontational way. All it takes is them refusing to learn or speak English, refusing to participate in mainstream American culture, refusing to display American cultural symbols, etc. And then proudly displaying their own.

Whites are a lot more wimpy about asserting their ethnic or cultural identity. I don't mean in some cosplay Nazi style, but in something as simple as playing rock music out of their car windows or on their front porch, walking around the neighborhood, and hanging out in its public spaces — particularly in groups. That would give them a group-y physical presence that would not feel palpable if they kept to themselves indoors most of the time, occasionally going out for a jog alone or walking the dog alone.

To the outgoing go the spoils. If you look around a diverse area today, you'll see that Mexican parents are more willing to let their kids hang out on their own, and that the grown-ups are more likely to be hanging out in the public areas of a shopping center instead of pulling right up in front of their destination, darting in and out, and taking off back home. Even where they may be a numerical minority of private residents, their public physical presence can be much greater.

This line of thinking may also explain why some white suburbs haven't been so thoroughly affected as others. The introverted Scandinavian weenies in Minnesota and Wisconsin have gotten slammed a lot harder by all the black newcomers from the Chicago area over the past 20 years. They're just going to keep moving further and further out.

The working-class suburbs of Boston and New York (who can't just out-bid the non-whites) have fared much better, compared to the rest of the country. You can always rely on the garrulous micks and wops, I mean the sociable Irish- and- Italian-Americans to stand their ground, not just as an individual defending his private property, but as whole groups coming together to put some pressure on unwelcome newcomers.

Public life is part of the pastoralist culture of honor that the capable defenders are adapted to, whereas retreating into the private domain of the family is more what large-scale agriculture selects for, outside of urban dwellers (and you let the culture of law deal with any problems that may arise). Naturally the Nords are going to have a tougher time driving outsiders back out, compared to the Celts.

April 13, 2014

Children of divorce, invisible to Wikipedia

I was just reading about Amy Adams on Wikipedia and found out that her parents divorced when she was 11, despite belonging to the Mormon church which places a heavy emphasis on family togetherness (heavier than other religions).

While scrolling down to the bottom of the entry, I expected to see one of those offbeat Wikipedia category tags, like "Mormon children of divorce." Alas. But they didn't even have a generic category tag like "children of divorce," "father-absent people," etc.

They have a category for adoptees, something that most folks reading a person's biography would be interested in knowing. And something that could have shaped the way they turned out as adults. Adoption puts a happy face on the category of "unusual family structures," though. Things looked hopeless for the kid at first, but then they were rescued. Pointing out everybody who went through the opposite — things looked cheery at first, but then it all fell apart — would be a downer.

Wikipedia has all sorts of other category tags about a person's background and upbringing, from ethnicity to religion to being a military brat (like Amy Adams). That's the good kind of diversity — in the Wikipedian's mind, no ethnicity or religion or parental occupation is better than any other, so what's the harm in pointing it out? But whether both your parents were still together when you were going through adolescence... well, we don't have to air everybody's dirty laundry, do we?

And it is becoming "everybody" — see this earlier post about the still-rising rate of children growing up in broken homes.*

However, conveying how fucked-up the world is becoming, and pointing out who specifically has been hit, would go against the prime directive of the neutral point-of-view. You can read about it on a case-by-case basis, and assuming you soak up thousands of articles about living people, the pattern might strike you.

But there will be no larger picture that an abstract category tag could clue you in to at the outset. And no firm sense of there being this whole category of people out there, without a concise tag to reify them as a group. Some things were not meant to be understood, even (or especially) through The Free Encyclopedia.

* The trend for divorce is not the same, and has been declining since a peak circa 1980. But the rosier trend for divorce ignores whether or not there are any children involved, and married couples aren't pumping out babies like they used to. The divorce of a childless couple does not leave a broken home.

April 9, 2014

Planned obsolescence and conspicuous consumption

Stuff isn't built to last anymore, whether it's cruddy pressboard furniture from IKEA that will flake off into a pile of shavings within 10 to 20 years, or an iPhone whose speed has been crippled by an "upgrade" to the operating system that overwhelms it.

And yet, as the popularity of IKEA and Apple testify, people these days not only don't mind how disposable their purchases are — they are eager to feel another endorphin rush from throwing out the old and buying up the new, like a woman who changes her hairdo every three months.

Sadly, you see the same treadmill consumerism at Walmart and Sears, where everyone wants to step up their game, upgrade their rig, etc. It's not just the elites who are effete. Working-class people today are not the honest, making-do folks from a John Cougar Mellencamp song. They act just as entitled, discontent, and in-need of cheap junk from China as the rest of the social pyramid.

So, upper or lower class, Americans today don't give a shit if their stuff is unusable in five or ten years. Indeed, that's the way they like it.

Usually "planned obsolescence" is talked about as a supply-side thing, with the producers scheming to trick us into buying things that will be no good before long. But consumers notice that kind of thing, and by now that awareness is so widespread that all you have to do is say "not built to last," and everyone nods along. Notice: they do not stop buying all this shoddy crap, rather they grudgingly accept the nuisance as the small price they have to pay to enjoy the larger benefit of stepping up their game and upgrading their rig, feeling that heady rush more frequently.

This is a sorely under-appreciated source of how crappy things are today. You don't want to think of your fellow Americans as feeding it through their own self-absorbed consumer behavior, and would rather pin it all on the greedy stockholders, managers, marketers, and so on. But those guys can't sell an entire nation what it doesn't want to buy. Same with immigration — you can blame large-scale farm-owners, but what about the folks you know who use a housecleaning service, lawncare / landscaping service, or construction service that's almost certainly employing cheap illegal labor?

The Arts and Crafts movement took root in the late Victorian period, as status-striving and inequality were rising toward their peak of the WWI years. The standard story is similar to today's, where the shoddiness of stuff at the time was blamed on mass production techniques introduced by the Industrial Revolution — something on the supply side, at any rate. Given today's parallels, I'm more inclined to blame airheaded Victorian strivers for spreading the throwaway mindset. Only with such a docile consumer base could the industrialists flood the market with cheap junk.

At the other end, it's striking how sturdy and long-lasting stuff is from the nadir of status-striving and inequality during the 1960s and '70s. Especially for mature industries that can be fairly compared across long stretches of time — like furniture. Those Archie Bunker chairs, cherry dressers, and "granny squares" Afghan blankets are still in wide circulation at thrift stores, and have always been. The "thrift store look" from today is about the same as it was 20 years ago.

For some reason, IKEA futons and plastic rolling cabinets from the Container Store are not making their way in, and likely never will. Nobody has any use for that, and it's going straight in the trash.

April 8, 2014

Displays for digital distraction: Larger monitors, multiple monitors

Back when computers were machines for doing something productive or creative, their monitors were small and compact, whether in a business or home setting. They could have been made much larger, since the same CRT technology was used for medium and large-sized TV sets. But those larger screens were preferred because TV was for entertainment, hence were of little value when it was time to type up a report or crunch some numbers on a spreadsheet.

Here is the original IBM PC, with an 11.5" monitor (and only about 10.5" being viewable), and the original all-in-one Macintosh, with an even smaller 9" monitor (rather prophetic copy in the ad, don'tcha think?):



How did Primitive Man ever get any work done with such crude tools, unable to read emails in HD? How could they have focused on the memo they were typing, without the word processor overwhelming their eyes on a 30" screen? Really makes you grateful for not having to live in the dark ages anymore...

Monitors stayed small for well over a decade, and did not begin to bloat out and take up most of the desk until the mid-to-late 1990s, as they came to be used more for entertainment — first for graphics-intensive video games, then for watching illegally downloaded movies and TV episodes, then for illegally downloaded porn, and then web pages in general once they became more dazzling than black Times New Roman text against a gray background.

The increasingly common widescreen aspect ratio is another sign of computers as entertainment centers.

But one monitor can only get so big — eventually you're going to have to step up your game and upgrade your rig to include two monitors. Awww yeah, son: getting interrupted by emails on one screen, while refreshing TMZ on the other. Creating PowerPoints LIKE A BAWS. Nobody can hold me down now, I'm a productivity beast!

The multi-monitor set-up is nothing more than conspicuous consumption for nerds. In fact, if two are better than one, why stop there? Hook up ten. (Brought to you by the makers of five-blade razors.) Better yet — swiveling around at the center of a 360-degree ring-o'-monitors. And hey, the sky's the limit: install a planetarium ceiling and crane your neck up toward your Display Dome. Endless workspace, endless possibilities!

Forget the fact that, in the old days of a real desktop, they did not bother extending desks out to 10 feet long in a lame attempt to maximize productivity. Having too many separate sub-areas of the desktop makes it hard to focus on the one at hand. About the only task that truly benefits from two separate areas visible at the same time is manually copying a target document onto a blank one, analogous to dubbing cassettes. Otherwise, the world churned right along — and saw greater productivity gains over time — with just one central work area on their desks.

Something similar is going on with the phenomenon of "twenty tabs open at a time," as though keeping twenty books open on a real desktop would somehow make you absorb information more efficiently. Or as though playing twenty TV channels simultaneously would make for greater entertainment. In Back to the Future II, that was presented as satire; today it has become the unremarkable reality.

Undoing or not participating in the multi-monitor trend is fairly simple. Finding a monitor under 17" is a lot tougher. Buying them new is out, and even thrift stores will not accept them, so don't bother there. Craigslist and eBay are the best places, although since most of them are CRT (better than LCD in any case) the shipping will not be cheap. Still, it's a small price to pay for something that will last forever and prevent digital distractions from taking over your desktop.

April 6, 2014

A weakening power of helicopter parenting? Another round of little Boomers?

While traveling a bit lately, I've been observing the otherwise invisible nuclear families of today, now that they have to leave their lock-down compounds to go to the airport, or leave their hotel room to grab breakfast.

It's probably too early to tell, but I'm getting a hunch about how small children these days are, or are not, going to internalize the paranoia of their helicopter parents. These are children in early elementary school or younger.

When helicopter parenting paranoia began with new births in the late '80s, there was plenty for parents to be concerned about (which doesn't excuse their over-reaction). Violent and property crime rates were nearing their peak, and for the previous several decades, it had seemed like the world would only continue on in that direction.

Hence, when the parents sealed off their nuclear family from the outside world and began ordering their kids not to do this and not to do that, there was an honest sense of concern coming through in their voice and mannerisms (however overblown this concern may have been). Moreover, this was the only message about the outside world that the parents allowed to get through to their children — primarily by shutting out all other sources of input, but also by choosing only those external sources that would provide a consistent paranoid message to their little dears. "Parental control."

These children, the Millennials, have grown up to be the most frightened and insecure generation in living memory — how else could they have turned out? Everybody who offered them input, or who their parents allowed them to observe, sent the message that the world is too scary and random to venture out into on your own. And their tone of voice was consistently frightened for your safety, not as though they were just making shit up or just trying to spoil your fun. I guess you might as well hunker down in your room and interact with others at most through virtual channels (texting, internet, online video games, etc.).

Now, what's going to happen when these people become parents? They don't have any first-hand experience with real life, let alone the dangerous, topsy-turvy, and humbling parts of it, let alone decades of such experience. When they try to pass on the message of how scary the world is, it will start to ring hollow. Kids aren't stupid, and they can tell what your tone of voice and mannerisms reveal, aside from whatever you claimed in words. At the least, they can tell when you're being sincere and honest, or when you're joking around and teasing them.

Can children also sense which grown-ups have more experience, and which are more naive? If so, they'd react to the dire warnings of their Millennial parents with, "Yeah, and how would you now, you big wuss?" Whereas if they sensed the parent was more seasoned, they'd take it to heart — "Damn, even this been-there, done-that kind of grown-up sounds scared. It must really be dangerous."

However, when the child steps on the other side of the paranoid "do not cross" line, Millennial parents sound more annoyed and upset than they sound concerned and afraid. This may also be going on with later Gen X parents, who are more experienced, but whose memories of how tumultuous the world can be are fading more and more every year. It's harder and harder for 1992 to exert that kind of gut influence that would shake up the parents, when the world has gotten as safe, stale, and antiseptic as it has by 2014.

Thus little kids today are not going to take parental paranoia to heart like the Millennials did. It's just the mean old grown-ups trying to boss us around and spoil our fun, not looking out for our greater long-term welfare. By the time they're in high school, cocooning will have bottomed out, and they'll be able to enjoy a more unsupervised adolescence. And given how low the crime rate will be by then, they'll conclude that all their parents' warnings were either clueless and out-of-touch, or knowingly wrong and intended to shelter them from real life. See, nothing to worry about in venturing off into unsupervised places!

What was the last generation that had this naive attitude toward breaking free from parents, who they callously dismissed as either out-of-touch or as hypocrites? Yep, we're about to see the rebirth of the Baby Boomers, whose defining impulse is calling the bluff of authority figures.*

It's odd how small children these days are more annoying in public places, running around and making noise, despite their helicopter parents trying to make them behave. When the Millennials were that age, they were either not to be seen at all, or were frozen in place. Today's rugrats and ankle-biters seem more appropriate for the 1950s (see any Saturday Evening Post cover showing their frazzled parents, or a crowd of kids running around the house at a birthday party on Mad Men).

We're not into that late '50s environment yet, but you can sense things creeping up to that turning point. For now, we're still waiting for Elvis.

* Gen X has a similar but more practical and mature attitude. Breaking free from parents is good, but you do have to be cautious out there. Yes, parents are out-of-touch, but that's to be expected given what a different world they grew up in. Authority should not be blindly followed, but blithely romping around calling the bluff of every older person who offers you advice, especially if it comes from the wisdom of tradition, is likely to get you killed.

April 4, 2014

Adventures in vinyl-hunting for practical reasons, part 2

In an earlier post I confessed a major sin: I have always bought vinyl records for practical rather than aesthetic or Romantic reasons. Namely, there's a lot out there that is not easily available on CD — finding it is hard or expensive.

I dropped by a Half-Priced Books today to comb through their rock/pop records, and found five things I've never seen in real life on CD, for under $15 including tax. You can't go wrong for under three bucks a pop.

1. Scandal's self-titled debut EP, with "Goodbye to You." It was never released on CD. Two songs are on a Patty Smyth compilation that I have on CD, and VH1 did release all of them on a CD compilation — though given the date, I'm sure it was a victim of the loudness wars and had all the dynamic range compressed out of it. In fact, the Patty Smyth compilation sounds a bit hollow itself, and it's from the mid-'90s IIRC.

2. Missing Person's album Spring Sessions M, with "Words," "Destination Unknown," and "Walking in L.A." It was released on CD, but I've never seen it in at least five years of regular combing of used record/CD stores, nor could they order it from their distributor. I'm guessing the CD is out of print, as it's going for $40-$50 on eBay.

3. A Flock of Seagulls' self-titled debut album, with "I Ran" and "Space Age Love Song." It was released on CD, but I've never seen it, and on eBay it's $10-$20 used. The LP is going for about that as well, so I got an even better deal today.

4. The Motels' album All Four One, with "Only the Lonely" and one of the coolest album covers ever. As with #3, it was released on CD, though I've never seen it, and is going for $10-$20 used on eBay.

5. After the Snow by Modern English, with "I Melt with You." On eBay there are CDs going from $5-$10, which isn't bad, but they also say that it's rare. Short supply doesn't always translate into high prices if there's little demand for it. Can that be, though? It's only got one of the most iconic, catchy, uplifting songs of all time on it! Up until now, I'd been listening to it on the CD soundtrack for Valley Girl.

Come to think of it, I don't believe I've seen any CDs by these bands in the wild, not even a greatest hits, although most of their hits can fairly easily be found on multi-group compilation discs. (I might have seen something by the Motels, but can't recall.)

They had a copy of the first two Duran Duran albums, the self-titled one and Rio. I'm sure I'll continue to like those albums more than any of the five that I bought today, but I still passed on them. I've already got them on pre-'90s CDs that sound great (not guaranteed to be in a used CD rack, but not hard to find either). I'll eventually get around to buying duplicates on vinyl just for the different sound, but there's still so many more records to hunt down of things that are damn near impossible to get on CD.

I also passed on some more pop-y choices that I probably shouldn't have, and may have to return to pick up tomorrow: Belinda Carlisle's debut album Belinda, with "Mad About You," and Jane Wiedlin's album Fur, with "Rush Hour." Two or three songs from Belinda are on a greatest hits of hers that I've already got, but I've never seen anything by Jane Wiedlin on CD, not even her one hit on a compilation.

Usually the discussion of vinyl centers on music from the '60s and '70s, before CDs were available. But you'd be surprised how much killer '80s music is hard to find outside of the record crates, at least the entire original album rather than the hits on a compilation.

If it was major enough, they probably released it on CD — perhaps following up with multiple re-issues if it's that important (like Graceland by Paul Simon). But if the band was remembered as a one-hit wonder, then probably not. The heyday of New Wave from '82 to '84 came just as CDs were being introduced, so it was far more likely to have been released on vinyl and perhaps cassette. That's the place to look for it, then.

April 3, 2014

Cosplay in the Fifties: The Davy Crockett craze

The music video for "Come As You Are" by Peter Wolf portrays exuberant everyday life in a small town during the late 1950s, and by the end the whole town gathers together in a crowd as though they'd all caught dance fever from the future.

One of the key examples of "iconic Fifties life" that they bring out is the 10 year-old kid who's all dressed up in his Davy Crockett gear. Not because he just came from a school play about frontier life, but because that's just what boys were into at the time. In fact, the Crockett craze was part of a broader mania for all things Western during the late '50s, among both children and adults.





How similar was that phenomenon to the trend of cosplay in the Millennial era? It was a full costume of the icon — clothing, hat, and prop gun — rather than a t-shirt / backpack / lunchbox with the icon's image on it. The icon was a specific character, Davy Crockett, rather than a generic type like "frontiersman." The children were role-playing as the character, rather than dressing up that way while behaving normally. And it was a mass phenomenon mediated by the mass media, rather than a strictly grassroots sub-culture or counter-culture.

Daily life in the Mid-century was relatively unexciting, especially for children during the heyday of Dr. Spock, smothering mothers, Levittown subdivisions, and drive-ins where the customers physically cut themselves off from each other. Role-playing as a figure from the Wild West gave them an escape from the cocooning culture's insistence on parking yourself in front of the TV set rather than wandering around the neighborhood, clipping your fingernails and brushing your teeth in the proper way, and remembering to drink your Ovaltine.

At the same time, competitiveness was low and falling circa 1960, so kids back then did not make costume competitions and one-ups-manship a part of the Crockett craze. And unlike now, there was no pseudo-slut counterpart among girls — attention-whoring being another aspect of the competitiveness of today's culture. It was also restricted to kids who were about to go through puberty, rather than adolescents and young adults.

It seems then that cocooning is what brings this trend into being, and that high or low levels of competitiveness only shape its expression.

I don't remember anything like the Crockett craze during the nadir of cocooning in the '80s. There are pictures floating around of D&D-themed cosplay in the late '70s and '80s, but that must've been damn rare. There are only a handful of gatherings pictured, as opposed to the endless examples you can find of kids in Davy Crockett get-up or of contempo cosplayers.

At the height of Hulkamania, we might have worn a t-shirt like Hulk Hogan's, or played with ninja weapons during the Ninja Turtles craze, but we never got fully dressed up in character for role-playing, let alone dress that way in an ordinary setting like hanging around the house. Everyday life had enough excitement back in more outgoing and rising-crime times that it wasn't necessary to pretend that you were part of a Wild West culture.

April 1, 2014

The decline of pop music parody when the songs and videos have become forgettable

VH1 Classic just played "Synchronicity II" by The Police. I'd never seen it before, but instantly recognized that it was a quotation of the iconic video for "Dancing with Myself" by Billy Idol, whose personal image Sting is imitating.

Seems like it's been a long time since a well known group made a video that quoted another video. This brief review confirms the hunch. In 2011, All-Time Low made a video spoofing Katy Perry and other major stars du jour, but they were not a very popular group (never appearing on Billboard's Year-End charts). The last clear example was "All the Small Things" by Blink 182 way back in 2000, and even that was a blip, as there were no others from the '92-and-after period. (Needless to say, Chris Rock did not appear on the charts, and I don't recall seeing the video for "Champagne" back then.)

After "Synchronicity II" in '83, David Lee Roth made a more obvious and slapstick parody video for "Just a Gigolo" in '85, the same year that Phil Collins aimed for parody in the video for "Don't Lose My Number." In '91 with Genesis, he included a Michael Jackson parody in the video for "I Can't Dance." There are probably some others out there that I can't think of immediately, and that the writer of the review missed, but that's enough to establish the basic picture.

Those four acts were all major successes at the time, all appearing on the Year-End charts across multiple years, hence the parody videos would've enjoyed high visibility.

So why the decline since the '90s? Well, allusions like these rely on the audience having already been exposed to the original and stored it in memory. Steadily over the past 20-odd years, less and less of the target audience has reliably tuned into music videos -- and music in general. Moreover, music videos have gotten a lot more bland and featureless -- and who's going to remember a forgettable video when it's referred to in a later parody?

By "forgettable," I don't mean "inferior" in a subjective way, but objectively lacking in identifying features -- things that help you explain them to others to jog their memory. Simple memory tests would show how easy or difficult a video is to recall. "Y'know, that video where the glamor models are playing the instruments and the singer is wearing a shirt and tie?" (Actually, he made three videos like that -- the one where the models are wearing pink, or wearing black? And is he wearing a suit or just a shirt?) Try doing that for recent videos -- "Y'know, that one where those rappers are boasting around a pool and a bunch of big booty hoes are shaking their asses..." Oh right, that one.

I think this also explains why parody songs have died off since the heyday of Weird Al Yankovic in the '80s and early '90s. You can't get the listener to recognize the target song if it has no memorable riffs, melody, solo, or distinctive vocal delivery. It can only end up sounding like a parody of the whole genre, whose performers are all interchangeable (emo, crunk, etc.). That would also have been difficult in the '50s and early '60s, when most pop music sounded so similar and lacking in identity.

Thus, the peak of pop parody during the '80s and early '90s suggests a peak in those years for the distinctiveness of both the music and the videos of hit songs.

Related: Ornament is meant to make things more memorable

March 31, 2014

TV audiences in love with annoying children, the second wave

Here is a New York Post article reviewing some of the many annoying child and teenage actors on hit TV shows right now. I haven't seen any of them, so I'm not sure how central they are, but it sounds like they're at least regular cast members, some of them at the core.

If there are so many of them littered throughout popular shows, the general audience appreciates it, notwithstanding a vocal minority. And even if the general audience doesn't care for a particular character, that's more of a failed attempt at the goal of creating the "prominent child character" that the audience craves. In today's climate of nuclear family-centric cocooning, viewers just can't get enough of watching children.

However, this isn't the only time when you would have been assaulted by annoying kids when turning on the TV. During the previous heyday of permissive parenting in the Mid-century, one of the most popular shows was Dennis the Menace, starring a more sympathetic but still off-putting Baby Boomer brat who couldn't act. The less popular yet more iconic show Leave It To Beaver starred an even more annoying kid who couldn't act.

Circa 1960, status-striving and dog-eat-dog competitiveness was nearing a low point, so at least those earlier examples would not have made you angry with their smug dismissive attitude. Still, they were children who couldn't act, they were central characters, and they were going to get on your nerves for being so dorky, bratty, and wussy.

As Mid-century cocooning had all but melted away by the 1980s, it was damn unlikely that you were going to suffer annoying children on television. Here is a site that lists the top 30 shows in the Nielsen ratings for each year, which you can browse if you're unfamiliar with them.

For the early and middle parts of the decade, there was nary a child to be seen, let alone a central character whose immaturity and inability to act would have made you change the channel. There was that blonde daughter from Family Ties who was a mopey sourpuss, but she was mostly out of the picture before she got to high school. I'm sure there are other marginal examples like that, but none where they're one of the main characters.

What's striking about the hit shows back then is how grown-up everyone is. Dallas, Magnum P.I., Cheers, Miami Vice. Viewers then were so maturity-minded that they put The Golden Girls, a show about four senior citizens, into the upper layer of ratings. Directly related to that is how unrelated most of the characters are — annoying kids tend to crop up in shows that focus on families.

Toward the end of the '80s, as cocooning is about to set in, there were still only a couple of annoying kids on TV — the son Jonathan in Who's the Boss? and Rudie from The Cosby Show.

Then as the family-centered shows of the '90s rode the wave of helicopter parenting and cocooning, we got a deluge of annoying kids. Darlene and DJ from Roseanne, the little boy from Family Matters, the Olsen twins and Stephanie from Full House, the blond nerdlinger from Step by Step, and all of the kids from Home Improvement.

Everybody's favorite show about nothing, Seinfeld, was a holdout in this regard, and in hindsight was a key factor in making the show so enjoyable — no fucking kids. Not even teenagers. It was more of an ironic and self-aware incarnation of Cheers adapted for the postmodern Nineties.

These changes back and forth are the result of folks being more community-minded and public-oriented vs. more family-minded and home-oriented. The community and public places are not made up of children, but adults (mostly). Only when people start locking themselves indoors do they dwell on the ankle-biters that are part of private, family life.

March 30, 2014

Do cats have generational differences?

Cats these days seem more bratty and dependent than I remember growing up. And more housebound and supervised by their caretakers — perhaps the cause of the greater brattiness and dependency, a la the children of helicopter parents? Just as there has been a huge change in parenting styles over the past 25 years, so has there been with pet-caring styles, and in the same direction.

I'm inclined to rule out genetic change, not on principle, only because the change has been so sudden.

Fortunately, cats are less obedient than children toward over-protective owners, so they still get out a good bit. But it still feels like they have a weaker public presence than they did in the '80s. One of our cats used to flop down into a deep sleep right on the sidewalk out front. Once he even got accidentally picked up by animal control because they thought he'd been hit by a car and was laying dead on the curb. Another cat used to climb trees and walk over our first-story roof.

Come to think of it, it's been awhile since I recall seeing the "cat stuck up in a tree" in real life or pop culture. Last I remember was having to climb up a tree and dislodge our cat who liked to run up there, at least until the birds began dive-bombing him. That was the mid-to-late 1990s.

I'm not sure how to investigate this idea quantitatively, but it would be worth looking into by animal psychologists. There may be surveys of pet-owners over time about whether their cat is an indoor or outdoor cat.

March 29, 2014

Pre-screening peers on Facebook

Different generations use the same technology in different ways. Nothing new about that: as teenagers, the Silent Gen huddled around the radio set to listen to drama programs, quiz shows, and variety hours — akin to parking yourself in front of the idiot box — whereas Boomers and especially X-ers turned on the radio to get energized by music.

After reading more and more first-hand accounts by Millennials about how they use Facebook, it's clear that there's more social avoidance going on than there first appears. Refusing to interact face-to-face, or even voice-to-voice, is an obvious signal to older folks of how awkward young people are around each other these days. As recently as the mid-1990s, when chat rooms exploded in popularity, we never interacted with our real-life friends on AOL. It was a supplement to real life, where we chatted with people we'd never meet. Folks who use Facebook, though, intend for it to be a substitute for real-life interactions with their real-life acquaintances.

But then there's their behavior, beyond merely using the service, that you can't observe directly and can only read or hear about. Like how the first thing they do when they add an acquaintance to their friend list is to perform an extensive background check on them using the publicly viewable info on their profile. Who are they friends with? Birds of a feather flock together. Who, if anyone, are they dating? Where do they go to school, and where do they work? Where did they go to school before their current school, and where did they work for their previous five jobs? What's being left on their wall? What do their status updates reveal? Pictures? Pages liked? Etc.?

Damn, you hardly met the person and you're already sending their profile pics to the lab in case there's a match in the sex offenders database. Back when trust was just beginning to fray, people (i.e., girls) would have maybe used the internet to check for criminal behavior, but that would be it. Now it's far worse: they scrutinize every little detail on your profile, and every trace you leave on other people's profiles. They're going far beyond checking for far-out-there kind of deviance, and are trying to uncover every nuance of your life. Rather than, y'know, discovering that first-hand or at most through word-of-mouth gossip.

Aside from feeling invasive ("creepy"), it betrays a profound lack of trust in all other people. After all, it's not this or that minority of folks who are subjected to the screening. Nope, it's like the fucking TSA waving each Facebook friend to step up to that full-body scanner. "OK, open up your profile, and hold still while we scan it... All right, step forward... and... you're clear." What if I don't want to "hit you up on Facebook," and keep that part of me private? "In that case, you can choose the alternative of a full background check by a private investigator, and provide three letters of reference." Just to hang out? "Well, you can never be too safe. It makes me feel more at ease." Yeah, you sure seem at ease...

Moreover, it destroys any mystery about a person. Remember: mystery is creepy, in the Millennial mind. An unknown means that something's going to pop out of the closet and scare them. They have to open every door and shine a flashlight over every micro-crevice of the home of your being, in order to feel secure.

It also delays the getting-to-know-you process, and interrupts the momentum whenever it gets going.

Trust greases the wheels of sociability, and it used to be normal to meet someone for the first time in the afternoon and feel like you knew them well by the end of the evening. Not just boys and girls pairing off, but also same-sex peers making quick friends, particularly in an intense setting like a concert or sports game. These days, there's this nervous laughter, sideways stare, and lack of touchy-feely behavior (for boys and girls) or rowdiness (for same-sex peers).

I think it's probably too late for the Millennials to be saved from their awkward, pre-screening behavior on social networks. They're already in their 20s and set in their techno ways. Hopefully by the time today's post-Millennial kindergartners become teenagers, they'll look at the choice of Facebook as real-life substitute, and let out a great big BOORRRRRRINNNNNG. Locking yourself indoors and huddling around the radio set listening to soap operas died off fairly suddenly, and there's no reason that the next generation can't kill off Facebook just as quickly.

Were you into vinyl before it was cool?

By "being into," I mean that it was a conscious choice among alternative formats. And browsing through the NYT archive for articles about "vinyl," it looks like it became cool during the 2000s. Before then, when it was also a conscious choice, is roughly the mid-to-late 1980s through the '90s.

Today I was reminded of the two main things that turned me on to records in the mid-'90s: selection and price. Nothing romantic.

Not the sound quality — it sounds great, and distinct from CDs, but not a difference that would make me want to convert my CD collection into vinyl. And not the status points — back then, there were no status points to be gained, and even now I would gain little once I revealed what it was that I bought on vinyl (not classic rock, punk, grunge, or indie).

I'm visiting home and stopped by the local thrift store, which had several dozen crates full of records. I hadn't even gone there looking for them, I just figured why not browse around after having scored something that I did come looking for, a vintage afghan (black with gold and cream patterns). Right away they started popping up, and within ten minutes, I had a handful of albums that are difficult to find on CD, let alone for a buck a piece in mint condition. The Bangles, David Bowie, Bonnie Tyler, The Go-Go's, Paul Young, Bryan Adams, and Stacey Q (only a 12" single, but more than I've ever seen on CD).

The only one that I've even seen before on CD ("in the wild") was All Over the Place, the debut power-pop album by the Bangles. I have that on CD, but the hits from the others I only have on compilations or greatest hits: "Blue Jean," "Total Eclipse of the Heart," "Every Time You Go Away," "Two of Hearts"... and I don't think I have "Head Over Heels" or "Summer of '69" on anything. Yet for the price of one crappy download on iTunes — with 95% of the sound compressed out — I got the entire album that each hit song came from.

It may be new wave and synth-pop today, but 15 to 20 years ago I was still bored by contempo music and looking back for something entertaining. As now, it was mostly the '80s, a decent amount from the '70s, and a handful from the '60s. Only more on the avant-garde or experimental side, compared to more well known stuff that I dig now. Starting in college, you should be getting over your adolescent insecurity about needing to prove how obscure and unique your tastes are, and that's the only real change in my vinyl-buying habits — who the groups are, not the larger reasons of selection and price.

Some of that stuff was available on CD back then, but it was expensive. On the CD racks at Tower Records, there were several columns of just Frank Zappa, but they were closer to $20 than $15. If you dropped by any used record store, though, you could find them used and in good condition for about five to ten bucks.

And other material was either not released on CD, was out of print, or was otherwise damn rare to find on CD. Yet I had no problem finding a couple albums and a single by Snakefinger, the guitarist who frequently collaborated with The Residents. But unlike the most famous obscure band in history, Snakefinger was actually a working musician instead of a performance artist, and was a superior songwriter and performer.

As an aside, if you're looking for something unheard-of, but can't stand how weird the typical weird band sounds, check out his stuff from the late '70s and early '80s. He recorded a cover of "The Model" by Kraftwerk that's more uptempo and danceable yet also more haunting than the original. (None of his own music is very dance-y, BTW, in case you're allergic to moving your body.)

Anyway, it struck me as odd that someone would be into vinyl for practical reasons. There really is a lot out there that can be had for cheap if you buy records, without compromising sound quality.

It's not an analog vs. digital thing either. Tapes are analog, but they sound pathetic compared to either records or CDs. Video tapes are the same way. Does that make laser discs the next target for hipster status contests? If so, better hit up your thrift stores soon before they're scavenged by the status-strivers. At the same place today, I found a perfect copy of the director's cut of Blade Runner on laser disc for five bucks. Don't know when I'll be able to actually play it, but...

March 25, 2014

Don't improvise in period movies

Got around to seeing American Hustle tonight, and liked it better than most new movies I see. One thing took me out of the moment several times, though -- the improv scenes.

In a period movie, actors should stick close to the script because they are unlikely to be able to improvise and maintain the period's authenticity on the fly. When the acting is more spur-of-the-moment, it will come from the actor's gut, which is tuned to the here-and-now.

There was a scene where Irving Rosenfeld asks "Really?" in a slightly flat and miffed voice, in response to someone else's ridiculous behavior. That's too specifically from the 2010s. At the end when Richie DiMaso does a mock performance of his namby-pamby boss, it's so over-the-top and laughing-at rather than laughing-with, that it felt more like a frat pack movie from the 21st century. Ditto the scene where DiMaso is trying to convince Edith to finally have sex, where the dialog sounds like it's from a doofus rom-com by Judd Apatow.

These and other scenes should have been in the outtakes -- wackiness ensues when the actors break character! When they're left in, it creates jarring shifts in tone, as though an actor who'd been speaking with an English accent switched to Noo Yawka for half a minute, then switched back to English (all for no apparent reason).

Improvising and wandering slightly outside of the character's range isn't that jarring. Like, maybe you're just seeing a slightly different and unexpected side of them in this scene. But anachronism is not so easy to suspend disbelief about -- it definitely does not belong to that period. If they turn on the radio in 1980 and it's playing Rihanna, that kills the verisimilitude.

It's even more baffling in American Hustle, where the costume, make-up, and production design has been so meticulously worked to make you believe you're looking in on a certain time and place. All it takes is a series of distinctively 21st-century improvs to throw that into doubt for the viewer.

March 24, 2014

Field report: Bowling alley birthday party

A party at the bowling alley became a common option by the late '80s, and I went to one today, so they're still at least somewhat common. Yet in 25 years a lot has changed, reflecting the larger social/cultural changes.

First, the guests are dropped off at the bowling alley itself — not at the birthday boy's house, where they'll pre-game before the host parents haul them off in a couple of mini-vans to the bowling alley. I've written a lot about the death of guest-host relationships during falling-crime / cocooning times. Each side is suspicious of the other — the hosts fear being over-run by and taken advantage of by guests, and the guests fear mistreatment (or whatever, I'm still not sure) by the hosts. It is primarily a guest-side change, just like the decline of trick-or-treating or hitch-hiking.

Now they have to meet in a third-party-supervised place like a bowling alley (or the mall / shopping district for trick-or-treating). Before, they would've met inside the host's home.

Then once the kids are dropped off, the parents will hang around to some degree before the event is over and they take their kid back. (Every kid today had a parent present.) Rather than drop them off, go do something for awhile, and either pick their kid up later or have them dropped off by the hosts. Again we see parents being so paranoid that they won't just leave their kids alone in the company of the hosts. Even if it's a bowling alley with only nuclear families present in a lily-white region and neighborhood, and during a Sunday afternoon.

(I can't emphasize often enough that, just because you live in a diverse shithole part of the country, doesn't mean that everyone else does. If parental paranoia is palpable in an all-white smallish town — if it is a national phenomenon — then it does not have to do with protecting kids against the dangers of a diverse megapolis.)

At least the parents won't hover directly over their child at a party, but the assembled grown-ups will form a ring around the kids, or form a side-group of chatting grown-ups next to the kids. Line-of-sight supervision remains unbroken. They can't trust their kid to use his own brain because he doesn't have one — intuition requires experience, and helicopter parenting blocks out experience from their kid's upbringing. It's more of a programming.

One of the kindergarten-aged guests politely declined eating a piece of birthday cake because it had "artificial frosting." He didn't know that (for all I know, it was made of natural poison). Still, as though if it were organic sugar, it wouldn't wind you up, put you in an insulin coma, rot your teeth, etc.

A lot of these brand-new food taboos that were wholly absent during the 1980s are just roundabout, rationalized ways to fragment the community. Sorry, my kid can't come to anyone's party because you'll have traces of peanuts in something and he'll die. So don't mind his absence from all celebrations that might bind a peer group together.

As for the actual bowling, you see both major societal influences at work — the self-esteem crap and the hyper-competitive crap. Quite a combination, eh? All of our kids are going to compete as though their lives depended on it, yet they're all going to enjoy the exact same rewards afterward.

The bumpers being up is not about self-esteem. That's just helping them learn, like riding a bike with training wheels at first. But everything else is. Like letting them cross the line as much as they want without penalizing them at all. Or cheering after any number of pins get knocked down — don't you think the kids can tell that they'll get praise no matter what they do? And hence they can just BS the whole thing and get full credit? Gee, I wonder if that'll pop up a little later in life...

The hyper-competitive stuff is way more visible and more offensive, though. If they knock down more than 5 pins, they're going to do some kind of victory dance, boys being worse than girls. Congratulations: it should've been a gutterball, but the bumpers let you knock down 7 whole pins. "Oh yeah! Uh-huh!"

And they're so eager to generally show off that they don't care how awful their technique is. Not like I'm even an amateur bowler, but I know that we're not doing the shotput here. The boys are again way worse than the girls on this one. (The winner today among four boys and two girls was one of the girls. She creamed the rest, not from being good, but from not crashing and burning in an attempt to show off.)

Have you guys seen kids throw a bowling ball lately? When I was their age, we stood with our legs wide apart, held the ball from behind with both hands, and rolled or lobbed it as close to the middle as possible. It's a granny-looking move, but when you're in kindergarten, you don't have the upper body strength to throw it in a more normal way. Heaven forbid you teach that lesson to kids these days, though. They're going to prove that they can do it. Only not.

They carry the ball with both hands near their chest, running up to the line with their left side forward (if right-handed), and then heaving or shotputting the ball with their right hand, turning their upper body to face the pins when they're near the line. This must be the worst way to release a bowling ball, and if the bumpers were not up, every one of these releases would go straight into the gutter. Not meander their way into the gutter — like, not even halfway down the lane, it's already sunk.

One kid did this with such enthusiasm that after shotputting the ball, his upper body carried itself forward over his feet, and he landed on his hands and knees — over the line, every time.

So, who cares if they're not trying to achieve the goal that the game assigns them? They're showing how eager they are to display intensity ("passion," later on), and that's all that matters in a dog-eat-dog world. The rules can be bent or changed on the fly, as long as the most intense person will win. After all, the parents aren't correcting or penalizing them.

One final odd but sadly not-too-surprising sight: the setting was a college student union, yet there were only two groups bowling, both family-and-kid-related. There were a handful of students playing pool for about 10 minutes, then that was it, no replacements. No students in the arcade area. On the other side of the rec area was the food court, which was closed on Sunday but which still had about a half-dozen students spread out.

Doing what? Why, surfing the web — what else? Most had a laptop out, one was on a school-provided terminal, and one girl was reading a textbook or doing homework with her back to everyone else.

If you haven't been on a college campus in awhile, you'd be shocked how utterly dead the unions are. Like, there are stronger signs of life in a gravestone showroom. Most of the students are locked in their rooms farting around on the internet / texting or video games. The few who venture out go to the library or the gym, where they can be around others and within the view of others, but still hide behind the expectation that you just don't go up and interact with people in those places. Unearned, risk-free ego validation — what's not to love for Millennials?

March 21, 2014

Another path from helicopter parenting to egocentrism

I'm visiting my nephew and got dragged into playing Legos with him. "Dragged" not because I'm heartless, but because I want to keep sufficient distance in case I need to assert authority. Remember: being your kid's friend undermines your authority. It's the kind of thing he should be playing with his peers, not a family member who's 27 years older than him.

At any rate, he's so excited to show you this thing he made and that thing he made, and what this design in the booklet looks like, and what that one looks like. I found myself the whole time saying "cool..." or "man..." or staying silent. All you get is agreement from family members, even if they don't really think it's cool. Family members have to treat each other more nicely than if one of them were an outsider.

It's your peers who would pipe up with "BORRRINNG." Like, "Hey Uncle Agnostic, you wanna watch Sponge Bob?" I can't tell him, "Nah man, that shit's boring." But one of his friends hopefully would. "Aw c'mon, change the chanelllll. Sponge Bob sucks." Then they'd engage in a little back-and-forth, give-and-take, until they compromised.

With family members in an era of family friendliness, the grown-ups tell the child that whatever they like is awesome. No need to change, improve, or compromise your interests and tastes.

This ego sheltering can only last so long, though. What happens when the kid starts to interact with his peers at age 25 or whenever it is with you damn Millennials? His whole formative years up to that point have prepared him to expect that other people will find his interests fascinating and his tastes impeccable.

Then, SLAM — your peers shrug off many of your interests and find your tastes average. That's typical, and not the end of the world. But with no preparation for it, the ego faces this challenge in such an atrophied state that it gets utterly demolished.

To pick up the pieces: "Well what do those idiots know about awesome anyway? They're probably just jealous. I don't need their confirmation anyway." Now they're headed down the path of social withdrawal and misanthropy. They'll grow suspicious of their so-called friends who don't share 100% enthusiasm with their interests.

Parents in the '70s and '80s used to view their job as preparing their kids for the tough and unpredictable world out there, not to insulate them from it. It'll slam into them at some point, so might as well make sure they've grown to withstand it. Parents stayed out of our lives even when we were children. By encouraging us to go out and make friends on the playground, or at school, or around the neighborhood, they helped us discover the shocking reality that not everybody is as interested as we are in the stuff we're interested in.

That not only taught us to negotiate and compromise with someone who wasn't on the same page as us, but also to seek out new friends who would be closer to us. That way we have our friends where we don't have to struggle that much just to get something done, and other friends or acquaintances who we have to make more of an effort to do things with — but not shutting them out because of that. Each side goes back to their closer circle of friends afterward and engages in some good-spirited gossip about how weird the other side can be sometimes.

This is a separate effect from over-praising the kids' efforts and output. That shelters their ego about their capabilities. This is about what gets their attention, what motivates them, their interests and tastes. It's much closer to the core of their identity than their capabilities, so that questioning it is far more likely to be perceived as doubting who they are as a person.

That will trigger a much more desperate defense: "What do you mean, you don't like Harry Potter? Here is a PowerPoint of the Top 10 reasons why you must, unless you're a big fat stupid idiot."

Never having your tastes questioned — not well into your late teens anyway — leads to another major problem that you see with Millennials. They can't separate objective and subjective discussions about something they like. It all boils down to the subjective. The objective is only a means toward that end, as though an objective argument would force them to decide one way or another on the subjective side of things.

To end with an example, most of them like music that is not very musical. That is an objective claim, easy to verify. If it's what floats their boat, I guess I'll just have to consider them lame, and they can look at me playing a Duran Duran album as lame. But objectively speaking, "Rio," "Save a Prayer," "Hungry Like the Wolf," etc. are more musical. More riffs, motifs, more varied song structure (intros, bridges, outros), more intricate melodic phrasing, richer instrumentation (actually hearing the bass), instrumental solos, greater pitch range for the singing, more required to write the instrumental parts, and so on.

I know some Boomers who see that spelled out, and compare what Pandora says about their Sixties faves, and respond self-deprecatingly with, "Meh, I guess I get turned on by simplistic music then!" as opposed to fancy-schmancy music. Millennials get bent out of shape, though, as though "logic has proven my tastes inferior — must re-inspect the logic."

But that's what happens when your tastes rarely get questioned during your formative years. You don't appreciate that there could be two separate ways that this could happen, one objective and the other subjective. In fact, being told that your favorite TV show is boring would probably have introduced you to the objective side of things, when you asked them why they felt that way. "I dunno, it's like none of the challenges the characters face actually matters. The motivation feels empty." Ok, they wouldn't phrase it that way, but you know what I mean. Usually their response would not only amount to, "I dunno, it just sucks."