July 31, 2009

Video game weekend: The PS3 really does suck, and the Wii really does rule

First, let me state at the outset that I don't own any of the current-generation home consoles, although I do own a Nintendo DS. My Nintendo, Super Nintendo, Sega Genesis, TurboGrafx 16, and Game Boy Player for the GameCube offer a superior library of games. So I have no stake in who wins the current console competition (let's leave the phrase "X wars" in the '90s where it belongs). Instead, I simply consider sales data from market research group NPD, as reported in the latest issue of Game Informer -- and as hand-checked by me (5 of the 20 data were errors!).

The list only includes the top 20 best-selling games (in units sold) of May 2009; obviously it would be better to have a more complete list. Still, this will do. They include 6 games for the Wii, 5 for the DS, 6 for the Xbox 360, and 3 for the PlayStation 3.

To get a better feel for a game's true, underlying quality, I excluded all games that were released in that month. The reason is simple: when a game is just released, its initial sales are mostly determined by the advertising budget, the hype it gets on the internet, and word-of-mouth exuberance in anticipation of its release. It's only after it's been out of the gate for a month or so that actual game players have had time to familiarize themselves with the game and talk about it to each other, post their opinions on the internet, and so on. Only then can we tell whether the game can survive on its merits rather than pure hype and PR.

If an overhyped game turns out to stink, people will talk about this, and its sales will crash in the next month and remain low. On the other hand, if an underrated game gets enough word-of-mouth praise, it can enter the best-selling list after its release month.

Doing this leaves 12 of the original 20 games, showing that nearly half of all best-selling games in a month probably benefit only from the producer's PR and fanboy hype, and that they likely fall off a cliff almost right away. Some of these games released in May could in fact prove strong, but the release dates from the list overall don't offer them much hope.

Of the 12 that have proven themselves over time, 5 are for the Nintendo DS (0 were cut), 4 are for the Nintendo Wii (2 were cut), 3 are for the Xbox 360 (3 were cut), and 0 are for the PlayStation 3 (all 3 were cut). The surviving Xbox 360 games are all below the original top 10, while 3 of the surviving Wii games and 1 of the surviving DS games are in the original top 10. The oldest game on the list is Mario Kart DS, which came out three and a half years before, and the second-oldest is New Super Mario Bros for the DS, which came out three years before. Imagine a Hollywood studio producing two movies that were in theaters for three years and counting!

In a world of fickle consumers who ruthlessly heap scorn on shitty games and gush over the great ones, the staying power of Nintendo's DS and Wii games, and of Microsoft's Xbox 360 games, is as reliable of an indicator of quality as we can imagine. For the same reason, the pathetic showing of PS3 games explains why its hardware sales are in last place by far -- no one wants to play the software for that system. Their game sales benefit only from pre-release hype and PR -- once people get around to playing them, and talking about them, their sales take a nose-dive.

As I said before, I don't care who wins, and that's probably why I'm not blinded in the whole matter. It was just an issue of looking up some numbers in a magazine I happened to be flipping through. On an objective basis, we conclude that Nintendo currently puts out the best games, although the Xbox 360's games are not terribly far behind, and the PS3 is the present-day reincarnation of the overhyped Neo Geo home console from the early 1990s.

As an older-minded video game player, it's heartening to see that all 5 of the best-selling handheld games have proven themselves over time -- none were due to hype and PR of that month -- and for quite some time. It just goes to show that one system's superior hardware capabilities don't mean shit if the games for it are boring. This allows a mostly 2-D handheld system to crush a 3-D home system that has more realistic graphics. I hesitate to say "better" graphics, since maybe the average person doesn't want straight-up realism and dark shading, but rather prefers vibrant colors and a fantasy look. But innate human preferences and video game aesthetics are another topic altogether.

July 30, 2009

The decline of kids' rough-house play, as shown through Nickelodeon

With the adoption of cable TV during the 1980s, channels could target themselves toward a narrower niche than before, and one obvious way to carve up the previously heterogeneous audiences was by age. Nickelodeon aimed itself at kids roughly aged 5 to 13, I'd say. By taking a brief look at how its programming has changed, we can track changes in what parents find acceptable for their kids to watch and imitate.

As with video games, the golden age of Nickelodeon lasted from about 1986 to 1994, and a large part of that was their game shows. Now, they don't even exist -- just have a look at their current vs. previous programming by genre. The physically oriented ones more or less stop in the mid-1990s. This could be part of the larger civilizing trend that began then, whereby violent crime and child abuse started plummeting -- no more wild and crazy kids.

It's not as if physical challenges between individuals or teams are a fad, like America's Funniest Home Videos was. Game shows like Double Dare were were wacky and different enough -- and short enough -- that kids could tune in for a half-hour and get into the competitive excitement. They were sports shows, just for kids. But today's helicopter parents are probably too worried about their kids trying to recreate what they see -- especially for a game like Finders Keepers where the kids go on a rampage tearing up a staged house looking for prizes.

It would be interesting to see how far this extends -- are little kids today deprived of the joy of building forts out of cardboard boxes and couch cushions? If you've been to a park recently and seen how close the parents stand next to their kids -- as opposed to being somewhere else altogether, or not even being at the park to supervise them at all -- then it doesn't sound so crazy.

Oh, go back and look at the differences in the "educational" genre of Nickelodeon's programming. In the '80s, the most popular show by far was Mr. Wizard's World -- I still vividly recall waking up each morning at 5am (I believe) to catch it. This was a general science and technology appreciation show -- show the kids how buoyancy makes some things float and other sink, what acids and bases are (using examples from around the house), and if memory serves, he even showed kids how to build some kind of toy rocket to launch in the backyard. Just try showing that on Nickelodeon today.

(Update: my memory rules, at least for cool things like setting up a rocket in your backyard -- there's a video clip of this demonstration on the DVD webpage. Check out the info pages for all of the volumes and note the several demonstrations dealing with fire, explosions, etc. Ah, it was another time.)

Now the educational programs are just a bunch of environmentalist propaganda. So much for science and education -- just try to brainwash the poor little bastards. There was a transition period during the early or mid-1990s when Beakman's World and Bill Nye the Science Guy were popular -- and Beakman's World was broadcast on Saturday morning, competing against cartoons!

Kids these days are doing basically as well as kids from previous days did as far as science and math achievement in school. So lacking these shows isn't harming them in that way. But being deprived of role models could affect how pumped they are to enter the math, science, and technology fields. I don't mean "role models" only in terms of people they look up to, but as someone who shows that a science or tech person can make it and get respect in popular culture. If kids think that the field or job is for losers, even the ones who could hack it will turn to something more glamorous, like working for Wall Street or the ACLU.

July 29, 2009

More moronic antitrust actions to follow

The antitrust bureaucracy, having few real threats to take on, is once again just making shit up in order for them to keep their cushy jobs. First they busted up the Hollywood studio system -- and output did not shoot up, and prices did not fall. (See Arthur De Vany's Hollywood Economics.) So that's one they got wrong. Then they busted up AT&T, and Bell Labs along with it -- and the output of major new innovations virtually stopped the next year. And they really let their cluelessness show in the Microsoft case. (See Liebowitz and Margolis' Winners, Losers, and Microsoft.) I've included both of these must-read books in my Amazon frame above.

I don't pretend to know everything about every industry, but that would make me a better antitrust enforcer than the idiots who run things. They assume that they know the ins and outs of an industry that they are quite ignorant of -- Hollywood and Microsoft being the two greatest examples. Still, I know enough just from reading newspapers that I can outsmart the dolts in the antitrust division. For example, here's a description of their worries about wireless phone services:

The division's wireless inquiry is looking at, among other things, whether it is legal for phone makers to offer a particular model, like the iPhone or the Palm Pre, exclusively to one phone carrier. It is examining the sharp increase in text-messaging rates at several phone companies. And it is scrutinizing obstacles imposed by the phone companies on low-price rivals like Skype.

Oh no, a sharp increase in prices -- it can only be due to a rising monopoly! Because these idiots didn't pay attention in freshman econ class, I'll remind them of the law of supply and demand: all else equal, when demand increases, so does price. It's incredibly simple. Price could also increase if supply decreased (as when a monopoly restricts output), but it sure doesn't seem like text messages are becoming a scarcer and scarcer resource. So, just going with the basic laws of economics, rather than assume something unusual, let's ask if demand for text messaging is increasing. If so, then that's it, and the government should just butt out.

From an article just two days later about the dangers of driving while texting:

Over all, texting has soared. In December, phone users in the United States sent 110 billion messages, a tenfold increase in just three years, according to the cellular phone industry's trade group, CTIA.

What do you know, a tenfold increase in three years! There's your answer for why texting rates are shooting up -- the demand for them is too. It's probably a demographic change, in that more and more of the cell phone-owning population come from the more recent cohorts who prefer texting over talking. Or it could be due to something else, but we sure don't need to invoke monopolistic forces.

This same retarded view dominated during the housing bubble too -- the skyrocketing prices could not possibly have to do with skyrocketing demand, namely from the irrational exuberance that everyone was under. "OMG, I like have to buy a second house, or buy one and flip it -- prices only go up!" No, instead we heard a bunch of malarky about the decreasing supply -- not due to a monopoly (at least the story wasn't that stupid), but because we were supposedly running out of land everywhere, not just in the fashionable places like usual.

In the minds of the elite, there is no such thing as the effect of real, breathing human beings' demand -- i.e., their hopes, fears, and desires. They only grant causal powers to producers who can restrict supply to drive up prices -- the Big Evil Corporation screwing the little guy -- or flood the market with supply to make it cheap -- the Big Evil Corporation trying to get people hooked on cheap crap (fast food, Wal Mart furniture, whatever).

Unfortunately, the antitrust division looks like it still doesn't have a clue. At least for the one case that I could check with no more investigation than reading a few newspaper articles a day, training their crosshairs on the cell phone industry is completely bogus. All they need to do is to leave an industry that they don't understand alone, and go get a life.

July 28, 2009

Hypocrisy about sugar consumption among upper and lower status people

I don't care about uncovering the boring kind of hypocrisy, where someone has goals that they occasionally fail to meet. That's for Gen X-ers who are still stuck in their middle school goth phase. But the real, two-faced kind does irritate me -- i.e., when someone advises everyone else to do one thing, and deliberately does something else when they're not looking.

Over at my low carb blog, I just put up a post on this regarding how we treat sugar-gulpers of low vs. high social status.

July 25, 2009

Video game weekend

- If you buy loose cartridges or CDs that don't come with the manual, have a look at this site. They've scanned manuals into free PDFs, some even in color.

- My big complaint about the newer consoles is that they've turned video games into movies. I want to play, not watch. A lot of responses to that post on various discussion forums said that the movie comparison was wrong. Of course it's not -- it's obvious that video games are trying to substitute or compete with movies, since the late '90s anyway. In fact, here's the creator of Sony's PlayStation, Ken Kutaragi, on his vision, from a 2001 Wired interview:

My initial goal with the PlayStation was to expand the game experience by expanding the available entertainment content. With PS2, one of my goals is to take entertainment even further, from games to a fusion of games, music, and movies.

So there you have it -- proof that it was the 3-D era that marked the end of video games as games, especially when the graphics became good enough in the late '90s to substitute for passable CGI effects from movies.

- Assuming that you still prefer playing games to watching bad movies, you should get a Game Boy Player for your GameCube. I've put a link to it in the Amazon box at the top, under the video games section. It goes for about $10, and it allows you to play Game Boy, Game Boy Color, and Game Boy Advance games on your TV.

For people like me who tuned out of the video game world in the mid-late '90s when the direction toward movies began, this is a godsend. There are a lot of great games in the non-movie style that came out then, but they were mostly released on the handheld systems, since they didn't have 3-D graphics and therefore didn't even bother to compete with movies. Even better, they all feel fresh because I've never played them before.

Playing Double Dragon II for the thousandth time is still pretty fun, but I already know everything about that game. However, the average video game player probably didn't get around to the Game Boy games since playing them on a tiny screen isn't nearly as exciting as playing on a TV. There are three incredible Zelda games for the Game Boy, all in color (Link's Awakening DX, Oracle of Ages, and Oracle of Seasons), three Castlevania games for Game Boy and three even better ones for the Game Boy Advance, a great color sequel to Bionic Commando, a stunning enhanced remake of the original Metroid on GBA (Metroid Fusion is a lot more boring, though), a string of great Kirby games -- and the most highly rated Metal Gear game is for the Game Boy Color.

So if you're looking for a fresh gameplay experience in the classic non-movie style, you can't go wrong with a Game Boy Player. Most of the games for it are fairly cheap, and the Player itself is only $10. If you don't have a GameCube, it's cheap too -- you can probably find a used one for $20 or $30 now since it wasn't as popular as the PS2.

And of course, you could always buy the full consoles from the golden age -- circa 1992, not 1982 -- but the games can be a bit more expensive for them, at least the great ones. Some are available for download on the Wii's Virtual Console, but not even a good fraction. I'll probably start reviewing some of the ones worth buying, but I want to wait until my TurboGrafx-16 gets here on Monday. Don't think I'll be getting a Neo Geo, though -- how many different Street Fighter clones could you want?

July 22, 2009

No one will care about Henry Louis Gates' run-in with the cops

I'm sure you've heard the story by now -- famous black academic breaks into his own house, police question him about it, and he freaks out about their supposedly racist mindset. And like the suburban wigger who finally gets pulled over by the cops, he's so ecstatic about an actual encounter with the police that he plans to make a movie about it (from here):

The charge against him was dropped Tuesday, but Gates said he plans to use the attention and turn his intellectual heft and stature to the issue of racial profiling. He now wants to create a documentary on the criminal justice system, informed by the experience of being arrested not as a famous academic but as an unrecognized black man.

Borrrinnng.

Back here I showed that Google searches for various phrases associated with identity politics show downward trends over the past several years, which I take to show less and less interest in such topics. In particular, the alleged racism surrounding Hurricane Katrina, the Jena Six, and the Duke lacrosse hoax didn't catch anyone's attention. They made the news for a little bit (the lacrosse case for a little longer, just because the trial lasted longer than a hurricane), but once the event itself was done, everyone went back to business as usual. Importantly, there were no massive riots as there were during the social hysteria of the early 1990s when the "not guilty" verdict in the Rodney King case sparked the disastrous L.A. riots of 1991, or the other L.A. riots of the late 1960s. Therefore, no one will care about Gates' run-in with the police in 2009.

In fact, here's a graph showing the frequency of the term "racial profiling" in the NYT from its first appearance in 1994 up through 2008:


In the late '90s and early 2000s, there was a hysteria about this term, but it was short-lived, and it wasn't large enough to trigger riots. If Gates plans to make a documentary about "racial profiling," he will be dealing with a nauseatingly unfashionable topic -- although you figure he'd market it to Gen X and Boomer dipshits anyway, among whom it's still a favorite buzzword. But don't expect it to catch on -- blacks have shown very little enthusiasm for taking it to the streets lately. There will probably be another massive social hysteria in the middle of the next decade, so if he wants a big audience, he should wait until then to release it.

July 20, 2009

My new subscription-driven blog and forums

Basic rationale: Starting next week, I'll be posting on the highest-traffic day -- Monday -- only at a companion blog that will be run by a $5 per month subscription. (I'll post here too, but not early in the week when demand is high.) These new posts will be the ones that I put the most effort into -- not where I recount how it went at the teen dance club, but where I present something that no one's found out before, or where I popularize something that few know about.

To give newer readers a feel of the original stuff I've done before, I've given examples in the section below, but I've already decided what the inaugural post is about -- turnover rates in many genres of popular music. Basically, I went through the Billboard charts as far back as they go for the genre, and asked, "For a given week, how many weeks was the #1 song at the top of the charts?" and plotted this over time. Some periods show lots of turnover, while others are marked by stasis. Are there any patterns, and does genre make a difference? Subscribe and find out.

Aside from a weekly top-notch post each Monday, the blog will host forums of sorts throughout the week. The way the blogosphere works now, a comment section largely turns into a discussion board, but the momentum is always halted due to the commenters having to run from one day's comment section to the next's. At the other blog, there will simply be an open thread for the entire week where commenters can go nuts within the same post. There will be one open thread per topic of interest -- race, gender / sex, generational topics, technology, health / nutrition, or whatever else interests the subscribers.

More about the blog posts

First, there will be no ads of any kind, and comments will NOT be moderated. Everyone knows how annoying it is to participate in a comment section with moderated comments -- you can't help but feel insulted by "your message is being held and will appear when the writer approves it," and it slows the discussion down, when all you want is a quick and steady fix. But when access to the site is free, retards and flamers will inevitably show up, as bums flock to crowded malls, so that ownership and control of a comment section is needed to keep it from turning into worthless chaos. Only by restricting access can these rules be loosened up, just as law-abiding children don't need a curfew.

I'm choosing Monday because that is always the highest-traffic day. If you subscribe, you'll get your fix when you want it most, while there will be only a teaser link here to start the week off.

Although blogging doesn't eat up a lot of time, the more data-intensive posts do. This is not something that most bloggers do -- most are linkers or gasbags, with some entertaining and others boring. I actually do a bit of investigation, provide data, and put it into an easy-to-read visual. Not everyone will agree with my interpretation of what I've found, but at least I've done lots of homework that others will benefit from, and that's something you find at very few places on the internet, especially if it's a new finding.

As such, it only makes sense that I get something out of it. For those entries that don't require lots of time, I'll continue to post them here for free. But if they are more original and time-draining, I need better motivation than merely having a gigantic internet following.

However, since the other site will be by subscription, I'll be much more open to reader requests. Right now, I ignore them because if I'm researching and writing something up for free, the only thing that counts is what I'm curious about. But if I'm being remunerated, I'll look into just about anything. If there are topics you're interested in, or parts of posts I've written that you wish I'd looked into more, just say so, and you'll have a bright and resourceful numbers guy on top of it. We all know how pathetic most journalism is -- if what you read isn't covering it, I will, where feasible.

Otherwise, I'll focus on the same issues I do here -- race, generational changes, health and nutrition, sex and gender, technology, video games, pop culture, and so on.

Examples of previous original and data-intensive work I've done:

The death of silly academic theories such as Marxism, psychoanalysis, and even postmodernism, using JSTOR archives. This story was picked up by the Toronto Globe and Mail, Arts and Letters Daily, and a few others I'm forgetting. (Here's a follow-up.)

How different social classes react to adolescent sex, using the GSS, and proposing a life history account of these differences.

How much different generations enjoy various music genres, using the GSS. This provides pretty clear data that you imprint on the popular music from when you were about 15 and stay that way for the rest of your life.

How the American diet has changed over the 20th C., using pretty fine-grained data such as red meat, fish, poultry, etc., rather than just "meat." There's also data showing that heart disease and obesity has only gotten worse as we've switched to a more carboholic diet since the 1970s.

How the blondness of Playboy Playmates has changed over time, as well as some speculation about why it changes the way it does.

The stagnating pace of revolutionary technological innovation, linking it to the decline in monopolistic bodies like AT&T's Bell Labs or the Defense Department.

And there's plenty more where that came from. Just browse my archives here, or search GNXP.com for "agnostic." Stuff you won't read anywhere else. To reiterate, next week's post will look at the dynamics of pop culture by using Billboard chart-topping data. Soon after that (perhaps the next week, unless there's subscriber demand for something else), I'll present data on whether or not there has been a "decline in formality" over the 20th C, or if that is even possible. You know -- jeans and tennis shoes replacing jackets and ties.

More about the forums

While part of the reason that people read blogs is to see what the writer has to say, an even larger part is to join in or listen to the ensuing discussion. At a given blog, there are a handful of topics that people are most interested in, and comment sections inevitably become discussion boards about these topics. But by being split up over multiple posts, the momentum of the discussion will be killed because there's a new post today and a new comment section to migrate to.

By having a single open thread on some topic (women, race, science, video games, whatever), commenters will be able to go at it to their heart's content and not have to worry about lost momentum. Should things slow down, I'll pop in often enough to stir the pot -- maybe something little like posting a link to a conversation-starting news story. Again, I'll start off with an initial group of threads, but if enough people want an open thread on a new topic, I'll start it up right away. Each Sunday, I'll post a new set of open threads, since this type of yakking has a weekly (not daily) rhythm: the momentum always dies over the weekend anyway.

Another huge benefit from this format is that you can engage in discussions on a variety of topics each day, all day. The way things are now, you talk about sex when there's a post about sex, or affirmative action when there's a post on that. I know that you all want to talk about lots of things every day, so this will allow you to do that. You could follow a bunch of free discussion boards, but by bringing everyone together under a single umbrella site, it will be closer-knit and you'll know that you have much more in common than just one interest. Plus you all sort of know each other already.

It'll be a regular Algonquin Roundtable, and you'll be able to relate ideas from various areas together. It'll be as participatory as a discussion board, but as broadly focused and free-ranging in content as the comment sections of your favorite blogs.

And again, comments will not be moderated, so there will be a much more natural pace to the discussion. Quite simply, if someone acts like a classic flamer, I will remove them without giving a refund, and they won't be allowed to re-join the site. Every meeting place that could attract losers needs a bouncer -- and lord knows that includes an internet comment section. By the same token, nightclubs that cost money to get into are generally better than those that don't charge -- it weeds out those with bad attitudes.

How to join

There is a "Subscribe" button at the top of this blog. You will need a PayPal account, and PayPal automates everything, billing you $5 every month. (You can cancel anytime via PayPal, although I won't refund money that you've already paid.) You will also need a Google account -- they're free, and you just provide them with an email address. When you subscribe, leave me a message via PayPal with the email address associated with your Google account. I need this to invite you to the blog. If you don't say so, I'll assume it's the one attached to your PayPal account. If you forget to mention it, you can always send me a correction through your PayPal account.

Once I invite you, you'll get an email that has a "join this blog" link that you click on. And with that, you're all set. You will need to be signed into your Google account, but you can stay signed in forever. I've already put up some open threads, so you can get going right away.

I expect that most subscribers will not be trolls or flamers -- they want to harass people for free -- but again, if you exhibit classical flamer behavior, you'll be kicked out with no refund. It just takes a couple people like that to ruin the vibe, so I'll be strict about that.

If you have any questions, feel free to leave a comment here or email me at icanfeelmyheartbeat at the hotmail-ish site.

July 19, 2009

Women still living beyond their means to try to look better

Here are two brief press releases from the marketing research group NPD, which show that the recession hasn't made most women re-prioritize their spending on beauty products, and that the more expensive brands' sales are actually increasing: one and two.

Unfortunately, women are junking higher-end skincare products in favor of higher-end make-up, eye make-up, and fragrances. At least skincare adds moisture to the skin, and the higher-end products tend to have vitamin A (retinol) or other things that ameliorate the effects of aging.

I cannot believe that there's an eye make-up that costs $38 and is named after Hello Kitty.

July 17, 2009

Video game weekend: a brief history of video games

Since Fridays and Saturdays are low-traffic days, I'm devoting them to lighter material that I know lots of the readers here are interested in -- video games. (Hopefully this will recapture the feeling of being done with school and renting a video game for the weekend.)

They're a huge player in the entertainment industry, yet they haven't been studied very much. At some point, movies became a serious thing to talk about, so why not video games too? And just as the evolution of Hollywood businesses tells us a lot about the cultural products they made, we can learn a lot about the quality of video games by examining the history of this industry's businesses.

I'm in the middle of reading two histories of video games from academic sources, and I'll probably post on that later. But for now, I'll simply re-direct you to a four-part YouTube series by The Gaming Goose:

Part one, part two, part three, and part four.

Most of the info can be found on Wikipedia and a few other sites, but it's a pretty good synthesis, and there's a lot of personal insight from someone who grew up experiencing the birth of video games. Plus, unlike virtually every video game reviewer on YouTube (or elsewhere), he comes across as a normal person, not an autist.

One thing he mentions is that it's nonsense that the improving quality of home console games was what killed off the arcade games -- i.e., that people could now fairly well substitute home for arcade games. That's a point that I've made before by just noting the timing of the rise and fall of arcade game sales, in relation to home console sales. Arcade games started disappearing already in the very late 1980s, years before arcade-quality graphics were available and affordable at home.

He also emphasizes that going to video arcades was a social experience that you can't duplicate by playing people online today. That's true: you're out of your house, probably at a mall or movie theater surrounded by tons of people, and you're interacting face-to-face. There's all sorts of non-verbal stuff that's missed by playing online, such as that look of excited relief that you give each other after you beat what seemed an unbeatable boss. It's a superficial bond that only lasts as long as you're blasting away a common enemy, but that's still closer than the interaction between two random mall-goers.

Also, especially for younger boys, there was the thrill of getting to hang out with the cool older kids -- one of the few places where they would tolerate your presence. Unlike sports, a younger kid can actually fare pretty well against a teenager in video games, so that you could easily prove yourself to them and earn their respect. "Damn, look at that little dude go -- he just smoked your ass!" I don't know that there are many places left now that allow this partial breaking down of the age barrier.

And just for the record, Golden Axe is the best arcade game, just ahead of Ninja Turtles and the Simpsons.

July 16, 2009

New layout

I read too many books to write about (and most of them are pretty good and worth reviewing), so rather than put yet another post on the back-burner for each book, I've opted for what you see at the top of the page. They're all really neat and worth reading; click through as normal to see their Amazon entry. If you do buy an item after entering Amazon via my link, I get a small commission that will help defray the costs of the moon-sized death ray pointed at Earth that I'm working on.

I get questions a lot about which health and nutrition books to get -- well, there they are. All you need is those two, and you're basically set.

I'll probably begin devoting Fridays, which tend to be lower-traffic days, to video game reviews -- mostly of older ones. You can play the games yourself by getting the NES / SNES combination console in the link, and Amazon marketplace now has used video game sellers too for the cartridges. If you want something newer, the Nintendo DS Lite is pretty good. I got one in order to play newer games (on the DS) as well as the good ones from Game Boy Advance (which includes some from the NES and SNES). Plus the DS Lite is a lot cheaper than the new DSi -- and the DSi doesn't play GBA games either.

If you have a GameCube, you can pick up the Game Boy Player from the link, and that'll play GBA, Game Boy, and Game Boy Color games. I just got one so that I can make up for the time I lost in the late '90s -- I didn't know they made a fair amount of good games for the GBC. I assumed they were like Game Boy but just in color. The games are cheap too -- I found Metal Gear Solid, the top-rated game for the system, for $6 at a used record store nearby.

The movies in that list are mostly ones I've reviewed here And the music list is an introduction to really good stuff, in case you haven't already heard it. And now back to the death ray.

Sports video games sucked then and suck now -- why?

I never liked sports video games, aside from the occasional game of Baseball Simulator 1.000 for Nintendo (made by Culture Brain, a great little company). If I wanted do something sports-related, I'd throw the football around with my friend or bike down to an abandoned tennis court to join a game of roller hockey. But maybe I just didn't play the good sports games, or perhaps sports games have gotten better -- that is, increased at a faster rate in quality than games overall.

Nope. To check whether my assessment of sports games from awhile ago is correct, I went to eStarland and looked up used Super Nintendo and Sega Genesis games, sorting them from low to high price. Almost without exception, the first 100 or so of roughly 500 games are sports games. Think of all the other genres that could have been mixed in there too -- yet the bottom fifth (and maybe more) are almost entirely sports games.

Then to check on more recent consoles, I looked up used PlayStation 2 games from low to high price. Again, virtually all of the really low-priced games -- i.e., those that no one wants -- are sports games. The graphics may be better, and you may be able to customize your team more than before, but compared to other genres, sports games are still garbage.

So, what is it about sports games that makes them so unpopular? It can't be "I could do that in real life," since The Sims is incredibly popular (among girls anyway). I don't think it's that we expect sports to be physical, so that a test of our button-pressing doesn't live up to a test of coordinating our whole body. After all, the beat 'em up games are supposed to be physical, and they don't live up to the feel of actually killing 500 guys with your bare hands. Thoughts?

July 15, 2009

The real benefits of good looks

Everyone knows from their own lives that looking good will make it easier to get your foot in the door with the opposite sex. And if you've followed the media's fascination with the psychology of the past 15 to 20 years -- mostly the evolutionary kind -- you've probably heard about how they are given more attention even by newborns, and that they tend to earn more than their plain-looking counterparts.

But there are two even more important perks that hardly anyone mentions:

1) Should the need arise, you can fart in public without worrying -- no one around will think that the good-looking person did it. It must have been that fat ugly guy over there. (Most girls under 30 benefit from this too -- everyone assumes they never do it.)

2) You can go longer without doing laundry since your sweat not only doesn't stink, but actually smells good to the opposite sex. (See the cottage industry of "sweaty t-shirt studies.") That's a real plus during the summer. I think you have to still have a clean overall appearance, though.

The only downside to children preferring to be around good-looking adults is that they want to play with you way too much. I didn't mind that as a tutor because it meant that they'd actually behave themselves pretty well around me and not act like monsters, lest they lose my favor. But, for example, yesterday I was trying to get some reading done at Starbucks when someone's group of little kids kept hovering near me to make their stuffed animal crawl up my leg. It was cute at first, but it's hard to shoo the little boogers away when you need to get work done.

July 14, 2009

The reasons that females police promiscuity are self-interested, not patriarchal

I want to return to a previous post where I linked to and commented on a YouTube video of an 8th grader mercilessly documenting her best friend's libidinous ways, all while the friend begs her to stop and chases after her when that doesn't work. This is the familiar pattern of girls sabotaging their so-called sisters to get ahead.

Yet several commenters argued that the teaser was only acting under the influence of the patriarchal norms that she'd internalized -- if they were reared under a matriarchy, teenage girls would not act so savagely toward one another.

But let's get real.

This girl just so happens to police the sexuality of other females exactly at the time when she herself has a selfish motive to do so -- to keep them from competing with her for boys, from dragging down standards, and so on.

Think it through: if females policed others due to social brainwashing, it would show up in their behavior early on. Even by the end of elementary school, kids have absorbed most parts of culture that are not innate -- the ambient language, looking both ways before crossing the street, saying please and thank you, etc.

And yet elementary school girls don't harangue females of any age about their sexuality -- at all. They only start doing this after puberty.

Under the "social brainwashing" hypothesis, the perfect coincidence between policing and puberty is completely unaccounted for. Under the "advancing her own interests" hypothesis, it makes perfect sense. We conclude that girls tearing into other girls for acting slutty, once puberty begins, is as genetically pre-programmed as boys acting violently toward each other to attain top dog status.

File another Women's Studies theory under "so wrong a high schooler could figure it out."

July 13, 2009

And you thought the PlayStation 3 was expensive...

When it was released nearly three years ago, Sony's PlayStation 3 video game console cost either $500 or $600, depending on how big the hard drive was. It sounded like a lot of money because we assume that dollars are fixed in value, when in reality inflation or deflation changes how much a dollar is worth all the time. Setting aside the consoles that were trying to be full arcade games for the home, or that made video game playing only one piece of their larger package of multimedia features, what was the most expensive video game system after adjusting for inflation?

It was Mattel's Intellivision, a mainstream competitor of Atari that was released in 1980 for $300 -- or for $746 in 2007 dollars. That's only 23% cheaper than the Neo Geo -- a home arcade system!

I know there are already charts of how much video game consoles cost in nominal and real terms across the years, but I've never seen a good scatter-plot or time-series plot. So here they are for 45 consoles (this time including the home arcade and multimedia ones), from Magnavox's Odyssey in 1972 to Sony's PS3 in 2006 (full table here):


I adjusted for inflation by using the Consumer Price Index, since after all these are consumer goods. I got the data from this chronology of video games, which is extensively referenced.

Notice that the nominal prices have gone steadily up -- there doesn't seem to be nominal price "rigidity" here, as though we had some psychological barrier against the creeping-upward prices on the tag. Well, at least until a mainstream console hits $1000 -- then we'll see.

In the plot of real prices, the typical consoles have gotten cheaper over time, probably due to economies of scale, memory getting cheaper, or some other technological advancement. Also notice that it appears that prices are lowest when there's only one company with most of the market share -- in the mid-late '80s when Nintendo was the only game in town, and in the early 2000s when the PlayStation 2 cleared out Nintendo's GameCube and Microsoft's Xbox. I don't have good market share data from the 1970s, but this appears to hold there too, since Atari was introduced in 1977 and didn't see real competition until the early 1980s.

This contradicts that idea that just having a huge market share makes you act like a bad monopoly -- prices should be higher when there's only one dominant player, under that idea. Instead, the period when there was essentially one system to play -- 1986 to about 1991 -- was characterized by an incredibly low-price console, the NES. It's not until the eruption of consoles during the 16-bit era and just after that prices start shooting up again.

And Nintendo, despite dominating the market for most of its existence, has always kept its prices low. The real price of the NES was $238. For the Super Nintendo, it was $301; $235 for the GameCube; and $257 for the Wii. Sony's consoles, by contrast, have ranged from roughly $350 to $500.

I've got similar data that I'll post soon for the handheld consoles, but it looks pretty similar.

So if you thought video game nerds today spend too much money on their vice, just imagine how much of their paycheck people forked over to buy an Intellivision. I don't remember any of it first-hand, but the video game craze of the late '70s and early '80s was something else (maybe it was the drugs). As I showed in a graph in this post, arcade game sales have never been matched since 1981 -- and that includes the home console sales too, at least as recently as 2002, notwithstanding how well the PS2 was selling then. In fact, it was only in 2001, after the PS2 caught on, that home sales had returned to their level before the video game crash of 1983. It's just too bad there weren't very many games for it -- just a bunch of movies.

July 10, 2009

Economists who aren't clueless

You may have already read Stan Liebowitz's "Anatomy of a Train Wreck" article about the mortgage meltdown (Steve Sailer publicized it), but he's written a lot of other great stuff, mostly about intellectual property and technology. Here's his webpage. If you don't know much about the real history of Microsoft's ascendancy -- and that includes just about everyone, since Microsoft is more of a folk devil than an existing company in most people's minds -- you should read through his very clear and data-packed articles there. Your library should have some of his books too.

Aside from reading about Microsoft, you can also discover that those academic urban legends about the Dvorak keyboard and Betamax tapes being superior to the QWERTY and VHS alternatives are bogus. We haven't been locked in to inferior standards at all.

July 9, 2009

Millennials watching less TV, guys trying more to lose weight

Before we saw that there was a sharp change in the smoking habits of high schoolers around 2001, suggesting that people born in the late 1980s are the first of the Millennials -- not "anyone born in 1980 or after," as you commonly hear. Here are two more pieces of evidence for this lifestyle change occurring in the early 2000s, also from the Youth Risk Behavior Survey.

First, there's something related to the impressions people have that young guys today are more effeminate in their preferences, hopes, anxieties, looks, and so on, compared to before -- removal of body hair, for example. The main worry that a teenage girl has is that she's too fat -- that will never change -- but there's been a shift in the level of young males who are trying to lose weight (bars show 95% confidence intervals):


There's no change among the percent of females who are trying to get thinner, but there's a clear break among males between 1991 to 1997 values and the 2001 to 2007 values, with 1999 marking a transition year. The bars for the first period all overlap a lot, and so do they in the second period, but the second period is clearly higher than the first. It's as if a switch were flipped on sometime during 1999 to 2001, again implicating the late '80s as the birth years of the new cultural group that's visible in the second period of high school dudes.

Elsewhere on this blog, I've documented how young people today have almost nothing to do. So you'd figure that they'd at least be watching more TV -- but nope, they're not even doing that:


Again there's a clear break, in that the 2001 to 2007 groups overlap a lot, but are essentially all below the 1999 value. Once more it's people born in the late '80s and after who have markedly shifted their lifestyle choices from those of young people before.

It's a mistake to put too much emphasis on the new technologies that some generation grew up taking for granted. That doesn't make them into a coherent generation because it says nothing about how people use them -- only that they're background features. Things like smoking, trying to lose weight, and allocating time to watching TV tell us more about what people want, or at least how they're trying to fit in. Generational analysis needs to focus more on what people are doing and thinking than on what stuff they have.

July 8, 2009

Harlem -- it still stinks

Here's a funny NYT article on the housing bust in Harlem.

In general, places that had strong fundamental value have not been hit so hard by the bust as those where the developers and transplants were betting on the idea that their supreme coolness would transform a shithole into a metropolis.

So, New York hasn't been nearly as devastated as Phoenix or Las Vegas because it's fucking New York City. But overhyping and its subsequent bursting is a fractal phenomenon that you can see at any level of zooming in or out. Thus, even within sturdy New York, the Upper East Side wasn't as overhyped and hasn't suffered so bad of a hangover as Harlem -- because Harlem has never had much going for it, while the UES has.

Some of the dopes interviewed by the NYT counter that, no, really, Harlem is still on its way to becoming the Next Big Thing -- after all, look at all the hip new shopping districts we've built! Well, swell, except those are merely a reflection of the irrational exuberance of the bubble years. How many technology start-ups did you found? Or medical research labs? Or anything productive? Harlem is lucky to have Columbia University nearby, which actually does make stuff happen, or it would look even worse.

No wonder video games started sucking in the late '90s

As a follow-up on the post where I traced the horrible present state of video games to the late 1990s, here's a partial explanation: just look at who the audience has been recently, according to the Entertainment Software Association. Read the whole fact sheet -- won't take longer than a few minutes.

The average video game player today is 35 -- about 5 to 10 years past the age when visuospatial skills are at their peak. Fully one-quarter are over 50! No wonder games aren't tests of skill anymore -- that would be like raising the badminton net in the retirement home as its residents are shrinking. Adult women outnumber under-18 boys by nearly twice as much. Females have visuospatial skills about 1 standard deviation below males.

Put together the huge change in the age and sex make-up of video game players, and the current mediocrity is a little less mysterious. The audience isn't mostly hyperactive, imaginative kids but a bunch of boring grown-ups for whom playing a video game is like popping in a DVD and vegetating on the couch. (The grown-ups who aren't boring are probably not playing video games at all, or are playing something challenging like Blaster Master.)

One encouraging fact is that most video games sold are basically all-ages, swamping the stupid shocker games for 18+ audiences only. (If you want a real thrill, go for a joy ride, shoplift something cheap and pointless just because, or dance with girls in a club.) It's the exact opposite of Hollywood movies, where too many R movies are made, and a huge niche of G and PG movies has been left untapped.

And note that G-rated games (or whatever the rating is) aren't necessarily dopey and juvenile -- the Nintendo had the best success rate at making great games, and there's hardly any gore, sex, or swearing in any of them. Beating people up, sure, but nothing really gory. Hell, Tetris is one of the greatest video games ever, and it's about as inoffensive as you can imagine.

The only downside is that although all-ages games are flourishing, they're still geared toward adults and the entire family playing along. There needs to be a large market share for all-ages games, but that are about testing your skills, exploring hard-to-navigate areas, and giving a good whomp to things that get in your way. Young boys today are pretty deprived -- they've got either kiddie games that their helicopter parents won't mind ("Parents report always or sometimes monitoring the games their children play 94% of the time"), or else games that loser 30-something males use to make-believe that they're badass.

July 7, 2009

In pop culture, is demography destiny?

The perennially bothersome and wrong commenter / blogger Whiskey (aka testing99) always leaves comments at various blogs in the Steve-o-sphere suggesting that the poor fool whose post he's commenting on has clumsily neglected the One True Explanation for Everything -- changing demographics.

Demography obviously has the power to influence many aspects of culture. The question is, in a given case, is the purported demographic change true? If not, then we're done. A non-existent change cannot have caused other changes. Recently in a thread about the decline of musicals since at least the 1950s, Whiskey offered a demographic explanation:

It's not the cult of authenticity that killed musical innovation, it's demographics. Not enough White teens to both provide innovators, and the market for it.

From memory (I won't bother looking them all up), he's offered this for a lot of changes between the 1950s and '60s vs. today -- not as many white teenagers. Is this true?

Obviously there are fewer teenagers now than in the 1960s because of the Baby Boom, but his argument is still wrong. That is because the period when teenagers and early 20-somethings made up the largest chunk of the population was not the 1950s or the '60s at all -- it was around 1980. As someone who reads a bunch of this stuff for graduate school work, I stupidly think that it's part of anyone's background information who talks about demography. So let's look in just a bit of detail -- actually, you only need to see two graphs.

First, let's look at the fertility rate over time. Notice that the peak is in the late 1950s and is still within a few percent of the peak through 1960; only after 1960 does it tumble downward. By the time the cohort born in this peak period reaches high school or college, it will be the late '70s or early '80s.

Next, let's look at the age pyramid in 5-year intervals. It's animated without a pause feature, so you may have to sit through it a few times to catch it, but the year when the 15 to 19 and 20 to 24 year-old bars are at their widest is in 1980. This says that our prediction based just on fertility was right -- those born during the peak around 1960 did not massively die off, and so young people were greatest in proportion around 1980. (You can see these pyramids in any good book on demography in your library.)

If such a puny fraction of a smaller population size was sufficient to make musicals and musical innovation marketable through the 1950s, then surely a larger fraction of a larger total population would make them even more marketable in 1980. Except musicals had been dead for decades by that point, and there weren't a whole lot of fundamentally new musical styles. The mistake in the argument is locating when young people made up the greatest part of the population. For the decline in musicals, there must be some non-demographic explanation. [1]

This also shows that demography isn't as all-powerful as we think in a broader sense: the period that we associate with youth rebellion, Hear the Voice of a New Generation, etc., is roughly 1957 to 1969 or so. But if it were sheer size that mattered most, we would look to 1980 as the golden age of teenagers. Not that there isn't something there in the popular awareness -- punk, new wave, Fast Times at Ridgemont High, etc. -- but we certainly don't think of the new wave era as quintessentially youth-dominant as we do The Sixties.

And it only gets worse if we look at the makers of youth culture -- they're not 15 to 24, but usually in their mid-20s. And 25 year-olds were represented the strongest in 1985, decades after The Beatles released Help! If sheer size mattered, we would look at "college rock," which exploded in the mid-'80s and was created mostly by people born in the 1958 - 1964 cohort, as the pinnacle of youth-dominant music-making.

Instead, it is the culture of the cohort that went through an intense hysteria that has persisted for the longest, rather than the culture of the cohort that was biggest in size. Going through a massive social hysteria forces solidarity on socially desperate young people (which is to say, all of them), and more solidaristic people will hold onto their culture for longer. That's why you still see Baby Boomers who look like they've just gotten out of a Weathermen meeting, and whose speech makes it sound like they're organizing a student sit-in. The same is true for Gen X people who still look like they just left a womynist slam poetry reading in a Berkeley coffee shop. They've preserved these things more because they feel a stronger connection to their generation.

The silent generations, on the other hand, never had to go through a hysteria and so don't feel such a strong affinity for their age-mates. No one born in 1959 still wears clothes that they would have at Studio 54 or CBGB's, and their disco-punk-era mannerisms and slang words are gone too. I can assure everyone that my generation, when we're in our 50s, will not still be wearing those retarded carpenter-style jeans of the late '90s, or blasting blink182 or Britney Spears out our car windows.

But I'll wager that in their middle age years, the Millennials -- who will go through a hysteria within the next 5 to 10 years -- will complain that rock music all went downhill after My Chemical Romance and Fallout Boy, will still be clinging to the skinny jeans and ballet flats look, and will still use "fml!!!" in their Facebook status updates.

In sum, as far as cultural production and persistence goes, sheer size of the group matters less than the strength of its bonds. Cohorts with a strong sense of belonging to a Generation will produce more and preserve it for longer, considering it something sacred that you just don't throw away, no matter how smelly may have gotten. And it is a generalized hysteria that brings socially defenseless young people together in tighter-knit groups.

[1] In Arthur De Vany's excellent book Hollywood Economics, he shows that Hollywood consistently ignores the niche for G, PG, and even PG-13 movies, even though they have higher return-on-investment than R movies. Maybe Hollywood people prefer making edgy R movies over cheesy G movies -- or whatever -- but there is a huge void there, and musicals would surely fall under this untapped niche.

July 6, 2009

Defining who Millenials are by cultural changes

Although I'm very glad not to be part of Generation X, that doesn't mean I'm automatically a Millennial. The wrong and common view of generations is that there's a big generation that lasts for some time, and then another big one takes its place, and so on. In reality, there is a cycle, but it is between attention-whore generations and silent generations. In fact, one of the supposedly big generations is called just that -- The Silent Generation. As another example, in between Baby Boomers and Generation X, there is another silent generation, born from roughly 1958 or '59 to 1963 or '64, a typical one being born in 1962. Steve Sailer, Alias Clio, and Barack Obama are part of this silent generation.

As a 28 year-old, I notice clearly that I'm not Gen X (again, thank god). But though they're close to me and I have friends among their group, I'm not part of the Millennials either. And neither are people who are just below me in age. We're a silent generation too (roughly 1979 or '80 to 1986).

Actually, no one is fully a Millennial yet because it takes a catalyzing hysteria to make an attention-whore generation -- the mid-late '60s for Baby Boomers, the early '90s for Generation X. This hysteria leads to young people demanding that everyone drop everything they're doing and "hear the voice of a new generation." (Booorrrrinnggg.) This probably won't happen again until sometime in the middle of the next decade. Still, they're already out there just waiting to hatch and go on another feminist, identity politics rampage. It's actually kind of scary.

Anyway, the group that will be affected by the next hysteria -- those who will be 15 to 24 year-olds at the time -- will have been born between roughly 1987 and 2000. In 20 years, something like this will be common knowledge -- a cliche, even:

dude, SO glad I was born in 2005 instead of 1994 and have to go through all that leftist bullshit in college...

yeah i know, i would've been like, "omg, fml!" -- that's what they said right? buncha fags.


But before they hatch, is there any way to tell where the break in the birth year is? I think so. This will be part of an ongoing series, but for now I'll just look at smoking. Unlike violent crime and rape, drug use did not decline until about 2000 or 2001. And of course it's mostly young people smoking dope. Unlike crime, smoking is more of a health or lifestyle choice -- perhaps an ethnic marker that people in your group use to identify each other (whether your group smokes or doesn't).

The large national representative sample from the Youth Risk Behavior Survey shows us how much trouble high schoolers are getting into, for various types of trouble -- sex, drugs, junk food, etc. There are lots of questions about smoking (see here), but they all show the same pattern over time. Here is the percent of students who have ever smoked, even just one or two puffs:


There's basically no change until 2001, when the smoking rate starts plummeting. Young people have made a huge shift in deciding that smoking isn't their thing -- old people do that. You figure it's the young students paving the way, since you're faced with the choice to take your first puff or not pretty early -- say, 14, a high school freshman. If this pioneering group was 14 in 2001, then they were born in 1987 -- which is what I guessed just by forecasting the next big social hysteria and working backwards to guess what birth years would be affected by it and so be transformed into an attention-whore generation.

I chose smoking rather than other things like crime because it's a lot closer to a group membership badge than stealing or killing is -- "is it gonna make me look cool or not?" There are other things I've found, which I'll write more about later, and they also point to roughly the late '80s as the earliest birth years of the Millennials.

BTW, there's some practical advice for when a bunch of barely legal girls come up to talk to you at '80s night (or perhaps not even legal if it's at the mall): make sure you don't smoke, or at least don't do it during that time. It'll instantly give your age away. I've never smoked, and high school and college girls always guess that I'm 22 or 23 anyway, so no big worry. But if you're a smoker, you'll seem as out of place among them as a pipe smoker would among anyone under 80.

July 2, 2009

Second thoughts about the Terminator sequel

While making and eating dinner awhile ago, I overheard most of Terminator 2: Judgment Day, which my housemate and his friend were watching in the next room. I thought, "God, was it really this stupid and preachy?" I wouldn't have known when I saw it as a kid, but it was hard to ignore. I finally found the first Terminator movie and watched it tonight. It blows the sequel out of the fucking water.

The first was made in 1984, while the sequel was made in 1991, during the peak of the social hysteria -- political correctness, identity politics and Rodney King, Third Wave feminism, etc. I don't like when critics read too much into what the work says about the larger culture in which it was made, but when it was made during a cultural and social hysteria, the imprint is hard to ignore. It really crippled what could have been a great sequel, and below are a few off-hand examples of how the Generation X era movie paled in comparison to the original from the New Wave / Reagan landslide period.

- Let's just get it out of the way: in T2, the computer programming genius who invents what will become the technology that is so smart that it takes over mankind -- is black. Not Ashkenazi Jewish or South Asian -- but black. These lame attempts to "provide good role models" don't fool anyone. Even looking just at really smart blacks, they aren't very interested in applying their talent to programming computers. Being a geek is just not a black thing. You can bet that if this character were written as a mad scientist type -- rather than an unwitting creator -- he would be blonde-haired and blue-eyed.

In contrast, the only black character in the first movie is a police officer who cares after Sarah Connor and is courageous enough to lose his life trying to stop the terminator once it starts shooting the fuck out of the police station. Sounds like more of a role model to me -- but then, being a police officer wouldn't motivate young black people to get trapped in the education bubble for 4+ years, the way that programming computers would. Yale or jail.

- In the sequel, Sarah Connor has transformed from a vulnerable, feminine waitress into a muscular, hyper-disciplined warrior. Again with the positive role model bullshit. Just let girls be girls and stop twisting their arms to get them to join the army or the mechanical engineering field. Aside from how inherently irritating all feminist propaganda is, this switch ruins much of the story. After all, we are so afraid for the terminator's victims in the first movie because they're so helpless, including Sarah Connor. By making her butch, we don't feel like she's in that much danger anymore -- we're just waiting to see which evenly matched badass character will come out on top.

- That annoying wannabe Gen X-er who's supposed to be the future savior of humanity. Let's see, having to suffer his voice and attitude vs. killing him off and humanity along with him -- it's actually not an easy choice.

- Infantilizing the Arnold terminator. What they were going for here was some kind of Give Peace a Chance dipshittery -- indeed, one of the final lines is Sarah Connor saying something to the effect of, "If we can teach a machine to love, then I have hope for humanity." But they haven't really reformed or transformed him -- he starts out completely clueless and is tutored by that punk kid about what's right and wrong.

This is not at all like Frankenstein's monster, who commits horrible crimes and feels morally conflicted as a result. If they had first shown Arnold killing a bunch of innocent kids just because they got in his way, then we would believe that the Connors had truly changed him by the end. As it stands, his character is just a big robotic baby -- pathetic.

- The new evil terminator, the T-1000, isn't frightening at all. Rather, he seems like a garden variety sociopath. In the first movie, the terminator doesn't craft a stealthy plan to kill John Connor's mother -- he simply looks up "Sarah Connor" in the phone book and blasts each of them to hell in order. That's what a fucking terminator does. That you could die in such a way is a bit unnerving, not to mention the fate of the scores of policemen and innocent bystanders in the nightclub who the terminator mows through. But very few innocents get killed in T2, except for those who actively get in the T-1000's way, so there's very little sense of "it could happen to me."

- The sequel relies too much on gimmicky special effects to show how cruel the T-1000 is -- turning his arm into a long blade and stabbing John's adoptive father through the neck, or turning his finger into a spike that he shoves through a security guard's eye. It's somewhat gory, but we don't feel that he's particularly cruel. The best shot that establishes how cold-hearted the terminator is in the first movie consists only of Arnold driving his car over a small boy's toy truck, crushing it. This also calls back to the shots of human skulls being run over by the tank tracks of the hunter-killer machines of the future. And as far as gore goes, punching a street punk through his gut, lifting him up, and ripping out his heart is a lot more badass.

- In general, the sequel is too optimistic -- Hope and Change. We watch a group of heros on a mission to stop the horrible technology from being invented in the first place, and they apparently succeed. Again, it's too Si Se Puede! In the first movie, we see lots of shots of the future -- and it looks like hell. Even the present looks pretty grimy, which wasn't too difficult to do in 1984 when crime and urban decay was still on the rise. Still, 1991 was the peak year of violent crime -- they could have easily emphasized how things appeared to be going downhill already by featuring street gangs, seedy nightclubs, alcohol-blinded street urchins, and all the rest that gives the first movie its gritty feel.

The whole point of the movie is that we thoughtlessly got ourselves into a big mess and may or may not get out of it. At the end of the first movie, a boy tells Sarah Connor that there's a storm coming, and she merely says, "I know," and rides toward it. Who knows what will happen? Rather than having a clear sense that the good guys are going to prevail, and that blacks and whites will all just get along after we teach terminators to cry, we're left feeling no more certain about humanity's fate than before. The first movie didn't offer the audience any of that schmaltz.

While the sequel did much better at the box office, only the first one is listed in the National Film Registry. The third one I saw on DVD awhile back, but I can't remember enough of it to comment on it. I just remember that it was forgettable. I haven't seen the new one either, but based on the reviews and word-of-mouth I've gotten, I'll wait till DVD (if then). A really good terminator movie needs a bleak cultural milieu to make it work. With things having been so great for awhile now, it's unlikely that we'll get another good movie in the series again.

July 1, 2009

Quick thoughts about Fast Times at Ridgemont High

This summary is not available. Please click here to view the post.