July 13, 2009

And you thought the PlayStation 3 was expensive...

When it was released nearly three years ago, Sony's PlayStation 3 video game console cost either $500 or $600, depending on how big the hard drive was. It sounded like a lot of money because we assume that dollars are fixed in value, when in reality inflation or deflation changes how much a dollar is worth all the time. Setting aside the consoles that were trying to be full arcade games for the home, or that made video game playing only one piece of their larger package of multimedia features, what was the most expensive video game system after adjusting for inflation?

It was Mattel's Intellivision, a mainstream competitor of Atari that was released in 1980 for $300 -- or for $746 in 2007 dollars. That's only 23% cheaper than the Neo Geo -- a home arcade system!

I know there are already charts of how much video game consoles cost in nominal and real terms across the years, but I've never seen a good scatter-plot or time-series plot. So here they are for 45 consoles (this time including the home arcade and multimedia ones), from Magnavox's Odyssey in 1972 to Sony's PS3 in 2006 (full table here):


I adjusted for inflation by using the Consumer Price Index, since after all these are consumer goods. I got the data from this chronology of video games, which is extensively referenced.

Notice that the nominal prices have gone steadily up -- there doesn't seem to be nominal price "rigidity" here, as though we had some psychological barrier against the creeping-upward prices on the tag. Well, at least until a mainstream console hits $1000 -- then we'll see.

In the plot of real prices, the typical consoles have gotten cheaper over time, probably due to economies of scale, memory getting cheaper, or some other technological advancement. Also notice that it appears that prices are lowest when there's only one company with most of the market share -- in the mid-late '80s when Nintendo was the only game in town, and in the early 2000s when the PlayStation 2 cleared out Nintendo's GameCube and Microsoft's Xbox. I don't have good market share data from the 1970s, but this appears to hold there too, since Atari was introduced in 1977 and didn't see real competition until the early 1980s.

This contradicts that idea that just having a huge market share makes you act like a bad monopoly -- prices should be higher when there's only one dominant player, under that idea. Instead, the period when there was essentially one system to play -- 1986 to about 1991 -- was characterized by an incredibly low-price console, the NES. It's not until the eruption of consoles during the 16-bit era and just after that prices start shooting up again.

And Nintendo, despite dominating the market for most of its existence, has always kept its prices low. The real price of the NES was $238. For the Super Nintendo, it was $301; $235 for the GameCube; and $257 for the Wii. Sony's consoles, by contrast, have ranged from roughly $350 to $500.

I've got similar data that I'll post soon for the handheld consoles, but it looks pretty similar.

So if you thought video game nerds today spend too much money on their vice, just imagine how much of their paycheck people forked over to buy an Intellivision. I don't remember any of it first-hand, but the video game craze of the late '70s and early '80s was something else (maybe it was the drugs). As I showed in a graph in this post, arcade game sales have never been matched since 1981 -- and that includes the home console sales too, at least as recently as 2002, notwithstanding how well the PS2 was selling then. In fact, it was only in 2001, after the PS2 caught on, that home sales had returned to their level before the video game crash of 1983. It's just too bad there weren't very many games for it -- just a bunch of movies.

July 10, 2009

Economists who aren't clueless

You may have already read Stan Liebowitz's "Anatomy of a Train Wreck" article about the mortgage meltdown (Steve Sailer publicized it), but he's written a lot of other great stuff, mostly about intellectual property and technology. Here's his webpage. If you don't know much about the real history of Microsoft's ascendancy -- and that includes just about everyone, since Microsoft is more of a folk devil than an existing company in most people's minds -- you should read through his very clear and data-packed articles there. Your library should have some of his books too.

Aside from reading about Microsoft, you can also discover that those academic urban legends about the Dvorak keyboard and Betamax tapes being superior to the QWERTY and VHS alternatives are bogus. We haven't been locked in to inferior standards at all.

July 9, 2009

Millennials watching less TV, guys trying more to lose weight

Before we saw that there was a sharp change in the smoking habits of high schoolers around 2001, suggesting that people born in the late 1980s are the first of the Millennials -- not "anyone born in 1980 or after," as you commonly hear. Here are two more pieces of evidence for this lifestyle change occurring in the early 2000s, also from the Youth Risk Behavior Survey.

First, there's something related to the impressions people have that young guys today are more effeminate in their preferences, hopes, anxieties, looks, and so on, compared to before -- removal of body hair, for example. The main worry that a teenage girl has is that she's too fat -- that will never change -- but there's been a shift in the level of young males who are trying to lose weight (bars show 95% confidence intervals):


There's no change among the percent of females who are trying to get thinner, but there's a clear break among males between 1991 to 1997 values and the 2001 to 2007 values, with 1999 marking a transition year. The bars for the first period all overlap a lot, and so do they in the second period, but the second period is clearly higher than the first. It's as if a switch were flipped on sometime during 1999 to 2001, again implicating the late '80s as the birth years of the new cultural group that's visible in the second period of high school dudes.

Elsewhere on this blog, I've documented how young people today have almost nothing to do. So you'd figure that they'd at least be watching more TV -- but nope, they're not even doing that:


Again there's a clear break, in that the 2001 to 2007 groups overlap a lot, but are essentially all below the 1999 value. Once more it's people born in the late '80s and after who have markedly shifted their lifestyle choices from those of young people before.

It's a mistake to put too much emphasis on the new technologies that some generation grew up taking for granted. That doesn't make them into a coherent generation because it says nothing about how people use them -- only that they're background features. Things like smoking, trying to lose weight, and allocating time to watching TV tell us more about what people want, or at least how they're trying to fit in. Generational analysis needs to focus more on what people are doing and thinking than on what stuff they have.

July 8, 2009

Harlem -- it still stinks

Here's a funny NYT article on the housing bust in Harlem.

In general, places that had strong fundamental value have not been hit so hard by the bust as those where the developers and transplants were betting on the idea that their supreme coolness would transform a shithole into a metropolis.

So, New York hasn't been nearly as devastated as Phoenix or Las Vegas because it's fucking New York City. But overhyping and its subsequent bursting is a fractal phenomenon that you can see at any level of zooming in or out. Thus, even within sturdy New York, the Upper East Side wasn't as overhyped and hasn't suffered so bad of a hangover as Harlem -- because Harlem has never had much going for it, while the UES has.

Some of the dopes interviewed by the NYT counter that, no, really, Harlem is still on its way to becoming the Next Big Thing -- after all, look at all the hip new shopping districts we've built! Well, swell, except those are merely a reflection of the irrational exuberance of the bubble years. How many technology start-ups did you found? Or medical research labs? Or anything productive? Harlem is lucky to have Columbia University nearby, which actually does make stuff happen, or it would look even worse.

No wonder video games started sucking in the late '90s

As a follow-up on the post where I traced the horrible present state of video games to the late 1990s, here's a partial explanation: just look at who the audience has been recently, according to the Entertainment Software Association. Read the whole fact sheet -- won't take longer than a few minutes.

The average video game player today is 35 -- about 5 to 10 years past the age when visuospatial skills are at their peak. Fully one-quarter are over 50! No wonder games aren't tests of skill anymore -- that would be like raising the badminton net in the retirement home as its residents are shrinking. Adult women outnumber under-18 boys by nearly twice as much. Females have visuospatial skills about 1 standard deviation below males.

Put together the huge change in the age and sex make-up of video game players, and the current mediocrity is a little less mysterious. The audience isn't mostly hyperactive, imaginative kids but a bunch of boring grown-ups for whom playing a video game is like popping in a DVD and vegetating on the couch. (The grown-ups who aren't boring are probably not playing video games at all, or are playing something challenging like Blaster Master.)

One encouraging fact is that most video games sold are basically all-ages, swamping the stupid shocker games for 18+ audiences only. (If you want a real thrill, go for a joy ride, shoplift something cheap and pointless just because, or dance with girls in a club.) It's the exact opposite of Hollywood movies, where too many R movies are made, and a huge niche of G and PG movies has been left untapped.

And note that G-rated games (or whatever the rating is) aren't necessarily dopey and juvenile -- the Nintendo had the best success rate at making great games, and there's hardly any gore, sex, or swearing in any of them. Beating people up, sure, but nothing really gory. Hell, Tetris is one of the greatest video games ever, and it's about as inoffensive as you can imagine.

The only downside is that although all-ages games are flourishing, they're still geared toward adults and the entire family playing along. There needs to be a large market share for all-ages games, but that are about testing your skills, exploring hard-to-navigate areas, and giving a good whomp to things that get in your way. Young boys today are pretty deprived -- they've got either kiddie games that their helicopter parents won't mind ("Parents report always or sometimes monitoring the games their children play 94% of the time"), or else games that loser 30-something males use to make-believe that they're badass.

July 7, 2009

In pop culture, is demography destiny?

The perennially bothersome and wrong commenter / blogger Whiskey (aka testing99) always leaves comments at various blogs in the Steve-o-sphere suggesting that the poor fool whose post he's commenting on has clumsily neglected the One True Explanation for Everything -- changing demographics.

Demography obviously has the power to influence many aspects of culture. The question is, in a given case, is the purported demographic change true? If not, then we're done. A non-existent change cannot have caused other changes. Recently in a thread about the decline of musicals since at least the 1950s, Whiskey offered a demographic explanation:

It's not the cult of authenticity that killed musical innovation, it's demographics. Not enough White teens to both provide innovators, and the market for it.

From memory (I won't bother looking them all up), he's offered this for a lot of changes between the 1950s and '60s vs. today -- not as many white teenagers. Is this true?

Obviously there are fewer teenagers now than in the 1960s because of the Baby Boom, but his argument is still wrong. That is because the period when teenagers and early 20-somethings made up the largest chunk of the population was not the 1950s or the '60s at all -- it was around 1980. As someone who reads a bunch of this stuff for graduate school work, I stupidly think that it's part of anyone's background information who talks about demography. So let's look in just a bit of detail -- actually, you only need to see two graphs.

First, let's look at the fertility rate over time. Notice that the peak is in the late 1950s and is still within a few percent of the peak through 1960; only after 1960 does it tumble downward. By the time the cohort born in this peak period reaches high school or college, it will be the late '70s or early '80s.

Next, let's look at the age pyramid in 5-year intervals. It's animated without a pause feature, so you may have to sit through it a few times to catch it, but the year when the 15 to 19 and 20 to 24 year-old bars are at their widest is in 1980. This says that our prediction based just on fertility was right -- those born during the peak around 1960 did not massively die off, and so young people were greatest in proportion around 1980. (You can see these pyramids in any good book on demography in your library.)

If such a puny fraction of a smaller population size was sufficient to make musicals and musical innovation marketable through the 1950s, then surely a larger fraction of a larger total population would make them even more marketable in 1980. Except musicals had been dead for decades by that point, and there weren't a whole lot of fundamentally new musical styles. The mistake in the argument is locating when young people made up the greatest part of the population. For the decline in musicals, there must be some non-demographic explanation. [1]

This also shows that demography isn't as all-powerful as we think in a broader sense: the period that we associate with youth rebellion, Hear the Voice of a New Generation, etc., is roughly 1957 to 1969 or so. But if it were sheer size that mattered most, we would look to 1980 as the golden age of teenagers. Not that there isn't something there in the popular awareness -- punk, new wave, Fast Times at Ridgemont High, etc. -- but we certainly don't think of the new wave era as quintessentially youth-dominant as we do The Sixties.

And it only gets worse if we look at the makers of youth culture -- they're not 15 to 24, but usually in their mid-20s. And 25 year-olds were represented the strongest in 1985, decades after The Beatles released Help! If sheer size mattered, we would look at "college rock," which exploded in the mid-'80s and was created mostly by people born in the 1958 - 1964 cohort, as the pinnacle of youth-dominant music-making.

Instead, it is the culture of the cohort that went through an intense hysteria that has persisted for the longest, rather than the culture of the cohort that was biggest in size. Going through a massive social hysteria forces solidarity on socially desperate young people (which is to say, all of them), and more solidaristic people will hold onto their culture for longer. That's why you still see Baby Boomers who look like they've just gotten out of a Weathermen meeting, and whose speech makes it sound like they're organizing a student sit-in. The same is true for Gen X people who still look like they just left a womynist slam poetry reading in a Berkeley coffee shop. They've preserved these things more because they feel a stronger connection to their generation.

The silent generations, on the other hand, never had to go through a hysteria and so don't feel such a strong affinity for their age-mates. No one born in 1959 still wears clothes that they would have at Studio 54 or CBGB's, and their disco-punk-era mannerisms and slang words are gone too. I can assure everyone that my generation, when we're in our 50s, will not still be wearing those retarded carpenter-style jeans of the late '90s, or blasting blink182 or Britney Spears out our car windows.

But I'll wager that in their middle age years, the Millennials -- who will go through a hysteria within the next 5 to 10 years -- will complain that rock music all went downhill after My Chemical Romance and Fallout Boy, will still be clinging to the skinny jeans and ballet flats look, and will still use "fml!!!" in their Facebook status updates.

In sum, as far as cultural production and persistence goes, sheer size of the group matters less than the strength of its bonds. Cohorts with a strong sense of belonging to a Generation will produce more and preserve it for longer, considering it something sacred that you just don't throw away, no matter how smelly may have gotten. And it is a generalized hysteria that brings socially defenseless young people together in tighter-knit groups.

[1] In Arthur De Vany's excellent book Hollywood Economics, he shows that Hollywood consistently ignores the niche for G, PG, and even PG-13 movies, even though they have higher return-on-investment than R movies. Maybe Hollywood people prefer making edgy R movies over cheesy G movies -- or whatever -- but there is a huge void there, and musicals would surely fall under this untapped niche.

July 6, 2009

Defining who Millenials are by cultural changes

Although I'm very glad not to be part of Generation X, that doesn't mean I'm automatically a Millennial. The wrong and common view of generations is that there's a big generation that lasts for some time, and then another big one takes its place, and so on. In reality, there is a cycle, but it is between attention-whore generations and silent generations. In fact, one of the supposedly big generations is called just that -- The Silent Generation. As another example, in between Baby Boomers and Generation X, there is another silent generation, born from roughly 1958 or '59 to 1963 or '64, a typical one being born in 1962. Steve Sailer, Alias Clio, and Barack Obama are part of this silent generation.

As a 28 year-old, I notice clearly that I'm not Gen X (again, thank god). But though they're close to me and I have friends among their group, I'm not part of the Millennials either. And neither are people who are just below me in age. We're a silent generation too (roughly 1979 or '80 to 1986).

Actually, no one is fully a Millennial yet because it takes a catalyzing hysteria to make an attention-whore generation -- the mid-late '60s for Baby Boomers, the early '90s for Generation X. This hysteria leads to young people demanding that everyone drop everything they're doing and "hear the voice of a new generation." (Booorrrrinnggg.) This probably won't happen again until sometime in the middle of the next decade. Still, they're already out there just waiting to hatch and go on another feminist, identity politics rampage. It's actually kind of scary.

Anyway, the group that will be affected by the next hysteria -- those who will be 15 to 24 year-olds at the time -- will have been born between roughly 1987 and 2000. In 20 years, something like this will be common knowledge -- a cliche, even:

dude, SO glad I was born in 2005 instead of 1994 and have to go through all that leftist bullshit in college...

yeah i know, i would've been like, "omg, fml!" -- that's what they said right? buncha fags.


But before they hatch, is there any way to tell where the break in the birth year is? I think so. This will be part of an ongoing series, but for now I'll just look at smoking. Unlike violent crime and rape, drug use did not decline until about 2000 or 2001. And of course it's mostly young people smoking dope. Unlike crime, smoking is more of a health or lifestyle choice -- perhaps an ethnic marker that people in your group use to identify each other (whether your group smokes or doesn't).

The large national representative sample from the Youth Risk Behavior Survey shows us how much trouble high schoolers are getting into, for various types of trouble -- sex, drugs, junk food, etc. There are lots of questions about smoking (see here), but they all show the same pattern over time. Here is the percent of students who have ever smoked, even just one or two puffs:


There's basically no change until 2001, when the smoking rate starts plummeting. Young people have made a huge shift in deciding that smoking isn't their thing -- old people do that. You figure it's the young students paving the way, since you're faced with the choice to take your first puff or not pretty early -- say, 14, a high school freshman. If this pioneering group was 14 in 2001, then they were born in 1987 -- which is what I guessed just by forecasting the next big social hysteria and working backwards to guess what birth years would be affected by it and so be transformed into an attention-whore generation.

I chose smoking rather than other things like crime because it's a lot closer to a group membership badge than stealing or killing is -- "is it gonna make me look cool or not?" There are other things I've found, which I'll write more about later, and they also point to roughly the late '80s as the earliest birth years of the Millennials.

BTW, there's some practical advice for when a bunch of barely legal girls come up to talk to you at '80s night (or perhaps not even legal if it's at the mall): make sure you don't smoke, or at least don't do it during that time. It'll instantly give your age away. I've never smoked, and high school and college girls always guess that I'm 22 or 23 anyway, so no big worry. But if you're a smoker, you'll seem as out of place among them as a pipe smoker would among anyone under 80.

July 2, 2009

Second thoughts about the Terminator sequel

While making and eating dinner awhile ago, I overheard most of Terminator 2: Judgment Day, which my housemate and his friend were watching in the next room. I thought, "God, was it really this stupid and preachy?" I wouldn't have known when I saw it as a kid, but it was hard to ignore. I finally found the first Terminator movie and watched it tonight. It blows the sequel out of the fucking water.

The first was made in 1984, while the sequel was made in 1991, during the peak of the social hysteria -- political correctness, identity politics and Rodney King, Third Wave feminism, etc. I don't like when critics read too much into what the work says about the larger culture in which it was made, but when it was made during a cultural and social hysteria, the imprint is hard to ignore. It really crippled what could have been a great sequel, and below are a few off-hand examples of how the Generation X era movie paled in comparison to the original from the New Wave / Reagan landslide period.

- Let's just get it out of the way: in T2, the computer programming genius who invents what will become the technology that is so smart that it takes over mankind -- is black. Not Ashkenazi Jewish or South Asian -- but black. These lame attempts to "provide good role models" don't fool anyone. Even looking just at really smart blacks, they aren't very interested in applying their talent to programming computers. Being a geek is just not a black thing. You can bet that if this character were written as a mad scientist type -- rather than an unwitting creator -- he would be blonde-haired and blue-eyed.

In contrast, the only black character in the first movie is a police officer who cares after Sarah Connor and is courageous enough to lose his life trying to stop the terminator once it starts shooting the fuck out of the police station. Sounds like more of a role model to me -- but then, being a police officer wouldn't motivate young black people to get trapped in the education bubble for 4+ years, the way that programming computers would. Yale or jail.

- In the sequel, Sarah Connor has transformed from a vulnerable, feminine waitress into a muscular, hyper-disciplined warrior. Again with the positive role model bullshit. Just let girls be girls and stop twisting their arms to get them to join the army or the mechanical engineering field. Aside from how inherently irritating all feminist propaganda is, this switch ruins much of the story. After all, we are so afraid for the terminator's victims in the first movie because they're so helpless, including Sarah Connor. By making her butch, we don't feel like she's in that much danger anymore -- we're just waiting to see which evenly matched badass character will come out on top.

- That annoying wannabe Gen X-er who's supposed to be the future savior of humanity. Let's see, having to suffer his voice and attitude vs. killing him off and humanity along with him -- it's actually not an easy choice.

- Infantilizing the Arnold terminator. What they were going for here was some kind of Give Peace a Chance dipshittery -- indeed, one of the final lines is Sarah Connor saying something to the effect of, "If we can teach a machine to love, then I have hope for humanity." But they haven't really reformed or transformed him -- he starts out completely clueless and is tutored by that punk kid about what's right and wrong.

This is not at all like Frankenstein's monster, who commits horrible crimes and feels morally conflicted as a result. If they had first shown Arnold killing a bunch of innocent kids just because they got in his way, then we would believe that the Connors had truly changed him by the end. As it stands, his character is just a big robotic baby -- pathetic.

- The new evil terminator, the T-1000, isn't frightening at all. Rather, he seems like a garden variety sociopath. In the first movie, the terminator doesn't craft a stealthy plan to kill John Connor's mother -- he simply looks up "Sarah Connor" in the phone book and blasts each of them to hell in order. That's what a fucking terminator does. That you could die in such a way is a bit unnerving, not to mention the fate of the scores of policemen and innocent bystanders in the nightclub who the terminator mows through. But very few innocents get killed in T2, except for those who actively get in the T-1000's way, so there's very little sense of "it could happen to me."

- The sequel relies too much on gimmicky special effects to show how cruel the T-1000 is -- turning his arm into a long blade and stabbing John's adoptive father through the neck, or turning his finger into a spike that he shoves through a security guard's eye. It's somewhat gory, but we don't feel that he's particularly cruel. The best shot that establishes how cold-hearted the terminator is in the first movie consists only of Arnold driving his car over a small boy's toy truck, crushing it. This also calls back to the shots of human skulls being run over by the tank tracks of the hunter-killer machines of the future. And as far as gore goes, punching a street punk through his gut, lifting him up, and ripping out his heart is a lot more badass.

- In general, the sequel is too optimistic -- Hope and Change. We watch a group of heros on a mission to stop the horrible technology from being invented in the first place, and they apparently succeed. Again, it's too Si Se Puede! In the first movie, we see lots of shots of the future -- and it looks like hell. Even the present looks pretty grimy, which wasn't too difficult to do in 1984 when crime and urban decay was still on the rise. Still, 1991 was the peak year of violent crime -- they could have easily emphasized how things appeared to be going downhill already by featuring street gangs, seedy nightclubs, alcohol-blinded street urchins, and all the rest that gives the first movie its gritty feel.

The whole point of the movie is that we thoughtlessly got ourselves into a big mess and may or may not get out of it. At the end of the first movie, a boy tells Sarah Connor that there's a storm coming, and she merely says, "I know," and rides toward it. Who knows what will happen? Rather than having a clear sense that the good guys are going to prevail, and that blacks and whites will all just get along after we teach terminators to cry, we're left feeling no more certain about humanity's fate than before. The first movie didn't offer the audience any of that schmaltz.

While the sequel did much better at the box office, only the first one is listed in the National Film Registry. The third one I saw on DVD awhile back, but I can't remember enough of it to comment on it. I just remember that it was forgettable. I haven't seen the new one either, but based on the reviews and word-of-mouth I've gotten, I'll wait till DVD (if then). A really good terminator movie needs a bleak cultural milieu to make it work. With things having been so great for awhile now, it's unlikely that we'll get another good movie in the series again.

July 1, 2009

Quick thoughts about Fast Times at Ridgemont High

This summary is not available. Please click here to view the post.