In case you haven't heard, a superstar Harvard scientist has been found responsible for research misconduct, possibly including fabrication of data, and leading to a key journal article being retracted. Rather than being some minor fudge that can be easily set straight, the comments from those in the field -- primate cognition -- make it sound like an entire edifice could crumble. How could just one bad apple spoil the bunch?
The answer comes from Nassim Taleb's view on the fragility vs. robustness of complex systems, and the accelerating per unit costs of correcting harms as those harms get bigger. Go to his webpage and read around, or listen to his podcast at EconTalk, or read the afterword to the paperback edition of The Black Swan. The ideas are spread throughout. Here's how it works in this case:
With the advent of the modern research university, work on a given research problem gets concentrated into fewer and fewer hands as academics over-over-over-specialize. Before, they were generalist enough to understand most problems, have something to say or carry out work about them, and interpret and judge other people's thoughts and findings on them. Academics may always be prone to unintentionally seeing their data with biased eyes, to maliciously faking the data, and to having their equipment crap out and give them a faulty reading.
But before, there were, say, 100 academics working on some problem, so that if any one of them made a serious error, the others could correct it at a small cost -- by trying to replicate the results themselves, asking for the data to re-analyze it themselves, proposing rival explanations, pointing out logical flaws (that would still require enough knowledge of the problem to see), and so on. So, if the system is moving in the direction of greater-truth-discovering, and we perturb it away by having one of these 100 academics make an error, the system goes back to where it was and the error doesn't spiral out of control. Also, the errors from various academics don't compound each other -- that would require belonging to some larger framework or vision or coordination that wove them together, making them interact.
However, in the modern setting of hyper-specialization, there are very few qualified to correct the errors of the one academic (or maybe a small handful) who are working on the problem. They mostly have to take the raw data and the interpretations on faith -- deferring to expert consensus is out since, again, so very few are qualified that the law of large numbers cannot work to yield an unbiased consensus. Thus, when a crucial error is made, there are few external checks to dampen it, and its harm grows and grows as it becomes taken for granted among the broader academic fields that cite the erroneous article.
Moreover, these errors compound each other because they all come from the same academic, not a group of unrelated researchers. In his vision, his big ideas all mesh together and reinforce each other; most academics don't toy with a bunch of totally unrelated ideas, but instead seek to build an interlocking tower of ideas.
Let's say that there are a total of 4 errors made relating to some problem and that go uncorrected for awhile -- they're only spotted after they've been incorporated into the broader field. In the generalist world, those errors probably came from 4 unrelated researchers, so they will hit at 4 distant spots in the problem -- like having 4 blows made to a chair, one to each of the 4 legs. It will get shaken around, but it won't get upended. Now go to the hyper-specialized world: those 4 errors will all be part of an integrated vision, so they'll have a stronger total force, given that they work with each other. And they'll be more focused in where they strike since the scope of research for one person is smaller than it is for four people combined. This is like an even stronger force striking at just one of the chair's legs -- now it will topple over.
In other words, the harm done by the 4 errors of a single, specialized academic is greater than 4 times the harm done by a single error of a generalist academic (because of compounding). We would boost the health of the academic system by breaking up research territory so that it looks more like the generalist than the specialized world. It doesn't matter if there are economies of scale, benefits of specialization (like becoming a master at one thing rather than being a jack-of-all-trades), or whatever. Those benefits all get wiped out when some series of errors gets out of control, as it inevitably will -- then the whole system crashes, and the so-called gains of specialization were illusory.
In the generalist world, errors are less likely to go uncorrected, and they do not get compounded, so it is much more robust -- at the "cost" of greater inefficiency. Greater efficiency in the short-run only serves to boost the status of researchers, as they get praised and financially rewarded for their impressive short-term performance. The fake efficiency of specialization counts for nothing once the body of work on a problem is found to be contaminated beyond repair. *
A central source of robustness is redundancy, like having spare parts in case one stops working. An academic world where 100 people are working on a problem is more robust, even though it looks inefficient -- why not just hand it over to a handful of specialists? Well, why have two lungs and why ever back up your files? The cost of redundancy is more than outweighed by the benefit of superior survival value, whereas the benefits of specialization are more than outweighed by the cost of getting wiped out before too long.
This is surely a big part of why so little great research was done during the 20th century, especially during the second half when peer review became institutionalized (but that's another story). For thousands of years before the modern research university, people found new big problems all the time, made progress on them, and this lead to solutions for other problems (because generalists can see connections between disparate problems). They were subject to fads and groupthink, but so are specialists -- the latter more so since they're so insulated -- and in any case, they weren't so fragile to the errors of a couple of researchers.
"Yeah, well this is just one case -- so what?" This is the only one that we know about here and now. We only found out because Hauser was so out there, according to his graduate students, that they couldn't stay silent any longer. Most grad students are of course a bunch of big fat pussies who value career stability over higher goals. So there are all sorts of cases, past and present, that we aren't even aware of. Sometimes people actually trace the chain of citations back to the original data and find that the original results were wrong, or got transformed in a game of broken telephone. Every field has their share of academic urban legends, and that's much more likely in a specialist world where people have no choice but to take data and interpretations on faith, and where non-specialists are likely to mutate the original finding or interpretation.
The only solution is to break up the academic territory and go back to the generalist world that prevailed before the modern research university, before publish-or-perish incentives that only made the temptation to fudge even greater, and before supposedly hard-headed academics began to take others' findings on faith.
* It's just like with banks: there's a mismatch between the time scale of performance and getting bonuses. Managers get bonuses every year, yet it could take a decade or more to see if they'd been taking dangerous risks that ultimately will wipe the company out for good. So their incentive is to take a lot of hidden risks and massage the numbers so that they get their yearly bonus. Once the hidden risks expose themselves beyond the power of massaging the data, the company goes bust -- but the banker still keeps the bonuses from those first 9 years of illusory efficiency. Same with academics, who get praised, promoted, and paid on an annual basis, but whose performance may take a decade or more to truly evaluate.
August 31, 2010
August 30, 2010
Real vs. simulated wildness
Just browsing the NYT archives and came across this snapshot of what young guys were up to for fun in 1987. As far as I know, not a single midnight launch of a video game has attracted nearly 20,000 people in one spot and ended with cars being overturned and burned. For all they rave on about the first-person shooter de jour, or how sick it is to get to play as a gang member in Grand Theft Auto, it's worth remembering that adolescent dudes today have never gotten into a real fight, or even toilet-papered someone's house -- really, when was the last time you saw that anywhere?
August 29, 2010
Music odds and ends
- After watching the new video for "California Girls" (not a Beach Boys cover), you might be sad that Katy Perry isn't dead yet. But look on the bright side: at least she's replaced Fergie as the queen of the I'm-so-hot, it's-all-about-me, look-how-exaggerated-I-can-make-my-face way of life that prevails among the women these days. Katy Perry is annoying but doesn't make you want to punch her in the face, and she looks cute rather than like a horse-faced transvestite.
I don't get the attempt to try to shock with the word "teenage" in her album title, Teenage Dream, given how lacking in sexual charge the lead single is. That reminds me of the laughably tame song by Against Me! called "I Was a Teenage Anarchist" -- yeah right, you've never gone on a joyride, shoplifted anything, openly confronted the principal... damn, you've probably never even cut school! Buncha dorks. The only cool song with "Teenage" in the title is "Teenage Kicks" by the Undertones, and the only erotic-sounding album title with the word is Seventh Dream of Teenage Heaven by Love and Rockets (not a very good album, but the title is).
- What happened to literary references in popular music? I guess this is part of the larger trend away from reading fiction, both on the part of the audience (who now won't appreciate them) and the creators (who don't have them on hand in their mind to begin with). My impression is that these used to be more common. Just a few examples from albums I've been listening to recently, not exhaustive: "Dover Beach" by the Bangles quotes "Prufrock," Iron Maiden wrote a song around "Rime of the Ancient Mariner," and Echo & the Bunnymen mention John Webster ("My White Devil"). As you can see, it wasn't just the arty people who made them, as you might expect; even the poppier and metal groups were doing it.
- Speaking of Echo & the Bunnymen, I picked up Ocean Rain -- another Sgt. Pepper's kind of album, one that's famous for being famous and for trying to turn a popular form into something Big and Serious. It's OK overall and has a few good songs, but it's overwrought. Their previous album Porcupine is so much better -- more stripped-down, vigorous, honest, and free of self-awareness. One of the best post-punk albums for sure. If you're going to get it, buy the re-release, which has an alternate version of "The Cutter" with more pleasing guitar harmonies.
- Before I noted that supermarkets play better music than coffee houses these days, likely a reversal from the pattern of 10 or 20 years ago. Mega-markets even have a better selection of music to buy compared to refined music-sellers like Barnes & Noble. I've been going on a buying binge lately, and B&N has had almost nothing I wanted aside from a low-priced copy of the Footloose soundtrack. There's too much fashionable stuff -- singer-songwriters on the one hand, and indie / emo and its alternative predecessors. Aside from the Beatles and some David Bowie, there's almost no classic rock there, no punk, no post-punk, no college rock, dance pop, synth-pop, heavy metal, or just about anything. That stuff has had its time, and room must be made for music that "goes beyond" it -- right, more like comes up light-years short of it.
Now go to the local mega-market, and even though the size of the selection is the same (only four or five racks) the quality is infinitely better. You can find a good amount of Madonna, Prince, and other superstars who you expect to see anywhere real music is sold -- and not just some greatest hits CD but the actual albums too. A bit too much indie / emo / alternative for my tastes, and too little post-punk or college rock, but still. There's a greater variety of classic rock, dance and disco styles, but the largest difference is that there's a lot more metal. They had plenty of albums that even my used record store, with its rows and rows of CDs, couldn't give me.
The obvious answer is who the average customer is for B&N vs. a Wal-Mart style mega-market, more well-to-do vs. more blue collar. Higher-status people are more concerned with oneupsmanship -- with signaling to their competitors that they're not only staying au courant but are part of the avant-garde (even if they're not leading it). So retailers that cater to them will have to throw out lots of great, classic material because that's not what's happening and not where things are going right now. Retailers who cater to lower-status people don't have to worry about whether something is "hot" but rather whether it's any good for its price.
...I was going to say a bit more about this greater preservation of metal classics vs. rock-for-the-college-educated, but that probably deserves a post of its own.
I don't get the attempt to try to shock with the word "teenage" in her album title, Teenage Dream, given how lacking in sexual charge the lead single is. That reminds me of the laughably tame song by Against Me! called "I Was a Teenage Anarchist" -- yeah right, you've never gone on a joyride, shoplifted anything, openly confronted the principal... damn, you've probably never even cut school! Buncha dorks. The only cool song with "Teenage" in the title is "Teenage Kicks" by the Undertones, and the only erotic-sounding album title with the word is Seventh Dream of Teenage Heaven by Love and Rockets (not a very good album, but the title is).
- What happened to literary references in popular music? I guess this is part of the larger trend away from reading fiction, both on the part of the audience (who now won't appreciate them) and the creators (who don't have them on hand in their mind to begin with). My impression is that these used to be more common. Just a few examples from albums I've been listening to recently, not exhaustive: "Dover Beach" by the Bangles quotes "Prufrock," Iron Maiden wrote a song around "Rime of the Ancient Mariner," and Echo & the Bunnymen mention John Webster ("My White Devil"). As you can see, it wasn't just the arty people who made them, as you might expect; even the poppier and metal groups were doing it.
- Speaking of Echo & the Bunnymen, I picked up Ocean Rain -- another Sgt. Pepper's kind of album, one that's famous for being famous and for trying to turn a popular form into something Big and Serious. It's OK overall and has a few good songs, but it's overwrought. Their previous album Porcupine is so much better -- more stripped-down, vigorous, honest, and free of self-awareness. One of the best post-punk albums for sure. If you're going to get it, buy the re-release, which has an alternate version of "The Cutter" with more pleasing guitar harmonies.
- Before I noted that supermarkets play better music than coffee houses these days, likely a reversal from the pattern of 10 or 20 years ago. Mega-markets even have a better selection of music to buy compared to refined music-sellers like Barnes & Noble. I've been going on a buying binge lately, and B&N has had almost nothing I wanted aside from a low-priced copy of the Footloose soundtrack. There's too much fashionable stuff -- singer-songwriters on the one hand, and indie / emo and its alternative predecessors. Aside from the Beatles and some David Bowie, there's almost no classic rock there, no punk, no post-punk, no college rock, dance pop, synth-pop, heavy metal, or just about anything. That stuff has had its time, and room must be made for music that "goes beyond" it -- right, more like comes up light-years short of it.
Now go to the local mega-market, and even though the size of the selection is the same (only four or five racks) the quality is infinitely better. You can find a good amount of Madonna, Prince, and other superstars who you expect to see anywhere real music is sold -- and not just some greatest hits CD but the actual albums too. A bit too much indie / emo / alternative for my tastes, and too little post-punk or college rock, but still. There's a greater variety of classic rock, dance and disco styles, but the largest difference is that there's a lot more metal. They had plenty of albums that even my used record store, with its rows and rows of CDs, couldn't give me.
The obvious answer is who the average customer is for B&N vs. a Wal-Mart style mega-market, more well-to-do vs. more blue collar. Higher-status people are more concerned with oneupsmanship -- with signaling to their competitors that they're not only staying au courant but are part of the avant-garde (even if they're not leading it). So retailers that cater to them will have to throw out lots of great, classic material because that's not what's happening and not where things are going right now. Retailers who cater to lower-status people don't have to worry about whether something is "hot" but rather whether it's any good for its price.
...I was going to say a bit more about this greater preservation of metal classics vs. rock-for-the-college-educated, but that probably deserves a post of its own.
August 28, 2010
When girls' only guy friends are gay
We have a vague impression that tame times are more sex-segregated and wild times are more mixed-up -- the '50s picture of the girlfriend and her gal pals sitting around a tupperware party while the boyfriend and his buddies are out having a drink, vs. the '70s picture of a bunch of boys and girls gathering to shoot the bull, get a little drunk, and then go out dancing. The segregated world shows less trust between the sexes, the mixed one more.
This pattern is hard to notice about the present. I might write up more examples later, but just consider that most of young girls' serious / long-termish guy friends are mostly gay. That's the only way they'll trust a guy enough to hang out with him in a variety of friend-like contexts. It's either that or just about no guy friends at all (again, ones whose relationships are as involved and even as close as those with her chick friends, not just an acquaintance).
To find out when the change happened, ideally we'd have annual survey data about how many gay friends a girl has. Unfortunately we don't. So let's look to pop culture that aims for some degree of contemporary realism. When did the "gay best friend" become a stock character? Well there's Mean Girls from 2004, where the only guy who isn't a boyfriend to someone is flaming gay. Before that was Will & Grace, which began in 1998, although I've never watched it, so couldn't say if there are substantial non-gay male friend roles. For sure there's Clueless from 1995, where the only guy who plays a strictly friend role is gay. The year before that produced My So-Called Life, where again the only guy in a friend role to girls is gay.
Heathers from 1989 was very insightful and captured a zeitgeist, and even shows the beginnings of the sex-segregated world that was to develop during the falling-crime times of the '90s and 2000s -- yet no gay best friend. Most of the teen movies from the '80s show mixed-sex social circles, such as Fast Times at Ridgemont High and A Nightmare on Elm Street, although not if the point was to emphasize how socially inept they were (like Weird Science or Revenge of the Nerds). The basis of the running gag of Mr. Roper's cluelessness in Three's Company was how unlikely it was in the late '70s and early '80s for a couple of cute 20-somethings to have a gay guy as their closest friend and housemate.
I can't think of too many examples to test from 1990 to 1993, but it's pretty clear that this is a rising-crime vs. falling-crime difference, with greater trust between boys and girls in the former and less in the latter. If homosexuality isn't socially acceptable, girls will hang out just with each other (1950s), whereas if it is acceptable they'll hang out with guys provided that they're gay ('90s and 2000s).
This pattern is hard to notice about the present. I might write up more examples later, but just consider that most of young girls' serious / long-termish guy friends are mostly gay. That's the only way they'll trust a guy enough to hang out with him in a variety of friend-like contexts. It's either that or just about no guy friends at all (again, ones whose relationships are as involved and even as close as those with her chick friends, not just an acquaintance).
To find out when the change happened, ideally we'd have annual survey data about how many gay friends a girl has. Unfortunately we don't. So let's look to pop culture that aims for some degree of contemporary realism. When did the "gay best friend" become a stock character? Well there's Mean Girls from 2004, where the only guy who isn't a boyfriend to someone is flaming gay. Before that was Will & Grace, which began in 1998, although I've never watched it, so couldn't say if there are substantial non-gay male friend roles. For sure there's Clueless from 1995, where the only guy who plays a strictly friend role is gay. The year before that produced My So-Called Life, where again the only guy in a friend role to girls is gay.
Heathers from 1989 was very insightful and captured a zeitgeist, and even shows the beginnings of the sex-segregated world that was to develop during the falling-crime times of the '90s and 2000s -- yet no gay best friend. Most of the teen movies from the '80s show mixed-sex social circles, such as Fast Times at Ridgemont High and A Nightmare on Elm Street, although not if the point was to emphasize how socially inept they were (like Weird Science or Revenge of the Nerds). The basis of the running gag of Mr. Roper's cluelessness in Three's Company was how unlikely it was in the late '70s and early '80s for a couple of cute 20-somethings to have a gay guy as their closest friend and housemate.
I can't think of too many examples to test from 1990 to 1993, but it's pretty clear that this is a rising-crime vs. falling-crime difference, with greater trust between boys and girls in the former and less in the latter. If homosexuality isn't socially acceptable, girls will hang out just with each other (1950s), whereas if it is acceptable they'll hang out with guys provided that they're gay ('90s and 2000s).
Are there groups of courageous academics?
Overall they're not courageous, obviously, but I mean relative to one another. Academics have some basic training in their field plus more specialized knowledge about their pet projects, but they don't seem to act on this knowledge very much -- perhaps because it's not very useful, so let's just stick with academics whose knowledge could be applied to real life.
Sometimes it's easy and painless to do so, like an engineering prof using a better type of building material rather than another for an addition to their house. But what about where it would cost them in reputation or social acceptance, which is what academics care most about (not money)?
For example, how many anthropologists or historians follow a low-carb diet? Other academics may not have a clue what happened to human health after we adopted an agricultural diet, but anthropologists do, and a fair amount of historians at least have the impression that it was for the worse. The most successful anthro / history book of the past several decades is Jared Diamond's Guns, Germs, and Steel, most of which chronicles how destructive agriculture was for human health. So even if this wasn't part of your specialty, you'd still recognize it as something that that Diamond guy talked about.
Conclusion for the real world: cut most carbs out. There can be no whining about taste, as how good something tastes is mostly due to fat, salt, and sugar. Most carb-eaters aren't eating cakes and ice cream all day, so they won't be missing much there. And they can eat all the fat and salt they want. Unadorned carbs taste like paper or grass -- nothing lost. The real reason they don't do so is that it'll make them stand out as a weirdo, even if they can explain why it's a healthier diet.
There is a mass suspicion about animal products in our culture today, so you can have a non-mainstream diet like vegans and not be considered weird (among academics anyway). But going the other way by relying more on animal products and eliminating grains -- now you are officially strange.
As far as I can tell, most anthropologists and historians don't apply this lesson at all in their real lives. Forget about those who say they don't care about their health (a lie almost always), and stick just with those who claim to want a healthy diet. Even they aren't majority low-carb.
Is there any group of grant-grubbers that does risk social disapproval to apply the wisdom or knowledge of their field to real life? Note the assumption -- that they possess such applicable wisdom to begin with. I'm not talking about an autistic economist who ruins his social life by contracting out the task of helping his friends to those who are specialists, which is more "rational" and "efficient." And, being less costly to the economist than doing them himself, are weaker signals of friendship, causing his friends to withdraw, so reducing his adaptedness to the highly social world we live in. This risk of social disapproval does not result from applying wisdom, as he has fundamentally misunderstood the purpose of helping out friends -- a big part of which is to give a costly (hence honest) signal of loyalty.
And I'm not talking about some sociologist who "just knows" that most of the differences in SAT scores are due to tutoring differences, and so spends a fortune for his kids to get top-rate tutors.
These people are fools whose decisions, if made commonly, would unravel humankind. I'm not ruling out anthropologists, economists, or sociologists -- but they surely wouldn't count based on the above examples. I know there are individual iconoclasts, but are there groups where most of them apply wisdom to real life despite the risk of broader disapproval?
Sometimes it's easy and painless to do so, like an engineering prof using a better type of building material rather than another for an addition to their house. But what about where it would cost them in reputation or social acceptance, which is what academics care most about (not money)?
For example, how many anthropologists or historians follow a low-carb diet? Other academics may not have a clue what happened to human health after we adopted an agricultural diet, but anthropologists do, and a fair amount of historians at least have the impression that it was for the worse. The most successful anthro / history book of the past several decades is Jared Diamond's Guns, Germs, and Steel, most of which chronicles how destructive agriculture was for human health. So even if this wasn't part of your specialty, you'd still recognize it as something that that Diamond guy talked about.
Conclusion for the real world: cut most carbs out. There can be no whining about taste, as how good something tastes is mostly due to fat, salt, and sugar. Most carb-eaters aren't eating cakes and ice cream all day, so they won't be missing much there. And they can eat all the fat and salt they want. Unadorned carbs taste like paper or grass -- nothing lost. The real reason they don't do so is that it'll make them stand out as a weirdo, even if they can explain why it's a healthier diet.
There is a mass suspicion about animal products in our culture today, so you can have a non-mainstream diet like vegans and not be considered weird (among academics anyway). But going the other way by relying more on animal products and eliminating grains -- now you are officially strange.
As far as I can tell, most anthropologists and historians don't apply this lesson at all in their real lives. Forget about those who say they don't care about their health (a lie almost always), and stick just with those who claim to want a healthy diet. Even they aren't majority low-carb.
Is there any group of grant-grubbers that does risk social disapproval to apply the wisdom or knowledge of their field to real life? Note the assumption -- that they possess such applicable wisdom to begin with. I'm not talking about an autistic economist who ruins his social life by contracting out the task of helping his friends to those who are specialists, which is more "rational" and "efficient." And, being less costly to the economist than doing them himself, are weaker signals of friendship, causing his friends to withdraw, so reducing his adaptedness to the highly social world we live in. This risk of social disapproval does not result from applying wisdom, as he has fundamentally misunderstood the purpose of helping out friends -- a big part of which is to give a costly (hence honest) signal of loyalty.
And I'm not talking about some sociologist who "just knows" that most of the differences in SAT scores are due to tutoring differences, and so spends a fortune for his kids to get top-rate tutors.
These people are fools whose decisions, if made commonly, would unravel humankind. I'm not ruling out anthropologists, economists, or sociologists -- but they surely wouldn't count based on the above examples. I know there are individual iconoclasts, but are there groups where most of them apply wisdom to real life despite the risk of broader disapproval?
August 27, 2010
Why Amazon hasn't replaced the college bookstore
As college classes begin soon, tons of students will get fleeced at the college bookstore for new and used textbooks. They can usually find the books at Amazon for around a quarter of the price at the bookstore, so what gives? Why hasn't competition driven the college bookstore out of the textbook business?
The answer is that colleges do not force their professors to publicize the reading lists for their courses on the web before the semester starts. You have to go to the brick-and-mortar bookstore to see what books are required for what classes. By the time you do this, classes are only a few days away or have already started.
Professors don't wait to make use of these books, so if you try to shop around at Amazon or other textbook retailers, you'll be out of the loop and probably will never catch up (the average student anyway). It takes several days to find a good deal, plus 4 to 7 business days -- so you're looking at the second week of classes at the earliest, perhaps even later if your search time reflects 4 classes with 8 books each. Rather than risk falling far behind right up front -- and all while feeling helpless, rather than just lazy -- they shell out the big bucks for overpriced textbooks.
The simple solution is to have professors post their reading lists on the web, integrated into the online course catalog. An entry would tell you the course number, name, prof, room number, etc., and then have a link that brings up the reading list. This would have to be in no later than, say, 2 weeks before classes start. Hell, it's not as though they don't already have the reading list made up by then. That would give students plenty of time to find cheap textbooks at Amazon or wherever else, and bypass the college bookstore.
If professors decided not to use a book that they had already committed to on the list, they would have to reimburse the students, either financially or perhaps giving them all A's or throwing a party for them. (Or maybe Amazon would have a refund program if students could show the book had been de-listed.) Professors could add books to the list, but they would not be allowed to make use of them in class until two weeks after classes begin, again to give students time to find cheap copies. And really, if you're adding something so last-minute, how central can it be anyway?
That's how it would work, roughly, if students paid for education on their own. They would not accept plunking down tens of thousands of dollars just for the privilege of getting ripped off every semester at the bookstore and being at the mercy of professors' whims regarding reading lists. Colleges that offered better customer service in this way would eliminate competitors of similar academic quality. As it stands, though, the customers have no way to tell the producers what they do or don't want. They more or less take whatever crappy treatment they get and hope for the best.
As much as I loathe the helicopter parent phenomenon, at least they create some kind of feedback and punishing power to the producers if they're screwing up. Unfortunately parents seem to care more about ensuring that the college is spoiling the kids enough, rather than that the college isn't taking them and their kids to the cleaners. It's like medicine and hospitals -- parents put total trust in the doctors, administrators, and so on, and hope that their kid is getting treated nicely. They don't commit heresy by asking if this or that is really worth paying for.
The answer is that colleges do not force their professors to publicize the reading lists for their courses on the web before the semester starts. You have to go to the brick-and-mortar bookstore to see what books are required for what classes. By the time you do this, classes are only a few days away or have already started.
Professors don't wait to make use of these books, so if you try to shop around at Amazon or other textbook retailers, you'll be out of the loop and probably will never catch up (the average student anyway). It takes several days to find a good deal, plus 4 to 7 business days -- so you're looking at the second week of classes at the earliest, perhaps even later if your search time reflects 4 classes with 8 books each. Rather than risk falling far behind right up front -- and all while feeling helpless, rather than just lazy -- they shell out the big bucks for overpriced textbooks.
The simple solution is to have professors post their reading lists on the web, integrated into the online course catalog. An entry would tell you the course number, name, prof, room number, etc., and then have a link that brings up the reading list. This would have to be in no later than, say, 2 weeks before classes start. Hell, it's not as though they don't already have the reading list made up by then. That would give students plenty of time to find cheap textbooks at Amazon or wherever else, and bypass the college bookstore.
If professors decided not to use a book that they had already committed to on the list, they would have to reimburse the students, either financially or perhaps giving them all A's or throwing a party for them. (Or maybe Amazon would have a refund program if students could show the book had been de-listed.) Professors could add books to the list, but they would not be allowed to make use of them in class until two weeks after classes begin, again to give students time to find cheap copies. And really, if you're adding something so last-minute, how central can it be anyway?
That's how it would work, roughly, if students paid for education on their own. They would not accept plunking down tens of thousands of dollars just for the privilege of getting ripped off every semester at the bookstore and being at the mercy of professors' whims regarding reading lists. Colleges that offered better customer service in this way would eliminate competitors of similar academic quality. As it stands, though, the customers have no way to tell the producers what they do or don't want. They more or less take whatever crappy treatment they get and hope for the best.
As much as I loathe the helicopter parent phenomenon, at least they create some kind of feedback and punishing power to the producers if they're screwing up. Unfortunately parents seem to care more about ensuring that the college is spoiling the kids enough, rather than that the college isn't taking them and their kids to the cleaners. It's like medicine and hospitals -- parents put total trust in the doctors, administrators, and so on, and hope that their kid is getting treated nicely. They don't commit heresy by asking if this or that is really worth paying for.
August 26, 2010
Falling American solidarity since the '90s, immigrant surname edition
Here is an article about the decline in immigrants Anglicizing their surnames, which they say has happened within "the last few decades." So this is likely another instance of the huge social fracturing that began in the late '80s / early '90s, depending on the case.
One idea is that most of the recent immigrants such as Latin American mestizos and Asians could not pass for white and therefore see no point in trying to fool others by changing their surnames, in contrast to earlier waves of European immigrants who could blend in by looks. The article itself proves this idea wrong, so you wonder why they don't mention that. The current decline in Anglicization of surnames affects even European immigrants (one of the people featured is from the former Yugoslavia).
Plus before the decline, white-looking and non-white-looking both tried to adopt Anglo surnames. Ashkenazi Jews are a clear example of the former, and blacks of the latter. Blacks may not have been immigrants, but if there had really been a desire to emphasize their cultural difference, they would've changed their last names like Malcolm X or those who adopted Arabic surnames when they got into the Nation of Islam. Those few who did were lampooned by the majority of blacks for taking their African roots too seriously -- we're Americans first, they were saying.
And blacks could certainly not pass for white or Anglo, but only a clueless journalist or social scientist would think that's what influences the decision to adopt a certain group's surnames. Rather, it's about signaling the good faith effort they're making to leave behind the tribe they came from and join the one they've come to. It has nothing to do with signaling race.
They didn't even change them to something European but less Anglo -- they kept Jones, Jenkins, and Smith. Their first names started to diverge from whites' during the second half of the 1960s, but surnames are more important for identifying what Big Group you come from. Most people inappropriately project the '90s-era identity politics back onto The Sixties, which was about civil rights and anti-war. Black power was a fringe movement and vanished in a flash. Blacks wanted to join mainstream society so much that when three of the top ten TV shows featured all-black casts, they didn't try to blackify their surnames (or first names) at all -- they were the Evanses, the Jeffersons, and Sanford and Son. Ditto when The Cosby Show took over.
As late as 1984, we were still a very nation-oriented country, as the Los Angeles Olympics exemplified:
You didn't see that at all in the Atlanta or Salt Lake City Olympics. Since the early '90s, it has become cool to be "ethnic," that is to de-nationalize the level that you base your identity on. Most of Robert Putnam's "Bowling Alone" narrative really applies to the '90s and 2000s, not the '60s through the '80s. If during the latter period people stopped playing bridge together, having friends over for dinner, and joining the PTA, it's because bridge ended as a fad as people took up other social games and sports (like softball), because people began to eat meals together outside the home, and because parents decided to get a life rather than hover over their kids and hound their teachers.
It was an incredibly social time, not to mention nationalistic. The public accepted or even cheered on a continual series of large wars and interventions, whereas during the '90s and 2000s politicians found out that they couldn't do that anymore, so we've had no large wars, let alone a string of them. If you weren't chanting "U-S-A!" during the Olympics, you got punched in the nuts for being a traitor. Since then it's become gauche and everyone will ostracize you for "jingoism." Right through the '80s, the description "All-American" was a complement rather than a put-down. That glowing image of the All-American showed up in popular music, too, from "California Girls" in 1965 to "Free Fallin'" in 1989.
One idea is that most of the recent immigrants such as Latin American mestizos and Asians could not pass for white and therefore see no point in trying to fool others by changing their surnames, in contrast to earlier waves of European immigrants who could blend in by looks. The article itself proves this idea wrong, so you wonder why they don't mention that. The current decline in Anglicization of surnames affects even European immigrants (one of the people featured is from the former Yugoslavia).
Plus before the decline, white-looking and non-white-looking both tried to adopt Anglo surnames. Ashkenazi Jews are a clear example of the former, and blacks of the latter. Blacks may not have been immigrants, but if there had really been a desire to emphasize their cultural difference, they would've changed their last names like Malcolm X or those who adopted Arabic surnames when they got into the Nation of Islam. Those few who did were lampooned by the majority of blacks for taking their African roots too seriously -- we're Americans first, they were saying.
And blacks could certainly not pass for white or Anglo, but only a clueless journalist or social scientist would think that's what influences the decision to adopt a certain group's surnames. Rather, it's about signaling the good faith effort they're making to leave behind the tribe they came from and join the one they've come to. It has nothing to do with signaling race.
They didn't even change them to something European but less Anglo -- they kept Jones, Jenkins, and Smith. Their first names started to diverge from whites' during the second half of the 1960s, but surnames are more important for identifying what Big Group you come from. Most people inappropriately project the '90s-era identity politics back onto The Sixties, which was about civil rights and anti-war. Black power was a fringe movement and vanished in a flash. Blacks wanted to join mainstream society so much that when three of the top ten TV shows featured all-black casts, they didn't try to blackify their surnames (or first names) at all -- they were the Evanses, the Jeffersons, and Sanford and Son. Ditto when The Cosby Show took over.
As late as 1984, we were still a very nation-oriented country, as the Los Angeles Olympics exemplified:
But the Los Angeles Games, televised by ABC, were a flag-waving, chanting USA ad...
"As a spectator at the Opening Ceremonies in 1984," NBC 2002 Opening Ceremonies producer Don Misher said recently, "I came out of the stadium euphoric. An IOC member later told me it was very nationalistic, second only to Hitler's (1936) Games."
You didn't see that at all in the Atlanta or Salt Lake City Olympics. Since the early '90s, it has become cool to be "ethnic," that is to de-nationalize the level that you base your identity on. Most of Robert Putnam's "Bowling Alone" narrative really applies to the '90s and 2000s, not the '60s through the '80s. If during the latter period people stopped playing bridge together, having friends over for dinner, and joining the PTA, it's because bridge ended as a fad as people took up other social games and sports (like softball), because people began to eat meals together outside the home, and because parents decided to get a life rather than hover over their kids and hound their teachers.
It was an incredibly social time, not to mention nationalistic. The public accepted or even cheered on a continual series of large wars and interventions, whereas during the '90s and 2000s politicians found out that they couldn't do that anymore, so we've had no large wars, let alone a string of them. If you weren't chanting "U-S-A!" during the Olympics, you got punched in the nuts for being a traitor. Since then it's become gauche and everyone will ostracize you for "jingoism." Right through the '80s, the description "All-American" was a complement rather than a put-down. That glowing image of the All-American showed up in popular music, too, from "California Girls" in 1965 to "Free Fallin'" in 1989.
August 23, 2010
No shampoo, and no problems
I've been thinking where else to apply the back-to-nature principle, which goes like this: if we're disturbing the outcome of natural selection, this practice is guilty until proven innocent. In some cases, a good case can be made -- like washing your hands. True we didn't evolve in a world with soap, and we are thus messing around with our natural state. However, we live in a world that's far more germ-ridden than when we all lived on the African savanna, mostly due to crowding, contact with animals, and living -- and doing our business -- in the same place over time. So, severely cutting down on the concentration of harmful junk on our hands gives a huge boost to our health.
That does not apply to shampoo, though. It has no anti-bacterial, anti-fungal, or anti-viral powers, unlike soap. (There are special shampoos if you want to zap the bacteria that contribute to dandruff or if you have lice, but I'm talking normal everyday shampoo.) All it does is remove some of the dirt and dust that sticks to the oil that your scalp makes and that coats your hair follicles (sebum), so there's no sanitation or hygiene angle here. Yet it doesn't have laser-like precision in removing these dust particles -- it strips away a good deal of the sebum too.
Now, natural selection put oil-making sebaceous glands on your scalp for a good reason -- and you don't even have to know what it is, although it's fun to speculate. It survived natural selection, so it must have been doing something good for you. And people can easily tell that their hair gets messed up in some way (too dry, too limp, whatever) after shampooing, so they then try to put some of the moisture back in with conditioner. Whenever you need a corrector for the corrector, you know you're digging yourself deep.
Washing the dirt and dust out of your hair doesn't take anything more than a thorough rinse in a shower with good water pressure. This is what humans have done for as long as they've had access to bodies of water in which to bathe, and the hair of hunter-gatherers looks just fine. They've also infused oily substances into their hair -- olive oil in the Mediterranean, coconut oil in the South Seas, a butterfat mixture among the Himba pastoralists of Namibia, and god knows what else. But whereas rubbing fatty stuff into the hair appears universal, using cleansing agents just to get out a little dust hardly shows up at all.
That's true even in the industrialized West. I searched the NYT archives for articles mentioning how often you should shampoo. There aren't any such articles before the first decade of the 1900s, when the consensus was that once a month was good, but aim for once every 2 to 3 weeks. That was true for the 1910s, too. I couldn't access the articles from most of the following decades, although there is one from the WWII days that mentions women's weekly shampoo. Brief histories I've read suggest that the advertizing of the '70s lead to even more frequent shampooing, and certainly I remember from personal experience that by no later than the end of the '80s, daily shampooing was expected.
As much as I rave about that decade, no one did babes like the '60s. Everything lined up just right for them -- lots of animal products in the diet, little / infrequent shampooing, no layers of hairstyling products, normal amount of sun exposure, and little or no air conditioning = normal level of sweat, helped out by a normal mix of physical activity. God, that big bouncy hair... here is a cropped, safe picture of Cynthia Myers, Playmate of the month for December 1968. You won't see that in the era of Clueless and Mean Girls. In fact, if it weren't for styling products, I doubt you would've seen it in the '80s either.
As for guys, here's a picture comparing current star Zac Efron to Leonard Whiting, who played Romeo in Zeffirelli's 1968 movie. Whiting's hair looks thicker at the individual hair level, more voluminous overall, and more lustrous.
It's been almost two weeks since I last shampooed my hair, although I still rinse it well with just water in the shower, and it's not a big greasy mess like I imagined it would. After a lifetime of daily shampooing, it'll take my sebaceous glands some time to adapt to the *lack* of the oil-sapping stuff and dial down their activity. From what I've read, it'll take between two weeks and two months for a complete return to normalcy, but it's worth it. Reading around, I was struck by how wimpy people are when they try this out -- "omigod i could never go for more than like three days, i'd feel so gross!" They know that it will all work out, but we've become so focused on immediate comfort rather than enduring robustness. It's considered a violation of someone's human rights to tell them to deal with it until it gets better.
There seems to be an eco-friendly movement afoot called "no 'poo" -- those damned Greens will never learn good advertizing -- which aims to reduce shampoo use for some environmental reason or other. It seems like most of them still use a cleansing agent and conditioner to ameliorate the damage done by the cleanser, though. Knowing that they're eco-friendly, we can infer that they ingest little animal fat and protein, on which our hair is so dependent, so they're not the best example of what little or no shampoo looks like -- for that, have another look at those '60s honey bunnies.
That does not apply to shampoo, though. It has no anti-bacterial, anti-fungal, or anti-viral powers, unlike soap. (There are special shampoos if you want to zap the bacteria that contribute to dandruff or if you have lice, but I'm talking normal everyday shampoo.) All it does is remove some of the dirt and dust that sticks to the oil that your scalp makes and that coats your hair follicles (sebum), so there's no sanitation or hygiene angle here. Yet it doesn't have laser-like precision in removing these dust particles -- it strips away a good deal of the sebum too.
Now, natural selection put oil-making sebaceous glands on your scalp for a good reason -- and you don't even have to know what it is, although it's fun to speculate. It survived natural selection, so it must have been doing something good for you. And people can easily tell that their hair gets messed up in some way (too dry, too limp, whatever) after shampooing, so they then try to put some of the moisture back in with conditioner. Whenever you need a corrector for the corrector, you know you're digging yourself deep.
Washing the dirt and dust out of your hair doesn't take anything more than a thorough rinse in a shower with good water pressure. This is what humans have done for as long as they've had access to bodies of water in which to bathe, and the hair of hunter-gatherers looks just fine. They've also infused oily substances into their hair -- olive oil in the Mediterranean, coconut oil in the South Seas, a butterfat mixture among the Himba pastoralists of Namibia, and god knows what else. But whereas rubbing fatty stuff into the hair appears universal, using cleansing agents just to get out a little dust hardly shows up at all.
That's true even in the industrialized West. I searched the NYT archives for articles mentioning how often you should shampoo. There aren't any such articles before the first decade of the 1900s, when the consensus was that once a month was good, but aim for once every 2 to 3 weeks. That was true for the 1910s, too. I couldn't access the articles from most of the following decades, although there is one from the WWII days that mentions women's weekly shampoo. Brief histories I've read suggest that the advertizing of the '70s lead to even more frequent shampooing, and certainly I remember from personal experience that by no later than the end of the '80s, daily shampooing was expected.
As much as I rave about that decade, no one did babes like the '60s. Everything lined up just right for them -- lots of animal products in the diet, little / infrequent shampooing, no layers of hairstyling products, normal amount of sun exposure, and little or no air conditioning = normal level of sweat, helped out by a normal mix of physical activity. God, that big bouncy hair... here is a cropped, safe picture of Cynthia Myers, Playmate of the month for December 1968. You won't see that in the era of Clueless and Mean Girls. In fact, if it weren't for styling products, I doubt you would've seen it in the '80s either.
As for guys, here's a picture comparing current star Zac Efron to Leonard Whiting, who played Romeo in Zeffirelli's 1968 movie. Whiting's hair looks thicker at the individual hair level, more voluminous overall, and more lustrous.
It's been almost two weeks since I last shampooed my hair, although I still rinse it well with just water in the shower, and it's not a big greasy mess like I imagined it would. After a lifetime of daily shampooing, it'll take my sebaceous glands some time to adapt to the *lack* of the oil-sapping stuff and dial down their activity. From what I've read, it'll take between two weeks and two months for a complete return to normalcy, but it's worth it. Reading around, I was struck by how wimpy people are when they try this out -- "omigod i could never go for more than like three days, i'd feel so gross!" They know that it will all work out, but we've become so focused on immediate comfort rather than enduring robustness. It's considered a violation of someone's human rights to tell them to deal with it until it gets better.
There seems to be an eco-friendly movement afoot called "no 'poo" -- those damned Greens will never learn good advertizing -- which aims to reduce shampoo use for some environmental reason or other. It seems like most of them still use a cleansing agent and conditioner to ameliorate the damage done by the cleanser, though. Knowing that they're eco-friendly, we can infer that they ingest little animal fat and protein, on which our hair is so dependent, so they're not the best example of what little or no shampoo looks like -- for that, have another look at those '60s honey bunnies.
Where do the art-lovers live?
GameFAQs just ran a poll about what your favorite subject was in high school. I excluded people who said "phys ed," then lumped the three artier ones together (art, language arts, foreign language) and the three sciencey ones together (math, science, social sciences). Then I found the percent of people who liked the arts classes out of everyone (excluding gym fans).
Across all of America, 32% like the arts classes. The map below shows each state's percent as a deviation from the national average, where browner means more art-minded and bluer means more science-minded:
The main pattern is east vs. west, with the middle and southeast of the country at about the national average and an outpost of science people in the middle. You might've expected it to show the northeast as the most art-loving, or perhaps bi-coastal vs. flyover country. Yet that would rely too much on things beyond a person's basic preferences -- there's the price of going to a museum, whether there are museums nearby in the first place, etc. By just asking what their favorite subject was in high school, we're only looking at differences in tastes. Compared to people who live in New York, people in Montana would rather read about western American folklore or take up an arts & crafts hobby, and wouldn't give a shit what Malcolm Gladwell and Steven Levitt are up to.
Across all of America, 32% like the arts classes. The map below shows each state's percent as a deviation from the national average, where browner means more art-minded and bluer means more science-minded:
The main pattern is east vs. west, with the middle and southeast of the country at about the national average and an outpost of science people in the middle. You might've expected it to show the northeast as the most art-loving, or perhaps bi-coastal vs. flyover country. Yet that would rely too much on things beyond a person's basic preferences -- there's the price of going to a museum, whether there are museums nearby in the first place, etc. By just asking what their favorite subject was in high school, we're only looking at differences in tastes. Compared to people who live in New York, people in Montana would rather read about western American folklore or take up an arts & crafts hobby, and wouldn't give a shit what Malcolm Gladwell and Steven Levitt are up to.
August 22, 2010
Will pop culture be remembered only for its decadent phases?
In higher culture, we remember the works made during a vigorous, youthful hey-day than later works that are more overly ornamented and perhaps a bit too self-aware for their own good. Virgil has outlasted Ovid, Michelangelo Boucher, and Beethoven Stravinsky. I'm talking about people who are casual consumers up through the most erudite, not some niche group that tries to shock or stand out by telling others that Bach was a wimp and Mendelssohn was a giant.
Popular culture that lasts is very recent because it took industrialization to make mass production possible and to make prices affordable to a mass audience (through division of labor and competition among producers). So there's a lot less data to look at. Still, I wonder if we won't see the reverse pattern, where it's remembered more for its flabbier stages. Things that we hold in high esteem we remember for their greater qualities, since remembering their flaws would make the grand seem less elevated. But things that we hold in disregard we remember for their damning qualities, since remembering their strengths would dignify something base.
I ask this because I decided to get out of my popular music comfort zone and start exploring jazz. After some basic reading around and sampling songs on YouTube, I found that I like the ragtime through hot jazz era of the '20s and early '30s, that big band and swing is OK but too overwrought, and that with bebop and after it left planet Earth, while also spawning some really soporific background music. That's just a vague impression, but the styles are so different that it doesn't take a lifelong familiarity to have a pretty strong judgment.
So off to the used record store I went to pick up a few CDs -- and found almost nothing. There was a greatest hits by Scott Joplin, but I'd already heard most of that in high school (that was my one experience with pre-'30s jazz as a teenager). Fortunately there was at least one other collection there that I picked up, a collaboration between Louis Armstrong and King Oliver from 1923. It sounds pretty good, but not quite as make-you-get-up-and-move as the songs I sampled from his Hot Five and Hot Seven band of the later '20s. There wasn't even that much big band music available. Almost everything was Miles Davis, John Coltrane, Thelonius Monk, etc. on the one hand, and a bunch of Harry Connick Jr., etc. on the other.
All of the carefree, fun-loving, don't-force-it stuff has been forgotten. Perhaps I shouldn't have been so surprised, since while reading around very few of the names from the 1890s through the early 1930s rang a bell, whereas just about all of them from the mid-'30s onward I'd either heard of or knew some of their songs. How odd that the common understanding of jazz derives from what came after The Jazz Age.
Movie buffs may keep an appreciation for periods of grand activity -- Hollywood's Golden Age and its silver age of roughly the mid-'70s through the late '80s -- but if you look at what's on offer for new DVDs, in the used section of a record store, or on an illegal filesharing service, not to mention what movies people refer to on just about any corner of the internet, you'd think that these eras never existed. No one's memory appears to go back farther than 1993 (Groundhog Day). I'm not talking about fleeting gossip related to what's opening this weekend. I mean any discussion of movies or Hollywood whatsoever.
And don't even start me on rock music; I've covered that enough before. The whole lifetime of the late '50s through the very early '90s has vanished from the common culture. Instead, our understanding of "rock music" has come to mean alternative / emo, indie, and singer-songwriter music, which are all as far away from rock as bebop is from Dixieland. Just think: in 50 to 100 years, will "Paranoid" by Black Sabbath be forgotten, while "Du Hast" by Rammstein spring to mind when people discuss the long-ago genre of heavy metal?
I don't read popular lit, so I couldn't say whether this pattern holds up there or not. I'm talking the kind that doesn't aspire to lit lit, as a bunch of sci-fi does, but to thrillers, mysteries, etc., what's disparaged as genre fiction. Is this remembered more for its decadent than great works?
TV it's hard to guess about. Some say it's in a peak period right now, so we'll have to wait until it goes into its overly busy and too-serious phase. Other things to think about?
Popular culture that lasts is very recent because it took industrialization to make mass production possible and to make prices affordable to a mass audience (through division of labor and competition among producers). So there's a lot less data to look at. Still, I wonder if we won't see the reverse pattern, where it's remembered more for its flabbier stages. Things that we hold in high esteem we remember for their greater qualities, since remembering their flaws would make the grand seem less elevated. But things that we hold in disregard we remember for their damning qualities, since remembering their strengths would dignify something base.
I ask this because I decided to get out of my popular music comfort zone and start exploring jazz. After some basic reading around and sampling songs on YouTube, I found that I like the ragtime through hot jazz era of the '20s and early '30s, that big band and swing is OK but too overwrought, and that with bebop and after it left planet Earth, while also spawning some really soporific background music. That's just a vague impression, but the styles are so different that it doesn't take a lifelong familiarity to have a pretty strong judgment.
So off to the used record store I went to pick up a few CDs -- and found almost nothing. There was a greatest hits by Scott Joplin, but I'd already heard most of that in high school (that was my one experience with pre-'30s jazz as a teenager). Fortunately there was at least one other collection there that I picked up, a collaboration between Louis Armstrong and King Oliver from 1923. It sounds pretty good, but not quite as make-you-get-up-and-move as the songs I sampled from his Hot Five and Hot Seven band of the later '20s. There wasn't even that much big band music available. Almost everything was Miles Davis, John Coltrane, Thelonius Monk, etc. on the one hand, and a bunch of Harry Connick Jr., etc. on the other.
All of the carefree, fun-loving, don't-force-it stuff has been forgotten. Perhaps I shouldn't have been so surprised, since while reading around very few of the names from the 1890s through the early 1930s rang a bell, whereas just about all of them from the mid-'30s onward I'd either heard of or knew some of their songs. How odd that the common understanding of jazz derives from what came after The Jazz Age.
Movie buffs may keep an appreciation for periods of grand activity -- Hollywood's Golden Age and its silver age of roughly the mid-'70s through the late '80s -- but if you look at what's on offer for new DVDs, in the used section of a record store, or on an illegal filesharing service, not to mention what movies people refer to on just about any corner of the internet, you'd think that these eras never existed. No one's memory appears to go back farther than 1993 (Groundhog Day). I'm not talking about fleeting gossip related to what's opening this weekend. I mean any discussion of movies or Hollywood whatsoever.
And don't even start me on rock music; I've covered that enough before. The whole lifetime of the late '50s through the very early '90s has vanished from the common culture. Instead, our understanding of "rock music" has come to mean alternative / emo, indie, and singer-songwriter music, which are all as far away from rock as bebop is from Dixieland. Just think: in 50 to 100 years, will "Paranoid" by Black Sabbath be forgotten, while "Du Hast" by Rammstein spring to mind when people discuss the long-ago genre of heavy metal?
I don't read popular lit, so I couldn't say whether this pattern holds up there or not. I'm talking the kind that doesn't aspire to lit lit, as a bunch of sci-fi does, but to thrillers, mysteries, etc., what's disparaged as genre fiction. Is this remembered more for its decadent than great works?
TV it's hard to guess about. Some say it's in a peak period right now, so we'll have to wait until it goes into its overly busy and too-serious phase. Other things to think about?
August 20, 2010
Albums that critics love, go unappreciated, but are actually good
Most lists of "the most underrated and under-appreciated" things are a see-through way of trying to boost the list-maker's status by implying how cultured they are compared to the philistines in the audience. But sometimes they do have a legitimate point. In most high school and intro college lit classes, you'll rarely read any Marlowe, but you can be sure to slog through Arthur Miller and Ibsen, even though he's superior to both combined. Someone who complained about the lack of Marlowe in the typical lit curriculum would have a good point, while someone who whined about not teaching Alfred Jarry would just be trying to score obscurity-based status points.
Applying this idea to music albums, let's take Rolling Stone's the 500 greatest albums as the list of what critics hype up. Next, we eliminate the ones that may be good but aren't out-of-this-world great -- that are there just because critics have to revere them or else lose their snob cred. (This eliminates over-hyped ones like The Velvet Underground & Nico's self-titled album). Finally, we eliminate those that the average youngish listener (say under the age of 25 or 30) has not heard much of -- let's say, no more than one song on the album. (This eliminates great but well known ones like Rocket to Russia by the Ramones.) If they've heard two, three, four hit singles but not the rest from it, I'd say they're still fairly aware of that album.
What's left is a pretty good list of "hidden gem" albums. I haven't listened to every one of the 500, so if it doesn't appear, it may be because I don't know it. Here's what I get:
Ramones (Ramones)
Purple Rain (Prince & The Revolution)
Back In Black (AC/DC)
Raw Power (The Stooges)
Pretenders (The Pretenders)
Electric Warrior (T. Rex)
1999 (Prince)
The Queen Is Dead (The Smiths)
Trans-Europe Express (Kraftwerk)
Psychocandy (The Jesus & Mary Chain)
Strange Days (The Doors)
Honorable mentions (quality isn't quite as high as the others, but still not well known by most young people):
Transformer (Lou Reed)
Siamese Dream (The Smashing Pumpkins)
Of the ones above, the T. Rex album is the best example of a hidden gem, since I'm pretty sure no one under 45 has heard even one song from it and the band is not well known regardless of which albums or songs we're talking about. The other albums would at least be recognized for one song, or for some of the band's other work.
Applying this idea to music albums, let's take Rolling Stone's the 500 greatest albums as the list of what critics hype up. Next, we eliminate the ones that may be good but aren't out-of-this-world great -- that are there just because critics have to revere them or else lose their snob cred. (This eliminates over-hyped ones like The Velvet Underground & Nico's self-titled album). Finally, we eliminate those that the average youngish listener (say under the age of 25 or 30) has not heard much of -- let's say, no more than one song on the album. (This eliminates great but well known ones like Rocket to Russia by the Ramones.) If they've heard two, three, four hit singles but not the rest from it, I'd say they're still fairly aware of that album.
What's left is a pretty good list of "hidden gem" albums. I haven't listened to every one of the 500, so if it doesn't appear, it may be because I don't know it. Here's what I get:
Ramones (Ramones)
Purple Rain (Prince & The Revolution)
Back In Black (AC/DC)
Raw Power (The Stooges)
Pretenders (The Pretenders)
Electric Warrior (T. Rex)
1999 (Prince)
The Queen Is Dead (The Smiths)
Trans-Europe Express (Kraftwerk)
Psychocandy (The Jesus & Mary Chain)
Strange Days (The Doors)
Honorable mentions (quality isn't quite as high as the others, but still not well known by most young people):
Transformer (Lou Reed)
Siamese Dream (The Smashing Pumpkins)
Of the ones above, the T. Rex album is the best example of a hidden gem, since I'm pretty sure no one under 45 has heard even one song from it and the band is not well known regardless of which albums or songs we're talking about. The other albums would at least be recognized for one song, or for some of the band's other work.
Millennials, the second Silent Generation
I've never read Time's original 1951 sketch of the Silent Generation until now, and it's striking (although not surprising to me) how contemporary it sounds. When it was written, middle-aged people were cooler than young people, just like today.
It was also written when crime had been falling for 18 years, just as we've seen since crime peaked in the early '90s. Recall that the crime rate soared from at least 1900 (maybe even back through the "Gay Nineties") up through 1933, after which it plummeted throughout most of the 1950s. Another crime wave began in 1959 and ended around 1992, when it started falling once again.
People who have years of memories of wild times -- especially if those were during their coming-of-age years -- are basically a different species of human from those whose memories are only of tame times. Too many people focus on "defining events" like wars, economic booms or busts, but these are all pretty minor in shaping generations. The rate of violent and property crime is a lot more influential, simply because that's what the human mind has been the most tuned into for most of our species' history -- there were no labor markets or World Wars until pretty recently. There's always been violence, so that's the cue we zoom in on.
I'll go back later and pick out some good quotes and draw the parallel in case some of them aren't obvious. It's not long, so read the whole thing. For now I'll just mention the "I-don't-give-a-damn-ism" attitude of both periods of rising crime (clear during the '60s through the '80s, but most people today have no idea how wild the '20s were), and something I suspected based on current trends but had never seen any evidence for before -- "The young American male is increasingly bewildered and confused by the aggressive, coarse, dominant attitudes and behavior of his women." Those super-soft, estrogen-dripping honey babies of the '60s, '70s, and '80s were not there during the '40s, '50s, and probably most of the '30s. And during that earlier period of tame times, young dudes were just as blindsided by the shift to bossy man-women, LOL. No more true sex symbols like you saw during the Jazz Age.
God I wish I could've been there for the Roaring Twenties! At least my mother's parents were (one born in 1914, the other in 1920), and I got to see a lot of them growing up. As with Baby Boomers today, they remained carefree and rambunctious even into old age (my grandmother is still an incorrigible prank-player). Once you get so much exposure to a wild-times environment, you shape yourself to fit in there, and that sticks more or less for life -- like learning a language. This huge effort of birth cohort can lead to the bizarre situation where it's the middle-aged who are exciting to play with and the young who are dull killjoys.
It was also written when crime had been falling for 18 years, just as we've seen since crime peaked in the early '90s. Recall that the crime rate soared from at least 1900 (maybe even back through the "Gay Nineties") up through 1933, after which it plummeted throughout most of the 1950s. Another crime wave began in 1959 and ended around 1992, when it started falling once again.
People who have years of memories of wild times -- especially if those were during their coming-of-age years -- are basically a different species of human from those whose memories are only of tame times. Too many people focus on "defining events" like wars, economic booms or busts, but these are all pretty minor in shaping generations. The rate of violent and property crime is a lot more influential, simply because that's what the human mind has been the most tuned into for most of our species' history -- there were no labor markets or World Wars until pretty recently. There's always been violence, so that's the cue we zoom in on.
I'll go back later and pick out some good quotes and draw the parallel in case some of them aren't obvious. It's not long, so read the whole thing. For now I'll just mention the "I-don't-give-a-damn-ism" attitude of both periods of rising crime (clear during the '60s through the '80s, but most people today have no idea how wild the '20s were), and something I suspected based on current trends but had never seen any evidence for before -- "The young American male is increasingly bewildered and confused by the aggressive, coarse, dominant attitudes and behavior of his women." Those super-soft, estrogen-dripping honey babies of the '60s, '70s, and '80s were not there during the '40s, '50s, and probably most of the '30s. And during that earlier period of tame times, young dudes were just as blindsided by the shift to bossy man-women, LOL. No more true sex symbols like you saw during the Jazz Age.
God I wish I could've been there for the Roaring Twenties! At least my mother's parents were (one born in 1914, the other in 1920), and I got to see a lot of them growing up. As with Baby Boomers today, they remained carefree and rambunctious even into old age (my grandmother is still an incorrigible prank-player). Once you get so much exposure to a wild-times environment, you shape yourself to fit in there, and that sticks more or less for life -- like learning a language. This huge effort of birth cohort can lead to the bizarre situation where it's the middle-aged who are exciting to play with and the young who are dull killjoys.
August 19, 2010
Are modernized people more feminized or more infantilized?
Although the picture goes back over 100 years, the events of the past two years have made it clear to everyone that we live in a bailout culture. This is just one way in which modern people seem fundamentally different from pre-modern people. (In Europe, the seeds may have been there somewhat earlier, but by the Enlightenment these tendencies had cleared a threshold so that they're visible to any observer.)
Most of those who are critical of the bailouts -- of homeowners, of investment banks, of people who bought crappy cars, of people who make terrible health choices, and so on -- invoke the image of a spoiled child who whines to his parents to make it all better when he gets himself into trouble in some predictable way. The message from this rhetorical frame is that we need to stop acting like babies, grow up, and behave like mature adults. Many notice how long it takes to reach the milestones of adulthood than it used to, which would seem to provide more evidence in support of the infantilization view.
However, couldn't we make another equally valid analogy to the spendthrift wife who, after ruining herself financially, tries to badger her husband into transferring more of his earnings to her? In this view, the problem is that we've become feminized, so that the lesson is to stop acting like girls and start taking it like a man.
In this case of our bailout mentality, the two ideas about modernization give the same result, since both bratty children and wives with a hole in their pocket scream for bailouts. So, what other changes that modern people show would help us to distinguish between the two causes? Both could be going on, but one may play a stronger role. What we want to do is find some aspect of life where children or adolescents (of either sex) go one way and females (of any age) go another way. During a late night stroll I came up with as many examples I could think of, and it looks more like modernization = feminization, rather than infantilization. Here's what I thought up. Any other examples that would distinguish the two?
1. Violence and property crimes. Feminization. Young people, whether children or adolescents, are much more violent and destructive of property than adults, whereas females are less so than males. The trend since circa 1600 in Europe has been downward.
2. Emotional sensitivity and empathy (even including non-human animals). Feminization. Young people are more callous about these things than mature adults are, whereas females are more adept than males. The trend since at least the Enlightenment has been toward greater sensitivity.
3. Sexual behavior. Feminization. Here we can only compare adolescents to adults, but the former have wilder thoughts and behavior, whereas females behave more conservatively. The trend during the modern age is toward more vanilla sex lives. (Remember to look at the whole sweep of things, rather than compare Victorians to the Summer of Love. Don't leave out Samuel Pepys' diary, Casanova, The Canterbury Tales, etc., and don't leave out the 1940s-1950s and the '90s-2000s.)
4. Gross-out and slapstick humor. Feminization. Young people participate in and consume these forms of humor much more than adults (especially if we're talking about smaller children), whereas females are much more repulsed by it than males. The trend since the Elizabethan era and before has been sharply downward. The movie with the most bawdy humor that I've seen recently is Decameron, adapted from a book of tales written in the 14th century.
5. Respect for laws, order, structure, etc. Feminization. You all remember Lord of the Flies and Boyz n the Hood: younger people are much more rebellious against these things, whereas females cling to them more than males. The trend has been toward greater respect. I'm not talking about behavior that may or may not break the law, but about how elevated in people's minds these concepts are. Under feudalism or absolute monarchy, people were more cynical about law & order (although thankful that it protected them from harm), compared to modern people who see most law enforcers as the good guys (because they are).
6. Sense of adventure and curiosity. Feminization. Young people are more driven by the Indiana Jones impulse than adults, whereas females are less adventure-seeking than males. The trend has been toward less boldness in one's adventures. It's not that no one enjoys a good real-life adventure anymore, but compared to braving the high seas to find treasure or virgin land, crusading in exotic far-off countries, or living by transhumance pastoralism, you have to admit that we've become pretty wimpy.
7. Attention to our health. Feminization. Young people feel more invincible and behave more recklessly in health matters compared to adults, whereas females are more mindful than males. The trend has been toward a greater preoccupation with personal hygiene, diet, exercise with the explicit goal to improve health, and so on. This is not merely due to the fact that it's safer to visit doctors now than 200 years ago or earlier, since most of these changes involve lifestyle habits rather than hospital visits. And I'm not talking about specific diets -- like more meat or less meat -- but just to how much we dwell on the topic and act on it no matter which particular choices we make.
8. Aspbergery / autistic behavior. Infantilization. Younger people have less developed social skills, less developed "Theory of Mind" (i.e., appreciating that other people have their own minds and beliefs), are less tolerant of ambiguity, and are more literal-minded when interpreting someone else's words or actions. Females are more developed in these areas than males. The trend since at least the Enlightenment has been toward a more Aspbergery personality. We seem less dexterous in our social relations, we're more likely to rule out that another human being could hold an opposite viewpoint from our own (a classic test of autism is the "false belief task"), and we're less accepting of ambiguity and open interpretations -- as shown by the proliferation of grammar Nazis since the 18th century, who were wholly absent before.
9. Faith in the supernatural. Infantilization. Younger people aren't quite so committed to belief in the afterlife, spirits, etc., as older adults are, whereas females have stronger beliefs here than males. The trend since the Scientific Revolution and the Enlightenment has been toward a more materialist view of the universe. I don't want to include a discussion of "religion" because that covers so many different beliefs and behaviors, so I focused on just one aspect.
Just by the number of cases, feminization looks like a stronger explanation, especially when you weight each case by how important the change is in defining the shift from pre-modern to modernized societies.
You could object that these changes only show that we're more mature than infantilized -- that the infantilized view was just plain wrong to begin with, not one of several viable explanations. Still, the larger picture of how long we delay reaching milestones of adulthood -- having a sustaining job, courting and having sex, etc. -- would seem to rule out a "we're so mature now" view.
This isn't just some hairsplitting exercise, since these two ideas could predict different mechanisms for the change. For example, the change could be due to a higher or lower frequency of some genes, as the result of natural selection. The genes involved in keeping an organism in a youthful state (neoteny) is not the same as the ones that promotes feminized rather than masculinized features. If we had reliable methods of detecting natural selection within just 300 or 400 years, I'd look at genes that affect the concentration of sex hormones, how sensitive their receptors are, and so on, based on the feminization idea. Someone who thought the infantilization angle was more important would look at a different, somewhat overlapping set of genes, such as those that stimulate growth.
Most of those who are critical of the bailouts -- of homeowners, of investment banks, of people who bought crappy cars, of people who make terrible health choices, and so on -- invoke the image of a spoiled child who whines to his parents to make it all better when he gets himself into trouble in some predictable way. The message from this rhetorical frame is that we need to stop acting like babies, grow up, and behave like mature adults. Many notice how long it takes to reach the milestones of adulthood than it used to, which would seem to provide more evidence in support of the infantilization view.
However, couldn't we make another equally valid analogy to the spendthrift wife who, after ruining herself financially, tries to badger her husband into transferring more of his earnings to her? In this view, the problem is that we've become feminized, so that the lesson is to stop acting like girls and start taking it like a man.
In this case of our bailout mentality, the two ideas about modernization give the same result, since both bratty children and wives with a hole in their pocket scream for bailouts. So, what other changes that modern people show would help us to distinguish between the two causes? Both could be going on, but one may play a stronger role. What we want to do is find some aspect of life where children or adolescents (of either sex) go one way and females (of any age) go another way. During a late night stroll I came up with as many examples I could think of, and it looks more like modernization = feminization, rather than infantilization. Here's what I thought up. Any other examples that would distinguish the two?
1. Violence and property crimes. Feminization. Young people, whether children or adolescents, are much more violent and destructive of property than adults, whereas females are less so than males. The trend since circa 1600 in Europe has been downward.
2. Emotional sensitivity and empathy (even including non-human animals). Feminization. Young people are more callous about these things than mature adults are, whereas females are more adept than males. The trend since at least the Enlightenment has been toward greater sensitivity.
3. Sexual behavior. Feminization. Here we can only compare adolescents to adults, but the former have wilder thoughts and behavior, whereas females behave more conservatively. The trend during the modern age is toward more vanilla sex lives. (Remember to look at the whole sweep of things, rather than compare Victorians to the Summer of Love. Don't leave out Samuel Pepys' diary, Casanova, The Canterbury Tales, etc., and don't leave out the 1940s-1950s and the '90s-2000s.)
4. Gross-out and slapstick humor. Feminization. Young people participate in and consume these forms of humor much more than adults (especially if we're talking about smaller children), whereas females are much more repulsed by it than males. The trend since the Elizabethan era and before has been sharply downward. The movie with the most bawdy humor that I've seen recently is Decameron, adapted from a book of tales written in the 14th century.
5. Respect for laws, order, structure, etc. Feminization. You all remember Lord of the Flies and Boyz n the Hood: younger people are much more rebellious against these things, whereas females cling to them more than males. The trend has been toward greater respect. I'm not talking about behavior that may or may not break the law, but about how elevated in people's minds these concepts are. Under feudalism or absolute monarchy, people were more cynical about law & order (although thankful that it protected them from harm), compared to modern people who see most law enforcers as the good guys (because they are).
6. Sense of adventure and curiosity. Feminization. Young people are more driven by the Indiana Jones impulse than adults, whereas females are less adventure-seeking than males. The trend has been toward less boldness in one's adventures. It's not that no one enjoys a good real-life adventure anymore, but compared to braving the high seas to find treasure or virgin land, crusading in exotic far-off countries, or living by transhumance pastoralism, you have to admit that we've become pretty wimpy.
7. Attention to our health. Feminization. Young people feel more invincible and behave more recklessly in health matters compared to adults, whereas females are more mindful than males. The trend has been toward a greater preoccupation with personal hygiene, diet, exercise with the explicit goal to improve health, and so on. This is not merely due to the fact that it's safer to visit doctors now than 200 years ago or earlier, since most of these changes involve lifestyle habits rather than hospital visits. And I'm not talking about specific diets -- like more meat or less meat -- but just to how much we dwell on the topic and act on it no matter which particular choices we make.
8. Aspbergery / autistic behavior. Infantilization. Younger people have less developed social skills, less developed "Theory of Mind" (i.e., appreciating that other people have their own minds and beliefs), are less tolerant of ambiguity, and are more literal-minded when interpreting someone else's words or actions. Females are more developed in these areas than males. The trend since at least the Enlightenment has been toward a more Aspbergery personality. We seem less dexterous in our social relations, we're more likely to rule out that another human being could hold an opposite viewpoint from our own (a classic test of autism is the "false belief task"), and we're less accepting of ambiguity and open interpretations -- as shown by the proliferation of grammar Nazis since the 18th century, who were wholly absent before.
9. Faith in the supernatural. Infantilization. Younger people aren't quite so committed to belief in the afterlife, spirits, etc., as older adults are, whereas females have stronger beliefs here than males. The trend since the Scientific Revolution and the Enlightenment has been toward a more materialist view of the universe. I don't want to include a discussion of "religion" because that covers so many different beliefs and behaviors, so I focused on just one aspect.
Just by the number of cases, feminization looks like a stronger explanation, especially when you weight each case by how important the change is in defining the shift from pre-modern to modernized societies.
You could object that these changes only show that we're more mature than infantilized -- that the infantilized view was just plain wrong to begin with, not one of several viable explanations. Still, the larger picture of how long we delay reaching milestones of adulthood -- having a sustaining job, courting and having sex, etc. -- would seem to rule out a "we're so mature now" view.
This isn't just some hairsplitting exercise, since these two ideas could predict different mechanisms for the change. For example, the change could be due to a higher or lower frequency of some genes, as the result of natural selection. The genes involved in keeping an organism in a youthful state (neoteny) is not the same as the ones that promotes feminized rather than masculinized features. If we had reliable methods of detecting natural selection within just 300 or 400 years, I'd look at genes that affect the concentration of sex hormones, how sensitive their receptors are, and so on, based on the feminization idea. Someone who thought the infantilization angle was more important would look at a different, somewhat overlapping set of genes, such as those that stimulate growth.
Adults playing softball tracks crime rate
Here's another demonstration of the sociability and team-mindedness of people tracking the crime rate. The Statistical Abstract of the United States has data on the number of adult softball teams (baseball must be included in this), and I've divided them by the total population. Because the size of a softball team is basically constant across time -- free of bias with respect to time, at any rate -- this picture also tells us the fraction of people playing softball.
It began steadily rising at least since 1970, when the data begin, up through 1991. (Based on what I've seen of the 1960s, I'd guess it was rising during that decade too, maybe back to the late '50s.) Then 1992 begins a steady fall up through 2006. This exactly matches the movement in the violent and property crime rates.
There are influences in both directions. When people are more out-and-about (for instance playing softball), crime increases because there are more sitting ducks for criminals to exploit, unlike when everyone is locked inside. And when crime is soaring, it's always part of a rise in wild behavior overall. Being outside playing softball is more adventurous than staying inside, and it also brings together people to cooperate as a team, which they feel a greater need to do when the common threat of crime is rising.
At some point people have had too much with their exposure to crime and begin to retreat from public spaces. This drains the pool of potential victims out in plain sight, so the crime rate falls as a result, which in turn gives people less of a feeling of facing a common threat, so they don't feel so keen to form teams, socialize, and otherwise look out for one another. Now they're either hiding indoors or leaving only for a brief time to get a massage at a day-spa or stare at tire rims in a display room.
It's very hard to remember, even for those who lived through the entire period, but from the late '50s through the very early '90s, adults actually had a life. They weren't hunkered down in their homes like during the falling-crime times of most of the '30s through most of the '50s, or the more recent period of the early-mid '90s through today. Every weekend when I was only 3 or 4, and they in their late 20s, my mother would drag my dad out to a dance club where she could cut a little rug to the new wave explosion of the early-mid '80s. Back then they made use of a strange arrangement known as "finding a babysitter" (I think it was one of their friends from work).
If that had taken place during the past 15-20 years of the increasingly helicopter parent culture, Child Services probably would've robbed them of their custody. Parents aren't supposed to do anything fun anymore.
August 18, 2010
Gays in existential drift since AIDS and homophobia have been fading?
A comment in the previous post asks about asabiya among gay man -- that is Ibn Khaldun's term for solidarity and capacity for collective action. He used it to describe what force led tribes outside the center of power (nomads, barbarians, whatever you want to call them) to band together and overrun the civilized urban elites.
I don't have much connection to the gay world, aside from having one friend on Facebook who's gay (a "Facebook-only friend," as they say, not actually in my social circle). But based on the picture of the past 20 years, in which everybody seems to be losing the solidarity and sense of purpose they had from the '60s through the '80s, I'd guess the gay community isn't as tightly knit and ready for action as it used to be. I tried looking through the General Social Survey for questions that would bear on gay solidarity, but there aren't any. Still, here's anecdotal support (via Ray Sawhill's blog) from a gay man long familiar with The Movement:
I think he's describing most groups of people after the early '90s fall in violence ended three and a half decades of wild and solidaristic times -- caricature, consumerism, decoration, impotence. He doesn't come right out and say that the ties that bind are pretty loose by now, but that seems a safe inference from the quote.
This massive social shift must have been even more pronounced among gays because they weren't just the recipients of random opportunistic violence, which is bad enough, but were also targeted in virtue of being gay. That will bind actual or potential victims together more than if there isn't a clear targeting of victims based on group membership. Plus AIDS looked like the Black Death, and again nothing pulls people together like a common threat.
Once the violence rate dropped in the early '90s, so must have gay-bashing. Violence is just one component of the overall wildness that started falling then, also including a decline in risky sexual behavior. After putting a lid on their previously reckless ways, AIDS stopped wiping out gays at the same rate and may have become less virulent. With those two large threats absent from a gay man's life today, he feels no need to band together with others and revive ACT UP. Instead they've done like the rest of us and fallen back into the default state of competing against each other in a variety of petty status games.
I don't have much connection to the gay world, aside from having one friend on Facebook who's gay (a "Facebook-only friend," as they say, not actually in my social circle). But based on the picture of the past 20 years, in which everybody seems to be losing the solidarity and sense of purpose they had from the '60s through the '80s, I'd guess the gay community isn't as tightly knit and ready for action as it used to be. I tried looking through the General Social Survey for questions that would bear on gay solidarity, but there aren't any. Still, here's anecdotal support (via Ray Sawhill's blog) from a gay man long familiar with The Movement:
What disappoints me most about the current state of the gay movement, if you can still call it that, is that most gays have settled for this really rigid, obvious, and stereotypical idea of what it means to be a homosexual. It's become a very facile, consumerist identity without any substance, purely decorative and inert, and strangely castrated.
I think he's describing most groups of people after the early '90s fall in violence ended three and a half decades of wild and solidaristic times -- caricature, consumerism, decoration, impotence. He doesn't come right out and say that the ties that bind are pretty loose by now, but that seems a safe inference from the quote.
This massive social shift must have been even more pronounced among gays because they weren't just the recipients of random opportunistic violence, which is bad enough, but were also targeted in virtue of being gay. That will bind actual or potential victims together more than if there isn't a clear targeting of victims based on group membership. Plus AIDS looked like the Black Death, and again nothing pulls people together like a common threat.
Once the violence rate dropped in the early '90s, so must have gay-bashing. Violence is just one component of the overall wildness that started falling then, also including a decline in risky sexual behavior. After putting a lid on their previously reckless ways, AIDS stopped wiping out gays at the same rate and may have become less virulent. With those two large threats absent from a gay man's life today, he feels no need to band together with others and revive ACT UP. Instead they've done like the rest of us and fallen back into the default state of competing against each other in a variety of petty status games.
August 17, 2010
How oppressed were various groups? Judge by their history of revolt
Gay marriage is the main liberatory movement in 2010 -- which goes to show how free everyone is in America, if that's all that's left to fight over. No more slavery, no landowning requirements to vote, no foreign rule, etc.
Freedom-seeking movements will try to get the most liberatory bang for their buck, so they will target sources of oppression that are truly heinous first, then move on to not so oppressive threats, and wind up squabbling over liberation that is so minor that achieving their goal could hardly be taken seriously as a Great Unyoking.
Throughout history, when a horde of invading foreigners took over a community, that community tended to resist, often violently and for long stretches of time. Because the revolt against foreign rule is so widespread across the world, so persistent through time, and so bloody, we infer that the revolutionaries are up against a very oppressive regime.
Gay marriage is at the other end of the spectrum: it appears almost nowhere in the world, even where it does it's only a decade old, and none of the proponents of it are willing to stake their lives for the struggle or even put themselves in front of firehoses. Thus, the denial of marriage rights to homosexuals is scarcely oppressive at all. After all, most gay guys have no plans to ever get married in the first place because, like most straight guys, they'd prefer to cat around until they turn gray. The point about banning gay marriage applies to homophobia in all forms -- fighting it is not widespread, old, or violent, so homophobia cannot have been very oppressive.
In the middle but closer to the anti-imperialist side are anti-slavery movements, including anti-wage slavery movements. They're not quite as widespread, old, and violent, but they're pretty close, especially when they also have an anti-foreign rule angle to them like slave rebellions in the Americas. Slaves and wage slaves are less oppressed than those under the boot of foreign rule because slaveowners at least have some incentive to keep their subjects decently fed, clothed, and housed -- otherwise they won't be able to work and earn the master any money. A foreign ruler is mostly interested in stealing whatever he can, abducting the pretty young women as concubines, and levying crushing taxes on the community that he has no ties to.
In the middle but closer to the gay marriage side is feminism. It's hardly widespread, has only appeared in the past couple centuries, and can get somewhat physically confrontational but rarely at the level of a slave rebellion or an overthrow of a foreign army ruling your land. Why did women the world over and for almost all of time feel that they weren't being oppressed? Well, they surely did see many restrictions on their freedom -- who they could date or marry, whether or not they could own property, etc. -- but they saw that men faced just as severe of restrictions on their freedom, albeit in different domains of life.
When a neighboring tribe or army comes through the village, guess who has no choice but to go face them in defense of their community? When someone needs to labor away at tracking down and hunting game animals to provide the animal fat and protein that we rely on, guess who gets conscripted into the hunting party? When a fight breaks out among the locals, guess who has to step in to break it up before it spills over? Men have brought home the bacon (not merely nuts and berries) and have protected women from violence and other threats to their safety, and they have no real choice to shirk these duties if they want to make it in the world. Women judged the drudgery of the woman's role as no more oppressive than the unavoidable danger of the man's role, so they didn't see any point in fighting for greater personal freedoms.
Only when most threats to a woman's physical safety were eliminated, and after the economy allowed them to bring home the bacon (from the supermarket with wages they earned), did they see an imbalance in how nice overall the guys and the girls had it. And even then most women weren't dyed-in-the-wool bra-burners, so this recent imbalance must not have been very oppressive either, contrary to all of those "prisoner of suburban domesticity" narratives about the 1950s.
The typical civil libertarian sees the world in black and white -- some group is being oppressed or not, and they don't focus much on the magnitude of oppression. Minor infractions must be fought on principle. In the real world, activists should target causes that will deliver the most freedom for the least effort, and go down the list of oppressed groups in order. The fact that so many spend such a large chunk of their outrage budget on trivial matters like gay marriage proves that they do not have liberatory goals, but must be more interested in other things like signaling the purity of their worldview to those whose approval they seek. Like I said, there's not much to fight over these days, but surely something greater than gay marriage -- ending the form of state-sanctioned discrimination known as affirmative action, for example.
Freedom-seeking movements will try to get the most liberatory bang for their buck, so they will target sources of oppression that are truly heinous first, then move on to not so oppressive threats, and wind up squabbling over liberation that is so minor that achieving their goal could hardly be taken seriously as a Great Unyoking.
Throughout history, when a horde of invading foreigners took over a community, that community tended to resist, often violently and for long stretches of time. Because the revolt against foreign rule is so widespread across the world, so persistent through time, and so bloody, we infer that the revolutionaries are up against a very oppressive regime.
Gay marriage is at the other end of the spectrum: it appears almost nowhere in the world, even where it does it's only a decade old, and none of the proponents of it are willing to stake their lives for the struggle or even put themselves in front of firehoses. Thus, the denial of marriage rights to homosexuals is scarcely oppressive at all. After all, most gay guys have no plans to ever get married in the first place because, like most straight guys, they'd prefer to cat around until they turn gray. The point about banning gay marriage applies to homophobia in all forms -- fighting it is not widespread, old, or violent, so homophobia cannot have been very oppressive.
In the middle but closer to the anti-imperialist side are anti-slavery movements, including anti-wage slavery movements. They're not quite as widespread, old, and violent, but they're pretty close, especially when they also have an anti-foreign rule angle to them like slave rebellions in the Americas. Slaves and wage slaves are less oppressed than those under the boot of foreign rule because slaveowners at least have some incentive to keep their subjects decently fed, clothed, and housed -- otherwise they won't be able to work and earn the master any money. A foreign ruler is mostly interested in stealing whatever he can, abducting the pretty young women as concubines, and levying crushing taxes on the community that he has no ties to.
In the middle but closer to the gay marriage side is feminism. It's hardly widespread, has only appeared in the past couple centuries, and can get somewhat physically confrontational but rarely at the level of a slave rebellion or an overthrow of a foreign army ruling your land. Why did women the world over and for almost all of time feel that they weren't being oppressed? Well, they surely did see many restrictions on their freedom -- who they could date or marry, whether or not they could own property, etc. -- but they saw that men faced just as severe of restrictions on their freedom, albeit in different domains of life.
When a neighboring tribe or army comes through the village, guess who has no choice but to go face them in defense of their community? When someone needs to labor away at tracking down and hunting game animals to provide the animal fat and protein that we rely on, guess who gets conscripted into the hunting party? When a fight breaks out among the locals, guess who has to step in to break it up before it spills over? Men have brought home the bacon (not merely nuts and berries) and have protected women from violence and other threats to their safety, and they have no real choice to shirk these duties if they want to make it in the world. Women judged the drudgery of the woman's role as no more oppressive than the unavoidable danger of the man's role, so they didn't see any point in fighting for greater personal freedoms.
Only when most threats to a woman's physical safety were eliminated, and after the economy allowed them to bring home the bacon (from the supermarket with wages they earned), did they see an imbalance in how nice overall the guys and the girls had it. And even then most women weren't dyed-in-the-wool bra-burners, so this recent imbalance must not have been very oppressive either, contrary to all of those "prisoner of suburban domesticity" narratives about the 1950s.
The typical civil libertarian sees the world in black and white -- some group is being oppressed or not, and they don't focus much on the magnitude of oppression. Minor infractions must be fought on principle. In the real world, activists should target causes that will deliver the most freedom for the least effort, and go down the list of oppressed groups in order. The fact that so many spend such a large chunk of their outrage budget on trivial matters like gay marriage proves that they do not have liberatory goals, but must be more interested in other things like signaling the purity of their worldview to those whose approval they seek. Like I said, there's not much to fight over these days, but surely something greater than gay marriage -- ending the form of state-sanctioned discrimination known as affirmative action, for example.
August 16, 2010
Guys today name their Xbox and iPod, not their car or their junk
We give nicknames to very few things around us, only those that are close enough to sustain and enrich our lives that we feel the need to individuate them from all the other clutter in our lives. Our pets are the most obvious example. Naming newborn or unborn children is common, but not everywhere. In some places where infant mortality is high, mothers don't even name their children until they reach a certain age. Not naming them right after birth (or even before) is a way to keep from growing too attached to them and then feeling demoralized should the child die within the first couple years of life. This makes it easier for the mother to move on with things and start again.
Back when young people had a life, guys used to name their cars in the same way that they used to christen ships or give names to their horses. It was a vehicle for physical freedom, and as a token of gratitude you gave it a name to pick it out against all those other cars that weren't so special. Since the early 1990s the percent of 17 year-olds who have a driver's license has been plummeting, and within the last few years it finally slipped below 50% -- today the typical 17 year-old does not have a license. Let alone have access to a car, especially one that is only theirs.
The LA Times ran a feature story on this major shift in the early 2000s, and my own experience talking to Millennial guys matches that -- namely, they view cars mostly as a burden and a hassle. Driving a car is no longer the holy grail for the freedom-seeking American teenage guy, which makes sense since, not having a life, they don't really have anywhere to go or anything to do in the back seat like guys used to. I haven't seen any survey data on this, but obviously the percent of teenage guys who give a name to a car has fallen sharply; the only question is by how much.
After some googling, I found that plenty of guys today name their Xbox video game console, their iPod, iPhone, Mac laptop, etc. I'm not talking about cases where the technology allows you to register a name for it, like how you can have your computer referred to as "Darwin" instead of "My Computer." I mean where they christen the thing and refer to it that way in their own natural speech. Again there was no survey data that I could find, so how common this is I don't know. But I do know that no one nicknamed their phone when I was a teenager, or their computer, or their video game system.
This shows the larger shift that young people have undergone: they used to value the greater physical freedom that a car could deliver, while now they value the greater informational or virtual freedom that comes from farting around on the internet via your iPhone or playing Halo (both alone or online). They then give names to the vehicles that take them to the places they value most.
Not that I talk to Millennial guys an awful lot, but I'd wager dollars to donuts that almost none of them have given a proper name to their cock. Before, it wasn't as though everyone did that, but it was common enough to hold a place in the folklore and folkpractices of young guys. Millennial guys are so afraid of expressing their libido, though, and are generally grossed out by the physical nature of sex -- "Ewwww, she's got pubic hair! Mommy, tell her mommy to make her shave it off!" And statistics on sexual behavior show that they're far less wild than young guys used to be (the decline starts in the early 1990s, as with wild behavior in general). Put all of those pieces together, and I can't see that the old practice of naming it has survived.
For the same reasons, I doubt guys who have a girlfriend give names to her anatomy either. Obviously even in wild times this was less common than naming your own, since every guy had his own junk but not all had a girlfriend, and it's a bit of a risky move to give her parts their own names, since girls aren't as salacious as guys. * Still, today this folkpractice of young people must be entirely dead.
This decline in wildness is what's behind the observations about the lack of emotional depth that young people have in relationships today, like girls who are busy texting furiously even when they're grinding some guy on the dance floor. After all, it takes quite a close emotional bond for a girlfriend and boyfriend to share a pet name for her pussy. A subset of this phenomenon is all the worrying about the "hook-up culture," which is actually a sign of Puritanical rather than Dionysian sex lives. Among young people, there's less sex, it happens later, and with less of a sense of magic pervading it.
* To show that happened at all, I wanted to refer to a popular movie where this happens, but I can't remember the name or the overall plot. The scene shows a woman listening to an answering machine message, where an ex-boyfriend or ex-husband is trying to win her back by reminiscing about their fun times together. (Or perhaps he's there in the room with her?) He reminds her of how they named both her left and right tit, as well as her pussy, supplying the names for all (which I forget). She has a somewhat embarrassed, somewhat amused look on her face, like Sigourney Weaver does when Bill Murray charms her over in front of Lincoln Center in Ghostbusters. I tried finding this movie through google, but you can guess what sort of results I got instead. Anyone know which one I'm talking about? Seems like it was from the '80s through the mid-'90s.
Back when young people had a life, guys used to name their cars in the same way that they used to christen ships or give names to their horses. It was a vehicle for physical freedom, and as a token of gratitude you gave it a name to pick it out against all those other cars that weren't so special. Since the early 1990s the percent of 17 year-olds who have a driver's license has been plummeting, and within the last few years it finally slipped below 50% -- today the typical 17 year-old does not have a license. Let alone have access to a car, especially one that is only theirs.
The LA Times ran a feature story on this major shift in the early 2000s, and my own experience talking to Millennial guys matches that -- namely, they view cars mostly as a burden and a hassle. Driving a car is no longer the holy grail for the freedom-seeking American teenage guy, which makes sense since, not having a life, they don't really have anywhere to go or anything to do in the back seat like guys used to. I haven't seen any survey data on this, but obviously the percent of teenage guys who give a name to a car has fallen sharply; the only question is by how much.
After some googling, I found that plenty of guys today name their Xbox video game console, their iPod, iPhone, Mac laptop, etc. I'm not talking about cases where the technology allows you to register a name for it, like how you can have your computer referred to as "Darwin" instead of "My Computer." I mean where they christen the thing and refer to it that way in their own natural speech. Again there was no survey data that I could find, so how common this is I don't know. But I do know that no one nicknamed their phone when I was a teenager, or their computer, or their video game system.
This shows the larger shift that young people have undergone: they used to value the greater physical freedom that a car could deliver, while now they value the greater informational or virtual freedom that comes from farting around on the internet via your iPhone or playing Halo (both alone or online). They then give names to the vehicles that take them to the places they value most.
Not that I talk to Millennial guys an awful lot, but I'd wager dollars to donuts that almost none of them have given a proper name to their cock. Before, it wasn't as though everyone did that, but it was common enough to hold a place in the folklore and folkpractices of young guys. Millennial guys are so afraid of expressing their libido, though, and are generally grossed out by the physical nature of sex -- "Ewwww, she's got pubic hair! Mommy, tell her mommy to make her shave it off!" And statistics on sexual behavior show that they're far less wild than young guys used to be (the decline starts in the early 1990s, as with wild behavior in general). Put all of those pieces together, and I can't see that the old practice of naming it has survived.
For the same reasons, I doubt guys who have a girlfriend give names to her anatomy either. Obviously even in wild times this was less common than naming your own, since every guy had his own junk but not all had a girlfriend, and it's a bit of a risky move to give her parts their own names, since girls aren't as salacious as guys. * Still, today this folkpractice of young people must be entirely dead.
This decline in wildness is what's behind the observations about the lack of emotional depth that young people have in relationships today, like girls who are busy texting furiously even when they're grinding some guy on the dance floor. After all, it takes quite a close emotional bond for a girlfriend and boyfriend to share a pet name for her pussy. A subset of this phenomenon is all the worrying about the "hook-up culture," which is actually a sign of Puritanical rather than Dionysian sex lives. Among young people, there's less sex, it happens later, and with less of a sense of magic pervading it.
* To show that happened at all, I wanted to refer to a popular movie where this happens, but I can't remember the name or the overall plot. The scene shows a woman listening to an answering machine message, where an ex-boyfriend or ex-husband is trying to win her back by reminiscing about their fun times together. (Or perhaps he's there in the room with her?) He reminds her of how they named both her left and right tit, as well as her pussy, supplying the names for all (which I forget). She has a somewhat embarrassed, somewhat amused look on her face, like Sigourney Weaver does when Bill Murray charms her over in front of Lincoln Center in Ghostbusters. I tried finding this movie through google, but you can guess what sort of results I got instead. Anyone know which one I'm talking about? Seems like it was from the '80s through the mid-'90s.
Food observations, non-meat edition
- Canned or frozen fruits and vegetables are superior to fresh ones unless you're cooking the fresh ones yourself. Most people don't do that anymore because we've set up a purity taboo about eating cooked vegetables -- they have to be "fresh," i.e. raw, in order to be pure. Even when we're allowed to cook them, it has to be some way that will highly oxidize the vegetables, such as grilling. No one merely boils them, let alone ferment them.
Eating raw fruits and vegetables is a surefire way to get food poisoning to varying degrees, especially if they have a higher sugar content that would attract flies in the supermarket. Cooked vegetables that come in cans are less risky. Ditto frozen fruit -- won't grow fuzz in a couple of days. Many people complain about the gas they get from eating cruciferous vegetables (ones whose bottom makes a cross shape), such as cabbage, cauliflower, brussel sprouts, artichokes, broccoli, etc. The plants are only deploying some mix of toxins, irritants, and other defenses to keep from being eaten. Fermentation apparently zaps most of that junk, as you can quickly verify by eating a bunch of sauerkraut -- no gas. I go through one or two tightly packed cans of that stuff a week, and I could easily go through four.
Extra bonus: canned vegetables are a lot cheaper since their maintenance doesn't cost the supermarket much at all, aside from some shelf or freezer space. Raw ones are much more high-maintenance, and you end up paying for that. People are willing to pay a lot in order to avoid violating food purity taboos.
- Hazelnuts are the perfect nut. Almonds are just sweet enough to be somewhat addictive, which keeps them from being ideal. I once ate about a half-pound of almonds, and I could never do that with the much more savory hazelnut. Like the almond, though, it's very low in carbs (most of these being fiber anyway), and loaded with monounsaturated fats. Macademias are this way too, but they don't have enough of a crunchy structure to make a good snack food. They're hard at first, but you can't munch on them. Other common nuts are either too high in omega-6 polyunsaturated fats or too high in carbs.
- Real yoghurt has vanished from the shelves. I don't eat yoghurt often at all, but I remember just a couple years ago it wasn't so bad. Now more or less every variety is fat-free or low-fat. You really have to hunt to find a full-fat one, assuming it's even there. Of course the sugar content has only gone through the roof in the meantime, since it has to taste like something. By this standard of "healthy eating," the healthiest thing we could eat are jelly beans -- no fat, tons of sugar.
- After plenty of self-experimentation, I conclude what I did last year about the effects of cheese (especially aged ones) on libido, provided you're also on a low-carb diet. Once guys leave their teens, they feel glad not to be so governed by their boners all the time, and to be able to think about other things than that cute girl in front of you in math class who always finds some excuse to lean over her desk. Still, going half-way back there feels like the right amount of prurience.
Hmmm, a low-carb diet with plenty of cheese -- that sounds like what distinguishes a pastoralist diet from a hunter-gatherer diet (no dairy) and a farmer diet (super high-carb, uncertain dairy). No wonder the herders in all those pastoral poems are always so amorous. It can't be due to leading semi-isolated lives and having the pressure build up -- there are plenty of semi-isolated farmers, hunter-gatherers, and modern-day laborers who don't always have love and sex on the mind. If you deprive them all of sex for the same length of time, it seems like the herders experience greater pangs of feeling lovelorn -- meaning they had higher libidos to begin with (there's no frustration where there's no desire).
Eating raw fruits and vegetables is a surefire way to get food poisoning to varying degrees, especially if they have a higher sugar content that would attract flies in the supermarket. Cooked vegetables that come in cans are less risky. Ditto frozen fruit -- won't grow fuzz in a couple of days. Many people complain about the gas they get from eating cruciferous vegetables (ones whose bottom makes a cross shape), such as cabbage, cauliflower, brussel sprouts, artichokes, broccoli, etc. The plants are only deploying some mix of toxins, irritants, and other defenses to keep from being eaten. Fermentation apparently zaps most of that junk, as you can quickly verify by eating a bunch of sauerkraut -- no gas. I go through one or two tightly packed cans of that stuff a week, and I could easily go through four.
Extra bonus: canned vegetables are a lot cheaper since their maintenance doesn't cost the supermarket much at all, aside from some shelf or freezer space. Raw ones are much more high-maintenance, and you end up paying for that. People are willing to pay a lot in order to avoid violating food purity taboos.
- Hazelnuts are the perfect nut. Almonds are just sweet enough to be somewhat addictive, which keeps them from being ideal. I once ate about a half-pound of almonds, and I could never do that with the much more savory hazelnut. Like the almond, though, it's very low in carbs (most of these being fiber anyway), and loaded with monounsaturated fats. Macademias are this way too, but they don't have enough of a crunchy structure to make a good snack food. They're hard at first, but you can't munch on them. Other common nuts are either too high in omega-6 polyunsaturated fats or too high in carbs.
- Real yoghurt has vanished from the shelves. I don't eat yoghurt often at all, but I remember just a couple years ago it wasn't so bad. Now more or less every variety is fat-free or low-fat. You really have to hunt to find a full-fat one, assuming it's even there. Of course the sugar content has only gone through the roof in the meantime, since it has to taste like something. By this standard of "healthy eating," the healthiest thing we could eat are jelly beans -- no fat, tons of sugar.
- After plenty of self-experimentation, I conclude what I did last year about the effects of cheese (especially aged ones) on libido, provided you're also on a low-carb diet. Once guys leave their teens, they feel glad not to be so governed by their boners all the time, and to be able to think about other things than that cute girl in front of you in math class who always finds some excuse to lean over her desk. Still, going half-way back there feels like the right amount of prurience.
Hmmm, a low-carb diet with plenty of cheese -- that sounds like what distinguishes a pastoralist diet from a hunter-gatherer diet (no dairy) and a farmer diet (super high-carb, uncertain dairy). No wonder the herders in all those pastoral poems are always so amorous. It can't be due to leading semi-isolated lives and having the pressure build up -- there are plenty of semi-isolated farmers, hunter-gatherers, and modern-day laborers who don't always have love and sex on the mind. If you deprive them all of sex for the same length of time, it seems like the herders experience greater pangs of feeling lovelorn -- meaning they had higher libidos to begin with (there's no frustration where there's no desire).
August 15, 2010
Pop musicians who are too educated for their own good
For the past several weeks Starbucks has been playing one of the most doofusy indie / pop songs ever recorded. Did he just say "drinking horchata"? God, what are they singing about now? So I looked up the lyrics; it's "Horchata" by Vampire Weekend. Here are the rhyme words: horchata, balaclava, aranciata, and Masada... How much more self-consciously status-seeking could this dork possibly get? Sure enough, their Wikipedia page mentions that the band met while they were at Columbia in the mid-2000s.
The only real pardon you can get for attending an "elite school" is if it's an art school, and only provided you don't fill your lyrics with references to horchata and aranciata. So, Talking Heads, who met at RISD, get a pass. Orchestral Manoeuvres in the Dark had a few songs about things you'd learn in school -- "Enola Gay," "Joan of Arc," "Maid of Orleans" -- but they are fun and sincere instead of laboring to project erudition, which doesn't belong in pop music. Even the pioneers of geek rock, They Might Be Giants, didn't used to be so nerdily self-aware. (John Flansburgh went to Pratt.) Their first two albums fit squarely within the late '80s college rock sound. Although too interested in wordplay, at least they weren't trying to one-up the inbred clique of Stuff White People Like competitors.
Since then there have been too many pop musicians who went to elite non-art schools: The Strokes, The Bravery, Lisa Loeb, etc. Not that their songs are all bad, but still. They spent their formative years studying hard to get into a good school, and then remained insulated from the real world during their late teens and early 20s, when real rock musicians would have either started playing gigs or been engrossed in real life in other ways. Overly protected people are not going to see enough of the world to get inspiration to make good culture, whether grave or light. Instead you'll end up singing about the fruity drinks you got hooked on during your semester abroad.
(Back when the elites were highly exposed to violence, they were unable to avoid real life and made great culture.)
You'll be more able to rock out if you don't get sucked into the higher ed bubble in the first place. See for yourself:
The only real pardon you can get for attending an "elite school" is if it's an art school, and only provided you don't fill your lyrics with references to horchata and aranciata. So, Talking Heads, who met at RISD, get a pass. Orchestral Manoeuvres in the Dark had a few songs about things you'd learn in school -- "Enola Gay," "Joan of Arc," "Maid of Orleans" -- but they are fun and sincere instead of laboring to project erudition, which doesn't belong in pop music. Even the pioneers of geek rock, They Might Be Giants, didn't used to be so nerdily self-aware. (John Flansburgh went to Pratt.) Their first two albums fit squarely within the late '80s college rock sound. Although too interested in wordplay, at least they weren't trying to one-up the inbred clique of Stuff White People Like competitors.
Since then there have been too many pop musicians who went to elite non-art schools: The Strokes, The Bravery, Lisa Loeb, etc. Not that their songs are all bad, but still. They spent their formative years studying hard to get into a good school, and then remained insulated from the real world during their late teens and early 20s, when real rock musicians would have either started playing gigs or been engrossed in real life in other ways. Overly protected people are not going to see enough of the world to get inspiration to make good culture, whether grave or light. Instead you'll end up singing about the fruity drinks you got hooked on during your semester abroad.
(Back when the elites were highly exposed to violence, they were unable to avoid real life and made great culture.)
You'll be more able to rock out if you don't get sucked into the higher ed bubble in the first place. See for yourself:
August 13, 2010
Dark-eyed Europeans were more likely to leave for the colonies (data)
The Europeans who left for America, Canada, and Australia were obviously not a representative sample of their home populations. Since they were not plucked at random by a Martian lab scientist, there must have been something different about them that made them want to leave. We can imagine all sorts of socioeconomic factors influencing this decision, but let's not forget that some people just have a greater sense of wanderlust than others.
In populations where eye and hair color vary a decent amount, like Northern Europe, correlational studies show that children with lighter eyes or hair tend to be more socially nervous, less exploratory in social situations. When we humans domesticated animals, their coat and eye colors lightened up (even if only partially, like large patches of white). Something similar probably happened when Europeans domesticated themselves, as they had to settle down in order to farm well. Looking at it the other way, the darker-colored ones like Christopher Marlowe must have had wilder personalities compared to their countrymen.
It's not very hard to see that Americans and Australians are a lot more sprawling and rambunctious than the British, and even Canadians are closer to the former than the latter. I've always wondered why there is such a disconnect between North Americans and the British on the topic of dark Irish or Scottish beauties. We seem to have a lot of them here, while those in Scotland and Ireland swear there's no such thing, that we Americans must be delusional. (It seems like there's more agreement on the presence of handsome, darker-looking Irish and Scottish men in the UK.)
Maybe, I thought, the colonies drew away the Europeans with darker eyes and hair? It would fit with the picture of lighter hair and eyes being associated with a less roaming social nature. Fortunately GameFAQs ran a poll on the eye color of its users. These are mostly 20-something males who are into video games. I collapsed blue, gray, and green into a single "light" category. Nation-level data were available for some parts of the world, including three Northwestern European countries and three former Anglo colonies. Here is the percent who have light eyes within each:
% Country
57 Netherlands
54 Germany
53 UK
38 Australia
37 Canada
35 US
The clear and big split is between the homeland populations and their colonial offshoots. Notice how little variation there is within each of the two clusters -- mid-50s for Northwestern Europe, upper 30s for the former colonies. The lower numbers outside of Europe are not due to the greater presence of Southern Europeans, Africans, Asians, etc. If that were true, it would imply that the Northern European populations of all three former colonies were only about 2/3 of the entire country. In the US, it's somewhere in the 70-80% range, in Canada in the 80-90% range, and in Australia about 90%. (The US may look so similar to Canada and Australia in this poll, despite having more dark-eyed people, if the American users of GameFAQs are mostly white. The website has never run a poll on race or ethnicity.)
So little data is collected on eye color that I'm sure this is the first demonstration that the colonists who set off from Northwestern Europe were darker-eyed than those they were leaving behind. It fits with the tiny number of studies done on eye / hair color and personality. And it also helps explain why Americans and Britons can disagree so much on the prevalence of dark Celtic beauties -- looks like we have a lot more of 'em outside of the UK.
In populations where eye and hair color vary a decent amount, like Northern Europe, correlational studies show that children with lighter eyes or hair tend to be more socially nervous, less exploratory in social situations. When we humans domesticated animals, their coat and eye colors lightened up (even if only partially, like large patches of white). Something similar probably happened when Europeans domesticated themselves, as they had to settle down in order to farm well. Looking at it the other way, the darker-colored ones like Christopher Marlowe must have had wilder personalities compared to their countrymen.
It's not very hard to see that Americans and Australians are a lot more sprawling and rambunctious than the British, and even Canadians are closer to the former than the latter. I've always wondered why there is such a disconnect between North Americans and the British on the topic of dark Irish or Scottish beauties. We seem to have a lot of them here, while those in Scotland and Ireland swear there's no such thing, that we Americans must be delusional. (It seems like there's more agreement on the presence of handsome, darker-looking Irish and Scottish men in the UK.)
Maybe, I thought, the colonies drew away the Europeans with darker eyes and hair? It would fit with the picture of lighter hair and eyes being associated with a less roaming social nature. Fortunately GameFAQs ran a poll on the eye color of its users. These are mostly 20-something males who are into video games. I collapsed blue, gray, and green into a single "light" category. Nation-level data were available for some parts of the world, including three Northwestern European countries and three former Anglo colonies. Here is the percent who have light eyes within each:
% Country
57 Netherlands
54 Germany
53 UK
38 Australia
37 Canada
35 US
The clear and big split is between the homeland populations and their colonial offshoots. Notice how little variation there is within each of the two clusters -- mid-50s for Northwestern Europe, upper 30s for the former colonies. The lower numbers outside of Europe are not due to the greater presence of Southern Europeans, Africans, Asians, etc. If that were true, it would imply that the Northern European populations of all three former colonies were only about 2/3 of the entire country. In the US, it's somewhere in the 70-80% range, in Canada in the 80-90% range, and in Australia about 90%. (The US may look so similar to Canada and Australia in this poll, despite having more dark-eyed people, if the American users of GameFAQs are mostly white. The website has never run a poll on race or ethnicity.)
So little data is collected on eye color that I'm sure this is the first demonstration that the colonists who set off from Northwestern Europe were darker-eyed than those they were leaving behind. It fits with the tiny number of studies done on eye / hair color and personality. And it also helps explain why Americans and Britons can disagree so much on the prevalence of dark Celtic beauties -- looks like we have a lot more of 'em outside of the UK.
August 12, 2010
Did you have a job as a kid?
Thinking more about the tough aspects of the real world that young people are shielded from today, having a job came to mind. The days of Fast Times at Ridgemont High -- where the average teenager desires, seeks out, and gets a job -- are long gone. I think this must be a difference between safe vs. dangerous times, as movies showed teenagers with jobs throughout the '80s, and only by the mid-late '90s did they not have jobs (e.g. Clueless and American Pie), a picture that remained through the 2000s (Mean Girls and Superbad).
It's shocking to go around today and see how few teenagers there are doing teenager jobs -- any job in the supermarket, a fast food place, movie theaters, you name it. Also, teenagers don't babysit, mow lawns, or shovel snow from driveways anymore. (I reflected on the disappearance of the babysitter in American life here.) So in the interest of setting down some oral history about a lost way of life (lost for now anyway), why don't we remind ourselves and others how common it used to be for quite young people to have jobs? -- before wimpy kids and helicopter parents both conspired to keep the young 'uns protected from the messy world of real life.
My first real paying job came from the restlessness that nearly pubescent boys naturally feel, provided they aren't total pansies, that spurs them to go out there and achieve something -- doesn't matter what, but something that most of the world's cultures would recognize as a growing-up activity. Perhaps we wouldn't go slay a dragon, but we weren't just going to sit around anymore. During the summer of 1991, when I was 10 years old, a small group of friends wanted to do grown-up stuff like work. Of course, child labor laws prevented us from most jobs, but my friend Robbie's parents asked around and found out that some odd jobs could still be done by youngsters with their parents' and the government's permission.
So that summer, we decided to undergo that ancient rite of passage -- spraypainting someone's address on the curb in front of their house.
Robbie's parents drove us to the municipal building, filled out some forms, and there it was in our hands -- our permit. To us it was like a college degree with a royal seal of approval. Every time we gave our sales pitch, we always boasted about how "We've even got a permit!" So don't worry, we're not total amateurs or anything. Unfortunately the permit also said we could charge no more and no less than $4, when I'm sure anyone who would've paid $4 would've paid us $5, boosting our revenues 25% right there.
On the other hand, the regulation probably made the homeowners more willing to take a chance on us. They always asked how much the service was before making a decision, and if we had set our own price, they might have felt that a bunch of punk kids were going to take them for a ride. But when we told them that the law required exactly $4, I think it made us seem more honest and trustworthy -- it was a no-haggle price, and it was set by a neutral third party. Normal market participants don't need these regulations to ensure trust, but when the offerer of a service is a bunch of 9 and 10 year-old kids, it could be necessary.
We bought some supplies at an arts & crafts store in the shopping center just a couple blocks away from Robbie's house, practiced a good deal on flat surfaces like pizza boxes -- though I don't think on actual curbs -- and once we felt comfortably trained, headed out on our daily tour through some part of our neighborhood in Upper Arlington, Ohio (a quieter suburb just outside of Columbus and the huge Ohio State campus). This was back when kids were allowed to go anywhere and do anything they wanted, all unsupervised, and often without even telling our parents we were leaving as we walked out the door. One or two of us brought a bike along, though we mostly moved on foot. We were only out canvassing for work for about 3 or 4 hours a day at most, so we rarely got tired from walking. Plus it was fun to see new parts of our neighborhood, till then invisible to us because we didn't have any friends on that street. And all in the fresh summer air!
Our approach was as naive and simple as you would think it: just walk up and knock on the door, give a pretty bland and hassle-free schpiel, and then either move on or get down to work. I think the fanciest thing we ever tossed out to finesse them was to note how useful it would be if acquaintances or firetrucks and ambulances needed to find their house quickly or at night. Really, though, how much can you dress up a proposal to spraypaint their address on the curb?
The painting process itself was fairly easy, although not something you could do on auto-pilot. We got into a state that I would later learn is called Flow by psychologists, where your skill level and the challenge at hand are matched so that you do pretty well, get quick feedback about how you're doing, and are focused yet unaware of time passing. We had cut out the shape of a smallish picture frame from a brown paper grocery bag, which served as the stencil for the white background. It had to be of a flexible material since we were going to press it against a rolling curb. The only trick here was pressing down all around the perimeter, not just here and there, or else the white spraypaint could slip under an area that wasn't held down fast. From heavy cardboard, we made a stencil that would block out all of the white background except for a small rectangular area where one of the numbers would go. On top of this we laid our store-bought number stencil, let the black paint fly, and after three or four times, we were done.
Of course, that doesn't mean we didn't occasionally screw up -- and learn how to deal with the inevitable failures that come from putting yourself out there. (The shielded kids of the past 15 to 20 years fail to learn how to cope with failure because they don't get any real-world practice with it.) It only happened once, as I recall, but it was pretty devastating for us. Somehow whoever painted the white background made it too big -- it wasn't just a long rectangle that tightly framed the address numbers, but yawned vertically across the entire front of the curb, making lots of ugly negative white space.
I don't remember exactly how he tried to fix it, but I think Robbie tried to make the numbers bigger to compensate, and that looked even more hideous. Frustrated and angry, he sprayed the black paint in a wavy, scribbly motion across the whole thing, as though he were crossing out a misspelled word in his book report. The rest of us freaked out, sure that we were going to get grounded for this -- for Robbie losing his cool -- and that we'd have to repay the homeowners who knew how much. Luckily they let us use their phone to call Robbie's parents, who came with a can of paint thinner and their grown-up social skills to smooth things over with the poor bastards whose curb we'd ruined. It took at least another hour to fix it up, but it ended up looking fine. Boy did we learn first-hand how when you find yourself in a hole, stop digging!
Still, most of our offers went nowhere, perhaps because they imagined something just like this happening to their as-yet-unsullied front yard. Paying $4 for a service that had a 1% chance of lowering their property value by $10,000 must have been too much risk for them to take just to indulge some kids in their quest to grow up and be doing something meaningful. By the end of the 10 or so weeks of our summer job, we'd pocketed between $80 and $90, although adjusting for inflation that would now be $125-140. Pretty sweet for kids who just graduated 4th grade, let alone for an activity we loved anyway -- hanging out together, exploring the neighborhood, trying to talk ourselves up to the grown-ups, and occasionally even getting to work in a focused way with our hands and tools.
And that was just the work day -- afterwards we almost always went to either of the two local public pools (unsupervised, naturally) to cool off, to goof around after having to be somewhat serious for hours straight, and especially to wash all that gunky spraypaint off our hands before we went home.
What did we do with the money we'd earned? Well Robbie and I did most of the work, so we had more say. And because we were friends, we didn't want to just split up the dough like some band of mercenaries -- we wanted to buy some Really Big Thing that would belong to us all and commemorate our adventure. We didn't think about it this way consciously, of course; it just came to us naturally. In June of that summer, we heard about what would become one of the greatest video games -- and at the time, one of the most expensive -- Sonic the Hedgehog for the Sega Genesis system.
None of us could have ever saved up enough money to get it on our own, so we decided to put our collective earnings toward it and play it together, trading it around whenever someone else wanted to play it (mostly just me and Robbie, though). It was the same as the premise of a later Simpsons episode where three boys pool their money to buy a rare comic book, but whose friendships are torn apart by the rivalry over who has greater access to it. That may happen in the imagination of self-centered TV writers, but back on planet Earth we got along just fine. It worked because it was such a small group and we all were friendly toward each other, so enforcement of fairness wasn't hard at all. Plus that was before anyone was a self-ruined video game addict who would have wanted to play it as their second full-time job. So, it was no more disruptive to our relationships than two or three housemates sharing a bathroom.
I returned to Upper Arlington once after moving away before middle school began, but I didn't think to look for the curbs we'd painted. I'll bet they're still there, even though the houses may look funny from additions, and despite the turnover of occupants during the 20 years since. You'd have to be pretty anti-address-on-the-curb to scrub it off if it was already there. It's no work of art -- no one will behold it in 10,000 years and marvel -- but it feels bizarrely rewarding to know there's some enduring mark of our existence and our teamwork out there.
* * *
When you're a little kid, especially a little boy, you can only take it so long before you get bored of trying to sell some of your toys or other junk in a yard sale, or setting up a sidewalk stand for lemonade. (Or for milk and cookies, like my brothers tried -- my parents soon bought all the milk, which had already spoiled in the summer sun, so no one would get poisoned). You want to band together with your allies and go out there and do something. When children and adults alike pussy out of their duty to help get the kids' hands dirty with real-world experience, not too long after we'll see a generation of young adults whose personal growth has been too stunted to function as well as young adults used to when they entered college or the workforce with a history of working.
This stunted generation may not bring down the economy, but it won't sail as smoothly as if they'd been properly trained by the real world. And even if it did work just as well as before, there's the matter of giving meaning to young people's lives, independent of how much value they add to the economic pie. If the Millennials appear to be coasting through an existential drift, bereft of a passion for life, it's no surprise: they've been a lot more shielded from the stressors of real life throughout their development. Even during materially prosperous times, like the dangerous years of the 1960s through the '80s, kids still felt the urge to make something of themselves, and their parents encouraged them. It's a tough world out there, so they've got to learn. During safe times, this sense of urgency evaporates, kids remain insulated, and they don't develop that grab-the-world-by-the-balls attitude.
i mean, dude, i'll get around to that later. right now i'm busy kicking some noob's ass in halo, so leave me alone.
(This is true even of the 1950s, which were a safe decade. Most people falsely project the period from roughly 1958 to 1963 back onto the 1950s. But that was when rock music blew up, when Hitchcock made Psycho, and when The Sandlot is set. Those are all from wild, rising-crime times, not The Fifties. People make this mistake because the explosion of the late '60s had not happened yet, but the pre-counter-cultural days were still squarely within the post-'50s era of unsupervised wildness.)
Well, at the start I thought I'd have a chance to yak on about my second pre-teen job as a paperboy on a route that my best friend and I shared in middle school, but this has gotten pretty long, so maybe I'll save that for later.
It's shocking to go around today and see how few teenagers there are doing teenager jobs -- any job in the supermarket, a fast food place, movie theaters, you name it. Also, teenagers don't babysit, mow lawns, or shovel snow from driveways anymore. (I reflected on the disappearance of the babysitter in American life here.) So in the interest of setting down some oral history about a lost way of life (lost for now anyway), why don't we remind ourselves and others how common it used to be for quite young people to have jobs? -- before wimpy kids and helicopter parents both conspired to keep the young 'uns protected from the messy world of real life.
My first real paying job came from the restlessness that nearly pubescent boys naturally feel, provided they aren't total pansies, that spurs them to go out there and achieve something -- doesn't matter what, but something that most of the world's cultures would recognize as a growing-up activity. Perhaps we wouldn't go slay a dragon, but we weren't just going to sit around anymore. During the summer of 1991, when I was 10 years old, a small group of friends wanted to do grown-up stuff like work. Of course, child labor laws prevented us from most jobs, but my friend Robbie's parents asked around and found out that some odd jobs could still be done by youngsters with their parents' and the government's permission.
So that summer, we decided to undergo that ancient rite of passage -- spraypainting someone's address on the curb in front of their house.
Robbie's parents drove us to the municipal building, filled out some forms, and there it was in our hands -- our permit. To us it was like a college degree with a royal seal of approval. Every time we gave our sales pitch, we always boasted about how "We've even got a permit!" So don't worry, we're not total amateurs or anything. Unfortunately the permit also said we could charge no more and no less than $4, when I'm sure anyone who would've paid $4 would've paid us $5, boosting our revenues 25% right there.
On the other hand, the regulation probably made the homeowners more willing to take a chance on us. They always asked how much the service was before making a decision, and if we had set our own price, they might have felt that a bunch of punk kids were going to take them for a ride. But when we told them that the law required exactly $4, I think it made us seem more honest and trustworthy -- it was a no-haggle price, and it was set by a neutral third party. Normal market participants don't need these regulations to ensure trust, but when the offerer of a service is a bunch of 9 and 10 year-old kids, it could be necessary.
We bought some supplies at an arts & crafts store in the shopping center just a couple blocks away from Robbie's house, practiced a good deal on flat surfaces like pizza boxes -- though I don't think on actual curbs -- and once we felt comfortably trained, headed out on our daily tour through some part of our neighborhood in Upper Arlington, Ohio (a quieter suburb just outside of Columbus and the huge Ohio State campus). This was back when kids were allowed to go anywhere and do anything they wanted, all unsupervised, and often without even telling our parents we were leaving as we walked out the door. One or two of us brought a bike along, though we mostly moved on foot. We were only out canvassing for work for about 3 or 4 hours a day at most, so we rarely got tired from walking. Plus it was fun to see new parts of our neighborhood, till then invisible to us because we didn't have any friends on that street. And all in the fresh summer air!
Our approach was as naive and simple as you would think it: just walk up and knock on the door, give a pretty bland and hassle-free schpiel, and then either move on or get down to work. I think the fanciest thing we ever tossed out to finesse them was to note how useful it would be if acquaintances or firetrucks and ambulances needed to find their house quickly or at night. Really, though, how much can you dress up a proposal to spraypaint their address on the curb?
The painting process itself was fairly easy, although not something you could do on auto-pilot. We got into a state that I would later learn is called Flow by psychologists, where your skill level and the challenge at hand are matched so that you do pretty well, get quick feedback about how you're doing, and are focused yet unaware of time passing. We had cut out the shape of a smallish picture frame from a brown paper grocery bag, which served as the stencil for the white background. It had to be of a flexible material since we were going to press it against a rolling curb. The only trick here was pressing down all around the perimeter, not just here and there, or else the white spraypaint could slip under an area that wasn't held down fast. From heavy cardboard, we made a stencil that would block out all of the white background except for a small rectangular area where one of the numbers would go. On top of this we laid our store-bought number stencil, let the black paint fly, and after three or four times, we were done.
Of course, that doesn't mean we didn't occasionally screw up -- and learn how to deal with the inevitable failures that come from putting yourself out there. (The shielded kids of the past 15 to 20 years fail to learn how to cope with failure because they don't get any real-world practice with it.) It only happened once, as I recall, but it was pretty devastating for us. Somehow whoever painted the white background made it too big -- it wasn't just a long rectangle that tightly framed the address numbers, but yawned vertically across the entire front of the curb, making lots of ugly negative white space.
I don't remember exactly how he tried to fix it, but I think Robbie tried to make the numbers bigger to compensate, and that looked even more hideous. Frustrated and angry, he sprayed the black paint in a wavy, scribbly motion across the whole thing, as though he were crossing out a misspelled word in his book report. The rest of us freaked out, sure that we were going to get grounded for this -- for Robbie losing his cool -- and that we'd have to repay the homeowners who knew how much. Luckily they let us use their phone to call Robbie's parents, who came with a can of paint thinner and their grown-up social skills to smooth things over with the poor bastards whose curb we'd ruined. It took at least another hour to fix it up, but it ended up looking fine. Boy did we learn first-hand how when you find yourself in a hole, stop digging!
Still, most of our offers went nowhere, perhaps because they imagined something just like this happening to their as-yet-unsullied front yard. Paying $4 for a service that had a 1% chance of lowering their property value by $10,000 must have been too much risk for them to take just to indulge some kids in their quest to grow up and be doing something meaningful. By the end of the 10 or so weeks of our summer job, we'd pocketed between $80 and $90, although adjusting for inflation that would now be $125-140. Pretty sweet for kids who just graduated 4th grade, let alone for an activity we loved anyway -- hanging out together, exploring the neighborhood, trying to talk ourselves up to the grown-ups, and occasionally even getting to work in a focused way with our hands and tools.
And that was just the work day -- afterwards we almost always went to either of the two local public pools (unsupervised, naturally) to cool off, to goof around after having to be somewhat serious for hours straight, and especially to wash all that gunky spraypaint off our hands before we went home.
What did we do with the money we'd earned? Well Robbie and I did most of the work, so we had more say. And because we were friends, we didn't want to just split up the dough like some band of mercenaries -- we wanted to buy some Really Big Thing that would belong to us all and commemorate our adventure. We didn't think about it this way consciously, of course; it just came to us naturally. In June of that summer, we heard about what would become one of the greatest video games -- and at the time, one of the most expensive -- Sonic the Hedgehog for the Sega Genesis system.
None of us could have ever saved up enough money to get it on our own, so we decided to put our collective earnings toward it and play it together, trading it around whenever someone else wanted to play it (mostly just me and Robbie, though). It was the same as the premise of a later Simpsons episode where three boys pool their money to buy a rare comic book, but whose friendships are torn apart by the rivalry over who has greater access to it. That may happen in the imagination of self-centered TV writers, but back on planet Earth we got along just fine. It worked because it was such a small group and we all were friendly toward each other, so enforcement of fairness wasn't hard at all. Plus that was before anyone was a self-ruined video game addict who would have wanted to play it as their second full-time job. So, it was no more disruptive to our relationships than two or three housemates sharing a bathroom.
I returned to Upper Arlington once after moving away before middle school began, but I didn't think to look for the curbs we'd painted. I'll bet they're still there, even though the houses may look funny from additions, and despite the turnover of occupants during the 20 years since. You'd have to be pretty anti-address-on-the-curb to scrub it off if it was already there. It's no work of art -- no one will behold it in 10,000 years and marvel -- but it feels bizarrely rewarding to know there's some enduring mark of our existence and our teamwork out there.
* * *
When you're a little kid, especially a little boy, you can only take it so long before you get bored of trying to sell some of your toys or other junk in a yard sale, or setting up a sidewalk stand for lemonade. (Or for milk and cookies, like my brothers tried -- my parents soon bought all the milk, which had already spoiled in the summer sun, so no one would get poisoned). You want to band together with your allies and go out there and do something. When children and adults alike pussy out of their duty to help get the kids' hands dirty with real-world experience, not too long after we'll see a generation of young adults whose personal growth has been too stunted to function as well as young adults used to when they entered college or the workforce with a history of working.
This stunted generation may not bring down the economy, but it won't sail as smoothly as if they'd been properly trained by the real world. And even if it did work just as well as before, there's the matter of giving meaning to young people's lives, independent of how much value they add to the economic pie. If the Millennials appear to be coasting through an existential drift, bereft of a passion for life, it's no surprise: they've been a lot more shielded from the stressors of real life throughout their development. Even during materially prosperous times, like the dangerous years of the 1960s through the '80s, kids still felt the urge to make something of themselves, and their parents encouraged them. It's a tough world out there, so they've got to learn. During safe times, this sense of urgency evaporates, kids remain insulated, and they don't develop that grab-the-world-by-the-balls attitude.
i mean, dude, i'll get around to that later. right now i'm busy kicking some noob's ass in halo, so leave me alone.
(This is true even of the 1950s, which were a safe decade. Most people falsely project the period from roughly 1958 to 1963 back onto the 1950s. But that was when rock music blew up, when Hitchcock made Psycho, and when The Sandlot is set. Those are all from wild, rising-crime times, not The Fifties. People make this mistake because the explosion of the late '60s had not happened yet, but the pre-counter-cultural days were still squarely within the post-'50s era of unsupervised wildness.)
Well, at the start I thought I'd have a chance to yak on about my second pre-teen job as a paperboy on a route that my best friend and I shared in middle school, but this has gotten pretty long, so maybe I'll save that for later.
August 11, 2010
Any area where young people live more dangerously than before?
I've been trying to think of some part of life where teenagers and young adults are bucking the overall trend toward greater short-term safety and (over-)protection that began in the early-mid 1990s. But I can't come up with any good counter-examples off the top of my head. Any ideas?
Actually, let me update that to include any greater exposure to the tougher aspects of real life. It's not just dangerous things that kids are more protected from than before, like violence and drug use, but also having a job, being in the Cub Scouts or Boy Scouts, or playing challenging video games (ones that are "Nintendo hard").
I emphasize short-term safety because I'm convinced that insulating yourself from the stressors of the real world here and now ultimately weakens and endangers your robustness to them. (See the afterword to the new paperback edition of Taleb's The Black Swan for more on this piece of lost wisdom.) Someone who has never gotten into a fight of any kind probably lives in a world where full-out brawls have become more rare -- but once he does encounter that obstacle, he'll come away much more banged up than someone who has had the occasional exposure to fights. Someone who has been shielded from interactions with exploiters probably lives in a world where these encounters are more rare -- but once they do meet an exploiter, they'll really get taken to the cleaners, like Little Red Riding Hood. The only cure for naivete is occasional exposure to real-world dangers.
Actually, let me update that to include any greater exposure to the tougher aspects of real life. It's not just dangerous things that kids are more protected from than before, like violence and drug use, but also having a job, being in the Cub Scouts or Boy Scouts, or playing challenging video games (ones that are "Nintendo hard").
I emphasize short-term safety because I'm convinced that insulating yourself from the stressors of the real world here and now ultimately weakens and endangers your robustness to them. (See the afterword to the new paperback edition of Taleb's The Black Swan for more on this piece of lost wisdom.) Someone who has never gotten into a fight of any kind probably lives in a world where full-out brawls have become more rare -- but once he does encounter that obstacle, he'll come away much more banged up than someone who has had the occasional exposure to fights. Someone who has been shielded from interactions with exploiters probably lives in a world where these encounters are more rare -- but once they do meet an exploiter, they'll really get taken to the cleaners, like Little Red Riding Hood. The only cure for naivete is occasional exposure to real-world dangers.
August 10, 2010
When music videos were narrative
I don't know where I stand on this whole "death of the word / rise of the image" debate, but after watching a bunch of old music videos, I was reminded by how important they found it to make the video tell some kind of story. Maybe it wouldn't treat the grand themes of the human condition, and maybe it wouldn't even make use of common folklore motifs (like "hero slays monster" or "stroke of luck enriches a thief"). Still, there was some narrative, however crude, that the song was meant to accompany.
Certainly not all videos used to be like that, but it was much more common than the approach of following the band around in some setting, with no action in particular required. Now people are not interested in music videos that tell a story -- neither the band, nor the director, nor the audience. "Uh, I think that'd be taking it a little too seriously," they'd protest in ironic detachment. Meanwhile, look at how overly visually ambitious a lot of the more recent videos are. It's fine to invest a lot of thought, time, and sweat into the video's images, just not the story that they might tell.
So perhaps there's something to this "death of the narrative" business after all. Just think -- even a medium that's primarily visual (supposedly) like music videos relied heavily on storytelling, much like a movie or TV show. When MTV launched, people were afraid that the videos would become total visual spectacles and crowd out the music. That didn't happen during MTV's hey-day, when they were more like brief stories set to music.
When did the shift occur? Like with so much else, it looks as though the wake of the larger early '90s social transition marked the end of narrative music videos. Here's a brief overview using representative videos from several periods. The spectacles "Here It Goes Again" from 2006 by OK Go and "Mo Money Mo Problems" from 1997 by the Notorious B.I.G. show where we've been for awhile. Rap videos used to avoid bling and blammo and focus on stories, even as late as 1993 with Dr. Dre's and Ice Cube's videos. Guns N' Roses was probably the last big rock band to make narrative videos, but Aerosmith kept them alive somewhat in the mid-'90s. Here's their earlier and better video for 1989's "Janie's Got a Gun." New Edition's 1986 video for their cover of "Earth Angel" features a classic folklore motif -- the kind girls who are rewarded and the mean girls who are punished when a lowly man reveals himself to have been high-status all along.
Actually, early MTV stuff sometimes just showed the band in some setting, before directors figured out what to do with the new medium. So Madonna had the spectacle-only video for "Lucky Star" but also a more narrative-based one for "Borderline," both from 1984. Duran Duran made a here's-the-band video for "Planet Earth" in 1981 (the year MTV was born), as well as a more story-filled one for "Hungry Like the Wolf" in 1982.
I'm not sure if this is a rising-crime vs. falling-crime difference, though I wouldn't be surprised. It's not "word vs. image" but "narrative vs. mere spectacle," of course. There were dazzling visuals in Aliens and The Terminator, but they were there to enhance the gripping narrative; people complain that more recent action and horror movies are only visual. Same with porn: earlier, the images were there to lay flesh on the bare bones of the narrative, whereas now it is story-free spectacle.
We'd have to go back farther in history to see if rises in the violence rate saw greater interest in compelling narratives. That's my impression anyways. The Elizbethan / Jacobean people and the Romantics had more fascinating stories than the Augustans or the Victorians, who seem more interested than the former groups in showing off their literary special effects. The Romantic era of course saw a surge in running around to collect folktales like the Brothers Grimm did. And the super-violent 14th C. gave us The Divine Comedy, The Decameron, and The Canterbury Tales, which wasn't matched even closely by look-how-clever-we-are Renaissance humanist writers of the 15th and most of the 16th centuries (that is, before the violence rate blew up again around 1580).
What's the connection? People working at the interface between evolutionary psychology and literary studies suggest that narratives help us navigate the world by letting us run a bunch of social experiments, of a sort, and gaining wisdom from their outcomes. This knowledge is worth more when the world looks a lot more dangerous, so the audience will demand more of it, and the culture-makers will boost their supply in response.
Certainly not all videos used to be like that, but it was much more common than the approach of following the band around in some setting, with no action in particular required. Now people are not interested in music videos that tell a story -- neither the band, nor the director, nor the audience. "Uh, I think that'd be taking it a little too seriously," they'd protest in ironic detachment. Meanwhile, look at how overly visually ambitious a lot of the more recent videos are. It's fine to invest a lot of thought, time, and sweat into the video's images, just not the story that they might tell.
So perhaps there's something to this "death of the narrative" business after all. Just think -- even a medium that's primarily visual (supposedly) like music videos relied heavily on storytelling, much like a movie or TV show. When MTV launched, people were afraid that the videos would become total visual spectacles and crowd out the music. That didn't happen during MTV's hey-day, when they were more like brief stories set to music.
When did the shift occur? Like with so much else, it looks as though the wake of the larger early '90s social transition marked the end of narrative music videos. Here's a brief overview using representative videos from several periods. The spectacles "Here It Goes Again" from 2006 by OK Go and "Mo Money Mo Problems" from 1997 by the Notorious B.I.G. show where we've been for awhile. Rap videos used to avoid bling and blammo and focus on stories, even as late as 1993 with Dr. Dre's and Ice Cube's videos. Guns N' Roses was probably the last big rock band to make narrative videos, but Aerosmith kept them alive somewhat in the mid-'90s. Here's their earlier and better video for 1989's "Janie's Got a Gun." New Edition's 1986 video for their cover of "Earth Angel" features a classic folklore motif -- the kind girls who are rewarded and the mean girls who are punished when a lowly man reveals himself to have been high-status all along.
Actually, early MTV stuff sometimes just showed the band in some setting, before directors figured out what to do with the new medium. So Madonna had the spectacle-only video for "Lucky Star" but also a more narrative-based one for "Borderline," both from 1984. Duran Duran made a here's-the-band video for "Planet Earth" in 1981 (the year MTV was born), as well as a more story-filled one for "Hungry Like the Wolf" in 1982.
I'm not sure if this is a rising-crime vs. falling-crime difference, though I wouldn't be surprised. It's not "word vs. image" but "narrative vs. mere spectacle," of course. There were dazzling visuals in Aliens and The Terminator, but they were there to enhance the gripping narrative; people complain that more recent action and horror movies are only visual. Same with porn: earlier, the images were there to lay flesh on the bare bones of the narrative, whereas now it is story-free spectacle.
We'd have to go back farther in history to see if rises in the violence rate saw greater interest in compelling narratives. That's my impression anyways. The Elizbethan / Jacobean people and the Romantics had more fascinating stories than the Augustans or the Victorians, who seem more interested than the former groups in showing off their literary special effects. The Romantic era of course saw a surge in running around to collect folktales like the Brothers Grimm did. And the super-violent 14th C. gave us The Divine Comedy, The Decameron, and The Canterbury Tales, which wasn't matched even closely by look-how-clever-we-are Renaissance humanist writers of the 15th and most of the 16th centuries (that is, before the violence rate blew up again around 1580).
What's the connection? People working at the interface between evolutionary psychology and literary studies suggest that narratives help us navigate the world by letting us run a bunch of social experiments, of a sort, and gaining wisdom from their outcomes. This knowledge is worth more when the world looks a lot more dangerous, so the audience will demand more of it, and the culture-makers will boost their supply in response.
Subscribe to:
Posts (Atom)