Censorship, from whatever origin, is more common in falling-crime times because it is a more idea-focused culture, so that danger comes from ideas, beliefs, even words or phrases. A rising violence level reminds people of their corporeality, and turns their mind more toward dangerous substances and acts, for example a war on drugs, alcohol, etc.
I've been meaning to write that post for awhile, but for now, take just one case study of greater censorship in falling-crime times -- book-burning. Wikipedia has compiled a list, although for our purposes you should skip down at least to #29 to get into the late Middle Ages. I'm ignoring cases where one civilization burns another's texts, as that could be part of a larger program of inter-group cultural destruction, not censorship within a society. And since we only have historical homicide data for Europe, I'm ignoring China, India, etc.
Notice where there are decade-long gaps.
Somewhere between 1450 and 1550 the European homicide rate began a secular decline that continues through today. However, there have been several reversals of that decline: from ca. 1580 to 1630 (the Elizabethan-Jacobean period), 1780 to 1830 (the Romantic-Gothic period), in some places from 1900 to 1930 (the Jazz Age), and finally from 1960 to 1990 (the New Wave Age).
It's interesting to note that the last hurrah of rising-crime before the secular decline sets in is roughly the 14th C. -- perhaps no surprise if you're familiar with how disastrous in general that period was. Yet there are no book-burning incidents listed, although there are from the 13th and 15th centuries. The Renaissance and Reformation, by which time the secular decline in violence begins, were more in favor of burning books.
There's nothing listed for most of the Elizabethan-Jacobean wave of violence until 1624, either right at the transition to the falling-crime era or perhaps outside of it. The end-points for these reversals are always "circa" of course, because we don't have annual data for homicide rates.
The censors take up their torches again during the mid-17th C. through the mid-18th C. -- the Age of Reason and Enlightenment. There's only one counter-example from the following Romantic-Gothic period, in 1787.
During the Victorian era, it was France that kept the tradition going, along with Gilded Age enthusiasm in America from the New York Society for the Suppression of Vice. During the Jazz Age there are several examples, perhaps reflecting the fact that homicide rates didn't rise across all nations. But in America, where we did have a steady crime wave, there are no incidents listed.
What everyone thinks of by now as the height of book-burning, Nazi Germany, was part of a broader trend during the falling-crime mid-century. It wasn't all political either: they burned comic books across late 1940s America, as part of the moral panic over horror comics. After the last major push, the burning of Wilhelm Reich's works in the mid-late 1950s, there are no incidents from the rising-crime period up through the early '90s.
There's a scene in Footloose where some local busybodies from the church burn corrupting books, when the Reverend Shaw breaks in to tell them to knock it off and go on to look after one another, because Satan is in your heart, not in some book. In the real world, it was Reverend Shaw's view that carried the day during the religious revival of the '60s through the '80s.
In 1988, angry Muslims did burn The Satanic Verses in two English cities, although with so little influence over society at that point, I'm reluctant to treat that as a genuine counter-example, as though a mainstream group had done so. It doesn't look like there were notable burnings during the '90s, although during the 2000s and 2010s they've burned Harry Potter books and Qurans here.
Over the centuries, the scale of each falling-crime wave of book-burning seems to have diminished. That's comforting. Still, perhaps during the past 20 years there would've been a level closer to that of the mid-century if only print and books were as dominant a form of the literate culture as they were back then. With people getting a lot more text from the internet, the censors might not see as much of a point in burning books. But even the Victorian era was a lot better than the Age of Reason / Enlightenment period, let alone the Renaissance-Reformation period.
Probably the censorship in more recent falling-crime periods has just shifted toward regulations and what we call now political correctness, and away from cruder methods like burning books. Sometime later I'll cover the history of censorship codes for popular entertainment; that's one obvious case of substitution of subtler for grosser means of censorship in recent falling-crime eras.
In any case, the main thing to take away is that censorship, judged by the proxy of book burning, rises in a falling-crime period, when people put more faith in the power of ideas and thoughts. When during a rising-crime era they are reminded of the power of weapons, artifacts, substances like drugs, and the body itself, a certain style of moral regulators stops resorting to censorship and more earnestly takes up causes like Prohibition (and during the '80s, raising the drinking age) and the war on drugs, another topic for another post.
May 29, 2012
May 26, 2012
Reflections on the cold case of the first boy on the milk carton, solved by witnesses, not forensics
May 25 has been National Missing Children's Day since 1983, when Ronald Reagan chose the date in memory of the 1979 disappearance of Etan Patz. He was 6 years old, lived in the SoHo neighborhood of Manhattan (pre-gentrification), and vanished without a trace while walking just a couple blocks from his apartment building to the school bus stop. It was the first time he walked out alone, and he had insisted on it to his parents.
In today's world where children are locked indoors all day long, it must sound strange for a 6 year-old to demand that much independence, let alone get it. But that was not uncommon back then, and neither was his parents' acquiescence. On my first day of kindergarten in 1986, I put my foot down about walking to school by myself, maybe 10 minutes away. After offering many times to accompany me, my dad finally gave in and encouraged me with, "Agnostic, you're being braaaave" -- you know, how grown-ups draw a word out when they're trying to teach you for the first time (the word, not the idea).
I got more than I'd bargained for, having to steel my nerves every time I passed the final corner where a mere chain-link fence kept back one of those dogs that was too wild to be used in Cujo. But I kept at it, and eventually could walk right by without speeding up, even looking it in the eyes for a bit, despite my heart trying to punch its way out of my ribcage.
I share that anecdote to try to put you in the mind of Etan and his parents in that time period. Back then, as the violent crime rate had been soaring for decades, somehow even us children could sense the rising danger level, perhaps from our observation of how grown-ups spoke, acted, and expressed emotion in their faces. And we wanted to prepare ourselves for the dangerous real world before it was too late. Our parents must have sensed this predicament as well -- we can't hover over them all the time, and the way the world's going, they'll have to learn to stick up for themselves sooner or later. So they grudgingly went along with our demand for greater independence.
It's only during the falling-crime era of the past 20 years that parents have begun to say, Hey, it's not like there's any pressing need for them to be independent, so let them grow up after they get their first steady job. You can't project today's world back onto the world of 1979, though, and ask why his parents could have been so careless. In a world of rising violence, it makes sense to send your kid to boxing or martial arts classes, even if there's a 1 in a million chance that he could get seriously injured from those very classes.
Etan's high-profile case would inspire the movement to put the faces of missing children on the sides of milk cartons. Numerous sources claim his was the first face on the milk carton, although I'm not sure how well documented that is. The movement began locally sometime in the early-mid-'80s and went national in 1985. Here is a clip of a news show host interviewing two men who contributed to the movement to showcase missing children's faces on milk cartons and in the phone book.
Note the mellow tone in everyone's voices, and the almost thousand-yard stare on their faces. In most real-life clips from the '80s, people sound and look almost spaced-out compared to today's hyper-alert standards. It was some combination of thick skin, humility, and concern for looking out for others that kept their self-consciousness and egotism in check. Our falling-crime environment has, in contrast, led to thin-skinned, arrogant, and avoidant types, which shows up in all the huffing, teeth-sucking outbursts about what sucks or blows, and the creaky-voiced whining about "really guys, i mean seriously?"
When did they stop showing kids' faces on milk cartons, anyway? This 1996 article from the LA Times says that those programs had ended "several years ago." Probably 1993, then, the first year of the current falling-crime period. The article says that the pictures were simply moved from milk cartons to internet sites and airport kiosks, but of course hardly anyone would have seen those back in 1993. They also began putting faces on junk mail, which could only have annoyed the person getting them.
Starting around '93, they did not just become less visible than before, as though reflecting the falling risk of children running away or being abducted or killed. Rather, they became out of sight and out of mind. In times of greater concern for others, and willingness to lend whatever help we could, we found nothing objectionable about looking at some poor missing child's face every time we went to pour a bowl of Cookie Crisp. That would seem unbearably morbid in today's culture of social-emotional distancing.
Finally, 33 years to the day after Etan's disappearance, a man was charged with second-degree murder after confessing to the police. Pedro Hernandez was an 19 year-old stock boy at a nearby bodega who lured Etan into the basement with the promise of a cold soda, strangled him, stuffed him into a bag, and dropped the body off among some garbage a couple blocks away. His body, therefore, will probably never be found.
The motive is unclear so far, but Hernandez was probably one of any number of pedophile faggots infesting downtown Manhattan. He has so far not admitted to sexual abuse. Close-up photos show that he has both ears pierced, and if he got them done anytime before the hipster doofus crew began making that popular in the 2000s, he's almost certainly gay / on the down-low. (He does have a wife and daughter.) He's been on disability since '93, apparently chain-smoking all day while his wife brings in a paycheck, so I'd guess he got his ears pierced before then. Some Puerto Rican dude (or Dominican or whatever) who lived in the New York metro area with both ears pierced in the '80s, definitely sounds gay.
[Update: The NY Post today reports that both cops and family say Hernandez has HIV. Definitely a faggot pedophile.]
Hernandez of course did not just wake up this week and feel like spilling the beans to the cops. They were led to him by a witness, his brother-in-law, who told the police that during the 1980s Hernandez had told several people that he'd "done a bad thing" and killed a child in New York. Since he was working at a nearby bodega when Etan went missing, that proved to be a good lead. So, just like that, the police received a fairly detailed confession to a murder over 30 years ago, all thanks to someone who knew something coming forward.
In fact, about a month ago the New York police ripped apart the basement of another suspect who still lives nearby, a handyman who police thought could have done it, perhaps burying the body within or under the concrete additions he'd made to his basement. Jackhammers broke open the way, samples were collected and analyzed, and absolutely nothing new was learned at the end of it all. The only good that came of it was that the press coverage of the spectacle rang a bell in the mind of a witness, prompting him to speak up.
Falling-crime eras inevitably breed a complacency and blind faith in the ability of engineering and technocracy to solve most of our problems, especially the big ones. Just put the right people in charge of those who have mastered the right technology, and presto -- problem solved.
Shows like CSI would have flopped big time back in the '80s because most people knew the score about how crimes got solved, no matter how removed they may have been from actual detective work. All the forensic evidence in the world won't point the detectives to a particular individual, unless they left their Social Security card, or a gun with a unique identifier, at the crime scene. Witness information is needed to point detectives to a suspect, whose fingerprints, shoe material, DNA, etc., can be checked against the evidence found at the crime scene. Not to mention provide details of what happened that were not recorded in the remaining physical evidence, or give reasons why the killer wanted the victim dead.
Quantitative criminologists have studied the factors that lead to higher "clearance rates" (e.g., making an arrest) for crimes like homicide. Here is one studying the mid-'90s, although from the other stuff I've read, their conclusions are typical. Table 9 lists the main reason why a homicide case got closed, and by far the leading cause is that a witness identified the offender, whether at the scene (closing 48% of all cases) or after investigators found witnesses (closing another 12% of cases). In only 2% of cases was the primary cause for closure the collection of physical evidence at the crime scene.
Table 19 shows which crime scene variables were linked to higher or lower clearance rates. Virtually nothing having to do with the CSI approach was linked to a higher clearance rate -- not the presence of evidence technicians, their number, the time they spent at the scene, nor the search for or even discovery of fingerprints and physical evidence. The feature of the crime scene that helped the most was being the type of place jam-packed with witnesses, e.g. a bar, club, or residence, rather than a public park.
Table 20 shows which witness variables were linked to higher/lower clearance rates. Most of them are helpful, especially the ones relating to information provided by a witness. Table 34 sums up all of the variables that were related to higher clearance rates, and nothing from the CSI world shows up at all, whereas the second-most powerful predictor of closing a case is having a witness who provides useful information. The strongest predictor is a tech variable, but a very low-tech one -- running a simple computer check on a suspect. Even that assumes an awful lot has already happened, such as witness information leading investigators to a list of suspects who can then be checked out in their databases.
How can our increasingly autistic society appreciate that it is the social side of police work that matters more than the technological? I think even a techno-utopian sperg would concede that the human sensory systems of witnesses, not to mention their higher-order processing of social relations (like, was the killer known to the victim), are vastly superior at recording the physical and social facts of what went down, compared to the impoverished picture recorded by a fingerprint here and a drop of blood there.
The best role for technology seems to be in creating better ways for people to communicate with other people, for example by building better databases and making it easier for all investigators involved to compare notes. In the Etan Patz case, the confessed killer was originally listed as someone worth interviewing, but no one got around to it, probably because all relevant facts and angles weren't known to everyone working on the case.
An even more infamous case of kidnapping and child murder is the Adam Walsh case, which was finally solved when someone went through all the tons of data, especially related to the killer's multiple confessions (not so much the physical evidence), and connected the dots. Listen here to an interview with detective Joe Matthews and his popularizing writer Les Standiford about how the case was solved without CSI wizardry, relying on the results of investigator interviews of witnesses and suspects.
It's disturbing to think of what would happen if you got attacked in today's cocooning society. When neighbors don't even trust each other enough to send their kids to each other's houses for trick-or-treating, and when everyone has floor-to-ceiling blinds closed all day long, you can bet they aren't keeping one eye open for what's going on around their community.
People worry about becoming the next Kitty Genovese, who gets stabbed while those in the vicinity hear something wrong but don't act, a case of "the diffusion of responsibility" or "the bystander effect". But shit, at least if you're a victim of the bystander effect, there will be bystanders who can provide leads to the police! Hopefully they'll catch the son of a bitch and fry him to avenge your death. If, however, those around you are really buried deep in their cocoons, your murder might not even get solved.
When a cold case like Etan Patz's does get solved, it brings the world closer back into balance. Not just in the egocentric sense of, "Phew, now I finally know whodunnit, and a weight's been lifted off my shoulders." But more in the sense of finally being on the same wavelength as the dead. If there's anything there on the other side, he's probably been calling out who killed him for a long time now, and we never heard it. Now we know, not because of finally hearing him but because of run-of-the-mill police work, yet still we understand what he'd been trying to get across to us all along. That ends his frustration about not being able to get through to us, and it lifts us up to share in a communion of awareness about the mystery of his death.
In today's world where children are locked indoors all day long, it must sound strange for a 6 year-old to demand that much independence, let alone get it. But that was not uncommon back then, and neither was his parents' acquiescence. On my first day of kindergarten in 1986, I put my foot down about walking to school by myself, maybe 10 minutes away. After offering many times to accompany me, my dad finally gave in and encouraged me with, "Agnostic, you're being braaaave" -- you know, how grown-ups draw a word out when they're trying to teach you for the first time (the word, not the idea).
I got more than I'd bargained for, having to steel my nerves every time I passed the final corner where a mere chain-link fence kept back one of those dogs that was too wild to be used in Cujo. But I kept at it, and eventually could walk right by without speeding up, even looking it in the eyes for a bit, despite my heart trying to punch its way out of my ribcage.
I share that anecdote to try to put you in the mind of Etan and his parents in that time period. Back then, as the violent crime rate had been soaring for decades, somehow even us children could sense the rising danger level, perhaps from our observation of how grown-ups spoke, acted, and expressed emotion in their faces. And we wanted to prepare ourselves for the dangerous real world before it was too late. Our parents must have sensed this predicament as well -- we can't hover over them all the time, and the way the world's going, they'll have to learn to stick up for themselves sooner or later. So they grudgingly went along with our demand for greater independence.
It's only during the falling-crime era of the past 20 years that parents have begun to say, Hey, it's not like there's any pressing need for them to be independent, so let them grow up after they get their first steady job. You can't project today's world back onto the world of 1979, though, and ask why his parents could have been so careless. In a world of rising violence, it makes sense to send your kid to boxing or martial arts classes, even if there's a 1 in a million chance that he could get seriously injured from those very classes.
Etan's high-profile case would inspire the movement to put the faces of missing children on the sides of milk cartons. Numerous sources claim his was the first face on the milk carton, although I'm not sure how well documented that is. The movement began locally sometime in the early-mid-'80s and went national in 1985. Here is a clip of a news show host interviewing two men who contributed to the movement to showcase missing children's faces on milk cartons and in the phone book.
Note the mellow tone in everyone's voices, and the almost thousand-yard stare on their faces. In most real-life clips from the '80s, people sound and look almost spaced-out compared to today's hyper-alert standards. It was some combination of thick skin, humility, and concern for looking out for others that kept their self-consciousness and egotism in check. Our falling-crime environment has, in contrast, led to thin-skinned, arrogant, and avoidant types, which shows up in all the huffing, teeth-sucking outbursts about what sucks or blows, and the creaky-voiced whining about "really guys, i mean seriously?"
When did they stop showing kids' faces on milk cartons, anyway? This 1996 article from the LA Times says that those programs had ended "several years ago." Probably 1993, then, the first year of the current falling-crime period. The article says that the pictures were simply moved from milk cartons to internet sites and airport kiosks, but of course hardly anyone would have seen those back in 1993. They also began putting faces on junk mail, which could only have annoyed the person getting them.
Starting around '93, they did not just become less visible than before, as though reflecting the falling risk of children running away or being abducted or killed. Rather, they became out of sight and out of mind. In times of greater concern for others, and willingness to lend whatever help we could, we found nothing objectionable about looking at some poor missing child's face every time we went to pour a bowl of Cookie Crisp. That would seem unbearably morbid in today's culture of social-emotional distancing.
Finally, 33 years to the day after Etan's disappearance, a man was charged with second-degree murder after confessing to the police. Pedro Hernandez was an 19 year-old stock boy at a nearby bodega who lured Etan into the basement with the promise of a cold soda, strangled him, stuffed him into a bag, and dropped the body off among some garbage a couple blocks away. His body, therefore, will probably never be found.
The motive is unclear so far, but Hernandez was probably one of any number of pedophile faggots infesting downtown Manhattan. He has so far not admitted to sexual abuse. Close-up photos show that he has both ears pierced, and if he got them done anytime before the hipster doofus crew began making that popular in the 2000s, he's almost certainly gay / on the down-low. (He does have a wife and daughter.) He's been on disability since '93, apparently chain-smoking all day while his wife brings in a paycheck, so I'd guess he got his ears pierced before then. Some Puerto Rican dude (or Dominican or whatever) who lived in the New York metro area with both ears pierced in the '80s, definitely sounds gay.
[Update: The NY Post today reports that both cops and family say Hernandez has HIV. Definitely a faggot pedophile.]
Hernandez of course did not just wake up this week and feel like spilling the beans to the cops. They were led to him by a witness, his brother-in-law, who told the police that during the 1980s Hernandez had told several people that he'd "done a bad thing" and killed a child in New York. Since he was working at a nearby bodega when Etan went missing, that proved to be a good lead. So, just like that, the police received a fairly detailed confession to a murder over 30 years ago, all thanks to someone who knew something coming forward.
In fact, about a month ago the New York police ripped apart the basement of another suspect who still lives nearby, a handyman who police thought could have done it, perhaps burying the body within or under the concrete additions he'd made to his basement. Jackhammers broke open the way, samples were collected and analyzed, and absolutely nothing new was learned at the end of it all. The only good that came of it was that the press coverage of the spectacle rang a bell in the mind of a witness, prompting him to speak up.
Falling-crime eras inevitably breed a complacency and blind faith in the ability of engineering and technocracy to solve most of our problems, especially the big ones. Just put the right people in charge of those who have mastered the right technology, and presto -- problem solved.
Shows like CSI would have flopped big time back in the '80s because most people knew the score about how crimes got solved, no matter how removed they may have been from actual detective work. All the forensic evidence in the world won't point the detectives to a particular individual, unless they left their Social Security card, or a gun with a unique identifier, at the crime scene. Witness information is needed to point detectives to a suspect, whose fingerprints, shoe material, DNA, etc., can be checked against the evidence found at the crime scene. Not to mention provide details of what happened that were not recorded in the remaining physical evidence, or give reasons why the killer wanted the victim dead.
Quantitative criminologists have studied the factors that lead to higher "clearance rates" (e.g., making an arrest) for crimes like homicide. Here is one studying the mid-'90s, although from the other stuff I've read, their conclusions are typical. Table 9 lists the main reason why a homicide case got closed, and by far the leading cause is that a witness identified the offender, whether at the scene (closing 48% of all cases) or after investigators found witnesses (closing another 12% of cases). In only 2% of cases was the primary cause for closure the collection of physical evidence at the crime scene.
Table 19 shows which crime scene variables were linked to higher or lower clearance rates. Virtually nothing having to do with the CSI approach was linked to a higher clearance rate -- not the presence of evidence technicians, their number, the time they spent at the scene, nor the search for or even discovery of fingerprints and physical evidence. The feature of the crime scene that helped the most was being the type of place jam-packed with witnesses, e.g. a bar, club, or residence, rather than a public park.
Table 20 shows which witness variables were linked to higher/lower clearance rates. Most of them are helpful, especially the ones relating to information provided by a witness. Table 34 sums up all of the variables that were related to higher clearance rates, and nothing from the CSI world shows up at all, whereas the second-most powerful predictor of closing a case is having a witness who provides useful information. The strongest predictor is a tech variable, but a very low-tech one -- running a simple computer check on a suspect. Even that assumes an awful lot has already happened, such as witness information leading investigators to a list of suspects who can then be checked out in their databases.
How can our increasingly autistic society appreciate that it is the social side of police work that matters more than the technological? I think even a techno-utopian sperg would concede that the human sensory systems of witnesses, not to mention their higher-order processing of social relations (like, was the killer known to the victim), are vastly superior at recording the physical and social facts of what went down, compared to the impoverished picture recorded by a fingerprint here and a drop of blood there.
The best role for technology seems to be in creating better ways for people to communicate with other people, for example by building better databases and making it easier for all investigators involved to compare notes. In the Etan Patz case, the confessed killer was originally listed as someone worth interviewing, but no one got around to it, probably because all relevant facts and angles weren't known to everyone working on the case.
An even more infamous case of kidnapping and child murder is the Adam Walsh case, which was finally solved when someone went through all the tons of data, especially related to the killer's multiple confessions (not so much the physical evidence), and connected the dots. Listen here to an interview with detective Joe Matthews and his popularizing writer Les Standiford about how the case was solved without CSI wizardry, relying on the results of investigator interviews of witnesses and suspects.
It's disturbing to think of what would happen if you got attacked in today's cocooning society. When neighbors don't even trust each other enough to send their kids to each other's houses for trick-or-treating, and when everyone has floor-to-ceiling blinds closed all day long, you can bet they aren't keeping one eye open for what's going on around their community.
People worry about becoming the next Kitty Genovese, who gets stabbed while those in the vicinity hear something wrong but don't act, a case of "the diffusion of responsibility" or "the bystander effect". But shit, at least if you're a victim of the bystander effect, there will be bystanders who can provide leads to the police! Hopefully they'll catch the son of a bitch and fry him to avenge your death. If, however, those around you are really buried deep in their cocoons, your murder might not even get solved.
When a cold case like Etan Patz's does get solved, it brings the world closer back into balance. Not just in the egocentric sense of, "Phew, now I finally know whodunnit, and a weight's been lifted off my shoulders." But more in the sense of finally being on the same wavelength as the dead. If there's anything there on the other side, he's probably been calling out who killed him for a long time now, and we never heard it. Now we know, not because of finally hearing him but because of run-of-the-mill police work, yet still we understand what he'd been trying to get across to us all along. That ends his frustration about not being able to get through to us, and it lifts us up to share in a communion of awareness about the mystery of his death.
May 24, 2012
Long-haired dudes now keeping it up in a bun?
Over the past 20 years, the whole distribution of hair length on men has shifted toward the shorter side. Guys who would've had hair halfway down their back now grow it around shoulder-length, while more conservative guys who would've had 4 or 5-inch hair now keep it very short. Even bald men didn't totally shave it down to the scalp like they do now.
In light of all the other social and cultural changes of the past 20 years, compared to the '60s through the '80s, and returning us back to the mid-century, I interpret this as having something to do with males competing for females less based on looks and sexuality, and more based on providing resources and emotional support.
If he's lucky, that means he's bringing up a family and keeping his wife's neuroses from overheating. If he's not, then he's one of those guys who's just funding his girlfriend or wife's silly addictions and being used as her emotional tampon. In either case, though, she's not choosing so much based on looks, good genes, etc., as she would have back in the '60s, '70s, and '80s, when girls were boy-crazy. We can tell that what's physically attractive to women hasn't changed too much because romance novels (i.e. chick porn) still feature long-haired men on the covers.
So, when choosing boyfriends, they may have a similar picture of "dreamy dude" in their mind, but are ignoring those traits more and giving greater weight to how much providing and supporting he'll deliver. Also, if she is more intent on a long-term monogamous relationship, she'll be less willing to let him wander around with long hair, in the same way that married men prefer that their wives not leave the house with flowing locks that might attract the attention of strange men.
A more recent trend shows that this applies even to the tiny minority of men who still do have long hair -- namely, wearing it up in a bun. An NYT article earlier this year noted some examples, including this picture:
That set off lots of posts from the internet peanut gallery, mostly blaming the NYT for posting another "fake trend" article based on a handful of examples. But here is an even earlier post from late 2010 by some girl who documents the trend with celebrity examples, and most of the commenters agreeing that they look hotter with the bun than with it totally down.
I've kept an informal lookout around here, and I too notice very few of the long-haired guys actually wearing it down, and a good fraction of them wearing a bun (usually lower on the head than in the picture above).
I'm guilty of that as well, though. When I started a couple weeks ago, I thought up the expected rationalizations -- it'll have to be that way to avoid the summer heat, girls tell me I should keep hair away from my face to show my "good bone structure," I won't have it blowing in my face while walking around, etc. But then I thought that my counterpart just 20 years ago would have looked like he was trying out for a part in The Lost Boys, even though they had hot summers, windy weather, and bone structure to worry about back then too.
You only have so much wiggle room within contemporary norms, and outside of that you risk getting thrown out. For ideological and artistic stuff, I could care less about staying within the bounds of our airheaded zeitgeist. Personal appearance isn't so fundamental to our identity, though, so I'm willing to concede more ground there. Besides, when the tide does eventually turn, it'll be easier to let your hair down out of a bun than to grow it out long from nothing.
In light of all the other social and cultural changes of the past 20 years, compared to the '60s through the '80s, and returning us back to the mid-century, I interpret this as having something to do with males competing for females less based on looks and sexuality, and more based on providing resources and emotional support.
If he's lucky, that means he's bringing up a family and keeping his wife's neuroses from overheating. If he's not, then he's one of those guys who's just funding his girlfriend or wife's silly addictions and being used as her emotional tampon. In either case, though, she's not choosing so much based on looks, good genes, etc., as she would have back in the '60s, '70s, and '80s, when girls were boy-crazy. We can tell that what's physically attractive to women hasn't changed too much because romance novels (i.e. chick porn) still feature long-haired men on the covers.
So, when choosing boyfriends, they may have a similar picture of "dreamy dude" in their mind, but are ignoring those traits more and giving greater weight to how much providing and supporting he'll deliver. Also, if she is more intent on a long-term monogamous relationship, she'll be less willing to let him wander around with long hair, in the same way that married men prefer that their wives not leave the house with flowing locks that might attract the attention of strange men.
A more recent trend shows that this applies even to the tiny minority of men who still do have long hair -- namely, wearing it up in a bun. An NYT article earlier this year noted some examples, including this picture:
That set off lots of posts from the internet peanut gallery, mostly blaming the NYT for posting another "fake trend" article based on a handful of examples. But here is an even earlier post from late 2010 by some girl who documents the trend with celebrity examples, and most of the commenters agreeing that they look hotter with the bun than with it totally down.
I've kept an informal lookout around here, and I too notice very few of the long-haired guys actually wearing it down, and a good fraction of them wearing a bun (usually lower on the head than in the picture above).
I'm guilty of that as well, though. When I started a couple weeks ago, I thought up the expected rationalizations -- it'll have to be that way to avoid the summer heat, girls tell me I should keep hair away from my face to show my "good bone structure," I won't have it blowing in my face while walking around, etc. But then I thought that my counterpart just 20 years ago would have looked like he was trying out for a part in The Lost Boys, even though they had hot summers, windy weather, and bone structure to worry about back then too.
You only have so much wiggle room within contemporary norms, and outside of that you risk getting thrown out. For ideological and artistic stuff, I could care less about staying within the bounds of our airheaded zeitgeist. Personal appearance isn't so fundamental to our identity, though, so I'm willing to concede more ground there. Besides, when the tide does eventually turn, it'll be easier to let your hair down out of a bun than to grow it out long from nothing.
May 22, 2012
Super Bowl national anthems, normal and weird
Some day -- like right now -- the kids are not going to know how drastically different things were even during the recent past. (Omigod, you mean people used to use their phones for, like, talking?!) The best way to give them illustrations is to show how some thing that's existed all along has changed over time. If something exists and then doesn't, or once did not exist and then does, I don't think they know what to make of it as far as understanding the flow of history.
The Super Bowl has been around long enough and is so popular that they'll know what you're talking about. And they've sung the national anthem just about every time. Since the major cultural shift starting in the early-mid 1990s, the performance has grown so Victorian in its overly ornamental encrustations that fairly recent performances, say during the '80s, look like they're from another planet. They're not plodding and spare, like Harry Connick Jr.'s 1992 performance. They have just the right level of ornament and oomph.
Moving away from that optimum level of ornament could go in either direction -- toward the puritanical minimalism whose most famous example is the Apple look, but also toward the super-emo warbling of much pop music of the past 20 years. There's nothing contradictory about both extremes being popular: they have in common a strong departure from the optimal middle ground.
In case you're fortunate enough not to have heard the more recent, overly embellished and drawn-out national anthems, here are a few out of many examples from the Super Bowl: the one that ignited the trend, Whitney Houston 1991 (and even this early one doesn't have so much ornamentation), Mariah Carey in 2002, and by far the worst Christina Aguilera in 2011. About the only halfway decent one of this period was the Dixie Chicks in 2003.
The latest one I could find before '91 was Neil Diamond in 1987 -- talk about brief and to the point! (Note for the kids: not Neil Young, the loathsome hippie/grunger.) Then there's Barry Manilow in 1984, and Cheryl Ladd in 1980. Not as bare as it would be if sung by Norah Jones, John Mayer, or any indie band, and still nowhere near as rococo as recent ones.
Perhaps the most dramatic example of how different things were is Diana Ross in 1982. She was a superstar from the heyday of '60s girl groups and disco, so based on her counterparts of the past 20 years, you might expect her to unleash her inner diva upon the audience. Yet she doesn't warble at all, and invites the crowd to sing along with her -- and you can really hear them! I didn't think about it until I saw this performance, but the more unique and unpredictable your rendition of a well known song is, the more impossible it is for the audience to join you. All you expect them to do is remain silent while they marvel at your awesomeness. If it's a guitar solo set within a larger song, that's cool. But not the whole fucking song, like it's mid-century jazz or something.
You get the picture, so I won't go through the whole history before the '80s as well. One glimpse from the '70s, though: here's the U.S. Air Force Academy Chorale in 1972. Again notice how "short" it sounds compared to the draaaaawn-ooooout ones, and how the low level of on-the-fly embellishment makes it easier for them all to harmonize.
Speaking of harmonies, here's the group that should have been in Barry Manilow's place, Huey Lewis and the News at the 1984 MLB All-Star game:
The Super Bowl has been around long enough and is so popular that they'll know what you're talking about. And they've sung the national anthem just about every time. Since the major cultural shift starting in the early-mid 1990s, the performance has grown so Victorian in its overly ornamental encrustations that fairly recent performances, say during the '80s, look like they're from another planet. They're not plodding and spare, like Harry Connick Jr.'s 1992 performance. They have just the right level of ornament and oomph.
Moving away from that optimum level of ornament could go in either direction -- toward the puritanical minimalism whose most famous example is the Apple look, but also toward the super-emo warbling of much pop music of the past 20 years. There's nothing contradictory about both extremes being popular: they have in common a strong departure from the optimal middle ground.
In case you're fortunate enough not to have heard the more recent, overly embellished and drawn-out national anthems, here are a few out of many examples from the Super Bowl: the one that ignited the trend, Whitney Houston 1991 (and even this early one doesn't have so much ornamentation), Mariah Carey in 2002, and by far the worst Christina Aguilera in 2011. About the only halfway decent one of this period was the Dixie Chicks in 2003.
The latest one I could find before '91 was Neil Diamond in 1987 -- talk about brief and to the point! (Note for the kids: not Neil Young, the loathsome hippie/grunger.) Then there's Barry Manilow in 1984, and Cheryl Ladd in 1980. Not as bare as it would be if sung by Norah Jones, John Mayer, or any indie band, and still nowhere near as rococo as recent ones.
Perhaps the most dramatic example of how different things were is Diana Ross in 1982. She was a superstar from the heyday of '60s girl groups and disco, so based on her counterparts of the past 20 years, you might expect her to unleash her inner diva upon the audience. Yet she doesn't warble at all, and invites the crowd to sing along with her -- and you can really hear them! I didn't think about it until I saw this performance, but the more unique and unpredictable your rendition of a well known song is, the more impossible it is for the audience to join you. All you expect them to do is remain silent while they marvel at your awesomeness. If it's a guitar solo set within a larger song, that's cool. But not the whole fucking song, like it's mid-century jazz or something.
You get the picture, so I won't go through the whole history before the '80s as well. One glimpse from the '70s, though: here's the U.S. Air Force Academy Chorale in 1972. Again notice how "short" it sounds compared to the draaaaawn-ooooout ones, and how the low level of on-the-fly embellishment makes it easier for them all to harmonize.
Speaking of harmonies, here's the group that should have been in Barry Manilow's place, Huey Lewis and the News at the 1984 MLB All-Star game:
May 20, 2012
Higher support for suicide in falling-crime times
The General Social Survey asks four questions about whether a person has the right to kill themselves in different scenarios: having an incurable disease, having gone bankrupt, having dishonored their family, and being tired of living and ready to die.
They are all positively correlated with each other, so I reduced them to just one underlying factor, suicide support, using factor analysis to compare years. This makes the value less interpretable -- it's like a z-score, comparing where the year stands compared to the rest of the years, instead of an absolute statement like a certain percentage support some position. At any rate, higher scores mean higher support. Here are the results:
From 1977 to 1991, support hovers near -1, aside from a couple fluke years. All those years are in negative territory. From 1993 to 2010, all are in the positive territory, and bounce around mostly between 0 and 1.
I haven't looked into popular views on suicide during the mid-century or the Jazz Age. Even with this narrower time focus, the fact that the upward shift happened in either 1992 or '93 suggests a link with the trend in the crime rate. The whole society got a lot less violent, and that includes suicide rates as well as homicide rates. People who hear less frequently about suicides might be more clueless about them, and so naively grant them more dignity than they deserve.
Still, mere cluelessness doesn't produce a bias in the more accepting direction -- people who are clueless about suicides could hypothetically be more intolerant of the unfamiliar. What makes their lack of familiarity go toward more, not less, acceptance?
Here we see the role of the cocooning trend of falling-crime times, as safer environments make people feel less of a need to connect with, rely on, and look out for each other. "You want to off yourself? Well, hey, it's your life, and I have no stake in it, so I don't care what you do with it." That's not how you respond to someone you are connected to when they bring up the idea of killing themselves.
This is a general principle: the more overlapping your lives are, the more intensely you care about what they do with theirs, because it is partly yours. They don't have 100% veto power over their life because it's embedded in the lives of many others. And reciprocally, you yourself don't have 100% veto power over your own life. So it's not like slavery, where the person has no say over what happens to them, but neither is it the pure autonomy of an atomized, hive-like society. It's a community-minded world where all concerned get to weigh in, perhaps according to how affected they will be by the decision -- like, close friends have more say than co-workers when someone they all know is thinking of suicide.
Maybe I've just been skimming or reading the wrong things all my literate life, but I can't remember reading anything useful on this topic that was philosophical or abstract. Until someone close to you (or who was once close to you) kills themselves, you won't understand that the suicide has robbed something from other people, involving life itself. He has not taken their lives, as though it were murder, but destroyed all their "shares" in his own life.
And often enough, in leaving he delivers one of the ultimate insults to those near and dear to him -- that "you guys are just too dumb, too callous, too cheery, or too whatever, to get my situation, and so it wouldn't be any use trying to reach out to you all for help." Gee, thanks for the vote of confidence, and for giving us no chance to defend and prove ourselves against the charge.
One consequence of that shift shown in the graph is that we're less likely these days to see the suicide as someone who needs to be forgiven by others for the harm and disorder that he chose to throw into their lives, and more likely to see him as a helpless victim buffeted about by "social forces." Anything to view our relationships with others in impersonal, emotionally avoidant ways.
They are all positively correlated with each other, so I reduced them to just one underlying factor, suicide support, using factor analysis to compare years. This makes the value less interpretable -- it's like a z-score, comparing where the year stands compared to the rest of the years, instead of an absolute statement like a certain percentage support some position. At any rate, higher scores mean higher support. Here are the results:
From 1977 to 1991, support hovers near -1, aside from a couple fluke years. All those years are in negative territory. From 1993 to 2010, all are in the positive territory, and bounce around mostly between 0 and 1.
I haven't looked into popular views on suicide during the mid-century or the Jazz Age. Even with this narrower time focus, the fact that the upward shift happened in either 1992 or '93 suggests a link with the trend in the crime rate. The whole society got a lot less violent, and that includes suicide rates as well as homicide rates. People who hear less frequently about suicides might be more clueless about them, and so naively grant them more dignity than they deserve.
Still, mere cluelessness doesn't produce a bias in the more accepting direction -- people who are clueless about suicides could hypothetically be more intolerant of the unfamiliar. What makes their lack of familiarity go toward more, not less, acceptance?
Here we see the role of the cocooning trend of falling-crime times, as safer environments make people feel less of a need to connect with, rely on, and look out for each other. "You want to off yourself? Well, hey, it's your life, and I have no stake in it, so I don't care what you do with it." That's not how you respond to someone you are connected to when they bring up the idea of killing themselves.
This is a general principle: the more overlapping your lives are, the more intensely you care about what they do with theirs, because it is partly yours. They don't have 100% veto power over their life because it's embedded in the lives of many others. And reciprocally, you yourself don't have 100% veto power over your own life. So it's not like slavery, where the person has no say over what happens to them, but neither is it the pure autonomy of an atomized, hive-like society. It's a community-minded world where all concerned get to weigh in, perhaps according to how affected they will be by the decision -- like, close friends have more say than co-workers when someone they all know is thinking of suicide.
Maybe I've just been skimming or reading the wrong things all my literate life, but I can't remember reading anything useful on this topic that was philosophical or abstract. Until someone close to you (or who was once close to you) kills themselves, you won't understand that the suicide has robbed something from other people, involving life itself. He has not taken their lives, as though it were murder, but destroyed all their "shares" in his own life.
And often enough, in leaving he delivers one of the ultimate insults to those near and dear to him -- that "you guys are just too dumb, too callous, too cheery, or too whatever, to get my situation, and so it wouldn't be any use trying to reach out to you all for help." Gee, thanks for the vote of confidence, and for giving us no chance to defend and prove ourselves against the charge.
One consequence of that shift shown in the graph is that we're less likely these days to see the suicide as someone who needs to be forgiven by others for the harm and disorder that he chose to throw into their lives, and more likely to see him as a helpless victim buffeted about by "social forces." Anything to view our relationships with others in impersonal, emotionally avoidant ways.
May 16, 2012
Has the internet benefited dissidents or the thought police more?
A recent comment somewhere at Steve Sailer's blog stated a widely held view about the power of the internet to give dissenting voices a megaphone, so that dissidents are better off in the online age.
And yet here we are, with political correctness worse than you could've imagined even as recently as the early '90s. The President is black and endorsing gay marriage -- didn't see that one coming. Quite plainly the internet hasn't empowered the anti-PC people very much, and perhaps has even undercut what little power they used to have.
I haven't heard anyone touch on that before -- that the internet allows not only the dissidents but also the thought police to more effectively reach their targets. True, you're more exposed to dissenting viewpoints now than in 1990, but you were exposed enough to them back then that they weren't secret. Of course the homos brought AIDS on themselves, of course blacks commit crime at higher rates than whites, and of course part of what makes someone more successful is being smarter. You could've heard your neighbor make these observations in casual conversation.
In the social psychology literature on conformity, people tend to go along with the wrong answer, unless there is even a single dissenter. For example, you're asked which of two lines is longer, everyone else says the short line is longer, and you tend to go along by saying the short line is longer. But when one other person says the long line is longer, suddenly you go with the right answer -- "So I wasn't crazy after all, someone else thought that the others were nuts, too!"
If the internet had in that way brought us some dissenting voices where zero had reached us before, it would have been beneficial to dissidents. I don't really see that, though. Like I said, who never heard someone say out loud that blacks are more violent than whites?
Since it seems to be more of a change in the degree of exposure to dissenting voices that we'd already heard, we also have to look at how much more we get exposed to the thought police via the internet. It's the net effect of those opposing forces that tells us what the internet has done for dissidents.
Well, just like in the offline world, the orthodox viewpoints have more megaphones, louder ones, and ones that are cross-linked or teamed up with one another. In the good old days, you had to search out a propaganda forum like the op-ed pages to find the thought police reminding you what the official story was.
These days online, it sets up base wherever you might go. You may just be doing a casual google search, and up pops a post from a discussion forum, one of those yahoo question/answer pages, blog posts, ad nauseam, that shout out some bit of PC orthodoxy. Not to mention the endless stream of bullshit from Twitter and Facebook feeds. It's also a lot easier for the foot soldiers of the Establishment to link to an op-ed or whatever by the mainstream media, which propagates it much faster than when the same pipsqueak would have had to clip out or xerox the article and paste a bunch of copies up around the office, student union, or wherever, like a jackass.
You shouldn't jump too quickly to praise some new bit of technology just because you see some way that it could make you better off. You need to think if it will make your enemies even better off. In a contest, it's only the relative gain you make against your adversary that counts, not your absolute gain seen in isolation.
And yet here we are, with political correctness worse than you could've imagined even as recently as the early '90s. The President is black and endorsing gay marriage -- didn't see that one coming. Quite plainly the internet hasn't empowered the anti-PC people very much, and perhaps has even undercut what little power they used to have.
I haven't heard anyone touch on that before -- that the internet allows not only the dissidents but also the thought police to more effectively reach their targets. True, you're more exposed to dissenting viewpoints now than in 1990, but you were exposed enough to them back then that they weren't secret. Of course the homos brought AIDS on themselves, of course blacks commit crime at higher rates than whites, and of course part of what makes someone more successful is being smarter. You could've heard your neighbor make these observations in casual conversation.
In the social psychology literature on conformity, people tend to go along with the wrong answer, unless there is even a single dissenter. For example, you're asked which of two lines is longer, everyone else says the short line is longer, and you tend to go along by saying the short line is longer. But when one other person says the long line is longer, suddenly you go with the right answer -- "So I wasn't crazy after all, someone else thought that the others were nuts, too!"
If the internet had in that way brought us some dissenting voices where zero had reached us before, it would have been beneficial to dissidents. I don't really see that, though. Like I said, who never heard someone say out loud that blacks are more violent than whites?
Since it seems to be more of a change in the degree of exposure to dissenting voices that we'd already heard, we also have to look at how much more we get exposed to the thought police via the internet. It's the net effect of those opposing forces that tells us what the internet has done for dissidents.
Well, just like in the offline world, the orthodox viewpoints have more megaphones, louder ones, and ones that are cross-linked or teamed up with one another. In the good old days, you had to search out a propaganda forum like the op-ed pages to find the thought police reminding you what the official story was.
These days online, it sets up base wherever you might go. You may just be doing a casual google search, and up pops a post from a discussion forum, one of those yahoo question/answer pages, blog posts, ad nauseam, that shout out some bit of PC orthodoxy. Not to mention the endless stream of bullshit from Twitter and Facebook feeds. It's also a lot easier for the foot soldiers of the Establishment to link to an op-ed or whatever by the mainstream media, which propagates it much faster than when the same pipsqueak would have had to clip out or xerox the article and paste a bunch of copies up around the office, student union, or wherever, like a jackass.
You shouldn't jump too quickly to praise some new bit of technology just because you see some way that it could make you better off. You need to think if it will make your enemies even better off. In a contest, it's only the relative gain you make against your adversary that counts, not your absolute gain seen in isolation.
May 15, 2012
Jews are uniquely non-religious, part 2
In an earlier post we saw that Jews are much less religious than Christians, even when you only look at white liberals with high enough IQs to be college material. The measures for religiosity there were belief in god, beliefs about the nature of the Bible, and frequency of attendance for religious services.
Poking around the General Social Survey, I found a couple other relevant questions (and that still yield decent sample sizes to compare Jews to Christians and no-religion people). These get more at the subjective, almost social relationship between a person and god. One asks how close you feel to god most of the time, and the other how often you pray.
Again I've restricted the data to whites who identify themselves as political liberals and who have an IQ of at least 120. We already know that Jews don't feel as close to god and don't pray as much as Christians. But if that's just because they don't have blacks or Hispanics, or because they tend to be more liberal, or because they tend to be more intelligent, we shouldn't see any difference once we've controlled for those factors. Here are the results:
Just as before, the Christian groups look fairly similar to each other, the no-religion folks are the farthest away from them, and Jews are in between, although closer to the no-religionists. Those who feel "not very close" or "not close at all" to god make up 23% of Protestants, 10% of Catholics, 44% of Jews, and 47% of those who profess no religion (not counting the extra 22% of them who don't believe). My guess is that people who don't feel close to god to begin with are more likely to be drawn to Protestantism, especially the born-again types.
As for prayer, 45% of Protestants pray at least once a day, as do 55% of Catholics, compared to only 11% of no-religion people and 25% of Jews. The farther away you feel from god, the less you feel like praying to him.
So once more, the usual factors that explain religiosity are not the only reason why Jews are less religious than Christians. There is some set of genetic and cultural factors unique to them that split them off in the graphs above and that indeed place them closer to people who identify as having "no religion."
As I mentioned in the earlier post, those are primarily the genetic adaptation to the white-collar managerial ecological niche, and the cultural tradition of legalistic bickering. The first leads them to not concern themselves with god -- they figured out the right interest rate to charge because they've got big brains. And that goy hick who was trying to get away with sending less money to the tax farmers, well they ran circles around him because they're so clever. As we saw during the mid-century, managerialism and godlessness tend to go together.
The tradition of legalism makes it less necessary to have any kind of personal or quasi-social relationship with the supernatural. Just follow the rules, and you'll please god, and he'll treat you well, kind of like the school principal who you rarely meet unless you fuck up the rules big-time. It just goes to show how far the Talmudic religion has departed from the Yahwehism of the Old Testament.
Julian Jaynes discusses several groups within more modern history that show thought processes closer to those of the "bicameral mind," where people experienced a guiding god very personally -- as a hallucinated voice that commanded people, in Jaynes' view. Shamanic figures are the best example. But he didn't discuss the opposite groups -- who are the ones who've taken the bicameral breakdown ball and run with it?
That would seem to be the Talmudists. As we've seen in this series, they've gone the farthest in making a religion where the relationship to god is abstract and out-there rather than concretely embodied, impersonal rather than personal, and mediated by a squabbling scribal class rather than direct.
Some degree of freeing ourselves from that bicameral mind seems to have been for the better, but there is such a thing as taking it too far. Feeling closer to god and reaching out to him may not directly affect how the outcomes of possibilities turn out. Still, it may introduce a sobering dose of humility when we're planning things out. It's no accident that Jews, whether Talmudist or atheist, have gone the farthest in proposing and struggling to implement the most wacked-out design specs for a human society, Karl Marx and Ayn Rand just being the tip of the iceberg.
Lord knows that non-Jewish white liberal smarty-pants have tried to push us in the same loony direction, but they do not shove with quite as much oomph as their Jewish fellow travelers do. Part of the reason why the Gentile autistic planners haven't brought mankind as close to the brink must be their Christian upbringing during the formative years, whatever became of it in adulthood, and their lesser genetic adaptedness to managerialism.
GSS variables used: race, polviews, wordsum, relig, neargod, pray.
Poking around the General Social Survey, I found a couple other relevant questions (and that still yield decent sample sizes to compare Jews to Christians and no-religion people). These get more at the subjective, almost social relationship between a person and god. One asks how close you feel to god most of the time, and the other how often you pray.
Again I've restricted the data to whites who identify themselves as political liberals and who have an IQ of at least 120. We already know that Jews don't feel as close to god and don't pray as much as Christians. But if that's just because they don't have blacks or Hispanics, or because they tend to be more liberal, or because they tend to be more intelligent, we shouldn't see any difference once we've controlled for those factors. Here are the results:
Just as before, the Christian groups look fairly similar to each other, the no-religion folks are the farthest away from them, and Jews are in between, although closer to the no-religionists. Those who feel "not very close" or "not close at all" to god make up 23% of Protestants, 10% of Catholics, 44% of Jews, and 47% of those who profess no religion (not counting the extra 22% of them who don't believe). My guess is that people who don't feel close to god to begin with are more likely to be drawn to Protestantism, especially the born-again types.
As for prayer, 45% of Protestants pray at least once a day, as do 55% of Catholics, compared to only 11% of no-religion people and 25% of Jews. The farther away you feel from god, the less you feel like praying to him.
So once more, the usual factors that explain religiosity are not the only reason why Jews are less religious than Christians. There is some set of genetic and cultural factors unique to them that split them off in the graphs above and that indeed place them closer to people who identify as having "no religion."
As I mentioned in the earlier post, those are primarily the genetic adaptation to the white-collar managerial ecological niche, and the cultural tradition of legalistic bickering. The first leads them to not concern themselves with god -- they figured out the right interest rate to charge because they've got big brains. And that goy hick who was trying to get away with sending less money to the tax farmers, well they ran circles around him because they're so clever. As we saw during the mid-century, managerialism and godlessness tend to go together.
The tradition of legalism makes it less necessary to have any kind of personal or quasi-social relationship with the supernatural. Just follow the rules, and you'll please god, and he'll treat you well, kind of like the school principal who you rarely meet unless you fuck up the rules big-time. It just goes to show how far the Talmudic religion has departed from the Yahwehism of the Old Testament.
Julian Jaynes discusses several groups within more modern history that show thought processes closer to those of the "bicameral mind," where people experienced a guiding god very personally -- as a hallucinated voice that commanded people, in Jaynes' view. Shamanic figures are the best example. But he didn't discuss the opposite groups -- who are the ones who've taken the bicameral breakdown ball and run with it?
That would seem to be the Talmudists. As we've seen in this series, they've gone the farthest in making a religion where the relationship to god is abstract and out-there rather than concretely embodied, impersonal rather than personal, and mediated by a squabbling scribal class rather than direct.
Some degree of freeing ourselves from that bicameral mind seems to have been for the better, but there is such a thing as taking it too far. Feeling closer to god and reaching out to him may not directly affect how the outcomes of possibilities turn out. Still, it may introduce a sobering dose of humility when we're planning things out. It's no accident that Jews, whether Talmudist or atheist, have gone the farthest in proposing and struggling to implement the most wacked-out design specs for a human society, Karl Marx and Ayn Rand just being the tip of the iceberg.
Lord knows that non-Jewish white liberal smarty-pants have tried to push us in the same loony direction, but they do not shove with quite as much oomph as their Jewish fellow travelers do. Part of the reason why the Gentile autistic planners haven't brought mankind as close to the brink must be their Christian upbringing during the formative years, whatever became of it in adulthood, and their lesser genetic adaptedness to managerialism.
GSS variables used: race, polviews, wordsum, relig, neargod, pray.
May 14, 2012
Songs you owe an apology to for once hating?
Not only because they are better than you gave them credit for, but especially because you were listening to far worse songs at the time. A real "those in glass houses" moment, I mean.
I'll start:
Now, when this first came out, I loved it as much as everybody else. The almost childlike sincerity must have struck a nostalgic chord in most of the audience. I was in fifth grade, though, so it felt like music made just for us as we were becoming possessed by our first big crushes. It was on the air all the time, and that's one of the earliest memories I have of pulling the radio closer and singing along, feeling like they were right there in my room.
Scarcely two years later it was 1993, with the alterna-grunge and gangsta rap trends destroying the catchy, make-you-come-alive music of whites and blacks, respectively. Every young person was tripping over themselves to prove how extreme and antagonistic they were, even though we were all just a bunch of posers and wiggers. (Sadly that was not as bad as it would get, and it's only gone downhill since then.)
When such a massive shift is underway, the triumphalist hordes of dorky teenagers will look for the easiest victims to make an example out of, so that the others understand we're doing things differently now. The antithesis of the new zeitgeist was something soft or moderate and reaching out rather than shoving away, so naturally we went after rock ballads. You might think that the acoustic, folk-influenced sound would have spared Mr. Big during the era of MTV Unplugged, but to us it just made it sound even wussier.
These still are not my favorite kind of song, and I wouldn't put most of them near the top, but again remember what the sacrificers were listening to themselves: Nirvana and Dr. Dre. Compared to that, let alone the even less listenable white and black music of today, "To Be with You" is a real breath of fresh air -- melodic, uplifting, and charming.
I recently bought a 2-disc compilation called MONSTER BALLADS, and it's for sure one of the most replayable tunes on there.
I'm curious to hear other people's stories, particularly if they're from a different time period. Aside from hair metal ballads, I'd guess that disco deserves the greatest apology from anyone who makes an honest reckoning of their own tastes when they were bashing it. Punk wasn't so great in itself, but more as an ingredient that was later incorporated -- along with disco -- into New Wave, sometimes by the same artist (like Billy Idol). And mainstream rock of 1980-'81 wasn't too hot either, kind of figuring out where to go after the various Seventies sounds and before the explosion of heartland rock.
Again I don't think a dispassionate look would put disco up near the top of popular music, but certainly much higher than it's been regarded, and better than what most of its detractors were listening to.
I'll start:
Now, when this first came out, I loved it as much as everybody else. The almost childlike sincerity must have struck a nostalgic chord in most of the audience. I was in fifth grade, though, so it felt like music made just for us as we were becoming possessed by our first big crushes. It was on the air all the time, and that's one of the earliest memories I have of pulling the radio closer and singing along, feeling like they were right there in my room.
Scarcely two years later it was 1993, with the alterna-grunge and gangsta rap trends destroying the catchy, make-you-come-alive music of whites and blacks, respectively. Every young person was tripping over themselves to prove how extreme and antagonistic they were, even though we were all just a bunch of posers and wiggers. (Sadly that was not as bad as it would get, and it's only gone downhill since then.)
When such a massive shift is underway, the triumphalist hordes of dorky teenagers will look for the easiest victims to make an example out of, so that the others understand we're doing things differently now. The antithesis of the new zeitgeist was something soft or moderate and reaching out rather than shoving away, so naturally we went after rock ballads. You might think that the acoustic, folk-influenced sound would have spared Mr. Big during the era of MTV Unplugged, but to us it just made it sound even wussier.
These still are not my favorite kind of song, and I wouldn't put most of them near the top, but again remember what the sacrificers were listening to themselves: Nirvana and Dr. Dre. Compared to that, let alone the even less listenable white and black music of today, "To Be with You" is a real breath of fresh air -- melodic, uplifting, and charming.
I recently bought a 2-disc compilation called MONSTER BALLADS, and it's for sure one of the most replayable tunes on there.
I'm curious to hear other people's stories, particularly if they're from a different time period. Aside from hair metal ballads, I'd guess that disco deserves the greatest apology from anyone who makes an honest reckoning of their own tastes when they were bashing it. Punk wasn't so great in itself, but more as an ingredient that was later incorporated -- along with disco -- into New Wave, sometimes by the same artist (like Billy Idol). And mainstream rock of 1980-'81 wasn't too hot either, kind of figuring out where to go after the various Seventies sounds and before the explosion of heartland rock.
Again I don't think a dispassionate look would put disco up near the top of popular music, but certainly much higher than it's been regarded, and better than what most of its detractors were listening to.
May 12, 2012
Unknown: Arcade history of the 2000s
Awhile ago, when I was just first collecting examples of how cocooning the society has become in the past 20 years, I posted on video game arcade revenues, 1977 to 2002. It shows the collapse of one of the many public spaces for young people (although the shift has affected all age groups).
The data came from an academic article, and are now nearly 10 years out of date. There isn't a follow-up on the author's website, and after an hour of googling using one of his sources, I didn't find anything new. But that wasn't exhaustive, and I didn't use Lexis-Nexis or try every search word.
So there's a little project for anyone who wants it -- what were the revenues in the US for arcade video games from 2003 to 2011? Try Lexis-Nexis if you have access (all college libraries do), or use his sources: "Amusement & Music Operators Association, Nintendo, PC Data, NPD, Veronis Suhler, Vending Times (1978-2001)." He adjusted for inflation by converting figures into 1983 dollars.
Post on your blog and I'll link here. Or just post comments intermittently as you come across the data.
Obviously the downward trend has only continued, but it would be nice to have numbers. And it would be worth looking for any exceptional periods -- based on the rest of the culture, I'd expect a halt or a small upward blip of arcade popularity during the mid-2000s. Isn't that when GameWorks was at its height? And the DDR kind of games too? We can't say for sure unless there are data to plot along with the first 25 years' worth.
The data came from an academic article, and are now nearly 10 years out of date. There isn't a follow-up on the author's website, and after an hour of googling using one of his sources, I didn't find anything new. But that wasn't exhaustive, and I didn't use Lexis-Nexis or try every search word.
So there's a little project for anyone who wants it -- what were the revenues in the US for arcade video games from 2003 to 2011? Try Lexis-Nexis if you have access (all college libraries do), or use his sources: "Amusement & Music Operators Association, Nintendo, PC Data, NPD, Veronis Suhler, Vending Times (1978-2001)." He adjusted for inflation by converting figures into 1983 dollars.
Post on your blog and I'll link here. Or just post comments intermittently as you come across the data.
Obviously the downward trend has only continued, but it would be nice to have numbers. And it would be worth looking for any exceptional periods -- based on the rest of the culture, I'd expect a halt or a small upward blip of arcade popularity during the mid-2000s. Isn't that when GameWorks was at its height? And the DDR kind of games too? We can't say for sure unless there are data to plot along with the first 25 years' worth.
May 10, 2012
Race-related eruptions as a modern springtime sacrifice?
As usual during spring, there've been a series of scandals, brou-ha-has, and Two Minutes Hate having to do with race. The springtime racial conflict on campus was an openly talked about phenomenon even by the spring of... 2002, I guess, the last time I was heavily involved with the activist scene in college. The Don Imus brou-ha-ha was in April of 2007, the Duke lacrosse rape hoax was April of 2006, and so on and so on.
I checked Wikipedia's list of race riots for the past 100 years, and most were in spring or summer, including the L.A. Riots of April-May 1992. Partly that's because people aren't in hibernating mode. Also men's testosterone levels are peaking, leading to more male-male competition during the mating season.
Still, I don't buy that as the main cause of the vernal battle about race. A good deal of it is not physical but verbal, so cold weather during fall and winter shouldn't keep you from verbally lashing out at others. I don't buy the testosterone and male-male competition thing because, while that would explain the physical rioting, the main form these days is again verbal, and it's mainly prissy men and bitchy women launching the attacks. Not something that stems from high testosterone.
Sticking just to the verbal conflict, it's mostly the elites on the attack, the ones who would've been part of the priestly or mandarin class in ye olden days (and aren't too different from that right now). It's not engineers or programmers, and neither is it graphic designers or songwriters.
Given the elite status of the hounders, how savagely they pounce on their target, how publicly they try to make the humiliating spectacle, and how loudly they insist that there's a greater order-preserving purpose behind it all, not to mention the seasonal regularity, perhaps the best way to make sense of it is as a springtime sacrifice. The threat of chaos ever looms on the horizon -- whites forming lynch mobs, blacks burning down the ghetto -- so that every year the priestly class must sacrifice victims to the unseen powerful forces that could sow dissension among the people by provoking race riots.
This explains why the victims in the ritual tend to be higher-status, from the same racial group as the sacrificers, and often part of that very class. They see it as a greater loss, hence something that will buy greater influence with The Forces. At the same time, they choose crooked members of the in-group in order to kill two birds with one stone, as it were, by combining the sacrifice with a crime and punishment function, not unlike the burning of witches.
Blacks and lower-status whites, when they wage a race riot in the spring or summer, are falling back on the good old days when you sacrificed a member of the out-group to propitiate the in-group's unique gods. The whites who run the country, though, are more the inheritors of the Axial Age and after, where sacrifice became something of your own that you gave up to more universal gods. This case is not as self-sacrificing as Lent, but it still culls its victims from within the racial or ethnic group, even the same social class.
This also fits well with how elite whites feel about racism, namely that it isn't just a sign of poor taste, bad manners, low status, etc., but a more apocalyptic anxiety about something that threatens to open up the pits of darkness (so to speak) and engulf the world in turmoil and chaos. Some bargain must have been struck deep within the mists of history, whereby their priestly ancestors agreed to give up some of their own, provided that The Forces maintained the integrity of the barrier protecting us from the realm of disorder. And every year that original sacrifice must be re-enacted to ensure a secure spring-summer season, when people will be coming out of hibernation and potentially stepping on each other's toes, and so potentially ready to burst into open conflict.
It doesn't seem like the elite whites view blacks as having much control over the evil that they could do, otherwise they would appeal directly to blacks themselves -- please, no matter how outraged you get, don't burn down the inner city again! That's how a good deal of them dealt with the white rioters in Vancouver last year, evidently thinking them more in control over good vs. evil behavior. Instead they make vague addresses to the gods of social justice, or whatever, that the elite whites invented on their own. They don't address the local gods of the blacks, in the same way that you might try to talk to the neighborhood hell-raiser's parents to make him behave better.
They apparently see blacks more as foot soldiers, along with lower-ranking whites, in a race battle that would truly and ultimately be unleashed by an angry group of The Forces responsible for sealing the barrier, if the elites had failed to offer them a proper sacrifice.
In general I don't find it helpful to equate ideologies with religions, since ideologies rarely touch on the sacred supernatural. But now and then there are cases where they do look like yet another case study in some religious phenomenon. I haven't thought too much about all the connections or implications in this case; it's just the first thing that popped into my mind when trying to account for why they always happen during springtime.
I checked Wikipedia's list of race riots for the past 100 years, and most were in spring or summer, including the L.A. Riots of April-May 1992. Partly that's because people aren't in hibernating mode. Also men's testosterone levels are peaking, leading to more male-male competition during the mating season.
Still, I don't buy that as the main cause of the vernal battle about race. A good deal of it is not physical but verbal, so cold weather during fall and winter shouldn't keep you from verbally lashing out at others. I don't buy the testosterone and male-male competition thing because, while that would explain the physical rioting, the main form these days is again verbal, and it's mainly prissy men and bitchy women launching the attacks. Not something that stems from high testosterone.
Sticking just to the verbal conflict, it's mostly the elites on the attack, the ones who would've been part of the priestly or mandarin class in ye olden days (and aren't too different from that right now). It's not engineers or programmers, and neither is it graphic designers or songwriters.
Given the elite status of the hounders, how savagely they pounce on their target, how publicly they try to make the humiliating spectacle, and how loudly they insist that there's a greater order-preserving purpose behind it all, not to mention the seasonal regularity, perhaps the best way to make sense of it is as a springtime sacrifice. The threat of chaos ever looms on the horizon -- whites forming lynch mobs, blacks burning down the ghetto -- so that every year the priestly class must sacrifice victims to the unseen powerful forces that could sow dissension among the people by provoking race riots.
This explains why the victims in the ritual tend to be higher-status, from the same racial group as the sacrificers, and often part of that very class. They see it as a greater loss, hence something that will buy greater influence with The Forces. At the same time, they choose crooked members of the in-group in order to kill two birds with one stone, as it were, by combining the sacrifice with a crime and punishment function, not unlike the burning of witches.
Blacks and lower-status whites, when they wage a race riot in the spring or summer, are falling back on the good old days when you sacrificed a member of the out-group to propitiate the in-group's unique gods. The whites who run the country, though, are more the inheritors of the Axial Age and after, where sacrifice became something of your own that you gave up to more universal gods. This case is not as self-sacrificing as Lent, but it still culls its victims from within the racial or ethnic group, even the same social class.
This also fits well with how elite whites feel about racism, namely that it isn't just a sign of poor taste, bad manners, low status, etc., but a more apocalyptic anxiety about something that threatens to open up the pits of darkness (so to speak) and engulf the world in turmoil and chaos. Some bargain must have been struck deep within the mists of history, whereby their priestly ancestors agreed to give up some of their own, provided that The Forces maintained the integrity of the barrier protecting us from the realm of disorder. And every year that original sacrifice must be re-enacted to ensure a secure spring-summer season, when people will be coming out of hibernation and potentially stepping on each other's toes, and so potentially ready to burst into open conflict.
It doesn't seem like the elite whites view blacks as having much control over the evil that they could do, otherwise they would appeal directly to blacks themselves -- please, no matter how outraged you get, don't burn down the inner city again! That's how a good deal of them dealt with the white rioters in Vancouver last year, evidently thinking them more in control over good vs. evil behavior. Instead they make vague addresses to the gods of social justice, or whatever, that the elite whites invented on their own. They don't address the local gods of the blacks, in the same way that you might try to talk to the neighborhood hell-raiser's parents to make him behave better.
They apparently see blacks more as foot soldiers, along with lower-ranking whites, in a race battle that would truly and ultimately be unleashed by an angry group of The Forces responsible for sealing the barrier, if the elites had failed to offer them a proper sacrifice.
In general I don't find it helpful to equate ideologies with religions, since ideologies rarely touch on the sacred supernatural. But now and then there are cases where they do look like yet another case study in some religious phenomenon. I haven't thought too much about all the connections or implications in this case; it's just the first thing that popped into my mind when trying to account for why they always happen during springtime.
May 9, 2012
Obfuscation about "Judaism" and its origins
Using the single term Judaism to describe the various religions of the Jewish ethnic / racial group makes as much sense as using "Italianism" to lump together the religions of the Etruscans, the Indo-European pagan religion of the Romans, Mithraism, early sects of Christianity, Catholicism, and so on.
You wouldn't notice this unless you bothered looking into the history of religions founded by Jews, which I've finally gotten around to. I'd always assumed that there was just Judaism, then Christianity building from that, followed by the non-Jewish but still Semitic development of Islam. Old Testament, then New Testament, then the Quran.
I'm ashamed to admit that I had little idea when the Talmud was written down -- circa 500 AD -- even though I dominated quiz bowl in high school and could've told you that Buddha lived before Jesus, that Mahavira founded Jainism, when the Council of Nicea was and what it was about, etc. etc. etc. Everything I'd read, not to mention all that I've heard spoken about it, kept secret the off-shoot nature of what is now called Judaism in everyday language -- that its distinguishing sacred text, as well as the beliefs and behaviors, originated in the Early Middle Ages, not around or shortly after the era of the Old Testament, like I'd assumed.
I repeat that I never encountered this level of obfuscation when it came to any other religion or sect. Every source told you when Mohammed left Mecca for Medina, when Martin Luther nailed the 95 Theses on the Wittenberg Cathedral door, roughly when The Analects and the Tao Te Ching were written, that the Rig Veda came long before the Upanishads, bla bla bla. Yet none of them made it a point to say, "Talmud written down c. 500 AD, ushering in Rabbinic Judaism, the form followed by Jews today." The only date and figure attached to the post-Old Testament history of non-Christian Jewish religion is for Maimonides (and maybe Martin Buber).
Rather than perpetrate the confusion about terminology, I propose we don't use "Judaism" either alone or modified, just as we wouldn't use "Italianism." "Rabbinic Judaism" would require us to call Christianity "Jesusite Judaism" or "New Testament Judaism". For the religion founded with the Talmud, we should just call it Talmudism, Rabbi-ism, or something else that is transparent and that makes no reference to the Jewish ethnic group.
I don't know what to call the collection of sects from the Second Temple period -- Second Temple-ism! Just so long as it doesn't have "Judaism" in it, as that would only make people think that back then there were full-time specialists called rabbis who read a text called the Talmud. The age of the Hebrew prophets that gave birth to the Old Testament, we could call Yahweh-ism. For the time before strict monotheism, use Elohimism.
Why is there a unique deliberate effort to hide basic facts about when the religion followed by today's Jews was founded? This comes from both Jews and non-Jews, so it isn't just the in-group concealing things from the out-group. And it cannot be for fear of somehow derogating the religion by noting that it's more recent than you might have thought -- that would apply to any "newer" religion. Yet no one keeps mum about when Islam began, or when the Lutheran or Mormon sects within Christianity began.
Jews are obviously on board here so they can claim bragging rights -- they were here first, and Christianity and Islam are just spin-offs of their religion, whether they see them neutrally or as corruptions (like most spin-offs are). Even the minority of Jews who are friendly toward Christianity and Islam would still tend to say that you should still be grateful to our religion for providing the foundation of your own. Has anyone ever read or heard a Jew say that their religion, along with Christianity and Islam, are "just" spin-offs of Yahweh-ism, let alone admit that theirs was spun off after Christianity?
You might think that atheist Jews would be the most eager group to get the message out -- talk about the chance to debunk a seriously wrong and seriously popular view about religion, and from insiders no less! But experience tells me that their ethnic pride would over-ride their urge to demystify religion.
Why the non-Jews play along too, I don't know. The obvious first guess is something about the 20th C. effort to reach out more to other religions, inflate their egos, and to do so for Jews in particular as guilt or embarrassment over the Holocaust. But that would mean that before then, say in Victorian or Elizabethan times, non-Jews knew the score and were open about it. I don't know much about the history of non-Jewish perceptions of Talmudism to say, but that's not my impression.
Certainly they looked down on it, maybe thought it was superstitious and overly legalistic, but I don't recall reading European authors saying that Talmudism was a late-comer, after Christianity. Perhaps they just weren't interested enough to look into the history of Talmudism to know when it began?
To end on a suggestion for future research, as they say, someone could do a very simple study to put some meat on these observations, and try to answer some of the why questions. Ask people a question (perhaps mixed in with other irrelevant, distractor questions) about the timing of Yahweh-ism, Christianity, and Talmudism, just a simple ranking task. Forget the wording for now. Let's at least see if people really are so clueless about Talmudism coming last, and if so what fraction of people are off.
Get the standard demographic information, too. I'll bet that the number of years of education is inversely related to getting that question right. Less educated people will probably just guess, while those who've absorbed the message of the elite will be biased toward giving Talmudism an earlier date.
As a control, give them a similar question about Yahweh-ism, Christianity, and Islam. Or throw all four into the same question. I doubt most people, certainly most educated people, would have trouble placing Islam after the others. Again that would mean there's something unique to our (mis)understanding of Talmudism, that it's not simply about it being a religion that we're not so familiar with.
If the rest of the questions are all about religious knowledge, factor analysis will show there to be a single underlying variable, like "general religious knowledge," that makes you more likely to know facts about all religions. However, I suspect that the question on Talmudism would "load" very low on this general factor, and maybe even inversely. That is, knowing lots of factoids about religion would actually hurt your chances of getting the Talmudism question right.
If set in this broader context, someone could actually do this study without raising an eyebrow beforehand, and just comment on it in passing during the write-up. Then let the commenters make a lot of hay out of the results.
You wouldn't notice this unless you bothered looking into the history of religions founded by Jews, which I've finally gotten around to. I'd always assumed that there was just Judaism, then Christianity building from that, followed by the non-Jewish but still Semitic development of Islam. Old Testament, then New Testament, then the Quran.
I'm ashamed to admit that I had little idea when the Talmud was written down -- circa 500 AD -- even though I dominated quiz bowl in high school and could've told you that Buddha lived before Jesus, that Mahavira founded Jainism, when the Council of Nicea was and what it was about, etc. etc. etc. Everything I'd read, not to mention all that I've heard spoken about it, kept secret the off-shoot nature of what is now called Judaism in everyday language -- that its distinguishing sacred text, as well as the beliefs and behaviors, originated in the Early Middle Ages, not around or shortly after the era of the Old Testament, like I'd assumed.
I repeat that I never encountered this level of obfuscation when it came to any other religion or sect. Every source told you when Mohammed left Mecca for Medina, when Martin Luther nailed the 95 Theses on the Wittenberg Cathedral door, roughly when The Analects and the Tao Te Ching were written, that the Rig Veda came long before the Upanishads, bla bla bla. Yet none of them made it a point to say, "Talmud written down c. 500 AD, ushering in Rabbinic Judaism, the form followed by Jews today." The only date and figure attached to the post-Old Testament history of non-Christian Jewish religion is for Maimonides (and maybe Martin Buber).
Rather than perpetrate the confusion about terminology, I propose we don't use "Judaism" either alone or modified, just as we wouldn't use "Italianism." "Rabbinic Judaism" would require us to call Christianity "Jesusite Judaism" or "New Testament Judaism". For the religion founded with the Talmud, we should just call it Talmudism, Rabbi-ism, or something else that is transparent and that makes no reference to the Jewish ethnic group.
I don't know what to call the collection of sects from the Second Temple period -- Second Temple-ism! Just so long as it doesn't have "Judaism" in it, as that would only make people think that back then there were full-time specialists called rabbis who read a text called the Talmud. The age of the Hebrew prophets that gave birth to the Old Testament, we could call Yahweh-ism. For the time before strict monotheism, use Elohimism.
Why is there a unique deliberate effort to hide basic facts about when the religion followed by today's Jews was founded? This comes from both Jews and non-Jews, so it isn't just the in-group concealing things from the out-group. And it cannot be for fear of somehow derogating the religion by noting that it's more recent than you might have thought -- that would apply to any "newer" religion. Yet no one keeps mum about when Islam began, or when the Lutheran or Mormon sects within Christianity began.
Jews are obviously on board here so they can claim bragging rights -- they were here first, and Christianity and Islam are just spin-offs of their religion, whether they see them neutrally or as corruptions (like most spin-offs are). Even the minority of Jews who are friendly toward Christianity and Islam would still tend to say that you should still be grateful to our religion for providing the foundation of your own. Has anyone ever read or heard a Jew say that their religion, along with Christianity and Islam, are "just" spin-offs of Yahweh-ism, let alone admit that theirs was spun off after Christianity?
You might think that atheist Jews would be the most eager group to get the message out -- talk about the chance to debunk a seriously wrong and seriously popular view about religion, and from insiders no less! But experience tells me that their ethnic pride would over-ride their urge to demystify religion.
Why the non-Jews play along too, I don't know. The obvious first guess is something about the 20th C. effort to reach out more to other religions, inflate their egos, and to do so for Jews in particular as guilt or embarrassment over the Holocaust. But that would mean that before then, say in Victorian or Elizabethan times, non-Jews knew the score and were open about it. I don't know much about the history of non-Jewish perceptions of Talmudism to say, but that's not my impression.
Certainly they looked down on it, maybe thought it was superstitious and overly legalistic, but I don't recall reading European authors saying that Talmudism was a late-comer, after Christianity. Perhaps they just weren't interested enough to look into the history of Talmudism to know when it began?
To end on a suggestion for future research, as they say, someone could do a very simple study to put some meat on these observations, and try to answer some of the why questions. Ask people a question (perhaps mixed in with other irrelevant, distractor questions) about the timing of Yahweh-ism, Christianity, and Talmudism, just a simple ranking task. Forget the wording for now. Let's at least see if people really are so clueless about Talmudism coming last, and if so what fraction of people are off.
Get the standard demographic information, too. I'll bet that the number of years of education is inversely related to getting that question right. Less educated people will probably just guess, while those who've absorbed the message of the elite will be biased toward giving Talmudism an earlier date.
As a control, give them a similar question about Yahweh-ism, Christianity, and Islam. Or throw all four into the same question. I doubt most people, certainly most educated people, would have trouble placing Islam after the others. Again that would mean there's something unique to our (mis)understanding of Talmudism, that it's not simply about it being a religion that we're not so familiar with.
If the rest of the questions are all about religious knowledge, factor analysis will show there to be a single underlying variable, like "general religious knowledge," that makes you more likely to know facts about all religions. However, I suspect that the question on Talmudism would "load" very low on this general factor, and maybe even inversely. That is, knowing lots of factoids about religion would actually hurt your chances of getting the Talmudism question right.
If set in this broader context, someone could actually do this study without raising an eyebrow beforehand, and just comment on it in passing during the write-up. Then let the commenters make a lot of hay out of the results.
May 7, 2012
Where did the emo scene go?
While we're on the topic of how far away the mid-2000s are, when was the last time you saw a chick with scene hair? Most of those girls I couldn't stand (attention whores), but they did bring back big hair and sported colorful accessories to the max. Their music and other cultural tastes were repulsive, their attitude sucked, but admittedly at a surface-only level they did participate in the revival of the late '70s and '80s.
That really was a reversal of the trend among the goth-dressing crowd from the '90s through today, which has gone toward monochrome black uniforms, fewer ornamental accessories, hair closer to the head and not as long, and wearing a scowl on your face, whereas the scene kids at least tried to look spunky (fake though it turned out).
Google trends shows that searches for "emo" peaked in 2008, the first 100 entries for "scene" in Urban Dictionary are overwhelmingly from 2005 and fall off afterward, and a tumblr account called fuckihatescenekids hasn't updated in 2 years. Guess even they, who overall weren't such a radical departure, have returned to the toned-down cocooning trend. They used to be one of the few hold-outs against the video game culture, still interested in music and going to live shows, shitty as they may have been to listen to.
Fun fact: looks like a lot of that sub-culture was a red state phenomenon. Google trends shows that the most likely places to search for scene hair, emo, etc., were flyover red states. Maybe blue-staters found it too garish and prole-looking?
Since 2005 feels so far away, I wonder if even the Millennials born up through 1990 feel like out-of-place old geezers now.
That really was a reversal of the trend among the goth-dressing crowd from the '90s through today, which has gone toward monochrome black uniforms, fewer ornamental accessories, hair closer to the head and not as long, and wearing a scowl on your face, whereas the scene kids at least tried to look spunky (fake though it turned out).
Google trends shows that searches for "emo" peaked in 2008, the first 100 entries for "scene" in Urban Dictionary are overwhelmingly from 2005 and fall off afterward, and a tumblr account called fuckihatescenekids hasn't updated in 2 years. Guess even they, who overall weren't such a radical departure, have returned to the toned-down cocooning trend. They used to be one of the few hold-outs against the video game culture, still interested in music and going to live shows, shitty as they may have been to listen to.
Fun fact: looks like a lot of that sub-culture was a red state phenomenon. Google trends shows that the most likely places to search for scene hair, emo, etc., were flyover red states. Maybe blue-staters found it too garish and prole-looking?
Since 2005 feels so far away, I wonder if even the Millennials born up through 1990 feel like out-of-place old geezers now.
May 5, 2012
The end of the '60s - '80s music revival
One of the clearest signs of the mid-2000s euphoria, which briefly halted or slightly reversed the wussification trend of the past 20 years, was the enthusiasm for music of the later '70s and '80s. That included the original bands as well as new bands who did a halfway decent interpretation of the originals, like the Libertines, the Raveonettes, or Franz Ferdinand.
By the same token, perhaps the clearest sign that the door of history has slammed shut on that whole zeitgeist is the growing apathy toward great music from the past. To try to quantify this, here are the changing frequencies of searches for music from different decades, using Google Trends. (That is, a search for "X music," where X could be 50s, 60s, 70s, 80s, or 90s.) They are all re-drawn as a fraction of their maximum, putting them all on the same scale. And they're just from the United States. Click to enlarge.
The main thing to take away is the decline in interest that starts off either in 2007 or early 2008, becomes complete by 2010, and has remained at that low level for the past two years (well below 50% of the peak interest level). That's no matter whether the person was looking for '60s, '70s, or '80s music. The brief rise-and-fall of those music video games like Guitar Hero and Rock Band reflect this too. They also reflected the more outgoing atmosphere back then, where people used video games in a more face-to-face, out-in-public way than they had been during the '90s, early 2000s, and late 2000s through today.
The only decade to bounce back is '90s music, where interest has been growing since early 2011. What a joke. That's also the time that interest in '50s music begins, as it was not part of the mid-2000s revival movements. This strengthens the idea that the mid-2000s had a somewhat more rising-crime feel in the zeitgeist, probably as a response to 9/11. Now that we no longer feel threatened, we're beginning to revive the culture of previous falling-crime eras like the '90s and '50s.
I've really noticed this change at '80s night, which I started going to every week sometime in early 2008 or maybe late '07. By the middle or end of 2009, it felt different, not quite as energetic and social. Last year I confess I played hookie from it for probably half the year, and then after going back for awhile, I've stopped going again for the last couple months. I'll give it another try during the summer. I have no trouble getting into it, if the music and the crowd are on board, but that's changed a bit, reflecting what's shown in the graph.
For one thing, the music had gotten more homogenized and stereotypical, like the songs aren't being played to get people worked up but to serve as references that the audience feels special for getting. It borders on self-consciousness and makes it hard to feel lost in the crowd.
They used to play the Cure, Pet Shop Boys, Oingo Boingo, and plenty of others that you don't hear at all anymore. Even those who they still play, it's only their single most famous song, whereas earlier they also mixed in "Planet Earth" by Duran Duran, "Lucky Star" by Madonna, "Strangelove" by Depeche Mode, and so on.
And the crowd has grown a lot more socially retarded (it's been mostly college-age kids all along). The guys are somewhat dorkier, but it's mostly the girls ramping up their attention whoring. The "boys are yucky" mindset is so pervasive that it's hardly necessary for their circle of chick friends to cockblock guys anymore -- she doesn't feel tempted to get close to boys in the first place. It's not as though face-to-face interaction was so common even 4 years ago, but now it's rare to see girls dancing with boys at all.
If the revival is done for good, I can't complain. '80s night has given me lots of memories over the past 4 years, not to mention when I was going out twice a week in Barcelona during 2004 and '05. Those really were the peak years. It was so fun to enjoy both the old music they played (Joy Division, Ramones, Cure, etc.) interwoven with the new bands that sounded similar. It felt more timeless, less self-consciously retro. Plus that club was open until 5am! Once it even landed on Daylight Savings Time, so we got an extra hour still...
And anyway, there's always the likelihood that when we get back to the same point in the crime cycle, there'll be a much broader and authentic resurrection of '80s culture. I'll probably be in my late 50s or 60s by then, but I expect to still be active and ornery enough to enjoy whatever is waiting for us.
By the same token, perhaps the clearest sign that the door of history has slammed shut on that whole zeitgeist is the growing apathy toward great music from the past. To try to quantify this, here are the changing frequencies of searches for music from different decades, using Google Trends. (That is, a search for "X music," where X could be 50s, 60s, 70s, 80s, or 90s.) They are all re-drawn as a fraction of their maximum, putting them all on the same scale. And they're just from the United States. Click to enlarge.
The main thing to take away is the decline in interest that starts off either in 2007 or early 2008, becomes complete by 2010, and has remained at that low level for the past two years (well below 50% of the peak interest level). That's no matter whether the person was looking for '60s, '70s, or '80s music. The brief rise-and-fall of those music video games like Guitar Hero and Rock Band reflect this too. They also reflected the more outgoing atmosphere back then, where people used video games in a more face-to-face, out-in-public way than they had been during the '90s, early 2000s, and late 2000s through today.
The only decade to bounce back is '90s music, where interest has been growing since early 2011. What a joke. That's also the time that interest in '50s music begins, as it was not part of the mid-2000s revival movements. This strengthens the idea that the mid-2000s had a somewhat more rising-crime feel in the zeitgeist, probably as a response to 9/11. Now that we no longer feel threatened, we're beginning to revive the culture of previous falling-crime eras like the '90s and '50s.
I've really noticed this change at '80s night, which I started going to every week sometime in early 2008 or maybe late '07. By the middle or end of 2009, it felt different, not quite as energetic and social. Last year I confess I played hookie from it for probably half the year, and then after going back for awhile, I've stopped going again for the last couple months. I'll give it another try during the summer. I have no trouble getting into it, if the music and the crowd are on board, but that's changed a bit, reflecting what's shown in the graph.
For one thing, the music had gotten more homogenized and stereotypical, like the songs aren't being played to get people worked up but to serve as references that the audience feels special for getting. It borders on self-consciousness and makes it hard to feel lost in the crowd.
They used to play the Cure, Pet Shop Boys, Oingo Boingo, and plenty of others that you don't hear at all anymore. Even those who they still play, it's only their single most famous song, whereas earlier they also mixed in "Planet Earth" by Duran Duran, "Lucky Star" by Madonna, "Strangelove" by Depeche Mode, and so on.
And the crowd has grown a lot more socially retarded (it's been mostly college-age kids all along). The guys are somewhat dorkier, but it's mostly the girls ramping up their attention whoring. The "boys are yucky" mindset is so pervasive that it's hardly necessary for their circle of chick friends to cockblock guys anymore -- she doesn't feel tempted to get close to boys in the first place. It's not as though face-to-face interaction was so common even 4 years ago, but now it's rare to see girls dancing with boys at all.
If the revival is done for good, I can't complain. '80s night has given me lots of memories over the past 4 years, not to mention when I was going out twice a week in Barcelona during 2004 and '05. Those really were the peak years. It was so fun to enjoy both the old music they played (Joy Division, Ramones, Cure, etc.) interwoven with the new bands that sounded similar. It felt more timeless, less self-consciously retro. Plus that club was open until 5am! Once it even landed on Daylight Savings Time, so we got an extra hour still...
And anyway, there's always the likelihood that when we get back to the same point in the crime cycle, there'll be a much broader and authentic resurrection of '80s culture. I'll probably be in my late 50s or 60s by then, but I expect to still be active and ornery enough to enjoy whatever is waiting for us.
May 4, 2012
Streetwise
Here's a YouTube link to a pretty cool "documentary," Streetwise, about teenage runaways living on the streets of Seattle around 1984. I put documentary in quotes because this was back when movies with that label were still historical or ethnographic, and not naked propaganda -- 80 minutes of cheerleading, peppered with 10 minutes of glib dismissal of The Other Side.
(There could be some rising vs. falling-crime era link here, as Triumph of the Will kicked off the propagandistic documentary trend of the falling-crime mid-century, followed by American documentary propaganda for the New Deal. Perhaps in more socially avoidant times, documentarians cannot get too close to reality, lest they find themselves attached to the individuals and groups being documented. Better then to intellectualize it and turn it into some abstract debate between imaginary factions.)
At any rate, the stories of the kids in Streetwise resonated with what I'd read from written ethnographies of teenage runaways during those times, such as this one. Unlike a Donahue or Geraldo show focused on the topic, or a very special episode of a sit-com devoted to it, longer-form documentaries are a lot more entertaining and real easy to learn from. You get to see an entire cast of characters, each with their own personalities, goals, strategies, and so on, instead of the creators trying to sum all of that up into the one guest star on the sit-com or the handful of guests on a talk show. And you also get to see several narratives unfold, rather than brief encounters in the sit-com or the directed interviews of the talk show.
All of this tends to check any urges that the makers might have toward melodrama, which belongs to fiction. It also broadens the appeal -- even if you aren't very interested in the culture of runaways, you'll still find the movie watchable, unlike propaganda that only the target micro-niche will get off on.
Less structured ethnographies also wind up capturing a lot of other interesting features of the world the action is set in. Although not central to the movie, you get a hint of how much more religious the country was back then by running into a Pentacostal street preacher. And you can't help but notice how out-and-about everyone was -- old and young, black and white, male and female, well-dressed and scraggly-lookin'. Young people still had their own public spaces like the always full video game arcades. Whenever a radio is on in the background, it's playing songs from the New Wave era. Somehow all of those things go together, but you wouldn't notice that if you'd only watched a bunch of highly focused interviews.
The world of Streetwise is particularly fascinating since it wouldn't even be 10 years later that Seattle would become the poster city for mopey and withdrawn youngsters, whereas the teenagers here couldn't be more strong-willed and resilient. Then 5 to 10 years after the heyday of grunge, Seattle became known as a yuppie capital due to Microsoft and Starbucks. By that time, 1984 was hardly ancient history, but it might as well have been, lying on the other side of the early '90s chasm.
That's not to say that the movie-makers romanticize the street kids. We see how malnourished many are, such as the 16 year-old boy who's told that aside from height, his body is like that of a 12 year-old. While seeing a doctor at a clinic, a teenage prostitute lists the various encounters with VD she's had. And by the end, one of them has killed himself. Within 5 years of the movie's release, another had been stabbed to death while sticking up for a friend, and a third was murdered by the notorious Green River Killer. I'm not sure when, but another died of AIDS. Not to mention all the lesser ways that the kids sabotage themselves and betray each other.
All in all it's just a refreshingly humanistic portrayal of a group we'd probably never get to know much about first-hand, showing their joys and their even greater fuck-ups. If you thought you'd never find a documentary that could hold your attention all the way through, give Streetwise a look.
(There could be some rising vs. falling-crime era link here, as Triumph of the Will kicked off the propagandistic documentary trend of the falling-crime mid-century, followed by American documentary propaganda for the New Deal. Perhaps in more socially avoidant times, documentarians cannot get too close to reality, lest they find themselves attached to the individuals and groups being documented. Better then to intellectualize it and turn it into some abstract debate between imaginary factions.)
At any rate, the stories of the kids in Streetwise resonated with what I'd read from written ethnographies of teenage runaways during those times, such as this one. Unlike a Donahue or Geraldo show focused on the topic, or a very special episode of a sit-com devoted to it, longer-form documentaries are a lot more entertaining and real easy to learn from. You get to see an entire cast of characters, each with their own personalities, goals, strategies, and so on, instead of the creators trying to sum all of that up into the one guest star on the sit-com or the handful of guests on a talk show. And you also get to see several narratives unfold, rather than brief encounters in the sit-com or the directed interviews of the talk show.
All of this tends to check any urges that the makers might have toward melodrama, which belongs to fiction. It also broadens the appeal -- even if you aren't very interested in the culture of runaways, you'll still find the movie watchable, unlike propaganda that only the target micro-niche will get off on.
Less structured ethnographies also wind up capturing a lot of other interesting features of the world the action is set in. Although not central to the movie, you get a hint of how much more religious the country was back then by running into a Pentacostal street preacher. And you can't help but notice how out-and-about everyone was -- old and young, black and white, male and female, well-dressed and scraggly-lookin'. Young people still had their own public spaces like the always full video game arcades. Whenever a radio is on in the background, it's playing songs from the New Wave era. Somehow all of those things go together, but you wouldn't notice that if you'd only watched a bunch of highly focused interviews.
The world of Streetwise is particularly fascinating since it wouldn't even be 10 years later that Seattle would become the poster city for mopey and withdrawn youngsters, whereas the teenagers here couldn't be more strong-willed and resilient. Then 5 to 10 years after the heyday of grunge, Seattle became known as a yuppie capital due to Microsoft and Starbucks. By that time, 1984 was hardly ancient history, but it might as well have been, lying on the other side of the early '90s chasm.
That's not to say that the movie-makers romanticize the street kids. We see how malnourished many are, such as the 16 year-old boy who's told that aside from height, his body is like that of a 12 year-old. While seeing a doctor at a clinic, a teenage prostitute lists the various encounters with VD she's had. And by the end, one of them has killed himself. Within 5 years of the movie's release, another had been stabbed to death while sticking up for a friend, and a third was murdered by the notorious Green River Killer. I'm not sure when, but another died of AIDS. Not to mention all the lesser ways that the kids sabotage themselves and betray each other.
All in all it's just a refreshingly humanistic portrayal of a group we'd probably never get to know much about first-hand, showing their joys and their even greater fuck-ups. If you thought you'd never find a documentary that could hold your attention all the way through, give Streetwise a look.
May 2, 2012
Skipping school, 1960 to present
Looking back on adolescence in the '90s, one of the things we were gypped out of was the not-so-serious attitude toward school attendance that they had even within the recent past. We still had senior skip day, but it seemed like the too-cool-for-school days were over.
Part of it may have been greater acceptance of school authority, although more worthy of examination is the idea that young people just got a lot lamer during the '90s, continuing through today. It's not like we had the same underlying desire to occasionally skip school as the generation just above us had, and that we were simply better at keeping a lid on it. I think it just didn't strike as many of us as a fun thing to do in the first place.
To figure out what the data say, I went to the Statistical Abstract and the NEA website for more recent years. They publish tables for fall enrollment and Average Daily Attendance over the years. Take the ADA as a fraction of fall enrollment, and subtract from 100% to find out what percent of students were absent on a typical day.
Unfortunately the data on ADA don't separate the primary from secondary students -- we'd really like to know how likely the middle and high-school kids are to be absent. The data collectors sensibly don't bother trying to break out excused vs. unexcused absences, which would be a nightmare for a national group to untangle for every school. Nevertheless, we can get a good-enough feel of the school-ditching culture.
The data go back to 1870, but I've restricted it to 1960 forward. If you look at the whole series, it's just one steady decline in absences, reflecting the switch from a more agricultural to industrial and information-based economy, where parents push their kids to stay in school forever. You can't infer from high absence rates that kids in the 1880s were in a hookie-playing mood, as they were probably working around the house or farm, or perhaps working for wages.
There are several periods when kids reversed the overall downward trend. The largest started in either 1969 or '70 and lasted through '75, the heyday of "turn on, tune in, drop out." Another reversal lasted from '77 to '79, when Cameron Crowe was doing his fieldwork for Fast Times at Ridgemont High. Rounding out the era is a reversal from '86 to '88, when Ferris Bueller's Day Off was a top-10 movie at the box office.
There's also an isolated reversal from 2003 - '05, part of the mid-2000s euphoria when the culture edged slightly back toward the good old days. As you can see by the shallow and then precipitous drop afterwards, those days of respite are long gone. Millennials born in the late '80s and 1990 enjoyed a brief glimpse of real life back then, and hopefully that influence will last (although I'm not counting on it). If they were born in 1991 or after, they missed out.
That really sharp drop beginning in 2009 (i.e. the '08-'09 school year) matches my impression of teenage culture. Along with the rest of society, they were still having fun up through the summer of 2006, and then there was a shallow decline up through the summer of 2008. Somewhere around fall '08 through today, they've totally reverted to the trend of the past 20 years toward dorky and tame behavior.
I wonder what the qualitative changes are like that are not evident in the quantitative picture. If you skipped school in the '80s, you probably went to some public space (mall, park, arcade, bowling alley, etc.), or to a friend's house who also skipped with you. I'll bet my life that today, those kids who do skip school you'd find holed up alone in their room playing Halo, Skyrim, or whatever, the day after it comes out.
Putting that aside, let's treat the act of skipping school as a threshold along a continuum, with people distributed along it in a normal curve. Then compare the end of the era subject to three closely spaced reversals (1988), to the present day. School kids today on average are 0.22 SD further in the obedient or unexcitable direction compared to their counterparts less than a generation ago. That's not a sea change, but remember how fast it happened. It'd be like school kids shrinking 2/3 of an inch in height during the same time.
If your attitude toward school attendance persists at least somewhat throughout your life, by now the teachers are probably more likely to ditch class than the students.
Part of it may have been greater acceptance of school authority, although more worthy of examination is the idea that young people just got a lot lamer during the '90s, continuing through today. It's not like we had the same underlying desire to occasionally skip school as the generation just above us had, and that we were simply better at keeping a lid on it. I think it just didn't strike as many of us as a fun thing to do in the first place.
To figure out what the data say, I went to the Statistical Abstract and the NEA website for more recent years. They publish tables for fall enrollment and Average Daily Attendance over the years. Take the ADA as a fraction of fall enrollment, and subtract from 100% to find out what percent of students were absent on a typical day.
Unfortunately the data on ADA don't separate the primary from secondary students -- we'd really like to know how likely the middle and high-school kids are to be absent. The data collectors sensibly don't bother trying to break out excused vs. unexcused absences, which would be a nightmare for a national group to untangle for every school. Nevertheless, we can get a good-enough feel of the school-ditching culture.
The data go back to 1870, but I've restricted it to 1960 forward. If you look at the whole series, it's just one steady decline in absences, reflecting the switch from a more agricultural to industrial and information-based economy, where parents push their kids to stay in school forever. You can't infer from high absence rates that kids in the 1880s were in a hookie-playing mood, as they were probably working around the house or farm, or perhaps working for wages.
There are several periods when kids reversed the overall downward trend. The largest started in either 1969 or '70 and lasted through '75, the heyday of "turn on, tune in, drop out." Another reversal lasted from '77 to '79, when Cameron Crowe was doing his fieldwork for Fast Times at Ridgemont High. Rounding out the era is a reversal from '86 to '88, when Ferris Bueller's Day Off was a top-10 movie at the box office.
There's also an isolated reversal from 2003 - '05, part of the mid-2000s euphoria when the culture edged slightly back toward the good old days. As you can see by the shallow and then precipitous drop afterwards, those days of respite are long gone. Millennials born in the late '80s and 1990 enjoyed a brief glimpse of real life back then, and hopefully that influence will last (although I'm not counting on it). If they were born in 1991 or after, they missed out.
That really sharp drop beginning in 2009 (i.e. the '08-'09 school year) matches my impression of teenage culture. Along with the rest of society, they were still having fun up through the summer of 2006, and then there was a shallow decline up through the summer of 2008. Somewhere around fall '08 through today, they've totally reverted to the trend of the past 20 years toward dorky and tame behavior.
I wonder what the qualitative changes are like that are not evident in the quantitative picture. If you skipped school in the '80s, you probably went to some public space (mall, park, arcade, bowling alley, etc.), or to a friend's house who also skipped with you. I'll bet my life that today, those kids who do skip school you'd find holed up alone in their room playing Halo, Skyrim, or whatever, the day after it comes out.
Putting that aside, let's treat the act of skipping school as a threshold along a continuum, with people distributed along it in a normal curve. Then compare the end of the era subject to three closely spaced reversals (1988), to the present day. School kids today on average are 0.22 SD further in the obedient or unexcitable direction compared to their counterparts less than a generation ago. That's not a sea change, but remember how fast it happened. It'd be like school kids shrinking 2/3 of an inch in height during the same time.
If your attitude toward school attendance persists at least somewhat throughout your life, by now the teachers are probably more likely to ditch class than the students.
May 1, 2012
Lasting generational effects on vocal qualities
People have up until puberty to acquire their native language with a flawless accent, which they never have to update or reacquaint themselves with. Same goes for much of the slang you pick up in adolescence, although your lexicon is more malleable.
It seems like there's more that persists than sound rules and dictionary entries, though, including the personality clues we get from the way people talk.
I've been watching or listening to a lot of the commentaries and retrospectives on the DVDs for Heathers, A Nightmare on Elm Street, Fast Times at Ridgemont High, not to mention many more over the past year or so. It's really striking how similar the Gen X people sound in their 30s or early 40s to when they starred in the movies as teenagers or early 20-somethings.
There are some changes just from being farther along in the lifespan, like not being so high-strung, as well as those reflecting the lower-key zeitgeist of the past 20 years. And some of them sound totally contemporary, like Winona Ryder who uses heavy vocal fry throughout a 2001 look back at Heathers. She sounds just like a typical bored and unexcitable Millennial chick. Jon Stewart is like that too -- the dorky leading the dorky.
Still, most of them talk a lot like they did when they were young. At first it sounds like they're stuck in adolescence -- still talking like they did as teenagers -- until you realize that teenagers don't talk that way anymore. It's more of a generational marker than an age marker. It's hard to say exactly what speech qualities I'm picking up on; it's more of a gestalt thing. But they're more expressive, more willing to open up and be real without fear of What Others Will Think. The women I especially notice using a much greater range of pitches, rather than the flat-toned speech of today's joyless young people. And they still giggle and laugh a lot! Not fake laughs either...
I also notice this around my department and any classes I share with undergrads. The Millennials are just about all of the avoidant attachment style, whether mousy or dismissive. Such a drag to talk to, often like pulling teeth, now that the mid-2000s euphoria is long gone. (They weren't so bad back then.) I've met a couple born in '85 or '86 who talk how I consider a normal person would. But I have the easiest time talking to people just around my age and back through Boomers, even the ones born in the late '40s, i.e. those who might have gone to Woodstock, unlike my parents who were born in the mid-'50s.
Probably the Millennials will be talking their way into their senior years, judging from the Silent Generation people in the commentaries, etc. For example, Ronee Blakley seemed just as reserved and almost aloof as she did as Nancy's mother in A Nightmare on Elm Street. By contrast Ray Walston, or Mr. Hand from Fast Times, sounded a lot more free-wheeling and rambunctious, even when he was interviewed as an elderly man. I went to check his birth year, and sure enough he was a Greatest Gen member (b. 1914), the early 20th-C. incarnation of what would be Gen X in the latter half (remembering that similar points in the zeitgeist cycle are separated by about 60 years).
Such a bizarre world where the middle-aged have an easier time than the youngsters at just letting it all hang out and shooting the shit with each other. It's yet another way in which we're back to the mid-century. One of the great things about It's a Wonderful Life, as far as understanding the past goes, is that it shows the past of the past, which was the Jazz Age for people of Jimmy Stewart's generation. If his character had met his wife during a high school dance of the 1950s instead of the Roaring Twenties, they wouldn't have grown as close and trusting as they did. People who go through more tumultuous times are simply less likely to take each other for granted.
It seems like there's more that persists than sound rules and dictionary entries, though, including the personality clues we get from the way people talk.
I've been watching or listening to a lot of the commentaries and retrospectives on the DVDs for Heathers, A Nightmare on Elm Street, Fast Times at Ridgemont High, not to mention many more over the past year or so. It's really striking how similar the Gen X people sound in their 30s or early 40s to when they starred in the movies as teenagers or early 20-somethings.
There are some changes just from being farther along in the lifespan, like not being so high-strung, as well as those reflecting the lower-key zeitgeist of the past 20 years. And some of them sound totally contemporary, like Winona Ryder who uses heavy vocal fry throughout a 2001 look back at Heathers. She sounds just like a typical bored and unexcitable Millennial chick. Jon Stewart is like that too -- the dorky leading the dorky.
Still, most of them talk a lot like they did when they were young. At first it sounds like they're stuck in adolescence -- still talking like they did as teenagers -- until you realize that teenagers don't talk that way anymore. It's more of a generational marker than an age marker. It's hard to say exactly what speech qualities I'm picking up on; it's more of a gestalt thing. But they're more expressive, more willing to open up and be real without fear of What Others Will Think. The women I especially notice using a much greater range of pitches, rather than the flat-toned speech of today's joyless young people. And they still giggle and laugh a lot! Not fake laughs either...
I also notice this around my department and any classes I share with undergrads. The Millennials are just about all of the avoidant attachment style, whether mousy or dismissive. Such a drag to talk to, often like pulling teeth, now that the mid-2000s euphoria is long gone. (They weren't so bad back then.) I've met a couple born in '85 or '86 who talk how I consider a normal person would. But I have the easiest time talking to people just around my age and back through Boomers, even the ones born in the late '40s, i.e. those who might have gone to Woodstock, unlike my parents who were born in the mid-'50s.
Probably the Millennials will be talking their way into their senior years, judging from the Silent Generation people in the commentaries, etc. For example, Ronee Blakley seemed just as reserved and almost aloof as she did as Nancy's mother in A Nightmare on Elm Street. By contrast Ray Walston, or Mr. Hand from Fast Times, sounded a lot more free-wheeling and rambunctious, even when he was interviewed as an elderly man. I went to check his birth year, and sure enough he was a Greatest Gen member (b. 1914), the early 20th-C. incarnation of what would be Gen X in the latter half (remembering that similar points in the zeitgeist cycle are separated by about 60 years).
Such a bizarre world where the middle-aged have an easier time than the youngsters at just letting it all hang out and shooting the shit with each other. It's yet another way in which we're back to the mid-century. One of the great things about It's a Wonderful Life, as far as understanding the past goes, is that it shows the past of the past, which was the Jazz Age for people of Jimmy Stewart's generation. If his character had met his wife during a high school dance of the 1950s instead of the Roaring Twenties, they wouldn't have grown as close and trusting as they did. People who go through more tumultuous times are simply less likely to take each other for granted.
Subscribe to:
Posts (Atom)