With all the commotion going on in the Ukraine, it can be hard to notice how minimal the chaos is in one respect — it is fighting among factions who want to gain control of a strong united nation, rather than one or more groups trying to violently break off. Turnover in leadership is one way to measure political instability, but an increase in the total number of nations in the region is an even stronger sign.
And it's not just in the Ukraine. Check out this map of separatist movements within EU nations, and this list of European separatist movements in general. There is strikingly little such agitation in Eastern Europe, and what breakaway groups there are would generally not result in an increase in the number of nations.
Rather, the group is culturally more similar to the people of a nearby nation than to the nation that they find themselves in right now. Perhaps they would like to join that nearby nation, and perhaps they want small-scale sovereignty. I think the more likely scenario is their territory being annexed into the nearby nation because asserting your common culture with an outside nation makes you think it only makes sense to join them in a Pan-Whoever-We-Are nation.
Call this "reconfiguration" rather than "fragmentation," then. Fragmentation is more when the breakaway group does not particularly identify with other groups nearby, and wants to be left by themselves.
The Ukraine actually has one of each type. Carpathian Ruthenia has a separatist movement that would naturally fit better with a Central European nation, and if successful would follow a path of reconfiguration. On the other hand, the Crimean Tatars don't have a nearby nation to align themselves with, and if successful would press for their own sovereignty and increase the number of nations.
Still, these cases are fairly marginal in Eastern Europe, whether measured by numbers of people who would be affected, the percent of the population that would change nations, or the area of land that is under dispute.
Look at Western Europe and the picture is totally different. Consider the southern part of the West — all those regions in Spain don't want to join up with any other nations, as they consider themselves the furthest extent of Whoever-We-Are. Galicians, Catalans, Basques, etc. Then move up north to the British Isles — Scotland wants to be by itself, Ireland has already split off unto itself, and there's even an active separatist movement within the tiny little nation of Ireland, seeking to split Ulster off from the rest.
The example of Northern Ireland serves as a reminder that there is no natural lower-bound to how small the splinter group can get. It's not as though Belarus has no such groups because it's already rather small and could not splinter any further after the breakdown of the Soviet Union. It sure could, but something is keeping the Slavic country together in a way that is lacking in the Celtic country.
One explanation for the greater scale of national cohesion in Eastern Europe would be Peter Turchin's model of ethnogenesis, whereby national solidarity is intensified along a meta-ethnic fault-line. That's where the groups on either side are so different from each other — different mode of subsistence, language, religion, race, clothing, etc. It is especially strengthened when one of those sides exerts power and influence over the other for so long. Over time, that binds the pushed-around group together, while the formerly glorious nation comes to rest on its laurels, becomes divided internally, and is weak to the attacks of the now united and motivated group on its periphery.
In this case, Eastern Europe would be glued together by the expansion of the Ottoman Empire during the middle part of the second millennium. True, the Turks are not the terrifying force they used to be, and they've been all but driven out of the Balkans. But national solidarity is slow to fade, as is the power and influence of the expanding nation that is now contracting. And before the Turks, there were the Mongols, and before them all the other waves of Steppe nomads.
At least compared to Western Europe, which has seen no such intense pressure as a whole region since the Roman and Holy Roman Empires, Eastern Europe must have a greater residue of solidarity, albeit at a decaying rate since the Steppe invaders are no longer pressing at the gates.
I think that could work for the parts of the East that were strongly subjected to such outside pressures. But Poland, the Czech Republic, Lithuania, and Finland did not bear the brunt of the Ottoman invasions, yet they are still remarkably well held together — unlike Spain, where Arab and later North African Muslims not only invaded but ruled the peninsula until they were finally driven out in the 15th century.
Something additional is gluing Eastern Europeans together more than Western Europeans.
My hunch is that it's the greater degree of genetic relatedness among Easterners. A recent study looked at how much genetic variation is "identical by descent" among various European groups. That is, they are genetically similar not because they evolved independently yet similarly, but because they descend from a common ancestor population. Since that splitting-off period, they have accumulated their own distinctive genetic variants. But descending from that common group gives them a good deal that's shared too.
Which part of Europe do you think looks the most similar genetically? Yep, it's Eastern Europe, from Poland down into the Balkans, even including Greece (though to a lesser extent). Folks in Spain or Italy have greater genetic distinctiveness within their own countries, forget about across countries. Just because they both speak Romance languages, were both part of the Roman Empire, and enjoy great Mediterranean weather, doesn't mean they share much blood with each other. Northwestern Europe isn't quite as patchy as Southern Europe, but it is still not as uniform as Eastern Europe.
What accounts for the greater similarity in the East is the recent, massive migrations of the Slavs — or if you want to call the source population something more pedantic, imagine I'm writing Slavs in quote marks. Those migrations began in the second half of the first millennium, more recently than any other mass migration in Europe aside from the Germanic ones from the mid-to-late first millennium. The Germanic groups, however, found people already settled in the lands that they invaded. Unless they drove them out en masse, they only ruled over the non-Germanic groups but did not contribute much to the local genepool. Germanic groups ruled Spain in the middle of the first millennium, for example, but didn't leave more than a drop in the bucket genetically.
The Slavs fanned out toward the north and east of Europe, where large sedentary civilizations were non-existent. We know that the Baltic languages used to be much more widespread in Northeastern Europe, so the Slavs must have displaced / killed their cousins the Balts when they took over what is now Russia and Poland. But the Balts were not a long-standing civilization that was difficult to dislodge, as the patchwork of groups in the Mediterranean were for the Germanic invaders.
Northeastern Europe is also very flat, being part of the Great European Plain, and it's easier to drive out people on a flat terrain. It's harder when they command the high ground in more hilly / mountainous areas. That's one thing that stopped the Slavs from totally taking over the Balkans, a largely mountainous region where the Albanians and Greeks are still hanging out, each of them being a sui generis branch of the Indo-European family. That must have been even stronger in Western Europe, where the Alps, Pyrenees, and other lesser ranges have allowed the pre-Germanic folks to hang around — some kind of Celtic and Roman mixture for the French-speaking areas, and the pre-Indo-European Basques.
In a sense, then, the Russians, Poles, and Bulgarians are not like the Portuguese, Spanish, and Italians, or the Irish, Dutch, and Swedish. It would not be too great of an exaggeration — especially in the comparative context — to speak of Slavs who have settled Russia, Slavs who have settled Poland, and Slavs who have settled Bulgaria.
Pan-Slavism nearly came to fruition just in the past century: the Soviet Union brought together the Western and Eastern groups, and Yugoslavia brought together the Southern groups. For a time, it looked like they might have merged, but the plains Soviets and mountainous Yugoslavs were just a bit too different for that. Bulgaria remained technically outside of either, but true to its lowland geography was more a satellite of the USSR than of Yugoslavia.
Both of those super-regional nations have broken up within the past 25 years, but don't count them out just yet. The high degree of genetic similarity gives them less of a barrier to clear in considering one another as brother-countries rather than mere neighbor-countries.
Also bear in mind how impossible this has proven in Western Europe, at least since the Holy Roman Empire of the early second millennium. You won't even get the British Isles to unite, forget about Pan-Germanism. The Nazis offered, and nobody took them up on it. Ditto for Southwestern Europe — Italians can't stand being governed along with their countrymen from too far away, and the same is true within Spain. Those two haven't been united under a strong nation since the Romans.
In that episode of Seinfeld where Kramer and Newman are playing Risk, it's not a Spaniard who objects to their game on the subway, saying "I from Spain, you not say Spain is weak." It's a Ukrainian who bellows out his defense of the fatherland.
The only thing that could show up the Slavs is if another group underwent a mass migration, displacing locals and swelling in numbers. They would be even more related than the Slavs. By this point, though, Europe is filled up. The Slavs were the last to enjoy a mass migration into not-so-occupied territory. Garden variety immigration will not do the trick, since they are drawn from too many sources. It would have to be a small initial group who steamrolled over everyone else, and left descendants from their own smallish group. And the opportunity for that is gone.
The meta-ethnic fault-line is well established as a potential force for gluing a people together into a nation. But genetic relatedness needs to be taken into account as well, especially when the group is not just some small clan of highlanders but nearly half a continent.
February 27, 2014
February 26, 2014
The anti-self-esteem climate of the 1980s, seen from Cosmopolitan magazine
We can learn a lot about how people's mindsets change by looking at things that are the most pandering. High culture also reflects the broader changes in society, but its creators are less concerned with what most people want -- if they like it, good, they recognize my genius; if not, well screw 'em. The more that they're trying to draw in and hook a target audience, the better they have to know what makes that audience tick. For them, it's not some pointless academic debate -- if they're wrong, they're out of business.
Last year I took a look at how adolescent / young adult female nature has changed since the '80s, using the covers of Seventeen magazine. (Here is the last of three posts, which links to the first two.) What about women who are more grown up, say in their 20s and 30s?
Have a look at this cover of Cosmopolitan from 1986. I didn't search for every issue from that year or the years around it -- I was just poking around Google Images and it came up. But the fact that even one issue could try to hook readers this way shows how different the world used to be. Click to enlarge.
Now, we tend to think of Cosmo as the cheerleading guru for the modern slutty career gal, and yet read the text.
"Man-shortage statistics lie. He's out there if you don't insist he be older and more successful."
AKA, you're no princess yourself, so lower your standards and learn to settle.
"When you're ashamed of what you do with men. Is the pleasure worth the pain?"
Look how far we have devolved -- back in the '80s, the word "shame" in a sexual context could be sincerely printed on the cover of Cosmo, without dismissive scare quotes. It's not preaching fire and brimstone, but trying to reach fallen women on an understanding level.
Also note the assumption that women got pleasure from men, all without having to read inside about 50 TRICKS FOR BETTER SEX -- TONIGHT! Women's bodies were not as numb to human touch as they are these days. I don't think it's because men were better at giving it to them either, else the mags today would advertise "how to get your boyfriend to give it to you better."
"What's right and wrong in a morally muddled world? A guide to behavior in the Eighties."
Notice, they didn't say "how to get through" this topsy-turvy world of ours, in a purely utilitarian tone. The reader wanted to know about the rightness or wrongness of her behavior, and perhaps of other's. This meant she was prepared to hear that what she was doing was wrong.
"It's not easy being green. Overcoming jealousy."
Calling a girl "jealous" these days is the ultimate dismissive insult. "Whatever -- you're just jealous." And here we see the main megaphone of the sexual revolution accusing the audience of being jealous, not to dismiss them but to make them want to improve their character.
Young women back then were prepared to have their self-esteem challenged just by glancing over the cover of Cosmo. No easy ego-stroking about how you treat people, no empty reassurance that 2014 will be your best year ever, and no glib dismissal of anyone who doesn't like you as jealous -- read inside about 5 PAINLESS WAYS TO GET OVER THE HATERS.
One of the benefits of growing up in a rising-crime period is that you don't think being naive is cool. Being sheltered and clueless will make you an easier target for robbers, rapists, serial killers, kidnappers, drug dealers, and cult recruiters. The socially outgoing environment of rising-crime times gives you another set of reasons why being clueless is bad -- if you don't know or care about how others view you, how are you supposed to fit in with a crowd or belong to a community?
Losing your naivete means being prepared to hear harsh things about yourself, albeit in a sympathetic tone, and taking a practical approach to improving -- not wallowing in self-pity like an attention whore.
With falling crime rates and a cocooning orientation toward other people, kids these days think it's better to block out the potentially distressing reality about the moral quality of their behavior, and how others respond to them. After all, what cost will they pay for ignorance? They're locked indoors all day, so criminals can't exploit their cluelessness. They want to interact as little as possible with their peers, so any ostracism would be out of sight, out of mind. And they don't really think that much of their peers anyway -- "I mean, some of them are kind of cool, but not as cool as me, and that's what really matters." So who cares what a bunch of losers thinks anyway?
If you want to teach young people a little humility, you have to shove them out there into an unsupervised peer group, and let them experience ("learn") first-hand that they aren't as supremely awesome as they think they are, and that that doesn't mean the end of the world.
Last year I took a look at how adolescent / young adult female nature has changed since the '80s, using the covers of Seventeen magazine. (Here is the last of three posts, which links to the first two.) What about women who are more grown up, say in their 20s and 30s?
Have a look at this cover of Cosmopolitan from 1986. I didn't search for every issue from that year or the years around it -- I was just poking around Google Images and it came up. But the fact that even one issue could try to hook readers this way shows how different the world used to be. Click to enlarge.
Now, we tend to think of Cosmo as the cheerleading guru for the modern slutty career gal, and yet read the text.
"Man-shortage statistics lie. He's out there if you don't insist he be older and more successful."
AKA, you're no princess yourself, so lower your standards and learn to settle.
"When you're ashamed of what you do with men. Is the pleasure worth the pain?"
Look how far we have devolved -- back in the '80s, the word "shame" in a sexual context could be sincerely printed on the cover of Cosmo, without dismissive scare quotes. It's not preaching fire and brimstone, but trying to reach fallen women on an understanding level.
Also note the assumption that women got pleasure from men, all without having to read inside about 50 TRICKS FOR BETTER SEX -- TONIGHT! Women's bodies were not as numb to human touch as they are these days. I don't think it's because men were better at giving it to them either, else the mags today would advertise "how to get your boyfriend to give it to you better."
"What's right and wrong in a morally muddled world? A guide to behavior in the Eighties."
Notice, they didn't say "how to get through" this topsy-turvy world of ours, in a purely utilitarian tone. The reader wanted to know about the rightness or wrongness of her behavior, and perhaps of other's. This meant she was prepared to hear that what she was doing was wrong.
"It's not easy being green. Overcoming jealousy."
Calling a girl "jealous" these days is the ultimate dismissive insult. "Whatever -- you're just jealous." And here we see the main megaphone of the sexual revolution accusing the audience of being jealous, not to dismiss them but to make them want to improve their character.
Young women back then were prepared to have their self-esteem challenged just by glancing over the cover of Cosmo. No easy ego-stroking about how you treat people, no empty reassurance that 2014 will be your best year ever, and no glib dismissal of anyone who doesn't like you as jealous -- read inside about 5 PAINLESS WAYS TO GET OVER THE HATERS.
One of the benefits of growing up in a rising-crime period is that you don't think being naive is cool. Being sheltered and clueless will make you an easier target for robbers, rapists, serial killers, kidnappers, drug dealers, and cult recruiters. The socially outgoing environment of rising-crime times gives you another set of reasons why being clueless is bad -- if you don't know or care about how others view you, how are you supposed to fit in with a crowd or belong to a community?
Losing your naivete means being prepared to hear harsh things about yourself, albeit in a sympathetic tone, and taking a practical approach to improving -- not wallowing in self-pity like an attention whore.
With falling crime rates and a cocooning orientation toward other people, kids these days think it's better to block out the potentially distressing reality about the moral quality of their behavior, and how others respond to them. After all, what cost will they pay for ignorance? They're locked indoors all day, so criminals can't exploit their cluelessness. They want to interact as little as possible with their peers, so any ostracism would be out of sight, out of mind. And they don't really think that much of their peers anyway -- "I mean, some of them are kind of cool, but not as cool as me, and that's what really matters." So who cares what a bunch of losers thinks anyway?
If you want to teach young people a little humility, you have to shove them out there into an unsupervised peer group, and let them experience ("learn") first-hand that they aren't as supremely awesome as they think they are, and that that doesn't mean the end of the world.
Categories:
Age,
Cocooning,
Crime,
Dudes and dudettes,
Generations,
Media,
Morality,
Psychology
February 25, 2014
Ransoming shoes at a wedding, an ancient Indo-European (and Caucasian) ritual
Update:
Looking into the northern Caucasian connection with Indo-European, I found descriptions of similar wedding traditions among the Circassians and Chechens (NW and NE Caucasus groups, resp.). The bride is held ransom by her side of the family (no specific item is taken, e.g. a shoe or knife), and the groom's side must pay to get her out of her house. A variant is the bride's side putting up crude roadblocks and not allowing her to pass through until the groom's side pays the ransom. (I recall reading about this roadblock variant in the Polish, too.) As in the IE groups, this is all on the playful / prankster side of things, not an official bride price. So this tradition must go back to the common cultural ancestor of the northern Caucasians and Indo-Europeans. End of update.
I recently watched a YouTube video about how crazy Punjabi weddings are, one example being the ransoming of the groom's shoes. The bride's side steals his shoes, and the groom's side must get them back by paying the ransom requested by the shoe-nappers. He can't very well leave without his shoes, so paying the ransom is required for the wedding to be completed. As far as I could tell, the shoe-nappers are female, and the ransom-payers are male. The key thing is that it is a form of bride price.
During an unrelated YouTube search for knife dances, I learned that Persians have a similar wedding ritual. Only with them, the women on the bride's side steal the knife used for cutting the wedding cake, and the men on the groom's side must pay the ransom to recover it.
That suggested a common origin among Indo-Iranians, hence perhaps as well among Indo-Europeans.
Sure enough, the Armenians have a similar ritual, where the only difference is that it's the bride's shoe that gets stolen. But it is still stolen and held for ransom by the women on the bride's side, and the men on the groom's side must pay the ransom to retrieve it.
The Armenian version is also found in all three major branches of Slavic peoples -- Poles (Western), Russians and Ukrainians (Eastern), and Serbs (Southern). Probably in the other members of each branch as well; I stopped counting once I found a member from each branch.
Among the Balts, Lithuanians have a custom of the wedding table being occupied by a fake wedding party, who require a ransom to give the table over to the real bride and groom. The groom's side (best man) carries out the negotiations with the impostors, though I couldn't find out whether the money goes to the bride's side or to the guests in general.
In Germany and Austria, the bride herself is kidnapped and whisked away to a local pub where she and her bridesmaids and friends drink until the groom finds her, after searching the area. He pays the entire tab for what they have drunk, and she is allowed to return and complete the ceremony.
Something like the German/Austrian kidnapping is found among the Romanians.
The tradition seems weaker in Southern Europe. I couldn't easily find something like this in Italy or Spain. In Portugal, the bride's shoe is passed around to the guests like a collection plate, and they leave money inside it. This does not involve kidnapping or paying a ransom (i.e., making it seem like the wedding cannot conclude until the money is paid). It also does not distinguish between the bride's side and the groom's side -- it's more about the guests giving money to the couple.
It is a bit stronger in Greece, where (at least in Epirus) the bride complains that her shoes are too loose and must be padded with money by the groom, who does so until the bride's side agrees that it's enough. It involves back-and-forth haggling as in other Eastern places.
I also could not find much in the way of ransoming, let alone of shoes, in the Celtic parts of NW Europe or in Scandinavia.
Overall, though, the phenomenon seems pretty widespread, often down to the particular item that's stolen -- shoes. What this means in the greater context of Indo-European culture, I don't know off the top of my head. "We leave this matter as a topic for future research." Still, it's neat to see a ritual being shared among people who trace their genetic and cultural heritage back so far.
Rituals tend to be conservative -- you're supposed to follow the script and not change it. And for rituals that are more common, you get more practice carrying out the script. Common rituals are less prone to error since they're something that "everybody knows" how to do. For rare rituals, you might genuinely forget how they're done and introduce mutations or scrap them altogether. It's hard to think of a more common and frequent communal ritual than weddings, so they ought to be a good place to look for conserved performance forms among people who have common ancestry going way back.
Looking into the northern Caucasian connection with Indo-European, I found descriptions of similar wedding traditions among the Circassians and Chechens (NW and NE Caucasus groups, resp.). The bride is held ransom by her side of the family (no specific item is taken, e.g. a shoe or knife), and the groom's side must pay to get her out of her house. A variant is the bride's side putting up crude roadblocks and not allowing her to pass through until the groom's side pays the ransom. (I recall reading about this roadblock variant in the Polish, too.) As in the IE groups, this is all on the playful / prankster side of things, not an official bride price. So this tradition must go back to the common cultural ancestor of the northern Caucasians and Indo-Europeans. End of update.
I recently watched a YouTube video about how crazy Punjabi weddings are, one example being the ransoming of the groom's shoes. The bride's side steals his shoes, and the groom's side must get them back by paying the ransom requested by the shoe-nappers. He can't very well leave without his shoes, so paying the ransom is required for the wedding to be completed. As far as I could tell, the shoe-nappers are female, and the ransom-payers are male. The key thing is that it is a form of bride price.
During an unrelated YouTube search for knife dances, I learned that Persians have a similar wedding ritual. Only with them, the women on the bride's side steal the knife used for cutting the wedding cake, and the men on the groom's side must pay the ransom to recover it.
That suggested a common origin among Indo-Iranians, hence perhaps as well among Indo-Europeans.
Sure enough, the Armenians have a similar ritual, where the only difference is that it's the bride's shoe that gets stolen. But it is still stolen and held for ransom by the women on the bride's side, and the men on the groom's side must pay the ransom to retrieve it.
The Armenian version is also found in all three major branches of Slavic peoples -- Poles (Western), Russians and Ukrainians (Eastern), and Serbs (Southern). Probably in the other members of each branch as well; I stopped counting once I found a member from each branch.
Among the Balts, Lithuanians have a custom of the wedding table being occupied by a fake wedding party, who require a ransom to give the table over to the real bride and groom. The groom's side (best man) carries out the negotiations with the impostors, though I couldn't find out whether the money goes to the bride's side or to the guests in general.
In Germany and Austria, the bride herself is kidnapped and whisked away to a local pub where she and her bridesmaids and friends drink until the groom finds her, after searching the area. He pays the entire tab for what they have drunk, and she is allowed to return and complete the ceremony.
Something like the German/Austrian kidnapping is found among the Romanians.
The tradition seems weaker in Southern Europe. I couldn't easily find something like this in Italy or Spain. In Portugal, the bride's shoe is passed around to the guests like a collection plate, and they leave money inside it. This does not involve kidnapping or paying a ransom (i.e., making it seem like the wedding cannot conclude until the money is paid). It also does not distinguish between the bride's side and the groom's side -- it's more about the guests giving money to the couple.
It is a bit stronger in Greece, where (at least in Epirus) the bride complains that her shoes are too loose and must be padded with money by the groom, who does so until the bride's side agrees that it's enough. It involves back-and-forth haggling as in other Eastern places.
I also could not find much in the way of ransoming, let alone of shoes, in the Celtic parts of NW Europe or in Scandinavia.
Overall, though, the phenomenon seems pretty widespread, often down to the particular item that's stolen -- shoes. What this means in the greater context of Indo-European culture, I don't know off the top of my head. "We leave this matter as a topic for future research." Still, it's neat to see a ritual being shared among people who trace their genetic and cultural heritage back so far.
Rituals tend to be conservative -- you're supposed to follow the script and not change it. And for rituals that are more common, you get more practice carrying out the script. Common rituals are less prone to error since they're something that "everybody knows" how to do. For rare rituals, you might genuinely forget how they're done and introduce mutations or scrap them altogether. It's hard to think of a more common and frequent communal ritual than weddings, so they ought to be a good place to look for conserved performance forms among people who have common ancestry going way back.
Categories:
Dudes and dudettes,
Economics,
Evolution,
Human Biodiversity,
Mythology
February 24, 2014
Why do gay men get gray hair earlier, if they're Peter Pans?
There's one big apparent counter-example to my theory that the gay syndrome is infantilization, caused by some kind of brain pathogen that slows / halts maturation in childhood (building on Greg Cochran's "gay germ" idea). And that is that gay men's health degrades much faster than normal men's. They die earlier, and they have worse prognoses for a given disease compared to normal men.
This is only a problem for the theory if we ignore the obvious -- their earlier onset of degradation is due to their wicked sexual behavior. It is physically corrupting, and their outcomes are proof of that. The root causes are number of partners and filthiness of the act.
Where else do we see evidence of their earlier aging, aside from the natural places to look like lifespan, prognosis for heart disease or diabetes, and so on? Their skin goes to hell a lot sooner than it does for normal men -- they start to look like mummies early on. That can be hard to quantify and find confirmation of in "the literature" or in journalism, though.
The other main sign of aging is de-pigmentation of the hair. For normal men, graying hair will become noticeable during their 40s and become fairly visible during their 50s. Ditto for women -- it is the normal timing for both sexes. During your 30s, you shouldn't have more than the occasional gray hair here or there.
I've noticed a common weird look among queers, where the guy will be in his 30s and already be noticeably gray, or fully gray-to-white by his 40s. Their bodies haven't totally broken down yet, so it creates a very incongruous look that is hard to ignore -- 50-something hair on a 30-something, 60-something hair on a 40-something.
Think of Anderson Cooper, whose hair began graying around age 20, and had further de-pigmented to silvery white before age 40. Dan Savage, another prominent queer, had dark gray hair by about age 40. Closeted actor George Clooney showed clear graying by age 35. Another closeted actor who played one of TV's first mainstream gay characters, Kerr Smith, was fully silver before age 40.
There aren't that many famous gays out there, and the above are only a small sampling of the gray-haired ones. That suggests a huge gap between normal and gay men right there.
To investigate further away from celebrities, I looked for academic articles, which I could not find right away. However, the most prominent gay magazine The Advocate ran a cover story on "Silver Foxes" in 2008. Skip past the first page of the drama queen story of how the author dealt with the discovery of his first gray hairs, and read what the people interviewed have to say. A sampling:
Jeez, unabashedly gray since college -- getting stuffed up the butt sure takes a toll on the body.
I'll take this article as confirmation of my hunch. They certainly aren't like-minded researchers whose work I'm citing because it supports my worldview. The whole piece is about how nowadays queers are starting to embrace the signs of their advanced aging and degradation -- own that mummy skin, girlfriend! Gray hair before 30? Hey, if you got it, flaunt it! But buried in this puff piece is some good on-the-ground data that you otherwise would not be able to find.
If this view is right, then the tiny number of normal men whose sex behavior resembles that of a queer should also show early graying of the hair. Porno actors are the only place to look, since no normal guy will ever have as many partners, nor dirty himself so much during sex, as a porno dude. Aside from anal sex (where, however, they are never the recipient), they also eat pussy and/or ass in just about every one of their scenes.
A year or so ago I saw something weird -- a porno actor who looked to be a normal 20-something guy, only his hair was entirely gray. "Huh," I thought. "Usually they make these guys take steroids, wax their nuts, take Viagra, etc. All signs of body dysmorphia -- you'd think they'd make a 20-something dye his silver hair a natural color." Keiran Lee, gray-haired before turning 30 (fully clothed picture). Let that be a lesson to any of you perverts who think there's nothing wrong with burying your mouth in a gash or an ass.
We already know that the main predictor of whether a man will get cancer of the oral cavity (after smoking) is how many chicks he's gone down on. Add head and neck cancers, too. They've even narrowed the pathogen down to HPV, the same one that causes cervical and genital cancers in women. Guys who hoover up a woman's fluids that contain HPV are in for a rude awakening.
That piece in Nature also notes that researchers are unclear if it's only oral sex that increases the risk of cancer, or if enough French kissing will do the trick too. Recall this earlier post that shows how recent a lot of our weird sexual behavior is, compared to what we did as hunter-gatherers. It wouldn't surprise me if many diseases turn out to be cryptic STDs, including from spit-swapping.
As for gray hair, I can't tell if it's due to a specific pathogen that attacks the cells that give hair its pigmentation, or some part of the brain that talks to those cells, or whether it's just one part of the broader breakdown in the face of so many stressors of the immune system. Or perhaps if the body senses that it is living an incredibly "fast" lifestyle, it withdraws resources from bodily maintenance and puts them into mating effort instead.
Whatever the mechanism turns out to be, it's clear that gray hair can now be added to the wider syndrome of advanced aging in the gay population. Condoning and even encouraging their warped behavior will only worsen their syndrome. But your typical fag-ophile is concerned less with preventing disease among gays, and more with broadcasting their own ideological purity. Being "on the right side of history" means more to them than whether or not they cluelessly egg on another AIDS epidemic.
This is only a problem for the theory if we ignore the obvious -- their earlier onset of degradation is due to their wicked sexual behavior. It is physically corrupting, and their outcomes are proof of that. The root causes are number of partners and filthiness of the act.
Where else do we see evidence of their earlier aging, aside from the natural places to look like lifespan, prognosis for heart disease or diabetes, and so on? Their skin goes to hell a lot sooner than it does for normal men -- they start to look like mummies early on. That can be hard to quantify and find confirmation of in "the literature" or in journalism, though.
The other main sign of aging is de-pigmentation of the hair. For normal men, graying hair will become noticeable during their 40s and become fairly visible during their 50s. Ditto for women -- it is the normal timing for both sexes. During your 30s, you shouldn't have more than the occasional gray hair here or there.
I've noticed a common weird look among queers, where the guy will be in his 30s and already be noticeably gray, or fully gray-to-white by his 40s. Their bodies haven't totally broken down yet, so it creates a very incongruous look that is hard to ignore -- 50-something hair on a 30-something, 60-something hair on a 40-something.
Think of Anderson Cooper, whose hair began graying around age 20, and had further de-pigmented to silvery white before age 40. Dan Savage, another prominent queer, had dark gray hair by about age 40. Closeted actor George Clooney showed clear graying by age 35. Another closeted actor who played one of TV's first mainstream gay characters, Kerr Smith, was fully silver before age 40.
There aren't that many famous gays out there, and the above are only a small sampling of the gray-haired ones. That suggests a huge gap between normal and gay men right there.
To investigate further away from celebrities, I looked for academic articles, which I could not find right away. However, the most prominent gay magazine The Advocate ran a cover story on "Silver Foxes" in 2008. Skip past the first page of the drama queen story of how the author dealt with the discovery of his first gray hairs, and read what the people interviewed have to say. A sampling:
[T]he silver fox also encompasses guys in their 30s, even 20s, who are sexy, in shape, and have some to a lot of gray hair... As one 26-year-old told me, "It adds to their sexiness when guys in their 30s have gray hair..."
Gray hair is "cool now," says Andrew Weir, one of the fashion world's go-to guys for casting advertising campaigns and editorial features. In his work scanning the streets for "real" men and women to model for his client' projects, he says he's noticed a definite rise in the number of silver foxes. Forty-one years old and graying, Weir is one himself...
It's a trend that New York psychotherapist Brian Lathrop has noticed among his predominantly gay male clientele too. In the last few years, he says, his clients in their 30s and 40s have shown a greater acceptance of turning gray in a way that's markedly different from just a decade ago. "They're not treating it like a symptom of aging," Lathrop says. Instead, they're making it work for them. "It's not hypermasculine, but truly masculine."...
A sales executive who cut his teeth at L'Oréal, Pestorius says his colleagues there encouraged him to dye his hair when the gray came in strong at 28. He colored it every four to six weeks but eventually grew tired of the high-maintenance routine...
So I'm unprepared when Travis Parman, a 35-year-old media relations representative for General Motors in New York, sounds as blasé about his silvery 41-year-old partner, John Davis, a health-care consultant...
But it's Justin Conner -- unabashedly gray since his college days at Vassar -- who really surprises me. As he's only 25, I suspect he’s on the leading edge of a full-scale generational upheaval when it comes to men, gray hair, and age...
Jeez, unabashedly gray since college -- getting stuffed up the butt sure takes a toll on the body.
I'll take this article as confirmation of my hunch. They certainly aren't like-minded researchers whose work I'm citing because it supports my worldview. The whole piece is about how nowadays queers are starting to embrace the signs of their advanced aging and degradation -- own that mummy skin, girlfriend! Gray hair before 30? Hey, if you got it, flaunt it! But buried in this puff piece is some good on-the-ground data that you otherwise would not be able to find.
If this view is right, then the tiny number of normal men whose sex behavior resembles that of a queer should also show early graying of the hair. Porno actors are the only place to look, since no normal guy will ever have as many partners, nor dirty himself so much during sex, as a porno dude. Aside from anal sex (where, however, they are never the recipient), they also eat pussy and/or ass in just about every one of their scenes.
A year or so ago I saw something weird -- a porno actor who looked to be a normal 20-something guy, only his hair was entirely gray. "Huh," I thought. "Usually they make these guys take steroids, wax their nuts, take Viagra, etc. All signs of body dysmorphia -- you'd think they'd make a 20-something dye his silver hair a natural color." Keiran Lee, gray-haired before turning 30 (fully clothed picture). Let that be a lesson to any of you perverts who think there's nothing wrong with burying your mouth in a gash or an ass.
We already know that the main predictor of whether a man will get cancer of the oral cavity (after smoking) is how many chicks he's gone down on. Add head and neck cancers, too. They've even narrowed the pathogen down to HPV, the same one that causes cervical and genital cancers in women. Guys who hoover up a woman's fluids that contain HPV are in for a rude awakening.
That piece in Nature also notes that researchers are unclear if it's only oral sex that increases the risk of cancer, or if enough French kissing will do the trick too. Recall this earlier post that shows how recent a lot of our weird sexual behavior is, compared to what we did as hunter-gatherers. It wouldn't surprise me if many diseases turn out to be cryptic STDs, including from spit-swapping.
As for gray hair, I can't tell if it's due to a specific pathogen that attacks the cells that give hair its pigmentation, or some part of the brain that talks to those cells, or whether it's just one part of the broader breakdown in the face of so many stressors of the immune system. Or perhaps if the body senses that it is living an incredibly "fast" lifestyle, it withdraws resources from bodily maintenance and puts them into mating effort instead.
Whatever the mechanism turns out to be, it's clear that gray hair can now be added to the wider syndrome of advanced aging in the gay population. Condoning and even encouraging their warped behavior will only worsen their syndrome. But your typical fag-ophile is concerned less with preventing disease among gays, and more with broadcasting their own ideological purity. Being "on the right side of history" means more to them than whether or not they cluelessly egg on another AIDS epidemic.
Categories:
Age,
Dudes and dudettes,
Gays,
Health
Springtime stirring-awake music
Don't know about the rest of the country with the whole polar vortex thing, but it's a warm spring day around here. That calls for some "thank God winter is over / time to switch gears from depressive to manic" music. Just try and sit still during these songs.
New Order, "The Village"
General Public, "Never You Done That"
Big Country, "In a Big Country"
Style Council, "Speak Like a Child"
Madness, "House of Fun"
New Order, "The Village"
General Public, "Never You Done That"
Big Country, "In a Big Country"
Style Council, "Speak Like a Child"
Madness, "House of Fun"
Categories:
Music
February 23, 2014
The school rock, and youth culture in small towns
One of the strongest signs of the lack of cohesion among young people is the absence of physical markings in public spaces that let everyone know, "This is our place." It is therefore one of the most invisible changes to have taken place over the past 20 or so years.
Couples and friends used to carve their initials or names into wet cement on sidewalks, assorted local kids might carve their identities onto tree trunks, and high school classes left their mark on the school rock out in front of the main building. These are all intensifiers of group cohesion -- signaling that the individual who made the mark identifies with all the others who have left theirs in the same spot. It's as though everyone is signing one great big guest book for the party of their generation.
If you keep your eyes peeled, you can still see those carvings on the sidewalk that read "Rusty + Tina '82." And if there's a walking path through the woods that's been used for long enough, you can still find those trees that are encrusted with signatures from anyone who was a teenager in the '60s, '70s, '80s, and early '90s.
The school rock, however, is more vulnerable to being reclaimed by the school authorities or removed altogether. That's harder to do with trees along a walking path or neighborhood sidewalks. Once young people lose interest in asserting their generational togetherness, the slightest cost or risk will turn them off from it. It's not as though painting on the school rock would carry a prison term. Even something like going to the principal's office and paying a nominal fine to have the rock cleaned is enough to discourage them into thinking "Meh, not worth it." A group that was more tightly glued together would feel emboldened to paint the rock, fine or no fine -- "Like what are they gonna do, throw us in jail? Fat chance, fags!"
I haven't seen a "school rock as student-run media" in decades, way back when I was a grade schooler walking by the high school. But maybe I've been in the wrong parts of the country, so I searched Google Images for "school rock" (minus "school of rock"). In the hundreds of results, I only found four showing an actual school rock, three of which were contemporary. You can search for "school parking lot," "school billboard," "school tree," "school garden," etc. and find tons of pictures. Just not for "school rock."
Not like we can learn an awful lot from a sample size of 3, but we might as well try. The first is from Alpena, Michigan -- population ~10,000, pop. density ~600 per square mile, 97% white, median age 43. The school just received the rock, as a donation; it is not part of an ongoing tradition, though hopefully this will start it up. The next is from Bonney Lake, Washington -- population ~17,000, pop. density ~2,200 per square mile, 89% white, median age 35. It looks like the rock here is used primarily for sports events, and I'm not sure if that's controlled by the students or the parents in the booster club. The last one is from Coupeville, Washington -- population ~1,800, pop. density ~1,500 per square mile, 87% white, median age 51 (damn). In the picture shown, the rock is wishing a student happy 16th birthday, from two of her friends (and clearly done in their own teenage girl hands, not their parents').
The thing they all share is being mostly white. Communal cohesion rapidly degrades when a place becomes more mixed-up.
Age structure doesn't seem to matter, when you'd expect that it would. Could be due to the small sample size here. But I have noticed something similar to that Coupeville, WA situation playing out in the Weirton-Steubenville and Wheeling metro areas in eastern Ohio and northern West Virginia, when I visit the place where my mother's side of the family is from. You can still see graffiti on the overpass proclaiming "Van Halen rules," and somewhere there's a large playground with all the good old dangerous rides still there (though not much in use). We saw a small group of middle school kids hanging out around there smoking in the afternoon. The whole region is so gray-haired and withering economically that perhaps the grown-ups feel they might as well try to give the kids some kind of fun before they leave for greener pastures after high school.
The towns with a school rock are also on the small side, with greater student freedom at smaller population sizes within the sample. The Powers That Be don't really give a shit about what goes on in small towns. They've got their eyes on the prize, something that they could stick the word "flagship" onto. Small populations will also be more variable in their outcomes. If there are only 10 families with kids, it would not be unlikely for 80% of the parents to be easy-going old school parents who let their kids form peer groups. (Although you could equally get just 20% being old school.) But if there are 700 families, the law of large numbers takes over, and that place will look like every other one that's been ruined by the helicopter parent majority.
Going back in time, as far as I can tell, student ownership of the school rock was not a central feature of youth culture during the cocooning Mid-century. In fact, the other result from Google Images shows a newspaper photo from 1967 -- which in turn shows a school rock introduced by the class of 1917, relating the story of the rowdy rivalry between the class of '17 and the class of '18 over who could have their class year carved onto the face. (The originators prevailed.) That was back during the Ragtime and Jazz Age, when young people were allowed to form unsupervised peer groups. Hard to imagine the school rock rivalry being commonplace in the era of Leave It to Beaver. But a look at old high school yearbooks could shed some light here.
I'd thought of the "school rock" phenomenon awhile back, but was reminded to do something about it while reading this recollection of its role in student life during the age of Dazed and Confused. It comes from the author of a graphic novel / memoir about going to high school with Jeffrey Dahmer, before he became completely unhinged but when there were clear early signs that something was not-quite-right about the guy.
Check out how many colors there were by the early '90s. There also appear to be drawings in addition to text. This picture could have been snapped of the school rock that I used to walk by in my own neighborhood around this time, though it was a little more subdued. That was Upper Arlington, Ohio, a suburb of Columbus. Backderf and Dahmer went to school closer to de-industrializing Akron. The part of Ohio I mentioned before is on the eastern border with WV.
So perhaps the thing about small towns applies at a larger level -- entire regions that no one cares about will allow more of a thriving youth culture (allow, not necessarily encourage). The control freaks are going to head over to where there's something worth controlling, like Manhattan or San Francisco. It's striking how little those two cities contributed to popular music, especially on a per capita basis. Akron, OH circa 1980 had around 240,000 people and was already in steady decline, yet produced Chrissie Hynde of The Pretenders, The Waitresses ("I Know What Boys Like"), and Devo. Nearby rust belt city Cleveland produced The Raspberries, Dazz Band, and Nine Inch Nails.
I hope to cover more of the mood of the Seventies, as conveyed by Backderf in My Friend Dahmer, since that period doesn't get much treatment in the history of youth culture. Political and economic trends, yes; movies directed by an auteur, yes; but not for what was going on with the kids these days.
Couples and friends used to carve their initials or names into wet cement on sidewalks, assorted local kids might carve their identities onto tree trunks, and high school classes left their mark on the school rock out in front of the main building. These are all intensifiers of group cohesion -- signaling that the individual who made the mark identifies with all the others who have left theirs in the same spot. It's as though everyone is signing one great big guest book for the party of their generation.
If you keep your eyes peeled, you can still see those carvings on the sidewalk that read "Rusty + Tina '82." And if there's a walking path through the woods that's been used for long enough, you can still find those trees that are encrusted with signatures from anyone who was a teenager in the '60s, '70s, '80s, and early '90s.
The school rock, however, is more vulnerable to being reclaimed by the school authorities or removed altogether. That's harder to do with trees along a walking path or neighborhood sidewalks. Once young people lose interest in asserting their generational togetherness, the slightest cost or risk will turn them off from it. It's not as though painting on the school rock would carry a prison term. Even something like going to the principal's office and paying a nominal fine to have the rock cleaned is enough to discourage them into thinking "Meh, not worth it." A group that was more tightly glued together would feel emboldened to paint the rock, fine or no fine -- "Like what are they gonna do, throw us in jail? Fat chance, fags!"
I haven't seen a "school rock as student-run media" in decades, way back when I was a grade schooler walking by the high school. But maybe I've been in the wrong parts of the country, so I searched Google Images for "school rock" (minus "school of rock"). In the hundreds of results, I only found four showing an actual school rock, three of which were contemporary. You can search for "school parking lot," "school billboard," "school tree," "school garden," etc. and find tons of pictures. Just not for "school rock."
Not like we can learn an awful lot from a sample size of 3, but we might as well try. The first is from Alpena, Michigan -- population ~10,000, pop. density ~600 per square mile, 97% white, median age 43. The school just received the rock, as a donation; it is not part of an ongoing tradition, though hopefully this will start it up. The next is from Bonney Lake, Washington -- population ~17,000, pop. density ~2,200 per square mile, 89% white, median age 35. It looks like the rock here is used primarily for sports events, and I'm not sure if that's controlled by the students or the parents in the booster club. The last one is from Coupeville, Washington -- population ~1,800, pop. density ~1,500 per square mile, 87% white, median age 51 (damn). In the picture shown, the rock is wishing a student happy 16th birthday, from two of her friends (and clearly done in their own teenage girl hands, not their parents').
The thing they all share is being mostly white. Communal cohesion rapidly degrades when a place becomes more mixed-up.
Age structure doesn't seem to matter, when you'd expect that it would. Could be due to the small sample size here. But I have noticed something similar to that Coupeville, WA situation playing out in the Weirton-Steubenville and Wheeling metro areas in eastern Ohio and northern West Virginia, when I visit the place where my mother's side of the family is from. You can still see graffiti on the overpass proclaiming "Van Halen rules," and somewhere there's a large playground with all the good old dangerous rides still there (though not much in use). We saw a small group of middle school kids hanging out around there smoking in the afternoon. The whole region is so gray-haired and withering economically that perhaps the grown-ups feel they might as well try to give the kids some kind of fun before they leave for greener pastures after high school.
The towns with a school rock are also on the small side, with greater student freedom at smaller population sizes within the sample. The Powers That Be don't really give a shit about what goes on in small towns. They've got their eyes on the prize, something that they could stick the word "flagship" onto. Small populations will also be more variable in their outcomes. If there are only 10 families with kids, it would not be unlikely for 80% of the parents to be easy-going old school parents who let their kids form peer groups. (Although you could equally get just 20% being old school.) But if there are 700 families, the law of large numbers takes over, and that place will look like every other one that's been ruined by the helicopter parent majority.
Going back in time, as far as I can tell, student ownership of the school rock was not a central feature of youth culture during the cocooning Mid-century. In fact, the other result from Google Images shows a newspaper photo from 1967 -- which in turn shows a school rock introduced by the class of 1917, relating the story of the rowdy rivalry between the class of '17 and the class of '18 over who could have their class year carved onto the face. (The originators prevailed.) That was back during the Ragtime and Jazz Age, when young people were allowed to form unsupervised peer groups. Hard to imagine the school rock rivalry being commonplace in the era of Leave It to Beaver. But a look at old high school yearbooks could shed some light here.
I'd thought of the "school rock" phenomenon awhile back, but was reminded to do something about it while reading this recollection of its role in student life during the age of Dazed and Confused. It comes from the author of a graphic novel / memoir about going to high school with Jeffrey Dahmer, before he became completely unhinged but when there were clear early signs that something was not-quite-right about the guy.
When the news of Dahmer's crimes first exploded in July 1991, I drove down to my folks' house to rifle through the boxes of high school material stored in the basement, as art for stories in the Akron Beacon Journal, where both my wife and I worked at the time. My parents live very close to the high school and I passed it on the way. As I did, I observed that some smartass had already painted "Class of '78- Dahmer" on [the] rock, mere hours after Dahmer had been arrested! The first of many surreal moments. I stopped and snapped the photo below.
Check out how many colors there were by the early '90s. There also appear to be drawings in addition to text. This picture could have been snapped of the school rock that I used to walk by in my own neighborhood around this time, though it was a little more subdued. That was Upper Arlington, Ohio, a suburb of Columbus. Backderf and Dahmer went to school closer to de-industrializing Akron. The part of Ohio I mentioned before is on the eastern border with WV.
So perhaps the thing about small towns applies at a larger level -- entire regions that no one cares about will allow more of a thriving youth culture (allow, not necessarily encourage). The control freaks are going to head over to where there's something worth controlling, like Manhattan or San Francisco. It's striking how little those two cities contributed to popular music, especially on a per capita basis. Akron, OH circa 1980 had around 240,000 people and was already in steady decline, yet produced Chrissie Hynde of The Pretenders, The Waitresses ("I Know What Boys Like"), and Devo. Nearby rust belt city Cleveland produced The Raspberries, Dazz Band, and Nine Inch Nails.
I hope to cover more of the mood of the Seventies, as conveyed by Backderf in My Friend Dahmer, since that period doesn't get much treatment in the history of youth culture. Political and economic trends, yes; movies directed by an auteur, yes; but not for what was going on with the kids these days.
Categories:
Architecture,
Books,
Cocooning,
Crime,
Education,
Generations,
Pop culture,
Psychology
February 22, 2014
The exotic Near East in music videos
Once again Katy Perry is letting her inner transvestite get her flame on in the video for "Dark Horse," a song whose drowsiness hardly brings Ancient Egyptian grandeur to mind.
She's trying to hit two retro themes at the same time: the Mid-century (Cleopatra 1963), and the '90s (mostly Michael Jackson's "Remember the Time" video, but also Stargate and maybe Aladdin). The result is an even mix of the two -- mid-century bombast and Nineties wacky zaniness.
It can be hard to remember how widespread the fascination used to be with the pastoralist belt stretching from Northern Africa through South Asia. The Exorcist opens in Iraq, the desert landscapes in Star Wars were shot in Tunisia, and of course all of the images of Egypt (whether shot there or nearby) -- Death on the Nile, Raiders of the Lost Ark, The Jewel of the Nile, and so on. The second Indiana Jones movie was shot in Sri Lanka (meant to be India), and so were the videos for "Hungry Like the Wolf" and "Save a Prayer" by Duran Duran.
Back in the '70s and '80s, they just let the place be what it is -- tempting and thrilling, but also disorienting and dangerous. They didn't paint it only one way or the other -- either a placid vacation spot, or else a catastrophe waiting to happen. They conveyed the dual nature of the place, somehow realistic and fantastic at the same time.
Music videos didn't have big budgets like feature films did, but there were still a handful that were shot on location in the Near East. Below: "Living on the Ceiling" by Blancmange in Cairo, Egypt; and "Dominion" by Sisters of Mercy in Petra, Jordan (where the third Indiana Jones movie would be filmed).
More low-budget, though in the same spirit, is "Arabian Knights" by Siouxsie and the Banshees. In the video for "Night Boat to Cairo" by Madness, the music makes up for the low-budget visuals in setting up a Near Eastern theme. And it's too bad "Rock the Casbah" wasn't shot on location. (Film boards didn't want them to degenerate the faithful.)
Notice that it wasn't the Top 40 pop stars who wanted to evoke exotic places. It was the groups somewhat outside the mainstream who were keen on style. Now it's the mainstream that boasts a bombastic style, while the groups outside have cultivated an anti-style. Having style used to be cool.
By the mid-'90s, the Middle East only showed up in music videos as an exotic spa getaway kind of place. The harem secluded from all men, save the eunuchs -- that's become women's ideal in a cocooning age. See "My Love Is for Real" by Paula Abdul and "The Woman in Me" by Shania Twain. And then it was gone altogether, until "Dark Horse."
What's going on with the timing of greater or lesser interest in the Near East as an exotic, stylish place?
Egyptian Revival, as a stylistic phenomenon, only crops up during the later part of a rising-crime period (the early 1800s, the 1920s, and the 1980s). It's part of a broader interest in the exotic and Sublime. I'm not sure whether it's the rising crime rate itself or people's more extraverted mindset that makes them crave the excitement of strange new peoples, places, and things.
Falling-crime / cocooning periods, when they do show an interest in The Other, tend to take a distant, clinical, or ethnographic approach, and present it in a one-way-or-the-other fashion, rather than showing its dual nature. Cocooners don't mind a nice peaceful vacation, after all. But if there's the slightest threat of violence, it's overblown to the point of convincing them that they'd better stay home instead. Do not approach unless 100% harmless.
She's trying to hit two retro themes at the same time: the Mid-century (Cleopatra 1963), and the '90s (mostly Michael Jackson's "Remember the Time" video, but also Stargate and maybe Aladdin). The result is an even mix of the two -- mid-century bombast and Nineties wacky zaniness.
It can be hard to remember how widespread the fascination used to be with the pastoralist belt stretching from Northern Africa through South Asia. The Exorcist opens in Iraq, the desert landscapes in Star Wars were shot in Tunisia, and of course all of the images of Egypt (whether shot there or nearby) -- Death on the Nile, Raiders of the Lost Ark, The Jewel of the Nile, and so on. The second Indiana Jones movie was shot in Sri Lanka (meant to be India), and so were the videos for "Hungry Like the Wolf" and "Save a Prayer" by Duran Duran.
Back in the '70s and '80s, they just let the place be what it is -- tempting and thrilling, but also disorienting and dangerous. They didn't paint it only one way or the other -- either a placid vacation spot, or else a catastrophe waiting to happen. They conveyed the dual nature of the place, somehow realistic and fantastic at the same time.
Music videos didn't have big budgets like feature films did, but there were still a handful that were shot on location in the Near East. Below: "Living on the Ceiling" by Blancmange in Cairo, Egypt; and "Dominion" by Sisters of Mercy in Petra, Jordan (where the third Indiana Jones movie would be filmed).
More low-budget, though in the same spirit, is "Arabian Knights" by Siouxsie and the Banshees. In the video for "Night Boat to Cairo" by Madness, the music makes up for the low-budget visuals in setting up a Near Eastern theme. And it's too bad "Rock the Casbah" wasn't shot on location. (Film boards didn't want them to degenerate the faithful.)
Notice that it wasn't the Top 40 pop stars who wanted to evoke exotic places. It was the groups somewhat outside the mainstream who were keen on style. Now it's the mainstream that boasts a bombastic style, while the groups outside have cultivated an anti-style. Having style used to be cool.
By the mid-'90s, the Middle East only showed up in music videos as an exotic spa getaway kind of place. The harem secluded from all men, save the eunuchs -- that's become women's ideal in a cocooning age. See "My Love Is for Real" by Paula Abdul and "The Woman in Me" by Shania Twain. And then it was gone altogether, until "Dark Horse."
What's going on with the timing of greater or lesser interest in the Near East as an exotic, stylish place?
Egyptian Revival, as a stylistic phenomenon, only crops up during the later part of a rising-crime period (the early 1800s, the 1920s, and the 1980s). It's part of a broader interest in the exotic and Sublime. I'm not sure whether it's the rising crime rate itself or people's more extraverted mindset that makes them crave the excitement of strange new peoples, places, and things.
Falling-crime / cocooning periods, when they do show an interest in The Other, tend to take a distant, clinical, or ethnographic approach, and present it in a one-way-or-the-other fashion, rather than showing its dual nature. Cocooners don't mind a nice peaceful vacation, after all. But if there's the slightest threat of violence, it's overblown to the point of convincing them that they'd better stay home instead. Do not approach unless 100% harmless.
Categories:
Design,
Human Biodiversity,
Movies,
Music,
Pop culture,
Psychology
February 21, 2014
"Honestly," "literally," etc. as "end of discussion" markers
Earlier I took a look at the large and growing class of slang words that make it sound like you think everyone's going to call you a liar unless you explicitly tell them you're not. Honestly, literally, seriously, actually, I'm not gonna lie, and so on.
My interpretation was that these words have sprung up in response to the growing social and emotional distance among young people. If the listener doesn't believe the speaker to be trustworthy, then the speaker will have to make these elaborate displays of not being the boy who cried "wolf."
I still get that vibe when I hear young people talking like that. It's like "Hey, I know I don't normally interact with other people, and that you tend not to trust people who never open up. But I'm being honest here, and don't dismiss what I'm saying just because I rarely open up."
The more I hear these things, though, the more it sounds like there's something beyond an earnest appeal for the listener to give the speaker the benefit of the doubt. It's that, plus a demand to not judge or talk back. It's an attempt to shut down any possible argument before it gets started. It comes from the aggressiveness and insistent tone that people use.
"OK, honestly, nobody's going to want to marry me."
The girl who says this is also saying, "...and don't try to tell me otherwise." End of discussion.
"I am literally going to freak the eff out."
...so don't try to talk me into a more calm mindset. I will be freaking out, nothing can be done about it, so just stay out of my way.
Then why bother making it seem like you're opening up and initiating or furthering along a conversation, when you just want to shut it down after you've said your little piece? It's like their pseudo-slutty way of dressing and acting. "Omigosh, just because I'm wearing butt-sculpting tights and a cleavage-baring top doesn't mean I want to have to, ick, deal with boys."
They only seem to be interested in interacting with their peers. They just want to have their little message or signal recognized, and that's it. Just move on once their ego has been validated from getting a pellet of recognition.
Somebody here pointed out (in a comment that I cannot find) a similar thing about "So," especially as the first word after someone else has been talking. It cuts them off, ends their line of thought, and dismisses it all as trivial. It's like, "So -- now that you're done blabbing, back to me and my piece." If it's an argument, every micro-clause will begin with "so," as in "ergo," even if there's no logical progression. It's like, "Hey, I began with 'so,' hence you must accept it."
Or they put "again" at the beginning of every clause, as though they've already established or proven the clause as fact before. Or were you not paying attention? / did you not get it the first time? / is your memory so poor that you've already forgotten and need reminding?
"Honestly," "literally," etc. fall into this broader category of discussion-enders. They contrast the world of maybe / maybe-not with the world of established fact. Not hypothetically -- actually. Not figuratively -- literally. Not jokingly -- honestly. We're not in the world of make-believe, so you must accept everything I've said and can't argue back. It's not as though this were some kind of hypothetical debate, imaginative storytelling, or jesting and joshing.
Under this reading, the trend is part of the intensification of status contests. Every listener is a potential opponent and must be pre-emptively shut down. If they insist on flapping their gums, they must be immediately dismissed when you get the chance to butt in with "So." I don't think there was anything like this in the '80s, though. "For real" from the '90s is about as early as it goes.
My interpretation was that these words have sprung up in response to the growing social and emotional distance among young people. If the listener doesn't believe the speaker to be trustworthy, then the speaker will have to make these elaborate displays of not being the boy who cried "wolf."
I still get that vibe when I hear young people talking like that. It's like "Hey, I know I don't normally interact with other people, and that you tend not to trust people who never open up. But I'm being honest here, and don't dismiss what I'm saying just because I rarely open up."
The more I hear these things, though, the more it sounds like there's something beyond an earnest appeal for the listener to give the speaker the benefit of the doubt. It's that, plus a demand to not judge or talk back. It's an attempt to shut down any possible argument before it gets started. It comes from the aggressiveness and insistent tone that people use.
"OK, honestly, nobody's going to want to marry me."
The girl who says this is also saying, "...and don't try to tell me otherwise." End of discussion.
"I am literally going to freak the eff out."
...so don't try to talk me into a more calm mindset. I will be freaking out, nothing can be done about it, so just stay out of my way.
Then why bother making it seem like you're opening up and initiating or furthering along a conversation, when you just want to shut it down after you've said your little piece? It's like their pseudo-slutty way of dressing and acting. "Omigosh, just because I'm wearing butt-sculpting tights and a cleavage-baring top doesn't mean I want to have to, ick, deal with boys."
They only seem to be interested in interacting with their peers. They just want to have their little message or signal recognized, and that's it. Just move on once their ego has been validated from getting a pellet of recognition.
Somebody here pointed out (in a comment that I cannot find) a similar thing about "So," especially as the first word after someone else has been talking. It cuts them off, ends their line of thought, and dismisses it all as trivial. It's like, "So -- now that you're done blabbing, back to me and my piece." If it's an argument, every micro-clause will begin with "so," as in "ergo," even if there's no logical progression. It's like, "Hey, I began with 'so,' hence you must accept it."
Or they put "again" at the beginning of every clause, as though they've already established or proven the clause as fact before. Or were you not paying attention? / did you not get it the first time? / is your memory so poor that you've already forgotten and need reminding?
"Honestly," "literally," etc. fall into this broader category of discussion-enders. They contrast the world of maybe / maybe-not with the world of established fact. Not hypothetically -- actually. Not figuratively -- literally. Not jokingly -- honestly. We're not in the world of make-believe, so you must accept everything I've said and can't argue back. It's not as though this were some kind of hypothetical debate, imaginative storytelling, or jesting and joshing.
Under this reading, the trend is part of the intensification of status contests. Every listener is a potential opponent and must be pre-emptively shut down. If they insist on flapping their gums, they must be immediately dismissed when you get the chance to butt in with "So." I don't think there was anything like this in the '80s, though. "For real" from the '90s is about as early as it goes.
Categories:
Language,
Pop culture,
Psychology
February 20, 2014
Slavic gynocracy gets whipped in the Caucasus
In case you haven't heard, the biggest news event of the 2014 Winter Olympics is that some band of Russian pseudo-skanks protested the government and got lashed with horse whips by Cossacks. "Look at us, our band's name has the word PUSSY in it!!!"
In the Russian culture of law, the members of Pussy Riot are formally charged with violating some rule, are found guilty, and are given a prison sentence. Over in the Caucasian culture of honor, homey don't play dat. The Cossacks owe much of their character to the general way things are done in the Caucasus.
If a woman shames herself, let alone in public, she isn't simply running afoul of some silly little law -- she's polluting the public sphere with shame. And she won't get formally charged because it needs to be stopped right now. Just knock them around a little bit, and they'll stop it. Problem solved -- no need to waste time, money, and energy to shut off the stream of pollution.
It's like if some drunken SOB is fucking around with people in a bar or club -- just team up on him, knock him down, and good-bye problem. Don't bother calling the cops.
Fortunately, the topless temperance activists of the Ukrainian group FEMEN have not bothered to protest the Olympics. I guess they felt they made a big enough splash last fall when two of then crashed a runway show, shrieking about how the fashion industry exploits women's bodies -- while running around with their tits out.
They explain that they must appear topless because otherwise no one would pay attention to their anti-patriarchal message. Typical of the rationalization of a culture of law. Who cares if you degrade and dishonor yourself in the process of broadcasting your message? "The ends justify the means" comes from the focus on contracts and laws, which are worked out in order to attain the explicit goals of us mere mortals. Only a culture of honor puts up barriers against human fulfillment if our mundane goal-seeking would disturb the supernatural order.
Pussy Riot and FEMEN are just two highly visible manifestations of Slavic gynocracy, akin to East Asian gynocracy ("tiger mothers," "dragon ladies"). An earlier post looked at the less spectacular but far broader phenomenon of mail-order brides in such cultures (i.e., intensive agriculturalists rather than (agro-)pastoralists). "Hey, whatever a woman's gotta do to make ends meet." Where else do you hear that excuse?
In the Russian culture of law, the members of Pussy Riot are formally charged with violating some rule, are found guilty, and are given a prison sentence. Over in the Caucasian culture of honor, homey don't play dat. The Cossacks owe much of their character to the general way things are done in the Caucasus.
If a woman shames herself, let alone in public, she isn't simply running afoul of some silly little law -- she's polluting the public sphere with shame. And she won't get formally charged because it needs to be stopped right now. Just knock them around a little bit, and they'll stop it. Problem solved -- no need to waste time, money, and energy to shut off the stream of pollution.
It's like if some drunken SOB is fucking around with people in a bar or club -- just team up on him, knock him down, and good-bye problem. Don't bother calling the cops.
Fortunately, the topless temperance activists of the Ukrainian group FEMEN have not bothered to protest the Olympics. I guess they felt they made a big enough splash last fall when two of then crashed a runway show, shrieking about how the fashion industry exploits women's bodies -- while running around with their tits out.
They explain that they must appear topless because otherwise no one would pay attention to their anti-patriarchal message. Typical of the rationalization of a culture of law. Who cares if you degrade and dishonor yourself in the process of broadcasting your message? "The ends justify the means" comes from the focus on contracts and laws, which are worked out in order to attain the explicit goals of us mere mortals. Only a culture of honor puts up barriers against human fulfillment if our mundane goal-seeking would disturb the supernatural order.
Pussy Riot and FEMEN are just two highly visible manifestations of Slavic gynocracy, akin to East Asian gynocracy ("tiger mothers," "dragon ladies"). An earlier post looked at the less spectacular but far broader phenomenon of mail-order brides in such cultures (i.e., intensive agriculturalists rather than (agro-)pastoralists). "Hey, whatever a woman's gotta do to make ends meet." Where else do you hear that excuse?
Categories:
Crime,
Dudes and dudettes,
Human Biodiversity,
Morality,
Politics,
Psychology,
Sports
February 19, 2014
Biblical epic movies, their resurr.... their return
In many ways the cultural mood these days feels like the 1950s. Right on schedule, then, we're about to see three major Biblical epic movies released this year: Exodus, Noah, and Son of God.
The last time the genre was a serious contender was the end of the Mid-century. Biblical epics topped the annual box office revenues for 1951, '53, '56, and '59: Quo Vadis, The Robe, The Ten Commandments, and Ben-Hur. Already by '61, Exodus could only reach as high as the #3 movie, and the genre would continue to fall from there. The #1 movie of 1966 was The Bible: In the Beginning, but that was it. After that, the New Hollywood style took over, and Mid-century epics were out.
I'm not sure whether the popularity of religious epics reflects the cocooning or the falling-crime trends of the Mid-century and Millennial eras. For some reason, it only emerges in the final stretch of such a period, though -- the genre came from out of nowhere circa 1950, and all of a sudden the genre is catching on again. (The Passion of the Christ was an outlier from 10 years ago.)
Are audiences in falling-crime times better prepared for movies about religions of peace? Hardly. The Ten Commandments and Exodus were Old Testament movies, and so will be Noah -- "real wrath-of-God type stuff." And the Christian movies take place against the backdrop of violent conflict with the Roman Empire, rather than on Jesus winning over the crowds to seek redemption for their sins, the Apostles spreading the Good Word to the people, and so on.
It looks instead like it's linked to cocooning, in particular the need for people whose daily social and cultural lives are so uneventful to really get knocked over when they go out to the movies. The movie industry was so worried about how tame the Mid-century audiences were that they decided to go in the opposite direction and offer them something they couldn't get at home -- a panoramic aspect ratio, 3-D effects, other sensory gimmicks (vibrating seats), color, an experience that lasted for over 3 hours, stories that were epic and grand, and so on.
I covered some of these themes before. Cocooning audiences crave narratives where the stakes are Earth-shattering, as well as really really long. Biblical epics will also satisfy another preference of cocooners -- adaptations and sequels.
Earlier, I found it puzzling that in the Mid-century, these things went together in the form of Biblical epics, whereas now they were going together in less religious works, like the Lord of the Rings movies. Well, that puzzle is resolving itself, as the Bible is back.
Why the Bible? Why not some other source material for epic storytelling and grandiose stage dressing? Falling-crime and cocooning periods are not very friendly toward religious fervor, after all. It's rising crime that makes people more desperate to search for answers, and it's an outgoing social orientation that makes them want to regularly meet up with each other and synch up on the same emotional wavelength.
Perhaps folks are sensing how meaningless and disconnected life is becoming, and rather than pick any old epic story, go for something that might lead to ultimate meaning and belonging. This would set the stage, as it were, for the dramatic fervor that ignites whenever the crime rate starts shooting up. Whether you like them as movies or not, at least they're playing a welcome social-cultural role -- preparing the way for a real consciousness-raising movement just down the line.
It's striking how peripheral the Biblical epic was during the '20s and the '80s, both periods of intense religious fervor. I guess that when people have enough religion in real life, they don't need any further stimulation of that lobe of their brain when they sit down in the movie theater. Humility and atonement are for Sunday -- let's just laugh our asses off on Friday or Saturday.
The Last Temptation of Christ is not a Biblical epic. It's an intellectual and philosophical movie that happens to have a Biblical setting and characters. The only big Bible movie from late '60s through the early '90s was not even a theatrical release, but the TV miniseries Jesus of Nazareth. That's been the only Bible movie I've ever responded to (both as a child and when I watched it again a couple years ago).
It just seems like the creators truly wanted to make a movie about the Jesus story, not just because they felt it would be the optimal strategy to get the butts in the seats (it was on TV, remember). It focuses on sin, atonement, redemption, salvation... y'know, the reason you'd choose to go to church. Not to see the Roman team whoop ass on the Christian team, and the Christian team score some underdog points against the Roman team. Jesus of Nazareth has more of a spiritual than a political-historical focus.
If there was a lull of Biblical movies during that time, what took its place? It's not as though audiences didn't want to see group vs. group conflict, quasi-historical tales, all with supernatural forces involved. Oddly enough, it was the pagan movie that flourished when folks were becoming more religious. That's a whole 'nother topic, though, and I'll try to get around to it sometime soon.
The last time the genre was a serious contender was the end of the Mid-century. Biblical epics topped the annual box office revenues for 1951, '53, '56, and '59: Quo Vadis, The Robe, The Ten Commandments, and Ben-Hur. Already by '61, Exodus could only reach as high as the #3 movie, and the genre would continue to fall from there. The #1 movie of 1966 was The Bible: In the Beginning, but that was it. After that, the New Hollywood style took over, and Mid-century epics were out.
I'm not sure whether the popularity of religious epics reflects the cocooning or the falling-crime trends of the Mid-century and Millennial eras. For some reason, it only emerges in the final stretch of such a period, though -- the genre came from out of nowhere circa 1950, and all of a sudden the genre is catching on again. (The Passion of the Christ was an outlier from 10 years ago.)
Are audiences in falling-crime times better prepared for movies about religions of peace? Hardly. The Ten Commandments and Exodus were Old Testament movies, and so will be Noah -- "real wrath-of-God type stuff." And the Christian movies take place against the backdrop of violent conflict with the Roman Empire, rather than on Jesus winning over the crowds to seek redemption for their sins, the Apostles spreading the Good Word to the people, and so on.
It looks instead like it's linked to cocooning, in particular the need for people whose daily social and cultural lives are so uneventful to really get knocked over when they go out to the movies. The movie industry was so worried about how tame the Mid-century audiences were that they decided to go in the opposite direction and offer them something they couldn't get at home -- a panoramic aspect ratio, 3-D effects, other sensory gimmicks (vibrating seats), color, an experience that lasted for over 3 hours, stories that were epic and grand, and so on.
I covered some of these themes before. Cocooning audiences crave narratives where the stakes are Earth-shattering, as well as really really long. Biblical epics will also satisfy another preference of cocooners -- adaptations and sequels.
Earlier, I found it puzzling that in the Mid-century, these things went together in the form of Biblical epics, whereas now they were going together in less religious works, like the Lord of the Rings movies. Well, that puzzle is resolving itself, as the Bible is back.
Why the Bible? Why not some other source material for epic storytelling and grandiose stage dressing? Falling-crime and cocooning periods are not very friendly toward religious fervor, after all. It's rising crime that makes people more desperate to search for answers, and it's an outgoing social orientation that makes them want to regularly meet up with each other and synch up on the same emotional wavelength.
Perhaps folks are sensing how meaningless and disconnected life is becoming, and rather than pick any old epic story, go for something that might lead to ultimate meaning and belonging. This would set the stage, as it were, for the dramatic fervor that ignites whenever the crime rate starts shooting up. Whether you like them as movies or not, at least they're playing a welcome social-cultural role -- preparing the way for a real consciousness-raising movement just down the line.
It's striking how peripheral the Biblical epic was during the '20s and the '80s, both periods of intense religious fervor. I guess that when people have enough religion in real life, they don't need any further stimulation of that lobe of their brain when they sit down in the movie theater. Humility and atonement are for Sunday -- let's just laugh our asses off on Friday or Saturday.
The Last Temptation of Christ is not a Biblical epic. It's an intellectual and philosophical movie that happens to have a Biblical setting and characters. The only big Bible movie from late '60s through the early '90s was not even a theatrical release, but the TV miniseries Jesus of Nazareth. That's been the only Bible movie I've ever responded to (both as a child and when I watched it again a couple years ago).
It just seems like the creators truly wanted to make a movie about the Jesus story, not just because they felt it would be the optimal strategy to get the butts in the seats (it was on TV, remember). It focuses on sin, atonement, redemption, salvation... y'know, the reason you'd choose to go to church. Not to see the Roman team whoop ass on the Christian team, and the Christian team score some underdog points against the Roman team. Jesus of Nazareth has more of a spiritual than a political-historical focus.
If there was a lull of Biblical movies during that time, what took its place? It's not as though audiences didn't want to see group vs. group conflict, quasi-historical tales, all with supernatural forces involved. Oddly enough, it was the pagan movie that flourished when folks were becoming more religious. That's a whole 'nother topic, though, and I'll try to get around to it sometime soon.
Categories:
Cocooning,
Crime,
Movies,
Mythology,
Psychology,
Religion,
Television
February 18, 2014
Bret Easton Ellis on Generation Wuss
From a Vice interview:
It's time we start lumping in the helicopter parents with other special interest groups whose main goal is to shield their Potential Victim Group from all criticism.
The multi-culti crybabies say we're not allowed to notice anything bad about a culture or race that we don't belong to. Feminists whine about men pointing out anything bad about female nature. And upon hearing a single slur, the homophiles are already preparing their "I love my dead gay son" speech to berate the bigots.
With the family values revolution of the past 20-some years, "my children" have become yet another sacred victim group. It's not children in general, a la "save the children" or "believe the children" from the 1980s. That came from do-gooders during a wave of child abuse, who may or may not have had small children of their own. Today's "praise the children" movement comes from parents themselves, seeking direct benefits and praise themselves -- "Who raised an honor student? THIS GUY!"
"Tell me how awesome my kid is."
Well, what if your kid is a wuss? Or a brat? Or a social retard? Or anything else that pollutes the public sphere? Your little dear isn't like those spazzy doggies that can be kept on your side of the fence. Pretty soon these annoying little shits are going to be spazzing out all over the neighborhood.
I remember when strangers would shoot you a stinging glare if you were acting up in public, when they might even come over and pinch your ear, scolding you to "Listen to your mother!" I also remember when you could make "ching-chong" jokes, dismiss female hysteria as "just PMS-ing," and sling the word "faggot" without being prosecuted for a hate crime. Do that these days, though, and you're treated like you pulled a gun on them, or like you raped their self-esteem. Nobody wants to be regarded as a thug or a rapist, even by strangers they'll never meet again, so basic norm enforcement has vanished from public places (anywhere outside of the nuclear household, including schools).
As "my children" age into their late 20s, we're finding out that the Millennial self-esteem bubble must be maintained indefinitely -- not just to save them from the occasional scraped knee as children. Parental interference is de rigueur at college, and grade inflation rampant. Anyone who's had to read college students' essays these days knows what Ellis means when he says you can expect shitty quality once the group has been declared off-limits from all criticism.
It's striking how absent the criticism of the anti-criticism movement has been, even while so-called conservatives have been busy thundering about the politically correct dumbing down of the culture. Maya Angelou alongside John Keats in English class? Dumbing down. Frida Kahlo studied as seriously as Rembrandt? Dumbing down. Everybody gets a trophy day, "Awesome job, buddy!" -- ??? Uh, well, you know how you have to handle the sensitive mind of a developing child. A child who's 25 years old? Yeah well... what are you trying to say about my parenting, huh asshole? Just like the other full-time defenders of Potential Victims, parents these days enjoy making a noisy display about how butt-hurt they are when you offend them.
While we're at it, why do so few of the anti-"dumbing down" crowd champion the balls-to-the-walls kind of classics? Somebody whose name demands the emphasis of "fucking" -- Christopher fucking Marlowe, Cara-fucking-vaggio. This is another sign that many in the anti-PC group are overly delicate themselves.
I know the strategy of framing helicopter parents as a variant of race hustlers would not fly with the liberal half of the country, but I think it would find enough traction in the conservative half. Or at least serve to separate the shameless from the integrity-minded ones.
You have to understand that I’m coming to these things as a member of the most pessimistic and ironic generation that has ever roamed the earth. When I hear millennials getting hurt by "cyber bullying", or it being a gateway to suicide, it’s difficult for me to process. A little less so for my boyfriend, who happens to be a millennial of that age, but even he somewhat agrees with the sensitivity of Generation Wuss. It’s very difficult for them to take criticism, and because of that a lot of the content produced is kind of shitty. And when someone is criticised for their content, they seem to collapse, or the person criticising them is called a hater, a contrarian, a troll.
In a way it’s down to the generation that raised them, who cocooned them in praise – four stars for showing up, you know? But eventually everyone has to hit the dark side of life; someone doesn’t like you, someone doesn’t like your work, someone doesn’t love you back… people die. What we have is a generation who are super-confident and super-positive about things, but when the least bit of darkness enters their lives, they’re paralysed.
It's time we start lumping in the helicopter parents with other special interest groups whose main goal is to shield their Potential Victim Group from all criticism.
The multi-culti crybabies say we're not allowed to notice anything bad about a culture or race that we don't belong to. Feminists whine about men pointing out anything bad about female nature. And upon hearing a single slur, the homophiles are already preparing their "I love my dead gay son" speech to berate the bigots.
With the family values revolution of the past 20-some years, "my children" have become yet another sacred victim group. It's not children in general, a la "save the children" or "believe the children" from the 1980s. That came from do-gooders during a wave of child abuse, who may or may not have had small children of their own. Today's "praise the children" movement comes from parents themselves, seeking direct benefits and praise themselves -- "Who raised an honor student? THIS GUY!"
"Tell me how awesome my kid is."
Well, what if your kid is a wuss? Or a brat? Or a social retard? Or anything else that pollutes the public sphere? Your little dear isn't like those spazzy doggies that can be kept on your side of the fence. Pretty soon these annoying little shits are going to be spazzing out all over the neighborhood.
I remember when strangers would shoot you a stinging glare if you were acting up in public, when they might even come over and pinch your ear, scolding you to "Listen to your mother!" I also remember when you could make "ching-chong" jokes, dismiss female hysteria as "just PMS-ing," and sling the word "faggot" without being prosecuted for a hate crime. Do that these days, though, and you're treated like you pulled a gun on them, or like you raped their self-esteem. Nobody wants to be regarded as a thug or a rapist, even by strangers they'll never meet again, so basic norm enforcement has vanished from public places (anywhere outside of the nuclear household, including schools).
As "my children" age into their late 20s, we're finding out that the Millennial self-esteem bubble must be maintained indefinitely -- not just to save them from the occasional scraped knee as children. Parental interference is de rigueur at college, and grade inflation rampant. Anyone who's had to read college students' essays these days knows what Ellis means when he says you can expect shitty quality once the group has been declared off-limits from all criticism.
It's striking how absent the criticism of the anti-criticism movement has been, even while so-called conservatives have been busy thundering about the politically correct dumbing down of the culture. Maya Angelou alongside John Keats in English class? Dumbing down. Frida Kahlo studied as seriously as Rembrandt? Dumbing down. Everybody gets a trophy day, "Awesome job, buddy!" -- ??? Uh, well, you know how you have to handle the sensitive mind of a developing child. A child who's 25 years old? Yeah well... what are you trying to say about my parenting, huh asshole? Just like the other full-time defenders of Potential Victims, parents these days enjoy making a noisy display about how butt-hurt they are when you offend them.
While we're at it, why do so few of the anti-"dumbing down" crowd champion the balls-to-the-walls kind of classics? Somebody whose name demands the emphasis of "fucking" -- Christopher fucking Marlowe, Cara-fucking-vaggio. This is another sign that many in the anti-PC group are overly delicate themselves.
I know the strategy of framing helicopter parents as a variant of race hustlers would not fly with the liberal half of the country, but I think it would find enough traction in the conservative half. Or at least serve to separate the shameless from the integrity-minded ones.
Categories:
Art,
Generations,
Literature,
Over-parenting,
Politics,
Pop culture,
Psychology
Links among Indo-Europeans, Caucasus peoples, and Native Americans: Genes, language, and mythology
Greg Cochran has written up some ideas about the origin and spread of the Indo-Europeans, based on new genetic data about the population waves that swept over Europe.
Some Native American groups, particularly the Navajo, who speak a Na-Dene language. It represents the Hopi nation as well, and they are Uto-Aztecan speakers who live next door to the Navajo.
Way back when, there were only western hunter-gatherers (WHG). These were replaced by farmers from the Levant (EEF: Early European Farmers). Later came a third group, who are most likely the group we know as the Indo-Europeans. They were themselves a mix of two more basic genepools, the WHG and the Ancestral North Eurasians (ANE) -- a group that roamed around Central Asia / Siberia, and contributed genes to the group that would eventually leave Asia and colonize the Americas.
Greg guesses that the Indo-European group could likely have come from the northern or northeastern Caucasus region, as folks there have the highest signal from that ANE genepool. That region has already been considered a likely homeland of the Indo-Europeans based on linguistic evidence.
So perhaps the Chechens and the Irish are more similar than we think, and not by accident?
I left several comments, which I'll copy and paste below in case you don't want to read through a long comment thread. They all pursue the approach of seeing genes, languages, myths, and visual icons as pieces in a larger population bundle. When a group is spreading its genes by moving into a region and displacing the locals, they also set down their language, myths, and icons. To a decent extent, anyway, and there's no way to know how far in any specific case without being curious and looking.
Key point: it's not only by one group adopting another group's language, myths, and icons that those could be spread.
* * * * *
I wonder if the ANE are the genetic link to the proposed Dene-Caucasian superfamily of languages. That would bring together the Na-Dene languages of the Americas, Yeniseian in Central Siberia, and the Northern and Northeastern Caucasian languages, including Chechen (but not the Southern ones like Georgian).
Basque is also thrown into this superfamily for reasons I don’t understand. Here’s another case where genetics could help to rule on a dispute in linguistic classification. If the Dene-Causcasian languages are closely related to the ANE genepool, and Basques are more or less out of that genepool, don’t bother trying to shoe-horn Basque in with Na-Dene, Yeniseian, and N/NE Caucasian languages.
* * * * *
Iconography should be looked into as well. Like the swastika — not as just another cool geometric motif, but as something with greater meaning and sacred symbolism. Was its veneration due to the ANE?
It goes way back among the Indo-Europeans, who via Buddhism introduced it into East / SE Asia.
It’s widespread among the Nakh peoples (N/NE Caucasians, including Chechens). I didn’t know before following the genetics-inspired hunch, but they’ve got swastikas all over their oldest monuments, sculptures, grave markers, and so on. And importantly, as a higher symbol, representing purification. Under the Wiki article for Vainakh mythology, the Nakh national ornament is shown to be a variant on the swastika, mixed with a four-leaf clover and thus clearly linking them to the Celts? (Only half-kidding…)
Some Native American groups, particularly the Navajo, who speak a Na-Dene language. It represents the Hopi nation as well, and they are Uto-Aztecan speakers who live next door to the Navajo.
If the N/NE Caucasus provided the seed for the Indo-Europeans, then perhaps the swastika was part of the package. And if the Caucasians inherited it from the ANE, that would explain its separate central role among the Navajo, who are also descended from the ANE (and maybe from there to their Hopi neighbors).
That would help to explain its otherwise puzzling and patchy global distribution.
* * * * *
Indo-European mythologists tend to gloss over Greek mythology when trying to reconstruct the Proto-mythology because it has too many elements from the Levant, i.e. the EEF heritage (such as Adonis). Of European mythologies, they lean most heavily on Celtic, Norse, and Slavic.
If myth diffusion was piggy-backing on genetic diffusion, that would explain why the mythologies of Ireland and Norway bear a closer resemblance to the Vedas than does Greek mythology. The ANE genetic signal is lower in Southern Europe.
Roman is an odd case — Indo-Europeanists lean on that too, but only somewhat. A lot is pilfered from Greece. But then there’s the founding of the folk by one twin who kills the other, the veneration of wolves, and the fact that the supreme god’s name is Ju Piter — “Father Sky” — instead of the plainer Zeus, which doesn’t include the word for “father.” In that way, it’s like Dyaus Pita in the Vedic pantheon, both from the Proto-god Dyeus.
Roman myth looks more Celto-Germanic than Greco-whatever. And Italo-Celtic is another one of those not-too-controversial groupings within the IE tree. Sticking with genes and myths as a bundle, it makes you wonder if the proto-Romans didn’t wander into Italy from a Celto-Germanic region. It wasn’t from the east, since that was the Greeks and Illyrians / Albanians.
* * * * *
Someone also needs to look into the links among the myths of the Indo-Europeans and the N/NE Caucasus groups, and try to tease out who originated what and who borrowed what from whom.
“Vainakh mythology” on Wikipedia says that Amjad Jaimoukha has noticed lots of parallels between the N/NE Caucasus myths and Celtic myths, but that the idea is not being widely discussed. Time to start on that.
John Colarusso has a book on the Nart Sagas of the Ossetians, who speak an I-E language in the Caucasus. Despite a lot of the material being I-E, there are fossils of older Caucasian myth in there, though I haven’t read to book in order to report what they are. Some of those may be seeds that left the P-I-E homeland. Who knows, some of the Ossetian / Sarmatian / Iranian myths that later entered the Caucasus might be returning home!
February 17, 2014
Irreverence toward authority in music videos
The post below on the decline of subversive office culture reminded me how widespread the attitude of irreverence was in the '80s. It wasn't defiance or hostility -- it was that plus the feel-good vibe that was everywhere back then. Carefree defiance.
The best way to see this is to look for it where you wouldn't expect. Like, it wouldn't shock you to see defiance or mockery in a heavy metal video. But how about feel-good dance-pop?
Here's the video for "Who's Johnny" by El DeBarge (who you might remember from "Rhythm of the Night").
Casual disrespect is shown toward the judge, lawyers, and law enforcement officers, in the courtroom no less. But it also strikes a humorous tone and gets them involved in the shenanigans -- that feel-good vibe again, not straight-up hostility toward authority.
The song is from the movie Short Circuit, where good-meaning defiance of authority is a central theme. (A robot designed for warfare by the US military wanders off base and befriends a woman. She tries to foil the military's plans to disassemble it, convinced that it is sentient. And the robot's designers want to give it less belligerent goals, against the Cold War aims of the brass.)
Scenes like the ones in the video could not catch on with audiences today. Most people worship authority figures -- just look at all the TV shows in the top of the ratings that feature sober and sympathetic depictions of judges, lawyers, courts, and police departments. Compare to the '80s hit sit-com Night Court, which was irreverent.
The minority of youngish people who aren't so hot for that stuff are more likely to have a bratty hostile attitude. "Fuck the police!" While never saying that to one of their faces...
I remember during the turning point around the mid-'90s that us teenagers were still emboldened enough to say things like that when the cops were within earshot. "Hey guys, [sniff sniff], do you smell bacon around here?" "I sure could go for a DONUT!" Being a smartass requires you to be somewhat confrontational in real life, not just whine about it in your room or on the internet.
Things were turning more contemptuous and hostile at that point, though. Eighties irreverence was already shading into Nineties smartass, though not yet at 21st century emo brattiness.
I wouldn't expect to see mainstream irreverence for a long while -- it seems to mature during the end of a rising-crime period, by which time the authorities have proven themselves incapable of doing their most important job, i.e. halting the rising crime rate. So don't pay them any mind. It took awhile for this attitude to come out during the early 20th century crime wave. More during the Jazz Age, when the homicide rate was nearing its peak, and not so much in the first part of the wave, circa 1900 to WWI. It's more part of the Twenties.
Offbeat color combinations will be another way to recognize this attitude when it eventually comes back. Good-natured flouting of the rules that fashion authorities set for us. A staple of the look of the '80s, and of the '20s as well.
The best way to see this is to look for it where you wouldn't expect. Like, it wouldn't shock you to see defiance or mockery in a heavy metal video. But how about feel-good dance-pop?
Here's the video for "Who's Johnny" by El DeBarge (who you might remember from "Rhythm of the Night").
Casual disrespect is shown toward the judge, lawyers, and law enforcement officers, in the courtroom no less. But it also strikes a humorous tone and gets them involved in the shenanigans -- that feel-good vibe again, not straight-up hostility toward authority.
The song is from the movie Short Circuit, where good-meaning defiance of authority is a central theme. (A robot designed for warfare by the US military wanders off base and befriends a woman. She tries to foil the military's plans to disassemble it, convinced that it is sentient. And the robot's designers want to give it less belligerent goals, against the Cold War aims of the brass.)
Scenes like the ones in the video could not catch on with audiences today. Most people worship authority figures -- just look at all the TV shows in the top of the ratings that feature sober and sympathetic depictions of judges, lawyers, courts, and police departments. Compare to the '80s hit sit-com Night Court, which was irreverent.
The minority of youngish people who aren't so hot for that stuff are more likely to have a bratty hostile attitude. "Fuck the police!" While never saying that to one of their faces...
I remember during the turning point around the mid-'90s that us teenagers were still emboldened enough to say things like that when the cops were within earshot. "Hey guys, [sniff sniff], do you smell bacon around here?" "I sure could go for a DONUT!" Being a smartass requires you to be somewhat confrontational in real life, not just whine about it in your room or on the internet.
Things were turning more contemptuous and hostile at that point, though. Eighties irreverence was already shading into Nineties smartass, though not yet at 21st century emo brattiness.
I wouldn't expect to see mainstream irreverence for a long while -- it seems to mature during the end of a rising-crime period, by which time the authorities have proven themselves incapable of doing their most important job, i.e. halting the rising crime rate. So don't pay them any mind. It took awhile for this attitude to come out during the early 20th century crime wave. More during the Jazz Age, when the homicide rate was nearing its peak, and not so much in the first part of the wave, circa 1900 to WWI. It's more part of the Twenties.
Offbeat color combinations will be another way to recognize this attitude when it eventually comes back. Good-natured flouting of the rules that fashion authorities set for us. A staple of the look of the '80s, and of the '20s as well.
Categories:
Crime,
Design,
Generations,
Music,
Pop culture,
Psychology
No more subversive office culture
Recently I was looking through the ceramic mugs at the thrift store and found this one for 50 cents, which would cost between $10 and $20 on a "vintage" site like Etsy or even eBay:
To err is human...
To really screw things up
You need a computer!
He doesn't just have a retarded look on his face, his head isn't even facing the computer while his twitchy fingers hammer at the keyboard. It's like someone hired the class clown for a respectable job. Here are some others from the same time and the same company (Russ Berrie, who oddly enough invented those Troll dolls for children):
"I'm expecting a great idea any minute now" (says a dude sitting on the can).
"I'm only here 'til I win the lottery" (says the guy with a messy desk).
"I love every inch of you... Some more than others!" (says the workplace bimbo).
Satirizing the seriousness of the workplace was a key part of adult life in the 1980s, along with the "xerox lore" posted here and there around the office -- spoofed memos, black humor cartoons, and the like. These were not internal thoughts or jokes around the water cooler, but pieces that were displayed openly around the office itself.
Alan Dundes and Carl Pagter published a series of books that collected a staggering and hilarious variety of "xerox lore." That was back when such things were popular, though. The first book came out in 1975 (and included items from somewhat earlier), and the last one in 2000. I thought the ones in this edition from 1987 were the funniest.
Technological changes don't account for its demise (email, the internet, etc.) -- folks these days are just a lot more conformist and authority-worshiping than they were in the irreverent Eighties. An office worker back then would not have contented themselves with just passing along a spoof in private, however they would have done that before email (passing notes, for grown-ups). They wanted it to be out in the open, to signal to everyone else that they weren't alone in their thinking, that it was OK for you to feel that way, and you should feel comfortable putting up your own defiant display. (A la the famous Solomon Asch conformity experiments.)
The coffee mugs of the '80s and early '90s were another prong of the overall attack on pretentious office culture. And they sure didn't vanish because of technological changes. People still have mugs around the office today, but they don't have text and images that take potshots at office life.
I noticed from reading through Dundes and Pagter's compilations that xerox lore started to move away from satirizing the office and bosses during the '90s. It moved more toward the themes of "people in general piss me off," or "my team is sensible and awesome, and the other team are clueless idiots." The best example of this, outside folk culture, was the comic strip Dilbert, where an egghead complains meekly that the suits don't appreciate his awesomeness. Really stickin' it to 'em...
When people enter a cocooning phase, they develop an auto-immune disorder where instead of attacking invasive parasites, they turn on the organ or body that they belong to, lacking a strong sense of what is self vs. not-self. When people interact more frequently with others (especially strangers), they develop a stronger sense of self vs. not-self, and the immune system functions well. Office workers asserted their dignity and solidarity with one another, and kept management and the workplace in general from stifling them on a day-to-day level.
What they were doing to their salaries, job security, etc. is another story, although that too was not as bad back then as it is now. I think management felt somewhat held in check by the widespread disregard for workplace authority in the '80s, that they didn't really start hosing their workers until the '90s and after, when the cocooning trend disconnected the workers and made them want to just keep to themselves. A divided workforce is an easily governed workforce.That goes beyond racial or ethnic diversity, as all-white parts of the country are also divided, atomized, and of the "don't rock the boat" mindset. Union-busting was not a factor either, as most workers were already not unionized. It was simply the cocooning behavior closing each of the workers off from the others.
That may have played a role in keeping labor agitation to a minimum during the Gilded Age or Victorian Era, another falling-crime / cocooning period, though also one where inequality was widening and the labor movement was trying to take off but could not. Once the crime rate and outgoing behavior started rising around the turn of the century, it became much more organized and serious. Class conflict peaked around WWI, after which both sides made concessions -- no violent agitation from workers, and better treatment and conditions from the elite. (And an end to immigration, whereby the soaring supply of labor had lowered wages.)
It's a good thing it ended when it did, since people only got more outgoing and rambunctious during the Roaring Twenties. They could have really turned up the heat by the start of the Great Depression, which was the peak of the extraverted Jazz Age culture.
I wouldn't expect to see much of a dent in the micro-managing norms of today's workplace until people leave their cocoons and start interacting with one another again. It takes a team effort to protect your dignity against managerial manipulations.
To err is human...
To really screw things up
You need a computer!
He doesn't just have a retarded look on his face, his head isn't even facing the computer while his twitchy fingers hammer at the keyboard. It's like someone hired the class clown for a respectable job. Here are some others from the same time and the same company (Russ Berrie, who oddly enough invented those Troll dolls for children):
"I'm expecting a great idea any minute now" (says a dude sitting on the can).
"I'm only here 'til I win the lottery" (says the guy with a messy desk).
"I love every inch of you... Some more than others!" (says the workplace bimbo).
Satirizing the seriousness of the workplace was a key part of adult life in the 1980s, along with the "xerox lore" posted here and there around the office -- spoofed memos, black humor cartoons, and the like. These were not internal thoughts or jokes around the water cooler, but pieces that were displayed openly around the office itself.
Alan Dundes and Carl Pagter published a series of books that collected a staggering and hilarious variety of "xerox lore." That was back when such things were popular, though. The first book came out in 1975 (and included items from somewhat earlier), and the last one in 2000. I thought the ones in this edition from 1987 were the funniest.
Technological changes don't account for its demise (email, the internet, etc.) -- folks these days are just a lot more conformist and authority-worshiping than they were in the irreverent Eighties. An office worker back then would not have contented themselves with just passing along a spoof in private, however they would have done that before email (passing notes, for grown-ups). They wanted it to be out in the open, to signal to everyone else that they weren't alone in their thinking, that it was OK for you to feel that way, and you should feel comfortable putting up your own defiant display. (A la the famous Solomon Asch conformity experiments.)
The coffee mugs of the '80s and early '90s were another prong of the overall attack on pretentious office culture. And they sure didn't vanish because of technological changes. People still have mugs around the office today, but they don't have text and images that take potshots at office life.
I noticed from reading through Dundes and Pagter's compilations that xerox lore started to move away from satirizing the office and bosses during the '90s. It moved more toward the themes of "people in general piss me off," or "my team is sensible and awesome, and the other team are clueless idiots." The best example of this, outside folk culture, was the comic strip Dilbert, where an egghead complains meekly that the suits don't appreciate his awesomeness. Really stickin' it to 'em...
When people enter a cocooning phase, they develop an auto-immune disorder where instead of attacking invasive parasites, they turn on the organ or body that they belong to, lacking a strong sense of what is self vs. not-self. When people interact more frequently with others (especially strangers), they develop a stronger sense of self vs. not-self, and the immune system functions well. Office workers asserted their dignity and solidarity with one another, and kept management and the workplace in general from stifling them on a day-to-day level.
What they were doing to their salaries, job security, etc. is another story, although that too was not as bad back then as it is now. I think management felt somewhat held in check by the widespread disregard for workplace authority in the '80s, that they didn't really start hosing their workers until the '90s and after, when the cocooning trend disconnected the workers and made them want to just keep to themselves. A divided workforce is an easily governed workforce.That goes beyond racial or ethnic diversity, as all-white parts of the country are also divided, atomized, and of the "don't rock the boat" mindset. Union-busting was not a factor either, as most workers were already not unionized. It was simply the cocooning behavior closing each of the workers off from the others.
That may have played a role in keeping labor agitation to a minimum during the Gilded Age or Victorian Era, another falling-crime / cocooning period, though also one where inequality was widening and the labor movement was trying to take off but could not. Once the crime rate and outgoing behavior started rising around the turn of the century, it became much more organized and serious. Class conflict peaked around WWI, after which both sides made concessions -- no violent agitation from workers, and better treatment and conditions from the elite. (And an end to immigration, whereby the soaring supply of labor had lowered wages.)
It's a good thing it ended when it did, since people only got more outgoing and rambunctious during the Roaring Twenties. They could have really turned up the heat by the start of the Great Depression, which was the peak of the extraverted Jazz Age culture.
I wouldn't expect to see much of a dent in the micro-managing norms of today's workplace until people leave their cocoons and start interacting with one another again. It takes a team effort to protect your dignity against managerial manipulations.
Categories:
Books,
Cocooning,
Design,
Economics,
Pop culture,
Psychology
February 16, 2014
Frisky females contribute to the culture of violence
A common context for an escalation of violence toward homicide is two strange men in a public place arguing over a woman. Maybe one is the boyfriend or husband, and maybe neither is but both are interested in her. Words and stares are exchanged, each feels pressure from the woman and the public not to back down, and suddenly they're off on a trajectory that could end in death.
The woman is not just a passive bystander. In some way she allows or provokes the advances from Guy #2, regardless of whether or not Guy #1 was her boyfriend or husband. Guy #2 comes over to talk to her, and she doesn't brush him off, shoot him a cold or confused look, etc. She likes having men compete for her attention (leveling up her ego points), and she may also want to keep her sexual options open. So she is at least not put off by Guy #2, and may even turn toward him, engage him in conversation, laugh, smile, and so on.
The sense that Guy #1 could lose access to something he had expected to be his -- either his girlfriend / wife, or a strange woman who he had "dibs" on by engaging her first -- leads him to cut off the external threat. He hoped that she'd just give Guy #2 the cold shoulder, and that would be that. But dangit if she isn't turning toward him, smiling and talking. He will have to drive off the rival by himself (or with the help of his buddies).
These initial steps in the escalation toward violence do not take place when the woman is cold, brusque, and unwelcoming to strange men. Guy #2 leaves on his own because he got the hint that she couldn't be less interested in him. That saves Guy #1 the risky task of regaining sole access (at least in the near term) over what he believes to be his, against a motivated male rival. In fact, after getting the cold shoulder, Guy #2 is not motivated at all.
I think these differences in average female dating-and-mating behavior go a long way to explain differences in violence among males of one group vs. another. Rude and frigid Yankee women are unlikely to allow, let alone encourage, two or more men to literally fight over her affections. "Is that guy seriously coming over to talk to me when my boyfriend's right here? How bitchy should I sound when I tell him to 'Find someone else to tell your lame lines to, creep'?"
These women have low sex drives, and use sex as a bargaining chip to get a high-status man to settle down and invest in her and her offspring. She would never entertain thoughts of "Hey, another lover... y'know, just in case," or "Hey, I wonder if he'd be a higher-quality stud... might be only one way to find out for sure."
Hence, Yankee men still compete for access to this woman, but it is not over who can get her the most hot and bothered, or who's man enough to displace the other one from a public place, in front of a public crowd. Rather, it's who can strive for the highest status and offer her the greatest material and reputational gains if she agreed to dole out sex on a regular basis.
Those Southern women, on the other hand, are frisky little kitties. They like keeping their options open (without appearing brazen -- more like having a wandering eye), having multiple suitors fighting over them, and surrendering to the moment. "In the heat of passion" describes not only her momentary lapse of fidelity, but also Guy #1's resort to violence to punish her and Guy #2 when he catches them.
There isn't much heat of passion up in Minnesota, so they don't have to worry about dangerous situations like these.
Comparing two groups is a suggestive but weak way to establish causation. Looking at how a group changes over time is better. As the causal variable changes, the response variable out to, well, respond.
Frisky females seem to be part of the outgoing phase, and frigid females part of the cocooning phase of the cycle of social openness and interaction. You can see young women getting a little more out-and-about by the mid-to-late '50s, for example by joining guys in public hang-outs, which are usually felt to be "creepy" places if the girls are frigid and cocooning. Swooning over Elvis was way more scandalous than having a crush on Frank Sinatra. And the incidence of common venereal diseases had begun rising already by the late '50s.
The homicide rate didn't start increasing until 1959, and other violent crimes in the early '60s. This argues against the view that frisky female behavior is a response to male violence, but is rather a contributing factor to it.
Of course, that isn't the only or even the primary cause of rising homicide rates. As far as the cocooning vs. outgoing behaviors go, I think simply having a lot more people out in public, and with their guard down / trust up, is what drives crime rates up. Criminals find easy pickin's in such an environment, and are left with little to do when everyone is locked inside their nuclear household as they were for much of the mid-century and the Millennial eras.
We should not greet this as good news -- "My daughter won't get knocked up, AND crime rates are plummeting? What's not to like?!" That has come at the cost of complete social isolation. That's why they're not dating -- minimal interest in boys and other people in general. Reality check: girls in the good old days were not sleeping with a different guy every week, and the homicide rate at its peak was around 1 in 10,000. You faced a higher risk of getting robbed then than now, but you didn't leave your house with a bullet-proof vest on.
Then again, how many people care about the "cost" of social isolation when most folks are cocooners? To them, that's just one more reward -- "No awkward interactions either?!"
This also explains why certain cultures used to be romanticized, while new ones are idealized today. It used to be Los Angeles, Dallas, and Miami. Now it's DC, Manhattan, and Brooklyn. It used to be the Mediterranean, now it's Scandinavia. That echoes who was prized by folks in the Jazz Age and the Mid-century, respectively. In outgoing times, our attention is drawn to more hot-blooded cultures.
The woman is not just a passive bystander. In some way she allows or provokes the advances from Guy #2, regardless of whether or not Guy #1 was her boyfriend or husband. Guy #2 comes over to talk to her, and she doesn't brush him off, shoot him a cold or confused look, etc. She likes having men compete for her attention (leveling up her ego points), and she may also want to keep her sexual options open. So she is at least not put off by Guy #2, and may even turn toward him, engage him in conversation, laugh, smile, and so on.
The sense that Guy #1 could lose access to something he had expected to be his -- either his girlfriend / wife, or a strange woman who he had "dibs" on by engaging her first -- leads him to cut off the external threat. He hoped that she'd just give Guy #2 the cold shoulder, and that would be that. But dangit if she isn't turning toward him, smiling and talking. He will have to drive off the rival by himself (or with the help of his buddies).
These initial steps in the escalation toward violence do not take place when the woman is cold, brusque, and unwelcoming to strange men. Guy #2 leaves on his own because he got the hint that she couldn't be less interested in him. That saves Guy #1 the risky task of regaining sole access (at least in the near term) over what he believes to be his, against a motivated male rival. In fact, after getting the cold shoulder, Guy #2 is not motivated at all.
I think these differences in average female dating-and-mating behavior go a long way to explain differences in violence among males of one group vs. another. Rude and frigid Yankee women are unlikely to allow, let alone encourage, two or more men to literally fight over her affections. "Is that guy seriously coming over to talk to me when my boyfriend's right here? How bitchy should I sound when I tell him to 'Find someone else to tell your lame lines to, creep'?"
These women have low sex drives, and use sex as a bargaining chip to get a high-status man to settle down and invest in her and her offspring. She would never entertain thoughts of "Hey, another lover... y'know, just in case," or "Hey, I wonder if he'd be a higher-quality stud... might be only one way to find out for sure."
Hence, Yankee men still compete for access to this woman, but it is not over who can get her the most hot and bothered, or who's man enough to displace the other one from a public place, in front of a public crowd. Rather, it's who can strive for the highest status and offer her the greatest material and reputational gains if she agreed to dole out sex on a regular basis.
Those Southern women, on the other hand, are frisky little kitties. They like keeping their options open (without appearing brazen -- more like having a wandering eye), having multiple suitors fighting over them, and surrendering to the moment. "In the heat of passion" describes not only her momentary lapse of fidelity, but also Guy #1's resort to violence to punish her and Guy #2 when he catches them.
There isn't much heat of passion up in Minnesota, so they don't have to worry about dangerous situations like these.
Comparing two groups is a suggestive but weak way to establish causation. Looking at how a group changes over time is better. As the causal variable changes, the response variable out to, well, respond.
Frisky females seem to be part of the outgoing phase, and frigid females part of the cocooning phase of the cycle of social openness and interaction. You can see young women getting a little more out-and-about by the mid-to-late '50s, for example by joining guys in public hang-outs, which are usually felt to be "creepy" places if the girls are frigid and cocooning. Swooning over Elvis was way more scandalous than having a crush on Frank Sinatra. And the incidence of common venereal diseases had begun rising already by the late '50s.
The homicide rate didn't start increasing until 1959, and other violent crimes in the early '60s. This argues against the view that frisky female behavior is a response to male violence, but is rather a contributing factor to it.
Of course, that isn't the only or even the primary cause of rising homicide rates. As far as the cocooning vs. outgoing behaviors go, I think simply having a lot more people out in public, and with their guard down / trust up, is what drives crime rates up. Criminals find easy pickin's in such an environment, and are left with little to do when everyone is locked inside their nuclear household as they were for much of the mid-century and the Millennial eras.
We should not greet this as good news -- "My daughter won't get knocked up, AND crime rates are plummeting? What's not to like?!" That has come at the cost of complete social isolation. That's why they're not dating -- minimal interest in boys and other people in general. Reality check: girls in the good old days were not sleeping with a different guy every week, and the homicide rate at its peak was around 1 in 10,000. You faced a higher risk of getting robbed then than now, but you didn't leave your house with a bullet-proof vest on.
Then again, how many people care about the "cost" of social isolation when most folks are cocooners? To them, that's just one more reward -- "No awkward interactions either?!"
This also explains why certain cultures used to be romanticized, while new ones are idealized today. It used to be Los Angeles, Dallas, and Miami. Now it's DC, Manhattan, and Brooklyn. It used to be the Mediterranean, now it's Scandinavia. That echoes who was prized by folks in the Jazz Age and the Mid-century, respectively. In outgoing times, our attention is drawn to more hot-blooded cultures.
Categories:
Cocooning,
Crime,
Dudes and dudettes,
Human Biodiversity,
Psychology,
Violence
February 14, 2014
The simplest model of bisexual females?
Gay males are defined by psychological, and even physical, stuntedness in childhood. Click on the category label "Gays" to see the wealth of posts supporting this view. It does better than the mainstream views that gays are feminized or hyper-masculinized.
They're Peter Pans who never grow out of the "girls are yucky" phase, which brings most or all of their other distinguishing traits along with it. Homosexuality in males is not just a sexual preference, it is a broader syndrome of dysfunction and depravity.
Now that I've established that to the best of my curiosity, I think it's time to move on to the two other non-hetero groups -- lesbians and bisexual women (there being no bisexual men: most are gay, some are straights trying to act "edgy"). Lesbians are too hard for me to spot in real life, and they don't stand out so much in society at large, so intuition will be lacking there. Let's start with the easier case of bisexuals.
Ideally the causes of bisexuality in women and gayness in men would be the same. Don't complicate the model if you don't need to. So, how does the psychological stuntedness model do with bi women?
The main thing that comes to mind is that they are the "wild child" type. They're not as infantilized as gays, though. They are more like horny and headstrong adolescents, fairly early adolescents, though -- middle school brats rather than college students. And they are bratty, just not as whiny as gays.
If the "gay germ" (in Greg Cochran's model) does something to halt or slow down maturation in the male brain, it could do something similar in the female brain. Same germ, similar effects, only not so extreme in intensity. Bi women tend to be intermediate between heterosexuals and gay men when it comes to deviance.
Why not so extreme? Women are less unstable than men. If there's any role of the X chromosome, women have a spare copy should one of them predispose toward harm. Male fitness is way more variable, so natural selection will only "see" the upper layers of the male distribution. Hence, males are the more disposable sex. Females have been designed to be more robust to such disruptions toward one extreme or the other. They are under-represented at both extremes.
So, if they got infected by the "gay germ," their maturation only slows down by half as much, and as adults they resemble middle schoolers rather than kindergarteners. That's if they got infected at the same age as males did. They could also just get infected at a later age, and the germ would have the same effect -- halt maturation to where it currently is. But then you'd have to explain why females got infected nearly 10 years after males did.
This post is off-the-cuff, so I'm not sure how broadly the model of "bi women as arrested middle schoolers" can account for the full suite of their characteristics. Having a high sex drive fits, and so does being willful, particularly toward authority figures. Below are more off-the-cuff thoughts on their traits that may or may not easily fit into this paradigm.
Masculinization. Bi girls are on the tomboy side in behavior, and as adults seem to have more masculine faces and perhaps skeletal structure generally (like narrower hips, broader shoulders). Testosterone levels peak in adolescence, are minimal for girls during childhood, and fall off very rapidly during their 20s and 30s as their body switches into mothering and grandmothering mode. Arresting their mind and body in adolescence would masculinize them as adults, although they wouldn't appear so relatively masculine during middle and high school, when the normal girls are also horny and headstrong.
Liking both guys and girls. This is their defining trait, so I might as well try to account for it. If their brain were arrested in childhood, they'd only like females. If it were arrested after menopause, they wouldn't really be attracted to anyone, though they'd probably prefer the company of women because they want a more peaceful and stable life, which men would disrupt. (Lesbians seem to be menopausal from an early age, but that's another topic.)
Female sexuality does seem more fluid and open than male, and early adolescent girls sometimes get practice kissing other girls at sleepovers or wherever. Girls are way more touchy-feely with each other in general at this age. This is intended, though, as practice for hooking a boy and marrying a man some day. It's only done with other girls because that makes it safer than doing so for real with a boy. In the case of bi girls, though, they don't out-grow this phase and mature into young adults who only prefer messing around with boys. Their maturation gets stuck in the phase of liking boys but feeling more comfortable playing with other girls, so that they're open for both.
That is a strong impression I get from them, BTW -- that guys to them are the real thing yet intimidating, and more of a sex object, while girls are more of a safe space where they can cuddle, make out, etc., and not worry about being judged by boys. It's very much like a sixth-grader would feel. It's not the other way around, where girls are mere sex objects while they romanticize guys.
More violent. This stems from having hormone levels frozen near adolescent levels. Anything that makes you think "hormonal" would follow from this general model.
There's more to look at, but those are a few random test cases that came to mind. I haven't given too much thought to this, so I'm not as convinced as I am of the "gay Peter Pan" model. Chime in with other well-known traits of theirs that need to be explained no matter which model it is.
They're Peter Pans who never grow out of the "girls are yucky" phase, which brings most or all of their other distinguishing traits along with it. Homosexuality in males is not just a sexual preference, it is a broader syndrome of dysfunction and depravity.
Now that I've established that to the best of my curiosity, I think it's time to move on to the two other non-hetero groups -- lesbians and bisexual women (there being no bisexual men: most are gay, some are straights trying to act "edgy"). Lesbians are too hard for me to spot in real life, and they don't stand out so much in society at large, so intuition will be lacking there. Let's start with the easier case of bisexuals.
Ideally the causes of bisexuality in women and gayness in men would be the same. Don't complicate the model if you don't need to. So, how does the psychological stuntedness model do with bi women?
The main thing that comes to mind is that they are the "wild child" type. They're not as infantilized as gays, though. They are more like horny and headstrong adolescents, fairly early adolescents, though -- middle school brats rather than college students. And they are bratty, just not as whiny as gays.
If the "gay germ" (in Greg Cochran's model) does something to halt or slow down maturation in the male brain, it could do something similar in the female brain. Same germ, similar effects, only not so extreme in intensity. Bi women tend to be intermediate between heterosexuals and gay men when it comes to deviance.
Why not so extreme? Women are less unstable than men. If there's any role of the X chromosome, women have a spare copy should one of them predispose toward harm. Male fitness is way more variable, so natural selection will only "see" the upper layers of the male distribution. Hence, males are the more disposable sex. Females have been designed to be more robust to such disruptions toward one extreme or the other. They are under-represented at both extremes.
So, if they got infected by the "gay germ," their maturation only slows down by half as much, and as adults they resemble middle schoolers rather than kindergarteners. That's if they got infected at the same age as males did. They could also just get infected at a later age, and the germ would have the same effect -- halt maturation to where it currently is. But then you'd have to explain why females got infected nearly 10 years after males did.
This post is off-the-cuff, so I'm not sure how broadly the model of "bi women as arrested middle schoolers" can account for the full suite of their characteristics. Having a high sex drive fits, and so does being willful, particularly toward authority figures. Below are more off-the-cuff thoughts on their traits that may or may not easily fit into this paradigm.
Masculinization. Bi girls are on the tomboy side in behavior, and as adults seem to have more masculine faces and perhaps skeletal structure generally (like narrower hips, broader shoulders). Testosterone levels peak in adolescence, are minimal for girls during childhood, and fall off very rapidly during their 20s and 30s as their body switches into mothering and grandmothering mode. Arresting their mind and body in adolescence would masculinize them as adults, although they wouldn't appear so relatively masculine during middle and high school, when the normal girls are also horny and headstrong.
Liking both guys and girls. This is their defining trait, so I might as well try to account for it. If their brain were arrested in childhood, they'd only like females. If it were arrested after menopause, they wouldn't really be attracted to anyone, though they'd probably prefer the company of women because they want a more peaceful and stable life, which men would disrupt. (Lesbians seem to be menopausal from an early age, but that's another topic.)
Female sexuality does seem more fluid and open than male, and early adolescent girls sometimes get practice kissing other girls at sleepovers or wherever. Girls are way more touchy-feely with each other in general at this age. This is intended, though, as practice for hooking a boy and marrying a man some day. It's only done with other girls because that makes it safer than doing so for real with a boy. In the case of bi girls, though, they don't out-grow this phase and mature into young adults who only prefer messing around with boys. Their maturation gets stuck in the phase of liking boys but feeling more comfortable playing with other girls, so that they're open for both.
That is a strong impression I get from them, BTW -- that guys to them are the real thing yet intimidating, and more of a sex object, while girls are more of a safe space where they can cuddle, make out, etc., and not worry about being judged by boys. It's very much like a sixth-grader would feel. It's not the other way around, where girls are mere sex objects while they romanticize guys.
More violent. This stems from having hormone levels frozen near adolescent levels. Anything that makes you think "hormonal" would follow from this general model.
There's more to look at, but those are a few random test cases that came to mind. I haven't given too much thought to this, so I'm not as convinced as I am of the "gay Peter Pan" model. Chime in with other well-known traits of theirs that need to be explained no matter which model it is.
Categories:
Age,
Dudes and dudettes,
Gays,
Psychology
Queers more likely to be cannibals
Cannibalism is so rare that Wikipedia's category of American cannibals includes only 20 entries. And yet by looking at which groups are over-represented, we can learn something about which ones are likely to be so seriously screwed up that they would eat human flesh.
Some of them were high on drugs like PCP, and it's not clear they intended to eat what they ate. A handful of others were Mountain Men who ate parts of their enemies, as a form of over-kill or disrespecting them even when they were dead. And a couple were curious anthropologist types who wanted to "go native" for a bit.
I'm interested in the ones who were drawn to it per se, in order to make sure we're looking at profound psychopathology and not the other weird but non-pathological causes.
That leaves just 9 cases: Bar-Jonah, Dahmer, Fish, Toole, Chase, Cole, Kemper, Gaskins, and the Ripper Crew. All cases are male (no surprise there, as male biology is more unstable). Of the cases, 4 are homosexual (the first four listed) and 5 are heterosexual. The Ripper Crew was made up of 4 individuals, although it's unclear how many of them ate the flesh of their victims. At the level of individuals, then, between 5 and 8 are straight cannibals, and 4 are gay.
The prevalence of homosexuals in the male population overall is about 4 in 100, while among disturbed cannibals they are around 4 in 10 -- over-represented by an order of magnitude. Even though our sample is small, there ought to be 0 homos if they are no more or less likely to be drawn to cannibalism than normal males.
Why are gay men less repulsed by an act that most cultures view as a disgusting taboo? I think it stems from their fundamental trait, which is being psychologically stunted in early childhood ("girls are yucky"). At that age, not all taboo behaviors have had the time to root themselves in the "ewww, disgusting" part of the brain. Infants put all kinds of sick crap in their hands and mouths as they probe their environment, and hardened disgust reflexes aren't in place until later childhood and early adolescence.
Queers are way more likely to play with piss and shit, for example, as though they never grew out of being 5 year-olds. Human flesh would then just be a special case of their broader undeveloped sense of disgust and taboo.
Of course, to reveal this tendency of theirs would require them to murder somebody, in cold blood, and not as revenge but because they enjoy it per se. That restricts the sample (gay or straight) to serial killer types. So, we will not easily observe the lower threshold that homos have for eating human flesh, because it only takes place in such infrequent contexts. But when those conditions are met -- when we look for serial killer type behavior -- then the higher level of gay depravity exposes itself.
Related: queers over-represented among serial killers by 2 to 7 times.
Some of them were high on drugs like PCP, and it's not clear they intended to eat what they ate. A handful of others were Mountain Men who ate parts of their enemies, as a form of over-kill or disrespecting them even when they were dead. And a couple were curious anthropologist types who wanted to "go native" for a bit.
I'm interested in the ones who were drawn to it per se, in order to make sure we're looking at profound psychopathology and not the other weird but non-pathological causes.
That leaves just 9 cases: Bar-Jonah, Dahmer, Fish, Toole, Chase, Cole, Kemper, Gaskins, and the Ripper Crew. All cases are male (no surprise there, as male biology is more unstable). Of the cases, 4 are homosexual (the first four listed) and 5 are heterosexual. The Ripper Crew was made up of 4 individuals, although it's unclear how many of them ate the flesh of their victims. At the level of individuals, then, between 5 and 8 are straight cannibals, and 4 are gay.
The prevalence of homosexuals in the male population overall is about 4 in 100, while among disturbed cannibals they are around 4 in 10 -- over-represented by an order of magnitude. Even though our sample is small, there ought to be 0 homos if they are no more or less likely to be drawn to cannibalism than normal males.
Why are gay men less repulsed by an act that most cultures view as a disgusting taboo? I think it stems from their fundamental trait, which is being psychologically stunted in early childhood ("girls are yucky"). At that age, not all taboo behaviors have had the time to root themselves in the "ewww, disgusting" part of the brain. Infants put all kinds of sick crap in their hands and mouths as they probe their environment, and hardened disgust reflexes aren't in place until later childhood and early adolescence.
Queers are way more likely to play with piss and shit, for example, as though they never grew out of being 5 year-olds. Human flesh would then just be a special case of their broader undeveloped sense of disgust and taboo.
Of course, to reveal this tendency of theirs would require them to murder somebody, in cold blood, and not as revenge but because they enjoy it per se. That restricts the sample (gay or straight) to serial killer types. So, we will not easily observe the lower threshold that homos have for eating human flesh, because it only takes place in such infrequent contexts. But when those conditions are met -- when we look for serial killer type behavior -- then the higher level of gay depravity exposes itself.
Related: queers over-represented among serial killers by 2 to 7 times.
Subscribe to:
Posts (Atom)