For all the obsession with authenticity, contemporary culture is remarkably fake. Or perhaps designed-by-committee is a better way to put it. You feel that the most in pop music -- too bland to bother turning on the radio anymore. Not since about 1993, and that's being generous. More like the late '80s.
This is only one example of the stultification of culture during a period of falling-crime / cocooning. What are the pathways, though, that lead from cocooning to committee-created culture? As elsewhere, when folks don't trust one another, they can't form a cohesive group -- not even a small one, which ought to be easy (no huge "free rider" problem).
Have you ever noticed how there are hardly any bands or music groups these days? How the hit songs are performed by a lone singer, perhaps with the occasional guest vocalist? That's what you get when suspicion is the first thing people feel toward one another. Without an atmosphere of camaraderie like you find among band members, some kind of central committee will have to arrange and supervise the relationships between the relevant players. And the players will assume a more on-the-move mercenary role -- not being bound to any particular enduring group.
One easy way to see the unraveling of camaraderie in the pop music industry is to look at who's listed under the writer's credit for a hit song. If it's a talented and cohesive group, they're probably creating everything in-house. If each individual involved in the creation is thrown together for any given song, then the songwriting is probably outsourced to a specialist (or a team of them).
I went to the Billboard Year-End singles charts, and looked up the writer's credit for the top 20 songs of the year, from 1949 to 2009 in 5-year intervals, and adding 2012 at the end. (The year-end charts for 1949 came from this site.) The criterion for counting the song as "written by the performers" was pretty strict, but that avoids having to make lots of subjective judgement calls. Namely: everyone given credit for writing the song must be a performer on it as well.
Obviously that excludes songs whose lyrics and music were written by a songwriting team and given to a singer to perform. It also excludes songs where the performer may have contributed to the creation, but received major help from outside. For example, "Time After Time" from 1984 is credited to Cyndi Lauper, who sings it, but also to Rob Hyman of The Hooters, who played a big role in writing the music. So it is not counted. From the same year, "When Doves Cry" is credited only to Prince, who performs vocally and instrumentally, so it is counted.
It's not as though getting sole credit means you had no outside help or influence whatsoever -- only that it wasn't big enough for them to be added to the writer's credit.
The graph below shows the rise and fall of pop music stars who wrote their own songs:
The mid-century points near the bottom are a good reminder of how fragmented the society was back then, and how afraid they were to take a chance on each other -- or on themselves. Why didn't Frank Sinatra or Perry Como play an instrument, or take a shot at writing their own songs? "I'm no expert -- best to leave that to the professionals." Hey dude, it's not like "Summer of '69" is Keats for a modern audience. You don't have to be a wiz with words. Just give it a shot and see how well it goes. You never know till you try, right? But social paralysis was as powerful back then as it is now.
Not until the 1959 point do we see a breaking away from the mid-century model. And it's not really a rock 'n' roll rebellion, as most of the early rock of the '50s was written by outside songwriters. Rather, it's pop singers like Paul Anka and the Everly Brothers sticking their neck out a little bit by doing it themselves.
The mini-decline during the '60s would be even more pronounced if I had chosen a year other than 1964 for the mid-decade point. That was the year of Beatlemania, and 5 of the 7 songs shown were written and performed by Lennon-McCartney.
There appear to be shorter cycles on top of the overall rise-and-fall pattern. First a burst of enthusiasm for new writer-performers, followed by professional songwriters jumping on the bandwagon to make a more polished product. Then audiences getting tired with the polished state of things and wanting another round of Something New.
After the mid-'70s doldrums, the next surge of originality came with disco. Most people associate the style with cheesy polyester leisure suits, more than the actual quality of the music itself. It's not easy to write for so many instruments, let alone pen lyrics that will pick people's spirits up and make them feel like dancing their troubles away. But the Bee Gees and Chic made writing your own songs look easy, and they deserve more respect than they get from the nerdier sort of pop music critic.
The peak of homemade songs arrived with the New Wave / Heartland Rock / Hard Rock zeitgeist of the mid-1980s. Even then, "only" one-half of the top 20 songs were created entirely by their performers, so there was plenty of room for professional songwriters and singers to work together on a project-by-project basis. Still, it's striking how skilled the average pop star was in the '80s. Can't hand off those guitar solos to someone else -- that would be a loss of honor.
By the late '80s, there were more professional outsiders jumping on board during the twilight phase of rock and synth-pop. But unlike earlier periods, things did not bounce back with Something New during the '90s -- or the 21st century, for that matter. Something that only the performers could come up with by themselves. There was a blip in 2004, perhaps another example of the mid-2000s respite from the overall boringification of the culture since the '90s. However, a lot of those songs are rap, hence the "songwriting" quality is not impressive. No singing, for instance, just plain speech.
And now we're right back to mid-century levels of creation-by-committee. It's mind-blowing to look at writer's credits from recent years -- sometimes 3, 4, 5, 6 people being listed. Perhaps the performer is thrown in there somewhere, but they are mostly being supported by an ad hoc team of consultants. And the results are predictable. There's no real sense of passion, commitment, and camaraderie. Each song sounds like the individuals involved were just going through the motions, rehearsing a script.
The only song written by the performer in 2012 was "Somebody That I Used to Know" by Gotye, which is easily the only halfway decent song from last year -- and the most distinctive. It didn't sound like the rest of the 21st century centrally planned pop pap. It's reminiscent of The Police or Peter Gabriel, not out of blind imitation, but simply because songs sound more memorable that way when their performers have something personal invested in them.
As for the rest of it, future generations will probably resonate with it as much as they would with Rosemary Clooney and Eddie Fisher. Meanwhile, they'll still be playing the Beatles, the Bee Gees, and the Boss.
September 30, 2013
September 28, 2013
Has grade inflation struck IQ tests, too? A look at kids of New York elite parents
Via Steve Sailer, here is an NYT article on the increasingly high-stakes contest among New York parents to get their kids into elite kindergartens. The focus is on the IQ test used to sort out the top 1% from the rest. Several sources express concern that something funny is going on because the fraction of all New York kids clearing the national top 1% mark has shot up in recent years. Just 6 years ago, only 18% cleared that mark, while last year 29% did.
The IQ / HBD crowd should show some
more interest in corruption, test prep, and other forms of grade
inflation among the elite -- not just in the obvious places like
the Atlanta public school system (wink wink).
Intra-elite competition has been
ramping up for decades now, so we can't assume that the most powerful and wealthy group of parents in the country aren't making good use of that power and wealth to get artificially better outcomes for their kids, and hence lower
outcomes for everyone else, where those resources are limited.
Looking through the ERB report on
changes over time, it's clear that something funny is going on.
There's an increase in the mean combined score of 0.375 Standard Deviations -- in just
5 years (fall '07 to fall '12). If we phrased it in terms of height,
it's as though these kids had gotten 1.1 inches taller on average in
just 5 years. Little of that is due to changes in
verbal scores, which are pretty flat. The performance scores,
however, have increased by 0.6 SD, almost linearly over time. In
height terms, that's 1.8 inches taller on average than 5 years
ago.
Moreover, there are no changes in the national sample of children, whether for the verbal, performance, or combined scores, and whether you look at mean scores or 90th or 98th percentiles. Something is going on specifically in the New York bunch of test-takers.
Now, what would champions of the
"cognitive elite" say about a group of Atlanta public
school kids whose mean IQ scores had apparently risen in 5 short years by nearly 0.4
SD, and whose performance scores had risen by 0.6?
Demographics? Nope -- the ERB looks at
age and sex, and there's no change. They don't mention race, but by
fall 2007, New York super-parents were already fully white / Jewish /
NE Asian. Environmental improvement? Nope --
those kids of Manhattan super-parents weren't starving,
disease-stricken, etc., "back" in 2007, only to have
recovered by now.
That leaves artificial causes. Since
the verbal scores are flat, you might think that it's mostly test
prep -- little human beings are already designed for verbal communication and
reasoning, but not for weird new things like "what picture comes
next in this sequence?" or "3 of the cells of this matrix
are filled in. What goes in the missing cell?" There are larger marginal returns if you prep for the performance sub-test, and don't waste too much time on the verbal one.
That's some pretty damn good test prep,
if that's all there is to the rise in scores. Remember -- 0.6 SD in just 5 years, on
an IQ performance test, not some quiz of factoids. I don't doubt test
prep is the new normal for Manhattanite children, and that smarter
kids will get more out of it than duller ones. But, not enough to
produce such a huge fast increase.
Rather, the likely causes are some mix of tacit grade inflation and outright corruption. IQ-focused nerds can join the 21st century and recognize how rampant such practices are -- including among the very top of the elite, like grade inflation and "No Child Left Without Latin Honors" at Harvard.
I have no experience with Manhattan super-parents or the local test-production and test-prep industries, so I have no intuition about what mix is the most likely. Grade inflation sounds more likely -- understood as necessary by the test-makers, or the super-parents will take their business elsewhere. That will hit the company financially and reputationally.
It's like during the last economic bubble, the ratings agencies inflated the worthiness of various financial packages because if one of them were more honest, the customer would shop it around to another ratings agency that would give it higher marks. The customer holding this batch of mortgages (or whatever) was not interested in an honest appraisal -- they wanted to pass it on to some sucker for as much money as possible, and a higher rating means more money.
Similarly, parents don't want an honest score for their kids -- they want the highest score possible, in order to pass them along to the school that will give them the greatest resource-earning potential into the future. They can't all get the maximum score, or the jig will be up. But that still leaves plenty of room for gradual, subtle rating inflation. The cumulative effect is more pronounced (like the '07 to '12 comparison), but by then your kid has already taken the test and gotten his inflated score, so what do you care if the bubble bursts in the future and wrecks the community? Gotta look out for Number One (and mini-One).
Corruption is not out of the question either, but probably less of a factor. You see that more when the test-takers are below average, and the proctors will just erase the wrong answers the students gave and write in the correct ones. Maybe with orders from above, lest the test make the public school system look bad. But not so much at the elite level -- parents in general aren't greasing the administrators' palms to get their kid into Harvard. Same with the housing bubble -- holders of the mortgages didn't plunk down however-many dollars on the conference table. They used the less detectable tactic of shopping their product around to find the highest rater, with competitive ratings agencies only too eager to accommodate the demand for inflated ratings.
I'm not sure how much revenue the ERB brings in from New York parents, in order for their shopping-around to drive possible decisions to inflate scores. I'm guessing it's more of a reputational concern -- if New York elite parents drop your test like a hot potato, you look like losers. That would hold even for an established IQ test like the SAT -- but would really slam a more unknown test that's just starting out (only existing since the mid-2000s), trying to distinguish itself from all the other tests that elite parents could choose from. Theirs is an endorsement that money can't buy. You have to make it worth their while in other ways.
Both of these causes still don't explain why the rise in scores is most dramatic for the performance and not the verbal scores. Perhaps the grade-inflaters know how to make it seem less noticeable. It is what they do for a living, after all. Like, if the goal is to inflate overall GPA, then (at least at first) inflate the grades for music class but not for math class, which would raise more red flags. And perhaps part of what those $200-per-hour tutoring sessions get you is some first-hand knowledge of the questions through someone who's socially connected and hard-up for money in uber-expensive New York City, though restricted to just the performance sub-test. Again, don't want to give away too much and have it look obvious. And to maintain a decent bargaining position, you want to still be holding onto something they want, not just give it all up on the first date.
All of these hypotheses and their predictions should be followed up on, to see where the culprit truly lies. Though good luck figuring it out. It's not simply a case of "getting your hands on the data" -- that's just test score data. The "data" that you're really after is the behavior and relationships among the various players -- and they're not likely to reveal that much to outsiders.
There's also likely an understanding that the test is designed to get the Children Who Matter into the Schools That Matter. If they don't, why, the global economy could unravel or blow up in 20-30 years. How could America run itself without the right children getting into the right schools? So, you cut them some slack in advance appreciation for the all the society-holding-together work their children will be contributing throughout their lives. Like driving home-ownership rates up to 100%, handing out mortgages to illiterate Mexican strawberry-pickers, opening the floodgates of immigration, and so on.
In summary -- time for the IQ crowd to stop pretending we have a meritocracy these days, to stop being such naive pawns in the elite's game of self-advancement. Intra-elite competition couldn't get any fiercer (although stay tuned for some scenes from next week's episode), and evidence of "rating inflation" of one kind or another has been obvious for decades now. As elite households fight more ruthlessly against one another in naked self-promotion, they'll result to whatever it takes. We don't live in a restrained, sportsmanslike meritocracy but in a no-holds-barred war of all against all.
Categories:
Age,
Crime,
Economics,
Education,
Human Biodiversity,
IQ,
Morality,
Over-parenting,
Politics,
Psychology
Miley Cyrus explains that youngsters' kabuki faces stem from autism
A recurring topic around here is how incapable folks these days are of expressing simple basic emotions, resulting in grotesque caricatures when they try to make any face other than that of an iPhone zombie. (See here and here.)
This broken system of expression is particularly strong among Millennials. Boomers and X-ers know how to and have had plenty of practice with showing genuine smiles back in the good old days. So when they make their weirdo face -- albeit not very frequently -- it's more like they're hiding who they really are, because you're not allowed to be yourself anymore in these stuffy, snarky times we live in.
The experience of Millennials, however, has only been with the sarcastic, glib, avoidant, and dismissive period of the past 20 or so years. They're not the pretty girl who makes herself look drab in order to avoid the inevitable assaults of beauty-hating bow-wows. They're the doughy, pizza-faced slug who slathers on juggalo make-up in order to suggest that "I can look pretty, I just don't want to." Get real, you skag.
In a new interview with Rolling Stone, pop star and suspected faggot-trapped-in-a-chick's-body Miley Cyrus explains why she always sticks her tongue out and dials the facial fakeness up to 11 when she's having her picture taken:
"I just stick my tongue out because I hate smiling in pictures. It's so awkward. It looks so cheesy."
The key word there being "awkward" -- expressing basic emotions is awkward, presenting an attentive and affectionate face for your fans is awkward, and trying to make yourself likable is awkward. Young people today are pretty fucking awkward. And yet, does warping your face into a kabuki mask make you look more normal, cheerful, and not-cheesy? Nope. Check out a gallery of this freak sticking her tongue out for the camera.
So there you have it, straight from the horse's mouth. Millennials themselves realize that they aren't taking duckface pictures, staring down at their phones, etc., because it's cool, comforting, fun, or whatever. They're aware that they're just too damn sheltered and awkward to know how to behave around people who don't come from their nuclear family.
Miley Cyrus represents the more antagonistic subset of emotional retards. But her less hostile peers like Selena Gomez, Nicki Minaj, and Kesha aren't any more developed. When they attempt a smile, it still looks smug or strained -- and irritated that you're causing them strain by expecting a normal smile.
Here's something you don't see much of nowadays. So cute and wholesome:
That looks like it's from 1987 or '88, when Debbie Gibson was only 17, yet looking more mature than bratty Millennials in their mid 20s today. Only when helicopter parents began to prevent their children from socializing with their peers did young people become incapable of even feeling basic human emotions, let alone express them to others. Helicopter parents have no one to blame but themselves for producing a generation of Lady Gaga's and Miley Cyruses.
If they want more Debbie Gibsons and Tiffanys, they're going to have to let their kids have their own (i.e. unsupervised) social life. That's what pushes a person to grow up -- having to fit into a group, leaving behind childish egocentrism.
And it's not just the "bright bubbly smile" look that young people used to be capable of. Check out the music video for "Foolish Beat" and note how effortlessly grown-up she looks and acts. Not one of those '80s hall of fame songs, in my book, but focus on how she presents herself. No caricature or irony back then to provide the lazy / retarded way out.
"I just stick my tongue out because I hate smiling in pictures. It's so awkward. It looks so cheesy."
The key word there being "awkward" -- expressing basic emotions is awkward, presenting an attentive and affectionate face for your fans is awkward, and trying to make yourself likable is awkward. Young people today are pretty fucking awkward. And yet, does warping your face into a kabuki mask make you look more normal, cheerful, and not-cheesy? Nope. Check out a gallery of this freak sticking her tongue out for the camera.
So there you have it, straight from the horse's mouth. Millennials themselves realize that they aren't taking duckface pictures, staring down at their phones, etc., because it's cool, comforting, fun, or whatever. They're aware that they're just too damn sheltered and awkward to know how to behave around people who don't come from their nuclear family.
Miley Cyrus represents the more antagonistic subset of emotional retards. But her less hostile peers like Selena Gomez, Nicki Minaj, and Kesha aren't any more developed. When they attempt a smile, it still looks smug or strained -- and irritated that you're causing them strain by expecting a normal smile.
Here's something you don't see much of nowadays. So cute and wholesome:
That looks like it's from 1987 or '88, when Debbie Gibson was only 17, yet looking more mature than bratty Millennials in their mid 20s today. Only when helicopter parents began to prevent their children from socializing with their peers did young people become incapable of even feeling basic human emotions, let alone express them to others. Helicopter parents have no one to blame but themselves for producing a generation of Lady Gaga's and Miley Cyruses.
If they want more Debbie Gibsons and Tiffanys, they're going to have to let their kids have their own (i.e. unsupervised) social life. That's what pushes a person to grow up -- having to fit into a group, leaving behind childish egocentrism.
And it's not just the "bright bubbly smile" look that young people used to be capable of. Check out the music video for "Foolish Beat" and note how effortlessly grown-up she looks and acts. Not one of those '80s hall of fame songs, in my book, but focus on how she presents herself. No caricature or irony back then to provide the lazy / retarded way out.
Categories:
Age,
Cocooning,
Dudes and dudettes,
Generations,
Music,
Over-parenting,
Pop culture,
Psychology
September 27, 2013
Baby names that would sound trendy but not lame
I've been digging through baby name data again on the Social Security's interactive site, and one thing that causes such distraction when you're coding the data is how irritating so many new names sound. Check out the top 1000 names for 2012, for instance.
"Pretentious" is the first word that comes to mind -- Dalton, Princeton, Colton, Jayden, Hunter, Archer, Aubrianna, Mackenzie, Harper, Brooklynn, Aurora, Arabella, ad nauseam.
What makes names trendy is primarily their sound-shape ("phonotactics"). Today's trendy names have lots of vowels compared to consonants, male names all end in unstressed -n or -r, and other sound-based regulations. Within this primary constraint, parents are more free to choose a name from the Old Testament, a cosmopolitan city, a Celtic surname, an elite school they hope their kid will get into, and so on.
Sadly, even the non-pretentious names that seem to be taken from pop culture, allude to the culture of the Millennial era -- Ryker, Dawson, Cullen, etc. It would be nice to find some more down-to-earth references that are more wholesome and All-American, while still fitting into the primary constraints on sound shapes.
I'm pretty sure your daughter would be the only Lauper or Brinkley at school, and your son the only Delorean or Halen. (And if you didn't get that son you were hoping for, you can always name her Haylynn too.) Girls' names are always more innovative, for better or worse (usually worse). So why not...
Bonjoviana
Brianarama ("ah"), or
Breannarama (with the stressed vowels as in "Anne")
Durannabella
Ediebrickella
Divinylynn
If you're going to scar your kid by picking their name from pop culture, it may as well be something cool.
"Pretentious" is the first word that comes to mind -- Dalton, Princeton, Colton, Jayden, Hunter, Archer, Aubrianna, Mackenzie, Harper, Brooklynn, Aurora, Arabella, ad nauseam.
What makes names trendy is primarily their sound-shape ("phonotactics"). Today's trendy names have lots of vowels compared to consonants, male names all end in unstressed -n or -r, and other sound-based regulations. Within this primary constraint, parents are more free to choose a name from the Old Testament, a cosmopolitan city, a Celtic surname, an elite school they hope their kid will get into, and so on.
Sadly, even the non-pretentious names that seem to be taken from pop culture, allude to the culture of the Millennial era -- Ryker, Dawson, Cullen, etc. It would be nice to find some more down-to-earth references that are more wholesome and All-American, while still fitting into the primary constraints on sound shapes.
I'm pretty sure your daughter would be the only Lauper or Brinkley at school, and your son the only Delorean or Halen. (And if you didn't get that son you were hoping for, you can always name her Haylynn too.) Girls' names are always more innovative, for better or worse (usually worse). So why not...
Bonjoviana
Brianarama ("ah"), or
Breannarama (with the stressed vowels as in "Anne")
Durannabella
Ediebrickella
Divinylynn
If you're going to scar your kid by picking their name from pop culture, it may as well be something cool.
Categories:
Language,
Over-parenting,
Pop culture
September 24, 2013
When did students start confronting teachers about grades?
We can ignore whatever the baseline level is that's due to human nature and focus instead on when teachers and professors become more and more vocal about how students whine, complain, and attempt to negotiate about grades they think are unfair.
Searching the NYT and LexisNexis for "grade grubbing" and "grade grubbers" shows that there are two distinct usages from two different periods. It begins sometime in the 1970s and refers to college students whose sole focus is maximizing their GPA -- they forego having a social life, extra-curriculars, intellectual curiosity, and so on, in order to memorize more facts and study harder before tests. There's no suggestion, though, that this involves regular and predictable confrontations with their professors to complain about grades, and then haggle over what a truly fair grade would be.
"Grade grubbers" in the sense of pests who annoy teachers doesn't show up until the early 2000s. Presumably, the shift in the meaning was gradual, and so likely took place during the '90s.
Looking for the earliest account of an ongoing epidemic of student complaining and entitlement, I found this webpage / article from 1997. Its citations include contemporary news reports, academic articles, and first-hand stories from professors around the country. Part of it is concerned with the declining quality of the preparation that students arrive at college with, and the amount of work they put in when they get there. But it also features a lengthy review of student attitudes and behavior regarding fair grades -- by the time of the article, they'd begun to feel like just showing up to class and completing the assignments (no matter the quality) should secure a high grade, they demanded grade inflation in general, and in particular cases they would confront professors to have their test or paper score bumped up.
How far back did this go? Going through his list of over 100 citations, there are few from the 1980s, and none point to student vs. professor confrontations over grades. The only references to the decade are to the late '80s, and have more to do with falling achievement levels and growing apathy among students. Even the citations from 1990 to '92 don't support the view of "students as haggling pests." Not until the citations from 1993 to '96 do we see an explosion of that picture. Here's a quote that the author chose from a 1993 article in Harvard Magazine ("Desperately seeking summa"):
The author points out that the "student empowerment" movement of the 1960s can have little to do with the phenomenon of aggressive complainers since it far post-dates the heyday of the counter-culture. That's true of just about everything wrong with social relations these days -- they're a product of the 1990s, and were absent as recently as the '80s (e.g., political correctness, diversity sensitivity, multiculturalism, date rape hysteria, feminazis, homo enablers, etc.). Conservatives must wrap their brains around this fact of the timing, or else they'll cluelessly blame The Sixties and propose solutions that, whatever their worth in other respects, will do nothing to turn the tide against Millennial-era problems.
What about the link to widening inequality, and the attendant growing competition among aspiring elite members? We'd expect that kind of dog-eat-dog world to breed the aggressive grade-haggling that we see today. (See this article by Peter Turchin for a good review of the dynamics of inequality and its consequences.)
Income inequality began to grow sometime in the mid-to-late 1970s, and the higher ed bubble got going around 1980 (shown by percent of high school grads in college, and college tuition -- higher prices reflecting higher demand). Why the lag of 12-13 years from the start of the higher ed bubble until constant professor complaining about constant student complaining? I doubt it'd take that long to manifest itself, or for observers to notice. Compare the case of yuppies -- punks were already spray-painting "Die Yuppie Scum" back in the '80s, and an acclaimed movie from 1987 skewered the prevalent view on Wall Street that "Greed is good."
For that matter, forget the higher ed bubble. Once inequality starts to widen by the late '70s, why don't we read widespread reports of high schoolers -- or their parents -- pestering their teachers about boosting grades? The "grade grubber" in the '70s sense is still there -- Brian from The Breakfast Club, the nerd in Heathers whose contribution to talking about the suicide of a classmate is if they'll be tested on it later. Or that other grade-grubber in Heathers, whose sole purpose is maximizing his chances to gain early acceptance into an Ivy League school ("and please let it be Harvard," he begs God while praying at a classmate's funeral). But absent are any journalistic reports or fictional portrayals of high school kids confronting their teachers to haggle over grades.
One of the early-2000s references to confrontational grade grubbling in the press came from a high school teacher. My sister-in-law spontaneously remarked that complaints about grade-obsessed parents these days could have been said about her high school back in 1995.
So, it seems that hostile self-promotion is a product of both the rising-inequality trend and the falling-crime / cocooning trend. It can't be just the first, or else we would've seen it during the later '70s perhaps, but definitely throughout all of the '80s. And it can't be just the second, since I don't have that impression of the last stretch of cocooning from the mid-30s through the '50s. Students from that era are also bored, apathetic, dorky, and docile. But not hostile and self-promoting on top of it, like today's dorky student body.
That predicts that we'd see something similar going on among students of the Victorian era in Europe, and the Gilded Age in America. I have no idea whether that's borne out or not, though. They didn't have the same widespread level of public education and higher ed that we do during this incarnation of the "worst of both worlds" zeitgeist. But, it's something that someone could look into.
Searching the NYT and LexisNexis for "grade grubbing" and "grade grubbers" shows that there are two distinct usages from two different periods. It begins sometime in the 1970s and refers to college students whose sole focus is maximizing their GPA -- they forego having a social life, extra-curriculars, intellectual curiosity, and so on, in order to memorize more facts and study harder before tests. There's no suggestion, though, that this involves regular and predictable confrontations with their professors to complain about grades, and then haggle over what a truly fair grade would be.
"Grade grubbers" in the sense of pests who annoy teachers doesn't show up until the early 2000s. Presumably, the shift in the meaning was gradual, and so likely took place during the '90s.
Looking for the earliest account of an ongoing epidemic of student complaining and entitlement, I found this webpage / article from 1997. Its citations include contemporary news reports, academic articles, and first-hand stories from professors around the country. Part of it is concerned with the declining quality of the preparation that students arrive at college with, and the amount of work they put in when they get there. But it also features a lengthy review of student attitudes and behavior regarding fair grades -- by the time of the article, they'd begun to feel like just showing up to class and completing the assignments (no matter the quality) should secure a high grade, they demanded grade inflation in general, and in particular cases they would confront professors to have their test or paper score bumped up.
How far back did this go? Going through his list of over 100 citations, there are few from the 1980s, and none point to student vs. professor confrontations over grades. The only references to the decade are to the late '80s, and have more to do with falling achievement levels and growing apathy among students. Even the citations from 1990 to '92 don't support the view of "students as haggling pests." Not until the citations from 1993 to '96 do we see an explosion of that picture. Here's a quote that the author chose from a 1993 article in Harvard Magazine ("Desperately seeking summa"):
A student in a history course at Harvard “told … [the instructor] that he needed a better grade … because he needed to get into medical school. ‘I may not be very good at history,’ he said, ‘but I’m going to be a very good doctor, and I really reject the idea that you have the right to keep me out of medical school’.… Students are more likely to contest grades now. For many instructors, grading is the most distasteful thing they do. Often students help to make it unpleasant."
The author points out that the "student empowerment" movement of the 1960s can have little to do with the phenomenon of aggressive complainers since it far post-dates the heyday of the counter-culture. That's true of just about everything wrong with social relations these days -- they're a product of the 1990s, and were absent as recently as the '80s (e.g., political correctness, diversity sensitivity, multiculturalism, date rape hysteria, feminazis, homo enablers, etc.). Conservatives must wrap their brains around this fact of the timing, or else they'll cluelessly blame The Sixties and propose solutions that, whatever their worth in other respects, will do nothing to turn the tide against Millennial-era problems.
What about the link to widening inequality, and the attendant growing competition among aspiring elite members? We'd expect that kind of dog-eat-dog world to breed the aggressive grade-haggling that we see today. (See this article by Peter Turchin for a good review of the dynamics of inequality and its consequences.)
Income inequality began to grow sometime in the mid-to-late 1970s, and the higher ed bubble got going around 1980 (shown by percent of high school grads in college, and college tuition -- higher prices reflecting higher demand). Why the lag of 12-13 years from the start of the higher ed bubble until constant professor complaining about constant student complaining? I doubt it'd take that long to manifest itself, or for observers to notice. Compare the case of yuppies -- punks were already spray-painting "Die Yuppie Scum" back in the '80s, and an acclaimed movie from 1987 skewered the prevalent view on Wall Street that "Greed is good."
For that matter, forget the higher ed bubble. Once inequality starts to widen by the late '70s, why don't we read widespread reports of high schoolers -- or their parents -- pestering their teachers about boosting grades? The "grade grubber" in the '70s sense is still there -- Brian from The Breakfast Club, the nerd in Heathers whose contribution to talking about the suicide of a classmate is if they'll be tested on it later. Or that other grade-grubber in Heathers, whose sole purpose is maximizing his chances to gain early acceptance into an Ivy League school ("and please let it be Harvard," he begs God while praying at a classmate's funeral). But absent are any journalistic reports or fictional portrayals of high school kids confronting their teachers to haggle over grades.
One of the early-2000s references to confrontational grade grubbling in the press came from a high school teacher. My sister-in-law spontaneously remarked that complaints about grade-obsessed parents these days could have been said about her high school back in 1995.
So, it seems that hostile self-promotion is a product of both the rising-inequality trend and the falling-crime / cocooning trend. It can't be just the first, or else we would've seen it during the later '70s perhaps, but definitely throughout all of the '80s. And it can't be just the second, since I don't have that impression of the last stretch of cocooning from the mid-30s through the '50s. Students from that era are also bored, apathetic, dorky, and docile. But not hostile and self-promoting on top of it, like today's dorky student body.
That predicts that we'd see something similar going on among students of the Victorian era in Europe, and the Gilded Age in America. I have no idea whether that's borne out or not, though. They didn't have the same widespread level of public education and higher ed that we do during this incarnation of the "worst of both worlds" zeitgeist. But, it's something that someone could look into.
Categories:
Cocooning,
Economics,
Education,
Generations,
Morality,
Over-parenting,
Politics,
Pop culture,
Psychology
September 23, 2013
Body dysmorphia and cocooning
It's something I've covered here and there before, but as occasionally happens, I wrote an entire blog post in someone else's comment section. Here is the post, about some nude model with already large breasts who's gotten them enlarged several times nevertheless. Not too different from the "sweater girl" phenomenon of the mid-century, exaggerating their chest size with bullet bras in order to prop up their sense of self-worth.
When people cocoon, they lack the feedback from others that tells them where they rank -- pretty, average, ugly, etc. So they take up solitary pursuits like breast augmentation to convince themselves that they're worthy -- y'know, rather than putting themselves out there for the boys and seeing how many date offers they get.
Add body dysmorphia to the list cocooning's web of dysfunction.
When people cocoon, they lack the feedback from others that tells them where they rank -- pretty, average, ugly, etc. So they take up solitary pursuits like breast augmentation to convince themselves that they're worthy -- y'know, rather than putting themselves out there for the boys and seeing how many date offers they get.
Add body dysmorphia to the list cocooning's web of dysfunction.
Categories:
Cocooning,
Design,
Dudes and dudettes,
Health,
Pop culture,
Psychology
"Sir" or "Mister"?
With the return of overly formal and distant/avoidant social relations, people feel less comfortable addressing strangers in friendly ways.
You still hear "dude" and "man" among youngish males, but even there it's not as common as it used to be. "Dude" is now more of an interjection, like "Wow!" Hence, "Dude! You'll never believe what I just saw..." It expresses the speaker's feelings rather than address another person in an informal way, a la "What's up dude?" "Not much, dude. How 'bout you dude?" (No one claimed that informality would make you sound smarter.) It seems like "man" is the only one left, like a fast food worker who gives you your order with a "Here you go, man." "Cool. Thanks, man."
The change that really rubs me the wrong way, however, is the ubiquity of "Sir" -- like, I'm not royalty, I'm not a potential employer who you're petitioning, and I'm not one of your superiors whose butt you have to kiss. It's not only retail workers who are required to say this -- it's panhandlers (who have no boss giving them orders) and everyday people on the train or other public spaces. In the retail setting, their bosses aren't just giving them pointless orders to address customers more formally -- it's likely because today's customers expect that level of formality from the workers.
I wish it were just a sign that I'm considered an old geezer by the young 'uns, but they all think I'm in my early or mid 20s, and besides, they address one another that way too. Generally girls call guys their own age "Sir" in the right context, while guys tend to reserve "Sir" for older men.
I'd be fine with no address at all in the retail setting, which is where you hear it most often. "Is there anything I can help you find?" -- I don't need to hear "Sir?" at the end. "Sir" makes it feel more like a patron-client relationship, rather than them being there to help out another person like themselves. Again, I don't blame the workers so much, although I'm sure that as members of today's world, they too prefer that kind of detachment rather than talking to strangers in a friendly way. It's due more to distant customers these days wanting more, well, distance between themselves and the workers.
If they're going to use some kind of address, what ever happened to "Mister"? That sounded more informal because people used it not only in a polite context -- "Need any help carrying those bags, Mister?" -- but also in a talking-down or putting-in-their-place context -- "And just where do you think you're going, Mister?" It was a leveling form of address, not used only toward superiors. Even when it was used toward elders or superiors, it just sounds more like someone you know from the neighborhood, like "Mr. Callahan from next door." It made young people sound less groveling and pathetic when they addressed their elders, and didn't make the elders feel like distant old fogies who needed to be carefully addressed in order to protect their ego.
Plus, "Mister" allowed all manner of ironic usages among folks who already knew each other. One friend tries to back out of a commitment to another one, and he replies jokingly but seriously with, "Woah, guess again, Mister. C'mon, there'll be girls there -- it'll be cool." It's like a friendly little poke in the ribs.
Or a girlfriend might refer to her boyfriend playfully as though he were a stranger. Remember at the end of Back to the Future, when Marty McFly is trying to appreciate what a sweet ride his new black truck is? His girlfriend Jennifer walks up behind him, pauses, and teasingly asks, "...How 'bout a ride, Mister?" It's such a warm way to flirt because "Mister" calls subtle attention to the fact that the two of you are in fact already close to each other.
And of course, sometimes the usage was intentionally underscoring the fact that the two were total strangers -- as in, "strangers in the night," or "relying on the kindness of strangers." Like some babe asking for a ride home: "You goin' my way, Mister?" Or when she recognizes that you're new to the nightclub or bar: "Say Mister, don't believe I've seen you around here before..." It's acknowledging the lack of a former acquaintance, while treating it as no big deal, nothing that we ought to let stand in the way of hitch-hiking, dancing, or what have you.
It really was a versatile form of address, and it'll be a sign of changing times once we start to hear it (or something like it) in common speech again. Until then, for God's sake, don't call me "Sir." It takes you right out of the moment.
You still hear "dude" and "man" among youngish males, but even there it's not as common as it used to be. "Dude" is now more of an interjection, like "Wow!" Hence, "Dude! You'll never believe what I just saw..." It expresses the speaker's feelings rather than address another person in an informal way, a la "What's up dude?" "Not much, dude. How 'bout you dude?" (No one claimed that informality would make you sound smarter.) It seems like "man" is the only one left, like a fast food worker who gives you your order with a "Here you go, man." "Cool. Thanks, man."
The change that really rubs me the wrong way, however, is the ubiquity of "Sir" -- like, I'm not royalty, I'm not a potential employer who you're petitioning, and I'm not one of your superiors whose butt you have to kiss. It's not only retail workers who are required to say this -- it's panhandlers (who have no boss giving them orders) and everyday people on the train or other public spaces. In the retail setting, their bosses aren't just giving them pointless orders to address customers more formally -- it's likely because today's customers expect that level of formality from the workers.
I wish it were just a sign that I'm considered an old geezer by the young 'uns, but they all think I'm in my early or mid 20s, and besides, they address one another that way too. Generally girls call guys their own age "Sir" in the right context, while guys tend to reserve "Sir" for older men.
I'd be fine with no address at all in the retail setting, which is where you hear it most often. "Is there anything I can help you find?" -- I don't need to hear "Sir?" at the end. "Sir" makes it feel more like a patron-client relationship, rather than them being there to help out another person like themselves. Again, I don't blame the workers so much, although I'm sure that as members of today's world, they too prefer that kind of detachment rather than talking to strangers in a friendly way. It's due more to distant customers these days wanting more, well, distance between themselves and the workers.
If they're going to use some kind of address, what ever happened to "Mister"? That sounded more informal because people used it not only in a polite context -- "Need any help carrying those bags, Mister?" -- but also in a talking-down or putting-in-their-place context -- "And just where do you think you're going, Mister?" It was a leveling form of address, not used only toward superiors. Even when it was used toward elders or superiors, it just sounds more like someone you know from the neighborhood, like "Mr. Callahan from next door." It made young people sound less groveling and pathetic when they addressed their elders, and didn't make the elders feel like distant old fogies who needed to be carefully addressed in order to protect their ego.
Plus, "Mister" allowed all manner of ironic usages among folks who already knew each other. One friend tries to back out of a commitment to another one, and he replies jokingly but seriously with, "Woah, guess again, Mister. C'mon, there'll be girls there -- it'll be cool." It's like a friendly little poke in the ribs.
Or a girlfriend might refer to her boyfriend playfully as though he were a stranger. Remember at the end of Back to the Future, when Marty McFly is trying to appreciate what a sweet ride his new black truck is? His girlfriend Jennifer walks up behind him, pauses, and teasingly asks, "...How 'bout a ride, Mister?" It's such a warm way to flirt because "Mister" calls subtle attention to the fact that the two of you are in fact already close to each other.
And of course, sometimes the usage was intentionally underscoring the fact that the two were total strangers -- as in, "strangers in the night," or "relying on the kindness of strangers." Like some babe asking for a ride home: "You goin' my way, Mister?" Or when she recognizes that you're new to the nightclub or bar: "Say Mister, don't believe I've seen you around here before..." It's acknowledging the lack of a former acquaintance, while treating it as no big deal, nothing that we ought to let stand in the way of hitch-hiking, dancing, or what have you.
It really was a versatile form of address, and it'll be a sign of changing times once we start to hear it (or something like it) in common speech again. Until then, for God's sake, don't call me "Sir." It takes you right out of the moment.
Categories:
Language,
Psychology
September 22, 2013
Gelatin in meals
"Where did Aspic go?" asks a post at chow.com. If it was dominant in the 1950s, '60s, and even into the '70s, before falling off a cliff in the '80s, then it sounds like another casualty of the low-fat, low-cal, crypto-vegetarian movement. That began in the later half of the '70s, picked up yuppie appeal during the '80s, and was a fait accompli by the '90s. Not surprisingly, all the carbs that replaced fat in the diet caused the obesity epidemic that we're still plagued by.
My mother wasn't that into the trend when I was growing up, so I still got to eat real animal foods when she cooked. But I don't remember savory gelatin dishes being one of them. Perhaps it was more of a suburban Fifties phenomenon, while she grew up in backwoods Appalachia, where they still had an outhouse into the 1960s. Though I definitely remember how popular the fruit-in-jello squares remained well into the '80s.
Here is another post on "The Icky Era of Aspic," showing pictures from old cookbooks. At the end it floats the lame sociological explanation that it was all just a way of showing off the fact that you owned a fancy, newfangled refrigerator. Doubtful -- there's all kinds of better ways to show that off. Serving cold-cuts that were still cold. Serving ice cream that still felt icy.
The fact that so many of them were made in those space-age looking molds suggests a more, well, Space Age motivation. It allowed you to signal your commitment to the ideal of the "meal in a pill" Way Of Tomorrow, without having to sacrifice basic wholesomeness. The pieces of meat and vegetables are recognizably fit for traditional human nourishment and enjoyment, while the look of being suspended in gel -- geometrically molded gel -- gives it the requisite futuristic cachet.
Finally, here are two more paleo recipes from Mark's Daily Apple, one savory and the other dessert.
If you're pressed for time or money, Knox gelatin packets are fairly cheap and easy to use to get more collagen in your diet. Just mix it with water or stock, and layer it on top of cold cuts like ham, turkey, or chicken that tend to have no fat these days. I've used an entire envelope with much less liquid to give it a more collagen-y than aspicky feel. Layer some canned carrot slices on top, drizzle with olive oil, sprinkle salt and pepper -- simple and delicious midnight / before bed snack.
I ought to try using a raw egg yolk as the liquid for the gelatin. The gelatin by itself puts your brain in a more relaxed, tranquil state, but it doesn't give you that euphoric, touchy-feely feeling that eating a couple raw egg yolks does.
Food is the one area of cultural life that was better before the '80s, at least as far as nutrients go. One thing that stands out after people began to move away from an animal-rich diet is how their hair doesn't look as lustrous as it did in the '60s or '70s. Eighties hair still had lots of volume, but it looks a bit finer or drier, and more obviously held out by tons of hairspray. Back when everyone was eating a pound or so of gelatin a week at dinner, all that collagen gave them richer looking skin and hair.
My mother wasn't that into the trend when I was growing up, so I still got to eat real animal foods when she cooked. But I don't remember savory gelatin dishes being one of them. Perhaps it was more of a suburban Fifties phenomenon, while she grew up in backwoods Appalachia, where they still had an outhouse into the 1960s. Though I definitely remember how popular the fruit-in-jello squares remained well into the '80s.
Here is another post on "The Icky Era of Aspic," showing pictures from old cookbooks. At the end it floats the lame sociological explanation that it was all just a way of showing off the fact that you owned a fancy, newfangled refrigerator. Doubtful -- there's all kinds of better ways to show that off. Serving cold-cuts that were still cold. Serving ice cream that still felt icy.
The fact that so many of them were made in those space-age looking molds suggests a more, well, Space Age motivation. It allowed you to signal your commitment to the ideal of the "meal in a pill" Way Of Tomorrow, without having to sacrifice basic wholesomeness. The pieces of meat and vegetables are recognizably fit for traditional human nourishment and enjoyment, while the look of being suspended in gel -- geometrically molded gel -- gives it the requisite futuristic cachet.
Finally, here are two more paleo recipes from Mark's Daily Apple, one savory and the other dessert.
If you're pressed for time or money, Knox gelatin packets are fairly cheap and easy to use to get more collagen in your diet. Just mix it with water or stock, and layer it on top of cold cuts like ham, turkey, or chicken that tend to have no fat these days. I've used an entire envelope with much less liquid to give it a more collagen-y than aspicky feel. Layer some canned carrot slices on top, drizzle with olive oil, sprinkle salt and pepper -- simple and delicious midnight / before bed snack.
I ought to try using a raw egg yolk as the liquid for the gelatin. The gelatin by itself puts your brain in a more relaxed, tranquil state, but it doesn't give you that euphoric, touchy-feely feeling that eating a couple raw egg yolks does.
Food is the one area of cultural life that was better before the '80s, at least as far as nutrients go. One thing that stands out after people began to move away from an animal-rich diet is how their hair doesn't look as lustrous as it did in the '60s or '70s. Eighties hair still had lots of volume, but it looks a bit finer or drier, and more obviously held out by tons of hairspray. Back when everyone was eating a pound or so of gelatin a week at dinner, all that collagen gave them richer looking skin and hair.
Categories:
Food,
Generations,
Health
September 17, 2013
"Honestly," "literally," "seriously" -- further mutations
An earlier post looked at the recent growth in slang words that loudly signal to the audience that you're telling the truth, worried that by default they'd think you were a liar or exaggerating. These words are a sign of how little trust young people have in each other anymore, perhaps also because they know that they themselves normally are not very trustworthy.
In the good old days, you didn't need to spastically wave a flag to let your conversation partner know that you were about to switch gears into truth-telling mode. But now that helicopter parenting has prevented peer socialization from proceeding as normal, the Millennial generation has settled into an unbreakable habit of exaggerating or lying as their default setting. Your nuclear family, guided by genetic self-interest, will always forgive you for being "the boy who cried wolf," whereas the other kids at school will ostracize you and give you an enduring stigma if that's how you act around them.
Given how recent these developments are (~ past 20 years), it's always worth checking up on their current status. They are not long-lasting features of the English language, but part of a thriving phenomenon, unfortunately.
I just spotted a new member of the family, albeit in a British context (source). Some weird dude in a clown suit is trying to shock the straights in Northampton. How does Twitter user Alex Wynick let the world know that he doesn't get the unsettled response of the townspeople?
I haven't heard "genuinely" in America yet, but just give it a little time for our SWPLs to begin imitating their posh patois.
Where does it go from here? I mean, not gonna lie, I feel like sometimes, I honestly don't even know how many more words like that we can use without feeling, actually-kind-of retarded. Right? "I authentically do not understand the fear of the Northampton Clown." Or perhaps, "I artisanally do not understand the fear of the Northampton Clown."
If they can't squeeze in many new members, they are embellishing on the ones already there. These words never had any semantic content for the utterance, but were more of a meta-linguistic social signal to the listener -- "Hey, be prepared, I'm going to switch into realtalk mode." So it's not surprising that they've morphed into a stylized tag at the beginning of the sentence, such as "I honestly I _____." They give a quick signal of realtalk mode, then they get to the meaning they're trying to get across.
Here are a couple examples of "I honestly I..." from separate articles found through a search of the NYT:
Searching google for "I literally I..." gives a bunch of examples. E.g.:
Googling "I seriously I..." gives:
These sound ungrammatical if you hear the "I honestly" thing as part of the meaning-expressing words of the utterance. "Honestly" etc. are adverbs, and you've already got a subject "I," so you expect to hear just a verb and whatever else afterward (the predicate). It's jarring to hear Subject Adverb Full-Sentence. But if you understand that "I honestly" is just a stylized meta-linguistic tag, it goes through the brain sounding acceptable.
I (honestly) wonder if another reason behind all of these meta-linguistic signals of the speaker's credibility, is the decline in face-to-face interactions among young people in particular. Or their stunted "emotional intelligence" even when they do interact in person. You'd think they could communicate the meta-linguistic thing through facial expression or gesture, and leave the meaning to the mouth.
Like, it's not hard to twist your face into a "joking around" or "to be taken seriously" expression. But if you get so little practice sending and interpreting subtle expressional signals like that, you might as well use speech.
Also, kids who get so little practice with facial expression and gestural modes of communication wind up over-doing it on those rare occasions where they do try to make a different face from "total apathy and lack of involvement." Hence the ubiquitous cancer of kabuki faces, from mass entertainment to portrayals of children in pop culture. Only stunted children with no experience make faces of such unrefined caricature.
Related: a post on the growth in slang words that show how suspicious you find everyone, everything, and every place.
In the good old days, you didn't need to spastically wave a flag to let your conversation partner know that you were about to switch gears into truth-telling mode. But now that helicopter parenting has prevented peer socialization from proceeding as normal, the Millennial generation has settled into an unbreakable habit of exaggerating or lying as their default setting. Your nuclear family, guided by genetic self-interest, will always forgive you for being "the boy who cried wolf," whereas the other kids at school will ostracize you and give you an enduring stigma if that's how you act around them.
Given how recent these developments are (~ past 20 years), it's always worth checking up on their current status. They are not long-lasting features of the English language, but part of a thriving phenomenon, unfortunately.
I just spotted a new member of the family, albeit in a British context (source). Some weird dude in a clown suit is trying to shock the straights in Northampton. How does Twitter user Alex Wynick let the world know that he doesn't get the unsettled response of the townspeople?
Genuinely do not understand the fear of the Northampton Clown. It's just a guy with too much time on his hands.
I haven't heard "genuinely" in America yet, but just give it a little time for our SWPLs to begin imitating their posh patois.
Where does it go from here? I mean, not gonna lie, I feel like sometimes, I honestly don't even know how many more words like that we can use without feeling, actually-kind-of retarded. Right? "I authentically do not understand the fear of the Northampton Clown." Or perhaps, "I artisanally do not understand the fear of the Northampton Clown."
If they can't squeeze in many new members, they are embellishing on the ones already there. These words never had any semantic content for the utterance, but were more of a meta-linguistic social signal to the listener -- "Hey, be prepared, I'm going to switch into realtalk mode." So it's not surprising that they've morphed into a stylized tag at the beginning of the sentence, such as "I honestly I _____." They give a quick signal of realtalk mode, then they get to the meaning they're trying to get across.
Here are a couple examples of "I honestly I..." from separate articles found through a search of the NYT:
I honestly I don't know when military force is justified, I think the people in the military are treated very wrong...
I honestly I don't care what the government does. They are all grown adults and they are perfectly capable of making decisions.
Searching google for "I literally I..." gives a bunch of examples. E.g.:
I literally i was crying like stupid when i saw this !
Sorry Tamsin, I literally I thought I was the only admin, and the owner said I could share my page as many times a I like because it got hacked :(
Googling "I seriously I..." gives:
Okay, I seriously I have no idea what I am doing at the moment, basically, I'm just following the example on what's given for the exercise.
Alright, I seriously I don't understand these Thieves Guild jobs ...
These sound ungrammatical if you hear the "I honestly" thing as part of the meaning-expressing words of the utterance. "Honestly" etc. are adverbs, and you've already got a subject "I," so you expect to hear just a verb and whatever else afterward (the predicate). It's jarring to hear Subject Adverb Full-Sentence. But if you understand that "I honestly" is just a stylized meta-linguistic tag, it goes through the brain sounding acceptable.
I (honestly) wonder if another reason behind all of these meta-linguistic signals of the speaker's credibility, is the decline in face-to-face interactions among young people in particular. Or their stunted "emotional intelligence" even when they do interact in person. You'd think they could communicate the meta-linguistic thing through facial expression or gesture, and leave the meaning to the mouth.
Like, it's not hard to twist your face into a "joking around" or "to be taken seriously" expression. But if you get so little practice sending and interpreting subtle expressional signals like that, you might as well use speech.
Also, kids who get so little practice with facial expression and gestural modes of communication wind up over-doing it on those rare occasions where they do try to make a different face from "total apathy and lack of involvement." Hence the ubiquitous cancer of kabuki faces, from mass entertainment to portrayals of children in pop culture. Only stunted children with no experience make faces of such unrefined caricature.
Related: a post on the growth in slang words that show how suspicious you find everyone, everything, and every place.
Categories:
Age,
Cocooning,
Generations,
Language,
Media,
Over-parenting,
Pop culture,
Psychology
September 16, 2013
Blonde hair and religious affiliation
In an earlier post on the great civilizational fault-line in Europe, I proposed that one salient marker of the split between Nordo-Balto-Slavic people and Italo-Celto-Germanic people on the other is blonde hair (more prevalent for the Plains People, less for the hilly/mountain people).
One of the main differences emphasized in that post was the strength of religious involvement -- blondes don't seem as into their religion as dark-haired people are. It seems like the blonde parts of Europe have been gradually shifting away from religious engagement since the Lutheran Reformation, with unusually large fractions of those regions being pretty uncommitted nowadays, and not holding things as sacred as they used to. Blondes seem more concretely oriented toward making a living and providing for their nuclear family, devoting less time to broader community involvement in a sacred setting, something that would really bring out the "collective effervescence" in a community.
I'll be looking in finer detail at hair color and religiosity in future posts. For now, let's start with a simpler goal -- how blonde are various religious groups? We should only look at people who come from places where hair color varies enough to have a fair number of blondes. Otherwise we'll be confounding hair color with ethnicity -- dark-haired Italians vs. blonde Swedes, for example. We want to look only at hair color, so I'm restricting the data to white Americans of English and German descent. That will also include all the major religious groupings. (I'm excluding Jews because they're not who I mean by English and German, even if they came here from there.)
The sample is the National Longitudinal Study of Youth (1979), a nationally representative group who were born in from the late '50s through the mid '60s, and who began to be studied in the late '70s, with regular follow-up surveys being done throughout their lives.
At the outset they were asked what religion they were raised in, and a little later they were categorized by hair color. I've merged "light blonde" and "blonde" into a single "blonde" group, and "dark brown" and "black" into a single "dark" group. The extremes are quite small in number, and it's not clear how meaningful the distinction was in the interviewer's mind. They were probed for clarification if they said they were raised Protestant -- e.g., Methodist, Lutheran, etc. The category here labeled "Protestant" is a catch-all for anyone not in the major denominations. I'm assuming most of them are evangelical, charismatic, and so on. "Other" I guess could mean Orthodox, Hindu, Buddhist, or whatever else parents might have been raising their kids as during the '60s and '70s.
Here is the percent of each group that has blonde hair:
The Episcopalians are more than twice as blonde as the catch-all Protestants. Lutherans are also noticeably blonder than the others. As are those children raised as godless heathens. If blonde birds of a feather flock together, this confirms the hunch that Episcopalians are the closest to atheists among mainstream groups. But it also hints at a similar streak among Lutherans, who most readers probably don't have much of a clear idea about.
We can see that it's not Protestantism per se because the catch-alls are the least blonde, with Methodists and Presbyterians being pretty dark as well. Catholics and Baptists are equally blonde.
Rather, it seems like the more historically high-commitment groups are darker haired, while the less involved and lenient groups are much blonder. Catholicism is already fairly high in commitment; the historically radical groups were even more so. One of Luther's main reforms was to shift focus away from performing acts of penance to atone for your sins, and toward faith alone as the path toward salvation. And the lax nature of Episcopalian doctrine and practice (at least by the late 1970s) is a widespread stereotype.
Is this something about non-dark hair? Nope. The table below shows blondes, darks, and redheads (light browns are the remainder, not shown). Most redheaded is at the top.
Blonde | Dark | Red | |
Presbyterian | 17.1 | 60.6 | 6.9 |
Baptist | 19 | 57.4 | 4.6 |
Methodist | 18 | 55.2 | 4.1 |
Lutheran | 24.9 | 51.5 | 3.5 |
Protestant | 14.2 | 61.8 | 3.3 |
Catholic | 19.1 | 58.1 | 3.2 |
Episcopalian | 29.5 | 52.4 | 2.9 |
None | 27.6 | 51.9 | 2.6 |
Other | 19.5 | 57.8 | 1.9 |
Redheads are the least common in godless, Other, and Episcopalian groups, but most common in Methodist, Baptist, and Presbyterian groups. Folks raised Presbyterian are nearly 3 times as redheaded as those raised with no religion. The general trend is for higher-commitment and radical groups to have more gingers. That fits the stereotype of fiery redheads, from Tacitus' description of the Germanic tribes of the ancient world, to the medieval Vikings, to present-day Yosemite Sam. In case we weren't sure before, now we know that red hair cannot be meaningfully grouped with blonde hair ("light hair") when it comes to social and cultural traits.
Remember that all these people are whites of English or German origin. Dark-haired French and blonde Lithuanians are not mixing up the results by introducing ethnicity as a confound. It does seem like there are two populations being studied, but it's hair color rather than ethnicity or nationality that distinguishes them. When the genetic mutations for blonde hair first appeared, they must have been doing something, and now it looks like a retreat from religious involvement is a key part of that new variety of homo sapiens. That really would have been a novel character trait, given how ubiquitous religion has been across the world throughout human evolution.
In future posts, we'll take a look at other hints from the NLSY that blonde hair color marked the beginning of the slow secularization of the groups that it caught on in.
NLSY variables: R00096.00 (ethnic origin), R02147.00 (race), R17741.00 (hair color), R00103.00 (religion raised in)
Categories:
Human Biodiversity,
NLSY,
Psychology,
Religion
September 15, 2013
Two different forms of trust: showing charity or feeling security
You read and hear a lot about "trust," whether it's social science literature or everyday conversation, when the topic is how people behave toward strangers and what they believe about strangers in general.
I think there are two fundamentally different types of "trust" being described in those writings and conversations, and it can confuse the audience enough that it's worth separating them.
The first kind of trust is a feeling of security -- you trust your neighbors, in the sense that you don't believe they're likely to do you harm, steal your stuff, spread nasty gossip about you, etc. Other people might do those things, but not your neighbors. You trust them. Or, if you don't trust your neighbors, it's because you believe they're likely to harm you in any of those ways.
The second kind is more about behavior than belief -- you show charity toward someone because you trust them, in the sense that you might have reasons to be suspicious, but you're setting aside those reasons, and giving them the benefit of the doubt.
Say one of your friends has a tendency to break small promises about 1/4 of the time, and now is the first time that the promise is a big one, so you can't judge based on an equivalent case in the past. If you trust him to keep his word in this unfamiliar case, you're acknowledging the risk but setting it aside (not denying that the risk actually exists). If you don't trust him, you can't bring yourself to give him the benefit of the doubt.
A sub-class of this second type is hospitality -- you always have reasons to be wary of guests you're hosting because they could take advantage of your letting your guard down. Ditto for the guest -- they always have reasons to be suspicious that their host could have ulterior motives for acting so generous. But the guest and host are putting those reasons aside, and trusting each other.
Putting aside these risks is not naive -- you recognize that they're there, but you are showing your faith that they won't blow up in this case.
Charity, hospitality, faith, compassion, mercy, forgiveness -- you get the idea of what this second kind of trust is about. Security, comfort, ease of mind, carrying on worry-free -- that's what the first kind of trust is about.
You read a lot about how the Nordic and Scandinavian countries are "high trust," but I think it must be in the sense of security and comfort. One Swede, surrounded by a lot of other Swedes, feels safe from possible harm or wrongdoing.
But what about charity and hospitality? My favorite example of hospitality in the modern world is hitch-hiking -- that's a solid example that no one will dispute as being a guest-host relationship. And you have to have pretty high trust (in the second sense) to pull it off. Otherwise, the driver thinks that the hitch-hiker might try to rob or kill them. Or the would-be hitch-hiker decides against asking for a ride because he assumes that the drivers will hit him up for money, take him hostage, kill him, or whatever.
Fortunately there's a website, Hitchwiki, that provides information to travelers about how easy or difficult it is to hitch-hike around the world. Each country has their own entry.
No strong consensus comes up for Finland. How about the liberal utopia just to the west? "Many say that hitching in Sweden sucks." Also: "Norway is not an easy country to hitch in ... Like in Sweden, foreign tourists and immigrants are more likely to pick up hitchhikers." Denmark is the only country that sounds close to Britain or America: "Most drivers are very friendly and hospitable." From my limited experience, I buy the idea that Danes are more extraverted and hospitable than Swedes or Norwegians. No clue about Finns, though.
It may not be impossible to rely on the hospitality of strangers in Scandinavia, but it sure sounds more difficult than in other Western countries. The entries for Norway and Sweden are defensive, trying to convince you that it's not hopeless, and again that most of the host drivers will not be native Scandinavians.
On the discussion page for Norway, there's a good explanation of why hospitality is not very strong in socialist Scandinavia:
Hospitality is a face-to-face, approaching form of trust. Scandinavians don't want unfortunate people standing on the side of the road waiting for a ride -- but they think the solution should come from the government, not from the everyday driver. This is distancing and avoidant -- outsource the job to a remote group like a bureaucracy, and the problem will be solved without the average citizen having to take part personally.
By the way, I don't think that state involvement erodes hospitality, but the other way around -- people who are uncomfortable providing hospitality find an alternate solution, creating a state bureaucracy to handle the job instead.
You see this in Sweden even more clearly with their immigration problem. They don't have a general fear of foreigners, or they wouldn't let so many in. Rather, they are OK with them arriving -- as long as they can be handed over to government workers to be taken care of. Letting a foreigner into the country who will be supported by the welfare system -- OK. Picking up a foreign hitch-hiker -- no way.
What countries do show high trust in the form of hospitality?
Also:
The flip-side to the "culture of honor" is the culture of hospitality. You do unto others as they do unto you, though starting off with a caring gesture. If it continues, you're in the hospitality side of their culture. If you show them disrespect, you've started a possible feud, and are now in the honor-based side of their culture. Both of these seemingly different treatments are in fact the same -- an obsession with reciprocation, face-to-face.
Cultures of honor (and hospitality) are most common among pastoralist people, most famously around the Mediterranean and through the Middle East. Cultures of law (and order) are found more among sedentary agrarian farmers, with East Asia being the best example. Western Europe reflects both traditions because its people come from agro-pastoralist stock.
The two cultures have different ways of helping out a stranded stranger -- the one in a more personal, hospitable way, and the other in a more distancing and bureaucratic way. Evidently, these two approaches create two different forms of trust -- one where average people give the stranger the benefit of the doubt, and the other where they outsource it to third parties and rest safe and secure knowing that someone else is taking care of the problem.
I think there are two fundamentally different types of "trust" being described in those writings and conversations, and it can confuse the audience enough that it's worth separating them.
The first kind of trust is a feeling of security -- you trust your neighbors, in the sense that you don't believe they're likely to do you harm, steal your stuff, spread nasty gossip about you, etc. Other people might do those things, but not your neighbors. You trust them. Or, if you don't trust your neighbors, it's because you believe they're likely to harm you in any of those ways.
The second kind is more about behavior than belief -- you show charity toward someone because you trust them, in the sense that you might have reasons to be suspicious, but you're setting aside those reasons, and giving them the benefit of the doubt.
Say one of your friends has a tendency to break small promises about 1/4 of the time, and now is the first time that the promise is a big one, so you can't judge based on an equivalent case in the past. If you trust him to keep his word in this unfamiliar case, you're acknowledging the risk but setting it aside (not denying that the risk actually exists). If you don't trust him, you can't bring yourself to give him the benefit of the doubt.
A sub-class of this second type is hospitality -- you always have reasons to be wary of guests you're hosting because they could take advantage of your letting your guard down. Ditto for the guest -- they always have reasons to be suspicious that their host could have ulterior motives for acting so generous. But the guest and host are putting those reasons aside, and trusting each other.
Putting aside these risks is not naive -- you recognize that they're there, but you are showing your faith that they won't blow up in this case.
Charity, hospitality, faith, compassion, mercy, forgiveness -- you get the idea of what this second kind of trust is about. Security, comfort, ease of mind, carrying on worry-free -- that's what the first kind of trust is about.
You read a lot about how the Nordic and Scandinavian countries are "high trust," but I think it must be in the sense of security and comfort. One Swede, surrounded by a lot of other Swedes, feels safe from possible harm or wrongdoing.
But what about charity and hospitality? My favorite example of hospitality in the modern world is hitch-hiking -- that's a solid example that no one will dispute as being a guest-host relationship. And you have to have pretty high trust (in the second sense) to pull it off. Otherwise, the driver thinks that the hitch-hiker might try to rob or kill them. Or the would-be hitch-hiker decides against asking for a ride because he assumes that the drivers will hit him up for money, take him hostage, kill him, or whatever.
Fortunately there's a website, Hitchwiki, that provides information to travelers about how easy or difficult it is to hitch-hike around the world. Each country has their own entry.
No strong consensus comes up for Finland. How about the liberal utopia just to the west? "Many say that hitching in Sweden sucks." Also: "Norway is not an easy country to hitch in ... Like in Sweden, foreign tourists and immigrants are more likely to pick up hitchhikers." Denmark is the only country that sounds close to Britain or America: "Most drivers are very friendly and hospitable." From my limited experience, I buy the idea that Danes are more extraverted and hospitable than Swedes or Norwegians. No clue about Finns, though.
It may not be impossible to rely on the hospitality of strangers in Scandinavia, but it sure sounds more difficult than in other Western countries. The entries for Norway and Sweden are defensive, trying to convince you that it's not hopeless, and again that most of the host drivers will not be native Scandinavians.
On the discussion page for Norway, there's a good explanation of why hospitality is not very strong in socialist Scandinavia:
...the very good welfare system in scandinavia. People are not used to help each other, because the goverment takes care and there is no need for solidarity (even the homeless in stockholm have a mobile haha) to each other.
Hospitality is a face-to-face, approaching form of trust. Scandinavians don't want unfortunate people standing on the side of the road waiting for a ride -- but they think the solution should come from the government, not from the everyday driver. This is distancing and avoidant -- outsource the job to a remote group like a bureaucracy, and the problem will be solved without the average citizen having to take part personally.
By the way, I don't think that state involvement erodes hospitality, but the other way around -- people who are uncomfortable providing hospitality find an alternate solution, creating a state bureaucracy to handle the job instead.
You see this in Sweden even more clearly with their immigration problem. They don't have a general fear of foreigners, or they wouldn't let so many in. Rather, they are OK with them arriving -- as long as they can be handed over to government workers to be taken care of. Letting a foreigner into the country who will be supported by the welfare system -- OK. Picking up a foreign hitch-hiker -- no way.
What countries do show high trust in the form of hospitality?
Iran is a very friendly country. [...] Finding a place to sleep in Iran is generally as easy as knocking the first door you come across. If you get tired of the unrelenting hospitality however, the city parks offer an excellent alternative. Many parks, even in big cities, are designated as camping zones, with toilets open all night. Camp fires are tolerated, but it's best to ask before.
Also:
Turkey is an extremely hitchhiking-friendly country. [...] You will never have to worry about lack of food in Turkey. Many truck drivers have coffee makers in their truck. Turkish people are very generous, and it is rare that you will get a lift without a driver offering you food. ... The tea (black tea or apple tea in Istanbul) is the national drink, and almost all the people that you meet offer you a tea − this is probably the most common way of showing you their hospitable culture.
The flip-side to the "culture of honor" is the culture of hospitality. You do unto others as they do unto you, though starting off with a caring gesture. If it continues, you're in the hospitality side of their culture. If you show them disrespect, you've started a possible feud, and are now in the honor-based side of their culture. Both of these seemingly different treatments are in fact the same -- an obsession with reciprocation, face-to-face.
Cultures of honor (and hospitality) are most common among pastoralist people, most famously around the Mediterranean and through the Middle East. Cultures of law (and order) are found more among sedentary agrarian farmers, with East Asia being the best example. Western Europe reflects both traditions because its people come from agro-pastoralist stock.
The two cultures have different ways of helping out a stranded stranger -- the one in a more personal, hospitable way, and the other in a more distancing and bureaucratic way. Evidently, these two approaches create two different forms of trust -- one where average people give the stranger the benefit of the doubt, and the other where they outsource it to third parties and rest safe and secure knowing that someone else is taking care of the problem.
Categories:
Cocooning,
Crime,
Economics,
Human Biodiversity,
Morality,
Politics,
Psychology
September 14, 2013
Generational stability and turnover in the American presidency
My realtalk buddy brought up something I hadn't thought about before. He said the transition from Bush I to Clinton was abrupt because there was a 22-year difference in their birth years. Here is a list of Presidents that includes their birth year.
Clinton was the first Boomer president, beginning a reign of at least 24 years of that generation. It's odd that there doesn't seem to be random variation across generations, but that they're all from a single generation.
Had that happened before? In fact, starting with John F. Kennedy in 1961, there was a 32-year reign of Greatest Gen presidents, with 16 years separating the youngest from the oldest. And it was not a simple story of later-born members succeeding the earlier-born. Johnson was older than his predecessor Kennedy, and Reagan was older than Carter. It didn't go in reverse order either, as Nixon was younger than Johnson, Ford younger than Nixon (by less than a year), Carter younger than Ford, and Bush I younger than Reagan. It's as though voters wanted somebody from the Greatest Gen to lead the country during the '60s, '70s, and '80s, and it didn't matter where exactly in the generation they were born.
Fun fact: there's a 2-year period from January 1987 to January '89 when all major leaders were Greatest Gen -- Reagan as President, Bush I as Vice President, Robert Byrd as Senate Majority Leader, and Jim Wright as Speaker of the House.
Notice the total lack of Silent Gen presidents -- guess they had them figured out back in 1951. And it's not for lack of contestants either -- Mondale, Dukakis, Perot, Kerry, and McCain were all Silents, spanning the entire generation from the second half of the '20s to the first half of the '40s. Yet all lost. In the next election in 2016, the youngest of them will be in their 70s, which is not unheard of, but still makes it highly unlikely that there will ever be one. (Drawing the contemporary analogy, hopefully that means that we'll never be ruled by a Millennial president.)
Again we see that it's not a simple story of succession of one cohort after another, or the Silents would've been in there at some point.
By the way, Silents have snuck into lesser offices. Five terms of Vice Presidents have been filled by them (Mondale, Cheney, and Biden). For 23 years off and on since the early 1980s, a Silent has been the Senate Majority Leader (Baker, Mitchell, Lott, and Reid). And from June of 1989 to January 2011, the Speaker of the House was a Silent. This last office seems to show the simple story of gradual succession of birth cohorts. It is also the least symbolic.
Before the Greatest Gen reign, there's a 28-year block of presidents born from 1880 to 1890, before the quantum leap to Kennedy.
Earlier still, there's a 32-year stretch from Theodore Roosevelt to Hoover, who were born within 18 years of each other, although it was not such a huge jump from Hoover to FDR.
But there is another big gap before Theodore Roosevelt, who was 15 years younger than his predecessor McKinley, who himself was part of another block going back to Grant who were born within 21 years of each other.
Then another big gap between Grant (1822) and Johnson (1808), who is part of an earlier 28-year block of births from the early 1800s and late 1700s, going back to John Tyler (1790), with Zachary Taylor (1784) being the only outlier in the block.
I don't pretend to know what any of this means, but it's worth looking into quantitatively, and more importantly trying to figure out what leaves certain generations in or out, and when they are in or out of demand. There is a pretty clear link of the timing of the blocks to the timing of the crime rate -- the rising-crime period of circa 1900 to 1933, the falling-crime period from then until 1958, the rising period from then until 1992, and the falling period since.
Rising-crime times select presidents who spent most of their childhood and perhaps adolescence in an earlier rising-crime period -- in particular, near the end / peak of the wave of violence. Theodore Roosevelt through at least Harding were born near a peak of violence in America that is most familiar from the Civil War (although it's not helpful to think of episodic wars as defining events, but the year-to-year trends of which they could be a part). Silent Cal and Hoover may have been born a little too late to experience that wave as children, but I'm not totally sure.
But FDR, Truman, and Eisenhower definitely were. They were children during the Gilded Age, though Eisenhower as an adolescent experienced the very beginnings of the early 20th-C. crime wave. Overall, though, not that much exposure to rising crime while growing up, and certainly not during phases of the crime cycle that are near the peak. This is the generation thought best to lead during the falling-crime mid-century era. We don't need a wild child like Teddy Roosevelt anymore, things have calmed down, and someone whose formative years were during the calm Gilded Age seems more appropriate.
Once the crime rate began climbing in 1959, voters thought it was time for a change -- someone whose formative years were marked by soaring violence levels and gangster rule, who might have some kind of intuition about how to handle the problem. So all of the sudden, people who grew up during the 1910s and Roaring Twenties (which lasted into the early '30s), were the right people for the job.
As crime rates peaked in 1992, voters were ready for a return to presidents whose formative years were in calmer times. People who grew up in the post-War, Dr. Spock, Leave It to Beaver world. Obama is somewhat of an anomaly in that respect, though his early years were only at the beginning of the crime wave. The second half of a crime wave feels qualitatively different than the first half -- compare 1959 through 1975 vs. 1976 through 1992. So again it may relate to growing up closer to the peak of the wave.
When crime rates begin rising sometime around 2020, we'll probably choose a president from Generation X, who may be around 50-55 years old, or perhaps one of the rarer young presidents if he's born into the '70s. (Of course I could be off by an election.) I predict we'll stick with Gen X presidents all through the next crime wave, just as we stuck with Greatest Gen presidents during the last wave.
And when things calm down, it'll skip the Millennials, whose sheltering during their formative years will keep them from seeming right for the highest office. It'll make another quantum leap to whatever the generation after Millennials will be, some of whom may already be born but not more than about 5 years old yet. They'll have grown up mostly in the post-9/11, neo-Dr. Spock, Return of the Leave It to Beaver world, but also have some adolescent experience with the coming crime wave.
People grossly underestimate the importance of a person's formative years -- they don't call them that for nothing.
Clinton was the first Boomer president, beginning a reign of at least 24 years of that generation. It's odd that there doesn't seem to be random variation across generations, but that they're all from a single generation.
Had that happened before? In fact, starting with John F. Kennedy in 1961, there was a 32-year reign of Greatest Gen presidents, with 16 years separating the youngest from the oldest. And it was not a simple story of later-born members succeeding the earlier-born. Johnson was older than his predecessor Kennedy, and Reagan was older than Carter. It didn't go in reverse order either, as Nixon was younger than Johnson, Ford younger than Nixon (by less than a year), Carter younger than Ford, and Bush I younger than Reagan. It's as though voters wanted somebody from the Greatest Gen to lead the country during the '60s, '70s, and '80s, and it didn't matter where exactly in the generation they were born.
Fun fact: there's a 2-year period from January 1987 to January '89 when all major leaders were Greatest Gen -- Reagan as President, Bush I as Vice President, Robert Byrd as Senate Majority Leader, and Jim Wright as Speaker of the House.
Notice the total lack of Silent Gen presidents -- guess they had them figured out back in 1951. And it's not for lack of contestants either -- Mondale, Dukakis, Perot, Kerry, and McCain were all Silents, spanning the entire generation from the second half of the '20s to the first half of the '40s. Yet all lost. In the next election in 2016, the youngest of them will be in their 70s, which is not unheard of, but still makes it highly unlikely that there will ever be one. (Drawing the contemporary analogy, hopefully that means that we'll never be ruled by a Millennial president.)
Again we see that it's not a simple story of succession of one cohort after another, or the Silents would've been in there at some point.
By the way, Silents have snuck into lesser offices. Five terms of Vice Presidents have been filled by them (Mondale, Cheney, and Biden). For 23 years off and on since the early 1980s, a Silent has been the Senate Majority Leader (Baker, Mitchell, Lott, and Reid). And from June of 1989 to January 2011, the Speaker of the House was a Silent. This last office seems to show the simple story of gradual succession of birth cohorts. It is also the least symbolic.
Before the Greatest Gen reign, there's a 28-year block of presidents born from 1880 to 1890, before the quantum leap to Kennedy.
Earlier still, there's a 32-year stretch from Theodore Roosevelt to Hoover, who were born within 18 years of each other, although it was not such a huge jump from Hoover to FDR.
But there is another big gap before Theodore Roosevelt, who was 15 years younger than his predecessor McKinley, who himself was part of another block going back to Grant who were born within 21 years of each other.
Then another big gap between Grant (1822) and Johnson (1808), who is part of an earlier 28-year block of births from the early 1800s and late 1700s, going back to John Tyler (1790), with Zachary Taylor (1784) being the only outlier in the block.
I don't pretend to know what any of this means, but it's worth looking into quantitatively, and more importantly trying to figure out what leaves certain generations in or out, and when they are in or out of demand. There is a pretty clear link of the timing of the blocks to the timing of the crime rate -- the rising-crime period of circa 1900 to 1933, the falling-crime period from then until 1958, the rising period from then until 1992, and the falling period since.
Rising-crime times select presidents who spent most of their childhood and perhaps adolescence in an earlier rising-crime period -- in particular, near the end / peak of the wave of violence. Theodore Roosevelt through at least Harding were born near a peak of violence in America that is most familiar from the Civil War (although it's not helpful to think of episodic wars as defining events, but the year-to-year trends of which they could be a part). Silent Cal and Hoover may have been born a little too late to experience that wave as children, but I'm not totally sure.
But FDR, Truman, and Eisenhower definitely were. They were children during the Gilded Age, though Eisenhower as an adolescent experienced the very beginnings of the early 20th-C. crime wave. Overall, though, not that much exposure to rising crime while growing up, and certainly not during phases of the crime cycle that are near the peak. This is the generation thought best to lead during the falling-crime mid-century era. We don't need a wild child like Teddy Roosevelt anymore, things have calmed down, and someone whose formative years were during the calm Gilded Age seems more appropriate.
Once the crime rate began climbing in 1959, voters thought it was time for a change -- someone whose formative years were marked by soaring violence levels and gangster rule, who might have some kind of intuition about how to handle the problem. So all of the sudden, people who grew up during the 1910s and Roaring Twenties (which lasted into the early '30s), were the right people for the job.
As crime rates peaked in 1992, voters were ready for a return to presidents whose formative years were in calmer times. People who grew up in the post-War, Dr. Spock, Leave It to Beaver world. Obama is somewhat of an anomaly in that respect, though his early years were only at the beginning of the crime wave. The second half of a crime wave feels qualitatively different than the first half -- compare 1959 through 1975 vs. 1976 through 1992. So again it may relate to growing up closer to the peak of the wave.
When crime rates begin rising sometime around 2020, we'll probably choose a president from Generation X, who may be around 50-55 years old, or perhaps one of the rarer young presidents if he's born into the '70s. (Of course I could be off by an election.) I predict we'll stick with Gen X presidents all through the next crime wave, just as we stuck with Greatest Gen presidents during the last wave.
And when things calm down, it'll skip the Millennials, whose sheltering during their formative years will keep them from seeming right for the highest office. It'll make another quantum leap to whatever the generation after Millennials will be, some of whom may already be born but not more than about 5 years old yet. They'll have grown up mostly in the post-9/11, neo-Dr. Spock, Return of the Leave It to Beaver world, but also have some adolescent experience with the coming crime wave.
People grossly underestimate the importance of a person's formative years -- they don't call them that for nothing.
Categories:
Age,
Crime,
Generations,
Over-parenting,
Politics,
Psychology,
Violence
September 12, 2013
Even one-night stands had distinct identities back in the '80s
Speaking of how young people don't have very strong, distinct identities anymore, what about an extreme test of the idea -- did even the most fleeting encounters leave an impression of the other person's distinctiveness? I'll leave it to another New Wave fave to settle the matter:
Can you imagine a similar song being so well crafted these days, let alone make it into the one-hit wonder hall of fame? Young people's identities are too under-developed to write songs about, and they're too distant toward each other to notice even if they did meet someone who makes an impression.
"Little Red Corvette" is a related example of how vivid an individual's identity used to be, regardless of how little time you spent with them.
(Worth reminding folks that when young people use the term "hook up," they're posing as more experienced than they really are. What actually happens when "hooking up" is what we would've called a make-out session -- kissing, but probably not sex, which only takes place in 1 out of 4 hook-ups.)
Can you imagine a similar song being so well crafted these days, let alone make it into the one-hit wonder hall of fame? Young people's identities are too under-developed to write songs about, and they're too distant toward each other to notice even if they did meet someone who makes an impression.
"Little Red Corvette" is a related example of how vivid an individual's identity used to be, regardless of how little time you spent with them.
(Worth reminding folks that when young people use the term "hook up," they're posing as more experienced than they really are. What actually happens when "hooking up" is what we would've called a make-out session -- kissing, but probably not sex, which only takes place in 1 out of 4 hook-ups.)
Categories:
Dudes and dudettes,
Generations,
Music,
Psychology
Why don't young people have strong identities anymore?
An identity goes beyond a temperament or a personality -- even toddlers have those. And I don't mean a subjective, conscious awareness of what makes you you. Just on a purely descriptive, observational level, young people are so damn interchangeable these days. Worse -- interchangeably dull and drab.
At best they form into loose clumps of peers that share a common interest (role-playing games, souped-up cars, cheerleading, etc.). But they don't really have their own roles or niches carved out within these already poorly held-together groups. No division of labor, no distinctiveness.
At worst they're totally socially isolated, and that includes the ones who still interact with their nuclear family members every day. That's being familial, not being social. These are the most robotic. And they are hard to distinguish from their siblings, all minor variations on the same theme.
Folks who'd grown up during the Jazz Age said the same thing about the Silent Generation in the 1950s. They didn't accuse them of acting bratty and entitled like we do today, probably because inequality was falling back then and hyper-competitiveness was taboo, while today inequality is widening and squeaky wheel-iness encouraged.
Still, their unassertiveness and lack of a strong sense of identity was a common observation. They would serve in the military if summoned to do so, and they'd go to church if that was expected of them, but none of those things (or other things) was a really integral part of their identity, something they wouldn't need to be called upon to contribute to. No passion or drive. No commitment to an identity -- just keep deferring the "decision," and you won't have to go through the awkwardness of risking failure. (Again I don't mean that any of this is conscious.)
All sorts of related changes are conspiring to produce an entire generation of quasi-Chinese humanoids:
- General climate of conformity and not sticking out. I don't think this is so strong of a factor in weakening their sense of identity, because no climate is so conformist that it allows no distinctiveness whatsoever.
- Social isolation from peers. Your own nature comes into sharper focus when contrasted against the natures of other people you interact with. Also, there are only so many spots open in a peer group for a certain role, so you may have to explore some other set of traits to find yourself. Identify formation is not merely honing what is already there, though that's a big part -- it's also veering off the intended course a little bit and adapting to an unanticipated and unfamiliar social role.
- Constant, intrusive supervision of peer groups. Even if you do interact with peers, adult authority figures could prevent the individuals from expressing themselves and developing their role, lest the rambunctious one start a chain of rough-housing, or lest the charming girl hog all the attention from boys. That would go against the supervisor's goal for stability and maximal self-esteem for all. Everybody gets a trophy, but nobody earns it for anything distinctive that they've done.
- Blank slate parenting style. Helicopter parents all believe in the blank slate, and it is their (long, thankless) task to shape the wet clay into the ideal form that they have envisioned. Their preconceived Platonic ideal of the perfect child is the standard toward which they twist and bend the raw material of their real-life children. They don't end up entirely the same because some of them begin closer to the ideal, and parents refine their sculpting technique with each child. Yet each one is pushed into the same narrow range of sports, the same narrow range of volunteer / intern activities, and the same narrow range of acceptable TV shows, movies, and video games to consume.
The Platonic ideal may vary across households, but within each household the siblings are all repeat attempts by the parents to stamp raw material into the same ideal shape -- to fit them all into the bunk-bed of Procrustes.
- Lastly, the unwillingness of adults, including parents, to serve as role models. Grown-ups these days treat kids in such a kiddie way. They lower themselves to their level and, whether it's sincere or fake, get as excited as the kid about whatever he's doing. They style themselves as the "cool parents" or "cool grown-ups" who hang out at the little kids' table at Thanksgiving, rather than model normal behavior at the grown-up's table that the kid will have to emulate in order to gain a seat there. They also pretend to give a shit about the endless parade of kiddie crap that Hollywood keeps shoveling out -- well, at least it'll please the kids. We don't need our own movies to cater to our tastes.
When grown-ups hide their identities whenever children are around, the kids have no clear target to aim for, no one where they say, "Oh cool, I wanna be just like him!" Him these days is just some generic stock character like "the jock," "the player," etc. Not an actual individual with their own identity that they're going to model themselves on, because adults refuse to be themselves around children.
With no sense that grown-up identities are so different from kid identities, the kids don't have any curiosity about what goes on in grown-up world, how they interact with each other, how they'll need to change in order to be accepted and thrive there, and so on.
Watch your home videos from the '80s (or find some on YouTube) and notice how the grown-ups are being themselves in their own adult world, the kids are being themselves in their own kid world, and they interact across the age gap as ambassadors from their separate worlds, not by the parent joining the kid world and acting like a kid themselves.
I was going to explore how this failure to provide role models is most striking in the case of sex roles, but that turned into its own post, which I'll put up later.
At best they form into loose clumps of peers that share a common interest (role-playing games, souped-up cars, cheerleading, etc.). But they don't really have their own roles or niches carved out within these already poorly held-together groups. No division of labor, no distinctiveness.
At worst they're totally socially isolated, and that includes the ones who still interact with their nuclear family members every day. That's being familial, not being social. These are the most robotic. And they are hard to distinguish from their siblings, all minor variations on the same theme.
Folks who'd grown up during the Jazz Age said the same thing about the Silent Generation in the 1950s. They didn't accuse them of acting bratty and entitled like we do today, probably because inequality was falling back then and hyper-competitiveness was taboo, while today inequality is widening and squeaky wheel-iness encouraged.
Still, their unassertiveness and lack of a strong sense of identity was a common observation. They would serve in the military if summoned to do so, and they'd go to church if that was expected of them, but none of those things (or other things) was a really integral part of their identity, something they wouldn't need to be called upon to contribute to. No passion or drive. No commitment to an identity -- just keep deferring the "decision," and you won't have to go through the awkwardness of risking failure. (Again I don't mean that any of this is conscious.)
All sorts of related changes are conspiring to produce an entire generation of quasi-Chinese humanoids:
- General climate of conformity and not sticking out. I don't think this is so strong of a factor in weakening their sense of identity, because no climate is so conformist that it allows no distinctiveness whatsoever.
- Social isolation from peers. Your own nature comes into sharper focus when contrasted against the natures of other people you interact with. Also, there are only so many spots open in a peer group for a certain role, so you may have to explore some other set of traits to find yourself. Identify formation is not merely honing what is already there, though that's a big part -- it's also veering off the intended course a little bit and adapting to an unanticipated and unfamiliar social role.
- Constant, intrusive supervision of peer groups. Even if you do interact with peers, adult authority figures could prevent the individuals from expressing themselves and developing their role, lest the rambunctious one start a chain of rough-housing, or lest the charming girl hog all the attention from boys. That would go against the supervisor's goal for stability and maximal self-esteem for all. Everybody gets a trophy, but nobody earns it for anything distinctive that they've done.
- Blank slate parenting style. Helicopter parents all believe in the blank slate, and it is their (long, thankless) task to shape the wet clay into the ideal form that they have envisioned. Their preconceived Platonic ideal of the perfect child is the standard toward which they twist and bend the raw material of their real-life children. They don't end up entirely the same because some of them begin closer to the ideal, and parents refine their sculpting technique with each child. Yet each one is pushed into the same narrow range of sports, the same narrow range of volunteer / intern activities, and the same narrow range of acceptable TV shows, movies, and video games to consume.
The Platonic ideal may vary across households, but within each household the siblings are all repeat attempts by the parents to stamp raw material into the same ideal shape -- to fit them all into the bunk-bed of Procrustes.
- Lastly, the unwillingness of adults, including parents, to serve as role models. Grown-ups these days treat kids in such a kiddie way. They lower themselves to their level and, whether it's sincere or fake, get as excited as the kid about whatever he's doing. They style themselves as the "cool parents" or "cool grown-ups" who hang out at the little kids' table at Thanksgiving, rather than model normal behavior at the grown-up's table that the kid will have to emulate in order to gain a seat there. They also pretend to give a shit about the endless parade of kiddie crap that Hollywood keeps shoveling out -- well, at least it'll please the kids. We don't need our own movies to cater to our tastes.
When grown-ups hide their identities whenever children are around, the kids have no clear target to aim for, no one where they say, "Oh cool, I wanna be just like him!" Him these days is just some generic stock character like "the jock," "the player," etc. Not an actual individual with their own identity that they're going to model themselves on, because adults refuse to be themselves around children.
With no sense that grown-up identities are so different from kid identities, the kids don't have any curiosity about what goes on in grown-up world, how they interact with each other, how they'll need to change in order to be accepted and thrive there, and so on.
Watch your home videos from the '80s (or find some on YouTube) and notice how the grown-ups are being themselves in their own adult world, the kids are being themselves in their own kid world, and they interact across the age gap as ambassadors from their separate worlds, not by the parent joining the kid world and acting like a kid themselves.
I was going to explore how this failure to provide role models is most striking in the case of sex roles, but that turned into its own post, which I'll put up later.
Categories:
Age,
Cocooning,
Dudes and dudettes,
Generations,
Over-parenting,
Psychology
September 10, 2013
Mid-century unwholesomeness: Gossip, celeb, sex scandal magazines
[See here for a list of earlier posts in this series.]
Back in the Fifties, they may not have had the internet or TV recording equipment fancy enough to film reality shows, but that doesn't mean they didn't have just as perverse of an interest in the lurid details of celebrities as we do today. They made do with the state of mass media at the time and pumped out gossip magazines, the most (in)famous one being Confidential.
From a 1955 article in TIME, during the gossip mag's heyday:
You hear that? The biggest newsstand seller in the country. Not some niche rag that nobody paid attention to. Not to mention all of its imitators.
Folks in cocooning times become starved for gossip, since isolated people won't be getting any in real life. But now that the target is someone unknown to the receiver, someone they could never potentially run into around the community, the receiver will feel none of the natural restraints that keep gossip from getting too lurid or too vicious. It's not like a stranger will be able to fight back. Plus you don't know them, so their feelings wouldn't really count to you anyway. Cocooning breeds a callous and lurid interest in the private lives of others.
These things began to decline already by the end of the '60s, as people got involved in each other's real lives and had something personal to gossip about, though now within the natural constraints of relationships among members of a common group. By the '80s, celeb obsession was at a nadir -- thank God -- though it has since risen again and become as widespread as it had been during the mid-century. TMZ.com, reality TV, bla bla bla.
You can do a Google image search for any of the big mid-century gossip magazines. Just search "hush-hush magazine 1957" or what have you. The gallery below may be a bit overkill, but I want to emphasize how pervasive to culture of lurid voyeurism was back then. All are from the Eisenhower years, yet it's uncanny and depressing how familiar they all feel. (Check the file name for the exact year.) After spending an hour of looking through all the covers, I'm going to need a good '80s cocktail antidote tomorrow. Something uplifting.
'Now -- Surgery cures frigid wives'
'Why Liberace's theme song should be, "Mad About the Boy!" '
'Call girl-school teacher tells of her experiences--
'The inside story on HOMOSEXUALS'
'Special report: Sex in schools!'
'Love-hungry women!
'Men in skirts!'
'Exposed: $300,000,000 pornography racket'
'Why the nude Folies Bergere star tried to kill herself!'
'The TV stars filthy letters'
'Debra Paget: She strips to conquer'
'Scandal of our massage parlors!'
'Hollywood's latest pill kick: "Don't-give-a-damn" drugs'
'Are Swedish girls oversexed?'
'The bare facts about the torso trade'
'Exposed: How wife swapping clubs cheat the law!'
'A pill a day to keep the stork away'
Back in the Fifties, they may not have had the internet or TV recording equipment fancy enough to film reality shows, but that doesn't mean they didn't have just as perverse of an interest in the lurid details of celebrities as we do today. They made do with the state of mass media at the time and pumped out gossip magazines, the most (in)famous one being Confidential.
From a 1955 article in TIME, during the gossip mag's heyday:
In a little more than two years, a 25¢ magazine called Confidential, based on the proposition that millions like to wallow in scurrility, has become the biggest newsstand seller in the U.S. Newsmen have called Confidential ("Tells the Facts and Names the Names") everything from "scrawling on privy walls" to a "sewer sheet of supercharged sex." But with each bimonthly issue, printed on cheap paper and crammed with splashy pictures, Confidential's sale has grown even faster than its journalistic reputation has fallen.
You hear that? The biggest newsstand seller in the country. Not some niche rag that nobody paid attention to. Not to mention all of its imitators.
Folks in cocooning times become starved for gossip, since isolated people won't be getting any in real life. But now that the target is someone unknown to the receiver, someone they could never potentially run into around the community, the receiver will feel none of the natural restraints that keep gossip from getting too lurid or too vicious. It's not like a stranger will be able to fight back. Plus you don't know them, so their feelings wouldn't really count to you anyway. Cocooning breeds a callous and lurid interest in the private lives of others.
These things began to decline already by the end of the '60s, as people got involved in each other's real lives and had something personal to gossip about, though now within the natural constraints of relationships among members of a common group. By the '80s, celeb obsession was at a nadir -- thank God -- though it has since risen again and become as widespread as it had been during the mid-century. TMZ.com, reality TV, bla bla bla.
You can do a Google image search for any of the big mid-century gossip magazines. Just search "hush-hush magazine 1957" or what have you. The gallery below may be a bit overkill, but I want to emphasize how pervasive to culture of lurid voyeurism was back then. All are from the Eisenhower years, yet it's uncanny and depressing how familiar they all feel. (Check the file name for the exact year.) After spending an hour of looking through all the covers, I'm going to need a good '80s cocktail antidote tomorrow. Something uplifting.
'Remember that cute trick you dated? "She" was a he!'
'Now -- Surgery cures frigid wives'
'Why Liberace's theme song should be, "Mad About the Boy!" '
'Call girl-school teacher tells of her experiences--
"Love Without Men in Women's Prison" '
'The inside story on HOMOSEXUALS'
'Special report: Sex in schools!'
'Love-hungry women!
Sex swindlers'
'Men in skirts!'
'Exposed: $300,000,000 pornography racket'
'Why the nude Folies Bergere star tried to kill herself!'
'The TV stars filthy letters'
'Debra Paget: She strips to conquer'
'Scandal of our massage parlors!'
'Hollywood's latest pill kick: "Don't-give-a-damn" drugs'
'New massage parlor scandal: They are peddling sex again!'
'Are Swedish girls oversexed?'
'The bare facts about the torso trade'
'Exposed: How wife swapping clubs cheat the law!'
'A pill a day to keep the stork away'
Categories:
Cocooning,
Dudes and dudettes,
Gays,
Generations,
Media,
Morality,
Pop culture,
Psychology
September 9, 2013
Mimicry, the link between crime and cocooning trends
Criminals can be thought of as predators, and their victims as prey. Elsewhere (mostly in comments) I've described the basic dynamic between these two groups that generates cycles in both crime rates and trust levels. To recap:
First, people (i.e., potential victims, not criminals) become more trusting of others. Higher trust levels allow criminals to gain access to the prey, who are basically defenseless once access has been gained. So, second, crime rates begin rising a bit after trust levels do. Rising crime rates make people more wary of others. Hence, third, after crime rates have been going up for awhile, average people begin to withdraw their trust of others. Falling trust levels make it more difficult for criminals to gain access to victims, so fourth and finally, crime rates begin to plummet a bit after trust levels begin to fall.
Of course, after crime rates have been falling for so long, average people sense less of a reason to keep others at arm's length and under suspicion. So trust levels begin rising for the first time in a long while. Now we're back to where we started, and the crime-and-trust cycle repeats itself.
I've never been able to find quite the right analogy, but it finally hit me that the theory and observations about mimicry in biology are the most relevant. Criminals don't wear conspicuous signals or stand out in obvious ways to the average person. They do just the opposite -- appear to be basically trustworthy, but will take advantage of others for personal gain when they feel like it, and not feel any compunction.
Sociopaths are "aggressive mimics" of normal, trustworthy people (the "model"), and they exploit those who let their guard down around them (the "dupes"). Some of this is visual -- dressing and grooming themselves a normal way, speaking in the tone of voice of a charming person interested in other people, using the body language of a normal person during the approach toward and around the prey (until the pounce).
They don't have to mimic the most trustworthy of all people -- they just have to convince the dupes that they're basically trustworthy, so that they won't raise their guard around them. Maybe it's someone who doesn't look particularly charismatic and dashing, but hey, why should that keep you from opening your door when he knocks at your door pleading desperately (and convincingly) for help? OK, so he doesn't look like he's going to be canonized as a saint some day, but hey, why should that keep you from letting him babysit your child or serve as his Boy Scout troop leader?
Most people who seem trustworthy are trustworthy. Maybe that should be said the other way around: trustworthy people tend to have certain identifying traits (mostly honest signals), and we come to judge others by the presence or absence of those traits. So in general, there's little harm in letting our guard down around someone who feels trustworthy.
But some signals are capable of being faked, and anyone who can fake them will find success in exploiting average people. As their prevalence increases, though, normal people become more aware of them, causing the predators to grow at a slower and slower rate, as trust levels begin to plateau instead of continuing up up up. This is a case of frequency-dependent selection.
When people withdraw into their cocoons, it's not only a physical isolation -- even when out and interacting with others, they have their guard up. So the analogy from biology is not where the prey merely avoid spaces where would-be predators could snag them, or honing their escape/evasive maneuvers (although that's going on too).
Cocooning folks show some fear of physical places that could be hot spots of crime ("this place looks sketchy"). But their predominant fear seems to be an anxiety that somebody they trust, or give a pass to, will turn out to be a sociopath who will take advantage of them. It's fear of deception, not of being attacked by someone you didn't even know was there. That means they're more worried about (aggressive) mimics than about stealthy and physically imposing predators.
Importantly, people are not worried about dangerous members of some foreign group swooping in to prey on them. That's more like a war zone. Crime is more local and caused by folks who look like your neighbors. Don't they always say that, about the guy who shot his wife and children one day after work -- neighbors tell reporters that he seemed like a normal, even model husband and father? No one could have told? If you can't even trust your own blood, who can you trust?
That's what worries people, not ghetto blacks riding over the hill. For another thing, most people don't live near ghetto blacks or Mexicans, either in the U.S. or in all the other Western countries that saw a crime wave from roughly 1960 to 1990, or from 1580 to 1630, or from 1780 to 1830. Crime rates rose across all 50 states, not just those with large black populations. The changing mood and response to rising crime also affected all 50 states.
To pick one example from the list in the Wikipedia article, the closest parallel looks like the system consisting of a grouper fish (the dupe) that allows a cleaner fish (the model) to get near it in order to eat parasites off the grouper's fins, while a species that mimics the cleaner fish is allowed close but then takes a bite out of the grouper's fin, enjoying a nice little meal for itself, and harming rather than helping the grouper.
The grouper and cleaner fish are in a mutualistic relationship, and that's how normal people within the same society interact with each other. The sociopath is one who can closely imitate pro-social behavior (body language, speech, dress and grooming, etc.), and take a quick bite out of a trusting person before splitting and moving on to target another unsuspecting dupe.
When the grouper is preyed on in this way by the mimic species, it raises its guard. That precludes some of the beneficial interactions it could have had from letting the true cleaner fish get close. But letting go of those benefits is just the price you have to pay these days to avoid those damn mimics. Normal people are also aware of how deprived their lives have become during the cocooning period of the past 20 or so years (or back during the mid-century). But they figure it's the price you have to pay to avoid being used by a deceiver.
Only when people feel such a wide mismatch between their anxiety about deception and their own real-world experience (and second-hand learning), will they figure the price is no longer worth it. Like, when was the last time I or anybody I know was victimized by a sociopath? If it's been that remote, maybe I can let my guard down more around strangers -- after all, there don't seem to be that many mimics out there these days.
I think the human case has something to tell the broader biological theory of mimicry, or at least how it's presented to the public. Namely, that mimicry is unstable and will rise and fall in cycles. Typically when you read about mimicry, it's portrayed as a static thing (though obviously allowing for new cases to be discovered). The hoverflies that mimic the appearance of nasty wasps, and thereby ward off predators, must have been doing so since forever. Those cuckoos that lay eggs mimicking the eggs of the host bird species, must have been doing that since forever. That's just what they do.
Well, no it isn't. When we look for examples of mimicry now, we're only going to find those cases that are like a rising-crime period in human history. In all likelihood, 1000 years ago, 10,000 years ago, or however long ago, there were mimicry arrangements that no longer exist today, even though the mimic, model, and dupe species are still with us. It's just that, in the intervening time, the dupe wised up and mimicry no longer paid off. Presumably the same will happen with the species targeted by cuckoo birds.
The only exception is the common side-blotched lizards, where every review of their dynamics focuses on the cyclical nature of mimicry. But that ought to come up in every example presented -- when and how does the dupe catch on, and what happens to the mimic when the jig is up? Do they evolve to mimic another dupe species, close enough to the first to require minimal change? Or do they just dwindle in number? In whatever records we have of the history of these species (like fossils), do we see a period where the mimic had not set on the dupe just yet? If so, when did it all begin? Also in the record, do we see clear signs of mimicry that are no longer present in the descendant species?
These cyclical dynamics are way more fascinating than the snapshot of a successful mimicry case, where we just have a good laugh at the dupe's expense. Same with humans -- how societies cycle over time is way more fascinating than how people interact with each other during a snapshot.
First, people (i.e., potential victims, not criminals) become more trusting of others. Higher trust levels allow criminals to gain access to the prey, who are basically defenseless once access has been gained. So, second, crime rates begin rising a bit after trust levels do. Rising crime rates make people more wary of others. Hence, third, after crime rates have been going up for awhile, average people begin to withdraw their trust of others. Falling trust levels make it more difficult for criminals to gain access to victims, so fourth and finally, crime rates begin to plummet a bit after trust levels begin to fall.
Of course, after crime rates have been falling for so long, average people sense less of a reason to keep others at arm's length and under suspicion. So trust levels begin rising for the first time in a long while. Now we're back to where we started, and the crime-and-trust cycle repeats itself.
I've never been able to find quite the right analogy, but it finally hit me that the theory and observations about mimicry in biology are the most relevant. Criminals don't wear conspicuous signals or stand out in obvious ways to the average person. They do just the opposite -- appear to be basically trustworthy, but will take advantage of others for personal gain when they feel like it, and not feel any compunction.
Sociopaths are "aggressive mimics" of normal, trustworthy people (the "model"), and they exploit those who let their guard down around them (the "dupes"). Some of this is visual -- dressing and grooming themselves a normal way, speaking in the tone of voice of a charming person interested in other people, using the body language of a normal person during the approach toward and around the prey (until the pounce).
They don't have to mimic the most trustworthy of all people -- they just have to convince the dupes that they're basically trustworthy, so that they won't raise their guard around them. Maybe it's someone who doesn't look particularly charismatic and dashing, but hey, why should that keep you from opening your door when he knocks at your door pleading desperately (and convincingly) for help? OK, so he doesn't look like he's going to be canonized as a saint some day, but hey, why should that keep you from letting him babysit your child or serve as his Boy Scout troop leader?
Most people who seem trustworthy are trustworthy. Maybe that should be said the other way around: trustworthy people tend to have certain identifying traits (mostly honest signals), and we come to judge others by the presence or absence of those traits. So in general, there's little harm in letting our guard down around someone who feels trustworthy.
But some signals are capable of being faked, and anyone who can fake them will find success in exploiting average people. As their prevalence increases, though, normal people become more aware of them, causing the predators to grow at a slower and slower rate, as trust levels begin to plateau instead of continuing up up up. This is a case of frequency-dependent selection.
When people withdraw into their cocoons, it's not only a physical isolation -- even when out and interacting with others, they have their guard up. So the analogy from biology is not where the prey merely avoid spaces where would-be predators could snag them, or honing their escape/evasive maneuvers (although that's going on too).
Cocooning folks show some fear of physical places that could be hot spots of crime ("this place looks sketchy"). But their predominant fear seems to be an anxiety that somebody they trust, or give a pass to, will turn out to be a sociopath who will take advantage of them. It's fear of deception, not of being attacked by someone you didn't even know was there. That means they're more worried about (aggressive) mimics than about stealthy and physically imposing predators.
Importantly, people are not worried about dangerous members of some foreign group swooping in to prey on them. That's more like a war zone. Crime is more local and caused by folks who look like your neighbors. Don't they always say that, about the guy who shot his wife and children one day after work -- neighbors tell reporters that he seemed like a normal, even model husband and father? No one could have told? If you can't even trust your own blood, who can you trust?
That's what worries people, not ghetto blacks riding over the hill. For another thing, most people don't live near ghetto blacks or Mexicans, either in the U.S. or in all the other Western countries that saw a crime wave from roughly 1960 to 1990, or from 1580 to 1630, or from 1780 to 1830. Crime rates rose across all 50 states, not just those with large black populations. The changing mood and response to rising crime also affected all 50 states.
To pick one example from the list in the Wikipedia article, the closest parallel looks like the system consisting of a grouper fish (the dupe) that allows a cleaner fish (the model) to get near it in order to eat parasites off the grouper's fins, while a species that mimics the cleaner fish is allowed close but then takes a bite out of the grouper's fin, enjoying a nice little meal for itself, and harming rather than helping the grouper.
The grouper and cleaner fish are in a mutualistic relationship, and that's how normal people within the same society interact with each other. The sociopath is one who can closely imitate pro-social behavior (body language, speech, dress and grooming, etc.), and take a quick bite out of a trusting person before splitting and moving on to target another unsuspecting dupe.
When the grouper is preyed on in this way by the mimic species, it raises its guard. That precludes some of the beneficial interactions it could have had from letting the true cleaner fish get close. But letting go of those benefits is just the price you have to pay these days to avoid those damn mimics. Normal people are also aware of how deprived their lives have become during the cocooning period of the past 20 or so years (or back during the mid-century). But they figure it's the price you have to pay to avoid being used by a deceiver.
Only when people feel such a wide mismatch between their anxiety about deception and their own real-world experience (and second-hand learning), will they figure the price is no longer worth it. Like, when was the last time I or anybody I know was victimized by a sociopath? If it's been that remote, maybe I can let my guard down more around strangers -- after all, there don't seem to be that many mimics out there these days.
I think the human case has something to tell the broader biological theory of mimicry, or at least how it's presented to the public. Namely, that mimicry is unstable and will rise and fall in cycles. Typically when you read about mimicry, it's portrayed as a static thing (though obviously allowing for new cases to be discovered). The hoverflies that mimic the appearance of nasty wasps, and thereby ward off predators, must have been doing so since forever. Those cuckoos that lay eggs mimicking the eggs of the host bird species, must have been doing that since forever. That's just what they do.
Well, no it isn't. When we look for examples of mimicry now, we're only going to find those cases that are like a rising-crime period in human history. In all likelihood, 1000 years ago, 10,000 years ago, or however long ago, there were mimicry arrangements that no longer exist today, even though the mimic, model, and dupe species are still with us. It's just that, in the intervening time, the dupe wised up and mimicry no longer paid off. Presumably the same will happen with the species targeted by cuckoo birds.
The only exception is the common side-blotched lizards, where every review of their dynamics focuses on the cyclical nature of mimicry. But that ought to come up in every example presented -- when and how does the dupe catch on, and what happens to the mimic when the jig is up? Do they evolve to mimic another dupe species, close enough to the first to require minimal change? Or do they just dwindle in number? In whatever records we have of the history of these species (like fossils), do we see a period where the mimic had not set on the dupe just yet? If so, when did it all begin? Also in the record, do we see clear signs of mimicry that are no longer present in the descendant species?
These cyclical dynamics are way more fascinating than the snapshot of a successful mimicry case, where we just have a good laugh at the dupe's expense. Same with humans -- how societies cycle over time is way more fascinating than how people interact with each other during a snapshot.
Categories:
Cocooning,
Crime,
Evolution,
Human Biodiversity,
Psychology,
Violence
Subscribe to:
Posts (Atom)