August 30, 2014

Can stronger social connections help out children from broken homes?

Data and analysis are in the post below. This is more of a conceptual overview of the topic.

The the nature vs. nurture "debate," the pendulum has swung a bit too far toward the side of nature. Somewhere along the line, fans of behavior genetics — which is not to say the researchers themselves — convinced themselves that environments don't affect how we turn out as adults. At least, not in predictable ways.

There could be random environmental differences that affect development, like some molecule in the brain zigs instead of zags due to a quantum-level coin flip, and that winds up making the kid more extraverted instead of introverted. Or maybe there just happened to be a frightening dog on the way to school one day, and this traumatic experience makes the kid more fearful in adulthood. These are environmental effects, but they all boil down to chance and fortune.

What we tend to mean by "environmental effects" are when a parent uses corporal punishment, the child may wind up with violent tendencies in adolescence. But it could be that the parent had genetically influenced violent tendencies themselves, which expressed themselves in the use of corporal punishment, and expressed themselves in the kid as getting into pointless fights. Or perhaps the corporal punishment is an effect, rather than a cause of the kid's violent tendencies — natural-born hell-raisers induce parents to use tougher discipline than little wiener kids do.

What gets lost in this quibbling on and on is that what normal people usually mean by "environments shape development" is that some events have a much stronger impact than merely getting whipped with a belt. A small bruise will heal, but what about things that a kid is not going to be so resilient against, like growing up in a broken home, being molested, emotionally neglected, and so on?

In our neo-Dickensian world, such events are hardly uncommon. Recall this post which shows how increasingly common it is for white Americans to be growing up without both of their parents in the home — up to around 40% of those born in the late '80s.

To look beyond the individual level for how such kids are affected, a follow-up post looked at the loss of connectedness that children have when their parents divorce.

Now let's try to work both of these levels together. Depression and anxiety are found more often in children of divorce than in children from intact families. But depression isn't just a free-floating individual trait — it responds to how socially integrated you are. More connected environments might alleviate the sense of disruption and having been ripped out of the ground.

We can imagine all sorts of ways to define "more connected," and here I'll look at the outgoing, as opposed to cocooning, trend in social-cultural cycles. Children from broken homes must feel even more lonely and depressed when there's a larger climate of cocooning and atomization.

When folks are more out and about, milling around in public spaces, and neighbors are looking out for neighbors, those kids might not feel such a cripping loss when their parents split up. Compared to victims of helicopter parenting, kids in an outgoing climate have far greater support from their peers, and may also find surrogate parents throughout their neighborhood and peer group, to make up for their own dysfunctional parents.

Children of divorce less depressed as young adults if they grew up in socially connected times

Introductory post here.

The General Social Survey asks a question about how happy you are, and most people say pretty happy. Since that's a fence-sitting answer, I'm going to look at the "very happy" answer (the other is "not too happy"). I've separated the respondents into those who grew up with both parents, and those who did not because of divorce or separation. Since the effects might peter out over time, I'm only looking at people who were 18 to 25 years old when they were surveyed. They're also whites only, to prevent the confound of race differences in family structure.

To measure how connected their world was when they were growing up, I grouped them into birth cohorts of 10 years: 1945-'54, '55-'64, '65-'74, '75-'84, and '85-'94. Each cohort is labeled by the 0-year in the middle, from 1950 to 1990.

The first cohort is early Boomers, and their 1950s childhood was fairly atomized by historical standards, although their adolescence was more outgoing in the '60s. Those in the next cohort, the late Boomers, grew up entirely in outgoing times, although when they were little this trend was only just beginning to rise. The early Gen X cohort also spent their childhood and adolescence in outgoing times, but now at a higher level than for the late Boomers. The late X-ers grew up mostly in outgoing times, up through the '80s, but went through adolescence during the cocooning '90s. (They are the inversion of the early Boomers, who also had a mixed-up upbringing.) The early Millennials grew up entirely during the cocooning period that we're still in.

Did the environment make a difference? Here are the rates of feeling very happy among children from intact vs. broken homes, who were asked in young adulthood, separated by cohort:

Notice that the red line, showing kids from intact homes, is fairly flat across all cohorts. The happiness levels don't seem to change so much, in one direction or another, for children from intact families. Perhaps a healthy home life makes them less dependent on an outgoing climate to feel great, and they will be better buffered against the atomization of the cocooning climate.

The blue line, however, rises along with the outgoing trend and begins to fall with the cocooning trend. With not so much at home to anchor their psychological stability, they are more sensitive to changes in the wider social climate. They never reach the same levels of happiness as those from intact families, but they did manage to close the gap by quite a bit when they had grown up in the more broadly connected environments of the '70s and '80s.

We can treat happiness like a trait that follows a bell-shaped curve, and use the "very happy" response as a cut-off value for that trait. Kind of like using "can dunk a basketball" as a cut-off value for height. We can then work backwards to turn the fraction meeting the cut-off into z-scores, and from there figure out what the distance is between the average person in the two different family structure groups. This is like figuring out how much shorter, on average, one group is compared to another, looking only at how likely they are to dunk a basketball.

Here is a graph of the gap between the happiness levels of the two groups, plotted across each of the cohorts again. The units are standard deviations. For comparison, one S.D. is about 3 inches in the case of height.

Never does the gap reach zero, but it does narrow quite a bit from around 0.4 S.D. to 0.25, then to 0.15, staying around 0.15, and finally rising back to 0.25. If we treat happiness as a kind of "height," then children from broken homes were about 1.2 inches "shorter" than those from intact homes, among the early Boomer cohort. For late Boomers, the gap shortened to 0.75 inch. For both Gen X cohorts, it narrowed down to under half an inch, and then widened back to 0.75 inch among the early Millennials. I'd project that it's even wider among later Millennials, born between 1995 and 2004.

Narrowing a gap in average "height" by three-quarters of an inch in just 20 to 30 years, or one generation, is pretty good — way better than utopian attempts at social engineering. Although by the same token, the steady trend toward cocooning over the past 25 years has probably wiped out those gains. Still, it tells us that, aside from trying to keep down the rate of kids growing up in broken homes, we ought to re-evaluate the costs of cocooning.

By sealing off the nuclear household from the rest of the neighborhood and community, the unfortunate kids who have one or both parents breaking up the family won't have anywhere to turn to as they try to cope with one of the most severe disruptions a person could face, and all while they're still growing up. I know that sounds incredibly emo, but these days, it really must be that devastating.

For a reminder of how manageable and totally not-emo the problem used to be back in the '80s, look at Molly Ringwald's character from Pretty in Pink. Her mother walked out on her and her father, but since it was an anti-helicopter parent era, the father doesn't think twice about her socializing with close friends, going to dances, and bonding with the surrogate maternal figure, who manages the New Wave record store where she works. She's probably not going to turn out exactly like the normal kids at school, but y'know — normal enough.

That was fairly common back then, even for younger children. If your parents were divorced, you spent most of your time outside the home, trying to latch onto something stable and healthy. And provided that you made yourself likable, you were welcome in your friends' homes by your friends' parents.

This underscores the importance of there not being too many broken families, though. If only 10% of the kids are from broken homes, maybe the other 90% can collectively handle them as guests. But what happens when the guests out-number the hosts? Nothing good can come from that situation.

GSS variables: happy, family16, famdif16, cohort, age, race

August 29, 2014

Jews over-represented among gay Congress members

According to this list, there are 16 members of Congress who were "out" while in office. All served in the House of Representatives, although one also served in the Senate. Of these 16, 4 are Jews (25%), including the only Senator (Frank and Polis are full, Cicilline is half, and Baldwin is a quarter).

During the time period that the gay ones have served -- the 1970s through today -- Jews have made up between 5-10% of the House. They are therefore 2 or 3 times over-represented among those members of Congress who are gay.

Or at least those who are openly gay -- we'd need a more exhaustive and reliable list of who was in the closet, and check them by ethnic background. Being 2-to-3 times over-represented doesn't seem like it'll go away even if Jews were less likely to be in the closet, and we counted both closeted and open homos. At any rate, they'd still be more likely to be open about it, i.e. to make it one of the issues on their agenda.

Related post: Jews are more than twice as likely to be gay, though equally likely to be lesbian.

August 27, 2014

The decline of stoicism during status-striving times

One of the most palpable changes to the social climate during times of rising competitiveness, with its norm of me-first and dog-eat-dog, is how emotionally unrestrained people become. Each individual no longer feels like it's their duty to regulate their emotions in the presence of others, no matter how positive, negative, or neutral the feeling is.

Soooo stoked for the beginning of pumpkin spice latte season!!!!

Soooo depressed that Wendy's is discontinuing the Tuscan chicken on toasted ciabatta :((((

Fuck yeah, George Takei for the epic motherbitching win!

It's my opinion, douchenozzle, if you don't want to hear it then go somewhere else.

Older and higher-status people are no less on-edge emotionally than those who are younger and more status-insecure. Steve Jobs, John McCain, and their fans, and their haters.

In more accommodating times, the norm is reining-it-in. Don't work 100 hours a week out of ambition, if that work could have been divided into two 50-hour schedules and given someone else a job too. Don't get overly excited when you feel positive, since you'll look like a poor winner or like you're trying to lord it over those who are feeling only so-so. And when you're feeling down in the dumps, don't whine so loudly about that either, since excessive emotion may be drawing more attention to your problems than is necessary to help you out, leaving the problems of others unattended to. Thus does stoicism support a more egalitarian society.

The removal of emotional restraint was already under way in the '80s, when the strong silent type like Clint Eastwood had become replaced by high-energy loose cannons like Mel Gibson, Tom Cruise, and Bruce Willis. Eastwood was not only popular in the '60s (continuing on the popularity of Westerns back through the '50s), but into the early part of the '70s as Dirty Harry, another strong silent type. Harrison Ford playing Indiana Jones was less unpredictable than other in-your-face heroes of the '80s, but he wails and grimaces in pain more than the Western stars of the Midcentury, and is more likely to unload on an enemy in a rage.

Consciously retro characters like the Fonze and Special Agent Dale Cooper derived their appeal in large part from harking back to a time when men kept it together emotionally.

The same changes can be seen in the prose styles of popular authors. During the Great Compression of roughly 1920 to 1970, Fitzgerald, Hemingway, Kerouac, and Mailer all wrote in a terse style, and kept their works fairly short in length, to nip sentimentality in the bud. Contrast with the verbose, florid, and treacly prose that was more typical of the Victorians, as well as the self-indulgent 800-page novels of our neo-Dickensian era.

Also from the Victorian era were public intellectuals with such nicknames as "Darwin's Bulldog" (Huxley), who have only started to come back into fashion with New Atheist types and their immediate predecessors like Richard Dawkins. In fairness, The Selfish Gene was an early example before things got as heated as they are now, and he was fairly sympathetic to religious thinking and wrote in a milder tone. Still, I don't recall there being a Fisher's Bulldog or a Wright's Bulldog in the Midcentury, although in 1930 Fisher was somewhat his own bulldog when he wrote an inflammatory second half about eugenics in his book The Genetical Theory of Natural Selection.

Returning to an earlier post about the rise of road rage, what kinds of solutions do we hear for folks with an aggressive driving problem? "Anger management." As it turns out, that phrase appears in Google's digital library (Ngram) only in the 1970s, and rises steadily through the late 2000s. A quick check of the Wikipedia article for "anger management" shows that the academic literature on the topic shares this timeline.

Andy Griffith may have had to deal with the occasional hot-head picking a fight in a bar, but not everyday road rage, and he did not have to refer one offender after another to anger management programs.

But the clearest demonstration of the link between falling competitiveness and stoicism is the original, capital-letter Stoicism of the Roman Empire. As a practical philosophical school that led by example, it flourished from Seneca, to Epictetus, to Marcus Aurelius, which coincides with a long period, from emperors Augustus to Marcus Aurelius, of increasing stability within the Empire (particularly among the elites, who had been at war with one another not so long before).

Once internal competition began to rear its head again under the reign of Commodus, Stoicism went up in a puff of smoke. We can tell that this was due to a social-cultural shift, rather than the introduction of newer, weaker blood because Commodus received half of his genetic stock from the Stoic philosopher-emperor himself. Recall this earlier post, using Commodus as an example, about how bombastic and unrestrained the leaders become as society creeps toward civil war.

Increasing levels of rage, excitement, and so on, are symptoms of the underlying cause — the rise in competitiveness, and its accompanying norm of me-first. Don't make me contain my emotions, and I won't make you contain yours. We'll just have to see whose emotions are stronger than the other's. So, going to those anger management programs, or taking a DIY approach, probably won't do all that much long-term. We need to attack these problems at their root, which is the laissez-faire attitude and hyper-competitive behavior toward others.

August 25, 2014

Liberal vs. conservative flavors of purity as a moral intuition

As summarized by Jonathan Haidt in The Righteous Mind, the basic difference between the moral intuitions of liberals and conservatives is that the moral lobes of liberals light up in response to the factors of harm / care and fairness / justice, with much weaker responses to authority / hierarchy, in-group / community / loyalty, and purity / taboo. Conservative brains respond about equally to all five of these factors, and the largest split with liberals is over the factor of purity.

This shows up today in debates over whether something disgusting like gay butt-sex ought to be condoned or condemned. "Just because it's yucky doesn't mean we should condemn, punish, or quarantine it," says the liberal. "We find things yucky for a reason," the conservative replies: "it's Mother Nature's way of guiding us toward the healthy and wholesome, and away from the diseased and infectious."

But as Haidt and others have observed, liberals do have strongly puritanical intuitions about some things. Look at how obsessive they are about food taboos — any type of food that is not organic, that is genetically modified, that has too much fat, too much sugar, too much sodium, gluten, dairy, peanuts, or high-fructose corn syrup. They are also the type to compulsively use hand sanitizers, napalm for their kitchen countertops, and other extreme forms of hygiene.

An earlier post extended this puzzle to looking at racial differences in disgust and morality between East Asians and Europeans (and Middle Easterners). Overall, East Asians don't show as much revulsion toward as wide a variety of stimuli as Europeans do, and they are more permissive of and more likely to consume what Europeans would consider the most vile of cultural garbage — eating raw squid, watching a cartoon squid rape a 10 year-old girl, buying used panties from vending machines, and so on and so forth. This is why East Asia has not exported a moral code like Europe and the Middle East has.

At the same time, Asians follow more elaborate behaviors to ensure hygiene, such as wearing different pairs of shoes in different rooms of the house, so as to not transfer pollution from a dirtier room to a cleaner room.

What underlies the apparent exceptions is the scope of focus: liberals and Asians restrict purity concerns to the personal, while conservatives and Europeans extend it farther out into the communal. The OCD mindset and habits of the hygiene freaks is fundamentally isolating, sensing everything outside of the self as a threat. The "how dare they?" mindset and habits of standard-bearers is pro-social, attempting to maintain the purity of people, places, things, symbols, and roles against the desecration which would corrode communal identity.

Notice that liberals don't care if some random school doesn't serve organic food in the cafeteria — it's only the schools that their own children attend that matter, and even then they'll give up and pack a separate organic lunch without launching a broader campaign. Only the tiniest fraction of delusional crusaders feel like non-organic school lunch is a travesty that requires correction across the entire school system. Ditto for their OCD routines in the bathroom and kitchen — what do they care about what you do in your bathroom and your kitchen, provided you and they remain apart?

For liberals, everyone is entitled to their own set of hygiene routines and food taboos, and no norms are held for larger groups than the individual (or at most a nuclear household), let alone are they enforced by the larger-group members. Occasionally, a band of do-gooder mommies whose children attend the same school will cooperate to establish and enforce their organic salad bar ways at the school, but again this is uncommon even for liberals.

Conservatives don't act according to the norm of "I'll let you do what you want, if you leave me to do what I want." What if that means I raise the American flag each morning outside my house, while every morning you burn a new American flag in your driveway? What if some bunch of attention junkies on Halloween want to go out in public dressed up as "slutty nuns"? What happens when some of us in the neighborhood want to preserve the mini-golf course where families have been playing and bonding for decades, while others want it replaced by yet another chunk of condos with throwaway trendoid shops at street level ("mixed-use development")?

Conservative-minded folks sanctify these things — whether they are symbolic like the flag, role-based like "nun in the Church," or tangible like the mini-golf course — because they bind us to others in a community at a point in time, connect us back to those who held these things as part of their group identity, too, and will continue to link future generations to us and our predecessors.

This is also why fans of a sports team don't want the name or mascot changed — both are central to the totem with which they all identify, and altering them would sever ties to the past. Changing the team's location is almost as bad, although uprooting an intact totem is not as sacrilegious as adulterating its name and form willy-nilly.

When the focus on purity could go either way, the conservative will sense the threat to communal cohesion, while the liberal senses the threat to individual health. Alcoholism, substance abuse, sleeping around, disgusting sex acts — liberals see more and more harm being done to the individual, and the need for others to care for them until they're better. Hence their solutions follow the model of self-focused therapy. (That is, when they are not "tolerant of" i.e. callous toward others descending into degradation, although in fairness that is more of a libertarian than a liberal inclination.)

Conservatives see the pollution of the individual in these cases, but they also see how this person's decay will weaken the bonds of everyone who is connected to them. And not only in the sense of actively threatening to harm others, e.g. a drunk who begins beating his wife. Even a non-violent, apathetic drunk will weaken the bond between him and his wife, and therefore between his family and the others in the community. The video game addict isn't just "wasting his life" — which he certainly is — he's one less anchor for social ties that run throughout the community.

The end-point of a liberal-guided society is the insectoid hive found in East Asia, where each little drone in each little cell follows their OCD rituals to maintain individual cleanliness (and indirectly, public cleanliness), and where junkies are sent off to heal themselves at video game addiction camp. But also where nothing is held sacred or taboo, and so where everything is in a constant state of flux, no two drones identifying with each other, and none of them feeling securely rooted in the past.

White folks are never going to become that atomized and soulless, but normal people need to point out where the liberal path would ultimately take us, to guard against the Panglossian assurances of how great it'll be when everyone tolerates everyone else's lifestyle choices, or the fallback agnosticism about how we can't know what the effects will be unless we try it. (Hey, I know, let's eat random wild berries — we won't know which are poisonous until we swallow a bucketful.)

They might lazily object about "OMG, seriously? Slippery slope arguments in 2014, really?" But normal folks don't want to take even one more step in the insectoid direction. We've already gone down that path far enough. Pointing to the examples of East Asian societies would instead serve as a reminder of what we need to be moving away from.

August 22, 2014

Jews less forgiving than Christians and even non-religious people

Religion is designed to regulate social behavior according to moral norms that weaken the instinct toward selfishness and strengthen our resolve to "be a bigger person," the better to hold together a large society. Humility is meant to curb vainglory, charity to discourage gold-hoarding, and chastity to prevent adultery.

When it comes to physical security, forgiveness is meant to break up what would be endless cycles of bloody vengeance. Later this evolved into a general attitude toward those who have wronged you, not only those who've caused physical harm.

And yet not all religious groups place an equal emphasis on forgiveness. This wouldn't be the worst thing in the world if the less forgiving group were not over-represented at the highest levels of wealth, power, and influence. They would instead be more of a nuisance, like Gypsies.

The General Social Survey asked a series of three questions about how often your religious or spiritual beliefs lead you to forgive yourself, to forgive others who have hurt you, and to know that God forgives you. The responses can be lumped into a "frequent" group ("always" or "often") and an "infrequent" group ("seldom" or "never").

The charts below show the percent of people in each group who gave an "infrequent" response for each of the questions on forgiveness. Only the major four religious groups are included — Protestants, Catholics, Jews, and those who answered "none" for current religious affiliation.* Respondents were restricted to whites only, to control for race. No effort was made to control for ethnic differences between Protestants and Catholics since they turned out fairly similar anyway.

Across the board Jews are more unforgiving than Christians, and surprisingly, more than non-religious folks as well.

We've all gotten the impression that petty behavior is more common among atheists than among those who fill the pews on Sunday, and here we see our hunch confirmed. There could be a selection effect here, where those who were raised religious but had a more unforgiving disposition drifted away from the church, leaving the more forgiving types to continue practicing as adults. But I suspect there's also a compounding effect of no longer belonging to a higher-level group than the individual, or at most the family, and being subject to its moral norms (abstract "humanity" is a bogus claim, belied by their behavior).

How large are these differences? Let's focus on the second question that asks about forgiving others who have hurt you. If we treat forgiveness as a normally distributed trait like height, with Protestants as the baseline height, it's as though Catholics were less than half an inch "shorter," the non-religious were 1.3 inches shorter, and the Jews were 1.9 inches shorter, on average.

These are not minor differences. The gap in forgiveness between Protestants and Jews is about what the height gap is between white and Asian Americans. Not only visible in everyday life, but far more pronounced at the extremes. Asian-Americans are way more likely than whites to be 5'3 or shorter, and Jews will be more likely than the goyim to never let up on holding a grudge. (Compare the academic fields of Christian Studies vs. Judaic Studies.)

You might object that this is one of those "no duh" findings, but using the GSS we've uncovered a pattern that casual observation would not suggest. Not only are they more unforgiving toward others, they're more unforgiving toward themselves, and they either deny or are uncertain about whether or not God forgives them.

It would not be too much to assume that if they had asked a question about whether you believed others forgave you for your wrongs, Jews would be less likely to believe so than Christians or even non-religious folks.

There is a broad, general absence of forgiveness in the minds of Jews, which suggests a single underlying cause. Here it is probably their elevated levels of neuroticism. People who are angry, worrisome, and depressive are more likely than cool-headed people to view any remark as a slight, and any bad outcome as a conspiracy. These sting bad enough for the neurotic that they cannot be truly forgiven.

Ashkenazi Jews spent the better part of a millennium adapting genetically and culturally to a managerial niche. Have you guys interacted with managers? Who can be surprised to find that they are now more petty and vindictive than their Christian and secular goy host populations?

These are not qualities that hold together a society, high IQ or low. Indeed, the example of Israel shows how pettiness and vindictiveness must be elevated into national, or rather foreign policy, in order for such corrossiveness to be directed outward. If it were not for The Eternal Muslim -- or, in a pinch, The Eternal Slav -- the Jewish nation would devour itself.

Related: two earlier posts here and here on why the Parsis are a "market-dominant minority" who are loved rather than loathed by their host society.

Humility, charity, and chastity all distinguish the Parsis from the Jews, and they do not continue to harp on their persecution in previous centuries, such as the Muslim conquest of Iran that drove them into India. They seem to have forgiven the Muslims for displacing them from their homeland and having to live as a diaspora. Somehow, I don't see the Parsis banding together to colonize Iran and forcefully reclaim their homeland a la the Zionist movement.

Jews will never collectively forgive goy Europeans, and so would never take our unsolicited advice. But they might be able to take the same advice if it came from the Parsis. Follow their example, and become viewed as a national treasure. They could not imitate them very well at the start, but in what other direction does their culture have to go?

* Jews have a small sample size of 26, but the differences between them and the others are large enough that even this small sample is good enough to give highly significant results on a chi-squared test.

GSS variables: forgive1, forgive2, forgive3, relig, race

August 20, 2014

The generational swing away from "Boo taxes"

An earlier post laid out the basic finding of Gen X and Millennials being distinctly less hysterical about paying taxes than the Boomers, Silents, and Greatest Gen. Today's inequality is in large part the outcome of the tax burden being kicked down the road by earlier generations. I've already discussed what the overall dynamics seem to be behind the shift in sentiment with later generations, so now let's turn to the data for the details on just how stark the split is.

These results come from the General Social Survey, which studies a national probability sample, going back to the early 1970s. The question about taxes asks, "Do you consider the amount of federal income tax which you have to pay as too high, about right, or too low?" We're interested in the "too high" response as a signal of anti-tax sentiment. Only whites were studied, to control for race (which varies quite a bit with later generations), and to look at what's going on with the majority.

There are changes across time periods, and across age groups, that we need to isolate before looking at cohort differences. If we compare one cohort when they were young to another cohort when they were old, or if we compare one cohort during a period of heavy anti-tax sentiment and another cohort during a period of lower sentiment, we'll confuse our generational comparison with age and period comparisons.

I've grouped people into birth cohorts of 10 years, from years ending in a 5 to years ending in a 4, and labeled by the year ending in a 0 that lies in between. For example, one cohort includes those born from 1945 to 1954, and they are labeled the 1950 cohort.

Time periods could not be each individual survey year, as that gives sample sizes that are too small. I looked for local peaks and local valleys in the changes over time, and put the surrounding years into a single time period. This captures periods of higher or lower anti-tax sentiment. There are five periods, each made up of 4 or 5 survey years: 1976-'82, '84-'88, '89-'93, '94-2002, and '04-'12.

The plot below shows the percent of respondents who said their taxes were "too high," tracked across time periods, and shown separately for each cohort. I've restricted people to those aged 18 to 64 -- once retirement, Social Security, and Medicare are within reach, folks stop whining so much about having to pay taxes. Looking only at the 18-64 year-olds lets us feel the pulse of the main tax-paying population.

There are two take-away patterns in this picture. First, all cohorts follow the same general up-and-down movements over time. Everyone felt more "Boo taxes" around 2000 compared to 1990, and everyone lightened up by 2010 compared to 2000.

Nevertheless, the cohorts born from 1915 to 1964 all more or less overlap each other across time, despite their age differences within any given period. This is why their colors are similar to each other -- you don't need to distinguish them, given how similar they are. Greatest Gen is red, Silents are orange, and Boomers are yellow. The early cohort of a generation has a dashed line, the later cohort has a solid line.

The late Boomers (in solid yellow) looked like they were going to pull away in the first couple periods, but by the time they hit 30 around 1990 they had been captured into the upper band on the graph. Both cohorts within Gen X, however, are markedly set apart from that upper band, regardless of time period. Millennials are also low in anti-tax sentiment, below Gen X. They are not shown because they would only appear as a single point in the last period, so that we couldn't see their movement across at least two time periods.

Although the entire society seems to be coming closer together during the nadir of the Great Recession, I think the split will remain into the near future, based on what we can project from the age curves, which we turn to now.

Here we let back in those aged 65 and above, to see how anti-tax sentiment falls off a cliff during a person's 60s. The age groups are 18-29, 30s, 40s, 50s, 60s, and 70s and 80s together (to prevent small sample sizes among the elderly). If most of a cohort hasn't aged through most of a given age group, I left them off the chart. The 1980 cohort, for example, doesn't show up in the 30-39 age group because, as of the most recent survey in 2012, only the early members had gone through a decent amount of their 30s, and the later members had hardly even entered them.

The plot below shows how anti-tax sentiment changes across the lifespan, separated by cohort. The visual coding of the cohorts is the same as the first chart.

Again notice how similar the Greatest, Silent, and Boomer generations are. They don't appear in all age groups, but you can see them all lying along the same curve. As before, the late Boomers looked like they were going to pull away in their youth, but they've aged to be more like the other cohorts in the upper band. They could still stand farther apart in middle age, and serve as a bridge cohort. We'll have to wait and see.

We must emphasize that the early Boomers (in dashed yellow) have remained consistently "Boo taxes," ever since they were youngsters. What kind of people are complaining about taxes being too high when they're 20-somethings? Yet there they are -- hovering between 65-70% from youth through middle age. Once they leave their productive earning years, they will surely lighten up as everyone does, but it is remarkable that have been so rabidly in favor of de-funding public goods and services for their entire productive lives -- while of course voting for more and more public goods to be doled out for their welfare (prescription drugs) and for their entertainment (kicking ass abroad).

And yet they express genuine shock when they hear how high the national debt keeps ballooning -- it's not a cynical defense like, "I am shocked, shocked to find that debt is going on here!" They are sincerely perplexed at how spending and taxing have gotten so outta-whack. The utter lack of self-awareness of their role in sinking our society is breathtaking.

Just speculating now, but why is the early Boomer cohort worse than the others in the band that it belongs to? I blame the Dr. Spock climate that they were raised in -- both at home and when they were out in public. Smothering mothers, permissive parenting, etc., made the early Boomers doubly spoiled during their formative years -- in addition to being spoiled by growing up during the Great Compression, when per capita wealth kept rising, inequality kept falling, and political instability had all but vanished (not until they were around 20 would they see the slight return of organized violence circa 1970).

Moving on to Generation X, they are again in a world apart from the preceding generations. The early X-ers have already aged through their 30s, and although they were more anti-tax than they had been as 20-somethings, they did not rise anywhere near the levels of the upper band. What hints we can glean from their feelings in their 40s (keeping in mind that they haven't all aged through them), suggest that they aren't going to go near the upper band in that age group either. Neither will the late X-ers. They will show the same rise during middle age and decline during their senior years, but it is plain to see their curve lying far below that of the Greatest, Silent, and Boomer generations.

The Millennials are not shown here either because, as of 2012, most of them had not aged through their 20s. They are like Gen X, though, lying somewhat below the late X-ers. We'll have to wait and see if they form a band with Gen X as we and they age. Who knows, though? -- perhaps it will be Gen X that is the bridge generation between the Greatest / Silent / Boomer band above and a band below made up of Millennials plus whoever follows them. Time will tell.

Summing up the psychology behind these dynamics, it looks like people who grow up in periods of falling inequality take that for granted, and are not consciously aware of the norms that support it -- namely, "reining it in" rather than "let the devil take the hindmost." That leads them to break with the earlier norms -- what's the harm of putting me first in the economic and political arena? -- that then leads to rising competitiveness, inequality, and political instability. (See Peter Turchin's model of the competition-and-inequality cycle in this review.)

Those who grow up under this new norm of dog-eat-dog see from an early age how divisive and corrosive it is to the bonds that hold society together. (The last time around, this was the FDR / Eisenhower generation who grew up during the peak of competitiveness and inequality in the early 20th century.) Even as youngsters, they drop out of the mindset of succeeding and gaining status at any cost to others.

The sharp break with those born circa 1970 suggests that our impressions begin as small children, since the '70s were the beginning of the present era of rising competitiveness and inequality. The fact that those born circa 1960 are not so heavily affected suggests that our mindsets harden already by adolescence. It's like the development of our native language -- infancy and childhood are crucial, adolescence not so much, and adulthood not at all.

Eventually enough of the population will be made up of folks with the "reining it in" mindset, and enough of the dog-eat-dog cohorts will have died off, that society swings back in the more egalitarian and stable direction... until the cycle repeats itself all over again.

Nevertheless, divisiveness will not be as bad this time around as it was during the long Gilded Age, up through World War I. We aren't going to have a civil war that was as destructive as the first one, our How the Other Half Lives won't look as filthy and depressing, and our Spanish Flu pandemic won't claim as many victims. That's about the only good news, though, aside from being able to glimpse the light at the end of the tunnel once the Boomers die off. If anything, that's the form that I'd expect a bitter civil war to take this time around -- along cohort lines.

Having to fight alongside the Millennials, if it means putting an end to Boomer (and Silent) dominance? We just might have to. Our slogan is, "Always having to clean up after someone else's mess." Resentful stewardship. It won't be that much more degrading to have to take on the Millennials as sidekicks, as long as we still get to make fun of their dorkiness when we're off the battlefield.

GSS variables: tax, age, cohort, year, race

August 18, 2014

More signs that people have abandoned the woods as a public place

Each time I head off into the woods to check on how they're doing, something new catches my attention as a symptom of their abandonment by people over the past 20-odd years. (Earlier posts here and here.) Two observations from today:

1. There are many more spider webs than there used to be, along paths and in clearings. They aren't very thick, almost invisible unless you're keeping your eyes peeled. They fall apart when you walk through a single thread, so there shouldn't be many at all in walkable areas. Their ubiquity shows that nobody is walking around back there. When dozens of folks would have been passing through on any given weekend, each individual may have gotten snagged once. The cumulative result is that they cleared out the spiderwebs so that everyone could enjoy a stroll without having to peel off sticky thread from their arms every five minutes.

We're not talking about way deep in the woods either -- even ten feet in, you have to start mindlessly swinging a stick in front of you to keep from getting snagged so often. I felt like an idiot doing that -- it was definitely not one of those moments where you remember, "Oh yeah, now this experience is all coming back to me." When I was a kid, we didn't have to result to such goofy stuff, since there weren't so many threads in our way. It felt strange having to do that right as you got into the woods: you expect cobwebs in the attic or the basement, not the entryway.

2. Related to all the underbrush that's turning the woods into a jungle, there seems to be a lot more branches and small trees that have fallen to the ground and are getting in the way because no one goes back there, where they would tramp them into smaller pieces. Kids, too (or boys at least) would entertain themselves by knocking off smaller and medium branches from trees, or finding a large branch already on the ground and whacking all of the smaller branches away from it with their walking stick.

It's like with spiderwebs, or treading over grass to lay down a path -- each individual's contribution is small, but added up over all those people, it made a huge difference.

The very large fallen trees look familiar -- no one would have stomped those into halves in the old days either. The small end of the spectrum looks familiar, too -- twig pieces, shredded leaves, broken acorn shells, etc. But there's way too much stuff in the middle range, like medium and large branches lying all around, often with all their dead shriveled leaves attached, instead of the branches getting snapped into smaller and smaller lengths, and the leaves getting all crumpled up into shreds. It looks like an unkempt lawn cluttered with yard waste, not the small-scale mat of debris that you think of when you picture "forest floor."

While I was back there, a guy driving a lawnmower (who surprised me by being white) was busy clearing out the field that is part of the property just outside of the woods. Public places under bureaucratic control can pay workers to mow the lawn, where that's convenient, and such places look well maintained today.

But once the public place is in a less convenient area, like the woods, the government or corporate board isn't going to pay someone to traipse around back there with a machete and weed-whacker to make it more walkable. Paths and clearings rely on the aggregate of many actions, each of small effect, from the entire nearby population. And of course you're not mindful of your unintended stewardship, given how puny each of your contributions is. You only take note of it when everyone has opted out, and suddenly the public place is overgrown and hostile.

I'm confident that the state of things will revert to how they were once we switch from the cocooning to the outgoing phase in our social-cultural cycle, sometime within the next five to ten years. But it sure would be nice if those future woods-explorers could hit the ground running, and have paths already cleared out. I make a point of carrying a stick to knock branches off and whack the taller weeds out of the way.

Still, it's too much for one person who occasionally goes back there. That would be a great shovel-ready project for the parks and recreation department.  Instead of installing free wi-fi in public parks, where folks are supposed to be unplugged, use that money to hire someone who doesn't mind cutting trails through the local woods, and making sure clearings don't get all cluttered up with large debris. It wouldn't even be as difficult as the first time, since now he could be guided by the fading traces of the original ones.

Did kids playing outside stall the arrival of road rage?

The increasing levels of competitive behavior, me-first attitudes, and dog-eat-dog morality are reflected in aggressive driving becoming the norm. The phenomenon is recent enough that its name was coined only in the 1990s. And yet if it's part of the broader trend of all things me-first and get-outta-my-way, it should have been rising since sometime in the '70s -- no later than the '80s, when careerism and the higher ed bubble were already clearly becoming the norm in the economy.

Why did interpersonal competitiveness take so much longer to show up in driving habits? It's not as though folks didn't have cars, roads, and traffic back then.

I think it was the presence of people in general, and children in particular, being more out and about in the '80s. Kids weren't just out walking and playing, they were riding around on their own little vehicles that you had to watch out for. Having kids around as potential accident victims made drivers take it a little easier, when they probably would have preferred to drive more aggressively. Even if you're a little man behind a big wheel, you're not looking for a criminal record.

Now that so few people spend any time outside their homes -- and not protected by another solid box like an office building, chain store, or automobile -- aggressive drivers feel they have less to worry about.

We ought to keep in mind how widespread me-first driving has become. It's not just maniacs weaving around other cars on the highway. It's treating stop signs like speed bumps even when they're four-way. It's peeling out when the light turns green in a slow part of town. It's turning a corner at 25 mph in a quiet residential neighborhood. And it's darting into your streetside parking space like the neighbors are going to think you're a stunt driver (they aren't). No matter how insignificant the stakes objectively are, everyone perceives their daily driving as though a NASCAR victory hanged in the balance.


It's enough to make even the anti-helicopter parents feel anxious when their kids are playing outside. When my nephew and his friends were walking to-and-from one another's houses, I had to yell at them a couple of times to get out of the street if they were just hanging around. It's a quiet suburban street, but when cars do come through, they turn the corner in front of our house like they're on a highway exit ramp.

That is a definite change since I was a kid. We not only hung out in the street, with plenty of time to move to the side if a car was coming, we tossed the football around, hit the whiffleball back and forth, and slapped the street-hockey puck to each other. The last time I remember seeing that in popular culture was the scene of street hockey in Wayne's World. "Car!" *guys get out of the way, car drives past* "Game on!" *guys resume game in the street*

I remember playing like that a few years after that movie came out in '92, but it was rare enough by the end of the decade that I don't recall being on the other side as a driver and having to stop every now and then while children moved aside. Helicopter parents had already begun to lock their children indoors 24/7, aside from school and activities that were supervised and chauffeured.

The case of road rage is likely part of a larger class of me-first phenomena that should have started circa 1980, but took at least a decade longer to come racing out of the gates. An outgoing and rising-crime climate has a pro-social effect on ordinary folks, who are more interested in getting along with and looking out for one another.

The difference from the early signals of competitiveness, such as careerism, is that the feedback from political and economic me-first-ism is less personal and face-to-face. You're competing for a job with people whose faces you may never see, and voting for policies like BOO TAXES whose cascade of consequences you won't immediately feel personally or observe in the lives of others.

But when it comes to driving aggressively when there are children crossing the sidewalk on their bikes, and throwing the football around in the street, the feedback that you should dial it down is immediate and tangible. Aggressive driving is an activity whose effects on others are impossible not to put a face on. It only flourishes in a climate of anonymity -- when people are cocooned indoors within a neighborhood setting, and when they might as well be faceless on the major roads and highways. Not just in the sense of "you'll never see them again," but in the physical sense of it being hard to see into today's cars, whose cocooning don't-look-at-me drivers have ushered in a change in design where all the windows are slit-like and tinted.

August 14, 2014

The "No pro sports" test of cities with low status-striving and long-term stability

Playing sports is good for the body and mind, especially for our sense of social connection. Watching sports can solidify communal bonds if it's mostly a local phenomenon -- local players, local rivalries, local stakes. There's a danger with the mass media, however, that spectator sports can transform into epic, winner-take-all battles between superpowers that the entire nation feels a misplaced emotional investment in.

This form of idolatry also requires a climate of worshiping competitiveness, which is likely why we didn't see such behavior from the '30s through the '60s, despite the well established mass media of daily newspapers, radio, and television, all of them providing sports coverage.

Back in 1960, when the Pirates won the World Series in dramatic fashion, folks who lived in the greater Pittsburgh region didn't collapse to the ground with tears streaming down their cheeks, as though this were a sign that their lives finally had meaning, and could now rest assured. They felt cheerful and excited, not lifted up into the Rapture. And Bill Mazeroski, whose 9th-inning home run in game 7 won the Series, was remembered as a local legend -- not as a god who would conquer outside sports teams for the benefit of the Yinzer tribe at home.

Not until the Me Generation in the middle of the '70s did spectator sports slowly begin turning into such a, well, spectacle. Any particular upwardly mobile Boomer was not going to win a major sports event -- but they could affiliate with a team that stood a good shot. Everyone realized that you couldn't enhance your status through display of your personal athleticism, but through being a rabid committed fan of a strong team. Hence, who won the Super Bowl became as important as who won the Presidential election.

This new turn toward individual competitiveness, rather than fitting in with the community, led to the erosion of traditional religious denominations, just as it led to the reaction of "yawn, losers are boring" when it came to the home team. Religious denomination and sports fandom used to be "ascribed" -- something you were more or less born into, by growing up in one community rather than another. After the Me Generation broke from these shackles in the pursuit of ever-increasing status levels, religious and sports following became "achieved" -- something that you actively committed to, after making a choice among alternatives.

Still, not all places across the country -- not even all major metropolitan regions -- have taken the plunge into the gladiatorial arena with equal enthusiasm. If our age's myriad status contests are all variations on the same theme, then we can look at just one of them to gain enough insight into the big picture.

Luckily, Wikipedia already has a list of American metro areas by how many pro sports teams they host. In the table, you can sort them by clicking on the "B4" column heading, which says how many teams the city has in the four major sports leagues -- football, basketball, baseball, and hockey. I'm weighting them in that order, based on how much fervor they generate among the American public, with hockey not counting for our purposes.

Note that any city that has a pro baseball team also has a pro team for either or both of the two other major sports, so we can ignore baseball and focus only on those that have just football, just basketball, or perhaps none at all.

So, which major cities are less obsessed with status contests than the norm?

Portland only has one major team, but it's in basketball, and the team has been around for awhile. Fellow hipster mecca for Western Whitopians, Salt Lake City, also has an NBA team that's been there for decades. A few other cities back toward the heart of flyover country, such as Oklahoma City and Memphis, have very recent basketball teams, but what about cities with no teams whatsoever?

Among major metro areas, only Columbus, OH, and Raleigh, NC, have no pro teams for football, basketball, or baseball. Both are recent entrants into pro hockey, but only hardcore hockey fans would know that. I've visited Columbus many times since they joined the NHL in 2000, and I've never heard anyone talk about the Blue Jackets. I'm guessing the reaction has been similar to the Hurricanes in Raleigh-Durham, although they probably found some fair-weather fans when they won the Stanley Cup in '06. Columbus also hosts a recent pro soccer team (aren't they all recent?), but that can't carry much weight in the overall assessment of sports obsession.

Both areas do have passionate fans of local college sports, of course. However, the fact that the Duke-Carolina rivalry is so intense among neighbors makes the sports culture there seem more like the Hatfields and the McCoys, jockeying for status at the neighborhood level. Buckeye football in the Columbus area ignites as much passion among fans, but it is all directed toward a real out-group (Michigan), rather than a close neighbor from the in-group.

I'm going to call this one in favor of Columbus, whose (relatively) minimal level of status-striving is even more striking when you consider that the metro area population is 50% larger than Raleigh-Durham (and ranks 36th in the nation), which ought to tilt it toward wanting a piece of the action in big-city contests.

I grew up in a suburb of Columbus during elementary school, and no matter how many times I visit back in the intervening 20-odd years, I've been struck by how (relatively) resistant the area has been to hipster / striver colonization. Not that the neighborhood mall hasn't been converted into a lifestyle center with a Whole Foods anchor and a nearby Peet's Coffee shop. And not that downtown looks as unpretentious as it did back in the '80s, when it served as the all-American setting for Family Ties.

But it's still remarkable how few hipsters you see preening around town, how relatively unobtrusive the faggot presence makes itself, and how plain-looking the customers are dressed even at trendy hot-spots like the Mongolian and Brazilian barbecue restaurants. Most decent-sized mom-and-pop grocers have long gone out of business, but Huffman's Market, across the street from my elementary school, is still drawing loyal customers despite the nearby Whole Foods. In that same shopping center, there is a real Midcentury diner (Chef-O-Nette) that has continued to pull a crowd since 1955, the local residents seeing no point in a pretentious, over-the-top fake '50s diner like you find in much of the rest of the country.

Does Columbus simply benefit from being located in Ohio / the Midwest rather than North Carolina / the Southeast? Somewhat, but not really. Columbus actually is distinct from the surrounding region in never having jumped on the status-striving bandwagons of one age or another, chanting we're #1 at ____! Nearby Cleveland, Cincinnati, and Pittsburgh are all victims of the Rust Belt -- they all grew way too fast and specialized way too much in out-competing the other big cities, back in last heyday of competitiveness during the Gilded Age.

Columbus never drank the Kool-Aid about growing as big as possible, as fast as possible, and specializing as much as possible in some grand get-rich-quick scheme, whether it was manufacturing or high tech. As a result, it was never over-built, did not fall victim to the Rust Belt collapse, and right up through the current recession it has fared far better than the rest of Ohio and nearby metro areas.

Not surprisingly, the three big metro areas nearby also show a greater obsession with pro sports. Cleveland has teams in all of the big three leagues, while Cincinnati and Pittsburgh have football and baseball teams. (Pittsburgh also has a hockey team, though that's less important.)

And, strange as it sounds to link hipsters with pro sports fans, those other areas are much greater targets of hipster colonization. Cleveland, and the broader region including Akron, has long been home to arty bands such as Devo and the Waitresses, as well as Chryssie Hynde from the Pretenders. Pittsburgh has become a local mecca for hipsters -- was it the ironic tribute in the movie Wonder Boys that drew everyone's attention? (Or its sincere portrayal in My So-Called Life?) I can't say about Cincinnati for sure, but my sense is that it isn't quite as bad there (from what I hear / don't hear). Nearby Dayton is fairly hipster-free and pro sports-free, but it's much smaller.

Transient cultural one-uppers, just as well as folks who derive much of their group identity from affiliating with a pro sports team that is made up of outsider mercenaries, reflect a profound lack of communal bonds that stretch back into the past and will grow on into the future. The relative absence of such groups is a sign of a healthy, self-sustaining city. Looking into the sociology of folk culture not only helps us understand our history, it points us toward better choices regarding the future.

August 6, 2014

Can kids these days distinguish between lies and mistakes? With a look back at youth disillusionment across the generations

I've been struck several times by how my 6 year-old nephew interprets the errors made by others as lies rather than mistakes.

For example, his friend screams that he hears the ice cream truck. When my nephew races out with some money and there's no ice cream truck music, he gets pissed and says, "He lied!" I was there, and his friend was clearly confused and acted hastily in excitement, rather than playing a trick. I heard his friend accuse him of lying, too, in a similar context. So it's not some flaw unique to my nephew.

If you make a prediction or a plan that doesn't end up taking place, and in a way that makes him upset, he'll accuse you of lying, on the assumption that you had perfect knowledge of the future.

Those are cases of informational or factual errors, but he behaves this way just as well when someone is breaking a rule or norm. At day care a group of kids were playing the board game Sorry, and one girl played as though the rules allowed you to "slide" on your own color, which they do not. She could have been ignorant of the rule, or maybe they play by modified rules at her house. But my nephew accused her of lying.

Lies are errors made with the knowledge that they are errors, while honest mistakes are made with the belief that they are correct. Why can't kids who are already in their early grade school years draw this distinction? I don't recall accusing others of lying on such a regular basis, at such a late age, as kids do today.

My first thought was that kids today are more in the autistic direction than those born from the late 1950s through the early '80s.

One core feature of autism is the inability to take another person's point-of-view, in a purely factual sense. The "false belief task" has a child watch a grown-up and another child playing with a toy. The grown-up puts the toy in one place, which the other child sees. The other child leaves the room, and the grown-up switches the toy to another place. The target child is asked, When the other child comes back into the room, where will he look for the toy?

Before their mind-reading abilities have matured, children say that the other child will look in the second location, when of course he will look in the original spot. The target child cannot take the perspective of the other child, who has incorrect beliefs. He expects the other child to have changed his beliefs in perfect sync with the changes in the state of the world, between the time he originally formed the beliefs and now when he's applying them. Small children cannot appreciate that other people may hold beliefs that are out of touch with recent changes.

So perhaps kids these days are just maturing a lot slower than they used to. They should be able to tell the basic difference between lies and mistakes by age 3 or 4, but it looks like the earliest they'll reach that state is by 7 or 8 — or later still. It could be one of those "early sensitive windows" of development, where lack of early experience results in a failure to develop much at all.

The lack of social interaction here is caused by the general climate of cocooning, and especially the helicopter parents preventing their kids from playing with other kids. Even when they do allow social contact, much of the communication and interaction is mediated through and controlled by the grown-ups. They're simply carrying out the orders of the adult supervisor, rather than figuring out how to behave on their own, with experienced adults serving only to set the broad guidelines and boundaries within which self-directed interactions take place.

It's like the parents doing their kid's math homework for them, so that the kid doesn't learn addition and subtraction. Although unlike arithmetic, basic social skills cannot be picked up at any age. The blank slate view of helicopter parents leads them to believe that developmental delays are just drawing out the inevitable — you can stop having to schedule play-dates when you're 30 — rather than depriving their children of necessary early experience, which will warp their development for good, to a greater or lesser degree.

The other possibility is that kids today are growing up in a climate of increasing competitiveness, status-striving, and dog-eat-dog morals. They might not have a cognitive difficulty in taking the other person's perspective — they may want to assume the worst, to make the other person look bad and demote their status in the zero-sum competition.

Also, if advancing my own awesomeness is the main goal, then when someone else's predictions or plans fall through, hindering my gratification, they must have intended to trip me up. Haters gonna hate.

When can pit the two possibilities against each other by looking at the early Baby Boomers. They grew up in a fairly socially isolated world, especially regarding children playing unsupervised with one another, and having to figure out other people's thoughts rather than have the grown-up supervisor tell them the answer. Yet their formative years in the '50s and '60s also saw the minimum of status-striving and dog-eat-dog morality (although they and the Silents would usher in a decisive break from those ways during the '70s, with the Me Generation).

I don't have a good feel for how they would've interacted in everyday social situations where they had to distinguish lies from mistakes. Watching old episodes of Leave It to Beaver or Dennis the Menace might fill in some gaps there.

But there is plenty of enduring evidence from how they responded to the Vietnam War, which then spread into other areas of public affairs. The general tone among youngish people was that they were being knowingly lied to, not that the officials and lever-pullers were ignorant, clueless, naive, and so forth. "Don't trust anyone over 30" meant that they were in some way out to get you, knowingly misleading you.

Certainly there were plenty of lies going around about the war and the state of things in general. But those were more like attempts to alleviate an acute symptom — some embarrassing or damning event happens, and we've got to cover it up to make things look all hunky-dory. Overall, though, the best and the brightest were true believers that they had all the knowledge they needed, that they were powerful enough to do whatever they wanted, and that blowback could be easily contained. They were full of hubris, not deception, and the youngsters were wrong to accuse them of systematically lying.

On the side of the youngish people, they seemed truly, earnestly mystified that so many errors were being told to them. If they were accusing their elders of lying as part of a dog-eat-dog status contest, I don't think they would have been so genuinely surprised that other people could so firmly hold inaccurate beliefs, just like that 3 year-old can't believe that the other kid is going to look in the original spot for the hidden toy! The reaction would have come off as a cynical, petty campaign to beat their elders with whatever flimsy stick was lying within reach, as we see these days when political party A acts all shocked, shocked! to discover unseemly behavior among party B, and disingenuously vilifies them through the press.

We see an attitude similar to the one of circa 1970 in youngish people's reactions today about the false promises of higher education and student loan debt. It seems to be one of accusing their parents and teachers of having lied to them. "We followed all the rules and advice, yet here we are stuck in a rut. I mean, what the heck, how else do you explain it other than our parents and teachers blatantly lying to us?"

As in the late '60s, young people's accusations of lying don't feel like a petty attempt to slander others in a game of one-upsmanship. They seem genuinely surprised that their elders could so firmly hold a set of false beliefs regarding the risks and rewards of higher ed — beliefs that came from a time when the risks were lower and the rewards higher. This points to a root cause that is more autistic than Machiavellian.

Where does that leave good ol' Generation X? Our disillusionment unfolded much earlier in life, right around puberty, rather than in young adulthood — another sign of how quickly folks used to mature. And we were less likely to accuse our elders of lying to us about the state of the world and how it worked. Rather, when they were committed to a set of false beliefs, we thought that they were clueless, naive, and out of touch, perhaps dangerously so. They grew up in the squeaky-clean 1940s, '50s, and early '60s — not the gritty '70s, '80s, and early '90s. Not only could we take their perspective, we could intuit where their mistaken views of the world came from ("that was then, this is now").

They were making honest mistakes, and however much we felt harmed by those mistakes, we didn't attribute them to malice, which would have required righteous indignation to correct. We just wrote off their presumed wisdom — "You don't have to be old to be wise" — and pragmatically tried to adapt the best we could to the changing ways of the world, relying mostly on our fellow young people, who were not already committed to inaccurate beliefs about how the world works.

Little of this had to do with large-scale political and economic events, by the way. It was mostly about the crime wave and related phenomena — drug use, child abuse, teenage pregnancy, suicide, and all the other staple topics for "very special episodes" of 1980s television and pop music.

Perhaps there's a difference when young people are faced with a set of mistaken beliefs that their elders hold in the large-scale political-economic realm, as opposed to the on-the-ground social-cultural realm. But the simplest explanation is that in the absence of helicopter parenting, Gen X and the late Boomers had richer social experiences at young ages, allowing the social lobes of their brains to develop early and fully by adolescence.

That not only gave us a more mature mind when it came to taking other people's perspectives, but also when it came to dealing with the false beliefs of others. If you find out they're mistaken, make a note of it, ignore them in that area of life, and move on. Don't get all furious at them as though they were deliberately lying, and don't keep harping resentfully on their wrong-doing. Never attribute to malice that which can be explained by incompetence.

August 4, 2014

Conservative criticism of day care should emphasize the lack of community

Although I haven't really dived into the literature on the effects of day care, it seems clear that the major studies look at the kids' cognitive and emotional development as the outcome variables.*

That is too narrow of a focus, since socialization does not merely mean "gets along well with generalized/abstract others," but has become integrated into a community of his particular peers and elders. We could go further to include becoming integrated into a particular neighborhood or region -- a particular environment that he feels rooted in.

An earlier post about the lasting effects of divorce on children made the point that social scientists ought to move beyond strictly individual traits (the best example being intelligence) to include social relationships at varying levels above the individual (like integration within a community).

How can this approach be applied to the divisive topic of day care for children?

I think it says a lot that no one has many fond memories of day care, in contrast to grade school. School is not a glorified form of babysitting, as they make some kind of effort to enculturate the students -- both works of formal culture and informal folkways -- and to get them to feel and act as part of a cohesive group of youngsters.

Day care administrators and workers hold no pretense about the children interacting with each other outside of the center, or inside for that matter. No effort is made to make sure the kids know each other's names. They don't expect kids to remember the "counselors" either. They operate more like a pet hotel, in keeping with parents nowadays viewing and treating their children as pets. The social atmosphere is atomized and high-density -- how else are normal human beings going to come out on the other side?

I have a very good memory, and went through several years of day care. Yet I don't remember anyone's name, whether a student's or a worker's. I didn't meet any of the students outside of center hours.

I remember exactly two faces: that one girl who invited me under the table during naptime for a game of "I'll show you mine if you show me yours," and one of the workers who gave me a warm hug during a field trip. I didn't need it at the moment, it was more of a maternal "just cuz" sort of gesture. That stuck out in my mind because typically the day care workers don't see you as an emotional creature -- more like something that has to be distracted ("entertained") and ordered around until it's claimed by its owner.

I don't recall many experiences other than those two above, although I do remember more about the physical environment (the indoor rooms and the playground area outside). Can't say I have any fondness for those places, though.

I never wanted to go to day care, and always felt that pick-up time was like being rescued.

In all of these ways, day care was the opposite of school. Most people like going to school, however much they may bitch about certain aspects of it. There's a certain anticipation before each school day starts, and definitely by the end of summer when you can't wait any longer. It was not uncommon to feel like hanging around school after the day was formally over, and getting picked up by your folks did not feel like they were rescuing you.

We can't begin to count all of the major and minor experiences we remember taking place in and around school, and beyond its walls in the company of our friends from school. We remember all sorts of gross and fine details about the physical environment at school, and generally have fond memories of those places.

We remember scores of names and faces, both from our peers and our teachers, and have yearbooks just to make sure. And our teachers cared more for us, whether they were gentle or stern, than day care workers. We made friends with our schoolmates, our parents kept in touch with other parents -- and teachers -- and there was a phone directory to facilitate these interactions outside of the school setting.

Not to mention the teams, mascots, school colors, and so on and so forth that gave us a more palpable group identity.

This stark contrast between schools and day care centers should be emphasized by conservatives. Jonathan Haidt's work on variety in moral frameworks shows that cons are more sensitive to one based on valuing and strengthening the in-group, whereas libs are either numb to the sense of belonging, or are actively hostile to it. (The former are standard liberals, the latter are more like libertarians.)

If one of the goals of socialization is to give youngsters a place to fit in within a larger group, day care begins to look way worse than it already did, a place that had left even liberals nervous. It's atomized, each kid is looking out for himself, and only the Dickensian supervisors keep that from unfolding. Prison, pet hotel -- take your pick of metaphor, but none of them bring to mind a cohesive in-group.

Day care would also make for a good place to find common ground with the minority of liberals who have half a lick of common sense. It's difficult even for rationalizers to make day care look like anything other than a way for parents (especially mothers) to pursue greater levels of status-striving, unfettered by having to pay any attention to their kids' immediate needs, and during such a sensitive and impressionable period in the child's life.

Ideally, we wouldn't have to subsidize mothers with "maternity leave" payments to stay home with their young children. Status-striving would ideally be at a low enough level that it wouldn't occur to mothers that they'd need to be bribed into doing so. But I wouldn't be above granting an increase in maternity leave in the short-term, as long as it had a phasing-out built into it. We have to get there from here, and mothers are still pretty careerist.

That would also be a nice way to start diverting public funds from the Silents and Boomers, who have been absorbing them way more than any other group in history, for decades now, and direct more of it toward the financially unstable generations after them, who will also be faced with paying down the massive debts of their "boo taxes" elders.

Regardless of how it works out in policy, the movement against day care seems like a no-brainer for folks who are sick of the morally lax society that we have become.

* Not surprisingly, kids who spend a lot of time in day care have greater emotional problems that last into adolescence (and who knows, perhaps longer). Perhaps those are due to selection bias, whereby troubled kids are more likely to get dumped for long hours in day care by parents who'd rather not deal with them directly.

Whatever the cause, it's still a fact that sending your kid to day care amounts to giving them a potential peer group that is more defiant, argumentative, and acting-out than the group he'd receive from socializing around the neighborhood while spending the day at and around his own home.

Then again, most psychological liberals don't see anything beyond the individual, such as quality of peer group. They're only concerned with how day care may or may not help in making the kid smarter and better behaved.

August 2, 2014

Jurassic Park as an intro to Studio Era visual style

While I was watching Jurassic Park with my 6 year-old nephew today, I noticed how long some of the shots lasted, and how sparsely a very-close-up framing was used -- there were way more long and faraway shots than in a typical movie from the past 40 years. It seemed like every other shot had multiple characters in frame, who were also moving around to make interesting spatial arrangements (rather than purely naturally).

Since most viewers are unfamiliar with the movement and pacing of Studio Era Hollywood, they need a reminder to anchor the approach they're seeing to an old movie that they have seen. As the towering, torch-lit gates to the park are opening up, one of the characters asks, "What have they got in there, King Kong?"

Here is a short article by Warren Buckland that quantifies these aspects of the movie, while comparing it to The English Patient, which is more typical of contemporary filmmaking in having shorter shots, closer-up framing, and less camera movement.

Whether you want to give your kid something to watch that won't warp his attention span, or you're looking to get more comfortable with the old approach yourself, Jurassic Park is a great introduction.