There's a buzz in the blogosphere about three opinion articles that Charles Murray has written in the WSJ relating to intelligence, education, and responsibility. The first asks us to come to grips with the boring truth that for normally distributed traits like IQ, half of the population will be below-average, and that this will constrain our ability to have all students reach a decent level of academic achievement. The second moots a discussion on the merits of over-emphasizing a four-year college education for those who will continue on with education after high school. And the third focuses on how and why we should attend more carefully to cultivating the cognitive elite who will run society.
Last June I wrote an entry on what makes a good teacher, which pretty much covered Murray's first two points, so I'll just make a few comments on them. He's surely exaggerating, or perhaps worded his statement poorly, when he says in the first article, "If you do not have a lot of g when you enter kindergarten, you are never going to have a lot of it." Most IQ tests aren't valid when tested on people under 16, let alone on 5 year-olds, and it's really by late adolescence / early adulthood that a person's IQ reaches its adult value. Five year-olds are also not used to testing procedures and strategies, and especially in the case of young boys, their disposition is to tell the teacher to go eat it. Thus, other measures should be used to make educated guesses about who will turn out by late adolescence to have a high IQ -- perhaps elementary schools could hold Tetris competitions. Boys love competing against each other in video games, and though a crude measure, skill at Tetris must surely involve a large amount of visuospatial intelligence. For girls, indicators of verbal intelligence would better discover the smarties -- perhaps a short-story writing contest. Math skill should not enter into the process in either case.
Also, again though not a pure measure of g, creativity does depend on intelligence, so perhaps the PTA could send out questionnaires asking parents if their kid is obsessed with creative endeavors. When I was about 8 or 9, I thought I was going to be a video game designer when I grew up, so I made up a video game loosely based on Bonk's Adventure: there was a rough plot of goals and obstacles, a cast of characters, fully mapped out levels, and an instruction manual that I stapled together. OK, so it wasn't like Mozart composing symphonies at the same age, but this streak of mine was clearly a better predictor of my adult GRE scores than the SAT score I got at age 12. I don't attribute the failure of the latter just to the fact that I couldn't have been expected to know high school math and vocabulary at that age, but also to the fact that I hadn't taken a standardized test before, and that my personality programs me to hate and rebel against such testing.
Now, if a 12 year-old does score highly on the SAT, that can't be due to chance, but we can't infer much from their failure to score highly on the SAT or other IQ tests that have a substantial loading on crystallized knowledge such as mathematics, vocabulary, and academic trivia (like world capitals). The Ravens Matrices, which are pure pattern-recognition tests, should be used instead, though if the kid hates taking pointless tests, again simply noting whether your son is a wiz at Tetris, or whether your daughter writes short stories, could be sufficient information to make a good enough of a guess that they'll be more suited to honors or gifted classes once they begin secondary schooling.
Turning to the second article, Murray is spot-on when he notes that many who attend four-year colleges would be better served by vocational schools. Incidentally, I was thinking not so much of the "paralegal" or "craftsman" type of vocational school, but of various Master's degree programs that are increasingly popular (the ones advertised in subway cars, for instance). Unlike doctoral and professional programs like those leading to a J.D., most of these Master's programs don't depend on four years of undergraduate study. Think of what a waste it is to major in education or communications: since these are not rigorous fields where undergrads learn lots of math in preparation for graduate studies, or where Classicists must memorize the history of civilizations and master dead languages, interested students should be able to skip straight to MA programs, study their subject intensively for a year or two, accrue far less debt, and enter the workforce four years earlier.
Cheerleaders of the four-year liberal arts education for the majority of post-secondary students would claim that, even if a graduate of such a program is ill-equipped to enter the workforce, at least he'll have learned the art of critical analysis and persuasive exposition. But they would be wrong. Remember, we're only talking about graduates from non-rigorous programs -- for example, English literature or political science -- where rational thinking and cogent presentation have been dethroned in favor of allegiance to theoretical fashion norms, obscurantist gobbledygook, mealy-mouthed "where is the thesis?" theses, and prescriptions founded on "assume a can-opener" premises. Such undergraduate programs -- among the most popular, mind you -- should be viewed not as an enrichment of the mind but rather as a clinically diagnosed form of brain trauma, and most students would be better off never setting foot in the classroom of the average English or PoliSci professor.
Lastly, Murray's third article advocates "a revival of the classical definition of a liberal education, serving its classic purpose: to prepare an elite to do its duty." I quite agree with him that we should emphasize to the cognitively gifted that they should consider themselves fortunate for having been blessed by the actions of their genes and good luck in development, which counteracts their inclination to believe they're smarter than average because they worked so hard, while others are just lazy or don't value education as much as they do. This is the secular version of a privileged person looking upon a beggar, thinking "There but for the grace of God go I," and feeling obliged to help them out of a sense of empathy. The more we obscure the central role of general intelligence in attaining high social status, and the central role of genes and lucky development in attaining high intelligence, the more likely we are to react callously to the financial hardships of the cognitively short-changed. We're likely to say, "They brought it upon themselves," or "Why don't they just go to college and make something of themselves like I did?"
However, I'm less optimistic than Murray seems with respect to how far this cultivation of responsibility will go. Restricting our attention only to those with IQs in the top 10% (for argument's sake) and the jobs they will fill as adults, as long as there is variation in personality traits among these smarties, as well as variation in the personality traits selected for in the different professions, I think it likely that those in positions of power will continue to be selfish bastards. The retiring types will shuffle themselves into a career in computer programming or accounting, while sons-of-bitches will beat a trampling path toward a career in social control, either of the direct variety (politics, executive-level management) or indirect variety (PR, advertising, and other "managers of public opinion"). Being sons-of-bitches, they will be nearly impossible to persuade to rule in the public interest. Most progress in this area of social life has been made by devising restraints on their desire to screw others over in order to enhance their power and wealth even more (checks and balances, limited terms served, election by a constituency, and so on).
It should be easier to instill a sense of responsibility in other groups of smarties -- say, engineers (I'm not an engineer). Most people will tell you that engineers and other MIT / CalTech geeks are basically amoral: they don't care much about others, but they're not power-hungry megalomaniacs. Moreover, they are in a much better position to positively change society -- who invented the wheel, air conditioning, the internet, hygiene products, nuclear & solar power, and perhaps before long genetic enhancement? These two facts are far more promising initial conditions for someone who wants to take up Murray's task. Consider just the last example I gave, that of genetic enhancement: strangely, this is the only method of exploiting our biology for the better that many object to. No one cries foul when we research which vitamins, minerals, and other foodstuffs are more likely to give us better health and longer lives. When we take prescription drugs to soothe arthritis or ease the strain on our hearts, again not a peep from the bioethicists. Ditto for administering vaccines or antibiotics to combat infectious disease. Or, for that matter, brushing our teeth and seeing the dentist twice a year.
Yet no other improvement of our biology has the power to render obsolete our worries about inequality than genetic enhancement. If the population were close enough to monomorphic for genes related to IQ, then Murray's entire first article would go out the window. Not being a geneticist, I don't know how close this is to becoming a reality for most people. The point remains, though: it is a hard, but not intractable problem to solve with genetics, while it is an impossible problem to solve with education, whose effects are limited by the pupil's underlying biology. At the same time, that day is not tomorrow, so until then we must face the issues that Murray has brought up. In my next post, I'll explore these questions in more depth for a particular case: that of math education.