November 26, 2014

Immigration policy — for cheap labor or cultural replacement?

In the search to track down the traitors who are selling out our country to hordes of foreigners, conservatives can mislead themselves into targeting primarily "cultural Marxists" — those who loathe Western, white, male, hetero culture, and want to replace it with something superior based on its opposites.

Not that that crowd isn't on board with amnesty and immigration, of course they are. But a bunch of limp-dick intellectuals in San Francisco don't have the wealth, power, and influence to control political and economic activity at the highest level. They only serve the powerful by providing an intellectual basis for the policies that were going to take place anyway, to make them sound like a logical necessity rather than a naked power play.

Culture-war conservatives should sober up by looking at the twin policy of immigration — off-shoring, especially of labor but also of tax status. It isn't only hordes of foreigners flooding in, but boatloads of jobs setting sail for far-flung dirt-floor countries. If immigration policy were primarily about replacing the native population with a foreign population more to the liking of the powerful, then why not bring all those beneficiaries of off-shoring right here to the USA?

The answer is that sometimes it's cheaper for their employers to bring the foreigners here, and sometimes cheaper to keep them where they are over there. A contractor who hires dry-wall workers cannot off-shore those jobs to Mexico, India, or China, because the dry-wall work must be done right here. Same with strawberry pickers, meat packers, fast food workers, lawn cutters, and leaf blowers.

But if the work can be done at a distance, the employers are happy to send the work overseas without the whole troublesome business of importing foreign workers to America (like having to pay them based more on the American vs. Indian cost of living). Answering phone calls to customer service and writing computer code naturally lend themselves to distance work. So does manufacturing, as long as shipping the goods here isn't too expensive (shipped in bulk, protected and organized efficiently through modern containerization). This includes industrial products, consumer electronics, and pharmaceutical drugs.

While the cultural replacement view cannot account for such a dramatic split between the twin policies of immigration and off-shoring, it follows straightforwardly from the cheap labor view that we normally associate with leftist or liberal criticism of immigration (such as it is).

There are other clear signs that the powerful don't care that much about replacing the native culture with a foreign one. Why aren't schoolchildren compelled to be bilingual in one of the languages of our new neighbors or trading partners from Central America, China, and India? Foreign language classes are a total joke, are only required for a couple years, and students are not tested for proficiency at all. School boards would eliminate French and German in favor of Cantonese and Hindi. The powerful may want us to be more sensitive and aware of foreign cultures, but not to actually become a foreign culture, which would require a lingua franca.

Deserting the battle over cheap labor, immigration, and off-shoring in this second Gilded Age of ours will earn conservatives a one-way ticket to irrelevance and impotence in the broader culture. So how can they present their criticism in a distinctly conservative rather than leftist way?

I think the main difference is that leftists only blame the shareholders and managers of Big Business for cheap labor policies. As the agents of bullying the government into opening the gates, they certainly deserve a good deal of the blame.

But what about ordinary consumers who clamor for ever cheaper products and services — and the hell if it means that the companies they buy from will employ workers from dirt-floor countries, whether bringing them here or sending the work over there? It's not as though the bulk of the American middle and lower classes would even consider, let alone carry out a boycott of companies that provide cheap junk made by careless foreigners.

"Hey, it's cheap, isn't it? It does the basic thing it's supposed to do, doesn't it? Then who cares if Chinese or Indians or Mexicans had to make it. Now I can buy ten times as much junk. If Americans made it, I could only afford one-tenth of the junk pile that I currently enjoy."

Middle-class callousness toward the consequences of their everyday purchases of goods and services on the demand side is almost as responsible for the cheap labor policies as Big Business greed is on the supply side. Not to mention the phenomenon of middle-class individuals employing cheap foreign labor as lawn cutters, dry-wallers, and babysitters in their own homes. That's not the outcome of a corporate board meeting on Wall Street.

The conservative response in the battle over cheap labor will not target only the wealthy in a class war, but try to humble the middle and lower classes as well, and hold them accountable for their callous preferences that have provided the fuel for the greed of Big Business.

Now, blaming everyone instead of a small easy target may seem like a losing strategy, but as long as it's based on humility and redemption, it can catch on at the grassroots level. An ordinary individual or family cannot meet with a politician the way that a corporate lobbyist can, but they can passively change their consumer practices and actively boycott companies that go against those wishes. "Boycott Chinese junk" would go a long way toward returning that work to American soil.

The leftist response to cheap labor, aimed only at the very top of society, is ultimately more hopeless. It relies on corporate containment policies at the very highest levels of government, or else violent disruption of the shareholders and managers' lives. During the last peak in inequality circa 1920, we saw armed strikers shooting it out against paramilitary armies, as well as anarchists lobbing bombs on Wall Street and assassinating politicians.

During the Great Compression, when inequality reversed and economic and political life became more stable, there were definitely large-scale regulatory programs by the government to rein in the greed and manipulation of Big Business, not to mention much higher income tax rates than we have seen since the '80s. That is the slice of Midcentury life that leftists and liberals can warm up to.

What they don't see is the grassroots change in preferences toward solidifying the culture through excluding foreigners and not buying stuff made in the third world, even if it meant more expensive products and services.

By the Midcentury, the days of hiring cheap servants recently arrived from Ireland, or cheap steel mill workers fresh off the boat from Poland, were long gone. As detailed in this profile from Fortune magazine in 1955, even elite executives chose to live in more modest houses and to employ fewer or no servants, compared to the decadent ways of the early 20th century — then still in living memory.

Middle-class preferences began to take account of the socially corrosive consequences of acquiring as much stuff for as cheaply as possible. And they came to view such pursuits as debasing to the individual. Those who still tried to cling to the old ways, a la Pottersville from It's a Wonderful Life and Norma Desmond's palazzo from Sunset Boulevard, were subjected to shaming in popular culture.

Liberals only see others as selfish, while conservatives see it as part of the human tendency toward sin. Emphasizing this difference will keep the battle over cheap labor from descending into class war against the rich.

November 22, 2014

Wannabes and absentees: Do it yourself and Pay someone else

During the shift toward status-striving of the past 30-odd years, there have been two huge changes in the way that services are performed. One is to Do It Yourself, the other is to Pay Someone Else. At the level of outsourcing vs. doing something in-house, these two are opposites, so there must be a unifying common theme at a higher level. That's where the link to status-striving lies.

Where have people increasingly opted for the DIY "solution"?

Home improvement — beyond simple maintenance and repairs, homeowners now remodel and build additions. They are also inclined to build their own furniture and fashion their own small decorations.

Specialty mechanics and electronics — modding your car, modding your computer or video game console or phone (perhaps tinkering with the hardware, but usually futzing around with software settings). "Developing" your own digital image captures (hours of dicking around in Photoshop), repairing your own intricate camera and lenses ("a little WD-40 ought to silence that squeaking..."), and photographing something complicated and important. Putting together a full mechanics' workshop in the garage.

Health — except for major trauma, diagnosing and treating malaise or illness is now done through researching a bit online, then drawing up a list of the right mix of foods, vitamins, supplements, and pills. Faddish psychobabble therapies can be added if the pain is mostly emotional.

These services require years of acquired knowledge, experience, and skills, not to mention tools and materials that are relatively expensive, hard to find, and difficult to understand and use. They are done — or used to be done — by artisans and professionals, and confer status on the DIY-ers.

What about the trend toward Pay Someone Else?

Child-rearing — daycare workers, nannies, school teachers (substitute mothers), and coaches and tutors of varying specialties (substitute fathers) now perform most of the day-to-day and face-to-face activities of raising children. That's in addition to parking your kids in front of a glowing media screen, an even more flagrant form of outsourcing your parenting duties.

Meal cooking — hardly anyone makes home-cooked meals anymore, which come instead from fast food chains, microwave meals, and already prepared meals / hot bar items at the supermarket.

Housekeeping and yard work — women who sweep, vacuum, clean counters, and scrub toilets and showers, as well as men who ride lawnmowers and aim leafblowers.

These services are unskilled and require the use of tools and materials that are cheap, plentiful, and simple to use. Their main input is labor (time and effort), so they would subtract status if the Pay-Someone-Elsers were to do it themselves.

The psychology is easy to understand. Status-strivers want to do the professional work themselves, to reap the benefits of branding themselves as artisans. They shed the status deadweight of unskilled work through outsourcing, even if it means neglecting their familial duties.

There are far broader implications for the economy, though, not just changes in how annoyingly grandiose and shamefully neglectful your fellow neighbors, co-workers, and citizens have become.

The DIY movement has wiped out the number of man-hours that could have been done by true artisans and skilled workers, and lowered the asking price of the labor they can still sell to customers, who now expect dirt-cheap services since "I could always just do it myself, y'know." If the service proves too complicated to DIY, the customer will just opt for replacement rather than repair (fueling planned obsolescence). That eliminates the once common fix-it shops as a way of making a living.

The Pay Someone Else movement has swollen the number of man-hours going into unskilled labor, which is already low-paying, offering little to no benefits, temporary / high-turnover, and uncertain job security.

The outcome is widening inequality, as skilled jobs are replaced by unskilled. We usually associate that with heartless managers of large companies decomposing a skilled task into separate rote tasks through mechanization. But here we see just how deep the rot goes — even ordinary individual consumers are such self-regarding skinflints that skilled tradesmen must debase themselves into unskilled laborers in order to satisfy the status-enhancing lifestyles of today's middle class.

Sure, skilled artisans can still find work with the top-top-top level elite, whose budgets are unlimited and who are more inclined toward conspicuous leisure. But that's not a large market. It was the middle class market for skilled services that used to support an electronics repair shop, photography studio, and carpenter's workshop.

Unlike the decadent and parasitic elite, the middle class actually produces for a living, and can't indulge so much in conspicuous leisure. So, conspicuous consumption it is. Middle-class folks have never been more profligate with their disposable income, loans, and credit — yet they would take it as a personal defeat to have to hire a carpenter to remodel their cabinets. All that status item spending has drained the portion of their income that could have gone toward skilled services and production.

After all, why by one good pair of shoes made by skilled Americans when you can buy five pairs made by unskilled Salvadoreans? Owning a single pair of shoes prevents you from participating in the fashion treadmill. So does owning a single professionally made dishwasher for 30 years — if the workers are barely skilled and their product breaks down every five years, that's just an opportunity to UPGRADE DAT SHIT and impress your friends. For status-strivers, crummy products are the gift that keeps on giving.

November 18, 2014

Zen and the art of trail maintenance

For the past couple weeks I've been spending three to five hours most days on a project to restore an abandoned trail that my peers and I took for granted in middle and high school, but has since fallen into ruins.

Clearing thorn bushes and the ubiquitous invasive vines along the edge of the woods, so that the portal into the trail can be seen and easily walked into. Logging the fallen and leaning trees (and some dead standing ones), plus all the branches strewn along the tread. Re-positioning logs so they don't dam up a bunch of leaves and water whenever rain flows downhill. Dislodging large hanging branches so that every step doesn't feel like you've got the sword of Damocles dangling over your head. Raking away all the debris that not only obscures the tread but makes it slippery to walk over — leaves, rocks, sticks, etc. Clearing leaves out of drains...

I've noticed how abandoned the woods have become for awhile, but now that I've started to try doing something about it, in a place that I knew well as a teenager, my mind struggles to comprehend how many areas need attention, and how effortless it used to be in the good old days when everybody pitched in here and there.

Part of the cause is the status-striving and inequality trend. Government funding for trail maintenance and similar programs has dried up since that benefits ordinary middle-class white people, when those funds could be better used to give mortgages to Mexicans, or to bail out the Jews on Wall Street who bet on the Mexican mortgages.

Status-striving also leads to a withdrawal from civic participation (less time, money, and effort to spend on self-advancement), a la Bowling Alone, so don't expect to see legions of volunteers regularly pitching in. Even the Boy Scouts these days seem to be more about fundraising for the Boy Scouts (how many bags of popcorn should we put you down for?), so they can attend the national Boy Scout jamboree, than practicing stewardship in the communities they live in.

The collapse of deliberate and concerted maintenance wasn't so noticeable in the context of woodland trails during the '80s and early '90s (well into the era of civic disengagement), because the society was in its outgoing phase of the cocooning-and-crime cycle. Every trail-goer who kicked a branch out of their way or bashed up a thorn bush in their way kept the trail in decent shape. Since the return of cocooning and helicopter parenting over the past 20-25 years, though, hardly anybody wanders back through there, so there aren't even the unwitting volunteers to keep it people-friendly.

I haven't been posting or reading comments here during this time because hours and hours of barely skilled labor in the woods is one of the most fulfilling activities I've ever done. It reminds me of my childhood when my Pap used to take me back into the Appalachian woods and we'd cut back thorns, push over dead trees, and take care of other public-space groundskeeping.

Although ideas for posts strike me while I'm out working, and I even elaborate them into fuller thoughts out there, I just don't feel like plugging my brain into the internet when I get home, nor would I feel like plugging in before I left for the day, putting me in the wrong state of mind.

I'll try to write some of this up soon, but in case it takes awhile, here is the gist of several, to get people thinking and talking again.

- Trail creation and maintenance is rooted in transhumance pastoralism (not nomadic). Why don't Slavs, chinks, spics, and blacks seem to care about the very presence of trails through nature, let alone practice stewardship over them?

- The long stretches of unskilled manual labor is rewarding because there's a point, and the effects can be clearly seen and appreciated. Working out at the gym has no point, other than toning your buns to please your gay lovers. Even the paleo exercises that are more in touch with the tasks we're adapted to do, are in the end still pointless leisure. If you want to be really paleo, get some damn work done while you're exerting yourself. It'll give you a sense of accomplishment and pride that you must otherwise force / con yourself into believing when it's just doing X many reps for Y many sets, or the equivalent end-points for a paleo routine.

- The hiker / rockclimber / outdoors culture is also purely leisure-based. Conspicuous leisure and conspicuous consumption (their "gear" is more encrusted with logos than a middle schooler's). And focused entirely on advancing the self in a status contest, rather than stewardship of shared common spaces. Look at how few tools those stores sell.

- Conservatives who do nothing to preserve, in whatever little way they can, the shared common spaces that made this country great, need to answer for their negligence, or be brushed aside as worthless whiners. "There's no point — the country is being over-run with Mexicans, so what's the point in preserving something that the Americans of 2050 won't appreciate and will allow to fall back into disrepair?" Well why don't we just burn the whole place to the ground, then, including your own house with you still in it? White Americans are still going to be around in 50 years, and they're going to be counting on us to keep things as well preserved as possible. Otherwise they'll just go to what is being preserved, i.e. Walmarts full of lardass spics. Innumerable people before your time shaped the places you connect with, so you've got to do your part too.

- Getting less political, what accounts for the split between woodsmen, woodworkers, and carpenters on the one side, and mechanics on the other? In the hardware section of Sears (burn down Home Depot), I was struck by how much mechanics tools there were, and how little any of it resonated with me (my mother's father was a carpenter). People-oriented vs. thing-oriented? Wood in the shop or in the woods as a natural substance, hence more people-like than exhaust pipes and drive trains? At least there's a preference for natural vs. artificial objects to work on.

- Warning coloration to ward off wildlife that may want to tangle with you. There was a decent-sized buck staring me down from about five yards away, who had a high-ground advantage over me, where I also had no maze of trees or anything to run back into or up into. Usually they'll walk a few steps at a time to see if you back away. This time, I'd taken off my coat since it had gotten warm, and I had on t-shirt with a very busy black-and-white Southwest Indian tribal print. That was the only time so far that I won in a staring contest up that close, at such a disadvantage. He blinked first, turned his head first, and walked away uphill. I wonder if having such a high-contrast pattern, like a skunk or badger on cocaine, played a role in driving him off.

- Why are both the outdoors and hunting scenes so averse to wearing "gear" from animal sources? It seems like they're finally learning about this ancient invention called wool, but it is damn rare to find animal skins or furs at a hunting store, outdoors store, or army/navy surplus store. The answer is not price, since their over-engineered Franken-fabrics cost an arm and a leg. I put it down to the trait of following natural vs. engineered solutions. Animals that have been shaped by millions of years of natural selection to adapt to cold, wet, thorny, outdoors conditions are going to have superior protection compared to whatever the latest lab fad is with consumers who have too much disposable income. We ought to copy what adaptations those animals have evolved — and if we can't copy them, we can just steal them.

November 6, 2014

Ohio court to queer couples: Drop dead (from viral loads)

In a landmark decision that will hopefully drive most of Ohio's gay-enabling Millennial generation out of the state, a federal appeals court in Cincinnati has allowed four states (Ohio, Kentucky, Tennessee, and Michigan) to treat gay marriages as illegitimate, following the sentiment of the people.

This may force a decision with the Supreme Court, and they may rule in favor of gay marriage. But even if that happens, conservatives in the region should not grasp defeat from the jaws of victory. A ruling against condoning gay deviance all the way up at the appellate level is already sending shockwaves throughout the region (see all the whiny Twitter reactions in the Dispatch article).

Now it is official: no matter what the Supreme Court ultimately decides, Ohio and its Appalachian neighbors have chosen to stand on the wrong side of history. Anybody who wants to stand on the right side can defect and join the liberal transplant hive in a more fag-friendly state.

If you think that gays and their apologists are going to forget this decision when/if the Supreme Court reverses it, think again. Look at how well people still remember the resistance in the Deep South to desegregation in the 1950s. That example is instructive: although local resistance was ruled unconstitutional by the Supreme Court, blacks still figured it wasn't worth the hassle of living there anymore, and continued migrating toward more liberal Midwestern areas.

Letting a group know that they aren't welcome, or at least that they can't push their agenda over the majority, goes a long way toward not having to live with their problems anymore. On the flipside, letting a group know that they are welcomed unconditionally, and that the majority will take all the narrow-interests abuse that can be dished out by the guests, makes it certain that the hosts will have to put up with the newcomers' problems for a very long time.

Chicago only shed large numbers of blacks when they told them that even better welfare policies awaited them up in Minnesota and Wisconsin. There were also enough micks and wops in Chicago to give the blacks a little boost out of the state, whereas Minneapolis and Milwaukee have only Nordic pansies standing guard.

Ohio, though, is proving to be less and less Midwestern over time. We see that now from a regional high court more or less giving the finger to the number one trendoid human rights cause du jour. There is a fault-line running through the state from southwest to northeast, with the southern and eastern strip being hillbillies, the southwest being more akin to Louisville, Kentucky, the center area drawing a variety of folks, and the northern and western area being part of the freezing industrial Midwest, now the Rust Belt.

Over the past two to three generations, the hillbillies have been leaving the rural areas and settling down more in the center near Columbus, or further south toward Cincinnati and Louisville. Cleveland in the northeast and Toledo in the northwest keep losing population, mostly out of state to transplant havens in Arizona, North Carolina, etc. Slowly but surely the Appalachian influence is on the rise, and the Midwestern on the decline.

It can be hard for folks not acquainted with flyover country to picture where the rough boundaries of Appalachia are, so here is a map of its counties according to the Appalachian Regional Council. Most people know that the country is flat along the East Coast, flat in the Midwest, and is hilly or mountainous somewhere in between, but think only of West Virginia.

Notice how much of Ohio is hillbilly territory. You don't see that out in the Platonic ideal Midwestern states like Iowa or Minnesota. (Also notice how much of Pennsylvania is hilly once you get away from Philadelphia on the East Coast.)

As the me-first impulse carries individuals away from their home town and to wherever they identify and affiliate with, the initial disparities will widen within fault-line states like Ohio. People who want to be on the right side of migration history will high-tail it out of the state toward Colorado, Arizona, North Carolina, Virginia, Florida, etc. And they'll take their "right side of history" politics with them.

The remainder who pay no mind to how trendy their place of residence and origin is, will neither care about how trendy their policies are.

November 2, 2014

Grandiose gravestones in status-striving times

Crossing over to the afterlife is the final rite of passage that we make, and like other such rites, it is marked by a ceremony to publicly and collectively acknowledge the irrevocably altered status of the deceased.

Ceremonies in general are ripe targets for elaboration during status-striving times — we get to show off before a captive audience. When the climate becomes more about accommodating others instead of me-first, ceremonies take on a more restrained and self-effacing tone.

During the Gilded Age and early 20th C., status-striving and inequality were soaring toward a peak that maxed out circa 1920. The wealthy could afford more of everything, and given their impulse toward excess, it's no surprise to see their grave monuments continuing to tower over the others in cemeteries across the country to this day. Visit a few local places that have graves going back through the 1800s, and you'll see it for yourself.

The main differences I've noticed are that they are much taller (easily exceeding human height), have more elaborate working (more than one typeface, semi-circular ruling for the text, images carved on a flat surface, and even relief sculpture), and tend to have bold messages about this life being over but the next one beginning — being re-born rather than truly passing away, triumphant over Death.

Here is a monument from 1879 and a mausoleum from 1911, both typical among the wealthy of their time:

These features begin to dwindle already during the '20s and '30s, and are more or less absent throughout the '70s. Headstones rise no higher than a few feet, are block-like in shape, and have simple working (at most a floral pattern carved around the sides and upper corners of the border), and contain no messages whatsoever — only the person's name (sometimes only the surname) and the dates of their birth and death. Not what their role or status in the community was, not what their status was in their extended family, not their job, or anything else. And no declaration that the show isn't really over / don't count me out just yet.

Those folks didn't lack confidence that the deceased would be thriving in the afterlife, nor did they believe that there was nothing to be said about their various roles and statuses in the domains of life. They just didn't feel like saying it — it would have struck them as vainglory.

Here is a typical tombstone from the end of the humble Great Compression, circa the '60s and '70s. If a cemetery began after 1920 and filled up before 1990, this is the only kind of marker you are likely to see:

Sometime during the '80s and '90s there was a shift back toward the Gilded Age pattern of taller, monumental styles, images and likenesses carved, relief sculpture, copious text, and this likely including a list of their various social achievements and proclamations about how they are too great to submit to Death, and are actually living it up in the Great Beyond.

I can't say from impression when the reversal occurred — I did see a couple like that from the '80s, but it seemed like the real growth was during the '90s. At any rate, by the 21st century, the shift is crystal clear, as seen in this recent example:

Somehow, our neo-Gilded Age climate has revived the grandiose style of grave markers. What are the links?

The taller height and more elaborate working speaks for itself.

Listing their social roles — father, officer, musician — is close to bragging about what they accomplished, even if it's not as obvious as the bumper stickers about "my kid is an honor student at Junior Genius pre-school," or the "fruit salad" decorations that military leaders now wear.

Inscribing a mini-eulogy is a bit odd — it was already said before those who knew the deceased, during the funeral service. Broadcasting it forever to random passersby is bordering on presumptuous. It also feeds an arms race of whose marker has more to grab our attention.

The bold messages about the non-finality of death do not strike me as meant to comfort and reassure those who have survived the deceased, but more of a statement of how great and powerful they were to have risen above death, more like a demi-god than a mere mortal.

This topic could easily be explored quantitatively, and even snuck into a mainstream outlet as long as it had a title like Inequality in the Graveyard. Plenty of folks have researched the temporal changes in funeral monuments, but none that I could find have looked at the link to the status-striving and inequality cycle.

And as hinted at the beginning, this approach could be broadened to look at all of the ceremonies that mark life's milestone transitions. Debutante balls long ago, which then vanished, but have been revived as Sweet Sixteen extravaganzas. Weddings (holy shit). Bearing children — how much stuff do you have to buy to welcome them into the world, and to let the public know that you now have a kid?

These changes have already been noticed and discussed, although not necessarily how they're reviving the ways of the Gilded Age and Downton Abbey period. Now we see that these changes include the ceremonies surrounding the final of life's major transitions.

Addendum: here is an article about similar changes in Germany from the early 20th C., Midcentury, and Millennial periods. It's not just an American thing, but wherever the status-striving and inequality cycle is more or less in sync.

October 31, 2014

Extended family contact by transplant vs. native residency

In an earlier post on differing levels of contact with extended family across regions, I stated that one factor underlying the pattern was the differing levels of being a transplant to the region, which would cut down on how often you keep in contact with family back home.

How much less contact do transplants actually have with their families? I looked at levels of contact with three different types of extended family groups for both natives and transplants. Racial groups have different patterns of migration and family contact, but it turned out not to affect the split between natives and transplants in level of extended family contact. So I left all races in.

Here's the breakdown for whether they've been in contact with the following groups during the past four weeks (among those who have living relatives of the type):

Cousins -- 50% of natives, 40% of transplants

Uncles and aunts -- 53% of natives, 40% of transplants

Nieces and nephews -- 70% of natives, 52% of transplants

Remember that the questions don't specify whether you kept in contact by meeting face to face, or by writing or calling. With nearly half of transplants saying they kept in contact with these groups as recently as the past four weeks, despite probably not living nearby, a good deal of those respondents took it to include mediated contact.

These results understate the difference in levels of face-to-face contact, which could be closer to zero for transplants, if it's already this low for mediated contact.

It's easy to blame technology for isolating us from those who we ought to be in contact with, especially in person. But here we see a vivid reminder of how simple it is to sever the ties to your extended family -- just move away, or perhaps they will. As long as the split is not acrimonious -- you're just leaving to better yourself -- no one will be bitter about the diluted and fragmented family web. It'll be one of those things that just happen, mysteriously and uncontrollably.

I don't see things changing course due to a change in attitudes toward family ties. There's too strong of an impulse toward self-enhancement, rather than maintenance and enhancement of everything else that made you.

But we may not have to wait for a change in attitudes. There's more than one way to keep people from moving away -- saturated real estate and job markets, and general lack of preparation for life after collage (where they goofed off for four years) among Millennials. "Boomerang kids" who live at home well into their 20s and 30s are becoming more of a reality, and reversing the trend of being a transplant during one's 20s.

They'll be in contact with their extended family more than earlier generations during that stage of life, whether they like it or not.

GSS variables: cousins, uncaunts, niecenep, regtrans (created from region and reg16)

October 30, 2014

Millennials reversing the trend of being a transplant during one's 20s

With gloomier job and housing prospects facing the most sheltered generation in world history, the Millennials are becoming "boomerang kids" who leave for four years of goofing off at college, and return home for their 20s, maybe longer.

One unnoticed but important side-effect of this shift is that they won't be contributing to the transplant phenomenon as much as earlier generations did during the same stage in life.

The General Social Survey asks questions about what region of the country you were living in at age 16, and where you're living at the time of the survey. I created a transplant variable that looks for a mismatch between the two answers, and looked at age and cohort patterns.

Cohorts are five years long, and the age group was 23 to 29, in order to make sure they were out of their college years. Only whites were studied, as races show different migration patterns, and sample sizes are not very large for non-white groups when restricted to such a narrow age range across multiple cohorts.

The eight cohorts within Boomers (1945-64) and X-ers (1965-84) were all nearly 20% likely to be a transplant during their post-college 20s. With the '85-'89 births, there's a sudden drop to 12%, cutting their chances in half. The results do not depend on whether you look at people who didn't go to college, or those who had at least one year of college.

The recession is a non-starter: many other generations faced recessions during their 20s, yet didn't hang around their home region (let alone their home). The sudden drop suggests that a clear breaking point has been reached for the broader socioeconomic structure, like the higher ed, job, and real estate markets, not simply recession or no-recession. Presumably those born in the '90s will be even less likely to bother chasing fame and fortune by leaving behind their native region.

Massive, unregulated, me-first migration patterns not only dislocate individuals from the social networks that they're most attached to, they destabilize the broader ecosystem -- both where they came from, which is losing natives and their native ways, as well as where they're moving to, which cannot cope with such an influx of outsiders and their outside ways.

Not that Millennial strivers wouldn't love to play their part in the transplant royal rumble -- they're simply less able to make it happen, being even more unprepared for real life than those who came before them, and with so many of the spots already taken up. Perhaps they'll rationalize the situation they've been forced into, and come to prefer living in the same general region that they grew up in. And perhaps they'll pass along this attitude and received wisdom to the generation after them.

The great big transplant shoving match may therefore come to an end, not by consciousness raising but by over-saturation.

GSS variables: regtrans (created from reg16 and region), cohort, age, race, educ

October 27, 2014

New Urbanism hijacked for leisure-class contests, and the plague of cars turning suburban streets into one-way roads

Not for the first time, I wrote what started as comments but soon morphed into an entire post on another site (this post at Uncouth Reflections reviewing a documentary on New Urbanism). This seems more likely when I've had a drink and am only focused on the now. You've heard of drunk texting -- this is drunk comment-spamming.

I'll just copy & paste the comments, rather than edit and fill them out into full posts. There's plenty more to say, so just riff on them in the comments, and I'll chime in again. (Like how I forgot to mention how much worse the parked car plague is where spics live. Six cars lining the curb in front of a "one-family" suburban house -- that's a sign of the Mexican invasion for sure. But I digress...)

The first is about how New Urbanism has turned out in reality, all these years after being an unheard-of movement, and being so widely adopted by the right kinds of people living in the right kinds of places. It promised a return to Main Street, but has built only playgrounds for the leisure class to publicly indulge in their status contests.

The second is more focused, on the topic of how clogged with parked cars the typical residential street is nowadays in suburban America, how recent of a change that has been, and what this example shows about the power of design and public planning to shape behavior when attitudes of individuals are pushing in the opposite direction.

* * *

The lack of reflection this far into the craze for New Urbanism is unsettling. Y’know, it’s not 1992 anymore, and the movement isn’t some underdog vanguard but the Next Big Thing that every SWPL enclave has been pushing through for at least the past 5, and more like 10 or 15 years.

That photo of the public square in New York sums up what’s gone wrong (or has revealed what had always been wrong from the start): New Urbanism has become (always was?) a brainstorming session / policy bandwagon for how to make the wealthiest neighborhoods in the wealthiest cities even more insanely epic playgrounds for the sponges who dwell nearby. That could either be loafer / hipster sponges, or finance / Big Law / PR / other bullshit sector sponges making a ton of money from parasitic professions.

There’s absolutely nothing civic, communal, cohesive, or enriching about these large playground oases in the urban jungle. Just a bunch of sponges sitting around indulging in some conspicuous consumption (where’s you coffee from? where’s your panini from?) and conspicuous leisure (1pm and I’m lounging in public, with designer clothes and perfect hair — jealous much?). There’s never any connection or awareness of the other people in these places. They’re all drones vibrating in their own little cell within the larger hive.

Don’t be fooled by the pairs of people who appear to be interacting with another person. The other person is just a social image prop, and gets no attention, which is instead directed at the hive in general.

You ever notice how loud and over-sharing their conversations are, and how their eyes are always darting around to see how many other drones are giving unpsoken “likes” to the speaker? When they aren’t talking, they are dead silent for hours at a stretch, never looking up toward the other, glued to their private glowing screen. No affection or closeness — they only “interact” when their speech and mannerisms can suck in attention from the hive.

Apart from the psychological segregation, contra the intimacy the New Urbanist cheerleaders promised we’d have, there’s the naked leisure-class nature of all the surrounding “small shops,” invariably 90% quirky foodie joints, and 10% quirky yoga, quirky doggie spas, and quirky clothing. Somehow that’s not what my grandfather would have imagined when New Urbanists spoke of a return to Main Street. These preening useless faggots would have gotten food thrown at them from passing cars back in those days.

Where do they buy their household tools? From a mom & pop hardware store? No — by ordering some Chinese piece of shit from Home Depot’s website. Where do they buy their music and movies? From iTunes (if they’re old) or more likely from some online streaming service. Consumer electronics? Amazon, or once a year a trip to the Apple Store where they actually buy something.

It’s pathetic how little variety there is in areas struck by the New Urbanist craze, and how much all of that stuff has migrated online due to airheaded consumer choice. I could have sampled a wider variety of stuff from a mall back in the ’80s — and they had professionals’ offices there too.

Defenders of New Urbanism will say that it wasn’t intended, that this is a hijacking or adulteration by wealthy interests, that the originators were more populist. Maybe — maybe not. The point is: this is what the mania has produced in reality, and it’s time to start taking stock of that, and coming up with ways to wipe out all of this airheaded elitist shit and return city and town life to more populist and enriching ways. Not by continuing to cheerlead for the craze like these designers and architects do.

It’ll be better if the new movement doesn’t have the words “new” or “urbanism,” to avoid confusion and tainting.

It would greatly help matters to identify designers, architects, and policy makers by one of three types, so we know who we’re dealing with and how to treat them.

1) Kool-Aid drinkers. These people truly get an endorphin rush from turning entire neighborhoods into leisure-class playgrounds. Crazy, not worth trying to talk some common sense into.

2) Sell-outs. These individuals started off with the populist Main Street ideal as their model, but quickly figured out that egalitarian small-town ecosystems are not exactly gonna fly off the shelves in a climate of such intense status-striving and inequality. A fella’s gotta eat and pay rent, so whaddayagonnado? Not worth trying to convert, since they only worship the almighty dollar, and they will not fall for the lie / clueless naive suggestion that somehow, someway the Main Street model could be made to be as profitable, or more, than the leisure-class playground model.

3) Frustrated idealists. Bitter, overlooked, unappreciated, disgusted by what the formerly idealistic movement has devolved into (or again, how the hidden variation among the originators has made itself manifest). They feel sick for being a part of a movement that has swept aside the variety of stores that used to be found in suburban strip centers as recently as 25 years ago, all in the name of converting the place into a “lifestyle center” with food, drink, food, drink, food, food, food, spa, salon, crappy cell phone outlet, food, and food. All chains, all oriented toward leisure-class strivers.

Naturally only the last group is worth the time for ordinary people to talk to. But if they won’t identify themselves, and their distaste for where the New Urbanist craze has gone, it will be hard to start cleaning house.

* * *

The designer in the documentary is Danish, so I don’t expect him to be in touch with American trends. But New Urbanists have overlooked the most pedestrian-unfriendly car phenomenon of the 21st century — suburban streets that are narrowed into de facto one-lane paths because residents park their cars all along the curb, at every house.

This is not a design / planning problem, since just 25 years ago, roughly the same number of cars belonging to roughly the same number of residents on a suburban street, were parked in the driveway, carport, or garage. It was normal for two-car houses to have both parked one behind the other in the driveway, and for someone to have to get out and move the back one if someone wanted to take the front one out. I remember doing that in the ’90s, though it was also starting to become common to park one in the driveway and one on the street.

What changed were attitudes toward private vs. public welfare. Individual convenience is maximized by parking one in the driveway and one or more on the street. Say goodbye to those unbearable 30 seconds of car-shuffling. But when everyone feels and acts that way, suddenly the whole street is clogged with parked cars. The two-way street is now one-way, and pedestrians who could have walked along the side of the road (the way we all used to) have nowhere to walk, unless there’s a sidewalk.

(Sidewalks are not the most common thing in suburbs, sadly, and even if there is one — how drone-like to have to follow a sidewalk in a quiet residential neighborhood, when you’re supposed to be walking through the streets because you own them, and only moving aside when you see a car approaching.)

This state of affairs points to the larger problem that is rarely discussed in New Urbanist forums — how easy does design change attitudes, and can a change in attitudes over-turn the utopian design plan? (Answer: yes.) Driveways, carports, and garages were a design solution to the problem of streets clogged with parked cars — provided that folks who lived in multi-car houses put the good of the community above their own stingy quest for maximum convenience. You don’t see cars parked in driveways in the city — it was supposed to be a way that suburbanites could lick one of the city’s worst problems.

In the end, though, attitudes trumped design plans.

October 26, 2014

The etiology of women who seem like gay men: a look at Anne Hathaway

During the trailer for Interstellar, there's a shot of Anne Hathaway looking sideways with her mouth agape that struck me as something you'd see from a creepy homosexual camping out at Starbucks to scope out the latte-sipping twinks. I've never been a fan of hers and don't have a strong sense of her range of facial expressions, so I investigated a little on Google Images. The hunch paid off. Here are just a handful of shots of her showing gay-face:

The over-smiling, overly eager open eyes, raised eyebrows, and slackjaw are all hallmarks of the campy gay-face. All resemble caricatures of a child's expressions of "surprise" and "I'm such a little stinker." The basis of male homosexuality is stunting during the "ewww, girls are so yucky" phase of development (gays as Peter Pans), hence their more neotenous (child-like / infantilized) appearance and behavior.

I've covered these features at length elsewhere, but here we see something similar in a heterosexual woman. In fact she doesn't just look a lot more like a gay man than 99% of women do, she shares their emotional and behavioral tendencies as well. Thin-skinned, breaking down over the most trivial happy or sad causes. Naturally campy and caricatured, not acting that way to be ironic. Loose and into drugs during college. No shame in using her sexuality to get a rise out of others. Childishly naive, easily fooled and taken advantage of by her first husband. And most importantly, no apparent desire to have children and nurture them as a mother.

Granted, women are more child-like to begin with, but she is way off the charts for how naive, kiddie, weepy, campy, and non-maternal she is.

How did she get that way? In the case of men, it's probably Greg Cochran's idea of a "gay germ" that strikes in childhood. My take on that is that its effects are a broad developmental stunting — a Peter Pan syndrome — and not merely a narrow change in sex role behavior, sexual preference, etc.

That raises the question: if it can strike boys and produce gay effects, what would happen if it struck girls? I think the answer is women like Anne Hathaway. (It would not produce lesbians, since they are characterized by the opposite pattern — not childish, but menopausal.)

As it turns out, her brother is gay, so we know that she would have been at a similar environmental risk of exposure to the gay germ in childhood. And being so closely related, she would have had a similar genetic susceptibility to the germ's effects.

The plot thickens with her second marriage. After being scammed by an apparently sociopathic first husband, she decided to swear off heterosexual men altogether and got married to an obviously homosexual nobody, named Adam Shulman. When every A-lister these days is part of a power couple, it's just a little bit strange for her husband to be a complete unknown, to have been a close friend beforehand (i.e., her gay BFF), and to look and act so effeminate and mincing, as though he were her kid brother rather than her lover and protector.

Other celebrity women have served as beards for closeted A-list men — Kim Kardashian for Kanye West, Cindy Crawford for Rande Gerber (Clooney's butt buddy), Julianne Hough for Ryan Seacrest, Jada Pinkett for Will Smith, and so on. But in these typical cases, the sham husband has wealth, influence, or looks that would "enhance the brand" of the sham wife.

In Anne's case, it would be inaccurate to call it a sham marriage, since it was not a cynical brand-enhancing contract, but an earnest attempt to elevate the status of her gay BFF-ship, in the same way that childish naive faggies believe that simply throwing a wedding will make their bond normal or special.

I don't mean to delve into so much celebrity gossip, but because their lives are so well documented, they do provide a window into a topic that would otherwise be completely opaque. Can you imagine getting funding to study how gay (not lesbian) the female relatives of gay men are? Maybe if you could spin it in some pro-homo way, but the whole topic of "what causes male homosexuality" is too radioactive these days.

Miley Cyrus is another example worth looking into. She comes off as a flaming queer trapped in a girl's body. And on Google Images, her brother Braison does a pretty good impression of a twink. But she's a little young to see whether or not she prefers getting married to a gay BFF. I give it greater than 50% chance, though.

October 22, 2014

NFL players got bigger once competitiveness became sanctified

In a comment, Feryl said he saw a chart about how NFL players started getting heavier and heavier circa the 1970s, linking it to the status-striving and inequality trend.

After some googling, I found this post with a series of graphs showing the evolution of body size among NFL players from 1950 to present. Most of the change has occurred since the '70s, and players on the whole tended to be similar in height and weight during the '50s and '60s, with slow decelerating growth at most. That's important for showing that this is not just some long-term trend that goes back to the very beginning of the sport, but one that began at the same time that competitiveness in general became glorified throughout society.

Some positions have gotten taller, but most show modest or no change in height. The real change has been in weight, particularly where the sumo wrestling takes place, among centers, tackles, guards, and linebackers. They appear to have gotten a full two standard deviations heavier, and even a good deal of the other positions have gotten about one standard deviation heavier. That is a huge change in less than two generations.

An offensive and defensive lineman who both weigh 200 lbs will have the same balance of forces as a pair who are both 300 lbs -- evenly matched. But somewhere along the way, some "greed is good" coach decided to put slightly heavier linemen up against the prevailing standard-weight linemen, giving his own a slight edge. Everyone else quickly caught on and imitated the strategy, the initial edge was eroded, and all were suddenly caught up in an escalating arms race toward 300-pound linemen.

Although the balance of forces is the same with a pair of 200-pound linemen and a pair of 300-pound linemen, the variance is not. Imagine a lighter pair engaged in an evenly matched tug-of-war, when someone cuts the rope and both fall backward. Now imagine two giants lumbering around when thrown off-balance. Or imagine the force of impact when the evenly matched lighter pair vs. heavier pair slam into each other.

Introducing a simple weight regulation to prevent a pointless arms race that endangers the players will not be enacted until the social mood changes away from sanctifying competition. For now, it would only enrage the braindead fans whose sole meaning in life is to squabble over whose squad of transplant gorillas-for-hire is more vicious.

Back when sports fans were not bloodsport junkies, and sports players were not outsider mercenary apes, this wasn't a problem. But now that competition for its own sake has reached sacrosanct status, who are we to get in the way of more and more ridiculous, bombastic bread-and-circus entertainment?

And it won't be the fans who have to pay the costs of the arms race -- they're not the ones taking all those hits from 300-pound hulks. Fortunately for the game, the players aren't exactly known for their future orientation, and don't mind fucking themselves up for big money today, at the cost of living as a cripple for the rest of their lives. The coaches only want to win, and the NFL only wants advertising dollars hence eyeballs and butts in seats. With no checks anywhere throughout the entire football ecosystem, the whole place is headed to hell in a handbasket.

Sadly, a sector that cannot police itself, and that is becoming ever more bombastic and economically parasitic on ordinary folks, means that only people from outside the pro sports ecosystem can do anything about it. Not shut it down, but dial it back by regulations to where it was in the Midcentury.

That would have sounded like a pipe dream in the '90s, when the two sports-crazy generations -- Silents and Boomers -- were dominant. But sooner rather than later, they'll either be retired, senile, or dead, and the average member of Gen X and Millennials could give a shit about the barbaric religion of gladiator worship (UFC fandom still being a very niche identity). I can't think of too many causes that would so effortlessly unite X-ers and Millennials of all political persuasions. Beating the money-changers out of the athletic temple is one of them.

Strength of extended family ties by region

Rootedness in a place has both a vertical dimension stretching back through time, as well as a lateral one linking a person to others in that place during a given period of time. Let's start with the state of affairs today, and compare levels of rootedness across some of the major geographic-cultural regions of America.

The strongest form of investment in a place, as opposed to feeling like nothing is holding you back from picking up and heading off for greener pastures, is extended family ties. Community ties with non-kin are worth examining, too, but blood is thicker than water.

The General Social Survey asked a series of questions about how much you keep in contact with three groups of extended family -- cousins, uncles and aunts, and nieces and nephews. It didn't specify whether "contact" was in-person or mediated. It was asked in 2002, so most respondents probably assumed it meant in-person or talking over the phone.

I excluded respondents who said they had no living relatives of that type, and lumped the two affirmative responses together (some having a little more contact over the past four weeks, and some a little less), against those who said they had no contact. The ranking was so similar for each of the three family groups that I averaged them into a single index.

Non-whites have much more contact with their extended family than whites do, and regions vary a lot in how non-white they are. However, looking at whites only vs. everyone did not change the ranking in this case, so I left all races in.

The chart below ranks the regions by how likely their residents are to have had any contact with their extended family during the past four weeks. The states which make up each region are listed below, in descending order by population size within each region, to give a feel for which states are more influential on the region's score. The GSS uses Census regions, and some of them have confusing names, which I've tried to re-name more helpfully (if changed, the original names are in parentheses below).

South Atlantic - Florida, Georgia, North Carolina, Virginia, Maryland, South Carolina, West Virginia, Delaware, District of Columbia

Southern Appalachia (E.S. Central) - Tennessee, Kentucky, Alabama, Mississippi

Eastern Midwest (E.N. Central) - Illinois, Ohio, Michigan, Indiana, Wisconsin

New England - Massachusetts, Connecticut, Maine, New Hampshire, Rhode Island, Vermont

Lower Mississippi (W.S. Central) - Texas, Louisiana, Oklahoma, Arkansas

Middle Atlantic - New York, Pennsylvania, New Jersey

Pacific - California, Washington, Oregon, Hawaii, Alaska

Mountain - Arizona, Colorado, Utah, Nevada, New Mexico, Idaho, Montana, Wyoming

Western Midwest (W.N. Central) - Missouri, Minnesota, Iowa, Kansas, Nebraska, South Dakota, North Dakota

About 20 percentage points between high and low -- we're not talking minor differences around the country.

The main divide is between the eastern vs. western half of the country. All these years later, the Mississippi River continues to be a major cultural barrier. As Americans have moved farther out west, they have left behind their extended family, only some of whom would have also been heading out west. That certainly makes it impossible to stay in face-to-face contact, but I'll bet that it cuts down on mediated contact as well -- out of sight, out of mind.

Among western regions, why do the furthest west have higher family contact? Because the coast was settled earlier. The Plains and Mountain states may have been reached first, but folks tended not to put down roots there for very long. It's the most desolate real estate in the country, so that over history there has been a lot more coming-and-going there compared to the desirable West coast, where people come more than they go.

Also, by the 21st century, a good deal of those living in the Mountain states are first or second-generation transplants from all over, and refugees from the over-saturated West coast, who won't have family nearby. The Plains states show the opposite problem -- the locals abandoning ship.

After the east-west divide, there's also a secondary cline from more familial southerners to less familial northerners. I doubt it's due to more favorable weather conditions that allow folks to get out of their homes comfortably. The GSS is administered during the summer, and more of less all of the eastern US is a humid hellhole then, and if anything, worse in the South.

My only hunch is historical ethnic conflict serving to strengthen clan ties. All else equal, larger groups defeat smaller ones, so ethnic conflict pressures folks into joining larger groups -- in the context of kin, extended vs. nuclear families. Blacks vs. whites in the Southeast compared to the Northeast, Mexicans and Indians vs. whites in Texas compared to Minnesota. Hence also why religious membership is more important in southern regions, it being the main way that we cement bonds with non-kin.

Since the major split is between earlier and later settled regions, we'll need to look into how rootedness has changed over time in these regions. Once started, do roots continue to grow, or are they uprooted every generation? Then the link between rootlessness and status-striving will become clearer.

GSS variables: uncaunts, cousins, niecenep, region

October 20, 2014

The geography of striver boomtowns, AKA future ghost towns

An NYT article reviews a report on how recent college grads are multiplying like cancer cells, I mean fueling the engine of growth in cities across America, particularly in the most central areas of the city. They are shying away from Establishment cities and gentrifying second and third-tier cities where the rents are cheaper -- until the word gets out and the next twenty waves of transplants bid up the rents to Establishment levels.

The article and report refer to 25-34 year-olds with a B.A. as talented, creative, etc., without any proof required other than the fact that their brains are young, that their credential being bought and paid for (rather than earned) allowed them to goof off for four years, and that they spent that time cultivating unique quirky tastes shared by 90% of their age-mates (craft breweries bla bla bla).

These cases illustrate some of the themes I've started to develop here about the geographic and generational differences in status-striving. The Gen X and Millennial subjects they're tracking are moving away from Establishment cities because the Silent and Boomer incumbents refuse to vacate the prime real estate and above-poverty-level jobs.

More broadly (see this post), materialist and career competition have become too saturated by Silents and Boomers, leaving X-ers and Millennials to pursue lifestyle competition instead. That will not only affect which cities they flock to, but the character of their lives once they arrive -- they will turn the place into one great big playground for lifestyle status contests, lots of drinking, and the occasional random drunken hook-up. Or, College: The Sequel (total run-time to be determined).

And they aren't picking just any old sub-Establishment cities but ones that are already growing at a fair clip, and are already and have historically been quite large in population. When they look for a city in the Northeast to take the place of New York or Boston, do they settle on Albany? No, it has to be the second-largest city in New York -- Buffalo. Feeling squeezed out of Chicago and Dallas in the Midwest and Philadelphia and DC in the east? Well, you could shoot for Wheeling, WV or Grand Rapids, MI -- but why not aim as high as you can (within your budget), and resurrect the Gilded Age empires of Pittsburgh and Cleveland?

Fortunately, nobody involved at the grassroots or the academic and journalistic levels has any knowledge of history, so what's to temper the enthusiasm for pushing Buffalo and Cleveland as up-and-coming boomtowns (at least among the Creative Class)? It's not as though they've already been through the over-hype and hollowing-out cycle before. But if you want more sustainable long-term growth, you'd have to settle for cities that are smaller, historically less important, and culturally less thrill-seeking. Off-the-radar cities.

On the whole, though, the striver boomtowns are not in the Rust Belt but in the Sun Belt, i.e. where mainstream America has already been heading for decades. There are now enough earlier transplants who can actually run a business and create jobs, that the Creative Class can play catch-up and jump on the Sun Belt bandwagon without starving and "living outside". Will the Sun Belt soon turn into the next Rust Belt? Impossible -- growth only increases, worst-case at a slowing rate, but there will never be a mass desertion of entire swaths of the country that had been over-hyped, over-built, and over-indulged.*

It comes as no surprise, then, that the cities with the greatest percentage growth in 25-34 college grads fail the test of egalitarianism outlined in this post -- not having a pro sports team. Only Austin passes (I didn't claim the test was perfect). That proves that they are not simply seeking refuge from the rising competitiveness and widening inequality that blight the Establishment cities. Otherwise they'd be heading to some place where it would never occur to the locals to allow their public coffers to be parasitized by a big-league team that could PUT THEM ON THE MAP.

Among the ranks of Millennial boomtowners, is there any awareness of how illusory all this rapid growth is? Let's ask one of them living in Denver:

“With lots of cultural things to do and getting away to the mountains, you can have the work-play balance more than any place I’ve ever lived,” said Colleen Douglass, 27, a video producer at Craftsy, a start-up with online classes for crafts. “There’s this really thriving start-up scene here, and the sense we can be in a place we love and work at a cool new company but not live in Silicon Valley.”

How can start-ups be thriving? You don't thrive until you're mature. All those dot-com start-ups sure seemed to be thriving in the late '90s -- what happened to them after that, I'll have to order a history book on inter-library loan, since I'm too retarded to remember.

Online classes for crafts, or where higher ed meets e-tailing. Two great bubbles that burst great together!

BTW, her Linkedin profile shows that she went to the University of Dayton, which I don't recall being very close to Denver. All the talk about youngsters choosing the cities and bustling city cores over the dull suburbs papers over the reality that these kids aren't shopping around at the urban vs. suburban level, but at the entire metro area level -- which city will maximize my number of likes and followers? She didn't choose downtown Dayton over an attractive suburb of Dayton like Beavercreek -- she wanted to ditch dear, dirty Dayton altogether.

Group identities that are constructed consciously by first-generation adherents who merely affiliate with a city will be weak, shallow, and fleeting compared to those that are inherited unwillingly by multi-generational descendents who are rooted there. The western half of the country has long been more plagued by vice and social decay than the eastern half, and its history of rootlessness provides the central explanation.

Another long established fact about urban growth and migration is that cities do not grow except by wave after wave of even greater fools pouring into them. City folk are so caught up in their status contests (whether based on career or lifestyle), that they forgot to get married and have kids. By the time their career is established, or their reputation on Instagram suitably impressive, it's too late to start. Cities have been fertility sink-holes ever since they began thousands of years ago, and migration from the countryside was all that fed their growth.

What will happen to these 30 year-olds when the contests over who's sampled the most esoteric food truck fare begin to get old? They won't have any family or community life to fall back on; everything up till then has been based on lifestyle status competition. They will face the choice between staying stuck on the status treadmill forever, or drop out in isolation, where they'll indulge their vices until they bodies and brains loosen into mush. Sadly, that will begin by the time they're 40, when the final relief of death is still very far away.

Gosh, you guys, what the heck. I hate to be such a downer, but all this mindless enthusiasm for urban cancer is not only getting tiresome, but by now disturbing.

* During a brief refractory period, the cheerleading reporter lets some sobering facts slip by:

Atlanta, one of the biggest net gainers of young graduates in the 1990s, has taken a sharp turn. Its young, educated population has increased just 2.8 percent since 2000, significantly less than its overall population. It is suffering the consequences of overenthusiasm for new houses and new jobs before the crash, economists say.

Good thing that Ben Bernanke ordered an anti-hype fence to be built around Atlanta, lest the overenthusiasm ruin other Sun Belt boomtowns.

October 19, 2014

Wide-open eyeglasses for a non-ironic look

Earlier I showed how different the impression is when someone wears glasses that are narrow as opposed to wide. Narrow eyes, whether now or during the cocooning Midcentury, make people look aloof and self-conscious. Wide eyes like you saw back in the '70s and '80s look inviting and other-directed.

Of course today's narrow glasses are more off-putting than the Midcentury originals because there's now a level of irony on top of it all. Get it -- retro Fifties, geek chic! Yep, we get it.

It makes you wonder whether people could ironically wear wide-eye glasses (other than sunglasses). I've been wearing a pair from several decades ago since the summer, and haven't gotten any winking approval looks like "Oh I see what you did there, Seventies glasses FTW!" They're pleasantly inconspicuous.

I was searching Google Images to try to identify the drinking glasses that my parents used to own around the time I was born, with only pictures to go from. "Vintage glasses orange brown" turned up this result:

It's meant to be part of an ironic "hot for teacher" costume for Halloween, but it doesn't succeed in the ironic department. Somehow, wearing wide-eye glasses makes someone look inviting, kind, and sincere, even when they're aiming for ironic.

Contrast the effect with those "sexy nerd" glasses with narrow eyes and thick rims, where the girl just looks sassy and self-absorbed. Wearing glasses like the ones above makes her look refreshingly tuned in to other people instead of herself.

October 18, 2014

Extended family structure as an influence on generational membership

Why is it that even among people born in the same year, some of them identify more strongly with an older cohort, some with their own cohort, and some with a younger cohort? If generational membership were only a matter of when you were born, and what the environment was like along each step of your development, we shouldn't see this kind of variation among folks who were born in the same year.

Going solely off of a hunch from personal experience, it could be due to differences in the generational make-up of a person's extended family.

I was born in 1980, part of a late '70s / early '80s cohort that either gets lumped in as the tail-end of Gen X or is given its own tiny designation, Gen Y, between X-ers and Millennials. I've always felt and acted closer to core X-ers than to core Millennials (who to me seem like they come from another planet), although a good fraction of people in my cohort would tilt more toward the Millennial side. We all recognize that we're neither core X-ers nor core Millennials, yet when pushed off of the fence-sitting position, some of us fall closer to an earlier generation and some to a later generation.

Since we spend quite a bit of time socializing with family members, though, perhaps we should look into that source of influence as well. If they're not related to you, much older and much younger people typically are blind to you, and reject hanging out with you if you try to make yourself seen. But blood is thicker than water, and those much older or younger kids will interact with you and pass along their ways in a family setting.

I've only rarely interacted with the extended family on my dad's side, so I'll stick to the maternal side. Although she was born in the mid-'50s, she is unusually young among her three other siblings, who were born in the early, mid, and late '40s — more typical of the parents of core X-ers. My cousins through them are also all older than me: of those I met regularly growing up, one is a late '60s birth, two are early '70s births, and one is a mid-'70s birth. Our grandparents are also more typical of core X-ers, with one born in the mid-1910s and the other in the early '20s.

I would have to ask around, but I suspect the people in my cohort who tilt more toward the Millennial side of the fence have cousins who are more centered around their own age, aunts and uncles centered around their parents' age ('50s births), and grandparents who are Silents (late '20s / early '30s births). That extended family profile is closer to a Millennial's than an X-er's.

Those are the blood relationships, but when you count the affines (those who marry in), you get the same result as long as there aren't wild age differences in dating and marriage. Growing up, I only got to know the girlfriends (and eventual wives) of two of my cousins, but they were both core X-ers (late '60s or early '70s births). And the uncle-by-marriage that I knew well growing up was a Silent.

In short, if you look at my family tree and cover up my birth year, and my parents', it would like a typical Gen X tree. The lateral influences from my cousins (and once they were old enough, their girlfriends and wives), as well as vertical influences from aunts and uncles and grandparents, are more typical of someone born in the early '70s than the early '80s.

Granted, the less time you spend with your extended family growing up, the weaker this effect will be. And people in my cohort had parents who were part of the Me Generation who didn't mind moving away from their siblings and parents, and who expected us to do so as well once we set off for college and whatever career we wanted to pursue. Status was to be more important than extended family cohesion.

But some of us didn't grow up so isolated from our extended families. My mother's sister and her husband lived only a few blocks away from us when I was in elementary school, and that was a second home for me and my brothers. By that time, her children had moved out, but still visited frequently, and brought their girlfriends, so we weren't so distant from them either. And I spent long portions of off-time at my grandparents' home, during the summer and winter.

Nowadays, with extended family ties being moderately strong at best, generational membership is going to be primarily shaped by your own birth year, period. That determines who your peers will be in school, and there's your generation. But that still leaves secondary influence for the generational make-up of your extended family, and in cases where you belong to a cohort that is neither here nor there, this secondary influence could push you clearly into one or the other clearly defined generation on either side of your own.

October 16, 2014

The generational divide among grunge musicians

Grunge music was a flash-in-the-pan phenomenon of the early 1990s, serving as a bridge between the longer and more stable periods of college rock throughout the '80s and alternative rock throughout the '90s. In fact, there was a generational bridging underneath the stylistic bridging.

I finally came upon a copy of the Temple of the Dog album with "Hunger Strike" on it. (Posted from my thrift store cardigan.) That song has always struck me as aging better, and agreeing with me better, since it first became a hit over 20 years ago. Not one of those perfect pop songs, but one worth buying the album for.

Like many other late Gen X adolescents, I was into grunge when it was the next big thing, but quickly moved on — or backward — to punk, ska, and college rock from the late '70s and '80s. (My friends and I hated the lame alternative, post-grunge, or whatever it's called music that defined the mid-'90s through the early 2000s, even when it was novel.) A good deal of what I used to like, I began not-liking, but there are some songs like "Hunger Strike" that still sound cool and uplifting.

As it turns out, the grunge groups that I find more agreeable were made up mostly or entirely of late Boomers, born in the first half of the '60s, while those I don't relate to as much anymore were made up mostly or entirely by early X-ers, born in the second half of the '60s. The late Boomers are the ones shown in Fast Times at Ridgemont High — abandoning themselves to whatever feels good — while the early X-ers are shown a little later in the John Hughes movies — consciously torn between wanting to be impulsive while seeking the comfort of stability.

The abandon of the late Boomers gives them a clear advantage when it comes to jamming within a group, improvising, and going wherever the moment is taking you without questioning it. This was most clearly on display when glam metal bands went mainstream in the '80s, ushering in the golden age of the virtuoso guitar solo and near-operatic vocal delivery. But it showed up also in the era's cornucopia of cheerful synth-pop riffs, as well as jangly, joyful college rock.

When the early X-ers took up songwriting, rock's frontmen were suddenly from a more self-conscious and ironic generation. Stylistically, it meant that the shaman-like performance of the spellbinding guitar solo was over, and that vocal delivery would be more aware of its own emotional state, or more affected — twee rather than carefree on the upbeat side, angsty rather than tortured on the downer side.

During this transition, along came grunge. Temple of the Dog was made up of members from Soundgarden and Pearl Jam, before either group exploded in popularity. Pursuing a hunch, I found out that singers Chris Cornell and Eddie Vedder are both late Boomers. Pearl Jam was roughly half Boomers and half X-ers, while Soundgarden was all Boomers aside from the bassist.

And sure enough, Soundgarden always felt like the evolution of '80s metal, which was created by their generation-mates, albeit at an earlier stage of their lives. Pearl Jam sounded more of-the-Nineties (more self-aware, less abandoned), though more rooted in the sincerity of college rock bands from the '80s than sharing the irony of '90s alternative rock.

Which groups had a solid Gen X basis? Nirvana had no Boomers — no surprise there. Neither did Alice in Chains. Stone Temple Pilots were all X-ers aside from their guitarist. This was the angsty side of grunge (self-consciously angry), with the funky riff of "Man in the Box" pointing the way toward the aggro, rap-influenced metal of the late '90s (Korn, Limp Bizkit, etc.).

Screaming Trees were equally Boomer and X-er, and "Nearly Lost You" sounds pretty easygoing by alternative standards.

And other Boomer-heavy groups? The girl groups, as it turns out. Only the bassist in L7 and Babes in Toyland were X-ers, the rest were Boomers. On their first grungier album, Hole consisted of Boomers (I couldn't find the birth year for the drummer, though). Recall an earlier post which showed all-female bands peaking in popularity during the '80s — the girl grunge bands were a fading generational echo.

The more self-conscious mindset of women in Gen X made it difficult or impossible to get into a state of abandon needed for grunge music, which was only partly introspective — and partly keeping the free-wheeling spirit of the '80s alive. When I think of the prototypical wild child, she's a late Boomer like the girls in Fast Times, the women of carefree '80s porn, and the real-life basis for the protagonist of Story of My Life by Jay McInerney.

Generations keep their ways well beyond their formative years, almost like a language that they were surrounded by and continue to speak, regardless of what new languages may have shown up in the meantime. If cultural change were only a matter of a changing zeitgeist, then Pearl Jam and Nirvana should have sounded much more similar than they did. And if those differences were a matter of being at different life stages at the time, why were the older guys more free-wheeling and the younger guys more reserved? It came down to a changing of the generational guard.

Today's immigrants are revolutionizing the way we experience disease — again

With ebola in the news, it's worth placing it in the broader context of exotic epidemic diseases cursing status-striving societies, where laissez-faire norms open the floodgates to all manner of vile pollution from outside.

The last peak of status-striving, inequality, and immigration was circa 1920, right when the Spanish flu pandemic struck. During the Great Compression, with immigration nearly ground to a halt, epidemic diseases looked to become a thing of the past. That sunny man-on-the-moon optimism of the 1960s would be undone by the Me Generation of the '70s. Not coincidentally, old diseases began rearing their ugly heads once more.

See this earlier post that examined the rise and fall and rise of epidemic diseases and pests, in tandem with the trends of inequality and immigration. Vaccines, hygiene, public health initiatives, etc., seem to have little or nothing to do with the fall, which in several cases was well underway before the vaccine was even discovered, let alone administered across the population.

It could have boiled down to something simple like the body not being burdened by as much stress as in hyper-competitive times, and keeping the ecosystem of diseases manageable by not allowing in boatload after boatload of new strains. Since ancient times, global migration has spread contagious diseases, but the world became less intertwined during the Great Compression.

Ebola will not become the pandemic of our neo-Gilded Age because it doesn't spread so easily. But ebola is just the tip of the iceberg of what is pouring into our country, and Western countries generally. It is not an isolated curiosity, but part of a larger population of pathogens being trucked and flown into this country every day.

One of those bugs, as of now an unseen up-and-comer, will soon enjoy its glory days just as the Spanish flu did 100 years ago. You can't say that America doesn't encourage the underdogs to give it their all.

October 2, 2014

When a girl gets taken advantage of, liberals cry rape, pseudo-cons shrug shoulders

There's really nobody to cheer for in the ongoing battle about "rape culture." The hysterical liberals / feminists are cheapening the charge of rape when they apply it to situations where a girl, usually intoxicated, gets taken advantage of. That involves a betrayal of trust, not the threat or use of violence.

Liberals used to concern themselves with matters of unfair treatment, injustice, one party taking advantage of another, and so on. You'd think the common-enough case of a drunk girl getting taken advantage of would be right up their alley. Hard to think of a better textbook example of someone using a higher bargaining position (clear-minded vs. intoxicated) to take advantage of another person.

Yet liberals these days don't talk much about the little person being taken advantage of (umm, vote for the Tea Party much? Cuh-reeeepy...) Most of them became numb to populism a couple decades ago. Hence every objection must be about preventing harm and providing care, no matter how unfitting this approach may be in any given case.

On the other hand, much of the so-called conservative reaction is to say, "Meh, what was she expecting? Anyway, no crime, no punishment. Next topic." While at least seeing past the blubbering about violence and harm, this response still shows a callousness toward a growing problem of young women getting taken advantage of while intoxicated, and while away from anyone who would look out for her interests, i.e. her family.

Sure doesn't sound like a world any of us want to live in, but pseudo-cons are so concerned with kneejerk reactions against the side of political correctness that they won't admit how unwholesome the situation is.

More and more, in fact, the pseudo-con position on any aspect of our fucked up world can be simplified as: "Unwholesomeness -- if you can't get used to it, you're a pussy." Real I-like-Ike kind of values.

This is the gist of some comments, copypasted below, that I left at this post at Uncouth Reflections about how hysterical accusations of rape are getting, that they're claiming ever less harm-based forms of not-so-consensual sex as rape.

That post was in turn motivated by this item at Time about Lena Dunham's new book, in which she relates a story about being "raped" in college. She was drunk and/or on Xanax, was insistent about going back with the guy who took advantage of her -- over an explicit attempt by a hapless white knight to steer her away from it -- and talked dirty to the guy during the act.

It never occurs to her that she was raped (she's thinking of actual rape, including force / violence). Her friend is the one who tells her she was raped, and convinces her. They are now thinking of metaphorical rape, and literal getting-taken-advantage-of. In the comments below I speculate about how our society has gotten here, making use of Haidt's framework of the moral foundations that liberals vs. conservatives draw on in their moral intuitions.

* * *

Incidents like Dunham’s are obviously not rape, they are getting taken advantage of. Why can’t today’s leftoids speak out against vulnerable people getting taken advantage of? Partly because the women make themselves vulnerable in the first place by blasting their brain with so many hard substances in rapid succession, in a public place where they know strangers only have one goal on their mind.

But there must be something more to it. We would object to a sleazy pawn shop owner offering a couple bucks for pristine 1970s receiver if it came in from a clearly stoned-out customer who says he just wants some cash to feed the munchies, man, you gimme anything for this thing?

These cases involve not harm or force but violation of trust. She trusted the guys at the party not to be that type of sleazeball, the stoner trusted the shopkeepers to give him honest treatment. When they wake up the next morning worrying, “Oh God, what did I do?” they feel betrayed or lied to.

Why insist on framing it as harm when it is not? Liberals are becoming so infantilized that it’s the only moral foundation they can appeal to anymore. “Mommy, that person hurt me!” No, they took advantage of you — it’s still wrong, but different from harming you. Kids don’t get it because they’re naive and don’t know about the possibility of having their trust betrayed.

Grown-ups should, though, and the fact that liberals cannot even appeal to their second-favorite moral foundation — fair treatment — shows how stunted this society has become.

Betrayal also taps into the moral foundation of community or in-group cohesion, but liberals are numb to that. If you’re both members of the same community (and typically they are even closer, being at least acquaintances), how could you think of taking advantage of her? Shame on you. Get out, and don’t come back until you’ve made up for it.

But liberals value hedonism, laissez-faire, individual advancement, and other quasi-libertarian leanings that most Republicans do. Hence they cannot object on the basis of wanting to prevent people from taking advantage of others. Under hedonism, that is guaranteed. And if laissez-faire and non-judgementalism are sacrosanct, who’s really to say that we’re in a place to judge a mere sleazeball who takes advantage of others? I mean, it’s not like he’s using force or violence.

Therefore, yes, he must have used force or threatened violence — that’s the farthest that the liberal is willing to draw the boundary. If they have an instinctive revulsion, it must be framed as some kind of harm, because anything less severe than that but still repugnant (like taking advantage of a drunk chick at a party) lies on the “fair game” side of the moral boundary line. Cognitive dissonance kicks in, so they rationalize what happened as harm / rape.

I alluded to it by calling Republicans quasi-libertarians, but let me say that too many conservatives don’t know how to react to these scenarios either. They know that the liberals are hysterically over-exaggerating, and like to invent new classes of victimhood, so their instinct is to dismiss these cases altogether.

But a drunk girl getting taken advantage of at a party isn’t a brand-new victim class. I’m sure that was a concern in Biblical times when the wine began flowing. It’s not as though she were a black who was denied admission to law school due to low LSAT scores, or a mentally ill tranny who feels robbed because ObamaCare won’t fund his castration / mangina surgery.

Brushing the Dunham type cases aside, like “Bitch deserved what she got for getting drunk,” or “Bitches need to learn to fend for themselves when drunk at college parties,” is too far in the anti-PC direction, however understandable the revulsion toward PC is.

I don’t sense any of that here — that’s more of a shrill Men’s Rights thing — but it’s worth emphasizing that it isn’t only liberals who are dumbfounded when they try to articulate their reaction to “drunk chick gets taken advantage of.” They both rely too heavily on laissez-faire and hedonistic norms to be able to say there’s something wrong with someone betraying another’s trust.

September 30, 2014

The crappy digital look: Demystifying the lie about how sensors vs. film handle light sensitivity

In this installment of an ongoing series (search "digital film" for earlier entries), we'll explore another case of digital photography offering supposedly greater convenience at the cost of compromised image quality. The end result is pictures that are too harshly contrasting, more pixelated, and color-distorted where it should be white, black, or shades of gray.

This time we'll look into the properties of the light-sensitive medium that records the visual information being gathered into the camera by the lens. I have only read one off-hand comment about the true nature of the differences between digital and film at this stage of capture, and mountains of misinformation. The only good article I've read is this one from Digital Photo Pro, "The Truth About Digital ISO," although it is aimed at readers who are already fairly familiar with photographic technology.

Given how inherent the difference is between the two media, and how much it influences the final look, this topic is in sore need of demystifying. So hold on, this post will go into great detail, although all of it is easy to understand. By the end we will see that, contrary to the claims about digital's versatility in setting the light-sensitivity parameter, it can do no such thing, and that its attempts to mimic this simple film process amounts to what used to be last-resort surgery at the post-processing stage. One of the sweetest and most alluring selling points of digital photography turns out to be a lie that has corrupted our visual culture, both high and low.

To capture an image that is neither too dark nor too bright, three inter-related elements play a role in both film and digital photography.

The aperture of the lens determines how much light is gathered in the first place: when it is wide open, more light passes through, when it is closed down like a squint, less light passes through. But light is not continually being allowed in.

The shutter speed regulates how long the light strikes the light-sensitive medium during capture: a faster shutter speed closes faster after opening, letting in less light than a shutter speed that is slower to close, which lets in more light.

Last but not least in importance, the light-sensitive medium may vary in sensitivity: higher sensitivity reacts faster and makes brighter images, lower senstivity reacts slower and makes dimmer images, other things being equal. This variable is sometimes labeled "ISO," referring to the name of a set of standards governing its measurement. But think of it as the sensitivity of the light-reactive material that captures an image. This scale increases multiplicatively, so that going from 100 to 200 to 400 to 800 is 3 steps up from the original. Confusingly, the jargon for "steps" is "stops."

A proper exposure requires all three of these to be in balance — not letting in too much or too little light through the lens aperture, not keeping the shutter open too long or too briefly, and not using a medium that is over-sensitive or under-sensitive. If you want to change one setting, you must change one or both of the other settings to keep it all in balance. For example, opening up the lens aperture lets in more light, and must be compensated for by a change that limits exposure — a faster closing of the shutter, and/or using a medium that is less light-sensitive.

Between digital and film, there are no major differences in two of those factors. The lenses can be opened up and closed down to the same degree, whether they are attached to camera bodies meant for one format or the other. And the shutter technology follows the same principles, whether it is opening up in front of a digital or film recording medium. (Digital cameras may offer slightly faster maximum shutter speeds because they are more recent and incorporate improvements in shutter technology, not because of digital properties per se.)

However, the two formats could not be more different regarding light-sensitivity of the recording medium.

Film cameras use rolls of film, which are loaded into and out of the camera on a regular basis. Load a roll, take however-many pictures, then unload it and send it off to the lab for development. The next set of pictures will require a new roll to be loaded. Digital cameras have a light-sensitive digital sensor which sends its readings to a memory card for later development and archiving. The sensor is hardwired into the camera body, while the memory card is removable.

Thus, no matter how many pictures you take with a digital camera, it is always the exact same light-sensitive piece of material that captures the visual information. With a film camera, every image is made on a new frame of film.

A digital sensor is like an Etch-a-Sketch that is wiped clean after each image is made, and used over and over again, while frames and rolls of film are like sheets of sketching paper that are never erased to be re-used for future drawings. The digital Etch-a-Sketch is just hooked up to a separate medium for storing its images, i.e. memory cards. Frames of film are both an image-capturing and an image-storage medium wrapped up into one.

Whether the light-sensitive material is always fresh or fixed once and for all has dramatic consequences for how it can be made more or less reactive to light — the third crucial element of proper exposure.

Film manufacturers can make a roll of film more reactive to light by making the light-sensitive silver halide crystals larger, and less reactive by making the crystals smaller. Hence slow films produce fine grain, and fast films large grain. What's so great is that you can choose which variety of film you want to use for any given occasion. If you're worried about too much light (outdoors on a sunny summer afternoon), you can load a slowly reacting film. If you're worried about not getting enough light (indoors in the evening), you can load a fast reacting film.

It's like buying different types of sketching paper depending on how much response you want there to be to the pencil lead — smooth and frictionless or bumpy and movement-dampening. Depending on the purpose, you're able to buy sketchpads of either type.

What was so bad about the good old way? The complaints boil down to:

"Ugh, sooo inconvenient to be STUCK WITH a given light sensitivity for the ENTIRE ROLL of film, unable to change the sensitivity frame-by-frame. What if I want to shoot half a roll indoors, and the other half outdoors?"

Well, you can just buy and carry two rolls of film instead of one — not much more expensive, and not much more to be lugging around. And that's only if you couldn't compensate for changes in location through the other two variables of aperture size and shutter speed. For the most part, these were not big problems in the film days, and served only as spastic rationalizations for why we absolutely need to shift to a medium that can alter the light-sensitivity variable on a frame-by-frame basis, just as aperture size and shutter speed can be.

That was the promise of digital sensors, which turns out to be a fraud that the overly eager majority have swallowed whole, while enriching the fraudsters handsomely.

Digital cameras do offer a means for making the image look as though it had been captured by a material that was more sensitive or less sensitive to light, and this variable can be changed on a frame-by-frame basis. But unlike film rolls that may have larger or smaller light-sensitive crystals, the photodiodes on the digital sensor have only one level of sensitivity, inherent to the material it is made from.

Because this sensitivity is baked into the materials, it certainly cannot be altered by the user, let alone on a frame-by-frame basis. And because the sensor is not removable, the user also has no recourse to swap it out for another with a different level of sensitivity.

How then do digital cameras attempt to re-create the many degrees of sensitivity that film offers? They choose a "native" sensitivity level for the photodiodes, which can never be changed, but whose electronic output signal can be amplified or dampened to mimic being more or less sensitive in the first place. In practice, they set the native (i.e. sole) sensitivity to be low, and amplify the signal to reach higher degrees, because dampening a highly sensitive "native" level leads to even lower quality.

Most digital cameras have a native (sole) sensitivity of ISO 100 or 160, meant to evoke the slowly reacting less sensitive kinds of film, and allow you to amplify that signal frame-by-frame, say to ISO 800, 3200, and beyond. But remember: it is never changing the "ISO" or sensitivity of the light-reactive material in the sensor, only amplifying its output signal to the memory card.

It is like always recording sound at a low volume, and then using a dial on an amplifier to make it louder for the final listening, rather than record at different volume levels in the initial stage. And we all know how high-quality our music sounds when it's cranked up to 11. It does not sound "the same only louder" — it is now corrupted by distortions.

We should expect nothing less from digital images whose "ISO" was dialed up far beyond the native (sole) sensitivity of 100 or 160.

Below are some online digital test shots taken with the lens cap fully in place, blocking out most light, with higher and higher settings for the faux-sensitivity ISO setting. Now, these images should have remained black or gray the whole way through. The only change that would have occurred if they were shot on more and more highly sensitive film material is a grainier texture, owing to the larger film crystals that make film more sensitive, and an increase in brightness, since what little light was sneaking in past the lens cap would have produced a stronger reaction.

And yet look at the outcome of a digital sensor trying to see in darkness:

Not only does the texture get grainier, and the light level brighter, when the native (sole) sensitivity is amplified, there are now obvious color distortions, with a harsh blue cast emerging at higher levels of sensor amplification.

What's worse is that different cameras may produce different kinds of color distortions, requiring photographers to run "noise tests" on each camera they use, rather than know beforehand what effects will be produced by changing some variable, independent of what particular camera they're using.

The test shots above were from a Canon camera. Here's another set from a Pentax, showing a different pattern of color distortions.

Now it's red instead of blue that emerges at higher levels of amplification. Red and blue are at opposite ends of the color spectrum, so that shooting a digital camera without test shots is like ordering a pizza, and maybe it'll show up vegetarian and maybe it'll show up meat lover's. Unpredictable obstacles — just what a craft needs more of.

These distortions can be manipulated in Photoshop back toward normal-ish, but now you've added an obligatory extra layer of corrections in "post" just because you want to be able to fiddle with light-sensitivity frame-by-frame, which you're not really doing anyways. Convenience proves elusive yet again.

So, if amplification of the native (sole) light sensitivity is not like using film rolls of different sensitivities, what is it like? As it turns out, it is almost exactly like a treatment from the film era called push-processing, which was a last-ditch rescue effort in the developing stage after shooting within severe limitations in the capturing stage.

Suppose you were shooting on film, and your only available rolls were of sensitivity ISO 100, which is a slowly reacting film best suited for outdoors in sunlight. Suppose you wanted to shoot an indoor or night-time scene, which might call for faster reacting film, say ISO 400. Could it still be done with such low-sensitivity film? You decide to shoot in the evening with a slow film, effectively under-exposing your film by 2 stops, worried the whole time that the images are going to come back way too dark.

Lab technicians to the rescue! ... kind of. If you let them know you under-exposed your whole roll of film by 2 stops, they can compensate for that by allowing your film to soak in the chemical developing bath for a longer time than normal, allowing more of those darkened details to turn brighter. (The film starts rather dark and the developing bath reveals areas of brightness over time.) Taking 100 film and trying to make it look as sensitive as 400 film is "pushing" its development by 2 stops.

But if that were all there were to it, nobody would've bothered using films of different sensitivities in the capturing stage — they would've let the lab techs worry about that in the developing stage. The costs of push-processing are various reductions in image quality, which Kodak's webpage on the topic summarizes in this way (click the link for fuller detail):

Push processing is not recommended as a means to increase photographic speed. Push processing produces contrast mismatches notably in the red and green sensitive layers (red most) compared to the blue. This produces reddish-yellow highlights, and cyan-blue shadows. Push processing also produces significant increases in film granularity. Push processing combined with under exposure produces a net loss in photographic speed, higher contrast, smoky shadows, yellow highlights and grainy images, with possible slight losses in sharpness.

Not a bad description of the signature elements of the digital look, is it? Blue shadows are exactly what the Canon test shots showed earlier.

Interestingly, they note that although push-processing produces less sharp images, they may subjectively appear to be normally sharp, given the increase in contrast. Sure, if a subject is wearing a normal red shirt and normal blue jeans, and you crank up the contrast parameter, the picture looks more defined — ultra-red juxtaposed against ultra-blue. But we're only fooling ourselves. Sharpness means how clear and crisp the details are, and push-processing and its obligatory counterpart in the digital world are actually losing details, while distracting us with more strongly contrasting colors.

Remember, this is what a digital camera is doing each time it takes a picture outside of its native (sole) sensitivity level of 100 or 160, i.e. when you shoot indoors, at night, or on cloudy days. In the digital world, every image is immediately rushed into emergency surgery.

Is there a way to compare side-by-side a film image that was processed both normally and with push-processing? Unfortunately, no, since developing the negative image from the latent image on the film cannot be undone, and then done a different way. I suppose you could take a shot of the same scene, with two identical cameras and two identical rolls of film, but with one camera set to the true sensitivity and the other set inaccurately, then develop the normal one normally and the under-exposed one with push-processing. That sounds like a bit too much just to make a perfect textbook comparison of normal vs. push-processed images, and I couldn't find any examples online.

But there are examples of film that has been push-processed. Although we can't compare them side-by-side with normally developed versions of the same film frame, at least we can pick up on some of the typical traits that push-processing introduces. Below is an example from this series at a photographer's website. The film is ISO 400, but was push-processed to look like ISO 3200. That is 3 stops of pushing, whereas Kodak and other photography guidebooks advise never pushing past 2 stops of over-development.

It's disturbing how digital this film photograph looks. It looks like someone opened a digital image in Photoshop and cranked up the contrast and saturation settings. Look for details on the man's shirt and pants, like folds and creases. They're hard to make out because push-processing renders the image less sharp. But we're distracted by how striking the contrast is between these overly rich yellows and reds and the cooler blues. It looks more defined, but is poorer in detail.

It's almost like a child drew an outline of pants and hit "fill" with yellow on MS Paint. Very little detail. The yellow pole also looks like a crude "fill" job. Even worse, these pictures were shot on medium-format film, which has a far higher resolution than the 35mm film we're all used to. It ought to have detail so fine that you could blow it up into a poster or banner without blurring of the details.

We also see the familiar blown-out sky from digital Earth, rather than the blue one we know and love. Other white areas look like intense spotlights, too. I can't tell if they have the red-yellow tint to them as Kodak warned, although they do look kind of bright pale yellow. There aren't many dark shadows to tell if they have the bluish tint warned about, although the asphalt on the road looks blue-gray. The color distortions might be more obvious if we had the same scene captured and developed normally, for comparison.

The ultra-contrasty, overly saturated, harshly blown-out bright areas are hard to miss, though. And they look like something straight from a digital camera plus Photoshop settings dialed up to 11.

You might object that, hey, this guy knows what he's doing, and he's using push-processing to give the pictures a flamingly dramatic style (he's gay). That misses the point: these kinds of distortions and reductions in image quality are built in with digital photography's light-sensitivity technology. They aren't going to be chosen purposefully for some intended artistic effect. They're just going to make ordinary people's pictures look cartoony and crappy because they don't know about them before buying a digital camera, and won't mind anyway because digital is all about convenience over quality.

Even Hollywood movies shot by pros will be subject to these digital distortions, although they'll have much better help cleaning them up in post — for a price. Good luck scrubbing your digital images that clean on your own with Photoshop.

In the end, is digital really more convenient, all things considered? All of these distortions require laborious and expensive corrections, which may well off-set the efficiency gains that were hoped for at the beginning. Or those corrections simply won't be done, and greater convenience will have been traded off against poorer quality. Either way, one of the fundamental promises of digital photography turns out to be a big fat lie.