October 31, 2014

Extended family contact by transplant vs. native residency

In an earlier post on differing levels of contact with extended family across regions, I stated that one factor underlying the pattern was the differing levels of being a transplant to the region, which would cut down on how often you keep in contact with family back home.

How much less contact do transplants actually have with their families? I looked at levels of contact with three different types of extended family groups for both natives and transplants. Racial groups have different patterns of migration and family contact, but it turned out not to affect the split between natives and transplants in level of extended family contact. So I left all races in.

Here's the breakdown for whether they've been in contact with the following groups during the past four weeks (among those who have living relatives of the type):

Cousins -- 50% of natives, 40% of transplants

Uncles and aunts -- 53% of natives, 40% of transplants

Nieces and nephews -- 70% of natives, 52% of transplants

Remember that the questions don't specify whether you kept in contact by meeting face to face, or by writing or calling. With nearly half of transplants saying they kept in contact with these groups as recently as the past four weeks, despite probably not living nearby, a good deal of those respondents took it to include mediated contact.

These results understate the difference in levels of face-to-face contact, which could be closer to zero for transplants, if it's already this low for mediated contact.

It's easy to blame technology for isolating us from those who we ought to be in contact with, especially in person. But here we see a vivid reminder of how simple it is to sever the ties to your extended family -- just move away, or perhaps they will. As long as the split is not acrimonious -- you're just leaving to better yourself -- no one will be bitter about the diluted and fragmented family web. It'll be one of those things that just happen, mysteriously and uncontrollably.

I don't see things changing course due to a change in attitudes toward family ties. There's too strong of an impulse toward self-enhancement, rather than maintenance and enhancement of everything else that made you.

But we may not have to wait for a change in attitudes. There's more than one way to keep people from moving away -- saturated real estate and job markets, and general lack of preparation for life after collage (where they goofed off for four years) among Millennials. "Boomerang kids" who live at home well into their 20s and 30s are becoming more of a reality, and reversing the trend of being a transplant during one's 20s.

They'll be in contact with their extended family more than earlier generations during that stage of life, whether they like it or not.

GSS variables: cousins, uncaunts, niecenep, regtrans (created from region and reg16)

October 30, 2014

Millennials reversing the trend of being a transplant during one's 20s

With gloomier job and housing prospects facing the most sheltered generation in world history, the Millennials are becoming "boomerang kids" who leave for four years of goofing off at college, and return home for their 20s, maybe longer.

One unnoticed but important side-effect of this shift is that they won't be contributing to the transplant phenomenon as much as earlier generations did during the same stage in life.

The General Social Survey asks questions about what region of the country you were living in at age 16, and where you're living at the time of the survey. I created a transplant variable that looks for a mismatch between the two answers, and looked at age and cohort patterns.

Cohorts are five years long, and the age group was 23 to 29, in order to make sure they were out of their college years. Only whites were studied, as races show different migration patterns, and sample sizes are not very large for non-white groups when restricted to such a narrow age range across multiple cohorts.

The eight cohorts within Boomers (1945-64) and X-ers (1965-84) were all nearly 20% likely to be a transplant during their post-college 20s. With the '85-'89 births, there's a sudden drop to 12%, cutting their chances in half. The results do not depend on whether you look at people who didn't go to college, or those who had at least one year of college.

The recession is a non-starter: many other generations faced recessions during their 20s, yet didn't hang around their home region (let alone their home). The sudden drop suggests that a clear breaking point has been reached for the broader socioeconomic structure, like the higher ed, job, and real estate markets, not simply recession or no-recession. Presumably those born in the '90s will be even less likely to bother chasing fame and fortune by leaving behind their native region.

Massive, unregulated, me-first migration patterns not only dislocate individuals from the social networks that they're most attached to, they destabilize the broader ecosystem -- both where they came from, which is losing natives and their native ways, as well as where they're moving to, which cannot cope with such an influx of outsiders and their outside ways.

Not that Millennial strivers wouldn't love to play their part in the transplant royal rumble -- they're simply less able to make it happen, being even more unprepared for real life than those who came before them, and with so many of the spots already taken up. Perhaps they'll rationalize the situation they've been forced into, and come to prefer living in the same general region that they grew up in. And perhaps they'll pass along this attitude and received wisdom to the generation after them.

The great big transplant shoving match may therefore come to an end, not by consciousness raising but by over-saturation.

GSS variables: regtrans (created from reg16 and region), cohort, age, race, educ

October 27, 2014

New Urbanism hijacked for leisure-class contests, and the plague of cars turning suburban streets into one-way roads

Not for the first time, I wrote what started as comments but soon morphed into an entire post on another site (this post at Uncouth Reflections reviewing a documentary on New Urbanism). This seems more likely when I've had a drink and am only focused on the now. You've heard of drunk texting -- this is drunk comment-spamming.

I'll just copy & paste the comments, rather than edit and fill them out into full posts. There's plenty more to say, so just riff on them in the comments, and I'll chime in again. (Like how I forgot to mention how much worse the parked car plague is where spics live. Six cars lining the curb in front of a "one-family" suburban house -- that's a sign of the Mexican invasion for sure. But I digress...)

The first is about how New Urbanism has turned out in reality, all these years after being an unheard-of movement, and being so widely adopted by the right kinds of people living in the right kinds of places. It promised a return to Main Street, but has built only playgrounds for the leisure class to publicly indulge in their status contests.

The second is more focused, on the topic of how clogged with parked cars the typical residential street is nowadays in suburban America, how recent of a change that has been, and what this example shows about the power of design and public planning to shape behavior when attitudes of individuals are pushing in the opposite direction.

* * *

The lack of reflection this far into the craze for New Urbanism is unsettling. Y’know, it’s not 1992 anymore, and the movement isn’t some underdog vanguard but the Next Big Thing that every SWPL enclave has been pushing through for at least the past 5, and more like 10 or 15 years.

That photo of the public square in New York sums up what’s gone wrong (or has revealed what had always been wrong from the start): New Urbanism has become (always was?) a brainstorming session / policy bandwagon for how to make the wealthiest neighborhoods in the wealthiest cities even more insanely epic playgrounds for the sponges who dwell nearby. That could either be loafer / hipster sponges, or finance / Big Law / PR / other bullshit sector sponges making a ton of money from parasitic professions.

There’s absolutely nothing civic, communal, cohesive, or enriching about these large playground oases in the urban jungle. Just a bunch of sponges sitting around indulging in some conspicuous consumption (where’s you coffee from? where’s your panini from?) and conspicuous leisure (1pm and I’m lounging in public, with designer clothes and perfect hair — jealous much?). There’s never any connection or awareness of the other people in these places. They’re all drones vibrating in their own little cell within the larger hive.

Don’t be fooled by the pairs of people who appear to be interacting with another person. The other person is just a social image prop, and gets no attention, which is instead directed at the hive in general.

You ever notice how loud and over-sharing their conversations are, and how their eyes are always darting around to see how many other drones are giving unpsoken “likes” to the speaker? When they aren’t talking, they are dead silent for hours at a stretch, never looking up toward the other, glued to their private glowing screen. No affection or closeness — they only “interact” when their speech and mannerisms can suck in attention from the hive.

Apart from the psychological segregation, contra the intimacy the New Urbanist cheerleaders promised we’d have, there’s the naked leisure-class nature of all the surrounding “small shops,” invariably 90% quirky foodie joints, and 10% quirky yoga, quirky doggie spas, and quirky clothing. Somehow that’s not what my grandfather would have imagined when New Urbanists spoke of a return to Main Street. These preening useless faggots would have gotten food thrown at them from passing cars back in those days.

Where do they buy their household tools? From a mom & pop hardware store? No — by ordering some Chinese piece of shit from Home Depot’s website. Where do they buy their music and movies? From iTunes (if they’re old) or more likely from some online streaming service. Consumer electronics? Amazon, or once a year a trip to the Apple Store where they actually buy something.

It’s pathetic how little variety there is in areas struck by the New Urbanist craze, and how much all of that stuff has migrated online due to airheaded consumer choice. I could have sampled a wider variety of stuff from a mall back in the ’80s — and they had professionals’ offices there too.

Defenders of New Urbanism will say that it wasn’t intended, that this is a hijacking or adulteration by wealthy interests, that the originators were more populist. Maybe — maybe not. The point is: this is what the mania has produced in reality, and it’s time to start taking stock of that, and coming up with ways to wipe out all of this airheaded elitist shit and return city and town life to more populist and enriching ways. Not by continuing to cheerlead for the craze like these designers and architects do.

It’ll be better if the new movement doesn’t have the words “new” or “urbanism,” to avoid confusion and tainting.

It would greatly help matters to identify designers, architects, and policy makers by one of three types, so we know who we’re dealing with and how to treat them.

1) Kool-Aid drinkers. These people truly get an endorphin rush from turning entire neighborhoods into leisure-class playgrounds. Crazy, not worth trying to talk some common sense into.

2) Sell-outs. These individuals started off with the populist Main Street ideal as their model, but quickly figured out that egalitarian small-town ecosystems are not exactly gonna fly off the shelves in a climate of such intense status-striving and inequality. A fella’s gotta eat and pay rent, so whaddayagonnado? Not worth trying to convert, since they only worship the almighty dollar, and they will not fall for the lie / clueless naive suggestion that somehow, someway the Main Street model could be made to be as profitable, or more, than the leisure-class playground model.

3) Frustrated idealists. Bitter, overlooked, unappreciated, disgusted by what the formerly idealistic movement has devolved into (or again, how the hidden variation among the originators has made itself manifest). They feel sick for being a part of a movement that has swept aside the variety of stores that used to be found in suburban strip centers as recently as 25 years ago, all in the name of converting the place into a “lifestyle center” with food, drink, food, drink, food, food, food, spa, salon, crappy cell phone outlet, food, and food. All chains, all oriented toward leisure-class strivers.

Naturally only the last group is worth the time for ordinary people to talk to. But if they won’t identify themselves, and their distaste for where the New Urbanist craze has gone, it will be hard to start cleaning house.

* * *

The designer in the documentary is Danish, so I don’t expect him to be in touch with American trends. But New Urbanists have overlooked the most pedestrian-unfriendly car phenomenon of the 21st century — suburban streets that are narrowed into de facto one-lane paths because residents park their cars all along the curb, at every house.

This is not a design / planning problem, since just 25 years ago, roughly the same number of cars belonging to roughly the same number of residents on a suburban street, were parked in the driveway, carport, or garage. It was normal for two-car houses to have both parked one behind the other in the driveway, and for someone to have to get out and move the back one if someone wanted to take the front one out. I remember doing that in the ’90s, though it was also starting to become common to park one in the driveway and one on the street.

What changed were attitudes toward private vs. public welfare. Individual convenience is maximized by parking one in the driveway and one or more on the street. Say goodbye to those unbearable 30 seconds of car-shuffling. But when everyone feels and acts that way, suddenly the whole street is clogged with parked cars. The two-way street is now one-way, and pedestrians who could have walked along the side of the road (the way we all used to) have nowhere to walk, unless there’s a sidewalk.

(Sidewalks are not the most common thing in suburbs, sadly, and even if there is one — how drone-like to have to follow a sidewalk in a quiet residential neighborhood, when you’re supposed to be walking through the streets because you own them, and only moving aside when you see a car approaching.)

This state of affairs points to the larger problem that is rarely discussed in New Urbanist forums — how easy does design change attitudes, and can a change in attitudes over-turn the utopian design plan? (Answer: yes.) Driveways, carports, and garages were a design solution to the problem of streets clogged with parked cars — provided that folks who lived in multi-car houses put the good of the community above their own stingy quest for maximum convenience. You don’t see cars parked in driveways in the city — it was supposed to be a way that suburbanites could lick one of the city’s worst problems.

In the end, though, attitudes trumped design plans.

October 26, 2014

The etiology of women who seem like gay men: a look at Anne Hathaway

During the trailer for Interstellar, there's a shot of Anne Hathaway looking sideways with her mouth agape that struck me as something you'd see from a creepy homosexual camping out at Starbucks to scope out the latte-sipping twinks. I've never been a fan of hers and don't have a strong sense of her range of facial expressions, so I investigated a little on Google Images. The hunch paid off. Here are just a handful of shots of her showing gay-face:


The over-smiling, overly eager open eyes, raised eyebrows, and slackjaw are all hallmarks of the campy gay-face. All resemble caricatures of a child's expressions of "surprise" and "I'm such a little stinker." The basis of male homosexuality is stunting during the "ewww, girls are so yucky" phase of development (gays as Peter Pans), hence their more neotenous (child-like / infantilized) appearance and behavior.

I've covered these features at length elsewhere, but here we see something similar in a heterosexual woman. In fact she doesn't just look a lot more like a gay man than 99% of women do, she shares their emotional and behavioral tendencies as well. Thin-skinned, breaking down over the most trivial happy or sad causes. Naturally campy and caricatured, not acting that way to be ironic. Loose and into drugs during college. No shame in using her sexuality to get a rise out of others. Childishly naive, easily fooled and taken advantage of by her first husband. And most importantly, no apparent desire to have children and nurture them as a mother.

Granted, women are more child-like to begin with, but she is way off the charts for how naive, kiddie, weepy, campy, and non-maternal she is.

How did she get that way? In the case of men, it's probably Greg Cochran's idea of a "gay germ" that strikes in childhood. My take on that is that its effects are a broad developmental stunting — a Peter Pan syndrome — and not merely a narrow change in sex role behavior, sexual preference, etc.

That raises the question: if it can strike boys and produce gay effects, what would happen if it struck girls? I think the answer is women like Anne Hathaway. (It would not produce lesbians, since they are characterized by the opposite pattern — not childish, but menopausal.)

As it turns out, her brother is gay, so we know that she would have been at a similar environmental risk of exposure to the gay germ in childhood. And being so closely related, she would have had a similar genetic susceptibility to the germ's effects.

The plot thickens with her second marriage. After being scammed by an apparently sociopathic first husband, she decided to swear off heterosexual men altogether and got married to an obviously homosexual nobody, named Adam Shulman. When every A-lister these days is part of a power couple, it's just a little bit strange for her husband to be a complete unknown, to have been a close friend beforehand (i.e., her gay BFF), and to look and act so effeminate and mincing, as though he were her kid brother rather than her lover and protector.

Other celebrity women have served as beards for closeted A-list men — Kim Kardashian for Kanye West, Cindy Crawford for Rande Gerber (Clooney's butt buddy), Julianne Hough for Ryan Seacrest, Jada Pinkett for Will Smith, and so on. But in these typical cases, the sham husband has wealth, influence, or looks that would "enhance the brand" of the sham wife.

In Anne's case, it would be inaccurate to call it a sham marriage, since it was not a cynical brand-enhancing contract, but an earnest attempt to elevate the status of her gay BFF-ship, in the same way that childish naive faggies believe that simply throwing a wedding will make their bond normal or special.

I don't mean to delve into so much celebrity gossip, but because their lives are so well documented, they do provide a window into a topic that would otherwise be completely opaque. Can you imagine getting funding to study how gay (not lesbian) the female relatives of gay men are? Maybe if you could spin it in some pro-homo way, but the whole topic of "what causes male homosexuality" is too radioactive these days.

Miley Cyrus is another example worth looking into. She comes off as a flaming queer trapped in a girl's body. And on Google Images, her brother Braison does a pretty good impression of a twink. But she's a little young to see whether or not she prefers getting married to a gay BFF. I give it greater than 50% chance, though.

October 22, 2014

NFL players got bigger once competitiveness became sanctified

In a comment, Feryl said he saw a chart about how NFL players started getting heavier and heavier circa the 1970s, linking it to the status-striving and inequality trend.

After some googling, I found this post with a series of graphs showing the evolution of body size among NFL players from 1950 to present. Most of the change has occurred since the '70s, and players on the whole tended to be similar in height and weight during the '50s and '60s, with slow decelerating growth at most. That's important for showing that this is not just some long-term trend that goes back to the very beginning of the sport, but one that began at the same time that competitiveness in general became glorified throughout society.

Some positions have gotten taller, but most show modest or no change in height. The real change has been in weight, particularly where the sumo wrestling takes place, among centers, tackles, guards, and linebackers. They appear to have gotten a full two standard deviations heavier, and even a good deal of the other positions have gotten about one standard deviation heavier. That is a huge change in less than two generations.

An offensive and defensive lineman who both weigh 200 lbs will have the same balance of forces as a pair who are both 300 lbs -- evenly matched. But somewhere along the way, some "greed is good" coach decided to put slightly heavier linemen up against the prevailing standard-weight linemen, giving his own a slight edge. Everyone else quickly caught on and imitated the strategy, the initial edge was eroded, and all were suddenly caught up in an escalating arms race toward 300-pound linemen.

Although the balance of forces is the same with a pair of 200-pound linemen and a pair of 300-pound linemen, the variance is not. Imagine a lighter pair engaged in an evenly matched tug-of-war, when someone cuts the rope and both fall backward. Now imagine two giants lumbering around when thrown off-balance. Or imagine the force of impact when the evenly matched lighter pair vs. heavier pair slam into each other.

Introducing a simple weight regulation to prevent a pointless arms race that endangers the players will not be enacted until the social mood changes away from sanctifying competition. For now, it would only enrage the braindead fans whose sole meaning in life is to squabble over whose squad of transplant gorillas-for-hire is more vicious.

Back when sports fans were not bloodsport junkies, and sports players were not outsider mercenary apes, this wasn't a problem. But now that competition for its own sake has reached sacrosanct status, who are we to get in the way of more and more ridiculous, bombastic bread-and-circus entertainment?

And it won't be the fans who have to pay the costs of the arms race -- they're not the ones taking all those hits from 300-pound hulks. Fortunately for the game, the players aren't exactly known for their future orientation, and don't mind fucking themselves up for big money today, at the cost of living as a cripple for the rest of their lives. The coaches only want to win, and the NFL only wants advertising dollars hence eyeballs and butts in seats. With no checks anywhere throughout the entire football ecosystem, the whole place is headed to hell in a handbasket.

Sadly, a sector that cannot police itself, and that is becoming ever more bombastic and economically parasitic on ordinary folks, means that only people from outside the pro sports ecosystem can do anything about it. Not shut it down, but dial it back by regulations to where it was in the Midcentury.

That would have sounded like a pipe dream in the '90s, when the two sports-crazy generations -- Silents and Boomers -- were dominant. But sooner rather than later, they'll either be retired, senile, or dead, and the average member of Gen X and Millennials could give a shit about the barbaric religion of gladiator worship (UFC fandom still being a very niche identity). I can't think of too many causes that would so effortlessly unite X-ers and Millennials of all political persuasions. Beating the money-changers out of the athletic temple is one of them.

Strength of extended family ties by region

Rootedness in a place has both a vertical dimension stretching back through time, as well as a lateral one linking a person to others in that place during a given period of time. Let's start with the state of affairs today, and compare levels of rootedness across some of the major geographic-cultural regions of America.

The strongest form of investment in a place, as opposed to feeling like nothing is holding you back from picking up and heading off for greener pastures, is extended family ties. Community ties with non-kin are worth examining, too, but blood is thicker than water.

The General Social Survey asked a series of questions about how much you keep in contact with three groups of extended family -- cousins, uncles and aunts, and nieces and nephews. It didn't specify whether "contact" was in-person or mediated. It was asked in 2002, so most respondents probably assumed it meant in-person or talking over the phone.

I excluded respondents who said they had no living relatives of that type, and lumped the two affirmative responses together (some having a little more contact over the past four weeks, and some a little less), against those who said they had no contact. The ranking was so similar for each of the three family groups that I averaged them into a single index.

Non-whites have much more contact with their extended family than whites do, and regions vary a lot in how non-white they are. However, looking at whites only vs. everyone did not change the ranking in this case, so I left all races in.

The chart below ranks the regions by how likely their residents are to have had any contact with their extended family during the past four weeks. The states which make up each region are listed below, in descending order by population size within each region, to give a feel for which states are more influential on the region's score. The GSS uses Census regions, and some of them have confusing names, which I've tried to re-name more helpfully (if changed, the original names are in parentheses below).


South Atlantic - Florida, Georgia, North Carolina, Virginia, Maryland, South Carolina, West Virginia, Delaware, District of Columbia

Southern Appalachia (E.S. Central) - Tennessee, Kentucky, Alabama, Mississippi

Eastern Midwest (E.N. Central) - Illinois, Ohio, Michigan, Indiana, Wisconsin

New England - Massachusetts, Connecticut, Maine, New Hampshire, Rhode Island, Vermont

Lower Mississippi (W.S. Central) - Texas, Louisiana, Oklahoma, Arkansas

Middle Atlantic - New York, Pennsylvania, New Jersey

Pacific - California, Washington, Oregon, Hawaii, Alaska

Mountain - Arizona, Colorado, Utah, Nevada, New Mexico, Idaho, Montana, Wyoming

Western Midwest (W.N. Central) - Missouri, Minnesota, Iowa, Kansas, Nebraska, South Dakota, North Dakota

About 20 percentage points between high and low -- we're not talking minor differences around the country.

The main divide is between the eastern vs. western half of the country. All these years later, the Mississippi River continues to be a major cultural barrier. As Americans have moved farther out west, they have left behind their extended family, only some of whom would have also been heading out west. That certainly makes it impossible to stay in face-to-face contact, but I'll bet that it cuts down on mediated contact as well -- out of sight, out of mind.

Among western regions, why do the furthest west have higher family contact? Because the coast was settled earlier. The Plains and Mountain states may have been reached first, but folks tended not to put down roots there for very long. It's the most desolate real estate in the country, so that over history there has been a lot more coming-and-going there compared to the desirable West coast, where people come more than they go.

Also, by the 21st century, a good deal of those living in the Mountain states are first or second-generation transplants from all over, and refugees from the over-saturated West coast, who won't have family nearby. The Plains states show the opposite problem -- the locals abandoning ship.

After the east-west divide, there's also a secondary cline from more familial southerners to less familial northerners. I doubt it's due to more favorable weather conditions that allow folks to get out of their homes comfortably. The GSS is administered during the summer, and more of less all of the eastern US is a humid hellhole then, and if anything, worse in the South.

My only hunch is historical ethnic conflict serving to strengthen clan ties. All else equal, larger groups defeat smaller ones, so ethnic conflict pressures folks into joining larger groups -- in the context of kin, extended vs. nuclear families. Blacks vs. whites in the Southeast compared to the Northeast, Mexicans and Indians vs. whites in Texas compared to Minnesota. Hence also why religious membership is more important in southern regions, it being the main way that we cement bonds with non-kin.

Since the major split is between earlier and later settled regions, we'll need to look into how rootedness has changed over time in these regions. Once started, do roots continue to grow, or are they uprooted every generation? Then the link between rootlessness and status-striving will become clearer.

GSS variables: uncaunts, cousins, niecenep, region

October 20, 2014

The geography of striver boomtowns, AKA future ghost towns

An NYT article reviews a report on how recent college grads are multiplying like cancer cells, I mean fueling the engine of growth in cities across America, particularly in the most central areas of the city. They are shying away from Establishment cities and gentrifying second and third-tier cities where the rents are cheaper -- until the word gets out and the next twenty waves of transplants bid up the rents to Establishment levels.

The article and report refer to 25-34 year-olds with a B.A. as talented, creative, etc., without any proof required other than the fact that their brains are young, that their credential being bought and paid for (rather than earned) allowed them to goof off for four years, and that they spent that time cultivating unique quirky tastes shared by 90% of their age-mates (craft breweries bla bla bla).

These cases illustrate some of the themes I've started to develop here about the geographic and generational differences in status-striving. The Gen X and Millennial subjects they're tracking are moving away from Establishment cities because the Silent and Boomer incumbents refuse to vacate the prime real estate and above-poverty-level jobs.

More broadly (see this post), materialist and career competition have become too saturated by Silents and Boomers, leaving X-ers and Millennials to pursue lifestyle competition instead. That will not only affect which cities they flock to, but the character of their lives once they arrive -- they will turn the place into one great big playground for lifestyle status contests, lots of drinking, and the occasional random drunken hook-up. Or, College: The Sequel (total run-time to be determined).

And they aren't picking just any old sub-Establishment cities but ones that are already growing at a fair clip, and are already and have historically been quite large in population. When they look for a city in the Northeast to take the place of New York or Boston, do they settle on Albany? No, it has to be the second-largest city in New York -- Buffalo. Feeling squeezed out of Chicago and Dallas in the Midwest and Philadelphia and DC in the east? Well, you could shoot for Wheeling, WV or Grand Rapids, MI -- but why not aim as high as you can (within your budget), and resurrect the Gilded Age empires of Pittsburgh and Cleveland?

Fortunately, nobody involved at the grassroots or the academic and journalistic levels has any knowledge of history, so what's to temper the enthusiasm for pushing Buffalo and Cleveland as up-and-coming boomtowns (at least among the Creative Class)? It's not as though they've already been through the over-hype and hollowing-out cycle before. But if you want more sustainable long-term growth, you'd have to settle for cities that are smaller, historically less important, and culturally less thrill-seeking. Off-the-radar cities.

On the whole, though, the striver boomtowns are not in the Rust Belt but in the Sun Belt, i.e. where mainstream America has already been heading for decades. There are now enough earlier transplants who can actually run a business and create jobs, that the Creative Class can play catch-up and jump on the Sun Belt bandwagon without starving and "living outside". Will the Sun Belt soon turn into the next Rust Belt? Impossible -- growth only increases, worst-case at a slowing rate, but there will never be a mass desertion of entire swaths of the country that had been over-hyped, over-built, and over-indulged.*

It comes as no surprise, then, that the cities with the greatest percentage growth in 25-34 college grads fail the test of egalitarianism outlined in this post -- not having a pro sports team. Only Austin passes (I didn't claim the test was perfect). That proves that they are not simply seeking refuge from the rising competitiveness and widening inequality that blight the Establishment cities. Otherwise they'd be heading to some place where it would never occur to the locals to allow their public coffers to be parasitized by a big-league team that could PUT THEM ON THE MAP.

Among the ranks of Millennial boomtowners, is there any awareness of how illusory all this rapid growth is? Let's ask one of them living in Denver:

“With lots of cultural things to do and getting away to the mountains, you can have the work-play balance more than any place I’ve ever lived,” said Colleen Douglass, 27, a video producer at Craftsy, a start-up with online classes for crafts. “There’s this really thriving start-up scene here, and the sense we can be in a place we love and work at a cool new company but not live in Silicon Valley.”

How can start-ups be thriving? You don't thrive until you're mature. All those dot-com start-ups sure seemed to be thriving in the late '90s -- what happened to them after that, I'll have to order a history book on inter-library loan, since I'm too retarded to remember.

Online classes for crafts, or where higher ed meets e-tailing. Two great bubbles that burst great together!

BTW, her Linkedin profile shows that she went to the University of Dayton, which I don't recall being very close to Denver. All the talk about youngsters choosing the cities and bustling city cores over the dull suburbs papers over the reality that these kids aren't shopping around at the urban vs. suburban level, but at the entire metro area level -- which city will maximize my number of likes and followers? She didn't choose downtown Dayton over an attractive suburb of Dayton like Beavercreek -- she wanted to ditch dear, dirty Dayton altogether.

Group identities that are constructed consciously by first-generation adherents who merely affiliate with a city will be weak, shallow, and fleeting compared to those that are inherited unwillingly by multi-generational descendents who are rooted there. The western half of the country has long been more plagued by vice and social decay than the eastern half, and its history of rootlessness provides the central explanation.

Another long established fact about urban growth and migration is that cities do not grow except by wave after wave of even greater fools pouring into them. City folk are so caught up in their status contests (whether based on career or lifestyle), that they forgot to get married and have kids. By the time their career is established, or their reputation on Instagram suitably impressive, it's too late to start. Cities have been fertility sink-holes ever since they began thousands of years ago, and migration from the countryside was all that fed their growth.

What will happen to these 30 year-olds when the contests over who's sampled the most esoteric food truck fare begin to get old? They won't have any family or community life to fall back on; everything up till then has been based on lifestyle status competition. They will face the choice between staying stuck on the status treadmill forever, or drop out in isolation, where they'll indulge their vices until they bodies and brains loosen into mush. Sadly, that will begin by the time they're 40, when the final relief of death is still very far away.

Gosh, you guys, what the heck. I hate to be such a downer, but all this mindless enthusiasm for urban cancer is not only getting tiresome, but by now disturbing.

* During a brief refractory period, the cheerleading reporter lets some sobering facts slip by:

Atlanta, one of the biggest net gainers of young graduates in the 1990s, has taken a sharp turn. Its young, educated population has increased just 2.8 percent since 2000, significantly less than its overall population. It is suffering the consequences of overenthusiasm for new houses and new jobs before the crash, economists say.

Good thing that Ben Bernanke ordered an anti-hype fence to be built around Atlanta, lest the overenthusiasm ruin other Sun Belt boomtowns.

October 19, 2014

Wide-open eyeglasses for a non-ironic look

Earlier I showed how different the impression is when someone wears glasses that are narrow as opposed to wide. Narrow eyes, whether now or during the cocooning Midcentury, make people look aloof and self-conscious. Wide eyes like you saw back in the '70s and '80s look inviting and other-directed.

Of course today's narrow glasses are more off-putting than the Midcentury originals because there's now a level of irony on top of it all. Get it -- retro Fifties, geek chic! Yep, we get it.

It makes you wonder whether people could ironically wear wide-eye glasses (other than sunglasses). I've been wearing a pair from several decades ago since the summer, and haven't gotten any winking approval looks like "Oh I see what you did there, Seventies glasses FTW!" They're pleasantly inconspicuous.

I was searching Google Images to try to identify the drinking glasses that my parents used to own around the time I was born, with only pictures to go from. "Vintage glasses orange brown" turned up this result:


It's meant to be part of an ironic "hot for teacher" costume for Halloween, but it doesn't succeed in the ironic department. Somehow, wearing wide-eye glasses makes someone look inviting, kind, and sincere, even when they're aiming for ironic.

Contrast the effect with those "sexy nerd" glasses with narrow eyes and thick rims, where the girl just looks sassy and self-absorbed. Wearing glasses like the ones above makes her look refreshingly tuned in to other people instead of herself.

October 18, 2014

Extended family structure as an influence on generational membership

Why is it that even among people born in the same year, some of them identify more strongly with an older cohort, some with their own cohort, and some with a younger cohort? If generational membership were only a matter of when you were born, and what the environment was like along each step of your development, we shouldn't see this kind of variation among folks who were born in the same year.

Going solely off of a hunch from personal experience, it could be due to differences in the generational make-up of a person's extended family.

I was born in 1980, part of a late '70s / early '80s cohort that either gets lumped in as the tail-end of Gen X or is given its own tiny designation, Gen Y, between X-ers and Millennials. I've always felt and acted closer to core X-ers than to core Millennials (who to me seem like they come from another planet), although a good fraction of people in my cohort would tilt more toward the Millennial side. We all recognize that we're neither core X-ers nor core Millennials, yet when pushed off of the fence-sitting position, some of us fall closer to an earlier generation and some to a later generation.

Since we spend quite a bit of time socializing with family members, though, perhaps we should look into that source of influence as well. If they're not related to you, much older and much younger people typically are blind to you, and reject hanging out with you if you try to make yourself seen. But blood is thicker than water, and those much older or younger kids will interact with you and pass along their ways in a family setting.

I've only rarely interacted with the extended family on my dad's side, so I'll stick to the maternal side. Although she was born in the mid-'50s, she is unusually young among her three other siblings, who were born in the early, mid, and late '40s — more typical of the parents of core X-ers. My cousins through them are also all older than me: of those I met regularly growing up, one is a late '60s birth, two are early '70s births, and one is a mid-'70s birth. Our grandparents are also more typical of core X-ers, with one born in the mid-1910s and the other in the early '20s.

I would have to ask around, but I suspect the people in my cohort who tilt more toward the Millennial side of the fence have cousins who are more centered around their own age, aunts and uncles centered around their parents' age ('50s births), and grandparents who are Silents (late '20s / early '30s births). That extended family profile is closer to a Millennial's than an X-er's.

Those are the blood relationships, but when you count the affines (those who marry in), you get the same result as long as there aren't wild age differences in dating and marriage. Growing up, I only got to know the girlfriends (and eventual wives) of two of my cousins, but they were both core X-ers (late '60s or early '70s births). And the uncle-by-marriage that I knew well growing up was a Silent.

In short, if you look at my family tree and cover up my birth year, and my parents', it would like a typical Gen X tree. The lateral influences from my cousins (and once they were old enough, their girlfriends and wives), as well as vertical influences from aunts and uncles and grandparents, are more typical of someone born in the early '70s than the early '80s.

Granted, the less time you spend with your extended family growing up, the weaker this effect will be. And people in my cohort had parents who were part of the Me Generation who didn't mind moving away from their siblings and parents, and who expected us to do so as well once we set off for college and whatever career we wanted to pursue. Status was to be more important than extended family cohesion.

But some of us didn't grow up so isolated from our extended families. My mother's sister and her husband lived only a few blocks away from us when I was in elementary school, and that was a second home for me and my brothers. By that time, her children had moved out, but still visited frequently, and brought their girlfriends, so we weren't so distant from them either. And I spent long portions of off-time at my grandparents' home, during the summer and winter.

Nowadays, with extended family ties being moderately strong at best, generational membership is going to be primarily shaped by your own birth year, period. That determines who your peers will be in school, and there's your generation. But that still leaves secondary influence for the generational make-up of your extended family, and in cases where you belong to a cohort that is neither here nor there, this secondary influence could push you clearly into one or the other clearly defined generation on either side of your own.

October 16, 2014

The generational divide among grunge musicians

Grunge music was a flash-in-the-pan phenomenon of the early 1990s, serving as a bridge between the longer and more stable periods of college rock throughout the '80s and alternative rock throughout the '90s. In fact, there was a generational bridging underneath the stylistic bridging.

I finally came upon a copy of the Temple of the Dog album with "Hunger Strike" on it. (Posted from my thrift store cardigan.) That song has always struck me as aging better, and agreeing with me better, since it first became a hit over 20 years ago. Not one of those perfect pop songs, but one worth buying the album for.

Like many other late Gen X adolescents, I was into grunge when it was the next big thing, but quickly moved on — or backward — to punk, ska, and college rock from the late '70s and '80s. (My friends and I hated the lame alternative, post-grunge, or whatever it's called music that defined the mid-'90s through the early 2000s, even when it was novel.) A good deal of what I used to like, I began not-liking, but there are some songs like "Hunger Strike" that still sound cool and uplifting.

As it turns out, the grunge groups that I find more agreeable were made up mostly or entirely of late Boomers, born in the first half of the '60s, while those I don't relate to as much anymore were made up mostly or entirely by early X-ers, born in the second half of the '60s. The late Boomers are the ones shown in Fast Times at Ridgemont High — abandoning themselves to whatever feels good — while the early X-ers are shown a little later in the John Hughes movies — consciously torn between wanting to be impulsive while seeking the comfort of stability.

The abandon of the late Boomers gives them a clear advantage when it comes to jamming within a group, improvising, and going wherever the moment is taking you without questioning it. This was most clearly on display when glam metal bands went mainstream in the '80s, ushering in the golden age of the virtuoso guitar solo and near-operatic vocal delivery. But it showed up also in the era's cornucopia of cheerful synth-pop riffs, as well as jangly, joyful college rock.

When the early X-ers took up songwriting, rock's frontmen were suddenly from a more self-conscious and ironic generation. Stylistically, it meant that the shaman-like performance of the spellbinding guitar solo was over, and that vocal delivery would be more aware of its own emotional state, or more affected — twee rather than carefree on the upbeat side, angsty rather than tortured on the downer side.

During this transition, along came grunge. Temple of the Dog was made up of members from Soundgarden and Pearl Jam, before either group exploded in popularity. Pursuing a hunch, I found out that singers Chris Cornell and Eddie Vedder are both late Boomers. Pearl Jam was roughly half Boomers and half X-ers, while Soundgarden was all Boomers aside from the bassist.

And sure enough, Soundgarden always felt like the evolution of '80s metal, which was created by their generation-mates, albeit at an earlier stage of their lives. Pearl Jam sounded more of-the-Nineties (more self-aware, less abandoned), though more rooted in the sincerity of college rock bands from the '80s than sharing the irony of '90s alternative rock.

Which groups had a solid Gen X basis? Nirvana had no Boomers — no surprise there. Neither did Alice in Chains. Stone Temple Pilots were all X-ers aside from their guitarist. This was the angsty side of grunge (self-consciously angry), with the funky riff of "Man in the Box" pointing the way toward the aggro, rap-influenced metal of the late '90s (Korn, Limp Bizkit, etc.).

Screaming Trees were equally Boomer and X-er, and "Nearly Lost You" sounds pretty easygoing by alternative standards.

And other Boomer-heavy groups? The girl groups, as it turns out. Only the bassist in L7 and Babes in Toyland were X-ers, the rest were Boomers. On their first grungier album, Hole consisted of Boomers (I couldn't find the birth year for the drummer, though). Recall an earlier post which showed all-female bands peaking in popularity during the '80s — the girl grunge bands were a fading generational echo.

The more self-conscious mindset of women in Gen X made it difficult or impossible to get into a state of abandon needed for grunge music, which was only partly introspective — and partly keeping the free-wheeling spirit of the '80s alive. When I think of the prototypical wild child, she's a late Boomer like the girls in Fast Times, the women of carefree '80s porn, and the real-life basis for the protagonist of Story of My Life by Jay McInerney.

Generations keep their ways well beyond their formative years, almost like a language that they were surrounded by and continue to speak, regardless of what new languages may have shown up in the meantime. If cultural change were only a matter of a changing zeitgeist, then Pearl Jam and Nirvana should have sounded much more similar than they did. And if those differences were a matter of being at different life stages at the time, why were the older guys more free-wheeling and the younger guys more reserved? It came down to a changing of the generational guard.

Today's immigrants are revolutionizing the way we experience disease — again

With ebola in the news, it's worth placing it in the broader context of exotic epidemic diseases cursing status-striving societies, where laissez-faire norms open the floodgates to all manner of vile pollution from outside.

The last peak of status-striving, inequality, and immigration was circa 1920, right when the Spanish flu pandemic struck. During the Great Compression, with immigration nearly ground to a halt, epidemic diseases looked to become a thing of the past. That sunny man-on-the-moon optimism of the 1960s would be undone by the Me Generation of the '70s. Not coincidentally, old diseases began rearing their ugly heads once more.

See this earlier post that examined the rise and fall and rise of epidemic diseases and pests, in tandem with the trends of inequality and immigration. Vaccines, hygiene, public health initiatives, etc., seem to have little or nothing to do with the fall, which in several cases was well underway before the vaccine was even discovered, let alone administered across the population.

It could have boiled down to something simple like the body not being burdened by as much stress as in hyper-competitive times, and keeping the ecosystem of diseases manageable by not allowing in boatload after boatload of new strains. Since ancient times, global migration has spread contagious diseases, but the world became less intertwined during the Great Compression.

Ebola will not become the pandemic of our neo-Gilded Age because it doesn't spread so easily. But ebola is just the tip of the iceberg of what is pouring into our country, and Western countries generally. It is not an isolated curiosity, but part of a larger population of pathogens being trucked and flown into this country every day.

One of those bugs, as of now an unseen up-and-comer, will soon enjoy its glory days just as the Spanish flu did 100 years ago. You can't say that America doesn't encourage the underdogs to give it their all.

October 2, 2014

When a girl gets taken advantage of, liberals cry rape, pseudo-cons shrug shoulders

There's really nobody to cheer for in the ongoing battle about "rape culture." The hysterical liberals / feminists are cheapening the charge of rape when they apply it to situations where a girl, usually intoxicated, gets taken advantage of. That involves a betrayal of trust, not the threat or use of violence.

Liberals used to concern themselves with matters of unfair treatment, injustice, one party taking advantage of another, and so on. You'd think the common-enough case of a drunk girl getting taken advantage of would be right up their alley. Hard to think of a better textbook example of someone using a higher bargaining position (clear-minded vs. intoxicated) to take advantage of another person.

Yet liberals these days don't talk much about the little person being taken advantage of (umm, vote for the Tea Party much? Cuh-reeeepy...) Most of them became numb to populism a couple decades ago. Hence every objection must be about preventing harm and providing care, no matter how unfitting this approach may be in any given case.

On the other hand, much of the so-called conservative reaction is to say, "Meh, what was she expecting? Anyway, no crime, no punishment. Next topic." While at least seeing past the blubbering about violence and harm, this response still shows a callousness toward a growing problem of young women getting taken advantage of while intoxicated, and while away from anyone who would look out for her interests, i.e. her family.

Sure doesn't sound like a world any of us want to live in, but pseudo-cons are so concerned with kneejerk reactions against the side of political correctness that they won't admit how unwholesome the situation is.

More and more, in fact, the pseudo-con position on any aspect of our fucked up world can be simplified as: "Unwholesomeness -- if you can't get used to it, you're a pussy." Real I-like-Ike kind of values.

This is the gist of some comments, copypasted below, that I left at this post at Uncouth Reflections about how hysterical accusations of rape are getting, that they're claiming ever less harm-based forms of not-so-consensual sex as rape.

That post was in turn motivated by this item at Time about Lena Dunham's new book, in which she relates a story about being "raped" in college. She was drunk and/or on Xanax, was insistent about going back with the guy who took advantage of her -- over an explicit attempt by a hapless white knight to steer her away from it -- and talked dirty to the guy during the act.

It never occurs to her that she was raped (she's thinking of actual rape, including force / violence). Her friend is the one who tells her she was raped, and convinces her. They are now thinking of metaphorical rape, and literal getting-taken-advantage-of. In the comments below I speculate about how our society has gotten here, making use of Haidt's framework of the moral foundations that liberals vs. conservatives draw on in their moral intuitions.

* * *

Incidents like Dunham’s are obviously not rape, they are getting taken advantage of. Why can’t today’s leftoids speak out against vulnerable people getting taken advantage of? Partly because the women make themselves vulnerable in the first place by blasting their brain with so many hard substances in rapid succession, in a public place where they know strangers only have one goal on their mind.

But there must be something more to it. We would object to a sleazy pawn shop owner offering a couple bucks for pristine 1970s receiver if it came in from a clearly stoned-out customer who says he just wants some cash to feed the munchies, man, you gimme anything for this thing?

These cases involve not harm or force but violation of trust. She trusted the guys at the party not to be that type of sleazeball, the stoner trusted the shopkeepers to give him honest treatment. When they wake up the next morning worrying, “Oh God, what did I do?” they feel betrayed or lied to.

Why insist on framing it as harm when it is not? Liberals are becoming so infantilized that it’s the only moral foundation they can appeal to anymore. “Mommy, that person hurt me!” No, they took advantage of you — it’s still wrong, but different from harming you. Kids don’t get it because they’re naive and don’t know about the possibility of having their trust betrayed.

Grown-ups should, though, and the fact that liberals cannot even appeal to their second-favorite moral foundation — fair treatment — shows how stunted this society has become.

Betrayal also taps into the moral foundation of community or in-group cohesion, but liberals are numb to that. If you’re both members of the same community (and typically they are even closer, being at least acquaintances), how could you think of taking advantage of her? Shame on you. Get out, and don’t come back until you’ve made up for it.

But liberals value hedonism, laissez-faire, individual advancement, and other quasi-libertarian leanings that most Republicans do. Hence they cannot object on the basis of wanting to prevent people from taking advantage of others. Under hedonism, that is guaranteed. And if laissez-faire and non-judgementalism are sacrosanct, who’s really to say that we’re in a place to judge a mere sleazeball who takes advantage of others? I mean, it’s not like he’s using force or violence.

Therefore, yes, he must have used force or threatened violence — that’s the farthest that the liberal is willing to draw the boundary. If they have an instinctive revulsion, it must be framed as some kind of harm, because anything less severe than that but still repugnant (like taking advantage of a drunk chick at a party) lies on the “fair game” side of the moral boundary line. Cognitive dissonance kicks in, so they rationalize what happened as harm / rape.

I alluded to it by calling Republicans quasi-libertarians, but let me say that too many conservatives don’t know how to react to these scenarios either. They know that the liberals are hysterically over-exaggerating, and like to invent new classes of victimhood, so their instinct is to dismiss these cases altogether.

But a drunk girl getting taken advantage of at a party isn’t a brand-new victim class. I’m sure that was a concern in Biblical times when the wine began flowing. It’s not as though she were a black who was denied admission to law school due to low LSAT scores, or a mentally ill tranny who feels robbed because ObamaCare won’t fund his castration / mangina surgery.

Brushing the Dunham type cases aside, like “Bitch deserved what she got for getting drunk,” or “Bitches need to learn to fend for themselves when drunk at college parties,” is too far in the anti-PC direction, however understandable the revulsion toward PC is.

I don’t sense any of that here — that’s more of a shrill Men’s Rights thing — but it’s worth emphasizing that it isn’t only liberals who are dumbfounded when they try to articulate their reaction to “drunk chick gets taken advantage of.” They both rely too heavily on laissez-faire and hedonistic norms to be able to say there’s something wrong with someone betraying another’s trust.

September 30, 2014

The crappy digital look: Demystifying the lie about how sensors vs. film handle light sensitivity

In this installment of an ongoing series (search "digital film" for earlier entries), we'll explore another case of digital photography offering supposedly greater convenience at the cost of compromised image quality. The end result is pictures that are too harshly contrasting, more pixelated, and color-distorted where it should be white, black, or shades of gray.

This time we'll look into the properties of the light-sensitive medium that records the visual information being gathered into the camera by the lens. I have only read one off-hand comment about the true nature of the differences between digital and film at this stage of capture, and mountains of misinformation. The only good article I've read is this one from Digital Photo Pro, "The Truth About Digital ISO," although it is aimed at readers who are already fairly familiar with photographic technology.

Given how inherent the difference is between the two media, and how much it influences the final look, this topic is in sore need of demystifying. So hold on, this post will go into great detail, although all of it is easy to understand. By the end we will see that, contrary to the claims about digital's versatility in setting the light-sensitivity parameter, it can do no such thing, and that its attempts to mimic this simple film process amounts to what used to be last-resort surgery at the post-processing stage. One of the sweetest and most alluring selling points of digital photography turns out to be a lie that has corrupted our visual culture, both high and low.

To capture an image that is neither too dark nor too bright, three inter-related elements play a role in both film and digital photography.

The aperture of the lens determines how much light is gathered in the first place: when it is wide open, more light passes through, when it is closed down like a squint, less light passes through. But light is not continually being allowed in.

The shutter speed regulates how long the light strikes the light-sensitive medium during capture: a faster shutter speed closes faster after opening, letting in less light than a shutter speed that is slower to close, which lets in more light.

Last but not least in importance, the light-sensitive medium may vary in sensitivity: higher sensitivity reacts faster and makes brighter images, lower senstivity reacts slower and makes dimmer images, other things being equal. This variable is sometimes labeled "ISO," referring to the name of a set of standards governing its measurement. But think of it as the sensitivity of the light-reactive material that captures an image. This scale increases multiplicatively, so that going from 100 to 200 to 400 to 800 is 3 steps up from the original. Confusingly, the jargon for "steps" is "stops."

A proper exposure requires all three of these to be in balance — not letting in too much or too little light through the lens aperture, not keeping the shutter open too long or too briefly, and not using a medium that is over-sensitive or under-sensitive. If you want to change one setting, you must change one or both of the other settings to keep it all in balance. For example, opening up the lens aperture lets in more light, and must be compensated for by a change that limits exposure — a faster closing of the shutter, and/or using a medium that is less light-sensitive.

Between digital and film, there are no major differences in two of those factors. The lenses can be opened up and closed down to the same degree, whether they are attached to camera bodies meant for one format or the other. And the shutter technology follows the same principles, whether it is opening up in front of a digital or film recording medium. (Digital cameras may offer slightly faster maximum shutter speeds because they are more recent and incorporate improvements in shutter technology, not because of digital properties per se.)

However, the two formats could not be more different regarding light-sensitivity of the recording medium.

Film cameras use rolls of film, which are loaded into and out of the camera on a regular basis. Load a roll, take however-many pictures, then unload it and send it off to the lab for development. The next set of pictures will require a new roll to be loaded. Digital cameras have a light-sensitive digital sensor which sends its readings to a memory card for later development and archiving. The sensor is hardwired into the camera body, while the memory card is removable.

Thus, no matter how many pictures you take with a digital camera, it is always the exact same light-sensitive piece of material that captures the visual information. With a film camera, every image is made on a new frame of film.

A digital sensor is like an Etch-a-Sketch that is wiped clean after each image is made, and used over and over again, while frames and rolls of film are like sheets of sketching paper that are never erased to be re-used for future drawings. The digital Etch-a-Sketch is just hooked up to a separate medium for storing its images, i.e. memory cards. Frames of film are both an image-capturing and an image-storage medium wrapped up into one.

Whether the light-sensitive material is always fresh or fixed once and for all has dramatic consequences for how it can be made more or less reactive to light — the third crucial element of proper exposure.

Film manufacturers can make a roll of film more reactive to light by making the light-sensitive silver halide crystals larger, and less reactive by making the crystals smaller. Hence slow films produce fine grain, and fast films large grain. What's so great is that you can choose which variety of film you want to use for any given occasion. If you're worried about too much light (outdoors on a sunny summer afternoon), you can load a slowly reacting film. If you're worried about not getting enough light (indoors in the evening), you can load a fast reacting film.

It's like buying different types of sketching paper depending on how much response you want there to be to the pencil lead — smooth and frictionless or bumpy and movement-dampening. Depending on the purpose, you're able to buy sketchpads of either type.

What was so bad about the good old way? The complaints boil down to:

"Ugh, sooo inconvenient to be STUCK WITH a given light sensitivity for the ENTIRE ROLL of film, unable to change the sensitivity frame-by-frame. What if I want to shoot half a roll indoors, and the other half outdoors?"

Well, you can just buy and carry two rolls of film instead of one — not much more expensive, and not much more to be lugging around. And that's only if you couldn't compensate for changes in location through the other two variables of aperture size and shutter speed. For the most part, these were not big problems in the film days, and served only as spastic rationalizations for why we absolutely need to shift to a medium that can alter the light-sensitivity variable on a frame-by-frame basis, just as aperture size and shutter speed can be.

That was the promise of digital sensors, which turns out to be a fraud that the overly eager majority have swallowed whole, while enriching the fraudsters handsomely.

Digital cameras do offer a means for making the image look as though it had been captured by a material that was more sensitive or less sensitive to light, and this variable can be changed on a frame-by-frame basis. But unlike film rolls that may have larger or smaller light-sensitive crystals, the photodiodes on the digital sensor have only one level of sensitivity, inherent to the material it is made from.

Because this sensitivity is baked into the materials, it certainly cannot be altered by the user, let alone on a frame-by-frame basis. And because the sensor is not removable, the user also has no recourse to swap it out for another with a different level of sensitivity.

How then do digital cameras attempt to re-create the many degrees of sensitivity that film offers? They choose a "native" sensitivity level for the photodiodes, which can never be changed, but whose electronic output signal can be amplified or dampened to mimic being more or less sensitive in the first place. In practice, they set the native (i.e. sole) sensitivity to be low, and amplify the signal to reach higher degrees, because dampening a highly sensitive "native" level leads to even lower quality.

Most digital cameras have a native (sole) sensitivity of ISO 100 or 160, meant to evoke the slowly reacting less sensitive kinds of film, and allow you to amplify that signal frame-by-frame, say to ISO 800, 3200, and beyond. But remember: it is never changing the "ISO" or sensitivity of the light-reactive material in the sensor, only amplifying its output signal to the memory card.

It is like always recording sound at a low volume, and then using a dial on an amplifier to make it louder for the final listening, rather than record at different volume levels in the initial stage. And we all know how high-quality our music sounds when it's cranked up to 11. It does not sound "the same only louder" — it is now corrupted by distortions.

We should expect nothing less from digital images whose "ISO" was dialed up far beyond the native (sole) sensitivity of 100 or 160.

Below are some online digital test shots taken with the lens cap fully in place, blocking out most light, with higher and higher settings for the faux-sensitivity ISO setting. Now, these images should have remained black or gray the whole way through. The only change that would have occurred if they were shot on more and more highly sensitive film material is a grainier texture, owing to the larger film crystals that make film more sensitive, and an increase in brightness, since what little light was sneaking in past the lens cap would have produced a stronger reaction.

And yet look at the outcome of a digital sensor trying to see in darkness:


Not only does the texture get grainier, and the light level brighter, when the native (sole) sensitivity is amplified, there are now obvious color distortions, with a harsh blue cast emerging at higher levels of sensor amplification.

What's worse is that different cameras may produce different kinds of color distortions, requiring photographers to run "noise tests" on each camera they use, rather than know beforehand what effects will be produced by changing some variable, independent of what particular camera they're using.

The test shots above were from a Canon camera. Here's another set from a Pentax, showing a different pattern of color distortions.


Now it's red instead of blue that emerges at higher levels of amplification. Red and blue are at opposite ends of the color spectrum, so that shooting a digital camera without test shots is like ordering a pizza, and maybe it'll show up vegetarian and maybe it'll show up meat lover's. Unpredictable obstacles — just what a craft needs more of.

These distortions can be manipulated in Photoshop back toward normal-ish, but now you've added an obligatory extra layer of corrections in "post" just because you want to be able to fiddle with light-sensitivity frame-by-frame, which you're not really doing anyways. Convenience proves elusive yet again.

So, if amplification of the native (sole) light sensitivity is not like using film rolls of different sensitivities, what is it like? As it turns out, it is almost exactly like a treatment from the film era called push-processing, which was a last-ditch rescue effort in the developing stage after shooting within severe limitations in the capturing stage.

Suppose you were shooting on film, and your only available rolls were of sensitivity ISO 100, which is a slowly reacting film best suited for outdoors in sunlight. Suppose you wanted to shoot an indoor or night-time scene, which might call for faster reacting film, say ISO 400. Could it still be done with such low-sensitivity film? You decide to shoot in the evening with a slow film, effectively under-exposing your film by 2 stops, worried the whole time that the images are going to come back way too dark.

Lab technicians to the rescue! ... kind of. If you let them know you under-exposed your whole roll of film by 2 stops, they can compensate for that by allowing your film to soak in the chemical developing bath for a longer time than normal, allowing more of those darkened details to turn brighter. (The film starts rather dark and the developing bath reveals areas of brightness over time.) Taking 100 film and trying to make it look as sensitive as 400 film is "pushing" its development by 2 stops.

But if that were all there were to it, nobody would've bothered using films of different sensitivities in the capturing stage — they would've let the lab techs worry about that in the developing stage. The costs of push-processing are various reductions in image quality, which Kodak's webpage on the topic summarizes in this way (click the link for fuller detail):

Push processing is not recommended as a means to increase photographic speed. Push processing produces contrast mismatches notably in the red and green sensitive layers (red most) compared to the blue. This produces reddish-yellow highlights, and cyan-blue shadows. Push processing also produces significant increases in film granularity. Push processing combined with under exposure produces a net loss in photographic speed, higher contrast, smoky shadows, yellow highlights and grainy images, with possible slight losses in sharpness.

Not a bad description of the signature elements of the digital look, is it? Blue shadows are exactly what the Canon test shots showed earlier.

Interestingly, they note that although push-processing produces less sharp images, they may subjectively appear to be normally sharp, given the increase in contrast. Sure, if a subject is wearing a normal red shirt and normal blue jeans, and you crank up the contrast parameter, the picture looks more defined — ultra-red juxtaposed against ultra-blue. But we're only fooling ourselves. Sharpness means how clear and crisp the details are, and push-processing and its obligatory counterpart in the digital world are actually losing details, while distracting us with more strongly contrasting colors.

Remember, this is what a digital camera is doing each time it takes a picture outside of its native (sole) sensitivity level of 100 or 160, i.e. when you shoot indoors, at night, or on cloudy days. In the digital world, every image is immediately rushed into emergency surgery.

Is there a way to compare side-by-side a film image that was processed both normally and with push-processing? Unfortunately, no, since developing the negative image from the latent image on the film cannot be undone, and then done a different way. I suppose you could take a shot of the same scene, with two identical cameras and two identical rolls of film, but with one camera set to the true sensitivity and the other set inaccurately, then develop the normal one normally and the under-exposed one with push-processing. That sounds like a bit too much just to make a perfect textbook comparison of normal vs. push-processed images, and I couldn't find any examples online.

But there are examples of film that has been push-processed. Although we can't compare them side-by-side with normally developed versions of the same film frame, at least we can pick up on some of the typical traits that push-processing introduces. Below is an example from this series at a photographer's website. The film is ISO 400, but was push-processed to look like ISO 3200. That is 3 stops of pushing, whereas Kodak and other photography guidebooks advise never pushing past 2 stops of over-development.


It's disturbing how digital this film photograph looks. It looks like someone opened a digital image in Photoshop and cranked up the contrast and saturation settings. Look for details on the man's shirt and pants, like folds and creases. They're hard to make out because push-processing renders the image less sharp. But we're distracted by how striking the contrast is between these overly rich yellows and reds and the cooler blues. It looks more defined, but is poorer in detail.

It's almost like a child drew an outline of pants and hit "fill" with yellow on MS Paint. Very little detail. The yellow pole also looks like a crude "fill" job. Even worse, these pictures were shot on medium-format film, which has a far higher resolution than the 35mm film we're all used to. It ought to have detail so fine that you could blow it up into a poster or banner without blurring of the details.

We also see the familiar blown-out sky from digital Earth, rather than the blue one we know and love. Other white areas look like intense spotlights, too. I can't tell if they have the red-yellow tint to them as Kodak warned, although they do look kind of bright pale yellow. There aren't many dark shadows to tell if they have the bluish tint warned about, although the asphalt on the road looks blue-gray. The color distortions might be more obvious if we had the same scene captured and developed normally, for comparison.

The ultra-contrasty, overly saturated, harshly blown-out bright areas are hard to miss, though. And they look like something straight from a digital camera plus Photoshop settings dialed up to 11.

You might object that, hey, this guy knows what he's doing, and he's using push-processing to give the pictures a flamingly dramatic style (he's gay). That misses the point: these kinds of distortions and reductions in image quality are built in with digital photography's light-sensitivity technology. They aren't going to be chosen purposefully for some intended artistic effect. They're just going to make ordinary people's pictures look cartoony and crappy because they don't know about them before buying a digital camera, and won't mind anyway because digital is all about convenience over quality.

Even Hollywood movies shot by pros will be subject to these digital distortions, although they'll have much better help cleaning them up in post — for a price. Good luck scrubbing your digital images that clean on your own with Photoshop.

In the end, is digital really more convenient, all things considered? All of these distortions require laborious and expensive corrections, which may well off-set the efficiency gains that were hoped for at the beginning. Or those corrections simply won't be done, and greater convenience will have been traded off against poorer quality. Either way, one of the fundamental promises of digital photography turns out to be a big fat lie.

September 28, 2014

Status contests and the shift from involuntary to voluntary identities

Transplants claiming to be New Yorkers. Whites trying to pass themselves off as blacks. Men who insist that they're really women. And denizens of the 21st century who dress up as though they belonged to the fedora-sporting Forties.

These and many other related phenomena have been noticed and detailed on their own, but as far as I'm aware, there has been no unified treatment of them, for either description or explanation.

What the phenomena have in common is a shift toward all forms of group membership being determined by deliberate choices to "identify" or affiliate with the group, rather than having belonged to that group for reasons beyond your control, say by being born into it.

Sociologists refer to "ascribed" status, which you are born into, raised in, or otherwise given involuntarily, vs. "achieved" status that you gain through your own doing. Membership in a race is ascribed, while membership in a fraternity is achieved. Being a child of divorce is ascribed, being a divorced adult is achieved.

Some forms of status could hypothetically go either way. Does membership in a regional culture stem from your birth, upbringing, and extended family roots? Or can you choose to identify with a region that you did not spend your formative years in, but have moved into as an adult?

When regional membership is ascribed, all that matters is birth, upbringing, and family roots — even if you have spent most of your adult life in a region that you were not raised in, you are still a guest within a host or adoptive culture. When membership is achieved, you're perfectly allowed to claim the regional identity of your adoptive place, after a suitable series of rites of passage, which may be tacit or explicit.

For example, when you first move to New York City, how long of a residence does it take until you're "really" a New Yorker? How numbed to the odor of piss does your nose have to become (in the old days), or how long do you have to use a monthly subway card rather than touristy tokens (in the new days), before you have gone through the trials and rituals that earn you admission into the club of "real" New Yorkers?

Notice that when status is achieved, the aspiring joiners will appeal to as many criteria as they can think of rationalizations for in their favor. Ascribed status constrains the debate. Sure, folks may still bicker about how many generations back the person's roots need to go, or how many kin they must have who are also New Yorkers, but that is still limited to just two criteria.

Thus, ascribed status largely speaks for itself, while achieved status encourages rattling off one after another qualification on the self-promoter's endless list. Status contests are limited in scope when status is ascribed — were you born here or not? — but turn into ever escalating games of one-ups-manship when it is achieved.

This suggests that in status-striving times, group membership will shift toward being more and more achieved, while in accommodating and egalitarian times it will shift toward being ascribed.

The prevailing norms in status-striving times are me-first and laissez-faire — who's to stop me from claiming a New Yorker identity if I work hard enough at it? If you work hard enough for it, you've earned it. Rags-to-riches and rugged individualism are other staples of the zeitgeist in status-striving times.

In accommodating times, the norms favor regulating interactions so that conflict is minimized. If we let one guy pursue New Yorker status as though it could be an accomplishment, then we open the floodgates to thousands of other combatants in a spiraling status war. Instead, individuals will attribute their various group memberships to the circumstances of their birth and upbringing — beyond their own control, and therefore pointless to change, and change, and change, according to whatever fashion battle they're engaged in at the moment.

In fact, you might as well make do with those circumstances and take a little pride in them. Upstate New York, the Ohio River Valley, Michigan — all these places used to carry a certain level of regional pride, no matter whether the person stayed or moved somewhere else. Now they are more likely to identify with the metro area that they have chosen to move into, probably embarrassed about where they came from.

Returning to the examples at the beginning of this post, let's spell out just how extreme our status contests have become. They have moved far beyond groups whose membership could be either ascribed or achieved, to the point where ascribed status should be indisputable, but where strivers are waging wars to make it achieved. They do not have to make up a majority of the status contests of our age — the fact that they are even happening at all proves how psychotic the climate has gotten.

Sex is entirely ascribed, yet the tranny movement asserts that men can identify as women or vice versa, and that the rest of society ought to assign them the sex status that the trannies insist on, rather than it being ascribed at birth. Tranny psychos are so status-striving that they whore for attention more than the others in the feminist and women's groups, and are always ready to start rattling off the top 100 reasons why I'm just as much of a woman as you (or more). They also viciously compete against each other to see who's unlocked the most achievements in the sim game of pretending to be a woman.

Generational membership is also determined by birth, yet we see more and more people cosplaying and LARP-ing as though they belonged to another generation. And not one that's just on the other side of their own, where honest disagreements might be made, but a generation whose formative years unfolded long before the person was even born.

Gen X-ers pretending to hail from the Midcentury, Millennials pretending to belong to the Boho vintage-y Seventies, not to mention legions of geeks placing themselves in the old timey Victorian era — steampunk conventions, going to night clubs wearing black corsets or black tailcoats, and so on. These are not occasional costumes worn as a fun break from routine, but part of their ongoing identity which they take (and craft) very seriously.

Similar widespread movements involve members of one race pretending to belong to another. OK, so they don't actually have the DNA test to back it up — but are we seriously going to rely only on bloodlines? The wigger is not an "honorary black," but someone who acts as though they were black, merely by aping real blacks. In the '90s, this term used to be a portmanteau word of "white nigger," alluding to the lily-white suburban area that this dork actually came from. Now that other races than whites pretend to be black, it now means "wannabe nigger," including East Asians and Indians who act that way.

Blacks have tried to push back against this attempt to make membership in the black race (or ethnic group) achieved rather than ascribed, but that hasn't stopped the wigger phenomenon from growing. It's just like women feminists trying to push back against mentally ill trannies trying to make membership in the female sex achieved rather than ascribed. Such efforts are ultimately doomed in a laissez-faire climate because they are seen as pleas for special or unfair treatment — to carve out race, or sex, as a domain where status is ascribed. But if status is to be achieved in so many other areas, it will play out that way for race and sex too, no matter how ridiculous it feels to normal people.

What were the counterparts of these extreme forms during the previous period of rising competitiveness and inequality, the Victorian era and turning of the 20th century?

Fin-de-siecle England was not only plagued by out-of-the-closet faggots (search Google Images for "gay Victorian photographs" — safe for work, they just show couples sitting together embracing). Trannies also had their own subculture and nightlife haunts that were raided by police.

Then there were Orientalists who LARP-ed as members of an exotic race or ethnic group, one that they were not rooted in one bit. As with today's wiggers, they did not merely dress up every once in awhile for fun, or borrow certain design elements to spice up their otherwise native style. They were constantly leveling up their identity as The Other, as close to 100% max stats as they could manage. They always dressed in the exotic style, and tried to re-create a foreign architectural style on English soil.

Finally there were various strains of anti-modernists who affiliated not with somewhat earlier generations or zeitgeists, but all the way back to the Gothic and Medieval periods from their nation's history. The most well known group was the Pre-Raphaelite Brotherhood of painters. They were not merely seeking contact with the past, or Luddites who hated where all this new-fangled technology was taking society. They chose to base their very identity on affiliation with the Medieval period.

In our second Gilded Age, everything old is new again.

September 25, 2014

Convenience as neglect, disloyalty, and desecration

A recent comment about digital cameras marveled at how remarkable technology is, that it has given us such cheaper, faster, and generally more convenient ways to take pictures. But that has come at a cost to image quality and to the emotional significance or resonance of our pictures, which has devolved in the digital age. This trade-off between convenience and some kind of quality is general, not only regarding cameras, so it's worth looking into.

These days the principle of convenience is so worshiped by so many people in so many contexts that we can hardly recognize how strange it is. From Walmart to Amazon to Redbox to Facebook, convenience has proven to be the most important value to 21st-century man (or more accurately, guy).

Yet convenience resonates with only one of the "moral foundations" in the Haidtian framework, namely liberty — freeing up the individual to pursue whatever they wish they had more time, money, and effort to devote towards.

On all other foundations, it offends rather than pleases our moral sensibilities. In matters of care and harm, it manifests as neglect; in the domain of fairness, as rule-bending and corner-cutting; in authority, as abdication at the top and shirking at the bottom; in group loyalty, as opting out; and in purity, as debasement.

Convenience is thus a libertarian rather than liberal or conservative value, and its pervasiveness reveals the callous laissez-faire norm that governs our neo-Dickensian Gilded Age v.2.0.

In politics it appeals mostly to so-called moderates or independents, who shop around for whichever candidate can offer them the most convenient quid pro quo if elected to office. Likewise in religion it appeals to the denominationally unaffiliated, who shop around for the most convenient arrangement of investment from the pew-filler and reward in self-fulfillment. Longer-term concerns about party or church stability, or indeed stewardship of anything outside of the individual's little existence, are utterly foreign to the convenience shopper.

As mundane as it sounds, there could hardly be a sharper ideological fault-line to wage a battle over than convenience, which prizes puny gains to the individual over substantial blows to group cohesion, whether it be the family, community, workplace, or nation. Or put the other way around, tolerating puny costs to the individual in order to hold these groups together is what makes us the successful social species that we are.

It is tolerance of inconveniences which compels us to care for the sick when we are healthy, to play fair, to carry out our duties to superiors and subordinates alike, to honor the wishes of the community, and to preserve purity from adulteration.

September 23, 2014

Transplant governors

Having studied the rooted vs. rootless connection that Senators have to the states they represent, let's turn now to governors.

As before, "rooted" means that they graduated from high school in the state that they're in charge of. If we had better data, we could count how many years from, say, age 5 to 20 they spent living in the state, but what you can find online isn't that fine-grained. High school graduation is the most convenient milestone for our purposes.

An appendix below contains the full list of where each governor was living across several milestones -- birth, high school, college, and any advanced degrees they took.

Onto the findings. The following states have transplant governors: Hawaii, Arizona, Colorado, New Mexico, North Dakota, Ohio, Florida, Maryland, Virginia, New Hampshire, and Vermont.

Transplants account for 11 of our 50 governors. That may appear to be a lower rate than the 29 of 100 Senators, but this difference is not statistically significant. Carpetbagging behavior appears to be independent of which branch of the government the politician pursues their career in.

Rootlessness is independent of party, a result also found among Senators. However, as we saw with Senators, the Republican transplants headed off for states where they would not disrupt the partisan status quo, such as Arizona or North Dakota, whereas the Democrat transplants are part of the ongoing disruption of "swing states" that used to be red but are turning blue, such as Colorado and Virginia. It would be a mistake to blame the politicians themselves: they only represent the will of the voters, a rising share of whom are carpetbaggers themselves, having fled expensive blue states to gentrify red states, where the competition for status is less saturated.

The main regions affected by transplants are rural New England, where a brain drain has left the local hold-outs willing to hire outsiders to preserve their fading regional culture; and the Mountain states, where boomtown growth has brought in truckloads of transplant citizens, and where a long history of frontier rootlessness has left the region vacant of an entrenched elite that aspiring office-holders would have to overcome (if political) or kow-tow to (if economic).

But the worst offenders did not even live in their state's general region for any of their four milestones -- Hickenlooper in Colorado, who is a total East Coaster, and Scott in Florida, who is from the Midwest / southern Plains.

Brewer in Arizona has only a weak connection -- a locally earned technical certificate, not four years of college, after moving from California. Ditto for Abercrombie in Hawaii, who left behind lifelong roots in New York to take an advanced degree locally. McAuliffe in Virginia isn't exactly from around the place either -- born and raised in upstate New York, college and after in DC.

Other transplants are not such flagrant outsiders. O'Malley in Maryland is from just over the border with Northwest DC, Martinez in New Mexico is from just over the Texas border in El Paso, Kasich in Ohio is from just over the border in the Pittsburgh metro area, and Shumlin in Vermont attended prep school just over the border in northern Massachusetts. While not from next-door, some are not from too far away either: Dalrymple in North Dakota is from Minneapolis, and Hassan in New Hampshire is from Boston.

The most resistant region is the Deep South, a pattern we saw in the legislative branch earlier. Their historical memory of the original carpetbaggers during the original Gilded Age has made their immune system more robust this time around.

The following states have governors who were born, raised, and educated entirely locally: Alabama, Arkansas, Georgia, Mississippi, Missouri, South Carolina, Iowa, Indiana, Kentucky, West Virginia, Kansas, Texas, Idaho, Utah, Michigan, Maine, and New York.

Most of these are part of "flyover country," but as with Senators, competition is so stiff in the power centers of New York and Texas that local roots is one of the few things that can decide a contest among competitors who all have impressive credentials and sociopathic ambition. Illinois (i.e. Chicago) is a tough nut to crack, too: Quinn only left the state for college. Similarly in California, Brown only left the state for his law degree. But the further you go back toward the historical Establishment in New York, you'll need every point for local roots that you can claim, to win over identically impressive credentials.

That's about all the major patterns I see, let us know if you see any others.

Please also chime in if you have advice about how to continue this study for the judicial branch. I figure the state Supreme Court is the place to look, but that makes for 5 to 9 judges per state = too many for a casually interested person. Counting the chief justice alone would make the data manageable, but I'm not sure how meaningful that position is across states. Is it the top of the top, or is it like the chair of a college department that rotates through people who don't want it?

So stay tuned, though it may take awhile before I can analyze what's going on with the rootedness of judges.

Appendix: Rootedness of American Governors, 2014

This table lists where each governor was born, graduated high school, graduated college, and took an advanced degree. The final column marks whether or not they're a transplant -- 1 for yes, blank for no. The table is sorted first by transplant status, and then alphabetically by state.


state governor party birth hs grad uni grad adv grad transplant
AZ Jan Brewer  R CA CA AZ
1
CO John Hickenlooper  D PA PA CT
1
FL Rick Scott  R IL MO MO TX 1
HI Neil Abercrombie  D NY NY NY HI 1
MD Martin O'Malley  D DC DC DC MD 1
ND Jack Dalrymple  R MN MN CT
1
NH Maggie Hassan  D MA MA RI MA 1
NM Susana Martinez  R TX TX TX OK 1
OH John Kasich  R PA PA OH
1
VA Terry McAuliffe  D NY NY DC DC 1
VT Peter Shumlin  D VT MA CT
1
AK Sean Parnell  R CA AK WA WA
AL Robert Bentley  R AL AL AL AL
AR Mike Beebe  D AR AR AR AR
CA Jerry Brown  D CA CA CA CT
CT Dannel Malloy  D CT CT MA MA
DE Jack Markell  D DE DE RI IL
GA Nathan Deal  R GA GA GA GA
IA Terry Branstad  R IA IA IA IA
ID Butch Otter  R ID ID ID

IL Pat Quinn  D IL IL DC IL
IN Mike Pence R IN IN IN IN
KS Sam Brownback  R KS KS KS KS
KY Steve Beshear  D KY KY KY

LA Bobby Jindal  R LA LA RI England
MA Deval Patrick  D IL MA MA MA
ME Paul LePage  R ME ME ME ME
MI Rick Snyder  R MI MI MI MI
MN Mark Dayton  D MN MN CT

MO Jay Nixon  D MO MO MO MO
MS Phil Bryant  R MS MS MS MS
MT Steve Bullock  D MT MT CA NY
NC Pat McCrory  R OH NC NC

NE Dave Heineman  R NE NE NY

NJ Chris Christie  R NJ NJ DE NJ
NV Brian Sandoval  R CA NV NV OH
NY Andrew Cuomo  D NY NY NY NY
OK Mary Fallin  R MO OK OK

OR John Kitzhaber  D WA OR NH OR
PA Tom Corbett  R PA PA PA TX
RI Lincoln Chafee  D RI MA RI

SC Nikki Haley  R SC SC SC

SD Dennis Daugaard  R SD SD SD IL
TN Bill Haslam  R TN TN GA

TX Rick Perry  R TX TX TX

UT Gary Herbert  R UT UT


WA Jay Inslee  D WA WA WA OR
WI Scott Walker  R CO WI WI

WV Earl Ray Tomblin  D WV WV WV WV
WY Matt Mead  R WY WY TX WY