March 29, 2015

Flashdance: Darkest lit mainstream movie ever?

I caught most of Flashdance on TV the other night, and was struck by how dark the lighting is. Not just a scene here or there, but the whole movie. See the gallery at the end of this post, and many more pictures in a post about Pittsburgh in film from What Price Glory.

The plot, dialog, and character development are nothing to write home about, but it is worth checking out for its look and sound.

Some scenes are evenly dim to convey the cloudy and dingy atmosphere that the characters are struggling to emerge from. On the other hand, many scenes have high-contrast lighting to make their lives look more stylized. This chiaroscuro can heighten the tone of romance or intimacy, as well as suggest the almost other-worldly nature of the nightlife environment.

Filmed in 1983, it represents a bridge between the gritty naturalism of the '70s and the stylized music-video look of the '80s. Just six years earlier, the similar movie Saturday Night Fever doesn't feature many scenes with high-contrast lighting, strongly back-lit shots that make people look like shadows, smoke giving the light a hazy quality, and so on.

I don't recall any shots from Flashdance that are destined for the cinematography hall of fame, but I appreciated the effort to sustain a dark look from one scene to the next for the entire duration. It does give the movie a distinct sense of place and time, aside from being shot on location in Steel City during the recession of the early '80s.

The iconic shot of an exotic dancer in a prop chair being doused with water shows how much the movie's look and feel depends on dark and high-contrast lighting. You can find many re-creations of this shot on Google Images, but they all tend to have brighter and more even lighting, so that the girl doesn't look like a shadow in profile. It just looks like a cheesecake photo from any random lad mag. The shadowy look of the original shot obscures the details of her body, so it doesn't come across quite as pornographic as it would have with standard lighting.

It's rare to find such an unusual visual approach in such a popular movie (it ranked 3rd at the box office for 1983). Can anyone think of another hit movie that is so distinctly dark, for both interior and exterior shots, and for daytime as well as nighttime shots?

Here are 20 images that show how broadly the dim and chiaroscuro look is throughout Flashdance.





















March 25, 2015

Higher suicide rates out West due to rootlessness?

From Wikipedia's article on the epidemiology of suicide by region, here's a map of suicide rates among white men during the peak of the violent crime rate circa 1990:


This map left out Alaska, but it has a very high rate as well.

The main effect is being out West, which I interpret as a symptom of rootlessness and transplant living. Not only does being rooted in your place give you more support to cope with life's troubles before you even began thinking about suicide, it also reminds you how disrespectful it would be to others to just off yourself. When roots are shallow, removing yourself from the local social web doesn't feel like you'd be inflicting such a great loss on them.

There are pockets of relatively lower rates along the Pacific coast, particularly away from its major cities. Sure enough, the West Coast is relatively more rooted than the Intermountain West -- once you reach California, you tend not to want to leave (especially before the '90s), whereas nobody feels like staying in Nevada for very long.

That still wouldn't explain why Utah's rates are so high. They are more deeply rooted than others in the region, and they are more religious.

Back East, there's a noticeable line between higher rates in the South and the southern Midwest, and lower rates farther north (aside from a pocket of higher rates in northern New England). Perhaps this reflects ethnic diversity, with whites more likely to kill themselves when they leave near large black populations -- Missouri, western Tennessee, and most of the Deep South.

If so, that would be another influence of rootedness, because even if your own group's roots go back as far as they do among Southern whites, they aren't rooted in the entire local region because that territory is occupied by the roots of the region's historically large black population.

Black roots are shallow in the industrial Midwest, Mid-Atlantic, and southern New England, so perhaps that's why they don't crowd out the feeling of rootedness among whites in Chicago, Philadelphia, and Boston (at least back in 1990). Whites don't like living next to black populations that just showed up within some residents' own memories -- the Great Migration that began in the 1920s -- but the sense that blacks are recent transplants from slave-land allows whites to still claim the whole region as the land of their own roots.

March 24, 2015

Were the original domestic cats not so defiant?

I came across a series of these "cat shaming" pictures that show a cat with a note explaining what trouble it got into, and why it's not sorry.


One thing stood out about these cat mugshots -- hardly any natural looking cats (tabbies). The few that were shown, had not crossed much of a boundary with their owner -- licking some butter, for instance, rather than tearing up a roll of toilet paper. At one site that had a sample of 30, 10 of them were either black or black-and-white. Orange ones were there, too, including calicos. The ones with tabby coloring almost always have large swaths of white as well, not fully natural looking tabbies.

Today's tabby cats look like the wildcats that were domesticated thousands of years ago, so I'm guessing that those first however-many generations did not steal and hide sharp objects around the yard, did not sneak their way into the granary and treat it like a great big litter box, and did not climb up on the roof and start shredding the thatching.

If they were wary of human beings, they must have still had a more helpful and submissive attitude, at least compared to the terrorists in the cat-shaming pictures. It's hard to imagine hardscrabble farmers continuing to domesticate an animal that only wanted to flop down on their work space so they couldn't get anything done.

The mischievous ones look more like what you'd see, not in nature, but in an artificial urban environment where feral colonies form. There are plenty of black, black-and-white, orange, and heavily white-spotted cats there, much more so than tabbies. These are the cats that have evolved to thrive in a setting where human feeders and caretakers can be taken for granted, hence where good manners aren't very necessary.

In fact, it must have paid off in Darwinian terms to be a little more pushy toward their adopters, who had begun to take care of them out of a sense of duty or stewardship (and so, more likely to thrive among a pastoralist group). They would therefore care for the creatures on a more unconditional basis, compared to the early domesticaters who tolerated cats on a quid pro quo basis, such as keeping the mice away from the grains. If your relationship with the semi-wild animal is conditional, you won't find it difficult to shoo it away if it starts acting too big for its britches and assuming an air of authority and entitlement.

Appearance and temperament are linked in all animals, so that selecting for certain personality traits will alter their looks as well (see Belyaev's foxes for the classic case study). As cat adopters shifted from farmers who allowed barn cats to hunt the mice on their land, to the urban crazy cat lady who takes care of the feral colony no matter how bad they misbehave, the tabby coloring has been slowly weeded out.

Black coloring is generally associated with aggressiveness in animals (including homo sapiens), so it may not come as a surprise to see the change move in a darker rather than lighter direction. I'm not sure why it's also selected for heavy patches of white as well -- perhaps to increase visual contrast and be more attention-getting in a dense, competitive urban setting.

Whatever the reasons, it's worth bearing in mind if you decide to adopt in the future. Tabby color looks like a good predictor of not acting like a dictator around the home.

March 22, 2015

'70s snapshot: Booze and drugs at a middle school hangout site

You ever wonder what the characters from Fast Times at Ridgemont High were like in middle school? Was little Jeff Spicoli already stoned out of his mind in seventh grade? It looks like he was.

The middle school age group isn't as exciting to study as high school, so there tends to be very little record left in the public imagination, only personal memories. But suburban archaeologists can still re-visit the original scene of the crime and see if there are any traces left of what the kids were up to way back when.

I always keep my eyes peeled when strolling around trails for signs of the good ol' days, especially when they're near schools, where young people would have been hanging out. The good weather this afternoon brought me to a wooded area behind a middle school and adjacent to residential housing.

I've seen plenty of relics from the wild times when wondering around near high schools, but they could have been near the age of majority. Middle schoolers should have had a lot more difficulty getting their hands on beer, pot, and the like. Then again, times were different back then.

As it turned out, there were at least half a dozen beer cans lying around the middle-school hangout. Every time there's a brewery so old I've never even heard of it -- this time it was Ballantine ("premium lager beer"). It was a can with the old pull tab, so it must have been from the '70s, and this label from 1977 looks like the one it had:


There was a Budweiser can still in colorful condition nearby, also with the old pull tab. There were a few push-in type cans by Coors, Milwaukee's Best, and Red Bull, though they all had the narrow mouth opening and anti-litter warnings on the top like you would have seen in the later '70s and '80s. If you find ones with the narrow mouth but no anti-litter warnings, those are from the '80s and '90s. You may never have paid any mind to these details of beer can design, but when you're trying to date a hangout site, they can give you a very close estimate.

Atypically, there were no collections of cans of the same type, as though they had all come from the same six-pack. It's not uncommon to find the remnants of an entire six-pack behind a high school or off of a trail where older teenagers and young adults would have gone. Near the middle school, I didn't find two cans by the same maker. My hunch is that each of the kids lifted a single beer from their home's fridge, or shoplifted a single can, or found a sympathetic older person to get them "just one" can of beer. That would keep the pre-high-school drinkers under the radar and account for the mismatched assortment of cans.

As usual, there were tree carvings with initials, dates, etc. "PARTYED HERE," read one of them, with the misspelling being an honest sign of the barely maturing, and barely literate make-up of the site's habitual occupants. A large graphic carving showed a bong with a stylized plume of smoke wafting up.

Unusually for what I've seen around high schools, most of the dates were from a narrow range -- the mid-to-late '70s, whereas those near a high school would have gone into the '80s and early-mid '90s. One kid left his full name and the three academic years he was at the school. (I'm not sure just how academic those years were for him, given the big-ass pot leaf engraving that he also left.) He was there from, I think, '74 to '77.

Few or no dates from the '80s -- that really struck me. The middle school opened in 1965 and is still packed with students. There were signs that students still walk through and around the area -- a recent wide-mouth lemonade can, bags from Lays chips (chip bags degrade quickly and must be recent), a notice from the principal to parents dated December 2014, and so on.

This recent stuff looked like trash that somebody chucked to the side while walking straight through, though, not collections of cans or bottles that would have been left over from students hanging out for a little while before moving on.

The period when the middle schoolers actually made the site their own hangout area, drank beer, and occasionally got high, was a small blip in the 50-year history of the school. Who were those students? They would have been born mostly in the first half of the '60s -- the late Boomers who would go on to a Fast Times kind of high school.

Their wild upbringing shows up just about anywhere you look, and it has had lifelong effects: they have always been over-represented among the homeless, for example, whether it's at the beginning of the homelessness phenomenon in the '80s or today. Anecdotal reports from early Gen X-ers suggest that the late Boomers were way more heavily into drugs during college, when the two may have shared a fraternity.

And it's reflected in the pop cultural record, where teen movies of the mid-'80s portray high schoolers who are noticeably more introspective and wary of just doing whatever feels good, man, in contrast to the uninhibited teenagers of Fast Times earlier in the decade.*

Now it looks like that lack of inhibition began earlier than their high school years, at least by the start of adolescence. I wonder how their wild attitude showed up in elementary school, when they were still too young to score beer.

* This may be one pathway from the outgoing to the cocooning mindset. The first generation that's born and raised entirely within outgoing / rising-crime times is going to turn out a little too wild, so that the next cohort after them, when looking up to what the older kids are like, are going to decide that maybe that's too far, and we should dial it down just a bit ourselves. And to not take being cool as the be-all end-all of youth, if that pursuit could lead to disaster. Make fun of trying to act cool.

This more self-aware and ironic mindset and behavioral style is already evident by the late '80s in the quintessential mock-ethnography of Generation X, Heathers, whose characters would have felt out of place at Ridgemont High.

March 20, 2015

Homo population size across metro areas (Gallup survey)

Gallup surveyed 50 major metro areas to uncover what percentage of residents identify as not heterosexual. Here are their results (scroll to bottom for the full list of 50).

No surprise that the West is the gayest region. The rootless inheritance of the Frontier makes it attractive for people who want to erase their personal history and associate only with folks who don't know who you are. Gays won't have bad memories of bullies if they uproot themselves from the community that shunned them. The laissez-faire norm of the Wild West also weakens any attempt to contain deviance.

For those following along with the bizarre nature of Mormon culture, you won't be surprised to see Salt Lake City landing in the top 10, at #7, edging out their regional rimjob rival of Denver, CO, as well as Los Angeles. I wonder if that'll be the next big thing for the Utah tourism commission -- "Salt Lake: More gay than LA! (It's official)"

Also not shocking to see how many queers there are in New England, although there aren't quite so many farther down the Bos-Wash corridor.

What might upset your expectations is how gay the plantation South is. New Orleans is all the way up at #4; Virginia Beach and Jacksonville land in the 11-20 spot. Atlanta is a bit more upland than the plantation plains, but still part of the Deep South. Miami, Orlando, and Tampa are also above the median in this list. The only low-ranking metros are Richmond, Raleigh, and Charlotte (although it's fairly upland).

The least gay region is in fact Appalachia, not the Deep South. The most fag-free metros in the nation are Birmingham and Pittsburgh, the southern and northern poles of hillbilly country. Other nearby but lower-elevation areas that rank pretty low include Memphis, Cincinnati, Nashville, and Charlotte. The only nearby areas that rank somewhat high are Louisville, Indianapolis, and Columbus (we've got to do something about that).

Much to the disappointment of Midwestern strivers, the region is about as devoid of homosexuals as Appalachia. The only somewhat high spots are those three sites on the border between the Heartland and Appalachia (Louisville, Indianapolis, and Columbus). In order to appear relevant to the homocentric media, SJWs from wholesome flyover country would have to result to hoaxes a la "my dead gay son" from Sherwood, Ohio.

It's worth comparing the ranking of how gay the population is with the ranking in this earlier post about how prevalent gay culture is. In other words, two cities may have the same concentration of homos in the population, but one may have a much more in-your-face gay culture.

Those differences would reflect the relative strength of enabling vs. containing forces from the surrounding normal population. They would not reflect differences in the dispositions of the gays themselves, since they are everywhere attention whores by inclination. Like how two cities may have similar fractions of the population being black, yet different crime rates, reflecting differences in the strength of the surrounding whites to contain black violence.

For the degree of gay culture, some metro areas rank about where you would expect based on how gay the population is. Salt Lake City is full of fags and has a "vibrant" gay culture, while Cincinnati has few of them and not much of one.

But other places have a much less palpable gay culture than you'd expect from their somewhat high ranking on percent of the population being queer. Boston, Providence, Columbus, and Las Vegas, for example. Las Vegas is too steeped in the commercialization of hetero vice to allow much room for gay culture. Boston's surrounding culture is sober and Puritanical, ditto for Providence. In their classical liberal view, as long as you don't let it show in public, what you do behind closed doors is no one else's business. And Columbus is too happily Middle-American to encourage its weirdos to fly their freak flags. (It didn't make the ranking of gay culture at all.)

On the other hand, several places in the Midwest are host to gay cultures that are far outsized for the tiny gay populations that live there, such as Minneapolis and St. Louis. Madison, Wisconsin was not surveyed by Gallup, but would probably not prove to be very much gayer than nearby Milwaukee. Yet it ranks sixth among cities for signs of gay culture, above San Francisco and Long Beach.

I doubt this is due to the reported gay populations being smaller than the actual size because of self-censorship. These are all liberal bastions that support gay marriage, so respondents would have little reason to lie about who they are.

Rather it seems like another case of Nordic permissiveness run amok, along with Scandinavian insecurity about how others view them, and exaggerating their credentials so that the elites will accept them. "We have pride parades, too!" "Des Moines in the new Brooklyn!" Pathetic.

So, in trying to figure out what factors make for a wholesome regional culture, we need to consider not just how common in the population deviants like homosexuals are, but also how enabling or containing it is of abnormality. This clearly tips the balance in favor of Appalachia over the Midwest as the beacon of cohesion to the rest of the nation that still cares.

March 18, 2015

No more clingy girlfriend songs in our cocooning age?

Still poking around the Billboard Year-End charts to see how things have changed since the '90s. Society had already entered the cocooning phase, but it was only several years into it, rather than 20-odd years into it.

Also, change doesn't always affect every individual -- it's not as though every pop star of the '90s was a watered-down version of their counterpart from the '80s. Some individuals still showed signs of the '80s climate, they were just fewer and fewer in number each year.

Looking over the charts from '93-'95, you can still see a remnant of the outgoing and socially connected world of the '80s -- the clingy girlfriend song. Torch songs wouldn't be popular if young people didn't really care that much about connecting, whether due to mousiness and celibacy or glibness and promiscuity.

Some examples, whether traditionally sentimental or with a then-contempo indie / alternative dressing. I wasn't very into rap or R&B, so won't remember any examples from that growing domain of pop music.

"I Will Always Love You" by Whitney Houston

"I'll Never Get Over You (Getting Over Me)" by Expose

"Again" by Janet Jackson

"Stay (I Missed You)" by Lisa Loeb

"Linger" by the Cranberries

"Take a Bow" by Madonna

"You Oughta Know" by Alanis Morissette

Musically these aren't as catchy as the clingy girlfriend songs from the New Wave heyday like "Goodbye To You," "Only the Lonely," and "Johnny Are You Queer". I'm just talking about the tone of the lyrics revealing that there was still a residual sign of people wanting to connect with each other, and feeling loss if that bond were broken.

Twenty years further into the cocooning phase, female singers don't even talk about the aftermath of a relationship, since everyone in the audience is too socially awkward and frightened to "reach out" in the first place.

The most popular songs all convey a profound fear and dread about the very beginning when you're only asking someone out on a date, getting to know them, and so on. Merely dating somebody has become this looming apocalyptic scenario, where if the other person rejects you outright or it fizzles before anything happens, you'd be so mortified that the world might as well explode.

It sounds like the singer is a 6th-grader blasting "Carmina Burana" in her room to pump herself up to ask her girl friend if she'll ask her crush if he likes her back. "Dark Horse," "Boom Clap," "Blank Space," etc etc etc. It's all middle school apocalypse music.

Where the torch song showed a level of maturity that allowed a relationship to fully run its course, and a desire to trust others and keep them close even afterward, the emo anthems of today show how stunted the audience is in the stage of development when you're still too awkward to open up to the opposite sex, as well as a distrust of others (MUST NOT EVER BE REJECTED), including your peers, who you feel like keeping a safe distance from, except to scratch the occasional lust itch (or maybe not even then).

Keep your ears open for signs that the cocooning phase is winding down. By the latter half of the '50s, when folks were leaving their Midcentury drive-in cocoons, they were in the mood for a sincere torch song like "Making Believe" by Kitty Wells (#2 on the Country charts). I prefer the version from further into the outgoing phase, performed more tenderly by Emmylou Harris:


March 16, 2015

With emphasis on good looks, pop music in cocooning periods has few black women

An earlier post showed that audiences in cocooning times prefer singers to be attractive, while in outgoing times it's whoever can sing well. Quite a few stars of the '80s, like Phil Collins and Bonnie Tyler, could never have made it today, when pop stars are chosen more for looks, as they were during the '50s (see the post for pictures of Midcentury pop singers).

With the unabated attempt by the cultural Powers That Be to get a '90s nostalgia movement going, I decided to look into where 1995 stood on this shift, judging from the Billboard Year-End singles chart. There were some attractive singers (Madonna, Mariah Carey, Sheryl Crow), but still a fair showing of homely women as well.

Something struck me about the homely ones, though: a good number were black. Rap and R&B were big at the time, were dominated by blacks, and featured about as many women as men. So, homely black female singers were a core part of pop music back in the mid-'90s.

Fast-forward to 2014, when there were only three black women on the Year-End charts -- Rihanna, Beyonce, and Nicki Minaj. Compared to most blacks, they are lighter-skinned, mulatto / exotic-looking, with the kind of mixture you'd find in the Caribbean. They're not bombshells, but they're clearly more attractive than their homely predecessors in the '90s -- TLC, Da Brat, Monica, Brandy, Des'ree, etc.

In this way, pop music has returned to the cocooning Midcentury, when there weren't many black female singers before the '60s. It wasn't until 1963 that the black girl-groups really took over R&B. And although they could sing, they were homely -- the Chiffons, Martha and the Vandellas, the Ronettes.

How about during the so-called image-obsessed decade of the '80s? Black female singers were well represented, and as usual were on the homely side. Several were also middle-aged. From '83-'84, there was Patti Austin, Donna Summer, Dionne Warwick, Roberta Flack, Tina Turner, Deniece Williams, the Pointer Sisters, and Shannon. Irene Cara was the only exotic Caribbean-looking mulatto. Back then, it was still "can she carry a tune?" rather than "how much sex appeal does she have?"

You might think it's odd that these changes haven't been noticed before, especially given the potential angle of "dat's raciss!" and "dat's sexiss!" But it would require black women admitting in public that they aren't very good-looking, so that trends toward good-looking pop culture stars will have a "disparate impact" on them.

Or white liberals risking ostracism in trying to explain the lack of black women in sex-appeal sectors, by pointing out that they aren't very attractive. White conservatives don't pay much attention to any of these matters to notice.

March 15, 2015

Picture of grandparenting Appalachian style from the early '80s

How many signs of the good ol' days can we count here? This is me and my Pap at his home in Jefferson County, Ohio around 1981.


- Absence of helicopter parenting (not just from the grandfather in the picture, but also from the mother standing by who's taking the picture).

- Grandparents being a part of their grandkids' growth, teaching them about important things, eventually imparting valuable skills (although I had to wait until I was about 9 before he would teach me how to fire that gun).

- People in their late '60s being retired and freed up to play these grandparental roles, rather than still trapped in the workforce in order to strive-and-spend until death.

- Children dressed in distinctly children's clothing, not as mini-teenagers.

- Encouraging kids to walk on their own as soon as possible, not carting them around in strollers well into elementary school.

- Unpretentious, home-y kitchen.

March 14, 2015

WaPo reporter shocked that gay rights bill passed by new-age cult out West

Still can't keep herself from insulting flyover hicks' aping of bi-coastal political fashions:

"Utah — yes, Utah — passes landmark LGBT rights bill" (link)

None of this will be news to readers here (search earlier posts for gay Utah).

People not familiar with the region assume that having your state colored red means more or less the same thing no matter where you are. But the entire West used to be colored red, despite its Frontier inheritance of footloose novelty-seeking. It has never been a place that valued tradition and rootedness. But hey, they voted for Nixon and Reagan, so that's all we need to know.

The inter-mountain West is rapidly being absorbed by the West Coast, largely through attitude changes from within, although being exacerbated by all the West Coast refugees colonizing the cheaper land away from the beach.

(I think the shift from the surf culture to the outdoors / ski / snowboard culture over the past several decades stems from West Coast culture-makers rationalizing their exodus out of their suntanned paradise and into the snowcapped Rockies. Framing it as a life mission to discover the most epic peak to ski down must alleviate cognitive dissonance better than admitting that your old panoramic shorelines got too over-crowded, over-priced, and over-spicked.)

So, which regions does that leave as relative sanctuaries for common sense and deafness to the call toward status-striving? Check out two maps showing gay anti-discrimination laws by state: one for employment and one for housing.

The Plains states are still in the clear, although you worry about the Dakotas being pulled within the orbit of their Scandinavian weenie center-of-mass next door in Minnesota. Nebraska is also home to too many indie record labels to be confident that it won't drift within the orbit of their "please love us" farmer-striver neighbors in Iowa. Kansas and Oklahoma seem safer, although Texas is a bit too materialistic and libertarian to view it as a bastion of conservatism.

The good ol' Deep South is holding out very well for now, but the pro-homo marriage ruling in Alabama points to the difficulty in maintaining solidarity among like-minded folks in the midst of toxic diversity levels in the environment.

That leaves Appalachia, which has ignored the legal trend in housing, and in employment has allowed no special treatment, or special treatment only for state employees rather than all employees, and in Ohio only for fags and not trannies. If you review the history, several states in this region have actually repealed earlier successes of pro-homo legislation, resulting in more of a tug-of-war rather than the final defeat in the flanking regions of the East Coast and Midwest.

States to watch are those torn between the core of the Midwest that runs from the Twin Cities to Milwaukee to Chicago to St. Louis, and those with much of their land in the Appalachian chain that runs from Pittsburgh to Knoxville to Birmingham.

Indiana and Michigan are not as airheadedly desperate for acceptance as the area farther west, whose main football teams are thinly veiled promotions of homosexuality -- the Packers and the Bears. But then they're not as ornery and ready-to-fight as the hillbillies to their south and east. Ohio so far has been leaning more toward its hillbilly and Amish side than to its defeatist Midwestern side.

The good thing to result from this national assault on common sense, normality, and tradition is that it is putting each locality to a test and revealing their inner nature as they choose one reaction or another. This will facilitate the de-nationalization of our over-bloated economy and polity into more sensible regions. Showing their true colors makes it simple to identify who your people are, and who are not.

March 13, 2015

Campus protests as sibling rivalry among infantilized Millennials, whining for intervention by surrogate helicopter parents

Here's a comment I left at Uncouth Reflections about the campus protests at Oklahoma over some frat bros singing a song with "nigger" in it.

The radically different climate on campuses these days compared to the Vietnam War era is a sorely overlooked change. People see a campus protest and write it off as the legacy of the Vietnam era, but they're too different these days to lump into the same phenomenon.

* * * * *

Youth activism sure has come a long way since the ’60s. Back then, it was students protesting actions by the government. And they tried to enlist as many of their peers in the movement as they could.

Now it’s one subculture of students protesting against another group of their peers. And it’s over speech rather than actions. And they’re eager to receive the help of the government, the school administration, and other authority figures, in their so-called struggle.

It goes to show how infantilized the Millennials are. These gay college slapfights are sibling rivalries, with the whinier sibling squealing as loud as possible to the parents to intervene and make the mean sibling stop saying mean things because their very self-esteem is at stake.

If we run into Marty McFly this year, we must go back to 1985 and abort the Millennial generation.

March 12, 2015

"It Follows": The anti-'80s horror film about not trusting anybody and only looking out for Number One

There's a lot of buzz about the retro vibe of a new horror film, It Follows, but its variations are inversions on the classic themes of the slasher movies of the '70s and '80s.

I don't think I'll be seeing it, and so can't say whether it succeeds on its own terms. I'm more interested in how people, especially so-called film buffs, perceive the past and how it compares with the present. With all the talk about it being a radically fresh incarnation of the '80s slasher flick, it looks like they've totally missed the message.

Here is the movie's trailer, and a full plot synopsis from Wikipedia. Let's look at just how opposite its treatment is of the major themes of the slasher / horror genre during its heyday in the '80s.

Who or what is the danger? Ultimately, it's some supernatural stalker that kills you once it slowly reaches you. But the stalker has no direction of its own, unlike Freddie Krueger who wanted to get revenge on the children of the adults who fire-bombed his house after the justice system failed to lock up the serial child rapist-murderer. Or unlike a psycho who picks victims on a whim, where it's still his choice, however lacking in motivation the choice may strike us.

Instead, the stalker is passed along from one victim to the next like a curse. After the current victim has sex with someone, the stalker drops the current target like a hot potato and turns single-mindedly toward the person they had sex with. In order to escape the stalker, your only hope is to pass it along to someone else after the most intimate kind of encounter. Since even hinting at your ulterior motives would make it impossible to make it with the next victim, your goal is to dupe them.

Thus, the true danger is not a supernatural entity, but anybody who might possibly be interested in you sexually, including all of your opposite-sex peers. You can never know which ones are just trying to dupe you into becoming the next victim in order to save their own skin.

With time being of the essence, you'll choose the quickest and easiest victim to dupe. Since that means somebody who already trusts you, you will naturally go after one of your own friends and acquaintances to pass it along to, rather than a stranger. A stranger would be wary of a random horndog guy trying to get into her pants, or a too-good-to-be-true case of a cute girl you don't even know throwing herself at you.

The real enemy, with a real motive, is therefore a close insider rather than an outsider. In the '80s slasher movies, it was someone within the neighborhood or community, but not within your most narrow and intimate social circle. That made it possible to band together with your peers against a common enemy. That left a fairly large social circle that could be trusted as a sanctuary from evil.

In the world of It Follows, there is no minimal social circle that you can trust. You are utterly on your own, and if you find yourself stalked by the entity, you are only going to look out for Number One by cynically and deceitfully passing it on to someone else.

According to the movie's rules, you cannot even sacrifice yourself to spare others, as the stalker will continue backward along the chain of transmission once it claims its first victim. Trying to take one for the team by allowing it to kill you would spare potential future targets, but would not protect those who came before you in the chain.

In the movie's logic, cooperation and altruism are pointless.

These are not minor, nitpick-y differences. They get at the fundamental themes of the horror genre -- what is the source of danger, how can we prepare for it before it finds us, how can we deal with it when it does show up, and how can we cope with its aftermath? In the classic slasher movies, these themes all led to pro-social solutions. In the Millennial version, they are anti-social.

Taking a broader objective view of the history of horror, is this really such a new inversion anyway? Not really: the classic '90s anti-slasher movie Scream had already placed the source of danger from within one's most intimate social circle.

However, It Follows has turned up the dial. In Scream, the idea that evil was so close that you couldn't trust your closest friends and partners was only revealed in a shock ending. Throughout most of the movie, you felt as though it were another case of a psycho killer coming from outside the circle of friends. It Follows lays out the anti-social paranoia from the get-go. Also, in Scream the killer's motive was revenge for his mother, which is at least somewhat pro-social. Mindless, cynical self-preservation is the only motive in It Follows.

During the bridge of the early '90s, Twin Peaks left it an open mystery who the killer was, for the captivating episodes anyway. The teenagers may have suspected one another, but they may also have suspected an adult from the community, an outsider, or a supernatural force. Unlike straight horror movies where the evil entity is known from early on, the unresolved mystery in Twin Peaks led to a tension between trusting and suspecting your closest friends and community members.

Lurid plots involving the closest of friends coldly and psychopathically killing each other have also long been a staple on Law & Order: SVU.

The main innovation of It Follows is the logic of how the evil entity "selects" its targets, but that's just a gimmicky plot device. It's still largely of a piece with the Scream-and-after era of horror movies.

The change in approaches to these themes follows straightforwardly from the phases of the social cycle, which alternates between a outgoing / trusting phase (roughly the '60s through the '80s) and a cocooning / suspicious phase ('90s through today).

I find it mind-boggling that film nerds compare stuff like this to classic slasher movies, all because it has an eerie synth soundtrack. In narrative substance, It Follows could not be any more of a bizarro '80s movie.

March 11, 2015

Thrift store finds rather than family hand-me-downs

One unusual sign of the status-striving climate is the boom in the thrift store sector. Shouldn't their success be interpreted as a signal of, well, thriftiness and preserving traditions, rather than going into debt up to your eyeballs to afford showy new stuff?

But all that stuff on the shelves of thrift stores came from somewhere. Someone decided to throw out a bunch of old stuff, and left it out for a charity group to collect, instead of the garbage truck. Why throw it out? Because they were upgrading to something newer and showier. That the new stuff doesn't function as well as the old stuff, and breaks down faster, doesn't matter -- the point is to stay on the fashion treadmill, so you trade off quality for novelty.

Why not pass it on to someone in your family? Possibly they're strivers as well, and wouldn't welcome a gift of old stuff. But your average thrift store shopper isn't wealthy enough to look a gift horse in the mouth. They may want a brand new microwave, but getting a free one from a family member beats paying retail, and it even beats buying second-hand since the used item comes free from family.

It seems like the root cause has more to do with the abandonment of stewardship in status-striving times. No time to take care of people, places, and things when we're each super busy advancing our position on the totem pole. Just drop off any unwanted stuff at a third party, and let them deal with it. Commercial interests will find a more efficient way to collect your stuff than a church or school, but again what do you care if some company makes money off of your old stuff?

The libertarian, laissez-faire norms that undergird the status-striving climate also make it awkward to redistribute things among family members. Notions about the highest bidder, the price that the market will bear, and so on, are foreign to family relationships. So, just donate them to a commercial enterprise, and let them allocate your things to unseen and unknown buyers according to the market rate.

Weird as it may seem, perhaps the only way a person today could come into possession of the things that their parents owned is by scouring the thrift stores to see if someone their parents' age has recently donated such things, and make a payment.

Most of those things won't even be very expensive, so it's not as though the major weirdness comes down to paying an arm and a leg vs. getting it for free. Thrift store finds might as well be free. It's that you have to navigate a cryptic web of donors and re-allocators in a commercial setting, rather than interact with folks you know, likely face to face, as part of the gift culture, where receiving a gift puts you in the donor's debt somehow.

Relationship duties are a drag on uber-efficient status-striving, though, so forget giving and receiving gifts. We'll just pay a nominal finder's fee and come away with not only the item, but a completely blank slate of obligations afterward. Your only obligation is to pay the thrift store the stated amount; after that, you're in the clear, and they expect nothing further from you.

This parallels the lack of indebtedness that the charity owes you after giving them your second-hand stuff. You don't have to monitor them and see if they're behaving like a gracious gift recipient. They give you a voucher to get a tax write-off, and that's the end of it.

However much we may appreciate the kind of stuff that we can easily and cheaply score at the thrift store, we should bear in mind how symptomatic they are of the frayed social fabric, and try to go through family relationships before commercial transactions.

License plate design and status striving

Came across this collection of license plates by state, from the late '60s through today.

The designs were simple and functional when status striving was falling, and turned toward show-off-y and bragging styles once the striver impulse set in during the '80s.

There's still variation in the timing across the states, though, with the Plains and Mountain states tending to adopt striver plates as early as the '70s. Partly it's an effort to distinguish themselves to the rest of the nation that regards them as indistinct Flyover Country. But it's also a reflection of their Frontier inclination toward novelty and razzle-dazzle rather than tradition and simplicity.

In fairness, fancy plates also reflects the difference between wealth-based striving vs. lifestyle striving. The least pretentious plates are actually found in the most viciously competitive lands, along the Bos-Wash corridor. There, regional status contests focus on displays of power and wealth -- our state has more Fortune 500 companies than yours, more media control, more political control. We don't need to advertise how powerful our state is on our license plates, since wealth and power speak for themselves.

Out West, license plates serve as tourist brochures, with one or more advertising slogans, a claim to fame, gaudier typefaces, and scenic depictions of what makes us awesome. In Utah, it's "Life elevated," home to the "greatest snow on Earth".

The Midwest, Appalachia, and the South stayed unpretentious for a longer period, and didn't adopt brochure plates until the '90s and 2000s.

The less elite parts of New England like Maine and New Hampshire, beyond Greater Boston, have adopted brochure plates as well.

California has kept relatively simple plates, akin to the power centers of the East Coast.

The overly encrusted look and the whiff of desperation must be starting to grate on the nerves by now, as "retro" simplified plates have popped up here and there (Texas, Montana). Who knows if they'll catch on as options, or become standardized as the base plate, but it's a hopeful development that was not out there in the 2000s.

The online collection also has Canadian and Mexican plates to browse through. Overall, Canada has simple plates like the Midwestern US, except for gaudy brochure plates along the Atlantic coast, similar to the backwoods New England plates of Maine and New Hampshire.

Mexican plates are almost universally gaudy, outside of the power center around Mexico City.

March 9, 2015

No more interest in portraits for wall adornment

Browsing through old stuff at thrift stores can reveal how much tastes have changed over the past several decades. When the change has gone against your own preferences, thrift stores offer an oasis of things that you'd buy if only retailers still sold them.

In the "wall decor" section, you're bound to find portraits, whether original works by a local painter or photographer, or mass-produced prints. Two that I recently picked up were a print of Senora Sabasa Garcia by Goya, and a wooden plaque with The Carpenter by folk religious artist Frances Hook.



The plaque with The Carpenter is from the early-mid '80s, and the Goya print looks to be made in the '70s (judging from the dark, faux-grain wooden frame). Back then, portraits were in high enough demand among the public that mass-produced copies sold well. Now they're stuffing the shelves at thrift stores. Each of the pick-ups was under five bucks, a sign of how little demand there is today.

Interest in people wanes in cocooning times, so it's no surprise to see autistic contemporary shoppers ignoring portraits when choosing what to put on their walls. If you browse the best-selling items at art.com or allposters.com, you'll see very few portraits. It doesn't matter whether you search all categories of posters, art, photography, etc.

When human subjects are shown at all, it's usually as part of an activity, where their animation is determined by the action they're engaging in, rather than a more probing look into their personality and inner nature. Then there are the scenes where they look like lifeless dolls, an emo approach to glorify or glamorize passivity and fatalism. See these two top-sellers from allposters.com:



These are in the same vein as other popular dorm-room high art, such as The Kiss by Klimt and any Pre-Raphaelite work.

Also popular are landscapes and cityscapes (suitably devoid of people), abstract or impressionistic buildings and flowers, personality-free animals (owls, ostriches), and the odd piece of technology (vintage cameras).

For a look into how dorm room walls used to be adorned, see this image-packed post at Business Insider for pictures from the 1890s through today, all at the same university. Portraits were a staple on dorm room walls in the '60s, '70s, and '80s. Even when animals rather than people are the animate subject, they are shown in an attempt at a character study, as though the animal had its own personality and nature.

Pictures from the previous cocooning phase of the Midcentury do not show anything at all on dorm room walls, perhaps a sign of the "don't show off" norm of the time. But I already covered the decline in portraits on the covers of the Saturday Evening Post (see here). There was a peak during the 1910s and '20s, then a decline through the '50s, when they were replaced by the types of scenes that have become popular again today (activity scenes, landscapes, technology, etc.).

This investigation shows the comfort around vs. distrust of strangers in outgoing vs. cocooning times. But I think you see the same mindset applying to people you know and are even related to by blood. Back in the '80s, every living room had a full array of photographic portraits on the walls (some of them must have been hanging there since the '60s and '70s, since they were taken back then).

Now it's considered cheesy to see close-up pictures of people in someone's living room. It's not a reaction against over-sharing, since these are your own private domestic spaces, and in cocooning times you are very rarely going to be visited by guests, let alone for a long time or on a frequent basis. It's you yourself who are weirded out by close-up pictures of your own flesh and blood.

Again, you may be comfortable displaying action / activity scenes -- here I am, touring Rome; here I am, skydiving for the first time; here I am, having drinks on my birthday; and here I am, building a house with Habitat for Humanity. Actual portraits, whether of you or someone you know, trigger the intimacy alarm.

It's such a strange mindset compared to 30 years ago, when the living room had portraits of many different individuals, and multiple portraits of a given individual to capture the full richness of their personality.

Related: a three-part look at how Seventeen magazine covers have changed between the '80s and the 2010s (here, here, and here). The covers from 1985 are all portraits, while the girl on the current covers is only meant to display clothing and to wear a kabuki face that shows how epically you'll be crushin' it if you buy These 8 Must-Have Smartphone Accessories.

March 7, 2015

Was disco the least gay genre, on the creation side?

Rappers and boy band members are the most likely types in pop music to be gay, usually closeted.

It sounds like an odd kinship: rappers try to project a persona of machismo, thug life, etc., while boy banders try to project a sensitive, non-threatening image. Rap targets more urban, lower-class, and less-white audiences, boy bands target suburban middle-class whites. Rap is a staple at night clubs, boy bands never are.

What gives?

The strongest similarity between the two genres is the reliance on vocals rather than instrumentation. Boy banders sing, and rappers merely talk, but it's still all about vocals. Nobody plays an instrument, and they don't bother hiring session musicians either.

This seems to be another symptom of gay Peter Pan syndrome.

Playing an instrument requires a certain level of maturity -- not only the time necessary to practice and master it well enough to make a living from playing it, but from having the mindset of wanting to practice in order to improve. Small children are more trapped in the present, and are more apt to say "forget this, I hate this" and melt down if they aren't immediate experts.

Small children must be bossed around by parents into practicing an instrument. By the time they're adolescents, they pick up their own internal motivation to keep at it, assuming they're musically inclined to begin with.

Trapped in the "eww girls are yucky" stage of development, gays never develop that level of motivation to master an instrument (other than the skin flute). If they join a musical group, it will almost always be as a singer -- that comes naturally to children, and doesn't have to be learned and practiced just to get the basics down. Practicing singing is to refine native talent.

This leads to one of the great ironies of pop music history: the near absence of homos in disco. Disco instrumentation was the most diverse and complex in popular music since the Big Band era, so if you didn't play an instrument, you were out.

I checked Wikipedia's list of gay musicians with those whose entry mentions "disco" somewhere, and only two matched -- The Village People (obviously), and Sylvester. No surprise that they were more campy novelty acts than a serious band like Chic or KC and the Sunshine Band.

Like disco, heavy metal is oriented more toward mastery of instruments than vocals, and is also relatively fag-free, other than Rob Halford from Judas Priest, but then he's a singer. The only gay instrumentalist I could uncover is Roddy Bottum (real name), the keyboardist from Faith No More (more alterna-metal than heyday metal).

Hard rock only had Freddie Mercury from Queen, again a singer. Rock music in any of its forms has almost no homos in it.

Synth-pop and dance pop have lots of gay singers, although they're paired with hetero instrumentalists. New wave was more instrumental than vocal, and wasn't nearly as gay as synth-pop.

The mainstream view is that the descendants of disco were the synth-pop groups, but just like the name says, it was more like pop music with synthesizers. Disco's true inheritors were the new wavers, with Nile Rodgers from Chic passing the baton to John Taylor from Duran Duran. As in disco bands, it was common for there to be black and white members in a new wave band, but not so much in synth-pop groups.

Disco gets a bum rap, in no small part because of its popularity with gay audiences -- at least back then, not so much these days since gays are too slave-to-fashion to conserve what is good from the past. Whit Stillman tried to portray how normal and mainstream the New York disco crowd was, although I don't think he convinced anyone who was already committed to the view of disco as only for gays and gals.

The other source of its bad reputation is people only remembering "Y.M.C.A." and therefore thinking that half or more of the disco groups must have been gay. This little investigation shows how off-base that is, and provides a good reason why -- gays don't play instruments, and disco was one of the most heavily instrumental genres then or now.

March 4, 2015

Teens react to the Apple IIe

Are Boomers as grandparental as their own parents were in that role?

A recurring theme in the rise of the Me Generation is their benefiting from an established set of rules, and then altering them -- even reversing them -- once they could no longer benefit, and would be expected to play the role of the benefactor.

The series of posts I wrote on incumbency highlight this pattern the best. Boomers went all "don't trust anyone over 30" when they were upstarts, insisting that their superiors be cast aside to make way for new blood. After all, those old fogies didn't have a fancy ass degree like an MBA like we do.

The older Greatest Generation largely went along with the coup, seeing it as in poor taste yet still necessary to keep social mobility going for the next generation.

Of course, once the Boomers became the Establishment, their credo switched to "beware everyone under 30," and if the upstarts have fancy little MBA's, well, big whoop, so do us Boomer incumbents. If they have more education than us, that's just pointless over-qualification. If they want good jobs, they should only have the level of education that we had, but back when we had it, not now, because we are already as educated as we are.

They've been all "boo taxes!" since their 20s, yet now that they're set to start collecting Social Security retirement checks (without retiring), "we" are going to have to tax-and-spending to meet our promise to our nation's senior citizens (or aging Americans, or whatever the Boomer euphemism for it is).

I wonder how this dynamic is playing out in the kinship realm. Boomers received loads of help from their parents when they had kids. Greatest Gen grandparents like mine consisted of a grandmother who was a homemaker, and perhaps had been so for decades and decades, and a grandfather who was a man-of-the-house. The grandmother donated endless time minding the grandchildren, freeing the Boomer mother to do whatever else, and the grandfather donated time and effort instructing and passing along know-how.

Some summers as children, my brothers and I would spend entire weeks at my grandparents' home in the middle of nowhere, nearest city Wheeling, West Virginia. Our grandmother watched over us, cooked us meals, made us take baths, and all the other maternal duties of a typical day. My grandfather would teach us how to hold someone in a full nelson, chop firewood, find your way around the woodland trails, shoot a .22, steer a tractor, and all the rest of the things you need to teach a growing boy how to do.

I don't think my parents spent the whole time in frivolous vacation mode. They just had more time to take a breather, and to finally get to all those millions of little things that need to be done around the house, at their jobs, and planning for the near future, that are hard to do while also trying to tame three wild kids.

Are the Boomers now taking on the burden of grandparenting their children's children in the same way? It doesn't look like it. No data to check from the General Social Survey, unfortunately, this is personal observation. You just don't see your Gen X friends posting pictures of dropping off their kids at Camp Grandma for weeks on end during the summer, or pictures of their excitement when they get to return home with mom and dad. No status updates to that effect.

They post all sorts of kid pictures on Facebook, so if they don't include lots of ones with children and grandparents, it's because they aren't really there. The only exception is if the Gen X-er or Millennial is living with their Boomer parent.

Now, some of the distance between today's grandkids and grandparents could be deliberate on the part of the Gen X parents, most of whom either have trust "issues" with their parents, or at least recall the lack of supervision of their own childhoods, and don't want grandma to behave that way again around her grandkids.

Still, it seems like most of the distance is from a lack of will from the Boomer grandparents. Parents today, as cocooning and paranoid as they are about other people being around their kids, are still stretched too thin for time, and would enjoy a break of sanity during the summer. And grandparents don't need to be researched, checked-out, and paid.

My sense is that Boomer grandparents play more of an absentee role, not spending as much time and effort nurturing them and teaching them know-how. My grandmother never ordered a pizza or went out for fast food when it came time to feed us. We got home fries straight from a cast iron pan that must have been a pain to clean afterward. With the Me Generation being so single-mindedly focused on their careers, they have little time and energy left to spend on caring for grandkids.

Boomers were happy to ask for and receive grandparental help, but are loathe to give it now that it's their turn. It's another case of re-writing the rules to benefit them in whatever life stage they're currently in. Being tight-fisted is one thing, but when you yourself benefited so much from asking for generosity when you were an upstart, it makes the hypocrisy unbearable.

Not that it's the most serious consequence, but it's also serving to widen inequality. When the established sacrifice in order to free up the status-insecure to work and earn more money, the extremes move more toward the middle. When the established are reluctant, the gap remains wide. Boomers have enjoyed a double boost to their status -- they got lots of free help when they were young, and they aren't doling much out when they're old.

March 1, 2015

Will Smith never could act

With yet another flopperino from the Fresh Prince (Focus), critics are starting to realize maybe this actor isn't all he's cracked up to be, y'know, acting-wise. Their unironic, "Where did it all go wrong?" post-mortems of his career are more puzzling, though, than the fact that this one-note goofus is still taken seriously as an actor.

Back in the '80s, closeted gay funnymen like Eddie Murphy were limited to prankster roles, and it worked fine. Can you imagine, after his comedic success in Beverly Hills Cop, casting Murphy as the detective in Basic Instinct? The idea that somebody green-lit an erotic thriller starring a crypto-homo joker just goes to show how desperate Hollywood studios have become by 2015.

I still remember how mind-bending it was to see Smith land all those big roles in the '90s, when his only talent was mugging for yuks as the Fresh Prince. Did audiences really take him seriously? Or was his appeal a meta- kinda thing, like they realized he couldn't play any other role than Will Smith (TM), but how hilarious is it to see Will Smith (TM) cosplaying as a caricature of a soldier, a g-man, or a lover of women?


Nothing says stoic badass like kabuki-esque face-scrunching


Gay adopter Peter Pan-ishly mirrors his toddler's expression


"The Greatest" as a brooding gay bullycide victim

His case generalizes to all closeted homo actors: lacking grown-up empathy, they can only play one role -- themselves -- and it will therefore be one variation or another on the theme of Peter Pan (the defining trait of gay men).

Eddie Murphy and Will Smith are adorable lil' stinker pranksters. Cary Grant and George Clooney are mirror-gazing playboys who, for whatever reason, can never be charmed into a long-term relationship with a woman. Tom Cruise plays a bit more grown-up of a role, as 12 year-old ultra-intense ultra-panicky action LARPer.

Fortunately for Hollywood, audiences these days don't care how hamfisted the on-screen performances are, as long as the star has instant brand recognition. Indeed, the fundamental appeal of these stars is that their brand of acting is so limited that you know exactly what to expect. Hence the box office bomb when Will Smith isn't doing Will Smith (TM).

Viewers no longer want to go in with an open mind and feel like being pleasantly surprised, accepting the characters on their own terms. Nope: the actors are only meant to be action figure dolls who do exactly what the spectators had already wanted them to do before entering the theater.

It used to be said that contemporary video games were a pale imitation of film, but now we see that film has become a pale imitation of video games. No controller required, folks -- we already know what actions you'd make the characters perform. Just sit back and enjoy the game playing itself.

February 25, 2015

Mormon paganism: Downplaying the concept of sin

EDIT: I'm editing the intro section in this first of two posts on sin, in order to clarify the distinction being drawn here.

In this first post, the matter is whether man's inclination toward sin is fundamentally a bad thing (the Stoic, Buddhist, Second Temple Israelite, and Christian view), or a blessing in disguise of a curse (the Mormon view). Another post on sin will explore the difference between an emphasis on inner nature vs. outward acts, which is a separate way in which Mormonism eschews the Axial Age concept of sin.

Uniquely among religions that are at all common in the West, Mormonism views the Fall of Man as not such a bad thing, indeed it's actually a good thing when you look at it the right way. The reasons are twofold: the Fall made procreation possible, and it allowed mortal beings to better learn to choose good and avoid evil.

First, in the Mormon view, immortal beings like Adam and Eve before the Fall could not produce mortal children. So, remaining in their immortal Edenic existence would have prevented them from obeying God's commandment to be fruitful and multiply. Torn between two seemingly contradictory commandments, they chose to disobey the commandment not to eat the fruit of the Tree of Knowledge of Good and Evil, thereby becoming mortal yet now able to fulfill the more important commandment of populating the world.

"Adam fell that men might be; and men are, that they might have joy." So reads the Book of Mormon's narrative about the Garden of Eden (2 Nephi: 25). See the footnote for a fuller context. *

Mormonism is certainly unique in forcing a contradiction between God's two commandments to not know good and evil and to be fruitful and multiply. Adam and Eve could have continued innocent of sin, and so produced offspring that were like themselves -- immortal and innocent. Mormonism, however, finds that naive Edenic state incompatible with its concept of the "eternal progression" from naive spirit children, to mortal beings who experience good and evil, to post-mortal "exalted" beings whose learning is completed and who will no longer choose evil. Hence the need for there to be a contradiction between God's two commandments, and the knowledge of good and evil being the lesser of two evils, so to speak.

Remember that the end goal of Mormonism is for a family to become united in the post-mortal stage of the eternal progression, when they hope to be exalted beings -- immortal bodies of flesh and bone, bound as "families forever." Since these exalted beings must first pass through the mortal stage of existence, Mormonism holds the populating of the world by the original mortal parents to be more important than Adam and Eve remaining free of sin.

Indeed, they downplay the notion of Original Sin more than any other existing strain of Western religion that includes the Garden of Eden narrative in its sacred texts. For centuries, the mainstream Christian view has been that Original Sin altered the inner nature of mankind, inclining it toward sinful acts -- not that people were responsible for the original sinful acts of Adam and Eve. That, however, is the straw man that Mormons argue against in their dismissal of the importance of Original Sin.

Moreover, they don't concede that the alteration of our inner nature toward sinfulness is a bad thing. Mormons view mortal existence as only a brief journey between the vast stages of pre-mortal spirit existence and post-mortal existence. Our purpose during the mortal stage is to acquire the knowledge, skills, and experience that will allow us to make effective gods in the post-mortal stage. If we remained as naive spirit children forever, we would never acquire any of that.

Crucially, we must acquire the knowledge about what is good and what is evil, and how to choose good over evil. (I'm unclear on whether Mormons emphasize this knowledge being explicit or intuitive, but it doesn't matter here.) We wouldn't make for very just gods if we were ignorant of the distinction or how to administer justice based on it.

Mormons believe that learning is best done through the experience of contrasts -- the notion is emphasized over and over again. You can't appreciate what pleasure is like without also experiencing pain now and then. Likewise, you can't appreciate what is good without experiencing what is evil, nor can you fully learn how to choose the good without also making mistakes by choosing evil now and then.

Thus, sinful acts are not all bad things that vary only in the degree of depravity, as in the Christian framework. On the contrary, in Mormonism those occasional not-too-severe sinful acts are for the greater spiritual good, allowing you to learn from your mistakes through trial-and-error, so that you'll be a more capable god in the post-mortal stage.

The role of sin reveals a profound difference in the orientations of Christianity and Mormonism. For Christians, the goal is to return to our sin-free state of being before the Fall. We may have inherited a nature of already-lost innocence, and we continue to sin, but we're doing our best to keep from sinning, and to attain a state of restored sin-free existence. Christian living is a sometimes Sisyphean struggle toward an ideal, and sliding back downward is never a good thing.

For Mormons, the goal is not to restore mortal mankind to a state free from sin -- that would prevent all the important learning about good and evil during our mortal stage, and handicap us as gods in the post-mortal stage. In that ultimate exalted form, our bodies will become immune to the tendency toward sin, from sickness, from decay, and from death. But we can only become that way from having learned through experience of contrasts (good and evil) during mortal life. Righteous and sinful acts are not ones that elevate us or sink us along the upward path toward an ideal sin-free state, but merely the successes and mistakes that are both necessary for learning and maturation to progress toward completion.

Adam and Eve's "transgression," in the Mormon euphemism, did not curse their offspring but enable their development toward godlike exaltation.

In this way, Mormonism has undone the Axial Age focus on our flawed inner nature, and the goal to correct this inner nature to point our outward acts in a more righteous direction. It does not celebrate sinful acts, let alone encourage its followers to indulge in them however they please. But it has removed the taboo on sin or vice, and reduced it to the concept of mistakes made during the course of learning and maturation.

Indeed, there is now a certain duty to commit sinful acts -- not necessarily on purpose, or so great in severity, or so frequently, or without trying to learn from them (a notion that is still different from atoning for them, as though they were wicked). Followers are reassured by the Mormon take on the Garden of Eden narrative, in which the original sin is pardoned as the lesser of two evils, necessary to populate the world. We are meant to follow the example set by our first parents, not to clean up after their mess. Adam and Eve could only set the eternal progression into motion on Earth by disobeying one of God's commandments.

In a future post, we will explore a related pagan development in Mormonism relating to the shift away from inner nature and toward outward acts -- a more legalistic moral framework, rather than the cultivation of inner righteousness or virtue.

* Here is the fuller rationalization of Adam and Eve's transgression in the Book of Mormon, along with the emphasis on growth through experiencing contrasts (2 Nephi: 22-25).

22 And now, behold, if Adam had not transgressed he would not have fallen, but he would have remained in the garden of Eden. And all things which were created must have remained in the same state in which they were after they were created; and they must have remained forever, and had no end.

23 And they would have had no children; wherefore they would have remained in a state of innocence, having no joy, for they knew no misery; doing no good, for they knew no sin.

24 But behold, all things have been done in the wisdom of him who knoweth all things.

25 Adam fell that men might be; and men are, that they might have joy.

February 23, 2015

Is the queen of rom-coms a lesbo?

This item at Blind Gossip mentions an actress who was playing all touchy-feely with her man at the Oscars, but only when the cameras were on them. During the pre- and after-party, when the cameras were not rolling, she was with her girlfriend. Her public relationship is just a PR stunt.

She will not come out of the closet because she fears that she'll lose her fan base and industry support.

The only way a closeted lesbian could lose a male fan base by coming out is if she were a sex bomb that they're all jerking off to. Nobody in Hollywood is that hot. No one who gets into the Oscars, anyway, which excludes mere eye candy actresses. In fact, coming out might actually titillate a sex-crazed male fan base, who'd start picturing her getting it on with some other celebrity babe.

That leaves a female fan base, who have invested so much of their lives into following her as a role model. How would coming out as a lesbian shatter their faith in her? If her roles had primarily been about a down-on-her-luck kinda gal who meets Mr. Right and everything works out great in the end. Discovering that they had been trying to imitate the love life of a lesbian all along would ruin the last hope they had of finding their prince.

The only good guess in the comments at the BG post is Jennifer Aniston. Most of them are moronically guessing Oprah, who is not an actress. One clue pointing to Aniston is the phrase "so sweet," perhaps referring to her recent drama Cake. Another clue is that "she really wants to win an Oscar more than she wants you to know the truth!" Someone pointed out that Aniston made headlines recently with the phrase "We know our own truth," referring to her relationship / engagement to seemingly closeted homosexual Justin Theroux.

Pictures of them show zero chemistry, which technically just means they are a sham couple, not necessarily that she is incapable of chemistry with men altogether in real life.

Still, check Google Images for pictures of Aniston and Selena Gomez, whether at the recent Oscars after-party or in numerous other occasions. They look way more into each other than Aniston and Theroux do. There's another set of pictures from Sunday's after-party of Aniston and Amy Adams looking tender together.

The blind item didn't say that the actress' girlfriend was famous herself, so I don't claim that Gomez or Adams are Aniston's girlfriend. It's just to point out how much more open and warmed-up she evidently feels around other women.

I'm happy to say that I never did get the whole Jennifer Aniston craze. They tried to make her a sex symbol during the '90s, and I even watched a few episodes of Friends just to see what all the hubbub was about. Nothing. Talk about being over-hyped. I didn't know any other guys who were into her either, but she must have had a niche following among doormat types.

I do understand her appeal to aging single women, and that makes me think it's her. She would lose industry support because her whole schtick is the rom-com princess, something that absolutely does not allow for a lesbian actress.

The Hollywood executives would lose so much money that could've been scored from churning out a dozen more inane Aniston rom-coms. And her fans would feel betrayed and led astray. "No wonder she has such trouble finding and holding onto a man!"

Lesson number one in finding Mr. Right: don't imitate the personality of a lesbian.