February 25, 2015

Mormon paganism: Downplaying the concept of sin

EDIT: I'm editing the intro section in this first of two posts on sin, in order to clarify the distinction being drawn here.

In this first post, the matter is whether man's inclination toward sin is fundamentally a bad thing (the Stoic, Buddhist, Second Temple Israelite, and Christian view), or a blessing in disguise of a curse (the Mormon view). Another post on sin will explore the difference between an emphasis on inner nature vs. outward acts, which is a separate way in which Mormonism eschews the Axial Age concept of sin.

Uniquely among religions that are at all common in the West, Mormonism views the Fall of Man as not such a bad thing, indeed it's actually a good thing when you look at it the right way. The reasons are twofold: the Fall made procreation possible, and it allowed mortal beings to better learn to choose good and avoid evil.

First, in the Mormon view, immortal beings like Adam and Eve before the Fall could not produce mortal children. So, remaining in their immortal Edenic existence would have prevented them from obeying God's commandment to be fruitful and multiply. Torn between two seemingly contradictory commandments, they chose to disobey the commandment not to eat the fruit of the Tree of Knowledge of Good and Evil, thereby becoming mortal yet now able to fulfill the more important commandment of populating the world.

"Adam fell that men might be; and men are, that they might have joy." So reads the Book of Mormon's narrative about the Garden of Eden (2 Nephi: 25). See the footnote for a fuller context. *

Mormonism is certainly unique in forcing a contradiction between God's two commandments to not know good and evil and to be fruitful and multiply. Adam and Eve could have continued innocent of sin, and so produced offspring that were like themselves -- immortal and innocent. Mormonism, however, finds that naive Edenic state incompatible with its concept of the "eternal progression" from naive spirit children, to mortal beings who experience good and evil, to post-mortal "exalted" beings whose learning is completed and who will no longer choose evil. Hence the need for there to be a contradiction between God's two commandments, and the knowledge of good and evil being the lesser of two evils, so to speak.

Remember that the end goal of Mormonism is for a family to become united in the post-mortal stage of the eternal progression, when they hope to be exalted beings -- immortal bodies of flesh and bone, bound as "families forever." Since these exalted beings must first pass through the mortal stage of existence, Mormonism holds the populating of the world by the original mortal parents to be more important than Adam and Eve remaining free of sin.

Indeed, they downplay the notion of Original Sin more than any other existing strain of Western religion that includes the Garden of Eden narrative in its sacred texts. For centuries, the mainstream Christian view has been that Original Sin altered the inner nature of mankind, inclining it toward sinful acts -- not that people were responsible for the original sinful acts of Adam and Eve. That, however, is the straw man that Mormons argue against in their dismissal of the importance of Original Sin.

Moreover, they don't concede that the alteration of our inner nature toward sinfulness is a bad thing. Mormons view mortal existence as only a brief journey between the vast stages of pre-mortal spirit existence and post-mortal existence. Our purpose during the mortal stage is to acquire the knowledge, skills, and experience that will allow us to make effective gods in the post-mortal stage. If we remained as naive spirit children forever, we would never acquire any of that.

Crucially, we must acquire the knowledge about what is good and what is evil, and how to choose good over evil. (I'm unclear on whether Mormons emphasize this knowledge being explicit or intuitive, but it doesn't matter here.) We wouldn't make for very just gods if we were ignorant of the distinction or how to administer justice based on it.

Mormons believe that learning is best done through the experience of contrasts -- the notion is emphasized over and over again. You can't appreciate what pleasure is like without also experiencing pain now and then. Likewise, you can't appreciate what is good without experiencing what is evil, nor can you fully learn how to choose the good without also making mistakes by choosing evil now and then.

Thus, sinful acts are not all bad things that vary only in the degree of depravity, as in the Christian framework. On the contrary, in Mormonism those occasional not-too-severe sinful acts are for the greater spiritual good, allowing you to learn from your mistakes through trial-and-error, so that you'll be a more capable god in the post-mortal stage.

The role of sin reveals a profound difference in the orientations of Christianity and Mormonism. For Christians, the goal is to return to our sin-free state of being before the Fall. We may have inherited a nature of already-lost innocence, and we continue to sin, but we're doing our best to keep from sinning, and to attain a state of restored sin-free existence. Christian living is a sometimes Sisyphean struggle toward an ideal, and sliding back downward is never a good thing.

For Mormons, the goal is not to restore mortal mankind to a state free from sin -- that would prevent all the important learning about good and evil during our mortal stage, and handicap us as gods in the post-mortal stage. In that ultimate exalted form, our bodies will become immune to the tendency toward sin, from sickness, from decay, and from death. But we can only become that way from having learned through experience of contrasts (good and evil) during mortal life. Righteous and sinful acts are not ones that elevate us or sink us along the upward path toward an ideal sin-free state, but merely the successes and mistakes that are both necessary for learning and maturation to progress toward completion.

Adam and Eve's "transgression," in the Mormon euphemism, did not curse their offspring but enable their development toward godlike exaltation.

In this way, Mormonism has undone the Axial Age focus on our flawed inner nature, and the goal to correct this inner nature to point our outward acts in a more righteous direction. It does not celebrate sinful acts, let alone encourage its followers to indulge in them however they please. But it has removed the taboo on sin or vice, and reduced it to the concept of mistakes made during the course of learning and maturation.

Indeed, there is now a certain duty to commit sinful acts -- not necessarily on purpose, or so great in severity, or so frequently, or without trying to learn from them (a notion that is still different from atoning for them, as though they were wicked). Followers are reassured by the Mormon take on the Garden of Eden narrative, in which the original sin is pardoned as the lesser of two evils, necessary to populate the world. We are meant to follow the example set by our first parents, not to clean up after their mess. Adam and Eve could only set the eternal progression into motion on Earth by disobeying one of God's commandments.

In a future post, we will explore a related pagan development in Mormonism relating to the shift away from inner nature and toward outward acts -- a more legalistic moral framework, rather than the cultivation of inner righteousness or virtue.

* Here is the fuller rationalization of Adam and Eve's transgression in the Book of Mormon, along with the emphasis on growth through experiencing contrasts (2 Nephi: 22-25).

22 And now, behold, if Adam had not transgressed he would not have fallen, but he would have remained in the garden of Eden. And all things which were created must have remained in the same state in which they were after they were created; and they must have remained forever, and had no end.

23 And they would have had no children; wherefore they would have remained in a state of innocence, having no joy, for they knew no misery; doing no good, for they knew no sin.

24 But behold, all things have been done in the wisdom of him who knoweth all things.

25 Adam fell that men might be; and men are, that they might have joy.

February 23, 2015

Is the queen of rom-coms a lesbo?

This item at Blind Gossip mentions an actress who was playing all touchy-feely with her man at the Oscars, but only when the cameras were on them. During the pre- and after-party, when the cameras were not rolling, she was with her girlfriend. Her public relationship is just a PR stunt.

She will not come out of the closet because she fears that she'll lose her fan base and industry support.

The only way a closeted lesbian could lose a male fan base by coming out is if she were a sex bomb that they're all jerking off to. Nobody in Hollywood is that hot. No one who gets into the Oscars, anyway, which excludes mere eye candy actresses. In fact, coming out might actually titillate a sex-crazed male fan base, who'd start picturing her getting it on with some other celebrity babe.

That leaves a female fan base, who have invested so much of their lives into following her as a role model. How would coming out as a lesbian shatter their faith in her? If her roles had primarily been about a down-on-her-luck kinda gal who meets Mr. Right and everything works out great in the end. Discovering that they had been trying to imitate the love life of a lesbian all along would ruin the last hope they had of finding their prince.

The only good guess in the comments at the BG post is Jennifer Aniston. Most of them are moronically guessing Oprah, who is not an actress. One clue pointing to Aniston is the phrase "so sweet," perhaps referring to her recent drama Cake. Another clue is that "she really wants to win an Oscar more than she wants you to know the truth!" Someone pointed out that Aniston made headlines recently with the phrase "We know our own truth," referring to her relationship / engagement to seemingly closeted homosexual Justin Theroux.

Pictures of them show zero chemistry, which technically just means they are a sham couple, not necessarily that she is incapable of chemistry with men altogether in real life.

Still, check Google Images for pictures of Aniston and Selena Gomez, whether at the recent Oscars after-party or in numerous other occasions. They look way more into each other than Aniston and Theroux do. There's another set of pictures from Sunday's after-party of Aniston and Amy Adams looking tender together.

The blind item didn't say that the actress' girlfriend was famous herself, so I don't claim that Gomez or Adams are Aniston's girlfriend. It's just to point out how much more open and warmed-up she evidently feels around other women.

I'm happy to say that I never did get the whole Jennifer Aniston craze. They tried to make her a sex symbol during the '90s, and I even watched a few episodes of Friends just to see what all the hubbub was about. Nothing. Talk about being over-hyped. I didn't know any other guys who were into her either, but she must have had a niche following among doormat types.

I do understand her appeal to aging single women, and that makes me think it's her. She would lose industry support because her whole schtick is the rom-com princess, something that absolutely does not allow for a lesbian actress.

The Hollywood executives would lose so much money that could've been scored from churning out a dozen more inane Aniston rom-coms. And her fans would feel betrayed and led astray. "No wonder she has such trouble finding and holding onto a man!"

Lesson number one in finding Mr. Right: don't imitate the personality of a lesbian.

February 22, 2015

Cost-cutting at clinic by Jewish owner brought AIDS into America

Here's some interesting news from the never-ending campaign to re-direct the public's view of AIDS away from its homosexual propagators, and toward the original enablers who introduced the exotic disease into the modern Western world, before it went gay.

A new pop sci book, The Chimp and the River, is trying out a new trick of journalistic sleight-of-hand in shifting the blame for the AIDS epidemic -- in the very beginning, it was those white colonial powers who kept re-using dirty needles while treating Africans for tropical diseases. Transmission through re-used needles, in this view, spread the disease at decent levels before sexual transmission took over. If only those skinflint colonialists in the '50s had splurged for new needles every time, AIDS could have been nipped in the bud.

Now, AIDS didn't enter America, and from there the rest of the world, directly from Africa. It came from Haiti.

And in Haiti, mass recycling of needles was not part of a WASP-y campaign to treat third-worlders for tropical diseases. Rather, it was part of an efficiency-maximizing, cost-cutting business model designed by a Jewish New York stockbroker, where the needles were re-used during the collection of blood from "donors," i.e. poor Haitians selling their blood to the clinic, who then sold it on the American hospital market.

And in any case, merely recycling needles would not have introduced the disease to the gigantic North American population, but only kept the disease afloat among Haitians. The sale of contaminated blood on the American market, though, would have easily done the trick of spreading it beyond Haiti's borders.

(Remember that the next time you eat seafood that says Product of China.)

I had not realized who was behind "clinic zero," called Hemo-Caribbean, until reading this review in the NY Post of The Chimp and the River. Of course, the review didn't say who ran the clinic, but I smelled a rat in its vague description (my comment in brackets, and my emphasis):

But how does one infected Haitian lead to an outbreak that, according to 1982 blood tests, results in 7.8 percent of women in a Port-au-Prince slum having HIV? Again, [re-used] needles. In the early 1970s, a plasma-donation clinic, run by a Miami investor, opened in Haiti offering residents $3 per liter. Shared needles at this clinic likely increased the infection rates in Haiti and shipped the disease to the United States in frozen blood plasma. Research indicates that just a single migration of the virus — ­either one infected person or one container of plasma — accounted for bringing AIDS to America. “That sorry advent had occurred in 1969, plus or minus about three years,” Quammen writes.

You'd think that a non-profit or charitable group would operate a place like a plasma donation clinic. This one was "run by a Miami investor," i.e. one looking to buy low and sell high.

But it's hard to drive up the price that hospitals will pay for the blood being collected by your clinic, and easy to just pay a lot less to your plasma donor suppliers. He was only willing to shell out three bucks for a full liter of your human blood, or $1.42 per pint, the standard size of a donation. Adjusting for inflation, that would still only be about $8.66 per pint today. In fact, donors today receive around $30 per pint, so that penny-pincher was only paying about 30% of what today's going rate is.

Cutting costs means lowering quality, and the operator of "clinic zero" was obtaining and selling cruddy Haitian blood, some of which turned out to be contaminated with HIV. Oops. "Hey, how was I supposeda know? I told the hospitals: let the buyer beware! (Or maybe I left that part out...)"

All signs point to an Ashkenazi Jewish peddler, not a WASP-y colonial administrator type. The obsession with maximizing efficiency, cutting costs, shunning healthy first-world products in favor of shoddy third-world products, pulling a bait-and-switch on the unsuspecting hospitals who bought from him, and callous disregard for imperiling his host society. These traits are found among the goyim as well, but with nowhere near the same prevalence -- especially before the "greed is good" era -- and Jews are over-represented in the positions that could make such things happen.

Sure enough, the founder of the clinic zero was Joseph Gorenstein (I've also seen it spelled Gorinstein). His name has largely been lost to history, appearing in connection with Hemo-Caribbean clinic mostly in a handful of articles in the 1970s, before the AIDS epidemic.

In mainstream history since then, the blame has gone instead to Luckner Cambronne, a high-ranking official in the regime of Papa Doc Duvalier. But Cambronne was just the Haitian co-owner who was brought on board by the American investor-founder. Cambronne was a corrupt official of a toilet nation that had just given Gorenstein's clinic a 10-year concession contract to harvest Haitian blood for export to the US. Gorenstein was the founder and brains behind the operation; Haitians can't finance or operate much on their own, let alone found a new industry.

Here is a synopsis of the unlicensed and unregulated blood bank industry of the 1970s, from an opinion piece in the Pittsburgh Courier in '75 (my comments in brackets):

The procurers [of blood] offer immediate cash for donations. The collected plasma is sold at 400 per cent profit to hospitals and pharmaceutical companies. All too frequently, the donors are drug addicts and alcoholics who can transmit such diseases as hepatitis and malaria [and as it turned out, AIDS].

Because of the demand for more rigid inspection [i.e., here in America], the speculators turned to Haiti and other Caribbean countries. According to Perez, in 1971, a New York-Miami stockbroker, Joseph Gorenstein, opened the Hemo-Caribbean Co., with a 10-year concession to export plasma from Haiti, with the support of the Haitian government.

Business flourished until 1972 when adverse publicity prompted Haitian authorities to re-examine the company's operation. The contract was canceled when gross improprieties were found.

Those gross improprieties spread AIDS into North America, and from there it spread sexually to the whole world. We have uncovered the dark triad at the origin of AIDS: slimeball Jewish businessmen, corrupt black politicians, and promiscuous gay fudgepackers.

Epidemics like AIDS are a great example of what Nassim Taleb calls "black swan" phenomena. There's a small probability that something bad will happen, but the magnitude of that bad thing is unknowably large, and the bad thing fuels its own destructive growth rather than quickly burning out. Ignoring such huge risks just to save a few dollars is not just moronic but dangerous, as well as being unjust when those costs will be imposed on society at large rather than the individual decision-makers who brought them into being.

Taleb points to efficiency-maximizing as a recurring source of black swan blow-ups. The decision-makers believe they can reasonably quantify the expected costs of their plan, and redeem those costs by deriving an equal or greater benefit from the plan. But when the costs are unknowably large, they can do no such balancing act, and the prudent decision is simply not to "go there".

With epidemic diseases, usually it's a decision by politicians to let in hordes of foreigners and their exotic germs. This is an indirect influence of efficiency-maximizing business owners, who lobby the politicians for cheap foreign labor to keep their costs down. So what if it brings in the next Spanish Flu pandemic?

With AIDS, though, it's a clear case of a greedy profiteer directly introducing an epidemic disease through his efficiency-maximizing business model. "I gotta keep my costs down, so where else am I supposed to turn but Haitian ghettos for cheap plasma? Oy, it's hard out there to make a buck as a blood-monger!"

The Ashkenazi mind has been shaped by evolution for short-term profit in a foreign host society, so it will naturally produce these kinds of economic and political practices. Not in Israel, of course, where they would imperil their own kind. Their own nation is governed by the opposite standard of what they propose in the West -- fencing off the border, deporting African immigrants, restricting citizenship for ethnic in-group members, and so on.

Gentile minds are also capable of such behavior, but it is not as widespread or as callous and reckless, given that their own society will be affected.

The final solution is to not allow these kinds of practices at all -- sorry, Americans don't need cruddy Haitian blood, regardless of who is procuring and peddling it. But discussing shady business practices that can trigger such destructive outcomes as an AIDS epidemic requires being able to talk about how Jewish operators operate, compared to Gentiles, given their shallower attachment to America, and their outsized role in elite affairs.

February 19, 2015

Part 2 of the micro-geography of diversity: Detail on density and mixture

Intro post here. Full set of map images here.

Let's look into how the first two spatial factors of ethnic diversity affect the local civic society. The examples will come from the black-white eastern half of the country, since there is a longer history to judge from, compared to the Mexican-(black)-white western half.

1) Density

Sparse is better, for whites too, but especially for blacks and Mexicans. Any social phenomenon that requires two or more people teaming up is proportional to the density of the individual members (the "law of mass action"). If there are 10 blacks in a neighborhood, who are spread out one to a block, they will have greater difficulty gathering together to play a game of basketball than if all 10 of them were packed densely into the same block. Substitute playing basketball with forming a gang, roving in packs, loitering while drunk / high, or other blessing of diversity.

Contrast Houston with Chicago. In both cities, roughly 1/3 of the population is white, black, and Mexican. However, the blacks and Mexicans are both densely packed into their territories in Chicago, vs. being thinly spread out in Houston. This shows up as the dots looking darker when clustered, and lighter when spread out. Chicago has a far worse reputation for being a ghetto mess than Houston, so sparse settlement wins.

(In all these maps, whites are red dots, blacks are blue, Hispanics are orange, Asians are green, and Others are yellow.)

Houston

Chicago

2) Mixture

Segregated is better. This follows from Putnam's research showing that diversity erodes trust within a neighborhood.

Contrast Atlanta with Virginia Beach. In Atlanta, blacks are over 50%, but it's highly segregated, with blacks and whites only overlapping a lot in the (sparse) northwest part. Most whites are fairly insulated in the northeast part, while blacks remain mostly in the southern part. In Virginia Beach, blacks are only 20%, which on its own ought to make for a happier white population than in Atlanta. However, black and white areas are mixed to varying degrees all around the city, with no clear "our side" and "their side".

Atlanta

Virginia Beach

Virginia's coastal Tidewater cities in general (Richmond, Newport News, etc.) are more mixed than segregated, leaving whites nowhere to run to. Virginians (i.e., excluding DC metro transplants) are more likely to complain about black behavior than Atlantans, because they have to see it everywhere they go. White Atlantans have no illusions about the black areas being anything but ghetto / hood culture, however they're more likely to just laugh it off and get back to their happy-go-lucky segregated living. Out of sight, out of mind.

Because Putnam found that trust is eroded even among members of the same race when there are high levels of diversity, we should expect whites in Tidewater Virginia to be less engaged in civic life with their fellow whites, compared to whites in Atlanta.

One way that shows up is the lower level of religious attendance in Virginia than in Atlanta, Birmingham, and other typically segregated parts in the South. Virginia lies just outside of the Bible Belt. See the map below for rates of weekly / near weekly church attendance (from a Gallup survey in 2009).


Religious attendance is not just a Southern Baptist redneck thing, since the largely Nordic and segregated population of the Dakotas and western Minnesota are frequent participants in the Lutheran Church, as are their Puritanical Yankee Mormon counterparts in Utah.

And of course blacks also attend church more frequently in the segregated Deep South compared to mixed Virginia. Promoting "separate but equal" living zones is not only good for white civic society, but also for black civic engagement, such as it is. While black civic participation may not be up to white levels in the South, it is downright abysmal farther north along the Mid-Atlantic, as well as out in the Midwest.

During the '60s, blacks in the South co-ordinated a series of non-violent protests, largely organized through churches, in which Virginia did not play much of a role. There was the Montgomery bus boycott, the Birmingham campaign (where blacks took the dousing by firehoses), the Little Rock Nine integrating Central High School, and so on. When Rosa Parks declined to give up her seat, she didn't jump up into a boxing stance, pulling hair and shouting "Worl' stah!"

In contrast, blacks in the more mixed areas of the East Coast and Midwest broke out into full riot-and-loot mode. Unlike the church-organized plans of blacks in the South, an atmosphere of anarchy and mob rule prevailed among the race rioters in the desegregated parts of the country.

As far as I know, no one has applied Putnam's "corrosive diversity" findings to failed civic institutions among blacks, let alone in the past, to show that neighborhood-level diversity is bad for them too. I haven't dug into the literature here, just stating an impression from what I (haven't) heard so far. The lessons of history teach us that it's better for all groups in a diverse society to keep apart from each other rather than overlap in territory.

The micro-geography of diversity: Density, mixture, and enclosure

In trying to figure out the best way to deal with America's too-high levels of diversity, we have trouble communicating to one another because we tend to only know our own neck of the woods, and maybe a handful of other areas.

That allows us to talk about blacks (or Mexicans) in terms of their sheer numbers as well as their share of the population, since those factors are not tied down to the physical space of my town, your town, or his town.

The upshot is a no-brainer, bearing in mind Robert Putnam's research on the corrosive effects of ethnic diversity on trust: greater percentages of blacks and Mexicans threaten homogeneity, sowing the seeds of distrust, bringing the place closer to being a shithole. The practical implications are also obvious -- keep their numbers and percentages down -- but harder to do much about here and now, such as preventing immigration and transplanting.

There are a host of other factors, though, that are more malleable in the short-to-medium term, and less likely to sound the alarm bells of "that's racist!" at a national level, where the federal government might get involved. These are spatial factors of residence, which only folks in the area would pick up on.

They also show a striking level of variety around the country. We should study these spatial patterns and see how they're linked to patterns in the things we care about, like crime rates, trust, civic participation, racial tension and riots, and so on. After all of these natural experiments, we can move away from the failed models in the future and try to re-shape them where they have already been carried out.

What are these spatial factors? From the maps I've looked at, there appear to be three:

1) Density. How densely packed or sparsely spread-out are people within a fixed boundary (neighborhood, "part" of a city, entire city, metro area).

2) Mixture. How much do the two (or more) groups overlap in territorial boundaries? The two could be separated, the smaller group's territory could be mixed within the larger group's territory, or two groups of roughly equal size could occupy the same territory.

3) Enclosure. In cases where the two groups are fairly separated, what is the shape of their boundary? I idealize this as four degrees of enclosure: 0-degree, where the boundary is a straight line; 1-degree, where the line is bent or curved at one point, forming an open alligator mouth; 2-degree, where the line is bent at two points, forming a U shape; and 3-degree, where one group is surrounded by the other.

To see how these spatial patterns vary across the country, check out this series of maps by Eric Fischer, drawn from 2010 Census data. (See this related series, using 2000 data. The list of cities is mostly the same, although not entirely, so check the other list if you don't see a city you're interested in.) Each dot shows 25 people of a given race: whites are red, blacks are blue, Hispanics are orange, Asians are green, and Others are yellow. There are maps for over 100 cities large and small across the country.

The problem in studying them is that it is easy to tell how each city scores on the three spatial variables, but requires digging through other data sources to link them to a crime rate, level of racial tension, history of race riots, and the like. We're mostly going by our impressions from what we've heard about other places -- or whether we haven't heard much about them (no news is good news).

A research project could code each city for all the relevant variables, draw any associations, and give us the upshot. That'll take awhile, and others may be more interested in this than I am, so I'll just give some overall impressions of what spatial patterns are better or worse for civic society.

After this intro post, two follow-up posts will go into greater detail.

February 15, 2015

Weakened by diversity, succumbing to the gay marriage epidemic: A contrast between Alabama and Ohio

A federal judge in Alabama has struck down the state's ban on gay marriage, and by now a majority of the state's counties are complying.

I definitely would not have predicted Alabama to be the first domino to fall in the Deep South, given that Atlanta is one of the gayest cities on Earth. The waves of laissez-faire transplants flooding Atlanta also make it more deviance-friendly.

First I figured the federal judge must be some blue-state transplant to a college town, appointed by Clinton or Obama. But nope: she's Southern born and raised, went to college in the South, holds court in Mobile (a Gulf Coast redneck town), and appointed by Bush Jr. Her grandfather, Richard Rives, was a native Alabamian lawyer who furthered the Civil Rights movement during his tenure as a federal appellate judge.

You'd think if she were a lone voice, there would be greater resistance. But the counties caved in pretty quickly. Especially since the Chief Justice of the state Supreme Court said that the go-ahead for gay marriage was unconstitutional, and told his state's judges not to comply. That would have been a halfway decent source of plausible deniability -- just following orders from the state Chief Justice. Most local judges must not have felt strong opposition to gay marriage to begin with. "Finally, the chance to over-ride those braindead voters!"

I looked over the timing of when each county began to comply, and it looked like the area around Montgomery in the south caved quicker, and that there was somewhat of a cluster of hold-outs in the northeastern part of the state, although it wasn't a very strong pattern. The northeast part of the state is the most heavily Appalachian and the whitest, while the southern and western parts are more Deep South / Plantation, with a more balanced mix of blacks and whites, or majority black.

Perhaps we're seeing the trouble with otherwise conservative folks defending their values if they don't make up such a solid majority of the regional population. Most folks who've never visited the Deep South think it's just full of white rednecks who outnumber a tiny handful of blacks. (This false impression comes from viewing the entire South as a never-ending lynch mob.) But the Plantation region in the southeastern U.S. has been substantially or majority black for centuries, and whites have only ever been a dominant force in the hilly and mountainous parts at the southern end of Appalachia, like Birmingham and Atlanta.

Putnam's research on the sowing of distrust within ethnically diverse cities suggests that a similar pattern would work at the level of states and regions. The Plantation South is too deeply divided racially to withstand attempts to screw over the ordinary whites for the benefit of some other group -- blacks, queers, whoever.

Too-high levels of diversity make it hard for whites to stick up for themselves at a higher level than the personal. An individual white man in Alabama can stare down or hurl curse words at blacks. But if many whites try to organize to protect their own interests -- "Wow, seriously? Are we, like, back in the KKK era again?"

It doesn't matter which piece of their culture they're trying to protect -- prayer in school, statue of the Ten Commandments in the state Supreme Court building, ban on gay marriage, whatever. Those things are all highly distinctive of white culture, therefore an attempt to protect one is secretly an attempt to promote their ethnic group over the other one. They can't win.

Farther north in West Virginia, those same cultural elements would still be found among whites, but since there are no blacks around, they are not seen as distinctly white. So, protecting them up there is not seen as an attempt to promote one's own ethnic group against The Other.

I don't think federal pressure plays much of a role. White Alabamians don't give a damn what the federal gubmint thinks of them, nor do they care about the opinions of the Puritanical Yankees, corn farmers, or surfer fags. It must be a more regional concern, like a civil war could erupt within Alabama or the Deep South itself, largely on racial lines as it did during the Civil Rights era.

West Virginians don't have to worry about an ethnic civil war, since up there it's a matter of whether you're Scotch-Irish, Welsh, Border English, or German. Even "big" cities nearby like Wheeling or Pittsburgh would expose you to a more exotic mix of Hunkies, Polacks, Dagoes, and Wops, but that's about it. The city of 300,000 is 65% non-Hispanic white, and just 25% black.

Now, this is not to say that The Other side in Alabama is a big fan of gay marriage. Obviously the blacks are against it too. You'd think both races could come together and stand firm against the homo agenda.

But that ignores the climate of distrust, both within groups and between groups, that stems from a highly diverse population. This vacuum of willpower, organization, and evangelism will be filled by elite, organized outsiders (and some elite natives) pushing their own special interests.

Black and white working-class men couldn't unite for labor rights, so that collective bargaining has been abysmal, thwarted by Big Business interests. Blacks and whites worry about who would benefit more from a decent provision of government services, so that basic welfare, infrastructure, and education have taken a back seat to elite groups, who are only too happy to pocket that tax money for themselves and their cronies rather than spread the wealth around. And now we see blacks and whites failing to unite against gay marriage, so that a federal judge expressing a minority opinion can single-handedly expose the whole state to the epidemic.

More and more I'm coming to believe Appalachia and its neighbors to be the only hope for a cohesive white culture that is even halfway conservative. Recall from this earlier post that the only federal judge to give the thumbs up to state bans on gay marriage came from Cincinnati, benefiting the citizens of Ohio, Michigan, Kentucky, and Tennessee.

New England and the Upper Midwest is still highly white, which allows them to achieve their goals easily, but those happen to be liberal SWPL goals. No one will cry racism if some high school in Vermont wants to fund a ski team ("blacks don't ski! that's only for white folks!"). And the northern parts of the Mountain states are still pretty white, but they have more of a laissez-faire morality stemming from their anything-goes Frontier roots.

That leaves the region that cuts across many state borders, but stretches mostly from a northern pole around Pittsburgh through a southern pole around Atlanta or Birmingham (no coincidence that it's nicknamed "the Pittsburgh of the South"). The jury's still out on whether Atlanta will be saved on Judgment Day, since it's plunged headlong into the "do whatever" morality in order to grow as big as possible, as fast as possible.

In these most interesting times to follow, we ought to keep in mind how individual desires may not percolate up to a collective action, if the region is too ethnically diverse. Remember Putnam's most troubling discovery -- on top of a diverse area making people distrust those from outside ethnic groups, it also makes them distrust members of their own group. Without that basic level of trust, don't expect their shared individual desires to take collective form.

February 13, 2015

Mormon paganism: Heavenly Mother, co-creator of spirit children

A key feature of pagan creation myths is that in the heavenly realm, reproduction takes a similar form to mundane reproduction, through the union of two sexually differentiated deities.

In Greek mythology, the original male sky god (Uranus) mates with the original female earth goddess (Gaia) and creates the generation of gods known as the Titans. A brother and sister among the Titans, Cronos and Rhea, mate and produce the generation known as the first six of the Olympian gods. A brother and sister among these six Olympians, Zeus and Rhea, mate and produce the rest of the Olympian gods.

With the shift toward transcendent monotheism during the Axial Age, there would be no role left for female goddesses, since there was no longer a pantheon of gods to be created, perhaps through several generations. The sole god (or two, in a dualistic religion like Zoroastrianism) had always existed, and will always exist. He does not need a story about who his mother and father gods were.

Like Christianity, Mormonism posits a variety of not-godly yet not-mortal beings, such as angels and demons. They are all of the same genus -- Heavenly Father's spirit children who have not yet been born on Earth. Reminder: Mormonism teaches that the soul of each mortal person came from a spirit being in the pre-mortal stage of existence.

Lucifer is one of these spirit children, although he was denied being born on Earth, for his rebellion. So is the pre-mortal spirit form of Jesus Christ, who before being born is called Jehovah in Mormonism (his, and our, Heavenly Father is called Elohim). Spirit children also include the demons, those pre-mortal spirits who sided with Lucifer and against Jehovah (pre-mortal Jesus) during the War in Heaven. And they include angels such as the archangel Michael.

Unlike Christianity, Mormonism does not hold this cast of spirit characters to be the production of the sole creator god. Instead, they all came from a physical marital union between the god Heavenly Father and his wife, the goddess Heavenly Mother. Reminder: gods in Mormonism have physical, flesh-and-bone bodies that are human in appearance, although in glorified form that is not subject to sinfulness, decay, death, and so on.

Given the emphasis that Mormonism places on the corporeal similarities between mortals and gods, the natural conclusion is that the act which created our world's spirit children resembled the act by which any mortal father and mother create mortal children. Yep, bumping their glorified uglies is how Heavenly Father and Heavenly Mother created our pre-mortal spirit forms.

This gives Mormon women quite a bit more to strive for than in other religions. Where else can they become a creator goddess in their own right? Recall that in Mormonism our ultimate goal is to reach the highest degree of "exaltation" and become creator gods ourselves, producing a race of spirit children, shaping their world, receiving their worship, and so on.

Men can expect to play the role of Heavenly Father, overseeing the shaping of the world and the creation of life and mankind. Women cannot expect those roles to play, but women are bored anyway by creating worlds out of Lego blocks, playing Sim City, and looking after an ant farm. Women can look forward to a role that truly matters to them: giving birth to the spirit children of her world, and nurturing them during their child-like pre-mortal spirit existence -- and without having to change diapers and breastfeed in the middle of the night!

Plus, women will be married to the head honcho god of their world -- not too shabby. Even better, that head honcho won't be just any old god, as though it were a Cinderella story. In fact, their godly husband will be the exalted form of the man to whom they were married during their mortal lives, continuing their Earthly marriage in exalted form for eternity.*

Because of the blatantly pagan nature of their Heavenly Mother goddess, Mormons try not to draw too much attention to her, as that might raise suspicion among the outsiders. They are instructed by their leaders not to pray to her or worship her, and to mention her as little as possible in church meetings and in lay discussions. And being inveterate rule-followers, they go along with it.

That's not to say that she receives no worship, however. The lyrics to an early Mormon hymn, "O My Father" (written by a woman), summarize their beliefs in the "eternal progression" from spirits to mortals to gods, organized around the theme of birth to loving parents, growing up away from them, and returning home to them in maturity.

In two separate verses it mentions Heavenly Mother in addition to Heavenly Father, making an argument from common sense that our creator god must have a creator goddess as his wife (a prime example of the Mormon aversion to mystery):

In the heav'ns are parents single?
No, the thought makes reason stare!
Truth is reason; truth eternal
Tells me I've a mother there.

When I leave this frail existence,
When I lay this mortal by,
Father, Mother, may I meet you
In your royal courts on high?

Because of the distinctly non-Christian nature of their theology and theogony (creation of gods), Mormons try not to draw too much attention by setting it to music and broadcasting the message to outsiders. To minimize suspicion, the world-renowned Mormon Tabernacle Choir mostly sings and records Christian standards, especially for Christmas. To end on, here is a rare exception of them singing "O My Father":



* The highest degree of exaltation requires that you be married and "sealed" to each other by a Mormon temple ceremony. This goes beyond a normal wedding, and is more of an initiation ritual, only this rite of passage is bringing in a couple rather than an individual.

February 10, 2015

Returning epidemics driven by geographic inter-connectedness (study)

Now that there's another measles outbreak in the news, and an apparent epidemic of whooping cough in California, it's worth returning to this earlier post on the rise and fall -- and rise again -- of epidemic diseases. They seem to track the inequality cycle, with rising inequality going along with rising prevalence of epidemic diseases.

Public health measures like vaccination programs, and educating people how to practice better hygiene, seem to have little to do with the changes up and down over time. For many of the major epidemic diseases, you can't really see the introduction and adoption of the vaccine when you look at the historical pattern of those diseases. In many cases, there was already a precipitous decline under way when the vaccine was introduced, and it didn't appear to accelerate the decline any faster.

That's not to say that the vaccine doesn't help an individual avoid the disease, or that a higher vaccination rate wouldn't contribute to "herd immunity" and help the entire group avoid epidemics of the disease. It's just to say that those effects are clearly swamped by something else at the macro level -- something related to the status-striving and inequality cycle.

Earlier, the main factor that I pointed to was immigration, which follows the inequality cycle. Immigrants are strivers themselves, leaving behind their country in order to make more money in America. And employers become more cost-cutting in striving times, hence more anxious to import cheap foreign labor. When competitive striving begins to decline, immigration gets cut off, as during the 1920s, after a peak in competitiveness circa WWI. Immigration stayed low throughout the Great Compression, and only became turbo-charged during the 1980s, with the return of competitive striving.

I didn't see it earlier, but a similar factor is transplant-ism within a large country. That's a form of immigration at a smaller scale, across states within a nation. If you look at the proportion of a state's residents that were born in some other state, it too follows the striving and inequality cycle. Transplants became more and more common during the Gilded Age and early 20th century, then folks settled down during the Great Compression. With the return of striving, they have decided to head off for greener pastures and revived the footloose, get-rich-quick transience of the Gold Rush era.

These two levels of greater demographic inter-connectedness are far more important for the spread of epidemic diseases because they increase the effective population size of the group of individuals who could spread the disease to one another. Small, sparse populations like the Bushmen hunter-gatherers in the Kalahari Desert are not subject to endemic person-to-person diseases like measles, small pox, and so on. Large, dense populations like ancient and modern Egypt are.

In epidemiology, the "critical community size" is the number of people needed for a disease to become a regular epidemic. If the population is too small, isolated cases or even outbreaks may occur, but not self-sustaining epidemics, let alone ones that recur time after time. The critical size differs across diseases because some diseases last longer in the sick person, and some diseases are more easily transmitted, both factors allowing a smaller population to sustain an epidemic.

The critical size has to be estimated from real-world data, and it is a clear threshold below which the disease won't become a regular epidemic. It's not that vaguely defined "small" populations are the ones that will be more disease-free. They just have to be smaller than the critical size, as estimated from empirical data.

As anyone who's read Plagues and Peoples will remember, increasing global connectedness has historically brought with it the spread of ravaging epidemics. That's because the global population changes from a high number of small closed groups to a small number of large webs. Epidemics are far more likely to propagate through a large web than within a small enclave.

In fact, a research team has shown the effect of inter-connectedness among an international "meta-population" on the persistence of epidemic diseases, looking at how island nations fare compared to mainland nations. Free full article here (if you are numerate, you can read it without any background in epidemiology or ecology).

Controlling for other relevant factors, islands are about twice as likely to see the extinction of an epidemic disease. Quarantine the whole outside world, and you've got it made in the shade.

Vaccination increased the chance of eradicating a disease, but vaccination rates haven't shifted radically over the past several decades, and are unlikely to explain the recent return of epidemic diseases. Transplant-ism, immigration, and other forms of demographic connectedness have shot through the roof, however, and are a much more plausible cause of whooping cough coming back from the dead.

Their study also looks at the role of what are euphemistically called "rescue" effects -- the disease is "rescued" from extinction by being introduced from an outside source where it flourishes. Inter-connectedness brings epidemics back from the dead. When relatively isolated nations come close to eradicating a disease, such rescue effects will play a larger role in the disease's persistence in those nations.

Proportion of the population who are migrants also has a strong negative effect on eradicating a disease. So, just being an island may not be a silver bullet -- if you happen to be globally connected. Singapore will fare worse than Vanuatu.

Their unstated conclusion (stating it would get them fired for racism): if a developed country has more or less fixed an epidemic disease, it shouldn't be allowing much demographic contact with outside nations -- even those where the disease isn't that common (that would still boost the effective population size, perhaps by an order of magnitude). But especially so when those nations have the disease running rampant.

Corollary: if a region within a nation has extinguished a disease, they shouldn't allow much demographic contact with other regions. Again, even if each region is relatively disease-free, joining them all into one great big swirl would boost their effective population size by an order of magnitude or more. But especially when one of those regions is more disease-ridden than another.

Stating that corollary would not only piss off the liberals, for whom freedom of movement is paramount; it would also unnerve those conservatives who are on board with limiting immigration. You mean we aren't even supposed to allow internal geographic churning either? Well then, how am I supposed to leave my flyover region and make shitloads of money in Silicon Valley or Wall Street?

A false model unsettles one faction of the powerful; the true model disturbs them all.

We ought to bear these lessons in mind when we look at vaccination rates, and ask, "Is that rate defined all the way up to the relevant level?" California's state-wide vaccination rates mean little when they are part of much greater effective population whose web of disease includes Central America and Pacific Rim populations.

And looking just within the nation, Utah's high vaccination rates will be effectively diluted by their demographic connectedness with Colorado next door, where vaccination rates are among the lowest in America.

By the same token, a tiny handful of Jenny McCarthy wannabes in the upper class will not be responsible for a return of whooping cough. The study estimates a critical size of unvaccinated births to be around 50,000 for whooping cough to go epidemic, and a size of perhaps 5 million unvaccinated in the population as a whole. Upper-class fads cannot introduce that many unvaccinated people into the population that quickly. Immigration, transplant-ism, and frequent travel (between and within nations) can do so easily. (The corresponding estimates for measles are 500,000 unvaccinated births or around 15 million unvaccinated total population.)

It would be a great help in the fight to eradicate disease to shift the discussion away from the micro-level of vaccinating individuals, and even from the semi-micro-level of herd immunity within a school or neighborhood. Our communities are unfortunately far more broadly connected webs than that. The focus ought to be on the macro-level processes that have re-introduced once-rare diseases, and on the macro-level solutions that could send them back to the grave -- namely, sealing off our communities into smaller enclaves.

Not to the extent of everyone hunkering in our own private nuclear fallout shelter, but small enough in population density that each enclave will be below the threshold for the major epidemic diseases. We could still communicate at larger scales if we wanted to, or ship goods across enclaves. But sustained demographic inter-mixing ought to be kept low. End immigration, and curtail transplant-ism as much as possible.

Citation: Metcalf CJE, Hampson K, Tatem AJ, Grenfell BT, Bjørnstad ON (2013). Persistence in Epidemic Metapopulations: Quantifying the Rescue Effects for Measles, Mumps, Rubella and Whooping Cough. PLoS ONE 8(9):e74696. doi:10.1371/journal.pone.0074696

February 9, 2015

Have Mormons been bred for gullibility?

When the Mormons left behind the Midwest and set off for the Rocky Mountains in the middle of the 19th century, they were not a random sample of Americans. Neither were other groups of frontier settlers, like the type who could make it in the Wild West. Today, the offspring of the Wild West settlers are among the most criminal whites, with Arizona leading the nation in the incarceration rate of the white population.

Mormons, though, were not selected for their rugged individualism and willingness to solve problems with violence. What traits did set them far apart from the average of their day? Well, whatever would incline one to put so much blind faith in a New Age cult ("new religious movement"), that they would trek out toward the inhospitable and untamed frontier because their leaders said that's where we need to go if we want insulation for our cult to practice plural marriage. Too bad if it means you have to leave your families and communities behind.

Sounds legit.

They would also have had to believe that their latter-day prophet had translated an ancient religious book written in Egyptian on golden plates (the Book of Mormon). And that the Garden of Eden of the Old Testament was really located in the American state of Missouri, and that the holy Israelite temple of Zion in the Old Testament was also to be restored in America.

A lazy atheist response would be that all followers of religion are gullible sheeple, how are Mormons any worse?

In fact, neither of the two major world religions today -- Christianity and Islam -- owed their initial growth to their founding prophet claiming to have discovered and properly interpreted a physical object, one that no one else saw, and using a translation method that no one else could verify. That should have alerted any normal person at the time to the strong possibility that Joseph Smith was a con man.

Jesus didn't claim to have discovered a long-lost tablet that would give the Israelites the true fullness of Mosaic laws. He preached his own take on existing laws, and listeners were either persuaded or not of his interpretation, moved or not moved by his exhortations. He wasn't trying to pull a fast one on his audience.

Mohammed's appeal was based on revelations that came to him from angels -- that's a supernatural claim that can't be verified by curious listeners, unlike the physical basis that Smith claimed for his prophecy. Either you believed Mohammed's claims about angelic revelation or you didn't, but it wasn't because he was employing flimflam techniques.

This suggests that present-day Mormons are the offspring of a population that was unusually suggestible and impressionable (to use more neutral terms than gullible), to be hooked in the first place. And to have stayed with the cult for so long in such inhospitable circumstances, they would also have had to be unusually eager to please their social superiors and to dread "making waves" against the collective.

Before turning to some hard data, let's take an impressionistic look at a Mormon woman explaining some of her religion's beliefs (video here). She's from southern California, but would blend right in among Mormon women of her generation in Utah. Overall impression: ditzy (not pejorative), free-spirited, anxious about fitting in, hopes that you like her by the end of her presentation.

That's way different from the stereotype that folks east of the Rocky Mountains have about stuffy, stern, sober Mormons. They were originally from Yankeedom, but they're more like a New Age splinter group of Puritans. New Englanders are infamous for being dour and crotchety, whereas Provo, Utah is "Happy Valley" -- even if used dismissively, the nickname still shows the peachy-keen Stepford Wives atmosphere that is more typical out West.

Getting more quantitative, we can look at the rates for con man schemes, which the government keeps close enough of an eye on for there to be national data. Google "Utah fraud per capita" and peruse the endless news reports of the Mormon capital being the most vulnerable to Ponzi schemes, "affinity fraud" (trust me, I'm one of you guys), and the like. These articles go back regularly to at least the 1990s.

Fraud per capita

Utah: Where the con is on

Utah County is hotbed of white-collar crimes

Affinity fraud: Fleecing the flock

Utah named a top Ponzi state -- again

And so on.

The scammers target LDS Church members preferentially, so it does seem to be related to Mormons specifically, and not their ethnically similar neighbors around the state. It doesn't seem to reduce to "Scandinavians are high-trust and vulnerable to con men". It's particular to followers of Mormonism.

And it doesn't generalize to all religious groups, as though the Catholics, Presbyterians, and Baptists were just as easily duped by con men.

That suggests that Mormons have been selected for higher levels of being suggestible and eager to please.

We'll see those traits come up again when we look at their temple initiation rites, which have been a constant selection pressure for those traits, right up to the present day.

February 4, 2015

Mormon paganism: Gods that are created and mortal for a time, not transcendent

Perhaps one of the most striking features of Mormon theology, and certainly one of the most original among religions with a good following in the West, is that its major gods are created beings, not a god that has always existed in its fully godlike state.

First let's look at the Mormon view of creation, to see how their gods fit into the larger pattern of created beings. A person like you or me (or even plants and animals) was once a spirit being created by Heavenly Father, dwelling with him in Heaven, unable to be harmed or die, but also not yet living in a physical body.

At some point, this spirit is born on our world in a physical body, subject to pain, decay, and death. When we die, the spirit separates from the body and returns to the spirit world, generally back to Heaven to re-join Heavenly Father if the person lived a righteous life.

Eventually, these post-mortal spirit beings will undergo a bodily resurrection, only now this eternal body will not be subject to pain, sinfulness, death, and so on. Depending on how righteously they lived as mortals, these resurrected bodies will occupy higher or lower levels of heavenly bliss. If they make it to the highest level, they will become gods themselves like their Heavenly Father, and thereby become capable of ruling over a world of their own, creating a population of spirit beings that will be born on their world, and receiving the worship of the mortals on their world.

Mormonism had been fairly clear about human beings who reached the highest level of Heaven becoming gods in this sense -- creator gods who would then oversee their own world. As it has sought more mainstream acceptance over the past 30 years, the idea has been downplayed but not denied. It is now framed as something that is possible, that we may speculate about and even hope for, but not necessarily something we can be totally certain about -- or at least not in front of the non-Mormon majority, who might find the whole idea a bit out-there.

The state of eternally residing with our Heavenly Father in bodily form -- both him and us -- is called "exaltation," and is the last stage of the "eternal progression" that began when we were pre-mortal spirits. A being in that state is called "exalted".

You may be thinking that a return to our creator for eternity may not sound too different from the Christian view of the afterlife. But here's what makes Mormonism unique: it holds that our Heavenly Father has gone through the eternal progression himself. He began as a pre-mortal spirit, lived as a righteous mortal, was resurrected in flesh and bone, and attained the highest level of Heaven. In that most exalted state, he created all the spirits that will ever be born on our world.

Heavenly Father in Mormonism is not a figurative role model but a literal one -- if he made it, we could too. Or in the pithy phrasing of Lorenzo Snow, who would rise to Church President at the turn of the 20th century:

As man now is, God once was:
As God now is, man may be.

In Mormon theology, Heavenly Father, his spirit children, and the mortal beings they become, are all of the same genus, only at different stages along the eternal progression. But there is no unbridgeable metaphysical chasm between mankind and its creator. Indeed, we are caterpillars, and he is the butterfly.

Just as in the case of the looking forward idea -- can human beings become gods to populate a spirit world of their own? -- the backward-looking idea -- is Heavenly Father an exalted man who was once mortal like us? -- was clearly portrayed through most of LDS history. With the recent attempt to boost its mainstream appeal, this idea is being downplayed but not denied. I conclude that the view is still mainstream within their theology, given how common the view has always been, up to the highest levels of Church office, and how today's leaders are not clarifying by saying "No, those earlier prophets were off the mark, and Heavenly Father did not go through his own eternal progression."

All right, but how does that merit a term like "pagan"? Well, call it whatever you want, but that is a crucial part of pagan mythology -- gods are created, or perhaps emerge from the cosmic background, so that there was a time when they did not exist, or at least when they did not exist as gods. They may in fact stop being gods, if their god-like status is stripped away from them by another god, or if they switch from divine to mortal form in order to interact with human beings, when they would be susceptible to bodily harm.

Unlike pagan gods, whose divine vs. mortal status was subject to flux, mankind's creator god in Mormonism cannot have his divinity stripped away, or assume mortal nature after becoming a god. The eternal progression only flows one way.

Nevertheless, the fact remains that Heavenly Father has not existed as an entity forever back into the past, let alone as a divine entity. In fact, he is of the same genus as mankind, only at a more advanced stage along the eternal progression, having reached the highest degree of exaltation.

This central feature of Mormon theology marks it as though it had come from an earlier stage in the evolution of religion than those of the Axial Age and its off-shoots, including Second Temple Israelite religion, Christianity, Islam, Zoroastrianism, Buddhism, Taoism, and Greek Golden Age philosophy.

That shift circa 500 BC, give or take a few centuries on either side, forever did away with the notion of gods as created beings who looked and behaved anthropomorphically. The pantheon was whittled down to at most two creator gods (Zoroastrianism), usually one (the Abrahamic religions, Xenophanes, Aristotle's Unmoved Mover), but perhaps no god at all but an impersonal guiding force (Buddhism, Taoism). And importantly, these creator gods were not themselves created or emergent from a primordial cosmic stew. They were creators at the very dawn of time, and have always been transcendent.

Add to this departure regarding Heavenly Father, the view that human beings may become their own creator gods of other worlds, and you have an even further deviation from the Axial Age tradition. No matter how the nature of God was perceived during that time, it was certainly not possible for a mortal man to become such a god, or for God to have existed in mortal form before attaining godhood. The religious revolution of the mid-1st millennium BC separated God and mankind into two states of nature that were impossible to cross between.

In 19th-century America, Mormonism restored the much earlier pagan view of our creator as just one example of a created being, one who had existed in earlier mortal form, and whose godlike nature it was possible for us mortals to attain. This view continued to be mainstream among LDS Church leaders right through their fastest period of recent growth, roughly 1960 to 1990. Only since then has it been toned down, without disappearing or being denied, let alone denounced as a heresy.

This post has only touched on one aspect of God's nature in Mormonism -- his theogony (an account of the genesis of gods). It's already been hinted that he is much more anthropomorphic than Axial Age gods, but that will be explored in another post.

Mormons as the Parsis, not the Jews, of America

Before I get too far into exploring the return toward the pagan worldview in Mormonism, it's worth providing another concrete example of a religious minority group who are likable people, upright citizens, capable builders and managers of institutions, and honest rather than parasitic in professional work -- and who are nevertheless from an earlier religious orientation than the transcendent monotheism of the world's two major religions, Christianity and Islam.

(It needs to be emphasized that the purpose of this series is not to smugly dismiss Mormonism just because it is in many ways more primitive than Christianity.)

The closest parallel of the Mormons, albeit not their identical twin, seems to be the Parsis of India. They are ethnic Persians who practice Zoroastrianism, a non-Abrahamic religion of ancient Persia.

These earlier posts here and here reviewed the role of the Parsis in their Indian host society, contrasting it sharply with that of the Ashkenazi Jews in the West (and the Han Chinese in Southeast Asia). The upshot of those posts was that there is nothing inherent in being a "market-dominant minority" that leads the host population toward distrust, hatred, and persecution of them. That only happens if the ethnic-cultural minority who specialize in middlemen activities are just looking out for themselves, either personally or as an ethnic group, and acting callously toward individuals in the host society, and toward the host culture as a whole.

If the market-dominant minority acts the opposite way -- giving away much of their wealth to charity, running charity hospitals, and managing industries in an honest and wholesome way -- they are welcomed and favored by the host society.

Thus have the Ashkenazi Jews and Han Chinese been loathed and persecuted wherever they have set up camp, while the Parsis are valued so highly in India that the government is willing to pay to boost their fertility rates -- the exact opposite policy of a pogrom.

A popular but lazy comparison (e.g. by Amy Chua) adds the Latter-day Saints to the list of persecuted market-dominant minorities. But Mormons hardly belong in the same camp as the Jews. They were only persecuted during their cult-like and polygamous initial stages, from the mid-19th century up through the early 20th century.

Since then they have been left alone, and in fact valued as more capable and productive citizens than the American mainstream. They are the fastest-growing religion in America, particularly out West, with no signs of persecution despite their exponentially increasing numbers. Their much higher-than-average birth rates would frighten a society that held them in suspicion, yet the average American doesn't seem to care that Mormons have large nuclear families. One of them was nearly voted President of the United States.

In short, viewing them as followers of an off-beat or weird religion with a shady past has not led the mainstream to loathe or persecute the Mormons. The acceptance and even fondness for their presence and growth seem all but certain for the future. They may be the butt of kindhearted jokes, but not the object of curses as the Jews are. The two groups could not be farther apart, deflating "market-dominant minorities" as an insightful model.

If both the Mormons and the Parsis are a kind of bizarro-Jewish or bizarro-Chinese group, that suggests a deeper affinity between their religions, which are the main component of their cultural identity. (They are only distantly related genetically.) I'll save a comparison of their religions for a follow-up, though, to keep these posts digestible.

February 3, 2015

The Mormon return to pagan culture: Introduction

Mormonism is the fastest growing religion in America, and one of the fastest growing in the world. Although still a small minority, its exponential growth means that in the coming decades and centuries, it will play a large role in American life. Explosive growth also reveals in which direction the mainstream is heading -- what appeals to them, and what does not?

In trying to distill what the Mormon explosion heralds for our society, the simplest way I can think of to put it is that Mormonism is a return to a pagan, pre-Christian worldview and orientation toward life. I don't mean that in a purely Romantic "noble savage" way, nor in a purely derogatory "godless heathen" way. It's "for better or worse".

It's not as though pre-Christian Europe was always and everywhere a den of iniquity.

Tacitus came away with a fairly sympathetic appraisal of Germanic moral codes and behavior, however backward he may have found their material culture. He himself came from a highly advanced civilization whose development did not depend on transcendent monotheism as found in the Israelite religion of the Second Temple period, or as in nascent Christianity. And the era in which he lived, the beginning of the reign of the Five Good Emperors, testified to the ability of Roman polytheism to support a society that enjoyed stability in its civic institutions and modesty in its everyday moral conduct.

It's crucial for moderns to remember, though, that both the Germanic tribes and the Roman Empire were thoroughly religious. When we think of alternatives to Christianity today, what comes to mind is either a lack of religious affiliation and participation, or outright atheism. So when I say that Mormons are returning to a pre-Christian pagan wordlview and orientation, I don't mean that they have dropped out of religion, that religion doesn't play a major role in their lives, or that their religion is empty.

I'm not suggesting that they are returning to paganism in every major aspect of belief and practice. Using magic for divination, for example, doesn't seem to play much of a role in their religion today, although it was to some degree during its initial phase of revelation. Their return is more "on the whole".

I'm also not suggesting that they are consciously restoring or reviving earlier features of specific pagan religions, or of pagan religion in the abstract. Rather, their religion has drawn and will continue to draw people whose gut intuitions make them uncomfortable with transcendent monotheism, who resonate with a religion that is more down-to-earth, corporeal, and polytheistic. From there, Mormonism will naturally evolve in a pagan direction without anyone having paganism per se on their mind.

This will be an ongoing series of posts, some about doctrine and beliefs, others about ritual and practices; some about their collective ethnic identity, and others about their individual personalities. These ideas are still inchoate, so I don't want to press any of the claims too hard, and part of posting them here is to refine my hunches and get feedback.

No substantial claims have been made in this post because the series will otherwise seem to come too far out of left field. There needs to be a basic tone-setting post before any claims are introduced and explored. No outline of the claims has been given because, as I said, these ideas are still coming together, and an outline right now would only include a few examples (although striking ones).

So let the basic proposal sink in, and I'll be adding to this series off and on over the next week or so, perhaps longer depending on how much there is to be uncovered.

January 30, 2015

2014 in film: No change from existing trends

Not exactly the most attention-grabbing news ever reported, but it is worth keeping our eyes peeled for signs of change back toward the film-making culture that we all love from the '70s and '80s. The current trends have been going on for over two decades now, and will run out of steam sometime in the next 5 to 10 years. But so far, there are no observable signs of change away from the dullification of movies.

A quick check of the traits that characterize the top box office draws in 2014 shows a mindless continuation of four major existing trends:

1. Unoriginal storytelling (earlier post here with data since 1936). Whether the stories are adaptations of existing stories, or sequels to existing movies, none of the top 10 movies in America were original.

You might try to excuse The LEGO Movie, since it was not a sequel and did not adapt a clearly defined existing story. But you weren't going to see that for the narrative or character arcs. You went to see the Lego-style visual animation. Its visual style was entirely familiar to the audience and adapted from the toys, video games, and cartoons done in the Lego style, so I count that movie as an adaptation. Also the pandering title shows that audiences would only be drawn by instant brand recognition -- it doesn't hint at what the movie is about, or offer a mysterious title to pique our curiosity. It's just: "You think Legos look cool? Well here they are, in a movie!"

Broadening the view to include the top 20 movies doesn't help. There are only 2 original stories in the 11-20 spots -- Interstellar and Neighbors. Only 2 in the top 20, or 1 in 10 original stories. Pretty sad.

2. Disappearance of separate movie cultures for children, adolescents, and adults (earlier post here on the MPAA ratings of top movies since 1969). Almost everything is PG-13 or PG.

There were no G-rated movies in the top 20, in contrast to the late '70s when The Muppet Movie was the #10 movie and rated G. Last year's LEGO Movie, however, was rated PG because they had to cram in adult-ish stuff to entertain the parents as well as the kids, rather than let kids have their own autonomous culture.

At the other extreme, there was only 1 movie rated R in the top 10 for 2014, although there were 4 in the top 20, or a rate of 2 in 10. A far cry from the '70s and '80s when mature themes could be treated without parents throwing a fit because they didn't plan on bringing their kids to those R-rated movies. Today, just as children are not allowed to have their own culture, neither are adults.

Helicopter parenting demands popular culture that is "fun for the whole family" (AKA bore the whole family), because parents won't see movies on their own anymore. That would involve leaving the kids under someone else's watch for a few hours, and y'know how that's bound to end up -- finding them bound, raped, and murdered in a ditch on the drive back home.

3. Comedies are still rare (earlier post here on the popularity of the comedy genre since 1915). None of the top 10 movies were comedies, although 2 of the top 20 were, for a rate of 1 in 10. I don't count kiddie movies because that isn't comedy, but rather cutesy and clowny humor with the occasional yuk-yuk gags. Even counting 22 Jump Street and Neighbors is being generous, since those are just juvenile yuk-yuk movies, not where there's some comedic dynamic that runs throughout the movie.

Since comedies are most popular in rising-crime times, they seem to fulfill the need for catharsis and resilience during such topsy-turvy times. In a world that is becoming safer and safer, folks aren't as likely to be in a state of physiological arousal, and don't have as much need for comedic relief in their lives.

Also worth noting that the two comedies for 2014 did not pair a light comedic tone with darker themes, as used to be the norm in the '80s. Back then, a comedy was always an action comedy, war comedy, horror comedy, or drama comedy. That pairing of light and dark themes emphasized the role of comedy as a relief from situations in life that would otherwise be depressing, frightening, or overwhelming.

4. Running times are still very long, especially considering how juvenile the subject matter is (earlier post here on running times since 1921). Using the top 10 or top 20 didn't matter. Average running time was 2 hrs 6 min, median was 2 hrs 7 min, minimum was about 1 hr 37 min, and max was about 2 hrs 45-50 min.

As with Midcentury cocooners, today's cocooners require something spectacular to get them out of their domestic fortresses, for it to really be "worth it". In cocooning times, people also seem to prefer drawn-out experiences rather than ones that pack a cathartic punch. In the Midcentury, serial dramas on the radio were more important than movies, just as serial dramas on TV these days are more important than movies. One mode is for folks who are generally bored during the day, the other for ones who already have other exciting stories to participate in in-real-life.

So there you have it: if you sensed that movies in 2014 have been continuing the trend toward tediousness, you were right. I confess that I only saw two new movies last year, Interstellar and Transcendence, and it doesn't look like I missed much. I began tuning out of new releases during the second half of the '90s, and have rarely felt regret when I've caught up later on. There's simply too many from the good old days to feel deprived by choosing to see "new" movies from the past than from the present.

January 28, 2015

To restore humanizing architecture, end the transplant phenomenon

A comment that I left at this post on "How to Create a Beautiful City" over at Uncouth Reflections:

- - - - -

The main source of awful public spaces is sociological and demographic rather than technological or artistic — the transplant phenomenon.

When you are born in a place, live there your whole life, will raise any children you have in that place, and your ancestors stretch back into the past in that place, you feel a level of respect for its natural and built environment. They are not completely inviolable, but altering them willy-nilly is taboo.

It is part of you and you are part of it. You would no more alter its substance and appearance than you would your own — some cosmetic things here and there, maybe a knee replacement if your original one gets too banged up, but never anything major and frivolous like a sex change operation.

When a place draws most of its population from transplants, or people whose roots go back no further than a single generation, its features are not treated as sacred. They’re just neat things that earlier waves of transplants found it fit to build in their day, but which we might not find so neat in our day, and may very well have to erase and replace to suit the living rather than the dead (those two being alien to each other when transplant-ism is the norm).

That’s the basic weakness — not feeling that the natural and built environment are sacred. It lets you treat the whole city like one great big Lego bucket or dollhouse for playing around with, to dress it up in one artificial identity or another.

If you’re lucky, the prevailing fashions will give the city Art Deco rather than the International Style or the International Style: The Sequel. But trying to analyze the differences at the technical level, and propose policies that could steer architecture back toward good ol’ Art Deco, is missing the big picture — that the constant demographic churning makes it impossible to hold something in place. You are reduced to trying to argue for why Art Deco should make a comeback in the fashion cycle, why the neo-Mies look is like so tired by 2015.

That’s why rural towns tend not to be so afflicted by all the things that trad architecture folks decry. They are not being constantly swamped by wave after wave of transplants bringing their own outside ideas and inclinations about what would make for a totally awesome city, as though it were wet clay rather than a living organism.

And that’s why some cities show greater levels of affliction than other cities. As much as New York transplants may always be complaining about “there goes the neighborhood,” the city and its population is deeper rooted than a place like Houston or Phoenix.

How do you keep the transplant invasion at bay? The trick is to not host the institutions that draw status-strivers — globally competitive industries (Wall Street, Hollywood), globally competitive cultural institutions (Harvard, Sundance Film Festival), and so on and so forth.

Trad architecture misses these larger points because most of the critics are striving transplants themselves. They want to have their competitive career and the wealth and creature comforts it affords, while preserving their adopted city’s traditional character. But the two are incompatible. You can choose one end of the trade-off spectrum or the other.

It would be best for the return of traditional, human-scale places to discourage the transplant phenomenon, to remind people that they’ll feel more connected to their place if they grew up there and haven’t seen it change radically. That creates a deeper and more enduring sense of belonging than shopping around for a city and tweaking its skin, as though you were purchasing a customized costume for a masquerade ball.

January 27, 2015

Mormon church officially endorses the gay agenda, with anti-retroviral proviso

In a desperate attempt to score moralistic status points against the not-so-Scandinavian parts of the country, the Mormon church has come out of the closet about their support for sweeping pro-homo legislation up to the national level. In typically naive Mormon fashion, they are holding to a "just the tip" strategy that stops short of fully inserting gay marriage into the body politic. But pretty much the whole rest of the diseased load is all cleared for "go".

Reporting from the Salt Lake Tribune here.
"We call on local, state and the federal government," Oaks said in a news release, "to serve all of their people by passing legislation that protects vital religious freedoms for individuals, families, churches and other faith groups while also protecting the rights of our LGBT citizens in such areas as housing, employment and public accommodation in hotels, restaurants and transportation — protections which are not available in many parts of the country."
The sleight-of-hand language about "protecting religious freedoms" is meant to suggest that religiously motivated bigots will somehow be magically protected against the BOO BIGOTS spirit of the laws. As if. Brigham Young University may be granted permission to keep gay orgies from taking place in their student dorms, but if your hotel isn't owned by the Church, then good luck keeping the drug-fueled poz parties off your property.

If your non-Mormon restaurant doesn't want to host and cater a gay civil union ceremony, TS. If you're a cabbie who doesn't feel like picking up a couple of coked up queers who are going to leave who knows what kinds of germs in your taxi, TS.
The push for gay rights was prompted by "centuries of ridicule, persecution and even violence against homosexuals," [Marriott] said. "Ultimately, most of society recognized that such treatment was simply wrong [my emphasis], and that such basic human rights as securing a place to live should not depend on a person's sexual orientation."
Got that, meanies? Ridicule leads to persecution leads to violence. Back on planet Earth, the gays dropped the AIDS bomb on themselves. Normal society played no role in the Swallow-caust.
As a matter of doctrine, the LDS Church does not support same-sex marriage, Marriott said. "But God is loving and merciful. His heart reaches out to all of his children equally, and he expects us to treat each other with love and fairness."
God doesn't want any of his wayward children to ever find The Way again, because judging them to be straying from The Way would be mean, which would contradict his unconditionally permissive love and mercy. This is just spineless "God as a helicopter parent" theology. Sadly in this case, beliefs have consequences, and religious folks will soon no longer be allowed to steer their destabilized communities back toward normality. Because God wants us all to love and not-judge each other, even as His straying flock tumbles over the cliffside.
Above all, the LDS leaders said, the debate about balancing religious and gay rights — often a polarizing predicament — should be civil and respectful.

"Nothing is achieved," Holland said, "if either side resorts to bullying, political point scoring or accusations of bigotry."
I wonder which "either side" will resort more to "accusations of bigotry" and "bullying" (i.e mean language)? The other either-side is just supposed to keep its mouth shut, while getting shouted down for being "simply wrong".

So there you have it. Mormon morality amounts to little more than secular liberalism, stereotypically blinkered to any concerns other than harm and fairness. Notions of purity, sanctity, and taboo are not invoked, nor is the threat to communal cohesion when deviance is promoted.

"Harm" now includes anything that makes someone feel upset, not only physical harm. And "fairness" now applies in contexts where discrimination is necessary, e.g. when one group is fundamentally abnormal and the other group normal, not only where two normal groups are in a relationship of majority/minority, center/periphery, and so on (as in the black/white focus of the Civil Rights era).

You might try to find a silver lining in their not assuming a raging, antagonistic tone, but the passive-aggressive behavior of this Minnesota in the Mountains will only allow the gay enablers to steamroll right over their culture. Fire-breathing liberals at least serve to alert and galvanize normal people who may not have been paying much attention. Meek liberals aren't going to trip off the alarm system so easily.

You might also try to play down Mormon liberalism by pointing to the generally Progressive nature of mainline Protestant denominations. But unlike those liberal churches, there is no conservative counter-balancing Mormon church, let alone one that swamps the liberal one in numbers and influence. It is more like the Catholic church, only from a dystopian world where it has become terminally corrupted by the homosexual contagion.

Recall that Salt Lake City is the gayest city per capita in the nation, whether or not the patrons of its elaborate gay culture would be officially "out" on a demographic survey. Recall that Mormon fat chicks feel no shame in sham-marrying Mormon faggots and bearing their children, as well as bald-facedly declaring that their gay husband is not gay. And recall that unlike Mormon country, where the ban on gay civil unions was struck down by a federal judge in Utah, such a ban was upheld by a federal judge in Ohio.

We should not expect much more from the Church of the Frontier, or the Church of the West, which historically attracts transient status-strivers. Where else could be the natural Zion for a Gilded Age cult that transplanted its way from New England all the way to the rootless Rockies, after getting driven out of one cohesive town after another in the East, the Midwest, and the Plains?

As degenerate as East Coast Catholics may think their church is becoming, at least its population is still hot-blooded enough, owing to the Irish and Italians, to not just lay down and wait their turn for AIDS-raping by the gay enabling movement. They need only look to the Episcopalians and the Mormons to see how much worse things could be, if their stock were drawn from spineless Saxons and dickless Scandinavians.

January 26, 2015

Bratty toddler anthem tops the adult contemporary charts, signaling generational change

If any of the places that you frequent plays an adult contemporary radio station over the PA, you are still hearing that annoying Taylor Swift song "Shake It Off," currently #1 on the AC charts. Like an earlier AC #1, "Roar" by Katy Perry, it's basically an anthem for bratty Millennials who don't want to ever change themselves in the slightest to fit in better with their social environment -- to have to adapt.

Anyone who doesn't like you 100% the way you are, and tries to re-shape you so that your behavior will be more pleasing to others, is just a hater. Glib dismissal is the Millennials' ideal response to haters, so that they never take any criticism to heart, however small and however accurate. No adaptation, no growth. Perpetual toddlers.

Childish emoting can also be heard in the AC #1's "Roar," "Stay with Me," "Home," etc etc etc.

When did the adult charts become so kiddie?

Going back 10 years, most of the year saw two songs at #1. "Breakaway" by Kelly Clarkson and "Lonely No More" by Rob Thomas are nothing to write home about musically, although in tone they're merely adolescent or young-adult rather than twee or bratty.

Another 10 years back, and the songs are adult for the most part, again whether you dig the music or not (probably not). "Take a Bow" by Madonna assumes an audience familiar with the give-and-take in adult relationships, beyond bratty tantrums or adolescent infatuation. "Kiss from a Rose" by Seal is a little more adolescent, based on the theme of love as a drug. "I'll Be There for You" by the Rembrandts is more young-adult than fully mature. It assumes a social network, but the challenges that the speaker faces are just bad days, nothing really deep.

Then we arrive safely back in good ol' 1985, where none of adult contemporary #1's sound kiddie or even adolescent. "Careless Whisper," "Smooth Operator," "Everytime You Go Away," and "Saving All My Love for You" all come from a mature stage of social development, with all its trials and complications. Even the upbeat dance hits are made for grown-ups -- "Rhythm of the Night" and "Axel F". There's also the soft rock ballad "Inspiration" by Chicago, which however super-cheesey and grating it is, nevertheless is made by and for adults.

What does this change reveal about generational differences? The target audience for adult contemporary is 25 to 44 years old.

So it was the Boomers and late Silents who drove the success of truly adult hits on the chart back in the '80s. The late Boomers and early X-ers, as suggested by their AC tastes in '95, were mostly OK with growing up, though still preferring periodic indulgence in the adolescent or young-adult mindset. Early X-er preferences show up again in the 2005 hits, although the introduction of late X-ers into the target audience has made the hits more adolescent in focus. By 2015, bringing Millennials into the core audience has made them downright infantile.

The main factor here seems to be the level of cocooning or connection that the generations enjoyed while growing up. Social connection causes personal change, in a pro-social direction, i.e. growth or maturity. Folks who grew up entirely within the outgoing / rising-crime period of roughly 1960 to 1990 are the most comfortable with adult life. That would be the late Boomers and the earliest X-ers.

Once the cocooning climate began to set in circa 1990, growth after that point would not be as strong as if it had taken place during the '60s, '70s, or '80s. Someone born around 1970 would have the tail end of their formative years stunted by cocooning, but the end result was not too severe -- it was only the tail end of the developmental window, and the degree of cocooning wasn't so high at the very beginning of the shift.

The generation born around 1980 would be affected more during their young-adult years than adolescence. These people seem more inclined toward a teenage mindset. By the time you get to those born around 1990, they only grew up during the cocooning / helicopter parenting period, and have hardly matured at all, not even to the adolescent stage of wanting social connections, being willing to engage in the give-and-take, and honing their people-reading abilities. Again, you can hardly expect a different outcome from people who were socially deprived for their entire formative years.

You can go further back and see the cheesier, schmaltzier AC hits of the '60s, when they were catering to the Silent Gen, who like the Millennials grew up socially cut-off during the previous peak of "smothering mothers" and Dr. Spock-inspired shielding from threats to the ego. Competitiveness was at a minimum back then, though, so their kiddie sounding AC hits were not bratty, but twee and saccharine. Case in point: the mawkish "Can't Help Falling in Love with You" by Elvis way back in '62, which sounded much more emotionally adjusted and pulled-together when covered by UB40 in '93.