March 28, 2020

Restlessness under quarantine shows excitement cycle has left refractory phase, entered warm-up phase

From what I can tell -- and it's only online, but from normies on Facebook to groypers on Twitter -- nobody is responding to the quarantine and social distancing in a positive, embracing, relieved way. They're not rejoicing as though it were a nationwide, indefinite snow day home from school. As though it were just the excuse they'd been looking for to burrow away in their cocoon of video games, porn, streaming TV, podcasts, parasocial online media, and so on and so forth.

You'd think at least the groypers would be taking advantage of the situation to spread the gospel of the cozy NEET lifestyle. Where are their memes showing how snug and comfy they are, being holed up inside the home next to the fireplace, while the normies and Boomers are outside dropping like flies from coronavirus? Why aren't they holding Tea Tunes every night, to amplify their warm, fuzzy get-to-stay-inside-forever vibes?

Nope: even the NEETs have become restless, and are venting their frustration at the people telling them to stay home, to practice social distancing, and the rest. A few years ago, these directives would have instead been met with ironic mockery -- and adoption. "Gee, you really had to twist my arm to make me cozy up by the fireplace and keep my distance from the normies whenever I go out..." Today, they're itching to leave the home and take part in non-parasocial relationships in real life.

The difference between now and the past several years is which phase of the excitement cycle we're in. From about 2015-'19, we were mired in the refractory phase (vulnerable, mellow), where all social stimuli feels painfully over-the-top. As of the last several months, though, we've begun shifting into the restless warm-up phase of the cycle, right on schedule for 2020-'24. (The next manic phase should hit around 2025-'29, and then back into the refractory phase around 2030-'34.)

I chronicled some of the initial changes in the popular mood during the early part of this year, and even somewhat the final part of last year. But now we're getting an unambiguous signal of this change of phase from the popular response to the quarantine. The fact that such a wide swath of the society is bristling under the quarantine, and so many are saying they're already going crazy and need to get out and interact with other people already, is a telltale sign of people who are eager to come out of their shells.

If they were still in the refractory phase, they would have the opposite reaction -- silently thanking God, or passing out from relief about not having to painfully be around and interact with others.

People's reaction is all the more telling when you consider that they'd face grave danger by giving in to their restlessness and going outside and mingling. They could come down with debilitating symptoms of the coronavirus themselves, or they could saddle other people with those costs by transmitting it to them. It has taken a literal global pandemic to keep them caged up in the home -- that's the magnitude of the external force required to keep us inside and isolated for a week, during a phase in the excitement cycle where we're itching to come out of our shells already.

If it were just harsh weather, those external conditions would be no match for the newly awakening desire to get out and socialize. I still remember the blistering heat wave of 2013 out West -- and it being the manic phase of the cycle, everyone was out and about at all hours of the day, with not a care in the world. Open your shirt -- take off your shirt -- whatever you gotta do, and just go with the triple-digit flow.

Being saturated with sweat didn't make you feel icky and sticky -- it gave your skin a healthy glisten as from vigorous activity, amping up your raw animal attractiveness to the opposite sex. Girls were buying up those sea salt sprays for beachy waves, mimicking the look and feel of dried sweaty hair (thick, salty, crunchy).

Contrast that with the Indian Summer we just had during the vulnerable phase in 2019, and how people felt frustrated and heat-avoidant, rather than invincible and adventurous.

There is no universal trend in responding to major external events -- the popular reaction always depends upon which phase of some cycle it is in. The excitement cycle, the status-striving / inequality cycle, the cocooning / crime rate cycle, or whatever else.

And rather than nipping the warm-up phase in the bud, the corona quarantine is only revealing how restless we have become to come out of our shells and start mixing it up with each other, after five insufferable years of a refractory phase, Me Too hysteria, moral panics in general, and the rest of that seemingly endless vulnerability.

If, or when, we're finally allowed to go back outside and live normal lives again, there is going to be so much pent-up sociability, proving that we had changed phases as the quarantine began. We will not react to the end of the quarantine as refractory-phase people would -- complaining and sulking, having to be dragged kicking and screaming back into our social routine. Just make sure you don't let your body atrophy in the meantime -- you wouldn't want to get caught flat-footed right as the neo-neo-neo-disco era comes alive.





March 26, 2020

Coronavirus severity and foreign-born population: Asia / Africa vs. Europe & colonies

My old co-blogger from Gene Expression, Razib, has good corona-coverage on Twitter, including a retweet of the following puzzle about Tokyo vs. New York:


The major difference is the size of the immigrant population: 4% of Tokyo vs. 40% of New York City. Ten-fold difference.

Then you have to take into account how globally connected their immigrants are: most of the immigrants in Japan are from neighboring South Korea, vs. the immigrants of New York coming more uniformly from all over the world.

Then you have to take into account how globally connected those sending nations are: South Korea itself only has an immigrant population of 3%, vs. a bit more (maybe 5-10%) for each of the sending nations of New York's immigrants.

So, when you go through all the links in the various chains of transmission that end in Japan, you don't get much global integration. When you go through all the links in the various chains of transmission that end in the NYC metro area or the US as a whole, it integrates so much more of the world.

See here for a list of nations that can be sorted by the share of their population that is foreign-born. East Asia, even the developed countries, has a small immigrant population, under 5% of the nation. Ditto for Africa, perhaps a reason why the coronavirus doesn't seem to be destroying them as badly as the Europeans. European developed countries and their off-shoots like the US have much higher immigrant populations -- well over 5%, and more like 15% in the US.

* * *

At least people are now allowed to talk about global interconnectedness as a major factor in the spread of contagious diseases. And closing the borders to foreign travelers is a sensible place to start once a pandemic is under way.

But as I emphasized in a recent review post on global integration providing the conditions for pandemics, it's not so much travelers as immigrants who spread disease. Travelers are inside a host country for a brief time, and within a narrow spatial range -- the tourist traps. Immigrants are inside a host country indefinitely -- weeks, months, years, decades -- and ranging over a far broader space -- home, work, leisure, etc., all of which could be in different parts of the city.

In between those two extremes would be recurring contacts, like those on a trade route -- not permanently residing in the host country, but not just stopping by briefly on one occasion.

A foreign military could play any degree of this role, depending on how long they're hunkered down in the host country. Just passing through, or a conflict that results in a quick defeat -- they're more like travelers. Occupying the area, laying siege, etc., in a protracted stalemate -- more like immigrants. Visiting every now and then to collect tribute from a defeated group -- more like traders.

The point is that you have to add up the foreigners' influence over time and over space, within the host country. That's why the share of the total population that is foreign-born gives you a better idea of what will happen, than how much international travel and tourism there is into that host country.

Why do elites easily discover the role of foreign travelers, but not the role of foreign residents? Because BEEP BOOP: xenophobia detected, in the case of immigrants. Elites are fine allowing the cheap foreign labor to continue flowing in and remain as residents indefinitely. That props up the elites' standard of living -- less to spend on workers, more to spend on themselves.

But if they had to cut off all travel into their own country by foreigners? Meh, no big deal. The elites would still have plenty of cultural and genetic diversity to experience from the foreign residents. And they don't like tourists, no matter where they come from. And it wouldn't necessarily prevent the elites from traveling outside their country -- just keeping others from traveling into theirs. It wouldn't lower their standard of living or ranking in their status contests, except for those whose wealth and status derives mainly from the tourism sector.

Populists, including those who want the highest state of public health (not just elite well-being), are the opposite -- they'd rather close the border to foreign residents, and open it to foreign travelers, if one had to be open and the other closed.

A closed border to would-be foreign residents also means they could not compete in our labor market and lower the (individual and collective) bargaining power of domestic workers. An open border to foreign travelers has no impact on domestic workers' bargaining power -- some foreigner visiting Times Square for a few days isn't going to take any American citizen's job.

The impact on the housing market is similar. Foreign-born residents add to the demand and drive up the price of housing. Foreign travelers add to the demand for hotel rooms, where they compete with other tourists (perhaps some of whom are citizens here), but not with citizen residents of the area, who live in apartments or houses. Only if tourism got way out of control would it drive up the price of citizens' housing, if apartments were demolished to make room for hotels, shrinking the supply of residential housing.

* * *

Perhaps it's the poisonous influence of identity politics, but it seems like most educated people don't understand that contagious diseases are contagious -- meaning, they can be transmitted from the source country to some other country through any number of intermediate countries. New York residents could be picking up the Chinese Virus from the Chinese themselves, but also from South Koreans, Iranians, or any other non-Chinese nation that the Chinese have had contact with.

(And that's only the chains with 1 intermediate step between China and New York. Go through all chains with more intermediate steps, e.g. China to Italy to Britain to New York.)

The elites appear to be essentializing the pathogen as though it did not just originate in China, but can only be spread by the Chinese, a permanent and unique trait of that population, part of their identity. That goes for right-wing members of the elite, too, who might be fine closing the border to China but not to our cousins in Britain, who are also stricken with the coronavirus.

The elites are not blinded to these disease dynamics when it's interpersonal, though, like STDs. They all know the phrase, "When you sleep with someone, you're sleeping with everyone who they have slept with." (And again, that phrase only captures the chains with just 1 intermediate step.)

Well, it's the same with nations and borders -- when you open your borders to another country, you're opening your borders to every other country that they have opened their own borders to. Only in this national context, the elite brain senses something being said about groups, and potentially having to protect one collective from another collective, and BEEP BOOP: xenophobia detected. ERROR: answer is not a non-racist number.

Any group of people whose minds are contaminated by identity politics will at best have to struggle in order to protect the public health from the usual threats to it, and at worst will actively make the environment worse by delaying action, sending out messages that quarantines don't work, and so on. Unfortunately that includes most educated people in the First World, and they're the ones who are in charge, to the extent that anyone is these days.

So, expect few major lessons to be learned after the damage from this current pandemic has been done. We'll only get major changes where it's needed after the relative strength of the elites vs. the commoners is severely weakened, and we're not going to get to that phase of the cycle for several decades at the earliest.

Right now, the elites continue to be highly over-produced, and they will likely be less decimated by the coronavirus than the commoners will be -- meaning the end-result will be an even more top-heavy society with even more intense intra-elite competition, and even less collective power for the commoners. In that phase, the elites will start to wipe each other out in one way or another, and only after their herds are thinned out can the commoners exert any real influence over society's direction.

March 23, 2020

Did ancient Akkadians refer to the head as the "yak-yak" part of the body? Support for Julian Jaynes' view of the history of consciousness

Nassim Taleb posted a diagram of the body with the parts labeled in Akkadian, an ancient Semitic language that is the earliest attested member of the family, spoken in / during the Akkadian Empire circa the late 3rd millennium BC, in what is now the northern and eastern part of Iraq:


There's something very strange about the word for 'head' being "qaqqadum," which bears no resemblance to the widely attested words for 'head' in all other Semitic languages, which derive from "raʔš" in Proto-Semitic (although Akkadian apparently also had a separate word for 'head' that derived from this usual root). It is not a loanword from the nearby, non-Semitic Sumerian language, whose word for 'head' is "sag".

What's going on here? "-um" is just a suffix that can be ignored, leaving "qaqqad". But that can't be it, because Semitic languages almost never allow for a pair of adjacent positions in a root to be the same sound, unless it's the middle and final positions in the triconsonantal root -- not the first and middle positions. For example, h-l-l is allowed, but not h-h-l.

Yet here we seem to have the "q" in the first and middle positions. Turns out it's a reduplication of the single syllable "qad," giving an intermediate form "qadqad," with the cluster "dq" apparently not allowed, so that the "d" assimilates into the following sound, becoming a "q". (Not an uncommon phenomenon across languages.)

OK, at least "qadqad" does not violate basic Semitic phonotactics, or the rules governing which sounds can be combined in which ways. But that only complicates things because it means the word for such a basic, primitive, all-important word like 'head' is morphologically complex -- it's a reduplication of "qad," giving "qadqad". Why not just make it its own primitive word, instead of deriving it from some other word? It's not like there's anything figurative or conceptual about it -- it's a literal head, from your physical, tangible, visible body.

Well, perhaps there's something even more semantically primitive about "qad," whose roots would seem to be q-d. That's morphologically primitive, only two consonants instead of the usual three. And the "q" and "d" in Akkadian only come from those same sounds in Proto-Semitic, so the Proto-Semitic root would also be q-d.

But what does the q-d root mean in Semitic languages -- anything having to do with the head? According to the article "Statistics of Language Morphology Change: From Biconsonantal Hunters to Triconsonantal Farmers" by Agmon & Bloch (2013), Table S1, row 1.10, q-d refers to 'burn' and 'ignite". In other words, nothing to do with 'head' by even a generous stretch of the imagination, let alone where 'burn' is the primitive form and 'head' is the derived form.

Saying it a few times out loud, I realized that "qadqad" is a lot like "yak-yak" or "bla-bla" or "bar-bar" (the origin of "barbarian"). Or "yadda-yadda". Then "qad" is just a case of onomatopoeia -- referring to a generic, all-purpose speech syllable, and the reduplicated form "qadqad" suggests a stream or longer utterance of such syllables, just like yadda-yadda and the others.

That is the only way for the word for 'head' to be morphologically complex. Whatever it's derived from must be more primitive -- and how do you get more simple, basic, and primitive than 'head'? It's impossible if it's derived from semantic concepts (like the putative 'burn'). But it is possible if it's derived from an onomatopoeia, which is purely physical rather than conceptual, and therefore simpler and more primitive.

Under this interpretation, the Akkadian word for 'head' originally meant something like "the part of the body that does the yak-yak activity". The bla-bla part. The yadda-yadda part. Not so different from modern languages like English referring to a body part by the activity that it performs, like "yapper" for 'mouth' in English. In Akkadian, they might have done this to refer to the head as a whole, not just the mouth. After all, you need to refer to some activity that is distinctive of the head -- what else is there?

Remember, this is before the emergence in history of the self-awareness of one's own internal mental states (or consciousness), to borrow from Julian Jaynes' seminal work "The Origin of Consciousness in the Breakdown of the Bicameral Mind". (Now there's a book to stimulate your stir-crazy mind during the quarantine). It's 20-some-hundred BC, not 500 BC, give or take. Or pre- vs. post-Axial Age, in structural terms rather than absolute time. So internal mental processes like thinking and feeling are not options for the Akkadians to choose from. BTW, I've written before about that book and model -- use Google to search this blog for "Jaynes".

External processes connected to the brain might work, but they are fairly passive and not active enough to be the most salient activities. Seeing, hearing, smelling -- not very evocative. Eating is more active, but it involves too much that lies outside the head (most of the digestive tract), and you only eat every so often. Speaking is the only thing that is distinctive of the head, external and physical rather than internal and psychological, and done often enough during the day, to be the most salient activity for the head (before the rise of consciousness).

These two views -- a certain etymology of 'head' in Akkadian, and Jaynes' history of consciousness -- are compatible with each other. I'm not supposing one to prove the other. More like using the etymology of "qaqqadum" as yet another piece of evidence for Jaynes' view of the history of consciousness. I don't need Jaynes' view to argue for the etymology of "qaqqadum," which stands on its own, and I don't see any obvious alternative explanations for such a strange word for 'head' in a Semitic language. And Jaynes' view doesn't rise or fall on the basis of this one proposed etymology.

But they do support each other. If the Akkadians were a more modern people, they probably would have derived their word for 'head' from a more purely cerebral activity, internal to the mind. "Oh that part of the body? Y'know, that's the, er, the thinking-part!" However, if they were pre-conscious (in Jaynes' sense), then they would have chosen a more physical and outwardly directed activity localized above the neck -- the 'head' is the speaking-part of the body! And not just "speech" in some abstract cerebral sense, but in the corporeal onomatopoetic sense, employing a specific example of a speech syllable (reduplicated to suggest a stream of such syllables). For pre-conscious people, the 'head' is not the thinking-part but the yak-yak part.

March 19, 2020

Upcoming political / cultural baby names, based on phonetic trends

For a little comic relief and black humor during the pandemic, and to discharge my stir-crazy mind, I got to thinking about what popular baby names there will soon be, based on the existing phonetic trends.

Fashion in baby names has nothing to do with semantics, or the meaning behind a name. Most names don't already mean anything. Rather, they rise and fall according to their phonotactic properties -- the rules governing what sounds can be combined in what ways. People may tell rationalizing stories after the fact about their choice, like "my uncle had that name" or "I love that character from my favorite book". But these were already pre-screened for fitting into the existing trends for how they sound, and only within those rigid guidelines did they choose one with this meaning or that significance. (See Lieberson's A Matter of Taste: How Names, Fashion, and Culture Change.)

For example, if current sound trends say that a boy's name must be two syllables, and you have ancestors whose surnames were Lloyd and Morgan, then only Morgan could be adapted to the current sound trends. You will rationalize about the meaning of family traditions, but the first choice you made was conforming to the sound patterns du jour. If none of your ancestors' surnames fit into today's sound trends, you will ignore them all (so much for that story about the importance of family roots), and find something else that's traditional in meaning yet still -- and most important of all -- obeying current sound trends.

One of the most rigid rules recently is for male names, and some female names, to have two syllables, stress on the first syllable, and second syllable ending with "n" or, somewhat less commonly, "r". Devin, Bryson, Grayden, Archer, Taylor, etc. And the vowel in the stressed syllable is mainly restricted to front vowels rather than back vowels. Back-high vowels are especially avoided. (See here for a diagram of vowel space.) In simple terms, "i" "e" and "a" sounds are fine, but "o" and "u" sounds are not, and "uh" is just barely tolerated (Hunter).

The other major tendency is for names to rhyme with each other, as parents try to put their own little variation on the existing theme. It's a form of status-striving, trying to sound unique yet recognizable to other strivers, unlike the conformist masses. But there are only so many variations on a theme, so they all end up sounding conformist anyway. E.g., Aiden, Jayden, Brayden, Cayden, Hayden, Grayden, etc.

The inspiration for this exercise came from listening to a recent episode of the Red Scare podcast, where one of the ladies said that if she had a kid, she wanted to name him "Honor," and gave a semantic rationalization -- it's trad, it's important to be honorable. In reality, it's a rhyming variation on Connor -- a name already highly popular among the elite class, but without any popular rhyming variations yet. (Donner sounds right, but would get your kid teased for being named after a reindeer.) These names adhere to the current rules of two syllables, stress on the first, second ends with "n" or "r," and stressed vowel is a front one ("ah").

Below are lists of some names that adhere to current sound trends, but that also have meaning for those whose minds are molded by today's political and cultural atmosphere (the usual place to search for names, outside of your own family roots). Some are for the Left, some for the Right, and some for overall lifestyle and persona-construction trends. I put existing names that rhyme in parentheses for comparison. Almost all conform to the rules detailed above, plus a girl's name inspired by the popular stress pattern of "DA-da-DA-da", where both stressed vowels are front ones.

Finally, there's a list of "almost, but not quite" names that could not catch on because their stressed vowel is a back one. They still have two syllables, stress on the first, second ending in "n" or "r," but that stressed vowel makes all the difference in the end. A few names could work phonotactically, but would be avoided because they sound like other names that have been downtrending for a long time. (E.g., if a new girl's name rhymed with Karen or Linda, it could not catch on because others would interpret it as a rhyming variation on an ancient, downtrending name.)

Lighten up for a bit, and enjoy.

* * *

The Left

Larper (Harper)

Recker

Planner (Tanner)

Warren, Warron, Warynn (NB: "wahr" fits better than "wor")

Byden (Bryden)

Berner

Strasser

Raddie (nickname for Radisson, from Madison / Maddie)

Brooklynn (already done)



The Right

Darwin (secular right only)

Bannon

Fasher (Asher)

Nigger (a boy named Sue, a redneck named Nigger)



Lifestyle / Persona

Bacon (Aiken; allows Macon Bacon)

Grynder

Jennder (Ender)

Hater (close to Hayden)

Texter (Dexter)

Fiver

Striver

Patron, Patrynn

Annarexis / Rexi (Annabella, Alexis / Lexi)

Rexi! How many times have I told you not to spoil your appetite!



Almost, but not quite

Groyper

Zoomer, Doomer, Coomer

Uber

Truther

Loser

Poser

Joker, Broker, Woker, Bespoker

Tucker

Trumper

Buster, Bustin (downtrending: Buster, Justin, Dustin)

Trad, Traddison / Traddie (Tr___ is downtrending for girls, e.g. Tracy, Trixie, Trina; Tr___ is trending for boys, e.g. Tripp, Trent, Trace, but Trad rhymes with only downtrending names, e.g. Brad, Chad, Tad, Thad)

March 18, 2020

Not just coronavirus: ecological conditions (open borders) allow other contagions to go pandemic too -- and coronavirus 2.0 (evolution)

None of the science experts you're hearing from has ever taken a course in ecology and evolution, or a general course in mathematical biology. (And if they did, it sure didn't leave an impression on them.)

This is crucial to keep in mind when you hear them analyze what is currently going on with the coronavirus pandemic, or when they describe what they think are the likely outcomes of various possible decisions that we could make. Most of their training is highly specialized, the most lucrative careers are in the micro rather than macro level, and therefore their views are rarely informed by holistic frameworks like ecology and evolution.

For example, an ecologist knows that when a set of pressures (or "incentives") continue to exist, the roles that they select for will continue to be filled. In a virgin niche with abundant resources, somebody is going to start gobbling them up. That first lone organism (if we're comparing different organisms), or that first species (if we're comparing different species), is like an actor playing a role.

The role is written there in the screenplay -- hero, villain, etc. If the finale involves a clash between hero and villain -- some actor is going to perform the good role, some other actor is going to perform the bad role, and their interactions will play out in a spectacle of good vs. evil, no matter which particular actors are in the roles.

The role is a variable rather than a specific number. It is an abstract concept, not the particular thing out there in reality that behaves according to its traits and rules. Interactions between roles are also an abstract concept -- predators vs. prey, hosts vs. pathogens, sick vs. infected vs. recovered, etc. -- not the particular individuals or species that interact in such a way in reality (lions vs. zebras, or whatever).

In fact, "epidemic / pandemic disease" is a variable, since there are all sorts of pathogens out there that could play such a role. When we're talking about the coronavirus, we ought to emphasize that we want to prevent all such nasty pandemic diseases from spreading -- not just snuff out this specific one. Who cares if we magically eradicated COVID-19, only to be wiped out a few years down the line by some other nasty pathogen?

In short, we want to eliminate a certain role from the screenplay altogether, not just lobby for some particular actor to be fired from the role (and implicitly, for some other actor to take their place).

We must therefore analyze the current pandemic in abstract terms, and make decisions that will prevent similar pandemics from occurring in the future. This also means looking over the history of epidemic / pandemic disease to discover similar environments to our own -- and environments that were spared of pandemics -- in order to abstract away from the particulars and figure out some general principles, which can then be applied in the future.

As covered in the recent summary post on the science and history of pandemic disease, the main factor is demographic interconnectedness across many different population clusters. In other words, open borders. It doesn't matter what's causing this linkage network across various groups -- trade routes, military expansion, mass migration, etc. They all serve to spread germs far and wide, and crucially to ramp up the population size in which the germs are spreading, because several populations are being combined into a single meta-population.

The larger the population size (really, the density), the easier it is for pathogens to explode in epidemics, instead of quickly snuffing themselves out. That's why hunter-gatherers were never stricken by crowd diseases like measles or the flu, which only emerged in the large (and dense) populations of sedentary agricultural societies.

So, what happens if by some miracle we eradicated COVID-19, the specific disease afflicting the world right now, without altering the ecological structures and pressures that gave rise to this pandemic in the first place? Obviously, some other pathogen will step in to do the job -- if you fire one actor from a role, some other actor will end up in that role, performing the same behavior that is called for in the screenplay.

Of course there may be a delay, the next one might not step in right away. And of course, they could end up having their own different take or interpretation on the role, no two actors will perform a role exactly the same way. But the point is, if the role is written into the screenplay, sooner or later, and somehow or other, it's going to be performed.

If you eliminate the current predator of some prey species, then some other species will take their place as predator. Sooner or later. And perhaps with their own different take on the role (maybe they're more vicious than the last predator, or maybe less vicious -- but still predators against the prey).

Therefore, if we eradicated COVID-19 (big IF), but then went back to the demographic flow between populations that we had before it emerged, we would simply set ourselves up for another pandemic. Maybe it would be another influenza-like bug, or maybe it would be from some entirely different class of pathogens. There are so many aspiring actors just champing at the bit to play this breakout role! Thus, our solution to the current pandemic must minimize demographic flow between groups.

Then there is the evolutionary angle -- suppose there were no other pathogens out there to step in and take the role from COVID-19. It's the only actor possible to hire for the role. But we somehow block it from playing that role -- we isolate foreigners, practice some degree of social distancing, trace contacts of those who were infected and quarantine them, maybe administer a vaccine or other treatment, and so on and so forth. Problem solved, right?

Wrong! Much like a frustrated but talented actor would do if they got rejected after the first audition, they would do their damnedest to change in order to adapt to what the casting director wanted. Dye their hair, bulk up their physique, speak with a different accent, go more method-acting, whatever it took. So will COVID-19 mutate and evolve in order to adapt to the barriers we try to place between it and us.

Suppose we figured out empirically that we needed to keep a 6-foot distance from other people, and that was all it took. No further lockdowns or anything -- just stay 6 feet away from others. What if one of the individual viruses has a genetic mutation that allows it to spread at a distance of 10 feet? Then just staying 6 feet away won't contain the disease from spreading, and it'll spread. Only now it will be the descendants of that one individual with the mutation that allowed it to spread at a somewhat further distance. But we'll still be stricken by a pandemic nevertheless.

Then we'd get into an evolutionary arms race. We would not have much time to evolve genetic defenses against it in the short term, unlike the virus whose generational time scale is far faster than ours. But we could evolve cultural defenses (social practices, technological fixes, etc.). In either case, that simply puts pressure on the pathogen to adapt yet again. And on and on it would go.

But unlike the use of, say, antibiotics -- the widespread use of which just selects for an antibiotic-resistant strain of the pathogen over time -- there's nothing the pathogen can do to adapt to closed borders. Thus, closed borders are a superior solution compared to other behavioral and technological fixes.

After all, the germ can't propel itself over tens, hundreds, or thousands of miles. It needs a vector. In the extreme case where all nations have sealed shut their borders, it's literally impossible for the disease to become pandemic. Where the flow is just minimal, albeit not literally zero, we are at least selecting for a more benign pathogen. If it has a harder time reaching the next host, then it must treat its current host decently well, lest its transportation break down and leave it stranded to die out.

With all population centers large and small being connected by just hours of cheap air travel, the pathogen can treat hosts pretty badly and not get stranded. Just make sure the host doesn't drop dead in less than two hours after infection, and the pathogen can still leave them to drop dead the next day or week.

Worse, first-world elite employers of cheap labor, or their puppets in the government, are paying the cost themselves to fly in foreigners by the millions, to exploit them instead of giving domestic workers a higher standard of living. That is effectively free air travel for the foreign vector of an imported disease. Nothing spreads disease like free, fast travel.

And unlike foreign tourists, who might only spew their exotic germs for a short while in the country they're visiting, immigrants who come to live and work in their host country are going to be spreading their diseases for far longer, increasing the chance that they'll catch on. Depending on the disease, even brief tourism could do the job by itself, but immigration makes it all the more likely.

That's why in the academic article on the role of demographic interconnectedness in pandemic diseases, one key predictor of a disease being "rescued" from extinction was the share of the population who are foreigners -- not simply tourists, but foreign-born residents who immigrated. Your country could have no tourism industry to speak of, but if you've imported a decent share of your laboring class, you've got a big problem with pandemic diseases.

Of course, if you have both, like Saudi Arabia -- worldwide Muslim pilgrims to Mecca every year, plus these days nearly 40% of residents being cheap foreign labor -- then you're just asking for pandemics. And naturally, they've been hit by plague, cholera, and others in the past, so they're closing Mecca to pilgrims during the current coronavirus pandemic.

For both ecological and evolutionary reasons, nothing beats the solution of minimizing demographic flow between population clusters, or "closed borders" to be concise and dramatic. That's why we didn't need much else, if anything, during the Great Compression / New Deal period of 1920 to 1980 in order to live normal lives largely free of pandemic threats.

Conversely, that's how easy it has been to become stricken by periodic waves of pandemic disease as of 1980, in the neoliberal period -- all we had to do was throw open the borders, in the interest of elites who profit from ramping up demographic interconnectedness, particularly those who control labor-intensive economic sectors and would reap massive free profits if they could import hordes of cheap foreign labor.

March 15, 2020

Closed borders allowed people to live normal lives, free of pandemic threats; Return to open-borders Gilded Age means return of life-disruption

Just because these extreme levels of social distancing, shutdowns, and lockdowns are necessary to mitigate the damage done by a pandemic disease, does not mean that we should consider them in any way normal. What kind of fucked up world do we live in where that's something the entire country has to do for so long?

But we will come to consider it normal -- if we have adapted to utterly pointless responses to foreign threats, like the draconian TSA screening in the wake of Arabian jihadism, then we will surely adapt to responses that actually do reduce harm to us from foreign threats such as pandemic diseases.

That's why the macro picture -- that pandemics are caused by global integration, and that they will only go away with de-globalization -- is so important to keep alive in the public consciousness. The natural tendency otherwise will be to simply go along with this stuff like there's nothing less disruptive to our lives that can be done to protect us. But there is -- closing the damn border. That doesn't disturb our lives at all, only the profit streams of those elites who depend on globalized supply chains, cheap foreign labor, and military expansion.

When we enjoyed a closed border from roughly 1920 to 1980, nobody had to worry about any of these lockdowns, social distancing, canceling of all meetings, and so on and so forth. This is not a case of younger generations finally being subjected to the hardships that their grandparents had to suffer through in their own time.

Bullshit -- my grandparents were born in 1914 and various years of the 1920s. They didn't have to live through a national or regional lockdown due to pandemic disease. My mother's father was a small child during the Spanish Flu pandemic, which did claim a very young family member of his. Other than that single case, though, no they did not have to live through a dystopia like ours -- they lived through the earthly paradise of the Great Compression and New Deal era.

We'd have to go back to the middle of the 19th century, during the Gilded Age, to find ancestors for whom it was a given that their lives would be periodically disrupted and locked down by threats of epidemic and pandemic disease. Especially if yours were Ellis Islanders: they could still have been back in Europe, which was ravaged by one wave after another of cholera outbreaks during the Gilded Age.

But our ancestors who lived most of their lives from 1920 to 1980? No way -- they had it made in the shade. Naturally that did not come easy, or for free -- they had to band together and threaten or actually exert collective leverage against the elites, who backed down and kept the borders closed in the interest of national peace.

That still involved individuals sacrificing for the greater good -- but unlike us, they weren't making a massive sacrifice like giving up their normal lives, and the magnitude of their benefit was not just "mitigating pandemic disease". Wow, what a high bar for collective accomplishment -- mere survival! Instead, they sacrificed their puny, worthless individual ambition in order to enjoy the prosperity, egalitarian distribution of wealth, purity from nasty diseases, societal harmony, and achievement of one scientific and cultural milestone after another.

How did we ever fall from such heights? Because of the return of individualist ambition and status-striving, which poisons our attempts at collective action. Specifically, those generations who were born into that socially accommodating environment took its benefits for granted. Prosperity, egalitarianism, public health, amazing achievements -- those things simply happen all by themselves, and no care needs to be taken by any of us to ensure that they continue. And you can understand them -- without forgiving them -- since they knew of no other such environment, like any part of the 1830 to 1920 period of misery, dystopia, and decadence.

Once the Silents and Boomers took over the demographic pyramid, and with it the elite levers of power in all domains of society, it was game over for prosperity, egalitarianism, health, and accomplishment. Seeing no harm in the unfettered pursuit of individual ambition (the Me Generation phenomenon of the 1970s), they upended the New Deal norms and ushered in the neoliberal shithole society that we are still descending further down into.

And no, Gen X-ers, Millennials, and Gen Z-ers are not going to make things better. Going back to the Gilded Age comparison, the generation who started turning society around circa the 1920s was born circa the 1880s. We are currently in the 1850s stage of the cycle, meaning our society's saviors won't even be born for another 30 years. And of course we'll have to wait further until they've grown up into full adults in order to see major changes made to society.

In the meantime, it is crucial for us to preserve the knowledge, for our future saviors, of what has gone right vs. wrong at the macro level over many centuries worth of historical cycles. That way they won't have to figure it all out for themselves. The knowledge per se won't change anything -- we already have it with us now, and have had for some time, yet look at how much lower we keep descending since 1980. But once the social conditions are ripe for creating a generation of saviors, it'll all catch on like wildfire if we have been keeping the embers alive in the meantime.

March 13, 2020

Pandemics are caused by global integration; de-globalization lags by decades, creating a century of misery

Let's begin a series of posts on the coronavirus pandemic by noting that it is only the latest in a series of global disease catastrophes that will continue on for at least another 50 years, as globalization integrates more of the world's people into a great big demographically interconnected meta-population, which is the condition for diseases to go pandemic.

I've written three posts on this specific topic over the years, and the situation keeps getting worse -- a bitter reminder that knowledge, no matter how widespread, changes nothing, since the forces pushing in the current direction are economic and political, not academic, scientific, or rhetorical.

I will not fully rehearse these in this post -- each one is required reading to understand the topic, and you can read them by clicking links below. You're holed up in your home with nothing to do for the next several weeks anyway, what are three posts going to cost you? If all you're looking for is retarded emotional takes or short-term-only technocratic "solutions", you already know where Twitter dot com is located.

The original post was in 2013, going in depth on the historical trends in both epidemic diseases and status-striving and inequality. As status-striving and inequality rose, so did epidemic disease burdens; as they fell, so did disease; and as they've begun rising again since about 1980, so have disease burdens. I covered epidemic diseases like whooping cough (microscopic person-to-person), bedbugs (macroparasites), and salmonella (food poisoning).

Importantly, vaccines prove to have nothing to do with these trends -- whooping cough was in steep decline before the first vaccine was discovered let alone introduced, and the introduction of two new vaccines has done nothing to slow or halt the resurgence of the disease since about 1980. You can quibble about its role at a secondary, tertiary, or lower level -- but at the primary level, vaccines are invisible in the historical data of disease. They are swamped by forces like population density, urbanization, immigration, and so on and so forth -- none of which are static, but move in cycles, which then produces cycles in epidemic disease burdens.

After that, a brief post in 2014 that simply provided a topical update in the wake of ebola being in the news.

The most recent post was in 2015, reviewing an academic article showing that geographic -- really, demographic -- interconnectedness was behind the spread of pandemic diseases, and how some diseases that had been going away had been brought back to life by migration into that country. One of the key variables in predicting how awful a disease will get is the share of the population who are foreigners, which is still rising off the charts in this country and the other rich nations.

This is not about population density, a well known variable in epidemics and pandemics, but demographic flow between population clusters (the article focuses on between-nations, but I point out that this applies within-nations as well, in region-to-region flow). I discuss the links between long-term trends in pandemic disease and in status-striving and inequality, updated with a discussion of within-country trends -- when the borders were closed between nations, they were also largely closed between regions within a nation, preventing diseases from catching on via movement across regions within a single country.

* * *

The closest analogy to the present situation -- i.e., as of roughly 1980 -- is the series of cholera outbreaks during the 19th and early 20th centuries, the last era during which the West was subjected by its elites to open borders, laissez-faire economics, and partisan polarization.

These societal conditions began around 1830, with the rise to dominance of the Whigs in Britain and the Jacksonian Democrats in America, lasted through the realignment around 1860 toward the Liberals in Britain and Lincolnian Republicans in America, and began turning around circa 1920, in the wake of WWI, before vanishing altogether by the New Deal period. This perfectly bookends the period of cholera outbreaks in Europe, beginning with the Second outbreak circa 1830 (the First was confined to Asia), and ending with the Sixth that lasted into the early 1920s (the Seventh was mainly confined to Asia).

The pathogen causing cholera comes from India, especially the Ganges River delta near what is now Bangladesh, where some of the filthiest water in the world is to be found. It is commonly said to be spread by contaminated water supplies -- but again, the real spread is from its geographical source in India to faraway places like Russia and Britain by demographic interconnectedness. You can't get a pandemic from a localized water supply that's polluted, and simply needs to be purified.

This is the worst case of science-y types learning the wrong lesson, simply because it's more technocratic (a boost to the demand for nerds' services), and more suitable for data visualizations (the nerd bible on data visualization prizes the map drawn by Dr. Snow to locate the proximate cause of a cholera outbreak in London during the Third pandemic). The real origin was India -- not the Broad Street water pump in Soho -- and the real solution was to minimize demographic flow between Britain and India, which only came about decades later when Britain began decolonization instead of imperial expansion, and economic nationalism instead of mercantilism.

But that is too macro of a view, and nerds always zoom in as microscopically as possible, missing the forest for the trees. "Wow guys, all we have to do is clean up the water supply!" -- as it's being assaulted non-stop by demographic flow from India to Britain, through all sorts of different routes. If "all we have to do" is clean up the water, and this was proven in the 1850s, why were cholera epidemics still striking Europe up to 50 years later?

Worse, even if some nerd had figured out that cholera's pathogen originated in India, and that the only way to stop it was cutting off demographic flow from there -- what group of wealthy, powerful British leaders would have followed that advice during the expansion of the British Empire? "Sorry, nerd -- but if losing a few percent of our population to cholera is the price we have to pay for maintaining our hold over India, then so be it." Whoever's making profits hand-over-fist in the mercantilist relationship with India, not to mention those getting rich off of colonial military occupation, are easily going to sacrifice the poor bastards who can't drink from purified water sources in London.

Perhaps the most retarded view from the so-called dissident Right is that "pathological altruism" among first-world elites underlies globalization. Gee, our elites are just not greedy enough! The British mercantile elites who used India as a cheap off-shore production site for raw materials that fed into a British-owned supply chain, and the military elites who subjugated India, were not altruistic but selfish. It is pathological greed, not altruism, that opens a nation's borders.

That was true of the British Empire during the Victorian era, the American Empire during the neoliberal era, Classical Athens during Athenian hegemony (culminating in the Plague of Athens), the Roman Empire near its peak (culminating in the Antonine Plague), the Byzantine Empire near its peak (culminating in the Justinian Plague), the Mongol Empire that spread the Black Death, and the Spanish and later British Empires that spread Old World epidemic diseases to the New World natives.

* * *

The basic pattern of pandemics being spread by population movements (whether military expansion, trade routes, or mass migration) used to be conventional wisdom back in the New Deal era. The most well known popular book by a highly regarded historian is Plagues and Peoples by William H. McNeill -- written all the way back in 1976, before the Reagan Revolution, although at the same time as its libertarian vanguard was picking up steam.

This knowledge was profoundly watered down and obfuscated by the time that Guns, Germs, and Steel by Jared Diamond was published in 1997, squarely within the neoliberal era. There, it's only in service of the overall question of how Eurasians subjugated non-Eurasian societies (in large part by introducing contagious diseases against which they themselves had some degree of defense, but which the natives did not, owing to their total novelty). It's part of the race-and-power discourse, which is not per se about public health, political economy, and imperial expansion -- as these dynamics may play out within a race (e.g., Athens vs. other Greek city-states in the Classical period).

In fact, most of history's plagues and pandemics have taken the form of Eurasian civilization vs. Eurasian civilization -- right up through the cholera pandemics of the Gilded Age and now the East Asian influenza-related pandemics of our Second Gilded Age.

In the neoliberal era, intellectuals serving those with concentrated wealth and power are required to shift away from material frameworks like public health, political economy, and geopolitical expansion -- and toward recasting these matters in more cultural, racial, and identitarian frameworks. Closing our borders to China would not be a matter of public health, economic nationalism, or minimizing our military imperial footprint -- it would be racist, xenophobic, and probably the first step on a slippery slope toward genocide.

Again, the real problem is the economic elites who benefit materially from open borders, not their servile propagandists in the universities, think tanks, media, etc. Knowledge has no power of its own, and winning an informational war does nothing ipso facto to improve our standard of living (although it's worth waging to purify the informational sectors of bullshit, for its own intellectual sake).

* * *

How much longer will we have to suffer through these miseries? Peter Turchin's historical dynamics work shows that we're only in about the 1850s, for the striving-and-inequality cycle. We are nowhere close to turning the corner, which is a good 50 years away at the earliest. Everyone alive today can count on the rest of their lives being stricken by periodic waves of pandemic diseases, as globalization continues apace.

Only when the commoners become unified and rowdy enough will the elites feel the pressure to dial down globalization, as they did circa 1920. But civic society organizations among ordinary people, especially labor unions, have been falling off a cliff for decades and are not coming back anytime soon. Still, that is the ultimate goal to be working toward, as well as identifying which factions of the elite stratum are more vs. less amenable to building alliances with in the battle against globalization (the topic for another post).

Getting bogged down in the details of how to mitigate a pandemic's impact is missing that bigger picture, and is a palliative rather than a cure. Sure, it's worthwhile, but we always have to be working toward the prevention of pandemics in the first place -- and that can only happen from de-connecting demographically from the rest of the world, like we enjoyed during the New Deal era.

Specifically, scale down our military to a purely defensive one to protect the 50 states, rather than occupying the entire world. Re-industrialize our economy, rather than off-shoring production to cheap labor colonies. Minimize immigration from outside, and minimize transplanting and carpetbagging within our own borders, to the rates of the New Deal era.

March 7, 2020

TikTok and the return of dance-step fever during warm-up phase of excitement cycle, like late 2000s online culture

The explosive popularity of the mobile app TikTok coincides with the transition out of the vulnerable phase of the 15-year excitement cycle, and into the warm-up phase when energy levels return to baseline from a refractory level, and people feel restless to get moving and interactive again. It reminds me of the late 2000s internet culture, the last time we were in the warm-up phase of the cycle.

The app lets you upload 15-second audio-visual clips to share with other users, and by far the main tendency is to have music playing on the audio, with the video saved for quick comedy sketches and sight gags, lip sync, or dance routines. It's most popular with people under age 25.

This first post will focus on TikTok's link to the return of dance mania, and a second post will focus on the return of non-parasocial media and its absence of personas.

To give a feel for what the app is being used for, and how it's spreading its influence into the broader pop culture ecosystem, we'll look at a dance routine for the song "Say So" by Doja Cat, a current top 40 hit in the US. The music is reminiscent of the disco era, and the late '70s was another warm-up phase of the cycle (naturally marked by dance fever).

First, a compilation of TikTok clips from Haley Sharpe, the originator of a dance routine set to the song, then a series of clips of other TikTok users adopting the routine, and finally the music video for the song which incorporated the routine and the originator herself. (All videos are small in order to not disrupt the flow of reading; click fullscreen to see them in a larger size.)

If you can't see these videos, reload the page (at least for me).







The moves in this routine recall my discussion of the nature of dance crazes, from the original post on the restless warm-up phase of the excitement cycle:

The most distinctive feature of this phase is, not surprisingly, dance crazes. I don't mean music that is highly danceable -- but music with accompanying dances that are so simple, repetitive, and color-by-numbers, that even someone who's barely emerging from a refractory period can get into them. Even those who are just getting out of their emo mindset from the vulnerable phase can get social enough to do these dances.

These dances are so rule-defined that they have their own names, and a list of them shows that they do in fact occur mostly during the third phase of the cycle.

It is not a spontaneous, free-form routine that requires a high level of skill and comfort-in-your-body to adapt on-the-fly. There's a small finite number of easily distinguishable moves or steps, each of which is simple enough to do and to carry out in a certain order. Painting-by-numbers, assembling building blocks according to the instruction manual, etc. It's like the Watusi, the YMCA, the Macarena, or the Cupid Shuffle (all from warm-up phases: early '60s, late '70s, early '90s, late 2000s).

Also like those examples, the song and the dance for "Say So" are linked together in a pair -- this routine goes with this song, that routine goes with that song. That way, there's no ambiguity or uncertainty -- and no social anxiety or awkwardness -- when the hit song comes on and everyone has to figure out how to move along to it. It's simple: you have no choice but to follow the routine that everyone else already knows, rather than trying to come up with your own individual moves.

Again, that's not so much about individual vs. collective orientation -- since plenty of these song/dance pairs emerged from the individualist, hyper-competitive neoliberal era -- but just a consequence of the routine's simplicity, widespread popularity, and the desire to not stand out or feel socially awkward when people are just starting to come out of their low-energy vulnerable-phase cocoons.

TikTok reminds me so much of YouTube during the late 2000s, when dancing and DIY music videos were among the most popular kinds of videos. It may be hard to remember, but way back in the olden days, there were no 20-minute monologues, no reactions to current events in any domain of society and culture, no characters, no false sense of intimacy -- just brief entertaining clips to get you in a good mood and feel excited. It was simple, unpretentious, engaging, and fun.

That state of YouTube did not last into the 2010s -- it was part of the warm-up phase of the late 2000s.

The main dance sensation was twerking, which was only just blowing up in popularity. And one YouTube dancer mesmerized millions with her instruction -- Patty Mayo. Lots of girls uploaded clips of themselves shaking their ass at the camera, but she had gymnastic training and a greater level of corporeal / kinesthetic skill, making her dances more of a proper performance. True to the zeitgeist, she did not put any effort into a visual brand (costume, staging the set, etc.), never spoke to the camera, and never revealed anything about her personal life. Just some random teenage girl cutting loose in her suburban home.

She was not just a viral video creator at the time, you can google her name (along with "dancer") and find large numbers of people asking what ever happened to her, from the early 2010s to present. YouTube has tons of copied videos of hers, although she only kept a few of them up on her own channel. Below is her dancing to "Cyclone" by Baby Bash. Again, the main dance move she's modeling for others to imitate is twerking, but there's also the moves that tie in to the song title, holding her hands above her head and undulating head-to-toe like a cyclone.



From the artistically inclined teenagers, there was the DIY music video for an existing hit song. These were less corporeal than a full dance routine, and less focused on the body moving in time to the rhythm. But they still incorporated a kind of choreography -- discrete, identifiable moves or gestures executed in a sequence -- especially if the creator edited the shots to transition in tandem with the beat. This is harder to do with TikTok due to the 15-second time limit, but if they relax that to a few minutes, it could come right back to popularity.

I first stumbled upon Olivia Parenteau's DIY music videos when "Hot N Cold" by Katy Perry came out. It was big in the dance club I spent most of my weekends at, so I figured there would be an official video for it, to listen outside of the club (I've never used iTunes, Pandora, Spotify, etc., and never will). At first, though, there was only some random teenagers' DIY video for the song -- but the stop-motion-inspired editing made it so infectiously fun to watch!

Below are her videos for "Hot N Cold," and "7 Things" by Miley Cyrus, each with millions of views (equivalent to hundreds of millions today). They were as popular as Patty Mayo videos, and just like them, have left their original fans returning to leave comments about how central they were to their pop culture experience of the late 2000s, from the early 2010s up through the present.





Of course, the original viral video on YouTube was a DIY music video for "Hey" by Pixies, made by two college babes way back in 2005, complete with choreography (if not dance-floor moves):



And then there was the 2007 viral video of two teenage girls dancing to "Harder Better Faster" by Daft Punk, choreographing not only their moves to the rhythm, but timing the display of their body parts with certain lyrics written on them to the moments when those lyrics are sung in the song. Like the video for "Hey," this one has tens of millions of views, equivalent to billions today. Its moves and the "lyric on body part" concept are simple enough that it spawned numerous imitators.



Although these once highly popular forms of YouTube videos did not last into the 2010s, they can easily be reborn in the current warm-up phase of the excitement cycle, although it may take a few years to reach its peak, and TikTok -- or some successor platform -- will have to relax the length to a few minutes instead of just 15 seconds. Welcome back to an online atmosphere when no one will care about your feelings, reactions, or persona construction -- when people will simply want to get into a groove and have fun with other people.

March 3, 2020

Social healing anthems come from manic pixie dream girls

The role of the manic pixie dream girl is to coax socially wary men out of their refractory-phase shells, in order to get both sexes re-acquainted with each other during the warm-up phase of the excitement cycle.

It's a form of social integration: undoing the social isolation of everyone huddling under a pile of blankets during their emo vulnerable phase, and getting them to join a social whole -- whether it's the two sexes interacting in ordinary public places, a dance club, or actually dating and mating.

The MPDG role is a kind of social healer, nursing others out of their state of social isolation, which heals the collective back into an integrated functioning whole. I don't mean "social healing" in any other sense (social justice, etc.).

We've already seen that these roles are overwhelmingly played by women who were born during the manic phase of the 15-year excitement cycle. They imprinted on an atmosphere of carefree invincibility regarding social relations, and are the most unflappable in that domain of life -- and so, the most capable of encouraging others that there's nothing to be afraid of about social integration.

In their movie roles, the specific kind of social integration they encourage is forming a couple between the male protagonist and a love interest (perhaps the MPDG herself). But there are other kinds of social integration that people who are just coming out of a refractory phase might be wary about pursuing, and may be tempted to continue wallowing in isolation.

An earlier post looked at resilience anthems, which appear during the warm-up phase as people start to come out of their emo cocoons. Only two of those songs are by women, and they're distinct from the male songs in having a more personal, intimate, direct, one-on-one address, as though the singer were a therapist or nurse to the listener. The kind of social integration here is all-purpose -- to get over self-doubt and self-pity, in order to form fulfilling relationships in general.

Both of these songs are performed by manic-phase births -- all three members of Wilson Phillips ("Hold On") are late '60s births, and Avril Lavigne ("Keep Holding On") is an early '80s birth. As a bonus, on the same album as "Hold On," Wendy Wilson was given a song ("Impulsive") in which she exhibits spontaneous, free-spirited traits that cement her in the MPDG role during the early '90s warm-up phase.

But the new genre I want to showcase in this post is the anthem that is a certain kind of therapeutic, self-empowerment, "you are not alone," "things will get better" type. It comes in the form of a direct, intimate, one-on-one address, like a therapist or nurse again. It's empathetic toward the listener's depression, doubt, etc., portraying it as a kind of social isolation. After trusting in the encouragement of the singer-nurse, the listener will feel better -- not in some generic sense, but specifically by being accepted by a larger group, by fitting in. It's encouraging social integration over isolation. The listener initially doubts that this is possible because of their gloomy view about their own individual quirks and traits -- but the singer-nurse assures them that being who they are is not going to make them ostracized, but will make them welcomed by the group.

The four examples are "True Colors" (1986) by Cyndi Lauper, "Hero" (1993) by Mariah Carey, "Beautiful" (2002) by Christina Aguilera, and "Firework" (2010) by Katy Perry. Lest I be accused of cherry-picking, I got them from Wikipedia's list of gay anthems. But they are universal in appeal -- they don't make any reference to the listener as marginalized, outcast, misfit, freak, weirdo, etc. All sorts of people may feel socially isolated, not just the tiny minority who are actively marginalized (or marginalize themselves).









Only one of these comes from a warm-up phase ("Hero"), one from a manic phase ("Firework"), and two from vulnerable phases ("True Colors" and "Beautiful"). So they are not linked by the phase they appear in, nor therefore what particular function they play within the excitement cycle. The tone is different according to their phase, with the vulnerable-phase ones sounding much more emo, and the manic-phase one sounding the most upbeat and carefree. But they do share the common theme described above.

All four of these songs were performed by manic-phase births: Lauper was born in the early '50s, Carey in the late '60s,* and both Aguilera and Perry in the early '80s. No matter what phase they found themselves in, they carried an imprint stemming from their birth (whose phase repeated when they were 15 and hitting their social stride as adolescents). They were meant to play a social healing role, whatever atmosphere they found themselves in at the time.

Since only one song is from a warm-up phase, it's not accurate to refer to these songs as manic pixie dream girl songs, but the performers do possess the free-spirited and spontaneous traits that are associated with the role. It may be hard to remember, but Mariah Carey was a different persona back in the '90s (see the video for "Dreamlover," during a warm-up phase, where she couldn't look like more of a MPDG). And Katy Perry was still fairly MPDG: compare the video for "Simple" (2005) with the video for "Firework," and see how similar her expressions are, looking directly into the camera with a sympathetic and encouraging look, reaching out her hand for reassurance, etc. Not a sex-bomb persona.

Finally, I'll reiterate that these are universal in their appeal, rather than "let you freak flag fly" anthems, which are a separate sub-genre on the gay anthem list, and are more apropos for deviants. I include "Brave" by Sara Bareilles in that category, since it's specifically about an outcast, someone getting bullied who ought to stand up to their bullies. Those don't show a bias toward performers born in any phase: Pink and Bareilles are warm-up phase births (late '70s), Kelly Clarkson ("People Like Us") is a manic-phase birth (early '80s), and Lady Gaga and Kesha are vulnerable-phase births (late '80s).

I looked for examples of social healing anthems before the 1980s but couldn't find any. They appear to reflect the neoliberal era's alienation and social isolation, and the need it causes for reassurance that they'll belong to a healthy group at some point. The hyper-competitive norm of the status-striving era also means that people are more likely to cut others loose, and to focus more on one's own needs than on including others in the group. The songs above serve to ameliorate the sense of hopelessness and isolation.

"Lean on Me" from 1972 is not an example, since it's about solidarity among equals within a community, rather than a relationship between nurse / therapist and patient. Its ethos belongs to the New Deal era, before neoliberalism. But for what it's worth, Bill Withers is also a manic-phase birth (late '30s).

* Wikipedia says Mariah Carey was born in either '69 or '70, with claims made in the media on both sides. But in a discrepancy over the year a woman says she was born in, always go with the earlier one. Especially if it straddles a decade boundary, and might make her think she's a full decade younger by being born "in the '70s" rather than "in the '60s".

All of Mariah Carey's personality traits and facial expressions, from her heyday during the '90s, resemble those of manic-phase births -- carefree, wholesome, and free-spirited, not an instigating provocative wild child (warm-up phase births), and not grave, tragic, or emo (vulnerable phase births).

February 26, 2020

After nursing others to health during warm-up phase, manic pixie dream girls pursue their own needs during manic phase of excitement cycle

So far we've been looking at the role that manic pixie dream girls play within the context of the restless warm-up phase of the 15-year excitement cycle -- coaxing wary guys out of their shells, so that the sexes can get reacquainted with each other, after 5 long years of refractory-phase hyper-sensitivity.

What happens to these girls once that role has been fulfilled, though, and the cycle enters the manic phase where everyone feels invincible and carefree? There's no longer a need for an earthly guardian angel to lift a guy up out of a deep psychological hole.

Roles are adaptive within a phase, and do not stay constant over time. New environment, new roles. Still, some may be better suited to a particular role than others are, based on their birth and development.

Once the wary people have been rescued from their emo funk by the manic pixie dream girls, during the warm-up phase, the MPDGs are then free to pursue their own social and emotional needs and fulfillment during the manic phase. They've been caring for others for the past 5 years -- it's time for a little vacation, a little break, a little "me time".

They were glad to care for others in the previous phase, and don't resent that at all. But now that that work has been done, it's time to play. They are not abdicating all duties and responsibilities, they're simply going on vacation. And it's not in a hedonistic degenerate way -- they just want to shake off their role of nurse and get footloose and fancy-free for awhile, in a wholesome way. As MPDGs, they were validating others -- now it's their time for receiving validation from others.

That's what was behind the backlash against the MPDG role during the early 2010s, after its heyday during the late 2000s: everyone understood that the MPDGs' function had been successfully accomplished, and now it was time for them -- and everyone else -- to move on to new roles during the manic phase. They were going to have more of a social life of their own, fulfill their own emotional needs, have others validate them rather than vice versa, and have some wholesome fun on their little vacation.

This change to the MPDG role shows up in a new focus on social independence during the manic phase movies with characters who, in the previous warm-up phase, may have been straightforward MPDGs. The girl in Ruby Sparks (2012) gets a life of her own, separate from her author-creator. The operating system in Her (2013) socializes with other OS's and leaves the human social ecosystem entirely. And the mermaid from Splash (1984) ends up leaving the human ecosystem of her love interest, taking him back to her own world under the sea.

This change was foreshadowed already at the tail-end of the MPDG heyday, in 500 Days of Summer (2009), where half the movie explores the MPDG leading a fulfilling married life of her own with another man, after having nursed the male protagonist out of his stagnant depression. It's not that manic pixies are fickle gypsies -- but that roles change along with the phases of the excitement cycle, and some other type of person may need her attention in a different phase (or she may need attention herself).

A more concise, impressionistic display of the changing of roles is this ad for Magnum ice cream with former MPDG Rachel Bilson from 2011 (infectious enough that I still remember it, despite watching minimal TV during my adult life). No longer nursing others through emotional rehab, she's now free to pursue a wholesome carefree treat of her own, on her own:



But the most intense signal of the changing roles is the "it's time for a little me-time" anthem that explodes during the manic phase of the cycle. These are not hedonistic, about cutting all social ties and responsibilities, egocentric, etc. They clearly place the desire for a little vacation and validation for themselves within the context of having already fulfilled their duties to others and behaved responsibly. It's simply time to take a break, catch their breath, and replenish their own emotional stores after having given to others, by having some carefree me-time fun. They will get back to their responsibilities to others, just after a brief rejuvenating vacation.

These anthems were performed by women who were born during a manic phase, just like the MPDGs were (early '50s, late '60s, early '80s). Also like the MPDGs, they socially imprinted on the manic phase environment when they were hitting their adolescent stride at age 15. And in one case, Avril Lavigne, she'd already played the MPDG role during the previous warm-up phase (the late 2000s, in "Keep Holding On" and "Girlfriend").

It's too bad that Cyndi Lauper and Shania Twain didn't have big hits from the late '70s and early '90s, when they would've been naturals in the MPDG role, to show the evolution across phases like Avril did. I assume that they at least resonated with the MPDG role during the warm-up phase, like other manic-phase births do in such an environment. By the time the manic phase rolls around, they certainly show signs of having been through a MPDG role recently -- I've taken care of others, now it's time for a validating vacation of my own.

They don't treat the generic topic of "me-time," though: it's a specifically feminine form of needing to unwind and receive some emotional validation from others. And that, too, is after having fulfilled a specifically feminine role -- nurturing others. That's why these fall into the broader "girl power" trend that characterizes the manic phase of the cycle. (There are different forms of girl power from women who were born during different phases, but that's a matter for a separate post.)

I searched the late '60s manic phase for examples, but came up empty-handed. The "girl power" songs from back then were more about social / political change, as the sudden eruption of the women's lib movement overshadowed the more mundane changing of phases in the excitement cycle. Without such a momentous one-time social revolution under way, I assume there would've been one of these anthems back then as well. Alternatively, in the pre-neoliberal era, it might have been unnatural to make songs that, in however qualified of a way, glorified me-time as opposed to couples-time, family-time, community-time, or country-time.

In any case, these anthems all made the year-end Billboard Hot 100 charts, and are some of the most iconic of the manic-phase zeitgeist.

"Girls Just Want to Have Fun" by Cyndi Lauper (1983)



"Man! I Feel Like a Woman" by Shania Twain (1997)



"What the Hell" by Avril Lavigne (2011)



February 23, 2020

Lars and the Real Girl, a transition between robo-gf and manic pixie dream girl trends of the excitement cycle

On a whim last night I watched this critically acclaimed box-office disappointment, and it resonated so well with some earlier posts here on the topic of manic pixie dream girls and their place in the 15-year cultural excitement cycle.

First, recall that during the vulnerable refractory phase of the cycle, there's a retreat into the fantasy of obtaining a made-to-order robo-gf -- one who won't require all that painful social stimulation in order to court and woo.

Then, recall that during the restless warm-up phase that follows, the manic pixie dream girl archetype appears out of nowhere, as a kind of guardian angel to coax the male protagonist out of his vulnerable-phase cocoon, lifting him out of the emo funk that he'd been mired in throughout the previous phase.

Lars and the Real Girl came out in 2007, during a restless warm-up phase that should not have had the robo-gf and should have had the manic pixie dream girl. Instead of featuring solely one of those two types, the movie shows both, but in a way that is consonant with the warm-up phase -- leaving the emotional crutch robo-gf behind, and welcoming the charms of the manic pixie dream girl, as the protagonist works his way out of a deep dreary depression.

Even when the robo-gf is the focus of the plot early on, the protagonist is never depicted as enjoying a fulfilling retreat into fantasy (unlike The Stepford Wives, Weird Science, and the like). His attachment to his robo-gf is clearly shown as forced on his part, plainly an emotional crutch, and is treated as pathological by the other characters, who still want to help him through this awkward stage. This is the only way the robo-gf archetype can exist during the warm-up phase when people are itching to leave behind their emo-phase cocoons.

The manic pixie dream girl, for her part, doesn't get as much screen time as in other movies during the most recent peak of the type (the late 2000s). But it's clear what role she plays vis-a-vis the protagonist, does not exist so much for her own character arc across the narrative, and has the usual eccentricities in personality and appearance that are associated with the type.

A post on the birth phases of manic pixie dream girls showed that they overwhelmingly were born during a manic phase of the cycle. They imprinted on a social-cultural atmosphere of invincibility and carefree social relations during their introduction to the world -- and then again during their adolescence (around age 15), when they're hitting their stride socially. And sure enough, the actress playing the manic pixie dream girl in this movie was born in 1984, during the early '80s manic phase. It rarely fails!

Strangely, she is left off of lists of manic pixie dream girls. I'd been looking over them and watching as many as I could lately, to get a better feel for this character type, now that she'll be coming back during the early 2020s warm-up phase. But this movie had totally eluded my radar until mindlessly scrolling Amazon Prime.

She must have been left off because the type of people who write those lists are obsessed with individual personas, both because they're spergy nerds who don't understand social relations, and because they're status-striving types who see things as contests among individuals rather than a holistic superorganic social ecosystem. (The movie does a great job of portraying this aspect of real communities like small-town Wisconsin.)

What makes a character a manic pixie dream girl is not her individual traits that could be listed on a trading card, or an online dating app profile -- it is her relationship to the protagonist, how their social interactions drive the plot of him coming out of his emo funk. She is his earthly guardian angel, not just some isolated free spirit who wears barettes in her hair and is generally in an upbeat mood.

Another reason may be the total lack of irony or self-awareness in the movie's tone. If every other example of the character was played in a movie that was ironic or self-aware in tone, then how could this one be a true example of the type? Because tone has nothing to do with the relationship between the characters. Again, cultural critics are just doing superficial analysis, ignoring social relations and roles, and emphasizing stylistic choices like the degree of irony struck in the tone.

A final reason why it's ignored in discussions of the character type or overall genre, is that the characters are not metropolitan professionals. In the striver critic's mind, who else but yuppies and current private school kids could ever be going through a funk and need to be coaxed out of their cocoons to fulfill some higher purpose that requires social integration? Certainly not office drones in flyover country small towns.

All these exceptions recommend the movie over most of the more well known examples of the genre. The focus on the holistic social ecosystem, the sincere tone, and the humanistic portrayal of ordinary people from unglamorous walks of life -- really makes it feel like a throwback to before the current status-striving / neoliberal era. Unfortunately that meant it couldn't succeed much with audiences, but it's definitely worth watching.

February 22, 2020

Dancers show that corporeal women are ass women, not boob women

We return to an ongoing investigation that has shown that corporeal people are ass men / women, while cerebral people are boob men / women. Where else can we look for evidence? Dancing -- there are few activities more corporeal than that (there's basically no cerebral component to it).

After playing "Red Light Special" for the previous post, and knowing that I'm a fan of dancing, YouTube's algorithm suggested I watch this choreography video set to the song. "Choreography" may be stretching it -- stripperography is closer to what it is (no stripping or nudity, but NSFW).

Videos like these provide a large sample size of people that clarifies statistical impressions that might otherwise elude someone. There are dozens of women in the room, whether actually dancing or watching nearby. And there are dozens of other videos on that channel and others like it, which all show a pretty similar profile of dancers.

Notice that hardly any of them are large-breasted, and that if anything they tend to have more shapely ass-hips-and-thighs.

Aside from their static shape, which parts of their body do they emphasize while actively dancing? Far more from the waist down -- and when the upper body is involved, it's the arms and hands, not the chest region. They play up their facial expressions, whip their hair around, touch their hair or head or face with their hands -- literally the only part of their body that they avoid emphasizing is their tits.

Now, this is exotic dancing, and there are plenty of moves that are highly sexual. So it's not that they avoid drawing attention to their chests because it's too refined to indulge in such vulgar displays. If they all had big boobs, they would absolutely be heaving them around, bouncing them up and down, and touching or slapping them -- since that's what they're doing with their ass, hips, and thighs.

This tendency is true even for the women who are about equal above and below the waist. It's not just a matter of emphasizing whichever is larger.

It is already well understood that ballet dancers tend to have small-to-modest breasts, but the case of exotic dancers helps to rule out all of the existing theories about ballet dancers. It's already known that small-chested women tend to be the ones who join ballet to begin with -- not that ballet transforms large chests into small ones.

But the answers offered all suppose that it's about ballet's high degree of athleticism (at least in modern times), which would weed out women whose breasts were large enough to get in the way of the gymnastic-like movements. Or something -- it's not clear what the arguments are for large boobs being an impediment per se to athletics -- there are plenty of guys with flabby bellies and man-boobs who are on football, baseball, and hockey teams, not to mention bowling and golf.

My hunch is rather that large-breasted women are not as athletically gifted as a matter of their inner kinesthetic sense. I predict that they could remove their large breasts, and small-breasted women could get large implants, and their success in athletic training and performance would mirror other women who were similar to their original chest size. Big boobs and lower athleticism is correlation, not causation (they're both correlated to kinesthetic sense). That applies to all athletics -- dancing, gymnastics, cheerleading, soccer, softball, volleyball, whatever.

In any case, the exotic dancers show that athleticism is irrelevant. The kinds of dances you see in that video do not require minimal body fat, intense activity, explosive movements, or gymnastic-level skills. If it were the high degree of athleticism that kept large-breasted women out of ballet, then why are they also kept out of exotic dancing?

It clearly is not the physical requirements of the activity. The weeding-out process is at the level of "are you corporeal or cerebral?" or "do you have a high or low kinesthetic sense?" If big boobs don't make the cut at this basic level, it means that "ass woman or boob woman?" is related to the fundamental trait of "corporeal or cerebral," not just certain athletic activities that a corporeal person might pursue.

You could also ask them how clumsy they are. I'll bet boob women are clumsier on average than ass women, even at tasks that do not involve either region -- catching a ball, for instance. Related: rhythmic skill (one aspect of kinesthetic sense). I'll bet ass women have better rhythm, even at tasks that don't involve dancing -- tapping back or humming back a rhythm that you've just heard.

Finally, there's the matter of the target male audience for dancing -- are they ass men or boob men? I'm guessing ass men, judging again from the moves of the dancers that overwhelmingly emphasize the fertility region rather than the mammary region. (This dance fan is definitely an ass man.)

If boob men wanted to see the chest emphasized, they wouldn't watch a dance routine, but something simple like women jumping on trampolines (a la The Man Show), jogging, jumping rope, or whatever. Or an upper-body striptease. But nothing that would fall under "dancing".

February 20, 2020

Gay Peter Pan-isms: Aversion to baby-making / quiet storm songs

Back in 2012-'13 I solved the mystery of describing gay syndrome -- that is, the broad correlated pattern of traits that distinguish male homosexuals from heterosexuals of either sex. The standard frameworks were both retarded -- that gays are a kind of hyper-masculine male, or a kind of effeminate male.

The simplest framework is that they are a kind of pre-pubescent male child, one whose mindset is signaled by the view that "Ewww, girls are yucky!" That's the stage that their broad psychological, and to some extent physical, development is mostly stuck in.

It is only to the extent that adult females are neotenous (resembling children), that gays appear effeminate -- they both resemble children, gays far more strongly. But post-pubescent females have all sorts of mature traits that mark them apart from children, and from gays -- namely, anything related to motherhood (the maternal instinct, pregnancy, birth, breastfeeding, nurturing children, and so on).

The framework of "hyper-masculine gays" never had much support -- relative to heterosexual men, gays are smaller, weaker, more easily frightened, cry more, and are totally uninterested in girls, to name the most obvious non-masculine traits. Those are all explained by resembling a child of the age that still feels that "girls are yucky".

The only piece of evidence was that gays are highly promiscuous, which is more male than female-typical. I showed how that is easily explained in the "gays as pre-pubescent boys" framework -- what would happen if you gave an adult hormonal sex drive to a boy who was still mostly pre-social. He wouldn't feel strongly attached to anyone outside his household, and could easily cycle through friends and acquaintances. Boys and girls do this normally with friends at that age; gays differ only in having a sex drive attached to it, so they cycle through "friends and acquaintances who they get it on with," rather than just friends.

It's only during adolescence that the social sense fully develops, and people become more bonded to their peers and behave in a give-and-take way to maintain durable social circles. And since gays are still in the "girls are yucky" stage, their sex drive targets males by default, who boys are not averse to interacting with socially.

Long-time readers remember this; newer readers can search the blog for "gay" and "Peter Pan," "pedomorphy," or "neoteny". The evidence is extensive. Nobody had proposed the theory before, either professionals or laymen, and either from the left, right, or center.

* * *

Having reviewed the overall framework for the first time in awhile, let's add another example to the pattern. I wasn't even planning on looking for it, it just showed up when I was looking at different music genres.

There is a very well established genre called the gay anthem. It's not just disco-dancing stuff, as there are weepy torch songs in there too. There is major overlap between "hoe anthems" and gay anthems.

And yet there is a related kind of doing-it song that does not show up at all -- the baby-making song, or the broader genre of quiet storm song. To show how little overlap there is, Wikipedia's lists of gay anthems and quiet storm songs have around 200 songs apiece, and yet there's only 1 song that belongs to both -- "I Will Always Love You" by Whitney Houston. And that's more of a torch song than a getting-it-on, baby-making song. Only 1 out of over 200 in common? That is not a minor difference.

There are a few near misses, where the same artist and album are on each list, but different songs from the album show up on either list (gay anthem first, quiet storm song second):

1985 - Whitney Houston, "How Will I Know" vs. "Saving All My Love For You"

1994 - TLC, "Waterfalls" vs. "Red Light Special"

1998 - Brandy & Monica, "The Boy Is Mine" vs. "(Everything I Do) I Do It For You" or "Angel of Mine"

The case of TLC is revealing since you'd think "Waterfalls" would be the more normie-friendly song, and that "Red Light Special" would be the one for the over-sexed group, but it's the other way around. In the YouTube comments to the video below, unlike for a hoe anthem like "Gimme More" by Britney Spears, there's no stampede of gays rushing in to identify with the song, and the girls all refer to baby-making rather than letting their inner hoe shine.



What distinguishes baby-making songs (mainly a sub-genre of quiet storm) from the seemingly similar hoe anthems, or torch songs, that make up much of the gay anthem genre? It's the juvenile vs. mature form of social interaction that is assumed. An immature person can feel attraction, infatuation, be sexually active with another person, split apart afterward, and feel lonesome after the parting of ways. But only socially mature people feel the romance and one-on-one intimacy that goes along with long-term monogamy, marriage, raising families, and so on.

Baby-making songs are not about risking pregnancy by sleeping with just any old guy -- it's the special, unique one who you're invested in, and who is invested in you, to the degree that you wouldn't mind eventually forming a family together. If it's just about fucking any random hot guy, that's a hoe anthem, not a baby-making song.

If gays are socially-psychologically like pre-pubescent boys, then of course they don't resonate with the tone of quiet storm songs, which assumes a couple that is adult, romantic, and pair-bonded. But hoe anthems and torch songs would certainly work for gays: in their pre-adolescent and mainly pre-social state, little boys (and girls) already cycle through friends and acquaintances "promiscuously," it's just that gays have an adult sex drive attached to this process, so that they are sexually promiscuous while cycling through them, and perhaps feeling all woe-is-me after the acquaintanceship inevitably breaks apart.

* * *

In typical libtard fashion, the push for gay marriage assumed that equality before the law required equality in nature -- "just like us," "same love," etc. But gays could not be more different from normal men in the relevant domains of life. If the crusaders wanted to allow them to get married, it should have been in terms of giving them the privilege despite being so different, as a form of tolerance. I don't support it, but that's the only way to do it if you do.

Instead we just got a bunch of risible propaganda making claims about how the world works, which normies already know is bogus. It politicized and weaponized social science, as part of a polarizing culture war, rather than a civil liberties approach (tolerance, admitting the vast differences in nature).

What if you got the social science wrong? Then is it OK to revoke the rights and privileges that you based on it? Libtards never stop to think about that. If scientific discoveries -- or basic common sense -- disprove your claims about nature, and those claims are the foundation for your rights argument, then your argument is incredibly fragile. A robust argument is made independently of whatever the state of nature is, a topic to which we'll return in future posts.