When the current wave of white-collar crime was in its early stage during the 1980s, one of its most notorious figures — Michael Milken, the junk bond king — was scarcely 40 years old. Fast-forward 20 years, and the leader of one of the largest Ponzi schemes ever — Bernie Madoff — had passed 70.
Do these two examples suggest a graying of the white-collar criminal class? Where are all the enterprising mega-swindlers in their late 30s and early 40s nowadays?
Wikipedia has a category page that lists 100 American white-collar criminals. Strikingly, something like 80-90% of them are members of the Silent and Baby Boomer generations. Now, some of them may be minor figures in the huckster hall of shame, but not most of them who are infamous enough to be included in such lists.
The Silents and Boomers are not simply carrying on a tradition passed down from earlier generations. There are only two real examples who were born before 1925. Charles Keating was born in '23, but his crimes belonged to the savings-and-loans crisis of the late '80s — not to an earlier period that the Silents and Boomers were the descendants of. Only Louis Wolfson, born 1912, disgraced himself before the current wave, back in the late '60s. (Sam Gilbert, born 1913, was not in finance.)
How about on the other side of the Silent/Boomer cohort — Gen X? I count only one who is American, in finance / Ponzi schemes, caused massive damage, and was born after 1964: Laura Pendergest-Holt (born '74). In their favor, births from the first half of the '60s appeared to be uncommon too.
Browsing around other lists of "top 10" worst white-collar criminals don't turn up many further examples not already in the Wikipedia list (such as Nicholas Cosmo, born '71). And of course those other lists include Silents and Boomers not already on Wikipedia's list.
It seems that no matter whether the high-level crimes were committed during the '70s, '80s, '90s, or 21st century, it has almost always been Silents and Boomers who were behind it — back then as upstarts, now as the Establishment. They were the original "Me Generation" of the 1970s whose thirst for higher and higher status instituted the dog-eat-dog morality in place of the Midcentury norms of making-do and reining-it-in which kept individual ambitions from destroying societal cohesion.
It can come as no surprise that these generations have produced the worst offenders throughout the entirety of the white-collar crime wave that has been escalating for the past 30 to 40 years.
Fortunately, though, this bodes well for the future since the Silents and Boomers aren't going to be living for too much longer. And their Gen X successors do not appear to be so driven by greed to take on the mantle. If anything, their guiding purpose has been to expose hypocrisy rather than to rationalize the dog-eat-dog morality. And the Millennials after them don't seem to have an ambitious bone in their bodies — not exactly the source of the Next Big Thing, but not the engineers of the next big Ponzi scheme either.
June 30, 2014
Historical analysis of horoscope advice to reveal the popular mood?
Horoscopes give the life advice that the readers want to hear. By shifting responsibility onto the stars, the reader can follow their own plans without feeling as though their motives were self-centered. Hey, it's what the star chart recommended!
But what readers want to do, and hence want to hear recommendations for, is not a constant over time. The range in typical mindsets today is way different from those in earlier periods. I doubt there was much horoscope advice in 1974 that nudged readers toward participating in a real estate bubble to get rich quick, although that wouldn't have been surprising advice in 2004. Romantic advice from 1984 would have been phrased to make something happen, while in 2014 it would be geared more toward soaking up attention.
How can we study the changes over time? I couldn't quickly find any sources online, nor any references in Google Scholar that suggest this line of thinking has been pursued in "the literature" before.
Tabloid newspapers would probably be too hard to track down, either online or in real life. Women's magazines are readily available in university library archives, and the entire history of Vogue has been digitally archived. Mass market horoscope books have been published, and would be easy to hunt around for.
I might poke around some of the public libraries and thrift stores in my area. If anyone can point me to online sources, though, I can dig through those as well.
But what readers want to do, and hence want to hear recommendations for, is not a constant over time. The range in typical mindsets today is way different from those in earlier periods. I doubt there was much horoscope advice in 1974 that nudged readers toward participating in a real estate bubble to get rich quick, although that wouldn't have been surprising advice in 2004. Romantic advice from 1984 would have been phrased to make something happen, while in 2014 it would be geared more toward soaking up attention.
How can we study the changes over time? I couldn't quickly find any sources online, nor any references in Google Scholar that suggest this line of thinking has been pursued in "the literature" before.
Tabloid newspapers would probably be too hard to track down, either online or in real life. Women's magazines are readily available in university library archives, and the entire history of Vogue has been digitally archived. Mass market horoscope books have been published, and would be easy to hunt around for.
I might poke around some of the public libraries and thrift stores in my area. If anyone can point me to online sources, though, I can dig through those as well.
Categories:
Media,
Pop culture,
Psychology
June 28, 2014
Today's grandmothers can free children from helicopter parenting
While I'm home for the summer, I'm having many more opportunities to observe the parenting culture, as my 6 year-old nephew is staying here without his parents. Just his grandmother and Uncle Agnostic.
Parenting styles appear to form right around the time a child is born, and remain more or less frozen from then on. Even though helicopter parenting was in full swing during the '90s for small children (the Millennials), the parents of teenagers still let them have their own life, and did not constantly hound them ("touching base") when they left for college. That's because these parents had their kids in the '70s or early '80s, and retained the mindset of that time, which encouraged doing things on your own, without endless supervision.
I'm seeing this again, where my mother is taking care of my nephew more like the way she raised us, and less like today's helicopter parents (including my brother). A major part of that is letting him play by himself, or making friends with other kids his age in the neighborhood — without "play dates," just interacting spontaneously amongst each other.
How could the other kids' parents allow their own child to participate in such dangerous activities? Well, it turns out the parents aren't home. One boy is the grandson of the woman who lives a few doors up, and another girl from across the street is being babysat by her grandmother while her parents are away on vacation.
Of course — grandmothers! How else could small children be allowed to interact with each other by simply visiting each other's houses and asking if so-and-so wants to come out and play? Today's helicopter parents are too paranoid against their community members, so leave it to those whose parenting style was formed back in the '70s and first half of the '80s.*
Kids have to learn how to treat others, and how to respond to others' treatment of them, away from authority figures mediating their interactions. That's called preparing for real life. Shelter your kids, and they cannot mesh into any normal social setting outside of their nuclear household.
If you would like to do something about your OCD parenting, but think you're too committed to hovering, then just bite the bullet and send them away to Camp Grandma for the summer.
Lots of X-ers don't exactly have the warmest relationship with their parents, but you can get over it for the benefit of your kid. They need an environment where they can take hard falls, schedule their own social activities, and face the consequences of their actions. And if you're like most parents today, you cannot bring yourself to give that to them directly.
* This would not have had the same effect when you yourself were a child, since it was your parents who let you enjoy an unsupervised life growing up. Your own grandmother was probably a worry wort, like mine was (and still is). They were bringing up your parents during the Midcentury heyday of Dr. Spock, "smothering mothers," etc., and that stuck with them right up through their grandparenting years in the '70s, '80s, and '90s.
Parenting styles appear to form right around the time a child is born, and remain more or less frozen from then on. Even though helicopter parenting was in full swing during the '90s for small children (the Millennials), the parents of teenagers still let them have their own life, and did not constantly hound them ("touching base") when they left for college. That's because these parents had their kids in the '70s or early '80s, and retained the mindset of that time, which encouraged doing things on your own, without endless supervision.
I'm seeing this again, where my mother is taking care of my nephew more like the way she raised us, and less like today's helicopter parents (including my brother). A major part of that is letting him play by himself, or making friends with other kids his age in the neighborhood — without "play dates," just interacting spontaneously amongst each other.
How could the other kids' parents allow their own child to participate in such dangerous activities? Well, it turns out the parents aren't home. One boy is the grandson of the woman who lives a few doors up, and another girl from across the street is being babysat by her grandmother while her parents are away on vacation.
Of course — grandmothers! How else could small children be allowed to interact with each other by simply visiting each other's houses and asking if so-and-so wants to come out and play? Today's helicopter parents are too paranoid against their community members, so leave it to those whose parenting style was formed back in the '70s and first half of the '80s.*
Kids have to learn how to treat others, and how to respond to others' treatment of them, away from authority figures mediating their interactions. That's called preparing for real life. Shelter your kids, and they cannot mesh into any normal social setting outside of their nuclear household.
If you would like to do something about your OCD parenting, but think you're too committed to hovering, then just bite the bullet and send them away to Camp Grandma for the summer.
Lots of X-ers don't exactly have the warmest relationship with their parents, but you can get over it for the benefit of your kid. They need an environment where they can take hard falls, schedule their own social activities, and face the consequences of their actions. And if you're like most parents today, you cannot bring yourself to give that to them directly.
* This would not have had the same effect when you yourself were a child, since it was your parents who let you enjoy an unsupervised life growing up. Your own grandmother was probably a worry wort, like mine was (and still is). They were bringing up your parents during the Midcentury heyday of Dr. Spock, "smothering mothers," etc., and that stuck with them right up through their grandparenting years in the '70s, '80s, and '90s.
Categories:
Age,
Cocooning,
Generations,
Kinship,
Over-parenting,
Psychology
June 25, 2014
When music videos were shot on film
Checking out the videos on Totally '80s today on VH1 Classic, I was struck by how common it was to shoot on film back then, despite the fact that video technology was not only available but cheaper than film, and already becoming the standard for shooting news and pornography.
Shouldn't music videos have joined in with other lesser media like reporting and porno, and chosen to shoot on video? They could have, but then they wouldn't have that stylized look that film gives.
Video is shot at a higher frame rate (capturing more motion per second), gives more desaturated colors, and has a more restricted dynamic range of brightness levels. It's more photorealistic and ordinary, making it better suited to media where the viewer wants that "you are there" feeling -- such as news reporting and porno.
For music videos, this format was generally chosen when the idea was to put the viewer in the audience of an ordinary, real-world live performance by the band. It's as if a documentary crew went to shoot a small gig that the band was playing that night.
Below are screenshots from the videos for "Any Way You Want It" by Journey, "Start Me Up" by the Rolling Stones, and "I Want Candy" by Bow Wow Wow, all of which were shot on video. (Click on the song titles to see the full videos.) No real reason for these particular examples, except that they're fresh in my mind from today, and are all from the early '80s -- to show how early the format was adopted for the ordinary/documentary approach. (Click to enlarge.)
Film gives lusher colors, more striking dark-bright contrast, more texture of the medium itself (film grain), and stylized motion by shooting fewer frames per second.
Here are some screenshots from "Papa Don't Preach" by Madonna, "Rhythm of the Night" by DeBarge, and "Love in an Elevator" by Aerosmith. No real reason for these either, except that they're fresh in my mind, and are from the second half of the '80s -- to show how film was still going strong well after it had been abandoned for video in news and porno. It didn't even need to be a narrative video like the one by Madonna. The other two feature a lot of live performance footage, but the setting is supposed to be larger than life and out of the ordinary, requiring a more stylized look.
Now that music videos are so rarely made, let alone watched, and even then are shot on digital, you wonder what effect it will have on the visual expectations of today's young generation. Will they expect the sky to be white rather than blue, will they find black shadows too dark, and will they feel comfortable only with either washed-out or caricatured/campy colors rather than ones that are warm and lush?
After all, it's not as though they have replaced music videos with another medium that has a film-y look and feel. The major new visual medium for them is video games, which they prefer to look more pale, blandly colored, and evenly lit than a news broadcast.
Shouldn't music videos have joined in with other lesser media like reporting and porno, and chosen to shoot on video? They could have, but then they wouldn't have that stylized look that film gives.
Video is shot at a higher frame rate (capturing more motion per second), gives more desaturated colors, and has a more restricted dynamic range of brightness levels. It's more photorealistic and ordinary, making it better suited to media where the viewer wants that "you are there" feeling -- such as news reporting and porno.
For music videos, this format was generally chosen when the idea was to put the viewer in the audience of an ordinary, real-world live performance by the band. It's as if a documentary crew went to shoot a small gig that the band was playing that night.
Below are screenshots from the videos for "Any Way You Want It" by Journey, "Start Me Up" by the Rolling Stones, and "I Want Candy" by Bow Wow Wow, all of which were shot on video. (Click on the song titles to see the full videos.) No real reason for these particular examples, except that they're fresh in my mind from today, and are all from the early '80s -- to show how early the format was adopted for the ordinary/documentary approach. (Click to enlarge.)
Film gives lusher colors, more striking dark-bright contrast, more texture of the medium itself (film grain), and stylized motion by shooting fewer frames per second.
Here are some screenshots from "Papa Don't Preach" by Madonna, "Rhythm of the Night" by DeBarge, and "Love in an Elevator" by Aerosmith. No real reason for these either, except that they're fresh in my mind, and are from the second half of the '80s -- to show how film was still going strong well after it had been abandoned for video in news and porno. It didn't even need to be a narrative video like the one by Madonna. The other two feature a lot of live performance footage, but the setting is supposed to be larger than life and out of the ordinary, requiring a more stylized look.
Now that music videos are so rarely made, let alone watched, and even then are shot on digital, you wonder what effect it will have on the visual expectations of today's young generation. Will they expect the sky to be white rather than blue, will they find black shadows too dark, and will they feel comfortable only with either washed-out or caricatured/campy colors rather than ones that are warm and lush?
After all, it's not as though they have replaced music videos with another medium that has a film-y look and feel. The major new visual medium for them is video games, which they prefer to look more pale, blandly colored, and evenly lit than a news broadcast.
Categories:
Design,
Media,
Music,
Pop culture,
Technology,
Television
June 22, 2014
How can "mercy" killings of pets be justified, when they never attempt suicide?
One of the more galling rationalizations for killing a pet whose near future looks dark is that, in some way, the animal is ready for death and perhaps even sending a signal to the owner that it's time to go and please do the job for me.
Their movement gets stiff, they don't walk or jump around as much or at all, they aren't as playful, they stop eating and drinking — they're just laying around, waiting for you to put them out of their listless misery. It's only humane of you to honor their request and have a doctor come pump barbiturates into their veins to shut off their nervous system.
Wait just a second — if they're so miserable and beyond all hope, why aren't they trying to do the job themselves? Not like it would be hard for most pets. Over a lifetime, they have cultivated a good sense of what things and places are dangerous, so instead of avoiding them, they could go toward them. Y'know, walk out into busy traffic, climb someplace high and go splat on the ground below, pick a fight with a nasty predator or rival... anything, really.
In fact, animals never commit suicide (which is not to say that they don't sometimes behave in a way that results in their own death). Their survival instinct is too strong. I suspect the same is true for human beings, and that suicide is a maladaptive disease of civilization (the exact mechanism being irrelevant for now). I don't recall ever hearing of hunter-gatherers taking their own lives. In any case, we sure don't see it in animals, particularly not pets who are winding down their last days.
Laying still, breathing slow, refusing food and water is their way of conserving rather than wasting energy so that they'll live as long as possible in the final stage of life. Yes, even trying to eat, digest, and excrete food would be a waste if the bodily systems required are performing poorly or winding down.
If they seem sad, it is because they sense the end is near — not sad because you're taking too long to bump them off already. Like any other living creature, they want to go out on their own terms, and not have their lives taken from them by some paternalistic authority. They want to be respected and leave with their dignity intact, not to get snuffed out in a final disrespectful humiliation.
Their movement gets stiff, they don't walk or jump around as much or at all, they aren't as playful, they stop eating and drinking — they're just laying around, waiting for you to put them out of their listless misery. It's only humane of you to honor their request and have a doctor come pump barbiturates into their veins to shut off their nervous system.
Wait just a second — if they're so miserable and beyond all hope, why aren't they trying to do the job themselves? Not like it would be hard for most pets. Over a lifetime, they have cultivated a good sense of what things and places are dangerous, so instead of avoiding them, they could go toward them. Y'know, walk out into busy traffic, climb someplace high and go splat on the ground below, pick a fight with a nasty predator or rival... anything, really.
In fact, animals never commit suicide (which is not to say that they don't sometimes behave in a way that results in their own death). Their survival instinct is too strong. I suspect the same is true for human beings, and that suicide is a maladaptive disease of civilization (the exact mechanism being irrelevant for now). I don't recall ever hearing of hunter-gatherers taking their own lives. In any case, we sure don't see it in animals, particularly not pets who are winding down their last days.
Laying still, breathing slow, refusing food and water is their way of conserving rather than wasting energy so that they'll live as long as possible in the final stage of life. Yes, even trying to eat, digest, and excrete food would be a waste if the bodily systems required are performing poorly or winding down.
If they seem sad, it is because they sense the end is near — not sad because you're taking too long to bump them off already. Like any other living creature, they want to go out on their own terms, and not have their lives taken from them by some paternalistic authority. They want to be respected and leave with their dignity intact, not to get snuffed out in a final disrespectful humiliation.
Categories:
Morality,
Pets,
Politics,
Psychology,
Violence
June 21, 2014
Are veterinarians biased against cats?
We cloak professional healers in an aura of sanctity, as though they were guardian angels and miracle workers. But they are fallible human beings with their own set of motivations, in addition to wanting to help patients.
The most disturbing example must be the veterinarians, who turn out to be biased against an entire class of their patients — the cats.
Here are the results of a survey of vets and vet technicians. The majority express no preference for either, but that's just giving them the easy fence-sitting answer. Psychological studies show that people who own both dogs and cats are more like dog people than cat people. Push them in a real-life setting, and they'd more likely come down in favor of dog patients. Of those who do express a preference, the vets are 2 to 1 in favor of their dog patients, and the vet techs are even more biased at 3 to 1.
Here are even more extensive statistics, painting the same basic picture of preferences, but also revealing how these manifest themselves in the hospitals and clinics themselves.
The two main reasons seem to be that dogs give cues that are easier for humans to read, and they aren't as fiesty. Dogs are more closely adapted to interacting with people, but I still wonder how much the greater difficulty of "reading" cats is due to the vet not being a cat lover in the first place. To each their own in their private life, but this is like a pediatrician who doesn't care for kids.
Aside from the bias against cats in itself, dog people also tend to be more liberal, in the sense of having less respect for purity and sanctity (click on the "pets" tag below to review earlier posts on this topic). I expect they're more likely to callously favor euthanasia, whereas cat people would set a higher threshold for putting a pet down.
And given the greater influence that the vet has in this decision — not being as easily persuadable as a panic-stricken pet owner — this likely results in more pets (canine and feline) having their lives taken than there should be.
Part of the belief in "healers as angels" is that we shouldn't research and choose which hospital and which doctor we should bring our loved one to. Aren't all angels equally angelic? In the case of taking care of cats, though, you ought to check into these things beforehand, to at least make sure you'll be seeing a cat person with the proper respect for life, and not a dog-loving mercy killer.
The most disturbing example must be the veterinarians, who turn out to be biased against an entire class of their patients — the cats.
Here are the results of a survey of vets and vet technicians. The majority express no preference for either, but that's just giving them the easy fence-sitting answer. Psychological studies show that people who own both dogs and cats are more like dog people than cat people. Push them in a real-life setting, and they'd more likely come down in favor of dog patients. Of those who do express a preference, the vets are 2 to 1 in favor of their dog patients, and the vet techs are even more biased at 3 to 1.
Here are even more extensive statistics, painting the same basic picture of preferences, but also revealing how these manifest themselves in the hospitals and clinics themselves.
The two main reasons seem to be that dogs give cues that are easier for humans to read, and they aren't as fiesty. Dogs are more closely adapted to interacting with people, but I still wonder how much the greater difficulty of "reading" cats is due to the vet not being a cat lover in the first place. To each their own in their private life, but this is like a pediatrician who doesn't care for kids.
Aside from the bias against cats in itself, dog people also tend to be more liberal, in the sense of having less respect for purity and sanctity (click on the "pets" tag below to review earlier posts on this topic). I expect they're more likely to callously favor euthanasia, whereas cat people would set a higher threshold for putting a pet down.
And given the greater influence that the vet has in this decision — not being as easily persuadable as a panic-stricken pet owner — this likely results in more pets (canine and feline) having their lives taken than there should be.
Part of the belief in "healers as angels" is that we shouldn't research and choose which hospital and which doctor we should bring our loved one to. Aren't all angels equally angelic? In the case of taking care of cats, though, you ought to check into these things beforehand, to at least make sure you'll be seeing a cat person with the proper respect for life, and not a dog-loving mercy killer.
Categories:
Economics,
Morality,
Pets,
Psychology
June 19, 2014
The generational structure of status contests: Competing over careers vs. lifestyles
Periods of relative economic and political stability are marked by a pervasive code of reining-it-in and making-do, which prevents individual ambitions from overgrazing the commons. The last such period was the Great Compression of the 1920s through the 1970s.
Peter Turchin's analysis of the dynamics of cycles in ideological climate and in material conditions suggests that popular attitudes change first, followed by their aggregate material effects. The Great Compression was preceded, for example, by the Progressive Era. Likewise, the period of rising inequality since circa 1980 was preceded by a popular push away from the ethos of reining-it-in.
In 1976, Tom Wolfe summed up this decisive shift in attitudes by labelling the trailblazers as "the Me Generation." Why should I have to rein it in, when I deserve better? I see what I want, I'm going to take it, and everybody else better get out of my way. After all, I deserve better.
These people in the early and middle stages of their working years, who were going to shake up the stability of the older incumbents, were the Silent Generation and the Baby Boomers. And ever since, they have made careerism their preferred mode of status competition. That's the closest to what we mean by "status" — job prestige, income and wealth, and the necessary credentials.
Once they began aging into the middle and later stages of their careers, did they gradually work less and then retire to make way for younger generations — in the way that the older generation did for them, when they were starting out? Hell no. They've dug themselves in like ticks on the political-economic body.
This has saturated careerism as an arena for status competition. Later generations can certainly try to break into that arena and do battle with the incumbents, but their success will be far smaller than back when the incumbents in the economy and government didn't put up much of a fight, since in those days they were glad to retire and give the next generation an opportunity to control things.
What options does this leave for the status strivers among later cohorts such as Generation X and the Millennials? Compete over your leisure pursuits, rather than pursuing your career.
I've eaten at more trendy food trucks than you guys have. I've heard of more obscure music groups than you guys have. I've unlocked more achievements in some video game than you guys have. And I'm more up-to-date on some epic TV show than you guys are. You jelly?
This does not boil down to an effect of age, as though young people will always have difficulty competing in the career arena and will therefore invest more of their efforts in lifestyle competition. Remember that in the '70s and '80s, the Silents and Boomers faced almost no pushback from the incumbents. The dazzling success of the Me Generation was not necessarily due to some greater talent they had, but perhaps due to the incumbents following a different, not-too-competitive code. It doesn't take much of a soldier to wipe out a bunch of pacifists, now does it?
An economic study by Erica Segall of age, period, and cohort effects on consumption patterns did find a significant cohort effect of being a member of Gen X or later when it came to what portion of your budget goes to consumer spending.
Now, the Me Generation has always indulged in contests of conspicuous consumption, but only to the extent that they honestly signal the competitors' superior job prestige and earning potential or accumulated wealth. If they compete over food, it will be based on the price of admission and dining per se — not steering the vanguard fad of vegan egg creams.
In general, then, their consumerism will be limited to longer-lasting and higher-priced items such as cars, real estate, and trophy wives. With all of their energies focused on dominating their career, they just don't have enough time and effort to compete over their possessions. Rather, they'll set aside a huge amount, in payments that can be made monthly and automatically instead of having to be attended to on an hourly basis, and for items that are obvious to everyone as status symbols.
The later generations who compete primarily on consumerism don't have much wealth to flaunt, but it doesn't take that much money to enter the lifestyle competition. Let's say your weekly foodie excursion sets you back $50, and that you take two weeks off a year. That's only $2,500 for the whole year — a sum that you'd have to fork over every month to rent in New York City, and even then only in a shoebox far away from all the action.
Competing as a fashionista or as a gadget geek will set you back a little more, but still nowhere near the cost of luxury cars and real estate, or credit cards to keep the trophy wife.
In fact, as long as you're competing more on how trendy rather than how wealthy you are, why not just buy clothes, gadgets, and meals that aren't very expensive at all, provided they run through fashion cycles fast enough? Cheap and static doesn't allow for any kind of competition, but cheap and high-turnover opens up a whole 'nother arena for poor strivers to climb their way to the top of some pyramid, and then another pyramid, and another, and another.
Hence, purchasing the top du jour at Forever 21, the app du jour at the Apple Store, the burger du jour at Wendy's, and to wash it down, the microbrew du jour at Whole Foods.
Then there's consumption's twin, leisure. The early waves of strivers kicked off the higher ed bubble, but it was purely to obtain credentials that would let them shove aside the incumbents in the economy and government, who didn't have an MBA or whose JD was not from Harvard or Yale. Even at the middle level, as long as it served as a launching pad toward a higher-earning career, the Me Generation had no problem going to State U.
For Gen X and the Millennials, however, choosing which college to attend was influenced more by what the choice would tell the world about their lifestyle. Nobody's paying for college out of pocket, so which school they go to reveals little about their wealth. Rather, they're trying to show how much time they researched what the different schools are like, and which one matched up the closest with their lifestyle, and would signal their commitment to competing in that lifestyle.
Not to mention that college is pure leisure these days. Kids do no work and get full credit. Degrees are bought and sold, so as long as your student loans don't bounce, you're in the clear as far as your studies go. That frees up more time for you to compete with the other students over lifestyle pursuits, whether that's being a shopaholic or a video game addict.
The "year abroad" during college is a big deal for the same reason — do you know of a more trendy yet less spoiled location than the other year-abroaders? Ditto for the unpaid internship: nobody is earning money, nor will anybody's gig lead anywhere afterward, so you try to score a more trendy and enviable spot for making yourself busy.
In daily life, the post-Me generations spend a lot of time in coffee shops, foodie joints, and cafeterias at Whole Foods style supermarkets. They're the minority, though. The cocooning majority hangs out online, where status preening takes place on websites where each competitor is assigned an ID card that shows how many points they've racked up within that domain — likes on Facebook, followers on Twitter or Instagram, gamerscore on Xbox Live, elite posting status on HuffPo / Amazon / IGN / Rotten Tomatoes, and so on and so forth. Climbing these ladders doesn't cost much money, but if all you've got is time and effort, you too can achieve internet immortality.
Thus, from the Me Generation to the Stuff White People Like Generation.
There are some interesting distinctions even within the career vs. lifestyle groups. Silents seem to be driven more by wealth, Boomers by influence and power, although both are careerists. And although both are lifestyle competitors, Gen X wants to be cool, Millennials want to be famous. I attribute these splits to how the cocooning vs. outgoing cycle has affected them. Accumulating wealth or having a bunch of followers are less social (you don't interact with fans), while influencing and controlling others is more interactive, and so is membership in a scene that's cool (not lame).
This has been a rough outline, as there's still a lot more to be said about the effects and implications of a generational split in status competition. But I'll save those for further posts, rather than try to cover everything all at once.
Peter Turchin's analysis of the dynamics of cycles in ideological climate and in material conditions suggests that popular attitudes change first, followed by their aggregate material effects. The Great Compression was preceded, for example, by the Progressive Era. Likewise, the period of rising inequality since circa 1980 was preceded by a popular push away from the ethos of reining-it-in.
In 1976, Tom Wolfe summed up this decisive shift in attitudes by labelling the trailblazers as "the Me Generation." Why should I have to rein it in, when I deserve better? I see what I want, I'm going to take it, and everybody else better get out of my way. After all, I deserve better.
These people in the early and middle stages of their working years, who were going to shake up the stability of the older incumbents, were the Silent Generation and the Baby Boomers. And ever since, they have made careerism their preferred mode of status competition. That's the closest to what we mean by "status" — job prestige, income and wealth, and the necessary credentials.
Once they began aging into the middle and later stages of their careers, did they gradually work less and then retire to make way for younger generations — in the way that the older generation did for them, when they were starting out? Hell no. They've dug themselves in like ticks on the political-economic body.
This has saturated careerism as an arena for status competition. Later generations can certainly try to break into that arena and do battle with the incumbents, but their success will be far smaller than back when the incumbents in the economy and government didn't put up much of a fight, since in those days they were glad to retire and give the next generation an opportunity to control things.
What options does this leave for the status strivers among later cohorts such as Generation X and the Millennials? Compete over your leisure pursuits, rather than pursuing your career.
I've eaten at more trendy food trucks than you guys have. I've heard of more obscure music groups than you guys have. I've unlocked more achievements in some video game than you guys have. And I'm more up-to-date on some epic TV show than you guys are. You jelly?
This does not boil down to an effect of age, as though young people will always have difficulty competing in the career arena and will therefore invest more of their efforts in lifestyle competition. Remember that in the '70s and '80s, the Silents and Boomers faced almost no pushback from the incumbents. The dazzling success of the Me Generation was not necessarily due to some greater talent they had, but perhaps due to the incumbents following a different, not-too-competitive code. It doesn't take much of a soldier to wipe out a bunch of pacifists, now does it?
An economic study by Erica Segall of age, period, and cohort effects on consumption patterns did find a significant cohort effect of being a member of Gen X or later when it came to what portion of your budget goes to consumer spending.
Now, the Me Generation has always indulged in contests of conspicuous consumption, but only to the extent that they honestly signal the competitors' superior job prestige and earning potential or accumulated wealth. If they compete over food, it will be based on the price of admission and dining per se — not steering the vanguard fad of vegan egg creams.
In general, then, their consumerism will be limited to longer-lasting and higher-priced items such as cars, real estate, and trophy wives. With all of their energies focused on dominating their career, they just don't have enough time and effort to compete over their possessions. Rather, they'll set aside a huge amount, in payments that can be made monthly and automatically instead of having to be attended to on an hourly basis, and for items that are obvious to everyone as status symbols.
The later generations who compete primarily on consumerism don't have much wealth to flaunt, but it doesn't take that much money to enter the lifestyle competition. Let's say your weekly foodie excursion sets you back $50, and that you take two weeks off a year. That's only $2,500 for the whole year — a sum that you'd have to fork over every month to rent in New York City, and even then only in a shoebox far away from all the action.
Competing as a fashionista or as a gadget geek will set you back a little more, but still nowhere near the cost of luxury cars and real estate, or credit cards to keep the trophy wife.
In fact, as long as you're competing more on how trendy rather than how wealthy you are, why not just buy clothes, gadgets, and meals that aren't very expensive at all, provided they run through fashion cycles fast enough? Cheap and static doesn't allow for any kind of competition, but cheap and high-turnover opens up a whole 'nother arena for poor strivers to climb their way to the top of some pyramid, and then another pyramid, and another, and another.
Hence, purchasing the top du jour at Forever 21, the app du jour at the Apple Store, the burger du jour at Wendy's, and to wash it down, the microbrew du jour at Whole Foods.
Then there's consumption's twin, leisure. The early waves of strivers kicked off the higher ed bubble, but it was purely to obtain credentials that would let them shove aside the incumbents in the economy and government, who didn't have an MBA or whose JD was not from Harvard or Yale. Even at the middle level, as long as it served as a launching pad toward a higher-earning career, the Me Generation had no problem going to State U.
For Gen X and the Millennials, however, choosing which college to attend was influenced more by what the choice would tell the world about their lifestyle. Nobody's paying for college out of pocket, so which school they go to reveals little about their wealth. Rather, they're trying to show how much time they researched what the different schools are like, and which one matched up the closest with their lifestyle, and would signal their commitment to competing in that lifestyle.
Not to mention that college is pure leisure these days. Kids do no work and get full credit. Degrees are bought and sold, so as long as your student loans don't bounce, you're in the clear as far as your studies go. That frees up more time for you to compete with the other students over lifestyle pursuits, whether that's being a shopaholic or a video game addict.
The "year abroad" during college is a big deal for the same reason — do you know of a more trendy yet less spoiled location than the other year-abroaders? Ditto for the unpaid internship: nobody is earning money, nor will anybody's gig lead anywhere afterward, so you try to score a more trendy and enviable spot for making yourself busy.
In daily life, the post-Me generations spend a lot of time in coffee shops, foodie joints, and cafeterias at Whole Foods style supermarkets. They're the minority, though. The cocooning majority hangs out online, where status preening takes place on websites where each competitor is assigned an ID card that shows how many points they've racked up within that domain — likes on Facebook, followers on Twitter or Instagram, gamerscore on Xbox Live, elite posting status on HuffPo / Amazon / IGN / Rotten Tomatoes, and so on and so forth. Climbing these ladders doesn't cost much money, but if all you've got is time and effort, you too can achieve internet immortality.
Thus, from the Me Generation to the Stuff White People Like Generation.
There are some interesting distinctions even within the career vs. lifestyle groups. Silents seem to be driven more by wealth, Boomers by influence and power, although both are careerists. And although both are lifestyle competitors, Gen X wants to be cool, Millennials want to be famous. I attribute these splits to how the cocooning vs. outgoing cycle has affected them. Accumulating wealth or having a bunch of followers are less social (you don't interact with fans), while influencing and controlling others is more interactive, and so is membership in a scene that's cool (not lame).
This has been a rough outline, as there's still a lot more to be said about the effects and implications of a generational split in status competition. But I'll save those for further posts, rather than try to cover everything all at once.
Categories:
Age,
Cocooning,
Design,
Economics,
Education,
Food,
Generations,
Media,
Music,
Politics,
Pop culture,
Psychology,
Technology,
Television,
Video Games
June 13, 2014
What killed New Hollywood, what took its place, and why?
Great post by Paleo Retiree and comment thread over at Uncouth Reflections, centering around William Friedkin's movie Sorcerer but wandering outward to touch on New Hollywood's rise and fall.
Few topics are in such dire need of revisionist history in art and entertainment as the demise of New Hollywood. I won't try to do all of that in a single post. I will, however, copy and paste the two long-winded but hopefully insightful comments I left over there.
The standard story is that the grimy, hardcore realistic, no-tidy-endings, fate's-a-bitch movies of the '70s were dethroned by crass commercial "fun for the whole family" movies like Star Wars and E.T. In reality, the '70s style thrillers were not dethroned, jettisoned, or blasted into outer space. The tradition was added to by taking things in a more science-fiction direction, and with a more satisfying catharsis by the end, a la Alien and RoboCop.
Examples from the later evolution of the grimy thriller are not optimistic, uplifting, fun for the whole family, or disposed toward mass merchandising and product tie-ins. ("Hey kids, this week only at McDonalds -- get a free CHESTBURSTER ALIEN toy when you buy a happy meal!")
This means the whole "Boo Spielberg" line of the pop history of New Hollywood is missing the big picture, and is just scapegoating the sci-fi chart-toppers like E.T. and Ghostbusters. The path from The Parallax View to RoboCop is more like an evolution of a species than it is a great extinction event.
* * *
I can’t comment on Sorcerer, but you can see why the grimy realism movement of the ’70s only lasted for a moment — it wasn’t stylized enough.
Fortune plays a huge role in real life, and there typically isn’t a memorable, satisfying resolution to most stories. Fictional narratives ought to stylize those details of reality so that the random texture of events is still perceptible, but not calling attention to itself the whole time. And they ought not to get in the way of the larger structure of beginning, middle, and end, including a climax for the audience to feel catharsis, followed by some resolution to bring their elevated state back down to the mundane level.
Movies where you’re going through a variety of different positions, sometimes easing into it and other times jack-hammering away, have to push you over the top and let you come back down to enjoy that refractory period. Otherwise it feels like your partner is fucking around with you, and wasn’t it so cool how she just up and left without any ultimate climax or resolution to the night’s roll in the hay?
Hey, I like a lot of the grimy, fate’s-a-bitch kind of movies, but you have to admit why there weren’t so many that worked at the time, and why the movement didn’t have much momentum to keep it going longer. The range of stories that lend themselves to the approach is severely limited, and they’d more or less run through them all by the mid-’70s. Not every narrative calls for Sisyphus and Sado-Masochism, y’know? Not even a majority of them.
Posing this as a New Hollywood vs. corporate blockbuster thing is going too easy on New Hollywood’s superstar directors. It’s more like, stochastic and fatalistic blockbuster vs. purposeful and cathartic blockbuster, where a hero’s efforts achieve something.
The waning interest in attempted follow-ups to Chinatown owes more to the constraints imposed by human psychology on the viewer’s side, than to economic motives on the creator’s side.
* * *
Pursuing the topic of New Hollywood burning out and what replaced it — it wasn’t Star Wars, Star Trek, E.T., Ghostbusters, Indiana Jones, or any of the other optimistic blockbusters that the whole family could enjoy.
Those movies are so different in tone from Chinatown, Taxi Driver, and The Parallax View, that we’d have to conclude there was an abrupt U-turn in public tastes (or artists’ inclinations) around that time. And in general the second half of the ’70s isn’t so different from the first half of the ’80s — not one of those U-turn periods, artistically or pop culturally.
What represented the next step forward from the fate’s-a-bitch blockbusters of the New Hollywood period was the sci-fi thriller, whether dystopian or at least in the tradition of “Oh shit, mankind doesn’t belong in this hostile environment.” Alien, Escape from New York, Blade Runner, Videodrome, The Terminator, Predator, RoboCop, Total Recall.
They portrayed a grimy realistic setting (not an unconvincing emo caricature a la contempo sci-fi thrillers), where the deck is stacked against the puny heroes, and where cruel fate generally has its way with ordinary background characters and featured protagonists alike. Buuuut, where we sense a progression toward an ultimate climax of redemption — and where that is paid off by the end. This allows even the schlockier examples like Predator to satisfy the viewer in a way that the more pretentious examples in the Sisyphean / S&M approach cannot.
And by telling the story within science fiction, they could get out from the “is this story REALLY plausible?” constraint of the hardcore realism of their ’70s forerunners. Sci-fi is just plausible enough.
There was a related movement toward slasher thrillers in horror. Grimy, realistic, unrelenting hostile forces wipe out just about everyone, despite their best efforts and teamwork — but not everyone. There’s at least a Pyrrhic victory for the lone survivor. The supernatural element allows the story-tellers to move outside of the strict boundaries imposed by hardcore realism.
These related inheritors of the ’70s thriller both reached their end in the early ’90s, when Total Recall styled itself as the sci-fi thriller to end all sci-fi thrillers, and Twin Peaks styled itself as the supernatural thriller to end all supernatural thrillers.
Few topics are in such dire need of revisionist history in art and entertainment as the demise of New Hollywood. I won't try to do all of that in a single post. I will, however, copy and paste the two long-winded but hopefully insightful comments I left over there.
The standard story is that the grimy, hardcore realistic, no-tidy-endings, fate's-a-bitch movies of the '70s were dethroned by crass commercial "fun for the whole family" movies like Star Wars and E.T. In reality, the '70s style thrillers were not dethroned, jettisoned, or blasted into outer space. The tradition was added to by taking things in a more science-fiction direction, and with a more satisfying catharsis by the end, a la Alien and RoboCop.
Examples from the later evolution of the grimy thriller are not optimistic, uplifting, fun for the whole family, or disposed toward mass merchandising and product tie-ins. ("Hey kids, this week only at McDonalds -- get a free CHESTBURSTER ALIEN toy when you buy a happy meal!")
This means the whole "Boo Spielberg" line of the pop history of New Hollywood is missing the big picture, and is just scapegoating the sci-fi chart-toppers like E.T. and Ghostbusters. The path from The Parallax View to RoboCop is more like an evolution of a species than it is a great extinction event.
* * *
I can’t comment on Sorcerer, but you can see why the grimy realism movement of the ’70s only lasted for a moment — it wasn’t stylized enough.
Fortune plays a huge role in real life, and there typically isn’t a memorable, satisfying resolution to most stories. Fictional narratives ought to stylize those details of reality so that the random texture of events is still perceptible, but not calling attention to itself the whole time. And they ought not to get in the way of the larger structure of beginning, middle, and end, including a climax for the audience to feel catharsis, followed by some resolution to bring their elevated state back down to the mundane level.
Movies where you’re going through a variety of different positions, sometimes easing into it and other times jack-hammering away, have to push you over the top and let you come back down to enjoy that refractory period. Otherwise it feels like your partner is fucking around with you, and wasn’t it so cool how she just up and left without any ultimate climax or resolution to the night’s roll in the hay?
Hey, I like a lot of the grimy, fate’s-a-bitch kind of movies, but you have to admit why there weren’t so many that worked at the time, and why the movement didn’t have much momentum to keep it going longer. The range of stories that lend themselves to the approach is severely limited, and they’d more or less run through them all by the mid-’70s. Not every narrative calls for Sisyphus and Sado-Masochism, y’know? Not even a majority of them.
Posing this as a New Hollywood vs. corporate blockbuster thing is going too easy on New Hollywood’s superstar directors. It’s more like, stochastic and fatalistic blockbuster vs. purposeful and cathartic blockbuster, where a hero’s efforts achieve something.
The waning interest in attempted follow-ups to Chinatown owes more to the constraints imposed by human psychology on the viewer’s side, than to economic motives on the creator’s side.
* * *
Pursuing the topic of New Hollywood burning out and what replaced it — it wasn’t Star Wars, Star Trek, E.T., Ghostbusters, Indiana Jones, or any of the other optimistic blockbusters that the whole family could enjoy.
Those movies are so different in tone from Chinatown, Taxi Driver, and The Parallax View, that we’d have to conclude there was an abrupt U-turn in public tastes (or artists’ inclinations) around that time. And in general the second half of the ’70s isn’t so different from the first half of the ’80s — not one of those U-turn periods, artistically or pop culturally.
What represented the next step forward from the fate’s-a-bitch blockbusters of the New Hollywood period was the sci-fi thriller, whether dystopian or at least in the tradition of “Oh shit, mankind doesn’t belong in this hostile environment.” Alien, Escape from New York, Blade Runner, Videodrome, The Terminator, Predator, RoboCop, Total Recall.
They portrayed a grimy realistic setting (not an unconvincing emo caricature a la contempo sci-fi thrillers), where the deck is stacked against the puny heroes, and where cruel fate generally has its way with ordinary background characters and featured protagonists alike. Buuuut, where we sense a progression toward an ultimate climax of redemption — and where that is paid off by the end. This allows even the schlockier examples like Predator to satisfy the viewer in a way that the more pretentious examples in the Sisyphean / S&M approach cannot.
And by telling the story within science fiction, they could get out from the “is this story REALLY plausible?” constraint of the hardcore realism of their ’70s forerunners. Sci-fi is just plausible enough.
There was a related movement toward slasher thrillers in horror. Grimy, realistic, unrelenting hostile forces wipe out just about everyone, despite their best efforts and teamwork — but not everyone. There’s at least a Pyrrhic victory for the lone survivor. The supernatural element allows the story-tellers to move outside of the strict boundaries imposed by hardcore realism.
These related inheritors of the ’70s thriller both reached their end in the early ’90s, when Total Recall styled itself as the sci-fi thriller to end all sci-fi thrillers, and Twin Peaks styled itself as the supernatural thriller to end all supernatural thrillers.
Categories:
Art,
Literature,
Movies,
Psychology
June 12, 2014
Are Millennials consciously reversing their sheltered upbringing when they raise kids of their own?
An earlier post looked at the dynamics of parenting styles, where folks who grew up mostly in rising-crime times choose to lock their kids up from the outside world, while these locked-up people themselves, who grew up in falling-crime times, don't see what the harm is in letting their kids lead a more unsupervised life.
We saw this during the last wave of outgoing behavior and communal focus. The Silent Generation, who were locked up by smothering mothers during the cocooning Midcentury, begat the late Boomers and first half of Generation X, who couldn't have enjoyed a less supervised childhood and adolescence. This continued with the second half of Gen X, whose parents were mostly early Boomers — who for their part spent a good deal of childhood in the Dr. Spock climate of the Midcentury.
I was taken aback during an episode of the Real Housewives of Orange County (which I sometimes tune into, for sociological insight), where the daughter of one of the housewives is beginning to raise a family of her own. The housewife Vicki is a late Boomer, the daughter Briana an early Millennial ('87).
Briana has decided to move away from Orange County, way out to Oklahoma, where her money will go a lot farther than it could in southern California. There's Steve Sailer's "affordable family formation" unfolding in clear terms.
Then she added that she wanted to raise her kids where they could run around in the driveway out front, and run off to go play with the other neighborhood kids. She revealed that in the 12 years that she lived in her family's house in Orange County, they'd only known two of their neighbors. The area is white and upper-middle class, so don't bother trying to excuse the helicopter parents there on the basis of dangerous ghetto Mexicans. It's just good old paranoia.
It may be only one data-point, but you can tell when someone is speaking more or less as a representative of their group.
Before, I noted that in the case of Millennials, they feel nostalgia for not having a life as children. Now that they are starting to have kids, this frustrated attempt at nostalgia has developed into a reflection on how deprived they were of social contact outside the home, from birth until college (by which time it's too late to cram 15 years of social maturation into the time you've got). Every generation remembers the negative side of their upbringing more than the positive, and try to correct that when they have kids of their own.
Before long, then, we'll see a reversal of the helicopter parenting trend that began about 25 years ago. Probably not for another five years or so, since the late X-ers are still busy raising kids, and lord knows they remember how dangerous childhood used to be. Who would've guessed that Beavis and Butt-head would become such over-protective fathers?
We saw this during the last wave of outgoing behavior and communal focus. The Silent Generation, who were locked up by smothering mothers during the cocooning Midcentury, begat the late Boomers and first half of Generation X, who couldn't have enjoyed a less supervised childhood and adolescence. This continued with the second half of Gen X, whose parents were mostly early Boomers — who for their part spent a good deal of childhood in the Dr. Spock climate of the Midcentury.
I was taken aback during an episode of the Real Housewives of Orange County (which I sometimes tune into, for sociological insight), where the daughter of one of the housewives is beginning to raise a family of her own. The housewife Vicki is a late Boomer, the daughter Briana an early Millennial ('87).
Briana has decided to move away from Orange County, way out to Oklahoma, where her money will go a lot farther than it could in southern California. There's Steve Sailer's "affordable family formation" unfolding in clear terms.
Then she added that she wanted to raise her kids where they could run around in the driveway out front, and run off to go play with the other neighborhood kids. She revealed that in the 12 years that she lived in her family's house in Orange County, they'd only known two of their neighbors. The area is white and upper-middle class, so don't bother trying to excuse the helicopter parents there on the basis of dangerous ghetto Mexicans. It's just good old paranoia.
It may be only one data-point, but you can tell when someone is speaking more or less as a representative of their group.
Before, I noted that in the case of Millennials, they feel nostalgia for not having a life as children. Now that they are starting to have kids, this frustrated attempt at nostalgia has developed into a reflection on how deprived they were of social contact outside the home, from birth until college (by which time it's too late to cram 15 years of social maturation into the time you've got). Every generation remembers the negative side of their upbringing more than the positive, and try to correct that when they have kids of their own.
Before long, then, we'll see a reversal of the helicopter parenting trend that began about 25 years ago. Probably not for another five years or so, since the late X-ers are still busy raising kids, and lord knows they remember how dangerous childhood used to be. Who would've guessed that Beavis and Butt-head would become such over-protective fathers?
Categories:
Age,
Cocooning,
Generations,
Over-parenting,
Psychology,
Television
Subscribe to:
Posts (Atom)

