Cats these days seem more bratty and dependent than I remember growing up. And more housebound and supervised by their caretakers — perhaps the cause of the greater brattiness and dependency, a la the children of helicopter parents? Just as there has been a huge change in parenting styles over the past 25 years, so has there been with pet-caring styles, and in the same direction.
I'm inclined to rule out genetic change, not on principle, only because the change has been so sudden.
Fortunately, cats are less obedient than children toward over-protective owners, so they still get out a good bit. But it still feels like they have a weaker public presence than they did in the '80s. One of our cats used to flop down into a deep sleep right on the sidewalk out front. Once he even got accidentally picked up by animal control because they thought he'd been hit by a car and was laying dead on the curb. Another cat used to climb trees and walk over our first-story roof.
Come to think of it, it's been awhile since I recall seeing the "cat stuck up in a tree" in real life or pop culture. Last I remember was having to climb up a tree and dislodge our cat who liked to run up there, at least until the birds began dive-bombing him. That was the mid-to-late 1990s.
I'm not sure how to investigate this idea quantitatively, but it would be worth looking into by animal psychologists. There may be surveys of pet-owners over time about whether their cat is an indoor or outdoor cat.
March 30, 2014
Pre-screening peers on Facebook
Different generations use the same technology in different ways. Nothing new about that: as teenagers, the Silent Gen huddled around the radio set to listen to drama programs, quiz shows, and variety hours — akin to parking yourself in front of the idiot box — whereas Boomers and especially X-ers turned on the radio to get energized by music.
After reading more and more first-hand accounts by Millennials about how they use Facebook, it's clear that there's more social avoidance going on than there first appears. Refusing to interact face-to-face, or even voice-to-voice, is an obvious signal to older folks of how awkward young people are around each other these days. As recently as the mid-1990s, when chat rooms exploded in popularity, we never interacted with our real-life friends on AOL. It was a supplement to real life, where we chatted with people we'd never meet. Folks who use Facebook, though, intend for it to be a substitute for real-life interactions with their real-life acquaintances.
But then there's their behavior, beyond merely using the service, that you can't observe directly and can only read or hear about. Like how the first thing they do when they add an acquaintance to their friend list is to perform an extensive background check on them using the publicly viewable info on their profile. Who are they friends with? Birds of a feather flock together. Who, if anyone, are they dating? Where do they go to school, and where do they work? Where did they go to school before their current school, and where did they work for their previous five jobs? What's being left on their wall? What do their status updates reveal? Pictures? Pages liked? Etc.?
Damn, you hardly met the person and you're already sending their profile pics to the lab in case there's a match in the sex offenders database. Back when trust was just beginning to fray, people (i.e., girls) would have maybe used the internet to check for criminal behavior, but that would be it. Now it's far worse: they scrutinize every little detail on your profile, and every trace you leave on other people's profiles. They're going far beyond checking for far-out-there kind of deviance, and are trying to uncover every nuance of your life. Rather than, y'know, discovering that first-hand or at most through word-of-mouth gossip.
Aside from feeling invasive ("creepy"), it betrays a profound lack of trust in all other people. After all, it's not this or that minority of folks who are subjected to the screening. Nope, it's like the fucking TSA waving each Facebook friend to step up to that full-body scanner. "OK, open up your profile, and hold still while we scan it... All right, step forward... and... you're clear." What if I don't want to "hit you up on Facebook," and keep that part of me private? "In that case, you can choose the alternative of a full background check by a private investigator, and provide three letters of reference." Just to hang out? "Well, you can never be too safe. It makes me feel more at ease." Yeah, you sure seem at ease...
Moreover, it destroys any mystery about a person. Remember: mystery is creepy, in the Millennial mind. An unknown means that something's going to pop out of the closet and scare them. They have to open every door and shine a flashlight over every micro-crevice of the home of your being, in order to feel secure.
It also delays the getting-to-know-you process, and interrupts the momentum whenever it gets going.
Trust greases the wheels of sociability, and it used to be normal to meet someone for the first time in the afternoon and feel like you knew them well by the end of the evening. Not just boys and girls pairing off, but also same-sex peers making quick friends, particularly in an intense setting like a concert or sports game. These days, there's this nervous laughter, sideways stare, and lack of touchy-feely behavior (for boys and girls) or rowdiness (for same-sex peers).
I think it's probably too late for the Millennials to be saved from their awkward, pre-screening behavior on social networks. They're already in their 20s and set in their techno ways. Hopefully by the time today's post-Millennial kindergartners become teenagers, they'll look at the choice of Facebook as real-life substitute, and let out a great big BOORRRRRRINNNNNG. Locking yourself indoors and huddling around the radio set listening to soap operas died off fairly suddenly, and there's no reason that the next generation can't kill off Facebook just as quickly.
After reading more and more first-hand accounts by Millennials about how they use Facebook, it's clear that there's more social avoidance going on than there first appears. Refusing to interact face-to-face, or even voice-to-voice, is an obvious signal to older folks of how awkward young people are around each other these days. As recently as the mid-1990s, when chat rooms exploded in popularity, we never interacted with our real-life friends on AOL. It was a supplement to real life, where we chatted with people we'd never meet. Folks who use Facebook, though, intend for it to be a substitute for real-life interactions with their real-life acquaintances.
But then there's their behavior, beyond merely using the service, that you can't observe directly and can only read or hear about. Like how the first thing they do when they add an acquaintance to their friend list is to perform an extensive background check on them using the publicly viewable info on their profile. Who are they friends with? Birds of a feather flock together. Who, if anyone, are they dating? Where do they go to school, and where do they work? Where did they go to school before their current school, and where did they work for their previous five jobs? What's being left on their wall? What do their status updates reveal? Pictures? Pages liked? Etc.?
Damn, you hardly met the person and you're already sending their profile pics to the lab in case there's a match in the sex offenders database. Back when trust was just beginning to fray, people (i.e., girls) would have maybe used the internet to check for criminal behavior, but that would be it. Now it's far worse: they scrutinize every little detail on your profile, and every trace you leave on other people's profiles. They're going far beyond checking for far-out-there kind of deviance, and are trying to uncover every nuance of your life. Rather than, y'know, discovering that first-hand or at most through word-of-mouth gossip.
Aside from feeling invasive ("creepy"), it betrays a profound lack of trust in all other people. After all, it's not this or that minority of folks who are subjected to the screening. Nope, it's like the fucking TSA waving each Facebook friend to step up to that full-body scanner. "OK, open up your profile, and hold still while we scan it... All right, step forward... and... you're clear." What if I don't want to "hit you up on Facebook," and keep that part of me private? "In that case, you can choose the alternative of a full background check by a private investigator, and provide three letters of reference." Just to hang out? "Well, you can never be too safe. It makes me feel more at ease." Yeah, you sure seem at ease...
Moreover, it destroys any mystery about a person. Remember: mystery is creepy, in the Millennial mind. An unknown means that something's going to pop out of the closet and scare them. They have to open every door and shine a flashlight over every micro-crevice of the home of your being, in order to feel secure.
It also delays the getting-to-know-you process, and interrupts the momentum whenever it gets going.
Trust greases the wheels of sociability, and it used to be normal to meet someone for the first time in the afternoon and feel like you knew them well by the end of the evening. Not just boys and girls pairing off, but also same-sex peers making quick friends, particularly in an intense setting like a concert or sports game. These days, there's this nervous laughter, sideways stare, and lack of touchy-feely behavior (for boys and girls) or rowdiness (for same-sex peers).
I think it's probably too late for the Millennials to be saved from their awkward, pre-screening behavior on social networks. They're already in their 20s and set in their techno ways. Hopefully by the time today's post-Millennial kindergartners become teenagers, they'll look at the choice of Facebook as real-life substitute, and let out a great big BOORRRRRRINNNNNG. Locking yourself indoors and huddling around the radio set listening to soap operas died off fairly suddenly, and there's no reason that the next generation can't kill off Facebook just as quickly.
Categories:
Dudes and dudettes,
Generations,
Psychology,
Technology
March 29, 2014
Were you into vinyl before it was cool?
By "being into," I mean that it was a conscious choice among alternative formats. And browsing through the NYT archive for articles about "vinyl," it looks like it became cool during the 2000s. Before then, when it was also a conscious choice, is roughly the mid-to-late 1980s through the '90s.
Today I was reminded of the two main things that turned me on to records in the mid-'90s: selection and price. Nothing romantic.
Not the sound quality — it sounds great, and distinct from CDs, but not a difference that would make me want to convert my CD collection into vinyl. And not the status points — back then, there were no status points to be gained, and even now I would gain little once I revealed what it was that I bought on vinyl (not classic rock, punk, grunge, or indie).
I'm visiting home and stopped by the local thrift store, which had several dozen crates full of records. I hadn't even gone there looking for them, I just figured why not browse around after having scored something that I did come looking for, a vintage afghan (black with gold and cream patterns). Right away they started popping up, and within ten minutes, I had a handful of albums that are difficult to find on CD, let alone for a buck a piece in mint condition. The Bangles, David Bowie, Bonnie Tyler, The Go-Go's, Paul Young, Bryan Adams, and Stacey Q (only a 12" single, but more than I've ever seen on CD).
The only one that I've even seen before on CD ("in the wild") was All Over the Place, the debut power-pop album by the Bangles. I have that on CD, but the hits from the others I only have on compilations or greatest hits: "Blue Jean," "Total Eclipse of the Heart," "Every Time You Go Away," "Two of Hearts"... and I don't think I have "Head Over Heels" or "Summer of '69" on anything. Yet for the price of one crappy download on iTunes — with 95% of the sound compressed out — I got the entire album that each hit song came from.
It may be new wave and synth-pop today, but 15 to 20 years ago I was still bored by contempo music and looking back for something entertaining. As now, it was mostly the '80s, a decent amount from the '70s, and a handful from the '60s. Only more on the avant-garde or experimental side, compared to more well known stuff that I dig now. Starting in college, you should be getting over your adolescent insecurity about needing to prove how obscure and unique your tastes are, and that's the only real change in my vinyl-buying habits — who the groups are, not the larger reasons of selection and price.
Some of that stuff was available on CD back then, but it was expensive. On the CD racks at Tower Records, there were several columns of just Frank Zappa, but they were closer to $20 than $15. If you dropped by any used record store, though, you could find them used and in good condition for about five to ten bucks.
And other material was either not released on CD, was out of print, or was otherwise damn rare to find on CD. Yet I had no problem finding a couple albums and a single by Snakefinger, the guitarist who frequently collaborated with The Residents. But unlike the most famous obscure band in history, Snakefinger was actually a working musician instead of a performance artist, and was a superior songwriter and performer.
As an aside, if you're looking for something unheard-of, but can't stand how weird the typical weird band sounds, check out his stuff from the late '70s and early '80s. He recorded a cover of "The Model" by Kraftwerk that's more uptempo and danceable yet also more haunting than the original. (None of his own music is very dance-y, BTW, in case you're allergic to moving your body.)
Anyway, it struck me as odd that someone would be into vinyl for practical reasons. There really is a lot out there that can be had for cheap if you buy records, without compromising sound quality.
It's not an analog vs. digital thing either. Tapes are analog, but they sound pathetic compared to either records or CDs. Video tapes are the same way. Does that make laser discs the next target for hipster status contests? If so, better hit up your thrift stores soon before they're scavenged by the status-strivers. At the same place today, I found a perfect copy of the director's cut of Blade Runner on laser disc for five bucks. Don't know when I'll be able to actually play it, but...
Today I was reminded of the two main things that turned me on to records in the mid-'90s: selection and price. Nothing romantic.
Not the sound quality — it sounds great, and distinct from CDs, but not a difference that would make me want to convert my CD collection into vinyl. And not the status points — back then, there were no status points to be gained, and even now I would gain little once I revealed what it was that I bought on vinyl (not classic rock, punk, grunge, or indie).
I'm visiting home and stopped by the local thrift store, which had several dozen crates full of records. I hadn't even gone there looking for them, I just figured why not browse around after having scored something that I did come looking for, a vintage afghan (black with gold and cream patterns). Right away they started popping up, and within ten minutes, I had a handful of albums that are difficult to find on CD, let alone for a buck a piece in mint condition. The Bangles, David Bowie, Bonnie Tyler, The Go-Go's, Paul Young, Bryan Adams, and Stacey Q (only a 12" single, but more than I've ever seen on CD).
The only one that I've even seen before on CD ("in the wild") was All Over the Place, the debut power-pop album by the Bangles. I have that on CD, but the hits from the others I only have on compilations or greatest hits: "Blue Jean," "Total Eclipse of the Heart," "Every Time You Go Away," "Two of Hearts"... and I don't think I have "Head Over Heels" or "Summer of '69" on anything. Yet for the price of one crappy download on iTunes — with 95% of the sound compressed out — I got the entire album that each hit song came from.
It may be new wave and synth-pop today, but 15 to 20 years ago I was still bored by contempo music and looking back for something entertaining. As now, it was mostly the '80s, a decent amount from the '70s, and a handful from the '60s. Only more on the avant-garde or experimental side, compared to more well known stuff that I dig now. Starting in college, you should be getting over your adolescent insecurity about needing to prove how obscure and unique your tastes are, and that's the only real change in my vinyl-buying habits — who the groups are, not the larger reasons of selection and price.
Some of that stuff was available on CD back then, but it was expensive. On the CD racks at Tower Records, there were several columns of just Frank Zappa, but they were closer to $20 than $15. If you dropped by any used record store, though, you could find them used and in good condition for about five to ten bucks.
And other material was either not released on CD, was out of print, or was otherwise damn rare to find on CD. Yet I had no problem finding a couple albums and a single by Snakefinger, the guitarist who frequently collaborated with The Residents. But unlike the most famous obscure band in history, Snakefinger was actually a working musician instead of a performance artist, and was a superior songwriter and performer.
As an aside, if you're looking for something unheard-of, but can't stand how weird the typical weird band sounds, check out his stuff from the late '70s and early '80s. He recorded a cover of "The Model" by Kraftwerk that's more uptempo and danceable yet also more haunting than the original. (None of his own music is very dance-y, BTW, in case you're allergic to moving your body.)
Anyway, it struck me as odd that someone would be into vinyl for practical reasons. There really is a lot out there that can be had for cheap if you buy records, without compromising sound quality.
It's not an analog vs. digital thing either. Tapes are analog, but they sound pathetic compared to either records or CDs. Video tapes are the same way. Does that make laser discs the next target for hipster status contests? If so, better hit up your thrift stores soon before they're scavenged by the status-strivers. At the same place today, I found a perfect copy of the director's cut of Blade Runner on laser disc for five bucks. Don't know when I'll be able to actually play it, but...
Categories:
Music,
Pop culture,
Technology
March 25, 2014
Don't improvise in period movies
Got around to seeing American Hustle tonight, and liked it better than most new movies I see. One thing took me out of the moment several times, though -- the improv scenes.
In a period movie, actors should stick close to the script because they are unlikely to be able to improvise and maintain the period's authenticity on the fly. When the acting is more spur-of-the-moment, it will come from the actor's gut, which is tuned to the here-and-now.
There was a scene where Irving Rosenfeld asks "Really?" in a slightly flat and miffed voice, in response to someone else's ridiculous behavior. That's too specifically from the 2010s. At the end when Richie DiMaso does a mock performance of his namby-pamby boss, it's so over-the-top and laughing-at rather than laughing-with, that it felt more like a frat pack movie from the 21st century. Ditto the scene where DiMaso is trying to convince Edith to finally have sex, where the dialog sounds like it's from a doofus rom-com by Judd Apatow.
These and other scenes should have been in the outtakes -- wackiness ensues when the actors break character! When they're left in, it creates jarring shifts in tone, as though an actor who'd been speaking with an English accent switched to Noo Yawka for half a minute, then switched back to English (all for no apparent reason).
Improvising and wandering slightly outside of the character's range isn't that jarring. Like, maybe you're just seeing a slightly different and unexpected side of them in this scene. But anachronism is not so easy to suspend disbelief about -- it definitely does not belong to that period. If they turn on the radio in 1980 and it's playing Rihanna, that kills the verisimilitude.
It's even more baffling in American Hustle, where the costume, make-up, and production design has been so meticulously worked to make you believe you're looking in on a certain time and place. All it takes is a series of distinctively 21st-century improvs to throw that into doubt for the viewer.
In a period movie, actors should stick close to the script because they are unlikely to be able to improvise and maintain the period's authenticity on the fly. When the acting is more spur-of-the-moment, it will come from the actor's gut, which is tuned to the here-and-now.
There was a scene where Irving Rosenfeld asks "Really?" in a slightly flat and miffed voice, in response to someone else's ridiculous behavior. That's too specifically from the 2010s. At the end when Richie DiMaso does a mock performance of his namby-pamby boss, it's so over-the-top and laughing-at rather than laughing-with, that it felt more like a frat pack movie from the 21st century. Ditto the scene where DiMaso is trying to convince Edith to finally have sex, where the dialog sounds like it's from a doofus rom-com by Judd Apatow.
These and other scenes should have been in the outtakes -- wackiness ensues when the actors break character! When they're left in, it creates jarring shifts in tone, as though an actor who'd been speaking with an English accent switched to Noo Yawka for half a minute, then switched back to English (all for no apparent reason).
Improvising and wandering slightly outside of the character's range isn't that jarring. Like, maybe you're just seeing a slightly different and unexpected side of them in this scene. But anachronism is not so easy to suspend disbelief about -- it definitely does not belong to that period. If they turn on the radio in 1980 and it's playing Rihanna, that kills the verisimilitude.
It's even more baffling in American Hustle, where the costume, make-up, and production design has been so meticulously worked to make you believe you're looking in on a certain time and place. All it takes is a series of distinctively 21st-century improvs to throw that into doubt for the viewer.
Categories:
Movies,
Psychology
March 24, 2014
Field report: Bowling alley birthday party
A party at the bowling alley became a common option by the late '80s, and I went to one today, so they're still at least somewhat common. Yet in 25 years a lot has changed, reflecting the larger social/cultural changes.
First, the guests are dropped off at the bowling alley itself — not at the birthday boy's house, where they'll pre-game before the host parents haul them off in a couple of mini-vans to the bowling alley. I've written a lot about the death of guest-host relationships during falling-crime / cocooning times. Each side is suspicious of the other — the hosts fear being over-run by and taken advantage of by guests, and the guests fear mistreatment (or whatever, I'm still not sure) by the hosts. It is primarily a guest-side change, just like the decline of trick-or-treating or hitch-hiking.
Now they have to meet in a third-party-supervised place like a bowling alley (or the mall / shopping district for trick-or-treating). Before, they would've met inside the host's home.
Then once the kids are dropped off, the parents will hang around to some degree before the event is over and they take their kid back. (Every kid today had a parent present.) Rather than drop them off, go do something for awhile, and either pick their kid up later or have them dropped off by the hosts. Again we see parents being so paranoid that they won't just leave their kids alone in the company of the hosts. Even if it's a bowling alley with only nuclear families present in a lily-white region and neighborhood, and during a Sunday afternoon.
(I can't emphasize often enough that, just because you live in a diverse shithole part of the country, doesn't mean that everyone else does. If parental paranoia is palpable in an all-white smallish town — if it is a national phenomenon — then it does not have to do with protecting kids against the dangers of a diverse megapolis.)
At least the parents won't hover directly over their child at a party, but the assembled grown-ups will form a ring around the kids, or form a side-group of chatting grown-ups next to the kids. Line-of-sight supervision remains unbroken. They can't trust their kid to use his own brain because he doesn't have one — intuition requires experience, and helicopter parenting blocks out experience from their kid's upbringing. It's more of a programming.
One of the kindergarten-aged guests politely declined eating a piece of birthday cake because it had "artificial frosting." He didn't know that (for all I know, it was made of natural poison). Still, as though if it were organic sugar, it wouldn't wind you up, put you in an insulin coma, rot your teeth, etc.
A lot of these brand-new food taboos that were wholly absent during the 1980s are just roundabout, rationalized ways to fragment the community. Sorry, my kid can't come to anyone's party because you'll have traces of peanuts in something and he'll die. So don't mind his absence from all celebrations that might bind a peer group together.
As for the actual bowling, you see both major societal influences at work — the self-esteem crap and the hyper-competitive crap. Quite a combination, eh? All of our kids are going to compete as though their lives depended on it, yet they're all going to enjoy the exact same rewards afterward.
The bumpers being up is not about self-esteem. That's just helping them learn, like riding a bike with training wheels at first. But everything else is. Like letting them cross the line as much as they want without penalizing them at all. Or cheering after any number of pins get knocked down — don't you think the kids can tell that they'll get praise no matter what they do? And hence they can just BS the whole thing and get full credit? Gee, I wonder if that'll pop up a little later in life...
The hyper-competitive stuff is way more visible and more offensive, though. If they knock down more than 5 pins, they're going to do some kind of victory dance, boys being worse than girls. Congratulations: it should've been a gutterball, but the bumpers let you knock down 7 whole pins. "Oh yeah! Uh-huh!"
And they're so eager to generally show off that they don't care how awful their technique is. Not like I'm even an amateur bowler, but I know that we're not doing the shotput here. The boys are again way worse than the girls on this one. (The winner today among four boys and two girls was one of the girls. She creamed the rest, not from being good, but from not crashing and burning in an attempt to show off.)
Have you guys seen kids throw a bowling ball lately? When I was their age, we stood with our legs wide apart, held the ball from behind with both hands, and rolled or lobbed it as close to the middle as possible. It's a granny-looking move, but when you're in kindergarten, you don't have the upper body strength to throw it in a more normal way. Heaven forbid you teach that lesson to kids these days, though. They're going to prove that they can do it. Only not.
They carry the ball with both hands near their chest, running up to the line with their left side forward (if right-handed), and then heaving or shotputting the ball with their right hand, turning their upper body to face the pins when they're near the line. This must be the worst way to release a bowling ball, and if the bumpers were not up, every one of these releases would go straight into the gutter. Not meander their way into the gutter — like, not even halfway down the lane, it's already sunk.
One kid did this with such enthusiasm that after shotputting the ball, his upper body carried itself forward over his feet, and he landed on his hands and knees — over the line, every time.
So, who cares if they're not trying to achieve the goal that the game assigns them? They're showing how eager they are to display intensity ("passion," later on), and that's all that matters in a dog-eat-dog world. The rules can be bent or changed on the fly, as long as the most intense person will win. After all, the parents aren't correcting or penalizing them.
One final odd but sadly not-too-surprising sight: the setting was a college student union, yet there were only two groups bowling, both family-and-kid-related. There were a handful of students playing pool for about 10 minutes, then that was it, no replacements. No students in the arcade area. On the other side of the rec area was the food court, which was closed on Sunday but which still had about a half-dozen students spread out.
Doing what? Why, surfing the web — what else? Most had a laptop out, one was on a school-provided terminal, and one girl was reading a textbook or doing homework with her back to everyone else.
If you haven't been on a college campus in awhile, you'd be shocked how utterly dead the unions are. Like, there are stronger signs of life in a gravestone showroom. Most of the students are locked in their rooms farting around on the internet / texting or video games. The few who venture out go to the library or the gym, where they can be around others and within the view of others, but still hide behind the expectation that you just don't go up and interact with people in those places. Unearned, risk-free ego validation — what's not to love for Millennials?
First, the guests are dropped off at the bowling alley itself — not at the birthday boy's house, where they'll pre-game before the host parents haul them off in a couple of mini-vans to the bowling alley. I've written a lot about the death of guest-host relationships during falling-crime / cocooning times. Each side is suspicious of the other — the hosts fear being over-run by and taken advantage of by guests, and the guests fear mistreatment (or whatever, I'm still not sure) by the hosts. It is primarily a guest-side change, just like the decline of trick-or-treating or hitch-hiking.
Now they have to meet in a third-party-supervised place like a bowling alley (or the mall / shopping district for trick-or-treating). Before, they would've met inside the host's home.
Then once the kids are dropped off, the parents will hang around to some degree before the event is over and they take their kid back. (Every kid today had a parent present.) Rather than drop them off, go do something for awhile, and either pick their kid up later or have them dropped off by the hosts. Again we see parents being so paranoid that they won't just leave their kids alone in the company of the hosts. Even if it's a bowling alley with only nuclear families present in a lily-white region and neighborhood, and during a Sunday afternoon.
(I can't emphasize often enough that, just because you live in a diverse shithole part of the country, doesn't mean that everyone else does. If parental paranoia is palpable in an all-white smallish town — if it is a national phenomenon — then it does not have to do with protecting kids against the dangers of a diverse megapolis.)
At least the parents won't hover directly over their child at a party, but the assembled grown-ups will form a ring around the kids, or form a side-group of chatting grown-ups next to the kids. Line-of-sight supervision remains unbroken. They can't trust their kid to use his own brain because he doesn't have one — intuition requires experience, and helicopter parenting blocks out experience from their kid's upbringing. It's more of a programming.
One of the kindergarten-aged guests politely declined eating a piece of birthday cake because it had "artificial frosting." He didn't know that (for all I know, it was made of natural poison). Still, as though if it were organic sugar, it wouldn't wind you up, put you in an insulin coma, rot your teeth, etc.
A lot of these brand-new food taboos that were wholly absent during the 1980s are just roundabout, rationalized ways to fragment the community. Sorry, my kid can't come to anyone's party because you'll have traces of peanuts in something and he'll die. So don't mind his absence from all celebrations that might bind a peer group together.
As for the actual bowling, you see both major societal influences at work — the self-esteem crap and the hyper-competitive crap. Quite a combination, eh? All of our kids are going to compete as though their lives depended on it, yet they're all going to enjoy the exact same rewards afterward.
The bumpers being up is not about self-esteem. That's just helping them learn, like riding a bike with training wheels at first. But everything else is. Like letting them cross the line as much as they want without penalizing them at all. Or cheering after any number of pins get knocked down — don't you think the kids can tell that they'll get praise no matter what they do? And hence they can just BS the whole thing and get full credit? Gee, I wonder if that'll pop up a little later in life...
The hyper-competitive stuff is way more visible and more offensive, though. If they knock down more than 5 pins, they're going to do some kind of victory dance, boys being worse than girls. Congratulations: it should've been a gutterball, but the bumpers let you knock down 7 whole pins. "Oh yeah! Uh-huh!"
And they're so eager to generally show off that they don't care how awful their technique is. Not like I'm even an amateur bowler, but I know that we're not doing the shotput here. The boys are again way worse than the girls on this one. (The winner today among four boys and two girls was one of the girls. She creamed the rest, not from being good, but from not crashing and burning in an attempt to show off.)
Have you guys seen kids throw a bowling ball lately? When I was their age, we stood with our legs wide apart, held the ball from behind with both hands, and rolled or lobbed it as close to the middle as possible. It's a granny-looking move, but when you're in kindergarten, you don't have the upper body strength to throw it in a more normal way. Heaven forbid you teach that lesson to kids these days, though. They're going to prove that they can do it. Only not.
They carry the ball with both hands near their chest, running up to the line with their left side forward (if right-handed), and then heaving or shotputting the ball with their right hand, turning their upper body to face the pins when they're near the line. This must be the worst way to release a bowling ball, and if the bumpers were not up, every one of these releases would go straight into the gutter. Not meander their way into the gutter — like, not even halfway down the lane, it's already sunk.
One kid did this with such enthusiasm that after shotputting the ball, his upper body carried itself forward over his feet, and he landed on his hands and knees — over the line, every time.
So, who cares if they're not trying to achieve the goal that the game assigns them? They're showing how eager they are to display intensity ("passion," later on), and that's all that matters in a dog-eat-dog world. The rules can be bent or changed on the fly, as long as the most intense person will win. After all, the parents aren't correcting or penalizing them.
One final odd but sadly not-too-surprising sight: the setting was a college student union, yet there were only two groups bowling, both family-and-kid-related. There were a handful of students playing pool for about 10 minutes, then that was it, no replacements. No students in the arcade area. On the other side of the rec area was the food court, which was closed on Sunday but which still had about a half-dozen students spread out.
Doing what? Why, surfing the web — what else? Most had a laptop out, one was on a school-provided terminal, and one girl was reading a textbook or doing homework with her back to everyone else.
If you haven't been on a college campus in awhile, you'd be shocked how utterly dead the unions are. Like, there are stronger signs of life in a gravestone showroom. Most of the students are locked in their rooms farting around on the internet / texting or video games. The few who venture out go to the library or the gym, where they can be around others and within the view of others, but still hide behind the expectation that you just don't go up and interact with people in those places. Unearned, risk-free ego validation — what's not to love for Millennials?
Categories:
Age,
Cocooning,
Over-parenting,
Pop culture,
Psychology,
Sports
March 22, 2014
Another path from helicopter parenting to egocentrism
I'm visiting my nephew and got dragged into playing Legos with him. "Dragged" not because I'm heartless, but because I want to keep sufficient distance in case I need to assert authority. Remember: being your kid's friend undermines your authority. It's the kind of thing he should be playing with his peers, not a family member who's 27 years older than him.
At any rate, he's so excited to show you this thing he made and that thing he made, and what this design in the booklet looks like, and what that one looks like. I found myself the whole time saying "cool..." or "man..." or staying silent. All you get is agreement from family members, even if they don't really think it's cool. Family members have to treat each other more nicely than if one of them were an outsider.
It's your peers who would pipe up with "BORRRINNG." Like, "Hey Uncle Agnostic, you wanna watch Sponge Bob?" I can't tell him, "Nah man, that shit's boring." But one of his friends hopefully would. "Aw c'mon, change the chanelllll. Sponge Bob sucks." Then they'd engage in a little back-and-forth, give-and-take, until they compromised.
With family members in an era of family friendliness, the grown-ups tell the child that whatever they like is awesome. No need to change, improve, or compromise your interests and tastes.
This ego sheltering can only last so long, though. What happens when the kid starts to interact with his peers at age 25 or whenever it is with you damn Millennials? His whole formative years up to that point have prepared him to expect that other people will find his interests fascinating and his tastes impeccable.
Then, SLAM — your peers shrug off many of your interests and find your tastes average. That's typical, and not the end of the world. But with no preparation for it, the ego faces this challenge in such an atrophied state that it gets utterly demolished.
To pick up the pieces: "Well what do those idiots know about awesome anyway? They're probably just jealous. I don't need their confirmation anyway." Now they're headed down the path of social withdrawal and misanthropy. They'll grow suspicious of their so-called friends who don't share 100% enthusiasm with their interests.
Parents in the '70s and '80s used to view their job as preparing their kids for the tough and unpredictable world out there, not to insulate them from it. It'll slam into them at some point, so might as well make sure they've grown to withstand it. Parents stayed out of our lives even when we were children. By encouraging us to go out and make friends on the playground, or at school, or around the neighborhood, they helped us discover the shocking reality that not everybody is as interested as we are in the stuff we're interested in.
That not only taught us to negotiate and compromise with someone who wasn't on the same page as us, but also to seek out new friends who would be closer to us. That way we have our friends where we don't have to struggle that much just to get something done, and other friends or acquaintances who we have to make more of an effort to do things with — but not shutting them out because of that. Each side goes back to their closer circle of friends afterward and engages in some good-spirited gossip about how weird the other side can be sometimes.
This is a separate effect from over-praising the kids' efforts and output. That shelters their ego about their capabilities. This is about what gets their attention, what motivates them, their interests and tastes. It's much closer to the core of their identity than their capabilities, so that questioning it is far more likely to be perceived as doubting who they are as a person.
That will trigger a much more desperate defense: "What do you mean, you don't like Harry Potter? Here is a PowerPoint of the Top 10 reasons why you must, unless you're a big fat stupid idiot."
Never having your tastes questioned — not well into your late teens anyway — leads to another major problem that you see with Millennials. They can't separate objective and subjective discussions about something they like. It all boils down to the subjective. The objective is only a means toward that end, as though an objective argument would force them to decide one way or another on the subjective side of things.
To end with an example, most of them like music that is not very musical. That is an objective claim, easy to verify. If it's what floats their boat, I guess I'll just have to consider them lame, and they can look at me playing a Duran Duran album as lame. But objectively speaking, "Rio," "Save a Prayer," "Hungry Like the Wolf," etc. are more musical. More riffs, motifs, more varied song structure (intros, bridges, outros), more intricate melodic phrasing, richer instrumentation (actually hearing the bass), instrumental solos, greater pitch range for the singing, more required to write the instrumental parts, and so on.
I know some Boomers who see that spelled out, and compare what Pandora says about their Sixties faves, and respond self-deprecatingly with, "Meh, I guess I get turned on by simplistic music then!" as opposed to fancy-schmancy music. Millennials get bent out of shape, though, as though "logic has proven my tastes inferior — must re-inspect the logic."
But that's what happens when your tastes rarely get questioned during your formative years. You don't appreciate that there could be two separate ways that this could happen, one objective and the other subjective. In fact, being told that your favorite TV show is boring would probably have introduced you to the objective side of things, when you asked them why they felt that way. "I dunno, it's like none of the challenges the characters face actually matters. The motivation feels empty." Ok, they wouldn't phrase it that way, but you know what I mean. Usually their response would not only amount to, "I dunno, it just sucks."
At any rate, he's so excited to show you this thing he made and that thing he made, and what this design in the booklet looks like, and what that one looks like. I found myself the whole time saying "cool..." or "man..." or staying silent. All you get is agreement from family members, even if they don't really think it's cool. Family members have to treat each other more nicely than if one of them were an outsider.
It's your peers who would pipe up with "BORRRINNG." Like, "Hey Uncle Agnostic, you wanna watch Sponge Bob?" I can't tell him, "Nah man, that shit's boring." But one of his friends hopefully would. "Aw c'mon, change the chanelllll. Sponge Bob sucks." Then they'd engage in a little back-and-forth, give-and-take, until they compromised.
With family members in an era of family friendliness, the grown-ups tell the child that whatever they like is awesome. No need to change, improve, or compromise your interests and tastes.
This ego sheltering can only last so long, though. What happens when the kid starts to interact with his peers at age 25 or whenever it is with you damn Millennials? His whole formative years up to that point have prepared him to expect that other people will find his interests fascinating and his tastes impeccable.
Then, SLAM — your peers shrug off many of your interests and find your tastes average. That's typical, and not the end of the world. But with no preparation for it, the ego faces this challenge in such an atrophied state that it gets utterly demolished.
To pick up the pieces: "Well what do those idiots know about awesome anyway? They're probably just jealous. I don't need their confirmation anyway." Now they're headed down the path of social withdrawal and misanthropy. They'll grow suspicious of their so-called friends who don't share 100% enthusiasm with their interests.
Parents in the '70s and '80s used to view their job as preparing their kids for the tough and unpredictable world out there, not to insulate them from it. It'll slam into them at some point, so might as well make sure they've grown to withstand it. Parents stayed out of our lives even when we were children. By encouraging us to go out and make friends on the playground, or at school, or around the neighborhood, they helped us discover the shocking reality that not everybody is as interested as we are in the stuff we're interested in.
That not only taught us to negotiate and compromise with someone who wasn't on the same page as us, but also to seek out new friends who would be closer to us. That way we have our friends where we don't have to struggle that much just to get something done, and other friends or acquaintances who we have to make more of an effort to do things with — but not shutting them out because of that. Each side goes back to their closer circle of friends afterward and engages in some good-spirited gossip about how weird the other side can be sometimes.
This is a separate effect from over-praising the kids' efforts and output. That shelters their ego about their capabilities. This is about what gets their attention, what motivates them, their interests and tastes. It's much closer to the core of their identity than their capabilities, so that questioning it is far more likely to be perceived as doubting who they are as a person.
That will trigger a much more desperate defense: "What do you mean, you don't like Harry Potter? Here is a PowerPoint of the Top 10 reasons why you must, unless you're a big fat stupid idiot."
Never having your tastes questioned — not well into your late teens anyway — leads to another major problem that you see with Millennials. They can't separate objective and subjective discussions about something they like. It all boils down to the subjective. The objective is only a means toward that end, as though an objective argument would force them to decide one way or another on the subjective side of things.
To end with an example, most of them like music that is not very musical. That is an objective claim, easy to verify. If it's what floats their boat, I guess I'll just have to consider them lame, and they can look at me playing a Duran Duran album as lame. But objectively speaking, "Rio," "Save a Prayer," "Hungry Like the Wolf," etc. are more musical. More riffs, motifs, more varied song structure (intros, bridges, outros), more intricate melodic phrasing, richer instrumentation (actually hearing the bass), instrumental solos, greater pitch range for the singing, more required to write the instrumental parts, and so on.
I know some Boomers who see that spelled out, and compare what Pandora says about their Sixties faves, and respond self-deprecatingly with, "Meh, I guess I get turned on by simplistic music then!" as opposed to fancy-schmancy music. Millennials get bent out of shape, though, as though "logic has proven my tastes inferior — must re-inspect the logic."
But that's what happens when your tastes rarely get questioned during your formative years. You don't appreciate that there could be two separate ways that this could happen, one objective and the other subjective. In fact, being told that your favorite TV show is boring would probably have introduced you to the objective side of things, when you asked them why they felt that way. "I dunno, it's like none of the challenges the characters face actually matters. The motivation feels empty." Ok, they wouldn't phrase it that way, but you know what I mean. Usually their response would not only amount to, "I dunno, it just sucks."
Categories:
Age,
Cocooning,
Generations,
Music,
Over-parenting,
Psychology
March 21, 2014
Ancestry in India revealed more by dance than by language
Continuing an earlier post which gave evidence for traditional dance styles being a better predictor of genetic ancestry than the language spoken.
Recall that Hungarians speak a non-Indo-European language, yet their folk dances place them squarely within the Central European -- and even Indo-European -- norm of step dancing. The style is characterized by fancy footwork, percussive footwork, and keeping the body's vertical axis more or less in place.
Language is a little too utilitarian of a trait to be free from cultural selection pressures, hence the handful of cases like the Hungarians who adopted the language of genetic outsiders for one good reason or another. (To trade, to serve as clients to their patrons, to move up the status ladder dominated by foreigners, and so on.)
Dance styles do not play such a utilitarian role in people's lives, and there is a very strong sense of "this is how the dance is done," so they are more conservative against selection pressures to adopt foreign traits. Cultural and genetic ancestries will tend to overlap, but they overlap much more closely when dance is singled out as the cultural trait, and not quite so closely when we look to language.
Now for the picture on the other end of the Indo-European world. There are two major language families represented in India, Indo-European brought by one branch of the, er, Indo-Europeans, and Dravidian, which better reflects the state of things before their arrival. Here is their distribution today, with the I-E languages in yellow and orange, and Dravidian in green:
Judging from language would lead us to expect I-E genepools everywhere but the southern third of the country and a fringe along the Himalayas. But when we look at the frequency of key genetic markers of the I-E migrants, such as lactase persistence, we see them confined more tightly to the northwestern regions (see this broader discussion by Razib):
The Indo-Europeans were (agro-)pastoralists, so they would have been under strong selection pressures to get more calories out of milk through metabolizing its sugar. Before the I-E folks showed up, there was a large-scale sedentary civilization based on agriculture in the Indus Valley, and you don't need to digest lactose if you're subsisting on grains. So, the blue regions above are more likely to be the descendants of that pre-IE civilization, driven somewhat to the south and east by the I-E pastoralists who settled in the northwest.
Notice that the language map vastly overstates how genetically Indo-European the Indians are. However, looking at language as a utilitarian trait, rather than a neutral one, we infer that large swaths of the Subcontinent who are genetically not-so-Indo-European have adopted the languages of foreigners for one good reason or another.
This disconnect is also evident in outward appearances. Take an eastern group such as the Bihari, who speak I-E languages. They have fairly dark skin, noses that are wide at the bottom, and low-lying nasal bridges. They would not look out of place in Southeast Asia or the Pacific Islands. Contrast that with the relatively lighter-skinned and hawk-nosed people of Pakistan, who would not look so out of place in Iran.
But external appearance is not a cultural trait, so let's turn to dance. Of the roughly 10 canonical forms of Indian classical dance, only one is native to the northwest -- Kathak (see an example clip here). It is a spirited step dance straight out of the Indo-European playbook. In fact, it looks a lot like Flamenco (example clip here for those living under a rock), carried into Europe by the Gypsies, who also speak and I-E language and who also hail from northwestern India -- 1,500 years ago. The common cultural ancestor between today's Flamenco and Kathak must go back even further. Here is a clip of a dancer from each style comparing and contrasting their similar styles.
All of the other classical dances of India belong to the blue regions in the lactase map. They ought to look quite different from Kathak, and they do: they include movements that are slower, limber poses are held as though the dancers were sculptures, and bending the body's vertical axis out of alignment is common. This is what most of us think of when we hear "Indian classical dance." They are also likely to feature costumes with a variety of garish colors, ornate headgear, and masks or make-up with exaggerated expressions. All of these features may tie together in a narrative / opera form, not just dance by itself. See this clip of the Bharatanatyam form, and this clip of the Kathakali form.
Those features place almost all Indian classical dances with those of Southeast Asia, from Thailand down into Bali. In fact, some of the distinguishing movements would not look out of place in a crowd of people practicing Tai Chi, nor would the costumes look so strange to a crowd used to Chinese opera or Japanese Noh theater.
Once again, dance styles are more faithful informants about a group's genetic ancestry than language is. You may adopt an Indo-European language in order to trade with, apprentice under, etc., a newly arrived powerful bunch of foreigners. But when it comes to expressing your race's identity in a communal setting, the dancing will be Dravidian.
Categories:
Dance,
Evolution,
Human Biodiversity,
Language
March 19, 2014
How superficial and bratty were the Silent Gen as youngsters?
A recent post looked at how Millennials (the females anyway, but probably also the males) have a shocking level of superficiality regarding the opposite sex. Girls don't want a guy because he's "cool" but because he's "hot." I attributed this to the helicopter parenting and cocooning environment that they grew up in -- social avoidance and distance lead you to only notice and value people's appearances.
Did something like this happen during the previous heyday of cocooning?
The Mid-century was part of the Great Compression, when competitiveness and squeaky-wheel entitled-ness were steadily falling. So, young people back then -- the Silent Generation -- did not live by a social code of dog-eat-dog. And yet, their formative years were part of the Dr. Spock / smothering mothers approach to treating children, which seems to be a feature of cocooning and falling-crime times (cocoon the children for their safety, no matter what it turns them into).
Hence, we expect them to look like Millennials, only in a way where everything is not a status contest.
Recall this post about how Mid-century singers and actors / actresses were required to be quite attractive. Throw in the election of JFK, and even politicians fit into that pattern, to a lesser extent. That had totally changed by the Eighties, when major entertainers looked more like Bill Murray, Sigourney Weaver, Phil Collins, and Ann Wilson from Heart. And our President looked like Ronald Reagan.
Since the '90s, it's swung back toward the Mid-century norm of superficiality, with Justin Bieber and Katy Perry as today's major entertainers (or attractive TV news readers, or attractive sideline reporter chicks, or attractive spokeswomen for breast cancer, or...). I don't know how good-looking Obama is perceived, but he was elected on superficial grounds -- he's got dark skin, so it'd be like My Cool Black Friend running the country! Don't bother checking the individual in question to see if he really is the cool-black-friend type or not... And Republicans are no different, with Mitt Romney and Sarah Palin being much better looking than major aspiring politicians from the '70s or '80s.
What about brattiness among Silents? I haven't gone through all the major hits of the Mid-century, but I can think of two just off the top of my head: Lesley Gore's back-to-back hits of 1963, "It's My Party" ("and I'll cry if I want to, cry if I want to, cry if I want to") and "Judy's Turn to Cry". The first hit #1, the second #5, and both made the year-end Billboard charts.
You might object that these were for 14 year-olds, who would've been early Boomers, not Silents. True, but I'm talking about young people during the Mid-century more than a specific generation. If that included some early Boomers toward the end, that doesn't change the focus.
Finally, check out this sit-com / product placement special from 1952, "Young Man's Fancy". The Silent Gen daughter, Judy (that name again), acts like a spoiled brat toward her mother, though her callous and dismissive attitude is more calm and carefree than the hostile tone that her Millennial descendant would take.
When she hears that her brother's friend from college is coming over for dinner, she gets all disgusted by the sound of his name (Alexander Phipps) -- with a name like that, he's bound to be an ugly geek! He turns out to be good-looking, so she naturally responds by cursing her brother for not warning her that he was bringing home a hot guy (or whatever she calls him). Now she has to make herself up with no time. When she yaks on the phone with her friend about him, it's only his cuteness (or whatever the slang was) that she spazzes out about.
That spastic reaction reminded me so much of how Millennial girls respond to the Random Hot Guy type. His appearance is the only thing they notice, not his character or other appeal. They only want to get noticed and thereby ego-validated by the Random Hot Guy, not actually open up to, get close to, or do anything with him. And they flip out when they see one, as though they hardly get to enjoy the opportunity. Because everyone is so sheltered and doesn't run into each other so often? Or because by choosing only based on looks, there just aren't as many who make them feel butterflies in the stomach? Beats me.
Eighties babes were more collected around guys, whether because it wasn't such a rare thing to be next to and interacting with them, or because all sorts of guys appealed to them. It was also ultimately uncool to be seen as a spazz, so they must have developed greater skill at composing themselves around others.
I realize that the Fifties sit-com is not the best acting, not a documentary, and produced more for product placement. (Gee, how neat is it to be watching the program on an ELECTRIC computer screen?) Yet, the character type they were going for to portray "kids these days" was a callous, superficial airhead. And I'm sure the actress had experience with that type of person, at least from observing and interacting with her peers, and perhaps personal inclination too.
Teenagers in the Eighties were not shown that way -- they were more concerned, thoughtful, and interested in getting to know what made someone tick. I get a similar impression of youngsters from the Twenties based on Fitzgerald's short stories, and plot synopses of silent films about the Flaming Youth of the time.
In both eras, they had an anti-authoritarian kind of attitude, but that is not callous dismissiveness directed toward all people who are not hot. And in both eras, young people were all about having fun and living life -- and good looks may only get you so far along that path. Wanting to have fun in a social setting makes young people focus more on what others are like inside -- are they an instigator or a killjoy?
Clearly, "further research is needed" in the matter, but even this cursory look has turned up more evidence that you would've expected for a superficial and bratty mindset among young people during the Mid-century.
Did something like this happen during the previous heyday of cocooning?
The Mid-century was part of the Great Compression, when competitiveness and squeaky-wheel entitled-ness were steadily falling. So, young people back then -- the Silent Generation -- did not live by a social code of dog-eat-dog. And yet, their formative years were part of the Dr. Spock / smothering mothers approach to treating children, which seems to be a feature of cocooning and falling-crime times (cocoon the children for their safety, no matter what it turns them into).
Hence, we expect them to look like Millennials, only in a way where everything is not a status contest.
Recall this post about how Mid-century singers and actors / actresses were required to be quite attractive. Throw in the election of JFK, and even politicians fit into that pattern, to a lesser extent. That had totally changed by the Eighties, when major entertainers looked more like Bill Murray, Sigourney Weaver, Phil Collins, and Ann Wilson from Heart. And our President looked like Ronald Reagan.
Since the '90s, it's swung back toward the Mid-century norm of superficiality, with Justin Bieber and Katy Perry as today's major entertainers (or attractive TV news readers, or attractive sideline reporter chicks, or attractive spokeswomen for breast cancer, or...). I don't know how good-looking Obama is perceived, but he was elected on superficial grounds -- he's got dark skin, so it'd be like My Cool Black Friend running the country! Don't bother checking the individual in question to see if he really is the cool-black-friend type or not... And Republicans are no different, with Mitt Romney and Sarah Palin being much better looking than major aspiring politicians from the '70s or '80s.
What about brattiness among Silents? I haven't gone through all the major hits of the Mid-century, but I can think of two just off the top of my head: Lesley Gore's back-to-back hits of 1963, "It's My Party" ("and I'll cry if I want to, cry if I want to, cry if I want to") and "Judy's Turn to Cry". The first hit #1, the second #5, and both made the year-end Billboard charts.
You might object that these were for 14 year-olds, who would've been early Boomers, not Silents. True, but I'm talking about young people during the Mid-century more than a specific generation. If that included some early Boomers toward the end, that doesn't change the focus.
Finally, check out this sit-com / product placement special from 1952, "Young Man's Fancy". The Silent Gen daughter, Judy (that name again), acts like a spoiled brat toward her mother, though her callous and dismissive attitude is more calm and carefree than the hostile tone that her Millennial descendant would take.
When she hears that her brother's friend from college is coming over for dinner, she gets all disgusted by the sound of his name (Alexander Phipps) -- with a name like that, he's bound to be an ugly geek! He turns out to be good-looking, so she naturally responds by cursing her brother for not warning her that he was bringing home a hot guy (or whatever she calls him). Now she has to make herself up with no time. When she yaks on the phone with her friend about him, it's only his cuteness (or whatever the slang was) that she spazzes out about.
That spastic reaction reminded me so much of how Millennial girls respond to the Random Hot Guy type. His appearance is the only thing they notice, not his character or other appeal. They only want to get noticed and thereby ego-validated by the Random Hot Guy, not actually open up to, get close to, or do anything with him. And they flip out when they see one, as though they hardly get to enjoy the opportunity. Because everyone is so sheltered and doesn't run into each other so often? Or because by choosing only based on looks, there just aren't as many who make them feel butterflies in the stomach? Beats me.
Eighties babes were more collected around guys, whether because it wasn't such a rare thing to be next to and interacting with them, or because all sorts of guys appealed to them. It was also ultimately uncool to be seen as a spazz, so they must have developed greater skill at composing themselves around others.
I realize that the Fifties sit-com is not the best acting, not a documentary, and produced more for product placement. (Gee, how neat is it to be watching the program on an ELECTRIC computer screen?) Yet, the character type they were going for to portray "kids these days" was a callous, superficial airhead. And I'm sure the actress had experience with that type of person, at least from observing and interacting with her peers, and perhaps personal inclination too.
Teenagers in the Eighties were not shown that way -- they were more concerned, thoughtful, and interested in getting to know what made someone tick. I get a similar impression of youngsters from the Twenties based on Fitzgerald's short stories, and plot synopses of silent films about the Flaming Youth of the time.
In both eras, they had an anti-authoritarian kind of attitude, but that is not callous dismissiveness directed toward all people who are not hot. And in both eras, young people were all about having fun and living life -- and good looks may only get you so far along that path. Wanting to have fun in a social setting makes young people focus more on what others are like inside -- are they an instigator or a killjoy?
Clearly, "further research is needed" in the matter, but even this cursory look has turned up more evidence that you would've expected for a superficial and bratty mindset among young people during the Mid-century.
Categories:
Age,
Cocooning,
Dudes and dudettes,
Generations,
Media,
Movies,
Music,
Over-parenting,
Pop culture,
Psychology
Forget it Jake, it's Malaysia-town
The near complete inability of the nations in eastern Asia to reveal what information they may have and cooperate during an urgent, life-and-death matter should temper the enthusiasm that many Westerners feel for glorious Asia — what with their high IQ levels (by global standards) and their low homicide rates.
Rather, this groping-through-the-labyrinth of an investigation should serve to revive an older description — "the inscrutable Oriental." It's all about keeping your guard up, wearing an unreadable stone face, and covering your ass / saving face above all else.
And it is not only in lending a hand that the Asian is stingy. Merely requesting help from others is seen as a sign of weakness, incompetence, "Why you so lazy?!" etc., and must only be resorted to long after it has become obvious that you're not as all-powerful and all-knowing as you'd thought. The level of hubris among Asians is astounding: no matter what the calamity, the in-over-their-heads team is bound to respond with, "Back off, man — I got this."
Why any Westerner would want to import large numbers of denizens from a black hole of trust, is beyond me. But probably boils down to them being a nerd who lives in the abstract and the hypothetical, where BRAINS + DOCILITY = MAX STATS, and having little connection with the real world, where these ghost people are among the worst neighbors and citizens you could ask for.
Rather, this groping-through-the-labyrinth of an investigation should serve to revive an older description — "the inscrutable Oriental." It's all about keeping your guard up, wearing an unreadable stone face, and covering your ass / saving face above all else.
And it is not only in lending a hand that the Asian is stingy. Merely requesting help from others is seen as a sign of weakness, incompetence, "Why you so lazy?!" etc., and must only be resorted to long after it has become obvious that you're not as all-powerful and all-knowing as you'd thought. The level of hubris among Asians is astounding: no matter what the calamity, the in-over-their-heads team is bound to respond with, "Back off, man — I got this."
Why any Westerner would want to import large numbers of denizens from a black hole of trust, is beyond me. But probably boils down to them being a nerd who lives in the abstract and the hypothetical, where BRAINS + DOCILITY = MAX STATS, and having little connection with the real world, where these ghost people are among the worst neighbors and citizens you could ask for.
Categories:
Human Biodiversity,
IQ,
Morality,
Politics,
Psychology,
Violence
March 18, 2014
Corporate inability to discriminate
Last night I had nothing to eat in the house, and I hadn't eaten in eight hours (and then, not that much). So I walked over to the nearest McDonald's, where the drive-thru is open 24 hours. Shouldn't matter if I don't show up in a car, right?
Well, the cashier pokes his head out of the window while serving the truck ahead of me, looks over, and asks if he can help me. "Can I help you?" = Go away. I asked if I could still order without being in a car, and he said he couldn't do it. When I pressed him politely, he said he got yelled at by his manager the last time.
There was no one behind me, I was not dressed like a bum, I would have gotten off their property after ordering, etc. And of course they would have made a little profit off of the order. Multiply that by all the other normal people who would patronize the drive-thru window on foot, and that's a decent chunk of change they're leaving sit on the table. They're already open, business is always slow during the late hours, so what gives?
They clearly have some kind of policy, since the guy's manager is strictly enforcing it. Presumably that comes from higher up the chain of command. Who knows what the reasoning is — perhaps keeping away bums, robbers, and so on? The point is: it's obvious I am not one of those people, so let me order my damn bacon McDoubles and give you money.
Or maybe the goal is to avoid lawsuits if I got hit — except that there were no cars behind me, or even in sight pulling into the parking lot. Just the truck ahead of me, which I stood a good several feet behind. How could I sue for getting hit by a ghost car?
No matter what hypothetical reasons are given, none applied in this concrete case. This silly example nevertheless reveals something rotten about the mega-corporate model — the inability of any of the workers, from top to bottom, to use their brain and discriminate.
Like, "Yeah, we don't want drugged-out bums or robber-looking dudes showing up, but you look fine." (And anyway, what's to stop a robber from showing up to the window in a getaway car?) Or "Well, normally we don't want people standing in between a line of cars with impatient drivers, but since there's no cars behind you, I guess it's OK."
Corporations instead insist on a culture of law — given how enormous and broadly distributed they are, and how top-down in command structure, their efficiency would get thrown off if every actor were allowed to pause, reflect, and discriminate based on what his local concrete circumstances appeared to warrant. Rather, the decisions are codified in an explicit set of rules or laws that applies to abstract categories of people — if the customer arrives in a car, let him order; if not, turn him away. That must be where the profit-left-on-the-table is made up for — this petty legalism is what allows the whole mega-corporate machine to operate in the first place.
I don't care what is the most rational or profit-maximizing way for McDonald's to run its business. I don't own any stock in the company, and don't know anybody who works there, depending on them for their welfare. But I do resent being treated like some formless representative of an abstract category (the customer). And the more that large top-down corporations take over the industries that serve us in one way or another, the more we can expect to get disrespected by some pantywaist who says he's just following the rules he's given.
You hear rationalizations like, "If we let you order here, we'd have to let everybody order here." Why? Are you completely blind, where every customer appears equally risky, despite you knowing that most are safe and only some are dangerous? I don't care where you set your threshold for "this customer appears too sketchy to allow to get close," but it sure as hell ain't going to include me, or most folks. That's paranoid and disrespectful. Judge me like a particular human being, not some abstract type that you can discover no information about.
Use your brain, not your mind.
Everyone knows how dehumanizing it becomes for workers within a corporation after awhile. This is different, and worse: they take the same clueless, legalistic approach with their customers too. Taking into account all corporations, and looking at both workers and customers, that's... uh, just about everybody who's affected.
And no, I didn't go all Falling Down on the McDonald's cashier. Felt like it would've been pointless with no one around to hear it and agree.
Well, the cashier pokes his head out of the window while serving the truck ahead of me, looks over, and asks if he can help me. "Can I help you?" = Go away. I asked if I could still order without being in a car, and he said he couldn't do it. When I pressed him politely, he said he got yelled at by his manager the last time.
There was no one behind me, I was not dressed like a bum, I would have gotten off their property after ordering, etc. And of course they would have made a little profit off of the order. Multiply that by all the other normal people who would patronize the drive-thru window on foot, and that's a decent chunk of change they're leaving sit on the table. They're already open, business is always slow during the late hours, so what gives?
They clearly have some kind of policy, since the guy's manager is strictly enforcing it. Presumably that comes from higher up the chain of command. Who knows what the reasoning is — perhaps keeping away bums, robbers, and so on? The point is: it's obvious I am not one of those people, so let me order my damn bacon McDoubles and give you money.
Or maybe the goal is to avoid lawsuits if I got hit — except that there were no cars behind me, or even in sight pulling into the parking lot. Just the truck ahead of me, which I stood a good several feet behind. How could I sue for getting hit by a ghost car?
No matter what hypothetical reasons are given, none applied in this concrete case. This silly example nevertheless reveals something rotten about the mega-corporate model — the inability of any of the workers, from top to bottom, to use their brain and discriminate.
Like, "Yeah, we don't want drugged-out bums or robber-looking dudes showing up, but you look fine." (And anyway, what's to stop a robber from showing up to the window in a getaway car?) Or "Well, normally we don't want people standing in between a line of cars with impatient drivers, but since there's no cars behind you, I guess it's OK."
Corporations instead insist on a culture of law — given how enormous and broadly distributed they are, and how top-down in command structure, their efficiency would get thrown off if every actor were allowed to pause, reflect, and discriminate based on what his local concrete circumstances appeared to warrant. Rather, the decisions are codified in an explicit set of rules or laws that applies to abstract categories of people — if the customer arrives in a car, let him order; if not, turn him away. That must be where the profit-left-on-the-table is made up for — this petty legalism is what allows the whole mega-corporate machine to operate in the first place.
I don't care what is the most rational or profit-maximizing way for McDonald's to run its business. I don't own any stock in the company, and don't know anybody who works there, depending on them for their welfare. But I do resent being treated like some formless representative of an abstract category (the customer). And the more that large top-down corporations take over the industries that serve us in one way or another, the more we can expect to get disrespected by some pantywaist who says he's just following the rules he's given.
You hear rationalizations like, "If we let you order here, we'd have to let everybody order here." Why? Are you completely blind, where every customer appears equally risky, despite you knowing that most are safe and only some are dangerous? I don't care where you set your threshold for "this customer appears too sketchy to allow to get close," but it sure as hell ain't going to include me, or most folks. That's paranoid and disrespectful. Judge me like a particular human being, not some abstract type that you can discover no information about.
Use your brain, not your mind.
Everyone knows how dehumanizing it becomes for workers within a corporation after awhile. This is different, and worse: they take the same clueless, legalistic approach with their customers too. Taking into account all corporations, and looking at both workers and customers, that's... uh, just about everybody who's affected.
And no, I didn't go all Falling Down on the McDonald's cashier. Felt like it would've been pointless with no one around to hear it and agree.
Categories:
Economics,
Food,
Morality,
Psychology
March 17, 2014
Death of the high school dance
Here is a nice little report from the NY Post about the decline of high school dances in Westchester County, an affluent suburban area outside of New York City.
This time they dug up some numbers to put on the phenomenon — although when the schools don't hold any dances except prom, and even then the kids are eager to just get it over with, perhaps leaving early, you can already tell the situation is dire.
It wasn't because of blacks, poors, etc. Kids these days just don't feel comfortable being around other people, no matter if it's their peers or so-called friends. The article tries to rationalize such profound anti-social tendencies as the only logical response to a world of texting on personal phones, snuggling up in bed with a laptop to use Facebook, Instagram, or whatever.
Well, I don't do any of that stuff, nor do most people over a certain age. We're just as bombarded by all of that crap, it simply doesn't resonate very much with us. We're what you call "halfway normal." Only deeply distrustful people would choose to adopt the texting and social networking technologies and use them as substitutes for the real thing, rather than as a supplement.
LOL at the author's attempt to confer an air of rebelliousness on today's youngsters, suggesting that their use of texting and internet exchanges allow them to interact outside the watchful eye of their parents. Their exchanges may not be monitored, but they're still locked up in the home with nowhere to go and no one to interact with for real.
It's depressing that they don't think to disobey their parents — which you ought to do when they've gone off the deep end — but perhaps after 15 to 20 years of solid helicopter parenting, you might feel as apathetic too? I doubt it, but perhaps.
Dances are needed to bond a group together, every culture in the world does it. Now, parents conspire to prevent their kids from establishing bonds with their peers, and the kids themselves are too full of themselves to feel like it'd even be worth "forming bonds with my peers — as if they were cooler than moi. Seriously."
When both the parents and the kids are against the community, what is the community to do?
This time they dug up some numbers to put on the phenomenon — although when the schools don't hold any dances except prom, and even then the kids are eager to just get it over with, perhaps leaving early, you can already tell the situation is dire.
It wasn't because of blacks, poors, etc. Kids these days just don't feel comfortable being around other people, no matter if it's their peers or so-called friends. The article tries to rationalize such profound anti-social tendencies as the only logical response to a world of texting on personal phones, snuggling up in bed with a laptop to use Facebook, Instagram, or whatever.
Well, I don't do any of that stuff, nor do most people over a certain age. We're just as bombarded by all of that crap, it simply doesn't resonate very much with us. We're what you call "halfway normal." Only deeply distrustful people would choose to adopt the texting and social networking technologies and use them as substitutes for the real thing, rather than as a supplement.
LOL at the author's attempt to confer an air of rebelliousness on today's youngsters, suggesting that their use of texting and internet exchanges allow them to interact outside the watchful eye of their parents. Their exchanges may not be monitored, but they're still locked up in the home with nowhere to go and no one to interact with for real.
It's depressing that they don't think to disobey their parents — which you ought to do when they've gone off the deep end — but perhaps after 15 to 20 years of solid helicopter parenting, you might feel as apathetic too? I doubt it, but perhaps.
Dances are needed to bond a group together, every culture in the world does it. Now, parents conspire to prevent their kids from establishing bonds with their peers, and the kids themselves are too full of themselves to feel like it'd even be worth "forming bonds with my peers — as if they were cooler than moi. Seriously."
When both the parents and the kids are against the community, what is the community to do?
Categories:
Cocooning,
Dance,
Dudes and dudettes,
Education,
Over-parenting,
Psychology
March 16, 2014
Gen X vs. Millennials compared in a single book
Greeting me as I walked through the door at Urban Outfitters today was a book on display, X vs. Y: A Culture War, A Love Story. It compares and contrasts the two generations, mostly from the vantage point of the media experiences of their formative years. (Here is the book's website, and here are some excerpts.)
That right there is overlooking some of the major differences, since Millennials prefer things and virtual reality over people and real life. Much of what makes Gen X different was their largely unmediated upbringing. And not because of technological change — they had TV, movies, novels, magazines, comic books, video games, portable music players, etc. They just didn't live their entire lives in the world of media.
Those are the most striking changes I've documented over the past three or four years, off and on. Not hanging out with friends to play a sport, not getting a driver's license, not interacting with the opposite sex, not passing along folk / oral culture (schoolyard songs, games, urban legends), not spending your childhood outside, and so on. What cartoons were like then and now — sure, that's changed too, but that's not primary.
At any rate, the book seemed like a decent read from only having flipped through and skimmed pieces. It's not meant as an academic or journalistic book that treats things at an abstract level. It's almost all nitty-gritty details about what makes the two generations (mostly) different or (sometimes) similar, across a variety of cultural domains.
The authors are half-sisters born 14 years apart, one in '71 or '72, the other in '85 or '86. The co-authorship and family relationship makes the tone more sympathetic toward the other side — whether they deserve it or not. And it's not one of those generic, lazy tones that reassure us we're all formed from the same mold, just somewhat differently. They recognize and detail how different the gens are, they are just trying to make love, not war. Given how different — how opposite — the two groups are, though, that's a bit naive.
The older sister, Eve, is at the ground zero of her generation, but her sister Leonora is one of the earliest Millennials and probably not the most representative. Seems to me you need to get to late '80s / early '90s births before they feel more palpably Millennial. This is another weakness of the sister-sister co-authorship — to get a prototypical member of each gen would have required the younger author to be born about 5 years later. On the other hand, it's worth having a younger author who is still old enough to get reflective about the course of their generation's experiences.
I was surprised to see, at the very beginning of the book where they define their terms, that they echo my take on those born between '79 and '84. Canonical Gen X ends at '78 births for me, and I used to distinguish the '79-'84 cohort as Gen Y, with Millennials coming after them. But it seemed silly to have such a small "generation," so I just lumped them in with Gen X. Nothing unusual about sub-grouping within a gen — look at the early and later Boomers, for instance.
Why not lump them in with Millennials on the other side? I dunno, they just don't fall that way when you push them. The authors make the same claim, having asked around to lots of folks in that mini-range of Gen X. So I'm not the only member of that cohort who's noticed that we are more like peripheral, Johnny-come-lately X-ers, not really our own gen, and DEFINITELY not part of the Millennials.
We feel uncomfortable with our designated membership, not because we have beef with (canonical) X-ers, but because we always thought of "Generation X" as the cool older kids when we were 13 or 14, not us. Perhaps that was the influence of Gen X as advertising brand rather than as social group.
The '79-'84 cohort is distinguished by going through puberty right at the moment when the entire social-cultural momentum was grinding to a halt, and swinging back in the opposite direction, circa 1992. These folks spent half of their formative years (childhood) in the good old days, but another half (adolescence) in the Nineties. And yet they are not evenly pulled in both directions — they are more akin to people who went through puberty in 1982 than in 2002.
Evidently, childhood exerts a stronger influence on the developing, impressionable brain. Language is an obvious example, and also fits with the theme of socialization, a child figuring out what his community's norms are and fitting in with them. Switching languages at age 13 is not impossible, but is not easy, and will leave an accent.
Actually, the language analogy understates how hard it is because it's not like your first language was somehow the opposite of the second language. Most of the norms you're internalizing fall on a continuum, and they were swinging from one extreme toward that other during the early-to-mid-'90s. We're expected to be interested in people — no, just kidding! Interested in things! We're supposed to be sincere and open — no, just kidding! Ironic and closed-off! Girls are boy-crazy — no, just kidding! They suspect you all as crypto date-rapists! And so on.
I'm at a loss to convey just how dizzying of a mind-fuck that changing-gears period was for someone undergoing the transition from childhood to adolescence. Childhood experiences are supposed to prepare you for adolescence, so you can hit the social ground running. Then right as you're about to hit the ground, down becomes up, hot becomes cold, near becomes far.
That right there is overlooking some of the major differences, since Millennials prefer things and virtual reality over people and real life. Much of what makes Gen X different was their largely unmediated upbringing. And not because of technological change — they had TV, movies, novels, magazines, comic books, video games, portable music players, etc. They just didn't live their entire lives in the world of media.
Those are the most striking changes I've documented over the past three or four years, off and on. Not hanging out with friends to play a sport, not getting a driver's license, not interacting with the opposite sex, not passing along folk / oral culture (schoolyard songs, games, urban legends), not spending your childhood outside, and so on. What cartoons were like then and now — sure, that's changed too, but that's not primary.
At any rate, the book seemed like a decent read from only having flipped through and skimmed pieces. It's not meant as an academic or journalistic book that treats things at an abstract level. It's almost all nitty-gritty details about what makes the two generations (mostly) different or (sometimes) similar, across a variety of cultural domains.
The authors are half-sisters born 14 years apart, one in '71 or '72, the other in '85 or '86. The co-authorship and family relationship makes the tone more sympathetic toward the other side — whether they deserve it or not. And it's not one of those generic, lazy tones that reassure us we're all formed from the same mold, just somewhat differently. They recognize and detail how different the gens are, they are just trying to make love, not war. Given how different — how opposite — the two groups are, though, that's a bit naive.
The older sister, Eve, is at the ground zero of her generation, but her sister Leonora is one of the earliest Millennials and probably not the most representative. Seems to me you need to get to late '80s / early '90s births before they feel more palpably Millennial. This is another weakness of the sister-sister co-authorship — to get a prototypical member of each gen would have required the younger author to be born about 5 years later. On the other hand, it's worth having a younger author who is still old enough to get reflective about the course of their generation's experiences.
I was surprised to see, at the very beginning of the book where they define their terms, that they echo my take on those born between '79 and '84. Canonical Gen X ends at '78 births for me, and I used to distinguish the '79-'84 cohort as Gen Y, with Millennials coming after them. But it seemed silly to have such a small "generation," so I just lumped them in with Gen X. Nothing unusual about sub-grouping within a gen — look at the early and later Boomers, for instance.
Why not lump them in with Millennials on the other side? I dunno, they just don't fall that way when you push them. The authors make the same claim, having asked around to lots of folks in that mini-range of Gen X. So I'm not the only member of that cohort who's noticed that we are more like peripheral, Johnny-come-lately X-ers, not really our own gen, and DEFINITELY not part of the Millennials.
We feel uncomfortable with our designated membership, not because we have beef with (canonical) X-ers, but because we always thought of "Generation X" as the cool older kids when we were 13 or 14, not us. Perhaps that was the influence of Gen X as advertising brand rather than as social group.
The '79-'84 cohort is distinguished by going through puberty right at the moment when the entire social-cultural momentum was grinding to a halt, and swinging back in the opposite direction, circa 1992. These folks spent half of their formative years (childhood) in the good old days, but another half (adolescence) in the Nineties. And yet they are not evenly pulled in both directions — they are more akin to people who went through puberty in 1982 than in 2002.
Evidently, childhood exerts a stronger influence on the developing, impressionable brain. Language is an obvious example, and also fits with the theme of socialization, a child figuring out what his community's norms are and fitting in with them. Switching languages at age 13 is not impossible, but is not easy, and will leave an accent.
Actually, the language analogy understates how hard it is because it's not like your first language was somehow the opposite of the second language. Most of the norms you're internalizing fall on a continuum, and they were swinging from one extreme toward that other during the early-to-mid-'90s. We're expected to be interested in people — no, just kidding! Interested in things! We're supposed to be sincere and open — no, just kidding! Ironic and closed-off! Girls are boy-crazy — no, just kidding! They suspect you all as crypto date-rapists! And so on.
I'm at a loss to convey just how dizzying of a mind-fuck that changing-gears period was for someone undergoing the transition from childhood to adolescence. Childhood experiences are supposed to prepare you for adolescence, so you can hit the social ground running. Then right as you're about to hit the ground, down becomes up, hot becomes cold, near becomes far.
Categories:
Books,
Generations,
Pop culture,
Psychology
Is the homo presence the most alienating form of diversity?
Everyday experience and empirical studies show that greater diversity of social groups leads to communal fragmentation.
You just can't feel like your group membership is stable when there are so many competing sets of beliefs, norms, and practices. Sure, you're fine with your own group -- but there are so many other groups out there. It's like not being sure whether to drive on the left or right side of the road depending on which neighborhood you're driving through.
Most of these studies look at racial and ethnic diversity (like the big one by Robert Putnam), but class diversity or inequality is another example (a la Charles Murray's story in Coming Apart).
Though it may come as a shock to feminists, men and women are not an example of diverse social groups whose proximity breeds alienation. Our most joyous and festive occasions are marked by the participation of both sexes -- weddings, feasts, parties, religious services -- unlike more mundane tasks like running errands, earning a living, mothering, and so on, where we're happy to segregate.
In fact, it feels unnatural if men and women occupy two wholly separate worlds, as has been taking place over the past 20 or so years. Contrast that with the natural process of segregation along racial/ethnic and class lines.
What about the Big Political Topic du jour -- diversity of sexual orientations? Unless you live in a city, you may not be familiar with such diversity in real life (consider yourself lucky). But having had to endure exposure to such diversity for years now -- and getting worse and worse every year -- I've concluded that it is far more alienating than racial or class diversity.
A racial or ethnic group that I don't belong to is not necessarily perverse, corrupted, deviant, and infantilized. Ditto for a class layer that I don't belong to. Their mere presence as a social group does not appear weird and abnormal, however much I don't like them pushing for their own group's benefit at mine's expense.
When faggots begin to infest an entire city, though, it strikes a normal person as though a bunch of schizos had been let loose from the nut-house. The gay syndrome of traits is so warped, manifold, and stark that nothing short of a comparison to abnormals will do.
You don't even have to find their syndrome disgusting (though it's hard to see how some folks do not). You don't find schizos disgusting either. There's just something so obviously outta-whack about their entire infantilized pattern of thinking, feeling, and behaving.
When there's just one of them, your brain doesn't register a very strong signal of deviance, just like if you only see one schizo muttering to himself on a park bench. But increase their concentration -- and they don't need to be operating as a single pack -- and suddenly you sense that there's an entirely different world that's breaking through into the normal world. Like homos, schizos are not even "that" numerous -- several percent of the population, so in no way could they overwhelm society on their own.
If you've ever seen five or more druggie / wacko bums in an area at once, you know what I mean. It feels like a breach in the very fabric of our normal wholesome world. It is no different with queers. Concentrated deviance feels more threatening to the stability of a community than a foreign ethnic group or an economic class far above or below our own. The latter two are at least from the same realm as ours -- the normal!
Which scenario gives you a greater feeling of communal breakdown? Seeing a woman from your ethnic group dating a man from another? Seeing a woman from your class dating far above (uppity, too good for us) or far below (wasting her prospects)? Or how about seeing heterosexual girls preferentially choosing gays as their male friends?
I don't mean which one gives the average person a greater fight-or-flight response, or anger, or etc., since those are priming us to start an Us vs. Them battle that, over evolutionary time scales, would have been about ethnic conflicts (and for less time, class conflicts). It feels like the in-group member is acting like a traitor, and you want to make sure they remain loyal only to their own group.
I mean which scenario, when you're exposed to it long enough, sows the seeds of doubt that your community is hanging together. With a white dude who tries to act black, at least he's joining a group that is part of the range of normal, whatever else you think about it. Ditto the striver who wants to betray his roots and join a much higher class.
But what if a good chunk of mentally normal folks only preferred the company of schizos? Now they're not looking to exchange one more or less normal group for another one. It's an inversion of what is preserving -- shunning the normal and welcoming the warped. Forget higher levels of norm enforcement -- the most basic one is that we treat the sick as sick and the healthy as healthy. If we cannot manage to enforce even that, then we are no community at all. Too many of "us" just don't give a fuck, so normlessness prevails.
Many conservative commentators give the name "cultural suicide" to our immigration policy, or other policies with similar effects. When you think about it, though, it's not so much suicide as metamorphosis -- one day America is European, the next day it's Amerindian.
In the context of social groups, I think "suicide" goes better for cases like girls associating with gays. Not only because such women (and their gay BFFs, naturally) will have fewer children and wipe themselves out over the long evolutionary haul. But because they're obliterating something normal (guy-gal friendships) and leaving only something warped in its place ("I don't trust normal men, only kiddie-acting cocksuckers can step close").
It is scarcely less pathetic, deviant, and alienating than if people only dated and slept with robots. No matter how advanced the AI, that's just not a normal person. Come to think of it, has any sci-fi writer ever premised a dystopia on women only associating with queers? Or is truth stranger than fiction?
I don't know how or when we're going to get out of this, but I do have hope that we will, based on history. The late Victorian era was in the same ballpark as ours for how heavy the homo presence was, and not just high-profile cases like a major cultural figure such as Oscar Wilde trying to turn buggery into a dissident political act. Go to Google Images and search "gay victorian". They were "out" enough to sit for photographs as a couple. Here is a pretty scandalous one. Here is an example of even greater deviance, two Victorian trannies.
That whole fin-de-siecle atmosphere was not merely decadent, but specifically homosexual.
Yet somehow that began to fade away during the first several decades of the 20th C. Homo deviance seems to be part of rising-inequality periods, where the dog-eat-dog mentality means that nobody can be bothered to enforce basic communal norms, even if they're breeding vice, decay, and perversion.
That all turned around during the Progressive Era, which was twinned with the Temperance movement. By the middle of the 1910s, the red light districts had been shuttered, they were about to enact Prohibition, cut off immigration, and... well, somewhere in that suite of changes must have been one that killed off the lackadaisical attitude toward gay decadence and deviance of the fin-de-siecle period just before.
Maybe there's a book on that somewhere, I'll have to remember to check later. Point is: they reversed it before, so we can reverse it again. And it may not have required a specific movement like Prohibition. Perhaps the gays were spooked into the closet by the entire sweeping away of the dog-eat-dog / whatever-it-takes mindset of the Gilded Age. "Oh no, moral judgement is making a comeback -- run!"
You just can't feel like your group membership is stable when there are so many competing sets of beliefs, norms, and practices. Sure, you're fine with your own group -- but there are so many other groups out there. It's like not being sure whether to drive on the left or right side of the road depending on which neighborhood you're driving through.
Most of these studies look at racial and ethnic diversity (like the big one by Robert Putnam), but class diversity or inequality is another example (a la Charles Murray's story in Coming Apart).
Though it may come as a shock to feminists, men and women are not an example of diverse social groups whose proximity breeds alienation. Our most joyous and festive occasions are marked by the participation of both sexes -- weddings, feasts, parties, religious services -- unlike more mundane tasks like running errands, earning a living, mothering, and so on, where we're happy to segregate.
In fact, it feels unnatural if men and women occupy two wholly separate worlds, as has been taking place over the past 20 or so years. Contrast that with the natural process of segregation along racial/ethnic and class lines.
What about the Big Political Topic du jour -- diversity of sexual orientations? Unless you live in a city, you may not be familiar with such diversity in real life (consider yourself lucky). But having had to endure exposure to such diversity for years now -- and getting worse and worse every year -- I've concluded that it is far more alienating than racial or class diversity.
A racial or ethnic group that I don't belong to is not necessarily perverse, corrupted, deviant, and infantilized. Ditto for a class layer that I don't belong to. Their mere presence as a social group does not appear weird and abnormal, however much I don't like them pushing for their own group's benefit at mine's expense.
When faggots begin to infest an entire city, though, it strikes a normal person as though a bunch of schizos had been let loose from the nut-house. The gay syndrome of traits is so warped, manifold, and stark that nothing short of a comparison to abnormals will do.
You don't even have to find their syndrome disgusting (though it's hard to see how some folks do not). You don't find schizos disgusting either. There's just something so obviously outta-whack about their entire infantilized pattern of thinking, feeling, and behaving.
When there's just one of them, your brain doesn't register a very strong signal of deviance, just like if you only see one schizo muttering to himself on a park bench. But increase their concentration -- and they don't need to be operating as a single pack -- and suddenly you sense that there's an entirely different world that's breaking through into the normal world. Like homos, schizos are not even "that" numerous -- several percent of the population, so in no way could they overwhelm society on their own.
If you've ever seen five or more druggie / wacko bums in an area at once, you know what I mean. It feels like a breach in the very fabric of our normal wholesome world. It is no different with queers. Concentrated deviance feels more threatening to the stability of a community than a foreign ethnic group or an economic class far above or below our own. The latter two are at least from the same realm as ours -- the normal!
Which scenario gives you a greater feeling of communal breakdown? Seeing a woman from your ethnic group dating a man from another? Seeing a woman from your class dating far above (uppity, too good for us) or far below (wasting her prospects)? Or how about seeing heterosexual girls preferentially choosing gays as their male friends?
I don't mean which one gives the average person a greater fight-or-flight response, or anger, or etc., since those are priming us to start an Us vs. Them battle that, over evolutionary time scales, would have been about ethnic conflicts (and for less time, class conflicts). It feels like the in-group member is acting like a traitor, and you want to make sure they remain loyal only to their own group.
I mean which scenario, when you're exposed to it long enough, sows the seeds of doubt that your community is hanging together. With a white dude who tries to act black, at least he's joining a group that is part of the range of normal, whatever else you think about it. Ditto the striver who wants to betray his roots and join a much higher class.
But what if a good chunk of mentally normal folks only preferred the company of schizos? Now they're not looking to exchange one more or less normal group for another one. It's an inversion of what is preserving -- shunning the normal and welcoming the warped. Forget higher levels of norm enforcement -- the most basic one is that we treat the sick as sick and the healthy as healthy. If we cannot manage to enforce even that, then we are no community at all. Too many of "us" just don't give a fuck, so normlessness prevails.
Many conservative commentators give the name "cultural suicide" to our immigration policy, or other policies with similar effects. When you think about it, though, it's not so much suicide as metamorphosis -- one day America is European, the next day it's Amerindian.
In the context of social groups, I think "suicide" goes better for cases like girls associating with gays. Not only because such women (and their gay BFFs, naturally) will have fewer children and wipe themselves out over the long evolutionary haul. But because they're obliterating something normal (guy-gal friendships) and leaving only something warped in its place ("I don't trust normal men, only kiddie-acting cocksuckers can step close").
It is scarcely less pathetic, deviant, and alienating than if people only dated and slept with robots. No matter how advanced the AI, that's just not a normal person. Come to think of it, has any sci-fi writer ever premised a dystopia on women only associating with queers? Or is truth stranger than fiction?
I don't know how or when we're going to get out of this, but I do have hope that we will, based on history. The late Victorian era was in the same ballpark as ours for how heavy the homo presence was, and not just high-profile cases like a major cultural figure such as Oscar Wilde trying to turn buggery into a dissident political act. Go to Google Images and search "gay victorian". They were "out" enough to sit for photographs as a couple. Here is a pretty scandalous one. Here is an example of even greater deviance, two Victorian trannies.
That whole fin-de-siecle atmosphere was not merely decadent, but specifically homosexual.
Yet somehow that began to fade away during the first several decades of the 20th C. Homo deviance seems to be part of rising-inequality periods, where the dog-eat-dog mentality means that nobody can be bothered to enforce basic communal norms, even if they're breeding vice, decay, and perversion.
That all turned around during the Progressive Era, which was twinned with the Temperance movement. By the middle of the 1910s, the red light districts had been shuttered, they were about to enact Prohibition, cut off immigration, and... well, somewhere in that suite of changes must have been one that killed off the lackadaisical attitude toward gay decadence and deviance of the fin-de-siecle period just before.
Maybe there's a book on that somewhere, I'll have to remember to check later. Point is: they reversed it before, so we can reverse it again. And it may not have required a specific movement like Prohibition. Perhaps the gays were spooked into the closet by the entire sweeping away of the dog-eat-dog / whatever-it-takes mindset of the Gilded Age. "Oh no, moral judgement is making a comeback -- run!"
Categories:
Cocooning,
Dudes and dudettes,
Economics,
Gays,
Human Biodiversity,
Morality,
Psychology
March 14, 2014
Millennial memoirs #1: Superficiality
Instead of a great big post on what you can learn about the Millennial generation from Katie Heaney's memoir Never Have I Ever: My Life (So Far) Without a Date, I'm going to break it up into separate posts on a single theme or observation.
For reference, she was born in '86, and is somewhere between 25 and 27 when writing the book.
I've read from the first chapter that discusses her crushes in kindergarten up until the last chapter in the "college years" section. From age 5 to 20, the only thing she notices about boys is whether they are cute, hot, beautiful, dreamy, adorable, sexy, etc. It is the only thing that attracts her to a boy, and it is the only aspect about them where she notices details — eye color, hair color and style, height, tanned or not, which celebs they resemble, and so on. She also recalls how they dressed — sweatpants, sideways hat, emo earrings, etc. To reiterate: what she notices about boys does not mature at all from kindergarten to senior year of college.
Hers is the most superficial mind I've ever found myself inside of. Young girls not too long ago certainly noticed whether a guy was cute, hot, or whatever, but that was only one (perhaps big) part of what drew her to him. There were all those other components of "how he makes me feel," his personality, demeanor, and character. Only one guy (so far) is shown with any detail on these dimensions, but she is uninterested in him because he's not cute / hot / sexy. He even confesses that he likes her, but not hot = not bf material. They remain friends who occasionally hold hands at drunk parties.
Girls also used to notice physical states about a guy that were not generic enduring traits like hot, sexy, etc., but were mannerisms and idiosyncrasies. The way they stand, or sit, the way their voice comes out, or how it sounds in different contexts, how they fuss around with their hair, or whatever. The things that make him, him. There is absolutely no detail about these things, as though she didn't even observe them in the first place. It's like being color-blind — only identity-blind. She is numb to any sign of distinctiveness about the guys she sees. They are just cartoon cut-outs of the pattern "random hot guy."
That's certainly been my experience with them, on the whole. When I was tutoring them, they'd call me cute, gorgeous, etc., but rarely wanted to open up or get me to open up. At '80s night, they sent their representative over to tell me, "My friend thinks you're hot!" rather than "is interested in you," or something more normal-sounding. A few years back, a 15 year-old strutted right up out of the blue and said, "...Can I have your phone number?" I tried making a playful joke back: "Why...? What are you gonna do with it?" She didn't respond with "I think it'd be fun to hang out" or "You seem like a cool guy," but "Cuzzzzzz... you're CUTE!"
It's odd how Millennial girls don't try to stroke your male ego when coming onto you. Guys don't evaluate themselves so much on how dreamy they look, so hearing catcalls is not as ego-boosting as the traditional forms of flattery, that play up their qualities in general.
That also goes along with a recent comment here that Millennial girls feel awkward calling you by your name. At one point in college, she and her friend walk into a coffee shop and see the dreamiest looking guy working there. Although they learn his name is Sam, they refer to him amongst themselves as "Barista Boy," a generic term that would cover any old hot coffee shop worker. He is a type, not an individual.
You might have thought that a generation that only notices each other on a superficial level would be somehow more decadent, libidinous, and so on. All they say about each other is "cute boy" and "hot girl." Sounds like they got sex on the brain. And yet it's the same generation that is too awkward to do much with each other.
When you think about it, though, only noticing the most superficial aspects of other people is a childish trait. Fourth grade girls can go crazy over what Justin Bieber looks like, and give clear details about what they like, yet they couldn't begin to give a good character portrait. You need more social experience and connections during adolescence to be able to pick up on those parts of a person, and to articulate them.
Note: I'm talking about merely noticing those idiosyncratic character traits and mannerisms. Some young adults might not value them in addition to looks alone, but they could all at least pick up on them. The Millennials are not good observers about such things in the first place.
That solves the puzzle, then. Since Millennials are more infantilized than other generations, they only pick up on superficial qualities in other people, they don't have much of an honest sex drive, and they have a bratty self-regard that leads them to dismiss connecting with others, who are all so totally not worthy.
It's important to bear in mind that Heaney is not a stuck-up cheerleader type, or whatever you might imagine her as, given how superficial she is. She's a socially awkward goody two-shoes, who holds intermission every once in awhile to deliver a feminist manifesto. Somehow those traits support each other, though — like if you fundamentally do not trust or value men, you block out all of the things that make them human and reduce them to their surfaces, indeed going further to treat all hot guys as only slightly different carbon copies of the Platonic hot-guy essence.
For a comparison with Generation X, see Kerry Cohen's book Loose Girl: A Memoir of Promiscuity. I won't review that one in detail here, only to say that it's the opposite in every way. She was born in the early '70s, and the events take place during the '80s and early '90s.
It's not about sluttiness, brazen-ness, etc. It's about her need to connect emotionally with a guy, and believing that sleeping with him will make that intimate connection happen more easily. Usually that doesn't happen, so she tries again with a new guy. There are real-life characters with non-physical traits that distinguish them from one another. If anything she notices too much personal detail about others, almost to the point of being intrusive. She gets excited when she sees cute guys, but the main thing she wants is for them to want and need her emotionally. She is clingy rather than avoidant.
But this society could use some girls who are toward the clingy side (if not out at the extreme). Seeking closeness leads to getting to know people for who they really are, while avoidance and distance lead to superficiality.
For reference, she was born in '86, and is somewhere between 25 and 27 when writing the book.
I've read from the first chapter that discusses her crushes in kindergarten up until the last chapter in the "college years" section. From age 5 to 20, the only thing she notices about boys is whether they are cute, hot, beautiful, dreamy, adorable, sexy, etc. It is the only thing that attracts her to a boy, and it is the only aspect about them where she notices details — eye color, hair color and style, height, tanned or not, which celebs they resemble, and so on. She also recalls how they dressed — sweatpants, sideways hat, emo earrings, etc. To reiterate: what she notices about boys does not mature at all from kindergarten to senior year of college.
Hers is the most superficial mind I've ever found myself inside of. Young girls not too long ago certainly noticed whether a guy was cute, hot, or whatever, but that was only one (perhaps big) part of what drew her to him. There were all those other components of "how he makes me feel," his personality, demeanor, and character. Only one guy (so far) is shown with any detail on these dimensions, but she is uninterested in him because he's not cute / hot / sexy. He even confesses that he likes her, but not hot = not bf material. They remain friends who occasionally hold hands at drunk parties.
Girls also used to notice physical states about a guy that were not generic enduring traits like hot, sexy, etc., but were mannerisms and idiosyncrasies. The way they stand, or sit, the way their voice comes out, or how it sounds in different contexts, how they fuss around with their hair, or whatever. The things that make him, him. There is absolutely no detail about these things, as though she didn't even observe them in the first place. It's like being color-blind — only identity-blind. She is numb to any sign of distinctiveness about the guys she sees. They are just cartoon cut-outs of the pattern "random hot guy."
That's certainly been my experience with them, on the whole. When I was tutoring them, they'd call me cute, gorgeous, etc., but rarely wanted to open up or get me to open up. At '80s night, they sent their representative over to tell me, "My friend thinks you're hot!" rather than "is interested in you," or something more normal-sounding. A few years back, a 15 year-old strutted right up out of the blue and said, "...Can I have your phone number?" I tried making a playful joke back: "Why...? What are you gonna do with it?" She didn't respond with "I think it'd be fun to hang out" or "You seem like a cool guy," but "Cuzzzzzz... you're CUTE!"
It's odd how Millennial girls don't try to stroke your male ego when coming onto you. Guys don't evaluate themselves so much on how dreamy they look, so hearing catcalls is not as ego-boosting as the traditional forms of flattery, that play up their qualities in general.
That also goes along with a recent comment here that Millennial girls feel awkward calling you by your name. At one point in college, she and her friend walk into a coffee shop and see the dreamiest looking guy working there. Although they learn his name is Sam, they refer to him amongst themselves as "Barista Boy," a generic term that would cover any old hot coffee shop worker. He is a type, not an individual.
You might have thought that a generation that only notices each other on a superficial level would be somehow more decadent, libidinous, and so on. All they say about each other is "cute boy" and "hot girl." Sounds like they got sex on the brain. And yet it's the same generation that is too awkward to do much with each other.
When you think about it, though, only noticing the most superficial aspects of other people is a childish trait. Fourth grade girls can go crazy over what Justin Bieber looks like, and give clear details about what they like, yet they couldn't begin to give a good character portrait. You need more social experience and connections during adolescence to be able to pick up on those parts of a person, and to articulate them.
Note: I'm talking about merely noticing those idiosyncratic character traits and mannerisms. Some young adults might not value them in addition to looks alone, but they could all at least pick up on them. The Millennials are not good observers about such things in the first place.
That solves the puzzle, then. Since Millennials are more infantilized than other generations, they only pick up on superficial qualities in other people, they don't have much of an honest sex drive, and they have a bratty self-regard that leads them to dismiss connecting with others, who are all so totally not worthy.
It's important to bear in mind that Heaney is not a stuck-up cheerleader type, or whatever you might imagine her as, given how superficial she is. She's a socially awkward goody two-shoes, who holds intermission every once in awhile to deliver a feminist manifesto. Somehow those traits support each other, though — like if you fundamentally do not trust or value men, you block out all of the things that make them human and reduce them to their surfaces, indeed going further to treat all hot guys as only slightly different carbon copies of the Platonic hot-guy essence.
For a comparison with Generation X, see Kerry Cohen's book Loose Girl: A Memoir of Promiscuity. I won't review that one in detail here, only to say that it's the opposite in every way. She was born in the early '70s, and the events take place during the '80s and early '90s.
It's not about sluttiness, brazen-ness, etc. It's about her need to connect emotionally with a guy, and believing that sleeping with him will make that intimate connection happen more easily. Usually that doesn't happen, so she tries again with a new guy. There are real-life characters with non-physical traits that distinguish them from one another. If anything she notices too much personal detail about others, almost to the point of being intrusive. She gets excited when she sees cute guys, but the main thing she wants is for them to want and need her emotionally. She is clingy rather than avoidant.
But this society could use some girls who are toward the clingy side (if not out at the extreme). Seeking closeness leads to getting to know people for who they really are, while avoidance and distance lead to superficiality.
Categories:
Age,
Books,
Cocooning,
Dudes and dudettes,
Generations,
Psychology
Deregulation did not affect airfare prices, and Neoliberal cluelessness generally
The airline industry used to be under government regulation, whereby the state could have a say in what cities would be served, which cities could be connected with which others, what the ticket prices could be, and so on. Then the government deregulated the industry, letting the airline companies make those decisions on their own.
Neoliberal economic theory says that this will open up competition among airline companies, which will lower prices for consumers. How this happens does not matter in the details -- in the economist's mind, real-world outcomes must be special cases of a general principle. Here, "deregulation leads to greater competition among producers, which will lead to greater savings for consumers." No need to check facts or run case studies -- the theory said it will be so, so it must be so. The only open question is the magnitude.
Without any pre-conceived notions, look at the red line in the following graph showing the real cost per mile of airline travel, from 1950 to 2005, and point out where the deregulation event occurred. It should pop out if the theory holds water.
You don't see the event at all because deregulation had no impact on prices. What you see is a single downward trend with no acceleration (that is, a sharper and sharper decline). Your best guess would be around 1962, since prices had been creeping up for five years before undergoing a long and steady reversal. In fact, the Airline Deregulation Act was signed in 1978. When a major cause is invisible in a graph showing the trend in the effect, then it is not the cause of the effect after all.
From a Kevin Drum post on this topic:
There's the link to rising inequality, BTW: it's cheaper for elites to fly between big cities, but harder for the rabble to fly between smaller cities.
Referring back to the graph, in the years surrounding 1980, the cost per mile was about 12 cents, and 20 years later it was around 7 cents. Prices fell 5 cents in 20 years. Presto, "deregulation worked"! -- just ask this glib article in The Atlantic.
Not so fast -- from 1950 to 1970, prices fell from about 23 cents to about 16 cents, a decline of 7 cents in 20 years. That's a larger magnitude drop than 5 cents in 20 years. The slope was 40% steeper in the first period than in the later period. Hence, bad news for neoliberal theory: prices fell at a faster rate before deregulation than after.
I'm not claiming that deregulation caused the rate of decline to slow down. Maybe that fact about the history of the airline industry would have taken shape regardless of what Congress did or did not sign into law. But if we are going to link deregulation with price trends at all, we would have to say that it was harmful to consumers in slowing down the decline that had been going for damn near 30 years before deregulation.
(Skip down to the last three paragraphs to see what non-effects or harmful effects deregulation has had in other major cases.)
I also have no solid idea about why prices have been falling since at least 1950. My hunch is that there were low-hanging tech improvements for such a brand-new industry -- remember that consumer airliners are modified bombers from WWII -- and that making things more efficient allowed some of the savings to be passed on to consumers. After several decades, the improvements get harder and harder to find, and are more costly to implement, so not as much savings gets passed along as before.
But why would airline corporations bother making their businesses more efficient and passing some of those savings along to consumers, when they were protected by government regulation, barriers to entry, price controls, and so on? Maybe businessmen had a different psychology back in the '50s and '60s compared to the '80s and '90s. Of course they did: the former were in the final stage of the Great Compression, whereas the latter were in the beginning of the Great Divergence. The spirit of the former was to rein in greed (even if it could not be wiped out), while the spirit of the latter is dog-eat-dog.
These psychological changes would have applied to the regulators as well -- the former leaning on airlines to make their businesses more efficient and to keep cutting prices in step with that, and the latter looking the other way since they came to view regulatory power as bogus.
And I'm sure consumers would have felt differently back then too, shunning or slandering the airline companies if they did not make things run better and cut prices. Consumers and producers were supposed to treat each other out of good faith, and not improving things would have been seen as laziness or stinginess.
Today's consumers are content to get walked all over by the airlines -- the elimination of cute stewardesses, the elimination of meals, being crammed in like cattle, fees for everything, and ever-growing restrictions on what you can do, what you can bring, and how early you have to show up at the airport.
It's not surprising that sensible explanations like this would not occur to neoliberal economists, easily the most autistic camp within social science (already a nerdy bunch). They'll allow that people are not all identical robots, but they still do view them as pieces on a chessboard -- all sets of pieces obeying the same rules and constrained into the same range of behavior no matter what chessboard they appear on, whether a 1950s chessboard or a 21st-century chessboard.
Those of them who have read their Adam Smith like to cut down the arrogance of social engineers, who in their mind are government regulators, by comparing them to chess players who see people as chess pieces, oblivious to or dismissive of the idea that those people might have their own drives that the chess player cannot simply over-ride by lifting them in the air and swinging them over toward another square.
Yet the neoliberals themselves have that same mindset, only instead of "prudent government regulation" it's "getting the incentives right." Just make the currents point West, and the ships will sail West. The notion that the boat might have a power motor, and is not some silly little raft, or that the wind-generator of policy incentives might not blow very hard, doesn't matter to them.
Nor does it occur to them that what they believe are the right incentives might be the wrong incentives. No humility. The airline deregulation case is a textbook case of this -- whatever wise changes in incentives there were circa 1980 seemed to make the airlines greedier and consumers worse off than under the earlier set of incentives.
The history of the programs flying under the banner of "deregulation" have mostly served to enrich the top and squeeze the bottom, so the standard story about airlines should have sounded suspicious from the start.
Deregulating the labor market has gone along with falling wages, ditto for deregulating residence and labor of immigrants (falling wages and higher housing costs). Deregulating finance has nearly blown up the global economy. Deregulating cable TV hasn't stopped prices from doubling. And as with airlines, real prices had already been falling in the telephone industry long before the divestiture of AT&T during 1982-'84, and even before the 1974 anti-trust lawsuit against them. (See this graph, where the gap between prices for phone services and for all items was already closing by the late '60s).
This all jibes with how most folks feel about how much worse off they are economically than their parents or grandparents' generation (except the elite, which has swollen in numbers and enjoyed greater wealth). It's not that people think having a toilet is worse than having an outhouse, but that progress was just zipping along for their grandparents and parents, while now that trend has slowed down, flatlined, or reversed, depending on the domain of life. I can't think of an uplifting way to end this post. Just that looking through the reality, rather than the theory, vindicates how most people feel about the pace of progress these days.
Neoliberal economic theory says that this will open up competition among airline companies, which will lower prices for consumers. How this happens does not matter in the details -- in the economist's mind, real-world outcomes must be special cases of a general principle. Here, "deregulation leads to greater competition among producers, which will lead to greater savings for consumers." No need to check facts or run case studies -- the theory said it will be so, so it must be so. The only open question is the magnitude.
Without any pre-conceived notions, look at the red line in the following graph showing the real cost per mile of airline travel, from 1950 to 2005, and point out where the deregulation event occurred. It should pop out if the theory holds water.
You don't see the event at all because deregulation had no impact on prices. What you see is a single downward trend with no acceleration (that is, a sharper and sharper decline). Your best guess would be around 1962, since prices had been creeping up for five years before undergoing a long and steady reversal. In fact, the Airline Deregulation Act was signed in 1978. When a major cause is invisible in a graph showing the trend in the effect, then it is not the cause of the effect after all.
From a Kevin Drum post on this topic:
Beyond that, you might be surprised to learn that it's an open question whether deregulation was such a boon for the flying public in the first place. In a 2007 paper, David Richards looked at airline fares since 1950 and concluded that deregulation accomplished little. Fares had been going down before 1978 and they kept going down afterward. Yield per passenger-mile showed no change before and after deregulation (see chart on right). Growth in passenger miles traveled actually slowed after deregulation, and fares were mostly about the same as they would have been under the old CAB formulas. His conclusion: "This paper makes clear that the grant of pricing freedom to the airline industry has generally resulted in average prices being higher than they would have been had regulation continued under the DPFI rate-setting policies."
What did happen, Richards found, was that fares decreased on routes between big cities and increased on routes between smaller cities. That may or may not have been a good thing on net, but it's certainly a far different story than the one we usually hear about deregulation. For more on this, see "Terminal Sickness," a fascinating look at airline deregulation and the death of the mid-tier market by Phillip Longman and Lina Khan.
There's the link to rising inequality, BTW: it's cheaper for elites to fly between big cities, but harder for the rabble to fly between smaller cities.
Referring back to the graph, in the years surrounding 1980, the cost per mile was about 12 cents, and 20 years later it was around 7 cents. Prices fell 5 cents in 20 years. Presto, "deregulation worked"! -- just ask this glib article in The Atlantic.
Not so fast -- from 1950 to 1970, prices fell from about 23 cents to about 16 cents, a decline of 7 cents in 20 years. That's a larger magnitude drop than 5 cents in 20 years. The slope was 40% steeper in the first period than in the later period. Hence, bad news for neoliberal theory: prices fell at a faster rate before deregulation than after.
I'm not claiming that deregulation caused the rate of decline to slow down. Maybe that fact about the history of the airline industry would have taken shape regardless of what Congress did or did not sign into law. But if we are going to link deregulation with price trends at all, we would have to say that it was harmful to consumers in slowing down the decline that had been going for damn near 30 years before deregulation.
(Skip down to the last three paragraphs to see what non-effects or harmful effects deregulation has had in other major cases.)
I also have no solid idea about why prices have been falling since at least 1950. My hunch is that there were low-hanging tech improvements for such a brand-new industry -- remember that consumer airliners are modified bombers from WWII -- and that making things more efficient allowed some of the savings to be passed on to consumers. After several decades, the improvements get harder and harder to find, and are more costly to implement, so not as much savings gets passed along as before.
But why would airline corporations bother making their businesses more efficient and passing some of those savings along to consumers, when they were protected by government regulation, barriers to entry, price controls, and so on? Maybe businessmen had a different psychology back in the '50s and '60s compared to the '80s and '90s. Of course they did: the former were in the final stage of the Great Compression, whereas the latter were in the beginning of the Great Divergence. The spirit of the former was to rein in greed (even if it could not be wiped out), while the spirit of the latter is dog-eat-dog.
These psychological changes would have applied to the regulators as well -- the former leaning on airlines to make their businesses more efficient and to keep cutting prices in step with that, and the latter looking the other way since they came to view regulatory power as bogus.
And I'm sure consumers would have felt differently back then too, shunning or slandering the airline companies if they did not make things run better and cut prices. Consumers and producers were supposed to treat each other out of good faith, and not improving things would have been seen as laziness or stinginess.
Today's consumers are content to get walked all over by the airlines -- the elimination of cute stewardesses, the elimination of meals, being crammed in like cattle, fees for everything, and ever-growing restrictions on what you can do, what you can bring, and how early you have to show up at the airport.
It's not surprising that sensible explanations like this would not occur to neoliberal economists, easily the most autistic camp within social science (already a nerdy bunch). They'll allow that people are not all identical robots, but they still do view them as pieces on a chessboard -- all sets of pieces obeying the same rules and constrained into the same range of behavior no matter what chessboard they appear on, whether a 1950s chessboard or a 21st-century chessboard.
Those of them who have read their Adam Smith like to cut down the arrogance of social engineers, who in their mind are government regulators, by comparing them to chess players who see people as chess pieces, oblivious to or dismissive of the idea that those people might have their own drives that the chess player cannot simply over-ride by lifting them in the air and swinging them over toward another square.
Yet the neoliberals themselves have that same mindset, only instead of "prudent government regulation" it's "getting the incentives right." Just make the currents point West, and the ships will sail West. The notion that the boat might have a power motor, and is not some silly little raft, or that the wind-generator of policy incentives might not blow very hard, doesn't matter to them.
Nor does it occur to them that what they believe are the right incentives might be the wrong incentives. No humility. The airline deregulation case is a textbook case of this -- whatever wise changes in incentives there were circa 1980 seemed to make the airlines greedier and consumers worse off than under the earlier set of incentives.
The history of the programs flying under the banner of "deregulation" have mostly served to enrich the top and squeeze the bottom, so the standard story about airlines should have sounded suspicious from the start.
Deregulating the labor market has gone along with falling wages, ditto for deregulating residence and labor of immigrants (falling wages and higher housing costs). Deregulating finance has nearly blown up the global economy. Deregulating cable TV hasn't stopped prices from doubling. And as with airlines, real prices had already been falling in the telephone industry long before the divestiture of AT&T during 1982-'84, and even before the 1974 anti-trust lawsuit against them. (See this graph, where the gap between prices for phone services and for all items was already closing by the late '60s).
This all jibes with how most folks feel about how much worse off they are economically than their parents or grandparents' generation (except the elite, which has swollen in numbers and enjoyed greater wealth). It's not that people think having a toilet is worse than having an outhouse, but that progress was just zipping along for their grandparents and parents, while now that trend has slowed down, flatlined, or reversed, depending on the domain of life. I can't think of an uplifting way to end this post. Just that looking through the reality, rather than the theory, vindicates how most people feel about the pace of progress these days.
Categories:
Economics,
Politics,
Technology
March 13, 2014
Millennial vs. Gen X memoir titles
Which is which? Both are by women.
Loose Girl: A Memoir of Promiscuity (link)
Never Have I Ever: My Life (So Far) Without a Date (link)
I plan on comparing them once I finish the second one.
Loose Girl: A Memoir of Promiscuity (link)
Never Have I Ever: My Life (So Far) Without a Date (link)
I plan on comparing them once I finish the second one.
Categories:
Books,
Cocooning,
Dudes and dudettes,
Generations
Nerds getting self-righteous about helping the Third World
As part of my ongoing experiment in rolling computer tech back to the '90s or earlier, I thought of using a web browser that doesn't have so much distraction, and that doesn't hog memory. (My laptop in general, and surfing the web, works just fine on half a gig of RAM — only if there's all kinds of cyber-sludge being fed into the system does it start to slow down).
The Dillo browser seemed like what I was looking for — no Java, no Flash, not much of anything other than displaying text and images. The only bummer is no frames, so everything is arranged in one very long column. Yet it has features that the browsers of 20 years ago did not, like tabs and a Google search bar next to the URL bar.
It's been fun poking around websites that usually take a long time to load and that play ads, and seeing only the text and images. The New York Times website, for instance. And good ol' Drudge Report looks exactly like it does in a fancy browser, only with no ads. The best part is that with one tab open, it's only using 15 megs of RAM, compared to 150 in Firefox. Plus it only takes up 1.27 MB of hard drive space.
Not the best browser ever made, but it has its uses, and I could really see this being great on older PCs that wouldn't be able to run any of the popular browsers.
In fact, that was exactly the thinking behind Dillo — create a web browser that has a minimal footprint on memory usage and the hard drive (shoot, it's small enough to fit on a floppy disk), to democratize web access around the world.
However, gaining access to Dillo was not so democratically easy and open for me, since I'm running Windows. And allowing a browser to run on Windows would only be encouraging the monopolistic yadda yadda yadda. So the developers insisted on not providing a Windows version, although thankfully a handful of others have done so on their own (naming it D+). Yet the original is available for Mac — guess they didn't have any problems with boosting the hardware and software costs there. Nothing monolithic or closed-off about Apple, after all.
Thus, in the mind of the typical geek, they're going to democratize web access by preventing their browser from running on the operating system that everybody uses. It must serve the higher goal of Open Source, and if that means that nobody — especially not in the Third World — will end up using it, then that's just not fair.
I see the same ego-stroking propaganda whenever I open up the Vim text editor. Some message about how to help starving children in Africa. Or how the Linux distribution on my office computer is named Ubuntu, meaning something in Swahili. Cuz, y'know, Linux nerds totally hang with black Africans. They were trying to democratize computer usage in Africa by pushing Linux there too, though I don't know if that's still going on — how could you tell anyway? After their efforts, are Africans starting to eat into the code monkey job market dominated by Indians?
B-B-But... it was Open Source! Our technology didn't come from some rich, giant corporation! You aren't forced to use a graphical user interface — the command line allows you to think more freely! Yeah, well, most users aren't interested in thinking. The average consumer just wants to get some basic tasks done and not have to think about it.
That's where Microsoft came in, delivering a user-friendly system to the average person. How else did it become so damn common? Microsoft was not driven by an abstract ideology but by concrete concerns about whether users would buy it or not. And whaddaya know, Windows works way better for starving black Africans than Linux does. Once again, practical demolishes theoretical.
Are nerds going to start designing open source medicines to treat tropical diseases? If their intentions are pure, then the result would have to be better than the medicines currently in use by the pharmaceutical industry. I bet they'd come up with some Rube Goldberg cocktail / regimen that someone with malaria would stand no chance of understanding or wanting to follow through with. The sick person would just ask for a pill from the Peace Corps volunteer, pop it in their mouth, and try to get back to their daily life.
In summary, nerds don't want to democratize access to what is truly best for the average Third Worlder. Rather, they want to develop and disseminate an alternative to the so-called best — something that would prove how idiotic, clumsy, and evil the corporate developers were, and how smart, clever, and noble the Open Source crusaders were.
They wouldn't care if winning their crusade against Microsoft ruined more lives in the Third World than it improved. The Third World is like a new baby in the family that the two older siblings are trying to get to imitate them. If sibling A prevails, that just proves what a loser sibling B is. "C'mon, whose room do you want to play in? I know it's mine, cuz you like me better, don't you? Yes, yes you do!" It's puerile.
What would really help the Third World, technologically? Making it easier for them to get a copy of Windows. The Chinese have gone the route of piracy to make that happen. But it wouldn't have to be that way if the version of Windows were old enough. Windows 95 was pretty easy to get the hang of. They haven't sold that in forever, so it's not like spreading it around the Third World would eat into sales.
True, they wouldn't be able to do all the weird stuff that we can with newer operating systems and more powerful browsers, but then you're expecting too much of the Third World technologically. It would do well enough for them to write their thoughts down in a text editor, keep track of any trades, dealings, and small business with a spreadsheet, and kill time playing card games. Only a nerd so removed from reality would want to squash that in order to push Open Source stuff on them that they would never use.
The Dillo browser seemed like what I was looking for — no Java, no Flash, not much of anything other than displaying text and images. The only bummer is no frames, so everything is arranged in one very long column. Yet it has features that the browsers of 20 years ago did not, like tabs and a Google search bar next to the URL bar.
It's been fun poking around websites that usually take a long time to load and that play ads, and seeing only the text and images. The New York Times website, for instance. And good ol' Drudge Report looks exactly like it does in a fancy browser, only with no ads. The best part is that with one tab open, it's only using 15 megs of RAM, compared to 150 in Firefox. Plus it only takes up 1.27 MB of hard drive space.
Not the best browser ever made, but it has its uses, and I could really see this being great on older PCs that wouldn't be able to run any of the popular browsers.
In fact, that was exactly the thinking behind Dillo — create a web browser that has a minimal footprint on memory usage and the hard drive (shoot, it's small enough to fit on a floppy disk), to democratize web access around the world.
However, gaining access to Dillo was not so democratically easy and open for me, since I'm running Windows. And allowing a browser to run on Windows would only be encouraging the monopolistic yadda yadda yadda. So the developers insisted on not providing a Windows version, although thankfully a handful of others have done so on their own (naming it D+). Yet the original is available for Mac — guess they didn't have any problems with boosting the hardware and software costs there. Nothing monolithic or closed-off about Apple, after all.
Thus, in the mind of the typical geek, they're going to democratize web access by preventing their browser from running on the operating system that everybody uses. It must serve the higher goal of Open Source, and if that means that nobody — especially not in the Third World — will end up using it, then that's just not fair.
I see the same ego-stroking propaganda whenever I open up the Vim text editor. Some message about how to help starving children in Africa. Or how the Linux distribution on my office computer is named Ubuntu, meaning something in Swahili. Cuz, y'know, Linux nerds totally hang with black Africans. They were trying to democratize computer usage in Africa by pushing Linux there too, though I don't know if that's still going on — how could you tell anyway? After their efforts, are Africans starting to eat into the code monkey job market dominated by Indians?
B-B-But... it was Open Source! Our technology didn't come from some rich, giant corporation! You aren't forced to use a graphical user interface — the command line allows you to think more freely! Yeah, well, most users aren't interested in thinking. The average consumer just wants to get some basic tasks done and not have to think about it.
That's where Microsoft came in, delivering a user-friendly system to the average person. How else did it become so damn common? Microsoft was not driven by an abstract ideology but by concrete concerns about whether users would buy it or not. And whaddaya know, Windows works way better for starving black Africans than Linux does. Once again, practical demolishes theoretical.
Are nerds going to start designing open source medicines to treat tropical diseases? If their intentions are pure, then the result would have to be better than the medicines currently in use by the pharmaceutical industry. I bet they'd come up with some Rube Goldberg cocktail / regimen that someone with malaria would stand no chance of understanding or wanting to follow through with. The sick person would just ask for a pill from the Peace Corps volunteer, pop it in their mouth, and try to get back to their daily life.
In summary, nerds don't want to democratize access to what is truly best for the average Third Worlder. Rather, they want to develop and disseminate an alternative to the so-called best — something that would prove how idiotic, clumsy, and evil the corporate developers were, and how smart, clever, and noble the Open Source crusaders were.
They wouldn't care if winning their crusade against Microsoft ruined more lives in the Third World than it improved. The Third World is like a new baby in the family that the two older siblings are trying to get to imitate them. If sibling A prevails, that just proves what a loser sibling B is. "C'mon, whose room do you want to play in? I know it's mine, cuz you like me better, don't you? Yes, yes you do!" It's puerile.
What would really help the Third World, technologically? Making it easier for them to get a copy of Windows. The Chinese have gone the route of piracy to make that happen. But it wouldn't have to be that way if the version of Windows were old enough. Windows 95 was pretty easy to get the hang of. They haven't sold that in forever, so it's not like spreading it around the Third World would eat into sales.
True, they wouldn't be able to do all the weird stuff that we can with newer operating systems and more powerful browsers, but then you're expecting too much of the Third World technologically. It would do well enough for them to write their thoughts down in a text editor, keep track of any trades, dealings, and small business with a spreadsheet, and kill time playing card games. Only a nerd so removed from reality would want to squash that in order to push Open Source stuff on them that they would never use.
Categories:
Human Biodiversity,
Morality,
Technology
March 11, 2014
Young people not working reflects cocooning, not economic factors
Here is an article on labor force participation rates among young people, using data from the Bureau of Labor Statistics.
We all know that young people these days aren't as eager to work as they used to be, but how far back does that go? And is their non-participation due to economic factors such as rising inequality and related trends, or to social factors like cocooning?
The data for 20-24 year-olds goes back to 1948, and up through 2013. The cycles for inequality and status-striving, as studied most in-depth by Peter Turchin, are independent of the cycles for cocooning and crime. This is great because with a long-enough period of data, we can test and see which of the two cycles a certain phenomenon maps onto. Here is the graph from the article; click to enlarge.
The Great Compression, from the 1920s through some time in the 1970s, saw the top falling and the bottom rising within the economic pyramid. It was easier to get jobs (aside from the Great Depression), they paid better at the bottom and median, and young people were not status-strivers who planned to go through a lengthy credentialing process before getting their first job.
And yet that does not show up in the picture above, which is only for young people. The late '40s and most of the '50s show falling or stagnant levels of labor force participation. It's not until the later '50s that things start to steadily increase, and they continue up up up through a peak in the late '80s. Since the early '90s, it's been almost all downhill, notwithstanding a hiccup in the late '90s.
Those movements map more closely onto the cocooning vs. outgoing social mood. Getting your first job as a young adult is one of those things that isn't too hard to set up, provided you don't mind leaving the security of home life.
Shoot, in outgoing periods, youngsters not only do not mind, they are damn eager to head off on the path toward independence from parental discipline. Plus you get to meet new people on the job, at least your co-workers and perhaps customers as well if you're in that line of work. Sure, you'll meet some retards and assholes, but you've got to take the good with the bad, right? You'll add new acquaintances, romantic partners, confidants, and activity buddies.
And -- somebody you can blow off steam to. I think a large part of the anxiety that grips a cocooning society is not having anyone you can vent to throughout the day. A little vent here and a little vent there, keeps the pressure from building up too much. Especially when the listener has gone through the same frustration. It lets you know you're not the only one in a bind, and you get to laugh it off. "Another old lady holding up the line to write out a check..." "Lady bosses, eh?" "Paper jam again? Sometimes I think we're all part of some sick psychology experiment..."
However, all of those benefits of working -- aside from the material ones -- require you to be open to interacting with other people. For folks in a cocooning society, that starts to feel like too heavy of a price to pay. "Ugh -- other people." For older adults, they're just going to have to suck it up, which is why the economic inequality cycle shows a rosy picture for workers overall during the Great Compression, despite a good chunk of it taking place during the cocooning Mid-century.
But the Silent Gen youngsters didn't have to sink or swim like the Greatest Gen adults did during the Mid-century. They could indulge their preference for social isolation for a little while longer before taking the plunge into the work force.
This is one of the hardest things to keep straight when looking at history -- that even though there is a prevailing zeitgeist, it influences and is influenced by different generations in different ways. Baby Boomers and Gen X did not have the same experiences in the '80s, even controlling for their age difference. During a cocooning period, a generation whose upbringing was fairly social will weather the storm better than a generation that is a product of cocooning and over-parenting.
Finally, I realize that more 20-24 year-olds are in higher ed today than during the good old days, but that doesn't explain what's going on either. The higher ed bubble began circa 1980, so if studying rather than working is the cause, the graph should show a dip during the '80s, when in reality it shows a rise toward the peak. College students back then studied and had a part-time job. By the early 2000s, I remember feeling weird for having a 15-20 hour job during junior and senior year of college, as most of our peers not only did not work, but were not even interested in it.
Furthermore, going to college these days is not a demanding thing that would eat into work time. Grade inflation means you'll pass all your courses. If you paid for the credits, you'll get the credits -- that's how the administration views customer service and growing its consumer base. It's mostly a screw-off time. Not, however, in a social way -- visit any college these days, other than the most hardcore party schools, and you'll be struck by how atomized the student body is. And how they like it that way. "Other people -- um, cuh-reeepy!" Rather, they're all plugged into their phones, laptops, or video games.
Spending time in higher ed when you are not intending to secure, or are not cut out for an academic job, is just as much an outlet for cocooning as it is a grab for higher status through credentialing.
Related post: Did you have a job as a kid? My first job was at age 10, during the summer of 1991. In '92 and '93, I got lasso-ed into splitting the labor (and the pay) with my best friend who had a paper route. Looking at the graph above, this couldn't have lasted long for kids that age.
We all know that young people these days aren't as eager to work as they used to be, but how far back does that go? And is their non-participation due to economic factors such as rising inequality and related trends, or to social factors like cocooning?
The data for 20-24 year-olds goes back to 1948, and up through 2013. The cycles for inequality and status-striving, as studied most in-depth by Peter Turchin, are independent of the cycles for cocooning and crime. This is great because with a long-enough period of data, we can test and see which of the two cycles a certain phenomenon maps onto. Here is the graph from the article; click to enlarge.
The Great Compression, from the 1920s through some time in the 1970s, saw the top falling and the bottom rising within the economic pyramid. It was easier to get jobs (aside from the Great Depression), they paid better at the bottom and median, and young people were not status-strivers who planned to go through a lengthy credentialing process before getting their first job.
And yet that does not show up in the picture above, which is only for young people. The late '40s and most of the '50s show falling or stagnant levels of labor force participation. It's not until the later '50s that things start to steadily increase, and they continue up up up through a peak in the late '80s. Since the early '90s, it's been almost all downhill, notwithstanding a hiccup in the late '90s.
Those movements map more closely onto the cocooning vs. outgoing social mood. Getting your first job as a young adult is one of those things that isn't too hard to set up, provided you don't mind leaving the security of home life.
Shoot, in outgoing periods, youngsters not only do not mind, they are damn eager to head off on the path toward independence from parental discipline. Plus you get to meet new people on the job, at least your co-workers and perhaps customers as well if you're in that line of work. Sure, you'll meet some retards and assholes, but you've got to take the good with the bad, right? You'll add new acquaintances, romantic partners, confidants, and activity buddies.
And -- somebody you can blow off steam to. I think a large part of the anxiety that grips a cocooning society is not having anyone you can vent to throughout the day. A little vent here and a little vent there, keeps the pressure from building up too much. Especially when the listener has gone through the same frustration. It lets you know you're not the only one in a bind, and you get to laugh it off. "Another old lady holding up the line to write out a check..." "Lady bosses, eh?" "Paper jam again? Sometimes I think we're all part of some sick psychology experiment..."
However, all of those benefits of working -- aside from the material ones -- require you to be open to interacting with other people. For folks in a cocooning society, that starts to feel like too heavy of a price to pay. "Ugh -- other people." For older adults, they're just going to have to suck it up, which is why the economic inequality cycle shows a rosy picture for workers overall during the Great Compression, despite a good chunk of it taking place during the cocooning Mid-century.
But the Silent Gen youngsters didn't have to sink or swim like the Greatest Gen adults did during the Mid-century. They could indulge their preference for social isolation for a little while longer before taking the plunge into the work force.
This is one of the hardest things to keep straight when looking at history -- that even though there is a prevailing zeitgeist, it influences and is influenced by different generations in different ways. Baby Boomers and Gen X did not have the same experiences in the '80s, even controlling for their age difference. During a cocooning period, a generation whose upbringing was fairly social will weather the storm better than a generation that is a product of cocooning and over-parenting.
Finally, I realize that more 20-24 year-olds are in higher ed today than during the good old days, but that doesn't explain what's going on either. The higher ed bubble began circa 1980, so if studying rather than working is the cause, the graph should show a dip during the '80s, when in reality it shows a rise toward the peak. College students back then studied and had a part-time job. By the early 2000s, I remember feeling weird for having a 15-20 hour job during junior and senior year of college, as most of our peers not only did not work, but were not even interested in it.
Furthermore, going to college these days is not a demanding thing that would eat into work time. Grade inflation means you'll pass all your courses. If you paid for the credits, you'll get the credits -- that's how the administration views customer service and growing its consumer base. It's mostly a screw-off time. Not, however, in a social way -- visit any college these days, other than the most hardcore party schools, and you'll be struck by how atomized the student body is. And how they like it that way. "Other people -- um, cuh-reeepy!" Rather, they're all plugged into their phones, laptops, or video games.
Spending time in higher ed when you are not intending to secure, or are not cut out for an academic job, is just as much an outlet for cocooning as it is a grab for higher status through credentialing.
Related post: Did you have a job as a kid? My first job was at age 10, during the summer of 1991. In '92 and '93, I got lasso-ed into splitting the labor (and the pay) with my best friend who had a paper route. Looking at the graph above, this couldn't have lasted long for kids that age.
Categories:
Age,
Cocooning,
Economics,
Generations,
Over-parenting
Keyboards that you can feel and hear working
Next up on the list of things to help focus attention and boost productivity when working at the computer: a clicky keyboard. You know, the kind that folks used to type with before computers became goof-off devices?
Over the past five or so years, it seems like more and more people are starting to realize how awful the typical computer set-up is for getting work done. There was a huge shift in the mid-'90s when the internet became the cocooning society's go-to outlet for wasting time indoors rather than enjoying a social life, playing a sport, tending to a hobby, and so on.
(The technology did not cause cocooning, rather cocooners adopted and shaped internet usage to fit their preference for isolation.)
Ever since then, the suite of computer-related stuff that is readily available — both the functions of physical devices and software, as well as their look and feel — has become entirely focused on optimizing computer use for distracting ourselves. The computer as the new idiot box has displaced the computer as productivity machine.
The keyboard is the main way in which we interact with the computer for productive work. You can browse around the web and play simple games largely by pointing and clicking with the mouse, but not for anything else. We need to know that the message we intended to send the computer actually reached it. And that requires feedback.
The clearest way to get feedback is to look at the screen and see if what you meant to type did in fact show up. This requires too much conscious monitoring, though, when you really want the thoughts to be flowing from your mind to your fingertips automatically. Proofreading as you type is like trying to listen to your own voice while you speak, primed to catch errors and fix them on the spot. Keeping an ear open for gross errors, fine; but not constant monitoring.
Brains don't like being in two incompatible modes ("multi-tasking"), especially at such a fundamental level like conscious vs. automatic. Hence it is better to be receiving a more unconscious form of feedback.
Typing is a tactile activity, so the brain wants tactile feedback that the correct key was pressed. That's more of a muscle memory thing — you'll just sense that you hit the wrong key. Then you may have chosen the correct key, but the brain still wants feedback that the key was pressed enough to register. It's like your heel striking the ground when you're walking — each heel strike lets the brain know that you successfully took a step. If not, something went wrong, and you'd better try to steady yourself or brace for impact.
For me, hearing the clicks plus feeling the keys pressed is better than the tactile response alone. It makes the feedback signal have another independent piece to it, giving it redundancy - if I don't feel it, maybe I'll at least hear it, or vice versa.
Today's soft and quiet keyboards are too mushy to get good feedback about whether the key was pressed or not. There's no clear feeling of "OK, there it went, now on to the next key." The worst are those chiclet style keyboards that Apple has popularized — a good rule of thumb is that if Apple popularized a feature, it is for the idiot box use rather than the productivity use of the computer. Give it a year or so, and we'll all be typing on touch-screens that provide no tactile feedback at all.
I'm fortunate enough to have an IBM ThinkPad from the mid-2000s, which has a fairly stiff keyboard by contempo standards (a feature of the series that has been junked as of the 2012 models, I hear). Still, for office / home office work on desktop, I need a proper "mechanical" keyboard, as they're called. But which kind to choose?
Here is a nice video review of the main types from Tek Syndicate on YouTube, and here is a much more in-depth look at them. None of them seem to be better than the others overall, but are simply specialized for different tasks — playing video games, typing, etc.
They are definitely more expensive than the cheap awful junk that come packed-in or built-in, ranging from about $80 to $130. But they'll last way longer, do their job way better, and make being productive more enjoyable, so that's not a bad investment.
I've used the Das Keyboard a lot before, and it feels and sounds great. It is toward the high end of the price range, though. What's caught my eye is the latest incarnation of the old "model M" keyboards by IBM that used to be everywhere in the '80s and early '90s. The spring-based mechanism is somewhat different from the newfangled ones like Das Keyboard, but they still provide great feedback to the fingers and ears. They're now manufactured by a company called Unicomp, and made in the USA if you can believe it. Most of their models look to be between $80 and $100. Browse their product line here.
Check out this one. Its buttons are meant for use with a Mac, but I can get over that given how cool the colors are. The two-tone keys in beige and light brown stand out against the dark black tray.
Something with a little warm color keeps the keyboard from looking so harsh and design-y. Like, it's just a keyboard. Compare to the ubiquitous Apple aesthetic of silver, gray, and white. The whole Zen Buddhist / Space Age look just leaves me cold and puts me to sleep.
Related posts on re-establishing the computer as productivity machine:
This one about light-on-dark color schemes for word processors, spreadsheets, etc.
This one about distraction-free text editors. (I'm using Q10 to type this post, and it really is striking how free of interruptions the task has been.)
Over the past five or so years, it seems like more and more people are starting to realize how awful the typical computer set-up is for getting work done. There was a huge shift in the mid-'90s when the internet became the cocooning society's go-to outlet for wasting time indoors rather than enjoying a social life, playing a sport, tending to a hobby, and so on.
(The technology did not cause cocooning, rather cocooners adopted and shaped internet usage to fit their preference for isolation.)
Ever since then, the suite of computer-related stuff that is readily available — both the functions of physical devices and software, as well as their look and feel — has become entirely focused on optimizing computer use for distracting ourselves. The computer as the new idiot box has displaced the computer as productivity machine.
The keyboard is the main way in which we interact with the computer for productive work. You can browse around the web and play simple games largely by pointing and clicking with the mouse, but not for anything else. We need to know that the message we intended to send the computer actually reached it. And that requires feedback.
The clearest way to get feedback is to look at the screen and see if what you meant to type did in fact show up. This requires too much conscious monitoring, though, when you really want the thoughts to be flowing from your mind to your fingertips automatically. Proofreading as you type is like trying to listen to your own voice while you speak, primed to catch errors and fix them on the spot. Keeping an ear open for gross errors, fine; but not constant monitoring.
Brains don't like being in two incompatible modes ("multi-tasking"), especially at such a fundamental level like conscious vs. automatic. Hence it is better to be receiving a more unconscious form of feedback.
Typing is a tactile activity, so the brain wants tactile feedback that the correct key was pressed. That's more of a muscle memory thing — you'll just sense that you hit the wrong key. Then you may have chosen the correct key, but the brain still wants feedback that the key was pressed enough to register. It's like your heel striking the ground when you're walking — each heel strike lets the brain know that you successfully took a step. If not, something went wrong, and you'd better try to steady yourself or brace for impact.
For me, hearing the clicks plus feeling the keys pressed is better than the tactile response alone. It makes the feedback signal have another independent piece to it, giving it redundancy - if I don't feel it, maybe I'll at least hear it, or vice versa.
Today's soft and quiet keyboards are too mushy to get good feedback about whether the key was pressed or not. There's no clear feeling of "OK, there it went, now on to the next key." The worst are those chiclet style keyboards that Apple has popularized — a good rule of thumb is that if Apple popularized a feature, it is for the idiot box use rather than the productivity use of the computer. Give it a year or so, and we'll all be typing on touch-screens that provide no tactile feedback at all.
I'm fortunate enough to have an IBM ThinkPad from the mid-2000s, which has a fairly stiff keyboard by contempo standards (a feature of the series that has been junked as of the 2012 models, I hear). Still, for office / home office work on desktop, I need a proper "mechanical" keyboard, as they're called. But which kind to choose?
Here is a nice video review of the main types from Tek Syndicate on YouTube, and here is a much more in-depth look at them. None of them seem to be better than the others overall, but are simply specialized for different tasks — playing video games, typing, etc.
They are definitely more expensive than the cheap awful junk that come packed-in or built-in, ranging from about $80 to $130. But they'll last way longer, do their job way better, and make being productive more enjoyable, so that's not a bad investment.
I've used the Das Keyboard a lot before, and it feels and sounds great. It is toward the high end of the price range, though. What's caught my eye is the latest incarnation of the old "model M" keyboards by IBM that used to be everywhere in the '80s and early '90s. The spring-based mechanism is somewhat different from the newfangled ones like Das Keyboard, but they still provide great feedback to the fingers and ears. They're now manufactured by a company called Unicomp, and made in the USA if you can believe it. Most of their models look to be between $80 and $100. Browse their product line here.
Check out this one. Its buttons are meant for use with a Mac, but I can get over that given how cool the colors are. The two-tone keys in beige and light brown stand out against the dark black tray.
Something with a little warm color keeps the keyboard from looking so harsh and design-y. Like, it's just a keyboard. Compare to the ubiquitous Apple aesthetic of silver, gray, and white. The whole Zen Buddhist / Space Age look just leaves me cold and puts me to sleep.
Related posts on re-establishing the computer as productivity machine:
This one about light-on-dark color schemes for word processors, spreadsheets, etc.
This one about distraction-free text editors. (I'm using Q10 to type this post, and it really is striking how free of interruptions the task has been.)
Categories:
Design,
Psychology,
Technology
Subscribe to:
Posts (Atom)