MY MIND IS SET (Until Further Notice)
Here’s a Mindset List® Blog, Updated From Time to Time
To respond, write to mcbridet@beloit.edu
Today’s Kids Are Terrific!
The older generation often despairs of the young Millennials. They fiddle with their smart phones and never look an elderly person in the eye. They spend countless hours idling on the Internet.
By these accounts the kids are terrible, and America is in trouble.
But the kids are OK.
Do they text all the time? Yes, of course, but their ability to squeeze a lot of information into a small space is the basis for superb academic discipline. No less than Sherlock Holmes said, of the brief telegram, that it forced him to state exactly what he meant. Millennials are ready to learn the art of concise and precise prose.
Do they surf all the time? Yes, of course. But they’re getting subtle practice in how to make connections between different websites, data sets, opinions, and concepts. The ability to make sound and creative connections is the mark of an educated person.
They’re also great collaborators. They’ve established generational solidarity online. They’re ready for the cooperative working groups that can solve problems by pooling insights.
In fact, when you think about it, there’s no reason why these texting, surfing, and sharing Millennials, given some hard and imaginative brain exercise from their teachers, can’t become one of the most educable generations in history.
Does Time Speed Up As You Grow Old?
A typical comment from an older person (50 plus) is that he blinks and ten years go by. He blinks again, and ten more years go by. It seems that as we grow old, time speeds up. Is this true? Or is it an illusion? Or is there a difference?
One study suggests that it’s true in objective terms. Older people estimate that 75 seconds have gone by when only a minute actually has. Younger people are much closer to being accurate about when a minute is up. This may have to do with brain wiring as the noggin ages. But scientists can’t say what the wiring change is.
Another approach is psychological. As we do new things—given the wonder and complexity of them—time slows down. Think of your younger days when you were having to get used to middle school or were trying to learn the guitar. As you concentrate fiercely on the novelty, time seems to accommodate itself to your task: it slows down so that you can focus on new details. But as we get older, we do fewer and fewer new things. Life becomes routine. There’s little to concentrate on as our existence goes more and more on automatic pilot. Time gallops.
And then there’s just arithmetic. If you’re 60, then 3/4ths of your life is over. You see the future as short and quick. If you’re twenty, then you’ve got 3/4ths of your life to go. This vast stretch of time will seem very long, and your entire conception of time will seem not only extensive in length but also deliberate in speed. It’s the difference between how you think of walking a mile as opposed to five miles. Which will be faster?
This is all rather sad in a way. Just when we’d like for time to slow down, we oldsters find it speeding up. We find ourselves hurtling towards Oblivion. But there’s hope. You an slow time down by doing something new. Learn how to play the viola. Learn how to speak Farsi. Ah, but there’s a catch. As we get older, we have less capacity for learning new things.
Well, that’s that then. I’ll see you in Oblivion—and soon!
How To End Sexual Assaults On Campus
Different observers disagree on how many of these terrible incidents occur, but all agree that there are too many. A useful thought experiment might be to ask how they could be stopped altogether—on college campuses.
If we go back fifty years, to 1955, we would find that sexual assaults on our nation’s campuses were very few in number. Most colleges and universities had an “hours” policy that required college women to be in their residence halls after 10 PM. I can recall—this betrays my age—a policy called “three on the floor,” which meant that even during daytime visiting hours college men and women had to leave the door of their rooms open and keep at least three of four total feet on the floor (translation: not on the bed).
That, you will say, was a more innocent time; a more repressed time; a period before the Sexual Revolution and the Pill. And you are right. But it was also a time of very, very few sexual assaults. We could virtually eliminate sexual assaults if we went back (or forward) to 1955.
Of course we can’t go back to 1955, and we aren’t going to. “Hours” policies also existed during a time when young women were viewed as highly vulnerable creatures (like children) that needed protection or as easily tempted creatures that needed to be saved from their own perilous passions. Both attitudes are inconsistent with today’s progressive view that young women are free, responsible, and highly competent agents.
The lesson here is that we can’t solve the problems of 2015 by any wistful longing for the “good old days.” The problem with the good old days is that they really weren’t all that good; and the problem with the “bad new days” is that they aren’t quite as bad as they seem. Meanwhile, college administrators will continue to fight sexual assaults with the means available to them in 2015: regulating booze on campus, changing policies about reporting and consent, and modifying the mindset of students.
“Hours” policies aren’t in the cards.
Christ and Chuck
A Short Conversation Between Jesus and Charles Darwin
Christ: Many of my followers are quite upset with you. They think your explanation for animals and plants leaves God out of the process. But I’m a peace-loving man, as you know, so I’m here to see if there’s common ground.
Charles: Well, there might not be any common ground where Creation is concerned—though my explanation doesn’t rule out Divine Management—but perhaps we can find some agreement about Sin.
Christ: Sin? Well, that happens when you disobey God’s commandment to love Him and to love others as you would love yourself.
Charles: Right. But Sin is inevitable, you see. My theory of competition for limited resources precludes loving others as much as you love yourself.
Christ: I see that. But in Eden there were unlimited resources. Adam and Eve were ejected from there because they disobeyed my Father. Limited resources became part of their punishment.
Charles: Well, now we’re getting somewhere. This is why there’s the doctrine of Original Sin. Sin is a contagion. Your followers believe that, and they’re right. Once you leave Eden and go into a world of finite resources, then you are bound to sin some more trying to garner enough to live on. But you’re never secure, and soon we have greed and cheating and war and all sorts of other sins.
Christ: So there really is something called Original Sin—and it goes back to limited means of food and shelter. You’re right, Charles. We really do agree. But how then do we stop sinning? My answer is that beneath Heaven we really can’t. We need a supernatural boost in order to stop sinning.
Charles: That’s beyond my pay scale. I only deal in nature. But I’d agree: only Super-nature can deliver us from the dilemma of limited resources. Shall we go to the church social together?
Christ: Let’s. They’re serving that apple punch that I like—but they don’t have unlimited amounts of it, so let’s hurry!
Charles: Agreed!
Don’t Treat A Millennial Like a Suitcase
I recommend a fabulous book published by Harvard, co-written by Beloit College alum Peter Brown. It’s called Make It Stick: The Science of Successful Learning. It’s a great guide to advanced research in learning and memory and especially astute in refuting the idea that the brain is some repository into which we are supposed to pack stuff—facts and data and even ideas. The brain is not a suitcase to be crammed full of information. And indeed, the best way to learn and recall is not by rote memory but by retrieval of what you think you’ve learned a short time after you think you’ve learned it.
This retrieval will test the accuracy and clarity of learning. It can even improve that learning, and Millennials are in a unique position to benefit from such a finding. Let’s say that a Millennial classroom has recently studied what is often called the “principal-agent” problem in economics. This problem rises when someone whom you entrust with your financial fortunes has his own agenda—not because he’s a crook but because his interests and yours do not entirely coincide.
This is a complex issue, and let’s say the teacher covered it on Monday. Then on Friday the teacher says, “O.K. Write me two paragraphs expressing your comprehension of the principal-agent problem. Once you’ve finished it, send it to your designated partner.”
Then the student and her partner will read each other’s mini-essays, their retrievals of understanding. Each student can test her understanding and can test her partner’s understanding; and the two of them can work together to sharpen one another’s understanding.
Now this method of learning is distinctively possible thanks to the ubiquity of word processing electronic mail. In the old days, when paper was a tiger and the typewriter was king it would have been much harder to manage. Now it can be brought off beautifully. Here’s another example of how digital natives can become beneficiaries of both advanced learning research and the blessings of high tech.
It’s just one more instance of how exciting it is to teach these vital digital natives.
Why Flipping A Classroom Is Great for Millennials
Millennials watch television everywhere except on television sets. Visual media, once composed of newspapers, television, and movies, are now devolving into a single digital source. Well. If they watch videos of movies and TV shows on line, they can sure as hell watch lectures on line. And with online lectures students can enter what’s known now as a “flipped classroom.”
The flipped classroom offers the most stimulating possibilities for higher education in years, and Millennials will find it right in their wheelhouse (to use a low-tech term). Why? It’s because the flipped classroom can create educational applications years earlier than in days of yore.
A few years ago a former student said to me, “Tom, these kids have simply got to realize that it isn’t just knowledge but the application of knowledge that gets you through the world. But I didn’t realize that until I was out of college.” Yes. She’s right. But with the flipped classroom we can show Millennials the truth of this wisdom much sooner. Here’s a brief example.
Say I’m teaching the rhetorical theories of Kenneth Burke, a seminal thinker, who analyzed rhetoric in terms of a five-point drama that included such concepts as act, actor, and milieu. I put my lectures about Burke on line, and Millennials do their “classwork” (listening to the lectures) at home on the computer or tablet or smart phone. Then in class we gather together. We plan our strategy and then go out into the world of coffee shops and bars and subways and Facebook, and we listen to conversations: everyday conversations.
And we test Burke’s theories. Do they hold up when analyzing ordinary discourse rather than formal occasions? Is there a rhetoric of everyday life? We compare answers. We publish them on our own website and blogs, and we’ve had a sublime educational experience.
Millennials are always worried about applications, especially questions of “how does this knowledge help me get a job and do it well?” It’s a generation that, given globalization and automation and recession, is rather suspicious of useless knowledge. The flipped classroom is a way to convince them that many forms of knowledge that they thought was useless is really analytically powerful in everyday, non-campus life.
The point is to demonstrate to Millennials how even so-called useless knowledge can become quite useful in context.
And the flipped classroom, made possible by digital technology, can alert Millennials to that insight years earlier than in times of yesteryear, when graduates had to learn the hard way that application is the name of the game.
Why Internet Surfing Prepares Millennials for College
Don’t Millennials spend countless idle hours just surfing randomly from site to site? The absolutely do! But what are the educational implications of this apparently addictive habit? Well, here’s one answer, in the form of a recommended exercise. It shows why surfing, which seems to be a waste of time, is actually a great preparation for college.
Assign Millennial students a starting website. It might be a site of American presidential firsts or an overview of flora and fauna in New England or an index to worldwide oil production or a categorization of earthquakes and tsunamis on the planet in the last fifty years. Then ask students to go to nine other sites, the first of which is prompted by something they found on the original one. The information on a starting site leads to information on the next one, which leads to further information on succeeding ones.
Then ask students to write an essay linking what they have found. To be sure, there remain questions about factual reliability, but checking sources is not the aim of this exercise. The purpose is to give Millennials, who like links, some real practice in linking: to test and hone their ability to make connections between disparate sources—connections that could lead to far more exhaustive investigations later, organized around a central question or thesis.
And you thought the Internet was just an addiction! Well, it is, but it can also become a very useful one!
Why Texting Preps Millennials For College
If you’re like me, you have a daughter that can text the whole of War and Peace while I’m still trying to text her that I miss her. Even so, text messages tend to be short, and Twitter messages must be—no more than 140 characters including spaces.
This would seem to be precise opposite of what is entailed by higher learning. The brevity of these practices seems to augur nothing but superficiality, right?
Wrong!
Sherlock Holmes, no slouch when it came to intellectual brilliance, once said to Doctor Watson that the telegram was a wonderful form of discipline, for with the concise demands of the telegram, one is forced to focus precisely on what one needs to say. And now telegrams have come back. They’re called texts. Anyone who, like me, has spent a career working with college students to narrow their focus; to find a central, briefly stated idea that will govern their essays and make them coherent in substance and a pleasure to read, can also find texts and tweets to be a golden educational opportunity. It can be a wonderful cure for that great bane of student writing and analysis: what we might call, adapting Pirandello, Thirty-Five Sentences Looking for a Thesis.
For instance, as Millennial students struggle with the organizing idea of a passage from Wittgenstein, or the social theories of Max Weber, or the implications of a finding about purchases of macaroni and cheese, or the ramifications of an experiment on sinusoidal flow, asking them to state these ideas in very few words—so that every word has to tell—is a fine idea indeed. Asking students to organize themselves into working groups and then text each other the central findings of these groups—in no more than 100 words—and then having each group text back to the other groups with a 100-word critique, can be a marvelous way to concentrate the mind.
Concise texting, like the prospect of being hanged, can force the mind to shed the extraneous and find the unifying, order-giving idea.
Millennials already have plenty of practice in this sort of thing. They text. Now is an exciting time to show them the intellectual possibilities of their previous activity in conciseness. And by the way, if you have a grammatically challenged student who blames his difficulty on textspeak, pay him scant attention. All Millennials text, and many of them know the difference between a semicolon and a comma fault.
Why Millennials Are the Most Educable Generation in History
3 Reasons
The older generation often despairs of today’s Millennials. Why are they always fiddling with their smart phones, and never look an elder in the eye? Don’t they spend countless idle hours surfing pointlessly? Why, they can’t even spell “you.” Instead they text it as “u.”
But today’s Millennials are the most educable generation in history, and here are three (3) reasons why.
#1. They Text & They Tweet. They have had lots of practice squeezing information into a small space. Thus they are well prepared to learn how to be concise in their prose, and how to write pithy propositions that will organize their essays. Past generations of college students have been prone to pad their prose in order to escape taking a definite point of view. Their essays have too often consisted of Thirty Sentences Looking for a Thesis. Today’s educators can tell Millennials, “You’re the texting and tweeting generation. Good. Now we’re going to put you to work with even more challenging lessons in economy and order.”
#2. They surf. Don’t they waste time doing this? They do. But they are also getting swell practice in disparate information and websites. Now educators can say to them, “You surf all around the global web. Good. Now we’re going to put you to work finding subtle connections between highly diverse data sets, concepts, and opinions. Finding links is a key to becoming educated.”
#3. They share. In fact, The Mindset List® guys have even dubbed them “the sharing generation.” They share profiles and posts. They text. They have created this great virtual community of collaboration and generational solidarity. So now educators can say to them, “You’ve learned to share ideas on line better than any generation before you. Good. Now we’re going to challenge you to do something much tougher: use that great sense of collaboration to solve challenging academic problems of interpretation, hypothesizing, and creativity.”
Please note that these three habits of Millennials are not, standing alone, enough to educate them. Educators must take advantage of these habits and move them to higher steps of learning. In subsequent postings on MY MIND IS SET I’ll discuss more specific applications. If we educators are imaginative enough, we can transform this texting, tweeting, surfing, and sharing generation into a great community of learners. In fact, it’s happening already.
The Strange Case of the Beer-Guzzling Billionaires
So five billionaires—let’s say Bill Gates, Warren Buffet, T. Boone Pickens, Carlos Slim, and Donald Trump—decide to head to the beach, have a cookout, and throw down a few beers. They get to the steamy coastline, admire the view and soak up the rays—but then they realize that they’ve totally forgotten the beer. And they need several dozen six-packs, since billionaires guzzle the stuff just as thoroughly as some of us more penurious sorts do.
But when they drive their limos to the nearby convenience store, they find that the owner, aware of their riches and the fact that he’s got the only store around, tells the five that they can have all the six-packs they want, but that they will cost them $50 each.
What do the five billionaires do?
Robert Franks, an economist at Cornell, once raised this question. He guessed that the billionaires, now angry at being bilked, would simply do without the beer or drive a long while elsewhere in order to get it at a fair price. But why would they do that? They can surely afford the 50-dollar a pack price. They’re billionaires!
Here is where the irrational is rational, for in the long run it serves all of us to advertise that we will not tolerate being cheated. We advertise that we are fair, and we project the reputation of being unwilling to accept unfairness in others. Is that because of some abstract love of justice? No, it isn’t. Rather, it’s a matter of projecting a rep: “Don’t mess with me.”
Here’s an experiment you can try: Keep a journal, say over a week, of all the times you present yourself as a fair person, with whom others can do business reliably. You’ll be amazed, I predict, at how often you try to come off as fair, and sometimes unwilling to tolerate lack of fairness in others—even in your friends, spouses, and relatives.
So for you, and me, and Bill Gates and Warren Buffet, it really does pay to be a little irrational sometimes. Bill and Warren may have to do without their beer, but their long claims of “ruthless reliability” have otherwise served them well—and constitute one reason why they could afford that costly beer in the first place (if they were willing to buy it).
Same Sex Marriage and the Plight of the Poor
Oh, Yes: There’s A Connection
Over the past fifteen years the United States has seen two trends: the rising acceptance of gay marriage and the dramatic decline in the net incomes of the poor—an almost $2,000 drop. Now 3 out of every 10 Americans live in the bottom 20% of the economy.
Is there a link between these two trends? I think there is—but please understand: the rise of single sex marriage as an institution has not caused an increase in poverty.
Still, there is a connection, for those advocating gay marriage are an example of the American tendency to “fly solo,” which is the enemy of grouping together to fight economic injustice. They are leaving the traditional ideas of marriage in favor of something quite new. And flying solo means that, when they blaze these new trails, they are incurring risks. The same goes for many Americans, especially younger ones, who are eager to live in big cities and leave small towns; design their own websites; embrace a multi-cultural world; and manage their own savings (if any) in a pension-free world.
Today the United States is a land of expanded personal freedoms but also of outsourced risks. “You’re on your own” is the mantra. You’re on your own to pursue same sex marriage if you wish (great), but you’re also on your own when it comes to planning for retirement (don’t count on a pension or even Social Security: not so great).
Once upon a time in America the poorest among us organized to protest their plight, and this led to the Progressive Movement, the New Deal, and the Great Society. But it was more than just shared interests that propelled this solidarity. It was also propelled within communities, such as mill towns and black churches, where people had known each other for years and years. There was not only an alliance of interests. There was also a kinship of relatives and friends. Occupy Wall Street, in contrast, was a meeting of strangers that soon fizzled out.
As Americans come more and more to “bowl alone” and live in crowded cities and shout their eccentricities on Facebook—and pursue their own, highly individualistic, lifestyles—the prospects for solidarity against economic inequality wither. This is not the only reason for the rise of inequality. But it’s a big one.
Cloning Insurance & Confederate Flags
Suppose you’re a wife in 2095 and your husband dies. This would normally be a sad occasion: you’ve been married for years and years and have been both companionable and loving. He’s gone, but wait! He’s taken out cloning insurance.
Thus about a week after the funeral you hear a knock at the door and greet your brand new husband: an exact copy of Jim or Ted or whatever his name was. Thank goodness Jim or Ted took out the insurance!
This seems absurd, but then so did, once upon a time, cures for TB and instant messaging and virtual colonoscopies. There are laws against experiments with human cloning, but already Dolly the Sheep has been cloned. And 2095 is a whole eighty years away. Eighty years ago there was no TV, no Internet, no space stations, no MRI, and no antibiotics. Cloning and “cloning insurance” may not be far-fetched at all.
But is the “new” Jim or Ted—the new husband—the same “person” as the old one? Apparently he is, but is a “person” simply a blend of electrical impulses and precise genes? What is a person? Is a one month old fetus a person? How about an individual now in a vegetative state? For that matter, are you the same “person” you were in high school? George Orwell once looked at his picture in a prep school yearbook and said, “That guy is George Orwell but has nothing to do with me now.”
And then there’s another form of “personhood”: what we call identity politics. The Confederate flag, about which there’s been so much dispute of late, is what Senator Lindsay Graham called “part of who we (South Carolinians) are,” while conceding that for many in the state and elsewhere it is a hurtful symbol of racist terror. For those of whom Senator Graham spoke in the first instance, the flag represents some sort of pride, ancestry, and honor. For them, apparently, the flag cannot be separated from this other form of “personhood.”
We are hardly done with the politics of personhood. Abortion remains a divisive issue. No doubt cloning will be, too. The question of when to “pull the plug” is fraught with peril, and it too goes back to when someone ceases to be a “person.” Antonio Damasio, a great neuroscientist, describes an actual individual with brain damage who can converse with us but has no idea of past or future and does not even know his own name. Is he a person? What are his rights in court?
Long after the Confederate flag is put to rest we will have many other controversies over “who we really are.” A bumpy ride—what else is new—awaits our children and grandchildren. They will need to fasten their seat belts.
Brian Williams and the Secret of Silicon Valley
We recently learned that Brian Williams will return to NBC on the cable network MSNBC, where he will be breaking and reporting news. Apparently MSNBC wants to upgrade its news coverage activity, and Williams will play a key role.
He recently had an interview with Matt Lauer of Today in which he blamed his ego for the various exaggerations in his stories but stopped short of admitting that he lied. For some this was insufficient. But he is coming back to test his credibility with a smaller TV audience.
Almost forgotten is the past of another network and cable regular, Marv Albert, who survived a sex scandal (barely remembered by most) to remain a mainstay of sports broadcasting. Williams is so “relatable” and smooth, and Albert such a mixture of excited description and iron discipline, that it has been hard to keep them off the air. That Williams also had a ten million dollar contract with NBC (they owed him the money) no doubt helped his prospective comeback get into the starting gate.
This is an American story—second chances—and it’s reflected in our bankruptcy laws. In Europe these laws are badges of disgrace, and they make it very hard for people to try again. Not so in America, where many a successful Silicon Valley has declared bankruptcy almost routinely, paid its creditorsl by court order as best it could, and then reorganized.
The mantra is “Fail; fail again; fail better; succeed.” Brian Williams in his interview with Lauer was declaring a form of bankruptcy. He admitted that he had failed. Let us hope he fails better next time and then succeeds. Many Americans don’t want to waste good talent in a miasma of shaming.
(Full Disclosure: Brian Williams endorsed our first book The Mindset Lists of American History.)
Whatever Happened to the Mad Scientist?
Recently TCM showed the 1958 horror classic The Fly, about a scientist who learns how to transport matter from place to place but forgets that, when he tries to transport himself, there’s a fly in the chamber with him. Thus his atoms and the fly’s atoms get mixed up. The scientist ends up with a giant fly’s head while the poor little fly ends up with a human head. The scientist’s wife discovers that she’s married a fly, while the fly screams for help when a spider is about to eat him.
This scientist was not strictly mad, but he was close to it. He was obsessed with his project and couldn’t stop tinkering with it. He closed himself off from family. He was sure that his discovery would save humankind, because anything destroyed could be reintegrated. Sci-fi horror films are full of scientists with grandiose ideas. They go too far and end up destroying themselves and others. The first one, of course, was Dr. Frankenstein, whose giant, slapdash creature became a menace to Europe. So much is Dr. Frankenstein associated with this creature that some people still call the monster “Frankenstein.”
But the mad scientist seems to have dropped out of circulation. Why not bring him back?
Why not a mad scientist who discovers how make it possible for people all over the globe to “friend” one another in real time? What will happen? Well, the scientist will spend all his time on his invention, “friending” one stranger after another. He can no longer tell who his real friends are. He is closeted away with his new invention. His family never sees him. He forgets to eat and almost starves to death. He goes crazy.
But actually he was mad to begin with: mad to think that his great discovery would save us all.
Millennial Manhood and the Charleston Murders
Dylan Roof is twenty-one years old, and that makes him a member of the Millennial Generation. He sat with a prayer group for an hour in a historically black church and then shot nine worshipers dead. Later he said they were so nice to him that he almost changed his mind. But he couldn’t because, he told the parishioners, blacks were taking over, raping white women.
Because Roof is a Millennial, news outlets have turned to the question of whether this generation is as racially tolerant as everyone seems to think they are. After all, they are the generation that has come of age with an African-American president, African-American secretaries of state, and continuous reverence for Martin Luther King, Jr. Yet research by the National Opinion Research Council (NORC) reveals that Millennials as a whole aren’t much less racist than Boomers and Generation Xers are. About 31% of them think that blacks are more lazy and dumb than whites are; the number for Boomers is 35%.
Not much progress here. I have been unable to find studies that measure whether or not Millennials in college are less racist than Millennials who, like Dylan Roof, are not. I suspect that college Millennials, exposed as they are to promotion of multi-racial tolerance, are less likely than non-college Millennials to think that blacks are lazier and dumber than whites are. Yet there have been racist incidents on campus, too, both in the North and the South.
Dylan Roof is still an outlier. Among the 31% of Millennials who are racist he’s the one who got the gun and killed nine African-Americans in a house of worship. But Roof is more than just a racist. He’s also an angry young man, and it’s worth asking about the implications.
Manhood is a huge, though tacit, issue for Millennials. On campus, Millennial males have been told that one traditional marker of “being a man,” sexual license, is now fraught with ideological and legal complications. Other traditional milestones of “manhood,” such as getting a responsible and good-paying job, have been affected by a dynamic economy that is closed to Millennials, such as Dylan Roof, who drop out of high school and have no sophisticated skills to offer. My guess is that Mr. Roof’s implicit message is, “I might not have a job or an education or a future, but if I have a gun and can shoot my scapegoats, I’m a man.” This is unspeakably tragic folly.
“Manhood” is a loaded term. Some readers will object to the very usage of the term as fraught with prejudice. I disagree. We need more messages promoting the idea of manhood as founded on disciplined compassion, deferred gratification, and community contribution. “I shall do all that becomes a man,” said Macbeth to his spouse as he pushed back against her urging him to murder King Duncan. He murdered the King anyhow.
But Macbeth got it right the first time: Being a real man means being civilized. This was an idea lost on Dylan Roof. We need to make sure it isn’t lost on others, whether they’re on campus or not.
We need to bring back the gentleman.
Is Death a Lot Like Decatur, Illinois?
It was the philosopher Ludwig Wittgenstein who first said that death is not a human experience, and at first you may disagree. Death, you will say, is a universal human experience: everyone dies.
That isn’t what Wittgenstein meant. He proposed something different: That all our human experiences come from human lives. Once death comes to us, you and I have no more human experiences. Thus for all the great differences in the world, the one between life and death is the greatest: far greater than the difference between animal and mineral, liberal and conservative, or Asian and African. There is nothing greater than the contrast between life and death.
Of course we can always imagine what it’s like to be dead. We can conjure up ghosts in sheets or beatific angels playing their harps on streets of gold—or even think of our worst enemy burning in Hell. But these are notions of death that come from the living. No one really knows what it’s like to be dead. There is no human experience of death.
We are stuck with an unbeatable fact. No one we know ever comes back from the dead. Someone can come back from Tibet or Java and tell us what it’s like there. No one can tell us what it’s like to be dead. Even Christ had nothing to say (that’s recorded anyhow) about what it was like during those three days in the tomb.
If future science, however, can truly bring someone back from the dead—can, say, restart a beating heart after two or three days of total cardiac stillness—then we might be able to ask someone: “So what do you recall when you were dead? What was it like—was it similar to France in any way?”
Imagine the anti-climax if the risen dead man says, “no, it was a lot like central Illinois, sort of flat.”
Roger Sterling: The Dark Side of the Greatest Generation
Roger Sterling is one of the four most famous characters on Mad Men—probably the second best known male after Don Draper. Roger has stylish and attractive white hair, is lithe and lean of frame, and irreverent of tone and attitude. He is charming and raffish. He is a hard drinker and an incessant womanizer. He divorces his first wife Mona and also splits with his second, Jane, who had been his 20 year-old secretary. He has a long-time affair with the voluptuous office manager Joan Holloway, whom he makes great with child.
Roger is crude and crass. He invites the divorced Don to Thanksgiving dinner where there will be beautiful young women. “If you find a turkey you like,” he tells Don, “you can stuff her.”
Roger’s personal qualities are often put down to his temperament but more often to the attitudes of the 1960s, when men were king and could do whatever they wanted. Roger is Exhibit A, right after Don, for why the women’s movement had to happen. Men were behaving badly and getting away with it.
But there’s another explanation for Roger: He was a Navy vet in the Pacific, part of the Greatest Generation that came of age in World War II. Roger’s generation won the war, and it decided it had a right to celebrate. Roger’s sense of entitlement occurs because he was part of the Greatest Generation. Hadn’t he been part of the cohort that helped make America, post 1945, the most powerful and wealthy nation in history? So who is anyone to tell him that he can’t cheat and divorce and drink and say off-color things?
Roger Sterling is the Greatest Generation’s dark side.
Look to Your Thermostat for the Secret of Happiness!
The secret of happiness is found in balance between yourself and your environment. And if your thermostat on the wall could talk, it would say the same thing.
But the language would be about “inputs” and “outputs.” For the thermostat an input is cold or warm air in its vicinity. An output is a signal to the cooler or furnace that it needs to condition that air so it’s always at a set temperature—your choice.
I personally like 72 in summer and 68 in winter, but that’s just me. 72 in July and 68 in January make me happy.
So what’s the lesson here? Let’s take something that makes you unhappy, like the loss of a job or a loved one or a lover. Consider the loss an input. Here’s where, if the thermostat could talk, it would teach you the secret of happiness: you need to match the miserable input with a proper output. And what is that?
It’s coming to live with the loss and figuring out how it might even become a good thing. It’s the Serenity Prayer: accept what I can’t change, courage to change what I can, and wisdom enough to know the difference. It’s finding a balance between the inputs (environments) and outputs (you) in life. Thermostats can do this automatically.
It takes us, well, a little longer.
But remembering our reliable friend on the wall makes it easier.
Coming Soon: Fake Orgasms That Feel Just Like The Real Thing
One of the great thought experiments in the history of artificial intelligence is the Turing Test, designed in the 1950s by the great mathematician Alan Turing, who proposed the principle on which the computer is based and is the subject of the recent film The Imitation Game. Turing’s question was simple: If you were in conversation with a computer, could you tell it was a computer and not a human being?
Well, now we have the Meloy Test, named here in honor of a leading pioneer in pain management, Dr. Stuart Meloy. The Orgasmatron is a by-product of his research. The test is for those who sail into the seas of virtual sex—can they tell the difference between it and the old-fashioned flesh version?
Appropriate technology is already well under way. Consider “haptic gloves,” which allow the user not only to manipulate items on screen but also to “feel” them; or kiss transmitters (toothbrush-like devices that allow the user to enjoy a fabulous smooch). As for the Orgasmatron, it entails surgical insertion of electrodes, neurological implants, near the spinal cord to simulate a neurological package of orgasms.
Can “printable” erotic odors be far behind? Tomorrow’s consumer of virtual sex might not be able to tell the difference between virtual and “real” sex, and would not care to. Think of the problems this could solve. You really like this person, but he or she is just not that great in bed. But now your steady companion, with whom you might want to spend your life, is someone who doesn’t have to supply you with sexual satisfaction. You can go “elsewhere” without being unfaithful.
Well, you might be “unfaithful” with Orgasmatronic implants. But that’s not the same as sleeping with another man or woman. Is it? Am I right?
Alfred Hitchcock, Psychiatrist!
The roots of Alfred Hitchcock’s sensibility are traceable to his father’s having him locked up. Poor little Alfred, not yet the portly figure of later years, was sent to a jail cell and left alone. I think Alfred didn’t just perceive that he himself was in solitary. He also realized that outside the jail, in London, all sorts of people were doing all sorts of things without him, as though he didn’t exist.
This is as close as we can get to conceiving of ourselves as dead—for that’s what being dead is: Everyone goes on without you, and after a while no one remembers you at all, and yet they do just fine. It’s as though you never existed.
Hitchcock put this idea into his films. Take one of his last great pictures, Frenzy, in which a woman is strangled to death with a necktie, whereupon the camera backs out of the second story where she lies dead, down the stairs, and out into the teeming London street, where everyone is carrying on as always. She will never be part of them again. Or take that deadeye close-up of Janet Leigh, after her terrifying death in the shower in Psycho. That eye will never see anything again, and then the camera pans to the money she stole, as though to say, “Whatever happens to that money will be of no consequence to her: not now, not ever.”
This is depressing, but Hitchcock also shows us how to cheer ourselves up. In this sense he’s a great psychiatrist without charging us any money. What do I mean? Well, go out to a coffee shop or sit on a park bench by yourself. Watch all the people go by, doing their thing, and pretend that you are dead and that they’re doing it all without you, as though you never were. For about ten minutes you’ll feel awful. Then leave the shop or the bench and be cheerful, for it isn’t true: You aren’t dead yet. Celebrate!
I’ll be you feel better already! But don’t thank me. Thank Alfred.
The Three Great Fallacies of Life
Here are three widespread but mistaken beliefs through which we human earthlings ruin, or even prematurely end, our lives. If you are reading this, you are alive. It’s not too late to change.
The Fallacy of Knowledge. We go around in life pretending that we know what we’re doing. We criticize others when they err, and we act as though they, too, should know what they’re doing about luck, kids, relationships, careers, decisions, and so forth. It isn’t true. Someone once said that living life was like taking violin lessons at the same time you’re performing in Carnegie Hall. We make it up as we go along. We bumble along. Life has far too many surprises and complexities for us to be good at dealing with them. If we learn how to do one or two specialized things well, we’re doing a relatively good job. As for the rest of it, we’re likely to screw up. Remember this. Stop being so hard on yourself. Forgive yourself. And bumble some more.
The Fallacy of Quality. One of the most commonly stated bromides is this: “It’s not the length of life but the quality of life that counts.” This is untrue. Once life goes, everything else goes with it. The distinction between life and death is nothing compared to the distinction between moon and earth, liberal and conservative, or physics and poetry. The difference between life and death is the Whole Ball Game. So if you are tempted to do something really dangerous and risky on the basis of the fact that “it isn’t length but quality that counts,” then think again. A dull life that goes to 80 just might be preferable to a perilously exciting one that goes to just 20. When people die young, we console ourselves with this quality versus length business. It’s a false consolation. As Shakespeare’s Falstaff said about the foolhardy Sir Walter Blunt, who died honorably in battle, “I like not such honor as Sir Walter has.” As the Bee Gees put it, “Stayin’ Alive” is everything.
The Fallacy of Skill. You often hear that life’s success is made up of skill, not luck. That’s another fallacy. Yes, skill is important, but luck is far more significant. For one thing, you have to be born under lucky circumstances in order to acquire skill. If you grow up in a prosperous household with a big vocabulary you’re much more likely to do well in school and in life. This is well established by social scientists over and over again. And then there are all those skillful people who had the bad luck of developing leukemia when they were young or being at the wrong place in the wrong car at the wrong time and had their lives snuffed out young. If you’re good at something, you may think you’re in charge in life. Don’t believe it. Be grateful for your good luck, and pray that it continues.
I wish you good living!
The Curious Case of the Happy Colonoscopy
Anyone who’s endured a colonoscopy (a tubular probe of the intestines to screen for cancer) knows that it is an unpleasant experience, an unhappy one of fasting and purging and being stuck in the arm with an anesthetic. During the prep and right before the procedure, it’s miserable. But an experiment showed that those who got a little extra shot of pain at the very end, when they were awake after the general anesthetic, remembered their colonoscopy as a more happy experience.
Why? Because they could contrast the earlier severe pain with the mild later pain.
They could recall the end of the experience more vividly—“it’s really not bad at all at the finish line.” And that trumped the overall misery. These folks were more likely to sign up for a colonoscopy later when it was time to have one.
Happiness depends much less on how a thing feels at the moment. It’s much more dependent on how things play out and are recalled. This is an experiment cited by Daniel Kahneman, a psychologist who won the Nobel Prize—for economics—in his great book Thinking Fast and Slow.
Pain or pleasure comes in two forms: the feelings of the moment and the memory later. The memory trumps the moment. How happy are you? That depends on how many happy endings you remember. And this is proved by The Curious Case of the Happy Colonoscopy.
Masochists of the world, unite! You nothing to lose but your pains!
You’re A Victim, but I’m An Even Bigger One
The best short thing ever written about victimization was by the great French writer Jean de La Fontaine in his little fable, “The Wolf and the Lamb.” The wolf accuses the little lamb of spoiling the drinking water in the river, whereupon the lamb says that’s impossible, for “I was just born.” But the wolf persists in his accusation. The lamb says, “I’m standing downstream from you, so how could I have ruined the water?” The wolf then says that even if the lamb himself didn’t spoil the water, the other sheep have done so, whereupon the wolf eats the lamb without any further ritual of “justice.”
The wolf wanted to eat the lamb, but he decided that first he had to claim the lamb was victimizing him. Notice: the wolf is the victim here. In her best seller about the Norwegian mass murderer Andrews Breivik, the journalist Anne Seierstad shows how Breivik thought of himself as a victim—of Muslims and feminists. Once he had showed himself that he was a victim, he could, like the wolf, take action. And so he killed over seventy people.
Hitler thought of himself as a victim—of Jews. Stalin thought himself a victim—of followers of Leon Trotsky. Once victims, they soon became killers.
Anne Seirstad calls her book on Breivik One Of Us, and indeed he is one of us, for this is a human tendency. We think we have the right to defend ourselves as victims. The trick is to make sure enough people know we’re victims—and sometimes the only person we have to convince is ourselves. Humanity is sometimes pretty ugly—and pretty apt to stay that way for a pretty long time.
How To Be Bratty With Your GPS
My own experience with GPS is limited. I once borrowed a friend’s to help me return from Minneapolis to my home near Madison, Wisconsin; and my partner Ron Nief has one on his smart phone that he uses to navigate us when we rent a car to travel to speeches for the Mindset List. Other than that, I have scant experience, except for travel with others who use it all the time. I’m familiar with the pleasant lady who orders the driver to turn left on Exit 129 in one half mile.
Because I’ve not used GPS much, I’ve only set it for myself one time. Then I got a sense that I am ordering the device and that the device is not ordering me. I am the one giving commands, and although “she” seems to be commanding me to turn left or right, in fact she is really carrying out my own instructions. This is reassuring.
Still, something about all that being-told-what-to-do is disconcerting. Maybe I just have a problem with authority, and have had ever since I argued with my Sunday School teacher about the weapon David used to kill Goliath (she said it was a stone; I said it was a brick—she was right). Still, how about turning right instead of left? What would the GPS do? Would he, she, or it say, “You can’t do that; now go back and do it right this time”? Suppose you want to pause for a cup of coffee and make an unauthorized stop. Does the GPS tell you that you can’t get out of the car? To stop for needed coffee is necessary, but to stop just to get a rise out of the GPS is bratty.
But I don’t really think the problem with all these voice commands is that we are going to become slaves to our technology and need to rebel. No. The problem is that we’ll be giving orders to our smart phones (“Get me a photo of Meryl Streep in a blue dress”) and taking their compliance for granted. We’ll become so used to getting our way that we’ll think that that’s how life should always be.
There are worse things than being bratty with your GPS; there’s also expecting your GPS to be your humble servant, whom you don’t even have to thank. And pretty soon you’ll start treating your spouse that way. Mark my word: with the rise of voice commands on high technology will come a terrible spike in the divorce rate. When it happens, ask (don’t tell) your smart phone to look it up. Make sure you’re polite, too.
3 Reasons We All Have To Die
The common sense answer to the question is that we get old, like a washing machine or computer that’s seen better days. But let’s probe beneath common sense. If we do, we find three deeper reasons why death is universal.
*Our natural selection is mediocre. Natural selection has done great things for us human beings. We have such fine brains that it is we who study and regulate kangaroos, rather than vice-versa. But natural selection is pretty lousy when it comes to preventing death. Our tissues are soft enough so that a sharp knife or a bad car crash can do us in. And apparently our genes don’t have much use for us once we reach the age of grandparents. We can help take care of our children’s children, but we aren’t needed to take care of our kids’ kids’ kids. Having no need of us, our genes have had no interest in evolving to make us last longer. Even high medical technology will not be able to put off our deaths forever.
*We need to go to a much-improved place. The idea is that in order to live forever in the afterlife, we must shed our bodies (die). Heaven is a perfect place, but there are lots of imperfections in our bodies with their aches and pains and wrinkles and moles. Only the soul can be perfect, so only a soul can live in a locale (Paradise) totally blemish-free. This is not a scientific argument because there’s no way to falsify it. But unless you believe that science alone can tell us the truth about anything, the possibility of an afterlife cannot be ruled out, even if you don’t believe in it.
*We need to get out of the way. Whenever we die we leave behind a network of relatives and friends. It’s apparently fated that we die while they get to live. Thus they get on with their lives. Life is for the living. If we died and then came back to life, then they’d hardly welcome us. We’d be in the way. They’ve gotten along without us: re-married, made new friends, gotten over the grieving, spent the inheritance, and so on. This is a better argument for why, once we die, we stay dead, but those left behind might consider that it was fate that we die, for our death has opened up new possibilities (including some good ones).
Death remains a mystery. All three of these reasons may even be true at once. But as a man in the funeral parlor business once said to me, “Folks are dying who’ve never died before”
Harper Lee and The Deadheads
We’re about to witness two major cultural events: The Fare Thee Well tour of the Grateful Dead and the release of a second Harper Lee novel, called Go Tell the Watchman. These two happenings reveal a persistent desire: If you liked a great party once upon a time, no matter how long ago, try to capture the same old magic again by re-throwing it.
The surviving members of The Grateful Dead include Bob Weir and three others. Weir says that while some old Deadheads will want them to play for five hours, they won’t and at their age they can’t. Once they did play for five hours, Weir adds, but just once. The word spread that they always did, but they haven’t done so since the 70s. The lead guitarist for this tour won’t be Jerry Garcia but Trey Anastasio from Phish.
In other words, it won’t be the same.
Harper Lee is now quite old and living in an Alabama nursing home. She has always eschewed becoming a public person—no Tonight show appearances for her. It’s not yet clear whether her new novel will provide the same blend of sentiment and rectitude, adventure story and family coherence, that has made To Kill A Mockingbird an enduringly fascinating American classic.
But readers should be prepared for the possibility that this party won’t have the mystique of the previous one. They should be grateful to have had the original party the first time.
Harper Lee does have one advantage over The Grateful Dead: she isn’t a public performer. She’s a writer, so her “live” appearances occur every time you read one of her (two) books. She can be dead and still be “live.” Not so, the Grateful Dead. Of course there are all the recordings, but it’s not the same as a live performance.
You don’t need Harper Lee to read To Kill A Mockingbird out loud to you in order to love it.
But we still miss Jerry Garcia and always will.
3 Reasons Why Caitlyn Jenner Should Read King Lear
As everyone not on Mars knows by now, the athlete and celebrity Bruce Jenner has changed his sex. She is now Caitlyn Jenner. No doubt Ms. Jenner has a lot to do these days, as she is more famous than ever. But she might like to know that Shakespeare got to the topic of Caitlyn’s new identity first, especially in King Lear.
*Boys. As in all of Shakespeare’s plays, the original production of Lear had boys playing the parts of women. These precocious lads, with their piping high voices, portrayed Cordelia, Regan, and Goneril, Lear’s three daughters. This cross-dressing was part of the show—yet another magical feature of the stage. Ms. Jenner might consider that it’ll take a while for her to be thought of as just another woman and not a personality of show biz.
*Nature. The theory behind Ms. Jenner’s sex change is that if you aren’t comfortable being a man, you can become a woman. You don’t have to bend to genetic determinism—or surrender to nature. In King Lear the word “nature” appears many times. Sometimes its poor characters are caught in nature’s cruel indifference—Lear is an old king tossed into the storm by two of his daughters—while at other moments nature seems to be supported by a kindly God. Ms. Jenner must have felt that nature played a nasty joke on her by making her a man, but that there’s a Higher Nature, which gives her the right to become whichever sex is the authentic Jenner.
*Fate. Finally, in the play Lear discovers that you can’t count yourself happy until you’re dead. Here he is, an honored and retiring king, yet because he bets on the wrong daughters his worst days are ahead of him. By play’s end he’s ready to die just to get relief. One wishes Ms. Jenner the best in her new life adventure. Yet it is when we get what we want, even through marvelous biotechnology, that we might be most vulnerable. The Fates lure us into overconfidence, just as it did Lear.
I once met someone who knew Bruce Jenner as a child. She told me he wasn’t much of a reader. It’s not too late for Caitlyn.
I’m A Freak, and So Are You!
I’m a freak. Perhaps this judgment is one with which my wife would agree, but it happens to be true anyhow. What is a freak? It’s something that did not have to happen, and that required all sorts of other things to happen before it could happen. Take my very existence. Think of all the people (grandparents and great grandparents and so on) who had to meet and copulate before my Mother and Father could even exist, and then of course they had to meet, too, and well, you get the idea. All of us owe our existences to a series of coincidences. There were so many ways by which we might never have been born. And yet we were. We’re freaks, all of us. I’m one; you’re one.
This is how life works. Take mass murderers. Killing our fellow human beings is really abnormal, and it’s not easy to do. That’s why in war the military works so hard to create a mindset—where have I heard that word before?—by which to make killing enemies no big deal: by rendering them sub-human, the foe, the Hun, or the Jap, or the Yank or whatever.
So if you take mass murderers in peacetime—like the guy who killed all those unlucky people in a Colorado movie theater or the one who killed all those poor kids at a Norway day camp—they are really freakish. Yes, they had a hard time in life, but lots of us do. Yes, they were full of hatred, but lots of us are. Yet for them everything came together to the point where they actually—with a mindset of war—killed all those people.
Everything came together. These people are freaks in that they killed multiple others. We’re freaks in that we were born in the first place.
The world is a much freakier place than we might like to admit. Is that comforting—or scary?
Disgusting! Why Dirty Toilets Make Us Do Wicked Things
We’ve all seen dirty toilets, and I’ll not go into details except to say that such filth is repellent. Now a new study reveals that when folks have seen a disgustingly unclean toilet, as featured in the film Trainspotting, they are more much more likely to cheat in a game they play shortly afterwards. The theory, according to social scientist Shankar Vedanstram, is that disgust triggers self-preservation, which in turn triggers cheating.
So a repulsively dirty toilet can make you do wicked things, even though there is no logical connection whatever between the toilet and the person you’re cheating. Score one for Freud, who long maintained that our lives are unconsciously motivated. But even more important, score one for Darwin.
Towards the end of his career Darwin made a study of human emotions, including such dominant feelings as disgust and joy. Why would Darwin decide to study emotions? And the answer is at once simple and subversive: Because emotions are human adaptations. They are the motivations we human beings evolved in order to beat the competition in a state of nature.
Suppose you encounter, in the woods or on the savannah, something truly ugly and threatening. Now if you are a member of Species A, you might feel sorry for this threat and try to help it. If you are a member of Species B, you will try to destroy it, even if that means cheating it to death! Which species is more likely to survive: A or B? If you answered B, then BINGO!
And human beings are Species B. Human emotions can be wonderful or horrible. Take joy. We can absorb the joy of the group and then become inspired to murder Gypsies or feed hungry children. Unfortunately, the habits we developed in order to get here at all are also the ones that sometimes propel us to do awful things. This was Darwin’s great insight. It’s a disturbing one. And it’s a major reason why even those who think they understand Darwin don’t really want to take him seriously. And that’s…disgusting!
Americans Are Narcissists—and It’s a Damned Good Thing, Too!
In the 1830s the Frenchman Alexis de Tocqueville visited the most democratic country on earth. America heralded a death knell for the old idea of aristocracy, where an elite few set standards for the obedient many. By the time Tocqueville arrived in the United States there had already been a ferocious revolution in France that displaced the old order, and there would be others in Europe as well. Britain didn’t have one, but it was a near thing.
Tocqueville had a lot to say about the new country—he noted our love of making money and worried that mass public opinion might replace kings as a form of tyranny. But he was struck by how much individual Americans were only excited by one thing: “the survey of themselves.” Each American, he thought, is terribly interested in his own significance, however “paltry” it may be. And the result, he thought, was a peculiar kind of grandeur.
The pronoun “his” is not accidental, for women, Indians, and black men were given much less permission to be enthralled by their own importance. But we should be glad about how right Tocqueville otherwise was. He almost predicted the uniquely American skyscraper, where many live insipid lives in their own little units but are proud of the great American building, which gives them a personal plus they otherwise wouldn’t have. Perhaps Walt Whitman said it best:
I celebrate myself, and sing myself,
And what I assume you shall assume,
For every atom belonging to me as good belongs to you.
This would be narcissism: an exaggerated sense of one’s own noteworthiness. And yet you can’t help but wonder how many overachieving Americans–like Thomas Edison or Sojourner Truth or Bill Gates or Martin Luther King, Jr.—were propelled by the idea that in America a single person can be a huge creator of differences. Narcissists don’t always have good “reality testing,” a lack of which has often gotten the United States in trouble. But if King had told anyone in 1955 that in ten years blacks would be able to vote all over the South or Edison had told anyone in 1865 that there would be “talking machines,” they’d have both been declared crazy narcissists. And they were narcissists.
They were also Americans.
Where Is Everybody? 4 Reasons We’re Alone in the Universe
In 1950 the great scientist Enrico Fermi asked the great scientist Edward Teller, “Where is everybody?” Are we truly alone in the entire, virtually limitless universe? Such a big world out there: Why haven’t we heard from anyone else? Here are four reasons why we’re alone:
*There isn’t anyone else. The conditions by which life came to be on our planet are unique: just the right blend of sunlight, air pressure, soil, and so forth. They are beyond unlikely to be duplicated anywhere else. Theists think this proves God had a hand; skeptics think that statistical anomalies just happen.
*There was somebody else, but they’ve died off. There may be at least one planet in every solar system that once had a civilization but now is barren. Warning: It will one day happen to us, given enough time. Even if we survive climate change and nuclear holocaust, someday we’ll get too close to the sun, which will cause the biggest nuclear explosion of them all.
*There is somebody else, and they’re trying to reach us—but they never will, because it’s too hard. Take our own plans for a manned mission to Mars. The trip will take nine months in super-cramped and –contained conditions. Even upon landing, human beings may have to wait three more months for temperatures on the planet to cool enough so that our astronauts can go outside. Living in the planet’s greatly reduced air pressure is difficult. When dogs have been put into a vacuum they bulged so much beyond normal size that some of them died. Yet Mars is comparatively just a short trip: it’s after all still in our solar system.
*There is somebody else, but who knows when they’ll reach us? In this scenario you’ll just have to be patient. It could happen anytime, or never. Meanwhile, if you’re feeling lonely, you’d better compromise and call on that grouchy neighbor. Waiting for your pals from the planet Perseus to show up is probably inefficient.
Wanna Live Forever? Join AARP Asap!
The other day I heard a woman on the radio discuss her recent experience of trying on “a shapely dress” in a department store dressing room. She looked in the mirror and saw a wrinkled body with liver spots. She went on to say that she had accepted her advanced age and had concluded that what she lacked in beauty she had gained in wisdom.
Good for her, but that is not the perspective of the American Association of Retired Persons. If you google AARP Magazine covers over the past several years, whom will you find? The still-vibrant Brad Pitt, Susan Sarandon, Gloria Estafan, Harrison Ford, Michael Douglas, Sharon Stone, and Jack Nicholson, among others. To be sure, at least one of these, Michael Douglas, has had some major health problems. But he still looks good and has no doubt a Hollywood retinue to make sure of it for as long as possible.
The AARP is not the AAOP—the American Association of Old People. You only have to break 50 in order to become a member of AARP. It seems the organization, shrewdly, has realized that there is not a big market for old people in the United States: the wise, wrinkled village elder is not a popular meme in America. We want to be young in this country, and for as long as possible. We are devoted to the idea of the vital retired person, who still looks good, works out every day, takes Viagra if necessary, and is sailing the ocean blue on his private ship when he isn’t still doing some occasional consulting. If there is wisdom, it is linked with (still somewhat flourishing) beauty. You can have both a Merry Christmas and a Happy New Year!
It was not always thus. Death used to come earlier, and it was a much more public affair. Family members would gather round to watch Uncle John breathe his last. Funeral processions often went through the center of town. I can recall attending the funerals of elderly people in Central Texas who were praised for their wise piety, which they could only have gained by getting old to the point of dying.
Before he himself died, author John Updike said that Americans want to be young. As a keen observer of American life, he was right. AARP: take notice. But it already has.
Would Skye, DNA, and Body Cams Have Ruined Shakespeare?
One of the great mysteries confronted by Hamlet is whether or not his uncle killed his father. Most readers conclude that Uncle Claudius was indeed guilty of poisoning Hamlet’s father and taking over the throne, thus shoving Hamlet himself aside. Bu the play’s alluring enigma is that we never quite know that for sure. Uncle Claudius is guilty of something—he says so—but it’s never entirely clear that it’s murder and hence regicide.
Had DNA been around, there would have been no mystery. No doubt Claudius would have left his genetic signature on the scene: mystery solved.
King Lear’s problem is that by the time his kindly daughter Cordelia finds out how mistreated the old man is, it’s too late. His wicked daughters have sadistically rendered him frail and mad—a condition from which he never recovers. Cordelia is off in France, this was the 1500s, and messages took a long time to travel. If only Lear could have just Skyped to Cordelia, she would have arrived in time to save him.
Othello thinks his wife Desdemona is having an affair with one of his underlings, Cassio, and eventually murders her out of jealousy. Here DNA wouldn’t have proved anything, for Cassio had certainly spent some face time with Desdemona. But wait: she could have proposed to the jealous Othello that she wear a body camera all day. This would have documented her faithfulness. The problem of suspicion would have been solved.
The difficulties of jealousy, suspicion, and distance would have been solved. But the plots would have been spoiled. There would have been no mystery and no tragedy. Right?
But before we get down on high technology as the ruination of Shakespeare, let’s think again. Claudius would have had to agree to allow DNA testing, and Claudius was a tyrant. The good Cordelia’s digital records could have been scooped up by her sisters’ intelligence network. She would have been stopped at the border. And Othello might easily have killed Desdemona before she could calm him down enough to propose the body cams.
High tech changes everything—except human nature, about which Shakespeare was an expert.
Texans Who Think They’re Sam Houston
Laure Murat has written a book called The Man Who Thought He Was Napoleon, a history of madness during and after the French Revolution. Murat describes the rising incidence of insanity during a time of revolutionary change: as a king was beheaded, ancient institutions were struck down, and bread was redistributed, things were changing very, very fast indeed. Some Frenchmen, it seems, simply went nuts. Some of them worried, needlessly, that they would have their own heads chopped off. The wealthy, those who had supported the King, were not only in fear of their lives. They also had major breakdowns as they saw themselves losing everything overnight to the revolutionary state.
Paranoia and real enemies were hard to tease apart.
This was also a time when madness was linked to politics. Those with politically strange views were declared insane and locked up in an asylum. In the 1930s the same thing happened in Stalin’s Russia: those who challenged Marxist orthodoxy, if they weren’t shot, were confined for the rest of their days to ghastly loony bins. And as French fortunes rose and fell with subsequent revolutions and wars, some Frenchmen decided that they were Napoleon, come back to rescue France from chaos that only the great military genius of the past could cure. Would-be Napoleons cropped up faster than Elvis impersonators. (Old Joke: A man walks into a psychiatrist’s office dressed as Napoleon and says to the shrink: “I’m here because I’m worried about my crazy brother.”)
When I was a lad growing up in Texas, we had our own Napoleon: Sam Houston, who was not only the leader of the Republic of Texas in the 1830s but also the hero who had defeated the Mexican Army at the Battle of San Jacinto. It was Sam Houston who created Texas independence. One of my most delightful memories was when my 7th grade Texas history teacher did a one-man recreation of San Jacinto.
Independence was vital to Texans, and still is. The “Don’t Mess With Texas” signs warn against littering the highway, but they also refer to a love of self-reliance and liberty that is peculiarly Texan. Now Texans are faced with allegations that the United States Army, as it begins drills in the state, is plotting to take it over.
Texans have lived through what they regard as revolutionary times: a president with a strange-sounding name who mandates that everyone must buy health insurance and thinks climate change is more important than oil drilling. And so, just as did the French during their revolutionary period (which really was a revolutionary period), Texans also look to a more heroic past. For the French it was Napoleon. For Texans it’s tacitly Sam Houston, who would have known what to do with those invading Obama soldiers.
For the record, the United States Army has denied it wants to take over Texas. But it’s not a bad idea for the spirit of Sam Houston to keep an eye on them, just in case. Napoleon himself might have agreed, and he’s probably around somewhere, too!
Millennials: The Digital Spirituals
Not long ago in this series I concluded that, “Millennials are quite the opposite of young Africans and Asians in their attitudes towards religion because Millennials are progressive, rational, consumerist, and “infertile” (in the sense that they do not wish to have a lot of children). They are un-African and un-Asian.”
But I might have added that if Millennials aren’t religious—in the sense of being formally affiliated with denominations—they do say they are “spiritual.” More than half claim to have been in touch with God in some way. On our campus, for instance, there has been a meditation room for over a decade.
Even if we have an idea of why they aren’t religious, why should they be spiritual instead?
My explanation starts with a fact hidden in plain sight: When Millennials are “connected” via texts or Skype or email, they are individually alone. They do not “do” face-to-face. Each of them is in solitude with their smart phone or tablet.
One function of religion was to provide a gathering place for meeting like-minded people and visiting with them. The unalterable reality was that in order to meet and visit in those pre-digital days, two bodies had to be the same place at the same time. That is no longer necessary. You can meet and visit with anyone now, in real time, and be many miles away from them. And this, I think, is a key to Millennial spirituality.
Millennials are used to “visiting alone,” so why shouldn’t they be attracted to “worshipping alone”? Is it any wonder that they find God in meditation and solitude, and not in the company of other bodies and souls? Of course we oldsters text, too, but we are digital immigrants. We were honed on face-to-face. Millennials, digital natives, were not.
They find God just as they text: by themselves. dear god: do u want 2 meditate 2day?
Everything You Need To Know Can Be Learned From A Cat
In my household I have what’s sometimes called “one of each,” though by that I don’t mean “a boy and a girl” (I have that, too, but they’ve moved away). No, I mean I have a dog and a cat. They are both lively and get along well together, but I think the cat is more instructive about life.
My cat sometimes jumps into my lap, kneads her paws, and purrs. Then she curls up and goes to sleep. This supplies me with immense pleasure. Some people might assume that the cat just wants a warm place to snooze. Not me: I assume she is recognizing my good lucks, sharp mind, and wise perspective. The cat’s got good taste.
But here’s the thing: She won’t do this every day. Sometimes I even sit on the couch and suggest that she join me, in perfect English by the way. Sometimes she does, but generally she doesn’t. Often, when she does climb onto my lap, she does so unbidden. This is where she begins to teach me about life, for life itself is like my cat. Sometimes it singles you out for flattery, praise, and warmth. It rarely does so if you ask it to. Most of the time it is indifferent. Every now and then, when you least expect it, it pays you a welcome visit.
That’s not quite all there is to life. Sometimes it’s good to you, sometimes indifferent—but sometimes bad as well. Even here, a cat has been a useful reminder. Years ago, we were trying to breed our Persian cat Madame Bovary to a male named Freak. The breeder told us to lock them in the bathroom in order to promote their friendship. Freak had gotten stuck in a corner, so I tried to pick him up and help him out. He wasn’t grateful. He helped me to several stitches in the hospital. The doctor even thought I might need an overnight stay for an antibiotic drip.
I didn’t, but this proves that when life isn’t treating you well, it’s ignoring you; and when it isn’t doing that, it gives you a bit of shipwreck. Life is a cat. Her name’s Dori, by the way. And yes, right now she’s paying me no mind whatever.
Millennials: Never “Be Yourselves!”
A recent book about James Boswell, the famous biographer of Dr. Samuel Johnson, details how disappointed in Boswell his father was. This was Scotland in the 1700s, where Calvinism was the norm. The elder Boswell was a devout member of the Church of Scotland. He had a dark and dour disposition. He thought his fellow human beings were sinners in the hands of an angry God. Boswell, Senior, was a rich and powerful man. He didn’t need to put on airs for anyone. He could be a righteous grump every day.
But Boswell, Junior, was different. He traveled all over Europe in search of the famed philosophers like Voltaire and Rousseau. He even tried to make an appointment with the Pope. Finally, he befriended the profound and wise Dr. Johnson and wrote his great and entertaining life story. He played the witty Johnson’s straight man. He was a young man on the make.
Boswell, Senior, looked upon his son and decided he was a phony. Boswell, Junior was a pliant, adaptable man. In order to ingratiate himself with the greats of Europe, he had to play roles: the flatterer, the boon companion, or the listener. He had to put on airs and try to get as much information as possible for his books and journalism.
Boswell, Senior is an anachronism. Boswell, Junior is not—in fact, he was an early version of us. We live in a world of diverse persons and viewpoints. We are thrown with these varieties all the time. We have to adapt to all sorts of people in order to win that job or close that deal. Boswell, Senior lived in a time when a single religion dictated most lifestyles. Junior, the next generation, lived in an era of growing capitalism: our world.
Senior wanted Junior to “be himself.” My parents used to give me this advice. It’s both misleading and dishonest. Millennials: Take heed. But it does help if, when you’re putting on your acts, you can convince yourself that it’s really you.
Why Our Puritan Forebears Were Squishy Liberals About Abortion
In 1729 a Philadelphia publisher named Samuel Keimer revealed the methods by which colonial women ended pregnancies. They included “immoderate Evacuations, violent Motions, sudden Passions, Frights … violent Purgatives and in the general anything that tends to promote the Menses.”They also included herbs such as nutmeg, slippery elm, rue, and squills. They sometimes involved much more poisonous stuff, such as turpentine. Women might take hot baths and drink lots of gin. They might take extreme exercise (jumping up and down for long periods was a favorite). They might even fall down the stairs and hope it was a controlled descent.
Were these precarious methods legal in the American colonies–even in Puritan New England? They were, as long as women used them before they felt movement in the womb—otherwise, known as “quickening.” Not until one hundred years after Keimer’s article did a state (New York) make “pre-quickening” abortions illegal—a misdemeanor. But from then on, until Roe versus Wade in 1973, all abortions started to become against the law in the United States.
Legally, the U.S. was a no-abortions-allowed zone for over one hundred years. Even today, some Americans are opposed to all abortions, and as such are “to the right” even of our Puritan forebears.
So why were all abortions outlawed for over a century? Two reasons: As doctors became more scientific, they decided that “amateur” abortions were unsafe and sought to have all of them made illegal. And as concern about immigrants from Eastern and Southern Europe began to mount, more traditional Americans thought we needed more “true American” babies. Even feminists in the nineteenth century were opposed to abortions. For them, control over their reproduction consisted of what they called “voluntary motherhood”—which meant abstinence, whatever their husbands might think.
So why did even pious Puritans think pre-quickening abortions were OK? The likely answer is that before the fetus could move in the womb, it was considered to be not really alive. Thus colonial America, as well as contemporary America, came back to the same old question: When does “life” begin?
Shakespeare and the Terrible Wondeers of 9-11
The Twin Towers of the World Trade center were spectacular, and so was their fiery demise. Put the two together—construction and destruction—and you have two wonders of the modern world: skyscrapers and jet propulsion. Although we have become jaded about both, they are both still marvelous. Imagine looking down at people the size of ants. Imagine going to sleep in L.A. and awakening in Tokyo.
The human wish for wonder has not diminished. We all like to be enthralled. Although the adjective “wonderful” is overworked, there are things in this world that are genuinely full—to the brim—with wonder: unremitting spectacle and magic that strikes one and all with awe. Think about the grand finale of firecrackers on the 4th of July when the audience goes “wow” in unison and suddenly starts to clap.
But what did the human race do before tall buildings and suicidal jetliners—before incredible surgery and instant information? Well, there was something called art and something called religion. We will leave religion for another time. Let’s take art. Let’s go to the top: Shakespeare.
Was, or is, Shakespeare “wonderful?” He is—but why? Here are a few clues. One of his characters, a superstitious Welshman named Glendower, says he can “call spirits from the vasty deep”—summon ghosts from Hell. Another of his characters, a Theban king, says that poets can give almost nothing at all “a habitation and a name”—an artist like Shakespeare can make you see castles where there are none and feel jealousy that is entirely make-believe.
Moderns build skyscrapers with brick and steel and mortar. Terrorists tear them down with stolen flying machines. Shakespeare was a builder, too—the very term playwright means “play builder.” But Shakespeare uses words to build the doubt and destruction of Hamlet, or the numb fury and terrible comeuppance of Macbeth. Once upon a time human beings thought Shakespeare was as wonderful as the Twin Towers used to be. But back then there were no bypass operations or virtual sexual climaxes or even skyscrapers. Folks just had the likes of Shakespeare, and he was (still is) wonderful indeed.
Idealistic Robots: Be Very Afraid!
The increasing place of robots in our society is assured. But there’s a risky misunderstanding about the dangers they pose. Most observers think that robots are perilous because they can pilot drones and help us evade responsibility for killing our fellow human beings. Or they think robots are going to take away our jobs.
But the real problem with robots is that they might become idealistic.
The real problem with robots is that they might become…like us. Yikes!
One of the great human ideals is “making others happy.” Sometimes we do that with a helping hand. It seems harmless and even noble. But Hitler also wanted to “make others happy” by killing millions of Jews. Stalin wanted to “make others happy” by killing off “anti-Soviet elements.” We think that curing cancer is a way to make others happy. But Hitler thought the murder of Jews was an extermination of undesirable elements and that non-Jewish Germany would be much healthier as a result.
It’s one thing for robots to be programed for specific tasks, such as washing dishes or driving cars. But if robots are programmed with higher value ideals, such as “making others happy,” in time they too will acquire a sinister variety of methods. They will conclude that “making others happy” includes the destruction of their former human “masters.” And they’ll be much better at this sort of “happiness-making destruction” than we are. Remember the computers that beat Jeopardy and chess champions?
Robots with ideals will mark the end of humankind. Make a note of it.
Jock Straps in the Band Hall
OK, right up front I’ve got to admit it: When I was a young teen I played a game of catch with athletic supporters (jock straps) in my high school band hall. I have no recollection of where or how I came upon them. But I do recall tossing them to and fro. I wasn’t the only one. I can’t remember where the band director was. Someone, no doubt outraged and embarrassed fellow students, reported us, and we all got paddled by the principal—yes, corporal punishment was quite permitted in those days.
There was a specific assignment of fault. We snot-nosed boys were sinners. We had done a bad thing because we were bad people. I have no recollection of ever saying, to the principal, “I made a mistake,” much less the passive and evasive “Mistakes were made.” A “mistake” was an error in Latin translation or an unsolved algebra problem. There were no “mistakes” in the matters of morality.
Of course, as we say in the Mindset List® business, that was then but this is now. When a prominent NFL running back punched out his wife on an elevator—it was caught on video—he said he had made a bad mistake. He did not say he was a bad person. We are told that the proud Bush family, rather dynastic, has difficulty admitting mistakes. And then there was the Catholic doctrine of Papal Infallibility. So we go from admitting a mistake (but not evil) to having enormous difficulty conceding a mistake to the idea that the highest church official simply cannot make a mistake. We’ve come a long way here from bad boys having done a heinous thing by tossing jock straps in the band hall.
Today, when all sorts of people are exposed thanks to high-tech information—people from governors to members of Congress to football stars—the apology has become a rhetorical art form, but they rarely take the form of “I am a bad person,” perhaps because bad people would have to give up politics or football, and our apologists don’t want to do that.
Homer Simpson was more honest when he admitted that he hated Ned Flanders, his pious neighbor, for one simple reason: “Because he’s a better person than I am.”
Ann Boleyn and Jeb Bush
Dynastic politics are all about sperm and eggs. The acclaimed PBS series Wolf Hall, based on Hilary Mantel’s novel, traces the unfortunate fall of Ann Boleyn, whom Henry VIII took as his second wife (over the Pope’s objections) because his first wife could not give him a male heir. In the event neither could Ann, though she did give birth to Princess Elizabeth, who became the greatest English monarch. Of course her tyrannical and mercurial father could not have known about his daughter’s brilliant future. All he knew was that she was female.
Why the need for a male heir? The tragic career of Mary, Queen of Scots will tell us why. As a female monarch, Mary had a couple of mating choices. She could marry a foreign royal, who would then wish to take over the country. Or she could marry one of her nobles, who would then be plotted against by all the other, jealous nobles. Mary took both these routes, and both became disasters for her. Her cousin Elizabeth wisely took lovers on the side and proclaimed herself a “virgin queen.” Such political smarts also tell us why she was so able a monarch. Mary, Queen of Scots, however, lost her head on the chopping block, and it was Elizabeth who agreed to let it happen.
Ann Boleyn was also decapitated. A male heir wasn’t absolutely necessary, but Henry figured that a daughter queen would more likely turn out like Mary than like Elizabeth. Henry needed a third wife, so he executed the second one on trumped-up charges of adultery and treason.
Dynastic politics can’t escape the luck of sperm and egg. Take Jeb Bush. Just as Ann Boleyn couldn’t choose whether or not she had a daughter or a son, Jeb couldn’t choose his siblings, of whom one happens to have been the nation’s 43rd president. Being a Bush is a great advantage for Jeb until it comes to talking about his older brother’s Iraq policy. Then he is caught between the unpopularity of that policy and his blood loyalty to George W. He gets tongue-tied and has only recently been able to talk plainly.
But losing an election is far better than losing one’s head.
Amtrak and the Future of Morality
By now nearly everyone in the United States has heard about the tragic passenger train accident in Philadelphia. An Amtrak was going around a bend at one hundred miles an hour while the normal and recommended speed limit for that stretch is only fifty miles an hour. It seems likely that the excess speed had a lot to do with the deaths of six people.
Commentary so far has made two points: that the engineer was inattentive or distracted and that the whole sorry thing could have been prevented had the engine only had what’s known as “positive train control.” This is an automated system that knows the track ahead and adjusts the train speed to the circumstances.
This raises a moral question. Who is responsible—the engineer who was careless or the owners who failed to put the positive train control in place? And what might this portend for the future? There are now devices that prevent a drunk from getting behind the wheel. If one fails a Breathalyzer test, then his steering wheel locks automatically. But let’s suppose a drunk runs over two people and kills them. Who is responsible—the drunk or the car manufacturer who failed to put the controls on the car? Or would that be the drunk for failing to purchase the controls?
Let’s suppose that in about 2050 someone robs and store and shoots the owner dead. And let’s say that he puts it all down to impulse, but adds, “I couldn’t afford to purchase a positive morality control implant for my brain, you see. So I’m not responsible.” Poverty has been blamed for immorality before. Will poverty eventually be blamed because someone couldn’t afford the ethical version of “positive train control”?
American Millennials: The New Un-Africans
The Pew Foundation recently repeated news of a consistent trend: Young people between 18-30 are becoming less and less affiliated with a religious denominations. Among aging Baby Boomers only 1 in 6 is not affiliated. Among Millennials that number is 1 of 3, and rising. How do we explain this?
The most common answer is that Millennials tend to be progressive in their outlook, while evangelical Protestant churches and traditional Catholic churches are not. Millennials tend to be in favor of same sex marriage and abortion; conservative churches are not, and have politicized these issues. Millennials are turned off.
Millennial secularism parallels rising secularism in Western Europe. Even the United States, where belief in God is much more likely than in England or France, is becoming less religious. But that is not what’s going on nearly everywhere else. In non-Western countries religious fervor is boiling. As a result, the growth rate of Islam now exceeds the growth rate of the world’s population as a whole. Evangelical Christianity is also on the rise. Both sub-Saharan Africa and South Asia are now cauldrons of religious enthusiasm.
But why? Here are some possible answers, and they all speak with instructive contrast about the religious trends of American Millennials.
*African and Asia are economically developing areas. Countries “on the make” in the world’s economy tend to be more religious because faith sustains those working hard to get ahead. Look, for instance, at how the Puritan emphasis on hard work and thrift helped sustain Americans during our own economic rise in the 19th and early 20th centuries.
*Africa and Asia, unlike the secular West, have rising birth rates. They don’t just convert; they also reproduce. American Millennials, on the other hand, are strong believers in contraceptives and the right to an abortion. Is there a connection between those convictions and their indifference to religion? Probably there is.
*It is much harder to break away from Islam, the fastest growing religion. Those who do may be stoned to death. In the United States, on the other hand, church shopping is common even among the faithful—a sort of religious consumerism.
*Evangelical Christianity and fundamentalist Islam promise otherworldly experiences, such as “being born again” and “jihad.” In countries less grounded in rational science and technology, these promises have much greater appeal.
We might then conclude that Millennials are quite the opposite of young Africans and Asians in their attitudes towards religion because Millennials are progressive, rational, prosperous, consumerist, and “infertile” (in the sense that they do not wish to have a lot of children). They are un-African and un-Asian. There’s nothing like a little global perspective to help us understand who we are and what we believe.
Otherwise, we’ll become so “normal” to ourselves that we’ll barely realize what we’re really all about.
Why Twitter Is Great In The Classroom
Any number of professors would argue that Twitter is the bane of the classroom, for tweeting is yet another thing students do, on their smart phones or laptops, when they should be listening to the lecture. But there is still some truth to adage that joining them is better than failing to beat them.
Tweeting is here to stay, and one sign of its omnipresence is its capacity to occur at almost any time in almost any place where one is linked to the digital Ethernet. As much as we might like to banish it from classrooms and churches and temples, places of reverence for learning or faith, we cannot do so. But there is good news: Tweeting is a great classroom exercise.
Tweeting depends on economy, and economy is an intellectual virtue. Dr. Watson wrote of Sherlock Holmes, “He has never been known to write when a telegram would serve.” Holmes, no mean mind, told Watson that sending a telegram was a good thing because one is forced to reduce matters to their essentials. College students have trouble with essentials. Anyone who has read a lot of student papers knows that their great vice is padding. Students will write a great deal of polysyllabic language in order to complete the page length requirement, or to avoid saying anything so clearly that the professor might disagree with it, or to hide the fact that they don’t really know what they are talking about.
Most student essays are fifteen paragraphs in search of a thesis.
One of the hardest tasks of the educator is getting a student to put her main, governing point into a simple sentence or two, and then letting that point guide the rest of the paper. This is where Twitter comes to the rescue. Tell students that they must tweet the essence of their argument to classmates—and to the teacher–in 140 characters or fewer.
Dr. Samuel Johnson once said, archly, that hanging concentrates the mind. Well, so does Twitter. Of course it is possible to be merciful. Tell students that if they can’t reduce their thesis to 140 characters, then you will allow them 280, or even 420—but no more than that. Force them to be concise. Force them to focus. A semi-tweetable proposition, once it is discovered, can unify a student essay. It can regulate it. It can prevent padding. Oh, and make sure that each student Tweeting A Thesis also knows this: the main idea should be clear enough that even a fellow student, or tweetee, can understand it.
Twitter: It’s not just a distraction any more!
The Philosophy of Mass Murderers
When we think of mass murderers—such as Jared Loughner, who killed twelve people in a Colorado movie theater—we think of psychology, not philosophy. What were Jared Loughner’s motives? Well, he had just flunked out of his graduate program and had been dropped by his girlfriend. He was angry, depressed, and vengeful. Or maybe he was just delusional—out of his mind—as his defense lawyers would have a jury in Colorado think.
That’s psychology, not philosophy. But mass killers have their philosophical side, too, and it rests in a distinction Ludwig Wittgenstein made between “reason” and “cause.” If you have a hankering for Dove Bars and go to the store in search of one, then the reason you did so was because, well, you have this yen for a succulent Dove Bar. But the cause of your doing so is a complex blend of neurological signals, which include the signals that gets you up and walking to the car, turning on the key, and driving to the supermarket.
Loughner will either be found guilty or insane and will go to jail or an asylum. If the jury buys the idea that he did his deed for a reason—fury at being dropped from his graduate program—they will also conclude that his reason is hardly a justification for killing twelve people and they will find him guilty. If they find him insane, then they will say he did what he did due to a cause—a neurological mechanism—over which he had no control.
It’s unlikely that the jury will find him insane. Machines and butterflies do things due to causes; humans have their reasons. And Loughner is a human being who took the lives of other human beings. Someone needs to pay for that. It’s almost certain that the jury, in the end, will opt for reasons over causes—and they will have made a philosophical decision as well as a legal one.
The more we know about human behavior, the more “caused” we might conclude it to be. But that time has not yet fully come. Now if you’ll excuse me, the reptilian hemisphere of the brain is causing synapses to create movement towards some roasted Theobroma cacao seeds—in other words, I want a Dove Bar.
Students Don’t Revere Professors Any More—and It’s No Wonder!
In this Sunday’s New York Times (May 10, 2015) Professor Mark Bauerlein laments the declining role of college professors. You can read his thoughts at the address below. In his view, once upon a time (1960) professors gave only fifteen percent A’s; were sought out by students eager for their wisdom; and were enforcers of high standards who rejected the lousy work of their charges. Nowadays, says Professor Bauerlein, professors give 43% of their students A’s and are rarely engaged in face time with their students. Professors are no longer mentors; they have become accreditors.
It’s quite unlikely that the changes noted here have occurred in a vacuum. Just as nature abhors a vacuum, changes in student attitudes abhor happening in the absence of a larger context:
*Students do not visit professors in person as often as they once did because they can email them—so far, tweeting and texting professors seems forbidden, while email has become the “new formal.”
*Students do not revere their professors as role models as much as they used to because all young people today are much more exposed to peer influences. These are beyond the control of authority figures: Millennial-friendly websites and media, password protected Facebook pages, and private smartphone accounts.
*College has grown more and more expensive, and yet it is more and more essential for successful participation in a globalized and automated economy. Is it any wonder that students are thinking like consumers? Seeing your professor as a fount of wisdom just might be the privilege of those who are much more assured about their futures.
*The 1960s marked a revolution in a great many things, including approaches to teaching. The old top-down methods gave way to more friendly modes of instruction. Just as peer influences became more important among students, so did profs begin to cast themselves as their students’ peers and co-learners. The old stentorian lectern was replaced by “let’s get the chairs in a circle and discuss.”
When I was a college student, back in the antediluvian times that Professor Bauerlein loves, a single comma splice got me a failing grade on a student essay—no second chances. I might have enjoyed talking to my writing professor about the meaning of life, but he would still give me an F for my grammatical evil. Today he or she would be more likely to give me extra help—and second chances—on my punctuation malfeasance until I got…yes, an A! Chalk up one point for grade inflation.
I have mastered the art of the semicolon, thank you; I also prefer a world where students have a chance to do so before they flunk out due to a zeal for “high standards.” Yes, times have changed. Professors have lost a certain standing and a certain role. But it would be silly to conclude that their loss means that electronic mail, consumer savvy about education, and second chances about semicolons are all bad. Times change, but it’s not always clear that they have changed for the worse, or for that matter, the better. Meanwhile, let us not go gaga over the good old days.
http://www.nytimes.com/2015/05/10/opinion/sunday/whats-the-point-of-a-professor.html?ref=opinion&_r=1
Is Empathy Bad for Desperate Kids?
I have previously discussed the link between seeing what’s in front of you and doing something about it. Thus: knowing gay Methodists makes it more likely that you will support gay marriage; and seeing the dire effects of global warming will make it more likely that you try to do something to stop it.
But in a new book called The Most Good You Can Do the philosopher Peter Singer argues that this method of taking ethical action is flawed. He cites studies in which donors exposed to a photo of one desperately ill child are more likely to give than if exposed to a photo of eight such children. For Singer this sort of direct, eye-to-victim empathy is a poor guide to ethical activity. He prefers that we use reason instead.
For instance, say you had a million dollars and could either build a new wing in an art museum or help blind children regain partial eyesight. Which should you do? The answer, Singer says, is to do some calculations. Don’t trust your feelings; trust your math. And if you do, you might find that your contribution will enable one hundred thousand more museumgoers to see Monet and Renoir but provide at least partial sight to one thousand children. Thus your money is better spent on the children. For Singer this is a rational calculation.
But it is no such thing. One is still stuck with the subjective question of whether art lovers seeing Monet is better than children seeing at all. And we can easily imagine a world where the donor can see, every day, all those museumgoers marveling at modern art—but never does see blind children in Africa because he has never been there. Eyeball empathy may be misleading, but it’s nearly all we’ve got when it comes to taking ethical action.
Just for the record, I am colorblind and would give my million dollars to the children.
Starbucks, Laptops, and the Great Recession
Two very different stories have emerged about Millennials in the workplace. But for all their divergences, the two stories are also alike.
The first one concerns a new Starbucks program that gives its workers deeply discounted tuition for on-line courses. The second involves indications that Millennials want more “flex time” in the workplace: longer maternity leaves and more chances to work from home.
The first story suggests that Starbucks “gets it” about the larger life needs of its workers. The second story suggests that Baby Boomer bosses, used to the idea that people report to work in their cubicles, don’t get it. But scratch the surface and you find that these stories are remarkably similar.
They are alike in two ways. First, the Great Recession prompts both stories. Parents are less able to help their children with college tuition. The kids incur greater debt and have to drop out of college. Starbucks is trying to help them out. The Great Recession is also behind the second story, as it has caused fewer workers to be hired and longer hours demanded of them. Millennials who want some flexible time off are finding it hard to get.
Second, both stories illustrate the digital revolution. The Starbucks program is possible only because student baristas don’t have to attend physical classes. They can study and write papers after work, or even during work if there are lulls, on their laptops. The Millennial desire for flexibility occurs because workers can indeed do their jobs without “being there.” One Millennial told his boss he would continue with the company only if he could do all his work from his laptop anywhere. The boss surprisingly agreed. The student left the company anyhow.
A Great Recession and a Great (Digital) Revolution: How many other important trends do they lay behind?
The Vowel Movements of American Presidents
I speak of vowels, not bowels. Little is known of the bowel movements of American presidents other than the claim that John F. Kennedy, who suffered from many ailments, was also sometimes miserable with irritable bowel syndrome. Vowels are another matter. Recently Carly Fiorina, formerly of Hewlett-Packard, announced that she would be running for president. Her last name ends in a vowel. If she is elected, she will be only the second American president to have that distinction. The other is the current holder of the office, President Obama, whose last name comes from his Kenyan father.
A quick survey of American presidents will reveal that until Mr. Obama not one had a last name ending in a vowel, with the rather spurious exception of William McKinley, for “y” is technically a vowel, and it is pronounceable. And we do recall our schoolhouse lesson that the vowels are “a, e, i, o, u, and sometimes y and w.” But we can hardly substitute an “i” for McKinley’s “y.” “McKinlei” would be an odd last name indeed, combining as it does a Scotch-Irish “Mc” with some indefinite remainder. Would “Kinlei” be Czech? Slovenian?
In fact, William McKinley is of Scotch-Irish or Ulster descent, as are one-third of our presidents. And what is the consonant most common to the ending of president’s last names? It’s “n,” of course—Wilson, Jackson, Johnson, Clinton, Jefferson—and these are Scotch-Irish last names. It is hardly astounding to learn that so many of our presidents have Ulster heritage, given our nation’s history of linkage with the British Isles. But what is astonishing is that all our presidents—save one (Martin Van Buren)—are related to King John of England, who in 1215 signed, under duress, the Magna Charta granting more powers to nobles and fewer powers to himself. And yes, President Obama is also descended from King John on his mother’s side.
I have no idea whether Carly Fiorina is related to King John. Her last name, with its radically pronounceable vowel, comes from her husband Frank, an AT&T executive. She herself was born with the last name “Sneed” (of English origin). Not until an Andrew Cuomo is elected president will we likely break the King John cycle. In the likes of “Cuomo” is a pronounceable vowel at the end and a clearly documented Italian origin. No King Johns! Ted Cruz, though “z” is not a vowel, is of Hispanic origin. But his mother was a Wilson—back to King John, I fear!
With regard to King John, we don’t seem to be getting anywhere!
Why Shamelessness Is Good For American Politics
Recently Steve Benen on MSNBC opined that Karl Rove, President Bush 43’s leading political consultant, is either confused or shameless. Why? Well, because Rove in the Wall Street Journal argues that President Obama has done all the bad things that…in reality President Bush did. Rove says Obama has founded his presidency on a permanent campaign. But Benen says Bush under Rove did the same. Rove says Obama is leaving terrible violence in the Middle East to his successor, but Benen says Bush left terrible Middle Eastern violence to…Obama.
There’s no need to go on. You get the idea. It’s unlikely that Rove is bewildered. He’s just shameless. And it’s a good thing, too.
Rove is a political operative. It’s his job to spin arguments that ignore inconvenient truths. He’s a rabble-rouser, not a philosopher. Political parties aren’t big on intellectual honesty. They seek power, not truth. (Once in power you just hope that they take a more complex view of affairs.) Years ago a former student told me about going to a Christian summer camp that decided to err on the skeptical side, so around the campfire they sang, “Possibly, Probably, He Rose From the Grave.” But this is highly unusual. The resurrection of Christ is non-negotiable for passionately believing Christians. And “Obama is bad” is non-negotiable for passionate Republicans, especially going into an election.
These shame-free zones in both parties are necessary. Guilt-free hacks are essential. If you are an independent-minded voter who believes in a vigorously contested two-party system of elections, you have to put up with it. If you like football, then you must accept that Chicago Bears fans, before big games, are not going to yell, “Go Bears! On Some Days We Are Slightly Better than Tampa Bay!” And politics isn’t wiffle ball. It’s much more like football.
Three Trends That Will Better Your Life (Maybe)
This is called the Information Age, but it isn’t just that there’s so much information readily accessible. It’s also because of what information can be delivered, how it can be used, and how it can bypass middlemen. Here are three upcoming trends:
*Calling Your Stove. Within the next twenty years you’ll be able to use your smart phone to turn on your lights, water your lawn, unlock your door, and turn on your stove—all before you arrive at the house. If you have a casserole waiting, just phone your stove forty-five minutes before you’re ready to eat it, tell the stove what temperature to reach and for how long, and enjoy your meal when you arrive. Upside: It’s convenient; you don’t have to wait until you get home to cook your meal. Downside: You could burn up your house if something goes wrong. There’s something to be said for “being there.” If you are, then you can use your smartphone to call your fire extinguisher!
*Emailing Your Money. Ever tried to get money to a son or daughter in, say, Timbuktu or Malaysia? You go the bank, explain your problem, fill out forms in quadruplicate, pay a big fee, and hope the money goes through. That’s one way. There are others, but they all involve having to go through a middleman (a bank, Western Union, Pay Pal). But with bitcoins you’ll be able to email the money directly. In places like Argentina, where banks have a history of being iffy, bitcoins are the new new thing. It’s virtual money. Upside: You can avoid the bank. Downside: You don’t enjoy the bank’s security system against hackers.
*Monitoring Your Self. New wearable technology—dresses, shirts, necklaces—can record daily what muscles you’re using most when you walk, how often you use the word “like” in conversation, your heart rate during sex, and much more. You can get a daily read-out on your smartphone about “your day.” It’s a life of quantity, not of quality. Upside: It’s a great way to keep track of muscles you may strain and words you want to eliminate in conversation because they make you seem inarticulate. Downside: It’s not just a life of quantity—it’s also a life of narcissistic quantity. There’s something to be said for not knowing—or caring—to much about yourself.
Why Betting On Losers Is Good For You
In my life I’ve known lots of writers, and some of them have been, in my humble view, sorry excuses for human beings: vain, pretentious, back-stabbing, and selfish. I’ve also thought that among these odious people the quality of their art was not sufficient recompense for their flaws as human beings.
But my whole way of thinking about this phenomenon has been wrong, for in any human endeavor (including artistry) there are going to be winners and losers, and this mixture is a necessary part of the whole thing. Thus: If you think poetry or fiction is good for the human race, then you should be prepared to put up with the second-raters, even with the second-raters who are terrible people.
In other words, life is all about a series of great tasks that will include both winners and losers. Betting on losers is unavoidable, and without willingness to bet on losers there would never be: children, inventions, painting, poetry, or education.
Far too many of us try to run away from this truth. Taxpayers, for instance, don’t want to fund schools that produce “losers,” yet the whole enterprise of education is premised on the inescapable fact that schooling will reach some while failing others—or others will fail it. The rise of charter schools and voucher programs is an attempt to congregate all the winners in one place, but that just compounds how much the remaining losers, bereft of funding support, are going to lose. And it is possible, yes, for some losers to lose more severely and dangerously than other losers do.
That is the greatest truth of all: that most of us are not pure winners or losers but blends of the two. So in public schools it is possible for a kid for “lose relatively well”—that is, get enough education to become part of the working poor. This is something that goes unreported in the media: all those half-victories that keep things from becoming worse. We tend to hear only about the great winners and the great losers in life: the winners of the lottery and the Noble Prizes or the horrific people who shoot people in theaters or the Congressmen who get caught stealing or sexting.
So: bet on losers, for they won’t all be simply losers, and if you aren’t willing to bet on losers, you probably aren’t willing to live life much at all. And that would be foolish, would it not?
A Toad’s More Handsome Than A Snake
Years ago I was falsely accused. It was in grammar school when one of my classmates said that I had shoved Bobby Burney to the ground, a reportage that Bobby himself supported. Eventually this prevaricating classmate, whose name was Wayne, admitted that he’d lied. He’d seen me do no such thing. At this point the authorities assumed I was innocent of all charges of shoving Bobby Burney to the ground. They didn’t even check with Bobby.
And I was innocent. But the authorities weren’t being logical. Just because Wayne had lied didn’t mean that I was innocent. Yet it’s well known that when a false witness fesses up to having lied, the accused is then cleared. All the sympathy goes to the accused, and all the hatred goes to the false witness. But just because the accused was lied about doesn’t really make the accused innocent.
Here is a great human lesson. It’s summarized thusly: Even a toad’s handsome next to a snake. So it’s very important in life to make your adversary seem to be the snake, for then everyone will side with you, the toad.
The night before the shootings of four students at Kent State University in 1970, someone, presumably student radicals, had burned down the ROTC Building. Most Americans supported the National Guardsmen who shot the students because it was the students who were the snakes—after all, they had committed arson—while the Guardsmen were the (relatively handsome) toads. Yet during the early days of the Civil Rights Movement, Martin Luther King, Jr., had insisted on non-violent protest in order to make the racist Southern sheriffs seem to be the snakes. And he succeeded.
During the late 1960s the American public was hardly in love with the stiff, disingenuous President Nixon. But at least Nixon was trying not to lose the Vietnam War, while the longhaired anti-war protestors seemed not to care. Nixon was a toad, and later he was even proved to a snake. But then he seemed to be the toad, while the protestors seemed to be the snakes. So the general public supported Nixon.
You can supply your own examples. But remember: in life, you don’t need to be handsome or beautiful. You just need to look better than the snake with which you are being compared. So make sure everyone knows: It’s the other person who’s the snake!
Picasso and Seinfeld
It has been nearly eighty years since Pablo Picasso first unveiled his movie-screen sized mural Guernica in Paris. It ranks among the most famous paintings of all time, perhaps rivaled only by the Mona Lisa or American Gothic. Reams have been written about its origins, beginning with the fact that the Spanish town of Guernica was the first town in history to be utterly incinerated by aerial firebombs. Guernica (the mural) is part outraged protest and part public propaganda.
But it is mostly modern art.
Picasso had struggled with the need to make art both “modern” and “public.” He didn’t think the two went together. His idea of a good time included vases that looked like supple women, and bulls that looked rather like agonizingly defeated men, and tongues that looked like knives (based partly on the tongue of his hated, estranged wife). Picasso had this private mythology that he translated into the abstract: figures both animate and inanimate at the same time.
As he began to struggle with how to complete a huge mural for the Spanish pavilion in Paris, he thought he would start with a hand holding up the hammer and sickle, the symbol of the Soviet Union, which supported Picasso’s side in the Spanish Civil War. He soon spurned this idea. The hammer and sickle was too trite, too public, too recognizable, and too predictable. Somehow Picasso had to make his art abstract, modernist, and personal—yet also a great public monument.
I think he succeeded, but you might not agree, so Google away if you haven’t seen Guernica. It is justly famous, but if you go into the nearest Wal-Mart and ask the patrons which one they’ve heard of—Guernica or Seinfeld—you know the answer will be Seinfeld.
If Guernica is great modern art, Seinfeld is great popular entertainment. Right? Not so fast.
Future generations may find in Seinfeld the stuff of great comic art. They may find the scene in which George refuses to allow the Boy in the Bubble—that desperately sick, isolated lad—to win at Trivial Pursuit because of a misprint on the answer card to be a classic piece of comedy. Isn’t it about the fine line between laughter and cruelty? Isn’t that a profound human insight?
We sometimes forget that Shakespeare was a great commercial artist. He was also a great artist. It’s time to get over the idea that art must be “difficult” in order to be great. I have read poems by fifth-rate modernist poets that were difficult all right—impossible to understand—but hardly great or even good.
Picasso was a great artist. So was Shakespeare.
There may yet be room for Larry David.
And The Award Goes To…Air Conditioning!
It’s impossible to say which is the most important technology of the twentieth century. Is it radio with the advent of instant, live communications; or TV with the advent of image over print; or the computer, with the advent of immediate, unlimited information? All three of these would have their supporters, though radio is now such an old technology that we forget how revolutionary it was when Franklin Roosevelt could go over the heads of Congress and talk directly to Americans “by the fireside.”
A less nominated candidate for “most important technology” might be air conditioning. As a Texan of a certain age I am well aware of this revolution. When I was a kid, pre-A.C., my family had what was called an “attic fan” in order to beat the heat. The idea was to turn it on once the sun began to go down. It would suck all the hot air out and allow the cooler air to come in. You can still use this method: Just put your box fan in the window and turn it towards the out of doors at around 8 PM. You’ll be surprised at how quickly the heat abandons the house. Make sure you put the fan on high and try to make sure it fits the window as snugly as possible.
But this isn’t nearly as good as air conditioning.
Not until the late 1950s did A.C. come into the mass market, first in the form of room units. My aunt in Houston had a single unit, in the kitchen. When we visited, people would talk in the kitchen until midnight rather than face the smothered remainder of her big house. With air conditioning came the modern Sun Belt. Before A.C. it was much harder to have factories and offices in the American South. After A.C. the nation rushed below the Mason-Dixon line. Taxes were lower. Labor unions were weaker. The modern conservative movement in the Republican Party would have been impossible without A.C., for it was based in the emerging power of cowboy-minded Sun Belt. From 1976 on, nearly every American president came from there—Barrack Obama was the first northerner to be elected president since John F. Kennedy.
When I was a kid, we had no A.C. I would go out each day in bare feet to explore my little Texas town. The bottoms of my feet were calloused. It didn’t feel all that hot—I didn’t know any better. But my subjective experiences aside, it was hot. We sat in the backyard most summer nights, talking and waiting for it to cool off inside. It was too hot for the South to become a major economic and political engine. A.C. changed all that, and thus gets, for today, my award for The Most Important Technology of the American 20th Century.
Gay Marriage and Underwater Manhattan
Twenty or so years ago two brothers here in Janesville, Wisconsin, heard that Beloit, Wisconsin (15 miles south) had a college. They laughed. Beloit has a college? Longtime Janesville residents, they had grown up looking down upon their sister city. Beloit couldn’t possibly have a college, and if it did, it couldn’t be a good one. Everyone in Janesville thought so. In the event both these brothers visited that college in Beloit and enrolled (I taught them). They acquitted themselves well at one of the top liberal arts colleges in the land (if you can believe U.S. News and other sources—and you can).
The brothers’ original attitudes illustrate the power of mindset. Our minds are set when we believe something that everyone around us says must be true. Woe be unto the persons who challenge mindsets, such as Galileo, who said everyone was wrong about the centrality of the Earth, and Martin Luther King, Jr., who said everyone in the South was wrong to believe segregation was forever. People like Galileo and King get into trouble. History also celebrates them as great.
One of the most changeable mindsets in recent years has involved gay marriage. In a notably short period of time, Americans have gone from disapproving it to approving it. What’s changed? Here’s a clue. In 1983 only twenty-four percent of Americans said they knew someone gay. Today seventy-five percent of Americans say they do. As gays have come out, more and more heterosexual Americans are meeting them. A Methodist minister who had risked his career to perform same-sex marriages told me that what most changed the minds of his congregation was actually meeting gay persons and discovering that they are not weird or corrupt or deadbeats or subversives. They are just, well, Methodists.
So the mindset about same-sex marriage has changed. What about climate change? The current majority mindset about climate change in the United States seems to be this: Climate change is caused by human activity, and it might well be a long-term threat. It is not, however, at the top of my political agenda and cannot compete with my personal economic concerns. So what would change this mindset into something like this: Climate change is the number one issue facing us today, far more important than my own dollars and cents, and government and business must unite to combat it.
The answer comes from what happened with same-sex marriage. People met guys and decided they were all right. We are wired to respond to what we actually see in front of us. This is why Groucho Marx was so funny when he asked, “Who are going to believe, me or your lying eyes?” We think our eyes don’t lie. Thus: if and when our eyes see eighty percent of the nation’s corn crop burning up and see Manhattan under twenty feet of water, then we’ll change our mindsets about climate change.
It’s never too late to approve of gay marriage, based on what our eyes tell us. But it might someday be too late to trust our eyes about climate change. By the time we see that the change is profoundly real, we might be re-naming New York City Venice and hoping for the best, glub-glub, as we swim over to Bloomingdale’s for one of those fashionable new snorkels.
Sunday Schools and Railroad Trains
The New York Times columnist David Brooks has just published a new book called The Road to Character (Random House), in which he says we should downplay self-promotion and focus instead on inner character. One of his more pithy thoughts is that when we are resume building we are attending to our strengths; when we are character building we are attending to our weaknesses.
There seems little doubt that ours is a self-promoting age. Brooks points out that none of President Eisenhower’s cabinet members wrote memoirs, while about 10 of President Reagan’s did. This seems to be an epoch of self-important “listen to me” or “look at me.” It’s a time of Kim Kardashian and video resumes. I’ve already confessed, myself, to being a Facebook narcissist. Please “friend” me so you can “like” my selfies!
One reason for this self-promotion is sheer competition. It’s much harder to get a hearing these days. So much information, so many memes, exist, that it’s difficult to get yours paid attention to. The economic counterpart is rivalry for high-paying jobs. So how can I make sure my vita stands out? This is especially intense for young people entering a still-pinched labor force.
But if I’m a Facebook narcissist I’m also an expert on character building, for one simple reason: I am a graduate of Baptist Sunday School in central Texas. There we were taught to obey our parents, respect our elders, pray for others, and be aware of our selves as sinners. But this character formation didn’t occur in a vacuum. It was the 1950s, a time of fantastic American power and prosperity. In my town was the rich Missouri Pacific Railroad. Kids in Sunday School didn’t have to worry about self-promotion. They could focus on Bible studies, partly because there was an automatic brakeman or fireman job waiting for them.
David Brooks is right about the need for character. Many years ago a student’s parent told me he didn’t care whether his daughter got a job after graduating from college. “Nobody in America starves. The important thing is for Annie to develop her character through learning.” But that too was a far more assured time. I wonder if he’d say that today. George Bernard Shaw once said that you cannot talk to a man about God when his stomach is growling. We leave in an age of viral self-promotion. We also live in an age of queasy tummies.
Hell Is (Still) Other People
Globalization, defined as the speedy movement of goods, services, and people, is one of the salient facts of our time. Even before the Internet, transportation advances had made the travel of people much easier. The great immigrations to the United States in the 1800s were not enabled by airplanes but by steamships.
Still, the true dominance of globalization had to wait for trade treaties and cyberspace. The former meant that owners of companies were free to purchase cheap labor elsewhere to build their products. The latter meant that tonight you and I might see a sleeper Polish movie on Netflix. It means we can Skype almost for free with someone in Albania or Tibet.
Globalization has also meant that we spend our lives trying to take advantage of what we like about it and avoid what we don’t. It’s a little like death: Without the penalty of death we human beings would never have evolved adaptations to become rulers of the earth, but death is also something we’d just as soon happens to somebody else for a while. We benefit from a system that we also try to avoid.
Like it or not, globalization doesn’t just mean the fast transport of information and movies and websites and messages. It also means the movement of people. In Europe and North America, where folks love the Internet, they don’t always love mass immigration. Yet it keeps on coming, much like information on the World Wide Web, as seen in the poor Libyans who drowned this week off the Italian coast.
Much is made of virtual reality versus the “real” thing. And there’s a lot to that. Globalization when it involves “products” (like videos and data) is virtual and welcome. When it involves people, not so much. Jean-Paul Sartre never lived to see laptops and smart phones. But he famously knew (and wrote) that “Hell is other people.” He knew something about human nature, which so far the World Wide Web has not canceled.
Lady Gaga’s Narrow Casted Poker Face
One of the most popular songs of all time is Lady Gaga’s “Poker Face.” Yet if you hear the word “gaga” without thinking about her Ladyship and her best-selling song, then you’re probably of a certain age (by no means under 30). For those who don’t know, the song’s scene involves a man having oral sex with a woman, but he doesn’t know that she would rather be having this sex instead with another woman. Hence, the man cannot tell by her “poker face” what her real desires are. The play on “poker face” also entails strip poker as well as the female vulva. But you should do your own research about the song.
This is one of the most downloaded songs of all time. That does not mean, however, that a majority of Americans knows about it. And that’s my point.
Even very popular songs and shows nowadays are not popular the way The Ed Sullivan Show or Perry Mason or I Love Lucy used to be. Entertainment can be quite popular today in a narrow-casted way: Shows are popular with a particular slice of the consumer population but not with everyone else. The shows are a great success. But everyone’s no longer watching them on Monday nights on CBS.
In fact, there are very few I Love Lucys around these days. The Super Bowl comes to mind, but most entertainment now is for a niche audience. On Saturday nights one niche will watch Mike Huckabee on Fox News; another will watch Girls on HBO. Both shows will make money. Totally different cohorts will be viewing them.
And here’s another thing: For many older Americans Lady Gaga’s ribald song would be a subject of disapproval, just as the sexual revolution in the 60s was. But now, nearly half a century after the sexual revolution, such erotic outrage is not even known by a great deal of the population. The kids today have their sexual revolution on-line now. The revolution is also private. Today’s Millennials have their own channels, their own downloads, their own smart phones. Whatever they’re up to, the moral majority might not even know about it. Fifty years ago John Q. Middle Class knew all about promiscuity on campus and was furious about it. Nowadays John Q. Middle Class, Jr. may not even know who Lady Gaga is.
Confessions of a Facebook Narcissist
One of the most misleading descriptions is that of Facebook as a social network. It is a social network—an electronic place where friends meet and greet and exchange news and views. But that’s only half true. It’s also a high-tech version of what we saw before Facebook and even before the Internet.
I speak of the t-shirt and the vanity plate. Back in the 70s came the personalized t-shirts with such memorable logos (at least by me) as “I Turned My Parents In For Doing Drugs, and All I Got Was This Lousy T-Shirt” and “I Survived the Blizzard of ’79.” One of my students had a memorable one, too: “Stewart Pumps Suck.” And then there were the vanity license plates. I heard of one person who wanted to have a vulgar plate, but the state wouldn’t let him. He had to settle for “ANI” (the plural of Anus). What an individualist!
The plates and the shirts were a form of self-advertising. Before the 70s advertising was largely something professionals did on behalf of products and companies. But then Norman Mailer wrote this book called Advertisements for Myself. Mailer was always a good trend-spotter, and he could see that more and more regular people wanted to share their self-love with others. But all they had, unless they were celebrities or entertainers, were t-shirts and vanity plates.
Facebook changed all that. Your Facebook page is really one big self-advertisement. Using Facebook’s template, it’s a snazzy page, too. Many magazine and newspaper adverts of an earlier era could not compete, in layout, contrast, and sheen, with your very own FB page. You are better sold than a box of cornflakes or detergent.
I must confess to spending a lot of time on my own FB page, not just posting on it but also fiddling with it, especially the profile pic. My selfies have run from my wearing a sinister maroon hoodie to my donning blue everything and smiling in ecstasy (“Rhapsody in Blue”). My FB page is a social network page in the sense of But enough about me. Let’s talk about you. What do you think about me? I’m trying to say that it isn’t much of a social networking page at all.
But in that regard I’m hardly alone.
Yes, Hillary Staying With Bill Was a Smart Feminist Move
A birthright of college Millennials is feminism. This generation has grown up with women professors, deans, and presidents as normal. The percentage of women attending college has outpaced the percentage of men. Major concern about sexual assaults on campus women is now a given. There have been women combat soldiers and astronauts. In just the last twenty years three Secretaries of State have been women. Young, unmarried, college-educated women have become a significant voting bloc.
But if feminism, a strong belief in gender equality, is flourishing, there is still disagreement about what it means in different situations, and how its goals are to be attained. Nothing better illustrates this perplexity than Hillary Clinton, often criticized as un-feminist or even anti-feminist for not leaving her philandering husband in the 70s, 80s, or 90s. Isn’t feminism about self-respecting, well-educated women not having to put up with that sort of thing any longer?
It is. But had Hillary Clinton divorced her husband for adultery, he would never have been elected president. She would not have been First Lady, featured as the most active, policy-fluent one in our history. She would have never had a powerful platform for which to run for Senate, which she won, and the presidency, which she lost. She would never have become Secretary of State, and she would not be by far the most likely woman to become president. If having a woman elected president is also a feminist goal, then Hillary’s staying with Bill was a profoundly important and potent feminist move.
Without Bill Clinton–even if she had never met him–Hillary Clinton would have done very well in life. She is very able. But it is unlikely that she would be running as the front-runner for president of the U.S.
Here it is useful to recall the insights of the great Italian political philosopher Machiavelli, who wrote that private and public morals exist on different planets. Keeping one’s word is a most ethical thing to do in one’s married life. It may be a disaster to do so if one is a politician, most especially other politicians have no intention of keeping theirs. Leaving a philandering husband is a deeply feminist thing to do in private life. But the Clintons are public people with public goals. They have public ambitions. It is when we confuse our politicians (of either party) with soap opera characters that we fail to see what game they are really playing, indeed, what game they really have to play.
Babes, Beasts, and the Logic of History
Yesterday I read of an Armenian assassin who, in 1921, murdered a high Turkish official that had directed the genocide against his country.
The year 1921 came up again when I was searching findagrave.com and discovered that one of my childhood neighbors, Corrine Wayland, had been born that same year. Her husband Jack was two years older. They were married in 1939. Within a couple of years Jack was in the U.S. Army; World War II was on. Jack survived, but Corrine had to worry about him every single day.
Hitler, it seems, had read of the Armenian assassination in 1921. He noted that while the killer was acquitted, nothing much came of the Armenian protest against the Turks. The world paid no heed. Hitler filed this information away and decided that any genocide of his own making would be ignored, too. He could get away with it.
Thus in 1921, unknown to baby Corrine, there was this beast, Hitler, reading about genocides and assassinations and starting to make his own plans. And that’s the logic of history. Babes are born every day, and they have no knowledge of the beasts out there, formulating awful ideas that will affect these babes when they mature, if not sooner.
Corrine Wayland would grow up to marry a man who had to fight Hitler. The birth of a newborn is cause for celebration, except for beasts, of which they know nothing, who may cause them grief and loss in the years to come.
The logic is history is as cruel as it is redundant. The poet Yeats put it best when he asked about “what beast slouches towards Bethlehem, waiting to be born.” What beasts await the kids born in 2015? We, and they, will find out, probably too late.
Mad Men and the Seven Deadly Sins
When I was studying Renaissance literature one of my professors showed me an easy way to remember the Seven Deadly Sins—with the acronym PAW’S LEG. Thus: Pride, Avarice, Wrath, Sloth, Lechery, Envy, and Greed. These were deadly sins because, if not repented of, they would send you to the burning inferno forever.
The Seven Deadly Sins organize the plot of Mad Men. But nobody pays any attention to them. They are in plain sight but never noticed. Instead, everyone concentrates on other vices, those of the 1960s: racism, sexism, and homophobia. Bert Cooper won’t allow a black woman to occupy the head desk of the advertising agency. A character asks Joan Holloway what she does around the office other than dress like she wants to be raped (he is soon fired, however). Sal, in the creative side of the agency, is fired because he, a secretly gay man, won’t sleep with a Lucky Strike executive who can, says Don Draper, “turn out lights off around here.” The women in the series have few choices: stay at home and tend the kids while suffering over hubby’s adultery (Betty Draper); endure constant slights while trying to get ahead in the office (Peggy Olson); or getting a slice of the agency by sleeping with a customer (Holloway).
We like the idea that our society has overcome these forms of discrimination. That’s progress. But a second look will reveal that on the matter of the Seven Deadly Sins we’ve made no progress at all. Advertising itself is about promoting envy: others have that snazzy new car, so why not you? Lechery is rampant, then and now. Don Draper is cool but also quite lecherous, but while his sexism has been repealed, his lechery has not. Avarice and greed propel both the advertisers and their customers. Other than spoiled Pete Campbell, however, nobody loses his or her temper much—Wrath is probably the least pervasive of the Seven Deadly Sins on Mad Men. Sloth (laziness) isn’t in much evidence either!
This blend of social vices, which we’ve made progress in overcoming, and spiritual vices, which continue, explains Mad Men’s appeal. We like to view the present through the prism of the discriminatory past, but we also like to see what always fascinates us: the enduring human vices of others. Those free of the Seven Deadly Sins are good but also, from an artistic viewpoint, boring. Don and Pete and Joan and Roger are hardly boring. Yet until we begin to eradicate deadly sins with the same devotion with which we have eradicated social discrimination, we are hardly entitled to report much progress in the human condition. Thus we can expect, all of us, to continue being mad, both men and women.
How To Cope With Internet Shaming (In 4 Easy Steps)
Being shamed virally on the World Wide Web is now, as befits the medium, a worldwide phenomenon. I hope it never happens to you, but if it does, here’s how to manage:
Step 1: Avoid going viral. A byte of prevention is worth megabytes of cure. This means you should avoid refusing to pick up your dog’s poop on the subway or tweeting that you hope you, a white person, don’t get AIDS while you are traveling through Africa. The former was caught on a smart phone camera, while the latter got her tweets re-tweeted and “favorited” all over North America. By the time she landed in Kenya millions of people had read her little 140-character joke; half of them hoped she got AIDS as soon as possible. These things are hard to live down.
Step 2: Get your “prime trender” to write you a letter of recommendation. I just made up the term “prime trender,” but this is the person who gets your viral shaming started in the first place. A young man had tweeted something incorrigibly sexist, and his prime trender was a distinguished academic. Soon his reputation was ruined; no one would hire him—just Googling his name held him up to pervasive opprobrium. But he got the distinguished academic who’d started it all to write a recommendation letter for him. She said he was an “unpleasant young man who did not deserve, however, to become unemployable.”
Step 3: Hire a reputation management firm. You need big bucks for this, but some of them are supposedly good. They can threaten to sue the owners of websites that continue to pile on the shame, and play the algorithms of the search engines so that your name starts to occur further down the page when people look you up on the web.
Step 4: Trust to time—and hope. In time, internet shamers will tire of you and go on to the Next Big Shameful Person or Thing. Even so, your shame will never go away. On the World Wide Web little if anything is ever forgotten. One European sued Google to get his shaming off the web; Google fought him; the European courts sided with the complainant. But while Google removed the notice of the man’s debt, it did so only on its European servers. You can still read about him everywhere else. But I’ll not reveal his name. No shamer am I! If I were, I’d be ashamed of myself.
Whatever Happened to the Village Wise Man?
The answer is that in the USA he’s hardly had a chance to exist. The same goes for the village wise woman. John Updike, in an interview before his death, said that Americans like youth, not age. He reminded us that the great writers of the twentieth century, Fitzgerald and Hemingway, wanted to be young—the former an audacious dancer and drunk, the latter a brave young soldier of fortune and big game hunter.
Why do Americans like youth so much? One answer is that we are the most important country in the “New” World—it’s a phrase you don’t hear much any more, but those who came here left behind the old, traditional European world. As recently as World War I Americans complained about how Europeans were stuck in their ways, and that’s why they fought all those terrible wars—it was all due to ancient alliances they couldn’t get out from under. But above all, I think, America likes youth because only young people can adjust to the sort of society America has: constantly changing, technologically updated, culturally trendy. And the American quest for the new can’t be de-linked from the American drive to make money.
American traditions are at war with American capitalism.
Traditionalists want to keep the country from being overrun with illegal immigrants, but the success of American capitalism makes the country all too tempting for Mexican immigrants who want to get ahead in life. Traditionalists oppose same-sex marriage, but it was business interests that pressured Indiana to amend a recent law that seemed anti-same sex marriage. Traditionalists don’t like smut on television, but other people like to watch it, so those seeking money are going to promote it. Traditionalists wish parents had more control over their children, but the moneymakers have given our children smart phones and more privacy than ever.
You can find the village wise man in the Vatican (the Popes) or Saudi Arabia (the Sheiks). Don’t look for him over here.
The Problem With Information (Isn’t That There’s Too Much Of It)
Recently we learned that a driverless car traversing the country used 30% of the information found in the Library of Congress. This is the sort of stat that astounds people. Another is that your smart phone, smaller than some seashells, has more information in it than did an old complete set of the World Book Encyclopedia. Daniel Dennett, the philosopher, says that there’s so much information now, and so few brains in comparison, that we will have to “forget” Berlioz (who?) in order to “remember” the Beatles. Of course the Internet will “never forget” Berlioz, but almost no one will visit him on the Web.
But this sheer volume of information isn’t the problem. Rather, the first problem with information is that it’s often asymmetrical. The guy selling you an insurance policy has information that you don’t. And that’s why we need data bases and open access to them, and watchdog agencies and good people to run them. It’s also why people who are educated have advantages over those who aren’t. They know more, and know more about how to find information about used cars, social policy, new books—you name it.
The second problem with information is that we are losing the ability to rank it. There’s information about the performance record of Maytag washing machines, and there’s information such as “everything I ever needed to know I learned in kindergarten,” to quote the title of a once best-selling book. Now which information would you rather have? Which is more valuable? While I’m not sure that kindergarten taught me everything I needed to know about life—back in home town we called it “expression school”—I do believe this: that while it’s important to know which washing machine to buy, it’s even more important to know how to live when your washing machine—and other aspects of life—fail you.
The problem with information isn’t its size: it’s unequal access to it, and the inability to figure out what information is crucial and what information isn’t.
***
The End of American Tragedy
The last great American tragedy occurred on November 22, 1963, when President Kennedy was murdered. It’s worth asking why we haven’t had one since.
By “tragedy” I mean the classical sort: the death of someone highborn but noble, accomplished but also lucky. Kennedy fit the bill. Born with wealth, good looks and brains, he was brought down in early middle age. What have we had since?
Well, we’ve had our share of collective tragedies in America, such as the war in Vietnam. A historian even wrote a book called The Tragedy of Lyndon Johnson, in which Eric Goldman tried make tragic an outsized president devoted to domestic reform yet caught in a war he thought he had to fight. But Johnson did not die in office, and he was too secretive and dislikable to be tragic. He wasn’t highborn either. And then came Watergate, another tragedy in America. But Richard Nixon was far too sleazy to qualify as an American tragic figure.
And there was Ronald Reagan, who did as much as anyone to make us forget the searing sorrow of American tragedy. Like Kennedy, Reagan was shot. Unlike Kennedy, he survived and even joked about the ordeal. Reagan turned the great American tragedy into the great American comedy. Affable and charming, he seemed to have the smiling good luck of the Irish. If a leftist fanatic of the Cold War murdered Kennedy, Reagan was president when that war concluded: from tragic to comic ending.
Since that black day in November ’63, then, we’ve had no more classical American tragedies. That’s good. No president has died in office.
But there are two other elements in the mix. The Cold War is over–thus we will never again have the great and handsome warrior against the Soviet Union that Kennedy was. With the finish of the Cold War, nothing unites us on foreign policy–even Israel is now a polarized issue. Second, we know far too much nowadays about our leaders’ private lives. Even Kennedy, it turns out, was recklessly promiscuous—a small smirch upon his tragic stature.
***
Justin Bieber, War Hero
My own acquaintance with pop singer Justin Bieber needs some work. But the other day a friend said to me that it was a good thing North America didn’t have to depend on the likes of him when it faced Hitler. Bieber in contemporary eyes is not made of stern stuff—not war hero material.
A quick check of young Mr. Bieber’s profile will reveal that he likes to affect a rapper look, complete with hoodies; that he has been criticized for an androgynous appearance; that he claims to have spoken to Jesus and does not believe in sex with someone you don’t really love; that though he talks with Jesus he also prays a Jewish prayer before concerts; and that this teen heartthrob likes what he calls the “swag” of black culture.
Are these the hallmarks of a war hero? Could we imagine Mr. Bieber going over the top at the Battle of the Bulge? Well, no, we can’t. But there’s a simple reason: because he was born in 1994, not 1924.
We forget how much our personalities and trajectories are determined by when and where we were born. Bieber’s love of black styles, his ecumenical blend of Jesus chats and Jewish prayers, and his androgyny: they are all possible because he was born into a digital culture where nearly every possible lifestyle is up for grabs. A 1924 Justin would have come into a world where rappers and androgyny and ecumenism didn’t exist. He would have been drafted into the Canadian military and quite possibly have been at the ready on D-Day. He might have become a decorated soldier.
We don’t choose our eras. We take our cue from them and then do what we have to do within their strictures. My friend was wrong. Justin could have been a flying ace or platoon leader after all. He might well have made a good one. –Tom McBride