Wednesday, November 18, 2009

POWER.

The Rules of Power.
1. Never outshine the master.
Always make those above you feel confortably superior. In your desire to please or impress them do not go too far in displaying your talents or you might accomplish the opposite; i.e., inspire insecurity. Make your masters appear more brilliant than they are and you will attain the heights of power.

2. Never put too much trust in friends; learn how to use enemies.
Be wary of friends, the will betray you more quickly, for they are easily aroused to envy. They also become spoiled and tyrannical. But hire a former enemy and he will be more loyal than a friend, because he has more to prove. In fact, you have more to fear from friends than from enemies. If you have no enemies, find a way to make them.

3. Conceal your intentions.
Keep people off-balance and in the dark by never revealing the purpose behind your actions. If they have no clue what you are up to, they cannot prepare a defense. Guide them far enough down th ewrong path, envelop them in enough smoke, and by the time they realize your intentions, it will be too late.

4. Always say less than necessary.
When you are trying to impress people with words, the more you say, the more common you appear, and the less in control. Even if you are saying banal, if will seem original if you make it vague, open-ended ans sphinxlike. Powerful people impress and intimidate by saying less. The more you say, the more likely you are to say something foolish.

5. So much depends on reputation. Guard it with your life.
Reputation is the cornerstone of power. Through reputation alone you can intimidate and win; once it slips, however, you are vulnerable and will be attacked on all sides. Make your reputation unassailable. Always be alert to potential attacks and thwart them befor ethey happen. Meanwhile, learn to destroy your enemies by opening holes in their own reputation. Then stand aside and let public opinion hang them.

6.Court attention at all costs.
It is better to be attacked and slandered than to be ignored. You must not discriminate between the different types of attention. In the end, all attention will work to your favor. Welcome personal attacks and feel no need to defend yourself. Court controversy, even scandal. Never be afraid or ashamed of the qualities that set you apart or draw attention to you.  Everything is judged by its appearance; what is unseen counts for nothing. Never let yourself get lost in a crowd, or buried in oblivion.  Stand out; be conspicuous at all costs. Make yourself a magnet for attention by appearing larger, more colorful, more mysterious, than the bland and the timid masses.
Burning more brightly than those around you is a skill that no one is born with. You have to learn to attract attention. At the start of your career, you have to attach your name and your reputation to a quality or an image that sets you apart from other people. This image can be something characteristic like a style of dress, or a personality quirk that amuses people and gets you talked about. Once the image is established, you have an appearance, a place in the sky for your star. Attack the sensational, the false, the scandalous, and the politically correct. Keep reinventing yourself. Once you are in the limelight you have to renew it by reinventing ways to court attention.
People feel superior to people whose actions they can predict or control. If you show them who is in control by playing against their expectations, you will gain their respect and tighten your hold on their fleeting attention. Society craves people who stand apart from general mediocrity.


Be ostentatious and be seen. . . . What is not seen is as though it did not exist. ... It was light that first caused all creation to shine forth. Display fills up many blanks, covers up deficiencies, and gives everything a second life, especially when it is backed by genuine merit. (Baltasar Gracian, 1601-1658)
Everything is judged by its appearance; what is unseen counts for nothing. Never let yourself get lost in the crowd, then, or buried in oblivion. Stand out. Be conspicuous, at all cost. Make yourself a magnet of attention by appearing larger, more colorful, more mysterious, than the bland and timid masses.
Why Fame Is Important In Every Field Of Work
Burning more brighty than those around you is a skill that no one is born with. You have to learn to attract attention. At the start of your career, you must attach your name and reputation to a quality, an image, that sets you apart from other people. This image can be something like a characteristic style of dress, or a personality quirk that amuses people and gets talked about. Once the image is established, you have an appearance, a place in the sky for your star.

Court of Louis XIV

The court of Louis XIV contained many talented writers, artists, great beauties, and men and women of impeccable virtue, but no one was more talked about than the singular Due de Lauzun. The duke was short, almost dwarfish, and he was prone to the most insolent kinds of behavior—he slept with the king's mistress, and openly insulted not only other courtiers but the king himself. Louis, however, was so beguiled by the duke's eccentricities that he could not bear his absences from the court. It was simple: The strangeness of the duke's character attracted attention. Once people were enthralled by him, they wanted him around at any cost.

Thomas Edison - The Greatest Inventor in the World

The great scientist Thomas Edison knew that to raise money he had to remain in the public eye at any cost. Almost as important as the inventions themselves was how he presented them to the public and courted attention. Edison would design visually dazzling experiments to display his discoveries with electricity. He would talk of future inventions that seemed fantastic at the time—robots, and machines that could photograph thought—and that he had no intention of wasting his energy on, but that made the public talk about him. He did everything he could to make sure that he received more attention than his great rival Nikola Tesla, who may actually have been more brilliant than he was but whose name was far less known. In 1915, it was rumored that Edison and Tesla would be joint recipients of that year's Nobel Prize in physics. The prize was eventually given to a pair of English physicists; only later was it discovered that the prize committee had actually approached Edison, but he had turned them down, refusing to share the prize with Tesla. By that time his fame was more secure than Tesla's, and he thought it better to refuse the honor than to allow his rival the attention that would have come even from sharing the prize.

6 Ways You Can Become famousLet's look at 6 Ways you can become famous and make your ideas more popular than the competition:
1. Attack The Sensational/ScandalousIf you find yourself in a lowly position that offers little opportunity for you to draw attention, an effective trick is to attack the most visible, most famous, most powerful person you can find.

Pietro Aretino

When Pietro Aretino, a young Roman servant boy of the early sixteenth century, wanted to get attention as a writer of verses, he decided to publish a series of satirical poems ridiculing the pope and his affection for a pet elephant. The attack put Aretino in the public eye immediately. A slanderous attack on a person in a position of power would have a similar effect. Remember, however, to use such tactics sparingly after you have the public's attention, then the act can wear thin.

2. Keep Reinventing YourselfOnce in the limelight you must constantiy renew it by adapting and varying your method of courting attention. If you don't, the public will grow tired, will take you for granted, and will move on to a newer star. The game requires constant vigilance and creativity.

3. Be UnpredictablePeople feel superior to the person whose actions they can predict. If you show them who is in control by playing against their expectations, you will gain their respect and tighten your hold on their fleeting attention.

Pablo Picasso - The Greatest Painter In The World

Pablo Picasso never allowed himself to fade into the background; if his name became too attached to a particular style, he would deliberately upset the public with a new series of paintings that went against all expectations. Better to create somediing ugly and disturbing, he believed, than to let viewers grow too familiar with his work. Understand:

4. Create an Air of MysteryIn a world growing increasingly banal and familiar, what seems enigthatic instantly draws attention. Never make it too clear what you are doing or about to do. Do not show all your cards. An air of mystery heightens your presence; it also creates anticipation—everyone will be watching you to see what happens next. Use mystery to beguile, seduce, even frighten

If you do not declare yourself immediately, you arouse expectation. . . . Mix a little mystery with everything, and the very mystery stirs up veneration. And when you explain, be not too explicit. ... In this manner you imitate the Divine way when you cause men to wonder and watch. (Baltasar Gracian, 1601-1658)

5. Better to be Attacked/Slandered Than Ignored.It is a common mistake to imagine that this peculiar appearance of yours should not be controversial, that to be attacked is somehow bad. Nothing could be further from the truth. To avoid being a flash in the pan, and having your notoriety eclipsed by another, you must not discriminate between different types of attention; in the end, every kind will work in your favor. Welcomed personal attacks and feel no need to defend yourself.

P.T. Barnum - The Greatest Entertainer in the World

P.T. Barnum learned about courting attention to his favor. Any form of publicity would benefit his entertainment business, no thatter if it were bad publicity. He promoted his shows of curiosities to authences with all kinds of gimmicks. He would offer Free Music for Millions, but hire bad musicians, so the crowd would end up buying tickets to the show so they could avoid the bands. He planted articles in newspapers and even sent anonymous letters to keep his name in the limelight.

6. Make Yourself Appear Larger Than Life.
Society craves larger-than-life figures, people who stand above the general mediocrity. Never be afraid, then, of the qualities that set you apart and draw attention to you. Court controversy, even scandal. It is better to be attacked, even slandered, than ignored. All professions are ruled by this law, and all professionals must have a bit of the showman about them.
- See more at: http://48laws-of-power.blogspot.com/2011/05/law-6-court-attention-at-all-cost.html#sthash.uCg1gEHc.6ZAIKGGz.dpuf

Law 6: Court Attention at all Cost

 1592 19 29
image
Be ostentatious and be seen. . . . What is not seen is as though it did not exist. ... It was light that first caused all creation to shine forth. Display fills up many blanks, covers up deficiencies, and gives everything a second life, especially when it is backed by genuine merit. (Baltasar Gracian, 1601-1658)
Everything is judged by its appearance; what is unseen counts for nothing. Never let yourself get lost in the crowd, then, or buried in oblivion. Stand out. Be conspicuous, at all cost. Make yourself a magnet of attention by appearing larger, more colorful, more mysterious, than the bland and timid masses.
Why Fame Is Important In Every Field Of Work
Burning more brighty than those around you is a skill that no one is born with. You have to learn to attract attention. At the start of your career, you must attach your name and reputation to a quality, an image, that sets you apart from other people. This image can be something like a characteristic style of dress, or a personality quirk that amuses people and gets talked about. Once the image is established, you have an appearance, a place in the sky for your star.

Court of Louis XIV

The court of Louis XIV contained many talented writers, artists, great beauties, and men and women of impeccable virtue, but no one was more talked about than the singular Due de Lauzun. The duke was short, almost dwarfish, and he was prone to the most insolent kinds of behavior—he slept with the king's mistress, and openly insulted not only other courtiers but the king himself. Louis, however, was so beguiled by the duke's eccentricities that he could not bear his absences from the court. It was simple: The strangeness of the duke's character attracted attention. Once people were enthralled by him, they wanted him around at any cost.

Thomas Edison - The Greatest Inventor in the World

The great scientist Thomas Edison knew that to raise money he had to remain in the public eye at any cost. Almost as important as the inventions themselves was how he presented them to the public and courted attention. Edison would design visually dazzling experiments to display his discoveries with electricity. He would talk of future inventions that seemed fantastic at the time—robots, and machines that could photograph thought—and that he had no intention of wasting his energy on, but that made the public talk about him. He did everything he could to make sure that he received more attention than his great rival Nikola Tesla, who may actually have been more brilliant than he was but whose name was far less known. In 1915, it was rumored that Edison and Tesla would be joint recipients of that year's Nobel Prize in physics. The prize was eventually given to a pair of English physicists; only later was it discovered that the prize committee had actually approached Edison, but he had turned them down, refusing to share the prize with Tesla. By that time his fame was more secure than Tesla's, and he thought it better to refuse the honor than to allow his rival the attention that would have come even from sharing the prize.

6 Ways You Can Become famousLet's look at 6 Ways you can become famous and make your ideas more popular than the competition:
1. Attack The Sensational/ScandalousIf you find yourself in a lowly position that offers little opportunity for you to draw attention, an effective trick is to attack the most visible, most famous, most powerful person you can find.

Pietro Aretino

When Pietro Aretino, a young Roman servant boy of the early sixteenth century, wanted to get attention as a writer of verses, he decided to publish a series of satirical poems ridiculing the pope and his affection for a pet elephant. The attack put Aretino in the public eye immediately. A slanderous attack on a person in a position of power would have a similar effect. Remember, however, to use such tactics sparingly after you have the public's attention, then the act can wear thin.

2. Keep Reinventing YourselfOnce in the limelight you must constantiy renew it by adapting and varying your method of courting attention. If you don't, the public will grow tired, will take you for granted, and will move on to a newer star. The game requires constant vigilance and creativity.

3. Be UnpredictablePeople feel superior to the person whose actions they can predict. If you show them who is in control by playing against their expectations, you will gain their respect and tighten your hold on their fleeting attention.

Pablo Picasso - The Greatest Painter In The World

Pablo Picasso never allowed himself to fade into the background; if his name became too attached to a particular style, he would deliberately upset the public with a new series of paintings that went against all expectations. Better to create somediing ugly and disturbing, he believed, than to let viewers grow too familiar with his work. Understand:

4. Create an Air of MysteryIn a world growing increasingly banal and familiar, what seems enigthatic instantly draws attention. Never make it too clear what you are doing or about to do. Do not show all your cards. An air of mystery heightens your presence; it also creates anticipation—everyone will be watching you to see what happens next. Use mystery to beguile, seduce, even frighten

If you do not declare yourself immediately, you arouse expectation. . . . Mix a little mystery with everything, and the very mystery stirs up veneration. And when you explain, be not too explicit. ... In this manner you imitate the Divine way when you cause men to wonder and watch. (Baltasar Gracian, 1601-1658)

5. Better to be Attacked/Slandered Than Ignored.It is a common mistake to imagine that this peculiar appearance of yours should not be controversial, that to be attacked is somehow bad. Nothing could be further from the truth. To avoid being a flash in the pan, and having your notoriety eclipsed by another, you must not discriminate between different types of attention; in the end, every kind will work in your favor. Welcomed personal attacks and feel no need to defend yourself.

P.T. Barnum - The Greatest Entertainer in the World

P.T. Barnum learned about courting attention to his favor. Any form of publicity would benefit his entertainment business, no thatter if it were bad publicity. He promoted his shows of curiosities to authences with all kinds of gimmicks. He would offer Free Music for Millions, but hire bad musicians, so the crowd would end up buying tickets to the show so they could avoid the bands. He planted articles in newspapers and even sent anonymous letters to keep his name in the limelight.

6. Make Yourself Appear Larger Than Life.
Society craves larger-than-life figures, people who stand above the general mediocrity. Never be afraid, then, of the qualities that set you apart and draw attention to you. Court controversy, even scandal. It is better to be attacked, even slandered, than ignored. All professions are ruled by this law, and all professionals must have a bit of the showman about them.
- See more at: http://48laws-of-power.blogspot.com/2011/05/law-6-court-attention-at-all-cost.html#sthash.uCg1gEHc.dpuf

Law 6: Court Attention at all Cost

 1592 19 29
image
Be ostentatious and be seen. . . . What is not seen is as though it did not exist. ... It was light that first caused all creation to shine forth. Display fills up many blanks, covers up deficiencies, and gives everything a second life, especially when it is backed by genuine merit. (Baltasar Gracian, 1601-1658)
Everything is judged by its appearance; what is unseen counts for nothing. Never let yourself get lost in the crowd, then, or buried in oblivion. Stand out. Be conspicuous, at all cost. Make yourself a magnet of attention by appearing larger, more colorful, more mysterious, than the bland and timid masses.
Why Fame Is Important In Every Field Of Work
Burning more brighty than those around you is a skill that no one is born with. You have to learn to attract attention. At the start of your career, you must attach your name and reputation to a quality, an image, that sets you apart from other people. This image can be something like a characteristic style of dress, or a personality quirk that amuses people and gets talked about. Once the image is established, you have an appearance, a place in the sky for your star.

Court of Louis XIV

The court of Louis XIV contained many talented writers, artists, great beauties, and men and women of impeccable virtue, but no one was more talked about than the singular Due de Lauzun. The duke was short, almost dwarfish, and he was prone to the most insolent kinds of behavior—he slept with the king's mistress, and openly insulted not only other courtiers but the king himself. Louis, however, was so beguiled by the duke's eccentricities that he could not bear his absences from the court. It was simple: The strangeness of the duke's character attracted attention. Once people were enthralled by him, they wanted him around at any cost.

Thomas Edison - The Greatest Inventor in the World

The great scientist Thomas Edison knew that to raise money he had to remain in the public eye at any cost. Almost as important as the inventions themselves was how he presented them to the public and courted attention. Edison would design visually dazzling experiments to display his discoveries with electricity. He would talk of future inventions that seemed fantastic at the time—robots, and machines that could photograph thought—and that he had no intention of wasting his energy on, but that made the public talk about him. He did everything he could to make sure that he received more attention than his great rival Nikola Tesla, who may actually have been more brilliant than he was but whose name was far less known. In 1915, it was rumored that Edison and Tesla would be joint recipients of that year's Nobel Prize in physics. The prize was eventually given to a pair of English physicists; only later was it discovered that the prize committee had actually approached Edison, but he had turned them down, refusing to share the prize with Tesla. By that time his fame was more secure than Tesla's, and he thought it better to refuse the honor than to allow his rival the attention that would have come even from sharing the prize.

6 Ways You Can Become famousLet's look at 6 Ways you can become famous and make your ideas more popular than the competition:
1. Attack The Sensational/ScandalousIf you find yourself in a lowly position that offers little opportunity for you to draw attention, an effective trick is to attack the most visible, most famous, most powerful person you can find.

Pietro Aretino

When Pietro Aretino, a young Roman servant boy of the early sixteenth century, wanted to get attention as a writer of verses, he decided to publish a series of satirical poems ridiculing the pope and his affection for a pet elephant. The attack put Aretino in the public eye immediately. A slanderous attack on a person in a position of power would have a similar effect. Remember, however, to use such tactics sparingly after you have the public's attention, then the act can wear thin.

2. Keep Reinventing YourselfOnce in the limelight you must constantiy renew it by adapting and varying your method of courting attention. If you don't, the public will grow tired, will take you for granted, and will move on to a newer star. The game requires constant vigilance and creativity.

3. Be UnpredictablePeople feel superior to the person whose actions they can predict. If you show them who is in control by playing against their expectations, you will gain their respect and tighten your hold on their fleeting attention.

Pablo Picasso - The Greatest Painter In The World

Pablo Picasso never allowed himself to fade into the background; if his name became too attached to a particular style, he would deliberately upset the public with a new series of paintings that went against all expectations. Better to create somediing ugly and disturbing, he believed, than to let viewers grow too familiar with his work. Understand:

4. Create an Air of MysteryIn a world growing increasingly banal and familiar, what seems enigthatic instantly draws attention. Never make it too clear what you are doing or about to do. Do not show all your cards. An air of mystery heightens your presence; it also creates anticipation—everyone will be watching you to see what happens next. Use mystery to beguile, seduce, even frighten

If you do not declare yourself immediately, you arouse expectation. . . . Mix a little mystery with everything, and the very mystery stirs up veneration. And when you explain, be not too explicit. ... In this manner you imitate the Divine way when you cause men to wonder and watch. (Baltasar Gracian, 1601-1658)

5. Better to be Attacked/Slandered Than Ignored.It is a common mistake to imagine that this peculiar appearance of yours should not be controversial, that to be attacked is somehow bad. Nothing could be further from the truth. To avoid being a flash in the pan, and having your notoriety eclipsed by another, you must not discriminate between different types of attention; in the end, every kind will work in your favor. Welcomed personal attacks and feel no need to defend yourself.

P.T. Barnum - The Greatest Entertainer in the World

P.T. Barnum learned about courting attention to his favor. Any form of publicity would benefit his entertainment business, no thatter if it were bad publicity. He promoted his shows of curiosities to authences with all kinds of gimmicks. He would offer Free Music for Millions, but hire bad musicians, so the crowd would end up buying tickets to the show so they could avoid the bands. He planted articles in newspapers and even sent anonymous letters to keep his name in the limelight.

6. Make Yourself Appear Larger Than Life.
Society craves larger-than-life figures, people who stand above the general mediocrity. Never be afraid, then, of the qualities that set you apart and draw attention to you. Court controversy, even scandal. It is better to be attacked, even slandered, than ignored. All professions are ruled by this law, and all professionals must have a bit of the showman about them.
- See more at: http://48laws-of-power.blogspot.com/2011/05/law-6-court-attention-at-all-cost.html#sthash.uCg1gEHc.dpuf
7. Get others to do the work for you, but always take the credit.
8. Make others come to you. Use bait if necessary.
9. Win through your actions, never through argument.
10.Infection. Avoid the unhappy and unlucky.
11. Learn to keep people dependent on you.
12. Use selective honesty and generosity to disarm your victim.
13. When asking for help, appeal to peoples' self-interest, never totheir mercy or gratitude.
14. Pose as a friend; work as a spy.
15. Crush your enemy totally.
16. Use absence to increase respect and honor.
17. Keep others in suspended terror: Cultivate an aire of unpredictability.
18. Do not build fortresses to protect yourself. Isolation is dangerous.
19. Know who you are dealing with; do not offend the wrong person.
20. Do not committ to anyone.
21. Play a sucker to catch a sucker. Seem dumber than your mark.
22. Use the surrender tactic. transform weakness into power.
23. Concentrate your forces.
24. Play the perfect courtier.
25. Re-create yourself.
26. Keep your hands clean.
27. Play on people's need to believe to create a cultlike following.
28. Enter action with boldness.
29. Plan all the way to the end.
30. Make your accomplishments seem effortless.
31. Control the options. Get others to play with the cards you deal.
32. Play to people's fantasies.
33. Discover each person's thumbscrew.
34. Be royal in your own fashion. Act like a king to be treated like one.
35. Master the art of timing.
36. Disdain things you cannot have. Ignoring them is the best revenge.
37. Create compelling spectacles.
38. Think as you like, but behave like others.
39. Stir up waters to catch fish.
40. Despise the free lunch.
41. Avoid stepping into a great man's shoes.
42. Strike the shepard and the sheep will scatter.
43. Work on the hearts and minds of others.
44. Disarm and infuriate with the mirror effect.
45. Preach the need for change, but never reform too much at once.
46. Never appear too perfect.
47. Do not go past the mark you aimed for. In victory, learn when to stop.
48. Assume forlessness.
49. Thank God; count your blessings.

Judge London Steverson
London Eugene Livingston Steverson
 (born March 13, 1947) was one of the first two African Americans to graduate from the United States Coast Guard Academy in 1968. Later, as chief of the newly formed Minority Recruiting Section of the United States Coast Guard (USCG), he was charged with desegregating the Coast Guard Academy by recruiting minority candidates. He retired from the Coast Guard in 1988 and in 1990 was appointed to the bench as a Federal Administrative Law Judge with the Office of Hearings and Appeals, Social Security Administration.

Early Life and Education
Steverson was born and raised in Millington, Tennessee, the oldest of three children of Jerome and Ruby Steverson. At the age of 5 he was enrolled in the E. A. Harrold elementary school in a segregated school system. He later attended the all black Woodstock High School in Memphis, Tennessee, graduating valedictorian.
A Presidential Executive Order issued by President Truman had desegregated the armed forces in 1948,[1] but the service academies were lagging in officer recruiting. President Kennedy specifically challenged the United States Coast Guard Academy to tender appointments to Black high school students. London Steverson was one of the Black student to be offered such an appointment, and when he accepted the opportunity to be part of the class of 1968, he became the second African American to enter the previously all-white military academy. On June 4, 1968 Steverson graduated from the Coast Guard Academy with a BS degree in Engineering and a commission as an ensign in the U.S. Coast Guard.
In 1974, while still a member of the Coast Guard, Steverson entered The National Law Center of The George Washington University and graduated in 1977 with a Juris Doctor of Laws Degree.

USCG Assignments.
Steverson's first duty assignment out of the Academy was in Antarctic research logistical support. In July 1968 he reported aboard the Coast Guard Cutter (CGC) Glacier [2] (WAGB-4), an icebreaker operating under the control of the U.S. Navy, and served as a deck watch officer and head of the Marine Science Department. He traveled to Antarctica during two patrols from July 1968 to August 1969, supporting the research operations of the National Science Foundation's Antarctic Research Project in and around McMurdo Station. During the 1969 patrol the CGC Glacier responded to an international distress call from the Argentine icebreaker General SanMartin, which they freed.
He received another military assignment from 1970 to 1972 in Juneau, Alaska as a Search and Rescue Officer. Before being certified as an Operations Duty Officer, it was necessary to become thoroughly familiar with the geography and topography of the Alaskan remote sites. Along with his office mate, Ltjg Herbert Claiborne "Bertie" Pell, the son of Rhode Island Senator Claiborne Pell, Steverson was sent on a familiarization tour of Coast Guard, Navy and Air Force bases. The bases visited were Base Kodiak, Base Adak Island, and Attu Island, in the Aleutian Islands.[3]
Steverson was the Duty Officer on September 4, 1971 when an emergency call was received that an Alaska Airlines Boeing 727 airline passenger plane was overdue at Juneau airport. This was a Saturday and the weather was foggy with drizzling rain. Visibility was less than one-quarter mile. The 727 was en route to Seattle, Washington from Anchorage, Alaska with a scheduled stop in Juneau. There were 109 people on board and there were no survivors. Steverson received the initial alert message and began the coordination of the search and rescue effort. In a matter of hours the wreckage from the plane, with no survivors, was located on the side of a mountain about five miles from the airport. For several weeks the body parts were collected and reassembled in a staging area in the National Guard Armory only a few blocks from the Search and Rescue Center where Steverson first received the distress broadcast.[4]. Later a full investigation with the National Transportation Safety Board determined that the cause of the accident was equipment failure.[5]
Another noteworthy item is Steverson's involvement as an Operations Officer during the seizure of two Russian fishing vessels, the Kolevan and the Lamut for violating an international agreement prohibiting foreign vessels from fishing in United States territorial waters. The initial attempts at seizing the Russian vessels almost precipitated an international incident when the Russian vessels refused to proceed to a U. S. port, and instead sailed toward the Kamchatka Peninsula. Russian MIG fighter planes were scrambled, as well as American fighter planes from Elmendorf Air Force Base before the Russian vessels changed course and steamed back

Read More

Thursday, November 12, 2009

The End Of History.

SINGAPORE -- The 20th anniversary of the fall of the Berlin Wall has just been celebrated. For many, that momentous event marked the so-called "end of history" and the final victory of the West. This week, Barack Obama, the first Black president of the once-triumphant superpower in that Cold War contest, heads to Beijing to meet America's bankers -- the Chinese Communist government -- a prospect undreamt of 20 years ago. Surely, this twist of the times is a good point of departure for taking stock of just where history has gone during these past two decades.

Let me begin with an extreme and provocative point to get the argument going: Francis Fukuyama's famous essay "The End of History" may have done some serious brain damage to Western minds in the 1990s and beyond. Fukuyama should not be blamed for this brain damage. He wrote a subtle, sophisticated and nuanced essay. However, few Western intellectuals read the essay in its entirety. Instead, the only message they took away from the essay were two phrases that can be found in the essay: namely The End of History = The Triumph of the West.

Western hubris was thick in the air then. I experienced it. For example, in 1991 I heard a senior Belgian official, speaking on behalf of Europe, tell a group of Asians, "The Cold War has ended. There are only two superpowers left: the United States and Europe." This hubris also explains how Western minds failed to foresee that instead of the triumph of the West, the 1990s would see the end of Western domination of world history (but not the end of the West) and the return of Asia.

There is no doubt that the West has contributed to the return of Asia. As I document in my book The New Asian Hemisphere: The Irresistible Shift of Global Power to the East, several Asian societies have succeeded because they finally understood, absorbed and implemented the seven pillars of Western wisdom, namely free-market economics, science and technology, meritocracy, pragmatism, culture of peace, rule of law and education.

Notice what is missing from the list: Western political liberalism, despite Fukuyama's claim that "The triumph of the West, of the Western idea, is evident first of all in the total exhaustion of viable systematic alternatives to Western liberalism."

The general assumption in Western minds after reading Fukuyama's essay was that the world would in one way or another become more Westernized. Instead, the exact opposite has happened. Modernization has spread across the world. But modernization has been accompanied by de-Westernization, not Westernization. Fukuyama acknowledges this today. As he said in a recent interview with Global Viewpoint editor Nathan Gardels: "The old version of the idea modernization was Euro-centric, reflecting Europe's own development. That did contain attributes which sought to define modernization in a quite narrow way."

In the same interview, Fukuyama was right in emphasizing that the three components of political modernization were: the creation of an effective state that could enforce rules, the rule of law that binds the sovereign, and accountability. Indeed, these are the very traits of political modernization that many Asian states are aspiring to achieve. Asians surely agree that no state can function or develop without an effective government. We feel particularly vindicated in this point of view after the recent financial crisis. One reason why the United States came to grief was the deeply held ideological assumption in the mind of key American policymakers, like Alan Greenspan, that Ronald Reagan was correct in saying that "Government is not a solution to our problem; government is the problem." Fortunately, Asians did not fall prey to this ideology.

Consequently, in the 21st century, history will unfold in the exact opposite direction of what Western intellectuals anticipated in 1991. Then they all assumed that The End of History = The Triumph of the West. Instead, we will now see that The Return of History = The Retreat of the West. One prediction I can make confidently is that the Western footprint on the world, which was hugely oversized in the 19th and 20th centuries, will retreat significantly in the 21st century.

This will not mean a retreat of all Western ideas. Indeed many key ideas like free-market economics and rule of law will be embraced ever more widely. However, few Asians will believe that the Western societies are best at implementing these Western ideas. Indeed, the general assumption of Western competence in governance and management will be replaced by awareness that the West has become quite inept at managing its economies. A new gap will develop. Respect for Western ideas will remain, but respect for Western practices will diminish, unless Western performance in governance improves again.

Sadly, in all the recent discussions of "The End of History" 20 years after its publication, few Western commentators have dared to address the biggest lapse in Western practice. The fundamental underlying assumption of "The End of History" thesis was that the West would remain the "beacon" for the world in democracy and human rights. In 1989, if anyone had dared to predict that within 15 years, the foremost "beacon" of human rights would become the first Western developed state to reintroduce torture, everyone would have shouted "impossible." Yet the impossible happened!

Few in the West understand how much shock Guantanamo has caused in non-Western minds. Hence, many are puzzled that Western intellectuals continue to assume that they can portray themselves and their countries as models to follow when they speak to the rest of the world on human rights. Fukuyama is right to emphasize the importance of "accountability." Yet no one in the West has been held accountable for Guantanamo.

Consequently, what moral authority does the West have to speak on the issues of human rights anymore? This loss of moral authority is the exact opposite outcome that Western minds expected when they celebrated the fall of the Berlin Wall in 1989.

Does this mean we should give up hope? Will the world become a sadder place? Probably few in the West will remember what Fukuyama wrote in the last paragraph of his essay. He wrote: "The end of history will be a very sad time. The struggle for recognition, the willingness to risk one's life for a purely abstract goal, the worldwide ideological struggle that called forth daring, courage, imagination and idealism, will be replaced by economic calculation, the endless solving of technical problems, environmental concerns, and the satisfaction of sophisticated consumer demands. In the post-historical period there will be neither art nor philosophy, just the perpetual caretaking of the museum of human history."

Here, too, as the 21st century unfolds, we will see the exact opposite outcome. The return of Asia will be accompanied by an astonishing Asian renaissance in which many diverse Asian cultures will rediscover their lost heritage of art and philosophy. There is no question that Asians will celebrate the return of history in the 21st century. The only question is: Will the West join them in these celebrations, or will they keep waiting for the end to come?

Kishore Mahbubani, dean of the Lee Kuan Yew School of Public Policy, at the National University of Singapore, is the author of The New Asian Hemisphere: The Irresistible Shift of Global Power to the East.

Tuesday, October 27, 2009

How To Define "Critical Thinking".

Definitions of Critical Thinking:

Robert H. Ennis, Author of The Cornell Critical Thinking Tests
"Critical thinking is reasonable, reflective thinking that is focused on deciding what to believe and do."

A SUPER-STREAMLINED CONCEPTION OF CRITICAL THINKING
Robert H. Ennis, 6/20/02

Assuming that critical thinking is reasonable reflective thinking focused on deciding what to believe or do, a critical thinker:

1. Is open-minded and mindful of alternatives
2. Tries to be well-informed
3. Judges well the credibility of sources
4. Identifies conclusions, reasons, and assumptions
5. Judges well the quality of an argument, including the acceptability of its reasons, assumptions, and evidence
6. Can well develop and defend a reasonable position
7. Asks appropriate clarifying questions
8. Formulates plausible hypotheses; plans experiments well
9. Defines terms in a way appropriate for the context
10. Draws conclusions when warranted, but with caution
11. Integrates all items in this list when deciding what to believe or do

Critical Thinkers are disposed to:

1. Care that their beliefs be true, and that their decisions be justified; that is, care to "get it right" to the extent possible. This includes the dispositions to

a. Seek alternative hypotheses, explanations, conclusions, plans, sources, etc., and be open to them
b. Endorse a position to the extent that, but only to the extent that, it is justified by the information that is available
c. Be well informed
d. Consider seriously other points of view than their own

2. Care to present a position honestly and clearly, theirs as well as others'. This includes the dispositions to

a. Be clear about the intended meaning of what is said, written, or otherwise communicated, seeking as much precision as the situation requires
b. Determine, and maintain focus on, the conclusion or question
c. Seek and offer reasons
d. Take into account the total situation
e. Be reflectively aware of their own basic beliefs

3. Care about the dignity and worth of every person (a correlative disposition). This includes the dispositions to

a. Discover and listen to others' view and reasons
b. Avoid intimidating or confusing others with their critical thinking prowess, taking into account others' feelings and level of understanding
c. Be concerned about others' welfare

Critical Thinking Abilities:

Ideal critical thinkers have the ability to
(The first three items involve elementary clarification.)

1. Focus on a question

a. Identify or formulate a question
b. Identify or formulate criteria for judging possible answers
c. Keep the situation in mind

2. Analyze arguments

a. Identify conclusions
b. Identify stated reasons
c. Identify unstated reasons
d. Identify and handle irrelevance
e. See the structure of an argument
f. Summarize

3. Ask and answer questions of clarification and/or challenge, such as,

a. Why?
b. What is your main point?
c. What do you mean by…?
d. What would be an example?
e. What would not be an example (though close to being one)?
f. How does that apply to this case (describe a case, which might well appear to be a counter example)?
g. What difference does it make?
h. What are the facts?
i. Is this what you are saying: ____________?
j. Would you say some more about that?

(The next two involve the basis for the decision.)

4. Judge the credibility of a source. Major criteria (but not necessary conditions):

a. Expertise
b. Lack of conflict of interest
c. Agreement among sources
d. Reputation
e. Use of established procedures
f. Known risk to reputation
g. Ability to give reasons
h. Careful habits

5. Observe, and judge observation reports. Major criteria (but not necessary conditions, except for the first):

a. Minimal inferring involved
b. Short time interval between observation and report
c. Report by the observer, rather than someone else (that is, the report is not hearsay)
d. Provision of records.
e. Corroboration
f. Possibility of corroboration
g. Good access
h. Competent employment of technology, if technology is useful
i. Satisfaction by observer (and reporter, if a different person) of the credibility criteria in Ability # 4 above.

(The next three involve inference.)

6. Deduce, and judge deduction

a. Class logic
b. Conditional logic
c. Interpretation of logical terminology in statements, including
(1) Negation and double negation
(2) Necessary and sufficient condition language
(3) Such words as "only", "if and only if", "or", "some", "unless", "not both".

7. Induce, and judge induction

a. To generalizations. Broad considerations:
(1) Typicality of data, including sampling where appropriate
(2) Breadth of coverage
(3) Acceptability of evidence
b. To explanatory conclusions (including hypotheses)
(1) Major types of explanatory conclusions and hypotheses:
(a) Causal claims
(b) Claims about the beliefs and attitudes of people
(c) Interpretation of authors’ intended meanings
(d) Historical claims that certain things happened (including criminal accusations)
(e) Reported definitions
(f) Claims that some proposition is an unstated reason that the person actually used
(2) Characteristic investigative activities
(a) Designing experiments, including planning to control variables
(b) Seeking evidence and counter-evidence
(c) Seeking other possible explanations
(3) Criteria, the first five being essential, the sixth being desirable
(a) The proposed conclusion would explain the evidence
(b) The proposed conclusion is consistent with all known facts
(c) Competitive alternative explanations are inconsistent with facts
(d) The evidence on which the hypothesis depends is acceptable.
(e) A legitimate effort should have been made to uncover counter-evidence
(f) The proposed conclusion seems plausible

8. Make and judge value judgments: Important factors:

a. Background facts
b. Consequences of accepting or rejecting the judgment
c. Prima facie application of acceptable principles
d. Alternatives
e. Balancing, weighing, deciding

(The next two abilities involve advanced clarification.)

9. Define terms and judge definitions. Three dimensions are form, strategy, and content.

a. Form. Some useful forms are:
(1) Synonym
(2) Classification
(3) Range
(4) Equivalent expression
(5) Operational
(6) Example and non-example
b. Definitional strategy
(1) Acts
(a) Report a meaning
(b) Stipulate a meaning
(c) Express a position on an issue (including "programmatic" and "persuasive" definitions)
(2) Identifying and handling equivocation
c. Content of the definition

10. Attribute unstated assumptions (an ability that belongs under both clarification and, in a way, inference)

(The next two abilities involve supposition and integration.)

11. Consider and reason from premises, reasons, assumptions, positions, and other propositions with which they disagree or about which they are in doubt -- without letting the disagreement or doubt interfere with their thinking ("suppositional thinking")

12. Integrate the other abilities and dispositions in making and defending a decision

(The first twelve abilities are constitutive abilities. The next three are auxiliary critical thinking abilities: Having them, though very helpful in various ways, is not constitutive of being a critical thinker.)

13. Proceed in an orderly manner appropriate to the situation. For example:

a. Follow problem solving steps
b. Monitor one's own thinking (that is, engage in metacognition)
c. Employ a reasonable critical thinking checklist

14. Be sensitive to the feelings, level of knowledge, and degree of sophistication of others

15. Employ appropriate rhetorical strategies in discussion and presentation (orally and in writing), including employing and reacting to "fallacy" labels in an appropriate manner.

Examples of fallacy labels are "circularity," "bandwagon," "post hoc," "equivocation," "non sequitur," and "straw person."


Dewey, John
Critical thinking is "active, persistent, and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends (Dewey 1933: 118)."


Glaser
(1) an attitude of being disposed to consider in a thoughtful way the problems and subjects that come within the range of one's experiences, (2) knowledge of the methods of logical inquiry and reasoning, and (3) some skill in applying those methods. Critical thinking calls for a persistent effort to examine any belief or supposed form of knowledge in the light of the evidence that supports it and the further conclusions to which it tends. (Glaser 1941, pp. 5-6).

Abilities include: "(a) to recognize problems, (b) to find workable means for meeting those problems, (c) to gather and marshal pertinent information, (d) to recognize unstated assumptions and values, (e) to comprehend and use language with accuracy, clarity and discrimination, (f) to interpret data, (g) to appraise evidence and evaluate statements, (h) to recognize the existence of logical relationships between propositions, (i) to draw warranted conclusions and generalizations, (j) to put to test the generalizations and conclusions at which one arrives, (k) to reconstruct one's patterns of beliefs on the basis of wider experience; and (l) to render accurate judgments about specific things and qualities in everyday life." (p.6)


MCC General Education Initiatives
"Critical thinking includes the ability to respond to material by distinguishing between facts and opinions or personal feelings, judgments and inferences, inductive and deductive arguments, and the objective and subjective. It also includes the ability to generate questions, construct, and recognize the structure of arguments, and adequately support arguments; define, analyze, and devise solutions for problems and issues; sort, organize, classify, correlate, and analyze materials and data; integrate information and see relationships; evaluate information, materials, and data by drawing inferences, arriving at reasonable and informed conclusions, applying understanding and knowledge to new and different problems, developing rational and reasonable interpretations, suspending beliefs and remaining open to new information, methods, cultural systems, values and beliefs and by assimilating information."


Nickerson, Perkins and Smith (1985)
"The ability to judge the plausibility of specific assertions, to weigh evidence, to assess the logical soundness of inferences, to construct counter-arguments and alternative hypotheses."


Moore and Parker, Critical Thinking
Critical Thinking is "the careful, deliberate determination of whether we should accept, reject, or suspend judgment about a claim, and the degree of confidence with which we accept or reject it."


Delphi Report
"We understand critical thinking to be purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based. CT is essential as a tool of inquiry. As such, CT is a liberating force in education and a powerful resource in one's personal and civic life. While not synonymous with good thinking, CT is a pervasive and self-rectifying human phenomenon. The ideal critical thinker is habitually inquisitive, well-informed, trustful of reason, open-minded, flexible, fair-minded in evaluation, honest in facing personal biases, prudent in making judgments, willing to reconsider, clear about issues, orderly in complex matters, diligent in seeking relevant information, reasonable in the selection of criteria, focused in inquiry, and persistent in seeking results which are as precise as the subject and the circumstances of inquiry permit. Thus, educating good critical thinkers means working toward this ideal. It combines developing CT skills with nurturing those dispositions which consistently yield useful insights and which are the basis of a rational and democratic society."

A little reformatting helps make this definition more comprehensible:

We understand critical thinking to be purposeful, self-regulatory judgment which results in

interpretation
analysis
evaluation
inference
as well as explanation of the

evidential
conceptual
methodological
criteriological
contextual
considerations upon which that judgment is based.


Francis Bacon (1605)
"For myself, I found that I was fitted for nothing so well as for the study of Truth; as having a mind nimble and versatile enough to catch the resemblances of things … and at the same time steady enough to fix and distinguish their subtler differences; as being gifted by nature with desire to seek, patience to doubt, fondness to meditate, slowness to assert, readiness to consider, carefulness to dispose and set in order; and as being a man that neither affects what is new nor admires what is old, and that hates every kind of imposture."

A shorter version is "the art of being right."

Or, more prosaically: critical thinking is "the skillful application of a repertoire of validated general techniques for deciding the level of confidence you should have in a proposition in the light of the available evidence."

Thursday, October 15, 2009

No Good Deed Goes Unpunished. Race and Rage.

It is a sign of our weird political moment that the award of the Nobel Peace Prize to President Obama will probably hurt him among some of his fellow citizens.

His opponents are describing the award as premature. The deeper problem is that the Nobel will underscore the extent to which Obama is a cosmopolitan figure, much loved in European capitals because he is the change they have been looking for.

Most Americans will probably be happy to have a leader who wins acclaim around the globe. But, paradoxically, a decision made in Oslo to honor Obama's peaceable intentions may make it more difficult for him to reconcile a body politic roiled by years of cultural warfare, partisan animosity and ideological extremism.

The effort to understand where Obama hatred comes from has been one of the few growth areas in the American economy.


There is no doubt that some of the anger is fueled by racial feeling, which is not the same as saying that all opposition to Obama is explained by racism. Most Obama opponents are simply conservative Republicans who disagree with him. But there are too many racist signs at rallies and too many overtly racial pronouncements in the fever swamps of the right-wing media to deny that racism is part of the anti-Obama mix.

Obama can't do much about those who are against him because of his race. Even a 1 percent unemployment rate wouldn't change the minds most scarred by prejudice. But there is a second level of angry opposition to which Obama needs to pay more attention. It involves the genuine rage of those who felt displaced in our economy even before the great recession and who are now hurting even more.

These Americans are sometimes written off as "angry white men." In analyzing anti-Obama feeling, commentators have taken to rummaging around the work of historian Richard Hofstadter during the 1950s and '60s, focusing on his theory that "status anxiety" helps explain the rise of movements on the far right. The idea is that extremism takes hold in groups that feel their "status" is threatened by new groups on the rise in society.

The problem with status-anxiety theory is that it focuses on feelings and psychology, thus easily crossing into condescension. It implies that the victims of status anxiety should be doing a better job accepting their new situations and plays down the idea that they might have something real to be angry about.

In fact, many who now feel rage have legitimate reasons for it, even if neither Obama nor big government is the real culprit. The September 2009 unemployment numbers told the story in broad terms: Among men 20 and over, unemployment was 10.3 percent; among women, the rate was 7.8 percent.

Middle-income men, especially those who are not college graduates, have borne the brunt of economic change bred by globalization and technological transformation. Even before the recession, the decline in the number of well-paid jobs in manufacturing hit the incomes of this group of Americans hard. The trouble in the construction industry since the downturn began has compounded the problem.

This is not a uniquely American problem. Last week I caught up with Australia's deputy prime minister, Julia Gillard, who was visiting Washington for a conference on education. Though Gillard diplomatically avoided direct comment on American politics, she said what's happening here reminded her of the rise of Pauline Hanson, a politician who caused a sensation in Australian politics in the 1990s by creating One Nation, a xenophobic and protectionist political party tinged with racism.

Gillard, a leader of Australia's center-left Labor Party, argues that high unemployment, particularly the displacement of men from previously well-paid jobs, helped unleash Hansonism and "the politics of the ordinary guy versus these elites, the opera-watching, latte-sipping elites." Hansonism collapsed, partly because the Australian economy boomed. Gillard argued that the key to battling the politics of rage is to acknowledge that it is driven by "real problems" and not simply raw feelings.

No doubt some who despise Obama will see the judges in Norway as part of that latte-sipping crowd and will hold their esteem for the president against him. He can't do much about this. What he can do -- and perhaps then deserve the domestic equivalent of a peace prize -- is reach out to the angry white men with policies that address their grievances, and do so with an understanding that what matters to them is not status but simply a chance to make a decent living again.
(E J Dionne Jr)

The Norwegian Nobel Committee has taken it in the neck for awarding this year's Peace Prize to a nine-month old American presidency. There's been much mockery of pencil-necked Norwegian academics in faraway Oslo. This is unfair.

The committee said it chose Barack Obama for his "vision of . . . a world without nuclear weapons" and for "meeting the great climatic challenges the world is confronting." I'd say that completes the argument over old and new Europe. This is a Nobel of decadence.

Let's be clear. This decadence isn't primarily about Roman Polanski or Silvio Berlusconi's playboy club or French culture minister Frederic Mitterrand's adventures in Thailand. Though these are not irrelevant.

This Nobel is about political decadence.

"Decadence," an enduring word, emerged from the Latin "de-cadere," which means "to fall down." Decadence stripped bare means decay.

When it was a vibrant garden of ideas, Europe gave the world more good things than one can count. Then it discovered the pleasures of the welfare state.

Old Europe now lives in a world of unpayable public pension obligations, weak job creation for its youngest workers, below-replacement birth rates, fat agricultural subsidies for farms dating to the Middle Ages, high taxes to pay for the public high-life, and history's most crucial proof of decay—the inability to finance one's armies. Only five of the 28 nations in NATO (the U.K., France, Turkey, Greece and Spain) achieve the minimum defense-spending benchmark of 2% of GDP.

The effect of arriving at a state of political decadence, of no longer being able to rise in the world, is that many people increasingly discover that soft moralism is a more congenial pastime than producing answers for the hard questions. As when David Cameron, the Tory leader and likely next British prime minister wonders: "The insatiable consumption and materialism of the past decade; has it made us happier or more fulfilled?"

This isn't to say that soft moralism is about nothing. But when matters such as climate change become life's primary concerns, it means one is going to spend more time preaching, which is easy, than doing, which is hard. One thinks of Nobelist Al Gore's unstoppable sermons.

Among the hardest questions Europe faced after World War II was the placement of anti-Soviet Pershing missiles on Europe's soil in 1983. Led by Helmut Kohl and Maggie Thatcher, Europe did something hard: It overcame its pacifists. A decade later, with the siege of Sarajevo, old Europe came to understand that making the hardest decisions was now beyond its reach.

Current hard questions include Pakistan and Afghanistan. Darfur is a hard question. Where to hold captured terrorists is a hard question.

Americans heard often the past four years how much Europe "hated" us because of that most complex of hard questions, the Iraq war. Unpopular wars cause bad feelings to be sure, but past some point Europe's antipathy toward the U.S. over Iraq began to sound a lot like moralistic decadence. It is a neurotic resentment of a superpower merely because it possesses the resources to do something Europe can no longer do, for good or ill.


Norwegian Nobel Committee Chairman Thorbjoern Jagland
.What we are in the process of discovering is just how much President Obama's worldview coincides with that of the continent that claims to have seen itself reflected in him and its Peace Prize.

Mr. Obama is at a crossroads in his presidency. As George W. Bush departed the White House, he said his successor would one day arrive at the need to make a decision that made clear the reality of being the American president. That moment has arrived. It is the pending troop-deployment for Afghanistan, a very hard decision.

After that, Mr. Obama will go to Oslo Dec. 10 to receive the Prize itself. That will occur in the middle of the Dec. 7-18 United Nations Climate Conference in Copenhagen, whose goal is among the explicit reasons why Mr. Obama was given the Nobel Peace Prize.

Between Afghanistan and Oslo, we're going to get some clarity about the Obama presidency.

Perhaps the most intriguing onlooker to this education is European Nicolas Sarkozy. On his good days, France's president seems aware of the political and economic decay he has inherited. So it was striking at the United Nations last month when Mr. Sarkozy said that Mr. Obama "dreams of a world without nuclear arms." Then, describing Iran's nuclear threat, he said, "At a certain moment hard facts will force us to make decisions."

By "us" he means that the U.S. must lead. In the West, only the U.S. president can still make decisions based on hard facts rather than recede into soft moralism. The day that is no longer true, the U.S. will finally deserve a decadent Nobel.

Saturday, September 26, 2009

Old Men/Young Women, A Good Thing?

It turns out that older men chasing younger women contributes to human
longevity and the survival of the species, according to new findings
by researchers at Stanford and the University of California-Santa
Barbara.

Evolutionary theory says that individuals should die of old age when
their reproductive lives are complete, generally by age 55 in humans,
according to demographer Cedric Puleston, a doctoral candidate in
biological sciences at Stanford. But the fatherhood of a small number
of older men is enough to postpone the date with death because natural
selection fights life-shortening mutations until the species is
finished reproducing.

"Rod Stewart and David Letterman having babies in their 50s and 60s
provide no benefit for their personal survival, but the pattern [of
reproducing at a later age] has an effect on the population as a
whole," Puleston said. "It's advantageous to the species if these
people stick around. By increasing the survival of men you have a
spillover effect on women because men pass their genes to children of
both sexes."

"Why Men Matter: Mating Patterns Drive Evolution of Human Lifespan,"
was published Aug. 29 in the online journal Public Library of Science
ONE. Shripad Tuljapurkar, the Morrison Professor of Population Studies
at Stanford; Puleston; and Michael Gurven, an assistant professor of
anthropology at UCSB, co-authored the study in an effort to understand
why humans don't die when female reproduction ends.

Human ability to scale the so-called "wall of death"—surviving beyond
the reproductive years—has been a center of scientific controversy for
more than 50 years, Puleston said. "The central question is: Why
should a species that stops reproducing by some age stick around
afterward?" he said. "Evolutionary theory predicts that, over time,
harmful mutations that decrease survival will arise in the population
and will remain invisible to natural selection after reproduction
ends." However, in hunter-gatherer societies, which likely represent
early human demographic conditions and mating patterns, one-third of
people live beyond 55 years, past the reproductive lifespan for women.
Furthermore, life expectancy in today's industrialized countries is 75
to 85 years, with mortality increasing gradually, not abruptly,
following female menopause.

Grandmother hypothesis

In 1966, William Hamilton, a British evolutionary biologist, worked
out the mathematics describing the "wall of death." Since then, the
most popular explanation for why humans don't die by age 55 has been
termed the "grandmother hypothesis," which suggests that women enhance
the survival of their children and grandchildren by living long enough
to care for them and "increasing the success of their genes," Puleston
said. However, Hamilton's work has been difficult to express as a
mathematical and genetic argument explaining why people live into old
age.

Unlike previous research on human reproduction, this study—for the
first time—includes data on males, a tweak that allowed the
researchers to begin answering the "wall of death" question by
matching it to human mortality patterns. According to Puleston,
earlier studies looked only at women, because scientists can reproduce
good datasets for humans entirely based on information related to
female fertility and survival rates.

"Men's fertility is contingent on women's fertility—you have to figure
out how they match up. We care about reproduction because that is a
currency by which force of selection is counted. If we have not
accounted for the entire pattern of reproduction, we may be missing
something that's important to evolution."

Men and longevity

In the paper, the researchers analyzed "a general two-sex model to
show that selection favors survival for as long as men reproduce." The
scientists presented a "range of data showing that males much older
than 50 years have substantial realized fertility through matings with
younger females, a pattern that was likely typical among early
humans." As a result, Puleston said, older male fertility helps to
select against damaging cell mutations in humans who have passed the
age of female menopause, consequently eliminating the "wall of death."

"Our analysis shows that old-age male fertility allows evolution to
breach Hamilton's wall of death and predicts a gradual rise in
mortality after the age of female menopause without relying on
'grandmother' effects or economic optimality," the researchers say in
the paper.

The scientists compiled longevity and fertility data from two
hunter-gatherer groups, the Dobe !Kung of the Kalahari and the Ache of
Paraguay, one of the most isolated populations in the world. They also
looked at the forager-farmer Yanomamo of Brazil and Venezuela, and the
Tsimane, an indigenous group in Bolivia. "They're living a lifestyle
that our ancestors lived and their fertility patterns are probably
most consistent with our ancestors," Puleston said about the four
groups. The study also looked at several farming villages in Gambia
and, for comparison, a group of modern Canadians.

In the less developed, traditional societies, males were as much as
5-to-15 years older than their female partners. In the United States
and Europe, the age spread was about two years. "It's a universal
pattern that in typical marriages men are older than women," Puleston
said. "The age gaps vary by culture, but in every group we looked at
men start [being reproductive] later. At the end of reproduction, male
fertility rates taper off gradually, as opposed to the fairly sharp
decline in female fertility by menopause." Despite small differences
based on marriage traditions, all women and most men in the six groups
stopped having children by their 50s, the researchers found. But some
men, particularly high-status males, continued to reproduce into their
70s. The paper noted that the age gap is most pronounced in societies
that favor polygyny, where a man takes several wives, and in
gerontocracies, where older men monopolize access to reproductive
women. The authors also cite genetic and anthropological evidence that
early humans were probably polygynous as well.

Older male fertility also exists in societies supporting serial
monogamy, because men are more likely to remarry than women. "For
these reasons, we argue that realized male fertility was substantial
at ages well past female menopause for much of human history and the
result is reflected in the mortality patterns of modern populations,"
the authors say. "We conclude that deleterious mutations acting after
the age of female menopause are selected against … solely as a result
of the matings between older males and younger females."

According to Puleston, the "grandmother hypothesis" may be true, but
the real pattern of male fertility extends beyond this explanation.
"The key question is: Does the population have a greater growth rate
if men are reproducing at a later age? The answer is 'yes.' The age of
last reproduction gets pushed into the 60s and 70s if you add men to
the analysis. Hamilton's approach was right, but in a species where
males and females have different reproductive patterns, you need a
two-sex model. You can't correctly estimate the force of selection if
you leave men out of the picture. As a man myself, it's gratifying to
know that men do matter."

Grants from the U.S. National Institute on Aging supported this study.

Wednesday, September 16, 2009

Is Justice Just For Us?



Suppose three children—Anne, Bob, and Carla—quarrel over a flute. Anne says it's hers because she's the only one who knows how to play it. Bob counters that he's the poorest and has no toys, so the flute would at least give him something to play with. Carla reminds Anne and Bob that she built the darn thing, and no sooner did she finish it than the other two started trying to take it away.

Intuitions clashing yet? Need something more complex to tingle your justice antennae—perhaps a puzzler from game theory? The example is Amartya Sen's, from the Nobel-Prize-winning economist's just-published The Idea of Justice (Belknap Press/Harvard University Press), his magnum opus on a line of work he's long addressed and now thoroughly re-examines: justice theory. And what a growth industry it's been since John Rawls revived the subject with his classic, A Theory of Justice (1971), and colleague Robert Nozick made its core principles into an Emerson Hall battle with his libertarian Anarchy, State, and Utopia (1974). Since Rawls, one hardly ranks as a political theorist without a whack at the J-word. Sen's stepping into the fray should keep things hopping, but justice theory is one subsidiary of philosophy that never really suffers a bad century.

Back in Homeric times, life was simpler. Justice largely meant personal vengeance. Complications began when Plato famously pinned on Thrasymachus the view that justice is simply the will of the stronger, and on Glaucon and Callicles the idea that justice is conventional. Plato argued, through his familiar Socratic ventriloquy, that justice is divine, an ideal to which human justice can only haltingly aspire. Aristotle then introduced a formal criterion of justice that still wins the greatest agreement, perhaps because it's merely formal: Treat equals equally and unequals unequally.

From then on, follow the history of philosophers' sentences that begin "Justice is … " on and you hit so many diverse endings you wonder whether anyone, including the lady in the blindfold, knows what justice is.

To Aquinas, it's "a certain rectitude of mind whereby a man does what he ought to do in the circumstances confronting him." To Hume, it's "nothing but an artificial invention." To Sir Edward Coke, it's "the daughter of the law, for the law bringeth her forth." To 20th-century American jurist Learned Hand, it's "the tolerable accommodation of the conflicting interests of society." Do a survey, and about the only thinker who invites instant agreement is Belgian philosopher of law Chaim Perelman. According to Perelman, justice is simply "a confused concept."

One reason theories of justice abound is the range of the concept, applied to decisions, people, procedures, laws, actions, events. Justice is usually considered a positive thing, yet some rank it below mercy. It's divine for some, purely human for others. It's supposedly majestic, yet many complain of its quotidian banality and everyday scarcity. Recall the old lawyer's joke:

Petitioner: "Justice, justice, I demand justice!"

Judge: "Silence or I'll have you removed! This is a court of law!"

When Rawls declared justice "the first virtue of social institutions, as truth is of systems of thought," and began his painstaking probe of the conditions of just institutions, he re-established a modern tradition dating back to Hobbes: using social-contract theory to articulate ideal forms of social justice, sometimes in quasi-syllogistic form. But there was also a longstanding, skeptical, antisystematic tradition in justice theory. One of the suspenseful aspects of Sen's book is how its author, personally close to Rawls (who died in 2002) but more expansive and historical in regard to justice, walks a difficult line between the analytic foundationalism Rawls and Nozick practiced and the sensitivity to real-world justice in people's lives that Sen and Martha Nussbaum argue for and describe as the "capabilities" conception of justice.

Although Sen mentions neither the late philosopher Robert C. Solomon, author of A Passion for Justice (1995), nor the very-much-with-us Elizabeth H. Wolgast, author of The Grammar of Justice (1987), both deserve credit for adumbrating ideas in justice theory that Sen, with his enormous intellectual prestige and cachet as a star in Harvard's firmament, may finally infiltrate into elite Ivy League and Oxbridge political theory.

Solomon wrote in A Passion for Justice that justice is "a complex set of passions to be cultivated, not an abstract set of principles to be formulated. … Justice begins with compassion and caring, not principles or opinions, but it also involves, right from the start, such 'negative' emotions as envy, jealousy, indignation, anger, and resentment, a keen sense of having been personally cheated or neglected, and the desire to get even." In time, suggested Solomon, "the sense of justice emerges as a generalization and, eventually, a rationalization of a personal sense of injustice."

That common-sense attempt at causal explanation—taking seriously how feelings of injustice spur the intellectual drive toward a theory of justice—had also been observed by Wolgast, who argued in The Grammar of Justice that injustice "grammatically" precedes justice. Sen's Harvard colleague, Michael J. Sandel, at the outset of his new Justice: What's the Right Thing to Do?—not quite The Idiot's Guide to Justice but, unlike Sen's work, mainly a summary for general readers of key ideas in justice theory—notes, "At the heart of the bailout outrage was a sense of injustice."

Might our concept of justice arise when society's normal moral inertia, the tendency to accept traditions and status quo ethical procedures without challenge, is itself challenged?

Sen inclines to that view. He begins An Idea of Justice by quoting Pip in Charles Dickens's Great Expectations: "In the little world in which children have their existence, there is nothing so finely perceived and finely felt, as injustice." Sen adds, "The identification of redressable injustice is not only what animates us to think about justice and injustice, it is also central … to the theory of justice."

Thus the great economist, who long ago transcended the bounds of his discipline, goes full-frontal with justice—and John Rawls. Displaying his customary mix of erudition and worldliness, his irritation at the "parochial" slighting of Eastern thought (see The Argumentative Indian) and resistance to (despite mastery of) purely formal approaches to justice, Sen both praises Rawls profusely for his "rightly celebrated" work and nicks him with a score of cuts.

"Justice," Sen writes, "is ultimately connected with the way people's lives go, and not merely with the nature of institutions surrounding them." Two concepts from early Indian jurisprudence, niti (strict organizational and behavioral rules of justice) and nyaya (the larger picture of how such rules affect ordinary lives), provide a better prism for justice than Rawls's obsession with the characterization of just institutions. Indeed, Sen writes in a killer sum-up: "If a theory of justice is to guide reasoned choice of policies, strategies, or institutions, then the identification of fully just social arrangements is neither necessary nor sufficient."

It was Solomon, in A Passion for Justice, who voiced the problem that hangs over ostensibly rigorous justice theory, which Sen plainly finds unconvincing yet never quite denounces. Speaking of the enormous technical literature spawned by Rawls, Nozick, and their acolytes, Solomon wrote: "The positions have been drawn, defined, refined, and redefined again. The qualifications have been qualified, the objections answered and answered again with more objections, and the ramifications further ramified. … But the hope for a single, neutral, rational position has been thwarted every time." Solomon complained that justice theory had "become so specialized and so academic and so utterly unreadable that it has become just another intellectual puzzle, a conceptual Gordian knot awaiting its academic Alexander."

Will Sen be that Alexander? In repeatedly bringing back into the discussion Adam Smith's Theory of Moral Sentiments, Sen signals the need for justice theory to reconnect to realistic human psychology, not the phony formal rationalism that infects modern economics or the for-sake-of-argument altruism that anchors Rawls's project. (In A Theory of Justice, Rawls writes that in his well-ordered society, "Everyone is presumed to act justly.") By declaring his desire "to address questions of enhancing justice and removing injustice, rather than to offer resolutions of questions about the nature of perfect justice," Sen sinks a knife into the heart of the latter utopian program.

On the other hand, Sen's own understanding of his aim in The Idea of Justice hardly dismisses formal resources or careful reasoning. He cites an alternative tradition to social-contract theory, one he identifies as extending from Smith to Mill and beyond and characterizes as "comparative" in its measuring of the justice actually experienced by individuals. That countertradition issued, Sen explains, in the "analytical—and rather mathematical—discipline of social-choice theory" developed by Kenneth Arrow in the mid-20th century.

Alas, Sen spends some of the most arid sections of his book arguing for how its insights can aid "enhancement of justice." He's far more convincing when he sticks to nonformal arguments. Nothing would be sadder than if An Idea of Justice, like A Theory of Justice, generates a fresh industry of acolyte-driven justice literature without moving political actors to improve people's lives (surely the author's paramount goal).

Still, one should never underestimate the influence in philosophy of a big book by a Harvard or Princeton luminary that impeaches an intellectual tradition, however politely. Richard Rorty successfully undermined the pretensions of analytic epistemology (except among its practitioners) because he was an ex-analyst whistle-blower. Sen may be just the inside man to redirect philosophical thinking about justice to that real-world "capabilities approach" he and Nussbaum urge.

One irony is that the famously media-shy Rawls had a complicated human relationship to "justice" few students of the magisterial system-builder understood. With the 2007 publication of Thomas Pogge's John Rawls: His Life and Theory of Justice, we learned that Rawls evolved away from Christianity and toward his secular theory of justice from deep feelings about concrete injustices such as the Holocaust. Another challenge to justice—the chanciness of life—occurred closer to home and similarly left a profound impact on him. In the Philippines during World War II, an assignment from a superior officer that might have gone to Rawls or another soldier went to the other man, who was killed.

"Reasoning," writes Sen early on, "is a robust source of hope and confidence in a world darkened by murky deeds." In The Idea of Justice, Sen provides us with a stunning model despite his eternally ambiguous and imperfectible subject. As he so winningly adds, "The remedy for bad reasoning is better reasoning."
(C Romano)

Wednesday, September 9, 2009

Thrifty vs Cheap.

Journalist Lauren Weber knows a little something about being cheap. When she was growing up, her father refused to set the heat above 50 degrees during the winter in New England.

He turned out the lights, even if someone had left a room for just a moment. And for a little while he even tried to ration the family's use of toilet paper. Seriously.

Rather than traumatize Weber, all that — and more — made her the perfect person to explore the roots of frugality in the United States.

She's documented that study in her new book, In Cheap We Trust.

"When I started working on the book," says Weber, "a friend of mine suggested I call it 'Thrift: A Short History of a Dying Virtue,' but the more I did reporting on it, the more every person I talked to would say, 'Oh, you've got to interview my father, my brother, my wife, or you should interview me.'"

Special Series
Financial Crisis: One Year And CountingThat's when Weber realized being cheap isn't dead or even dying; it's just been hiding underground for quite a while. Think of it as the frugal silent majority that Weber hopes will surface again soon.

America's Cheap Roots

Some trace the roots of frugality in this country to the Puritans. And it was an important part of the Puritan ideology. But Weber believes it was a virtue dictated during the American Revolution by the likes of Ben Franklin, Thomas Jefferson and John Adams.

Not only was it good for the soul, it was good for the American economy.

"They believed this was the way the United States could be less dependent on Europe for all of its trade and all of its goods," says Weber. "The patriots believed that if Americans could be industrious, work hard and save their money, that would provide them the capital to then till a new field or open a new workshop or hire more apprentices. We would be able to cut off more trade with Britain and become more self-sufficient here."

And Weber believed it was a strategy that worked. But, she adds, frugality never was the most popular ideal. "I like to say that thrift was a virtue Americans couldn't wait to relinquish. That's as true in the period right after the revolution as it was in the period in the 1950s when credit cards were brought into being."

When Cheap Went Out of Style

If you want to pinpoint the moment when being cheap went out of style, look no further than the end of World War II. American economists, manufacturers and politicians were worried the post-war nation would fall into a deep recession. The Great Depression was still a fresh memory after all, and the war had amounted to the largest government stimulus program the country had seen at that time.

So what could keep the postwar economy afloat? The American consumer.

"So right before the war ended there was a lot of talk about how Americans needed to buy more washing machines, buy cars, get prepared for the consumer economy that was coming and Americans really took that to heart," Weber says.

And, she says, pop culture offers proof:

"It's not ironic that Scrooge McDuck, the famous miserly uncle of Donald Duck, emerged in 1947. And Jack Benny's famous cheapskate character on radio and TV came in the late '40s and early '50s. Thrift went from being a national virtue to being kind of a punch line."

Cheap Thrills

Cheap. Cheap suit. Cheap date. Cheap shot. It's a dirty word, rife with negative associations. We hear the word cheap and we think, miser, whore, Wal-Mart, made in China, something that's going to fall apart. It's an insult, almost any way you look at it. An eighty-four-year-old man heard about my interest in cheapness and got so excited that he offered himself up for an interview about his frugal ways. At the end of our conversation, he said, sheepishly, "Please don't use my name ... I don't want people to think I'm cheap."

My father has been called cheap for most of his life, by family members, friends, colleagues, and me. Dad is an economist in both senses of the word. He was an economics professor for thirty-three years at the U.S. Coast Guard Academy in New London, Connecticut. And he's also a master economizer, a legend in our extended family. I remember him dashing around the house turning lights off all the time, even if the room's occupant had just left to make a brief phone call. If I was in the shower for longer than a few minutes, I'd hear a knock on the bathroom door, followed by my father's voice saying, "Laur, you're using too much water." He refuses to use the dishwasher; instead, he insists on washing all the plates and cutlery by hand. At some point, we discovered that he was using cold water and no soap; that explained why the knives and forks were often encrusted with the remnants of recent meals.

His latest conceit? He doesn't like to use the brakes on his car because he doesn't want to wear them out. So he coasts when he's approaching a red light, or employs a series of light taps and thrusts, a system he believes minimizes brake wear and tear. Dad also prefers to use hand signals out the window instead of the car's turning lights.

I recently learned that he uses his tea bags not three or four times, like most proud cheapskates, but ten or twelve times. ("I just dip it in for a few seconds, until the water gets a little color," he says.)

I spent my girlhood doing homework at the kitchen table, sock-clad feet nestling on the radiator, hands resting on the oven as it cooled down from dinner. This was the only way to stay warm during New England winters, when my father forbade us from turning the thermostat above fifty degrees. Cold? "Put on another sweater," he'd snap in his native Queens accent. Once he even tried to ration toilet paper, sitting the family down after dinner to tell us how much we could use for each bodily function. Proving too hard to enforce, however, these rules were eventually forgotten.

It's easy to mock these extremes of thrift, to marvel at the amount of time, thought, and emotional energy that some people will expend just to save a few dollars, even a few pennies. We call them eccentrics. We call them irrational. If we're related to them, and even if we're not, we complain bitterly about how cheap they are.

Then we turn into them — at least, I did. And maybe we realize they were onto something.

* * *

This book is a reconsideration of cheapness. It asks why we malign and make fun of people who save money. After all, when we as a nation and as individuals are so dangerously overleveraged, when we've watched our global financial system teeter and then tumble because of greed and ill-considered spending, when all of us could use a little more parsimony in our daily lives, why is it an insult to be called cheap?

The word cheap actually started out with a positive spin. It derives from the Latin word caupo, or tradesman; evolved into the noun ceap (a trade) in Old English; and came to be used in Middle English mostly in the phrase "good chepe," meaning "good bargain" or "good price." The opposite phrase was not "bad chepe" but "dear chepe," which referred to high prices. By the sixteenth century, cheap was employed, without judgment, as a synonym for "inexpensive."

But soon the word began assuming more malignant meanings. When the Earl of Clarendon wrote in 1674 of "the cheap laughter of all illiterate men," he referred to laughter too inexpensive, too easily obtained, and thus worthless. Cheapness came to indicate not just a low price but low quality as well. In 1820s America, a "cheap John" or "cheap Jack" was a man who peddled flimsy pots, ill-fitting suits, and inferior merchandise of all types, asking unreasonably high prices to begin with and gradually letting his customers haggle him down. And in 1880, a story in Harper's Magazine mentioned a character named Isaacson, "a traveling cheap-John who had opened a stock of secondhand garments for ladies and gentlemen in a disused sh**- house on the wharf." It was a contemptible profession, and through that usage, the word cheap entered our vocabulary as a term of derision, an adjective synonymous with miserly or stingy.

Every culture, it seems, sustains a deep discomfort with the figure of the miser. A Native American legend tells of a hunter who refused to bring his kills back to share with his hungry tribe; he was punished by the gods. Jamie and Edmund Tyrone, the two sons in Eugene O'Neill's play Long Day's Journey into Night, blame their father's penny-pinching for their mother's drug addiction and the other woes they've endured. And Mr. Burns, from The Simpsons, lives by himself in his vast mansion, hatching schemes to multiply his fortune. He's a tragicomic figure of bewildered loneliness and pecuniary preoccupations, a modern version of Dickens' Scrooge.

The miser of legend offends the approved values of our religious and social traditions by favoring the base satisfactions of his pile of money over the rewards of spiritual wealth. He chooses his own personal comforts over love and charity. But even as we scorn the miser, we admire the kindred qualities of saving and prudence. Miserliness has always been the dishonorable cousin of thriftiness, the habit of playing it close with money. So when writers, ministers, and wealthy elites in the nineteenth century admonished the poor and middle classes to be thrifty, they nearly always included a complementary warning against miserliness. But where is the line between hoarding and prudence? That's never been clear. Instead we're told, Save your money, but don't hoard it. Be thrifty and prudent, but generous as well. There is a narrow and nebulous band of acceptable behavior: spend "too much" (relative to one's circumstances, or perhaps to one's peers) and be labeled irresponsible; spend "too little" and be labeled parsimonious and stinting.

As cheap as he is, my father is also one of the most generous people I know. All my life, he has given time and money to causes he cares about, from homelessness and hunger to AIDS and political campaigns. He rarely passes a panhandler on the street without giving him the coins in his pocket. He's scrupulously honest; if a restaurant undercharges him, he points out the error and pays the higher check. After my grandmother's savings ran out, he and my mother began bankrolling all of her expenses at the assisted-living facility where she resides, rather than see her move into a Medicaid-funded home. When it came time for my sister and brother and me to attend college, we were never told to limit our sights to a state school or other lower-cost option. Instead, our parents sent us to some of the best and priciest schools in the country. Neither our father nor our mother ever complained about the high tuition. That was not only generosity in the extreme, but also a sign of my parents' priorities. The house may be freezing, and my father may still wear the maroon polyester blazer that he's owned for thirty years, but he and my mother provided three kids with priceless educations, and they continue to set an example of decency, generosity, and openhearted engagement with the world around them.

So is my father cheap or thrifty when he turns off lights, or when he refuses to use the car for what he calls "dippy-sh** trips," like a quick run to the grocery store for a single forgotten item? Is he cheap or thrifty when he notes which station sells gas for a penny or two less than the competition? He always seems to be branded with the former adjective, even by relatives and friends who don't have two nickels to rub together and who could learn a thing or two from my father about low-cost living. Money stirs up fierce and deeply uncomfortable emotions, emotions like resentment, envy, guilt, self- righteousness, anxiety. It is a source of conflict; we war with ourselves and with others about it. And we feel a peculiar pleasure in judging what other people do with their money — how they spend it, how they save it, how and what they consume.

Since the earliest civilizations, philosophers and friends, prophets and lovers, have carped and quibbled about other people's excesses or austerities. Confucius warned that "he who will not economize will have to agonize," but the Greeks countered indirectly with the proverb "A miser is ever in want." The censure we heap on others for their expenditures is so intense that it calls to mind Freud's theories about projection. Teasing or censuring my father for being cheap seems to neutralize some of the confusion or shame his critics feel about their own relationships with money, and perhaps helps them validate their own choices. I'd call that a cheap shot.

Part of the reason I embrace the word cheap is that it embodies some of the contradictions, ambivalence, and confusion we feel about money. Our culture bombards us with a schizophrenic range of messages about how we steward our cash. On the one hand, personal finance experts tell us we're wildly unprepared for retirement, that we have to practice restraint and save for the future. On the other hand, presidents George W. Bush and Barack Obama, along with the Democrat-led Congress, passed "fiscal stimulus" plans in the last two years, giving us all tax rebates and telling us to spend them in order to keep the economy afoat. Editorial writers like David Brooks lament the lost virtue of thrift, but their employers — newspapers and magazines — stay in business by advertising pricey diamond jewelry and peddling images of a glamorous life that's far out of reach to most of us. Is it any wonder we're confused? We hold in our minds and in our culture an unrelenting tension between the twin imperatives of spending (for personal gratication, for economic growth) and saving (for our future security, for moral uplift).

If it's any consolation, this confusion is not new. When the sociologists Robert and Helen Lynd set out to capture the personality of America in the 1920s, they traveled to Muncie, Indiana, and studied the dynamics of this "typical" American town. In their seminal 1929 book, Middletown, they wrote that the local newspaper printed editorials over the course of a single year that offered perfectly contradictory opinions on the importance of both saving and spending. One editorial opined that "the American citizen's first importance to his country is no longer that of citizen but that of consumer. Consumption is a new necessity." But another editorial soon after the first expressed the opposite message: "Better start saving late than never," it warned. "If you haven't opened your weekly savings account with some local bank, trust company, or building and loan, today's the day."

Given the turgid economic straits of the last couple of years, many commentators have been calling for a "return to thrift."

This phrase a return to thrift leaves me cold.

For one thing, I'm not convinced that ordinary Americans ever truly valued thrift in the first place. When I started researching the history of frugality, I assumed, as most Americans probably do, that we were once a thrifty nation but that we had become lazy and spoiled over time, a condition abetted by a cabal of corporations and advertising firms. I began this book with a single question in my mind: What happened to thrift in America? But as I got deeper into the subject I came to realize that what we consider old-fashioned thrift — some combination of resourcefulness, prudence, simplicity, and aversion to debt — was more the result of circumstance than virtue. Thrift was determined by necessity in the early days of the republic. Goods were scarce and often prohibitively expensive for the average family, so stockings were darned, clothes were patched, fruits were preserved and stored. People made do with what they had until the stuff fell apart or was used up. Very little went to waste when each cord of firewood or linen nightshirt was the product of one's own hard labor.

But when industrialization and financial innovation brought Americans opportunities to make their lives easier and more comfortable — through new technologies like railroads and refrigerators, and the emergence of installment plans, mail- order shopping, and credit cards — by and large, they took advantage of them. Indeed, the truest story of America is not, as we might like to believe, the story of political freedom — slavery and the Japanese internments of World War II, for instance, put the lie to that — but the story of an ever-rising standard of living. Benjamin Franklin and John Adams envisioned a nation of thrifty, industrious farmers and artisans. But after the Revolutionary War, even many of the patriots who fought for independence from England celebrated Americans' new opportunities to get rich and spend lavishly. By the beginning of the twentieth century, politicians and ordinary consumers alike crowed about how the United States boasted the world's highest living standard, as measured by the consumption of modern conveniences like canned vegetables, washing machines, and automobiles. Thrift was a "virtue" many Americans couldn't wait to relinquish.

The idea, too, that our ancestors shunned debt and lived within their means turns out to be false. In his wonderful 1999 study of consumer credit called Financing the American Dream, the historian Lendol Calder refers to this misconception as "the myth of lost economic virtue." In fact, as Calder writes, "A river of red ink runs through American history." Debt has vexed Americans since the first European ships landed on these shores. Hard currency was scarce in the colonies and the British forbade Americans from printing their own money, so almost all business was transacted on credit. This led to a steady stream of lawsuits against delinquent debtors, and thousands of Americans were thrown in jail — debtors' prison — for taking on obligations they couldn't repay. While many of these were business debts (a shopkeeper restocking his inventory or a farmer buying an expensive piece of equipment), consumer debt piled up too, as early as the late 1700s and the first decades of the 1800s. Upwardly mobile planters and townspeople purchased pianos and parlor furniture on credit, ever condent that their circumstances would eventually rise to match their consumption. Similarly, loan sharks and pawnbrokers operated lively and lucrative businesses in the early twentieth century, offering advances to cash-strapped workers as well as to people who simply wanted to own the markers of the emerging "American dream" a radio, a car, a family vacation. In every era, Americans have indulged the temptation to live beyond their means.

This is not to say that our cultural values haven't changed at all. Our attitudes toward debt have certainly relaxed further over the last hundred years. To many Americans, into the middle of the twentieth century, the only debt a person might respectably hold was a home mortgage. But new forms of credit invented by retailers, manufacturers, and financial institutions — installment plans in the mid-1800s, credit cards after World War II — made borrowing easier and more convenient. By the 1920s, even once-conservative banks were giving out loans for luxury items like jewelry and new suites of furniture. And the refinancing boom of the early 2000s led millions of Americans to use their homes as ATMs. Today, mortgage holders owe, on average, a balance of $129,789 on their homes. Credit card debt averages $3,161 per American man, woman, and child. And our personal savings rate plunged from around 10 percent in the early 1980s to less than 1 percent for much of the past decade. Recent generations have grown up with expectations of abundance and ready access to the fruits of consumer capitalism, not the experience of scarcity that drove our ancestors to conserve and save.

The other reason I dislike the phrase "return to thrift" is that it looks backward with the gauzy imprecision of nostalgia. No one can argue with thrift itself, as a word or a concept. It's solid, unimpeachable, respectable, like a sturdy old rocking chair or a pair of good jeans. But a "return to thrift" sounds stale, sober, and even a bit dour. It speaks of New England Puritans and Ben Franklin's folksy maxims, of religious sermons and laments over lost virtues. Like most of the things for which we feel nostalgic, the old-fashioned thrift some pundits long for exists more in the realm of myth than reality.

I imagine cheapness as a new framework for low-cost living, a twenty-first-century version of thrift. I'm not talking about getting a lot of stuff at low prices, even though I love a bargain as much as anyone else. I'm talking about living cheaply, consuming less, scaling down our needs and wants to modest levels that are economically and ecologically sustainable. Cheapness requires practical knowledge — a set of skills and strategies, such as knowing how to fix a leaky faucet or how to tile your own kitchen floor, driving at the speed limit to conserve gas, eating by the seasons to take advantage of bumper harvests, replenishing your wardrobe at thrift stores, or cooking lentils thirty different ways. But cheapness is also a mind-set, a habit of asking yourself, "Do I need this?" and "Is this a good use of my hard-earned money?"

Cheapness doesn't necessarily require abstinence and austerity — simply a thoughtfulness and care about how we live, and a skepticism toward the messages peddled by the retail-industrial complex. It means seeing oneself as an outsider in a world that values instant gratication and promotes the idea that we can understand and express our identities through the products we consume. It means embracing and even cultivating an adversarial relationship with consumer culture. It means rejecting the belief that spending money is the route to feeling good about ourselves or feeling better than, or the same as, or different from other people, that it can help us fulfill our longings or soothe our hearts.

This book is also a history of thrift, frugality, and cheapness in America. As you'll see, though, it's not a linear tale moving from our thrifty past to our profligate present. Though I tell the story chronologically, the history of thrift is cyclical; this is a virtue we've abandoned and then circled back to over and over again in the last four hundred years. Americans' interest in frugality has waxed and waned, usually in response to economic events like financial panics or political calamities such as the two world wars of the last century. We Americans are an intemperate and mercurial people, but perhaps these qualities also help us adapt to unfavorable circumstances. History shows that in hard times, we hunker down and make do with less. It also shows that as soon as the danger passes, we cheerfully reset our appetites a notch or two higher than before.

I tell the story of cheapness and thrift as a subjective, lived experience and also as an idea, one that has meant different things to different people over the course of our history. At certain moments, it has been proposed as a panacea for sin, luxury, moral corruption, poverty, alcoholism, marital discord, war, and urban vice and depravity. At other times, like the present, it's been blamed for recessions and for choking off the consumption needed to keep the economy chugging along.

I also explore the ways that thrift has been used and, at times, deployed like a weapon to judge, praise, and condemn those who were or weren't conforming to shifting standards of patriotism, Protestant Christianity, bourgeois convention, or psychological self-control. The history of thrift in America is as much about social conflict, class anxiety, and hardscrabble conditions as it is about evolving cultural values.

Thrift has always been a morally charged category, used to define a vision of the good life and to separate out the upright and righteous from the prodigal and wayward. Advocacy in favor of thrift can be roughly divided into two types: traditional, religiously based appeals that classify consumption in terms of vice and virtue, and pragmatic appeals couched in the language of social mobility, budgeting, and financial management. The former type dominated in the Puritan days and remained potent in the nineteenth and early twentieth centuries. The latter type found an audience in the 1800s and through the early 1900s as the Industrial Revolution fueled interest in concepts like efficiency and scientific management. No matter what the wording, though, one fact is undeniable: thrift advocacy has always carried a whiff and often a stench of preachiness.

Wealthy reformers in the nineteenth century, for instance, urged sailors, milliners, and other laboring people to practice thrift and put their cash away in newly opened savings banks. Doing so, promised the — well-meaning but out-of-touch upper crust, would not only provide workers a measure of financial security, but would also induce in them piety, sobriety, and high moral standards. Such preaching papered over the reality that working people lived exceedingly precarious lives, vulnerable to the convulsions and insecurities of an unregulated economy. If many of them lived on the knife-edge of poverty, it was as much a result of their low wages and unsteady employment as any "thriftless" habits of alcoholism or gambling.

There have also been those who celebrated simplicity thrift, frugality, nonmaterialism, cheapness — without trying to impose a way of life onto anyone else. Henry David Thoreau helped build the Transcendentalist movement of the mid-1800s on the premise that plain living cleared time for truth-seeking and spiritual satisfaction (though like just about every other thrift advocate, Thoreau didn't always live up to his own ideals). In the post-World War II era, the philosopher Paul Goodman recommended "decent poverty" as the aim of an equitable society, one that would allow individuals the leisure and freedom to pursue their creative goals. As the essayist Richard Todd has written, "There has seldom been an era in America where someone hasn't argued that the simple life is the good life."

I've grappled with this history of moralizing and sanctimoniousness, in part because with this book I insert myself into the long lineage of writers who have complained that Americans place too much stock in the ability of material pleasures to bring happiness. I've tried to look skeptically both at my antecedents and at the current incarnations of thrift advocacy, whether at the compulsive aspects of my father's and my cheap personalities or at the politics of a band of anticapitalist activists that I profile.

I've also attempted to take some of the moralizing out of the topic of cheapness and thrift. When I analyze the consumer nation that America has become, I try not to suggest that whether one is a cheapskate or a spendthrift reflects on one's moral or spiritual fiber. Instead, I approach these issues from a practical standpoint, arguing that cheapness is a virtue in the sense that it creates financial security at a time when government is exiting the business of providing economic safety nets to individuals. That it's better for the planet. That, in unhinging us from a cycle of working and spending to provide for our basic needs and our more extravagant desires, it can provide us with the freedom to pursue our true passions. The virtues of cheapness are autonomy, independence, a deeper sense of fulfillment, financial security, and a lighter footprint on the world around us.

But I don't mean to hold myself up as the embodiment of frugal virtue. After all, our uses of money are personal, often eccentric, and deeply inconsistent, adhering to a unique calculus we each concoct. I'll walk thirty minutes out of my way to get to my own bank's ATM (and thus avoid fees), but I use a French face cream that costs $60 an ounce. Like most of the people I write about — including the most fervent nineteenth- and twentieth-century moralists — I'm prone to temptation and too easily seduced. Not long ago I was visiting my sister in Chicago and we stopped in at a chic atelier. I spied a beautiful pair of shoes by a designer whose products I'd coveted but had never been able to afford. They were high-heeled oxfords, tooled from soft brown leather and crisscrossed with thick laces. The shoes were marked down from $360 to $99. I tried them on and traversed the store, admiring my feet in every mirror. Should I buy them? I asked my sister. I was midway through writing this book, and nearing the bottom of my advance. I went back and forth, talking myself into and then out of buying the shoes. There was absolutely no reason to fork over 99 bucks for a pair of shoes I certainly didn't need.

Reader, I bought them.

I'm not suggesting that anyone should be a purist or live in ascetic isolation from worldly pleasures. I don't believe one has to disavow all attachment to money and the things it can buy. Even my father believes in the occasional splurge, unable to give up his country drives despite gas prices that at one point rose north of $4 a gallon. But I am talking about moderation, about living below one's means, about spending less money and buying less stuff, about casting a critical eye over the exigencies of our late- capitalist consumer economy.

When I started working on this book, my friend Darcy suggested I call it "Thrift: A Short History of a Dying Virtue." But thrift — or cheapness, or frugality— is not dying at all. In fact, it's alive and well. I heard about a doctor who uses surgical forceps to hang up his tea bags so he can reuse them two or three times; about a computer programmer who ate peanuts for lunch every day, shells and all (he didn't want the shells to go to waste); about millionaires who refuse to turn on the air-conditioning in their enormous homes; about fathers and husbands and grandmothers who wash and dry their tinfoil; about my accountant's brother, a lawyer, who will spend a half hour in New York City traffic looking for a free parking spot rather than give in to a pricey garage.

I wrote sections of this book while staying (cheaply!) in East Hampton, New York. Even in this gilded zip code, I saw signs of cheapness everywhere. It was there in the line of Mercedeses and BMWs following the same weekend yard sale circuit as I, and in the crowds of women who responded to a newspaper ad for a designer shoe sale. Everyone even the rich, maybe especially the rich loves a bargain.

The Internet has breathed new life into the long tradition of Americans sharing strategies for saving money. Decades ago, this information passed seamlessly from father to son and mother to daughter, and was traded at general stores and cattle auctions. It lled the pages of women's magazines and, later, mimeographed newsletters. In 1990, a Maine housewife named Amy Dacyczyn (pronounced "Decision") began penning a monthly bulletin called The Tightwad Gazette, which garnered more than 100,000 subscribers during its seven-year run. In it, she offered instructions for making jump ropes from old bread bags and volleyball nets from plastic six-pack rings. She suggested cutting out the serrated edges of wax-paper boxes and using them to make the saw tooth hangers on the backs of picture frames. She gave recipes for making chalk. She calculated that you can cook up your own chocolate syrup for 3 cents per ounce, less than half the cost of Hershey's.

Though no one can take the place of the Frugal Zealot, as Dacyczyn called herself, there are now dozens of websites and blogs devoted to the same kind of thrifty living. Many fall into the "frugal mommy" category, they are written by stay-at-home moms who manage the family budget and swear by chest freezers as the number one secret to conserving cash. Others are penned by reformed debtors, environmentalists, and those who have embraced their "inner cheapskate." One of my favorites, www.Fallenfruit.org, maps out public fruit trees in Los Angeles and encourages readers to gather up the bounty. Another, www.Swaporamarama.org, lists clothing swaps nationwide for people who want a new wardrobe without spending a dime. All these sites offer some combination of ingenious new ideas and tried-and-true methods for living on less.

My own cheapness hit its apex during the writing of this book. I've long been an advocate of washing and reusing plastic bags, and I'll sometimes walk thirty blocks rather than spend $2 on the subway. I also had dial-up Internet service until November 2007, when I nally upgraded to DSL, and, too frequently, I save money by skipping lunch if I'm too busy to make a sandwich before leaving my apartment (I've been a little better about that since a friend looked at me as though I was out of my mind and asked, "Is it because you think you don't deserve to eat?").

But the moment when I realized I had truly veered into the Cheap Dimension came not long ago, when I decided to cut my budget by using up some of the older items in my kitchen pantry. I knew there was a can of baby clams in there, which I'd had for five years (okay, maybe seven). On my way home from the library one day, I bought spaghetti, parsley, and a lemon. These were all simmering on the stovetop that evening, along with garlic, olive oil, and white wine from my pantry, when I opened the can of clams. Inside, the mollusks were greenish blue. I sniffed; they smelled like old pennies. I considered throwing them away, but I just couldn't, not after I'd already cooked up the other ingredients. Forging ahead, I tried to cover up the metallic taste of the clams with extra Parmesan and salt. A few minutes later, I was sitting at my kitchen table eating the pasta with my laptop open in front of me, looking up the symptoms of botulism, just in case. The thought occurred to me, I may have gone too far this time.

Indeed, cheapness can become compulsive, pathological even. An acquaintance of mine briey dated a man who weaseled out of taking her to dinner by claiming to have an eating disorder. I still haven't totally forgiven my father for keeping the house so cold that I could sometimes see my breath, and neither has my mother. I asked her recently how she had managed to coexist with Dad's stinginess. "I should've divorced him a long time ago," she said bitterly (turned out they'd had an argument about something else that morning, which partly explains the acrimony). According to a possibly apocryphal story, Hetty Green, the early-twentieth-century millionaire who was named by the Guinness Book of Records as the world's "greatest miser," refused to take her son to the hospital when he had gangrene, a decision that eventually cost him his leg.

On a broader scale, too, cheapness can come with a high price. Congress passed a national minimum-wage law in 1938 because employers, in a slack labor market, will happily offer their workers sub-subsistence pay if they can get away with it.

One hundred and forty-six women died in the 1911 Triangle Shirtwaist Factory fire in New York City because of poor ventilation and a lack of re exits. That tragedy and others like it forced a reckoning that manufacturers, in pursuit of high prots, often skimp on safety precautions. It inspired the state legislature to pass regulations mandating fire exits, fire drills, and sprinkler systems, but there are innumerable examples of corporations still cutting corners to swell profits at the expense of employees, customers, or the environment. According to labor groups and even former executives, Wal-Mart has long fought workers' attempts to unionize, which has the effect of depressing wages and benefits and thereby keeping prices low. An investigation conducted after an explosion killed fteen workers at a Texas oil refinery found that the owner, the British company BP, had severely reduced its capital and maintenance spending in the years before the accident. And in 2007, parents removed Thomas & Friends train sets from their kids' shelves because the toys were coated in lead paint. Why? Lead paint costs about half as much as lead-free paint in China, where the toys were made.

In addition, we've damaged our planet through our addiction to cheap commodities such as oil, food, and water. Gasoline at $1.50 a gallon should seem like a cheapskate's dream. But that low price is an illusion; it doesn't factor in long-term costs related to pollution and climate change or the political violence and instability sparked by competition for energy supplies. And we're unlikely to steward our resources carefully when, like the plastic necklaces thrown out of floats at Mardi Gras, they're cheap and plentiful. Only when the various costs of these commodities shoot up for good will we start getting really serious about challenges like global warming and energy independence.

So what's a tightwad to do? I opt for something I call ethical cheapness: reconciling my goal to live cheaply with my desire to consume conscientiously, shopping in a way that supports my values. One has to know which corners can be cut safely, on an individual level and on a broader scale. Laws and regulations that require employers to conserve resources and provide all Americans — deally, all workers around the world — with healthful working conditions and a living wage might drive up the cost of consumer goods, but they also prove the maxim that "what's most dear is most cheap": environmental and labor regulations are more economical in the long run than a laissez-faire approach that burdens us with the costs of global warming and a deeply unbalanced distribution of resources.

But I can't tell you how to live. In a 2007 article in the New York Times, Carl Pope, the executive director of the Sierra Club, talked about his group's brand of environmental activism, and said, "We'll encourage companies to make more efcient SUVs, and we'll encourage consumers to buy them, but we do not find lecturing people about personal consumption choices to be effective." Hundreds of years of history tell us the same thing. Americans famous and unfamous have been preaching on the evils of overconsumption since the days of the Puritan minister Cotton Mather, and where has it gotten us? Our national debt rings up at more than $11 trillion and we've accumulated another $14 trillion in mortgage and consumer debt.

Americans slash their spending en masse only in times of national crisis wars, economic depressions, periods of high inflation. Hard experience teaches quick lessons. Unfortunately, that may be one of the few silver linings of our current recession. Perhaps we will nd we can live without four hundred cable channels. Perhaps we'll decide that big houses are too costly to heat, and we'll demand denser developments of apartment buildings and small homes. Perhaps we'll turn our rooftops into gardens that insulate our houses and provide low-cost vegetables. Perhaps I'll finally take up quilting so I can put all the fabric scraps from my old clothes to good use.

Perhaps, perhaps. As we figure out the way forward through these hard times and through the next cycles of prosperity and pain, we'll be reminded that there's no simple formula for finding the right mix of frugality and comfort, self-denial — and self- indulgence, lentils and high-heeled oxfords. We've each got to craft our own solutions. In the meantime, I hope this book can help generate a new respect for values like prudence, resourcefulness, and economy. A new respect, even, for the cheapskates among us.