Tuesday, March 30, 2010

Dobson's Laws (Part 2)

Herewith another collection of my daily SideWise Insights. Enjoy — and be sure to drop them into casual conversation.

Leadership and Motivation

Leadership skills are not fungible. Eisenhower was a great leader; Patton was a great leader; neither could have done the other's job.

While it's useful to tame what can be tamed, most of the management world lives where the wild things are.

The Old Yeller Rule: you have to know when and how to shoot your own dog. Sometimes it's even an act of mercy.

Realism isn't cynicism. A cynic is disappointed that things are what they are. Realists accept the facts and go from there.

It's not enough to learn the lessons an event teaches; you have to *not* learn the lessons it *doesn't* teach.

If someone spends more time and energy scheming to get out of work than it would take to do it, is that person unmotivated?

In the same way expenses rise faster than income, so does responsibility rise faster than authority.

Competition imposes constraints that aren't under your control. This suggests that a portion of your resources be devoted to intelligence.

You have to pay people to get them to work: you personally, not the organization. Respect, gratitude, and support make great paychecks.

"Lessons learned" aren't pleasant, but they're essential for growth. Make watching the game film as pleasant as possible.

There are two very different reasons to delegate: (a) to get stuff off your desk and (b) to train other people. Do some of each every week.

Another proud graduate of the Blanche Dubois School of Leadership: "We rely on the kindness of strangers!"

The job of leaders is making bad decisions, not good ones. When all options are rotten, the decision goes up the ladder.

When I first became a supervisor, I was so naive I actually believed my title meant people would do what I said.

Work is infinite. Resources are finite. Many management problems derive from this essential truth.

Operational definition of quality: It ain't dog food if the dog don't eat it. If no customer or boss wants it, why is it a requirement?

Power and Politics

Your role power is delegated by other people, but your respect power is something you own personally.

Here's a simple test to see if you have office politics in your organization: Do a headcount. If the result is 3 or higher, the answer is yes.

Power is energy that overcomes resistance to achieve work. The corollary of no power = no work.

The power to say "no" is held at lower organizational levels than the power to say "yes." Don't ask someone whose only answer is negative.

If there are multiple stakeholders, the 500 lb. gorilla wins. If there's more than one 500 lb. gorilla, conflict is inevitable.

Your negotiation power is greatest right before you say "yes." As soon as you've said it, your power plummets.

Pick your fights carefully. Some are necessary, some even desirable, but others should be avoided at all costs.

If you tell people they can have what they want, you're a genius. Tell them "no," you're a moron. Spin your "no" so it sounds like "yes."

If the wasps are already swarming, maybe you shouldn't be riling them up even more.

Military retreats don't garner kudos, but managing one takes a lot of planning. If you expect cutbacks, don't wait for the announcement.

Power follows failure like white corpuscles follow disease. Look for the last major failure to find out which department is most powerful.

Keep your friends close, and your customers closer. You need to manage them.

Project Management and Risk Management

Organizational movement up and down the stovepipe is easier; unfortunately, project managers usually need to work sideways.

A project begins life as a gap between where you are and where you want to be. If the project doesn't close the gap, it's a failure.

There are three main project gaps: the official gap, the underlying gap, and one or more hidden agendas. You need to know them all.

All wars are projects, though not all projects are wars. Borrow the best military thinking, but don't confuse problems with actual enemies.

It's not just one damn thing after another, it's usually the same damn thing over and over again. Attack repetitive crises at the roots.

In spite of a quarter-century of project management professionalism, studies show nearly 70% of all projects fail...and the trend is getting worse.

It's well known project management needs to be scaled, but it also needs to be stretched.

Two reasons projects fail: stuff nobody expected and didn't prepare for, and stuff everybody expected...and didn't prepare for.

Success at the project level doesn't mean success at the program level. "The operation was a success, but the patient died."

It's often an advantage to organize your project in stages. Actual results build enthusiasm and commitment.

Identify bad projects early by looking at stakeholder interests and conflicts. Are you being set up as the scapegoat for inevitable failure?

What makes a project challenging? Complexity, constraints, and (un)certainty. Of them, uncertainty is the toughest to manage.

Megaprojects inevitably have megaproblems. Take scale and complexity into account before judging disaster too harshly.

Some objectives are easier to achieve then others, even if they aren't central. It's often smart to pick low-hanging fruit.

Hidden or unstated objectives may be politically sensitive, and can't be spoken out loud without creating problems.

Some key goals are assumed, not stated, but that doesn't mean you're off the hook. Listen carefully to what people don't say.

"Nothing's impossible" implies unlimited resources and time and really flexible performance goals. None of these apply to project managers.

Earning a PMP only means you know the basics. No one ever finishes learning to be a good project manager.

A risk evaluation prices a risk, but price alone doesn't always tell you whether the risk is worth running.

There is no reliable correlation between short-term and long-term outcomes. Bad early can be great later, and vice versa.

If you do something stupid and get lucky, it doesn't validate the quality of your original decision.

Don't drive carpet tacks with a sledgehammer. Most formal systems are way too robust for most normal projects.

The most overlooked question is "Why?" If you don't know, even if you're on time, on budget, and to spec, you haven't done the job.

All wars are projects, but not all projects are wars. Wars have conscious opponents. Don't confuse ordinary risk with actual malice.

You are never the only game in town. What other projects are going on? How's the overall organization's health? Adjust accordingly.

The project isn't necessarily what they tell you it is. It isn't necessarily even what they think it is. Your job is to figure out what it really is.

Projects live in a finite universe, bounded by the triple constraints of time, cost, and performance.

The triple constraints of time, cost, and performance are never equally constraining. What drives your project? What is most flexible?

On any project, make sure you know where "good enough" is. Even Tiger Woods needs to know what par is.

Why People Don’t Do What You Want

Performance problem come in three varieties: "don't know," "can't do," or "won't do." Each has a different solution.

"Don't know" problems are communications failures. Don't expect your team to perform the Vulcan mind meld.

"Can't do" problems may require training, tools, someone else, or you may have to change what you're asking.

"Won't do" problems are about motivation. Everyone's motivated. Some work harder to get out of work than it would take to do it.

There are three reasons for a “Won’t Do”:

If performance is punished (reward for a bad project is an even worse one), expect motivation to drop quickly.

If failure is rewarded (screw up a job, get an easier one), expect failure.

If performance doesn't seem to matter, people put it on the bottom of the "to do" list, as they should.

Whenever someone isn't doing what you want, remember there are only these three reasons: don't know, can't do, and won't do.


Copyright © 2010 Michael Dobson, and made freely available under the Creative Commons attribution license.

Tuesday, March 23, 2010

I Know You Are But What Am I? (Part 10 of Cognitive Biases)

A cognitive bias is a pattern of deviation in judgment that occurs in particular situations, and boy howdy, are there a lot of them! In today’s installment, we’ll focus on illusions: the illusion of asymmetric insight, the illusion of control, illusory superiority, as well as impact bias, information bias, ingroup bias, and irrational escalation.

The illustration is by Baker & Hill from my new book Creative Project Management.

Illusion of asymmetric insight

Think about the people you know. How well do you know them? How much insight do you have into the way they think, their strengths and weaknesses, and the reasons they behave the way they do?

Now think about how well they know and understand you. Do they understand you as well as you understand them, or are their insights about you more likely to be wrong, shallow, or incomplete?

The illusion of asymmetric insight is the common belief that we understand other people better than they understand us. It happens both with individuals and with groups — do you think you understand, say, the culture of the Middle East better than Middle Easterners understand the culture of the United States?

A 2001 report (Pronin et al.) in the Journal of Personality and Social Psychology on the illusion of asymmetric insight cited six different studies that confirm the widespread cognitive bias. Like most cognitive biases, your best strategy is self-awareness. Be more modest about your knowledge about others, and assume you’re more transparent than you appear.

The Johari Window is a good tool to help you. It’s a model for mapping how well you understand yourself, how well other people understand you, and how to be more self-aware. By taking the test (and asking others to take your test as well), you’ll learn about your four selves: a public arena known to you and to others), a blind spot known to others and not to you, a façade known to you and not to others, and an unknown self hidden to all.

Related to the illusion of asymmetric insight is the illusion of transparency, the extent to which people overestimate the degree their personal mental state is known by others: “Can’t you tell I’m really upset?” This tends to be most pronounced when people are in a personal relationship.

Illusion of control

When rolling dice in craps (or, presumably, in role-playing games), studies have shown that people tend to throw harder when they want high numbers and throw softer for low ones. That’s the illusion of control, the tendency of people to believe they can control (or at least influence) outcomes even when it’s clear they cannot.

Like a lot of cognitive biases, this particular one has advantages as well as disadvantages. It’s been argued that the illusion of control is an adaptive behavior because it tends to increase motivation and persistence, and in fact the illusion of control bias is found more commonly in people with normal mental health than in those suffering from depression.

But it’s not all good news. In a 2005 study of stock traders, those who were prone to high illusion of control had significantly worse performance in analysis, risk management, and profitability, and earned less as well.

Illusory superiority

“The trouble with the world is that the stupid are cocksure and the intelligent are full of doubt,” wrote Bertrand Russell. The cognitive bias he describes is known as illusory superiority.

In a 1981 survey, students were asked to compare their driving safety and skill to other students in the same experiment. For driving skill, 93% of the students put themselves in the top 50%. For safety, 88% put themselves in the top 50%.

In intelligence, illusory superiority shows up in the Downing effect, the tendency of people with below-average IQs to overestimate their intelligence, and for people with above-average intelligence to underestimate.

Incompetence and stupidity also play into the Dunning-Kruger effect, a series of demonstrations that incompetents tend to overestimate their own skill, fail to recognize genuine skill and others, and fail to recognize their own inadequacies. As in the Downing effect, people of much higher competency levels are perversely much more self-critical.

The danger, alas, is that people tend to judge the competence of others by their degree of self-esteem, leading to situations in which incompetence can actually increase someone’s ability to get a good job.

Impact bias

Imagine that you’ve just learned your lotto ticket is the big winner, and you’ve just become a multi-millionaire. How would you feel, and how long would you feel that way?

Now imagine that instead of winning the lotto, you’ve just lost your job. How would you feel, and how long would you feel that way?

According to studies of impact bias, you’ve probably overestimated how long you’d be elated at the lotto win, and how long it’ll take you to recover emotionally from getting laid off.

People tend to have a basic “happiness set-point.” Although good and bad events can dramatically change your level of happiness, most people tend to return fairly rapidly to their emotional base states.

Information bias

"We need more study before we make a decision." Well, sometimes we do, but the big question is what good the information will do us. In an experiment involving medical students and fictitious diseases, the students looked at a diagnostic problem:

A patient’s presenting symptoms and history suggest a diagnosis of globoma, with about an 80% probability. If it isn’t globoma, it’s either popitis or flapemia. Each disease has its own treatment, which is ineffective against the other two diseases. A test called the ET scan would certainly yield a positive result if the patient had popitis, and a negative result if she has flapemia. If the patient has globoma, a positive and negative result are equally likely.

If the ET scan was the only test you could do, should you do it? Why or why not?

The majority of students opted for the ET scan, even when they were told it was costly, but the truth is that the result of the scan doesn’t matter. Here’s why:

Out of 100 patients, a total of 80 people will have globoma regardless of whether the ET scan is positive or negative. Since it is equally likely for a patient with globoma to have a positive or negative ET scan result, 40 people will have a positive ET scan and 40 people will have a negative ET scan, which totals to 80 people having globoma.

This means that a total of 20 people will have either popitis or flapemia regardless of the result of the ET scan. The number of patients with globoma will always be greater than the number of patients with popitis or flapemia no matter what the ET scan happens to show.

More information doesn’t always make a better decision. If the information isn’t relevant, more of it doesn’t help.

Ingroup bias

Most of us recognize the tendency to give preferential treatment to people we perceive to be members of our own groups. What’s interesting is the extent to which ingroup bias works even when the groups that link us are random and arbitrary: having the same birthday, having the same last digit in a Social Security number, or being assigned to a group based on the same flip of a coin.

Ingroup bias is one of the root causes of racism and other forms of prejudice, so it’s dangerous indeed. However, like with most cognitive biases, there’s an upside as well. We’re not part of a single group (black/white, American/Chinese, rich/poor) but of many different ones. That means we’re almost always able to define each other as members of at least one of our ingroups. That builds connections.

Irrational escalation

There’s the old joke about the man who accidentally dropped a quarter in the outhouse, and immediately took out a $20 bill and threw it down the hole as well. When asked why, he replied, “If I gotta go in after it, it had better be worth my while.”

An example of irrational escalation is the dollar auction experiment. The setup involves an auctioneer who volunteers to auction off a dollar bill with the following rule: the dollar goes to the highest bidder, who pays the amount he bids. The second-highest bidder also must pay the highest amount that he bid, but gets nothing in return.

Suppose that the game begins with one of the players bidding 1 cent, hoping to make a 99 cent profit. He or she will quickly be outbid by another player bidding 2 cents, as a 98 cent profit is still desirable. Three cents, same thing. And so the bidding goes forward.

As soon as the bidding reaches 99 cents, there's a problem. If the other player bid 98 cents, he or she now has the choice of losing the 98 cents or bidding $1.00, for a profit of zero. Now the other player is faced with a choice of either losing 99 cents or bidding $1.01, and only losing one cent. After this point the two players continue to bid the value up well beyond the dollar, and neither stands to profit.

The dollar auction is often used as a simple illustration of the irrational escalation of commitment. By the end of the game, though both players stand to lose money, they continue bidding the value up well beyond the point that the dollar difference between the winner's and loser's loss is negligible; they are fueled to bid further by their past investment.

Previous Installments

Part 1 — Bias blind spot, confirmation bias, déformation professionnelle, denomination effect, moral credential effect.
Part 2 — Base rate fallacy, congruence bias, experimenter’s bias
Part 3 — Ambiguity aversion effect (Ellsberg paradox), choice-supportive bias, distinction bias, contrast effect
Part 4 — Actor-observer bias, anchoring effect, attentional bias, availability cascade, belief bias
Part 5 — Clustering illusion, conjunction fallacy, cryptomnesia
Part 6 — Disposition effect, egocentric bias, endowment effect, extraordinarity bias
Part 7 — False consensus effect, false memory, Forer effect, framing, fundamental attribution error
Part 8 — Gambler’s fallacy, halo effect
Part 9 — Hawthorne effect, herd instinct, hindsight bias, hyperbolic discounting

Tuesday, March 16, 2010

Dobson's Laws (Part 1)

I've been practicing my skills as an aphorist through daily tweets since last August, and I'm grateful for the many insightful responses I've received here, on Twitter, and on Facebook. Herewith a collection of the first 123 of Dobson's Laws, presented in two parts.

Dobson's Laws are copyright © 2010 by Michael Dobson under the terms of the Creative Commons Attribution license.


Career Management and Personal Growth

Your strength in one situation can be a weakness elsewhere. You must know when you're operating out of your vulnerabilities and biases.

Never say it can't be done in a first meeting, no matter how sure you are. People tend to think you aren't even trying.

Sweat key small stuff. There's often something that makes the customer disproportionately happy, and you can always use more good will.

There is no situation so bad that you cannot make it worse.

Your attitude changes the world around you. If you look at the world through rose-colored glasses, it's amazing how often you get roses.

Research shows pessimists see the world more clearly than optimists, but optimists make more money and live longer. Take your pick.

Things are what they are, but that doesn't make you helpless. The world is filled with both opportunity and danger.

Failure can sometimes be turned into success. If Pisa's tower didn't lean, no one would visit it.

Fight the temptation to take on interesting projects that exceed your performance bandwidth.

If you age your work properly, lots of it will turn out to be unnecessary or irrelevant. This is the great secret of time management.

Most of us are not-so-good Samaritans. But being a good neighbor is in your best interest, too. Someday you might be the one in the ditch.

How often do you describe your workplace as a war zone? Taking flak, being shot down, or out for blood...pay attention to violent metaphors.

Faking can be dishonest, but it can also be a form of practice. I've faked liking people so long that now I actually do.

We correlate age with wisdom, but that's wrong. Age provides experience; wisdom is learning from it.


Communications, Cognitive Bias, Perception, Influence

Realism isn't cynicism. A cynic is disappointed that things are what they are. Realists accept the facts and go from there.

When people rate their own decisions as "95% certain," research shows they're wrong approximately 40% of the time.

You have three kinds of blind spots: ones you don't know you have, ones you embrace or accept, and ones you try to overcome.

The fundamental flaw of almost all management thinking is the assumption that we are all rational.

You want someone to know something, to do something, or to feel something - there are no other reasons to communicate.

In English, gratitude and ingratitude are opposites, but flammable and inflammable are synonyms. Language is a leading cause of fires.

When there aren't standards for empirical proof, our common ground turns into scorched earth.

Telling people it's going to be OK often influences the likelihood it will be. That's not lying; it's premature truth-telling.

The customer is always right only at the end of the project, when they decide if they're happy and want to pay you.

Say back to them what they said to you before arguing. Until they hear their words in your mouth, they don't believe you listened.

A customer only needs two qualifications: a need, and the wherewithal to pay for it. You have to figure out the rest and then supply it.

Every communications medium has some special virtue nothing else can replace. After 6,000 years, we still chisel some messages into stone.

In the South, an honest politician is one who stays bought. But real politicians are always dependable. Their word is their stock in trade.

Seminars on dealing with difficult people are mostly filled with difficult people. Try looking in the mirror.

Jokes reveal pain and offer insight. Read the cartoons people tape to cubicle walls. They're often cries for help.


Creative Thinking, Problem Solving, Decision-Making

What do you know now that you wish you had known earlier? You can't replay the past, but the lesson might be useful in the future?

Failure is essential to all creative endeavors. Learn to fail early, fail often, and fail cheaply.

What's half of thirteen? A mathematician would say 6.5, but a graphic designer might say "thir" or "teen." The answer depends on the goal.

There's always a question that will illuminate key problems if asked early enough. Try on lots of questions to find the right one.

Fortunately, doing the right thing and the smart thing are usually the same thing. Ethics and self-interest often go hand-in-hand.

An often overlooked way to overcome procrastination: delegation. Is there some way you can get someone else to do it for you?

You are not there to do what the customer (or your boss) says. You're there to do what your he or she *wants.* They aren't always identical.

Real life seldom conforms to the clean, crisp edges of a model. Models are useful, but don't confuse the map with the territory.

Always identify the "good enough" point, even if you don't settle for it. How can you exceed expectations if you don't know what they are?

There are two ways to learn from experience: Have an experience, learn. Or find someone else who's had the experience and learn from that.

SideWise Thinkers know that reality is nuanced and complex. Beware the person who claims to explain it all in 140 characters.

"Known knowns, known unknowns, unknown unknowns." Rumsfeld missed one: unknown knowns, things to which our perceptual biases blind us.

You don't procrastinate because you're a "procrastinator," you procrastinate for a reason. Knowing why is essential to overcoming the block.

Creativity trainers mostly teach you how to generate ideas. Useful, but inspiration's only 1%. The rest (the hard part) is follow through.

The Godzilla Principle: Baby monsters are easier to kill than the full-grown variety. Some solutions come with expiration dates.

If a job's worth doing, it's worth doing badly. That's why we practice what we care about: we start bad, then work up.

Models aren't true or false; they're useful or not useful. A map of Chicago may well be accurate, but in New York City it's not very useful.

Inertia, friction, and entropy are universal: they affect people and organizations as well as physical objects.

Think "both-and" instead of "either-or." People want seemingly opposite things all the time. Often, they achieve them.


Friends and Enemies

Map the political environment around you by identifying allies, opponents, neutrals, fellow travelers, and enemies.

Two factors determine how people treat you: (a) the quality of the relationship and (b) the degree of common interest.

Allies have common interests and a good relationship, so they tend to win when you win. Use them wisely.

Opponents have conflicting interests, but a good relationship. They're valuable; always treat them with respect and fairness.

Fellow travelers have a common interest but a poor relationship. Trust them only as far as their own self-interest takes them.

Enemies have conflicting interests and a poor relationship. Negotiate interests in the short term; build relationships over time.

Neutrals shade in all four directions. Some are best left on the fence; others need to be lured into the game.


Tuesday, March 9, 2010

Do, Know, Feel: The Goals of Communication


Why do we communicate? The simple reason is because we don’t do the Vulcan Mind Meld very well. We cannot know directly the thoughts in someone else’s head, nor transmit our thoughts to them. As a result, we have to go through a fairly complex process to get our messages across.

  • Encoding is the process of turning our thoughts into something that can be transmitted. Language is an obvious way to encode messages, but we also add information (consciously or subconsciously) by our tone of voice and our body language. Notice this is the first — but definitely not the last — chance you’ll have to get it wrong.
  • Broadcasting is the process of transmission itself. Whether it’s face-to-face or over email, whether it’s synchronous (real-time) communication or asynchronous (a Facebook conversation), there’s always a physical transmission that accompanies the movement of information from Point A to Point B.
  • Noise and distortion normally occurs during broadcasting, and can take the form of a distraction that keeps the listener from receiving the message, or it can result from a message that isn’t delivered.
  • Decoding takes place in the mind of the listener. The transmitted message (which may be distorted or corrupted) gets turned back into thought. This is a very complicated process.

Dr. Albert Mehrabian, in his groundbreaking book Silent Messages (1971), concluded that tone of voice and facial expression are how feelings and attitudes get communicated (often cited as the 7% word/38% voice/55% expression rule). When the tone of voice or facial expression is in conflict (“This soup is really delicious,” said with a sour look on your face), people tend to believe tonality and facial expression conveys the deeper truth.

Added to the problem of decoding is stereotyping and prejudice. If you have a preexisting idea about an individual or a group, you naturally tend to use that idea as a filter. Race, religion, ethnicity, accent, political orientation, sexual orientation, career — any difference that people notice affects out they interpret what you say.

Because communication is an artificial process, it’s not a good idea to assume your message has gotten across unless you’ve done it right. That means you need a measure of success. How do you know if you’ve communicated effectively?

Well, in project management, success gets benchmarked against your goal. Did you accomplish what you set out to accomplish, and did the underlying problem or issue get solved? These aren’t automatically the same thing, as we all know. Sometimes we communicate our hurt feelings and anger with remarkable effectiveness — but the ultimate outcome may not be to our liking.

The way to think about communication, therefore, is to start with the goal and work backward to the technique. With that in mind, here's the key question: Why are you communicating? On inspection, there turn out to be three goals.

  • We want somebody to do something (Take action)
  • We want somebody to know something (Convey information)
  • We want somebody to feel or believe something (Persuade)

Do. Know. Feel. That’s it. That’s why we communicate. These three goals often interact, of course — the process of getting someone to take some action may sometimes involve conveying information along with persuasion.

Start every communication by thinking about the outcome. What is it that you want to happen with the other person – do, know, feel?

Often, simply thinking clearly about what you want is enough all by itself to show you the way. And when it isn’t, then you know it’s clearly worth some effort to map out a plan. There are great tools available — if you know you've got a problem.

Tuesday, March 2, 2010

You're Not Being Reasonable!














I’m embarrassed to admit that I’ve been getting myself into more online arguments about politics and religion lately, and I’m not happy with either my own behavior or others. All the cognitive biases are on display, and hardly anyone actually speaks to the other side. Unreasonableness is rampant.

The problem is that what’s reasonable tends to be subjective. Obviously, I’m going to be biased toward thinking people who agree with me are more reasonable than those lunkheads who don’t. But that doesn’t mean there aren’t objective standards for being reasonable.

To find out what those standards are, I needed to compensate for my bias, so I identified three groups of people to study.

  • I looked for people I agreed with whom I thought unreasonable
  • I looked for people I disagreed with whom I thought were reasonable
  • I looked for people arguing about stuff I didn’t care about and picked which ones sounded more reasonable

In addition, I looked at some of the classical standards for reason, and also reviewed some of the basic communications concepts we use in business.

I learned some of the following through observation, and most of it through the contrary experience of doing it wrong. You’ve heard some of the advice elsewhere, but a reminder every once in a while comes in handy.

1. You’re not being reasonable if you don’t construct your arguments using reason.

Classical argument operates by rules. There is a proper structure for proof. Some proofs are invalid, riddled with structural errors. Failure to follow the rules of reason is, by definition, unreasonable. A good description of classical rules for arguments and a list of common fallacies can be found here.

2. You’re not being reasonable if you don’t acknowledge your own biases and blind spots.

Before you deal with the mote in your brother’s eye, deal with the plank in your own. The problem with that advice is that it’s hard, by definition, to see a blind spot. Because some biases are universal, you can at least acknowledge that you have those.

Remember too that because a decision is biased doesn’t mean it’s automatically wrong. I have a bias toward trusting people. I don’t think that’s a fundamental error; it is, however, a risk. I don’t want to get rid of the bias; I simply want to be aware of it so I can correct my own observations.

3. You’re not being reasonable if you don’t take the time to find out what the other person really means.

If there’s a misunderstanding about what they mean, remember that it is they — not you — who are the official judge of that issue. It’s fair they should acknowledge that their initials statement may have been infelicitous, but they’re entitled to revise their thoughts for greater clarity and accuracy. You have to let the old phrasing go after they change it.

In addition, reasonable people take the time to find out why their arguments are rejected, and in the future either use a different argument or at least address the identified deficiency. If you keep repeating the same argument, and fail to adjust it when you find out other people aren’t buying it, eventually you stop being reasonable. (Trying an argument a few times to work out the kinks is completely different. A failed argument can be an opportunity to do a better job next time.)

4. You’re not being reasonable if you don’t take the time to figure out why the other side believes what they believe.

While it’s instructive to define what are our differences, it’s fundamental to probe the underlying reasons why we believe it. If the difference between us is a fundamental value, it’s not subject to contrary proof by logical argument. Acknowledge the difference, and move on to the next phase, which may be walking away or getting ready to rumble.

If, on the other hand, our fight is between which of two roads is the best way to our shared destination, there’s no need to get hostile about it. One or more of us might in fact be wrong, but all have the same desire.

5. You’re not being reasonable if you don’t separate emotional outbursts from logical reasoning.

If the matter is serious enough, emotions are going to break through. It’s not practical to regard such lapses as evidence of moral failure. What does constitute failure to be reasonable is failing to curb the outbursts before they get out of hand, and failing to apologize (or failing to accept an apology, even a grudging one) when you’ve stepped over the line of good manners.

Labeling an emotional outburst an emotional outburst helps, but doesn’t undo all the potential consequences, any more than labeling a bomb “BOMB” constitutes an acceptable safety program.

6. You’re not being reasonable if you only expose yourself to one type of information.

Always read at least one news source that strongly contradicts your worldview, and make sure you understand what they actually believe and why they believe it. If you have no idea why they think what they think in the first place, what makes you believe you can come up with a persuasive argument to change their minds?

7. You’re not being reasonable if you don’t acknowledge your mistakes and apologize generously.

Factual errors, misrepresentations of the other side’s opinions, violations of good manners — people who own up quickly and generously are considered a lot more reasonable than those who don’t.

An accusation that you’ve done something wrong isn’t automatically proof that you have done so. Or maybe you did and still believe you’re justified. If there’s doubt, a good judge is someone known to be reasonable who leans toward the other side. If that person thinks you’re out of line, maybe you should listen. If a reasonable person on the other side thinks you’re being reasonable, that’s a fairly encouraging sign.

If the situation’s mixed, you can apologize for your fair share (err on the side of generosity) without having to own all the blame. If the other person tries to shove it down your throat, understand that other reasonable people will be more inclined toward you if you don’t return the aggression.

8. You're not being reasonable if you don't separate what you know from what you believe.

You don't know it if you can't prove it by empirical, external means. Facts can generally be proved to the satisfaction of someone else. Beliefs aren't necessarily subject to the need for external proof, but you can't demand someone accept your belief the same way you can demand someone accept a demonstrable fact.

9. You’re not being reasonable if you don’t stay out of fights that aren’t any of your business.

‘Nuff said.



And yes, I’ve been guilty of violating many of these rules myself. Hypocrisie est un homage que la vice rend á la vertu.

Which rules of reasonableness have I missed?