Effective altruists and rationalists know about the Straw Vulcan stereotype of rationality. The stereotype is that rationality is cold, without wisdom, and somehow missing the big picture. Thus it is possible that rationality could lead one to make transparently bad decisions, as Spock did in Julia's example, when he expected irrational beings to act rationally.
This isn't the definition of rationality that EAs use, though. Rather, rationality refers to making a decision "that is not just reasoned, but that is also optimal for achieving a goal or solving a problem." The rational decision is by definition the right decision, whatever it is. If a car is flying toward you, the rational decision is to jump out of the way, rather than to calculate how much time you have or what options are available to you. This second type of rational reasoning seems pretty foolproof to me. But nobody should want to identify with the first type.
It is not rare for EAs to be accused of missing something or making some mistake and then use the Straw Vulcan defence: "You're assuming that we hold the Straw Vulcan position here - but actually most EAs don't believe we should do that."
I think that defence is sometimes a cop out. The fact is that EAs are walking a fine line between rationality and Vulcan rationality. When trying to optimize, it's very easy to make a mistake or overlook factors and wind up with worse results than had you been less intent on optimizing. That isn't to say that EA is based on Straw Vulcan rationality - it isn't - but that the risk of occasionally stepping into Straw Vulcan territory is very real and should be taken seriously. In real life, we should expect this to happen somewhat regularly among those hoping to maximize the altruistic impact of their lives - even if they know about the Straw Vulcan stereotype. Knowing the name of the problem is not the same as knowing how to solve the problem.
No comments:
Post a Comment