Talk:Omission bias
This article is rated Stub-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
Untitled
editIts worth pointing out that it's a disputed issue in philosophy as to whether this "bias" is in fact rational. The article should be changed to reflect this. --gregorya--
This has a lot to do with Utilitarian and Kantian ethics, so maybe it should include a link to these articles. 67.37.231.80
What seems biased may not be so biased. I have read the article and it seems only not supported. Used to living in backward countryside will explain a lot of this. Jimmycyang1 13:54, 9 June 2007 (UTC)
NPOV
editThese seem heavily biased to me:
"What people often do not realize is that not acting is also a choice and thus actions and inactions should be judged equally. If an omission causes as much harm as an action, then both choices are equally bad."
"the parent is clearly making the erroneous decision"
Did the author of these passages even consider NPOV?
- It's not biased the author was arguing from the point of view of expected utility theory which is the normative theory that suggests these actions are biased
> I agree with the above. The article should merely explain the bias, not pass judgment on it. Many people can reasonably argue with what it currently presumes. Mushroom Pi 03:19, 12 March 2007 (UTC)
Unregistered User: Comment
The included example of vaccination is biased. The risks are misquoted. To vaccinate depends on the *overall* risks to the child. This includes not just the immediate risk from either having or not having the vaccination (even these risks are disputed for some types of vaccination and published contraindications are frequently omitted) It requires consideration of the overall probability of contracting the disease a person is being vaccinated for together with the combined risk arising from either having or not having a given vaccination.
Example. Consider this. One might be vaccinated against a particularly rare but virulent and deadly tropical disease where the risk factor for the vaccination may be moderately high - but still "low" when but compared to the severity of the disease (where our hypothetical disease is, say, fatal in 90% of known cases). However saying that a person ought, therefore, "logically" to be vaccinated against such diseases is actually quite illogical since it fails to factor-in the probability of contracting this disease in the first place. In our hypothetical example and many real-world examples, even though the consequences are severe, the overall probability of contracting the disease is low so the risk of vaccination is not justified.
This is similar perhaps to the Precautionary Principle, which, if not checked would appear to lead to psychotic behaviour where individuals or groups of people lose touch with real world risks and start to see imagined threats everywhere which require immediate action to prevent. This, of course, is a useful technique for government and has been thoroughly documented.
The above may not be clear so I will add ...
The flaw is focusing only on the actual risk of the vaccination versus death or serious illness when the disease is contracted were the child not vaccinated. Commonly risks are expressed as - "X percent of children who contract disease Y suffer illness X as a consequence whereas the risk of damage to health due to vaccination is only Q" however this neglects to include what percentage of kids contract the disease in the first place. If only 1% of children previously contracted the disease but 95% are then forced to endure the risk of vaccination then the overall risk factor is skewed. I'd say you multiply the risk factors together.
Additional Comment - I can't comment from any position of knowledge, but is the discussion of vaccination not itself very complex, and therefore needs to be resolved before it can be used for illustration? The above comment says that the possibility of contracting the eventual condition needs to be taken into account. However, vaccination isn't only about individuals, it's a systemic process. Surely, if *everyone* stopped vaccinating for a specific but rare condition that had not been completely wiped out, then the disease may well start to spread again. Surely we don't want to give a disease that is 90% fatal any chance to gain a foothold again? Does the comment itself not display a bias that the needs of a specific individual are more important than the systemic needs of the group? —Preceding unsigned comment added by 81.149.182.243 (talk) 12:51, 22 November 2007 (UTC)
I too agree that this page has an NPOV problem. For one thing, the use of a controversial topic to illustrate the subject matter is unnecessary. Yes, I understand that the anti-vaccination crowd is a little nutty, but the fact that they exist at all is good enough reason to choose a different example. As is, this example just invites hostility from a well-known vocal minority, as if it were "trolling" on a message board.
And aside from that, the page still seems to presume that the distinction between an act and an omission is irrational or illogical. The entire American legal system, for instance, has very deliberately and consciously made a distinction between acts and omissions. The former can often lead to criminal or civil liability, whereas the latter cannot absent some special relationship or obligation that was voluntarily accepted by the person whose omission is at issue. For instance, I can walk by a man stranded in a pit and do nothing to aid him (even at ZERO cost to myself), and the law passes no judgment on me for doing so. If, however, I were to induce him to go into the pit in the first place by promising to haul him back out, then my omission could expose me to some liability if he were injured as a result.
Rejecting the act/omission distinction leads to all sorts of really kooky conclusions, i.e. Peter Singer and his ilk who would suggest that every time you buy a television set you are starving 100 African children. In fact, one might suggest that the argument that acts and omissions are morally equivalent is as fringe a belief as the Pauline Christian position that acts and thoughts are morally equivalent.
If the act/omission distinction is reasonable, as I and many others believe it to be, then the connotations that come with being labelled a "cognitive bias" are inappropriate. I therefore believe it should not be listed in this category at all. —Preceding unsigned comment added by 69.92.248.171 (talk) 20:49, 23 March 2008 (UTC)
Maybe?
edit209.197.190.240 13:22, 29 June 2007 (UTC): I was wondering, if this is actually part of the bias or just someone's idea:
- "When it comes to making a decision, this bias is similar to the Status quo bias, because they both favor the default, which in the case of the Omission Bias is not acting."
Because sometimes taking an action is a default. An example would be to retaliate or to have a reflex. which is sometimes the default? maybe either way is a point of view?
POV tag added
editThis article describes 'omission bias' as a 'cognitive bias', which from the link in the article means: "a person's tendency to make errors in judgment based on cognitive factors", so the article is implying that the result of the 'omission bias' is an error. We can't do that. 'Omission bias', as described here, pertains to judgments of morality. A WP article simply cannot call any position on morality erroneous, because of WP:NPOV. The article is biased.
Take, for example, the example included in the article. It potrays the majority position: "John’s action of recommending the allergenic [was] more immoral than John’s inaction of not informing the opponent of the allergenic substance", as an example of the result of 'omission bias' which the introduction established (by describing it as a cognitive bias) as an 'error in judgement'. We can't do this, because to say that this position is incorrect constitutes a qualitative judgment, and a violation of WP:NPOV. —Preceding unsigned comment added by 219.73.21.218 (talk) 05:28, 14 November 2008 (UTC)
- I edited the article a bit, referenced to Stanford Encyclopedia of Philosophy on doing vs allowing harm, and pointed out that while it would be inconsistent under a consequentialist framework, deonologically, it is normally argued that there is a distinction. Dismissing one very large side in a very contentious and important philosophical debate as a mere bias is rather shakey, doubly so for an encyclopedia. Still not particularly happy with the article. Really, to establish this as a bias, you'd have to show people displaying it who signed up to a moral doctrine which didn't draw the distinction (and made sure no other morally relevant differences were affecting it). Then you'd have evidence of a cognitive bias.
- Some of you have been talking about this deriving from Expected Utility Theory? EUT relates to dealing with individual preferences under risk (or certainty). It does not say anything about how you can weigh up or compare preferences/utilities between individuals. That would require a substantive moral theory about summing and comparing preferences, at which point you are likely to import consequentialism. Incidentally, there are ways of comparing and summing preferences, using EUT, which would lead to treating harmful action as worse than inaction - just use deontological assumptions. 92.29.126.29 (talk) 13:32, 17 December 2009 (UTC)
This omits the mathematical concept of omission bias, which concept has economic and scientific utility. In fact, that definition was preexisting on this site and and has been purposefully deleted. I accuse the authors of ommision bias. —Preceding unsigned comment added by 174.116.124.161 (talk) 12:30, 4 September 2010 (UTC)