Watch Out For Those Biases

The scientific method is the most rigorous route to knowledge, but academic discourses on terrorism reveal some pitfalls.

A few years ago, I had made a presentation on the psychology of terrorism for a seminar. I had decided not to name individuals or organizations as terrorists in order to stay neutral. For anyone familiar with academia, this isn’t surprising. It had been my naive attempt at objectivity through value-neutrality. Historically speaking, objectivity in examining ideas is an ideal most valued in science. In fact, this claim of objectivity is part of the reason why science is influential in society. It is expected that the assumptions, methods, results, and dissemination, among other aspects of science, are impartial.

However, scientists are prone to many human biases. For example, they could be ethnocentric, and could use assumptions of (and about) their own cultures while formulating theories which are meant to be applicable universally. Western theories generally emphasize idiosyncratic values (which is a rather narrow lens through which to look at the world), in spite of which they are believed to be the predominant way of looking at the world.

For instance, many of the academic discourses on terrorism and counter-terrorism are focused on ‘Islamic terrorism’ alone. However, it can be argued that there is very little consensus about what exactly terrorism is. There are as many definitions as there are experts, which in indicative of how value-laden the term is.  Even so, whoever gets termed as a ‘terrorist’, or a ‘terrorist organization’ even in scientific literature is mostly based on a narrow view of the term, making generalisations difficult. Similarly, when discussing the motives of terrorist behaviour, such narrow perspectives shine through. The motive, in fact, depends on whose perspective we take — one man’s terrorist is, indeed, another’s freedom fighter. For instance, nationalist-separatist terrorists (those who want to secure varying degrees of self-determinism) are typically regarded as risking their lives for social welfare, while revolutionary terrorists (those who want to replace a given political system with another one) attack their society of origin. This kind of outlook could be a problem for science.

Besides, scientists aren’t inoculated against confirmation bias — they are equally prone to look for, interpret, promote, recall, and of course, believe in information that confirms their own pre-existing beliefs, including their own prejudices. Or as Simon and Garfunkel put it, “A man hears what he wants to hear, and disregards the rest.” For instance, the scientific literature on terrorism has been mucked with questionable arguments. There are clear undertones of dehumanizing non-Westerners as ‘barbaric savages’ who should be offered political tutelage. Further, the ones who argue for Western notions of modernity miss the point that they are talking of liberal, educated, and relatively well-off individuals, and condescendingly imagine the West as a unified totality.

For example, Monroe and Kreidie (1997) in trying to explain how fundamentalists see the world, propose that the “absolute dedication to an Islamic way of life” is the answer to their issues of dual identity formed due to the contradiction between traditional values and modernizing Western culture. While it can be argued that they only talk of Islamic fundamentalism, these idea of ‘Western is modern’ is very blatant in this work, especially evident in their interviews of non-fundamental Muslims. Moreover, the depiction of an Islamic fundamentalist is often one of a downtrodden, withdrawn individual, who is idealistic, and has obstinately doctrinaire beliefs, an inferiority complex, and an ascetic lifestyle filled with struggle and sacrifice. However, despite the stereotype, they are often university graduates, and often in the physical sciences, who are misguided about their moral imperatives (and not by self-interest), and have a worldview different than the Western ones.

Ethnocentrism of scientists could then be further aggravated by the false consensus effect i.e. believing that the way one sees the world, is the way most people see the world. A very blatant example comes in the form of culture-blaming by drawing a clean divide in differentiating between individualism and collectivism. In such an argument, collectivism has been argued as a prerequisite for terrorism. According to some, individualists are more likely to attack members of other individualistic cultures, while collectivists are more likely to attack “foreigners.” Further, those from individualistic cultures apparently feel morally restrained in attacking innocents, but collectivists have two moral principles–one for their in-group and another for their out-group–making them uninhibited towards attacking innocents as long as they are from the out-group. Thus, the “cultural value of collectivism” has been thought of as one of the predictive factors for terrorism.

The problem with such arguments stems from a flawed methodology. For example, individualism and collectivism in one of the studies was measured based on data of IBM employees. But are IBM employees representatives of their culture? There is dearth of data within some of the claims. For example, the argument in this study about different moral principles has not been backed by data. Similarly, such assertions trivialize the fact that subcultures exist within these cultural dimensions making such group-based arguments unsound. For example, it can be debated whether nations have a unified culture. Besides, the division of cultures along the individualism/collectivism dimension has garnered some criticism — mainly about how it creates a simplistic picture of cross-cultural differences, and it has been questioned about its ability to explain extreme behaviours.

Such biases about color scientific enquiry are particularly harmful when the implications go beyond the scientific community–as is true for literature on terrorism, because not only is it of interest to the community at large, but could make for eye-catching headlines too. However, when such biases influence the way data is interpreted or disseminated, objectivity is lost. This is also true when scientists cherry-pick what results to focus on and to report, and when they are more interested in the wow-factor of a particular study.

None of these imply a direct threat to the institution of science, however. The scientific method is still the most rigorous route to knowledge. It is a self-correcting mechanism which goes through constant revisions of its claims. In fact, there are groups working to make science more reliable and transparent. There is definitely hope for how the practices within the field will work. So there is no need to throw out the champagne with the cork.

About the author

Arathy Puthillam

Arathy Puthillam is a Research Assistant at the Department of Psychology, Monk Prayogshala, a not-for-profit research organisation based in Mumbai, India. Her research interests include Evolutionary Psychology, Social Cognition, and Psychological Methods.