Last December I interviewed Steve Fuller at Warwick University in England. He spoke broadly and very eloquently, I was engaged – not surprisingly – he holds the Auguste Comte Chair in Social Epistemology in the Department of Sociology. I asked him to provide an article for H+ Magazine, and he obliged – so here it is, ‘The Moral Equivalent of War’ (and The Right to Science). Note I will finish editing uploading an interview series with him entitled ‘Humanity 2.0’ at a later stage (watch H+ Magazine for an article on that), but in the mean time I have embedded a couple of snippets of the interview with him along with his article on the Right to Science – I do hope you enjoy 🙂 — Adam Ford
The Right to Science
In 1906 the great American pragmatist philosopher William James delivered a public lecture entitled, ‘The Moral Equivalent of War’. James imagined a point in the foreseeable future when states would rationally decide against military options to resolve their differences. While he welcomed this prospect, he also believed that the abolition of warfare would remove an important pretext for people to think beyond their own individual survival and toward some greater end, perhaps one that others might end up enjoying more fully. What then might replace war’s altruistic side?
It is telling that the most famous political speech to adopt James’ title was US President Jimmy Carter’s 1977 call for national energy independence in response to the Arab oil embargo. Carter characterised the battle ahead as really about America’s own ignorance and complacency rather than some Middle Eastern foe. While Carter’s critics pounced on his trademark moralism, they should have looked instead to his training as a nuclear scientist. Historically speaking, nothing can beat a science-led agenda to inspire a long-term, focused shift in a population’s default behaviours. Louis Pasteur perhaps first exploited this point by declaring war on the germs that he had shown lay behind not only human and animal disease but also France’s failing wine and silk industries. Moreover, Richard Nixon’s ‘war on cancer’, first declared in 1971, continues to be prosecuted on the terrain of genomic medicine, even though arguably a much greater impact on the human condition could have been achieved by equipping the ongoing ‘war on poverty’ with comparable resources and resoluteness.
Science’s ability to step in as war’s moral equivalent has less to do with whatever personal authority scientists command than with the universal scope of scientific knowledge claims. Even if today’s science is bound to be superseded, its import potentially bears on everyone’s life. Once that point is understood, it is easy to see how each person could be personally invested in advancing the cause of scientific research. In the heyday of the welfare state, that point was generally understood. Thus, in The Gift Relationship, perhaps the most influential work in British social policy of the past fifty years, Richard Titmuss argued, by analogy with voluntary blood donation, that citizens have a duty to participate as research subjects, but not because of the unlikely event that they might directly benefit from their particular experiment. Rather, citizens should participate because they would have already benefitted from experiments involving their fellow citizens and will continue to benefit similarly in the future.
However, this neat fit between science and altruism has been undermined over the past quarter-century on two main fronts. One stems from the legacy of Nazi Germany, where the duty to participate in research was turned into a vehicle to punish undesirables by studying their behaviour under various ‘extreme conditions’. Indicative of the horrific nature of this research is that even today few are willing to discuss any scientifically interesting results that might have come from it. Indeed, the pendulum has swung the other way. Elaborate research ethics codes enforced by professional scientific bodies and university ‘institutional review boards’ protect both scientist and subject in ways that arguably discourage either from having much to do with the other. Even defenders of today’s ethical guidelines generally concede that had such codes been in place over the past two centuries, science would have progressed at a much slower pace.
The other and more current challenge to the idea that citizens have a duty to participate in research comes from the increasing privatisation of science. If a state today were to require citizen participation in drug trials, as it might jury duty or military service, the most likely beneficiary would be a transnational pharmaceutical firm capable of quickly exploiting the findings for profitable products. What may be needed, then, is not a duty but a right to participate in science. This proposal, advanced by Sarah Chan at the University of Manchester’s Institute for Bioethics, looks like a slight shift in legal language. But it is the difference between science appearing as an obligation and an opportunity for the ordinary citizen. In the latter case, one does not simply wait for scientists to invite willing subjects. Rather, potential subjects are invited to organize themselves and lobby the research community with their specific concerns. The cost of this proposal is that scientists may no longer exert final control over their research agenda, but the benefit is that they can be assured of steady public support for their work.
Steve Fuller holds the Auguste Comte Chair in Social Epistemology at the Department of Sociology, University of Warwick, UK. He has recently published three books on the future of humanity, all with Palgrave Macmillan: Humanity 2.0: What It Means to Be Human Past, Present and Future (2011); Preparing for Life in Humanity 2.0 (2012); The Proactionary Imperative: A Foundation for Transhumanism (with Veronika Lipinska, 2013). The last book develops further the argument in this paper. Website: http://bit.ly/q3GBmi
For more videos of lectures and interviews with thought leaders please
Subscribe to Adam Ford’s YouTube Channel