Neuroscience of Morality

Principal Researchers:  Prof. J. Savulescu, Prof. W. Sinnott-Armstrong, Prof. N. Levy, Prof. B. Fulford.

Further Researchers:  Dr N. Shea, Dr B. Foddy.

In the last decade, neuroscientists and psychologists have produced a substantial body of empirical evidence which challenges established views of morality and rationality. This evidence may be incompatible with the central methodology in practical ethics which involves putting weight on intuitions in ethical reflection (Rawls 1951, 1972; Daniels 1996).

Employing neuroimaging and psychological experiments, Haidt (2001), Hauser (2006) and others have documented unconscious influences on moral judgement with little input from consciousness. In one influential study, Greene et al. (2001) used fMRI to study the neural correlates of responses to moral dilemmas, showing that subjects who responded in a non-utilitarian manner exhibited strong activation in brain areas associated with emotion. These claims have been supported by studies of patients with frontal damage (Koeings et al. 2007; but see Kahane & Shackel 2008). Others studies show that the reasons people adduce to justify moral judgements are often merely post-hoc rationalizations (Haidt 2007). Finally, surveys of the intuitions of lay persons have shown moral judgements vary across cultures and classes (Haidt 2001). Such research has been claimed to support far-reaching conclusions such as the denial of the viability of virtue ethics (Harman, 1999; Doris 2002) and of common views about killing (Greene 2003), risk, punishment, and reproduction (Sunstein 2005). Utilitarians such as Peter Singer (2005) claim that it shows that opposition to utilitarianism is due to irrational emotions. However, there remain serious philosophical questions about the methodology and interpretation of this empirical research, its bearing on normative claims, and its implications for ethical practice (Levy 2006, 2007).

Further ethical questions are raised by recent research exploring means of manipulating judgement through psychopharmaceuticals or brain stimulation. Scientists have used hypnosis to trigger moral disapproval towards innocent actions (Wheatly & Haidt 2005), oxytocin to increase trust (Kosfeld et al. 2005), and transcranial magnetic stimulation to induce acceptance of unfair offers (Knoch at al. 2006). These possibilities raise urgent questions about the ethical status of the manipulation of will, character and belief.

We will develop a systematic account of the ways in which neuroscientific research bears on the practice and substance of applied ethics. We shall identify methods for reducing moral bias and assess the implications for policy.

1. Do these scientific findings show some normative beliefs and practices to be defective, and if so, can we develop ways of improving moral judgement?

 

a. Can we identify: (i) criteria for sound interpretation of the ethical significance of empirical findings; (ii) ways in which empirical findings might undermine confidence in moral practices?

b. Does the involvement of emotion undermine moral cognition or might emotions facilitate accurate cognition? What are the implications of evidence of abnormal moral judgment and decisions by psychopaths and patients with frontopolar dementia or orbitofrontal lobe damage (Mendez 2005; Ciamarelli et al. 2007)?

c. How might we implement findings about normative bias to enhance moral reasoning (Bostrom & Ord 2006)? Might there be reasons to preserve some forms of normative bias/heuristics?

2. To the extent that it becomes possible to enhance or manipulate rationality or moral judgement, what ethical principles and constraints should govern such interventions? 

a. If it is permissible to enhance normative judgment, what principles should guide such enhancement?

b. How should we rethink the practice of ethics, medical decision-making, and questions about consent?

A secondary arm of research involves initiating novel collaborative empirical research driven by our ethical and conceptual analysis. Our team has already promoted such research. Levy has initiated a study by Jesse Bering (Queen’s University) of the extent to which intuitions about enhancement are driven by intuitive dualism; Kahane and Weich have initiated an fMRI study of moral reasoning which is nearly complete. We will apply to other funding bodies for funding to conduct this research.

Principal Researchers:  Prof W. Sinnott-Armstrong, Prof. N. Levy, Dr G. Kahane, Prof. J. Savulescu, Prof. I. Tracey.

Further Researchers:  Prof. J. Radcliffe Richards, Prof. N. Bostrom, Dr N. Shea, Dr S. Clarke, Dr K. Wiech, Dr N. Shackel.

 

References

Bostrom, N. & Ord, T. (2006), 'The Reversal Test: Eliminating Status Quo Bias in Applied Ethics,' Ethics,116: 656–679.

Ciaramelli, E., Muccioli, M., La`davas, E. & di Pellegrino, G. (2007), 'Selective deficit in personal moral judgment following damage to ventromedial prefrontal cortex,' Social Cogn. Affect., 2: 84–89.

Daniels, N. (1996), Justice and justification: Reflective equilibrium in theory and practice, Cambridge: Cambridge University. Press.

Doris, J.M. (2002), Lack of character,Cambridge: Cambridge University Press.

Greene, J. (2003), 'From neural ‘is’ to moral ‘ought’: what are the moral implications of neuroscientific moral psychology?' Nature Reviews Neuroscience, 4: 847-850.

Greene J., Sommerville R.B., Nystrom L.E., Darley J.M. & Cohen J.D. (2001), 'An fMRI investigation of emotional engagement in moral judgment,' Science, 293: 105-2108.

Haidt, J. (2001), 'The emotional dog and its rational tail: A social intuitionist approach to moral judgment,' Psychological Review, 108: 814-834.

Haidt, J. (2007), 'The new synthesis in moral psychology,' Science, 316: 998-1002.

Harman, G. (1999), 'Moral Philosophy Meets Social Psychology: Virtue Ethics and the Fundamental Attribution Error,' Proceedings of the Aristotelian Society, 99: 315-331.

Hauser, M. (2006), Moral Minds: How Nature Designed Our Universal Sense of Right and Wrong, Ecco/Harper Collins Publishers.

Kahane G. and Shackel, N. (2008), 'Do abnormal responses show utilitarian bias?' Nature, 452: 7185.

Koenigs M., Young L., Adolphs R., Tranel D., Hauser M., Cushman F. & Damasio A. (2007), 'Damage to the prefrontal cortex increases utilitarian moral judgments,' Nature, 446: 908-911.

Knoch, D., Pascual-Leone A, Meyer K, Treyer, V. & Fehr, E. (2006), 'Diminishing reciprocal fairness by disrupting the right prefrontal cortex,' Science, 314 (5800): 829-832.

Kosfeld M., Heinrichs M., Zak P.J., Fischbacher U. & Fehr E. (2005), 'Oxytocin increases trust in humans,' Nature, 435 (7042): 673-676.

Levy, N. (2006), 'Cognitive Scientific Challenges to Morality,' Philosophical Psychology, 19: 567-587.

Levy, N. (2007), Neuroethics: Challenges for the 21st Century, Cambridge University Press, Cambridge.

Mendez, M. F., Anderson, E., & Shapira, J. S. (2005), 'An investigation of moral judgement in frontotemporal dementia,' Cognitive Behavioral Neurology, 18: 193-197.

Rawls, J. (1951), 'Outline of a decision procedure for ethics,' Philosophical Review, 60: 177-197.

Rawls, J. (1972), A Theory of Justice, Cambridge, MA: Harvard University Press.