AI, Ethics and Morality Workshop

"AI, Ethics and Morality: Re-evaluating moral and systematic theology in light of Artificial Intelligence" will take place on Thursday 20th June 2024 at Lambeth Palace Library, London, SE1 7JU.

Please register for this AI, Ethics and Morality workshop on Thursday 20th June 2024 in the Bancroft Room at Lambeth Palace Library, London, SE1 7JU. Charles Larkin, Director of Research, Institute for Policy Research, University of Bath, is organising and chairing the day.

The event will discuss the challenges and opportunities of AI in a technical sense and then broaden out into the ethical, moral and theological conversation. The objective is to act as colloquium of different views and ideas in this topic domain to create a transdisciplinary community of practice that can produce a synthesis of ethics, technology and theology so as to inform technologists, policymakers and religious leaders on the role, practice and policy of accountable, responsible and transparent artificial intelligence. The event is intended to be the first step in developing this community of practice. 

The speakers will give brief papers followed by two panel discussions in the morning and the afternoon.  Outputs from the day will be published and available to be submitted to the special issue of the Practical Theology Journal.

Confirmed Speakers

Malcolm Brown, Director of Mission and Public Affairs for the Archbishops’ Council of the Church of England

Michael Burdett, Associate Professor of Christian Theology, University of Nottingham

Katrin Hatzinger, Director of the Brussels office of the Protestant Church in Germany

Peter Phillips, Tutor/Director, Centre for Digital Theology, Spurgeons College

Esther Reed, Professor of Theological Ethics, Department of Classic and Ancient History,  University of Exeter

Isaac Sharp, Director of Online and Part-time Programs and Visiting Assistant Professor, Union Theological Seminary, New York

Simeon Xu, Postdoctoral Research Fellow in Theology and Ethics of AI, University of Edinburgh

Register here

Registration is £26. The deadline for registration is 4th June 2024 although registrations may close earlier if maximum numbers are reached.

Schedule for the day

08:30 Arrival Tea & Coffee and Registration

09:00 Welcome & Opening Address

09:30 Malcolm Brown, Director of Mission and Public Affairs for the Archbishops’ Council of the Church of England

10:00 Katrin Hatzinger, Director of the Brussels office of the Protestant Church, Germany

10:30 Michael Burdett, Associate Professor of Christian Theology, University of Nottingham

11:00 Esther Reed, Professor of Theological Ethics, Department of Classic and Ancient History,  University of Exeter

11:30 Coffee break

12:00 Panel 1 (Malcolm Brown, Katrin Hatzinger, Michael Burdett, Esther Reed)

13:00 Lunch break, lunch provided

14:00 Peter Phillips, Tutor/Director, Centre for Digital Theology, Spurgeons College

14:30 Isaac Sharp, Director of Online and Part-time Programs and Visiting Assistant Professor, Union Theological Seminary, New York

15:00 Simeon Xu, Postdoctoral Research Fellow in Theology and Ethics of AI, University of Edinburgh

15:30 Coffee break

16:00 Panel 2 (Peter Phillips, Isaac Sharp, Simeon Xu)

16:45-17:00 Closing


Malcolm Brown


A key question surrounding many AI applications is how to negotiate trust. Trust in what one cannot prove or fully understand is also a central concept in religions. And, in both AI and religion, trust can be manipulated and abused. Considering abuses of trust in religious contexts, can we derive criteria for assessing trustworthiness in AI applications? As examples, safeguarding failures in religious contexts have often arisen when ministers elide their own role with that of God and seek trust in themselves rather than promoting trust in the divine. This “message/messenger” distinction could be applied to AI creators and their products to reveal what, or whom, we are being asked to trust, helping negotiate trust in AI. Baldur Bjarnason has compared Large Language Models with the techniques of bogus spiritualists: can reflection on the differences between mainstream religion and religious charlatanry help calibrate the degree of trust that should be accorded to chatbots and other AI applications?


The Revd Dr Malcolm Brown is Director of Faith and Public Life for the Church of England, responsible for the team which leads the church’s national work on public policy and ethics, relationships with Parliament and other external relationships. His academic background is in Christian Ethics with a particular interest in the ethics of the market economy. He has taught ethics and Practical Theology for a number of universities and is currently Visiting Lecturer with the CDT-ART AI at the University of Bath. His publications include Tensions in Christian Ethics (2010), Anglican Social Theology (2014), a forthcoming work on theologies of taxation and a jointly authored chapter on AI and Just War Theory.

Michael Burdett


Grounding AI Governance Theologically: Prudence, Providence and the Common Good


One of the most pressing contemporary issues in AI ethics is governance. The initial focus on developing AI ethical principles has given way to how to operationalize such principles and insure throughout the design, development and deployment of the AI system the principles are being followed and exhibit the kinds of values that mitigate the harms and risks and maximise the benefits. I attempt here to ground AI governance in theological terms. I argue that one of the most substantial interpretations of the image of God valorises human agency in guiding and cultivating all creatures to their proper and good end—which includes AI. This guidance recognises each creature has a purpose independent of humanity and that any human providential activity must be modelled on and derived from divine providential activity. That guidance has volitional and intellectual requirements that, in turn, must stem from particular virtues that human beings need to exhibit in such governance. Two of the most important theological terms considered here in terms of human governance of AI are prudence and the common good.


Dr Michael Burdett is Associate Professor of Christian Theology at the University of Nottingham. He is a series editor for the Routledge Science and Religion series and has written several relevant books including Technology and the Rise of Transhumanism (Grove, 2014), Eschatology and the Technological Future (Routledge, 2015) and Finding Ourselves After Darwin (Baker Academic, 2018). He has helped lead several grant projects totalling over ~£2.5 million including “Co-creating Ourselves?: Deification and Creaturehood in an Age of Biotechnological Enhancement” (JTF), “Bridging the Two Cultures of Science and Humanities” (TRT and Blankemeyer) and “Christian Flourishing in a Technological World” (Issachar).

Katrin Hatzinger


Ethical and human centric? A protestant view on the AI Act of the European Union


The EKD-office in Brussels has closely followed the political debates on regulating artificial intelligence (AI) in the European Union from an ethical perspective. Whilst recognising the importance of technical innovation and potential benefits, the EKD-office supported, since the beginning, the risk-based approach, which focuses on the potential harmful effects of AI applications for individuals, societies and the entire creation. The aim of the AI Act as adopted this year, is to promote the introduction of human-centred and trustworthy AI, while ensuring a high level of protection of health, safety and fundamental rights. EKD believes that a dialogue platform with equal representation, including theologians, should be set up to accompany the implementation of the legislation to monitor developments and to foster dialogue and exchange within society.


Ms Katrin Hatzinger is director of the the Brussels Representation of the Protestant Church in Germany (EKD) to the EU since May 2008. She studied law in Bielefeld, specialising in European and international law. After her second state exam, she joined the EKD office in Brussels as a legal advisor in 2003. She is inter alia a member of the Ethics Advisory Board of the Community of Protestant Churches in Europe (CPCE), a member of the Commission for European Affairs of the Council of the EKD and a permanent guest on the Advisory Board of the Commissioner for Refugee Issues of the Council of the EKD. She is the publisher and editor of bi-annually magazine EKD Europa-Informationen“.

Esther Reed


On the Ethics of Weapons Control: Resisting Babel Aspirations in AI-enabled Technologies


Orthodox scholar John S. Romanides writes: ‘Without a correct understanding of the fall of mankind (sic), an Orthodox interpretation of the dogma of redemption is impossible’ (Romanides, 1998, 17). In a spirit of ecumenical sharing, this paper launches from Romanides’ challenge to understand the power of sin and evil potentially facilitated in/by AI-enabled technologies. Mindful of tensions between Orthodoxy and Augustinian views of original/ancestral sin, this paper posits that an Augustinian-inspired approach to the idolatry of Babel aspirations illumines contemporary ethical challenges associated with AI-enabled technologies – notably regarding how responsibility for the deployment of AI-enabled weapons systems exceeds the individual and potentially evades regulable accountability. ‘How did they expect to raise this lofty mass against God…?’ asks Augustine (City of God XVI, 4). ‘Let us confound their speech’ (City of God XVI, 5). As questions of responsibility and accountability for the use of AI-enabled technologies become increasingly uncertain, this essay interprets present-day confusion and hindrances to communication as, in effect, the Babel-like consequences of human impiety.


Esther is Professor of Theological Ethics, University of Exeter, UK. She is working currently at the interface between military ethics and moral injury, and on the ethics of weapons control. Recent publications include ‘Accountability for the Taking of Human Life with LAWS in War’, Ethics & International Affairs 2023;37(3); ‘On Limited Force: Prudence Below the Threshold of War’, Studies in Christian Ethics, forthcoming.

Peter Phillips


AI Worldviews :: Internet Culture :: Religious Practice


This talk starts out by reflecting on the impact of social media culture on the kind of Bible texts shared online as explored in the speaker’s 2019 research. But should we read the potential impact of AI worldviews into this model as well? The talk will look briefly at issues with using limited prompts and at the problem of AI hallucinations and gaps to investigate the worldview issues. Do AI worldviews developed through Ai training processes result in potential deterministic impacts for future religious knowledge and practice.


Peter Phillips is currently Tutor and Director of the Centre for Digital Theology at Spurgeons College in London, an Honorary Research Fellow in the Department of Theology and Religion at Durham University, and Minister in the Thames Valley Methodist Circuit. He teaches courses and supervises research students in Digital Theology and New Testament Studies and Greek. His current research focuses on the impact of digital culture on theology and on contemporary religious practice, especially around the digital transformation of religious practice and thinking. His recent publications have explored the concept of ‘Digital Being’, Elaine Graham’s exploration of the ‘imago dei’ and the metaphysics of information.  

Isaac Sharp


Beyond Rules for Robots: Mapping and Triaging AI Ethics


Artificial Intelligence has become something of a catchall term, often used to describe a dizzying array of technological developments, computational processes, philosophical problems, historical research agendas, and contemporary fields of study. The burgeoning sub-field of AI ethics similarly encompasses a host of complex concerns ranging from rules for robots to the singularity. Although the twin temptations of techno-utopianism and techno-pessimism—with their attendant visions of either transcendence or catastrophe—are great for fundraising, AI hype obscures a variety of the more mundane, yet no less concerning, social-ethical implications associated with the various technologies proliferating under the umbrella of artificial intelligence. In this paper, I survey some of the historical and contemporary concerns of AI ethics to develop a provisional map of the field. I conclude with some social-ethical “triage,” highlighting a few of the most pressing areas of concern for ethicists and moral theologians.


Isaac Sharp is the author of The Other Evangelicals: A Story of Liberal, Black, Progressive, Feminist, and Gay Christians—and the Movement that Pushed Them Out. He currently serves as Faculty Director of Online and Part-time Programs and Visiting Assistant Professor at Union Theological Seminary in the City of New York. He has coedited Evangelical Ethics: A Reader in the Library of Theological Ethics series (Westminster John Knox, 2015) as well as Christian Ethics in Conversation (Wipf & Stock, 2020).

Simeon Xu


A Christian Hope for the Future of AI: Christian Morality and Ethical AI




Ximian (Simeon) Xu (PhD, University of Edinburgh) is Duncan Forrester Fellow at the Institute for Advanced Studies in the Humanities and the Centre for Theology and Public Issues, and also is Postdoctoral Research Fellow in Theology and AI Ethics at the Centre for Technomoral Futures and the School of Divinity, the University of Edinburgh, UK. He is the author of the book Theology as the Science of God: Herman Bavinck’s Wetenschappelijke Theology for the Modern World (Vandenhoeck & Ruprecht, 2022). His new monograph The Digitalised Image of God: Artificial Intelligence, Liturgy, and Ethics is forthcoming in Routledge Science and Religion Series.

Directions to Lambeth Palace Library, Lambeth Palace Road, London, SE1 7JU

The Lambeth Palace Library is located along Lambeth Palace Road, opposite the Evelina Children’s Hospital / St Thomas’ Hospital and neighbouring the north entrance to Archbishop’s Park. Entrance is via the automatic glass doors at street level on Lambeth Palace Road.

By Bus: Lambeth Palace Road: C10, 77 Lambeth Road: 3, 344. The bus stops are identified as Lambeth Palace and are a 5 minute walk to the Library.

Underground: Lambeth North: Bakerloo Line, walk towards St Thomas’ Hospital and along Lambeth Palace Road. Vauxhall: Victoria Line, walk north along Albert Embankment towards Westminster and along Lambeth Palace Road. Westminster: Circle, District, Jubilee Lines Cross the bridge and walk around St Thomas’ Hospital along Lambeth Palace Road. blue-badge holders at the Library.

By Car: Lambeth Palace Library is in the central London Congestion Zone and ULEZ. Limited parking is available for blue-badge holders at the Library. If you require parking please contact us to check availability.

Main Photo by Hugo Sousa on Unsplash

Event Info

Date 20.06.2024
Start Time 9:00am
End Time 5:00pm

Add to Google Calendar