Jump to page content

2nd Session of the Group of Governmental Experts (GGE) on Emerging Technologies in the Area of Lethal Autonomous Weapons (LAWS)

  • 26.08.2024
    • Disarmament
Scroll to page content
The undoubted benefits that humanity will be able to draw from the current technological progress will depend on the degree to which such progress is accompanied by an adequate development of responsibility and values that place technological advancements at the service of integral human development and of the common good .

Statement of H.E. Ettore Balestrero, Permanent Observer to the United Nations and Other International Organizations in Geneva

to the Second Session of the 2024 Group of Governmental Experts (GGE) on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems (LAWS)

General Exchange of Views 

Geneva, 26 August 2024

 

 

Mr. Chair,

 

At the outset, please allow me to thank you for all the preparatory work that you have conducted in advance of this second session of the Group of Governmental Experts (GGE). In particular, my Delegation wishes to thank you for the “rolling text” that you have provided, which constitutes a valuable foundation upon which to build a shared understanding.

 

Speaking to the G7 leaders gathered in Italy last June, Pope Francis urged them to “reconsider the development and use of devices like the so-called “lethal autonomous weapons” and ultimately ban their use. This starts from an effective and concrete commitment to introduce ever greater and proper human control. No machine should ever choose to take the life of a human being.”[1]

 

For the Holy See, given the pace of technological advancements and the research on weaponization of artificial intelligence, it is of the utmost urgency to deliver concrete results in the form of a solid legally binding instrument and in the meantime to establish an immediate moratorium on their development and use. In this regard, it is profoundly distressing that, adding to the suffering caused by armed conflicts, the battlefields are also becoming testing grounds for more and more sophisticated weapons.

 

Mr. Chair,

 

This Delegation supports your approach to analyze the potential functions and technological aspects of autonomous weapon systems. Identifying those systems that are wholly or partially incompatible with IHL and other existing international obligations, could be of great benefit in adequately characterizing the systems under consideration in order to establish prohibitions and restrictions accordingly, while taking into account broader ethical considerations.

 

For the Holy See, autonomous weapons systems cannot be considered as morally responsible entities. The human person, endowed with reason, possesses a unique capacity for moral judgement and ethical decision-making that cannot be replicated by any set of algorithms, no matter how complex.[2] Therefore, this Delegation appreciates the references to both “appropriate control” and “human judgement” in your rolling text, although we would welcome more clarity and common understanding of these terms.

 

In this regard, it is useful to recall the difference between a “choice” and a “decision”. While pointing out that machines merely produce technical algorithmic choices, Pope Francis recalled that “human beings, however, not only choose, but in their hearts are capable of deciding. A decision is what we might call a more strategic element of a choice and demands a practical evaluation […] Moreover, an ethical decision is one that takes into account not only an action’s outcomes but also the values at stake and the duties that derive from those values.”[3]

 

Mr. Chair,

 

The Holy See deems it of fundamental importance to retain references to human dignity and ethical considerations at the core of our deliberations. It is necessary “to ensure and safeguard a space for proper human control over the choices made by artificial intelligence programs: human dignity itself depends on it.”[4]

 

In this regard, this Delegation welcomes the prominent role given to ethical considerations at the recent conference “Humanity at the Crossroads: Autonomous Weapons Systems and the Challenge of Regulation” which was held in Vienna on 29-30 April 2024. This and other similar conferences on the same subject are further indications of an ever-growing awareness of the ethical concerns raised by the weaponization of AI. Such public awareness represents a remarkable, ever-growing “conscience publique” that cannot be ignored.

 

In conclusion, the development of ever more sophisticated weapons is certainly not the solution. The undoubted benefits that humanity will be able to draw from the current technological progress will depend on the degree to which such progress is accompanied by an adequate development of responsibility and values that place technological advancements at the service of integral human development and of the common good [5].

 

Thank you, Mr. Chair.

 



[1] Pope Francis, Address to the G7 Session on Artificial Intelligence, Borgo Egnazia, Italy, 14 June 2024.

[2] Cf. Document CCW/CONF.VI/WP.3, “Translating Ethical Concerns into a Normative and Operational Framework for Lethal Autonomous Weapons Systems”, submitted by the Holy See to the Sixth Review Conference of the CCW, 13-17 December 2021.

[3] Pope Francis, Address to the G7 Session on Artificial Intelligence, Borgo Egnazia, Italy, 14 June 2024.

[4] Ibid.

[5] Cf. Pope Francis, Laudato Si’: Encyclical Letter On Care For Our Common Home, n. 105.