An interview with Sr. Helen Alford, OP, the new President of the Pontifical Academy for the Social Sciences
Sister Helen Alford, the new president of the Pontifical Academy of Social Sciences, talks about how the Church can provide a different vision on technology and artificial intelligence.
Interview by Isabella H. de Carvalho (I.MEDIA – Aleteia)
This article is republished with permission from: I.MEDIA – Aleteia
On April 1, 2023, Sister Helen Alford, a 58-year-old Dominican nun, was named president of the Pontifical Academy of Social Sciences, an institution founded in 1994 by John Paul II to enrich the Holy See’s engagement with issues related to the Church’s social doctrine, particularly in the areas of economics, law, political science, and sociology.
With a PhD in engineering from Cambridge University in the UK, Sister Alford is also an expert in economics and business ethics and has taught in the United States. Since 2021, she has been Dean of the Faculty of Social Sciences at the Pontifical University of St. Thomas Aquinas in Rome, commonly known as the Angelicum. She had also previously held this position from 2001 to 2013.
As head of the Pontifical Academy of Social Sciences, Sister Helen Alford will have to encourage reflection on the technological advancements that the world is currently experiencing and show that the Church can provide a vision for the digital revolution underway.
We spoke with her about this new mission.
In a secularized world where perhaps Catholic social doctrine may not be taken into consideration, how does the Pontifical Academy for Social Sciences, and more generally the Church, carve a space for itself?
Many people experience this sense that the Church is not a major player, but being here in Rome I think you get a different perspective. International organizations, such as the United Nations or the European Union, are rather open, and that is not a new thing. When John Paul II published Sollicitudo Rei Socialis in 1987 the UN had a seminar in New York on the encyclical. Shortly after in 1990 the first human development report was produced and I think that the encyclical had some impact. Countries started measuring themselves based on the human development index, rather than only on gross domestic product (GDP).
When Laudato Sì‘ came out in 2015, shortly after there were two big agreements: the Paris Climate Accords and the UN’s Sustainable Development Goals. Everyone recognizes Pope Francis’ encyclical was a crucial factor in those agreements.
I would say the Church is important and influential in some ways and areas and in others it isn’t. It is a normal situation for the Church; we are making an appeal and some people are interested and some aren’t.
What do you think are the most important topics that the Pontifical Academy for Social Sciences has to address today?
Our number one source is Pope Francis. We need to be supporting him and the topics he wants to work on. I haven’t had a chance to talk to him yet, but we know from his writing and his ministry what he is generally concerned about.
I would say it is this double crisis: social and environmental, which is the focus of Laudato Sì‘. Another aspect is the sense of bringing our communal dimension back into the center of our thinking and action, which is the idea ofFratelli Tutti. We are not just individuals achieving our own goals but rather we are fundamentally building our society together.
I like to think of it as we try to advance the frontiers of the common good against the throwaway society. We know that social systems can be modified by our actions and we can work together for more justice in the world. To do this we must also preach the Gospel and make it present in the world. It is a continuation of the Incarnation if you like, a sharing of our faith in a very practical way.
Artificial intelligence is a major trending topic today. Images of Pope Francis in a fashionable jacket went viral across the world. However, this is just the tip of the iceberg. How does the Church analyze the development of this technology?
Technological development is in our hands. Technology isn’t like science, which is something we discover, as it is about the principles of the natural order. Technology is like culture, it is something we create.
Thus there are two main trajectories of technological development. The first is the one we experience today as the dominant one and which some would call technocentric or monotechnic. This focuses on the machine at the center, with society having to adapt itself to it, which is what we are seeing with artificial intelligence. A certain group of people get benefit from this type of development and the rest of society is constrained to change itself to fit.
However, there is another form of technological development which we can call human- or life-centered. This focuses on making a particular way of life more productive, rich and full and is not about one group dominating another. An example of this is in the early industrial revolution — two different types of spinning machines were developed. The first in the 1770s by a skilled spinner and the second in the 1830s by professional engineers commissioned by the owners of the weaving machines, the big capitalists. The two machines were equally productive at the beginning but there was a huge difference in how they affected the people operating them. In the end it was the second one, which could be operated by anyone, that received investments.
With artificial intelligence we are dealing with a problem that has been created because the first type of technological development has gone ahead and not the human-centered kind, which is perfectly possible on a structural level. The issue is that there hasn’t been investment in this second option. Technological development can be good. We could do much better with artificial intelligence and other technologies we have so that they can support life and humans as best as possible.
A letter also recently emerged, signed by Elon Musk, among others, calling for a moratorium on artificial intelligence research. How is the Church, and the Academy specifically, able to respond to these concerns on this topic?
A lot of dialogues are going on. The Pontifical Academy for Life, for example, created a document in 2020 called the Rome Call for AI Ethics, which was signed by big corporations such as Microsoft. This year they called for other religions to sign it to try and give it an interreligious dimension.
At the Vatican, there are other conversations with specialists, such as in the Dicastery for Culture and Education which has a department dedicated to digital culture. There have been important figures present historically also at the Dicastery for Integral Human Development. There is a transversal group across the Holy See that is working on different aspects of artificial intelligence.
One of the key things the Church can do is give people a different vision. If people can see the world differently they could change the parameters to design technology differently. The Church can open up people’s minds.
In a way that is what the Gospel has always been doing: giving people the sense that there is another world. In the end we know there is going to be a final other world, but we want to show that even in our world today things can be better because of the presence of grace, of the teaching of Christ, of the community of faith and more.
Is this work and vision of interest to the people who run technological companies?
I think people that are high up in these technological organizations appreciate what we do. They realize they need another vision. They may not necessarily know how to put it into practice but they can start thinking about it.
In the end it has to be the engineers who put this into practice but if they don’t have an inspiration to try to do it they are not going to; that is where we come in. If Mark Zuckerberg and his equivalent asked their engineers to create an interface that generated profit for their company but also enhanced human dignity it could be done. However, this isn’t the objective that these workers were given and many of them are now leaving these companies because they don’t like what they do and what they see.
Others live with a tension within them where they continue working in these fields, even though they see the negative effects, for example, that social media has on their children. There are strong incentives in our economic system for many to carry on working in the technological world. To deal with that we need grace, prayer, and divine help and we talk about that in the Pontifical Academy as well.
The Academy focuses a lot of its attention on developing the idea of a fraternal economy, with a focus on human wellbeing. How does artificial intelligence fit within this economic vision?
We could have an artificial intelligence that focuses on making human skill more productive, just like the spinning jenny did at the end of the 1700s. We do have some forms of AI that are already doing that, like expert systems that are helping doctors diagnose illnesses more effectively.
Anywhere where you need to amass a lot of information to help you come up with the best possible result, AI systems are fantastic, as they can do it cheaply and more quickly. Just like steam lifted things where our or horses’ muscles couldn’t, AI can help us make our scale and capacity more effective.
Many are concerned that advanced artificial intelligence tools could replace certain types of jobs – these chatbots that have emerged can reproduce for example very accurate media articles or contracts. What do you think are the implications for employment, work culture and ethics with the rise of artificial intelligence?
I think whenever a major technology emerges it has destroyed jobs but has created others. First it was steam, then electricity, then IT, now probably AI is another type of influential technology which is going to have very across-the-board effects. People were always worried when a new technology emerged.
It is true that right now studies have given quite pessimistic projections concerning jobs that would be affected by AI. However, maybe we can reorient this development so that it supports human development and so that the focus is to make human skill more productive rather than take the skill from the jobs.
The crucial point is we have choices, we are not constrained to only develop technology in a certain way that benefits a certain group of people. We need to have confidence to raise questions. The criteria should be what is the technological development doing to support life and the goodness of creation as a whole. If the criteria is “let’s just make as much money as we can, whatever the cost is to anybody else,” then that in the past has tended to do a lot of damage and will most certainly do so again. This is the technocratic mindset that Pope Francis talks about.