Reflecting on key dilemmas about ensuring human rights in the age of technology
Reflecting on key dilemmas about ensuring human rights in the age of technology
The age of technology is shaping the current and future challenges to human rights protection. Academia can foster spaces for dialogue and exchange of views, such as the 2023 Euregio Summer School which is in a cooperation with the Global Campus of Human Rights and focuses on the interplay between human dignity, human rights and digital technologies.
Digital transformation is the mainstream theme of the current era. Digitalisation is not just a particular subject area, be it in terms of its technical components or be it in terms of its conceptual framework. It is shaping reality. It is the consequence and a manifestation of a dominant perception and knowledge of nature, machine and human beings that initiated in early modernity. The digital age has shaped how people access and share information, form their opinions, debate, and mobilise, deeply transforming the public sphere. However, technology has also been used to suppress, limit and violate rights, for instance through surveillance, censorship, online harassment, algorithmic bias, and automated decision-making systems. The misuse of digital technologies also disproportionately affects marginalised individuals and groups, leading to inequality and discrimination.
The aforementioned topics lead to the need to further discuss possible solutions to these issues. In this regard, the academic sector fosters spaces for dialogue and exchange of views, such as the 2023 Euregio Summer School which also is in a cooperation with the Global Campus of Human Rights. This initiative has brought together students and experts from the fields of IT, law, and social sciences to discuss issues on human rights in the digital age through a mixture of presentations from experts and interactive group work, addressing the impact of technology within the sphere of human dignity. With an interdisciplinary approach to this topic, the gathering has included discussions regarding legislative developments, democratic institutions and processes, as well as questions concerning the implications of open data on scientific research and the overall implications of these developments on human freedom.
In particular, the lectures and discussions have been organised in relation to three disciplines. Firstly, Ivo De Gennaro and Robert Simon have presented some thoughts regarding a philosophical perspective on dignity and human rights. Secondly, Robert Simon has addressed the age of technology from an IT point of view. Thirdly, Domenico Rosani has discussed the issues related to digital media, criminal law and human rights. Lastly, Marya Akhtar has focused on the need to guarantee human rights in automated decision-making at public sector profiling, and in digital platforms.
A focus on Akhtar’s lecture During her lecture on ‘human rights and technology’, Marya Akhtar from the Danish Institute for Human Rights has analysed the intricate overlap between technology and human rights. She has discussed the obstacles presented by automated decision-making in different fields such as public administration, finance, and law enforcement. The presentation has drawn attention to issues related to discretionary legal judgments, emphasising the significance of transparent algorithmic procedures, and navigating a fine line between non-discrimination and potential biases introduced by algorithms.
Akhtar has thoroughly examined the notion of achieving a ‘favourable equilibrium’, drawing inspiration from legal philosopher Alf Ross, who emphasised the skill of reaching mutually agreeable solutions. Her exploration has also encompassed topics such as the potential for automating discretion, distinguishing between discretionary and capricious decisions, as well as appreciating the significance of reasoning in decision-making processes. Furthermore, discussions have revolved around transparency concerns intrinsic to automated decision-making (ADM) systems by analysing the trade-offs involved in balancing explainability and accuracy, while highlighting challenges associated with comprehending intricate algorithms. In relation to discrimination within ADM, Akhtar has identified three distinct levels of potential harm: repetition, amplification, and trade-offs. She has discussed in depth the intricate nature of safeguarding against various manifestations of bias, also exploring how proxy information can exert influence in this regard.
The discussion has turned toward the influence of technology giants on human rights. Akhtar has examined their complex role as facilitators of free expression and carriers of harmful content such as hate speech and disinformation. The intrusive practices related to data collection, sharing, and the potential manipulation of user thoughts have also been emphasised. Although tech giants are not directly bound by enforceable obligations, the issue regarding their accountability under international human rights law has been brought up. Additionally, in examining the available international legal instruments for safeguarding human rights, the importance of striking a harmonious equilibrium between voluntary adherence and obligatory responsibilities has been highlighted. She has introduced risk-based methodologies as an alternative, drawing parallels to the principle of proportionality in human rights legislation and providing instances such as GDPR, DSA, and AIA.
All in all, Akhtar has underscored the importance of striking a balanced and harmonious relationship between self-regulation and obligations in the technology industry in order to protect human rights. Her presentation has provided valuable insights into the evolving landscape of how technology affects human rights, advocating for increased transparency, accountability, and a nuanced approach to regulation.
A focus on Rosani’s lecture Another relevant topic addressed during the Euregio Summer School is self-generated child pornography. This widespread phenomenon has a strong connection with the digital age, in which technology shapes our daily life and plays an important role in some of the most personal and intimate aspects of our existence. In particular, younger generations are inclined to consider the Internet and IT tools as a natural part of their love and sexual life. Practices like sexting are common among teenagers and they are increasingly becoming part of the exploration of sexuality. Domenico Rosani from the University of Utrecht has addressed self-generated child pornography from a criminal law perspective with a special focus on European legislation.
First, Rosani has highlighted the lack of a clear definition of self-generated child pornography at the international level. In particular, he has focused on the difficulties encountered by the legislators in choosing a definition that guarantees protection to minors and at the same time avoids their criminalisation. At the European level, the Council of Europe (CoE) has addressed the topic in the 2007 Convention on the Protection of Children against Sexual Exploitation and Sexual Abuse (Lanzarote Convention). Moreover, the European Union (EU) has focused on it in the Council Framework Decision 2004/68/JHA and Directive 2011/93/EU. More recently, both the UN and the CoE have further addressed the topic, respectively in General Comment No. 25 of the UN Committee on the Rights of the Child, the 2019 Opinion of the Lanzarote Committee, and the 2022 Implementation Report of the Lanzarote Committee. The resulting legal framework allows European states to not criminalise consensual child pornography in specific circumstances.
The example of self-generated child pornography shows how the digital age creates new human rights threats but also reproduces old discussions in new contexts. Indeed, the online diffusion of self-generated child pornographic material regards new aspects of the concepts of evolving capacities and the right balance between the protection of minors and their autonomy. This issue raises questions also about the age of sexual consent and the concept of private use of contents. However, European states have not been able yet to find a common ground on these topics and they adopted different solutions creating fragmentation and uncertainty.
Looking head Addressing the dilemmas about ensuring human rights in the age of technology requires further steps into the legal harmonisation and practices of all the sectors of society. The interplay between human dignity and digitalisation needs new provisions regarding state and non-state actors’ obligations in the technology sector to protect people’s rights. In this regard, the rich discussions at the 2023 Euregio Summer School have offered insightful exchanges from relevant interdisciplinary backgrounds to aim for increased transparency, accountability, and a legal framework on the evolving topic of technology.
Written by Cristina Giacomin, Pamela Peralta and Eirini Tsordia
Cristina Giacomin holds a master’s degree in law at the University of Trento. She is currently a master’s student in Human Rights and Democratisation (EMA) at the Global Campus of Human Rights. She is interested in the relationship between human rights and new technologies.
Pamela Peralta holds a degree in law at the Universidad Católica “Nuestra Señora de la Asunción”. She is currently a master’s student in Human Rights and Democratisation (EMA) at the Global Campus of Human Rights. She works as a consultant at the Centre for Civil and Political Rights.
Eirini Tsordia is a master’s student of Human Rights and Democratisation (EMA) at the Global Campus of Human Rights. She is interested in the intersection of law and technology, women and minority rights. She explores the impact of digital transformation on human rights, focusing on privacy, freedom of expression, and access to information.
Cite as: Giacomin, Cristina; Peralta, Pamela; Tsordia, Eirini. "Reflecting on key dilemmas about ensuring human rights in the age of technology ", GC Human Rights Preparedness, 21 September 2023, https://gchumanrights.org/gc-preparedness/preparedness-science-technology/article-detail/reflecting-on-key-dilemmas-about-ensuring-human-rights-in-the-age-of-technology.html
This site is not intended to convey legal advice. Responsibility for opinions expressed in submissions published on this website rests solely with the author(s). Publication does not constitute endorsement by the Global Campus of Human Rights.