Social Acceptance: the Role of Subjectivity and Stakeholder Participation

Published in Social Acceptance by


This article is part of a series on the social acceptance topic. Stay tuned for more!

When we talk about the social acceptance of technologies, we should first of all consider a series of psychological, anthropological, and sociological factors that concern the concept, birth, and primitive experience of social acceptance. This acceptance is first of all individual, then collective, and finally, it can refer to objects or tools, as in the case of technologies. A more precise analysis of the philosophical foundations of technologies’ acceptability (their inherent “goodness of fit” within a given context) versus their acceptance (the state of their adoption after being introduced in a given context) would lead us to fundamental questions about the constitution of personal identity, the structuring of social norms and conventions, and the laws (explicit or implicit) that govern human relations and relations with the world of objects. Here, instead, we intend to focus on the relationship between the development of new technologies and the debate on the social acceptability of these technologies.

New technologies are changing the way we live life and establish a relationship with the world. Objectives that in the past seemed impossible or unimaginable have become real and outdated; today, most skills and creative abilities always have to do with (increasingly intelligent) technologies. However, these same technologies pose, as we have said in previous reflections, new risks, problems, and doubts of a social and ethical nature. Among these problems, there is, for example, that of access to the resources that technology allows, and therefore the gap and inequality between those who can benefit from certain tools and those without them. If we think of a technology like 5G, it is easy to understand that the benefits it implies could be decisive only for those with access to it.

Once again, it is easy to understand how social acceptance is a fundamental outcome when considering the risks and opportunities connected to the adoption of new technologies, and must be present in any ethical debate.

Establishing good practices for social acceptability is more difficult – and less frequent – than establishing, through feedback and results, the bad practices that have led to the limited acceptance of a technology. The difficulty in establishing precise criteria for social acceptability is also due to the fact that these change in relation to time, space, and specific social, political, and cultural contexts.

The subjectivity that determines whether a technology or group of technologies is accepted in a given context, however, cannot be reduced to an individual dimension. In fact, it is immediately transformed into a collective subjectivity, an entity that possesses its own nature and is often influenced by massifying and depersonalised dynamics. When we speak of collectivity, therefore, we do not mean only the sum of the subjects that constitute it, but a separate force. For this reason, it is possible and frequent that what in one society is considered acceptable, in another – due to a different cultural background, values, or religious beliefs – is considered unacceptable. Social acceptability must be considered with reference to both these levels: the subjectivity as belonging to the individual subject, and the subjectivity of a social collectivity that is a kind of super-structure in relation to the subjects that constitute it.

Some of the most known and probable causes of non-acceptance are the following:

fear for one’s privacy;
fear of losing control of one’s self-image, among other things;
misunderstanding of a technological product or service’s purpose.
In addition to the above, issues of an ethical nature – such as concerns about the discriminatory impact of some technologies or their restricted availability to all – may preclude technologies from acceptance. Sometimes such ethical issues are set aside by final decision-makers in the process of development of a technology.

However, borrowing a concept from juridical philosophy, some scholars of social acceptance speak of “procedural justice” as a means to reduce any perceived unethical qualities of technologies. Two necessary conditions for a process to be perceived as just are transparent communication and the active involvement of as many categories of stakeholders as possible. The following example from the contemporary technological arena should clarify the point. A company designing a new interface of an ATM should consult diverse clusters of stakeholders beyond primary beneficiaries – such as the elderly, the blind, and disabled persons –in order to validate the purpose of and the experience connected with the tool. Shared decisions in social contexts tend to favour more sustainable dynamics in the long-term.

Involvement is not the only factor (especially when there are balances between different societies and nation states at stake); the potential interests of protagonists in decision-making processes cannot be overlooked. Recent contrasts between the US and China have shown how geopolitical aspects can be even more important than technological ones.

At the basis of these practices and definitions there are, as already mentioned, philosophical and sociological assumptions that give rise to ethical considerations. This implies the need, which is taking hold, to develop interdisciplinary methods and research – what we do at CyberEthics Lab. – for the ethical evaluation of technological development, its impacts and its risks. Furthermore, decision-making procedures need to be better outlined with reference to many existing technologies, so that a definition and practice of social acceptance can be established.

Nielsen, J. Usability Engineering. Elsevier, 1994.
Cotton M. Ethics and Technology Assessment: A Participatory Approach. Heidelberg: Springer, 2014 Available at:

Service involved

Social acceptance of technologies assessment
Connected, disruptive technologies permeate all aspects of our daily lives and pose challenges to the real foundation of human rights, such as the right to privacy or the freedom of speech. One could say that human values such as trust, accountability, and dignity are mutually influenced by the social acceptance of technologies. We support our clients to conceive a novel way of aligning the thus-far divergent concepts of sustainability, ethics impact, and technological innovation. By combining these three concepts, we respond to the need of a socially responsible innovation ecosystem by developing a tailored methodology for assessing users’/citizens’ social acceptance of technologies, a fundamental driver for technology market adoption. Our social acceptance framework includes six fundamental dimensions over which social acceptability (i.e. perception, motivation, trust, awareness, capacity enabling, and accountability) is measured and assessed through a two-step approach based on an online Sentiment Analysis (SA) – to create structured and actionable knowledge from the web – and the engagement of our client’s stakeholders (e.g. relevant target groups, associations of citizens, domain operators, decision makers, etc.) for the technology co-creation and communication regarding its social acceptance.
Responsible Research & Innovation
We love discovering and staying on top of new research to continuously advance our knowledge and to transform it into responsible innovation, taking into account effects and potential impacts on ethics, privacy and data protection. We help national and international partners to handle ethical, legal and cybersecurity concerns on both the research process and the project outcomes, through the legal support for the involvement of human beings in the research activity, the analysis of the national and regional legal framework applicable to the implementing technology and the recommendations for the secure and compliant development of technology. We are a multidisciplinary team that promotes the inclusion of legal and ethical concerns in the design of the technology, researching and producing new knowledge and best practices towards making a conscious and transparent adoption of technology.