March 03, 2025
Social Justice
Technology doesn't shape society; rather, society, with all its values and diverse ideas, shapes technology.
The voice of Diletta Huyskes, author of "Technology of Revolution", interviewed by Chiara Pedrocchi and Marco Biondi
In her book "Technology of the Revolution", the author addresses the issue of how technology is closely linked to the dominant social constructs that are at the base of the system that creates it, especially its genesis and the use we make of it. Chiara Pedrocchi and Marco Biondi interviewed her to better understand her perspective on this topic.
Q: Diletta, can you introduce yourself and tell us about the path that led you to write the book "Technology of the revolution", published by Il Saggiatore?
A: I do many things focusing on technology and artificial intelligence, with a critical and analytical look at complexity. I come from the world of activism and research. This book is definitely a result of these points of view and I tried to give an answer to the question: why do technologies reproduce certain logics and how do you limit all this? It was a great moment of awareness for me.
Q: The microwave is one of the first technologies you talk about in your book. Why did you decide to include this equipment?
A: The microwave doesn’t occupy an important space in this book as a technological object but as a method for studying technology. In the mid-1990s, Cynthia Cockburn and Susan Ormrod, two feminist sociologists, conducted a study entitled "Gender and Technology in the making", to investigate how gender relations enter into technological design. They chose the microwave as a case study because it was a very popular consumer technology at the time, just as if we were doing research on generative AI today. They decided to investigate gender relations to understand what role they played, from the microwave’s design to its commercialization. Thanks to this social research based on an ethnographic methodology, the researchers found that men were the main actors in the design of this technology, and women only the end users of the product. In this book I talk about an interview with a sales director of the company that produced the microwave which describes the end user as a specific type of woman with a chignon and a skirt: the conventional image of the bourgeois housewife. The manufacturers implemented the project for this hypothetical end user. In reality, however, they ended in selling the microwave as an object intended for men because the reasoning has been turned upside down: women already knew how to cook and they didn’t need something to speed up and automate cooking, this was men’s necessity.
It should be noted that the design of this product has codified different norms and prejudices about gender relations. This is the subtext that I aim to take from this work, suggesting its adoption for the study of other technologies.
Q: In this book you talk a lot about the concept of technological determinism. What does it mean? What is your position on this construct?
A: Technological determinism is the paradigm by which technology has been interpreted for a very long time, the classic narrative used since the first industrial revolutions, according to which technology creates society and not vice versa.
I am against this paradigm. The alternative model which I promote is "social constructivism". Everything we experience or we create is a social construct, therefore the product of a set of beliefs, values, ideas, choices and interactions that take place within society.
For example, classical determinism suggests that the creation and use of the microwave oven is responsible for all kitchen habits after it was introduced. The same happens with the washing machine: before, women spent many hours of the day washing clothes and managed to do only one laundry because of the length of the process. The washing machine was supposed to automate that process and free up time for women, but in reality it has raised standards and productivity. We no longer do one laundry a day since we have the opportunity to do several laundries per day in less time.
Nowadays, we have seen companies that control the market in North America push us to identify a specific technology with the companies that produce it. For example, if we think of artificial intelligence, we think of Open AI, the company that produces it. But then another company (Deepseek, which makes generative AI in China) started to sell its own version of the same technology. This showed us that there is more than one way to make generative AI, and that we don't have to spend billions of dollars and take over other countries to set up data centres and use all the planet's resources. These technologies might not be the best, but they show us that it's not technology that makes a society, but society that "makes technology" based on its own values and ideas.
Q: In your book you also talk about the different feminist approaches to technology. Could you walk us through them?
A: We did not have literature or a body of research that gave an account of the impact of technologies on different social groups before the feminist studies on technology. The larger feminist movements and the most prominent feminist scholars who studied technology have literally split when faced with this question. There were two parallel tendencies: one was towards boundless enthusiasm for these new technological objects in the home, from washing machines to the Internet, that would help the female gender to emancipate itself; on the other hand, they saw technologies as an additional form of control, with a deep-seated belief that technology increased the control of women's bodies by the actors who created them, i.e. men, and who continue to control them today.
For example, the contraceptive pill could lead subjectivities identified with feminist movements to be completely convinced of the liberating and emancipatory potential of this technology for women to control pregnancy and the menstrual cycle, but on the other hand this same technology was perceived as a sublimating instrument of control over women's bodies because, like all other technologies, it was conceived and designed by men. Furthermore, the history of the pill was based on a terrible process of exploitation of the bodies of many women in the global South for the benefit of other women in the global North. It was very interesting to trace and study how feminism has collided and separated in the face of these scenarios opened up by technology. It is understood that there are ways and means of doing technology, and that the way it is done is often oppressive and problematic for some social categories.
Q: Another theme addressed in the book is the racist mentality that characterizes many people involved with these technologies, and how their biases have an impact on technology tools and their use. The main example is racial profiling, which is the categorization of people on the basis of their physical characteristics or ethnicity. How do racial oppression and technology intersect?
A: We approached the issue through the lens of gender because it was second-wave feminism that brought this analysis to light. In the 1980s and 1990s we discovered intersectionality. The problem is no longer just gender discrimination, but other interwoven issues. For example, one of the categories most often used by artificial intelligence to discriminate is race or ethnicity, often linked to people's socio-cultural background. Unfortunately, the operation is always the same, these technologies learn from statistical bases and historical data, they learn what is more recurrent. As a result, they exclude certain social categories or include others too broadly. For example, if we look at crime statistics in the US, we know that those who are culturally different from North Americans are more oppressed and more controlled by the state, and therefore will be more represented in the statistics. If this data is fed to an artificial intelligence tool, it will only discriminate because it is set up and because it learns from the most frequent events. This is how machine learning works, we cannot expect these technologies to behave differently because it is humans who have set them up. It is not artificial intelligence that discriminates, but the discriminating society that created it in the first place.
Q: In the book you also talk about how this could lead us to become unaccountable and to clear our consciences. Is that true?
A: Exactly, there’s a risk. First we have to work on ourselves and the discrimination that we create, and then we have to work on the machines. "What exists will be repeated, what does not exist will probably not exist" is the reasoning that machines do in probabilistic terms, but it is something that we have to take into account in order to empower the human actors who build these technologies.
Q: What if the control of technologies is in the hands of people who do not care about social issues and values of equality and justice?
A: The technologies used in this way are undeniably tools of power that could be used with alternative frameworks to control, such as care. The most advanced technologies are designed to surveil individuals, rather than, for instance, to identify which categories are most in need of state attention or to make the distribution of public resources more efficient. It is about changing the way we make technology. What I am trying to do in my small way is to show that technology can be built differently. If there’s one message I hope people take away from my book, it’s that nothing is inevitable in technological development.
Q: Another thing we should dismantle is the mad genius’ individualistic rhetoric.
A: Back in the '80s, the constructivist studies of Donald Mackenzie and Thomas Hughes warned us against the rhetoric of the single inventor who built the thing we needed from one day to the next. There is never a single person behind a technology, there are much more complex processes and each one of the team makes choices that contribute to a particular outcome. Changing even a single factor would lead to a different result.