Gloria González Fuster is a keynote. She is a Research Professor at the Vrije Universiteit Brussel (VUB)’s Faculty of Law and Criminology, and Co-Director of the Law, Science, Technology and Society (LSTS) Research Group. We invited Gloria to tell us more about herself ahead of the conference.

Gloria, you investigate legal issues related to privacy, personal data protection and security. Can you briefly tell us about your research and what you are currently working on?

At VUB’s LSTS Research Group we carry out intradisciplinary research on law, science, technology and society, and the interconnections between them. Personally, I have a background in law, but also in communication sciences. For many years, I have been studying privacy and data protection law, primarily from a European perspective. More recently, I am also researching data law and data policies in a broader sense.

Much of my research is about trying to bring meaningfully into these discussions the role of individuals. Data protection law respectfully calls them ‘data subjects’, but it nevertheless often appears to rely on an ill-defined conception of what such ‘data subjects’ actually might need, know, or be able and willing to do.

Somehow similarly, many discussions on AI appear to concede there might some problems affecting for instance discrimination or issues related to ‘groups’, while individuals as such tend to be erased out of debates. Part of my research is about how these shifts may affect personal freedom.

With the rise of social media, privacy feels like the issue of the moment but back in 1999 Sun Microsystems CEO, Scott McNealy, said that consumer privacy issues are a “red herring” and that “you have zero privacy anyway”. And a decade later, in 2010, Zuckerberg declared that privacy was “no longer a social norm”. There seems to be a conflict between social media users wanting to share information about themselves in public arenas but with the expectation of privacy – by which they probably mean they don’t want this information used by companies or employers without their permission. Is that a reasonable expectation or are we all being naïve?

Well, it is precisely because I use Google, and my son does, and you probably also do, and in any case most of us do, or will, that it is critical that Google fully complies with the law.

I understand one could see a conflict, but it is only superficial. My son asks me similar questions: ‘Mum, how can you criticize Google, and then go and use Google all the time?’. Well, it is precisely because I use Google, and he does, and you probably also do, and in any case most of us do, or will, that it is critical that Google fully complies with the law. It is because companies such as Facebook, Microsoft, Apple, Amazon, etc. are so popular and important – for our economy, for politics, for science – that it is not only reasonable, but essential, to expect from them complete respect of fundamental rights. What would be naïve is for anybody to imagine that, by using this or that online service, we would be lightly agreeing to give up any of our society’s fundamental values.

In East Germany the Stasi regarded the family as “the last bastion of disloyalty, requiring ever-greater efforts to penetrate its secrets” (Privacy: A Short History, p. 108). Yet today we are inviting listening devices such as Alexa into our most private and intimate spaces – our homes, our families. What’s your take on this?

The family is a fascinating object of research from a privacy perspective! Indeed, from a certain viewpoint family life is a privileged space deserving protection, deeply connected to the respect for private life, and to the respect for autonomy and personal development of its members, including their sexual lives – which would lead indeed into thinking the State must definitely be kept out of the picture. On the other hand, historically the protection of the domestic sphere has also been misused to inappropriately keep the State out of issues such as domestic violence – leading into a rethinking of relevant boundaries.

More recently, it has also emerged that some parents are for instance particularly inclined to over-share pictures and data about their children, or to engage in certain tracking practices of children, or even spouses, that defy the limits of what seems acceptable, not only socially but also legally. This invites us to remember the family is, in relation to privacy, certainly not a homogeneous group, but rather a composite set of interests.

Devices such as Alexa bring in indeed an extra layer of complexity into this, by placing data extraction on all family members, by multinational private companies, at the core of many homes.

Devices such as Alexa bring in indeed an extra layer of complexity into this, by placing data extraction on all family members, by multinational private companies, at the core of many homes. In addition to this type of devices, others that might look less invasive are just equally challenging – I am thinking, for instance, of game consoles and the millions of players connected to them through microphones that potentially record the voice and sounds of anybody else in the same room.

We’d love to ask you more questions but we’re conscious of your time! Is there anything else you’d like to tell us or say?

I am looking forward the discussions at the Conference.

In your opinion, why do you think technologists should attend the conference?

I won’t try to talk in their name…

Why are you excited to attend the conference and what do you hope to get from attending?

I find the idea of engaging an interdisciplinary discussion that acknowledges the value of anthropology very interesting and promising.

Thanks so much, Gloria, we can’t wait to meet you on 3 October! In the meantime, check out Gloria’s blog including her most recent post, The EU rights to privacy and personal data protection: 20 years in 10 questions.

Image credit: Photo by ev on Unsplash