Usable Security and Privacy Lab

Young man in an experimental setup with mobile phone, laptop and overhead camera
Working on the “digital guardian angel”

At the beginning of October, the “ATHENE Usable Security and Privacy Lab” (USP Lab) at h_da began its work. The laboratory is run by the User-Centered Security research group and supported by ATHENE, the National Research Center for Applied Cybersecurity. The aim of the research work conducted at the lab is to make digital security technology more user-friendly so that people can understand it and use it more easily in their everyday lives. In this interview for impact, computer scientist Professor Andreas Heinemann explains why this is important and why data security is not only a technical topic but also a social one.

Interview: Christina Janssen, 8.11.2025

impact: If I put my mobile phone down in front of you with WhatsApp, work e-mails, my Gmail account, Instagram, and so on – would that irritate you?

Professor Andreas Heinemann:  No. What’s important is whether you have consciously decided which apps to use and whether you know the risks. Many people install apps because “everyone else” has them. That is different from making my own informed decisions.

impact: What’s the problem with using something because everyone else does?

Heinemann:  People want to communicate; that’s their primary objective. As long as I can communicate, I’m happy for the time being. At that moment, whether messages are secure or encrypted is secondary. There are a lot of communication apps, and they have very different security standards. If I don’t take a closer look at them, they all seem the same to me.

impact: And what can happen in the worst case?

Heinemann: Personal data can be lost or misused. Someone could sell the data – movement profiles, contact details or date of birth. Or they could try to mimic my identity on the Internet in order to harm me. From a social perspective, there is a risk that we are unconsciously influenced in our everyday life and our political decision-making process. Smartphones like this one here constantly collect data: location, movements, contacts, communication. The more apps are installed, the more data can be extracted.

impact: Can I be sure that WhatsApp does not misuse my data?

Heinemann: The messages themselves are end-to-end encrypted, so I’m not particularly worried on that score. However, what ends up on Meta’s servers is contact information, for example, from which it is possible to derive social relationships and networks: Who is connected with who? Let’s take our university as an example: I’ve got the president’s mobile number in my phone and so have you. We’re both on WhatsApp, and we send each other messages, which reveals to WhatsApp that we know each other. If my phonebook and your phonebook are synchronised on the server, WhatsApp then also learns that we both know the president. This means that information is generated that we perhaps didn’t intend to disclose. It happens automatically when I choose to use convenience features such as displaying real names instead of just telephone numbers. If I do that, I give up some of my privacy.

impact: Are you yourself on WhatsApp?

Heinemann: Yes, and I’m aware of the risks. There is information that I would share via WhatsApp and other information that I wouldn’t.

impact: Are there apps you would categorically advise against?

Heinemann: I would always ask how the app is financed. When a service is free, I generally pay with my data. You can see this with a lot of supermarket apps. You get discounts, but at the same time your purchasing behaviour is analysed in detail. We have known this principle ever since Payback was introduced, only today it’s more personalised. I personally don’t use such apps.

impact: What is the USP Lab’s goal in this context?

Heinemann: The lab is part of ATHENE, the National Research Center for Applied Cybersecurity. We examine security from the user’s perspective: What do people need so that they can use digital services safely and without worrying about the potential consequences? The lab provides the infrastructure to study such questions in an experimental setting: rooms, technical equipment and staff. At the same time, we develop new research methods and incorporate the results into teaching.

impact: Can you please describe the research being conducted in the lab?

Heinemann: We work with eye tracking, for example. A test person works with specific software while an eye tracker records their eye movements. In this way, we can see where the user is looking, where they get stuck or which functions they don’t even notice. This helps to improve user interfaces – for example, if a button for activating encryption is difficult to spot. Then it’s not only a matter of usability but also of security: if a user can’t find a security function, they can’t use it either.

impact: What other methods are you using in the USP Lab?

Heinemann: We try to understand how different groups use IT-systems and apps. Younger people, for example, don’t use computers or mobile phones in the same way as adults or elderly persons. To do this, we work with focus groups: we invite representative users and conduct interviews with them, develop solutions with them and ultimately show how a product might look. Or we ask them what they think about a prototype. This enables us to learn how applications need to be designed so that they are accepted and safe to use.

impact: Who are the USP Lab team members?

Heinemann: At the present time, a computer scientist and a psychologist are working in the lab, which allows us to integrate both technical as well as user-oriented aspects. The lab is part of the UCS – User-Centered Security research group at h_da.

impact: Who is permitted to use the lab?

Heinemann: In the first instance, researchers at all the institutions participating in ATHENE. I also use the lab in my teaching. But other colleagues from the Faculty of Computer Science are also welcome to use it and, depending on capacity, from other faculties as well. We have already received expressions of interest from Darmstadt Business School, the Business Psychology study programme, and several others.

Andreas Heinemann is Professor for Computer Networks and IT Security at h_da’s Faculty of Computer Science. He heads the User-Centered Security research group. His research interests lie in usable security, security for ubiquitous computing and privacy in opportunistic networks. He represents h_da on the board of the National Research Center for Applied Cybersecurity – ATHENE and is a member of the management board of the Competence Center for Applied Security Technology (CAST).

impact: The goal of the USP Lab is therefore to make security more of an “everyday” thing?

Heinemann: Exactly. We don’t run any training courses, instead we develop applications that support users on an everyday basis. One example is our concept of a “Privacy Buddy”: a digital companion that is by my side to help me protect my privacy – by alerting me, for example, to the risks I face when an app demands an unnecessary number of permissions or advising me in which situations it would be better if I sent a message in encrypted form. A kind of digital guardian angel.

impact: Your aim is to develop a “digital guardian angel”?

Heinemann: That’s what we’re looking at. Whether or not it becomes an actual product is a decision that industry will make in the future. Our job is to deliver scientific findings. When I started at h_da in 2013, we still had to change our passwords every six months. Research has shown, however, that constantly changing passwords for no apparent reason does not make them any more secure. At some point, the Federal Office for Information Security took note of this finding, and many organisations, including our university, have abandoned the obligation to change passwords on a regular basis. This is an example of how research findings are transferred into practice.

impact: How does AI influence your work? Isn’t it also an “enemy” as far as your research is concerned because it enables attackers to act faster and more ingeniously?

Heinemann: AI is a huge topic in IT security. AI can automate and optimise attacks, but it can also be used for defence. In “usable security”, our field of research, I see it above all as a support tool.

impact: Do applications exist where privacy and security have already been optimally resolved?

Heinemann: There is no one-size-fits-all answer to that question. Security is always a negotiation process. The COVID-19 contact tracing and warning app is a good example: the aim was to track chains of infection while at the same time protecting privacy. This led to a solution that consciously dispensed with certain types of data analysis. There is always a field of tension between security and usability. If I want maximum anonymity, I limit functionality; if I want maximum functionality, I relinquish privacy.

impact: Would a “data protection label” for apps be conceivable, similar to Nutri-Score, the nutritional rating system for food?

Heinemann: Yes, some research work has already been conducted on this. A simple traffic light system would help users make informed decisions: this mobile phone costs less because it collects a lot of data, that one costs more because it collects less. Something like this would be desirable, but so far it has not caught on.

impact: Are people in Germany not sufficiently interested in this topic?

Heinemann: At present, the need for security outweighs the need for freedom. Many people would be willing to relinquish privacy if it made them feel safer – think of the debates about more video surveillance. How far this will go is difficult to predict. In her book “Die Methode” (“The Method”), Juli Zeh describes a dictatorship in which people are rewarded for eating healthily and getting lots of physical exercise. If they don’t comply, they are punished.

impact: In China, the “social credit system” is already reality. How can we embed a greater awareness towards such issues in our society?

Heinemann: I would wish that we, as a university located here in Darmstadt, would establish closer links between the university, academia and the urban community. For a while, Darmstadt had a citizens’ panel, a project at the Faculty of Social Sciences led by Professor Daniel Hanß. Reactivating something like that with a view to discussing digital topics with the people of Darmstadt would be a good idea. The USP Lab could play a role here.

Contact our Editorial Team

Christina Janssen
Science Editor
University Communications
Tel.: +49.6151.533-60112
Email: christina.janssen@h-da.de

Translation: Sharon Oranski
Photography: Gregor Schuster