COPENHAGEN, Denmark — If you’ve been feeling freaked out by your smartphone lately, you’re not alone. Fascinating findings out of Denmark reveal that people can’t help but continue using their favorite apps — regardless of how creeped out they feel about the amount of personal data they collect.
It’s no secret that mobile apps collect a wide range of data on their users, and studies continue to show that most people aren’t the biggest fans of data collection. Surprisingly, study authors from the University of Copenhagen report that despite surveys showing users experience emotional stress over app-based data collection, most simply continue scrolling.
“It seems that people accept this uneasy feeling almost as a part of the user experience. Somehow, we have been trained to live with being uncomfortable. But you may ask how it can be defensible to treat people and their emotional states so terribly,” says study co-author Irina Shklovski, Professor at the Department of Computer Science (DIKU), in a university release.
Moreover, Prof. Shklovski and her co-authors have developed a specialized tool that measures the degree of discomfort tech users feel.
“I think most of us have tried feeling uneasy when downloading apps, but most often you can’t really put your finger on what the problem might be. So, we decided to create a way of measuring the degree of discomfort,” the professor adds.
The study breaks down a ‘creepy app’ intro 3 distinct parts:
- Violating the boundaries of the user
- Violating those boundaries in an unexpected manner
- Possessing ambiguity of threat
Study authors say high scores in all three of those categories would make for one very creepy app.
“Notably, we are talking about emotional response here. Even in a situation where objectively everything is fine, for instance if a technical solution guarding against misuse of personal data is in place, the user may still feel discomfort,” Prof. Shklovski emphasizes.
Now that they’ve developed a creepiness scale, researchers can investigate how various changes or modifications may alter users’ experiences and feelings.
This project included a total of 751 participants divided into various groups, with each one receiving a different version of a fictional app to download. The fictitious app, called “Remember Music,” functioned just like many real apps and served to identify songs or tunes.
“Just like in the real world, the participants would have to agree to a license agreement, and again just like in the real world they would click accept without thinking twice,” Prof. Shklovski notes.
One version of the app collected users’ locations, while another version almost immediately began making music suggestions based on previously identified songs and artists. A third version even automatically posted what users were listening to on Facebook. Some users in this group had control over the Facebook settings.
“We had expected the group with control to feel more comfortable, but surprisingly they didn’t,” the study author comments, adding that this is a major discovery. “Lawyers and organizations working to improve data privacy are often focused on improving user control. While this may be desirable for other reasons, sadly our research show that the emotional stress to users will not be relieved.”
Digital literacy leads people to take greater risks
Each person in the experiment also had to rate themselves on digital literacy.
“We normally assume people who have a high degree of digital literacy to be more critical towards the apps, but again surprisingly, the opposite is true. The more you see yourself as digitally literate, the higher the likeliness of you continuing using an app which is invasive,” Prof. Shklovski says.
Study authors say this work suggests that it’s on the app developers, not users, to correct these creepy issues.
“Industry and public bodies will argue that this is a question of personal data hygiene. In other words, that as users become more digitally aware they will favor less intrusive apps over the more intrusive. Based on the data from our study, we can say that trying to shift responsibility to the user in this way will not work. That horse has bolted. If we want things to get better, we need developers and policy makers to change the scene,” Shklovski concludes.
The study is published in Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems.