Skip to content

Audio Version

See more about

More info

Essay by Rosa Llop & Ariel Guersenzvaig.

The Dark Side of User Interface Design

With the appearance of the first graphic interfaces, the notion of “interface” came to be associated with the idea of a “window”. All operating systems began to facilitate interaction through manipulable frames and, in no time at all, the metaphor of the interface as a “window to the world” was popularised. While the allegories of the desktop, folders and the recycle bin are still valid, the metaphors used for graphic interfaces are being brought into question. Nonetheless, we continue to define the interface as a transparent surface that facilitates contact with a system. By using a substantive to represent the interface (layer, window, surface…), we give the concept a neutrality it doesn’t deserve. We trust in it completely and believe that the information it provides is honest. This apparent neutrality leads us to feel we can act freely, that we are the masters of this communication. Interfaces, however, like any other design, are subject to a context and, therefore, respond to the aims of that context. And, for the time being, the methods for achieving these aims are neither regulated nor guaranteed. In this article, we’ll describe some of the practices that demonstrate how interfaces are not transparent and advocate the importance of changing this metaphor.

In 2015, a court ruling ordered the professional social network Linkedin to pay 13 million dollars to the state of California for tricking the users of their service through its registration interface. The aim of the scam was to gain access to users’ e-mail address books and obtain permission to send e-mails in their name. In order to highlight this scam, Google’s product manager Dan Schlosser published an article in Medium in which he outlines up to 16 different interfaces used by Linkedin to fraudulently appropriate address books. The strategies he describes range from using deliberately misleading messages, to dialogue boxes designed so that certain buttons stand out above others.

These kinds of practices that deceive users during their interaction are called Dark Patterns, strategies designed to manipulate users and trick them into performing actions beneficial to the company. In the case of Linkedin, the result was to quickly increase the number of users of their network. It’s highly likely that readers have repeatedly received e-mails inviting them to connect through this network. What readers probably do not realise is that these invitations are sent by the platform itself, autonomously and without the explicit permission of the users, who are seen to be senders of these e-mails.

Linkedin is not the only company that uses dark patterns. Apple was also the subject of scorn when in 2013 it was discovered they had introduced a dark pattern in the iPhone operating system, with the aim of ensuring that users scarcely had the chance to disable the identifier that tracks their browsing activity; an identifier that infringes on user privacy and whose aim is for advertisers to offer contextual advertising based on users’ web browsing behaviour. Specifically, Apple made use of the most underhand dark patterns: double negatives in their messages that make it treacherously difficult to understand the text. In this specific example, when a user arrives at a window where they can, in theory, disable the ad tracking, the dialogue box shows the following message:

“Limit ad tracking: Off”.

Innocently, users could easily believe that the tracking has been disabled, when, in reality, it is limited tracking that has been turned off. And these kinds of practices are more common that they might seem. In fact, they have their own category within the dark patterns: they are the so-called Tricky Questions. Linkedin’s deceitful practice also has its own category: Friend Spam. There is even a category named in honour of Facebook founder Mark Zuckerberg: Privacy Zuckering, a pattern that Facebook used in its beginnings to hinder access to the privacy setup menu, thus tricking users into sharing more information and with more people than they really wanted to. Fortunately, and thanks to complaints from Facebook users, this pattern was removed, making it easier for users to manage their privacy levels for each post. Which is not to say that Facebook are free from sin.

Other well-known companies like Youtube, Pinterest, Netflix or Evernote also try to trick their users through interfaces that provide a simple subscription process and hide the fact that it is nigh on impossible to unsubscribe. This pattern is called Roach Motel, and if readers don’t believe it, they can consult the list of web services that hinder account closure at www.justdelete.me. There are more than a few!

Harry Brignull is the designer responsible for listing these and many other dark patterns at www.darkpatterns.org with the aim of denouncing and disseminating these practices, thereby making users more aware of them.

As designers and professionals with a sense of ethics, the dark patterns that most outrage us are those related with the use of tricks of visual perception to deliberately focus the user’s attention on one thing and distract it from another that would be more useful to them. It outrages us because, for these kinds of practices, you need an expert knowledge of the influence exerted by colour, the position of an object in the space and its size when it comes to understanding a message. The aspects of image theory that researchers and teachers are trying to decipher and transmit to their students should not be used to manipulate and thereby achieve fraudulent aims, but rather to provide more efficient visual communication in functional terms, and not to the detriment of the user.

This information asymmetry, which allows a designer and a company to manipulate users so that they click on an advert, buy the most expensive ticket or are unable to close their account, flies in the face of professional ethics. These should have a more central role. As designers, we should be aware that the user interface is not merely a transparent window, and that whoever dominates the interface, establishes the rules of the game, leading to a strong asymmetry in favour of the manipulator.

Our users don’t ask what the interface is not showing, they don’t ask why they are given certain options and not others, or why it could be in the providers’ interest to offer particular information. They cannot be expected to know that there is hidden information which is more useful than what they first see. They don’t imagine, for example, that if they are looking for a flight and then return to the same page and use the same browser, the airline has raised its prices slightly, assuming that this new visit shows an increase in the user’s interest and that the client is, therefore, prepared to pay more. Users simply trust they will not be manipulated, that they are being offered the best option and that they don’t need to be overly suspicious when surfing the Internet. But as the dark patterns clearly demonstrate, Internet services are not deserving of such trust.

Readers may be tempted to think that the dark patterns are marketing tactics, and that anything goes when it comes to achieving aims in business. Here, it is worth remembering that not only are there court rulings against these practices, but also, with increasing frequency, consumers are becoming more aware of these kinds of practices and penalise brands that use them. There’s no doubt about it, consumers appreciate brands that offer good user experience. A brand’s reputation is not all that is at stake when such practices are employed, but also the reputation of the entire ecosystem of digital products.

Technological determinism assumes that technology has inherent effects in society which affect all social spheres, from cultural values to the ways individuals relate to one another. This theory, defended by thinkers like Heidegger or McLuhan, establishes that the form technology takes determines the personality and behaviour of society. If we fill the market with products that use interfaces that manipulate and trick users, what kind of society can we hope for, when users are constantly being encouraged to distrust and suspect? Technological determinism offers a reason to change the metaphor for the interface and to stop considering it as a substantive with neutral traits (a window) and start seeing it for what it really is: a mediator with the capacity to significantly influence our social construct.

At the other extreme of thought regarding technology, is also a reason to change the metaphor and defend our ethical responsibility. Technological constructivism refutes technological determinism, arguing that technology does not unilaterally influence society, but rather that it is a social construct, and, therefore, society has the capacity to intervene and decide what this technology should be like. Designers, therefore, have the obligation to recognise our responsibility in the construction of the digital society and stimulate the design of user interfaces that reflect our ethical values, protect our rights and guarantee our freedoms. Technologies, undoubtedly, condition us, enabling certain possibilities and denying others, and it is in this space where there is room for individual, professional and entrepreneurial freedom and ethics. Technology is part of our existence, but it does not determine who we are. If, as designers, we only pay attention to sales margins, conversion rates or criteria that maximise results, we’ll forget that we have other obligations, not just to our clients but to society. If we apply these dark patterns, we might be able to sell more or gain an extra subscriber, but we will be failing a society that trusts in us. If we believe that design is a profession, that the designer is a professional and not just a technician, we should act like professionals, and that means assuming our responsibilities. If our designs are detrimental to trust, one of society’s most precious assets, can we continue to consider ourselves professionals?

Essay by Rosa Llop & Ariel Guersenzvaig.

urbanNext (November 21, 2024) The Dark Side of User Interface Design. Retrieved from https://urbannext.net/the-dark-side/.
The Dark Side of User Interface Design.” urbanNext – November 21, 2024, https://urbannext.net/the-dark-side/
urbanNext September 21, 2018 The Dark Side of User Interface Design., viewed November 21, 2024,<https://urbannext.net/the-dark-side/>
urbanNext – The Dark Side of User Interface Design. [Internet]. [Accessed November 21, 2024]. Available from: https://urbannext.net/the-dark-side/
The Dark Side of User Interface Design.” urbanNext – Accessed November 21, 2024. https://urbannext.net/the-dark-side/
The Dark Side of User Interface Design.” urbanNext [Online]. Available: https://urbannext.net/the-dark-side/. [Accessed: November 21, 2024]

urbanNext
urbanNext | expanding architecture to rethink cities and territories

Newsletter

Sign up to our newsletter

Search
Generic filters
Exact matches only
Search in title
Search in content
Search in excerpt
Formats
Audio&visual
Concept
Data
Essay
Forum
Lecture
Podcast
Project
Talk
Survey
Statement
Selfthink
High Density
Middle Density
Low Density
No Density

talk

essay

project

product

survey

data

all formats