Addictive behavior and mental healthIt has been suggested that using UXD artifacts could result in addictive behavior and dependence, which could lead to problems such as emotional stress, damaged relationships, and attention deficit disorder (Shuib et al., 2015b). Further, stimulating experiences can sometimes lead to uncontrolled use, despite negative repercussions on one’s personal and social life (Noë et al., 2019). Good examples of stimulation from UXD are social media applications that are designed to make users prolong their usage. For example, some designs include infinite scrolling interfaces that lack any stopping cues while others encourage users to return via notifications to check who liked their picture or see the latest news (Noë et al., 2019). This and other functions of UXD artifacts could lead to excessive attention to artifacts, uncontrolled dedication, and a preoccupation with them, which are all factors of possible addictive behavior (Coyne et al., 2019). Although the APA (American Psychiatric Association) has not yet included smartphone or internet addiction in their official list of mental disorders (in the Diagnostic and Statistical Manual of Mental Disorders, DSM-5), the phenomena have been highly researched and suggested correlations between addiction and excessive and uncontrolled use of smartphones and UXD artifacts (Coyne et al., 2019; Trowbridge et al., 2018). Furthermore, it has been suggested that artifacts like social media could contribute to or worsen mental disorders such as depression, anxiety, sleep, and eating disorders (Coyne et al., 2019). Fear of missing out or need for a physical touch could result in excessive use of mobile phones and thus worsen pre-existing mental disorder conditions (Elhai et al., 2016).
MemoryWith the ongoing development of UXD artifacts such as search engines and other databases, we have constant access to information, and in just a few short steps, we can get solutions to complex mathematical tasks or answers to natural phenomena we are curious about. Offloading transactive memory into this human-technology relation nowadays enables us to offload much more of what we were storing internally before the invention of the internet. Although digital databases help us alleviate our cognitive load because we do not have to remember everything, there are also, arguably, potential downsides to overuse and reliance on them. Research shows that people are more prone to remember where to find specific information rather than knowing it, or that they tend to forget items that they think will be available externally and remember those they think are not (Sparrow et al., 2011). One could say that these are positive effects of offloading transactive memory, however concerns could be raised that our relationships with technology are becoming overly symbiotic to the extent that our own cognition, such as memory and attention, is being altered by them. It has been suggested that people who grew up with internet technology, also known as ‘digital natives,’ display different cognitive profiles to ones that adopted it later in life, called ‘digital immigrants.’ Digital natives were shown to gravitate towards shallow information processing, have increased distractibility and poor executive control abilities, which could lead to structural changes in the brain (Loh & Kanai, 2016).
Distraction and attentionResearch has suggested that UXD artifacts could affect attention; they can act as a distraction from daily tasks like working, studying, or simply living and even become hazardous, creating potential collisions while walking or driving. Social media tends to provide interruptions (especially because of its mobility) and could therefore be associated with multitasking (Primack & Escobar-Viera, 2017), i.e., divided attention. This kind of interrupted attention has been associated with negative cognitive outcomes, e.g., where two or more tasks are performed simultaneously, performance on the tasks is worse when they are performed together versus when they are performed separately (Farmer et al., 2018). Furthermore, multitasking has been related to decreased ability to sustain attention, poor academic performance, decreased subjective well-being, a higher level of depression and anxiety (Primack & Escobar-Viera, 2017), and deficient self-regulation that could lead to addictive behavior (David et al., 2015).
When it comes to artifacts affecting drivers, it has been shown that if a driver is checking their phone apps while driving, their visual information processing is delayed, which could be a great contributor to car accidents (Ishida & Matsuura, 2001). It has also been shown that drivers who used mobile phones missed more red lights than drivers without distractions (Strayer & Johnston, 2001). Besides drivers, accidents caused because of distractions are shown to be very high also among pedestrians, causing injuries like concussions, fractions, or sprains (Nasar & Troyer, 2013).
Privacy, tracking, and psychological targetingUXD can affect a person's privacy by, for example, creating artifacts that track their activity while using it or collecting digital footprints, such as comments and likes, or even performing background experiments based on user’s behavior. It is known that private companies generate a lot of money from the collection, use and sale of personal data and data brokers collect user information gathered from social media and other footprints to sell for marketing purposes (Beake, 2014). If one knows the unique psychological characteristics and motivations of a person, this can be used for psychological persuasion, such as influencing a person to vote for a specific candidate, buy a certain product more often or click on advertisements tailored to them (Matz et al., 2017). Therefore, creators can design their artifacts in a way that takes advantage of this kind of knowledge. To further make UXD artifacts more attractive and usable, user experience designers experiment with techniques such as A/B testing, where different versions of an artifact are shown to different people and their behaviour is then tracked to see which variation is more effective at transforming them into customers. These experiments usually take place in the background without a user’s knowledge and have been shown as successful in political campaigns, using images or buttons that were, after testing, shown to have the greatest success (Siroker & Koomen, 2013).
Malicious user experience and user interfaceThe benefits of user experience design are often directed towards someone else’s end than the users’ (e.g., artifact originator).. In this section, I will describe some of the malicious techniques where user experience designers (or other agents) implement deceptive functionality that is not in the best interest of users.
Exploiting pre-attentive processing by distraction
This refers to attracting the user’s attention away from their current task by exploiting perception, particularly pre-attentive processing (Conti & Sobiesk, 2010). In a user interface, this can be achieved by using movement (blinking or moving content, especially advertisements), distracting audio, intense hues, colors, and size (tricking users into clicking a prominent red button), to name a few (Gray et al., 2018).
Information architecture that is supposed to guide the user towards the designer’s goals (Conti & Sobiesk, 2010). For example, paid subscription screens that appear in freshly installed applications, even though a free version is available.
Sneaking and lying
Hiding or disguising important information from the user. The intention is to make users perform an action that they might object to if they had knowledge of it (Gray et al., 2018). The most common example is continuing a subscription without informing the user, or presenting users with free trials and not informing them when the subscription will start. Another example is disingenuous behavior, such as sneaking things into the basket, installing additional software one did not ask for, or advertising a monthly price for a product while switching to a yearly price at the checkout (Conti & Sobiesk, 2010; Gray et al., 2018).
Hiding desired information (Conti & Sobiesk, 2010); for example, when one tries to cancel a subscription or delete the account, the user’s desired options are light grey, and the button for continuing the subscription is red and big.
Presenting content as ‘never-ending,’ i.e., lacking any stopping cues, which could lead to prolonged usage of the artifact (Noë et al., 2019).
Spreading false informationSearch engines and social media allow us to find information quickly and easily. However, there’s no guarantee this information will be true. Serious examples of this include the spreading of false beliefs on the way vaccinations work and their effects on people, stating false facts on certain health risks, or generally nurturing false beliefs about diseases, symptoms, treatment, and prevention (Wu & McCormick, 2018). |There are also many false news reports in media with fabricated, misleading, or negligently written content; reasons for false reporting could be simple accidental mistakes, negligent reporting, or planned, strategic manipulation (Vos et al., 2019).
CyberbullyingWhile UXD has given users the ability to exercise their freedom of speech, this ability is a double-edge sword. The ability to be anonymous or partially hidden means that there is a heightened opportunity for harmful action. The phenomenon of cyberbullying can include posting derogatory comments, posting humiliating pictures, or threatening someone electronically. Cyberbullying goes one step further from regular bullying because it can reach an unlimited audience, and visible actions can remain in place for long periods of time (Kowalski et al., 2014; Nixon, 2014). The most affected groups are adolescents (Craig et al., 2020), and there have been reported increased depressive effects, loneliness, suicidal behavior, anxiety, and other somatic symptoms (Nixon, 2014).