Cybersecurity
Created by Harry Brignull, in 2010, the term dark pattern, or deceptive pattern, comprises subtle UX decisions meant to manipulate or trick users into actions they wouldn’t normally take.
Some manipulation techniques often include hiding sketchy privacy consent requests inside lengthy agreements, which is where the vast majority of users simply skip the lines and hit: "I agree". Few people take time to read terms of services, and the habit of overlooking and blindly trusting is where danger frequently lies.
It's no secret that profit is the primary objective of any company, and every corporate action is based upon this premise. Sometimes, good and responsible design gets in the way of profit, and that's when dark patterns become the strategy. Dark patterns are designed to facilitate profit, directly or indirectly. From making it harder to close a pop-up in a page, to requesting an excessive amount of personal data in a sign-up form, these tactics are considered unethical, dangerous, and train users to overlook warning signs, which is what cyber attackers find most appealing. Just like the story of a child who cries wolf repeatedly until everyone stops believing it.
Companies like Meta (Facebook, at the time of the incident) have been accused of compromising user privacy with dark patterns. The massive scandal Cambridge Analytica, for instance, occurred when millions of user profiles worldwide were harvested without consent, influencing their political choices through targeted ads. According to investigations, the culprit was that some sensitive data use consent was hiding among the lines of an updated privacy agreement, enabling further abuse. The consequences affected several countries, giving the CEO Mark Zuckerberg his very own exploit framework:
''privacy zuckering'' is defined by the act of tricking users into disclosing more personal information than they intend to.
Yes... dark patterns!
In a cybersecurity perspective, dark patterns mechanics aren't that different from a social engineering attack: they both abuse the human behavior to obtain advantages that wouldn't be achieved otherwise, like accessing PII and SPII data, or opening doors for financial abuse. Thus, the question here is:
How do we define where an unethical action ends and a criminal one starts?
The cornerstone of user experience is to empower users, so their interaction with technology is natural, instinctive. It's no coincidence that the UX reference literature is titled Don't Make Me Think. However, by reducing users attention span across experiences, design might also be used as a tool to explore vulnerabilities caused by our own humanity, leaving an open road for malicious exploitation.
Bear in mind that dark patterns are, in most cases, intentional. Their use is a corporate decision. Therefore, the development of genuinely secure digital products demands that we expose manipulative design practices which go beyond just "clever optimization."
The weakest link in cybersecurity is always human. Designers can—and should—take part in risk assessment processes.