The ubiquitous cameras in our cities and increasingly sophisticated biometric technologies keep us safe, but they can also violate our rights as citizens. These devices collect huge amounts of personal data – often without our consent – but we are unaware of its destination and use. The ever-present smartphones, voice assistants, wearable devices, and many other ‘smart’ objects that the Internet of Things (IoT) has brought into our privacy make our lives easier by connecting everything, but they also track our movements and actions, both online and offline. The data that these everyday devices extract from us is, in the words of Shoshana Zuboff, the “oil” of a new form of capitalism, something the scholar defines as “surveillance capitalism”, which is also the title of her acclaimed book where she explains how new means of production, i.e. artificial intelligence, process this raw material into what feeds the new, rapidly evolving market: predictions of our behavior. Another key issue is the discriminatory component of data – a problem that stems from the illusion that these devices are objective and free of the biases we instinctively ascribe to humans as opposed to machines. Kate Crawford – artificial intelligence expert and curator of the exhibition ‘Excavating AI: The Politics of Images in Machine Learning Training Sets’ – explains that data are political interventions. They are not as neutral and objective as they seem but are instead organized into datasets that inevitably embody the ideological biases of those who create them. Their management can therefore lead to the proliferation of ‘automated discriminations’ based on race, gender, social status, financial situation, and even lifestyle, taste, and personality.
Designing privacy: 13 projects to raise awareness and resist digital surveillance
In the digital society, our personal data has become the new currency: here is how design is stepping up to protect vulnerable privacy and raise awareness of issues that are often underestimated or barely known.
View Article details
- Rei Morozumi
- 24 January 2023
Before this complex and ever-changing scenario, design has not sat idly by. In addition to conceiving simple objects such as “privacy shutters” – webcam blocking devices that are already often built into laptops – design, in its critical dimension, has proposed provocative projects that aim first and foremost to raise awareness of the threat that digital tracking poses to privacy. However, these projects can also serve as marketable objects that users can adopt to escape the control of their devices, thus protecting themselves without retreating into an anachronistic anti-digital sanctuary. The projects use different methods to protect individual privacy. Some, for example, isolate the user from tracking systems, while others interfere with the various technologies through data overload or falsification. The gallery presents a selection of projects that focus on two main themes – defense against facial recognition and protection against the surveillance of smart objects – thus addressing both the public and private spheres of today’s citizens.
Privacy Mask, presented at Anonymous, an exhibition organized by the HKU (University of the Arts Utrecht), is one of the objects designed to protect privacy from the invasiveness of facial recognition cameras, which are becoming increasingly widespread and can recognize people more efficiently than the human eye. Facial recognition is used in smart city management and for security reasons, including, especially in some regimes, the identification of participants in protests and social unrest. The purpose of the mask is to provide protection while preventing the wearer from completely losing their expressivity and identity in the real world. The material is in fact a transparent PMMA that allows the face to remain visible while shielding it from cameras thanks to its ribbed patterns and lines.
Face Projector, showcased at Milan Design Week 2017, is also one of the prototypes created for HKU’s Anonymous exhibition, which explores the defense of individual identity in a future that is less and less dystopian by the minute, where facial recognition tools are ubiquitous. It consists of a wearable device, worn on the head like a visor, that projects images of random faces onto that of the user in order to confuse facial recognition technologies. The project regained popularity in 2019, when fake news spread on social media decontextualizing the product and describing it as a ploy used by protesters during the Hong Kong protests to avoid being identified by the police.
Adam Harvey, an American artist and independent researcher based in Berlin, is one of the most active voices in the context of privacy and surveillance. CV Dazzle, whose name is inspired by the naval camouflage techniques used during the First World War, is arguably one of the best-known and most interesting projects when it comes to blocking facial detection. It explores how make-up and hairstyles can be used to avoid being recognized by cameras, while still remaining recognizable to people. Using a variety of make-up and avant-garde hairstyles, CV Dazzle breaks the continuity of a face and alters the spatial relationships of its main features, disrupting and blocking facial detection. Harvey points out that the project should not be understood as a finished product or pattern, but rather as a camouflage strategy that must be specifically designed in relation to an algorithm and a unique face.
The Manifesto Collection features a selection of cotton knitwear and transports adversarial patches onto the fabric, thanks to innovative technology. At first glance, adversarial patches look like mere abstract patterns, but they are in fact carefully designed graphics that can block recognition or associate facial biometrics with the wrong categories, such as animals instead of people, thus deceiving the detectors. This is another unobtrusive tool that protects against facial recognition. The project was realized by Cap_able – an Italian fashion-tech start-up with a focus on ethics and technology. It’s based on a patented technique that incorporates algorithms into the pieces’ texture to ensure a perfect fit – the knitwear exquisitely adapting to the body’s volumes – without compromising protection. Its effectiveness was tested with Yolo – the most widely used and fastest real-time object recognition system. The pieces not only have a provocative function but can also be bought and worn on the street.
With Tricking Biometrics, a project presented at Dutch Design Week 2014, Alix Gallet wittily addresses the invasiveness of biometric technologies by proposing a collection of accessories consisting of fake body parts (ears, noses, fingers). By wearing them, the artist explains, one can modify, move or multiply facial features and elements in order to confuse recognition systems and prevent identification. The aesthetics of the accessories, which are reminiscent of carnival costumes, and the idea behind the project are in line with the artist’s aim of highlighting the absurdity and contradictions of the development of these technologies, which collect an infinite amount of data on our physical identity without any guarantee as to how and by whom they will be used.
Off pocket, another project by Adam Harvey, stems from the artist’s personal exploration of privacy and focuses on the smartphone, perhaps the most widely used device in everyday life and certainly the one that creates and shares the most data. Even when we are not using it, the smartphone keeps tracking our position, not only through GPS, but also through Wi-Fi, 5G, and cell towers. It can therefore track all our movements, knowing where we go, how we move, where we stop, and so on. The project consists of a fabric smartphone case that acts as a Faraday cage, shielding all signals (700MHz – 5GHz) and isolating the phone from tracking or eavesdropping. It is part of the collection of the Design, Architecture, and Digital Department of the Victoria and Albert Museum in London, but is not currently available on the market due to excessive production costs.
Designed by the Austrian architects of the Copp Hummelb(l)au studio and presented at the ‘Abiti da Lavoro’ exhibition at the Milan Triennale in 2014, the Jammer Coat is a cloak that protects individuals from surveillance by smart devices, allowing them to disconnect and aid their privacy. The coat is made of metallic fabrics that block radio waves, internet, and smartphone signals, thus acting as a shield against surveillance. Stored in the jacket’s inside pockets, mobile phones and tablets remain isolated and virtually inaccessible. The design of the jacket itself is intended to evoke the principle of the individual’s physical freedom, hiding the shape of the wearer’s body with an overabundance of grooves that, together with patterned trails of black dots and lines, give the illusion of strange multiple body parts.
The Kovr project, by Marcha Schagen and Leon Baauw, was also created with the aim of designing purchasable clothing items capable of protecting the individual from surveillance by smart devices and allowing them to disconnect. The project consists of a jacket made of several layers of metallic fabrics that shield and isolate incoming and outgoing signals, working on the principle of the Faraday cage. However, the jacket still gives the opportunity to stay connected thanks to unshielded black pockets that can be used when one wants to be reachable. This gives users the freedom to choose when they want to connect or isolate themselves. In addition to the jacket, backpacks and bags have been designed using the same material, to isolate the many digital devices.
Accessories For The Paranoid explores an alternative approach to digital privacy, proposing a series of objects that can be used to counteract the surveillance of heaps of very common devices. These devices operate in the domestic and private context – reading, collecting, and storing an impressive amount of user information. The accessories proposed by this project are parasitic objects that, when attached to the devices, can generate random data with the aim of ‘polluting’ those of the users and thus falsifying the profiling that the algorithms make of them. For example, one of these objects is placed on the computer’s webcam and works just like a toy camera, showing the webcam random photographs that confuse it. Another interferes with Echo, Amazon’s speaker featuring the Alexa voice assistant, by disrupting it with white noise or random conversations that also confuse the algorithms. Yet another connects to the computer and navigates autonomously, randomly visiting sites such as Google, Facebook, YouTube, Twitter, or Amazon, leaving false traces.
In recent years, voice assistants have invaded the homes and offices of users who are often unaware of these devices’ ability to listen and track everything said around them, recognize the person speaking, their tone of voice, and all their interactions with connected objects. Project Alias consists of a filter that continuously emits white noise and is installed in the voice assistant to prevent it from hearing what is going on around it, thus monitoring the user. When users want to use the voice assistant, they have to interact with Alias, which is activated by the trigger word chosen by the user during configuration. At that point, Alias disables the white noise in the background and allows the user to interact directly with the voice assistant. When the interaction is over, Alias reactivates the white noise to ensure privacy.
Project Seen aims to raise awareness of the tracking of textual content we share via social media, messaging, email, and so on. In particular, it shows how certain words, known as trigger words, immediately make us the target of tracking by security systems and others. Emil Kozole has therefore created a font, Seen, which has pre-loaded a set of sensitive words that the NSA and other agencies use to screen the documents they check. These trigger words are struck through with a black line as soon as they are typed, and the resulting text thus provides the user with information on the fields that make them most vulnerable to surveillance, while simultaneously preventing them from being tracked. Seen can be used on all popular software and also on a browser.
This project also addresses the issue of tracking and recognizing textual content on the web, particularly on social media, where our tweets and comments can be easily analyzed to obtain behavioral and emotional data about us, building up an increasingly detailed profile. Captcha Tweet is an app that allows users to post tweets in the form of Captcha, the famous test to distinguish human users from bots, thus protecting their message from artificial intelligence that cannot read it. This project sees Captcha as a symbol of the relationship between human and computer vision and applies this tool to social media communication through an app.
Unfit Bits addresses the theme of self-tracking, focusing on devices that record our physical and sporting activities. The name of the project itself is a paraphrase of one of the most important companies in the wearable sector, Fitbit, which was acquired by Google in 2021. The project consists of a series of everyday solutions developed by Brooklyn-based engineer, artist, and investigative journalist Suraya Mattu to spoof fitness trackers – wearable devices that monitor physical activity. These trackers share and sell the data we generate by being active. Fitness trackers have also been offered to customers of some insurance companies in exchange for discounts on life insurance policies, as they allow insurance companies to track the user’s physical activity. The use of Unfit Bits, in addition to guaranteeing privacy, makes it possible to simulate the achievement of the daily physical activity targets set by each policy, using very simple solutions: for example, the device can be connected to a drill, whose rotation simulates running, or to a metronome, which reproduces another movement. Each solution provides different patterns of physical activity.