In 2024, increased adoption of biometric surveillance systems, such as the use of AI-based facial recognition in public places and access to government services, will spur biometric identity theft and anti-surveillance innovations. Individuals attempting to steal biometric identities to commit fraud or gain access to unauthorized data will be bolstered by generative AI tools and the abundance of facial and voice data published online.
Voice clones are already being used for scams. Take, for example, Jennifer DeStefano, an Arizona mother who heard the panicked voice of her daughter crying “Mom, these bad men have me! after receiving a call from an unknown number. The scammer demanded money. DeStefano was finally able to confirm that his daughter was safe. This hoax is a precursor to more sophisticated biometric scams that will target our deepest fears by using images and sounds of our loved ones to force us to do the bidding of whoever uses these tools.
In 2024, some governments are likely to adopt biometric mimicry to support psychological torture. In the past, a person of interest could be given false information with little evidence to support the claims other than the interrogator’s words. Today, a person questioned may have been arrested due to a false facial recognition match. Dark-skinned men in the United States, including Robert Williams, Michael Oliver, Niger Parksand Randall Reid, have been wrongfully arrested due to erroneous facial identification, detained and imprisoned for crimes they did not commit. They are among a group of people, including the elderly, people of color and gender non-conforming people, who are at higher risk of facial misidentification.
Generative AI tools also give intelligence agencies the ability to create fake evidence, such as a video of an alleged accomplice confessing to a crime. Perhaps just as heartbreaking is that the power of creating digital doubles is not limited to entities with big budgets. The availability of open source generative artificial intelligence systems that can produce human-like voices and fake videos will increase the circulation of revenge porn, child sexual abuse materials, and more on the dark web.
By 2024 we will have a growing number of “excoded” communities and people, those whose life chances have been negatively altered by AI systems. In it Algorithmic Justice League, we have received hundreds of reports of compromised biometric rights. In response, we will witness the rise of the anonymous, those who are committed to keeping their biometric identities hidden in plain sight.
Since biometric rights will vary around the world, fashion choices will reflect regional biometric regimes. Face shields, such as those used for religious purposes or medical masks to prevent viruses, will be adopted as fashion and anti-surveillance clothing when permitted. In 2019, when protesters began destroying surveillance equipment while concealing their appearance, a Hong Kong government leader bans face masks.
In 2024, we will begin to see a bifurcation of mass surveillance and free territories, areas where laws exist such as the provision in the proposed EU AI Law, which prohibits the use of live biometric data in public places. In those places the anti-surveillance trend will flourish. After all, facial recognition can be used retroactively in video streams. Parents will fight to protect children’s right to be “biometric naive,” meaning not to have any of their biometric data, such as face prints, voice prints, or patterns, scanned or stored by government agencies, schools, or religious institutions. of iris. New eyewear companies will offer lenses that distort the cameras’ ability to easily capture ocular biometric information, and pairs of glasses will come with prosthetic extensions to alter the shape of the nose and cheeks. 3D printing tools will be used to make facial prosthetics at home, although depending on where you are in the world, this may be prohibited. In a world where the face is the final frontier of privacy, looking at another person’s unaltered face will be a rare intimacy.