When: June 9-20
Admission: Free (donations accepted)
Discriminator, director Brett Gaylor’s new independent film about facial recognition software, is as close to a horror movie Hollywood has ever produced — but there’s a nary a ghost, serial killer, or unexplained phenomenon anywhere in sight.
The Victoria filmmaker’s short film about how technology companies are worming their way into our lives has a doomsday quality to it, and should leave viewers an overwhelming sense of dread. Will it? Probably not, which is a scarier proposition than much of what appears in Discrimination.
The film had its world premiere online Wednesday as part of the Immersive Lineup at this year’s Tribeca Film Festival in New York. From now through June 20, the film is available for free on www.discriminator.film. Those who watch it online have the option try out facial recognition software, a startling first-person example of what Gaylor disocovered during his investigation (and no, he won’t collect your data in the process).
The idea for the film came to the Galiano Island-born Gaylor, 44, when he discovered that pictures of his honeymoon he’d uploaded to photo-sharing site Flickr were being used by companies looking to skirting copyright law under the argument of creative commons. In short, these companies took his photos because they could, and had free reign to use them at their will.
That isn’t necessarily what bothered Gaylor. By uploading photos to the Internet, which he did willingly, ownership of those photos was being eroded pixel by pixel. The binding agreement between users and platforms in the online world? What was once yours is now theirs. Don’t like those terms? Don’t log on in the first place.
But when he discovered, through Freedom of Information Act requests, that photos of him and photos he had taken of others — including shots of he and his wife on their honeymoon — had been used by a variety of companies more than 5,000 times, including ones in the miltary, he knew the situation deserved a deep dive.
Netflix documentary The Social Dilemma tackled the topic of unethical behaviour on social media. Gaylor mines similar territory, and came upon equally sinister discoveries — such as an inherent racial bias in facial recognition software, one that could eventually have lasting effects.
“By the shape of a person’s face, and their skull, they can detect race, and they can monitor that race to see what they’re doing and limit their movements. We have to say, ‘No, you can’t do that,’ in the same way X-ray goggles would violate privacy and be invasive. We have to think about what that looks long-term. You can’t give that consent.”
Nothing in the world of facial recognition happens by accident, Gaylor said. With closed-circuit TV cameras on street corners in almost every city, citizens of the world are being filmed 24/7. These photos became part of a universal “megaface” database. While some might disagree about effective-use when it comes to this information, few laws currently exist that protect those unknowingly being filmed.
“What we find is that oftentimes what you thought your image was going to be used for ends up being something that’s wildly different than you anticipated,” Gaylor said. “In my case, that spanned investment firms, the Chinese military, Google, the parent company of Tik Tok, and advertising companies. Literally every major player in artificial intelligence downloaded that dataset which included pictures of me and my family.”