Skip to content
Join our Newsletter

B.C. 'sextortion' victims have new powers to scrub intimate images from the Internet

Tribunal tasked with assessing applications, seeking compensation

As artificial intelligence-generated nude photos of pop star Taylor Swift gave an international platform to the problem of online sextortion, a new B.C. law that came into force Monday gives victims more power to get non-consensual intimate images scrubbed from the Internet.

AI-generated photos purporting to show the singer-songwriter in sexually suggestive positions were viewed tens of millions of times this past week on social media platform X before being taken down.

“If Taylor Swift is not immune from this, certainly British Columbians are not,” said Premier David Eby, speaking to reporters from Ottawa.

“The troubling developments over the weekend in relation to one of the most popular pop stars that has ever lived and recognizing that even someone with the wealth and authority and power that she has, that this could happen, is a call for all governments and tech companies to have a look at the laws and the frameworks and the safeguards that they have in place.”

A Victoria-based online security expert said artificial intelligence is increasingly being used by online predators to extort victims out of money and he praised B.C.’s legislation for keeping up with technology.

“Taylor Swift and her notoriety around the world has really shone a light on this issue, to bring it to the attention of parents and adults who can become a target for this type of crime,” said Darren Laur, a former Victoria police officer and online security consultant who advised the Attorney General’s Ministry as the law was being drafted.

The Intimate Images Protection Act provides expedited options for victims to get orders for images to be taken down and destroyed. It will allow victims to claim compensation from people who posted the photos without permission.

For example, B.C.’s civil resolution tribunal has expanded its online portal to give people information on their rights and self-help tools to begin remedial action, as well as connect them to community and mental-health supports.

The tribunal has the power to order a person, social media company or website to stop distribution and remove an image from its platform. These applications can be made without notice and without naming a respondent.

Laur’s team tested out an AI app that takes someone’s image and manipulates the photo so it appears the person is naked. Even though it is not the person’s actual body, the fake image can be used to extort them.

“It’s scary what [the app] can do,” he said. “The new Intimate Images Protection Act that we have here in the province, it’s the only legislation in Canada that deals with this issue.”

Laur acknowledged it is more challenging to go after someone who is unknown to the victim or who lives outside Canada. However, he said most of the cases he investigated involved someone known to the victim.

Eby announced last week the B.C. NDP will introduce legislation this spring allowing the government to sue social media giants such as Facebook and Meta for damages over the harms they cause.

>>> To comment on this article, write a letter to the editor: [email protected]