AI makes deepfake pornography more accessible, as Canadian laws play catch-up - Action News
Home WebMail Sunday, November 10, 2024, 10:43 PM | Calgary | 0.3°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
British Columbia

AI makes deepfake pornography more accessible, as Canadian laws play catch-up

The technology required to create convincing fake pornography has existed for years, but experts warn it's faster and more accessible than ever, creating an urgent challenge for Canadian policymakers.

B.C. recently became latest province to pass laws allowing people to take down explicit content of them online

A blonde woman wearing a varsity jacket walks in a red corridor.
Explicit digitally-altered photos of Taylor Swift, shown here attending an NFL game on Dec. 31, 2023, have renewed calls for better laws involving deepfakes. (Ed Zurga/The Associated Press)

Underage Canadian high school girls are being targeted using AI, whichcreate fake explicit photos that spread online. Google searches bring up multiple free websites capable of "undressing" women, in a matter of minutes.

The technology required to create convincing fake pornography has existed for years, but experts warnit's faster and more accessible than ever, creating an urgent challenge for Canadian policymakers.

Advances in artificial intelligence have made it possible to do with a cellphone what once would have required a supercomputer, said Philippe Pasquier, a professor of creative AI at Simon Fraser University inB.C.

Pasquier said society has "lost the certainty" of what is real and what is altered.

Image of Barak Obama on a laptop screen.
This image, made from a fake video featuring former U.S. president Barack Obama shows elements of facial mapping technology that lets anyone make videos of real people appearing to say things they've never said. (The Associated Press)

"The technology got a little better in the lab, but mostly the quality of the technology that anyone and everyone has access to has got better," he said.

"If you increase the accessibility of the technology, that means good and bad actors are going to be much more numerous."

Across Canada, legislators have been trying to keep up. Eight provinces have enacted intimate image laws, but only half of them refer to altered images.

B.C. recently became the latest, joining Prince Edward Island, Saskatchewan and New Brunswick.

TheB.C. law, which came into effect on Jan. 29, allows people to go to a civil resolution tribunal to get intimate images taken down, regardless of whether they are real or fake, and go after perpetrators and internet companies for damages.

Individuals will be fined up to $500 per day and websites up to $5,000 a day if they don't comply with orders to stop distributing images that are posted without consent.

A tall white man is pictured wearing a black coat.
B.C. Premier David Eby said no one was immune to deepfake 'attacks,' where images of individuals are used to make fake, explicit versions of those images. (Ben Nelms/CBC)

Premier David Eby said the recent sharing of fake images of pop star Taylor Swift proved no one was immune to such "attacks."

Attorney General Niki Sharma said in an interview that she is concerned people don't come forward when they are the victim of non-consensual sharing of intimate images, real or not.

"Our legal systems need to step up when it comes to the impacts of technology on society and individuals, and this is one part of that," she said of the new legislation.

The province said it couldn't provide specific data about the extent of AI-altered images and deepfakes.

But cases have occasionally been made public elsewhere.

In December, a Winnipeg school notified parents that AI-generated photos of underage female students were circulating online.

At least 17 photos taken from students' social media were explicitly altered using artificial intelligence. School officials said they had contacted police and made supports available for students directly or indirectly affected.

Manitoba has intimate image laws, but they don't refer to altered images.

WATCH | Why Brandon Laur says sextortion cases won't go away in Canada:

Why 'sextortion' continues to be an issue in Canada

1 year ago
Duration 6:13
Internet safety expert Brandon Laur speaks with CBC News: Compass host Louise Martin about the dangers of sharing intimate images online.

Victoria-based internet safety companyWhite Hatter recently conducted an experiment and found it took only minutes using free websites to virtually undress an image of a fully clothed woman, something CEO Brandon Laur called "shocking."

The woman used in the experiment wasn't real she was also created with AI.

WATCH | Experts say it's difficult to get deepfake images taken down:

Taylor Swift deepfakes taken offline. Its not so easy for regular people

7 months ago
Duration 1:47
Fake, AI-generated sexually explicit images of Taylor Swift were feverishly shared on social media until X took them down after 17 hours. But many victims of the growing trend lack the means, clout and laws to accomplish the same thing.

"It's pretty surprising," Laur said in an interview. "We've been dealing with cases [of fake sexual images]since the early 2010s, but back then it was all Photoshop.

"Today, it's much simpler to do that without any skills."

Legal avenues, new and old

Angela Marie MacDougall, executive director of Battered Women's Support Services, said her organization was consulted about theB.C. legislation.

She said Swift's case underscored the urgent need for comprehensive legislation to combat deepfakes on social media, and applauded the province for making it a priority.

But the legislation targets non-consensual distribution of explicit images, and the next "crucial step" is to create legislation targeting creators of non-consensual images, she said.

A woman with her hair in a black bandana, wearing a black leather jacket over a black open-neck sweater, stands in front of a green hedge.
Angela Marie MacDougall with Battered Women's Support Service says that the province needs to make it so anyone can apply to get any images taken down, not just explicit ones. (CBC News )

"It's very necessary," she said. "There's a gap there. There's other possibilities that would require having access to resources, and the women that we work with wouldn't be able to hire a lawyer and pursue a legal civil process around the creation of images because, of course, it costs money to do that."

But other legal avenues may exist for victims.

Suzie Dunn, an assistant law professor at Dalhousie University in Halifax, said there were several laws that could apply to deepfakes and altered images, including those related to defamation and privacy.

"There's this new social issue that's coming up with AI-generated content and image generators and deepfakes, where there's this kind of new social harm that doesn't fit perfectly in any of these existing legal categories that we have," she said.

She said some forms of fakery could deserve exceptions, such as satire.

"As technology evolves, the law is constantly having to play catch-up and I worry a bit with this, that there might be some catch-up with this generative AI."

Deepfakes 'accelerating' misrepresentation

Pablo Tseng, an intellectual property lawyer in Vancouver, said deepfakes are "accelerating" an issue that has been around for decades: misrepresentation.

"There's always been a body of law that has been targeted towards misrepresentation that's been in existence for a long time, and that is still very much applicable today to deepfakes, (including) the torts of defamation, misrepresentation or false light, and the tort of misappropriation of personality."

But he saidspecific laws, like theB.C. legislation, are steps in the right direction.

WATCH | B.C. introduces act to take down intimate online images:

B.C. attorney general announces services to stop explicit image distribution

7 months ago
Duration 2:35
Attorney General Niki Sharma announced two new services coming into effect on Jan. 29 through the Intimate Images Protection Act. The services will help stop the distribution of explicit images without peoples consent, as well as provide victims with easy access to support and legal tools.

Tseng said he knew of one Quebec case that showcased how the misuse of deepfake technology could fall under child pornography laws. That case led to a prison sentence of more than three years for a 61-year-old man who used AI to produce deepfake child pornography videos.

But Tseng said he wasn't aware of any judgment in which the technology is referenced in the context of misrepresentation.

"It's clear that just because no judgment has been rendered doesn't mean that it isn't happening all around us. Taylor Swift is but the latest example of a string of other examples where celebrities' faces and personalities and portraits have simply been misused," he said.

A person's hand is seen holding a mobile device in the dark.
British Columbians who have had intimate images of themselves posted online without their consent now have a civil process to have those images expeditiously taken down. (iHaMoo/Shutterstock)

Dunn said she believed content moderation by websites was likely the best way forward.

She called on search engines like Google to de-index websites primarily focused on creating sexual deepfakes.

"At a certain point, I think some people just give up, even people like Scarlett Johansson or Taylor Swift, because there's so much content being produced and so few opportunities for legal recourse because you would have to sue every individual person who reshares it," Dunn said.

She said while most video deepfakes involve celebrities, there are cases of "everyday women" being targeted.

"All you need to have is one still image of a person, and you can feed it into these nude image generators and it just creates a still image that looks like they're naked, and most of that technology only works on women."