Home | WebMail | Register or Login

      Calgary | Regions | Local Traffic Report | Advertise on Action News | Contact

Manitoba

No criminal charges laid after AI-generated fake nudes of girls from Winnipeg school posted online

Police say no charges have been laid after an investigation into AI-generated nude photos of underage girls that circulated at a Winnipeg school late last year.

Case highlights gap in Canadian law around sexualized deepfakes, expert says

A school is seen from the outside, in winter, with a Canadian flag flying atop a pole.
Doctored photos of female students at Collge Bliveau, a Grade 7-12 French-immersion school in the city's Windsor Park area, were discovered by school officials after students came forward to report that the images were being shared online in December. (Travis Golby/CBC)

Police say no charges have been laid after an investigation into AI-generated nude photos of underage girls that circulated at a Winnipeg school late last year.

The doctored photos of female students at Collge Bliveau, a Grade7-12 French-immersion school in the city's Windsor Park area, were discovered by school officials after students came forward to report that the images were being shared online in December.

The school said at the time that the original photos appeared to have been gathered from publicly accessible social media, then explicitly altered using artificial intelligence.

Winnipeg police said this week their investigation into the incident has concluded and no charges have been laid.

Winnipeg Police Service spokesperson Const. Dani McKinnon said generally speaking, charges may not be laid for a number of reasons, including potentialevidence issues, victims' desire to move forward with the case, likelihood of conviction and the nuances of AI-related crimes and the law.

"At the end of the day, it is our understanding that all involved parties were satisfied with this final decision," McKinnon said in an email.

The Louis Riel School Division, which includes Collge Bliveau, would not say how many photos were shared, how many girls were victimized or whether the person or people involved in creating the images had been identified.

Suzie Dunn, an assistant law professor at Dalhousie University's Schulich School of Law, said the case highlights a gap in Canadian law when it comes to addressing sexualized deepfakes images that have been doctored using artificial intelligence.

"Many of the early intimate image laws were created before deepfakes were in the public presence, and so many of them didn't include the term 'altered images,'" said Dunn, whose research examines laws and policies around sexual violence facilitated by technology.

Dunn said an argument could be made that Canada's existing laws against sharing intimate images without consent should also cover deepfakes, but she hasn't seen anyone try that yet in part because incidents like what happened at Collge Bliveau are so rare.

The only similar case she's seen so far happened in Quebec last year, when a 61-year-old man was sentenced to more than three years in prison for using artificial intelligence to produce synthetic videos of child pornography.

But those laws don'tquite covertheWinnipeg incident,she said.

High school students, their faces unseen, sit cross-legged on concrete steps while scrolling on cellphones
The Louis Riel School Division would not say how many photos were shared, how many girls were victimized or whether the person or people involved in creating the images had been identified. (Evan Mitsui/CBC)

Other factors in the lack of charges could include the possibility thoseresponsible were also minors themselves, Dunn saidsince police can often use discretion in deciding whether to charge young people with crimes or the possibility victim co-operation was a challenge.

However, the latter wouldn't necessarily be a dealbreaker in a case with digital evidence like deepfake photos, she said.

'Hopefully we can learn lessons' fromcase: expert

The mother of one girl whose altered photos were among those circulated said she was disappointed to hear the update.

The mother whomCBC News is not naming because that could identify her daughter said she hopes to see more focus on educating students on the effects of sharing intimate and altered images to avoid similar cases in the future.

Kaitlynn Mendes, an associate professor of sociology at Western University and Canada Research Chair in inequality and gender, said Manitoba is among many provinces whose curriculum does not recognize that sexual violence can happen online or address the possible legal consequences of online behaviour.

"It's really heartbreaking to hear about what happened in Manitoba, but hopefully we can learn lessons and use it as an opportunity to start having these conversations and also start pushing for change," said Mendes, who recentlyco-authored a reporton the subject, along with Dunn.

Shesaid that meansstudents in the province are "very likely to be ill-equipped" to respond to situations like the Collge Bliveau one.

A woman with blond hair and wearing a beige blazer over a black top stands in an indoor room, with a large clock seen behind her.
Kaitlynn Mendes is an associate professor of sociology at Western University and Canada Research Chair in inequality and gender. (Prasanjeet Choudhury/CBC)

"They won't understand what rights they have. They won't understand places where they can go for help or support, if or when things go wrong," Mendes said.

"What I would say to teachers is that even though these topics may not be in the official curriculum, there can still be really great opportunities to talk to young people about what's going on."

In an emailed statement, the Louis Riel School Division said the incident has "underscored the importance of comprehensive learning about safe/healthy relationships and conduct online."

The division said it plans to provide in-class presentations about responsible internet use and consent for students, and expectsto organize presentations for parents in the coming weeks.

Officials previously said explicitly altered images received by the school would be uploaded to Project Arachnid, a tool operated by the Winnipeg-based Canadian Centre for Child Protection, which can help get them deleted.