close
close

He was careful online, but this Toronto teen was still being targeted with deepfake porn

He was careful online, but this Toronto teen was still being targeted with deepfake porn

According to 16-year-old Ruby from Toronto, one of the worst things that can happen to a person is finding a naked photo of herself online.

And that’s exactly what happened to him, it wasn’t his fault.

“All of a sudden I found myself in the worst-case scenario,” he said.

CBC News is not revealing Ruby’s last name because she was the victim of a worrying new trend – sexually explicit deepfakes of minors.

A deepfake is any image or video that has been altered or created, usually using artificial intelligence, that is difficult to distinguish from the real thing.

Last year Ruby received a series of messages from someone saying there were pictures of her on the internet and asking her to click a link to see them. He asked to see them and was sent a topless deepfake of himself. The original photo of her fully clothed was taken when she was 13.

Two women standing in an office are facing each other.
Lindsey Lobb, left, of the Canadian Center for Child Protection, says deepfakes are often used to blackmail, harass or bully minors, and are easy to do because of the many sites and apps that will ‘nude’ an image. (Mia Sheldon/CBC)

Ruby was taught how to be safe online and she does not post under her real name.

“I didn’t do anything wrong,” he said. “It just happened… I was filled with so much anger.”

“I have no idea who this person is.”

Ruby’s parents called Cybertip.ca, a national hotline for people to report sexually explicit images of minors. It committed 4,000 explicit deepfakes last year, and this is the first year it has started tracking that number.

Lindsay Lobb, operations director of support services for the Canadian Child Protection Centre, which operates cybertip.ca, said the problem “continues to grow and evolve.”

Deepfakes are often used to blackmail, harass or bully minors, he says, and are easy to make due to the many sites and apps that will “nude” an image.

‘Massive generation leaps’

There have been some high-profile sexual deepfake cases wandering around canada And US high schools.

In Canada, sexually explicit images of a minor, deepfake or not, are illegal and considered child pornography. Online and social media platforms say they have reported images found on their sites to police.

Online safety trainer Brandon Laur says it will become increasingly difficult to distinguish deepfakes from the real thing.

“Every year we will see huge generational leaps in the authenticity of these images,” he said.

Even now, he says, parents don’t always believe their children when they tell them the images aren’t real.

WATCH | Real or deepfake?

Can you spot deepfake? How is artificial intelligence threatening elections?

Fake videos created by artificial intelligence are used for fraud and internet pranks, but what happens when they are created to interfere with elections? CBC’s Catharine Tunney explains how the technology could be weaponized and looks at whether Canada is ready for a deepfake election.

Laur says it’s unrealistic to expect people not to post online, but she wants to raise awareness that once an image is published, it’s nearly impossible to control what happens to it, even with secure settings.

The RCMP and other police services have expressed concern about the emergence of such images. But legal recourse can be difficult, says Molly Reynolds, a lawyer at Tory’s LLP in Toronto who represents adult victims in civil cases.

Deepfakes can be made by former partners for revenge, by colleagues and students to threaten and bully, or by strangers in other countries.

“If a stranger takes your photo anywhere in the world and turns it into a deepfake, it can be very difficult to find a legal way to stop that in Canada,” Reynolds said.

Reynolds says victims can submit a takedown request to the site hosting the image (like Google or Meta).

After that, “there may be civil or criminal law avenues to allege harassment,” he said.

In Ruby’s case, police found no evidence that the image was distributed online. They believe the person contacting her was trying to break into her iCloud with an elaborate phishing scheme.

She’s still shaken and wants people to know this can happen to them.

“What we’re still taught about cybersecurity is that nothing ever leaves the internet, and to be safe, don’t take nude photos,” he said. “And that’s true. But now it’s a whole different game.”