Calgary teen accused of using AI to create sexually explicit deepfakes of female students
Posted Dec 3, 2025 10:20 am.
Last Updated Dec 3, 2025 7:29 pm.
A Calgary teenager is facing charges after police say he created sexually explicit deepfakes of female students from across the city.
The Alberta Law Enforcement Response Team (ALERT) says the 17-year-old used AI to sexualize photos of teen girls who attended several different Calgary high schools.
Officials began investigating in October after receiving a tip about materials being uploaded to social media. A search warrant was executed at a house in Calgary on Nov. 13, and officers seized two cell phones, a tablet, and a laptop.
“Anything to create child sexual abuse exploitation material is an offence,” says Staff Sgt. Mark Auger with ALERT ICE. “You can draw it, you can talk about it, you can write stories, or use digital means to create it, it is still a criminal offence.”
Auger says suspects in these type of incidents usually capture a picture of someone from social media and then use software to “nudify” the image.
“AI will then give a very accurate assessment of your body type, your skin colour, and make a near impossible to distinguish nude image with a face attached,” says Auger.
The teen suspect is facing several charges, including making, possessing, and distributing child sexual abuse material and exploitation materials, as well as criminal harassment.
He was released on numerous court-ordered conditions, including no contact with anyone under the age of 18 unless incidental through work or school, and not to have any electronics capable of accessing the internet other than for work or school. He is schedule to appear in court on Jan. 8.
ALERT’S Internet Child Exploitation (ICE) unit isn’t identifying the suspect or the high school to protect the identity of the victims.
The victims have been provided support services, police say.
In 2024, an amendment was passed by Parliament, replacing the term child pornography in the criminal code with child sexual abuse and exploitation material. Child advocates have argued that the term “pornography” implies consent, while the new term more accurately conveys the abuse and exploitation offenders inflict onto victims. The provision came into effect in October 2025.