People caught posting deepfakes on social media could face tougher penalties under a proposed law updated by a London MP.
Earlier this week, Parliament approved changes to Bill C-16 put forth by Andrew Lawton to address posting non-consensual “nude” and”nearly nude,” A.I. images.
“We have to put victims front and centre and understand how technologies are jeopardizing online and offline safety” said Lawton, who is the Conservative MP for south London, Elgin and St. Thomas.
Bill C-16 address a range of intimate partner violence, but Lawton’s ammendments were focused on the online issues that A.I. technology has made more urgent. Under his changes, if Bill C-16 passes, social media platforms will need to remove any non-consensual images of “nude” or “nearly nude” people within 48 hours.
It also introduced definitions on what type of images were considered illegal. The bill originally stated sexual organs needed to be visible for the image to be illegal, but perpetrators found a loophole: they could put one tiny pixel on the image, which would make the post legal. That’s why Lawton wanted to expand the definition to “nearly nude.”
Perpetrators were mainly using the X/Twitter A.I. engine Grok to make the type of photos being banned, he said.
“People were using Grok because it was very easy to use and took no technology skill or knowledge to make these very sexual images who never consented to them” Lawton said. “Thankfully Grok had to pull back on these features, but we saw it be very widespread.”
Lawton said to XFM he had a personal connection revolving deepfakes being posted online where one of his friends wasn’t liked by someone and they created an image that was “very embarrassing and sexually explicit” using an A.I. engine.
His ammendment also introduced stiffer penalties for someone caught making images or videos depicting sexual assault, he said.
The issue of AI being used to exploit victims has been on the radar of anti-gender-based violence organizations. Human traffickers have been using A.I. and other technology, to exploit and control their victims in London by tracking victims remotely and threatening to release pornographic photos and videos that may not be real, the leader of Anova said last week.







Comments