Sexualised deepfakes created ‘every day’ in Northern Ireland, Assembly told
By Gráinne Ní Aodha, Press Association
Sexualised deepfake images are being created “every day” in Northern Ireland, the Assembly has been told.
NI justice minister Naomi Long said that she was “deeply concerned about the exponential rise” in non-consensual sexualised deepfakes, and said she would legislate against the creation of such images of adults by March.
She said that minors who use sexualised deepfakes against their peers is already illegal, and warned it “could have serious consequences for them for the rest of their lives”.
A motion on artificial intelligence (AI) and online violence against women and girls was passed at the Assembly on Monday by SDLP MLAs Cara Hunter and Colin McGrath.
Ms Hunter has spoken out previously about how she feared a deepfake pornographic video of her shared online in April 2022 would ruin her political career.
The motion expresses “grave concern” at the use of AI to perpetrate online violence against women and girls and condemns the creation and sharing of non-consensual intimate images, including AI-generated deepfakes, as a serious form of abuse.
It also calls on the Ms Long to ensure that effective criminal offences, enforcement powers, and victim protections are in place no later than those in England and Wales.
There has been strong criticism of AI after tech billionaire Elon Musk’s chatbot Grok started producing non-consensual sexualised deepfakes of women and minors on the social media site X.
Speaking in the chamber on Monday, Ms Hunter said the motion looked to recognise a “new and rapid-evolving entry point for that violence, one that this Assembly can no longer afford to ignore”.
“Deepfake and so-called nudification apps represent a profound abuse of technology, they are designed to violate, humiliate and control,” she said.
She said that while women and girls are the main targets, they can also be used against men and boys.
“These apps are being used every day in Northern Ireland, in our schools, in our workplaces and sadly, in our communities.
“Children are having their images manipulated and shared before they fully understand what is actually happening. Lives are being altered in minutes, sometimes irreversibly.”
She shared some testimonies from people who had been victims of sexualised deepfakes, including a student and TV presenter Grainne Seoige, who last week said a deepfake of her was “deeply traumatising”.
Ms Long said Grok had brought into focus how technology could also be used by those with “malign intent” against women and girls.
She said she had introduced protections for women and girls, both online and in person, and new laws would strengthen measures to tackle abusive imagery.
“Sadly, those with malign intent always will find new ways to continue this abuse,” she said.
“Now women and girls have to face the growing threat of having their image sexually manipulated online and I am deeply concerned about the exponential rise in the proliferation of non-consensual sexually explicit deepfake image abuse online and, unfortunately, AI tools have made it easier to create those abusive images.”
She said this was a form of sexual abuse targeted mostly at women and girls, and said they were done to “humiliate, to harm and to abuse them”.
She added: “Currently there is no legislation governing sexually explicit deepfake images of adults, but I am determined to address this gap and the threat and harm caused by these images.
“I will introduce a new deepfake offence by way of an amendment. It will be tabled at Consideration stage of the Justice Bill which is currently before the Assembly.
“The aim is to have the offence completed in March.”

