STISA calls on hotlines, helplines and operational services to demand the tools needed to support survivors of online sexual violence:
If you don't ask, you won't receive it.
Survivors of IBSA have an immediate and pressing need : ensure that their intimate content, whether authentic or fabricated (with IA, in the case of deepfakes), is removed and is not republished in any public space online.
To achieve this, they’re advised to contact the platforms where the content is published, seek guidance from a helpline, and file a police report. Hotlines can help take the content down, with a success rate of up to 90%.
This said, in 2024, there is no robust global mechanism in place to ensure that content is reported to one place and can be removed everywhere, all the time.
Case management for survivor support
For processing requests for the removal of abusive content, STISA recommends SCARt and HARt, a state-of-the-art, open-source, cloud-based case management solution developed by S3Group and built on the trusted AbuseIO framework. This system has been proven effective in managing complex abuse reports securely and efficiently, particularly in child sexual exploitation cases. It has also shown the reduction in manual workload, improvement in data accuracy, and enables secure information sharing with trusted partners and authorities. SCARt/HARt have proven to be 5× more efficient than current systems (e.g. excel sheets and emails).
Preventing re-upload and re-victimisation
To prevent the re-upload of verified abusive content, STISA endorses Nexus, a hash database developed by Videntifier. Each image or video is digitally fingerprinted (“hashed”), allowing participating platforms to detect and remove content without exposing the image itself. Control over what is hashed and shared always remains with the hotline or helpline, and only with the survivor’s explicit consent. Nexus has proven to be over 6,000× more efficient in detecting videos than the most commonly used technology used for harmful video identification.
With new technologies come new threats and new ways to protect from harm. Artificial Intelligence is abused to create fake intimate images and videos that are harmful to those represented on these images, but it is also essential to help detect newly produced harmful content.
STISA and the members of its network look forward to develop and implement new technologies, to make the world safer.
So, if you have a great technology or solution that can make the world safer from IBSA, contact us and explore with us how we can collaborate together.
Copyright © 2025 Survivors & Tech Solving Image Based Sexual Abuse - All Rights Reserved.
STISA is a fiscally sponsored project of Global Impact (501c3 organization).
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.