Survivors of IBSA have an immediate and pressing need : ensure that their intimate content, whether authentic or fabricated (with IA, in the case of deepfakes), is removed and is not republished in any public space online.
To achieve this, they’re advised to contact the platforms where the content is published, seek guidance from a helpline, and file a police report. Hotlines can help take the content down, with a success rate of up to 90%.
This said, in 2024, there is no robust global mechanism in place to ensure that content is reported to one place and can be removed everywhere, all the time.
STISA partners with Videntifier to establish an actionable data platform to assist in the identification and removal of IBSA content on platforms.
With the support of the STISA's Network who receive daily reports on IBSA, we will develop the first centralized hash database on adult IBSA content. This database will be shared with interested platforms for effective removal of the content. Videntifier has extensive experience as they have already built such as hash list, working with different hotlines in the field of child sexual abuse material (CSAM).
With new technologies come new threats and new ways to protect from harm. Artificial Intelligence is abused to create fake intimate images and videos that are harmful to those represented on these images, but it is also essential to help detect newly produced harmful content.
STISA and the members of its network look forward to develop and implement new technologies, to make the world safer.
So, if you have a great technology or solution that can make the world safer from IBSA, contact us and explore with us how we can collaborate together.
Copyright © 2024 Survivors & Tech Solving Image Based Sexual Abuse - All Rights Reserved.
STISA is a fiscally sponsored project of Global Impact (501c3 organization).
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.