When ‘revenge porn’ comes on social media, it has been a very important issue. Keeping in mind that almost everyone knows that there will be a profile on some social media platform, it is important that users are protected from all types of revenge porn.
In exchange for that, Facebook has finally launched a pilot program to test and update its intimate image detection system. The new image identification system has been updated through the implementation of advanced artificial intelligence technology. After consultation with both victims and experts in different situations, social media giants have developed their new identity technology.
Facebook launched a pilot program to deal with the problem of ‘porn revenge porn’. Its purpose is to remove non-consensual intimate images before users are notified by users to improve privacy and experience. The filter has also been implemented in Facebook’s photo sharing based social network, Instagram.
In recent years, leaking of inappropriate images and non-consensual intimate images has been a big problem, which has hurt the internet. Facebook has also come under heavy investigation due to not being able to take immediate decisive action. New developments for image filters will allow them to preserve the content of the platform from any inappropriate picture.
Holly Jacobs, founder of the Cyber Civil Rights Initiative, said, “We are thrilled to expand the pilot to include more women security organizations all over the world, because we have many requests from victims.”
The new AI-based filters will flag inappropriate content and to avoid posting, Facebook’s Community Operations team will take down the post before being reviewed by a specially trained member. The pilot program also allows users to present special photos on Facebook, which they do not want to share as an emergency option without worrying about emergency issues.
Facebook has provided this new development so far in Australia.
Tags: facebook Facebook Image Detection Facebook pics Revenge Pilot