The founder of a popular dating app has developed a feature to protect its users from receiving inappropriate images.
Andrey Andreev, who founded the dating app group which contains Bumble, has announced the brand new feature this week.
The technology will identify inappropriate images and automatically blur them out, while alerting the user of the nature of the contents.
Said to have 98% accuracy at identifying the lewd images, the technology be rolled out from June 2019 across all Bumble users.
READ MORE: Tinder user’s list of rules sparks backlash
Users of Badoo, Lumen and Chappy (which are part of the same dating group) will all benefit from the same feature integrated into their app experience.
“The safety of our users is without question the number one priority in everything we do and the development of ‘Private Detector’ is another undeniable example of that commitment,” said Andreev.
“The sharing of lewd images is a global issue of critical importance and it falls upon all of us in the social media and social networking worlds to lead by example and to refuse to tolerate inappropriate behaviour on our platforms.”
Whitney Wolfe Herde, founder and CEO of Bumble, is also working with Texas state lawmakers to develop a bill that makes sharing indecent images a punishable crime.
“The digital world can be a very unsafe place overrun with lewd, hateful and inappropriate behaviour. There’s limited accountability, making it difficult to deter people from engaging in poor behaviour,” added Wolfe Herd.
“The ‘Private Detector,’ and our support of this bill are just two of the many ways we’re demonstrating our commitment to making the internet safer.”
This isn’t Bumble’s only innovation in the dating app world, as earlier this year it launched a star sign filter to see if you are romantically compatible based on zodiac.