Bumble has released the source code for their AI algorithm for spotting unsolicited nudes

Bumble has released the source code for their AI algorithm for spotting unsolicited nudes

Bumble has been using machine learning to safeguard its members from indecent photographs since 2019. The function, dubbed Private Detector, examines photographs shared from matches to see whether they contain improper information. It was built largely to capture unsolicited nude photographs, but it may also identify shirtless selfies and images of firearms, both of which are prohibited on Bumble. When a positive match is found, the app will blur the offending image, giving you the option to see it, block it, or report the sender.

Bumble revealed in a recent blog post that it was open-sourcing Private Detector, making the framework accessible on Github. “We hope that the functionality will be embraced by the larger tech community as we work together to make the internet a safer place,” the business stated while noting that it is only one of many participants in the online dating sector.

Unwanted sexual approaches are a common occurrence for many women, both online and offline. According to a 2016 research, 57 percent of women reported being harassed on dating apps. A 2020 survey from the United Kingdom discovered that 76 percent of females aged 12 to 18 have received unsolicited nude photographs. The issue extends beyond dating apps, with applications such as Instagram developing their own remedies.

No votes yet.
Please wait...