The Buzz has a new home!

The Buzz has now moved to a new website. Check it out here for advice on dating, friendship, wellness, and more:

Bumble Partners with to Help Prevent the Sharing of Private Images

At Bumble Inc., the safety of our community is our top priority. While the sharing of nude or sexual images without consent is unfortunately prevalent across the internet, we don’t allow this behavior across our apps. 

To reinforce these values, we’ve joined forces with U.K.-based nonprofit SWGfL on their project in an effort to help stop non-consensual intimate images from being shared online, alongside industry peers TikTok, Facebook, and Instagram. 

This partnership represents the first worldwide initiative of its kind, and aims to support those concerned that photos or videos featuring nudity or sexual content may be shared without their consent. uses first-of-its-kind hashing technology to prevent private, intimate images from being shared across the tech platforms participating in this initiative. The intention is to empower members of these communities to regain control from perpetrators. Those being threatened with intimate image abuse can create unique identifiers of their image, also known as hashes or digital fingerprints. 

If you’re concerned that an intimate image has been posted (or might be posted) to online platforms including Bumble or Badoo, you can create a case through to proactively detect these photos. 

The tool features hash-generating technology that assigns a one-off numerical code to an image, creating a secure digital fingerprint without ever having to share the photo or video in question with anyone. Tech companies participating in receive that “hash” and can use it to detect if someone has shared (or is trying to share) those images on their platforms.

While participating companies use the hash they receive from to identify images that someone has shared (or is trying to share) on their platforms, the original image never leaves the original device, be it a smartphone or laptop. Only hashes, not the images themselves, are shared with and participating tech platforms. This prevents further circulation and keeps those images securely in the possession of the owner. is for adults over 18 years old who think an intimate image of them may be shared, or has already been shared, without their consent. For people who are under 18, there are other resources and organizations that can offer support, including the National Center for Missing & Exploited Children (NCMEC).