Jacksonville Jaguars Fan Forums

Full Version: Facebook wants your nude photos
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Somebody at Facebook will be "reviewing" the photos.

According to Facebook, the review will be carried out by one specifically trained member of their community operations safety team; however, the post doesn't describe what their training will consist of, or what the review will entail.

Let's hope this one person doesn't get overwhelmed "reviewing" thousands of nudes per day, or ever take a day off, or ever go on vacation.

Facebook says they will protect users by "hashing" the photos to create a unique "fingerprint".  Then, if an ex-partner tries to upload the photo for revenge, the system will detect the matching "hash" and prevent the upload.

Sounds good, right?  What could go wrong?  Let's do some math...

Photos taken with an iPhone 6 are 3264 x 2448, or 7,745,472 pixels.  There are 24 bits in 1 pixel, so we're talking about 185,891,328 bits in one photo.

IF EVEN 1 OF THOSE 185,891,328 BITS HAS BEEN ALTERED, THE HASH OF THE 2 PHOTOS WILL NOT MATCH AND UPLOAD OF THE PHOTO WOULD NOT BE BLOCKED.

Let's test it with a tiny photo containing only 7600 pixels, 1000x fewer pixels than the typical iPhone image:

 [Image: vPM6sm]  hash = 3f4ed12cfb96e07b4d4e26bdff734ec5ce783325


 [Image: tCd27Z]  hash = 3cf29540a38b64cf35b72006946ac5ba0fa421a3

Can you tell the difference between the 2 pictures?  Me, neither.
Yet nothing would prevent someone from uploading the 2nd photo even if the first photo was blocked.

Nice try, Pervbook.
I think that if you are taking nudes and posting them online anywhere, even in a PM, you are basically publishing your own porn. Digital nudes = lots of problems later.
I doubt that the kind of people who upload nudes as revenge porn would have the mental capacity (or know how) to alter the pixels in the photo, personally.

I mean, I don't know how to and I'm not even a (total) moron.
Forgive me for believing that people who take 400 selfies a day might be able to hit a "brightness" or "resize" button on the photo app.
(06-03-2018, 01:36 PM)Byron LeftTown Wrote: [ -> ]Forgive me for believing that people who take 400 selfies a day might be able to hit a "brightness" or "resize" button on the photo app.


You’re forgiven.


Sent from my iPhone using Tapatalk
Of course they want them,  I'm a magnificent specimen of human masculinity.
I already sent Zuckerberg my nude photos. Are those still good or do I need to resend more?
[Image: r6BFR3V.jpg]
Whants? Really... I think it's not the best desire.
I don't usually venture into the Political section of the boards, but let's clear up some critical misconceptions regarding this.

1) A victim of NCII (non-consensual intimate imagery) would be in contact with a victim support group (either via direct contact or the relevant authorities would help make the connection).
2) The victim would work with this group to securely send images to Facebook's dedicated team (the suggestion here is that the reviewer would be the same sex as the person, in addition to someone who has received the highest form of background checks and accreditation).
3) The reviewer validates that the images are NCII (vs some picture that a government doesn't want spread or other nonsense).
4) A perceptual hash is created from the image and the original is deleted. Perceptual hashes are quite a bit more than just a simple SHA1 or something. An example of this is http://www.phash.org/

[Image: attachment.php?aid=73]
The RADISH algorithm reviewed the two photos and decided they were 99.9936% similar, which seems pretty accurate.


NCII is a pretty significant issue, right alongside child exploitation images.

I'll bet you didn't know that Microsoft created and donated PhotoDNA in 2009 to the International Centre for Missing & Exploited Children. Microsoft, for free, hosts the service to safely and accurately detect images of child exploitation. It is used by Bing, OneDrive, Google Gmail, Twitter, Facebook, Adobe Systems and the National Center for Missing & Exploited Children.
Almost the exact same process is followed, with the exception that the photos are reviewed by ICMEC & NCMEC before being included in the database (as a human must review it to confirm it's validity prior it it being broken down into the hash and added to the database). 
More information on PhotoDNA: https://en.wikipedia.org/wiki/PhotoDNAhttps://www.microsoft.com/en-us/photodna

Source on the NCII Facebook stuff: my own research & this thread (click through for the entire thread): https://twitter.com/alexstamos/status/99...4709415937

Yes, Facebook is a pretty big dumpster fire of a situation, but when a company is as big as Facebook is, they think in a whole separate world and deal with an entirely separate set of problems. And no, they don't want your nudes, but they do want to make their life easier (and the world a slightly better place) by preemptively banning images so their team of moderators don't have to see it. 

Think about it: people are already uploading the pictures. Then the get flagged and a non-specialized moderation team has to look at the content and delete it. This takes a few hours and puts the image in front of hundreds of people. So the image will be seen, should it be seen by one or two dedicated specialized people or leave it public for a few hours while a dozen moderators review it.