title: Put the knife down and take a green herb, dude. |
descrip: One feller's views on the state of everyday computer science & its application (and now, OTHER STUFF) who isn't rich enough to shell out for www.myfreakinfirst-andlast-name.com Using 89% of the same design the blog had in 2001. |
FOR ENTERTAINMENT PURPOSES ONLY!!!
Back-up your data and, when you bike, always wear white. As an Amazon Associate, I earn from qualifying purchases. Affiliate links in green. |
|
x
MarkUpDown is the best Markdown editor for professionals on Windows 10. It includes two-pane live preview, in-app uploads to imgur for image hosting, and MultiMarkdown table support. Features you won't find anywhere else include...
You've wasted more than $15 of your time looking for a great Markdown editor. Stop looking. MarkUpDown is the app you're looking for. Learn more or head over to the 'Store now! |
|
Saturday, August 07, 2021 | |
There's been a lot of pixels spilt regarding Apple's plans to sniff out pictures of child exploitation on personal devices and, if I understand correctly, silently report someone to the National Center for Missing and Exploited Children if they think their algorithms have found it. A couple of quick reactions: First, the rhetorical power of child exploitation as a cover discourse for something else entirely in the last decade or so would be unbelievable to someone from the 1980s. What the absolute heck is going on? Stopping someone accused of exploitation has become akin to a theoretical rhetorical get out of jail free card to justify doing whatever someone wants to be doing. And, in some cases thankfully, like the fellow with the assault rifle at the pizza store, it also still appears to be a practical get into jail quick card if you actually act on that coopted rhetoric. Still, it's a bizarre, collective neurosis. Second, let's talk about what Apple's doing. It seems antithesis to their "privacy is in or DNA" claim, even though some Pizzagates are tragically real. Why are they taking an anti-privacy stand now? Let's be frank: Apple has not been doing great looking privacy-minded this year, as they gave up the privacy high ground when they announced just last month that they plan to literally start selling AAA privacy in iOS once iOS 15 ships. You don't pay, you don't get to be fully private on iOS.
Let's also admit that Apple's not going to be able to create a successful system for sniffing evil images for years. If you've been reading this blog for a while, you know I believe Apple can't QA software to save their life (QA is "Quality Assurance", here meaning the ability to test software to make sure it works well even in unanticipated situations). Here's one example:
Things will go wrong. Someone will be suing Apple for a false positive. And one or two of those people may honestly have their lives ruined. How hashing an image worksTo know why someone's going to get charged who shouldn't, I want to describe how hashing and fingerprinting works, though I really don't want to get into the weeds. So let's grossly oversimplify and say it works like this:
In our case, matching our hash or fingerprint of 357 means there's a 1 in 9,999 chance of actually having 1234567. That's a horribly large chance of a false positive. You could also have 0030507. Or 3335577. Or 2136567. We don't know for sure. Each of those 9,999 matches that aren't 1234567 are collisions. Even so, that's only 10,000 values out of ten million we need to check behind those three hashed digits. Huge potential time savings. Now when "cyber-fingerprinting" large files like images, the numbers are VERY large (lots over seven digits) and the hashing algorithm, though it will have some collisions, is MUCH more exact. The chances of false positives with a true fingerprinting is, let's guess, about the same or worse than winning the lottery. In any event, false positives are very rare. And you should appreciate that. But eventually people do win the actual lottery, and, given enough people, someone will have a photo fingerprint collision. Someone will have a picture that, once hashed, matches the fingerprint of a known evil [no hyperbole intended at all] image. And their life could be ruined in a way that will make some identity thefts feel quaint. Worse, with Apple's software record, there's going to be some bug that says the equivalent of "Any number with a single 3, 5, or 7 in it matches," the National Center is going to receive thousands on thousands of false reports, and we're going to bring down, at least briefly, the very system we're trying to support. And if someone games the system, well, all bets are off. Someone is going to match a fuzzy fingerprint with a meme image specifically spoofed to match a database image, it's going to get popular, and suddenly there are Pizzagates everywhere! No, really, no joke. It's going to happen. Apple's true (and legitimate) motivationPerhaps false positives are worth it to expose those who do exploit children. Certainly in theory I think a few ruined lives is worth the good that can come out of this if there's any meaningful reduction of exploitative imagery. And Apple has a clear motivation for doing this, an angle nobody's mentioned yet (that I've heard): Apple is hosting child pornography on their servers right now. Not maybe. They are. I can't say that with 100% certainty, but theoretically, given a billion active devices, you know they are. There are too many sickos out there, sickos have phones, they have evil on those phones, and some of those phones are backed up to iCloud. That's a huge issue for Apple. That has to be Apple's motivation. My guess is that the people with serious problems know other ways to maintain their privacy that Apple won't catch. Apple sniffing Photos (the app) will get some less deliberate criminals. But even at 100% foolproof iCloud sniffing, Apple won't stop exploitation. Should Apple delete apps like those from the App Store too? Maybe! Tough question, but cut from the same cloth. I mean, what a freaking mess. I can't imagine all the smut people likely have on their phone. Heck, Brett Favre allegedly (almost certainly did, right?) sent pics of, well, you know, to a female reporter while he was with the Jets. Very few have signed him off as a habitual recidivist, and I bet most NFL fans still have a mostly positive view of Favre in spite of having ยญand sharing NSFW pics and being a sexual harasser. (Could iOS stop these sorts of pics from being shared? Would that be bad? What if I wasn't a Puritan at heart, would I still think it's bad -- that is, consenting adults can exchange NSFW pictures, right? Right? Ewww.) It's just that this passive, "We're looking through your phone and taking actions based off of its contents without your involvement" that's scariest to the layperson, I think. To jump all the way from absolute, objective evil, let's go right to the end of the grey area where it's almost harmless: Wait. Before I venture much further, let me stop completely to say something: If someone has 1000 matches with a database of child exploitation imagery, even at 95% accuracy (insanely low accuracy, I'd think), statistics say that they've definitely got non-trivial amounts of illegal imagery. If they have 100, they have illegal images. If they have 10? I've got to think probably they do. I have some practical privacy and 5th amendment itches somewhere, but here, they're unimportant. You're using a private company to store illegal goods. Apple sniffed those illegal images just like a storage company could catch a cocaine stash with a drug sniffing dog. You should get turned in with no warning and let the judicial system (at least in the US) figure out where the chips should fall. Back to the grey area discussion... What if you have too many pictures of jaywalking? Movies taken from cars that were speeding? Should you get fines in the mail as if you'd been caught by a red-light camera? How many jaywalking pictures before something must be done? It's more than 2. Is it less than 1000? In all of these cases, what we're talking about is the practical loss of privacy, at least compared to the situation that came before it. This practical, day-to-day loss is starkly different from losing the theoretical right to privacy, which Apple hasn't changed at all without some serious mental gymnastics -- you could argue that today's First World requires a cell phone, and if Android starts doing this photo sniffing too you're trapped in a duopoly, but you also have other options for taking and storing pictures. Again, this is an argument because Apple is hosting your images on their hardware. But wow, it feels like a slippery slope, and a dangerous rhetoric of absolute evil attached to not tripping down it. Labels: apple, Other Stuff, privacy posted by ruffin at 8/07/2021 04:31:00 PM |
|
| |
All posts can be accessed here: Just the last year o' posts: |
||||||||||||||||||||||
|