- cross-posted to:
- technews@radiation.party
- cross-posted to:
- technews@radiation.party
I still think it’s not very nice to go break into people’s devices for the sole purpose of seeing what they do in private. Dunno if this is just yet another of my unpopular opinions, I’m known to think differently about key subjects, but I don’t need or want any spyware to possibly misjudge a naked picture of my gf to either send it to some stranger to look at or, even worse, temporarily flag me as something I’m not until I have to go show them the picture for human review myself, at which point damage will already have been done. This already happened to one guy using MS’s OneDrive (which is why I emptied out my OneDrive and stay away from it these days), please keep my local computer storage safe from surveillance… 😕
I get that child-abuse needs to stop (or preferably never have existed at all in the first place), but nothing justifies surveiling private information that has no intent of doing harm. I would even advocate that it’s the buying/selling of it and of course the creation of it which does the harm, and not the merely having a misjudged file you found one drunk winternight on the free web and stored somewhere. I still think these people are sick and I do share the population’s general feelings of wanting to castrate every last possible one of them, but actually doing that would go against so many other of my base principles, of which “you do you if you don’t bother another” is a big one for example… I’d say to leave people that (however disgusting their tastes are) don’t do actual harm alone and focus on the ones that actually do hurt kids. It’s easy to ignore, but in the end facts seem to show that (with exceptions of course due to human diversity) it seems to be a form of unintentional sexual preference rather than bad intent. If they do not act on it, it’s just something they can’t help having. And if they can’t help having it, it just doesn’t feel right disrupting lives and defaming people on that base alone.
Anyway, vote me down if you must, I’m used to having the unpopular opinion, but please don’t get me wrong: I’m not advocating for anyone that actually has CSAM stored, but for the rest of us that doesn’t want the possible wrongful implication - resulting in unnecessary show-and-tells about private pictures - of it either. I’m not that much interested in punishing people for what they ‘might’ do wrong, while it already seems impossible to get rid of all those that actually do/try/intent… 🤷♂️
Sacrificing everyone’s privacy to fight crime is always a very Orwellian thing to do.
My family has many, many photos of us as kids that these tools would likely flag. What nonsense.
It’s nothing more than pushing intrusions using “think about the children” fear mongering.
My device, my stuff, keep the grubby corporate paws off it.
Yeah, soon we’re gonna have to hard-encrypt our personal files on our own device (with the hassle of manual decrypting with password when needed) because we might get accused of having stored babypictures of ourselves or our children…
Ridiculous, really…
How else would they uhhh…“improve the database” then
(/s just in case)
They’re not the images of your gf, they are hashes. It basically takes elements from a photo, metadata, raw data and checks them against a db of hashes from known CSAM.
In any case, the biggest issue is the potential for misuse because it all leverages on governments and most don’t have their citizens backs.
It’s basically building a back door into a corner of your system and handing the keys to every government agency, regardless of how authoritarian they are. It will start off with noble intentions but quickly devolve into a shitshow of civil rights violations.
Would this ever really work? I don’t want to be pessimistic and I’d be extremely happy if they actually made something which functions. However how are they going to deal with all the edge cases - I know my mom still has photos from when I was younger dressed scasrly, would those be flagged. How about actual porn but the actress just looks underage (not a fan of it), would that be flagged? Etc… Etc…
I just see so many issues with something like this. Besides the privacy concerns ofcourse.