I have never liked Apple and lately even less. F… US monopolies
It’s not data harvesting if it works as claimed. The data is sent encrypted and not decrypted by the remote system performing the analysis.
What if I don’t want Apple looking at my photos in any way, shape or form?’
I don’t want Apple exflitrating my photos.
I don’t want Apple planting their robotic minion on my device to process my photos.
I don’t want my OS doing stuff I didn’t tell it to do. Apple has no business analyzing any of my data.“opt out” to looking at my data ✅
Yeah I was gonna say… I’ll defend Apple sometimes but ultimately this should only be opt-in and they are wrong for not doing that. Full stop.
What if I don’t want Apple looking at my photos in any way, shape or form?’
Then you don’t buy an iPhone. Didn’t they say a year or two ago that they’re going to scan every single picture using on-board processing to look for images and videos that could be child porn and anything suspicious would be flagged and sent to human review?
Well, the other cloud services just did server side csam scan long before apple and they do it respecting your privacy less than apple.
Apple wanted to improve the process like EU wants it, so that no illegal data can be uploaded to apple’s servers making them responsible. That is why they wanted to scan on devices.
But any person who ever used spotlight in the last 4 years should have recognised how they find pictures with words. This is nothing new, apple photos is analysing photos with AI since a very long time.
TLDR edit: I’m supporting the above comment - ie. i do not support apple’s actions in this case.
It’s definitely good for people to learn a bit about homomorphic computing, and let’s give some credit to apple for investing in this area of technology.
That said:
-
Encryption in the majority of cases doesn’t actually buy absolute privacy or security, it buys time - see NIST’s criteria of ≥30 years for AES. It will almost certainly be crackable <oneday> either by weakening or other advances… How many people are truly able to give genuine informed consent in that context?
-
Encrypting something doesn’t always work out as planned, see example:
“DON’T WORRY BRO, ITS TOTALLY SAFE, IT’S ENCRYPTED!!”
Yes Apple is surely capable enough to avoid simple, documented, mistakes such as above, but it’s also quite likely some mistake will be made. And we note, apple are also extremely likely capable of engineering leaks and concealing it or making it appear accidental (or even if truly accidental, leveraging it later on).
Whether they’d take the risk, whether their (un)official internal policy would support or reject that is ofc for the realm of speculation.
That they’d have the technical capability to do so isn’t at all unlikely. Same goes for a capable entity with access to apple infrastructure.
- The fact they’ve chosen to act questionably regarding user’s ability to meaningfully consent, or even consent at all(!), suggests there may be some issues with assuming good faith on their part.
How hard is it to grasp that I don’t want Apple doing anything in my cellphone I didn’t explicitely consent to?
I don’t care what technology they develop, or whether they’re capable of applying it correctly: the point is, I don’t want it on my phone in the first place, anymore than I want them to setup camp in my living room to take notes on what I’m doing in my house.
My phone, my property, and Apple - or anybody else - is not welcome on my property.
Sorry for my poor phrasing, perhaps re-read my post? i’m entirely supporting your argument. Perhaps your main point aligns most with my #3? It could be argued they’ve already begun from a position of probable bad faith by taking this data from users in the first place.
Oh yeah I kinda missed your last point. Sorry 🙂
-