I have never liked Apple and lately even less. F… US monopolies
I don’t really understand the purpose of the feature — GPS tags are already embedded in the photo by the phone, so it knows the location of each picture. The phone also analyzes faces of people you’ve identified so you can search for people you know. What else does this new feature add?
It let’s you type “eiffel tower” into search and get those pictures. Rather than all the other unspeakable things you did in Paris that night
Current implementation seems like overkill. Why not just:
- Search “Eiffel tower”
- send search term to Apple server that already exists (Apple Maps)
- server returns gps coordinates for that term
- photos app displays photos in order of nearest to those coordinates
Because then they don’t have an excuse to move all your data to Apple servers and scan it for later use.
Because you took two selfies in a restaurant near there, made a huge stunning collage of a duck below the tower and a couple photos from a while away to get the whole tower in view.
I’m running this tech at home, because we had the same use case. Except for me it’s running on a nas, not Apple’s servers. The location solution doesn’t quite work as well when you’re avid photographer
If you read the article, you would know that the hard work is done locally on your iPhone not on apples server.