On Apple’s Perceptio acquisition

Bloomberg:

The company’s leaders, Nicolas Pinto and Zak Stone, are both established AI researchers who specialize in developing image-recognition systems using deep learning. Deep learning is an approach to artificial intelligence that lets computers learn to identify and classify sensory input.

And:

Perceptio’s goals were to develop techniques to run AI image-classification systems on smartphones, without having to draw from large external repositories of data. That fits Apple’s strategy of trying to minimize its usage of customer data and do as much processing as possible on the device.

I see this as a huge leap forward for Apple’s Photos app, adding a Google Photos level of search without accumulating data from those photos back on the main server.

One of the benefits of Google Photos is the ability to search for elements within a photo without the requirement of tagging. For example, you can ask Google Photos to search one of your photo albums for all pictures containing cats.

Google Photos does this by running an image analysis on all your photos, automatically generating tags for each photo, keeping the information from your photos on their servers for use in your searches (good) and for other purposes (not so good).

Apple’s current Photos app does allow you to manually tag your photos, and does do some semi-automated facial recognition. Seems to me, this acquisition will allow Apple to leapfrog Google’s efforts, performing similar automated image analysis, but on-device without requiring data about your personal pictures to make a trip back to the company servers.

Apple continues to put its money where its mouth is when it comes to its commitment to privacy.