How Apple might incorporate PrimeSense 3D tech into its products

Juli Clover’s take on the 3D object sensing technology Apple just acquired when they purchased PrimeSense. As much as she sees, my guess is, Apple sees much more. Object sensing technology has been around since the 1970s, when Patrick Winston first wrote about computer vision and described the artificial intelligence algorithms needed for a computer to distinguish the corners that make up a room.

Devices like the PS4, Xbox One, and the smart TVs to which they send their signals can interpret the world around them, from facial features to body movements to subtle hand gestures. The interfaces are solidifying, reliability increasing. This stuff is consumer ready, about to make the leap from the niche gamer markets to the wider world around us. Imagine going to a store and gesturing to a device to find your brand of frozen broccoli, or waving to a parking meter to indicate how long you plan on staying and how you’d like to pay.

To me, this technology was made for the brains at Apple. I’m looking forward to seeing what they do with it.

  • Apple has facial recognition built into iPhoto, so it can scan pictures and recognize who is in them. And the iPhone camera has basic facial recognition, where it can tell that there is a face in the frame that’s perhaps worth focusing on. This newly acquired tech could be the bridge that brings true individual facial recognition to the iPhone camera. Easy to imagine that was Apple’s trajectory all along.

  • I hope they do something “special” with it and not just copy Samsung with the touchless display.

    • yes, because apple is known for copying crappier products with matching crappy products. not like it’s the completely the other way around, or anything…

      get real, man.

      • Wow, you must really be hiding in the sand. Apple is well known for taking others’ ideas: smart cover, notifications, control center, iOS 7 multitasking UI, Swiss Clock design, Braun, etc, etc, etc. Shall I continue?

        Just Google [or Bing] it. Go through the past 5 years and search: “apple copied”. Could use “apple copies” too.

        In no way was my comment to go down this path but you opened yourself up to that slaughter.

  • dejager

    Considering how our brains use gestures already for processing language, conceptualizing abstract ideas, and communicating emotional state, it’ll be interesting to see if gesture sensing technology will have an effect on the way we speak.

  • Sushifan007

    I have a big feeling this is going to be less about Kinect style interface and more about creating revolutionary new APIs for businesses and developers. Let’s think of it this way, Apple bought a company that specializes in indoor mapping through Wifi. Now with this new technology we have the opportunities to see some really killer augmented reality features. Imagine being an interior designer with iPhone 6S with IOS 9. You will be able to use a specialized app to scan and digitize a room and then do your mockup for a potential customer right there. Or enhancing iBeacon with this. Shopping apps could pinpoint killer deals. Personally I’d love to see apple use this technology to for digitizing technology for a fleet of apple cars that digitally map and chart roads throughout the world like street view but a more enhanced version like Nokia discussed.

  • Les_S

    I’d love to have a FaceTime chat that incorporates this technology in a way that would allow me to navigate in 3D the environment on the other side of the chat.