With iPadOS 13.4, Apple brings trackpad support to iPad, giving customers an all-new way to interact with their iPad. Rather than copying the experience from macOS, trackpad support has been completely reimagined for iPad. As users move their finger across the trackpad, the pointer elegantly transforms to highlight user interface elements. Multi-Touch gestures on the trackpad make it fast and easy to navigate the entire system without users ever lifting their hand.
This feels like a huge leap to me, beyond the trackpad experience on any existing computer. Makes me wonder if we’ll see a change to the Mac trackpad support to bring some of this new experience to macOS. This seems logical to me, especially for iPad apps ported to the Mac via Mac Catalyst.
As to LiDAR:
The breakthrough LiDAR Scanner enables capabilities never before possible on any mobile device. The LiDAR Scanner measures the distance to surrounding objects up to 5 meters away, works both indoors and outdoors, and operates at the photon level at nano-second speeds. New depth frameworks in iPadOS combine depth points measured by the LiDAR Scanner, data from both cameras and motion sensors, and is enhanced by computer vision algorithms on the A12Z Bionic for a more detailed understanding of a scene. The tight integration of these elements enables a whole new class of AR experiences on iPad Pro.
The stage is set for AR. This feels like a deflection point to me, Apple introducing key new technology that will mark a sea change to future user experiences.
And no small thing, Apple also delivered an amazing new keyboard case, the Magic Keyboard. Comes with a built-in trackpad, smooth laptop-like viewing angle adjustment, and a USB-C port (which charges the iPad Pro) built into the hinge. The case is pricey, $299 for the 11 inch, $349 for the 12.9 inch.
The line between iPad and Mac has never been more blurred. Will Apple port Xcode to iPad, give iPad users the ability to build apps on device?