Apple

What is browser fingerprinting and how does it work?

Electronic Frontier Foundation:

When a site you visit uses browser fingerprinting, it can learn enough information about your browser to uniquely distinguish you from all the other visitors to that site. Browser fingerprinting can be used to track users just as cookies do, but using much more subtle and hard-to-control techniques.

And:

By using browser fingerprinting to piece together information about your browser and your actions online, trackers can covertly identify users over time, track them across websites, and build an advertising profile of them. The information that browser fingerprinting reveals typically includes a mixture of HTTP headers (which are delivered as a normal part of every web request) and properties that can be learned about the browser using JavaScript code: your time zone, system fonts, screen resolution, which plugins you have installed, and what platform your browser is running on.

And:

When stitched together, these individual properties tell a unique story about your browser and the details of your browsing interactions. For instance, yours is likely the only browser on central European time with cookies enabled that has exactly your set of system fonts, screen resolution, plugins, and graphics card.

The linked/quoted article is long and detailed, an enlightening read. But the bits about browser fingerprinting are incredibly important. And this is as good an explanation as I’ve seen.

At WWDC, Apple declared war on browser fingerprinting and related techniques. From Apple’s Mojave press release:

As with all Apple software updates, enhanced privacy and security remain a top priority in macOS Mojave. In Safari, enhanced Intelligent Tracking Prevention helps block social media “Like” or “Share” buttons and comment widgets from tracking users without permission. Safari now also presents simplified system information when users browse the web, preventing them from being tracked based on their system configuration.

And that’s a good thing.

Why Apple’s AirPower wireless charger is taking so long to make

Mark Gurman, Bloomberg:

Apple said in September that the iPhone X and iPhone 8 could be charged wirelessly. It recommended charging hubs from Mophie and Belkin, an unusual move for the consumer-hardware specialist. Apple also announced its own AirPower charger, but said it wouldn’t be released until 2018.

And:

Company engineers have been toiling away to address problems. One challenge is making sure the charger doesn’t overheat. Another is the complexity of the circuitry, according to people familiar with the device’s development.

And:

Unlike wireless chargers on the market today, the AirPower is designed to charge three devices simultaneously: an iPhone, Apple Watch, and AirPods with a still-to-be-released wireless charging case.

And, the point I think is the heart of the problem:

Apple also wants users to be able to place any of their devices anywhere on the charging mat to begin a charge. That ambitious goal requires the company to pack the AirPower with multiple charging sensors, a process that has proven difficult, the people said.

If you take apart a Qi wireless charger, you’ll find a coil of fabric-coated wire, the induction coil behind the physics of wireless charging. That coil is always round, and the chargers you buy are typically round as well, keeping the case design at its smallest form factor.

Here’s a video showing a tear-down of a Samsung Qi charger. Jump to about 3:58 in to see the coil.

Apple’s AirPower charger is oblong, not the same shape of the existing, circular Qi chargers. Some physics to solve for there. There’s also the complexity of a number of objects placed in unpredictable proximity on the oblong coil and it seems understandable that this is a tricky problem to solve.

Add to that:

The AirPower charger is also more advanced than the current competition because it includes a custom Apple chip running a stripped down version of the iOS mobile operating system to conduct on-device power management and pairing with devices. Apple engineers have also been working to squash bugs related to the on-board firmware, according to the people familiar.

This is a complex piece of engineering.

UPDATE: Interesting tweet from Jeff Guilfoyle, with a picture of overlapping coils. The idea being the controlling circuitry would switch between coils as needed. Interesting.

Apple wants to replace your car keys with an iPhone

Mikey Campbell, Apple Insider:

The Car Connectivity Consortium, which counts Apple among its charter members, on Wednesday announced the publication of new “digital key” standard that allows drivers to actuate vehicle systems like door locks and the engine via an NFC-enabled smartphone.

And:

With its technology, aptly dubbed the Digital Key Release 1.0 specification, the CCC aims to bring automotive manufacturers and mobile device makers together to create an interoperable digital key standard.

The system operates in much the same way as first-party digital keys currently available from a handful of vehicle OEMs. Users with authenticated smart devices are able to lock, unlock, start the engine of and share access to a specific car. Unlike some remote control solutions that leverage Wi-Fi or Bluetooth communications, however, Release 1.0 appears intrinsically tied to short-range technology like NFC.

Here’s the consortium press release with all the details.

Apple is figuring out what’s next

Neil Cybart rolls out a smart, detailed look at where Apple has its future focus. Lots of interesting bits here. A few highlights:

While Apple management will never admit it, the company has been thinking and looking beyond iPhone for years. The Apple Watch’s ongoing march to iPhone independency is clear evidence of this post-iPhone thinking.

And:

Management isn’t driven by the goal to come up with something that is more profitable than iPhone. Instead, the focus is on coming up with something that makes technology more personal and handling new workflows that were never able to be handled by iPhone.

And:

While AR makes for a cool on-stage demo, having to hold an iPhone or iPad up as an AR viewfinder for long periods of time isn’t ideal. Items like Siri Shortcuts and Siri Suggestions are interesting on iPhone and iPad although they are incredibly more appealing on mobile displays worn on our bodies. ML applications on iPhone and iPad are useful, but the predictive and proactive nature of the technology can work wonders when combined with mobile cameras and screens that we don’t have to hold. Apple is announcing new technologies that make more sense on form factors that currently don’t exist.

And:

It’s easy to think that Apple may simply be biding its time until the world is ready for AR glasses. However, WWDC gave us a glimpse of how Apple is busy behind the scenes, preparing for what comes next. With ARKit, Apple is using hundreds of millions of iPhone and iPads to inspire 20 million developers with the potentials found with AR. A similar dynamic is at play in getting customers comfortable with items like Animoji and Memoji – items that will likely one day be available via a pair of smart glasses.

This is a wonderful exploration of where Apple is heading, their strategy for getting there. Don’t miss the chart in the middle of the post, specifically that yellow line showing Apple Watch growth.

Apple inks partnership with Sesame Workshop, the non-profit behind Sesame Street

Wall Street Journal:

Under the terms of the contract, Apple has ordered multiple series from Sesame Workshop, the nonprofit media and educational platform best known for the long-running show “Sesame Street.” Shows will be live-action, animated as well as one featuring puppets, according to a person close to Apple.

Sesame Street itself isn’t part of the deal. This jibes with what Jim and I were discussing on the latest Dalrymple Report (should pop up later today). We were discussing Apple’s stated aim of focusing on family-friendly programming, avoiding edgier, R-rated stuff.

Jony Ive’s favorite color is orange

[VIDEO] Before I watched this video (embedded in the main Loop post), I was skeptical, could only think of a few cases of orange used in a modern Apple product design. But wow, there really is a lot of it.

Walkie-Talkie on Apple Watch (watchOS 5 beta 2)

[VIDEO] When I saw the watchOS 5 Walkie-Talkie announcement in the WWDC keynote, I got a little excited, had a little nostalgia buzz full of campouts and whispered late night push-to-talk conversations.

The video, embedded in the main Loop post, is Jeff Benjamin doing what he does best, taking you on a tour through the latest shiny, in this case, a step-by-step on the watchOS 5 Walkie-Talkie app.

Is this purely for fun? Or is there a use case? The performance seemed just a bit laggy, clearly laggier than the real world walkie-talkies which had no A-to-D conversions, were straight real-time radio transmission.

If the answer is, don’t be grumpy, just have fun with it, cool. Just want to be sure I’m not missing the value here.

More ARKit 2.0 with image tracking

First check out the video in this tweet:

https://twitter.com/osfalmer/status/1008736572185903105

The concept is familiar, a business card, a real world object, that expands when seen through an augmented reality lens, tracking to the original object, but adding views and controls that enhance the original object.

To me, there’s no question that this approach has tremendous potential. Imagine picking up an item in the grocery store and having a pane appear with buttons like “find best value” or “find cheapest” and having arrows appear on nearby shelves marked with appropriate alternatives.

Or a “convert” button that translates the price into a common format. For example, if you buy paper towels, it might show you cents per foot, so you can compare differently priced products, which range from $/roll to $/package to cents/sheet (with different sheet sizes).

I can definitely see the advantage of wearing a pair of glasses when immersed in an AR environment. It would get old constantly having to hold my phone up as a lens as I walk through a store. I wear glasses, so it’d be interesting to see how Apple will deal with the corrective lens issue. Will we someday see AR glasses that automatically correct my vision as well as offering an AR overlay?

Mission Impossible theft of Apple gear from Best Buy

WSB Atlanta:

Dunwoody police said burglars took a page out of the movie ‘Mission Impossible’ when they stole more than $100k worth of Apple products from a Best Buy Store.

Police said the thieves rappelled through a hole in the ceiling at the store on Hammond Drive.

Reading this story, I can’t help but picture this scene.

A look at the iPad-specific features in iOS 12

[VIDEO] Have an iPad? This is a terrific walk through what’s coming in iOS 12, a chance to wrap your head around the new gestures before you are plunked square in the middle of them with time pressures and work to do. Per usual, the video is embedded in the main Loop post.

What you need to do when you inherit a Mac

Glenn Fleishman weighs in with some excellent advice on what to do if you inherit or buy a Mac, to make sure you don’t end up with an unusable doorstop down the line. Worth a scan, just to get the gist of the issue, and a more detailed read if you are in that situation.

Supply chain report suggests Apple expects 6.5-inch ‘iPhone X Plus’ to be most popular 2018 iPhone model

Benjamin Mayo, 9to5Mac:

A report from supply chain sources, via Korean language publication The Bell, suggests that the largest of Apple’s 2018 iPhone lineup will be the most popular. Apple is set to announce an ‘iPhone X Plus’, or whatever Apple ends up branding it, with a 6.46-inch OLED screen, packing a ~6.5-inch screen into roughly the same size as the existing 5.5-inch iPhone 8 Plus.

And:

The Bell report says Apple has ordered more screen panels for the X Plus than any other model. It forecasts 45 million 6.46-inch panels, about 25 million panels for the 5.8-inch iPhone X successor, and 30 million 6.04-inch LCD screens for the new lower-priced flagship.

Not hard to believe the rumors of an iPhone X Plus, also not hard to believe that Apple will go with that name, if they do ship that phone. And not hard to believe it will become the most popular phone.

A 6.5 inch iPhone X Plus would be be a huge upgrade to the iPhone 8 Plus, the form-factor it would be replacing. Better screen, more pixels, what’s not to like? To me, the real question is one of price.

The iPhone X sold very well at its $999 price point last year, but the ‘super cycle’ of upgraders did not materialise in the way some investors expected.

The iPhone 8 starts at $699 and the iPhone 8 Plus at $799. Which leads to an iPhone X Plus entry price of $1099. Is that too high, too soon? We shall see.

Free trials from Apple’s perspective

Drew McCormack:

Apple currently allows free trials in two forms: if you sell subscriptions, you can give customers a free month to try the app; and, you can give your app away free, and offer a free In-App Purchase (IAP) to unlock all features for a fixed period of time.

So why does Apple allow these forms, but not offer a more formal version of free trials?

And:

Think for a moment about how a ‘formal’ free trial system would work. What would you see in the App Store? Probably something along the lines of a button with the text “$50 with Free Trial”. Now take your average iOS customer, who has never heard of free trials as they exist outside the App Stores. I suspect many will already be confused by this.

Drew goes on to explain that confusion, with specific questions like:

  • If I click the button, will I be charged $50 now?
  • What happens when my trial is up: will I be charged automatically then?

Not sure that confusion can’t be addressed by better wording. And if Apple did go down that road, I think they would try to make sure all those questions were answered before the user was put in that decision position.

I also think, and this is a nitpick on the post’s title, it’s impossible for anyone outside Apple to truly know Apple’s logic on this without either a clear statement from Apple or being inside the room.

To be clear, I do like this post. The two points above are my instant reaction, don’t want them to be left unsaid. Don’t let those points derail you, though. Drew’s post is worth reading.

Moving on:

So why are the existing options any better? Let’s take the free IAP system. Firstly, there is no fear about downloading an app — it is free to download. There is a nice big “Get” button to indicate that. Second, once you have the app, you are told there is a free trial, and you are given a clear choice to opt-in. Because it is an IAP, and not a subscription, you know there can be no charge at the end of the trial. There is a second IAP to purchase the app; it is equally clear that you don’t pay until you activate that IAP, and that you can do that any time. Everything is driven by the customer, and all opt-in. No uncertainty.

To me, that’s the core. Apple’s chose a clear, straightforward solution. Not one that will satisfy everyone, but one that won’t confuse users.

This is an interesting take on the free trial issue, and a good balance to Daniel Jalkut’s excellent Ersatz Free Trials post from a few weeks ago.

[H/T Dman228]

James Corden teases Paul McCartney Carpool Karaoke episode

This came out last week, but I just ran across it over the weekend, thought it worth a share.

I’m a huge Beatles nerd, McCartney fan. This is a definite yes for me, a solid thumbs up. Gimmicky perhaps, but I’ll be watching nonetheless.

The wonder of image tracking in ARKit 2

Last week, we shared this example of image tracking using ARKit 2. Here’s another one:

https://twitter.com/nathangitter/status/1008397365018005504

Wonderful. I get that, perhaps, our AR future will be seen through glasses. But examples like these are useful even seen through the lens of your iPhone. To me, a relatively short AR transaction works just fine on an iPhone. And I do agree that a more immersive experience will require glasses or (way in the future) connected contact lenses.

Apple’s original content is further along than you think

Gene Munster, Loup Ventures:

At the helm of the company’s content efforts are Jamie Erlicht and Zack Van Amburg, who Apple hired away from Sony in 2017. Erlicht and Van Amburg ran Sony’s primetime series division since 2005. They will report directly to Eddie Cue, who runs Apple’s Services business. Apple has also hired an array of industry veterans from a range of backgrounds including streaming platforms like Hulu and Amazon Studios, and mainstay media companies like WGN America and Legendary Entertainment.

Nice rollup of Apple’s content efforts to date. Amazing to see it all together like this.

And this comparison with Netflix:

At first glance, it appears Netflix’s lead in original content is insurmountable. Netflix will end 2018 with close to 1,000 original titles and spend an estimated $3.5 billion on new titles this year. Keep in mind that almost half of that content is outside of the U.S. That compares to Apple, which has 2 titles out today and another 16 in the works (to be released in 2019 at the earliest), expecting to spend about $900 million this year.

But:

However, history is on Apple’s side, given that just five years ago Netflix had 13 original titles including the debut season of House of Cards. In other words, with the right resources, which Apple has, Apple’s original content titles can ramp from just under two dozen to potentially over one hundred. We note that Apple has stated they are focused on quality vs. quantity.

To me, that last is the key. Can Apple figure out how to deliver the quality? If I was looking at a model for how to do this, I would start with Netflix, but then move on to HBO. Netflix has plenty of swings and misses, HBO less so. If I was on the Apple team, I’d be asking the question, “What is HBO’s secret sauce?”

How Apple can fix 3D Touch

Eliz Kılıç:

3D Touch is missing the most obvious thing to be mainstream. Visual cues.

This. So much this. There is nothing in the interface that signals to a user that a particular element will respond to force/3D touch. The only way to tell is by trial and error. And then, once you’ve figured it out, you have to remember what works, or trail and error all over again.

And what’s great about this writeup is that Eliz not only identified the problem, but came up with an elegant solution. Check the last three images in the article, see if you can tell which controls are force touchable?

Eliz tied this up with a bow, handed it to Apple. Here’s hoping someone is listening.

iOS 12 lets you securely and automatically share your emergency location with 911

Apple:

iPhone users in the United States who call 911 will be able to automatically and securely share their location data with first responders beginning later this year with iOS 12, providing faster and more accurate information to help reduce emergency response times.

The way it works, prior to iOS 12:

To address this challenge, Apple launched HELO (Hybridized Emergency Location) in 2015, which estimates a mobile 911 caller’s location using cell towers and on-device data sources like GPS and WiFi Access Points.

And the new process:

Apple today announced it will also use emergency technology company RapidSOS’s Internet Protocol-based data pipeline to quickly and securely share HELO location data with 911 centers, improving response time when lives and property are at risk. RapidSOS’s system will deliver the emergency location data of iOS users by integrating with many 911 centers’ existing software, which rely on industry-standard protocols.

And:

The FCC requires carriers to locate callers to within 50 meters at least 80 percent of the time by 2021. iOS location services are capable of exceeding this requirement today, even in challenging, dense, urban environments. This new feature allows Apple to make these benefits available to local 911 centers now rather than years from now.

Not sure of the details, but sounds like a more direct, efficient process, yielding more accurate locations well ahead of the FCC required date.

Downside to Siri shortcut “trigger phrases” in iOS 12, but one with a solution already in place

From Reddit:

When I was using an Amazon Echo, my biggest complaint was that each third party “skill” has a specific voice command associated with it, and any deviation from that syntax would cause Alexa to not recognize what I was asking for (Haven’t used one in about a year, so this may have changed). I always found this frustrating in comparison to Siri, which can make sense of natural language. i.e. Siri can hear “get me directions to…” or “take me to…” or “how do I get to…” and either way it knows you want help with navigation. Apple made a big deal of this capability when Siri first launched.

But with Shortcuts, Siri behaves more like Alexa in that even though the trigger phrases are customized by the user (which is a one-up on the echo), Siri still requires the exact phrase every time.

And:

Remembering one or two custom phrases isn’t a big deal. But if this is the way Apple is going to open Siri up to third party apps, requiring users to remember dozens of specific trigger phrases (custom or not) is, I think, a step backwards for Siri.

First things first, there is a muddying of the waters at work here. The term Siri Shortcuts is associated with the coming Shortcuts app, which lets you build your own custom workflows which you can fire off as you like. You can assign a trigger phrase to a shortcut which, as the Reddit user points out, must be an exact match for Siri to fire it.

If you build a lot of these, you might run into a problem, but this is a problem with an easy solution. Apple maintains a list of all your trigger phrases, in Setting > Siri > My Shortcuts. [H/T Marcus Mendes]

Interesting point, though. I wonder if Siri will eventually be able to “machine learning” its way to an educated guess as to the shortcut you wanted if you are pretty close.

Apple, Grayshift whac-a-mole

From this New York Times article:

Apple said it was planning an iPhone software update that would effectively disable the phone’s charging and data port — the opening where users plug in headphones, power cables and adapters — an hour after the phone is locked. While a phone can still be charged, a person would first need to enter the phone’s password to transfer data to or from the device using the port.

And from the Elcomsoft blog:

In the second beta of 11.4.1 released just days ago, activating the SOS mode enables USB restrictions, too. This feature was not present in the first 11.4.1 beta (and it is not part of any other version of iOS including iOS 12 beta). In all other versions of iOS, the SOS mode just disables Touch/Face ID. The SOS feature in iOS 11.4.1 beta 2 makes your iPhone behave exactly like if you did not unlock it for more than an hour, effectively blocking all USB communications until you unlock the device (with a passcode, as Touch ID/Face ID would be also disabled).

And this from Motherboard, with the title Cops Are Confident iPhone Hackers Have Found a Workaround to Apple’s New Security Feature:

“Grayshift has gone to great lengths to future proof their technology and stated that they have already defeated this security feature in the beta build. Additionally, the GrayKey has built in future capabilities that will begin to be leveraged as time goes on,” a June email from a forensic expert who planned to meet with Grayshift, and seen by Motherboard, reads, although it is unclear from the email itself how much of this may be marketing bluff.

And:

A second person, responding to the first email, said that Grayshift addressed USB Restricted Mode in a webinar several weeks ago.

My instinct is that this is, indeed, a marketing bluff. But one without teeth if it doesn’t work.

Whac-a-mole (note the spelling, a trademark thing, I think).

Apple launches new wave of Mac ads

[VIDEO] All of these ads (embedded in the main Loop post) are posted under the campaign slogan Behind the Mac. I’ll post the short YouTube writeup for each ad, followed by the ad itself. Each ad ends with the phrase Make something wonderful, followed by Behind the Mac.

On the sad state of Macintosh hardware, with a twinkle of hope

Quentin Carnicelli, Rogue Amoeba blog, posts this list of last updates from the indispensable MacRumors Buyer’s Guide:

  • iMac Pro: 182 days ago
  • iMac: 374 days ago
  • MacBook: 374 days ago
  • MacBook Air: 374 days ago
  • MacBook Pro: 374 days ago
  • Mac Pro: 436 days ago
  • Mac Mini: 1337 days ago

And:

Worse, most of these counts are misleading, with the machines not seeing a true update in quite a bit longer. The Mac Mini hasn’t seen an update of any kind in almost 4 years (nor, for that matter, a price drop). The once-solid Mac Pro was replaced by the dead-end cylindrical version all the way back in 2013, which was then left to stagnate. I don’t even want to get started on the MacBook Pro’s questionable keyboard, or the MacBook’s sole port (USB-C which must also be used to provide power).

As if by magic, Apple released four new Mac ads yesterday, obviously a coincidence, but a good sign nonetheless.

Follow the money. We recently posted this article quoting numbers from Apple’s last holiday quarter:

  • iOS revenue: $68 billion
  • Mac revenue: $6.9 billion
  • iOS units sold: 90.4 million
  • Mac units sold: 5.1 million

Going purely by the numbers, clearly iOS should have Apple’s attention. But the Mac remains a vital part of Apple’s ecosystem. Given the WWDC announcement of the effort to port iOS apps to the Mac, and the new ad campaign, I have to feel a bit optimistic that Apple is turning their massive battleship back towards the Mac.

Hands on with macOS Mojave

[VIDEO] Per usual, Jeff Benjamin does a wonderful job walking through macOS Mojave (embedded in the main Loop post). So much new stuff. Love the new screenshot capabilities. Another tick towards iOS with the screenshot hanging around in a floating window for you to edit.

The various flavors of Siri shortcuts

Rene Ritchie, iMore:

Developers can tap into the Continuity-derived user activity to make locations available within their apps. And they can use a new Intents API to let the system know, more expansively, the actions available in the app.

Once that’s done, Siri keeps track of what you do with them and when you do it, and tries to guess when you’ll do it next.

Rene clarifies this with examples:

For example, if you always order pizza before the game on Sunday, instead of having to go to the pizza app, pick your favorite, and place your order, it’ll have a banner waiting for you right on your Lock screen ready with your favorite order.

If you always text your child to say you’re on your way home from work, instead of having to go to messages, find the conversation with your child in the list, and tap to start a new message, a banner will be waiting for you, ready and able to send that message with a single tap.

Rene’s article is long and full of interesting detail. But the part that struck me was the way he distinguished between shortcuts you create yourself (using the Shortcuts app, rebranded from Workflow), and the voice triggers you create to label shortcuts (Hey Siri “Get pizza”), and the shortcuts Siri creates (driven by user activity reported by various apps) and suggests to you.

I’ve been using the iOS 12 beta for a week now. In that time, my Lock screen has offered to put my phone into Do Not Disturb when a Wallet pass, Open Table, and even simply iMessage indicated I might be having dinner or breakfast.

I hasn’t offered to let me order my usual Philz Mint Mojito, because I don’t have the Shortcuts enabled version of that app — yet! — but it has offered me directions to Philz after I used Maps for walking directions the first couple days of the conference.

Read Rene’s post to take advantage of his iOS 12 experience, wrap your head around what’s coming. Good stuff.

How the 12.9-inch iPad Pro took me by surprise and replaced my laptop

Paul Stamatiou:

Against my better judgement, I decided to give tablets one more chance. On the last day of a vacation that started in Rwanda and ended in the UK, I walked into the Regent Street Apple Store in London and purchased a 12.9″ iPad Pro and Smart Keyboard.

That was a few months ago. A few months in which my 13″ MacBook Pro has not even been powered up once. Any new gadget novelty has long since worn off and I’m still loving and using this iPad Pro daily.

What changed this time around?

Let me be clear about something. Though I often write about why I am still on a MacBook and the things that prevent me from moving full-time to an iPad Pro, I would love to make that move. I would love for an iPad to fill all my needs. I own a number of iPads and use them all the time.

Every time I read one of these stories, I dig down to see if, perhaps, the time has arrived. I do see us getting closer, but there are still a few things that make the MacBook my central computing device.

From Paul:

The viewing angle of the iPad Pro is not adjustable. You just get the two modes and that’s it. It’s okay most of the time but on a few occasions (usually when I’m slouching in a chair…) I have found myself stuffing something behind the iPad Pro to prop it up a bit more.

And:

Rather trivial but it’s hard to use the keyboard in a more relaxed, casual couch setting without placing a hard surface underneath.

The MacBook is its own platform. You could balance it on your lap, a small tray table (think airplane), even on a soft patch of grass. The iPad keyboard combo is not stiff enough to work on non rigid surfaces. Sure, I can use my iPad anywhere, but to type at speed, I need the keyboard, and the iPad keyboard combo requires a rigid surface.

More from Paul:

Repetitively placing a cursor or selecting text is a chore. It’s tedious to constantly move your hand from the keyboard up to the middle of the screen as opposed to a closer adjacent mouse as you have become accustomed to with a computer.

The text editing thing is the one thing I can’t get past. I would love to write a Loop post on my iPad. But typing and editing anything more than a paragraph is a chore on my iPad. I wish I could solve this. I want to believe!

All told, Paul made the transition. Terrific read, lots and lots of interesting detail, all written on the iPad Pro.

Apple tries to stop developers sharing user data

Bloomberg:

Apple Inc. changed its App Store rules last week to limit how developers use information about iPhone owners’ friends and other contacts, quietly closing a loophole that let app makers store and share data without many people’s consent.

The move cracks down on a practice that’s been employed for years. Developers ask users for access to their phone contacts, then use it for marketing and sometimes share or sell the information — without permission from the other people listed on those digital address books.

Glad Apple made this move. Amazing to me that Apple continues to embrace privacy, with the constant lure of moving to the dark side.

A beautiful example of iOS 12 and ARKit 2

Watch the video embedded in this tweet:

https://twitter.com/nathangitter/status/1008397365018005504

To me, this really puts the augmented in augmented reality. This is a taste of what’s coming.