October 18, 2018
Written by Dave Mark
Nick Corasaniti, New York Times:
> IT’S MEMORIAL DAY WEEKEND, 1976, and nearly 1,000 people pack a tiny club in Asbury Park, N.J., to watch a local band and a local legend named Bruce Springsteen, share their mix of rock and soul with a wider world that had all but written off this struggling seaside city for good.
And:
> Since it opened in 1974, the club, the Stone Pony, has been the beating heart of Asbury Park, a beacon for musicians and fans alike. But its survival, much like that of its host city, has been a constant battle, a story of resilience and revival, of sold-out shows and custom roller blinds on windows. > > Here is the renowned club’s history, as told by the owners, musicians, staff and fans who have called its dark black interior and low-slung stage home.
Growing up in New Jersey, Springsteen was almost a religion. And the Stone Pony was the center of his universe. This is a brilliantly told tale of a major branch of the rock family tree.
Written by Shawn King
CNET:
The Pixel 3 camera holds its own against Apple’s iPhone XS despite having one camera tied behind its back. It all but dispenses with the camera’s flash, using new low-light shooting abilities instead. And it offers enthusiasts a radically new variety of raw image that opens up photographic flexibility and artistic freedom.
It’s all possible because of a field called computational photography, a term invented in 2004 by Google distinguished engineer Marc Levoy while he was at Stanford, before he moved full-time to Google Research. Long gone are the days when photography was all about glass lenses and film chemistry. Fast receding are first-generation digital cameras that closely mirror the analog approach.
Now our cameras rely as much on computers as optics. And what we’ve seen so far is only the beginning.
I post this less to say one camera is better than the other but to point out as advancements happen in computational photography, everyone who shoots photos with their smartphone benefit as manufacturers leapfrog each other with each release.
I think there will always be a place for DSLRs but for more and more people, not only is their smartphone “good enough”, it’s become a damn good camera in its own right.
October 16, 2018
I usually write my review of the new iPhones about a week after I get one of the new devices, but this time around, I just kept using and trying out the different features of the iPhone. Now, I want to give you my thoughts on the iPhone XS and XS Max.
When it comes right down to it, the XS and Max are exactly the same iPhone, except for the size. All of the internals, cameras, and design provide you with the same experience. The only real difference is the display size, but that’s a big difference.
For me, there is a clear winner in the size comparisons to these iPhones—it’s the XS Max. The Max is the perfect iPhone for me—the bigger screen allows me to see everything I need, and I a lot of the time I don’t even need to put on my reading glasses to see everything.
My favorite iPhone before this was the “Plus” versions because of the bigger size, but the Max takes that to a whole new level. The case size of the Plus and Max are basically the same, but the Max is all screen, whereas the Plus had that prominent chin and forehead.
Some people don’t like the “compromise” of not having one-handed control using the iPhone Max and prefer the XS model. That’s fair, I guess, but when I look at how I use the iPhone, I don’t tend to use one hand for anything but scrolling. Ultimately, the amount of time I use one hand doesn’t make up for the extra screen size I get with the Max.
When I type on the iPhone, I use two hands regardless of the model I’m using. I scroll using my thumb on the same hand I’m holding the iPhone with, and other tasks like pinch and zoom require both hands. Size wins out for me.
The camera on both XS models is magnificent. If you’ve listened to my podcasts, you know I’m not a great photographer, so I need all the help I can get.
Part of the reason I didn’t post this sooner is that I was waiting for a live concert to take some pictures. I was scheduled to see Ozzy Osbourne last weekend, but he canceled, so I don’t have any low light shots to show, but I do have some experience in a different lighting situation.
I went to a San Jose Sharks game on the opening night with some friends. If you’ve been to a hockey game, you know it’s terrible for lighting—actually, there is too much light on a white ice surface, which just destroys your pictures.
We all took turns with the XS Max camera and the iPhone X camera—the difference was stunning. Using the iPhone X, the picture of my friends turned out okay, but the background was really bright white, and you couldn’t make out the people in the crowd on the other side of the rink. They were just blotted out by the white lights, and I assume reflections from the ice.
The Max was able to minimize the effects of the white ice and lighting, so you could easily make out people on the other side of the rink, and even signs they were holding. Even in this unnatural environment, the picture seemed more natural. (I don’t have these pics to show either—another long story.)
The bottom line with the camera is that the Smart HDR Apple developed for the iPhone cameras really works. Even the most average of photographers will benefit from this camera.
Of course, there are a lot of other advanced features with the new cameras like bokeh (background blur), depth control, and Quad-LED True Tone flash. All of these features are designed to bring you the best pictures possible, and they do.
As someone who listens to music on my iPhone a lot, I was impressed with the new stereo speakers. The speakers are loud, clear, and they provide you with a broader stereo sound than any previous generation. I even found myself turning the volume down a bit on the Max.
As an aside, a friend asked why I listen to music on my iPhone so much. The only reason I can think of is so that when I’m listening to a great playlist or station, and I go for a walk or a drive, I can just continue listening. I don’t actually know why I listen to music so much on my iPhone, but I do.
I would like to touch on one thing—iPhone XR. I’ve heard people say that the iPhone XR will hurt sales of the XS and Max. I don’t believe that to be true. I think the majority of people that were going to buy the XS models are still going to buy those models. There are a few that will want to be a little different and get an XR for the color option, and that’s fine.
The iPhone XR is going to bring in a whole new group of people that want a large screen iPhone but don’t want to pay the additional cost of the XS. The XR is a fantastic iPhone in its own right, but I don’t think sales of one model is going to hurt the others.
There’s a lot to like about this year’s three iPhone models. There are a ton of upgraded features, from the processor to the camera, and the display. There is nothing I’ve seen in my time of using these devices that would make me hold back on recommending either of the two new iPhones.
Written by Dave Mark
Craig Grannell:
The standard macOS interface has quite a few semi-transparent elements, which like frosted glass provide a glimpse of what’s beneath them. At Apple events, execs go giddy about how pretty this is. In use, these elements vary from being distracting to outright dangerous. For example, if you have a motion-sickness issue and an animating web page is sitting behind a semi-transparent element, it can take a while before you realise it’s affecting you, by which time it’s too late and you’re already dizzy.
And:
“Fine”, says Apple, grumpily, “so just turn on Reduce transparency”. Only it’s not that simple. Because when you do, Apple designers get in a strop and hurl logic out of the window. What you’d expect to happen is for macOS to remove the semi-transparent bits. So instead of Finder sidebars or the macOS app switcher showing what’s beneath them, they’d just have a neutral solid background. Nope. Instead, in its infinite wisdom, Apple’s decided those components should instead be coloured by your Desktop background.
Stephen Hackett put together a few screenshots to show off this effect.
To me, this sort of thing happens due to lack of a specific branch of testing. Seems to me, someone at Apple should reach out to Craig Grannell and ask him (and other leaders in the accessibility community) to beta test new software/hardware early in the cycle, so they have time to address these sorts of issues. I believe accessibility testing would be greatly enhanced by voices, hands, eyes of real world experience.
Written by Dave Mark
Follow the link to see Apple’s original bagel emoji, and the new “fixed” version. While the new version is undoubtedly better (everything is better with a shmear of cream cheese), it still (IMO) falls far short of truly reflecting a real life bagel.
And that said, I do recognize how trivial this is. But I was born with a deep, familial appreciation of bagels, so this hits home for me.
I think the bagel Wikipedia page has some images that might be a good starting point for rev 3. The key is texture.
Written by Jim Dalrymple
My thanks to Bare Bones Software for sponsoring The Loop this week. Do you sling code or compose with words? Whether you’re an app developer, web developer, systems admin or just want a powerful writing tool that stays out of your way, BBEdit is worth checking out.
I’ve been using BBEdit since 1995, so I know first hand that it can handle any job I throw at it.
BBEdit is crafted in response to the needs of writers, web authors, and software developers, providing an abundance of high-performance features for editing, searching, and the manipulation of text.
BBEdit 12 is 64-bit ready. Download and try it today!
Written by Shawn King
VentureBeat: >Google today announced that the camera within the Translate app for iOS and Android is now able to translate 13 new languages including Arabic, Bengali, Hindi, Thai, and Vietnamese. The update is available starting today and will be rolled out to Translate users worldwide in the coming days, a company spokesperson told VentureBeat in an email. > >Translation of text seen in photos or in real-time on billboards or menus was added for Google Translate in 2015, and started with 27 languages. To carry out visual translations, simply open the Translate app and choose the camera icon. Translations can be carried out in real time.
As the demand for seamless communication across languages continues to rise, the importance of reliable translation services cannot be overstated.
With tools like Google Translate’s camera feature making everyday translations more accessible, professional agencies are still essential for high-quality, nuanced translations. Ortiz Schneider, a premier language services agency, has positioned itself as a leader in providing specialized translations that go beyond the capabilities of automated tools.
While machine learning advancements are impressive, Ortiz Schneider ensures that every translation is culturally accurate and contextually relevant, bridging gaps that technology alone cannot address. Whether it’s legal documents, medical reports, or business communications, this agency offers a human touch, ensuring that nothing is lost in translation.
As I do research for a (belated) honeymoon in Florence, Italy and my photography workshop in Lisbon, Portugal, I’ve been testing the Google Translate app and I’m very impressed at how well it does. I don’t speak Italian or Portuguese so I don’t know how accurate the translations are but they are definitely good enough to get the basic idea.