Petition to make Logic Pro more accessible for blind users

I don’t usually post about petitions trying to get Apple—or any company—to do something, but I can get behind this one. With all of the articles that have been published in The Loop Magazine over the last few issues on accessibility, I believe this is an important issue that should be addressed.



  • Sigivald

    Can you provide any insight into what an accessible Logic Pro experience UX would even bet?

    I mean, I can’t think of how you’d even start to make something as visually tied as Logic Pro is “accessible” to the blind.

    (With a webpage, it’s easy – there’s text. Text can be read out.

    But what does the tracking/mastering UI end up being if it’s not visual?

    Without even a hint of an answer to that, demanding that Apple “just make it happen somehow” has a hint of the unreasonable to it.

    Thought experiment: Can Adobe make an “accessible” Photoshop?

    Second thought experiment: Would it ever possibly be worth the development expense to do so?)

    • Sigivald

      To clarify, I don’t mean that you’re obliged to have a UX plan to ask Apple to try.

      I do mean that it’s a really tough problem – and might be one with no good solution at all.

      And that nobody involved in proposing it seems to – on the evidence of the lack of details on the petition – have put any thought into the problems of.

    • Scott

      The specifics of how to do it probably weren’t included in the petition because Apple themselves have already designed the screen reader software that blind people would be using to access logic, as well as provided the APIs it feeds off and documentation on how to use those APIs to third party developers. Essentially, the problem boils down to Apple not following their own guidelines. If they were, the bulk of GUI elements in Logic would be being exposed to accessibility out of the box automagically. To answer the question of how it would work as a user experience though, if you think about it, that complex visual interface you’re used to looking at can easily be thought of as just a table of text content. When you see a fader, you know which track it applies to and can gauge the volume by judging its position visually. You might recognise the state of mute or solo buttons by their colour etc. The computer knows it in terms of an actual DB value for volume or an on/off status for toggles like mute or solo buttons, and that’s all simple stuff for screen reading software to speak. To address the Photoshop comparison, ironically, there are a lot of apps out there where the actual GUI is largely accessible, even if the content that GUI allows you to work with is inherently visual. That’s not the case with logic though is it, because the content you’re working with is audio, not photos. The fastest pro audio editors rely on a combination of keyboard shortcuts, their ears, and looking at waveforms. Nobody is expecting Apple to invent a way of blind people being able to do the latter, but tbh looking at waveforms is the least essential part of that workflow. This isn’t a pipe dream, because there are blind people making decent music every day using DAWs on other platforms. Sure, you could say “well, if another platform works, they should stick with that”, but is Apples own product being usable with Apples own screen reader really that much to ask? I think not.