How open is macOS, really?

4 months ago 24

Alan Mendelevich

Pre-conception

Until recently I only tangentially touched the Apple ecosystem mostly by helping family members with their iPhones and iPads and, occasionally, dealing with my web projects working incorrectly in Safari. At the same time, I follow the tech industry closely and a substantial chunk of that scene are either avid Apple users or plain Apple fanboys.

The mantra I always hear from these people about Apple’s “computer-like” ecosystem goes something like this:

If you want a tightly controlled ecosystem, get an iPad. Want an open system — get a Mac.

Sounds logical and I had no reason to doubt this statement. Until I decided to…

Try to make an app

For the past 10+ years I mostly worked on web-based software. You get used to your stuff working across the board (safe for a few browser quirks you get to resolve). But for my recent idea a native app sounded like the most logical option.

For someone with most of the recent experience focused on web technologies (and one may argue for anyone really) Electron is a logical platform choice to make an app that would run natively across major desktop platforms.

And that’s what I did. I made a fairly simple app for myself in a couple of weeks and decided to share it with the world as well. I was scratching my own itch and there’s no business model behind it. I just thought people may find it useful, and as it doesn’t cost me anything to run, I’m happy to share it, not expecting anything in return.

Let’s set aside the point that you need access to Mac hardware just to compile the app for Mac. That could be a different post but it’s beside the point of this one. Once I was done with the first pass of the app, I uploaded it to the server and tried to install it on a Mac from there. When I tried to run it, I got this:

Note the wording here. The message just tells the user that the file is “damaged” and there’s nothing you can do about it but just move it to trash. Silly me thought that some error just happened during upload or download of the file. Or maybe during packaging. But it didn’t take long for me to realize that this is “by design”.

Turns out that to be installable on macOS Sequoia (and, probably, later) the app has to be both code-signed and notarized by Apple. Or in human terms, as a modern macOS user you have only these two options for getting apps:

You can only install apps from “known developers”

There’s no “user friendly” way around this. On a supposedly “open” system.

Gatekeeping

The subsystem handling this is literally called Gatekeeper. And while as a user you may [rightfully] think that it’s keeping you safe from malicious software, you probably never think about it keeping all the useful enthusiast-made and open-source software outside those gates as well.

For me to give you my non-commercial software for free from my own server (we are not talking about the App Store here) I must pay Apple $100.

OK, then. Sunk cost fallacy be damned.

Not being familiar with the whole ecosystem here I figured that the code-signing certificate Apple issues is valid for 5 years, meaning that if I pay them once and never publish anything in the App Store, I can stretch that $100 over 5 years. Still a random expense, but ok, whatever. But even here I was wrong.

Apparently, code-signing is not enough. Each time you want to publish a new version you also have to “notarize” the package (in addition to signing it). In practical terms this means that your app is uploaded to Apple, and they perform a malware scan on it and issue a “stamp” declaring it malware-free.

As you may have guessed by now, this notarization step requires an active Apple Developer account. Meaning that it’s not $100 for 5 years but $100 for each year when you want to release an update for your macOS app.

Abandonware

It’s not impossible but quite unlikely that I will get into the business of making native macOS or iOS apps. This mail merge app may end up being all I do in this space until this time next year. Suppose you find a bug in it 12+ months from now. How likely do you think it is that I will pay another $100 to Apple for them to allow me to fix a bug in my tiny free app?

As if people losing motivation in working on free software wasn’t enough, let’s add a financial penalty to the injury.

What’s up on the other side? (Windows)

Windows has a similar technology called SmartScreen but it comes with two key differences:

  1. If you have privileges on this PC, you can override it right there and “Run anyway”
“Run anyway” option in SmartScreen

2. As far as I understand (correct me if I’m wrong), the “malware” scan happens on the client so there’s no “notarization” step when you decide to pay for code-signing your apps.

To be fair though, the process of code-signing non-store apps on Windows seems to be more complicated and likely more expensive than it is on Apple’s side. But at least it’s not mandatory (at least for now).

At the same time, if you don’t mind some extra ceremony, you can now publish to the Microsoft Store absolutely free. And that should presumably take care of all the code-signing headaches.

Web is the future

All said and done, my conclusion is to avoid native app development like a plague, unless absolutely necessary or you have extra resources to deal with the burdens. In the modern day and age, the web can handle 95%+ of software use cases, with fewer headaches for you and, likely, better user experience for your users. Especially, if you consider update cycle predictability as part of the UX (I do).

For the remaining 5% of cases — don’t believe the hype of “openness” of the modern commercial operating systems and be ready to deal with the setbacks.

Read Entire Article