MacOS App Security Histrionics

November 16, 2020

Nick Heer, with an excellent breakdown of last week’s drama regarding the MacOS signature verification process that caused my Thursday panic attack:

For a few hours on Big Sur’s launch day, Apple’s overwhelmed servers prevented a MacOS process called trustd from quickly verifying signatures using the Online Certificate Status Protocol, or OCSP. This affected many versions of MacOS and manifested as applications taking forever to launch, and some general slowness.

This problem sucked, but it was resolved quickly. I hope a future MacOS update has a patch for whatever bug created this misbehaviour. But, this being the internet, it somehow snowballed into a crisis — MacOS is apparently spying on users, it’s worse in Big Sur, and that means Apple’s new M1 products that run nothing but Big Sur are evil surveillance devices that should not be bought by anyone. Or, at least, that’s what you would think if you read Jeffrey Paul’s article that hit the top of Techmeme and Hacker News[.]

This is another case where the first article that spreads around the web is a bit overblown and sensational, but the truth is less interesting or flashy so it doesn’t get as much coverage. The main issue in the aftermath of the event was not that the service went down, but rather the concern that Apple is sending usage information back it its servers to “keep track” of your computer usage, what apps you run, and from where.

Apple posted a support article over the weekend to clarify the security procedure:

These security checks have never included the user’s Apple ID or the identity of their device. To further protect privacy, we have stopped logging IP addresses associated with Developer ID certificate checks, and we will ensure that any collected IP addresses are removed from logs.

In addition, over the the next year we will introduce several changes to our security checks:

  • A new encrypted protocol for Developer ID certificate revocation checks
  • Strong protections against server failure
  • A new preference for users to opt out of these security protections

Call me naive, but I believe Apple and take their word in this instance. As probably the biggest company advocate for user privacy, it makes no senses to their business to break user trust for this use case. This incident was clearly a mistake and exposed some areas of the infrastructure that need to be improved and I’m happy to see Apple taking the opportunity to make things better in the long run.