The Robservatory

Robservations on everything…

 

macOS quality as measured by update release rate

There’s a lot of chatter out there that High Sierra is potentially the worst macOS release ever, in terms of bugs and broken or missing functionality. From the recent Month 13 is out of bounds log spewage problem to the root no password required issue (whoops!) to a variety of other glitches, High Sierra has presented many users, myself included, with a near-constant stream of issues.

But is it actually any worse than prior macOS/OS X1I’ll just call it macOS from here on. releases? There’s really not a lot of information to go on, given Apple’s very-private development process and non-public bug tracker.

However, the one data source I do have is a list of every macOS release date. With 10.13.2 having just been released, I thought it might be interesting to see how quickly the third update arrived on each version of macOS. If High Sierra is worse than usual, I’d expect that the time required to reach its third update would be notably less than that of other releases.

After some fiddling in Excel, the data proved—with some caveats and observations—my hypothesis…

Here’s a visual comparison that makes High Sierra’s place in the history of macOS releases really stand out2This used to read “perfectly clear,” but then I explained it for a paragraph, so it’s not really perfectly clear. (click for a much larger version)…

With only 72 days between the release of the OS and its third update, High Sierra becomes the third-most-quickly-updated macOS release ever. What makes that even worse is that the first place release is the original Mac OS X 10.0, which one would expect would receive a lot of frequent updates (and it did). And it’s third by only one day, to Mac OS X 10.8.

So if you ignore the original release, High Sierra is basically tied for first in a contest that you don’t really want to win.

Now for the caveats and observations…

  • Obviously, there’s no proof that release rate is a measure of software quality. But I think it’s a not-all-bad proxy, barring any additional data sources (of which I have none). If the OS is stable and solid, releases will be few and far between, because nothing needs updating and Apple doesn’t usually add any features between major releases.
  • Regardless of the viability of the method, the chart shows three clear outliers, one of which is High Sierra.
  • One of the three 10.13 releases was a supplemental update, so should that count as a “regular” release? There have been other supplemental updates, including two in 10.8, so I felt it fair to include it. And the fact that it was required implies there was an issue that didn’t get caught in testing. But even if you exclude the update, High Sierra’s rate for the other two updates is still incredibly quick when viewed against prior releases.
  • Apple switched to annual releases with OS X 10.8, so you could argue that comparisons before and after that date shouldn’t be made. But if you look at 10.9 and 10.10, they went roughly the same number of days until their third updates as did many of the pre-annual-cycle releases.
  • Four of the five “quickest to three” releases have happened since Apple made the switch to annual releases. To me, this implies that quality has dropped as a result of the transition to an annual schedule. I am all for a return to bi-annual major updates, with a primary emphasis on fixing bugs between major releases.
  • The trend line starting from 10.9 through 10.13 is generally down, with a slight uptick for 10.12. That’s also disconcerting, as it implies that things are continuing to get worse, not better.

I admit this is about as scientific as tossing marshmallows at a cup while blindfolded, and concluding that cups magically reject marshmallows because so few wind up in the cup. I still find it interesting, though, and I plan to keep updating this graph, both for additional dot releases within 10.13, and for 10.14 and beyond.

4 Comments

Add a Comment
  1. I love the marshmallow metaphor. Anyway one could argue that this is what you would actually expect as companies switch to a more agile development model. It would be my preference to see shorter smaller release cycles.There is likely a balance between:
    a) more features released with more bugs that are fixed more quickly
    b) fewer features and longer lasting bugs
    For now I am pretty happy with the pace of innovation and its rather slight costs.

    What I am less happy with is that earlier versions of iPhoto and iTunes were in many ways better from a feature perspective than the latest versions. Or perhaps I should say with those products the features that were taken away because they added complexity and were not used by the majority of users were a very important part of the feature set for me. That upsets me way more than excessive log entries, etc..

  2. Great analysis! What fun! I appreciate your appropriate disclaimer at the end about marshmallows and cups! (You could have said, “cups of hot chocolate”!)

    Now, that redressed the only issue I saw in your analysis– and it’s a nitpick.

    You wrote:

    “Here’s a visual comparison that makes High Sierra’s place in the history of macOS releases perfectly clear.”

    “perfectly clear” was a gross exaggeration, especially as its place in history was not readily apparent and you had to take an entire column to explain all the caveats, exceptions, counterpoints, etc.!!

    1. Good point. I was trying to say that the graph made it clear that High Sierra was an outlier, like 10.0 and 10.8, but I overly shortened it. I’ll edit it and add a footnote explaining what I changed.

      thx;
      -rob.

  3. I also learned something, too, about Apple’s nomenclature! That it’s spelled “macOS” and not “MacOS”! Thanks!

Leave a Reply

The Robservatory © 2018 • Privacy Policy Built from the Frontier theme