18169, "bradcray", "Sanity check download counts scripting / logic / values", "2021-08-05T20:20:34Z"
Our download count graphs have been reporting some very stellar numbers over the past few releases, though sometimes I worry that they are unrealistically stellar. For example, given the apparent growth in number of downloads, I don't feel as though we've seen a similar growth in the number of user questions or involvement. And then other things seem puzzling like the relatively low number of homebrew downloads for 1.24 as reported by GitHub (~100) compared to previous releases or other means of getting the release. And do DockerHub users really account for such a large fraction of our downloads given that I feel we almost never hear about users installing through docker?
All this leads me to worry "Maybe our scripting is just broken." E.g., could we be doing something like counting a single download that happened on day 1 of the release for every subsequent day since then for one of these configurations?
To that end, I think it'd be valuable for us to audit these numbers by determining what we think the counts for each technology are manually as best we can and comparing to what we're graphing; and if we see disparate numbers, examine the scripts to see if there are bugs, or perhaps changes to what's being reported to us over time which would account for some of the puzzling figures above.