Audit Log beyond 3-4 months of past data

We recently moved over 70 of our GitHub Teams accounts into an Enterprise we have, and so the orgs are now using GitHub Enterprise Cloud, enabling access to the audit log APIs.

For archival and historical purposes, I would love to be able to kick off a slow process or job to backfill the 10 years of data that we have on these organizations, in case we ever need them, without having to parse and process the exported audit log data.

Is it possible that in the future, there may be a heavily rate-limited option to get data beyond the 3-4 month window that the docs refer to for GraphQL + the audit log, or should I just get to parsing the massive JSON/CSV data files that are exported instead?


Hi @jeffwilcox :wave:

So even with Enterprise, the 90 day limit does apply, but you can use the created qualifier in the UI.

Exporting the audit logs should fetch logs since the Org was created, but we can’t fully guarantee the completeness of the log data beyond 90 days.

As it stands, the export process is the (as of now) only real way to get what you’re after.

The very helpful idea you’re suggesting for a heavily rate limited endpoint to gather historic data would be amazing!! If you’re willing, would you mind submitting that via our feedback form?

That goes directly to our Product Org for review and this is something that’s definitely been requested before. The more folks asking for it, the better!