Currently, the biggest complaint Google’s PageSpeed has about my pages is that the cache TTL for static assets is ten minutes. Rightly so! Given that sites hosted on Github Pages are by design static, ten minutes seems low to begin with; for static assets, it’s even worse.
The community seems to suggest using CDNs for such assets. For a small, personal site I would like to avoid that, and I’m wary of adding third-party dependencies by principle.
I’m not sure whether it is, in principle, possible to invalidate browser caches only when the page was actually redeployed.
As a best-effort workaround, it would be great to mark certain folders – such with images, stylesheets, etc. – for a larger TTL. I would imagine adding a simple text file to such directories which either act as a flag, or maybe even provide a way to tune the TTL to the use-case at hand.
Please consider adding such a feature.