Rongchai Wang
Jan 17, 2026 09:16
GitHub introduces fee limiting for Actions cache entries at 200 uploads per minute per repository, addressing system stability considerations from high-volume uploads.
GitHub has carried out a brand new fee restrict on its Actions cache system, capping uploads at 200 new cache entries per minute for every repository. The change, introduced January 16, 2026, targets repositories that had been hammering the cache system with rapid-fire uploads and inflicting stability issues throughout the platform.
Downloads stay unaffected. In case your workflows pull present cache entries, nothing adjustments. The restrict particularly targets the creation of recent entries—a distinction that issues for groups operating parallel builds that generate recent cache knowledge.
Why now? GitHub cited “cache thrash” because the perpetrator. Repositories importing large volumes of cache entries in brief bursts had been degrading efficiency for everybody else on the shared infrastructure. The 200-per-minute cap provides heavy customers sufficient headroom for reputable use instances whereas stopping the type of abuse that was destabilizing the system.
A part of a Broader Actions Overhaul
This fee restrict arrives amid a number of important adjustments to GitHub Actions economics. Earlier this month, GitHub decreased pricing on hosted runners by 15% to 39% relying on measurement. However the larger information hits March 1, 2026, when self-hosted runner utilization in non-public repos begins costing $0.002 per minute—a brand new cost that is pushing some groups to rethink their CI/CD structure solely.
The cache system itself bought an improve in late 2025, with repositories now in a position to exceed the earlier 10 GB restrict by pay-as-you-go pricing. Each repo nonetheless will get 10 GB free, however heavy customers can now purchase extra reasonably than consistently combating eviction insurance policies.
What Groups Ought to Examine
Most workflows will not discover this restrict. However in case you’re operating matrix builds that generate distinctive cache keys throughout dozens of parallel jobs, do the mathematics. A 50-job matrix finishing concurrently may theoretically hit 200 cache uploads in below a minute if every job creates a number of entries.
The repair is simple: consolidate cache keys the place doable, or stagger job completion in case you’re genuinely bumping in opposition to the ceiling. GitHub hasn’t introduced any monitoring dashboard for cache add charges, so groups involved about hitting limits might want to audit their workflow logs manually.
Picture supply: Shutterstock







