What is compression?
Compression is the process of taking a file and reducing its size. There are many methods through which this is accomplished, but the method utilized by gzip takes advantage of the fact that many files have long sequences of data that are exactly the same. For example, a data file might contain 8 bytes in a row of the binary value "1" (0000001). This sequence of bytes can be represented symbolically in one byte with the value "8" (00001000), thus reducing the size of the file by a factor of 8.
The algorithms used by gzip and other compression utilities are far more complex than explained above and are able to utilize numerous other statistical quirks to achieve even greater compression levels.
What Gzip level does Stackpath support?
Currently, gzip supports compression up to level 9, with the default being level 6. Stackpath supports compression levels 1-6 because higher compression levels offer significantly diminishing returns for the effort required to reduce files by this much.
How is Gzip used by the CDN?
Modern browsers are cable of fetching files in a compressed format and uncompressing them on the fly, which reduces the amount of data the end user must fetch from the server when loading a web page. This can lead to significant performance gains in site load time.
The Stackpath CDN is capable of fetching uncompressed assets from the origin, compressing them in cache, and serving them in compressed gzip format to browsers that advertise they are capable of handling compressed assets. By default, the StackPath CDN uses gzip compression level 1, but this can be adjusted in the control portal to any level from 1 through 6.
Update StackPath gzip compression setting by logging into the StackPath control portal, selecting a CDN site, and choosing the Cache Settings section.
Compression by default will be applied to all assets ingested by the CDN for the specific site, but it can be limited to specific assets by file extension and/or mime type.
Once compression has been enabled on a site, all assets matching the policy will be compressed in cache after they are ingested again from the origin.
Assets already in cache when the policy is enabled will not be compressed, so it is important to purge the host after the policy is applied to ensure that the assets are compressed.
Another important note is that if an asset needs to be compressed but is not already in cache, the CDN will fetch the asset from the origin and proxy it to the user immediately in uncompressed format and queue the asset for compression in the background. For this reason, the first few requests for an asset may not be served in .gzip format until this background compression has been completed.
What if files are Gzipped on my Origin?
Assets that are already in .gzip format (or any other compressed format) on the origin will be treated as normal assets and served straight to the end user without any in-cache processing.
When enabled, are all files Gzipped instantly?
For files not in cache, the first few requests after an asset is ingested from the origin, the CDN will continue to serve the asset uncompressed until it has completed the gzipping process in the background. Files already in cache before the compression policy was enabled will not be compressed until they are ingested from the origin again (Purging the file(s) is a great way to accomplish this).
If you'd like to read more about gzip compression we'd like to recommend this article: https://www.rootusers.com/gzip-vs-bzip2-vs-xz-performance-comparison/