Legacy GZip files before transfer to a CDN

Status
Not open for further replies.

furnival

Customer
I ran some tests at webpagespeedtest.org. It showed me that most of my javascript and css files were GZipped before I started using the CDN feature of VBOptimise Pro. Now my files are being served from Amazon S3 without GZip compression, so they now load slower than they did before! Can you incorporate a way to Gzip gzippable files during the synchronisation process with the CDN please?
 
Upvote 0
This suggestion has been closed. Votes are no longer accepted.
Gzip is an on-the-fly compression/decompression that is supported on the server level. Gzipping files before upload would change the extension to .gz, and would not function in the way you expect.

In other words, this is something that needs to be supported by the server you're hosting files on.
 
As you claim to support Amazon S3 as an option I think you should incorporate a gzipping function into the files synch part of the CDN part of VBOptimize. Otherwise I am going to have to find some way to GZip files and add a meta data header to tell Amazon S3 that they are gzipped.
 
I'm not sure how Amazon S3 support is related to whether their servers gzip content before displaying it? Could you elaborate, please?
 
As per the first link, this is not something that can be done programmatically. It requires command line access - which, if you allow this for PHP scripts, is a gigantic security issue.
 
The difference is that .gz files can't be served as CSS or JS, they have their own mime-type.
 
The difference is that .gz files can't be served as CSS or JS, they have their own mime-type.

OK I think I see what you mean. However PHP can also gzip strings so I believe it could open .js and .css files, zip the entire text of the file as one long string, then close the file.
 
That would not help browsers interpret those files as their original content type.
 
Status
Not open for further replies.
Back
Top