From: Joey Hess Date: Mon, 19 May 2008 03:05:09 +0000 (-0400) Subject: web commit by http://harningt.eharning.us/: Added potential workitem regarding Amazon... X-Git-Url: https://scripts.mit.edu/gitweb/www/ikiwiki.git/commitdiff_plain/13d21938b202b344e86a455b64aee01ae8a18ed3 web commit by http://harningt.eharning.us/: Added potential workitem regarding Amazon S3 and other items WRT static-like-hosting --- diff --git a/doc/todo/for_amazon_s3_pre-gzip-encode_safe_files.mdwn b/doc/todo/for_amazon_s3_pre-gzip-encode_safe_files.mdwn new file mode 100644 index 000000000..c0c0c12ab --- /dev/null +++ b/doc/todo/for_amazon_s3_pre-gzip-encode_safe_files.mdwn @@ -0,0 +1,17 @@ +Regarding the [[Amazon_S3_Plugin|plugins/amazon_s3]]: + +Amazon S3 doesn't seem to support automatic GZIP encoding content (such as HTML, JavaScript, and CSS) that might be compressed by a full-capability webserver. (I'll also note that NearlyFreeSpeech.NET doesn't support compressing out-going files on-the-fly). However, Amazon S3 does support setting some response headers, such as Transfer-Encoding and the like. + +One possibility of decreasing bandwidth costs/download sizes would be to GZIP all content on the site and set the necessary header... however there are certain browser compatibility issues to be navigated. + +Another side item that would be useful potentially would be to have a config option to create a mapping of files that can be gzipped as an alternate name... + +For example: + + gzipped_files => { + js => "js.gz" + } + +Would take all js files and gzip them w/ the altered extension. *This* could allow for using JavaScript to customize what other JS/CSS code gets loaded in based on browser-detection JS code. + +--[[harningt]]