Google Pagespeed Image Optimization Updated 2017

2017 – Google Updated its criteria for optimizing images for Google Pagespeed Insights

… And didn’t tell anyone!

I noticed several people on other sites that thought, like me, that they were either going crazy or that Google Pagespeed Insights was a bit broken. Has google started using ‘caches’ of images it fetched from our pages? surely not – this would render any image optimizations we had made in the meantime obsolete! I messed around changing htaccess settings to tell google to fetch images afresh but, trying different browsers, even using a VPS so the google server accessed would be a different part of the world, but to no avail. JPEGS & PNGs using the highest lossless compression settings were still being flagged up that they could lose another 10-20% – WTF?

But we all initially missed the cause of this ‘problem’; Google had quietly switched to lossy images by default. And what I hadn’t noticed was they had dropped the word ‘losslessly’ from their results page when stating about how the images had been compressed.

So sometime early 2017 (or perhaps late 2016? – implementation may have been phased in), Google moved the goalposts and a few of us doubted our sanity.

So how ‘lossy’ does does Google now recommend?

If you find a definitive answer, please let me know!

As they didn’t exactly announce their change of criteria, nor have they announced how much to compress images to still achieve maximum score.

I did some experimentation and it’s far from obvious. I tried 85% quality on one image as that figure is mentioned somewhere on Google’s mod_pagespeed site and while that was sufficient for one JPEG image, for another I had to reduce it to 64% quality before Google stopped complaining about it!

I have one theory about why they are doing this, (more on this later) but as far as the lossiness % of images is concerned, its fair to say that its an arbitrary figure they have come up with and may depend on the size of the image, number of colours used and even advanced image analysis techniques including AI, but I’ll stop there as down that route lies madness.

Revisiting compressing images with ‘lossy’ enabled

So here were my recommendations for reducing JPEGs losslessly (you need to go into the directory where the files are and use the following command line commands):-

jpegoptim --strip-all *.jpg

That optimizes all the Jpeg files using its best normal compression method. Plus you can then also use:-

jpegoptim --strip-all --all-progressive *.jpg

But lets add the lossy parameter to the mix:-

jpegoptim -m 85 --strip-all *.jpg

and again as larger images sometimes give better results with the progressive option:-

jpegoptim -m 85 --strip-all --all-progressive *.jpg

and that goes through them all again, this time trying the progressive method of compression to see if it is more efficient for each image, if so, it will recompress them with that method, if not, it will leave as normal.

Then for png files, optipng is designed to give lossless compression so i just repeat the default settings from before;

optipng *.png

or to tweak every last byte (be warned this may take ages!)

optipng -o7 *.png

the second command takes much longer and is very processor intensive but can sometimes yield an extra few % compression for some larger images for those who are fanatical about such things.

But if google complains about your PNGs after this, you may need to consider a lossy PNG optimiser. For this purpose, I tried pngquant which works by rendering down to a pallette with 256 colours or less and can produce some huge file size savings. Unfortunately, with a 256 colour pallette, dome images show noticeable degradation, but not as much as you might expect.

 

Results based on images supplied with blacknoir theme described in Xara article.

142 PNGs

Original 515.0 kb

WEBP 360.0 kb

optipng 296.4 kb

optipng -o7 295.7 kb

pngquant –quality 100 –speed 1 210 kb

pngquant –quality 85 –speed 1 98.9 kb

79 JPEGs

Original 2.1 mb

jpegoptim 2.1 mb (2% smaller)

<some results missing>

Sample code for batch conversion of files

find . -name "*.png" | parallel -eta pngquant -v --quality 100 --speed 1 {}

 

If these files you have optimized were not on the server, you need to upload those optimized files to the server now.

So back to my theory

Is this all just a ploy for Google to push its WEBP image format? With an average of 20%+ savings on image sizes, plus the convenience of animation and transparency options, it seems ideal to finally have the one image format that combines all the best bits of PNG, JPEG and GIF in one file type (albeit different versions within that filetype).

While well implemented by Chrome, Opera and android browsers, Firefox and Apple browsers have dug their heels in and are reluctant to implement it, even though for Firefox, the code is already written and in place, it just needs enabling!

 

One step further – WEPB

Why is it when you come fix a problem you end up wanting to improve the situation, not just fix it. Well somewhere around 70% of browsers page visits now support WEBP, so any savings that could be made by supporting the WEBP format to deliver images could be substantial.

Say a webpage sends 1MB of image data. If those images could be compressed into WEBP format with and average saving of 20%, (which is a typical figure compared to minimally optimized PNGs and JPEGs), that would translate into an average 14% saving on bandwidth and web page image loading times. If the saving was 50% (which is probably more typical compared to unoptimized PNGs and JPEGs), that would translate into a 35% saving. And as more browsers eventually take on support that figure will rise!

The downsides of serving WEBP images?

Well WEBP as any other image format is not suitable for creating on-the-fly, the idea is to free up resources, so we would really need to have pre-compressed WEBP versions of all our images. So allow somewhere between 50%-80% on top of your existing image sizes. That’s not a problem for most, so then we need a mechanism to decide whether the browser visiting our page supports WEBP or not. Luckily, that is no longer a problem as all popular browsers now include this info in the accept header that it sends to your server to request the page.

I am currently writing my own WEBP serving scripts that should be compatible with most websites, but in the meantime, some popular CDN networks are now transparently offering WEBP support so that might be an easy solution