Posted by & filed under How To, Technical, Work.

There’s a tool that can cut your image filesizes in HALF without meaningfully degrading their appearance. But hardly anyone seems to be using it.

TL;DR: TRY IMAGEOPTIM!

[Disclaimer: I had intended this to be a lengthier and more technically detailed post about image optimization. I saved a draft of it two months ago… then got too busy with other things. I’m finally just publishing this abridged version, on 20131119.]

I’ve spent a few years paying pretty close attention to web performance optimization best practices as they evolve. Some techniques are counterintuitive, complex or subtle. But despite the ongoing, rapid changes in front-end engineering, the simplest and most obvious solution of simply sending fewer bytes over the wire remains one of the most critical to faster load times. Looking at a typical web page, images account for the largest proportion of bytecount:

Before (20,803 bytes)

So it makes sense to try harder to shave bytes off our image assets. If you are simply using Photoshop and its “Save for Web”, you’re missing a huge opportunity to make your website load faster. This hurts the UX, and in the enterprise it is hurting your bottom line too, with unnecessarily inflated storage and bandwidth costs. If you use a tool like smush.it to reduce the bytecount by another 5 or 10%, that’s a good start, and it certainly helps. But I recently stumbled upon a tool that is radically better at trimming the fat. Enter ImageOptim. Here is the same image you see above, after being run through ImageOptim:

After (14,298 bytes)

The 2nd image is about 31% smaller than the 1st one. But they look identical. Under the hood, ImageOptim leverages several utilities for compressing binary images. It goes beyond stripping metadata and color profiles. In some cases what it does is not technically lossless… yet the images look the same, even to keen designers’ eyes on huge high-res, high-DPI monitors. At my workplace, running ImageOptim against tens of thousands of images, we reduced filesizes by about 56%! That is just unbelievable. I shared these exciting results with my brother (who is in a fairly senior position at Facebook) and he told me it started a discussion leading to changes in FB’s asset pipeline, with anticipated bandwidth cost savings in the millions of dollars per month. This is a big deal. Some of the underlying utils (like pngquant) have improved by leaps and bounds in recent months, but so far it’s gone unnoticed by the community at large. Webperf professionals work so hard to shave 3-5% off rendering times… but here’s a tool to make gains that go way, way beyond that. Cutting the filesize of all your images in half is a game-changer.

Note, the image I used here came from httparchive.org, a fantastic resource for understanding web technology trends. It was created by Steve Souders, among the world’s most famous experts on web performance. Yet even that site’s assets are under-optimized. It seems to me there’s a huge opportunity here, one that’s being missed by the large majority of the webdev community. I’m not sure why this is.

Finally, here’s a detailed breakdown of various tools, showing ImageOptim to be a pretty clear winner in most cases:
http://jamiemason.github.io/ImageOptim-CLI/

Happy optimizing! 🙂

Comments are closed.