Fast websites are important. Speed is everything these days and will have a profound effect on your business. Here’s how to install jpegoptim in Ubuntu to improve your PageSpeed score

Step 1: Installing jpegoptim in Ubuntu

First things first, before we can attempt to optimize our images, let’s install the tool jpegoptim. Google is recommending this tool and I believe may use this on their own websites.

apt-get install jpegoptim

Step 1: Installing jpegoptim in CentOS (Skip If NA)

If you are like me, you run Ubuntu at home and CentOS at work / on your servers. So, let’s make sure we also install this on CentOS.

We are compiling the source tar. Make sure to check for the latest version here: http://www.kokkonen.net/tjko/src/

cd /path/to/scripts

Get the tar archive

wget http://www.kokkonen.net/tjko/src/jpegoptim-1.4.1.tar.gz

Let’s extract this stuff:

tar -xzf jpeg*

Go into the folder:

cd ./jpeg*

Let’s compile this:

./configure
make 
make install

You should get a response similar to this:
Compiling Jpegoptim On Centos.png

Let’s quickly test if it works

jpegoptim

You should get back “file arguments missing” or similar depending on the version

Step 2: Let’s Get This Done And Optimize Images Losslessly

Now the easy part. All we have to do is to cd into the directory with our images and execute a simple command.

cd /path/to/images/

This will compress any jpg images stored in /my/path LOSSLESSLY, which means image quality will not suffer. In any case, it is a good idea to make a backup of your original image folder before you execute this (just in case).

cp -R /images/ /imagesLL/

Then run the lossless compression:

find /my/path -name '*.jpg' -type f -print0 | xargs -0 jpegoptim -o --strip-all

Note the apostrophes around the extension, else you may get an error “paths must precede expression”.

Step 3: Let’s upload compressed images to our S3 Bucket / CDN

Lastly, grab a copy of s3cmd from s3cmd.com and execute this command

s3cmd put --recursive /path/to/your/lossless/images s3://bucket/lossless/

Done – rename your original image folder to BAK (backup) and change the name of your newly created folder to the original folder name.

Step 4: Let’s store a cronjob

One more last step. If you want to optimize your images on a schedule, you need to create several cronjobs for your push-zones::

– 1 Download images from CDN
– 2 Optimize them
– 3 Re-Upload

Here’s the alternative:

Create a pull-zone that will files from your server and upload it to your CDN. I recommend MaxCDN for this , because they are cheaper than Amazon AWS and have a very reliable pull-zone CDN.

What is the difference between a pull-zone and a push-zone? A pull-zone will be a zone you CDN uses to fetch files automatically. You no longer have to upload them manually or via batch files, everything will be done by the CDN.

After you have created a pull zone, you can then set up a cronjob on your server to deal with the image optimization using jpegoptim as explained above.

Any questions? I am here to help you out. You can also post questions on our forums