My thoughts on SEO URLs

I originally compiled this in response to a question relating to the merits/pitfalls of including directories and subdirectories in paths relating to SEO on Opencart shopping cart software.

“You may get a lot of opinion on this but here’s my 2cents.
* realize search engines are striving to be as human-like as possible. Search engines may tweak their algorithms but this is the direction they are heading.
* any keywords in the URL are scored highly. Putting long paths in the URL dilutes this effect
* Too much duplication dilutes the effect
* Significant related keywords boost the scoring but again, too much duplication may dilute this.

So relating this to your question, I say keep it simple and relevant, don’t overuse keywords or try to use ‘tricks’. Concentrate on the specifics of your category for the category, and the specifics of the product for the product and let the search engines worry about how to arrange their results.
Having category/subcategory in path should be fine in general but if you wanted to fine tune and have complete control, I would say no folder paths and repeat parts of category where required.
So rather than audio/mp3players/ipod8gb
you could have no folders and
mp3-ipod8gb
If someone searches for ‘mp3 ipod 8gb’ would give better product relevance than folder paths and searches for ‘mp3 players’ would be more likely to locate your category.”

 

Optimizing a Xara Website For Google Pagespeed

 

About Google Pagespeed Insights and Xara

By default, websites generated by Xara Web Design or Xara Designer Pro software are not server optimized. Quite correctly, in order to be universally compatible, there is no attempt to enable compression or use specific .htaccess settings for example. The images themselves are also not fully optimized (compressed) and include meta data which for display purposes is unnecessary. In general, Xara is optimized for speed and squeezing every last bit of compression from an image is a time-consuming process, so again this is understandable. On the plus, the mobile ready themes are generally scored as mobile friendly by google so there is little we need to do for that aspect. Since page speed is increasingly being taken into account in google ranking, it makes sense to optimize it as much as possible and of course to make most efficient use of resources.

There are various elements to page speed according to google, plus there are other factors rated highly by other page speed analysis websites, but we are going to concentrate on googles criteria for obvious reasons!

Also, this articles is based on a standard LAMP setup (Linux, Apache, Mysql, Php) and cPanel based hosting account although for other setups, much can be reused.

Google Pagespeed Insights Ranking Criteria

For an overview of the aspects that google uses to rank your website page speed, please see the following article, but I cover most of it as I go through

Overview of Google Pagespeed Insights Ranking Criteria

 

Optimizing a Xara Website for Google

So lets go! We are going to use default themes with no modifications as our testing on a shared server with caching switched off. You will need ftp access to your website (and be familiar with its use) and have made a backup before starting! You will also need a plain text or code editor (notepad for example), don’t use a word processor or rich text editor as it may generate additional hidden code.

Hopefully, the suggestions listed here will apply to most themes but there may be specifics out of the scope of this article.

We are using Google Pagespeed Insights website to scores although extra functionality can be achieved through API access.

I am doing two themes in parallel, Black Noir and Applab.

Here are the lousy default scores with no optimization. (I switched off some optimizations that are normally enabled by default on my hosting so as to better show the improvement of each stage but also because certain other optimizations can conflict with each other. Your initial results may be better or worse than these!).

BLACK NOIR – 44/100 mobile, 64/100 Desktop.

APPLAB – 38/100 mobile, 54/100 Desktop.

5 out of the 10 criteria are flagged up as problems, the 5 that pass at this stage are:-

Avoid Landing page redirects

Self-explanatory, you are simply visiting a page so not redirects should occur. If you add www. redirect or https in the future, ensure you update all incoming links and sitemaps to the full new URLs to avoid redirects as much as possible.

Prioritize Visible Content

Fortunately on the themes I am going to use this is taken care of. Watch out in future for poorly designed themes that break this or other rules.

Reduce Server Response Time.

By default google expects a page to start producing output within 0.2 seconds. A good host should achieve this. As there are no database calls to be made and the pages generated by Xara are pre-generated html, this should never be a problem, if this is a problem on your hosting for a simple basic website, you might want to consider changing your hosting. As you add more to your website, this will only become more likely to be an issue in the future.

Minify HTML.

I was actually surprised that google didn’t complain about this. Looking at the HTML code, it appeared at first glance that there are a lot of line breaks and its spaced out quite well although there is very little indentation. On inspecting on Gtmetrix. it reckons it can only be minified by 3%, so google was sensible enough not to complain about this. So well done Xara for keeping it readable yet optimized! In fact, this message usually gets negated when we enable compression anyway.

Minify Javascript.

Another well done Xara! It looks like the Javascript libraries that are included must already be minifed. If other libraries get included by other themes that are not minified, look at the Minify CSS section below as the same method can be used to rectify this.

The issues we need to try and fix.

So, working through the 5 issues that are flagged as a problem.

The top of the list as a priority to fix is Enable Compression so lets add a few lines to .htaccess and see how it improves. As there is no .htaccess generated by Xara by default, you may need to create this file. If there is already one there, make a backup of it first and add this code towards the bottom. There are numerous combinations of this code but this one seems to allow for old browsers and most importantly exclude (already compressed) images. Source (http://stackoverflow.com/questions/2835818/how-do-i-enable-mod-deflate-for-php-files)

## SWITCH COMPRESSION ON
<IfModule mod_deflate.c> 
    SetOutputFilter DEFLATE 
    <IfModule mod_setenvif.c> 
        # Netscape 4.x has some problems... 
        BrowserMatch ^Mozilla/4 gzip-only-text/html 
  
        # Netscape 4.06-4.08 have some more problems 
        BrowserMatch ^Mozilla/4\.0[678] no-gzip 
  
        # MSIE masquerades as Netscape, but it is fine 
        # BrowserMatch \bMSIE !no-gzip !gzip-only-text/html 
  
        # NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48 
        # the above regex won't work. You can use the following 
        # workaround to get the desired effect: 
        BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html 
  
        # Don't compress images 
        SetEnvIfNoCase Request_URI .(?:gif|jpe?g|png)$ no-gzip dont-vary 
    </IfModule> 
  
    <IfModule mod_headers.c> 
        # Make sure proxies don't deliver the wrong content 
        Header append Vary User-Agent env=!dont-vary 
    </IfModule> 
</IfModule>
## END SWITCH COMPRESSION ON

Here are the new results now

BLACK NOIR –  49/100 Mobile 71/100 Desktop.

APPLAB –  42/100 Mobile 63/100 Desktop.

That’s a modest improvement but already and the Desktop warning is now orange instead of red on both themes!

So we are down to 4 issues remaining.

Next on the list is Leverage browser caching, lets add a few more lines to .htaccess (there are various variations of these lists but I have tweaked to satisfy google, you can adjust for your own preference).

## LEVERAGE BROWSER CACHING
ExpiresActive On 
ExpiresByType image/jpg "access 1 week" 
ExpiresByType image/jpeg "access 1 week" 
ExpiresByType image/gif "access 1 week" 
ExpiresByType image/png "access 1 week" 
ExpiresByType text/css "access 1 month" 
ExpiresByType application/pdf "access 1 month" 
ExpiresByType application/x-javascript "access 1 month" 
ExpiresByType application/javascript "access 1 month" 
ExpiresByType application/x-shockwave-flash "access 1 month" 
ExpiresByType image/x-icon "access 1 month" 
ExpiresDefault "access 1 week" 
## END LEVERAGE BROWSER CACHING

Here are the scores after that optimization

BLACK NOIR – 67/100 Mobile,  86/100 Desktop.

APPLAB – 58/100 Mobile 74/100 Desktop.

WOW- that’s a huge improvement, particularly for Black Noir Theme! Mobile is now orange and Desktop is green!

Next we’ll deal with Minify CSS. When we click on the Show How To Fix link underneath, for Black Noir theme, it reveals that the highslide.css file could be optimized by 22%. On Applab theme it shows ani.css could be optimized 11% (even though this file is also in Black Noir theme, google does not flag it up). There are online and offline minifiers for this type of thing but google makes this easy and does it for you. Look for the link;

Download optimized image, JavaScript, and CSS resources for this page.

This gives a zip file with several optimized resources in for your convenience. Open the zip file and look in the css folder. So lets take the minified version of the css files and substitute the optimized versions instead in the index_htm_file folder. Take a backup of the existing css files for your own peace of mind in case of problems then extract the minified versions and upload them to the index_html_files directory.

Lets see what difference that has made

BLACK NOIR – 67/100 Mobile,  86/100 Desktop.

APPLAB – 58/100 Mobile 74/100 Desktop.

None at all it seems! Never mind there’s only 2 things its complaining about now.

So now we will deal with Optimize Images

  • UPDATE – As of 2017, Google now defaults to optimize to ‘lossy’ images whereas previously they were ‘lossless’. This means that some image quality will be lost during the optimization process. So if you choose the option to use the google generated optimized images, you will be using images that have slightly less quality than the original. See my new article for more information.

This step is a bit more involved. There are several ways this can be approached, I will detail 2 methods here. First off, backup your images just in case. Just note it strips meta information from the files so if you need that retained, you need to omit this step or investigate an option that retains it.

First option, replace your images with the optimized versions given by google!

So we start by replacing the optimized images with the ones that google just provided us also. In the zip of optimized files we got the css from, there is also a folder called image with optimized versions of our images in. They will all be in there with their original filenames. So we simply need to upload these files into the index_html_files directory as with the css. Now google only gives a limited number of files per page, so you will need to repeat this process several times.

In testing, I hit a point where the same optimized files keep getting produced by google even though I have replaced them! In the end, for example for Black Noir, google still complained that the 14@2x.jpg image and I just uploaded it! After some head scratching, I then realized that google changes the @ to an _ in the filenames, so you just need to extract the files then change each filename google gives with an _ in to an @ symbol if that is what the filename should be!

Anyway, this option does get there eventually, but the following option is quicker for many files.

Second Option, install jpegoptim and optipng (or similar) and manually optimize!

This option involves either installing these two applications (or others of your choice) on your local machine, or preferably on your server subject to permissions of your hosting. If you install them on your local machine, you will need to download all of the jpg and png files to your local machine, optimize them then re-upload them to your hosting.

So you need to go into the directory where the files are and use the following command line commands:-

jpegoptim --strip-all *.jpg

That optimizes all the Jpeg files using its best normal compression method. Plus you can then also use:-

jpegoptim --strip-all --all-progressive *.jpg

and that goes through them all again, this time trying the progressive method of compression to see if it is more efficient for each image, if so, it will recompress them with that method, if not, it will leave as normal.

Then for png files, this uses default settings;

optipng *.png

or to tweak every last byte (be warned this may take ages!)

optipng -o7 *.png

the second command takes much longer and is very processor intensive but can sometimes yield an extra few % compression for some larger images for those who are fanatical about such things.

If these files you have optimized were not on the server, you need to upload those optimized files to the server now.

It still complained to me that a few files were not fully optimized, I haven’t full investigated this, it may be that google tolerates a small amount of imperceivable image degradation whereas the above command line tools I recommend do not.

  • UPDATE – I have now investigated this matter and as of 2017, Google now defaults to optimize to ‘lossy’ images whereas previously they were ‘lossless’. This means that some image quality will be lost during the optimization process. So if you choose the option to use the google generated optimized images, you will be using images that have slightly less quality than the original.
  • Please see my new article – Google Pagespeed Image Optimization Updated 2017

So I just used the technique in Option 1 above of replacing the remaining files with the goo gle optimized ones, noting the quirk of @ being change to _ in the filenames that google produces and changing them accordingly.

Finally, the image warning disappeared. Now lets see how much that has improved the situation:-

BLACK NOIR – 68/100 Mobile,  88/100 Desktop.

APPLAB – 68/100 Mobile,  88/100 Desktop.

Well a modest improvement for Black Noir but Applab has finally caught up, I suspected the difference between the two was always that Applab has more images or that the images it uses were less optimized.

So that just leaves the one aspect being reported by google;

Eliminate render-blocking JavaScript and CSS in above-the-fold content

Now I have looked into this aspect but it starts getting very complicated! I have gained some improvement by inlining some of the css and asyncing some of the Javascript, but doing this wrong can break some functionality of your website or make it not appear correctly. And as these inclusions will vary so much from theme to theme, I am not able to post anything that can be used in a generic way on other themes. Also, every time a site changes, even slightly, the files get regenerated and so any amendments made would need to be repeated. Really speaking, this is something that would be better addressed by Xara themselves as this type of thing is much better to be determined at the point of generating the code rather than coming along afterwards and trying to re-organize it then.

So that’s where I will leave it. Take from it what you will, apply it to your Xara or other websites (but take a backup first, I accept no responsibility for breaking your website etc.) and enjoy the reduced load on your server resources, quicker loading times for your website visitors and a better ranking on google because your site has been optimized to its liking!

Additional:

It is really only recommended to perform these optimizations when a site is complete or near completion.

Enabling the cache for example will load images from your browser cache, so if you then change an image on the website, it would still show the old (cached) image (new visitors will get the newest uploaded image). You can refresh all images on your browser by F5 or ctrl-F5.

To prevent the optimized images and CSS files being overwritten each time the site is published, ensure you select Fast Update (Changed files only) in the ftp settings in Xara. It should then only upload elements in the site that have changed since last time it was published.

If you upload manually to your server, you need to only update files that are new or changed since last time.

If something does change, or new elements are added, you will need to repeat the relevant section for optimal results.

The .htaccess changes should be preserved as Xara does not touch this.

Overview of Google Pagespeed Insights Ranking Criteria

As I am writing several articles about Optimizing websites for Google Pagespeed, rather than repeat this section in each article, I am including it here as a standalone article.

The Google Website page is https://developers.google.com/speed/pagespeed/insights/ and there are reference links from that page for more information.

Leverage Browser Caching – This is allowing static files to be stored on your local browser so that for subsequent pageloads, the cached files will be used. This is quite easy to do by adding a few lines to your .htaccess

Enable Compression – This is the method of using a compressor (typically gzip) to compress files so that they are the smallest possible while they are being transferred, then decompressing them before being rendered. Although most images etc. are already compressed and therefore do not recompress much more, scripts such as css, javascript and html are very compressible, even if they have been minified. Again, .htaccess changes can switch compression on.

Optimize images – As stated above, all modern images are already stored in a compressed format, however there are often many unnecessary segments of information left in the image files (such as camera settings and information) and the default compression algorithms of most major software packages are not optimal and further compression can be made. Also thumbnails that are dynamically generated by php are optimized for speed of generation rather than smallest possible size. Gzip and similar compression is not efficient with already compressed images, so a re-processing of images using specialized software is the best way to have all images optimized. This is rarely done on-the-fly as its a time-consuming process, so it is often performed by some triggering mechanism when an image is uploaded or changed, or periodically in a batch-optimize process.

Server Response Time – The time for your server to serve the main html document page, this excludes images and resources such as Javascript libraries, CSS and Fonts

Prioritize Visible Content – Best to concentrate on items that are going to be rendered visible on the current device on the initial page load. Other resources that would only be visible on a mobile version of the site (if viewing on a desktop) or vice versa, plus content under tabs that are not selected by default etc. should be treated as low priority and rendered after the priority content.

Minify CSS – Pretty CSS has lots of spaces, indents etc. These all add precious bytes and are only necessary for readability in case the CSS needs to be edited. In case of standard libraries, minified versions are usually served (often with a .min.css extention). Although the byte saving is somewhat negated when compression is used, it is still good practice to used minified versions of css libraries when available for optimal results.

Minify Javascript – Again, spacing, indentation, carriage returns etc. all go to make code more readable but can usually be stripped and optimized to reduce the number of bytes to be transferred. Again, this is somewhat negated by having file compression but when minified libraries are available, you may as well use them

Minify HTML – Contentiously, as html is typically dynamically produced, the overhead of minifying the html. code with so many quirks on different browsers etc. is rarely worth the effort. Fortunately, having compression enabled negates the need for this and google seems to tolerate un-minified html as long as compression is enabled.

Eliminate Render-Blocking Javascript and CSS in the above-fold content – Much Javascript and CSS is purely aesthetic and can be left until the main body has been displayed before being loaded. Good examples of these are fonts, layouts, animations etc. Even though the page may be momentarily rendered slightly differently until a resource is fully loaded (such as a font), google feels it is most important to get the first 600 pixels (what is called above-fold-content) rendered even in a basic form before fine tuning and completing the full page render with all bells and whistles. This helps low-bandwidth and particularly mobile users not waste time and bandwidth fully loading a page before being able to see it. they can cancel or go back quickly this way. So pages that are dependent on libraries to render the above-fold-content (such as Jquery) must have these libraries fully loaded before this section can be rendered. The best way to achieve this is to either inline the important code or have it load asynchronously, (not waiting for the current page to finish loading before loading it in parallel). The rest of the libraries can be left until the main body has finished before being loaded, perhaps at the bottom by the </body> tag. Alternatively, there are scripts available that load these resources on once the main body of the page has finished loading.

Avoid landing page redirects – If you have a redirection from an old site to your new one, or to force www. on your url, or force https for example, these redirections take precious time iterating through the Apache page serving protocol and is called a redirect. These are very useful in many circumstances like those listed, but it is important to rectify the source of these incorrect urls and change them to the full corrected ones. So if you have a Sitemap generating urls without the www. and you have a redirect to force the www. to be added, you need to change this to generate the full urls including the www. as a priority.

Please refer to my individual articles for how I approach optimizing these aspects for various platforms.

Optimizing Opencart website for Google Pagespeed

Optimizing your Whole Opencart Website for Google Pagespeed

This article assumes you have read and applied my previous article on optimizing your Opencart front page for Google Pagespeed

Optimizing an Opencart front page for google pagespeed 100/100

So now we are going to see what else we can do to optimize the rest of the website.

Lets try a category page to start. On the demo we’ll try the Desktops category (Show All Desktops)

Aargh! Here’s the results

74/100 Mobile,  97/100 Desktop.

What went wrong? Well on both Mobile and Desktop, its showing Optimize Images so we know that there are some more images that need optimizing. But by Desktop results that must only account for 3%. Lets quickly take care of that by doing the same process as before, replacing the images on the site with the optimized images provided by Google.

75/100 Mobile,  100/100 Desktop.

Desktop improved by 3% but Mobile only improved by 1%? Lets not worry about that, but now we know that the problem that’s remaining on the Mobile site that its really complaining about is

Eliminate Render-Blocking Javascript and CSS in the above-fold content

Here is the extra info it gives.
“Your page has 3 blocking script resources and 6 blocking CSS resources. This causes a delay in rendering your page.
Approximately 40% of the above-the-fold content on your page could be rendered without waiting for the following resources to load. Try to defer or asynchronously load blocking resources, or inline the critical portions of those resources directly in the HTML”

Wow that’s 9 resources, no wonder its complaining.

More to follow..

Optimizing an Opencart front page for google pagespeed 100/100

About Google Pagespeed Insights and Opencart

Out of the box, Opencart 2 is mobile friendly and is one of the better performing shopping carts out there, but since page speed is increasingly being taken into account in google ranking, it makes sense to optimize it as much as possible and of course to make most efficient use of resources.

There are various elements to page speed according to google, plus there are other factors rated highly by other page speed analysis websites, but we are going to concentrate on googles criteria for obvious reasons!

Also, this articles is based on a standard LAMP setup (Linux, Apache, Mysql, Php) and cPanel based hosting account although for other setups, much can be reused.

Google Pagespeed Insights Ranking Criteria

For an overview of the aspects that google uses to rank your website page speed, please see the following article, but I cover most of it as I go through..

Overview of Google Pagespeed Insights Ranking Criteria

Optimizing an Opencart Website front page for Google

So lets go! We are starting with a fresh install of Opencart 2.3.0.2 (the results are virtually identical on all 2.x versions) on a shared server with caching switched off, default theme.

* UPDATE – This method has also been tested with latest verision 3.0.0.0 alpha 1 with the same results!! There are a couple of changes though as version 3 uses a twig template system so I will post an update after it gets officially released! *

We are using Google Pagespeed Insights website to scores although extra functionality can be achieved through API access.

Here are the lousy scores for our initial install with no optimizations. (I actually switched off some optimizations that are normally enabled by default on my hosting so as to better show the improvement of each stage but also because certain other optimizations can conflict with each other. Your initial results may be better or worse than these!).

58/100 mobile, 64/100 Desktop.

7 out of the 10 criteria are flagged up as problems, the 3 that pass at this stage are:-

Avoid Landing page redirects

Self-explanatory, you are simply visiting a page so not redirects should occur. If you add www. redirect or https in the future, ensure you update all incoming links and sitemaps to the full new URLs to avoid redirects as much as possible.

Prioritize Visible Content

Fortunately on the default Opencart theme this is taken care of already by the default structure. Watch out in future for poorly written themes or extensions that break this or other rules.

Reduce Server Response Time.

By default google expects a page to start producing output within 0.2 seconds. A good host should achieve this, although I know even some hosting companies advertised on Opencart website fail this task miserably! As your store grows and more db queries etc. are required to render a page, this will be more likely to flag up as an issue. You may consider optimizing db tables and implementing a caching solution to reduce this issue.

The issues we need to fix.

So, working through the 7 issues that are flagged as a problem.

The top of the list as a priority to fix is Enable Compression so lets add a few lines to .htaccess and see how it improves. There are numerous combinations of this code but this one seems to allow for old browsers and most importantly exclude (already compressed) images. Source (http://stackoverflow.com/questions/2835818/how-do-i-enable-mod-deflate-for-php-files)

## SWITCH COMPRESSION ON
<IfModule mod_deflate.c> 
    SetOutputFilter DEFLATE 
    <IfModule mod_setenvif.c> 
        # Netscape 4.x has some problems... 
        BrowserMatch ^Mozilla/4 gzip-only-text/html 
  
        # Netscape 4.06-4.08 have some more problems 
        BrowserMatch ^Mozilla/4\.0[678] no-gzip 
  
        # MSIE masquerades as Netscape, but it is fine 
        # BrowserMatch \bMSIE !no-gzip !gzip-only-text/html 
  
        # NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48 
        # the above regex won't work. You can use the following 
        # workaround to get the desired effect: 
        BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html 
  
        # Don't compress images 
        SetEnvIfNoCase Request_URI .(?:gif|jpe?g|png)$ no-gzip dont-vary 
    </IfModule> 
  
    <IfModule mod_headers.c> 
        # Make sure proxies don't deliver the wrong content 
        Header append Vary User-Agent env=!dont-vary 
    </IfModule> 
</IfModule>
## END SWITCH COMPRESSION ON

Here are the new results now

68/100 Mobile 76/100 Desktop.

That’s a huge improvement already and the warnings are now orange instead of red!

Wait, we also just got rid of Minify HTML & Minify CSS. Of course because they are all now being transferred in compressed form, even though they have not been minified, Google is now happy with these! So we are down to 4 issues remaining.

Next on the list is Leverage browser caching, lets add a few more lines to .htaccess (there are various variations of these lists but I have tweaked to satisfy google, you can adjust for your own preference).

## LEVERAGE BROWSER CACHING
ExpiresActive On 
ExpiresByType image/jpg "access 1 week" 
ExpiresByType image/jpeg "access 1 week" 
ExpiresByType image/gif "access 1 week" 
ExpiresByType image/png "access 1 week" 
ExpiresByType text/css "access 1 month" 
ExpiresByType application/pdf "access 1 month" 
ExpiresByType application/x-javascript "access 1 month" 
ExpiresByType application/javascript "access 1 month" 
ExpiresByType application/x-shockwave-flash "access 1 month" 
ExpiresByType image/x-icon "access 1 month" 
ExpiresDefault "access 1 week" 
## END LEVERAGE BROWSER CACHING

Here are the scores after that optimization

77/100 Mobile, 84/100 Desktop.

Another huge improvement and we are already about half way there.

Next we’ll deal with

Eliminate render-blocking JavaScript and CSS in above-the-fold content

Now it starts getting a little more complicated. We are gonna need to change some file loading somewhere to get rid of the blocking Javascript. In this instance I will inline the jquery.js file but there are more elegant ways to resolve this issue without the extra overhead of this inlined code being loaded with every single page.

Locate catalog/view/theme/default/template/common/header.tpl find the following line

<script src="catalog/view/javascript/jquery/jquery-2.1.1.min.js" type="text/javascript"></script>

and replace it with

<script type="text/javascript"> 
<?php $jq = file_get_contents("catalog/view/javascript/jquery/jquery-2.1.1.min.js"); echo $jq;?> 
</script>

Now here are the scores, another significant improvement!

84/100 Mobile, 87/100 Desktop.

Whats left? We just have Optimize Images and Minify Javascript. Lets deal with the minified Javascript first. There are online and offline minifiers for this type of thing but google even does this for you. Look for the link;

Download optimized image, JavaScript, and CSS resources for this page.

This gives a zip file with several optimized resources in for your convenience. Open the zip file and look in the js folder. So lets take the minified version of common.js and substitute that version instead in my cart. Extract that file and rename it to common.min.js and upload it to the catalog/view/javascript folder. Again, we then edit the header.tpl file we edited above, find

<script src="catalog/view/javascript/common.js" type="text/javascript"></script>

and replace it with

<script src="catalog/view/javascript/common.min.js" type="text/javascript"></script>

Now the scores are very healthy. Interestingly Mobile and Desktop are now the same as Desktop did not change with that last one, it obviously penalizes that un-minified Javascript for Mobiles only.

87/100 Mobile, 87/100 Desktop.

A slight improvement on the mobile so that just leaves the Optimize Images now.

The next (final) step is a little more complicated..

Lets start by replacing the optimized images with the ones that google just provided us also. In the zip of optimized files we got the Javascript from, there is also a folder called image with optimized versions of our images in. They will all be in there with their original filenames, just not in subfolders, so we will have to search to find where they all belong. If we look in the media information of the website (Firefox – click in the info icon just to the left of the url bar, then > on the right, More Information and then Media Tab, Chrome – hover over list in Networking Activity in the Developer Tools) we can see the relative paths to copy them into, The default installation has images in:- image/catalog. image/cache/catalog, image/cache/catalog/demo, image/cache/catalog/demo/banners, image/cache/catalog/demo/manufacturers

Replace the files in their native folders and retest..

Result

We made it! – Didn’t we?

What’s that I hear you say? What about the rest of the website? That’s cheating! what about other images?

Well, the title of the article does say the Front Page and i’ll be honest now, this article alone is not going to get every single dynamic page on your Opencart website 100%, especially if you’ve made modifications. But most of what i’ve shown already will apply to the rest of your website so its all probably a significant improvement on what it was previously. But you’ve made a fair point, so i’m following this with another article to see what else we can do and what else we can consider for getting the rest as good as possible.

Optimizing Opencart website for Google Pagespeed

Top 3 Tips for People Setting up an online business

Running several websites since 2004, I wanted to share some tips that may be of benefit to other Small online businesses!

Get Decent Website Hosting!

Its really not worth getting budget hosting for a website these days. A minimum for any website these days I would recommend getting cPanel hosting and make sure it has more than one processor and allows for multiple accounts in case of expansion. Here are some more things to look out for

Website/Hosting Administration

This is Administration software for your website(s). You can log into it from any browser and administer various aspects from anywhere, things like email accounts, FTP accounts, MySQL databases,Website statistics, etc.

cPanel

cPanel software will be licensed by the hosting company and should be inclusive in your hosting package. There are a few cPanel alternatives out there but I have tried a few but have found to date that it is worth paying the extra for cPanel so it is the only one I will recommend.

Hosting Package Types

Basic Shared Hosting

– Usually based on a single or small number of domains, if you can ever foresee the possibility of needing more that one website, this would be best avoided. There is likely to be restrictions on the number of databases and email accounts you can have which can quickly become restrictive. If you add another website in the future, you may have to get this extra, it may be on a different server and require different logins etc. You will most likely have hundreds of other people hosting their websites on the same server so if some of those websites are using lots of resources, or get attacked for some reason, your website will also suffer. Packages can be found from just £1 per month but I would recommend budgeting at least £5 a month if your business is going to depend on your website, that should at least give you a decent amount of resources and the ability to host more than one website.

Reseller

– This is the minimum I would recommend these days. It should allocated you much more bandwidth, hard disk space and other resources than a basic shared hosting package. Even if you only want the extra resources for yourself (multiple website domains etc.) The extra resources and the power to administer multiple accounts from a central interface is a huge benefit. As with basic shared hosting, there will be other resellers on the same server as you and there is still the risk that their websites can adversely affect your websites. Packages start from around £20 per month but be wary of companies that offer unlimited resources, as they are offering this to everyone else on that server aswell!

Server (Virtual)

– This is the bees knees as far as hosting is concerned, Not only do you get everything in a reseller hosting package, you have access to everything on the server, You should get root access and have powerful commands and resources at your disposal. If you are reading this article, you would most likely want to choose a managed server option as this hides some of the powerful functionality that you don’t normally need that you might unintentionally do damage with!. You can usually sell hosting or even reseller packages from a server hosting! Unlike Reseller or shared hosting, the server is self-contained for you so there is no danger of anyone else on the server slowing down your websites! You should get WHM to manage the server (which is like a parent of cPanel) and it is all powerful. (Several Virtual Servers will actually be running on a single machine but these days is not an issue, see below). Typical costs for a Virtual Private Server start at £40 per month

Server Types

Cloud Server

This is basically a virtual server(see below) but everything is virtualized. The storage can exist totally separate to the processing and the processing power is shared across a vast array of websites. This is great for small websites but is often not as quick as a dedicated server.

VDS (Virtual Dedicated Server)

All but the most powerful websites can run happily on a VDS these days. The technology has advanced so much and the virtualizations is so optimized that you would be hard pushed to tell if a server is dedicated or virtual. As mentioned in the hosting packages section, a VDS will be one of several running on a physical machine but the technology is now so advanced that this is no longer an issue. They are hardware independent so any problems with the hardware it is running on, the VDS can be quickly moved to another physical machine and you will be allocated minimum guaranteed resources.

Dedicated Server

This was the bees knees until a few years ago (and still is) but the massive expense of having a dedicated machine constantly running for you in some data centre somewhere is just a luxury these days and unnecessary for most people unless you have specific requirements such as massive storage for video streaming etc. Also there is the problem if the hardware fails, your entire server is down and has to be repaired or moved to another (identical) physical server. This would usually take hours but could take days to resolve particularly if there is an intermittent problem.

Summary

In summary, there are still some cheap and nasty hosting providers out there. Many will offer free or cheap websites as an incentive, but a lot of these cheap companies are the same ones I constantly read on forums that people are having problems with.

My recommendation:

Siteground.com
Hosting with lots of important extras and fantastic 24/7 support!

Phone numbers and VOIP!

08xx numbers – grrr!

Another pet hate of mine is 0845 or 0800 type numbers! These can cost a small fortune from mobiles, don’t work from abroad and I personally avoid calling them at all costs. If you want non-geographical numbers or specific geographical numbers that will not change if your business location moves, see VOIP below.

VOIP

I cannot recommend VOIP enough. Its basically phone lines using the internet. You can have multiple VOIP phones that can call each other for free and they can be located anywhere with an internet connection such as home and business and calls can be transferred easily!

The requirements for VOIP are;

A decent broadband connection

Either really fast like fibre or at least one that will not have others on it using all the bandwidth watching videos or uploading etc.

VOIP phones or adapters

Special VOIP phones that plug into a standard RJ45 network port. They can often have multiple lines available from the same handset and many other advanced features not found on a regular phone system, such as conference calls etc. Adapters can be bought to convert standard phones into VOIP phones. I haven’t used these myself but I can see how they would be handy, particularly if you still want to access an existing landline and the adapter handles both simultaneously.

* NEW – Mobile/Computer Voip *

Since first writing this article, mobile data and apps have improved significantly, so much so that I would now recommend Zoiper as a VOIP client to run on your smartphone. I find this great to pick up calls when setup as an additional extension to my VOIP setup. (Of course if you are in Wifi range, your phone should be setup to use that as a preference to mobile data). There are alternatives such as Jitsu, but Zoiper is the most professional I have tried. There are also desktop computer applications, I have used Jitsu on Linux with some success but prefer the mobile app. Also these apps can have some cool extra features like call recording that you may have to pay for separately otherwise.

A VOIP provider

Although its possible to setup your own VOIP server, I have used a couple of VOIP providers and find them excellent. They can start with a small monthly charge and most allow you to expand as your business grows

My recommendation:

Voipfone.com
Click here for their totally Free Trial Offer!

Decent email management

Its one of my pet hates to see joegwilll@btinternet.com or whatever as a business email address especially when they have their own website! Is it that they are blissfully unaware that they can easily multiple emails through their website domain? So joe@joegwill.com, sales@joegwill.com etc. It looks so much more professional and has other benefits!

These can either be setup as mailboxes of their own, or individually redirected to other email accounts elsewhere! So joe@joegwill.com can easily be redirected to joegwill@btinternet.com using cPanel. So if they move their hosting from btinternet they can just change their redirection to the new email provided by their new hosting company!

They could be redirected or collected by a centralized client such as gmail etc. (which allows access to multiple external accounts through a single gmail account). I don’t like this option myself as I don’t like google profiting from my email data however this or similar may be a great option for those requiring the convenience of something like gmail and its apps. But don’t just move to gmail, keep the power and control yourself by using an email address through your website forever!

My recommendation:

Empower yourself by Hosting your own email!

Sculptex Opencart Support

I will be sharing my knowledge and experience here for everyones benefit!

I have been programming since the 1980’s and I have a wealth of experience of many different platforms and types of programming. Since 2004, I have been customizing and maintaining php shopping cart software for my own family business.

In 2015, I moved over to Opencart software for my family business and am now able to share the benefit of my experience with this with others. Of course I am familiar with a lot of other common php software out there such as WordPress and phpList.

I am offering a selection of my own mods for sale and I will be supporting these through the Opencart community and through this website.

I am also putting some of my smaller mods and contributions available free of charge, of course I will support these aswell but I will of course treat any paid for mods as priority.