Google Panda & low quality pages

admin
Uncategorized
Comments Off on Google Panda & low quality pages

By Joost de Valk

While reviewing websites, quite often we find sites that already have become a victim of Google Panda’s algorithm, or run the risk of getting “Pandalized”. A site runs the risk of being hit by Panda when it has a lot of low quality pages. In this post I want to give you some quick insights into what Google Panda is and how you can prevent from getting hit by it.

What is Google Panda?

A quick intro by yours truly on what Google Panda is:

How to fix low quality pages

In all, Panda usually affects your site because you have too many low quality pages as a proportion of your overall number of pages. So you need to fix your low quality pages in order for that ratio to become healthy again. You can fix problems with low quality pages in two ways:

You improve the content and quality of these pages. This means adding more well written content, but it often also means making sure you don’t have too many ads on the page and improve the UX of the page.
You remove these low quality pages or disallow the search engine access to the pages. This last method is often called a “Panda diet”.

Hit by Panda? Afraid of being hit? Order a website review and we’ll give you a ton of practical tips and tricks on how to prevent that! »

Identifying which low quality pages to fix

Whether you decide to improve your content or to remove your low quality pages, you need to know where to start. The best way to identify pages that need fixing is to look at pages on your site that have very few visitors and/or a very high bounce rate. These are pages that aren’t ranking or that aren’t doing anything for your overall site’s performance because people go away from these pages too quickly.

There are a couple of tools that can help you identify which low quality pages need fixing. The one I like most for smaller to medium sized sites is Screaming Frog. When you’ve opened it, you can connect to a site’s Google Analytics data and then have Screaming Frog crawl the site, after which you can sort by bounce rate. There are also some filters to show you pages above 70% bounce rate (which really is too high) and pages that have no GA data because nobody visited those pages…

It’s important to note that sometimes, a bounce doesn’t mean something’s wrong. On our knowledge base, if a page has a high bounce rate, the reason usually is that it solved the problem. So people didn’t do anything else because they didn’t have to. So you really have to go into the pages you’ve identified with these methods and evaluate their individual quality.

Improve or remove?

Once you’ve identified which pages on your site needs fixing, you need to start thinking about whether you want to improve or remove them. Usually, you’ll end up doing a bit of both. Pages that target keywords that truly matter to you should just be improved. If other pages are targeting keywords that are not interesting enough to your business, or they’re not targeting anything at all, get rid of them. You can choose to no index them, thereby preventing Google from showing them in the index, but I honestly think you’re usually better off deleting pages like that.

Source:: SEO