There’s a common misconception regarding how to rank in search engines for several, similar searches – and it’s rooted in how search engines used to work. Back in the days, you could create loads of somewhat similar pages to optimize for small search variations.
Today, however, SEO experts advice against this.
Le's look at the problem with creating too many pages. Or even worse, when pages are created automatically. It all started when we found this question on Quora:
"I want to make a page which consists of a search result on my website, the reason is I want search results from my website to appear on google search engine. Let’s say someone searches for "barbie for sale" on a search engine, I want to have a page on my classified site, which consists of a search for "barbie". I read that to do this you need to create a dynamically generated page, how is that done?"
This is actually not an uncommon strategy. We’ve come across several e-commerce websites that generate pages dynamically, many of which are regretting it today.
How to create thousands of website pages
The goal is to create pages that target every search that seem relevant for the business. Let’s say a website sells shoes. If someone searches for “red Converse Chuck Taylor All Stars size 10” in Google, and reaches the website, this search automatically generates a unique page with a unique URL.
If someone then searches for “red Converse Chuck Taylor All Stars size 9”, this also becomes a page. Then there’s a virtual avalanche of new pages. Shoe brands, models, sizes, colours, search variations, misspells etc. generate pages intended to target small search-query variations.
If the strategy is “successful,” thousands of pages are created every week and suddenly the poor webmaster has a site with + 100,000 pages to operate. And the number is constantly growing.
Why you shouldn’t do it
Five, ten years ago, the URL string was a far more important ranking factor. A keyword-matched URL could be enough for you to rank for a search query. Today, Google cares a lot less about the actual words in the URL-string. Optimizing pages this way – without having unique, quality content on the page – is just not enough, especially if the competition is tough.
At the same time, nowadays, Google is more inclined to penalise websites with a lot of duplicate content. And to be honest, since this strategy relies on a machine creating duplicate content automatically, the search engine will not know which one of the many pages is most relevant, ultimately leading to none of them showing up in the ranking.
This might also cause problems for the user experience on the site. A user doesn’t want to navigate between different pages to see which colours and sizes are available; he is expecting to see everything on one page, using faceted filters to find the right model, size and colour. Read more about this in our Guide to User-Experience Design.
What to do with all those duplicate pages
- If you want to remove a page, make a permanent redirect (301) to the page you intend to make the main page around the topic. If any of the redirected pages have been linked to from other sources, this link strength is passed on.
- If you have to keep a page, use canonical tags to signal to search engines that you have similar content under different URLs although they are essentially the same page.
- If you want to make sure to exclude pages from the search results, add a "noindex" metatag and/or a "nofollow" metatag. Here’s how.
The RIGHT way to do it
As we've written about earlier, it is very helpful to use your site-search data to optimise your SEO.
"On-site search data, unlike GA data, provides great insight into the most relevant search phrases regarding your business. These might be words you didn't know people use to describe your products, products you may not even carry, and questions you didn’t know customers had.
Shortlist the most popular converting search queries that essentially mean the same thing (i.e. lead to the same search results) and create one indexable URL for that page.While Google Analytics provides data for keywords you already rank for and/or buy, site-search provides data on what people look for when they have entered your website. It provides a picture of the customers' perception of your brand."
So the simple answer to the Quora user is: Don’t do it.
And to webmasters who have created a monster: Pull the plug!
Topics: