Skip to main content

Google Algorithm Change: Who Are The Biggest Losers?

Last week Google dealt a big blow on content farms and scrapper sites that copy content from others and essentially all blogs in general. The algorithm change, which I reckon, you have heard, read and experienced yourself by now is aimed to move sites with “original” content to the top of search results and push down sites with “low quality” content to the bottom.

The trouble is - there is no clear way to distinguish between sites that carry articles with “in-depth reports” and “thoughtful analysis” from the ones that don’t. As a result, a large number of blogs and innocent sites have been negatively impacted by the algorithm change.

From what I could gather from Twitter and discussion forums, many legitimate sites have reported organic traffic drop as high as 50% to 60%, and 25% drop in traffic seems to be “normal”. The change has been currently made only to the US. Eventually, it will be rolled out to the rest of the regions and that means even worse days are ahead for bloggers and webmasters.

What everybody wants to know now is whether the content farms were affected by this change.

German SEO consulting company Sistrix, which specializes in linkbuilding progress of their customer and their competitors domains, took upon this task. The company analyzed 1 million keywords and noted rankings of website before and after the update in the U.S. index. It then ranked these by a “visibility” index value that Sistrix created, which takes into account the number of keyword positions lost, specific ranking position and estimated clickthrough rate from those results.

Below is the table of the 25 biggest losers in absolute losses.

sistrix-ranking

Over at Search Engine Land, Danny Sullivan obtained the full list of 331 domains that were found to have lost rankings from Sistrix and carried his own analysis resulting in some new tables. These tables show the top 100 domains that suffered losses sorted by total number of keyword positions lost by absolute numbers as well as by percentage.

Most of the usual suspects are there - Associated Content, Ezine Articles, Suite101.com, Hubpages, Mahalo, and FindArticles among others. On the surface, the algorithm update looks successful, but what has not been taken into account are the vast number of smaller but good quality sites that got nuked. Collateral damage, you might argue, is bound to happen. But we are talking about 11.8% of all searches – a big number and subsequently a big damage on innocent bystanders.

Looking back at the list, you will notice that eHow and Huffingtonpost escaped. In fact, eHow’s ranking actually went up, along with YouTube, Facebook and eBay. How did that happen? The possibility is that sites that were ranked above eHow got devalued resulting in automatic boosting of the site’s ranking.

So has the quality of Google search improved since the algorithm change? Not necessarily, as evident from eHow's case. Many legit sites have disappeared from the search pages and their places taken by other low quality sites. I believe, it won’t be long when content farms find another way to beat the system and find their way back into Google.

Comments

  1. The reflection only made matters worst. Already the search results were flooded by spam and now we see thousand of actual genuine content based websites are getting off from search results by making it more easy for those spammers..

    ReplyDelete
  2. Right.

    I want to know the signals that Google considers when flagging a site as low quality or spam, because if you go through the list at Search Engine Land you can see a couple of genuinely good sites getting affected.

    I know almost all blogs have had their traffic reduced by some degree, but to get into a list of top-100 affected blogs is terrible. It's like getting into a spam blog list.

    ReplyDelete

Post a Comment

Popular posts from this blog

How to Record CPU and Memory Usage Over Time in Windows?

Whenever the computer is lagging or some application is taking too long to respond, we usually fire up task manager and look under the Performance tab or under Processes to check on processor utilization or the amount of free memory available. The task manager is ideal for real-time analysis of CPU and memory utilization. It even displays a short history of CPU utilization in the form of a graph. You get a small time-window, about 30 seconds or so, depending on how large the viewing area is.

How to Schedule Changes to Your Facebook Page Cover Photo

Facebook’s current layout, the so called Timeline, features a prominent, large cover photo that some people are using in a lot of different creative ways. Timeline is also available for Facebook Pages that people can use to promote their website or business or event. Although you can change the cover photo as often as you like, it’s meant to be static – something which you design and leave it for at least a few weeks or months like a redesigned website. However, there are times when you may want to change the cover photo frequently and periodically to match event dates or some special promotion that you are running or plan to run. So, here is how you can do that.

Diagram 101: Different Types of Diagrams and When To Use Them

Diagrams are a great way to visualize information and convey meaning. The problem is that there’s too many different types of diagrams, so it can be hard to know which ones you should use in any given situation. To help you out, we’ve created this diagram that lays out the 7 most common types of diagrams and when they’re best used: