Google Releases 52 Search Quality Changes for April
April was a busy month for the minions over at Google. With Panda updates, the introduction of a Penguin, and a parked domain classifier error (I guess even Google makes mistakes), one would think the search engine would take a small break.
But, alas, Google is still changing, tweaking, and plotting, and in an effort to continue its support of transparency it released 52 more changes and updates.
We don’t know exactly when Google implemented these changes, though, or if they coincided with Panda or Penguin.Does anyone else get sick of talking in zoo animal code?
Here are the changes we feel are important for our readers: If you want the unabridged version, click here.
Every month, Google makes changes to its Freshness update. I can’t say I am sad about it. I am not a fan of the first page of search results spitting out five-year-old articles about how reciprocal linking is a great linking strategy.
Google introduced three freshness updates involving search results and ranking signals. Breaking news topics along with other new content may see a boost as well as “fresh documents.”
We aren’t entirely sure what this means, but Google did mention it excluded websites identified as “low quality” content from the classifier it uses to promote fresh content.
No freshness boost for low-quality content. [launch codename “NoRot”, project codename “Freshness”] We have modified a classifier we use to promote fresh content to exclude fresh content identified as particularly low-quality.
The first update discusses authoritative content:
More authoritative results. We’ve tweaked a signal we use to surface more authoritative content.
Can anyone say “ambiguous?” Every webmaster thinks their content is authoritative, so what does this really mean?
The web geeks over at SearchEngineWatch seem to think Google will improve the ranking of older domains that have strong link profiles and those that have refrained from questionable (spammy) techniques.
The next update may coincide with the Penguin update:
Keyword stuffing classifier improvement. [project codename “Spam”] We have classifiers designed to detect when a website is keyword stuffing. This change made the keyword stuffing classifier better.
Penguin centered on keyword stuffing so we may have already seen the results of this one.
How many keywords are considered stuffing? We will never know. As a rule of thumb, write your content with no thought about keywords. After it is complete, go back and take a count. If the content is focused on the keyword topic, most likely you will have used your keywords appropriately.
Tip: This is completely unscientific, so take it for what it is worth. When my mind starts to wonder if there are too many keywords, I know I have added too many.
The next update:
Improvements to how search terms are scored in ranking. [launch codename “Bi02sw41”] One of the most fundamental signals used in search is whether and how your search terms appear on the pages you’re searching. This change improves the way those terms are scored.
Matt McGee over at SearchEngineLand guesses that this along with the keyword stuffing update is related to “spun” content, although it could refer to a number of things. It definitely alludes to the misuse of keywords. Spinning content and adding keyword-specific links that have nothing to do with the content would most likely fall under that blanket. Any thoughts?
If you know what “spinning” content is, I would re-consider your linking strategies if you are doing it. If you don’t know what it is, don’t worry about it. It’s not worth the risk.
More updates include changes to how Google categorizes paginated documents so they don’t take over the pages of search results. The search engine also announced it would focus on publishing more diverse results by removing excess results from the same domain.
Local search may get a boost even for websites that are not as optimized. Here is Google’s first update:
Improvements to local navigational searches. [launch codename “onebar-l”] For searches that include location terms, e.g. [dunston mint seattle] or [Vaso Azzurro Restaurant 94043], we are more likely to rank the local navigational homepages in the top position, even in cases where the navigational page does not mention the location.
Google is trying to improve its spider to detect a local business’ location even if the home page does not mention a specific locale. The next update followed in the same vein though it involves countries:
Country identification for webpages. [launch codename “sudoku”] Location is an important signal we use to surface content more relevant to a particular country. For a while we’ve had systems designed to detect when a website, subdomain, or directory is relevant to a set of countries. This change extends the granularity of those systems to the page level for sites that host user generated content, meaning that some pages on a particular site can be considered relevant to France, while others might be considered relevant to Spain.
Google is digging down deeper into a site to detect additional locations from user generated content since certain pages may be relevant to users in one country while other pages may focus on a different country.
Last year, the SEO industry was “up in arms” over Google’s announcement to change title tags in the search results as it sees fit:
The newest April update improved Google’s ability to change page titles. According to the search giant, “you’ll find more informative titles and/or more concise titles with the same information.”
Many SEOs are irate over this. What do you think?
Google announced four changes to sitelinks and “megasitelinks,” the links that display below a website’s listing that link to deeper parts of the website. Sub-sitelinks will now replace text snippets and Google improved the ranking of megasitelinks by “providing a minimum score for the sitelink based on a score for the same URL used in general ranking.”
- Indexing – Google increased the number of documents served by its main index by 15%. It also launched a new index “tier.”
- Instant preview changes
- Changes to how Google interprets the intention behind search queries by using users’ “last few searches.”
- Improved user interface for searches related to breaking news topics.
- Anchors bug fix
So, we’ve had a lot of fun trying to decipher Google’s code and as always I love to hear your opinions.
Do you have any other suggestions about the updates? Has your website been affected? Let us know in the comments.