Changes to Google’s Search Engine Results

Google logo

A lot of noise has been made about the recent changes and updates that Google has made to its algorithms. In 2011 Google launched their ‘Panda’ update and in the time since have updated it to the most recent patch released earlier this month, Panda 3.9.1.

This has dramatically changed the way that results are coordinated by Google and, along with their ‘Penguin’ update, users can expect a more tailored results page. Google has launched what it terms the ‘Knowledge Graph’ but what everyone else refers to as ‘Semantic Search’. In essence, Google uses all of the information that it has to modify search results for each individual user.

They do this in a number of ways. One of them is their aggressive push to sign people up to Google+. Once you’ve created a Google+ account, Google would proceed to use the information provided in your profile, as well as taking into account your connections and your friends’ likes and so on, to ultimately produce a results page specific to you. Google’s ownership of YouTube also helps, as they’ll be aware of the kinds of videos and interests you have. They’ll also use localised results to provide users with information relevant to their surrounding area.

The biggest change that users may notice from Google, if they aren’t paying attention to SERP positions, is that in some cases Google has gone from a 10-results page to a 7-results page for approximately 20% of queries. This means that, for 1 in 5 search queries, your audience will be seeing fewer results per page. On top of that, these results will be altered based on the searcher’s needs.

Several members of Webmasterworld.com have noted that site links, a result with links to further pages within a domain, are more likely to appear on the top of 7-result SERPs (Search Engine Result Pages). The initial ‘Panda’ updates back in March/April found many webmasters indignant with rage when they saw their SEO-optimised pages drop-down SERPs in favour of blogs or keyword-heavy domains instead of legitimate business sites. Google seems to have rectified this for most queries and has made several announcements over the year with each patch reassuring webmasters that there is some fluctuation to be expected every time that the search engine alters itself. Ultimately, Google is trying to make results better for users by showing appropriate sites, with fresh, new information.

Many companies may find that refreshing their website copy can help with higher Google rankings after these new patches. Blogs are similarly affected and writers are discovering a need to adapt from the old days. While keywords and tags in titles are important in appearing on SERPs, writers are finding that instead of the ‘write and forget’ mentality that came before, they are having to frequently revisit the content to make sure that they’re consistently ranking high on result pages.

This is all in an effort to weed out keyword-spammed articles and hastily rewritten content so that web users get the best surfing experience. These changes, from large overhauls to small tweaks, are here to stay and we can expect more shortly. One thing is certain, businesses will have to adapt to the new SERP algorithms, with Google accounting for 91% of search engine traffic, they can’t afford not to.

 

 

Subscribe for latest news

2 Responses

  1. Google’s rise to success was in large part due to a patented algorithm called PageRank that helps rank web pages that match a given search string. When Google was a Stanford research project, it was nicknamed BackRub because the technology checks backlinks to determine a site’s importance.;

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to Our Newsletter