Get Free Site Assessment with EBriks Get Free Site Assessment with EBriks Get Free Site Assessment with EBriks

Posts Tagged ‘Google algorithm’

Google launches Search Transparency Blog Series

Friday, February 10th, 2012

The small changes and tweaks Google does to its search algorithm aren’t conveyed to the global users. This was inspite of Google posting over 1000 official blogs and 400 webmaster’s videos. Hence was born the Search Transparency Blog series.

This blog brings to us the alterations Google introduces in its search module; which is almost 500 per year. The supposed ‘’algorithm changes’’ are sometimes nothing more than a revision of the results and display. For instance, Google made minor changes in the colors and layout of the results in tablets and mobile phones that run on the Android Operating system. This makes sense since we expect faster and clear results on our mobile phones as that on a desktop but are not ready to compromise on the content.

Another interesting tweak is especially designed for the sports fan. Scores and schedules of the major soccer league from the USA and Canada are displayed. Users would be benefitted and would be happier if Google could make this a geographical concept and try to bring in search results that confer to the season and type of sport. But this feature is still being speculated.

The prediction algorithm has been a great boon. Auto complete is flexible enough to accommodate other speculations without losing your original search query as was before.

One major change is that two original queries can have very different search results with changes in only one word. Huffman states,“[This]makes it less likely that these results will rank highly if the original query had a rare word that was dropped in the alternate query.”

Google also decided to prevent ‘parked domains’ from dominating the results. Parked domains are those websites that bear only a large amount of advertisements and not original content. This is certainly a huge hit for developers who neglect their duty to update their websites with novel content.

Good news for bloggers is that Blog Search index has starting focusing on fresh and comprehensive blogs. This is a great opportunity for the new, undiscovered blogs that have quality work to show up in the first few search results. In a similar manner ‘scrapper’ sites, those that post duplicate stuff, are identified based on the signals Google gets. This is optimized in order to eliminate fraudulent sites.

The concept of “top result selection code rewrite” was adapted by top programmers at Google to improvise and reduce redundant results for the same time in the first page. The first result deserves to be processed more than the others in order to end up as the first search result. Certain large companies like Coca Cola have other ancillary sites and social media pages that this modification is an absolute necessity so that the first page is not crowded with only one Company’s pages.

A web developer’s prime concern is to rank well in the SERPs.A legitimate developer following these rules and catering to the multitude of changes made by Google will be able to easily adopt a better SEO technique.


VN:F [1.9.22_1171]
Rating: 10.0/10 (1 vote cast)
VN:F [1.9.22_1171]
Rating: +1 (from 1 vote)


Ask Your Queries

Ebriks@ Quick Contact Bar