Archive December 2015

SEO Limitations that penalise rich content website

SEO Limitations penalise rich content website

Why do some website get penalised with lower page rank even with good content ?

The Important aspect of SEO is making your website visible to the search engines. Ever wondered why your web page didn’t get the expected page rank? Why your web page didn’t turn out #1 in SERP despite its rich content?

SEO is Limited by the decoding mechanism of bots and crawlers. The need boils down to one common issue. The bots and crawlers of google don’t have intelligence as a human being.

Reason that penalise rich content website

  1. Penalisation due to online forms

    The bots have the capability to span all over the internet, follow millions of links and index billions of content. But, the bots cannot fill out the online forms such as login that comes with the captcha! This simply means, that the codes that proceed after the forms are not spanned by the crawlers! The pages that are unreachable by the crawlers fails to show up in SERP. So What do we do now? Very little can be done from our side i.e., prevention. So, take care not to put major content behind online forms.

  2. Error in Crawling directives [Robot.txt file]

    Errors in the website crawling directives can be a major trouble creator. The bots get confused with the Robot.txt. When such situations occur, the bots and crawlers fail to report your website. That simply means, your website goes out of scope for a Search Engine.

  3. poor construction of Linking Structures

    When your website’s link structure is poor, it’s time for your dream called ‘Page Rank’ to end up as a dream. By Poor link structure, I refer to broken links, internally linked web pages, a complicated linking structure that’s not user-friendly. The bots report the poor link structure and there’s a very bad chance of losing your dream page rank.

  4. Excess use of Media and less use of text

    Believe me! Bots have the capability to read media such as images, but bots face some problems during processing and parsing them. Media like images, videos, flash content, flash files, plugin content etc.. gets either ignored or wrongly processed by bots to a large extent. Instead, bots are good at reading simple HTML texts! Making pages completely with media (even though you have got rich content in it) can be termed a bad habit. Try using textual approach wherever possible. Remember, it is the bots that process your Page Rank!

  5. Unmatching keywords in your content

    This is the problem faced by most of the users. This can be easily rectified, but the task is more time consuming. Plan your keywords perfectly. Use tools like Adwords Keyword manager and find out search volumes. This simple on-page optimization can make your page rank raise. Find out 5-6 core keywords and make your base of the post on it.

Remember! Your website should not satisfy the bots alone. Your content must satisfy both the viewer and the bots.

Perfect SEO – 5 basic google webmaster guidelines

Perfect SEO for beginners

Everyone wants to make it to the top of the google SERP. What requirements should your site have in order to compete with others? What do you think that can make your website top the SERP? To help us out, google webmasters had come up with some basic guidelines for a perfect SEO .

If you think cracking the complicated google algorithms, you are going one step down in achieving your goal. Instead following webmaster guidelines for SEO can be of much help.

The 5 basic google webmaster guidelines for a perfect SEO is listed for better ranking:

1. Make Webpage for users not for search engines:

Avoid cloaking.

Cloaking: Cloaking is a popular black hat SEO technique where the content provided for the users to browse is different from the content provided to the search engine spider.

Google loves websites that don’t try black hat techniques. Make web pages rich in content as much as possible. Give a meaning to your site. Concentrate on your website core idea and work on it. If you are working on a blog, concentrate on the focus keyword for every post.

2. Make a perfect site navigation:

Make sure that all public pages in your web page are completely accessible by the user. Having a navigation menu that can basically help in navigating to all pages is essential. Making a Universal Menu bar is preferred.

The Navigation bar shown below is my universal navigation bar.This basically navigates to all web pages in my blog.
Universal site navigation

3. Choose a suitable name for your site category items and menu items:

Google Adwords Keyword manager comes to your rescue. With Google Adwords keyword Manager, choose your category name. A competition level of low or moderate signals a good start sign. Nevertheless, with high competition keywords, you have a focused task of more rich and unique content.

4. In case of blogging, make sure your article word count exceeds 300:

Google has a primary aim of providing the user more accurate result. A content rich article should have explanations that exceed 300 words. Google is good at sensing synonyms..i.e., usage of words that has a similar meaning to the focus keyword of the blog post. Avoid recurrence of focus keyword and use alternate keywords that make the same sense as focus keyword.

5. Make your Title and h1, h2 tags contain the focus keywords of the article:

Having the focus keyword of your web page in the title, h1, h2 tags are highly essential. This makes your web page get a slightly higher preference in the SERP (Search Engine Results Page). h1, h2 tags creates the basic impression that you have content specific to the keyword listed. So, this step is highly essential while building web pages.

Making a perfect SEO makes your website stand out for its content. It’s a ticket for you to showcase your website to the right audience.

2007 Google Algorithm update

2007 google algorithm update

2007 google algorithm update consisted of Buffy upadate and Universal search. Buffy was a collection of minor update and univeral search was a mind-blowing concept that phrased the future of SERP.

Buffy Update: [ June 2007 google algorithm update ]

After quite a year(2006) of no big updates, google produces Buffy! The first major update in 2007. The Jagger update [October 2005 update] made the preparation for change in the infrastructure by Big Daddy update [December 2005 Update]. Buffy had a huge result on the single word search results. Buffy turns out to be a collection of various small tweaks and minor modifications.
The update got named ‘Buffy’ in honor of “Vanessa Fox” who then left the team as declared by the Webmasters. To read the complete message of webmasters Click here!

Universal search: [ May 2007 google algorithm update ]

Before universal search, google displayed webpages with text content seperately, images seperately and so on.. It was framed that you gotta search everything seperately. With Universal search google got the right junction for employing most of its services such as youtube, maps, images, local, books etc.. on the search engine results page [SERP]. The previous SERP was officially declared dead (10-listing) and the new SERP took over!

The Universal search not just limited itself to the google platform. Videos, images, documents from other websites too showed up in Universal Search Engine Result page.

2005 Google Algorithm Update

2005 google algorithm update

The 2005 google algorithm update brought the efficiency of the search engine to the higher level. Spam via fake backlinks got a drastic reduction.

NoFollow attribute: [January 2005 google algorithm update ]

On January 2005, the search engines dominating the market such as Yahoo, Bing (MSN Network) and google joined hands together. They contributed towards the ‘NOFOLLOW attribute’. This new indexing web command ensured that no fake backlinks were created.
Earlier, spammers had the ability to comment their own website links onto the comment section of any high rated blog or forum or any website. This generated backlinks to the spammer’s webpage in turn enhancing the page rank. By this attribute, the links on the comment section carried a default NoFollow attribute. A NoFollow attribute nullifies the backlink detected by google and hence goes unaccounted!
This is not technically an update, but the introduction of NoFollow had a huge impact on link graph!

Allegra Update: [February 2005 google algorithm update ]

The Allegra update focused mainly on LSI [Latent Semantic Indexing] algorithms. Very few details were known about the tweaks in the indexing. Speculated rumors suggested that google started to penalize websites for fake backlinks from irrelevant websites.

Bourbon Update: [May 2005 google algorithm update ]

Bourbon update stretched over a week with several tweaks and enhancements. It was seen that 3.5 improvements in search results was mentioned during release. This meant that the improvements in search algorithm was not yet complete and it lasted for a week. It was speculated by the webmasters that google changed the way non-canonical URLs were treated.

Personalized search and XML site maps: [June 2005 google algorithm update ]

• Personalized search was the next attempt by the search engine giant. When a user gets logged in, the web history of the user profile is reviewed. Websites that were already visited by the user gets high priority in search results.
Note: This is for that particular google account only. Once logged out, the personalized search effect goes off.
• After introduction of XML site maps, google had a very clear knowledge of the navigation methodology of every website. Users have an option either to submit the XML sitemap of their webpage directly. If not, worry not! Google bots and crawlers are always on its way spanning the internet. Google bots create an xml sitemap and submits it to google. The site navigation was given high importance as better navigation makes the user more comfortable.

• The change and rate of change of XML sitemaps were used to predict the core activities of the website.

Gilligan update: [September 2005 google algorithm update ]

The Gilligan update was rather termed as “False Update”. Mattcutts, the then head of anti-spam team of google mentioned that search algorithms had no major update. The Gilligan update was all about the toolbar PR that gets updated once in 3 months or so.

Jagger Update: [October 2005 google algorithm update ]

The Jagger update made a clear impact on fake backlinks. Jagger targeted low quality links – namely link farms, reciprocal links and paid backlinks! Jagger rolled out for a period of 3 months. i.e., Jagger got updates over a period of time fine-tuning the update. Removing unused pages, making a 301 redirect from non-canonical URL to canonical form of url, along with several tweaks were advised.

Integration of Google local and maps: [October 2005 google algorithm update]

‘Google Local’ was successfully launched. maps.google.com became a one-stop solution for mapping and local search [Location based search for nearby stuff]. ‘Google Local’ and ‘Google maps’ got integrated.
know more on Integration of google local and maps!

Big Daddy Update: [December 2005 google algorithm update]

The Big Daddy update was termed the ‘Infrastructure update’. Big Daddy changed the way of URL canonicalization, 301/302 redirects.

Google Algorithm update 2004

2004 google algorithm update

Google Algorithm update 2004 sets efficiency of the search engine a bit higher. The updates in 2004 include Austin and Brandy Update only. Google stopped its big monthly monolithic index updates with the Esmeralda [June 2003 update] and started working on polylithic day to day index update.

Austin Update: [January 2004 google algorithm update ]

Following Florida [November 2003 update], Austin made google smarter and better. The Hilltop Algorithm combined with Page rank algorithm of google was the main game changer. [Google acquired Hilltop Algorithm in February 2003]. At the initial stage, not all keywords gave perfect results. Google started fine tuning and a few weeks later, the efficiency of google search increased!

Brandy Update: [February 2004 google algorithm update ]

• The keyword analysis picked up the next phase in Brandy Update. Latent Semantic Indexing (LSI) was introduced.

Significance of LSI:
LSI deals with closed semantic match that puts makes google realize the topic and concept of your web page.
The main course of action was synonyms! Implying that close analysis of words other than keywords were performed. When the content excluding the keyword was relevant to the keyword, the web page gets preferred.
For example, let’s take speak about ‘Review of novels’. Here, LSI looks for words or links relevant to the plot, story, author, awards etc.. If relevant, greater the chance of showing up in SERP.

• ‘Neighborhood’ was given complete focus!

The concept of ‘buying backlinks’ – a spam act according to google got a few chance for survival. The term ‘Neighborhood’ refers to links from another webpage that links to your website. Google started penalizing websites that had backlinks from irrelevant sources. Thus, making the search engine more efficient and appealing to users.

• Synonyms were given a priority!
Instead of using the focus keyword everywhere, google started analyzing for its synonym as keywords. Good title tags are said to have synonyms instead of focus keyword. Also, avoiding repetition of focus keyword was a changelog.

Google IPO: [August 2004 – not algorithm update]

On August 19, 2004, Google sold 22.5M shares at $80 each in the IPO. Google had the aim to raise $2,718,281,828 via public funding and failed to achieve the goal. The failure can be said as a mere bad luck, but the share value bounced to more than twice. At the end google set their market value at $20 billion.

Google Algorithm Update 2003

google algorithm update 2003

The Google Algorithm update 2003 was the year google started to name their updates. The following article provides clear insight into the effective changes that were made in the algorithm.

Google Algorithm Update 2003

February 2003 Update: [called Boston Update]

The first named “Google Algorithm” series of updates was the Boston Update. Starting 2003, Google started naming its updates to google search algorithm alphabetically.

The Google Dance during Boston update:

The term ‘google dance’ implies the period where google rebuilds every website’s ranking thus enhancing the search result. During this period, the SERP shows a huge variation of results for the same keyword time to time. (SERP – Search Engine Result Page).

Boston update faced the google dance due to a combo of algorithm changes and refreshment of search engine index.

April 2003 Update: [called Cassandra Update]

Casandra update of google algorithm focussed on perfection in relevance.
1. The hit-point of the update was the ability of google to find out hidden text and hidden links. Hidden links and texts were much of annoyance to google. The web developers had the ability to fool google by including links and words that helped them reach the top position in SERP. While hidden texts make the same website rank higher, hidden links were a nuisance as it acted as backlinks to other web pages.

2. The second major update in Cassandra was the ability of google to prevent back-links from co-owned domains. Google built the power to prevent the backlink counts from and to the domains owned by a single user. This makes fake backlinks disappear from the picture.

May 2003 Update: [called Dominic Update]

The exact functionality difference of Dominic update is yet unknown. The bots and crawlers of google – “Freshbot”, “Deepcrawler” spanned the internet and bounces were reported. This update is said to have changed counting and reporting of backlinks in a drastic way.

June 2003 Update: [called Esmeralda Update]

A major infrastructure change in google algorithm showed up with Esmeralda. Google dance got replace with Everflux.

Everflux: Instead of making huge monthly monolithic changes to the search index, google started making polylithic day-to-day changes.

Everflux made the google dance effect go ease and enhanced the SERP very effectively.
Esmeralda was the last monthly update released by google. Following Esmeralda, Google Algorithm update showed up as a continuous process.

July 2003 Update: [called Fritz Update]

The Fritz update just ended the google dance started up by the Boston update. The Fritz update changed the way google indexed. The index started changing daily instead of a monthly basis.

September 2003 Update: [called supplemental Index Update]

Google created a new supplemental index making a feeble step to make indexing faster and better. Some results were split-off into the supplemental index in this version. The main aim was to reduce the storage space. Google made a move to split the less relevant search results into a separate index called supplemental index. After a lot of testing, the bi-index method was found a burden. The burden was when some pages of rich content go into the supplemental index. The supplemental Index was later removed and google follows a mono-index pattern now.

November 2003 Update [called Florida Update]

Google war on spam had just begun with Florida Update. Florida made its essence as a nightmare to many websites. Google demanded more keyword focus and developed SEO tactics to bring the website to search results. This made the early 2000 SEO techniques almost null. Adapting to the new algorithm by making rich content was the only option left for survival.

Thus, Google improved the search engine SERP greatly via the google algorithm update 2003. The efforts of google didn’t go in vain. This laid the perfect platform for the future Google Algorithm . For brief history on Algorithm updates, click here for wiki !

Evolution of Search Engine Algorithm – Google

Evolution of Googe algorithm

IN THIS SECTION: Evolution of Search Engine Algorithm and Ranking Techniques

In past 10 years, Google search engine intelligence had gathered a fast pace. Earlier, Google ranked pages based on the presence of search keyword. Blah! That’s the most basic attempt ever possible. As a result, a lot of spam pages took their position to page 1 of google.

Later, due to lack of proper results, the backlinks were included in the basic Algorithm.
Backlinks: Backlinks are links to your website from another website. ‘Organic backlinks’ implies that your website is content rich! Hence, Google adopted the frame of including the total number of backlinks. Higher the number of backlinks, more chance of your website showing up in search results.This got the attention of spammers, who then created as many backlinks as possible to get their website show up in the search result.

Search Engines at present

Nowadays, search Algorithms are much more optimized and appealing. Also ‘secure’ – Google never reveals its search algorithm to public.
From the fall of 2002, Google started naming their major updates starting from Boston, Cassandra, Dominic etc.., in alphabetical order. Every year google changes its algorithm 300 times. Predicting the changes are difficult, yet a proper valid and rich content on your website will suffice.

For brief History of Algorithm Updates: Click here!

SEO and working mechanism of Search Engines

Evolution Search Engine Algorithm

SEO 101: BASICS OF SEARCH ENGINE

IN THIS SECTION:
WE WILL UNDERSTAND AND ANALYSE WORKING OF SEARCH ENGINES!

By owning a domain, everyone takes the first step to their business or life. Wait! Does Owning a domain implies you are visible to the world? Certainly not. The bitter truth is that no one gets to see your web page among the million other web pages!

So, how do people get traffic to their web page? The simple trick is making your website ‘Search Engine Optimized’. Google had been a major game-changer in the world of search engines. Let’s see through the course to optimize your website for search engines.

Understanding Structure of Search Engine:

Search Engines employ bots that crawl the internet. Search engine finds every web page, decipher the code and saves the page description shortly in their database. Data centers had been constructed all over the world in order to accomplish the high-end task of storing billions of data.

Understanding Function of Search Engine:

The crawling bots index every website and hence produces the search results expected by the users. According to every search engine, web pages and websites are given a rank. Nowadays, for a search engine, relevance is not simply displaying pages with similar search terms. However, earlier it was different.
The more valuable information a website possess, the more opportunity to show up in the search results. Thus, a website or blog becomes popular. Popularity and relevance are not calculated manually. (Woah! You know that’s one hell of a task!) Search Engines formulate algorithms aka mathematical equations that calculate the rank of any website. These Algorithms are termed “Ranking Factors”.

SEO google Algorithm

More than hundred ranking factors are employed by google. Every year google refines and refurbishes the algorithm, thereby making it more efficient in providing relevant search results.

Every time a user performs a search –

• Google algorithm calculates the search terms

• Google algorithm finds out relevance in their indexed web pages

• Results are displayed as per “page rank” [which varies time to time].

Next Section: