Onpage SEO keyword optimisation

Onpage SEO keyword optimisation

Taken Mind SEO Onpage SEO Keyword Optimisation!

Previously, we have seen that Keyword density is a myth and is not related to ranking. Onpage SEO optimisation keyword is based on the orientation of keywords. Yes! Keyword density too matters but not to a greater extent.

OnPage SEO keyword optimisation Rules:

The following general pattern is best suggested as good Onpage SEO optimisation:

Rule 1: Proximity Relative distance between keywords in the document is considered.

Rule 2: Distribution The places where the focus keywords are present in the document or body

Rule 3: co-occurance The link structure of the specific webpage with other webpages

Rule 4: The important tags The focus keyword is expected to be present on header tags, URL etc..

Example of Best Onpage SEO keyword optimisation

The above rules are necessary and must be followed. Keyword Cannibalisation a term used to refer linking multiple links to other domain web page or internal linking must be avoided. Also, linking of multiple pages to a single page is more preferred than linking one page to multiple pages via anchor tags.

  • The focus keyword must be present in the title tag at least once. Frame the title such that the focus keyword is near to the starting of title.
  • Make the content of the web page more than 300 words, especially in case of blog
  • Make sure the keyword is present in your body content, at least, two or three times. Make sure the keywords are not close to each other but are distributed.
  • Make sure the keyword is present in the alt tag of images in your post. This has a high capability of bringing good organic traffic via google image search.
  • Make sure you are using canonical URL formats. In URL, the presence of focus keyword is highly essential.
  • Make sure in the meta tag of the web page, focus keyword is present. Meta tag greatly helps in providing short gist about the web page to google. The absence of focus keyword might ruin your SEO.

Following the above-mentioned rules, greatly enhances the SEO by keyword optimisation. Any doubts? Blow up in the comment section below.

How SEO keyword Abuse and spam are handled ?

How SEO keyword Abuse and spam are handled ?

Taken Mind SEO keyword spam!

SEO keyword abuse and spam

Since the invention of search engines, spammers had been a real trouble. At present, google had made great efforts especially to prevent SEO keyword abuse and spam. Google employs a separate team in charge of handling spam. Nowadays, the filters and algorithm that search engine employs can penalise most of the spammers at ease.

One such Black hat technique is keyword abuse. This method involves the inclusion of fake keywords in body content, URL, alt text of images and meta tags. This is termed “Keyword stuffing

A myth named ‘Keyword density’:

Earlier SEO was based more on “Keyword density”. Keyword density is the ratio of the number of specific keywords to the total number of words. Spammers had a great time misguiding the search engine by having very high keyword density. This happened until google removed keyword density as one of the key criteria for ranking.

The way Keyword density dissatisfies search results:

Let us say, two pages P1 and P2 have total words 300 and 150. The focus keyword present in P1 and P2 are 6 and 3 i.e, both have the same keyword density of 0.02. Also, this turns to be an unreliable calculation as P1 might have better content than P2.

Onpage Optimisation had come into effect nowadays. i.e, the orientation of the focus keywords play an important role. Any case of excessive spam attempt has immediate blacklisting of your website as a result. Understand that repeating keywords are not going to help you secure better place. Instead, the orientation and related content do guarantee you.

Learn OnPage Optimisation and Keyword that makes your website rank higher! Trusted Source – Taken Mind

You can check any keyword overview here at ease!

Deciding your SEO keyword usage for your webpage

Taken Mind SEO Keyword Usage!

SEO Keyword Usage : The Fundamental of Search Mechanism

Keywords are the basic unit operations in search engine mechanism. When a user searches for a query using a search engine, the retrieval mechanism from the database is based upon the keyword of the indexed web page. During the process of indexing the content, the bots and crawlers store the web pages in keyword-based indexes. This prevents the time lag that can occur while storing 30 billion web pages in one single database. Instead, search engines maintain millions of database for a particular keyword or phrase that makes SERP load much fast.

During search mechanism, keyword usage drastically changes the results on SERP (Search Engine Results Page). The order of the words even changes the search results. For example “Laughing Clown” and “Clown laughing” may produce different results. Apart from the order of words, punctuations, spelling, capitalization etc.. produces different results of web pages on SERP.

In general, making more specific keywords helps you narrow down the competition on Search Engine Results Page. This means, your website has more chances of achieving greater page rank.

How using specific keywords help search engines narrow down the competition:

Imagine yourself writing about “World war” with keyword ‘History’, your friend writing on the same with keyword ‘war’ and a second friend of you using the keyword ‘World war’. When a search is performed on “World war”, the second friend of yours get higher preference in SERP. That’s how using specific keyword helps in getting a higher position in results.

You can check any keyword overview here at ease!

Keyword Abuse: Might get you blacklisted at ease

Beware:

In SEO keyword usage turns to be the fundamental part. Phew! Keywords are riskier when used inappropriately. Having told the importance of keywords, it is more important now to ensure caution that your web page doesn’t get blacklisted for spam activity. That is some excess keyword usage might make your web page get sensed as a scam!

Learn Prevention of keyword spam that can get you blacklisted. NOW!

How search engine see webpages : SEO Strategy

How search engine see webpages : SEO Strategy

As I had already mentioned in my previous post that “Search Engines” don’t see what you and me see while using a webpage. The vision of bots and crawlers of search engine are basic html, i.e, they don’t see the styling css. Take a look into how search engine see webpages.

How human users see webpage of yours:

Here, I have grabbed a screenshot of www.takenmind.com as seen by human users.

Vision of webpage by human

Vision of a webpage by human beings like you and me

How search Engine see webpages :

Vision of webpage by bots and crawlers of search engine

Vision of webpage by bots and crawlers of search engine

From the above images, you can find that humans visualise a webpage with css styling and high User Experience levels. Bots on the other hand read plain html and make out whether your webpage contains rich content compared to others. This implies that search engine rely more on your alt keywords for images. Make sure they are relevant to the content of the image. Though search engine see webpage’s image, parsing error might occur which makes rendering details about image difficult without alt text.

With the help of vision difference, let me make you understand the effect of Flash content on search engine bots.

How human users see your webpage with flash content:

Here, I have taken a screenshot of a complete flash based website www.cloudsovercuba.com

flash content in website as seen by human

flash content in website as seen by human

How Search Engine see your webpage:

how flash content are seen by search engines

how flash content are seen by search engines

From the couple of above image, the strong objection of flash content by Search Engines is shown. Search Engines donot evaluate flash content. This is due to less intelligent bots and crawlers.

For your website: Want to know how search engine see webpages ? Worry Not. Visit www.seo-browser.com and see how search engine see your webpage.

SEO Limitations that penalise rich content website

SEO Limitations that penalise rich content website

Why do some website get penalised with lower page rank even with good content ?

The Important aspect of SEO is making your website visible to the search engines. Ever wondered why your web page didn’t get the expected page rank? Why your web page didn’t turn out #1 in SERP despite its rich content?

SEO is Limited by the decoding mechanism of bots and crawlers. The need boils down to one common issue. The bots and crawlers of google don’t have intelligence as a human being.

Reason that penalise rich content website

  1. Penalisation due to online forms

    The bots have the capability to span all over the internet, follow millions of links and index billions of content. But, the bots cannot fill out the online forms such as login that comes with the captcha! This simply means, that the codes that proceed after the forms are not spanned by the crawlers! The pages that are unreachable by the crawlers fails to show up in SERP. So What do we do now? Very little can be done from our side i.e., prevention. So, take care not to put major content behind online forms.

  2. Error in Crawling directives [Robot.txt file]

    Errors in the website crawling directives can be a major trouble creator. The bots get confused with the Robot.txt. When such situations occur, the bots and crawlers fail to report your website. That simply means, your website goes out of scope for a Search Engine.

  3. poor construction of Linking Structures

    When your website’s link structure is poor, it’s time for your dream called ‘Page Rank’ to end up as a dream. By Poor link structure, I refer to broken links, internally linked web pages, a complicated linking structure that’s not user-friendly. The bots report the poor link structure and there’s a very bad chance of losing your dream page rank.

  4. Excess use of Media and less use of text

    Believe me! Bots have the capability to read media such as images, but bots face some problems during processing and parsing them. Media like images, videos, flash content, flash files, plugin content etc.. gets either ignored or wrongly processed by bots to a large extent. Instead, bots are good at reading simple HTML texts! Making pages completely with media (even though you have got rich content in it) can be termed a bad habit. Try using textual approach wherever possible. Remember, it is the bots that process your Page Rank!

  5. Unmatching keywords in your content

    This is the problem faced by most of the users. This can be easily rectified, but the task is more time consuming. Plan your keywords perfectly. Use tools like Adwords Keyword manager and find out search volumes. This simple on-page optimization can make your page rank raise. Find out 5-6 core keywords and make your base of the post on it.

Remember! Your website should not satisfy the bots alone. Your content must satisfy both the viewer and the bots.

Perfect SEO – 5 basic google webmaster guidelines

Perfect SEO – 5 basic google webmaster guidelines

Everyone wants to make it to the top of the google SERP. What requirements should your site have in order to compete with others? What do you think that can make your website top the SERP? To help us out, google webmasters had come up with some basic guidelines for a perfect SEO .

If you think cracking the complicated google algorithms, you are going one step down in achieving your goal. Instead following webmaster guidelines for SEO can be of much help.

The 5 basic google webmaster guidelines for a perfect SEO is listed for better ranking:

1. Make Webpage for users not for search engines:

Avoid cloaking.

Cloaking: Cloaking is a popular black hat SEO technique where the content provided for the users to browse is different from the content provided to the search engine spider.

Google loves websites that don’t try black hat techniques. Make web pages rich in content as much as possible. Give a meaning to your site. Concentrate on your website core idea and work on it. If you are working on a blog, concentrate on the focus keyword for every post.

2. Make a perfect site navigation:

Make sure that all public pages in your web page are completely accessible by the user. Having a navigation menu that can basically help in navigating to all pages is essential. Making a Universal Menu bar is preferred.

The Navigation bar shown below is my universal navigation bar.This basically navigates to all web pages in my blog.
Universal site navigation

3. Choose a suitable name for your site category items and menu items:

Google Adwords Keyword manager comes to your rescue. With Google Adwords keyword Manager, choose your category name. A competition level of low or moderate signals a good start sign. Nevertheless, with high competition keywords, you have a focused task of more rich and unique content.

4. In case of blogging, make sure your article word count exceeds 300:

Google has a primary aim of providing the user more accurate result. A content rich article should have explanations that exceed 300 words. Google is good at sensing synonyms..i.e., usage of words that has a similar meaning to the focus keyword of the blog post. Avoid recurrence of focus keyword and use alternate keywords that make the same sense as focus keyword.

5. Make your Title and h1, h2 tags contain the focus keywords of the article:

Having the focus keyword of your web page in the title, h1, h2 tags are highly essential. This makes your web page get a slightly higher preference in the SERP (Search Engine Results Page). h1, h2 tags creates the basic impression that you have content specific to the keyword listed. So, this step is highly essential while building web pages.

Making a perfect SEO makes your website stand out for its content. It’s a ticket for you to showcase your website to the right audience.

2007 Google Algorithm update

2007 Google Algorithm update

2007 google algorithm update consisted of Buffy upadate and Universal search. Buffy was a collection of minor update and univeral search was a mind-blowing concept that phrased the future of SERP.

Buffy Update: [ June 2007 google algorithm update ]

After quite a year(2006) of no big updates, google produces Buffy! The first major update in 2007. The Jagger update [October 2005 update] made the preparation for change in the infrastructure by Big Daddy update [December 2005 Update]. Buffy had a huge result on the single word search results. Buffy turns out to be a collection of various small tweaks and minor modifications.
The update got named ‘Buffy’ in honor of “Vanessa Fox” who then left the team as declared by the Webmasters. To read the complete message of webmasters Click here!

Universal search: [ May 2007 google algorithm update ]

Before universal search, google displayed webpages with text content seperately, images seperately and so on.. It was framed that you gotta search everything seperately. With Universal search google got the right junction for employing most of its services such as youtube, maps, images, local, books etc.. on the search engine results page [SERP]. The previous SERP was officially declared dead (10-listing) and the new SERP took over!

The Universal search not just limited itself to the google platform. Videos, images, documents from other websites too showed up in Universal Search Engine Result page.

2005 Google Algorithm Update

2005 Google Algorithm Update

The 2005 google algorithm update brought the efficiency of the search engine to the higher level. Spam via fake backlinks got a drastic reduction.

NoFollow attribute: [January 2005 google algorithm update ]

On January 2005, the search engines dominating the market such as Yahoo, Bing (MSN Network) and google joined hands together. They contributed towards the ‘NOFOLLOW attribute’. This new indexing web command ensured that no fake backlinks were created.
Earlier, spammers had the ability to comment their own website links onto the comment section of any high rated blog or forum or any website. This generated backlinks to the spammer’s webpage in turn enhancing the page rank. By this attribute, the links on the comment section carried a default NoFollow attribute. A NoFollow attribute nullifies the backlink detected by google and hence goes unaccounted!
This is not technically an update, but the introduction of NoFollow had a huge impact on link graph!

Allegra Update: [February 2005 google algorithm update ]

The Allegra update focused mainly on LSI [Latent Semantic Indexing] algorithms. Very few details were known about the tweaks in the indexing. Speculated rumors suggested that google started to penalize websites for fake backlinks from irrelevant websites.

Bourbon Update: [May 2005 google algorithm update ]

Bourbon update stretched over a week with several tweaks and enhancements. It was seen that 3.5 improvements in search results was mentioned during release. This meant that the improvements in search algorithm was not yet complete and it lasted for a week. It was speculated by the webmasters that google changed the way non-canonical URLs were treated.

Personalized search and XML site maps: [June 2005 google algorithm update ]

• Personalized search was the next attempt by the search engine giant. When a user gets logged in, the web history of the user profile is reviewed. Websites that were already visited by the user gets high priority in search results.
Note: This is for that particular google account only. Once logged out, the personalized search effect goes off.
• After introduction of XML site maps, google had a very clear knowledge of the navigation methodology of every website. Users have an option either to submit the XML sitemap of their webpage directly. If not, worry not! Google bots and crawlers are always on its way spanning the internet. Google bots create an xml sitemap and submits it to google. The site navigation was given high importance as better navigation makes the user more comfortable.

• The change and rate of change of XML sitemaps were used to predict the core activities of the website.

Gilligan update: [September 2005 google algorithm update ]

The Gilligan update was rather termed as “False Update”. Mattcutts, the then head of anti-spam team of google mentioned that search algorithms had no major update. The Gilligan update was all about the toolbar PR that gets updated once in 3 months or so.

Jagger Update: [October 2005 google algorithm update ]

The Jagger update made a clear impact on fake backlinks. Jagger targeted low quality links – namely link farms, reciprocal links and paid backlinks! Jagger rolled out for a period of 3 months. i.e., Jagger got updates over a period of time fine-tuning the update. Removing unused pages, making a 301 redirect from non-canonical URL to canonical form of url, along with several tweaks were advised.

Integration of Google local and maps: [October 2005 google algorithm update]

‘Google Local’ was successfully launched. maps.google.com became a one-stop solution for mapping and local search [Location based search for nearby stuff]. ‘Google Local’ and ‘Google maps’ got integrated.
know more on Integration of google local and maps!

Big Daddy Update: [December 2005 google algorithm update]

The Big Daddy update was termed the ‘Infrastructure update’. Big Daddy changed the way of URL canonicalization, 301/302 redirects.