Games - Movies - Clips - Wrestling - Music - Artists - Chat - Women - Cooking - Cars - Football - Art - Programmes - iPhone - Laptops - Girls - Jokes
HomeCalendarFAQSearchMemberlistUsergroupsRegisterLog in

The Basics of Search Engine Friendly Design and Development

admin stator
admin stator

Posts : 1628
Join date : 2010-09-04

View user profile 
Post The Basics of Search Engine Friendly Design and Development   Wed Aug 31, 2011 8:13 pm

The Basics of Search Engine Friendly Design and Development

The work of search engines is limited to crawling the web and interprets content to bring back the information and display the results. So, it is important to know the basic technical aspects of building and modifying the web pages in order to make them for search engines, and for the human visitors.

In order to be listed in the search engines, your content should be in HTML text format. Despite advances in crawling technology, Images, Flash Files, java applet and other non-text contents are virtually invisible to search engine spider. The easiest way to display your words and phrases is to place them in the HTML format. Images, Flash file, Java-applets and other non-text contents can be assigned by providing engines a text description of the visual contents.

The search engines need to see content in order to list pages in their massive key-word based indices. By using tools like Google's cache,, the MozBar or Yellow pipe; it can be observed what elements of your content are visible and indexable to the engines. If you are curious about exactly what terms and phrases the search engine like to see on web pages, there is a good tool like "Term Extractor". They will display words and phrases in the order of frequency. However, it is wise not only to check the text content, but also to use tool like SEO Browser to double-check the pages you are building, are visible to the engine.

Crawlable Link Structure

The search engines need to see the content of the pages in order to list pages in their massive keyword-based indexes. They also need to have access to a crawlable link structure. This structure allows the spiders browse the pathways of a website in order to find all of the pages on a website. Hundreds of thousands of sites make the critical mistake of hiding or obscuring their navigation in ways that search engine cannot access. It holds back their ability to get pages listed in the search engines. Without crawlable link, the spider cannot reach the important pages, in spite of the good content, good keywords targeting, and smart marketing.

Keyword Usages & Targeting

Key Words, the building blocks of the language, are the fundamental to the search process. As the engine crawl and index the content of the pages around the web, they keep track of those pages in key-word base index. Thus, rather than storing 25 billion web pages in one database, the engines have millions of smaller database, each centered on a particular key word. This makes it faster for the engines to retrieve the data they need within fraction of a second.

Key word Domination

Keywords dominate our search intent and interaction with the engines. When the search is performed, the engine knows which pages to retrieve based on the words entered into the search box. For obvious reasons, search engine measures how the keywords are used on the pages to help determine the 'relevance' of a particular document to a query. One of the best ways to optimize a page's ranking is, therefore, to ensure that the key words are prominently used in the title, text and meta data.

Myth of Keyword Density

The question of keyword density arises whenever the topic of keyword usages and search engines comes together. But, this is wrong assumption. Keyword Density is not the part of the search engine. It provides far worst result than many others. Keyword density is the more advanced method of key word analysis. The notion of keyword density value predates all commercial search engines and the Internet, and can hardly be considered an information retrieval concept. Key word density plays no role on how commercial search engines process text, index documents or assign weight to term. What is the value of keyword density to the optimizer? It is unfortunate that key word density always does not always help optimizing the page rankings. We have little chance to create formulas that will be helpful for true optimization.

On-Page Optimisation

Key word search is only a small fraction of the search engines' ranking strategy and there are still so many best practices which are very close to optimizing the pages. When working with one of your own site, there is a recommended process.

First, use the keyword in the title tag at least once or twice if it makes sense. Try to keep the keywords as close to the beginning of the title tag as possible. You may find additional value in adding the keywords more than 3x, but in reality, but adding more words or phrases tend to have little or no impact

Secondly, use the keywords at least once in bold by using either the <strong> or <b> tag, as they are considered equal by the search engine.

Thirdly, use the keyword once in the URL

Fourthly, use it at least once in the meta description tag.

The future may be uncertain, but in the world of search, change is constant, and it is inevitable. For this reason, the search engine marketing will remain persistent in the diet of those who wish to remain competitive in the web world. Those with the best knowledge and experience of ranking will only receive the best benefits of traffic and visibility.

Page 1 of 1

Permissions in this forum:You cannot reply to topics in this forum
Powered by phpBB.forumotion.4smackdown©  2010
Free forum | © phpBB | Free forum support | Contact | Report an abuse | Free forum