The search engines use robots called crawlers to identify the web pages for their search algorithms. If your web pages are linked from pages that are indexed by other search engines, then they are automatically updated. But there are certain search engines such as Yahoo, which requires paid submission of URLs for searches. The services of such search engines guarantee a place in the search engines but do not guarantee higher ranks until review. Google on the other hand offers webmaster tools using which an XML map feed is created and submitted for free of cost.

There are a number of factors for the crawlers to consider crawling a page and one important factor is the distance of pages from the root node. The webmasters have the ability to prevent crawling in certain pages using robots.txt file. There are times when some pages need not be placed in indexes and such page information is provided in the text file. This robots.txt is the first file which is crawled and the parsers instruct the robots not to crawl those pages. The use of specific Meta tags also prevents the search engines from indexing certain pages. Most often, the login pages are prevented from crawling. If you want search engines to crawl your sites often, then you need to update their contents now and then. The content present in the pages should include appropriate keywords. They should not be jammed through the pages and must be spread everywhere.

Apart from contents, cross links from one page to another within a website also increases the visibility. The SEO techniques can be classified as black hat techniques and white hat techniques. The former is the informal way of optimizing web pages for getting better results, but such techniques are not authorized by the search engines and once found, the sites following such techniques gets banned immediately. Whereas, the white hat techniques are the conventional and organic means of promoting web pages using good design, accurate keywords etc. One of the more common black hat techniques is spamdexing. The white hat techniques need to follow the guidelines of the search engines and the common rule of thumb is that the web pages need to be created keeping in mind the target audience and not the search engines. If such principle is followed, the sites will automatically be placed higher in the searches because of high traffic.