New sites do not need to be “submitted” to search engines to be listed. A simple link from an established site will get the search engines to visit the new site and spider its contents. It is rarely more than a few days from the acquisition of the link to all the main search engine spiders visiting and indexing the new site.
Webmasters can instruct spiders to not index certain files or directories through the standard robots.txt file in the root directory of the domain. Standard practice requires a search engine to check this file upon visiting the domain. The web developer can use this feature to prevent pages such as shopping carts or other dynamic, user-specific content from appearing in search engine results.
For those search engines who have their own paid submission (like Yahoo), it may save some time to pay a nominal fee for submission.
Need an webmaster? Click HERE