Crawlers :
crawler is an automated program that systematically browses the Internet for new web pages. Google and other search engines use web crawlers to update their search indexes. Each search engine that has its own index also has its own web crawler .
crawling the task of crawler is to visit a web page, read it, and follow the links to other web pages of the site. Each time the crawler visits a webpage, it makes a copy of the page and adds its URL to the index. After adding the URL, it regularly visits the sites like every month or two to look for updates or changes.
google search Engine to give site oner choices about how to bots crowl this site also provide detailed instructions about how to process pages on their sites , can request re-crawling or not . that file called robots. Text .