Software robots that "crawl" Web pages, gathering information for search engine databases. Also known as search engine crawlers.
are automated programs (sometimes called a webcrawler) which crawls over the World Wide Web, gathering web pages for search engines. Large search engines employ many spiders. Spiders are a type of robot.
Search Engines use special programs, called Spiders or Robots that catalogue web sites for use in their searches.