Web robots (also known as crawlers or spiders) are programs that traverse the Web automatically, and which are used by search engines to index the Web, or part of it.
More information

Related categories 3

Information on the robots.txt Robots Exclusion Standard and other articles about writing well-behaved Web robots.
Search Tools Consulting explains how the search engine programs called "robots" or "spiders" work, and reviews related sites.
Standard being developed on behalf of content publishers to communicate permissions information more extensively than is the case with robots.txt. Project documents, implementation and background information.
This large database lists user agents in categories and distinguishes between robots and browsers.
An alphabetical list of user agents and the deployer behind them, compiled by Christoph Rüegg.
A searchable database of user-agents with information about their type, purpose and origin.
John A. Fotheringham presents data in tabular form on the robots sent by search engines and other sites to read and index Web pages: their origins, names and IP addresses.
Tool from ASAP Consulting s.r.o. for detailed user agent string analysis using an online form. Includes databases of browsers and robots.
Contains a database of user-agents for crawlers, spiders, browsers; tools for user-agent lookup and tools for user-agent string search.
Information on the robots.txt Robots Exclusion Standard and other articles about writing well-behaved Web robots.
A searchable database of user-agents with information about their type, purpose and origin.
Search Tools Consulting explains how the search engine programs called "robots" or "spiders" work, and reviews related sites.
Contains a database of user-agents for crawlers, spiders, browsers; tools for user-agent lookup and tools for user-agent string search.
Standard being developed on behalf of content publishers to communicate permissions information more extensively than is the case with robots.txt. Project documents, implementation and background information.
Tool from ASAP Consulting s.r.o. for detailed user agent string analysis using an online form. Includes databases of browsers and robots.
An alphabetical list of user agents and the deployer behind them, compiled by Christoph Rüegg.
This large database lists user agents in categories and distinguishes between robots and browsers.
John A. Fotheringham presents data in tabular form on the robots sent by search engines and other sites to read and index Web pages: their origins, names and IP addresses.
Last update:
March 10, 2020 at 5:35:07 UTC
Computers
Games
Health
Home
News
Recreation
Reference
Regional
Science
Shopping
Society
Sports
All Languages
Arts
Business