background preloader

Applications

Facebook Twitter

A Simple Crawler Using C# Sockets. Contents Introduction A web crawler (also known as a web spider or ant) is a program, which browses the World Wide Web in a methodical, automated manner.

A Simple Crawler Using C# Sockets

Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine, that will index the downloaded pages to provide fast searches. Crawlers can also be used for automating maintenance tasks on a web site, such as checking links, or validating HTML code. Also, crawlers can be used to gather specific types of information from Web pages, such as harvesting e-mail addresses (usually for spam). Mercator.pdf (application/pdf Object)