Click here to Skip to main content
13,089,400 members (58,809 online)
Click here to Skip to main content
Add your own
alternative version


166 bookmarked
Posted 9 Sep 2006

A Web Spider Library in C#

, 18 Sep 2007
Rate this:
Please Sign up or sign in to vote.
An article about a spider library to grab websites and store them locally

Sample Image - ZetaWebSpider.png

Don't fear, it's just a web spider ;-)


Today, while looking through some older code, I came across a set of classes I wrote at the beginning of this year for a customer project.

The classes implement a basic web spider (also called "web robot" or "web crawler") to grab web pages (including resources like images and CSS), download them locally and adjust any resource hyperlinks to point to the locally downloaded resources.

While this article is not a full-featured article with detailed explanations as I usually like to write, I still want to put the code online with this short article. Maybe some reader can still take some ideas from this code and use it as a starting point for his own project.


The classes allow for synchronous as well as asynchronous download of the web pages, allowing multiple options to be specified like hyperlink-depth to follow and proxy settings.

The downloaded resources get their own new file names, based on the hash code of the original URL. I did this for simplifications (for me as the programmer).

To parse a document, I am using the SGMLReader DLL from the GotDotNet website.

Also, since I didn't need it for the project I wrote, the library does not care about "robots.txt" or throttling or other features.

Using the Code

The download for this article contains the library ("WebSpider") and a testing console application ("WebSpiderTest"). The testing application is rather short and should be rather easy to understand.

Basically, you do create an instance of the WebSiteDownloaderOptions class, configure several parameters, create an instance of the WebSiteDownloader class, optionally connect event handlers and then tell the instance to either start synchronously or asynchronously processing the given URL.


  • 2007-09-17: Fixed several issues
  • 2006-09-10: Initial release of the article


This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


About the Author

Uwe Keim
Chief Technology Officer Zeta Producer Desktop CMS
Germany Germany
Uwe does programming since 1989 with experiences in Assembler, C++, MFC and lots of web- and database stuff and now uses ASP.NET and C# extensively, too. He has also teached programming to students at the local university.

In his free time, he does climbing, running and mountain biking. Recently he became a father of a cute boy.

Some cool, free software from us:

German Developer Community  
Free Test Management Software - Intuitive, competitive, Test Plans. Download now!  
Homepage erstellen - Intuitive, very easy to use. Download now!  
Send large Files online for free by Email

You may also be interested in...


Comments and Discussions

GeneralMy vote of 2 Pin
daveauld19-Jun-10 23:51
memberdaveauld19-Jun-10 23:51 
GeneralRe: My vote of 2 Pin
Kevin Yochum9-Aug-10 7:59
memberKevin Yochum9-Aug-10 7:59 
GeneralRe: My vote of 2 Pin
Uwe Keim9-Aug-10 19:39
mvpUwe Keim9-Aug-10 19:39 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.

Permalink | Advertise | Privacy | Terms of Use | Mobile
Web04 | 2.8.170813.1 | Last Updated 19 Sep 2007
Article Copyright 2006 by Uwe Keim
Everything else Copyright © CodeProject, 1999-2017
Layout: fixed | fluid