ever thought of trying to validate each URL listed in your sitemap file?
I have a
site with dynamically generated page links. Those links are generated based on
a page title which can be any combination of letters, numbers and symbols. Of
course, the site does remove all forbidden characters from the page title
before generating its URL, trims and shortens it a bit... however errors still
occur from time to time. For example, a page with a title: ''...IS_BROKEN'' ''' due to my URL conversion specifics will have
the following URL: /.IS_BROKEN+ There
are thousands of pages so it’s clear that I can not verify each separate page
that the site’s database contains.
Based on a
list of dynamically generated URLs I generate a sitemap.xml file. Which contains all of the site pages. So each
time a map-file is generated I need to ensure that there are no repeating items
(this may happen if different pages have same titles) and each separate URL is
accessible, i.e. does not produce either bad request, or 404 or anything like
created a C# program that walks through each URL listed in the sitemap.xml file
and tries to access it. It logs all errors occurred into an output file, so
it’s easy to track problem pages.
I use XmlDocument class for loading a sitemap.xml; WebRequest and WebResponse classes for determination of whether a URL exists.