Click here to Skip to main content
15,896,557 members
Please Sign up or sign in to vote.
4.00/5 (1 vote)
See more:
Hi there,

I am trying to do a spin off from automating a webpage. I am trying to put together an application that can be handed any url and from that it can "click" the links on that page. Am coding in C# in VS express 2008.

EG:
1) Google a topic.
2) Grab a list of the url's on page 1 result set.
3) Open each page one at a time and automate clicking on every link.

This would all happen in the background with no visible web browser window/frame.

The application must cater for all types of page design. EG: Each page could have a different way of downloading a file from it...Javascript, redirecting, direct link....

I have been able to get as far as point 2 where I can scrape the html and get a list of url's, and I have been able to use a webclient to download files for pages with direct links to the files....I still am a bit shaky on determining what kind of webpage I'm dealing with in order to implement a specific "download" process.

I am having major difficulty dealing with scripted pages where downloading the files comes via a java script or redirect.

Can anyone help please?

Thanks in advance!!!
Posted
Updated 6-Mar-10 9:21am
v2

1 solution

Use WatiN [^] for such a task.
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900