I am developing a project for college and i need some suggestions over the development. Its a website which shows information from other websites like Links, Images etc.
I have prepared below given model for the website.
A Home.aspx page which shows data from tables (sql server).
I have coded a crawler (in c#) which can crawl (fetch data) required website data.
I want some way through which i can run the crawler at back end for some time interval and it can insert updates in the tables. I want that i can get updated information in my database so that Home.aspx shows updated info. (Its like smaller version of Google News website)
I want to host the wesbite in Shared Hosted Environment (i.e a 3rd party hosting provider company and that may use IIS platform)
I posted simliar situation to different .NET forums and communities and they suggested lot of different things such as
- Create a web service (is it really necessary ?)
- Use WCF
- Create a Console application and run windows task sheduler (is it okay with asp.net (win forms website) and in shared hosted)
- Run crawler on local machine and update database accordingly. (No i want everything online)
Please suggest me a clear way out so that i complete the task. Please suggest elobrated technology and methods which suits my project.