Hi,
I am currently working on rewriting my company's web service logging.
The reason for doing this is that our production server is experiencing heavy load during peak hours. The current setup is very intensive on our database.
Before I dig into things; I post this so that more experienced programmers might read this and put in thoughts on my setup. Pure code examples can be posted at a later stage.
Right now I am looking for architectural advice.
Current setup:
ASMX Webservices, SQL Server 2008 R2 Enterprise Edition.
Logging is handled in our own logging class dbLog.cs, basically any error, soap-error or successful request is being logged to the database.
Currently, in order to keep track of what items are being requested by what user, there is a DataSet (ds) being populated on each request.
This ds will be populated with everything the current user has ever requested (items).
Then, the current item request will be compared to the ds and any new items, not requested before, will be written to a usage table in the database.
Then there is logging for info about the current request - userId, time for the request, method name etc.
Also, there is argument logging, that is basically the parameters being sent to the webservice - for each user.
My thoughts:
I am no expert, but this setup seems to populate A LOT of database connections as well as memory usage. The current setup has no way of actually showing the SOAP envelopes being sent to the server.
Future (possible) setup:
Read each incoming request via a SOAP extension, save each SOAP envelope to a XML file on disk, give it a GUID as file name.
Log some info, userId, time for the request, method name etc. along with the corresponding GUID for the current request to the database.
That's it.
Now, in order to keep track of usage I will write an application that will run each night when the load on the server is low.
This application will loop over each userId in the database, grab the corresponding GUID, fetch the corresponding XML-file from disk, read the XML file to a DataSet (using one XSLT for each web service). Then loop over all users and XML files in order to analyse usage.
Write the output to a separate .csv file for each userId.
Then ZIP each .csv along with the corresponding XML file(s) for the current date.
Okay, what do you think about this?
I believe it would put off some load on the server during peak hours, also this would let us inspect all incoming SOAP envelopes, this could be nice for future statistics on requests. However there might be better ways to do this, and there probably is, I'm not very experienced when it comes to programming.
Please share your thoughts on this!
Thanks for reading!
/D