Click here to Skip to main content
15,881,687 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
Hi,

I am currently working on rewriting my company's web service logging.

The reason for doing this is that our production server is experiencing heavy load during peak hours. The current setup is very intensive on our database.

Before I dig into things; I post this so that more experienced programmers might read this and put in thoughts on my setup. Pure code examples can be posted at a later stage.

Right now I am looking for architectural advice.

Current setup:
ASMX Webservices, SQL Server 2008 R2 Enterprise Edition.
Logging is handled in our own logging class dbLog.cs, basically any error, soap-error or successful request is being logged to the database.
Currently, in order to keep track of what items are being requested by what user, there is a DataSet (ds) being populated on each request.

This ds will be populated with everything the current user has ever requested (items).
Then, the current item request will be compared to the ds and any new items, not requested before, will be written to a usage table in the database.
Then there is logging for info about the current request - userId, time for the request, method name etc.
Also, there is argument logging, that is basically the parameters being sent to the webservice - for each user.

My thoughts:
I am no expert, but this setup seems to populate A LOT of database connections as well as memory usage. The current setup has no way of actually showing the SOAP envelopes being sent to the server.

Future (possible) setup:
Read each incoming request via a SOAP extension, save each SOAP envelope to a XML file on disk, give it a GUID as file name.
Log some info, userId, time for the request, method name etc. along with the corresponding GUID for the current request to the database.
That's it.

Now, in order to keep track of usage I will write an application that will run each night when the load on the server is low.
This application will loop over each userId in the database, grab the corresponding GUID, fetch the corresponding XML-file from disk, read the XML file to a DataSet (using one XSLT for each web service). Then loop over all users and XML files in order to analyse usage.
Write the output to a separate .csv file for each userId.
Then ZIP each .csv along with the corresponding XML file(s) for the current date.

Okay, what do you think about this?

I believe it would put off some load on the server during peak hours, also this would let us inspect all incoming SOAP envelopes, this could be nice for future statistics on requests. However there might be better ways to do this, and there probably is, I'm not very experienced when it comes to programming.

Please share your thoughts on this!
Thanks for reading!

/D
Posted
Comments
fjdiewornncalwe 29-Jan-13 14:34pm    
An interesting idea provided you can live with the delay in logging information getting into the database. I'm not sure you're saving us much as you think you are by doing this though, because you are still generating objects for logging and spending the I/O time and disk space on writing them. It would be an interesting test if you were to setup up some timing metrics on the current system and then set up comparable metrics on your proposed system. I would think it worth doing that and then use the one that is more efficient.
Daniel Carlström 29-Jan-13 14:43pm    
Thanks for your input, you might be right. I certainly have to compare the two setups in order to really know. Regarding writing the SOAP envelopes to disk, would I/O be lower if written to a XML column in the db instead you think?
BR
/D
fjdiewornncalwe 29-Jan-13 16:44pm    
I'm not sure on that because I've never generated any metrics on that, but I'm curious so it would be awesome if you could let us know what you find out.

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900