Click here to Skip to main content
15,900,906 members
Please Sign up or sign in to vote.
1.00/5 (1 vote)
See more:
Hi all :)
I want to handle a very big data in SQL Server if any one know good technique kindly guide me.
the application run 24 hours on server and fetch protocols data from router in to database initially SQL Server is consist of one table within one hour the table contain more then millions of records due to large traffic load.
the table also consist of required indexing but after one day it is very hard to search in this table so what techniques should I used to control this amount of data.
Any Suggestion will be appreciated From seniors... :)
Posted
Comments
PIEBALDconsult 14-Jun-14 11:25am    
What sort of data? Do you have an example? I hope you're not trying to log data sent across the network.
Jawad Ahmed Tanoli 15-Jun-14 5:21am    
the application capture protocols over the server and the table consist of CaptureTime,SourceHwAddress ,DestinationHwAddress,IPType,SourceAddress,DestinationAddress,protocol ,SourcePort ,DestinationPort,ProtocolReadAbleData.
Jörgen Andersson 15-Jun-14 15:28pm    
What kind of data mining are you trying to do?

1 solution

SQL server is not suited for this type of high volume/velocity data usage, you should look into using high performance nosql engines (or even text log files) for ingesting data and post processing of the data for the queries you want.
 
Share this answer
 
Comments
Jawad Ahmed Tanoli 15-Jun-14 5:23am    
so if choose text files to store data then how should i determine the queries to retrieve data .

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900