Click here to Skip to main content
14,639,650 members
Rate this:
Please Sign up or sign in to vote.
See more: , +
I am developed a project for a trading Company using C# 2.0 and sql server 2005.
The project is working perfectly. No issue regarding code part.

But I am afraid of one thing.

They are entering huge number of records daily, around 3000 records are entering in different tables in a day.

There hard disk size is 1TB.
I am Thinking after few month there will be some million of records, then will my Software work slow? or it will work as it is now.

If it will work slow then what will be remedy.
[I implement Primary key index in some tables not all-Please give me any other solution].
[Network connection is not the issue.]

Please answer me.
Thanks in advance.
Posted
Updated 16-Aug-12 10:45am
v2
Rate this:
Please Sign up or sign in to vote.

Solution 3

One thing you should probably consider is to use CTEs to perform paging when the table gets bigger - have a look at Data Paging Using CTE (Common Table Expression)[^]

This will allow you to browse the full table, even if you end up having a 100 million records or more.

Best regards
Espen Harlinn
   
Comments
Kenneth Haugland 16-Aug-12 18:39pm
   
Hey, haven gotten that far in the book yet. Kidding, 5'ed it :)
Espen Harlinn 16-Aug-12 18:41pm
   
Thank you, Kenneth :-D
Abhinav S 16-Aug-12 23:21pm
   
5!
Espen Harlinn 17-Aug-12 4:14am
   
Thank you, Abhinav :-D
Wendelius 16-Aug-12 23:45pm
   
Good point!
Espen Harlinn 17-Aug-12 4:14am
   
Thank you, Mika :-D
lw@zi 17-Aug-12 0:57am
   
+5!
Espen Harlinn 17-Aug-12 4:14am
   
Thank you, d@nish :-D
StianSandberg 17-Aug-12 3:11am
   
5'ed
Espen Harlinn 17-Aug-12 4:15am
   
Thank you, AlluvialDeposit :-D
Sergey Alexandrovich Kryukov 18-Aug-12 12:57pm
   
I was wondering if you personally would respond to this question not providing enough information on the problem; and, if you would, how?

And yes, this is a reasonable idea for work around the slow processing while improving user experience. My 5.
--SA
Espen Harlinn 18-Aug-12 13:06pm
   
3000 records a day isn't exactly a huge amount of data, but in 10 days he'll have 30000 records, which will be cumbersome to handle in a web application.

Professionally I would use DevExpress XpoDataSource with servermode set to true. We use this with tables containing nearly a 100 million records, and it works pretty well and was implmented in about an hour.
Sergey Alexandrovich Kryukov 18-Aug-12 13:23pm
   
Right...
Rate this:
Please Sign up or sign in to vote.

Solution 2

A table havin a million rows isn't big (in my opinion) and shouldn't cause any gray hair. The things you should consider include at least (in no specific order):

  • proper indexing, not just the primary key but to satisfy different kinds of frequent/important operations
  • avoid over indexing to keep the modifications running smoothly
  • using a good design in the database, proper normalization and so on
  • use correct data types
  • avoid casts
  • design and test the SQL statements well
  • monitor the usage
  • update the statistics frequently enough
  • from time to time, defragment the database
  • utilize the database to take care about the logic when possible, meaning having triggers, procedures constraints etc
   
Comments
Espen Harlinn 16-Aug-12 18:36pm
   
5'ed!
Wendelius 16-Aug-12 23:44pm
   
Thanks :)
lw@zi 17-Aug-12 0:57am
   
+5!
Wendelius 17-Aug-12 1:01am
   
Thanks d@nish :)
[no name] 18-Aug-12 12:10pm
   
Thank you very much for your suggesstion.
Wendelius 18-Aug-12 15:40pm
   
You're welcome :)
Rate this:
Please Sign up or sign in to vote.

Solution 1

If these guy could make it work I would guess you could too:
http://aws.amazon.com/rds/sqlserver/[^]

as for SQL server there are number of different ways you can configure it so that it is not a problem:
http://support.microsoft.com/kb/319942[^]

Hope that would help you :)
   
Comments
[no name] 16-Aug-12 16:57pm
   
Thanks for your quick response.
but first link is not at all use full.
and second link is about win NT and 2000.
but the priority concept is good.thanks for that.
Any point point answer is required.
Please help.
Kenneth Haugland 16-Aug-12 17:00pm
   
I only gave the link to show you that SQL is used for government and hugh buissiness porpuses and you should not be afraid that it might be slow. With the proper configuration (and there are lots of possibilities) it woul dwork on anything. SQL is also the backbone of SAP, a database tool that is used by many of the big oil fims in the world.
Wendelius 16-Aug-12 17:06pm
   
Don't quite understand the downvote. Countered.
Espen Harlinn 16-Aug-12 18:25pm
   
Nice links :-D
Kenneth Haugland 16-Aug-12 18:27pm
   
Thanks, I just started reading a book about SQL servers and didnt tink it would be a big issue :)
[no name] 18-Aug-12 13:34pm
   
but what jhappend after few years?
Rate this:
Please Sign up or sign in to vote.

Solution 4

The most obvious way of determining if it would work, or not, and how to improve response time if needed would be to .... test for it?

Stuff the DB with a couple of million records and profile it?
   
Comments
[no name] 18-Aug-12 13:36pm
   
not a proper answer.
give some links at least
how to test
or
some thing like that
barneyman 18-Aug-12 20:52pm
   
hardly - you've been given some very good 'generic' solutions to improve the speed of a SQL database, however, your opening question is still a thought experiment ...

"I am Thinking after few month there will be some million of records, then will my Software work slow? or it will work as it is now"

We don't know the answer to that question, and it would appear you're not interested in finding out either - my suggestion was try it out!! Fill the database with 5 years of info and empirically *determine* how it behaves, and *then* look at improving it

No number of SQL optimisations will help if the code is inefficient, the tables aren't normalised properly, or you have some serious contention issues in there

I'd suggest you test it by automating what your application does, insertions, reports and auditing

You measure response by charting response times as the data grows, and you can also use SQL profiler to find any bottlenecks
[no name] 19-Aug-12 10:43am
   
Thank you so much for your bbbbbiiiiiiiiiiiiiiggggg suggestion.
"We don't know the answer to that question"
Then why you people answer.
I got some solve is true.
But still searching for any short and more effective way to solve the problem.
I don't think its a crime, to find a easy way for my destiny.
Some peoples are still there in CodeProject those who have the answer, still they are not posting there answer.
My mistake is i think you are one of them.
But you are someone who can say can't do anything!
barneyman 19-Aug-12 20:49pm
   
I assume there are some translation issues at play here

I've created SQL-based systems that had to run at 10,000 inserts per minute - the soak-test harnesses took as long to write as the actual product, and they uncovered problems we had never anticipated

I wish you the best of luck
[no name] 21-Aug-12 14:52pm
   
thanks

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)




CodeProject, 503-250 Ferrand Drive Toronto Ontario, M3C 3G8 Canada +1 416-849-8900 x 100