Click here to Skip to main content
15,920,383 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Hi Frnds,

I am having a SQL Database table which have almost 20 million records. There are 4 Columns already indexed. I want to select some part of the table for application calculation.

For example: SELECT Id, name, description FROM Table WHERE Active = 1

This is taking around 2 minutes to produce the result. Any other method to retrieve the data faster from SQL table and load it in DataSet(ASP.Net)?

Please suggest me any searching mechanism. Something like Elastic search!!!!
Posted
Updated 12-Mar-14 0:39am
v3
Comments
RahulRana723 12-Mar-14 6:55am    
Are you using Subquery for Calculation?
Karthik Jaganathan 12-Mar-14 7:11am    
No Subquery!
I want to select all the rows of table.
Simple select statement with one where condition
RahulRana723 12-Mar-14 8:10am    
you want to display all records in one go or some record means first 100 then 100 then 100??

This is not a simple thing, there's a lot of todo. Check this answer, I have explained things on each topic.
How to improve performance of a website?[^]

Check Data Loading optimization & Database optimization for your issue.
 
Share this answer
 
Comments
Maciej Los 12-Mar-14 18:03pm    
+5!
SQL
1. Don't use UPDATE instead of CASE
This issue is very common, and though it's not hard to spot, many developers often overlook it because using UPDATE has a natural flow that seems logical.

2. Don't blindly reuse code
This issue is also very common. It's very easy to copy someone else's code because you know it pulls the data you need. The problem is that quite often it pulls much more data than you need, and developers rarely bother trimming it down, so they end up with a huge superset of data. This usually comes in the form of an extra outer join or an extra condition in the WHERE clause. You can get huge performance gains if you trim reused code to your exact needs.

<pre lang="sql">3. Do pull only the number of columns you need
This issue is similar to issue No. 2, but it&#39;s specific to columns. It&#39;s all too easy to code all your queries with SELECT * instead of listing the columns individually. The problem again is that it pulls more data than you need. I&#39;ve seen this error dozens and dozens of times. A developer does a SELECT * query against a table with 120 columns and millions of rows, but winds up using only three to five of them. At that point, you&#39;re processing so much more data than you need it&#39;s a wonder the query returns at all. You&#39;re not only processing more data than you need, but you&#39;re also taking resources away from other processes.</pre>

4. Don't double-dip
Here's another one I've seen more times than I should have: A stored procedure is written to pull data from a table with hundreds of millions of rows. The developer needs customers who live in California and have incomes of more than $40,000. So he queries for customers that live in California and puts the results into a temp table; then he queries for customers with incomes above $40,000 and puts those results into another temp table. Finally, he joins both tables to get the final product.

Are you kidding me? This should be done in a single query; instead, you're double-dipping a superlarge table. Don't be a moron: Query large tables only once whenever possible -- you'll find how much better your procedures perform.

A slightly different scenario is when a subset of a large table is needed by several steps in a process, which causes the large table to be queried each time. Avoid this by querying for the subset and persisting it elsewhere, then pointing the subsequent steps to your smaller data set.

5. Do know when to use temp tables
This issue is a bit harder to get a handle on, but it can yield impressive gains. You can use temp tables in a number of situations, such as keeping you from double-dipping into large tables. You can also use them to greatly decrease the processing power required to join large tables. If you must join a table to a large table and there's a condition on that large table, you can improve performance by pulling out the subset of data you need from the large table into a temp table and joining with that instead. This is also helpful (again) if you have several queries in the procedure that have to make similar joins to the same table.
 
Share this answer
 
Use LIMIT and OFFSET method in selection Query for improving speed.....
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900