Click here to Skip to main content
15,886,788 members
Articles / ORM
Tip/Trick

Best Practices for ORM

Rate me:
Please Sign up or sign in to vote.
4.57/5 (6 votes)
16 Oct 2013CPOL6 min read 15.6K   9  
I want to share some of the best practices which needs to be followed while dealing with any ORMs (Object Relational Models)

Introduction

Although everyone puts their complete effort to develop the best software, sometimes bad things happen. But to be on safer side we can take some precautionary measures. Well, today I want to share some of the best practices which needs to be followed while dealing with any ORMs (Object Relational Models). Most of these are common among many of the relational databases.

  1. Exception handling: One should handle only those exceptions which need to be handled, and let all others pass through. For example, if you want to handle a connection issue you should do this by catching the specific exceptions, but don't use a catch all on Exceptions with the clause "just in case" because it may lead to high performance issues.
     
  2. using  As many of us know, that USING automatically handles dispose functionality. But this statement doesn’t work all the time, especially in the case of database connections. If you want to try it out, then run some overnight processes using windows services. Although USING handles it most of the time it is always a good practice to close database handles/connections manually.
     
  3. Prefer compiled LINQ queries: Compiling a query once and executing it several times can lead to high performance gains. Such queries are useful, especially when the same query needs to be accomplished numerous times for innumerable parameters. The syntax for that is:
    CompiledQuery.Compile((DataContext db, <inputparameters>)=>LINQ query
  4. Always prefer IQueryable on IEnumerable: While dealing with LINQ together with SQL, it is always advisable to go for IQueryable<T> rather than IEnumerable<T>. The reason behind this is, IQueryable<T> is specifically made to query SQL, whereas IEnumerable<T> is the generalized form, which will work for every type, including SQL. The same can be proved by using profiler. Let’s take a simple example:
    C#
    IQueryable<Customer> custs = …;
    var richCustomers = custs.Where(c => c.Balance > 1000);

    The above snippet will execute SQL to only selected customers having balance greater than 1K. On the other hand, the below snippet will execute the original query in the database, then filtering out the customers having balance below/equal to 1K:

    C#
    IEnumerable<customer> custs = …;
    var richCustomers = custs.Where(c => c.Balance > 1000);

    The above snippet will execute SQL to only selected customers having a balance greater than 1K. On the other hand, the below snippet will execute the original query in the database, then filtering out the customers having balance below/equal to 1K: So, one of the important differences is, in most cases, working on IQueryable<T> can save us from returning too many rows from the database.

    Another key advantage of IQueryable<T> is while doing pagination: If we use Take and Skip on IQueryable, we will only get the number of rows requested; doing that on an IEnumerable<T> will cause all the rows to be encumbered in memory, which is again a very BIG performance hit. Same can be proved using SQL profiler.

    At the end of the day (which is preferred) it totally depends on what our underlying datasource is

  5. Avoid implicit transactions – It is a good practice to do CRUD operation within an explicitly defined transaction. If we don’t create a transaction on our own, the DB engine automatically tumbles into implicit transaction mode, in which every statement will be treated as a transaction, resulting in a large performance cost as well as reduced consistency.
     
  6. Avoid using SELECT *. Always be specific on which columns need to be retrieved and try to fetch only those columns. This will help in achieving fast results with reduced network traffic.
     
  7. Always refer fields using their names, instead of using ordinal placement.
     
  8. Prefer stored procedures to SQL statements.
     
  9. Queries with NoTracking option: If tracking of changes and relationship is not required, then NoTracking option works well.
    C#
    var result = query.Execute(MergeOption.NoTracking).toList();
  10. Always follow a best practice to verify the row count when performing DELETE operations. It will assure us about successful deletion.
     
  11. It is always preferable to do data validation beforehand while performing any CRUD operation, in order to reduce unnecessary DB round trips.
     
  12. Use of functions with WHERE clauses – Instead of using functions in where clause, prefer to use it on a column of a table. The reason behind this is, functions are executed at runtime and based on the function’s output. The database engine has to trace the entire table to get the results.
     
  13. Wherever possible supply primary key while using WHERE clause. This holds useful, when single record is expected as an output.
     
  14. Avoid using wildcard character with LIKE clause. It prohibits DB engine from using indexing during search, which may lead to poor performance.
     
  15. Avoid multiuse of tables: Always refrain from using tables serving multipurpose and use normalization to get rid of it. It will save us from unwanted complexity and confusion.
     
  16. Avoid saving the same data across multiple tables and make sure that each column in the table is related to each other. In short, make proper use of normalization while designing databases.
     
  17. Avoid burdening tables with huge number of columns, although there is no hard and fast rule on number of columns.
     
  18. Avoid Select N+1
     
  19. Wherever possible, try to maintain the sequence/position of SP parameters while calling. This will speed up SP execution.
     
  20. Use triggers only for data integrity enforcement and business rule processing and not to return information as triggers are very costly in terms of performance.
     
  21. Always follow the best practice to use global variable @@ERROR.
     
  22. It would be better to have uncorrelated subqueries over correlated subqueries wherever possible. In this way, the inner SELECT query will run only once, irrespective of whatever the number of rows in outer query.
     
  23. Limit the size of the EDMX file – Although EF is smart enough to tackle large Entity Data Models(EDMs) but it comes with an associated cost. You can notice this while dealing with highly interconnected data models. So, if a model is too large, then one should break it up into multiple edmx files. Because if the size of schema file is large enough, the time it takes to parse and create an in-memory model for this metadata will also be too high. You need to decide this factor very carefully because the creation of multiple .edmx files may sometime lead to poor performance as whenever change is made in the DB, these .edmx files need to be synchronized.
     
  24. Create foreign key constraints for every foreign key relationship used: If you do not have foreign key constraints defined, then you must define them before attempting to implement any ORM tool.
     
  25. Avoid using ‘Smart’ columns: Smart column contains the data which needs to be parsed. For example, XML. Storing XML in a text field of database doesn’t make any sense as there is nothing to ensure that this is valid XML and conforms to correct grammar. So, here I would suggest to go for XML datatype to store XML into the DB.
     
  26. One and the most IMPORTANT thing, use Profilers. It is always good to be with profiler to periodically check the performance of an application.

Hope above tips will be useful.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



Comments and Discussions

 
-- There are no messages in this forum --