Click here to Skip to main content
16,017,015 members
Please Sign up or sign in to vote.
0.00/5 (No votes)
See more:
Hi All,

Our company is about to develop a product (Web application).

Scenario / Requirement:

The application is going to be used by multiple concurrent users (approx 50 to 70). The applications basic feature is to provide search. This search will be of different entities (name, place, organization, etc.). Further these entities are interlinked.

Issue / Challenge:

So as to have better performance and lesser DB interactions we plan to store these entity objects in server RAM (in memory). And periodically update them. We are not able to finalize what approach to follow to achieve this. Some of the options we think of;

1) Asp.net Data Caching

2) In memory databases. E.g. http://www.mongodb.com/what-is-mongodb

3) some thing else.



Kindly provide us inputs on this.

Any thoughts / comments highly appreciated.



Thanks in advance
Posted
Updated 8-Dec-14 6:00am
v2

1 solution

I believe that with 50-70 concurrent users it is a relatively small database, so in-memory option is a reality and can give definitely the best performance...
Both document-bases (like MongoDB) and SQL (2014 only) can use the memory to map tables into and speed up access like 30 time...
However if you were wrong about your usage prediction you can be in deep troubles when you try to apply the new design...
For the second-best solution you can combine a high speed SSD drive with large memory cache...Both SQL and document-base support caching of course...
To pick up the right type of database is depends on the data you intend to collect for your search. You have to analyse that data and see if it is pure table oriented or fit more to the document way...
 
Share this answer
 

This content, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)



CodeProject, 20 Bay Street, 11th Floor Toronto, Ontario, Canada M5J 2N8 +1 (416) 849-8900