The Lounge is rated PG. If you're about to post something you wouldn't want your
kid sister to read then don't post it. No flame wars, no abusive conduct, no programming
questions and please don't post ads.
As the others have said: buy new. Upgrading RAM is a PITA: you normally find that the old ram is unsuitable to run with the new, so you have to chuck that. And then, does the motherboard take the amount you want? And then there's the processor...and the hard disk...and the video card...and an SSD would be nice...and hang on, does the PSU supply enough power? You end up paying more for the upgrade than you would have for the new system most of the time, particularly when you are trying to catch up 5 years!
The only drawback is that a new system is likely to come with Win8 - but you may be able to pay a little extra for an upgrade to Win7...
Those who fail to learn history are doomed to repeat it. --- George Santayana (December 16, 1863 – September 26, 1952) Those who fail to clear history are doomed to explain it. --- OriginalGriff (February 24, 1959 – ∞)
A 5 year old box is past the point where substantial upgrades make sense; time to replace it entirely. You're probably only a year or two from starting to see component failures that will junk it. No sense throwing good money after bad now.
If you're planning to keep your system more than a few years and use it for either gaming or dev work, I'd suggest going all the way to 16GB of ram. Windows will use it as disk cache now for a minor speedup and as your software continues its inevitable bloating it'll protect you from swapping. DDR4 is expected to start appearing in high end systems late this year and in mainstream ones sometime next year, so the DDR3 the box you build now will use isn't going to get much cheaper in the future; easier just to buy the extra ram now and not worry about it.
In the short term, an i7 probably doesn't offer a whole lot. Relatively few games can take advantage of more than 4 cores (although with PS4/XBOne both having 8 core CPUs this may change in a few years); and unless you're building really large applications you won't be able to parallelize the build enough to benefit from 8 cores much. If you can afford it, I'd still tend to lean toward an i7, it's only $100 on top of a $1k+ system, just for future proofing; but if you need to cut costs for wife approval this is a relatively safe place to do so.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, waging all things in the balance of reason? Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful? --Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies. -- Sarah Hoyt
My personal recommendation to my customers is if it was originally running Windows XP then you may as well buy new. The newer hardware generally has more or different requirements for ventilation.
If you want more than 2 Gig of Memory then you will need to go to 64 bit. There are also certain things built into the 64 bit that is not in the 32 bit that can help protect the system.
As far as the processor you have to look very close at the part numbers to decide on what you need and want. They don’t make it easy to break thru the hype on what is what for either Intel or AMD. I even tried to contact them thru the Intel partner chat.
With some of the research I have done for finding a new laptop, if you were going to be multi tasking you may want to go with an I 7, again look at the part numbers they are not all the same and most of the new systems put the less effective ones in them. Take Note of the wattage too, or “Max TDP”
I would also say that buying a new machine is a better bet. You can give the old one to someone else in your family who has less intensive needs, use it for a backup file server or just keep it as a reserve, which you can't do if you gut it, and it is likely to be of a similar cost if not cheaper.
I play Planetside 2 which is fairly intensive, and the community view there is that a high end i5 is as effective as an i7, because the game engine isn't fully capable of using 4 cores to the max. That's probably true for other modern games too. A large part of the game experience is dependent on the GPU, extra money is probably better spent there.
But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government, and provide new Guards for their future security.
Anything that is unrelated to elephants is irrelephant Anonymous ----- The problem with quotes on the internet is that you can never tell if they're genuine Winston Churchill, 1944 ----- I'd just like a chance to prove that money can't make me happy. Me, all the time
Today i came across a scene where we had a discussion on this topic.
we have some work on the database side where batch files data is insert (row count is several thousand per operations )/ some text files generated by windows services / flag status maintain based on the data insert in the database per operation and many other small operations happen during this process. obviously correctness of the operation is must but at the same time performance of the operation came in to the picture. some teammates says that as there are lot of process happens during per operations it will take time to complete things while some believe that there can be something done to reduce the time. Client is okay even if that takes time as for him accuracy is more important rather than time taken. How to manage such scenarios ? after some level we can not think of optimization on query as all the things which is written is necessary. does anyone face such things ? how do you manage things to improve the performance?
Client is okay even if that takes time as for him accuracy is more important rather than time taken.
So it appears that from the client's point of view there is not a problem with 'performance' - so who is pushing for 'performance'?
after some level we can not think of optimization on query as all the things which is written is necessary.
There are always tweaks one can make to speed things up - proper analysis with regards to indexes, table structure, partitioning, staging the data from the batch files and indexing those staged tables, improvements at the hardware level...
You have not given much information in order to be able to answer the question.
“That which can be asserted without evidence, can be dismissed without evidence.”
In a much theoretic level - split your operation into smaller junks (it seems you done it already, but see if there is more), examine every step on it's own. Also consider asynchronous handling. Sometimes you do not need all the data in real time, so you may use multiply (synchronized) databases, one for writing and one for reading operations...And much much more things, most depend on your specific solution...
I'm not questioning your powers of observation; I'm merely remarking upon the paradox of asking a masked man who he is. (V)
we have some work on the database side where batch files data is insert
So is it done by a sheduled night job or something similar? If it is do you have a risk that other things might take to long time time or not getting the data (other night jobs that needs data from the previous job, etc) ?
If it is a seperate night job with no other dependecies (and given that it doent take a long time = >3-6 hours) leave it be, then it's better to focus on accuracy. But it all comes down to what is done and what do the import achieve?
If a user presses a button to start the job and expect the outcome faily rapidly then 3-6 hours is way to long and you need to work on both accuracy and performance.
Basically you can answer this question by your self, by asking youself a bunch of questions:
Is the process a critical part of the application? Depending on how critical the process is, you can do several things to optimize performance. For example, if the process isn't that important to the basic operation, then you can try to push the process in the background, i.e. give it a lower priority.
On the other hand, if it is a critical operation, you have to find other means to optimize the process. And that can be very hard, especially if the code has to keep it's accuracy.In that case you have to decide if the work, required to optimize the code, is worth your client's money. Because that is what it comes down to.
Are there opportunities to do optimizations/micro-optimizations on several parts of the code? This will probably cause a heated discussion, but micro-optimizations do work, if there are plenty of opportunities. I.e. benefit > time. The basic rule here is ~+5% performance, or more, should be worth the time. In reality those figures will come down to 2-3% performance plus, but depending on how much time your process takes, the total benefit can be huge.
Can a technology change improve performance? This is a big one, a very big one. Before you even consider to switch to a different technology, you have to do "the math". Is it even possible to migrate the "old" storage, to the "new" one? How much time, and in the end money, does this cost? Is the benefit high enough? And so on, and so forth.
This is a step i usually don't recommend, but it can pay out in the long run.
Is that what the client wants? In the end, the client has the last word. If he is satisfied with the performance of the application and doesn't want any optimizations, then the situation is very clear. Even if other people, colleagues, think otherwise.