Last place head IT took zero notice when filling out a inventory stock that my machine was 6+ years old, Win7 and the Dell model which takes a moment to check that it is not Win10 supported, the graphics chipset was not supported for Win10.
So what did they do, force overnight a Win10 upgrade, because it was listed as Win7. Not the previously expected result mentioned months before that it would be replaced with a new machine.
Come in. Oh, looks different. only 1 monitor working, odd?
Install driver, driver crashed. Reverting to windows basic display driver.
1 monitor helped focus. Good thing my job had no need for a dedicated graphics processor.
Okay, well I got work to crack on with.
Getting a new development machine was not on any list of requirements when I was changing companies, but looking down at a XPS 15 laptop, with some 16Gb ram, all of the CPUs, does help point out the slivers of time saved by speeding past.
Especially when adding in that Windows 10 complains about reboots once a week, where my Win7 could power through for 2 months before I considered rebooting for updates.
My current Employer supplied rig is only 6 months old.. The "main" personal rig is about to turn 4 years old.
I have a Dell Inspiron 7000 (circa 1999) later updated to 7500 guts that I kept on hand for an old .NET 1.0 app. Client wasn't interested in migration/updating the app at the time.. For legacy .NET 1.0/2.0 apps, it was easier to just keep VS2005 on it instead of going VM.. Up until about 3 years ago, it was also a QA/File server as I had to maintain an app that needed XP compatibility. Now thankfully, ALL of those old client apps have been buried with them finally moving past Windows XP.
About the only thing that unit is used for today is the occasional nostalgia of when I want to pull out the library of Iomega ZIP 100 disks since it was THE original source code repo backup since I never did get an optical writer. Also the only way I can access those ZIP disks anymore.
Dell Precision T5810 - 32GB RAM 8 3.5GHz cores - Win7 (by general developer demand). They bought four. I don't feel so special.
It has more power than I'll need while I continue to work for them. Only a couple of years old.
My personal pick would have been more conventional (using, at the time, the comparatively new DDR4 for future compatibility ease) - an i7 gamer box. The view was, since the brute power at this point was irrelevant, it would be easier to maintain, repair, and upgrade. They all shook their heads "Yes - good thinking" and then went and ignored it.
They have me now work on web development (vs. original C++) - so what do I get with all that power and speed? Pretty much the same performance as a $200 Dell refurb! The only place I really make any use of it at all is photo-editing - which is pauseless. I do that "around here", but really, the less expensive and more serviceable i7 gamer would probably be better with the video.
My view was for something that will have easy-to-maintain (repairs, upgrades) components for a long time. No special memory, etc.
Where all that horsepower could have done some good is as a server.
Fortunately not mine but another division here purchased new stations for their group. The person in charge decided to save a couple hundred dollars on each machine by getting the i7-5000 series(circa mid-2014) instead of the i5-8000 processors.
I know what you're saying. An 8th gen i5 will run circles around those 5th gen i7 ones.
I have a feeling that companies like Intel are banking on the ignorance of common people. People who don't know any better will think "i7 is better than i5" without looking at the specs, so Intel gets to sell more of the old sh*t, so why not milk that i7 name.
I think that an unconditional "A is better than B" is an overly simplistic approach.
I bought a machine around the time when the first i7 arrived. It was the first x86 to have three high-performance memory channels, and I was working with tools that were shuffeling tons of data around in memory, but not very CPU intensive, so e.g. clock speed had little effect on the performance.
Later, I have been on projects with data that could almost fit in cache, but requiring a lot of CPU work, so clock speed and microarchitecture turned out to be essential; memory channels were "almost idle".
Software achitecture and peripheral software may have a great impact. The SketchUp 3D drawing system came in a new version that delegated lots of the graphic processing to your advanced Graphics card - if you had one. I bought one, and drawing times were reduced to a fraction, while CPU load dropped to a fraction. (The previous SU version made very little use of the graphics card functionality; it didn't matter much if you had a top rate or an El Cheapo card.)
Looking back at my old University: They ran a Cray-1 for a number of years. It was replaced not to get more CPU power, but to get more I/O capacity. When running e.g. wheather forecasts, even 20 DMA channels was a bottleneck.
You may have other tradeoffs. What is the cost of interrupt handling? That depends on how much you depend on interrupts. Can the CPU support a chipset that provides the I/O-standards that you need? Can the CPU support the amount of physical RAM that you need? Is the power consumption low enough to run the machine on a battery, or does it require water cooling?
Lots of such variations. For one specific application you may certainly conclude that "A is better than B", but not as a geneal unqualified statement.
Not necessarily. I've still gotten a 25+ year old Win 3.11 PC sitting in my home office, for the reason that it has a 5.25" floppy drive. Or rather, the last time I turned it on was because a friend of mine wanted me to check if his old floppies were still readable (they were!). I've got a couple other rather outdated interfaces to it as well, such as a MIDI interface card, an interface card for a tape cassette unit, and for a hand scanner. The cards are for old bus systems, so I can't move them to a newer machine. (For the MIDI, I could of course buy a brand new USB based MIDI interface, but not for the other two.)
I am not doing any development on this machine, but if you work as a consultant, you should be prepared for the strangest requests. If I was offered a six man months project on adding some new functions to a 16-bit Windows program system (that is entirely possible, even today!), I guess I would take it. Then, this Win 3.11 machine would be my development environment (it has a C++ compiler and debugger - but I have forgotten the name of the debugger!), and for a period my main development machine would have been 25+ years old.
I haven't had such a project, but about fifteen years ago a fellow came with a pile of 8" floppies (I guess most of you have never seen a real-life 8" floppy!). "I think these may contain some essential data, but I don't know the format...". He couldn't tell what kind of machine or software had created them, their age (except that they were old!), nothing. He gave us a cost limit (which turned out to be high enough to get his data out).
At that time, we had an old minim machine (not a PDP-11, but same class, from around 1980) with an 8" drive and a good selection of drivers for various track/sector formatting, and I managed to get a binary dump of the floppies. Then I could start poking around. The blocks looked like line noise, so I suspected that it could be encrypted in some way. In those days, some people were still using primitive encryption (like xoring or code shifting). Octet values were indeed unevenly distributed. By plain luck I came across an EBCDIC table, and saw that the two most common octed values were the EBCDIC codes for 'e' and 't'. The very most of the contents were plain text in EBCDIC coding. The customer confirmed that the floppies might come from an old IBM system.
For this project, we at least made use of a 20+ year old machine to dump the floppies to a hard drive. Not quite "software development", but it illustrates that some projects may call for that kind of equipment.