|
Even a fairly old video card - I use a NVIDIA GTX660Ti - will have support for at least two displays (but it's worth checking what inputs your monitors will take. You can't "split" a single HDMI output to two different images, just duplicate the same picture on two.
Trust me, multiple monitors make things so much easier: just having one for the internet and one for VS while coding (or one for VS and one for your app while debugging) is worth it's weight in gold.
I run three now: internet (Portrait), VS (Landscape) and a spare for Email & everything else!
Go for it: get a cheap two - or three - output card, and disable onboard. It'll free up your system RAM and accelerate windows display stuff at the same time.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
I wouldn't be without a dual display for work. As others have said, the productivity gains are significant.
Point of this post is to put in a word for the multiple desktops feature in Windows. I rely on this heavily and have been up to a virtual 8 or 10 monitors with it - two IDEs open with a lot of stuff in each, best spread over two monitors per IDE, plus documentation, browser windows, email, discreet sanity-preserving displacement activities that that shouldn't be left visible to passers by...
If it wasn't for this feature, I'd be looking at more and bigger monitors and still find them constraining on occasions.
Once the extra desktops have been created (Task View icon which for me is to the right of the Start menu icon, lower left), panning right and left through the desktops array is [Ctrl]+[Windows key]+[Right cursor] or +[Left cursor].
|
|
|
|
|
While I like the desktop feature, these kids don't seem to understand the Rubiks Cube Variant...
I want to use my multiple monitors, with desktops, and organize them in 3D! So I can FLIP UP/DOWN as well as left/right.
This is like workbooks of related tools.
The other frustration is that I have a nice VM Setup, but when it is full screen, it captures the key I wish to apply outside of it. [I Ended up making a keyboard macro to trigger the right ctrl key, then trigger the desktop change to escape out to the parent OS]...
For my large monitor, I love using AquaSnap Pro for repositioning and resizing my windows as though I have 4 monitors, when I just have a HUGE 4K monitor.
|
|
|
|
|
Two Display Ports if you want to be a bit more future safe.
One DisplayPort should be enough since you can daisy chain them, but not to many monitors support that, so you'd need to check that out.
|
|
|
|
|
My choice exactly! And if you buy a new computer, make sure it has a NVMe M.2 SSD drive, not the older SATA SSD drives. NVMe drives are significantly faster. No serial comms. They plug directly into the PCIe bus.
Get me coffee and no one gets hurt!
|
|
|
|
|
I agree about using Display Port. You'll almost certainly get better resolution with it than HDMI.
|
|
|
|
|
Going as far back as Windows has had the ability, people have generally been recommending against mixing video chipsets from different manufacturers together, so that would mean disabling the onboard video and using instead a video card that supports two independent output ports. In reality--and I can only speak for myself--I've personally never experienced any display problem that I could attribute to mixing video cards from different manufacturers together.
Maybe I've just been lucky, but I can definitely see how there could be room for strange behavior when using video drivers and add-on display software from different manufacturers. YMMV, but IMO if you're going to get a card anyway to provide a second display, then it would make sense to get a card that can provide two outputs and then just disable the built-in one.
Another possibility (which is not necessarily as cheap as a low-end video card) - there are USB to VGA adapters that will act as a video card, and let you send video to a monitor through a USB port if, for some strange reason, adding a video card is not an option. You can chain them with no problem - at one point, for the hell of trying it out, I had 3 of them hooked up to a system that already had 2 "regular" displays, for a total of 5 monitors. But I will point out that the key here is to use a USB 3 port - when testing the configuration I mentioned, I could have 5 different HD videos playing full-screen, independently from each other, and not a stutter. Full-screen HD video over USB 2 did NOT work as smoothly. Even just moving a single window over a USB2 port showed the image getting clipped as it was being dragged.
And TBH that was years ago when HDMI still wasn't all that common as a PC connector. I can't imagine there wouldn't be equivalent USB to HDMI adapters nowadays.
[Edit]
..and sure enough, they exist, and they're even cheaper (CAD$25) than what I had paid for the USB to VGA adapters I experimented with.
|
|
|
|
|
dandy72 wrote: Going as far back as Windows has had the ability, people have generally been recommending against mixing video chipsets from different manufacturers together, so that would mean disabling the onboard video and using instead a video card that supports two independent output ports. In reality--and I can only speak for myself--I've personally never experienced any display problem that I could attribute to mixing video cards from different manufacturers together.
If you go back far enough (win95? NT4????) you actually had to use different brands because something in the driver model choked on 2 instances of the same GPU.
I ran into NVidia vs ATI driver problems with XP (vista?) years ago; I was running 3 monitors in an era when GPUs only had 2 outputs and upgraded my main card from NVidia to ATI and ended up having to buy a second cheap ATI card because I could never get my main card and low end NVidia ones to play nicely. On more modern systems I never have had an issue with Intel and AMD/NVidia/USB; I've never tried mixing AMD and NVidia though. (My current systems are Intel and since cards started offering more outputs haven't needed to double up just for that.)
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
No doubt problems existed, they were being reported for a reason. I just was saying I was fortunate enough not to have run across them in any of my own little experiments.
|
|
|
|
|
What OS?
Without waxing too descriptive then, get a PCIe video card but know that purchasing one will require some homework on your part, mainly brushing up on knowing what types of connections to IT you will be supporting once you decide you're going to spend x amount of dollars for your reason.
Why "what OS?" ... because the blue screen of death has evolved recently.
And with that warning, hang on to your old DB 15-pin monitor for a while or at least until you do everything in your power to resist the nostalgia of sticking with an old operating system just because your hardware seems to run optimally on it.
I'll end this here.
But remember the onboard VGA connection CAN BE USURPED by your BIOS.
modified 4-Feb-21 16:44pm.
|
|
|
|
|
I've run dual monitors for many, many years and I now find using a computer with a single monitor very restrictive.
In my case I've always used a PCIe graphics card with multiple outputs and that has worked well for me.
I'm considering going to three monitors (two for primary work area, one for systems monitoring, tv, or debug output, depending on what else I'm doing) and I'll probably get another graphics card to do it.
This is currently on Windows 10 (and was previously on previous Windows versions).
|
|
|
|
|
Personally I don't like any setup with less than two monitors. I guess I spoiled that way. I have had 3 monitors in the past for work. But the 3rd one was normally just dedicated to keeping an eye on my email.
One of the fellows I worked with had 5 monitors. I think he was using them to simulate a complete environment, via virtual machines, for testing and development purposes.
Another guy setup complete systems of monitors to turn MS Flight simulator a 360 degree simulation in is basement. He ended up getting his pilots license in record time.
INTP
"Program testing can be used to show the presence of bugs, but never to show their absence." - Edsger Dijkstra
"I have never been lost, but I will admit to being confused for several weeks. " - Daniel Boone
|
|
|
|
|
I agree with those before me extra monitors are a significant improvement in efficiency, and after years of doing so I can hardly get any work done on a single monitor. Sometimes I travel and I will bring an extra long DisplayPort and HDMI cable with me so I can use the TV in the room as a second monitor.
I would suggest a laptop with a dock and at least two external monitors, and skip the 1920X1080 FHD versions and go directly to 2K monitors
Consider Dell XPS, or the Precision mobile workstations. The latter can be had inexpensively as companies near tech centers don't hold on to them long as they upgrade to get the latest and greatest every few years.
|
|
|
|
|
Just an odd thought to throw out into the mix, but working remotely when you have two displays in the shop is a real pain. Instead of two displays, I upped the size of the display at work to allow space for an output area, and working remotely became painless. I have a coworker who has a two-display setup at work and he is constantly complaining, reminding me of what I am not missing.
|
|
|
|
|
I'm not sure I'm following. Are you saying you have two displays at work, but prefer a single-display setup when working remotely?
|
|
|
|
|
Yes. The problem with two at work and one at home is that anytime you move your mouse against the side the second monitor is, the screen on your local display scrolls in that direction. If the mouse move was just moving the mouse, this means you have to scroll back to where you were working, a delay and break of thought stream.
|
|
|
|
|
Let me clarify a bit more. I ONLY have a single large display at home. Having dual displays at work caused the problem. Replacing the two smallish displays with a larger display at work solved the scrolling problem. I have no room at home for a second display, which would have also solved the problem. A 27" display was < $200 USD, so it was a solution my owner approved of.
|
|
|
|
|
Right...I despise working on a laptop because I'm used to multiple displays so I find one display to be cramped, to the point of being unusable...but I don't feel like the solution then is to stick with one display everywhere for the sake of having one consistent experience no matter where I am. If that's what you're saying.
And are you saying 27" is "large"? What's the pair of "smallish" displays that you got replaced at work?
|
|
|
|
|
I offered this as an alternative, not as the end all. My dual displays were a 21" and an old 19", so 27" is large. The company I work for has 25 employees, so being the third developer means cast-offs.
I read the other responses and saw many bits of wisdom and offered this as an alternative that might be useful to you, not as an insult. What works for me might not work for you, but I thought you might consider it if it helped.
|
|
|
|
|
Well...size is not all that matters; resolution matters too.
I have a 40" 4K display as my primary monitor, and I'll make the claim that it's a better replacement than 4 1080p monitors (equivalent resolution) could provide. That being said, when I got it, I did not get rid of the other two 1080p monitors I already had.
|
|
|
|
|
Well that depends on how you are remote accessing.
For example, With MS's RDS your displays are driven by the remote machine.
With Google's your resolution and number of monitors is driven by the host machine.
Horses for courses.
|
|
|
|
|
Funny - it was only a couple weeks ago I found out that none of my remote team in India and Bangladesh were using dual displays, so I had them all go out and buy large monitors so they could have two monitors - their laptop screen and their external monitor.
Then, 2 days ago, I was on a remote session with one of my guys and it looked like he was using only his laptop screen. He assured me that he was using his beautiful large monitor (32 inches), but the resolution looked terrible. But it was clear that he was using only one monitor.
Turned out, he dutifully plugged in his monitor and then closed his laptop, using an external mouse and keyboard, forcing the external monitor to the same crummy resolution of the laptop. We went through the steps to show him how to use both simultaneously, which tripled his working desktop space - double resolution on the external monitor and then the original laptop. We put the debug window on the laptop, ran his program, and instantly found that his web system was making an extra extraneous SQL call for every action and a separate bug where a server call was being made when the data was already in memory.
I cannot overstate the benefit of being able to have a live console output/debug log for applications, especially web apps.
And for me, using double/triple monitors for years, don't care what the outputs are. VGA, DVI, HDMI, Display port - my primary tasks are writing and coding, so speed has very little concern for me. Once I needed a dual DVI cable to get a better resolution on my ultra-high res monitor, but that was it.
<hr>
"Qulatiy is Job #1"
|
|
|
|
|
That's an "attaboy" post if I ever saw one.
Agreed wholeheartedly, I've been using at least dual-displays for over a decade, and there's just no going back. A laptop is just something to get by when I'm away from my desk. There's just no way I could sit down and try to do something useful with one display.
|
|
|
|
|
David Carta wrote: I had them all go out and buy large monitors so they could have two monitors
Wow, your employees are very lucky. Most employers in my experience would take severe grovelling to agree to budget for new hardware like that.
|
|
|
|
|
Probably a benefit of having a CEO who started the company as the primary developer.
IMO, hardware expenditures, even extravagant ones, are so miniscule compared to the employee salary, I generally don't dicker when an employee needs better hardware, whatever it might be, computer, RAM, SSD, monitors, software, etc. It makes the employee happy and usually more productive, far outweighing the costs.
Even when some of your dev team is in relatively cheap places like India or Bangladesh!
<hr>
"Qulatiy is Job #1"
|
|
|
|
|