Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Help displaying on 2 monitors

RollieJoeRollieJoe Member UncommonPosts: 451

Hi, hopefully this is a fairly simple problem someone can help me with.   I want to display my PC on two monitors.  One is a 22" and one is a 47" but I want to display the exact same thing on both monitors, same resolution, same refresh, same everything.  Both monitors support 1920x1080 and 60hz and I can easily hook my pc up to either monitor right now with no problem. 

 

Further, my video card has multiple outputs, but here's the thing - I don't want my video card to have to work any harder, even a tiny bit harder, to display on both monitors.  So I need a solution that doesn't force the video card to do extra work.

 

With all that in mind, what's my cheapest option?  My video card has DVI-I, DVI-D, HDMI, and Displayport outputs.   My 22" monitor has a VGA and a DVI-D input.  My 47" monitor has a VGA and an HDMI input.

 

So to recap, I need a way to send the same signal from my computer's video card to two monitors:

1) without making the video card work any harder, not even a little bit

2) without degrading image quality, I need to be able to display on both monitors at 1920x1080 60hz

3) as cheap as possible

 

Thanks! 

Comments

  • QuizzicalQuizzical Member LegendaryPosts: 25,348

    It depends on what hardware you have, as duplicating displays has to be done through video drivers.  If you dig around for a while through your driver package, you can probably find the option for it.

    There's no way to make it so that the video card doesn't have to work any harder at all.  It can render an image once, but still has to send it to a monitor twice, and that's extra work.

  • RollieJoeRollieJoe Member UncommonPosts: 451


    Originally posted by Quizzical It depends on what hardware you have, as duplicating displays has to be done through video drivers.  If you dig around for a while through your driver package, you can probably find the option for it. There's no way to make it so that the video card doesn't have to work any harder at all.  It can render an image once, but still has to send it to a monitor twice, and that's extra work.

    Pretty sure all of this is incorrect. Duplicating displays should be possible with just the appropriate splitters/adapters, and since its only sending one signal it shouldn't need to work any harder.

    To clarify, I wasn't really asking "is this possible" (again, I'm mostly certain that it is) but rather "what's the best/cheapest way to do this".

    At the moment I'm thinking of splitting my video card's DVI output with:

    http://www.amazon.com/Tripp-Lite-Splitter-1920x1200-B116-002A/dp/B003SG9IOC

    and then converting one of the now split DVI signals into HDMI with:

    http://www.amazon.com/AmazonBasics-HDMI-DVI-Adapter-Cable/dp/B001TH7T2U/ref=sr_1_1?s=electronics&ie=UTF8&qid=1424573241&sr=1-1&keywords=dvi+to+hdmi

    This seems like it would allow me to connect dvi to my 22" and hdmi to my 47" both in 1080p with no quality loss and no extra strain on the video card. However, its an ~$80 solution. Any other suggestions?

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Originally posted by RollieJoe

     


    Originally posted by Quizzical It depends on what hardware you have, as duplicating displays has to be done through video drivers.  If you dig around for a while through your driver package, you can probably find the option for it. There's no way to make it so that the video card doesn't have to work any harder at all.  It can render an image once, but still has to send it to a monitor twice, and that's extra work.

     

    Pretty sure all of this is incorrect. Duplicating displays should be possible with just the appropriate splitters/adapters, and since its only sending one signal it shouldn't need to work any harder.

    To clarify, I wasn't really asking "is this possible" (again, I'm mostly certain that it is) but rather "what's the best/cheapest way to do this".

    At the moment I'm thinking of splitting my video card's DVI output with:

    http://www.amazon.com/Tripp-Lite-Splitter-1920x1200-B116-002A/dp/B003SG9IOC

    and then converting one of the now split DVI signals into HDMI with:

    http://www.amazon.com/AmazonBasics-HDMI-DVI-Adapter-Cable/dp/B001TH7T2U/ref=sr_1_1?s=electronics&ie=UTF8&qid=1424573241&sr=1-1&keywords=dvi+to+hdmi

    This seems like it would allow me to connect dvi to my 22" and hdmi to my 47" both in 1080p with no quality loss and no extra strain on the video card. However, its an ~$80 solution. Any other suggestions?

    Why would you want to do that externally rather than just having your video card handle it?  I'm not sure if a video card with a duplicated signal can read the frame buffer once and send it to both monitors, or if it would have to read the frame buffer twice to send it to the monitors independently.  But even if it's the latter, that's only about an extra 500 MB/s of bandwidth--or about 0.2% of what a high end video card has available.  Extra strain on the monitor outputs isn't going to reduce your frame rate.

  • RollieJoeRollieJoe Member UncommonPosts: 451
    Originally posted by Quizzical

    Why would you want to do that externally rather than just having your video card handle it?  I'm not sure if a video card with a duplicated signal can read the frame buffer once and send it to both monitors, or if it would have to read the frame buffer twice to send it to the monitors independently.  But even if it's the latter, that's only about an extra 500 MB/s of bandwidth--or about 2% of what a high end video card has available.  Extra strain on the monitor outputs isn't going to reduce your frame rate.

    I've never hooked multiple monitors to multiple outputs on one video card, is the extra strain on the video card really negligible?  I know if the outputs are sending different signals, for example a dual screen setup with different images on each screen, then it adds significant extra strain to the video card, effectively making it run at double the resolution.  But if you are just sending the same signal is this not the case? 

     

    If its really just a percent or two extra strain, that does beat spending $80.  But I didn't want to sacrifice performance since I'll be displaying to an audience on the larger screen. 

  • QuizzicalQuizzical Member LegendaryPosts: 25,348
    Originally posted by RollieJoe
    Originally posted by Quizzical

    Why would you want to do that externally rather than just having your video card handle it?  I'm not sure if a video card with a duplicated signal can read the frame buffer once and send it to both monitors, or if it would have to read the frame buffer twice to send it to the monitors independently.  But even if it's the latter, that's only about an extra 500 MB/s of bandwidth--or about 2% of what a high end video card has available.  Extra strain on the monitor outputs isn't going to reduce your frame rate.

    I've never hooked multiple monitors to multiple outputs on one video card, is the extra strain on the video card really negligible?  I know if the outputs are sending different signals, for example a dual screen setup with different images on each screen, then it adds significant extra strain to the video card, effectively making it run at double the resolution.  But if you are just sending the same signal is this not the case? 

     

    If its really just a percent or two extra strain, that does beat spending $80.  But I didn't want to sacrifice performance since I'll be displaying to an audience on the larger screen. 

    It depends very heavily on what you're displaying.  If you're rendering two completely independent images, then that can double the load.  Rendering one image at double the resolution increases the load by quite a bit, but often a lot less than double.  If you're just rendering an image once and then sending exactly the same image to two different monitors, I'd be very surprised video cards aren't smart enough to just render the image once and send it twice.

    It also depends on what you're displaying on the monitor.  Running two different and very demanding games on two different monitors is going to put a lot of load on the video card.  Displaying the desktop on six monitors isn't much more demanding than only displaying it on one.  Anything that video cards could do at all fifteen years ago is probably nearly free on modern gaming cards.

    It's reasonably common to have multiple monitors and play a game on one while the other(s) do something simple like showing a spreadsheet or browser.  That's something that GPU developers very much anticipated, and the extra monitor(s) don't hamper game performance much, so long as they're not doing GPU-intensive stuff.

    What are you trying to do, anyway?

  • ferdmertferdmert Member Posts: 10
    I've got 4, 24in, monitors running at 1920x1200 on 1 video card.  You are way over thinking this.  Not sure what video card you have but anything from Nvidia or ATI can what you got.  Hook the tv up with display port or HDMI and your monitor with DVI or whatever one you didn't use for TV.  The card will be just fine.
  • [Deleted User][Deleted User] CommonPosts: 0
    The user and all related content has been deleted.
  • RidelynnRidelynn Member EpicPosts: 7,383

    Just plug them both into your video card, and set the video card to Mirror

    It won't tax your video card any extra.

    Sure, it has to drive two displays, but that doesn't take GPU "work" power, that just takes the card driving the interface.

    Even extending your desktop, which technically does add extra strain, is entirely neglibile on a discrete GPU - the desktop takes up so little resource/power on a modern card the effects of extending a desktop across any number of monitors you can phyically plug into the monitor is difficult to even measure.

    Now, if you Eyefinity across those and actually use them for additional gaming space - then yeah, you've significantly added to your GPU load. But driving a desktop - the memory your desktop background photograph takes up is the most significant measure of additional resources used.

  • NightliteNightlite Member UncommonPosts: 227

    Let me put it this way..

     

    How much processing do you think that little trip lite box is doing to split the signal to two displays? None, its all hardware.

     

  • RidelynnRidelynn Member EpicPosts: 7,383


    Originally posted by Dren_Utogi
    graphics card needs1 hdmi and dvi outor2 hdmi outs .....................................right click you desktop, and click you graphics cards softwareIf it is ati :click on catalyst > destop management > creating and arranging desktopBoth displays should be present.under the big box showing the display will be a smaller boxs with the displaysclick the black arrow on the second display and choose "duplicate"This will duplicate monitor one onto monitor 2................................................. For Nvidia, no clue.

    Essentially the same process on nVidia.

Sign In or Register to comment.