I have a GeForce GTX 750 Ti with the 352.63 proprietary drivers working fine on my monitor using a DVI cable and I added my HDTV to the video card with HDMI.
xrandr shows:
DVI-I-1 connected primary 1920x1080+0+0 (normal left inverted right x axis y axis) 531mm x 298mm
1920x1080 60.0 + 144.0 120.0* 99.9
1440x900 119.9
1280x1024 120.0 75.0 60.0
1024x768 120.0 75.0 60.0
800x600 120.0 75.0 60.3
640x480 120.0 75.0 59.9
HDMI-0 connected 1920x1080+1920+0 (normal left inverted right x axis y axis) 1124mm x 627mm
1920x1080 60.0*+ 59.9 50.0 30.0 25.0 24.0 60.1 60.0 50.0
1680x1050 60.0
1600x900 60.0
1440x900 59.9
1366x768 59.8
1280x1024 75.0 60.0
1280x800 59.8
1280x720 60.0 59.9 50.0
1152x864 75.0
1024x768 75.0 70.1 60.0
800x600 75.0 72.2 60.3
720x576 50.0 50.1
720x480 59.9 60.1
640x480 75.0 72.8 59.9
Opening up nvidia-settings it shows both monitors and I configure them fine, hit apply but nothing ever shows up on the second HDMI monitor.
Does anyone know the correct x server display settings or how to get the second monitor to show up?
Thanks and let me know if you need more information.
How do I activate the hdmi output in nvidia settings?
I forgot to note that the only thing I want to do is play steam games on the hdmi television which is the second monitor so I was thinking that it should just be a clone of the desktop monitor?
If you have both displays listed in the XServer Display Configuration and you set clone they should be active.
Did you also tried the extended mode instead of clone ?
I was using an hdmi switch connected to my TV to split the playstation 3 and my netrunner computer since I was out of hdmi inputs on the tv. I disconnected the switch and plugged the computer directly into the tv and it works perfectly!
The product is a fosman switch I bought on amazon which I guess doesn’t support the graphics card. Too bad. I’m wondering if there any out there that would work so I don’t need to keep manually swapping the hdmi cables.