Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
21st January 2015, 16:05 | #3 | Link |
Registered User
Join Date: Mar 2011
Posts: 4,829
|
Was he referring to the video output levels or the "global" output levels?
"DVI for computers" and "HDMI for video" doesn't really work in that context as lots of video cards have HDMI out rather than DVI, while TVs have dedicated HDMI/PC inputs and computers are used to play video. Remicade, Is there a setting to change the video output levels in the Nvidia control panel under the Video section? Is it available for both HDMI and DVI? If so, do either work? You weren't very specific with your question in respect to whether you need help changing the levels or if either method produces the "correct" levels or whether you were referring only to the video levels. I've read lots of complaints in the past regarding HDMI always being limited range with Nvidia drivers, but I think it's only a problem if you're running a version of Windows newer than XP. Apparently the Nvidia drivers automatically select a "global output range" based on whether the chosen resolution is from the TV or PC list of resolutions in the Nvidia Control Panel. There's a thread here including a link to a utility for forcing full range video output if you need it and a post explaining how the "global output range" effects the video levels, although I'm not sure I've got my head around it. It seems like it might be a good idea. Connect at PC resolutions and refresh rates and Windows runs in full range mode as it's always done, or connect at TV resolutions and refresh rates and everything is scaled back to limited range.... Although there's not much of a line between them any more. This post says there's now a "global levels" option under Desktop Colour Settings in the control panel. That might effect the way the Video output range setting works but I don't really know. My old Nvidia card is dual VGA/DVI and by default they both output 16-235 for video (Windows XP). I've always thought it silly they don't expand video to full range by default, but at least it's pretty simple. Windows full range, video levels expanded to full range or not, some basic colour adjustments and that's about it. Last edited by hello_hello; 22nd January 2015 at 00:45. |
21st January 2015, 17:46 | #4 | Link | |
Registered User
Join Date: Mar 2009
Location: Germany
Posts: 5,769
|
Quote:
That some TVs have DVI and some monitors have HDMI inputs, this won't change this. I've carried out a simple test - in amazon there are currently 170 HDMI monitors for PC against 1427 DVI ones (I neglected those models that offer both, these must be less than 170 anyway). Similarly, that a particular DVD player is able to play DivX, this doesn't change the fact that DivX was created for computers and not for TVs.
__________________
Born in the USB (not USA) |
|
21st January 2015, 19:34 | #5 | Link | |
Life's clearer in 4K UHD
Join Date: Jun 2003
Location: Notts, UK
Posts: 12,227
|
Quote:
Cheers
__________________
| I've been testing hardware media playback devices and software A/V encoders and decoders since 2001 | My Network Layout & A/V Gear |
|
|
21st January 2015, 19:35 | #6 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
The new Nvidia drivers offer range options for each display. It is amazing, after only a decade or two it is possible to actually set the range correctly for each display simply using the driver control panel!
You do not want to use the setting SeeMoreDigital posted, that only changes the video range. You can change the range for the entire display on the display properties page. Sorry not on an Nvidia computer at the moment. Last edited by Asmodian; 21st January 2015 at 19:37. |
21st January 2015, 20:51 | #7 | Link |
Registered User
Join Date: Dec 2009
Location: Romania
Posts: 98
|
Well right now I use HDMI-HDMI connction, I have RGB and Dynamic Range Limited 16-235 in nvidia Control Panel, I cannot switch to 0-255, I hit Apply and the settings are back to 16-235. The video card is connected to Samsung 48H5030 LCD TV, I use madVR and I adjusted brightness and contrast with AVS 709 HD. If I switch to DVI-HDMI will be any benefit ? I watch only movies on TV, I don't play games or use as a monitor.
|
21st January 2015, 21:25 | #8 | Link |
Angel of Night
Join Date: Nov 2004
Location: Tangled in the silks
Posts: 9,559
|
DVI-HDMI takes away all of the YUV options (and audio) and leaves you with only RGB. The system trying to force you to use YUV (which also subsamples color to 4:2:0... check out black text on a red background) is Windows and the video driver being too clever for their own good, trying to force it to be a TV instead of a monitor. There are tricks you can use to fix it, like hacking the monitor .ini file in the Windows drivers folder to input your own EDID, but that's a huge pain and only works sometimes. Using DVI makes all the pain go away.
Of course, if you're using it as a TV and not a monitor you need very crisp text and most accurate color on, then it probably won't matter to you either way. Now the fact that it is a TV might mean that the overscan is affected on DVI, you'll have to look into disabling or mitigating that, and it might also mean that the TV can only input YUV and will convert from RGB to YUV and back even on DVI. TVs are a pain. |
21st January 2015, 22:06 | #9 | Link |
Life's clearer in 4K UHD
Join Date: Jun 2003
Location: Notts, UK
Posts: 12,227
|
In short... no! Given all commercial video is currently 4.2.0 just use HDMI to HDMI
__________________
| I've been testing hardware media playback devices and software A/V encoders and decoders since 2001 | My Network Layout & A/V Gear |
|
21st January 2015, 23:33 | #10 | Link | |
Registered User
Join Date: Mar 2011
Posts: 4,829
|
Quote:
Video card manufacturers have blurred the line between DVI and HDMI for quite a while. DVI cards can often output HDMI signalling. The video card in this PC must be at least six years old. It's capable of YCbCr 4:4:4 output. DVI is officially only RGB. TVs may be predominantly HDMI but it supports DVI signalling, making DVI on TVs a bit redundant. Nvidia have a system where the output is considered either HDMI or DVI, I think depending on resolution. This video card has no HDMI out. Only DVI. The Nvidia control panel shows it as being connected to my TV via HDMI. DVI is going the way of the dodo. Chances are either HDMI and/or Display Port will succeed it. Just ask Intel, four years ago. It doesn't stop the majority of DVD/Bluray players on the planet from playing DivX video though, does it? |
|
21st January 2015, 23:41 | #11 | Link | |
Registered User
Join Date: Mar 2011
Posts: 4,829
|
Quote:
See my previous post. You had to dare me..... http://en.wikipedia.org/wiki/Radeon_...Other_features Each DVI output includes dual-link HDCP encoder with on-chip decipher key. HDMI was introduced, supporting display resolutions up to 1,920×1,080, with integrated HD audio controller with 5.1-channel LPCM and AC3 encoding support. Audio is transmitted via DVI port, with specially designed DVI-to-HDMI dongle for HDMI output that carries both audio and video. |
|
21st January 2015, 23:55 | #12 | Link | |
Angel of Night
Join Date: Nov 2004
Location: Tangled in the silks
Posts: 9,559
|
Quote:
|
|
22nd January 2015, 00:03 | #13 | Link | |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
Quote:
And you specifically need to force YUV output on those cards, they all default to RGB. Also, many NVIDIA cards can just use a passive generic DVI->HDMI adapter without any change in functionality otherwise, they detect automatically if a TV or PC display is connected. AMD needs a proprietary adapter, but otherwise should offer similar things.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
|
22nd January 2015, 00:15 | #14 | Link | ||
Registered User
Join Date: Mar 2011
Posts: 4,829
|
Quote:
I can tell my TV to expect PC levels, which of course means I've got to expand the video levels for it just like a PC monitor, but what happens if the TV/monitor is set to expect TV levels? I'm asking mainly in respect to image quality with everything reduced to limited range. Does it make a noticeable difference? Quote:
It may even be possible there's some sort of communication going on between the TV and PC when using HDMI where the TV says "hey, I want TV levels". I don't know if it can, but there's still a lot of variables. When you compare DVI and HDMI in respect to the way video looks, does it look different? I know you said HDMI is limited levels and DVI is full range but I'm just wondering what the TV's doing? Apparently Nvidia drivers can send "content type information" over HDMI that says "be a desktop", and then when you run a video full screen, it says "be a TV", or something like that, or you can set it manually. Apparently it's under Display/Adjust Colour settings. If it can get the TV to switch modes, I could only imagine what else it might do. As a side note: Here's an example of what I meant in an earlier post when I said Nvidia seem to refer to HDMI and DVI as interchangeable, but not dependant on using a DVI or HDMI out. http://nvidia.helpmax.net/en/display...splay-content/ This instruction applies to HDMI monitors (not treated as DVI) and GPUs that support AVI infoframes on Windows Vista and later. It strongly implies you can connect via HDMI and the drivers will "treat a monitor as DVI" under some circumstances. And logically DVI can be treated as HDMI as seems to be the case for my card and TV. If that's what happens, I don't know how it works though. Last edited by hello_hello; 22nd January 2015 at 00:40. |
||
22nd January 2015, 03:06 | #16 | Link | |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Quote:
Last edited by Asmodian; 22nd January 2015 at 04:31. Reason: added image |
|
22nd January 2015, 06:15 | #17 | Link | |
Registered User
Join Date: Mar 2011
Posts: 4,829
|
Quote:
Cheers. |
|
22nd January 2015, 08:00 | #18 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
The option in "Adjust video image settings" doesn't change bitmaps, web pages, the OS, thumbnails, etc. The new one does. To me it is instantly noticeable but I notice shadow crush pretty quickly. It is a very obvious change toggling between them. This option is screen specific so your normal monitor is still full range. Nvidia finally gave an option to override the default global output levels for HDMI, DVI, etc. With display port becoming main stream I assume they could not decide what a good default was so they added the option.
My desktop image looks so much better at the correct range. As I never do much browsing on a TV the biggest benefit for me is thumbnails and the OS in general. The faint shades of gray are visible again. |
22nd January 2015, 21:33 | #19 | Link | |
Registered User
Join Date: Mar 2011
Posts: 4,829
|
Quote:
Setup 1: TV set to expect PC levels, Nvidia global output range PC levels. Windpws displays correctly. Setup 2: TV set to expect TV levels, Nvidia global output range TV levels. Windows displays correctly. Now comes the question...... do they look the same? I'm wondering if the scaling has a negative effect on the way Windows and programs look. I'm pretty sure I've read posts in the past where someone's claimed the expanding of video levels to PC levels for a PC monitor increases the likelihood of banding, which seems plausible if there's rounding errors, but I'm not sure I've seen a difference between PC levels in and out and TV levels in and out in that respect. It's almost impossible to make valid comparisons without two identical TVs and two identical video cards etc so you could compare them side by side. It would. I don't have the "global" levels option (still using XP) but I can set the output to YCbCr 4:4:4, and when I do the TV defaults to expecting TV levels and won't let me change it. Video looks fine without expanding the levels but Windows itself looks a bit off/crushed/dark. I assume it's still full range. |
|
23rd January 2015, 02:46 | #20 | Link | |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Quote:
|
|
Thread Tools | Search this Thread |
Display Modes | |
|
|