Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 17th May 2019, 06:48   #56281  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 408
Quote:
Originally Posted by Warner306 View Post
This chart appears to be fairly reliable and is probably a good guide for perceived resolution:

Resolution: Screen Size vs. Viewing Distance
My test was for difference between NGU Lanc and Nearest @ Greater or Less than 5 feet.

Where with a 1080p file, expensive upscaling is a bit wasteful.


As for choice of Movie Resolution. 4K is ALWAYS better than 1080p at ANY Distances due to 4:2:0. it's the only way to get 1080p chroma.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 17th May 2019, 14:03   #56282  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Things are different with a projector-sized screen. Projector owners see a lot of benefit with 1080p -> 4K upscaling.
Warner306 is offline   Reply With Quote
Old 17th May 2019, 14:08   #56283  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 408
well yea, projected is usually much larger, and bigger pixels.

Although from what I've seen of NON-native 4K projectors, the pixel shifting has an effect similar to analog upscaling (scanning), In that case, I'd say again NGU may not be so necessary, since it's not producing the intended NGU output anyway.

For most TV sizes/ distances, its' ok to go Green.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 17th May 2019 at 14:11.
tp4tissue is offline   Reply With Quote
Old 17th May 2019, 14:15   #56284  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Yes, you'd want a native 4K projector if you placed a premium on upscaling. Even then, many are using very high settings for image upscaling and derive a lot more visual improvement than say a 55" 4K TV.
Warner306 is offline   Reply With Quote
Old 18th May 2019, 12:29   #56285  |  Link
Bl4ze87
Registered User
 
Join Date: Aug 2018
Posts: 11
Sorry for the question, but what Nvidia driver versione is better?
I ve a 1050ti
Bl4ze87 is offline   Reply With Quote
Old 18th May 2019, 13:24   #56286  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 224
Bl4ze87

https://forum.doom9.org/showthread.php?t=176013
__________________
Windows 10-1909 | i5-3570k | GTX 1070 Windforce OC Rev2 8GB : 430.64 | Yamaha RX-V377 | Philips 65PUS6703 - 65"
madjock is offline   Reply With Quote
Old 19th May 2019, 04:41   #56287  |  Link
SirMaster
Registered User
 
Join Date: Feb 2019
Posts: 47
Quote:
Originally Posted by SirMaster View Post
Anyone have any suggestions for this issue?

I am not sure where to ask it, but I think it's probably a Windows or maybe NVidia thing?

I set up a HTPC for a friend, but for some reason, whenever a video opens or closes (Like starting a video in MPC-HC or then closing MP-HC itself) it causes his display to have to re-acquire it's signal.

This is pretty annoying as he has an older JVC (RS600) with a near 20-second sync time.

Now, I am completely familiar with refresh rate switching and such, but I have all of that completely disabled. I have the HTPC set for 3840x2160 23Hz RGB 12-bit (tried 8-bit as well). Refresh rate switching is disabled in madVR and disabled in MPC-HC.

Any ideas of how I can have it so that the JVC doesn't need to re-sync every time I open and close a video?
Well, I figured out the screen re-sync issue.

It was a combination of software and hardware.

It happened when using in madVR calibration tab "Report BT2020 to display" enabled and when sending the video through a Marantz 8802.

If we send the video to the JVC RS600 directly, it never blanks with "Report BT2020 to display" enabled, or if we send the video through the Marantz 8802 with "Report BT2020 to display" disabled it also never blanks.

But send "Report BT2020 to display" enabled into the Marantz and it blanks every time opening and closing a video through madVR.

Weird!
SirMaster is offline   Reply With Quote
Old 19th May 2019, 09:02   #56288  |  Link
svengun
Registered User
 
Join Date: Jan 2018
Location: Barcelona
Posts: 49
Quote:
Originally Posted by SirMaster View Post
Well, I figured out the screen re-sync issue.
I also have that option enabled and send my signal through a Marantz SR6010, but I don't have that issue, also use an RX2070 (latest driver)

So double weird :-)
__________________
Livingroom: Ryzen 7 1700@3.9ghz - Win Insiders Fast Ring - MSI RTX 2700 Gaming - Philips 65OLED803 | Bedroom Ryzen 3 1200 - Win 8.1 - GTX1060 - LG OLED EG920V 55" > All with MadVR latest test build

Last edited by svengun; 19th May 2019 at 09:05.
svengun is offline   Reply With Quote
Old 19th May 2019, 16:56   #56289  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 733
Quote:
Originally Posted by SirMaster View Post
Well, I figured out the screen re-sync issue.
Is your Marantz set to HDMI enhanced in the HDMI config? Otherwise it reduces your bandwidth to 10Gb/s for compatibility.

I have an X8500H and HDMI Enhanced needs to be enabled to enable the full 18Gb/s bandwidth.

If you don't enable BT2020 when using 12bits, the native gamut isn't selected and your calibration is all wonky.

It needs to be enable in 12bits to get the correct gamut in SDR BT2020.

Not a problem with 8bits.
__________________
Win10 Pro x64 b1903 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 436.48 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 20th May 2019, 17:26   #56290  |  Link
SirMaster
Registered User
 
Join Date: Feb 2019
Posts: 47
Quote:
Originally Posted by Manni View Post
Is your Marantz set to HDMI enhanced in the HDMI config? Otherwise it reduces your bandwidth to 10Gb/s for compatibility.

I have an X8500H and HDMI Enhanced needs to be enabled to enable the full 18Gb/s bandwidth.

If you don't enable BT2020 when using 12bits, the native gamut isn't selected and your calibration is all wonky.

It needs to be enable in 12bits to get the correct gamut in SDR BT2020.

Not a problem with 8bits.
It is set to Enhanced.

Also, to solve the problem we ended up just outputting 2 cables from the HTPC. One is run straight to the JVC for video, and the other is run straight to the Marantz for audio. Lip sync is then set manually to line up correctly.

This way we can have "Report BT2020" enabled without any issues, because we are using 2160p 23Hz 12-bit RGB on the RS600.

Last edited by SirMaster; 20th May 2019 at 17:34.
SirMaster is offline   Reply With Quote
Old 20th May 2019, 17:28   #56291  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 733
Quote:
Originally Posted by SirMaster View Post
It is set to Enhanced.

Also, to solve the problem we ended up just outputting 2 cables from the HTPC. One is run straight to the JVC for video, and the other is run straight to the Marantz for audio. Lip sync is then set manually to line up correctly.

This way we can have "Report BT2020" enabled without and issues, because we are using 2160p 23Hz 12-bit RGB on the RS600.
As you know, using 12bits on the JVCs isn't recommended because it forces to YCC422 behind the renderer's back.

Worth trying 8bits and see if you still have these issues. You won't get any banding (if madVR is set to 8bits dithering) and you'll get better chroma.
__________________
Win10 Pro x64 b1903 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 436.48 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 20th May 2019, 17:48   #56292  |  Link
SirMaster
Registered User
 
Join Date: Feb 2019
Posts: 47
Quote:
Originally Posted by Manni View Post
As you know, using 12bits on the JVCs isn't recommended because it forces to YCC422 behind the renderer's back.

Worth trying 8bits and see if you still have these issues. You won't get any banding (if madVR is set to 8bits dithering) and you'll get better chroma.
Yes, but again this is a friends HT that I am configuring and it's an RS600, so 12-bit RGB works fine.

Yes on my own HT with my NX5 of course I use 8-bit with madVR error diffusion dithering and I don't need the report BT2020. But it also does not cause re-sync through my Denon X4200W anyways for if JVC does ever fix 12-bit.
SirMaster is offline   Reply With Quote
Old 20th May 2019, 18:27   #56293  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 733
Quote:
Originally Posted by SirMaster View Post
Yes, but again this is a friends HT that I am configuring and it's an RS600, so 12-bit RGB works fine.

Yes on my own HT with my NX5 of course I use 8-bit with madVR error diffusion dithering and I don't need the report BT2020. But it also does not cause re-sync through my Denon X4200W anyways for if JVC does ever fix 12-bit.
12bits doesn't work fine on the rs600, it does exactly the same re chroma (forces YCC422 with RGB or YCC444 12bits), but you can't use 8bits without getting the magenta bug, so not an option indeed unless you only use 60p, you have to live with the forced YCC422 (or select YCC422 to start with).
__________________
Win10 Pro x64 b1903 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 436.48 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25
Denon X8500H>HD Fury Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old 20th May 2019, 18:30   #56294  |  Link
SirMaster
Registered User
 
Join Date: Feb 2019
Posts: 47
Quote:
Originally Posted by Manni View Post
12bits doesn't work fine on the rs600, it does exactly the same re chroma (forces YCC422 with RGB or YCC444 12bits), but you can't use 8bits without getting the magenta bug, so not an option indeed unless you only use 60p, you have to live with the forced YCC422 (or select YCC422 to start with).
Oh, I thought I remember you saying 12-bit RGB worked on your RS500.

So what are you saying for how should he run his HTPC to RS600 then for tone-mapped HDR? What should we set in nVidia, YCC422 so that it's not happening behind madVR's back?

Last edited by SirMaster; 20th May 2019 at 18:41.
SirMaster is offline   Reply With Quote
Old 20th May 2019, 18:43   #56295  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 733
Quote:
Originally Posted by SirMaster View Post
Oh, I thought I remember you syaing 12-bit RGB worked on your RS500.

So what are you saying for how should he run his HTPC to RS600 then for tone-mapped HDR? What should we set in nVidia, YCC422 so that it's not happening behind madVR's back?
Yes, 12bits "works" with the rs500, and I had no choice because of the magenta bug in 8bits.

You should test chroma with various options. Given that the RS600 will force YCC422 with any colorspace in 12bits, I would suppose that selecting this in the driver would be best, but when I do this with the rs2000 it's still a worse result than using 8bits.
__________________
Win10 Pro x64 b1903 MCE
i7 3770K@4.0Ghz 16Gb@2.18Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 436.48 RGB Full 8bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25
Denon X8500H>HD Fury Maestro>JVC RS2000

Last edited by Manni; 22nd May 2019 at 08:27.
Manni is offline   Reply With Quote
Old 21st May 2019, 13:29   #56296  |  Link
miroslav22
Registered User
 
Join Date: May 2011
Posts: 5
Just started using Madvr, very impressed!

I just had a couple of questions I'm struggling to find answers to. Apologies if I'm not fully understanding this!

The issues I have are:

I want to use madvr to upscale TV at 50hz to 2160p, however switching to 50hz only works if I set the colour output in the Nvidia control panel to 4:2:0 chroma/luma sampling OR 8 bit precision (due to the bandwith limitations of HDMI 2.0). For some reason I can only select 8/12 bit for precision and not 10 bit. My understanding is that HDMI 2.0 has enough bandwidth for full RGB/10 bit output at 50hz so it's annoying the option doesn't seem to be available. (Is there a reason for this?)

But I also want HDR output for films at 23p. So I guess I am forced to set the Nvidia control panel to output 12 bit and not 8 bit? If this is the case I am also forced to use 4:2:0 chroma/luma to allow 50hz to work.

So I guess my question is are you 'stuck' with what you set in the Nvidia control panel for all refresh rates, or can/does Madvr change the colour precision/subsampling output dynamically, along with the refresh rate? I'm running in DX11 FSE mode

ie Can/does madvr switch the output to 10/12bit RGB at 2160p23 and 8bit RGB (or even 12bit 4:2:0) at 50hz?

If not then is the quality difference of always outputting 4:2:0 vs RGB negligible for HTPC use?
miroslav22 is offline   Reply With Quote
Old 21st May 2019, 17:10   #56297  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,712
Why is 12 bit important to you? 4:2:0 at 12 bit is much lower quality than RGB at 8 bit.

I suggest you simply use 8 bit RGB all the time. madVR cannot change the bit depth but if you set 23Hz as 12bit the driver will use 12 bit when madVR switches to 23Hz.

Please do some visual tests, if 12 bit looks the same as 8 bit to you please do not obsess about getting 12 bit. 12 bit really is the least important improvement and not worth sacrificing anything else for.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 22nd May 2019, 00:41   #56298  |  Link
miroslav22
Registered User
 
Join Date: May 2011
Posts: 5
Thanks for the reply... Sorry I was under the impression that the display needed to be configured to 10 or 12 bit for HDR to work. Is that not the case? This was the only reason I was trying to use it.

If it does require it then it seems I must choose between having HDR available (but using 4:2:0) or full RGB, as I can't have both (due to needing to switch to 50hz for TV)
miroslav22 is offline   Reply With Quote
Old 22nd May 2019, 01:53   #56299  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,712
No, HDR does not need 10 bit, at least on Nvidia with old and new drivers. There were some in the middle that did require 10 bit.

AMD does require 10 bit for HDR.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 22nd May 2019, 13:53   #56300  |  Link
miroslav22
Registered User
 
Join Date: May 2011
Posts: 5
Many thanks I understand much better now. I'd assumed the extra bits were needed to specify the greater luminance for HDR but I've realised this isn't how it works after a bit more reading.

You're right, 10 -> 8-bit dithered by madvr looks almost identical to 10-bit native close up, and is completely indistinguishable at viewing distance. Certainly not worth worrying about!

Thanks again!
miroslav22 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 08:44.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.