Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old Yesterday, 16:39   #48561  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 187
Quote:
Originally Posted by jasonwc18 View Post
What's the relative quality difference between using NGU Sharp in High vs. Very High modes for Luma doubling? They both look excellent on my 120" screen from an 11' seating distance but it's hard to compare them without doing A/B comparisons with individual frames. With Very Sharp, you get High quality (as opposed to Medium) compression artifact removal which is a perk. The madvr GUI shows that the Very High setting yields a sharpness improvement and slightly less ringing but its not clear to what extent this is visible. For example, x264 "Placebo" mode also yields a small improvement but massively increases the performance impact. I'm wondering if NGU Very High is similar.

Also, as a general setting, does it make sense to turn on debanding? I currently have it set to Low/High but since I generally watch high quality Blu-Ray sources, I generally only see a benefit for fade ins/outs. It does makes a significant improvement at the beginning of the Planet Earth II episodes on Blu-Ray (one of the nice thinks about the UHD Blu-Ray's is that the 10 bit color eliminates this banding - clearly visible on the Planet Earth II UHD BDs). I read that the higher settings remove detail so I'm hesitant to use a higher setting for default debanding strength.
debanding - you lose detail even on low, this should be off and only used for really poor quality content.
mclingo is offline   Reply With Quote
Old Yesterday, 16:41   #48562  |  Link
el Filou
Registered User
 
Join Date: Oct 2016
Posts: 178
Quote:
Originally Posted by jasonwc18 View Post
They both look excellent on my 120" screen from an 11' seating distance but it's hard to compare them without doing A/B comparisons with individual frames.
Then do comparisons with individual frames.
MPC-HC can go directly to a frame number, I guess other players have that feature too.
It's always best to compare yourself because every setup is different and every person has different preferences.
__________________
HTPC: E7400, GeForce 1050 Ti, DVB-C TV, Panasonic GT60 | Desktop: 4690K, Radeon 7870, Dell U2713HM | Windows 1703, MediaPortal/MPC-HC, LAV Filters, ReClock, madVR | Laptop: i5-2520m, Windows Insider
el Filou is offline   Reply With Quote
Old Yesterday, 16:46   #48563  |  Link
jasonwc18
Registered User
 
Join Date: Jan 2018
Posts: 4
Quote:
Originally Posted by mclingo View Post
debanding - you lose detail even on low, this should be off and only used for really poor quality content.
Good to know; then I might as well keep my current settings. With Luma set to NGU Sharp Very High, Chroma set to NGU AA Medium, artifact removal at High, and a 3D LUT in use, I'm getting render times of around 25 ms on a GTX 1070. Debanding brought that over 30 ms.

I have been avoiding the deringing filter for the same reason you mentioned. I read it shouldn't be used on high quality sources.
jasonwc18 is offline   Reply With Quote
Old Yesterday, 21:01   #48564  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 2,817
It is pretty small, I could relate it to placebo in x264. It is better, and so is x264's placebo mode, but each step of NGU Sharp is less important. I think the quality v.s. performance hit of NGU veryhigh compared to high is better than x264's placebo compared to veryslow, but not drastically so. The choice is also a bit different because x264's placebo takes a lot more time (and thus power) while NGU veryhigh simply takes more power. You need all NGUs to run in real time to be able to use them at all.

Medium is a lot better than low, high is noticeably better than medium, and very high is a bit better than high. At least this is what I noticed during my testing. For luma I always run very high if I can.

For chroma I was convinced to stay at high or below all the time, I have never noticed quality improvements during playback with very high for chroma and there have even been a few examples where very high seemed a little worse. There isn't much fine detail in chroma, sub-sampling is very damaging, so the extra power used for very high is simply wasted.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old Yesterday, 21:17   #48565  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 2,605
Asmodian's comments were my exact findings also.
ryrynz is offline   Reply With Quote
Old Yesterday, 21:19   #48566  |  Link
jasonwc18
Registered User
 
Join Date: Jan 2018
Posts: 4
Quote:
Originally Posted by Asmodian View Post
It is pretty small, I could relate it to placebo in x264. It is better, and so is x264's placebo mode, but each step of NGU Sharp is less important. I think the quality v.s. performance hit of NGU veryhigh compared to high is better than x264's placebo compared to veryslow, but not drastically so. The choice is also a bit different because x264's placebo takes a lot more time (and thus power) while NGU veryhigh simply takes more power. You need all NGUs to run in real time to be able to use them at all.

Medium is a lot better than low, high is noticeably better than medium, and very high is a bit better than high. At least this is what I noticed during my testing. For luma I always run very high if I can.

For chroma I was convinced to stay at high or below all the time, I have never noticed quality improvements during playback with very high for chroma and there have even been a few examples where very high seemed a little worse. There isn't much fine detail in chroma, sub-sampling is very damaging, so the extra power used for very high is simply wasted.
Thanks, this was very helpful!
jasonwc18 is offline   Reply With Quote
Old Yesterday, 21:19   #48567  |  Link
Rexian
Registered User
 
Join Date: Apr 2004
Posts: 9
Quote:
Originally Posted by nevcairiel View Post
I don't even like it in the cinema, so what chance does home have?
You really need to try it with a 2016 OLED (C/E/G6). Of course bigger the better (saw a demo on 77" G6 and it was much better than theater) when it comes to 3D but even on my 65" E6, it looks awesome. Probably because I sit right in front of it and in theater you don't have much control. The active 3D on Samsung sucks and gave me headache at a friend's place and almost wrote off 3D
Rexian is offline   Reply With Quote
Old Yesterday, 21:38   #48568  |  Link
James_b
Registered User
 
Join Date: Jan 2018
Posts: 1
MadVR meets old school CRT-projector from the 90's (1920 x 816 / 71.952Hz) 2,2 m screen

MPC madVR + DXVA downscaling, Sharpness complex 2 post Scaling.

Nvidea edge enhancement 47% 0-255 + Nvidea Video gamma ramps at 1.24 ---> Moome HDMI-card in Projector, 0-255

CPU Quad 2,4. Videocard, MSI GT 710 2GD3H LP ~900 mHz





Attachments Pending Approval
File Type: jpg madvr_0-255-0-255.jpg
File Type: jpg 6.jpg
File Type: jpg 5.jpg

Last edited by James_b; Yesterday at 22:28.
James_b is offline   Reply With Quote
Old Yesterday, 21:42   #48569  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 2,605
Unfortunately though having a good 3D experience on a single outdated TV doesn't really make the solution terribly attractive xD

James, upload those to an image host and link em, they'll be pending 4dayz.
ryrynz is offline   Reply With Quote
Old Today, 05:09   #48570  |  Link
Dodgexander
Registered User
 
Join Date: Jul 2008
Posts: 157
Questions

1. Does MadVR support hardware acceleration for HEVC?
There is a lot of info out there that states MadVR doesn't support hardware decoding for HEVC, even with 9xx 10xx series Nvidia cards.

2. What is the minimum recommended video card now for upscaling to UHD? I don't care so much about using advance scalers, just basic.

3. Will it be possible to output 2x HDMi, one to an older receiver, one to an UHD display without any issues? I read HDMI clone/extended will work for this purpose. I have a non Atmos AVR but want to keep UHD whilst retaining True HD/DTS HD sound.
Dodgexander is offline   Reply With Quote
Old Today, 05:21   #48571  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 2,817
1) LAV Video does the hardware decoding, madVR has nothing to do with it. Hardware decoding of HEVC with LAV+madVR works very well on my Titan X (Pascal).

2) 1050 Ti, or maybe anything with 3 or more GB of RAM.

3) Yes
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old Today, 05:22   #48572  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 4,354
Quote:
Originally Posted by Dodgexander View Post
1. Does MadVR support hardware acceleration for HEVC?
There is a lot of info out there that states MadVR doesn't support hardware decoding for HEVC, even with 9xx 10xx series Nvidia cards.
madVR has nothing todo with acceleration support. lavfilter can decode HEVC and you can renderer them with madVR.
madVR doesn't have decoders anymore.
Quote:
2. What is the minimum recommended video card now for upscaling to UHD? I don't care so much about using advance scalers, just basic.
1050 ti or you risk running out of memory.
Quote:
3. Will it be possible to output 2x HDMi, one to an older receiver, one to an UHD display without any issues? I read HDMI clone/extended will work for this purpose. I have a non Atmos AVR but want to keep UHD whilst retaining True HD/DTS HD sound.
this should work but i'm not going to garantie you a issue free playback at this point.
you don't have to clone it you just have to send an image to the AVR that's it.
huhn is offline   Reply With Quote
Old Today, 05:26   #48573  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 2,817
Quote:
Originally Posted by James_b View Post
Nvidea edge enhancement 47% 0-255 + Nvidea Video gamma ramps at 1.24 ---> Moome HDMI-card in Projector, 0-255
Ouch.

Do that in madVR! Nvidia's stuff is terrible.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old Today, 05:57   #48574  |  Link
Dodgexander
Registered User
 
Join Date: Jul 2008
Posts: 157
Thanks guys, I was surprised how much I had to search to find this answer.

I am an old time MadVR user back when the HD4870 ATI card was good for scaling to 1080p. I take it AMD don't have any viable alternatives to Nvidia's 1050TI?

Anyone using PowerDVD17 and UHD Blu-Rays? I doubt it will work with a separate HDMI audio out so I resigned myself to getting an UHD Blu-Ray player for discs.
Dodgexander is offline   Reply With Quote
Old Today, 06:32   #48575  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 4,354
if you can find a 560 4Gb you could use that one.
huhn is offline   Reply With Quote
Old Today, 06:46   #48576  |  Link
Dodgexander
Registered User
 
Join Date: Jul 2008
Posts: 157
The best I own is a 660 which is 2gb VRAM sadly.
Dodgexander is offline   Reply With Quote
Old Today, 08:21   #48577  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 4,354
an RX 560 4gb.

you need at least an maxwell, pascal or newer card to get an HEVC 10 bit decoder(not all of these cards have this decoder)
huhn is offline   Reply With Quote
Old Today, 09:05   #48578  |  Link
nsnhd
Registered User
 
Join Date: Jul 2016
Posts: 56
Quote:
Originally Posted by huhn View Post
if you can find a 560 4Gb you could use that one.
Rx 560 4Gb doesn't have VP9 profile 2 hardware decoder and is slow on madVR NGU, right ? What else as minus in comparison to a Gtx 1050ti ? Is the Rx560 less powerful than the Gtx1050ti in general ?
nsnhd is offline   Reply With Quote
Old Today, 09:07   #48579  |  Link
Dodgexander
Registered User
 
Join Date: Jul 2008
Posts: 157
Quote:
Originally Posted by huhn View Post
an RX 560 4gb.

you need at least an maxwell, pascal or newer card to get an HEVC 10 bit decoder(not all of these cards have this decoder)
Thats what I read and you confirm my findings. Basically the 1050ti is the way forward.

From my research, 7xx series have a very limited hardware decoder for HEVC.

9xx series have full decoder.
10xx series same.

But its not that thats the issue here, MadVRs scaling requires more shaders and for that it seems a 1050ti is optimal right now.
Dodgexander is offline   Reply With Quote
Old Today, 09:36   #48580  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 4,354
Quote:
Originally Posted by nsnhd View Post
Rx 560 4Gb doesn't have VP9 profile 2 hardware decoder and is slow on madVR NGU, right ? What else as minus in comparison to a Gtx 1050ti ? Is the Rx560 less powerful than the Gtx1050ti in general ?
generally yes it was slower (not sure if this is the case for madVR outside of NGU) but supposed to be cheaper but well GPU prices are not what they used to be... the 1050 ti is not heavily affected yet.

it doesn't have a VP9 decoder in general sorry but a hard to use hybrid decoder is not what you want if you are going to use madVR (AFAIK you can't use it out side of browser anyway).

AMD was usual faster with madVR but the polaris NGU problem and the fact that you can't buy a performance AMD GPU for a reasonable price.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 11:09.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2018, vBulletin Solutions Inc.