Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 25th November 2017, 15:54   #47301  |  Link
ABDO
Registered User
 
Join Date: Dec 2016
Posts: 65
Quote:
Originally Posted by madshi View Post
I wonder if I should maybe remove the "quality" option and auto pick, based on the strength?
that would hurt my gtx 150 ti boost , i realy do not know.
ABDO is offline   Reply With Quote
Old 25th November 2017, 17:26   #47302  |  Link
cork_OS
Registered User
 
cork_OS's Avatar
 
Join Date: Mar 2016
Posts: 160
Quote:
Originally Posted by madshi View Post
Thanks for the feedback, it matches my own test results. I wonder if I should maybe remove the "quality" option and auto pick, based on the strength?
Please don't remove this option, RCA high is too slow for GTX 1060 and below.
__________________
I'm infected with poor sources.
cork_OS is offline   Reply With Quote
Old 25th November 2017, 18:05   #47303  |  Link
Werewolfy
Registered User
 
Join Date: Feb 2013
Posts: 137
Quote:
Originally Posted by madshi View Post
Thanks for the feedback, it matches my own test results. I wonder if I should maybe remove the "quality" option and auto pick, based on the strength?
Unfortunately, I don't think so because the GPU load is quite higher with high quality. Even on my Geforce GTX1080 I can see a difference so I supsect weaker GPU will struggle with this.
__________________
Windows 8.1 and 10 x64 - Intel Core i5-4670K (4.2 GHz) - 8 GB DDR3 - MSI Geforce GTX 1080 8 GB - Sony KD-55A1 - Denon AVR-X3600H
Werewolfy is offline   Reply With Quote
Old 25th November 2017, 18:58   #47304  |  Link
Fabulist
Registered User
 
Join Date: Oct 2015
Posts: 29
Quote:
Originally Posted by madshi View Post
Some sharpeners bloat more than others. E.g. "crispen edges" doesn't bloat by design, but LumaSharpen does. Using anti-bloating removes the bloating, but that also reduces the overall sharpening effect, so you may have to increase the sharpening strength when using anti-bloating. Using 100% seems like a safe choice, but you can pick any value that looks good to your eyes.
Thank you for taking your time to reply. Does Sharpen Edges really do any bloating on 1080p+ sources? I mean objectively and in technical terms, by like a specific amount or %?

I run tests on various sets and sources and I am unable to see any kind of bloating while running Sharpen Edges at 4 (like other sharpeners do) on up to 65 inches. Is there something I am missing or am I not looking at what I should be looking at? Does it bloat in a different way other sharpeners do, one which I cannot identify, like a different kind of video distortion?
Fabulist is offline   Reply With Quote
Old 25th November 2017, 21:28   #47305  |  Link
Blackwalker
Registered User
 
Blackwalker's Avatar
 
Join Date: Dec 2008
Posts: 239
Quote:
Originally Posted by Razoola View Post
Yell you could try but really it needs to be on and not off. If you have an AV receiver between the PC and TV you should also make sure that supports deep color or that the HDMI port your using supports it.

You could also try a different HDMI port of TV / AV receiver.
Hi, the htpc hdmi is connected directly to tv..!


Quote:
Originally Posted by madshi View Post
My impression is that you have a tendency to ignore (or accidently miss) what people say. It has been suggested to you that you should use an older NVidia driver (e.g. 385.2x), but last time you reported your Nvidia driver version, it was still 387.xx or 388.xx. Also make sure you have the OS option "HDR and Advanced Color" turned off.
Thank you madshi,

Sorry.. i’m not ignoring, i miss it, again
I’ll try with 385.2x driver and check the “HDR and Advanced Color” tourned off.
Thx for your time and your work with this unique and fantastic toolb

Last edited by Blackwalker; 26th November 2017 at 11:46.
Blackwalker is offline   Reply With Quote
Old 25th November 2017, 21:58   #47306  |  Link
Gopa
Registered User
 
Join Date: Oct 2017
Posts: 27
Rca high

Quote:
Originally Posted by Werewolfy View Post
Unfortunately, I don't think so because the GPU load is quite higher with high quality. Even on my Geforce GTX1080 I can see a difference so I supsect weaker GPU will struggle with this.
Agree. Usually, some "other" madVR enhancement compromises, with a GTX1070, using RCA high quality. Usually, fewer upscaling refinements &/or, a lower luma image upscaling setting, are required to use RCA high quality, with higher strength setting. RCA high quality, with average quality 720p and 1080p upscaled anime, is often, a very, very good madVR enhancement (sometimes, RCA medium quality, or reduce random noise, is just enough). Maybe, something between, medium & high quality RCA, might be nice, if possible? Lower GPU load? Madashi, says, I will love, the next madVR version, because of my preference for RCA high quality!
Gopa is offline   Reply With Quote
Old 25th November 2017, 22:02   #47307  |  Link
yukinok25
Registered User
 
Join Date: May 2015
Posts: 17
Quote:
Originally Posted by madshi View Post
It looks like a crash in the Nvidia OpenCL driver. Try using NGU-Anti-Alias instead of NNEDI3. That should fix the problem. NGU AA looks better than NNEDI3 IMHO, anyway, and is faster at the same time.
Thank you madshi.

It's strange thus, I am actually using NGU-sharp at medium as a chroma, why the crash would report NNEDI3?

It's because I left the option "Let MadVR decide" on somewhere?
yukinok25 is offline   Reply With Quote
Old 25th November 2017, 22:50   #47308  |  Link
Gopa
Registered User
 
Join Date: Oct 2017
Posts: 27
Anime + GTX1070 + 4k monitor + madVR

newbie questions: I have a 65" 4k TV & so, upscaling 720p anime, results in a heavy GPU load - direct x4 luma upscaling to 2880p & then, downscaling to 2160p. Why is it not possible to just upscale 720p to 2160p (x3 instead of x4)? Is this just not possible, at all, or a limitation of madVR image upscaling? I think, that my Samsung TV, does this (direct x3 upscaling of 720p?), although the difference in upscaling quality, is significantly lower, compared to madVR upscaling + downscaling (or maybe, my TV upscales x4 & then downscales, like madVR?). I prefer 720p anime (original resolution), over upscaled 1080p, because of madVR 4x upscaling, so pre-upscaling anime to 1080p, is not the best option, unless I could "improve" the quality (fewer artifacts/sharper image, without overall quality loss). Any suggestions, other than upgrading my graphics card? And, yeah, I am simply striving for perfection: A GTX1070, is almost perfect, for what I am viewing. Very pleased with madVR - especially RCA high quality & luma image upscaling - limitation on using image refinements is a fairly minor issue (a little sharper image would be nice).
Gopa is offline   Reply With Quote
Old 26th November 2017, 01:17   #47309  |  Link
thighhighs
Registered User
 
Join Date: Sep 2016
Posts: 70
NGU soft\standard\sharp can upscale 2x and direct 4x, 3x not supported. Also no direct quadrupling for NGU AA, it work only in doubling twice mode.
Last i'm remember about NGU 3x variant, NGU 3x speed\quality ratio not good vs NGU 4x + downscaling. But i hope NGU 3x not dead.
These features is what exactly i would like to see in madvr:

- NGU direct 3x
- NGU AA direct 4x, 3x
- Also better SLI support: a) SLI-friendly madvr without negative scaling b) extra performance with second GPU (just dream )
thighhighs is offline   Reply With Quote
Old 26th November 2017, 02:39   #47310  |  Link
Gopa
Registered User
 
Join Date: Oct 2017
Posts: 27
Quote:
Originally Posted by thighhighs View Post
NGU soft\standard\sharp can upscale 2x and direct 4x, 3x not supported. Also no direct quadrupling for NGU AA, it work only in doubling twice mode.
Last i'm remember about NGU 3x variant, NGU 3x speed\quality ratio not good vs NGU 4x + downscaling. But i hope NGU 3x not dead.
These features is what exactly i would like to see in madvr:

- NGU direct 3x
- NGU AA direct 4x, 3x
- Also better SLI support: a) SLI-friendly madvr without negative scaling b) extra performance with second GPU (just dream )
GTX1070 – average quality 720p anime: RCA high quality, including strength 10 + chroma ngu sharp low/medium + luma ngu sharp direct quadruple high + this compromise: refine image only once, instead of after every 2x upscaling (thin edges, soften edges, luma sharpen, anti-bloating, anti-ringing) + a few low strength image enhancement sharpeners, if more sharpening is still needed. Refine image only once, is not quite as smooth as after every 2x. Otherwise, this seems to be a pretty good compromise, to use RCA high quality, along with ngu high x4 luma upscaling. Images look great (to me). Likely, no compromises, at all, with a GTX1070ti, or GTX1080? Better sources, would mean, no need for RCA high quality, of course & RCA medium quality, or reduce random noise, use a lot less rendertime. RCA high quality + ngu sharp 4x high luma, looks better than, ngu sharp very high luma, without RCA high quality (to me). Most likely, because of less than great source material. Upscaling 720p to 4k, without madVR, is not something, that I care to do, ever again! Thanks for the information, about past attempts with 3x upscaling. I once thought, that SLI, would be the answer: a more powerful graphics card, makes more sense to me, now, unless also using SLI, for gaming. Very happy with a GTX1070. Yeah, just a little bit more power, would still be nice! All of this is because I chose to get a 4k TV, instead of a 1080p TV (better overall features, with 4k TV’s). Not so sure, this was the right decision, though. A 65” 1440p monitor, might be the best, if any existed. I like sitting pretty close & this is not possible, with a 1080p monitor. Always open to suggestions for alternative settings.
Gopa is offline   Reply With Quote
Old 26th November 2017, 06:23   #47311  |  Link
heiseikiseki
Registered User
 
Join Date: Jan 2015
Posts: 37
Quote:
Originally Posted by madshi View Post
As mentioned by heiseikiseki, Intel seems to plan to replace their internal GPUs with AMD soon, so maybe Intel NUCs might do the job then. But the GPU chip alone won't be enough, we also need fast VRAM, and ideally 4GB+ for 4K.
some of Kabylake-G use 4GB HBM2 memory for the GPU
so I think it would be ok for 4k.

Here are some bench marks information leaked
https://wccftech.com/intel-kaby-lake...pecifications/

The performance is about 3.3 TFLOPs.

Seems at least much much stronger than GTX750TI or GTX960.
heiseikiseki is offline   Reply With Quote
Old 26th November 2017, 06:32   #47312  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
Quote:
Originally Posted by madshi View Post
Good testing. But annoying test results, because the source code changes between 0.91.11 and 0.92.1 are pretty large.

What is the easiest way to reproduce this problem?
Play anything, pause, change screen power off to 1 min and wait. Let screen blank then wait a bit, most times the video but not audio playback is frozen upon resuming.
For anyone else having this issue (is it Nvidia only?) try setting LAV to software decoding (Hardware decoding "none") until a fix is found.
ryrynz is offline   Reply With Quote
Old 26th November 2017, 06:46   #47313  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,925
Quote:
Originally Posted by heiseikiseki View Post
some of Kabylake-G use 4GB HBM2 memory for the GPU
so I think it would be ok for 4k.

Here are some bench marks information leaked
https://wccftech.com/intel-kaby-lake...pecifications/

The performance is about 3.3 TFLOPs.

Seems at least much much stronger than GTX750TI or GTX960.
it is still not clear if the IGPU get's replaced by the amd GPU or if both are on the chip. i don't have to remind anyone here how terrible nvidia "optimus" is.

and most important these chip are going to be expensive.
HBM2 is very expensive.
it a faster 560 which is decent no question. but really wait until they are out there is so much missing informations.
huhn is offline   Reply With Quote
Old 26th November 2017, 06:57   #47314  |  Link
heiseikiseki
Registered User
 
Join Date: Jan 2015
Posts: 37
Hello mashi,
I have some small problem confused me a long time with the osd which I don't know is it a bug of madvr or not.
I thought that is caused by my computer is optimus spec.
But recently I buy a new desktop with GTX1060 the problem is still not gone.

Here are some screenshot.

First I set my presented frames for 16


And here is the OSD


the decoder queue and upload queue seems work well is about 17-18 (I set 18)

Some questions here.

1. The render queue always not reach 18.
2. The present queue I set it 16 ,but the maximum here shows 15. (if I sent N, it would be N-1)
3. The Hardware Decoder I set D3D11 on LAV , but the OSD shows DXVA11.

Is it all normal?
heiseikiseki is offline   Reply With Quote
Old 26th November 2017, 07:37   #47315  |  Link
Nyago123
Registered User
 
Join Date: Jan 2016
Posts: 5
My notes on nVidia driver 388.31 + madVR 0.92.9 & playing HDR on a Win10 1709+LG E6+GTX 1080+Denon X4300H AVR:

Originally after the Fall Creator's update, the default Windows-included nVidia drivers wouldn't set HDR for me (MPC-BE). With the reports here on the problems, I went with a 385.xx version which was OK.

But I decided to play around with the most recent driver release (388.31). What I found is:

1. HDR works correctly with Zoom Player - it kicks in when starting the player, and disables when the player exits at least 90% of the time. One time it started with incorrect colors and one time the player hung. For the former, I just double toggled HDR in the Win 10 Display Settings; for the latter I just killed the process and tried again (to be honest, ZP freezes from time to time on me in general - nothing to do with HDR - so I can't tell if this was just "one of those times").

2. HDR works with MPC-HC but stays on when the player exits. I have to use the Win 10 Display Settings dialog and toggle on/off to get back to SDR.

3. HDR doesn't start at all with MPC-BE

I should probably spend a little more time looking at how LAV might be playing into this, but for now Zoom Player is my player of choice for HDR content until nVidia+MS can get this fixed.
Nyago123 is offline   Reply With Quote
Old 26th November 2017, 10:47   #47316  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by madshi View Post
For SDR screenshots, please compare F5 with PrintScreen. Is there a difference? You can configure F5 behaviour in the new madVR "screenshots" settings page.
Yes, there is, using F5 for different HDR->SDR conversion:
- pixel shader conversion is OK
- 3dlut conversion is Not (image is much darker)

PrintScreen in Windowed mode works fine for both of them.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 26th November 2017, 11:48   #47317  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by ABDO View Post
that would hurt my gtx 150 ti boost , i realy do not know.
Quote:
Originally Posted by cork_OS View Post
Please don't remove this option, RCA high is too slow for GTX 1060 and below.
Quote:
Originally Posted by Werewolfy View Post
Unfortunately, I don't think so because the GPU load is quite higher with high quality. Even on my Geforce GTX1080 I can see a difference so I supsect weaker GPU will struggle with this.
Ok!

Quote:
Originally Posted by Fabulist View Post
Thank you for taking your time to reply. Does Sharpen Edges really do any bloating on 1080p+ sources? I mean objectively and in technical terms, by like a specific amount or %?

I run tests on various sets and sources and I am unable to see any kind of bloating while running Sharpen Edges at 4 (like other sharpeners do) on up to 65 inches. Is there something I am missing or am I not looking at what I should be looking at? Does it bloat in a different way other sharpeners do, one which I cannot identify, like a different kind of video distortion?
Sharpen Edges internally does some supersampling which reduces bloating. It's all somewhat subjective, please trust your eyes and pick values that look good to you. This is all not really scientific. If you want scientific, you'd have to use deconvolution instead of sharpening, but even then, which deconvolution kernel would you be using. Gaussian or other? Linear light or gamma light?

Quote:
Originally Posted by Blackwalker View Post
I’ll try with 385.2x driver and check the “HDR and Advanced Color” tourned off.
Hope it will work for you!

Quote:
Originally Posted by yukinok25 View Post
It's strange thus, I am actually using NGU-sharp at medium as a chroma, why the crash would report NNEDI3?

It's because I left the option "Let MadVR decide" on somewhere?
Let madVR decide will never switch to NNEDI3 if you haven't selected it. In your bug report there were 2 crashes. One was not clear. The other one pointed to a crash in the Nvidia OpenCL driver. madVR uses OpenCL only for NNEDI3, so my conclusion was that you probably used NNEDI3. Please check the OSD (Ctrl+J) to confirm that NNEDI3 is really not used, neither in chroma upscaling nor image upscaling. If you're really not using NNEDI3, then I'm at a loss why the Nvidia OpenCL driver crashed!

Quote:
Originally Posted by Gopa View Post
newbie questions: I have a 65" 4k TV & so, upscaling 720p anime, results in a heavy GPU load - direct x4 luma upscaling to 2880p & then, downscaling to 2160p. Why is it not possible to just upscale 720p to 2160p (x3 instead of x4)?
It's possible, but difficult to do, and I'm not sure it would actually produce competetive results. It might look worse than x4 + downscaling. Furthermore, it would probably be only barely faster than x4 upscaling + downscaling. Anyway, x3 is still on my list of things to look at, but it's a low priority atm, because I doubt its usefulness.

Quote:
Originally Posted by heiseikiseki View Post
some of Kabylake-G use 4GB HBM2 memory for the GPU
so I think it would be ok for 4k.
Yes, that looks promising, actually! We'll have to wait and see how it works in reality, but it *could* be suitable for madVR!

Quote:
Originally Posted by huhn View Post
it is still not clear if the IGPU get's replaced by the amd GPU or if both are on the chip. i don't have to remind anyone here how terrible nvidia "optimus" is.
FWIW, the key problem with Optimus is that there are really 2 totally separate GPUs, each of which has their own HDMI/DVI driver inside. And what is worse: The actual driving of the display (and HDMI output) is done by the Intel GPU, while the rendering is done by the Nvidia GPU, so the 2 GPUs have to work together. Basically Nvidia has to transport the rendered frames to the Intel GPU, to be sent to the display. Practically this means, both drivers have to work together. That's like driver bugs ^ 2!

If the new Intel CPUs with integrated AMD GPUs only have one GPU instead of two, most (or even all) of the Optimus problems should not occur.

Quote:
Originally Posted by ryrynz View Post
Play anything, pause, change screen power off to 1 min and wait. Let screen blank then wait a bit, most times the video but not audio playback is frozen upon resuming.
For anyone else having this issue (is it Nvidia only?) try setting LAV to software decoding (Hardware decoding "none") until a fix is found.
So it only occurs with hardware decoding? Does it occur with DXVA2 copyback *and* native? How about D3D11?

Quote:
Originally Posted by heiseikiseki View Post
Hello mashi,
I have some small problem confused me a long time with the osd which I don't know is it a bug of madvr or not.
I thought that is caused by my computer is optimus spec.
But recently I buy a new desktop with GTX1060 the problem is still not gone.

the decoder queue and upload queue seems work well is about 17-18 (I set 18)

Some questions here.

1. The render queue always not reach 18.
2. The present queue I set it 16 ,but the maximum here shows 15. (if I sent N, it would be N-1)
3. The Hardware Decoder I set D3D11 on LAV , but the OSD shows DXVA11.

Is it all normal?
That's all perfectly normal. The queues don't have to be 100% full. As long as they're nearly full, everything's fine. The key thing to look at is that you don't get frame drops/repeats or presentation glitches increasing all the time during playback.

Quote:
Originally Posted by Nyago123 View Post
My notes on nVidia driver 388.31 + madVR 0.92.9 & playing HDR on a Win10 1709+LG E6+GTX 1080+Denon X4300H AVR:

Originally after the Fall Creator's update, the default Windows-included nVidia drivers wouldn't set HDR for me (MPC-BE). With the reports here on the problems, I went with a 385.xx version which was OK.

But I decided to play around with the most recent driver release (388.31). What I found is:

1. HDR works correctly with Zoom Player - it kicks in when starting the player, and disables when the player exits at least 90% of the time. One time it started with incorrect colors and one time the player hung. For the former, I just double toggled HDR in the Win 10 Display Settings; for the latter I just killed the process and tried again (to be honest, ZP freezes from time to time on me in general - nothing to do with HDR - so I can't tell if this was just "one of those times").

2. HDR works with MPC-HC but stays on when the player exits. I have to use the Win 10 Display Settings dialog and toggle on/off to get back to SDR.

3. HDR doesn't start at all with MPC-BE

I should probably spend a little more time looking at how LAV might be playing into this, but for now Zoom Player is my player of choice for HDR content until nVidia+MS can get this fixed.
Interesting. But why not stick to 385.xx until the issue if fixed?

Quote:
Originally Posted by chros View Post
Yes, there is, using F5 for different HDR->SDR conversion:
- pixel shader conversion is OK
- 3dlut conversion is Not (image is much darker)

PrintScreen in Windowed mode works fine for both of them.
Can you give me a few more details. How is 3dlut conversion not ok? Is the 3dlut not applied for the screenshot? Or is it applied incorrectly? What happens if you disable the 3dlut? Are F5 and PrintScreen identical then for SDR movies?
madshi is offline   Reply With Quote
Old 26th November 2017, 11:53   #47318  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
Quote:
Originally Posted by madshi View Post
So it only occurs with hardware decoding? Does it occur with DXVA2 copyback *and* native? How about D3D11?
Occurs with all of those, haven't been able to reproduce on Intel.
ryrynz is offline   Reply With Quote
Old 26th November 2017, 12:08   #47319  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
In case you guys wonder how NGU Sharp compares to mpv's latest FSRCNN(X), here's a little comparison:

Blu-Ray screenshot
downscaled (PNG) | (JPG, 100 quality)
latest FSRCNN32
latest FSRCNNX32
NGU Sharp - High
NGU Sharp - Very High

To make things as fair as possible I've downscaled the image with Bicubic/Catrom, which is exactly what FSRCNN and FSRCNNX were trained for.

Here are benchmark numbers, for 720p doubling:

Code:
Nvidia 1070:
FSRCNN16: 15.270 ms
FSRCNNX16: 26.397 ms
FSRCNN32: 46.290 ms
FSRCNNX32: ? (estimated: 80.021 ms)
NGU-Sharp High: 3.940 ms
NGU-Sharp Very High: 11.800 ms
Code:
AMD 560:
FSRCNN16: 14.289 ms
FSRCNNX16: 24.412 ms
FSRCNN32: 45.235 ms
FSRCNNX32: ? (estimated: 77.282 ms)
NGU-Sharp High: 12.970 ms
NGU-Sharp Very High: 37.100 ms
These are very weird benchmark results, to say the least. We know that NGU doesn't run as well as it should on AMD Polaris GPUs. But FSRCNN(X) running (ever so slighty) faster on my AMD 560 than on my Nvidia 1070 is just plain weird.

Last edited by madshi; 26th November 2017 at 13:29.
madshi is offline   Reply With Quote
Old 26th November 2017, 12:26   #47320  |  Link
ABDO
Registered User
 
Join Date: Dec 2016
Posts: 65
Quote:
Originally Posted by madshi View Post
In case you guys wonder how NGU Sharp compares to mpv's latest FSRCNN(X)
well, no Surprise here, NGU Sharp look, best, crisp and closer to
Blu-Ray.

edit
i see that FSRCNN bring some colours shift.

Last edited by ABDO; 26th November 2017 at 12:32.
ABDO is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 00:16.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.