Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 5th January 2018, 19:29   #48081  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Peak nit measurment works with D3D11VA without copyback.
You're apparently right regarding detelecine, forgot about that (I don't consider it important for myself). Black bar cropping could be also done on the GPU. Well, madshi and many users were more interested in new video filters (ofc. absolutely legit)...

Last edited by aufkrawall; 5th January 2018 at 19:34.
aufkrawall is offline   Reply With Quote
Old 5th January 2018, 19:43   #48082  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,921
SVP and other such stuff needs it too.

and everything of this could be done on GPU but it isn't.
huhn is offline   Reply With Quote
Old 5th January 2018, 20:55   #48083  |  Link
Plutotype
Registered User
 
Join Date: Apr 2010
Posts: 235
Guys, is madvr doing automatic gamut conversion? UHD BD´s are in BT.2020 (P3), my display is REC.709.
__________________
__________________
System: Intel Core i5-6500, 16GB RAM, GTX1060, 75" Sony ZD9, Focal speakers, OS Win10 Pro, Playback: madvr/JRiver
Plutotype is offline   Reply With Quote
Old 5th January 2018, 21:07   #48084  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 212
Quote:
Originally Posted by Plutotype View Post
Guys, is madvr doing automatic gamut conversion? UHD BD´s are in BT.2020 (P3), my display is REC.709.
Yes. I think it by default assumes BT.709 gamut and 2.2 gamma if calibration options are set to "disable..."
mytbyte is offline   Reply With Quote
Old 6th January 2018, 16:55   #48085  |  Link
Plutotype
Registered User
 
Join Date: Apr 2010
Posts: 235
Did some comparison between BD of Dunkirk and UHD BD of Dunkirk with madvr SDR to HDR conversion at 226, 255 and 280 nit setting. Compressing the HDR highlights was set to OFF.

https://drive.google.com/file/d/1d1e...ew?usp=sharing

The nit setting is a kind of tricky, because SDR displays are calibrated to 100-120nit, but here you can see, that going below 200 nit setting would clip the bright highlights unnaturally. In this shot, nit setting around 250-260 generates very close result to SDR blu-ray.
__________________
__________________
System: Intel Core i5-6500, 16GB RAM, GTX1060, 75" Sony ZD9, Focal speakers, OS Win10 Pro, Playback: madvr/JRiver

Last edited by Plutotype; 6th January 2018 at 17:11.
Plutotype is offline   Reply With Quote
Old 6th January 2018, 18:24   #48086  |  Link
Sm3n
Registered User
 
Join Date: Jul 2012
Posts: 94
Quote:
Originally Posted by Sm3n View Post
Yep USING it. But now I think it's all good I don't know how or why but I could change and save the "optimized" setting for the 23fps. I succeeded to achieve more than several days reaching the "no frame drops/repeats expected" sometimes.

Thx again for pointing me this new tutorial and for your time.
Hi again,

Unfortunately I don't know why but now - and I didn't change anything - my frames are stucked @ 1 frame per 3 minutes.

I restarted my custom mode from scratch but can't figure what the issue could be and the result is the same everytime.

Last time, the same day I asked for help I updated my nvidia driver - can't say if it was before or after - could it be the reason?

I'll keep trying to fix it but if someone has got an idea please share it

Last edited by Sm3n; 6th January 2018 at 19:01.
Sm3n is offline   Reply With Quote
Old 6th January 2018, 19:07   #48087  |  Link
jkauff
Registered User
 
Join Date: Oct 2012
Location: Akron, OH
Posts: 491
Quote:
Originally Posted by Sm3n View Post
Last time, the same day I asked for help I updated my nvidia driver - can't say if it was before or after - could it be the reason?

I'll keep trying to fix it but if someone has got an idea please share it
When you updated the driver, did you do a "clean install" or use DDU to remove the old driver? If so, go into the Nvidia Control Panel and make sure the power setting is "Adaptive" and that Vsync is turned on.
jkauff is offline   Reply With Quote
Old 6th January 2018, 20:02   #48088  |  Link
Sm3n
Registered User
 
Join Date: Jul 2012
Posts: 94
Quote:
Originally Posted by jkauff View Post
When you updated the driver, did you do a "clean install" or use DDU to remove the old driver? If so, go into the Nvidia Control Panel and make sure the power setting is "Adaptive" and that Vsync is turned on.
Nope, no clean install or DDU but adaptive and vsync is activated. Do you think I should properly reinstall?

There is one thing I'm not 100% certain I'm doing well is the first step during the "define timing parameters" selection.
EDID is recommanded in both the tutorial and the window but I noticed in some case I have to select a CVT v1, v2 or CRT. Does it matter? I think that depending of what I choose, later I can or can't switch the "optimized pixel clock" mode.

Here is my native modes:


Last edited by Sm3n; 6th January 2018 at 21:37.
Sm3n is offline   Reply With Quote
Old 7th January 2018, 03:22   #48089  |  Link
Magik Mark
Registered User
 
Join Date: Dec 2014
Posts: 666
Is there a way to save the refresh rate optimized values only?

I do refresh of madvr setting from time to time and the optimized values get refreshed as well. I would like to prevent this from happening
__________________
Asus ProArt Z790 - 13th Gen Intel i9 - RTX 3080 - DDR5 64GB Predator - LG OLED C9 - Yamaha A3030 - Windows 11 x64 - PotPlayerr - Lav - MadVR
Magik Mark is offline   Reply With Quote
Old 7th January 2018, 03:56   #48090  |  Link
psyside
Registered User
 
Join Date: Nov 2016
Posts: 46
Hi guys, can anyone help me with a decision for budget gpu?

What is the best and why?

I will use it for 4k mobile phones clips in madVR, i got an S8, and i want them to look the best possible, my monitor is U2414h IPS, calibrated decently.

So, GTX960 vs GTX750 ti 4gb vs RX460/5604gb vs GTX950 2gb?

GTX750 ti i can get for 90$, GTX960 for 160$, RX460/560 around 130$, and GTX 950 around 100$, which is best for the money, and why? what is the difference from this cards in terms of features? should i also look into 1050 2gb? thanks.
psyside is offline   Reply With Quote
Old 7th January 2018, 05:09   #48091  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
960 or 1050ti ideally on a budget but that 950 is looking good for the price. You're downscaling so it doesn't use much processing power to do that unless you're doing 60fps or using ssim2D. I'd stick it in bicubic 150 and probably not bother with any other enhancements as you have tons of detail already.

Last edited by ryrynz; 7th January 2018 at 05:14.
ryrynz is offline   Reply With Quote
Old 7th January 2018, 11:16   #48092  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 212
950 can't do HEVC in HW as I understand it...I'd say 960 if cheaper, 1050 otherwise (Ti is more expensive)
mytbyte is offline   Reply With Quote
Old 7th January 2018, 11:43   #48093  |  Link
Plutotype
Registered User
 
Join Date: Apr 2010
Posts: 235
Forget 950 or 960 the dont have HDMI 2.0b only HDMI 2.0. Grab GTX 1050Ti ( Palit makes a passive one ).
__________________
__________________
System: Intel Core i5-6500, 16GB RAM, GTX1060, 75" Sony ZD9, Focal speakers, OS Win10 Pro, Playback: madvr/JRiver
Plutotype is offline   Reply With Quote
Old 7th January 2018, 11:47   #48094  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
Why is 2.0b gonna even matter for madVR? Yeah 960/1050+ for HEVC / decent performance for sure.

Last edited by ryrynz; 7th January 2018 at 12:09.
ryrynz is offline   Reply With Quote
Old 7th January 2018, 12:33   #48095  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 212
2.0a and 2.0b only bring new HDR metadata to pass to the display, perhaps latest drivers on the GTX 960 fix that (can't confirm, don't have an HDR TV) but if that's not important to @psyde, this gfx card is quite capable...I'm getting 2160p60 @8 bit RGB or @10-bit 4:2:0 to the TV

Last edited by mytbyte; 7th January 2018 at 12:36.
mytbyte is offline   Reply With Quote
Old 7th January 2018, 12:59   #48096  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,565
Quote:
Originally Posted by mytbyte View Post
950 can't do HEVC in HW as I understand it...I'd say 960 if cheaper, 1050 otherwise (Ti is more expensive)
GTX 950 has the same hardware decoder as GTX 960.
sneaker_ger is offline   Reply With Quote
Old 7th January 2018, 13:11   #48097  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 212
Quote:
Originally Posted by sneaker_ger View Post
GTX 950 has the same hardware decoder as GTX 960.
OK. At the time I eventually purchased the GTX 960, there was a net-wide consensus it was the lowest GTX to feature a HEVC decoder
mytbyte is offline   Reply With Quote
Old 7th January 2018, 13:15   #48098  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,565
The GTX 950 was released about 7 months after the GTX 960 so you probably bought it in-between.
sneaker_ger is offline   Reply With Quote
Old 7th January 2018, 13:53   #48099  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,921
you can generally ignore minor HDMI difference for GPUs. it doesn't matter it is all up to the driver what type of 0 and 1 are send over the cable.

the stuff a GPU driver can't fix is HDCP which is doesn't matter to us madVR user and the bandwidth of the HDMI connector.
hdmi 2.0a-z doesn't matter as long as the bandwidth is present to send the data and the driver is able to creates the data. there is a reason kepler cards can do limited HDMI 2.0
huhn is offline   Reply With Quote
Old 7th January 2018, 14:50   #48100  |  Link
Sm3n
Registered User
 
Join Date: Jul 2012
Posts: 94
Quote:
Originally Posted by Sm3n View Post
Nope, no clean install or DDU but adaptive and vsync is activated. Do you think I should properly reinstall?

There is one thing I'm not 100% certain I'm doing well is the first step during the "define timing parameters" selection.
EDID is recommanded in both the tutorial and the window but I noticed in some case I have to select a CVT v1, v2 or CRT. Does it matter? I think that depending of what I choose, later I can or can't switch the "optimized pixel clock" mode.

Here is my native modes:

So, I decided to completely remove gpu driver using DDU in safe mode. Just reinstalled it and there is no improvement.
I really don't understand anything.
Sm3n is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 20:32.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.