Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 17th December 2018, 03:21   #53901  |  Link
seiyafan
Registered User
 
Join Date: Feb 2014
Posts: 162
Thanks for the explanation!
seiyafan is offline   Reply With Quote
Old 17th December 2018, 06:56   #53902  |  Link
giulianoprs
Registered User
 
Join Date: Mar 2018
Posts: 39
ah !!! I believed that "Smooth motion" was an IFC function of the plasma panasonic, which activated by a strange feeling, so IFC for me is always OFF!
I try to activate "Smooth motion", I want to see what happens differently.
"_Please trust your eyes_" hahahaha, you're right, watching movies or TV series not known any kind of shooting or irregularity, but watching the OSD I see those few "dropped frames" I get nervous, hahaha.
thanks to all the advice.
giulianoprs is offline   Reply With Quote
Old 17th December 2018, 15:49   #53903  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Custom resolution mode

I created 1 custom mode for 23p with madvr (the rest of the modes are fine on my system), all work fine.
- the interesting part is that it created a 24p entry in nvidia control panel not 23p. Is this right this way?
-- it still can properly switch to 23p and 24p ONLY if the mode is different at the beginning of playback (e.g. 60p)
- but if the mode of the display is set e.g. to 23p "manually" then madVR can't switch to the custom "23p" mode. Is this correct as well?

The 2nd one normally isn't an issue since at the end of the playback when the player closes, mnadvr switches back to the original mode (e.g. 60p).
But if I set "After playback: Sleep" in MPC-BE then madvr can't switch the mode back, and next time when I turn on the TV I have to manually set it back to 60p (since it's still in the custom 23p mode).
Is there any workaround for this?
Thanks!
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 17th December 2018, 20:36   #53904  |  Link
nghiabeo20
Registered User
 
Join Date: Jun 2013
Posts: 32
Hi, can someone help me with the calibration settings?

I had my laptop monitor calibrated with a i1Display Pro, but the technician only gives me an icc file. I've already loaded it in Windows. I use DisplayCal software to convert it to a madVR 3dlut profile (Source profile: BT.709, target profile: my icc file) and load the output file in madVR BT.709 section. Does it cause the correction was applied twice therefore worsen the image quality?

Thanks!

Update: I think the answer is yes, it worsen the image. But the image quality still not convince me

Original image: https://drive.google.com/file/d/1buw...ew?usp=sharing

Disable calibration:




Enable calibration with external 3dlut:




How to solve this? Thanks!

Last edited by nghiabeo20; 17th December 2018 at 20:50.
nghiabeo20 is offline   Reply With Quote
Old 17th December 2018, 21:16   #53905  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 315
There is issue again with nvidia 417.35 . It doesn't detect NVidia mode 10 bit HDR . only 8 bit no matter what I do , even when I change the Output Color Depth to 10 bit. It worked fine before I think with one of the 4xx don't remember which .

Enabling OS HDR and changing to 10 bit fix this issue. I must it to be 10 bit because I am using something called DreamScreen which is like Philips Amblight it has LEDS on the back of the TV and it gives light of what u see . and using 8 bit doesn't give accurate color because it doesn't receive the current data it needs ( 8 bit ) . using 10 bit gives the correct colors .
x7007 is offline   Reply With Quote
Old 17th December 2018, 21:33   #53906  |  Link
border.community
Registered User
 
Join Date: Dec 2018
Posts: 10
Quote:
Originally Posted by x7007 View Post
There is issue again with nvidia 417.35 . It doesn't detect NVidia mode 10 bit HDR . only 8 bit no matter what I do , even when I change the Output Color Depth to 10 bit.


same here, only displaying 8-bit since update


is it only displaying 8-bit, or is it sending 8-bit as well?

Last edited by border.community; 17th December 2018 at 22:45.
border.community is offline   Reply With Quote
Old 17th December 2018, 21:47   #53907  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 315
Quote:
Originally Posted by border.community View Post
same here, only 8-bit since update
Is someone with 417.xx older version than 35 can check it please ? if it works 10 bit with NV HDR mode ?
x7007 is offline   Reply With Quote
Old 17th December 2018, 23:07   #53908  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 155
For me any driver after 416.81 doesn't trigger NV HDR, not even at 8-bit. The TV doesn't even try and then fail to switch to HDR like on previous broken drivers. Nothing happens at all.

Win10 1809, 1050 Ti, LG C8
j82k is offline   Reply With Quote
Old 18th December 2018, 02:38   #53909  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 315
Quote:
Originally Posted by j82k View Post
For me any driver after 416.81 doesn't trigger NV HDR, not even at 8-bit. The TV doesn't even try and then fail to switch to HDR like on previous broken drivers. Nothing happens at all.

Win10 1809, 1050 Ti, LG C8
yes so the last working driver for NV HDR is 416.81 . thanks

I guess I'll just have to change the NVidia Color settings YCbcr 4:2:2 10 bit Limited every time and enable windows HDR .

The question is how will it work for games like Far Cry 5 ? would there bug there also ? because it auto goes HDR mode you don't enable windows HDR so it's lik NV mode ... interesting .. How can we check ?

Last edited by x7007; 18th December 2018 at 02:41.
x7007 is offline   Reply With Quote
Old 18th December 2018, 08:10   #53910  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
I need FSE to trigger HDR with current drivers, but then it works fine. Win10 1809, 2080 Ti, 417.35, LG C7, RGB 8 bit full.

Edit: Actually, I seem to need to have madVR set output 10 bit for the drivers to switch to HDR, the GPU can output 8 bit, just not madVR.
__________________
madVR options explained

Last edited by Asmodian; 18th December 2018 at 08:33.
Asmodian is offline   Reply With Quote
Old 18th December 2018, 08:19   #53911  |  Link
kitame
Registered User
 
Join Date: May 2012
Posts: 85
is TFLOPs still a reliable indicator of which card would be better for MadVR? or had that changed?

e.g.
RX Vega 56 = 10.5 TFLOPs
GTX 1070 = 7.2 TFLOPs

in reviewer's game benches both perform roughly the same.
i really wish reviewers added MadVR to their list.

Last edited by kitame; 18th December 2018 at 08:22.
kitame is offline   Reply With Quote
Old 18th December 2018, 08:50   #53912  |  Link
Betroz
Is this for real?
 
Betroz's Avatar
 
Join Date: Mar 2016
Location: Norway
Posts: 168
Like someone else suggested here earlier, for those who run games and use MadVr on the same PC, just run a dual boot setup with Win 8.1 for MadVr and Win10 for games. Or two installs of Win10 if possible.
__________________
My HTPC : i9 10900K | nVidia RTX 4070 Super | TV : Samsung 75Q9FN QLED
Betroz is offline   Reply With Quote
Old 18th December 2018, 10:57   #53913  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by nghiabeo20 View Post
Hi, can someone help me with the calibration settings?
This is your topic: https://forum.doom9.org/showthread.php?t=172783&page=14
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 18th December 2018, 12:06   #53914  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,923
Quote:
Originally Posted by kitame View Post
is TFLOPs still a reliable indicator of which card would be better for MadVR? or had that changed?

e.g.
RX Vega 56 = 10.5 TFLOPs
GTX 1070 = 7.2 TFLOPs

in reviewer's game benches both perform roughly the same.
i really wish reviewers added MadVR to their list.
it was sadly never an indicator for madVR performance between different architectures.

tflops are now fundamentally flawed too.
the 1070 tflops are with stock clocks but all properly cooled cards boost to around 1900 not the ~1500 they are sold at.
huhn is offline   Reply With Quote
Old 18th December 2018, 12:54   #53915  |  Link
kitame
Registered User
 
Join Date: May 2012
Posts: 85
Quote:
Originally Posted by huhn View Post
it was sadly never an indicator for madVR performance between different architectures.

tflops are now fundamentally flawed too.
the 1070 tflops are with stock clocks but all properly cooled cards boost to around 1900 not the ~1500 they are sold at.
7.2 TFLOPs is actually at 1900, max boost is rated 1900 in Wiki too.
from what is described in wiki, the TFLOPs is calculated purely based on Shader/CUDA cores and it's clock speed.
this means GTX1070 which only has 1920 CUDA cores would be way slower than RX Vega 56 which has 3584 shaders.


in any case, is there any way to get a rough estimate to their actual performance in MadVR?

edit: by the way, in certain "compute benchmarks" Vega series were shown outperforming even the GTX1080, so it's TFLOPs rating isn't entirely baseless.
https://www.anandtech.com/show/11717...d-56-review/17

Last edited by kitame; 18th December 2018 at 13:03.
kitame is offline   Reply With Quote
Old 18th December 2018, 12:58   #53916  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
Quote:
Originally Posted by kitame View Post
in any case, is there any way to get a rough estimate to their actual performance in MadVR?
Not from paper stats like that. You can roughly use gaming performance as an indicator, but thats not going to be fully accurate either since game rendering uses far more parts of a GPU then just image shading.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 18th December 2018, 13:04   #53917  |  Link
kitame
Registered User
 
Join Date: May 2012
Posts: 85
Quote:
Originally Posted by nevcairiel View Post
Not from paper stats like that. You can roughly use gaming performance as an indicator, but thats not going to be fully accurate either since game rendering uses far more parts of a GPU then just image shading.
is there a particular benchmark that would reflect MadVR's workload? or at least close to it?
kitame is offline   Reply With Quote
Old 18th December 2018, 13:11   #53918  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,923
sadly no.

for example polaris is on paper fast in games but slow when madVR is used with NGU but only then it performance as expect with "any" other scaler/feature.

this was kind of the same with nnedi3 so who knows with the next algorithm. maybe you need an RTX card with tensor cores no one knows.
huhn is offline   Reply With Quote
Old 18th December 2018, 13:36   #53919  |  Link
kitame
Registered User
 
Join Date: May 2012
Posts: 85
i see, however RTX cards are currently way overpriced, and there were rumors that it was intentionally so to get rid of Pascal cards.

edit: speaking of benchmarks though, i wonder if Madshi is interested in making a benchmark on MadVR.

Last edited by kitame; 18th December 2018 at 13:39.
kitame is offline   Reply With Quote
Old 18th December 2018, 13:38   #53920  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 155
Ok so with newer nvidia drivers (tested with 417.35) when starting an HDR movie:

madVR native display bitdepth set to 8-bit --> nothing happens, TV stays in SDR mode
madVR native display bitdepth set to 10-bit --> HDR triggers correctly

After starting a movie with 10-bit I can change it back to 8-bit and HDR stays active but when closing mpc-hc it doesn't return to SDR mode. Only closing it with 10-bit selected returns the TV to SDR mode.

Doesn't this point to it being a madVR problem then?
Why would it even make a difference regarding HDR triggering or not what bitdepth I have selected in madVR?
j82k is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 12:47.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.