Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 22nd March 2018, 23:15   #49741  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by Warner306 View Post
Your Vertex doesn't need to trigger the projector's HDR mode to get the gamma right?
No, the projector's HDR10 mode is buggy, so you don't want the crappy gamma that the manufacturer forces when it detects HDR metadata.

So the Vertex doesn't send the HDR metadata, hence the display believes it receives SDR, but the full HDR content is sent unchanged. I've designed a better mode that allows the Vertex to select the best possible gamma curve according to the content, using RS-232 control. So instead of the buggy manufacturer's HDR mode, the Vertex selects a user mode and a custom gamma curve according to content. This mode has a BT2020 calibration and a ST2084 gamma curve optimised for the content with the appropriate tonemapping, taking into account the actual peak white of the projector (which the manufacturer has no idea about, as we're talking about a projector here).

This is completely off topic here (and unrelated), so I won't elaborate on this. You'll find more info in the Vertex links at the end of the "useful links" section in the first post of this thread: http://www.avsforum.com/forum/24-dig...500-rs600.html, especially this post for a background explanation of the issue on this specific projector model: http://www.avsforum.com/forum/24-dig...l#post55097696
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 22nd March 2018 at 23:18.
Manni is offline   Reply With Quote
Old 22nd March 2018, 23:21   #49742  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
I'm pretty sure I'm familiar with the programmer you quote but could be different. He was adjusting color corrections via the player and saving the settings per title claiming HDR was a marketing gimmick and he was achieving the same thing using SDR titles and HDR titles without initiating HDR mode. Each title had to be previewed and tailored and then the settings saved via a file saved beside the title. Users had to tailor their own metadata. Furthermore he integrated it with KODI, put up a youtube video, and then asked money for it. Rather than pay for the software that automates some of this, you can test out the theory yourself by playing an SDR or HDR and adjust the color settings in your player. I don't know what any of that had to do with Windows getting it right and displays not.

I know how much mclingo and his AMD cards like me. I'm sure he'll be taking his rig over to his neighbor with a Samsung 10bit any moment now. Maybe an AMD card will land in my lap. Good suggestion, one I hadn't thought of. Hmmmm?
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W
brazen1 is offline   Reply With Quote
Old 22nd March 2018, 23:27   #49743  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by brazen1 View Post
I'm pretty sure I'm familiar with the programmer you quote but could be different. He was adjusting color corrections via the player and saving the settings per title claiming HDR was a marketing gimmick and he was achieving the same thing using SDR titles and HDR titles without initiating HDR mode. Each title had to be previewed and tailored and then the settings saved via a file saved beside the title. Users had to tailor their own metadata. Furthermore he integrated it with KODI, put up a youtube video, and then asked money for it. Rather than pay for the software that automates some of this, you can test out the theory yourself by playing an SDR or HDR and adjust the color settings in your player. I don't know what any of that had to do with Windows getting it right and displays not.

I know how much mclingo and his AMD cards like me. I'm sure he'll be taking his rig over to his neighbor with a Samsung 10bit any moment now. Maybe an AMD card will land in my lap. Good suggestion, one I hadn't thought of. Hmmmm?
Did he mention if Windows does the tone mapping? That’s what I got. It does trigger HDR mode but alters the metadata. The upcoming Windows HDR calibration feature seems to indicate that.
Warner306 is offline   Reply With Quote
Old 22nd March 2018, 23:40   #49744  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,344
Quote:
Originally Posted by brazen1 View Post
madVR has a native display bitdepth adjustment. We can select 1-9 and the next option is 10 or better.

What if madVR had exclusive adjustments for 10,11,12, and so on? This way no matter what the GPU sent (12 in this case), it would adhere to the display bitdepth limit we select, in this case, 10.

'auto' still selects what the GPU is sending (12) and some of us 10bit native display owners are encountering epic failures because of poorly implemented display modes that don't handle 12bit
madVR can only send two formats to the GPU, 8-bit RGB or 10-bit RGB. The only other commonly supported formats on GPUs is 16-bit, and madshi has previously said that the 16-bit format is rather "weird", and doesn't necessarily work right.

So that means that you cannot have any special handling for 12-bit output, since you cannot give the GPU 12-bit data, and dithering to 12-bit makes no sense if you transport 10-bit data afterwards.

As a consequence, this also means that dithering to 10-bit is pointless if you send 8-bit data to the GPU, and I would hope that madVR is actually smart enough to automatically dither to 8-bit if its sending 8-bit output, even if the device settings are set to 10-bit.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 22nd March 2018, 23:53   #49745  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
I don't know how far it progressed? I was the one persuading him to trigger the Windows advanced HDR and color switch early on and apply HDR through Windows because there was nothing else at the time. No one could replicate a shortcut to the feature. Had there been one I'd of done my own thing. Then madshi beat him to it, avoiding Windows altogether, and going straight to the display once NVidia released its private API. I prefer GPU initiated HDR. Early on testing between Windows HDR and GPU HDR were night and day imo. I know the poster here lately tested this problem we've been tiring of tried it recently too to help debug. To this day, I don't think that switch is automated or triggered by anything. It's on always or off always until you change it. I'm very glad madshi waited and chose the direction he did.

Thank you nevcairiel. That answers that. It seems my fate is in the hands of driver development to introduce RGB 10bit since special 12 bit handling isn't possible for 10bit sources and the only other choice of 8bit is HDR digression. Evidently, AMD is able to output it. nVidia needs to step it up. I'm about to convert.....
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W

Last edited by brazen1; 23rd March 2018 at 00:18.
brazen1 is offline   Reply With Quote
Old 22nd March 2018, 23:55   #49746  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
Quote:
Originally Posted by huhn View Post
it should still be possible to force nvidia to output 10 bit with a custom edid.
Could you elaborate on that? Would the graphics card 'fall back' to 10 bpc by hiding 12 bpc from the accepted formats maybe?
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 23rd March 2018, 00:00   #49747  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,344
Quote:
Originally Posted by el Filou View Post
Could you elaborate on that? Would the graphics card 'fall back' to 10 bpc by hiding 12 bpc from the accepted formats maybe?
Of course. Its not like NVIDIA doesn't support 10-bit, they just only offer 8-bit and the highest format supported in the settings. My PC screen here which only supports 10-bit also makes NVIDIA offer that in the control panel.
So you could take the EDID from your screen as-is, and just remove the 12-bit modes, and it should instead offer you 10-bit options.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 23rd March 2018, 00:23   #49748  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by e-t172 View Post
That's completely unsurprising if your "5 year old monitor" has lower contrast than your "2016 LG Oled", which seems extremely likely. The higher the contrast, the more visible banding is.
no not really it's a known processing error on LG OLED screens.

oled has only much ""bigger" steps in the dark part the rest is very close to even 1000:1 TVs. so no higher contrast doesn't mean generally more banding. and LG claims it is a 10 bit panel with 4 subpixel where one is white giving it in theory 20 bit for luma... so NO.
huhn is offline   Reply With Quote
Old 23rd March 2018, 00:28   #49749  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 155
Quote:
Originally Posted by e-t172 View Post
That's completely unsurprising if your "5 year old monitor" has lower contrast than your "2016 LG Oled", which seems extremely likely. The higher the contrast, the more visible banding is.
While contrast certainly plays a role, it's not about that. The gradation on Oleds is just bad. When looking at grayscale steps test pattern it's just not evenly graded. The difference from one step to the next is sometimes too small and sometimes too big, especially near black steps are extremely bad.

What's funny is that setting madVR to 5-bit makes the grading look almost perfect but that of course causes too much noise and other issues.

Last edited by j82k; 23rd March 2018 at 00:30.
j82k is offline   Reply With Quote
Old 23rd March 2018, 07:55   #49750  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Makes me wonder if madshi couldn't cook up some special OLED dithering xD. Sony's are superior in this area as reviews have highlighted.
Panasonic and LG both have some improvements in gradients, this years Panasonic OLEDs feature an updated LUT and LG's C series and above feature it's α9 processor to better deal with this.

Last edited by ryrynz; 23rd March 2018 at 08:21.
ryrynz is offline   Reply With Quote
Old 23rd March 2018, 08:08   #49751  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 315
Which HDR mode will I want .

NV HDR or OS HDR ? does it matter? NV HDR happens when it auto detect and enable TV HDR automatically when I run the movie with Potplayer + MadVR . OS HDR happens when I open the movie with HDR Enabled in the windows display settings.
x7007 is offline   Reply With Quote
Old 23rd March 2018, 11:54   #49752  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,348
OLED graduation has been discussed to death on AV forums, its not as simple as the panel itself. Early panels had issues where poorly compressed material would create quantisation errors and blocky noise in dark areas but a movie with no compression artifacts looks perfect in dark areas. Later panels dealt with this with what we assumed to be better dithering algorithms but most issues on OLED are caused by poor source material, push great in, get great out, push crap in ..... I dont notice any banding at all on my OLED, even at near black, unless its on the source and mine is gen one. There were some differences in the panels, the later 2015 models tended to be better and a few lucky people like me got a really good example, although I had to send two back with terrible near black vertical banding which is the real Achilles heal of OLED. It can still be a bit of a panel lottery but its a lot better than it was, i'd still buy my current OLED over any LCD, FALD or otherwise as i'm a dark room viewer and OLED cant be beaten in this field, once you go OLED there is no going back unfortunately
mclingo is offline   Reply With Quote
Old 23rd March 2018, 12:07   #49753  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Yeah there ain't no beating that perception of depth and clarity that OLED brings. Every day I see the C7, A1 and EZ950 and Sammy's Q8 while good, don't compare. I'll never buy an LCD TV :P
ryrynz is offline   Reply With Quote
Old 23rd March 2018, 12:18   #49754  |  Link
bran
Registered User
 
Join Date: Jun 2009
Location: Stockholm
Posts: 28
Quote:
Originally Posted by Warner306 View Post

It would be useful to have a user with an AMD card and the same display. Then you could determine if its the extra 2-bits that is causing the problem.
THIS.

Exactly what I've done - exchanged nvidia (12b) for AMD (10b) on my 2015 flagship Samsung JS9505 (FALD). Banding much reduced.
__________________
HTPC: Samsung 65JS9505, Yamaha A2070, Sonus Faber, RX580
bran is offline   Reply With Quote
Old 23rd March 2018, 12:36   #49755  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
was there still banding?
huhn is offline   Reply With Quote
Old 23rd March 2018, 13:44   #49756  |  Link
bran
Registered User
 
Join Date: Jun 2009
Location: Stockholm
Posts: 28
Quote:
Originally Posted by huhn View Post
was there still banding?
Some, but not nearly to the same extent. And it's hard to rule out the material as well. Tested using Planet Earth II, which some reviewers have noticed banding in the skies as well.
__________________
HTPC: Samsung 65JS9505, Yamaha A2070, Sonus Faber, RX580
bran is offline   Reply With Quote
Old 23rd March 2018, 14:06   #49757  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 155
I'd suggest using test patterns and not movies for testing stuff like banding. With movies you never know if the source is already bad.
This site has a ton of HDR10 test patterns and they're free.

https://yadi.sk/d/RPrX2C7l3HEjPq
j82k is offline   Reply With Quote
Old 23rd March 2018, 15:25   #49758  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by bran View Post
Some, but not nearly to the same extent. And it's hard to rule out the material as well. Tested using Planet Earth II, which some reviewers have noticed banding in the skies as well.
So what happens if you choose 8-bits? Does the banding get even better? Some banding sounds strange. The processing still isn't doing its job.
Warner306 is offline   Reply With Quote
Old 23rd March 2018, 15:26   #49759  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by x7007 View Post
Which HDR mode will I want .

NV HDR or OS HDR ? does it matter? NV HDR happens when it auto detect and enable TV HDR automatically when I run the movie with Potplayer + MadVR . OS HDR happens when I open the movie with HDR Enabled in the windows display settings.
I would go with the Nvidia HDR. Otherwise, you will be switching the OS switch every time you watch an HDR video.
Warner306 is offline   Reply With Quote
Old 23rd March 2018, 16:13   #49760  |  Link
GCRaistlin
Registered User
 
GCRaistlin's Avatar
 
Join Date: Jun 2006
Posts: 350
These screenshots are taken in MPC-HC by Alt-I: with EVR, with madVR. No noticable differences.
These screenshots are taken by PrtScr: with EVR, with madVR. It may be noticed that EVR's one has more grain (see the sleeve). Why?

(Tinypic changes picture size, nothing can be done about that. Here are the original images).
__________________
Windows 8.1 x64

Magically yours
Raistlin
GCRaistlin is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 21:20.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.