Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 9th March 2014, 19:44   #24681  |  Link
e-t172
Registered User
 
Join Date: Jan 2008
Posts: 589
Quote:
Originally Posted by e-t172 View Post
So I got frame drop/presentation glitches issues when using madVR with a 780 Ti. To be fair it's almost perfect but I still get at least one frame drop or other discontinuity every 10 minutes or so (it seems to happen at random times). I tried every combination of flushing settings/queue depths/use separate device for presentation/disable composition imaginable (well, almost) without any success. I'm out of ideas and asking for help.
Quote:
Originally Posted by Incriminated View Post
Try to check if your card operates within the normal range with tools like MSI-Afterburner.
Tried that, nothing seems out of the ordinary. I tried putting the GPU in high performance mode (in the NVidia per-program power management settings) and setting the Windows power options to performance mode as well, no luck.

Quote:
Originally Posted by iSunrise View Post
Personally, I would try the 327.23 drivers first
AFAIK 327.23 doesn't support the 780 Ti, it's too old.

madshi: are you interested in a log for this issue? I'm all out of ideas here. Even the old rendering path has the exact same issue on my system.
e-t172 is offline   Reply With Quote
Old 9th March 2014, 20:06   #24682  |  Link
Shiandow
Registered User
 
Join Date: Dec 2013
Posts: 753
Quote:
Originally Posted by Ver Greeneyes View Post
Just use the table given by dispcal and interpolate between the values (this would be easiest in dispcal itself, since it also knows what the target gamma curve was, including any viewing conditions transformation). As long as the values given by dispcal are monotonic it shouldn't be hard.
I agree with that method, but a linear interpolation won't be enough. I saw you suggested some other methods in your previous comment, those should probably be better. Basically what I was trying to say is that it might be a good idea to do this interpolation in log-log space, the main reason I think so is that the Rec. 709 and sRGB gamma curve are largely linear in log-log space, this should make interpolation easier.
Shiandow is offline   Reply With Quote
Old 9th March 2014, 20:20   #24683  |  Link
XMonarchY
Guest
 
Posts: n/a
Quote:
Originally Posted by madshi View Post

|- gamma -|- original -|- linear (pure power 1/0.45) -|- linear (sRGB 2.4) -|- linear (BT.709 1/0.45) -|

Again, make sure you watch these at 100%. Having your browser zoom these will totally screw up everything.

The key point to take from this screenshot comparison is that dithering in gamma light is not correct. I hope everybody can agree with that now? Which image looks best to you will directly depend on how your display is calibrated. The image created with the transfer function nearest to your display calibration should look nearest to the original 8bit image to your eyes.
My TV is calibrated to Rec709/sRGB with BT.1886 gamma curve using ArgyllCMS LUT. The best image that mimics the original is the Linear BT.709 and the next best one is Gamma. Others are too dark. I know Graeme firmly considers BT.1886 gamma to be THE gamma to use, but I am not sure how it relates to dithering gamma...
  Reply With Quote
Old 9th March 2014, 20:32   #24684  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
As we can see, people voted for every single one of the screenshots as looking closest to their monitor gamma.
The question remains, how will madshi solve these differences in preferred transfer function and make everybody happy (again)?

Time shows madshi does not disappoint satisfying our videophile needs (or the differences between them).
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 9th March 2014 at 20:36.
James Freeman is offline   Reply With Quote
Old 9th March 2014, 20:38   #24685  |  Link
XMonarchY
Guest
 
Posts: n/a
Quote:
Originally Posted by James Freeman View Post
As we can see, people voted for every single one of the screenshots as looking closest to their monitor gamma.
The question remains, how will madshi solve these differences in preferred transfer function and make everybody happy (again)?

Time shows madshi does not disappoint satisfying our videophile needs.
Why can't dithering use same exact transfer function as the video? Is it because we do not know the exact gamma tone that was used for whatever content mastering? Isn't there a linear gamma default of some kind?
  Reply With Quote
Old 9th March 2014, 20:40   #24686  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
Quote:
Originally Posted by James Freeman View Post
The question remains, how will madshi solve these differences in preferred transfer function and make everybody happy (again)?
IMHO people need to stop judging too much at low bitdepth.
At 8-bit, any of the linear ones will probably solve the issue that Gamma presents, since the effect is miniscule there.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 9th March 2014, 20:41   #24687  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by XMonarchY View Post
Why can't dithering use same exact transfer function as the video? Is it because we do not know the exact gamma tone that was used for whatever content mastering? Isn't there a linear gamma default of some kind?
As strange as it sounds, its because of your display device unique gamma curve and not the video content gamma curve, one user selects one picture and the other user selects another.

Quote:
Originally Posted by nevcairiel
IMHO people need to stop judging too much at low bitdepth.
At 8-bit, any of the linear ones will probably solve the issue that Gamma presents, since the effect is miniscule there.
True, but...
We could have said the same when we tested dithering, but then we would not get to where we are now.
Probable we would have been satisfied by one of the first Direct Compute builds.

Let me post a small phrase from the Audiophile/Videophile book (which I just made up):
"Hearing or Seeing means nothing, Its KNOWING* that really counts..."
*Can easily be transferred from person to person by (a relatively weak) subconscious suggestion.


Actually, it does not matter.
IMO madVR goal is to make the best picture quality whether less observant (or could not care less) user sees it or not.
Its not about "good enough", its about "The best of the best" which madVR currently holds the medal for.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 9th March 2014 at 20:59.
James Freeman is offline   Reply With Quote
Old 9th March 2014, 23:33   #24688  |  Link
MistahBonzai
Registered User
 
Join Date: Mar 2013
Posts: 101
Quote:
Originally Posted by Ver Greeneyes View Post
Thanks! Out of these, sRGB preserves the most detail but also brightens the source somewhat. 709 is most correct in terms of hue, saturation and brightness, but still looks noticeably darker than the original near black. Unfortunately I don't have an easy way of looking at the images from far away while still switching between them quickly, so I had to make do with flipping back and forth between the original and the bit-reduced images from fairly up close. Both sRGB and 709 are pretty close to the original for me, which matches my somewhat hybrid calibration (Rec.709 adjusted for sRGB viewing conditions).

I should note that I viewed these in my browser (Firefox) with my Rec.709-focused ICC profile active (gfx.color_management.mode = 1) and 1D curves loaded into my videoLUT.
Belated input here... Having an apparently similar 'monitor calibration' I viewed the images much as you did if I expanded them in the Chrome browser by pressing the little "+". Otherwise they appeared much like the others who said that "avatar2bitgamma" was closest - if fact it was almost an exact match between it and the original 8 bit image (excluding dither of course).

So.. Having never considered browser color management in the many years ago I discovered it was a crap shoot - I always DL to perform image evaluations - I figured the time had come to take another look. Like you I selected gfx.color_management.mode = 1 in the latest Firefox (Aurora 29.0.a2) along with defining my CalibratedMonitorProfile.icc profile as default.

While cross checking between Chrome and Aurora using the test files I could see no determinable difference. Nor between them and a local copy displayed via Picasa Image Viewer. If I'm wrong I'm consistently wrong
MistahBonzai is offline   Reply With Quote
Old 9th March 2014, 23:48   #24689  |  Link
cyberbeing
Broadband Junkie
 
Join Date: Oct 2005
Posts: 1,859
Quote:
Originally Posted by e-t172 View Post
AFAIK 327.23 doesn't support the 780 Ti, it's too old.
If you want to use OpenCL try the following driver with the modded inf I created. Q-the-STORM who also has a GTX 780 Ti had success installing it with the modified inf on Win7 x64:

Quote:
Originally Posted by cyberbeing View Post
Quadro 321.10 (December 17, 2013) + Modifed Inf for GTX 780 Ti
I can confirm it works with madVR OpenCL.
cyberbeing is offline   Reply With Quote
Old 10th March 2014, 01:21   #24690  |  Link
markanini
Registered User
 
Join Date: Apr 2006
Posts: 299
Am I the only one that thinks perceived the tone response is downright atrocious with gamma in the avatar comparison?
markanini is offline   Reply With Quote
Old 10th March 2014, 01:45   #24691  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by Ver Greeneyes View Post
A 12-bit table (3x4096) would cover the smallest differences that humans can discern, but that might have a bigger impact on performance (I don't know). 11-bit seems to be the limit of what madVR's dithering can do as far as calibration is concerned.
I bet a 12-bit table (3x4096) would have a rather negative effect on performance seeing as it would be 8 GB.


Does anyone see shadows that are too dark with linear light dithering in 8 bit? I assume no one noticed too light of shadows in 8 bit with gamma dithering? I know I didn't.

If we cannot notice the difference between the two extremes in 8 bit wouldn't any of the linear light dither options, being mathematically more correct, be fine for normal use?

Best of the best can be taken too far...

linear (pure power 1/0.45) looks the best to me, or maybe sRGB.

Last edited by Asmodian; 10th March 2014 at 03:25.
Asmodian is offline   Reply With Quote
Old 10th March 2014, 01:56   #24692  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Quote:
Originally Posted by seiyafan View Post
Just want to double check, in MPC under LAV video decoder, what's a good hardware decoder to use? Is it none?
Here is a strong argument for the None choice.

https://forums.geforce.com/default/topic/527075/geforce-drivers/hardware-acceleration-decoding-issue/1/

Run it with each of the different settings you can choose for LAV hardware acceleration. Changes are you'll experience pixelation (but not dropped frames) when you run the video clip there with any of the hardware acceleration options selected, but not have the issue if you choose none.

Granted just because DVX2 may cause pixelation in LAV doesn't mean you couldn't use it as an upscaling or downscaling in madVR. As you'll see if you use it in any of the madVR settings, and play the same video clip it shouldn't pixelate.
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.
Anime Viewer is offline   Reply With Quote
Old 10th March 2014, 02:04   #24693  |  Link
Ver Greeneyes
Registered User
 
Join Date: May 2012
Posts: 447
Quote:
Originally Posted by Asmodian View Post
I bet a 12-bit table (3x4096) would have a rather negative effect on performance seeing as it would be 8 GB.
I'm just talking about 3 1D curves, not a 4096^3 3DLUT :P Inverting the whole 3DLUT and increasing its resolution would be pretty insane
Ver Greeneyes is offline   Reply With Quote
Old 10th March 2014, 03:00   #24694  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by nevcairiel View Post
IMHO people need to stop judging too much at low bitdepth.
At 8-bit, any of the linear ones will probably solve the issue that Gamma presents, since the effect is miniscule there.
I like to nitpick as much as the next guy but I just watched "Out of the Furnace" on BD and PQ was outstanding with mono-static A4...and God forbid, that was in GL
leeperry is offline   Reply With Quote
Old 10th March 2014, 04:13   #24695  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by cyberbeing View Post
I strongly encourage you to consider an option for loading a GPU gamma ramp with madVR via shaders for Windowed & FSE mode as well:

[X]Disable GPU Gamma Ramp Globally
....[X]Reload GPU Gamma Ramp via Shaders
I do get slightly better results even within 16-235 using collink "-H" and running madVR in overlay mode compared to using "-a" in FSE or overlay. I do use -r256 in collink as well. I notice better white and black points, especially when using "-a", with -r256 but it takes a long time.

Quote:
Originally Posted by Ver Greeneyes View Post
I'm just talking about 3 1D curves, not a 4096^3 3DLUT :P Inverting the whole 3DLUT and increasing its resolution would be pretty insane
madVR already uses 16 bit 1D LUTs, only need to invert them.

That might be overkill for 8-bit but then everyone with a good calibration could watch in 4-bit without noticing if they were not close to the screen.

Last edited by Asmodian; 10th March 2014 at 04:23.
Asmodian is offline   Reply With Quote
Old 10th March 2014, 04:43   #24696  |  Link
Ver Greeneyes
Registered User
 
Join Date: May 2012
Posts: 447
Quote:
Originally Posted by Asmodian View Post
madVR already uses 16 bit 1D LUTs, only need to invert them.
That's pretty much what I'm suggesting. The problem is that as discussed, you'd need more than 256 entries for it to have an effect (so you need to apply some sort of interpolation to generate more points), and you'd have to apply the calibration's target curve on top of them to get linear RGB. And then madVR still needs to actually use it during dithering. Since madshi doesn't want to implement something complicated like that for very little gain (and I don't blame him), and since he mentioned the possibility of loading shaders directly into madVR before, I was suggesting that he could allow a custom shader to override the transfer function used during dithering.

Last edited by Ver Greeneyes; 10th March 2014 at 05:41.
Ver Greeneyes is offline   Reply With Quote
Old 10th March 2014, 05:01   #24697  |  Link
iSunrise
Registered User
 
Join Date: Dec 2008
Posts: 496
Quote:
Originally Posted by leeperry View Post
I like to nitpick as much as the next guy but I just watched "Out of the Furnace" on BD and PQ was outstanding with mono-static A4...and God forbid, that was in GL
Just asking, but how is that post even helpful when all you did was watch the movie at 8bit gamma light, when you didn't compare it with linear light? At least provide us with something we can work with.

Which one is gamma and which one is linear (both at 8bit):



Which one looks closer to the previous ones:



It doesnīt matter if you inspect the pixels themselves or look at it from a distance, there are very obvious differences.

Quote:
Originally Posted by XMonarchY View Post
My TV is calibrated to Rec709/sRGB with BT.1886 gamma curve using ArgyllCMS LUT. The best image that mimics the original is the Linear BT.709 and the next best one is Gamma. Others are too dark. I know Graeme firmly considers BT.1886 gamma to be THE gamma to use, but I am not sure how it relates to dithering gamma...
Yes, same thing Iīm seeing here (when Iīm on Rec709). So it seems for BT.1886/709 the "linear BT.709" is the only really useful option, whereas for sRGB displays, itīs the sRGB curve (it would be interesting to see 2.2 gamma on sRGB, too, though).

The "gamma" example is way too bright, whereas the "linear" example is a bit too dark. If I switch to sRGB, though, the linear is more accurate, while the gamma one still retains the elevated (brighter) look.

Last edited by iSunrise; 10th March 2014 at 08:25.
iSunrise is offline   Reply With Quote
Old 10th March 2014, 05:01   #24698  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by Ver Greeneyes View Post
The problem is that as discussed, you'd need more than 256 entries for it to have an effect (so you need to apply sort of interpolation to generate more points)
Ah sorry, I understand. I think I missed a page.
Asmodian is offline   Reply With Quote
Old 10th March 2014, 08:44   #24699  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by iSunrise View Post
....whereas for sRGB displays, itīs the sRGB curve (it would be interesting to see 2.2 gamma on sRGB, too, though).

The "gamma" example is way too bright, whereas the "linear" example is a bit too dark. If I switch to sRGB, though, the linear is more accurate, while the gamma one still retains the elevated (brighter) look.
I'm doing the tests in 3/4 bit to see the difference clearer.

On my display:
Linear Power 2.2 is too dark.
Linear sRGB 2.4 is almost there, but very little too dark.
Linear BT.709 is a little too bright.
Gamma is nothing like the original image, very bright.

So my display falls somewhere between sRGB 2.4 & BT.709.
I think sRGB 2.2 might be it for my display.

I experimented with the Gamma slider in Nvidia Control Panel and managed to make my display match (+/-) the Linear Power 2.2 and sRGB 2.4 images madshi posted.
I use a 3DLUT in madVR calibrated to power 2.2, so the gamma curve will be perfect 2.2 there.
I do not load VideoLUT curves (or include them in the 3DLUT) because they create banding, even in madVR when GPU gamma ramp is disabled.
I just "Profile" the monitor (w/ 128 Neutral patches) without "Calibrating" and create a 3DLUT (2.2 Relative), this gives me a 2.2 curve with no banding what so ever.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 10th March 2014 at 08:48.
James Freeman is offline   Reply With Quote
Old 10th March 2014, 12:03   #24700  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by iSunrise View Post
all you did was watch the movie at 8bit gamma light, when you didn't compare it with linear light
LL looks terribly dull on my rig, maybe the internal post-processing of my TV isn't compatible for some reason. My point is that as nev said, you guys seem to focus a tad too much on low bit-depths and LL isn't quite a magic bullet to everyone.
leeperry is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 06:22.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.