Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 3rd October 2013, 10:11   #20201  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
I've been using ArgyllCMS for quite some time now.
What is the benefits of using madTPG over the built in one?

P.S
Over a long email exchange sending back and forth huge Verbose calibration reports, and receiving new fixed beta versions.
Graeme Gill managed to fix the Black Input Offset problems when calibrating to BT.1886 standard (in fact anything with the -f0 command).
Expect new version soon.

Last edited by James Freeman; 3rd October 2013 at 10:17.
James Freeman is offline   Reply With Quote
Old 3rd October 2013, 17:06   #20202  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by James Freeman View Post
Graeme Gill managed to fix the Black Input Offset problems when calibrating to BT.1886 standard (in fact anything with the -f0 command).
Expect new version soon.
Goodie, so I'll soon be able to use the following?
Code:
dispcal -P 0.5,0.5,2,2 -v -yl -t6500 -f0 -G2.4 -qm bla
I can't make the test pattern area too big otherwise those silly Sammy TV's turn off their backlight completely when sent pure black. I'll get into the process of building a 3DLUT out of it later. But mVR can also trigger Argyll now, I wonder if it could do -f0 -G2.4, need to look it up.

Quote:
Originally Posted by NewmanHD View Post
The best picture settings will prove adequate for a HD7730?
I'm pulling the trigger on a 2GB HD7850, hard to refuse for 110€ shipped with some serious cooling too

It's only barely slower than a 660 so that should be the bee's knees with mVR for quite a while, I'll reconfirm
leeperry is offline   Reply With Quote
Old 3rd October 2013, 18:40   #20203  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by StephaneM View Post
So you can have madVR added to the graph in the UI thread and have the graph in another thread (preferably with a message pump). Then the only thing you have to do is to place some madVR calls on the main UI thread (well I had to do it only for SetWindowPosition, so I guess that anything that will somehow change properties on the madVR window requires to be called on the UI thread)
All of this might change some time in the future. I might at some point offer a windowless mode, similar to EVR. That should makes things like this a bit less painful. But it will probably be some time before I get to things like that...

Quote:
Originally Posted by NewmanHD View Post
Thanks for your reply. Could you give me a link about the post? I can not find.
Sorry, don't have a link. Would have to search myself. Just search for posts of the user name I gave you and look at the last 100 posts or so. There should be something useful there.

Quote:
Originally Posted by andybkma View Post
Win7 64-bit, with FSE On (using new path, Present Several Frames in Advance checked), HDMI or VGA out to an external monitor or projector, if I put the computer into Sleep (Standby) and then come out of Sleep, the first time (and only the first time) I go into FSE mode I seem to lose graphics output to the external monitor (the monitor goes into sleep mode) for about 3-4 seconds before finally going into FSE. If I use the old FSE path (Present Several Frames in Advance unchecked) then I have this problem when coming out of FSE mode instead of going into. Very odd. This happens only with Sleep, if I reboot the computer I don't have this problem.
Not sure what this is, but it doesn't sound like it's too much of a problem. I don't think madVR has anything to do with this. You could check if games behave the same way. Then you'd have proof that it's not madVR's fault.

Quote:
Originally Posted by ryrynz View Post
Would absolutely love to get a test build with the deband feature. Pretty excited for this one based on what I've read and seen over on the flash3kyuu thread.
Alright, but no guarantees whatsoever:

http://madshi.net/madVRdeband.rar

You can toggle it by pressing Ctrl+Alt+D, or by using the file name tag "deband=low/high". Not available in the settings dialog yet.

FWIW, this test build also has a nice improvement for FSE mode, where rendering times don't decrease, anymore, when the backbuffer queue is full. In older versions a full backbuffer queue somewhat slowed rendering down. Not anymore (only Vista+).

Quote:
Originally Posted by SHaKOL View Post
i have a problem with .avi files when i choose madVR as the video renderer, like this:

[...]

but when i choose enhanced video rendere (custom presenter) works normally
That's a bug in the decoder which just doesn't show with other renderers. Basically the renderer sends the decoder buffers to store decoded frames in. And this specific decoder requires the buffer to contain the contents of the previously decoded frame, only then the image looks correct. madVR works differently, it does not initialize the buffers in any way. And DirectShow does not require it to. I could make the decoder work, but on the cost of quite a bit of raised CPU consumption, so I won't do that. Use a better decoder. Maybe you can get LAV Video Decoder or ffdshow to handle this file instead of the AVI Decompressor.

Quote:
Originally Posted by StephaneM View Post
I need some help regarding external pixel shaders, no matter what I try I fail to apply a shader.

No matter what I do I get 0x80004005 as result...
Do external pixel shaders work when using MPC-HC?

You'll need to have D3D9 installed and updated to the latest version. IIRC it's not installed by default, at least not with all helper dlls etc.

Quote:
Originally Posted by Duffy Moon View Post
After toggling a few settings, the problem seems to disappear after switching the decoding matrix to BT.601 instead of BT.709, which is what madVR autodetects.

I was getting weird effects in, say Breaking Bad season 3 episode 1, the clouds in the Mexican sky had red blotches.
I don't see how switching between BT.601 and BT.709 could make/fix a blue sky getting red blotches. Sure, switching between BT.601 and BT.709 should change colors a bit. But *not* from blue to red!

Quote:
Originally Posted by James Freeman View Post
I've been using ArgyllCMS for quite some time now.
What is the benefits of using madTPG over the built in one?
(1) madTPG dithers, which helps dispcal accuracy and speed.
(2) You run everything through the same pipeline you later use for video playback. This way there's no potential difference (in colors, levels, whatever) between the test pattern generator pipeline and the video rendering pipeline.
(3) You can run ArgyllCMS on one PC (e.g. your laptop) and madTPG on another (e.g. your HTPC). Such a remote calibration works automatically with madTPG, if you allow the network communication to pass eventual firewalls. With ArgyllCMS' built in test pattern generator you must run ArgyllCMS and the test pattern generator on the same PC, which can be bothersome, depending on your setup.

Quote:
Originally Posted by James Freeman View Post
Over a long email exchange sending back and forth huge Verbose calibration reports, and receiving new fixed beta versions.
Graeme Gill managed to fix the Black Input Offset problems when calibrating to BT.1886 standard (in fact anything with the -f0 command).
Expect new version soon.
Cool. How much of an improvement did the fixes bring? Just slightly better? Or is the result near perfect now?
madshi is offline   Reply With Quote
Old 3rd October 2013, 18:51   #20204  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
-f0 -G2.4
Yes, now it will give you BT.1886 without elevating the 0 black step, thus not sacrificing contrast ratio (tested).

As I have already mentioned, I am not a fan of BT.1886 (on my IPS 870:1 monitor).
I can see compression artifacts on black steps 0 to 3 (out of 255), on the purest blu-rays like The Hobbit.
Looks like random noise or squares.
For me BT.1886 is much too aggressive on my Dell U2410 poor Contrast Ratio.

On the other end, The better the monitor contrast ratio, the better and closer BT.1886 calibrated "Flatpanel" will look like a good CRT.
IMO™, anything below 1000:1, stick with Power law + BLC.


Cheers.

Last edited by James Freeman; 3rd October 2013 at 19:02.
James Freeman is offline   Reply With Quote
Old 3rd October 2013, 18:55   #20205  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by madshi View Post
Cool. How much of an improvement did the fixes bring? Just slightly better? Or is the result near perfect now?
Near Perfect.
The problem was the -f0 raised my native blacks from 0.23 to 0.29 (after calibration) thus destroying the contrast ratio (same with L*).
Now my contrast ratio stays native like the values I see on the "Report Uncalibrated Display" log.

I'll quote one of the mails:
Quote:
> Can you please explain what did you do to make the calibration more
> accurate,
> for example: tightened the lower 10% dE?

Hi,
not exactly. There was some code that adjusted the
targets (for purposes that are no longer clear - perhaps to
help convergence on difficult devices ?) that I removed,
as well as weighting the stopping criteria to err on the side
of the luminance being lower than the target rather than
higher for the near black points.
Not to say that I actually understood it, but I'm still very grateful.

Last edited by James Freeman; 3rd October 2013 at 19:07.
James Freeman is offline   Reply With Quote
Old 3rd October 2013, 20:06   #20206  |  Link
StephaneM
Registered User
 
Join Date: Jun 2006
Posts: 15
Quote:
Originally Posted by madshi View Post
All of this might change some time in the future. I might at some point offer a windowless mode, similar to EVR. That should makes things like this a bit less painful. But it will probably be some time before I get to things like that...
This would be an improvment, though once you know that there is a restriction, it's easy to handle (it can be tricky)


Quote:
Do external pixel shaders work when using MPC-HC?

You'll need to have D3D9 installed and updated to the latest version. IIRC it's not installed by default, at least not with all helper dlls etc.
External pixel shaders are working with MPC-HC (well shaders are applied when madVR is used as the renderer)

For now I'm doing test on Windows 8, so I suppose D3D9 is up to date on this platform...
StephaneM is offline   Reply With Quote
Old 3rd October 2013, 21:19   #20207  |  Link
Thunderbolt8
Registered User
 
Join Date: Sep 2006
Posts: 2,197
I got a new laptop with a GTX 760m, but I found out that I cannot change the graphics processor to NVIDIA for media player classic, because that option is greyed out (whql 327.23). how can I solve this issue?

edit: managed to change it with nvidia inspector, I had to change the "enable application for Optimus" setting to Value "SHIM_RENDERING_MODE_ENABLE". Now I can change all the sub-settings as well.
__________________
Laptop Lenovo Legion 5 17IMH05: i5-10300H, 16 GB Ram, NVIDIA GTX 1650 Ti (+ Intel UHD 630), Windows 10 x64, madVR (x64), MPC-HC (x64), LAV Filter (x64), XySubfilter (x64) (K-lite codec pack)

Last edited by Thunderbolt8; 3rd October 2013 at 21:38.
Thunderbolt8 is offline   Reply With Quote
Old 3rd October 2013, 21:40   #20208  |  Link
Thunderbolt8
Registered User
 
Join Date: Sep 2006
Posts: 2,197
regarding the nvidia settigs. its been said that leaving everything to default should be recommended. but what about the setting texture filtering quality? shouldnt that affect the image quality of movies as well? its set to high as default/global, but can be set to very high.
does this indeed improve anything? in combination with this trilinear optimization can also set to off for best image quality.


also, something I noticed is that when I use CUVID in LAV Video as Hardware Acceleration, then my GPU and my CPU usage both is lower (according to task manager and nvidia inspector) than when using no hardware acceleration at all. Can this be right? Shouldnt at the GPU load be higher, because the NVIDIA card is being utilized with LAV Video and madvr at the same time then?
__________________
Laptop Lenovo Legion 5 17IMH05: i5-10300H, 16 GB Ram, NVIDIA GTX 1650 Ti (+ Intel UHD 630), Windows 10 x64, madVR (x64), MPC-HC (x64), LAV Filter (x64), XySubfilter (x64) (K-lite codec pack)

Last edited by Thunderbolt8; 3rd October 2013 at 22:05.
Thunderbolt8 is offline   Reply With Quote
Old 3rd October 2013, 22:02   #20209  |  Link
SHaKOL
Registered User
 
Join Date: Jun 2013
Posts: 6
Quote:
Originally Posted by madshi View Post
That's a bug in the decoder which just doesn't show with other renderers. Basically the renderer sends the decoder buffers to store decoded frames in. And this specific decoder requires the buffer to contain the contents of the previously decoded frame, only then the image looks correct. madVR works differently, it does not initialize the buffers in any way. And DirectShow does not require it to. I could make the decoder work, but on the cost of quite a bit of raised CPU consumption, so I won't do that. Use a better decoder. Maybe you can get LAV Video Decoder or ffdshow to handle this file instead of the AVI Decompressor.
How do i use LAV Video Decoder or ffdshow instead of the AVI Decompressor? because i opened LAV Video Decoder (formats) and didn't find avi .

Last edited by SHaKOL; 3rd October 2013 at 22:05.
SHaKOL is offline   Reply With Quote
Old 3rd October 2013, 23:15   #20210  |  Link
6233638
Registered User
 
Join Date: Apr 2009
Posts: 1,019
Quote:
Originally Posted by madshi View Post
Alright, but no guarantees whatsoever:

http://madshi.net/madVRdeband.rar

You can toggle it by pressing Ctrl+Alt+D, or by using the file name tag "deband=low/high". Not available in the settings dialog yet.
Debanding is okay, I guess...


(brightened in Photoshop)
6233638 is offline   Reply With Quote
Old 4th October 2013, 02:55   #20211  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by Thunderbolt8 View Post
regarding the nvidia settigs. its been said that leaving everything to default should be recommended. but what about the setting texture filtering quality? shouldnt that affect the image quality of movies as well? its set to high as default/global, but can be set to very high.
does this indeed improve anything? in combination with this trilinear optimization can also set to off for best image quality.
Those do not affect MadVR, they are for rendering textures at an angle (like the ground in a 3D game).

Quote:
Originally Posted by Thunderbolt8 View Post
also, something I noticed is that when I use CUVID in LAV Video as Hardware Acceleration, then my GPU and my CPU usage both is lower (according to task manager and nvidia inspector) than when using no hardware acceleration at all. Can this be right? Shouldnt at the GPU load be higher, because the NVIDIA card is being utilized with LAV Video and madvr at the same time then?
CUVID doesn't use the GPU to decode video. It uses the VPU which is a different chip on the card and another sensor in GPU-Z or Nvidia Inspector (I believe it defaults to not displayed in Nvidia Inspector). CUVID will also use memory bandwidth on the video card.
Asmodian is offline   Reply With Quote
Old 4th October 2013, 05:01   #20212  |  Link
turbojet
Registered User
 
Join Date: May 2008
Posts: 1,840
Deband works well here with some quick tests is there a way for it to save it's state?
What is F2 save icon when toggling deband?
Speaking of keys ctrl+j conflicts with a hardcoded potplayer key (open webcam), ctrl+alt+j has no conflicts.
__________________
PC: FX-8320 GTS250 HTPC: G1610 GTX650
PotPlayer/MPC-BE LAVFilters MadVR-Bicubic75AR/Lanczos4AR/Lanczos4AR LumaSharpen -Strength0.9-Pattern3-Clamp0.1-OffsetBias2.0
turbojet is offline   Reply With Quote
Old 4th October 2013, 06:02   #20213  |  Link
sheppaul
Registered User
 
Join Date: Sep 2004
Posts: 146
turbojet, you can disable the hotkey in preferences.
sheppaul is offline   Reply With Quote
Old 4th October 2013, 07:33   #20214  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Quote:
Originally Posted by madshi View Post
You can toggle it by pressing Ctrl+Alt+D, or by using the file name tag "deband=low/high". Not available in the settings dialog yet.
Awesome, I'm pretty impressed with the low setting, it not only smoothes out those strong bands but cleans up smaller areas too, giving an overall impressive smoothing. It cleans up a fair amount of artifacts too,
so definitely worth keeping active even on 10bit content which may not suffer as much from banding but still has encoding artifacts that will be improved with having this enabled.

I find the high setting is a little strong for my tastes, is it worthwhile to implement the improved (more demanding) algorithm option with the low setting as well?

Quote:
Originally Posted by turbojet View Post
Is there a way for it to save it's state?
What is F2 save icon when toggling deband?
Funnily enough this is exactly what you're after. F2 saves the changed state when you press it. As you can can maybe tell it actually doesn't work for deband at this stage, fair enough being a test build.
I too would really like a build where this setting could be saved we'll just have to use the filename tags for now.

Last edited by ryrynz; 4th October 2013 at 10:57.
ryrynz is offline   Reply With Quote
Old 4th October 2013, 10:43   #20215  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by James Freeman View Post
As I have already mentioned, I am not a fan of BT.1886 (on my IPS 870:1 monitor).
I can see compression artifacts on black steps 0 to 3 (out of 255), on the purest blu-rays like The Hobbit.
Looks like random noise or squares.
I think with a display with a higher native contrast ratio, you should see the compression artifacts just as well, if your ambient light level allows it. At least that's how I understand BT.1886.

Quote:
Originally Posted by StephaneM View Post
[...]
http://www.microsoft.com/en-us/download/details.aspx?id=34429

Quote:
Originally Posted by SHaKOL View Post
How do i use LAV Video Decoder or ffdshow instead of the AVI Decompressor? because i opened LAV Video Decoder (formats) and didn't find avi .
Avi is the container not the codec. Judging from your screenshot the codec is CRAM. Try simply blocking "AVI Decompressor".

Quote:
Originally Posted by 6233638 View Post
Debanding is okay, I guess...
Not sure: Is that irony, or did you really expect better?

Quote:
Originally Posted by turbojet View Post
Deband works well here with some quick tests
Comparable to your favorite f3kdb() settings? Or better/worse?

Quote:
Originally Posted by ryrynz View Post
I find the high setting is a little strong for my tastes
Well, for good quality content it is probably too strong. But it can be useful for really bad content, especially for Anime. With Anime usually image edges are very sharp while non-edges are usually rather flat. So even in "high" setting there won't be much (if any) detail loss. I could probably even add a "ultra" setting for Anime with even stronger debanding. But I guess "high" should do the trick for most situations already.

Quote:
Originally Posted by ryrynz View Post
is it worthwhile to implement the improved (more demanding) algorithm option with the low setting as well?
The improved algorithm consists of 5 checks instead of 1 check. The "high" setting has all 5 checks in it. The "low" setting still has 4 checks in it. So it's still improved over the original f3kdb() algorithm. But I've removed the 1 check which consumes most of the performance. That cut the rendering time in half. With the test images I can't really see a difference between having this 1 check on/off for the "low" setting, so I believe cutting the rendering times in half is the better solution.
madshi is offline   Reply With Quote
Old 4th October 2013, 11:34   #20216  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Quote:
Originally Posted by madshi View Post
Well, for good quality content it is probably too strong. But it can be useful for really bad content, especially for Anime. With Anime usually image edges are very sharp while non-edges are usually rather flat. So even in "high" setting there won't be much (if any) detail loss. I could probably even add a "ultra" setting for Anime with even stronger debanding. But I guess "high" should do the trick for most situations already.
I guess it depends on how well this fifth check holds up with stronger settings. The ultra setting would likely get very limited use, I do think the high is strong enough, but if you felt so inclined or other people would prefer an ultra setting I wouldn't mind seeing how well it does.

Quote:
Originally Posted by madshi View Post
With the test images I can't really see a difference between having this 1 check on/off for the "low" setting, so I believe cutting the rendering times in half is the better solution.
For the general user I totally agree, do you happen to have any of the test images that included this extra check vs without?
I guess with a GPU upgrade coming soon, I wouldn't mind chucking extra GPU at even a small increase in PQ. Would you entertain me a build with the fifth check enabled on the low setting? I'd understand if you'd dismiss it.. Maybe one that includes the ultra setting too?

BTW your low setting isn't that far off the detail retention of my flash3kyuu settings. It's a bit softer but it does a better job overall.. I can honestly say I'll be moving flash3kyuu out of the Avisynth chain.
I like that a lot because now I can deband 10bit content without dropping down to 8bit. Pretty cool.. I'm going to watch my first MadVR debanded episodes tonight.
ryrynz is offline   Reply With Quote
Old 4th October 2013, 11:55   #20217  |  Link
GCRaistlin
Registered User
 
GCRaistlin's Avatar
 
Join Date: Jun 2006
Posts: 350
As I wrote above, sometimes madVR incorrectly changes or, on the contrary, doesn't change monitor refresh rate. Is it possible to add the possibility to change refresh rate from madVR's tray icon menu manually?
__________________
Windows 8.1 x64

Magically yours
Raistlin
GCRaistlin is offline   Reply With Quote
Old 4th October 2013, 12:31   #20218  |  Link
DarkSpace
Registered User
 
Join Date: Oct 2011
Posts: 204
Quote:
Originally Posted by ryrynz View Post
I can honestly say I'll be moving flash3kyuu out of the Avisynth chain.
I like that a lot because now I can deband 10bit content without dropping down to 8bit.
Uhm, just for your information, dropping down to 8 bit is not necessary, there's Dither Tools for high-bitdepth AviSynth output...

Also, the debanding works well here, but I think I'm getting some kind of spike in rendering times (default: ~14ms, spike ~154ms) every time it is activated from its disabled state or increased from low to high... I suppose that is expected?
Just to be clear, I get maybe 1 or two dropped frames because of this, if at all, and the rendering times do go down again relatively fast, I just want to make sure it's not a bug.
DarkSpace is offline   Reply With Quote
Old 4th October 2013, 12:43   #20219  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by ryrynz View Post
do you happen to have any of the test images that included this extra check vs without?
Yes, see the f3kdb() thread.

Quote:
Originally Posted by ryrynz View Post
BTW your low setting isn't that far off the detail retention of my flash3kyuu settings. It's a bit softer but it does a better job overall..
Is the whole image softer? Or just certain areas? In the first case it's probably not the debanding itself, but f3kdb() by default adds random grain/noise to the image. For no benefit other than to make the image noisier. I've removed that part from madVR because madVR adds noise for the last dithering step, anyway. If you like the added noise, you could simply switch your display to 7bit instead of 8bit in the madVR device settings. That will increase the amount of noise madVR adds.

Quote:
Originally Posted by GCRaistlin View Post
As I wrote above, sometimes madVR incorrectly changes or, on the contrary, doesn't change monitor refresh rate.
I've already replied to your "above" post.

Quote:
Originally Posted by DarkSpace View Post
Also, the debanding works well here, but I think I'm getting some kind of spike in rendering times (default: ~14ms, spike ~154ms) every time it is activated from its disabled state or increased from low to high... I suppose that is expected?
Just to be clear, I get maybe 1 or two dropped frames because of this, if at all, and the rendering times do go down again relatively fast, I just want to make sure it's not a bug.
When switching any settings, madVR resets the whole rendering setup which can take a tiny bit of time. Rendering times may spike. But changing debanding settings shouldn't take more time than e.g. changing scaling algorithms.
madshi is offline   Reply With Quote
Old 4th October 2013, 13:08   #20220  |  Link
SHaKOL
Registered User
 
Join Date: Jun 2013
Posts: 6
Quote:
Originally Posted by madshi View Post
Avi is the container not the codec. Judging from your screenshot the codec is CRAM. Try simply blocking "AVI Decompressor".
hmmm i blocked "AVI Decompressor" and then i played the video it says "cannot render the file" shows this:

SHaKOL is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 19:39.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.