Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 20th October 2011, 20:53   #10281  |  Link
dansrfe
Registered User
 
Join Date: Jan 2009
Posts: 1,210
Crash on close here as well with v0.75
dansrfe is offline   Reply With Quote
Old 20th October 2011, 20:57   #10282  |  Link
TheShadowRunner
Registered User
 
TheShadowRunner's Avatar
 
Join Date: Feb 2004
Posts: 399
It seems pretty clear that the exit crash only occurs on 7.
__________________
XP SP3 / Geforce 8500 / Zoom Player
TheShadowRunner is offline   Reply With Quote
Old 20th October 2011, 21:17   #10283  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
I hate stupid C++. In Delphi accessing a two dimensional array can be done by writing "arr[0, 0]" or "arr[0][0]". In C++ the compiler also allows both, but I only get correct results when using "arr[0][0]". When using "arr[0,0]" in C++ I get very weird results. What a stupid compiler. This was causing the crash.
madshi is offline   Reply With Quote
Old 20th October 2011, 21:21   #10284  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
madVR v0.76 released

http://madshi.net/madVR.zip

Code:
* fixed: crash on MPC-HC exit / close file
* fixed: OSD didn't show properly or didn't show at all
* fixed: subtitles showed with a black background
* fixed: trade quality -> use 10bit chroma option was used for luma sometimes
* unfortunately subtitles are not running through 3dlut, anymore
It seems that older GPUs are not capable of performing alpha blending on 16bit (per component) textures. My ATI 6850 has no problem doing that, but my NVidia 9400 can't. As a consequence this means that subtitles drawn by the internal MPC-HC subtitle renderer can not run through the 3dlut. At least not when using 16bit textures and older GPUs. So we're back to square one on that one...
madshi is offline   Reply With Quote
Old 20th October 2011, 21:25   #10285  |  Link
HeadlessCow
Registered User
 
Join Date: Nov 2002
Posts: 131
Quote:
Originally Posted by madshi View Post
I hate stupid C++. In Delphi accessing a two dimensional array can be done by writing "arr[0, 0]" or "arr[0][0]". In C++ the compiler also allows both, but I only get correct results when using "arr[0][0]". When using "arr[0,0]" in C++ I get very weird results. What a stupid compiler. This was causing the crash.
The comma operator in C++ just ignores the value on the left and returns the value on the right. arr[0,0] is the same as just writing arr[0].
HeadlessCow is offline   Reply With Quote
Old 20th October 2011, 21:26   #10286  |  Link
nand chan
( ≖‿≖)
 
Join Date: Jul 2011
Location: BW, Germany
Posts: 380
Quote:
Originally Posted by madshi View Post
madVR v0.76 released

http://madshi.net/madVR.zip

Code:
* fixed: crash on MPC-HC exit / close file
* fixed: OSD didn't show properly or didn't show at all
* fixed: subtitles showed with a black background
* fixed: trade quality -> use 10bit chroma option was used for luma sometimes
* unfortunately subtitles are not running through 3dlut, anymore
It seems that older GPUs are not capable of performing alpha blending on 16bit (per component) textures. My ATI 6850 has no problem doing that, but my NVidia 9400 can't. As a consequence this means that subtitles drawn by the internal MPC-HC subtitle renderer can not run through the 3dlut. At least not when using 16bit textures and older GPUs. So we're back to square one on that one...
Run them through the .3dlut separately on an 8-bit texture (round the .3dlut output down since dithering isn't enabled, doesn't matter as much for subtitles) and/or make the option opt-in so only those with modern GPUs can enable it. I doubt many users would care about color accurate subtitles other than me and a few other specialists, those may as well be forced to use a modern GPU.

Quote:
Originally Posted by madshi View Post
I hate stupid C++. In Delphi accessing a two dimensional array can be done by writing "arr[0, 0]" or "arr[0][0]". In C++ the compiler also allows both, but I only get correct results when using "arr[0][0]". When using "arr[0,0]" in C++ I get very weird results. What a stupid compiler. This was causing the crash.
(I've never had this problem in C#. What a silly language you are using)
__________________
Forget about my old .3dlut stuff, just use mpv if you want accurate color management
nand chan is offline   Reply With Quote
Old 20th October 2011, 21:26   #10287  |  Link
ikarad
Registered User
 
Join Date: Apr 2008
Posts: 546
Quote:
Originally Posted by madshi View Post
madVR v0.76 released

http://madshi.net/madVR.zip

Code:
* fixed: crash on MPC-HC exit / close file
* fixed: OSD didn't show properly or didn't show at all
* fixed: subtitles showed with a black background
* fixed: trade quality -> use 10bit chroma option was used for luma sometimes
* unfortunately subtitles are not running through 3dlut, anymore
It seems that older GPUs are not capable of performing alpha blending on 16bit (per component) textures. My ATI 6850 has no problem doing that, but my NVidia 9400 can't. As a consequence this means that subtitles drawn by the internal MPC-HC subtitle renderer can not run through the 3dlut. At least not when using 16bit textures and older GPUs. So we're back to square one on that one...
Madshi Can we expect to add a deband filter and a TIVTC filter (like telecide filter) in madvr?
ikarad is offline   Reply With Quote
Old 20th October 2011, 21:31   #10288  |  Link
nand chan
( ≖‿≖)
 
Join Date: Jul 2011
Location: BW, Germany
Posts: 380
Quote:
Originally Posted by ikarad View Post
Madshi Can we expect to add a deband filter and a TIVTC filter (like telecide filter) in madvr?
General support for HLSL (or assembly) pixel shaders would be much nicer imo, then we can add our own deband and deinterlacing filters without needing madshi to do so, and if they're good enough he can just seamlessly integrate them.
__________________
Forget about my old .3dlut stuff, just use mpv if you want accurate color management
nand chan is offline   Reply With Quote
Old 20th October 2011, 21:41   #10289  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by HeadlessCow View Post
The comma operator in C++ just ignores the value on the left and returns the value on the right. arr[0,0] is the same as just writing arr[0].
Ouch. How can a compiler treat arr[0,0] as arr[0]? That doesn't make *any* sense to me whatsoever.

Quote:
Originally Posted by nand chan View Post
Run them through the .3dlut separately
Can't. I don't even draw the subtitles myself. I just tell MPC-HC to draw them. I have to let MPC-HC draw them directly onto the video image, that's the only way it works with the ISR.

Quote:
Originally Posted by nand chan View Post
or make the option opt-in so only those with modern GPUs can enable it.
Yeah, something like that is probably the best workaround.

Quote:
Originally Posted by nand chan View Post
I've never had this problem in C#. What a silly language you are using
There's no such problem in Delphi, either. I don't like C++ at all. But it was the "best" choice for writing a video renderer.

Quote:
Originally Posted by ikarad View Post
Madshi Can we expect to add a deband filter and a TIVTC filter (like telecide filter) in madvr?
Maybe.

Quote:
Originally Posted by nand chan View Post
General support for HLSL (or assembly) pixel shaders would be much nicer imo, then we can add our own deband and deinterlacing filters without needing madshi to do so, and if they're good enough he can just seamlessly integrate them.
Yeah, support for external pixel shaders is on my to do list. Like so many other things.
madshi is offline   Reply With Quote
Old 20th October 2011, 21:44   #10290  |  Link
mr.duck
quack quack
 
Join Date: Apr 2009
Posts: 259
Quote:
Originally Posted by nand chan View Post
General support for HLSL (or assembly) pixel shaders would be much nicer imo, then we can add our own deband and deinterlacing filters without needing madshi to do so, and if they're good enough he can just seamlessly integrate them.
Are there any 1/2 decent pixel shader based deinterlacers? The one in MPC HC, "Deinterlace (blend)", is junk.
__________________
Media Player Classic Home Cinema Icon Library: NORMAL VERSION / GLOWING VERSION
mr.duck is offline   Reply With Quote
Old 20th October 2011, 21:46   #10291  |  Link
nand chan
( ≖‿≖)
 
Join Date: Jul 2011
Location: BW, Germany
Posts: 380
Quote:
Originally Posted by madshi View Post
Ouch. How can a compiler treat arr[0,0] as arr[0]? That doesn't make *any* sense to me whatsoever.
C++ doesn't make *any* sense whatsoever.

Quote:
Can't. I don't even draw the subtitles myself. I just tell MPC-HC to draw them. I have to let MPC-HC draw them directly onto the video image, that's the only way it works with the ISR.
Allocate a new 8-bit texture then and have MPC-HC draw on it instead?

Quote:
There's no such problem in Delphi, either. I don't like C++ at all. But it was the "best" choice for writing a video renderer.
Only because DirectShow is so horribly tied to C++. There are a number of projects attempting to port the headers + interop to other languages (eg. D, Go, C#) that don't require selling your soul to use. There's still hope for the future.

Have you considered just writing a wrapper to X back-end library written in Y language of choice, using C++?

Quote:
Yeah, support for external pixel shaders is on my to do list. Like so many other things.
Well, it should be rather high priority - as it would replace a great deal of others. In fact, most of the gamma correction / color management can be done through pixel shaders. You can even do linear light rescaling and such using pixel shaders. Writing a decent pixel shader API just enables you to do so much, reducing the importance of all other feature requests afterwards. You can probably even deinterlace through pixel shaders.

Quote:
Originally Posted by mr.duck View Post
Are there any 1/2 decent pixel shader based deinterlacers? The one in MPC HC, "Deinterlace (blend)", is junk.
Write one according to your requirements. All you need is access to the current and previous frames. I don't think MPC-HC lets you have the previous frame but madVR certainly could if designed that way.
__________________
Forget about my old .3dlut stuff, just use mpv if you want accurate color management

Last edited by nand chan; 20th October 2011 at 21:50.
nand chan is offline   Reply With Quote
Old 20th October 2011, 22:04   #10292  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by nand chan View Post
Allocate a new 8-bit texture then and have MPC-HC draw on it instead?
The ISR does not fill the target texture's alpha channel. So if I did as you suggest, after the subtitles are drawn, I would have no alpha channel I could use to blend the subtitles onto the real video image.

Quote:
Originally Posted by nand chan View Post
Only because DirectShow is so horribly tied to C++. There are a number of projects attempting to port the headers + interop to other languages (eg. D, Go, C#) that don't require selling your soul to use. There's still hope for the future.
madFlac is written in Delphi. However, writing a video renderer goes a lot deeper than writing a "simple" decoder. That's why I (reluctantly) chose C++ for the video renderer.

Quote:
Originally Posted by nand chan View Post
Have you considered just writing a wrapper to X back-end library written in Y language of choice, using C++?
There's a reason why madVR ships with many dlls and a madHcCtrl.exe. They're all written in Delphi. Only madVR.ax itself is written in C++.

Quote:
Originally Posted by nand chan View Post
Well, it should be rather high priority - as it would replace a great deal of others. In fact, most of the gamma correction / color management can be done through pixel shaders. You can even do linear light rescaling and such using pixel shaders. Writing a decent pixel shader API just enables you to do so much, reducing the importance of all other feature requests afterwards. You can probably even deinterlace through pixel shaders.
It's not as simple as it sounds. If you deinterlace video content properly, the video framerate doubles. If I allow external shaders to add new video frames to madVR's internal frame queues, things can become ugly fast. E.g. would such frames get into the "decoder queue"? Or would I have to add a new "deinterlacer queue"? (These are rhetorical questions). Supporting external scaling would be somewhat easier. Supporting simple processing with no framerate and resolution change would be rather easy in comparison.

Anyway, everybody has his own idea about which of the many things on my to do list should have which priority. In the end I will do things in the order that I consider best. Shader support is not at the very top of my priority list, but it's not too far down, either.
madshi is offline   Reply With Quote
Old 20th October 2011, 22:11   #10293  |  Link
nand chan
( ≖‿≖)
 
Join Date: Jul 2011
Location: BW, Germany
Posts: 380
Quote:
Originally Posted by madshi View Post
The ISR does not fill the target texture's alpha channel. So if I did as you suggest, after the subtitles are drawn, I would have no alpha channel I could use to blend the subtitles onto the real video image.
Oh, okay. That sucks. The only contrived work-around I can think of at this point is to have the ISR render it twice, once on a black background and once on a white background then use various image techniques to recreate the missing alpha channel. But that would require twice as much rendering time.

I think a simpler solution may just be to have one of the MPC-HC devs go through the ISR code and make it fill the alpha channel (optionally, of course) as well.
Have you considered forking VSFilter and rendering subtitles yourself (well, of course you've considered it, but how immediate/easy would it be?)

Quote:
madFlac is written in Delphi. However, writing a video renderer goes a lot deeper than writing a "simple" decoder. That's why I (reluctantly) chose C++ for the video renderer. There's a reason why madVR ships with many dlls and a madHcCtrl.exe. They're all written in Delphi. Only madVR.ax itself is written in C++.
Makes sense, yeah.

Quote:
It's not as simple as it sounds. If you deinterlace video content properly, the video framerate doubles. If I allow external shaders to add new video frames to madVR's internal frame queues, things can become ugly fast. E.g. would such frames get into the "decoder queue"? Or would I have to add a new "deinterlacer queue"? (These are rhetorical questions). Supporting external scaling would be somewhat easier. Supporting simple processing with no framerate and resolution change would be rather easy in comparison.
I suppose you're right about this one, I didn't think of changing framerates. I agree at this point that external processing filters (eg. ffdshow) should be used here.
__________________
Forget about my old .3dlut stuff, just use mpv if you want accurate color management
nand chan is offline   Reply With Quote
Old 20th October 2011, 22:15   #10294  |  Link
iSunrise
Registered User
 
Join Date: Dec 2008
Posts: 496
Quote:
Originally Posted by nevcairiel View Post
I actually get glitches if i use 16 buffers, 12 seems to be the sweet spot for me right now.
Combined with "use a separate device for presentation" (but not DX11), flushing: dont/flush/flush/dont and "limit rendering times to avoid glitches", its pretty much perfect now.

Default settings used to glitch on non-matched refresh rates, be it 25p at 50Hz, or even 24p at 60Hz, but these are perfect.

The only problem i have right now is that after playing some files consecutively without closign the player, exclusive mode fails. Restarting the player fixes it ...

This is on a GTS450
These settings mustīve taken you quite some time to figure out, but they are just perfect, Nevcariel, thanks a lot for sharing them. I just went through a lot files and these are rock-solid even when playing back 23.976/24fps content at 24Hz. 12 buffers indeed seems to be the sweet spot, everything else keeps on glitching.
iSunrise is offline   Reply With Quote
Old 20th October 2011, 22:16   #10295  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by nand chan View Post
I think a simpler solution may just be to have one of the MPC-HC devs go through the ISR code and make it fill the alpha channel (optionally, of course) as well.
Have you considered forking VSFilter and rendering subtitles yourself (well, of course you've considered it, but how immediate/easy would it be?)
Forking VSFilter myself would probably be problematic in terms of GPL etc. After all madVR is closed source. I'd rather have a look at libAss instead. But that's all future talk. It's really too bad that the alpha blending doesn't work on 16bit textures with older GPUs. That's a problem a didn't anticipate. It would have been so easy/nice.
madshi is offline   Reply With Quote
Old 20th October 2011, 22:24   #10296  |  Link
nand chan
( ≖‿≖)
 
Join Date: Jul 2011
Location: BW, Germany
Posts: 380
Quote:
Originally Posted by madshi View Post
Forking VSFilter myself would probably be problematic in terms of GPL etc. After all madVR is closed source. I'd rather have a look at libAss instead. But that's all future talk. It's really too bad that the alpha blending doesn't work on 16bit textures with older GPUs. That's a problem a didn't anticipate. It would have been so easy/nice.
Possible work-around: Dither down to 8 bit, then render subtitles, then color correct back into 16-bit. Sure, you dither twice, but I've tested previously whether this actually results in a noticeable difference (I have not been able to find a test case for which it does).

As an optional setting in the tweaks section this could work for now, for those who want color accurate subtitles *and* have an older GPU.
__________________
Forget about my old .3dlut stuff, just use mpv if you want accurate color management
nand chan is offline   Reply With Quote
Old 20th October 2011, 22:26   #10297  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,348
If there is work going into a new subtitle filter, it should imho be designed in such a way that its not locked into one renderer (iow. defining a new/better interface to a sub rendering filter, and using that instead of hard-wiring it)
If one day the time comes, i would be available to also work on that, the existing subtitle solutions have really been annoying me lately.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 20th October 2011, 22:26   #10298  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by nand chan View Post
Possible work-around: Dither down to 8 bit, then render subtitles, then color correct back into 16-bit. Sure, you dither twice, but I've tested previously whether this actually results in a noticeable difference (I have not been able to find a test case for which it does).

As an optional setting in the tweaks section this could work for now, for those who want color accurate subtitles *and* have an older GPU.
I'd rather suggest using 10bit textures then. That has already been reported as a working solution.

Last edited by madshi; 20th October 2011 at 22:28.
madshi is offline   Reply With Quote
Old 20th October 2011, 22:30   #10299  |  Link
nand chan
( ≖‿≖)
 
Join Date: Jul 2011
Location: BW, Germany
Posts: 380
Quote:
Originally Posted by madshi View Post
I'd rather suggest using 10bit textures then. That has already been reported as a working solution.
10 bit would in fact even be perfect since at that level the dithering is basically going to have no visible difference to 16 bit. Either way, can I go ahead and assume this is how you're going to implement it for the time being? Would be a great compromise solution, ie:

Default: No color management on subs, same as now
Optional 1: Color management for subs
Optional 2: (Only available when previous selected) Dither to 10 bit before rendering subtitles
__________________
Forget about my old .3dlut stuff, just use mpv if you want accurate color management
nand chan is offline   Reply With Quote
Old 20th October 2011, 22:37   #10300  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by nevcairiel View Post
If there is work going into a new subtitle filter, it should imho be designed in such a way that its not locked into one renderer (iow. defining a new/better interface to a sub rendering filter, and using that instead of hard-wiring it)
If one day the time comes, i would be available to also work on that, the existing subtitle solutions have really been annoying me lately.
Well, I would be happy if there was a new external subtitle renderer available. That would save me from writing my own. I have to say, though, if I do feel the need to write my own subtitle renderer, at some point in the future (due to lack of available alternatives), then I will definitely make it a fixed part of madVR.
madshi is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 05:09.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.