Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 14th October 2017, 03:14   #46521  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Don’t use D3D11 native for better results....
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 14th October 2017, 03:36   #46522  |  Link
Klaus1189
Registered User
 
Join Date: Feb 2015
Location: Bavaria
Posts: 1,667
Quote:
Originally Posted by Klaus1189 View Post
Tested with 0.92.4. Currently I‘m not in Germany, so I can‘t test 0.92.5 and 0.92.6 with my equipment.
Do not forget about me and my concern.
Klaus1189 is offline   Reply With Quote
Old 14th October 2017, 12:47   #46523  |  Link
rivera
Registered User
 
Join Date: Apr 2016
Posts: 25
Is there any step-by-step explanation how to provide a 23.976 for Nvidia?

My TV (Panny 65VT60) already has a native "1920x1080@23" mode in Nvidia Control Panel.
According to madVR stats, this mode is actually 23.972.

I read a manual (http://madvr.com/crt/CustomResTutorial.html) but it gives a very poor explanation for my case.
rivera is offline   Reply With Quote
Old 14th October 2017, 13:48   #46524  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 656
Quote:
Originally Posted by rivera View Post
I read a manual (http://madvr.com/crt/CustomResTutorial.html) but it gives a very poor explanation for my case.
I have to say, having written the previous guide that adopted the same approach, with no direct madVR support back then, that I found hard to understand how to apply the optimization results.

I followed the instructions but could not see results. And I completely understand the underlying logic. It's just that I do not see the things that should be happening... well, happen.
ashlar42 is offline   Reply With Quote
Old 14th October 2017, 15:27   #46525  |  Link
steakhutzeee
Registered User
 
steakhutzeee's Avatar
 
Join Date: May 2015
Posts: 225
Hi I use madvr on a 8bit monitor, an ASUS VC239H. Can i reproduce an H265 10bit on it? Think it's a dumb question :/
__________________
Intel i5-4590 - MSI R9 270X 2GB - 8GB RAM
steakhutzeee is offline   Reply With Quote
Old 14th October 2017, 15:36   #46526  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
Quote:
Originally Posted by rivera View Post
Is there any step-by-step explanation how to provide a 23.976 for Nvidia?

My TV (Panny 65VT60) already has a native "1920x1080@23" mode in Nvidia Control Panel.
According to madVR stats, this mode is actually 23.972.

I read a manual (http://madvr.com/crt/CustomResTutorial.html) but it gives a very poor explanation for my case.
Its really very simple providing you monitor supports the new refreshrates. Using .095.6.1;

1) Make sure the monitor to receive the new refresh rate is not at the refresh rate you want to alter (so not at 23Hz in the nvidia control panel).
2) In the madVR custom resolution options, select the predefined 23Hz refresh rate in the list and click edit (that is your case).
3) Click 'test mode'.
4) The monitor will set the new resolution, confirm that you see it. If its fine click save.
5) Reboot PC

When you now play a video it should now be closer to 23.976. Once MadVR has played a video in the new resolution for 30 minutes or so you will be able to go back into the custom resolution and use the optimise option. In my case that was not needed as it was already bang on 23.976 with no need to optimise (1 dropped frame every 12 1/2 hours is good enough for me)..

Last edited by Razoola; 14th October 2017 at 15:42.
Razoola is offline   Reply With Quote
Old 14th October 2017, 15:36   #46527  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 656
Quote:
Originally Posted by ashlar42 View Post
I followed the instructions but could not see results. And I completely understand the underlying logic. It's just that I do not see the things that should be happening... well, happen.
It appears that, in my case (GTX 660, Windows 10 x64, September drivers), a reboot was mandatory. I've managed to bring 23Hz mode to "no frame drops/repeats expected" in two passes. I like the "perfectly optimized" description. It gives that warm and fuzzy feeling we videophiles seek so much. Before I was simply resetting the GPU (which worked for me in CRU), rebooting worked.

Even though I've activated the "D3D hack to use 24Hz and 60Hz", I'm getting the "The GPU rejected this mode, for unknown reasons." error when trying to input original EDID's values for timings (pressing Edit the first time). And I do that while using a different refresh rate.

Last edited by ashlar42; 14th October 2017 at 15:45.
ashlar42 is offline   Reply With Quote
Old 14th October 2017, 15:39   #46528  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by Neo-XP View Post
You should see it clearly with a little zoom on all the external edges of Jack shirt (Source / MadVR RRA + RDH / FineDehalo)
Thanks, I see it now. Do you have more images like this? Would help...

Quote:
Originally Posted by Ver Greeneyes View Post
Hmm, I don't think I'd be particularly broken up about not having separate strengths for Luma and Chroma. I don't know if my results generalize to other videos, but if chroma generally benefits from a bit more strength than luma you could always use a multiplier internally to set the strength of chroma deblocking (e.g. 1.67x the luma strength). But this would probably need more testing than my subjective opinion on a single set of videos :P

With only 1 strength setting, you could just have checkboxes for luma and chroma that can be toggled independently.
Quote:
Originally Posted by Clammerz View Post
My initial thought would be to make:
[] Enable deblock
Luma Strength <-------->
Chroma Strength <------> [] Lock Chroma

When "Lock Chroma" is selected, chroma slider/number will change relative to luma. So if a user has Luma 1, Chroma 2, then locks Chroma, and then slides luma to 2, chroma will change to 3.
Kind of like Ver Greeneyes multiplier selection only probably more annoying to implement
With Chroma initially being locked by default.
I haven't tested/have no idea if individual settings are worth it, sorry.
Of course all this would be possible. I just wonder how useful it would really be. How many users are going to enter the madVR settings dialog and try different luma + chroma deblock strengths? If you look through the thread, many users commented that they are too lazy to even *enable* deblocking on a per-movie base, let alone carefully tweak separate luma and chroma strengths!

As always, we need a good compromise between easy intuitive GUI, without taking too many useful options away.

I'm not sure right now what I'll do...

Quote:
Originally Posted by jmonier View Post
It fixes it for me. It no longer says that you can't save the active mode.
Quote:
Originally Posted by Sarlaith View Post
Yes that fixes the issue!

Madshi
Quote:
Originally Posted by Razoola View Post
Yes it does and I have to say its now much easier adding a custom resolution with madVR than through the nvidia control panel.
Quote:
Originally Posted by jkauff View Post
I was FINALLY able to set up custom resolutions in madVR on my GTX1060.

A couple of things I learned:
  • Rebooting works better than Reset GPU.
  • Before you start, make sure the OS is showing only the native resolutions for your display and GPU.
  • Once the OS is aware of the new custom resolution you created, set the display to the new resolution via the OS to prepare for optimization.
  • After playing the video for 30 minutes, change the resolution on the display to a native resolution in the OS. Now when you click the Optimize button for the new resolution, you will finally get the optimization dialog.

All of this may not be necessary for everyone, but it worked for me on Win 10 Creators Update and the latest Nvidia driver. No more "The GPU has rejected this mode for some reason" errors.

The custom modes work perfectly, and I LOVE the results!
Glad to hear that!

Quote:
Originally Posted by mzso View Post
Well, I'm using an old version for a long time (2017-01) now and until recently I didn't experienced this. So something's changed.
Edit: After downgrading to v0.91.10 the problem stops happening. (v0.92.3 is also bugged. I had these lying around didn't try other versions)
Yes, the problem is related to the new screenshot functionality which was recently introduced. *However*, although the problem was triggered by a madVR feature improvement, the actual bug is in PotPlayer, and as I already said, it should be fixed in the next PotPlayer build.

Quote:
Originally Posted by mzso View Post
I don't use a newer version because of the aforementioned hang on file opening with madVR. (Which is far more likely to happen with newer versions)
Did you see anything useful in the debug logs?
Hmmmm... Did I miss a post of yours? I'm not sure...

Quote:
Originally Posted by clsid View Post
Having the user do the calculations manually would be unfeasible, except for maybe a select few here. Most newcomers already struggle with creating simple resolution based preset scripts.

Hence the idea to move the intelligence and complexity into madvr itself, and provide a single value that can be used as an indicator for video quality.

MediaInfo can show which % of the file is video data. I checked a few files and those were between 75 and 95. So lets assume on average 85% of the file is video. It doesn't need to be super accurate.
Or maybe just subtract audio stream sizes. Those usually have a known (average) bitrate. Subtitles have negligible size.
Fair enough. I could assume 85% of the file is video. But what's the next step? What exact bitrates should I treat as which quality level? The codec would need to be involved, but also the video resolution and framerate, chroma subsampling etc. And bitrate requirements don't scale linear with video resolution. So judging which bitrate is "high enough" for a specific quality level seems pretty hard to me.

Anyway willing to do the leg work? Which means create an exact "formula" which outputs a quality level, based on:

- codec
- resolution
- frame rate
- chroma subsampling
- bitrate

Quote:
Originally Posted by huhn View Post
after some testing i found this so far.

RCA is very useful but some more steps between 1 and 4 would be nice. NGU sharp and standard still look pretty bad with files that "need" RCA even with RCA.

RCA softs the image a lot so i tried to compensate it with a sharpener. sharpen edges does nothing when scaling a DVD to UHD while adaptive sharpening is still effective too effective smaller step between 0.0-0.1 would be nice and add grain after sharping would be highly preferred too it is really difficult to use with adaptive sharpening.

RRN is hard to judge i find it useful together with RCA but not much alone.
Thanks. There'll be a lower RCA quality step in the next build, maybe it helps, and also lower RRN steps, and a choice between 2 different RCA variants. AdaptiveSharpen will be tweaked once more, too.

I may add more RCA quality steps in between, but it's not as easy it might seem, so before I do that, I'll need more feedback first about all these new changes.

Quote:
Originally Posted by austinminton View Post
what is the right nvidia setting to set in the control panel. I have been using rgb 4k@60 8bit till now but I wonder if that's incorrect to playback 10bit files. When madvr switches to 4k@23, does it also set 10bit? I have tried looking into all my tv/avr settings and can't find anything that can confirm bitdepth and color format unfortunately.

I have set 10bit and above in madvr for the panel (Sony 75z9d). Should I set the control panel to YCbCr 422 and 10bit?

I also have a file with 4k@60, 4:2:0, 10 bit HDR. Yes its a 60hz UHD. This one really confuses me since i know rgb 4k@60 10 bit wont work over hdmi. So somewhere its either getting converted to 8 bit or a lower chroma? It plays fine and looks very good to my eyes, but I 'need' to know if some unnecessary conversion is getting done somewhere. Madvr OSD says P010, 10-bit, 4:2:0 but I am not sure if that's what the GPU is finally passing to the panel?

Card is nvidia 1080 (385.41), latest madvr + lav filters. I am using FSE and madvr auto switch resolution for 4k@23, 4k@24, 4k@60. DXVA copyback in LAV.

Also I would like to say that I think my pc with madvr has a much better picture than my uhd player. Thanks madshi.
madVR saying P010, 10bit, 4:2:0 is what the decoder delivers to madVR, it has no consequence of what is sent to the GPU.

HDMI 2.0 can't do 4Kp60 RGB with more than 8bit. So even if you tell the GPU *and* madVR to output 10bit RGB at 4Kp60, it won't happen. Someone somewhere will have to drop it down to 8bit. Probably it will be the GPU driver, and I'm not sure they'll apply proper dithering. So it's better if you take care of this yourself, e.g. by using profiles.

My recommendation is to always output RGB. 10bit is nice if your TV fully supports it (and benefits from it), otherwise 8bit is just fine, too, even for HDR content, thanks to madVR's high quality dither.

Quote:
Originally Posted by j1731630 View Post
Is there option to disable image downscale, if source resolution exceeds monitor/tv?
Will be useful to capture UHD frames on FullHD display.
You mean for screenshots? Simply set the new madVR "rendering -> screenshots" settings to "100% view". Then your screenshots will automatically be done at original resolution. So this way you can very easily create UHD screenshots even if your TV is only FullHD.

Quote:
Originally Posted by Q-the-STORM View Post
The Custom Mode is still active after reboot...
So does the tweaked madHcCtrl.exe which I made available a couple posts ago fix the probem?

Quote:
Originally Posted by VAMET View Post
1. let madVR decide
2. convert HDR content to SDR by using pixel shader math
3. process HDR content by using pixel shader math

When using option 1, I have darker colors, when 2 bright colors and 3 washed out colors. Which one is the most similar experience to real HDR effect?
You can use 1 or 2. Option 1 should detect that your projector doesn't support HDR, and thus should automatically activate "convert HDR content to SDR by using pixel shader math". So basically options 1 and 2 should be identical.

There's one key difference, though: If you choose option 2, you can select a number of options for how madVR should do the HDR -> SDR conversion. If you choose option 1, you don't have these options, so madVR chooses itself.

You're saying you have darker colors when using option 1. That means you probably have a lower "display peak nits" number selected than madVR chooses for option 1. If you increase the "display peak nits" number, you'll probably get the same image as when using option 1.

Quote:
Originally Posted by omw2h View Post
I really preferred the previous Adaptive Sharpen, the new one enhances grain/fine detail too much and may alias edges more then the previus one.
Quote:
Originally Posted by ryrynz View Post
Yeah.. If we stick with this I think I want more control over what it shapes than just ticking the box.. Madshi what's the chance of exposing some options for the new AA? If not I'll have to sit down and revisit my sharpener settings.
"new AA"? You mean new "AS"? I don't plan on offering options at this point, if there's any way to avoid it. Let's first try to make you happy without offering options, ok? The next build will have a tweaked AdaptiveSharpen algorithm, which sharpens grain a bit less than the current build. Maybe you will like that version more? Please let me know...

New build out maybe later today (6-7 hours from now), or maybe tomorrow.

Quote:
Originally Posted by leeperry View Post
After more testing, RCA@1 is fantastic for low-res hopeless files that could use some smoothing anyway but indeed a far lower value would be ideal for files that come with both details and blocking. Anywhere between 0.1 and 0.5 might do the magic for those blocky and yet not quite hopeless videos
There's be a new lower settings. It's as low as I can go, I think.

Quote:
Originally Posted by valdeski View Post
With madVR v0.92.6 I often get video player freezes when launching another DX11 application, like a game.
I'm using MPC-HC Nightly, Lav filters nightly with D3D11 Automatic (Native).
Video freezes but audio keeps playing, also d3d11 native switches to d3d11 copyback in lav filter video decoder window setting.
Is this a new problem with madVR v0.92.6 which didn't occur before? If so, can you please try to find out which exact madVR build introduced this problem? You can download older builds here:

https://www.videohelp.com/software/m...sions#download

Furthermore: Does this problem only occur when using the new "D3D11" decoder in LAV? Or does it also occur when e.g. using software decoding?

Quote:
Originally Posted by Klaus1189 View Post
If I use madvr and enable "delay playback until renderer is full" the taskbar indicator (is it called that way?) displays pause and yellow color progress of file. Is that something that can be fixed? In MPC-BE or madvr?
That taskbar indicator is not known to me. It's probably MPC-BE's? If so, you may need to talk to the MPC-BE devs about this problem.

Quote:
Originally Posted by ashlar42 View Post
I have to say, having written the previous guide that adopted the same approach, with no direct madVR support back then, that I found hard to understand how to apply the optimization results.

I followed the instructions but could not see results. And I completely understand the underlying logic. It's just that I do not see the things that should be happening... well, happen.
I'm not sure I understand what you mean. Are you saying you're not fully happy with the tutorial text I've written? Or with the optimization procedure/logic? Or both? I'm open for improvement suggestions.
madshi is offline   Reply With Quote
Old 14th October 2017, 15:50   #46529  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 656
Quote:
Originally Posted by madshi View Post
I'm not sure I understand what you mean. Are you saying you're not fully happy with the tutorial text I've written? Or with the optimization procedure/logic? Or both? I'm open for improvement suggestions.
I simply needed to reboot instead of resetting GPU. Maybe it might be worth it to point out in the guide that resetting, in some cases, simply doesn't do anything.

See the other problem I described for 24Hz and 60Hz. I was writing at the same time as you
ashlar42 is offline   Reply With Quote
Old 14th October 2017, 15:55   #46530  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by madshi View Post
There's be a new lower settings. It's as low as I can go, I think.
Great and please provide some fine-tuning between the current 1 & 2, the more headroom the better
leeperry is offline   Reply With Quote
Old 14th October 2017, 16:18   #46531  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
@madshi,

I know this was brought up years ago (I think it was leeperry who did) but have you any plans to add customisable enhancements to madVR in relation to 'near black' processing? I'm thinking something on the lines of being able for example to boost or manipulate brightness in the 0%-5% range. This would be for those with panels what no matter how well they can be calibrated, due to ABL or whatever else, shadow details are sometimes lost.
Razoola is offline   Reply With Quote
Old 14th October 2017, 16:22   #46532  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 656
And now I get the "GPU driver rejected mode for unknown reasons" for 59Hz too...
ashlar42 is offline   Reply With Quote
Old 14th October 2017, 16:39   #46533  |  Link
mzso
Registered User
 
Join Date: Oct 2009
Posts: 930
Quote:
Originally Posted by madshi View Post
Yes, the problem is related to the new screenshot functionality which was recently introduced. *However*, although the problem was triggered by a madVR feature improvement, the actual bug is in PotPlayer, and as I already said, it should be fixed in the next PotPlayer build.


Hmmmm... Did I miss a post of yours? I'm not sure...
I don't know. You asked for a debug log here. And I provided it here.

But I've been getting hang with madVR for ages and complained a few times here. Also did to the PP dev and the progDVB dev.

It's just that it's far more common of an occurrence with newer versions of PP, where I can reproduce a hang in seconds by quickly opening files after each other for playback.

Now if I get a different issue with older PP and new madVR I'm pretty screwed.
mzso is offline   Reply With Quote
Old 14th October 2017, 16:42   #46534  |  Link
MuriKha
Registered User
 
Join Date: Jan 2017
Posts: 2
What is a good madvr setting for 4k display?

Hi this is my 1st post here. There are several articles related to madvr settings but i was wondering what is the optimal setting for my configuration.

Specs:
i7 4770k @4.2Ghz
Zotac GTX 1070 amp extreme SLI
16gb ram

I am using potplayer with lav filters.
I specifically want to know about how much important the artifact removal, image enhancements settings are and when i should use them?
MuriKha is offline   Reply With Quote
Old 14th October 2017, 16:49   #46535  |  Link
MuriKha
Registered User
 
Join Date: Jan 2017
Posts: 2
Quote:
Originally Posted by valdeski View Post
With madVR v0.92.6 I often get video player freezes when launching another DX11 application, like a game.
I'm using MPC-HC Nightly, Lav filters nightly with D3D11 Automatic (Native).
Video freezes but audio keeps playing, also d3d11 native switches to d3d11 copyback in lav filter video decoder window setting.
This also happened to me but its not there anymore for me. It started to happen only after the lastest nvidia driver update with madvr v0.92.x

If i disable the d3d11 for rpesentation everything starts to work fine again. Right now i dont have the problem anymore and the only thing that changed for the is i updated msi afterburner and rivatuner with latest version.

Using win 10 rs3. Nvidia driver 387.92
MuriKha is offline   Reply With Quote
Old 14th October 2017, 16:49   #46536  |  Link
Neo-XP
Registered User
 
Neo-XP's Avatar
 
Join Date: Mar 2016
Location: Switzerland
Posts: 140
Quote:
Originally Posted by madshi View Post
Thanks, I see it now. Do you have more images like this? Would help...
Sure, I did a test here some time ago : https://forum.doom9.org/showthread.p...53#post1789253

I will try to find more.
Neo-XP is offline   Reply With Quote
Old 14th October 2017, 17:11   #46537  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
Quote:
Originally Posted by madshi View Post
I'm not sure I understand what you mean. Are you saying you're not fully happy with the tutorial text I've written? Or with the optimization procedure/logic? Or both? I'm open for improvement suggestions.
As far as nvidia is concerned I think the only change needed now to custom resolutions is for madVR to check the current refresh rate at the point the user presses the 'test mode' button (on either first adding or on optimise). If the display is already using the refresh rate to be altered madVR should either;

1) Automatically first change the current refresh rate to something other than the refresh rate setting to be tested.

2) Tell the user that madVR first needs to change the current display refresh rate which then happens after the user presses ok.

3) Stop with an error telling the user to first change the current refresh rate to something else.

Last edited by Razoola; 14th October 2017 at 17:21.
Razoola is offline   Reply With Quote
Old 14th October 2017, 17:35   #46538  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
Regarding the idea by clsid for a quality variable, in the meantime I've found a workaround where I include the TV channel name in the filename of the recording, and then I've set rules in madVR that match specific channel names to an "awful source" profile
Only thing is this cannot work for live TV, only recordings.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 14th October 2017, 19:26   #46539  |  Link
valdeski
Registered User
 
Join Date: Nov 2016
Posts: 10
@madshi
I did some testing, all madvr settings to default but D3D11 fullscreen windowed on a second monitor. I deleted the settings.bin and reset the settings every time just in case.
v0.92.02
DXVA2(Copy-Back) No issues.
D3D11 Automatic (Native) sometimes freezes for 2 or 3 seconds but no actual crashes and playback resumes like usual.
D3D11 cb direct(Hardware Device selected) No issues.

v0.92.03
DXVA2(Copy-Back) No issues.
D3D11 Automatic (Native) Crashes, no error messages, MPC-HC freezes and CPU usage goes crazy.
D3D11 cb direct(Hardware Device selected) No issues.

v0.92.04
DXVA2(Copy-Back) No issues.
D3D11 Automatic (Native) Crashes, no error messages, MPC-HC freezes and CPU usage goes crazy.
D3D11 cb direct(Hardware Device selected) No issues.

v0.92.05
DXVA2(Copy-Back) Crashes with "MPC-HC has stopped working" window.
D3D11 Automatic (Native) Crashes with "MPC-HC has stopped working" window.
D3D11 cb direct(Hardware Device selected) Crashes with "MPC-HC has stopped working" window.
All three happen just from going from in-game fullscreen to windowed mode, definitely a lot more sensitive that the previous releases.
Actually got a madvr crash error window with this version but got not log file(I assume it goes to the desktop, right?)

v0.92.06
DXVA2(Copy-Back) No issues.
D3D11 Automatic (Native) Crashes, no error messages, MPC-HC freezes and CPU usage goes crazy, sometimes behaves similar to v0.92.02 with 2-3 second freezes and playback continues.
D3D11 cb direct(Hardware Device selected) No issues.


All of the above happens when launching a game or when closing the game or going from in-game fullscreen to windowed and vise versa, if you manage to get in-game without MPC-HC crashing then you're not very likely to encounter issues unless you do try to go into windowed mode or encounter a loading screen.
Some files seem to be more susceptible to crashing but it's not an issue of only happening with "x" specific file or "x" specific game.
GPU is Nvidia GTX 970, driver 385.69. Game was Battlefield 4 Fullscreen exclusive(it happens in other games too).

Last edited by valdeski; 14th October 2017 at 21:20. Reason: Fomating
valdeski is offline   Reply With Quote
Old 14th October 2017, 20:27   #46540  |  Link
PurpleMan
Registered User
 
Join Date: Oct 2003
Posts: 273
@madshi

I'm not whether or not this has been previously reported or nobody noticed, but I see 2 very annoying issue with the "move subtitles to the bottom of the screen" feature (which as you know is extremely useful for 2.35:1 movies).

(I can't comment on whether it was okay in previous versions as I only now started using madVR with XYsubfilter).

Issue 1 - It seems that every subtitle is displayed in a slightly different horizontal position (more than a few pixels up or down).

Issue 2 - Unrelated to issue 1, which happens regardless, I think you're treating the subtitle as an image and just move it down not accounting for actual text or characters. A little complicated to explain, but I'll try -

Consider the following different subpictures:

SUB 1 - aaaaaaaaaaaaaaaaa

SUB 2 - aaaaaaaaaaapaaaaaaaaaaq

Because the lowest pixel in SUB1 is at the bottom part of the 'a' character, it will align the entire text in a certain horizontal position.

On SUB2, the lowest point is at the bottom of the 'q' and 'p' characters, and it will align the entire text based on that, so that the rest of the characters ('a' characters) are in a different position on SUB2 than they were in SUB1.

I know this example makes it seem as if it's a rare case, but in some languages, and even in English - this is EXTREMELY noticeable and as subtitles changing fast it make it seem that the entire sub picture stream is just moving up and down between subs.

Issue 3 - Some sub pictures are 1 line while others are 2 lines. When rendering subtitles, there's a fixed virtual position for each line (row 1 and row 2), so if you have just 1 line it's displayed on row 2 (the lower row) and if you have 2 of them they're displayed on both rows. What I'm seeing is that if there's a single line, it's actually displayed in a third location seemingly in between row 1 and row 2.

I couldn't find a better way to describe this, so let me know if you want me to include some screenshots to clarify. What I can tell you is that if you turn off the 'move subtitles to bottom' feature in madVR, while the subtitles are then rendered on the video itself instead of the black bars area, the positioning is perfect and you don't have any of the above issues, thus eliminating xysubfilter as the cause for this.

Thanks for all your hard work, much appreciated!
PurpleMan is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 04:50.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.