Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 25th March 2014, 23:10   #25321  |  Link
wolfman2791
Registered User
 
wolfman2791's Avatar
 
Join Date: Jan 2014
Posts: 63
Quote:
Originally Posted by Mangix View Post
Wrong link. You'd get better results by using CRU since you can adjust the timings: http://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU

That's just wrong and sounds like a symptom of not using CRU. The EVGA Pixel Clock utility works by creating a custom resolution through the Nvidia Control Panel. It's just another GUI for it. What this means is that Windows treats this new higher refresh rate resolution as non-standard and unsupported by the monitor. This is why there are problems with video games.

CRU on the other hand edits the EDID and makes Windows think that 75Hz is a built-in resolution and thereforce problem solved. The only thing to make sure of if you want 75Hz in game is to launch the game when your desktop is at 75Hz. Works for me with Source games.
Brilliant!
Thank you.
wolfman2791 is offline   Reply With Quote
Old 26th March 2014, 00:51   #25322  |  Link
kalston
Registered User
 
Join Date: May 2011
Posts: 164
Quote:
Originally Posted by wolfman2791 View Post
You can pretty easily overclock a 60Hz monitor to 75Hz...
Easily if the monitor supports it yes. Not all LCDs do (I still find that a good number do but I haven't extensively tested all of them for framedrops or fake numbers) Some monitors also have issues other than framedrops when doing that, such as a corrupted display, weird scan lines etc. It's not THAT simple

It's really easy to test whether it works or not though. Fire up an old game (or well any game that runs at a super high framerate on your system and preferably a FPS) with v-sync on, run around with fraps and/or some other reliable fps meter... see what happens. There's other methods now but that's the one I'm used to (since I've been "overclocking" LCDs for quite a while).

And ye, use CRU or do a good old EDID override if you want to use it in games and such. My LCD for example is running at 76.5hz & 1080p* (I made Windows treat that as his native resolution) which works with absolutely everything, even applications that give you no options to change any advanced settings (most just run with your desktop resolution and refresh rate by default anyway).
*the max you can fit in DVI single link with a 165mhz pixel clock and tight timings.

And I added 71.928hz and 75hz for madVR of course.


I must say I haven't had any issues with video playback for a long time now (well, ever since I got my Nvidia GTX 670), using madvr/lav/JRiver MC. It's all been perfection thanks to madshi and nevcairiel (and the JRiver team for the audio part)

Oh, and the upscaling is absolutely fantastic (using settings mostly similar to 6233638) and it makes some old films very enjoyable to watch on my 22" LCD monitor, which is quite a feat considering how close to it I am sitting.

I haven't really tried the new NNEDI3 thingy though, maybe I should. Been using error diffusion option 2 (the one that says it produces less noise) but I'm not sure I can see much of a difference between the dithering settings anyway (my eyes may not be good enough for that kind of thing though)

Last edited by kalston; 26th March 2014 at 00:55.
kalston is offline   Reply With Quote
Old 26th March 2014, 00:55   #25323  |  Link
wolfman2791
Registered User
 
wolfman2791's Avatar
 
Join Date: Jan 2014
Posts: 63
Yeah... i guess mine does :lol: I played Skyrim at 75 fps without a problem when it was totally wigging out without using CRU. I had to push it down to 60 fps to make it work before.

Have you noticed that the brightness increases with the higher refresh rates? i had to bring down my brightness a tad.
wolfman2791 is offline   Reply With Quote
Old 26th March 2014, 01:34   #25324  |  Link
seiyafan
Registered User
 
Join Date: Feb 2014
Posts: 162
I think I found a bug. When I enable BOTH NNEDI3 chroma upscaling and luma doubling, I get frame drops (2 frames per second) when the rendering time is less than the dropping theshold (~41ms), however if I choose a chroma upscaling other than the NNEDI3, I don't get frame drop even if the rendering time is longer than that of the above, as long as it's less than the threshold.
seiyafan is offline   Reply With Quote
Old 26th March 2014, 01:48   #25325  |  Link
Mangix
Audiophile
 
Join Date: Oct 2006
Posts: 353
Quote:
Originally Posted by kalston View Post
Not all LCDs do (I still find that a good number do but I haven't extensively tested all of them for framedrops or fake numbers)
In my experience, most LCD monitors do support 75Hz but not past that. I have two such monitors from ~2004 and ~2007 that are like that.

Modern LCDs *should* be able to go higher. CRU was developed as a tool for the cheap Korean monitors on eBay to increase the refresh rate to 120Hz.
Mangix is offline   Reply With Quote
Old 26th March 2014, 08:25   #25326  |  Link
bugmen0t
Banned
 
Join Date: May 2012
Location: _Lies|Greed|Misery_
Posts: 114
@cyberbeing
You use 'don't flush' for everything in FSE. I can hardly remember but didn't madshi said something needs 'flush & wait' to work properly, e.g. reporting dropped frames in the OSD or something else?
bugmen0t is offline   Reply With Quote
Old 26th March 2014, 14:09   #25327  |  Link
Cinemancave
Registered User
 
Join Date: Dec 2012
Posts: 33
Since I've been using NNEDI3 and Error Diffusion I have really noticed a jump in quality in my front projection setup. I have just purchased an Oppo 103D and plan to use its Darbee processing in combination with MadVR. Now I've just read that the Darbee in the Oppo automatically converts any signal to YcBcR 4:2:2, where it does its internal processing (I earlier thought that the processing was done in the color space that was sent into the unit, but appareantly only the standalone Darblet does this).

So my question is if using MadVR to output RGB to the 103D is worthless because of the 103D's internal color space conversion? Or am I still going to take advantage of some of MadVR's greatness?
Cinemancave is offline   Reply With Quote
Old 26th March 2014, 14:24   #25328  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Darbee can't do anything PS scripts can't provide for free IMHO, and the latter work in much higher resolution(16bit RGB).

Last edited by leeperry; 26th March 2014 at 14:26.
leeperry is offline   Reply With Quote
Old 26th March 2014, 14:36   #25329  |  Link
Cinemancave
Registered User
 
Join Date: Dec 2012
Posts: 33
Quote:
Originally Posted by leeperry View Post
Darbee can't do anything PS scripts can't provide for free IMHO, and the latter work in much higher resolution(16bit RGB).
What scripts are you referring to exactly? I have often seen these kinds of comments but never actually seen any references of them providing a better result than the Darbee. But I am all ears.

Otherwise, does anyone know how much the 103D reconversion is going to "damage" MadVR's results?
Cinemancave is offline   Reply With Quote
Old 26th March 2014, 14:40   #25330  |  Link
cyberbeing
Broadband Junkie
 
Join Date: Oct 2005
Posts: 1,859
Quote:
Originally Posted by bugmen0t View Post
@cyberbeing
You use 'don't flush' for everything in FSE. I can hardly remember but didn't madshi said something needs 'flush & wait' to work properly, e.g. reporting dropped frames in the OSD or something else?
Flush & Wait is needed if you want the madVR OSD to report Render & Present times. madVR doesn't use this information itself. Setting everything to 'don't flush' is equivalent to madVR only flushing in pre-determined critical sections, instead of being arbitrarily forced to flush at additional locations by end-user settings. NVIDIA GPUs with desktop composition enabled do not require any additional flushing to run smoothly, as of the r304 driver release from a couple years ago which contained numerous vsync stuttering fixes. The settings exist in madVR mainly for troubleshooting, with less flushing being preferred if your GPU can handle it without stuttering, glitches, or GPU queue abnormalities.

Last edited by cyberbeing; 26th March 2014 at 14:48.
cyberbeing is offline   Reply With Quote
Old 26th March 2014, 14:48   #25331  |  Link
*Touche*
Registered User
 
Join Date: May 2008
Posts: 84
Quote:
Originally Posted by leeperry View Post
Darbee can't do anything PS scripts can't provide for free IMHO, and the latter work in much higher resolution(16bit RGB).
Ouch, their examples look terrible. I would never want to do that to my picture.
*Touche* is offline   Reply With Quote
Old 26th March 2014, 15:42   #25332  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by Cinemancave View Post
What scripts are you referring to exactly? [..]
Otherwise, does anyone know how much the 103D reconversion is going to "damage" MadVR's results?
Considering what Darbee does to the picture, the RGB/YUV conversions aren't your biggest problem IMO.

Not too sure, when I want subtle EE I simply enable Gamma Light Dithering in mVR.....for strong halo-based EE you could try your luck with those 2 PS scripts packs: 1 (LumaSharpen might be right up your alley) & 2

TMT went through the same kind of commercial fluff with their CUDA based sharpener, long story short it didn't quite meet their promises.

What we all crave is Super Resolution and at this point NNEDI is as close as it's gonna get. Of course there were impressive software doing exactly that, but they still don't run in realtime: Ikena Forensic’s super-resolution reconstruction algorithm

If you're willing to go Avisynth, this might help too.

You can rest assured that the reviewers raving about Darbee were either on their payroll or very easy to impress.
leeperry is offline   Reply With Quote
Old 26th March 2014, 17:34   #25333  |  Link
Cinemancave
Registered User
 
Join Date: Dec 2012
Posts: 33
Quote:
Originally Posted by leeperry View Post
Considering what Darbee does to the picture, the RGB/YUV conversions aren't your biggest problem IMO.

Not too sure, when I want subtle EE I simply enable Gamma Light Dithering in mVR.....for strong halo-based EE you could try your luck with those 2 PS scripts packs: 1 (LumaSharpen might be right up your alley) & 2

TMT went through the same kind of commercial fluff with their CUDA based sharpener, long story short it didn't quite meet their promises.

What we all crave is Super Resolution and at this point NNEDI is as close as it's gonna get. Of course there were impressive software doing exactly that, but they still don't run in realtime: Ikena Forensic’s super-resolution reconstruction algorithm

If you're willing to go Avisynth, this might help too.

You can rest assured that the reviewers raving about Darbee were either on their payroll or very easy to impress.
Thanks for the tips, I'll look into some of them.

As for rest of the Darbee comments, no one serious about AV uses the thing at 100% effect. DO NOT look at their example pictures and judge it by them, they all look awful. Most seem to use it at 20-30% which doesn't introduce many artifacts at all, but enhances "perceived depth" in the image. I have a JVC X90 (RS65) and it's suppose to work well with it's MPC settings combined with the Darbee at lower levels (according to numerous people over at AVS, who take this hobby very seriously). The Darbee really is a way for me to try and squeeze every last bit out of my setup, even if it means only using it at a very low level.
Cinemancave is offline   Reply With Quote
Old 26th March 2014, 18:08   #25334  |  Link
kalston
Registered User
 
Join Date: May 2011
Posts: 164
Quote:
Originally Posted by wolfman2791 View Post
Yeah... i guess mine does :lol: I played Skyrim at 75 fps without a problem when it was totally wigging out without using CRU. I had to push it down to 60 fps to make it work before.

Have you noticed that the brightness increases with the higher refresh rates? i had to bring down my brightness a tad.
Nope, mine doesn't do that. Like I said different monitors will behave differently when you overclock them.

Mine has no side effects besides a slight buzzing noise when displaying some content (such as this forum). It's very subtle though, I need to have my head close to it to hear it, but it is quite a bit louder at 75+hz compared to 60hz. Still too quiet to be an issue.

Btw don't play Skyrim at more than 60fps or you'll get messed up physics and the game's internal clock will go mad and bugs will appear. People playing on 120hz monitors with vsync found out the hard way 75 isn't a big increase from 60 but over long gaming sessions you'll still end up having issues because of how silly this game engine is (thank god TESO doesn't have that issue)
kalston is offline   Reply With Quote
Old 26th March 2014, 18:36   #25335  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by Cinemancave View Post
Most seem to use it at 20-30% which doesn't introduce many artifacts at all, but enhances "perceived depth" in the image. I have a JVC X90 [..] The Darbee really is a way for me to try to squeeze every last bit out of my setup, even if it means only using it at a very low level.
NNEDI+Lanczos is pretty darn sharp, turn on Gamma Light dithering on top of it and that'll look plenty "deep" IMHO.

Overcoming the LCD panels poor convergence in your JVC projector by oversharpening the picture might not be your best option IMHO...maybe there are setttings to finetune their alignment, possibly in a service menu.

I've read that the latest 2160p projectors from SONY are still misconverged OOTB and you have to do your homework and go through the tedious task of aligning them by hand duh...and if the misalignment is not linear good luck with that.

mVR should provide R/G/B aligments soon or later, OTOH if they look fine and you're actually looking for slight sharpening then I kinda rest my case that mVR can already look very sharp on its own and PS scripts would nail it down.
leeperry is offline   Reply With Quote
Old 26th March 2014, 19:15   #25336  |  Link
Cinemancave
Registered User
 
Join Date: Dec 2012
Posts: 33
Quote:
Originally Posted by leeperry View Post
NNEDI+Lanczos is pretty darn sharp, turn on Gamma Light dithering on top of it and that'll look plenty "deep" IMHO.

Overcoming the LCD panels poor convergence in your JVC projector by oversharpening the picture might not be your best option IMHO...maybe there are setttings to finetune their alignment, possibly in a service menu.

I've read that the latest 2160p projectors from SONY are still misconverged OOTB and you have to do your homework and go through the tedious task of aligning them by hand duh...and if the misalignment is not linear good luck with that.

mVR should provide R/G/B aligments soon or later, OTOH if they look fine and you're actually looking for slight sharpening then I kinda rest my case that mVR can already look very sharp on its own and PS scripts would nail it down.
I'll give the Gamma light dithering a try, thanks.

Actually, the convergence on my unit is pretty darn good, especially after some alignment adjustments. But when MadVR lets me do it in software I'll be the first to try it and put the JVC at its default setting.

I am actually perfectly happy with the sharpness right now, I am more interested in the Darbee's ability to slightly adjust luminance levels to increase perceived contrast in the image. Is there any filter/script that I can use with MadVR to emulate that effect? (This goes against being true to the source material but I am fine with that, however this does vary from movie to movie though)
Cinemancave is offline   Reply With Quote
Old 26th March 2014, 19:56   #25337  |  Link
Niyawa
Registered User
 
Niyawa's Avatar
 
Join Date: Dec 2012
Location: Neverland, Brazil
Posts: 169
Quote:
Originally Posted by madshi View Post
Unfortunately in the meanwhile user reports show the OpenCL <-> D3D9 interop cost seems to vary greatly from one PC to the next, even with the same GPU. Which means that I fear that your tests might only be true for your PC. It's possible that NNEDI3 will run faster or slower on somebody else's PC...
Interesting. I guess no one here has any sort of idea where we can set a deviation to that change...

Quote:
Originally Posted by nevcairiel View Post
I think such "levels" are not such a great idea when it comes to NNEDI, there is just so many different options that spreading such pre-defined levels can cause more confusion then it helps to solve.
Before, defining the performance levels of the scalers alone was OK, as most people agreed on the order of quality/performance, and if they didn't they usually knew why, but now..

There is so many cases where you might be able to use NNEDI on SD with a "worse" image scaler, but not on 720p, where you want a better image scaler. "Levels" are not going to cut it.
Not to mention the huge differences between NVIDIA and AMD at this time (in performance, and with NVIDIA even only working with certain drivers)
At the very least it's good to see people are parsing data. Even if at the time it's useless to label them, It's important to keep testing and looking for more details of how to handle NNEDI3 in majority of situations. Just having an idea that a R9 270x has the potential to be enough for lowest doubling (16 neurons) will let us get closer to a standard.

Quote:
Originally Posted by 6233638 View Post
And yet I somehow manage to see it every time there is a frame drop, and I find even a single dropped frame very noticeable.
This should vary from people to people. My own experience tells me that without some sort of eye training most of us don't realize such things.

Quote:
Originally Posted by StinDaWg View Post
Wow, thanks for sharing. I'm not sure about his choice of bilinear though. I noticed he's using DXVA as well, wonder if the results would differ in any worthwhile difference with software decoding instead.

Also, question for madshi (or anyone else that's able to answer this). Have you ever thought of implementing 'on-demand rendering algorithms' option for chroma upscaling in madVR? To be clear, they would be some sort of automatic-scaling filter that would activate/deactivate depending of the rendering situation madVR is in. Say we want to use Jinc as chroma upscaling, but it doesn't make much sense to have that enabled when there's no upscaling being done, so it would only be used when madVR actually upscales anything. I'm not a developer so I have no idea about the flexibility of this option, but it could help within a margin of reason.

One can say that the shortcut keys already do this, but I'd like an answer to make sure if the idea is a realistic approach or not.
__________________
madVR scaling algorithms chart - based on performance x quality | KCP - A (cute) quality-oriented codec pack

Last edited by Niyawa; 26th March 2014 at 20:04.
Niyawa is offline   Reply With Quote
Old 26th March 2014, 20:18   #25338  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
Quote:
Originally Posted by Niyawa View Post
My own experience tells me that without some sort of eye training most of us don't realize such things.
I respectfully disagree. Your own experience tells you about only...your own experience. And what is "eye training" anyway? 6233638 says he can easily see single frame drops, and I can too. I think it would be the exceptional case to find someone that cannot. Why do you think people complain about telecine judder? It's only a single field jerk compared to a full frame drop, and yet it is easily detected. Human vision is finely tuned for tracking motion; it's not surprising that we can detect gross discontinuities like frame drops.

Last edited by Guest; 26th March 2014 at 20:27.
Guest is offline   Reply With Quote
Old 26th March 2014, 20:40   #25339  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by Cinemancave View Post
I am more interested in the Darbee's ability to slightly adjust luminance levels to increase perceived contrast in the image. Is there any filter/script that I can use with MadVR to emulate that effect?
Maybe Jan would have something for you but dynamic contrast usually goes along with "pumping" effects during high contrast scenes transitions.

Darbee is prolly a nice toy when you run a standalone BD player but mVR's far more versatile than anything it could ever do IMHO, NNEDI and error diffusion dithering being just two things that will leave Darbee in the dust.
leeperry is offline   Reply With Quote
Old 26th March 2014, 20:41   #25340  |  Link
Boltron
Registered User
 
Boltron's Avatar
 
Join Date: May 2011
Posts: 94
I for one can't see half the detail many here argue about for days but I definitely do see judder and frame drops even just one or two.
Boltron is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 08:19.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.