Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players
Register FAQ Calendar Today's Posts Search

Reply
 
Thread Tools Search this Thread Display Modes
Old 18th February 2014, 00:54   #23381  |  Link
turbojet
Registered User
 
Join Date: May 2008
Posts: 1,840
Quote:
Originally Posted by XMonarchY View Post
Would enabling my i7 3770K HD4000 CPU graphics to work along with my GeForce GTX 770 improve madVR rendering performance? Would I need to get the Lucid Logix software to make it happen?
I'm pretty sure madvr must render on the gpu used for display.

Quote:
Finally, I set LAV Video to use nVidia CUVID acceleration for decoding. I know decoding is not the same as rendering, but since CUVID uses GPU, could it be reducing madVR rendering that also needs GPU power?
Absolutely. You could use software decoding or I think setting up a fake display on the igp and using quicksync decode also works.
__________________
PC: FX-8320 GTS250 HTPC: G1610 GTX650
PotPlayer/MPC-BE LAVFilters MadVR-Bicubic75AR/Lanczos4AR/Lanczos4AR LumaSharpen -Strength0.9-Pattern3-Clamp0.1-OffsetBias2.0
turbojet is offline   Reply With Quote
Old 18th February 2014, 00:58   #23382  |  Link
cyberbeing
Broadband Junkie
 
Join Date: Oct 2005
Posts: 1,859
madshi, I noticed today that madVR produces diagonal distortion with uncompressed 4096x2304 v210 in AVI saved from VirtualDub. Uncompressed v210 in AVI below 4K has no issue. 4096x2304 v210 from LAV Video also has no problem, so the issue is specifically when madVR opens the raw video directly. If you don't believe this is a trivial fix, I'll stick it up on your bug tracker.

Last edited by cyberbeing; 18th February 2014 at 01:05.
cyberbeing is offline   Reply With Quote
Old 18th February 2014, 01:07   #23383  |  Link
MistahBonzai
Registered User
 
Join Date: Mar 2013
Posts: 101
Quote:
Originally Posted by leeperry View Post
When you compare NL6 and A4, don't you see the major decrease in noise? Are you running Reclock in 24/60Hz?
.
.
.
Right up front I almost never watch actual content unless it is to judge image quality. I'm a confirmed 'pixel peeper'.

I utilize reclock with 23/24/25/30P for image calibration Otherwise, in the real world, I use AVIsynth plugins to image double (interframe2) or other smoothing techniques. yeah I know they destroy the image from a purest perspective but that's part of the challenge .

So far I have limited my comparisons to adaptive4 versus linearlight. I figured there should be a significant difference between them - no? Update; I have fallen back to the basic approach of capturing, cropping, zooming and eyeballing. It consists of controlled print-screen image captures of the gradient-perceptual-v2.mkv video off loaded to Paintshop pro. Croping a suitable sample of the lower right-hand corner and vieweing it zoomed 400%. The differences are readily apparent - no contest the ED capture created with adaptive4 wins hands down based on smoothness and constancy.

Agree with your corrective lens observations. My 'distance glasses' are adaptive photo grey. I don't wear corrective lenses while performing critical viewing cuz of the undesirable image shifting they introduce. I assume you do and they were created to minimize light diffraction and tinting. Or are they a special tool used to control the light environment associated with emissive light source image calibration?

In any-case very useful info cuz the cataract in my right eye ain't getting better .

I have misplaced (lost) my contrast meter - it was a pro model utilized to produce display service procedure for Kodac imaging systems (KIMS) - so no-longer able to measure post calibration peak light output or calculate contrast.

I have constructed my own 'white balance' calibration tool - a black shoe box partitioned lengthwise. Equipped with a 6500K light source and a photo grey card on the left and a view port on the right. I can attach ND filter(s) to balance the brightness of calibration source versus the display image when needed. Crude yes, but it works very well. Well back to pixel peeping

Last edited by MistahBonzai; 18th February 2014 at 01:14. Reason: clarity..?
MistahBonzai is offline   Reply With Quote
Old 18th February 2014, 01:31   #23384  |  Link
The 8472
Registered User
 
Join Date: Jan 2014
Posts: 51
Quote:
Originally Posted by XMonarchY View Post
Finally, I set LAV Video to use nVidia CUVID acceleration for decoding. I know decoding is not the same as rendering, but since CUVID uses GPU, could it be reducing madVR rendering that also needs GPU power?
I think the video decoders use ASICs and not the shader cores, so they shouldn't affect compute performance itself. But they might consume some memory or PCIe bandwidth, especially with copyback.
The 8472 is offline   Reply With Quote
Old 18th February 2014, 01:43   #23385  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
Quote:
Originally Posted by cyberbeing View Post
madshi, I noticed today that madVR produces diagonal distortion with uncompressed 4096x2304 v210 in AVI saved from VirtualDub. Uncompressed v210 in AVI below 4K has no issue. 4096x2304 v210 from LAV Video also has no problem, so the issue is specifically when madVR opens the raw video directly. If you don't believe this is a trivial fix, I'll stick it up on your bug tracker.
there is a problem with 4096 16 bit png too i was wait for the final 87.x version to see if it's still there but it is most likely related

just try this png http://media.xiph.org/sintel/sintel-4k-png16/00000162.png i get a red screen
the sd 16 bit version works fine http://media.xiph.org/sintel/sintel-1k-png16/00000162.png

Quote:
Originally Posted by XMonarchY View Post
Would enabling my i7 3770K HD4000 CPU graphics to work along with my GeForce GTX 770 improve madVR rendering performance? Would I need to get the Lucid Logix software to make it happen?
i tried it out a year or longer ago of cause this doesn't work and yeah you need lucid logix software to to use it. and the software is not free anymore... ignore it. it mostly for tearing free playback but there is no real tearing problems with madvr.

Quote:
Finally, I set LAV Video to use nVidia CUVID acceleration for decoding. I know decoding is not the same as rendering, but since CUVID uses GPU, could it be reducing madVR rendering that also needs GPU power?
why should it decreases render times it increases them very little. what you see is very easy CUVID is cuda based (at least the upload....) and cuda forces your gpu in the highest possible power state possible which is just a waste of power.

because the gpu is now "faster" render times are lower. render times are pretty unreliable anyway they depend on the gpu power states.

just use dxva it uses the same decoder in the gpu and doesn't force your gpu in the highest powerstate which is just better at least with newer card with dxva2 support vp4+.

didn't i tell you the same in avsforum where you stated CUVID looks a lot better than dxva or software decoding... ?
huhn is offline   Reply With Quote
Old 18th February 2014, 02:00   #23386  |  Link
Mfusick
Registered User
 
Join Date: Nov 2013
Posts: 1
Quote:
Originally Posted by Asmodian View Post
madVR does not use SLI. Actually simply enabling SLI has a huge negative performance hit, at least on my system.

For example at my 720p settings:
If I disable SLI I get 38ms rendering times, 84% GPU0 (1071Mhz), 0% GPU1 (324Mhz).
If I enable SLI I get 84ms rendering times, 79% GPU0 (1071Mhz), 30% GPU1 (836Mhz).

Oddly while in SLI mode the memory usage of both cards is identical but the memory controller for GPU1 is idle.

If I force a SLI rendering mode (AFR1 or AFR2) I get similar performance and horrible flickering (the second GPU's frames are black?). This flickering has happened with all versions of madVR I have ever used when forced into a SLI rendering mode.

I am using Titans and running drivers 327.23.



I have a 2560x1440 monitor, if you are using a lower resolution you might be able to get away with higher settings.

I needed to setup profiles but for <720p I use:
32 neuron NNEDI3 chroma 4:2:0 -> 4:4:4
128 neuron NNEDI3 luma doubling if scaling is >= 2.0
32 neuron NNEDI3 luma quadrupling if scaling is >= 4.0
Jinc3AR image upsampling (very small performance hit relative to lanczos so might as well)
Catmul-Rom AR+LL image downscaling

For >= 720p I use:
Same as above except 64 neuron NNEDI3 luma doubling if scaling is >= 2.0

No trade quality for performance options. Smooth motion on when watching 24fps on a 60Hz display.
Is this really true you can't use dual GPU cards with MadVR ?
Mfusick is offline   Reply With Quote
Old 18th February 2014, 03:23   #23387  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Well you can use madVR on SLI systems, it is just much slower than if you disable SLI. I updated my drivers again so I do not use NNEDI3 (I usually watch blurays and 1080p->1440p with NNEDI3 isn't much better) so I do not notice or care that rendering is slower. EDIT: I will probably switch back soon though, NNEDI3 chroma upsampling is nice all the time.

It is probably only an issue when using NNEDI3 or if you have SLI'ed two low end cards.

I would love to hear reports from other SLI and Crossfire users as to rendering times with and without SLI/Crossfire. Use NNEDI3 to put a real load on the GPU, otherwise the cards sit in a lower power state and ramp up their clock speeds more in SLI instead of taking longer to render.

Last edited by Asmodian; 18th February 2014 at 03:26.
Asmodian is offline   Reply With Quote
Old 18th February 2014, 04:35   #23388  |  Link
Stereodude
Registered User
 
Join Date: Dec 2002
Location: Region 0
Posts: 1,436
Quote:
Originally Posted by leeperry View Post
I guess 700:1 is your bottleneck especially if there's a thick pearly anti-glare layer on your monitor. I'm always dubious when I see computer geeks raving about 1440p 24/27" monitors that use IPS panels and come with a very blurry anti-glare layer....they are not getting 1440p by a long shot IMO due to the blurriness.
Only if he doesn't use an oxygen free cable that's at least 24 gauge not longer than 42cm. The skin effect and jitter really bloom out of control once it gets longer than that. Oh yeah, you've got to use an isolation transformer too. If you don't it totally ruins the immersive quality.
Stereodude is offline   Reply With Quote
Old 18th February 2014, 04:51   #23389  |  Link
MistahBonzai
Registered User
 
Join Date: Mar 2013
Posts: 101
Quote:
Originally Posted by Stereodude View Post
Only if he doesn't use an oxygen free cable that's at least 24 gauge not longer than 42cm. The skin effect and jitter really bloom out of control once it gets longer than that. Oh yeah, you've got to use an isolation transformer too. If you don't it totally ruins the immersive quality.
Wow..I didn't know that! The things I learn @doom9 just blows me away ...
MistahBonzai is offline   Reply With Quote
Old 18th February 2014, 05:12   #23390  |  Link
GREG1292
Registered User
 
Join Date: Aug 2007
Location: Fort Wayn,Indiana
Posts: 52
A4 works for me and my 799.00 Mitsubishi HC7900. I am an end user and this build
seems to have it all. Just when I think I am happy the bar is raised even higher.
Since last week I have tested all the builds to date and put 36 hours on my projector
testing. Job well done!!!!!!!!!!!!!!
GREG1292 is offline   Reply With Quote
Old 18th February 2014, 08:58   #23391  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by Stereodude View Post
Only if he doesn't use an oxygen free cable that's at least 24 gauge not longer than 42cm. The skin effect and jitter really bloom out of control once it gets longer than that. Oh yeah, you've got to use an isolation transformer too. If you don't it totally ruins the immersive quality.


leeperry,

That goes to show that you are way overboard with your eagle eye theory.
You are probably the only one who can actually see super clearly the difference with your own eyes, and use terms like "pop", "looks like 3D", "looking through a dirty window", etc...
The difference is not THAT drastic between the builds (without enhancements).

I wonder if you would see the difference in a real blind test with the current builds without enhancements.

Moreover,
If you are testing the builds with uncompressed (Original or Remux) BluRay content,
Dare I say, MadVR Dithering does absolutely nothing in terms of visible banding/error correction, so that so you can disable it and not see the difference.
This for the simple reason that the video is already heavily dithered from the original high-bit master to 16-235 bluray (about 7.8-bit).
Now, compressed "internet content" that destroys/blends this native dithering to a solid color, or anime, is a whole different story.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 18th February 2014 at 09:08.
James Freeman is offline   Reply With Quote
Old 18th February 2014, 09:30   #23392  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
Quote:
Originally Posted by James Freeman View Post


leeperry,

That goes to show that you are way overboard with your eagle eye theory.
You are probably the only one who can actually see super clearly the difference with your own eyes, and use terms like "pop", "looks like 3D", "looking through a dirty window", etc...
The difference is not THAT drastic between the builds (without enhancements).

I wonder if you would see the difference in a real blind test with the current builds without enhancements.

Moreover,
If you are testing the builds with uncompressed (Original or Remux) BluRay content,
Dare I say, MadVR Dithering does absolutely nothing in terms of visible banding/error correction, so that so you can disable it and not see the difference.
This for the simple reason that the video is already heavily dithered from the original high-bit master to 16-235 bluray (about 7.8-bit).
Now, compressed "internet content" that destroys/blends this native dithering to a solid color, or anime, is a whole different story.
you should take this back...

madvr upscales the chroma in 16 bit so there are now 16 bit informations in the chroma then the 16-235 -> 0-255 correction is done in 16 it and this can create banding but now the bt 709 to srgb or 3d lut correction... this creates even more information in the 16 bit and now just clip it away ? and you really think this is lossless/invisible?

about leeperry i can't take him serious i mean good 1440 displays are 10 bit professional displays they are at the top... if not the best of the best...
huhn is offline   Reply With Quote
Old 18th February 2014, 09:30   #23393  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
Quote:
Originally Posted by James Freeman View Post
An Image(s) worth a thousand words:

Original (Untouched).

No Dithering (Low bitrate).

Random Dithering (Low bitrate dithered with MadVR Random Dithering).

ED Adaptive 4 (Low bitrate dithered with MadVR Adaptive 4).
Incredible, but how come there's dithering on the black bars? Can you add an untouched screenshot of Adaptive 4 please?

Last edited by ryrynz; 18th February 2014 at 09:33.
ryrynz is offline   Reply With Quote
Old 18th February 2014, 09:36   #23394  |  Link
iSunrise
Registered User
 
Join Date: Dec 2008
Posts: 496
Quote:
Originally Posted by ryrynz View Post
Incredible, but how come there's dithering on the black bars? Can you add an untouched screenshot of Adaptive 4 please?
Read the last 2 pages please, already been answered.
iSunrise is offline   Reply With Quote
Old 18th February 2014, 10:00   #23395  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by huhn View Post
you should take this back...

madvr upscales the chroma in 16 bit so there are now 16 bit informations in the chroma then the 16-235 -> 0-255
correction is done in 16 it and this can create banding but now the bt 709 to srgb or 3d lut correction...
this creates even more information in the 16 bit and now just clip it away ? and you really think this is lossless/invisible?
Don't get me wrong, I'm very thankful for madshi and MadVR.
That's why I specifically said Visible Difference & see the difference.

Yeah, I am completely aware of the 16-bit processing chain.
What you explain in your example is a smooth and undithered 16-235 video (a test pattern for example),
WILL have very visible errors when stretching it to 0-255 after all the the 16-bit processing (lut, gamma, etc...) or without.
BUT, native dithered to 16-235 content, will have almost no visible errors when stretched to 0-255.
See?

What I meant was "You can't make sand look different after adding more sand".
Hmm... maybe that's a bad analogy.

I'll try to explain it again,
By dithering again (to 0-255) the already randomly dithered 16-235 pixels, you are fixing a random noise "pixel by pixel" errors, which is almost imperceivable.

Another try,
Where do you encounter banding caused by bit conversion errors?
In a smooth gradient (Skies for example) un-dithered content, that does not match the bit depth of the display device you have (8-bit), be it 6-bit, 7.8-bit or 16-bit.


In the future madVR will be a very big thing when the native bit depth of BluRay (v2.0) will be 10/12 bit, for people who would still be using 8-bit panels.


Quote:
Originally Posted by ryrynz
Incredible, but how come there's dithering on the black bars? Can you add an untouched screenshot of Adaptive 4 please?
Backing up what I already written, you will not see the difference with or without dithering.
The Hobbit is already heavily dithered bluray (Its my reference bluray disc), as most properly converted blurays.

Yeah, I'll post some untouched images anyway (No Dithering, Random, Adaptive 4).
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 18th February 2014 at 12:09.
James Freeman is offline   Reply With Quote
Old 18th February 2014, 10:06   #23396  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by MistahBonzai View Post
I use AVIsynth plugins [..] I know they destroy the image
8bit processing a big no-no, it will increase both banding and noise-floor. You are throwing away quite a lot of useful data that mVR could put to good use.

Quote:
Originally Posted by MistahBonzai View Post
The differences are readily apparent - no contest the ED capture created with adaptive4 wins hands down based on smoothness and constancy.
Yup hard to deny, even Stevie Wonder can see that

Quote:
Originally Posted by MistahBonzai View Post
I don't wear corrective lenses while performing critical viewing cuz of the undesirable image shifting they introduce. I assume you do and they were created to minimize light diffraction and tinting. Or are they a special tool used to control the light environment associated with emissive light source image calibration?
I either use 59 Abbe mineral glass lenses with a pretty sharp sprayed multi-coated anti-glare layer or better, Focus Dailies Aquacomfort contact lenses that provide the best sharpness but are hard to bear in a pitch black room.

Quote:
Originally Posted by MistahBonzai View Post
I have misplaced (lost) my contrast meter[..] no-longer able to measure post calibration peak light output or calculate contrast.
Argyll can very easily measure the color temperature and native contrast using:
Code:
dispcal -P 1,1,5.5,5.5 -r -yl -v -Y p
Quote:
Black level = 0.0386 cd/m^2
50% level = 30.18 cd/m^2
White level = 134.78 cd/m^2
Aprox. gamma = 2.26
Contrast ratio = 3488:1
White chromaticity coordinates 0.3104, 0.3265
White Correlated Color Temperature = 6651K, DE 2K to locus = 4.5
White Correlated Daylight Temperature = 6651K, DE 2K to locus = 0.1
White Visual Color Temperature = 6481K, DE 2K to locus = 4.3
White Visual Daylight Temperature = 6656K, DE 2K to locus = 0.1
Quote:
Originally Posted by MistahBonzai View Post
I can attach ND filter(s) to balance the brightness of calibration source versus the display image when needed. Crude yes, but it works very well.
Every optical filter in the light path will temper with sharpness, even the TOTL multi-coated ones AFAIK.

Quote:
Originally Posted by GREG1292 View Post
A4 works for me and my 799.00 Mitsubishi HC7900. I am an end user and this build seems to have it all. Just when I think I am happy the bar is raised even higher.
Yaah, kinda makes you wonder if that's ever gonna end ^^

Quote:
Originally Posted by James Freeman View Post
I wonder if you would see the difference in a real blind test with the current builds without enhancements.
Again, make it interesting and I'll happily prove you wrong.

BTW I see that you've literally become a screnshots scrutinizing self-made prime expert overnight, I'm impressed. Keep up the good work.

Quote:
Originally Posted by huhn View Post
1440 displays are 10 bit professional displays they are at the top... if not the best of the best
Who gives a damn about 10bit if it's 600:1 IPS huh(n).

Last edited by leeperry; 18th February 2014 at 11:24.
leeperry is offline   Reply With Quote
Old 18th February 2014, 11:45   #23397  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
To back this argument:
By dithering again (to 0-255) the already randomly dithered 16-235 pixels, you are fixing a random noise "pixel by pixel" errors, which is almost imperceivable.


007 Bond:
No Dithering

Frodo:
No Dithering
Random Dithering
Adaptive 4

Bilbo:
No Dithering
Random Dithering
Adaptive 4

Moon:
No Dithering
Random Dithering
Adaptive 4

Bilbo (Clipped Whites to level 30):
No Dithering
Random Dithering
Adaptive 4

Bilbo (MadVR Brightness + Photoshop Treatment):
No Dithering
Random Dithering
Adaptive 4

Moon (MadVR Brightness + Photoshop Treatment):
No Dithering
Random Dithering
Adaptive 4


Hopefully you can see that dithering already dithered content will yield almost no visible difference (Moon & 007 shows it the best).
You can clearly see the heavy dithering that the original blurays have.
Note that this Moon shot is completely CGI (Not camera noise, unlike the other shots), so why the ugly dithering? Maybe its the AVC-1 compression (unlikely, because it does just the opposite)?

Even after enhancements (Bilbo Clipped Whites), there is (almost) no visible MadVR dithering, but the native colorful rgb dithering (or camera noise?) is becoming visible on the walls & fireplace.

The last two sets of images is with lowered brightness in MadVR to -100 & Gamma to 2.60 (no Contrast change = no clipped/dithered blacks), then lifted Output Level middles in Photoshop.
Here you can clearly see MadVR Dithering at work (Which is effing perfect I might add ).


I Just wanted to say that whoever actually can discern the differences between the DC, NL, Adaptive, or any other ED test builds with real bluray content (leeperry ), I take my hat off for you.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 18th February 2014 at 13:26.
James Freeman is offline   Reply With Quote
Old 18th February 2014, 15:58   #23398  |  Link
turbojet
Registered User
 
Join Date: May 2008
Posts: 1,840
http://www.anandtech.com/show/7764/t...view-maxwell/9

Given this is an 'entry level' card that may see see some passive models and can do just about everything with madvr, nnedi3 TBD, maxwell may be very nice for madvr.

There's a bug report in the last paragraph that I haven't seen in this thread.
__________________
PC: FX-8320 GTS250 HTPC: G1610 GTX650
PotPlayer/MPC-BE LAVFilters MadVR-Bicubic75AR/Lanczos4AR/Lanczos4AR LumaSharpen -Strength0.9-Pattern3-Clamp0.1-OffsetBias2.0

Last edited by turbojet; 18th February 2014 at 16:03.
turbojet is offline   Reply With Quote
Old 18th February 2014, 16:10   #23399  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by turbojet View Post
There's a bug report in the last paragraph that I haven't seen in this thread.
Gotta love reviewers won't can't be hassled to make official bug reports.

The OpenCL benchmark result doesn't look too good, sub-HD7790 territory.

Last edited by leeperry; 18th February 2014 at 16:14.
leeperry is offline   Reply With Quote
Old 18th February 2014, 16:38   #23400  |  Link
turbojet
Registered User
 
Join Date: May 2008
Posts: 1,840
Where's the openCL benchmark?

The last paragraph at anandtech isn't worded very well and missing some information. To me it sounds like it can do Jinc3AR chroma and image upscaling and nnedi3 chroma and luma doubling on a 1080p display with source resolutions up to 1080p. Even if it can do that with 720p source it's pretty impressive for the price. Would be nice if they fixed their tables and used a more comparable nvidia card for madvr tests, like a 650ti.
__________________
PC: FX-8320 GTS250 HTPC: G1610 GTX650
PotPlayer/MPC-BE LAVFilters MadVR-Bicubic75AR/Lanczos4AR/Lanczos4AR LumaSharpen -Strength0.9-Pattern3-Clamp0.1-OffsetBias2.0
turbojet is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 21:37.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.