Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 30th April 2016, 19:52   #37641  |  Link
baii
Registered User
 
Join Date: Dec 2011
Posts: 180
Quote:
Originally Posted by Georgel View Post
I was thinking of buying a laptop with desktop GPU GTX980. (Not 980M). My question regarding madVR, is: would an i7-6700K + GTX980 be able to run NNEDI at 256 Neurons and add edge sharpening + edge thinning + debanding + 4K upscaling with AR and sigmoidal light using Jinc ?

I am a bit curious if anyone tried, because I was thinking of getting a P870 from Clkevo, and while it is a power house of a laptop, I am afraid that for using madVR in these conditions, I should probably wait a bit more for the new generation of GPUs to appear. Especially that I will be always upscaling FHD to 4K.
256 neuron luma nnedi3? No, let alone all the enhancement.

With no enhancement a 980ti / fury x can do 128 for 1080p while gimping everything else. Anyways, cramping all the option on doesn't always make it "look better".

Btw, if you push the gpu hard on laptop, the fan noise can be pretty annoying unless you have headphone on all time.
baii is offline   Reply With Quote
Old 30th April 2016, 19:53   #37642  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by Patrik G View Post
dont you need a tv with 10 bit panel to see steps above 255?
that panasonic plasma is only 8 bit right ?
sure it will be clipping.

get a 500M instead
true 10bit panel
plasma are technically 1 bit and noisy compared to LCD screens.

and of cause you don't need more bit deep to show steps higher than x that's not how this works.
huhn is offline   Reply With Quote
Old 30th April 2016, 20:04   #37643  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by Georgel View Post
I was thinking of buying a laptop with desktop GPU GTX980. (Not 980M). My question regarding madVR, is: would an i7-6700K + GTX980 be able to run NNEDI at 256 Neurons and add edge sharpening + edge thinning + debanding + 4K upscaling with AR and sigmoidal light using Jinc ?
1080p NNEDI3 32 doubling already requires much performance, GTX 980 can do 1080p 30fps NNEDI3 64 doubling + some very lightweight postprocessing at most. When you output 4k resolution, deband with max quality isn't low on resources either and neither is sharpen edges.
Don't buy an expensive notebook now, probably already in June or July you can buy notebooks with Polaris 10 or GP104/106 GPU which will be a huge technology update in general over current GPUs (HEVC 10 bit, HDR, Display Port 1.3...).
aufkrawall is offline   Reply With Quote
Old 30th April 2016, 20:15   #37644  |  Link
Uoppi
Registered User
 
Join Date: Oct 2015
Posts: 99
Quote:
Originally Posted by Asmodian View Post

The only other option is madVR 16-235 and the GPU full, but this clips your desktop.
This is the recommended combination if desktop colors are of no concern, right?

I'm getting confused once again because some on the other hand say to use madVR full and GPU full.

I have madVR limited, GPU full and HDMI black level "low" on my Samsung TV (can't remember if low meant full or limited but I'm guessing low = full). Can someone confirm if this is opimal? Black levels look alright on test clips at least.
Uoppi is offline   Reply With Quote
Old 30th April 2016, 20:25   #37645  |  Link
baii
Registered User
 
Join Date: Dec 2011
Posts: 180
yet Another TV color range question, so confusing grr

Samsung TV hdmi in
Intel HD

I have full range gpu, pc mode tv (should be full IDK...), hdmi normal, madvr set to full, AVS hd709 black clipping flashing at 17+.
hdmi low with full on gpu will crush black. (no flash)

Does this sound right~?

And 1080p 59hz vs 60hz?

I hate TV...
seems like I am on the same boat like uoppi,lols.

edit
Going to go with this~~ from AVS zoyd
http://www.avsforum.com/forum/139-di...l#post33427002

For me would be the second one.

Last edited by baii; 30th April 2016 at 21:05.
baii is offline   Reply With Quote
Old 30th April 2016, 21:58   #37646  |  Link
trueunborn
Registered User
 
Join Date: Feb 2014
Posts: 2
Help with 3D

Hi all, Got the 3D player working great with Latest LAV and Mavr but Reclock does not detect correctly the refresh rate that MAdVr is forcing. Reclock thinks it is 24hz while MAdvr forces 23hz (x2 in reality...)... the result is a lot of dropped frames. Tried a lot of things with no luck.Any idea?

BTW, I have an nvidia card with drivers 364,47.
trueunborn is offline   Reply With Quote
Old 30th April 2016, 22:21   #37647  |  Link
JamPS
Registered User
 
Join Date: Jul 2005
Posts: 9
Quote:
Originally Posted by trueunborn View Post
...Reclock thinks it is 24hz while MAdvr forces 23hz (x2 in reality...)... the result is a lot of dropped frames. Tried a lot of things with no luck.Any idea?
Try setting ReClock on manual fps.
JamPS is offline   Reply With Quote
Old 30th April 2016, 22:34   #37648  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by Uoppi View Post
This is the recommended combination if desktop colors are of no concern, right?

I'm getting confused once again because some on the other hand say to use madVR full and GPU full.

I have madVR limited, GPU full and HDMI black level "low" on my Samsung TV (can't remember if low meant full or limited but I'm guessing low = full). Can someone confirm if this is opimal? Black levels look alright on test clips at least.
I'd set everything to full range so desktop levels remain accurate and the GPU doesn't have to do a range conversion for the display -- after madVR, the output is straight passthrough.

Your configuration is considered optimal because madVR doesn't clip anything and all output is passed-through as is to the display. But I doubt it has any real advantage over setting everything full range; especially when your desktop levels will be off and look distracting.

Can anyone claim superiority for setting madVR to 16-235?
Warner306 is offline   Reply With Quote
Old 30th April 2016, 23:12   #37649  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by Warner306 View Post
Can anyone claim superiority for setting madVR to 16-235?
There is some banding introduced by expanding 16-235 to 0-255 but madVR's dithering prevents this being an issue (it results in extra dither noise instead of banding), at least on most displays.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 30th April 2016, 23:53   #37650  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by Asmodian View Post
There is some banding introduced by expanding 16-235 to 0-255 but madVR's dithering prevents this being an issue (it results in extra dither noise instead of banding), at least on most displays.
If the display is set to 16-235, does it clip 0-15 or 236-255? I'm still editing this section in my own guide.

Last edited by Warner306; 1st May 2016 at 00:16.
Warner306 is offline   Reply With Quote
Old 1st May 2016, 00:56   #37651  |  Link
Georgel
Visual Novel Dev.
 
Georgel's Avatar
 
Join Date: Nov 2015
Location: Bucharest
Posts: 200
Quote:
Originally Posted by baii View Post
256 neuron luma nnedi3? No, let alone all the enhancement.

With no enhancement a 980ti / fury x can do 128 for 1080p while gimping everything else. Anyways, cramping all the option on doesn't always make it "look better".

Btw, if you push the gpu hard on laptop, the fan noise can be pretty annoying unless you have headphone on all time.
I thought that an ultra fast GPU could do a bit more.

I do wear headphones almost all the time. I am one of those audiophools who invests in headphones too. Afterall my project is about audio reproduction, and music quality and fidelity.
Georgel is offline   Reply With Quote
Old 1st May 2016, 00:57   #37652  |  Link
Georgel
Visual Novel Dev.
 
Georgel's Avatar
 
Join Date: Nov 2015
Location: Bucharest
Posts: 200
Quote:
Originally Posted by Warner306 View Post
If the display is set to 16-235, does it clip 0-15 or 236-255? I'm still editing this section in my own guide.
IF you have display set 16-235, it normally clips all values outside of this, so yes, they should look like clipping.
Georgel is offline   Reply With Quote
Old 1st May 2016, 01:02   #37653  |  Link
Georgel
Visual Novel Dev.
 
Georgel's Avatar
 
Join Date: Nov 2015
Location: Bucharest
Posts: 200
Quote:
Originally Posted by aufkrawall View Post
1080p NNEDI3 32 doubling already requires much performance, GTX 980 can do 1080p 30fps NNEDI3 64 doubling + some very lightweight postprocessing at most. When you output 4k resolution, deband with max quality isn't low on resources either and neither is sharpen edges.
Don't buy an expensive notebook now, probably already in June or July you can buy notebooks with Polaris 10 or GP104/106 GPU which will be a huge technology update in general over current GPUs (HEVC 10 bit, HDR, Display Port 1.3...).
I was studying the new generation of GPUs. I am a member of Notebookreview, and their thread for Pascal seems to be almost the most fired up on the internet.

1080M for laptops sadly seems like one of those huge letdowns that you wish you could get over, because all news indicate to 1080M being released a bit later, and they don't indicate a much higher power update.

Well, I mainly need a laptop with 4K display, because I am creating Visual Novels, and need those color accuracy.

But I also abuse my day to day laptop to watch anime, and listen to music and do programming, so having a laptop that has a 4K screen, but on which I can also do powerful gaming and use madVR with full blown potential really sounds sweet. So far, I am able to use anything but not NNEDI algorithms with my GTX860M, so it is not a major letdown, but I am not watching videos in 4K. I am a bit worried that when I will get to 4K I might be disappointed in the performance of that GPU (desktop GTX980).

I want to be able to fluently play games and watch movies in native 4K at 60fps for games, and without drops for madVR.

I also hope that the new link they use for Pascal could make SLI or whatever it will be named to be easier to implement in madVR (or automatic, why not ?)
Georgel is offline   Reply With Quote
Old 1st May 2016, 02:04   #37654  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
NVLink is for professional HPC, not consumer GPUs.
16nm FF+ will also lower power consumption remarkably over 28nm. This is especially useful in notebooks or HTPCs, I can't imagine notebooks with full GM204 chip can actually be close to silent.

Also, with good 1080p sources there's not really any need to use NNEDI3 over super-xbr (anti-bloat) + SuperRes, which requires much less GPU performance. Polaris 10 will most likely be slower than GP104 in games, but it will also cost less and still be sufficient for a lot of things.
GM204 isn't able to decode HEVC 10 bit or VP-9, it will be outdated soon.
aufkrawall is offline   Reply With Quote
Old 1st May 2016, 02:43   #37655  |  Link
Stereodude
Registered User
 
Join Date: Dec 2002
Location: Region 0
Posts: 1,436
Quote:
Originally Posted by Patrik G View Post
dont you need a tv with 10 bit panel to see steps above 255?
that panasonic plasma is only 8 bit right ?
sure it will be clipping.

get a 500M instead
true 10bit panel
FWIW, I can see an improvement with 10-bit vs. 8-bit on my Samsung plasma with a gradient ramp (madVR dithering turned off).
Stereodude is offline   Reply With Quote
Old 1st May 2016, 02:49   #37656  |  Link
Georgel
Visual Novel Dev.
 
Georgel's Avatar
 
Join Date: Nov 2015
Location: Bucharest
Posts: 200
Quote:
Originally Posted by aufkrawall View Post
NVLink is for professional HPC, not consumer GPUs.
16nm FF+ will also lower power consumption remarkably over 28nm. This is especially useful in notebooks or HTPCs, I can't imagine notebooks with full GM204 chip can actually be close to silent.

Also, with good 1080p sources there's not really any need to use NNEDI3 over super-xbr (anti-bloat) + SuperRes, which requires much less GPU performance. Polaris 10 will most likely be slower than GP104 in games, but it will also cost less and still be sufficient for a lot of things.
GM204 isn't able to decode HEVC 10 bit or VP-9, it will be outdated soon.
That is useful to know.

I will make sure to wait until they release the desktop version of 1080 and see how much better it is in relation to desktop 980. I need to feel motivated to buy the thing.

Also, Clevo P870, is not exactly your average desktop, it is pretty silent, built like a tank, accepts both desktop CPU and GPU, and rocks one's world.

On the other hand, it is quite expensive and heavy. Really heavy.

This being said, I would still consider P870, because I can get it with G.Skill RAM and Samsung PRO SSDs and other features that you cannot do on non custom laptops.
Georgel is offline   Reply With Quote
Old 1st May 2016, 03:33   #37657  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by Georgel View Post
IF you have display set 16-235, it normally clips all values outside of this, so yes, they should look like clipping.
Right, to properly calibrate for 16-235 you would adjust the display until only 16-235 were visible so everything above and below would be effectively clipped.

MadVR 16-235, GPU 0-255, and adjust the display until only 16-235 are visible.
__________________
madVR options explained

Last edited by Asmodian; 1st May 2016 at 03:40.
Asmodian is offline   Reply With Quote
Old 1st May 2016, 04:03   #37658  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by Patrik G View Post
dont you need a tv with 10 bit panel to see steps above 255?
that panasonic plasma is only 8 bit right ?
sure it will be clipping.

get a 500M instead
true 10bit panel
This isn't how 10-bit v.s. 8-bit works.

255 8-bit = 1023 10-bit

64-940 in 10-bit is simply 16-235 in 8-bit, the problem is bad banding with a gradient from near 0 to 10,000(!?) nits with only 220 steps.

10,000 nits is crazy but if it was possible you would need at least 12-bit for 3505 steps... hopefully we can get rid of limited range by then too so we could use all 4096 steps.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 1st May 2016, 04:58   #37659  |  Link
JarrettH
Registered User
 
Join Date: Aug 2004
Location: Canada
Posts: 860
Is there a string value for hevc or h.265?

Basically what I want to do is create a profile to disable artifact removal if the file format is hevc.

Edit: Maybe this works for now...?

Quote:
if (not h264) or (not MPEG2) "x265"
Ok that isn't working...what do I need to fix?

Last edited by JarrettH; 1st May 2016 at 05:07.
JarrettH is offline   Reply With Quote
Old 1st May 2016, 05:03   #37660  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by Asmodian View Post
Right, to properly calibrate for 16-235 you would adjust the display until only 16-235 were visible so everything above and below would be effectively clipped.

MadVR 16-235, GPU 0-255, and adjust the display until only 16-235 are visible.
Ok, I'm finally clear on this topic. Thanks.
Warner306 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 14:27.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.