Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 1st November 2018, 14:01   #53521  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,052
Quote:
Originally Posted by nevcairiel View Post
If it was truely trained on taking high-res originals and downscales of those, then that is entirely expected. If you do such a training, the filter doesn't learn to "upscale", it learns to un-do the downscale - which in theory sounds similar, but in practice can end up quite different.

Low-quality content does not qualify for that particular type, since even if it was downscaled once, those attributes were destroyed by over-compression, noise, or whatever makes it "low quality".

Thats really the hard part with training (outside the computational requirements). You need to be careful how you train it, or you might bias the algorithm. If you only train on pristine downscales of high-quality high-res images, then thats what it'll be good at, and only that.
But of course where do you get a set of medium to low quality images and high-quality upscales of those to train an algorithm? Someone would have to upscale those in the first place. Or you downscale images, and then artifically degrade them, but unless you do that very carefully, the algorithm might once again just learn to un-do your degredation, and not in a very generic sense.
Yeah, I don't know how the training could be done for poor content with consistent results. It is difficult to carefully degrade a source and repair it. I'm not if it's worth it when you can just choose something like NGU Anti-Alias that handles poor sources fairly well. But I've seen some odd results with NGU Sharp on some material.
Warner306 is offline   Reply With Quote
Old 1st November 2018, 14:03   #53522  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,052
Quote:
Originally Posted by yok833 View Post
Hi guys,
With an LG OLED 2017, should I set 700 for the target peak nits or should I stick with 400?
I use the last test build with the measurement tool and the result is already amazing!!
You should be losing brightness at 400 nits if the display is actually 700 nits.
Warner306 is offline   Reply With Quote
Old 1st November 2018, 14:04   #53523  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,052
Quote:
Originally Posted by madjock View Post
Reference what to set the NITs too, for the last two posts.

It seems to be subjective to what film you are watching and what you like yourselves brightness wise. From what I have read the brighter you make it, the more chance you have of losing details.

I think it will be another madVR to personal taste to a point.
The higher the target, the less chance of losing detail because less compression is involved.
Warner306 is offline   Reply With Quote
Old 1st November 2018, 14:10   #53524  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,587
NGU sharp is well known to under perform on bad sources.
and that's where RCA comes into play which helps NGU sharp tremendously.
huhn is offline   Reply With Quote
Old 1st November 2018, 16:45   #53525  |  Link
SirSwede
Registered User
 
Join Date: Nov 2017
Posts: 67
Quote:
Originally Posted by huhn View Post
NGU sharp is well known to under perform on bad sources.
and that's where RCA comes into play which helps NGU sharp tremendously.
What is best for "bad sources" then?

Would a GTX 1050 Ti be able to handle NGU and RCA?


Last edited by SirSwede; 1st November 2018 at 17:32.
SirSwede is offline   Reply With Quote
Old 1st November 2018, 17:42   #53526  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 198
Quote:
Originally Posted by kostik View Post
I have the same question but for LG OLED C8:
I'd go for 2% peak nits as your peak nits in madVR, it ought to be particularly impactful in dark scenes with small bright single lights or space scenes, in overall brighter scenes the pupil would contract to compensate so the impact of those peak highlights would get lost anyway even if they were presented in original peak brightness...

Last edited by mytbyte; 1st November 2018 at 17:45.
mytbyte is offline   Reply With Quote
Old 1st November 2018, 18:07   #53527  |  Link
cork_OS
Registered User
 
cork_OS's Avatar
 
Join Date: Mar 2016
Location: Minsk (Blr)
Posts: 136
Quote:
Originally Posted by SirSwede View Post
Would a GTX 1050 Ti be able to handle NGU and RCA?
RCA is free for NGU Sharp. It's called NGU fusion.
__________________
I'm infected with poor sources.
cork_OS is offline   Reply With Quote
Old 1st November 2018, 18:16   #53528  |  Link
SirSwede
Registered User
 
Join Date: Nov 2017
Posts: 67
Quote:
Originally Posted by cork_OS View Post
RCA is free for NGU Sharp. It's called NGU fusion.
NGU Sharp would be too sharp for me, as I am using Sharpen Complex 2 in MPC-HC.

Would the GTX 1050 Ti, 4GB GDDR5 be able to handle something like: NGU AA, RCA and SC2 for upscaling 720p to 4K?
SirSwede is offline   Reply With Quote
Old 1st November 2018, 18:20   #53529  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,497
No, NGU AA + RCA is quite hard.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 1st November 2018, 20:35   #53530  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 758
@MADSHI - to come back to an issue AMD users have, over blown colours for SDR material. I've made some further progress on this with help from another user with the same problem SPENCERFORD.
I've now managed to reproduce this using MPC-BE so it does look like its either driver or DIRECT3D related. However I’ve now found that this only happens with 8 bit 1080p material which is upscaled to 2160p and outputted in 10bit using DIRECT3D, If I play an SDR movie which is native 2160p its fine so it seems to be the upscaling part which is confusing something in the driver / output chain. All 2160p HDR material is also unaffected.
So to reproduce this you need:
• Play a 1080p 8 bit SDR movie
• Have 10 bit or higher enabled on graphsics card and in MADVR
• Set Movie player or MADVR to upscale to 2160p.
• Ensure DIRECT3D 11 is enabled in MADVR or other movie player
• HDR cable TV/MONITOR/PROJECTOR * unsure about this one.

I have logged a ticket to AMD with this new information but I logged one before and nothing came of it, do you have a contact at AMD you could mention this to maybe?.
Now I have this locked down can some other AMD users see if they can reproduce this, for some reason it’s a lot noticeable on the Phantom menace about 32 minutes in, see screen shot.

https://1drv.ms/u/s!AgvFafeelEBigP8JoUs9wKjRu8Ajjw
__________________
OLED EF950-YAMAHA RX-V685-Win101809-4K 444 RGB 60hz-AMD RX580 19.2.2
KODI DS - MAD/LAV 92.14/0.72.2 - FSE / 3D / DIRECT3D11 / 10bit
mclingo is offline   Reply With Quote
Old 2nd November 2018, 01:08   #53531  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 216
For your SDR problem, have you tried BT.2020, DCI-P3, and BT.709 options to see if that oversaturated red gets controlled? I use a profile for BT.2020 and another for BT.709 for calibration. Also 8bit or auto instead of 10bit for SDR 8bit? I use an 8bit and a 10bit profile for properties.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10v1809 X5690 9604GB RGB 4:4:4 8bit Desktop @60Hz 8,10,12bit @Matched Refresh Rates
KODI MPC-HC/BE PDVD DVDFab
65JS8500 UHD HDR 3D
brazen1 is offline   Reply With Quote
Old 2nd November 2018, 01:27   #53532  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,587
that's the core problem you are not supposed to need that.
huhn is offline   Reply With Quote
Old 2nd November 2018, 02:26   #53533  |  Link
HillieSan
Registered User
 
Join Date: Sep 2016
Posts: 119
Quote:
Originally Posted by brazen1 View Post
For your SDR problem, have you tried BT.2020, DCI-P3, and BT.709 options to see if that oversaturated red gets controlled? I use a profile for BT.2020 and another for BT.709 for calibration. Also 8bit or auto instead of 10bit for SDR 8bit? I use an 8bit and a 10bit profile for properties.
You are right. I am using PCI-P3 and no problems with my AMD RX card.
HillieSan is offline   Reply With Quote
Old 2nd November 2018, 08:08   #53534  |  Link
NoTechi
Registered User
 
Join Date: Mar 2018
Location: Germany
Posts: 53
What gamma are you using for HDR content? I tested so far a 3D LUT DCI-P3 with power gamma 2.2-2.4 with nice results. Now I tried a 3D LUT with madvr set to output in HDR format and ST2048 and JVC also on ST2048 and Guardians of the Galaxy looked fantastic. But when I looked at Iron Man 1 it was not watchable with it. Might be caused by just recalculating the 3D LUt instead of creating a new one with the JVC on ST2048 and HDR profile.

I was thinking I should go for:
1. A 3DLUT for 709 content with a gamma 2.2 or a bit higher using JVC 2020 profile since the calibration was made in DCI-P3
2. A 3DLUT for DCI-P3/BT2020 with gamma ST2048 using JVC HDR profile (which it would switch to automattically on my 7900)

Does this makes sense?

NoTechi

Last edited by NoTechi; 2nd November 2018 at 08:12.
NoTechi is offline   Reply With Quote
Old 2nd November 2018, 09:49   #53535  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 758
Quote:
Originally Posted by brazen1 View Post
For your SDR problem, have you tried BT.2020, DCI-P3, and BT.709 options to see if that oversaturated red gets controlled? I use a profile for BT.2020 and another for BT.709 for calibration. Also 8bit or auto instead of 10bit for SDR 8bit? I use an 8bit and a 10bit profile for properties.

Hi mate, luckily yes that does work, ive been using it as a workaround for a good while now. I use BT2020 as BT709 still gives quite a red image.

Like Huhn says though this shouldht be required which is the core problem, many thanks for your input though.

__________________
OLED EF950-YAMAHA RX-V685-Win101809-4K 444 RGB 60hz-AMD RX580 19.2.2
KODI DS - MAD/LAV 92.14/0.72.2 - FSE / 3D / DIRECT3D11 / 10bit
mclingo is offline   Reply With Quote
Old 2nd November 2018, 09:53   #53536  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 758
Quote:
Originally Posted by HillieSan View Post
You are right. I am using PCI-P3 and no problems with my AMD RX card.

Hi, I find by calibrating for BT2020 resolves the issue completely for me and I dont need separate profiles for any other material, no other material seems affected by this for me which in itself is odd suggesting whatever is happening is happening towards the end of the processing chain maybe.

its quite a bizzare problem but I guess its nice to know i'm not the only one who has it and not an issue with my own setup.
__________________
OLED EF950-YAMAHA RX-V685-Win101809-4K 444 RGB 60hz-AMD RX580 19.2.2
KODI DS - MAD/LAV 92.14/0.72.2 - FSE / 3D / DIRECT3D11 / 10bit
mclingo is offline   Reply With Quote
Old 2nd November 2018, 10:12   #53537  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 450
Quote:
Originally Posted by mclingo View Post
If I play an SDR movie which is native 2160p its fine so it seems to be the upscaling part which is confusing something in the driver / output chain. All 2160p HDR material is also unaffected.
What is the native colour space of SDR 2160p movies? BT.709 or DCI-P3/BT.2020?
Quote:
Originally Posted by huhn View Post
that's the core problem you are not supposed to need that.
Why not? Wouldn't UHDTV displays that conform to Rec.2020 expect the content to be in BT.2020 when the definition is 2160p SDR (Edit: https://en.wikipedia.org/wiki/Rec._2...em_colorimetry), just like HD displays expected BT.709 when being fed > 576p? Maybe the Radeon is enforcing the standard really hard? (Edit: i.e. indicating BT.2020 in the HDMI metadata when it's outputting 2160p even though madVR is sending BT.709?)
Just like 8-bit HDR or even 8-bit UHD, maybe BT.709 UHD is technically accepted by some displays but not following the standard and therefore not tested/considered by manufacturers?
__________________
HTPC: W10 1809, E7400, 1050 Ti, DVB-C, Denon 2310, Panasonic GT60 | Desktop: W10 1809, 4690K, HD 7870, Dell U2713HM | MediaPortal 1/MPC-HC, LAV Filters, ReClock, madVR

Last edited by el Filou; 2nd November 2018 at 10:15.
el Filou is offline   Reply With Quote
Old 2nd November 2018, 10:20   #53538  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,713
Quote:
Originally Posted by el Filou View Post
What is the native colour space of SDR 2160p movies? BT.709 or DCI-P3/BT.2020?
Both exist and are valid. You'll probably find more BT.709 content still though. If you have a BT.2020/DCI-P3 capable screen with a decent color volume, I would probably calibrate for DCI-P3 (or even BT.2020) and let madVR change all content to that.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 2nd November 2018, 13:05   #53539  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 758
Quote:
Originally Posted by el Filou View Post
What is the native colour space of SDR 2160p movies? BT.709 or DCI-P3/BT.2020?Why not? Wouldn't UHDTV displays that conform to Rec.2020 expect the content to be in BT.2020 when the definition is 2160p SDR (Edit: https://en.wikipedia.org/wiki/Rec._2...em_colorimetry), just like HD displays expected BT.709 when being fed > 576p? Maybe the Radeon is enforcing the standard really hard? (Edit: i.e. indicating BT.2020 in the HDMI metadata when it's outputting 2160p even though madVR is sending BT.709?)
Just like 8-bit HDR or even 8-bit UHD, maybe BT.709 UHD is technically accepted by some displays but not following the standard and therefore not tested/considered by manufacturers?

Hi,if this were the case you'd see this with NVIDA cards too though surely, I have no issues with my 1050 card, i've even tried two different HDR capable TVs
__________________
OLED EF950-YAMAHA RX-V685-Win101809-4K 444 RGB 60hz-AMD RX580 19.2.2
KODI DS - MAD/LAV 92.14/0.72.2 - FSE / 3D / DIRECT3D11 / 10bit
mclingo is offline   Reply With Quote
Old 2nd November 2018, 13:56   #53540  |  Link
mytbyte
Registered User
 
Join Date: Dec 2016
Posts: 198
Quote:
Originally Posted by mclingo View Post
Hi,if this were the case you'd see this with NVIDA cards too though surely, I have no issues with my 1050 card, i've even tried two different HDR capable TVs
Hmmm....the other day, when I connected my very old comp with ATI HD6570 to plasma, it reported xv.color color signal which is kind of like old wide gamut "standard" and I don't know how this happens, I never activated it if I can recall correctly...I haven't looked closer into this, and can't tell if I get wider gamut and more saturated colors because i didn't have time to look into how to deactivate this...I could connect it to my UHD over the weekend, but perhaps this could be a clue for you -> perhaps BT.2020 is signalled to TV without you knowing about it (like a sort of AMD-specific user friendliness)and TV switches to it's wide gamut mode (probably native colorspace)

Last edited by mytbyte; 2nd November 2018 at 14:01.
mytbyte is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 22:49.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.