Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
25th July 2020, 19:15 | #1 | Link |
Registered User
Join Date: Jul 2020
Posts: 76
|
Transparent encodes of fake 4k blurays
My goals when encoding are to be indistinguishable from the source based on my viewing conditions, with a bit of quality buffer should those viewing conditions change (closer seating, larger tv etc).
I havent started encoding my 4k blurays just yet, only 1080p, as i want to get a ryzen 4000 processor before to speed things up a tad. But i wanted to know if i should downsample them to 1080p? Obviously true 4k i should leave as is, but with fake 4k, stuff that was mastered at 2k and then upscaled to 4k, should i leave that in the full 4k? Or will it be the same quality at 1080p as thats what it mastered at? I would likely save a whole lot more space doing this. |
27th July 2020, 02:21 | #2 | Link |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,871
|
>1080p doesn't matter so much with SDR, but absolutely can with HDR so that sharp specular highlights can be preserved.
There isn't always a hard line between fake and real 4K, as titles may combine 4K elements with 2K VFX shots. How much grain and motion blur have a big impact on potential detail as well. |
27th July 2020, 07:25 | #3 | Link |
brainless
Join Date: Mar 2003
Location: Germany
Posts: 3,653
|
Also 2k Upscaled UHD Bluray benefit from better chroma resolution. (Near 444 related to 2k)
When downscaling an UHD Bluray with 2k content to 1080p, one will reduce the chroma to a quarter of its original resolution.
__________________
Don't forget the 'c'! Don't PM me for technical support, please. |
1st August 2020, 10:11 | #5 | Link |
Angel of Night
Join Date: Nov 2004
Location: Tangled in the silks
Posts: 9,560
|
Before you make a choice that will hugely impact both encoding time and storage space, just watch a few downsampled to 1080p and then upscaled again to see if you can tell. Some can, many can't, and there's not much point if you can't.
|
1st August 2020, 14:09 | #6 | Link |
Pig on the wing
Join Date: Mar 2002
Location: Finland
Posts: 5,799
|
This. I use a method that utilizes the clever tool called Zopti to determine optimal (well, at least better than using the same values for all sources) parameters b and c for BicubicResize to downsample. There's a huge difference compared to just using Spline36 or Lanczos. For native 4K, I go for 1440p and upscaled 2K goes down to 1080p.
__________________
And if the band you're in starts playing different tunes I'll see you on the dark side of the Moon... |
11th August 2020, 01:13 | #8 | Link |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,871
|
And note that even "uprezzed" 2K to 4K titles don't use some fire-and-forget algorithim like bicubic. There is a lot of shot- and scene-based tweaking of parameters to get best results. For major titles from major studios, it's more like a lightweight remastering. And if it was a DCI-P3 master getting converted to HDR, it really IS a remaster. What you'd get scaling back down to 1080p isn't going to be the same as the original 2K master (which was 2048x, not 1920x; ~14% more source pixels).
Note a cinema "4K" projector is 4096x, not the home video/broadcast 3840x. |
14th August 2020, 23:37 | #9 | Link |
Broadcast Encoder
Join Date: Nov 2013
Location: Royal Borough of Kensington & Chelsea, UK
Posts: 3,032
|
Not necessarily. If someone doesn't care about playback compatibility, when using a reverse upscale filter, the idea would be to reverse upscale the luma inverting the kernel (i.e Debilinear, Debicubic etc) to the original resolution but leaving the chroma as it is, which means that you're gonna end up with a FULL HD 4:4:4 10bit file from an upscaled UHD 4:2:0 10bit which is a win win.
|
20th August 2020, 02:22 | #10 | Link | |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,871
|
Quote:
Screen recordings, sure. Turn off ClearType before doing screen recordings, people! It messes up the chroma for people with different types of panels. |
|
20th August 2020, 07:50 | #11 | Link | |
brainless
Join Date: Mar 2003
Location: Germany
Posts: 3,653
|
Quote:
It bothers me. That's why I try to use MadVRs chroma reconstruction/upsampling most of the time I use thr PC to playback movies. Anyways YUV444 files won't be too compatible with SmartTV or Bluray players. So we are stuck here with 420 chroma.
__________________
Don't forget the 'c'! Don't PM me for technical support, please. |
|
24th August 2020, 18:40 | #12 | Link | |||
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,871
|
Quote:
Quote:
Quote:
Put another way, 1440p 4:2:0 is fewer samples-per-second than 1080p 4:4:4, and will look better for 4K source content. |
|||
24th August 2020, 18:55 | #13 | Link | |
Cary Knoop
Join Date: Feb 2017
Location: Newark CA, USA
Posts: 398
|
Quote:
It would be much more efficient for a CODEC to determine what perceptual compressions to make instead of feeding an already chroma-subsampled source. There are five things, I believe need to change in video processing: 1. More to a float32 code value standard 2. Abandon the difference between full and limited range (no clipping while float 0.0 to 1.0 is always the visible range) 3. No more chroma subsampling (but allow chroma subsampling for legacy sources) 4. No more interlaced sources (but allow interlaced for legacy sources) 5. Allow for multiple framerates in the same source (each frame will have an absolute time duration marker) This will make life a lot simpler! Last edited by Cary Knoop; 24th August 2020 at 19:07. |
|
24th August 2020, 22:49 | #14 | Link | ||
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,871
|
Quote:
Quote:
Doing 1080p30 in your proposed would have the same memory requirements as 4Kp60. Even 10-bit HDR would still be 6x more efficient using 4:2:0. 4. Is pretty much accomplished at this point. There is no interlaced happening with resolutiosn beyond 1080i or with HDR. And H.264 was the last codec that really engineered for interlaced efficiency (HEVC doesn't have MBAFF, for example). Interlaced is already a shrinking legacy technology, thank goodness. 5. We've had media formats that just give each frame a duration for ages and ages - supporting a fixed frame rate was itself an innovation that didn't get broadly implemented until about 15 years ago. The real challenge with variable frame rate video is that technologies like HDMI don't handle it gracefully. Over a 60p connection, switching between 24 and 60 has intrinsic judder. Having everything 120 Hz would be a lot easier for NTSC countries. PAL countries mixing 24/25/30/50/60. THe least common multiplier for all standard frame rates is 600. And that's not accouting for the NTSC 29.97/30 timing headache. |
||
25th August 2020, 00:47 | #15 | Link | |
Cary Knoop
Join Date: Feb 2017
Location: Newark CA, USA
Posts: 398
|
Quote:
I believe it should be possible when someone makes a documentary with mixed framerate footage not to worry about making the overall framerate unique but to allow multiple segments. There is absolutely no technical reason why modern monitors cannot switch framerates on the fly (or get a change signal a few frames ahead). HDMI is an awful, protectionist, and very limited technology. |
|
25th August 2020, 02:13 | #16 | Link |
Registered User
Join Date: Oct 2012
Posts: 8,116
|
adaptive sync for gaming is not a special feature anymore more which HDMI can do but rarely used with HDMI displays. usually only a range of 48-60 is present which means a practical range of 50-58 in short not useful at all yet.
HDMI 1.3 was always able to do 1080p 60 hz 12 bit and TV where able to accept this type of signal too and it was used on PC with consumer grade hardware. HDMI 2.0 is the first major used version that couldn't do the target refreshrate/resolution with 12 bit for 23p 12 bit is a default feature. if i now take in to consideration that every TV except one i have tested in my life span performance much worse in term of banding when 10 bit or more bit was send into the display instead of 8 bit i don't even know what the point of it is. guess what my new X900h get's "destroyed" when 10 bit is send into it instead of 8 bit and i wasn't expecting anything else. hasn't it been proven that downscaling is a terrible compression algorithm and that 4:4:4 which a higher chroma qp offset setting is "general" better for quality even for luma then 4:2:0 because more bandwidth is used on luma? the chroma scaler in madVR weren't created for fun and they where discussed with real world examples. if i really have to i can hunt some... or we just take a esport computer game gaming is already bigger then hollywood so lot's and lot's of content where 4:4:4 matter a lot. pretty much every new not low end LG TV can or is planned to do 120hz 4K 4:4:4. just for fun nvidia can do 4.4:4 hardware "encoding" is limited but it can do it. modern computer are very good at 16-256 bit so i don't see a huge issue here and aren't people already doing this with "avisynth" and vapoursynth? we are talking about processing right not encoding to 16-32 bit? |
26th August 2020, 18:34 | #17 | Link | |||
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,871
|
Quote:
Quote:
It's a fine idea, and I could see something like the PS5 or XBox Series X being able to support it eventually. It might "just work" under a few specific high-end Windows gaming setups. But that'd be <0.1% of the installed base at best, today. Sounds like a potential SMPTE spec. Quote:
|
|||
26th August 2020, 20:00 | #18 | Link |
Registered User
Join Date: Oct 2012
Posts: 8,116
|
does 4:4:4 decoding really cost more silicon when a hardware decoder can do 8K 4:2:0 and 4:4:4 is limited to 4k in this case?
BTW. freesync and gsync screen work out of the box with pretty old hardware and even entry level screens have often support for it (mostly useless like TVs). and player seem to support it too. a high end gaming PC is clearly not needed. https://github.com/mpv-player/mpv/issues/6137 https://forums.blurbusters.com/viewtopic.php?t=3509 |
27th August 2020, 18:03 | #19 | Link | ||
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,871
|
Quote:
Quote:
But the broader ecosystem challenge is that there are plenty of devices that can't play the content back, and not much content exists that could truly take advantage of a variable frame rate. Most titles that used sources with different frame rates had those assets conformed to the project's fps before editing. Getting end-to-end VFR working would require big overhauls to editing and content creation software upstream and then a new generation of content using those technologies. Still, it'd make it possible to have the 48p versions of The Hobbit movies work on home video, which hasn't been possible to date as 48 fps support is far from universal, and isn't a broadcast or HDMI standard. |
||
27th August 2020, 20:33 | #20 | Link |
Registered User
Join Date: Oct 2012
Posts: 8,116
|
for 1080p is pretty easy to show "major" difference been chroma scaler.
for UHD there is no 4:4:4 source i know of so i have to take game footage to show the difference which is again not that hard. games are simply different from movies. |
Tags |
2160p, bluray, staxrip, upscale, x265 |
Thread Tools | Search this Thread |
Display Modes | |
|
|