Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
1st August 2019, 02:00 | #21 | Link | ||
Registered User
Join Date: Nov 2016
Posts: 11
|
Quote:
Quote:
I think I will hold off on hme and hme-search until I see it supported in the standard Handbrake download. I would be willing to use the hme-search 1,2,3 setting once that happens though. I'd have to see what I think of hme-search 1,3,3 to decide if I would incorporate that one into my presets. Taking your word on rc-lookahead=60. I think Ma's post referenced in my OP is probably the only place I've seen it set to 120. The tu-inter and tu-intra settings seem like no brainers, thanks for not letting me miss out on those. Are there any settings here that would be significantly affected by the output resolution? Like are there settings that become more important as resolution gets higher, or any that are definitely no benefit at 480p or 360p output? I'll have to search these forums for info on RipBot264. I could make use of the workload distribution its capable of, but the one time I tried to test that it crashed and I found the UI frustrating compared to what I'm used to (Handbrake). I really appreciate all the helpful info. Last edited by Ischemia24; 1st August 2019 at 12:07. |
||
1st August 2019, 05:52 | #22 | Link |
Pig on the wing
Join Date: Mar 2002
Location: Finland
Posts: 5,718
|
A higher qcomp basically means that the encoder will use more bits in the more difficult parts of the frame. This should prevent cases where things get too blurry in high motion etc.
__________________
And if the band you're in starts playing different tunes I'll see you on the dark side of the Moon... |
1st August 2019, 08:57 | #23 | Link | |||
Lost my old account :(
Join Date: Jul 2017
Posts: 322
|
Quote:
Quote:
Quote:
And tbh, I think you are overcomplicating this a bit, the presets are pretty decent in setting appropriate settings based on encoding time trade-offs. And all these settings you are picking up and adding etc, might not be great for your use case, and might not be great for your content, and would need tweaking etc to be appropriate. Its imo better to keep it a bit simpler, at least as a baseline, with the slowest setting you think is realistic for you to use for an real life scenario, then you can look in to settings that affects the specific charactaristics that you are not pleased with. This is what I personally would use for 720p24 and 1080p24 rec709 SDR material if I would target a bitrate on the lower side (I would guesstimate a bitrate arround 2-3Mbps). I might play arround with the different AQ-modes, and if I were on a 3900x like you I might bump up 'slow' to 'slower'. I would recommend you to use something similar as your baseline to evaluate against. --preset slow --profile main10 --ctu 32 --merange 26 --crf 22 --keyint 240 --min-keyint 24 --rc-lookahead 48 --bframes 8 --no-sao --deblock -1:-1 --colorprim bt709 --transfer bt709 --colormatrix bt709 --range limited Last edited by excellentswordfight; 1st August 2019 at 18:27. |
|||
1st August 2019, 22:24 | #24 | Link | |
Registered User
Join Date: Jun 2016
Posts: 116
|
Quote:
I consider Placebo to be the max setting I should use. So like subme 7 I wouldn't go above 5 because placebo doesn't even use it... You'll waste a lot of time for about .1% difference. As other's have mentioned, you find your encodes will speed up if you drop some settings like these. ctu32, me-range 26, qg-size 16 However, there is a small chance you could end up with larger files. It's a speed:size trade off. I'd look into Analysis save and load. You can create a file that saves a lot of the decisions the encoder made during your first attempt at encoding a file. Then if you don't like the outcome and want to make minor changes like bitrate\CRF you can read that log file and it'll take hours to complete instead of days. There is a lot to it so you should read up on it first. https://x265.readthedocs.io/en/defau...ision-analysis |
|
2nd August 2019, 00:25 | #25 | Link |
Moderator
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
|
The one thing missing from placebo that I've seen be helpful in special cases is --tskip. Not much for natural images, but it can help with very sharp synthetic edges like text, graphics, or CGI-generated cel-style animation.
|
2nd August 2019, 16:45 | #26 | Link | |||
Registered User
Join Date: Nov 2016
Posts: 11
|
Quote:
My aim for 1080p encodes is pretty much to reach a bitrate as low as ~1200 to 1500 kbps while still looking impressively good to my eyes. 700 to 760 kbps (maybe try pushing down to 650 kbps) for 720p. 340 kbps is often a struggle for some 480p content, and I need to do more testing to determine what range to target for 360p (guessing I can't go much lower than 300 kbps). Basically I want to end up with content that is both impressively small and impressively good looking, but I will sometimes go with "eh, it looks pretty good". I don't disagree with you that I am overcomplicating it, but I do want to skew things toward maximum compression beyond what most consider a sensible compression/speed balance. Thank you for the suggested baseline. Is there significant efficiency to be gained in specifying the colorspace and limiting the range? Quote:
I'm convinced, I will lower subme to 5. I haven't been specifying qg-size (looks like it was defaulting to 32 in a recent test), so I will go with 16 as a sane value. What is your sense of the time and benefit trade-off if I were to go with 8? Ctu seems content dependent, and like it could be lowered without efficiency loss for something like a complex computer generated scene where every part of the frame is busy and detailed for a vast majority of frames, but it would be beneficial to keep it at 64 for large numbers of frames with a focused subject and minimal to no detail (like a flat color with no gradient at all) in a large part of the frame. Do I have that right? I'm thinking for my goals I should probably lower me-range to 26 for 480p and 360p. I'm undecided on 720p and I feel like I should leave it at the default 57 for 1080p. I can see how it could have a significant effect on speed when I'm using me=sea. Quote:
This is great, I've learned more about H.265 settings since starting this thread than I have in the past three years. Looks like I need to compare a CRF 22 encode with the above suggested baseline with what I have as a best attempt for maximum compression settings: rc-lookahead=60:bframes=16:bframe-bias=5:ref=6:min-keyint=24:me=sea:subme=5:tu-inter=4:tu-intra=4:qg-size=8:deblock=-1:-1:no-sao:colorprim=1:transfer=1:colormatrix=1:range=limited And I need to determine how I feel about lowering ctu and merange in 720p. I think I'll use the Tears of Steel clip from the challenge thread. Last edited by Ischemia24; 2nd August 2019 at 17:00. Reason: Handbrake -> H.265 |
|||
2nd August 2019, 17:59 | #27 | Link |
Pig on the wing
Join Date: Mar 2002
Location: Finland
Posts: 5,718
|
In the faster presets than 'Slower', tu-intra and -inter -depths are both 1, so they don't cause excessive calculations. In 'Slower', the depth is changed to 3 and limit-tu 4 is set to restore some of the lost performance.
I think that --rect and --amp could be useful for your use case since they should help with edges. Just set --limit-modes and --limit-refs 3 to fight the resulting slowdown. I also recommend trying the new --hme --hme-search x,y,z instead of --me sea. You can lower merange to 26 if you use CTU 32 and use some other search method than hex. Based on the docs, the value 57 is calculated from CTU 64 (The default is derived from the default CTU size (64) minus the luma interpolation half-length (4) minus maximum subpel distance (2) minus one extra pixel just in case the hex search method is used.)
__________________
And if the band you're in starts playing different tunes I'll see you on the dark side of the Moon... |
2nd August 2019, 18:09 | #28 | Link | |
Registered User
Join Date: Jun 2016
Posts: 116
|
Quote:
What are your thoughts on --rskip? I tend to like it and leave it on... |
|
2nd August 2019, 18:36 | #29 | Link | |
Registered User
Join Date: Jun 2016
Posts: 116
|
When doing 4k encodes I set qg-size to 64. qg-size can help with compression, so I wouldn't go to low with it. You'll find a lot of people recommend ctu 32 qg-size 16 for 1080p content. I usually leave it at 64 and 64. better compression with it, with a very minor loss in quality.
My personal usage would be this. 4k - ctu and qg-size = 64 1080p - cut and gq-size = 64 720p - ctu and qg-size = 32 <720p - ctu = 32, qg-size = 16 To each there own. There are always trade offs... CTU has to do with the number of pixels the encoder will look at. It can break each block down into smaller pieces based on your settings. --tu-intra and --tu-inter determine how far it can go. The larger the value the further down the tree it can go and the longer the encode can take. Read this link about how HEVC and encoders work in general. It is a great read! https://forum.doom9.org/showthread.php?t=167081 Quote:
|
|
2nd August 2019, 20:01 | #30 | Link | ||
Registered User
Join Date: Nov 2016
Posts: 11
|
Quote:
Quote:
|
||
2nd August 2019, 21:44 | #31 | Link |
Registered User
Join Date: Jun 2016
Posts: 116
|
Generally speaking, yes larger CTUs will be used for areas with little to no motion and static\flat images.
No smaller qg-size don't automatically mean higher quality. This is taken from the docs. --qg-size <64|32|16|8> Enable adaptive quantization for sub-CTUs. This parameter specifies the minimum CU size at which QP can be adjusted, ie. Quantization Group size. Allowed range of values are 64, 32, 16, 8 provided this falls within the inclusive range [maxCUSize, minCUSize]. Default: same as maxCUSize |
2nd August 2019, 22:42 | #32 | Link | |
Registered User
Join Date: Nov 2016
Posts: 11
|
Quote:
|
|
2nd August 2019, 23:17 | #33 | Link | |
Lost my old account :(
Join Date: Jul 2017
Posts: 322
|
Quote:
Having that baseline might be a good thing either way, cause then you can make decisions if the aim of a lowering bitrate is worth the trade off in image quality. No the flags are not there for compression reasons, it's imo best practice to specify them, especially for HEVC since the codec is used for such big range of different color formats. Last edited by excellentswordfight; 2nd August 2019 at 23:25. |
|
9th August 2019, 01:19 | #34 | Link | |
Registered User
Join Date: Nov 2016
Posts: 11
|
Quote:
Currently testing with a couple high profile movies. I will do a bunch more testing with Tears of Steel as well and submit my CQ and bitrate focused settings for feedback. Last edited by Ischemia24; 9th August 2019 at 01:21. Reason: incorrect user reference |
|
10th August 2019, 04:38 | #36 | Link |
Registered User
Join Date: May 2009
Posts: 328
|
You do realize with the amount of time you are wasting attempting this, and the amount of energy, it would still be just far cheaper and easier to build a bigger system than your ITX system. You're eventually going to run out of space, then what will you do? Upgrade your server now, save yourself the hassle, as it's pretty obvious you still have no idea on what you want and what you will find acceptable, as you're going to have to encode movies MULTIPLE times before you get a visual level you deem acceptable as one command line will not work across the board for your desires.
Sorry, but that's what I'm getting from all this. It's futile. |
Thread Tools | Search this Thread |
Display Modes | |
|
|