Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > General > Newbies

Reply
 
Thread Tools Search this Thread Display Modes
Old 14th October 2014, 01:32   #41  |  Link
LoRd_MuldeR
Software Developer
 
LoRd_MuldeR's Avatar
 
Join Date: Jun 2005
Location: Last House on Slunk Street
Posts: 13,196
Quote:
Originally Posted by foxyshadis View Post
If you only use the preset option and not --ref directly, that doesn't apply, but only in the command-line:
Code:
    /* Automatically reduce reference frame count to match the user's target level
     * if the user didn't explicitly set a reference frame count. */
If you use libx264 directly, you're trusted to know what you're doing and not set it incorrectly unless you want to.
Yes, if you don't use "--ref" at all, then x264 will automatically set the number of reference frames according to Level specified via "--level".

But the point is that x264 does not enforce the restrictions of the Level set via "--level" switch, which is in contrast to "--profile" switch. That's because there are other things, such as the video's resolution and frame-rate, that affect Level compliance. Thus, setting "--level X" may produce a stream that complies with Level X, but it may just as well produce as stream that does not comply with Level X (but would still be tagged, wrongly, as being compliant with Level X).

So again: Why would you explicitly add the "--level" switch? If, by default (that is: without "--level"), your stream came out at a Level this is above the Level that you had intended, then you can not reliably make your stream conform to the intended Level just by adding the "--level" switch! Instead, you should probably figure out what exactly is currently preventing your stream from conforming to the intended Level and fix that. And if, by default (that is: without "--level"), your stream came out at a Level this is even below the Level that you had intended, you can just be happy and don't have to change anything. The only reason to set "--level" explicitly that comes to my mind is when x264 actually got the Level wrong.
__________________
There was of course no way of knowing whether you were being watched at any given moment.
How often, or on what system, the Thought Police plugged in on any individual wire was guesswork.



Last edited by LoRd_MuldeR; 14th October 2014 at 01:43.
LoRd_MuldeR is offline   Reply With Quote
Old 14th October 2014, 09:16   #42  |  Link
fvisagie
Registered User
 
Join Date: Aug 2008
Location: Isle of Man
Posts: 588
Quote:
Originally Posted by LoRd_MuldeR View Post
Yes, if you don't use "--ref" at all, then x264 will automatically set the number of reference frames according to Level specified via "--level".
Quote:
The only reason to set "--level" explicitly that comes to my mind is when x264 actually got the Level wrong.
There is another, very important, reason. Due to the subtlety you allude to in the first quote, --level is needed to ensure --ref is set correctly. If the command line omits --level, --ref defaults to 16. I just retested that to confirm.

That means all encodes that need --ref <> 16, and for which the appropriate --level is not specified, will be non-level-conformant.

Therefore, for an encode to conform to whatever level its parameters and bitrate translate to, one needs to:
  • either specify the corresponding --level (for which one obviously needs to correctly anticipate it)
  • or to let x264 default the level and to manually specify --ref (for which one needs to correctly anticipate the level in any case)
In other words, to ensure an encode is level-compliant, it seems there is no easier way than having to anticipate the level, and to specify it.

@r0lZ,
In addition, where an encode needs to be constrained to a maximum level as is the case with you, you'd also need to compare the specified maximum level to the anticipated encoding level as suggested in post #34.

For what it's worth, attached is an example encoding batch file that uses h264levl.bat posted above to anticipate the encoding level. Note that this batch file does not currently constrain the encoding level as you want to, although that would be trivial to add as mentioned.
Attached Files
File Type: zip FFx264.zip (4.3 KB, 6 views)
fvisagie is offline   Reply With Quote
Old 14th October 2014, 09:33   #43  |  Link
r0lZ
PgcEdit daemon
 
r0lZ's Avatar
 
Join Date: Jul 2003
Posts: 7,393
Quote:
Originally Posted by sneaker_ger View Post
Yes, the level is written at the start and will not be corrected if it turns out wrong.
Damn! The management of the level in x264 is rather complex! Not only it is impossible to limit the encoder to a maximum level (without forcing a specific level), but even when you let x264 decide what level to use, it can tag the file with a wrong level! A very big limitation, plus a big bug. It's a pain for the users.

So, I have no other choice, for BD3D2MK3D, than to add 2 options:

1. Leave x264 decide what level to use, and it may produce streams incompatible with your hardware if you use a slow preset. In addition, it may even tag the stream erroneously. IMO, that's the solution to use when the user encodes with any fast or medium preset, because the resulting level is 4.0 (or perhaps 4.1 or even 4.2) for 1080p, and that levels are compatible with most players. That can be easily explained in a help dialog.

2. Or force a specific level, such as 4.2, with the --level and --vbv-* arguments, and in that case, it is possible that the specified level is actually too high for the selected preset and the stream may have been encoded with a lower level, but will be tagged erroneously with the specified level. Therefore its compatibility with the players will be less good without any good reason. But at least, it is possible to encode with any slow preset without breaking the compatibility with many hardware players.

There is obviously no good solution, but I will have to implement the two options above, because IMO the worst solution is to encode blindly at any level without the possibility for the user to be sure that its file will be compatible with his player.

Thanks to everybody. I know now the limitations and problems of x264, and how to offer a solution to the casual user of BD3D2MK3D, even if the solution is far to the perfection.

Of course, if there is still a problem in the methods explained above, please let me know.

[EDIT] Oh, yes, there is also the possibility to anticipate the level, as explained by fvisagie above. But since the bitrate depends mainly of the "complexity" of the scenes in the input file, and that it is impossible to analyse that with a batch file, the result of any anticipation based on the analysis of the input file and parameters may not be totally accurate. Given the fact that all files encoded by BD3D2MK3D are 1080p @ 23.976 fps, I suppose that I can simply assume that the levels are 4.0 (or a bit above in case of bitrate peaks) for any preset <= medium, and 5.0 or 5.1 (or a bit more) for the slow presets. There is normally no need to constraint the level when the user encodes with a fast preset, so finally, in the case of BD3D2MK3D, it is possible to offer a relatively simple solution.
__________________
r0lZ
PgcEdit homepage (hosted by VideoHelp)
BD3D2MK3D A tool to convert 3D blu-rays to SBS, T&B or FS MKV

Last edited by r0lZ; 14th October 2014 at 09:45.
r0lZ is offline   Reply With Quote
Old 14th October 2014, 09:35   #44  |  Link
detmek
Registered User
 
Join Date: Aug 2009
Posts: 475
My encoding experience is that I use 4 reference frames for every encode, no matter what preset I use. 4 reference frames are maximum for level 4.1 and 1080p30. That way I don't have to worry that strem will exceed level 4.1 but it will flagged with lower level if resolution is lower.

Last edited by detmek; 14th October 2014 at 09:44.
detmek is offline   Reply With Quote
Old 14th October 2014, 09:42   #45  |  Link
Kurtnoise
Swallowed in the Sea
 
Kurtnoise's Avatar
 
Join Date: Oct 2002
Location: Aix-en-Provence, France
Posts: 5,195
Level is not a tag in the bitstream...Anyway, there are some tools to override this.

What do you mean by compatibility with many hardware players ? Why not put these switches --bluray-compat --vbv-maxrate 40000 --vbv-bufsize 30000 --level 4.1 in your command line then ? That should be fine for all players.
Kurtnoise is offline   Reply With Quote
Old 14th October 2014, 09:49   #46  |  Link
fvisagie
Registered User
 
Join Date: Aug 2008
Location: Isle of Man
Posts: 588
Quote:
Originally Posted by r0lZ View Post
So, I have no other choice, for BD3D2MK3D, than to add 2 options:

1. Leave x264 decide what level to use, and it may produce streams incompatible with your hardware if you use a slow preset. In addition, it may even tag the stream erroneously. IMO, that's the solution to use when the user encodes with any fast or medium preset, because the resulting level is 4.0 (or perhaps 4.1 or even 4.2) for 1080p, and that levels are compatible with most players. That can be easily explained in a help dialog.

2. Or force a specific level, such as 4.2, with the --level and --vbv-* arguments, and in that case, it is possible that the specified level is actually too high for the selected preset and the stream may have been encoded with a lower level, but will be tagged erroneously with the specified level. Therefore its compatibility with the players will be less good without any good reason. But at least, it is possible to encode with any slow preset without breaking the compatibility with many hardware players.

There is obviously no good solution, but I will have to implement the two options above, because IMO the worst solution is to encode blindly at any level
The third (and in my view more suitable) option would be for your program to anticipate the resultant encoding level as suggested above, to compare that to the maximum specified, to proceed if within limits or, in case not, to terminate with a level warning (and optionally to suggest trying a lower bitrate, that being directly under user control). In all cases where your encode does proceed, you would need to specify the --level as anticipated to ensure --ref is set to the corresponding level limit.
fvisagie is offline   Reply With Quote
Old 14th October 2014, 09:54   #47  |  Link
fvisagie
Registered User
 
Join Date: Aug 2008
Location: Isle of Man
Posts: 588
Quote:
Originally Posted by r0lZ View Post
But since the bitrate depends mainly of the "complexity" of the scenes in the input file, and that it is impossible to analyse
That is not the case for constant bitrate encoding (i.e. either two-pass encoding or using the vbv parameters). If your program generates constant bitrate encodes, you can safely use the specified bitrate to anticipate and/or limit the resulting encoding level.
fvisagie is offline   Reply With Quote
Old 14th October 2014, 10:12   #48  |  Link
r0lZ
PgcEdit daemon
 
r0lZ's Avatar
 
Join Date: Jul 2003
Posts: 7,393
Quote:
Originally Posted by Kurtnoise View Post
Level is not a tag in the bitstream...Anyway, there are some tools to override this.

What do you mean by compatibility with many hardware players ? Why not put these switches --bluray-compat --vbv-maxrate 40000 --vbv-bufsize 30000 --level 4.1 in your command line then ? That should be fine for all players.
Can you give a link to one of these tools?

I don't want compatibility with bluray players, but with any player (hardware or software) that any user may have. As explained in a post earlier in this thread, a good example of this is my TV. It supports level 4.2 or below, but not 5.0. But I want to be able to use the slow presets, because they are supposed to compress better, but limit them to a level that is compatible with the target player of the user. (I can't limit to 4.1 anyway, because many users of my program want 5.0 or more.)
__________________
r0lZ
PgcEdit homepage (hosted by VideoHelp)
BD3D2MK3D A tool to convert 3D blu-rays to SBS, T&B or FS MKV
r0lZ is offline   Reply With Quote
Old 14th October 2014, 10:19   #49  |  Link
r0lZ
PgcEdit daemon
 
r0lZ's Avatar
 
Join Date: Jul 2003
Posts: 7,393
Quote:
Originally Posted by fvisagie View Post
The third (and in my view more suitable) option would be for your program to anticipate the resultant encoding level as suggested above, to compare that to the maximum specified, to proceed if within limits or, in case not, to terminate with a level warning (and optionally to suggest trying a lower bitrate, that being directly under user control). In all cases where your encode does proceed, you would need to specify the --level as anticipated to ensure --ref is set to the corresponding level limit.
Yes, I know that it's a good solution, but BD3D2MK3D has several limitations. For example, I can't launch an encode just to verify that the level doesn't exceed the limit, because the program extracts the AVC, MVC, audio and subtitle streams to a "project" directory, and builds a batch file to encode. When the batch file is launched by the user, there is no way to interrupt it, and if finally, the level really used exceeds what the user wants, he will have to redo everything from the beginning, including the long demux process. Well, he can also edit the batch file and some other files, but I want a solution for inexperienced users, and editing a batch file is something really too hard for many peoples! ;-)
__________________
r0lZ
PgcEdit homepage (hosted by VideoHelp)
BD3D2MK3D A tool to convert 3D blu-rays to SBS, T&B or FS MKV
r0lZ is offline   Reply With Quote
Old 14th October 2014, 10:20   #50  |  Link
r0lZ
PgcEdit daemon
 
r0lZ's Avatar
 
Join Date: Jul 2003
Posts: 7,393
Quote:
Originally Posted by fvisagie View Post
That is not the case for constant bitrate encoding (i.e. either two-pass encoding or using the vbv parameters). If your program generates constant bitrate encodes, you can safely use the specified bitrate to anticipate and/or limit the resulting encoding level.
It's what I have the intention to do, at least when the preset is a slow one. For the faster ones, there should be no problem.
__________________
r0lZ
PgcEdit homepage (hosted by VideoHelp)
BD3D2MK3D A tool to convert 3D blu-rays to SBS, T&B or FS MKV
r0lZ is offline   Reply With Quote
Old 14th October 2014, 10:22   #51  |  Link
r0lZ
PgcEdit daemon
 
r0lZ's Avatar
 
Join Date: Jul 2003
Posts: 7,393
Quote:
Originally Posted by detmek View Post
My encoding experience is that I use 4 reference frames for every encode, no matter what preset I use. 4 reference frames are maximum for level 4.1 and 1080p30. That way I don't have to worry that strem will exceed level 4.1 but it will flagged with lower level if resolution is lower.
That may be a very good solution indeed. What are the drawbacks? Less good compression with slow presets I suppose?
__________________
r0lZ
PgcEdit homepage (hosted by VideoHelp)
BD3D2MK3D A tool to convert 3D blu-rays to SBS, T&B or FS MKV
r0lZ is offline   Reply With Quote
Old 14th October 2014, 10:22   #52  |  Link
r0lZ
PgcEdit daemon
 
r0lZ's Avatar
 
Join Date: Jul 2003
Posts: 7,393
I have another question about the presets and levels.

When I have encoded my test clip with all presets (without other arguments) to verify what level x264 prints in its log, I have kept the encoded files, just to compare their sizes. Strangely, it's the veryfast preset that has the best compression! For me, that has been a big surprise!

I know that the compression depends greatly of the content of the input file, and of the complexity of the scenes it contains, but I would have expected better compressions with the slow presets.

For your information, here is the directory with all files, sorted by size:
Code:
Size      Preset
3.244.187 01_ultrafast.264
2.143.208 02_superfast.264
1.732.023 05_fast.264
1.672.734 04_faster.264
1.633.941 07_slow.264
1.603.109 06_medium.264
1.566.318 08_slower.264
1.518.301 10_placebo.264
1.443.274 09_veryslow.264
1.334.144 03_veryfast.264
As you can see, the veryfast preset is the winner!

So, I wonder if it is useful to use slow presets and waste much time with interminable encodings if an extremely fast encoding gives better results in terms of file size. Hence my questions. How is it possible? And, more importantly, how could I select the "best" preset for a given file. Are there rules such as "if there is much noise, prefer preset X", or "for computer animated films, prefer preset Y" ? If it's the case, do you know a good site that explains that rules?
__________________
r0lZ
PgcEdit homepage (hosted by VideoHelp)
BD3D2MK3D A tool to convert 3D blu-rays to SBS, T&B or FS MKV

Last edited by r0lZ; 14th October 2014 at 10:26.
r0lZ is offline   Reply With Quote
Old 14th October 2014, 10:29   #53  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,079
I don't think judging the preset by its file size alone is a very good metric, the quality inside could be quite different as the encoder becomes more efficient at slower speeds.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 14th October 2014, 10:55   #54  |  Link
fvisagie
Registered User
 
Join Date: Aug 2008
Location: Isle of Man
Posts: 588
Quote:
Originally Posted by r0lZ View Post
I can't launch an encode just to verify that the level doesn't exceed the limit
That is unnecessary. To calculate the encoding level, one needs profile, frame size, framerate, B-frames and B-pyramid and bitrate. These are all either known, specified or can be inspected before starting the encode. That is how h246level.bat works - it does not even attempt an encode.

If some video parameter varies between encodes and/or isn't provided by the user, say frame size, you can use something like ffprobe to inspect the input video for that. ffx264.bat provides an example of such usage.

To emphasise the point, anticipating the encoding level can be fully completed prior to starting the encode. Hence the word "anticipating" .

Of course simpler short-cuts or encoding rules of thumb are available. However, you'll invariably find that those will either limit the generality and usefulness of your program, and/or the encoding quality of its output.
fvisagie is offline   Reply With Quote
Old 14th October 2014, 11:26   #55  |  Link
r0lZ
PgcEdit daemon
 
r0lZ's Avatar
 
Join Date: Jul 2003
Posts: 7,393
Quote:
Originally Posted by nevcairiel View Post
I don't think judging the preset by its file size alone is a very good metric, the quality inside could be quite different as the encoder becomes more efficient at slower speeds.
In CRF mode, the quality of the resulting file should be approximately identical, regardless of the preset used, no?

Anyway, my question is still valid. Are there guidelines to select a preset rather than another given the type of movie to encode (such as action film, cartoon, computer graphics, old film with much gain, and so on...) ?
__________________
r0lZ
PgcEdit homepage (hosted by VideoHelp)
BD3D2MK3D A tool to convert 3D blu-rays to SBS, T&B or FS MKV
r0lZ is offline   Reply With Quote
Old 14th October 2014, 11:43   #56  |  Link
r0lZ
PgcEdit daemon
 
r0lZ's Avatar
 
Join Date: Jul 2003
Posts: 7,393
Quote:
Originally Posted by fvisagie View Post
That is unnecessary. To calculate the encoding level, one needs profile, frame size, framerate, B-frames and B-pyramid and bitrate. These are all either known, specified or can be inspected before starting the encode. That is how h246level.bat works - it does not even attempt an encode.
OK, then that means that if the anticipated bitrate is greater than what is required for the "limit" level, I have to use the --level and --vbv-* parameters to limit the level to the specified one, and otherwise I can leave the command line as it is, without specifying additional parameters, and I have the guarantee that the final stream will use a level <= to the "maximum level" that the user allows. Right?

If it's so easy, why x264 prints a possibly wrong level when it starts to encode the file? It knows exactly all parameters, and it should be able to estimate the bitrate and therefore the level with the same precision. Why doesn't it use the same formula?

Quote:
Originally Posted by fvisagie View Post
If some video parameter varies between encodes and/or isn't provided by the user, say frame size, you can use something like ffprobe to inspect the input video for that. ffx264.bat provides an example of such usage.
Luckily, with BD3D2MK3D, the characteristics of the file to encode are known before launching the encoding, since it's the same program that demuxes the AVC and MVC streams from the BD3D. And, as far as I know, the size and frame rate MUST always be 1080p @ 23.976. That simplifies greatly the process.

Quote:
Originally Posted by fvisagie View Post
Of course simpler short-cuts or encoding rules of thumb are available. However, you'll invariably find that those will either limit the generality and usefulness of your program, and/or the encoding quality of its output.
I understand that, but since my program uses always the same picture size and fps, I suppose that in absence of additional parameters such as B-frames and B-pyramid, I can simplify the process greatly and assume that the levels printed in my tests by x264 are correct.

Anyway, I will still study your batch files, and see how I can integrate them in my program. Thanks again.
__________________
r0lZ
PgcEdit homepage (hosted by VideoHelp)
BD3D2MK3D A tool to convert 3D blu-rays to SBS, T&B or FS MKV

Last edited by r0lZ; 14th October 2014 at 11:55.
r0lZ is offline   Reply With Quote
Old 14th October 2014, 15:11   #57  |  Link
fvisagie
Registered User
 
Join Date: Aug 2008
Location: Isle of Man
Posts: 588
Quote:
Originally Posted by r0lZ View Post
OK, then that means that if the anticipated bitrate is greater than what is required for the "limit" level, I have to use the --level and --vbv-* parameters to limit the level to the specified one, and otherwise I can leave the command line as it is, without specifying additional parameters, and I have the guarantee that the final stream will use a level <= to the "maximum level" that the user allows. Right?
No, the roles of bitrate, --vbv* and --level are somewhat different. You (or your program) specify a bitrate to the encoder. For a given input video and encoder settings, bitrate is probably the most direct way the user or program controls/limits encoding level. Having specified the bitrate, if you then use two-pass encoding or the vbv parameters to roughly "limit the bitrate" to the specified bitrate (i.e. constant bitrate encoding), that specified bitrate can then reliably be used in calculations for anticipated encoding level (which is what h264levl.bat does). If the specified bitrate then results in an anticipated encoding level higher than the maximum, encoding should not proceed if you want to ensure compliance with the maximum level you specified.

Of course you could hard-code an encoding bitrate that you've confirmed beforehand to result in an acceptable encoding level in typical cases (although that would limit generality of the implementation), but a maximum level test would still be a good safe-guard.

Once you have determined what level your encode will result in, use --level to mark it accordingly, and to also ensure --ref gets set appropriately. Then you can encode .
fvisagie is offline   Reply With Quote
Old 14th October 2014, 15:28   #58  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,570
Quote:
Originally Posted by Kurtnoise View Post
What do you mean by compatibility with many hardware players ? Why not put these switches --bluray-compat --vbv-maxrate 40000 --vbv-bufsize 30000 --level 4.1 in your command line then ? That should be fine for all players.
It will never work in all players, too many different devices out there. From reports I've read a PS3 can choke on such an encode because it needs slices for efficient decoding. (like Blu-Ray requires --slices 4 for level 4.1)

Quote:
Originally Posted by r0lZ View Post
OK, then that means that if the anticipated bitrate is greater than what is required for the "limit" level, I have to use the --level and --vbv-* parameters to limit the level to the specified one, and otherwise I can leave the command line as it is, without specifying additional parameters, and I have the guarantee that the final stream will use a level <= to the "maximum level" that the user allows. Right?
Without --vbv-XXXX you will never have the guarantee that the chosen level will not be exceeded.

Quote:
Originally Posted by fvisagie View Post
No, the roles of bitrate, --vbv* and --level are somewhat different. You (or your program) specify a bitrate to the encoder. For a given input video and encoder settings, bitrate is probably the most direct way the user or program controls/limits encoding level. Having specified the bitrate, if you then use two-pass encoding or the vbv parameters to roughly "limit the bitrate" to the specified bitrate (i.e. constant bitrate encoding), that specified bitrate can then reliably be used in calculations for anticipated encoding level (which is what h264levl.bat does). If the specified bitrate then results in an anticipated encoding level higher than the maximum, encoding should not proceed if you want to ensure compliance with the maximum level you specified.
Wrong. Two-pass encoding without --vbv-XXX is variable, not constant bit rate. (You definitely don't want to use constant bit rate!) Without --vbv-XXXX you cannot reliably achieve a certain level.

Quote:
Originally Posted by r0lZ View Post
If it's so easy, why x264 prints a possibly wrong level when it starts to encode the file? It knows exactly all parameters, and it should be able to estimate the bitrate and therefore the level with the same precision. Why doesn't it use the same formula?
See above.
sneaker_ger is offline   Reply With Quote
Old 14th October 2014, 16:51   #59  |  Link
r0lZ
PgcEdit daemon
 
r0lZ's Avatar
 
Join Date: Jul 2003
Posts: 7,393
@fvisagie: I don't want to specify a bitrate (and certainly not a constant bitrate - That was good for MPEG-1, but we are in the 21st century!). The user may want to do a 2-pass encode, and in that case, the bitrate is specified, but in most cases, the CBR mode is much better, and I want to keep the possibility to use it. And anyway, even in 2-pass, there is no guarantee that the bitrate will never exceed the limit, since it is variable. If I understand correctly, the encoder may exceed the maximum level if the maximum bitrate for the specified level is exceeded during some peaks, in "difficult" scenes (and this, regardless of the CRF or 2-pass mode used). It should therefore be sufficient to limit it with --vbv-maxrate. And I see no good reason to force a specific bitrate during the whole movie, except of course in 2-pass, where it is mandatory.

Also, if I understand correctly, what you explain is exactly the opposite of what I need to do. I don't want to compute the bitrate to know what will be exactly the level used by the encoder to tag the file correctly. I want to specify (or, better, limit) the encoder to a specific level, and therefore limit the bitrate accordingly. I know what bitrates are accepted by what levels, and therefore I suppose that I can simply limit the bitrate to obtain that level. (I know that I must also specify --level or --ref to obtain the right level, but it's another problem.) However, I may use your method afterward, to tag the file with the correct level, when it has NOT been specified by the user.

Quote:
Originally Posted by sneaker_ger View Post
Without --vbv-XXXX you will never have the guarantee that the chosen level will not be exceeded.
I know that. It's why my intention is to use the --vbv parameters, unless the user accepts to end up with any acceptable level given the characteristics of the input file.
__________________
r0lZ
PgcEdit homepage (hosted by VideoHelp)
BD3D2MK3D A tool to convert 3D blu-rays to SBS, T&B or FS MKV

Last edited by r0lZ; 14th October 2014 at 16:55.
r0lZ is offline   Reply With Quote
Old 14th October 2014, 17:32   #60  |  Link
Sharc
Registered User
 
Join Date: May 2006
Posts: 3,822
Quote:
Originally Posted by r0lZ View Post
In CRF mode, the quality of the resulting file should be approximately identical, regardless of the preset used, no? .....
No.
CRF means constant rate factor, which is not an absolute measure for "quality".
- Same CRF for different presets will normally result in different quality. Therefore your experience with the unexpected files sizes (compression) which was even better for faster presets.
- With CRF mode one can however expect constant viewing quality (for the given CRF and given preset) throughout the particular movie.
Sharc is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 03:12.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2021, vBulletin Solutions Inc.