Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > (HD) DVD, Blu-ray & (S)VCD > DVD & BD Rebuilder

Reply
 
Thread Tools Search this Thread Display Modes
Old 3rd August 2020, 06:35   #29801  |  Link
Lathe
Registered User
 
Lathe's Avatar
 
Join Date: Aug 2005
Posts: 1,014
Quote:
Originally Posted by jdobbs View Post
One thing that might cause that: Did you import the file before NVENCC was implemented in BD-RB and then use NVENCC for the encode? If so, the NVENCC adjustments wouldn't be found in the PSEUDO.INF file (created during import) and the resizing and/or padding wouldn't occur.

Just throwing out possibilities.
Hmmm, thanks for the thought, but no... I just simply took the MKV file, used TSMuxer to convert it to a BDMV/CERT format, then used that as the source like I would any ripped Blu-ray for BDRB. I then set the size output slightly smaller so that it would force a re-encode (the original size of the BDMV folder was around 18 Gigs. I used the LAV internal encoder as usual. I didn't change anything that I normally would do.

No biggie really, as long as I know now that BDRB will not automatically detect the improper AR within the BDMV folder, I will just add the AVS script from now on if I have to do that. I don't do that very often, only when I have a pretty high resolution file that has the lossless audio (I know, I know... ) but I want to convert the video to a playable Blu-ray format.

It just occurred to me too that maybe if I just simply imported the original MKV file into BDRB and let IT created the pseudo BDMV folder, then perhaps it would detect the non-compliant AR in the MKV file. I guess I was just trying to skip having BDRB do that step.
Lathe is offline   Reply With Quote
Old 3rd August 2020, 21:01   #29802  |  Link
cartman0208
Registered User
 
Join Date: Jun 2010
Location: Germany
Posts: 133
Quote:
Originally Posted by SquallMX View Post
It's almost impossible for a 35mm live action film to get those CRF values for a 4K BD-25, for reference when using nVidia HW Accelerated encoding I got a value of 22.15.
Sure, but did you actually complete the encode (in your log it was aborted)?
Might be a display issue and an other value is used ... just see, what output size you get...
cartman0208 is offline   Reply With Quote
Old 3rd August 2020, 23:12   #29803  |  Link
jdobbs
Moderator
 
Join Date: Oct 2001
Posts: 20,675
Status update: Got pretty much everything reported in bug reports fixed. Currently working on HDR10+ support. I'll probably be releasing the next version in a day or two.
__________________
Help with development of new apps: Donations.
Website: www.jdobbs.net
jdobbs is offline   Reply With Quote
Old 3rd August 2020, 23:17   #29804  |  Link
jdobbs
Moderator
 
Join Date: Oct 2001
Posts: 20,675
Quote:
Originally Posted by SquallMX View Post
CRF Prediction is broken for x265.



CRF 2 is insanely low, I loaded the prediction AVS script / M2TS on VDub, seems to be broken at I frames, so its mainly static frames, which explains the low CRF value.

This is the sample file:
https://mega.nz/file/ATpg3Q6J#XK09Bq...KhTTZVPpz4w9Es
The AVS isn't used for HEVC prediction because the SelectRangeEvery() filter just doesn't seem to be able to find keyframes on HEVC. You also can't look at the M2TS used as input because a player (but not an encoder) will have trouble with the crazy DTS/PTS values.

You pretty much have to look at the output of the prediction to actually see how it went.
__________________
Help with development of new apps: Donations.
Website: www.jdobbs.net
jdobbs is offline   Reply With Quote
Old 4th August 2020, 08:07   #29805  |  Link
Lathe
Registered User
 
Lathe's Avatar
 
Join Date: Aug 2005
Posts: 1,014
Quote:
Originally Posted by jdobbs View Post
Status update: Got pretty much everything reported in bug reports fixed. Currently working on HDR10+ support. I'll probably be releasing the next version in a day or two.
Thanks Boss!
Lathe is offline   Reply With Quote
Old 4th August 2020, 08:18   #29806  |  Link
Lathe
Registered User
 
Lathe's Avatar
 
Join Date: Aug 2005
Posts: 1,014
On a side note, with my new build everything works really well, except when I try to utilize more of the CPU. I wonder if the Ryzen 5's are known easily to over heat? It's ONLY when I try to do an encode which of course tries to utilize most of the CPU. If it is running full out at 90%+ it gradually gets hotter and hotter until it reaches about 90 degrees and shuts the computer off. The case I got has FIVE bloody fans in it too for Goodness sake! With a huge one on the side to draw in cooler air. I'm taking it back in to where they built it, and as an added help I bought an after market CPU cooler. But, it really shouldn't be doing that anyway.

The only work around I could come up with is when I did an encode using x264 either with or without BDRB, the only way I can keep the CPU from overheating is to deselect the cores/threads in the 'Affinity' setting when you right-click the x264 process in TM. So, if I only use 1/4 of the 'cores' or whatever they are and the CPU is only running at 25%, I can just BARELY keep it under the redline temperature. Sure is frustrating. Still is a lot faster than my old one, but it would be kind of nice to be able to use the entire potential of the CPU.

I'm HOPING that he will find what exactly is causing that (hopefully NOT a bad CPU!) because just adding the after market cooler alone will not fix that. I guess I'll just hafta see what happens...
Lathe is offline   Reply With Quote
Old 4th August 2020, 15:03   #29807  |  Link
cartman0208
Registered User
 
Join Date: Jun 2010
Location: Germany
Posts: 133
Quote:
Originally Posted by Lathe View Post
On a side note, with my new build everything works really well, except when I try to utilize more of the CPU. I wonder if the Ryzen 5's are known easily to over heat? It's ONLY when I try to do an encode which of course tries to utilize most of the CPU. If it is running full out at 90%+ it gradually gets hotter and hotter until it reaches about 90 degrees and shuts the computer off. The case I got has FIVE bloody fans in it too for Goodness sake! With a huge one on the side to draw in cooler air. I'm taking it back in to where they built it, and as an added help I bought an after market CPU cooler. But, it really shouldn't be doing that anyway.
It's like cars with a lot of power and crappy tires ... can't get the power on the street
One hint I can give: thermal conductive paste, not too little, not too much...
If the heat from the CPU cant get to the heat spreader you can have the best cooling solution ever but it won't cool.

I, personally, use AIO watercoolers ... easy setup, lots of space left in the case, way cooler than all the aircoolers I had and not even that expensive...
cartman0208 is offline   Reply With Quote
Old 4th August 2020, 22:03   #29808  |  Link
gonca
Registered User
 
Join Date: Jul 2012
Posts: 1,026
@Lathe
The coolers supplied by Intel and AMD with their CPUs are not the greatest, to be kind.
Like cartman0208 said, get an AIO watercooler
I like Corsair, and don't get one with a small radiator
gonca is offline   Reply With Quote
Old 5th August 2020, 04:33   #29809  |  Link
MrVideo
Registered User
 
MrVideo's Avatar
 
Join Date: May 2007
Location: Wisconsin
Posts: 1,712
Quote:
Originally Posted by gonca View Post
@Lathe
The coolers supplied by Intel and AMD with their CPUs are not the greatest, to be kind.
Like cartman0208 said, get an AIO watercooler
I like Corsair, and don't get one with a small radiator
You still got to get the paste correct.
MrVideo is offline   Reply With Quote
Old 5th August 2020, 10:42   #29810  |  Link
BuddTX
Registered User
 
Join Date: Mar 2006
Posts: 71
Quote:
Originally Posted by MrVideo View Post
You still got to get the paste correct.
Check out this IC Graphite Pad, supposed to work just as good as the best thermal pastes, but with no drying out or installation errors:
https://youtu.be/YpphKzmDiJM
https://www.amazon.com/dp/B07CKVW18G
BuddTX is offline   Reply With Quote
Old 5th August 2020, 16:40   #29811  |  Link
SquallMX
Special SeeD
 
Join Date: Nov 2002
Location: Mexico
Posts: 288
Quote:
Originally Posted by cartman0208 View Post
Did you try a complete encode?
I had CRF Values of 1.00 but the output was not oversized ...
Yes I did:

Code:
[08/03/20] BD Rebuilder v0.61.09
[21:37:28] Source:  THE_FAST_AND_THE_FURIOUS_2001
  - Input BD size: 57.81 GB
  - Approximate total content: [02:36:40.348]
  - Target BD size: 24.41 GB
  - Windows Version: 6.2 [9200]
  - Quality: Better (Faster), CRF
  - Decoding/Frame serving: FFMPEG
  - Audio Settings: AC3=0 DTS=0 HD=1 Kbs=640
[21:37:32] PHASE ONE, Encoding
 - [21:37:32] Processing: VID_00165 (1 of 4)
 - [21:37:32] Extracting A/V streams [VID_00165]
 - [21:37:37] Reencoding video [VID_00165]
   - Source Video: HEVC, 3840x2160
   - Rate/Length: 23.976fps, 576 frames
 - [21:37:37] Performing CRF Prediction...
   - Analyzing 15.80 15.30 14.95 [14.95]
 - [21:37:53] Encoding using constant rate factor.
 - [21:38:40] Video Encode complete
 - [21:38:40] Processing audio tracks
   - Track 4352 (eng): Keeping original audio
 - [21:38:40] Multiplexing M2TS
 - [21:38:44] Blanking: VID_00252 (2 of 4)
 - [21:38:44] Blanking: VID_00253 (3 of 4)
 - [21:38:44] Processing: VID_00294 (4 of 4)
 - [21:38:44] Extracting A/V streams [VID_00294]
 - [21:48:00] Reencoding video [VID_00294]
   - Source Video: HEVC, 3840x2160
   - Rate/Length: 23.976fps, 153,721 frames
 - [21:48:00] Performing CRF Prediction...
   - Analyzing 15.90 8.45 4.72 2.86 1.93 1.46 1.23 1.11 1.06 1.03 1.02 1.01 [1.01]
 - [21:49:38] Encoding using constant rate factor.
   - Performing size-correcting second pass...
 - [09:34:24] Video Encode complete
 - [09:34:24] Processing audio tracks
   - Track 4352 (eng): Keeping original audio
   - Track 4354 (spa): Keeping original audio
   - Track 4357 (eng): Keeping original audio
 - [09:34:24] Multiplexing M2TS
[09:35:24]PHASE ONE complete
[09:35:24]PHASE TWO - Rebuild Started
 - [09:35:24] Rebuilding BD file Structure
[09:35:44] - Encode and Rebuild complete
Code:
[Status]
LABEL=THE_FAST_AND_THE_FURIOUS_2001
VERSION=v0.61.09
SOURCE_SIZE=62073708041
SOURCE_VIDEO_SIZE=60550053888
TARGET_SIZE=26214400000
REDUCTION=.407774134977166
RESIZE_1080=0
RESIZE_1440=0
AUDIO_TO_KEEP=all
KEEP_HD_AUDIO=-1
SUBS_TO_KEEP=all
BACKUP_MODE=0
MOVIEONLY_TYPE=0
USE_LAVF=0
INSTANCES=1
DGDECNV=0
DGDECIM=0
FRIMSOURCE=0
FFMS2=0
SSIF_MODE=0
UHD_V3_MODE=0
QUICK=0
ENCODE_STEP=0
COMPLETED=4
REBUILD_COMPLETE=1
[00165]
AUDIO=1
PGS=
VIDEO2=0
V2MBRATE=0
M2TS_TARGET=78560708
NSTART=27000000
NEND=28081079
NSIZE=80283648
FLINK=0
MLINK=0
[00294]
AUDIO=101001
PGS=1111111111111
VIDEO2=0
V2MBRATE=0
M2TS_TARGET=24612185139
NSTART=27000000
NEND=315515101
NSIZE=23796996096
FLINK=0
MLINK=0
After creating an enormous 40+ GBs .hevc file, BD-RB did a 2-pass which resulted in a properly fit 23 GB output but pretty much regates the point of a CRF mode, which is getting similar quality than 2 pass mode on 1 pass, in fact, because 2-pass mode uses a "fast" 1 pass the encoding time was longer (8 Hours [Normal 2-Pass mode] vs 12 Hours [CRF mode + size-correcting second pass]).

So software encoding CRF prediction is definitely buggy, at least in version 61.09. Looks like its using FFMPEG for getting the sample data but the results are static images (crazy DTS/PTS values?) so the prediction calculates a very low CRF value:

Code:
"D:\Archivos de Programa\BD Rebuilder UHD\tools\ffmpeg.exe" -ss 5998.7456 -i 
"L:\BLU-RAYS\4K\THE FAST AND THE FURIOUS 2001\BDMV\STREAM\00294.m2ts" 
-frames 48 -an -sn -codec copy -f mpegts - >> "C:\TEMP\WORKFILES\00294.AVS.SMPL.m2ts"

Last edited by jdobbs; 6th August 2020 at 12:59. Reason: Too wide
SquallMX is offline   Reply With Quote
Old 6th August 2020, 01:33   #29812  |  Link
MrVideo
Registered User
 
MrVideo's Avatar
 
Join Date: May 2007
Location: Wisconsin
Posts: 1,712
Please code wrap, do not quote wrap output text.
MrVideo is offline   Reply With Quote
Old 6th August 2020, 01:34   #29813  |  Link
MrVideo
Registered User
 
MrVideo's Avatar
 
Join Date: May 2007
Location: Wisconsin
Posts: 1,712
Quote:
Originally Posted by BuddTX View Post
Check out this IC Graphite Pad, supposed to work just as good as the best thermal pastes, but with no drying out or installation errors:
https://www.amazon.com/dp/B07CKVW18G
Neat stuff. Thanks for the tip. I ordered one from ebay. 50 cents more, but free shipping.

Last edited by MrVideo; 6th August 2020 at 01:42.
MrVideo is offline   Reply With Quote
Old 6th August 2020, 12:50   #29814  |  Link
jdobbs
Moderator
 
Join Date: Oct 2001
Posts: 20,675
Quote:
Originally Posted by Lathe View Post
Hmmm, thanks for the thought, but no... I just simply took the MKV file, used TSMuxer to convert it to a BDMV/CERT format, then used that as the source like I would any ripped Blu-ray for BDRB. I then set the size output slightly smaller so that it would force a re-encode (the original size of the BDMV folder was around 18 Gigs. I used the LAV internal encoder as usual. I didn't change anything that I normally would do.

No biggie really, as long as I know now that BDRB will not automatically detect the improper AR within the BDMV folder, I will just add the AVS script from now on if I have to do that. I don't do that very often, only when I have a pretty high resolution file that has the lossless audio (I know, I know... ) but I want to convert the video to a playable Blu-ray format.

It just occurred to me too that maybe if I just simply imported the original MKV file into BDRB and let IT created the pseudo BDMV folder, then perhaps it would detect the non-compliant AR in the MKV file. I guess I was just trying to skip having BDRB do that step.
Yes. That would be the problem. If you create the source manually (using TSMUXER) then BD-RB has no way of knowing the AR is noncompliant and noting it for adjustment. TSMUXER will set the CLPI's flags to the nearest compliant resolution. Since there is no PSEUDO.INI file, BD-RB thinks it's a standard BD and assumes the CLPI is correct.
__________________
Help with development of new apps: Donations.
Website: www.jdobbs.net

Last edited by jdobbs; 6th August 2020 at 12:55.
jdobbs is offline   Reply With Quote
Old 6th August 2020, 13:04   #29815  |  Link
jdobbs
Moderator
 
Join Date: Oct 2001
Posts: 20,675
Quote:
Originally Posted by SquallMX View Post
After creating an enormous 40+ GBs .hevc file, BD-RB did a 2-pass which resulted in a properly fit 23 GB output but pretty much regates the point of a CRF mode, which is getting similar quality than 2 pass mode on 1 pass, in fact, because 2-pass mode uses a "fast" 1 pass the encoding time was longer (8 Hours [Normal 2-Pass mode] vs 12 Hours [CRF mode + size-correcting second pass]).

So software encoding CRF prediction is definitely buggy, at least in version 61.09. Looks like its using FFMPEG for getting the sample data but the results are static images (crazy DTS/PTS values?) so the prediction calculates a very low CRF value:
Yes. It needs work -- that's the purpose of releasing a test version, so I can get feedback. But as I said before -- the "static images" you mention AREN'T a problem when encoding for prediction... that is an issue with whatever you are using in playback.

The reason I think it is important to get CQM & CRF working well ISN'T just speed. My testing shows that you get BETTER QUALITY at the same size using CQM/CRF than you do with one-pass VBR bitrate encoding. NVENCC has no true two-pass option and X265 is so slow on UHD that you want to try and avoid two-pass. So the time comparison shouldn't be between one-pass and CQM/CRF -- but between CQM/CRF and a two pass encode (averaging across multiple discs -- not just one). A second pass in CQM/CRF mode only occurs when the final target size is exceeded to a point where it won't fit on the target disc.

It's also important to point out that the reason BD-RB has to create an alternate prediction method for HEVC is because AVISYNTH doesn't handle HEVC well. The problems stem from the fact that when trying to seek (SelectRangeEvery) with AVISYNTH on an HEVC source you don't seem to land on (or adjust from) i-frames and you get garbage output -- making the method used for AVC prediction (upon which I spent a considerable amount of time) is a non-starter. Part of the purpose of the "test" releases is to figure out a good way to predict without using AVISYNTH. So the new method (for HEVC) uses FFMPEG to pull out sections of the source M2TS and combines them into an M2TS to use as the prediction source.

With all that said... I'm working on trying to make the initial CQM/CRF value more accurate. But it ain't easy... and that's why you don't see CQM/CRF prediction algorithms laying around waiting for someone to pick one up to use.
__________________
Help with development of new apps: Donations.
Website: www.jdobbs.net

Last edited by jdobbs; 6th August 2020 at 13:44.
jdobbs is offline   Reply With Quote
Old 6th August 2020, 16:00   #29816  |  Link
jdobbs
Moderator
 
Join Date: Oct 2001
Posts: 20,675
BD Rebuilder v0.61.10

Unless I've missed something in testing, I think the latest version is very close to release for public consumption. Please download and do some testing on this release:

BD Rebuilder v0.61.10

Summary of changes:
Code:
- Added support for NVIDIA NVENC encoding.
- Corrected an error that could cause BD-RB
  to exit in failure when attempting a size-
  correcting second pass in CRF mode while
  doing a UHD backup.
- Added code that will update the HDR flags
  in the MPLS file for sources in which HDR
  was not established until after reencoding 
  on full backups.
- Fixed an error in which MPEG2 sources may
  not make video adjustments detected during
  import when reencoding. This could result
  in stretched or compressed images.
- Created a workaround for a problem with
  converting PAL to NTSC flags during import 
  which could confuse TSMUXER's recognition
  of video resolution.
- Added code to adapt to UHD sources that are
  not sized to either 3840(h) or 2160(v).
- Corrected an issue in which video sources
  that are an odd size during import might
  use the wrong resizing when HEVC encoding
  is enabled.
- Fixed a bug in which large audio offsets
  detected during import were not being
  interpreted correctly.
- Fixed a bug that could cause undersizing
  when an AC3 stream is kept intact because
  the original is smaller than the selected
  reencoder bitrate.  This typically only
  happens on DVD imports.
- Fixed an error that could result in wrong
  aspect ratio on imported sources that are
  being resized.
- Corrected an issue in which some command
  line settings were not being set when a
  non-UHD source is output as V3.
- Added a routine to ensure formatting of
  CRF values consistent.
- Updated CQM prediction routines.
- Fixed an issue that can cause the CQM/CRF
  sample file to have a zero length when
  used in other-than-US regions.
- Modified NVENCC options to eliminate vbrhq
  from the command line, as it is targeted
  to be deprecated.  Replaced by "vbr" and
  manually enabling "--multipass 2pass-full"
  (the newer equivalent of vbrhq).
- Rewrote the "Bitstream Exception" TSMUXER
  error workaround routine so it now uses
  a more efficient method -- and better
  adjusts audio sync.  It also now adjusts
  for Dolby Vision exception errors.
- Fixed an issue in which BT709 sources were
  not being encoded with proper settings.
- Increased the accepted value of the THREADS
  hidden option from 16 to 128. Note:  While
  128 is accepted for x264 -- realistically 
  you should never set it that high.
- Added support for HDR10+  A JSON file is
  created concurrently with video extraction.
  HDR10+ streams are now identified in the
  streams list by a "+" next to "HEVC". Note
  that "*" indicates Dolby Vision. "*+"=both.
- Changed the ALTERNATE settings so creating
  a preset that outputs HEVC to MP4 is now
  allowed.
- Updated the included version of MP4BOX to
  support HEVC.
- Updated the included versions of X265 to a
  newer (v3.2.1) release.
- Other minor corrections and cosmetic fixes.
__________________
Help with development of new apps: Donations.
Website: www.jdobbs.net
jdobbs is offline   Reply With Quote
Old 6th August 2020, 16:37   #29817  |  Link
SquallMX
Special SeeD
 
Join Date: Nov 2002
Location: Mexico
Posts: 288
Quote:
Originally Posted by jdobbs View Post
Yes. It needs work -- that's the purpose of releasing a test version, so I can get feedback. But as I said before -- the "static images" you mention AREN'T a problem when encoding for prediction... that is an issue with whatever you are using in playback.
I have to disagree, the prediction file generated by BD-RB is also only 48 frames, here a direct feed from LASTCMD.TXT:

Quote:
C:\Windows\System32>"D:\Archivos de Programa\BD Rebuilder UHD\tools\ffmpeg.exe" -probesize 100MB -i "C:\TEMP\WORKFILES\00294.AVS.SMPL.m2ts" -frames 48 -an -pix_fmt yuv420p10le -f yuv4mpegpipe -strict -1 - | "D:\Archivos de Programa\BD Rebuilder UHD\tools\x265-64.exe" - --preset faster --profile main10 --uhd-bd --repeat-headers --vbv-bufsize 45000 --vbv-maxrate 48000 --hdr --chromaloc 2 --colorprim bt2020 --transfer smpte2084 --colormatrix bt2020nc --master-display "G(13250,34500)B(7500,3000)R(34000,16000)WP(15635,16450)L(10000000,50)" --max-cll "1000,221" --fps 23.976 --slow-firstpass --pass 1 --sar 1:1 --qpfile "C:\TEMP\WORKFILES\VID_00294.CHP" --keyint 24 --crf 2.86 --y4m --no-strong-intra-smoothing --no-sao --stats "C:\TEMP\WORKFILES\TEMP.265.stats" --output "C:\TEMP\WORKFILES\TEMP.265"
ffmpeg version 3.4 Copyright (c) 2000-2017 the FFmpeg developers
built with gcc 7.2.0 (GCC)
configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-bzlib --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-cuda --enable-cuvid --enable-d3d11va --enable-nvenc --enable-dxva2 --enable-avisynth --enable-libmfx
libavutil 55. 78.100 / 55. 78.100
libavcodec 57.107.100 / 57.107.100
libavformat 57. 83.100 / 57. 83.100
libavdevice 57. 10.100 / 57. 10.100
libavfilter 6.107.100 / 6.107.100
libswscale 4. 8.100 / 4. 8.100
libswresample 2. 9.100 / 2. 9.100
libpostproc 54. 7.100 / 54. 7.100
Input #0, mpegts, from 'C:\TEMP\WORKFILES\00294.AVS.SMPL.m2ts':
Duration: 00:00:02.00, start: 1.525122, bitrate: 4537470 kb/s
Program 1
Metadata:
service_name : Service01
service_provider: FFmpeg
Stream #0:0[0x100]: Video: hevc (Main 10) ([36][0][0][0] / 0x0024), yuv420p10le(tv, bt2020nc/bt2020/smpte2084), 3840x2160 [SAR 1:1 DAR 16:9], 23.98 fps, 23.98 tbr, 90k tbn, 23.98 tbc
Stream mapping:
Stream #0:0 -> #0:0 (hevc (native) -> wrapped_avframe (native))
Press [q] to stop, [?] for help
[hevc @ 00000207dc1208e0] First slice in a frame missing.
Last message repeated 6 times
[hevc @ 00000207dcd200a0] First slice in a frame missing.
Last message repeated 6 times
[yuv4mpegpipe @ 00000207e22d2020] Warning: generating non standard YUV stream. Mjpegtools will not work.
Output #0, yuv4mpegpipe, to 'pipe:':
Metadata:
encoder : Lavf57.83.100
Stream #0:0: Video: wrapped_avframe, yuv420p10le, 3840x2160 [SAR 1:1 DAR 16:9], q=2-31, 200 kb/s, 23.98 fps, 23.98 tbn, 23.98 tbc
Metadata:
encoder : Lavc57.107.100 wrapped_avframe
y4m [info]: 3840x2160 fps 23976/1000 i420p10 sar 1:1 unknown frame count
raw [info]: output file: C:\TEMP\WORKFILES\TEMP.265
x265 [info]: HEVC encoder version 3.2.1+3-b4b2ecac21f6
x265 [info]: build info [Windows][GCC 9.2.0][64 bit] 10bit
x265 [info]: using cpu capabilities: MMX2 SSE2Fast LZCNT SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
x265 [warning]: uhd-bd: Turning off open GOP
x265 [warning]: uhd-bd: keyframeMin is always 1
x265 [info]: Main 10 profile, Level-5.1 (High tier)
x265 [info]: Thread pool created using 24 threads
x265 [info]: Slices : 1
x265 [info]: frame threads / pool features : 4 / wpp(34 rows)
x265 [info]: Coding QT: max CU size, min CU size : 64 / 8
x265 [info]: Residual QT: max TU size, max depth : 32 / 1 inter / 1 intra
x265 [info]: ME / range / subpel / merge : hex / 57 / 2 / 2
x265 [info]: Keyframe min / max / scenecut / bias: 1 / 24 / 40 / 5.00
x265 [info]: Lookahead / bframes / badapt : 15 / 4 / 0
x265 [info]: b-pyramid / weightp / weightb : 1 / 1 / 0
x265 [info]: References / ref-limit cu / depth : 2 / on / on
x265 [info]: AQ: mode / str / qg-size / cu-tree : 2 / 1.0 / 32 / 1
x265 [info]: Rate Control / qCompress : CRF-2.9 / 0.60
x265 [info]: VBV/HRD buffer / max-rate / init : 45000 / 48000 / 0.900
x265 [info]: tools: rd=2 psy-rd=2.00 early-skip rskip signhide tmvp fast-intra
x265 [info]: tools: lslices=8 deblock stats-write
frame= 48 fps= 13 q=-0.0 Lsize= 1166400kB time=00:00:02.00 bitrate=4772803.0kbits/s dup=1 drop=0 speed=0.532x
video:25kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 4712628.500000%
x265 [info]: frame I: 4, Avg QP:8.74 kb/s: 113388.31
x265 [info]: frame P: 10, Avg QP:8.47 kb/s: 77779.35
x265 [info]: frame B: 34, Avg QP:11.63 kb/s: 42043.62
x265 [info]: Weighted P-Frames: Y:90.0% UV:90.0%
x265 [info]: consecutive B-frames: 35.7% 0.0% 7.1% 0.0% 57.1%

encoded 48 frames in 6.38s (7.53 fps), 55433.96 kb/s, Avg QP:10.73

C:\Windows\System32>
And here the output file from that preddiction pass:

https://mega.nz/file/NXpgmSJb#LPjgIr...0bdzKUTdTOTmog

Looks like even FFMPEG is incapable of properly read the "00294.AVS.SMPL.m2ts" file created for CRF prediction, and that is the reason for the ultra low values that results in a oversized file.

Quote:
Originally Posted by jdobbs View Post
With all that said... I'm working on trying to make the initial CQM/CRF value more accurate. But it ain't easy... and that's why you don't see CQM/CRF prediction algorithms laying around waiting for someone to pick one up to use.
Thanks for your hard work, is what makes a difference from lesser quality bloatware (cough, cough, DVDFa... cough, cough).

EDIT: I Think I found the culprit, "-frames 48" removing it from lastcmd.txt creates a proper output file!!!

Looks like FFMPEG is actually capable of properly decoding the prediction file, but for some reason, BD-RB is parsing a incorrect parameter for "-frames" that cut the prediction file to early.

EDIT 2: Still present in 0.61.10 (obviously).

Last edited by jdobbs; 6th August 2020 at 20:11. Reason: Fix?
SquallMX is offline   Reply With Quote
Old 6th August 2020, 18:46   #29818  |  Link
cartman0208
Registered User
 
Join Date: Jun 2010
Location: Germany
Posts: 133
Quote:
Originally Posted by jdobbs View Post
Unless I've missed something in testing, I think the latest version is very close to release for public consumption. Please download and do some testing on this release:

BD Rebuilder v0.61.10
Long list of changes ... thanks a lot for the hard work ... testing ...
cartman0208 is offline   Reply With Quote
Old 6th August 2020, 20:31   #29819  |  Link
jdobbs
Moderator
 
Join Date: Oct 2001
Posts: 20,675
@SquallMX

The "-frames 48" is a bit confusing. I'll have a look at it. If it were taken from the command line that is creating the SAMPLE it would make more sense.

The sample file is created using groups of 48 frames (about 2 seconds each), each starting with an iframe (the start of which is pulled from the CLPI's EP_map table). FFMPEG is run multiple times outputting 48 frames with each iteration giving an approximate 1% of the entire source file (by default, it is adjustable with HIDDENOPTS and it can be a larger percentage sample on smaller files).

Why it is in the command line used to encode the sample makes me think there has to be a bug somewhere. The question is "why is it not happening to me?" I assume the source being encoded is larger than 4800 frames?

Can you send me (or post) your settings (the contents of BDREBUILDER.INI). Maybe something in the settings is triggering the glitch.

[Edit] Never mind. I found it.

It was a mistake in coding and you were absolutely right. No wonder the prediction was so far off. It appears to be left over from a test I was doing to compare encoding 48 frames at a time as opposed to extracting them with FFMPEG beforehand and then encoding the group. Since most of my recent testing has been using NVENCC (the logic flow of which was copied from the X265 routine) -- I guess I just outright failed to switch it back from the test code in the X265 routine.

Thanks for the help in identifying that bug.

I'll put in a fix, test it, and get another version up. I'm also working on improving the prediction algorithm on small streams -- so it will probably be a day or two (at the most).
__________________
Help with development of new apps: Donations.
Website: www.jdobbs.net

Last edited by jdobbs; 7th August 2020 at 01:32.
jdobbs is offline   Reply With Quote
Old 6th August 2020, 21:05   #29820  |  Link
jedihyte
Registered User
 
Join Date: Mar 2010
Posts: 10
Quote:
Originally Posted by jdobbs View Post
If it only happens on a software player and never on a hardware player, it would lead to to believe that the software player is at fault, not the encode.
Thanks for the reply. Ya its something I've just had to deal with for several years, but just trying to find out if there were any new developments. BTW, It only happens if it goes through BD-Rebuilder first, as the software player (PowerDVD) plays the original 3D discs fine without artifacts. I know 3D is not a priority and is dying , but I'm still a huge 3D fan. Thanks for all the cool BD Rebuilder developments over the years .
jedihyte is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 14:50.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.