View Single Post
Old 13th January 2013, 04:26   #2212  |  Link
Sparktank
47.952fps@71.928Hz
 
Sparktank's Avatar
 
Join Date: Mar 2011
Posts: 940
I have a question about the ConvertToYV12() function.

Sometimes, I'll remux a Blu-Ray movie from disc and convert it down to a 480p (with Rec.709 -> Rec.601) resolution to feed to AVStoDVD.
Frameserving with FFMS2 and using VirtualDub to save as an AVI with Lagarith compression.
The "Color Depth" options in VD are both set to "4:2:0 planar YCbCr (YV12)" and I'll even throw in ConvertToYV12() in the AVS script before converting to a LAG.avi file.
Lagarith settings are: Mode=YV12, Prevent Upsampling When Decoding=Yes

Using DirectShowSource on the LAG.avi with Info() shows that it's already YV12.

But every time I load it into AVStoDVD, the autoscript will always show ConvertToYV12().
Is this normal? Or will it affect the colorimetry?
Am I doing something wrong?

I find this also happens with UtVideoCodec (YUV420 (ULY0) VCM x86) as well.
__________________
Win10 (x64) build 19041
NVIDIA GeForce GTX 1060 3GB (GP106) 3071MB/GDDR5 | (r435_95-4)
NTSC | DVD: R1 | BD: A
AMD Ryzen 5 2600 @3.4GHz (6c/12th, I'm on AVX2 now!)
Sparktank is offline   Reply With Quote