First of all: Why do you worry so much about this? The bitrate of a video stream is almost always
not a fixed thing anyway! It's fluctuating constantly. So if we talk about "
the bitrate" of a video stream, what we actually mean is the stream's
average bitrate. And, even in 2-Pass mode, x264 does
not hit the target (average) bitrate perfectly accurate. If, e.g., you encode with a target bitrate of 515 kbps, you will almost certainly end with a slightly higher or lower
average bitrate.
Code:
Creating encoder process:
x264_8bit_x64.exe --bitrate 515 --pass 2 --stats city.704x576.stats --output city.704x576.264 city.704x576.avi
ffms [info]: 704x576p 0:1 @ 60062/1001 fps (vfr)
x264 [info]: using cpu capabilities: MMX2 SSE2Fast SSSE3 Cache64 SlowShuffle
x264 [info]: profile High, level 3.1
[...]
encoded 600 frames, 92.02 fps, 514.07 kb/s
Final file size is 627.5 KB bytes.
Secondly: Why do the programs show different bitrate? Reasons I can think of include different rounding strategies ("round to nearest" vs. "truncation") when rounding from bits/s to kilobits/s. It is also quite possible that the one program simply shows the "nominal" bitrate, as stored in the fil'se header, while the other program computes the actual
average bitrate of the stream. As we have seen above, this is
not necessarily the same...