well I would imagine that framerate defines what amount of frames is displayed in a second (speed of video). if I encode a 1 minute video that has 30fps and in the encoder change it into 60fps then I should obviously end up with a 30sec video at 60fps should I not?
When the encoder drops me a full 1 minute video with 60fps I'm just super confused about where those extra frames came from and are they repeats of the same frames and also what in the world might be the point of this? unless it's a SVP type of feature.
|