whats the difference between r_frame_rate and avg_frame_rate ?
Hi, I want to know about r_frame_rate and avg_frame_rate significance. these are the output parameters.
r_frame_rate is "the lowest framerate with which all timestamps can be represented accurately (it is the least common multiple of all framerates in the stream)."
avg_frame_rate is total duration / total # of frames
r_frame_rateis "the lowest framerate with which all timestamps can be represented accurately (it is the least common multiple of all framerates in the stream)."
avg_frame_rateistotal duration / total # of frames
Minor correction. I think the avg_frame_rate is total # of frames / total duration.
The denominator and the numerator is changed. (30 frames in 1 sec = 30 fps)
And I want to know the source about the avg_frame_rate. I was able to find this thread, but I couldn't find the same info from the documentation.
I think it is more accurate to say that avg_frame_rate is (nb_frames / time_base ) "/" duration_ts
Essentially the total duration is calculated based off the time_base (Usually tbn) multiplied by the largest presentation timestep in the file (duration_ts).
Sorry to re-awaken this thread, but...
Since nb_frames is the number of frames in a stream, not the number of frames which are going to be displayed from the stream (because some frames in the stream can be "hidden" by edit lists, and that there can be frames in a stream past the end of the duration of a stream) and duration_ts is the actual duration that the non-"hidden" frames will be played for - avg_frame_rate can be nonsense for some files.