The goal here is to ensure we check for audio backends in a way that
makes sense. On macOS, let's just always use Audio Unit (and thus avoid
any checks for Pulse, to reduce needless/confusing build log noise). We
will also only use the Qt audio backend if no other backend was found,
rather than only checking for Pulse.
We had numerous NiH-based implementations of audio formats and metadata
that we now no longer need because we either don't make use of the code,
or we replaced its implementation by FFmpeg.
This loader supports whatever format libavformat and libavcodec can
handle. Currently only seekable streams are supported, and we still have
some limitations as to the number of channels and sample format.
Plays all non-streaming audio files at:
https://tools.woolyss.com/html5-audio-video-tester/
H.264 in Matroska can have blocks with unordered timestamps. Without
passing these as the presentation timestamp into the FFmpeg decoder,
the frames will not be returned in chronological order.
VideoFrame will now include a timestamp that is used by the
PlaybackManager, rather than assuming that it is the same timestamp
returned by the demuxer.