简体   繁体   中英

AVAssetReader, how to use with a stream rather than a file?

AVAssetReader is fantastic, but I can only see how to use it with a local asset, a file, or I guess a composition,

So,

assetReader = try AVAssetReader(asset: self.asset)
...
assetReader.addOutput(readerOutput)

and so on,

Say you have an arriving stream

(perhaps Apple's examples of .M3U8 files,

在此处输入图片说明

https://developer.apple.com/streaming/examples/ )

In fact, can AVAssetReader be used for streams? Or only local files?

I just plain cannot find this explained anywhere. (Maybe it's obvious if you're more familiar with it. :/ )

It's not obvious. Patching together the header file comments for both AVAssetReader and AVComposition gives the strong impression of an API designed only for local assets, although the language does not explicitly rule out non-local assets.

From the AVAssetReader header file:

Instances of AVAssetReader read media data from an instance of AVAsset, whether the asset is file-based or represents an assembly of media data from multiple sources, as is the case with AVComposition.

and from AVComposition :

An AVComposition combines media data from multiple local file-based sources in a custom temporal arrangement, in order to present or process media data from multiple sources together. All local file-based audiovisual assets are eligible to be combined, regardless of container type.

If you're interested in video only, and don't mind processing as part of playback, you can capture frames from a remote asset by adding an AVPlayerItemVideoOutput to your AVPlayerItem . If you're interested in audio, you're up a creek.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM