简体   繁体   English

来自PHAsset url的NSInputStream - iOS照片框架

[英]NSInputStream from PHAsset url - iOS photos framework

I am trying to create an NSInputStream from the AVURLAsset url for a video file (or a photo from PHAsset url) from photos framework. 我正在尝试从AVURLAsset网址创建一个NSInputStream,用于从照片框架中获取视频文件(或来自PHAsset网址的照片)。 My code goes as following 我的代码如下

mAsset = [NSInputStream inputStreamWithFileAtPath:[murl path]];
[mAsset open];

the url is file:///var/mobile/Media/DCIM/100APPLE/IMG_0519.JPG 网址为file:///var/mobile/Media/DCIM/100APPLE/IMG_0519.JPG

Now when I do a read as 现在当我读一读

NSUInteger readLength = [mAsset read:(uint8_t *)data maxLength:maxSize];

the readLength returned is -1. readLength返回-1。 I think it has something to do with permissions for the iOS photo assets 我认为它与iOS照片资产的权限有关

If this way is not correct is there a way I can stream in data from a video or image file from the photos framework assets. 如果这种方式不正确,我可以通过照片框架资产从视频或图像文件中流式传输数据。 Any help will be appreciated. 任何帮助将不胜感激。

Although the question is a bit old, I'm going to explain how I solved it since I ran into the same problem and never found any working solution on the internet that works with the Photos Framework. 虽然问题有点陈旧,但我将解释我是如何解决它的,因为我遇到了同样的问题,并且从未在互联网上找到任何与照片框架协同工作的解决方案。

Because of how the Apple APIs are designed, it's indeed not possible to upload directly from ALAsset and PHAsset source files. 由于Apple API的设计方式,实际上无法直接从ALAsset和PHAsset源文件上传。 So let me start with discussing how this problem was solved back in the days with the old (and now deprecated) API - AssetsLibrary. 因此,让我先讨论如何在旧的(现在已弃用的)API - AssetsLibrary中解决这个问题。

ALAssetRepresentation has one awesome method getBytes:fromOffset:length:error: that directly translates to NSInputStream 's read:maxLength: . ALAssetRepresentation有一个很棒的方法getBytes:fromOffset:length:error:直接转换为NSInputStreamread:maxLength: . This gives you a number of options of how to put a stream from an instance of ALAsset - you may either create a bound pair of input and output streams , or you may choose to go with a little bit trickier path of subclassing NSInputStream . 这为您提供了一些如何从ALAsset实例中放置流的ALAsset - 您可以创建一对绑定的输入和输出流 ,也可以选择使用NSInputStream子类的一些棘手的路径。

So in regards of working with Photos Framework this gives you the first solution: you may try to get an ALAsset URL from a PHAsset and after that just fall back to creating a stream from the good old ALAssetRepresentation . 因此,在使用Photos Framework时,这为您提供了第一个解决方案:您可以尝试PHAsset获取ALAsset URL, PHAsset再回到从旧的ALAssetRepresentation创建流。 Yes, this URL conversion is not documented, and yes, AssetsLibrary is now deprecated, but hey - it's an option. 是的,此URL转换未记录,是的,AssetsLibrary现已弃用,但嘿 - 这是一个选项。 And there is an article on Medium that suggest that it's indeed a working solution. 还有一篇关于Medium的文章表明它确实是一个有效的解决方案。

Now let's move on to Photos Framework. 现在让我们转到Photos Framework。

With iOS 9 Apple introduced a new class PHAssetResourceManager that is suitable for our purposes. 在iOS 9中,Apple推出了一个适合我们目的的新类PHAssetResourceManager It's lengthy method requestDataForAssetResource:options:dataReceivedHandler:completionHandler: a) progressively provides you with chunks of asset's data; 它是冗长的方法requestDataForAssetResource:options:dataReceivedHandler:completionHandler: a)逐步为您提供资产数据块; b) it provides direct access to these underlying data resources and doesn't require any additional space of the file system if the photo is present on the phone (ie not from the iCloud). b)它提供对这些底层数据资源的直接访问,并且如果手机上存在照片(即不是来自iCloud),则不需要文件系统的任何额外空间。 Side note: the statement in “b)” is not actually documented, but proved to be correct in real life - you might try to fill up the device's storage and invoke this method and it will work nicely. 旁注:“b)”中的语句实际上没有记录,但在现实生活中证明是正确的 - 您可能会尝试填充设备的存储并调用此方法,它可以很好地工作。 However, there are a few caveats with PHAssetResourceManager - it delivers the data asynchronously and the chunks of data are of arbitrary size. 但是, PHAssetResourceManager有一些注意事项 - 它以异步方式传递数据,并且数据块具有任意大小。 It's quite understandable why this new API looks the way it does - with Photos Framework you have the same methods to work with local and iCloud assets. 这个新API看起来很简单,这是很容易理解的 - 使用Photos Framework,您可以使用与本地和iCloud资产相同的方法。 But all in all this new method doesn't translate to NSInputStream 's interface as nicely as getBytes:fromOffset:length:error: method of ALAssetRepresentation did. 但总而言之,这个新方法并没有像getBytes:fromOffset:length:error:那样很好地转换为NSInputStream的接口getBytes:fromOffset:length:error: ALAssetRepresentation方法。 But rest easy, there's one feature of this method that we can exploit to make it consumer-friendly so that it will look just like the old getBytes:fromOffset:length:error: method. 但是很简单,我们可以利用这个方法的一个特性使其对消费者友好,以便它看起来就像旧的getBytes:fromOffset:length:error:方法。 This method requestDataForAssetResource:options:dataReceivedHandler:completionHandler: delivers it's data on a serial queue in sequential order. 此方法requestDataForAssetResource:options:dataReceivedHandler:completionHandler:按顺序在串行队列中传递数据。 That means that we could use bounded blocking queue to create a synchronous method that will look like func nextChunk() throws -> Data? 这意味着我们可以使用有界阻塞队列来创建一个看起来像func nextChunk() throws -> Data?的同步方法func nextChunk() throws -> Data? . And after we have such method, getting the asset's bytes is super easy. 在我们有这样的方法之后,获取资产的字节非常容易。

And actually that's exactly what I did in my library PHAssetResourceInputStream . 实际上,这正是我在我的库PHAssetResourceInputStream中所做的 It takes all the heavy lifting behind getting the bytes of assets from Photos Framework and provides you with a simple API, so I hope it'll be helpful for someone who ran into the same problem. 它需要从Photos Framework获取资产字节后面的所有繁重工作,并为您提供一个简单的API,所以我希望它对遇到同样问题的人有所帮助。

TL;DR TL; DR

PHAssetResourceManager will make you happy. PHAssetResourceManager会让你开心。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM