简体   繁体   中英

How to capture image in ARKit and send Binary data?

I want to make a capture image in ARKit and send byte array to TCP server.

Well This is my code.

    @IBOutlet weak var sceneView: ARSCNView!
    @IBAction func sendButtonAction(_ sender: Any) {

    let captureImage:UIImage = self.sceneView.snapshot()
    }

I can get image by snapshot but i don't know how convert it to Byte Array (include pixel R,G,B Data.)

I tried to change UIImage to binary data like this.

    let imageData: NSData = UIImagePNGRepresentation(captureImage)! as NSData

but this is not correct because imageData's size is mutable whenever i snapshot :(

My purpose is to make a captureImage's Byte(UInt8) array which size is Width*Height*3(R,G,B) Bytes

If you have any ideas to solve this problem, please help me.

I'm a little confused about what exactly isn't working for you by using UIImagePNGRepresentation , sorry if I'm missing something or maybe you could clarify.

But for one, PNG would be expecting RGBA, so you might want to try UIImageJPEGRepresentation , since it doesn't support an alpha channel.

And if you're trying to get an actual NSMutableArray , see if this old answer helps: https://stackoverflow.com/a/29734175/8895191

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM