简体   繁体   中英

Pass video from camera intent to webView

i am trying to implement a webview that start a video intent, and return the video to a webView.

What i try to do:

1) Java - add webAppInterface that open video capture intent:

mWebView = (WebView) findViewById(R.id.webView);
mWebView.addJavascriptInterface(webAppInterface, "Android");

public class WebAppInterface {
    ...
    public void dispatchTakeVideoIntent() {
        Intent takeVideoIntent = new Intent(MediaStore.ACTION_VIDEO_CAPTURE);
        takeVideoIntent.putExtra(android.provider.MediaStore.EXTRA_DURATION_LIMIT,10);

        if (takePictureIntent.resolveActivity(mContext.getPackageManager()) != null) {
            ((AppCompatActivity) mContext).startActivityForResult(takeVideoIntent, REQUEST_VIDEO_CAPTURE);
        }
    }
    ...

2) JavaScript - Call it from the webview:

Android.dispatchTakeVideoIntent()

3) Java - Get the Uri, and send path to my webview

 public void onActivityResult(int requestCode, int resultCode, Intent intent) {
    if (requestCode == REQUEST_VIDEO_CAPTURE && resultCode == Activity.RESULT_OK) {
        Uri videoUri = intent.getData();
        wView.loadUrl("javascript:test('" + videoUri.getPath() + "')");
    }
}

4) JavaScript - Get the path in my webView

window.test = (videoUriPath) => {
    ...
}

My question is, how to access the video?

And maybe there is a totally different way to go about it?

Accessing the video means I'll suppose playing the video in webView. Have a video element in your HTML (suppose id is 'my-video'), then your javascript will be:

window.test = (videoUriPath) => {
 var video = document.getElementById('video');
 var source = document.createElement('source');

 source.setAttribute('src', videoUriPath);

 video.appendChild(source);
 video.load();
 video.play();
}

ok i found a solution, its a little overkill, but its working...

1) JAVA: convert the video to bytes array

byte[] bytes;
byte[] data = new byte[16384];
int bytesRead;
ByteArrayOutputStream output = new ByteArrayOutputStream();
while ((bytesRead = is.read(data)) != -1) {
   output.write(data, 0, bytesRead);
}
bytes = output.toByteArray();

2) JAVA: send encoded chunks (base64) to the webvView

int startIndex = 0;
int chunkSize= 16384;
while(startIndex < bytes.length){
     byte[] newArray = Arrays.copyOfRange(bytes, startIndex, startIndex + chunkSize);
     startIndex = startIndex + chunkSize;
     encodedString = Base64.encodeToString(newArray, Base64.DEFAULT);
     wView.loadUrl("javascript:g_sendFile_f('" + encodedString + "')");
}
wView.loadUrl("javascript:g_sendFile_f('" + "finish" + "')");

3) JAVASCRIPT: receive the encode chunks, combine them, and create blob file

let bytesArrFinal_an = []
window.g_sendFile_f = (msg) =>{
     // last call
     if(msg === "finish"){
         let blob = new Blob(byteArrFinal_an,{type : "video/mp4"})
         this.test_videoUrl = URL.createObjectURL(blob);
         console.log("finish")
         return
      }

      // add bytes to final array
      let bytesArr_an = this.b64toByteArr(msg)
      bytesArrFinal_an = bytesArrFinal_an.concat(bytesArr_an);
      console.log(msg)
}

If someone have a more elegant solution i will be happy the see it!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM