简体   繁体   English

Android ICS和MJPEG使用AsyncTask

[英]Android ICS and MJPEG using AsyncTask

I modified the MJPEG viewer code from Android and MJPEG to work using an AsyncTask (and thus work on Ice Cream Sandwich (ICS), 4.0.4) and here is my code. 我修改了Android和MJPEG的MJPEG查看器代码,使用AsyncTask工作(因此可以使用Ice Cream Sandwich(ICS),4.0.4),这是我的代码。

If anyone has any suggestions on how to optimize, cleanup, or do something more proper with the code please let me know. 如果有人对如何优化,清理或使用代码做更合适的事情有任何建议,请告诉我。 Two issues I'd appreciate help addressing: 我很感激有两个问题有助于解决:

  • If you have the device on a stream then lock the screen and unlock the screen it does not resume playing until you either kill and resume the app or rotate the screen. 如果您有设备在流上,然后锁定屏幕并解锁屏幕,它不会恢复播放,直到您杀死并恢复应用程序或旋转屏幕。 All my attempts at using OnResume() to do something or other resulted in app crashes. 我尝试使用OnResume()执行某些操作或其他操作导致应用程序崩溃。

  • In particular I'd like to get the AsyncTask back in MjpegInputStream.java but was not able to get that to work. 特别是我想在MjpegInputStream.java中获取AsyncTask但是无法使其工作。

MjpegActivity.java: MjpegActivity.java:

package com.demo.mjpeg;

import java.io.IOException;
import java.net.URI;

import org.apache.http.HttpResponse;
import org.apache.http.client.ClientProtocolException;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.impl.client.DefaultHttpClient;

import com.demo.mjpeg.MjpegView.MjpegInputStream;
import com.demo.mjpeg.MjpegView.MjpegView;
import android.app.Activity;
import android.os.AsyncTask;
import android.os.Bundle;
import android.util.Log;
import android.view.Window;
import android.view.WindowManager;
import android.widget.Toast;

public class MjpegActivity extends Activity {
    private static final String TAG = "MjpegActivity";

    private MjpegView mv;

    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        //sample public cam
        String URL = "http://trackfield.webcam.oregonstate.edu/axis-cgi/mjpg/video.cgi?resolution=800x600&amp%3bdummy=1333689998337";

        requestWindowFeature(Window.FEATURE_NO_TITLE);
        getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, 
        WindowManager.LayoutParams.FLAG_FULLSCREEN);

        mv = new MjpegView(this);
        setContentView(mv);        

        new DoRead().execute(URL);
    }

    public void onPause() {
        super.onPause();
        mv.stopPlayback();
    }

    public class DoRead extends AsyncTask<String, Void, MjpegInputStream> {
        protected MjpegInputStream doInBackground(String... url) {
            //TODO: if camera has authentication deal with it and don't just not work
            HttpResponse res = null;
            DefaultHttpClient httpclient = new DefaultHttpClient();     
            Log.d(TAG, "1. Sending http request");
            try {
                res = httpclient.execute(new HttpGet(URI.create(url[0])));
                Log.d(TAG, "2. Request finished, status = " + res.getStatusLine().getStatusCode());
                if(res.getStatusLine().getStatusCode()==401){
                    //You must turn off camera User Access Control before this will work
                    return null;
                }
                return new MjpegInputStream(res.getEntity().getContent());  
            } catch (ClientProtocolException e) {
                e.printStackTrace();
                Log.d(TAG, "Request failed-ClientProtocolException", e);
                //Error connecting to camera
            } catch (IOException e) {
                e.printStackTrace();
                Log.d(TAG, "Request failed-IOException", e);
                //Error connecting to camera
            }

            return null;
        }

        protected void onPostExecute(MjpegInputStream result) {
            mv.setSource(result);
            mv.setDisplayMode(MjpegView.SIZE_BEST_FIT);
            mv.showFps(true);
        }
    }
}

MjpegInputStream.java: MjpegInputStream.java:

package com.demo.mjpeg.MjpegView;

import java.io.BufferedInputStream;
import java.io.ByteArrayInputStream;
import java.io.DataInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.util.Properties;

import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.util.Log;

public class MjpegInputStream extends DataInputStream {
    private static final String TAG = "MjpegInputStream";

    private final byte[] SOI_MARKER = { (byte) 0xFF, (byte) 0xD8 };
    private final byte[] EOF_MARKER = { (byte) 0xFF, (byte) 0xD9 };
    private final String CONTENT_LENGTH = "Content-Length";
    private final static int HEADER_MAX_LENGTH = 100;
    private final static int FRAME_MAX_LENGTH = 40000 + HEADER_MAX_LENGTH;
    private int mContentLength = -1;

    public MjpegInputStream(InputStream in) {
        super(new BufferedInputStream(in, FRAME_MAX_LENGTH));
    }

    private int getEndOfSeqeunce(DataInputStream in, byte[] sequence) throws IOException {
        int seqIndex = 0;
        byte c;
        for(int i=0; i < FRAME_MAX_LENGTH; i++) {
            c = (byte) in.readUnsignedByte();
            if(c == sequence[seqIndex]) {
                seqIndex++;
                if(seqIndex == sequence.length) {
                    return i + 1;
                }
            } else {
                seqIndex = 0;
            }
        }
        return -1;
    }

    private int getStartOfSequence(DataInputStream in, byte[] sequence) throws IOException {
        int end = getEndOfSeqeunce(in, sequence);
        return (end < 0) ? (-1) : (end - sequence.length);
    }

    private int parseContentLength(byte[] headerBytes) throws IOException, NumberFormatException {
        ByteArrayInputStream headerIn = new ByteArrayInputStream(headerBytes);
        Properties props = new Properties();
        props.load(headerIn);
        return Integer.parseInt(props.getProperty(CONTENT_LENGTH));
    }   

    public Bitmap readMjpegFrame() throws IOException {
        mark(FRAME_MAX_LENGTH);
        int headerLen = getStartOfSequence(this, SOI_MARKER);
        reset();
        byte[] header = new byte[headerLen];
        readFully(header);
        try {
            mContentLength = parseContentLength(header);
        } catch (NumberFormatException nfe) { 
            nfe.getStackTrace();
            Log.d(TAG, "catch NumberFormatException hit", nfe);
            mContentLength = getEndOfSeqeunce(this, EOF_MARKER); 
        }
        reset();
        byte[] frameData = new byte[mContentLength];
        skipBytes(headerLen);
        readFully(frameData);
        return BitmapFactory.decodeStream(new ByteArrayInputStream(frameData));
    }
}

MjpegView.java: MjpegView.java:

package com.demo.mjpeg.MjpegView;

import java.io.IOException;

import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.Canvas;
import android.graphics.Color;
import android.graphics.Paint;
import android.graphics.PorterDuff;
import android.graphics.PorterDuffXfermode;
import android.graphics.Rect;
import android.graphics.Typeface;
import android.util.AttributeSet;
import android.util.Log;
import android.view.SurfaceHolder;
import android.view.SurfaceView;

public class MjpegView extends SurfaceView implements SurfaceHolder.Callback {
    private static final String TAG = "MjpegView";

    public final static int POSITION_UPPER_LEFT  = 9;
    public final static int POSITION_UPPER_RIGHT = 3;
    public final static int POSITION_LOWER_LEFT  = 12;
    public final static int POSITION_LOWER_RIGHT = 6;

    public final static int SIZE_STANDARD   = 1; 
    public final static int SIZE_BEST_FIT   = 4;
    public final static int SIZE_FULLSCREEN = 8;

    private MjpegViewThread thread;
    private MjpegInputStream mIn = null;    
    private boolean showFps = false;
    private boolean mRun = false;
    private boolean surfaceDone = false;    
    private Paint overlayPaint;
    private int overlayTextColor;
    private int overlayBackgroundColor;
    private int ovlPos;
    private int dispWidth;
    private int dispHeight;
    private int displayMode;

    public class MjpegViewThread extends Thread {
        private SurfaceHolder mSurfaceHolder;
        private int frameCounter = 0;
        private long start;
        private Bitmap ovl;

        public MjpegViewThread(SurfaceHolder surfaceHolder, Context context) {
            mSurfaceHolder = surfaceHolder;
        }

        private Rect destRect(int bmw, int bmh) {
            int tempx;
            int tempy;
            if (displayMode == MjpegView.SIZE_STANDARD) {
                tempx = (dispWidth / 2) - (bmw / 2);
                tempy = (dispHeight / 2) - (bmh / 2);
                return new Rect(tempx, tempy, bmw + tempx, bmh + tempy);
            }
            if (displayMode == MjpegView.SIZE_BEST_FIT) {
                float bmasp = (float) bmw / (float) bmh;
                bmw = dispWidth;
                bmh = (int) (dispWidth / bmasp);
                if (bmh > dispHeight) {
                    bmh = dispHeight;
                    bmw = (int) (dispHeight * bmasp);
                }
                tempx = (dispWidth / 2) - (bmw / 2);
                tempy = (dispHeight / 2) - (bmh / 2);
                return new Rect(tempx, tempy, bmw + tempx, bmh + tempy);
            }
            if (displayMode == MjpegView.SIZE_FULLSCREEN){
                return new Rect(0, 0, dispWidth, dispHeight);
            }
            return null;
        }

        public void setSurfaceSize(int width, int height) {
            synchronized(mSurfaceHolder) {
                dispWidth = width;
                dispHeight = height;
            }
        }

        private Bitmap makeFpsOverlay(Paint p, String text) {
            Rect b = new Rect();
            p.getTextBounds(text, 0, text.length(), b);
            int bwidth  = b.width()+2;
            int bheight = b.height()+2;
            Bitmap bm = Bitmap.createBitmap(bwidth, bheight, Bitmap.Config.ARGB_8888);
            Canvas c = new Canvas(bm);
            p.setColor(overlayBackgroundColor);
            c.drawRect(0, 0, bwidth, bheight, p);
            p.setColor(overlayTextColor);
            c.drawText(text, -b.left+1, (bheight/2)-((p.ascent()+p.descent())/2)+1, p);
            return bm;           
        }

        public void run() {
            start = System.currentTimeMillis();
            PorterDuffXfermode mode = new PorterDuffXfermode(PorterDuff.Mode.DST_OVER);
            Bitmap bm;
            int width;
            int height;
            Rect destRect;
            Canvas c = null;
            Paint p = new Paint();
            String fps;
            while (mRun) {
                if(surfaceDone) {
                    try {
                        c = mSurfaceHolder.lockCanvas();
                        synchronized (mSurfaceHolder) {
                            try {
                                bm = mIn.readMjpegFrame();
                                destRect = destRect(bm.getWidth(),bm.getHeight());
                                c.drawColor(Color.BLACK);
                                c.drawBitmap(bm, null, destRect, p);
                                if(showFps) {
                                    p.setXfermode(mode);
                                    if(ovl != null) {
                                        height = ((ovlPos & 1) == 1) ? destRect.top : destRect.bottom-ovl.getHeight();
                                        width  = ((ovlPos & 8) == 8) ? destRect.left : destRect.right -ovl.getWidth();
                                        c.drawBitmap(ovl, width, height, null);
                                    }
                                    p.setXfermode(null);
                                    frameCounter++;
                                    if((System.currentTimeMillis() - start) >= 1000) {
                                        fps = String.valueOf(frameCounter)+" fps";
                                        frameCounter = 0; 
                                        start = System.currentTimeMillis();
                                        ovl = makeFpsOverlay(overlayPaint, fps);
                                    }
                                }
                            } catch (IOException e) {
                                e.getStackTrace();
                                Log.d(TAG, "catch IOException hit in run", e);
                            }
                        }
                    } finally { 
                        if (c != null) {
                            mSurfaceHolder.unlockCanvasAndPost(c); 
                        }
                    }
                }
            }
        }
    }

    private void init(Context context) {
        SurfaceHolder holder = getHolder();
        holder.addCallback(this);
        thread = new MjpegViewThread(holder, context);
        setFocusable(true);
        overlayPaint = new Paint();
        overlayPaint.setTextAlign(Paint.Align.LEFT);
        overlayPaint.setTextSize(12);
        overlayPaint.setTypeface(Typeface.DEFAULT);
        overlayTextColor = Color.WHITE;
        overlayBackgroundColor = Color.BLACK;
        ovlPos = MjpegView.POSITION_LOWER_RIGHT;
        displayMode = MjpegView.SIZE_STANDARD;
        dispWidth = getWidth();
        dispHeight = getHeight();
    }

    public void startPlayback() { 
        if(mIn != null) {
            mRun = true;
            thread.start();         
        }
    }

    public void stopPlayback() { 
        mRun = false;
        boolean retry = true;
        while(retry) {
            try {
                thread.join();
                retry = false;
            } catch (InterruptedException e) {
                e.getStackTrace();
                Log.d(TAG, "catch IOException hit in stopPlayback", e);
            }
        }
    }

    public MjpegView(Context context, AttributeSet attrs) { 
        super(context, attrs); init(context); 
    }

    public void surfaceChanged(SurfaceHolder holder, int f, int w, int h) { 
        thread.setSurfaceSize(w, h); 
    }

    public void surfaceDestroyed(SurfaceHolder holder) { 
        surfaceDone = false; 
        stopPlayback(); 
    }

    public MjpegView(Context context) { 
        super(context);
        init(context); 
    }

    public void surfaceCreated(SurfaceHolder holder) { 
        surfaceDone = true; 
    }

    public void showFps(boolean b) { 
        showFps = b; 
    }

    public void setSource(MjpegInputStream source) { 
        mIn = source;
        startPlayback();
    }

    public void setOverlayPaint(Paint p) { 
        overlayPaint = p; 
    }

    public void setOverlayTextColor(int c) { 
        overlayTextColor = c; 
    }

    public void setOverlayBackgroundColor(int c) { 
        overlayBackgroundColor = c; 
    }

    public void setOverlayPosition(int p) { 
        ovlPos = p; 
    }

    public void setDisplayMode(int s) { 
        displayMode = s; 
    }
}

nice work! 干得好! For your problem with onResume(), isn't it enough when you move the following code from onCreate() to onResume()? 对于onResume()的问题,将以下代码从onCreate()移动到onResume()是不够的?

    //sample public cam 
    String URL = "http://trackfield.webcam.oregonstate.edu/axis-cgi/mjpg/video.cgi?resolution=800x600&amp%3bdummy=1333689998337"; 

    mv = new MjpegView(this); 
    setContentView(mv);         

    new DoRead().execute(URL); 

Then you simply recreate the View and new instance of the AsyncTask... I tried it and it works for me... 然后你只需重新创建AsyncTask的View和新实例......我试过它,它对我有用......

It'll be helpful for the newbies that if you want to access your ip camera having a username or password , you might want to add this to your DefaultHttpClient and the above code will work for cameras that require authentication 如果您想要访问具有用户名或密码的IP摄像头,您可能希望将其添加到DefaultHttpClient ,以上代码适用于需要身份验证的摄像头,这对新手有用。

 CredentialsProvider provider = new BasicCredentialsProvider();
            UsernamePasswordCredentials credentials = new UsernamePasswordCredentials("yourusername", "yourpassword");
            provider.setCredentials(AuthScope.ANY, credentials);
            DefaultHttpClient httpclient = new DefaultHttpClient();
            httpclient.setCredentialsProvider(provider);

thanks for the code, it's very helpful 谢谢你的代码,这是非常有帮助的

I want to suggest few optimization tips, which are already used in my code, overall performance can be easily increased by a few times. 我想建议一些优化技巧,这些技巧已经在我的代码中使用,整体性能可以轻松增加几倍。

  1. I removed memory allocations during frame reading, where possible 在可能的情况下,我在帧读取期间删除了内存分配

     private final static int HEADER_MAX_LENGTH = 100; private final static int FRAME_MAX_LENGTH = 200000 + HEADER_MAX_LENGTH; private final String CONTENT_LENGTH = "Content-Length:"; private final String CONTENT_END = "\\r\\n"; private final static byte[] gFrameData = new byte[FRAME_MAX_LENGTH]; private final static byte[] gHeader = new byte[HEADER_MAX_LENGTH]; BitmapFactory.Options bitmapOptions = new BitmapFactory.Options(); public Bitmap readMjpegFrame() throws IOException { mark(FRAME_MAX_LENGTH); int headerLen = getStartOfSequence(SOI_MARKER); if(headerLen < 0) return false; reset(); readFully(gHeader, 0, headerLen); int contentLen; try { contentLen = parseContentLength(gHeader, headerLen); } catch (NumberFormatException nfe) { nfe.getStackTrace(); Log.d(TAG, "catch NumberFormatException hit", nfe); contentLen = getEndOfSequence(EOF_MARKER); } readFully(gFrameData, 0, contentLen); Bitmap bm = BitmapFactory.decodeByteArray(gFrameData, 0, contentLen, bitmapOptions); bitmapOptions.inBitmap = bm; return bm; } 
  2. Optimizing parseContentLength, removing String operations as much as possible 优化parseContentLength,尽可能地删除String操作

     byte[] CONTENT_LENGTH_BYTES; byte[] CONTENT_END_BYTES; public MjpegInputStream(InputStream in) { super(new BufferedInputStream(in, FRAME_MAX_LENGTH)); bitmapOptions.inSampleSize = 1; bitmapOptions.inPreferredConfig = Bitmap.Config.RGB_565; bitmapOptions.inPreferQualityOverSpeed = false; bitmapOptions.inPurgeable = true; try { CONTENT_LENGTH_BYTES = CONTENT_LENGTH.getBytes("UTF-8"); CONTENT_END_BYTES = CONTENT_END.getBytes("UTF-8"); } catch (UnsupportedEncodingException e) { e.printStackTrace(); } } private int findPattern(byte[] buffer, int bufferLen, byte[] pattern, int offset) { int seqIndex = 0; for(int i=offset; i < bufferLen; ++i) { if(buffer[i] == pattern[seqIndex]) { ++seqIndex; if(seqIndex == pattern.length) { return i + 1; } } else { seqIndex = 0; } } return -1; } private int parseContentLength(byte[] headerBytes, int length) throws IOException, NumberFormatException { int begin = findPattern(headerBytes, length, CONTENT_LENGTH_BYTES, 0); int end = findPattern(headerBytes, length, CONTENT_END_BYTES, begin) - CONTENT_END_BYTES.length; // converting string to int int number = 0; int radix = 1; for(int i = end - 1; i >= begin; --i) { if(headerBytes[i] > 47 && headerBytes[i] < 58) { number += (headerBytes[i] - 48) * radix; radix *= 10; } } return number; } 

There could be mistakes in the code since I was rewriting it for stackoverflow, originally I'm using 2 threads, one is reading frames and another is rendering. 因为我正在为stackoverflow重写它,所以代码中可能存在错误,最初我使用2个线程,一个是读取帧,另一个是渲染。

I hope it will help for someone. 我希望它会对某人有所帮助。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM