简体   繁体   English

在实时人脸检测 Android 中的实时摄像头预览中在人脸周围绘制矩形

[英]Draw Rectangle Around Face on Live CameraPreview in Real Time Face Detection Android

I am creating an app that opens camera of the phone.我正在创建一个打开手机摄像头的应用程序。 I am using FaceDetector (Google ML Kit) for real time face detection.我正在使用 FaceDetector (Google ML Kit) 进行实时人脸检测。 I want to display rectangle around the detected face in live camera preview.我想在实时相机预览中在检测到的人脸周围显示矩形。 I am not able to find a perfect answer.我无法找到完美的答案。 Can anyone please help me.谁能帮帮我吗。 Thanks in advance.提前致谢。 I am attaching code for reference.我附上代码以供参考。

This is my activity_main.xml file这是我的 activity_main.xml 文件

<LinearLayout
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical"
    xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools">

    <FrameLayout
        android:id="@+id/previewView_container"
        android:layout_width="match_parent"
        android:layout_height="match_parent">

        <androidx.camera.view.PreviewView
            android:id="@+id/previewView"
            android:layout_width="match_parent"
            android:layout_height="match_parent"/>
    </FrameLayout>
</LinerLayout>

Below is my MainActivity.java file.下面是我的 MainActivity.java 文件。

public class MainActivity extends AppCompatActivity{
FaceDetector detector;
private ListenableFuture<ProcessCameraProvider> cameraProviderFuture;
PreviewView previewView;
CameraSelector cameraSelector;
boolean start = true,flipX=false;
int cam_face=CameraSelector.LENS_FACING_FRONT;
ProcessCameraProvider cameraProvider;

@Override
protected void onCreate(Bundle savedInstanceState) {
   super.onCreate(savedInstanceState);
   setContentView(R.layout.activity_main);


   FaceDetectorOptions highAccuracyOpts =
          new FaceDetectorOptions.Builder()
               .setPerformanceMode(FaceDetectorOptions.PERFORMANCE_MODE_ACCURATE)
               .setContourMode(FaceDetectorOptions.LANDMARK_MODE_ALL)
               .build();
   detector = FaceDetection.getClient(highAccuracyOpts);

   cameraBind();
}
  

private void cameraBind(){

    cameraProviderFuture = ProcessCameraProvider.getInstance(this);
    previewView=findViewById(R.id.previewView);
    cameraProviderFuture.addListener(() -> {
        try {
            cameraProvider = cameraProviderFuture.get();
            bindPreview(cameraProvider);
        } catch (ExecutionException | InterruptedException e) {
            // No errors need to be handled for this in Future.
            // This should never be reached.
        }
    }, ContextCompat.getMainExecutor(this));
}

void bindPreview(@NonNull ProcessCameraProvider cameraProvider) {

    Preview preview = new Preview.Builder()
            .build();

    cameraSelector = new CameraSelector.Builder()
            .requireLensFacing(cam_face)
            .build();

    preview.setSurfaceProvider(previewView.getSurfaceProvider());
    ImageAnalysis imageAnalysis = new ImageAnalysis.Builder()
                    .setTargetResolution(new Size(640, 480))
                    .setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST)
                    .build();

    Executor executor = Executors.newSingleThreadExecutor();
    imageAnalysis.setAnalyzer(executor, new ImageAnalysis.Analyzer() {
        @Override
        public void analyze(@NonNull ImageProxy imageProxy) {

            InputImage image = null;
            @SuppressLint("UnsafeExperimentalUsageError")

            Image mediaImage = imageProxy.getImage();

            if (mediaImage != null) {
                image = InputImage.fromMediaImage(mediaImage, imageProxy.getImageInfo().getRotationDegrees());
            }
            if (image != null) {
                detector.process(image)
                    .addOnSuccessListener(
                        new OnSuccessListener<List<Face>>() {
                            @Override
                            public void onSuccess(List<Face> faces) {
                                if(faces.size()!= 0) {
                                    Face face = faces.get(0);
                                    Bitmap frame_bmp = toBitmap(mediaImage);
                                    int rot = imageProxy.getImageInfo().getRotationDegrees();
                                    Bitmap frame_bmp1 = rotateBitmap(frame_bmp, rot, flipX, false);
                                    RectF boundingBox = new RectF(face.getBoundingBox());
                                    Bitmap croppedFace = getCropBitmapByCPU(frame_bmp1, boundingBox);
                                    Bitmap scaled = getResizedBitmap(croppedFace, 112, 112);
                                    // will pass this scaled bitmap to model

    //                                        Canvas canvas = new Canvas();
  //                                        Paint paint = new Paint();
  //                                        paint.setColor(Color.GREEN);
  //                                        paint.setStyle(Paint.Style.STROKE);
  //                                        paint.setStrokeWidth(3);
  //                                        canvas.drawRect(boundingBox, paint);



                                    try {
                                        Thread.sleep(100);
                                    } catch (InterruptedException e) {
                                        e.printStackTrace();
                                    }
                                }

                            }
                        })
                    .addOnFailureListener(
                        new OnFailureListener() {
                            @Override
                            public void onFailure(@NonNull Exception e) {
                            }
                        })
                    .addOnCompleteListener(new OnCompleteListener<List<Face>>() {
                        @Override
                        public void onComplete(@NonNull Task<List<Face>> task) {
                            imageProxy.close();
                        }
                    });
            }
        }
    });
    cameraProvider.bindToLifecycle((LifecycleOwner) this, cameraSelector, imageAnalysis, preview);
}

private Bitmap toBitmap(Image image) {

    byte[] nv21=YUV_420_888toNV21(image);
    YuvImage yuvImage = new YuvImage(nv21, ImageFormat.NV21, image.getWidth(), image.getHeight(), null);
    ByteArrayOutputStream out = new ByteArrayOutputStream();
    yuvImage.compressToJpeg(new Rect(0, 0, yuvImage.getWidth(), yuvImage.getHeight()), 100, out);
    byte[] imageBytes = out.toByteArray();
    return BitmapFactory.decodeByteArray(imageBytes, 0, imageBytes.length);
}

private static byte[] YUV_420_888toNV21(Image image) {

    int width = image.getWidth();
    int height = image.getHeight();
    int ySize = width*height;
    int uvSize = width*height/4;
    byte[] nv21 = new byte[ySize + uvSize*2];
    ByteBuffer yBuffer = image.getPlanes()[0].getBuffer();
    ByteBuffer uBuffer = image.getPlanes()[1].getBuffer();
    ByteBuffer vBuffer = image.getPlanes()[2].getBuffer();
    int rowStride = image.getPlanes()[0].getRowStride();
    assert(image.getPlanes()[0].getPixelStride() == 1);
    int pos = 0;
    if (rowStride == width) {
        yBuffer.get(nv21, 0, ySize);
        pos += ySize;
    }
    else {
        long yBufferPos = -rowStride;
        for (; pos<ySize; pos+=width) {
            yBufferPos += rowStride;
            yBuffer.position((int) yBufferPos);
            yBuffer.get(nv21, pos, width);
        }
    }
    rowStride = image.getPlanes()[2].getRowStride();
    int pixelStride = image.getPlanes()[2].getPixelStride();
    assert(rowStride == image.getPlanes()[1].getRowStride());
    assert(pixelStride == image.getPlanes()[1].getPixelStride());
    if (pixelStride == 2 && rowStride == width && uBuffer.get(0) == vBuffer.get(1)) {
        byte savePixel = vBuffer.get(1);
        try {
            vBuffer.put(1, (byte)~savePixel);
            if (uBuffer.get(0) == (byte)~savePixel) {
                vBuffer.put(1, savePixel);
                vBuffer.position(0);
                uBuffer.position(0);
                vBuffer.get(nv21, ySize, 1);
                uBuffer.get(nv21, ySize + 1, uBuffer.remaining());
                return nv21;
            }
        }
        catch (ReadOnlyBufferException ex) {
            // unfortunately, we cannot check if vBuffer and uBuffer overlap
        }
        vBuffer.put(1, savePixel);
    }
    for (int row=0; row<height/2; row++) {
        for (int col=0; col<width/2; col++) {
            int vuPos = col*pixelStride + row*rowStride;
            nv21[pos++] = vBuffer.get(vuPos);
            nv21[pos++] = uBuffer.get(vuPos);
        }
    }
    return nv21;
}

private static Bitmap rotateBitmap(Bitmap bitmap, int rotationDegrees, boolean flipX, boolean flipY) {

    Matrix matrix = new Matrix();
    matrix.postRotate(rotationDegrees);
    matrix.postScale(flipX ? -1.0f : 1.0f, flipY ? -1.0f : 1.0f);
    Bitmap rotatedBitmap = Bitmap.createBitmap(bitmap, 0, 0, bitmap.getWidth(), bitmap.getHeight(), matrix, true);
    if (rotatedBitmap != bitmap) {
        bitmap.recycle();
    }
    return rotatedBitmap;
}

public Bitmap getResizedBitmap(Bitmap bm, int newWidth, int newHeight) {

    int width = bm.getWidth();
    int height = bm.getHeight();
    float scaleWidth = ((float) newWidth) / width;
    float scaleHeight = ((float) newHeight) / height;
    Matrix matrix = new Matrix();
    matrix.postScale(scaleWidth, scaleHeight);
    Bitmap resizedBitmap = Bitmap.createBitmap(bm, 0, 0, width, height, matrix, false);
    bm.recycle();
    return resizedBitmap;
}

private static Bitmap getCropBitmapByCPU(Bitmap source, RectF cropRectF) {

    Bitmap resultBitmap = Bitmap.createBitmap((int) cropRectF.width(), (int) cropRectF.height(), Bitmap.Config.ARGB_8888);
    Canvas cavas = new Canvas(resultBitmap);
    Paint paint = new Paint(Paint.FILTER_BITMAP_FLAG);
    paint.setColor(Color.WHITE);
    cavas.drawRect(new RectF(0, 0, cropRectF.width(), cropRectF.height()), paint);
    Matrix matrix = new Matrix();
    matrix.postTranslate(-cropRectF.left, -cropRectF.top);
    cavas.drawBitmap(source, matrix, paint);
    if (source != null && !source.isRecycled()) {
        source.recycle();
    }
    return resultBitmap;
}
}

After getting the bounding box from detected face, you can draw a rectangle around detected face.从检测到的人脸中获取边界框后,您可以在检测到的人脸周围绘制一个矩形。 Please refer to the ML Kit vision sample app to see how you can apply such UI with detected faces.请参阅ML Kit 视觉示例应用程序,了解如何将此类 UI 应用于检测到的人脸。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM