The Mobile Vision API provides a framework for recognizing objects in photos and videos. The framework includes detectors, which locate and describe visual objects in images or video frames, and an event-driven API that tracks the position of those objects in video.
The objects that can be tracked by Mobile Vision API include facial features, text and bar codes.
For our purpose, we will be doing real-time tracking of face objects using a custom camera. So let’s get started.
First, create a new project and add the following dependencies in your app level build.gradle file which is available at location: app/build.gradle.
1 2 3 4 |
dependencies { /*... other dependencies ...*/ implementation 'com.google.android.gms:play-services-vision:17.0.2' } |
Now open strings.xml file at location res/values/strings.xml and add the following in it.
1 2 3 4 |
<string name="take_photo">Blink your eyes to capture photo</string> <string name="permission_required">Permission Required</string> <string name="permission_message">You must grant permission to access camera to run this application.</string> <string name="permission_warning">All permissions are required.</string> |
Now let’s build our layout file. So open activity_main.xml file and add the below code.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
<?xml version="1.0" encoding="utf-8"?> <FrameLayout xmlns:android="http://schemas.android.com/apk/res/android" android:layout_width="match_parent" android:layout_height="match_parent" android:orientation="vertical"> <SurfaceView android:id="@+id/surfaceView" android:layout_width="match_parent" android:layout_height="match_parent" android:visibility="gone" /> <TextView android:id="@+id/tv_capture" android:layout_width="wrap_content" android:layout_height="wrap_content" android:layout_gravity="center_horizontal|bottom" android:background="#8cffffff" android:padding="20dp" android:text="@string/take_photo" android:textStyle="bold" android:visibility="gone" /> </FrameLayout> |
Here SurfaceView is used for the camera instance that we will be creating in the latter part of this post.
Now in the MainActivity.java, add the below code
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 |
package com.test.camerademo.ui; import android.Manifest; import android.app.AlertDialog; import android.content.DialogInterface; import android.content.Intent; import android.content.pm.PackageManager; import android.graphics.Bitmap; import android.graphics.BitmapFactory; import android.os.Bundle; import android.os.Handler; import android.os.Looper; import android.support.annotation.NonNull; import android.support.v4.app.ActivityCompat; import android.support.v4.content.ContextCompat; import android.support.v7.app.AppCompatActivity; import android.util.Log; import android.view.SurfaceHolder; import android.view.SurfaceView; import android.view.View; import android.widget.Toast; import com.google.android.gms.vision.CameraSource; import com.google.android.gms.vision.face.FaceDetector; import com.google.android.gms.vision.face.LargestFaceFocusingProcessor; import com.test.camerademo.R; import java.io.IOException; import java.util.ArrayList; import static android.Manifest.permission.CAMERA; public class MainActivity extends AppCompatActivity implements SurfaceHolder.Callback, CameraSource.PictureCallback { public static final int CAMERA_REQUEST = 101; public static Bitmap bitmap; private SurfaceHolder surfaceHolder; private SurfaceView surfaceView; private String[] neededPermissions = new String[]{CAMERA}; private FaceDetector detector; private CameraSource cameraSource; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); surfaceView = findViewById(R.id.surfaceView); detector = new FaceDetector.Builder(this) .setProminentFaceOnly(true) // optimize for single, relatively large face .setTrackingEnabled(true) // enable face tracking .setClassificationType(/* eyes open and smile */ FaceDetector.ALL_CLASSIFICATIONS) .setMode(FaceDetector.FAST_MODE) // for one face this is OK .build(); if (!detector.isOperational()) { Log.w("MainActivity", "Detector Dependencies are not yet available"); } else { Log.w("MainActivity", "Detector Dependencies are available"); if (surfaceView != null) { boolean result = checkPermission(); if (result) { setViewVisibility(R.id.tv_capture); setViewVisibility(R.id.surfaceView); setupSurfaceHolder(); } } } } private boolean checkPermission() { ArrayList<String> permissionsNotGranted = new ArrayList<>(); for (String permission : neededPermissions) { if (ContextCompat.checkSelfPermission(this, permission) != PackageManager.PERMISSION_GRANTED) { permissionsNotGranted.add(permission); } } if (!permissionsNotGranted.isEmpty()) { boolean shouldShowAlert = false; for (String permission : permissionsNotGranted) { shouldShowAlert = ActivityCompat.shouldShowRequestPermissionRationale(this, permission); } if (shouldShowAlert) { showPermissionAlert(permissionsNotGranted.toArray(new String[permissionsNotGranted.size()])); } else { requestPermissions(permissionsNotGranted.toArray(new String[permissionsNotGranted.size()])); } return false; } return true; } private void showPermissionAlert(final String[] permissions) { AlertDialog.Builder alertBuilder = new AlertDialog.Builder(this); alertBuilder.setCancelable(true); alertBuilder.setTitle(R.string.permission_required); alertBuilder.setMessage(R.string.permission_message); alertBuilder.setPositiveButton(android.R.string.yes, new DialogInterface.OnClickListener() { public void onClick(DialogInterface dialog, int which) { requestPermissions(permissions); } }); AlertDialog alert = alertBuilder.create(); alert.show(); } private void requestPermissions(String[] permissions) { ActivityCompat.requestPermissions(MainActivity.this, permissions, CAMERA_REQUEST); } @Override public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) { if (requestCode == CAMERA_REQUEST) { for (int result : grantResults) { if (result == PackageManager.PERMISSION_DENIED) { Toast.makeText(MainActivity.this, R.string.permission_warning, Toast.LENGTH_LONG).show(); setViewVisibility(R.id.showPermissionMsg); checkPermission(); return; } } setViewVisibility(R.id.tv_capture); setViewVisibility(R.id.surfaceView); setupSurfaceHolder(); } super.onRequestPermissionsResult(requestCode, permissions, grantResults); } private void setViewVisibility(int id) { View view = findViewById(id); if (view != null) { view.setVisibility(View.VISIBLE); } } private void setupSurfaceHolder() { cameraSource = new CameraSource.Builder(this, detector) .setFacing(CameraSource.CAMERA_FACING_FRONT) .setRequestedFps(2.0f) .setAutoFocusEnabled(true) .build(); surfaceHolder = surfaceView.getHolder(); surfaceHolder.addCallback(this); } public void captureImage() { // We add a delay of 200ms so that image captured is stable. new Handler(Looper.getMainLooper()).postDelayed(new Runnable() { @Override public void run() { runOnUiThread(new Runnable() { @Override public void run() { clickImage(); } }); } }, 200); } private void clickImage() { if (cameraSource != null) { cameraSource.takePicture(null, this); } } @Override public void surfaceCreated(SurfaceHolder surfaceHolder) { startCamera(); } private void startCamera() { try { if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) { return; } cameraSource.start(surfaceHolder); detector.setProcessor(new LargestFaceFocusingProcessor(detector, new GraphicFaceTracker(this))); } catch (IOException e) { e.printStackTrace(); } } @Override public void surfaceChanged(SurfaceHolder surfaceHolder, int i, int i1, int i2) { } @Override public void surfaceDestroyed(SurfaceHolder surfaceHolder) { cameraSource.stop(); } @Override public void onPictureTaken(byte[] bytes) { bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length); // Save or Display image as per your requirements. Here we display the image. Intent intent = new Intent(this, PictureActivity.class); startActivity(intent); } } |
Here we create a custom Tracker, i.e, GraphicFaceTracker which tracks the facial features. We use this to track the blinking of the eyes.
Below is the code for GraphicFaceTracker.java.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 |
package com.test.camerademo.ui; import android.util.Log; import com.google.android.gms.vision.Tracker; import com.google.android.gms.vision.face.Face; import com.google.android.gms.vision.face.FaceDetector; public class GraphicFaceTracker extends Tracker<Face> { private static final float OPEN_THRESHOLD = 0.85f; private static final float CLOSE_THRESHOLD = 0.4f; private final MainActivity mainActivity; private int state = 0; GraphicFaceTracker(MainActivity mainActivity) { this.mainActivity = mainActivity; } private void blink(float value) { switch (state) { case 0: if (value > OPEN_THRESHOLD) { // Both eyes are initially open state = 1; } break; case 1: if (value < CLOSE_THRESHOLD) { // Both eyes become closed state = 2; } break; case 2: if (value > OPEN_THRESHOLD) { // Both eyes are open again Log.i("Camera Demo", "blink has occurred!"); state = 0; mainActivity.captureImage(); } break; default: break; } } /** * Update the position/characteristics of the face within the overlay. */ @Override public void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) { float left = face.getIsLeftEyeOpenProbability(); float right = face.getIsRightEyeOpenProbability(); if ((left == Face.UNCOMPUTED_PROBABILITY) || (right == Face.UNCOMPUTED_PROBABILITY)) { // One of the eyes was not detected. return; } float value = Math.min(left, right); blink(value); } } |
Now once the image has been clicked, we display it in the PictureActivity.java.
Below is the code for activity_picture.xml.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 |
<?xml version="1.0" encoding="utf-8"?> <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent" android:layout_height="match_parent" tools:context=".ui.PictureActivity"> <ImageView android:id="@+id/img" android:layout_width="match_parent" android:layout_height="match_parent" android:scaleType="fitXY" android:src="@mipmap/ic_launcher" /> </RelativeLayout> |
And the code for PictureActivity.java is below.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
package com.test.camerademo.ui; import android.os.Bundle; import android.support.v7.app.AppCompatActivity; import android.view.View; import android.widget.ImageButton; import android.widget.ImageView; import com.test.camerademo.R; public class PictureActivity extends AppCompatActivity { private ImageView imageView; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_picture); imageView = findViewById(R.id.img); imageView.setImageBitmap(MainActivity.bitmap); } } |
Now on running this code, we can see that on blinking our eyes, an image will be captured and displayed.
You can also read our blog on the topic “Custom Camera using SurfaceView“.
InnovationM is a globally renowned Mobile app development company that caters to a strong & secure Android app development, iOS app development, hybrid app development services. Our commitment & engagement towards our target gives us brighter in the world of technology and has led us to establish success stories consecutively which makes us the best iOS app development company.
That’s all for this post. Hope you enjoyed learning. 🙂