Android Detect Face
In this post I want to describe how to detect faces in an image using Android. I will use the camera to get the picture and then the app will draws some rectangle around the faces detected in the image. To do it we simply use the Android API like:
- FaceDetector used to detect face in the bitmap
- Face object that contains information about face detected
So we have to make three steps in our apps:
- Take the picture
- Detect faces in the picture
- Draws the rectangles around the faces detected
Take the picture with camera using Android
This step is quite simple because we have simply to invoke an intent and get ready to receive the result. The Intent is very simple:
Intent cameraIntent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE); startActivityForResult(cameraIntent, CAMERA_RETURN_CODE);
To retrieve the result (the picture taken) we have to override a method in our activity:
@Override protected void onActivityResult(int requestCode, int resultCode, Intent data) { if (requestCode == CAMERA_RETURN_CODE) { Bitmap cameraBmp = (Bitmap) data.getExtras().get("data"); } }
At line 4 we retrieve our picture using extra with data key.
Detect faces in the picture
At this point, we use the API shipped with Android SDK since level 1. To implement our logic we create a custom component that extends
ImageView, we call it FaceImageView.
public class FaceImageView extends ImageView { ... }
To detect the faces, we have first convert the image in Bitmap in 556 format otherwise we can’t use the image as stated in API, so we have:
Bitmap tmpBmp = image.copy(Config.RGB_565, true);
Now we have our image in the right format, we create an instance of the FaceDetector:
FaceDetector faceDet = new FaceDetector(tmpBmp.getWidth(), tmpBmp.getHeight(), MAX_FACES);
passing the image width and image height and the number of faces we want to detect (in our case it is simple constant). Now we invoke the findFaces method to detect the faces, as a result we expect an array of Face instance that must have the length equals to the number of faces we want to detect:
faceDet.findFaces(tmpBmp, faceList);
where faceList is our array.
Now we have to analyze each item in our array and get the result. We simply want to get the face midpoint and the eyes distance. We use these two information to draw the rectangle for each face detected:
for (int i=0; i < faceList.length; i++) { Face face = faceList[i]; Log.d("FaceDet", "Face ["+face+"]"); if (face != null) { Log.d("FaceDet", "Face ["+i+"] - Confidence ["+face.confidence()+"]"); PointF pf = new PointF(); face.getMidPoint(pf); Log.d("FaceDet", "\t Eyes distance ["+face.eyesDistance()+"] - Face midpoint ["+pf+"]"); RectF r = new RectF(); r.left = pf.x - face.eyesDistance() / 2; r.right = pf.x + face.eyesDistance() / 2; r.top = pf.y - face.eyesDistance() / 2; r.bottom = pf.y + face.eyesDistance() / 2; rects[i] = r; } }
At line (9-13) we calculate the rectangle vertex and we store them in a RectF object. If we want to run the app we can use a real smartphone or avd configured in our dev environment. In this case be sure you have configured the camera to use:
Running the example and looking at the log we have:
We notice that the app detected one face while others are null because no more faces exist in the picture.
Draw rectangles around the faces detected
The last step is drawing rectangles around the faces detected. In this case we can use the information, retrieved before, in this way:
@Override protected void onDraw(Canvas canvas) { super.onDraw(canvas); Paint p = new Paint(); canvas.drawBitmap(image, 0, 0, p); Paint rectPaint = new Paint(); rectPaint.setStrokeWidth(2); rectPaint.setColor(Color.BLUE); rectPaint.setStyle(Style.STROKE); for (int i=0; i < MAX_FACES; i++) { RectF r = rects[i]; if (r != null) canvas.drawRect(r, rectPaint); } }
- Source code available @ github
hi i am new to android ,, my project is that to detect face and crop it… how wil i implement it to crop thanks