Sometimes there is a need to detect the face & face features in an image,now we can detect the face very easily in iOS 5 with just adding a few lines of code.We can detect the number of faces visible in the image & we can also detect the position of eyes & mouth.
1) Import the CoreImage framework.
#import
2) Draw a CI image with the previously loaded face detection picture.
CIImage* image = [CIImage imageWithCGImage:_imageView.image.CGImage];
3) Create a face detector.
CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
4) Create an array containing all the detected faces from the detector .
NSArray* features = [detector featuresInImage:image];
5) We can iterate through every detected face. CIFaceFeature class provides us with the bounds for the face, the location of each eye, and mouth, and also BOOL’s indicating whether each eye or the mouth is found for each face.
Here, we draw the red border around the face & draw a green circle over the mouth.
for(CIFaceFeature* faceFeature in features) { CGFloat faceWidth = faceFeature.bounds.size.width; UIView* faceView = [[UIView alloc] initWithFrame:faceFeature.bounds]; faceView.layer.borderWidth = 1; faceView.layer.borderColor = [[UIColor redColor] CGColor]; [_imageView addSubview:faceView]; if(faceFeature.hasMouthPosition) { UIView* mouth = [[UIView alloc] initWithFrame:CGRectMake(faceFeature.mouthPosition.x-faceWidth*0.2, faceFeature.mouthPosition.y-faceWidth*0.2, faceWidth*0.4, faceWidth*0.4)]; [mouth setBackgroundColor:[[UIColor greenColor] colorWithAlphaComponent:0.3]]; [mouth setCenter:faceFeature.mouthPosition]; mouth.layer.cornerRadius = faceWidth*0.2; [_imageView addSubview:mouth]; } }