Face detection in iOS

Sachin Khard
2 min readSep 17, 2019

By using Vision framework you can do many things like detection of faces, face features, object tracking, and others.

In this blog, we will see how we can detect a face from an image -

  • First, you need an image from which you can detect a face from an image this is taken from the camera or a Photos from your iPhone, you can use UIImagePickerController for this.

let picker = UIImagePickerController()

picker.delegate = self

let alert = UIAlertController(title: nil, message: nil, preferredStyle: .actionSheet)

if UIImagePickerController.isSourceTypeAvailable(.camera) {

alert.addAction(UIAlertAction(title: “Camera”, style: .default, handler: {action in

picker.sourceType = .camera

self.present(picker, animated: true, completion: nil)

}))

}

alert.addAction(UIAlertAction(title: “Photo Library”, style: .default, handler: { action in

picker.sourceType = .photoLibrary

self.present(picker, animated: true, completion: nil)

}))

alert.addAction(UIAlertAction(title: “Cancel”, style: .cancel, handler: nil))

alert.popoverPresentationController?.sourceRect = self.view.frame

self.present(alert, animated: true, completion: nil)

  • Now use below delegate methods to fetch an image:

extention ViewController: UIImagePickerControllerDelegate, UINavigationControllerDelegate {

func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any]) {

dismiss(animated: true, completion: nil)

self.image = info[UIImagePickerControllerOriginalImage] as! UIImage

}

}

  • After fetching the image, you can use that image for detecting a face using CIDetector like below -

let imageOptions = NSDictionary(object: NSNumber(value: 5) as NSNumber, forKey: CIDetectorImageOrientation as NSString)

let personciImage = CIImage(cgImage: image.cgImage!)

let accuracy = [CIDetectorAccuracy: CIDetectorAccuracyHigh]

let faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: accuracy)

let faces = faceDetector?.features(in: personciImage, options: imageOptions as? [String : AnyObject])

if let face = faces?.first as? CIFaceFeature {

print(“found bounds are \(face.bounds)”)

let alert = UIAlertController(title: “Say Cheese!”, message: “We detected a face!”, preferredStyle: UIAlertController.Style.alert)

// alert.addAction(UIAlertAction(title: “OK”, style: UIAlertAction.Style.default, handler: nil))

let confirmAction = UIAlertAction(title: “OK”, style: .default) { [weak alert] _ in

}

alert.addAction(confirmAction)

let cancelAction = UIAlertAction(title: “Cancel”, style: .cancel, handler: nil)

alert.addAction(cancelAction)

} else {

let alert = UIAlertController(title: “No Face!”, message: “No face was detected”, preferredStyle: UIAlertController.Style.alert)

alert.addAction(UIAlertAction(title: “OK”, style: UIAlertAction.Style.default, handler: nil))

self.present(alert, animated: true, completion: nil)

}

  • With the help of CIFaceFeature we can also detect smiling face or eye position etc.

if face.hasSmile {

print(“face is smiling”);

}

if face.hasLeftEyePosition {

print(“Left eye bounds are \(face.leftEyePosition)”)

}

if face.hasRightEyePosition {

print(“Right eye bounds are \(face.rightEyePosition)”)

}

Hope you have learned basics of face detection in iOS, please let me know in case of any issues so that i can assist you further on this topic.

--

--