Get started with the Azure AI Vision Face UI SDK for iOS

In this sample, you will learn how to build and run the face liveness detection application.

Contents

API Reference Documentation

Prerequisites

  1. An Azure Face API resource subscription.
  2. A Mac (with iOS development environment, Xcode 13+), an iPhone (iOS 14+).
  3. An Apple developer account to install and run development apps on the iPhone.

Step 1: Set up the environment

  1. For the best experience, please do not open the sample project in Xcode yet before completing the environment setup.
  2. If this is your first time using your Mac to develop, you should build a sample app from About Me — Sample Apps Tutorials | Apple Developer Documentation and run it on your phone before you attempt to build the App here. This will help ensure that your developer environment has been setup properly.
  3. Get the access token to access the release artifacts. More details can be found in GET_FACE_ARTIFACTS_ACCESS.md.
  4. Prepare Git LFS

      # install with homebrew
      brew install git-lfs
      # verify and initialize
      git lfs --version
      git lfs install
    
  5. The sample app project has been preconfigured to reference the SDK through Swift Package Manager (SPM). Configure the authorization of the git repository from which SPM will pull the package:

    1. Open your global git config file.
      # path will be shown by the following command, then open it using editor
      git config --global --show-origin --list | head -1
      # alternatively default editor will be used if using the following command
      git config --global --edit
    
    1. Add the following lines to the global git config file. You may leave out the comments and is provided here for completeness.
      [credential "https://msface.visualstudio.com"]
              username = pat
              helper =
              helper = "!f() { test \"$1\" = get && echo \"password=INSERT_PAT_HERE\"; }; f"
    
              # get PAT from GET_FACE_ARTIFACTS_ACCESS.md and paste ^^^^^^^^^^^^^^^ above, replacing "INSERT_PAT_HERE".
              # username does not matter for PAT so long as it is not left blank.
              # the first blank helper line is necessary to override existing helpers and not a typo.
    
  • for other methods of dependency such as CocoaPods, or other methods of git authentication, please refer to the FAQ section of this document.
  1. If Xcode Command Line Tools is never installed on your machine, install it first following instructions from Apple Developer website.

Step 2: Build and run sample app

Build the sample

  1. Download the sample App folder, extract it but do not open it yet.
  2. Run the following command from Terminal, from the directory where your .xcodeproj is located, as appropriate for your project. It will resolve the package through your system Git. Your system Git should already have Git LFS configured, as mentioned in Prerequisites section.

    xcodebuild -scmProvider system -resolvePackageDependencies
    
  3. Open the .xcodeproj file.
    Sample project opened

  4. Verify the package dependency through Swift Package Manager or other methods as described above.

  5. In Xcode → Targets → Signing & Capabilities, set the App bundle identifier and developer team.

    Signing & Capabilities

  6. Connect your iPhone to the Mac, then trust the Mac when prompted.

    Trust This Computer Enter Passcode to Trust

  7. Select your iPhone in the Xcode top bar.

    Select your iPhone

  8. Build and run the app.

Run the sample

  1. Allow camera permission when prompted.
  2. This sample creates token on the client, so it needs the API configuration. In production scenario, this will not be necessary. For now, go to the settings page and configure:
    • API endpoint
    • Subscription key
  3. Try one of the buttons (such as “Liveness”) to begin testing.

Test out key scenarios

Liveness

  1. Tap “Liveness” then “Start” and show your face.
  2. The screen flashes for liveness analysis.
  3. Observe the Real/Spoof status.

LivenessWithVerify

  1. Tap “LivenessWithVerify” then select a reference face image.
  2. Show your face to the camera.
  3. Observe the Real/Spoof status, verification status, and confidence score.

Step 3: Integrate face liveness detection into your own application

  1. Configure your Xcode project

    1. In Xcode → Targets → Build Settings → Swift Compiler - Language, select the C++ and Objective-C Interoperability to be C++ / Objective-C++

      C++ / Objective-C++
    2. In Xcode → Targets → Info → Custom iOS Target Properties, add Privacy - Camera Usage Description.

      Privacy - Camera Usage Description
  2. Add package dependency by adding AzureAIVisionFaceUI.xcframework in Xcode → Files → Add Package Dependencies for Swift Package Manager. See FAQ for other dependency management tools.

    Xcode → Files → Add Package Dependencies

    1. In Search or Enter Package URL text box, enter https://msface.visualstudio.com/SDK/_git/AzureAIVisionFaceUI.xcframework.

      https://msface.visualstudio.com/SDK/_git/AzureAIVisionFaceUI.xcframework
    2. You will be prompted for credentials. Insert the token from GET_FACE_ARTIFACTS_ACCESS.md as the password. The username does not matter here so long as it is not left blank

      https://msface.visualstudio.com credentials
    3. Add the package

      Add Package
    4. Package resolution will fail. Select Add Anyway.

      Add Anyway
    5. AzureAIVisionFaceUI.xcframework will show up under Package Dependencies with red crossed icon.

      Package not resolved
    6. Close Xcode window of your project.
    7. Run the following command from Terminal, from the directory where your .xcodeproj is located, as appropriate for your project. It will resolve the package through your system Git. Your system Git should already have Git LFS configured, as mentioned in Prerequisites section.
      xcodebuild -scmProvider system -resolvePackageDependencies
    
    1. After the command succeeds, open the project again in Xcode. The package should be resolved properly.

      Package resolved
  3. Insert FaceLivenessDetectorView. Respond to the update of the passed binding in your View. In MainView.swift example, the View uses onChange(of:perform:) to demonstrate a more imperative way of handling the result, but you can also use a more SwiftUI-esque declarative way of handling the result, like:

   struct HostView: View {
       @State var livenessDetectionResult: LivenessDetectionResult? = nil
       var token: String
       var body: some View {
           if livenessDetectionResult == nil {
               FaceLivenessDetectorView(result: $livenessDetectionResult,
                                        sessionAuthorizationToken: token)
           } else if let result = livenessDetectionResult {
               VStack {
                   switch result { 
                       case .success(let success):
                       /// <#show success#>
                       case .failure(let error):
                       /// <#show failure#>
                   }
               }
           }
       }
   }
  1. Obtain the session authorization token from your service and update the view accordingly. See obtainToken function in the sample, used in LaunchView.swift and defined in AppUtility.swift for a basic token retrieval demo.

  2. Compare digest from both the client’s LivenessDetectionSuccess instance and service response to ensure integrity. For more details, see DeviceCheck | Apple Developer Documentation

FAQ

Q: How do we use CocoaPods or other package managers?

Add the following lines to your project’s Podfile. 'YourBuildTargetNameHere' is an example target, and you should use your actual target project instead. You can also specify your version requirement as needed.

# add repo as source
source 'https://msface.visualstudio.com/SDK/_git/AzureAIVisionFaceUI.podspec'
target 'YourBuildTargetNameHere' do
   # add the pod here, optionally with version specification as needed
   pod 'AzureAIVisionFaceUI'
end

Also read: CocoaPods (CocoaPods Guides - Getting Started)

For other package managers, please consult their documentation and clone the framework repo manually.

Q: Are there alternatives for access authorization?

There are some situations where the example plaintext token inside global git-config may not be suitable for your needs, such as automated build machines.

If you are using git-credential-manager, credential.azreposCredentialType needs to be set to pat.

The example above uses credential.helper approach of git-config. Aside from storing it directly inside the config file, there are alternate ways to provide the token to credential.helper. Read custom helpers section of the gitcredentials documentation for more information.

To use http.extraHeader approach of git-config, you need to convert the token to base64 format. Refer to the Use a PAT section of this Azure DevOps documentation article. Note that instead of using the git clone invocation as shown in the example, you should call:

MY_PAT=accessToken
HEADER_VALUE=$(printf "Authorization: Basic %s" "$MY_PAT" | base64)
git config --global http.https://msface.visualstudio.com/SDK.extraHeader "${HEADER_VALUE}"

For other types of Git installation, refer to the Credentials section of Git FAQ.

Q: How do I provide localization?

The SDK provides default localization for 75 locales. The strings can be customized for each localization by following this guide by Apple: Localizing and varying text with a string catalog. Please refer to this document for the keys of the strings.

Q: How do I customize the displayed strings?

Please refer to the localization FAQ answer above.