Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Background Blur Fails with Unable to load TensorFlowLiteSegmentationProcessor Error #714

Open
luketehira opened this issue Mar 2, 2025 · 0 comments

Comments

@luketehira
Copy link

luketehira commented Mar 2, 2025

Description

Use Case

I am implementing video background blur functionality in an iOS app using the Amazon Chime SDK. The goal is to enable a blurred background during video calls, leveraging the BackgroundBlurVideoFrameProcessor provided by the AmazonChimeSDKMachineLearning framework.

Steps to Reproduce

Set Up Project:

  1. Create an iOS project in Xcode.

Attempt to integrate the Amazon Chime SDK with background blur support using one of the following methods:

Manual Integration: Add AmazonChimeSDK.xcframework, AmazonChimeSDKMedia.xcframework, and AmazonChimeSDKMachineLearning.xcframework (versions 0.24.1 to 0.27.0) to the project under "Frameworks, Libraries, and Embedded Content" with "Embed & Sign".

Alternatively (I tried both approaches).

Swift Package Manager (SPM): Add the Amazon Chime SDK package and include AmazonChimeSDK, AmazonChimeSDKMedia, and AmazonChimeSDKMachineLearning.

  1. Implement Background Blur Code:

Use the following function to set up the background blur processor:

private func addBackgroundBlurProcessor(blurStrength: BackgroundBlurStrength) {
    guard backgroundBlurProcessor == nil else { return }

    let configuration = BackgroundBlurConfiguration(logger: ConsoleLogger(name: "BackgroundBlurProcessor"), blurStrength: blurStrength)
    let processor = BackgroundBlurVideoFrameProcessor(backgroundBlurConfiguration: configuration)

    self.backgroundBlurProcessor = processor

    var videoSource: VideoSource = cameraCaptureSource
    videoSource.addVideoSink(sink: processor)
    videoSource = processor
    videoSource.videoContentHint = .none
}
  • Call the function with a medium blur strength: addBackgroundBlurProcessor(blurStrength: .medium).
  • Start the local video using the processor as the source: audioVideo.startLocalVideo(source: backgroundBlurProcessor).
  1. Run the app
  • Build and run the app. I did this on an iPad Air (5th generation) with iOS 18.3.
  • Enable video in the app and trigger the background blur functionality.

Logs

When the addBackgroundBlurProcessor function executes, the following error is logged:

[ERROR] BackgroundBlurProcessor - Unable to load TensorFlowLiteSegmentationProcessor. See `Update Project File` section in README for more information on how to import `AmazonChimeSDKMachineLearning` framework and the `selfie_segmentation_landscape.tflite` as a bundle resource to your project.
  • The error occurs specifically on the line where BackgroundBlurConfiguration is initialized:
let configuration = BackgroundBlurConfiguration(logger: ConsoleLogger(name: "BackgroundBlurProcessor"), blurStrength: blurStrength).

Expected Behaviour

  • The background blur processor should initialise successfully without errors.
  • The AmazonChimeSDKMachineLearning framework should load the required TensorFlow Lite model (selfie_segmentation_landscape.tflite) from its bundle.
  • When audioVideo.startLocalVideo(source: backgroundBlurProcessor) is called, the video feed should display with a blurred background at the specified strength (e.g., medium).

Actual Behaviour

  • The app logs the error [ERROR] BackgroundBlurProcessor - Unable to load TensorFlowLiteSegmentationProcessor when attempting to create the BackgroundBlurConfiguration.
  • The background blur feature fails to activate, and the video feed either remains unprocessed or does not start.
  • The error message suggests that the AmazonChimeSDKMachineLearning framework cannot find or load the selfie_segmentation_landscape.tflite file, despite the framework being included in the project.

Additional Details

  • Device: iPad Air (5th generation)
  • OS: iOS 18.3
  • SDK Versions Tested: Amazon Chime SDK versions 0.24.1 to 0.27.0

Integration Methods Tried:

  • Manually embedding XCFrameworks (v0.24.1 to v0.27.0).
  • Adding via Swift Package Manager.

Observations

  • The AmazonChimeSDKMachineLearning.xcframework is present in the project and embedded correctly (verified in the Xcode project navigator).
  • Also tried manually adding selfie_segmentation_landscape.tflite to the project so that it was in the root directory of the bundle, but this did not work.
  • The README’s Update Project File section likely refers to additional steps (e.g., linking the .tflite file as a resource), but this is unclear without further guidance.

Request

  • Could this be a bug in the SDK where the selfie_segmentation_landscape.tflite file is not properly bundled in the XCFramework?
  • If not, what specific steps are required to include selfie_segmentation_landscape.tflite as a bundle resource when using XCFrameworks or SPM? The README instructions are vague on this point.
  • Any insights or fixes to resolve the Unable to load TensorFlowLiteSegmentationProcessor error would be greatly appreciated.

Similar issue

#621

@luketehira luketehira changed the title [ERROR] BackgroundBlurProcessor - Unable to load TensorFlowLiteSegmentationProcessor. See Update Project File section in README for more information on how to import AmazonChimeSDKMachineLearning framework and the selfie_segmentation_landscape.tflite Background Blur Fails with Unable to load TensorFlowLiteSegmentationProcessor Error Mar 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant