The Ultralytics YOLO iOS App makes it easy to experience the power of Ultralytics YOLO object detection models directly on your Apple device. Explore real-time detection capabilities with various models, bringing state-of-the-art computer vision to your fingertips.
Getting started with the Ultralytics YOLO iOS App is straightforward. Follow these steps to install and run the app on your iOS device.
Ensure you have the following before you begin:
- Xcode: The app requires Xcode installed on your macOS machine. You can download it from the Mac App Store.
- iOS Device: An iPhone or iPad running iOS 14.0 or later is needed for testing.
- Apple Developer Account: A free Apple Developer account is sufficient for testing on your device.
-
Clone the Repository: Use
git
to clone the repository to your local machine.git clone https://github.com/ultralytics/yolo-ios-app.git cd yolo-ios-app # Navigate into the cloned directory
-
Open the Project in Xcode: Locate the
YOLO.xcodeproj
file within the cloned directory and open it using Xcode.In Xcode, navigate to the project's target settings. Under the "Signing & Capabilities" tab, select your Apple Developer account to sign the app.
-
Add YOLO11 Models: You need models in the CoreML format to run inference on iOS. Export INT8 quantized CoreML models using the
ultralytics
Python package (install viapip install ultralytics
- see our Quickstart Guide) or download pre-exported models from our GitHub release assets. Place the.mlpackage
files into the correspondingYOLO/{TaskName}Models
directory within the Xcode project (e.g.,YOLO/DetectModels
). Refer to the Ultralytics Export documentation for more details on exporting models for various deployment environments.from ultralytics import YOLO # Loop through different YOLO11 model sizes (nano, small, medium, large, x-large) for size in ("n", "s", "m", "l", "x"): # Load a YOLO11 PyTorch model for detection model = YOLO(f"yolo11{size}.pt") # Export the model to CoreML INT8 format with NMS layers (recommended for detection) # Ensure imgsz matches expected input size, e.g., [640, 384] for landscape video # INT8 quantization reduces model size and speeds up inference with minimal accuracy trade-off model.export(format="coreml", int8=True, nms=True, imgsz=[640, 384]) # Example exports for other tasks (segmentation, classification, pose, OBB) # Note: For tasks other than detection, export without NMS (nms=False) # Segmentation: https://docs.ultralytics.com/tasks/segment/ # seg_model = YOLO(f"yolo11{size}-seg.pt") # seg_model.export(format="coreml", int8=True, nms=False, imgsz=[640, 384]) # Classification: https://docs.ultralytics.com/tasks/classify/ # cls_model = YOLO(f"yolo11{size}-cls.pt") # cls_model.export(format="coreml", int8=True, nms=False, imgsz=[224, 224]) # Typical classification input size # Pose Estimation: https://docs.ultralytics.com/tasks/pose/ # pose_model = YOLO(f"yolo11{size}-pose.pt") # pose_model.export(format="coreml", int8=True, nms=False, imgsz=[640, 384]) # Oriented Bounding Box (OBB): https://docs.ultralytics.com/tasks/obb/ # obb_model = YOLO(f"yolo11{size}-obb.pt") # obb_model.export(format="coreml", int8=True, nms=False, imgsz=[640, 384])
-
Run the App: Connect your iOS device via USB. Select your device from the list of run targets in Xcode (next to the stop button). Click the Run button (▶) to build and install the app on your device.
The Ultralytics YOLO iOS App offers an intuitive user experience:
- Real-Time Inference: Launch the app and point your device's camera at objects. The app will perform real-time object detection, instance segmentation, pose estimation, image classification, or oriented bounding box detection depending on the selected task and model.
- Flexible Task Selection: Easily switch between different computer vision tasks supported by the loaded models using the app's interface.
- Multiple AI Models: Choose from a range of pre-loaded Ultralytics YOLO11 models, from the lightweight YOLO11n ('nano') optimized for edge devices to the powerful YOLO11x ('x-large') for maximum accuracy. You can also deploy and use custom models trained on your own data after exporting them to the CoreML format.
The YOLO iOS App includes a suite of unit and integration tests to ensure functionality and reliability.
The test suite is designed to run with or without the actual CoreML model files:
- Without Models: Set
SKIP_MODEL_TESTS = true
in the test target's build settings (underBuild Settings
>User-Defined
). This allows running tests that don't require model inference, such as UI tests or utility function tests. - With Models: Set
SKIP_MODEL_TESTS = false
and ensure the required model files are added to the project in their respective directories as described below. This enables the full test suite, including tests that perform actual model inference.
To execute the complete test suite (with SKIP_MODEL_TESTS = false
), include the following INT8 quantized CoreML models in your project:
- Detection:
yolo11n.mlpackage
(place inYOLO/DetectModels
) - Segmentation:
yolo11n-seg.mlpackage
(place inYOLO/SegmentModels
) - Pose Estimation:
yolo11n-pose.mlpackage
(place inYOLO/PoseModels
) - OBB Detection:
yolo11n-obb.mlpackage
(place inYOLO/OBBModels
) - Classification:
yolo11n-cls.mlpackage
(place inYOLO/ClassifyModels
)
You can export these models using the Python script provided in the Installation section or download them directly from the releases page.
- Open the
YOLO.xcodeproj
project in Xcode. - Navigate to the Test Navigator tab (represented by a diamond icon) in the left sidebar.
- Select the tests you wish to run (e.g., the entire
YOLOTests
suite or individual test functions). - Click the Run button (play icon) next to your selection to execute the tests on a connected device or simulator.
Review the test files located within the YOLOTests
directory for specific implementation details and test coverage.
Contributions power the open-source community! We welcome your involvement in improving Ultralytics projects. Your efforts, whether reporting bugs via GitHub Issues, suggesting features, or submitting code through Pull Requests, are greatly appreciated.
- Check out our Contributing Guide for detailed instructions on how to get involved.
- Share your feedback and insights on your experience by participating in our brief Survey.
- A big thank you 🙏 to all our contributors who help make our projects better!
Ultralytics provides two licensing options to accommodate different use cases:
- AGPL-3.0 License: Ideal for students, researchers, and enthusiasts who want to experiment, learn, and share their work openly. This OSI-approved license promotes collaboration and knowledge sharing within the open-source community. See the LICENSE file for the full terms and conditions.
- Enterprise License: Designed for commercial applications where integrating Ultralytics software into proprietary products or services is necessary. This license allows for commercial use without the open-source requirements of AGPL-3.0. If your project requires an Enterprise License, please contact us through the Ultralytics Licensing page.
For bug reports, feature requests, and contributions specifically related to the YOLO iOS App:
- Submit issues on GitHub Issues. Please provide detailed information for effective troubleshooting.
For general questions, support, and discussions about Ultralytics YOLO models, software, and other projects:
- Join our vibrant community on Discord. Connect with other users and the Ultralytics team.