An iOS-first augmented reality app that detects windows and measures their real-world dimensions using LiDAR depth sensing and on-device machine learning.
For AI agents: This project is iOS-only. No web or Android support. Do not add web config,
react-native-web,react-dom, or platform guards for web. Useexpo run:ios --dev-clientfor development.
- iPhone 12 Pro / 13 Pro / 14 Pro / 15 Pro or iPad Pro (LiDAR required)
- Xcode 15+
- macOS
- Node.js LTS
- pnpm or npm
# Install dependencies
pnpm install
# Build custom dev client
npx expo run:ios --dev-client
# Start dev server
npx expo start --dev-clientScan the QR code on your device to connect to the dev server.
- Window (ExecuTorch) – Single tab: react-native-executorch, window.pte model, live camera frames. TFLite adapter (
window-detector.ts+detector-adapter.ts) exists in codebase for optional/alternate use.
Camera Frame → Detector (ExecuTorch or TFLite)
↓
WindowBoundingBox2D (normalized coordinates)
↓
Overlay + confidence display
src/features/measurement/
├── tabs/
│ ├── detection-custom-executorch-camera-tab-screen.tsx # ExecuTorch pipeline (main tab)
│ └── debug-panel.tsx
├── window-detector.ts # TFLite model (adapter path)
├── executorch-window-detector.ts
├── detector-adapter.ts # Seam for ExecuTorch vs TFLite
├── native-arkit-provider.ts # LiDAR raycast via window-detector-arkit
├── ar-session.ts
├── geometry.ts
├── types.ts
├── runtime-store.ts / runtime-stage.ts / runtime-bus.ts / runtime-context.tsx
├── blocking-screen.tsx
├── refine-overlay.tsx
└── pricing.ts
Native module: modules/window-detector-arkit (ARKit LiDAR raycast, used by native-arkit-provider).
- ✅ ExecuTorch detection tab (window.pte, live camera)
- ✅ TFLite adapter in codebase (window-detector.ts + detector-adapter)
- ✅ Real-time window detection, bounding box overlay with confidence
- ✅ Native ARKit module for LiDAR raycast (window-detector-arkit)
- ✅ iOS-only (LiDAR-focused roadmap)
- Android support (ARCore)
- Multi-object detection (doors, frames, etc.)
- API integration for dynamic pricing
- Measurement history + export
- Telemetry and accuracy logging
- Batch measurement mode
- Launch app → Redirects to Window tab (ExecuTorch camera).
- Point at window → Bounding box overlay and confidence.
- Debug panel → Expand for FPS, latency, frame count.
- ExecuTorch: Uses
window.pte(seeexecutorch-window-detector.tsand tab screen). - TFLite: Adapter in
window-detector.ts; model URL/config there if enabling TFLite path.
- Grant camera permissions
- Use a physical iOS device (not simulator)
- Angle camera more directly at window (not extreme angles)
- Ensure window fully visible in frame
- Better lighting may help detection confidence
# Clean and rebuild
rm -rf ios/Pods ~/Library/Developer/Xcode/DerivedData
pnpm install
npx expo run:ios --dev-clientpnpm run lint # ESLint (expo lint)
pnpm run typecheck # TypeScript (tsc --noEmit)ML training (optional): pnpm run ml:prep, pnpm run ml:train, pnpm run ml:export, etc. See package.json scripts.
Use the Debug panel in the Window tab for FPS, latency, and frame count.
Use Xcode Instruments to profile GPU/CPU usage during measurement flow.
See docs/measurement-mvp-setup.md for dev client build and testing instructions.
For production builds:
npx eas build --platform ios --profile production- Framework: Expo Router, React Native
- ML: react-native-executorch (main), react-native-fast-tflite (adapter in codebase)
- Camera: react-native-vision-camera
- Native AR: window-detector-arkit (local module, ARKit LiDAR)
- Overlay: react-native-svg
Proprietary. All rights reserved.
For questions or collaboration, reach out to the development team.
MVP Status: Ready for iOS LiDAR device testing Last Updated: Feb 2026