I.R.I.S transforms iPhone LiDAR data into simple haptic navigation for blind users. Using morse-code patterns (dots for left, dash for right), it enables independent navigation without sight.
๐จ FOR JUDGES: Click here for detailed setup instructions
- Simple Haptic Patterns: 4 dots = left, 1 dash = right, 2 taps = straight
- Real-time LiDAR Scanning: 60Hz obstacle detection up to 5 meters
- Eye-Level Detection: Smart filtering to ignore ground/ceiling
- Multi-Zone Awareness: Separate feedback for left, center, right
- Object Classification: Detects walls, furniture, doors, and obstacles
- Path Planning Engine: A* pathfinding with dynamic obstacle avoidance
- Location Fingerprinting: Identifies rooms using WiFi BSSID, obstacle patterns, and spatial layout
- Temporal Learning: Learns time-based patterns (e.g., office door open 9-5)
- Predictive Scanning: Uses cached obstacles in known locations, saving 70% battery
- Adaptive Modes: Switches between aggressive/normal/conservative/predictive scanning
- Permanence Scoring: Distinguishes temporary obstacles (chairs) from permanent (walls)
- <5% per hour in known locations (predictive mode)
- Dynamic Scan Frequency: Adjusts from 10Hz to 60Hz based on familiarity
- Smart Caching: Reuses obstacle data in familiar environments
- ARM64 Optimizations: NEON instructions for vector math
- VoiceOver Compatible: Full screen reader support
- Voice Announcements: Optional audio guidance for key events
- Haptic-First Design: Designed for eyes-free navigation
- No Internet Required: All processing happens on-device with privacy
- Device: iPhone 12 Pro or later (LiDAR required)
- iOS: 15.0 or later
- Xcode: 14.0 or later (for building)
- Clone repository:
git clone https://github.com/CodeWithInferno/I.R.I.S.git
cd I.R.I.S/LiDARObstacleDetection- Open in Xcode:
open LiDARObstacleDetection.xcodeproj- Connect iPhone, select it as target, and press Run
First time? See SETUP_FOR_JUDGES.md for detailed instructions including trust settings.
- โข โข โข โข (4 dots) = Turn LEFT
- โโโ (1 dash) = Turn RIGHT
- โข โข (2 taps) = Go STRAIGHT
- ~~~~~ (continuous) = STOP
- Hold phone at chest height
- Point forward like taking a photo
- Keep phone steady for best results
- ARKit Scene Reconstruction: Real-time mesh generation with LiDAR
- Scene Depth API: High-fidelity depth mapping at 60Hz
- Core Haptics Engine: Custom morse-code patterns for navigation
- SQLite: Local spatial memory database with location fingerprinting
- Core Location: WiFi-based indoor positioning (no GPS)
The secret sauce that sets I.R.I.S apart:
-
Location Fingerprinting
- WiFi BSSID scanning for unique location IDs
- Spatial hash of obstacle distribution
- Room dimension analysis (ceiling height, bounds)
- 95%+ accuracy in familiar environments
-
Obstacle Permanence Analysis
- Tracks how often obstacles appear in same location
- Permanence score: 0.0 (temporary chair) to 1.0 (wall)
- Only caches obstacles with >0.7 permanence
- Automatic cleanup of outdated data
-
Temporal Pattern Learning
- Records time-of-day for each observation
- Learns patterns like "door closed at night"
- Confidence scoring based on pattern consistency
- Smart predictions for known time patterns
-
Adaptive Scanning Strategy
- Aggressive Mode (new locations): 60Hz, 100% coverage
- Normal Mode (semi-familiar): 30Hz, 60% coverage
- Conservative Mode (familiar): 15Hz, 30% coverage
- Predictive Mode (very familiar): 10Hz verification only
- Automatic mode switching based on confidence
- ARM64 NEON SIMD: 4x faster vector math for spatial calculations
- Spatial Hashing: O(1) obstacle lookups using 3D grid
- Frame Skipping: Intelligent frame dropping in familiar locations
- SQLite with Indexes: <1ms location fingerprint matching
- Metal Shaders: GPU-accelerated depth processing
- Latency: <16ms response time (60Hz scanning)
- Battery: <5% per hour (predictive mode), ~8% (aggressive mode)
- Range: 0.5m - 5m detection range
- CPU: <15% utilization (ARM optimized)
- Memory: ~120MB active, ~50MB for spatial database
- Location Recognition: 95%+ accuracy in known environments
- Path Planning: Real-time A* with <5ms compute time
- Dynamic Path Finding: A* algorithm calculates optimal routes in real-time
- Obstacle Memory Integration: Uses cached obstacles for faster planning
- Safe Zone Detection: Identifies clear walking paths between obstacles
- Turn-by-Turn Guidance: Haptic feedback guides user along planned route
- Adaptive Replanning: Automatically recalculates if new obstacles detected
- ARKit Scene Classification: Identifies walls, floors, doors, furniture
- Confidence Scoring: Only acts on high-confidence detections (>70%)
- Spatial Clustering: Groups nearby depth points into coherent objects
- Size Estimation: Calculates object dimensions for better navigation
- Type-Specific Feedback: Different haptic patterns for different obstacle types
- Set up obstacle course with chairs/tables
- Put on blindfold
- Hold phone at chest height
- Follow haptic feedback to navigate
- Enable VoiceOver in iOS Settings
- Launch app
- Verify all UI elements are announced
- Test navigation with screen reader active
Built with โค๏ธ at hackathon by Team I.R.I.S Members: -Pratham Patel -Kunga Lama Tamang -Riwaz Shrestha
MIT License - See LICENSE file
- Apple ARKit team for incredible LiDAR APIs
- Our test users who provided invaluable feedback
- Hackathon organizers and mentors
For Judges: Please see SETUP_FOR_JUDGES.md for complete setup instructions.
For Users: Download from TestFlight (coming soon) or build from source.