Exploring immersive realities and the intersection of XR & AI
A selection of XR prototypes integrated with AI and realtime APIs
Prototypes delivered for 2024 XR Bootcamp, featuring AWS Console XR, Hive Smart Home integration, and XR Thames AIS real-time boats tracking.
Gemini Live running on Meta Quest 3, enhanced with integrated Mapbox mapping capabilities.
A Unity-based XR AI workbench built on Unity 6.2, featuring integration with multiple AI workflows and providers.
A Unity-based XR AI workbench built on Unity 6.2, featuring integration with multiple workflows and AI providers.
An Android application integrating Google's Gemini Live API (free tier) for real-time AI interactions on Meta Quest devices. Features real-time video streaming to Gemini Live, audio processing, and custom system prompt configuration. Built with Android Studio and the Meta Spatial SDK.
A Unity XR AI library providing a standardized API for testing AI providers and models in XR projects. Features multiple AI model pipelines including text-to-image, image-to-text, object detection, speech-to-text, and text-to-speech. Supports plugins from OpenAI, Google, Groq, Nvidia, Stability AI, YOLO, and Roboflow for seamless integration.