Generated Reality is an early-stage project exploring how to link VR motion tracking with AI video generation. Using a VR headset’s hand and head tracking plus a text prompt, the system generates first-person video that roughly matches the user’s movements.
Right now, the output runs at around 11 frames per second and the visual quality is still low compared to modern image models. Even so, the demo suggests a path toward interactive AI-generated environments that respond to player motion and instructions in real time.
The team says code is coming soon, which could open experimentation for VR developers and researchers building new kinds of immersive experiences.
Comments
No comments yet. Be the first to share your thoughts!