Jennifer Story
Remote AR
Intelligent and interactive real/remote visual & audio content in a VR
ROLE
UX, visual mockups for developing technology, video editing
As a lead designer, worked with SW engineers
DURATION
Project & design research and development Mar. 2016 ~ Sep 2017
​
Project Introduction
-
Fully immersive visual content that can transport to real/virtual remote locations in VR
-
Objects are in an immersive visual field and augmented with intelligent AR visual annotation and interactive social connectivity
Challenges
-
Make the VR world a new space where users can interact and engage with real world
-
Support social interaction and connectivity features for multi-user and sharing experience (spatial placement of annotations)
Remote AR experimental App UI
Remote AR experience App Homescreen Design
Developed key techology features
-
Object/scene detection and segmentation mask: Multiple detectors/trackers/segmenters
-
Classification labels: Multiple classifiers/recognizers
-
Augmented information for the immersive environment
-
Face detection and pose estimation
Developed UX features
-
AI contextual Information overlay on the detected object
-
Human face analysis in real-time
-
Navigation
-
Link to the external video for storytelling
-
Automatic Annotation/bookmarks
-
Social interaction in the VR environment-avatar, game, item store, and interactive tools between multi-users
-
Hand, voice, and eye-tracking interface
-
Haptic feedback
-
POV change
Object Detection & Visual Annotation (Masks-Object)
Link to the External Video for Storytelling
Multi-users Social Sharing & Visual Annotation
Domain-specific Annotation
Demo Video
Extensive Applications
-
Domain-specific annotations (sports, art, food, animals, etc.)
-
Time-based annotation (related to a particular time period)
-
MR annotation (visualize annotation as virtual objects are placed on behind the real world)
-
Monetized annotations (pay-per-view, advertising)
-
Educational annotation (zoo, museum, language, remote learning, etc)