top of page

Sense the Unseen

Created a multisensory social VR experience that helps users perceive others beyond their field of view, extending social presence through visual, auditory, haptic, and scent based cues.

Screenshot 2026-03-25 at 10.51.40 PM.png

Type

Solo Project

Screenshot 2026-03-25 at 10.51.40 PM.png

Tools

Blender , Autodesk Fusion 360, Arduino, Keyshot

Screenshot 2026-03-25 at 10.51.40 PM.png

Timeframe

Sept - Dec 2024

Screenshot 2026-03-25 at 10.51.40 PM.png

Role

Interaction Designer

Problem

In social VR, it is difficult to perceive the movement and presence of others outside the field of view, which weakens social presence and connection.

Screenshot 2026-04-06 at 12.43.06 PM.png

Low awareness of people outside the FoV in social VR

While attending a class in Horizon Workrooms, I turned my head and suddenly found another user much closer than expected. The unexpected proximity broke immersion and revealed a broader issue: in social VR, people outside your FoV(Field of View) are hard to perceive, which disrupts personal-space awareness and smooth communication.

Screenshot 2026-04-06 at 12.43.23 PM.png

Limits of multisensory experience from vision–audio bias

In VR, the presence of others is conveyed almost exclusively through sight and sound. With few non-visual/non-auditory cues, such as touch, vibration, airflow or warmth, it’s hard to read subtle social signals such as distance, approach direction, and speed. As a result, the quality of communication and co-presence falls short of face-to-face interaction.

Solution

Introducing Sense the Unseen

A multisensory interface that makes out of view movement perceptible in social VR.

Primary Research

Turning to my own cohort to understand how people experience others beyond the field of view in social VR

I interviewed 12 members of my cohort who had attended classes together in Horizon Workrooms. 10 participants described moments when another user suddenly appeared closer than expected after they turned their head, which often disrupted immersion and made personal space feel uncertain. 9 said they relied almost entirely on voice to infer where others were, and 8 expressed a desire for richer non visual cues such as vibration, airflow, or spatial feedback to better sense direction, distance, and approach. Across interviews, participants described out of view awareness as a missing layer of social presence rather than just a navigation issue.

“In Workrooms, I often knew someone was there, but I did not feel their presence until I turned and looked.”

Screenshot 2026-04-06 at 1.01.35 PM.png
Secondary Research

Understanding why social presence breaks down beyond the field of view in social VR

Social VR depends on

co-presence

Social VR is not only about sharing a virtual space, but about feeling with others through real time awareness and interaction.

Social presence comes from sensing others in space

A strong sense of social presence depends on perceiving where others are, how close they are, and how they move within a shared environment.

Limited field of view weakens social awareness

Because VR headsets restrict peripheral vision, users outside the field of view become harder to notice, reducing awareness of direction, distance, and approach.

Precedent Analysis

Reviewing how existing systems communicate what lies beyond the field of view

I reviewed how existing 2D and 3D systems represent off screen information, but found that most focus on location and direction rather than conveying another person’s presence in a socially meaningful way.

2D

Picture1.png

Dynamic Camera Adjustments

Arrows, Offscreen indicator

Sound Cues

Context-Sensitive Alerts

Radar

Miniature Map

3D

Picture3.png

Halo

Wedge

EyeSee 360

3D Radar

These findings revealed a clear opportunity: not just to indicate where someone is beyond the field of view, but to make their presence perceptible through multisensory cues!

Design Concept

Design a multisensory interface system that integrates the VR environment and physical interface to enable users to perceive the presence of others located outside their field of view

무제-1.jpg

Multisensory Mapping

Translates out-of-view movement into particles, vibration, sound, and scent, making others’ presence perceptible through multiple sensory channels.

Social, Not Just Spatial

Goes beyond showing location by conveying how another person moves in relation to the user, making out of view awareness feel more social and relational.

Virtual Physical Integration

Synchronizes VR events with physical devices so presence is experienced across both virtual and physical space

User Flow

Placing out-of-view movement in a social VR scenario reveals how presence can be translated across virtual and physical space

This flow shows how Sense the Unseen turns another user’s unseen movement into particles in VR and synchronized scent, sound, and vibration in the physical world. As the other user gets closer, the feedback becomes stronger, helping the seated user sense direction, distance, and approach beyond the field of view.

User Flow.png
Final Outcome

Social VR Space| Visualizing unseen movement inside the "Virtual Environment"

Two users enter the social VR space, select avatars, and remain seated facing forward so they cannot directly see one another. Within this environment, out-of-view movement is translated into dust-like particles, allowing social presence to be perceived visually before it is extended through physical feedback.

무제-8.png

Shared spatial setup

Two seated users face forward in fixed positions, while nearby virtual users and one moving user outside the field of view provide the motion data that drives the experience.

무제-5.png

Upper body movement appears as subtle particles

Head, hand, and torso movement from users outside the field of view appears as dust-like motion near the lower left and right edges of the screen, helping users sense nearby activity beyond direct sight.

6.png

Positional movement expands across the screen

When an unseen user changes location, dust-like particles flow across the wider visual field in the same direction as that movement, making larger spatial shifts perceptible in VR.

Final Outcome

Physical Interface | Translating virtual distance into sound, vibration, and scent with "Stools"

Sense the Unseen connects VR events to two physical interfaces, the Sound Stool and the Scented Stool. As user distance is calculated in real time, the system sends those values to Arduino and converts them into synchronized physical feedback.

Sound Stool

Scented Stool

SoundStool.png
ScentedStool.png
arduino.png

Distance is mapped in real time

The distance between users in the social VR space is continuously measured on a scale from 0 to 255 and grouped into five levels for physical output.

Sound Stool turns proximity into tactile sound

In the Sound Stool, speaker vibrations make pebbles strike the metal surface, producing both vibration and sound. Stronger feedback indicates that the other user is closer.

Scented Stool turns proximity into olfactory motion

In the Scented Stool, vibration creates ripples in water and changes the way scent diffuses across the surface, allowing proximity to be felt through movement and smell.

Takeaways

Sense the Unseen shows how multisensory feedback can make out-of-view presence perceptible in social VR, expanding social awareness beyond sight alone.

  • Sense the Unseen translated unseen movement into a connected system of particles, vibration, sound, and scent across virtual and physical space. By mapping distance, direction, and approach into multisensory cues, the project demonstrated a new way to support social awareness when others are outside the field of view.

  • Most social VR systems still rely heavily on visual and auditory feedback, which makes it difficult to perceive others beyond what is directly visible. This project suggests that social presence can be strengthened when movement is not only seen, but also felt through the body, opening new possibilities for more intuitive and human centered interaction in immersive environments.

  • This project could be expanded by refining the sensory mapping, testing the system with more users, and exploring how different combinations of feedback affect comfort, readability, and emotional connection. Future work could also examine how multisensory presence cues might support other contexts such as remote collaboration, education, and shared virtual communication.

Achievements

Exhibited at the 2025 SIGGRAPH Immersive Pavilion

Screenshot 2026-04-07 at 3.21.40 PM.png
bottom of page