Back to Projects

VR Sound Visualizer

Preview Project View Code

What

The VR Sound Visualizer converts non-verbal sounds like birds singing or ambient audio into 3D visual signals and haptic feedback.

  • Provide an accessible experience for Deaf or Hard-of-Hearing people by adding a rich visual and physical layer to their environment.
  • Enhance immersion and interaction in VR through multi-modal feedback.
  • Combine audio, visuals, and haptic signals to create a unified, interactive, and customizable experience.

How

Framework:
Built with JavaScript, Three.js, and Vite, and designed to run smoothly in a VR headset’s WebGL environment.

Features:

  • Direction Indicator:
    Displays the direction of a nearby audio source in real time.
    (Code: src/audioVisualizers/DirectionIndicator.js)
  • Spectrogram:
    Shows a 3D spectrum of the audio signals to visualize pitch and volume.
    (Code: src/audioVisualizers/SpectrogramModelController.js)
  • Haptic Feedback:
    Sends vibrations to VR controllers based on the proximity of the audio sources, adding a physical dimension to the experience.
    (Code: src/haptics.js)
  • Customization:
    The environment, number of flowers, or time of day can be easily configured through public/configs/*.json.
    Example:
    • Increase blue flowers: testConfigurations/manyBlueFlowers.json
    • Sunset view: testConfigurations/sunset.json

Results

  • The VR Sound Visualizer successfully merges audio, visual, and haptic components into a unified, interactive experience.
  • It lets users:
    • See the direction and spectrum of nearby sounds.
    • Feel their surroundings through controller vibrations.
    • Customize their environment for a personalized experience.
  • The lightweight architecture performs smoothly on VR headsets and in a standard WebGL view, making it a viable solution for developing interactive, multi-sensory VR applications.

Contributors

Preview Project View Code