collaborative virtual environments for scientific visualization
Brahma-XR is a library and framework for making collaborative WebXR rooms with spatial data. The same exact code runs in the browser-- and cross operable to Apple Vision Pro, Meta Quest 3, and VARJO. Great for rapid prototyping, and on-the-fly demos. Built off of many people's research and software contributions.
It is purposely design to require minimal computation, and can work across every headset.
Thanks goes to these wonderful people (emoji key):
It comes with lots of a immersive features out-of-the-box such as:
- Locomotion, using an accessible Grab-And-Pull method
- Raycasting, enabling pointing with the mouse, controller, hands, or eye-tracking.
- Basic spatial data rendering
- Geospatial Views
- Information Visualization Views
- 3D model views (coming soon!)
Collaborate with others!
- Avatar Embodiment
- Shared note: voice chat and text not included!
For HCI researchers and managers telemetry data options are provided.
- Proxemics
- Event logs
Using this library necessitates strong development practices in Three.js and some knowledge about how WebXR works. We have recommended resources for this on the wiki.
It's the early stages of this project, so feel free to reach out to contributors and ask about the project.
Thanks goes to these wonderful people (emoji key):
Samir Ghosh 💻 |
Kajal Jotwani 💻 |
|||||
|
This project follows the all-contributors specification. Contributions of any kind welcome!
Support for the came in many ways from