LuminaVerseAI
Immersive VRM Companion
VRM Model Viewer
Use Settings to load a model
LuminaVerseAI Chat
Local LLM / Offline ready
Hello! I'm your AI companion. How can I help you today?
Settings
Reset
VRM Model
Load a .vrm file from disk or use the sample URL for testing
Load VRM (file)
Or load sample VRM URL
Load
Sample: https://cdn.jsdelivr.net/gh/pixiv/three-vrm@master/examples/models/AliciaSolid.vrm
Scale
Current:
1.00
x
Vertical offset (Y)
Current:
0.00
m
Rotation speed
Current:
0.25
Expression Preset
Neutral
Happy
Sad
Surprised
Angry
Save
Reset
Background Color
Lighting intensity
Current:
1.00
Light color
TTS voice (browser)
Lip-sync (toggle)
Enable lip-sync when speaking
This app runs locally. No outbound network requests are made to load local files. To use an offline LLM, configure your local endpoint in your environment and connect the chat send handler.
No model loaded