This project contains the code used for the Sketching into the Metaverse demo presented at AIUK2023. In this demo, the user draws a 3D sketch in the VR environment. The backend search engine retrieves the closest matching shapes from the database based on the sketch. Currently, the retrieval model used is from paper Structure-Aware 3D VR Sketch to 3D Shape Retrieval, so only the chair class is used as an example.
After running the demo on the host machine, the user puts on the headset and enters a virtual living room. In front of them, there is a cube labeled Sketch Space
.
- Sketch: The user can use the right-hand controller to freely draw a chair sketch in the sketch space by pressing the
Sketch
trigger. During this process, users can use theGrab
trigger at any time to rotate the entire sketch space, and use theUndo
button on the left-hand controller to undo the last stroke. (Please refer to the operation guide below for the triggers and buttons) - Search: Once finished, the user can click the
Search
button on the TV using theClick Button
on the right-hand controller to trigger a search. The top 1 search result will immediately appear in theSketch Space
cube. If the user want to see more results, press theMore Results
button on the left-hand controller. Pressing it again will hide the additional results.- The user can then select any model using the
Click Button
, and the chosen model will be displayed in the green area next to the table. - If the search results are unsatisfactory, the user can continue drawing on the existing sketch and search again. Alternatively, they can click the
Clear
button on the TV to delete the current sketch and start over.
- The user can then select any model using the
- End: When finishing the game, click the
Exit
button on the door.
Please refer to the demo video for the complete process after running the demo.
The controller operation guide is shown in the following figure and is also visible in the virtual room.
Platform:
- Windows system: Unity + Visual Studio Code
- Oculus Rift: 1 headset + 2 hand controllers
The demo project consists of two parts:
retrieval_inference
: Backend inference code based on PythonSketch_VR
: VR interface using Unity
Open retrieval_inference
in Visual Studio Code.
Create your own conda environment, then install the necessary packages by running:
pip install -r requirements.txt
Run main.py
from retrieval_inference
.
First, set up the Oculus environment and ensure it is functioning properly.
Second, download the chair object files with password wjh9
and unzip the downloaded ShapeNetCore.v2.zip
under the current Sketch_VR_demo
directory.
If you want to run the demo directly, you can download the executable file and extract the downloaded game.zip
into the current Sketch_VR_demo
directory. Then start the game by running VR Sketch.exe
The correct directory hierarchy structure is as follows:
- retrieval_inference
- Sketch_VR_demo
- game
- VR Sketch.exe: The executable file of this game.
- ...
- ShapeNetCore.v2
- 03001627: chair category of ShapeNetCore.v2 dataset
- demo_savedir: The location where the VR sketch is saved.
- Sketch_VR: The code repository for Unity game development.
- ...
If you want to continue editing this demo, please open the Sketch_VR
subdirectory in Unity. You can also download the original project from Baidu Disk with password b4qp
.