neospark314 / godot_oculus_quest_toolkit Goto Github PK
View Code? Open in Web Editor NEWAn easy to use VR toolkit for Oculus Quest development using the Godot game engine
License: MIT License
An easy to use VR toolkit for Oculus Quest development using the Godot game engine
License: MIT License
Hello
I recently just got a Quest2 and managed to install and run this kit perfectly. I love it!
The BeepSaber demo runs fine on PC but when sideloaded to the Quest everything works except that I'm unable to start the game. I get the Error saying the Json file can't be loaded from the .dat
Any ideas why this might be? Any hint would be appreciated.
I'm using Godot 3.2.1 and created a new project using only the Beep Saber folder and got the plugin from this repo (not the Godot Asset Library). I had to fix a lot of dependencies manually so maybe I missed something there.
I already checked whether the file exists on the device and messed with files directory to ensure proper access but still no clue.
I'm thinking it has something to do with the manifest but I'm not sure, plus I'm not expert in Android Manifest.
Thank you for sharing and any help you can provide
Hello,
with a higher higher refresh rate of the Godot physics engine the game runs a a lot smoother for me.
Maybe better to set the Engine's refresh rate to a higher value by default according to the screen refresh rate.
In the autoload-script:
`
func _refresh_settings():
log_info("_refresh_settings()");
set_display_refresh_rate(oculus_mobile_settings_cache["display_refresh_rate"]);
...
Engine.iterations_per_second = oculus_mobile_settings_cache["display_refresh_rate"] * 2
_need_settings_refresh = false;
var oculus_mobile_settings_cache = {
"display_refresh_rate" : 90,
"boundary_visible" : false,
"tracking_space" : ovrVrApiTypes.OvrTrackingSpace.VRAPI_TRACKING_SPACE_LOCAL_FLOOR,
"default_layer_color_scale" : Color(1.0, 1.0, 1.0, 1.0),
"extra_latency_mode" : ovrVrApiTypes.OvrExtraLatencyMode.VRAPI_EXTRA_LATENCY_MODE_ON,
"foveation_level" : FoveatedRenderingLevel.Off,
"foveation_dynamic" : 0,
"swap_interval" : 1,
"clock_levels_cpu" : 2,
"clock_levels_gpu" : 2,
}
`
Hi, some of the gltf files in OQ_Toolkit/OQ_ARVRController/models3d contain links to texture files that do not exist in the tree:
This is happening intermittently in Feature_HandModel.gd
in _track_average_velocity
both in the demo scene and in my own incorporation of the script.
average_velocity = Vector3(0, 0, 0);
for i in range(0, _velocity_buffer_size):
average_velocity += _velocity_buffer[i]
>>> average_velocity = average_velocity * (1.0 / (_dt * _velocity_buffer_size));
_last_velocity_position = global_transform.origin;
I'm not sure if the right solution here is to add a very small buffer to _dt or if the function call should be skipped altogether if _dt == 0. Thoughts?
Godot version: 3.2 rc-1
toolkit version: head
edit: added screenshot with Godot version
Wasn't able to get it to work, no matter if ran before or after initialize
in my root node script. The FFR won't change.
(i'm using GLES3 if that matters)
I had step offset set to 0.4 and when I held the object I was holding towards my body space it assumed I was standing on it, and so moved me up, then it moved up the object I was holding, and repeated this until I was high in space.
Hello! I have recently started VR development, and found your toolkit to be really helpful. I made a test world scene to add the all the VR features, player, RigidBodies, StaticBodies, etc. Everything worked perfectly, except the climbing mechanic. Grabbing objects works by itself. Climbing works by itself. But when I tried adding both mechanics, instead of grabbing and throwing a RigidBody, you climb on a RigidBody while being able to rotate it at the same time. I tried to reprograming a few things, but none of my tries worked. Is there a fix, a workaround, or something I missed? Will there be an update for the toolkit that will fix this issue? Also, is there a way to make the climbing mechanic only work on certain StaticBodies? I tried Groups, Classes, and even attempted to make a custom node. If you answer any of my questions, I'll be really grateful.
When you install from the asset library and try to run any example scene you get a "Parser Error: Expected a constant expression." on a line such as:
export(vr.CONTROLLER_BUTTON) var grab_button = vr.CONTROLLER_BUTTON.GRIP_TRIGGER;
I remember from last time it took quite a while to find out you had to add the singleton vr_autoload.gd
as vr
into the project. This problem was made worse because I was new and didn't know what autoloading was.
Is there a way to either: (1) make sure this startup error is useful in telling you to add this autoloading thing in, (2) put this requirement in bold in the description of the plugin, or (3) programmatically load the singleton at the start of each demo so this is doesn't need to be done?
Also, there's a typo on line 20 of OQ_UI2DLabel.gd.
onready var ui_color_rect : CenterContainer = $Viewport/ColorRect
should be
onready var ui_color_rect : ColorRect = $Viewport/ColorRect
First, let me thank you for putting this repo together. This is truly inspiring and extremely useful.
I'm trying to get the Demos running. And with the HandTrackingDemoScene
, I can only see a black label telling me "Please enable hand tracking for gesture detection!".
I've looked around in different Godot project properties and the documentation you have for this project. But I can't seem to find how to activate this. Is there something I'm missing?
Godot Version: 3.2 beta5
Hi, I recently came back to playing around with the Godot Oculus Quest Toolkit. On my Quest 1, using Godot 3.4 and Quest Toolkit 0.4.2 (also tried the current git version), neither the demo scenes nor my own scenes show the controller models. They are visible in Godot’s desktop preview, but not on the actual Quest. I see no error messages in the debug log and the controllers themselves and tracking works fine, just the models are invisible. I also tried explicitly setting the controller models type to “Quest 1” in the controller nodes. I also tried GLES 2 as well as 3.
Hi, when running the demo scene in Godot 3.2.2 beta 3, the debugger complained about this line with a mismatch between CenterContainer and ColorRect. Weird this didn’t came up when running the demo in Godot 3.2.1 before.
Either I am not understanding this feature right or you forgot to apply delta_position to the grabbed object after you calculate it.
Failed to initialize libOVR error
Godot Engine v3.2.stable.official - https://godotengine.org
OpenGL ES 3.0 Renderer: GeForce MX150/PCIe/SSE2
Project is missing: C:/Users/Ivan/Documents/Oculus Quest Button Testing/project.godot
Editing project: C:/projects/godot_oculus_quest_toolkit-master (C:::projects::godot_oculus_quest_toolkit-master)
Godot Engine v3.2.stable.official - https://godotengine.org
OpenGL ES 2.0 Renderer: GeForce MX150/PCIe/SSE2
Failed to initialize libOVR.
Compiling blit shader
Linking blit shaders
ERROR: initialize: No library set for this platform
At: modules/gdnative/gdnative.cpp:290
ERROR: does not have a library for the current platform.
At: modules/gdnative/nativescript/nativescript.cpp:1483
ERROR: does not have a library for the current platform.
At: modules/gdnative/nativescript/nativescript.cpp:1483
We need to be able to query the device type:
The ovrDeviceType value, VRAPI_DEVICE_TYPE_OCULUSQUEST2, has been added to the API.
The upstream oculus_mobile plugin does it as such:
inline int get_device_type() {
return ovrmobile::get_device_type(OvrMobileSession::get_singleton_instance());
}
inline bool is_oculus_quest_1_device() {
return ovrmobile::is_oculus_quest_1_device(OvrMobileSession::get_singleton_instance());
}
inline bool is_oculus_quest_2_device() {
return ovrmobile::is_oculus_quest_2_device(OvrMobileSession::get_singleton_instance());
}
ovrDeviceType get_device_type(OvrMobileSession *session) {
return check_session_initialized<ovrDeviceType>(
session,
[&]() {
auto device_type = static_cast<ovrDeviceType>(vrapi_GetSystemPropertyInt(
session->get_ovr_java(), VRAPI_SYS_PROP_DEVICE_TYPE));
return device_type;
},
[]() { return VRAPI_DEVICE_TYPE_UNKNOWN; });
}
bool is_oculus_quest_1_device(OvrMobileSession *session) {
ovrDeviceType device_type = get_device_type(session);
return device_type >= VRAPI_DEVICE_TYPE_OCULUSQUEST_START &&
device_type <= VRAPI_DEVICE_TYPE_OCULUSQUEST_END;
}
bool is_oculus_quest_2_device(OvrMobileSession *session) {
ovrDeviceType device_type = get_device_type(session);
return device_type >= VRAPI_DEVICE_TYPE_OCULUSQUEST2_START &&
device_type <= VRAPI_DEVICE_TYPE_OCULUSQUEST2_END;
}
Hi, seems the OVRMobile addon is not up to date any more, as the recent version contains more functions (like vibrate_controller etc). Would be nice to update it.
Added to that, there’s Godot’s upcoming switch to the newer Android plugin system that might be worth a look for making the toolkit future proof, as the GodotVR driver already is compatible.
How do I make a UI-facing label that's always facing the camera and always on-screen? I want to use it to display the number of points.
I got labels working via an OQ_UI2DLabel
instance. But, it only faces one direction, and doesn't move around. I tried placing it as a child of the controller (works great in the VR simulator), camera, origin, etc. but it simply stays in place.
To be clear, I would like it to always be on-screen and facing the camera, like any other HUD element. In 2D, I could achieve this with a CanvasLayer and a label sub-control.
As in this function:
https://github.com/NeoSpark314/godot_oculus_quest_toolkit/blob/master/OQ_Toolkit/OQ_ARVRController/scripts/Feature_HandModel.gd#L247
It looks like a cleaner work-around/transform is to apply the inverse of the rotation of the rest position before assigning the pose value:
Instead of:
skeleton.set_bone_pose(_vrapi2hand_bone_map[i], Transform(_vrapi_bone_orientations[i]));
do
var j = _vrapi2hand_bone_map[i]
var bone_rest = skeleton.get_bone_rest(j);
skeleton.set_bone_pose(j, (bone_rest.basis.inverse()*Basis(_vrapi_bone_orientations[i])))
This might work more within the grain of how the bone posing system works. I'm sure there's a method in its madness, though I can't see it yet.
Hey, I've been using your toolkit for a while now, and I first want to thank you a lot for your work. It's my go-to plugin when creating a new project and some of the features are absolute life savers.
I understand that Godot is moving to OpenXR and that the VRApi support is going to end during 2022. I'd love to move to OpenXR but I'm worried it will break compatibility with this addon. Can you tell me if you already support OpenXR? If yes, what steps should I take to replace the VRApi? And if not, do you think it's possible?
First, thanks for doing this project, it's a really great idea!
Sorry if I'm doing something very dumb, I'm still new to godot and game dev in general. I installed the toolkit through the assets store, set the GameMain.tscn as the main scene and drag demo_scenes/GodotSplash.tscn as a child node to the root as the main scene. When I click the play button to run the game it just errors in GameMain.gd at the first reference to the vr object vr.button_just_released()...
. I don't quite get how this vr object is supposed to be created or inherited from.
This is on godot 3.2.1 on macOS. Happy to provide additional details. I see the wiki is a WIP, I'd be happy to pitch in there a little more once I figure it out myself.
Great job
And we deploy the toolkit successfully, thanks. But the hand model were missing in the hand track demo scene, who can help to solve this issue?
I thought I'd open the discussion about this here.
I can imagine my project working with hand tracking, except for my need for smooth (non-teleporting) artificial locomotion.
Currently I use the Stick locomotion from this project.
I don't have any (good) ideas currently for what would look like.
The physics interaction (standing and falling) is with a thing that is completely invisible.
On the godot xr tools I was able to put squashed cylinder mesh into Function_Direct_movement -> KinematicBody to represent my foot on the ground, which is useful for debugging.
When I put an object into the Feature_PlayerCollision it is fixed relative to the playing area origin instead of to the player position.
I was just setting up a new project with OQT and thought it might be nice for new users to make things a little more automatic with a plugin. It could automatically add the vr singleton for them and remove one more step.
Hello again. So far I've been using the toolkit for small personal projects mainly for experiment purpose and learning.
However I recently published a small project on sidequest and I was wondering if it could be added here on the feature projects section.
https://sidequestvr.com/app/2519
Any help will be of course greatly appreciated
I'm sure i'm just using it wrong but what did i mess up ? while i'm here i might as well also ask, how much of the toolkit is actually needed and what is just demos (what do i need to keep to have a clean project)
The right hand saber's orientation is incorrect making it unplayable. Shouldn't be a hard fix I don't think?
In my Oculus Quest 2, is_oculus_quest_2_device() method returns false and is_oculus_quest_1_device() returns true.
Maybe it needs to modify the AndroidManifest.xml to detect the correct device type of Oculus Quest 2.
This would make it fit a bit better if it's going to be put on the asset library at some point.
Hi, (This is discussion not a issue)
Just playing around with library and i wanted to know how can i make the object collide with other object after gabbing it by controller ? Take for example the table tennis bat when we throw it in the scene it collides with other object but when it is grabbed it does not collide at all.
Please suggest. (tried a demo game & files but did not find any thing).
Thanks
Installed Release 0.4.2 on Godot 3.3 (Mono Build) on Windows 10.
What happens next is steamvr starts up and OpenVR plugin is detected. The headset tracking is working and I can see the tracking working properly on the display on my PC monitor; however, nothing shows up on the headset itself (ie. its black). I tried changing the engine target fps to 72 to match the Quest 1; however, it made no difference.
Note: I have verified the VR demo in the Godot Tutorial works just fine on the Oculus Link.
--- Debugging process started ---
Godot Engine v3.3.stable.mono.official - https://godotengine.org
OpenGL ES 2.0 Renderer: AMD Radeon RX 6800 XT
OpenGL ES Batching: ONInitializing VR (Toolkit version 0.4.2)
Available Interfaces are [{id:0, name:OpenVR}, {id:1, name:Native mobile}]:
Found OpenVR Interface.
Success initializing OpenVR Interface.
_perform_switch_scene to res://demo_scenes/GodotSplash.tscn
switching to scene 'res://demo_scenes/GodotSplash.tscn'
ERROR: : get_tracking_space(): no ovrBaseAPI object.
Tracking space is: -1
ERROR: : get_boundary_oriented_bounding_box(): no ovrBaseAPI object.
get_boundary_oriented_bounding_box is: [1, 0, 0, 0, 1, 0, 0, 0, 1 - 0, 0, 0, (1.93, 2.5, 2.25)]
Engine.target_fps = 72
Switching model for controller 'oculus_quest_controller_left_1' (id 1)
WARNING: Unable to automatically determine controller model type.
Switching model for controller 'oculus_quest_controller_right_2' (id 2)
WARNING: Unable to automatically determine controller model type.
_perform_switch_scene to res://demo_scenes/UIDemoScene.tscn
switching to scene 'res://demo_scenes/UIDemoScene.tscn'
Changed move speed to 1.000000
Changed smooth turn speed to 90
Changed click turn angle to 45
Switching model for controller 'oculus_quest_controller_left_1' (id 1)
WARNING: Unable to automatically determine controller model type.
Switching model for controller 'oculus_quest_controller_right_2' (id 2)
WARNING: Unable to automatically determine controller model type.
ERROR: : MixedRealityCapture currently requries a special build with the method 'VisualServer.viewport_get_color_texture_id'
Teleport Locomotion is a fundamental system needed in many different VR Applications. We need such a system also as a Feature_TeleportLocomotion
inside the toolkit.
It should be a simple drag-n-drop solution comparable to the Feature_StickLocomotion and provide few parameters to customize. The visual cue for the teleport location and the arc should be rendered in a performant way on the Oculus Quest.
The hit detection is likely going to use a ray-cast based system.
Hi, sifting through the code, I stumbled over a minor typo in vr_autoload.gd, beginning here, should be ovrPerformance, not ovrPerfromance ;)
This line: https://github.com/NeoSpark314/godot_oculus_quest_toolkit/blob/master/OQ_Toolkit/OQ_ARVRController/scripts/Feature_UIRayCast.gd#L30
if (vr.ovrHandTracking):
causes an error because ovrHandTracking is not listed anywhere in
https://github.com/NeoSpark314/godot_oculus_quest_toolkit/blob/master/OQ_Toolkit/vr_autoload.gd
It's not obvious what it is supposed to be, or if there is a huge block of code missing.
The node name of Feature_Quest2ControllerModel_Right, "Feature_Quest2ControllerModel_Right" is typo.
Hi, I’m still new to Godot so I might have overlooked something. Seems though the only way to use the toolkit right now is via GDScript. Thus, it would be great to make the classes etc. available for C# scripts as well.
There is an example of it at this point in the video https://youtu.be/BswmgR18tZ8?t=1100 (You can also experience it in the free intro for The Under Presents.)
This method is the most pleasant and efficient VR locomotion I have seen, even though it is for some reason not listed in: https://locomotionvault.github.io/
The mechanism is as follows:
A
button and pull back towards your chestA
button and it snaps around you to meet the the moved centre of your field of viewApparently this could be implemented by a vertex shader, but I don't know how or when it would get slotted into the rendering pipeline in Godot (TUP is implemented in Unity). Would it be a parameter that gets applied to every material? It should not affect the lighting geometry because it is a distortion that applies just prior to rendering.
The technique could be faked at first with a vignette (darkening of the peripheral field of view) and moving the camera to the position of step 3 above, which would get many of the advantages of speed and efficiency, even if it didn't look quite as wonderful.
need arms swing based movement in this , walk in place is rather difficult I must say
I got it working! This is a miracle of cross-platform compatibility.
Needs a debug warning to install the "Open-VR" plugin if no devices are found and you have one plugged in. This is blocked me for about an hour till I found it out. (I didn't give up because I remembered it worked on another project.)
The controllers are a the wrong angle for playing Beep Saber -- they poke up instead of outwards. I succeeded in playing by holding the controllers upside down.
This is the feature exhibited in the System area of the Quest when you have hand-tracking.
Any time the confidence is below 1 the hand stops moving and fades away nicely.
Although you can use hands with confidence between 0 and 1 there's a tendency for them to dance around and not give a very good experience.
By taking the view of the hands away when the tracking is less than excellent the player can easily be trained to keep their hands apart and visible to the cameras, and thus operate the hand controls much more reliably and with less frustration.
This is a good thing.
I implemented this feature in my app here:
https://github.com/goatchurchprime/tunnelvr/blob/master/HandScript.gd#L153
One thing that makes it more complicated in the OQ_toolkit than in my system is that the hand meshes are under the transform nodes of the ARVRControllers instead of being in a separate node under the ARVROrigin. This means that the to keep them stationary during the fade-out (much less distracting to see than when they jump about), we're going to have to undo the controller transform.
You had to do exactly this on this line with the vr.ovrBaseAPI.get_pointer_pose(), which is given relative to the Origin, not the Controller.
https://github.com/NeoSpark314/godot_oculus_quest_toolkit/blob/master/OQ_Toolkit/OQ_ARVRController/scripts/Feature_UIRayCast.gd#L31 I took this as a cue that the pointer didn't belong under the Controller node and was something independent.
It would also be nice to get the transparent glass effect you get with the hands in the System area of the Quest, but that's something much more complicated than simply assigning a translucent material.
Hello!
I'm currently working on a project where I'm using the button and joystick inputs of the touch controllers fairly frequently.
I'm currently using a mix of signals, but this very quickly gets out of hand, since I'm doing a lot of prototyping and debugging outside of VR.
I'd very much like to be able to just map the Touch Controller Button and Joysticks in the Input Map, and haven't seen any information on this.
Thanks for all of your work! Getting everything up and running was very easy with the tools provided.
@Wavesonics reported this issue.
The change to var instead of enum for remapping controller buttons broke the use of button names/assignments in the editor
The error is for example
res://OQ_Toolkit/OQ_ARVRController/scripts/Feature_AreaObjectGrab.gd:21 - Parse Error: invalid index 'CONTROLLER_BUTTON' in constant expression
modules/gdscript/gdscript.cpp:599 - Method failed. Returning: ERR_PARSE_ERROR
I had to comment out the two lines at startup in vr_autoload that do the following logging:
log_info(str(" is_oculus_quest_1_device: ", is_oculus_quest_1_device()));
log_info(str(" is_oculus_quest_2_device: ", is_oculus_quest_2_device()));
This looks like it could be an updated function in a more recent version of the godot_ovrmobile addon asset than comes in the AssetLib (v 3.0.1)? Have I got that right?
I prefer not to commit that binary libraries into my source code repo. Is there a way to read the version of the library that's loaded and give an warning that it is less than what your toolkit expects?
I was looking for some docs on what OQClass_Tool is, as well as how to use it. It looks like there is no documentation for this in the wiki.
The infotext instructions are useful at first, but too large and get in the way when you are debugging.
This feature should already work, but or some reason, the visibility of the 3D node doesn't influence the visibility of the 2D node created below it. If doesn't even matter if you set the ColorRect 2D node to visible=false, the keeps setting it back to to true (is this a bug)?
The quickest method to get it to work is to insert if visible:
above the line that inserts the node:
https://github.com/NeoSpark314/godot_oculus_quest_toolkit/blob/master/OQ_Toolkit/OQ_ARVROrigin/scripts/Feature_VRSimulator.gd#L81
3D dev noob here. Sorry if this is something obvious (flipped normals?) but I can't figure it out.
I created a new project/scene with a fork of one of the others (BeatSaber, I think) and imported three different FBX models; one of them is this apple one from TurboSquid: https://www.turbosquid.com/3d-models/apple-cartoon-3d-1495154.
I created a new scene by right-clicking, picking "new inherited scene," and I can verify that the mesh appears normally. I save an instance, go to my game scene, and add the newly-saved instance; it shows up correctly.
The really puzzling thing: if I run the regular (Windows) target/debugger, the mesh appears fine. But, if I deploy and run to Oculus, the mesh is simply missing.
Things I tried:
Two things that worked:
What am I missing / doing wrong?
It would be pretty nice to have two seperate projects here.
VR Toolkit
This would contain things like the Buttons, UiCanvas, lots of other things that are just solutions to general VR problems
Quest toolkit
This would contain the Quest specific utilities, setup, model, ect.
I think lots of VR projects will want to use support both Quest and PC VR. Seperating these out would allow anyone to use the VR Toolkit.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.