Code Monkey home page Code Monkey logo

ubiq's Introduction

Welcome to Ubiq

Picture of Avatars Waving

Ubiq is a Unity networking library for research, teaching and development, maintained by the Virtual Environments and Computer Graphics group at UCL. Ubiq is 100% free and open source.

Features

Ubiq's goal is to enable your networked project. It includes message passing, room management, rendezvous and matchmaking, object spawning, shared binary blobs, multiple synchronisation models, lighweight XR interaction examples, customisable avatars and voice chat across Windows, Linux, Android, MacOS, and Javascript running in the browser.

For Researchers

Instructions for setting up your own server are included. Ubiq does not rely on any third-party services, making it GDPR-safe for your experiments.

Supported Unity Versions

Ubiq supports Unity 2021.3.22 LTS or later. If you are building for WebXR, we recommend using Unity 2022.3.16 due to an issue with later minor versions.

Quick Start

  1. Add Ubiq using the UPM, with the "Install package from git url..." option:
https://github.com/UCL-VR/ubiq.git#upm
  1. Select Ubiq from the package list and import the Demo (XRI) sample.

  2. Open the Assets/Samples/Ubiq/x.y.z/Demo (XRI)/Demo.unity scene.

  3. Click Play.

You're connected! For next steps see the Getting Started guide in our docs at https://ucl-vr.github.io/ubiq/.

Awards

Do you use Ubiq?

Having a list of projects using Ubiq helps when applying for funding.

If you use Ubiq in any way, please consider telling us using the form below. It only takes 30 seconds, all responses are anonymous, all fields optional, and the data will only be used in aggregate for applications.

https://forms.gle/DsnFZVA3RvtNhge37

ubiq's People

Contributors

bnco-dev avatar fjthiel avatar k-lara avatar nsalminen avatar sebjf avatar thebv avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ubiq's Issues

Quest 3 microphone input garbled in build when using Unity WebRTC plugin

Quest 3 microphone input for VOIP appears to be gathered or transmitted incorrectly. Audio received by other peers from the Quest 3 peer is played back on a short loop and misses samples.

As a workaround, we now fall back to the old dotnet WebRTC implementation on Android. This is the case as of 2ca2da6 (and so v1.0.0-pre.3).

@k-lara suggested it may be a sample rate issue, good place to start!

Related info:

  • Audio the Quest 3 peer receives from other peers is played back correctly.
  • Quest 2 works fine, send and receive.

Network IDs of pre-existing scene objects are always the same

I've been trying to create a scene with different objects placed around it. So far I've created a Ball script and Frisbee script (and their respective prefabs) with some differences regarding their Grasp and Use behaviour. My problem is that I cannot seem to get them to have different Network IDs (and thus be synchronized as different objects on different peers).

Note that this doesn't apply in the case of spawning, as spawned objects have different IDs assigned to them, while the pre-existing ones always have a series of zeroes separated by a dash. How should I assign different IDs to them?

Issue with HTC Vive pro

Hi,

I have an issue to play what inside UNITY into HTC Vive pro.
I cannot see the screen inside VR. Could you give some help ?
How to implement ubiq in HTC Vive pro ?

Thanks
Ihshan

Hand tracking issues with Quest 2

Hand tracking doesn't seem to be working at all with the Quest 2.
It looks the hands at around waist level and you can't move them pretty much at all. They do seem to be changing rotation and position ever so slightly but not in a meaningful way.

I've tried messing with the floor height and various other settings but to no avail.

Net compatibility issues with Unity version 2021.3.X

When using Unity 2021.3.X we can run and build the project without any problems.

We do however get some errors in the editor/we also can't debug the application because of them.
E.g. :

The type 'Span<T>' exists in both 'System.Memory, Version=4.0.1.1, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51' and 'netstandard, Version=2.1.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51'

I wouldn't have mentioned this since ubiq still uses 2020.X but a recent commit addressed some issues in regards to other issues with 2021.X See ca6d8ea

I've had resolved these issues before by removing the plugin .dlls since things seem to work without them, but maybe I'm missing something. See ubiq-fork

So I assume it might have to do with the existing plugin libraries using a different net standard? I'm not sure though.

More access to audio samples

It would be great to have access to the audio samples for the Unity WebRTC implementation similar to the .Net implementation.
Just read access would be enough.

It would also be helpful to access the local players' microphone input even when not being connected to other users.

Node sample app logcollectorservice not working

It seems that the logcollector service isn't working anymore, most likely due to recent changes on the server side.
It can't connect to rooms anymore and after getting it to connect to a room (after some modifications) the listeners just wouldn't work either.

Object despawning doesn't work (on peer objects)

When using the NetworkSpawnManager to despawn an object that was previously spawned with it, it only despawns locally but not on the other clients.
After some time debugging, it seems to me that the update done on the local peer never gets correctly sent to the others.

When rejoining the room the object that should've been despawned doesn't get spawned again, so everything then works as intended.

I haven't tested this with room-scope objects.

Minor Fixes

Below is a set of minor fixes that should be made before the start of term.
These are based on observations from last year.
Hopefully we can try and do these next week (they are very minor!)

  1. FollowGraspable should have a flag saying whether its grasped
  2. It should be easier to delete the Bot from the Local Loopback scene
  3. Make sure all events (especially in AvatarManager) are all initialised (not relying on the consumer to do so)
  4. RoomClient.Find should be able to be called before NetworkScene.Register, or any other method
  5. Turn off Graphic Raycaster on the canvas by default
  6. Save and Mirror Project settings should be stored on a per project basis
  7. NetworkScene should throw an error when attempting to register an uninitialised NetworkId

Hololens 2 support

Hello,
we would like to use Ubiq with the Hololens 2 instead of virtual reality hmd's to connect multiple users in AR. It seems like there currently is no Hololens support. At least nothing documented. In a first attempt to set it up, all the demo scenes don't seem to work with the MRTK . Atleast all the menus and buttons are not interactable with your own hands. I am also unsure how to set up the avatars for AR use. Is there any more information or documentation regarding using Ubiq with a Hololens 2?

Errors stop building to APK

Here is the steps that I did to encouter the errors:

  1. Clone the code
  2. open with unity hub and download the recommended unity version
  3. swith to the andriod platform
  4. player settings->XR plugin management_> tickle oculus
  5. Open the Demo scene in the samples->Demo(XR) folder
  6. go to build setting, add open scene, tickle the Demo scene
  7. click on build
  8. name the apk
  9. after 3-5 seconds, the console showed errors and the build task canceled
    image

I also tested the release version and had same problem

Packaging errors

The version at https://github.com/UCL-VR/ubiq.git#upm has a number of packaging errors out of the box, which are serious enough to prevent compilation or attaching a debugger.

2023.2.0b9

  • "Platform name 'VisionOS' not supported."**

This is a known bug in Unity that means the 'VisionOS' option is not available in some versions despite being released in 2022. This causes problems for the null voip asmdef which prevents compilation of projects outside very specific versions.

2021.3.38 LTS

  • "Platform name 'QNX' not supported."
  • "An error occurred while resolving packages"

The latter is because Ubiq now relies on shadegraph 14, whereas <2022 versions of Unity have 12 inbuilt. Ubiq core should probably not rely on shadergraph at all.

2022.3.18 works OK.

Is it possible to implement remote rendering on ubiq?

Hello, our team is planning to implement remote a rendering-based social VR application, where the rendering workload is offloaded to the server. We are wondering if it is possible to instrument the code of ubiq to implement it.

Specifically, we need to design a rendering pipeline on the server side and transmit the rendered content to the client. We want to implement our scheme based on ubiq. Do you have any suggestions about our idea?

NetworkSpawner Catalogues

This is the counterpart to #40, which is the less minor thing that keeps coming up.

The NetworkSpawner Catalogue is included with the Samples. I.e. its maintained upstream. However users need to add their own Prefabs to it.
The NetworkSpawner should be modified so users don't have to change any Sample files to make their project work.

One way to do this is to make the NetworkSpawner take a List of Catalogues. Users can create a Catalogue for their project, and add it to the Spawner, where that association will be saved with the Scene.

The downside of this is that they will need the same configuration between all Scenes for the indexing to work.
So, the referencing also needs to be changed so that the identity is taken from property of the Prefab that is project-wide (where possible).

Additionally, when spawning at the Room level, there should be an API that supports co-routines or callbacks, so users can reference the instance after it is created in one function.

MissingReferenceException when leaving and rejoining a room using the Social Network Scene prefab

When using the Social Network Scene prefab joining an existing room, leaving it and rejoining it will result in the following error:

MissingReferenceException: The object of type 'MeshRenderer' has been destroyed but you are still trying to access it.
Your script should either check if it is null or you should not destroy the object.
Ubiq.Samples.NetworkedMainMenuIndicator.SetVisibility (System.Boolean visible) (at Assets/ubiq/Unity/Assets/Samples/_Common/UI/Scripts/NetworkedMainMenuIndicator.cs:83)
Ubiq.Samples.NetworkedMainMenuIndicator.OnRecv () (at Assets/ubiq/Unity/Assets/Samples/_Common/UI/Scripts/NetworkedMainMenuIndicator.cs:134)
Ubiq.Samples.NetworkedMainMenuIndicator.ProcessMessage (Ubiq.Messaging.ReferenceCountedSceneGraphMessage message) (at Assets/ubiq/Unity/Assets/Samples/_Common/UI/Scripts/NetworkedMainMenuIndicator.cs:123)
Ubiq.Messaging.NetworkScene.ReceiveConnectionMessages () (at Assets/ubiq/Unity/Assets/Runtime/Messaging/NetworkScene.cs:270)
Ubiq.Messaging.NetworkScene.Update () (at Assets/ubiq/Unity/Assets/Runtime/Messaging/NetworkScene.cs:246)

Exact setup to reproduce:
Client 1: Creates room
Client 2: Joins room
Client 2: Uses the "Leave" button to leave the room
Client 2: Joins the same room again

Adding a null check to that part of the code fixes this but a similar issue arises at line #135/136

This was tested on the ubiq version from the 11th of November

Max user capacity

Hi there!
After reading through (almost) the entire docs, I got to say, I really like the idea and documentation for this! It's a great project!

I'm looking for a platform dynamic enough to host a number of user and a number of interactable shared objects.
Are there any known limitations or guesses how many VR-user we could hook together?

Thanks for creating and sharing this!
Best, Florian

Keyboard and Mouse Controller

Some thoughts about reintroducing the Desktop (keyboard & mouse) controller, which has been on the list since the XRIT upgrade...

Goals

  • To provide an intuitive interface for 2D desktop users
  • Transparent to developers using the XRIT (i.e. drives XR Interactables without any additional setup)

Scope

  • Should work with PC, WebGL & WebXR (outside of VR/AR mode)
  • Touchscreens are out of scope
  • Should support all the interactions in the XR Interactable Toolkit Demo scene sample:
    • Grab
    • Poke
    • Gaze
    • UI

Comparison with Device Simulator

The Component is not a replacement for the XR Device Simulator. Whereas the Device Simulator can emulate all typical XR Controller Inputs, the Desktop interface will be more limited. It will only support the minimum input required to engage the Interactables mentioned above (that is, trigger the Hover, Select and Activate events).

It is expected applications intending to support the Desktop interface are implicitly developed with this in mind, by sticking to Interactable setups that are amenable to being driven with a keyboard and mouse, even if the Desktop controller means no active development effort or additional control flow needs to be configured.

Other applications will still need the XR Device Simulator to emulate more complex interactions, such as multi-interactor behaviour.

Proposed User Story

If an XR device is active, it overrules the Desktop interface. However, if an XR Device is connected but inactive, the Desktop interface should function automatically without any configuration change (1).

Users interact from a first person perspective. Using WSAD will translate the viewpoint. Holding the Right Mouse Button while not over an Interactable will allow rotating the Camera, otherwise the cursor behaves normally (2).

Users use the mouse both to look around and Select or Grab items. Items are activated with the Left Mouse Button, and selected with the Right Mouse Button (CTRL + Mouse Button on Mac). Moving the cursor over an item should activate Hover. Ideally, items should respond to Hover themselves (e.g. with an XRInteractableAffordanceStateProvider), but we will also colour the reticle so that it responds to to Interactable objects, in the way the current Ray Interactor does.

When the Right Mouse button is activated over an Interactable, it is locked and will not change the orientation until released.

If an Interactable is selected, like below, we will show a spline between the cursor and the focal point, to emulate static or heavy items.

image

Users can interact with world-space UI Canvas elements by clicking normally, and text boxes by typing.

(1) This is so users can use the Desktop interface to prototype their scenes with headsets connected, instead of having to keep putting on and taking off a headset.
(2) The reason for masking the orientation change is so users can use the cursor to more easily interact with on screen keyboards and other UI elements.

An alternative to clutching, is that the cursor by default rotates the camera, but stops moving when over a UX element. This might be quite nice in XR focused apps, but could also become odd if the cursor ends up far away from the center of the screen.

Implementation Options

Almost every interaction can be achieved with the XR Device Simulator's Left Controller and Grab or Trigger button emulation.

One possibility then is to create our own InputDevice subclasses and hook into the InputSystem in the same way the device simulator does. One problem with this is that the behaviour when having multiple XR devices added appears to be undefined. For example, we can easily take control of the hand from the device simulator with an XR controller, but not vice versa.

Another possibility is to manually trigger the Input Action References on the XR Controller scripts. This will provide more control and also mean we can repurpose the existing hands.

Another possibility is to introduce a new GameObject in parallel to the hands, which maintains its own Interactor components, set up specifically for the desktop case. The advantage of this is that it would probably be easier to set up dedicated visuals and configurations more suitable for the Desktop interfaces. We would have to worry about fighting over control of the Hand Controllers, because the XR Input Device can track the hands, and the Desktop component can have its own GameObject to deal with.
For locomotion, the Desktop interface would translate the XR rig, and orient the head using the Transforms.

ARM64 Build crashes on startup

I've recently had problems building my Ubiq project for standalone VR use using ARM64. I've built the project before using ARMv7 with no issues, but with ARM64 the project seems to crash as soon as you enter the game, right after the "made with Unity" splash-screen.

After plenty of debugging I've seem to hit a dead end, so I tried building the Start Here sample with ARM64 to see if even the original sample has this problem, and in my case it has. Here's the steps to reproduce my scenario:

  • Using Unity version 2022.3.10f1
  • Create a fresh Unity project using the 3D preset
  • Install the Ubiq package from the package manager using the git url
  • Import the samples and open the Start Here scene
  • Open the build settings, switch to Android
  • Add the open scene to the "Scenes in Build" section
  • Go to Project Settings > Player > Other settings
  • Select "linear" color space
  • Change the scripting backend to IL2CPP
  • Untick ARMv7 and tick ARM64
  • Untick "Auto Graphics API" and remove Vulkan (keep OpenGLES3)
  • Go to Project Settings > XR Plug-in Management and select Oculus under the Android tab
  • Build and sideload onto a Quest 2 headset

If you try and keep Vulkan as your preferred Graphics API you just get an infinite black loading screen. The build seems to crash most of the time (sometimes it loads unexplainably with no issues) and adb logcat reports a Vulkan-related problem if you use that, but if you switch to OpenGLES3 you just get a crash due to a null-pointer dereference (which I have no idea how to debug).

My biggest concern is that Meta only supports ARM64 applications on the store, but more importantly this issue prevents me from building with OpenXR, which is needed to create a build that supports the first Quest headset. I'd be very glad if you tried to reproduce this and tell me your results (even with a different Unity version, since it shouldn't make a difference?)

Ubiq Node server not starting

Simply put running npm i followed by npm run start won't actually start the server.
Instead I get the following error:

H:\Repos\Uni\ubiq-dev\Node\node_modules\ts-node\dist-raw\node-internal-modules-esm-get_format.js:93
        throw new ERR_UNKNOWN_FILE_EXTENSION(ext, fileURLToPath(url));
              ^
CustomError: ERR_UNKNOWN_FILE_EXTENSION .ts H:\Repos\Uni\ubiq-dev\Node\node_modules\ubiq\index.ts

Hint:
ts-node is configured to ignore this file.
If you want ts-node to handle this file, consider enabling the "skipIgnore" option or adjusting your "ignore" patterns.

Main issue being that we are installing the submodules as dependencies but these are never transpiled to javascript.
So we'd need to tell ts-node that it's allowed to also transpile typescript files by adding the skipIgnore flag.
E.g ts-node-esm --skipIgnore app.ts

I'm not fully convinced that that's the best solution especially since this also requires us to add to @types dependencies since they are required by the ubiq-module.

Maybe I'm missing something though.
Otherwise I'd suggest that it might be a good idea to add a build step to the project that builds the various modules.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.