Code Monkey home page Code Monkey logo

microsoft / mixedrealitytoolkit-unity Goto Github PK

View Code? Open in Web Editor NEW
6.0K 421.0 2.1K 1.83 GB

This repository is for the legacy Mixed Reality Toolkit (MRTK) v2. For the latest version of the MRTK please visit https://github.com/MixedRealityToolkit/MixedRealityToolkit-Unity

Home Page: https://aka.ms/mrtkdocs

License: MIT License

C# 97.03% ShaderLab 0.80% PowerShell 1.17% HTML 0.01% JavaScript 0.03% CSS 0.01% GLSL 0.86% Python 0.09% HLSL 0.01%
holotoolkit holotoolkit-unity mixed-reality hololens mixedrealitytoolkit mrtk openvr mixedrealitytoolkit-unity unity

mixedrealitytoolkit-unity's Introduction

Important

With the creation of the new Mixed Reality Toolkit organization, there now exists two MRTK repositories, one repository for version 3+ and a legacy one for version 2.

MRTK v3+

New versions of the Mixed Reality Toolkit will be released by the Mixed Reality Toolkit organization using a repository at https://github.com/MixedRealityToolkit/MixedRealityToolkit-Unity. Visit this repository for the latest version of the MRTK project, and when creating new issues or discussion topics for MRTK version 3 or later.

MRTK v2 (Legacy)

The old MRTK2 repository, https://github.com/microsoft/MixedRealityToolkit-Unity, is remaining under Microsoft's management and will stay on version 2. Microsoft is committed to the next version of MRTK, and recommends applications move to MRTK version 3 or later. However, Microsoft will continue to support and address critical MRTK2 issues, until MRTK2 is deprecated. Please open MRTK2 issues and discussion topics using the old repository.



Mixed Reality Toolkit

MRTK_AWE_AuggieAwards_2021a

What is the Mixed Reality Toolkit

MRTK-Unity is a Microsoft-driven project that provides a set of components and features, used to accelerate cross-platform MR app development in Unity. Here are some of its functions:

  • Provides the cross-platform input system and building blocks for spatial interactions and UI.
  • Enables rapid prototyping via in-editor simulation that allows you to see changes immediately.
  • Operates as an extensible framework that provides developers the ability to swap out core components.
  • Supports a wide range of devices:
XR SDK Plugin (Unity XR Plugin Management Providers) Supported Devices
Unity OpenXR Plugin (Unity 2020 or 2021 LTS)
(Mixed Reality OpenXR Plugin required for certain features on certain devices)
Microsoft HoloLens 2
Windows Mixed Reality headsets
Meta Quest
Device running on SteamVR via OpenXR
Windows XR Plugin Microsoft HoloLens
Microsoft HoloLens 2
Windows Mixed Reality headsets
Oculus XR Plugin (Unity 2019 or newer LTS) Meta Quest (via Oculus Integration Package)
ARCore XR Plug-in Android (via AR Foundation)
ARKit XR Plug-in iOS (via AR Foundation)

Additional devices supported:

NOTE: We have introduced the public preview of MRTK3, the next chapter of MRTK. For documentation, please go to the MRTK3 documentation. For code, please go to the mrtk3 branch.

Getting started with MRTK

If you're new to MRTK or Mixed Reality development in Unity, we recommend you start at the beginning of our Unity development journey in the Microsoft Docs. The Unity development journey is specifically tailored to walk new developers through the installation, core concepts, and usage of MRTK.

IMPORTANT: The Unity development journey currently uses MRTK version 2.8.2, Mixed Reality OpenXR plugin version 1.6.0 and Unity 2020.3.42+.

If you're an experienced Mixed Reality or MRTK developer, check the links in the next section for the newest packages and release notes.

Documentation

Starting from MRTK 2.6, we are publishing both conceptual docs and API references on docs.microsoft.com. For conceptual docs, please visit our new landing page. For API references, please visit the MRTK-Unity section of the dot net API explorer. Existing content will remain here but will not be updated further.

Release notes
Release Notes
MRTK Overview
MRTK Overview
Feature Guides
Feature Guides
API Reference
API Reference

Build status

Branch CI Status Docs Status
main CI Status Docs Validation (MRTK2)

Required software

Windows SDK Windows SDK Unity Unity 2018/2019/2020 LTS Visual Studio 2019 Visual Studio 2019 Emulators (optional) Emulators (optional)

Please refer to the Install the tools page for more detailed information.

Feature areas

Input System
Input System
 
Hand Tracking<br/> (HoloLens 2)
Hand Tracking
(HoloLens 2)

 
Eye Tracking<br/> (HoloLens 2)
Eye Tracking
(HoloLens 2)

 
Profiles
Profiles
 
Hand Tracking<br/> (Ultraleap)
Hand Tracking
(Ultraleap)

 
UI Controls
UI Controls
 
Solvers
Solvers
 
Multi-Scene<br/> Manager
Multi-Scene
Manager
Spatial<br/> Awareness
Spatial
Awareness
Diagnostic<br/> Tool
Diagnostic
Tool
MRTK Standard Shader
MRTK Standard
Shader
Speech & Dictation
Speech
& Dictation
Boundary<br/>System
Boundary
System
In-Editor<br/>Simulation
In-Editor
Simulation
Experimental<br/>Features
Experimental
Features

UX building blocks

Button Button Bounds Control Bounds Control Object Manipulator Object Manipulator
A button control which supports various input methods, including HoloLens 2's articulated hand Standard UI for manipulating objects in 3D space Script for manipulating objects with one or two hands
Slate Slate System Keyboard System Keyboard Interactable Interactable
2D style plane which supports scrolling with articulated hand input Example script of using the system keyboard in Unity A script for making objects interactable with visual states and theme support
Solver Solver Object Collection Object Collection Tooltip Tooltip
Various object positioning behaviors such as tag-along, body-lock, constant view size and surface magnetism Script for laying out an array of objects in a three-dimensional shape Annotation UI with a flexible anchor/pivot system, which can be used for labeling motion controllers and objects
Slider Slider MRTK Standard Shader MRTK Standard Shader Hand Menu Hand Menu
Slider UI for adjusting values supporting direct hand tracking interaction MRTK's Standard shader supports various Fluent design elements with performance Hand-locked UI for quick access, using the Hand Constraint Solver
App Bar App Bar Pointers Pointers Fingertip Visualization Fingertip Visualization
UI for Bounds Control's manual activation Learn about various types of pointers Visual affordance on the fingertip which improves the confidence for the direct interaction
Near Menu Near Menu Spatial Awareness Spatial Awareness Voice Command Voice Command / Dictation
Floating menu UI for the near interactions Make your holographic objects interact with the physical environments Scripts and examples for integrating speech input
Progress Indicator Progress Indicator Dialog Dialog [Experimental] Hand Coach Hand Coach
Visual indicator for communicating data process or operation UI for asking for user's confirmation or acknowledgement Component that helps guide the user when the gesture has not been taught
Hand Physics Service Hand Physics Service [Experimental] Scrolling Collection Scrolling Collection Dock Dock [Experimental]
The hand physics service enables rigid body collision events and interactions with articulated hands An Object Collection that natively scrolls 3D objects The Dock allows objects to be moved in and out of predetermined positions
Eye Tracking: Target Selection Eye Tracking: Target Selection Eye Tracking: Navigation Eye Tracking: Navigation Eye Tracking: Heat Map Eye Tracking: Heat Map
Combine eyes, voice and hand input to quickly and effortlessly select holograms across your scene Learn how to auto-scroll text or fluently zoom into focused content based on what you are looking at Examples for logging, loading and visualizing what users have been looking at in your app

Tools

Optimize Window Optimize Window Dependency Window Dependency Window Build Window Build Window Input recording Input recording
Automate configuration of Mixed Reality projects for performance optimizations Analyze dependencies between assets and identify unused assets Configure and execute an end-to-end build process for Mixed Reality applications Record and playback head movement and hand tracking data in editor

Example scenes

Explore MRTK's various types of interactions and UI controls through the example scenes. You can find example scenes under Assets/MRTK/Examples/Demos folder.

Example Scene

MRTK examples hub

With the MRTK Examples Hub, you can try various example scenes in MRTK. On HoloLens 2, you can download and install MRTK Examples Hub through the Microsoft Store app.

See Examples Hub README page to learn about the details on creating a multi-scene hub with MRTK's scene system and scene transition service.

Example Scene

Sample apps made with MRTK

Periodic Table of the Elements Galaxy Explorer Galaxy Explorer
Periodic Table of the Elements is an open-source sample app which demonstrates how to use MRTK's input system and building blocks to create an app experience for HoloLens and Immersive headsets. Read the porting story: Bringing the Periodic Table of the Elements app to HoloLens 2 with MRTK v2 Galaxy Explorer is an open-source sample app that was originally developed in March 2016 as part of the HoloLens 'Share Your Idea' campaign. Galaxy Explorer has been updated with new features for HoloLens 2, using MRTK v2. Read the story: The Making of Galaxy Explorer for HoloLens 2 Surfaces is an open-source sample app for HoloLens 2 which explores how we can create a tactile sensation with visual, audio, and fully articulated hand-tracking. Check out Microsoft MR Dev Days session Learnings from the Surfaces app for the detailed design and development story.

Session videos from Mixed Reality Dev Days 2020

MRDevDays MRDevDays MRDevDays
Tutorial on how to create a simple MRTK app from start to finish. Learn about interaction concepts and MRTK’s multi-platform capabilities. Deep dive on the MRTK’s UX building blocks that help you build beautiful mixed reality experiences. An introduction to performance tools, both in MRTK and external, as well as an overview of the MRTK Standard Shader.

See Mixed Reality Dev Days to explore more session videos.

Engage with the community

This project has adopted the Microsoft Open Source Code of Conduct. For more information, see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Useful resources on the Mixed Reality Dev Center

Discover Discover Design Design Develop Develop Distribute Distribute
Learn to build mixed reality experiences for HoloLens and immersive headsets (VR). Get design guides. Build user interface. Learn interactions and input. Get development guides. Learn the technology. Understand the science. Get your app ready for others and consider creating a 3D launcher.

Useful resources on Azure

Spatial Anchors
Spatial Anchors
Speech Services
Speech Services
Vision Services
Vision Services
Spatial Anchors is a cross-platform service that allows you to create Mixed Reality experiences using objects that persist their location across devices over time. Discover and integrate Azure powered speech capabilities like speech to text, speaker recognition or speech translation into your application. Identify and analyze your image or video content using Vision Services like computer vision, face detection, emotion recognition or video indexer.

Learn more about the MRTK project

You can find our planning material on our wiki under the Project Management Section. You can always see the items the team is actively working on in the Iteration Plan issue.

How to contribute

Learn how you can contribute to MRTK at Contributing.

For details on the different branches used in the Mixed Reality Toolkit repositories, check this Branch Guide here.

mixedrealitytoolkit-unity's People

Contributors

adammitchell-ms avatar cameron-micka avatar cdiaz-ms avatar cre8ivepark avatar fast-slow-still avatar johnppella avatar julenka avatar kenjakubzak avatar keveleigh avatar killerantz avatar lukastoennems avatar luval-microsoft avatar macborow avatar maxwang-ms avatar menelvagormilsom avatar mrw-eric avatar radicalad avatar railboy avatar ritijain avatar rogpodge avatar rolandsmeenk avatar simondarksidej avatar sostel avatar stephenhodgson avatar thalbern avatar troy-ferrell avatar vaoliva avatar wiwei avatar yoyozilla avatar zee2 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mixedrealitytoolkit-unity's Issues

Keyboard shortcuts

I'm wondering if you guys think it would be useful to add a keyboard shortcut to the KeywordManager.
I find it easier to debug/test my actions in the unity player rather than deploy all the way to the HoloLens (or emulator).
So to that end I was thinking of adding ability to trigger a Keyword action by pressing a key on the keyboard.

Separate background thread for PlaneFinding

Clarification on blurb from readme:

NOTE: In the interest of simplicity, this test script calls the PlaneFinding APIs directly from the main Unity thread in Update(). In a real application, the PlaneFinding APIs MUST be called from a background thread in order to avoid stalling the rendering thread and causing a drop in frame rate.

Separate background thread - using something like Loom? Further clarification on background thread, please..

Conflicting architecture build

The following directories containe some binary objects that are imported from the HoloToolkit project. They are provided for both x64 and x86. This causes Unity to throw an error as the editor can only support a single architecture. In order to fix this I needed to remove the x64 binaries.

Would it be better to include this content as a git submodule rather than a binary build?
If a binary build is needed then what is the proper way to prevent these errors?

Directories containing x64 binaries...

External/HoloToolkit/Sharing/Tools/Profiler/
External\HoloToolkit\Sharing\Tools\SessionManager
Assets/HoloToolkit/Sharing/Plugins

Make classes more extendable

Scenarios:
Gina is a developer. She wants to extend the GazeManager to provide a property for a field directly in front of the users gaze.
Desired code

40         private Vector3 gazeOrigin; 
41         private Vector3 gazeDirection; 
40         public Vector3 AheadOfGaze{ get { return gazeOrigin+gazeDirection; } }

Problem:

  1. By changing the GazeManager Gina needs to propagate this change to all customers using her scripts.
  2. Changing parts of the HoloToolkit itself locks Gina out from future updates and requires clients to change their HoloToolkit too (removing the 'drop-in-and-run' factor)

Proposed solution:
Make classes partial. Partial classes allow Gina to extend them using her own scripts and export her packages as a drop in component working on top of the base toolkit.
In GinaGazeManager.cs:

public partial class GazeManager{
 public Vector3 AheadOfGaze{ get { return gazeOrigin+gazeDirection; } }
}

The VertexLit shaders fail to Build in Unity for the UWP Win 10 HoloLens with light baking

The HoloToolKit VertexLit shaders fail to Build in Unity for the UWP Win 10 HoloLens target when light baking is turned on:

Error building Player: Shader error in 'HoloToolkit/Vertex Lit Configurable': invalid subscript 'texcoord1' at Assets/HoloToolkit/Shaders/VertexLitConfigurable.cginc(67) (on d3d11)

Compiling Vertex program with DIRECTIONAL SHADOWS_OFF LIGHTMAP_ON DIRLIGHTMAP_COMBINED DYNAMICLIGHTMAP_OFF
Platform defines: UNITY_NO_SCREENSPACE_SHADOWS UNITY_ENABLE_REFLECTION_BUFFERS UNITY_PBS_USE_BRDF3 UNITY_HARDWARE_TIER1

Consider adding TapToPlaceParent like functionality

Scenario:

Adam is a developer who wants an easy way to move his objects around. Place them where they matter to him.
He wants a quick way to tap on an object, gaze at a different location and then tap again to place the object at that location.

Proposed Solution:

Add a new script called TapToPlaceParent or something similar to either the Input or Utilities folder.

Crash when disabling spatialmappingmanager's visual meshes

In my project, setting DrawVisualMeshes to false gives me an exception. In this case, I'm triggering it from a voice command where I stop the observer and stop the drawing of the meshes like so:

_spatialMM.StopObserver();
_spatialMM.DrawVisualMeshes = false;

And I get this exception:

Exception thrown: 'System.NullReferenceException' in WinRTBridge.winmd
NullReferenceException: Exception of type 'System.NullReferenceException' was thrown.
at WinRTBridge.Utils.ThrowNewNullReferenceException(String message)
at UnityEngineProxy.InternalCalls.PInvokeCalls.Renderer_Set_Custom_PropEnabled(IntPtr param_0, Boolean param_1)
at UnityEngineProxy.InternalCalls.Renderer_Set_Custom_PropEnabled(Object self, Boolean paramValue)
at HoloToolkit.Unity.SpatialMappingManager.UpdateRendering(Boolean Enable)
at GameManager.ToggleObserver(Boolean on)
at GameManager.StartPlacement()
at GUIManager.OnPlayCommnand()
at UnityEngine.Events.InvokableCall.Invoke(Object[] args)
at UnityEngine.Events.InvokableCallList.Invoke(Object[] parameters)
at HoloToolkit.Unity.KeywordManager.KeywordRecognizer_OnPhraseRecognized(PhraseRecognizedEventArgs args)
at UnityEngine.Windows.Speech.PhraseRecognizer.InvokePhraseRecognizedEvent(String text, ConfidenceLevel confidence, SemanticMeaning[] semanticMeanings, Int64 phraseStartFileTime, Int64 phraseDurationTicks)
at UnityEngine.Windows.Speech.PhraseRecognizer.$Invoke6(Int64 instance, Int64* args)
at UnityEngine.Internal.$MethodUtility.InvokeMethod(Int64 instance, Int64* args, IntPtr method)
(Filename: Line: 0)

The program '[4280] HoloToolkit-Unity.exe' has exited with code -1 (0xffffffff).

Documentation is hard to navigate and collaborate on.

Right now we've got documentation spread throughout the README and the Wiki. This causes a few problems:

  • The front page of our GitHub is polluted with in-depth scripting documentation.
  • The wiki documentation is hard to collaborate on.
  • Documentation is hard to navigate and doesn't connect well.

After investigating using pages ( see comment below ) I'm recommending we simply reorganize the wiki to look something like this:

  • Manual
    • Logical descriptions of systems, like the sections of the wiki we have now
    • Descriptions of prefabs and the exposed editor data, should include screenshots from the editor
    • Tutorials
    • Should contain links to relevant Scripting API sections.
  • Scripting API
    • API descriptions of script types and their API
    • All the descriptions that currently live in the README
    • Should contain links to relevant Manual sections.

It'd be nice to auto-generate the Scripting API section someday, which should be possible given some VS XML documentation to Markdown Open Source projects out there.

Custom UI/Default shader causing confusing with Unity's UI/Default shader

HoloToolkit has a cursor asset that uses a shader called UI-default.shader in Assets\HoloToolkit\Input\Models\Cursor\UI-Default.shader". As a result, Unity confuses using its own UI\Default shader and using HoloToolkit's custom UI\Default shader.

To fix this, the custom shader should be renamed and appropriately placed in HoloToolkit\Utilities\Shaders folder.

GestureManager to call OnSelect on focused object from Unity

Hi all

Another small change that would make it easier to work straight from the editor.
It would be great to call the "OnSelect" message when in the editor we either right mouse click or press a specific key (enter or space) like in the emulator.
Any thoughts?

-s

Spatial Mapping Renderer = Hot Pink

After adding the HoloToolkit to a new Unity project and assigning it to collection of holograms, the inside of my office is rendered in spectacular hot pink. By default, my holograms should be occluded by my office walls. I suspect that the renderer is not locating the occlusion material.

My fix:

  1. Set the Renderer Mode to Material, and
  2. Drag and drop HoloToolkit/SpatialMapping/Materials/Occlusion on to Render Material.

Make HoloToolkit feature folders more modular

Scenarios:

  1. Gina is a developer. She wants to add a script related to Cursors.
    She should be able to add the script file, prefabs, tests, related materials etc. all into the Assets\HoloToolkit\Input folder aka feature folder. This makes the folder more compartmentalized while contributing.
  2. David is a designer and he only wants to keep specific feature folders from the HoloToolkit.
    The folder structure should be modular enough so users can delete not needed ones easily.

Proposed folder structure:

Assets\HoloToolkit
-------------------------> Input
----------------------------------> Input \ Scripts
----------------------------------> Input \ Materials
----------------------------------> Input \ Prefabs
----------------------------------> Input \ Tests
-------------------------> Sharing
----------------------------------> Sharing \ Scripts
----------------------------------> Sharing \ Editor
----------------------------------> Sharing \ Prefabs
----------------------------------> Sharing \ Plugins
----------------------------------> Sharing \ Tests

Other folders will match a similar layout.
Dependencies between folders will be documented.

TextToSpeechManager Unity scene should be a text asset

Right now it's a binary asset, not sure how that happened. We should just pull in the conversion to a text asset.

@jbienz - any idea how you got a binary asset here? Curious so we can try to avoid this in the future, though I should have caught it when reviewing the PR.

Deprecate remote spatial mapping classes

The preferred method for getting spatial mapping data into Unity is to use the Windows Dev Portal for capturing a 3D model of the room and then loading the meshes from that model. We're currently updating the Spatial Mapping course to show this new flow and will no longer need the following classes in HoloToolkit for sending meshes to Unity:
RemoteMapping.prefab
MeshSaver.cs
RemoteMappingManager.cs
RemoteMeshSource.cs
RemoteMeshTarget.cs
SimpleMeshSerializer.cs

Reorganize SpatialSound folder

Make SpatialSound folder match the other folders (ex: SpatialMapping) by adding Scripts folder.
Within Scripts, create UAudioManager folder to make it easy to opt in/out of including the component.

Create a test scene or a prefab for the FPSDisplay script

Monitoring performance and frame rate is a pretty important part in developing a HoloLens app.

I quickly found the FPSDisplay script in the toolkit but there wasn't a lot of documentation on how to use it. So I wrote a prefab to make it really easy to integrate in your scene.

You guys think it would be a good contribution to the project? Maybe also use it in one of the demo/test scenes? If so any suggestions where ?

I wrote a blog post with details and a link to the file not sure if it's OK to post it here but you can find it on my profile page.

Cannot send KeywordManager events to dynamically credated prefab instances

The problem is that KeywordManager allows sending keyword-triggered events only to static objects that are already in the scene. Sending events to dynamically instantiated prefabs requires additional custom code, so the proposal is for HoloToolkit to provide this functionality out of the box.

KeywordManager breaks if disabled and re-enabled

The KeywordManager is set to be set up and potentially started on Start, and stopped and destroyed on destroy. However, Start occurs any time the object is enabled, so it can cause the keyword recognizer to throw errors if the game object is disabled and re-enabled.

The KeywordRecognizer should either be set up and started in Awake (which happens once in object lifetime), destroyed and disabled in OnDisable, or have a check to not create a new KeywordRecognizer if one is already set up.

Build window to support HTTPS communication

We currently only support local IP (IPOverUSB) due to lack of HTTPS support. We would like to add secure communication to the REST calls used by this window (the Portal class) so that remote IPs can be exposed out.

`-unsafe` compiler option required

To use HoloToolkit it is necessary to set the -unsafe compiler option. I'm new to C# so I'm not sure what this means, but I did find the following text on the Unity forums:

"Again, using unsafe doesn't give you much advantages, it just opens up for a huge variety of errors, mistakes and unexpected behaviour. Unless you know exactly what you do and you use pointers sparsely you shouldn't go for an unsafe context. " (http://answers.unity3d.com/questions/804103/how-to-enable-unsafe-and-use-pointers.html)

Is it wise to require unsafe? If this needs to remain then this should be documented in the readme (see link above for documentation)

GazeManager sets the normal incorrectly when the Raycast does not hit a hologram

This is noticeable if you create a "CursorOffHolograms" model that does not have a "back" side to render. What is happening is the normal when no collision is detected is set in the direction of the user's gaze, this turns the model's face away from the user. The solution is to set the normal in the opposite direction of the user's gaze (back toward the user).

Consider using Time.unscaledDeltaTime

Just a minor observation: I would recommend considering the use of Time.unscaledDeltaTime in scripts instead of Time.deltaTime. especially input related scripts.

One of the simpler ways for Unity games to pause stuff is to set Time.timeScale = 0. Which stops the calling of FixedUpdate and sets Time.deltaTime = 0. Even when we pause game we still want the HoloLens scripts to work properly specifically input scripts.

One could also put an inspector variable in such scripts so the developer can choose deltaTime or unscaledDeltaTime.

So far for inputs, I only see GazeManager using Time.deltaTime; but I also see UAudioManagerBase and Interpolator script using it as well. Not sure about the Audio stuff, but Interpolator will get effected if I need to pause game and want my pause menu UI to tag along.

Bug in Sharing - CustomMessages implementation?

I noticed this code in the CustomMessages.cs class.

void InitializeMessageHandlers()
    {
        SharingStage sharingStage = SharingStage.Instance;
        if (sharingStage != null)
        {
            serverConnection = sharingStage.Manager.GetServerConnection();
            connectionAdapter = new NetworkConnectionAdapter();
        }

        connectionAdapter.MessageReceivedCallback += OnMessageReceived;
...

If sharingService is null, we don't instantiate the connectionAdapter and the code will crash when we try and set the MessageRecievedCallback.

Unless I'm missing something, I think we should put the last line of code inside the if statement.

Add new control mode to ManualCameraControl

Right now the ManualCameraControl script provides a fly mode control which can make it hard to simulate how you would move if you were in a HL experience.
This looks like an interesting and useful change that should make it easier to design HL app straight from Unity.
I propose to add a new control option so it only allow you move on a X/Z plane and on the Y axis (to simulate squat)
An enum would be added so it’s possible to use the current fly mode control (will be the default option) or use the new walk mode control.
Any thoughts?

GazeManager logs error messages for objects lacking OnGazeEnter / OnGazeLeave

Unity will spam the console error log with the text "SendMessage OnGazeEnter has no receiver!" if no method for OnGazeEnter or OnGazeLeave has been created for the object. This can be mitigated by using the SendMessageOptions.DontRequireReceiver option as the second parameter to the SendMessage method on the GameObject.

Unable to Build Error in Unity HoloTookkit/

Assets/HoloToolkit/SpatialMapping/Scripts/SpatialMappingComponent/Support/SMBaseAbstract.cs(332,88): error CS1501: No overload for method Update' takes1' arguments

Assets/HoloToolkit/SpatialMapping/Scripts/SpatialMappingComponent/Support/SMBaseAbstract.cs(481,33): error CS1501: No overload for method RequestMeshAsync' takes2' arguments

SurfacePlane plane information is cleared

When Start is called on a SurfacePlane, the plane member is assigned to a new BoundedPlane. Since this method is called after Instantiation by SurfaceMeshesToPlanes you'll find that GameObject is in the right spot, but access the BoundPlane gives you zeroed out information.

Hololens needs Game Controller Support in Unity

There does not seem to be any support for game controllers as of yet. Bluetooth controllers will pair with the Hololens but do not function inside of Unity apps. The same controller will work paired to the computer inside the editor or standalone version of the game, which leads me here for solutions.

There's no way to configure where builds from the Build menu are output to.

I've already run into issues where I've wanted the output build folder from BuildsCommands.BuildForHololens to differ from the hardcoded value. I'm thinking I'll add a new section to the preferences for the HoloToolkit and then we'll have a nice place to configure this and any other data that should be configurable.

Add 'FocusedObject' property to GestureManager

I suggest adding the following property to GestureManager.cs:

public GameObject FocusedObject { get { return focusedObject; } }

This allows you to get the currently focused object. In fact the Origami academy tutorial even added this.

Need an easy way to do Text to Speech

Windows 10 supports text to speech via SpeechSynthesizer but this isn't easily used from Unity because of the complexities in converting from SpeechSynthesisStream to AudioClip. I propose a new component called TextToSpeechManager (which I've already written) to do this work for us.

Expose MicStreamSelector to Unity

Hi guys

Just wondering if someone has been working on making the HoloToolKit MicStreamSelector available from Unity ?
Something we would need here soonish so I thought I should ask in case there's already something available privately.

Thanks in advance
-s

Tagalongs need to face camera

Isn't it kind of weird how the tagalongs don't orient towards the camera? If you turn about 90 degrees off, the tagalong will be facing so far away from you it's impossible to access.

I tried putting a simple lookat in the Tagalong's update (and restricted rotation to just Y) but I think it's messing up the visibility check with the bounding box so it floats out of view more often...but I'm not totally sure.

Any best practices for keeping a tagalong facing the player--or is this something we need to fix in the tagalong scripts?

Create unified Spatial Mapping Components

We should have components to handle rendering and creating colliders for spatial mapping.

These components should be usable by default but customizable to what most people need. Additionally, we should have heuristics to handle Tracking Lost and moving far away from a placed observer intelligently instead of just removing the meshes.

Deploying to device is cumbersome and not available directly from Unity

I'd like to add a sub menu to the HoloToolkit Unity menu that allows a single operation for building and deploying to device or the emulator.

I'm thinking something like:

  • HoloToolkit
    • Deploy to Device

Here's the process I see:

  1. WSA Build from Unity
  2. Nuget restore of VS Solution
  3. MSBuild build of VS Solution
  4. WinAppDeployCmd of appx.

I'd prefer a solution that only required visual studio, but I'm running into these issues:

  1. VS cmd line /build doesn't do the nuget restore (the documentation says it does exactly what Build Solution does, but that's a lie).
  2. VS cmd line /build doesn't generate a deployable appx, just binaries.
  3. VS cmd line /deploy doesn't let me set the deployment target. (There is no documentation I can find for controlling the target via the command line).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.