Code Monkey home page Code Monkey logo

hololabinc / mixedrealitytoolkit-unity Goto Github PK

View Code? Open in Web Editor NEW

This project forked from microsoft/mixedrealitytoolkit-unity

7.0 1.0 3.0 1.4 GB

MixedRealityToolkit-Unity uses code from the base MixedRealityToolkit repository and makes it easier to consume in Unity.

Home Page: https://microsoft.github.io/MixedRealityToolkit-Unity

License: MIT License

C# 97.02% ShaderLab 1.54% HTML 0.01% JavaScript 0.03% CSS 0.01% GLSL 0.26% PowerShell 1.09% Python 0.04% HLSL 0.01%

mixedrealitytoolkit-unity's Introduction

Mixed Reality Toolkit

What is the Mixed Reality Toolkit

MRTK-Unity is a Microsoft-driven project that provides a set of components and features, used to accelerate cross-platform MR app development in Unity. Here are some of its functions:

  • Provides the cross-platform input system and building blocks for spatial interactions and UI.
  • Enables rapid prototyping via in-editor simulation that allows you to see changes immediately.
  • Operates as an extensible framework that provides developers the ability to swap out core components.
  • Supports a wide range of platforms, including
    • Microsoft HoloLens
    • Microsoft HoloLens 2
    • Windows Mixed Reality headsets
    • OpenVR headsets (HTC Vive / Oculus Rift)
    • Ultraleap Hand Tracking
    • Mobile devices such as iOS and Android

Getting started with MRTK

If you're new to MRTK or Mixed Reality development in Unity, we recommend you start at the beginning of our Unity development journey in the Microsoft Docs. The Unity development journey is specifically tailored to walk new developers through the installation, core concepts, and usage of MRTK.

IMPORTANT: The Unity development journey currently uses MRTK version 2.4.0 and Unity 2019.4.

If you're an experienced Mixed Reality or MRTK developer, check the links in the next section for the newest packages and release notes.

Documentation

Release notes
Release Notes
MRTK Overview
MRTK Overview
Feature Guides
Feature Guides
API Reference
API Reference

Build status

Branch CI Status Docs Status
mrtk_development CI Status Docs Status

Required software

Windows SDK 18362+ Windows SDK 18362+ Unity Unity 2018.4.x Visual Studio 2019 Visual Studio 2019 Emulators (optional) Emulators (optional)
To build apps with MRTK v2, you need the Windows 10 May 2019 Update SDK.
To run apps for immersive headsets, you need the Windows 10 Fall Creators Update.
The Unity 3D engine provides support for building mixed reality projects in Windows 10 Visual Studio is used for code editing, deploying and building UWP app packages The Emulators allow you to test your app without the device in a simulated environment

Feature areas

Input System Input System
 
Hand Tracking<br/> (HoloLens 2) Hand Tracking
(HoloLens 2)
Eye Tracking<br/> (HoloLens 2) Eye Tracking
(HoloLens 2)
Profiles Profiles
 
Hand Tracking<br/> (Ultraleap) Hand Tracking (Ultraleap)
UI Controls UI Controls
 
Solvers Solvers
 
Multi-Scene<br/> Manager Multi-Scene
Manager
Spatial<br/> Awareness Spatial
Awareness
Diagnostic<br/> Tool Diagnostic
Tool
MRTK Standard Shader MRTK Standard Shader Speech & Dictation Speech
& Dictation
Boundary<br/>System Boundary
System
In-Editor<br/>Simulation In-Editor
Simulation
Experimental<br/>Features Experimental
Features

UX building blocks

Button Button Bounds Control Bounds Control Object Manipulator Object Manipulator
A button control which supports various input methods, including HoloLens 2's articulated hand Standard UI for manipulating objects in 3D space Script for manipulating objects with one or two hands
Slate Slate System Keyboard System Keyboard Interactable Interactable
2D style plane which supports scrolling with articulated hand input Example script of using the system keyboard in Unity A script for making objects interactable with visual states and theme support
Solver Solver Object Collection Object Collection Tooltip Tooltip
Various object positioning behaviors such as tag-along, body-lock, constant view size and surface magnetism Script for laying out an array of objects in a three-dimensional shape Annotation UI with a flexible anchor/pivot system, which can be used for labeling motion controllers and objects
Slider Slider MRTK Standard Shader MRTK Standard Shader Hand Menu Hand Menu
Slider UI for adjusting values supporting direct hand tracking interaction MRTK's Standard shader supports various Fluent design elements with performance Hand-locked UI for quick access, using the Hand Constraint Solver
App Bar App Bar Pointers Pointers Fingertip Visualization Fingertip Visualization
UI for Bounds Control's manual activation Learn about various types of pointers Visual affordance on the fingertip which improves the confidence for the direct interaction
Near Menu Near Menu Spatial Awareness Spatial Awareness Voice Command Voice Command / Dictation
Floating menu UI for the near interactions Make your holographic objects interact with the physical environments Scripts and examples for integrating speech input
Progress Indicator Progress Indicator Dialog Dialog [Experimental] Hand Coach Hand Coach [Experimental]
Visual indicator for communicating data process or operation UI for asking for user's confirmation or acknowledgement Component that helps guide the user when the gesture has not been taught
Hand Physics Service Hand Physics Service [Experimental] Scrolling Collection Scrolling Collection Dock Dock [Experimental]
The hand physics service enables rigid body collision events and interactions with articulated hands An Object Collection that natively scrolls 3D objects The Dock allows objects to be moved in and out of predetermined positions
Eye Tracking: Target Selection Eye Tracking: Target Selection Eye Tracking: Navigation Eye Tracking: Navigation Eye Tracking: Heat Map Eye Tracking: Heat Map
Combine eyes, voice and hand input to quickly and effortlessly select holograms across your scene Learn how to auto-scroll text or fluently zoom into focused content based on what you are looking at Examples for logging, loading and visualizing what users have been looking at in your app

Tools

Optimize Window Optimize Window Dependency Window Dependency Window Build Window Build Window Input recording Input recording
Automate configuration of Mixed Reality projects for performance optimizations Analyze dependencies between assets and identify unused assets Configure and execute an end-to-end build process for Mixed Reality applications Record and playback head movement and hand tracking data in editor

Example scenes

Explore MRTK's various types of interactions and UI controls in this example scene.

You can find other example scenes under Assets/MixedRealityToolkit.Examples/Demos folder.

Example Scene

MRTK examples hub

With the MRTK Examples Hub, you can try various example scenes in MRTK. You can find pre-built app packages for HoloLens(x86), HoloLens 2(ARM), and Windows Mixed Reality immersive headsets(x64) under Release Assets folder. Use the Windows Device Portal to install apps on HoloLens. On HoloLens 2, you can download and install MRTK Examples Hub through the Microsoft Store app.

See Examples Hub README page to learn about the details on creating a multi-scene hub with MRTK's scene system and scene transition service.

Example Scene

Sample apps made with MRTK

Periodic Table of the Elements Galaxy Explorer Galaxy Explorer
Periodic Table of the Elements is an open-source sample app which demonstrates how to use MRTK's input system and building blocks to create an app experience for HoloLens and Immersive headsets. Read the porting story: Bringing the Periodic Table of the Elements app to HoloLens 2 with MRTK v2 Galaxy Explorer is an open-source sample app that was originally developed in March 2016 as part of the HoloLens 'Share Your Idea' campaign. Galaxy Explorer has been updated with new features for HoloLens 2, using MRTK v2. Read the story: The Making of Galaxy Explorer for HoloLens 2 Surfaces is an open-source sample app for HoloLens 2 which explores how we can create a tactile sensation with visual, audio, and fully articulated hand-tracking. Check out Microsoft MR Dev Days session Learnings from the Surfaces app for the detailed design and development story.

Session videos from Mixed Reality Dev Days 2020

MRDevDays MRDevDays MRDevDays
Tutorial on how to create a simple MRTK app from start to finish. Learn about interaction concepts and MRTK’s multi-platform capabilities. Deep dive on the MRTK’s UX building blocks that help you build beautiful mixed reality experiences. An introduction to performance tools, both in MRTK and external, as well as an overview of the MRTK Standard Shader.

See Mixed Reality Dev Days to explore more session videos.

Engage with the community

This project has adopted the Microsoft Open Source Code of Conduct. For more information, see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Useful resources on the Mixed Reality Dev Center

Discover Discover Design Design Develop Develop Distribute) Distribute
Learn to build mixed reality experiences for HoloLens and immersive headsets (VR). Get design guides. Build user interface. Learn interactions and input. Get development guides. Learn the technology. Understand the science. Get your app ready for others and consider creating a 3D launcher.

Useful resources on Azure

Spatial Anchors
Spatial Anchors
Speech Services Speech Services Vision Services Vision Services
Spatial Anchors is a cross-platform service that allows you to create Mixed Reality experiences using objects that persist their location across devices over time. Discover and integrate Azure powered speech capabilities like speech to text, speaker recognition or speech translation into your application. Identify and analyze your image or video content using Vision Services like computer vision, face detection, emotion recognition or video indexer.

Learn more about the MRTK project

You can find our planning material on our wiki under the Project Management Section. You can always see the items the team is actively working on in the Iteration Plan issue.

How to contribute

Learn how you can contribute to MRTK at Contributing.

For details on the different branches used in the Mixed Reality Toolkit repositories, check this Branch Guide here.

mixedrealitytoolkit-unity's People

Contributors

adammitchell-ms avatar cameron-micka avatar cdiaz-ms avatar cre8ivepark avatar johnppella avatar julenka avatar kenjakubzak avatar keveleigh avatar killerantz avatar kircher1 avatar lukastoennems avatar luval-microsoft avatar macborow avatar maxwang-ms avatar menelvagormilsom avatar mrw-eric avatar radicalad avatar railboy avatar ritijain avatar rogpodge avatar rolandsmeenk avatar simondarksidej avatar sostel avatar stephenhodgson avatar thalbern avatar troy-ferrell avatar vaoliva avatar wiwei avatar yoyozilla avatar zee2 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

mixedrealitytoolkit-unity's Issues

[v2.1.0] Input/Overview.md

diff --git a/Documentation/Input/Overview.md b/Documentation/Input/Overview.md
index 58eaabcde..551fd968f 100644
--- a/Documentation/Input/Overview.md
+++ b/Documentation/Input/Overview.md
@@ -19,4 +19,6 @@ Controllers can have [**Pointers**](Pointers.md) attached to them that query the
 <img src="../../Documentation/Images/Input/MRTK_Input_EventFlow.png" width="200px" style="display:block;margin-left:auto;margin-right:auto;">
 <sup>Event flow.</sup>
 
-While you can handle input events directly in UI components it is recommended to use pointer events to keep the implementation device-independent.
+While you can handle [input events directly in UI components](InputEvents.md) it is recommended to use [pointer events](pointers.md#pointer-event-interfaces) to keep the implementation device-independent.
+
+MRTK also provides several convenience methods to query input state directly in a device-independent way. See [Accessing Input State in MRTK](InputState.md) for more details.

[v2.1.0] MixedRealityConfigurationGuide

diff --git a/Documentation/MixedRealityConfigurationGuide.md b/Documentation/MixedRealityConfigurationGuide.md
index f4304c298..219a3d5d4 100644
--- a/Documentation/MixedRealityConfigurationGuide.md
+++ b/Documentation/MixedRealityConfigurationGuide.md
@@ -53,8 +53,6 @@ From here you can navigate to all the configuration profiles for the MRTK, inclu
 
 These configuration profiles are detailed below in their relevant sections:
 
-From here you can navigate to all the configuration profiles for the MRTK, including:
-
 ---
 <a name="experience"/>
 
@@ -146,7 +144,7 @@ An optional but highly useful feature of the MRTK is the plugin diagnostics func
 
 <img src="../Documentation/Images/MixedRealityToolkitConfigurationProfileScreens/MRTK_DiagnosticsSystemSelection.png" width="650px" style="display:block;">
 
-The diagnostics profile provides several simple systems to monitor whilst the project is running, including a handy On/Off switch to enable / disable the display pane in the scene.
+The diagnostics profile provides several simple systems to monitor whilst the project is running, including a handy On/Off switch to enable / disable the display panel in the scene.
 
 <img src="../Documentation/Images/MixedRealityToolkitConfigurationProfileScreens/MRTK_DiagnosticsProfile.png" width="650px" style="display:block;">
 

[v2.1.0] GettingStartedWithTheMRTK

diff --git a/Documentation/GettingStartedWithTheMRTK.md b/Documentation/GettingStartedWithTheMRTK.md
index 1828caad3..e1fbb6bb9 100644
--- a/Documentation/GettingStartedWithTheMRTK.md
+++ b/Documentation/GettingStartedWithTheMRTK.md
@@ -21,10 +21,12 @@ To get started with the Mixed Reality Toolkit you will need:
 
 ## Getting started tutorials
 
-If you are new to MRTK, or MR development, we recommend you check out the [Getting started tutorials](https://docs.microsoft.com/en-us/windows/mixed-reality/mrlearning-base) which uses MRTK v2.
+If you are new to MRTK, or MR development, we recommend you check out the [Getting started tutorials](https://docs.microsoft.com/en-us/windows/mixed-reality/mrlearning-base) which uses MRTK v2. Check out [MRTK 101: How to use Mixed Reality Toolkit Unity for Basic Interactions (HoloLens 2, HoloLens, Windows Mixed Reality, Open VR)](https://docs.microsoft.com/en-us/windows/mixed-reality/mrtk-101) to learn about core building blocks.
 
 ## Add MRTK to your Unity Project
 
+Mixed Reality Toolkit is now available for download on NuGet.org, for details see [MRTK NuGet Package](MRTKNuGetPackage.md).
+
 ### Get the latest MRTK Unity packages
 
 1. Go to the  [MRTK release page](https://github.com/Microsoft/MixedRealityToolkit-Unity/releases).
@@ -73,34 +75,36 @@ Some prefabs and assets require TextMesh Pro, meaning you have to have the TextM
 
 [![HandInteractionExample scene](../Documentation/Images/MRTK_Examples.png)](README_HandInteractionExamples.md)
 
-The [hand interaction examples scene](README_HandInteractionExamples.md) is a great place to get started because it shows a wide variety of UX controls and interactions in MRTK. To get started we will import MRTK, open the example scene, and explore the scene in the editor.
+The [hand interaction examples scene](README_HandInteractionExamples.md) is a great place to get started because it shows a wide variety of UX controls and interactions in MRTK. 
 
-1. Create a new Unity project and then import both the **Foundation** and **Examples** unity packages following [the steps above](#import-mrtk-packages-into-your-unity-project).
-2. Open the HandInteractionExamples scene under `Assets\MixedRealityToolkit.Examples\Demos\HandTracking\Scenes\HandInteractionExamples`
+1. Open the **HandInteractionExamples** scene under `Assets\MixedRealityToolkit.Examples\Demos\HandTracking\Scenes\HandInteractionExamples`
 
-3. You may get a prompt asking you to import "TMP Essentials".
+2. You may get a prompt asking you to import "TMP Essentials".
 
 ![TMP Essentials](../Documentation/Images/getting_started/MRTK_GettingStarted_TMPro.png)
 
 If you get such a prompt, select "Import TMP essentials" button. "TMP Essentials" refers to Text Mesh Pro plugin, which some of the MRTK examples use for improved text rendering. (See [Text in Unity](https://docs.microsoft.com/en-us/windows/mixed-reality/text-in-unity) for more detailed information)
 
-4. Close the TMP dialog. After this you need to reload the scene. You can do this by double clicking the scene in the project tab.
+3. Close the TMP dialog. After this you need to reload the scene. You can do this by double clicking the scene in the project tab.
 
-5. Press the play button.
+4. Press the play button.
 
 ## Using the In-Editor Hand Input Simulation to test a scene
 
 The in-editor input simulation allows you to test virtual object behavior given a specific type of input such as [hands](InputSimulation/InputSimulationService.md#hand-simulation) or [eyes](EyeTracking/EyeTracking_BasicSetup.md#simulating-eye-tracking-in-the-unity-editor).
 
 How to move around in the scene: 
-- Use W/A/S/D keys to move the camera forward/left/back/right.
-- Press and hold the right mouse to rotate the camera.
+- Use **W/A/S/D** keys to move the camera forward/left/back/right.
+- Use **Q/E** to move the camera vertically.
+- Press and hold the **right mouse button** to rotate the camera.
 
 How to simulate hand input:
-- Press and hold the space bar to enable the right hand. 
+- Press and hold the **space bar** to enable the right hand. 
 - While holding the space bar, move your mouse to move the hand.
-- Use the middle mouse scroll to adjust the depth of the hand.
-- Click the left mouse to switch gestures.
+- Use the mouse **scroll wheel** to adjust the depth of the hand.
+- Click the **left mouse button** to simulate pinch gesture.
+- Use **T/Y** keys to make the hand persistent in the view.
+- Hold **CTRL** key and move the mouse to rotate the hand.
 
 Have fun exploring the scene! You can learn more about the UI controls [in the hand interaction examples guide](README_HandInteractionExamples.md). Also, read through [input simulation docs](InputSimulation/InputSimulationService.md) to learn more about in-editor hand input simulation in MRTK.
 
@@ -126,7 +130,7 @@ Click "OK".
 
 ![MRTK Select Configure Dialog](../Documentation/Images/MRTK_SelectConfigurationDialog.png)
 
-> **NOTE**: Note that if you are getting started on the HoloLens 2, you should choose the "DefaultHoloLens2ConfigurationProfile" instead.
+> **NOTE**: Note that if you are getting started on the HoloLens or HoloLens 2, you should choose the "DefaultHoloLens1ConfigurationProfile" or "DefaultHoloLens2ConfigurationProfile" instead.
 > See the [profiles](Profiles/Profiles.md#hololens-2-profile) for more information on the differences between 
 > DefaultMixedRealityToolkitConfigurationProfile and DefaultHoloLens2ConfigurationProfile.
 
@@ -136,8 +140,8 @@ You will then see the following in your Scene hierarchy:
 
 Which contains the following:
 
-* Mixed Reality Toolkit - The toolkit itself, providing the central configuration entry point for the entire framework.
-* MixedRealityPlayspace - The parent object for the headset, which ensures the headset / controllers and other required systems are managed correctly in the scene.
+* **Mixed Reality Toolkit** - The toolkit itself, providing the central configuration entry point for the entire framework.
+* **MixedRealityPlayspace** - The parent object for the headset, which ensures the headset / controllers and other required systems are managed correctly in the scene.
 * The Main Camera is moved as a child to the Playspace - Which allows the playspace to manage the camera in conjunction with the SDKs
 
 **Note** While working in your scene, **DO NOT move the Main Camera** (or the playspace) from the scene origin (0,0,0).  This is controlled by the MRTK and the active SDK.
@@ -151,34 +155,14 @@ You are now ready to build and deploy to device! Follow the steps instructions a
 
 Here are some suggested next steps:
 
-* Add a [PressableButton](README_Button.md) to your scene (we recommend using the `PressableButtonPlated` prefab to start)).
-* Add a cube to your scene, then make it movable using the [ManipulationHandler](README_ManipulationHandler.md) component.
-* Learn about the UX controls available in MRTK in [building blocks for UI and interactions](#building-blocks-for-ui-and-interactions).
-* Read through [input simulation guide](InputSimulation/InputSimulationService.md) to learn how to simulate hand input in editor.
+* Check out [MRTK 101: How to use Mixed Reality Toolkit Unity for Basic Interactions](https://docs.microsoft.com/en-us/windows/mixed-reality/mrtk-101) to learn about how to achieve common spatial interactions such as grab, move, scale, and rotate.
+* Learn about the UX controls available in MRTK in [UI and interaction building blocks](../README.md#ui-and-interaction-building-blocks).
+* Try [MRTK Examples Hub](README_ExampleHub.md) (pre-built app packages are included in the release page for your convenience)
 * Learn how to work with the MRTK Configuration profile in the [mixed reality configuration guide](MixedRealityConfigurationGuide.md).
-
-## Building blocks for UI and interactions
-
-|  [![Button](Images/Button/MRTK_Button_Main.png)](README_Button.md) [Button](README_Button.md) | [![Bounding Box](Images/BoundingBox/MRTK_BoundingBox_Main.png)](README_BoundingBox.md) [Bounding Box](README_BoundingBox.md) | [![Manipulation Handler](Images/ManipulationHandler/MRTK_Manipulation_Main.png)](README_ManipulationHandler.md) [Manipulation Handler](README_ManipulationHandler.md) |
-|:--- | :--- | :--- |
-| A button control which supports various input methods including HoloLens 2's articulated hand | Standard UI for manipulating objects in 3D space | Script for manipulating objects with one or two hands |
-|  [![Slate](Images/Slate/MRTK_Slate_Main.png)](README_Slate.md) [Slate](README_Slate.md) | [![System Keyboard](Images/SystemKeyboard/MRTK_SystemKeyboard_Main.png)](README_SystemKeyboard.md) [System Keyboard](README_SystemKeyboard.md) | [![Interactable](Images/Interactable/InteractableExamples.png)](README_Interactable.md) [Interactable](README_Interactable.md) |
-| 2D style plane which supports scrolling with articulated hand input | Example script of using the system keyboard in Unity  | A script for making objects interactable with visual states and theme support |
-|  [![Solver](Images/Solver/MRTK_Solver_Main.png)](README_Solver.md) [Solver](README_Solver.md) | [![Object Collection](Images/ObjectCollection/MRTK_ObjectCollection_Main.png)](README_ObjectCollection.md) [Object Collection](README_ObjectCollection.md) | [![Tooltip](Images/Tooltip/MRTK_Tooltip_Main.png)](README_Tooltip.md) [Tooltip](README_Tooltip.md) |
-| Various object positioning behaviors such as tag-along, body-lock, constant view size and surface magnetism | Script for lay out an array of objects in a three-dimensional shape | Annotation UI with flexible anchor/pivot system which can be used for labeling motion controllers and object. |
-|  [![App Bar](Images/AppBar/MRTK_AppBar_Main.png)](README_AppBar.md) [App Bar](README_AppBar.md) | [![Pointers](Images/Pointers/MRTK_Pointer_Main.png)](/Input/Pointers.md) [Pointers](/Input/Pointers.md) | [![Fingertip Visualization](Images/Fingertip/MRTK_FingertipVisualization_Main.png)](README_FingertipVisualization.md) [Fingertip Visualization](README_FingertipVisualization.md) |
-| UI for Bounding Box's manual activation | Learn about various types of pointers | Visual affordance on the fingertip which improves the confidence for the direct interaction |
-|  [![Slider](Images/Slider/MRTK_UX_Slider_Main.jpg)](README_Sliders.md) [Slider](README_Sliders.md) | [![MRTK Standard Shader](Images/MRTKStandardShader/MRTK_StandardShader.jpg)](README_MRTKStandardShader.md) [MRTK Standard Shader](README_MRTKStandardShader.md) | [![Hand Joint Chaser](Images/HandJointChaser/MRTK_HandJointChaser_Main.jpg)](README_HandJointChaser.md) [Hand Joint Chaser](README_HandJointChaser.md) |
-| Slider UI for adjusting values supporting direct hand tracking interaction | MRTK's standard shader supports various fluent design elements with performance | Demonstrates how to use solver to attach objects to the hand joints |
-|  [![Eye Tracking: Target Selection](Images/EyeTracking/mrtk_et_targetselect.png)](EyeTracking/EyeTracking_TargetSelection.md) [Eye Tracking: Target Selection](EyeTracking/EyeTracking_TargetSelection.md) | [![Eye Tracking: Navigation](Images/EyeTracking/mrtk_et_navigation.png)](EyeTracking/EyeTracking_Navigation.md) [Eye Tracking: Navigation](EyeTracking/EyeTracking_Navigation.md) | [![Eye Tracking: Heat Map](Images/EyeTracking/mrtk_et_heatmaps.png)](EyeTracking/EyeTracking_ExamplesOverview.md#visualization-of-visual-attention) [Eye Tracking: Heat Map](EyeTracking/EyeTracking_ExamplesOverview.md#visualization-of-visual-attention) |
-| Combine eyes, voice and hand input to quickly and effortlessly select holograms across your scene | Learn how to auto scroll text or fluently zoom into focused content based on what you are looking at| Examples for logging, loading and visualizing what users have been looking at in your app |
-
-
-## Tools
-|  [![Optimize Window](Images/MRTK_Icon_OptimizeWindow.png)](Tools/OptimizeWindow.md) [Optimize Window](Tools/OptimizeWindow.md) | [![Dependency Window](Images/MRTK_Icon_DependencyWindow.png)](Tools/DependencyWindow.md) [Dependency Window](Tools/DependencyWindow.md) | ![Build Window](Images/MRTK_Icon_BuildWindow.png) Build Window | [![Input recording](Images/MRTK_Icon_InputRecording.png)](InputSimulation/InputAnimationRecording.md) [Input recording](InputSimulation/InputAnimationRecording.md) |
-
-| :--- | :--- | :--- | :--- |
-| Automate configuration of Mixed Reality projects for performance optimizations | Analyze dependencies between assets and identify unused assets |  Configure and execute end-to-end build process for Mixed Reality applications | Record and playback head movement and hand tracking data in-editor |
+* Learn about the [MRTK's Architecture](../Documentation/Architecture/Overview.md)
+* Learn about the [MRTK's Input System](../Documentation/Input/Overview.md)
+* Learn about the [MRTK's Tools](../README.md#tools) that will empower your mixed reality design and development.
+* Read through [input simulation guide](InputSimulation/InputSimulationService.md) to learn how to simulate hand input in editor.
 
 ## Upgrading from the HoloToolkit (HTK/MRTK v1)
 

[v2.1.0] toc.yml

diff --git a/Documentation/toc.yml b/Documentation/toc.yml
index 4c675bc7f..b3bb4f020 100644
--- a/Documentation/toc.yml
+++ b/Documentation/toc.yml
@@ -3,7 +3,7 @@
   items:
   - name: Upgrading from HTK
     href: HTKToMRTKPortingGuide.md
-  - name: Updating from RC2
+  - name: Updating from earlier versions
     href: Updating.md
   - name: Release Notes
     href: ReleaseNotes.md
@@ -27,6 +27,8 @@
       href: Architecture/InputSystem/CoreSystem.md
     - name: Controllers, Pointers, and Focus
       href: Architecture/InputSystem/ControllersPointersAndFocus.md
+  - name: Systems, Extension Services and Data Providers
+    href: Architecture/SystemsExtensionsProviders.md
 - name: Feature Overviews
   items:
   - name: Profiles
@@ -47,6 +49,8 @@
       href: Input/Controllers.md
     - name: Pointers
       href: Input/Pointers.md
+    - name: How to Add Near Interaction
+      href: Input/HowToAddNearInteractivity.md
     - name: Gestures
       href: Input/Gestures.md
     - name: Speech(Voice command)
@@ -54,11 +58,13 @@
     - name: Dictation
       href: Input/Dictation.md
     - name: Hands
-      href: InputSystem/HandTracking.md
+      href: Input/HandTracking.md
     - name: Gaze
       href: Input/Gaze.md
     - name: Eyes
       href: EyeTracking/EyeTracking_Main.md
+    - name: Creating an input data provider
+      href: Input/CreateDataProvider.md
   - name: In-Editor Input Simulation
     href: InputSimulation/InputSimulationService.md
   - name: UX Building Blocks
@@ -97,14 +103,16 @@
     href: README_MRTKStandardShader.md
   - name: Spatial Awareness
     items:
-    - name: Spatial Awareness Overview
+    - name: Getting Started
       href: SpatialAwareness/SpatialAwarenessGettingStarted.md
-    - name: Configuring the Spatial Awareness Mesh Observer
+    - name: Configuring Observers for Device
       href: SpatialAwareness/ConfiguringSpatialAwarenessMeshObserver.md
-    - name: Spatial Object Mesh Observer
+    - name: Configuring Observers for Editor
       href: SpatialAwareness/SpatialObjectMeshObserver.md
-    - name: Usage Guide
+    - name: Controlling Observers via Code
       href: SpatialAwareness/UsageGuide.md
+    - name: Creating a custom Observer
+      href: SpatialAwareness/CreateDataProvider.md
   - name: Multi Scene System
     items:
     - name: Multi Scene System Overview
@@ -139,6 +147,8 @@
       href: MixedRealityServices.md
     - name: What are the MixedRealityServiceRegistry and IMixedRealityServiceRegistrar
       href: ServiceUtilities/MixedRealityServiceRegistryAndIMixedRealityServiceRegistrar.md
+    - name: Extension services
+      href: Extensions/ExtensionServices.md
   - name: Packages
     items:
     - name: MRTK Packages
@@ -158,6 +168,10 @@
         href: InputSimulation/InputAnimationFileFormat.md
     - name: Extension Service Creation Wizard
       href: Tools/ExtensionServiceCreationWizard.md
+    - name: Runtime tools
+      items:
+      - name: Controller Mapping tool
+        href: Tools/ControllerMappingTool.md 
   - name: Scene Transition Service
     href: Extensions/SceneTransitionService/SceneTransitionServiceOverview.md
   - name: Experimental Features

翻訳ドキュメント v2.5.0 対応

git diff v2.4.0..v2.5.0 --name-status .

M	Documentation/Architecture/InputSystem/ControllersPointersAndFocus.md
M	Documentation/Architecture/InputSystem/CoreSystem.md
M	Documentation/Authors.md
M	Documentation/CameraSystem/WindowsMixedRealityCameraSettings.md
M	Documentation/Contributing/CodingGuidelines.md
M	Documentation/Contributing/DocumentationGuide.md
M	Documentation/Contributing/ExperimentalFeatures.md
M	Documentation/Contributing/Roadmap.md
M	Documentation/Contributing/UnitTests.md
M	Documentation/CrossPlatform/LeapMotionMRTK.md
A	Documentation/CrossPlatform/OculusQuestMRTK.md
M	Documentation/CrossPlatform/UsingARFoundation.md
M	Documentation/Diagnostics/ConfiguringDiagnostics.md
M	Documentation/Diagnostics/DiagnosticsSystemGettingStarted.md
A	Documentation/Elastics/ElasticSystem.md
M	Documentation/EyeTracking/EyeTracking_BasicSetup.md
M	Documentation/EyeTracking/EyeTracking_EyesAndHands.md
M	Documentation/GettingStartedWithMRTKAndXRSDK.md
M	Documentation/GettingStartedWithTheMRTK.md
A	Documentation/Images/BoundsControl/MRTK_BoundsControl_Assign.png
A	Documentation/Images/BoundsControl/MRTK_BoundsControl_Constraints.png
A	Documentation/Images/BoundsControl/MRTK_BoundsControl_Elastics.png
A	Documentation/Images/BoundsControl/MRTK_BoundsControl_Events.png
A	Documentation/Images/BoundsControl/MRTK_BoundsControl_Examples.png
A	Documentation/Images/BoundsControl/MRTK_BoundsControl_HandleStyles1.png
A	Documentation/Images/BoundsControl/MRTK_BoundsControl_HandleStyles1_NoHighlight.png
A	Documentation/Images/BoundsControl/MRTK_BoundsControl_HandleStyles2.png
A	Documentation/Images/BoundsControl/MRTK_BoundsControl_Main.png
A	Documentation/Images/BoundsControl/MRTK_BoundsControl_Migrate.png
A	Documentation/Images/BoundsControl/MRTK_BoundsControl_ObjectManipulator.png
A	Documentation/Images/BoundsControl/MRTK_BoundsControl_Proximity.png
A	Documentation/Images/ConfigurationDialog/EnableMSB4UPrompt.png
A	Documentation/Images/ConfigurationDialog/MSB4UMenuItems.png
A	Documentation/Images/ConstraintManager/AutoSelection.png
A	Documentation/Images/ConstraintManager/ManualSelection.png
A	Documentation/Images/ControllerMappingTool/InputFeatureUsages.png
A	Documentation/Images/CrossPlatform/CloneInputSystemProfile.png
R100	Documentation/Images/CrossPlatform/LeapMotion/LeapProfileClone.png	Documentation/Images/CrossPlatform/CloneProfile.png
A	Documentation/Images/CrossPlatform/InputConfigurationProfile.png
D	Documentation/Images/CrossPlatform/LeapMotion/LeapDeviceManagerDesk.png
D	Documentation/Images/CrossPlatform/LeapMotion/LeapDeviceManagerHeadset.png
D	Documentation/Images/CrossPlatform/LeapMotion/LeapDeviceManagerProfileBeforeClone.png
A	Documentation/Images/CrossPlatform/LeapMotion/LeapMotionDeviceManagerDesk.png
A	Documentation/Images/CrossPlatform/LeapMotion/LeapMotionDeviceManagerHeadset.png
A	Documentation/Images/CrossPlatform/LeapMotion/LeapMotionDeviceManagerProfile.png
A	Documentation/Images/CrossPlatform/LeapMotion/LeapMotionIntegrateMenu.png
A	Documentation/Images/CrossPlatform/OculusQuest/AndroidToolsConfig.png
A	Documentation/Images/CrossPlatform/OculusQuest/OculusAddDataXRSDKProvider.png
A	Documentation/Images/CrossPlatform/OculusQuest/OculusExpectedBuildErrors.png
A	Documentation/Images/CrossPlatform/OculusQuest/OculusIntegrationAsmdef.png
A	Documentation/Images/CrossPlatform/OculusQuest/OculusIntegrationControllerAndHands.png
A	Documentation/Images/CrossPlatform/OculusQuest/OculusPluginProvider.png
A	Documentation/Images/CrossPlatform/OculusQuest/OculusRunDevice.png
A	Documentation/Images/CrossPlatform/OculusQuest/OculusSeparationAsmdef.png
A	Documentation/Images/CrossPlatform/OculusQuest/OculusXRPluginPackage.png
D	Documentation/Images/Diagnostics/DiagnosticsProfile.png
M	Documentation/Images/Diagnostics/DiagnosticsSelectSystemType.png
D	Documentation/Images/Diagnostics/MRTKConfig_Diagnostics.png
A	Documentation/Images/Elastics/Elastics_Example_Scene.png
A	Documentation/Images/Elastics/Elastics_Main.gif
A	Documentation/Images/Elastics/Elastics_Main1.gif
A	Documentation/Images/Elastics/Elastics_Rotation.gif
A	Documentation/Images/Elastics/Elastics_Volume_Bounds.gif
A	Documentation/Images/Elastics/Elastics_Volume_Snap.gif
M	Documentation/Images/Input/Pointers/GrabPointer_MRTKProfile.png
A	Documentation/Images/InputSimulation/ArticulatedHandJoints.png
A	Documentation/Images/InputSimulation/MRTK_Core_Input_Hands_JointNames.png
A	Documentation/Images/InputSimulation/MRTK_Core_Input_Hands_JointNames_Dark.png
D	Documentation/Images/InputSimulation/MRTK_Core_Input_Hands_JointVisualizerPrefabs.png
A	Documentation/Images/Logo_MRTK_Unity_Badge.png
A	Documentation/Images/Logo_MRTK_Unity_Banner.png
A	Documentation/Images/MRDL_Surfaces.jpg
A	Documentation/Images/MRDevDays_Session1.png
A	Documentation/Images/MRDevDays_Session2.png
A	Documentation/Images/MRDevDays_Session3.png
A	Documentation/Images/MRTK-Doc-Versions.png
A	Documentation/Images/MRTK_Icon_ReleaseNotes.png
A	Documentation/Images/Packaging/MRTK_ExamplesUpm.png
A	Documentation/Images/Packaging/MRTK_FoundationUPM.png
A	Documentation/Images/ReleaseNotes/FixSceneTransitionProfile.png
A	Documentation/Images/RiggedHandVisualizer/MRTK_RiggedHandVisualizer_ControllerVisualizationSettings.png
A	Documentation/Images/RiggedHandVisualizer/MRTK_RiggedHandVisualizer_InputSimulation.gif
A	Documentation/Images/RiggedHandVisualizer/MRTK_RiggedHandVisualizer_Leapmotion.gif
A	Documentation/Images/RiggedHandVisualizer/MRTK_RiggedHandVisualizer_Main.png
A	Documentation/Images/RiggedHandVisualizer/MRTK_RiggedHandVisualizer_Orientation.png
A	Documentation/Images/RiggedHandVisualizer/MRTK_RiggedHandVisualizer_PrefabSetup.png
R100	Documentation/Images/ScrollingCollection/MRTK_UX_ScrollingCollection_Main.jpg	Documentation/Images/ScrollingCollection/ScrollingCollection_Main.jpg
A	Documentation/Images/ScrollingCollection/ScrollingObjectCollection.png
A	Documentation/Images/ScrollingCollection/ScrollingObjectCollection_ExampleScene.png
A	Documentation/Images/ScrollingCollection/ScrollingObjectCollection_GridLayout.png
A	Documentation/Images/ScrollingCollection/ScrollingObjectCollection_Prefabs.png
A	Documentation/Images/ScrollingCollection/ScrollingObjectCollection_ViewableArea.png
M	Documentation/Images/Solver/TapToPlace/TapToPlaceInspector2.png
M	Documentation/Input/HandTracking.md
M	Documentation/Input/HowToAddNearInteractivity.md
M	Documentation/Input/Overview.md
M	Documentation/Input/Pointers.md
M	Documentation/InputSimulation/InputSimulationService.md
A	Documentation/Installation.md
M	Documentation/LargeProjects.md
D	Documentation/MRTKNuGetPackage.md
M	Documentation/MRTK_Configuration_Dialog.md
D	Documentation/MRTK_PackageContents.md
A	Documentation/MRTK_and_managed_code_stripping.md
M	Documentation/Packaging/MRTK_Packages.md
M	Documentation/Performance/PerfGettingStarted.md
M	Documentation/README_AppBar.md
M	Documentation/README_BoundingBox.md
A	Documentation/README_BoundsControl.md
A	Documentation/README_ConstraintManager.md
M	Documentation/README_HandInteractionExamples.md
M	Documentation/README_ManipulationHandler.md
M	Documentation/README_NearMenu.md
M	Documentation/README_ObjectManipulator.md
A	Documentation/README_ScrollingObjectCollection.md
M	Documentation/README_Slate.md
M	Documentation/README_TapToPlace.md
M	Documentation/README_Toolbox.md
M	Documentation/ReleaseNotes.md
M	Documentation/Rendering/HoverLight.md
M	Documentation/Tools/ControllerMappingTool.md
M	Documentation/Tools/HolographicRemoting.md
A	Documentation/Tools/InputFeatureUsageTool.md
M	Documentation/Updating.md
A	Documentation/WelcomeToMRTK.md
M	Documentation/toc.yml
A	Documentation/usingupm.md

[v2.1.0] README_MRTKStandardShader

diff --git a/Documentation/README_MRTKStandardShader.md b/Documentation/README_MRTKStandardShader.md
index 73358e249..ec87f6ccd 100644
--- a/Documentation/README_MRTKStandardShader.md
+++ b/Documentation/README_MRTKStandardShader.md
@@ -19,10 +19,31 @@ You can find a comparison scene to compare and test the MRTK/Standard shader aga
 
 The MRTK/Standard shading system is an "uber shader" that uses [Unity's shader program variant feature](https://docs.unity3d.com/Manual/SL-MultipleProgramVariants.html) to auto-generate optimal shader code based on material properties. When a user selects material properties in the material inspector they only incur performance cost for features they have enabled.
 
-A custom material inspector exists for the MRTK/Standard shader called **MixedRealityStandardShaderGUI.cs**. The inspector automatically enables/disables shader features based on user selection and aides in setting up render state. For more information about each feature please hover over each property in the Unity Editor for a tooltip.
+## Material Inspector
+
+A custom material inspector exists for the MRTK/Standard shader called [`MixedRealityStandardShaderGUI.cs`](xref:Microsoft.MixedReality.Toolkit.Editor.MixedRealityStandardShaderGUI). The inspector automatically enables/disables shader features based on user selection and aides in setting up render state. For more information about each feature **please hover over each property in the Unity Editor for a tooltip.**
 
 ![Material Inspector](../Documentation/Images/MRTKStandardShader/MRTK_MaterialInspector.jpg)
 
+The first portion of the inspector controls the material's render state. *Rendering Mode* determines when and how a material will be rendered. The aim of the MRTK/Standard shader is to mirror the [rendering modes found in the Unity/Standard shader](https://docs.unity3d.com/Manual/StandardShaderMaterialParameterRenderingMode.html). The MRTK/Standard shader also includes an *Additive* rendering mode and *Custom* rendering mode for complete user control.
+
+| Rendering Mode |                                                                                                                                                                                                                                                                                                                                                                    |
+|----------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| Opaque         | (Default) Suitable for normal solid objects with no transparent areas.                                                                                                                                                                                                                                                                                             |
+| Cutout         | Allows creation of transparent effects that have hard edges between the opaque and transparent areas. In this mode, there are no semi-transparent areas, the texture is either 100% opaque, or invisible. This is useful when using transparency to create the shape of materials such as vegetation.                                                              |
+| Fade           | Allows the transparency values to entirely fade an object out, including any specular highlights or reflections it may have. This mode is useful if you want to animate an object fading in or out. It is not suitable for rendering realistic transparent materials such as clear plastic or glass because the reflections and highlights will also be faded out. |
+| Transparent    | Suitable for rendering realistic transparent materials such as clear plastic or glass. In this mode, the material itself will take on transparency values (based on the texture’s alpha channel and the alpha of the tint colour), however reflections and lighting highlights will remain visible at full clarity as is the case with real transparent materials. |
+| Additive       | Enables an additive blending mode which sums the previous pixel color with the current pixel color. This is the preferred transparency mode to avoid transparency sorting issues.                                                                                                                                                                                  |
+| Custom         | Allows for every aspect of the rendering mode to be controlled manually. For advanced usage only.                                                                                                                                                                                                                                                                  |                                                                                                                                            |
+
+![Rendering Modes](../Documentation/Images/MRTKStandardShader/MRTK_RenderingModes.jpg)
+
+| Cull Mode |                                                                                                                                                                                    |
+|-----------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
+| Off       | Disables face culling. Culling should only be set to Off when a two sided mesh is required.                                                                                        |
+| Front     | Enables front face culling.                                                                                                                                                        |
+| Back      | (Default) Enables [back face culling](https://en.wikipedia.org/wiki/Back-face_culling). Back face culling should be enabled as often as possible to improve rendering performance. |
+
 ## Performance
 
 One of the primary advantages to using the MRTK Standard shader over the Unity standard shader is performance. The MRTK Standard Shader is extensible to only utilize the features enabled. However, the MRTK Standard shader has also been written to deliver comparable aesthetic results as the Unity Standard shader but at a much lower cost. One simple way to compare shader performance is via the number of operations that needs to be performed on the GPU. Of course, the magnitude of calculations may fluctuate by features enabled and other rendering configurations. But, in general, the MRTK Standard shader performs significantly less computation than the Unity Standard shader.
@@ -56,11 +77,11 @@ For static lighting the shader will respect lightmaps built by Unity's [Lightmap
 
 ### Hover Light
 
-A Hover Light is a Fluent Design System paradigm that mimics a "point light" hovering near the surface of an object. Often used for far away cursor lighting the application can control the properties of a Hover Light via the [**HoverLight.cs**](xref:Microsoft.MixedReality.Toolkit.Utilities.HoverLight). Up to 3 Hover Lights are supported at a time.
+A Hover Light is a Fluent Design System paradigm that mimics a "point light" hovering near the surface of an object. Often used for far away cursor lighting the application can control the properties of a Hover Light via the [`HoverLight.cs`](xref:Microsoft.MixedReality.Toolkit.Utilities.HoverLight). Up to 3 Hover Lights are supported at a time.
 
 ### Proximity Light
 
-A Proximity Light is a Fluent Design System paradigm that mimics a "gradient inverse point light" hovering near the surface of an object. Often used for near cursor lighting the application can control the properties of a Proximity Light via the [**ProximityLight.cs**](xref:Microsoft.MixedReality.Toolkit.Utilities.ProximityLight). Up to 2 Proximity Lights are supported at a time.
+A Proximity Light is a Fluent Design System paradigm that mimics a "gradient inverse point light" hovering near the surface of an object. Often used for near cursor lighting the application can control the properties of a Proximity Light via the [`ProximityLight.cs`](xref:Microsoft.MixedReality.Toolkit.Utilities.ProximityLight). Up to 2 Proximity Lights are supported at a time.
 
 ## Lightweight Scriptable Render Pipeline Support
 
@@ -72,13 +93,23 @@ To perform the MRTK upgrade select: **Mixed Reality Toolkit -> Utilities -> Upgr
 
 After the upgrade occurs the MRTK/Standard shader will be altered and any magenta (shader error) materials should be fixed. To verify the upgrade successfully occurred please check the console for: **Upgraded Assets/MixedRealityToolkit/StandardAssets/Shaders/MixedRealityStandard.shader for use with the Lightweight Render Pipeline.**
 
+## UGUI Support
+
+The MRTK Standard shading system works with Unity's built in [UI system](https://docs.unity3d.com/Manual/UISystem.html). On Unity UI components the unity_ObjectToWorld matrix is not the transformation matrix of the local transform the Graphic component lives on but that of it's parent Canvas. Many MRTK/Standard shader effects require object scale to be known. To solve this issue the [`ScaleMeshEffect.cs`](xref:Microsoft.MixedReality.Toolkit.Input.Utilities.ScaleMeshEffect) will store scaling information into UV channel attributes during UI mesh construction.
+
+Note, when using a Unity Image component it is recommended to specify "None (Sprite)" for the Source Image to prevent Unity UI from generating extra vertices.
+
+A Canvas within the MRTK will prompt for the addition of a [`ScaleMeshEffect.cs`](xref:Microsoft.MixedReality.Toolkit.Input.Utilities.ScaleMeshEffect) when one is required:
+
+![scale mesh effect](../Documentation/Images/MRTKStandardShader/MRTK_ScaleMeshEffect.jpg)
+
 ## Texture Combiner
 
 To improve parity with the Unity Standard shader per pixel metallic, smoothness, emissive, and occlusion values can all be controlled via [channel packing](http://wiki.polycount.com/wiki/ChannelPacking). For example:
 
 ![channel map example](../Documentation/Images/MRTKStandardShader/MRTK_ChannelMap.gif)
 
-When you use channel packing, you only have to sample and load one texture into memory instead of four separate ones. When you write your texture maps in a program like Substance or Photoshop, you can pack hand pack them like so:
+When you use channel packing, you only have to sample and load one texture into memory instead of four separate ones. When you write your texture maps in a program like Substance or Photoshop, you can hand pack them like so:
 
 | Channel | Property             |
 |---------|----------------------|
@@ -97,26 +128,65 @@ This windows can be automatically filled out by selecting a Unity Standard shade
 
 Below are extra details on a handful of features details available with the MRTK/Standard shader.
 
+### Primitive Clipping
+
 Performant plane, sphere, and box shape clipping with the ability to specify which side of the primitive to clip against (inside or outside). You can find a scene that demonstrates advanced usage of clipping primitives in the  **ClippingExamples** scene under: [MixedRealityToolkit.Examples/Demos/StandardShader/Scenes/](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_release/Assets/MixedRealityToolkit.Examples/Demos/StandardShader/Scenes)
 
 ![primitive clipping](../Documentation/Images/MRTKStandardShader/MRTK_PrimitiveClipping.gif)
 
-[**ClippingPlane.cs**](xref:Microsoft.MixedReality.Toolkit.Utilities.ClippingPlane), [**ClippingSphere.cs**](xref:Microsoft.MixedReality.Toolkit.Utilities.ClippingSphere), and [**ClippingBox.cs**](xref:Microsoft.MixedReality.Toolkit.Utilities.ClippingBox) can be used to easily control clipping primitive properties.
+[`ClippingPlane.cs`](xref:Microsoft.MixedReality.Toolkit.Utilities.ClippingPlane), [**ClippingSphere.cs`](xref:Microsoft.MixedReality.Toolkit.Utilities.ClippingSphere), and [**ClippingBox.cs`](xref:Microsoft.MixedReality.Toolkit.Utilities.ClippingBox) can be used to easily control clipping primitive properties.
 
 ![primitive clipping gizmos](../Documentation/Images/MRTKStandardShader/MRTK_PrimitiveClippingGizmos.gif)
 
+### Mesh Outlines
+
+Many mesh outline techniques are done using a [post processing](https://docs.unity3d.com/Manual/PostProcessingOverview.html) technique. Post processing provides great quality outlines, but can be prohibitively expensive on many Mixed Reality devices. You can find a scene that demonstrates usage of mesh outlines in the  **OutlineExamples** scene under: [MixedRealityToolkit.Examples/Demos/StandardShader/Scenes/](https://github.com/microsoft/MixedRealityToolkit-Unity/tree/mrtk_release/Assets/MixedRealityToolkit.Examples/Demos/StandardShader/Scenes)
+
+<img src="../Documentation/Images/MRTKStandardShader/MRTK_MeshOutline.jpg" width="900">
+
+[`MeshOutline.cs`](xref:Microsoft.MixedReality.Toolkit.Utilities.MeshOutline) and [`MeshOutlineHierarchy.cs`](xref:Microsoft.MixedReality.Toolkit.Utilities.MeshOutlineHierarchy) can be used render an outline around a mesh renderer. Enabling this component introduces an additional render pass of the object being outlined, but is designed to run performantly on mobile Mixed Reality devices and does not utilize any post processes. Limitations of this effect include it not working well on objects which are not watertight (or required to be two sided) and depth sorting issues can occur on overlapping objects.
+
+The outline behaviors are designed to be used in conjunction with the MRTK/Standard shader. Outline materials are usually a solid unlit color, but can be configured to achieve a wide array of effects. The default configuration of a outline material is as follows: 
+
+<img src="../Documentation/Images/MRTKStandardShader/MRTK_OutlineMaterial.jpg" width="450">
+
+1. Depth Write - should be disabled for outline materials to make sure the outline does not prevent other objects from rendering.
+2. Vertex Extrusion - needs to be enabled to render the outline.
+3. Use Smooth Normals - this setting is optional for some meshes. Extrusion occurs by moving a vertex along a vertex normal, on some meshes extruding along the default normals will cause discontinuities in the outline. To fix these discontinuities you can check this box to use another set of smoothed normals which get generated by [`MeshSmoother.cs`](xref:Microsoft.MixedReality.Toolkit.Utilities.MeshSmoother)
+
+[`MeshSmoother.cs`](xref:Microsoft.MixedReality.Toolkit.Utilities.MeshSmoother) is a component which can be used to automatically generate smoothed normals on a mesh. This method groups vertices in a mesh that share the same location in space then averages the normals of those vertices. This process creates a copy of the underlying mesh and should be used only when required.
+
+<img src="../Documentation/Images/MRTKStandardShader/MRTK_SmoothNormals.jpg" width="450">
+
+1. Smooth normals generated via [`MeshSmoother.cs`](xref:Microsoft.MixedReality.Toolkit.Utilities.MeshSmoother).
+2. Default normals used, notice the artifacts around the cube corners. 
+
+### Stencil Testing
+
 Built in configurable stencil test support to achieve a wide array of effects. Such as portals:
 
 ![stencil test](../Documentation/Images/MRTKStandardShader/MRTK_StencilTest.gif)
 
+### Instanced Color Support
+
 Instanced color support to give thousands of GPU instanced meshes unique material properties:
 
 ![instanced properties](../Documentation/Images/MRTKStandardShader/MRTK_InstancedProperties.gif)
 
+### Triplanar Mapping
+
 Triplanar mapping is a technique to programmatically texture a mesh. Often used in terrain, meshes without UVs, or difficult to unwrap shapes. This implementation supports world or local space projection, the specification of blending smoothness, and normal map support. Note, each texture used requires 3 texture samples, so please use sparingly in performance critical situations.
 
 ![triplanar](../Documentation/Images/MRTKStandardShader/MRTK_TriplanarMapping.gif)
 
+### Vertex Extrusion
+
+Vertex extrusion in world space. Useful for visualizing extruded bounding volumes or transitions in/out meshes.
+
+![normal map scale](../Documentation/Images/MRTKStandardShader/MRTK_VertexExtrusion.gif)
+
+### Miscellaneous
+
 A checkbox to control albedo optimizations. As an optimization albedo operations are disabled when no albedo texture is specified. This is useful for controlling [remote texture loading](http://dotnetbyexample.blogspot.com/2018/10/workaround-remote-texture-loading-does.html).
 
 Simply check this box:
@@ -127,10 +197,6 @@ Per pixel clipping textures, local edge based anti aliasing, and normal map scal
 
 ![normal map scale](../Documentation/Images/MRTKStandardShader/MRTK_NormalMapScale.gif)
 
-Vertex extrusion in world space. Useful for visualizing extruded bounding volumes or transitions in/out meshes.
-
-![normal map scale](../Documentation/Images/MRTKStandardShader/MRTK_VertexExtrusion.gif)
-
 ## See also
 
 - [Interactable](README_Interactable.md)

[v2.1.0] SpatialAwareness/SpatialAwarenessGettingStarted.md

diff --git a/Documentation/SpatialAwareness/SpatialAwarenessGettingStarted.md b/Documentation/SpatialAwareness/SpatialAwarenessGettingStarted.md
index e73754d42..cd92fa4e9 100644
--- a/Documentation/SpatialAwareness/SpatialAwarenessGettingStarted.md
+++ b/Documentation/SpatialAwareness/SpatialAwarenessGettingStarted.md
@@ -2,66 +2,70 @@
 
 ![Spatial Awareness](../../Documentation/Images/SpatialAwareness/MRTK_SpatialAwareness_Main.png)
 
-The Spatial Awareness system provides real-world environmental awareness in mixed reality applications. When introduced on Microsoft HoloLens, spatial awareness provided a collection of meshes, representing the geometry of the environment, which allowed for compelling interactions between holograms and the real-world.
+The Spatial Awareness system provides real-world environmental awareness in mixed reality applications. When introduced on Microsoft HoloLens, Spatial Awareness provided a collection of meshes, representing the geometry of the environment, which allowed for compelling interactions between holograms and the real-world.
 
-## Getting Started
-
-Adding support for spatial awareness requires two key components of the Mixed Reality Toolkit: the spatial awareness system and a supported platform provider.
+> [!NOTE]
+> At this time, the Mixed Reality Toolkit does not ship with Spatial Understanding algorithms as originally packaged in the HoloToolkit. Spatial Understanding generally involves transforming Spatial Mesh data to create simplified and/or grouped Mesh data such as planes, walls, floors, ceilings, etc.
 
-1. [Enable](#enable-spatial-awareness) the spatial awareness system
-2. [Register](#register-observers) and [configure](#configure-observers) one or more spatial observers
-3. [Build and deploy](#build-and-deploy) to a platform that supports spatial awareness
+## Getting Started
 
-### Enable Spatial Awareness
+Adding support for Spatial Awareness requires two key components of the Mixed Reality Toolkit: the Spatial Awareness system and a supported platform provider.
 
-The spatial awareness system is managed by the MixedRealityToolkit object (or another [service registrar](xref:Microsoft.MixedReality.Toolkit.IMixedRealityServiceRegistrar) component). The steps below are not necessary for users of the default profile (DefaultMixedRealityToolkitConfigurationProfile) which has this system already enabled. The following steps presume use of the MixedRealityToolkit object. Steps required for other service registrars may be different.
+1. [Enable](#enable-the-spatial-awareness-system) the Spatial Awareness system
+2. [Register](#register-observers) and [configure](ConfiguringSpatialAwarenessMeshObserver.md) one or more spatial observers to provide mesh data
+3. [Build and deploy](#build-and-deploy) to a platform that supports Spatial Awareness
 
-> [!NOTE]
-> The spatial awareness system is disabled by default on the default HoloLens 2 profile (DefaultHoloLens2ConfigurationProfile), and the intent of this is to avoid the visual overhead of calculating and rendering the meshes.
+### Enable the Spatial Awareness system
 
-1. Select the MixedRealityToolkit object in the scene hierarchy.
+The Spatial Awareness system is managed by the MixedRealityToolkit object (or another [service registrar](xref:Microsoft.MixedReality.Toolkit.IMixedRealityServiceRegistrar) component). Follow the steps below to enable or disable the *Spatial Awareness system* in the *MixedRealityToolkit* profile.
 
-![MRTK Configured Scene Hierarchy](../../Documentation/Images/MRTK_ConfiguredHierarchy.png)
+The Mixed Reality Toolkit ships with a few default pre-configured profiles. Some of these have the Spatial Awareness system enabled OR disabled by default. The intent of this pre-configuration, particularly for when disabled, is to avoid the visual overhead of calculating and rendering the meshes.
 
-2. Navigate the Inspector panel to the Spatial Awareness System section and check *Enable Spatial Awareness System*
+| Profile | System Enabled by Default |
+| --- | --- |
+| [DefaultHoloLens1ConfigurationProfile](https://github.com/microsoft/MixedRealityToolkit-Unity/blob/mrtk_development/Assets/MixedRealityToolkit.SDK/Profiles/HoloLens1/DefaultHoloLens1ConfigurationProfile.asset) | False |
+| [DefaultHoloLens2ConfigurationProfile](https://github.com/microsoft/MixedRealityToolkit-Unity/blob/mrtk_development/Assets/MixedRealityToolkit.SDK/Profiles/HoloLens2/DefaultHoloLens2ConfigurationProfile.asset) | False |
+| [DefaultMixedRealityToolkitConfigurationProfile](https://github.com/microsoft/MixedRealityToolkit-Unity/blob/mrtk_development/Assets/MixedRealityToolkit.SDK/Profiles/DefaultMixedRealityToolkitConfigurationProfile.asset) | True |
 
-![Enable Spatial Awareness](../../Documentation/Images/SpatialAwareness/MRTKConfig_SpatialAwareness.png)
+1. Select the MixedRealityToolkit object in the scene hierarchy to open in the Inspector Panel.
 
-3. Select the Spatial Awareness System implementation
+    ![MRTK Configured Scene Hierarchy](../../Documentation/Images/MRTK_ConfiguredHierarchy.png)
 
-![Select the Spatial Awareness System Implementation](../../Documentation/Images/SpatialAwareness/SpatialAwarenessSelectSystemType.png)
+1. Navigate to the *Spatial Awareness System* section and check *Enable Spatial Awareness System*
 
-### Register observers
+    ![Enable Spatial Awareness](../../Documentation/Images/SpatialAwareness/MRTKConfig_SpatialAwareness.png)
 
-Before the spatial awareness system can provide applications with data about the real-world, at least one spatial observer must be registered. Spatial observers are generally platform specific components that may vary in the type(s) of data (ex: meshes) provided.
+1. Select the desired Spatial Awareness system implementation type. The [`MixedRealitySpatialAwarenessSystem`](xref:Microsoft.MixedReality.Toolkit.SpatialAwareness.MixedRealitySpatialAwarenessSystem) is the default provided.
 
-1. Open or expand the Spatial Awareness System profile
+    ![Select the Spatial Awareness System Implementation](../../Documentation/Images/SpatialAwareness/SpatialAwarenessSelectSystemType.png)
 
-![Spatial Awareness System Profile](../../Documentation/Images/SpatialAwareness/SpatialAwarenessProfile.png)
-
-2. Click "Add Spatial Observer"
-3. Select the Spatial Observer implementation
+### Register observers
 
-![Select the Spatial Observer Implementation](../../Documentation/Images/SpatialAwareness/SpatialAwarenessSelectObserver.png)
+Services in the Mixed Reality Toolkit can have [Data Provider services](../Architecture/SystemsExtensionsProviders.md) that supplement the main service with platform specific data and implementation controls. An example of this is the Mixed Reality Input System which has [multiple data providers](../Input/InputProviders.md) to get controller and other related input information from various platform-specific APIs.
 
-> [!NOTE]
-> Users of the default profile (DefaultMixedRealitSpatialAwarenessSystemProfile) will have the spatial awareness system pre-configured to use the [WindowsMixedRealitySpatialMeshObserver](xref:Microsoft.MixedReality.Toolkit.WindowsMixedReality.SpatialAwareness.WindowsMixedRealitySpatialMeshObserver) from the Mixed Reality Toolkit Windows Mixed Reality Provider package.
+The Spatial Awareness system is similar in that data providers supply the system with mesh data about the real-world. The Spatial Awareness profile must have at least one Spatial Observer registered. Spatial Observers are generally platform specific components that act as the provider for surfacing various types of mesh data from a platform specific endpoint (i.e HoloLens).
 
-#### Configure observers
+1. Open or expand the *Spatial Awareness System profile*
 
-Once the spatial observer(s) have been registered with the system, they can be configured to provide the desired data. When configuring a spatial observer, many implementations will auto-populate the observer's configuration profile with common default values.
+    ![Spatial Awareness System Profile](../../Documentation/Images/SpatialAwareness/SpatialAwarenessProfile.png)
 
-1. Open or expand the Spatial Observer profile
+1. Click the *"Add Spatial Observer"* button
+1. Select the desired *Spatial Observer implementation type*
 
-![Spatial Mesh Observer Profile](../../Documentation/Images/SpatialAwareness/SpatialAwarenessMeshObserverProfile.png)
+    ![Select the Spatial Observer Implementation](../../Documentation/Images/SpatialAwareness/SpatialAwarenessSelectObserver.png)
 
-2. Configure the desired options
+1. [Modify configuration properties on the observer](ConfiguringSpatialAwarenessMeshObserver.md) as necessary
 
-The illustration in the previous step shows the configuration profile for a spatial mesh observer. Please see [Configuring the Spatial Awareness Mesh Observer](ConfiguringSpatialAwarenessMeshObserver.md) for more information pertaining to the specific settings available to mesh observers. Other observers may have similar settings.
+> [!NOTE]
+> Users of the [DefaultMixedRealityToolkitConfigurationProfile](https://github.com/microsoft/MixedRealityToolkit-Unity/blob/mrtk_development/Assets/MixedRealityToolkit.SDK/Profiles/DefaultMixedRealityToolkitConfigurationProfile.asset) will have the Spatial Awareness system pre-configured for the Windows Mixed Reality platform which uses
+the [`WindowsMixedRealitySpatialMeshObserver`](xref:Microsoft.MixedReality.Toolkit.WindowsMixedReality.SpatialAwareness.WindowsMixedRealitySpatialMeshObserver) class.
 
 ### Build and Deploy
 
-Once the spatial awareness system is configured with the desired observer(s), the project can be built and deployed to the target platform.
+Once the Spatial Awareness system is configured with the desired observer(s), the project can be built and deployed to the target platform.
+
+> [!IMPORTANT]
+> If targeting the Windows Mixed Reality platform (ex: HoloLens), it is important to ensure the [Spatial Perception capability](https://docs.microsoft.com/en-us/windows/mixed-reality/spatial-mapping-in-unity) is enabled in order to use the Spatial Awareness system on device.
 
 > [!WARNING]
 > Some platforms, including Microsoft HoloLens, provide support for remote execution from within Unity. This feature enables rapid development and testing without requiring the build and deploy step. Be sure to do final acceptance testing using a built and deployed version of the application, running on the target hardware and platform.
@@ -69,5 +73,8 @@ Once the spatial awareness system is configured with the desired observer(s), th
 ## See Also
 
 - [Spatial Awareness API documentation](xref:Microsoft.MixedReality.Toolkit.SpatialAwareness)
-- [Configuring the Spatial Awareness Mesh Observer](ConfiguringSpatialAwarenessMeshObserver.md)
-- [Spatial Object Mesh Observer](SpatialObjectMeshObserver.md)
\ No newline at end of file
+- [Configuring Observer for Device](ConfiguringSpatialAwarenessMeshObserver.md)
+- [Configuring Observer for Editor](SpatialObjectMeshObserver.md)
+- [Creating a custom Observer](CreateDataProvider.md)
+- [Spatial Mapping Overview WMR](https://docs.microsoft.com/en-us/windows/mixed-reality/spatial-mapping)
+- [Spatial Mapping in Unity WMR](https://docs.microsoft.com/en-us/windows/mixed-reality/spatial-mapping-in-unity)

[v2.1.0] README_Button

diff --git a/Documentation/README_Button.md b/Documentation/README_Button.md
index ec50ab2ae..5004a4417 100644
--- a/Documentation/README_Button.md
+++ b/Documentation/README_Button.md
@@ -1,12 +1,21 @@
-# Button #
+# Button
 
 ![Button](../Documentation/Images/Button/MRTK_Button_Main.png)
 
 A button gives the user a way to trigger an immediate action. It is one of the most foundational components in mixed reality. MRTK provides various types of button prefabs.
 
-## Button prefabs in MRTK ##
+## Button prefabs in MRTK
+
 Examples of the button prefabs under ``MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs`` folder
 
+### Unity UI Image/Graphic based buttons
+
+* [`PressableButtonUnityUI.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/PressableButtonUnityUI.prefab)
+* [`PressableButtonUnityUICircular.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/PressableButtonUnityUICircular.prefab)
+* [`PressableButtonHoloLens2UnityUI.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/PressableButtonHoloLens2UnityUI.prefab)
+
+### Collider based buttons
+
 |  ![PressableButtonHoloLens2](../Documentation/Images/Button/MRTK_Button_Prefabs_HoloLens2.png) PressableButtonHoloLens2 | ![PressableButtonHoloLens2Unplated](../Documentation/Images/Button/MRTK_Button_Prefabs_HoloLens2Unplated.png) PressableButtonHoloLens2Unplated | ![PressableButtonHoloLens2Circular](../Documentation/Images/Button/MRTK_Button_Prefabs_HoloLens2Circular.png) PressableButtonHoloLens2Circular |
 |:--- | :--- | :--- |
 | HoloLens 2's shell-style button with backplate which supports various visual feedback such as border light, proximity light, and compressed front plate | HoloLens 2's shell-style button without backplate  | HoloLens 2's shell-style button with circular shape  |
@@ -19,11 +28,22 @@ Examples of the button prefabs under ``MixedRealityToolkit.SDK/Features/UX/Inter
 
 The [`Button.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/Button.prefab) is based on the [Interactable](README_Interactable.md) concept to provide easy UI controls for buttons or other types of interactive surfaces. The baseline button supports all available input methods, including articulated hand input for the near interactions as well as gaze + air-tap for the far interactions. You can also use voice command to trigger the button.
 
-[`PressableButtonHoloLens2.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/PressableButtonHoloLens2.prefab) is HoloLens 2's shell style button that supports the precise movement of the button for the direct hand tracking input. It combines `Interactable` script with `PressableButton` script. 
+[`PressableButtonHoloLens2.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/PressableButtonHoloLens2.prefab) is HoloLens 2's shell style button that supports the precise movement of the button for the direct hand tracking input. It combines `Interactable` script with `PressableButton` script.
+
+## How to use pressable buttons
+
+### Unity UI based buttons
+
+Create a Canvas with
+* Render Mode set to World Space
+* A scale of 0.001
+* CanvasUtility component
 
-## How to use pressable buttons ##
+Then drag [`PressableButtonUnityUI.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/PressableButtonUnityUI.prefab), [`PressableButtonUnityUICircular.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/PressableButtonUnityUICircular.prefab), or [`PressableButtonHoloLens2UnityUI.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/PressableButtonHoloLens2UnityUI.prefab) onto the canvas.
 
-Simply drag [`PressableButtonHoloLens2.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/PressableButtonHoloLens2.prefab) or [`PressableButtonHoloLens2Unplated.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/PressableButtonHoloLens2Unplated.prefab) into the scene. These button prefabs are already configured to have audio-visual feedback for the various types of inputs, including articulated hand input and gaze.
+### Collider based buttons
+
+Simply drag [`PressableButtonHoloLens2.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/PressableButtonHoloLens2.prefab), , or [`PressableButtonHoloLens2Unplated.prefab`](https://github.com/Microsoft/MixedRealityToolkit-Unity/blob/mrtk_release/Assets/MixedRealityToolkit.SDK/Features/UX/Interactable/Prefabs/PressableButtonHoloLens2Unplated.prefab) into the scene. These button prefabs are already configured to have audio-visual feedback for the various types of inputs, including articulated hand input and gaze.
 
 The events exposed in the prefab itself as well as the [Interactable](README_Interactable.md) component can be used to trigger additional actions. The pressable buttons in the [HandInteractionExample scene](README_HandInteractionExamples.md) use Interactable's *OnClick* event to trigger a change in the color of a cube. This event gets triggered for different types of input methods such as gaze, air-tap, hand-ray, as well as physical button presses through the pressable button script.
 
@@ -37,7 +57,7 @@ To leverage specific articulated hand input state information, you can use press
 
 <img src="../Documentation/Images/Button/MRTK_Button_HowTo_PressableButton.png" width="450">
 
-## Interaction States ##
+## Interaction States
 
 In the idle state, the button's front plate is not visible. As a finger approaches or a cursor from gaze input targets the surface, the front plate's glowing border becomes visible. There is additional highlighting of the fingertip position on the front plate surface. When pushed with a finger, the front plate moves with the fingertip. When the fingertip touches the surface of the front plate, it shows a subtle pulse effect to give visual feedback of the touch point.
 
@@ -45,7 +65,7 @@ In the idle state, the button's front plate is not visible. As a finger approach
 
 The subtle pulse effect is triggered by the pressable button, which looks for *ProximityLight(s)* that live on the currently interacting pointer. If any proximity lights are found, the `ProximityLight.Pulse` method is called, which automatically animates shader parameters to display a pulse.
 
-## Inspector properties ##
+## Inspector properties
 
 ![Button](../Documentation/Images/Button/MRTK_Button_Structure.png)
 
@@ -68,19 +88,41 @@ Unity audio source for the audio feedback clips.
 Required to make any object touchable with articulated hand input.
 
 ## Prefab Layout
+
 The *ButtonContent* object contains front plate, text label and icon. The *FrontPlate* responds to the proximity of the index fingertip using the *Button_Box* shader. It shows glowing borders, proximity light, and a pulse effect on touch. The text label is made with TextMesh Pro. *SeeItSayItLabel*'s visibility is controlled by [Interactable](README_Interactable.md)'s theme.
 
 ![Button](../Documentation/Images/Button/MRTK_Button_Layout.png)
 
-## Voice command ('See-it, Say-it') ##
+## How to change the icon and text
+
+To change the text of the button, update the *Text* component of the *TextMeshPro* object under *IconAndText*. Changing the icon can be done by replacing the material that is assigned to *UIButtonSquareIcon* object. By default, *HolographicButtonIconFontMaterial* is assigned. 
+
+<img src="../Documentation/Images/Button/MRTK_Button_IconUpdate1.png">
+
+To create a new icon material, duplicate one of the existing icon materials. These can be found under ``MixedRealityToolkit.SDK/Features/UX/Interactable/Materials`` folder. 
+
+<img src="../Documentation/Images/Button/MRTK_Button_IconUpdate2.png"  width="350">
+
+Create a new PNG texture and import into Unity. Use existing icon PNG file examples as reference. ``MixedRealityToolkit.SDK/Features/UX/Interactable/Textures`` 
+
+Drag and drop newly created PNG texture onto the *Albedo* property in the material.
+
+<img src="../Documentation/Images/Button/MRTK_Button_IconUpdate3.png">
+
+Assgin the material to the *UIButtonSquareIcon* object.
+
+<img src="../Documentation/Images/Button/MRTK_Button_IconUpdate4.png">
+
+
+## Voice command ('See-it, Say-it')
 
 **Speech Input Handler**
-The [Interactable](README_Interactable.md) script in Pressable Button already implements `IMixedRealitySpeechHandler`. A voice command keyword can be set here. 
+The [Interactable](README_Interactable.md) script in Pressable Button already implements `IMixedRealitySpeechHandler`. A voice command keyword can be set here.
 
 <img src="../Documentation/Images/Button/MRTK_Button_Speech1.png" width="450">
 
 **Speech Input Profile**
-Additionally, you need to register the voice command keyword in the global *Speech Commands Profile*. 
+Additionally, you need to register the voice command keyword in the global *Speech Commands Profile*.
 
 <img src="../Documentation/Images/Button/MRTK_Button_Speech2.png" width="450">
 
@@ -89,17 +131,19 @@ The pressable button prefab has a placeholder TextMesh Pro label under the *SeeI
 
 <img src="../Documentation/Images/Button/MRTK_Button_Speech3.png" width="450">
 
-## How to make a button from scratch ##
+## How to make a button from scratch
+
 You can find the examples of these buttons in the **PressableButtonExample** scene.
 
 <img src="../Documentation/Images/Button/MRTK_PressableButtonCube0.png">
 
 ### 1. Creating a Pressable Button with Cube (Near interaction only)
+
 1. Create a Unity Cube (GameObject > 3D Object > Cube)
 2. Add `PressableButton.cs` script
 3. Add `NearInteractionTouchable.cs` script
 
-In the `PressableButton`'s Inspector panel, assign the cube object to the **Moving Button Visuals**. 
+In the `PressableButton`'s Inspector panel, assign the cube object to the **Moving Button Visuals**.
 
 <img src="../Documentation/Images/Button/MRTK_PressableButtonCube3.png" width="450">
 
@@ -114,6 +158,7 @@ When you press the button, it will move and generate proper events exposed in th
 <img src="../Documentation/Images/Button/MRTK_PressableButtonCubeRun1.jpg">
 
 ### 2. Adding visual feedback to the basic cube button
+
 MRTK Standard Shader provides various features that makes it easy to add visual feedback. Create an material and select shader `Mixed Reality Toolkit/Standard`. Or you can use or duplicate one of the existing materials under `/SDK/StandardAssets/Materials/` that uses MRTK Standard Shader.
 
 <img src="../Documentation/Images/Button/MRTK_PressableButtonCube4.png" width="450">
@@ -125,6 +170,7 @@ Check `Hover Light` and `Proximity Light` under **Fluent Options**. This enables
 <img src="../Documentation/Images/Button/MRTK_PressableButtonCubeRun2.jpg">
 
 ### 3. Adding audio feedback to the basic cube button
+
 Since `PressableButton.cs` script exposes events such as TouchBegin(), TouchEnd(), ButtonPressed(), ButtonReleased(), we can easily assign audio feedback. Simply add Unity's `Audio Source` to the cube object then assign audio clips by selecting AudioSource.PlayOneShot(). You can use MRTK_Select_Main and MRTK_Select_Secondary audio clips under `/SDK/StandardAssets/Audio/` folder.
 
 <img src="../Documentation/Images/Button/MRTK_PressableButtonCube7.png" width="450">
@@ -132,10 +178,10 @@ Since `PressableButton.cs` script exposes events such as TouchBegin(), TouchEnd(
 <img src="../Documentation/Images/Button/MRTK_PressableButtonCube6.png" width="450">
 
 ### 4. Adding visual states and handle far interaction events
+
 [Interactable](README_Interactable.md) is a script that makes it easy to create a visual states for the various types of input interactions. It also handles far interaction events. Add `Interactable.cs` and drag and drop the cube object onto the **Target** field under **Profiles**. Then, create a new Theme with a type **ScaleOffsetColorTheme**. Under this theme, you can specify the color of the object for the specific interaction states such as **Focus** and **Pressed**. You can also control Scale and Offset as well. Check **Easing** and set duration to make the visual transition smooth.
 
- <img src="../Documentation/Images/Button/MRTK_PressableButtonCube8.png" width="450">
-  <img src="../Documentation/Images/Button/MRTK_PressableButtonCube9.png" width="450">
+![Select profile theme](Images/Button/mrtk_button_profiles.gif)
 
 You will see the object responds to both far(hand ray or gaze cursor) and near(hand) interactions.
 
@@ -154,4 +200,7 @@ Each piano key has a `PressableButton` and a `NearInteractionTouchable` script a
 
 <img src="../Documentation/Images/Button/MRTK_Button_Custom3.png" width="450">
 
+## See also
 
+* [Interactable](README_Interactable.md)
+* [Visual Themes](VisualThemes.md)

ドキュメントの v2.1.0 対応について

変更点(Documentation 以下の画像以外)

M Documentation/Architecture/InputSystem/ControllersPointersAndFocus.md
M Documentation/Architecture/InputSystem/Terminology.md
M Documentation/Architecture/SpatialAwareness.md
A Documentation/Architecture/SystemsExtensionsProviders.md
M Documentation/Contributing/Roadmap.md
M Documentation/Contributing/UnitTests.md
A Documentation/Extensions/ExtensionServices.md
M Documentation/EyeTracking/EyeTracking_BasicSetup.md
M Documentation/EyeTracking/EyeTracking_TargetSelection.md
M Documentation/GettingStartedWithTheMRTK.md
A Documentation/Input/CreateDataProvider.md
R091 Documentation/InputSystem/HandTracking.md Documentation/Input/HandTracking.md
A Documentation/Input/HowToAddNearInteractivity.md
M Documentation/Input/InputActions.md
M Documentation/Input/InputEvents.md
M Documentation/Input/InputProviders.md
A Documentation/Input/InputState.md
M Documentation/Input/Overview.md
M Documentation/Input/Pointers.md
M Documentation/Input/Speech.md
M Documentation/InputSimulation/InputAnimationRecording.md
M Documentation/InputSimulation/InputSimulationService.md
A Documentation/MRTKNuGetPackage.md
M Documentation/MixedRealityConfigurationGuide.md
M Documentation/README_BoundingBox.md
M Documentation/README_Button.md
A Documentation/README_ExampleHub.md
M Documentation/README_Interactable.md
M Documentation/README_MRTKStandardShader.md
A Documentation/README_NearMenu.md
M Documentation/README_Solver.md
M Documentation/ReleaseNotes.md
M Documentation/SpatialAwareness/ConfiguringSpatialAwarenessMeshObserver.md
A Documentation/SpatialAwareness/CreateDataProvider.md
M Documentation/SpatialAwareness/SpatialAwarenessGettingStarted.md
M Documentation/SpatialAwareness/SpatialObjectMeshObserver.md
M Documentation/SpatialAwareness/UsageGuide.md
A Documentation/Tools/ControllerMappingTool.md
M Documentation/Tools/ExtensionServiceCreationWizard.md
M Documentation/Tools/OptimizeWindow.md
M Documentation/Updating.md
A Documentation/VisualThemes.md
M Documentation/toc.yml

v2.3.0 へのアップデート

日本語変更あり

M Documentation/Architecture/FrameworkAndRuntime.md
M Documentation/Architecture/InputSystem/Terminology.md
M Documentation/BuildAndDeploy.md
M Documentation/DetectingPlatformCapabilities.md
M Documentation/Diagnostics/ConfiguringDiagnostics.md
M Documentation/Diagnostics/DiagnosticsSystemGettingStarted.md
M Documentation/Diagnostics/UsingVisualProfiler.md
M Documentation/DownloadingTheMRTK.md
M Documentation/EyeTracking/EyeTracking_Positioning.md
M Documentation/EyeTracking/EyeTracking_ExamplesOverview.md
M Documentation/GettingStartedWithTheMRTK.md
M Documentation/HTKToMRTKPortingGuide.md

M Documentation/Input/Controllers.md
M Documentation/Input/Dictation.md
M Documentation/Input/Gaze.md
M Documentation/Input/Gestures.md
M Documentation/Input/HandTracking.md
M Documentation/Input/Overview.md
M Documentation/Input/Speech.md
M Documentation/InputSimulation/InputSimulationService.md
M Documentation/MixedRealityConfigurationGuide.md

M Documentation/Packaging/MRTK_Packages.md
M Documentation/Profiles/Profiles.md

M Documentation/README_AppBar.md
M Documentation/README_Button.md
M Documentation/README_ExampleHub.md
M Documentation/README_HandInteractionExamples.md
M Documentation/README_HandJointChaser.md
M Documentation/README_Interactable.md
M Documentation/README_ManipulationHandler.md
M Documentation/README_ObjectCollection.md
D Documentation/README_Pointers.md
M Documentation/README_Slate.md
M Documentation/README_Sliders.md
M Documentation/README_Solver.md
M Documentation/README_SystemKeyboard.md
M Documentation/README_TextPrefab.md
M Documentation/README_Tooltip.md
M Documentation/ServiceUtilities/MixedRealityServiceRegistryAndIMixedRealityServiceRegistrar.md
M Documentation/SpatialAwareness/SpatialAwarenessGettingStarted.md
M Documentation/TeleportSystem/Overview.md
M Documentation/Tools/DependencyWindow.md
M Documentation/hologram-stabilization.md
M Documentation/toc.yml

日本語変更なし

M Documentation/Architecture/InputSystem/ControllersPointersAndFocus.md
M Documentation/Architecture/InputSystem/CoreSystem.md
M Documentation/Architecture/SpatialAwareness.md
M Documentation/Architecture/SystemsExtensionsProviders.md

M Documentation/Authors.md
M Documentation/Boundary/BoundarySystemGettingStarted.md
M Documentation/Boundary/ConfiguringBoundaryVisualization.md

M Documentation/Contributing/BreakingChanges.md
M Documentation/Contributing/CONTRIBUTING.md
M Documentation/Contributing/CodingGuidelines.md
M Documentation/Contributing/DevDocGuide.md
M Documentation/Contributing/DocumentationGuide.md
M Documentation/Contributing/ExperimentalFeatures.md
M Documentation/Contributing/Feature_Contribution_Process.md
M Documentation/Contributing/PullRequests.md
M Documentation/Contributing/Roadmap.md
M Documentation/Contributing/UnitTests.md

M Documentation/CameraSystem/CameraSystemOverview.md
M Documentation/CameraSystem/CreateSettingsProvider.md
M Documentation/CameraSystem/UnityArCameraSettings.md
M Documentation/CameraSystem/WindowsMixedRealityCameraSettings.md
M Documentation/CrossPlatform/UsingARFoundation.md
M Documentation/Extensions/ExtensionServices.md
A Documentation/Extensions/HandPhysicsService/HandPhysicsServiceOverview.md
M Documentation/Extensions/SceneTransitionService/SceneTransitionServiceOverview.md
M Documentation/EyeTracking/EyeTracking_BasicSetup.md
M Documentation/EyeTracking/EyeTracking_EyeGazeProvider.md
M Documentation/EyeTracking/EyeTracking_EyesAndHands.md
M Documentation/EyeTracking/EyeTracking_IsUserCalibrated.md
M Documentation/EyeTracking/EyeTracking_Main.md
M Documentation/EyeTracking/EyeTracking_Navigation.md
M Documentation/EyeTracking/EyeTracking_TargetSelection.md
A Documentation/GettingStartedWithMRTKAndXRSDK.md

M Documentation/Input/CreateDataProvider.md
M Documentation/Input/HowToAddNearInteractivity.md
M Documentation/Input/InputActions.md
M Documentation/Input/InputEvents.md
M Documentation/Input/InputProviders.md
M Documentation/Input/InputState.md
M Documentation/Input/Pointers.md
M Documentation/InputSimulation/InputAnimationFileFormat.md
M Documentation/InputSimulation/InputAnimationRecording.md
M Documentation/MRTKNuGetPackage.md
M Documentation/MRTK_PackageContents.md
M Documentation/Packaging/MRTK_Modularization.md
M Documentation/Performance/PerfGettingStarted.md
M Documentation/README_BoundingBox.md
M Documentation/README_FingertipVisualization.md
M Documentation/README_LostTrackingService.md
M Documentation/README_MRTKStandardShader.md
M Documentation/README_NearMenu.md

M Documentation/ReleaseNotes.md
M Documentation/Rendering/MaterialInstance.md
M Documentation/SceneSystem/SceneSystemContentLoading.md
M Documentation/SceneSystem/SceneSystemGettingStarted.md
M Documentation/SceneSystem/SceneSystemLightingScenes.md
M Documentation/SceneSystem/SceneSystemLoadProgress.md
M Documentation/SceneSystem/SceneSystemSceneTypes.md

M Documentation/SpatialAwareness/ConfiguringSpatialAwarenessMeshObserver.md
M Documentation/SpatialAwareness/CreateDataProvider.md

M Documentation/SpatialAwareness/SpatialObjectMeshObserver.md
M Documentation/SpatialAwareness/UsageGuide.md

M Documentation/Tools/ControllerMappingTool.md
M Documentation/Tools/ExtensionServiceCreationWizard.md
M Documentation/Tools/HolographicRemoting.md
M Documentation/Tools/OptimizeWindow.md
M Documentation/Tools/ScreenshotUtility.md
M Documentation/Updating.md
M Documentation/VisualThemes.md

[v2.1.0] InputSimulation\InputSimulationService

diff --git a/Documentation/InputSimulation/InputSimulationService.md b/Documentation/InputSimulation/InputSimulationService.md
index 896f747f4..2ac6bb1dd 100644
--- a/Documentation/InputSimulation/InputSimulationService.md
+++ b/Documentation/InputSimulation/InputSimulationService.md
@@ -17,42 +17,47 @@ Input simulation is enabled by default in MRTK.
 
 Input simulation is an optional [Mixed Reality service](../MixedRealityServices.md). It can be added as a data provider in the [Input System profile](../Input/InputProviders.md).
 * __Type__ must be _Microsoft.MixedReality.Toolkit.Input > InputSimulationService_.
-* __Platform(s)__ should always be _Windows Editor_ since the service depends on keyboard and mouse input.
-* __Profile__ has all settings for input simulation.
+* __Platform(s)__ by default includes all _Editor_ platforms, since the service uses keyboard and mouse input.
 
-> [!WARNING]
-> Any type of profile can be assigned to services at the time of this writing. If you assign a different profile to the service, make sure to use a profile of type _Input Simulation_ or it will not work!
+## Input simulation tools window
+
+Enable the input simulation tools window from the  _Mixed Reality Toolkit > Utilities > Input Simulation_ menu. This window provides access to the state of input simulation during play mode.
+
+## Viewport Butttons
 
-<a target="_blank" href="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_InputSystemDataProviders.png">
-  <img src="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_InputSystemDataProviders.png" title="Full Hand Mesh" width="80%" class="center" />
-</a>
+A prefab for in-editor buttons to control basic hand placement can be specified in the input simulation profile under __Indicators Prefab__. This is an optional utility, the same features can be accessed in the [input simulation tools window](#input-simulation-tools-window).
+
+> [!NOTE]
+> The viewport indicators are disabled by default, as they currently sometimes interfere with Unity UI interactions, see issue [#6106](https://github.com/microsoft/MixedRealityToolkit-Unity/issues/6106). To enable, add the InputSimulationIndicators prefab to __Indicators Prefab__.
 
-Open the linked profile to access settings for input simulation.
 
-<a target="_blank" href="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_InputSimulationProfile.png">
-  <img src="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_InputSimulationProfile.png" title="Full Hand Mesh" width="80%" class="center" />
-</a>
+Hand icons show the state of the simulated hands:
+* ![Untracked hand icon](../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandIndicator_Untracked.png "Untracked hand icon") The hand is not tracking. Click to enable the hand.
+* ![Tracked hand icon](../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandIndicator_Tracked.png "Tracked hand icon") The hand is tracked, but not controlled by user. Click to hide the hand.
+* ![Controlled hand icon](../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandIndicator_Controlled.png "Controlled hand icon") The hand is tracked and controlled by user. Click to hide the hand.
+* ![Reset hand icon](../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandIndicator_Reset.png "Reset hand icon") Click to reset the hand to default position.
 
 # Camera Control
 
 Head movement can be emulated by the Input Simulation Service.
 
-<a target="_blank" href="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_CameraControlSettings.png">
-  <img src="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_CameraControlSettings.png" title="Full Hand Mesh" width="80%" class="center" />
-</a>
-
-## Rotating the camera
+### To rotate the camera:
 
 1. Hover over the viewport editor window.
+    _You may need to click the window to give it input focus if button presses don't work._
+1. Press and hold the __Mouse Look Button__ (default: Right mouse button).
+1. Move the mouse in the viewport window to rotate the camera.
+1. Use the scroll wheel to roll the camera around the view direction.
 
-   _You may need to click the window to give it input focus if button presses don't work._
+Camera rotation speed can be configured by changing the __Mouse Look Speed__ setting in the input simulation profile.
 
-2. Press and hold the __Mouse Look Button__ (default: Right mouse button).
-3. Move the mouse in the viewport window to rotate the camera.
+Alternatively use the __Look Horizontal__/__Look Vertical__ axes to rotate the camera (default: game controller right thumbstick).
 
-## Moving the camera
+### To move the camera:
 
-Press and hold the movement keys (W/A/S/D for forward/left/back/right).
+Use the __Move Horizontal__/__Move Vertical__ axes to move the camera (default: WASD keys or game controller left thumbstick).
+
+Camera position and rotation angles can be set explicitly in the tools window as well. The camera can be reset to its default using the __Reset__ button.
 
 <iframe width="560" height="315" src="https://www.youtube.com/embed/Z7L4I1ET7GU" class="center" frameborder="0" allow="accelerometer; encrypted-media; gyroscope; picture-in-picture" allowfullscreen />
 
@@ -60,11 +65,9 @@ Press and hold the movement keys (W/A/S/D for forward/left/back/right).
 
 The input simulation supports emulated hand devices. These virtual hands can interact with any object that supports regular hand devices, such as buttons or grabbable objects.
 
-<a target="_blank" href="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandSimulationMode.png">
-  <img src="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandSimulationMode.png" title="Full Hand Mesh" width="80%" class="center" />
-</a>
+## Hand Simulation Mode
 
-The __Hand Simulation Mode__ switches between two distinct input models.
+In the [input simulation tools window](#input-simulation-tools-window) the __Hand Simulation Mode__ setting switches between two distinct input models. The default mode can also be set in the input simulation profile.
 
 * _Articulated Hands_: Simulates a fully articulated hand device with joint position data.
 
@@ -80,41 +83,21 @@ The __Hand Simulation Mode__ switches between two distinct input models.
 
 ## Controlling hand movement
 
-<a target="_blank" href="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandControlSettings.png">
-  <img src="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandControlSettings.png" title="Full Hand Mesh" width="80%" class="center" />
-</a>
-
-Press and hold the _Left/Right Hand Manipulation Key_ (default: Left Shift/Space for left/right respectively) to gain control of either hand. While the manipulation key is pressed, the hand will appear in the viewport. Mouse movement will move the hand in the view plane.
-
-Once the manipulation key is released the hands will disappear after a short _Hand Hide Timeout_. To toggle hands on permanently, press the _Toggle Left/Right Hand Key_ (default: T/Y for left/right respectively). Press the toggle key again to hide the hands again.
+Press and hold the __Left/Right Hand Control Key__ (default: Left Shift/Space for left/right respectively) to gain control of either hand. While the manipulation key is pressed, the hand will appear in the viewport. Once the manipulation key is released the hands will disappear after a short __Hand Hide Timeout__.
 
-<a target="_blank" href="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandPlacementSettings.png">
-  <img src="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandPlacementSettings.png" title="Full Hand Mesh" width="80%" class="center" />
-</a>
+Hands can be toggle on permanently in the [input simulation tools window](#input-simulation-tools-window) or by pressing the __Toggle Left/Right Hand Key__ (default: T/Y for left/right respectively). Press the toggle key again to hide the hands again.
 
-Hands can be moved further or closer to the camera using the _mouse wheel_.
-By default the hand will move somewhat slowly in response to mouse scroll,
-and this can be made faster by changing the *Hand Depth Multiplier* to a
-larger number.
+Mouse movement will move the hand in the view plane. Hands can be moved further or closer to the camera using the __mouse wheel__.
 
-The initial distance from the camera that the hand appears at is controlled by
-*Default Hand Distance.*
+To rotate hands using the mouse, hold both the __Left/Right Hand Control Key__ (shift/space) _and_ the __Hand Rotate Button__ (default: right mouse button). Hand rotation speed can be configured by changing the __Mouse Hand Rotation Speed__ setting in the input simulation profile.
 
-By default, the simulated hand joints will be perfectly still. Note that on devices there
-will always be some amount of jitter/noise due to the underlying hand tracking.
-You can see this on the device when you have hand mesh or joints enabled (and
-see how it has slightly jitter even if you have your hand perfectly still). It's possible
-to simulate jitter by changing *Hand Jitter Amount* to a positive value (for example, 0.1
-as is shown in the image above).
+All hand placement can also changed in the [input simulation tools window](#input-simulation-tools-window), including resetting hands to default.
 
-<a target="_blank" href="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandRotationSettings.png">
-  <img src="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandRotationSettings.png" title="Full Hand Mesh" width="80%" class="center" />
-</a>
+## Additional profile settings
 
-Hands can be rotated when precise direction is required.
-* Yaw rotates around the Y axis (default: E/Q keys for clockwise/counter-clockwise rotation)
-* Pitch rotates around the X axis (default: F/R keys for clockwise/counter-clockwise rotation)
-* Roll rotates around the Z axis (default: X/Z keys for clockwise/counter-clockwise rotation)
+* __Hand Depth Multiplier__ controls the sensitivity of the mouse scroll wheel depth movement. A larger number will speed up hand zoom.
+* __Default Hand Distance__ is the initial distance of hands from the camera. Clicking the __Reset__ button hands will also place hands at this distance.
+* __Hand Jitter Amount__ adds random motion to hands. This can be used to simulate inaccurate hand tracking on device, and ensure that interactions work well with noisy input.
 
 <iframe width="560" height="315" src="https://www.youtube.com/embed/uRYfwuqsjBQ" class="center" frameborder="0" allow="accelerometer; encrypted-media; gyroscope; picture-in-picture" allowfullscreen />
 
@@ -122,10 +105,6 @@ Hands can be rotated when precise direction is required.
 
 Hand gestures such as pinching, grabbing, poking, etc. can also be simulated.
 
-<a target="_blank" href="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandGestureSettings.png">
-  <img src="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_HandGestureSettings.png" title="Full Hand Mesh" width="80%" class="center" />
-</a>
-
 1. First enable hand control using the manipulation keys (Left Shift/Space)
 
    Alternatively toggle the hands on/off using the toggle keys (T/Y).
@@ -152,27 +131,22 @@ Each of the mouse buttons can be mapped to transform the hand shape into a diffe
 For manipulating objects with two hands at the same time the persistent hand mode is recommended.
 
 1. Toggle on both hands by pressing the toggle keys (T/Y).
-2. Manipulate one hand at a time:
+1. Manipulate one hand at a time:
   1. Hold _Space_ to control the right hand
-  2. Move the hand to where you want to grab the object
-  3. Press mouse button to activate the _Pinch_ gesture. In persistent mode the gesture will remain active when you release the mouse button.
-3. Repeat the process with the other hand, grabbing the same object in a second spot.
-4. Now that both hands are grabbing the same object, you can move either of them to perform two-handed manipulation.
+  1. Move the hand to where you want to grab the object
+  1. Press mouse button to activate the _Pinch_ gesture. In persistent mode the gesture will remain active when you release the mouse button.
+1. Repeat the process with the other hand, grabbing the same object in a second spot.
+1. Now that both hands are grabbing the same object, you can move either of them to perform two-handed manipulation.
 
 <iframe width="560" height="315" src="https://www.youtube.com/embed/Qol5OFNfN14" class="center" frameborder="0" allow="accelerometer; encrypted-media; gyroscope; picture-in-picture" allowfullscreen />
 
 ## GGV Interaction
 
 1. Enable GGV simulation by switching __Hand Simulation Mode__ to _Gestures_ in the [Input Simulation Profile](#enabling-the-input-simulation-service)
-
-<a target="_blank" href="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_SwitchToGGV.png">
-  <img src="../../Documentation/Images/InputSimulation/MRTK_InputSimulation_SwitchToGGV.png" title="Full Hand Mesh" width="80%" class="center" />
-</a>
-
-2. Rotate the camera to point the gaze cursor at the interactable object (right mouse button)
-3. Hold _Space_ to control the right hand
-4. Click and hold _left mouse button_ to interact
-5. Rotate the camera again to manipulate the object
+1. Rotate the camera to point the gaze cursor at the interactable object (right mouse button)
+1. Hold _Space_ to control the right hand
+1. Click and hold _left mouse button_ to interact
+1. Rotate the camera again to manipulate the object
 
 <iframe width="560" height="315" src="https://www.youtube.com/embed/6841rRMdqWw" class="center" frameborder="0" allow="accelerometer; encrypted-media; gyroscope; picture-in-picture" allowfullscreen />

翻訳フローを Wiki に書く

MRTK翻訳

作業フロー

  • 翻訳の対象ファイル
  • 翻訳作業用ブランチ
    • 大本のブランチ : feature/mrtk_documentation_jp
  • 上記ブランチから作業用として doc_ja/作業対象.md のブランチを切り翻訳作業をする
  • 翻訳したらfeature/mrtk_documentation_jpにプルリクエストを送る
  • 作業途中で、誰かに見てもらいたい場合は、タイトルに[WIP] をつけて Pull Request を送る
    • [WIP] がついているものはマージしてはいけない
    • 作業が完了したらタイトルの [WIP] を外す

翻訳ガイドライン

  • ですます調
  • Pressable button → 押しボタン (Pressable button)
    (押しボタンとカッコの間は半角空白あり)
  • 英語と日本語の間は半角空白をあける
  • ページ内リンクの書き方
    • 英語->小文字
    • スペース-> -
例
[上記の手順](#mrtk-のパッケージを-unity-プロジェクトにインポートする)

用語集

  • Configration : 設定
  • Setup : セットアップ
  • ...

DocFx

DocFx インストール

> docfx docfx.json --serve

を実行すると、http://localhost:8080 で手元でドキュメントページをみることができる

Other

  • 元の英語ドキュメントにおかしなところを見つけたら、本家に Pull Request 送る
  • 不安な場合は、社内slackの #技術_mrtk で相談

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.