Code Monkey home page Code Monkey logo

arfoundation-samples's Introduction

AR Foundation Samples

Example AR scenes that use AR Foundation 6.0 and demonstrate its features. Each feature is used in a minimal sample scene with example code that you can modify or copy into your project.

This sample project depends on four Unity packages:

Which version should I use?

The main branch of this repository uses AR Foundation 6.0 and is compatible with Unity 2023.2 and newer. To access sample scenes for previous versions of AR Foundation, refer to the table below for links to other branches.

Unity Version AR Foundation Version
Unity 6 (6000.0) 6.0 (main)
2023.2 5.1
2022.3 5.1
2021.3 4.2

How to use these samples

Build and run on device

You can build the AR Foundation Samples project directly to device, which can be a helpful introduction to using AR Foundation features for the first time.

To build to device, follow the steps below:

  1. Install Unity 2023.2 or later and clone this repository.

  2. Open the Unity project at the root of this repository.

  3. As with any other Unity project, go to Build Settings, select your target platform, and build this project.

Understand the sample code

All sample scenes in this project can be found in the Assets/Scenes folder. To learn more about the AR Foundation components used in each scene, see the AR Foundation Documentation. Each scene is explained in more detail below.

Table of Contents

Sample scene(s) Description
Simple AR Demonstrates basic Plane detection and Raycasting
Camera Scenes that demonstrate Camera features
Plane detection Scenes that demonstrate Plane detection
Image tracking Scenes that demonstrate Image tracking
Object tracking Demonstrates Object tracking
Face tracking Scenes that demonstrate Face tracking
Body tracking Scenes that demonstrate Body tracking
Point clouds Demonstrates Point clouds
Anchors Demonstrates Anchors
Meshing Scenes that demonstrate Meshing
Environment Probes Demonstrates Environment Probes
Occlusion Scenes that demonstrate Occlusion
Check support Demonstrates checking for AR support on device
Configuration Chooser Demonstrates AR Foundation's Configuration Chooser
Debug Menu Visualize trackables and configurations on device
ARKit ARKit-specific sample scenes
ARCore session recording Demonstrates the session recording and playback functionality available in ARCore

Simple AR

This is a good starting sample that enables point cloud visualization and plane detection. There are buttons on screen that let you pause, resume, reset, and reload the ARSession.

When a plane is detected, you can tap on the detected plane to place a cube on it. This uses the ARRaycastManager to perform a raycast against the plane. If the plane is in TrackingState.Limited, it will highlight red. In the case of ARCore, this means that raycasting will not be available until the plane is in TrackingState.Tracking again.

Action Meaning
Pause Pauses the ARSession, meaning device tracking and trackable detection (e.g., plane detection) is temporarily paused. While paused, the ARSession does not consume CPU resources.
Resume Resumes a paused ARSession. The device will attempt to relocalize and previously detected objects may shift around as tracking is reestablished.
Reset Clears all detected trackables and effectively begins a new ARSession.
Reload Completely destroys the ARSession GameObject and re-instantiates it. This simulates the behavior you might experience during scene switching.

Camera

CPU Images

This samples shows how to acquire and manipulate textures obtained from AR Foundation on the CPU. Most textures in ARFoundation (e.g., the pass-through video supplied by the ARCameraManager, and the human depth and human stencil buffers provided by the AROcclusionManager) are GPU textures. Computer vision or other CPU-based applications often require the pixel buffers on the CPU, which would normally involve an expensive GPU readback. AR Foundation provides an API for obtaining these textures on the CPU for further processing, without incurring the costly GPU readback.

The relevant script is CpuImageSample.cs.

The resolution of the camera image is affected by the camera's configuration. The current configuration is indicated at the bottom left of the screen inside a dropdown box which lets you select one of the supported camera configurations. The CameraConfigController.cs demonstrates enumerating and selecting a camera configuration. It is on the CameraConfigs GameObject.

Where available (currently iOS 13+ only), the human depth and human stencil textures are also available on the CPU. These appear inside two additional boxes underneath the camera's image.

Basic Light Estimation

Demonstrates basic light estimation information from the camera frame. You should see values for "Ambient Intensity" and "Ambient Color" on screen. The relevant script is BasicLightEstimation.cs script.

HDR Light Estimation

This sample attempts to read HDR lighting information. You should see values for "Ambient Intensity", "Ambient Color", "Main Light Direction", "Main Light Intensity Lumens", "Main Light Color", and "Spherical Harmonics". Most devices only support a subset of these 6, so some will be listed as "Unavailable." The relevant script is HDRLightEstimation.cs script.

On iOS, this is only available when face tracking is enabled and requires a device that supports face tracking (such as an iPhone X, XS or 11). When available, a virtual arrow appears in front of the camera which indicates the estimated main light direction. The virtual light direction is also updated, so that virtual content appears to be lit from the direction of the real light source.

When using HDRLightEstimation, the sample will automatically pick the supported camera facing direction for you, for example World on Android and User on iOS, so it does not matter which facing direction you select in the ARCameraManager component.

Background Rendering Order

Produces a visual example of how changing the background rendering between BeforeOpaqueGeometry and AfterOpaqueGeometry would effect a rudimentary AR application. Leverages Occlusion where available to display AfterOpaqueGeometry support for AR Occlusion.

Camera Grain (ARKit)

This sample demonstrates the camera grain effect. Once a plane is detected, you can place a cube on it with a material that simulates the camera grain noise in the camera feed. See the CameraGrain.cs script. Also see CameraGrain.shader which animates and applies the camera grain texture (through linear interpolation) in screenspace.

This sample requires a device running iOS 13 or later and Unity 2020.2 or later.

EXIF Data

This sample demonstrates how to access camera frame's EXIF metadata. You should see values for all the supported EXIF tags on screen. Refer to ExifDataLogger.cs for more details.

This sample requires iOS 16 or newer.

Image Stabilization (ARCore)

This sample shows how to toggle the Image Stabilization feature on and off, and requires an ARCore-supported device with Google Play Services for AR version 1.37 or newer.

Plane Detection

Toggle Plane Detection

This sample shows how to toggle plane detection on and off. When off, it will also hide all previously detected planes by disabling their GameObjects. See PlaneDetectionController.cs.

Plane Masking

This sample demonstrates basic plane detection, but uses an occlusion shader for the plane's material. This makes the plane appear invisible, but virtual objects behind the plane are culled. This provides an additional level of realism when, for example, placing objects on a table.

Move the device around until a plane is detected (its edges are still drawn) and then tap on the plane to place/move content.

Image Tracking

There are two samples demonstrating image tracking. The image tracking samples are supported on ARCore and ARKit. To enable image tracking, you must first create an XRReferenceImageLibrary. This is the set of images to look for in the environment. Click here for instructions on creating one.

You can also add images to the reference image library at runtime. This sample includes a button that adds the images one.png and two.png to the reference image library. See the script DynamicLibrary.cs for example code.

Run the sample on an ARCore or ARKit-capable device and point your device at one of the images in Assets/Scenes/ImageTracking/Images. They can be displayed on a computer monitor; they do not need to be printed out.

Basic Image Tracking

At runtime, ARFoundation will generate an ARTrackedImage for each detected reference image. This sample uses the TrackedImageInfoManager.cs script to overlay the original image on top of the detected image, along with some meta data.

Image Tracking With Multiple Prefabs

With PrefabImagePairManager.cs script, you can assign different prefabs for each image in the reference image library.

You can also change prefabs at runtime. This sample includes a button that switch between the original and alternative prefab for the first image in the reference image library. See the script DynamicPrefab.cs for example code.

Object Tracking

Similar to the image tracking sample, this sample detects a 3D object from a set of reference objects in an XRReferenceObjectLibrary. Click here for instructions on creating one.

To use this sample, you must have a physical object the device can recognize. The sample's reference object library is built using two reference objects. The sample includes printable templates which can be printed on 8.5x11 inch paper and folded into a cube and cylinder.

Alternatively, you can scan your own objects and add them to the reference object library.

This sample requires iOS 12 or above.

Face Tracking

There are several samples showing different face tracking features. Some are ARCore specific and some are ARKit specific.

Face Pose

This is the simplest face tracking sample and simply draws an axis at the detected face's pose.

This sample uses the front-facing (i.e., selfie) camera.

Face Mesh

This sample instantiates and updates a mesh representing the detected face. Information about the device support (e.g., number of faces that can be simultaneously tracked) is displayed on the screen.

This sample uses the front-facing (i.e., selfie) camera.

Face Regions (ARCore)

"Face regions" are an ARCore-specific feature which provides pose information for specific "regions" on the detected face, e.g., left eyebrow. In this example, axes are drawn at each face region. See the ARCoreFaceRegionManager.cs.

This sample uses the front-facing (i.e., selfie) camera.

Blend Shapes (ARKit)

"Blend shapes" are an ARKit-specific feature which provides information about various facial features on a scale of 0..1. For instance, "wink" and "frown". In this sample, blend shapes are used to puppet a cartoon face which is displayed over the detected face. See the ARKitBlendShapeVisualizer.cs.

This sample uses the front-facing (i.e., selfie) camera.

Eye Lasers, Eye Poses, and Fixation Point (ARKit)

These samples demonstrate eye and fixation point tracking. Eye tracking produces a pose (position and rotation) for each eye in the detected face, and the "fixation point" is the point the face is looking at (i.e., fixated upon). EyeLasers uses the eye pose to draw laser beams emitted from the detected face.

This sample uses the front-facing (i.e., selfie) camera and requires an iOS device with a TrueDepth camera.

Rear Camera (ARKit)

iOS 13 adds support for face tracking while the world-facing (i.e., rear) camera is active. This means the user-facing (i.e., front) camera is used for face tracking, but the pass through video uses the world-facing camera. To enable this mode in ARFoundation, you must enable an ARFaceManager, set the ARSession tracking mode to "Position and Rotation" or "Don't Care", and set the ARCameraManager's facing direction to "World". Tap the screen to toggle between the user-facing and world-facing cameras.

The sample code in DisplayFaceInfo.OnEnable shows how to detect support for these face tracking features.

When using the world-facing camera, a cube is displayed in front of the camera whose orientation is driven by the face in front of the user-facing camera.

This feature requires a device with a TrueDepth camera and an A12 bionic chip running iOS 13.

Body Tracking

Body Tracking 2D

This sample demonstrates 2D screen space body tracking. A 2D skeleton is generated when a person is detected. See the ScreenSpaceJointVisualizer.cs script.

This sample requires a device with an A12 bionic chip running iOS 13 or above.

Body Tracking 3D

This sample demonstrates 3D world space body tracking. A 3D skeleton is generated when a person is detected. See the HumanBodyTracker.cs script.

This sample requires a device with an A12 bionic chip running iOS 13 or above.

Point Clouds

This sample shows all feature points over time, not just the current frame's feature points as the "AR Default Point Cloud" prefab does. It does this by using a slightly modified version of the ARPointCloudParticleVisualizer component that stores all the feature points in a Dictionary. Since each feature point has a unique identifier, it can look up the stored point and update its position in the dictionary if it already exists. This can be a useful starting point for custom solutions that require the entire map of point cloud points, e.g., for custom mesh reconstruction techniques.

This sample has two UI components:

  • A button in the lower left which allows you to switch between visualizing "All" the points and just those in the "Current Frame".
  • Text in the upper right which displays the number of points in each point cloud (ARCore & ARKit will only ever have one).

Anchors

This sample shows how to create anchors as the result of a raycast hit. The "Clear Anchors" button removes all created anchors. See the AnchorCreator.cs script.

This script can create two kinds of anchors:

  1. If a feature point is hit, it creates a normal anchor at the hit pose using the GameObject.AddComponent<ARAnchor>() method.
  2. If a plane is hit, it creates an anchor "attached" to the plane using the AttachAnchor method.

Meshing

These meshing scenes use features of some devices to construct meshes from scanned data of real world surfaces. These meshing scenes will not work on all devices.

For ARKit, this functionality requires at least iPadOS 13.4 running on a device with a LiDAR scanner.

Classification Meshes

This scene demonstrates mesh classification functionality. With mesh classification enabled, each triangle in the mesh surface is identified as one of several surface types. This sample scene creates submeshes for each classification type and renders each mesh type with a different color.

This scene only works on ARKit.

Normal Meshes

This scene renders an overlay on top of the real world scanned geometry illustrating the normal of the surface.

Occlusion Meshes

At first, this scene may appear to be doing nothing. However, it is rendering a depth texture on top of the scene based on the real world geometry. This allows for the real world to occlude virtual content. The scene has a script on it that fires a red ball into the scene when you tap. You will see the occlusion working by firing the red balls into a space which you can then move the iPad camera behind some other real world object to see that the virtual red balls are occluded by the real world object.

Environment Probes

This sample demonstrates environment probes, a feature which attempts to generate a 3D texture from the real environment and applies it to reflection probes in the scene. The scene includes several spheres which start out completely black, but will change to shiny spheres which reflect the real environment when possible.

Occlusion

SimpleOcclusion

This sample demonstrates occlusion of virtual content by real world content through the use of environment depth images on supported Android and iOS devices.

Depth Images

This sample demonstrates raw texture depth images from different methods.

  • Environment depth (certain Android devices and Apple devices with the LiDAR sensor)
  • Human stencil (Apple devices with an A12 bionic chip (or later) running iOS 13 or later)
  • Human depth (Apple devices with an A12 bionic chip (or later) running iOS 13 or later)

Check Support

Demonstrates checking for AR support and logs the results to the screen. The relevant script is SupportChecker.cs.

Configuration Chooser

Demonstrates how to use the AR Foundation session's ConfigurationChooser to swap between rear and front-facing camera configurations.

Debug Menu

The AR Foundation Debug Menu allows you to visualize trackables and configurations on device.

ARKit

These samples are only available on iOS devices.

Coaching Overlay

The coaching overlay is an ARKit-specific feature which will overlay a helpful UI guiding the user to perform certain actions to achieve some "goal", such as finding a horizontal plane.

The coaching overlay can be activated automatically or manually, and you can set its goal. In this sample, we've set the goal to be "Any plane", and for it to activate automatically. This will display a special UI on the screen until a plane is found. There is also a button to activate it manually.

The sample includes a MonoBehavior to define the settings of the coaching overlay. See ARKitCoachingOverlay.cs.

This sample also shows how to subscribe to ARKit session callbacks. See CustomSessionDelegate.

This sample requires iOS 13 or above.

Thermal State

This sample contains the code required to query for an iOS device's thermal state so that the thermal state may be used with C# game code. This sample illustrates how the thermal state may be used to disable AR Foundation features to reduce the thermal state of the device.

AR World Map

An ARWorldMap is an ARKit-specific feature which lets you save a scanned area. ARKit can optionally relocalize to a saved world map at a later time. This can be used to synchronize multiple devices to a common space, or for curated experiences specific to a location, such as a museum exhibition or other special installation. Read more about world maps here. A world map will store most types of trackables, such as reference points and planes.

The ARWorldMapController.cs performs most of the logic in this sample.

This sample requires iOS 12 or above.

Geo Anchors

ARKit's ARGeoAnchors are not yet supported by ARFoundation, but you can still access this feature with a bit of Objective-C. This sample uses a custom ConfigurationChooser to instruct the Apple ARKit XR Plug-in to use an ARGeoTrackingConfiguration.

This sample also shows how to interpret the nativePtr provided by the XRSessionSubsystem as an ARKit ARSession pointer.

This sample requires an iOS device running iOS 14.0 or later, an A12 chip or later, location services enabled, and cellular capability.

AR Collaboration Data

Similar to an ARWorldMap, a "collaborative session" is an ARKit-specific feature which allows multiple devices to share session information in real time. Each device will periodically produce ARCollaborationData which should be sent to all other devices in the collaborative session. ARKit will share each participant's pose and all reference points. Other types of trackables, such as detected planes, are not shared.

See CollaborativeSession.cs. Note there are two types of collaboration data: "Critical" and "Optional". "Critical" data is available periodically and should be sent to all other devices reliably. "Optional" data is available nearly every frame and may be sent unreliably. Data marked as "optional" includes data about the device's location, which is why it is produced very frequently (i.e., every frame).

Note that ARKit's support for collaborative sessions does not include any networking; it is up to the developer to manage the connection and send data to other participants in the collaborative session. For this sample, we used Apple's MultipeerConnectivity Framework. Our implementation can be found here.

You can create reference points by tapping on the screen. Reference points are created when the tap results in a raycast which hits a point in the point cloud.

This sample requires iOS 13 or above.

High Resolution CPU Image

This sample demonstrates high resolution CPU image capture on iOS 16 and newer. See the High Resolution CPU Image package documentation to learn more about this feature.

Camera Exposure

This sample shows how to lock the device camera and set the camera exposure mode, duration, and ISO. See CameraExposureController.cs for example code.

This sample requires iOS 16 or newer and a device with an ultra-wide camera.

Camera White Balance

This sample shows how to lock the device camera and set the camera white balance mode and gains. See CameraWhiteBalanceController.cs for example code.

This sample requires iOS 16 or newer and a device with an ultra-wide camera.

Camera Focus

This sample shows how to lock the device camera and set the camera focus mode and lens position. See CameraFocusController.cs for example code.

This sample requires iOS 16 or newer and a device with an ultra-wide camera.

ARCore Session Recording

This sample demonstrates the session recording and playback functionality available in ARCore. This feature allows you to record the sensor and camera telemetry during a live session, and then reply it at later time. When replayed, ARCore runs on the target device using the recorded telemetry rather than live data. See ARCoreSessionRecorder.cs for example code.

Additional demos

While no longer actively maintained, Unity has a separate AR Foundation Demos repository that contains some larger samples including localization, mesh placement, shadows, and user onboarding UX.

Community and feedback

Refer to sections below to understand how to provide different kinds of feedback to Unity.

AR Foundation bug reports

To report a bug in AR Foundation, please file a bug. You may also submit a GitHub issue, but we will close your GitHub issue if it does not contain an official bug ID number. The best way to ensure that your issue is addressed is to file a bug using Unity's official bug reporting process.

AR Foundation feature requests

To request a new feature in AR Foundation or related packages, use Unity's XR Roadmap. Click on the AR Foundation tab, then scroll down to Submit a New Idea.

Contributions to this repository

We are not accepting pull requests at this time. If you find an issue with the samples or would like to request a new sample, please submit a GitHub issue.

arfoundation-samples's People

Contributors

andyb-unity avatar ankur-unity avatar cjwills avatar danmillerdev avatar davidmohrhardt avatar diannaunity avatar jackt-unity avatar johnsietsma avatar malduffin avatar robinmasquesgn avatar shelleyj32-zz avatar stevenp-unity avatar syzygialcat avatar tdmowrer avatar todds-unity avatar tropicdragon avatar unity-andrewc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

arfoundation-samples's Issues

Building Android with Gradle Export project causes manifest error and camera failure

Using Unity 2018.1.7f1

If we build Android using gradle export project, the project with throw an error that it can't find the manifest path. Like this:

DirectoryNotFoundException: Could not find a part of the path "D:\Projects\PROJECTNAME\client\UnityProject\Builds\GeneratedApkTemp\src\main\AndroidManifest.xml".
System.IO.FileStream..ctor (System.String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize, Boolean anonymous, FileOptions options) (at /Users/builduser/buildslave/mono/build/mcs/class/corlib/System.IO/FileStream.cs:292)
System.IO.FileStream..ctor (System.String path, FileMode mode, FileAccess access, FileShare share)
(wrapper remoting-invoke-with-check) System.IO.FileStream:.ctor (string,System.IO.FileMode,System.IO.FileAccess,System.IO.FileShare)
System.Xml.XmlUrlResolver.GetEntity (System.Uri absoluteUri, System.String role, System.Type ofObjectToReturn)
Mono.Xml2.XmlTextReader.GetStreamFromUrl (System.String url, System.String& absoluteUriString)
Mono.Xml2.XmlTextReader..ctor (System.String url, System.Xml.XmlNameTable nt)
System.Xml.XmlTextReader..ctor (System.String url, System.Xml.XmlNameTable nt)
System.Xml.XmlDocument.Load (System.String filename)
UnityEditor.XR.ARCore.ARCoreManifest.OnPostGenerateGradleAndroidProject (System.String path) (at C:/Users/jlander/AppData/Local/Unity/cache/packages/packages.unity.com/[email protected]/Editor/ARCoreBuildProcessor.cs:193)

The problem is the actual path in the built project has the project name in there
"client\UnityProject\Builds\GeneratedApkTemp\PROJECTNAME\src\main\AndroidManifest.xml"

Manually adding the project name into line 202 of ARCoreBuildProcessor fixes the issue:
string manifestPath = path + "/PROJECTNAME" + k_AndroidManifestPath;

I am not sure how to do this correctly.

Linear Color Space for Camera Background

I just started using the foundation framework to see how it works out.

During testing I wondered how to use a proper linear camera background shader instead of the default one. Seems like the background image is to bright when setting the player settings to linear.

I tried to plug the arkit camera shader into a material and use this as override, but this gives me black background.

Is there a sample shader available that will work?

Black Screen and no camera permission

Hi i tried arfoundation 1 month ago and it worked just 1 time.
But since than i try different time with different version of unity, but the app do not ask me the camera permission and i not show me the camera background
no error on debug...

i tried to allow permission in app setting but nothing change, any idea why it appen?

Download arcore-unity- SDK -v1.5.0.unitypackage package apk installation start black screen

phone model : Huawei P20 Pro
Phone ARCore version : 1.5.180910096

Logout :
11-12 18:29:23.990 10718-10744/? E/native: session_c_api.cc:289 ArConfig_setUpdateMode: session was passed NULL.
session_c_api.cc:272 ArConfig_setPlaneFindingMode: session was passed NULL.
session_c_api.cc:237 ArConfig_setLightEstimationMode: session was passed NULL.
session_c_api.cc:307 ArConfig_setCloudAnchorMode: session was passed NULL.
session_c_api.cc:374 ArConfig_setFocusMode: session was passed NULL.
session_c_api.cc:550 ArSession_configure: session was passed NULL.
session_c_api.cc:1158 ArFrame_acquireCamera: session was passed NULL.
session_c_api.cc:1027 ArCamera_getTrackingState: session was passed NULL.
session_c_api.cc:1158 ArFrame_acquireCamera: session was passed NULL.
session_c_api.cc:1027 ArCamera_getTrackingState: session was passed NULL.

Why is it so?I packed it according to the tutorial : https://developers.google.cn/ar/develop/unity/quickstart-android

Block ARSessionOrigin.Raycast with UI

Is it possible to block the raycast from ARSessionOrigin in PlaceOnPlane.cs with a UI component? It appears as though the raycast ignores all non-Trackable objects at the moment.

I feel that having a UI element overlaid over an AR view is not an uncommon usecase, and surely people don't want the AR scene reacting to taps when all they wanted to do is press a button.

ARKit Package Import Fail

Updating to the new packages seems to give this error:

Assertion failed: Removing Packages/com.unity.xr.arkit/npm-debug.log because the asset does not exist
Causing the package manager to uninstall the ARKit package. Running 2018.1.3f1 on mac.

Flickering when playing video on Android

When using the Unity VideoPlayer component in the SampleScene, every other background frame is the video texture.

This is with the VideoPlayer's Render Mode set to Material Override, and Render Texture;

When the AR Camera Background component's material is applied to another object, the texture still shows the same flickering.

Trying to set the override material for the AR Camera Background results in a black background.

Tracking Broken on iOS 12

Tested on an iPhone 6s and 7+, iOS 12 beta 1 and beta 2.

After a few seconds Unity will stop receiving updates from ARKit and all graphical elements (camera, detected planes, pointcloud, hit test object) will be frozen on the screen. The color background image will keep updating.

There are repeated messages that "World tracking performance is being affected by resource constraints" every few seconds leading up to the freeze. No error messages are present in XCode.

This is running the AR Foundation sample scene with no additions or changes. AR Foundation does not freeze in iOS 11.4. The standalone UnityARKitPlugin runs fine on both devices in iOS 12.

Black camera view on iPhone 6

I've got an app in production that was approved without a hitch by Apple.
Now, two separate people have reported the camera-view is black on iPhone 6 devices (not 6s).
All other devices seem to have no issues.

I've ordered an iPhone 6 to reproduce, but thought I'd share this anyway. Will report my findings once I get the device.

I've also submitted a TSI to Apple, but haven't heard from them yet (sent 5 days ago).

To get LWRP working with the camera, I did apply the fix mentioned here: https://nolanscobie.com/2018/07/unity-mobile-ar-with-lwrp/ - without that fix, the camera is black on all devices. So maybe that could be causing it - that fix not being applied somehow. It's just weird it only happens on one specific device.

The app in question can be installed from App Store or Google Play, in case you want to reproduce/see it.

Build info:
Unity: Unity 2018.2.3f1, Lightweight Render Pipeline, AR Foundation, ShaderGraph
iOS: macOS High Sierra 10.13.6, Mac mini Late 2014, Xcode 9.4.1 (9F2000)
Android: Windows 10.0.17134, min API level 24, target API 27, .NET 3.5 eq. runtime, Mono and .NET 2.0 subset, ARMv7
manifest.json:

{
  "dependencies": {
    "com.unity.package-manager-ui": "1.9.11",
    "com.unity.render-pipelines.core": "3.0.0-preview",
    "com.unity.render-pipelines.lightweight": "3.0.0-preview",
    "com.unity.shadergraph": "3.0.0-preview",
    "com.unity.textmeshpro": "1.2.4",
    "com.unity.xr.arcore": "1.0.0-preview.18",
    "com.unity.xr.arfoundation": "1.0.0-preview.17",
    "com.unity.xr.arkit": "1.0.0-preview.14",
    "com.unity.modules.animation": "1.0.0",
    "com.unity.modules.assetbundle": "1.0.0",
    "com.unity.modules.audio": "1.0.0",
    "com.unity.modules.director": "1.0.0",
    "com.unity.modules.imageconversion": "1.0.0",
    "com.unity.modules.imgui": "1.0.0",
    "com.unity.modules.jsonserialize": "1.0.0",
    "com.unity.modules.particlesystem": "1.0.0",
    "com.unity.modules.physics": "1.0.0",
    "com.unity.modules.physics2d": "1.0.0",
    "com.unity.modules.screencapture": "1.0.0",
    "com.unity.modules.ui": "1.0.0",
    "com.unity.modules.uielements": "1.0.0",
    "com.unity.modules.umbra": "1.0.0",
    "com.unity.modules.unityanalytics": "1.0.0",
    "com.unity.modules.unitywebrequest": "1.0.0",
    "com.unity.modules.unitywebrequestassetbundle": "1.0.0",
    "com.unity.modules.unitywebrequestaudio": "1.0.0",
    "com.unity.modules.unitywebrequesttexture": "1.0.0",
    "com.unity.modules.unitywebrequestwww": "1.0.0",
    "com.unity.modules.video": "1.0.0",
    "com.unity.modules.vr": "1.0.0",
    "com.unity.modules.xr": "1.0.0"
  }
} 

Crash when starting ARCore on Samsung S8 related to permissions

Hello,
I'm having an issue with getting ARFoundation to work on Android (a Samsung S8) in that it crashes when it tries to ask for camera permissions. (It's not entirely clear to me why it needs to ask for camera permissions, since according to the Unity manual permissions are supposed to be set at install). Part of the design of my app requires being able to toggle AR on and off within the same scene, and I have noticed that ARFoundation can be a bit finicky when used this way. However I have no issues building and running on IOS.

Below is the relevant section of the log:

09-07 16:12:53.431: E/mono(16776): Unhandled Exception:

09-07 16:12:53.431: E/mono(16776): UnityEngine.AndroidJavaException: java.lang.ClassNotFoundException: com.unity3d.plugin.UnityAndroidPermissions

09-07 16:12:53.431: E/mono(16776):   at UnityEngine.AndroidJNISafe.CheckException () [0x00091] in <923839a08fb841a3aae5c73693b946f8>:0 

09-07 16:12:53.431: E/mono(16776):   at UnityEngine.AndroidJNISafe.CallStaticObjectMethod (System.IntPtr clazz, System.IntPtr methodID, UnityEngine.jvalue[] args) [0x00011] in <923839a08fb841a3aae5c73693b946f8>:0 

09-07 16:12:53.431: E/mono(16776):   at UnityEngine.AndroidJavaObject._CallStatic[ReturnType] (System.String methodName, System.Object[] args) [0x002d6] in <923839a08fb841a3aae5c73693b946f8>:0 

09-07 16:12:53.431: E/mono(16776):   at UnityEngine.AndroidJavaObject.CallStatic[ReturnType] (System.String methodName, System.Object[] args) [0x00001] in <923839a08fb841a3aae5c73693b946f8>:0 

09-07 16:12:53.431: E/mono(16776):   at UnityEngine.AndroidJavaObject.FindClass (System.String name) [0x0001d] in <923839a08fb841a3aae5c73693b946f8>:0 

09-07 16:12:53.431: E/mono(16776):   at UnityEngine.AndroidJavaObject._AndroidJavaObject (System.String className, System.Object[] args) [0x00020] in <923839a08fb841a3aae5c73693b946f8>:0 

09-07 16:12:53.431: E/mono(16776):   at UnityEngine.AndroidJavaObject..ctor (System.String className, System.Object[] args) [0x00007] in <923839a08fb841a3aae5c73693b946f8>:0 

09-07 16:12:53.431: E/mono(16776):   at UnityEngine.XR.ARCore.ARCorePermissionManager.get_permissionsService () [0x00014] in <57129082ff43417e8b08c23e3f1db4f7>:0 

09-07 16:12:53.431: E/mono(16776):   at UnityEngine.XR.ARCore.ARCorePermissionManager.IsPermissionGranted (System.String permissionName) [0x0000c] in <57129082ff43417e8b08c23e3f1db4f7>:0 

09-07 16:12:53.431: E/mono(16776):   at UnityEngine.XR.ARCore.ARCorePermissionManager.RequestPermission (System.String permissionName, System.Action2[T1,T2] callback) [0x00011] in <57129082ff43417e8b08c23e3f1db4f7>:0 

This behaviour is displayed if I build with mono and IL2CPP. I have a suspicion it could have something to do with my player settings because deploying the samples project displays the permissions and works correctly. The only difference I can see between the player settings in the sample project is that it is set to use .Net 3.x (Which won't work for my purposes, and I can't see how this would be the problem). And the text under Scripting Define Symbols is "UNITY_POST_PROCESSING_STACK_V2" - which I have no idea what this means.

Attached are the player settings for my project. Also attached is the full log from the session, but I couldn't really find anything else of note.

projectsettings1

projectsettings2

ProjectSettings.txt

samsungfulllog.txt

Playing in Unity

Hello,

I've liked developing/playing in Unity project based on https://github.com/Unity-Technologies/experimental-ARInterface . Editing in Play mode right in the Unity without a need to build app on device every time.
Now, I would like to switch to ArFoundation, but these frameworks are not substitutes.
The main pain is a need to build app on device every time to see some new changes.
Are you planning to add support playing in Unity without building on device? Or can I achieve the similar behaviour like developing on ARInterface without writing all the supporting code? I mean viewing mocked AR surfaces and mooving camera by mouse and keyboard.

Thank you.

PlaceOnPlane script missing error

Hi all,

I have a clean copy that I just cloned, running Unity 2018.2.14f1, modified my JSON manifest to support the remote/mock package. I am getting an error in editor "PlaceOnPlane" script error the associated script cannot be found etc. I checked the class name and it matches the source file. I didn't modify this file at all. Any ideas? Tried searching here and in the forums and didn't see anyone else with the issues.

iOS camera freezes after one frame

When attempting to run an app with ARFoundation in ios, the first time it asks for camera permission and works fine. Any subsequent times you try to start the app the camera only takes exactly one picture and then freezes, forcing you to uninstall the app to make it work again.

Anyone else had this problem? Is there an easy fix or a fundamental problem somewhere?

Creating persistent reference points and scale of the objects

Hi, I have seen that this package has just been released, and I want to use this instead of native ARCore SDK for my project, but I have some questions.

From the samples I know you can call m_SessionOrigin.Raycast to detect the point of the plane you are tapping on. I also know you can create a reference point with the method TryAttachReferencePoint(ARPlane, Pose) from ARReferencePointManager. The pose could be retrieved directly from the raycast but how do I retrieve the ARPlane?

In my game there are multiple levels, and I want to keep a persistent reference point as the the origin position for the levels. In the first scene I want to place the origin tapping on a point of a plane. Then I need to create the anchor there, so when a level loads it changes its position to the anchor one. With ARCore I was able to keep a persistent object, with DontDestroyOnLoad(anchor.gameobject), to do the same with ARFoundation, do I need to put DontDestroyOnLoad() on the reference point, or when it is created it becomes a child of the ARSessionOrigin?

And when I load a new level, If I need it to be in the scale of the AR Session Origin, do I need to make it a child of the trackablesParent from m_SessionOrigin?

Combine ARFoundation with Vuforia

Helloo,

I am trying to integrate Vuforia functionality into a project also containing the ARFoundation.

So far everything seems to be running fine in editor, and comiled fine, but I recieve a black screen.

My thoughts here are that both SDK's are trying to acces the camera at the same time - I could be completely wrong.

So I am wondering if there is a way to supply the ARFoundation the same camera feed that Vuforia is receiving?

Alternatively - is it possible to access ARKit 1.5/2 or ARCore 1.2 image tracking (and other functionality) via the ARFoundation?

Thankyou in advance!

Oliver

EDIT

Okay I eat my words,

It seems as though I can successfully get the vuforia SDK to start by attahcig the Vuforia scripts to the ARFoundation Camera.

The ARFoundation functionality all seems to work fine, but the issue now is that although Vuforia has initialized and the console says the sessions has started - no tracking is recognized.

So unless anybody has anything they might want to add to point me in the right direction with getting the tracking to work with vuforia correctly - then I would assume this is an issue to post to the Vuforia forums (god forbid...), and this can be closed?

If anybody is interested in havign a look at the poject I have up and running in case they might have some ideas you can find it here: https://github.com/oliverellmers/arfoundation-samples/tree/Testing

Cheers

Crash on LG G6

Hi, I've been experiencing strange crashes on an LG G6. It'll sometimes work and then other times it just crashes before or sometimes just after starting the camera feed, logs attached.

arprestolog.txt
fulllog.txt

Plane type detection

Hi guys! How can I change plane detection (horizontal, vertical, both or none)?

Linear color space shader

Hi,Unity
The ARKit plugin has two shaders for gamma color space and linear color space. Arfoundation is only available for gamma color space shaders. Unfortunately, my project uses linear space. How can I solve this problem?
36cbf8d48a421e774969f7e2f7b7d59b
ee1133238da5e26e7b61c14c75cf785a

About scaling content and how to reset the plane

Hi Unity,
Question 1: I want to scale the content rendered by ARSessionOrigin, arsessionOrigin.transform.localScale = Vector3.one * scaleValue, but it will affect the point cloud particle effects. It looks strange, is it something I missed?

Problem 2: Disable ARsession, then enable ARSession, the previous plane still appears and cannot be cleared. I need to implement enable AR, reset AR, close AR, how do I need to set it up;
aaaa
Thanks!

ARSubsystemManager.systemStateChanged never reaches ARSystemState.SessionTracking on iOS

After the ARSession is deactivated and reactivated (ARSession.enabled), the status ARSystemState.SessionTracking is never reached on iOS.

reproduction:

  • Open the SampleScene from ARFoundation
  • Add a event on ARSubsystemManager.systemStateChanged which logs the state on Call
  • Add two buttons to the scene
  • one button to disable ARSession
  • another button to enable the ARSession
  • build the app for ios (with 2018.1.9f1)
  • At first run the state changes from Ready to SessionInitializing to SessionTracking. After switch it only changes from Ready to SessionInitializing. But it is still functional and scans the environment and change and add planes. Only the PointCloud isnt visible.

On Android all is working fine.

Detection not working on OnePlus 6

I've got an app in production that seems to work just fine on most Android devices.

But I've had a report from a friend, that plane detection doesn't work on his OnePlus 6.
The marker particles never show up, it's just like a regular camera view. Had him try many different locations, good lighting, surfaces that should show marker particles. Nothing.

Have tried all other troubleshooting I could think of, including uninstalling, checking if camera works in other apps, etc. It does work in the Ikea Place app*. Also had him manually reinstall and update to latest version of ARCore with no luck.

*Update: The Ikea Place app does allow him to place something, but it does not follow with the camera. It's placed where he pointed his camera at first. So it would seem like there's some issues there too.

The app in question can be installed from App Store or Google Play, in case you want to reproduce/see it.

Build info:
Unity: Unity 2018.2.3f1, Lightweight Render Pipeline, AR Foundation, ShaderGraph
iOS: macOS High Sierra 10.13.6, Mac mini Late 2014, Xcode 9.4.1 (9F2000)
Android: Windows 10.0.17134, min API level 24, target API 27, .NET 3.5 eq. runtime, Mono and .NET 2.0 subset, ARMv7
Other: Applied fix to get LWRP working with the camera (black camera view without it): (https://nolanscobie.com/2018/07/unity-mobile-ar-with-lwrp/

manifest.json:

{
  "dependencies": {
    "com.unity.package-manager-ui": "1.9.11",
    "com.unity.render-pipelines.core": "3.0.0-preview",
    "com.unity.render-pipelines.lightweight": "3.0.0-preview",
    "com.unity.shadergraph": "3.0.0-preview",
    "com.unity.textmeshpro": "1.2.4",
    "com.unity.xr.arcore": "1.0.0-preview.18",
    "com.unity.xr.arfoundation": "1.0.0-preview.17",
    "com.unity.xr.arkit": "1.0.0-preview.14",
    "com.unity.modules.animation": "1.0.0",
    "com.unity.modules.assetbundle": "1.0.0",
    "com.unity.modules.audio": "1.0.0",
    "com.unity.modules.director": "1.0.0",
    "com.unity.modules.imageconversion": "1.0.0",
    "com.unity.modules.imgui": "1.0.0",
    "com.unity.modules.jsonserialize": "1.0.0",
    "com.unity.modules.particlesystem": "1.0.0",
    "com.unity.modules.physics": "1.0.0",
    "com.unity.modules.physics2d": "1.0.0",
    "com.unity.modules.screencapture": "1.0.0",
    "com.unity.modules.ui": "1.0.0",
    "com.unity.modules.uielements": "1.0.0",
    "com.unity.modules.umbra": "1.0.0",
    "com.unity.modules.unityanalytics": "1.0.0",
    "com.unity.modules.unitywebrequest": "1.0.0",
    "com.unity.modules.unitywebrequestassetbundle": "1.0.0",
    "com.unity.modules.unitywebrequestaudio": "1.0.0",
    "com.unity.modules.unitywebrequesttexture": "1.0.0",
    "com.unity.modules.unitywebrequestwww": "1.0.0",
    "com.unity.modules.video": "1.0.0",
    "com.unity.modules.vr": "1.0.0",
    "com.unity.modules.xr": "1.0.0"
  }
} 

Get detected plane's normal always pointing to camera's side.

There is an issue I face mostly when I work on vertical surfaces. AR tool detects a couple of planes on the same surface and one of these planes have it's normal inverted. Like normal of that plane is looking the other way where camera is. So the spawned object looks upside down.

Edit: Issue occurs on horizontal surfaces too.

Black screen after building to Android

Hello,

The first time I downloaded this project and build I still had no problems with my camera. But now I have removed that project because after a build a camera no longer appeared. That is why I have now downloaded this project again, but again, after the build, he gives no more pop-up to accept the camera permission. If I want to do this mentally in the settings and I restart the app it still does not work. How do I solve this?

iOS freezes upon loading other scene and sometimes at random

Unity 2018.2.14, xcode 10.1 (aswel as older versions).

Happens about half the time when loading a scene from the ARKit scene, across different devices and ios versions (ipads/iphones)

Comes with a complimentary 'EXC_BAD_ACCESS (code=1, address=0xf000000011d7097f)'

Not always in the same file, but usualy in something cameratexture related. In the attached pastebin it happened in CameraImageApi::ImageManager::~ImageManager()

https://pastebin.com/QF47c9sb

Error when building on iOS

Unity 2018.2.0f2
macOS High Sierra 10.13.5
Xcode 9.4.1

Getting the below error when trying to build. I've tried many combinations of building for release/debug, script debugging, appending and replacing, etc. I keep getting the error.

Exception: The required file: '' does not exist
UnityEditor.iOS.PostProcessiPhonePlayer.InstallIncludedFiles (UnityEditor.iOS.IncludedFileList includedFiles, System.String installPath, BuildSettings bs, UnityEditor.Build.Reporting.BuildReport buildReport) (at /Users/builduser/buildslave/unity/build/PlatformDependent/iPhonePlayer/Extensions/Common/BuildPostProcessor.cs:1526)
UnityEditor.iOS.PostProcessiPhonePlayer.UpdateInstallLocation (UnityEditor.iOS.ProjectPaths paths, BuildSettings bs, UnityEditor.iOS.IncludedFileList includedFiles, UnityEditor.Build.Reporting.BuildReport buildReport) (at /Users/builduser/buildslave/unity/build/PlatformDependent/iPhonePlayer/Extensions/Common/BuildPostProcessor.cs:1463)
UnityEditor.iOS.PostProcessiPhonePlayer.PostProcess (BuildSettings bs, UnityEditor.iOS.ProjectPaths paths, UnityEditor.RuntimeClassRegistry usedClassRegistry, UnityEditor.Build.Reporting.BuildReport buildReport) (at /Users/builduser/buildslave/unity/build/PlatformDependent/iPhonePlayer/Extensions/Common/BuildPostProcessor.cs:773)
UnityEditor.iOS.PostProcessiPhonePlayer.PostProcess (PostProcessorSettings postProcessorSettings, BuildPostProcessArgs args) (at /Users/builduser/buildslave/unity/build/PlatformDependent/iPhonePlayer/Extensions/Common/BuildPostProcessor.cs:611)
UnityEditor.iOS.iOSBuildPostprocessor.PostProcess (BuildPostProcessArgs args) (at /Users/builduser/buildslave/unity/build/PlatformDependent/iPhonePlayer/Extensions/Common/ExtensionModule.cs:37)
UnityEditor.Modules.DefaultBuildPostprocessor.PostProcess (BuildPostProcessArgs args, UnityEditor.BuildProperties& outProperties) (at /Users/builduser/buildslave/unity/build/Editor/Mono/Modules/DefaultBuildPostprocessor.cs:27)
UnityEditor.PostprocessBuildPlayer.Postprocess (BuildTargetGroup targetGroup, BuildTarget target, System.String installPath, System.String companyName, System.String productName, Int32 width, Int32 height, BuildOptions options, UnityEditor.RuntimeClassRegistry usedClassRegistry, UnityEditor.Build.Reporting.BuildReport report) (at /Users/builduser/buildslave/unity/build/Editor/Mono/BuildPipeline/PostprocessBuildPlayer.cs:287)
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

Light estimation not working

I have enabled light estimation on the AR session but it does not react to the lights. I'm using a Samsung Galaxy S8

XR Mock preview 6 error in Unity 2018.3.0b8

XR Mock is awesome, hope there is an update soon!
Thanks.

Unity 2018.3.0b8
"com.unity.xr.arcore": "1.0.0-preview.21",
"com.unity.xr.arfoundation": "1.0.0-preview.19",
"com.unity.xr.arkit": "1.0.0-preview.16",
"com.unity.xr.mock": "0.0.1-preview.6",

Error:
Library\PackageCache\[email protected]\com.unity.xr.remoting\Editor\EditorRemoting.cs(761,31): error CS0311: The type 'UnityEngine.Experimental.XR.XRDepthSubsystemDescriptor' cannot be used as type parameter 'TDescriptor' in the generic type or method 'EditorRemoting.CreateSubsystem<TDescriptor, TSubsystem>(List<TDescriptor>, string)'. There is no implicit reference conversion from 'UnityEngine.Experimental.XR.XRDepthSubsystemDescriptor' to 'UnityEngine.Experimental.SubsystemDescriptor<UnityEngine.Experimental.XR.XRDepthSubsystem>'.

Doesn't compile for iOS

I cloned this repo and tried to build the application for iOS but I got the following error:

Failed running /Applications/Unity/Hub/Editor/2018.1.0f2/Unity.app/Contents/il2cpp/build/UnityLinker.exe --api=NET_2_0 -out="/Users/rkc/Projects/arfoundation-samples/Temp/StagingArea/Data/Managed/tempStrip" -l=none -c=link --link-symbols -x="/Applications/Unity/Hub/Editor/2018.1.0f2/PlaybackEngines/iOSSupport/Whitelists/Core.xml" -f="/Applications/Unity/Hub/Editor/2018.1.0f2/Unity.app/Contents/il2cpp/LinkerDescriptors" -x "/Users/rkc/Projects/arfoundation-samples/Temp/StagingArea/Data/Managed/../platform_native_link.xml" -x "/var/folders/gk/02w1vv9n47v422v47dt2rtk40000gn/T/tmp27667db.tmp" -x "/var/folders/gk/02w1vv9n47v422v47dt2rtk40000gn/T/tmp594a621d.tmp" -x "/var/folders/gk/02w1vv9n47v422v47dt2rtk40000gn/T/tmp2514e7b2.tmp" -x "/Users/rkc/Projects/arfoundation-samples/Assets/link.xml" -d "/Users/rkc/Projects/arfoundation-samples/Temp/StagingArea/Data/Managed" -a "/Users/rkc/Projects/arfoundation-samples/Temp/StagingArea/Data/Managed/Assembly-CSharp.dll" -a "/Users/rkc/Projects/arfoundation-samples/Temp/StagingArea/Data/Managed/Unity.XR.ARKit.dll" -a "/Users/rkc/Projects/arfoundation-samples/Temp/StagingArea/Data/Managed/Unity.XR.ARFoundation.dll" -a "/Users/rkc/Projects/arfoundation-samples/Temp/StagingArea/Data/Managed/Unity.XR.ARExtensions.dll" -a "/Users/rkc/Projects/arfoundation-samples/Temp/StagingArea/Data/Managed/UnityEngine.SpatialTracking.dll" -a "/Users/rkc/Projects/arfoundation-samples/Temp/StagingArea/Data/Managed/UnityEngine.Analytics.dll"

Any help would be appreciated.

Build details:

Unity Version: 2018.1.0f2

Xcode version: Version 9.4.1 (9F2000)

iOS Version: 11.4.1

Player Setting
gh

Vertical Planes UVs are stretched

The horizontal planes are fine but the vertical ones seem to have some issues with the UVs

ar foundation samples

I used the "FeatheredPlaneScene" scene and changed the shader of the ARPlane's material to the one in this tutorial.

If there is a quick fix on my side please let me know.

Thank you

Support for Google Cloud Anchors

Hi Unity,

I was curious if you are planning to add support for Google Cloud Anchors. I'm currently working off of the GoogleARCore package but I find it would be more advantageous in the long-term to leverage ARFoundation to build for cross-platform.

Thanks!

Plane texture not visible

Hi, I wanted to change the plane texture to an opaque seamless texture of my choice, but when I change the material only its base color is visible, not the texture.
Does this API work with plane materials differently or I'm missing something?

gradle build failed on android

Hello friends:
i want to build on android by unity2018.2.1f1, but it has a probelm gradle failed .
details as the follow:

CommandInvokationFailure: Gradle build failed.
D:/Program Files/Java/jdk1.8.0_25\bin\java.exe -classpath "D:\lzh\unity2018.2.1\Editor\Data\PlaybackEngines\AndroidPlayer\Tools\gradle\lib\gradle-launcher-4.2.1.jar" org.gradle.launcher.GradleMain "-Dorg.gradle.jvmargs=-Xmx2048m" "assembleRelease"

stderr[

FAILURE: Build failed with an exception.

  • What went wrong:
    A problem occurred configuring root project 'gradleOut'.

Could not resolve all files for configuration ':classpath'.
Could not find intellij-core.jar (com.android.tools.external.com-intellij:intellij-core:26.0.1).
Searched in the following locations:
https://jcenter.bintray.com/com/android/tools/external/com-intellij/intellij-core/26.0.1/intellij-core-26.0.1.jar

  • Try:
    Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.

  • Get more help at https://help.gradle.org

BUILD FAILED in 0s
]
stdout[

]
exit code: 1
UnityEditor.Android.Command.WaitForProgramToRun (UnityEditor.Utils.Program p, UnityEditor.Android.WaitingForProcessToExit waitingForProcessToExit, System.String errorMsg)
UnityEditor.Android.Command.Run (System.Diagnostics.ProcessStartInfo psi, UnityEditor.Android.WaitingForProcessToExit waitingForProcessToExit, System.String errorMsg)
UnityEditor.Android.AndroidJavaTools.RunJava (System.String args, System.String workingdir, System.Action1 progress, System.String error) UnityEditor.Android.GradleWrapper.Run (UnityEditor.Android.AndroidJavaTools javaTools, System.String workingdir, System.String task, System.Action1 progress)
Rethrow as GradleInvokationException: Gradle build failed
UnityEditor.Android.GradleWrapper.Run (UnityEditor.Android.AndroidJavaTools javaTools, System.String workingdir, System.String task, System.Action`1 progress)
UnityEditor.Android.PostProcessor.Tasks.BuildGradleProject.Execute (UnityEditor.Android.PostProcessor.PostProcessorContext context)
UnityEditor.Android.PostProcessor.PostProcessRunner.RunAllTasks (UnityEditor.Android.PostProcessor.PostProcessorContext context)
UnityEditor.BuildPlayerWindow:BuildPlayerAndRun()

and i try to open the link :https://jcenter.bintray.com/com/android/tools/external/com-intellij/intellij-core/26.0.1/intellij-core-26.0.1.jar
it shows like this:
{
"errors" : [ {
"status" : 404,
"message" : "Could not find resource"
} ]
}

i am doubting of how to fix these.

hope for your feedback

Doesn't build (link) on iOS...?

Not sure if this is related to the other iOS build issue. However, I downloaded the package and it fails to link:

  "_OBJC_METACLASS_$_ARAnchor", referenced from:
      _OBJC_METACLASS_$_ARPlaneAttachmentAnchor in UnityARKit.a(ARKitXRReferencePointProvider.o)
  "_OBJC_CLASS_$_ARAnchor", referenced from:
      _OBJC_CLASS_$_ARPlaneAttachmentAnchor in UnityARKit.a(ARKitXRReferencePointProvider.o)
      objc-class-ref in UnityARKit.a(ARKitXRReferencePointProvider.o)
  "_OBJC_CLASS_$_ARPlaneAnchor", referenced from:
      objc-class-ref in UnityARKit.a(ARKitXRReferencePointProvider.o)
      objc-class-ref in UnityARKit.a(ARKitXRPlaneProvider.o)
  "_OBJC_CLASS_$_ARSession", referenced from:
      objc-class-ref in UnityARKit.a(ARKitXRSessionProvider.o)
  "_OBJC_CLASS_$_ARWorldTrackingConfiguration", referenced from:
      objc-class-ref in UnityARKit.a(ARKitXRSessionProvider.o)
ld: symbol(s) not found for architecture arm64
clang: error: linker command failed with exit code 1 (use -v to see invocation)

Unity 2018.2.2f1 Xcode 9.4.1 (9F2000), build target iOS11 universal.

Can't build for Android - Missing UnityARCore\classes.jar file

I am using Unity 2018.2.3f1
Updated my Android SDK (SDK Manager) to the latest version
Cloned the repo and Opened the project from Unity Hub
Build Settings -> Build
The first image is some warning I get when I opened. I am not sure how can I help the editor and do it manually. (Not sure if it's related so I mentioned it)

image

In the second image, is the actual build error that appears. I am not sure where to download this missing file and where will be the destination to put the file in.
image

If you need any more details, let me know.
Thanks

Aborted on iOS

This sample app is aborted on iOS due to lack of the key 'NSCameraUsageDescription' in the Info.plist.

black screen when used as AAR lib

When i build from Unity as apk or also when deployed directly from Unity made AndroidProject
the AR(Core) scene/view works but when instead of deploying the apk right from the Unity made AndroidProject i deploy that as AAR lib and use it in another AndroidProject then in the AR scene the cam/screen stays black (it only shows Unity UI)

When using Unity content as AAR lib i have it in a UnityHolderActivity and there it then throws the error:
Unable to find UnityARCore
UnityHolderActivity E/Unity: DllNotFoundException: UnityARCore
at (wrapper managed-to-native) UnityEngine.XR.ARCore.Api:UnityARCore_setCameraPermissionProvider (UnityEngine.XR.ARCore.Api/CameraPermissionRequestProvider)
at UnityEngine.XR.ARCore.ARCoreCameraExtension.Register () [0x00000] in :0

Black Screen

Unity 2018.1.5f
I have built and ran the demo scene with a cube successfully before. Now, the build is only showing a black screen and the camera permission is never raised when starting the build for the first time. Does anyone have any ideas what might have changed?

Using Pixel XL, ARCore 1.3.180604066

throws error when not being able to overwrite aar libs

when one deploys an Android Project to where one already deployed the Android Project before it throws a bunch of errors for AAR lib files it can't replace (because they are already there).

Like

Trying to add file ...unityandroidpermissions.aar to the list of ouptut files in the build report, but a file at that path has already been added.
UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr)

it shouldn't show errors for these since it is to be expected the files are already there on re deploy

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.