google-ar / codelab-webxr Goto Github PK
View Code? Open in Web Editor NEWBuilding an augmented reality application with the WebXR Device API
Home Page: https://codelabs.developers.google.com/codelabs/ar-with-webxr/
License: Apache License 2.0
Building an augmented reality application with the WebXR Device API
Home Page: https://codelabs.developers.google.com/codelabs/ar-with-webxr/
License: Apache License 2.0
I'm getting an Error 404 page when trying to access the codelabs website.
We should pass in a third argument for setSize
as false
so we explicitly do not attempt to update the canvas styling: this.renderer.setSize(viewport.width, viewport.height, false);
In the latest Chrome Canary #70.0.3646.0 the getDevicePose()
function (in app.js:151) is renamed to getViewerPose()
and needs to be edited in app.js
Hello, I've done extensive testing and found the following: on step 4, part 4 of Add a reticle the following code is on the tutorial:
async onSessionStarted() {
// ...
// Setup an XRReferenceSpace using the "local" coordinate system.
this.localReferenceSpace = await **_session_**.requestReferenceSpace("local");
// Add these lines:
// Create another XRReferenceSpace that has the viewer as the origin.
this.viewerSpace = await this.**_session_**.requestReferenceSpace("viewer");
// Perform hit testing using the viewer as origin.
this.hitTestSource = await this.**_session_**.requestHitTestSource({ space: this.viewerSpace });
// ...
}
and it doesn't work, it throws the onNoXRDevice exception.
The functional code, found in step-04 folder is the following:
// Setup an XRReferenceSpace using the "local" coordinate system.
this.localReferenceSpace = await this.**_xrSession_**.requestReferenceSpace('local');
// Create another XRReferenceSpace that has the viewer as the origin.
this.viewerSpace = await this.**_xrSession_**.requestReferenceSpace('viewer');
// Perform hit testing using the viewer as origin.
this.hitTestSource = await this.**_xrSession_**.requestHitTestSource({ space: this.viewerSpace });
via: https://twitter.com/jerome_etienne/status/1039480467668566016
Maybe hitting a compositing issue -- can possibly move this into WebGL
I had a gltf, and I changed the link here:
window.gltfLoader.load("https://immersive-web.github.io/webxr-samples/media/gltf/sunflower/sunflower.gltf", function(gltf) {
const flower = gltf.scene.children.find(c => c.name === 'sunflower')
flower.castShadow = true;
window.sunflower = gltf.scene;
});
to something like http://mysite.com/mymodel.gltf
and it is not working. I have no idea how I can change the sunflower properly. Now, I can successfully instantiate nothing on the screen 😂
Before the API is available on Canary, where can I get the build that was available for the IO attendees?
Thanks.
When starting any of the demos, Chrome Canary unexpectedly exits after click the button "START AUGUMENTED REALITY".
Device: Samsung S7 (SM-G9308)
Android version 8.0.0
Chrome Canary version 73.0.3636.2。
ARCore version 1.5.180910133
I have turned on the #webxr and #webxr-hit-test flags and nothing else.
If I removed the option environmentIntegration: true
for requestSession(), Chrome will render the page without camera stream.
Any idea what causing the problem?
Hi , I'm getting the browser not unsupported message when trying the examples.Tried on chrome version 81.0.4044.138. android 10.
chrome://version
Google Chrome | 81.0.4044.138 (Official Build) (32-bit) |
---|---|
Revision | 8c6c7ba89cc9453625af54f11fd83179e23450fa-refs/branch-heads/4044@{#999} |
OS | Android 10; Pixel 2 XL Build/QQ2A.200405.005 |
How to load the H5 page written by webXR on the APP developed by yourself? Loading the H5 from Chrome is ok, but using the webView component of Androids directly to load the H5 prompts that the device does not support it.
Demo is using outdated code and API version of webxr.
Camera is not working properly after updating Canary to version 72.0.3589.0 from 70 in Pixel2 Android 9.
Portrait Mode: Screen blinks different colors after clicking the button Start Augmented Reality.
Landscape Mode: Camera detects the surrounding but displays at different angle.
Everything was working fine in Canary 70.0.3538.2 (https://youtu.be/dXODcxYQMtM).
It's a question, sorry. Don't know where is the right place for that. WebXR looks very potential but I can not test it with any official available version of Canary. Is there a (serious) place where I find a working (72) version? The more relevant question is, when will current versions of Canary works for available WebXR Demos (https://web-education-ar-demo.appspot.com/). Maybe that's a stupid question, but is there any information online about time schedules for that?
Thanks, Matthias
While projecting my phone screen on my glasses, I dont want to see the camera streaming on top off the real world, so
is it possible to remove the camera streaming on the screen while AR mode is still working?
Hi there! I've been trying my best to integrate the animation code into my project. But it still does not work.
I added this.animate(); to the top of my code - under this.init - TBH I dont know if the 'this.' should be there? But I'm guessing it should... I read the article provided on async and promises, but I still don't understand how exactly it relates to integrating animation in my project.
Then under this.scene.add(this.model); I added -
async animate (); {
requestAnimationFrame( animate );
this.model.rotation.y+=1;
this.renderer.render(this.scene, thiscamera ); //But this already happens in 'if (pose)' and
//if I delete it from there, the app stops working.
}
Last question : I know that Aframe has a slack channel that specifically addresses these types of issues for noobs like myself. and also allows the community of outsiders to maybe work together.. Do you have one as well? Or - will you be implementing in the future?
Thanks so much!
Hi,
Although the phone I am using fullfills all requirements, I get the "unsupported Browser" screen when opening the test website. I have experimented with AR apps before (eg. three.ar.js via the older ARCore version), so it should in principle work.
Having successfully run the codelab examples yesterday, I've found that after today's Canary update I'm getting the above error on my S8 running Canary 73.0.3665.4 and Android 8.
The result is that the grey background and GIF remains while the camera feed is missing.
if let hiResCaptureVideoFormat = ARWorldTrackingConfiguration.recommendedVideoFormatForHighResolutionFrameCapturing {
// Assign the video format that supports hi-res capturing.
config.videoFormat = hiResCaptureVideoFormat
}
// Run the session.
session.run(config)
Hey,
I'm trying to implement hit-testing like it's showing here. In the onSessionStarted function I set the frame of reference, but I'm forced to do this in 'head-model'. When I try to set it to 'eye-level' I get this error: Unhandled Promise Rejection: Only head-model hit testing is supported
This is my onSessionStarted function:
onSessionStarted = async () => {
this.setState({ isARSessionStarted: true, arMessage: 'session started' });
this.renderer = new THREE.WebGLRenderer({
alpha: true,
preserveDrawingBuffer: true,
});
this.scene = new THREE.Scene();
this.renderer.autoClear = false;
this.renderer.shadowMap.enabled = true;
this.renderer.shadowMap.type = THREE.PCFSoftShadowMap;
this.gl = this.renderer.getContext();
await this.gl.setCompatibleXRDevice(this.session.device);
this.session.baseLayer = new window.XRWebGLLayer(this.session, this.gl);
const framebuffer = this.session.baseLayer.framebuffer;
this.renderer.setFramebuffer(framebuffer);
// this.camera.matrixAutoUpdate = false;
// this.camera.position.z = 100;
this.camera = new THREE.PerspectiveCamera();
this.frameOfRef = await this.session.requestFrameOfReference('eye-level');
this.session.requestAnimationFrame(this.onXRFrame);
};`
So the behavior of my app is strange. When I want to add an object on the surface, it instead "sticks on the display" and moves with the camera. Is this caused by 'head-model'? Is there a way to fix this?
I am on iOS in the XRViewer Browser.
Thanks!
Using Chrome Canary 72.0.3626.0, Android 9, Pixel 2. ARCore is installed and works with native apps. Canary has camera permission (verified with a getUserMedia demo). webxr and webxr-hit-test flags are enabled.
Visiting the code lab demo link: https://googlecodelabs.github.io/ar-with-webxr/final
On pressing the "Start Augmented Reality" button, I see a toast saying "Installing AR Module..." and then nothing else. Tried re-installing, re-tapping the link, force closing, reinstalling ARCore. No dice.
Is it just me?
In the latest three.js, there's a WebGLRenderer.prototype.setFramebuffer
function that works around this issue.
I have two Android device (one is the latest phone(Samsung), another one is Nexus5x with Oreo)
I have to install older version of chrome because of this example limitation.
Finally, Nothing is possible to work....
I lost too much valuable time.
Hello,
After I saw the demo at https://web-education-ar-demo.appspot.com/ . Very cool and smooth.
I started digging around the browser-AR tech and found this repository and looks like a good starting point for me.
I followed the tutorial at https://codelabs.developers.google.com/codelabs/ar-with-webxr/#0 and I successfully served your app via the Web Server Chrome extension but if I visit the page using my smartphone it says that the browser is not supported.
If I visit your demo at https://googlecodelabs.github.io/ar-with-webxr/final/ it works. It lag a lot but at least it works.
From Canary I get these info:
App version: 69.0.3469.0
OS version: Android 8.0.0, SM-G930F Build/R16NW
Working demo screenshot: https://photos.app.goo.gl/PUnAkHRipBr9jUfb7
Not Working demo screenshot: https://photos.app.goo.gl/RNe9hbXZPzGEnnBE7
Android device: Pixel 2 XL with Android 9
ARCore 1.9.190422066
I have installed many version of Chrome Canary/Dev v70-73, and opened the webxr flags, but none work.
It show "unsupported browsers". I found that "xr" attribute doesn't exist on navigator object.
If anyone have a resolution or know the Chrome version which could run the samples, please tell me! Thank you!
Hi,
Thanks for the great tutorial.
I have setup Canary the way you describe in the tutorial. I then tried to host the website using Web Server for Chrome, went into work
, but the webpage doesn't seem to load properly and it says "Unsupported Browser".
But apparently if I visit https://googlecodelabs.github.io/ar-with-webxr/final/ from the same device and same browser, I can access the AR mode.
I have briefly compared the app.js
in the two folders but I cannot seem to find any visible difference. I have also tried to debug the webpage on Android using Chrome Dev tools and found out that at line 40 in app.js
the navigator.xr
is not defined (just on the website I serve with Web Server for Chrome, your final link works fine).
I am not sure what that means.
Am I missing something obvious or there is something odd about it?
Thanks in advance
Hi! I am pretty new to three.js and google ar and I have been trying to find a way to animate the obj's that I import into the final example. I have mainly been looking at three.js tutorials on how to do this, however none of the tutorials I view have quite the same syntax as was used in the tutorial making it a little hard for me to translate the code to fit my needs. Are there any other resources to learn more about how rotate and translate the objs on the reticle? I know that I cannot export the animation already on the obj, as obj's do not support that. I am trying to use the code to animate.
Thanks so much!
Maybe here is a better place for my problem:
(immersive-web/webxr#457)
I need your help. I have done the webxr tutorial with enthusiasm. (https://codelabs.developers.google.com/codelabs/ar-with-webxr/#0)
Now I would like to display animated objects. With blender I animated the object in fbx format and loaded it into an html with the FBXLoader of the three.js library. That works!
Unfortunately I did not succeed to use the fbx object with the same FBXLoader in the AR example of the tutorial.
Is there an example for this? Is gtlf object better usable than fbx objects? What needs to be changed in the code so I can use the loader?
Thanks a lot!
three.js FBX Loader:
var loader = new THREE.FBXLoader();
loader.load( 'models/fbx/AR_Tutorial.fbx', function ( object ) {
mixer = new THREE.AnimationMixer( object );
var action = mixer.clipAction( object.animations[ 0 ] );
action.play();
object.traverse( function ( child ) {
if ( child.isMesh ) {
child.castShadow = true;
child.receiveShadow = true;
}
} );
scene.add( object );
} );
The instructions say that it does not work in latest Canary, and that Canary 70-72 is needed. However, no link can be found to those versions of Canary, and Google searches for a version in that range turn up empty.
On https://codelabs.developers.google.com/codelabs/ar-with-webxr/#0
the link to WebXR Device API should be
https://immersive-web.github.io/webxr/
instead of
https://immersive-web.github.io/webxr/spec/latest/ which is 404
Here is a video showing a problem I have with these demos:
https://youtu.be/htXAzbYrAkU
When starting any of the demos the background of the webpage turns gray and camera live feed stops. The tracking seems to still work when moving the phone.
Device: Samsung S8 (SM-G950F) and Samsung S7 Edge
Both devices have:
The "phone-ar-hit-test.html" and "phone-ar.html" demos work nicely on the immersive-web github page:
https://github.com/immersive-web/webxr-samples/tree/master/proposals
Also the Chacmool demo works at: https://web-education-ar-demo.appspot.com/
Any idea what is causing the problem?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.