cesiumgs / gltf-pipeline Goto Github PK
View Code? Open in Web Editor NEWContent pipeline tools for optimizing glTF assets. :globe_with_meridians:
License: Apache License 2.0
Content pipeline tools for optimizing glTF assets. :globe_with_meridians:
License: Apache License 2.0
As we continue adding pipeline stages, it is going to be important to keep a uniform template for pipeline stage functions so that they can be used in the same way.
I would propose the following:
/**
* Performs some operation on a glTF hierarchy.
*
* @param {Object} gltf An object holding a glTF hierarchy.
* @param {Object} [options] Defines more specific behavior for this stage.
*
* @returns {Promise} A promise that resolves when the stage completes.
*/
function stage(gltf, options) {
...
}
For #130:
I would classify the following as "stages" and should be converted to use this template. Anyone can feel free to do them in any order; it doesn't need to be one big pull request.
Here is a rough list of tasks for bringing this repo up to production. Feel free to edit or add new items to the list:
{}
in a lot of places, just do it the Cesium way and create the options object if it's not defined with options = defaultValue(options, defaultValue.EMPTY_OBJECT);
Especially when gltf-pipeline is ran on a server, we'll want to be able to provide progress events to users so they know much time is left, especially for slow stages like AO baking.
Currently, when loading uris, if a file is missing, the pipeline throws an error. If that missing file is a texture, the reference should remain the same and maybe print a warning.
Duplicate: see #12
Build is currently based on browserify which seems to have issues with some of our code dependencies. @lasalvavida suggested it may be an issue he encountered where browserify can't handle dynamically loaded dependencies.
"build": "browserify index.js --standalone gltfPipeline -o build/gltf-pipeline.js",
Richard@Richard-PC /cygdrive/c/Users/Richard/src/gltf-pipeline
$ npm run build
> [email protected] build C:\Users\Richard\src\gltf-pipeline
> browserify index.js --standalone gltfPipeline -o build/gltf-pipeline.js
Error: Cannot find module './process' from 'C:\Users\Richard\src\gltf-pipeline\node_modules\cesium\node_modules\requirejs\bin'
at C:\Users\Richard\AppData\Roaming\npm\node_modules\browserify\node_modules\resolve\lib\async.js:55:21
at load (C:\Users\Richard\AppData\Roaming\npm\node_modules\browserify\node_modules\resolve\lib\async.js:69:43)
at onex (C:\Users\Richard\AppData\Roaming\npm\node_modules\browserify\node_modules\resolve\lib\async.js:92:31)
at C:\Users\Richard\AppData\Roaming\npm\node_modules\browserify\node_modules\resolve\lib\async.js:22:47
at FSReqWrap.oncomplete (fs.js:82:15)
npm ERR! Windows_NT 6.1.7601
npm ERR! argv "C:\\Program Files\\nodejs\\node.exe" "C:\\Program Files\\nodejs\\node_modules\\npm\\bin\\npm-cli.js" "run" "build"
npm ERR! node v4.2.4
npm ERR! npm v2.14.12
npm ERR! code ELIFECYCLE
npm ERR! [email protected] build: `browserify index.js --standalone gltfPipeline -o build/gltf-pipeline.js`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the [email protected] build script 'browserify index.js --standalone gltfPipeline -o build/gltf-pipeline.js'.
npm ERR! This is most likely a problem with the gltf-pipeline package,
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR! browserify index.js --standalone gltfPipeline -o build/gltf-pipeline.js
npm ERR! You can get their info via:
npm ERR! npm owner ls gltf-pipeline
npm ERR! There is likely additional logging output above.
npm ERR! Please include the following file with any support request:
npm ERR! C:\Users\Richard\src\gltf-pipeline\npm-debug.log
For a test model and discussion, see CesiumGS/cesium#1754
For the unit tests, please make a few test models to cover all the cases instead of reusing that model. Note that we originally supported DAGs in glTF, but the final spec does not, so we need to convert it to a tree as part of the pipeline.
I did not implement this, but here are my notes, which I'm pretty confident will work:
Given a Directed Acyclic Graph (in this case, a glTF node hierarchy), how do we convert it into a tree such that nodes with multiple incident edges are duplicated to only have one incident edge? This is used for converting a 3D model data structure to a data structure used for rendering, e.g., to compute transforms based on each node's ancestors.
Assuming each node has a
visited
property, initialize this tofalse
for all nodes. Traverse the graph breadth-first. If a node'svisited
property istrue
, duplicate it and the subgraph it is the root of (all nodes in this subgraph should havevisited === false
), and update its parent's pointer. Otherwise, setvisited
totrue
.For example,
D
is visited twice below.A // Pretend these have downward facing arrows / \ B C \ / D
And the converted tree is:
A / \ B C | | D D1 // D1 is the only copy made
In the next graph,
G
is visited four times.A / \ B C \ / D / \ E F \ / G
The converted tree is:
A / \ B C | | D D1 / \ / \ E F E1 F1 | | | | G G1 G2 G3
How to reproduce:
./bin/gltf-pipeline.js ... --ao
This actually happens if you put any flag without a value last. But it fails on ao because the typeof check inbin/gltf-pipeline.js
doesn't work so it just gets assigned to be the path.
As discussed offline, a lot of potentially useful gltf-pipeline helpers aren't being exposed right now. We can keep a running list here. Some ideas for exposing them:
utils.js
file to gltf-pipeline that exposes useful utils. this to avoid clutter in index.js
Wishlist:
How to get MBR from one gltf?
Mbr 's format is below(
"root": {
"boundingVolume": {
"region": [
-1.2960028825819643,
0.7068325938171807,
-1.286308749045535,
0.7141023380151126,
-11.892070104139751,
547.7591871983744
]
)
To build on #157, most doc need
This was suggested for obj2gltf but it would nice to support it here for multiple converters to use.
@leerichard42 this is something we could consider writing and submitting in the fall if you are interested. JCGT is a great pragmatic journal. The paper would need to be very data focused, showing before/after size/performance for various models with various pipeline stages.
Use Cesium's attribute compression to pack vec3 normals and vec2 texture coordinates into vec2 and float types respectively.
From the #1 roadmap.
bluebird
, there's no reason to also depend on promise
fs-extra
means we don't need mkdirp
underscore
object-values
is literally 11 lines of code and provides no useful benefitdatauri
usage is unnecessary and can be replaced with one line of code.There might be more, but this is good low hanging fruit for anyone looking to contribute.
Details: #32 (comment)
Potential workaround: #32 (comment)
Passing gltf with custom attributes that aren't VEC2
, VEC3
, or VEC4
right now breaks the pipeline. Example model.
Part of the problem is packArray
in gltfPrimitiveToCesiumGeometry
but it might also be elsewhere. SincepackArray
uses readAccessor
it should handle every type that readAccessor
can process.
removeUnusedVertices
and by proxy mergeDuplicateVertices
are a major bottleneck for speed in the main pipeline right now.
My impression is that removeUnusedVertices
is so slow because mergeDuplicateVertices
ends up creating lots of holes in the buffer that need to be closed. In order to close those holes, the entire buffer has to be moved over which can be very time consuming for large models.
This could be faster by dividing the buffer up into chunks, closing the gaps in the smaller chunks, then concatenating the used parts of the chunks. It may also be faster to iteratively read through and copy chunks into a new buffer, though probably less memory efficient.
I can do these at the code sprint if we don't get to them sooner.
OptimizationStatistics.prototype.print
to a function in bin/gltf-pipeline.js
since it doesn't make sense in the context of the browserify build.https://github.com/AnalyticalGraphicsInc/gltf-pipeline/blob/master/lib/removeUnusedImages.js#L22-L42
This will require replacing properties like stats.numberOfImagesRemoved
with something like stats.numberRemoved['image']
.
Also remove it as a dependency in package.json.
See the discussion at #151 (comment) for details.
In short, we've decided not to use underscore in this project.
When running a billboard model through the pipeline I noticed that when combineNodes
runs the processed model is incorrect. Is it because the root node has a non-identity matrix?
I've been noticing a pattern where glTFs that don't have shaders/technqiues that are converted with quantization enabled do not load in Cesium. There must be some interaction going on between these two steps. @lasalvavida can you take a look into this?
test/dragon-no-shaders.gltf -q
Testing the gltfPipeline with more involved models than the CesiumBox, I identified removeUnusedVertices seems to be removing all primitives except the main primitive of the model (is my terminology correct here? Still getting a feel for glTF).
See attached images
This causes all models coming out of the pipeline to fail validation using the glTF-Validator.
Brought up in CesiumGS/obj2gltf#11
In Node v0.10.42 the Array.prototype.fill function, likely among other things, is not supported. If it's not too much trouble we should support these older versions since they are the latest in some package managers.
I found out that if you run the pipeline on a model that has accessors with byteStride
set to 0
but type
set to VEC3
, compressBuffers
in removeUnusedVertices
strips out too much information and causes an indexOutOfBounds
error soon after.
From my reading of the gltf spec (this part) it seems like the pipeline should be able to observe from "byteOffset": 0
and "type": "VEC3"
that the data is tightly packed vec3s.
This is not defined in the glTF specification
The model in question is the chinese dragon model from here.
I ran it through obj2gltf to produce this valid gltf.
Remove unusedVertices outputs this.
Running a gltf model without indices through the pipeline crashes currently at the cacheOptimization
stage.
I tried to convert a simple OBJ model to glTF, using https://github.com/AnalyticalGraphicsInc/OBJ2GLTF
The resulting fragment shader code contained a line
vec3 l = normalize(czm_sunDirectionEC);
but the variable czm_sunDirectionEC
has not been declared at this point. After a bit of tracing and backward search, it appears that this variable is inserted in the shader by the gltf-pipeline, here:
https://github.com/AnalyticalGraphicsInc/gltf-pipeline/blob/master/lib/modelMaterialsCommon.js#L417
(At least this variable name appears nowhere else, and I can't imagine how the shader that is constructed there should be valid when this line is inserted)
Probably related to #83
buffer.js:835
throw new TypeError('value is out of bounds');
^TypeError: value is out of bounds
at checkInt (buffer.js:835:11)
at Buffer.writeUInt16LE (buffer.js:893:5)
at mergeAccessors (C:\Users\rtaglang\AGI\git\gltf-pipeline\lib\combinePrimitives.js:233:36)
at combinePrimitiveGroup (C:\Users\rtaglang\AGI\git\gltf-pipeline\lib\combinePrimitives.js:91:36)
at combineMeshPrimitives (C:\Users\rtaglang\AGI\git\gltf-pipeline\lib\combinePrimitives.js:49:33)
at combinePrimitives (C:\Users\rtaglang\AGI\git\gltf-pipeline\lib\combinePrimitives.js:19:43)
at processJSONWithExtras (C:\Users\rtaglang\AGI\git\gltf-pipeline\lib\gltfPipeline.js:49:5)
at C:\Users\rtaglang\AGI\git\gltf-pipeline\lib\gltfPipeline.js:63:9
at C:\Users\rtaglang\AGI\git\gltf-pipeline\lib\readGltf.js:42:13
at C:\Users\rtaglang\AGI\git\gltf-pipeline\lib\loadGltfUris.js:18:9
Currently, it just reads the values off of the buffer, even though the normals might be oct-encoded, the texture coordinates might be compressed, or any attribute may be quantized.
This creates bad geometry data and as a result, processes like ao that depend on geometry will produce bad results or fail entirely.
If you try to run ao on a quantized box model, the pipeline hangs.
JOINT
attributes for example are integer values that are currently stored as floats in a lot of models. A compressIntegerAttributes
stage could be written that checks if an accessor is marked as float
, but only contains integers, then choosing a suitable int size (and sign) to contain the range.
combineNodes
doesn't oct-decode or unquantize the existing normals before apply transforms on them. One solution is to do some checks in combineNodes
to decode them, transform, then re-encode. Another way could be to store the decoded normals in the extras when the gltf is loaded.
@lasalvavida could you check this out?
For the below model some nodes may be combined unnecessarily. Is this a bug or just a side effect of the current optimization pipeline?
test.gltf.txt
test-optimized.gltf.txt
More discussion here: https://groups.google.com/forum/#!topic/cesium-dev/2LStvibUv2Y
The pipeline should be able to analyze an image to check if it has transparency, and if so edit the techniques for alpha blending. In the obj2gltf project I did a simple approach for this here: https://github.com/AnalyticalGraphicsInc/OBJ2GLTF/blob/master/lib/image.js. However there are also many pngs that have alpha channels but don't actually use them, so this approach wouldn't work.
Using a model that I ran through the obj2gltf converter, I'm seeing an issue where the removeUnusedVertices
stage causes a buffer view to have a negative byteLength
in the optimized gltf. This seemed to happen before #64 as well.
Here's the model I'm using:
@lasalvavida can you investigate this?
TEX_COORD
instead of a new varying
VEC4
positions, VEC3/VEC4
normals)-o example.glb
should output Binary glTF, not regular glTF with a .glb
extension.
Throw an exception if the -b
option is provided and the extension doesn't match.
When the output of gltfPipeline.js
is JSON and not a file, the extras._pipeline.source
are not written back to the glTF. We should call writeSource
in processFile
and processJSON
.
@JoshuaStorm Do you want to look into this?
This is something we didn't really account much for in originally writing the pipeline and something that I have tried to be better about more recently.
Particularly, AccessorReader
should be used where it is possible instead of readAccessor
. The difference is that AccessorReader
reads off accessor elements one-by-one, while readAccessor
reads off the entire accessor at once. Sometimes this is necessary, when a whole view of the accessor data is required, but if we're operating on an accessor an element at a time, it would be best not to need to allocate space for all of those elements.
accessor
buffer
bufferView
camera
image
material
mesh
node
program
sampler
shader
skin
technique
texture
modelMaterialsCommon
from obj2gltf and Cesium to here.
For example, the CesiumMilkTruck.png from the glTF sample models is 902 KB.
If you run it through Jimp and re-encode it, it ends up as 1,441 KB.
This is obviously not ideal, and it would also be useful to be able to optimize texture images for the web in general.
I did take a brief look at imagemin this morning, but I was getting a failure that I think is probably due to AGI's firewall. The imagemin-optipng module has a vendor folder on git with native executables in it that is missing when I get it as an npm module.
Imagemin seems to be the most widely used solution for this, but if we can avoid it, I'd really rather not depend on node modules that have native dependencies.
We often stash data in a glTF object's extras
object, e.g.,
bufferView.extras.id = // ...
This is the right approach, but it could overwrite application-specific data stored in extras
if an app's content pipeline adds metadata to extras
objects before using the glTF pipeline. To avoid this, let's create and use a _pipeline
sub-object, e.g.,
bufferView.extras._pipline.id = // ...
This could also simplify clean up where we could have a final stage that walks all glTF objects and deletes extras._pipeline
and also extras
if deleting extras._pipeline
makes extras
empty. This way we would not need a lot of custom clean up code in each stage (an exception will be if a large amount of data is stored in extras._pipeline
- we would want to delete this right away).
While testing my own branch, I have found a model that is mangled in my branch as well as master:
Quickly commenting out each step and seeing the results seems to show that the model is getting mangled by the combinePrimitives
step.
@leerichard42 let's do this next. It is a great performance optimization since it reduces the number of draw calls required at runtime and some exporters can generate really sub-optimal models. For example, I've seen the SketchUp export COLLADA where each triangle needed a separate draw call! After running an optimization like this, the model only needed one draw call.
primitives
in a node (if they have the same attribute formats, material, and mode; this will also require combining accessors/bufferviews/buffers)meshes
in a node@tfili any additional advice here?
Fails when run through npm, passes when run directly.
Richard@Richard-PC /cygdrive/c/Users/Richard/src/gltf-pipeline
$ npm run coverage
> [email protected] coverage C:\Users\Richard\src\gltf-pipeline
> ./node_modules/istanbul/lib/cli.js cover -x **/specs/** ./node_modules/jasmine-node/bi n/jasmine-node specs
'.' is not recognized as an internal or external command,
operable program or batch file.
npm ERR! Windows_NT 6.1.7601
npm ERR! argv "C:\\Program Files\\nodejs\\node.exe" "C:\\Program Files\\nodejs\\node_mod ules\\npm\\bin\\npm-cli.js" "run" "coverage"
npm ERR! node v4.2.4
npm ERR! npm v2.14.12
npm ERR! code ELIFECYCLE
npm ERR! [email protected] coverage: `./node_modules/istanbul/lib/cli.js cover -x **/s pecs/** ./node_modules/jasmine-node/bin/jasmine-node specs`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the [email protected] coverage script './node_modules/istanbul/lib/ cli.js cover -x **/specs/** ./node_modules/jasmine-node/bin/jasmine-node specs'.
npm ERR! This is most likely a problem with the gltf-pipeline package,
npm ERR! not with npm itself.
npm ERR! Tell the author that this fails on your system:
npm ERR! ./node_modules/istanbul/lib/cli.js cover -x **/specs/** ./node_modules/jasm ine-node/bin/jasmine-node specs
npm ERR! You can get their info via:
npm ERR! npm owner ls gltf-pipeline
npm ERR! There is likely additional logging output above.
npm ERR! Please include the following file with any support request:
npm ERR! C:\Users\Richard\src\gltf-pipeline\npm-debug.log
Richard@Richard-PC /cygdrive/c/Users/Richard/src/gltf-pipeline
$ ./node_modules/istanbul/lib/cli.js cover -x **/specs/** ./node_modules/jasmine-node/bin/jasmine-node sp ecs
....................
Finished in 0.046 seconds
20 tests, 62 assertions, 0 failures, 0 skipped
=============================================================================
Writing coverage object [C:\Users\Richard\src\gltf-pipeline\coverage\coverage.json]
Writing coverage reports at [C:\Users\Richard\src\gltf-pipeline\coverage]
=============================================================================
=============================== Coverage summary ===============================
Statements : 73.41% ( 243/331 )
Branches : 48.63% ( 89/183 )
Functions : 91.3% ( 21/23 )
Lines : 73.41% ( 243/331 )
================================================================================
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.