Comments (13)
The way it was solved in ECS nodes what that RenderTexture and CameraSystem were parent and duplicated per viewport with their own Render systems below. I could then i have PBR Renderer in left viewport and Basic Renderer in the right viewport.
Currently even if i make two camera, two viewports and use tags to have PBR entities on the left and once again cloned tagged entities with unlit=true to render on the right the helper system would need to draw after those two, twice and "on top of the viewports". How would depth buffer be shared? Or what about combining deferred PBR with forward helpers renderer.
from pex-renderer.
One idea that I wanted to do in Nodes but never had time to do was to decouple Renderer from materials/techniques. So I could have rendering technique "providers" (PBR, ThickLines, UnlitWithShadows) upstream or as inputs and then RenderExecutionSystem that would use entities and appropriate rendering technique providers to draw in the current viewport.
from pex-renderer.
Current pex-renderer@3 systems.renderer draws all active cameras one by one (hoping they have different viewports) and maybe it should be actually upside down that the camera view is calling a renderer 🤔...
from pex-renderer.
Something something render graphs
- Entity-Component-Systems and Rendering [WebArchive] (screenshot is from here)
- High-Level Rendering Using Render Graphs [WebArchive]
Maybe Related
- https://ourmachinery.com/post/borderland-between-rendering-and-editor-part-1/
- https://ourmachinery.com/post/borderland-part-2-picking/
- https://ourmachinery.com/post/borderland-part-3-selection-highlighting/
from pex-renderer.
ThreeJS manually calls renderer.render(scene, camera) 3x https://threejs.org/examples/webgl_multiple_views.html
from pex-renderer.
The need for render graph is there even before multiple views. Not sure how to avoid generalized graph library not different from Nodes themselves. And that's before adding multiple views. Main challenge is still having a way to have both PBR Renderer and ScreenSpaceLineRenderer (and ParticlesRenderer and SDFRenderer) contribute to e.g. same shadowmap.
The way OurMachine was doing it is to specify attachment points where more passes can be added before final graph execution. In the graph below it would be GBuffer Pass
and Shadowmap Pass
and Depth Prepass
.
Green - passes from graph / rendering algorithm
Purple - passed from systems
Blue - textures
Graph made in knotend
from pex-renderer.
Would api like that be acceptable? That doesn't even touch RenderGraph yet. Just if you want to start customizing things or have advanced use case things will get "manual" pretty fast. I wonder how much should that be hidden? We could even remove world.addSystem
/world.update
completely.
// can't just automatically render all the systems as we have two views
// world.update();
geometrySys.update()
transformSys.update()
cameraSys.update()
skyboxSys.update()
//draw left side, debug view
//no clue how to pass camera here as in Nodes ECS
//we pass entities manually and therefore we can filter / select camera before entities list reaches renderer
view1.draw(() => {
rendererSys.update({ debugRender: 'directLightingOnly' })
helperSys.update()
})
//draw right side
view2.draw(() => {
rendererSys.update()
})
from pex-renderer.
It kind of leans to conclusion that there is no world and do just pass entities list around for maximum flexibilty and compatibility in Nodes.
const entities = []
entities.push({ ... })
geometrySys.update({ entities })
transformSys.update({ entities })
Or you keep the world
and pass it instead of entities list. Even though there is nothing more there ATM.
from pex-renderer.
Actual working example
const view1 = createView([0, 0, 0.5, 1]);
const view2 = createView([0.5, 0, 0.5, 1]);
...
geometrySys.update(entities);
transformSys.update(entities);
skyboxSys.update(entities);
//draw left side, debug view
view1.draw((view) => {
const aspect = view.viewport[2] / view.viewport[3];
entities
.filter((e) => e.camera)
.forEach((e) => {
e.camera.aspect = aspect;
e.camera.dirty = true;
});
cameraSys.update(entities);
rendererSys.update(entities);
helperSys.update(entities);
});
//draw right side
view2.draw((view) => {
const aspect = view.viewport[2] / view.viewport[3];
entities
.filter((e) => e.camera)
.forEach((e) => {
e.camera.aspect = aspect;
e.camera.dirty = true;
});
cameraSys.update(entities);
rendererSys.update(entities);
});
This is very much library approach way more than a framework but i guess it's a good thing?
from pex-renderer.
Could that be abstracted in a view system then?
from pex-renderer.
Conclusion on render graphs research:
- nobody knows what they are doing
- Bevy (rust/wasm ECS engine) rewrote their renderer 4 times in last two years, all documentation is out of date
- OurMachinery rewrote their renderer 2/3 times
- Both of them ended up having duplicated legacy abstractions doing same thing in different ways
- None of them have single image of a graph / drawing of real world scene in the docs so it's hard to know what they are talking about
- All of them have CPU scene graph + GPU render graph created dynamically
- All of them operate on some Camera/View abstraction that can render scene but also shadowmap
- All of them have pre-defined Render Phases / Named Passes where renderers/materials can add themselves to to e.g. render skinned mesh and particles while in ShadoMapRenderPass. That's done using either passes in the renderer or techniques/annotated functions in the shader file itself.
- All of them end up with some micro task job system where you add on the fly and execute later Jobs or SubGraphs to e.g. blur a texture
- Forstbite Engine (uses FrameGraph) and Destiny Game Renderer (has similar RenderGraph) split everything in thousands pieces (e.g. cloth touches 9 systems / render phases) and are probably over kill
- Sth sth about "complex systems are grown now built" so the best would be to keep renderer as it is, duplicate on demand resources from
rg.*
nodes andgl.ResourceCache
nodes, add pass dependencies and pass runner (aka RenderGraph) and start small, e.g. shadow mapping, or split screen camera and single pass postrpro)
from pex-renderer.
The biggest challenges remain the same
- using multiple renderers in one view (pbr + thick lines)
- using same renderer in multiple places (shadowmap + final render)
- reusing renderer across different views (use pbr to render spinning cube to a texture, and then use pbr to render room with that texture on the wall)
from pex-renderer.
Started new issue about Render Graphs #315
from pex-renderer.
Related Issues (20)
- material.id is missing and leads to cache issues HOT 4
- Improve reflection probes seams HOT 3
- Move shaders back to pex-shaders
- MSAA Antialiasing
- Layer system
- Reflection map prefiltering improvements HOT 5
- Provide default values when creating light components HOT 1
- Add support for attribute.dirty HOT 2
- Allow shadow map size customisation HOT 1
- Prevent camera helper rendering when rendering from the same entity camera component
- Renderer crashes on directional lights that don't cast shadows HOT 1
- Something is broken
- Directional light entity helper should not grow with light intensity HOT 9
- Maximum call stack size exceeded HOT 3
- Add basic render engine example
- Default skybox sunPosition HOT 1
- PostProcessing cleanup HOT 1
- Add custom attributeMap
- Camera clear color should be in linear space HOT 1
- [v4] material receiveShadows is not used
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pex-renderer.