Code Monkey home page Code Monkey logo

Comments (13)

vorg avatar vorg commented on June 16, 2024

The way it was solved in ECS nodes what that RenderTexture and CameraSystem were parent and duplicated per viewport with their own Render systems below. I could then i have PBR Renderer in left viewport and Basic Renderer in the right viewport.

Currently even if i make two camera, two viewports and use tags to have PBR entities on the left and once again cloned tagged entities with unlit=true to render on the right the helper system would need to draw after those two, twice and "on top of the viewports". How would depth buffer be shared? Or what about combining deferred PBR with forward helpers renderer.

from pex-renderer.

vorg avatar vorg commented on June 16, 2024

One idea that I wanted to do in Nodes but never had time to do was to decouple Renderer from materials/techniques. So I could have rendering technique "providers" (PBR, ThickLines, UnlitWithShadows) upstream or as inputs and then RenderExecutionSystem that would use entities and appropriate rendering technique providers to draw in the current viewport.

from pex-renderer.

vorg avatar vorg commented on June 16, 2024

Current pex-renderer@3 systems.renderer draws all active cameras one by one (hoping they have different viewports) and maybe it should be actually upside down that the camera view is calling a renderer 🤔...

from pex-renderer.

vorg avatar vorg commented on June 16, 2024

Something something render graphs
ecs-1

Maybe Related

from pex-renderer.

vorg avatar vorg commented on June 16, 2024

ThreeJS manually calls renderer.render(scene, camera) 3x https://threejs.org/examples/webgl_multiple_views.html
Screenshot 2022-07-15 at 10 25 29 (2)

from pex-renderer.

vorg avatar vorg commented on June 16, 2024

The need for render graph is there even before multiple views. Not sure how to avoid generalized graph library not different from Nodes themselves. And that's before adding multiple views. Main challenge is still having a way to have both PBR Renderer and ScreenSpaceLineRenderer (and ParticlesRenderer and SDFRenderer) contribute to e.g. same shadowmap.

The way OurMachine was doing it is to specify attachment points where more passes can be added before final graph execution. In the graph below it would be GBuffer Pass and Shadowmap Pass and Depth Prepass.

Green - passes from graph / rendering algorithm
Purple - passed from systems
Blue - textures

knotend-3 1

Graph made in knotend

from pex-renderer.

vorg avatar vorg commented on June 16, 2024

Would api like that be acceptable? That doesn't even touch RenderGraph yet. Just if you want to start customizing things or have advanced use case things will get "manual" pretty fast. I wonder how much should that be hidden? We could even remove world.addSystem /world.update completely.

  // can't just automatically render all the systems as we have two views
  // world.update();

  geometrySys.update()
  transformSys.update()
  cameraSys.update()
  skyboxSys.update()
  
  //draw left side, debug view
  //no clue how to pass camera here as in Nodes ECS
  //we pass entities manually and therefore we can filter / select camera before entities list reaches renderer
  view1.draw(() => {
    rendererSys.update({ debugRender: 'directLightingOnly' })
    helperSys.update()
  })

  //draw right side
  view2.draw(() => {
    rendererSys.update()
  })

from pex-renderer.

vorg avatar vorg commented on June 16, 2024

It kind of leans to conclusion that there is no world and do just pass entities list around for maximum flexibilty and compatibility in Nodes.

const entities = []
entities.push({ ... })

geometrySys.update({ entities })
transformSys.update({ entities })

Or you keep the world and pass it instead of entities list. Even though there is nothing more there ATM.

from pex-renderer.

vorg avatar vorg commented on June 16, 2024

Screenshot 2022-08-04 at 14 53 12

Actual working example

const view1 = createView([0, 0, 0.5, 1]);
const view2 = createView([0.5, 0, 0.5, 1]);

...

geometrySys.update(entities);
  transformSys.update(entities);
  skyboxSys.update(entities);

  //draw left side, debug view
  view1.draw((view) => {
    const aspect = view.viewport[2] / view.viewport[3];
    entities
      .filter((e) => e.camera)
      .forEach((e) => {
        e.camera.aspect = aspect;
        e.camera.dirty = true;
      });
    cameraSys.update(entities);
    rendererSys.update(entities);
    helperSys.update(entities);
  });

  //draw right side
  view2.draw((view) => {
    const aspect = view.viewport[2] / view.viewport[3];
    entities
      .filter((e) => e.camera)
      .forEach((e) => {
        e.camera.aspect = aspect;
        e.camera.dirty = true;
      });
    cameraSys.update(entities);
    rendererSys.update(entities);
  });

This is very much library approach way more than a framework but i guess it's a good thing?

from pex-renderer.

dmnsgn avatar dmnsgn commented on June 16, 2024

Could that be abstracted in a view system then?

from pex-renderer.

vorg avatar vorg commented on June 16, 2024

Conclusion on render graphs research:

  • nobody knows what they are doing
  • Bevy (rust/wasm ECS engine) rewrote their renderer 4 times in last two years, all documentation is out of date
  • OurMachinery rewrote their renderer 2/3 times
  • Both of them ended up having duplicated legacy abstractions doing same thing in different ways
  • None of them have single image of a graph / drawing of real world scene in the docs so it's hard to know what they are talking about
  • All of them have CPU scene graph + GPU render graph created dynamically
  • All of them operate on some Camera/View abstraction that can render scene but also shadowmap
  • All of them have pre-defined Render Phases / Named Passes where renderers/materials can add themselves to to e.g. render skinned mesh and particles while in ShadoMapRenderPass. That's done using either passes in the renderer or techniques/annotated functions in the shader file itself.
  • All of them end up with some micro task job system where you add on the fly and execute later Jobs or SubGraphs to e.g. blur a texture
  • Forstbite Engine (uses FrameGraph) and Destiny Game Renderer (has similar RenderGraph) split everything in thousands pieces (e.g. cloth touches 9 systems / render phases) and are probably over kill
  • Sth sth about "complex systems are grown now built" so the best would be to keep renderer as it is, duplicate on demand resources from rg.* nodes and gl.ResourceCache nodes, add pass dependencies and pass runner (aka RenderGraph) and start small, e.g. shadow mapping, or split screen camera and single pass postrpro)

from pex-renderer.

vorg avatar vorg commented on June 16, 2024

The biggest challenges remain the same

  • using multiple renderers in one view (pbr + thick lines)
  • using same renderer in multiple places (shadowmap + final render)
  • reusing renderer across different views (use pbr to render spinning cube to a texture, and then use pbr to render room with that texture on the wall)

from pex-renderer.

vorg avatar vorg commented on June 16, 2024

Started new issue about Render Graphs #315

from pex-renderer.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.