Code Monkey home page Code Monkey logo

Comments (17)

MolotovCherry avatar MolotovCherry commented on May 20, 2024 1

@mdrejhon It's switched now to MIT

from virtual-display-rs.

mdrejhon avatar mdrejhon commented on May 20, 2024 1

@mdrejhon It's switched now to MIT

Fantastique! This just opened maybe three time as many developer options -- in a very difficult "nichengineering" project. Thank you!

Keep in mind this is a self-funded passion project, so a bit of time may pass before I successfully move forward on getting the chess board setup. I will inform you when I do.

from virtual-display-rs.

mdrejhon avatar mdrejhon commented on May 20, 2024

I'm having difficulty with the labels; it's not letting me label this as "enhancement".

P.S. I have an EV Code Signing Certificate, which I volunteer to use for betas/public-releases of this feature.

EDIT: I sponsored @MolotovCherry $100 to show goodwill -- no commitment follow through. I'm very happy people are at least playing more often with virtualized display drivers (which the world badly needs, for the reasons above), and will inspire more and more indies to do the same. We indies want to produce stuff that enhances displays beyond what the manufacturers intended them to do.

You can see how I overlap indies and business -- as a passionate indie-helping-manufacturers, I help manufacturers with some elements of the refresh rate race. Especially where 120-vs-240 is much more visible to mainstream on OLED than LCD, but drawbacks (e.g. strobing/BFI removed from OLED, plus bad ClearType, etc) -- where some of the pick-poisons that can be partially solved with a virtualized display driver with temporal filtering/reprocessing capabilities -- which also helps them to be convinced to add features to the actual hardware! (While concurrently, niche features continue to be added DIY by end users)

P.S. Anybody who wants to learn more of the best content I've blogged, see Area 51 on Blur Busters for the curated best-articles. A more manageable read than Google Scolar.

from virtual-display-rs.

MolotovCherry avatar MolotovCherry commented on May 20, 2024

Hi!

Thanks for the feature request! Also thanks for the goodwill sponsorship!

This sounds very interesting! I never imagined that GPU shader plugin support in something like this would be a much desired feature.

I will give this a serious lookover when I have some time and see if it's possible for me to implement such a shader system1. I'm still not sure at the moment how much of those items are possible to do with my current skill-set, but I will look into it. For non-sensitive discussion, I would like to let communication reside in this open issue if that's alright with you

Question: Are you the only developer, @MolotovCherry?

Yes, I'm the sole developer and owner of this project (all the code was made by me, so there are no loose threads)

Note: It could even be a fork of this project if this is too dramatically different. The original project can remain GPL3, just the fork qualifying for the bounty would need to be one of the permissive licenses to support both the indies and businesses, as per above. Can even backport the Apache/MIT version to your GPL3 fork (one-way compatibility), if you want -- all good.

I'm not opposed to changing my license on the project to a more friendly one, or offering a special re-licenced version which is favorable to the interested party, on a case-by-case basis, if circumstances are favorable to me (of which this is an aforementioned favorable circumstance). My only real previous concern with my current driver was that I didn't want it to be taken and used with 0 credit / sharing of source code, since I put so much hard work into making it.

Footnotes

  1. It would fit well within the gui I've been developing
    image

from virtual-display-rs.

mdrejhon avatar mdrejhon commented on May 20, 2024

Thank you for the follow up!
I'll post a few messages to add more information; to help you (and other readers) think.

Purpose / Precedents

Why does the world need this?

Lots of people are doing lots of work in this sphere already! But most of them are FRAME BASED PROCESSING

But there are a lot of algorithms that require REFRESH CYCLE BASED PROCESSING independent of frame rate, and also works outside of games. that can run on all refresh cycles independently of the underlying frame rate...

Existing Precedents of FRAME-BASED processing Systems

  1. The BFI support now built into SpecialK (but it only works inside games as Present() hook), using a custom workflow they've created to allow a refresh-cycle-based system to work with a frame-based system;
  2. Frame based filter systems like ReShade/SweetFX/NVIDIA FreeStyle/etc! But they are not suitable for stuff that must be refresh cycle granularity, and/or need to run outside games;
  3. Gamescope built into Steam for Linux, does some frame-based processing too!
  4. Although not a processor (But a monitor controller), some software such as DisplayFusion allows you to use hotkeys or command line options to enable/disable displays from appearing in Control Panel, so there's precedent.

Not suitable for REFRESH-CYCLE-BASED processing

Refresh cycle based processing (independent of underlying frame rate) requires a virtual video driver -- like yours. That's why I posted here.

While many could serve as plugins for a virtual display driver (to allow filters to be used outside of a game, for the whole Windows desktop) -- unfortunately, the majority of these don't support all algorithms that only works with REFRESH CYCLE BASED processing (independently of frame rate)

Also some drivers for PCVR headsets (some of which actually connect as displays, but their specialized drivers "hide" the displays from becoming visible as regular monitors). Oculus Rift DK1 showed them as regular displays but by the time Oculus Rift CV1, they were hidden displays that could only be accessed through Oculus APIs.

I realize you don't have time now, and maybe someone else needs to take the baton, but your software is the closest thing to a code skeleton that could be used for this sort of stuff!

from virtual-display-rs.

mdrejhon avatar mdrejhon commented on May 20, 2024

Interim "Specs" Discussion / Brainstorm For Temporal Processing

I realize a lot of things need to be clarified, I'm happy to flesh it out, as I don't have enough skills to write a Windows driver (I'm not an expert), but I am full of display-processing ideas! (That's where I am an expert on)

Some of these things MAY not be possible; this needs to be verified.

Bounty Requirements need to be Spec'd/Simplified

I realize that a lot of thinking is needed to see what is the simplest way to proceed, because we have a bunch of potential input/output settings. Most of which should be possible, and these values should be accessible to the shader processor.

For example, simulated VRR might be removed from requirements, if VRR is too difficult to implement, and that alternative capabilities are implemented instead;

Tentative/Unconfirmed Requirements

Loadable temporal shader filter modules

  • Loadable filter modules should be easily importable/exportable package, preferably 100% text based for easy customizability (ala JSON or INI or TXT or CFG), but an open to binary formats if easier to implement.
  • Loadable filter module should be able to have the following:
    The filter profile could be text-based to allow easy customizing by advanced users:
    • Title (e.g. "Software BFI")
    • Optional prerequisites/constraints on mode metrics
      • Custom warning-league constraints (filter runs degraded, such as stutters)
      • Custom fail-league constraints (can't run filter at all)
      • Example: "Output Hz must be an integer multiple of input Hz" or "hzOutput==2*hzInput"
    • Settings
      • Data type [boolean,integer,float], allowed range of data, and default value
      • Custom data constraint [if possible]
      • Examples: "Black frame count, defined as unsigned integer", "Phosphor decay simulation, defined as float, range 0.0 to 1.0", etc.
    • Shader text file (that driver compiles if not already compiled)

Possible Static Shader Variables

  • [boolean] Whether input is fixed-Hz or variable (e.g. VRR / VSYNC OFF)
  • [boolean] Whether output is fixed-Hz or variable (e.g. VRR / VSYNC OFF)
  • [float/double] Input refresh rate (aspirational if variable)
  • [float/double] Output refresh rate (aspirational if variable)
  • [enum?] Actual sync technology of actual FSE output (VSYNC ON, VSYNC OFF, VRR)
  • [integer] The input horizontal resolution
  • [integer] The input vertical resolution
  • [integer] The output horizontal resolution
  • [integer] The output vertical resolution
  • [?] The input format (16, 24, 32, FP16 HDR, FP48 HDR10, etc)
  • [?] The output format (16, 24, 32, FP16 HDR, FP48 HDR10, etc), typically same but can be different
  • Other accessible settings may be needed, if easy to include (e.g. multimonitor coordinate of the upper-left corner of virtualized screen)

Possible Dynamic Shader Variables

Basically, shader variables that changes once a frame / once a refresh cycle.

  • [framebuffers/textures] Previous unreprocessed "refresh cycle" (1 by default, configurable)
  • [framebuffers/textures] Current unreprocessed "refresh cycle"
  • [framebuffers/textures] Previous post-processed "refresh cycle" (1 by default, configurable)
  • [framebuffers/textures] Current post-processed "refresh cycle"
  • [numeric] (if possible grabbing timestamps from a presentation hook for refresh cycle processor) Current frame presentation time (e.g. timestamp of last Present event); this will require a barebones presentation hook to capture the frame presentation time
  • [framebuffers/textures] Previous unreprocessed frames and their frame presentation time (independent of refresh cycles), for processing frames/framerates independently of refresh cycle.
  • [integer] Monotonically increasing frame counter (since load of driver)
  • [integer] Monotonically increasing refresh cycle counter (since load of driver)

Output should support Full Screen Exclusive (FSE)

  • FSE also prevents the output and input refresh rate from throttling each other.
  • This also lowers input latency of filter processing too.
  • The output mode should have its sync technology configurable (VSYNC ON, VSYNC OFF, VRR).

ADVANCED NOTE (doesn't have to be part of bounty at first): If you have a Present() hook too and concurrently support both frame processing and refresh cycle processing, it is possible to have virtually lagless processing + VSYNC OFF mirroring! (VSYNC OFF on input Hz, VSYNC OFF on output Hz). In this case, the only lag is how fast the filter executes.

Configurable constraints on a per-filter basis

  • A temporal processing module (settings + shader) could specify what it needs / what it does not need
    • An overdrive lookup table system will require access to previous refresh cycles (but not frames or frame presentation times)
    • Virtualized VRR on non-VRR will require access to frame presentation times and previous frames (preferably all frames since previous refresh cycle, up to a max)
  • Ideally I'd like HDR and SDR to be able to diverge too.
    • An HDR nits-booster shader for SDR, will require input to be SDR and output to be HDR
    • An HDR analysis shader for SDR (e.g. heatmap debug), will require input to be HDR and output to be SDR
  • Optional exact constraints like "input Hz is always half output Hz", or "output Hz is always twice input Hz" or "output horizontal resolution is twice input horizontal resolution".
  • Some algorithms such as BFI may require "output Hz is always an integer multiple of input Hz".
    • Maybe an startup-time-executable math formula that is eval()'d somehow upon driver startup or loadable-module startup, or maybe a separate startup-only shader function that executes -- whichever is easiest to implement a very customizable settings-constraints system for a specific filter.

Input data for shader processing

  • Input shader data/variables considered "static" can be initialized on startup or during mode changes (e.g. resolution changes, refresh rate changes). For these cases, anytime you need to change these variables (such as screen mode changes), it is generally ok to essentially "reboot" the driver (e.g. reloading the shader module from scratch).
  • If possible, support for dynamic (once-a-Hz) variable updates would be desired. Input shader variables considered "dynamic" needs to be refreshed every refresh cycle (e.g. frame presentation time). This is needed for some algorithms like virtualized software VRR ala https://www.testufo.com/vrr running on a fixed-Hz display. This may require creativity to pull off without shader recompiles (I'm not 100% familar what will force a shader recompile)

API for driver, for a settings application, and for screwup-recovery, etc.

  • Some filters may accidentally make the display unintelligible. So an API provides a possible recovery mechanism may be needed,
  • An API should be built into the driver to allow applications to control the driver, such as switch filters. This would make unit testing easier, as well as loading data, as well as messup-recovery, as well as a settings applet / system tray / etc.
  • It would make it possible to have a system tray app that allows a hotkey to enable simple mirroring operation to revert from a shader messup (e.g. shader making the display unreadable) or some other reasonable recovery mechanism. With a driver supporting an API of some kind, then a system tray app can monitor a hotkey and send an API call to put the driver into a safe state. And preventing things like refresh rate getting too low (e.g. <1 Hz), or other problematic conditions that would otherwise require the user to hard-poweroff the computer.
  • API should be ideally able to enable/disable the virtual monitor, reverting to non-virtual display (eg. #19). Sort of like switching between two DisplayFusion profiles with two separate hotkeys, which can disable one monitor while enabling a different monitor. I could use DisplayFusion to do this instead, if not possible.

Settings applet of some sort (whether in Windows CP or system tray, etc)

  • Settings application could handle the import/export responsibility for loadable filter modules
  • Existence of API could allow a separate settings application / control panel application to make configuring the driver much more user friendly. Could be part of an existing Windows Control Panel workflow, or could be a launchable system tray style application.
  • Ideally, the settings should be able to be dynamically adjustable (within constraints) via the API mentioned above. This would allow a settings application to see real time results as the user adjusts. If not practical, understandable; but this would be ideal. Like you can do for a gamma setting for a gamma-enhancing filter (of other software), I would be able to realtime adjust an overdrive setting, or a phosphor decay setting, or another temporal-filter setting.

This is just a proposal

Other possible workflows might be better and merit discussion; this is how I've visualized this so far;

External filter support

You can try testing SmoothVideoProject attachment support, maybe it already works out-of-the-box, because it can (in theory) attach to any DirectX framebuffer (Direct3D, DirectShow). If not enough compute, just use the crappy laggy interpolation as a validation test -- if that works, then the massively better AI-based interpolation might work okay.

  • ReShade
  • SmoothVideoProject

Multiple GPU support

Also given some filters (e.g. RIFE or future 10:1 reprojection algorithms) can be more compute-heavy than usual, might need to support 2-GPU systems -- so one GPU can be used for the game (primary) and a secondary GPU is used for the compute-heavy filter workloads. So architecturing decisions (long term) may need to (eventually) allow a "Select Preferred GPU to run Shaders/Filters On", even different from the rendering GPU.

Multiple Video Output Support (Refresh Rate Combining):

This will be useful for refresh rate multiplication Experiments. A virtual display driver can also support refresh rate multiplication (A Blur Busters breakthrough, which I'm still formulating a white paper for) through two methods:

  • Two strobed monitors in a beam splitter mirror;
  • Multiple strobed projectors pointing to the same projection screen;

So you can produce a 960Hz screen by combining four 240Hz, eight 120Hz or sixteen 60Hz projectors pointing to the same screen.

A virtual windows driver driver virtualizes a higher refresh rate, and outputs the frames round-robin to multiple lower-Hz GPU outputs.

There are some additional complicated techniques to projection map and or offset the VBI's that utilizes simulating fixed Hz via VRR, and using VRR as a software genlock to slew offsetted blanking intervals (but leave those details to me). To make it easier for multiple projectors pointing to the same screen, even a custom user-written shader can do projection mapping with settings adjustments (keystone/distortion correction) and specify which output the next framebuffer should go to (or something like that), but leave the details to the shader deets.

What I need is a filter-supporting driver framework that makes it all possible at the end; which is part of the reason of the bounty.

Epilogue

If only a subset can be supported, that's negotiable for the bounty (you can communicate with me privately at mark [at] blurbusters.com first, if necessary before discuss public here).

Some of the above is easy, and some of the above is hard. A task breakdown might need to be brainstormed (stage 1, stage 2, stage 3...)

Just putting some known factors down, to help brainstorming, and to help task-breakdown, and to help architecturing... You are welcome to continue to implement subsets of these ideas without doing a bounty. Basically you're welcome to using a subset of my ideas (less than bounty) if not able to -- at least I helped improve a project. But if a full shader-customizable driver is possible

from virtual-display-rs.

mdrejhon avatar mdrejhon commented on May 20, 2024

External Processors (As additional support other than plug-ins)

Although it is desired to have built-in shader plugins, nominally, the framework should be able to support external processors (mostly spatial processors), and this does not necessarly have to be part of the bounty. But it is desirable to try to include support for popular external filter processors, which is usually simple API calls:

Another purpose; add optional support for SmoothVideoProject filters -- there are some custom interpolation engines (e.g. RIFE 4.6 NN) that does near-flawless artificial-based interpolation for retrogaming;

Currently it requires a HDMI capture card like an Elecard, but a virtual graphics driver supporting SmoothVideoProject and its most advanced interpolation plugins.

Interpolation is not well liked by most gamers, but an RTX 4090 is able to do RIFE 4.6 NN in realtime to create 1080p/120 that looks really "native" (relatively artifact free and low latency) out of 1080p/60 content, while still having enough compute leftover to render the game itself.

I think something (roughly like this) is already in the works already by a few people I know on Discord!

Now...A lot of things have improved since this thread was created! AI-based interpolation.

For absolutely stunning near-flawless interpolation on 2D games (like RTS style), I recommend using a Elecard capture card + RIFE 4.6 NCNN interpolation engine + RTX 4000 series. It can do one of the world's best NN-AI-based interpolation to 1080p/120 in real time using a compute-heavy interpolation/extrapolation algorithm requiring the latest RTX cards

It's a convoluted setup, but it makes retro games look like true native 120fps, with only rare artifacts.

The NN-AI "learns" the background of games while you play them, and uses as background-reveal or parallax-infill (sometimes pixel perfectly, sometimes not, depends on the game you try). So it's a more flawless interpolation for retrogaming, platformers, and RTS games. For a interpolation "black box in the middle" type of setup that is not integrated into the engine -- it is the most native looking realtime interpolation today's compute can get you today so far.

There's input lag, but it's not too shabby. In theory only +1 frame! (excluding capture card overhead).

Long term, I want to see somebody implement this into a Windows Virtual Display Driver so that you can omit the capture card (Elecard).

For more information about artifical-intelligence (NN) based interpolation, which is the state of art, see https://www.svp-team.com/wiki/RIFE_AI_interpolation and you can tell it's pretty much supercomputing-league interpolation essentially, but it's light years quality far beyond any interpolation algorithm I've ever seen. Recommended if you /must/ use interpolation in retrogaming.

It's nigh native-looking on most retrogaming content! Sadly, gotta be piped into an Elecard capture card (lossless HDMI video input), I hope this changes so we can do it natively as a filter plugin (e.g. ReShade/SweetFX style, but it would be better to use a virtualized graphics driver to allow it to work on any content)

So from this, possible additional ideas, not all unviersally part of the bounty, but nominal support (e.g. multiple-output support) that makes a shader-filter programming task possible.

which I've edited into the previous posts;

External filter support

You can try testing SmoothVideoProject attachment support, maybe it already works out-of-the-box, because it can (in theory) attach to any DirectX framebuffer (Direct3D, DirectShow). If not enough compute, just use the crappy laggy interpolation as a validation test -- if that works, then the massively better AI-based interpolation might work okay.

  • ReShade
  • SmoothVideoProject

Multiple GPU support

Also given some filters (e.g. RIFE or future 10:1 reprojection algorithms) can be more compute-heavy than usual, might need to support 2-GPU systems -- so one GPU can be used for the game (primary) and a secondary GPU is used for the compute-heavy filter workloads. So architecturing decisions (long term) may need to (eventually) allow a "Select Preferred GPU to run Shaders/Filters On", even different from the rendering GPU.

Multiple Video Output Support (Refresh Rate Combining):

This will be useful for refresh rate multiplication Experiments. A virtual display driver can also support refresh rate multiplication (A Blur Busters breakthrough, which I'm still formulating a white paper for) through two methods:

  • Two strobed monitors in a beam splitter mirror;
  • Multiple strobed projectors pointing to the same projection screen;

So you can produce a 960Hz screen by combining four 240Hz, eight 120Hz or sixteen 60Hz projectors pointing to the same screen.

A virtual windows driver driver virtualizes a higher refresh rate, and outputs the frames round-robin to multiple lower-Hz GPU outputs.

There are some additional complicated techniques to projection map and or offset the VBI's that utilizes simulating fixed Hz via VRR, and using VRR as a software genlock to slew offsetted blanking intervals (but leave those details to me). To make it easier for multiple projectors pointing to the same screen, even a custom user-written shader can do projection mapping with settings adjustments (keystone/distortion correction) and specify which output the next framebuffer should go to (or something like that), but leave the details to the shader deets.

What I need is a filter-supporting driver framework that makes it all possible at the end; which is part of the reason of the bounty.

from virtual-display-rs.

daiaji avatar daiaji commented on May 20, 2024

https://eligao.com/enabling-freesync-on-unsupported-displays-f90ce7e8089346d2bbbe9275b21ba3ca

By design, FreeSync does not require any special hardware as G-Sync does. Wouldn't it be nice to get it to work on my other non-FreeSync monitors?

VRR seems to only need to modify the EDID?
I use LookingGlass to play games on a virtual machine, but my graphics card RX550 does not support HDMI VRR (AMD said this graphics card does not support it), so I only have one DP interface that supports VRR, so VRR support on a virtual monitor is very useful.

from virtual-display-rs.

mdrejhon avatar mdrejhon commented on May 20, 2024

VRR seems to only need to modify the EDID?

Depends. It requires the panel to be VRR-tolerant.

I've seen VRR forced over DVI and VGA and old-HDMI versions before, through this ToastyX trick on AMD GPUs, as it's simply a variable-size blanking interval on a standard video signal, to vary the interval between refresh cycles (unchanged horizontal scanrate, varying vertical refresh rate).

There are many panels that are intolerant of VRR forced upon it (goes blank).

Also, panels with generic adaptive-sync with no compensations, will do a poor job of VRR quality. It usually won't have good overdrive compensation for VRR. This means you may have LCD ghosting artifacts at certain framerates/refreshrates, which is more problematic for motion quality than high-level FreeSync Premium & G-SYNC Native certifications.

Example of asymmetric LCD ghosting artifact; this can appear/disappear, worsen/improve at different VRR frame rates;
image

Motion-quality enthusiasts usually have a preference to pay extra for better/tested/premium certified VRR motion quality.

But, yes, it's also nice to be able to add VRR as a bonus feature to a non-VRR panel; as long as expectations are tempered.

I use LookingGlass to play games on a virtual machine, but my graphics card RX550 does not support HDMI VRR (AMD said this graphics card does not support it), so I only have one DP interface that supports VRR, so VRR support on a virtual monitor is very useful.

Unfortunately, virtual machines usually are incapable of VRR.
However, there's two possible tricks to get VMs to work with VRR:

  1. Force the virtual machine to run in tearing VSYNC OFF mode (make tearing appear whenever VRR is disabled on host OS and monitor)
  2. Force the host OS in VRR mode (whether native or ToastyX trick).
  3. Turn VRR ON in the monitor menus.
    Then sometimes games running inside virtual machines will work fine with VRR.

In other words, if you have to force VRR, you have to do it at the native-host-OS level. For example, a Windows VM on a Windows host, would work with this trick. As long as you successfully got the VM to show tearing in step 1, then you can trick the VM to convert the tearing into successfully working VRR. But would not work with a Windows VM on a Mac, even if you were able to create a custom EDID via SwitchResX (the Mac equivalent of ToastyX), because the M1/M2/M3 doesn't support tearing during VSYNC OFF.

Technical Reason Why It Works: This is because the first step of tearing = good indicator if VRR will work when VRR is enabled at host OS + monitor level. Random tearing (as a temporary visual debugging method) means you've successfully disconnected frame-present timing from fixed Hz refresh rate. Tearing is simply mid-refresh-cycle splices of a new frame, indicating asynchronous attempts at delivering new frames to display, is now functioning. If that's solved, then making the refresh rate float (enable VRR in host drivers/host OS/host monitor) seamlessly "converts" the VSYNC OFF to VRR successfully, even if the VM software is VRR-unaware. The VRR-unaware software is configured as VSYNC OFF, something that existed for a long time pre-VRR, as a backwards compatibility technique. That's also happens to be how pre-VRR game software (e.g. 2004's Half Life 2 and the like) successfully worked with raster VRR (which arrived in the 2010s), you simply had to use VSYNC OFF at the game side, in order to get VRR working on a VRR-supported OS+GPU+display. The same is true for VRR-unaware VM software that still supports VSYNC OFF with visible tearing artifacts whenever VRR is turned off

Sorry about the sidetrack, just Blur Busters (that I founded) knows this stuff; and I wanted to follow up;

</Temporary Side Track>

from virtual-display-rs.

daiaji avatar daiaji commented on May 20, 2024

Looking Glass

An extremely low latency KVMFR (KVM FrameRelay) implementation for guests with VGA PCI Passthrough.

Basically I use a virtual machine with GPU passthrough to play games and have been using VRR for a while, so the issue doesn't seem to be there.

from virtual-display-rs.

mdrejhon avatar mdrejhon commented on May 20, 2024

Looking Glass
An extremely low latency KVMFR (KVM FrameRelay) implementation for guests with VGA PCI Passthrough.

Basically I use a virtual machine with GPU passthrough to play games and have been using VRR for a while, so the issue doesn't seem to be there.

Ah, GPU passthrough is different! That's easier.

from virtual-display-rs.

daiaji avatar daiaji commented on May 20, 2024

Looking Glass
An extremely low latency KVMFR (KVM FrameRelay) implementation for guests with VGA PCI Passthrough.

Basically I use a virtual machine with GPU passthrough to play games and have been using VRR for a while, so the issue doesn't seem to be there.

Ah, GPU passthrough is different! That's easier.

So for virtual displays, to achieve VRR, just edit EDID?

from virtual-display-rs.

mdrejhon avatar mdrejhon commented on May 20, 2024

So for virtual displays, to achieve VRR, just edit EDID?

In theory. There could be some gotchas, as virtual display drivers may do other behaviors that prevent VRR from working.
But now we're getting offtopic; I wanna focus on the github item.

from virtual-display-rs.

mdrejhon avatar mdrejhon commented on May 20, 2024

Hi!
Thanks for the feature request! Also thanks for the goodwill sponsorship!

Any update? I'd love to at least get access to a Apache/MIT codebase of a virtual display driver this year.

May you contact me at [email protected] and we can negotiate -- I might have third party resources that may be able to help and I may be able to submit some improvements in due time.

from virtual-display-rs.

MolotovCherry avatar MolotovCherry commented on May 20, 2024

I'd love to at least get access to a Apache/MIT codebase of a virtual display driver this year.

That's fine, I'll switch the license to a more permissive one soon.

May you contact me at [email protected] and we can negotiate

I assume you were referring to the license when you said this, right? (So that should be covered by my above statement?) Or did you have something else in mind?

I might have third party resources that may be able to help and I may be able to submit some improvements in due time.

That does sound great! Contributions are most welcome.

from virtual-display-rs.

mdrejhon avatar mdrejhon commented on May 20, 2024

I assume you were referring to the license when you said this, right? (So that should be covered by my above statement?) Or did you have something else in mind?

Either or both.

I need maximum flexibility to figure out how to improve the codebase to meet multiple needs. As well as submit it back to the open source community (but also be able to use it in future proprietary Blur Busters software too).

I'm open to many kinds of arrangement, but simply switching licenses would also be the simplest if you have no time to do the feature requests -- at least I'd have more options, whether via both funded developer and volunteer developer routes.

I'd still want to contribute a bunch of changes back somehow (even if I used a funded software developer to do it), or make a forked open source project.

Also, custom blur busting software is a very niche, with niche skills, so additional options are welcome! Obviously, if I am unable to find developer resources, then it may be a while, but the sooner it is Apache/MIT, the sooner I can find non-unobtainium developer options within the next several months.

For those who want to contact me: For the actual work, I may have a modicum of funding available to an open source software developer, who's allowed to recontribute it back to any relevant opensource project (But only if it's on a MIT/Apache codebase -- I have simultaneous opensource & proprietary needs that I'm trying to make compatible).

from virtual-display-rs.

MolotovCherry avatar MolotovCherry commented on May 20, 2024

I need maximum flexibility to figure out how to improve the codebase to meet multiple needs. As well as submit it back to the open source community (but also be able to use it in future proprietary Blur Busters software too).

I'm open to many kinds of arrangement, but simply switching licenses would also be the simplest if you have no time to do the feature requests -- at least I'd have more options, whether via both funded developer and volunteer developer routes.

You can operate under the assumption that it's switched to MIT already (I'll get around to it soon, by the time you find anything, it'll already be switched, so it's not an issue).

from virtual-display-rs.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.