Comments (4)
I tried implementing this (change poll_pending to use a set value instead of an address) locally this morning, but that didn't fully solve the issue.
With the change, the poll_pending
value from an async function defined in a different library is recognized, but the function never completes.
I added many print statements throughout the relevant mlua functions and Delay (from the tokio tutorial) to figure out what's wrong and it seems that the waker does not get set correctly in this scenario.
So when the Future impls for async functions defined in a different library call wake, they're doing so on the default noop waker.
For async functions defined in the same library, the waker is set and used correctly.
The WakerGuard
does set the waker, but at some point in the process, the waker is lost.
It seems like it's somewhere between calling resume_inner
on the Thread
and calling get_poll
in the lua chunk created by create_async_callback
.
I'm not very familiar with mlua internals/the lua c api, but it seems like maybe coroutines have their own lua sub-state and ExtraData
.
So I'm thinking that the waker isn't getting copied into the sub-state, but I can't figure out why that'd be the case for async functions defined in another library, but not ones defined in the same library.
from mlua.
After digging through the mlua docs and code, I believe I've tracked the issue down to the fact that the address of ASYNC_POLL_PENDING is used for the Lua::poll_pending value rather than a constant value.
This means every binary/library has a different value for poll_pending.
Yeah, this is by design. It's exactly how it should be.
Modules and the main app does not share internals, everything they have on Rust side lives in their own state, isolated and unshared.
This also applies not only to futures but to registered userdata objects and so on.
Is there any specific reason (such as possible security concerns) that the address of ASYNC_POLL_PENDING is used for poll_pending rather than a constant value?
Rust does not provide stable ABI, the only API we can rely on is Lua C API which is stable and consitent.
Modules and the main app can be compiled by different compiler version, they can use different libraries (such as tokio). Even if libraries are exactly the same there is no guarantee that structs layouts are not reordered by llvm optimizer just because it decided to apply some optimization for this specific code.
For async functions defined in the same library, the waker is set and used correctly.
The WakerGuard does set the waker, but at some point in the process, the waker is lost.
It seems like it's somewhere between calling resume_inner on the Thread and calling get_poll in the lua chunk created by create_async_callback.
For the same reason as I explained above. WakerGuard
is an internal entity and cannot be shared between modules/app.
I'm not very familiar with mlua internals/the lua c api, but it seems like maybe coroutines have their own lua sub-state and ExtraData.
So I'm thinking that the waker isn't getting copied into the sub-state, but I can't figure out why that'd be the case for async functions defined in another library, but not ones defined in the same library.
Each module and the main app have their own independent state, the ExtraData
object. Waker
object for polling futures lives inside that non-shared state and this is why it's visible inside own library but not externally.
from mlua.
The best way to integrate async functionality between loadable modules and the main app is to integrate independent event loops together using stable primitives.
The idea is to poll coroutines normally through Lua API but recognize their ASYNC_POLL_PENDING
state and pass/receive oneshot channels to share futures state. When remote future is ready you should receive a message through channel and only then continue polling coroutine.
What use as a channel is up to you, it can be pipes or tokio channels that can operate in mixed sync/async module (sync on foreign side and async on native side).
The same principle is used to integrate async rust and async c++, async Lua apps (such as openresty OR neovim) and async rust modules.
from mlua.
Integrating futures through FFI is not easy, you can take a look to https://docs.rs/async-ffi/latest/async_ffi/ for some other solutions outside of Lua world (just to understand scope of the problem!)
from mlua.
Related Issues (20)
- Can module mode works with async? HOT 2
- Difficulty creating a wrapper type HOT 1
- Support Audio manipulation HOT 2
- Support UI HOT 2
- Error compiling module to wasm32-unknown-emscripten
- Terra integration?
- Lifetime Issues with Scope HOT 3
- In a module best way to deal with async HOT 4
- Serialization: Recursive table detection yields false positives HOT 1
- Allow setting luau compiler options when loading with `require` HOT 2
- Builds fail with useless errors if user's global Cargo target-dir is changed HOT 3
- Send feature flag HOT 2
- bug: `c_void` not a Sync HOT 3
- Limiting script execution time (a question and a bug report) HOT 2
- Deserialize from a table with function HOT 2
- How to convert json to `mlua::Value` like `c-json.decode` HOT 4
- how to method UserData method self.vars? HOT 2
- table.concat expect got "hello" HOT 2
- Get `registry_id` from `RegistryKey` HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from mlua.