Comments (9)
Could you give some insight into what exactly the time is used for – is it some limit until the computation is deemed too expensive and the error recovery is aborted?
In essence, yes. [It's slightly more complex than that: error recovery is unbounded in time and space. It turns out that using a time limit is a fairly easy way of stopping us using too much memory!]
If it doesn't complicate the code base too much, maybe the timing feature could be guarded behind a cargo feature flag?
I think we need some way of specifying time (for the reasons above) . If we can find an unobtrusive way of doing this, I guess it's OK. For example, perhaps we can conditionally include a file along the lines of rust-lang/rust#48564 (comment) then we wouldn't make the rest of the code base any worse, since everything would be isolated to one file.
There might be a quick alternative, if you don't need error recovery, that might prevent panic
s: if you add recoverer(RecoveryKind::None)
to your builder in build.rs
, you won't get any recovery. I admit that, off hand, I can't remember if that will still call Instant::now()
somewhere, but a) it's probably worth trying b) if it does still call Instant::now()
it might be easy to alter the code not to when the recoverer is RecoveryKind::None
.
from grmtools.
Having though about it, I share your idea that this issue should be resolved on a platform/backend level rather than in the library.
Implementing syscalls for Wasm doesn't seem to get too much traction, though: https://internals.rust-lang.org/t/what-is-the-plan-regarding-libstd-and-wasm-syscalls/8497.
For my needs I think I'll just fall back to patching the grmtools dependency in my project.
I'll close this issue for now since I don't have any actionable input moving forward. I'll keep an eye on how the Wasm / Wasi / syscalls story progresses.
This might be relevant https://crates.io/crates/instant.
This would be an alternative if you want to add support for Wasm directly to this library and consider adding that dependency to be elegant enough.
from grmtools.
lrpar uses elapsed time to know how long time to try recovering from errors, so it does need to have some notion of wall-clock time. I wonder if you can include rust-lang/rust#48564 (comment) in your project? Or whether wasm32-wasi is a suitable target for your usage?
from grmtools.
lrpar uses elapsed time to know how long time to try recovering from errors [...]
Could you give some insight into what exactly the time is used for – is it some limit until the computation is deemed too expensive and the error recovery is aborted?
I wonder if you can include rust-lang/rust#48564 (comment) in your project?
Unfortunately, that won't affect the lrpar
library code.
Or whether wasm32-wasi is a suitable target for your usage?
That sounds interesting, however multiple parts of the toolchain don't seem to support it yet. Also, if I understood correctly, wasm32-wasi
is a target that runs in a standalone-runtime outside of the browser. My particular use case is an application running in the browser.
If it doesn't complicate the code base too much, maybe the timing feature could be guarded behind a cargo feature flag?
from grmtools.
Thank you, adding recoverer(RecoveryKind::None)
helped masking the symptoms for now.
The time limit seems like a plausible solution regarding computation time (and setting a hard limit for when the user can expect a result). However, one could consume more memory than desired when execution is vastly faster on the target system.
I wonder if tracking the number of iterations / recursion depth and limiting the execution accordingly would achieve the same?
from grmtools.
The only possibly better mechanism I can think of is to track memory usage, but that might be hard on some platforms. The current time limit is safe on every grammar I know, and single-threaded performance would probably have to improve by a factor of 4-5x before it became an issue. For better or worse, such an increase in performance seems unlikely to happen any time soon ;)
from grmtools.
The only possibly better mechanism I can think of is to track memory usage, but that might be hard on some platforms.
Wouldn't tracking some sort of stack depth be a good approximation for that?
For better or worse, such an increase in performance seems unlikely to happen any time soon
True that.
Before I head into the adventure of making those changes – what do you think of the following approach: Along the RecoveryKind
, a get_cost
function and a recovery_budget
constant can be provided. For the existing case, this would just be Instant::now
and 500
.
In my case, I then could conveniently provide an equivalent.
from grmtools.
Wouldn't tracking some sort of stack depth be a good approximation for that?
Unfortunately not. [Previous works tried using vaguely similar ideas, and it's easy to construct grammars that cause them to explode.]
Before I head into the adventure of making those changes – what do you think of the following approach: Along the RecoveryKind, a get_cost function and a recovery_budget constant can be provided. For the existing case, this would just be Instant::now and 500.
I'm reluctant to expose knobs like this to the user because it's going to confuse them unnecessarily: it's got to be grmtools
job to get this right IMHO.
One obvious question occurs to me: can we change the Rust WASM backend to call the JavaScript timer? That would solve the problem for grmtools and other libraries too!
from grmtools.
This might be relevant https://crates.io/crates/instant.
from grmtools.
Related Issues (20)
- Generated code incompatible with `-D rust-2018-idioms` HOT 7
- Broken links for YACC and LEX Manual in grmtools-book HOT 5
- %ignore like in flex HOT 8
- Is it possible to perform side effects while parsing? HOT 3
- Would be nice to have an online playground HOT 8
- Order of execution of grammar statements? HOT 7
- Bug in grmtools documentation - traits not displaying required method names HOT 12
- Is there a way to avoid global variables for compile time data structures? HOT 5
- Request for example of handling string literals in lexer HOT 2
- error when using fake UMINUS token in %prec directive HOT 9
- GDB Support? HOT 5
- Add support for comments in lrlex files HOT 3
- Explain why Copy is required for a type specified %parse-param HOT 7
- Nondeterministic generation of Rust code HOT 2
- Support lex-style definitions HOT 1
- Trying to replicate `tinylang.yy` and `tinylang.l` syntax with `nimbleparse` HOT 5
- Weird that `nimbleparse` works on `java7.y` `java7.l` even though the `GT` symbol in both is different. HOT 1
- Online Yacc/Lex editor/tester HOT 6
- Detailed debug info for reduce/reduce shift/reduce errors? HOT 27
- Docs Question - Optional Prefix to Regular Expressions HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from grmtools.