Mate is a library for parsing and calculating arithmetic expressions inputted as &str(string). Uses Lexer(similar to interpreted-programming languages' lexer) structure to parse string input in to token list, and Calculator structure to calculate final result from token list. Implements also a general wrapper structure that implements Lexer and Calculator inside of it. And makes it easy to calculate arithmetic expression's result directly without dealing with parsing and calculating manually.
This crate is on crates.io and can be used by adding mate-rs to your dependencies in your project's Cargo.toml
.
[dependencies]
mate-rs = "0.1.0"
Mate
is general wrapper structure for Lexer
and Calculator
.
has only one method that used to calculate
result via string(&str) input.
use mate_rs::mate::Mate;
let result = Mate::calculate("6 * 7");
match result {
Ok(v) => assert_eq!(v, 42.0),
Err(_) => {
// Do something ...
}
};
Lexer
is the main structure that parses string-input to token-list.
Calculator
is the structure that used to calculate final result via Lexer
's result.
use mate_rs::{calculator::Calculator, lexer::Lexer};
// Generated tokens gonna be something like:
// | Token(type: NUMBER literal: "-2"),
// | Token(type: PLUS literal: "+"),
// | Token(type: NUMBER literal: "2"),
// | Token(type: PLUS literal: "+"),
// | Token(
// | type: SUBEXP,
// | tokens: [
// | Token(type: NUMBER, literal: "6")
// | Token(type: PRODUCT, literal: "*")
// | Token(type: NUMBER, literal: "7")
// | ],
// | ),
let tokens = Lexer::lex(" - 2 + 2 + 6 * 7").unwrap(); // should handle error case also
// Result will be calculated from tokens, by X/O/Y algorithm.
let result = Calculator::calculate(tokens);
match result {
Ok(v) => assert_eq!(v, 42.0),
Err(_) => {
// Do something ...
}
};
Mate is all about two main structures, [Lexer] and [Calculator]. [Lexer] is the structure that takes care of parsing given string expression, and [Calculator] is the structure that takes care of calculating final result via parsed tokens
Loops through the given input string, reads and converts each character to an [Token] structure. We've several types of main tokens and they are:
ILLEGAL
- illegal character.NUMBER
- number type.MINUS
,PLUS
,PRODUCT
,DIVIDE
- operations.LPAREN
,RPAREN
- parentheses.SUBEXP
- sub expression, expressions inside of parentheses or combinations of division and multiplication.
Lexer's lex
functionality converts each character to one of these tokens.
It combines multiplication or division operation related tokens into one sub-expression to keep the operation-priority right.
And nests the parentheses with a custom level-to-expression algorithm.
level-to-expression algorithm is mapping algorithm that maps concrete expression to it's nesting level.
For example if the given token list is -> (2 + 5) : (5 - 9 / (8 - 5))
.
Generated result will be:
By doing that we make it easy to keep the operation-priority safe.
Calculator takes the parsed token-list and calculates final result of it.
Uses custom X/O/Y
algorithm a.k.a X/OPERATION/Y
where X
and Y
are numbers, and O
is operation.
If cannot get the X
or Y
takes them as zero.
╭────────╮ ╭───────────╮ ╭────────╮
│ NUMBER │ │ OPERATION │ │ NUMBER │
╰────────╯ ╰───────────╯ ╰────────╯
╰───╮ │ ╭───╯
▼ ▼ ▼
X [+, -, *, /] Y