xampprocky / tokei Goto Github PK
View Code? Open in Web Editor NEWCount your code, quickly.
License: Other
Count your code, quickly.
License: Other
To add further formats a serde version needs to be available, that can handle the current format. If anyone has any other formats they'd like to see, I'll add them to the list.
Input:
(* Title: Tools/induct.ML
Author: Markus Wenzel, TU Muenchen
Proof by cases, induction, and coinduction.
*)
signature INDUCT_ARGS =
sig
val cases_default: thm
val atomize: thm list
val rulify: thm list
val rulify_fallback: thm list
val equal_def: thm
val dest_def: term -> (term * term) option
val trivial_tac: Proof.context -> int -> tactic
end;
Output:
$ tokei induct.ML
-------------------------------------------------------------------------------
Language Files Total Blanks Comments Code
-------------------------------------------------------------------------------
OCaml 1 16 2 14 0
-------------------------------------------------------------------------------
Total 1 16 2 14 0
-------------------------------------------------------------------------------
Tokei counts files with the ending .in
as Rust code.
Tokei version Tokei 2.1.1
(Installed via cargo install
on stable rustc/cargo)
echo "a\n" > code.in
tokei code.in
code.in is counted as Rust code
code.in should be seen as autoconf input file.
Could you add markdown support?
Whether you use -V or --version it returns nothing.
Isabelle theories want something like this
Language::new("--", "{*,(*,‹,\\<open>", "*},*),›,\\<close>"),
Where all of those comment styles are used pretty ubiquitously.
It would be very convenient to have such an option when an extension is unknown for a given language or when an extension is ambiguous (e.g. .cgi, .inc).
It could be used to override a default mapping, or even discard a mapping, for instance I have a file .pro which is a QT Creator project and not a Prolog file. But maybe in the later case, it would be cleaner to have a dedicated option to ignore a given extension.
I'll try to submit a PR, but any guidance would be greatly appreciated :)
If you use -l or --languages it returns the error the following required arguments were not supplied: '...'. But you should not have to provide input when seeing the list of supported languages
GCC's dependency .d
files left over from a build are treated as D lang source files.
Recommend make clean
somewhere?
$ tokei -s comments src/
error: 'comments' isn't a valid value for '--sort <sort>'
[values:blanks code commments files total]
Did you mean 'commments' ?
$ tokei -s commments src/
-------------------------------------------------------------------------------
Language Files Lines Code Comments Blanks
-------------------------------------------------------------------------------
thread '<main>' panicked at 'internal error: entered unreachable code', /tmp/cargo-install.yLVjW5kEQlUb/release/build/tokei-8af4b02509125ebd/out/main.rs:3567
note: Run with `RUST_BACKTRACE=1` for a backtrace.
For some files (plain text, markdown), tokei reports 0 lines in total, but more than 0 lines of code.
For example, for README.md
in the tokei repository, which has 252 lines, the output looks like this:
> tokei README.md
-------------------------------------------------------------------------------
Language Files Lines Code Comments Blanks
-------------------------------------------------------------------------------
Markdown 1 0 252 0 0
-------------------------------------------------------------------------------
Total 1 0 252 0 0
-------------------------------------------------------------------------------
Add capability to report both physical lines of code and logical lines of code
Please add a License file so we know if it is GPLv3 or MIT or so on.
I didn't know what this crate did before I looked at the README, and it took me a rather long time to figure it out. "Count code quickly" is pithy but nonobvious. I think an above-the-fold example would explain the crate's purpose without requiring a change in the verbiage.
Currently wildcard(*) does not work with the Exclude flags
If a Cargo.lock
exists in a rust project, it should be possible to also include the dependencies from the local .cargo
directory in offline mode. Probably via a --deps
switch or similar.
Not sure about online mode or other languages. Have you considered this functionality?
Now with the inclusion handling comments while in quotes, and out of them. All existing languages need to be updated with their string type as currently the default is (", ")
which obviously doesn't apply to all languages. All that is needed to go to src/lib/language/languages.rs
and use the set_quotes()
method.
for example:
$ cat hello.json
{
"name": "tokei",
"height": 150
}
$ tokei hello.json
-------------------------------------------------------------------------------
Language Files Total Blanks Comments Code
-------------------------------------------------------------------------------
JSON 1 4 0 4 0
-------------------------------------------------------------------------------
Total 1 4 0 4 0
-------------------------------------------------------------------------------
By the way, cloc
is under a behavior:
$ cloc --version
1.66
$ cloc hello.json
1 text file.
1 unique file.
0 files ignored.
https://github.com/AlDanial/cloc v 1.66 T=0.01 s (135.7 files/s, 542.9 lines/s)
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
JSON 1 0 0 4
-------------------------------------------------------------------------------
Sorry, my broken english.
There should be a file for each language that covers every edge case. The file should contain every variant comments and quotes, as well as a comment at the top of the file containing the manually verified lines, code, comments, blanks e.g. // 39 lines 32 code 2 comments 5 blanks
. A good example of a test file is tests/data/rust.rs
.
// 39 lines 32 code 2 comments 5 blanks
/* /**/ */
fn main() {
let start = "/*";
loop {
if x.len() >= 2 && x[0] == '*' && x[1] == '/' { // found the */
break;
}
}
}
fn foo() {
let this_ends = "a \"test/*.";
call1();
call2();
let this_does_not = /* a /* nested */ comment " */
"*/another /*test
call3();
*/";
}
fn foobar() {
let does_not_start = // "
"until here,
test/*
test"; // a quote: "
let also_doesnt_start = /* " */
"until here,
test,*/
test"; // another quote: "
}
fn foo() {
let a = 4; // /*
let b = 5;
let c = 6; // */
}
Example foo.rs
:
Foo
/*
Bar
/*
Baz
*/
asdf
'asdf
asdf
asdf
asdf
asdf
asdf
asdf
asd
fasdf
asd
f
*/
$ tokei foo.rs
-------------------------------------------------------------------------------
Language Files Total Blanks Comments Code
-------------------------------------------------------------------------------
Rust 1 24 5 5 14
-------------------------------------------------------------------------------
Total 1 24 5 5 14
-------------------------------------------------------------------------------
It counts the comment lines after the nested */
as code.
It seems piping output around doesn't update the file total, plus, only the file types from the final command are being tallied.
~$ tokei usercorn-unstable/
-------------------------------------------------------------------------------
Language Files Lines Code Comments Blanks
-------------------------------------------------------------------------------
Go 149 9163 7886 298 979
Makefile 1 26 18 0 8
Markdown 3 254 254 0 0
Python 2 154 125 2 27
YAML 1 20 17 0 3
-------------------------------------------------------------------------------
Total 156 9617 8300 300 1017
-------------------------------------------------------------------------------
~$ tokei go/src/github.com/
-------------------------------------------------------------------------------
Language Files Lines Code Comments Blanks
-------------------------------------------------------------------------------
BASH 1 8 6 1 1
C Header 6 3435 2939 400 96
Go 170 29590 25755 1492 2343
Markdown 13 1090 1090 0 0
Python 2 130 101 9 20
YAML 6 69 65 0 4
-------------------------------------------------------------------------------
Total 198 34322 29956 1902 2464
-------------------------------------------------------------------------------
~$ tokei -o json go/src/github.com/ | tokei -i stdin usercorn-unstable/
-------------------------------------------------------------------------------
Language Files Lines Code Comments Blanks
-------------------------------------------------------------------------------
Go 149 38753 33641 1790 3322
Makefile 1 26 18 0 8
Markdown 3 1344 1344 0 0
Python 2 284 226 11 47
YAML 1 89 82 0 7
-------------------------------------------------------------------------------
Total 163 43939 38256 2202 3481
-------------------------------------------------------------------------------
When you look the result, it tooks time until find the most important data, the lines of code. It should be at the right.
See the output of gocloc: https://github.com/hhatto/gocloc#basic-usage
$ gocloc .
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
Markdown 3 8 0 18
Go 1 29 1 323
-------------------------------------------------------------------------------
TOTAL 4 37 1 341
-------------------------------------------------------------------------------
.pm files are Perl modules.
I checked in src/lib/language/language_type.rs and there's no conflict with another language.
The modification seems trivial, I can make a PR if you want.
$ tokei -f tokeitest_123/
-------------------------------------------------------------------------------
Language Files Lines Code Comments Blanks
-------------------------------------------------------------------------------
Rust 3 15 12 0 3
-------------------------------------------------------------------------------
|keitest_123/123456789.rs 5 4 0 1
|okeitest_123/12345678.rs 5 4 0 1
tokeitest_123/1234567.rs 5 4 0 1
-------------------------------------------------------------------------------
Total 3 15 12 0 3
-------------------------------------------------------------------------------
In the second case, |okeitest_123/12345678.rs
and tokeitest_123/12345678.rs
take the same amount of space, so it's unnecessary to replace the first character.
Hi,
I'm trying to build tokei, but error occurred.
Platform: OS X 10.11.6 (15G1208)
Command: cargo build --verbose --release --features=all
rustc 1.13.0 (2c6933acc 2016-11-07)
Result
Compiling tokei v4.5.2 (file:///Users/neo/Desktop/tokei)
Running `rustc src/main.rs --crate-name tokei --crate-type bin -C opt-level=3 --cfg feature=\"hex\" --cfg feature=\"default\" --cfg feature=\"toml-io\" --cfg feature=\"io\" --cfg feature=\"serde_codegen\" --cfg feature=\"serde_yaml\" --cfg feature=\"all\" --cfg feature=\"cbor\" --cfg feature=\"serde_json\" --cfg feature=\"serde_cbor\" --cfg feature=\"serde\" --cfg feature=\"toml\" --cfg feature=\"json\" --cfg feature=\"yaml\" -C metadata=dc2f5d3a69030dfc --out-dir /Users/neo/Desktop/tokei/target/release --emit=dep-info,link -L dependency=/Users/neo/Desktop/tokei/target/release/deps --extern clap=/Users/neo/Desktop/tokei/target/release/deps/libclap-9178c8b70f5c5456.rlib --extern serde_yaml=/Users/neo/Desktop/tokei/target/release/deps/libserde_yaml-3758ab135f6f7d90.rlib --extern rayon=/Users/neo/Desktop/tokei/target/release/deps/librayon-c3557c1c242af173.rlib --extern ignore=/Users/neo/Desktop/tokei/target/release/deps/libignore-e66729014aae0f37.rlib --extern log=/Users/neo/Desktop/tokei/target/release/deps/liblog-bf16bb9a4912b11d.rlib --extern serde=/Users/neo/Desktop/tokei/target/release/deps/libserde-97f01bf227222121.rlib --extern serde_json=/Users/neo/Desktop/tokei/target/release/deps/libserde_json-b812e04ba18930d3.rlib --extern toml=/Users/neo/Desktop/tokei/target/release/deps/libtoml-cf3bfced9e77aba4.rlib --extern regex=/Users/neo/Desktop/tokei/target/release/deps/libregex-36c8e259ac5ba542.rlib --extern maplit=/Users/neo/Desktop/tokei/target/release/deps/libmaplit-50ef67709de53e77.rlib --extern hex=/Users/neo/Desktop/tokei/target/release/deps/libhex-c4311871abf53460.rlib --extern lazy_static=/Users/neo/Desktop/tokei/target/release/deps/liblazy_static-7f1b96a3a3eb529d.rlib --extern encoding=/Users/neo/Desktop/tokei/target/release/deps/libencoding-804c203c6e9b8b1a.rlib --extern serde_cbor=/Users/neo/Desktop/tokei/target/release/deps/libserde_cbor-7aefca53f259cf88.rlib --extern env_logger=/Users/neo/Desktop/tokei/target/release/deps/libenv_logger-c716af707f2027e1.rlib --extern tokei=/Users/neo/Desktop/tokei/target/release/deps/libtokei.rlib`
error: Could not compile `tokei`.
Caused by:
process didn't exit successfully: `rustc src/main.rs --crate-name tokei --crate-type bin -C opt-level=3 --cfg feature="hex" --cfg feature="default" --cfg feature="toml-io" --cfg feature="io" --cfg feature="serde_codegen" --cfg feature="serde_yaml" --cfg feature="all" --cfg feature="cbor" --cfg feature="serde_json" --cfg feature="serde_cbor" --cfg feature="serde" --cfg feature="toml" --cfg feature="json" --cfg feature="yaml" -C metadata=dc2f5d3a69030dfc --out-dir /Users/neo/Desktop/tokei/target/release --emit=dep-info,link -L dependency=/Users/neo/Desktop/tokei/target/release/deps --extern clap=/Users/neo/Desktop/tokei/target/release/deps/libclap-9178c8b70f5c5456.rlib --extern serde_yaml=/Users/neo/Desktop/tokei/target/release/deps/libserde_yaml-3758ab135f6f7d90.rlib --extern rayon=/Users/neo/Desktop/tokei/target/release/deps/librayon-c3557c1c242af173.rlib --extern ignore=/Users/neo/Desktop/tokei/target/release/deps/libignore-e66729014aae0f37.rlib --extern log=/Users/neo/Desktop/tokei/target/release/deps/liblog-bf16bb9a4912b11d.rlib --extern serde=/Users/neo/Desktop/tokei/target/release/deps/libserde-97f01bf227222121.rlib --extern serde_json=/Users/neo/Desktop/tokei/target/release/deps/libserde_json-b812e04ba18930d3.rlib --extern toml=/Users/neo/Desktop/tokei/target/release/deps/libtoml-cf3bfced9e77aba4.rlib --extern regex=/Users/neo/Desktop/tokei/target/release/deps/libregex-36c8e259ac5ba542.rlib --extern maplit=/Users/neo/Desktop/tokei/target/release/deps/libmaplit-50ef67709de53e77.rlib --extern hex=/Users/neo/Desktop/tokei/target/release/deps/libhex-c4311871abf53460.rlib --extern lazy_static=/Users/neo/Desktop/tokei/target/release/deps/liblazy_static-7f1b96a3a3eb529d.rlib --extern encoding=/Users/neo/Desktop/tokei/target/release/deps/libencoding-804c203c6e9b8b1a.rlib --extern serde_cbor=/Users/neo/Desktop/tokei/target/release/deps/libserde_cbor-7aefca53f259cf88.rlib --extern env_logger=/Users/neo/Desktop/tokei/target/release/deps/libenv_logger-c716af707f2027e1.rlib --extern tokei=/Users/neo/Desktop/tokei/target/release/deps/libtokei.rlib` (signal: 9, SIGKILL: kill)
compiling fail by cargo install tokei --features all
as well
rustc 1.13.0 (2c6933acc 2016-11-07)
binary: rustc
commit-hash: 2c6933acc05c61e041be764cb1331f6281993f3f
commit-date: 2016-11-07
host: x86_64-apple-darwin
release: 1.13.0
NeoMacBook-Pro:~ neo$ RUST_BACKTRACE=1 tokei Development --output json
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: Syntax(KeyMustBeAString, 0, 0)', ../src/libcore/result.rs:799
stack backtrace:
1: 0x105efa0a8 - std::sys::backtrace::tracing::imp::write::h6f1d53a70916b90d
2: 0x105efd2ff - std::panicking::default_hook::{{closure}}::h137e876f7d3b5850
3: 0x105efc395 - std::panicking::default_hook::h0ac3811ec7cee78c
4: 0x105efc9a6 - std::panicking::rust_panic_with_hook::hc303199e04562edf
5: 0x105efc844 - std::panicking::begin_panic::h6ed03353807cf54d
6: 0x105efc762 - std::panicking::begin_panic_fmt::hc321cece241bb2f5
7: 0x105efc6c7 - rust_begin_unwind
8: 0x105f287b0 - core::panicking::panic_fmt::h27224b181f9f037f
9: 0x105d71b1a - core::result::unwrap_failed::h9916702f61d59849
10: 0x105da3f30 - tokei::main::h788afbac7a995aa4
11: 0x105efd8ba - __rust_maybe_catch_panic
12: 0x105efbed6 - std::rt::lang_start::h538f8960e7644c80
And so sorry for the inconvenience about posted on wrong place.
Could you add toml support
It is common to have a man page for command line utilities.
After the update to the has_trailing_comments function in #35, the tests now don't cover the new functionality of nested=false.
The -e flag does not work. The --Exclude only works if you set it equal to what you are excluding(--exclude=dirname/) but it should work like (--exclude dirname/) as noted in the help
Today I thought I'd test Tokei before running it on my code, and I found some cases that it doesn't handle correctly. I'll include them in one issue because I assume they will be fixed by the same change, but I can split them into multiple issues if you'd like.
There are 8 lines of Rust code here, but Tokei reports 3:
fn main() {
let start = "/*";
loop {
if x.len() >= 2 && x[0] == '*' && x[1] == '/' { // found the */
break;
}
}
}
There are 6 lines of code here, but Tokei reports 5:
fn foo() {
let no_comment = "
// another test
";
func();
}
There are 9 lines of code here, but Tokei panics:
fn foo() {
let this_ends = "a \"test/*.";
call1();
call2();
let this_does_not = /* a /* nested */ comment " */
"*/another /*test
call3();
*/";
}
There are 10 lines of code here, but Tokei reports 7:
fn foobar() {
let does_not_start = // "
"until here,
test/*
test"; // a quote: "
let also_doesnt_start = /* " */
"until here,
test,*/
test"; // another quote: "
}
There are 5 lines of code here, but Tokei reports 2:
fn foo() {
let a = 4; // /*
let b = 5;
let c = 6; // */
}
There are 5 lines of D code here, but Tokei reports 2:
void main() {
auto x = 5; /+ a /+ nested +/ comment /* +/
writefln("hello");
auto y = 4; // */
}
I typed these out without trying to compile them, so sorry if I made a mistake.
I have written real world code where this kind of implementation would miscount a lot of code, so that's why I check out LOC counters before using them.
OCaml, Coq, etc. comments are for some reason not counted.
I basically want what ripgrep has, but I don't really understand all the requirements for submitting to these package managers. Any help would be very appreciated.
See below for reproduction steps. Once the "Counting files..." portion finishes if you scroll down to view the first couple files and then quit the panic will be shown. I haven't been able to see if the same issue occurs on Linux yet.
Likely any print!
and println!
invocations should be changed to write!
or writeln!
so that the Result
can be handled.
$ cargo install tokei
$ RUST_BACKTRACE=1 tokei . --files | less
thread 'main' panicked at 'failed printing to stdout: The pipe is being closed. (os error 232)', ../src/libstd\io\stdio.rs:617
thread '<unnamed>' panicked at 'failed printing to stdout: The pipe is being closed. (os error 232)', ../src/libstd\io\stdio.rs:617
stack backtrace:
0: 0x13f4e00ec - <unknown>
1: 0x13f4df6f9 - <unknown>
2: 0x13f4cd05d - <unknown>
3: 0x13f4e272b - <unknown>
4: 0x13f4cdf5f - <unknown>
5: 0x13f4d62e3 - <unknown>
6: 0x13f2a22ad - <unknown>
7: 0x13f4df0cc - <unknown>
8: 0x13f4e27c1 - <unknown>
9: 0x13f4dee04 - <unknown>
10: 0x13f2cfdc9 - <unknown>
11: 0x13f4ec1d8 - <unknown>
12: 0x76d459bc - BaseThreadInitThunk
https://github.com/rust-lang/rust/blob/1.11.0/src/libstd/io/stdio.rs#L617
There are multiple issue with Fortran that precent it from being used with tokei. First, two syntaxes exists: the modern (or free-form) one, and the legacy (or fixed-form) one. They can be considered as two different languages in tokei. The main issues are:
.f90
, .F90
, .f95
, .F95
, .f03
, .f08
(and maybe .F03
and .F08
, but I never found them). For the legacy syntax, you can use .f
, .F
, .f77
, .F77
, .fpp
, .FPP
.c
or C
or *
characters indicate a comment, but only if found in the columns from 1 to 7.&
character is a line continuation, and it is used a lot in legacy syntax, due to the 72 characters by line limit.So, how can I bypass these issues ?
line_continuation
field in the Language struct. But I do not know the usual way of doing it in cloc programs, to ignore or to merge continuated lines.I just tried installing tokei using cargo install tokei
, but was unable to compile due to conflicting versions of the syntex crate, it looks like both 0.32.0 and 0.33.0 are being fetched:
$ cargo install tokei
Updating registry `https://github.com/rust-lang/crates.io-index`
Downloading tokei v2.1.1
Downloading rayon v0.3.1
Downloading serde_json v0.7.1
Downloading serde v0.7.6
Downloading clap v2.5.2
Downloading serde_cbor v0.3.3
Downloading maplit v0.1.3
Downloading serde_yaml v0.2.4
Downloading deque v0.3.1
Downloading yaml-rust v0.3.2
Downloading serde_codegen v0.7.6
Downloading syntex v0.32.0 <-------------- Here
Downloading syntex_syntax v0.33.0
Downloading syntex v0.33.0 <-------------- And here
Downloading aster v0.17.0
Downloading quasi v0.11.0
Downloading quasi_codegen v0.11.0
Compiling serde v0.7.6
Compiling log v0.3.6
Compiling glob v0.2.11
Compiling winapi-build v0.1.1
Compiling bitflags v0.5.0
Compiling num-traits v0.1.32
Compiling unicode-xid v0.0.3
Compiling winapi v0.2.7
Compiling vec_map v0.6.0
Compiling byteorder v0.3.13
Compiling ansi_term v0.7.2
Compiling rustc-serialize v0.3.19
Compiling yaml-rust v0.3.2
Compiling libc v0.2.11
Compiling kernel32-sys v0.2.2
Compiling unicode-width v0.1.3
Compiling term v0.2.14
Compiling walkdir v0.1.5
Compiling rand v0.3.14
Compiling num_cpus v0.2.12
Compiling strsim v0.4.1
Compiling maplit v0.1.3
Compiling deque v0.3.1
Compiling rayon v0.3.1
Compiling clap v2.5.2
Compiling serde_json v0.7.1
Compiling serde_cbor v0.3.3
Compiling serde_yaml v0.2.4
Compiling syntex_syntax v0.32.0
Compiling syntex_syntax v0.33.0
Compiling syntex v0.32.0
Compiling syntex v0.33.0
Compiling quasi v0.11.0
Compiling aster v0.17.0
Compiling quasi_codegen v0.11.0
Compiling serde_codegen v0.7.6
Compiling tokei v2.1.1
/Users/nm46057/.cargo/registry/src/github.com-88ac128001ac3a9a/tokei-2.1.1/build.rs:15:29: 15:42 error: mismatched types:
expected `&mut syntex::Registry`,
found `&mut syntex::Registry`
(expected struct `syntex::Registry`,
found a different struct `syntex::Registry`) [E0308]
/Users/nm46057/.cargo/registry/src/github.com-88ac128001ac3a9a/tokei-2.1.1/build.rs:15 serde_codegen::register(&mut registry);
^~~~~~~~~~~~~
/Users/nm46057/.cargo/registry/src/github.com-88ac128001ac3a9a/tokei-2.1.1/build.rs:15:29: 15:42 help: run `rustc --explain E0308` to see a detailed explanation
/Users/nm46057/.cargo/registry/src/github.com-88ac128001ac3a9a/tokei-2.1.1/build.rs:15:29: 15:42 note: Perhaps two different versions of crate `syntex` are being used?
/Users/nm46057/.cargo/registry/src/github.com-88ac128001ac3a9a/tokei-2.1.1/build.rs:15 serde_codegen::register(&mut registry);
^~~~~~~~~~~~~
error: aborting due to previous error
error: failed to compile `tokei v2.1.1`, intermediate artifacts can be found at `/Users/nm46057/projects/rust/target-install`
Caused by:
Could not compile `tokei`.
To learn more, run the command again with --verbose.
For example:
tokei --files D:\tmp\新建文件夹\tokei
-------------------------------------------------------------------------------
Language Files Lines Code Comments Blanks
-------------------------------------------------------------------------------
...
-------------------------------------------------------------------------------
thread 'main' panicked at 'index 17 and/or 41 in `D:\tmp\新建文件夹\tokei\CHANGELOG.md` do not lie on character boundary', ../src/libcore\str/mod.rs:1721
stack backtrace:
0: 0xd98f9b - std::rt::lang_start::haaae1186de9de8cb
1: 0xd9989a - std::panicking::rust_panic_with_hook::hb1322e5f2588b4db
2: 0xd99736 - std::panicking::begin_panic_fmt::h4fe9fb9d5109c4bf
3: 0xd995eb - rust_begin_unwind
4: 0xda05f7 - core::panicking::panic_fmt::h4395919ece15c671
5: 0xda12c0 - core::str::slice_error_fail::h9ad582f282290cd6
6: 0xd0c602 - <tokei::stats::Stats as core::fmt::Display>::fmt::h4eaabf7e66bef4ca
7: 0xcf5bae - __ImageBase
8: 0xda25af - core::fmt::write::h300d6e605e327781
❯ tree pkg/migrations/
pkg/migrations/
├── migrations.go
└── migrations_test.go
0 directories, 2 files
❯ tokei -e test pkg/migrations
-------------------------------------------------------------------------------
Language Files Lines Code Comments Blanks
-------------------------------------------------------------------------------
Go 2 598 479 70 49
-------------------------------------------------------------------------------
Total 2 598 479 70 49
-------------------------------------------------------------------------------
.sc
, .scala
comment ::=
‘/*’ “any sequence of characters; nested comments are allowed” ‘*/’ |
‘//’ “any sequence of characters up to end of line”
Please tag the last 3 releases (for packagers).
Right now, tokei just silently ignores errors (like invalid encoding) when processing source files.
This leads to the user seeing incorrect results in these cases, without knowing that they are incorrect because errors happened when processing the files.
Stuff like the following should 5 lines of code.
<<<END
comment
//
/*
END
Just like HTML but in addition {{! ... }}
is also comment.
Extension: .hbs
, .handlebars
An option like tokei --file-list
which would just print the list of files used would be useful for checking the exclude
parameter.
Right now, tokei just silently ignores files that are not UTF-8, giving incorrect line counts compared to cloc.
github.com/AlDanial/cloc v 1.70 T=0.47 s (74.1 files/s, 56982.6 lines/s)
-------------------------------------------------------------------------------
Language files blank comment code
-------------------------------------------------------------------------------
C++ 22 2344 912 23023
C/C++ Header 13 119 7 520
-------------------------------------------------------------------------------
SUM: 35 2463 919 23543
-------------------------------------------------------------------------------
[snake@neko mog]$ tokei sources/
-------------------------------------------------------------------------------
Language Files Lines Code Comments Blanks
-------------------------------------------------------------------------------
C Header 13 646 0 527 119
C++ 14 6671 73 5854 744
-------------------------------------------------------------------------------
Total 27 7317 73 6381 863
-------------------------------------------------------------------------------
The files that tokei didn't count are encoded in ISO-8859
.
Even if we decide explicitly not to support encodings other than UTF-8, tokei should warn when ignoring files due to not being UTF-8, instead of silently giving incorrect results. #54
The link to the homepage is broken (as in 404) on crates.io.
When running tokei dirname/ it returns a line with the D language, but there is no D language in my directory.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.