Code Monkey home page Code Monkey logo

tokei's People

Contributors

5ht2 avatar adam-tokarski avatar alexe1289 avatar alexmaco avatar brandonboone avatar brightly-salty avatar byron avatar cosmichorrordev avatar dependabot-preview[bot] avatar dependabot[bot] avatar elliotwutingfeng avatar embers-of-the-fire avatar fkarg avatar lespea avatar liigo avatar llogiq avatar lpil avatar luthaf avatar m1el avatar mwilli20 avatar mykter avatar nickhackman avatar rhysd avatar spenserblack avatar svisser avatar taniokay avatar tjodden avatar veykril avatar xampprocky avatar yc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

tokei's Issues

Add more formats for I/O.

To add further formats a serde version needs to be available, that can handle the current format. If anyone has any other formats they'd like to see, I'll add them to the list.

  • TOML(will be in the next version probably)
  • XML(currently waiting on serde-xml to have serialization)

OCaml doesn't seem to count anything as code

Input:

(*  Title:      Tools/induct.ML
    Author:     Markus Wenzel, TU Muenchen

Proof by cases, induction, and coinduction.
*)

signature INDUCT_ARGS =
sig
  val cases_default: thm
  val atomize: thm list
  val rulify: thm list
  val rulify_fallback: thm list
  val equal_def: thm
  val dest_def: term -> (term * term) option
  val trivial_tac: Proof.context -> int -> tactic
end;

Output:

$ tokei induct.ML
-------------------------------------------------------------------------------
 Language            Files        Total       Blanks     Comments         Code
-------------------------------------------------------------------------------
 OCaml                   1           16            2           14            0
-------------------------------------------------------------------------------
 Total                   1           16            2           14            0
-------------------------------------------------------------------------------

Tokei autoconf file mistaken as Rust code

Tokei counts files with the ending .in as Rust code.

Tokei version Tokei 2.1.1 (Installed via cargo install on stable rustc/cargo)

Steps to reproduce:

  • echo "a\n" > code.in
  • tokei code.in

Actual Results:

code.in is counted as Rust code

Expected Results:

code.in should be seen as autoconf input file.

Add an option to map an extension to a language

It would be very convenient to have such an option when an extension is unknown for a given language or when an extension is ambiguous (e.g. .cgi, .inc).
It could be used to override a default mapping, or even discard a mapping, for instance I have a file .pro which is a QT Creator project and not a Prolog file. But maybe in the later case, it would be cleaner to have a dedicated option to ignore a given extension.
I'll try to submit a PR, but any guidance would be greatly appreciated :)

Language flag does not work

If you use -l or --languages it returns the error the following required arguments were not supplied: '...'. But you should not have to provide input when seeing the list of supported languages

-s comments option broken

$ tokei -s comments src/
error: 'comments' isn't a valid value for '--sort <sort>'
    [values:blanks code commments files total]

    Did you mean 'commments' ?

$ tokei -s commments src/
-------------------------------------------------------------------------------
 Language            Files        Lines         Code     Comments       Blanks
-------------------------------------------------------------------------------
thread '<main>' panicked at 'internal error: entered unreachable code', /tmp/cargo-install.yLVjW5kEQlUb/release/build/tokei-8af4b02509125ebd/out/main.rs:3567
note: Run with `RUST_BACKTRACE=1` for a backtrace.

0 Lines reported for some files

For some files (plain text, markdown), tokei reports 0 lines in total, but more than 0 lines of code.

For example, for README.md in the tokei repository, which has 252 lines, the output looks like this:

> tokei README.md 
-------------------------------------------------------------------------------
 Language            Files        Lines         Code     Comments       Blanks
-------------------------------------------------------------------------------
 Markdown                1            0          252            0            0
-------------------------------------------------------------------------------
 Total                   1            0          252            0            0
-------------------------------------------------------------------------------

Add License File

Please add a License file so we know if it is GPLv3 or MIT or so on.

README does not clearly describe crate

I didn't know what this crate did before I looked at the README, and it took me a rather long time to figure it out. "Count code quickly" is pithy but nonobvious. I think an above-the-fold example would explain the crate's purpose without requiring a change in the verbiage.

Total rust lines of code

If a Cargo.lock exists in a rust project, it should be possible to also include the dependencies from the local .cargo directory in offline mode. Probably via a --deps switch or similar.

Not sure about online mode or other languages. Have you considered this functionality?

Update existing languages with their String litreals.

Now with the inclusion handling comments while in quotes, and out of them. All existing languages need to be updated with their string type as currently the default is (", ") which obviously doesn't apply to all languages. All that is needed to go to src/lib/language/languages.rs and use the set_quotes() method.

  • ActionScript
  • Ada
  • Assembly
  • Autoconf
  • BASH
  • Batch
  • C
  • C Header
  • Clojure
  • CoffeeScript
  • ColdFusion
  • ColdFusion CFScript
  • Coq
  • C++
  • C++ Header
  • C#
  • C Shell
  • CSS
  • D
  • Dart
  • Device Tree
  • Erlang
  • Forth
  • FORTRAN Legacy
  • FORTRAN Modern
  • Go
  • Handlebars
  • Haskell
  • HTML
  • Idris
  • Isabelle
  • JAI
  • Java
  • JavaScript
  • Julia
  • JSON
  • JSX
  • Kotlin
  • LESS
  • LD Script
  • LISP
  • Lua
  • Makefile
  • Markdown
  • Mustache
  • Nim
  • Objective C
  • Objective C++
  • OCaml
  • Oz
  • Pascal
  • Perl
  • Polly
  • PHP
  • Protocol Buffers
  • Prolog
  • Python
  • QCL
  • R
  • Ruby
  • Ruby HTML
  • Rust
  • Sass
  • Scala
  • Standard ML
  • SQL
  • Swift
  • TeX
  • Plain Text
  • TOML
  • TypeScript
  • Vim Script
  • Unreal Script
  • Wolfram
  • XML
  • YAML
  • Zsh

JSON file is all comment line?

for example:

$ cat hello.json
{
    "name": "tokei",
    "height": 150
}
$ tokei hello.json
-------------------------------------------------------------------------------
 Language            Files        Total       Blanks     Comments         Code
-------------------------------------------------------------------------------
 JSON                    1            4            0            4            0
-------------------------------------------------------------------------------
 Total                   1            4            0            4            0
-------------------------------------------------------------------------------
  • Is this behavior as per specification?
  • Markdown file same behavior, too.

By the way, cloc is under a behavior:

$ cloc --version
1.66
$ cloc hello.json
       1 text file.
       1 unique file.
       0 files ignored.

https://github.com/AlDanial/cloc v 1.66  T=0.01 s (135.7 files/s, 542.9 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
JSON                             1              0              0              4
-------------------------------------------------------------------------------

Sorry, my broken english.

Improve tokei's language test coverage.

There should be a file for each language that covers every edge case. The file should contain every variant comments and quotes, as well as a comment at the top of the file containing the manually verified lines, code, comments, blanks e.g. // 39 lines 32 code 2 comments 5 blanks. A good example of a test file is tests/data/rust.rs.

// 39 lines 32 code 2 comments 5 blanks

/* /**/ */
fn main() {
    let start = "/*";
    loop {
        if x.len() >= 2 && x[0] == '*' && x[1] == '/' { // found the */
            break;
        }
    }
}

fn foo() {
    let this_ends = "a \"test/*.";
    call1();
    call2();
    let this_does_not = /* a /* nested */ comment " */
        "*/another /*test
            call3();
            */";
}

fn foobar() {
    let does_not_start = // "
        "until here,
        test/*
        test"; // a quote: "
    let also_doesnt_start = /* " */
        "until here,
        test,*/
        test"; // another quote: "
}

fn foo() {
    let a = 4; // /*
    let b = 5;
    let c = 6; // */
}

Languages

  • ActionScript
  • Ada
  • Agda
  • Alex
  • ASP
  • ASP.NET
  • Assembly
  • AutoHotKey
  • Autoconf
  • BASH
  • Batch
  • C
  • C Header
  • CMake
  • C#
  • C Shell
  • Cabal
  • Cassius
  • Ceylon
  • Clojure
  • ClojureScript
  • CoffeeScript
  • Cogent
  • ColdFusion
  • ColdFusion CFScript
  • Coq
  • C++
  • C++ Header
  • Crystal
  • CSS
  • D
  • Dart
  • Device Tree
  • Dockerfile
  • Emacs Lisp
  • Elixir
  • Elm
  • Emacs Dev Env
  • Erlang
  • F#
  • Fish
  • Forth
  • FORTRAN Legacy
  • FORTRAN Modern
  • F*
  • GDScript
  • GLSL
  • Go
  • Hamlet
  • Handlebars
  • Happy
  • Haskell
  • HEX
  • HTML
  • Idris
  • Intel HEX
  • Isabelle
  • JAI
  • Java
  • JavaScript
  • JSON
  • JSX
  • Julia
  • Julius
  • Kotlin
  • Lean
  • LESS
  • LD Script
  • LISP
  • Lua
  • Lucius
  • Madlang
  • Makefile
  • Markdown
  • Module-Definition
  • MSBuild
  • Mustache
  • Nim
  • Nix
  • OCaml
  • Objective C
  • Objective C++
  • Org
  • Oz
  • PSL Assertion
  • Pascal
  • Perl
  • PHP
  • Polly
  • Processing
  • Prolog
  • Protocol Buffers
  • PureScript
  • Python
  • QCL
  • QML
  • R
  • Rakefile
  • Razor
  • ReStructuredText
  • Ruby
  • Ruby HTML
  • Rust
  • SRecode Template
  • Sass
  • Scala
  • Scons
  • Shell
  • Standard ML (SML)
  • Specman e
  • Spice Netlist
  • SQL
  • SVG
  • Swift
  • SystemVerilog
  • TCL
  • TeX
  • Plain Text
  • TOML
  • TypeScript
  • Unreal Script
  • Ur/Web
  • Ur/Web Project
  • Vala
  • Verilog
  • Verilog Args File
  • VHDL
  • Vim Script
  • Visual Basic
  • Wolfram
  • XAML
  • XML
  • Xtend
  • YAML
  • Zsh

Nested comments not handled

Example foo.rs:

Foo 
/*

       Bar

       /*

       Baz

       */

    asdf
    'asdf
    asdf
    asdf
    asdf
    asdf
    asdf
    asdf
    asd
    fasdf
    asd
    f
    */
$ tokei foo.rs
-------------------------------------------------------------------------------
 Language            Files        Total       Blanks     Comments         Code
-------------------------------------------------------------------------------
 Rust                    1           24            5            5           14
-------------------------------------------------------------------------------
 Total                   1           24            5            5           14
-------------------------------------------------------------------------------

It counts the comment lines after the nested */ as code.

Number of files tallying doesn't work

It seems piping output around doesn't update the file total, plus, only the file types from the final command are being tallied.

~$  tokei usercorn-unstable/
-------------------------------------------------------------------------------
 Language            Files        Lines         Code     Comments       Blanks
-------------------------------------------------------------------------------
 Go                    149         9163         7886          298          979
 Makefile                1           26           18            0            8
 Markdown                3          254          254            0            0
 Python                  2          154          125            2           27
 YAML                    1           20           17            0            3
-------------------------------------------------------------------------------
 Total                 156         9617         8300          300         1017
-------------------------------------------------------------------------------


~$  tokei go/src/github.com/
-------------------------------------------------------------------------------
 Language            Files        Lines         Code     Comments       Blanks
-------------------------------------------------------------------------------
 BASH                    1            8            6            1            1
 C Header                6         3435         2939          400           96
 Go                    170        29590        25755         1492         2343
 Markdown               13         1090         1090            0            0
 Python                  2          130          101            9           20
 YAML                    6           69           65            0            4
-------------------------------------------------------------------------------
 Total                 198        34322        29956         1902         2464
-------------------------------------------------------------------------------

~$  tokei -o json go/src/github.com/ | tokei -i stdin usercorn-unstable/
-------------------------------------------------------------------------------
 Language            Files        Lines         Code     Comments       Blanks
-------------------------------------------------------------------------------
 Go                    149        38753        33641         1790         3322
 Makefile                1           26           18            0            8
 Markdown                3         1344         1344            0            0
 Python                  2          284          226           11           47
 YAML                    1           89           82            0            7
-------------------------------------------------------------------------------
 Total                 163        43939        38256         2202         3481
-------------------------------------------------------------------------------

Code column at right

When you look the result, it tooks time until find the most important data, the lines of code. It should be at the right.

See the output of gocloc: https://github.com/hhatto/gocloc#basic-usage

$ gocloc .
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Markdown                         3              8              0             18
Go                               1             29              1            323
-------------------------------------------------------------------------------
TOTAL                            4             37              1            341
-------------------------------------------------------------------------------

Add .pm to Perl extensions

.pm files are Perl modules.
I checked in src/lib/language/language_type.rs and there's no conflict with another language.
The modification seems trivial, I can make a PR if you want.

tokei -f: Unnecessary `|` notation in cases where filename barely fits

$ tokei -f tokeitest_123/
-------------------------------------------------------------------------------
 Language            Files        Lines         Code     Comments       Blanks
-------------------------------------------------------------------------------
 Rust                    3           15           12            0            3
-------------------------------------------------------------------------------
 |keitest_123/123456789.rs            5            4            0            1
 |okeitest_123/12345678.rs            5            4            0            1
 tokeitest_123/1234567.rs             5            4            0            1
-------------------------------------------------------------------------------
 Total                   3           15           12            0            3
-------------------------------------------------------------------------------

In the second case, |okeitest_123/12345678.rs and tokeitest_123/12345678.rs take the same amount of space, so it's unnecessary to replace the first character.

cargo build fail with --features=all

Hi,

I'm trying to build tokei, but error occurred.

Platform: OS X 10.11.6 (15G1208)
Command: cargo build --verbose --release --features=all

rustc 1.13.0 (2c6933acc 2016-11-07)

Result

  Compiling tokei v4.5.2 (file:///Users/neo/Desktop/tokei)
     Running `rustc src/main.rs --crate-name tokei --crate-type bin -C opt-level=3 --cfg feature=\"hex\" --cfg feature=\"default\" --cfg feature=\"toml-io\" --cfg feature=\"io\" --cfg feature=\"serde_codegen\" --cfg feature=\"serde_yaml\" --cfg feature=\"all\" --cfg feature=\"cbor\" --cfg feature=\"serde_json\" --cfg feature=\"serde_cbor\" --cfg feature=\"serde\" --cfg feature=\"toml\" --cfg feature=\"json\" --cfg feature=\"yaml\" -C metadata=dc2f5d3a69030dfc --out-dir /Users/neo/Desktop/tokei/target/release --emit=dep-info,link -L dependency=/Users/neo/Desktop/tokei/target/release/deps --extern clap=/Users/neo/Desktop/tokei/target/release/deps/libclap-9178c8b70f5c5456.rlib --extern serde_yaml=/Users/neo/Desktop/tokei/target/release/deps/libserde_yaml-3758ab135f6f7d90.rlib --extern rayon=/Users/neo/Desktop/tokei/target/release/deps/librayon-c3557c1c242af173.rlib --extern ignore=/Users/neo/Desktop/tokei/target/release/deps/libignore-e66729014aae0f37.rlib --extern log=/Users/neo/Desktop/tokei/target/release/deps/liblog-bf16bb9a4912b11d.rlib --extern serde=/Users/neo/Desktop/tokei/target/release/deps/libserde-97f01bf227222121.rlib --extern serde_json=/Users/neo/Desktop/tokei/target/release/deps/libserde_json-b812e04ba18930d3.rlib --extern toml=/Users/neo/Desktop/tokei/target/release/deps/libtoml-cf3bfced9e77aba4.rlib --extern regex=/Users/neo/Desktop/tokei/target/release/deps/libregex-36c8e259ac5ba542.rlib --extern maplit=/Users/neo/Desktop/tokei/target/release/deps/libmaplit-50ef67709de53e77.rlib --extern hex=/Users/neo/Desktop/tokei/target/release/deps/libhex-c4311871abf53460.rlib --extern lazy_static=/Users/neo/Desktop/tokei/target/release/deps/liblazy_static-7f1b96a3a3eb529d.rlib --extern encoding=/Users/neo/Desktop/tokei/target/release/deps/libencoding-804c203c6e9b8b1a.rlib --extern serde_cbor=/Users/neo/Desktop/tokei/target/release/deps/libserde_cbor-7aefca53f259cf88.rlib --extern env_logger=/Users/neo/Desktop/tokei/target/release/deps/libenv_logger-c716af707f2027e1.rlib --extern tokei=/Users/neo/Desktop/tokei/target/release/deps/libtokei.rlib`
error: Could not compile `tokei`.

Caused by:
  process didn't exit successfully: `rustc src/main.rs --crate-name tokei --crate-type bin -C opt-level=3 --cfg feature="hex" --cfg feature="default" --cfg feature="toml-io" --cfg feature="io" --cfg feature="serde_codegen" --cfg feature="serde_yaml" --cfg feature="all" --cfg feature="cbor" --cfg feature="serde_json" --cfg feature="serde_cbor" --cfg feature="serde" --cfg feature="toml" --cfg feature="json" --cfg feature="yaml" -C metadata=dc2f5d3a69030dfc --out-dir /Users/neo/Desktop/tokei/target/release --emit=dep-info,link -L dependency=/Users/neo/Desktop/tokei/target/release/deps --extern clap=/Users/neo/Desktop/tokei/target/release/deps/libclap-9178c8b70f5c5456.rlib --extern serde_yaml=/Users/neo/Desktop/tokei/target/release/deps/libserde_yaml-3758ab135f6f7d90.rlib --extern rayon=/Users/neo/Desktop/tokei/target/release/deps/librayon-c3557c1c242af173.rlib --extern ignore=/Users/neo/Desktop/tokei/target/release/deps/libignore-e66729014aae0f37.rlib --extern log=/Users/neo/Desktop/tokei/target/release/deps/liblog-bf16bb9a4912b11d.rlib --extern serde=/Users/neo/Desktop/tokei/target/release/deps/libserde-97f01bf227222121.rlib --extern serde_json=/Users/neo/Desktop/tokei/target/release/deps/libserde_json-b812e04ba18930d3.rlib --extern toml=/Users/neo/Desktop/tokei/target/release/deps/libtoml-cf3bfced9e77aba4.rlib --extern regex=/Users/neo/Desktop/tokei/target/release/deps/libregex-36c8e259ac5ba542.rlib --extern maplit=/Users/neo/Desktop/tokei/target/release/deps/libmaplit-50ef67709de53e77.rlib --extern hex=/Users/neo/Desktop/tokei/target/release/deps/libhex-c4311871abf53460.rlib --extern lazy_static=/Users/neo/Desktop/tokei/target/release/deps/liblazy_static-7f1b96a3a3eb529d.rlib --extern encoding=/Users/neo/Desktop/tokei/target/release/deps/libencoding-804c203c6e9b8b1a.rlib --extern serde_cbor=/Users/neo/Desktop/tokei/target/release/deps/libserde_cbor-7aefca53f259cf88.rlib --extern env_logger=/Users/neo/Desktop/tokei/target/release/deps/libenv_logger-c716af707f2027e1.rlib --extern tokei=/Users/neo/Desktop/tokei/target/release/deps/libtokei.rlib` (signal: 9, SIGKILL: kill)

compiling fail by cargo install tokei --features all as well

tokei panic while set with output param

rustc 1.13.0 (2c6933acc 2016-11-07)
binary: rustc
commit-hash: 2c6933acc05c61e041be764cb1331f6281993f3f
commit-date: 2016-11-07
host: x86_64-apple-darwin
release: 1.13.0
NeoMacBook-Pro:~ neo$ RUST_BACKTRACE=1 tokei Development --output json
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: Syntax(KeyMustBeAString, 0, 0)', ../src/libcore/result.rs:799
stack backtrace:
   1:        0x105efa0a8 - std::sys::backtrace::tracing::imp::write::h6f1d53a70916b90d
   2:        0x105efd2ff - std::panicking::default_hook::{{closure}}::h137e876f7d3b5850
   3:        0x105efc395 - std::panicking::default_hook::h0ac3811ec7cee78c
   4:        0x105efc9a6 - std::panicking::rust_panic_with_hook::hc303199e04562edf
   5:        0x105efc844 - std::panicking::begin_panic::h6ed03353807cf54d
   6:        0x105efc762 - std::panicking::begin_panic_fmt::hc321cece241bb2f5
   7:        0x105efc6c7 - rust_begin_unwind
   8:        0x105f287b0 - core::panicking::panic_fmt::h27224b181f9f037f
   9:        0x105d71b1a - core::result::unwrap_failed::h9916702f61d59849
  10:        0x105da3f30 - tokei::main::h788afbac7a995aa4
  11:        0x105efd8ba - __rust_maybe_catch_panic
  12:        0x105efbed6 - std::rt::lang_start::h538f8960e7644c80

And so sorry for the inconvenience about posted on wrong place.

Add Man page

It is common to have a man page for command line utilities.

Test coverage in fsutil.rs

After the update to the has_trailing_comments function in #35, the tests now don't cover the new functionality of nested=false.

Exclude flag does not work as noted in help

The -e flag does not work. The --Exclude only works if you set it equal to what you are excluding(--exclude=dirname/) but it should work like (--exclude dirname/) as noted in the help

Inaccurate counts

Today I thought I'd test Tokei before running it on my code, and I found some cases that it doesn't handle correctly. I'll include them in one issue because I assume they will be fixed by the same change, but I can split them into multiple issues if you'd like.

There are 8 lines of Rust code here, but Tokei reports 3:

fn main() {
    let start = "/*";
    loop {
        if x.len() >= 2 && x[0] == '*' && x[1] == '/' { // found the */
          break;
        }
    }
}

There are 6 lines of code here, but Tokei reports 5:

fn foo() {
    let no_comment = "
    // another test
    ";
    func();
}

There are 9 lines of code here, but Tokei panics:

fn foo() {
    let this_ends = "a \"test/*.";
    call1();
    call2();
    let this_does_not = /* a /* nested */ comment " */
                        "*/another /*test
    call3();
    */";
}

There are 10 lines of code here, but Tokei reports 7:

fn foobar() {
    let does_not_start = // "
        "until here,
        test/*
        test"; // a quote: "
    let also_doesnt_start = /* " */
        "until here,
        test,*/
        test"; // another quote: "
}

There are 5 lines of code here, but Tokei reports 2:

fn foo() {
    let a = 4; // /*
    let b = 5;
    let c = 6; // */
}

There are 5 lines of D code here, but Tokei reports 2:

void main() {
    auto x = 5; /+ a /+ nested +/ comment /* +/
    writefln("hello");
    auto y = 4; // */
}

I typed these out without trying to compile them, so sorry if I made a mistake.

I have written real world code where this kind of implementation would miscount a lot of code, so that's why I check out LOC counters before using them.

Add package repositories.

I basically want what ripgrep has, but I don't really understand all the requirements for submitting to these package managers. Any help would be very appreciated.

  • Arch Linux
  • Chocolatey
  • Debian/Ubuntu
  • Fedora 24+
  • FreeBSD
  • Homebrew
  • Gentoo
  • MacPorts
  • Nix
  • RHEL/CentOS

Panic when piping output to less and quitting before all output displayed on Windows

See below for reproduction steps. Once the "Counting files..." portion finishes if you scroll down to view the first couple files and then quit the panic will be shown. I haven't been able to see if the same issue occurs on Linux yet.

Likely any print! and println! invocations should be changed to write! or writeln! so that the Result can be handled.

$ cargo install tokei
$ RUST_BACKTRACE=1 tokei . --files | less
thread 'main' panicked at 'failed printing to stdout: The pipe is being closed. (os error 232)', ../src/libstd\io\stdio.rs:617
thread '<unnamed>' panicked at 'failed printing to stdout: The pipe is being closed. (os error 232)', ../src/libstd\io\stdio.rs:617
stack backtrace:
   0:        0x13f4e00ec - <unknown>
   1:        0x13f4df6f9 - <unknown>
   2:        0x13f4cd05d - <unknown>
   3:        0x13f4e272b - <unknown>
   4:        0x13f4cdf5f - <unknown>
   5:        0x13f4d62e3 - <unknown>
   6:        0x13f2a22ad - <unknown>
   7:        0x13f4df0cc - <unknown>
   8:        0x13f4e27c1 - <unknown>
   9:        0x13f4dee04 - <unknown>
  10:        0x13f2cfdc9 - <unknown>
  11:        0x13f4ec1d8 - <unknown>
  12:         0x76d459bc - BaseThreadInitThunk

https://github.com/rust-lang/rust/blob/1.11.0/src/libstd/io/stdio.rs#L617

Fortran, my friend!

There are multiple issue with Fortran that precent it from being used with tokei. First, two syntaxes exists: the modern (or free-form) one, and the legacy (or fixed-form) one. They can be considered as two different languages in tokei. The main issues are:

  1. Multiple file extension. For modern syntax, are recognised the following file extensions: .f90, .F90, .f95, .F95, .f03, .f08 (and maybe .F03 and .F08, but I never found them). For the legacy syntax, you can use .f, .F, .f77, .F77, .fpp, .FPP.
  2. Multiple comments characters: in legacy syntax, the c or C or * characters indicate a comment, but only if found in the columns from 1 to 7.
  3. Line continuations characters: in all syntaxes, the & character is a line continuation, and it is used a lot in legacy syntax, due to the 72 characters by line limit.

So, how can I bypass these issues ?

  1. Can it be a way to group together multiple extension for one language in the finale countdown ?
  2. Maybe use regex for comments detection, if needed ?
  3. I believe this exist for lot of other languages, so it may be useful to have a line_continuation field in the Language struct. But I do not know the usual way of doing it in cloc programs, to ignore or to merge continuated lines.

Unable to compile due to conflicting syntex versions

I just tried installing tokei using cargo install tokei, but was unable to compile due to conflicting versions of the syntex crate, it looks like both 0.32.0 and 0.33.0 are being fetched:

$ cargo install tokei
    Updating registry `https://github.com/rust-lang/crates.io-index`
 Downloading tokei v2.1.1
 Downloading rayon v0.3.1
 Downloading serde_json v0.7.1
 Downloading serde v0.7.6
 Downloading clap v2.5.2
 Downloading serde_cbor v0.3.3
 Downloading maplit v0.1.3
 Downloading serde_yaml v0.2.4
 Downloading deque v0.3.1
 Downloading yaml-rust v0.3.2
 Downloading serde_codegen v0.7.6
 Downloading syntex v0.32.0 <-------------- Here
 Downloading syntex_syntax v0.33.0
 Downloading syntex v0.33.0 <-------------- And here
 Downloading aster v0.17.0
 Downloading quasi v0.11.0
 Downloading quasi_codegen v0.11.0
   Compiling serde v0.7.6
   Compiling log v0.3.6
   Compiling glob v0.2.11
   Compiling winapi-build v0.1.1
   Compiling bitflags v0.5.0
   Compiling num-traits v0.1.32
   Compiling unicode-xid v0.0.3
   Compiling winapi v0.2.7
   Compiling vec_map v0.6.0
   Compiling byteorder v0.3.13
   Compiling ansi_term v0.7.2
   Compiling rustc-serialize v0.3.19
   Compiling yaml-rust v0.3.2
   Compiling libc v0.2.11
   Compiling kernel32-sys v0.2.2
   Compiling unicode-width v0.1.3
   Compiling term v0.2.14
   Compiling walkdir v0.1.5
   Compiling rand v0.3.14
   Compiling num_cpus v0.2.12
   Compiling strsim v0.4.1
   Compiling maplit v0.1.3
   Compiling deque v0.3.1
   Compiling rayon v0.3.1
   Compiling clap v2.5.2
   Compiling serde_json v0.7.1
   Compiling serde_cbor v0.3.3
   Compiling serde_yaml v0.2.4
   Compiling syntex_syntax v0.32.0
   Compiling syntex_syntax v0.33.0
   Compiling syntex v0.32.0
   Compiling syntex v0.33.0
   Compiling quasi v0.11.0
   Compiling aster v0.17.0
   Compiling quasi_codegen v0.11.0
   Compiling serde_codegen v0.7.6
   Compiling tokei v2.1.1
/Users/nm46057/.cargo/registry/src/github.com-88ac128001ac3a9a/tokei-2.1.1/build.rs:15:29: 15:42 error: mismatched types:
 expected `&mut syntex::Registry`,
    found `&mut syntex::Registry`
(expected struct `syntex::Registry`,
    found a different struct `syntex::Registry`) [E0308]
/Users/nm46057/.cargo/registry/src/github.com-88ac128001ac3a9a/tokei-2.1.1/build.rs:15     serde_codegen::register(&mut registry);
                                                                                                                   ^~~~~~~~~~~~~
/Users/nm46057/.cargo/registry/src/github.com-88ac128001ac3a9a/tokei-2.1.1/build.rs:15:29: 15:42 help: run `rustc --explain E0308` to see a detailed explanation
/Users/nm46057/.cargo/registry/src/github.com-88ac128001ac3a9a/tokei-2.1.1/build.rs:15:29: 15:42 note: Perhaps two different versions of crate `syntex` are being used?
/Users/nm46057/.cargo/registry/src/github.com-88ac128001ac3a9a/tokei-2.1.1/build.rs:15     serde_codegen::register(&mut registry);
                                                                                                                   ^~~~~~~~~~~~~
error: aborting due to previous error
error: failed to compile `tokei v2.1.1`, intermediate artifacts can be found at `/Users/nm46057/projects/rust/target-install`

Caused by:
  Could not compile `tokei`.

To learn more, run the command again with --verbose.

`--files` flag may cause panic when file path has non-ASCII characters

For example:

tokei --files D:\tmp\新建文件夹\tokei

-------------------------------------------------------------------------------
 Language            Files        Lines         Code     Comments       Blanks
-------------------------------------------------------------------------------
...
-------------------------------------------------------------------------------
thread 'main' panicked at 'index 17 and/or 41 in `D:\tmp\新建文件夹\tokei\CHANGELOG.md` do not lie on character boundary', ../src/libcore\str/mod.rs:1721
stack backtrace:
   0:   0xd98f9b - std::rt::lang_start::haaae1186de9de8cb
   1:   0xd9989a - std::panicking::rust_panic_with_hook::hb1322e5f2588b4db
   2:   0xd99736 - std::panicking::begin_panic_fmt::h4fe9fb9d5109c4bf
   3:   0xd995eb - rust_begin_unwind
   4:   0xda05f7 - core::panicking::panic_fmt::h4395919ece15c671
   5:   0xda12c0 - core::str::slice_error_fail::h9ad582f282290cd6
   6:   0xd0c602 - <tokei::stats::Stats as core::fmt::Display>::fmt::h4eaabf7e66bef4ca
   7:   0xcf5bae - __ImageBase
   8:   0xda25af - core::fmt::write::h300d6e605e327781

option exclude doesn't work.

❯ tree pkg/migrations/
pkg/migrations/
├── migrations.go
└── migrations_test.go

0 directories, 2 files

❯ tokei -e test pkg/migrations
-------------------------------------------------------------------------------
 Language            Files        Lines         Code     Comments       Blanks
-------------------------------------------------------------------------------
 Go                      2          598          479           70           49
-------------------------------------------------------------------------------
 Total                   2          598          479           70           49
-------------------------------------------------------------------------------

Add Scala Language

  • Name of the language: Scala
  • File extensions: .sc, .scala
  • Comment syntax: C syntax. Multi-line comments can be nested. From the spec:
comment ::=  
  ‘/*’ “any sequence of characters; nested comments are allowed” ‘*/’  |  
  ‘//’ “any sequence of characters up to end of line”

Tag releases

Please tag the last 3 releases (for packagers).

Notify user about errors when processing files

Right now, tokei just silently ignores errors (like invalid encoding) when processing source files.
This leads to the user seeing incorrect results in these cases, without knowing that they are incorrect because errors happened when processing the files.

Support encodings other than UTF-8

Right now, tokei just silently ignores files that are not UTF-8, giving incorrect line counts compared to cloc.

github.com/AlDanial/cloc v 1.70  T=0.47 s (74.1 files/s, 56982.6 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
C++                             22           2344            912          23023
C/C++ Header                    13            119              7            520
-------------------------------------------------------------------------------
SUM:                            35           2463            919          23543
-------------------------------------------------------------------------------
[snake@neko mog]$ tokei sources/
-------------------------------------------------------------------------------
 Language            Files        Lines         Code     Comments       Blanks
-------------------------------------------------------------------------------
 C Header               13          646            0          527          119
 C++                    14         6671           73         5854          744
-------------------------------------------------------------------------------
 Total                  27         7317           73         6381          863
-------------------------------------------------------------------------------

The files that tokei didn't count are encoded in ISO-8859.

Even if we decide explicitly not to support encodings other than UTF-8, tokei should warn when ignoring files due to not being UTF-8, instead of silently giving incorrect results. #54

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.