Code Monkey home page Code Monkey logo

udsv's People

Contributors

joshuaslate avatar leeoniya avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

udsv's Issues

Crashing when adding to transform stream

I'm no stream expert, but given the examples, it seemed like it would be super easy to create a transform stream with your module.

I am able to verify that it works fine when running like the example in the readme, but when I port it into a custom Transform stream, it ends super early, doesn't hit the writeable, and I get no errors. There's a good chance this isn't a problem with uDSV, but it's strange, given your apis, this should work.

// lines = 16_182_971 , 100% finished in 13.1 seconds = 1_235_341 rows per second
// 597.5 MB  13.1s   45.5MB/s
const streamedCSV = async () => {
  const fileSize = await fs.promises.stat(filePath).then((stats) => stats.size)
  const stream = fs.createReadStream(filePath).pipe(new ProgressBarStream(fileSize))

  let parser: Parser;
  for await (const buffer of stream) {
    const strChunk = buffer.toString();
    parser ??= initParser(inferSchema(strChunk));
    parser.chunk<string[]>(strChunk, parser.typedArrs, (batch) => {
      const x = batch // reaches here fine
    });
  }
  parser!.end();
}

const transform = async () => {
  const fileSize = await fs.promises.stat(filePath).then((stats) => stats.size)
  return new Promise<void>((resolve, reject) => {
    pipeline(
      fs.createReadStream(filePath),
      new SimpleUDSVTransform(),
      new Writable({
        write(chunk, encoding, callback) {
          const x = chunk // never reaches here
          callback()
        }
      }),
      (err) => {
        if (err) {
          reject(err) // never reaches here
        } else {
          resolve() // never reaches here
        }
      }
    )
  })
}
(async () => {
  await streamedCSV() //works
  await transform() //breaks
})()

avoid string concat

i think it will reduce mem use a lot if we just .slice() from start to end and do a .replaceAll() to remove the escaped commas or quotes. not sure this can work for incremental parsing but probably. think this will increase overall perf as well since mem alloc / GC is the slowest thing in JS.

make both node & web stream example the same

NodeJS streams as well as web streams are both async iterable and both yields uint8arrays. and both env have TextDecoderStream in the global namespace

so there is no need to use eventEmitter and writing different code for both.

so this

let stream = fs.createReadStream(filePath);

let parser = null;
let result = null;

stream.on('data', (chunk) => {
  // convert from Buffer
  let strChunk = chunk.toString();
  // on first chunk, infer schema and init parser
  parser ??= initParser(inferSchema(strChunk));
  // incremental parse to string arrays
  parser.chunk(strChunk, parser.stringArrs);
});

stream.on('end', () => {
  result = p.end();
});

and this:

let stream = fs.createReadStream(filePath);

let webStream = Stream.Readable.toWeb(stream);
let textStream = webStream.pipeThrough(new TextDecoderStream());

let parser = null;

for await (const strChunk of textStream) {
  parser ??= initParser(inferSchema(strChunk));
  parser.chunk(strChunk, parser.stringArrs);
}

let result = parser.end();

could be written as:

for await (const chunk of stream) {
  parser ??= initParser(inferSchema(strChunk.toString()));
  parser.chunk(strChunk, parser.stringArrs);
});

for await (const strChunk of textStream) {
  parser ??= initParser(inferSchema(strChunk));
  parser.chunk(strChunk, parser.stringArrs);
}

probe every chunk for quoted vs unquoted?

currently the assertion for whether to use the quoted or unquoted codepath is statically configured in the schema, which is usually inferred from the first chunk.

however if there are no quotes in the first chunk, but they exist in subsequent chunks we'll get mis-parses. so the only safe path for incremental parsing is to force quoted mode all the time, but this is not done automatically right now.

need to measure if there is any benefit to the unquoted path during streaming. maybe not since it appears to be i/o-bound.

Unparsing?

Sorry for the issue spam (maybe you could open the discussions feature up?) but I was wondering if you have plans for or interest in implementing unparsing (CSV to JS collection). I love the approach you have taken here, but I am looking for a parse/unparse combined solution in my project. I would be happy to try to contribute toward development.

Flaw in the benchmark?

I was a bit surprised to see Papaparse being twice as fast as csv42 in many of the benchmarks. I did similar benchmarks before and there csv42 was always at least a bit faster than Papaparse. Now, it can be that I'm overlooking something here and comparing apples with pears, but I would love to understand what's going on, these benchmarks should give similar kind of results.

I created a minimal benchmark to try figure out what's going on: https://github.com/josdejong/csv-benchmark

The results show csv42 being twice as fast as Papaparse with for example the HPI_master.csv file and also another file:

udsv x 9.42 ops/sec ±3.74% (28 runs sampled)
csv42 x 4.38 ops/sec ±3.70% (16 runs sampled)
papaparse x 2.33 ops/sec ±2.47% (10 runs sampled)

Can it be that papaparse isn't configured correctly? When omitting the header: true setting, it will create arrays instead of objects which is about 3 times faster, giving results similar to the results shown here. Not sure, but here papaparse isn't configured with header: true: https://github.com/leeoniya/uDSV/blob/main/bench/non-streaming/untyped/PapaParse.cjs.

Did you verify that the output is indeed what you expect for all the libraries in all the benchmarks? (i.e. an object with parsed numbers and booleans).

Inconsistent benchmark samples

First of all you put together an awesome benchmark suite, thanks for that! Now I might have to try to optimize my library a bit more 🤣

I see some samples that look strange to me, and I'm not sure why, for example:

  1. for litmus_ints.csv I see most implementation returning [["-8399","-5705","-5076","8780","8650","-55","741, but uDSV returns [["6287","-60","7062","-3411","4613","-5840","209", isn't that an incorrect output?
┌───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│ litmus_ints.csv (1 MB, 20 cols x 10K rows)                                                                                                                                                    │
├────────────────────────┬────────┬─────────────────────────────────────────────────────────────┬─────────────────────────────────┬────────┬────────────────────────────────────────────────────┤
│ Name                   │ Rows/s │ Throughput (MiB/s)                                          │ RSS above 68 MiB baseline (MiB) │ Types  │ Sample                                             │
├────────────────────────┼────────┼─────────────────────────────────────────────────────────────┼─────────────────────────────────┼────────┼────────────────────────────────────────────────────┤
│ uDSV                   │ 2.41M  │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 248 │ ░░░░ 34.5                       │ string │ [["6287","-60","7062","-3411","4613","-5840","209" │
│ String.split()         │ 1.6M   │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 164                   │ ░░░░░ 37.8                      │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ csv-simple-parser      │ 1.59M  │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 163                   │ ░░░░░ 42.1                      │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ d3-dsv                 │ 1.31M  │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 134                          │ ░░░░░░░ 58                      │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ but-csv                │ 981K   │ ░░░░░░░░░░░░░░░░░░░░░░░ 101                                 │ ░░░░░░ 48.8                     │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ PapaParse              │ 727K   │ ░░░░░░░░░░░░░░░░░ 74.8                                      │ ░░░░░░░░░░░░░░ 128              │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ ACsv                   │ 717K   │ ░░░░░░░░░░░░░░░░░ 73.8                                      │ ░░░░░░░░░░░░░░░░ 146            │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ csv-rex                │ 617K   │ ░░░░░░░░░░░░░░░ 63.5                                        │ ░░░░░░░░░░░░░░░░░░ 162          │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ @gregoranders/csv      │ 555K   │ ░░░░░░░░░░░░░ 57.1                                          │ ░░░░░░░░ 73.4                   │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ node-csvtojson         │ 517K   │ ░░░░░░░░░░░░ 53.2                                           │ ░░░░░░░░░░░░░░░░ 150            │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ achilles-csv-parser    │ 508K   │ ░░░░░░░░░░░░ 52.2                                           │ ░░░░░░░░░░░░░░░░░░ 162          │ string │ [{"zYs0DQRw06":"6287","E5mPdSri":"-60","Cku78":"70 │
│ csv-js                 │ 498K   │ ░░░░░░░░░░░░ 51.2                                           │ ░░░░░░░░░░░░░░░░░░ 164          │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ comma-separated-values │ 412K   │ ░░░░░░░░░░ 42.4                                             │ ░░░░░░░░░░░░░░░░░░ 167          │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ csv42                  │ 338K   │ ░░░░░░░░ 34.7                                               │ ░░░░░░░░░░░░░░░░░ 151           │ string │ [{"zYs0DQRw06":"6287","E5mPdSri":"-60","Cku78":"70 │
│ dekkai                 │ 315K   │ ░░░░░░░░ 32.4                                               │ ░░░░░░░░░░░░░░░░░ 157           │ string │ [["6287","-60","7062","-3411","4613","-5840","209" │
│ SheetJS                │ 304K   │ ░░░░░░░ 31.3                                                │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░ 245 │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ CSVtoJSON              │ 280K   │ ░░░░░░░ 28.8                                                │ ░░░░░░░░░░░░░░░░░░░░░░ 198      │ string │ [{"zYs0DQRw06":"6287","E5mPdSri":"-60","Cku78":"70 │
│ @vanillaes/csv         │ 264K   │ ░░░░░░░ 27.2                                                │ ░░░░░░░░░░░░░░░░ 145            │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ csv-parser (neat-csv)  │ 207K   │ ░░░░░ 21.2                                                  │ ░░░░░░░░░░░░░░░░░░░░ 186        │ string │ [{"zYs0DQRw06":"6287","E5mPdSri":"-60","Cku78":"70 │
│ csv-parse/sync         │ 157K   │ ░░░░ 16.1                                                   │ ░░░░░░░░░░░░░░░ 134             │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ jquery-csv             │ 104K   │ ░░░ 10.7                                                    │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░ 253 │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ utils-dsv-base-parse   │ 100K   │ ░░░ 10.3                                                    │ ░░░░░░░░░░░░░ 116               │ string │ [["-8399","-5705","-5076","8780","8650","-55","741 │
│ @fast-csv/parse        │ 95.8K  │ ░░░ 9.85                                                    │ ░░░░░░░░░░░░░░░░░ 152           │ string │ [{"zYs0DQRw060":"6287","E5mPdSri1":"-60","Cku782": │
│ json-2-csv             │ 25.9K  │ ░ 2.67                                                      │ ░░░░░░░░░░░░░░ 122              │ string │ [{"zYs0DQRw06":"6287","E5mPdSri":"-60","Cku78":"70 │
└────────────────────────┴────────┴─────────────────────────────────────────────────────────────┴─────────────────────────────────┴────────┴────────────────────────────────────────────────────┘
  1. same for litmus_quoted.csv:
┌───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│ litmus_quoted.csv (1.9 MB, 20 cols x 10K rows)                                                                                                                                                │
├────────────────────────┬────────┬─────────────────────────────────────────────────────────────┬─────────────────────────────────┬────────┬────────────────────────────────────────────────────┤
│ Name                   │ Rows/s │ Throughput (MiB/s)                                          │ RSS above 74 MiB baseline (MiB) │ Types  │ Sample                                             │
├────────────────────────┼────────┼─────────────────────────────────────────────────────────────┼─────────────────────────────────┼────────┼────────────────────────────────────────────────────┤
│ uDSV                   │ 1.62M  │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 301 │ ░░░ 45.1                        │ string │ [["JitoL7gXbT","-3677"," NoqmZHgaSk14","1997","mtW │
│ csv-simple-parser      │ 1.23M  │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 228              │ ░░░ 43.8                        │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ but-csv                │ 895K   │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 166                         │ ░░░░ 58.3                       │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ achilles-csv-parser    │ 395K   │ ░░░░░░░░░░░░░░ 73.1                                         │ ░░░░░░ 81.8                     │ string │ [{"DRadpUJ9 TCnUO0":"JitoL7gXbT","92ZnE3J8IME4Ru": │
│ d3-dsv                 │ 370K   │ ░░░░░░░░░░░░░ 68.5                                          │ ░░░░░░░░░░░ 156                 │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ csv-rex                │ 363K   │ ░░░░░░░░░░░░░ 67.1                                          │ ░░░░░░░░░░░ 164                 │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ csv-js                 │ 287K   │ ░░░░░░░░░░ 53.1                                             │ ░░░░░░░░░░░░ 176                │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ PapaParse              │ 282K   │ ░░░░░░░░░░ 52.1                                             │ ░░░░░░░░░░░░ 170                │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ comma-separated-values │ 242K   │ ░░░░░░░░░ 44.9                                              │ ░░░░░░░░░░░ 155                 │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ dekkai                 │ 232K   │ ░░░░░░░░ 43                                                 │ ░░░░░░░░░░░░░░ 212              │ string │ [["JitoL7gXbT","-3677"," NoqmZHgaSk14","1997","mtW │
│ SheetJS                │ 212K   │ ░░░░░░░░ 39.2                                               │ ░░░░░░░░░░░░░░░░ 229            │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ ACsv                   │ 208K   │ ░░░░░░░░ 38.6                                               │ ░░░░░░░░░░ 150                  │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ @vanillaes/csv         │ 208K   │ ░░░░░░░░ 38.5                                               │ ░░░░░░░░░ 133                   │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ csv42                  │ 201K   │ ░░░░░░░ 37.1                                                │ ░░░░░░░░░░░░░ 187               │ string │ [{"DRadpUJ9 TCnUO0":"JitoL7gXbT","92ZnE3J8IME4Ru": │
│ node-csvtojson         │ 189K   │ ░░░░░░░ 35                                                  │ ░░░░░░░░░░░ 161                 │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ CSVtoJSON              │ 178K   │ ░░░░░░░ 33                                                  │ ░░░░░░░░░░░░░ 197               │ string │ [{"\"DRadpUJ9TCnUO0\"":"JitoL7gXbT","\"92ZnE3J8IME │
│ csv-parser (neat-csv)  │ 162K   │ ░░░░░░ 30                                                   │ ░░░░░░░░░░░░ 180                │ string │ [{"DRadpUJ9 TCnUO0":"JitoL7gXbT","92ZnE3J8IME4Ru": │
│ utils-dsv-base-parse   │ 120K   │ ░░░░░ 22.2                                                  │ ░░░░░░░░░ 132                   │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ @gregoranders/csv      │ 117K   │ ░░░░ 21.6                                                   │ ░░░░░░░░░░ 143                  │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ csv-parse/sync         │ 108K   │ ░░░░ 20.1                                                   │ ░░░░░░░░░ 134                   │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ @fast-csv/parse        │ 74.2K  │ ░░░ 13.7                                                    │ ░░░░░░░░░░░░░░░ 228             │ string │ [{"DRadpUJ9 TCnUO00":"JitoL7gXbT","92ZnE3J8IME4Ru1 │
│ jquery-csv             │ 57.9K  │ ░░ 10.7                                                     │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░ 411 │ string │ [["UPwF","5742","TWB88","-2020","ih2q9F5ijs","-864 │
│ json-2-csv             │ 23.5K  │ ░ 4.35                                                      │ ░░░░░░░░░░░░ 170                │ string │ [{"DRadpUJ9 TCnUO0":"JitoL7gXbT","92ZnE3J8IME4Ru": │
└────────────────────────┴────────┴─────────────────────────────────────────────────────────────┴─────────────────────────────────┴────────┴────────────────────────────────────────────────────┘
  1. Similar for data-large2.csv, but basically uDSV seems to be rounding numbers differently, the first data row in the CSV is 1370044800,4819440.062,4645092.555,382.8436706,383.0596235,0,2,13935372.4,4230328.642,358.828813,375.6540512,45.01385132,191.3370908,5872264.787,5487537.33,346.35,351.65,0,0,1,2500,2498,192,197,0,0,0,0,0,2489,2485,196,193,0.2,0,0,0, so for example I'd expect to see 4819440.062 in the output, but uDSV gives 4869044.81, isn't that a bug?
┌─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│ data-large2.csv (36 MB, 37 cols x 130K rows)                                                                                                                                                    │
├────────────────────────┬────────┬─────────────────────────────────────────────────────────────┬───────────────────────────────────┬────────┬────────────────────────────────────────────────────┤
│ Name                   │ Rows/s │ Throughput (MiB/s)                                          │ RSS above 282 MiB baseline (MiB)  │ Types  │ Sample                                             │
├────────────────────────┼────────┼─────────────────────────────────────────────────────────────┼───────────────────────────────────┼────────┼────────────────────────────────────────────────────┤
│ uDSV                   │ 525K   │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 145 │ ░░░░░░░░░░░░░░░░░░░░░░░ 1.69K     │ string │ [["1370045100","4869044.81","4630605.41","382.8270 │
│ String.split()         │ 446K   │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 124         │ ░░░░░░░░░░░░░░░░░░░░░░░░ 1.76K    │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ ACsv                   │ 387K   │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 107               │ ░░░░░░░░░░░░░░░░░░░░░░░░ 1.75K    │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ PapaParse              │ 385K   │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 107               │ ░░░░░░░░░░░░░░░░░░░░░░░░ 1.72K    │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ csv-simple-parser      │ 373K   │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 104                │ ░░░░░░░░░░░░░░░░░░░░░░░ 1.68K     │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ d3-dsv                 │ 372K   │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 103                 │ ░░░░░░░░░░░░░░░░░░░░░░░░ 1.75K    │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ but-csv                │ 365K   │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 101                 │ ░░░░░░░░░░░░░░░░░░░░░░░░ 1.69K    │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ csv-rex                │ 334K   │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ 92.7                   │ ░░░░░░░░░░░░░░░░░░░░░░ 1.57K      │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ comma-separated-values │ 228K   │ ░░░░░░░░░░░░░░░░░░░░░░░░ 63.3                               │ ░░░░░░░░░░░░░░ 1.01K              │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ csv-js                 │ 185K   │ ░░░░░░░░░░░░░░░░░░░░ 51.3                                   │ ░░░░░░░░░░░░░░░ 1.05K             │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ csv42                  │ 178K   │ ░░░░░░░░░░░░░░░░░░░ 49.4                                    │ ░░░░░░░░░░░░░░░░░░░░░░░ 1.65K     │ string │ [{"A":"1370045100","B":"4869044.81","C":"4630605.4 │
│ achilles-csv-parser    │ 173K   │ ░░░░░░░░░░░░░░░░░░░ 48                                      │ ░░░░░░░░░░░░░░░░░░░░░░░ 1.66K     │ string │ [{"A":"1370045100","B":"4869044.81","C":"4630605.4 │
│ SheetJS                │ 157K   │ ░░░░░░░░░░░░░░░░░ 43.6                                      │ ░░░░░░░░░░░░░░░░░░░░░ 1.48K       │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ CSVtoJSON              │ 157K   │ ░░░░░░░░░░░░░░░░░ 43.4                                      │ ░░░░░░░░░░░░░░░░░░░░░░░ 1.63K     │ string │ [{"A":"1370045100","B":"4869044.81","C":"4630605.4 │
│ @gregoranders/csv      │ 151K   │ ░░░░░░░░░░░░░░░░ 41.7                                       │ ░░░░░░░░░░░░ 872                  │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ dekkai                 │ 146K   │ ░░░░░░░░░░░░░░░░ 40.5                                       │ ░░░░░░░░ 569                      │ string │ [["1370045100","4869044.81","4630605.41","382.8270 │
│ @vanillaes/csv         │ 143K   │ ░░░░░░░░░░░░░░░ 39.6                                        │ ░░░░░░░░░░░░ 871                  │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ node-csvtojson         │ 120K   │ ░░░░░░░░░░░░░ 33.2                                          │ ░░░░░░░░░░░░ 845                  │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ csv-parser (neat-csv)  │ 109K   │ ░░░░░░░░░░░░ 30.1                                           │ ░░░░░░░░░░░░░░░ 1.07K             │ string │ [{"A":"1370045100","B":"4869044.81","C":"4630605.4 │
│ csv-parse/sync         │ 70.9K  │ ░░░░░░░░ 19.7                                               │ ░░░░░░░░░ 594                     │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ jquery-csv             │ 44.5K  │ ░░░░░ 12.3                                                  │ ░░░░░░░░░░░░░░░░░░░░░░░░░░░ 1.99K │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
│ @fast-csv/parse        │ 43.8K  │ ░░░░░ 12.1                                                  │ ░░░░░░░░░░░░ 839                  │ string │ [{"A0":"1370045100","B1":"4869044.81","C2":"463060 │
│ utils-dsv-base-parse   │ 38.2K  │ ░░░░░ 10.6                                                  │ ░░░░░ 336                         │ string │ [["1370044800","4819440.062","4645092.555","382.84 │
└────────────────────────┴────────┴─────────────────────────────────────────────────────────────┴───────────────────────────────────┴────────┴────────────────────────────────────────────────────┘

Unable to parse CSV with only one line

example: inferSchema('a,b')

After execution, an error will occur: "Cannot read properties of null (reading '1')"

It must have a line break to be parsed correctly. I currently solve it by if (!csvStr.includes('\n')) csvStr += '\n';. I hope to release a new version with this fix.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.