Code Monkey home page Code Monkey logo

ipfs-car's Issues

v0.9.2 types

Hi, due to the project requirements i have to use cjs with typescript. Unfortunately, the ts is complaining about types. are they installed via @types or they are provided in some other way?

Thanks

Is unpacking files supported on browser?

Hey, thanks for open sourcing this library.

It seems that CarIndexedReader is strictly for Node.js without any implementation for browser.

Our usecase in plebbit-js is fetching ipns/ipfs files from gateway using regular fetch and then verify its validity in case the gateway is not being honest. One thing to note, is we also fetch files in directory like https://ipfs.io/ipfs/<folder-cid1>/<folder-cid2>/update.

Is it possible to implementing verification for both node and browser for our usecase?

Error: 524: Origin Time-out

Hello,

I tweaked the example code (https://docs.web3.storage/) to put files on IPFS.

const storage = new Web3Storage({ token })
const files = [];
for(let i = 1; i<1003; i++){`
    const pathFiles = await getFilesFromPath(`D:/jsonFolder/${i}.json`);
  files.push(...pathFiles);
}

console.log(`Uploading ${files.length} files`)
const cid = await storage.put(files);
console.log('Content added with CID:', cid)

It returns

Error: 524: Origin Time-out
at pRetry.retries (file:///D:/node_modules/web3.storage/src/lib.js:155:23)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at async RetryOperation._fn (D:\node_modules\p-retry\index.js:50:12) {
attemptNumber: 6,
retriesLeft: 0
}

I do not know why I got this error, it was ok with 500 files .

Please give me some advice

Unsupported UnixFS type object for path

Command below results in error: Unsupported UnixFS type object for path: bafyreihhdxf6vr5vembqausvrsukarksvms3vob5jzj7kgwe76rurjp7he

ipfs-car unpack /tmp/bafyreihhdxf6vr5vembqausvrsukarksvms3vob5jzj7kgwe76rurjp7he.car --output /tmp

Is there a way to unpack this CAR file? Why is this error happening with this file in particular?

npm 引兄 web3.storage ä¾čµ–ę—¶ļ¼Œå¼•ē”Ø ipfs-car ä¾čµ–ēš„č·Æ径错čÆÆé—®é¢˜ć€‚

npm 引兄 web3.storage ä¾čµ–ę—¶ä¼šč‡ŖåŠØ也引兄äø€äŗ›å’Œ ipfs-car ē›ø关ēš„包

čæč”Œ

npm run serve

ꊄ错

ERROR  Failed to compile with 3 errors                                                                      

These dependencies were not found:

* ipfs-car/blockstore/memory in ./node_modules/web3.storage/src/platform.web.js
* ipfs-car/pack in ./node_modules/web3.storage/src/lib.js
* ipfs-car/unpack in ./node_modules/web3.storage/src/lib.js

To install them, you can run: npm install --save ipfs-car/dist/types/blockstore/memory ipfs-car/pack ipfs-car/unpack

ēœ‹ęŗę–‡ä»¶

/app/node_modules/web3.storage/src/platform.web.js

ę–‡ä»¶å¼€å¤“ä»£ē 

import { MemoryBlockStore } from 'ipfs-car/blockstore/memory'

ēœ‹ ipfs-car ęŗä»£ē 

发ēŽ°
memory ꖇ件č·Æ径ę˜Æ /app/node_modules/ipfs-car/dist/types/blockstore/memory.d.ts
pack ꖇ件č·Æ径ę˜Æ /app/node_modules/ipfs-car/dist/types/pack/index.d.ts
unpack ꖇ件č·Æ径ę˜Æ /app/node_modules/ipfs-car/dist/types/unpack/index.d.ts

äŗŽę˜Æ乎ļ¼Œäæ®ę”¹ /app/node_modules/web3.storage/src/platform.web.js ꖇ件

import { MemoryBlockStore } from 'ipfs-car/dist/types/blockstore/memory'

å†ę¬”čæč”Œ

npm run serve

å†ę¬”ęŠ„é”™

 ERROR  Failed to compile with 3 errors                                                                      

These dependencies were not found:

* ipfs-car/dist/types/blockstore/memory in ./node_modules/web3.storage/src/platform.web.js
* ipfs-car/pack in ./node_modules/web3.storage/src/lib.js
* ipfs-car/unpack in ./node_modules/web3.storage/src/lib.js

To install them, you can run: npm install --save ipfs-car/dist/types/blockstore/memory ipfs-car/pack ipfs-car/unpack

ę±‚č§£

čæ™äøŖę˜Æäøę˜Æå®˜ę–¹ēš„问题ļ¼Ÿęœ‰å“Ŗ位大ē„žēŸ„道ēš„č§£ē­”äø€äø‹ļ¼Œč°¢č°¢ļ¼ļ¼ļ¼

`packToStream` different input options

The packToStream function only accepts file paths as input parameter. It would be helpful to be able to also specify non-file path input options such as stream.

npx error : crypto is not defined

Hi,

Does command with npx still work ?
I always have this error :

npx ipfs-car --pack test.txt --output test.car
ReferenceError: crypto is not defined
  at Hasher.encode (/home/me/.npm/_npx/68468/lib/node_modules/ipfs-car/node_modules/multiformats/cjs/src/hashes/sha2-browser.js:7:56)
  at Hasher.digest (/home/me/.npm/_npx/68468/lib/node_modules/ipfs-car/node_modules/multiformats/cjs/src/hashes/hasher.js:16:35)
  at persist (/home/me/.npm/_npx/68468/lib/node_modules/ipfs-car/node_modules/ipfs-unixfs-importer/src/utils/persist.js:29:42)
  at /home/me/.npm/_npx/68468/lib/node_modules/ipfs-car/node_modules/ipfs-unixfs-importer/src/dag-builder/file/buffer-importer.js:47:20
  at /home/me/.npm/_npx/68468/lib/node_modules/ipfs-car/node_modules/it-parallel-batch/index.js:32:16
  at Array.map (<anonymous>)
  at parallelBatch (/home/me/.npm/_npx/68468/lib/node_modules/ipfs-car/node_modules/it-parallel-batch/index.js:27:26)
  at processTicksAndRejections (internal/process/task_queues.js:97:5)
  at async buildFileBatch (/home/me/.npm/_npx/68468/lib/node_modules/ipfs-car/node_modules/ipfs-unixfs-importer/src/dag-builder/file/index.js:45:20)
  at async batch (/home/me/.npm/_npx/68468/lib/node_modules/ipfs-car/node_modules/it-batch/index.js:20:20)

Thanks for help :)

Env :

Proposal: less generic / confusing interface

I find myself staring going back to docs / examples to try and figure out what various options would do to make sure I get things right. In my experience this is an opportunity to improve API such that it is difficult to misuse / misinterpret.

Given a bit of experience here are few suggestions:

  1. Expose separate APIs per use case:

    • pack a file - Most cases I've encountered so far it's a single file in which case following options are confusing and serve little purpose
      1. multiple inputs
      2. path of the input
      3. wrapWithDirectory
    • pack a directory of files - This could also be simplified by
      1. Getting rid of wrapWithDirectory option (if you want to warp file with dir use packDiretory and pass a single file.
  2. Let user normalize input instead

    I think this is idiosyncrasy of IPFS that is really not worth replicating. Input type is extremely complex which is flexible enough to take almost anything yet it is not always obvious how each one is treated.

    I think API can be a lot nicer if it let user deal with input normalization and was itself was super simple

    • packFile({ content: Blob, ... })
    • packDirectory({ files: File[], ... })

    Files and Blobs can easily be created from all those input types and bad inputs.

  3. Allow deciding what output should be afterwards

    Right now you have to pick between blob, fs and stream packer. I would suggest instead having single result value Pack that is:

    interface WebPack extends Blob {
        type: 'File' | 'Directory'
        readonly maxChunkSize: number
        readonly blockstore: BlockStore
    }

    And following in node:

    interface NodePack extends WebPack {
       // mimics https://nodejs.org/dist/latest-v17.x/docs/api/fs.html#fswritefilefile-data-options-callback
       writeFile(path:string, options:import('fs').WriteFileOptions):Promise<void>
       // mimics https://nodejs.org/dist/latest-v17.x/docs/api/fs.html#fscreatereadstreampath-options
       createReadStream():import('stream').Readable
    }

    Also given that web streams and blobs are coming to node, I'm not sure if it's even worth doing node specific things.

Have `ipfs-car --pack` print the output filename and root CID by default

How about something like:

ipfs-car --pack pinpie.jpg
root CID: bafkreiajkbmpugz75eg2tmocmp3e33sg5kuyq2amzngslahgn6ltmqxxfa
output: pinpie.car

It'd be nice to see the root CID, so folks can use it to quickly generate a CID for a file, rather than having to read the car back in with --list-roots

packToBlob support

Before 1.0 you had a packToBlob function in the API.
Will you add support for it back? Or an example about the proper way to create a packToBlob?

What steps are required to build this?

Heya, I was looking into possibly contributing to this repo as I'm working on similar things, but I can't seem to get it to build. Here's what I tried:

git clone [email protected]:web3-storage/ipfs-car.git
cd ipfs-car
npm install
npm run build

Everything is smooth sailing up until npm run build:

disco@MacBook-Air ipfs-car % npm run build

> [email protected] build
> npm run compile:esm && npm run compile:cjs


> [email protected] compile:esm
> ttsc -p tsconfig.json && echo '{ "type" : "module" }' > dist/esm/package.json

/Users/disco/Programming/ipfs-car/node_modules/ttypescript/lib/loadTypescript.js:29
    var _e = ts.versionMajorMinor.split('.'), major = _e[0], minor = _e[1];
                                  ^

TypeError: Cannot read properties of undefined (reading 'split')
    at Object.loadTypeScript (/Users/disco/Programming/ipfs-car/node_modules/ttypescript/lib/loadTypescript.js:29:35)
    at Object.<anonymous> (/Users/disco/Programming/ipfs-car/node_modules/ttypescript/lib/tsc.js:8:27)
    at Module._compile (node:internal/modules/cjs/loader:1246:14)
    at Module._extensions..js (node:internal/modules/cjs/loader:1300:10)
    at Module.load (node:internal/modules/cjs/loader:1103:32)
    at Module._load (node:internal/modules/cjs/loader:942:12)
    at Module.require (node:internal/modules/cjs/loader:1127:19)
    at require (node:internal/modules/helpers:112:18)
    at Object.<anonymous> (/Users/disco/Programming/ipfs-car/node_modules/ttypescript/bin/tsc:2:1)
    at Module._compile (node:internal/modules/cjs/loader:1246:14)

Node.js v19.6.0

Should I be doing something differently?

disco@MacBook-Air ipfs-car % uname -a
Darwin MacBook-Air.local 22.1.0 Darwin Kernel Version 22.1.0: Sun Oct  9 20:15:52 PDT 2022; root:xnu-8792.41.9~2/RELEASE_ARM64_T8112 arm64

CAR and SPLITTING seems fine, but uploading to nft.storage isn't liking it.

I posted on discord too but no response yet. I'm hoping someone here can help as well. So, I created the CAR file and I can list contents and recognizes everything. I also used the split with no problems and seems fine. Problem is, when I upload to NFT.STORAGE. I can upload FILE-0.CAR file fine (first one). But, I get Error Uploading: Missing block for alskj23k4023nklsjfa whatever. The file is there, but has error pinning and gives that error. I thought it was because the other files are not there yet, but it will NOT let me upload any other files. They error out as soon as I press the upload button. So, I'm thinking I'm missing something in the process.

Cannot packToBlob with content that is a File in Node.js...

import { File } from 'nft.storage'
import { packToBlob } from 'ipfs-car/pack/blob'
import fs from 'fs'

const file = new File([fs.readFileSync('../skate-movie.mp4')], 'skate-movie.mp4', { type: 'video/mp4' })
const { root, car } = await packToBlob({ input: [{ path: 'skate-movie.mp4', content: file }] })
Error: Unexpected input: [object Blob]
    at toAsyncIterable (node_modules/ipfs-core-utils/src/files/normalise-input/normalise-content.js:71:17)
    at toAsyncIterable.next (<anonymous>)
    at validateChunks (node_modules/ipfs-unixfs-importer/src/dag-builder/validate-chunks.js:14:20)
    at validateChunks.next (<anonymous>)
    at fixedSizeChunker (node_modules/ipfs-unixfs-importer/src/chunker/fixed-size.js:15:20)
    at fixedSizeChunker.next (<anonymous>)
    at bufferImporter (node_modules/ipfs-unixfs-importer/src/dag-builder/file/buffer-importer.js:16:18)
    at bufferImporter.next (<anonymous>)
    at batch (node_modules/it-batch/index.js:20:20)
    at batch.next (<anonymous>)

require('stream-to-it') causes error when used in node esm

// tslint:disable-next-line: no-var-requires needs types
const toIterable = require('stream-to-it')

āÆ ./bin.js --get bafybeidd2gyhagleh47qeg77xqndy2qy3yzn4vkxmk775bg2t5lpuy7pcu --api http://127.0.0.1:8787                                                                  15:28:34
file:///Users/oli/Code/web3-storage/web3.storage/node_modules/ipfs-car/dist/esm/unpack/fs.js:7
const toIterable = require('stream-to-it');
                   ^

ReferenceError: require is not defined in ES module scope, you can use import instead
This file is being treated as an ES module because it has a '.js' file extension and '/Users/oli/Code/web3-storage/web3.storage/packages/cli/package.json' contains "type": "module". To treat it as a CommonJS script, rename it to use the '.cjs' file extension.

Why does packToBlob return web-std/blob in the browser

Using vite to bundle for the browser I'm seeing packToBlob return a web-std/blob instead of the brower global Blob. This is problematic as if you pass a web-std/blob impl as the body to fetch, you don't get the file data upload behaviour you want, you just get the body set to [object blob]..

The implementation of web-std/blob should mean that the browser global Blob is used in browser contexts. It's not yet clear if this problem is limited to vite, or is caused by something we're doing in ipfs-car or is a problem with web-std/blob

fs_1.default.promises.rm is not a function

Get this error in car packing, the resulting car file seems sane but we need to be able to trust the error status of the command as we'll be running this on 50k+ items in the archive.

parkan@ytterbium:~/archive_org/filecoin/staging$ time ipfs-car --unpack Qmdk4jahNe1gSRWdYS96Jmas9CD6c1Rdp8twSWkErgc6jK.car --output test/

real	0m15.917s
user	0m13.457s
sys	0m7.452s
parkan@ytterbium:~/archive_org/filecoin/staging$ time ipfs-car --pack test --output ipfs-car-out.car
TypeError: fs_1.default.promises.rm is not a function
    at FsBlockStore.close (/usr/lib/node_modules/ipfs-car/dist/cjs/blockstore/fs.js:57:41)
    at Object.packToFs (/usr/lib/node_modules/ipfs-car/dist/cjs/pack/fs.js:27:26)
    at async handleInput (/usr/lib/node_modules/ipfs-car/dist/cjs/cli/cli.js:90:36)

real	0m17.625s
user	0m27.249s
sys	0m15.474s

ipfs-car cannot get hash of large car file it created

I created a car file of a large folder using ipfs-car. However, when I later call ipfs-car to get the hash of that file, I get an error that the file is too large.

Steps to reproduce:

~/data
āÆ ipfs-car pack mit-ocw --output mit-ocw-hash-test.car
bafybeifbua4aqsuiuqobeu7gxccyqx26ckxzxu5fdxermjrahimp65aiay

~/data 24s
āÆ ipfs-car hash mit-ocw-hash-test.car
node:internal/fs/promises:526
    throw new ERR_FS_FILE_TOO_LARGE(size);
          ^

RangeError [ERR_FS_FILE_TOO_LARGE]: File size (16911159405) is greater than 2 GiB
    at readFileHandle (node:internal/fs/promises:526:11)
    at async Module.hash (file:///Users/i/.nvm/versions/node/v21.2.0/lib/node_modules/ipfs-car/cmd/hash.js:13:13) {
  code: 'ERR_FS_FILE_TOO_LARGE'
}

Node.js v21.2.0

~/data
āÆ ls -l
-rw-r--r--@   1 i  staff  16911159405 Nov 29 14:04 mit-ocw-hash-test.car

Example from README "fetch and locally verify file..." fails

Example reads:

Fetch and locally verify files from a IPFS gateway over http

curl "https://ipfs.io/ipfs/bafybeidd2gyhagleh47qeg77xqndy2qy3yzn4vkxmk775bg2t5lpuy7pcu?format=car" | ipfs-car unpack

Error: šŸ”“
Not a file - specify output path with --output

image

complete example for `packToBlob`

the example for the README explains usage of packToBlob as follows:

import { packToBlob } from 'ipfs-car/pack/blob'
import { MemoryBlockStore } from 'ipfs-car/blockstore/memory' // You can also use the `level-blockstore` module

const { root, car } = await packToBlob({
  input: [new Uint8Array([21, 31, 41])],
  blockstore: new MemoryBlockStore()
})

Unfortunately I can not use this example to generate a valid car file. I specifically struggle with the input param.
It would be helpful to provide a more comprehensive example here to show how the input data shall be provided. The README says the input should be a ImportCandidateStream and links to utils.ts but I don't understand how to construct a ImportCandidateStream from this. Maybe it's obvious and I'm missing something here.

Thank you so much in advance.

Add support for specifying multiple files on the command line and preserve file paths

In other words, work like tar.

For packing:

ipfs-car --pack a1/b1/c1.dat a2/b2/c2.dat f1.dat --output out.car

This avoids the need to link disparate files into a common root directory.

On unpacking:

ipfs-car --unpack out.car

would result in the following in the current working directory:

a1/b1/c1.dat
a2/b2/c2.dat
f1.dat

This avoids the need to track the full file paths separately.

Provide useful argv for ps, etc

Currently a ps shows the ipfs-car args as ?.

top shows them as a little car (as in automobile) icon.

Showing the actual command line arguments is useful for admins etc.

Flaky tests on Windows

Sometimes we have windows tests failing with

pack
       with MemoryBlockStore
         "after each" hook for "can packToBlob":
     Error: EPERM: operation not permitted, lstat '\\?\D:\a\ipfs-car\ipfs-car\test\pack\tmp\dir.car'

This started happening likely after a change on the windows environment running in the jobs. I believe that some windows machines have a different set of permissions that do not allow deletes, given we usually have at least one working

feature: programmable chunker

Iā€™d like to put up some grants for a ipfs-car-fastcdc and ipfs-car-ffmpeg for application specific chunking. It would make these a lot easier to write if there was a simple API for passing a custom chunker into ipfs-car šŸ˜

Automate Releases

Currently releases are done manually, we should get them automated

Directories containing only directories are removed from path (different from go-car)

If I pack a file in a nested directory the intermediate directory is removed from the .car file. This is in contradiction to how go-car behaves.

mkdir -p root/intermediate
echo "hello world" > root/intermediate/test.txt
ipfs-car pack root -o result.car

Looking at the .car file it does not contain the intermediate directory but it immediately contains the file.

image

The problem lies with the files-from-paths package which removes a common path as stated in the usage description.

// Output:
// [
//   { name: 'file.txt', stream: [Function: stream] },
//   { name: 'dir/b.pdf', stream: [Function: stream] },
//   { name: 'dir/images/cat.gif', stream: [Function: stream] },
// ]
// Note: common sub-path ("path/to/") is removed.

ERROR - No command specified

PS E:\Coding Stuff\nftmarketplace\nftmarketplaceapp> npx ipfs-car --pack images --output images.car

ERROR
No command specified.

Run $ ipfs-car --help for more info.
ipfs error

I have .jpg images in images folder still getting the error.

Error: Invalid CAR version: 2

After doing a successful retrieval from a client when unpacking i get an error.

Unpacking with Go-car on the same file works.

npx ipfs-car --list 1674816302356116587.car
Error: Invalid CAR version: 2
at readHeader (/root/node_modules/@ipld/car/cjs/lib/decoder.js:59:11)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at async decodeIndexerComplete (/root/node_modules/@ipld/car/cjs/lib/indexer.js:37:28)
at async Function.fromFile (/root/node_modules/@ipld/car/cjs/lib/indexed-reader.js:73:22)
at async listFilesInCar (/root/node_modules/ipfs-car/dist/cjs/cli/lib.js:11:23)

Why won't it skypack

trying to use skypack like:

<!DOCTYPE html>
<head>
  <meta charset="utf-8" />
</head>
<pre id="out"></pre>
<script type="module">
  import { NFTStorage } from 'https://cdn.skypack.dev/nft.storage'
  import { packToBlob } from 'https://cdn.skypack.dev/ipfs-car/pack/blob'

  const endpoint = 'https://api.nft.storage' // the default
  const token =
    new URLSearchParams(window.location.search).get('key') || 'API_KEY' // your API key from https://nft.storage/manage

  function log(msg) {
    msg = JSON.stringify(msg, null, 2)
    document.getElementById('out').innerHTML += `${msg}\n`
  }

  async function main() {
    const store = new NFTStorage({ endpoint, token })
    const data = 'Hello nft.storage!'

    // locally chunk'n'hash the data to get the CID and pack the blocks in to a CAR
    const { root, car } = await packToBlob({
      input: [new TextEncoder().encode(data)],
    })
    const expectedCid = root.toString()
    console.log({ expectedCid })

    // send the CAR to nft.storage, the returned CID will match the one we created above.
    const cid = await store.storeCar(car)

    // verify the service is storing the CID we expect
    const cidsMatch = expectedCid === cid
    log({ data, cid, expectedCid, cidsMatch })

    // check that the CID is pinned
    const status = await store.status(cid)
    log(status)
  }
  main()
</script>

does not work. You get a

Uncaught TypeError: null has no properties
    <anonymous> https://cdn.skypack.dev/-/[email protected]/dist=es2020,mode=imports/unoptimized/to-string.js:4
to-string.js:4:23

from:

https://cdn.skypack.dev/-/[email protected]/dist=es2020,mode=imports/unoptimized/to-string.js

but there is no obvious reason why it should pull in version 2.1.3 when there is a 2.1.5 version of unit8arrays that does not have the problem. the older version uses TextEncoder from web-encoding achingbrain/uint8arrays@v2.1.3...v2.1.5 that fails to bundle. The newere 2.1.5 doesn't. WHY IS SKYPACK USING THE OLDER BAD ONE?

413 Request Entity Too Large error

Trying to upload a car file returns this error:

curl -X POST --data-binary @example.car -H 'Authorization: Bearer API_KEY' https://api.nft.storage/upload

<html>
<head><title>413 Request Entity Too Large</title></head>
<body>
<center><h1>413 Request Entity Too Large</h1></center>
<hr><center>cloudflare</center>
</body>
</html>

From the FAQ this shouldn't happen:

Are there any size restrictions for stored NFTs?
NFT.storage can store NFTs up to 32GB in size! (There was previously a 100MB limit due to Cloudflare workers but NFT.storage now supports chunked uploads, allowing files bigger than 100MB to be uploaded! šŸŽ‰)

How to specify v0 for CIDs?

Hello, so I am trying to migrate over 100,000 files off of Pinata. A significant portion of them unfortunately are v0 CIDs and have to stay available that way (with the exact same v0 CIDs) because of the way they are recorded in this blockchain. I tried the pinning service for nft.storage, that seems to absolutely not work at all (they just get stuck indefinitely, like maybe they can't be retrieved or something?), and it would be a single file at a time anyway.

I am having success with new pins with CAR files made using default options. They can use any type of CID so the default v1s that come out are great and we just record those in the new NFT records.

I think I can make CAR files with ipfs dag export and then putCar each one individually and that should work. But it feels like a last resort kind of, because again, over 100K requests.

What I really want to be able to do is use ipfs-car on about 100 files at a time, but somehow get it to give me v0 CIDs. I can then look at the CAR and verify they match the old v0 CIDs that are on Pinata and in the NFT records (which we can't change).
Any way to get it to put v0 CIDs in the CAR?

I found this https://github.com/ipfs/js-ipfs-unixfs/tree/master/packages/ipfs-unixfs-importer and am hoping if there is no other way that I can kind of tweak that to give me v0 CIDs.

Is there some old CAR program that just always does v0 by default maybe?

Thanks for any advice.

Crash on packing

Crash when trying to pack 19GB folder

<--- Last few GCs --->

[10721:0x4721100]   579166 ms: Scavenge (reduce) 2045.5 (2050.4) -> 2044.9 (2051.9) MB, 9.2 / 0.0 ms  (average mu = 0.293, current mu = 0.262) allocation failure 
[10721:0x4721100]   584654 ms: Mark-sweep (reduce) 2045.8 (2053.9) -> 2038.9 (2053.9) MB, 5467.1 / 0.2 ms  (+ 1.7 ms in 4060 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 6128 ms) (average mu = 0.238, current mu = 0.18

<--- JS stacktrace --->

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
 1: 0xa04200 node::Abort() [šŸš˜]
 2: 0x94e4e9 node::FatalError(char const*, char const*) [šŸš˜]
 3: 0xb7978e v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [šŸš˜]
 4: 0xb79b07 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [šŸš˜]
 5: 0xd34395  [šŸš˜]
 6: 0xd34f1f  [šŸš˜]
 7: 0xd42fab v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [šŸš˜]
 8: 0xd46b6c v8::internal::Heap::AllocateRawWithRetryOrFailSlowPath(int, v8::internal::AllocationType, v8::internal::AllocationOrigin, v8::internal::AllocationAlignment) [šŸš˜]
 9: 0xd1524b v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationType, v8::internal::AllocationOrigin) [šŸš˜]
10: 0x105b23f v8::internal::Runtime_AllocateInYoungGeneration(int, unsigned long*, v8::internal::Isolate*) [šŸš˜]
11: 0x1401219  [šŸš˜]

Received "aborted" signal

Specify CID version?

Hi guys! This is more of a question:

Is there a way to specify CID version during CAR creation?

Context:

I need to obtain the CAR files for IPFS content that already has their CIDs. Therefore, I need to make sure the hash to the original CID when using this CLI tool. How can I achieve this?

Root CID does not match js/go-ipfs when packing a dir with 10k sub dirs

Reported by @obo20

Singular file uploads went fine and then a folder of 100 files went fine, but when I tried with a folder of 10k files (a super common use case), I received a different CID with adding to IPFS (go-ipfs/js-ipfs) and then the ipfs-car output

the cid I get from go-ipfs and js-ipfs is: bafybeihq6az265aar27wuhzltxrgge5ywwllcgux7wui4z3ddq4i2cskky
the cid I get from ipfs-car is: bafybeigww4x6shkc7vbp7c5slmnw3vo6ioj4gnar6ign5eqbkfpijcavk4

large-folder-10k.zip

Support packing deterministic CAR files

Write the graph out in deterministic graph traversal order instead of in the order it parses the files

Current state

The current implementation of ipfs-car writes the CAR file blocks in any specific order, as follows:

  • receives a glob source
  • relies on ipfs-unixfs-importer for layout and chunking of the source, storing each UnixFS generated block into a temporary blockstore
  • iterates the Blockstore stored blocks and write them into the CAR File

This means that we currently have a different output for the same file as go-ipfs and js-ipfs, which do an ordered walk.

Motivation

Supporting deterministic outputs will enable ipfs-car to have the same output CAR as the core ipfs implementation and move us towards supporting other use cases like interact directly with Filecoin (and perhaps offline deals).

Implementation

Given we currently have two iterations (unixfs importer + blockstore iteration), we can support a deterministic output by getting the root and traverse the graph like https://github.com/ipld/js-datastore-car/blob/master/car.js#L198-L221

We should make this optional and pluggable, given we will need to add codecs and hashers which would increase the dependency footprint for users who not need deterministic CAR files.

We can alternatively support a different function where we do not do the two iterations and keep everything in memory. This would be faster and some users could be ok with the extra memory consumption. But, I would say the write performance to create the CAR file is not the biggest concern, and we have been focusing on efficiency more on Reads than Writes.

cc @rvagg @olizilla @mikeal @alanshaw

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    šŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. šŸ“ŠšŸ“ˆšŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ā¤ļø Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.