This repository was archived by the owner on Apr 29, 2020. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 3
feat: convert to async/await #21
Merged
Merged
Changes from all commits
Commits
Show all changes
11 commits
Select commit
Hold shift + click to select a range
579311b
feat: convert to async/await
achingbrain eea8423
chore: readme update
achingbrain e1334bb
chore: fix linting
achingbrain 1c7456b
feat: convert internals to be async/await
achingbrain 468cdfb
test: increase test coverage
achingbrain 220de8a
chore: remove uncessary await
achingbrain 3679e1d
feat: add export depth and recursive exports
achingbrain ef5499b
chore: address PR comments
achingbrain 747a664
chore: update ipld formats
achingbrain a193651
chore: PR comments
achingbrain 6a73635
chore: standardise error codes
achingbrain File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
'use strict' | ||
|
||
module.exports = { | ||
karma: { | ||
browserNoActivityTimeout: 1000 * 1000, | ||
} | ||
} |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -19,13 +19,22 @@ | |
|
||
## Table of Contents | ||
|
||
- [Install](#install) | ||
- [Usage](#usage) | ||
- [Example](#example) | ||
- [API](#api) | ||
- [exporter(cid, ipld)](#exportercid-ipld-options) | ||
- [Contribute](#contribute) | ||
- [License](#license) | ||
- [ipfs-unixfs-exporter](#ipfs-unixfs-exporter) | ||
- [Lead Maintainer](#lead-maintainer) | ||
- [Table of Contents](#table-of-contents) | ||
- [Install](#install) | ||
- [Usage](#usage) | ||
- [Example](#example) | ||
- [API](#api) | ||
- [`exporter(cid, ipld)`](#exportercid-ipld) | ||
- [UnixFS V1 entries](#unixfs-v1-entries) | ||
- [Raw entries](#raw-entries) | ||
- [CBOR entries](#cbor-entries) | ||
- [`entry.content({ offset, length })`](#entrycontent-offset-length) | ||
- [`exporter.path(cid, ipld)`](#exporterpathcid-ipld) | ||
- [`exporter.recursive(cid, ipld)`](#exporterrecursivecid-ipld) | ||
- [Contribute](#contribute) | ||
- [License](#license) | ||
|
||
## Install | ||
|
||
|
@@ -38,29 +47,41 @@ | |
### Example | ||
|
||
```js | ||
// Create an export source pull-stream cid or ipfs path you want to export and a | ||
// <dag or ipld-resolver instance> to fetch the file from | ||
// import a file and export it again | ||
const importer = require('ipfs-unixfs-importer') | ||
const exporter = require('ipfs-unixfs-exporter') | ||
const pull = require('pull-stream/pull') | ||
const { stdout } = require('pull-stdio') | ||
|
||
const options = {} | ||
|
||
pull( | ||
exporter(cid, ipld, options), | ||
collect((error, files) => { | ||
if (error) { | ||
// ...handle error | ||
} | ||
|
||
// Set up a pull stream that sends the file content to process.stdout | ||
pull( | ||
// files[0].content is a pull-stream that contains the bytes of the file | ||
files[0].content, | ||
stdout() | ||
) | ||
}) | ||
) | ||
|
||
const files = [] | ||
|
||
for await (const file of importer([{ | ||
path: '/foo/bar.txt', | ||
content: Buffer.from(0, 1, 2, 3) | ||
}], ipld)) { | ||
files.push(file) | ||
} | ||
|
||
console.info(files[0].cid) // Qmbaz | ||
|
||
const entry = await exporter(files[0].cid, ipld) | ||
|
||
console.info(entry.cid) // Qmqux | ||
console.info(entry.path) // Qmbaz/foo/bar.txt | ||
console.info(entry.name) // bar.txt | ||
console.info(entry.unixfs.fileSize()) // 4 | ||
|
||
// stream content from unixfs node | ||
const bytes = [] | ||
|
||
for await (const buf of entry.content({ | ||
offset: 0, // optional offset | ||
length: 4 // optional length | ||
})) { | ||
bytes.push(buf) | ||
} | ||
|
||
const content = Buffer.concat(bytes) | ||
|
||
console.info(content) // 0, 1, 2, 3 | ||
``` | ||
|
||
#### API | ||
|
@@ -69,124 +90,146 @@ pull( | |
const exporter = require('ipfs-unixfs-exporter') | ||
``` | ||
|
||
### exporter(cid, ipld, options) | ||
### `exporter(cid, ipld)` | ||
|
||
Uses the given [dag API][] or an [ipld-resolver instance][] to fetch an IPFS [UnixFS][] object(s) by their CID. | ||
Uses the given [js-ipld instance][] to fetch an IPFS node by it's CID. | ||
|
||
Creates a new pull stream that outputs objects of the form | ||
Returns a Promise which resolves to an `entry`. | ||
|
||
```js | ||
#### UnixFS V1 entries | ||
|
||
Entries with a `dag-pb` codec `CID` return UnixFS V1 entries: | ||
|
||
```javascript | ||
{ | ||
path: 'a name', | ||
content: <pull stream> | ||
name: 'foo.txt', | ||
path: 'Qmbar/foo.txt', | ||
cid: CID, // see https://github.com/multiformats/js-cid | ||
node: DAGNode, // see https://github.com/ipld/js-ipld-dag-pb | ||
content: function, // returns an async iterator | ||
unixfs: UnixFS // see https://github.com/ipfs/js-ipfs-unixfs | ||
} | ||
``` | ||
|
||
#### `offset` and `length` | ||
If the entry is a file, `entry.content()` returns an async iterator that yields one or more buffers containing the file content: | ||
|
||
`offset` and `length` arguments can optionally be passed to the exporter function. These will cause the returned stream to only emit bytes starting at `offset` and with length of `length`. | ||
```javascript | ||
if (entry.unixfs.type === 'file') { | ||
for await (const chunk of entry.content()) { | ||
// chunk is a Buffer | ||
} | ||
} | ||
``` | ||
|
||
See [the tests](test/exporter.js) for examples of using these arguments. | ||
If the entry is a directory or hamt shard, `entry.content()` returns further `entry` objects: | ||
|
||
```js | ||
const exporter = require('ipfs-unixfs-exporter') | ||
const pull = require('pull-stream') | ||
const drain = require('pull-stream/sinks/drain') | ||
|
||
pull( | ||
exporter(cid, ipld, { | ||
offset: 0, | ||
length: 10 | ||
}) | ||
drain((file) => { | ||
// file.content is a pull stream containing only the first 10 bytes of the file | ||
}) | ||
) | ||
```javascript | ||
if (entry.unixfs.type.includes('directory')) { // can be 'directory' or 'hamt-sharded-directory' | ||
for await (const entry of dir.content()) { | ||
console.info(entry.name) | ||
} | ||
} | ||
``` | ||
|
||
### `fullPath` | ||
#### Raw entries | ||
|
||
If specified the exporter will emit an entry for every path component encountered. | ||
Entries with a `raw` codec `CID` return raw entries: | ||
|
||
```javascript | ||
const exporter = require('ipfs-unixfs-exporter') | ||
const pull = require('pull-stream') | ||
const collect = require('pull-stream/sinks/collect') | ||
|
||
pull( | ||
exporter('QmFoo.../bar/baz.txt', ipld, { | ||
fullPath: true | ||
}) | ||
collect((err, files) => { | ||
console.info(files) | ||
|
||
// [{ | ||
// depth: 0, | ||
// name: 'QmFoo...', | ||
// path: 'QmFoo...', | ||
// size: ... | ||
// cid: CID | ||
// content: undefined | ||
// type: 'dir' | ||
// }, { | ||
// depth: 1, | ||
// name: 'bar', | ||
// path: 'QmFoo.../bar', | ||
// size: ... | ||
// cid: CID | ||
// content: undefined | ||
// type: 'dir' | ||
// }, { | ||
// depth: 2, | ||
// name: 'baz.txt', | ||
// path: 'QmFoo.../bar/baz.txt', | ||
// size: ... | ||
// cid: CID | ||
// content: <Pull stream> | ||
// type: 'file' | ||
// }] | ||
// | ||
}) | ||
) | ||
{ | ||
name: 'foo.txt', | ||
path: 'Qmbar/foo.txt', | ||
cid: CID, // see https://github.com/multiformats/js-cid | ||
node: Buffer, // see https://nodejs.org/api/buffer.html | ||
content: function, // returns an async iterator | ||
} | ||
``` | ||
|
||
### `maxDepth` | ||
`entry.content()` returns an async iterator that yields a buffer containing the node content: | ||
|
||
If specified the exporter will only emit entries up to the specified depth. | ||
```javascript | ||
for await (const chunk of entry.content()) { | ||
// chunk is a Buffer | ||
} | ||
``` | ||
|
||
Unless you an options object containing `offset` and `length` keys as an argument to `entry.content()`, `chunk` will be equal to `entry.node`. | ||
|
||
#### CBOR entries | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. hmm, interesting...
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes, through the I'm not sure what go does. The name starts to become misleading, true - might be worth turning it into a |
||
|
||
Entries with a `dag-cbor` codec `CID` return JavaScript object entries: | ||
|
||
```javascript | ||
const exporter = require('ipfs-unixfs-exporter') | ||
const pull = require('pull-stream') | ||
const collect = require('pull-stream/sinks/collect') | ||
|
||
pull( | ||
exporter('QmFoo.../bar/baz.txt', ipld, { | ||
fullPath: true, | ||
maxDepth: 1 | ||
}) | ||
collect((err, files) => { | ||
console.info(files) | ||
|
||
// [{ | ||
// depth: 0, | ||
// name: 'QmFoo...', | ||
// path: 'QmFoo...', | ||
// size: ... | ||
// cid: CID | ||
// content: undefined | ||
// type: 'dir' | ||
// }, { | ||
// depth: 1, | ||
// name: 'bar', | ||
// path: 'QmFoo.../bar', | ||
// size: ... | ||
// cid: CID | ||
// content: undefined | ||
// type: 'dir' | ||
// }] | ||
// | ||
}) | ||
) | ||
{ | ||
name: 'foo.txt', | ||
path: 'Qmbar/foo.txt', | ||
cid: CID, // see https://github.com/multiformats/js-cid | ||
node: Object, // see https://github.com/ipld/js-ipld-dag-cbor | ||
} | ||
``` | ||
|
||
There is no `content` function for a `CBOR` node. | ||
|
||
|
||
#### `entry.content({ offset, length })` | ||
|
||
When `entry` is a file or a `raw` node, `offset` and/or `length` arguments can be passed to `entry.content()` to return slices of data: | ||
|
||
```javascript | ||
const bufs = [] | ||
|
||
for await (const chunk of entry.content({ | ||
offset: 0, | ||
length: 5 | ||
})) { | ||
bufs.push(chunk) | ||
} | ||
|
||
// `data` contains the first 5 bytes of the file | ||
const data = Buffer.concat(bufs) | ||
``` | ||
|
||
If `entry` is a directory or hamt shard, passing `offset` and/or `length` to `entry.content()` will limit the number of files returned from the directory. | ||
|
||
```javascript | ||
const entries = [] | ||
|
||
for await (const entry of dir.content({ | ||
offset: 0, | ||
length: 5 | ||
})) { | ||
entries.push(entry) | ||
} | ||
|
||
// `entries` contains the first 5 files/directories in the directory | ||
``` | ||
|
||
### `exporter.path(cid, ipld)` | ||
|
||
`exporter.path` will return an async iterator that yields entries for all segments in a path: | ||
|
||
```javascript | ||
const entries = [] | ||
|
||
for await (const entry of exporter.path('Qmfoo/foo/bar/baz.txt', ipld)) { | ||
entries.push(entry) | ||
} | ||
|
||
// entries contains 4x `entry` objects | ||
``` | ||
|
||
### `exporter.recursive(cid, ipld)` | ||
|
||
`exporter.recursive` will return an async iterator that yields all entries beneath a given CID or IPFS path, as well as the containing directory. | ||
|
||
```javascript | ||
const entries = [] | ||
|
||
for await (const child of exporter.recursive('Qmfoo/foo/bar', ipld)) { | ||
entries.push(entry) | ||
} | ||
|
||
// entries contains all children of the `Qmfoo/foo/bar` directory and it's children | ||
``` | ||
|
||
[dag API]: https://github.com/ipfs/interface-ipfs-core/blob/master/SPEC/DAG.md | ||
|
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.