Skip to content
This repository was archived by the owner on Apr 29, 2020. It is now read-only.

Commit 0d316ac

Browse files
committed
chore: removes importer code
Bonus: exporter tests now run in the browser!
1 parent 60ae4d5 commit 0d316ac

File tree

225 files changed

+1415
-57607
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

225 files changed

+1415
-57607
lines changed

README.md

Lines changed: 41 additions & 160 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,17 @@
1-
IPFS unixFS Engine
2-
==================
1+
# ipfs-unixfs-exporter
32

43
[![](https://img.shields.io/badge/made%20by-Protocol%20Labs-blue.svg?style=flat-square)](http://ipn.io)
54
[![](https://img.shields.io/badge/project-IPFS-blue.svg?style=flat-square)](http://ipfs.io/)
65
[![](https://img.shields.io/badge/freenode-%23ipfs-blue.svg?style=flat-square)](http://webchat.freenode.net/?channels=%23ipfs)
76
[![standard-readme compliant](https://img.shields.io/badge/standard--readme-OK-green.svg?style=flat-square)](https://github.com/RichardLitt/standard-readme)
8-
[![Jenkins](https://ci.ipfs.team/buildStatus/icon?job=ipfs/js-ipfs-unixfs-engine/master)](https://ci.ipfs.team/job/ipfs/job/js-ipfs-unixfs-engine/job/master/)
9-
[![Codecov](https://codecov.io/gh/ipfs/js-ipfs-unixfs-engine/branch/master/graph/badge.svg)](https://codecov.io/gh/ipfs/js-ipfs-unixfs-engine)
10-
[![Dependency Status](https://david-dm.org/ipfs/js-ipfs-unixfs-engine.svg?style=flat-square)](https://david-dm.org/ipfs/js-ipfs-unixfs-engine)
7+
[![Jenkins](https://ci.ipfs.team/buildStatus/icon?job=ipfs/js-ipfs-unixfs-exporter/master)](https://ci.ipfs.team/job/ipfs/job/js-ipfs-unixfs-exporter/job/master/)
8+
[![Codecov](https://codecov.io/gh/ipfs/js-ipfs-unixfs-exporter/branch/master/graph/badge.svg)](https://codecov.io/gh/ipfs/js-ipfs-unixfs-exporter)
9+
[![Dependency Status](https://david-dm.org/ipfs/js-ipfs-unixfs-exporter.svg?style=flat-square)](https://david-dm.org/ipfs/js-ipfs-unixfs-exporter)
1110
[![js-standard-style](https://img.shields.io/badge/code%20style-standard-brightgreen.svg?style=flat-square)](https://github.com/feross/standard)
1211
![](https://img.shields.io/badge/npm-%3E%3D3.0.0-orange.svg?style=flat-square)
1312
![](https://img.shields.io/badge/Node.js-%3E%3D8.0.0-orange.svg?style=flat-square)
1413

15-
> JavaScript implementation of the layout and chunking mechanisms used by IPFS to handle Files
14+
> JavaScript implementation of the exporter used by IPFS to handle Files
1615
1716
## Lead Maintainer
1817

@@ -22,181 +21,80 @@ IPFS unixFS Engine
2221

2322
- [Install](#install)
2423
- [Usage](#usage)
25-
- [Example Importer](#example-importer)
26-
- [Importer API](#importer-api)
27-
- [const add = new Importer(dag)](#const-add--new-importerdag)
28-
- [Example Exporter](#example-exporter)
29-
- [Exporter: API](#exporter-api)
30-
- [new Exporter(hash, dagService)](#new-exporterhash-dagservice)
24+
- [Example](#example)
25+
- [API](#api)
26+
- [exporter(cid, ipld)](#exportercid-ipld-options)
3127
- [Contribute](#contribute)
3228
- [License](#license)
3329

3430
## Install
3531

3632
```
37-
> npm install ipfs-unixfs-engine
33+
> npm install ipfs-unixfs-exporter
3834
```
3935

4036
## Usage
4137

42-
### Importer
43-
44-
#### Importer example
45-
46-
Let's create a little directory to import:
47-
48-
```sh
49-
> cd /tmp
50-
> mkdir foo
51-
> echo 'hello' > foo/bar
52-
> echo 'world' > foo/quux
53-
```
54-
55-
And write the importing logic:
56-
57-
```js
58-
const Importer = require('ipfs-unixfs-engine').Importer
59-
60-
// You need to create and pass an ipld-resolve instance
61-
// https://github.com/ipld/js-ipld-resolver
62-
const filesAddStream = new Importer(<ipld-resolver instance>)
63-
64-
// An array to hold the return of nested file/dir info from the importer
65-
// A root DAG Node is received upon completion
66-
67-
const res = []
68-
69-
// Import path /tmp/foo/bar
70-
const rs = fs.createReadStream(file)
71-
const rs2 = fs.createReadStream(file2)
72-
const input = { path: '/tmp/foo/bar', content: rs }
73-
const input2 = { path: '/tmp/foo/quxx', content: rs2 }
74-
75-
// Listen for the data event from the importer stream
76-
filesAddStream.on('data', (info) => res.push(info))
77-
78-
// The end event of the stream signals that the importer is done
79-
filesAddStream.on('end', () => console.log('Finished filesAddStreaming files!'))
80-
81-
// Calling write on the importer to filesAddStream the file/object tuples
82-
filesAddStream.write(input)
83-
filesAddStream.write(input2)
84-
filesAddStream.end()
85-
```
86-
87-
When run, the stat of DAG Node is outputted for each file on data event until the root:
88-
89-
```js
90-
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
91-
size: 39243,
92-
path: '/tmp/foo/bar' }
93-
94-
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
95-
size: 59843,
96-
path: '/tmp/foo/quxx' }
97-
98-
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
99-
size: 93242,
100-
path: '/tmp/foo' }
101-
102-
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
103-
size: 94234,
104-
path: '/tmp' }
105-
106-
```
107-
108-
#### Importer API
109-
110-
```js
111-
const Importer = require('ipfs-unixfs-engine').Importer
112-
```
113-
114-
#### const import = new Importer(dag [, options])
115-
116-
The `import` object is a duplex pull stream that takes objects of the form:
117-
118-
```js
119-
{
120-
path: 'a name',
121-
content: (Buffer or Readable stream)
122-
}
123-
```
124-
125-
`import` will output file info objects as files get stored in IPFS. When stats on a node are emitted they are guaranteed to have been written.
126-
127-
`dag` is an instance of the [`IPLD Resolver`](https://github.com/ipld/js-ipld-resolver) or the [`js-ipfs` `dag api`](https://github.com/ipfs/interface-ipfs-core/blob/master/SPEC/DAG.md)
128-
129-
The input's file paths and directory structure will be preserved in the [`dag-pb`](https://github.com/ipld/js-ipld-dag-pb) created nodes.
130-
131-
`options` is an JavaScript option that might include the following keys:
132-
133-
- `wrap` (boolean, defaults to false): if true, a wrapping node will be created
134-
- `shardSplitThreshold` (positive integer, defaults to 1000): the number of directory entries above which we decide to use a sharding directory builder (instead of the default flat one)
135-
- `chunker` (string, defaults to `"fixed"`): the chunking strategy. Now only supports `"fixed"`
136-
- `chunkerOptions` (object, optional): the options for the chunker. Defaults to an object with the following properties:
137-
- `maxChunkSize` (positive integer, defaults to `262144`): the maximum chunk size for the `fixed` chunker.
138-
- `strategy` (string, defaults to `"balanced"`): the DAG builder strategy name. Supports:
139-
- `flat`: flat list of chunks
140-
- `balanced`: builds a balanced tree
141-
- `trickle`: builds [a trickle tree](https://github.com/ipfs/specs/pull/57#issuecomment-265205384)
142-
- `maxChildrenPerNode` (positive integer, defaults to `174`): the maximum children per node for the `balanced` and `trickle` DAG builder strategies
143-
- `layerRepeat` (positive integer, defaults to 4): (only applicable to the `trickle` DAG builder strategy). The maximum repetition of parent nodes for each layer of the tree.
144-
- `reduceSingleLeafToSelf` (boolean, defaults to `true`): optimization for, when reducing a set of nodes with one node, reduce it to that node.
145-
- `dirBuilder` (object): the options for the directory builder
146-
- `hamt` (object): the options for the HAMT sharded directory builder
147-
- bits (positive integer, defaults to `8`): the number of bits at each bucket of the HAMT
148-
- `progress` (function): a function that will be called with the byte length of chunks as a file is added to ipfs.
149-
- `onlyHash` (boolean, defaults to false): Only chunk and hash - do not write to disk
150-
- `hashAlg` (string): multihash hashing algorithm to use
151-
- `cidVersion` (integer, default 0): the CID version to use when storing the data (storage keys are based on the CID, _including_ it's version)
152-
- `rawLeaves` (boolean, defaults to false): When a file would span multiple DAGNodes, if this is true the leaf nodes will not be wrapped in `UnixFS` protobufs and will instead contain the raw file bytes
153-
- `leafType` (string, defaults to `'file'`) what type of UnixFS node leaves should be - can be `'file'` or `'raw'` (ignored when `rawLeaves` is `true`)
154-
155-
### Exporter
156-
157-
#### Exporter example
38+
### Example
15839

15940
```js
16041
// Create an export source pull-stream cid or ipfs path you want to export and a
16142
// <dag or ipld-resolver instance> to fetch the file from
162-
const filesStream = Exporter(<cid or ipfsPath>, <dag or ipld-resolver instance>)
43+
const exporter = require('ipfs-unixfs-exporter')
44+
const pull = require('pull-stream/pull')
45+
const { stdout } = require('pull-stdio')
16346

164-
// Pipe the return stream to console
165-
filesStream.on('data', (file) => file.content.pipe(process.stdout))
47+
const options = {}
48+
49+
pull(
50+
exporter(cid, ipld, options),
51+
collect((error, files) => {
52+
if (error) {
53+
// ...handle error
54+
}
55+
56+
// Set up a pull stream that sends the file content to process.stdout
57+
pull(
58+
// files[0].content is a pull-stream that contains the bytes of the file
59+
files[0].content,
60+
stdout()
61+
)
62+
})
63+
)
16664
```
16765

168-
#### Exporter API
66+
#### API
16967

17068
```js
171-
const Exporter = require('ipfs-unixfs-engine').Exporter
69+
const exporter = require('ipfs-unixfs-exporter')
17270
```
17371

174-
### new Exporter(<cid or ipfsPath>, <dag or ipld-resolver>, <options>)
72+
### exporter(cid, ipld, options)
17573

176-
Uses the given [dag API][] or an [ipld-resolver instance][] to fetch an IPFS [UnixFS][] object(s) by their multiaddress.
74+
Uses the given [dag API][] or an [ipld-resolver instance][] to fetch an IPFS [UnixFS][] object(s) by their CID.
17775

178-
Creates a new readable stream in object mode that outputs objects of the form
76+
Creates a new pull stream that outputs objects of the form
17977

18078
```js
18179
{
18280
path: 'a name',
183-
content: (Buffer or Readable stream)
81+
content: <pull stream>
18482
}
18583
```
18684

18785
#### `offset` and `length`
18886

189-
`offset` and `length` arguments can optionally be passed to the reader function. These will cause the returned stream to only emit bytes starting at `offset` and with length of `length`.
87+
`offset` and `length` arguments can optionally be passed to the exporter function. These will cause the returned stream to only emit bytes starting at `offset` and with length of `length`.
19088

191-
See [the tests](test/reader.js) for examples of using these arguments.
89+
See [the tests](test/exporter.js) for examples of using these arguments.
19290

19391
```js
194-
const exporter = require('ipfs-unixfs-engine').exporter
92+
const exporter = require('ipfs-unixfs-exporter')
19593
const pull = require('pull-stream')
19694
const drain = require('pull-stream/sinks/drain')
19795

19896
pull(
199-
exporter(cid, ipldResolver, {
97+
exporter(cid, ipld, {
20098
offset: 0,
20199
length: 10
202100
})
@@ -206,31 +104,14 @@ pull(
206104
)
207105
```
208106

209-
#### Errors
210-
211-
Errors are received by [pull-stream][] sinks.
212-
213-
```js
214-
const exporter = require('ipfs-unixfs-engine').exporter
215-
const pull = require('pull-stream')
216-
const collect = require('pull-stream/sinks/collect')
217-
218-
pull(
219-
exporter(cid, ipldResolver)
220-
collect((error, chunks) => {
221-
// handle the error
222-
})
223-
)
224-
```
225-
226107
[dag API]: https://github.com/ipfs/interface-ipfs-core/blob/master/SPEC/DAG.md
227108
[ipld-resolver instance]: https://github.com/ipld/js-ipld-resolver
228109
[UnixFS]: https://github.com/ipfs/specs/tree/master/unixfs
229110
[pull-stream]: https://www.npmjs.com/package/pull-stream
230111

231112
## Contribute
232113

233-
Feel free to join in. All welcome. Open an [issue](https://github.com/ipfs/js-ipfs-unixfs-engine/issues)!
114+
Feel free to join in. All welcome. Open an [issue](https://github.com/ipfs/js-ipfs-unixfs-exporter/issues)!
234115

235116
This repository falls under the IPFS [Code of Conduct](https://github.com/ipfs/community/blob/master/code-of-conduct.md).
236117

js-ipfs-unixfs-exporter

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
../../js-ipfs-unixfs-exporter

package.json

Lines changed: 13 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,11 @@
11
{
2-
"name": "ipfs-unixfs-engine",
2+
"name": "ipfs-unixfs-exporter",
33
"version": "0.34.0",
4-
"description": "JavaScript implementation of the unixfs Engine used by IPFS",
4+
"description": "JavaScript implementation of the UnixFs exporter used by IPFS",
55
"leadMaintainer": "Alex Potsides <alex.potsides@protocol.ai>",
66
"main": "src/index.js",
77
"browser": {
8-
"fs": false,
9-
"rabin": false
8+
"fs": false
109
},
1110
"scripts": {
1211
"test": "aegir test",
@@ -22,60 +21,42 @@
2221
},
2322
"repository": {
2423
"type": "git",
25-
"url": "git+https://github.com/ipfs/js-ipfs-unixfs-engine.git"
24+
"url": "git+https://github.com/ipfs/js-ipfs-unixfs-exporter.git"
2625
},
2726
"keywords": [
2827
"IPFS"
2928
],
3029
"license": "MIT",
3130
"bugs": {
32-
"url": "https://github.com/ipfs/js-ipfs-unixfs-engine/issues"
31+
"url": "https://github.com/ipfs/js-ipfs-unixfs-exporter/issues"
3332
},
3433
"engines": {
3534
"node": ">=8.0.0",
3635
"npm": ">=3.0.0"
3736
},
38-
"homepage": "https://github.com/ipfs/js-ipfs-unixfs-engine#readme",
37+
"homepage": "https://github.com/ipfs/js-ipfs-unixfs-exporter#readme",
3938
"devDependencies": {
4039
"aegir": "^17.0.0",
4140
"chai": "^4.2.0",
41+
"detect-node": "^2.0.4",
4242
"dirty-chai": "^2.0.1",
43-
"ipfs-block-service": "~0.15.1",
44-
"ipfs-repo": "~0.25.0",
43+
"ipfs-unixfs-importer": "~0.34.0",
4544
"ipld": "~0.20.0",
46-
"mkdirp": "~0.5.1",
47-
"multihashes": "~0.4.14",
48-
"ncp": "^2.0.0",
49-
"pull-generate": "^2.2.0",
45+
"ipld-dag-pb": "~0.15.0",
46+
"pull-pushable": "^2.2.0",
5047
"pull-stream-to-stream": "^1.3.4",
5148
"pull-zip": "^2.0.1",
52-
"rimraf": "^2.6.2",
53-
"sinon": "^7.1.0"
49+
"sinon": "^7.1.0",
50+
"stream-to-pull-stream": "^1.7.2"
5451
},
5552
"dependencies": {
5653
"async": "^2.6.1",
5754
"cids": "~0.5.5",
58-
"deep-extend": "~0.6.0",
5955
"ipfs-unixfs": "~0.1.16",
60-
"ipld-dag-pb": "~0.15.0",
61-
"left-pad": "^1.3.0",
62-
"multihashing-async": "~0.5.1",
63-
"pull-batch": "^1.0.0",
64-
"pull-block": "^1.4.0",
6556
"pull-cat": "^1.1.11",
66-
"pull-pair": "^1.1.0",
6757
"pull-paramap": "^1.2.2",
68-
"pull-pause": "0.0.2",
69-
"pull-pushable": "^2.2.0",
7058
"pull-stream": "^3.6.9",
71-
"pull-through": "^1.0.18",
72-
"pull-traverse": "^1.0.3",
73-
"pull-write": "^1.1.4",
74-
"sparse-array": "^1.3.1",
75-
"stream-to-pull-stream": "^1.7.2"
76-
},
77-
"optionalDependencies": {
78-
"rabin": "^1.6.0"
59+
"pull-traverse": "^1.0.3"
7960
},
8061
"contributors": [
8162
"Alan Shaw <alan@tableflip.io>",

0 commit comments

Comments
 (0)