Skip to content
This repository was archived by the owner on Aug 12, 2020. It is now read-only.

Commit d0485ef

Browse files
committed
update the readme
1 parent 04e7483 commit d0485ef

File tree

1 file changed

+33
-36
lines changed

1 file changed

+33
-36
lines changed

README.md

Lines changed: 33 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
1-
# IPFS unixFS Engine
1+
IPFS unixFS Engine
2+
==================
23

34
[![](https://img.shields.io/badge/made%20by-Protocol%20Labs-blue.svg?style=flat-square)](http://ipn.io)
45
[![](https://img.shields.io/badge/project-IPFS-blue.svg?style=flat-square)](http://ipfs.io/)
@@ -48,19 +49,19 @@ And write the importing logic:
4849
```js
4950
// Dependencies to create a DAG Service (where the dir will be imported into)
5051
const memStore = require('abstract-blob-store')
51-
const ipfsRepo = require('ipfs-repo')
52-
const ipfsBlock = require('ipfs-block')
53-
const ipfsBlockService = require('ipfs-block-service')
54-
const ipfsMerkleDag = require('ipfs-merkle-dag')
52+
const Repo = require('ipfs-repo')
53+
const Block = require('ipfs-block')
54+
const BlockService = require('ipfs-block-service')
55+
const MerkleDag = require('ipfs-merkle-dag')
5556
const fs = require('fs')
5657

57-
const repo = new ipfsRepo('', { stores: memStore })
58-
const blocks = new ipfsBlockService(repo)
59-
const dag = new ipfsMerkleDag.DAGService(blocks)
58+
const repo = new Repo('', { stores: memStore })
59+
const blockService = new BlockService(repo)
60+
const dagService = new ipfsMerkleDag.DAGService(blocks)
6061

6162

62-
const Importer = require('ipfs-unixfs-engine').importer
63-
const add = new Importer(dag)
63+
const Importer = require('ipfs-unixfs-engine').Importer
64+
const filesAddStream = new Importer(dagService)
6465

6566
// An array to hold the return of nested file/dir info from the importer
6667
// A root DAG Node is received upon completion
@@ -76,26 +77,24 @@ const input2 = {path: /tmp/foo/quxx, content: rs2}
7677

7778
// Listen for the data event from the importer stream
7879

79-
add.on('data', (info) => {
80+
filesAddStream.on('data', (info) => {
8081
res.push(info)
8182
})
8283

8384
// The end event of the stream signals that the importer is done
8485

85-
add.on('end', () => {
86-
console.log('Finished adding files!')
87-
return
86+
filesAddStream.on('end', () => {
87+
console.log('Finished filesAddStreaming files!')
8888
})
8989

90-
// Calling write on the importer to add the file/object tuples
90+
// Calling write on the importer to filesAddStream the file/object tuples
9191

92-
add.write(input)
93-
add.write(input2)
94-
add.end()
92+
filesAddStream.write(input)
93+
filesAddStream.write(input2)
94+
filesAddStream.end()
9595
```
9696

9797
When run, the stat of DAG Node is outputted for each file on data event until the root:
98-
9998
```
10099
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
101100
size: 39243,
@@ -143,38 +142,37 @@ Nodes.
143142
### Example Exporter
144143

145144
```
146-
const ipfsRepo = require('ipfs-repo')
147-
const ipfsBlock = require('ipfs-block')
148-
const ipfsBlockService = require('ipfs-block-service')
149-
const ipfsMerkleDag = require('ipfs-merkle-dag')
145+
const Repo = require('ipfs-repo')
146+
const Block = require('ipfs-block')
147+
const BlockService = require('ipfs-block-service')
148+
const MerkleDAG = require('ipfs-merkle-dag')
150149
151-
const repo = new ipfsRepo('', { stores: memStore })
152-
const blocks = new ipfsBlockService(repo)
153-
const dag = new ipfsMerkleDag.DAGService(blocks)
150+
const repo = new Repo('', { stores: memStore })
151+
const blockService = new BlockService(repo)
152+
const dagService = new MerkleDag.DAGService(blockService)
154153
155154
// Create an export readable object stream with the hash you want to export and a dag service
156155
157-
const exportEvent = Exporter(hash, dag)
156+
const filesStream = Exporter(<multihash>, dag)
158157
159158
// Pipe the return stream to console
160159
161-
exportEvent.on('data', (result) => {
162-
result.stream.pipe(process.stdout)
160+
filesStream.on('data', (file) => {
161+
file.content.pipe(process.stdout)
163162
}
164163
```
165164

166165
### Exporter: API
166+
167167
```js
168-
const Exporter = require('ipfs-unixfs-engine').exporter
168+
const Exporter = require('ipfs-unixfs-engine').Exporter
169169
```
170170

171171
### new Exporter(hash, dagService)
172172

173-
Uses the given [DAG Service][] to fetch an IPFS [UnixFS][] object(s) by their
174-
multiaddress.
173+
Uses the given [DAG Service][] to fetch an IPFS [UnixFS][] object(s) by their multiaddress.
175174

176-
Creates a new readable stream in object mode that outputs objects of the
177-
form
175+
Creates a new readable stream in object mode that outputs objects of the form
178176

179177
```js
180178
{
@@ -183,8 +181,7 @@ form
183181
}
184182
```
185183

186-
Errors are received as with a normal stream, by listening on the `'error'` event
187-
to be emitted.
184+
Errors are received as with a normal stream, by listening on the `'error'` event to be emitted.
188185

189186

190187
[DAG Service]: https://github.com/vijayee/js-ipfs-merkle-dag/

0 commit comments

Comments
 (0)