Skip to content
This repository was archived by the owner on Aug 12, 2020. It is now read-only.

feat: dag-api direct support #171

Merged
merged 1 commit into from
May 25, 2017
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ matrix:
env: CXX=g++-4.8
- node_js: 6
env:
- SAUCE=true
- SAUCE=false
- CXX=g++-4.8
- node_js: stable
env: CXX=g++-4.8
Expand All @@ -34,4 +34,4 @@ addons:
sources:
- ubuntu-toolchain-r-test
packages:
- g++-4.8
- g++-4.8
78 changes: 16 additions & 62 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,6 @@ IPFS unixFS Engine
![](https://img.shields.io/badge/npm-%3E%3D3.0.0-orange.svg?style=flat-square)
![](https://img.shields.io/badge/Node.js-%3E%3D4.0.0-orange.svg?style=flat-square)

[![Sauce Test Status](https://saucelabs.com/browser-matrix/ipfs-unixfs-engine.svg)](https://saucelabs.com/u/ipfs-unixfs-engine)

> JavaScript implementation of the layout and chunking mechanisms used by IPFS

## Table of Contents
Expand All @@ -29,20 +27,10 @@ IPFS unixFS Engine
- [Contribute](#contribute)
- [License](#license)

## BEWARE BEWARE BEWARE there might be 🐉

This module has passed through several iterations and still is far from a nice and easy understandable codebase. Currently missing features:

- [ ] tar importer
- [x] trickle dag exporter
- [ ] sharding (HAMT)

## Install

With [npm](https://npmjs.org/) installed, run

```
$ npm install ipfs-unixfs-engine
> npm install ipfs-unixfs-engine
```

## Usage
Expand All @@ -51,56 +39,35 @@ $ npm install ipfs-unixfs-engine

Let's create a little directory to import:
```sh
$ cd /tmp
$ mkdir foo
$ echo 'hello' > foo/bar
$ echo 'world' > foo/quux
> cd /tmp
> mkdir foo
> echo 'hello' > foo/bar
> echo 'world' > foo/quux
```

And write the importing logic:
```js
// Dependencies to create a DAG Service (where the dir will be imported into)
const memStore = require('abstract-blob-store')
const Repo = require('ipfs-repo')
const Block = require('ipfs-block')
const BlockService = require('ipfs-block-service')
const MerkleDag = require('ipfs-merkle-dag')
const fs = require('fs')

const repo = new Repo('', { stores: memStore })
const blockService = new BlockService(repo)
const dagService = new ipfsMerkleDag.DAGService(blocks)


const Importer = require('ipfs-unixfs-engine').Importer
const filesAddStream = new Importer(dagService)
const filesAddStream = new Importer(<dag or ipld-resolver instance)

// An array to hold the return of nested file/dir info from the importer
// A root DAG Node is received upon completion

const res = []

// Import path /tmp/foo/bar

const rs = fs.createReadStream(file)
const rs2 = fs.createReadStream(file2)
const input = {path: /tmp/foo/bar, content: rs}
const input2 = {path: /tmp/foo/quxx, content: rs2}
const input = { path: /tmp/foo/bar, content: rs }
const input2 = { path: /tmp/foo/quxx, content: rs2 }

// Listen for the data event from the importer stream

filesAddStream.on('data', (info) => {
res.push(info)
})
filesAddStream.on('data', (info) => res.push(info))

// The end event of the stream signals that the importer is done

filesAddStream.on('end', () => {
console.log('Finished filesAddStreaming files!')
})
filesAddStream.on('end', () => console.log('Finished filesAddStreaming files!'))

// Calling write on the importer to filesAddStream the file/object tuples

filesAddStream.write(input)
filesAddStream.write(input2)
filesAddStream.end()
Expand Down Expand Up @@ -129,7 +96,7 @@ When run, the stat of DAG Node is outputted for each file on data event until th
### Importer API

```js
const Importer = require('ipfs-unixfs-engine').importer
const Importer = require('ipfs-unixfs-engine').Importer
```

#### const add = new Importer(dag)
Expand Down Expand Up @@ -173,24 +140,11 @@ In the second argument of the importer constructor you can specify the following
### Example Exporter

```
const Repo = require('ipfs-repo')
const Block = require('ipfs-block')
const BlockService = require('ipfs-block-service')
const MerkleDAG = require('ipfs-merkle-dag')

const repo = new Repo('', { stores: memStore })
const blockService = new BlockService(repo)
const dagService = new MerkleDag.DAGService(blockService)

// Create an export readable object stream with the hash you want to export and a dag service

const filesStream = Exporter(<multihash>, dag)
const filesStream = Exporter(<multihash>, <dag or ipld-resolver instance>)

// Pipe the return stream to console

filesStream.on('data', (file) => {
file.content.pipe(process.stdout)
}
filesStream.on('data', (file) => file.content.pipe(process.stdout))
```

### Exporter: API
Expand All @@ -199,9 +153,9 @@ filesStream.on('data', (file) => {
const Exporter = require('ipfs-unixfs-engine').Exporter
```

### new Exporter(hash, dagService)
### new Exporter(<hash>, <dag or ipld-resolver>)

Uses the given [DAG Service][] to fetch an IPFS [UnixFS][] object(s) by their multiaddress.
Uses the given [dag API or an ipld-resolver instance][] to fetch an IPFS [UnixFS][] object(s) by their multiaddress.

Creates a new readable stream in object mode that outputs objects of the form

Expand All @@ -215,7 +169,7 @@ Creates a new readable stream in object mode that outputs objects of the form
Errors are received as with a normal stream, by listening on the `'error'` event to be emitted.


[DAG Service]: https://github.com/vijayee/js-ipfs-merkle-dag/
[IPLD Resolver]: https://github.com/ipld/js-ipld-resolver
[UnixFS]: https://github.com/ipfs/specs/tree/master/unixfs

## Contribute
Expand Down
25 changes: 13 additions & 12 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -39,11 +39,12 @@
},
"homepage": "https://github.com/ipfs/js-ipfs-unixfs-engine#readme",
"devDependencies": {
"aegir": "^11.0.1",
"aegir": "^11.0.2",
"chai": "^3.5.0",
"dirty-chai": "^1.2.2",
"ipfs-block-service": "^0.9.0",
"ipfs-repo": "^0.13.0",
"ipfs": "^0.24.0",
"ipfs-block-service": "^0.9.1",
"ipfs-repo": "^0.13.1",
"ncp": "^2.0.0",
"pre-commit": "^1.2.2",
"pull-generate": "^2.2.0",
Expand All @@ -52,27 +53,27 @@
"split": "^1.0.0"
},
"dependencies": {
"async": "^2.1.5",
"async": "^2.4.1",
"cids": "^0.5.0",
"deep-extend": "^0.4.1",
"deep-extend": "^0.5.0",
"ipfs-unixfs": "^0.1.11",
"ipld-dag-pb": "^0.11.0",
"ipld-resolver": "^0.11.0",
"ipld-resolver": "^0.11.1",
"is-ipfs": "^0.3.0",
"left-pad": "^1.1.3",
"lodash": "^4.17.4",
"multihashes": "^0.4.5",
"multihashing-async": "^0.4.5",
"pull-batch": "^1.0.0",
"pull-block": "^1.1.0",
"pull-block": "^1.2.0",
"pull-cat": "^1.1.11",
"pull-pair": "^1.1.0",
"pull-paramap": "^1.2.1",
"pull-paramap": "^1.2.2",
"pull-pause": "0.0.1",
"pull-pushable": "^2.0.1",
"pull-stream": "^3.5.0",
"pull-pushable": "^2.1.1",
"pull-stream": "^3.6.0",
"pull-traverse": "^1.0.3",
"pull-write": "^1.1.1",
"pull-write": "^1.1.2",
"sparse-array": "^1.3.1"
},
"contributors": [
Expand All @@ -87,4 +88,4 @@
"jbenet <juan@benet.ai>",
"nginnever <ginneversource@gmail.com>"
]
}
}
1 change: 1 addition & 0 deletions test/node.js
Original file line number Diff line number Diff line change
Expand Up @@ -50,4 +50,5 @@ describe('IPFS UnixFS Engine', () => {
require('./test-hash-parity-with-go-ipfs')(repo)
require('./test-nested-dir-import-export')(repo)
require('./test-dirbuilder-sharding')(repo)
require('./test-dag-api')
})
Loading