Skip to content
This repository was archived by the owner on Aug 12, 2020. It is now read-only.

Commit 27565ff

Browse files
committed
chore: remove importer and exporter code
1 parent 60ae4d5 commit 27565ff

File tree

223 files changed

+68
-58125
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

223 files changed

+68
-58125
lines changed

README.md

Lines changed: 37 additions & 174 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,4 @@
1-
IPFS unixFS Engine
2-
==================
1+
# ipfs-unixfs-engine
32

43
[![](https://img.shields.io/badge/made%20by-Protocol%20Labs-blue.svg?style=flat-square)](http://ipn.io)
54
[![](https://img.shields.io/badge/project-IPFS-blue.svg?style=flat-square)](http://ipfs.io/)
@@ -22,12 +21,8 @@ IPFS unixFS Engine
2221

2322
- [Install](#install)
2423
- [Usage](#usage)
25-
- [Example Importer](#example-importer)
26-
- [Importer API](#importer-api)
27-
- [const add = new Importer(dag)](#const-add--new-importerdag)
28-
- [Example Exporter](#example-exporter)
29-
- [Exporter: API](#exporter-api)
30-
- [new Exporter(hash, dagService)](#new-exporterhash-dagservice)
24+
- [Importing a file](#importing-a-file)
25+
- [Exporting a file](#exporting-a-file)
3126
- [Contribute](#contribute)
3227
- [License](#license)
3328

@@ -39,195 +34,63 @@ IPFS unixFS Engine
3934

4035
## Usage
4136

42-
### Importer
37+
The `unixfs-engine` exports the [`unixfs-importer`](https://npmjs.com/packages/ipfs-unixfs-importer) and [`unixfs-exporter`](https://npmjs.com/packages/ipfs-unixfs-exporter) modules. Please see those modules for for full documentation.
4338

44-
#### Importer example
39+
### Importing a file
4540

46-
Let's create a little directory to import:
41+
The importer is a [pull-stream through](https://github.com/pull-stream/pull-stream#through) which takes objects of the form `{ path, content }` where `path` is a string path and `content` can be a `Buffer`, a `ReadableStream` or a `pull-stream` that emits `Buffer`s.
4742

48-
```sh
49-
> cd /tmp
50-
> mkdir foo
51-
> echo 'hello' > foo/bar
52-
> echo 'world' > foo/quux
53-
```
54-
55-
And write the importing logic:
56-
57-
```js
58-
const Importer = require('ipfs-unixfs-engine').Importer
59-
60-
// You need to create and pass an ipld-resolve instance
61-
// https://github.com/ipld/js-ipld-resolver
62-
const filesAddStream = new Importer(<ipld-resolver instance>)
63-
64-
// An array to hold the return of nested file/dir info from the importer
65-
// A root DAG Node is received upon completion
66-
67-
const res = []
68-
69-
// Import path /tmp/foo/bar
70-
const rs = fs.createReadStream(file)
71-
const rs2 = fs.createReadStream(file2)
72-
const input = { path: '/tmp/foo/bar', content: rs }
73-
const input2 = { path: '/tmp/foo/quxx', content: rs2 }
74-
75-
// Listen for the data event from the importer stream
76-
filesAddStream.on('data', (info) => res.push(info))
77-
78-
// The end event of the stream signals that the importer is done
79-
filesAddStream.on('end', () => console.log('Finished filesAddStreaming files!'))
80-
81-
// Calling write on the importer to filesAddStream the file/object tuples
82-
filesAddStream.write(input)
83-
filesAddStream.write(input2)
84-
filesAddStream.end()
85-
```
43+
It requires an [ipld](https://npmjs.com/packages/ipld) resolver to persist [DAGNodes](https://npmjs.com/packages/ipld-dag-pb) and make them available over IPFS.
8644

87-
When run, the stat of DAG Node is outputted for each file on data event until the root:
45+
See the [`unixfs-importer`](https://npmjs.com/packages/ipfs-unixfs-importer) module for full documentation.
8846

8947
```js
90-
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
91-
size: 39243,
92-
path: '/tmp/foo/bar' }
93-
94-
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
95-
size: 59843,
96-
path: '/tmp/foo/quxx' }
97-
98-
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
99-
size: 93242,
100-
path: '/tmp/foo' }
101-
102-
{ multihash: <Buffer 12 20 bd e2 2b 57 3f 6f bd 7c cc 5a 11 7f 28 6c a2 9a 9f c0 90 e1 d4 16 d0 5f 42 81 ec 0c 2a 7f 7f 93>,
103-
size: 94234,
104-
path: '/tmp' }
105-
106-
```
107-
108-
#### Importer API
109-
110-
```js
111-
const Importer = require('ipfs-unixfs-engine').Importer
112-
```
113-
114-
#### const import = new Importer(dag [, options])
115-
116-
The `import` object is a duplex pull stream that takes objects of the form:
117-
118-
```js
119-
{
120-
path: 'a name',
121-
content: (Buffer or Readable stream)
122-
}
123-
```
124-
125-
`import` will output file info objects as files get stored in IPFS. When stats on a node are emitted they are guaranteed to have been written.
126-
127-
`dag` is an instance of the [`IPLD Resolver`](https://github.com/ipld/js-ipld-resolver) or the [`js-ipfs` `dag api`](https://github.com/ipfs/interface-ipfs-core/blob/master/SPEC/DAG.md)
128-
129-
The input's file paths and directory structure will be preserved in the [`dag-pb`](https://github.com/ipld/js-ipld-dag-pb) created nodes.
130-
131-
`options` is an JavaScript option that might include the following keys:
132-
133-
- `wrap` (boolean, defaults to false): if true, a wrapping node will be created
134-
- `shardSplitThreshold` (positive integer, defaults to 1000): the number of directory entries above which we decide to use a sharding directory builder (instead of the default flat one)
135-
- `chunker` (string, defaults to `"fixed"`): the chunking strategy. Now only supports `"fixed"`
136-
- `chunkerOptions` (object, optional): the options for the chunker. Defaults to an object with the following properties:
137-
- `maxChunkSize` (positive integer, defaults to `262144`): the maximum chunk size for the `fixed` chunker.
138-
- `strategy` (string, defaults to `"balanced"`): the DAG builder strategy name. Supports:
139-
- `flat`: flat list of chunks
140-
- `balanced`: builds a balanced tree
141-
- `trickle`: builds [a trickle tree](https://github.com/ipfs/specs/pull/57#issuecomment-265205384)
142-
- `maxChildrenPerNode` (positive integer, defaults to `174`): the maximum children per node for the `balanced` and `trickle` DAG builder strategies
143-
- `layerRepeat` (positive integer, defaults to 4): (only applicable to the `trickle` DAG builder strategy). The maximum repetition of parent nodes for each layer of the tree.
144-
- `reduceSingleLeafToSelf` (boolean, defaults to `true`): optimization for, when reducing a set of nodes with one node, reduce it to that node.
145-
- `dirBuilder` (object): the options for the directory builder
146-
- `hamt` (object): the options for the HAMT sharded directory builder
147-
- bits (positive integer, defaults to `8`): the number of bits at each bucket of the HAMT
148-
- `progress` (function): a function that will be called with the byte length of chunks as a file is added to ipfs.
149-
- `onlyHash` (boolean, defaults to false): Only chunk and hash - do not write to disk
150-
- `hashAlg` (string): multihash hashing algorithm to use
151-
- `cidVersion` (integer, default 0): the CID version to use when storing the data (storage keys are based on the CID, _including_ it's version)
152-
- `rawLeaves` (boolean, defaults to false): When a file would span multiple DAGNodes, if this is true the leaf nodes will not be wrapped in `UnixFS` protobufs and will instead contain the raw file bytes
153-
- `leafType` (string, defaults to `'file'`) what type of UnixFS node leaves should be - can be `'file'` or `'raw'` (ignored when `rawLeaves` is `true`)
154-
155-
### Exporter
156-
157-
#### Exporter example
158-
159-
```js
160-
// Create an export source pull-stream cid or ipfs path you want to export and a
161-
// <dag or ipld-resolver instance> to fetch the file from
162-
const filesStream = Exporter(<cid or ipfsPath>, <dag or ipld-resolver instance>)
163-
164-
// Pipe the return stream to console
165-
filesStream.on('data', (file) => file.content.pipe(process.stdout))
166-
```
167-
168-
#### Exporter API
169-
170-
```js
171-
const Exporter = require('ipfs-unixfs-engine').Exporter
172-
```
173-
174-
### new Exporter(<cid or ipfsPath>, <dag or ipld-resolver>, <options>)
175-
176-
Uses the given [dag API][] or an [ipld-resolver instance][] to fetch an IPFS [UnixFS][] object(s) by their multiaddress.
177-
178-
Creates a new readable stream in object mode that outputs objects of the form
179-
180-
```js
181-
{
182-
path: 'a name',
183-
content: (Buffer or Readable stream)
184-
}
185-
```
186-
187-
#### `offset` and `length`
188-
189-
`offset` and `length` arguments can optionally be passed to the reader function. These will cause the returned stream to only emit bytes starting at `offset` and with length of `length`.
190-
191-
See [the tests](test/reader.js) for examples of using these arguments.
192-
193-
```js
194-
const exporter = require('ipfs-unixfs-engine').exporter
48+
const {
49+
importer
50+
} = require('ipfs-unixfs-engine')
19551
const pull = require('pull-stream')
196-
const drain = require('pull-stream/sinks/drain')
19752

53+
// Import path /tmp/foo/bar
19854
pull(
199-
exporter(cid, ipldResolver, {
200-
offset: 0,
201-
length: 10
202-
})
203-
drain((file) => {
204-
// file.content is a pull stream containing only the first 10 bytes of the file
55+
pull.values([{
56+
path: '/tmp/foo/bar',
57+
content: fs.createReadStream(file)
58+
}]),
59+
60+
// You need to create and pass an ipld resolver instance
61+
// https://npmjs.com/packages/ipld
62+
importer(<ipld-resolver instance>, <options>),
63+
64+
// Handle the error and do something with the results
65+
pull.collect((err, files) => {
66+
console.info(files)
20567
})
20668
)
20769
```
20870

209-
#### Errors
71+
### Exporting a file
72+
73+
The exporter is a [pull-stream source](https://github.com/pull-stream/pull-stream#through) which takes a [cid](https://npmjs.com/packages/cids) and an [ipld](https://npmjs.com/packages/ipld) resolver.
21074

211-
Errors are received by [pull-stream][] sinks.
75+
See the [`unixfs-exporter`](https://npmjs.com/packages/ipfs-unixfs-exporter) module for full documentation.
21276

21377
```js
214-
const exporter = require('ipfs-unixfs-engine').exporter
78+
const {
79+
exporter
80+
} = require('ipfs-unixfs-engine').exporter
21581
const pull = require('pull-stream')
216-
const collect = require('pull-stream/sinks/collect')
82+
const drain = require('pull-stream/sinks/drain')
21783

21884
pull(
219-
exporter(cid, ipldResolver)
220-
collect((error, chunks) => {
221-
// handle the error
85+
// You need to create and pass an ipld resolver instance
86+
// https://npmjs.com/packages/ipld
87+
exporter(cid, ipld),
88+
drain((file) => {
89+
// file.content is a pull stream containing the bytes of the file
22290
})
22391
)
22492
```
22593

226-
[dag API]: https://github.com/ipfs/interface-ipfs-core/blob/master/SPEC/DAG.md
227-
[ipld-resolver instance]: https://github.com/ipld/js-ipld-resolver
228-
[UnixFS]: https://github.com/ipfs/specs/tree/master/unixfs
229-
[pull-stream]: https://www.npmjs.com/package/pull-stream
230-
23194
## Contribute
23295

23396
Feel free to join in. All welcome. Open an [issue](https://github.com/ipfs/js-ipfs-unixfs-engine/issues)!

package.json

Lines changed: 3 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -39,40 +39,11 @@
3939
"devDependencies": {
4040
"aegir": "^17.0.0",
4141
"chai": "^4.2.0",
42-
"dirty-chai": "^2.0.1",
43-
"ipfs-block-service": "~0.15.1",
44-
"ipfs-repo": "~0.25.0",
45-
"ipld": "~0.20.0",
46-
"mkdirp": "~0.5.1",
47-
"multihashes": "~0.4.14",
48-
"ncp": "^2.0.0",
49-
"pull-generate": "^2.2.0",
50-
"pull-stream-to-stream": "^1.3.4",
51-
"pull-zip": "^2.0.1",
52-
"rimraf": "^2.6.2",
53-
"sinon": "^7.1.0"
42+
"dirty-chai": "^2.0.1"
5443
},
5544
"dependencies": {
56-
"async": "^2.6.1",
57-
"cids": "~0.5.5",
58-
"deep-extend": "~0.6.0",
59-
"ipfs-unixfs": "~0.1.16",
60-
"ipld-dag-pb": "~0.15.0",
61-
"left-pad": "^1.3.0",
62-
"multihashing-async": "~0.5.1",
63-
"pull-batch": "^1.0.0",
64-
"pull-block": "^1.4.0",
65-
"pull-cat": "^1.1.11",
66-
"pull-pair": "^1.1.0",
67-
"pull-paramap": "^1.2.2",
68-
"pull-pause": "0.0.2",
69-
"pull-pushable": "^2.2.0",
70-
"pull-stream": "^3.6.9",
71-
"pull-through": "^1.0.18",
72-
"pull-traverse": "^1.0.3",
73-
"pull-write": "^1.1.4",
74-
"sparse-array": "^1.3.1",
75-
"stream-to-pull-stream": "^1.7.2"
45+
"ipfs-unixfs-exporter": "~0.34.0",
46+
"ipfs-unixfs-importer": "~0.34.0"
7647
},
7748
"optionalDependencies": {
7849
"rabin": "^1.6.0"

src/builder/balanced/balanced-reducer.js

Lines changed: 0 additions & 57 deletions
This file was deleted.

src/builder/balanced/index.js

Lines changed: 0 additions & 12 deletions
This file was deleted.

0 commit comments

Comments
 (0)