I’ve got this working
import { concat as uint8ArrayConcat } from "uint8arrays/concat";
import all from "it-all";
fastify.get(
"/v1/files/:cid",
async function (request: any, reply: any) {
const { cid }: { cid: string } = request.params;
const ipfs = create();
const data = uint8ArrayConcat(await all(ipfs.cat(cid)));
reply.type("image/png").send(data);
}
);
but I’m concerned that accumulating all the content of a stream in memory will cause issues.
Fastify does support streams
If you are sending a stream and you have not set a 'Content-Type' header, send will set it at 'application/octet-stream'.
fastify.get('/streams', function (request, reply) {
const fs = require('fs')
const stream = fs.createReadStream('some-file', 'utf8')
reply.send(stream)
})
I thought this would work
const { Readable } = require("stream");
...
fastify.get(
"/v1/files/:cid",
async function (request: any, reply: any) {
const { cid }: { cid: string } = request.params;
const ipfs = create();
const readableStream = Readable({
async read() {
for await (const chunk of ipfs.cat(cid)) {
this.push(chunk);
}
this.push(null);
},
});
reply.type("image/png").send(readableStream);
}
);
but I get INFO (48299): stream closed prematurely
and my second attempt just hangs
const readableStream = Readable({
async start(controller: any) {
for await (const chunk of ipfs.cat(cid)) {
controller.enqueue(chunk);
}
controller.close();
},
});
readableStream.read = () => {};
reply.send(readableStream);
So is there some way of getting the large file out of IPFs and streamed to the user and not have memory issues? Or maybe I’m over thinking it?