Collectives™ on Stack Overflow

Find centralized, trusted content and collaborate around the technologies you use most.

Learn more about Collectives

Teams

Q&A for work

Connect and share knowledge within a single location that is structured and easy to search.

Learn more about Teams

I have the url to a possibly large (100+ Mb) file, how do I save it in a local directory using fetch?

I looked around but there don't seem to be a lot of resources/tutorials on how to do this.

Thank you!

I'm creating an Electron app, fetch is supported. Why fetch instead of pure http, because it's a lot easier to use (or so it seemed so far). Gloomy Jun 3, 2016 at 12:53 If someone looked for a way to save file using fetch api but in browser (and came across this answer) then please take a look here: stackoverflow.com/a/42274086/350384 Mariusz Pawelski Feb 16, 2017 at 12:36 See below for an example that uses the native Node.js http / https libraries. Note that I don't have to deal with 301/302, so it is straightforward. angstyloop Oct 4, 2022 at 16:14

Updated solution on Node 18:

const fs = require("fs");
const {mkdir,writeFile} = require("fs/promises");
const { Readable } = require('stream');
const { finished } = require('stream/promises');
const path = require("path");
const downloadFile = (async (url, folder=".") => {
  const res = await fetch(url);
  if (!fs.existsSync("downloads")) await mkdir("downloads"); //Optional if you already have downloads directory
  const destination = path.resolve("./downloads", folder);
  const fileStream = fs.createWriteStream(destination, { flags: 'wx' });
  await finished(Readable.fromWeb(res.body).pipe(fileStream));
downloadFile("<url_to_fetch>", "<filename>")

Old Answer works till Node 16:

Using the Fetch API you could write a function that could download from a URL like this:

You will need node-fetch@2 run npm i node-fetch@2

const fetch = require("node-fetch");
const fs = require("fs");
const downloadFile = (async (url, path) => {
  const res = await fetch(url);
  const fileStream = fs.createWriteStream(path);
  await new Promise((resolve, reject) => {
      res.body.pipe(fileStream);
      res.body.on("error", reject);
      fileStream.on("finish", resolve);
                You could even make it a little shorter by writing res.body.on('error', reject); and fileStream.on('finish', resolve);.
– Ricki-BumbleDev
                Jun 14, 2020 at 10:21
                The function which calls downloadFile does not wait for it to resolve the promise. I'm calling this function like this-> await downloadFile(URL, path). Would you mind correcting me?
– Swapnil
                Jun 21, 2022 at 3:19
                just style preferences but especially for short example code I much prefer the explicit async function downloadFile style over const somevar = 
– Purefan
                Aug 30, 2022 at 12:58

If you want to avoid explicitly making a Promise like in the other very fine answer, and are ok with building a buffer of the entire 100+ MB file, then you could do something simpler:

const fetch = require('node-fetch');
const {writeFile} = require('fs');
const {promisify} = require('util');
const writeFilePromise = promisify(writeFile);
function downloadFile(url, outputPath) {
  return fetch(url)
      .then(x => x.arrayBuffer())
      .then(x => writeFilePromise(outputPath, Buffer.from(x)));

But the other answer will be more memory-efficient since it's piping the received data stream directly into a file without accumulating all of it in a Buffer.

I have tried this code but got error...I got error [Error: EISDIR: illegal operation on a directory, open 'D:\Work\repo\'] { errno: -4068, code: 'EISDIR', syscall: 'open', path: 'D:\\Work\\repo\\' } – Scott Jones May 23, 2022 at 9:08 @ScottJones EISDIR means "Error: IS Directory": you're giving Node a directory when it expects a file. Just use d:\work\repo\file.txt for example – Ahmed Fasih May 23, 2022 at 16:45

Older answers here involve node-fetch, but since Node.js v18.x this can be done with no extra dependencies.

The body of a fetch response is a web stream. It can be converted to a Node fs stream using Readable.fromWeb, which can then be piped into a write stream created by fs.createWriteStream. If desired, the resulting stream can then be turned into a Promise using the promise version of stream.finished.

const fs = require('fs');
const { Readable } = require('stream');
const { finished } = require('stream/promises');
const stream = fs.createWriteStream('output.txt');
const { body } = await fetch('https://example.com');
await finished(Readable.fromWeb(body).pipe(stream));
                That can also be nicely compacted in one line const download = async (url, path) => Readable.fromWeb((await fetch(url)).body).pipe(fs.createWriteStream(path))
– Jamby
                Dec 29, 2022 at 8:42
                Does this download the entire file (await fetch(...)) before starting the write stream?
– 1252748
                Feb 2 at 0:50
                Argument of type 'ReadableStream<Uint8Array>' is not assignable to parameter of type 'ReadableStream<any>'.   Type 'ReadableStream<Uint8Array>' is missing the following properties from type 'ReadableStream<any>': values, [Symbol.asyncIterator]ts(2345)
– RonH
                Mar 8 at 13:41
                @RonH unfortunately it looks like there are 2 different ReadableStream definitions, as per stackoverflow.com/questions/63630114/…. You should be able to cast body to the correct ReadableStream from 'stream/web'; i.e. import { ReadableStream } from 'stream/web'; and body as ReadableStream<any>.
– antonok
                Mar 8 at 22:11
const {createWriteStream} = require('fs');
const {pipeline} = require('stream/promises');
const fetch = require('node-fetch');
const downloadFile = async (url, path) => pipeline(
    (await fetch(url)).body,
    createWriteStream(path)
                I get error TypeError: Cannot read property 'on' of undefined     at destroyer (internal/streams/pipeline.js:23:10)
– Codler
                Oct 17, 2020 at 7:35
import { existsSync } from "fs";
import { mkdir, writeFile } from "fs/promises";
import { join } from "path";
export const download = async (url: string, ...folders: string[]) => {
    const fileName = url.split("/").pop();
    const path = join("./downloads", ...folders);
    if (!existsSync(path)) await mkdir(path);
    const filePath = join(path, fileName);
    const response = await fetch(url);
    const blob = await response.blob();
    // const bos = Buffer.from(await blob.arrayBuffer())
    const bos = blob.stream();
    await writeFile(filePath, bos);
    return { path, fileName, filePath };
// call like that ↓
await download("file-url", "subfolder-1", "subfolder-2", ...)
                Your answer could be improved by adding more information on what the code does and how it helps the OP.
– Tyler2P
                Aug 9, 2022 at 8:38

I was looking for kinda a same usage, wanted to fetch bunch of api endpoints and save the json responses to some static files, so I came up creating my own solution, hope it helps

const fetch = require('node-fetch'),
    fs = require('fs'),
    VERSIOINS_FILE_PATH = './static/data/versions.json',
    endpoints = [
            name: 'example1',
            type: 'exampleType1',
            url: 'https://example.com/api/url/1',
            filePath: './static/data/exampleResult1.json',
            updateFrequency: 7 // days
            name: 'example2',
            type: 'exampleType1',
            url: 'https://example.com/api/url/2',
            filePath: './static/data/exampleResult2.json',
            updateFrequency: 7
            name: 'example3',
            type: 'exampleType2',
            url: 'https://example.com/api/url/3',
            filePath: './static/data/exampleResult3.json',
            updateFrequency: 30
            name: 'example4',
            type: 'exampleType2',
            url: 'https://example.com/api/url/4',
            filePath: './static/data/exampleResult4.json',
            updateFrequency: 30
    checkOrCreateFolder = () => {
        var dir = './static/data/';
        if (!fs.existsSync(dir)) {
            fs.mkdirSync(dir);
    syncStaticData = () => {
        checkOrCreateFolder();
        let fetchList = [],
            versions = [];
        endpoints.forEach(endpoint => {
            if (requiresUpdate(endpoint)) {
                console.log(`Updating ${endpoint.name} data... : `, endpoint.filePath);
                fetchList.push(endpoint)
            } else {
                console.log(`Using cached ${endpoint.name} data... : `, endpoint.filePath);
                let endpointVersion = JSON.parse(fs.readFileSync(endpoint.filePath, 'utf8')).lastUpdate;
                versions.push({
                    name: endpoint.name + "Data",
                    version: endpointVersion
        if (fetchList.length > 0) {
            Promise.all(fetchList.map(endpoint => fetch(endpoint.url, { "method": "GET" })))
                .then(responses => Promise.all(responses.map(response => response.json())))
                .then(results => {
                    results.forEach((endpointData, index) => {
                        let endpoint = fetchList[index]
                        let processedData = processData(endpoint.type, endpointData.data)
                        let fileData = {
                            data: processedData,
                            lastUpdate: Date.now() // unix timestamp
                        versions.push({
                            name: endpoint.name + "Data",
                            version: fileData.lastUpdate
                        fs.writeFileSync(endpoint.filePath, JSON.stringify(fileData));
                        console.log('updated data: ', endpoint.filePath);
                .catch(err => console.log(err));
        fs.writeFileSync(VERSIOINS_FILE_PATH, JSON.stringify(versions));
        console.log('updated versions: ', VERSIOINS_FILE_PATH);
    recursiveRemoveKey = (object, keyname) => {
        object.forEach((item) => {
            if (item.items) { //items is the nesting key, if it exists, recurse , change as required
                recursiveRemoveKey(item.items, keyname)
            delete item[keyname];
    processData = (type, data) => {
        //any thing you want to do with the data before it is written to the file
        let processedData = type === 'vehicle' ? processType1Data(data) : processType2Data(data);
        return processedData;
    processType1Data = data => {
        let fetchedData = [...data]
        recursiveRemoveKey(fetchedData, 'count')
        return fetchedData
    processType2Data = data => {
        let fetchedData = [...data]
        recursiveRemoveKey(fetchedData, 'keywords')
        return fetchedData
    requiresUpdate = endpoint => {
        if (fs.existsSync(endpoint.filePath)) {
            let fileData = JSON.parse(fs.readFileSync(endpoint.filePath));
            let lastUpdate = fileData.lastUpdate;
            let now = new Date();
            let diff = now - lastUpdate;
            let diffDays = Math.ceil(diff / (1000 * 60 * 60 * 24));
            if (diffDays >= endpoint.updateFrequency) {
                return true;
            } else {
                return false;
        return true
syncStaticData();

link to github gist

If you don't need to deal with 301/302 responses (when things have been moved), you can actually just do it in one line with the Node.js native libraries http and/or https.

You can run this example oneliner in the node shell. It just uses https module to download a GNU zip file of some source code to the directory where you started the node shell. (You start a node shell by typing node at the command line for your OS where Node.js has been installed).

require('https').get("https://codeload.github.com/angstyloop/js-utils/tar.gz/refs/heads/develop", it => it.pipe(require('fs').createWriteStream("develop.tar.gz")));

If you don't need/want HTTPS use this instead:

require('http').get("http://codeload.github.com/angstyloop/js-utils/tar.gz/refs/heads/develop", it => it.pipe(require('fs').createWriteStream("develop.tar.gz")));

Thanks for contributing an answer to Stack Overflow!

  • Please be sure to answer the question. Provide details and share your research!

But avoid

  • Asking for help, clarification, or responding to other answers.
  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.