Collectives™ on Stack Overflow
Find centralized, trusted content and collaborate around the technologies you use most.
Learn more about Collectives
Teams
Q&A for work
Connect and share knowledge within a single location that is structured and easy to search.
Learn more about Teams
I have the url to a possibly large (100+ Mb) file, how do I save it in a local directory using fetch?
I looked around but there don't seem to be a lot of resources/tutorials on how to do this.
Thank you!
–
–
–
Updated solution on Node 18:
const fs = require("fs");
const {mkdir,writeFile} = require("fs/promises");
const { Readable } = require('stream');
const { finished } = require('stream/promises');
const path = require("path");
const downloadFile = (async (url, folder=".") => {
const res = await fetch(url);
if (!fs.existsSync("downloads")) await mkdir("downloads"); //Optional if you already have downloads directory
const destination = path.resolve("./downloads", folder);
const fileStream = fs.createWriteStream(destination, { flags: 'wx' });
await finished(Readable.fromWeb(res.body).pipe(fileStream));
downloadFile("<url_to_fetch>", "<filename>")
Old Answer works till Node 16:
Using the Fetch API you could write a function that could download from a URL like this:
You will need node-fetch@2
run npm i node-fetch@2
const fetch = require("node-fetch");
const fs = require("fs");
const downloadFile = (async (url, path) => {
const res = await fetch(url);
const fileStream = fs.createWriteStream(path);
await new Promise((resolve, reject) => {
res.body.pipe(fileStream);
res.body.on("error", reject);
fileStream.on("finish", resolve);
–
–
–
If you want to avoid explicitly making a Promise like in the other very fine answer, and are ok with building a buffer of the entire 100+ MB file, then you could do something simpler:
const fetch = require('node-fetch');
const {writeFile} = require('fs');
const {promisify} = require('util');
const writeFilePromise = promisify(writeFile);
function downloadFile(url, outputPath) {
return fetch(url)
.then(x => x.arrayBuffer())
.then(x => writeFilePromise(outputPath, Buffer.from(x)));
But the other answer will be more memory-efficient since it's piping the received data stream directly into a file without accumulating all of it in a Buffer.
–
–
Older answers here involve node-fetch
, but since Node.js v18.x
this can be done with no extra dependencies.
The body of a fetch response is a web stream. It can be converted to a Node fs
stream using Readable.fromWeb
, which can then be piped into a write stream created by fs.createWriteStream
. If desired, the resulting stream can then be turned into a Promise
using the promise version of stream.finished
.
const fs = require('fs');
const { Readable } = require('stream');
const { finished } = require('stream/promises');
const stream = fs.createWriteStream('output.txt');
const { body } = await fetch('https://example.com');
await finished(Readable.fromWeb(body).pipe(stream));
–
–
–
–
const {createWriteStream} = require('fs');
const {pipeline} = require('stream/promises');
const fetch = require('node-fetch');
const downloadFile = async (url, path) => pipeline(
(await fetch(url)).body,
createWriteStream(path)
–
import { existsSync } from "fs";
import { mkdir, writeFile } from "fs/promises";
import { join } from "path";
export const download = async (url: string, ...folders: string[]) => {
const fileName = url.split("/").pop();
const path = join("./downloads", ...folders);
if (!existsSync(path)) await mkdir(path);
const filePath = join(path, fileName);
const response = await fetch(url);
const blob = await response.blob();
// const bos = Buffer.from(await blob.arrayBuffer())
const bos = blob.stream();
await writeFile(filePath, bos);
return { path, fileName, filePath };
// call like that ↓
await download("file-url", "subfolder-1", "subfolder-2", ...)
–
I was looking for kinda a same usage, wanted to fetch bunch of api endpoints and save the json responses to some static files, so I came up creating my own solution, hope it helps
const fetch = require('node-fetch'),
fs = require('fs'),
VERSIOINS_FILE_PATH = './static/data/versions.json',
endpoints = [
name: 'example1',
type: 'exampleType1',
url: 'https://example.com/api/url/1',
filePath: './static/data/exampleResult1.json',
updateFrequency: 7 // days
name: 'example2',
type: 'exampleType1',
url: 'https://example.com/api/url/2',
filePath: './static/data/exampleResult2.json',
updateFrequency: 7
name: 'example3',
type: 'exampleType2',
url: 'https://example.com/api/url/3',
filePath: './static/data/exampleResult3.json',
updateFrequency: 30
name: 'example4',
type: 'exampleType2',
url: 'https://example.com/api/url/4',
filePath: './static/data/exampleResult4.json',
updateFrequency: 30
checkOrCreateFolder = () => {
var dir = './static/data/';
if (!fs.existsSync(dir)) {
fs.mkdirSync(dir);
syncStaticData = () => {
checkOrCreateFolder();
let fetchList = [],
versions = [];
endpoints.forEach(endpoint => {
if (requiresUpdate(endpoint)) {
console.log(`Updating ${endpoint.name} data... : `, endpoint.filePath);
fetchList.push(endpoint)
} else {
console.log(`Using cached ${endpoint.name} data... : `, endpoint.filePath);
let endpointVersion = JSON.parse(fs.readFileSync(endpoint.filePath, 'utf8')).lastUpdate;
versions.push({
name: endpoint.name + "Data",
version: endpointVersion
if (fetchList.length > 0) {
Promise.all(fetchList.map(endpoint => fetch(endpoint.url, { "method": "GET" })))
.then(responses => Promise.all(responses.map(response => response.json())))
.then(results => {
results.forEach((endpointData, index) => {
let endpoint = fetchList[index]
let processedData = processData(endpoint.type, endpointData.data)
let fileData = {
data: processedData,
lastUpdate: Date.now() // unix timestamp
versions.push({
name: endpoint.name + "Data",
version: fileData.lastUpdate
fs.writeFileSync(endpoint.filePath, JSON.stringify(fileData));
console.log('updated data: ', endpoint.filePath);
.catch(err => console.log(err));
fs.writeFileSync(VERSIOINS_FILE_PATH, JSON.stringify(versions));
console.log('updated versions: ', VERSIOINS_FILE_PATH);
recursiveRemoveKey = (object, keyname) => {
object.forEach((item) => {
if (item.items) { //items is the nesting key, if it exists, recurse , change as required
recursiveRemoveKey(item.items, keyname)
delete item[keyname];
processData = (type, data) => {
//any thing you want to do with the data before it is written to the file
let processedData = type === 'vehicle' ? processType1Data(data) : processType2Data(data);
return processedData;
processType1Data = data => {
let fetchedData = [...data]
recursiveRemoveKey(fetchedData, 'count')
return fetchedData
processType2Data = data => {
let fetchedData = [...data]
recursiveRemoveKey(fetchedData, 'keywords')
return fetchedData
requiresUpdate = endpoint => {
if (fs.existsSync(endpoint.filePath)) {
let fileData = JSON.parse(fs.readFileSync(endpoint.filePath));
let lastUpdate = fileData.lastUpdate;
let now = new Date();
let diff = now - lastUpdate;
let diffDays = Math.ceil(diff / (1000 * 60 * 60 * 24));
if (diffDays >= endpoint.updateFrequency) {
return true;
} else {
return false;
return true
syncStaticData();
link to github gist
If you don't need to deal with 301/302 responses (when things have been moved), you can actually just do it in one line with the Node.js native libraries http
and/or https
.
You can run this example oneliner in the node
shell. It just uses https
module to download a GNU zip file of some source code to the directory where you started the node
shell. (You start a node
shell by typing node
at the command line for your OS where Node.js has been installed).
require('https').get("https://codeload.github.com/angstyloop/js-utils/tar.gz/refs/heads/develop", it => it.pipe(require('fs').createWriteStream("develop.tar.gz")));
If you don't need/want HTTPS use this instead:
require('http').get("http://codeload.github.com/angstyloop/js-utils/tar.gz/refs/heads/develop", it => it.pipe(require('fs').createWriteStream("develop.tar.gz")));
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.