Reading and Writing Files

Unlike the browser, which limits what data scripts can access on the client's computer, Node.js provides full access to the local file system. The routines it provides for reading and writing files are asynchronous. For example, here's how you'd read a JSON file passed as a command-line argument using a callback:

const fs = require('fs');

const jsonFile = process.argv[2];
fs.readFile(jsonFile, 'utf8', (error, json) => {
if (error) {
console.error(error);
} else {
const object = JSON.parse(json);
console.log(object);
}
});

Try running this script on a JSON file.

Suppose you want to write a script that loads in two JSON files and merges their contents into a single JSON object. There are several ways you might organize the two reads. You could read the first file and then read the second inside the callback to the first:

const fs = require('fs');

fs.readFile(process.argv[2], 'utf8', (error, json) => {
if (error) {
console.error(error);
} else {
const objectA = JSON.parse(json);
fs.readFile(process.argv[3], 'utf8', (error, json) => {
if (error) {
console.error(error);
} else {
const objectB = JSON.parse(json);
// TODO: merge the objects.
console.log(objectA);
console.log(objectB);
}
});
}
});

Try running this script on two JSON files.

The script is messy already, and it will get messier still when you add code to write the merged object to a new file. Another option is to use the alternative promises API. Your first approach might be to sequence the promises together:

const fsPromises = require('fs').promises;

fsPromises.readFile(process.argv[2], 'utf8')
.then(jsonA => {
const objectA = JSON.parse(jsonA);
fsPromises.readFile(process.argv[3], 'utf8')
.then(jsonB => {
const objectB = JSON.parse(jsonB);
// TODO: merge the objects.
console.log(objectA);
console.log(objectB);
})
.catch(error => console.error(error));
})
.catch(error => console.error(error));

Try running this script on two JSON files. Does it still work?

This version of the script is also messy. Furthermore, both this approach and the callback approach force the first read to complete before the second begins. This isn't necessary. A better approach is to issue both read calls immediately. You schedule your response to be run when both reads have finished using Promise.all:

const fsPromises = require('fs').promises;

const promiseA = fsPromises.readFile(process.argv[2], 'utf8');
const promiseB = fsPromises.readFile(process.argv[3], 'utf8');

Promise.all([promiseA, promiseB])
.then(jsons => {
const objectA = JSON.parse(jsons[0]);
const objectB = JSON.parse(jsons[1]);
// TODO: merge the objects.
})
.catch(error => console.error(error));

This is much less messy and may offer better performance.

To merge the two objects, you use the spread syntax operator, which expands the key-value pairs from an existing object into a new object literal: const newObject = {...oldObject}. Then you write the merged object using the writeFile method:

const fsPromises = require('fs').promises;

const promiseA = fsPromises.readFile(process.argv[2], 'utf8');
const promiseB = fsPromises.readFile(process.argv[3], 'utf8');

Promise.all([promiseA, promiseB])
.then(jsons => {
const objectA = JSON.parse(jsons[0]);
const objectB = JSON.parse(jsons[1]);
const mergedObject = {...objectA, ...objectB};
fsPromises.writeFile(process.argv[4], JSON.stringify(mergedObject, null, 2))
.catch(error => console.error(error));
})
.catch(error => console.error(error));

Try running this script with the names of two existing JSON files and the name of a third file to which the merged object will be written.