Summary: in this tutorial, you will learn how to use the stream to process large result sets more efficiently.
This tutorial begins where the Calling a stored procedure with an OUTPUT parameter in Node.js tutorial left off.
How to use Node.js stream to handle large result sets
A stream allows you to handle a large result set efficiently by processing it row by row, rather than loading it into memory at once.
The stream is useful when you handle large result sets from SQL Server without consuming much server memory.
The following describes the steps for using the stream:
First, connect to the SQL Server:
const pool = await sql.connect(config);
Code language: JavaScript (javascript)
Second, create a Request
object that represents a query executed against the SQL Server and enables streaming:
const request = new sql.Request(pool);
request.stream = true;
Code language: JavaScript (javascript)
Third, handle data when it arrives. To do that, you use the on()
method of the Request
object. The request.on()
method handles various events that occur during the execution of a query when the stream is enabled:
recordset
: Triggered when the columns metadata is available.row
: Triggered for each row in the result set.error
: Triggered when an error occurs.done
: Triggered when the query execution is complete.
For example, the following shows how to process each row of data as they are arrived using the request.on()
with the row
event:
request.on('row', row => {
console.log('Row:', row);
});
Code language: JavaScript (javascript)
Streaming data into a CSV file example
We’ll show you how to query data from the Authors
table and stream it directly to a file to avoid loading everything into memory:
Step 1. Create a new file named stream.js
within the project directory.
Step 2. Define a function exportAuthorsToCSV
that exports all rows from the Authors
table to a CSV file:
import fs from 'fs';
import sql from 'mssql';
import { config } from './config.js';
const exportAuthorsToCSV = async (csvFilename) => {
try {
// Create a write stream for the CSV file
const writeStream = fs.createWriteStream(csvFilename);
// Connect to the database
const pool = await sql.connect(config);
// Create a new request with streaming enabled
const request = new sql.Request(pool);
request.stream = true;
// Execute the query
request.query('SELECT * FROM Authors ORDER BY FirstName');
// Event handlers for the streaming
request.on('recordset', (columns) => {
let header = '';
for (const column in columns) {
header += `${column},`;
}
// remove the last comma (,)
header = header.slice(0, -1);
// write the header
writeStream.write(header + '\n');
});
// Process each row when it is arrived
request.on('row', ({ AuthorID, FirstName, LastName, BirthDate }) => {
const csvRow = `${AuthorID},${FirstName},${LastName},${BirthDate.toLocaleDateString()}`;
writeStream.write(csvRow + '\n');
});
// Error handlers for the request
request.on('error', (err) => {
console.error('Error:', err);
writeStream.end();
});
// Event handler for when the query is done
request.on('done', (result) => {
console.log('Query done. Rows affected:', result.rowsAffected);
writeStream.end();
});
} catch (err) {
console.error(err);
}
};
export { exportAuthorsToCSV };
Code language: JavaScript (javascript)
How it works.
First, import fs
, sql
, and config
objects from the fs
, mssql
, and config.js
modules respectively.
import fs from 'fs';
import sql from 'mssql';
import { config } from './config.js';
Code language: JavaScript (javascript)
Second, define a function exportAuthorsToCSV
that accepts a CSV filename:
const exportAuthorsToCSV = async (csvFilename) => {
//...
};
Code language: JavaScript (javascript)
Third, create a write stream for the CSV file:
const writeStream = fs.createWriteStream(csvFilename);
Code language: JavaScript (javascript)
Fourth, connect to the SQL Server using the provided config
object.
const pool = await sql.connect(config);
Code language: JavaScript (javascript)
Note that the config
object contains the necessary parameters for connecting to the SQL Server, including username, password, server, and database name:
Fifth, create a new Request
with the streaming enabled:
const request = new sql.Request(pool);
request.stream = true;
Code language: JavaScript (javascript)
Sixth, execute a query that retrieves data from the Authors
table:
request.query('SELECT * FROM Authors ORDER BY FirstName');
Code language: JavaScript (javascript)
Seventh, write the header of the CSV file once in the event handler of the recordset
event handler:
request.on('recordset', (columns) => {
let header = '';
for (const column in columns) {
header += `${column},`;
}
// remove the last comma (,)
header = header.slice(0, -1);
// write the header
writeStream.write(header + '\n');
});
Code language: JavaScript (javascript)
How it works.
- Initialize the header to an empty string.
- Iterate over the properties of the columns object using the
for...in
loop and concatenate property names (AuthorID
,FirstName
,LastName
, andBirthDate
) into a single string. Note that the columns object contains all the fields in the Authors table includingAuthorID
,FirstName
,LastName
, andBirthDate
. - Remove the last comma (
,
) from theheader
string using theslice()
method. - Write the header into the CSV file by calling the
write()
method of thewriteStream
object.
Eighth, write each row into the CSV file when it arrives in the row event handler:
request.on('row', ({ AuthorID, FirstName, LastName, BirthDate }) => {
const csvRow = `${AuthorID},${FirstName},${LastName},${BirthDate.toLocaleDateString()}`;
writeStream.write(csvRow + '\n');
});
Code language: JavaScript (javascript)
Ninth, log the error to the console if an error occurs and end the stream in the error
event handler:
request.on('error', (err) => {
console.error('Error:', err);
writeStream.end();
});
Code language: JavaScript (javascript)
Tenth, log a message to the console and end the stream in the done
event handler:
request.on('done', (result) => {
console.log('Query done. Rows affected:', result.rowsAffected);
writeStream.end();
});
Code language: JavaScript (javascript)
Eleventh, log the error message to the console in the catch
block:
console.error(err);
Code language: JavaScript (javascript)
Finally, export the exportAuthorsToCSV
function.
Step 3. Modify the index.js
to use the exportAuthorsToCSV
function:
import { exportAuthorsToCSV } from './stream.js';
exportAuthorsToCSV('data/output.csv');
Code language: JavaScript (javascript)
How it works.
First, import exportAuthorsToCSV
function from the stream.js
module.
import { exportAuthorsToCSV } from './stream.js';
Code language: JavaScript (javascript)
Second, call the exportAuthorsToCSV()
function to export author data to the CSV file located at data/output.csv
:
exportAuthorsToCSV('data/output.csv');
Code language: JavaScript (javascript)
Step 4. Open your terminal and execute the following command to run the index.js
file:
npm start
Code language: plaintext (plaintext)
If the program runs successfully, you’ll see a new output file output.csv
in the data
directory:
AuthorID,FirstName,LastName,BirthDate
2,Agatha,Christie,9/15/1890
9,Charles,Dickens,2/7/1812
22,Emily,Bronte,7/30/1818
...
Code language: plaintext (plaintext)
Summary
- Use the stream to handle large result sets efficiently.
- Set the
stream
property of theRequest
object to true to enable streaming. - Use the
request.on()
method to handle events that occur during the execution of a query once streaming is enabled.