Lecture Notes Of Class 18: Working with
Streams in Node.js
Objective:
- Understand
the concept of streams in Node.js.
- Learn
how to implement readable and writable streams to work with files.
Introduction to Streams
Streams in Node.js are
powerful objects that enable handling of data in a continuous flow, such as
reading from or writing to files, making them more memory efficient. Streams
allow data to be processed piece-by-piece as it is being read or written, rather
than loading the entire data into memory at once.
There are four main types
of streams in Node.js:
1.
Readable Streams
- Used for reading data from a source, like a file or HTTP request.
2.
Writable Streams
- Used for writing data to a destination, like a file or HTTP response.
3.
Duplex Streams
- Can be both readable and writable. They allow both reading and writing
operations, like a TCP socket.
4.
Transform Streams
- A type of duplex stream where the output is computed from the input, for
example, modifying data before it is written.
In this class, we will
focus on Readable and Writable streams, particularly in the
context of working with files.
Key Concepts
1.
Buffer:
o A
buffer in Node.js is a chunk of memory used to store binary data. Streams often
work with buffers to store data temporarily before it’s processed.
2.
Stream Events:
o Streams
in Node.js are event-driven, meaning they emit various events such as:
§ 'data':
Fired when there is data available to be read.
§ 'end':
Fired when the stream has no more data to read.
§ 'error':
Fired when an error occurs while processing data.
§ 'finish':
Fired when all data has been written to a writable stream.
Lab Tasks
Task 1: Implement a
Readable Stream to Read a File
In this task, you will
create a readable stream to read the contents of a file in chunks.
Steps:
1.
Import the fs module:
o The
fs (File System) module in Node.js provides functions to interact with files.
We will use it to create a readable stream.
2.
Create a readable stream:
o Use
fs.createReadStream() to create a stream that reads a file. You need to specify
the path of the file.
3.
Handle Stream Events:
o You
can listen for the 'data' event to handle chunks of data as they are read.
o Listen
for the 'end' event to know when the entire file has been read.
Example Code:
javascript
CopyEdit
const fs = require('fs');
// Create a readable
stream
const readableStream =
fs.createReadStream('example.txt', 'utf8');
// Handle 'data' event to
read data
readableStream.on('data',
(chunk) => {
console.log('Reading chunk: ', chunk);
});
// Handle 'end' event
when file reading is complete
readableStream.on('end', ()
=> {
console.log('File reading finished');
});
// Handle 'error' event
for any issues
readableStream.on('error',
(err) => {
console.log('Error occurred while reading
file:', err);
});
Explanation:
- fs.createReadStream('example.txt',
'utf8'): This creates a readable stream from
the file example.txt, and the data will be read as a string (because of
the 'utf8' encoding).
- data
event: The stream reads the file in chunks, and each
chunk is passed to the callback of the data event. You can process each
chunk of data here.
- end
event: The 'end' event signifies that the stream has
finished reading the file.
- error
event: If there’s any issue reading the file (e.g.,
the file doesn't exist), the error is caught here.
Task 2: Implement a
Writable Stream to Write Data to a File
In this task, you will
create a writable stream to write data to a file.
Steps:
1.
Import the fs module:
o Use
the fs module to work with files and create a writable stream.
2.
Create a writable stream:
o Use
fs.createWriteStream() to create a stream that writes data to a file. You will
specify the file path and optionally, encoding.
3.
Write data:
o Use
the .write() method to write data to the stream.
o Once
all data is written, call the .end() method to signal the end of the writing
process.
Example Code:
javascript
CopyEdit
const fs = require('fs');
// Create a writable
stream to a file
const writableStream =
fs.createWriteStream('output.txt', 'utf8');
// Write data to the
stream
writableStream.write('Hello,
Node.js streams!\n');
writableStream.write('This
is another line of text.\n');
// End the stream (finish
writing)
writableStream.end();
// Handle 'finish' event
writableStream.on('finish',
() => {
console.log('Data has been written to
output.txt');
});
// Handle 'error' event
writableStream.on('error',
(err) => {
console.log('Error occurred while writing
to file:', err);
});
Explanation:
- fs.createWriteStream('output.txt',
'utf8'): This creates a writable stream that
will write data to output.txt in UTF-8 encoding.
- write()
method: You can use the .write() method to
write data to the file. Each call writes a chunk of data.
- end()
method: This signals that no more data will
be written. It is necessary to call .end() to properly close the stream
and finish the writing process.
- finish
event: Once the data has been fully written, the 'finish'
event is triggered, indicating the completion of the writing operation.
Additional Tips and
Concepts:
- Pipes:
You can use .pipe() to connect a readable stream to a writable stream.
This allows for easy data transfer between streams.
Example:
javascript
CopyEdit
const readableStream =
fs.createReadStream('input.txt');
const writableStream =
fs.createWriteStream('output.txt');
readableStream.pipe(writableStream);
- Stream
Chaining: You can chain multiple streams
together, such as reading from one stream, transforming data, and writing
to another stream. For example, you could use a transform stream to modify
the data as it is being read and written.
Conclusion:
- Streams
are an essential concept in Node.js for handling large amounts of data
efficiently, especially when dealing with I/O operations like reading and
writing files.
- In
this class, you've learned how to work with readable and writable streams
to read data from and write data to files.
- Next,
you can explore more advanced stream concepts such as duplex and transform
streams for real-time applications and data processing.
Lab Tasks Summary:
1.
Readable Stream:
Implemented a readable stream to read data from a file and processed chunks of
data.
2.
Writable Stream:
Implemented a writable stream to write data to a file and handled the
completion of the writing process.
Further Practice:
1.
Experiment with different stream events
and add more error handling for robustness.
2.
Try piping a readable stream into a
writable stream to copy a file.
4o mini
You said:
give solutions: Readable
Stream: Implemented a readable stream to read data from a file and processed
chunks of data. Writable Stream: Implemented a writable stream to write data to
a file and handled the completion of the writing process. Further Practice:
Experiment with different stream events and add more error handling for
robustness. Try piping a readable stream into a writable stream to copy a file.
ChatGPT said:
ChatGPT
Here are the solutions
for the tasks and further practice that you can implement in your Node.js
project.
1. Readable Stream: Read
Data from a File and Process Chunks of Data
In this solution, we
create a readable stream to read data from a file (input.txt) and process it in
chunks. We'll also add error handling for robustness.
Solution Code:
javascript
CopyEdit
const fs = require('fs');
// Create a readable
stream from 'input.txt'
const readableStream =
fs.createReadStream('input.txt', 'utf8');
// Handle 'data' event to
read and process chunks of data
readableStream.on('data',
(chunk) => {
console.log('Reading chunk: ', chunk);
// Process the chunk (for example, modify
or analyze the data)
});
// Handle 'end' event to
know when the file reading is complete
readableStream.on('end', ()
=> {
console.log('File reading finished');
});
// Handle 'error' event
for any issues while reading the file
readableStream.on('error',
(err) => {
console.error('Error occurred while reading
file:', err);
});
Explanation:
- fs.createReadStream('input.txt',
'utf8'): Creates a readable stream for the
file input.txt with UTF-8 encoding.
- data
event: Fires whenever a chunk of data is available to
read. Each chunk is logged to the console.
- end
event: Fires when the end of the file is reached,
signaling that there is no more data to read.
- error
event: Handles any errors, such as the file not
existing or being unreadable.
2. Writable Stream: Write
Data to a File and Handle the Completion
In this solution, we
create a writable stream to write data to a file (output.txt) and handle the
completion of the writing process. We'll also add error handling for
robustness.
Solution Code:
javascript
CopyEdit
const fs = require('fs');
// Create a writable
stream to 'output.txt'
const writableStream =
fs.createWriteStream('output.txt', 'utf8');
// Write data to the
writable stream
writableStream.write('Hello,
this is a test message written to the file.\n');
writableStream.write('We
can write multiple lines to the file.\n');
// End the stream (finish
writing)
writableStream.end();
// Handle 'finish' event
when writing is complete
writableStream.on('finish',
() => {
console.log('Data has been written to
output.txt');
});
// Handle 'error' event
for any issues while writing to the file
writableStream.on('error',
(err) => {
console.error('Error occurred while writing
to file:', err);
});
Explanation:
- fs.createWriteStream('output.txt',
'utf8'): Creates a writable stream to write
data to output.txt with UTF-8 encoding.
- write()
method: Writes chunks of data to the
stream. Each call writes one piece of data.
- end()
method: Signals the end of the writing
process. This is important to properly close the stream.
- finish
event: Fires when all data has been written to the
file.
- error
event: Catches any errors, such as file write
permissions or disk space issues.
3. Further Practice:
Experiment with Stream Events and Add More Error Handling
We can improve the
previous code by adding more stream events to handle different situations and
improve error handling.
Solution Code:
javascript
CopyEdit
const fs = require('fs');
// Create a readable
stream from 'input.txt'
const readableStream =
fs.createReadStream('input.txt', 'utf8');
// Handle 'data' event
readableStream.on('data',
(chunk) => {
console.log('Reading chunk: ', chunk);
});
// Handle 'end' event
readableStream.on('end', ()
=> {
console.log('File reading finished');
});
// Handle 'error' event
with more detailed error handling
readableStream.on('error',
(err) => {
console.error('Error occurred while reading
file:', err.message);
});
// Handle 'close' event
to signal when the stream has been fully closed
readableStream.on('close',
() => {
console.log('Stream closed');
});
Explanation:
- close
event: This event fires when the underlying resource
has been fully closed, such as a file or network connection.
- Improved
error handling: The error event now logs a more
detailed error message (err.message).
4. Piping a Readable
Stream into a Writable Stream to Copy a File
The pipe method connects
a readable stream to a writable stream, so you can automatically transfer data
from one stream to another. This is useful when copying files.
Solution Code:
javascript
CopyEdit
const fs = require('fs');
// Create a readable
stream from 'input.txt'
const readableStream =
fs.createReadStream('input.txt');
// Create a writable
stream to 'output.txt'
const writableStream =
fs.createWriteStream('output.txt');
// Pipe the readable
stream into the writable stream
readableStream.pipe(writableStream);
// Handle 'finish' event
when the piping is complete
writableStream.on('finish',
() => {
console.log('File has been copied
successfully');
});
// Handle 'error' event
for any issues
readableStream.on('error',
(err) => {
console.error('Error reading file:', err);
});
writableStream.on('error',
(err) => {
console.error('Error writing file:', err);
});
Explanation:
- pipe()
method: Automatically transfers data from
the readableStream to the writableStream, which in this case copies the
content of input.txt to output.txt.
- finish
event: Fires when the writable stream finishes writing
all data.
- error
event: Catches errors on both the readable and
writable streams.
Summary:
- We’ve
created readable and writable streams to read from and write
to files.
- We
added detailed error handling to capture and log issues that occur
during the stream operation.
- We
used the pipe method to easily copy data from one file to another
using streams.
By experimenting with
these solutions, you can now efficiently handle large files and work with
streams in Node.js for various applications.
4o mini
You said:
Write 10 assignments with
solution and explanation in detail on above concept
ChatGPT said:
ChatGPT
Here are 10 assignments
based on the concept of working with streams in Node.js, along with detailed
solutions and explanations for each task. These assignments are intended to
give students a thorough understanding of streams and help them practice working
with them.
Assignment 1: Reading a
File Using a Readable Stream
Objective:
Learn how to use a readable stream to read data from a file.
Task:
Implement a readable stream to read data from a file data.txt and display it in
the console.
Solution:
javascript
CopyEdit
const fs = require('fs');
// Create a readable
stream to read 'data.txt'
const readableStream =
fs.createReadStream('data.txt', 'utf8');
// Display the data as it
is read
readableStream.on('data',
(chunk) => {
console.log('Chunk of data:', chunk);
});
// Handle end of stream
readableStream.on('end', ()
=> {
console.log('File reading finished');
});
// Handle error
readableStream.on('error',
(err) => {
console.error('Error reading file:', err);
});
Explanation:
- We
use fs.createReadStream() to create a readable stream.
- The
data event is used to handle chunks of data that are read from the file.
- The
end event is triggered when all data has been read.
- The
error event handles any issues while reading the file.
Assignment 2: Writing
Data to a File Using a Writable Stream
Objective:
Learn how to use a writable stream to write data to a file.
Task:
Implement a writable stream to write "Hello, Node.js!" to a file output.txt.
Solution:
javascript
CopyEdit
const fs = require('fs');
// Create a writable
stream to 'output.txt'
const writableStream =
fs.createWriteStream('output.txt', 'utf8');
// Write data to the
stream
writableStream.write('Hello,
Node.js!\n');
// End the writable
stream
writableStream.end();
// Handle 'finish' event
writableStream.on('finish',
() => {
console.log('Data has been written to
output.txt');
});
// Handle error
writableStream.on('error',
(err) => {
console.error('Error writing to file:',
err);
});
Explanation:
- fs.createWriteStream()
creates a writable stream to the file output.txt.
- write()
is used to send data to the writable stream.
- end()
signals the end of the stream.
- finish
is triggered once the data has been written completely.
Assignment 3: Reading and
Writing Using Streams (Pipe Method)
Objective:
Learn how to use the pipe method to copy data from one file to another.
Task:
Implement a readable stream to read from input.txt and a writable stream to
write to output.txt using the pipe() method.
Solution:
javascript
CopyEdit
const fs = require('fs');
// Create a readable
stream from 'input.txt'
const readableStream =
fs.createReadStream('input.txt');
// Create a writable
stream to 'output.txt'
const writableStream =
fs.createWriteStream('output.txt');
// Pipe the readable
stream to the writable stream
readableStream.pipe(writableStream);
// Handle 'finish' event
writableStream.on('finish',
() => {
console.log('File has been copied to
output.txt');
});
Explanation:
- pipe()
connects the readable stream to the writable stream.
- It
automatically transfers data from the input.txt file to output.txt.
Assignment 4: Handling
Large Files by Reading in Chunks
Objective:
Learn how to handle large files by reading them in chunks using streams.
Task:
Read a large file bigFile.txt in chunks and display each chunk in the console.
Solution:
javascript
CopyEdit
const fs = require('fs');
// Create a readable
stream
const readableStream =
fs.createReadStream('bigFile.txt', 'utf8');
// Process each chunk of
data
readableStream.on('data',
(chunk) => {
console.log('Processing chunk:', chunk);
});
// Handle end of stream
readableStream.on('end', ()
=> {
console.log('Finished processing the file');
});
// Handle error
readableStream.on('error',
(err) => {
console.error('Error reading the file:',
err);
});
Explanation:
- The
file is read in chunks, which allows you to process large files
efficiently without loading the entire file into memory.
Assignment 5: Write Data
to a File with Multiple Chunks
Objective:
Learn how to write multiple chunks of data to a file using a writable stream.
Task: Write
multiple lines of text to a file multipleLines.txt.
Solution:
javascript
CopyEdit
const fs = require('fs');
// Create a writable
stream
const writableStream =
fs.createWriteStream('multipleLines.txt', 'utf8');
// Write multiple chunks
of data
writableStream.write('First
line of text\n');
writableStream.write('Second
line of text\n');
writableStream.write('Third
line of text\n');
// End the writable
stream
writableStream.end();
// Handle 'finish' event
writableStream.on('finish',
() => {
console.log('Data has been written to
multipleLines.txt');
});
Explanation:
- We
use multiple write() calls to send different chunks of data to the
writable stream.
Assignment 6: Handling
Errors in Writable and Readable Streams
Objective:
Learn how to handle errors in both readable and writable streams.
Task:
Implement error handling while reading from input.txt and writing to output.txt.
Solution:
javascript
CopyEdit
const fs = require('fs');
// Create a readable
stream from 'input.txt'
const readableStream =
fs.createReadStream('input.txt', 'utf8');
// Create a writable
stream to 'output.txt'
const writableStream =
fs.createWriteStream('output.txt', 'utf8');
// Pipe the readable
stream to the writable stream
readableStream.pipe(writableStream);
// Handle error in
readable stream
readableStream.on('error',
(err) => {
console.error('Error reading file:', err);
});
// Handle error in
writable stream
writableStream.on('error',
(err) => {
console.error('Error writing to file:',
err);
});
Explanation:
- We
have added error handling for both the readable and writable streams, so
if there's any issue with reading or writing, the error will be logged.
Assignment 7: Copying a
File Using Streams
Objective:
Learn how to copy a file using streams.
Task:
Use streams to copy a file source.txt to destination.txt.
Solution:
javascript
CopyEdit
const fs = require('fs');
// Create a readable
stream from 'source.txt'
const readableStream =
fs.createReadStream('source.txt');
// Create a writable
stream to 'destination.txt'
const writableStream =
fs.createWriteStream('destination.txt');
// Pipe the readable
stream to the writable stream
readableStream.pipe(writableStream);
// Handle 'finish' event
writableStream.on('finish',
() => {
console.log('File has been copied
successfully');
});
Explanation:
- The
pipe() method automatically copies the content from source.txt to destination.txt.
Assignment 8: Read and
Process a JSON File Using Readable Stream
Objective:
Learn how to read and parse a JSON file using a readable stream.
Task:
Implement a readable stream to read a JSON file data.json and log its parsed
content.
Solution:
javascript
CopyEdit
const fs = require('fs');
// Create a readable
stream to 'data.json'
const readableStream =
fs.createReadStream('data.json', 'utf8');
// Process data as it is
read
let rawData = '';
readableStream.on('data',
(chunk) => {
rawData += chunk;
});
// Handle end of stream
readableStream.on('end', ()
=> {
try {
const parsedData = JSON.parse(rawData);
console.log('Parsed JSON data:',
parsedData);
} catch (err) {
console.error('Error parsing JSON:',
err);
}
});
// Handle error
readableStream.on('error',
(err) => {
console.error('Error reading the JSON
file:', err);
});
Explanation:
- The
file is read as a string using the utf8 encoding.
- Once
the data is fully read, it is parsed using JSON.parse().
Assignment 9:
Implementing a Stream for Logging Data
Objective:
Learn how to log data using a writable stream.
Task:
Implement a writable stream to log messages to a log.txt file.
Solution:
javascript
CopyEdit
const fs = require('fs');
// Create a writable
stream to 'log.txt'
const writableStream =
fs.createWriteStream('log.txt', 'utf8');
// Write log messages
writableStream.write('Log
entry 1: Server started\n');
writableStream.write('Log
entry 2: User logged in\n');
writableStream.write('Log
entry 3: Error occurred\n');
// End the writable
stream
writableStream.end();
// Handle 'finish' event
writableStream.on('finish',
() => {
console.log('Logs have been written to
log.txt');
});
Explanation:
- This
assignment simulates logging by writing multiple log entries to log.txt.
Assignment 10:
Implementing a Transform Stream
Objective:
Learn how to implement a transform stream to modify data as it is read.
Task:
Implement a transform stream to read text from input.txt, convert it to
uppercase, and write it to output.txt.
Solution:
javascript
CopyEdit
const fs = require('fs');
const { Transform } = require('stream');
// Create a transform
stream to convert text to uppercase
const transformStream = new
Transform({
transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
});
// Create a readable
stream from 'input.txt'
const readableStream =
fs.createReadStream('input.txt', 'utf8');
// Create a writable
stream to 'output.txt'
const writableStream =
fs.createWriteStream('output.txt', 'utf8');
// Pipe the readable
stream through the transform stream to the writable stream
readableStream.pipe(transformStream).pipe(writableStream);
// Handle 'finish' event
writableStream.on('finish',
() => {
console.log('Text has been converted to
uppercase and written to output.txt');
});
Explanation:
- The
transform stream modifies the data as it is read, converting it to
uppercase.
- The
data flows through the readable stream, the transform stream, and then the
writable stream.
These assignments cover a
broad range of topics related to streams in Node.js and will help students
understand different types of streams, error handling, and more complex
concepts like transformation.
4o mini
You said:
write 50 MCQs for the
concept above with explanation
ChatGPT said:
ChatGPT
Here are 50
multiple-choice questions (MCQs) on the concept of working with streams in
Node.js. Each question includes the correct answer and an explanation to help
you understand the concept better.
MCQs on Working with
Streams in Node.js
1. What is a stream in
Node.js?
- A)
A type of database
- B)
A method for handling large amounts of data in chunks
- C)
A function for making HTTP requests
- D)
A module for handling events
Answer:
B
Explanation: A stream in Node.js is an abstract interface for working
with streaming data. It allows handling large amounts of data piece by piece
(in chunks) rather than loading the entire dataset into memory at once.
2. Which of the following
is not a type of stream in Node.js?
- A)
Readable Stream
- B)
Writable Stream
- C)
Transform Stream
- D)
Connection Stream
Answer:
D
Explanation: Node.js supports three main types of streams: Readable,
Writable, and Transform streams. There is no "Connection Stream."
3. How do you create a
readable stream in Node.js?
- A)
fs.createWriteStream()
- B)
fs.createReadStream()
- C)
stream.createRead()
- D)
stream.createWrite()
Answer:
B
Explanation: The method fs.createReadStream() is used to create a
readable stream in Node.js.
4. What does the data
event in a readable stream indicate?
- A)
The stream has finished processing.
- B)
A chunk of data is available to be read.
- C)
An error has occurred in the stream.
- D)
The stream is about to start reading.
Answer:
B
Explanation: The data event is emitted whenever a new chunk of data is
available to be read from a readable stream.
5. Which method is used
to write data to a writable stream in Node.js?
- A)
writeData()
- B)
write()
- C)
data()
- D)
push()
Answer:
B
Explanation: The write() method is used to send data to a writable
stream in Node.js.
6. What does the end
event signify in a writable stream?
- A)
Data has been written successfully.
- B)
No more data will be written to the stream.
- C)
The stream is starting.
- D)
The stream is still processing data.
Answer:
B
Explanation: The end event signifies that no more data will be written
to the writable stream, and the stream has finished writing.
7. What is the pipe() method
used for in Node.js streams?
- A)
To read data from a file
- B)
To connect a readable stream to a writable stream
- C)
To convert data into a stream
- D)
To create a new readable stream
Answer:
B
Explanation: The pipe() method is used to connect a readable stream to a
writable stream, enabling data to flow from one stream to another.
8. What does the finish
event in a writable stream indicate?
- A)
Data has been read successfully.
- B)
The writable stream has finished writing all data.
- C)
The stream is about to begin writing.
- D)
There is an error in the writable stream.
Answer:
B
Explanation: The finish event indicates that the writable stream has
finished writing all the data and that no more data will be written.
9. Which of the following
is a common use case of streams in Node.js?
- A)
Handling HTTP requests
- B)
Reading or writing large files efficiently
- C)
Managing database connections
- D)
Sending emails
Answer:
B
Explanation: Streams are commonly used in Node.js for reading or writing
large files, as they allow data to be processed in chunks, which is more
memory-efficient.
10. How do you handle
errors in streams in Node.js?
- A)
By using errorHandler()
- B)
By using on('error') event listener
- C)
By using try-catch block
- D)
By using catch() method
Answer:
B
Explanation: In Node.js, you handle errors in streams by attaching an
event listener for the error event, like stream.on('error', callback).
11. What is a transform
stream in Node.js?
- A)
A stream that reads data
- B)
A stream that writes data
- C)
A stream that modifies data while it is being read or written
- D)
A stream that only handles error events
Answer:
C
Explanation: A transform stream is used for modifying data as it is
being read or written. It can be used to apply transformations to the data
during the stream process.
12. Which of the
following can be piped into a writable stream?
- A)
A writable stream
- B)
A readable stream
- C)
A transform stream
- D)
Both B and C
Answer:
D
Explanation: Both readable and transform streams can be piped into
writable streams. Transform streams can also modify the data as it is being
piped.
13. Which of these events
are emitted by a readable stream in Node.js?
- A)
finish
- B)
end
- C)
data
- D)
Both B and C
Answer:
D
Explanation: The end and data events are emitted by readable streams. data
is emitted when new chunks of data are available, and end is emitted when no
more data is available.
14. What happens when you
call the end() method on a writable stream?
- A)
It starts writing new data.
- B)
It indicates that no more data will be written.
- C)
It stops the data flow completely.
- D)
It closes the readable stream.
Answer:
B
Explanation: Calling end() on a writable stream indicates that no more
data will be written to the stream.
15. Which of the
following is a method to create a readable stream from a file?
- A)
fs.createWriteStream()
- B)
fs.createStream()
- C)
fs.createReadStream()
- D)
stream.read()
Answer:
C
Explanation: The method fs.createReadStream() is used to create a
readable stream from a file in Node.js.
16. What is the default
encoding used when creating a readable stream?
- A)
utf-8
- B)
ascii
- C)
binary
- D)
base64
Answer:
A
Explanation: The default encoding used for streams in Node.js is utf-8,
but it can be customized when creating the stream.
17. What does the push()
method do in a transform stream?
- A)
It sends data to the writable stream.
- B)
It allows you to modify data in a readable stream.
- C)
It emits data into the stream.
- D)
It triggers the end event.
Answer:
C
Explanation: In a transform stream, the push() method is used to send
data into the stream, allowing for the modification of data.
18. Which method is used
to read data from a readable stream?
- A)
read()
- B)
on('data')
- C)
pipe()
- D)
write()
Answer:
B
Explanation: The on('data') event is used to read data from a readable
stream as chunks of data are made available.
19. How can you stop the
readable stream from emitting more data?
- A)
Call pause() on the readable stream
- B)
Call end() on the readable stream
- C)
Remove the 'data' event listener
- D)
Call write() on the readable stream
Answer:
A
Explanation: Calling pause() on a readable stream will stop it from
emitting more data. This allows you to control the flow of data.
20. What happens if an
error occurs in a stream?
- A)
The stream stops working immediately.
- B)
A custom error handler is triggered.
- C)
The error is automatically logged to the console.
- D)
The error event is emitted.
Answer:
D
Explanation: If an error occurs in a stream, the error event is emitted,
and the error can be handled by attaching an on('error') event listener.
21. Which stream method
is used to pipe data from a readable stream to a writable stream?
- A)
stream.pipe()
- B)
stream.write()
- C)
stream.end()
- D)
stream.read()
Answer:
A
Explanation: The pipe() method is used to connect a readable stream to a
writable stream, allowing data to flow from one to the other.
22. What is the primary
advantage of using streams in Node.js?
- A)
Easier to handle smaller files
- B)
It consumes more memory
- C)
Allows handling large files without consuming large memory
- D)
Streams are used for network communication only
Answer:
C
Explanation: The main advantage of streams is that they allow handling
large files or data without loading the entire data into memory, thus saving
memory resources.
23. How do you handle
when a stream is finished writing data?
- A)
Use the on('end') event
- B)
Use the on('finish') event
- C)
Use the on('complete') event
- D)
Use the on('write') event
Answer:
B
Explanation: The on('finish') event is emitted when a writable stream
finishes writing all its data.
24. What does the unpipe()
method do in a stream?
- A)
Disconnects the readable stream from a writable stream
- B)
Stops the readable stream from emitting data
- C)
Stops the writable stream from receiving data
- D)
Disconnects both readable and writable streams
Answer:
A
Explanation: The unpipe() method is used to remove the pipe connection
between a readable stream and a writable stream.
25. How can you manually
stop data flow in a readable stream?
- A)
Call end() on the readable stream
- B)
Call stop() on the writable stream
- C)
Call pause() on the readable stream
- D)
Remove the data event listener
Answer:
C
Explanation: Calling pause() on a readable stream will stop the data
flow, allowing you to resume it later.
Continued in the next
message...
4o mini
You said:
yes
ChatGPT said:
ChatGPT
26. What is the default
behavior when data is not consumed quickly enough in a readable stream?
- A)
The stream will pause automatically.
- B)
The stream will stop emitting data.
- C)
The stream will enter a "flowing" mode.
- D)
The stream will enter a "paused" mode.
Answer:
A
Explanation: If the data is not consumed quickly enough, the readable
stream will automatically pause until the data is consumed or resumed.
27. What does the flowing
mode of a readable stream mean?
- A)
Data is being emitted and consumed simultaneously.
- B)
Data is being consumed slowly.
- C)
The stream is in an idle state.
- D)
The data is being processed asynchronously.
Answer:
A
Explanation: In flowing mode, the readable stream emits data as fast as
it is consumed, meaning data is automatically read and passed to listeners.
28. Which of the
following is used to handle streams that emit large amounts of data?
- A)
The buffer module
- B)
The stream module
- C)
The fs module
- D)
The event module
Answer:
B
Explanation: The stream module is specifically designed for working with
streams, which are ideal for handling large amounts of data efficiently.
29. What does the destroy()
method do in a stream?
- A)
Stops the stream and emits the close event.
- B)
Forces an immediate error to be thrown.
- C)
Deletes the stream object.
- D)
Stops the stream without emitting any events.
Answer:
A
Explanation: The destroy() method stops the stream and emits the close
event, allowing for proper cleanup of resources.
30. How can you check
whether a readable stream is in paused or flowing mode?
- A)
By using the stream.mode() method
- B)
By checking the stream.flowing property
- C)
By using the stream.isPaused() method
- D)
By using the stream.flow() method
Answer:
B
Explanation: You can check whether a readable stream is in paused or
flowing mode by checking the stream.flowing property.
31. What is the default
encoding for readable streams in Node.js?
- A)
binary
- B)
utf-8
- C)
base64
- D)
ascii
Answer:
B
Explanation: The default encoding for readable streams in Node.js is utf-8,
which is used to handle textual data.
32. What happens if you
call the push() method in a transform stream but the stream is not yet
readable?
- A)
The data event is emitted.
- B)
The data is buffered and emitted once the stream becomes readable.
- C)
The data is ignored.
- D)
An error is thrown.
Answer:
B
Explanation: If the stream is not yet readable, the data passed to push()
will be buffered and emitted once the stream becomes readable.
33. How do you know when
a stream has ended in Node.js?
- A)
By checking the finished property
- B)
By listening for the end event
- C)
By calling the done() method
- D)
By checking the status property
Answer:
B
Explanation: The end event indicates when the stream has finished
processing all the data, whether readable or writable.
34. What does the pipe()
method do when used on a transform stream?
- A)
It reads data from the source and writes it to a destination stream.
- B)
It allows data to pass through and be transformed by the transform stream.
- C)
It pauses the data flow until data is consumed.
- D)
It closes the destination stream once the data is written.
Answer:
B
Explanation: When used on a transform stream, the pipe() method allows
data to pass through and be transformed by the transform stream.
35. What type of stream
can modify data as it is being written?
- A)
Readable stream
- B)
Writable stream
- C)
Transform stream
- D)
Duplex stream
Answer:
C
Explanation: A transform stream modifies data as it is being read or
written. It can be used to change data (e.g., converting text to uppercase)
during the stream process.
36. Which of the
following can be used to handle data chunks in a readable stream?
- A)
on('data', callback)
- B)
on('chunk', callback)
- C)
on('read', callback)
- D)
on('emit', callback)
Answer:
A
Explanation: The on('data', callback) event handler is used to handle
data chunks as they are read from a readable stream.
37. What does the resume()
method do for a readable stream?
- A)
Starts emitting data from the stream.
- B)
Pauses the data flow in the stream.
- C)
Ends the stream.
- D)
Resumes writing data to the stream.
Answer:
A
Explanation: The resume() method is used to start emitting data from the
stream after it has been paused.
38. What happens if a
writable stream is full and cannot accept more data?
- A)
The stream will automatically pause.
- B)
It throws an error.
- C)
The stream will buffer the data until space is available.
- D)
The stream will stop emitting data.
Answer:
C
Explanation: When a writable stream is full, it buffers the data until
space becomes available to write more data.
39. What is the role of
the flush() method in a writable stream?
- A)
It forces the stream to write any buffered data.
- B)
It clears the writable stream.
- C)
It pauses the writable stream until data is ready.
- D)
It triggers the stream to process the data.
Answer:
A
Explanation: The flush() method forces the writable stream to write any
buffered data, making it available for further processing.
40. Which stream method
is typically used in a duplex stream?
- A)
write()
- B)
pipe()
- C)
read()
- D)
Both A and C
Answer:
D
Explanation: A duplex stream supports both reading and writing.
Therefore, methods like read() and write() are commonly used in duplex streams.
41. What is the readable
property of a stream used for?
- A)
To check if the stream can emit data.
- B)
To determine if the stream is paused.
- C)
To check if the stream has ended.
- D)
To check the data type of the stream.
Answer:
A
Explanation: The readable property indicates whether the stream is
currently capable of emitting data.
42. What does the stream.unpipe()
method do?
- A)
Removes the readable stream from the writable stream.
- B)
Disconnects the transform stream from the readable stream.
- C)
Stops data flow from the readable stream.
- D)
Closes the writable stream.
Answer:
A
Explanation: The unpipe() method is used to disconnect a readable stream
from a writable stream, stopping data flow between the two.
43. What event does a
writable stream emit when it has finished writing all data?
- A)
complete
- B)
done
- C)
end
- D)
finish
Answer:
D
Explanation: The finish event is emitted when a writable stream has
finished writing all the data it was given.
44. Which Node.js module
provides the createReadStream() and createWriteStream() methods?
- A)
path
- B)
fs
- C)
http
- D)
stream
Answer:
B
Explanation: The fs (File System) module provides the createReadStream()
and createWriteStream() methods, which are used to create readable and writable
streams for files.
45. What does the stream.pipe()
method return?
- A)
A reference to the source stream
- B)
A reference to the destination stream
- C)
The transform stream
- D)
The destination stream from which you can call .end()
Answer:
B
Explanation: The pipe() method returns the destination stream, allowing
you to continue the process and chain additional operations.
46. How do you create a
writable stream that writes to a file in Node.js?
- A)
fs.createReadStream()
- B)
fs.createWriteStream()
- C)
stream.createWriteStream()
- D)
fs.writeStream()
Answer:
B
Explanation: The fs.createWriteStream() method is used to create a
writable stream that writes data to a file.
47. What happens if you
try to read from a stream that has already ended?
- A)
An error is thrown.
- B)
No data is emitted.
- C)
The stream automatically starts emitting new data.
- D)
The stream enters the paused state.
Answer:
B
Explanation: If you try to read from a stream that has already ended, no
further data will be emitted, as the stream has completed its operation.
48. How can you handle
errors when working with streams in Node.js?
- A)
Using the try-catch block
- B)
Using the on('error', callback) event
- C)
Using the finally block
- D)
Using the catch() method
Answer:
B
Explanation: You can handle errors in streams by listening for the on('error',
callback) event, which is triggered when an error occurs.
49. What type of stream
can both read and write data simultaneously?
- A)
Readable stream
- B)
Writable stream
- C)
Transform stream
- D)
Duplex stream
Answer:
D
Explanation: A duplex stream can both read and write data
simultaneously, making it ideal for bidirectional communication.
50. Which event is used
to check the completion of a writable stream in Node.js?
- A)
close
- B)
finish
- C)
end
- D)
done
Answer:
B
Explanation: The finish event is used to check when a writable stream
has finished writing all its data.
These multiple-choice
questions will help reinforce students' understanding of streams in Node.js.

No comments:
Post a Comment