DEV Community

Cover image for Understanding HTTP Responses and Streams
Abdullah
Abdullah

Posted on

Understanding HTTP Responses and Streams

Table of Contents

1. Understanding Response Objects

A Response object represents an HTTP response from a request.

The most basic way to create a Response:

const response = new Response("Hello world");

// Key properties that tell us about the response:
console.log(response.status); // 200 (default)
console.log(response.statusText); // OK
console.log(response.ok); // true (status in 200-299 range)
console.log(response.headers); // Headers object
Enter fullscreen mode Exit fullscreen mode

The Response interface gives us information about the HTTP response through its props:

  • status: The HTTP status code (200, 404, 500, etc.)
  • ok: Boolean indicating if status is in the successful range (200-299)
  • headers: Access to the response headers

An important concept is that Response bodies can only be consumed once:

const response = new Response('{"message": "hello"}');

// This works
const data = await response.json();
console.log(data); // { message: 'hello' }

// This fails
try {
  const data2 = await response.json();
} catch (error) {
  console.log("Error: Body already consumed!");
}
Enter fullscreen mode Exit fullscreen mode

If you need to read the body multiple times, use the clone() method before the first consumption:

const response = new Response('{"message": "hello"}');
const clone = response.clone(); // Create copy before consuming

const data1 = await response.json();
const data2 = await clone.json(); // Works! Using the clone
Enter fullscreen mode Exit fullscreen mode

Most of the time, you'll usually get Response objects from fetch:

// Fetch returns a Promise<Response>
const response = await fetch("https://api.example.com/data");

// Check response status
if (!response.ok) {
  throw new Error(`HTTP error! status: ${response.status}`);
}

// Check and handle content type
const contentType = response.headers.get("content-type");
if (contentType && contentType.includes("application/json")) {
  const data = await response.json();
  // Process your JSON data
}
Enter fullscreen mode Exit fullscreen mode

If you know it's JSON (a lot of the times you do), no need to check the content type.

2. Different Ways to Consume Responses

We've seen .json() and .clone() in action already.

Let's explore all consumption methods:

// .text() for raw text data
const textResponse = new Response("Hello, world");
const text = await textResponse.text();
console.log(text); // "Hello, world"

// .blob() for binary data like images
const imageResponse = await fetch("image.png");
const blob = await imageResponse.blob();
const imageUrl = URL.createObjectURL(blob);

// .arrayBuffer() for raw binary data
const buffer = await imageResponse.arrayBuffer();
const uint8Array = new Uint8Array(buffer);

// .formData() for form data
const formResponse = new Response("first_name=John&last_name=Doe", {
  headers: { "Content-Type": "application/x-www-form-urlencoded" },
});
const formData = await formResponse.formData();
console.log(formData.get("first_name")); // 'John'
Enter fullscreen mode Exit fullscreen mode

Key points about these methods:

  • Each method returns a Promise that resolves to the appropriate data type
  • Once you use any of these methods, the response body is consumed!
  • Choose the method based on the type of data you're expecting:
    • .json() for JSON data
    • .text() for plain text, HTML, XML, etc.
    • .blob() for files, images
    • .arrayBuffer() when you need to process binary data directly
    • .formData() for form submissions

What is Binary Data?

Data in its raw form of 1s and 0s. Everything in computers are binary data interpreted differently.

Example: the letter 'A' in binary is 01000001 (65 in decimal).

What's ASCII (American Standard Code for Information Interchange)?

ASCII is a character encoding standard where each letter/symbol maps to a number (0-127).

Modern systems mostly use UTF-8 (which is ASCII-compatible for 0-127)

const a = 97; // lowercase 'a' in ASCII
const A = 65; // uppercase 'A' in ASCII

const text = "hello";
// Converts string to Uint8Array using UTF-8 encoding
const asArray = new TextEncoder().encode(text);
console.log(asArray); // Uint8Array: [104, 101, 108, 108, 111]
Enter fullscreen mode Exit fullscreen mode

What is ArrayBuffer?

ArrayBuffer is raw binary data buffer, just bytes in memory.

You can't manipulate ArrayBuffer directly. It's just raw memory. You need to use a view (a way to read and write to the ArrayBuffer) to access it.

const buffer = new ArrayBuffer(4); // 4 bytes of memory

const view = new Uint8Array(buffer);
view[0] = 104; // 'h'
view[1] = 105; // 'i'
Enter fullscreen mode Exit fullscreen mode

What is Uint8Array?

A Uint8Array is a typed array that handles 8-bit unsigned integers.

Each element is 8 bits (1 byte), allowing values from 0 to 255.

const array = new Uint8Array([104, 101, 108, 108, 111]); // "hello" in ASCII
console.log(array[0]); // 104 (ASCII code for 'h')

// Common ways to create
const empty = new Uint8Array(5); // Creates array of 5 zeros
const fromArray = new Uint8Array([1, 2, 3]); // From regular array
const fromBuffer = new Uint8Array(someArrayBuffer); // From ArrayBuffer

// You can't store values outside 0-255
const array2 = new Uint8Array([256, -1, 1.5]);
console.log(array2); // [0, 0, 1] (values are converted)
Enter fullscreen mode Exit fullscreen mode

They're often used for:

  • Processing network requests
  • Reading/writing files
  • Working with streams
  • Handling images or audio data

The reason they're befitting for such use cases is because they're fixed-size arrays of 8-bit unsigned integers. Why 8 bits is good:

  • Matches how data is commonly transmitted over networks (byte by byte)
  • Most basic unit of data in computers is a byte (8 bits)

3. Streams

We've seen that Response.body is a ReadableStream.

// Basic stream reading
const response = await fetch("large-file.mp4");
const reader = response.body.getReader();

while (true) {
  const { done, value } = await reader.read();
  if (done) break;

  // value is a Uint8Array chunk
  processChunk(value);
}
Enter fullscreen mode Exit fullscreen mode

A ReadableStream represents a source of data that you can read from piece by piece.

  • Read operations return an object with {done, value}
  • value is typically a Uint8Array chunk
  • done becomes true when stream is finished

Creating Your Own ReadableStream

You can create your own ReadableStream:

const stream = new ReadableStream({
  start(controller) {
    controller.enqueue("First chunk");
    controller.enqueue("Second chunk");
    controller.close(); // Close the stream
  },
});
Enter fullscreen mode Exit fullscreen mode

Why Streams?

If streams didn't exist, you'd have to load the entire file into memory.

const response = await fetch("huge-file.mp4");
const blob = await response.blob(); // Memory: 💥

// With streams - process chunk by chunk
const response = await fetch("huge-file.mp4");
for await (const chunk of response.body) {
  // Memory: ✅ Only one chunk at a time
  uploadChunk(chunk);
}
Enter fullscreen mode Exit fullscreen mode

Combining Chunks

// Collecting chunks into a Blob
const response = await fetch("some-file.mp4");
const chunks = [];

// Method 1: Using getReader
const reader = response.body.getReader();
while (true) {
  const { done, value } = await reader.read();
  if (done) break;
  chunks.push(value);
}

// This is the final blob
// Can be used as is or passed to a file input
const blob = new Blob(chunks);

// Method 2: Using for await...of
// More readable but same concept
const chunks2 = [];
for await (const chunk of response.body) {
  chunks2.push(chunk);
}
const blob2 = new Blob(chunks2);
Enter fullscreen mode Exit fullscreen mode

More info:

  • for await...of
  • getReader

Transform Streams

Transform streams are a type of stream that can modify the data as it flows through the stream.

const transformStream = new TransformStream({
  transform(chunk, controller) {
    // Example: convert chunk to uppercase if it's text
    const upperChunk = chunk.toString().toUpperCase();
    controller.enqueue(upperChunk);
  },
});

// Pipe through transform
const response = await fetch("data.txt");
const transformedStream = response.body.pipeThrough(transformStream);

// Now read transformed data
for await (const chunk of transformedStream) {
  console.log(chunk); // UPPERCASE chunks
}
Enter fullscreen mode Exit fullscreen mode

What's Handled for You

  • Backpressure management
  • Internal queuing
  • Stream locking
  • Memory management

Best Practices

1. Memory Efficiency

// Bad: Loads entire file
const response = await fetch("huge-video.mp4");
const blob = await response.blob(); // Memory: 💥

// Good: Process in chunks
const response = await fetch("huge-video.mp4");
for await (const chunk of response.body) {
  processVideoChunk(chunk); // Memory: ✅
}
Enter fullscreen mode Exit fullscreen mode

2. Cancellation is Important

If you don't cancel a reader on error, you can run into several issues:

// Problem scenario
const reader = response.body.getReader();

try {
  while (true) {
    const { done, value } = await reader.read();
    if (done) break;

    // If error happens here
    processChunk(value);
    // Reader isn't cancelled
  }
} catch (error) {
  // ❌ No reader.cancel()
  throw error;
}
Enter fullscreen mode Exit fullscreen mode

The main issues:

  • Resource Leaks: The underlying stream resources stay open
  • Memory Leaks: Any internal buffers remain allocated
  • Network Connections: May stay open if it's a network stream
  • Other readers can't access: Stream stays locked (no other reader can read from it)

A good pattern to handle this properly:

const reader = response.body.getReader();

try {
  while (true) {
    const { done, value } = await reader.read();
    if (done) break;
    processChunk(value);
  }
} catch (error) {
  reader.cancel();
  throw error;
} finally {
  // Best practice: cancel in finally to ensure cleanup
  // even if the code above doesn't throw
  reader.cancel();
}
Enter fullscreen mode Exit fullscreen mode

3. Common Stream Pipeline

// Fetch -> Transform -> Process -> Save
const response = await fetch("data.json");
const transformed = response.body
  .pipeThrough(parseJSON)
  .pipeThrough(filterData)
  .pipeThrough(compressData)
  .pipeTo(saveToFileStream);
Enter fullscreen mode Exit fullscreen mode

Recap of Practical Uses

  • Large file handling (upload/download)
  • Real-time data (video/audio)
  • Progressive loading (load as you scroll)
  • Data transformations (compression, encryption)
  • Network efficiency (don't wait for everything)

Top comments (0)