Oh, the dreaded experience of trying to read a monstrous, hundreds-of-megabytes-large file in a web application. It’s a path many have walked down, only to witness the horror of page crashes, the haunting UI freezes, and the ever-so-helpful browser Violation warning messages. Have you ever seen a "[Violation] 'something' handler took 43129ms" message? It's the digital equivalent of "I'm too old for this."
The Horrors of Large Files
Let's paint the picture here: you have a large file, and you need to process it right in the user's browser. What could possibly go wrong? For starters, JavaScript was not designed as a heavyweight lifting crane; think of it more like a versatile Swiss Army knife. When you load a massive file using traditional methods, JavaScript tries to gulp it down in one go. The result? The browser chokes, gasps for memory, and might even crash. Cue the panic as users furiously refresh their browsers, to no avail.
The Saviors: TypedArrays and WebWorkers
Before we dive into our superhero tech, let’s talk about what these tools are:
TypedArrays: A Brief Explanation
TypedArrays is not your everyday hero; it's JavaScript’s less-talked-about special forces unit. These arrays allow you to handle binary data directly, with precision and finesse, similar to how you might imagine a surgeon operates. As we know, JavaScript traditionally uses UTF-16 encoding for strings, which is about as efficient as using a butter knife to cut a steel beam. TypedArrays, on the other hand, let you manipulate raw binary data efficiently, slicing through data with the precision of a laser.
WebWorkers: A Brief Explanation
Enter WebWorkers: the covert operatives who take heavy tasks off the main thread, working behind the scenes, unbeknownst to the user. WebWorkers allow you to run JavaScript in the background, in a parallel universe where they don’t block the user interface. Imagine sending your file processing task to a quiet, dedicated worker who sits in a soundproof room, so the rest of the application can continue humming along smoothly without any hiccups.
Let's roll up our sleeves and dive into the solution
WebWorker file
postMessage("Hello from web worker");
onmessage = event => {
const textDecoder = new TextDecoder("utf-8");
const data = new Uint8Array(event.data);
const text = textDecoder.decode(data);
postMessage(text);
};
Handling file change
Here we handle file change event and sending file as chunks to web worker to process it and receive its message
const worker = new Worker("./web-worker.js");
worker.onmessage = e => {
const $p = document.createElement("p");
$p.textContent = e.data;
if ($output) $output.append($p);
};
worker.onerror = function(e) {
console.error('Error from worker: ', e.message);
};
const $fileInput = document.querySelector<HTMLInputElement>("#file-input");
const handleFileInputChange = async (e: Event) => {
try {
const file = (e.target as HTMLInputElement).files?.[0];
const chunkSize = 1024 * 1024 * 15; // 15MB chunk size
if (!file) return;
await processFileInChunks(file, chunkSize);
} catch (error) {
console.error("Error:", error);
}
};
$fileInput.addEventListener("change", handleFileInputChange);
And here where the magic happens ✨
const readFileAsync = (reader: FileReader): Promise<ArrayBuffer> => {
return new Promise((resolve, reject) => {
reader.onload = event => {
const target = event.target;
if (!target) return;
const buffer = target.result as ArrayBuffer;
const slice = new Uint8Array(buffer);
// Process the buffer data here
worker.postMessage(slice, [buffer]);
resolve(buffer); // Resolve promise with the buffer
};
reader.onerror = error => {
reject(error); // Reject promise on error
};
});
};
async function handleChunk(chunk: Chunk) {
const reader = new FileReader();
// Read specific byte range
reader.readAsArrayBuffer(chunk.blob.slice(0, chunk.end - chunk.start));
await readFileAsync(reader);
}
async function processFileInChunks(file: File, chunkSize: number) {
for (let start = 0; start < file.size; start += chunkSize) {
const end = Math.min(start + chunkSize, file.size);
const chunk: Chunk = { start, end, blob: file.slice(start, end) };
await handleChunk(chunk);
}
}
In this thrilling TypeScript adventure, first we set up a worker. When a file is selected, instead of reading it directly in the main thread, we use FileReader
to read the file as an ArrayBuffer
. This buffer then gets sent to the WebWorker, away from the fragile main thread, allowing your application to remain responsive and crash-free.
Conclusion: A Brighter Future for File Processing
With the combined powers of TypedArray and WebWorkers, governed by the wise laws of TypeScript, processing large files in the browser no longer needs to be a nightmare. This strategy not only prevents the UI from freezing but also optimizes performance, making your application a robust, modern marvel that can handle large datasets gracefully. So, why not give your users the smooth, responsive web experience they deserve? After all, nobody likes a slow and cranky application, right?
Now, brave reader, go forth and conquer those gargantuan files with the power of JavaScript, TypeScript, and a little help from your friends, TypedArray and WebWorkers!
Of course any suggestions or thoughts about this will be much appreciated.
The complete code
Complete code can be found in this repo
References and Further Reading
Here are some resources that can deepen your understanding of TypedArrays, WebWorkers, and handling large files in JavaScript:
MDN Web Docs on TypedArray
Dive deep into the world of TypedArrays and see all the possibilities for handling binary data in JavaScript.
TypedArray - MDNMDN Web Docs on Web Workers
Learn more about the power and potential of Web Workers in offloading tasks and keeping your UI smooth.
Web Workers - MDNUsing Web Workers with TypeScript
A comprehensive guide to implementing Web Workers in a TypeScript project.
Using Web Workers - TypeScriptHTML5 Rocks - The Basics of Web Workers
Get a historical perspective and practical examples of using Web Workers to improve performance in web applications.
HTML5 Rocks - Web WorkersUsing Promises with FileReader a good article on how to read files in async
Processing huge files using FileReader.readAsArrayBuffer() in web browser a good article about file reader with array buffer
These resources will not only help you understand the technical foundations but also give you practical insights into applying these concepts in real-world applications.
Top comments (0)