Caching Binary Data With jQuery Ajax and IndexedDB

After long, grueling months (years? or does it only feel like years?), your web application nears completion. It is tightly coded, well documented, works across all modern browsers, and is well received by your beta testers. It’s nearly time to go live, and a smile of pure relief plays upon your lips… and freezes into a rictus grin when your client turns to you, and asks, “so, hey, can we speed up the dynamic cat pic loading? Especially when I close the browser and come back to it later. I think that’s really key to the whole application.”

Long, long ago we discussed our jQuery plugin that will allow you to cache responses of ajax queries in Local Storage, so long as they’re strings, or something that can be coerced to a string (objects as JSON, numbers). We also previously discussed adding an ajax transport to allow us to handle sending and receiving binary blobs and array buffers via jQuery ajax.

But what if we need to cache binary blobs or arraybuffers? Say, we need those cat pics on the double – we could convert them to and from base64, but not only is that slow, but we’re certain to run up against the 5MB limit of local storage in short order. No, what we need is some way to cache binary data in some sort of client-side database…

(more…)

AJAX Upload XHR2 and FileReader, Part 3

So, this week’s stand up meeting is finally concluded. You weren’t really paying attention – blah blah something uploader, the details are in the task, blah blah HTML5. You sit down at your station, pull up the task and – hmm, support for modern browsers, including mobile… need to show previews of certain types of files before uploading… show progress… pause and resume? You seem to remember seeing something like that on one of your favourite developer blogs…

You may have been thinking about the HTML5-based uploader and file reader I shared way back when. However, as reader RLK points out, there wasn’t really a way to pause or resume uploads previously. The logic was there… if you were willing to unwrap DeferXHR from the plugin. Oops. Let’s fix that, and add some functionality while we’re at it.

This article continues work on the plugin we’ve discussed previously in Ajax Upload Part II, and Ajax Upload XHR2, Take Two.

Pausing & Resuming

First off, we’ll need to add a couple functions to the HUp prototype so that we can call pause and resume on the enhanced element (we’ll discuss exactly how to use these functions at the end):

/**
* Pause any in progress, chunked uploads/file reads. If pauseList is specified,
* elements should be either the names of the files or the index in which they were returned in the files
* list returned from the FILE_LIST_LOADED event. Can provide only a single string or number of only a single
* upload/read needs to be paused.
* @param {Array|number|string|boolean|undefined} pauseList
*/
Hup.prototype.pause = function(pauseList){
pauseList = (!pauseList) ? false : Array.isArray(pauseList) ? pauseList : [pauseList];

this.fprocessors.forEach(function(fprocess, idx){
if (!pauseList)
{
    fprocess.pause();
    return;
}
if (pauseList.indexOf(idx) !== -1 || pauseList.indexOf(fprocess.file.name) !== -1)
{
    fprocess.pause();
}
});
};

/**
* Resume any in progress, paused, chunked uploads/file reads, following the same rules for pauseList as
* specified for pause.
* @see Hup.prototype.pause
* @param {Array|number|string|boolean|undefined} pauseList
*/
Hup.prototype.resume = function(pauseList){
    pauseList = (!pauseList) ? false : Array.isArray(pauseList) ? pauseList : [pauseList];

    this.fprocessors.forEach(function(fprocess, idx){
        if (!pauseList)
        {
            fprocess.resume();
            return;
        }
        if (pauseList.indexOf(idx) !== -1 || pauseList.indexOf(fprocess.file.name) !== -1)
        {
            fprocess.resume();
        }
});
};

You’ll notice the pauseList parameter for both functions – this is to allow us to specify only specific file upload or read processors as needing to pause. Remember that the file reading or uploading will occur asynchronously, and if you’ve allowed the user to select or drag and drop multiple files, there can be multiple file reads/uploads occuring at once. The pauseList parameter, therefore, allows, for example, an interface you’ve built on top of this plugin to give the user the ability to pause one upload out of a list of uploads.

You can see we’re being fairly permissive about what pauseList looks like. It can be a boolean false, which will indicate we want to pause/resume all file processors (a boolean true is an invalid value). It can be an array, in which case it should be an array of strings or numbers (more on that in a moment). Or it can be a single string or number. Other values are invalid and will cause errors and/or untested strangeness.

If a number or string is specified (separately, or as part of the array), it should either be the 0-indexed number of the occurrence of the file you want to pause/resume processing on in the files array that gets returned from the FILE_LIST_LOADED event:

this.input.trigger(Hup.state.FILE_LIST_LOADED, {state:Hup.state.FILE_LIST_LOADED, files:files});

Or it should be the full name of the file as a string, e.g. picture_of_my_cat.jpg. Also worth noting is that most of the events that have to do with a specific file will also include the file_name as part of the event data, for use in disambiguating which file the event has taken place on.

On the DeferXHR that processes our files for upload in chunks, we have the following:

/**
* Pause the upload - that is, after the current chunk is finished uploading, cease uploading chunks until
* resume is called. For obvious reasons, this only works with chunked uploads.
* If the state of the deferred object is not pending (that is, is either already resolved or rejected),
* return early - we won't attempt to pause an upload that's finished or failed.
*/
DeferXhr.prototype.pause = function(){
if (this.defer.state() !== 'pending' || !this.options.chunked) return;
this.paused = true;
this.defer.notify({
state:Hup.state.FILE_UPLOAD_PAUSE, file_name:this.file.name,
current_range:{start:this.start, end:this.end, total:this.file.size}
});
};

/**
* Resume the upload if paused (works for chunked uploads only).
*/
DeferXhr.prototype.resume = function(){
    if (this.options.chunked && this.paused)
    {
        this.paused = false;
        this.defer.notify({
        state:Hup.state.FILE_UPLOAD_RESUME, file_name:this.file.name,
        current_range:{start:this.start, end:this.end, total:this.file.size}
    });
    this.upload();
    }
};

You’ll notice that we check the state of the Deferred object first – no point in attempting to pause an upload that’s already resolved or rejected. We also check that the upload is chunked – the way ‘pausing’ an upload works is to stop uploading after the current chunk is finished. If we’re not uploading in chunks (because the chunked option has been set to false, or because the file is smaller than the specified chunk_size), then calling pause on the upload has no effect.

So, now it looks like we have all the pieces for pausing and resume uploading in place – what about file reading?

Chunked File Reading

We just finished discussing how ‘pausing’ an upload works by finishing up the current chunk, and then waiting on resume to start uploading the next. Therefore, if we want to be able to pause a file read, we need to read it in chunks as well.

/**
* Read the entire file or a slice thereof, depending on the value of options.chunked and chunk_size.
*/
DeferReader.prototype.readFile = function(){
if (this.options.chunked)
{
    this.reader[this.read_method](this.file.slice(this.start, this.end));
    return;
}
this.reader[this.read_method](this.file);
};

This involves a number of changes to the structure of DeferReader, but one obvious starting point is that, if we’re chunking the file read, we follow the same general structure as we do for file uploads – use the slice method on the File to get a specified range of bytes within the file, and read that using the specified read method.

You may be wondering what the advantage is to this. After all, the file is just going to get read into memory, whether we read it all at once or in chunks, right? Is it just to have a standard api across both file processors?

On the contrary, it offers us some interesting advantages.

For one, we can potentially avoid locking the browser while loading a large file – if we have to perform the file loading on the UI thread, we can pause between chunks to perform other updates as necessary.

For another, memory isn’t the only place we could load a large file – we could instead plan to re-assemble it into IndexedDB. This would also potentially be a good place to download files to, if we needed to interact with them in the browser before serving them up to the user (say, decrypting them)… more on that in a future article.

For now, if you do decide to chunk your file read, you’ll need to re-assemble it as its read in to get the complete file. Notice the changes in DeferReader.prototype.readComplete:

/**
* On read completion, if we're reading in chunks, if we've reached the last chunk, report on file read completion.
* If there are remaining chunks, report on progress and read the next chunk.
* Otherwise if we're reading the entire file in one go, report on file read completion.
* @param event
*/
DeferReader.prototype.readComplete = function(event){
if (event.target.readyState == FileReader.DONE && (!this.options.chunked || this.end == this.file.size))
{
this.defer.resolve({
    state:Hup.state.FILE_READ_FINISHED, file_name:this.file.name, file_size:this.file.size,
    file_type:this.file.type, read_method:this.read_method, read_result:event.target.result
});
return;
}

this.defer.notify({
    state:Hup.state.FILE_READ_PROGRESS, file_name:this.file.name, progress:this.progress,
    read_result:(event.target.readyState == FileReader.DONE) ? event.target.result : void 0
});

this.start = this.end;
this.end = Math.min(this.start+this.options.chunk_size, this.file.size);

if (!this.paused)
{
    this.readFile();
}
};

As detailed above, whenever a chunk has finished reading in, we’ll check to see if its the last chunk. If not, we’ll trigger a FILE_READ_PROGRESS event, much like the kind that was triggered previously while a file read was ongoing. The difference here is that this version of the event will also include a read_result property, with the result of reading the file chunk. A simple test for the presence of this property can be done when listening for the event, and the read_result can be stored in a new Blob in memory or elsewhere (such as IndexedDB).

If it’s the final chunk, then we trigger a FILE_READ_FINISHED as usual, and return the final chunk in the read_result in that event. In the listener for this event, we can finish assembling the chunked file, and do what we want with it at this point.

Use

I mentioned at the beginning we’d discuss how to actually call pause or resume on your element and the associated file processor(s). This is very simple. First, you’d attach a HUp instance to an element by calling it like so:

$('#hupInput').hup({options});

Then, you can get back the HUp instance by referencing that same element and looking for ‘hup’ in the data attached to the Node. You can also save a reference to this instance – it will never change within the lifetime of the Node, so it’s safe to save.

var aHupInstance = $('#hupInput').data('hup');

Then, you could call aHupInstance.pause() or aHupInstance.resume(). You could also skip saving a reference and address it directly:

=$('#hupInput').data('hup').pause([1, 'file_name.ext']);

Plugin

Thanks again to RLK for noticing that pause and resume weren’t really there yet! The updated plugin code is below – for details on use see the Github repository.

https://gist.github.com/SaneMethod/a3fda36468efd5129005

Ajax Upload XHR2, Take 2

It’s a pleasure to be able to interact with files in the browser at long last, isn’t it? Reading files in without needing to bounce them against the server first opens up a lot of possibilities – and getting progress from a chunked ajax upload is miles away from the indeterminate form uploads of days past.

Last time we touched this subject, I shared an (admittedly rough) jQuery plugin that allowed you to enjoy HTML5 ajax uploading and file reading with the familiar event interface, and convert any element into a drag-and-drop target.

At the request of reader Mateusz, let’s revisit our HUp plugin, and polish it up a little by adding a new feature – the ability to filter files to be read/uploaded by their file size, and/or their mime-type.

(more…)

jQuery Ajax Blobs and Array Buffers

Alien_blob_monster

A big part of what makes jQuery a regular part of so many web projects is the clean interface it offers us for a number of sometimes messy built-in aspects of javascript. The most obvious is the DOM interface; and in second place, jquery ajax and its various shorthand methods. Abstracting away the difference between ActiveXObject and XMLHttpRequest is one of the most obvious benefits – but even if you don’t need to worry about supporting old versions of IE, you might well enjoy the clean, object-based, promise-returning interface that jquery ajax offers.

It’s a shame then, that if you want to take advantage of XMLHttpRequest Level 2 features like Blob and ArrayBuffer uploading/downloading, you have to fall back to the standard javascript api.

Let’s fix that, shall we?

(more…)

Ajax Caching, Transports Compatible with jQuery Deferred

squirrelwithnuts

Ever since the advent of memcached and its ilk, the server-side has been able to benefit from reduced load by caching recently or oft-requested resources. This hasn’t become any less important and valuable. If anything, in this era of the webApp, when native application look and feel is increasingly desired, speedy response to requests is critical for your application to meet the needs of your time-pressed users.

Server-side caching can’t save us from the delay that making a roundtrip from the server imposes, however, and if you’re serving hundreds or thousands of users at the same time, even small memory-cached items may, to the user, seem to be taking their sweet time showing up. For small amounts of oft-requested data that isn’t likely to change often, we can eliminate even these delays by shifting the burden client-side, with local storage and ajax request caching.

(more…)