# Other Languages > jQuery >  [RESOLVED] How much can I load up into the DOM?

## szlamany

Wow - thie jQuery section of the forum surely appears dead - no posts since Oct 1st - hope someone reads this!!

At any rate - with jQuery it's become common for me to download objects of data and stash them somewhere in the DOM and then re-use them for auto-completes or dropdowns or slickgrid sources.

I've got experience doing 2000 names in an auto-complete drop down - which means I stored all that data somewhere in the page for use as the page ajaxes away.

How about doing 30,000 names in an auto-complete?  I'm not concerned about the speed of the auto-complete - really worried about the available storage space in the browser apps out there and wondering if anyone has pushed a large amount to a browser and found a "breaking point" so to say.

Thanks!

----------


## akhileshbc

Are you talking about the amount of data that can be stored using JavaScript ? If so, I think there is no limit. Since, it would depend on the user's PC.

The more the number of elements or data, the more the RAM it would consume. And it might pose a lag in user's machine.

That's what I think.  :wave:

----------


## tr333

I don't see any problems occurring on a desktop browser, but I would expect to see problems if this is on a device with low memory, eg. Smartphone or tablet device.

If you're using Web storage then you can run into quota limits which are defined by the client browser, but these are usually at least a few megabytes.

I guess the easiest way to find out is to just run some tests, loading up X objects into the dom and increase X until you start seeing problems.  It's going to be different for every browser/machine that you're running on due to differing client hardware specs.

----------


## xxarmoxx

I'm sure there is a tutorial out there for this. My hunch is NOT to store 30,000 items in the browser. I would make a new ajax call after each keystroke and display the results in the autocomplete.

----------


## szlamany

Have you used the standard jquery UI autocomplete - where you type some letters and it pulls from the DOM to display matches?  Very wicked fast.

I've done 1000, 2000 or so names in a lookup like this.

I was looking for people who have gone down this path - but with 30,000 or 50,000 rows of data - and looking for experience in this area.  Seems no one has offered such - I'll eventually get to the point where I test this with some users out in the field and hopefully get back here with some meaningful results.

----------


## tr333

> I was looking for people who have gone down this path - but with 30,000 or 50,000 rows of data - and looking for experience in this area.  Seems no one has offered such - I'll eventually get to the point where I test this with some users out in the field and hopefully get back here with some meaningful results.


I think that's the only real way to find out.  Do lots of testing and see what you get out of it.

----------


## xxarmoxx

Yes I have seen the jquery autocomplete, and yes they use an array that has data in it, but that is only good for small data sets. You should make an ajax call that sets that array. The ajax call will return a small subset of the data making that array very small. Its the only way to do this for large data sets. Just use ajax that returns JSON and set that array with the JSON.

http://stackoverflow.com/questions/3...on-large-array

----------


## xxarmoxx

Make sure your server call is fast and make sure your query is optimized, it should work well if you make that happen.

----------


## szlamany

Ok - finally got around to loading this town's census data - 43,000 names of people with a corresponding CensusId.

All of this loads at "boot" time for the page (initial ajax data call after un/pw authenication).  This particular set of JSON data is 2.5 MB - and takes 1.6 seconds locally running here in the office on development machines.

The jQuery autocomplete is handing 43,000 entries perfectly - I already extended it a long time ago to tell you to be more specific if more than 200 rows were being considered for display - so until then you simply get a record count of how many it's waiting to get under...



```
$.extend($.ui.autocomplete, {
    escapeRegex: function(value) {
        return value.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, "\\$&");
    },
    filter: function(array, term) {
        var matcher = new RegExp($.ui.autocomplete.escapeRegex(term), "i");
        var arraytoreturn = $.grep(array, function(value) {
            return matcher.test(value.label || value.value || value);
        });
        if (arraytoreturn.length > 200) {
            arraytoreturn = [{ label: "Please be more specific - " + arraytoreturn.length + " matching entries!", value: ""}];
        }
        return arraytoreturn;
    }
});
```

I'll give further timings after we get the page up on a public facing server...

I also want to try running this with a browser on a low-end / low-memory machine and see if eating 2.5 MB of Javascript object space is a bad thing...

----------


## szlamany

> Yes I have seen the jquery autocomplete, and yes they use an array that has data in it, but that is only good for small data sets. You should make an ajax call that sets that array. The ajax call will return a small subset of the data making that array very small. Its the only way to do this for large data sets. Just use ajax that returns JSON and set that array with the JSON.
> 
> http://stackoverflow.com/questions/3...on-large-array


Have you done lots of ajax calls for autocomplete's?  How is the performance like going over the wire as you are typing characters?  Does it appear instant??

Do you set it for a minimum number of characters before you start polling the service to get you matching data?

I've only ever done autocompletes with data that was loaded initially and just held for use later as needed.

----------


## Serge

I've used autocomplete with 500k user records (ajax calls). With proper indexes in the table, it works very well even if you set minChars: 1.

----------


## tr333

If you "really" want to speed up things...

I noticed a huge speed increase when switching from WebForms to MVC, because the amount of data sent on each request/response was reduced by not having to send through all the extra stuff that WebForms uses (ajax script methods, viewstate, etc.).  Also, the new Bundling and Optimization stuff in MVC4 is a real winner.  Moving all your scripts into external .js files allows you to concatenate/minify them to save loading time and also allows you to cache them properly, which a dynamically loaded .cshtml or .aspx view/page can't do as easily.

Check out the web.config from the HTML5 Boilerplate template that includes settings for high levels of caching of static assets.  Other useful tools include yslow and pagespeed.

----------


## szlamany

> If you "really" want to speed up things...
> ...
> Other useful tools include yslow


Point #15 - under "Web Performance Best Practices and Rules" of that link - "Make AJAX Cacheable" - is what this thread is all about.

And I can happily report that you can in fact cache a huge amount of SOURCE data in the DOM at page-load time - making AJAX-calls for auto-complete's unneeded.

I've got four autocompletes that each load 25,000+ rows of data - takes a couple of seconds to initially load (but the user is unaware - as it's an AJAX call to even load the initial data!).

From that point forward the user can access those auto-completes and get full functionality *without revisiting the server* for appropriate data.  Actually when I display a new bit of DOM with a similar autocomplete - that also needs that source data - I can effortlessly point to that object that is already in the DOM.

I am going to mark this thread resolved

----------


## Serge

> Point #15 - under "Web Performance Best Practices and Rules" of that link - "Make AJAX Cacheable" - is what this thread is all about.
> 
> And I can happily report that you can in fact cache a huge amount of SOURCE data in the DOM at page-load time - making AJAX-calls for auto-complete's unneeded.
> 
> *I've got four autocompletes that each load 25,000+ rows of data - takes a couple of seconds to initially load (but the user is unaware - as it's an AJAX call to even load the initial data!).*
> 
> From that point forward the user can access those auto-completes and get full functionality *without revisiting the server* for appropriate data.  Actually when I display a new bit of DOM with a similar autocomplete - that also needs that source data - I can effortlessly point to that object that is already in the DOM.
> 
> I am going to mark this thread resolved


That goes against all rules in web page development. Your initial load will be slow(er). Also if you have a public website, you will be heavily penalized by search engines. Your rank is also based on the speed of your pages as well. What you should do is to cache your ajax calls on the server side.

----------


## szlamany

My customers are running things like CENSUS and REAL ESTATE TAX collection apps here - all https secure - no need to worry about how my "page" gets ranked by search engines.

The app uses jQuery TAB's to allow you to open dozens, if you want, tabs of data - for editing.  I put huge amounts of effort into making sure that the "source" of my drop downs and autocompletes all come from "single" cached copies in the browser.  I might have a dozen dropdowns all pointing to the same data object in javascript.

All of these data sources get downloaded via ajax calls made in the READY event of the page - so the user is unaware (actually if a data source is not loaded when they open a tab that dropdown will display "Loading..." and be disabled - it becomes enabled in the ajax callback).

Your arguments against this method don't seem to fit my situation - althrough I welcome your opinion!

----------


## Serge

I understand your reasons but until you get those arrays loaded the user will just sit there and wait. Although it might take a couple of seconds and you might only get one array of 25k records to share against multiple autocompletes (it might serve your purposes), what happens when you need more just 1 data source?

----------

