0

If user issues a query, I would like AJAX to download the result of query into browser so that JavaScript can do the parsing. The results could be around 1MB-5MB (at most, doubt it would get anywhere near 0.1MB), and I would send it from PHP to front-end using JSON, then convert it into array, which I would then parse depending on users expectations. Given that 57KB is what my asset is worth (project not finished), DaniWeb is for example 1.232kB (as my browser receives it). Would containing 5MB of data in a single array-variable hurt the performance of a computer browser? I'm not talking about download, I'm talking after it's done. Will containing 5MB and making data[2198124]["text"] hurt a lot?

Edited by Aeonix

4
Contributors
5
Replies
41
Views
1 Year
Discussion Span
Last Post by ryantroop
0

On a "modern" machine, probably not. You will have a longer delay simply downloading the data.

However, if this is the path you plan on taking, I would let PHP format it into the array so that the browser just has to JSON.parse() the value and then you don't have to worry about "processing" (so.. pretty much everything in JS is a string. If you send a raw string of whatever along to script, it will have to take that string and basically "eval" it to figure out what is inside said string. You would then turn it into an "array" which means JS then has to iterate through the string again to break everything apart. Not such a good idea...).

That said, 5MB in an array may be a bit demanding, and actually stall processing a bit (if you do a straight up for (;;) loop, and you have 100000 results, you will likely be sitting for a while to process all of that (seconds, if not minutes). It also depends on what the compare values are, if they are strings, etc... or what actions you are taking with the data as you iterate over your array.

Really, the only way to "know" is to make an HTML page that has your 5MB of data on it already formatted as you need it, and start processing it in script to check your timings. It likely will not be "usable" without some sort of spinny wheel of death to show processing.

So... what are you trying to accomplish? And why all that data? Why must it be that much?

Edited by ryantroop

0

I would let PHP format it into the array so that the browser just has to JSON.parse() the value and then you don't have to worry about "processing"

It will be straight JSON, that will be JSON.parse()d, I won't eval anything, they're ready to use values, strings and integers.

if you do a straight up for (;;) loop, and you have 100000 results, you will likely be sitting for a while to process all of that (seconds, if not minutes)

What would happen, if I had this entire 100000 entries, and I would for(;;) 100 results? Would that hurt that much?

if they are strings, etc... or what actions you are taking with the data as you iterate over your array.

Just display, document.getElementById("result" + x).innerHTML = oneofresults[x];.

Really, the only way to "know" is to make an HTML page that has your 5MB of data on it already formatted as you need it

This will take A LONG time... also, why HTML page? The result comes from SQL databased, that is parsed into usable array in PHP, then json_encode(), passes through AJAX to front-end JavaScript, the JSON.parse() into a variable, and use it. Where does HTML come from?

So... what are you trying to accomplish? And why all that data? Why must it be that much?

Knowing my skill to explain things, it will only take 5 years to get my point over. So let's just skip why and how :D

P.S: Oh, I didn't explain anything... yea... user makes request into database and I want it to be parsed so that users access is faster, than parsing time of AJAX/PHP and SQL.

Edited by Aeonix

0

It is of course too much for the browser. In that case you can go ahead with readucing the queries so that downloading can be a bit easier.

Edited by jayashreemarg: more information

0

I work on something that recieves a lot of data and displays it in a grid. We are able to recieve about 70MB worth of data without issue. After 70MB, the webapi calls start to choke, but not because javascript can't handle it. There seems to be a limit as to how much data can come through a single webapi call. So we ended up chunking the call to be able to bring in over 70MB worth of data. Basically, you should have no problem with 1-5MB of data coming from a WebAPI/webservice/ajax/whatever call. It's now up to the speed of the connection from the client machine and the server.

0

For the record, my suggestion was to make a stub html page that has a single script element that has your formatted data on the page already. Not really a whole lot of work there

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.