Problems:
I'm doing some data lookup from a website contain thousands of .json files, now I'm reading those .json with a straight forward for loop $.getJSON calls. It works fine within ~3000 files, but when the size goes up to 5000 or even 20000 (max number I've to read). The Chrome browser becomes super slow or get ERR_INSUFFICIENT_RESOURCE.
What I'm doing right now:
example.org/1.json to 20000.json.json content into arrayone[], process array of data capturedarrayone[], I do another $.getJSON on example.org/someplace/1.json to 20000.json.json content into arraytwo[]arrayone[] & arraytwo[]So given I've to read 20000 files, it's actually 40000 $.getJSON call.
Seek helps:
Is there a smarter way to read multiple .json with better performance?
Hope it's clear, thanks in advance!