• Jobs
  • About Us
  • Jobs
    • Home
    • Jobs
    • Courses and challenges
  • Businesses
    • Home
    • Post vacancy
    • Our process
    • Pricing
    • Assessments
    • Hire tech talent
    • Blog
    • Sales
    • Salary Calculator

0

123
Views
javascript - proper way to do thousands of $.getJSON calls to avoid ERR_INSUFFICIENT_RESOURCE

Problems:

I'm doing some data lookup from a website contain thousands of .json files, now I'm reading those .json with a straight forward for loop $.getJSON calls. It works fine within ~3000 files, but when the size goes up to 5000 or even 20000 (max number I've to read). The Chrome browser becomes super slow or get ERR_INSUFFICIENT_RESOURCE.

What I'm doing right now:

  1. scan through data example.org/1.json to 20000.json
  2. read these .json content into arrayone[], process array of data captured
  3. based on arrayone[], I do another $.getJSON on example.org/someplace/1.json to 20000.json
  4. read these .json content into arraytwo[]
  5. compare arrayone[] & arraytwo[]

So given I've to read 20000 files, it's actually 40000 $.getJSON call.

Seek helps:

Is there a smarter way to read multiple .json with better performance? Hope it's clear, thanks in advance!

over 3 years ago · Juan Pablo Isaza
Answer question
Find remote jobs

Discover the new way to find a job!

Top jobs
Top job categories
Business
Post vacancy Pricing Our process Sales
Legal
Terms and conditions Privacy policy
© 2026 PeakU Inc. All Rights Reserved.
Andres GPT
Show me some job opportunities
There's an error!