I currently fetch business's historical sales data from a 3rd party api, and this data is needed for the user's entire life with my web app. I also need ongoing updates from the same 3rd party api for new sales data each day.
I am currently able to handle business's with a few thousand objects in their historical data, but I need to scale to handle business's with 100,000+ objects of historical data and 100+ new objects added daily.
Every time a user loads the app, I currently run a bulk operation to get all data from the 3rd party api. I am not currently storing any of the 3rd party api data in a database.
Caching for my API calls is currently done locally with apollo client, I use MongoDb for saving general user provided inputs, and Redis for managing sessions. I am using Node JS, Next JS, and Heroku as well.
To handle 100,000+ objects at installation and 100+ new objects daily per user from the 3rd party api (all data is vital), how should I approach storing and caching the api data?