Wanted a suggestion/clarity on a possible way to improve DataTables performance when working with large datasets that don’t update frequently (e.g., product catalogs, archived reports).
To reduce unnecessary API calls, will it be useful to cache the response in localStorage with an expiration time of 1 hour. The idea is to avoid re-fetching the same data on every page load while ensuring the cache refreshes periodically.
Here’s a small part of the approach:
const cacheKey = "tableData";
const cacheExpiry = 3600 * 1000; // 1 hour expiration
function fetchData(callback) {
let cached = localStorage.getItem(cacheKey);
let cachedTime = localStorage.getItem(cacheKey + "_time");
if (cached && cachedTime && (Date.now() - cachedTime < cacheExpiry)) {
callback(JSON.parse(cached)); // Use cached data
} else {
$.getJSON("your-api-endpoint", function(data) {
localStorage.setItem(cacheKey, JSON.stringify(data));
localStorage.setItem(cacheKey + "_time", Date.now());
callback(data);
});
}
}
Not sure if this is the best approach, especially when handling data that occasionally updates. I was trying to consolidate some good practices on how to use Datatables with jQuery, Laravel, and Angular. Would appreciate any insights!