admin管理员组

文章数量:1022631

I was trying to see if there was a way to cache a json response from a fetch async call, possibly using LRU.

I've tried using several packages, such as node-cache and lru-cache, but I don't think they worked because my function is asynchronous.

This is what my fetch function basically looks like:

const jsonFetch = async (url) => {
    try {
        const response = await fetch (url)
        const json = await response.json();
        return json
    }
    catch (error) {
        console.log(error)
    }
}

For example, if I get someone to hit my route 20 times in a minute, I'd like to easily fetch the data and return the response within 0.03 ms instead of 0.3 ms. Currently, it is always using the a URL to fetch the data.

I was trying to see if there was a way to cache a json response from a fetch async call, possibly using LRU.

I've tried using several packages, such as node-cache and lru-cache, but I don't think they worked because my function is asynchronous.

This is what my fetch function basically looks like:

const jsonFetch = async (url) => {
    try {
        const response = await fetch (url)
        const json = await response.json();
        return json
    }
    catch (error) {
        console.log(error)
    }
}

For example, if I get someone to hit my route 20 times in a minute, I'd like to easily fetch the data and return the response within 0.03 ms instead of 0.3 ms. Currently, it is always using the a URL to fetch the data.

Share Improve this question asked Jun 7, 2019 at 0:32 flarow432flarow432 651 silver badge6 bronze badges 1
  • Try this github./tugrul512bit/LruJS its fully asynchronous. – huseyin tugrul buyukisik Commented Sep 26, 2021 at 10:03
Add a ment  | 

2 Answers 2

Reset to default 5

This has been here for a while, but I agree with the ment from @sleepy012. If I wanted to avoid parallel calls, the trick should be to cache the promise, not only the value. So something like this should work:

let cache = {}
function cacheAsync(loader) {
  return async (url) => {
    if (url in cache) {                    // return cached result if available
        console.log("cache hit")
        return cache[url]
    }
    try {
        const responsePromise = loader(url)
        cache[url] = responsePromise
        return responsePromise
    }
    catch (error) {
        console.log('Error', error.message)
    }
  };
}


function delayedLoader(url) {
  console.log('Loading url: ' + url)
  return new Promise((r) => setTimeout(r, 1000,'Returning ' + url));
}

const cachedLoader = cacheAsync(delayedLoader);

cachedLoader('url1').then((d) => console.log('First load got: ' + d));
cachedLoader('url1').then((d) => console.log('Second load got: ' + d));
cachedLoader('url2').then((d) => console.log('Third load got: ' + d));
cachedLoader('url2').then((d) => console.log('Fourth load got: ' + d));
console.log('Waiting for load to plete');

There's nothing about async functions that will prevent caching results. It's possible the libraries you're looking at can't handle the promises, but here's a basic proof of concept that might help to get things started:

let cache = {}
const jsonFetch = async (url) => {
    if (url in cache) {                    // return cached result if available
        console.log("cache hit")
        return cache[url]
    }
    try {
        const response = await fetch (url)
        const json = response.json();
        cache[url] = json                  // cache response keyed to url
        return json
    }
    catch (error) {
        console.log(error)
    }
}

jsonFetch("https://jsonplaceholder.typicode./todos/1").then((user) => console.log(user.id))

// should be cached -- same url
setTimeout(() => jsonFetch("https://jsonplaceholder.typicode./todos/1").then((user) => console.log(user.id)), 2000)

// not in cache
setTimeout(() => jsonFetch("https://jsonplaceholder.typicode./todos/2").then((user) => console.log(user.id)), 2000)

You will only get cache hits on requests made after the first request returns a value to cache

I was trying to see if there was a way to cache a json response from a fetch async call, possibly using LRU.

I've tried using several packages, such as node-cache and lru-cache, but I don't think they worked because my function is asynchronous.

This is what my fetch function basically looks like:

const jsonFetch = async (url) => {
    try {
        const response = await fetch (url)
        const json = await response.json();
        return json
    }
    catch (error) {
        console.log(error)
    }
}

For example, if I get someone to hit my route 20 times in a minute, I'd like to easily fetch the data and return the response within 0.03 ms instead of 0.3 ms. Currently, it is always using the a URL to fetch the data.

I was trying to see if there was a way to cache a json response from a fetch async call, possibly using LRU.

I've tried using several packages, such as node-cache and lru-cache, but I don't think they worked because my function is asynchronous.

This is what my fetch function basically looks like:

const jsonFetch = async (url) => {
    try {
        const response = await fetch (url)
        const json = await response.json();
        return json
    }
    catch (error) {
        console.log(error)
    }
}

For example, if I get someone to hit my route 20 times in a minute, I'd like to easily fetch the data and return the response within 0.03 ms instead of 0.3 ms. Currently, it is always using the a URL to fetch the data.

Share Improve this question asked Jun 7, 2019 at 0:32 flarow432flarow432 651 silver badge6 bronze badges 1
  • Try this github./tugrul512bit/LruJS its fully asynchronous. – huseyin tugrul buyukisik Commented Sep 26, 2021 at 10:03
Add a ment  | 

2 Answers 2

Reset to default 5

This has been here for a while, but I agree with the ment from @sleepy012. If I wanted to avoid parallel calls, the trick should be to cache the promise, not only the value. So something like this should work:

let cache = {}
function cacheAsync(loader) {
  return async (url) => {
    if (url in cache) {                    // return cached result if available
        console.log("cache hit")
        return cache[url]
    }
    try {
        const responsePromise = loader(url)
        cache[url] = responsePromise
        return responsePromise
    }
    catch (error) {
        console.log('Error', error.message)
    }
  };
}


function delayedLoader(url) {
  console.log('Loading url: ' + url)
  return new Promise((r) => setTimeout(r, 1000,'Returning ' + url));
}

const cachedLoader = cacheAsync(delayedLoader);

cachedLoader('url1').then((d) => console.log('First load got: ' + d));
cachedLoader('url1').then((d) => console.log('Second load got: ' + d));
cachedLoader('url2').then((d) => console.log('Third load got: ' + d));
cachedLoader('url2').then((d) => console.log('Fourth load got: ' + d));
console.log('Waiting for load to plete');

There's nothing about async functions that will prevent caching results. It's possible the libraries you're looking at can't handle the promises, but here's a basic proof of concept that might help to get things started:

let cache = {}
const jsonFetch = async (url) => {
    if (url in cache) {                    // return cached result if available
        console.log("cache hit")
        return cache[url]
    }
    try {
        const response = await fetch (url)
        const json = response.json();
        cache[url] = json                  // cache response keyed to url
        return json
    }
    catch (error) {
        console.log(error)
    }
}

jsonFetch("https://jsonplaceholder.typicode./todos/1").then((user) => console.log(user.id))

// should be cached -- same url
setTimeout(() => jsonFetch("https://jsonplaceholder.typicode./todos/1").then((user) => console.log(user.id)), 2000)

// not in cache
setTimeout(() => jsonFetch("https://jsonplaceholder.typicode./todos/2").then((user) => console.log(user.id)), 2000)

You will only get cache hits on requests made after the first request returns a value to cache

本文标签: javascriptHow to cache data from async function that uses fetch in NodeStack Overflow