cache-loader-async

To create a LoadingCache with lru cache backing use the with_backing method on the LoadingCache

crates.io

The goal of this crate is to provide a thread-safe and easy way to access any data structure which might is stored in a database at most once and keep it in cache for further requests.

This library is based on tokio-rs and futures.

Usage

Using this library is as easy as that:

#[tokio::main]
async fn main() {
    let static_db: HashMap<String, u32> =
        vec![("foo".into(), 32), ("bar".into(), 64)]
            .into_iter()
            .collect();
    
    let (cache, _) = LoadingCache::new(move |key: String| {
        let db_clone = static_db.clone();
        async move {
            db_clone.get(&key).cloned().ok_or("error-message")
        }
    });

    let result = cache.get("foo".to_owned()).await.unwrap().0;

    assert_eq!(result, 32);
}

The LoadingCache will first try to look up the result in an internal HashMap and if it's not found and there's no load ongoing, it will fire the load request and queue any other get requests until the load request finishes.

Features & Cache Backings

The library currently supports two additional inbuilt backings: LRU & TTL LRU evicts keys based on the cache maximum size, while TTL evicts keys automatically after their TTL expires.

LRU Backing

You can use a simple pre-built LRU cache from the lru-rs crate by enabling the lru-cache feature.

To create a LoadingCache with lru cache backing use the with_backing method on the LoadingCache.

async fn main() {
    let size: usize = 10;
    let (cache, _) = LoadingCache::with_backing(LruCacheBacking::new(size), move |key: String| {
        async move {
            Ok(key.to_lowercase())
        }
    });
}

TTL Backing

You can use a simple pre-build TTL cache by enabling the ttl-cache feature. This will not require any additional dependencies.

To create a LoadingCache with ttl cache backing use the with_backing method on the LoadingCache.

async fn main() {
    let duration: Duration = Duration::from_secs(30);
    let (cache, _) = LoadingCache::with_backing(TtlCacheBacking::new(duration), move |key: String| {
        async move {
            Ok(key.to_lowercase())
        }
    });
}

Own Backing

To implement an own cache backing, simply implement the public CacheBacking trait from the backing mod.

pub trait CacheBacking<K, V>
    where K: Eq + Hash + Sized + Clone + Send,
          V: Sized + Clone + Send {
    fn get(&mut self, key: &K) -> Option<&V>;
    fn set(&mut self, key: K, value: V) -> Option<V>;
    fn remove(&mut self, key: &K) -> Option<V>;
    fn contains_key(&self, key: &K) -> bool;
}
Issues

Collection of the latest Issues

huntc

huntc

0

Thanks for providing this library!

I couldn't see how to feedback on my experience, but I just wanted to convey that using the library has been very pleasant. Here's a code snippet of how I set up my cache to read secrets from Hashicorp's Vault:

Feel free to use this as a more (incomplete) complex example if that helps. Thanks once again!

Versions

Find the latest versions by id

v0.2.0 - Jan 23, 2022

What's Changed

New Contributors

Full Changelog: https://github.com/ZeroTwo-Bot/cache-loader-async-rs/compare/v0.1.1...v0.2.0

v0.1.1 - Aug 31, 2021

Information - Updated Jun 08, 2022

Stars: 3
Forks: 0
Issues: 0

A Mustache template compiler

Stache is released under the MIT license

A Mustache template compiler

Catfs is a caching filesystem written in Rust

Catfs allows you to have cached access to another (possibly remote)

Catfs is a caching filesystem written in Rust
Cargo

2.8K

sccache - Shared Compilation Cache

ccache-like tool for caching C/C++ Rust and even CUDA platforms like Nivida

sccache - Shared Compilation Cache

SumQueue it's a Rust queue type that keeps a fixed number of

items by time, not capacity, similar to a cache, but with a simpler

SumQueue it's a Rust queue type that keeps a fixed number of

[email protected] default starter kit for Rust

Get to know the Fastly [email protected] environment with a basic starter that demonstrates routing, simple synthetic responses, and overriding caching rules

Compute@Edge default starter kit for Rust

immutable chunk map

A cache efficient immutable map, written using only safe rust, with

immutable chunk map

A simple safe Rust library to cache functions (or sections of code) in memory, mainly...

A simple safe Rust library to cache functions (or sections of code) in memory, mainly ported from my Python module Rust

A simple safe Rust library to cache functions (or sections of code) in memory, mainly...

A fast multiplayer shooter on wheels

RustCycles is a third person shooter that's about movement, not aim

A fast multiplayer shooter on wheels

Cache the dependencies of your Rust project and speed up your Docker builds

You can install cargo-chef from all the available tags on Dockerhub

Cache the dependencies of your Rust project and speed up your Docker builds
Facebook Instagram Twitter GitHub Dribbble
Privacy