Skip to content

Latest commit

 

History

History
376 lines (273 loc) · 10.4 KB

cache.md

File metadata and controls

376 lines (273 loc) · 10.4 KB

Cache

Tip

Location within the framework bee-agent-framework/cache.

Caching is a process used to temporarily store copies of data or computations in a cache (a storage location) to facilitate faster access upon future requests. The primary purpose of caching is to improve the efficiency and performance of systems by reducing the need to repeatedly fetch or compute the same data from a slower or more resource-intensive source.

Usage

Capabilities showcase

import { UnconstrainedCache } from "bee-agent-framework/cache/unconstrainedCache";

const cache = new UnconstrainedCache();

// Save
await cache.set("a", 1);
await cache.set("b", 2);

// Read
const result = await cache.get("a");
console.log(result); // 1

// Meta
console.log(cache.enabled); // true
console.log(await cache.has("a")); // true
console.log(await cache.has("b")); // true
console.log(await cache.has("c")); // false
console.log(await cache.size()); // 2

// Delete
await cache.delete("a");
console.log(await cache.has("a")); // false

// Clear
await cache.clear();
console.log(await cache.size()); // 0

Source: examples/cache/unconstrainedCache.ts

Caching function output + intermediate steps

import { UnconstrainedCache } from "bee-agent-framework/cache/unconstrainedCache";

const cache = new UnconstrainedCache<number>();

async function fibonacci(n: number): Promise<number> {
  const cacheKey = n.toString();
  const cached = await cache.get(cacheKey);
  if (cached !== undefined) {
    return cached;
  }

  const result = n < 1 ? 0 : n <= 2 ? 1 : (await fibonacci(n - 1)) + (await fibonacci(n - 2));
  await cache.set(cacheKey, result);
  return result;
}

console.info(await fibonacci(10)); // 55
console.info(await fibonacci(9)); // 34 (retrieved from cache)
console.info(`Cache size ${await cache.size()}`); // 10

Source: examples/cache/unconstrainedCacheFunction.ts

Usage with tools

import { SlidingCache } from "bee-agent-framework/cache/slidingCache";
import { WikipediaTool } from "bee-agent-framework/tools/search/wikipedia";

const ddg = new WikipediaTool({
  cache: new SlidingCache({
    size: 100, // max 100 entries
    ttl: 5 * 60 * 1000, // 5 minutes lifespan
  }),
});

const response = await ddg.run({
  query: "United States",
});
// upcoming requests with the EXACTLY same input will be retrieved from the cache

Source: examples/cache/toolCache.ts

Important

Cache key is created by serializing function parameters (the order of keys in the object does not matter).

Usage with LLMs

import { SlidingCache } from "bee-agent-framework/cache/slidingCache";
import { OllamaChatLLM } from "bee-agent-framework/adapters/ollama/chat";
import { BaseMessage } from "bee-agent-framework/llms/primitives/message";

const llm = new OllamaChatLLM({
  modelId: "llama3.1",
  parameters: {
    temperature: 0,
    num_predict: 50,
  },
  cache: new SlidingCache({
    size: 50,
  }),
});

console.info(await llm.cache.size()); // 0
const first = await llm.generate([BaseMessage.of({ role: "user", text: "Who was Alan Turing?" })]);
// upcoming requests with the EXACTLY same input will be retrieved from the cache
console.info(await llm.cache.size()); // 1
const second = await llm.generate([BaseMessage.of({ role: "user", text: "Who was Alan Turing?" })]);
console.info(first === second); // true

Source: examples/cache/llmCache.ts

Tip

Caching for non-chat LLMs works exactly the same way.

Cache types

The framework provides multiple out-of-the-box cache implementations.

UnconstrainedCache

Unlimited in size.

import { UnconstrainedCache } from "bee-agent-framework/cache/unconstrainedCache";
const cache = new UnconstrainedCache();

await cache.set("a", 1);
console.log(await cache.has("a")); // true
console.log(await cache.size()); // 1

SlidingCache

Keeps last k entries in the memory. The oldest ones are deleted.

import { SlidingCache } from "bee-agent-framework/cache/slidingCache";

const cache = new SlidingCache<number>({
  size: 3, // (required) number of items that can be live in the cache at a single moment
  ttl: 1000, // (optional, default is Infinity) Time in milliseconds after the element is removed from a cache
});

await cache.set("a", 1);
await cache.set("b", 2);
await cache.set("c", 3);

await cache.set("d", 4); // overflow - cache internally removes the oldest entry (key "a")
console.log(await cache.has("a")); // false
console.log(await cache.size()); // 3

Source: examples/cache/slidingCache.ts

FileCache

One may want to persist data to a file so that the data can be later loaded. In that case the FileCache is ideal candidate. You have to provide a location where the cache is persisted.

import { FileCache } from "bee-agent-framework/cache/fileCache";
import * as os from "node:os";

const cache = new FileCache({
  fullPath: `${os.tmpdir()}/bee_file_cache_${Date.now()}.json`,
});
console.log(`Saving cache to "${cache.source}"`);
await cache.set("abc", { firstName: "John", lastName: "Doe" });

Source: examples/cache/fileCache.ts

Note

Provided location (fullPath) doesn't have to exist. It gets automatically created when needed.

Note

Every modification to the cache (adding, deleting, clearing) immediately updates the target file.

Using a custom provider

import { FileCache } from "bee-agent-framework/cache/fileCache";
import { UnconstrainedCache } from "bee-agent-framework/cache/unconstrainedCache";
import os from "node:os";

const memoryCache = new UnconstrainedCache<number>();
await memoryCache.set("a", 1);

const fileCache = await FileCache.fromProvider(memoryCache, {
  fullPath: `${os.tmpdir()}/bee_file_cache.json`,
});
console.log(`Saving cache to "${fileCache.source}"`);
console.log(await fileCache.get("a")); // 1

Source: examples/cache/fileCacheCustomProvider.ts

NullCache

The special type of cache is NullCache which implements the BaseCache interface but does nothing.

The reason for implementing is to enable Null object pattern.

@Cache (decorator cache)

import { Cache } from "bee-agent-framework/cache/decoratorCache";

class Generator {
  @Cache()
  get(seed: number) {
    return (Math.random() * 1000) / Math.max(seed, 1);
  }
}

const generator = new Generator();
const a = generator.get(5);
const b = generator.get(5);
console.info(a === b); // true
console.info(a === generator.get(6)); // false

Source: examples/cache/decoratorCache.ts

Complex example

import { Cache, SingletonCacheKeyFn } from "bee-agent-framework/cache/decoratorCache";

class MyService {
  @Cache({
    cacheKey: SingletonCacheKeyFn,
    ttl: 3600,
    enumerable: true,
    enabled: true,
  })
  get id() {
    return Math.floor(Math.random() * 1000);
  }

  reset() {
    Cache.getInstance(this, "id").clear();
  }
}

const service = new MyService();
const a = service.id;
console.info(a === service.id); // true
service.reset();
console.info(a === service.id); // false

Source: examples/cache/decoratorCacheComplex.ts

Note

Default cacheKey function is ObjectHashKeyFn

Caution

Calling an annotated method with the @Cache decorator with different parameters (despite the fact you are not using them) yields in cache bypass (different arguments = different cache key) generated. Be aware of that. If you want your method always to return the same response, use SingletonCacheKeyFn.

CacheFn

Because previously mentioned CacheDecorator can be applied only to class methods/getter the framework provides a way how to do caching on a function level.

import { CacheFn } from "bee-agent-framework/cache/decoratorCache";
import { setTimeout } from "node:timers/promises";

const getSecret = CacheFn.create(
  async () => {
    // instead of mocking response you would do a real fetch request
    const response = await Promise.resolve({ secret: Math.random(), expiresIn: 100 });
    getSecret.updateTTL(response.expiresIn);
    return response.secret;
  },
  {}, // options object
);

const token = await getSecret();
console.info(token === (await getSecret())); // true
await setTimeout(150);
console.info(token === (await getSecret())); // false

Source: examples/cache/cacheFn.ts

Note

Internally, the function is wrapped as a class; therefore, the same rules apply here as if it were a method annotated with the @Cache decorator.

Creating a custom cache provider

To create your cache implementation, you must implement the BaseCache class.

import { BaseCache } from "bee-agent-framework/cache/base";
import { NotImplementedError } from "bee-agent-framework/errors";

export class CustomCache<T> extends BaseCache<T> {
  size(): Promise<number> {
    throw new NotImplementedError();
  }

  set(key: string, value: T): Promise<void> {
    throw new NotImplementedError();
  }

  get(key: string): Promise<T | undefined> {
    throw new NotImplementedError();
  }

  has(key: string): Promise<boolean> {
    throw new NotImplementedError();
  }

  delete(key: string): Promise<boolean> {
    throw new NotImplementedError();
  }

  clear(): Promise<void> {
    throw new NotImplementedError();
  }

  createSnapshot() {
    throw new NotImplementedError();
  }

  loadSnapshot(snapshot: ReturnType<typeof this.createSnapshot>): void {
    throw new NotImplementedError();
  }
}

Source: examples/cache/custom.ts

The simplest implementation is UnconstrainedCache, which can be found here.