refactor progress

This commit is contained in:
Hampus Kraft
2026-02-17 12:22:36 +00:00
parent cb31608523
commit d5abd1a7e4
8257 changed files with 1190207 additions and 761040 deletions

332
packages/cache/BEST_PRACTICES.md vendored Normal file
View File

@@ -0,0 +1,332 @@
# Cache package best practices
Working document. Append new lessons as they arise.
---
## 1. Extract shared logic into focused, single-responsibility files
When two or more providers (KV, Redis, InMemory) share identical logic, extract it
into a dedicated file rather than duplicating it. Each extracted file should own
exactly one concern:
| File | Concern |
|---|---|
| `CacheProviderTypes.tsx` | Shared interfaces consumed by providers and callers (`CacheLogger`, `CacheTelemetry`, `CacheKeyClassifier`) |
| `CacheSerialization.tsx` | JSON parse/stringify helpers with error handling and truncation |
| `CacheLockValidation.tsx` | Lock key/token format validation, token generation, key formatting |
| `CacheKeyClassification.tsx` | Mapping a cache key to a telemetry dimension string |
| `RedisClientTypes.tsx` | `RedisClient` and `RedisPipeline` interface contracts |
**Why:** duplicated code drifts apart over time, bugs get fixed in one copy but not
another, and readers have to mentally diff near-identical methods.
---
## 2. Separate public contract types from implementation files
Types that callers or tests need to import (`CacheLogger`, `CacheTelemetry`,
`RedisClient`) belong in their own files not buried inside a provider class.
**Rule of thumb:** if a type appears in an `import type` statement in a file that is
not the defining file, it deserves its own home.
**Before:**
```tsx
// KVCacheProvider.tsx defines AND exports CacheLogger
export interface CacheLogger { ... }
export class KVCacheProvider { ... }
```
**After:**
```tsx
// CacheProviderTypes.tsx sole owner of the type
export interface CacheLogger { ... }
// KVCacheProvider.tsx imports the type
import type {CacheLogger} from '@fluxer/cache/src/CacheProviderTypes';
```
---
## 3. Use an `instrumented` helper to collapse telemetry boilerplate
When every method in a class follows the same start/try/record-success/catch/record-error
pattern, extract it into a single private method:
```tsx
private async instrumented<T>(
operation: string,
key: string,
fn: () => Promise<T>,
statusResolver?: (result: T) => string,
): Promise<T> { ... }
```
Each call site then becomes one line of intent instead of 25 lines of ceremony:
```tsx
async get<T>(key: string): Promise<T | null> {
return this.instrumented('get', key, async () => {
const value = await this.client.get(key);
if (value == null) return null;
return safeJsonParse<T>(value, this.logger);
}, (result) => (result == null ? 'miss' : 'hit'));
}
```
**Why:** the telemetry pattern was copy-pasted across `get`, `set`, `delete` with
only the operation name and status differing. A helper makes the unique logic visible
and the boilerplate invisible.
---
## 4. Keep serialisation consistent
Use `serializeValue()` and `safeJsonParse()` from `CacheSerialization.tsx` inside the
cache package rather than bare `JSON.stringify` / `JSON.parse`. Benefits:
- Parse errors are caught and logged with truncated values, not swallowed or thrown
raw.
- A single place to add future concerns (metrics on parse failures, encoding changes,
etc.).
- Callers outside the cache package doing their own `JSON.stringify` for HTTP bodies,
worker queues, etc. do **not** need these helpers they are cache-specific.
---
## 5. Centralise validation and make it reusable
Lock key and token validation was copy-pasted across all three providers with identical
regexes and error messages. `CacheLockValidation.tsx` now owns:
- `validateLockKey(key)` throws on bad format
- `validateLockToken(token)` throws on bad format
- `generateLockToken()` `randomBytes(16).toString('hex')`
- `formatLockKey(key)` returns `lock:${key}`
**Guideline:** if the same regex or format string appears in more than one file, it
should be a named export in a shared module.
---
## 6. Import from the defining file ("horse's mouth")
Per project conventions: all callsites import directly from the file that defines the
symbol. Never re-export. Never create barrel files.
```tsx
// Good
import type {CacheLogger} from '@fluxer/cache/src/CacheProviderTypes';
// Bad importing a type from a file that re-exports it
import type {CacheLogger} from '@fluxer/cache/src/providers/KVCacheProvider';
```
When moving a type to a new home, grep for every import and update it in the same
change. Do not leave the old export as a compatibility shim.
---
## 7. Provider-specific config stays in the provider file
`KVCacheProviderConfig` stays in `KVCacheProvider.tsx` because it references
`IKVProvider`, which is specific to that implementation. Only types shared across
multiple providers get extracted.
**Heuristic:** if a type is only imported by a single file and its tests, it belongs
in that file.
---
## 8. Verify refactors with the full test suite, formatter, and type checker
After any structural refactor, run in order:
```bash
cd packages/cache && pnpm test # all existing tests pass
pnpm biome check --write packages/cache/src/ # formatting (run from repo root)
cd packages/cache && pnpm typecheck # type checking
```
Then spot-check downstream consumers:
```bash
cd packages/api && pnpm typecheck
```
Pre-existing failures in downstream packages are acceptable so long as **none of the
errors reference the refactored package**.
---
## 9. Do not over-extract
Not every use of `randomBytes` or `JSON.stringify` in the codebase needs to use the
cache package's helpers. Cache serialisation helpers are for **cache values**. Other
domains (SSO tokens, CSP nonces, HTTP bodies, worker queues) have their own
serialisation needs and should not couple to the cache package.
**Rule:** extract shared code within a bounded context (the cache package), not across
unrelated domains.
---
## 10. Naming conventions for extracted files
Follow the project's descriptive-filename convention. Avoid generic names:
| Good | Bad |
|---|---|
| `CacheLockValidation.tsx` | `Validation.tsx` |
| `CacheKeyClassification.tsx` | `Utils.tsx` |
| `CacheProviderTypes.tsx` | `Types.tsx` |
| `RedisClientTypes.tsx` | `RedisTypes.tsx` |
Domain-prefix every file so it is unambiguous in search results and import
auto-complete.
---
## 11. Avoid backwards-compatibility shims
When moving exports to new files, do not leave behind re-exports or aliases in the old
location. This creates indirection and lets stale imports survive indefinitely.
Instead:
1. Move the definition.
2. Update every import site in the same commit.
3. Delete the old export entirely.
---
## 12. Keep the abstract contract stable
`ICacheService` and `CacheMSetEntry` stay in `ICacheService.tsx`. They define the
public contract that all providers implement and all consumers depend on. Refactoring
provider internals should never change this file.
---
## 13. One import block, no blank lines inside
Per project conventions, keep a single contiguous import block at the top of each file
with no blank lines or code between imports. Let biome handle ordering.
```tsx
// Good
import {classifyKeyType} from '@fluxer/cache/src/CacheKeyClassification';
import {formatLockKey, generateLockToken} from '@fluxer/cache/src/CacheLockValidation';
import type {CacheLogger} from '@fluxer/cache/src/CacheProviderTypes';
import {safeJsonParse, serializeValue} from '@fluxer/cache/src/CacheSerialization';
import {ICacheService} from '@fluxer/cache/src/ICacheService';
```
---
## 14. Scope of `safeJsonParse` logger parameter
`safeJsonParse` accepts an optional `CacheLogger`. The KV provider passes its logger
so parse errors are reported. The Redis provider passes nothing (logs to console as a
fallback in the provider itself now removed in favour of silent null return). The
InMemory provider does not call `safeJsonParse` at all because it stores native values,
not serialised strings.
When adding a new provider, decide up front whether parse errors should be logged and
pass the logger accordingly.
---
## 15. Watch for non-atomic operations
The Redis provider's `acquireLock` was (and remains, pending a proper fix) non-atomic:
```tsx
await this.client.set(lockKey, token);
await this.client.expire(lockKey, ttlSeconds);
```
Between these two calls, the lock exists without a TTL. If the process crashes, the
lock is held forever. The KV provider uses `SET ... EX ... NX` in a single command,
which is correct. When implementing distributed primitives, always prefer atomic
operations.
---
## 16. Downstream callsites should use shared utilities
After extracting shared logic, audit the rest of the codebase for consumers that
duplicate the same logic inline. Common patterns:
- `lock:${key}` string templates → `formatLockKey(key)`
- `crypto.randomBytes(16).toString('hex')``generateLockToken()`
- `Math.random().toString(36)...``generateLockToken()` (also a security fix)
- Mock implementations with inline lock validation → import from `CacheLockValidation`
Update downstream callsites in the same change to prevent the shared code from
becoming an orphan.
---
## 17. Never use Math.random() for tokens
`Math.random()` is not cryptographically secure. For any token used in locking,
authentication, or session management, use `crypto.randomBytes()` or the shared
`generateLockToken()` utility.
```tsx
// bad
const token = Math.random().toString(36).substring(2, 15) + Math.random().toString(36).substring(2, 15);
// good
import {generateLockToken} from '@fluxer/cache/src/CacheLockValidation';
const token = generateLockToken();
```
---
## 18. Remove no-op string operations
Watch for `.replace()` calls that silently match nothing:
```tsx
// bad — 'deletion_queue:rebuild_lock' doesn't contain 'lock:' as a prefix
// .replace('lock:', '') matches nothing and returns the original string unchanged
REBUILD_LOCK_KEY.replace('lock:', '')
// good — pass the key directly
REBUILD_LOCK_KEY
```
No-op operations are a sign the author assumed a different key format. Verify the
format and remove the dead code.
---
## 19. Mock implementations should use shared utilities
Test mocks (like `MockCacheService`) that implement lock acquisition should use the
same shared utilities as real providers:
```tsx
// bad — inline duplication
const lockKey = `lock:${key}`;
const token = crypto.randomBytes(16).toString('hex');
// good — shared utilities keep mocks consistent
import {formatLockKey, generateLockToken} from '@fluxer/cache/src/CacheLockValidation';
const lockKey = formatLockKey(key);
const token = generateLockToken();
```
This ensures mocks produce tokens in the same format as real providers and that
format changes propagate automatically.
---
## Changelog
| Date | Change |
|---|---|
| 2026-02-06 | Initial version from cache package refactoring |
| 2026-02-06 | Added downstream cleanup lessons (sections 1619) |

23
packages/cache/package.json vendored Normal file
View File

@@ -0,0 +1,23 @@
{
"name": "@fluxer/cache",
"version": "0.0.0",
"private": true,
"type": "module",
"exports": {
"./*": "./*"
},
"scripts": {
"test": "vitest run",
"test:watch": "vitest",
"typecheck": "tsgo --noEmit"
},
"dependencies": {
"@fluxer/kv_client": "workspace:*"
},
"devDependencies": {
"@types/node": "catalog:",
"@typescript/native-preview": "catalog:",
"vite-tsconfig-paths": "catalog:",
"vitest": "catalog:"
}
}

View File

@@ -0,0 +1,29 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
export function classifyKeyType(key: string): string {
if (key.startsWith('lock:')) return 'lock';
if (key.startsWith('bluesky:')) return 'bluesky';
if (key.includes(':session:')) return 'session';
if (key.includes(':user:')) return 'user';
if (key.includes(':guild:')) return 'guild';
if (key.includes(':channel:')) return 'channel';
if (key.includes(':ratelimit:')) return 'ratelimit';
return 'other';
}

View File

@@ -0,0 +1,43 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
import {randomBytes} from 'node:crypto';
const LOCK_KEY_PATTERN = /^[a-zA-Z0-9:_-]+$/;
const LOCK_TOKEN_PATTERN = /^[a-z0-9]+$/;
export function validateLockKey(key: string): void {
if (!LOCK_KEY_PATTERN.test(key)) {
throw new Error('Invalid lock key format');
}
}
export function validateLockToken(token: string): void {
if (!LOCK_TOKEN_PATTERN.test(token)) {
throw new Error('Invalid lock token format');
}
}
export function generateLockToken(): string {
return randomBytes(16).toString('hex');
}
export function formatLockKey(key: string): string {
return `lock:${key}`;
}

View File

@@ -0,0 +1,29 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
export interface CacheLogger {
error(obj: unknown, message: string): void;
}
export interface CacheTelemetry {
recordCounter(metric: {name: string; dimensions?: Record<string, string>}): void;
recordHistogram(metric: {name: string; valueMs: number; dimensions?: Record<string, string>}): void;
}
export type CacheKeyClassifier = (key: string) => string;

View File

@@ -0,0 +1,37 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
import type {CacheLogger} from '@fluxer/cache/src/CacheProviderTypes';
export function safeJsonParse<T>(value: string, logger?: CacheLogger): T | null {
try {
return JSON.parse(value);
} catch (error) {
if (logger) {
const truncatedValue = value.length > 200 ? `${value.substring(0, 200)}...` : value;
const errorMessage = error instanceof Error ? error.message : String(error);
logger.error({errorMessage, value: truncatedValue}, '[CacheProvider] JSON parse error');
}
return null;
}
}
export function serializeValue<T>(value: T): string {
return JSON.stringify(value);
}

56
packages/cache/src/ICacheService.tsx vendored Normal file
View File

@@ -0,0 +1,56 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
export interface CacheMSetEntry<T> {
key: string;
value: T;
ttlSeconds?: number;
}
export abstract class ICacheService {
abstract get<T>(key: string): Promise<T | null>;
abstract set<T>(key: string, value: T, ttlSeconds?: number): Promise<void>;
abstract delete(key: string): Promise<void>;
abstract getAndDelete<T>(key: string): Promise<T | null>;
abstract exists(key: string): Promise<boolean>;
abstract expire(key: string, ttlSeconds: number): Promise<void>;
abstract ttl(key: string): Promise<number>;
abstract mget<T>(keys: Array<string>): Promise<Array<T | null>>;
abstract mset<T>(entries: Array<CacheMSetEntry<T>>): Promise<void>;
abstract deletePattern(pattern: string): Promise<number>;
abstract acquireLock(key: string, ttlSeconds: number): Promise<string | null>;
abstract releaseLock(key: string, token: string): Promise<boolean>;
abstract getAndRenewTtl<T>(key: string, newTtlSeconds: number): Promise<T | null>;
abstract publish(channel: string, message: string): Promise<void>;
abstract sadd(key: string, member: string, ttlSeconds?: number): Promise<void>;
abstract srem(key: string, member: string): Promise<void>;
abstract smembers(key: string): Promise<Set<string>>;
abstract sismember(key: string, member: string): Promise<boolean>;
async getOrSet<T>(key: string, valueFactory: () => Promise<T>, ttlSeconds?: number): Promise<T> {
const existingValue = await this.get<T>(key);
if (existingValue !== null) {
return existingValue;
}
const newValue = await valueFactory();
await this.set(key, newValue, ttlSeconds);
return newValue;
}
}

47
packages/cache/src/RedisClientTypes.tsx vendored Normal file
View File

@@ -0,0 +1,47 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
export interface RedisClient {
get(key: string): Promise<string | null>;
set(key: string, value: string): Promise<void>;
setex(key: string, ttlSeconds: number, value: string): Promise<void>;
del(...keys: Array<string>): Promise<number>;
getdel(key: string): Promise<string | null>;
exists(key: string): Promise<number>;
expire(key: string, ttlSeconds: number): Promise<number>;
ttl(key: string): Promise<number>;
mget(...keys: Array<string>): Promise<Array<string | null>>;
mset(...args: Array<string>): Promise<void>;
scan(pattern: string, count: number): Promise<Array<string>>;
publish(channel: string, message: string): Promise<number>;
sadd(key: string, ...members: Array<string>): Promise<number>;
srem(key: string, ...members: Array<string>): Promise<number>;
smembers(key: string): Promise<Array<string>>;
sismember(key: string, member: string): Promise<number>;
getex(key: string, ttlSeconds: number): Promise<string | null>;
pipeline(): RedisPipeline;
}
export interface RedisPipeline {
setex(key: string, ttlSeconds: number, value: string): RedisPipeline;
mset(...args: Array<string>): RedisPipeline;
sadd(key: string, ...members: Array<string>): RedisPipeline;
expire(key: string, ttlSeconds: number): RedisPipeline;
exec(): Promise<unknown>;
}

View File

@@ -0,0 +1,278 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
import {
formatLockKey,
generateLockToken,
validateLockKey,
validateLockToken,
} from '@fluxer/cache/src/CacheLockValidation';
import {ICacheService} from '@fluxer/cache/src/ICacheService';
interface CacheEntry<T> {
value: T;
expiresAt?: number;
}
export interface InMemoryProviderConfig {
maxSize?: number;
cleanupIntervalMs?: number;
}
export class InMemoryProvider extends ICacheService {
private cache = new Map<string, CacheEntry<unknown>>();
private sets = new Map<string, Set<string>>();
private locks = new Map<string, {token: string; expiresAt: number}>();
private maxSize: number;
private cleanupInterval?: NodeJS.Timeout;
constructor(config: InMemoryProviderConfig = {}) {
super();
this.maxSize = config.maxSize ?? 10000;
if (config.cleanupIntervalMs) {
this.cleanupInterval = setInterval(() => this.cleanup(), config.cleanupIntervalMs);
}
}
private cleanup(): void {
const now = Date.now();
for (const [key, entry] of this.cache.entries()) {
if (entry.expiresAt && entry.expiresAt <= now) {
this.cache.delete(key);
}
}
for (const [key, lock] of this.locks.entries()) {
if (lock.expiresAt <= now) {
this.locks.delete(key);
}
}
}
private isExpired(entry: CacheEntry<unknown>): boolean {
if (!entry.expiresAt) return false;
return Date.now() >= entry.expiresAt;
}
private evictIfNeeded(): void {
if (this.cache.size >= this.maxSize) {
const firstKey = this.cache.keys().next().value;
if (firstKey !== undefined) {
this.cache.delete(firstKey);
}
}
}
async get<T>(key: string): Promise<T | null> {
const entry = this.cache.get(key) as CacheEntry<T> | undefined;
if (!entry) return null;
if (this.isExpired(entry)) {
this.cache.delete(key);
return null;
}
return entry.value;
}
async set<T>(key: string, value: T, ttlSeconds?: number): Promise<void> {
this.evictIfNeeded();
const entry: CacheEntry<T> = {
value,
expiresAt: ttlSeconds ? Date.now() + ttlSeconds * 1000 : undefined,
};
this.cache.set(key, entry);
}
async delete(key: string): Promise<void> {
this.cache.delete(key);
}
async getAndDelete<T>(key: string): Promise<T | null> {
const value = await this.get<T>(key);
if (value !== null) {
this.cache.delete(key);
}
return value;
}
async exists(key: string): Promise<boolean> {
const entry = this.cache.get(key);
if (!entry) return false;
if (this.isExpired(entry)) {
this.cache.delete(key);
return false;
}
return true;
}
async expire(key: string, ttlSeconds: number): Promise<void> {
const entry = this.cache.get(key);
if (entry && !this.isExpired(entry)) {
entry.expiresAt = Date.now() + ttlSeconds * 1000;
}
}
async ttl(key: string): Promise<number> {
const entry = this.cache.get(key);
if (!entry || this.isExpired(entry)) {
return -2;
}
if (!entry.expiresAt) {
return -1;
}
const ttlMs = entry.expiresAt - Date.now();
return Math.max(0, Math.floor(ttlMs / 1000));
}
async mget<T>(keys: Array<string>): Promise<Array<T | null>> {
const results: Array<T | null> = [];
for (const key of keys) {
results.push(await this.get<T>(key));
}
return results;
}
async mset<T>(entries: Array<{key: string; value: T; ttlSeconds?: number}>): Promise<void> {
for (const entry of entries) {
await this.set(entry.key, entry.value, entry.ttlSeconds);
}
}
async deletePattern(pattern: string): Promise<number> {
const regex = new RegExp(pattern.replace(/\*/g, '.*'));
let deletedCount = 0;
for (const key of this.cache.keys()) {
if (regex.test(key)) {
this.cache.delete(key);
deletedCount++;
}
}
return deletedCount;
}
async acquireLock(key: string, ttlSeconds: number): Promise<string | null> {
validateLockKey(key);
const lockKey = formatLockKey(key);
const existingLock = this.locks.get(lockKey);
if (existingLock && existingLock.expiresAt > Date.now()) {
return null;
}
const token = generateLockToken();
this.locks.set(lockKey, {
token,
expiresAt: Date.now() + ttlSeconds * 1000,
});
return token;
}
async releaseLock(key: string, token: string): Promise<boolean> {
validateLockKey(key);
validateLockToken(token);
const lockKey = formatLockKey(key);
const lock = this.locks.get(lockKey);
if (!lock || lock.token !== token) {
return false;
}
this.locks.delete(lockKey);
return true;
}
async getAndRenewTtl<T>(key: string, newTtlSeconds: number): Promise<T | null> {
const value = await this.get<T>(key);
if (value !== null) {
await this.expire(key, newTtlSeconds);
}
return value;
}
async publish(_channel: string, _message: string): Promise<void> {
return;
}
async sadd(key: string, member: string, ttlSeconds?: number): Promise<void> {
let set = this.sets.get(key);
if (!set) {
set = new Set<string>();
this.sets.set(key, set);
}
set.add(member);
if (ttlSeconds) {
await this.set(`${key}:expiry`, {}, ttlSeconds);
}
}
async srem(key: string, member: string): Promise<void> {
const set = this.sets.get(key);
if (set) {
set.delete(member);
if (set.size === 0) {
this.sets.delete(key);
}
}
}
async smembers(key: string): Promise<Set<string>> {
const expiryExists = await this.exists(`${key}:expiry`);
if (!expiryExists && this.sets.has(key)) {
return new Set();
}
return this.sets.get(key) ?? new Set<string>();
}
async sismember(key: string, member: string): Promise<boolean> {
const set = this.sets.get(key);
if (!set) return false;
const expiryExists = await this.exists(`${key}:expiry`);
if (!expiryExists) {
return false;
}
return set.has(member);
}
destroy(): void {
if (this.cleanupInterval) {
clearInterval(this.cleanupInterval);
this.cleanupInterval = undefined;
}
this.cache.clear();
this.sets.clear();
this.locks.clear();
}
}

View File

@@ -0,0 +1,244 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
import {classifyKeyType} from '@fluxer/cache/src/CacheKeyClassification';
import {
formatLockKey,
generateLockToken,
validateLockKey,
validateLockToken,
} from '@fluxer/cache/src/CacheLockValidation';
import type {CacheLogger, CacheTelemetry} from '@fluxer/cache/src/CacheProviderTypes';
import {safeJsonParse, serializeValue} from '@fluxer/cache/src/CacheSerialization';
import {ICacheService} from '@fluxer/cache/src/ICacheService';
import type {IKVProvider} from '@fluxer/kv_client/src/IKVProvider';
export interface KVCacheProviderConfig {
client: IKVProvider;
cacheName?: string;
logger?: CacheLogger;
telemetry?: CacheTelemetry;
}
export class KVCacheProvider extends ICacheService {
private client: IKVProvider;
private cacheName: string;
private logger?: CacheLogger;
private telemetry?: CacheTelemetry;
constructor(config: KVCacheProviderConfig) {
super();
this.client = config.client;
this.cacheName = config.cacheName ?? 'kv';
this.logger = config.logger;
this.telemetry = config.telemetry;
}
private async instrumented<T>(
operation: string,
key: string,
fn: () => Promise<T>,
statusResolver?: (result: T) => string,
): Promise<T> {
const start = Date.now();
try {
const result = await fn();
const duration = Date.now() - start;
const status = statusResolver ? statusResolver(result) : 'success';
this.telemetry?.recordCounter({
name: 'cache.operation',
dimensions: {operation, cache_name: this.cacheName, status, key_type: classifyKeyType(key)},
});
this.telemetry?.recordHistogram({
name: 'cache.operation_latency',
valueMs: duration,
dimensions: {operation, cache_name: this.cacheName},
});
return result;
} catch (error) {
const duration = Date.now() - start;
this.telemetry?.recordCounter({
name: 'cache.operation',
dimensions: {operation, cache_name: this.cacheName, status: 'error', key_type: classifyKeyType(key)},
});
this.telemetry?.recordHistogram({
name: 'cache.operation_latency',
valueMs: duration,
dimensions: {operation, cache_name: this.cacheName},
});
throw error;
}
}
async get<T>(key: string): Promise<T | null> {
return this.instrumented(
'get',
key,
async () => {
const value = await this.client.get(key);
if (value == null) return null;
return safeJsonParse<T>(value, this.logger);
},
(result) => (result == null ? 'miss' : 'hit'),
);
}
async set<T>(key: string, value: T, ttlSeconds?: number): Promise<void> {
return this.instrumented('set', key, async () => {
const serialized = serializeValue(value);
if (ttlSeconds) {
await this.client.setex(key, ttlSeconds, serialized);
} else {
await this.client.set(key, serialized);
}
});
}
async delete(key: string): Promise<void> {
return this.instrumented('delete', key, async () => {
await this.client.del(key);
});
}
async getAndDelete<T>(key: string): Promise<T | null> {
const value = await this.client.getdel(key);
if (value == null) {
return null;
}
return safeJsonParse<T>(value, this.logger);
}
async exists(key: string): Promise<boolean> {
const result = await this.client.exists(key);
return result === 1;
}
async expire(key: string, ttlSeconds: number): Promise<void> {
await this.client.expire(key, ttlSeconds);
}
async ttl(key: string): Promise<number> {
return await this.client.ttl(key);
}
async mget<T>(keys: Array<string>): Promise<Array<T | null>> {
if (keys.length === 0) return [];
const values = await this.client.mget(...keys);
return values.map((value: string | null) => {
if (value == null) return null;
return safeJsonParse<T>(value, this.logger);
});
}
async mset<T>(entries: Array<{key: string; value: T; ttlSeconds?: number}>): Promise<void> {
if (entries.length === 0) return;
const withoutTtl: Array<{key: string; value: T}> = [];
const withTtl: Array<{key: string; value: T; ttlSeconds: number}> = [];
for (const entry of entries) {
if (entry.ttlSeconds) {
withTtl.push({
key: entry.key,
value: entry.value,
ttlSeconds: entry.ttlSeconds,
});
} else {
withoutTtl.push({
key: entry.key,
value: entry.value,
});
}
}
const pipeline = this.client.pipeline();
if (withoutTtl.length > 0) {
const flatArgs: Array<string> = [];
for (const entry of withoutTtl) {
flatArgs.push(entry.key, serializeValue(entry.value));
}
pipeline.mset(...flatArgs);
}
for (const entry of withTtl) {
pipeline.setex(entry.key, entry.ttlSeconds, serializeValue(entry.value));
}
await pipeline.exec();
}
async deletePattern(pattern: string): Promise<number> {
const keys = await this.client.scan(pattern, 1000);
if (keys.length === 0) return 0;
return await this.client.del(...keys);
}
async acquireLock(key: string, ttlSeconds: number): Promise<string | null> {
validateLockKey(key);
const token = generateLockToken();
const lockKey = formatLockKey(key);
const result = await this.client.set(lockKey, token, 'EX', ttlSeconds, 'NX');
return result === 'OK' ? token : null;
}
async releaseLock(key: string, token: string): Promise<boolean> {
validateLockKey(key);
validateLockToken(token);
const lockKey = formatLockKey(key);
return await this.client.releaseLock(lockKey, token);
}
async getAndRenewTtl<T>(key: string, newTtlSeconds: number): Promise<T | null> {
const value = await this.client.getex(key, newTtlSeconds);
if (value == null) return null;
return safeJsonParse<T>(value, this.logger);
}
async publish(channel: string, message: string): Promise<void> {
await this.client.publish(channel, message);
}
async sadd(key: string, member: string, ttlSeconds?: number): Promise<void> {
const pipeline = this.client.pipeline();
pipeline.sadd(key, member);
if (ttlSeconds) {
pipeline.expire(key, ttlSeconds);
}
await pipeline.exec();
}
async srem(key: string, member: string): Promise<void> {
await this.client.srem(key, member);
}
async smembers(key: string): Promise<Set<string>> {
const members = await this.client.smembers(key);
return new Set(members);
}
async sismember(key: string, member: string): Promise<boolean> {
const result = await this.client.sismember(key, member);
return result === 1;
}
}

View File

@@ -0,0 +1,178 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
import {formatLockKey, generateLockToken, validateLockKey} from '@fluxer/cache/src/CacheLockValidation';
import {safeJsonParse, serializeValue} from '@fluxer/cache/src/CacheSerialization';
import {ICacheService} from '@fluxer/cache/src/ICacheService';
import type {RedisClient} from '@fluxer/cache/src/RedisClientTypes';
export interface RedisCacheProviderConfig {
client: RedisClient;
cacheName?: string;
}
export class RedisCacheProvider extends ICacheService {
private client: RedisClient;
constructor(config: RedisCacheProviderConfig) {
super();
this.client = config.client;
}
async get<T>(key: string): Promise<T | null> {
const value = await this.client.get(key);
if (value == null) return null;
return safeJsonParse<T>(value);
}
async set<T>(key: string, value: T, ttlSeconds?: number): Promise<void> {
const serialized = serializeValue(value);
if (ttlSeconds) {
await this.client.setex(key, ttlSeconds, serialized);
} else {
await this.client.set(key, serialized);
}
}
async delete(key: string): Promise<void> {
await this.client.del(key);
}
async getAndDelete<T>(key: string): Promise<T | null> {
const value = await this.client.getdel(key);
if (value == null) {
return null;
}
return safeJsonParse<T>(value);
}
async exists(key: string): Promise<boolean> {
const result = await this.client.exists(key);
return result === 1;
}
async expire(key: string, ttlSeconds: number): Promise<void> {
await this.client.expire(key, ttlSeconds);
}
async ttl(key: string): Promise<number> {
return await this.client.ttl(key);
}
async mget<T>(keys: Array<string>): Promise<Array<T | null>> {
if (keys.length === 0) return [];
const values = await this.client.mget(...keys);
return values.map((value) => {
if (value == null) return null;
return safeJsonParse<T>(value);
});
}
async mset<T>(entries: Array<{key: string; value: T; ttlSeconds?: number}>): Promise<void> {
if (entries.length === 0) return;
const withoutTtl: Array<{key: string; value: T}> = [];
const withTtl: Array<{key: string; value: T; ttlSeconds: number}> = [];
for (const entry of entries) {
if (entry.ttlSeconds) {
withTtl.push({
key: entry.key,
value: entry.value,
ttlSeconds: entry.ttlSeconds,
});
} else {
withoutTtl.push({
key: entry.key,
value: entry.value,
});
}
}
const pipeline = this.client.pipeline();
if (withoutTtl.length > 0) {
const flatArgs: Array<string> = [];
for (const entry of withoutTtl) {
flatArgs.push(entry.key, serializeValue(entry.value));
}
pipeline.mset(...flatArgs);
}
for (const entry of withTtl) {
pipeline.setex(entry.key, entry.ttlSeconds, serializeValue(entry.value));
}
await pipeline.exec();
}
async deletePattern(pattern: string): Promise<number> {
const keys = await this.client.scan(pattern, 1000);
if (keys.length === 0) return 0;
return await this.client.del(...keys);
}
async acquireLock(key: string, ttlSeconds: number): Promise<string | null> {
validateLockKey(key);
const token = generateLockToken();
const lockKey = formatLockKey(key);
await this.client.set(lockKey, token);
await this.client.expire(lockKey, ttlSeconds);
return token;
}
async releaseLock(_key: string, _token: string): Promise<boolean> {
throw new Error('releaseLock not implemented for RedisCacheProvider');
}
async getAndRenewTtl<T>(key: string, newTtlSeconds: number): Promise<T | null> {
const value = await this.client.getex(key, newTtlSeconds);
if (value == null) return null;
return safeJsonParse<T>(value);
}
async publish(channel: string, message: string): Promise<void> {
await this.client.publish(channel, message);
}
async sadd(key: string, member: string, ttlSeconds?: number): Promise<void> {
const pipeline = this.client.pipeline();
pipeline.sadd(key, member);
if (ttlSeconds) {
pipeline.expire(key, ttlSeconds);
}
await pipeline.exec();
}
async srem(key: string, member: string): Promise<void> {
await this.client.srem(key, member);
}
async smembers(key: string): Promise<Set<string>> {
const members = await this.client.smembers(key);
return new Set(members);
}
async sismember(key: string, member: string): Promise<boolean> {
const result = await this.client.sismember(key, member);
return result === 1;
}
}

View File

@@ -0,0 +1,748 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
import {InMemoryProvider} from '@fluxer/cache/src/providers/InMemoryProvider';
import {afterEach, beforeEach, describe, expect, it, vi} from 'vitest';
describe('InMemoryProvider', () => {
let cache: InMemoryProvider;
beforeEach(() => {
vi.useFakeTimers();
cache = new InMemoryProvider();
});
afterEach(() => {
cache.destroy();
vi.useRealTimers();
});
describe('get and set', () => {
it('returns null for non-existent key', async () => {
const result = await cache.get('nonexistent');
expect(result).toBeNull();
});
it('stores and retrieves a string value', async () => {
await cache.set('key', 'value');
const result = await cache.get<string>('key');
expect(result).toBe('value');
});
it('stores and retrieves an object value', async () => {
const obj = {name: 'test', count: 42, nested: {foo: 'bar'}};
await cache.set('obj', obj);
const result = await cache.get<typeof obj>('obj');
expect(result).toEqual(obj);
});
it('stores and retrieves an array value', async () => {
const arr = [1, 2, 3, 'four', {five: 5}];
await cache.set('arr', arr);
const result = await cache.get<typeof arr>('arr');
expect(result).toEqual(arr);
});
it('stores and retrieves null value', async () => {
await cache.set('nullKey', null);
const result = await cache.get('nullKey');
expect(result).toBeNull();
});
it('stores and retrieves zero value', async () => {
await cache.set('zero', 0);
const result = await cache.get<number>('zero');
expect(result).toBe(0);
});
it('stores and retrieves empty string', async () => {
await cache.set('empty', '');
const result = await cache.get<string>('empty');
expect(result).toBe('');
});
it('stores and retrieves boolean values', async () => {
await cache.set('true', true);
await cache.set('false', false);
expect(await cache.get<boolean>('true')).toBe(true);
expect(await cache.get<boolean>('false')).toBe(false);
});
it('overwrites existing value', async () => {
await cache.set('key', 'first');
await cache.set('key', 'second');
const result = await cache.get<string>('key');
expect(result).toBe('second');
});
});
describe('TTL and expiration', () => {
it('returns value before TTL expires', async () => {
await cache.set('key', 'value', 60);
vi.advanceTimersByTime(30000);
const result = await cache.get<string>('key');
expect(result).toBe('value');
});
it('returns null after TTL expires', async () => {
await cache.set('key', 'value', 1);
vi.advanceTimersByTime(1001);
const result = await cache.get<string>('key');
expect(result).toBeNull();
});
it('returns null exactly at TTL boundary', async () => {
await cache.set('key', 'value', 5);
vi.advanceTimersByTime(5000);
const result = await cache.get<string>('key');
expect(result).toBeNull();
});
it('stores value without TTL indefinitely', async () => {
await cache.set('key', 'value');
vi.advanceTimersByTime(999999999);
const result = await cache.get<string>('key');
expect(result).toBe('value');
});
it('handles very short TTL', async () => {
await cache.set('key', 'value', 1);
const beforeExpiry = await cache.get<string>('key');
expect(beforeExpiry).toBe('value');
vi.advanceTimersByTime(1000);
const afterExpiry = await cache.get<string>('key');
expect(afterExpiry).toBeNull();
});
it('handles long TTL values', async () => {
const oneYear = 365 * 24 * 60 * 60;
await cache.set('key', 'value', oneYear);
vi.advanceTimersByTime(oneYear * 1000 - 1);
expect(await cache.get<string>('key')).toBe('value');
vi.advanceTimersByTime(2);
expect(await cache.get<string>('key')).toBeNull();
});
});
describe('delete', () => {
it('deletes existing key', async () => {
await cache.set('key', 'value');
await cache.delete('key');
const result = await cache.get('key');
expect(result).toBeNull();
});
it('does not throw when deleting non-existent key', async () => {
await expect(cache.delete('nonexistent')).resolves.toBeUndefined();
});
it('only deletes specified key', async () => {
await cache.set('key1', 'value1');
await cache.set('key2', 'value2');
await cache.delete('key1');
expect(await cache.get('key1')).toBeNull();
expect(await cache.get<string>('key2')).toBe('value2');
});
});
describe('getAndDelete', () => {
it('returns value and deletes key', async () => {
await cache.set('key', 'value');
const result = await cache.getAndDelete<string>('key');
expect(result).toBe('value');
expect(await cache.get('key')).toBeNull();
});
it('returns null for non-existent key', async () => {
const result = await cache.getAndDelete('nonexistent');
expect(result).toBeNull();
});
it('returns null for expired key', async () => {
await cache.set('key', 'value', 1);
vi.advanceTimersByTime(1001);
const result = await cache.getAndDelete('key');
expect(result).toBeNull();
});
});
describe('exists', () => {
it('returns true for existing key', async () => {
await cache.set('key', 'value');
const result = await cache.exists('key');
expect(result).toBe(true);
});
it('returns false for non-existent key', async () => {
const result = await cache.exists('nonexistent');
expect(result).toBe(false);
});
it('returns false for expired key', async () => {
await cache.set('key', 'value', 1);
vi.advanceTimersByTime(1001);
const result = await cache.exists('key');
expect(result).toBe(false);
});
it('cleans up expired key on exists check', async () => {
await cache.set('key', 'value', 1);
vi.advanceTimersByTime(1001);
await cache.exists('key');
await cache.set('key', 'newValue');
expect(await cache.get<string>('key')).toBe('newValue');
});
});
describe('expire', () => {
it('sets TTL on existing key', async () => {
await cache.set('key', 'value');
await cache.expire('key', 5);
vi.advanceTimersByTime(4999);
expect(await cache.get<string>('key')).toBe('value');
vi.advanceTimersByTime(2);
expect(await cache.get('key')).toBeNull();
});
it('does not set TTL on non-existent key', async () => {
await cache.expire('nonexistent', 5);
expect(await cache.exists('nonexistent')).toBe(false);
});
it('does not set TTL on expired key', async () => {
await cache.set('key', 'value', 1);
vi.advanceTimersByTime(1001);
await cache.expire('key', 5);
expect(await cache.exists('key')).toBe(false);
});
it('overwrites existing TTL', async () => {
await cache.set('key', 'value', 10);
await cache.expire('key', 2);
vi.advanceTimersByTime(2001);
expect(await cache.get('key')).toBeNull();
});
it('extends TTL on key with existing TTL', async () => {
await cache.set('key', 'value', 5);
vi.advanceTimersByTime(3000);
await cache.expire('key', 10);
vi.advanceTimersByTime(8000);
expect(await cache.get<string>('key')).toBe('value');
});
});
describe('ttl', () => {
it('returns -2 for non-existent key', async () => {
const result = await cache.ttl('nonexistent');
expect(result).toBe(-2);
});
it('returns -1 for key without TTL', async () => {
await cache.set('key', 'value');
const result = await cache.ttl('key');
expect(result).toBe(-1);
});
it('returns remaining TTL in seconds', async () => {
await cache.set('key', 'value', 60);
vi.advanceTimersByTime(30000);
const result = await cache.ttl('key');
expect(result).toBe(30);
});
it('returns 0 when TTL is almost expired', async () => {
await cache.set('key', 'value', 1);
vi.advanceTimersByTime(999);
const result = await cache.ttl('key');
expect(result).toBe(0);
});
it('returns -2 for expired key', async () => {
await cache.set('key', 'value', 1);
vi.advanceTimersByTime(1001);
const result = await cache.ttl('key');
expect(result).toBe(-2);
});
});
describe('mget', () => {
it('returns values for multiple keys', async () => {
await cache.set('key1', 'value1');
await cache.set('key2', 'value2');
await cache.set('key3', 'value3');
const results = await cache.mget<string>(['key1', 'key2', 'key3']);
expect(results).toEqual(['value1', 'value2', 'value3']);
});
it('returns null for missing keys', async () => {
await cache.set('key1', 'value1');
const results = await cache.mget<string>(['key1', 'missing', 'key1']);
expect(results).toEqual(['value1', null, 'value1']);
});
it('returns empty array for empty input', async () => {
const results = await cache.mget([]);
expect(results).toEqual([]);
});
it('handles expired keys correctly', async () => {
await cache.set('key1', 'value1', 1);
await cache.set('key2', 'value2', 10);
vi.advanceTimersByTime(2000);
const results = await cache.mget<string>(['key1', 'key2']);
expect(results).toEqual([null, 'value2']);
});
});
describe('mset', () => {
it('sets multiple keys at once', async () => {
await cache.mset([
{key: 'key1', value: 'value1'},
{key: 'key2', value: 'value2'},
{key: 'key3', value: 'value3'},
]);
expect(await cache.get<string>('key1')).toBe('value1');
expect(await cache.get<string>('key2')).toBe('value2');
expect(await cache.get<string>('key3')).toBe('value3');
});
it('sets TTL for individual keys', async () => {
await cache.mset([
{key: 'short', value: 'shortVal', ttlSeconds: 1},
{key: 'long', value: 'longVal', ttlSeconds: 60},
]);
vi.advanceTimersByTime(2000);
expect(await cache.get('short')).toBeNull();
expect(await cache.get<string>('long')).toBe('longVal');
});
it('handles empty array', async () => {
await expect(cache.mset([])).resolves.toBeUndefined();
});
it('overwrites existing keys', async () => {
await cache.set('key1', 'original');
await cache.mset([{key: 'key1', value: 'updated'}]);
expect(await cache.get<string>('key1')).toBe('updated');
});
});
describe('deletePattern', () => {
it('deletes keys matching pattern', async () => {
await cache.set('user:1', 'user1');
await cache.set('user:2', 'user2');
await cache.set('session:1', 'session1');
const count = await cache.deletePattern('user:*');
expect(count).toBe(2);
expect(await cache.get('user:1')).toBeNull();
expect(await cache.get('user:2')).toBeNull();
expect(await cache.get<string>('session:1')).toBe('session1');
});
it('returns 0 for no matches', async () => {
await cache.set('key1', 'value1');
const count = await cache.deletePattern('nonexistent:*');
expect(count).toBe(0);
});
it('handles complex patterns', async () => {
await cache.set('prefix:middle:suffix', 'value1');
await cache.set('prefix:other:suffix', 'value2');
await cache.set('other:middle:suffix', 'value3');
const count = await cache.deletePattern('prefix:*:suffix');
expect(count).toBe(2);
expect(await cache.get('prefix:middle:suffix')).toBeNull();
expect(await cache.get('prefix:other:suffix')).toBeNull();
expect(await cache.get<string>('other:middle:suffix')).toBe('value3');
});
it('handles wildcard at start', async () => {
await cache.set('test:suffix', 'value1');
await cache.set('other:suffix', 'value2');
await cache.set('test:other', 'value3');
const count = await cache.deletePattern('*:suffix');
expect(count).toBe(2);
});
});
describe('acquireLock', () => {
it('acquires lock successfully', async () => {
const token = await cache.acquireLock('resource', 60);
expect(token).not.toBeNull();
expect(token).toMatch(/^[a-f0-9]{32}$/);
});
it('fails to acquire lock when already held', async () => {
await cache.acquireLock('resource', 60);
const secondToken = await cache.acquireLock('resource', 60);
expect(secondToken).toBeNull();
});
it('allows reacquiring lock after expiry', async () => {
await cache.acquireLock('resource', 1);
vi.advanceTimersByTime(1001);
const newToken = await cache.acquireLock('resource', 60);
expect(newToken).not.toBeNull();
});
it('throws on invalid key format', async () => {
await expect(cache.acquireLock('invalid key!', 60)).rejects.toThrow('Invalid lock key format');
});
it('allows valid key characters', async () => {
const token = await cache.acquireLock('valid-key_123:test', 60);
expect(token).not.toBeNull();
});
it('acquires different locks independently', async () => {
const token1 = await cache.acquireLock('resource1', 60);
const token2 = await cache.acquireLock('resource2', 60);
expect(token1).not.toBeNull();
expect(token2).not.toBeNull();
expect(token1).not.toBe(token2);
});
});
describe('releaseLock', () => {
it('releases lock with correct token', async () => {
const token = await cache.acquireLock('resource', 60);
const released = await cache.releaseLock('resource', token!);
expect(released).toBe(true);
const newToken = await cache.acquireLock('resource', 60);
expect(newToken).not.toBeNull();
});
it('fails to release with wrong token', async () => {
await cache.acquireLock('resource', 60);
const released = await cache.releaseLock('resource', 'wrongtoken123456789012');
expect(released).toBe(false);
});
it('fails to release non-existent lock', async () => {
const released = await cache.releaseLock('nonexistent', 'sometoken12345678901234');
expect(released).toBe(false);
});
it('throws on invalid key format', async () => {
await expect(cache.releaseLock('invalid key!', 'token')).rejects.toThrow('Invalid lock key format');
});
it('throws on invalid token format', async () => {
await expect(cache.releaseLock('validkey', 'INVALID_TOKEN!')).rejects.toThrow('Invalid lock token format');
});
});
describe('getAndRenewTtl', () => {
it('returns value and renews TTL', async () => {
await cache.set('key', 'value', 10);
vi.advanceTimersByTime(5000);
const result = await cache.getAndRenewTtl<string>('key', 60);
expect(result).toBe('value');
vi.advanceTimersByTime(30000);
expect(await cache.get<string>('key')).toBe('value');
vi.advanceTimersByTime(31000);
expect(await cache.get('key')).toBeNull();
});
it('returns null for non-existent key', async () => {
const result = await cache.getAndRenewTtl('nonexistent', 60);
expect(result).toBeNull();
});
it('returns null for expired key', async () => {
await cache.set('key', 'value', 1);
vi.advanceTimersByTime(1001);
const result = await cache.getAndRenewTtl('key', 60);
expect(result).toBeNull();
});
});
describe('publish', () => {
it('does not throw (no-op in memory provider)', async () => {
await expect(cache.publish('channel', 'message')).resolves.toBeUndefined();
});
});
describe('Set operations', () => {
describe('sadd', () => {
it('adds member to a new set', async () => {
await cache.sadd('myset', 'member1', 60);
const members = await cache.smembers('myset');
expect(members.has('member1')).toBe(true);
});
it('adds member to existing set', async () => {
await cache.sadd('myset', 'member1', 60);
await cache.sadd('myset', 'member2', 60);
const members = await cache.smembers('myset');
expect(members.has('member1')).toBe(true);
expect(members.has('member2')).toBe(true);
});
it('does not duplicate members', async () => {
await cache.sadd('myset', 'member1', 60);
await cache.sadd('myset', 'member1', 60);
const members = await cache.smembers('myset');
expect(members.size).toBe(1);
});
});
describe('srem', () => {
it('removes member from set', async () => {
await cache.sadd('myset', 'member1', 60);
await cache.sadd('myset', 'member2', 60);
await cache.srem('myset', 'member1');
const members = await cache.smembers('myset');
expect(members.has('member1')).toBe(false);
expect(members.has('member2')).toBe(true);
});
it('removes set when last member is removed', async () => {
await cache.sadd('myset', 'member1', 60);
await cache.srem('myset', 'member1');
const members = await cache.smembers('myset');
expect(members.size).toBe(0);
});
it('does not throw when removing from non-existent set', async () => {
await expect(cache.srem('nonexistent', 'member')).resolves.toBeUndefined();
});
});
describe('smembers', () => {
it('returns all members of set', async () => {
await cache.sadd('myset', 'member1', 60);
await cache.sadd('myset', 'member2', 60);
await cache.sadd('myset', 'member3', 60);
const members = await cache.smembers('myset');
expect(members.size).toBe(3);
expect(members).toEqual(new Set(['member1', 'member2', 'member3']));
});
it('returns empty set for non-existent key', async () => {
const members = await cache.smembers('nonexistent');
expect(members.size).toBe(0);
});
it('returns empty set when TTL expires', async () => {
await cache.sadd('myset', 'member1', 1);
vi.advanceTimersByTime(2000);
const members = await cache.smembers('myset');
expect(members.size).toBe(0);
});
});
describe('sismember', () => {
it('returns true for existing member', async () => {
await cache.sadd('myset', 'member1', 60);
const result = await cache.sismember('myset', 'member1');
expect(result).toBe(true);
});
it('returns false for non-existing member', async () => {
await cache.sadd('myset', 'member1', 60);
const result = await cache.sismember('myset', 'member2');
expect(result).toBe(false);
});
it('returns false for non-existent set', async () => {
const result = await cache.sismember('nonexistent', 'member');
expect(result).toBe(false);
});
it('returns false when TTL expires', async () => {
await cache.sadd('myset', 'member1', 1);
vi.advanceTimersByTime(2000);
const result = await cache.sismember('myset', 'member1');
expect(result).toBe(false);
});
});
});
describe('Memory limits', () => {
it('evicts oldest entry when max size reached', async () => {
const smallCache = new InMemoryProvider({maxSize: 3});
try {
await smallCache.set('key1', 'value1');
await smallCache.set('key2', 'value2');
await smallCache.set('key3', 'value3');
await smallCache.set('key4', 'value4');
expect(await smallCache.get('key1')).toBeNull();
expect(await smallCache.get<string>('key2')).toBe('value2');
expect(await smallCache.get<string>('key3')).toBe('value3');
expect(await smallCache.get<string>('key4')).toBe('value4');
} finally {
smallCache.destroy();
}
});
it('evicts multiple entries as needed', async () => {
const smallCache = new InMemoryProvider({maxSize: 2});
try {
await smallCache.set('key1', 'value1');
await smallCache.set('key2', 'value2');
await smallCache.set('key3', 'value3');
await smallCache.set('key4', 'value4');
await smallCache.set('key5', 'value5');
expect(await smallCache.get('key1')).toBeNull();
expect(await smallCache.get('key2')).toBeNull();
expect(await smallCache.get('key3')).toBeNull();
expect(await smallCache.get<string>('key4')).toBe('value4');
expect(await smallCache.get<string>('key5')).toBe('value5');
} finally {
smallCache.destroy();
}
});
it('uses default max size of 10000', async () => {
const defaultCache = new InMemoryProvider();
try {
for (let i = 0; i < 100; i++) {
await defaultCache.set(`key${i}`, `value${i}`);
}
expect(await defaultCache.get<string>('key0')).toBe('value0');
} finally {
defaultCache.destroy();
}
});
});
describe('Cleanup interval', () => {
it('cleans up expired entries periodically', async () => {
const cleanupCache = new InMemoryProvider({cleanupIntervalMs: 1000});
try {
await cleanupCache.set('key1', 'value1', 2);
await cleanupCache.set('key2', 'value2', 10);
vi.advanceTimersByTime(3000);
expect(await cleanupCache.get<string>('key2')).toBe('value2');
} finally {
cleanupCache.destroy();
}
});
it('cleans up expired locks', async () => {
const cleanupCache = new InMemoryProvider({cleanupIntervalMs: 1000});
try {
await cleanupCache.acquireLock('resource', 1);
vi.advanceTimersByTime(2000);
const newToken = await cleanupCache.acquireLock('resource', 60);
expect(newToken).not.toBeNull();
} finally {
cleanupCache.destroy();
}
});
});
describe('destroy', () => {
it('clears all data', async () => {
await cache.set('key1', 'value1');
await cache.set('key2', 'value2');
await cache.sadd('set1', 'member1', 60);
await cache.acquireLock('lock1', 60);
cache.destroy();
const newCache = new InMemoryProvider();
try {
expect(await newCache.get('key1')).toBeNull();
} finally {
newCache.destroy();
}
});
it('stops cleanup interval', () => {
const cleanupCache = new InMemoryProvider({cleanupIntervalMs: 1000});
cleanupCache.destroy();
expect(() => vi.advanceTimersByTime(5000)).not.toThrow();
});
it('can be called multiple times safely', () => {
expect(() => {
cache.destroy();
cache.destroy();
}).not.toThrow();
});
});
describe('Edge cases', () => {
it('handles special characters in keys', async () => {
const specialKey = 'key:with:colons';
await cache.set(specialKey, 'value');
expect(await cache.get<string>(specialKey)).toBe('value');
});
it('handles unicode in values', async () => {
const unicodeValue = 'Hello World';
await cache.set('unicode', unicodeValue);
expect(await cache.get<string>('unicode')).toBe(unicodeValue);
});
it('handles large objects', async () => {
const largeObj = {
data: Array.from({length: 1000}, (_, i) => ({
id: i,
name: `item-${i}`,
nested: {value: i * 2},
})),
};
await cache.set('large', largeObj);
const result = await cache.get<typeof largeObj>('large');
expect(result?.data.length).toBe(1000);
});
it('handles rapid sequential operations', async () => {
for (let i = 0; i < 100; i++) {
await cache.set(`rapid-${i}`, i);
}
for (let i = 0; i < 100; i++) {
expect(await cache.get<number>(`rapid-${i}`)).toBe(i);
}
});
it('handles concurrent operations', async () => {
const promises = [];
for (let i = 0; i < 50; i++) {
promises.push(cache.set(`concurrent-${i}`, i));
}
await Promise.all(promises);
const getPromises = [];
for (let i = 0; i < 50; i++) {
getPromises.push(cache.get<number>(`concurrent-${i}`));
}
const results = await Promise.all(getPromises);
expect(results).toEqual(Array.from({length: 50}, (_, i) => i));
});
});
});

View File

@@ -0,0 +1,905 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
import type {CacheLogger, CacheTelemetry} from '@fluxer/cache/src/CacheProviderTypes';
import {KVCacheProvider} from '@fluxer/cache/src/providers/KVCacheProvider';
import type {IKVPipeline, IKVProvider, IKVSubscription} from '@fluxer/kv_client/src/IKVProvider';
import {afterEach, beforeEach, describe, expect, it, vi} from 'vitest';
class MockKVPipeline implements IKVPipeline {
private operations: Array<{method: string; args: Array<unknown>}> = [];
constructor(
private store: Map<string, string>,
private sets: Map<string, Set<string>>,
private expiries: Map<string, number>,
) {}
get(key: string) {
this.operations.push({method: 'get', args: [key]});
return this;
}
set(key: string, value: string) {
this.operations.push({method: 'set', args: [key, value]});
return this;
}
setex(key: string, ttlSeconds: number, value: string) {
this.operations.push({method: 'setex', args: [key, ttlSeconds, value]});
return this;
}
del(key: string) {
this.operations.push({method: 'del', args: [key]});
return this;
}
expire(key: string, ttlSeconds: number) {
this.operations.push({method: 'expire', args: [key, ttlSeconds]});
return this;
}
sadd(key: string, ...members: Array<string>) {
this.operations.push({method: 'sadd', args: [key, ...members]});
return this;
}
srem(key: string, ...members: Array<string>) {
this.operations.push({method: 'srem', args: [key, ...members]});
return this;
}
zadd(key: string, score: number, value: string) {
this.operations.push({method: 'zadd', args: [key, score, value]});
return this;
}
zrem(key: string, ...members: Array<string>) {
this.operations.push({method: 'zrem', args: [key, ...members]});
return this;
}
mset(...args: Array<string>) {
this.operations.push({method: 'mset', args});
return this;
}
async exec(): Promise<Array<[Error | null, unknown]>> {
for (const op of this.operations) {
switch (op.method) {
case 'set':
this.store.set(op.args[0] as string, op.args[1] as string);
break;
case 'setex': {
const [key, ttl, value] = op.args as [string, number, string];
this.store.set(key, value);
this.expiries.set(key, Date.now() + ttl * 1000);
break;
}
case 'del':
this.store.delete(op.args[0] as string);
break;
case 'expire': {
const [expKey, expTtl] = op.args as [string, number];
if (this.store.has(expKey)) {
this.expiries.set(expKey, Date.now() + expTtl * 1000);
}
break;
}
case 'sadd': {
const [setKey, ...members] = op.args as [string, ...Array<string>];
let set = this.sets.get(setKey);
if (!set) {
set = new Set();
this.sets.set(setKey, set);
}
for (const m of members) {
set.add(m);
}
break;
}
case 'mset': {
const msetArgs = op.args as Array<string>;
for (let i = 0; i < msetArgs.length; i += 2) {
this.store.set(msetArgs[i], msetArgs[i + 1]);
}
break;
}
}
}
return this.operations.map(() => [null, 'OK']);
}
}
class MockKVProvider implements IKVProvider {
private store = new Map<string, string>();
private sets = new Map<string, Set<string>>();
private expiries = new Map<string, number>();
pipeline() {
return new MockKVPipeline(this.store, this.sets, this.expiries);
}
async get(key: string): Promise<string | null> {
return this.store.get(key) ?? null;
}
async set(key: string, value: string, ...args: Array<string | number>): Promise<string | null> {
if (args.includes('NX') && this.store.has(key)) {
return null;
}
this.store.set(key, value);
const exIdx = args.indexOf('EX');
if (exIdx !== -1 && typeof args[exIdx + 1] === 'number') {
this.expiries.set(key, Date.now() + (args[exIdx + 1] as number) * 1000);
}
return 'OK';
}
async setex(key: string, ttlSeconds: number, value: string): Promise<void> {
this.store.set(key, value);
this.expiries.set(key, Date.now() + ttlSeconds * 1000);
}
async setnx(key: string, value: string, ttlSeconds?: number): Promise<boolean> {
if (this.store.has(key)) return false;
this.store.set(key, value);
if (ttlSeconds) {
this.expiries.set(key, Date.now() + ttlSeconds * 1000);
}
return true;
}
async mget(...keys: Array<string>): Promise<Array<string | null>> {
return keys.map((key) => this.store.get(key) ?? null);
}
async mset(...args: Array<string>): Promise<void> {
for (let i = 0; i < args.length; i += 2) {
this.store.set(args[i], args[i + 1]);
}
}
async del(...keys: Array<string>): Promise<number> {
let count = 0;
for (const key of keys) {
if (this.store.delete(key)) count++;
}
return count;
}
async exists(key: string): Promise<number> {
return this.store.has(key) ? 1 : 0;
}
async expire(key: string, ttlSeconds: number): Promise<number> {
if (!this.store.has(key)) return 0;
this.expiries.set(key, Date.now() + ttlSeconds * 1000);
return 1;
}
async ttl(key: string): Promise<number> {
if (!this.store.has(key)) return -2;
const expiry = this.expiries.get(key);
if (!expiry) return -1;
const remaining = Math.floor((expiry - Date.now()) / 1000);
return remaining > 0 ? remaining : -2;
}
async incr(key: string): Promise<number> {
const val = parseInt(this.store.get(key) ?? '0', 10) + 1;
this.store.set(key, String(val));
return val;
}
async getex(key: string, ttlSeconds: number): Promise<string | null> {
const val = this.store.get(key);
if (val !== undefined) {
this.expiries.set(key, Date.now() + ttlSeconds * 1000);
}
return val ?? null;
}
async getdel(key: string): Promise<string | null> {
const val = this.store.get(key);
this.store.delete(key);
return val ?? null;
}
async sadd(key: string, ...members: Array<string>): Promise<number> {
let set = this.sets.get(key);
if (!set) {
set = new Set();
this.sets.set(key, set);
}
let added = 0;
for (const m of members) {
if (!set.has(m)) {
set.add(m);
added++;
}
}
return added;
}
async srem(key: string, ...members: Array<string>): Promise<number> {
const set = this.sets.get(key);
if (!set) return 0;
let removed = 0;
for (const m of members) {
if (set.delete(m)) removed++;
}
return removed;
}
async smembers(key: string): Promise<Array<string>> {
const set = this.sets.get(key);
return set ? Array.from(set) : [];
}
async sismember(key: string, member: string): Promise<number> {
const set = this.sets.get(key);
return set?.has(member) ? 1 : 0;
}
async scard(key: string): Promise<number> {
return this.sets.get(key)?.size ?? 0;
}
async spop(key: string, count = 1): Promise<Array<string>> {
const set = this.sets.get(key);
if (!set) return [];
const results: Array<string> = [];
const iter = set.values();
for (let i = 0; i < count; i++) {
const next = iter.next();
if (next.done) break;
results.push(next.value);
set.delete(next.value);
}
return results;
}
async zadd(_key: string, ..._scoreMembers: Array<number | string>): Promise<number> {
return 1;
}
async zrem(_key: string, ..._members: Array<string>): Promise<number> {
return 1;
}
async zcard(_key: string): Promise<number> {
return 0;
}
async zrangebyscore(
_key: string,
_min: string | number,
_max: string | number,
..._args: Array<string | number>
): Promise<Array<string>> {
return [];
}
async rpush(_key: string, ..._values: Array<string>): Promise<number> {
return 1;
}
async lpop(_key: string, _count?: number): Promise<Array<string>> {
return [];
}
async llen(_key: string): Promise<number> {
return 0;
}
async hset(_key: string, _field: string, _value: string): Promise<number> {
return 1;
}
async hdel(_key: string, ..._fields: Array<string>): Promise<number> {
return 1;
}
async hget(_key: string, _field: string): Promise<string | null> {
return null;
}
async hgetall(_key: string): Promise<Record<string, string>> {
return {};
}
async publish(_channel: string, _message: string): Promise<number> {
return 1;
}
duplicate(): IKVSubscription {
return {} as IKVSubscription;
}
async releaseLock(_key: string, _token: string): Promise<boolean> {
return true;
}
async renewSnowflakeNode(_key: string, _instanceId: string, _ttlSeconds: number): Promise<boolean> {
return true;
}
async tryConsumeTokens(
_key: string,
_requested: number,
_maxTokens: number,
_refillRate: number,
_refillIntervalMs: number,
): Promise<number> {
return 0;
}
async scheduleBulkDeletion(_queueKey: string, _secondaryKey: string, _score: number, _value: string): Promise<void> {}
async removeBulkDeletion(_queueKey: string, _secondaryKey: string): Promise<boolean> {
return true;
}
async scan(pattern: string, _count: number): Promise<Array<string>> {
const regex = new RegExp(pattern.replace(/\*/g, '.*'));
return Array.from(this.store.keys()).filter((k) => regex.test(k));
}
multi(): IKVPipeline {
return new MockKVPipeline(this.store, this.sets, this.expiries);
}
async health(): Promise<boolean> {
return true;
}
clear(): void {
this.store.clear();
this.sets.clear();
this.expiries.clear();
}
}
function createNoopLogger(): CacheLogger {
return {
error: () => {},
};
}
function createNoopTelemetry(): CacheTelemetry {
return {
recordCounter: () => {},
recordHistogram: () => {},
};
}
describe('KVCacheProvider', () => {
let mockClient: MockKVProvider;
let cache: KVCacheProvider;
beforeEach(() => {
mockClient = new MockKVProvider();
cache = new KVCacheProvider({
client: mockClient,
cacheName: 'test',
logger: createNoopLogger(),
telemetry: createNoopTelemetry(),
});
});
afterEach(() => {
mockClient.clear();
});
describe('get and set', () => {
it('returns null for non-existent key', async () => {
const result = await cache.get('nonexistent');
expect(result).toBeNull();
});
it('stores and retrieves a string value', async () => {
await cache.set('key', 'value');
const result = await cache.get<string>('key');
expect(result).toBe('value');
});
it('stores and retrieves an object value', async () => {
const obj = {name: 'test', count: 42};
await cache.set('obj', obj);
const result = await cache.get<typeof obj>('obj');
expect(result).toEqual(obj);
});
it('stores value with TTL', async () => {
await cache.set('key', 'value', 60);
const result = await cache.get<string>('key');
expect(result).toBe('value');
});
it('handles invalid JSON gracefully', async () => {
await mockClient.set('invalid', 'not-valid-json{');
const result = await cache.get('invalid');
expect(result).toBeNull();
});
});
describe('delete', () => {
it('deletes existing key', async () => {
await cache.set('key', 'value');
await cache.delete('key');
const result = await cache.get('key');
expect(result).toBeNull();
});
});
describe('getAndDelete', () => {
it('returns value and deletes key', async () => {
await cache.set('key', 'value');
const result = await cache.getAndDelete<string>('key');
expect(result).toBe('value');
expect(await cache.get('key')).toBeNull();
});
it('returns null for non-existent key', async () => {
const result = await cache.getAndDelete('nonexistent');
expect(result).toBeNull();
});
});
describe('exists', () => {
it('returns true for existing key', async () => {
await cache.set('key', 'value');
const result = await cache.exists('key');
expect(result).toBe(true);
});
it('returns false for non-existent key', async () => {
const result = await cache.exists('nonexistent');
expect(result).toBe(false);
});
});
describe('expire', () => {
it('sets TTL on existing key', async () => {
await cache.set('key', 'value');
await cache.expire('key', 60);
const ttl = await cache.ttl('key');
expect(ttl).toBeGreaterThan(0);
});
});
describe('ttl', () => {
it('returns TTL for key with expiry', async () => {
await cache.set('key', 'value', 60);
const result = await cache.ttl('key');
expect(result).toBeGreaterThan(0);
});
it('returns -1 for key without expiry', async () => {
await cache.set('key', 'value');
const result = await cache.ttl('key');
expect(result).toBe(-1);
});
it('returns -2 for non-existent key', async () => {
const result = await cache.ttl('nonexistent');
expect(result).toBe(-2);
});
});
describe('mget', () => {
it('returns values for multiple keys', async () => {
await cache.set('key1', 'value1');
await cache.set('key2', 'value2');
const results = await cache.mget<string>(['key1', 'key2']);
expect(results).toEqual(['value1', 'value2']);
});
it('returns null for missing keys', async () => {
await cache.set('key1', 'value1');
const results = await cache.mget<string>(['key1', 'missing']);
expect(results).toEqual(['value1', null]);
});
it('returns empty array for empty input', async () => {
const results = await cache.mget([]);
expect(results).toEqual([]);
});
});
describe('mset', () => {
it('sets multiple keys at once', async () => {
await cache.mset([
{key: 'key1', value: 'value1'},
{key: 'key2', value: 'value2'},
]);
expect(await cache.get<string>('key1')).toBe('value1');
expect(await cache.get<string>('key2')).toBe('value2');
});
it('handles empty array', async () => {
await expect(cache.mset([])).resolves.toBeUndefined();
});
it('handles mixed TTL entries', async () => {
await cache.mset([
{key: 'withTtl', value: 'val1', ttlSeconds: 60},
{key: 'noTtl', value: 'val2'},
]);
expect(await cache.get<string>('withTtl')).toBe('val1');
expect(await cache.get<string>('noTtl')).toBe('val2');
});
});
describe('deletePattern', () => {
it('deletes keys matching pattern', async () => {
await cache.set('user:1', 'user1');
await cache.set('user:2', 'user2');
await cache.set('session:1', 'session1');
const count = await cache.deletePattern('user:*');
expect(count).toBe(2);
});
it('returns 0 for no matches', async () => {
const count = await cache.deletePattern('nonexistent:*');
expect(count).toBe(0);
});
});
describe('acquireLock', () => {
it('acquires lock successfully', async () => {
const token = await cache.acquireLock('resource', 60);
expect(token).not.toBeNull();
expect(token).toMatch(/^[a-f0-9]{32}$/);
});
it('throws on invalid key format', async () => {
await expect(cache.acquireLock('invalid key!', 60)).rejects.toThrow('Invalid lock key format');
});
});
describe('releaseLock', () => {
it('releases lock', async () => {
const token = await cache.acquireLock('resource', 60);
const released = await cache.releaseLock('resource', token!);
expect(released).toBe(true);
});
it('throws on invalid key format', async () => {
await expect(cache.releaseLock('invalid key!', 'token')).rejects.toThrow('Invalid lock key format');
});
it('throws on invalid token format', async () => {
await expect(cache.releaseLock('validkey', 'INVALID!')).rejects.toThrow('Invalid lock token format');
});
});
describe('getAndRenewTtl', () => {
it('returns value and renews TTL', async () => {
await cache.set('key', 'value', 10);
const result = await cache.getAndRenewTtl<string>('key', 60);
expect(result).toBe('value');
});
it('returns null for non-existent key', async () => {
const result = await cache.getAndRenewTtl('nonexistent', 60);
expect(result).toBeNull();
});
});
describe('publish', () => {
it('publishes message to channel', async () => {
await expect(cache.publish('channel', 'message')).resolves.toBeUndefined();
});
});
describe('Set operations', () => {
describe('sadd', () => {
it('adds member to set', async () => {
await cache.sadd('myset', 'member1', 60);
const members = await cache.smembers('myset');
expect(members.has('member1')).toBe(true);
});
});
describe('srem', () => {
it('removes member from set', async () => {
await mockClient.sadd('myset', 'member1');
await cache.srem('myset', 'member1');
const isMember = await cache.sismember('myset', 'member1');
expect(isMember).toBe(false);
});
});
describe('smembers', () => {
it('returns all members of set', async () => {
await mockClient.sadd('myset', 'member1', 'member2');
const members = await cache.smembers('myset');
expect(members.size).toBe(2);
expect(members.has('member1')).toBe(true);
expect(members.has('member2')).toBe(true);
});
it('returns empty set for non-existent key', async () => {
const members = await cache.smembers('nonexistent');
expect(members.size).toBe(0);
});
});
describe('sismember', () => {
it('returns true for existing member', async () => {
await mockClient.sadd('myset', 'member1');
const result = await cache.sismember('myset', 'member1');
expect(result).toBe(true);
});
it('returns false for non-existing member', async () => {
await mockClient.sadd('myset', 'member1');
const result = await cache.sismember('myset', 'member2');
expect(result).toBe(false);
});
});
});
describe('telemetry', () => {
it('records metrics on get operations', async () => {
const telemetry = {
recordCounter: vi.fn(),
recordHistogram: vi.fn(),
};
const telemetryCache = new KVCacheProvider({
client: mockClient,
cacheName: 'test',
telemetry,
});
await telemetryCache.get('key');
expect(telemetry.recordCounter).toHaveBeenCalledWith({
name: 'cache.operation',
dimensions: expect.objectContaining({
operation: 'get',
cache_name: 'test',
}),
});
expect(telemetry.recordHistogram).toHaveBeenCalledWith({
name: 'cache.operation_latency',
valueMs: expect.any(Number),
dimensions: expect.objectContaining({
operation: 'get',
}),
});
});
it('records metrics on set operations', async () => {
const telemetry = {
recordCounter: vi.fn(),
recordHistogram: vi.fn(),
};
const telemetryCache = new KVCacheProvider({
client: mockClient,
cacheName: 'test',
telemetry,
});
await telemetryCache.set('key', 'value');
expect(telemetry.recordCounter).toHaveBeenCalledWith({
name: 'cache.operation',
dimensions: expect.objectContaining({
operation: 'set',
status: 'success',
}),
});
});
it('records error metrics on failure', async () => {
const telemetry = {
recordCounter: vi.fn(),
recordHistogram: vi.fn(),
};
const failingClient = {
...mockClient,
get: vi.fn().mockRejectedValue(new Error('connection error')),
} as unknown as IKVProvider;
const telemetryCache = new KVCacheProvider({
client: failingClient,
cacheName: 'test',
telemetry,
});
await expect(telemetryCache.get('key')).rejects.toThrow('connection error');
expect(telemetry.recordCounter).toHaveBeenCalledWith({
name: 'cache.operation',
dimensions: expect.objectContaining({
operation: 'get',
status: 'error',
}),
});
});
});
describe('key type detection', () => {
it('identifies lock keys', async () => {
const telemetry = {
recordCounter: vi.fn(),
recordHistogram: vi.fn(),
};
const telemetryCache = new KVCacheProvider({
client: mockClient,
cacheName: 'test',
telemetry,
});
await telemetryCache.get('lock:mylock');
expect(telemetry.recordCounter).toHaveBeenCalledWith({
name: 'cache.operation',
dimensions: expect.objectContaining({
key_type: 'lock',
}),
});
});
it('identifies session keys', async () => {
const telemetry = {
recordCounter: vi.fn(),
recordHistogram: vi.fn(),
};
const telemetryCache = new KVCacheProvider({
client: mockClient,
cacheName: 'test',
telemetry,
});
await telemetryCache.get('user:123:session:abc');
expect(telemetry.recordCounter).toHaveBeenCalledWith({
name: 'cache.operation',
dimensions: expect.objectContaining({
key_type: 'session',
}),
});
});
it('identifies user keys', async () => {
const telemetry = {
recordCounter: vi.fn(),
recordHistogram: vi.fn(),
};
const telemetryCache = new KVCacheProvider({
client: mockClient,
cacheName: 'test',
telemetry,
});
await telemetryCache.get('prefix:user:123');
expect(telemetry.recordCounter).toHaveBeenCalledWith({
name: 'cache.operation',
dimensions: expect.objectContaining({
key_type: 'user',
}),
});
});
it('defaults to other for unknown key types', async () => {
const telemetry = {
recordCounter: vi.fn(),
recordHistogram: vi.fn(),
};
const telemetryCache = new KVCacheProvider({
client: mockClient,
cacheName: 'test',
telemetry,
});
await telemetryCache.get('random:key');
expect(telemetry.recordCounter).toHaveBeenCalledWith({
name: 'cache.operation',
dimensions: expect.objectContaining({
key_type: 'other',
}),
});
});
});
describe('logger', () => {
it('logs JSON parse errors', async () => {
const logger = {
error: vi.fn(),
};
const loggingCache = new KVCacheProvider({
client: mockClient,
cacheName: 'test',
logger,
});
await mockClient.set('invalid', 'not-valid-json{');
await loggingCache.get('invalid');
expect(logger.error).toHaveBeenCalledWith(
expect.objectContaining({
value: 'not-valid-json{',
}),
expect.stringContaining('JSON parse error'),
);
});
it('truncates long values in error logs', async () => {
const logger = {
error: vi.fn(),
};
const loggingCache = new KVCacheProvider({
client: mockClient,
cacheName: 'test',
logger,
});
const longInvalidValue = 'x'.repeat(300);
await mockClient.set('invalid', longInvalidValue);
await loggingCache.get('invalid');
expect(logger.error).toHaveBeenCalledWith(
expect.objectContaining({
value: expect.stringMatching(/^x{200}\.\.\.$/),
}),
expect.any(String),
);
});
});
describe('default config', () => {
it('uses default cache name when not provided', async () => {
const telemetry = {
recordCounter: vi.fn(),
recordHistogram: vi.fn(),
};
const defaultCache = new KVCacheProvider({
client: mockClient,
telemetry,
});
await defaultCache.get('key');
expect(telemetry.recordCounter).toHaveBeenCalledWith({
name: 'cache.operation',
dimensions: expect.objectContaining({
cache_name: 'kv',
}),
});
});
});
});

41
packages/cache/src/utils/Coalescer.tsx vendored Normal file
View File

@@ -0,0 +1,41 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
export class Coalescer {
private pending = new Map<string, Promise<unknown>>();
async coalesce<T>(key: string, fn: () => Promise<T>): Promise<T> {
const existing = this.pending.get(key) as Promise<T> | undefined;
if (existing) {
return existing;
}
const promise = (async () => {
try {
return await fn();
} finally {
this.pending.delete(key);
}
})();
this.pending.set(key, promise);
return promise;
}
}

View File

@@ -0,0 +1,351 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
import {Coalescer} from '@fluxer/cache/src/utils/Coalescer';
import {beforeEach, describe, expect, it, vi} from 'vitest';
describe('Coalescer', () => {
let coalescer: Coalescer;
beforeEach(() => {
coalescer = new Coalescer();
});
describe('basic functionality', () => {
it('executes function and returns result', async () => {
const result = await coalescer.coalesce('key', async () => 'value');
expect(result).toBe('value');
});
it('executes function with complex return type', async () => {
const complexValue = {name: 'test', count: 42, nested: {data: [1, 2, 3]}};
const result = await coalescer.coalesce('key', async () => complexValue);
expect(result).toEqual(complexValue);
});
it('handles null return value', async () => {
const result = await coalescer.coalesce('key', async () => null);
expect(result).toBeNull();
});
it('handles undefined return value', async () => {
const result = await coalescer.coalesce('key', async () => undefined);
expect(result).toBeUndefined();
});
it('handles numeric return value', async () => {
const result = await coalescer.coalesce('key', async () => 42);
expect(result).toBe(42);
});
it('handles boolean return value', async () => {
const trueResult = await coalescer.coalesce('key1', async () => true);
const falseResult = await coalescer.coalesce('key2', async () => false);
expect(trueResult).toBe(true);
expect(falseResult).toBe(false);
});
});
describe('request coalescing', () => {
it('coalesces concurrent requests with same key', async () => {
const fn = vi.fn().mockImplementation(
() =>
new Promise((resolve) => {
setTimeout(() => resolve('result'), 100);
}),
);
const [result1, result2, result3] = await Promise.all([
coalescer.coalesce('sameKey', fn),
coalescer.coalesce('sameKey', fn),
coalescer.coalesce('sameKey', fn),
]);
expect(fn).toHaveBeenCalledTimes(1);
expect(result1).toBe('result');
expect(result2).toBe('result');
expect(result3).toBe('result');
});
it('does not coalesce requests with different keys', async () => {
const fn = vi.fn().mockResolvedValue('result');
await Promise.all([
coalescer.coalesce('key1', fn),
coalescer.coalesce('key2', fn),
coalescer.coalesce('key3', fn),
]);
expect(fn).toHaveBeenCalledTimes(3);
});
it('allows new request after previous completes', async () => {
let callCount = 0;
const fn = vi.fn().mockImplementation(async () => {
callCount++;
return `result-${callCount}`;
});
const result1 = await coalescer.coalesce('key', fn);
const result2 = await coalescer.coalesce('key', fn);
expect(fn).toHaveBeenCalledTimes(2);
expect(result1).toBe('result-1');
expect(result2).toBe('result-2');
});
it('coalesces only during pending period', async () => {
let resolveFirst: (value: string) => void;
const firstPromise = new Promise<string>((resolve) => {
resolveFirst = resolve;
});
const fn = vi.fn().mockReturnValue(firstPromise);
const coalescedPromise1 = coalescer.coalesce('key', fn);
const coalescedPromise2 = coalescer.coalesce('key', fn);
expect(fn).toHaveBeenCalledTimes(1);
resolveFirst!('first-result');
await Promise.all([coalescedPromise1, coalescedPromise2]);
const fn2 = vi.fn().mockResolvedValue('second-result');
const result = await coalescer.coalesce('key', fn2);
expect(fn2).toHaveBeenCalledTimes(1);
expect(result).toBe('second-result');
});
});
describe('error handling', () => {
it('propagates errors to all coalesced callers', async () => {
const error = new Error('test error');
const fn = vi.fn().mockRejectedValue(error);
const promises = [coalescer.coalesce('key', fn), coalescer.coalesce('key', fn), coalescer.coalesce('key', fn)];
const results = await Promise.allSettled(promises);
expect(fn).toHaveBeenCalledTimes(1);
results.forEach((result) => {
expect(result.status).toBe('rejected');
if (result.status === 'rejected') {
expect(result.reason).toBe(error);
}
});
});
it('clears pending state after error', async () => {
const error = new Error('test error');
const failingFn = vi.fn().mockRejectedValue(error);
const succeedingFn = vi.fn().mockResolvedValue('success');
try {
await coalescer.coalesce('key', failingFn);
} catch {}
const result = await coalescer.coalesce('key', succeedingFn);
expect(result).toBe('success');
expect(succeedingFn).toHaveBeenCalledTimes(1);
});
it('handles synchronous errors', async () => {
const error = new Error('sync error');
const fn = vi.fn().mockImplementation(() => {
throw error;
});
await expect(coalescer.coalesce('key', fn)).rejects.toThrow('sync error');
});
});
describe('different keys', () => {
it('handles multiple different keys concurrently', async () => {
const fn1 = vi.fn().mockImplementation(
() =>
new Promise((resolve) => {
setTimeout(() => resolve('result1'), 50);
}),
);
const fn2 = vi.fn().mockImplementation(
() =>
new Promise((resolve) => {
setTimeout(() => resolve('result2'), 50);
}),
);
const fn3 = vi.fn().mockImplementation(
() =>
new Promise((resolve) => {
setTimeout(() => resolve('result3'), 50);
}),
);
const [r1a, r1b, r2a, r2b, r3a, r3b] = await Promise.all([
coalescer.coalesce('key1', fn1),
coalescer.coalesce('key1', fn1),
coalescer.coalesce('key2', fn2),
coalescer.coalesce('key2', fn2),
coalescer.coalesce('key3', fn3),
coalescer.coalesce('key3', fn3),
]);
expect(fn1).toHaveBeenCalledTimes(1);
expect(fn2).toHaveBeenCalledTimes(1);
expect(fn3).toHaveBeenCalledTimes(1);
expect(r1a).toBe('result1');
expect(r1b).toBe('result1');
expect(r2a).toBe('result2');
expect(r2b).toBe('result2');
expect(r3a).toBe('result3');
expect(r3b).toBe('result3');
});
it('handles special characters in keys', async () => {
const fn = vi.fn().mockResolvedValue('result');
await coalescer.coalesce('key:with:colons', fn);
await coalescer.coalesce('key/with/slashes', fn);
await coalescer.coalesce('key.with.dots', fn);
expect(fn).toHaveBeenCalledTimes(3);
});
it('handles empty string key', async () => {
const fn = vi.fn().mockResolvedValue('result');
const result = await coalescer.coalesce('', fn);
expect(result).toBe('result');
});
});
describe('type safety', () => {
it('preserves return type', async () => {
interface User {
id: number;
name: string;
}
const user: User = {id: 1, name: 'test'};
const result = await coalescer.coalesce<User>('key', async () => user);
expect(result.id).toBe(1);
expect(result.name).toBe('test');
});
it('handles generic array types', async () => {
const arr = [1, 2, 3, 4, 5];
const result = await coalescer.coalesce<Array<number>>('key', async () => arr);
expect(result).toEqual([1, 2, 3, 4, 5]);
});
});
describe('timing behavior', () => {
it('executes function only once even with rapid calls', async () => {
const fn = vi.fn().mockImplementation(
() =>
new Promise((resolve) => {
setTimeout(() => resolve('result'), 100);
}),
);
const promises = [];
for (let i = 0; i < 100; i++) {
promises.push(coalescer.coalesce('key', fn));
}
const results = await Promise.all(promises);
expect(fn).toHaveBeenCalledTimes(1);
results.forEach((result) => {
expect(result).toBe('result');
});
});
it('handles interleaved requests correctly', async () => {
let resolve1: (value: string) => void;
let resolve2: (value: string) => void;
const fn1 = vi.fn().mockReturnValue(
new Promise<string>((resolve) => {
resolve1 = resolve;
}),
);
const fn2 = vi.fn().mockReturnValue(
new Promise<string>((resolve) => {
resolve2 = resolve;
}),
);
const p1 = coalescer.coalesce('key1', fn1);
const p2 = coalescer.coalesce('key2', fn2);
expect(fn1).toHaveBeenCalledTimes(1);
expect(fn2).toHaveBeenCalledTimes(1);
resolve2!('result2');
resolve1!('result1');
const [r1, r2] = await Promise.all([p1, p2]);
expect(r1).toBe('result1');
expect(r2).toBe('result2');
});
});
describe('cleanup', () => {
it('removes key from pending map after completion', async () => {
const fn = vi.fn().mockResolvedValue('result');
await coalescer.coalesce('key', fn);
await coalescer.coalesce('key', fn);
expect(fn).toHaveBeenCalledTimes(2);
});
it('removes key from pending map after error', async () => {
const failingFn = vi.fn().mockRejectedValue(new Error('error'));
const succeedingFn = vi.fn().mockResolvedValue('success');
try {
await coalescer.coalesce('key', failingFn);
} catch {}
await coalescer.coalesce('key', succeedingFn);
expect(succeedingFn).toHaveBeenCalledTimes(1);
});
});
describe('multiple instances', () => {
it('different coalescer instances do not share state', async () => {
const coalescer1 = new Coalescer();
const coalescer2 = new Coalescer();
let callCount = 0;
const fn = vi.fn().mockImplementation(async () => {
callCount++;
return `result-${callCount}`;
});
const [r1, r2] = await Promise.all([coalescer1.coalesce('sameKey', fn), coalescer2.coalesce('sameKey', fn)]);
expect(fn).toHaveBeenCalledTimes(2);
expect(r1).toBe('result-1');
expect(r2).toBe('result-2');
});
});
});

5
packages/cache/tsconfig.json vendored Normal file
View File

@@ -0,0 +1,5 @@
{
"extends": "../../tsconfigs/package.json",
"compilerOptions": {},
"include": ["src/**/*"]
}

44
packages/cache/vitest.config.ts vendored Normal file
View File

@@ -0,0 +1,44 @@
/*
* Copyright (C) 2026 Fluxer Contributors
*
* This file is part of Fluxer.
*
* Fluxer is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* Fluxer is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU Affero General Public License for more details.
*
* You should have received a copy of the GNU Affero General Public License
* along with Fluxer. If not, see <https://www.gnu.org/licenses/>.
*/
import path from 'node:path';
import {fileURLToPath} from 'node:url';
import tsconfigPaths from 'vite-tsconfig-paths';
import {defineConfig} from 'vitest/config';
const __dirname = path.dirname(fileURLToPath(import.meta.url));
export default defineConfig({
plugins: [
tsconfigPaths({
root: path.resolve(__dirname, '../..'),
}),
],
test: {
globals: true,
environment: 'node',
include: ['**/*.{test,spec}.{ts,tsx}'],
exclude: ['node_modules', 'dist'],
coverage: {
provider: 'v8',
reporter: ['text', 'json', 'html'],
exclude: ['**/*.test.tsx', '**/*.spec.tsx', 'node_modules/'],
},
},
});