The fastest database query is the one you don't make.
Caching in most frameworks involves Redis setup, serialization headaches, and cache invalidation bugs that haunt you at 3 AM. Alepha tries to make it boring.
$cacheThe simplest use: cache a function's result.
1import { $cache } from "alepha/cache"; 2 3class ProductService { 4 // cache expensive computation for 10 minutes 5 getPopularProducts = $cache({ 6 name: "popular-products", 7 ttl: [10, "minutes"], 8 handler: async () => { 9 // this runs only once per 10 minutes10 return await this.db.products.findMany({11 orderBy: { sales: "desc" },12 limit: 100,13 });14 },15 });16}
First call: runs the handler, stores result, returns it. Next calls (within 10 min): returns cached result instantly.
Most caches depend on input parameters:
1class UserService { 2 getUserProfile = $cache({ 3 name: "user-profile", 4 ttl: [5, "minutes"], 5 handler: async (userId: string) => { 6 return await this.db.users.findById(userId); 7 }, 8 }); 9}10 11// usage12const profile = await this.getUserProfile("user-123");13// cached separately for each userId
The cache key automatically includes the arguments. getUserProfile("a") and getUserProfile("b") are cached independently.
Sometimes you need direct control:
1class SessionService { 2 // define cache without handler 3 sessions = $cache<UserSession>({ 4 name: "sessions", 5 ttl: [1, "hour"], 6 }); 7 8 async createSession(userId: string): Promise<string> { 9 const sessionId = crypto.randomUUID();10 const session = { userId, createdAt: Date.now() };11 12 // manually set13 await this.sessions.set(sessionId, session);14 15 return sessionId;16 }17 18 async getSession(sessionId: string): Promise<UserSession | null> {19 // manually get20 return await this.sessions.get(sessionId);21 }22 23 async destroySession(sessionId: string): Promise<void> {24 // manually delete25 await this.sessions.delete(sessionId);26 }27}
The two hardest problems in computer science: cache invalidation and naming things.
1// delete specific entry2await this.getUserProfile.invalidate("user-123");
1// delete all entries matching pattern2await this.sessions.invalidate("user:*:sessions");
Common pattern: invalidate cache when data changes.
1class UserService { 2 getUserProfile = $cache({ 3 name: "user-profile", 4 ttl: [5, "minutes"], 5 handler: async (userId: string) => { 6 return await this.db.users.findById(userId); 7 }, 8 }); 9 10 async updateProfile(userId: string, data: UpdateData) {11 await this.db.users.update(userId, data);12 13 // clear the cache for this user14 await this.getUserProfile.invalidate(userId);15 }16}
For API responses, use the cache option on actions:
1class ProductApi { 2 // cache the entire HTTP response 3 listProducts = $action({ 4 path: "/products", 5 cache: true, // uses default settings 6 handler: async () => { 7 return await this.db.products.findMany(); 8 }, 9 });10 11 // with custom TTL12 getProduct = $action({13 path: "/products/:id",14 cache: { ttl: [30, "seconds"] },15 handler: async ({ params }) => {16 return await this.db.products.findById(params.id);17 },18 });19}
This sets proper HTTP headers (Cache-Control, ETag) so browsers and CDNs can cache too.
Alepha automatically generates ETags for cached responses:
1// first request2GET /products/1233-> 200 OK4-> ETag: "abc123"5 6// second request with ETag7GET /products/1238If-None-Match: "abc123"9-> 304 Not Modified (no body, saves bandwidth)
You don't write any code for this. It just works.
1// stored in process memory, lost on restart2const cache = $cache({3 name: "my-cache",4 ttl: [10, "minutes"],5 handler: async () => { /* ... */ },6});
Good for: development, single-instance apps, short-lived data.
1import { RedisCacheProvider } from "alepha/cache/redis";2 3const alepha = Alepha.create()4 .with({ provide: CacheProvider, use: RedisCacheProvider });5 6// set REDIS_URL in your environment
Good for: production, multi-instance apps, persistent cache.
Your cache code stays the same. Only the provider changes.
Pre-populate cache on startup:
1class CacheWarmer { 2 products = $inject(ProductService); 3 4 warmup = $hook({ 5 on: "ready", 6 handler: async () => { 7 // pre-fetch popular products into cache 8 await this.products.getPopularProducts(); 9 10 // pre-fetch top categories11 for (const cat of ["electronics", "clothing", "home"]) {12 await this.products.getByCategory(cat);13 }14 },15 });16}
First users don't wait for cold cache.
The default pattern. Check cache, if miss, compute and store.
1// this is what $cache does internally 2async getUser(id: string) { 3 const cached = await cache.get(id); 4 if (cached) return cached; 5 6 const user = await this.db.users.findById(id); 7 await cache.set(id, user); 8 return user; 9}10 11// with $cache, just:12getUser = $cache({13 name: "users",14 ttl: [5, "minutes"],15 handler: (id) => this.db.users.findById(id),16});
Update cache when writing:
1async updateUser(id: string, data: UpdateData) {2 const user = await this.db.users.update(id, data);3 4 // update cache with fresh data5 await this.userCache.set(id, user);6 7 return user;8}
When cache expires, you don't want 1000 requests all hitting the database. Alepha handles this:
1// only one request computes, others wait 2getExpensiveData = $cache({ 3 name: "expensive", 4 ttl: [1, "minute"], 5 // implicit: lock while computing 6 handler: async () => { 7 // only runs once even if 1000 requests hit simultaneously 8 return await this.heavyComputation(); 9 },10});
Redis directly:
1const redis = new Redis(); 2 3async function getUser(id: string) { 4 const cached = await redis.get(`user:${id}`); 5 if (cached) return JSON.parse(cached); 6 7 const user = await db.users.findById(id); 8 await redis.setex(`user:${id}`, 300, JSON.stringify(user)); 9 return user;10}11 12// don't forget to invalidate...13async function updateUser(id: string, data: any) {14 await db.users.update(id, data);15 await redis.del(`user:${id}`); // easy to forget16}
Alepha:
1getUser = $cache({2 name: "users",3 ttl: [5, "minutes"],4 handler: (id) => this.db.users.findById(id),5});6 7// invalidation is explicit, hard to miss8await this.getUser.invalidate(id);
Less boilerplate. Serialization handled. TTL is readable.
| Need | Solution |
|---|---|
| Cache function results | $cache({ handler }) |
| Manual cache control | $cache() + set/get/delete |
| HTTP response caching | $action({ cache: true }) |
| Pattern-based invalidation | cache.invalidate("pattern:*") |
| Production caching | Swap to RedisCacheProvider |
Caching doesn't have to be complicated. Define TTL, define handler, forget about it until you need to invalidate.