# Alepha Framework Documentation This file contains all documentation for the Alepha framework, concatenated for LLM context. Latest version: 0.10.3 Generated on: 2025-10-04T20:55:42.033Z --- # Important Guidelines ## 🎯 Framework Overview Alepha is a convention-driven, class-based TypeScript framework that uses **descriptors** (factory functions starting with `$`) to define application components. It's NOT a wrapper around Express/Fastify but a complete framework built from scratch. ### Core Principles 1. **Class-based architecture** - All services are classes, not functional components 2. **Descriptor pattern** - Use `$` prefixed functions to declare functionality 3. **Dependency injection** - Built-in DI container manages all services 4. **Type-safe from database to frontend** - Full TypeScript with TypeBox schemas 5. **Convention over configuration** - Opinionated structure with clear patterns ### Rules Summary - use TypeScript (strict mode) - use Biome for formatting and linting - use Vitest for testing - use Vite for bundling (full-stack) - use React for frontend (full-stack) - use Postgres for database (with built-in support) - use TypeBox for schema definitions (not Zod!), using `t` from Alepha, not importing TypeBox directly - use documentation: https://alepha.dev/llms.txt - one file = one class - descriptors are always a class property, except for `$entity` to be drizzle-kit compatible - no decorators, no functional services, no Express/Fastify patterns - no manual instantiation, always use DI - use import with file extensions (e.g. `import { User } from "./User.ts"`) ## πŸ“‹ Essential Rules for Code Generation ### Rule 1: Project Structure ``` my-app/ β”œβ”€β”€ src/ β”‚ β”œβ”€β”€ server/ # Backend services β”‚ β”‚ β”œβ”€β”€ controllers/ # API controllers with $action β”‚ β”‚ β”œβ”€β”€ services/ # Business logic β”‚ β”‚ β”œβ”€β”€ entities/ # Entities with $entity β”‚ β”‚ └── providers/ # External service providers β”‚ β”œβ”€β”€ client/ # Frontend (if full-stack) β”‚ β”‚ β”œβ”€β”€ components/ # React components β”‚ β”‚ └── AppRouter.ts # Router definition, with $page β”‚ β”œβ”€β”€ shared/ # Shared types/schemas β”‚ β”œβ”€β”€ index.server.ts # Server entry point β”‚ └── index.browser.ts # Browser entry point (if full-stack) β”œβ”€β”€ package.json β”œβ”€β”€ tsconfig.json β”œβ”€β”€ vite.config.ts # If using full-stack features └── index.html # If using full-stack features ``` ### Rule 2: Always Use Classes and Descriptors ```typescript // βœ… CORRECT - Alepha way import { $action, $inject } from "alepha"; import { $logger } from "alepha/logger"; class UserController { log = $logger(); userService = $inject(UserService); getUser = $action({ schema: { params: t.object({ id: t.string() }) }, handler: async ({ params }) => { this.log.info(`Getting user ${params.id}`); return await this.userService.findById(params.id); } }); } // ❌ WRONG - Traditional Express/Fastify way const router = express.Router(); router.get('/users/:id', async (req, res) => { // This is NOT how Alepha works! }); ``` ### Rule 3: Entry Point Pattern (server and browser) ```typescript // Server entry (src/index.server.ts) import { Alepha, run } from "alepha"; import { UserController } from "./controllers/UserController"; const alepha = Alepha.create() .with(UserController) .with(DatabaseService); run(alepha); // Or for simple apps import { run } from "alepha"; run(UserController); ``` ### Rule 4: TypeBox Schemas (Not Zod!) ```typescript import { t } from "alepha"; // TypeBox, NOT zod! // βœ… CORRECT - TypeBox schemas const userSchema = t.object({ id: t.uuid(), email: t.string({ format: "email" }), age: t.number({ minimum: 0, maximum: 120 }), isActive: t.boolean({ default: true }) }); // ❌ WRONG - Zod schemas import { z } from "zod"; // DON'T USE THIS! ``` ### Rule 5: Database with Postgres Module ```typescript import { pg, $entity, $repository } from "alepha/postgres"; import { t } from "alepha"; // Define entity with TypeBox schema const users = $entity({ name: "users", schema: t.object({ id: pg.primaryKey(t.uuid()), email: t.string({ format: "email" }), name: t.string(), createdAt: pg.createdAt(), updatedAt: pg.updatedAt(), deletedAt: pg.deletedAt() // Soft delete }), indexes: [ { column: "email", unique: true } ] }); // Use in service class UserService { userRepository = $repository(users); async createUser(data: any) { return await this.userRepository.create(data); } async findByEmail(email: string) { return await this.userRepository.findOne({ email }); } } ``` ### Rule 6: API Endpoints with $action ```typescript import { $action } from "alepha/server"; import { t } from "alepha"; class ProductController { // GET /products listProducts = $action({ schema: { response: t.array(productSchema), // response schema is mandatory for json response }, handler: async () => { return await this.products.find(); } }); // POST /products createProduct = $action({ method: "POST", schema: { body: t.object({ // body schema is mandatory for request json body name: t.string(), price: t.number({ minimum: 0 }) }), response: productSchema, }, handler: async ({ body }) => { return await this.products.create(body); } }); // GET /products/:id getProduct = $action({ path: "/products/:id", schema: { params: t.object({ id: t.string() }), response: productSchema, }, handler: async ({ params }) => { return await this.products.findById(params.id); } }); } ``` ### Rule 7: Full-Stack with React ```typescript // AppRouter.ts import { $page } from "alepha/react"; export class AppRouter { layout = $page({ lazy: () => import("./components/Layout.tsx"), children: () => [this.home, this.about] }); home = $page({ path: "/", lazy: () => import("./components/Home.tsx"), resolve: async () => { // Server-side data fetching const data = await fetchHomeData(); return { data }; } }); about = $page({ path: "/about", lazy: () => import("./components/About.tsx") }); } // vite.config.ts import { viteAlepha } from "alepha/vite"; import viteReact from "@vitejs/plugin-react"; import { defineConfig } from "vite"; export default defineConfig({ plugins: [ viteReact(), viteAlepha({ serverEntry: "./src/index.server.ts" }) ] }); ``` ### Rule 8: Common Descriptors Reference ```typescript // Core $inject(ServiceClass) // Dependency injection $env(schema) // Environment variables $logger(name?) // Logging $hook({ on, handler }) // Lifecycle hooks $module({ name, services }) // Module definition // Server $action({ method?, path?, schema?, handler }) // API endpoint $route({ method?, path?, handler }) // Simple route $middleware({ handler }) // Middleware // Database (postgres) $entity({ name, schema, indexes }) // Table definition $repository(entity) // Repository for entity $transaction({ handler }) // Database transaction $sequence({ name, start, increment }) // ID sequence // Features $cache({ name?, ttl, handler }) // Caching $queue({ handler }) // Background jobs $scheduler({ cron, handler }) // Scheduled tasks $email({ subject, body, schema }) // Email templates $bucket({ name, mimeTypes, maxSize }) // File storage $lock({ handler }) // Distributed locks $topic({ name }) // Pub/sub topics $subscriber({ topic, handler }) // Topic subscriber $batch({ schema, maxSize, handler }) // Batch processing // React $page({ path?, lazy, resolve?, children? }) // Page definition ``` ### Rule 9: Service Communication ```tsx // Within same module - use $inject class ServiceA { serviceB = $inject(ServiceB); async doSomething() { return await this.serviceB.process(); } } // Between modules - use $client import {$client} from "alepha/server/links"; import type {UserController} from "../server/controllers/UserController.ts"; class UserController { notifications = $client(); createUser = $action({ handler: async (data) => { const user = await this.users.create(data); await this.notifications.sendWelcome(user.email); return user; } }); } // Between frontend and backend - use $client import {$client} from "alepha/server/links"; import type {UserController} from "../server/controllers/UserController.ts"; import type {User} from "../server/entities/users.ts"; class UserProfile { userApi = $client(); home = $page({ path: "/", lazy: () => import("./Home.tsx"), resolve: async () => { return { user: await this.userApi.getMyUserProfile(data) }; } }) } const Home = ({user}: {user: User}) => { const userApi = useClient(); return
Welcome, {user.name}
; }; ``` ### Rule 10: Testing Pattern ```typescript import { describe, it, expect } from "vitest"; import { Alepha } from "alepha"; describe("UserService", () => { const alepha = Alepha.create() .with(UserService); it("should create user", async () => { const service = alepha.inject(UserService); const user = await service.createUser({ email: "test@example.com" }); expect(user.email).toBe("test@example.com"); }); }); ``` ## 🚫 Common Mistakes to Avoid 1. **DON'T use decorators** - Alepha uses descriptors, not decorators 2. **DON'T use Express/Fastify patterns** - No `app.get()`, `router.use()`, etc. 3. **DON'T use Zod** - Use TypeBox (`t`) for schemas 4. **DON'T use functional components for services** - Always use classes 5. **DON'T forget the `$` prefix** - All descriptors start with `$` 6. **DON'T inject across modules** - Use `$client` for cross-module communication 7. **DON'T use async constructors** - Use `$hook({ on: "start" })` instead 8. **DON'T create instances manually** - Let the DI container manage them ## 🎨 Code Generation Template ```typescript // 1. Import Alepha essentials import { Alepha, run, t, $logger, $inject } from "alepha"; import { $action } from "alepha/server"; import { $entity, $repository } from "alepha/postgres"; import { pg } from "alepha"; // 2. Define entities const users = $entity({ name: "table_name", schema: t.object({ id: pg.primaryKey(t.uuid()), // ... fields createdAt: pg.createdAt(), updatedAt: pg.updatedAt() }), indexes: [] }); // 3. Create repository service class EntityService { userRepo = $repository(users); log = $logger(); async create(data: any) { this.log.info("Creating entity"); return await this.userRepo.create(data); } } // 4. Create controller class EntityController { service = $inject(EntityService); list = $action({ handler: async () => { return await this.service.findAll(); } }); create = $action({ method: "POST", schema: { body: t.object({ // ... fields }) }, handler: async ({ body }) => { return await this.service.create(body); } }); } // 5. Bootstrap application const alepha = Alepha.create() .with(EntityService) .with(EntityController); run(alepha); ``` ## πŸ“¦ Package.json Requirements ```json { "type": "module", "dependencies": { "alepha": "latest" }, "scripts": { "dev": "vite", "build": "vite build", "test": "vitest" } } ``` ## πŸ”§ TSConfig Requirements ```json { "compilerOptions": { "module": "nodenext", "target": "esnext", "strict": true, "jsx": "react-jsx", "esModuleInterop": true, "skipLibCheck": true, "allowImportingTsExtensions": true } } ``` ## 🎯 When Generating Code 1. **Start with the entity/data model** using `$entity` 2. **Create services** with `$repository` for data access 3. **Add business logic** in service classes 4. **Expose APIs** with `$action` in controllers 6. **For full-stack**, add `$page` descriptors and React components ## πŸ’‘ Pro Tips - Use `$logger()` extensively for debugging - Leverage `$hook({ on: "start" })` for initialization - Use `$cache()` to wrap expensive operations - Implement `$queue()` for background processing - Add `$scheduler()` for recurring tasks - Use transactions with `$transaction()` for data consistency - Implement `$lock()` for distributed systems - Always type your schemas with TypeBox (`t`) - t.uuid() for UUID - t.datetime() for date-time Remember: Alepha is about **declarative, class-based services** with **descriptor-driven functionality**. Think in terms of services, not routes or middleware! ## Modules Alepha is modular, with a LOT of modules. ### Core & Application Layer * **Core ([@alepha/core](https://feunard.github.io/alepha/docs/alepha-core)) πŸ“¦:** The heart of the framework, providing a powerful dependency injection container, application lifecycle management, and the core descriptor system. * **Server ([@alepha/server](https://feunard.github.io/alepha/docs/alepha-server)) 🌐:** A high-performance, minimalist HTTP server for creating type-safe REST APIs using declarative `$action` descriptors. * **Database ([@alepha/postgres](https://feunard.github.io/alepha/docs/alepha-postgres)) πŸ—„οΈ:** A powerful and type-safe ORM built on Drizzle. Define your schema with `$entity` and get fully-typed repositories with `$repository`. * **React ([@alepha/react](https://feunard.github.io/alepha/docs/alepha-react)) βš›οΈ:** Build full-stack, server-side rendered React applications with a file-based routing system (`$page`) that handles data fetching, hydration, and type-safe props. ### Backend Infrastructure & Abstractions * **Security ([@alepha/security](https://feunard.github.io/alepha/docs/alepha-security)) πŸ›‘οΈ:** A complete authentication and authorization system. Manage roles (`$role`), permissions (`$permission`), JWTs, and realms (`$realm`). * **Queue ([@alepha/queue](https://feunard.github.io/alepha/docs/alepha-queue)) ⏳:** A simple and robust interface for background job processing. Define workers with the `$queue` descriptor and integrate with backends like Redis. * **Cache ([@alepha/cache](https://feunard.github.io/alepha/docs/alepha-cache)) ⚑:** A flexible caching layer with support for TTL, automatic function caching (`$cache`), and multiple backends like in-memory or Redis. * **Bucket ([@alepha/bucket](https://feunard.github.io/alepha/docs/alepha-bucket)) ☁️:** A unified API for file and object storage. Abstract away the details of local, in-memory, or cloud storage providers like Azure Blob Storage. * **Scheduler ([@alepha/scheduler](https://feunard.github.io/alepha/docs/alepha-scheduler)) ⏰:** Schedule recurring tasks using cron expressions or fixed intervals with the `$scheduler` descriptor, with built-in support for distributed locking. * **Topic ([@alepha/topic](https://feunard.github.io/alepha/docs/alepha-topic)) πŸ“’:** A publish-subscribe (pub/sub) messaging interface for building event-driven architectures with `$topic` and `$subscriber`. * **Lock ([@alepha/lock](https://feunard.github.io/alepha/docs/alepha-lock)) πŸ”’:** A distributed locking mechanism to ensure safe concurrent access to shared resources, using Redis or other backends. ### Server Middleware & Plugins * **Links ([@alepha/server-links](https://feunard.github.io/alepha/docs/alepha-server-links)) πŸ”—:** Enables end-to-end type-safe communication between your frontend and backend, or between microservices, with the `$client` descriptor. * **Swagger ([@alepha/server-swagger](https://feunard.github.io/alepha/docs/alepha-server-swagger)) πŸ“œ:** Automatically generate OpenAPI 3.0 documentation and a beautiful Swagger UI for all your `$action` API endpoints. * **Helmet ([@alepha/server-helmet](https://feunard.github.io/alepha/docs/alepha-server-helmet)) 🎩:** Enhance your application's security by automatically applying essential HTTP security headers like CSP and HSTS. * **CORS ([@alepha/server-cors](https://feunard.github.io/alepha/docs/alepha-server-cors)) ↔️:** A configurable middleware to handle Cross-Origin Resource Sharing (CORS) for your server. * **Multipart ([@alepha/server-multipart](https://feunard.github.io/alepha/docs/alepha-server-multipart)) πŸ“Ž:** Seamlessly handle `multipart/form-data` requests for file uploads. * **Compress ([@alepha/server-compress](https://feunard.github.io/alepha/docs/alepha-server-compress)) πŸ“¦πŸ’¨:** Automatically compress server responses with Gzip or Brotli to improve performance. And more, like **Request Logging**, **Error Handling**, and **Response Caching**, cookie parsers, and more, to enhance your server's capabilities. ### Full-Stack & React Ecosystem * **Auth ([@alepha/react-auth](https://feunard.github.io/alepha/docs/alepha-react-auth)) πŸ”‘:** Simplifies frontend authentication flows, providing the `useAuth` hook to manage user sessions and permissions in your React components. * **Head ([@alepha/react-head](https://feunard.github.io/alepha/docs/alepha-react-head)) SEO:** Manage your document's `` for SEO and metadata. Control titles, meta tags, and more, both on the server and client. * **i18n ([@alepha/react-i18n](https://feunard.github.io/alepha/docs/alepha-react-i18n)) 🌍:** A complete internationalization solution for your React applications, with support for lazy-loaded translation files and the `useI18n` hook. * **Form ([@alepha/react-form](https://feunard.github.io/alepha/docs/alepha-react-form)) πŸ“:** Create powerful, type-safe forms with automatic validation using the `useForm` hook, powered by your TypeBox schemas. ### Tooling & Utilities * **Vite ([@alepha/vite](https://feunard.github.io/alepha/docs/alepha-vite)) ✨:** A seamless Vite plugin that handles all the complex build and development server configurations for your full-stack Alepha applications. * **Command ([@alepha/command](https://feunard.github.io/alepha/docs/alepha-command)) ⌨️:** Build powerful, type-safe command-line interfaces and scripts directly within your application using the `$command` descriptor. * **Retry ([@alepha/retry](https://feunard.github.io/alepha/docs/alepha-retry)) πŸ”„:** A declarative and powerful decorator (`$retry`) for automatically retrying failed operations with exponential backoff. --- # concepts-1-alepha-instance.md ## Alepha Instance ```ts import { Alepha } from "alepha"; const alepha = new Alepha(); ``` The `Alepha` class is the core of the Alepha framework. It serves as the main entry point for your application, allowing you to configure and run your app. ### Factory ```ts import { Alepha } from "alepha"; const alepha = Alepha.create(); ``` A preferred way to create an instance of Alepha is by using the `create` method. - Server-side, it will use `process.env.*`. - In testing environments, it will attach `.start` and `.stop` methods to `beforeAll` and `afterAll` hooks if globals is enabled. ### Lifecycle Methods ```ts await alepha.start(); await alepha.stop(); ``` The `start` method initializes the Alepha instance, setting up the necessary environment and configurations. The `stop` method gracefully shuts down the instance, cleaning up resources and connections. #### Running the Application ```ts import { run } from "alepha"; run(alepha) // server: alepha.start().then(() => process.on("exit", () => alepha.stop())); // browser: alepha.start() ``` The `run` function is a convenience method that starts the Alepha instance. It abstracts away the details of server setup, allowing you to focus on building your application. On the server side, `.stop` will be called automatically when the process exits, ensuring a clean shutdown. ### Configuration ```ts import { Alepha } from "alepha"; Alepha.create({ env: { // custom environment variables MY_VAR: "value", }, // other configuration options }); ``` Alepha constructors can accept a configuration object that allows you to set custom environment variables and other options. Env variables can be accessed using `alepha.env.MY_VAR`, it's immutable, so you cannot change it after the instance is created. ### Container ```ts import { Alepha, run } from "alepha"; import { AlephaServer } from "alepha/server"; const alepha = Alepha.create(); alepha.with(AlephaServer); // register a http server run(alepha); // run http server ``` The Alepha instance acts as a container for your application. You can register services, providers, modules, that your application needs. > Descriptors will automatically register their module when they are used, so you don't need to register them manually.
> Example: A service with `$route()` will register the `AlephaServer` for you. You can also inject services. ```ts import { Alepha, run } from "alepha"; class MyService { greet() { return "Hello from MyService!"; } } const alepha = Alepha.create(); const myService = alepha.inject(MyService); console.log(myService.greet()); // "Hello from MyService!" ``` --- # concepts-2-descriptors.md ## Descriptors Alepha Framework provides factory functions called **descriptors** that allow you to define various aspects of your application in a declarative way. These descriptors are used to create routes, services, and other components without the need for complex boilerplate code. ```ts import { $action } from "alepha/server"; import { run, Alepha, $logger } from "alepha"; import { $queue } from "alepha/queue"; import { $scheduler } from "alepha/scheduler"; class UserController { log = $logger(); findUsers = $action({ handler: () => "List of users", }); sendEmail = $queue(); purgeItems = $scheduler({ cron: "0 0 * * *", handler: () => this.log.info("Purging items..."), }) // etc ... } const alepha = Alepha .create() .with(UserController); run(alepha); ``` Descriptors are functions that return a configuration object, which Alepha uses to set up the corresponding functionality. ### Collection of Descriptors There are more than 30 descriptors available in Alepha, each serving a specific purpose. Here are some of the most commonly used ones: - **$action**: The primary way to build type-safe APIs. Creates a full-featured API endpoint with automatic validation for request parameters, query, and body, along with response serialization. - **$repository**: The main interface for database interaction. Provides a fully-typed repository for a database entity, offering a complete set of CRUD and querying methods (find, create, update, delete, paginate). - **$inject**: The core dependency injection mechanism. Allows a class to easily access instances of other services managed by the Alepha container, acting as the glue that connects the framework. - **$page**: The cornerstone of the React integration. Defines a server-side rendered (SSR) or statically-generated web page, handling its route, data fetching (resolve), and component rendering. - **$cache**: Creates a high-performance cache for functions or arbitrary data. It can wrap a function to automatically cache its results with a defined TTL, improving application speed. - ... --- # concepts-3-providers.md ## Providers Take a look at the following code snippet: ```ts // this is a service class UserNotificationService { // it's a simple method which sends a notification notifyUser(to: string, message: string) { // email.send(to, message); } } ``` The class `NotificationService` inside the Alepha container is called a **service**. It's a stateless singleton. In order to send our email, we need to create a provider that will handle the email sending logic. Providers are classes that encapsulate specific functionality and can be injected into services or other providers. ```ts import { $hook, $env, t, $inject } from "alepha"; import { createTransport } from "nodemailer"; class EmailProvider { // configure provider with environment variables env = $env(t.object({ SMTP_HOST: t.string(), })); transporter = createTransport({ host: this.env.SMTP_HOST, }); send(to: string, message: string) { return this.transporter.sendMail({ from: to, text: message, }); } } class UserNotificationService { emailProvider = $inject(EmailProvider); notifyUser(to: string, message: string) { return this.emailProvider.send(to, message); } } ``` Voila! Now we have a `UserNotificationService` that can send notifications using the `EmailProvider`. The `EmailProvider` is configured with environment variables and can be injected into any service that needs to send emails. All Alepha packages contains a set of providers that can be used in your application. For example, the `alepha/queue` package provides a `QueueProvider` that can be used to send messages to a queue. ### Polymorphic Providers Sometimes, you may want to use different implementations of a provider based on the environment or configuration. For example, `QueueProvider` may vary based on the queue system you use (e.g., Redis, RabbitMQ, etc.). In this case, you can use polymorphic providers. ```ts import { $env, t, $inject, alepha } from "alepha"; import { QueueProvider, MemoryQueueProvider } from "alepha/queue"; import { RedisQueueProvider } from "alepha/queue/redis"; class TransactionService { // Inject the QueueProvider, which can be either Redis or Memory based on the environment queue = $inject(QueueProvider); } const alepha = Alepha.create().with(TransactionService); if (alepha.isProduction()) { // In production, use a Redis queue provider alepha.with({ provide: QueueProvider, use: RedisQueueProvider }); } else { // In development, use a memory queue provider alepha.with({ provide: QueueProvider, use: MemoryQueueProvider }); } run(alepha); ``` --- # concepts-4-modules.md ## Modules Small applications can be built with a single file, but as your application grows, you will want to organize your code into modules. Modules are a way to group related services, providers, and descriptors together. ```ts import { Alepha, run, $module } from "alepha"; const MyUserModule = $module({ name: "com.example.user", services: [UserController], }); const MyNotificationModule = $module({ name: "com.example.notification", services: [NotificationController], }); const alepha = Alepha .create() .with(MyUserModule) .with(MyNotificationModule); run(alepha); ``` Modules are not mandatory, but they help to organize your code and make it more maintainable. ### Don't $inject across modules When using modules, it's important to avoid injecting services from one module into another. Instead, use the `$client` descriptor to communicate between modules with `$action`. This ensures that your modules remain decoupled and can be reused independently. ```ts import { $client } from "alepha/server/links"; import type { NotificationController } from "@mycompany/api-notifications"; class UserController { notificationCtrl = $client(); createUser = $action({ handler: async () => { // ... await this.notificationCtrl.sendNotification( /* ... */ ); // ... }, }); } // now, modules can run in the same process (monolith) // or in different processes (microservices), does not matter ``` Check out the [alepha/server/links](/docs/alepha-server-links) package for more details on how to use `$client` and communicate between modules. --- # guides-1-introduction.md ## What is Alepha? The name "Alepha" is a play on the mathematical concept of *Aleph numbers* (א), which represent infinite sets. With a feminine "a" suffix, it embodies the idea of creating boundless possibilities from a strong, elegant foundation. At its core, **Alepha is an opinionated, class-based framework for building full-stack TypeScript applications with React, Drizzle, and Vite.** Alepha is designed from the ground up to provide a cohesive development experience, where conventions guide you and type safety protects you, from your database schema all the way to your frontend components. ## A Modern, Integrated Foundation Alepha **is not** a wrapper around existing libraries like [Express](https://expressjs.com) or [Fastify](https://fastify.dev). Started in 2024, it is a fresh take on modern backend development. The framework builds for the future of the platform, targeting **Node.js 22+** to leverage the latest features of the runtime. ### An Opinionated Stack, Not a Box of Parts While much of Alepha is a fresh rewrite, it also stands on the shoulders of giants. The framework's philosophy is to build where unique value can be added and integrate where it makes sense. To provide a seamless, end-to-end typed experience, Alepha is built upon a mandatory foundation of three exceptional tools: 1. **[React](https://react.dev/) for UIs:** The `alepha/react` package provides a powerful routing and data-fetching system inspired by the golden era of Next.jsβ€”before Server Components. It offers a straightforward and effective model for building Server-Side Rendered (SSR) applications. 2. **[Drizzle ORM](https://orm.drizzle.team/) for Databases:** Alepha uses Drizzle as its foundation for database interaction. You can use Drizzle's powerful query builder directly, or leverage Alepha's type-safe repository layer (`$repository`), which provides a streamlined and integrated experience for PostgreSQL and SQLite. 3. **[Vite](https://vitejs.dev/) for Building:** Alepha applications are built and bundled using Vite. The `alepha/vite` plugin provides a seamless build process for both your server and client code, with out-of-the-box support for deploying to Docker or Serverless. These three technologies are not optional pillars; they are the core of the Alepha experience. ## Platform Support Alepha is optimized for Node.js but also runs seamlessly on Bun. While a native Bun integration is planned for the future, the current version performs robustly in the Bun runtime. Many packages, especially `alepha/react`, are designed for both server and browser environments, making it a true full-stack solution. --- # guides-2-getting-started.md ## Getting Started Welcome to Alepha! This guide will walk you through creating your first Alepha application in just a few minutes, demonstrating how easily it integrates into any modern TypeScript project. ### Prerequisites All you need is a modern JavaScript runtime. Alepha is built and optimized for **Node.js 22+** or the latest version of **Bun**. * [Install Node.js](https://nodejs.org/) * [Install Bun](https://bun.sh/) If you're new to TypeScript, don't worry! Alepha is designed to be beginner-friendly, and this guide will help you get started without any prior experience. ### 1. Project Setup Let's begin by creating a new project directory and initializing it. ```bash mkdir my-app cd my-app ``` Next, we'll install Alepha as dependency. ```bash # Install the all-in-one Alepha package npm install alepha ``` ### 2. Configure TypeScript Alepha is a TypeScript-first framework. Create a `tsconfig.json` file in your project root with the following configuration. This minimal setup enables modern module resolution and JSX support. **`tsconfig.json`** ```json { "compilerOptions": { "module": "nodenext", "target": "esnext", "strict": true, "jsx": "react-jsx" } } ``` You'll also need to update your `package.json` to specify that your project uses ES Modules. Add the following line: **`package.json`** ```json { "type": "module" } ``` ### 3. Create Your First Server Now for the fun part! Create an `src/server.ts` file. This will be the entry point for your application. We'll define a simple server with a single route that responds with "Hello World!". Notice that we are using standard TypeScript classes and methodsβ€”**no decorator shims or complex syntax required.** **`src/server.ts`** ```typescript import { run } from "alepha"; import { $route } from "alepha/server"; class Server { // the $route descriptor declares a new HTTP endpoint // by default, it's a GET request hello = $route({ path: "/", handler: () => "Hello World!", }); } // the run function initializes the Alepha application // and starts the server run(Server); ``` > **Note:** Did you notice the `$` on `$route` ?
> `$route` is a _descriptor_, a factory function usable only in Alepha Context.
> You can learn more about descriptors in the [dedicated page](/docs/descriptors). That's all it takes to write a complete, working web server. Alepha plugs into your project with zero fuss. ### 4. Run Your Application You're all set. You can run your server directly with Node.js or Bun. No extra build steps or runtime tools are needed for development. **Using Node.js:** ```bash node src/server.ts ``` You should see a message indicating that the server has started: ``` [22:05:51.123] INFO : Starting App... [22:05:51.160] INFO : Server listening on http://localhost:3000 [22:05:51.160] INFO : App is now ready [37ms] ``` Now, open your web browser or use a tool like `curl` to access the endpoint: ```bash curl http://localhost:3000 ``` You should see the response: `Hello World!` VoilΓ ! πŸŽ‰ You have successfully created and run your first Alepha application using just your runtime's native capabilities. --- # guides-3-fullstack-app.md ## Building a Full-Stack Application Ready to level up? This guide will show you how to build a complete full-stack application with Alepha, featuring server-side rendering (SSR), client-side navigation, and type-safe routing. We'll transform a basic server into a modern web application with React components and dynamic pages. ### Prerequisites Make sure you've completed the [Getting Started](/docs/getting-started) guide first. You'll need: - Node.js 22+ or the latest version of Bun - A basic Alepha server from the previous guide ### 1. Install Additional Dependencies We need to add React and the Alepha React package for full-stack functionality. ```bash npm install react npm install -D @types/react @vitejs/plugin-react vite ``` ### 2. Project Structure Let's organize our project with the following structure: ``` my-app/ β”œβ”€β”€ src/ β”‚ β”œβ”€β”€ components/ β”‚ β”‚ β”œβ”€β”€ Layout.tsx β”‚ β”‚ β”œβ”€β”€ Home.tsx β”‚ β”‚ └── About.tsx β”‚ β”œβ”€β”€ styles.css β”‚ β”œβ”€β”€ AppRouter.ts β”‚ β”œβ”€β”€ index.server.ts β”‚ └── index.browser.ts β”œβ”€β”€ index.html β”œβ”€β”€ vite.config.ts β”œβ”€β”€ package.json └── tsconfig.json ``` ### 3. Create the App Router The `AppRouter` is the heart of your full-stack application. It defines all your pages using `$page` descriptors and handles routing, data fetching, and error handling. **`src/AppRouter.ts`** ```typescript import { $page } from "alepha/react"; import { t } from "alepha"; export class AppRouter { // Root layout page that wraps all other pages layout = $page({ lazy: () => import("./components/Layout.tsx"), children: () => [this.home, this.about], }); // Home page with static generation for performance home = $page({ path: "/", static: true, lazy: () => import("./components/Home.tsx"), resolve: async () => { // Fetch data on the server before rendering const message = "Welcome to your full-stack Alepha app!"; const timestamp = new Date().toISOString(); return { message, timestamp, }; }, }); about = $page({ path: "/about", lazy: () => import("./components/About.tsx") }); } ``` ### 4. Create React Components Let's create clean React components with simple CSS styling. **`src/components/Layout.tsx`** ```tsx import { Link, NestedView, useRouterEvents } from "alepha/react"; import { useState } from "react"; const Layout = () => { return (

My Alepha App

); }; export default Layout; ``` **`src/components/Home.tsx`** ```tsx import { Link } from "alepha/react"; interface HomeProps { message: string; timestamp: string; } const Home = ({message, timestamp}: HomeProps) => (

Full-Stack Alepha Application

{message}

Rendered at: {timestamp}

Features

  • Server-Side Rendering (SSR)
  • Type-safe routing with $page descriptors
  • Automatic code splitting
  • Data fetching with resolve functions
  • Static generation for performance
); export default Home ``` **`src/components/About.tsx`** ```tsx import { Link } from "alepha/react"; const About = () => (

About Our Application

); export default About ``` ### 5. Create Simple Styles **`src/styles.css`** ```css body { font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif; margin: 0; padding: 20px; line-height: 1.6; } header { background: #f5f5f5; padding: 1rem; margin-bottom: 2rem; border-radius: 8px; } nav a { color: #0066cc; text-decoration: none; margin-right: 1rem; } nav a:hover { text-decoration: underline; } footer { margin-top: 2rem; padding-top: 1rem; border-top: 1px solid #eee; color: #666; } .loading { background: #0066cc; color: white; padding: 0.5rem; border-radius: 4px; margin-bottom: 1rem; } code { background: #f5f5f5; padding: 0.25rem 0.5rem; border-radius: 3px; font-family: monospace; } ``` ### 6. Create HTML Template The HTML template serves as the entry point and includes our CSS file. **`index.html`** ```html My Full-Stack App ``` ### 7. Create Server Entry Point The server entry point initializes your Alepha application and starts the HTTP server. **`src/index.server.ts`** ```typescript import { Alepha, run } from "alepha"; import { AppRouter } from "./AppRouter.ts"; const alepha = Alepha.create(); alepha.with(AppRouter); run(alepha); ``` ### 8. Create Browser Entry Point The browser entry point handles client-side hydration and navigation. **`src/index.browser.ts`** ```typescript import { Alepha, run } from "alepha"; import { AppRouter } from "./AppRouter.ts"; const alepha = Alepha.create(); alepha.with(AppRouter); run(alepha); ``` ### 9. Configure Vite Vite handles the build process and development server for your full-stack application. **`vite.config.ts`** ```typescript import { viteAlepha } from "alepha/vite"; import viteReact from "@vitejs/plugin-react"; import { defineConfig } from "vite"; export default defineConfig({ plugins: [ viteReact(), viteAlepha({ // Point to your server entry file serverEntry: "./src/index.server.ts", }), ], }); ``` ### 10. Update Package.json Add the necessary scripts and update your package.json: **`package.json`** ```json { "type": "module", "scripts": { "dev": "vite", "build": "vite build" } } ``` ### 11. Run Your Full-Stack Application Now you can run your application in development mode: ```bash npm run dev ``` Visit `http://localhost:5173` in your browser. You'll see: - **Server-Side Rendering**: The page loads instantly with pre-rendered HTML - **Client-Side Navigation**: Smooth navigation between pages without full page reloads - **Type Safety**: Full TypeScript support throughout your application - **Data Fetching**: Server-side data loading with automatic serialization ### 12. Build for Production When you're ready to deploy, build your application: ```bash npm run build ``` This creates: - Optimized client-side bundles in `dist/client/` - Server-side code in `dist/server/` - Static assets with proper caching headers ### What You've Built Congratulations! πŸŽ‰ You've created a complete full-stack application with: - **Type-Safe Routing**: Using `$page` descriptors for all routes - **Server-Side Rendering**: Fast initial page loads with SEO benefits - **Client-Side Navigation**: Smooth single-page app experience - **Data Fetching**: Server-side data loading with automatic hydration - **Static Generation**: Performance optimization for static content - **Development Tools**: Hot module replacement and TypeScript support ### Next Steps From here, you can: - Add more complex pages with nested routing - Integrate databases with `alepha/postgres` - Add email functionality with `alepha/email` - Implement background jobs with `alepha/queue` - Add authentication with `alepha/security` - Deploy to production platforms like Vercel or Railway Your full-stack Alepha application is now ready for real-world development! --- # packages-alepha-batch.md # Alepha Batch Efficiently process operations in groups by size or time. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module This module allows you to group multiple asynchronous operations into a single "batch," which is then processed together. This is an essential pattern for improving performance, reducing I/O, and interacting efficiently with rate-limited APIs or databases. ```ts import { Alepha, $hook, run, t } from "alepha"; import { $batch } from "alepha/batch"; class LoggingService { // define the batch processor logBatch = $batch({ schema: t.string(), maxSize: 10, maxDuration: [5, "seconds"], handler: async (items) => { console.log(`[BATCH LOG] Processing ${items.length} events:`, items); }, }); // example of how to use it onReady = $hook({ on: "ready", handler: async () => { this.logBatch.push("Application started."); this.logBatch.push("User authenticated."); // ... more events pushed from elsewhere in the app }, }); } ``` This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaBatch } from "alepha/batch"; const alepha = Alepha.create() .with(AlephaBatch); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $batch() Creates a batch processing descriptor for efficient grouping and processing of multiple operations. This descriptor provides a powerful batching mechanism that collects multiple individual items and processes them together in groups, significantly improving performance by reducing overhead and enabling bulk operations. It supports partitioning, concurrent processing, automatic flushing, and intelligent retry mechanisms for robust batch processing workflows. **Key Features** - **Intelligent Batching**: Groups items based on size and time thresholds - **Partitioning Support**: Process different types of items in separate batches - **Concurrent Processing**: Handle multiple batches simultaneously with configurable limits - **Automatic Flushing**: Time-based and size-based automatic batch execution - **Type Safety**: Full TypeScript support with schema validation using TypeBox - **Retry Logic**: Built-in retry mechanisms for failed batch operations - **Resource Management**: Automatic cleanup and graceful shutdown handling **Use Cases** Perfect for optimizing high-throughput operations: - Database bulk inserts and updates - API call batching and rate limit optimization - Log aggregation and bulk shipping - File processing and bulk uploads - Event processing and analytics ingestion - Notification delivery optimization - Cache invalidation batching **Basic database batch operations:** ```ts import { $batch } from "alepha/batch"; import { t } from "alepha"; class UserService { userBatch = $batch({ schema: t.object({ id: t.string(), name: t.string(), email: t.string(), createdAt: t.optional(t.string()) }), maxSize: 50, // Process up to 50 users at once maxDuration: [5, "seconds"], // Or flush every 5 seconds handler: async (users) => { // Bulk insert users - much faster than individual inserts console.log(`Processing batch of ${users.length} users`); const result = await this.database.users.insertMany(users.map(user => ({ ...user, createdAt: user.createdAt || new Date().toISOString() }))); console.log(`Successfully inserted ${result.length} users`); return { inserted: result.length, userIds: result.map(r => r.id) }; } }); async createUser(userData: { name: string; email: string }) { // Individual calls are automatically batched const result = await this.userBatch.push({ id: generateId(), name: userData.name, email: userData.email }); return result; // Returns the batch result once batch is processed } } ``` **API call batching with partitioning:** ```ts class NotificationService { notificationBatch = $batch({ schema: t.object({ userId: t.string(), type: t.enum(["email", "sms", "push"]), message: t.string(), priority: t.enum(["high", "normal", "low"]) }), maxSize: 100, maxDuration: [10, "seconds"], // Partition by notification type for different processing partitionBy: (notification) => notification.type, concurrency: 3, // Process up to 3 different types simultaneously handler: async (notifications) => { const type = notifications[0].type; // All items in batch have same type console.log(`Processing ${notifications.length} ${type} notifications`); switch (type) { case 'email': return await this.emailProvider.sendBulk(notifications.map(n => ({ to: n.userId, subject: 'Notification', body: n.message, priority: n.priority }))); case 'sms': return await this.smsProvider.sendBulk(notifications.map(n => ({ to: n.userId, message: n.message }))); case 'push': return await this.pushProvider.sendBulk(notifications.map(n => ({ userId: n.userId, title: 'Notification', body: n.message, priority: n.priority }))); } } }); async sendNotification(userId: string, type: 'email' | 'sms' | 'push', message: string, priority: 'high' | 'normal' | 'low' = 'normal') { // Notifications are automatically batched by type return await this.notificationBatch.push({ userId, type, message, priority }); } } ``` **Log aggregation with retry logic:** ```ts class LoggingService { logBatch = $batch({ schema: t.object({ timestamp: t.number(), level: t.enum(["info", "warn", "error"]), message: t.string(), metadata: t.optional(t.record(t.string(), t.any())), source: t.string() }), maxSize: 1000, // Large batches for log efficiency maxDuration: [30, "seconds"], // Longer duration for log aggregation concurrency: 2, // Limit concurrent log shipments retry: { maxAttempts: 5, delay: [2, "seconds"], backoff: "exponential" }, handler: async (logEntries) => { console.log(`Shipping ${logEntries.length} log entries`); try { // Ship logs to external service (e.g., Elasticsearch, Splunk) const response = await this.logShipper.bulkIndex({ index: 'application-logs', body: logEntries.map(entry => ([ { index: { _index: 'application-logs' } }, { ...entry, '@timestamp': new Date(entry.timestamp).toISOString() } ])).flat() }); if (response.errors) { console.error(`Some log entries failed to index`, response.errors); // Retry will be triggered by throwing throw new Error(`Failed to index ${response.errors.length} log entries`); } console.log(`Successfully shipped ${logEntries.length} log entries`); return { shipped: logEntries.length, indexedAt: Date.now() }; } catch (error) { console.error(`Failed to ship logs batch`, error); throw error; // Trigger retry mechanism } } }); async log(level: 'info' | 'warn' | 'error', message: string, metadata?: Record, source: string = 'application') { // Individual log calls are batched and shipped efficiently return await this.logBatch.push({ timestamp: Date.now(), level, message, metadata, source }); } } ``` **File processing with dynamic partitioning:** ```ts class FileProcessingService { fileProcessingBatch = $batch({ schema: t.object({ filePath: t.string(), fileType: t.enum(["image", "video", "document"]), processingOptions: t.object({ quality: t.optional(t.enum(["low", "medium", "high"])), format: t.optional(t.string()), compress: t.optional(t.boolean()) }), priority: t.enum(["urgent", "normal", "background"]) }), maxSize: 20, // Smaller batches for file processing maxDuration: [2, "minutes"], // Reasonable time for file accumulation // Partition by file type and priority for optimal resource usage partitionBy: (file) => `${file.fileType}-${file.priority}`, concurrency: 4, // Multiple concurrent processing pipelines retry: { maxAttempts: 3, delay: [5, "seconds"] }, handler: async (files) => { const fileType = files[0].fileType; const priority = files[0].priority; console.log(`Processing ${files.length} ${fileType} files with ${priority} priority`); try { const results = []; for (const file of files) { const result = await this.processFile(file.filePath, file.fileType, file.processingOptions); results.push({ originalPath: file.filePath, processedPath: result.outputPath, size: result.size, duration: result.processingTime }); } // Update database with batch results await this.updateProcessingStatus(results); console.log(`Successfully processed ${files.length} ${fileType} files`); return { processed: files.length, fileType, priority, totalSize: results.reduce((sum, r) => sum + r.size, 0), results }; } catch (error) { console.error(`Batch file processing failed for ${fileType} files`, error); throw error; } } }); async processFile(filePath: string, fileType: 'image' | 'video' | 'document', options: any, priority: 'urgent' | 'normal' | 'background' = 'normal') { // Files are automatically batched by type and priority return await this.fileProcessingBatch.push({ filePath, fileType, processingOptions: options, priority }); } } ``` --- # packages-alepha-bucket-azure.md # Alepha Bucket Azure Azure Blob Storage implementation for the bucket file storage. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Plugin for Alepha Bucket that provides Azure Blob Storage capabilities. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaBucketAzure } from "alepha/bucket/azure"; const alepha = Alepha.create() .with(AlephaBucketAzure); run(alepha); ``` ## API Reference ### Providers Providers are classes that encapsulate specific functionality and can be injected into your application. They handle initialization, configuration, and lifecycle management. For more details, see the [Providers documentation](/docs/providers). #### AzureFileStorageProvider Azure Blog Storage implementation of File Storage Provider. --- # packages-alepha-bucket-vercel.md # Alepha Bucket Vercel Vercel Blob Storage implementation for the bucket file storage. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Plugin for Alepha Bucket that provides Vercel Blob Storage capabilities. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaBucketVercel } from "alepha/bucket/vercel"; const alepha = Alepha.create() .with(AlephaBucketVercel); run(alepha); ``` ## API Reference ### Providers Providers are classes that encapsulate specific functionality and can be injected into your application. They handle initialization, configuration, and lifecycle management. For more details, see the [Providers documentation](/docs/providers). #### VercelFileStorageProvider Vercel Blob Storage implementation of File Storage Provider. --- # packages-alepha-bucket.md # Alepha Bucket A universal interface for object and file storage providers. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Provides file storage capabilities through declarative bucket descriptors with support for multiple storage backends. The bucket module enables unified file operations across different storage systems using the `$bucket` descriptor on class properties. It abstracts storage provider differences, offering consistent APIs for local filesystem, cloud storage, or in-memory storage for testing environments. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaBucket } from "alepha/bucket"; const alepha = Alepha.create() .with(AlephaBucket); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $bucket() Creates a bucket descriptor for file storage and management with configurable validation. This descriptor provides a comprehensive file storage system that handles file uploads, downloads, validation, and management across multiple storage backends. It supports MIME type validation, size limits, and integrates seamlessly with various storage providers for scalable file management in applications. **Key Features** - **Multi-Provider Support**: Works with filesystem, cloud storage (S3, Azure), and in-memory providers - **File Validation**: Automatic MIME type checking and file size validation - **Type Safety**: Full TypeScript support with FileLike interface compatibility - **Event Integration**: Emits events for file operations (upload, delete) for monitoring - **Flexible Configuration**: Per-bucket and per-operation configuration options - **Automatic Detection**: Smart file type and size detection with fallback mechanisms - **Error Handling**: Comprehensive error handling with descriptive error messages **Use Cases** Perfect for handling file storage requirements across applications: - User profile picture and document uploads - Product image and media management - Document storage and retrieval systems - Temporary file handling and processing - Content delivery and asset management - Backup and archival storage - File-based data import/export workflows **Basic file upload bucket:** ```ts import { $bucket } from "alepha/bucket"; class MediaService { images = $bucket({ name: "user-images", description: "User uploaded profile images and photos", mimeTypes: ["image/jpeg", "image/png", "image/gif", "image/webp"], maxSize: 5 // 5MB limit }); async uploadProfileImage(file: FileLike, userId: string): Promise { // File is automatically validated against MIME types and size const fileId = await this.images.upload(file); // Update user profile with new image await this.userService.updateProfileImage(userId, fileId); return fileId; } async getUserProfileImage(userId: string): Promise { const user = await this.userService.getUser(userId); if (!user.profileImageId) { throw new Error('User has no profile image'); } return await this.images.download(user.profileImageId); } } ``` **Document storage with multiple file types:** ```ts class DocumentManager { documents = $bucket({ name: "company-documents", description: "Legal documents, contracts, and reports", mimeTypes: [ "application/pdf", "application/msword", "application/vnd.openxmlformats-officedocument.wordprocessingml.document", "text/plain", "text/csv" ], maxSize: 50 // 50MB for large documents }); async uploadDocument(file: FileLike, metadata: { title: string; category: string; userId: string }): Promise { try { const fileId = await this.documents.upload(file); // Store document metadata in database await this.database.documents.create({ id: fileId, title: metadata.title, category: metadata.category, uploadedBy: metadata.userId, fileName: file.name, fileSize: file.size, mimeType: file.type, uploadedAt: new Date() }); console.log(`Document uploaded successfully: ${metadata.title} (${fileId})`); return fileId; } catch (error) { console.error(`Failed to upload document: ${metadata.title}`, error); throw error; } } async downloadDocument(documentId: string, userId: string): Promise { // Check permissions const document = await this.database.documents.findById(documentId); if (!document) { throw new Error('Document not found'); } const hasAccess = await this.permissionService.canAccessDocument(userId, documentId); if (!hasAccess) { throw new Error('Insufficient permissions to access document'); } // Download and return file return await this.documents.download(documentId); } async deleteDocument(documentId: string, userId: string): Promise { // Verify ownership or admin privileges const document = await this.database.documents.findById(documentId); if (document.uploadedBy !== userId && !await this.userService.isAdmin(userId)) { throw new Error('Cannot delete document: insufficient permissions'); } // Delete from storage and database await this.documents.delete(documentId); await this.database.documents.delete(documentId); console.log(`Document deleted: ${document.title} (${documentId})`); } } ``` **Cloud storage integration with custom provider:** ```ts class ProductImageService { productImages = $bucket({ name: "product-images", provider: S3FileStorageProvider, // Use AWS S3 for production storage description: "Product catalog images and thumbnails", mimeTypes: ["image/jpeg", "image/png", "image/webp"], maxSize: 10 // 10MB for high-quality product images }); thumbnails = $bucket({ name: "product-thumbnails", provider: S3FileStorageProvider, description: "Generated product thumbnail images", mimeTypes: ["image/jpeg", "image/webp"], maxSize: 1 // 1MB for thumbnails }); async uploadProductImage(productId: string, file: FileLike): Promise<{ imageId: string; thumbnailId: string }> { try { // Upload original image const imageId = await this.productImages.upload(file); // Generate and upload thumbnail const thumbnailFile = await this.imageProcessor.generateThumbnail(file, { width: 300, height: 300, format: 'webp', quality: 80 }); const thumbnailId = await this.thumbnails.upload(thumbnailFile); // Update product in database await this.database.products.update(productId, { imageId, thumbnailId, imageUpdatedAt: new Date() }); console.log(`Product images uploaded for ${productId}: image=${imageId}, thumbnail=${thumbnailId}`); return { imageId, thumbnailId }; } catch (error) { console.error(`Failed to upload product image for ${productId}`, error); throw error; } } async getProductImage(productId: string, thumbnail: boolean = false): Promise { const product = await this.database.products.findById(productId); if (!product) { throw new Error(`Product ${productId} not found`); } const imageId = thumbnail ? product.thumbnailId : product.imageId; if (!imageId) { throw new Error(`Product ${productId} has no ${thumbnail ? 'thumbnail' : 'image'}`); } const bucket = thumbnail ? this.thumbnails : this.productImages; return await bucket.download(imageId); } } ``` **Temporary file processing with memory storage:** ```ts class FileProcessingService { tempFiles = $bucket({ name: "temp-processing", provider: "memory", // Use in-memory storage for temporary files description: "Temporary files during processing workflows", maxSize: 100 // Large limit for processing workflows }); async processDataFile(inputFile: FileLike, transformations: string[]): Promise { let currentFile = inputFile; const intermediateFiles: string[] = []; try { // Upload initial file to temp storage let currentFileId = await this.tempFiles.upload(currentFile); intermediateFiles.push(currentFileId); // Apply each transformation for (const transformation of transformations) { console.log(`Applying transformation: ${transformation}`); // Download current file currentFile = await this.tempFiles.download(currentFileId); // Apply transformation const transformedFile = await this.applyTransformation(currentFile, transformation); // Upload transformed file currentFileId = await this.tempFiles.upload(transformedFile); intermediateFiles.push(currentFileId); } // Download final result const finalFile = await this.tempFiles.download(currentFileId); console.log(`File processing completed with ${transformations.length} transformations`); return finalFile; } finally { // Clean up all intermediate files for (const fileId of intermediateFiles) { try { await this.tempFiles.delete(fileId); } catch (error) { console.warn(`Failed to clean up temp file ${fileId}:`, error.message); } } console.log(`Cleaned up ${intermediateFiles.length} temporary files`); } } } ``` **File validation with dynamic configuration:** ```ts class UserContentService { userContent = $bucket({ name: "user-content", description: "User-generated content with flexible validation" // Base configuration - can be overridden per operation }); async uploadUserFile(file: FileLike, userId: string, contentType: 'avatar' | 'document' | 'media'): Promise { // Dynamic validation based on content type const validationConfig = this.getValidationConfig(contentType, userId); try { // Upload with specific validation rules const fileId = await this.userContent.upload(file, validationConfig); // Log upload for audit trail await this.auditLogger.log({ action: 'file_upload', userId, fileId, contentType, fileName: file.name, fileSize: file.size, mimeType: file.type }); return fileId; } catch (error) { console.error(`File upload failed for user ${userId}`, { contentType, fileName: file.name, error: error.message }); throw error; } } private getValidationConfig(contentType: string, userId: string) { const baseConfig = { avatar: { mimeTypes: ['image/jpeg', 'image/png'], maxSize: 2 // 2MB for avatars }, document: { mimeTypes: ['application/pdf', 'text/plain'], maxSize: 10 // 10MB for documents }, media: { mimeTypes: ['image/jpeg', 'image/png', 'video/mp4'], maxSize: 50 // 50MB for media files } }; const config = baseConfig[contentType]; // Premium users get higher limits if (this.userService.isPremium(userId)) { config.maxSize *= 2; } return config; } } ``` --- # packages-alepha-cache-redis.md # Alepha Cache Redis Redis implementation for the caching interface. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Plugin for Alepha Cache that provides Redis caching capabilities. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaCacheRedis } from "alepha/cache/redis"; const alepha = Alepha.create() .with(AlephaCacheRedis); run(alepha); ``` --- # packages-alepha-cache.md # Alepha Cache A generic key-value caching interface with in-memory implementation. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Provides high-performance caching capabilities for Alepha applications with configurable TTL and multiple storage backends. The cache module enables declarative caching through the `$cache` descriptor, allowing you to cache method results, API responses, or computed values with automatic invalidation and type safety. It supports both in-memory and persistent storage backends for different performance and durability requirements. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaCache } from "alepha/cache"; const alepha = Alepha.create() .with(AlephaCache); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $cache() Creates a cache descriptor for high-performance data caching with automatic cache management. This descriptor provides a powerful caching layer that can significantly improve application performance by storing frequently accessed data in memory or external cache stores like Redis. It supports both function result caching and manual cache operations with intelligent serialization and TTL management. **Key Features** - **Function Result Caching**: Automatically cache function results based on input parameters - **Multiple Storage Backends**: Support for in-memory, Redis, and custom cache providers - **Intelligent Serialization**: Automatic handling of JSON, strings, and binary data - **TTL Management**: Configurable time-to-live with automatic expiration - **Cache Invalidation**: Pattern-based cache invalidation with wildcard support - **Environment Controls**: Enable/disable caching via environment variables - **Type Safety**: Full TypeScript support with generic type parameters ## Cache Strategies ### 1. Function Result Caching (Memoization) Automatically cache the results of expensive operations based on input parameters. ### 2. Manual Cache Operations Direct cache operations for custom caching logic and data storage. ## Storage Backends - **Memory**: Fast in-memory cache (default for development) - **Redis**: Distributed cache for production environments - **Custom Providers**: Implement your own cache storage backend **Basic function result caching:** ```ts import { $cache } from "alepha/cache"; class DataService { // Cache expensive database queries getUserData = $cache({ name: "user-data", ttl: [10, "minutes"], handler: async (userId: string) => { // Expensive database operation return await database.users.findById(userId); } }); async getUser(id: string) { // This will hit cache on subsequent calls with same ID return await this.getUserData(id); } } ``` **API response caching with custom key generation:** ```ts class ApiService { fetchUserPosts = $cache({ name: "user-posts", ttl: [5, "minutes"], key: (userId: string, page: number) => `${userId}:page:${page}`, handler: async (userId: string, page: number = 1) => { const response = await fetch(`/api/users/${userId}/posts?page=${page}`); return await response.json(); } }); } ``` **Manual cache operations for custom logic:** ```ts class SessionService { sessionCache = $cache({ name: "user-sessions", ttl: [1, "hour"], provider: "memory" // Use memory cache for sessions }); async storeSession(sessionId: string, session: UserSession) { await this.sessionCache.set(sessionId, session); } async getSession(sessionId: string): Promise { return await this.sessionCache.get(sessionId); } async invalidateUserSessions(userId: string) { // Invalidate all sessions for a user using wildcards await this.sessionCache.invalidate(`user:${userId}:*`); } } ``` **Redis-backed caching for production:** ```ts class ProductService { productCache = $cache({ name: "products", ttl: [1, "hour"], provider: RedisCacheProvider, // Use Redis for distributed caching handler: async (productId: string) => { return await this.database.products.findById(productId); } }); async invalidateProduct(productId: string) { await this.productCache.invalidate(productId); } async invalidateAllProducts() { await this.productCache.invalidate("*"); } } ``` **Conditional caching with environment controls:** ```ts class ExpensiveService { computation = $cache({ name: "heavy-computation", ttl: [1, "day"], disabled: process.env.NODE_ENV === "development", // Disable in dev handler: async (input: ComplexInput) => { // Very expensive computation that should be cached in production return await performHeavyComputation(input); } }); } ``` --- # packages-alepha-command.md # Alepha Command Build powerful, type-safe command-line interfaces for your application. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module This module provides a powerful way to build command-line interfaces directly within your Alepha application, using declarative descriptors. It allows you to define commands using the `$command` descriptor. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaCommand } from "alepha/command"; const alepha = Alepha.create() .with(AlephaCommand); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $command() Declares a CLI command. This descriptor allows you to define a command, its flags, and its handler within your Alepha application structure. --- # packages-alepha-core.md # Alepha Core The essential dependency injection and application lifecycle engine. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Core container of the Alepha framework. It is responsible for managing the lifecycle of services, handling dependency injection, and providing a unified interface for the application. ```ts import { Alepha, run } from "alepha"; class MyService { // business logic here } const alepha = Alepha.create({ // state, env, and other properties }) alepha.with(MyService); run(alepha); // trigger .start (and .stop) automatically ``` ### Alepha Factory Alepha.create() is an enhanced version of new Alepha(). - It merges `process.env` with the provided state.env when available. - It populates the test hooks for Vitest or Jest environments when available. new Alepha() is fine if you don't need these helpers. ### Platforms & Environments Alepha is designed to work in various environments: - **Browser**: Runs in the browser, using the global `window` object. - **Serverless**: Runs in serverless environments like Vercel or Vite. - **Test**: Runs in test environments like Jest or Vitest. - **Production**: Runs in production environments, typically with NODE_ENV set to "production". * You can check the current environment using the following methods: - `isBrowser()`: Returns true if the App is running in a browser environment. - `isServerless()`: Returns true if the App is running in a serverless environment. - `isTest()`: Returns true if the App is running in a test environment. - `isProduction()`: Returns true if the App is running in a production environment. ### State & Environment The state of the Alepha container is stored in the `store` property. Most important property is `store.env`, which contains the environment variables. ```ts const alepha = Alepha.create({ env: { MY_VAR: "value" } }); // You can access the environment variables using alepha.env console.log(alepha.env.MY_VAR); // "value" // But you should use $env() descriptor to get typed values from the environment. class App { env = $env( t.object({ MY_VAR: t.string(), }) ); } ``` ### Modules Modules are a way to group services together. You can register a module using the `$module` descriptor. ```ts import { $module } from "alepha"; class MyLib {} const myModule = $module({ name: "my.project.module", services: [MyLib], }); ``` Do not use modules for small applications. ### Hooks Hooks are a way to run async functions from all registered providers/services. You can register a hook using the `$hook` descriptor. ```ts import { $hook } from "alepha"; class App { log = $logger(); onCustomerHook = $hook({ on: "my:custom:hook", handler: () => { this.log?.info("App is being configured"); }, }); } Alepha.create() .with(App) .start() .then(alepha => alepha.events.emit("my:custom:hook")); ``` Hooks are fully typed. You can create your own hooks by using module augmentation: ```ts declare module "alepha" { interface Hooks { "my:custom:hook": { arg1: string; } } } ``` @module alepha ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $env() Get typed values from environment variables. ```ts const alepha = Alepha.create({ env: { // Alepha.create() will also use process.env when running on Node.js HELLO: "world", } }); class App { log = $logger(); // program expect a var env "HELLO" as string to works env = $env(t.object({ HELLO: t.string() })); sayHello = () => this.log.info("Hello ${this.env.HELLO}") } run(alepha.with(App)); ``` #### $hook() Registers a new hook. ```ts import { $hook } from "alepha"; class MyProvider { onStart = $hook({ name: "start", // or "configure", "ready", "stop", ... handler: async (app) => { // await db.connect(); ... } }); } ``` Hooks are used to run async functions from all registered providers/services. You can't register a hook after the App has started. It's used under the hood by the `configure`, `start`, and `stop` methods. Some modules also use hooks to run their own logic. (e.g. `@alepha/server`). You can create your own hooks by using module augmentation: ```ts declare module "@alepha/core" { interface Hooks { "my:custom:hook": { arg1: string; } } } await alepha.events.emit("my:custom:hook", { arg1: "value" }); ``` #### $inject() Get the instance of the specified type from the context. ```ts class A { } class B { a = $inject(A); } ``` #### $module() Wrap Services and Descriptors into a Module. - A module is just a Service extended {@link Module}. - You must attach a `name` to it. - Name must follow the pattern: `project.module.submodule`. ```ts import { $module } from "alepha"; import { MyService } from "./MyService.ts"; // export MyService, so it can be used everywhere export * from "./MyService.ts"; export default $module({ name: "my.project.module", // MyService will have a module context "my.project.module" services: [MyService], }); ``` ## Why Modules? ### Logging By default, AlephaLogger will log the module name in the logs. This helps to identify where the logs are coming from. You can also set different log levels for different modules. It means you can set 'some.very.specific.module' to 'debug' and keep the rest of the application to 'info'. ### Modulith Force to structure your application in modules, even if it's a single deployable unit. It helps to keep a clean architecture and avoid monolithic applications. You can also use `MODULE_INCLUDE` and `MODULE_EXCLUDE` environment variables to load only specific modules. A strict mode is planned to enforce module boundaries. Throwing errors when a service from another module is injected. ### When not to use Modules? Small applications does not need modules. It's better to keep it simple. Modules are more useful when the application grows and needs to be structured. If we speak with `$actions`, a module should be used when you have more than 30 actions in a single module. --- # packages-alepha-datetime.md # Alepha Datetime Date, time, and duration utilities based on Day.js. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $interval() Run a function periodically. It uses the `setInterval` internally. It starts by default when the context starts and stops when the context stops. --- # packages-alepha-devtools.md # Alepha Devtools Developer tools for monitoring and debugging Alepha applications. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Developer tools module for monitoring and debugging Alepha applications. This module provides comprehensive data collection capabilities for tracking application behavior, performance metrics, and debugging information in real-time. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaDevtools } from "alepha/devtools"; const alepha = Alepha.create() .with(AlephaDevtools); run(alepha); ``` --- # packages-alepha-email.md # Alepha Email Email sending interface with multiple provider implementations (memory, local file, nodemailer). ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Provides email sending capabilities for Alepha applications with multiple provider backends. The email module enables declarative email sending through the `$email` descriptor, allowing you to send emails through different providers: memory (for testing), local file system, or SMTP via Nodemailer. It supports HTML email content and automatic provider selection based on environment configuration. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaEmail } from "alepha/email"; const alepha = Alepha.create() .with(AlephaEmail); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $email() Creates an email descriptor for sending type-safe templated emails. The $email descriptor provides a powerful templating system for creating and sending emails with full type safety and validation. It supports multiple email providers, template variable validation, and automatic HTML rendering. **Template Engine** - Simple {{variable}} syntax for dynamic content - Automatic template variable validation at runtime - Support for nested object properties in templates - HTML email support with rich formatting **Type Safety** - Full TypeScript support with schema validation using TypeBox - Compile-time type checking for template variables - Runtime validation of email data before sending - Automatic type inference from schema definitions **Provider Flexibility** - Memory provider for development and testing - Support for SMTP, SendGrid, AWS SES, and other providers - Custom provider implementation for specialized services - Automatic fallback and error handling **Template Management** - Reusable email templates across your application - Centralized template configuration and maintenance - Template variable documentation through schemas - Easy testing and preview capabilities **Development Experience** - Clear error messages for missing template variables - Comprehensive logging for debugging email delivery - Memory provider captures emails for testing - Template validation before sending ```typescript const welcomeEmail = $email({ subject: "Welcome to {{companyName}}, {{firstName}}!", body: `

Welcome {{firstName}} {{lastName}}!

Thank you for joining {{companyName}}.

Your account role is: {{role}}

Get started by visiting your dashboard.

`, schema: t.object({ firstName: t.string(), lastName: t.string(), companyName: t.string(), role: t.enum(["admin", "user", "manager"]), dashboardUrl: t.string() }) }); // Send with full type safety await welcomeEmail.send("user@example.com", { firstName: "John", lastName: "Doe", companyName: "Acme Corp", role: "user", dashboardUrl: "https://app.acme.com/dashboard" }); ``` ```typescript const orderConfirmation = $email({ subject: "Order #{{orderNumber}} confirmed - {{totalAmount}}", body: `

Order Confirmed!

Hi {{customerName}},

Your order #{{orderNumber}} has been confirmed.

Order Details:

Total: {{totalAmount}}

Estimated delivery: {{deliveryDate}}

`, schema: t.object({ customerName: t.string(), orderNumber: t.string(), totalAmount: t.string(), deliveryDate: t.string() }) }); ``` ```typescript const testEmail = $email({ subject: "Test: {{subject}}", body: "

{{message}}

", provider: "memory", // Captures emails for testing schema: t.object({ subject: t.string(), message: t.string() }) }); // In tests - emails are captured, not actually sent await testEmail.send("test@example.com", { subject: "Unit Test", message: "This email was captured for testing" }); ``` --- # packages-alepha-file.md # Alepha File Helpers for creating and managing file-like objects seamlessly. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` --- # packages-alepha-lock-redis.md # Alepha Lock Redis Redis implementation for the distributed locking mechanism. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Plugin for Alepha that provides a locking mechanism. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaLockRedis } from "alepha/lock/redis"; const alepha = Alepha.create() .with(AlephaLockRedis); run(alepha); ``` --- # packages-alepha-lock.md # Alepha Lock Distributed mutex and semaphore for resource locking and synchronization. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Lock a resource for a certain period of time. This module provides a memory implementation of the lock provider. You probably want to use an implementation like RedisLockProvider for distributed systems. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaLock } from "alepha/lock"; const alepha = Alepha.create() .with(AlephaLock); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $lock() Creates a distributed lock descriptor for ensuring single-instance execution across processes. This descriptor provides a powerful distributed locking mechanism that prevents multiple instances of the same operation from running simultaneously. It's essential for maintaining data consistency and preventing race conditions in distributed applications, scheduled tasks, and critical sections that must execute atomically. **Key Features** - **Distributed Coordination**: Works across multiple processes, servers, and containers - **Automatic Expiration**: Locks expire automatically to prevent deadlocks - **Graceful Handling**: Configurable wait behavior for different use cases - **Grace Periods**: Optional lock extension after completion for additional safety - **Topic Integration**: Uses pub/sub for efficient lock release notifications - **Unique Instance IDs**: Prevents lock conflicts between different instances - **Timeout Management**: Configurable durations with intelligent retry logic **Use Cases** Perfect for ensuring single execution in distributed environments: - Database migrations and schema updates - Scheduled job execution (cron-like tasks) - File processing and batch operations - Critical section protection - Resource initialization and cleanup - Singleton service operations - Cache warming and maintenance tasks **Basic lock for scheduled tasks:** ```ts import { $lock } from "alepha/lock"; class ScheduledTaskService { dailyReport = $lock({ handler: async () => { // This will only run on one server even if multiple servers // trigger the task simultaneously console.log('Generating daily report...'); const report = await this.generateDailyReport(); await this.sendReportToManagement(report); console.log('Daily report completed'); } }); async runDailyReport() { // Multiple servers can call this, but only one will execute await this.dailyReport.run(); } } ``` **Migration lock with wait behavior:** ```ts class DatabaseService { migration = $lock({ wait: true, // Wait for other instances to complete migration maxDuration: [10, "minutes"], // Migration timeout handler: async (version: string) => { console.log(`Running migration to version ${version}`); const currentVersion = await this.getCurrentSchemaVersion(); if (currentVersion >= version) { console.log(`Already at version ${version}, skipping`); return; } await this.runMigrationScripts(version); await this.updateSchemaVersion(version); console.log(`Migration to ${version} completed`); } }); async migrateToVersion(version: string) { // All instances will wait for the first one to complete // before continuing with their startup process await this.migration.run(version); } } ``` **Dynamic lock keys with grace periods:** ```ts class FileProcessor { processFile = $lock({ name: (filePath: string) => `file-processing:${filePath}`, wait: false, // Don't wait, skip if already processing maxDuration: [30, "minutes"], gracePeriod: [5, "minutes"], // Keep lock for 5min after completion handler: async (filePath: string) => { console.log(`Processing file: ${filePath}`); try { const fileData = await this.readFile(filePath); const processedData = await this.processData(fileData); await this.saveProcessedData(filePath, processedData); await this.moveToCompleted(filePath); console.log(`File processing completed: ${filePath}`); } catch (error) { console.error(`File processing failed: ${filePath}`, error); await this.moveToError(filePath, error.message); throw error; } } }); async processUploadedFile(filePath: string) { // Each file gets its own lock, preventing duplicate processing // Grace period prevents immediate reprocessing of the same file await this.processFile.run(filePath); } } ``` **Resource initialization with conditional grace periods:** ```ts class CacheService { warmCache = $lock({ name: (cacheKey: string) => `cache-warming:${cacheKey}`, wait: true, // Wait for cache to be warmed before continuing maxDuration: [15, "minutes"], gracePeriod: (cacheKey: string) => { // Dynamic grace period based on cache importance const criticalCaches = ['user-sessions', 'product-catalog']; return criticalCaches.includes(cacheKey) ? [30, "minutes"] // Longer grace for critical caches : [5, "minutes"]; // Shorter grace for regular caches }, handler: async (cacheKey: string, force: boolean = false) => { console.log(`Warming cache: ${cacheKey}`); if (!force && await this.isCacheWarm(cacheKey)) { console.log(`Cache ${cacheKey} is already warm`); return; } const startTime = Date.now(); switch (cacheKey) { case 'user-sessions': await this.warmUserSessionsCache(); break; case 'product-catalog': await this.warmProductCatalogCache(); break; case 'configuration': await this.warmConfigurationCache(); break; default: throw new Error(`Unknown cache key: ${cacheKey}`); } const duration = Date.now() - startTime; console.log(`Cache warming completed for ${cacheKey} in ${duration}ms`); await this.markCacheAsWarm(cacheKey); } }); async ensureCacheWarmed(cacheKey: string, force: boolean = false) { // Multiple instances can call this, but cache warming happens only once // All instances wait for completion before proceeding await this.warmCache.run(cacheKey, force); } } ``` **Critical section protection with custom timeout handling:** ```ts class InventoryService { updateInventory = $lock({ name: (productId: string) => `inventory-update:${productId}`, wait: true, // Ensure all inventory updates are sequential maxDuration: [2, "minutes"], gracePeriod: [30, "seconds"], // Brief grace to prevent immediate conflicts handler: async (productId: string, quantity: number, operation: 'add' | 'subtract') => { console.log(`Updating inventory for product ${productId}: ${operation} ${quantity}`); try { // Start transaction for inventory update await this.db.transaction(async (tx) => { const currentInventory = await tx.getInventory(productId); if (operation === 'subtract' && currentInventory.quantity < quantity) { throw new Error(`Insufficient inventory for product ${productId}. Available: ${currentInventory.quantity}, Requested: ${quantity}`); } const newQuantity = operation === 'add' ? currentInventory.quantity + quantity : currentInventory.quantity - quantity; await tx.updateInventory(productId, newQuantity); // Log inventory change for audit await tx.logInventoryChange({ productId, operation, quantity, previousQuantity: currentInventory.quantity, newQuantity, timestamp: new Date() }); console.log(`Inventory updated for product ${productId}: ${currentInventory.quantity} -> ${newQuantity}`); }); // Notify other services about inventory change await this.inventoryChangeNotifier.notify({ productId, operation, quantity, timestamp: new Date() }); } catch (error) { console.error(`Inventory update failed for product ${productId}`, error); throw error; } } }); async addInventory(productId: string, quantity: number) { await this.updateInventory.run(productId, quantity, 'add'); } async subtractInventory(productId: string, quantity: number) { await this.updateInventory.run(productId, quantity, 'subtract'); } } ``` ### Providers Providers are classes that encapsulate specific functionality and can be injected into your application. They handle initialization, configuration, and lifecycle management. For more details, see the [Providers documentation](/docs/providers). #### MemoryLockProvider A simple in-memory store provider. --- # packages-alepha-logger.md # Alepha Logger A simple logger for Alepha applications ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $logger() Create a logger. `name` is optional, by default it will use the name of the service. ```ts import { $logger } from "alepha"; class MyService { log = $logger(); constructor() { this.log.info("Service initialized"); // print something like '[23:45:53.326] INFO : Service initialized' } } ``` --- # packages-alepha-postgres.md # Alepha Postgres A type-safe SQL query builder and ORM using Drizzle. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Postgres client based on Drizzle ORM, Alepha type-safe friendly. ```ts const users = $entity({ name: "users", schema: t.object({ id: pg.primaryKey(), name: t.string(), email: t.string(), }), }); class Db { users = $repository(users); } const db = alepha.inject(Db); const user = await db.users.one({ name: { eq: "John Doe" } }); ``` This is not a full ORM, but rather a set of tools to work with Postgres databases in a type-safe way. It provides: - A type-safe way to define entities and repositories. (via `$entity` and `$repository`) - Custom query builders and filters. - Built-in special columns like `createdAt`, `updatedAt`, `deletedAt`, `version`. - Automatic JSONB support. - Automatic synchronization of entities with the database schema (for testing and development). - Fallback to raw SQL via Drizzle ORM `sql` function. Migrations are supported via Drizzle ORM, you need to use the `drizzle-kit` CLI tool to generate and run migrations. Relations are **NOT SUPPORTED** yet. If you need relations, please use the `drizzle-orm` package directly. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaPostgres } from "alepha/postgres"; const alepha = Alepha.create() .with(AlephaPostgres); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $entity() Creates a database entity descriptor that defines table structure using TypeBox schemas. This descriptor provides a type-safe way to define database tables using JSON Schema syntax while generating the necessary database metadata for migrations and operations. It integrates with Drizzle ORM under the hood and works seamlessly with the $repository descriptor for complete database functionality. **Key Features** - **Type-Safe Schema Definition**: Uses TypeBox for full TypeScript type inference - **Automatic Table Generation**: Creates Drizzle ORM table structures automatically - **Index Management**: Supports single-column, multi-column, and unique indexes - **Constraint Support**: Foreign keys, unique constraints, and check constraints - **Audit Fields**: Built-in support for created_at, updated_at, deleted_at, and version fields - **Schema Validation**: Automatic insert/update schema generation with validation **Important Note**: This descriptor only defines the table structure - it does not create the physical database table. Use it with $repository to perform actual database operations, and run migrations to create the tables in your database. **Use Cases** Essential for defining database schema in type-safe applications: - User management and authentication tables - Business domain entities (products, orders, customers) - Audit and logging tables - Junction tables for many-to-many relationships - Configuration and settings tables **Basic entity with indexes:** ```ts import { $entity } from "alepha/postgres"; import { pg, t } from "alepha"; const User = $entity({ name: "users", schema: t.object({ id: pg.primaryKey(t.uuid()), email: t.string({ format: "email" }), username: t.string({ minLength: 3, maxLength: 30 }), firstName: t.string(), lastName: t.string(), isActive: t.boolean({ default: true }), createdAt: pg.createdAt(), updatedAt: pg.updatedAt(), deletedAt: pg.deletedAt() }), indexes: [ "email", // Simple index on email "username", // Simple index on username { column: "email", unique: true }, // Unique constraint on email { columns: ["firstName", "lastName"] } // Composite index ] }); ``` **E-commerce product entity with relationships:** ```ts const Product = $entity({ name: "products", schema: t.object({ id: pg.primaryKey(t.uuid()), sku: t.string({ minLength: 3 }), name: t.string({ minLength: 1, maxLength: 200 }), description: t.optional(t.string()), price: t.number({ minimum: 0 }), categoryId: t.string({ format: "uuid" }), inStock: t.boolean({ default: true }), stockQuantity: t.integer({ minimum: 0, default: 0 }), tags: t.optional(t.array(t.string())), // PostgreSQL array column metadata: t.optional(t.record(t.string(), t.any())), // JSONB column version: pg.version(), createdAt: pg.createdAt(), updatedAt: pg.updatedAt() }), indexes: [ { column: "sku", unique: true }, // Unique SKU "categoryId", // Foreign key index "inStock", // Filter frequently by stock status { columns: ["categoryId", "inStock"] }, // Composite for category + stock queries "createdAt" // For date-based queries ], foreignKeys: [ { name: "fk_product_category", columns: ["categoryId"], foreignColumns: [Category.id] // Reference to Category entity } ] }); ``` **Audit log entity with constraints:** ```ts const AuditLog = $entity({ name: "audit_logs", schema: t.object({ id: pg.primaryKey(t.uuid()), tableName: t.string(), recordId: t.string(), action: t.enum(["CREATE", "UPDATE", "DELETE"]), userId: t.optional(t.string({ format: "uuid" })), oldValues: t.optional(t.record(t.string(), t.any())), newValues: t.optional(t.record(t.string(), t.any())), timestamp: pg.createdAt(), ipAddress: t.optional(t.string()), userAgent: t.optional(t.string()) }), indexes: [ "tableName", "recordId", "userId", "action", { columns: ["tableName", "recordId"] }, // Find all changes to a record { columns: ["userId", "timestamp"] }, // User activity timeline "timestamp" // Time-based queries ], constraints: [ { name: "valid_action_values", columns: ["action"], check: sql`action IN ('CREATE', 'UPDATE', 'DELETE')` } ] }); ``` **Many-to-many junction table:** ```ts const UserRole = $entity({ name: "user_roles", schema: t.object({ id: pg.primaryKey(t.uuid()), userId: t.string({ format: "uuid" }), roleId: t.string({ format: "uuid" }), assignedBy: t.string({ format: "uuid" }), assignedAt: pg.createdAt(), expiresAt: t.optional(t.datetime()) }), indexes: [ "userId", "roleId", "assignedBy", { columns: ["userId", "roleId"], unique: true }, // Prevent duplicate assignments "expiresAt" // For cleanup of expired roles ], foreignKeys: [ { columns: ["userId"], foreignColumns: [User.id] }, { columns: ["roleId"], foreignColumns: [Role.id] }, { columns: ["assignedBy"], foreignColumns: [User.id] } ] }); ``` **Entity with custom Drizzle configuration:** ```ts const Order = $entity({ name: "orders", schema: t.object({ id: pg.primaryKey(t.uuid()), orderNumber: t.string(), customerId: t.string({ format: "uuid" }), status: t.enum(["pending", "processing", "shipped", "delivered"]), totalAmount: t.number({ minimum: 0 }), currency: t.string({ default: "USD" }), notes: t.optional(t.string()), createdAt: pg.createdAt(), updatedAt: pg.updatedAt(), version: pg.version() }), indexes: [ { column: "orderNumber", unique: true }, "customerId", "status", "createdAt", { columns: ["customerId", "status"] } ], // Advanced Drizzle ORM configuration config: (table) => [ // Custom index with specific options index("idx_orders_amount_status") .on(table.totalAmount, table.status) .where(sql`status != 'cancelled'`), // Partial index // Full-text search index (PostgreSQL specific) index("idx_orders_search") .using("gin", table.notes) ] }); ``` #### $repository() Creates a repository descriptor for database operations on a defined entity. This descriptor provides a comprehensive, type-safe interface for performing all database operations on entities defined with $entity. It offers a rich set of CRUD operations, advanced querying capabilities, pagination, transactions, and built-in support for audit trails and soft deletes. **Key Features** - **Complete CRUD Operations**: Create, read, update, delete with full type safety - **Advanced Querying**: Complex WHERE conditions, sorting, pagination, and aggregations - **Transaction Support**: Database transactions for consistency and atomicity - **Soft Delete Support**: Built-in soft delete functionality with `pg.deletedAt()` fields - **Optimistic Locking**: Version-based conflict resolution with `pg.version()` fields - **Audit Trail Integration**: Automatic handling of `createdAt`, `updatedAt` timestamps - **Raw SQL Support**: Execute custom SQL queries when needed - **Pagination**: Built-in pagination with metadata and navigation **Important Requirements** - Must be used with an entity created by $entity - Entity schema must include exactly one primary key field - Database tables must be created via migrations before use **Use Cases** Essential for all database-driven applications: - User management and authentication systems - E-commerce product and order management - Content management and blogging platforms - Financial and accounting applications - Any application requiring persistent data storage **Basic repository with CRUD operations:** ```ts import { $entity, $repository } from "alepha/postgres"; import { pg, t } from "alepha"; // First, define the entity const User = $entity({ name: "users", schema: t.object({ id: pg.primaryKey(t.uuid()), email: t.string({ format: "email" }), firstName: t.string(), lastName: t.string(), isActive: t.boolean({ default: true }), createdAt: pg.createdAt(), updatedAt: pg.updatedAt() }), indexes: [{ column: "email", unique: true }] }); class UserService { users = $repository({ table: User }); async createUser(userData: { email: string; firstName: string; lastName: string }) { return await this.users.create({ id: generateUUID(), email: userData.email, firstName: userData.firstName, lastName: userData.lastName, isActive: true }); } async getUserByEmail(email: string) { return await this.users.findOne({ email }); } async updateUser(id: string, updates: { firstName?: string; lastName?: string }) { return await this.users.updateById(id, updates); } async deactivateUser(id: string) { return await this.users.updateById(id, { isActive: false }); } } ``` **Advanced querying and filtering:** ```ts const Product = $entity({ name: "products", schema: t.object({ id: pg.primaryKey(t.uuid()), name: t.string(), price: t.number({ minimum: 0 }), categoryId: t.string({ format: "uuid" }), inStock: t.boolean(), tags: t.optional(t.array(t.string())), createdAt: pg.createdAt(), updatedAt: pg.updatedAt() }), indexes: ["categoryId", "inStock", "price"] }); class ProductService { products = $repository({ table: Product }); async searchProducts(filters: { categoryId?: string; minPrice?: number; maxPrice?: number; inStock?: boolean; searchTerm?: string; }, page: number = 0, size: number = 20) { const query = this.products.createQuery({ where: { and: [ filters.categoryId ? { categoryId: filters.categoryId } : {}, filters.inStock !== undefined ? { inStock: filters.inStock } : {}, filters.minPrice ? { price: { gte: filters.minPrice } } : {}, filters.maxPrice ? { price: { lte: filters.maxPrice } } : {}, filters.searchTerm ? { name: { ilike: `%${filters.searchTerm}%` } } : {} ] }, orderBy: [{ column: "createdAt", direction: "desc" }] }); return await this.products.paginate({ page, size }, query, { count: true }); } async getTopSellingProducts(limit: number = 10) { // Custom SQL query for complex analytics return await this.products.query( (table, db) => db .select({ id: table.id, name: table.name, price: table.price, salesCount: sql`COALESCE(sales.count, 0)` }) .from(table) .leftJoin( sql`( SELECT product_id, COUNT(*) as count FROM order_items WHERE created_at > NOW() - INTERVAL '30 days' GROUP BY product_id ) sales`, sql`sales.product_id = ${table.id}` ) .orderBy(sql`sales.count DESC NULLS LAST`) .limit(limit) ); } } ``` **Transaction handling and data consistency:** ```ts class OrderService { orders = $repository({ table: Order }); orderItems = $repository({ table: OrderItem }); products = $repository({ table: Product }); async createOrderWithItems(orderData: { customerId: string; items: Array<{ productId: string; quantity: number; price: number }>; }) { return await this.orders.transaction(async (tx) => { // Create the order const order = await this.orders.create({ id: generateUUID(), customerId: orderData.customerId, status: 'pending', totalAmount: orderData.items.reduce((sum, item) => sum + (item.price * item.quantity), 0) }, { tx }); // Create order items and update product inventory for (const itemData of orderData.items) { await this.orderItems.create({ id: generateUUID(), orderId: order.id, productId: itemData.productId, quantity: itemData.quantity, unitPrice: itemData.price }, { tx }); // Update product inventory using optimistic locking const product = await this.products.findById(itemData.productId, { tx }); if (product.stockQuantity < itemData.quantity) { throw new Error(`Insufficient stock for product ${itemData.productId}`); } await this.products.save({ ...product, stockQuantity: product.stockQuantity - itemData.quantity }, { tx }); } return order; }); } } ``` **Soft delete and audit trail:** ```ts const Document = $entity({ name: "documents", schema: t.object({ id: pg.primaryKey(t.uuid()), title: t.string(), content: t.string(), authorId: t.string({ format: "uuid" }), version: pg.version(), createdAt: pg.createdAt(), updatedAt: pg.updatedAt(), deletedAt: pg.deletedAt() // Enables soft delete }) }); class DocumentService { documents = $repository({ table: Document }); async updateDocument(id: string, updates: { title?: string; content?: string }) { // This uses optimistic locking via the version field const document = await this.documents.findById(id); return await this.documents.save({ ...document, ...updates // updatedAt will be set automatically }); } async softDeleteDocument(id: string) { // Soft delete - sets deletedAt timestamp await this.documents.deleteById(id); } async permanentDeleteDocument(id: string) { // Hard delete - actually removes from database await this.documents.deleteById(id, { force: true }); } async getActiveDocuments() { // Automatically excludes soft-deleted records return await this.documents.find({ where: { authorId: { isNotNull: true } }, orderBy: [{ column: "updatedAt", direction: "desc" }] }); } async getAllDocumentsIncludingDeleted() { // Include soft-deleted records return await this.documents.find({}, { force: true }); } } ``` **Complex filtering and aggregation:** ```ts class AnalyticsService { users = $repository({ table: User }); orders = $repository({ table: Order }); async getUserStatistics(filters: { startDate?: string; endDate?: string; isActive?: boolean; }) { const whereConditions = []; if (filters.startDate) { whereConditions.push({ createdAt: { gte: filters.startDate } }); } if (filters.endDate) { whereConditions.push({ createdAt: { lte: filters.endDate } }); } if (filters.isActive !== undefined) { whereConditions.push({ isActive: filters.isActive }); } const totalUsers = await this.users.count({ and: whereConditions }); const activeUsers = await this.users.count({ and: [...whereConditions, { isActive: true }] }); // Complex aggregation query const recentActivity = await this.users.query( sql` SELECT DATE_TRUNC('day', created_at) as date, COUNT(*) as new_users, COUNT(*) FILTER (WHERE is_active = true) as active_users FROM users WHERE created_at >= NOW() - INTERVAL '30 days' GROUP BY DATE_TRUNC('day', created_at) ORDER BY date DESC ` ); return { totalUsers, activeUsers, inactiveUsers: totalUsers - activeUsers, recentActivity }; } } ``` #### $sequence() Creates a PostgreSQL sequence descriptor for generating unique numeric values. This descriptor provides a type-safe interface to PostgreSQL sequences, which are database objects that generate unique numeric identifiers. Sequences are commonly used for primary keys, order numbers, invoice numbers, and other cases where guaranteed unique, incrementing values are needed across concurrent operations. **Key Features** - **Thread-Safe**: PostgreSQL sequences are inherently thread-safe and handle concurrency - **Configurable Parameters**: Start value, increment, min/max bounds, and cycling behavior - **Automatic Creation**: Sequences are created automatically when first used - **Type Safety**: Full TypeScript support with numeric return types - **Performance**: Optimized for high-throughput ID generation - **Schema Support**: Works with PostgreSQL schemas for organization **Use Cases** Perfect for generating unique identifiers in concurrent environments: - Primary key generation (alternative to UUIDs) - Order numbers and invoice sequences - Ticket numbers and reference IDs - Version numbers and revision tracking - Batch numbers for processing workflows - Any scenario requiring guaranteed unique incrementing numbers **Basic sequence for order numbers:** ```ts import { $sequence } from "alepha/postgres"; class OrderService { orderNumbers = $sequence({ name: "order_numbers", start: 1000, // Start from order #1000 increment: 1 // Increment by 1 each time }); async createOrder(orderData: OrderData) { const orderNumber = await this.orderNumbers.next(); return await this.orders.create({ id: generateUUID(), orderNumber, ...orderData }); } async getCurrentOrderNumber() { // Get the last generated number without incrementing return await this.orderNumbers.current(); } } ``` **Invoice numbering with yearly reset:** ```ts class InvoiceService { // Separate sequence for each year getInvoiceSequence(year: number) { return $sequence({ name: `invoice_numbers_${year}`, start: 1, increment: 1 }); } async generateInvoiceNumber(): Promise { const year = new Date().getFullYear(); const sequence = this.getInvoiceSequence(year); const number = await sequence.next(); // Format as INV-2024-001, INV-2024-002, etc. return `INV-${year}-${number.toString().padStart(3, '0')}`; } } ``` **High-performance ID generation with custom increments:** ```ts class TicketService { // Generate ticket numbers in increments of 10 for better distribution ticketSequence = $sequence({ name: "ticket_numbers", start: 1000, increment: 10, min: 1000, max: 999999, cycle: false // Don't cycle when max is reached }); priorityTicketSequence = $sequence({ name: "priority_ticket_numbers", start: 1, increment: 1, min: 1, max: 999, cycle: true // Cycle when reaching max }); async generateTicketNumber(isPriority: boolean = false): Promise { if (isPriority) { return await this.priorityTicketSequence.next(); } return await this.ticketSequence.next(); } async getSequenceStatus() { return { currentTicketNumber: await this.ticketSequence.current(), currentPriorityNumber: await this.priorityTicketSequence.current() }; } } ``` **Batch processing with sequence-based coordination:** ```ts class BatchProcessor { batchSequence = $sequence({ name: "batch_numbers", start: 1, increment: 1 }); async processBatch(items: any[]) { const batchNumber = await this.batchSequence.next(); console.log(`Starting batch processing #${batchNumber} with ${items.length} items`); try { // Process items with batch number for tracking for (const item of items) { await this.processItem(item, batchNumber); } await this.auditLogger.log({ event: 'batch_completed', batchNumber, itemCount: items.length, timestamp: new Date() }); return { batchNumber, processedCount: items.length }; } catch (error) { await this.auditLogger.log({ event: 'batch_failed', batchNumber, error: error.message, timestamp: new Date() }); throw error; } } async processItem(item: any, batchNumber: number) { // Associate item processing with batch number await this.items.update(item.id, { ...item.updates, batchNumber, processedAt: new Date() }); } } ``` **Multi-tenant sequence management:** ```ts class TenantSequenceService { // Create tenant-specific sequences getTenantSequence(tenantId: string, sequenceType: string) { return $sequence({ name: `${tenantId}_${sequenceType}_seq`, start: 1, increment: 1 }); } async generateTenantOrderNumber(tenantId: string): Promise { const sequence = this.getTenantSequence(tenantId, 'orders'); const number = await sequence.next(); return `${tenantId.toUpperCase()}-ORD-${number.toString().padStart(6, '0')}`; } async generateTenantInvoiceNumber(tenantId: string): Promise { const sequence = this.getTenantSequence(tenantId, 'invoices'); const number = await sequence.next(); return `${tenantId.toUpperCase()}-INV-${number.toString().padStart(6, '0')}`; } async getTenantSequenceStatus(tenantId: string) { const orderSeq = this.getTenantSequence(tenantId, 'orders'); const invoiceSeq = this.getTenantSequence(tenantId, 'invoices'); return { tenant: tenantId, sequences: { orders: { current: await orderSeq.current(), next: await orderSeq.next() }, invoices: { current: await invoiceSeq.current() } } }; } } ``` **Important Notes**: - Sequences are created automatically when first used - PostgreSQL sequences are atomic and handle high concurrency - Sequence values are not rolled back in failed transactions - Consider the impact of max values and cycling behavior - Sequences are schema-scoped in PostgreSQL #### $transaction() Creates a transaction descriptor for database operations requiring atomicity and consistency. This descriptor provides a convenient way to wrap database operations in PostgreSQL transactions, ensuring ACID properties and automatic retry logic for version conflicts. It integrates seamlessly with the repository pattern and provides built-in handling for optimistic locking scenarios with automatic retry on version mismatches. **Important Notes**: - All operations within the transaction handler are atomic - Automatic retry on `PgVersionMismatchError` for optimistic locking - Pass `{ tx }` option to all repository operations within the transaction - Transactions are automatically rolled back on any unhandled error - Use appropriate isolation levels based on your consistency requirements --- # packages-alepha-queue-redis.md # Alepha Queue Redis Redis implementation for the message queueing system. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Plugin for Alepha Queue that provides Redis queue capabilities. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaQueueRedis } from "alepha/queue/redis"; const alepha = Alepha.create() .with(AlephaQueueRedis); run(alepha); ``` --- # packages-alepha-queue.md # Alepha Queue A simple, powerful interface for message queueing systems. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Provides asynchronous message queuing and processing capabilities through declarative queue descriptors. The queue module enables reliable background job processing and message passing using the `$queue` descriptor on class properties. It supports schema validation, automatic retries, and multiple queue backends for building scalable, decoupled applications with robust error handling. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaQueue } from "alepha/queue"; const alepha = Alepha.create() .with(AlephaQueue); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $consumer() Creates a consumer descriptor to process messages from a specific queue. This descriptor creates a dedicated message consumer that connects to a queue and processes its messages using a custom handler function. Consumers provide a clean way to separate message production from consumption, enabling scalable architectures where multiple consumers can process messages from the same queue. **Key Features** - **Queue Integration**: Seamlessly connects to any $queue descriptor - **Type Safety**: Full TypeScript support inherited from the connected queue's schema - **Dedicated Processing**: Isolated message processing logic separate from the queue - **Worker Management**: Automatic integration with the worker system for background processing - **Error Handling**: Built-in error handling and retry mechanisms from the queue system - **Scalability**: Multiple consumers can process the same queue for horizontal scaling **Use Cases** Perfect for creating specialized message processors: - Dedicated email sending services - Image processing workers - Data synchronization tasks - Event handlers for specific domains - Microservice message consumers - Background job processors **Basic consumer setup:** ```ts import { $queue, $consumer } from "alepha/queue"; import { t } from "alepha"; class EmailService { // Define the queue emailQueue = $queue({ name: "emails", schema: t.object({ to: t.string(), subject: t.string(), body: t.string(), template: t.optional(t.string()) }) }); // Create a dedicated consumer for this queue emailConsumer = $consumer({ queue: this.emailQueue, handler: async (message) => { const { to, subject, body, template } = message.payload; if (template) { await this.sendTemplatedEmail(to, template, { subject, body }); } else { await this.sendPlainEmail(to, subject, body); } console.log(`Email sent to ${to}: ${subject}`); } }); async sendWelcomeEmail(userEmail: string) { // Push to queue - consumer will automatically process it await this.emailQueue.push({ to: userEmail, subject: "Welcome!", body: "Thanks for joining our platform.", template: "welcome" }); } } ``` **Multiple specialized consumers for different message types:** ```ts class NotificationService { notificationQueue = $queue({ name: "notifications", schema: t.object({ type: t.enum(["email", "sms", "push"]), recipient: t.string(), message: t.string(), metadata: t.optional(t.record(t.string(), t.any())) }) }); // Email-specific consumer emailConsumer = $consumer({ queue: this.notificationQueue, handler: async (message) => { if (message.payload.type === "email") { await this.emailProvider.send({ to: message.payload.recipient, subject: message.payload.metadata?.subject || "Notification", body: message.payload.message }); } } }); // SMS-specific consumer smsConsumer = $consumer({ queue: this.notificationQueue, handler: async (message) => { if (message.payload.type === "sms") { await this.smsProvider.send({ to: message.payload.recipient, message: message.payload.message }); } } }); // Push notification consumer pushConsumer = $consumer({ queue: this.notificationQueue, handler: async (message) => { if (message.payload.type === "push") { await this.pushProvider.send({ deviceToken: message.payload.recipient, title: message.payload.metadata?.title || "Notification", body: message.payload.message }); } } }); } ``` **Consumer with advanced error handling and logging:** ```ts class OrderProcessor { orderQueue = $queue({ name: "order-processing", schema: t.object({ orderId: t.string(), customerId: t.string(), items: t.array(t.object({ productId: t.string(), quantity: t.number(), price: t.number() })) }) }); orderConsumer = $consumer({ queue: this.orderQueue, handler: async (message) => { const { orderId, customerId, items } = message.payload; try { // Log processing start this.logger.info(`Processing order ${orderId} for customer ${customerId}`); // Validate inventory await this.validateInventory(items); // Process payment const paymentResult = await this.processPayment(orderId, items); if (!paymentResult.success) { throw new Error(`Payment failed: ${paymentResult.error}`); } // Update inventory await this.updateInventory(items); // Create shipment await this.createShipment(orderId, customerId); // Send confirmation await this.sendOrderConfirmation(customerId, orderId); this.logger.info(`Order ${orderId} processed successfully`); } catch (error) { // Log detailed error information this.logger.error(`Failed to process order ${orderId}`, { error: error.message, orderId, customerId, itemCount: items.length }); // Re-throw to trigger queue retry mechanism throw error; } } }); } ``` **Consumer for batch processing with performance optimization:** ```ts class DataProcessor { dataQueue = $queue({ name: "data-processing", schema: t.object({ batchId: t.string(), records: t.array(t.object({ id: t.string(), data: t.record(t.string(), t.any()) })), processingOptions: t.object({ validateData: t.boolean(), generateReport: t.boolean(), notifyCompletion: t.boolean() }) }) }); dataConsumer = $consumer({ queue: this.dataQueue, handler: async (message) => { const { batchId, records, processingOptions } = message.payload; const startTime = Date.now(); this.logger.info(`Starting batch processing for ${batchId} with ${records.length} records`); try { // Process records in chunks for better performance const chunkSize = 100; const chunks = this.chunkArray(records, chunkSize); for (let i = 0; i < chunks.length; i++) { const chunk = chunks[i]; if (processingOptions.validateData) { await this.validateChunk(chunk); } await this.processChunk(chunk); // Log progress const progress = ((i + 1) / chunks.length) * 100; this.logger.debug(`Batch ${batchId} progress: ${progress.toFixed(1)}%`); } if (processingOptions.generateReport) { await this.generateProcessingReport(batchId, records.length); } if (processingOptions.notifyCompletion) { await this.notifyBatchCompletion(batchId); } const duration = Date.now() - startTime; this.logger.info(`Batch ${batchId} completed in ${duration}ms`); } catch (error) { const duration = Date.now() - startTime; this.logger.error(`Batch ${batchId} failed after ${duration}ms`, error); throw error; } } }); } ``` #### $queue() Creates a queue descriptor for asynchronous message processing with background workers. The $queue descriptor enables powerful asynchronous communication patterns in your application. It provides type-safe message queuing with automatic worker processing, making it perfect for decoupling components and handling background tasks efficiently. **Background Processing** - Automatic worker threads for non-blocking message processing - Built-in retry mechanisms and error handling - Dead letter queues for failed message handling - Graceful shutdown and worker lifecycle management **Type Safety** - Full TypeScript support with schema validation using TypeBox - Type-safe message payloads with automatic inference - Runtime validation of all queued messages - Compile-time errors for invalid message structures **Storage Flexibility** - Memory provider for development and testing - Redis provider for production scalability and persistence - Custom provider support for specialized backends - Automatic failover and connection pooling **Performance & Scalability** - Batch processing support for high-throughput scenarios - Horizontal scaling with distributed queue backends - Configurable concurrency and worker pools - Efficient serialization and message routing **Reliability** - Message persistence across application restarts - Automatic retry with exponential backoff - Dead letter handling for permanently failed messages - Comprehensive logging and monitoring integration ```typescript const emailQueue = $queue({ name: "email-notifications", schema: t.object({ to: t.string(), subject: t.string(), body: t.string(), priority: t.optional(t.enum(["high", "normal"])) }), handler: async (message) => { await emailService.send(message.payload); console.log(`Email sent to ${message.payload.to}`); } }); // Push messages for background processing await emailQueue.push({ to: "user@example.com", subject: "Welcome!", body: "Welcome to our platform", priority: "high" }); ``` ```typescript const imageQueue = $queue({ name: "image-processing", provider: RedisQueueProvider, schema: t.object({ imageId: t.string(), operations: t.array(t.enum(["resize", "compress", "thumbnail"])) }), handler: async (message) => { for (const op of message.payload.operations) { await processImage(message.payload.imageId, op); } } }); // Batch processing multiple images await imageQueue.push( { imageId: "img1", operations: ["resize", "thumbnail"] }, { imageId: "img2", operations: ["compress"] }, { imageId: "img3", operations: ["resize", "compress", "thumbnail"] } ); ``` ```typescript const taskQueue = $queue({ name: "dev-tasks", provider: "memory", schema: t.object({ taskType: t.enum(["cleanup", "backup", "report"]), data: t.record(t.string(), t.any()) }), handler: async (message) => { switch (message.payload.taskType) { case "cleanup": await performCleanup(message.payload.data); break; case "backup": await createBackup(message.payload.data); break; case "report": await generateReport(message.payload.data); break; } } }); ``` --- # packages-alepha-react-auth.md # Alepha React Auth Simplifies user authentication flows in React applications. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module The ReactAuthModule provides authentication services for React applications. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaReactAuth } from "alepha/react/auth"; const alepha = Alepha.create() .with(AlephaReactAuth); run(alepha); ``` --- # packages-alepha-react-form.md # Alepha React Form Manages form state and validation in React applications. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module React hooks for managing forms in Alepha applications. This module provides a set of hooks to simplify form handling, validation, and submission in React applications built with Alepha. It includes: - `useForm`: A hook for managing form state, validation, and submission. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaReactForm } from "alepha/react/form"; const alepha = Alepha.create() .with(AlephaReactForm); run(alepha); ``` ## API Reference ### Hooks Hooks provide a way to tap into various lifecycle events and extend functionality. They follow the convention of starting with `use` and return configured hook instances. #### useForm() Custom hook to create a form with validation and field management. This hook uses TypeBox schemas to define the structure and validation rules for the form. It provides a way to handle form submission, field creation, and value management. ```tsx import { t } from "alepha"; const form = useForm({ schema: t.object({ username: t.string(), password: t.string(), }), handler: (values) => { console.log("Form submitted with values:", values); }, }); return (
); ``` --- # packages-alepha-react-head.md # Alepha React Head Manages the document for SEO and metadata. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Fill `` server & client side. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaReactHead } from "alepha/react/head"; const alepha = Alepha.create() .with(AlephaReactHead); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $head() Set global `` options for the application. ### Hooks Hooks provide a way to tap into various lifecycle events and extend functionality. They follow the convention of starting with `use` and return configured hook instances. #### useHead() ```tsx const App = () => { const [head, setHead] = useHead({ // will set the document title on the first render title: "My App", }); return ( // This will update the document title when the button is clicked ); } ``` --- # packages-alepha-react-i18n.md # Alepha React I18n Internationalization (i18n) support for React applications using Alepha. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Add i18n support to your Alepha React application. SSR and CSR compatible. It supports lazy loading of translations and provides a context to access the current language. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaReactI18n } from "alepha/react/i18n"; const alepha = Alepha.create() .with(AlephaReactI18n); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $dictionary() Register a dictionary entry for translations. It allows you to define a set of translations for a specific language. Entry can be lazy-loaded, which is useful for large dictionaries or when translations are not needed immediately. ```ts import { $dictionary } from "alepha/react-i18n"; const Example = () => { const { tr } = useI18n(); return
{tr("hello")}
; // } class App { en = $dictionary({ // { default: { hello: "Hey" } } lazy: () => import("./translations/en.ts"), }); home = $page({ path: "/", component: Example, }) } run(App); ``` ### Hooks Hooks provide a way to tap into various lifecycle events and extend functionality. They follow the convention of starting with `use` and return configured hook instances. #### useI18n() Hook to access the i18n service. --- # packages-alepha-react.md # Alepha React Build server-side rendered (SSR) or single-page React applications. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Provides full-stack React development with declarative routing, server-side rendering, and client-side hydration. The React module enables building modern React applications using the `$page` descriptor on class properties. It delivers seamless server-side rendering, automatic code splitting, and client-side navigation with full type safety and schema validation for route parameters and data. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaReact } from "alepha/react"; const alepha = Alepha.create() .with(AlephaReact); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $page() Main descriptor for defining a React route in the application. The $page descriptor is the core building block for creating type-safe, SSR-enabled React routes. It provides a declarative way to define pages with powerful features: **Routing & Navigation** - URL pattern matching with parameters (e.g., `/users/:id`) - Nested routing with parent-child relationships - Type-safe URL parameter and query string validation **Data Loading** - Server-side data fetching with the `resolve` function - Automatic serialization and hydration for SSR - Access to request context, URL params, and parent data **Component Loading** - Direct component rendering or lazy loading for code splitting - Client-only rendering when browser APIs are needed - Automatic fallback handling during hydration **Performance Optimization** - Static generation for pre-rendered pages at build time - Server-side caching with configurable TTL and providers - Code splitting through lazy component loading **Error Handling** - Custom error handlers with support for redirects - Hierarchical error handling (child β†’ parent) - HTTP status code handling (404, 401, etc.) **Page Animations** - CSS-based enter/exit animations - Dynamic animations based on page state - Custom timing and easing functions **Lifecycle Management** - Server response hooks for headers and status codes - Page leave handlers for cleanup (browser only) - Permission-based access control ```typescript const userProfile = $page({ path: "/users/:id", schema: { params: t.object({ id: t.int() }), query: t.object({ tab: t.optional(t.string()) }) }, resolve: async ({ params }) => { const user = await userApi.getUser(params.id); return { user }; }, lazy: () => import("./UserProfile.tsx") }); ``` ```typescript const projectSection = $page({ path: "/projects/:id", children: () => [projectBoard, projectSettings], resolve: async ({ params }) => { const project = await projectApi.get(params.id); return { project }; }, errorHandler: (error) => { if (HttpError.is(error, 404)) { return ; } } }); ``` ```typescript const blogPost = $page({ path: "/blog/:slug", static: { entries: posts.map(p => ({ params: { slug: p.slug } })) }, resolve: async ({ params }) => { const post = await loadPost(params.slug); return { post }; } }); ``` ### Hooks Hooks provide a way to tap into various lifecycle events and extend functionality. They follow the convention of starting with `use` and return configured hook instances. #### useAlepha() Main Alepha hook. It provides access to the Alepha instance within a React component. With Alepha, you can access the core functionalities of the framework: - alepha.state() for state management - alepha.inject() for dependency injection - alepha.events.emit() for event handling etc... #### useClient() Hook to get a virtual client for the specified scope. It's the React-hook version of `$client()`, from `AlephaServerLinks` module. #### useInject() Hook to inject a service instance. It's a wrapper of `useAlepha().inject(service)` with a memoization. #### useQueryParams() Not well tested. Use with caution. #### useRouter() Use this hook to access the React Router instance. You can add a type parameter to specify the type of your application. This will allow you to use the router in a typesafe way. class App { home = $page(); } const router = useRouter(); router.go("home"); // typesafe #### useRouterEvents() Subscribe to various router events. #### useStore() Hook to access and mutate the Alepha state. --- # packages-alepha-redis.md # Alepha Redis A Redis client for caching, pub/sub, and more. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Redis client provider for Alepha applications. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaRedis } from "alepha/redis"; const alepha = Alepha.create() .with(AlephaRedis); run(alepha); ``` ## API Reference ### Providers Providers are classes that encapsulate specific functionality and can be injected into your application. They handle initialization, configuration, and lifecycle management. For more details, see the [Providers documentation](/docs/providers). #### RedisProvider Redis client provider. --- # packages-alepha-retry.md # Alepha Retry Simple, declarative, and powerful automatic retry for failed operations. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $retry() Creates a function that automatically retries a handler upon failure, with support for exponential backoff, max duration, and cancellation. --- # packages-alepha-scheduler.md # Alepha Scheduler Schedule recurring tasks using cron expressions or fixed intervals. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Generic interface for scheduling tasks. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaScheduler } from "alepha/scheduler"; const alepha = Alepha.create() .with(AlephaScheduler); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $scheduler() Scheduler descriptor. --- # packages-alepha-security.md # Alepha Security Manage realms, roles, permissions, and JWT-based authentication. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Provides comprehensive authentication and authorization capabilities with JWT tokens, role-based access control, and user management. The security module enables building secure applications using descriptors like `$realm`, `$role`, and `$permission` on class properties. It offers JWT-based authentication, fine-grained permissions, service accounts, and seamless integration with various authentication providers and user management systems. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaSecurity } from "alepha/security"; const alepha = Alepha.create() .with(AlephaSecurity); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $permission() Create a new permission. #### $realm() Create a new realm. #### $role() Create a new role. #### $serviceAccount() Allow to get an access token for a service account. You have some options to configure the service account: - a OAUTH2 URL using client credentials grant type - a JWT secret shared between the services ```ts import { $serviceAccount } from "alepha/security"; class MyService { serviceAccount = $serviceAccount({ oauth2: { url: "https://example.com/oauth2/token", clientId: "your-client-id", clientSecret: "your-client-secret", } }); async fetchData() { const token = await this.serviceAccount.token(); // or const response = await this.serviceAccount.fetch("https://api.example.com/data"); } } ``` ### Providers Providers are classes that encapsulate specific functionality and can be injected into your application. They handle initialization, configuration, and lifecycle management. For more details, see the [Providers documentation](/docs/providers). #### JwtProvider Provides utilities for working with JSON Web Tokens (JWT). --- # packages-alepha-server-cache.md # Alepha Server Cache Adds ETag and Cache-Control headers to server responses. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Plugin for Alepha Server that provides server-side caching capabilities. It uses the Alepha Cache module to cache responses from server actions ($action). It also provides a ETag-based cache invalidation mechanism. ```ts import { Alepha } from "alepha"; import { $action } from "alepha/server"; import { AlephaServerCache } from "alepha/server/cache"; class ApiServer { hello = $action({ cache: true, handler: () => "Hello, World!", }); } const alepha = Alepha.create() .with(AlephaServerCache) .with(ApiServer); run(alepha); ``` This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServerCache } from "alepha/server/cache"; const alepha = Alepha.create() .with(AlephaServerCache); run(alepha); ``` --- # packages-alepha-server-compress.md # Alepha Server Compress Gzip and Brotli compression for server responses. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` --- # packages-alepha-server-cookies.md # Alepha Server Cookies Type-safe HTTP cookie parsing and serialization for servers. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Provides HTTP cookie management capabilities for server requests and responses with type-safe cookie descriptors. The server-cookies module enables declarative cookie handling using the `$cookie` descriptor on class properties. It offers automatic cookie parsing, secure cookie configuration, and seamless integration with server routes for managing user sessions, preferences, and authentication tokens. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServerCookies } from "alepha/server/cookies"; const alepha = Alepha.create() .with(AlephaServerCookies); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $cookie() Declares a type-safe, configurable HTTP cookie. This descriptor provides methods to get, set, and delete the cookie within the server request/response cycle. --- # packages-alepha-server-cors.md # Alepha Server Cors Configurable Cross-Origin Resource Sharing (CORS) support for servers. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` --- # packages-alepha-server-health.md # Alepha Server Health Adds a /health endpoint for monitoring application status. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Plugin for Alepha Server that provides health-check endpoints. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServerHealth } from "alepha/server/health"; const alepha = Alepha.create() .with(AlephaServerHealth); run(alepha); ``` ## API Reference ### Providers Providers are classes that encapsulate specific functionality and can be injected into your application. They handle initialization, configuration, and lifecycle management. For more details, see the [Providers documentation](/docs/providers). #### ServerHealthProvider Register `/health` endpoint. - Provides basic health information about the server. --- # packages-alepha-server-helmet.md # Alepha Server Helmet Essential, configurable security headers for all server responses. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Automatically adds important HTTP security headers to every response to help protect your application from common web vulnerabilities. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServerHelmet } from "alepha/server/helmet"; const alepha = Alepha.create() .with(AlephaServerHelmet); run(alepha); ``` ## API Reference ### Providers Providers are classes that encapsulate specific functionality and can be injected into your application. They handle initialization, configuration, and lifecycle management. For more details, see the [Providers documentation](/docs/providers). #### ServerHelmetProvider Provides a configurable way to apply essential HTTP security headers to every server response, without external dependencies. --- # packages-alepha-server-links.md # Alepha Server Links Enables type-safe communication between different services. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Provides server-side link management and remote capabilities for client-server interactions. The server-links module enables declarative link definitions using `$remote` and `$client` descriptors, facilitating seamless API endpoint management and client-server communication. It integrates with server security features to ensure safe and controlled access to resources. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServerLinks } from "alepha/server/links"; const alepha = Alepha.create() .with(AlephaServerLinks); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $client() Create a new client. #### $remote() $remote is a descriptor that allows you to define remote service access. Use it only when you have 2 or more services that need to communicate with each other. All remote services can be exposed as actions, ... or not. You can add a service account if you want to use a security layer. ### Providers Providers are classes that encapsulate specific functionality and can be injected into your application. They handle initialization, configuration, and lifecycle management. For more details, see the [Providers documentation](/docs/providers). #### LinkProvider Browser, SSR friendly, service to handle links. --- # packages-alepha-server-metrics.md # Alepha Server Metrics Exposes application metrics in Prometheus format at /metrics. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module This module provides prometheus metrics for the Alepha server. Metrics are exposed at the `/metrics` endpoint. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServerMetrics } from "alepha/server/metrics"; const alepha = Alepha.create() .with(AlephaServerMetrics); run(alepha); ``` --- # packages-alepha-server-multipart.md # Alepha Server Multipart Handles multipart/form-data requests for file uploads. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module This module provides support for handling multipart/form-data requests. It allows to parse body data containing t.file(). This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServerMultipart } from "alepha/server/multipart"; const alepha = Alepha.create() .with(AlephaServerMultipart); run(alepha); ``` --- # packages-alepha-server-proxy.md # Alepha Server Proxy Reverse-proxies incoming requests to other backend services. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Plugin for Alepha that provides a proxy server functionality. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServerProxy } from "alepha/server/proxy"; const alepha = Alepha.create() .with(AlephaServerProxy); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $proxy() Creates a proxy descriptor to forward requests to another server. This descriptor enables you to create reverse proxy functionality, allowing your Alepha server to forward requests to other services while maintaining a unified API surface. It's particularly useful for microservice architectures, API gateways, or when you need to aggregate multiple services behind a single endpoint. **Key Features** - **Path-based routing**: Match specific paths or patterns to proxy - **Dynamic targets**: Support both static and dynamic target resolution - **Request/Response hooks**: Modify requests before forwarding and responses after receiving - **URL rewriting**: Transform URLs before forwarding to the target - **Conditional proxying**: Enable/disable proxies based on environment or conditions **Basic proxy setup:** ```ts import { $proxy } from "alepha/server-proxy"; class ApiGateway { // Forward all /api/* requests to external service api = $proxy({ path: "/api/*", target: "https://api.example.com" }); } ``` **Dynamic target with environment-based routing:** ```ts class ApiGateway { // Route to different environments based on configuration api = $proxy({ path: "/api/*", target: () => process.env.NODE_ENV === "production" ? "https://api.prod.example.com" : "https://api.dev.example.com" }); } ``` **Advanced proxy with request/response modification:** ```ts class SecureProxy { secure = $proxy({ path: "/secure/*", target: "https://secure-api.example.com", beforeRequest: async (request, proxyRequest) => { // Add authentication headers proxyRequest.headers = { ...proxyRequest.headers, 'Authorization': `Bearer ${await getServiceToken()}`, 'X-Forwarded-For': request.headers['x-forwarded-for'] || request.ip }; }, afterResponse: async (request, proxyResponse) => { // Log response for monitoring console.log(`Proxied ${request.url} -> ${proxyResponse.status}`); }, rewrite: (url) => { // Remove /secure prefix when forwarding url.pathname = url.pathname.replace('/secure', ''); } }); } ``` **Conditional proxy based on feature flags:** ```ts class FeatureProxy { newApi = $proxy({ path: "/v2/*", target: "https://new-api.example.com", disabled: !process.env.ENABLE_V2_API // Disable if feature flag is off }); } ``` --- # packages-alepha-server-rate-limit.md # Alepha Server Rate Limit Blocks requests that exceed a defined rate limit. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Provides rate limiting capabilities for server actions with configurable limits and windows. The server-rate-limit module enables per-action rate limiting using the `rateLimit` option in action descriptors. It offers sliding window rate limiting, custom key generation, and seamless integration with server routes. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServerRateLimit } from "alepha/server/rate-limit"; const alepha = Alepha.create() .with(AlephaServerRateLimit); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $rateLimit() Declares rate limiting for server actions or custom usage. This descriptor provides methods to check rate limits and configure behavior within the server request/response cycle. --- # packages-alepha-server-security.md # Alepha Server Security Add security layer to the Alepha server. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Plugin for Alepha Server that provides security features. Based on the Alepha Security module. By default, all $action will be guarded by a permission check. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServerSecurity } from "alepha/server/security"; const alepha = Alepha.create() .with(AlephaServerSecurity); run(alepha); ``` --- # packages-alepha-server-static.md # Alepha Server Static Serves static files like HTML, CSS, and JavaScript. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Create static file server with `$static()`. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServerStatic } from "alepha/server/static"; const alepha = Alepha.create() .with(AlephaServerStatic); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $serve() Create a new static file handler. --- # packages-alepha-server-swagger.md # Alepha Server Swagger Generates OpenAPI documentation and a Swagger UI for APIs. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Plugin for Alepha Server that provides Swagger documentation capabilities. It generates OpenAPI v3 documentation for the server's endpoints ($action). It also provides a Swagger UI for interactive API documentation. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServerSwagger } from "alepha/server/swagger"; const alepha = Alepha.create() .with(AlephaServerSwagger); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $swagger() Create a new OpenAPI. --- # packages-alepha-server.md # Alepha Server Core HTTP server for creating REST APIs. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Provides high-performance HTTP server capabilities with declarative routing and action descriptors. The server module enables building REST APIs and web applications using `$route` and `$action` descriptors on class properties. It provides automatic request/response handling, schema validation, middleware support, and seamless integration with other Alepha modules for a complete backend solution. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaServer } from "alepha/server"; const alepha = Alepha.create() .with(AlephaServer); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $action() Creates a server action descriptor for defining type-safe HTTP endpoints. Server actions are the core building blocks for REST APIs in the Alepha framework. They provide a declarative way to define HTTP endpoints with full TypeScript type safety, automatic schema validation, and integrated security features. Actions automatically handle routing, request parsing, response serialization, and OpenAPI documentation generation. **Key Features** - **Type Safety**: Full TypeScript inference for request/response types - **Schema Validation**: Automatic validation using TypeBox schemas - **Auto-routing**: Convention-based URL generation with customizable paths - **Multiple Invocation**: Call directly (`run()`) or via HTTP (`fetch()`) - **OpenAPI Integration**: Automatic documentation generation - **Security Integration**: Built-in authentication and authorization support - **Content Type Detection**: Automatic handling of JSON, form-data, and plain text **URL Generation** By default, actions are prefixed with `/api` (configurable via `SERVER_API_PREFIX`): - Property name becomes the endpoint path - Path parameters are automatically detected from schema - HTTP method defaults to GET, or POST if body schema is provided **Use Cases** Perfect for building robust REST APIs: - CRUD operations with full type safety - File upload and download endpoints - Real-time data processing APIs - Integration with external services - Microservice communication - Admin and management interfaces **Basic CRUD operations:** ```ts import { $action } from "alepha/server"; import { t } from "alepha"; class UserController { getUsers = $action({ path: "/users", description: "Retrieve all users with pagination", schema: { query: t.object({ page: t.optional(t.number({ default: 1 })), limit: t.optional(t.number({ default: 10, maximum: 100 })), search: t.optional(t.string()) }), response: t.object({ users: t.array(t.object({ id: t.string(), name: t.string(), email: t.string(), createdAt: t.datetime() })), total: t.number(), hasMore: t.boolean() }) }, handler: async ({ query }) => { const { page, limit, search } = query; const users = await this.userService.findUsers({ page, limit, search }); return { users: users.items, total: users.total, hasMore: (page * limit) < users.total }; } }); createUser = $action({ method: "POST", path: "/users", description: "Create a new user account", schema: { body: t.object({ name: t.string({ minLength: 2, maxLength: 100 }), email: t.string({ format: "email" }), password: t.string({ minLength: 8 }), role: t.optional(t.enum(["user", "admin"])) }), response: t.object({ id: t.string(), name: t.string(), email: t.string(), role: t.string(), createdAt: t.datetime() }) }, handler: async ({ body }) => { // Password validation and hashing await this.authService.validatePassword(body.password); const hashedPassword = await this.authService.hashPassword(body.password); // Create user with default role const user = await this.userService.create({ ...body, password: hashedPassword, role: body.role || "user" }); // Return user without password const { password, ...publicUser } = user; return publicUser; } }); getUser = $action({ path: "/users/:id", description: "Retrieve user by ID", schema: { params: t.object({ id: t.string() }), response: t.object({ id: t.string(), name: t.string(), email: t.string(), role: t.string(), profile: t.optional(t.object({ bio: t.string(), avatar: t.string({ format: "uri" }), location: t.string() })) }) }, handler: async ({ params }) => { const user = await this.userService.findById(params.id); if (!user) { throw new Error(`User not found: ${params.id}`); } return user; } }); updateUser = $action({ method: "PUT", path: "/users/:id", description: "Update user information", schema: { params: t.object({ id: t.string() }), body: t.object({ name: t.optional(t.string({ minLength: 2 })), email: t.optional(t.string({ format: "email" })), profile: t.optional(t.object({ bio: t.optional(t.string()), avatar: t.optional(t.string({ format: "uri" })), location: t.optional(t.string()) })) }), response: t.object({ id: t.string(), name: t.string(), email: t.string(), updatedAt: t.datetime() }) }, handler: async ({ params, body }) => { const updatedUser = await this.userService.update(params.id, body); return updatedUser; } }); } ``` **Important Notes**: - Actions are automatically registered with the HTTP server when the service is initialized - Use `run()` for direct invocation (testing, internal calls, or remote services) - Use `fetch()` for explicit HTTP requests (client-side, external services) - Schema validation occurs automatically for all requests and responses - Path parameters are automatically extracted from schema definitions - Content-Type headers are automatically set based on schema types #### $route() Create a basic endpoint. It's a low level descriptor. You probably want to use `$action` instead. ### Providers Providers are classes that encapsulate specific functionality and can be injected into your application. They handle initialization, configuration, and lifecycle management. For more details, see the [Providers documentation](/docs/providers). #### ServerNotReadyProvider On every request, this provider checks if the server is ready. If the server is not ready, it responds with a 503 status code and a message indicating that the server is not ready yet. The response also includes a `Retry-After` header indicating that the client should retry after 5 seconds. #### ServerRouterProvider Main router for all routes on the server side. - $route => generic route - $action => action route (for API calls) - $page => React route (for SSR) --- # packages-alepha-testing.md # Alepha Testing Provides testing utilities, including data factories with Faker.js. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` --- # packages-alepha-thread.md # Alepha Thread Run worker threads in Node.js with a simple API. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Simple interface for managing worker threads in Alepha. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaThread } from "alepha/thread"; const alepha = Alepha.create() .with(AlephaThread); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $thread() Creates a worker thread descriptor for offloading CPU-intensive tasks to separate threads. This descriptor enables you to run JavaScript code in Node.js worker threads, allowing you to leverage multiple CPU cores and avoid blocking the main event loop. It provides a pool-based approach with intelligent thread reuse and automatic lifecycle management. **Key Features** - **Thread Pool Management**: Automatically manages a pool of worker threads with configurable limits - **Thread Reuse**: Reuses existing threads to avoid expensive initialization overhead - **Idle Cleanup**: Automatically terminates unused threads after a configurable timeout - **Type-Safe Communication**: Optional TypeBox schema validation for data passed to threads - **CPU-Aware Defaults**: Pool size defaults to CPU count Γ— 2 for optimal performance - **Error Handling**: Proper error propagation and thread cleanup on failures **Use Cases** Perfect for CPU-intensive tasks that would otherwise block the main thread: - Image/video processing - Data transformation and analysis - Cryptographic operations - Heavy computations and algorithms - Background data processing **Basic thread usage:** ```ts import { $thread } from "alepha/thread"; class DataProcessor { heavyComputation = $thread({ name: "compute", handler: async () => { // This runs in a separate worker thread let result = 0; for (let i = 0; i < 1000000; i++) { result += Math.sqrt(i); } return { result, timestamp: Date.now() }; } }); async processData() { // Execute in worker thread without blocking main thread const result = await this.heavyComputation.execute(); console.log(`Computation result: ${result.result}`); } } ``` **Configured thread pool with custom settings:** ```ts class ImageProcessor { imageProcessor = $thread({ name: "image-processing", maxPoolSize: 4, // Limit to 4 concurrent threads idleTimeout: 30000, // Clean up idle threads after 30 seconds handler: async () => { // CPU-intensive image processing logic return await processImageData(); } }); } ``` **Thread with data validation:** ```ts import { t } from "alepha"; class CryptoService { encrypt = $thread({ name: "encryption", handler: async () => { // Perform encryption operations return await encryptData(); } }); async encryptSensitiveData(data: { text: string; key: string }) { // Validate input data before sending to thread const schema = t.object({ text: t.string(), key: t.string() }); return await this.encrypt.execute(data, schema); } } ``` **Parallel processing with multiple threads:** ```ts class BatchProcessor { processor = $thread({ name: "batch-worker", maxPoolSize: 8, // Allow up to 8 concurrent workers handler: async () => { return await processBatchItem(); } }); async processBatch(items: any[]) { // Process multiple items in parallel across different threads const promises = items.map(() => this.processor.execute()); const results = await Promise.all(promises); return results; } } ``` --- # packages-alepha-topic-redis.md # Alepha Topic Redis Redis implementation for the pub/sub messaging interface. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Plugin for Alepha Topic that provides Redis pub/sub capabilities. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaTopicRedis } from "alepha/topic/redis"; const alepha = Alepha.create() .with(AlephaTopicRedis); run(alepha); ``` --- # packages-alepha-topic.md # Alepha Topic A publish-subscribe (pub/sub) messaging interface for eventing. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Generic interface for pub/sub messaging. Gives you the ability to create topics and subscribers. This module provides only a memory implementation of the topic provider. This module can be imported and used as follows: ```typescript import { Alepha, run } from "alepha"; import { AlephaTopic } from "alepha/topic"; const alepha = Alepha.create() .with(AlephaTopic); run(alepha); ``` ## API Reference ### Descriptors Descriptors are functions that define and configure various aspects of your application. They follow the convention of starting with ` $ ` and return configured descriptor instances. For more details, see the [Descriptors documentation](/docs/descriptors). #### $subscriber() Creates a subscriber descriptor to listen for messages from a specific topic. This descriptor creates a dedicated message subscriber that connects to a topic and processes its messages using a custom handler function. Subscribers provide a clean way to separate message publishing from consumption, enabling scalable pub/sub architectures where multiple subscribers can react to the same events independently. ## Key Features - **Topic Integration**: Seamlessly connects to any $topic descriptor - **Type Safety**: Full TypeScript support inherited from the connected topic's schema - **Dedicated Processing**: Isolated message processing logic separate from the topic - **Real-time Processing**: Immediate message delivery when events are published - **Error Isolation**: Subscriber errors don't affect other subscribers or the topic - **Scalability**: Multiple subscribers can listen to the same topic independently ## Use Cases Perfect for creating specialized event handlers: - Notification services for user events - Analytics and logging systems - Data synchronization between services - Real-time UI updates - Event-driven workflow triggers - Audit and compliance logging **Basic subscriber setup:** ```ts import { $topic, $subscriber } from "alepha/topic"; import { t } from "alepha"; class UserActivityService { // Define the topic userEvents = $topic({ name: "user-activity", schema: { payload: t.object({ userId: t.string(), action: t.enum(["login", "logout", "purchase"]), timestamp: t.number(), metadata: t.optional(t.record(t.string(), t.any())) }) } }); // Create a dedicated subscriber for this topic activityLogger = $subscriber({ topic: this.userEvents, handler: async (message) => { const { userId, action, timestamp } = message.payload; await this.auditLogger.log({ event: 'user_activity', userId, action, timestamp, source: 'user-activity-topic' }); this.log.info(`User ${userId} performed ${action} at ${new Date(timestamp).toISOString()}`); } }); async trackUserLogin(userId: string, metadata: Record) { // Publish to topic - subscriber will automatically process it await this.userEvents.publish({ userId, action: "login", timestamp: Date.now(), metadata }); } } ``` **Multiple specialized subscribers for different concerns:** ```ts class OrderEventHandlers { orderEvents = $topic({ name: "order-events", schema: { payload: t.object({ orderId: t.string(), customerId: t.string(), status: t.union([ t.literal("created"), t.literal("paid"), t.literal("shipped"), t.literal("delivered") ]), data: t.optional(t.record(t.string(), t.any())) }) } }); // Analytics subscriber analyticsSubscriber = $subscriber({ topic: this.orderEvents, handler: async (message) => { await this.analytics.track('order_status_changed', { orderId: message.payload.orderId, customerId: message.payload.customerId, status: message.payload.status, timestamp: Date.now() }); } }); // Email notification subscriber emailSubscriber = $subscriber({ topic: this.orderEvents, handler: async (message) => { const { customerId, orderId, status } = message.payload; const templates = { 'paid': 'order-confirmation', 'shipped': 'order-shipped', 'delivered': 'order-delivered' }; const template = templates[status]; if (template) { await this.emailService.send({ customerId, template, data: { orderId, status } }); } } }); // Inventory management subscriber inventorySubscriber = $subscriber({ topic: this.orderEvents, handler: async (message) => { if (message.payload.status === 'paid') { await this.inventoryService.reserveItems(message.payload.orderId); } else if (message.payload.status === 'delivered') { await this.inventoryService.confirmDelivery(message.payload.orderId); } } }); } ``` **Subscriber with advanced error handling and filtering:** ```ts class NotificationSubscriber { systemEvents = $topic({ name: "system-events", schema: { payload: t.object({ eventType: t.string(), severity: t.enum(["info", "warning", "error"]), serviceId: t.string(), message: t.string(), data: t.optional(t.record(t.string(), t.any())) }) } }); alertSubscriber = $subscriber({ topic: this.systemEvents, handler: async (message) => { const { eventType, severity, serviceId, message: eventMessage, data } = message.payload; try { // Only process error events for alerting if (severity !== 'error') { return; } // Log the event this.logger.error(`System alert from ${serviceId}`, { eventType, message: eventMessage, data }); // Send alerts based on service criticality const criticalServices = ['payment', 'auth', 'database']; const isCritical = criticalServices.includes(serviceId); if (isCritical) { // Immediate alert for critical services await this.alertService.sendImmediate({ title: `Critical Error in ${serviceId}`, message: eventMessage, severity: 'critical', metadata: { eventType, serviceId, data } }); } else { // Queue non-critical alerts for batching await this.alertService.queueAlert({ title: `Error in ${serviceId}`, message: eventMessage, severity: 'error', metadata: { eventType, serviceId, data } }); } // Update service health status await this.healthMonitor.recordError(serviceId, eventType); } catch (error) { // Log subscriber errors but don't re-throw // This prevents one failing subscriber from affecting others this.log.error(`Alert subscriber failed`, { originalEvent: { eventType, serviceId, severity }, subscriberError: error.message }); } } }); } ``` **Subscriber for real-time data aggregation:** ```ts class MetricsAggregator { userActivityTopic = $topic({ name: "user-metrics", schema: { payload: t.object({ userId: t.string(), sessionId: t.string(), eventType: t.string(), timestamp: t.number(), duration: t.optional(t.number()), metadata: t.optional(t.record(t.string(), t.any())) }) } }); metricsSubscriber = $subscriber({ topic: this.userActivityTopic, handler: async (message) => { const { userId, sessionId, eventType, timestamp, duration, metadata } = message.payload; // Update real-time metrics await Promise.all([ // Update user activity counters this.metricsStore.increment(`user:${userId}:events:${eventType}`, 1), this.metricsStore.increment(`global:events:${eventType}`, 1), // Track session activity this.sessionStore.updateActivity(sessionId, timestamp), // Record duration metrics if provided duration ? this.metricsStore.recordDuration(`events:${eventType}:duration`, duration) : Promise.resolve(), // Update time-based aggregations this.timeSeriesStore.addPoint({ metric: `user_activity.${eventType}`, timestamp, value: 1, tags: { userId, sessionId } }) ]); // Trigger real-time dashboard updates await this.dashboardService.updateRealTimeStats({ eventType, userId, timestamp }); this.logger.debug(`Processed metrics for ${eventType}`, { userId, eventType, timestamp }); } }); } ``` #### $topic() Creates a topic descriptor for pub/sub messaging and event-driven architecture. This descriptor provides a powerful publish/subscribe system that enables decoupled communication between different parts of your application. Topics allow multiple publishers to send messages and multiple subscribers to receive them, creating flexible event-driven architectures with support for real-time messaging and asynchronous event processing. **Key Features** - **Publish/Subscribe Pattern**: Decoupled communication between publishers and subscribers - **Multiple Subscribers**: One-to-many message distribution with automatic fan-out - **Type-Safe Messages**: Full TypeScript support with schema validation using TypeBox - **Real-time Processing**: Immediate message delivery to active subscribers - **Event Filtering**: Subscribe to specific message types using filter functions - **Timeout Support**: Wait for specific messages with configurable timeouts - **Multiple Backends**: Support for in-memory, Redis, and custom topic providers - **Error Resilience**: Built-in error handling and message processing recovery **Use Cases** Perfect for event-driven architectures and real-time communication: - User activity notifications - Real-time chat and messaging systems - System event broadcasting - Microservice communication - Live data updates and synchronization - Application state change notifications - Webhook and external API event handling **Basic topic with publish/subscribe:** ```ts import { $topic } from "alepha/topic"; import { t } from "alepha"; class NotificationService { userActivity = $topic({ name: "user-activity", schema: { payload: t.object({ userId: t.string(), action: t.enum(["login", "logout", "purchase"]), timestamp: t.number(), metadata: t.optional(t.record(t.string(), t.any())) }) }, handler: async (message) => { // This subscriber runs automatically for all messages console.log(`User ${message.payload.userId} performed ${message.payload.action}`); } }); async trackUserLogin(userId: string) { // Publish event - all subscribers will receive it await this.userActivity.publish({ userId, action: "login", timestamp: Date.now(), metadata: { source: "web", ip: "192.168.1.1" } }); } async setupAdditionalSubscriber() { // Add another subscriber dynamically await this.userActivity.subscribe(async (message) => { if (message.payload.action === "purchase") { await this.sendPurchaseConfirmation(message.payload.userId); } }); } } ``` **Real-time chat system with multiple subscribers:** ```ts class ChatService { messagesTopic = $topic({ name: "chat-messages", description: "Real-time chat messages for all rooms", schema: { payload: t.object({ messageId: t.string(), roomId: t.string(), userId: t.string(), content: t.string(), timestamp: t.number(), messageType: t.enum(["text", "image", "file"]) }) } }); async sendMessage(roomId: string, userId: string, content: string) { await this.messagesTopic.publish({ messageId: generateId(), roomId, userId, content, timestamp: Date.now(), messageType: "text" }); } // Different services can subscribe to the same topic async setupMessageLogging() { await this.messagesTopic.subscribe(async (message) => { // Log all messages for compliance await this.auditLogger.log({ action: "message_sent", roomId: message.payload.roomId, userId: message.payload.userId, timestamp: message.payload.timestamp }); }); } async setupNotificationService() { await this.messagesTopic.subscribe(async (message) => { // Send push notifications to offline users const offlineUsers = await this.getOfflineUsersInRoom(message.payload.roomId); await this.sendPushNotifications(offlineUsers, { title: `New message in ${message.payload.roomId}`, body: message.payload.content }); }); } } ``` **Event filtering and waiting for specific messages:** ```ts class OrderService { orderEvents = $topic({ name: "order-events", schema: { payload: t.object({ orderId: t.string(), status: t.union([ t.literal("created"), t.literal("paid"), t.literal("shipped"), t.literal("delivered"), t.literal("cancelled") ]), timestamp: t.number(), data: t.optional(t.record(t.string(), t.any())) }) } }); async processOrder(orderId: string) { // Publish order created event await this.orderEvents.publish({ orderId, status: "created", timestamp: Date.now() }); // Wait for payment confirmation with timeout try { const paymentEvent = await this.orderEvents.wait({ timeout: [5, "minutes"], filter: (message) => message.payload.orderId === orderId && message.payload.status === "paid" }); console.log(`Order ${orderId} was paid at ${paymentEvent.payload.timestamp}`); // Continue with shipping... await this.initiateShipping(orderId); } catch (error) { if (error instanceof TopicTimeoutError) { console.log(`Payment timeout for order ${orderId}`); await this.cancelOrder(orderId); } } } async setupOrderTracking() { // Subscribe only to shipping events await this.orderEvents.subscribe(async (message) => { if (message.payload.status === "shipped") { await this.updateTrackingInfo(message.payload.orderId, message.payload.data); await this.notifyCustomer(message.payload.orderId, "Your order has shipped!"); } }); } } ``` **Redis-backed topic for distributed systems:** ```ts class DistributedEventSystem { systemEvents = $topic({ name: "system-events", provider: RedisTopicProvider, // Use Redis for cross-service communication schema: { payload: t.object({ eventType: t.string(), serviceId: t.string(), data: t.record(t.string(), t.any()), timestamp: t.number(), correlationId: t.optional(t.string()) }) }, handler: async (message) => { // Central event handler for all system events await this.processSystemEvent(message.payload); } }); async publishServiceHealth(serviceId: string, healthy: boolean) { await this.systemEvents.publish({ eventType: "service.health", serviceId, data: { healthy, checkedAt: new Date().toISOString() }, timestamp: Date.now() }); } async setupHealthMonitoring() { await this.systemEvents.subscribe(async (message) => { if (message.payload.eventType === "service.health") { await this.updateServiceStatus( message.payload.serviceId, message.payload.data.healthy ); if (!message.payload.data.healthy) { await this.alertOnCall(`Service ${message.payload.serviceId} is down`); } } }); } } ``` --- # packages-alepha-vite.md # Alepha Vite Vite plugin for building Alepha applications. ## Installation This package is part of the Alepha framework and can be installed via the all-in-one package: ```bash npm install alepha ``` ## Module Plugin vite for Alepha framework. This module provides Vite plugins and configurations to integrate Alepha applications with Vite's build and development processes. ```ts import { defineConfig } from "vite"; import { viteAlepha } from "@alepha/vite"; export default defineConfig({ plugins: [viteAlepha()], // other Vite configurations... }); ``` ---