Skip to content

Cstannahill/LocalChat

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LocalChat AI Orchestration Backend

LocalChat is a .NET 9 backend for local-first AI orchestration with:

  • multi-provider LLM routing (Ollama, OpenRouter, HuggingFace, LlamaCpp)
  • conversation memory and session-state lifecycle tooling
  • retrieval-enhanced prompt composition
  • streaming chat responses over Server-Sent Events (SSE)
  • text-to-speech (Kokoro or Qwen provider integration)
  • contextual image prompting and image generation job orchestration (ComfyUI)
  • import/export, inspection, and maintenance/admin endpoints

This repository contains the API plus application/domain/infrastructure layers and automated tests.

Tech Stack

  • .NET 9.0
  • ASP.NET Core Minimal APIs
  • Entity Framework Core + SQLite
  • Swagger/OpenAPI (enabled in Development)
  • xUnit test suite (domain, application, infrastructure)

Solution Layout

  • src/LocalChat.Api - web host, minimal API endpoints, middleware, static assets
  • src/LocalChat.Application - orchestration and use-case logic
  • src/LocalChat.Domain - entities, value objects, enums
  • src/LocalChat.Infrastructure - persistence, provider adapters, background workers
  • src/LocalChat.Contracts - request/response contracts
  • tests/* - test projects

Quick Start

  1. Install .NET 9 SDK.
  2. Configure src/LocalChat.Api/appsettings.json (or environment variables).
  3. Start dependencies you intend to use (for example Ollama, TTS server, ComfyUI).
  4. Run the API:
dotnet run --project src/LocalChat.Api/LocalChat.Api.csproj

Default dev URL is http://localhost:5170 (see launchSettings.json).

Detailed setup and provider key placement: QUICKSTART.md

Health Check

GET /health

Swagger (Development Only)

  • http://localhost:5170/swagger

Configuration

Configuration is loaded from ASP.NET configuration sources (appsettings*.json, environment variables, etc.).

High-impact sections:

  • ConnectionStrings:DefaultConnection
  • Ollama, OpenRouter, HuggingFace, LlamaCpp
  • Speech, KokoroTts, QwenTts
  • ComfyUi
  • Summaries, Retrieval, MemoryProposals, SessionStateCleanup
  • ConversationBackgroundWork, BackgroundMemoryProposals
  • Inspection, RequestFlowLogging

Detailed guidance: docs/CONFIGURATION.md

Professional Use Cases

The platform architecture is adaptable to multiple production AI scenarios:

  • Customer service AI systems (multi-provider LLM routing with memory)
  • Educational tutoring systems (retrieval-enhanced prompt composition)
  • Healthcare information assistants (with additional compliance, privacy, and access controls)
  • Enterprise knowledge assistants (RAG over internal memory/lore/knowledge stores)
  • Productivity copilots and agents (stateful orchestration plus provider routing)
  • Technical support assistants (domain-specific retrieval over product/code documentation)
  • Multilingual business support experiences (provider/model flexibility for global workloads)

Implementation Highlights

This project includes production-relevant backend engineering for agent systems:

  • clean layered architecture in .NET (Api, Application, Domain, Infrastructure, Contracts)
  • provider-agnostic model routing with per-turn overrides
  • stateful conversation orchestration with memory and retrieval pipelines
  • background work execution, maintenance endpoints, and operational runbook support
  • streaming token delivery over SSE for real-time UX
  • test-backed behavior across domain/application/infrastructure layers

Taken together, these capabilities show a backend designed for practical delivery beyond prototypes: system design, implementation discipline, documentation, and operational thinking.

API Surface

Primary route groups for AI orchestration:

  • /api/chat - streaming send, continue, regenerate, suggested user message
  • /api/conversations - conversation CRUD and message mutation
  • /api/agents, /api/user-profiles
  • /api/model-profiles, /api/generation-presets, /api/app-defaults
  • /api/memory, /api/knowledge-bases, /api/inspection
  • /api/tts, /api/images
  • /api/import-export
  • /api/admin, /api/admin/maintenance, /api/admin/background-work

Endpoint catalog and examples: docs/API_OVERVIEW.md

Data and Generated Assets

  • SQLite database: src/LocalChat.Api/App_Data/localchat.db (default)
  • Request flow telemetry: src/LocalChat.Api/App_Data/Logs/request-flow.ndjson (when enabled)
  • Generated speech files: src/LocalChat.Api/wwwroot/generated/audio/...
  • Generated images: src/LocalChat.Api/wwwroot/generated/images/...
  • Agent uploads: src/LocalChat.Api/wwwroot/uploads/agents/...

On startup the app:

  1. ensures migration history compatibility for legacy databases
  2. applies pending EF Core migrations
  3. seeds default model profile, generation preset, and agent records

Development Commands

Build:

dotnet build LocalChat.sln

Test:

dotnet test LocalChat.sln

Security and Release Notes

  • This backend currently exposes admin and maintenance endpoints without built-in auth.
  • Treat deployments as trusted-network only until authn/authz is introduced.
  • Never commit real API keys or tokens.

Release-oriented documentation:

About

Local-first .NET 9 backend for agent orchestration with multi-provider LLM routing, memory/retrieval pipelines, SSE streaming, and operations-ready APIs.

Topics

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors