Architecture
on_chunk callbackStack
| Layer | Technology | Rationale |
|---|---|---|
| API | FastAPI + uvicorn |
Async-native, WebSocket streaming for real-time digest generation |
| CLI | Click 8.0+ |
Interactive setup, cron-triggered generation, local management |
| Database | SQLite |
Single-file, zero-config, portable — fits the local-first model perfectly |
| Frontend | React 18 + TS + Tailwind |
Component model for section-level interactivity — highlights, feedback, annotations |
| Bundler | Vite 6 |
Fast HMR, clean dev experience |
| LLM | Anthropic / OpenAI / Ollama |
Pluggable provider abstraction with unified streaming interface |
| Auth | Google OAuth |
Gmail metadata ingestion — email subjects and threads, never bodies |
Data Model
-- Core tables (SQLite, XDG-compliant paths) digests id, type, date, title sections (JSON array) provider, model, prompt_hash tokens_used, duration_ms feedback digest_id → FK, section_id reaction (6 types), comment learning_profile category (depth|style|pace|accuracy) key (domain name), value (1–5) UNIQUE(category, key) annotations digest_id, section_id highlighted_text, comment xp_events event_type, xp_amount metadata (JSON) badges badge_id (UNIQUE), earned_at
prompt_hash (SHA-256) prevents redundant generations. Config in ~/.config/reprise/, data in ~/.local/share/reprise/.Generation Pipeline
on_chunk interface → structured JSONKey Decisions
Local-First, Not Local-Only
All user data in SQLite on the local machine. Config in XDG-compliant paths. Only outbound call: chosen LLM provider. Fully offline with Ollama.
Streaming Everywhere
All providers implement unified on_chunk callback. FastAPI exposes via WebSocket. Frontend renders generation in real-time.
Profile as Feedback Accumulator
No training step. The profile is a live projection of all feedback signals — reactions, highlights, comments. Every interaction adjusts depth, styles, and pace.
Structured JSON Output
Digests return typed section arrays — each with type, content, domain, and depth_level. Enables section-specific UI interactions and targeted feedback.
Browser History Privacy
Direct read from local SQLite files — no extensions needed. Built-in blocklist filters NSFW, ad networks, and trackers at ingestion time.
Cognitive Colour Themes
Five palettes backed by peer-reviewed research — ART, melatonin suppression, colour-cognition studies. Auto-switch via CSS custom properties, zero re-renders.
Circadian Theme Engine
Feedback System
Gamification
25 × L × (L-1) XP per levelPrivacy Model
SQLite in ~/.local/share/reprise/. Single file. Your machine only. Config separate in ~/.config/reprise/.
Only outbound call: structured prompt to chosen LLM provider. No telemetry, no analytics, no phone-home. Fully offline with Ollama.
Direct local SQLite read — Chrome, Firefox, Safari, Edge auto-detected. NSFW/ad/tracker domains filtered at ingestion. No extensions needed.
Quick Start
# Install pip install reprise[all] # Interactive setup — choose provider, configure sources reprise init # Generate today's digest reprise generate daily # Launch web UI reprise serve # → localhost:3141 # Enable daily automation reprise schedule on # Timezone-aware cron