15 - Developer & Integration Guide
15 - Developer & Integration Guide
Section titled β15 - Developer & Integration Guideβπ οΈ Technical Documentation for Developers
β±οΈ Time Estimate: 30 minutes
π What Youβll Learn: Architecture, database schema, API integration, building from source
Table of Contents
Section titled βTable of Contentsβ- Architecture Overview
- Technology Stack
- Database Schema
- Component Architecture
- Building from Source
- Development Setup
- Custom Ollama Models
- API Integration Patterns
- Contributing Guidelines
Architecture Overview
Section titled βArchitecture OverviewβHigh-Level Architecture
Section titled βHigh-Level Architectureβββββββββββββββββββββββββββββββββββββββββββββ Frontend (React) ββ - UI Components ββ - State Management ββ - Visualization Rendering ββββββββββββββββββββ¬ββββββββββββββββββββββββ β Tauri IPCβββββββββββββββββββΌβββββββββββββββββββββββββ Backend (Rust) ββ - Database Operations ββ - File System Access ββ - API Integrations ββ - Audio Processing ββββββββββββββββββββ¬ββββββββββββββββββββββββ β βββββββββββ΄ββββββββββ β ββββββββββΌβββββββββ βββββββββΌββββββββββ SQLite Databaseβ β File System ββ - Projects β β - Audio Files ββ - Transcripts β β - Models βββββββββββββββββββ ββββββββββββββββββDesign Principles
Section titled βDesign Principlesβ1. Local-First:
- All data stored locally (SQLite + file system)
- No cloud dependencies (except AI providers)
- Offline capable with local models
2. Privacy-Focused:
- No telemetry or tracking
- Encrypted API key storage
- User controls all data flow
3. Modular Architecture:
- Clear separation: Frontend β Backend
- Pluggable AI providers
- Extensible visualization system
4. Performance:
- Async processing (Rust Tokio)
- Debounced search (300ms)
- Efficient database queries (indexed)
Technology Stack
Section titled βTechnology StackβFrontend
Section titled βFrontendβCore:
- React 18 - UI framework
- TypeScript - Type safety
- Vite - Build tool & dev server
- Tailwind CSS - Utility-first styling
Key Libraries:
{ "react": "^18.2.0", "typescript": "^5.0.0", "vite": "^4.3.0", "tailwindcss": "^3.3.0", "lucide-react": "^0.263.1", "@tauri-apps/api": "^1.4.0", "openai": "^6.3.0", "@google/generative-ai": "^0.24.1"}Visualization:
- Mermaid.js - Flowcharts
- D3.js - Mind maps
- TanStack Table - Action matrices
Backend
Section titled βBackendβCore:
- Rust - Systems programming
- Tauri 2.0 - Desktop framework
- Tokio - Async runtime
- SQLx - Database access
Key Crates:
[dependencies]tauri = "2.0"tokio = { version = "1.28", features = ["full"] }sqlx = { version = "0.7", features = ["sqlite", "runtime-tokio"] }serde = { version = "1.0", features = ["derive"] }serde_json = "1.0"reqwest = { version = "0.11", features = ["json"] }AI Integration:
- reqwest - HTTP client for API calls
- Custom service layers for each provider
Database Schema
Section titled βDatabase SchemaβSQLite Structure
Section titled βSQLite StructureβCore Tables:
1. PROJECTS
CREATE TABLE projects ( id INTEGER PRIMARY KEY AUTOINCREMENT, name TEXT NOT NULL, description TEXT, color_tag TEXT, created_at TEXT NOT NULL, last_accessed TEXT NOT NULL);
CREATE INDEX idx_projects_name ON projects(name);CREATE INDEX idx_projects_created ON projects(created_at DESC);2. TRANSCRIPTS
CREATE TABLE transcripts ( id INTEGER PRIMARY KEY AUTOINCREMENT, project_id INTEGER NOT NULL, filename TEXT NOT NULL, original_text TEXT NOT NULL, processed_json TEXT, processing_status TEXT DEFAULT 'pending', file_size INTEGER, upload_date TEXT NOT NULL,
-- F011 Audio Recording Fields audio_file_path TEXT, recording_duration INTEGER, transcription_method TEXT DEFAULT 'file_upload', transcription_provider TEXT, transcription_model TEXT, audio_format TEXT, recording_type TEXT, transcription_status TEXT DEFAULT 'pending',
-- F015 Temporal Data temporal_data_json TEXT,
-- Processing Metadata tokens_used INTEGER, processing_cost REAL, model_used TEXT, provider TEXT,
FOREIGN KEY (project_id) REFERENCES projects(id) ON DELETE CASCADE);
CREATE INDEX idx_transcripts_project ON transcripts(project_id);CREATE INDEX idx_transcripts_upload ON transcripts(upload_date DESC);CREATE INDEX idx_transcripts_audio ON transcripts(audio_file_path);3. LLM_SETTINGS
CREATE TABLE llm_settings ( id INTEGER PRIMARY KEY AUTOINCREMENT,
-- Transcription Settings transcription_provider TEXT DEFAULT 'ollama', transcription_model TEXT DEFAULT 'whisper:base', transcription_api_key TEXT, transcription_api_url TEXT DEFAULT 'http://localhost:11434',
-- Analysis Settings analysis_provider TEXT DEFAULT 'ollama', analysis_model TEXT DEFAULT 'llama3.1:latest', analysis_api_key TEXT, analysis_api_url TEXT,
-- Automation auto_transcribe BOOLEAN DEFAULT 0, auto_analyze BOOLEAN DEFAULT 0,
-- Custom Models custom_transcription_models TEXT, -- JSON array custom_analysis_models TEXT, -- JSON array
updated_at TEXT NOT NULL);
-- Single settings recordINSERT INTO llm_settings (id, updated_at) VALUES (1, datetime('now'));4. LICENSE_SETTINGS
CREATE TABLE license_settings ( id INTEGER PRIMARY KEY AUTOINCREMENT, license_key TEXT UNIQUE, activation_status TEXT DEFAULT 'inactive', instance_id TEXT, plan_name TEXT, activation_date TEXT, activation_response_json TEXT, updated_at TEXT NOT NULL);5. VISUALIZATIONS (Planned)
CREATE TABLE visualizations ( id INTEGER PRIMARY KEY AUTOINCREMENT, transcript_id INTEGER NOT NULL, visualization_type TEXT NOT NULL, svg_data TEXT, mermaid_code TEXT, export_format TEXT, created_at TEXT NOT NULL,
FOREIGN KEY (transcript_id) REFERENCES transcripts(id) ON DELETE CASCADE);Processed JSON Structure
Section titled βProcessed JSON StructureβFormat stored in transcripts.processed_json:
interface ProcessedTranscript { meeting_metadata: { title: string; date: string; // ISO 8601 participants: string[]; meeting_type: "standup" | "planning" | "decision" | "brainstorm" | "other"; transcript_type: "single_speaker" | "multi_speaker"; };
decisions: Array<{ id: string; // "D1", "D2", etc. text: string; timestamp?: string; flows_to: string[]; // Action IDs }>;
actions: Array<{ id: string; // "A1", "A2", etc. task: string; owner: string; deadline: string | null; // ISO 8601 date priority: "high" | "medium" | "low"; source_decision: string; // Decision ID }>;
concepts: Array<{ id: string; // "C1", "C2", etc. topic: string; importance: number; // 1-5 sub_topics: string[]; related_concepts: string[]; // Concept IDs }>;
processing_metadata: { model_used: string; provider: "openai" | "gemini" | "ollama"; timestamp: string; // ISO 8601 tokens_used?: number; cost_estimate?: number; };}Component Architecture
Section titled βComponent ArchitectureβFrontend Structure
Section titled βFrontend Structureβsrc/βββ components/β βββ Dashboard/ # Main app layoutβ βββ Header/ # Global header with searchβ βββ Sidebar/ # Project navigation (F001)β βββ AudioRecorder/ # Recording UI (F011)β βββ TranscriptUpload/ # File upload (F002)β βββ LLMProcessing/ # AI processing modalβ βββ TranscriptVisualization/ # Results displayβ βββ Settings/ # Settings page (F007)β βββ GlobalSearch/ # Search dropdown (F020)β βββ EditableText/ # Inline editing (F004)β βββ SentimentArc/ # F015 visualizationβ βββ ParticipationHeatmap/ # F017 visualizationββββ contexts/β βββ ProjectContext.tsx # Global project stateβ βββ SettingsContext.tsx # App settingsββββ utils/β βββ llmService.ts # LLM integrationβ βββ transcriptApi.ts # Backend API callsβ βββ searchApi.ts # Search functionsββββ types/β βββ transcript.ts # TypeScript interfacesβ βββ project.tsβ βββ settings.tsββββ App.tsx # Root componentBackend Structure
Section titled βBackend Structureβsrc-tauri/src/βββ main.rs # Tauri app entryβββ commands.rs # Tauri IPC commandsβββ database.rs # Database operationsβββ audio_commands.rs # Audio recording (F011)βββ ollama_service.rs # Ollama integrationβββ license_service.rs # License management (F014)βββ lib.rs # Module exportsKey Commands (Tauri IPC)
Section titled βKey Commands (Tauri IPC)β// Project Management#[tauri::command]async fn create_project(pool: State<'_, SqlitePool>, name: String, description: String, color: String) -> Result<i64, String>
#[tauri::command]async fn get_projects(pool: State<'_, SqlitePool>) -> Result<Vec<Project>, String>
// Transcript Management#[tauri::command]async fn upload_transcript(pool: State<'_, SqlitePool>, project_id: i64, filename: String, content: String) -> Result<i64, String>
#[tauri::command]async fn get_transcripts(pool: State<'_, SqlitePool>, project_id: i64) -> Result<Vec<Transcript>, String>
// Audio Recording (F011)#[tauri::command]async fn save_audio_recording(pool: State<'_, SqlitePool>, project_id: i64, audio_data: Vec<u8>) -> Result<i64, String>
#[tauri::command]async fn transcribe_audio_ollama(pool: State<'_, SqlitePool>, transcript_id: i64) -> Result<String, String>
// Search (F020)#[tauri::command]async fn search_transcripts(pool: State<'_, SqlitePool>, query: String) -> Result<Vec<SearchResult>, String>
// Data Export (F013)#[tauri::command]async fn export_app_data(pool: State<'_, SqlitePool>) -> Result<String, String>
#[tauri::command]async fn import_app_data(pool: State<'_, SqlitePool>, zip_content: Vec<u8>, overwrite: bool) -> Result<ImportResult, String>Building from Source
Section titled βBuilding from SourceβPrerequisites
Section titled βPrerequisitesβRequired:
# Node.js 18+node --version # v18.0.0+
# Rust 1.70+rustc --version # 1.70.0+
# Cargo (comes with Rust)cargo --version
# Platform-specific:# Windows: Visual Studio Build Tools# macOS: Xcode Command Line Tools# Linux: build-essential, libwebkit2gtk-4.0-devClone Repository
Section titled βClone Repositoryβgit clone https://github.com/shobankr/selfoss.gitcd selfossInstall Dependencies
Section titled βInstall DependenciesβFrontend:
npm installBackend (Rust):
# Tauri CLIcargo install tauri-cli
# Dependencies auto-installed via Cargo.tomlDevelopment Build
Section titled βDevelopment Buildβ# Start dev server (hot reload)npm run tauri dev
# Runs:# 1. Vite dev server (frontend)# 2. Tauri app (backend)# 3. Opens desktop app windowProduction Build
Section titled βProduction Buildβ# Build optimized appnpm run tauri build
# Outputs:# Windows: src-tauri/target/release/selfoss.exe# macOS: src-tauri/target/release/bundle/macos/Selfoss.app# Linux: src-tauri/target/release/selfossPlatform-Specific Instructions
Section titled βPlatform-Specific InstructionsβWindows:
# Install Visual Studio Build Tools# https://visualstudio.microsoft.com/downloads/
# Or via wingetwinget install Microsoft.VisualStudio.2022.BuildTools
# Then buildnpm run tauri buildmacOS:
# Install Xcode Command Line Toolsxcode-select --install
# Then buildnpm run tauri build
# Code signing (for distribution)codesign -s "Developer ID" target/release/bundle/macos/Selfoss.appLinux (Ubuntu/Debian):
# Install dependenciessudo apt updatesudo apt install -y \ libwebkit2gtk-4.0-dev \ build-essential \ curl \ wget \ libssl-dev \ libgtk-3-dev \ libayatana-appindicator3-dev \ librsvg2-dev
# Then buildnpm run tauri buildDevelopment Setup
Section titled βDevelopment SetupβIDE Configuration
Section titled βIDE ConfigurationβVS Code (Recommended):
{ "recommendations": [ "tauri-apps.tauri-vscode", "rust-lang.rust-analyzer", "dbaeumer.vscode-eslint", "esbenp.prettier-vscode" ]}VS Code Settings:
{ "editor.formatOnSave": true, "rust-analyzer.cargo.allFeatures": true, "eslint.format.enable": true}Running Tests
Section titled βRunning TestsβFrontend (Jest/Vitest):
# Run testsnpm test
# Watch modenpm test -- --watch
# Coveragenpm test -- --coverageBackend (Rust):
# Run testscargo test
# With outputcargo test -- --nocapture
# Specific testcargo test test_nameDebugging
Section titled βDebuggingβFrontend:
# Chrome DevTools built-in# Ctrl+Shift+I (Windows/Linux)# Cmd+Option+I (macOS)Backend (Rust):
# Use VS Code debugger# Or rust-lldbrust-lldb target/debug/selfossCustom Ollama Models
Section titled βCustom Ollama ModelsβCreating Custom Models
Section titled βCreating Custom Modelsβ1. Write Modelfile:
# ModelfileFROM llama3.1:latest
PARAMETER temperature 0.7PARAMETER top_p 0.9
SYSTEM """You are an expert meeting analyzer specializing in technical discussions.Extract decisions, action items, and concepts with high precision.Focus on technical terminology and specifications."""2. Create Model:
ollama create technical-meeting -f Modelfile3. Test Model:
ollama run technical-meeting "Test prompt"4. Add to Selfoss:
Settings β LLM & Processing β Manage Custom Models β Add "technical-meeting"Fine-Tuning for Domains
Section titled βFine-Tuning for DomainsβExamples:
Medical Meetings:
FROM llama3.1:70b
SYSTEM """You are a medical meeting analyst.Understand medical terminology, procedures, and diagnoses.Extract patient-related actions and clinical decisions.Maintain HIPAA compliance in outputs."""Legal Meetings:
FROM llama3.1:latest
SYSTEM """You are a legal meeting analyst.Understand legal terminology, case references, and procedures.Extract legal actions, deadlines, and decisions.Note document references and jurisdictions."""Sales Calls:
FROM llama3.1:latest
SYSTEM """You are a sales call analyzer.Extract: objections, commitments, next steps, pricing discussions.Identify buying signals and deal status.Track follow-up actions and decision-makers."""API Integration Patterns
Section titled βAPI Integration PatternsβOllama Integration
Section titled βOllama IntegrationβExample: Custom transcription call
// Frontendasync function transcribeWithOllama(audioPath: string): Promise<string> { const endpoint = 'http://localhost:11434/api/generate';
const response = await fetch(endpoint, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ model: 'whisper:base', prompt: '', // Audio handling varies stream: false }) });
const data = await response.json(); return data.response;}Rust Backend:
pub async fn transcribe_audio( audio_path: &str, model: &str, endpoint: &str) -> Result<String, String> { let client = reqwest::Client::new();
// Read audio file let audio_data = tokio::fs::read(audio_path) .await .map_err(|e| e.to_string())?;
// Call Ollama API let response = client .post(format!("{}/api/generate", endpoint)) .json(&serde_json::json!({ "model": model, "audio": base64::encode(audio_data), "stream": false })) .send() .await .map_err(|e| e.to_string())?;
let result: serde_json::Value = response .json() .await .map_err(|e| e.to_string())?;
Ok(result["response"].as_str().unwrap().to_string())}Adding New AI Providers
Section titled βAdding New AI Providersβ1. Define Provider Interface:
interface LLMProvider { name: string; transcribe(audio: File): Promise<string>; analyze(text: string): Promise<ProcessedTranscript>; testConnection(): Promise<boolean>;}2. Implement Provider:
class CustomProvider implements LLMProvider { private apiKey: string; private endpoint: string;
async transcribe(audio: File): Promise<string> { // Implementation }
async analyze(text: string): Promise<ProcessedTranscript> { // Implementation }
async testConnection(): Promise<boolean> { // Implementation }}3. Register Provider:
const providers = { 'ollama': OllamaProvider, 'openai': OpenAIProvider, 'gemini': GeminiProvider, 'custom': CustomProvider // Add here};Contributing Guidelines
Section titled βContributing GuidelinesβGetting Started
Section titled βGetting Startedβ- Fork repository
- Create feature branch:
git checkout -b feature/my-feature - Follow code style (Prettier, ESLint, Rustfmt)
- Write tests for new features
- Update documentation as needed
- Submit pull request
Code Style
Section titled βCode StyleβTypeScript:
- Use Prettier for formatting
- Follow ESLint rules
- Use TypeScript strict mode
- Document complex functions
Rust:
- Use
rustfmtfor formatting - Follow Clippy lints
- Document public APIs
- Write unit tests
Commit Messages
Section titled βCommit MessagesβFormat:
type(scope): subject
body (optional)
footer (optional)Types:
feat: New featurefix: Bug fixdocs: Documentationrefactor: Code refactoringtest: Adding testschore: Maintenance
Examples:
feat(audio): add support for MP3 files
Implements MP3 transcoding to WAV before processing.Uses ffmpeg for conversion.
Closes #123Testing Requirements
Section titled βTesting RequirementsβNew features must include:
- Unit tests (Jest/Vitest or Rust)
- Integration tests (if applicable)
- Manual testing steps in PR
Documentation
Section titled βDocumentationβUpdate relevant docs:
- README.md (if user-facing)
- USER_GUIDES/ (if new feature)
- Code comments (for complex logic)
- API documentation (if backend changes)
Next Steps
Section titled βNext Stepsβπ οΈ Ready to contribute!
For Developers:
Section titled βFor Developers:β- π§ Set up dev environment - Follow build instructions
- π Read architecture docs - Understand codebase
- π Fix a bug - Start with βgood first issueβ
- π‘ Propose feature - Open discussion first
- π€ Join community - GitHub Discussions
Resources:
Section titled βResources:β- Tauri Docs: tauri.app/v1/guides
- React Docs: react.dev
- Rust Book: doc.rust-lang.org/book
- SQLite Docs: sqlite.org/docs.html
π οΈ Build the future of meeting intelligence.