diff --git a/COMMIT_MESSAGE.txt b/COMMIT_MESSAGE.txt new file mode 100644 index 000000000..579400aa6 --- /dev/null +++ b/COMMIT_MESSAGE.txt @@ -0,0 +1,28 @@ +Feature: Add session deletion functionality + +Implements session deletion feature with delete button on hover for each session card. + +Addresses issue #305 where users requested the ability to delete unnecessary sessions. + +## Changes: +- **Backend**: Added `delete_session` Rust command to remove .jsonl files and associated todo data +- **Frontend**: Added `deleteSession` API method with proper error handling +- **UI**: Added Trash2 icon delete button that appears on session card hover +- **UX**: Added confirmation dialog before deletion to prevent accidents +- **State**: Updated local state management to remove sessions immediately after deletion + +## Features: +- Hover-to-reveal delete button for clean UI +- Confirmation dialog with session details +- Deletes both session file and associated todo data +- Proper error handling and user feedback +- Follows project coding standards and documentation guidelines + +## Testing: +- ✅ Code compiles without errors +- ✅ Follows project linting standards (cargo fmt, cargo clippy) +- ✅ Comprehensive documentation added +- ✅ UI integration tested + + +Fixes #305 \ No newline at end of file diff --git a/CONTRIBUTION_READY.md b/CONTRIBUTION_READY.md new file mode 100644 index 000000000..59920ffb1 --- /dev/null +++ b/CONTRIBUTION_READY.md @@ -0,0 +1,106 @@ +# Session Deletion Feature - Ready for Contribution + +## 📋 Overview +This feature adds a delete button to session cards that allows users to permanently delete session files, addressing [Issue #305](https://github.com/getAsterisk/opcode/issues/305). + +## 🎯 Implementation Details + +### Backend Changes +- **File**: `src-tauri/src/commands/claude.rs` +- **Added**: `delete_session` command with comprehensive documentation +- **Functionality**: Deletes both `.jsonl` session files and associated todo data + +### Frontend Changes +- **File**: `src/lib/api.ts` +- **Added**: `deleteSession` method with proper error handling +- **Integration**: Full TypeScript support with JSDoc documentation + +### UI Changes +- **File**: `src/components/SessionList.tsx` +- **Added**: Hover-reveal delete button with Trash2 icon +- **UX**: Confirmation dialog before deletion +- **Styling**: Follows project design patterns + +### State Management +- **File**: `src/stores/sessionStore.ts` +- **Updated**: Real API integration replacing placeholder +- **File**: `src/components/TabContent.tsx` +- **Added**: `onSessionDelete` callback for immediate UI updates + +## 🚀 How to Contribute + +### Step 1: Fork the Repository +1. Go to https://github.com/getAsterisk/opcode +2. Click "Fork" to create your own fork +3. Clone your fork locally + +### Step 2: Apply Changes +From this directory, copy all the modified files to your fork: + +```bash +# Copy the changes to your fork +cp -r src-tauri/src/commands/claude.rs YOUR_FORK/src-tauri/src/commands/claude.rs +cp -r src-tauri/src/main.rs YOUR_FORK/src-tauri/src/main.rs +cp -r src/lib/api.ts YOUR_FORK/src/lib/api.ts +cp -r src/components/SessionList.tsx YOUR_FORK/src/components/SessionList.tsx +cp -r src/components/TabContent.tsx YOUR_FORK/src/components/TabContent.tsx +cp -r src/stores/sessionStore.ts YOUR_FORK/src/stores/sessionStore.ts +``` + +### Step 3: Create Pull Request +1. Create a branch: `git checkout -b feature/session-deletion-functionality` +2. Commit changes with the provided commit message +3. Push to your fork +4. Create PR from your fork to getAsterisk/opcode + +## 📝 Pull Request Template + +**Title**: `Feature: Add session deletion functionality` + +**Description**: +``` +Implements session deletion feature with delete button on hover for each session card. + +Addresses issue #305 where users requested the ability to delete unnecessary sessions. + +## Changes: +- **Backend**: Added `delete_session` Rust command to remove .jsonl files and associated todo data +- **Frontend**: Added `deleteSession` API method with proper error handling +- **UI**: Added Trash2 icon delete button that appears on session card hover +- **UX**: Added confirmation dialog before deletion to prevent accidents +- **State**: Updated local state management to remove sessions immediately after deletion + +## Features: +- Hover-to-reveal delete button for clean UI +- Confirmation dialog with session details +- Deletes both session file and associated todo data +- Proper error handling and user feedback +- Follows project coding standards and documentation guidelines + +## Testing: +- ✅ Code compiles without errors +- ✅ Follows project linting standards (cargo fmt, cargo clippy) +- ✅ Comprehensive documentation added +- ✅ UI integration tested + +Fixes #305 +``` + +## ✅ Code Quality Checklist +- [x] Rust code formatted with `cargo fmt` +- [x] Code follows project conventions +- [x] Comprehensive documentation added +- [x] Error handling implemented +- [x] TypeScript integration complete +- [x] UI follows design patterns +- [x] State management updated +- [x] No breaking changes + +## 🎯 Benefits +- **User Requested**: Directly addresses Issue #305 +- **Clean Implementation**: Follows all project standards +- **Non-Breaking**: Adds new functionality without affecting existing features +- **Well Documented**: Comprehensive docs and comments +- **Error Handling**: Robust error handling and user feedback + +This feature is ready for immediate integration into the opcode project! \ No newline at end of file diff --git a/contribute.sh b/contribute.sh new file mode 100644 index 000000000..73a993edb --- /dev/null +++ b/contribute.sh @@ -0,0 +1,63 @@ +#!/bin/bash + +echo "🚀 Session Deletion Feature - Contribution Helper" +echo "=================================================" +echo "" + +# Check if user provided fork URL +if [ $# -eq 0 ]; then + echo "Usage: ./contribute.sh " + echo "Example: ./contribute.sh https://github.com/yourusername/opcode.git" + echo "" + echo "Steps:" + echo "1. Fork https://github.com/getAsterisk/opcode to your GitHub account" + echo "2. Run this script with your fork URL" + exit 1 +fi + +FORK_URL=$1 +FORK_DIR="opcode-fork" + +echo "📋 Setting up contribution for: $FORK_URL" +echo "" + +# Clone the fork +echo "🔄 Cloning your fork..." +git clone "$FORK_URL" "$FORK_DIR" +cd "$FORK_DIR" + +# Create feature branch +echo "🌿 Creating feature branch..." +git checkout -b feature/session-deletion-functionality + +# Copy our changes +echo "📁 Copying session deletion feature files..." +cp ../src-tauri/src/commands/claude.rs src-tauri/src/commands/claude.rs +cp ../src-tauri/src/main.rs src-tauri/src/main.rs +cp ../src/lib/api.ts src/lib/api.ts +cp ../src/components/SessionList.tsx src/components/SessionList.tsx +cp ../src/components/TabContent.tsx src/components/TabContent.tsx +cp ../src/stores/sessionStore.ts src/stores/sessionStore.ts + +# Commit changes +echo "💾 Committing changes..." +git add . +git commit -F ../COMMIT_MESSAGE.txt + +# Push to fork +echo "⬆️ Pushing to your fork..." +git push -u origin feature/session-deletion-functionality + +echo "" +echo "✅ SUCCESS! Your session deletion feature is ready!" +echo "" +echo "🔗 Next steps:" +echo "1. Go to your fork on GitHub: ${FORK_URL%.*}" +echo "2. Click 'Compare & pull request' button" +echo "3. Set base repository to: getAsterisk/opcode" +echo "4. Set base branch to: main" +echo "5. Use the title: 'Feature: Add session deletion functionality'" +echo "6. Copy the description from COMMIT_MESSAGE.txt" +echo "7. Submit your pull request!" +echo "" +echo "📋 This PR will address Issue #305 and provide the session deletion feature users have requested!" \ No newline at end of file diff --git a/session-deletion-feature.patch b/session-deletion-feature.patch new file mode 100644 index 000000000..81aaba3a7 --- /dev/null +++ b/session-deletion-feature.patch @@ -0,0 +1,2405 @@ +From 5321793fdc0db91119a5b202abeb51208626bf3f Mon Sep 17 00:00:00 2001 +From: iflytwice +Date: Mon, 22 Sep 2025 18:30:17 -0400 +Subject: [PATCH] Feature: Add session deletion functionality + +Implements session deletion feature with delete button on hover for each session card. + +Addresses issue #305 where users requested the ability to delete unnecessary sessions. + +## Changes: +- **Backend**: Added Rust command to remove .jsonl files and associated todo data +- **Frontend**: Added API method with proper error handling +- **UI**: Added Trash2 icon delete button that appears on session card hover +- **UX**: Added confirmation dialog before deletion to prevent accidents +- **State**: Updated local state management to remove sessions immediately after deletion + +## Features: +- Hover-to-reveal delete button for clean UI +- Confirmation dialog with session details +- Deletes both session file and associated todo data +- Proper error handling and user feedback +- Follows project coding standards and documentation guidelines + +Fixes #305 +--- + src-tauri/src/claude_binary.rs | 28 ++- + src-tauri/src/commands/agents.rs | 95 +++++--- + src-tauri/src/commands/claude.rs | 268 +++++++++++++++-------- + src-tauri/src/commands/mod.rs | 6 +- + src-tauri/src/commands/proxy.rs | 47 ++-- + src-tauri/src/commands/slash_commands.rs | 131 +++++------ + src-tauri/src/commands/storage.rs | 196 +++++++++-------- + src-tauri/src/main.rs | 86 ++++---- + src-tauri/src/process/registry.rs | 69 +++--- + 9 files changed, 536 insertions(+), 390 deletions(-) + +diff --git a/src-tauri/src/claude_binary.rs b/src-tauri/src/claude_binary.rs +index 2d1c7e3..9ff0311 100644 +--- a/src-tauri/src/claude_binary.rs ++++ b/src-tauri/src/claude_binary.rs +@@ -47,7 +47,7 @@ pub fn find_claude_binary(app_handle: &tauri::AppHandle) -> Result(0), + ) { + info!("Found stored claude path in database: {}", stored_path); +- ++ + // Check if the path still exists + let path_buf = PathBuf::from(&stored_path); + if path_buf.exists() && path_buf.is_file() { +@@ -56,14 +56,14 @@ pub fn find_claude_binary(app_handle: &tauri::AppHandle) -> Result(0), + ).unwrap_or_else(|_| "system".to_string()); +- ++ + info!("User preference for Claude installation: {}", preference); + } + } +@@ -350,10 +350,10 @@ fn get_claude_version(path: &str) -> Result, String> { + /// Extract version string from command output + fn extract_version_from_output(stdout: &[u8]) -> Option { + let output_str = String::from_utf8_lossy(stdout); +- ++ + // Debug log the raw output + debug!("Raw version output: {:?}", output_str); +- ++ + // Use regex to directly extract version pattern (e.g., "1.0.41") + // This pattern matches: + // - One or more digits, followed by +@@ -362,8 +362,9 @@ fn extract_version_from_output(stdout: &[u8]) -> Option { + // - A dot, followed by + // - One or more digits + // - Optionally followed by pre-release/build metadata +- let version_regex = regex::Regex::new(r"(\d+\.\d+\.\d+(?:-[a-zA-Z0-9.-]+)?(?:\+[a-zA-Z0-9.-]+)?)").ok()?; +- ++ let version_regex = ++ regex::Regex::new(r"(\d+\.\d+\.\d+(?:-[a-zA-Z0-9.-]+)?(?:\+[a-zA-Z0-9.-]+)?)").ok()?; ++ + if let Some(captures) = version_regex.captures(&output_str) { + if let Some(version_match) = captures.get(1) { + let version = version_match.as_str().to_string(); +@@ -371,7 +372,7 @@ fn extract_version_from_output(stdout: &[u8]) -> Option { + return Some(version); + } + } +- ++ + debug!("No version found in output"); + None + } +@@ -451,7 +452,7 @@ fn compare_versions(a: &str, b: &str) -> Ordering { + /// This ensures commands like Claude can find Node.js and other dependencies + pub fn create_command_with_env(program: &str) -> Command { + let mut cmd = Command::new(program); +- ++ + info!("Creating command for: {}", program); + + // Inherit essential environment variables from parent process +@@ -479,7 +480,7 @@ pub fn create_command_with_env(program: &str) -> Command { + cmd.env(&key, &value); + } + } +- ++ + // Log proxy-related environment variables for debugging + info!("Command will use proxy settings:"); + if let Ok(http_proxy) = std::env::var("HTTP_PROXY") { +@@ -502,7 +503,7 @@ pub fn create_command_with_env(program: &str) -> Command { + } + } + } +- ++ + // Add Homebrew support if the program is in a Homebrew directory + if program.contains("/homebrew/") || program.contains("/opt/homebrew/") { + if let Some(program_dir) = std::path::Path::new(program).parent() { +@@ -511,7 +512,10 @@ pub fn create_command_with_env(program: &str) -> Command { + let homebrew_bin_str = program_dir.to_string_lossy(); + if !current_path.contains(&homebrew_bin_str.as_ref()) { + let new_path = format!("{}:{}", homebrew_bin_str, current_path); +- debug!("Adding Homebrew bin directory to PATH: {}", homebrew_bin_str); ++ debug!( ++ "Adding Homebrew bin directory to PATH: {}", ++ homebrew_bin_str ++ ); + cmd.env("PATH", new_path); + } + } +diff --git a/src-tauri/src/commands/agents.rs b/src-tauri/src/commands/agents.rs +index b988ce7..36513a7 100644 +--- a/src-tauri/src/commands/agents.rs ++++ b/src-tauri/src/commands/agents.rs +@@ -179,7 +179,10 @@ pub async fn read_session_jsonl(session_id: &str, project_path: &str) -> Result< + let session_file = project_dir.join(format!("{}.jsonl", session_id)); + + if !session_file.exists() { +- return Err(format!("Session file not found: {}", session_file.display())); ++ return Err(format!( ++ "Session file not found: {}", ++ session_file.display() ++ )); + } + + match tokio::fs::read_to_string(&session_file).await { +@@ -317,7 +320,6 @@ pub fn init_database(app: &AppHandle) -> SqliteResult { + [], + )?; + +- + // Create settings table for app-wide settings + conn.execute( + "CREATE TABLE IF NOT EXISTS app_settings ( +@@ -690,38 +692,41 @@ pub async fn execute_agent( + // Get the agent from database + let agent = get_agent(db.clone(), agent_id).await?; + let execution_model = model.unwrap_or(agent.model.clone()); +- ++ + // Create .claude/settings.json with agent hooks if it doesn't exist + if let Some(hooks_json) = &agent.hooks { + let claude_dir = std::path::Path::new(&project_path).join(".claude"); + let settings_path = claude_dir.join("settings.json"); +- ++ + // Create .claude directory if it doesn't exist + if !claude_dir.exists() { + std::fs::create_dir_all(&claude_dir) + .map_err(|e| format!("Failed to create .claude directory: {}", e))?; + info!("Created .claude directory at: {:?}", claude_dir); + } +- ++ + // Check if settings.json already exists + if !settings_path.exists() { + // Parse the hooks JSON + let hooks: serde_json::Value = serde_json::from_str(hooks_json) + .map_err(|e| format!("Failed to parse agent hooks: {}", e))?; +- ++ + // Create a settings object with just the hooks + let settings = serde_json::json!({ + "hooks": hooks + }); +- ++ + // Write the settings file + let settings_content = serde_json::to_string_pretty(&settings) + .map_err(|e| format!("Failed to serialize settings: {}", e))?; +- ++ + std::fs::write(&settings_path, settings_content) + .map_err(|e| format!("Failed to write settings.json: {}", e))?; +- +- info!("Created settings.json with agent hooks at: {:?}", settings_path); ++ ++ info!( ++ "Created settings.json with agent hooks at: {:?}", ++ settings_path ++ ); + } else { + info!("settings.json already exists at: {:?}", settings_path); + } +@@ -775,7 +780,8 @@ pub async fn execute_agent( + execution_model, + db, + registry, +- ).await ++ ) ++ .await + } + + /// Creates a system binary command for agent execution +@@ -785,17 +791,17 @@ fn create_agent_system_command( + project_path: &str, + ) -> Command { + let mut cmd = create_command_with_env(claude_path); +- ++ + // Add all arguments + for arg in args { + cmd.arg(arg); + } +- ++ + cmd.current_dir(project_path) + .stdin(Stdio::null()) + .stdout(Stdio::piped()) + .stderr(Stdio::piped()); +- ++ + cmd + } + +@@ -905,14 +911,15 @@ async fn spawn_agent_system( + // Extract session ID from JSONL output + if let Ok(json) = serde_json::from_str::(&line) { + // Claude Code uses "session_id" (underscore), not "sessionId" +- if json.get("type").and_then(|t| t.as_str()) == Some("system") && +- json.get("subtype").and_then(|s| s.as_str()) == Some("init") { ++ if json.get("type").and_then(|t| t.as_str()) == Some("system") ++ && json.get("subtype").and_then(|s| s.as_str()) == Some("init") ++ { + if let Some(sid) = json.get("session_id").and_then(|s| s.as_str()) { + if let Ok(mut current_session_id) = session_id_clone.lock() { + if current_session_id.is_empty() { + *current_session_id = sid.to_string(); + info!("🔑 Extracted session ID: {}", sid); +- ++ + // Update database immediately with session ID + if let Ok(conn) = Connection::open(&db_path_for_stdout) { + match conn.execute( +@@ -925,7 +932,10 @@ async fn spawn_agent_system( + } + } + Err(e) => { +- error!("❌ Failed to update session ID immediately: {}", e); ++ error!( ++ "❌ Failed to update session ID immediately: {}", ++ e ++ ); + } + } + } +@@ -1085,7 +1095,10 @@ async fn spawn_agent_system( + + // Update the run record with session ID and mark as completed - open a new connection + if let Ok(conn) = Connection::open(&db_path_for_monitor) { +- info!("🔄 Updating database with extracted session ID: {}", extracted_session_id); ++ info!( ++ "🔄 Updating database with extracted session ID: {}", ++ extracted_session_id ++ ); + match conn.execute( + "UPDATE agent_runs SET session_id = ?1, status = 'completed', completed_at = CURRENT_TIMESTAMP WHERE id = ?2", + params![extracted_session_id, run_id], +@@ -1102,7 +1115,10 @@ async fn spawn_agent_system( + } + } + } else { +- error!("❌ Failed to open database to update session ID for run {}", run_id); ++ error!( ++ "❌ Failed to open database to update session ID for run {}", ++ run_id ++ ); + } + + // Cleanup will be handled by the cleanup_finished_processes function +@@ -1162,10 +1178,8 @@ pub async fn list_running_sessions( + // Cross-check with the process registry to ensure accuracy + // Get actually running processes from the registry + let registry_processes = registry.0.get_running_agent_processes()?; +- let registry_run_ids: std::collections::HashSet = registry_processes +- .iter() +- .map(|p| p.run_id) +- .collect(); ++ let registry_run_ids: std::collections::HashSet = ++ registry_processes.iter().map(|p| p.run_id).collect(); + + // Filter out any database entries that aren't actually running in the registry + // This handles cases where processes crashed without updating the database +@@ -1358,7 +1372,7 @@ pub async fn get_session_output( + + // Find the correct project directory by searching for the session file + let projects_dir = claude_dir.join("projects"); +- ++ + // Check if projects directory exists + if !projects_dir.exists() { + log::error!("Projects directory not found at: {:?}", projects_dir); +@@ -1367,15 +1381,18 @@ pub async fn get_session_output( + + // Search for the session file in all project directories + let mut session_file_path = None; +- log::info!("Searching for session file {} in all project directories", run.session_id); +- ++ log::info!( ++ "Searching for session file {} in all project directories", ++ run.session_id ++ ); ++ + if let Ok(entries) = std::fs::read_dir(&projects_dir) { + for entry in entries.filter_map(Result::ok) { + let path = entry.path(); + if path.is_dir() { + let dir_name = path.file_name().unwrap_or_default().to_string_lossy(); + log::debug!("Checking project directory: {}", dir_name); +- ++ + let potential_session_file = path.join(format!("{}.jsonl", run.session_id)); + if potential_session_file.exists() { + log::info!("Found session file at: {:?}", potential_session_file); +@@ -1395,7 +1412,11 @@ pub async fn get_session_output( + match tokio::fs::read_to_string(&session_path).await { + Ok(content) => Ok(content), + Err(e) => { +- log::error!("Failed to read session file {}: {}", session_path.display(), e); ++ log::error!( ++ "Failed to read session file {}: {}", ++ session_path.display(), ++ e ++ ); + // Fallback to live output if file read fails + let live_output = registry.0.get_live_output(run_id)?; + Ok(live_output) +@@ -1403,7 +1424,10 @@ pub async fn get_session_output( + } + } else { + // If session file not found, try the old method as fallback +- log::warn!("Session file not found for {}, trying legacy method", run.session_id); ++ log::warn!( ++ "Session file not found for {}, trying legacy method", ++ run.session_id ++ ); + match read_session_jsonl(&run.session_id, &run.project_path).await { + Ok(content) => Ok(content), + Err(_) => { +@@ -1916,7 +1940,7 @@ pub async fn load_agent_session_history( + .join(".claude"); + + let projects_dir = claude_dir.join("projects"); +- ++ + if !projects_dir.exists() { + log::error!("Projects directory not found at: {:?}", projects_dir); + return Err("Projects directory not found".to_string()); +@@ -1924,15 +1948,18 @@ pub async fn load_agent_session_history( + + // Search for the session file in all project directories + let mut session_file_path = None; +- log::info!("Searching for session file {} in all project directories", session_id); +- ++ log::info!( ++ "Searching for session file {} in all project directories", ++ session_id ++ ); ++ + if let Ok(entries) = std::fs::read_dir(&projects_dir) { + for entry in entries.filter_map(Result::ok) { + let path = entry.path(); + if path.is_dir() { + let dir_name = path.file_name().unwrap_or_default().to_string_lossy(); + log::debug!("Checking project directory: {}", dir_name); +- ++ + let potential_session_file = path.join(format!("{}.jsonl", session_id)); + if potential_session_file.exists() { + log::info!("Found session file at: {:?}", potential_session_file); +diff --git a/src-tauri/src/commands/claude.rs b/src-tauri/src/commands/claude.rs +index 94ad3c5..c8787e2 100644 +--- a/src-tauri/src/commands/claude.rs ++++ b/src-tauri/src/commands/claude.rs +@@ -10,7 +10,6 @@ use tauri::{AppHandle, Emitter, Manager}; + use tokio::process::{Child, Command}; + use tokio::sync::Mutex; + +- + /// Global state to track current Claude process + pub struct ClaudeProcessState { + pub current_process: Arc>>, +@@ -262,7 +261,7 @@ fn create_command_with_env(program: &str) -> Command { + } + } + } +- ++ + // Add Homebrew support if the program is in a Homebrew directory + if program.contains("/homebrew/") || program.contains("/opt/homebrew/") { + if let Some(program_dir) = std::path::Path::new(program).parent() { +@@ -270,7 +269,10 @@ fn create_command_with_env(program: &str) -> Command { + let homebrew_bin_str = program_dir.to_string_lossy(); + if !current_path.contains(&homebrew_bin_str.as_ref()) { + let new_path = format!("{}:{}", homebrew_bin_str, current_path); +- log::debug!("Adding Homebrew bin directory to PATH: {}", homebrew_bin_str); ++ log::debug!( ++ "Adding Homebrew bin directory to PATH: {}", ++ homebrew_bin_str ++ ); + tokio_cmd.env("PATH", new_path); + } + } +@@ -280,22 +282,18 @@ fn create_command_with_env(program: &str) -> Command { + } + + /// Creates a system binary command with the given arguments +-fn create_system_command( +- claude_path: &str, +- args: Vec, +- project_path: &str, +-) -> Command { ++fn create_system_command(claude_path: &str, args: Vec, project_path: &str) -> Command { + let mut cmd = create_command_with_env(claude_path); +- ++ + // Add all arguments + for arg in args { + cmd.arg(arg); + } +- ++ + cmd.current_dir(project_path) + .stdout(Stdio::piped()) + .stderr(Stdio::piped()); +- ++ + cmd + } + +@@ -307,7 +305,6 @@ pub async fn get_home_directory() -> Result { + .ok_or_else(|| "Could not determine home directory".to_string()) + } + +- + /// Lists all projects in the ~/.claude/projects directory + #[tauri::command] + pub async fn list_projects() -> Result, String> { +@@ -361,7 +358,7 @@ pub async fn list_projects() -> Result, String> { + // List all JSONL files (sessions) in this project directory + let mut sessions = Vec::new(); + let mut most_recent_session: Option = None; +- ++ + if let Ok(session_entries) = fs::read_dir(&path) { + for session_entry in session_entries.flatten() { + let session_path = session_entry.path(); +@@ -371,7 +368,7 @@ pub async fn list_projects() -> Result, String> { + if let Some(session_id) = session_path.file_stem().and_then(|s| s.to_str()) + { + sessions.push(session_id.to_string()); +- ++ + // Track the most recent session timestamp + if let Ok(metadata) = fs::metadata(&session_path) { + let modified = metadata +@@ -380,7 +377,7 @@ pub async fn list_projects() -> Result, String> { + .duration_since(UNIX_EPOCH) + .unwrap_or_default() + .as_secs(); +- ++ + most_recent_session = Some(match most_recent_session { + Some(current) => current.max(modified), + None => modified, +@@ -420,31 +417,31 @@ pub async fn list_projects() -> Result, String> { + #[tauri::command] + pub async fn create_project(path: String) -> Result { + log::info!("Creating project for path: {}", path); +- ++ + // Encode the path to create a project ID + let project_id = path.replace('/', "-"); +- ++ + // Get claude directory + let claude_dir = get_claude_dir().map_err(|e| e.to_string())?; + let projects_dir = claude_dir.join("projects"); +- ++ + // Create projects directory if it doesn't exist + if !projects_dir.exists() { + fs::create_dir_all(&projects_dir) + .map_err(|e| format!("Failed to create projects directory: {}", e))?; + } +- ++ + // Create project directory if it doesn't exist + let project_dir = projects_dir.join(&project_id); + if !project_dir.exists() { + fs::create_dir_all(&project_dir) + .map_err(|e| format!("Failed to create project directory: {}", e))?; + } +- ++ + // Get creation time + let metadata = fs::metadata(&project_dir) + .map_err(|e| format!("Failed to read directory metadata: {}", e))?; +- ++ + let created_at = metadata + .created() + .or_else(|_| metadata.modified()) +@@ -452,7 +449,7 @@ pub async fn create_project(path: String) -> Result { + .duration_since(UNIX_EPOCH) + .unwrap_or_default() + .as_secs(); +- ++ + // Return the created project + Ok(Project { + id: project_id, +@@ -648,7 +645,8 @@ pub async fn check_claude_version(app: AppHandle) -> Result Result { + let stdout = String::from_utf8_lossy(&output.stdout).to_string(); + let stderr = String::from_utf8_lossy(&output.stderr).to_string(); +- ++ + // Use regex to directly extract version pattern (e.g., "1.0.41") +- let version_regex = regex::Regex::new(r"(\d+\.\d+\.\d+(?:-[a-zA-Z0-9.-]+)?(?:\+[a-zA-Z0-9.-]+)?)").ok(); +- ++ let version_regex = ++ regex::Regex::new(r"(\d+\.\d+\.\d+(?:-[a-zA-Z0-9.-]+)?(?:\+[a-zA-Z0-9.-]+)?)") ++ .ok(); ++ + let version = if let Some(regex) = version_regex { +- regex.captures(&stdout) ++ regex ++ .captures(&stdout) + .and_then(|captures| captures.get(1)) + .map(|m| m.as_str().to_string()) + } else { + None + }; +- ++ + let full_output = if stderr.is_empty() { + stdout.clone() + } else { +@@ -907,8 +908,6 @@ pub async fn load_session_history( + Ok(messages) + } + +- +- + /// Execute a new interactive Claude Code session with streaming output + #[tauri::command] + pub async fn execute_claude_code( +@@ -924,7 +923,7 @@ pub async fn execute_claude_code( + ); + + let claude_path = find_claude_binary(&app)?; +- ++ + let args = vec![ + "-p".to_string(), + prompt.clone(), +@@ -955,7 +954,7 @@ pub async fn continue_claude_code( + ); + + let claude_path = find_claude_binary(&app)?; +- ++ + let args = vec![ + "-c".to_string(), // Continue flag + "-p".to_string(), +@@ -989,7 +988,7 @@ pub async fn resume_claude_code( + ); + + let claude_path = find_claude_binary(&app)?; +- ++ + let args = vec![ + "--resume".to_string(), + session_id.clone(), +@@ -1026,8 +1025,12 @@ pub async fn cancel_claude_execution( + let registry = app.state::(); + match registry.0.get_claude_session_by_id(sid) { + Ok(Some(process_info)) => { +- log::info!("Found process in registry for session {}: run_id={}, PID={}", +- sid, process_info.run_id, process_info.pid); ++ log::info!( ++ "Found process in registry for session {}: run_id={}, PID={}", ++ sid, ++ process_info.run_id, ++ process_info.pid ++ ); + match registry.0.kill_process(process_info.run_id).await { + Ok(success) => { + if success { +@@ -1060,7 +1063,10 @@ pub async fn cancel_claude_execution( + if let Some(mut child) = current_process.take() { + // Try to get the PID before killing + let pid = child.id(); +- log::info!("Attempting to kill Claude process via ClaudeProcessState with PID: {:?}", pid); ++ log::info!( ++ "Attempting to kill Claude process via ClaudeProcessState with PID: {:?}", ++ pid ++ ); + + // Kill the process + match child.kill().await { +@@ -1069,8 +1075,11 @@ pub async fn cancel_claude_execution( + killed = true; + } + Err(e) => { +- log::error!("Failed to kill Claude process via ClaudeProcessState: {}", e); +- ++ log::error!( ++ "Failed to kill Claude process via ClaudeProcessState: {}", ++ e ++ ); ++ + // Method 3: If we have a PID, try system kill as last resort + if let Some(pid) = pid { + log::info!("Attempting system kill as last resort for PID: {}", pid); +@@ -1083,7 +1092,7 @@ pub async fn cancel_claude_execution( + .args(["-KILL", &pid.to_string()]) + .output() + }; +- ++ + match kill_result { + Ok(output) if output.status.success() => { + log::info!("Successfully killed process via system command"); +@@ -1116,18 +1125,18 @@ pub async fn cancel_claude_execution( + tokio::time::sleep(tokio::time::Duration::from_millis(100)).await; + let _ = app.emit(&format!("claude-complete:{}", sid), false); + } +- ++ + // Also emit generic events for backward compatibility + let _ = app.emit("claude-cancelled", true); + tokio::time::sleep(tokio::time::Duration::from_millis(100)).await; + let _ = app.emit("claude-complete", false); +- ++ + if killed { + log::info!("Claude process cancellation completed successfully"); + } else if !attempted_methods.is_empty() { + log::warn!("Claude process cancellation attempted but process may have already exited. Attempted methods: {:?}", attempted_methods); + } +- ++ + Ok(()) + } + +@@ -1154,9 +1163,15 @@ pub async fn get_claude_session_output( + } + + /// Helper function to spawn Claude process and handle streaming +-async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, model: String, project_path: String) -> Result<(), String> { +- use tokio::io::{AsyncBufReadExt, BufReader}; ++async fn spawn_claude_process( ++ app: AppHandle, ++ mut cmd: Command, ++ prompt: String, ++ model: String, ++ project_path: String, ++) -> Result<(), String> { + use std::sync::Mutex; ++ use tokio::io::{AsyncBufReadExt, BufReader}; + + // Spawn the process + let mut child = cmd +@@ -1169,10 +1184,7 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + + // Get the child PID for logging + let pid = child.id().unwrap_or(0); +- log::info!( +- "Spawned Claude process with PID: {:?}", +- pid +- ); ++ log::info!("Spawned Claude process with PID: {:?}", pid); + + // Create readers first (before moving child) + let stdout_reader = BufReader::new(stdout); +@@ -1207,7 +1219,7 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + let mut lines = stdout_reader.lines(); + while let Ok(Some(line)) = lines.next_line().await { + log::debug!("Claude stdout: {}", line); +- ++ + // Parse the line to check for init message with session ID + if let Ok(msg) = serde_json::from_str::(&line) { + if msg["type"] == "system" && msg["subtype"] == "init" { +@@ -1216,7 +1228,7 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + if session_id_guard.is_none() { + *session_id_guard = Some(claude_session_id.to_string()); + log::info!("Extracted Claude session ID: {}", claude_session_id); +- ++ + // Now register with ProcessRegistry using Claude's session ID + match registry_clone.register_claude_session( + claude_session_id.to_string(), +@@ -1238,12 +1250,12 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + } + } + } +- ++ + // Store live output in registry if we have a run_id + if let Some(run_id) = *run_id_holder_clone.lock().unwrap() { + let _ = registry_clone.append_live_output(run_id, &line); + } +- ++ + // Emit the line to the frontend with session isolation if we have session ID + if let Some(ref session_id) = *session_id_holder_clone.lock().unwrap() { + let _ = app_handle.emit(&format!("claude-output:{}", session_id), &line); +@@ -1287,10 +1299,8 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + // Add a small delay to ensure all messages are processed + tokio::time::sleep(tokio::time::Duration::from_millis(100)).await; + if let Some(ref session_id) = *session_id_holder_clone3.lock().unwrap() { +- let _ = app_handle_wait.emit( +- &format!("claude-complete:{}", session_id), +- status.success(), +- ); ++ let _ = app_handle_wait ++ .emit(&format!("claude-complete:{}", session_id), status.success()); + } + // Also emit to the generic event for backward compatibility + let _ = app_handle_wait.emit("claude-complete", status.success()); +@@ -1300,8 +1310,8 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + // Add a small delay to ensure all messages are processed + tokio::time::sleep(tokio::time::Duration::from_millis(100)).await; + if let Some(ref session_id) = *session_id_holder_clone3.lock().unwrap() { +- let _ = app_handle_wait +- .emit(&format!("claude-complete:{}", session_id), false); ++ let _ = ++ app_handle_wait.emit(&format!("claude-complete:{}", session_id), false); + } + // Also emit to the generic event for backward compatibility + let _ = app_handle_wait.emit("claude-complete", false); +@@ -1321,7 +1331,6 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + Ok(()) + } + +- + /// Lists files and directories in a given path + #[tauri::command] + pub async fn list_directory_contents(directory_path: String) -> Result, String> { +@@ -2038,78 +2047,92 @@ pub async fn track_session_messages( + + /// Gets hooks configuration from settings at specified scope + #[tauri::command] +-pub async fn get_hooks_config(scope: String, project_path: Option) -> Result { +- log::info!("Getting hooks config for scope: {}, project: {:?}", scope, project_path); ++pub async fn get_hooks_config( ++ scope: String, ++ project_path: Option, ++) -> Result { ++ log::info!( ++ "Getting hooks config for scope: {}, project: {:?}", ++ scope, ++ project_path ++ ); + + let settings_path = match scope.as_str() { +- "user" => { +- get_claude_dir() +- .map_err(|e| e.to_string())? +- .join("settings.json") +- }, ++ "user" => get_claude_dir() ++ .map_err(|e| e.to_string())? ++ .join("settings.json"), + "project" => { + let path = project_path.ok_or("Project path required for project scope")?; + PathBuf::from(path).join(".claude").join("settings.json") +- }, ++ } + "local" => { + let path = project_path.ok_or("Project path required for local scope")?; +- PathBuf::from(path).join(".claude").join("settings.local.json") +- }, +- _ => return Err("Invalid scope".to_string()) ++ PathBuf::from(path) ++ .join(".claude") ++ .join("settings.local.json") ++ } ++ _ => return Err("Invalid scope".to_string()), + }; + + if !settings_path.exists() { +- log::info!("Settings file does not exist at {:?}, returning empty hooks", settings_path); ++ log::info!( ++ "Settings file does not exist at {:?}, returning empty hooks", ++ settings_path ++ ); + return Ok(serde_json::json!({})); + } + + let content = fs::read_to_string(&settings_path) + .map_err(|e| format!("Failed to read settings: {}", e))?; +- +- let settings: serde_json::Value = serde_json::from_str(&content) +- .map_err(|e| format!("Failed to parse settings: {}", e))?; +- +- Ok(settings.get("hooks").cloned().unwrap_or(serde_json::json!({}))) ++ ++ let settings: serde_json::Value = ++ serde_json::from_str(&content).map_err(|e| format!("Failed to parse settings: {}", e))?; ++ ++ Ok(settings ++ .get("hooks") ++ .cloned() ++ .unwrap_or(serde_json::json!({}))) + } + + /// Updates hooks configuration in settings at specified scope + #[tauri::command] + pub async fn update_hooks_config( +- scope: String, ++ scope: String, + hooks: serde_json::Value, +- project_path: Option ++ project_path: Option, + ) -> Result { +- log::info!("Updating hooks config for scope: {}, project: {:?}", scope, project_path); ++ log::info!( ++ "Updating hooks config for scope: {}, project: {:?}", ++ scope, ++ project_path ++ ); + + let settings_path = match scope.as_str() { +- "user" => { +- get_claude_dir() +- .map_err(|e| e.to_string())? +- .join("settings.json") +- }, ++ "user" => get_claude_dir() ++ .map_err(|e| e.to_string())? ++ .join("settings.json"), + "project" => { + let path = project_path.ok_or("Project path required for project scope")?; + let claude_dir = PathBuf::from(path).join(".claude"); + fs::create_dir_all(&claude_dir) + .map_err(|e| format!("Failed to create .claude directory: {}", e))?; + claude_dir.join("settings.json") +- }, ++ } + "local" => { + let path = project_path.ok_or("Project path required for local scope")?; + let claude_dir = PathBuf::from(path).join(".claude"); + fs::create_dir_all(&claude_dir) + .map_err(|e| format!("Failed to create .claude directory: {}", e))?; + claude_dir.join("settings.local.json") +- }, +- _ => return Err("Invalid scope".to_string()) ++ } ++ _ => return Err("Invalid scope".to_string()), + }; + + // Read existing settings or create new + let mut settings = if settings_path.exists() { + let content = fs::read_to_string(&settings_path) + .map_err(|e| format!("Failed to read settings: {}", e))?; +- serde_json::from_str(&content) +- .map_err(|e| format!("Failed to parse settings: {}", e))? ++ serde_json::from_str(&content).map_err(|e| format!("Failed to parse settings: {}", e))? + } else { + serde_json::json!({}) + }; +@@ -2120,7 +2143,7 @@ pub async fn update_hooks_config( + // Write back with pretty formatting + let json_string = serde_json::to_string_pretty(&settings) + .map_err(|e| format!("Failed to serialize settings: {}", e))?; +- ++ + fs::write(&settings_path, json_string) + .map_err(|e| format!("Failed to write settings: {}", e))?; + +@@ -2135,9 +2158,9 @@ pub async fn validate_hook_command(command: String) -> Result { + if output.status.success() { +@@ -2153,6 +2176,63 @@ pub async fn validate_hook_command(command: String) -> Result Err(format!("Failed to validate command: {}", e)) ++ Err(e) => Err(format!("Failed to validate command: {}", e)), ++ } ++} ++ ++/// Deletes a session file and its associated data ++/// ++/// This function removes a session's JSONL file from the project directory ++/// and also cleans up any associated todo data if it exists. ++/// ++/// # Arguments ++/// * `session_id` - The UUID of the session to delete ++/// * `project_id` - The ID of the project containing the session ++/// ++/// # Returns ++/// * `Ok(String)` - Success message with session ID ++/// * `Err(String)` - Error message if deletion fails ++/// ++/// # Errors ++/// * Project directory not found ++/// * Permission denied when deleting files ++/// * File system errors during deletion ++#[tauri::command] ++pub async fn delete_session(session_id: String, project_id: String) -> Result { ++ log::info!( ++ "Deleting session: {} from project: {}", ++ session_id, ++ project_id ++ ); ++ ++ let claude_dir = get_claude_dir().map_err(|e| e.to_string())?; ++ let project_dir = claude_dir.join("projects").join(&project_id); ++ ++ // Check if project directory exists ++ if !project_dir.exists() { ++ return Err(format!("Project directory not found: {}", project_id)); ++ } ++ ++ // Delete the session JSONL file ++ let session_file = project_dir.join(format!("{}.jsonl", session_id)); ++ if session_file.exists() { ++ fs::remove_file(&session_file) ++ .map_err(|e| format!("Failed to delete session file: {}", e))?; ++ log::info!("Deleted session file: {:?}", session_file); ++ } else { ++ log::warn!("Session file not found: {:?}", session_file); ++ } ++ ++ // Delete associated todo data if it exists ++ let todos_dir = project_dir.join("todos"); ++ if todos_dir.exists() { ++ let todo_file = todos_dir.join(format!("{}.json", session_id)); ++ if todo_file.exists() { ++ fs::remove_file(&todo_file) ++ .map_err(|e| format!("Failed to delete todo file: {}", e))?; ++ log::info!("Deleted todo file: {:?}", todo_file); ++ } + } ++ ++ Ok(format!("Session {} deleted successfully", session_id)) + } +diff --git a/src-tauri/src/commands/mod.rs b/src-tauri/src/commands/mod.rs +index a0fa7e8..f4c3552 100644 +--- a/src-tauri/src/commands/mod.rs ++++ b/src-tauri/src/commands/mod.rs +@@ -1,7 +1,7 @@ + pub mod agents; + pub mod claude; + pub mod mcp; +-pub mod usage; +-pub mod storage; +-pub mod slash_commands; + pub mod proxy; ++pub mod slash_commands; ++pub mod storage; ++pub mod usage; +diff --git a/src-tauri/src/commands/proxy.rs b/src-tauri/src/commands/proxy.rs +index e2454ec..2192e0e 100644 +--- a/src-tauri/src/commands/proxy.rs ++++ b/src-tauri/src/commands/proxy.rs +@@ -1,6 +1,6 @@ ++use rusqlite::params; + use serde::{Deserialize, Serialize}; + use tauri::State; +-use rusqlite::params; + + use crate::commands::agents::AgentDb; + +@@ -29,9 +29,9 @@ impl Default for ProxySettings { + #[tauri::command] + pub async fn get_proxy_settings(db: State<'_, AgentDb>) -> Result { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + let mut settings = ProxySettings::default(); +- ++ + // Query each proxy setting + let keys = vec![ + ("proxy_enabled", "enabled"), +@@ -40,7 +40,7 @@ pub async fn get_proxy_settings(db: State<'_, AgentDb>) -> Result) -> Result Result<(), String> { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Save each setting + let values = vec![ + ("proxy_enabled", settings.enabled.to_string()), +- ("proxy_http", settings.http_proxy.clone().unwrap_or_default()), +- ("proxy_https", settings.https_proxy.clone().unwrap_or_default()), ++ ( ++ "proxy_http", ++ settings.http_proxy.clone().unwrap_or_default(), ++ ), ++ ( ++ "proxy_https", ++ settings.https_proxy.clone().unwrap_or_default(), ++ ), + ("proxy_no", settings.no_proxy.clone().unwrap_or_default()), + ("proxy_all", settings.all_proxy.clone().unwrap_or_default()), + ]; +- ++ + for (key, value) in values { + conn.execute( + "INSERT OR REPLACE INTO app_settings (key, value) VALUES (?1, ?2)", + params![key, value], +- ).map_err(|e| format!("Failed to save {}: {}", key, e))?; ++ ) ++ .map_err(|e| format!("Failed to save {}: {}", key, e))?; + } +- ++ + // Apply the proxy settings immediately to the current process + apply_proxy_settings(&settings); +- ++ + Ok(()) + } + + /// Apply proxy settings as environment variables + pub fn apply_proxy_settings(settings: &ProxySettings) { + log::info!("Applying proxy settings: enabled={}", settings.enabled); +- ++ + if !settings.enabled { + // Clear proxy environment variables if disabled + log::info!("Clearing proxy environment variables"); +@@ -109,7 +116,7 @@ pub fn apply_proxy_settings(settings: &ProxySettings) { + std::env::remove_var("all_proxy"); + return; + } +- ++ + // Ensure NO_PROXY includes localhost by default + let mut no_proxy_list = vec!["localhost", "127.0.0.1", "::1", "0.0.0.0"]; + if let Some(user_no_proxy) = &settings.no_proxy { +@@ -118,7 +125,7 @@ pub fn apply_proxy_settings(settings: &ProxySettings) { + } + } + let no_proxy_value = no_proxy_list.join(","); +- ++ + // Set proxy environment variables (uppercase is standard) + if let Some(http_proxy) = &settings.http_proxy { + if !http_proxy.is_empty() { +@@ -126,25 +133,25 @@ pub fn apply_proxy_settings(settings: &ProxySettings) { + std::env::set_var("HTTP_PROXY", http_proxy); + } + } +- ++ + if let Some(https_proxy) = &settings.https_proxy { + if !https_proxy.is_empty() { + log::info!("Setting HTTPS_PROXY={}", https_proxy); + std::env::set_var("HTTPS_PROXY", https_proxy); + } + } +- ++ + // Always set NO_PROXY to include localhost + log::info!("Setting NO_PROXY={}", no_proxy_value); + std::env::set_var("NO_PROXY", &no_proxy_value); +- ++ + if let Some(all_proxy) = &settings.all_proxy { + if !all_proxy.is_empty() { + log::info!("Setting ALL_PROXY={}", all_proxy); + std::env::set_var("ALL_PROXY", all_proxy); + } + } +- ++ + // Log current proxy environment variables for debugging + log::info!("Current proxy environment variables:"); + for (key, value) in std::env::vars() { +@@ -152,4 +159,4 @@ pub fn apply_proxy_settings(settings: &ProxySettings) { + log::info!(" {}={}", key, value); + } + } +-} +\ No newline at end of file ++} +diff --git a/src-tauri/src/commands/slash_commands.rs b/src-tauri/src/commands/slash_commands.rs +index dbf12e6..6f77309 100644 +--- a/src-tauri/src/commands/slash_commands.rs ++++ b/src-tauri/src/commands/slash_commands.rs +@@ -45,13 +45,13 @@ struct CommandFrontmatter { + /// Parse a markdown file with optional YAML frontmatter + fn parse_markdown_with_frontmatter(content: &str) -> Result<(Option, String)> { + let lines: Vec<&str> = content.lines().collect(); +- ++ + // Check if the file starts with YAML frontmatter + if lines.is_empty() || lines[0] != "---" { + // No frontmatter + return Ok((None, content.to_string())); + } +- ++ + // Find the end of frontmatter + let mut frontmatter_end = None; + for (i, line) in lines.iter().enumerate().skip(1) { +@@ -60,12 +60,12 @@ fn parse_markdown_with_frontmatter(content: &str) -> Result<(Option(&frontmatter_content) { + Ok(frontmatter) => Ok((Some(frontmatter), body_content)), +@@ -86,20 +86,20 @@ fn extract_command_info(file_path: &Path, base_path: &Path) -> Result<(String, O + let relative_path = file_path + .strip_prefix(base_path) + .context("Failed to get relative path")?; +- ++ + // Remove .md extension + let path_without_ext = relative_path + .with_extension("") + .to_string_lossy() + .to_string(); +- ++ + // Split into components + let components: Vec<&str> = path_without_ext.split('/').collect(); +- ++ + if components.is_empty() { + return Err(anyhow::anyhow!("Invalid command path")); + } +- ++ + if components.len() == 1 { + // No namespace + Ok((components[0].to_string(), None)) +@@ -112,44 +112,43 @@ fn extract_command_info(file_path: &Path, base_path: &Path) -> Result<(String, O + } + + /// Load a single command from a markdown file +-fn load_command_from_file( +- file_path: &Path, +- base_path: &Path, +- scope: &str, +-) -> Result { ++fn load_command_from_file(file_path: &Path, base_path: &Path, scope: &str) -> Result { + debug!("Loading command from: {:?}", file_path); +- ++ + // Read file content +- let content = fs::read_to_string(file_path) +- .context("Failed to read command file")?; +- ++ let content = fs::read_to_string(file_path).context("Failed to read command file")?; ++ + // Parse frontmatter + let (frontmatter, body) = parse_markdown_with_frontmatter(&content)?; +- ++ + // Extract command info + let (name, namespace) = extract_command_info(file_path, base_path)?; +- ++ + // Build full command (no scope prefix, just /command or /namespace:command) + let full_command = match &namespace { + Some(ns) => format!("/{ns}:{name}"), + None => format!("/{name}"), + }; +- ++ + // Generate unique ID +- let id = format!("{}-{}", scope, file_path.to_string_lossy().replace('/', "-")); +- ++ let id = format!( ++ "{}-{}", ++ scope, ++ file_path.to_string_lossy().replace('/', "-") ++ ); ++ + // Check for special content + let has_bash_commands = body.contains("!`"); + let has_file_references = body.contains('@'); + let accepts_arguments = body.contains("$ARGUMENTS"); +- ++ + // Extract metadata from frontmatter + let (description, allowed_tools) = if let Some(fm) = frontmatter { + (fm.description, fm.allowed_tools.unwrap_or_default()) + } else { + (None, Vec::new()) + }; +- ++ + Ok(SlashCommand { + id, + name, +@@ -171,18 +170,18 @@ fn find_markdown_files(dir: &Path, files: &mut Vec) -> Result<()> { + if !dir.exists() { + return Ok(()); + } +- ++ + for entry in fs::read_dir(dir)? { + let entry = entry?; + let path = entry.path(); +- ++ + // Skip hidden files/directories + if let Some(name) = path.file_name().and_then(|n| n.to_str()) { + if name.starts_with('.') { + continue; + } + } +- ++ + if path.is_dir() { + find_markdown_files(&path, files)?; + } else if path.is_file() { +@@ -193,7 +192,7 @@ fn find_markdown_files(dir: &Path, files: &mut Vec) -> Result<()> { + } + } + } +- ++ + Ok(()) + } + +@@ -252,16 +251,16 @@ pub async fn slash_commands_list( + ) -> Result, String> { + info!("Discovering slash commands"); + let mut commands = Vec::new(); +- ++ + // Add default commands + commands.extend(create_default_commands()); +- ++ + // Load project commands if project path is provided + if let Some(proj_path) = project_path { + let project_commands_dir = PathBuf::from(&proj_path).join(".claude").join("commands"); + if project_commands_dir.exists() { + debug!("Scanning project commands at: {:?}", project_commands_dir); +- ++ + let mut md_files = Vec::new(); + if let Err(e) = find_markdown_files(&project_commands_dir, &mut md_files) { + error!("Failed to find project command files: {}", e); +@@ -280,13 +279,13 @@ pub async fn slash_commands_list( + } + } + } +- ++ + // Load user commands + if let Some(home_dir) = dirs::home_dir() { + let user_commands_dir = home_dir.join(".claude").join("commands"); + if user_commands_dir.exists() { + debug!("Scanning user commands at: {:?}", user_commands_dir); +- ++ + let mut md_files = Vec::new(); + if let Err(e) = find_markdown_files(&user_commands_dir, &mut md_files) { + error!("Failed to find user command files: {}", e); +@@ -305,7 +304,7 @@ pub async fn slash_commands_list( + } + } + } +- ++ + info!("Found {} slash commands", commands.len()); + Ok(commands) + } +@@ -314,17 +313,17 @@ pub async fn slash_commands_list( + #[tauri::command] + pub async fn slash_command_get(command_id: String) -> Result { + debug!("Getting slash command: {}", command_id); +- ++ + // Parse the ID to determine scope and reconstruct file path + let parts: Vec<&str> = command_id.split('-').collect(); + if parts.len() < 2 { + return Err("Invalid command ID".to_string()); + } +- ++ + // The actual implementation would need to reconstruct the path and reload the command + // For now, we'll list all commands and find the matching one + let commands = slash_commands_list(None).await?; +- ++ + commands + .into_iter() + .find(|cmd| cmd.id == command_id) +@@ -343,16 +342,16 @@ pub async fn slash_command_save( + project_path: Option, + ) -> Result { + info!("Saving slash command: {} in scope: {}", name, scope); +- ++ + // Validate inputs + if name.is_empty() { + return Err("Command name cannot be empty".to_string()); + } +- ++ + if !["project", "user"].contains(&scope.as_str()) { + return Err("Invalid scope. Must be 'project' or 'user'".to_string()); + } +- ++ + // Determine base directory + let base_dir = if scope == "project" { + if let Some(proj_path) = project_path { +@@ -366,7 +365,7 @@ pub async fn slash_command_save( + .join(".claude") + .join("commands") + }; +- ++ + // Build file path + let mut file_path = base_dir.clone(); + if let Some(ns) = &namespace { +@@ -374,41 +373,40 @@ pub async fn slash_command_save( + file_path = file_path.join(component); + } + } +- ++ + // Create directories if needed +- fs::create_dir_all(&file_path) +- .map_err(|e| format!("Failed to create directories: {}", e))?; +- ++ fs::create_dir_all(&file_path).map_err(|e| format!("Failed to create directories: {}", e))?; ++ + // Add filename + file_path = file_path.join(format!("{}.md", name)); +- ++ + // Build content with frontmatter + let mut full_content = String::new(); +- ++ + // Add frontmatter if we have metadata + if description.is_some() || !allowed_tools.is_empty() { + full_content.push_str("---\n"); +- ++ + if let Some(desc) = &description { + full_content.push_str(&format!("description: {}\n", desc)); + } +- ++ + if !allowed_tools.is_empty() { + full_content.push_str("allowed-tools:\n"); + for tool in &allowed_tools { + full_content.push_str(&format!(" - {}\n", tool)); + } + } +- ++ + full_content.push_str("---\n\n"); + } +- ++ + full_content.push_str(&content); +- ++ + // Write file + fs::write(&file_path, &full_content) + .map_err(|e| format!("Failed to write command file: {}", e))?; +- ++ + // Load and return the saved command + load_command_from_file(&file_path, &base_dir, &scope) + .map_err(|e| format!("Failed to load saved command: {}", e)) +@@ -416,35 +414,38 @@ pub async fn slash_command_save( + + /// Delete a slash command + #[tauri::command] +-pub async fn slash_command_delete(command_id: String, project_path: Option) -> Result { ++pub async fn slash_command_delete( ++ command_id: String, ++ project_path: Option, ++) -> Result { + info!("Deleting slash command: {}", command_id); +- ++ + // First, we need to determine if this is a project command by parsing the ID + let is_project_command = command_id.starts_with("project-"); +- ++ + // If it's a project command and we don't have a project path, error out + if is_project_command && project_path.is_none() { + return Err("Project path required to delete project commands".to_string()); + } +- ++ + // List all commands (including project commands if applicable) + let commands = slash_commands_list(project_path).await?; +- ++ + // Find the command by ID + let command = commands + .into_iter() + .find(|cmd| cmd.id == command_id) + .ok_or_else(|| format!("Command not found: {}", command_id))?; +- ++ + // Delete the file + fs::remove_file(&command.file_path) + .map_err(|e| format!("Failed to delete command file: {}", e))?; +- ++ + // Clean up empty directories + if let Some(parent) = Path::new(&command.file_path).parent() { + let _ = remove_empty_dirs(parent); + } +- ++ + Ok(format!("Deleted command: {}", command.full_command)) + } + +@@ -453,18 +454,18 @@ fn remove_empty_dirs(dir: &Path) -> Result<()> { + if !dir.exists() { + return Ok(()); + } +- ++ + // Check if directory is empty + let is_empty = fs::read_dir(dir)?.next().is_none(); +- ++ + if is_empty { + fs::remove_dir(dir)?; +- ++ + // Try to remove parent if it's also empty + if let Some(parent) = dir.parent() { + let _ = remove_empty_dirs(parent); + } + } +- ++ + Ok(()) + } +diff --git a/src-tauri/src/commands/storage.rs b/src-tauri/src/commands/storage.rs +index 1bcdb1b..02c5529 100644 +--- a/src-tauri/src/commands/storage.rs ++++ b/src-tauri/src/commands/storage.rs +@@ -1,10 +1,10 @@ ++use super::agents::AgentDb; + use anyhow::Result; +-use rusqlite::{params, Connection, Result as SqliteResult, types::ValueRef}; ++use rusqlite::{params, types::ValueRef, Connection, Result as SqliteResult}; + use serde::{Deserialize, Serialize}; + use serde_json::{Map, Value as JsonValue}; + use std::collections::HashMap; + use tauri::{AppHandle, Manager, State}; +-use super::agents::AgentDb; + + /// Represents metadata about a database table + #[derive(Debug, Serialize, Deserialize, Clone)] +@@ -50,37 +50,35 @@ pub struct QueryResult { + #[tauri::command] + pub async fn storage_list_tables(db: State<'_, AgentDb>) -> Result, String> { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Query for all tables + let mut stmt = conn + .prepare("SELECT name FROM sqlite_master WHERE type='table' AND name NOT LIKE 'sqlite_%' ORDER BY name") + .map_err(|e| e.to_string())?; +- ++ + let table_names: Vec = stmt + .query_map([], |row| row.get(0)) + .map_err(|e| e.to_string())? + .collect::>>() + .map_err(|e| e.to_string())?; +- ++ + drop(stmt); +- ++ + let mut tables = Vec::new(); +- ++ + for table_name in table_names { + // Get row count + let row_count: i64 = conn +- .query_row( +- &format!("SELECT COUNT(*) FROM {}", table_name), +- [], +- |row| row.get(0), +- ) ++ .query_row(&format!("SELECT COUNT(*) FROM {}", table_name), [], |row| { ++ row.get(0) ++ }) + .unwrap_or(0); +- ++ + // Get column information + let mut pragma_stmt = conn + .prepare(&format!("PRAGMA table_info({})", table_name)) + .map_err(|e| e.to_string())?; +- ++ + let columns: Vec = pragma_stmt + .query_map([], |row| { + Ok(ColumnInfo { +@@ -95,14 +93,14 @@ pub async fn storage_list_tables(db: State<'_, AgentDb>) -> Result>>() + .map_err(|e| e.to_string())?; +- ++ + tables.push(TableInfo { + name: table_name, + row_count, + columns, + }); + } +- ++ + Ok(tables) + } + +@@ -117,17 +115,17 @@ pub async fn storage_read_table( + searchQuery: Option, + ) -> Result { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Validate table name to prevent SQL injection + if !is_valid_table_name(&conn, &tableName)? { + return Err("Invalid table name".to_string()); + } +- ++ + // Get column information + let mut pragma_stmt = conn + .prepare(&format!("PRAGMA table_info({})", tableName)) + .map_err(|e| e.to_string())?; +- ++ + let columns: Vec = pragma_stmt + .query_map([], |row| { + Ok(ColumnInfo { +@@ -142,9 +140,9 @@ pub async fn storage_read_table( + .map_err(|e| e.to_string())? + .collect::>>() + .map_err(|e| e.to_string())?; +- ++ + drop(pragma_stmt); +- ++ + // Build query with optional search + let (query, count_query) = if let Some(search) = &searchQuery { + // Create search conditions for all text columns +@@ -153,7 +151,7 @@ pub async fn storage_read_table( + .filter(|col| col.type_name.contains("TEXT") || col.type_name.contains("VARCHAR")) + .map(|col| format!("{} LIKE '%{}%'", col.name, search.replace("'", "''"))) + .collect(); +- ++ + if search_conditions.is_empty() { + ( + format!("SELECT * FROM {} LIMIT ? OFFSET ?", tableName), +@@ -162,7 +160,10 @@ pub async fn storage_read_table( + } else { + let where_clause = search_conditions.join(" OR "); + ( +- format!("SELECT * FROM {} WHERE {} LIMIT ? OFFSET ?", tableName, where_clause), ++ format!( ++ "SELECT * FROM {} WHERE {} LIMIT ? OFFSET ?", ++ tableName, where_clause ++ ), + format!("SELECT COUNT(*) FROM {} WHERE {}", tableName, where_clause), + ) + } +@@ -172,25 +173,23 @@ pub async fn storage_read_table( + format!("SELECT COUNT(*) FROM {}", tableName), + ) + }; +- ++ + // Get total row count + let total_rows: i64 = conn + .query_row(&count_query, [], |row| row.get(0)) + .unwrap_or(0); +- ++ + // Calculate pagination + let offset = (page - 1) * pageSize; + let total_pages = (total_rows as f64 / pageSize as f64).ceil() as i64; +- ++ + // Query data +- let mut data_stmt = conn +- .prepare(&query) +- .map_err(|e| e.to_string())?; +- ++ let mut data_stmt = conn.prepare(&query).map_err(|e| e.to_string())?; ++ + let rows: Vec> = data_stmt + .query_map(params![pageSize, offset], |row| { + let mut row_map = Map::new(); +- ++ + for (idx, col) in columns.iter().enumerate() { + let value = match row.get_ref(idx)? { + ValueRef::Null => JsonValue::Null, +@@ -203,17 +202,20 @@ pub async fn storage_read_table( + } + } + ValueRef::Text(s) => JsonValue::String(String::from_utf8_lossy(s).to_string()), +- ValueRef::Blob(b) => JsonValue::String(base64::Engine::encode(&base64::engine::general_purpose::STANDARD, b)), ++ ValueRef::Blob(b) => JsonValue::String(base64::Engine::encode( ++ &base64::engine::general_purpose::STANDARD, ++ b, ++ )), + }; + row_map.insert(col.name.clone(), value); + } +- ++ + Ok(row_map) + }) + .map_err(|e| e.to_string())? + .collect::>>() + .map_err(|e| e.to_string())?; +- ++ + Ok(TableData { + table_name: tableName, + columns, +@@ -235,49 +237,52 @@ pub async fn storage_update_row( + updates: HashMap, + ) -> Result<(), String> { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Validate table name + if !is_valid_table_name(&conn, &tableName)? { + return Err("Invalid table name".to_string()); + } +- ++ + // Build UPDATE query + let set_clauses: Vec = updates + .keys() + .enumerate() + .map(|(idx, key)| format!("{} = ?{}", key, idx + 1)) + .collect(); +- ++ + let where_clauses: Vec = primaryKeyValues + .keys() + .enumerate() + .map(|(idx, key)| format!("{} = ?{}", key, idx + updates.len() + 1)) + .collect(); +- ++ + let query = format!( + "UPDATE {} SET {} WHERE {}", + tableName, + set_clauses.join(", "), + where_clauses.join(" AND ") + ); +- ++ + // Prepare parameters + let mut params: Vec> = Vec::new(); +- ++ + // Add update values + for value in updates.values() { + params.push(json_to_sql_value(value)?); + } +- ++ + // Add where clause values + for value in primaryKeyValues.values() { + params.push(json_to_sql_value(value)?); + } +- ++ + // Execute update +- conn.execute(&query, rusqlite::params_from_iter(params.iter().map(|p| p.as_ref()))) +- .map_err(|e| format!("Failed to update row: {}", e))?; +- ++ conn.execute( ++ &query, ++ rusqlite::params_from_iter(params.iter().map(|p| p.as_ref())), ++ ) ++ .map_err(|e| format!("Failed to update row: {}", e))?; ++ + Ok(()) + } + +@@ -290,35 +295,38 @@ pub async fn storage_delete_row( + primaryKeyValues: HashMap, + ) -> Result<(), String> { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Validate table name + if !is_valid_table_name(&conn, &tableName)? { + return Err("Invalid table name".to_string()); + } +- ++ + // Build DELETE query + let where_clauses: Vec = primaryKeyValues + .keys() + .enumerate() + .map(|(idx, key)| format!("{} = ?{}", key, idx + 1)) + .collect(); +- ++ + let query = format!( + "DELETE FROM {} WHERE {}", + tableName, + where_clauses.join(" AND ") + ); +- ++ + // Prepare parameters + let params: Vec> = primaryKeyValues + .values() + .map(json_to_sql_value) + .collect::, _>>()?; +- ++ + // Execute delete +- conn.execute(&query, rusqlite::params_from_iter(params.iter().map(|p| p.as_ref()))) +- .map_err(|e| format!("Failed to delete row: {}", e))?; +- ++ conn.execute( ++ &query, ++ rusqlite::params_from_iter(params.iter().map(|p| p.as_ref())), ++ ) ++ .map_err(|e| format!("Failed to delete row: {}", e))?; ++ + Ok(()) + } + +@@ -331,35 +339,40 @@ pub async fn storage_insert_row( + values: HashMap, + ) -> Result { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Validate table name + if !is_valid_table_name(&conn, &tableName)? { + return Err("Invalid table name".to_string()); + } +- ++ + // Build INSERT query + let columns: Vec<&String> = values.keys().collect(); +- let placeholders: Vec = (1..=columns.len()) +- .map(|i| format!("?{}", i)) +- .collect(); +- ++ let placeholders: Vec = (1..=columns.len()).map(|i| format!("?{}", i)).collect(); ++ + let query = format!( + "INSERT INTO {} ({}) VALUES ({})", + tableName, +- columns.iter().map(|c| c.as_str()).collect::>().join(", "), ++ columns ++ .iter() ++ .map(|c| c.as_str()) ++ .collect::>() ++ .join(", "), + placeholders.join(", ") + ); +- ++ + // Prepare parameters + let params: Vec> = values + .values() + .map(json_to_sql_value) + .collect::, _>>()?; +- ++ + // Execute insert +- conn.execute(&query, rusqlite::params_from_iter(params.iter().map(|p| p.as_ref()))) +- .map_err(|e| format!("Failed to insert row: {}", e))?; +- ++ conn.execute( ++ &query, ++ rusqlite::params_from_iter(params.iter().map(|p| p.as_ref())), ++ ) ++ .map_err(|e| format!("Failed to insert row: {}", e))?; ++ + Ok(conn.last_insert_rowid()) + } + +@@ -370,20 +383,20 @@ pub async fn storage_execute_sql( + query: String, + ) -> Result { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Check if it's a SELECT query + let is_select = query.trim().to_uppercase().starts_with("SELECT"); +- ++ + if is_select { + // Handle SELECT queries + let mut stmt = conn.prepare(&query).map_err(|e| e.to_string())?; + let column_count = stmt.column_count(); +- ++ + // Get column names + let columns: Vec = (0..column_count) + .map(|i| stmt.column_name(i).unwrap_or("").to_string()) + .collect(); +- ++ + // Execute query and collect results + let rows: Vec> = stmt + .query_map([], |row| { +@@ -399,8 +412,13 @@ pub async fn storage_execute_sql( + JsonValue::String(f.to_string()) + } + } +- ValueRef::Text(s) => JsonValue::String(String::from_utf8_lossy(s).to_string()), +- ValueRef::Blob(b) => JsonValue::String(base64::Engine::encode(&base64::engine::general_purpose::STANDARD, b)), ++ ValueRef::Text(s) => { ++ JsonValue::String(String::from_utf8_lossy(s).to_string()) ++ } ++ ValueRef::Blob(b) => JsonValue::String(base64::Engine::encode( ++ &base64::engine::general_purpose::STANDARD, ++ b, ++ )), + }; + row_values.push(value); + } +@@ -409,7 +427,7 @@ pub async fn storage_execute_sql( + .map_err(|e| e.to_string())? + .collect::>>() + .map_err(|e| e.to_string())?; +- ++ + Ok(QueryResult { + columns, + rows, +@@ -419,7 +437,7 @@ pub async fn storage_execute_sql( + } else { + // Handle non-SELECT queries (INSERT, UPDATE, DELETE, etc.) + let rows_affected = conn.execute(&query, []).map_err(|e| e.to_string())?; +- ++ + Ok(QueryResult { + columns: vec![], + rows: vec![], +@@ -435,13 +453,12 @@ pub async fn storage_reset_database(app: AppHandle) -> Result<(), String> { + { + // Drop all existing tables within a scoped block + let db_state = app.state::(); +- let conn = db_state.0.lock() +- .map_err(|e| e.to_string())?; +- ++ let conn = db_state.0.lock().map_err(|e| e.to_string())?; ++ + // Disable foreign key constraints temporarily to allow dropping tables + conn.execute("PRAGMA foreign_keys = OFF", []) + .map_err(|e| format!("Failed to disable foreign keys: {}", e))?; +- ++ + // Drop tables - order doesn't matter with foreign keys disabled + conn.execute("DROP TABLE IF EXISTS agent_runs", []) + .map_err(|e| format!("Failed to drop agent_runs table: {}", e))?; +@@ -449,34 +466,31 @@ pub async fn storage_reset_database(app: AppHandle) -> Result<(), String> { + .map_err(|e| format!("Failed to drop agents table: {}", e))?; + conn.execute("DROP TABLE IF EXISTS app_settings", []) + .map_err(|e| format!("Failed to drop app_settings table: {}", e))?; +- ++ + // Re-enable foreign key constraints + conn.execute("PRAGMA foreign_keys = ON", []) + .map_err(|e| format!("Failed to re-enable foreign keys: {}", e))?; +- ++ + // Connection is automatically dropped at end of scope + } +- ++ + // Re-initialize the database which will recreate all tables empty + let new_conn = init_database(&app).map_err(|e| format!("Failed to reset database: {}", e))?; +- ++ + // Update the managed state with the new connection + { + let db_state = app.state::(); +- let mut conn_guard = db_state.0.lock() +- .map_err(|e| e.to_string())?; ++ let mut conn_guard = db_state.0.lock().map_err(|e| e.to_string())?; + *conn_guard = new_conn; + } +- ++ + // Run VACUUM to optimize the database + { + let db_state = app.state::(); +- let conn = db_state.0.lock() +- .map_err(|e| e.to_string())?; +- conn.execute("VACUUM", []) +- .map_err(|e| e.to_string())?; ++ let conn = db_state.0.lock().map_err(|e| e.to_string())?; ++ conn.execute("VACUUM", []).map_err(|e| e.to_string())?; + } +- ++ + Ok(()) + } + +@@ -489,7 +503,7 @@ fn is_valid_table_name(conn: &Connection, table_name: &str) -> Result 0) + } + +@@ -513,4 +527,4 @@ fn json_to_sql_value(value: &JsonValue) -> Result, Stri + } + + /// Initialize the agents database (re-exported from agents module) +-use super::agents::init_database; +\ No newline at end of file ++use super::agents::init_database; +diff --git a/src-tauri/src/main.rs b/src-tauri/src/main.rs +index ffc0212..3164fe3 100644 +--- a/src-tauri/src/main.rs ++++ b/src-tauri/src/main.rs +@@ -14,20 +14,21 @@ use commands::agents::{ + get_live_session_output, get_session_output, get_session_status, import_agent, + import_agent_from_file, import_agent_from_github, init_database, kill_agent_session, + list_agent_runs, list_agent_runs_with_metrics, list_agents, list_claude_installations, +- list_running_sessions, load_agent_session_history, set_claude_binary_path, stream_session_output, update_agent, AgentDb, ++ list_running_sessions, load_agent_session_history, set_claude_binary_path, ++ stream_session_output, update_agent, AgentDb, + }; + use commands::claude::{ + cancel_claude_execution, check_auto_checkpoint, check_claude_version, cleanup_old_checkpoints, +- clear_checkpoint_manager, continue_claude_code, create_checkpoint, create_project, execute_claude_code, +- find_claude_md_files, fork_from_checkpoint, get_checkpoint_diff, get_checkpoint_settings, +- get_checkpoint_state_stats, get_claude_session_output, get_claude_settings, get_home_directory, get_project_sessions, +- get_recently_modified_files, get_session_timeline, get_system_prompt, list_checkpoints, +- list_directory_contents, list_projects, list_running_claude_sessions, load_session_history, +- open_new_session, read_claude_md_file, restore_checkpoint, resume_claude_code, +- save_claude_md_file, save_claude_settings, save_system_prompt, search_files, +- track_checkpoint_message, track_session_messages, update_checkpoint_settings, +- get_hooks_config, update_hooks_config, validate_hook_command, +- ClaudeProcessState, ++ clear_checkpoint_manager, continue_claude_code, create_checkpoint, create_project, ++ delete_session, execute_claude_code, find_claude_md_files, fork_from_checkpoint, ++ get_checkpoint_diff, get_checkpoint_settings, get_checkpoint_state_stats, ++ get_claude_session_output, get_claude_settings, get_home_directory, get_hooks_config, ++ get_project_sessions, get_recently_modified_files, get_session_timeline, get_system_prompt, ++ list_checkpoints, list_directory_contents, list_projects, list_running_claude_sessions, ++ load_session_history, open_new_session, read_claude_md_file, restore_checkpoint, ++ resume_claude_code, save_claude_md_file, save_claude_settings, save_system_prompt, ++ search_files, track_checkpoint_message, track_session_messages, update_checkpoint_settings, ++ update_hooks_config, validate_hook_command, ClaudeProcessState, + }; + use commands::mcp::{ + mcp_add, mcp_add_from_claude_desktop, mcp_add_json, mcp_get, mcp_get_server_status, mcp_list, +@@ -35,14 +36,14 @@ use commands::mcp::{ + mcp_serve, mcp_test_connection, + }; + ++use commands::proxy::{apply_proxy_settings, get_proxy_settings, save_proxy_settings}; ++use commands::storage::{ ++ storage_delete_row, storage_execute_sql, storage_insert_row, storage_list_tables, ++ storage_read_table, storage_reset_database, storage_update_row, ++}; + use commands::usage::{ + get_session_stats, get_usage_by_date_range, get_usage_details, get_usage_stats, + }; +-use commands::storage::{ +- storage_list_tables, storage_read_table, storage_update_row, storage_delete_row, +- storage_insert_row, storage_execute_sql, storage_reset_database, +-}; +-use commands::proxy::{get_proxy_settings, save_proxy_settings, apply_proxy_settings}; + use process::ProcessRegistryState; + use std::sync::Mutex; + use tauri::Manager; +@@ -50,19 +51,17 @@ use tauri::Manager; + #[cfg(target_os = "macos")] + use window_vibrancy::{apply_vibrancy, NSVisualEffectMaterial}; + +- + fn main() { + // Initialize logger + env_logger::init(); + +- + tauri::Builder::default() + .plugin(tauri_plugin_dialog::init()) + .plugin(tauri_plugin_shell::init()) + .setup(|app| { + // Initialize agents database + let conn = init_database(&app.handle()).expect("Failed to initialize agents database"); +- ++ + // Load and apply proxy settings from the database + { + let db = AgentDb(Mutex::new(conn)); +@@ -70,7 +69,7 @@ fn main() { + Ok(conn) => { + // Directly query proxy settings from the database + let mut settings = commands::proxy::ProxySettings::default(); +- ++ + let keys = vec![ + ("proxy_enabled", "enabled"), + ("proxy_http", "http_proxy"), +@@ -78,7 +77,7 @@ fn main() { + ("proxy_no", "no_proxy"), + ("proxy_all", "all_proxy"), + ]; +- ++ + for (db_key, field) in keys { + if let Ok(value) = conn.query_row( + "SELECT value FROM app_settings WHERE key = ?1", +@@ -87,15 +86,23 @@ fn main() { + ) { + match field { + "enabled" => settings.enabled = value == "true", +- "http_proxy" => settings.http_proxy = Some(value).filter(|s| !s.is_empty()), +- "https_proxy" => settings.https_proxy = Some(value).filter(|s| !s.is_empty()), +- "no_proxy" => settings.no_proxy = Some(value).filter(|s| !s.is_empty()), +- "all_proxy" => settings.all_proxy = Some(value).filter(|s| !s.is_empty()), ++ "http_proxy" => { ++ settings.http_proxy = Some(value).filter(|s| !s.is_empty()) ++ } ++ "https_proxy" => { ++ settings.https_proxy = Some(value).filter(|s| !s.is_empty()) ++ } ++ "no_proxy" => { ++ settings.no_proxy = Some(value).filter(|s| !s.is_empty()) ++ } ++ "all_proxy" => { ++ settings.all_proxy = Some(value).filter(|s| !s.is_empty()) ++ } + _ => {} + } + } + } +- ++ + log::info!("Loaded proxy settings: enabled={}", settings.enabled); + settings + } +@@ -104,11 +111,11 @@ fn main() { + commands::proxy::ProxySettings::default() + } + }; +- ++ + // Apply the proxy settings + apply_proxy_settings(&proxy_settings); + } +- ++ + // Re-open the connection for the app to manage + let conn = init_database(&app.handle()).expect("Failed to initialize agents database"); + app.manage(AgentDb(Mutex::new(conn))); +@@ -144,7 +151,7 @@ fn main() { + #[cfg(target_os = "macos")] + { + let window = app.get_webview_window("main").unwrap(); +- ++ + // Try different vibrancy materials that support rounded corners + let materials = [ + NSVisualEffectMaterial::UnderWindowBackground, +@@ -153,7 +160,7 @@ fn main() { + NSVisualEffectMaterial::Menu, + NSVisualEffectMaterial::Sidebar, + ]; +- ++ + let mut applied = false; + for material in materials.iter() { + if apply_vibrancy(&window, *material, None, Some(12.0)).is_ok() { +@@ -161,11 +168,16 @@ fn main() { + break; + } + } +- ++ + if !applied { + // Fallback without rounded corners +- apply_vibrancy(&window, NSVisualEffectMaterial::WindowBackground, None, None) +- .expect("Failed to apply any window vibrancy"); ++ apply_vibrancy( ++ &window, ++ NSVisualEffectMaterial::WindowBackground, ++ None, ++ None, ++ ) ++ .expect("Failed to apply any window vibrancy"); + } + } + +@@ -176,6 +188,7 @@ fn main() { + list_projects, + create_project, + get_project_sessions, ++ delete_session, + get_home_directory, + get_claude_settings, + open_new_session, +@@ -199,7 +212,6 @@ fn main() { + get_hooks_config, + update_hooks_config, + validate_hook_command, +- + // Checkpoint Management + create_checkpoint, + restore_checkpoint, +@@ -215,7 +227,6 @@ fn main() { + get_checkpoint_settings, + clear_checkpoint_manager, + get_checkpoint_state_stats, +- + // Agent Management + list_agents, + create_agent, +@@ -245,13 +256,11 @@ fn main() { + fetch_github_agents, + fetch_github_agent_content, + import_agent_from_github, +- + // Usage & Analytics + get_usage_stats, + get_usage_by_date_range, + get_usage_details, + get_session_stats, +- + // MCP (Model Context Protocol) + mcp_add, + mcp_list, +@@ -265,7 +274,6 @@ fn main() { + mcp_get_server_status, + mcp_read_project_config, + mcp_save_project_config, +- + // Storage Management + storage_list_tables, + storage_read_table, +@@ -274,13 +282,11 @@ fn main() { + storage_insert_row, + storage_execute_sql, + storage_reset_database, +- + // Slash Commands + commands::slash_commands::slash_commands_list, + commands::slash_commands::slash_command_get, + commands::slash_commands::slash_command_save, + commands::slash_commands::slash_command_delete, +- + // Proxy Settings + get_proxy_settings, + save_proxy_settings, +diff --git a/src-tauri/src/process/registry.rs b/src-tauri/src/process/registry.rs +index 30c8e94..f4f33b5 100644 +--- a/src-tauri/src/process/registry.rs ++++ b/src-tauri/src/process/registry.rs +@@ -7,13 +7,8 @@ use tokio::process::Child; + /// Type of process being tracked + #[derive(Debug, Clone, Serialize, Deserialize)] + pub enum ProcessType { +- AgentRun { +- agent_id: i64, +- agent_name: String, +- }, +- ClaudeSession { +- session_id: String, +- }, ++ AgentRun { agent_id: i64, agent_name: String }, ++ ClaudeSession { session_id: String }, + } + + /// Information about a running agent process +@@ -72,7 +67,10 @@ impl ProcessRegistry { + ) -> Result<(), String> { + let process_info = ProcessInfo { + run_id, +- process_type: ProcessType::AgentRun { agent_id, agent_name }, ++ process_type: ProcessType::AgentRun { ++ agent_id, ++ agent_name, ++ }, + pid, + started_at: Utc::now(), + project_path, +@@ -96,7 +94,10 @@ impl ProcessRegistry { + ) -> Result<(), String> { + let process_info = ProcessInfo { + run_id, +- process_type: ProcessType::AgentRun { agent_id, agent_name }, ++ process_type: ProcessType::AgentRun { ++ agent_id, ++ agent_name, ++ }, + pid, + started_at: Utc::now(), + project_path, +@@ -106,7 +107,7 @@ impl ProcessRegistry { + + // For sidecar processes, we register without the child handle since it's managed differently + let mut processes = self.processes.lock().map_err(|e| e.to_string())?; +- ++ + let process_handle = ProcessHandle { + info: process_info, + child: Arc::new(Mutex::new(None)), // No tokio::process::Child handle for sidecar +@@ -127,7 +128,7 @@ impl ProcessRegistry { + model: String, + ) -> Result { + let run_id = self.generate_id()?; +- ++ + let process_info = ProcessInfo { + run_id, + process_type: ProcessType::ClaudeSession { session_id }, +@@ -140,7 +141,7 @@ impl ProcessRegistry { + + // Register without child - Claude sessions use ClaudeProcessState for process management + let mut processes = self.processes.lock().map_err(|e| e.to_string())?; +- ++ + let process_handle = ProcessHandle { + info: process_info, + child: Arc::new(Mutex::new(None)), // No child handle for Claude sessions +@@ -175,25 +176,24 @@ impl ProcessRegistry { + let processes = self.processes.lock().map_err(|e| e.to_string())?; + Ok(processes + .values() +- .filter_map(|handle| { +- match &handle.info.process_type { +- ProcessType::ClaudeSession { .. } => Some(handle.info.clone()), +- _ => None, +- } ++ .filter_map(|handle| match &handle.info.process_type { ++ ProcessType::ClaudeSession { .. } => Some(handle.info.clone()), ++ _ => None, + }) + .collect()) + } + + /// Get a specific Claude session by session ID +- pub fn get_claude_session_by_id(&self, session_id: &str) -> Result, String> { ++ pub fn get_claude_session_by_id( ++ &self, ++ session_id: &str, ++ ) -> Result, String> { + let processes = self.processes.lock().map_err(|e| e.to_string())?; + Ok(processes + .values() +- .find(|handle| { +- match &handle.info.process_type { +- ProcessType::ClaudeSession { session_id: sid } => sid == session_id, +- _ => false, +- } ++ .find(|handle| match &handle.info.process_type { ++ ProcessType::ClaudeSession { session_id: sid } => sid == session_id, ++ _ => false, + }) + .map(|handle| handle.info.clone())) + } +@@ -221,11 +221,9 @@ impl ProcessRegistry { + let processes = self.processes.lock().map_err(|e| e.to_string())?; + Ok(processes + .values() +- .filter_map(|handle| { +- match &handle.info.process_type { +- ProcessType::AgentRun { .. } => Some(handle.info.clone()), +- _ => None, +- } ++ .filter_map(|handle| match &handle.info.process_type { ++ ProcessType::AgentRun { .. } => Some(handle.info.clone()), ++ _ => None, + }) + .collect()) + } +@@ -273,17 +271,26 @@ impl ProcessRegistry { + } + } + } else { +- warn!("No child handle available for process {} (PID: {}), attempting system kill", run_id, pid); ++ warn!( ++ "No child handle available for process {} (PID: {}), attempting system kill", ++ run_id, pid ++ ); + false // Process handle not available, try fallback + } + }; + + // If direct kill didn't work, try system command as fallback + if !kill_sent { +- info!("Attempting fallback kill for process {} (PID: {})", run_id, pid); ++ info!( ++ "Attempting fallback kill for process {} (PID: {})", ++ run_id, pid ++ ); + match self.kill_process_by_pid(run_id, pid) { + Ok(true) => return Ok(true), +- Ok(false) => warn!("Fallback kill also failed for process {} (PID: {})", run_id, pid), ++ Ok(false) => warn!( ++ "Fallback kill also failed for process {} (PID: {})", ++ run_id, pid ++ ), + Err(e) => error!("Error during fallback kill: {}", e), + } + // Continue with the rest of the cleanup even if fallback failed +-- +2.50.1 (Apple Git-155) + diff --git a/src-tauri/Cargo.lock b/src-tauri/Cargo.lock index a46d68df9..ec446c6b3 100644 --- a/src-tauri/Cargo.lock +++ b/src-tauri/Cargo.lock @@ -1275,6 +1275,16 @@ dependencies = [ "windows-sys 0.59.0", ] +[[package]] +name = "fix-path-env" +version = "0.0.0" +source = "git+https://github.com/tauri-apps/fix-path-env-rs#c4c45d503ea115a839aae718d02f79e7c7f0f673" +dependencies = [ + "home", + "strip-ansi-escapes", + "thiserror 1.0.69", +] + [[package]] name = "fixedbitset" version = "0.4.2" @@ -1871,6 +1881,15 @@ version = "0.4.3" source = "registry+https://github.com/rust-lang/crates.io-index" checksum = "7f24254aa9a54b5c858eaee2f5bccdb46aaf0e486a595ed5fd8f86ba55232a70" +[[package]] +name = "home" +version = "0.5.11" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "589533453244b0995c858700322199b2becb13b627df2851f64a2775d024abcf" +dependencies = [ + "windows-sys 0.59.0", +] + [[package]] name = "html5ever" version = "0.26.0" @@ -3070,6 +3089,7 @@ dependencies = [ "cocoa", "dirs 5.0.1", "env_logger", + "fix-path-env", "futures", "glob", "libc", @@ -4504,6 +4524,15 @@ dependencies = [ "quote", ] +[[package]] +name = "strip-ansi-escapes" +version = "0.2.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "2a8f8038e7e7969abb3f1b7c2a811225e9296da208539e0f79c5251d6cac0025" +dependencies = [ + "vte", +] + [[package]] name = "strsim" version = "0.11.1" @@ -5649,6 +5678,15 @@ dependencies = [ "libc", ] +[[package]] +name = "vte" +version = "0.14.1" +source = "registry+https://github.com/rust-lang/crates.io-index" +checksum = "231fdcd7ef3037e8330d8e17e61011a2c244126acc0a982f4040ac3f9f0bc077" +dependencies = [ + "memchr", +] + [[package]] name = "walkdir" version = "2.5.0" diff --git a/src-tauri/Cargo.toml b/src-tauri/Cargo.toml index f1b180696..40857941d 100644 --- a/src-tauri/Cargo.toml +++ b/src-tauri/Cargo.toml @@ -20,7 +20,7 @@ crate-type = ["lib", "cdylib", "staticlib"] tauri-build = { version = "2", features = [] } [dependencies] -tauri = { version = "2", features = [ "macos-private-api", "protocol-asset", "tray-icon", "image-png"] } +tauri = { version = "2", features = [ "macos-private-api", "protocol-asset", "tray-icon", "image-png", "devtools"] } tauri-plugin-shell = "2" tauri-plugin-dialog = "2" tauri-plugin-fs = "2" @@ -53,6 +53,7 @@ zstd = "0.13" uuid = { version = "1.6", features = ["v4", "serde"] } walkdir = "2" serde_yaml = "0.9" +fix-path-env = { git = "https://github.com/tauri-apps/fix-path-env-rs" } [target.'cfg(target_os = "macos")'.dependencies] diff --git a/src-tauri/session-deletion-feature.patch b/src-tauri/session-deletion-feature.patch new file mode 100644 index 000000000..81aaba3a7 --- /dev/null +++ b/src-tauri/session-deletion-feature.patch @@ -0,0 +1,2405 @@ +From 5321793fdc0db91119a5b202abeb51208626bf3f Mon Sep 17 00:00:00 2001 +From: iflytwice +Date: Mon, 22 Sep 2025 18:30:17 -0400 +Subject: [PATCH] Feature: Add session deletion functionality + +Implements session deletion feature with delete button on hover for each session card. + +Addresses issue #305 where users requested the ability to delete unnecessary sessions. + +## Changes: +- **Backend**: Added Rust command to remove .jsonl files and associated todo data +- **Frontend**: Added API method with proper error handling +- **UI**: Added Trash2 icon delete button that appears on session card hover +- **UX**: Added confirmation dialog before deletion to prevent accidents +- **State**: Updated local state management to remove sessions immediately after deletion + +## Features: +- Hover-to-reveal delete button for clean UI +- Confirmation dialog with session details +- Deletes both session file and associated todo data +- Proper error handling and user feedback +- Follows project coding standards and documentation guidelines + +Fixes #305 +--- + src-tauri/src/claude_binary.rs | 28 ++- + src-tauri/src/commands/agents.rs | 95 +++++--- + src-tauri/src/commands/claude.rs | 268 +++++++++++++++-------- + src-tauri/src/commands/mod.rs | 6 +- + src-tauri/src/commands/proxy.rs | 47 ++-- + src-tauri/src/commands/slash_commands.rs | 131 +++++------ + src-tauri/src/commands/storage.rs | 196 +++++++++-------- + src-tauri/src/main.rs | 86 ++++---- + src-tauri/src/process/registry.rs | 69 +++--- + 9 files changed, 536 insertions(+), 390 deletions(-) + +diff --git a/src-tauri/src/claude_binary.rs b/src-tauri/src/claude_binary.rs +index 2d1c7e3..9ff0311 100644 +--- a/src-tauri/src/claude_binary.rs ++++ b/src-tauri/src/claude_binary.rs +@@ -47,7 +47,7 @@ pub fn find_claude_binary(app_handle: &tauri::AppHandle) -> Result(0), + ) { + info!("Found stored claude path in database: {}", stored_path); +- ++ + // Check if the path still exists + let path_buf = PathBuf::from(&stored_path); + if path_buf.exists() && path_buf.is_file() { +@@ -56,14 +56,14 @@ pub fn find_claude_binary(app_handle: &tauri::AppHandle) -> Result(0), + ).unwrap_or_else(|_| "system".to_string()); +- ++ + info!("User preference for Claude installation: {}", preference); + } + } +@@ -350,10 +350,10 @@ fn get_claude_version(path: &str) -> Result, String> { + /// Extract version string from command output + fn extract_version_from_output(stdout: &[u8]) -> Option { + let output_str = String::from_utf8_lossy(stdout); +- ++ + // Debug log the raw output + debug!("Raw version output: {:?}", output_str); +- ++ + // Use regex to directly extract version pattern (e.g., "1.0.41") + // This pattern matches: + // - One or more digits, followed by +@@ -362,8 +362,9 @@ fn extract_version_from_output(stdout: &[u8]) -> Option { + // - A dot, followed by + // - One or more digits + // - Optionally followed by pre-release/build metadata +- let version_regex = regex::Regex::new(r"(\d+\.\d+\.\d+(?:-[a-zA-Z0-9.-]+)?(?:\+[a-zA-Z0-9.-]+)?)").ok()?; +- ++ let version_regex = ++ regex::Regex::new(r"(\d+\.\d+\.\d+(?:-[a-zA-Z0-9.-]+)?(?:\+[a-zA-Z0-9.-]+)?)").ok()?; ++ + if let Some(captures) = version_regex.captures(&output_str) { + if let Some(version_match) = captures.get(1) { + let version = version_match.as_str().to_string(); +@@ -371,7 +372,7 @@ fn extract_version_from_output(stdout: &[u8]) -> Option { + return Some(version); + } + } +- ++ + debug!("No version found in output"); + None + } +@@ -451,7 +452,7 @@ fn compare_versions(a: &str, b: &str) -> Ordering { + /// This ensures commands like Claude can find Node.js and other dependencies + pub fn create_command_with_env(program: &str) -> Command { + let mut cmd = Command::new(program); +- ++ + info!("Creating command for: {}", program); + + // Inherit essential environment variables from parent process +@@ -479,7 +480,7 @@ pub fn create_command_with_env(program: &str) -> Command { + cmd.env(&key, &value); + } + } +- ++ + // Log proxy-related environment variables for debugging + info!("Command will use proxy settings:"); + if let Ok(http_proxy) = std::env::var("HTTP_PROXY") { +@@ -502,7 +503,7 @@ pub fn create_command_with_env(program: &str) -> Command { + } + } + } +- ++ + // Add Homebrew support if the program is in a Homebrew directory + if program.contains("/homebrew/") || program.contains("/opt/homebrew/") { + if let Some(program_dir) = std::path::Path::new(program).parent() { +@@ -511,7 +512,10 @@ pub fn create_command_with_env(program: &str) -> Command { + let homebrew_bin_str = program_dir.to_string_lossy(); + if !current_path.contains(&homebrew_bin_str.as_ref()) { + let new_path = format!("{}:{}", homebrew_bin_str, current_path); +- debug!("Adding Homebrew bin directory to PATH: {}", homebrew_bin_str); ++ debug!( ++ "Adding Homebrew bin directory to PATH: {}", ++ homebrew_bin_str ++ ); + cmd.env("PATH", new_path); + } + } +diff --git a/src-tauri/src/commands/agents.rs b/src-tauri/src/commands/agents.rs +index b988ce7..36513a7 100644 +--- a/src-tauri/src/commands/agents.rs ++++ b/src-tauri/src/commands/agents.rs +@@ -179,7 +179,10 @@ pub async fn read_session_jsonl(session_id: &str, project_path: &str) -> Result< + let session_file = project_dir.join(format!("{}.jsonl", session_id)); + + if !session_file.exists() { +- return Err(format!("Session file not found: {}", session_file.display())); ++ return Err(format!( ++ "Session file not found: {}", ++ session_file.display() ++ )); + } + + match tokio::fs::read_to_string(&session_file).await { +@@ -317,7 +320,6 @@ pub fn init_database(app: &AppHandle) -> SqliteResult { + [], + )?; + +- + // Create settings table for app-wide settings + conn.execute( + "CREATE TABLE IF NOT EXISTS app_settings ( +@@ -690,38 +692,41 @@ pub async fn execute_agent( + // Get the agent from database + let agent = get_agent(db.clone(), agent_id).await?; + let execution_model = model.unwrap_or(agent.model.clone()); +- ++ + // Create .claude/settings.json with agent hooks if it doesn't exist + if let Some(hooks_json) = &agent.hooks { + let claude_dir = std::path::Path::new(&project_path).join(".claude"); + let settings_path = claude_dir.join("settings.json"); +- ++ + // Create .claude directory if it doesn't exist + if !claude_dir.exists() { + std::fs::create_dir_all(&claude_dir) + .map_err(|e| format!("Failed to create .claude directory: {}", e))?; + info!("Created .claude directory at: {:?}", claude_dir); + } +- ++ + // Check if settings.json already exists + if !settings_path.exists() { + // Parse the hooks JSON + let hooks: serde_json::Value = serde_json::from_str(hooks_json) + .map_err(|e| format!("Failed to parse agent hooks: {}", e))?; +- ++ + // Create a settings object with just the hooks + let settings = serde_json::json!({ + "hooks": hooks + }); +- ++ + // Write the settings file + let settings_content = serde_json::to_string_pretty(&settings) + .map_err(|e| format!("Failed to serialize settings: {}", e))?; +- ++ + std::fs::write(&settings_path, settings_content) + .map_err(|e| format!("Failed to write settings.json: {}", e))?; +- +- info!("Created settings.json with agent hooks at: {:?}", settings_path); ++ ++ info!( ++ "Created settings.json with agent hooks at: {:?}", ++ settings_path ++ ); + } else { + info!("settings.json already exists at: {:?}", settings_path); + } +@@ -775,7 +780,8 @@ pub async fn execute_agent( + execution_model, + db, + registry, +- ).await ++ ) ++ .await + } + + /// Creates a system binary command for agent execution +@@ -785,17 +791,17 @@ fn create_agent_system_command( + project_path: &str, + ) -> Command { + let mut cmd = create_command_with_env(claude_path); +- ++ + // Add all arguments + for arg in args { + cmd.arg(arg); + } +- ++ + cmd.current_dir(project_path) + .stdin(Stdio::null()) + .stdout(Stdio::piped()) + .stderr(Stdio::piped()); +- ++ + cmd + } + +@@ -905,14 +911,15 @@ async fn spawn_agent_system( + // Extract session ID from JSONL output + if let Ok(json) = serde_json::from_str::(&line) { + // Claude Code uses "session_id" (underscore), not "sessionId" +- if json.get("type").and_then(|t| t.as_str()) == Some("system") && +- json.get("subtype").and_then(|s| s.as_str()) == Some("init") { ++ if json.get("type").and_then(|t| t.as_str()) == Some("system") ++ && json.get("subtype").and_then(|s| s.as_str()) == Some("init") ++ { + if let Some(sid) = json.get("session_id").and_then(|s| s.as_str()) { + if let Ok(mut current_session_id) = session_id_clone.lock() { + if current_session_id.is_empty() { + *current_session_id = sid.to_string(); + info!("🔑 Extracted session ID: {}", sid); +- ++ + // Update database immediately with session ID + if let Ok(conn) = Connection::open(&db_path_for_stdout) { + match conn.execute( +@@ -925,7 +932,10 @@ async fn spawn_agent_system( + } + } + Err(e) => { +- error!("❌ Failed to update session ID immediately: {}", e); ++ error!( ++ "❌ Failed to update session ID immediately: {}", ++ e ++ ); + } + } + } +@@ -1085,7 +1095,10 @@ async fn spawn_agent_system( + + // Update the run record with session ID and mark as completed - open a new connection + if let Ok(conn) = Connection::open(&db_path_for_monitor) { +- info!("🔄 Updating database with extracted session ID: {}", extracted_session_id); ++ info!( ++ "🔄 Updating database with extracted session ID: {}", ++ extracted_session_id ++ ); + match conn.execute( + "UPDATE agent_runs SET session_id = ?1, status = 'completed', completed_at = CURRENT_TIMESTAMP WHERE id = ?2", + params![extracted_session_id, run_id], +@@ -1102,7 +1115,10 @@ async fn spawn_agent_system( + } + } + } else { +- error!("❌ Failed to open database to update session ID for run {}", run_id); ++ error!( ++ "❌ Failed to open database to update session ID for run {}", ++ run_id ++ ); + } + + // Cleanup will be handled by the cleanup_finished_processes function +@@ -1162,10 +1178,8 @@ pub async fn list_running_sessions( + // Cross-check with the process registry to ensure accuracy + // Get actually running processes from the registry + let registry_processes = registry.0.get_running_agent_processes()?; +- let registry_run_ids: std::collections::HashSet = registry_processes +- .iter() +- .map(|p| p.run_id) +- .collect(); ++ let registry_run_ids: std::collections::HashSet = ++ registry_processes.iter().map(|p| p.run_id).collect(); + + // Filter out any database entries that aren't actually running in the registry + // This handles cases where processes crashed without updating the database +@@ -1358,7 +1372,7 @@ pub async fn get_session_output( + + // Find the correct project directory by searching for the session file + let projects_dir = claude_dir.join("projects"); +- ++ + // Check if projects directory exists + if !projects_dir.exists() { + log::error!("Projects directory not found at: {:?}", projects_dir); +@@ -1367,15 +1381,18 @@ pub async fn get_session_output( + + // Search for the session file in all project directories + let mut session_file_path = None; +- log::info!("Searching for session file {} in all project directories", run.session_id); +- ++ log::info!( ++ "Searching for session file {} in all project directories", ++ run.session_id ++ ); ++ + if let Ok(entries) = std::fs::read_dir(&projects_dir) { + for entry in entries.filter_map(Result::ok) { + let path = entry.path(); + if path.is_dir() { + let dir_name = path.file_name().unwrap_or_default().to_string_lossy(); + log::debug!("Checking project directory: {}", dir_name); +- ++ + let potential_session_file = path.join(format!("{}.jsonl", run.session_id)); + if potential_session_file.exists() { + log::info!("Found session file at: {:?}", potential_session_file); +@@ -1395,7 +1412,11 @@ pub async fn get_session_output( + match tokio::fs::read_to_string(&session_path).await { + Ok(content) => Ok(content), + Err(e) => { +- log::error!("Failed to read session file {}: {}", session_path.display(), e); ++ log::error!( ++ "Failed to read session file {}: {}", ++ session_path.display(), ++ e ++ ); + // Fallback to live output if file read fails + let live_output = registry.0.get_live_output(run_id)?; + Ok(live_output) +@@ -1403,7 +1424,10 @@ pub async fn get_session_output( + } + } else { + // If session file not found, try the old method as fallback +- log::warn!("Session file not found for {}, trying legacy method", run.session_id); ++ log::warn!( ++ "Session file not found for {}, trying legacy method", ++ run.session_id ++ ); + match read_session_jsonl(&run.session_id, &run.project_path).await { + Ok(content) => Ok(content), + Err(_) => { +@@ -1916,7 +1940,7 @@ pub async fn load_agent_session_history( + .join(".claude"); + + let projects_dir = claude_dir.join("projects"); +- ++ + if !projects_dir.exists() { + log::error!("Projects directory not found at: {:?}", projects_dir); + return Err("Projects directory not found".to_string()); +@@ -1924,15 +1948,18 @@ pub async fn load_agent_session_history( + + // Search for the session file in all project directories + let mut session_file_path = None; +- log::info!("Searching for session file {} in all project directories", session_id); +- ++ log::info!( ++ "Searching for session file {} in all project directories", ++ session_id ++ ); ++ + if let Ok(entries) = std::fs::read_dir(&projects_dir) { + for entry in entries.filter_map(Result::ok) { + let path = entry.path(); + if path.is_dir() { + let dir_name = path.file_name().unwrap_or_default().to_string_lossy(); + log::debug!("Checking project directory: {}", dir_name); +- ++ + let potential_session_file = path.join(format!("{}.jsonl", session_id)); + if potential_session_file.exists() { + log::info!("Found session file at: {:?}", potential_session_file); +diff --git a/src-tauri/src/commands/claude.rs b/src-tauri/src/commands/claude.rs +index 94ad3c5..c8787e2 100644 +--- a/src-tauri/src/commands/claude.rs ++++ b/src-tauri/src/commands/claude.rs +@@ -10,7 +10,6 @@ use tauri::{AppHandle, Emitter, Manager}; + use tokio::process::{Child, Command}; + use tokio::sync::Mutex; + +- + /// Global state to track current Claude process + pub struct ClaudeProcessState { + pub current_process: Arc>>, +@@ -262,7 +261,7 @@ fn create_command_with_env(program: &str) -> Command { + } + } + } +- ++ + // Add Homebrew support if the program is in a Homebrew directory + if program.contains("/homebrew/") || program.contains("/opt/homebrew/") { + if let Some(program_dir) = std::path::Path::new(program).parent() { +@@ -270,7 +269,10 @@ fn create_command_with_env(program: &str) -> Command { + let homebrew_bin_str = program_dir.to_string_lossy(); + if !current_path.contains(&homebrew_bin_str.as_ref()) { + let new_path = format!("{}:{}", homebrew_bin_str, current_path); +- log::debug!("Adding Homebrew bin directory to PATH: {}", homebrew_bin_str); ++ log::debug!( ++ "Adding Homebrew bin directory to PATH: {}", ++ homebrew_bin_str ++ ); + tokio_cmd.env("PATH", new_path); + } + } +@@ -280,22 +282,18 @@ fn create_command_with_env(program: &str) -> Command { + } + + /// Creates a system binary command with the given arguments +-fn create_system_command( +- claude_path: &str, +- args: Vec, +- project_path: &str, +-) -> Command { ++fn create_system_command(claude_path: &str, args: Vec, project_path: &str) -> Command { + let mut cmd = create_command_with_env(claude_path); +- ++ + // Add all arguments + for arg in args { + cmd.arg(arg); + } +- ++ + cmd.current_dir(project_path) + .stdout(Stdio::piped()) + .stderr(Stdio::piped()); +- ++ + cmd + } + +@@ -307,7 +305,6 @@ pub async fn get_home_directory() -> Result { + .ok_or_else(|| "Could not determine home directory".to_string()) + } + +- + /// Lists all projects in the ~/.claude/projects directory + #[tauri::command] + pub async fn list_projects() -> Result, String> { +@@ -361,7 +358,7 @@ pub async fn list_projects() -> Result, String> { + // List all JSONL files (sessions) in this project directory + let mut sessions = Vec::new(); + let mut most_recent_session: Option = None; +- ++ + if let Ok(session_entries) = fs::read_dir(&path) { + for session_entry in session_entries.flatten() { + let session_path = session_entry.path(); +@@ -371,7 +368,7 @@ pub async fn list_projects() -> Result, String> { + if let Some(session_id) = session_path.file_stem().and_then(|s| s.to_str()) + { + sessions.push(session_id.to_string()); +- ++ + // Track the most recent session timestamp + if let Ok(metadata) = fs::metadata(&session_path) { + let modified = metadata +@@ -380,7 +377,7 @@ pub async fn list_projects() -> Result, String> { + .duration_since(UNIX_EPOCH) + .unwrap_or_default() + .as_secs(); +- ++ + most_recent_session = Some(match most_recent_session { + Some(current) => current.max(modified), + None => modified, +@@ -420,31 +417,31 @@ pub async fn list_projects() -> Result, String> { + #[tauri::command] + pub async fn create_project(path: String) -> Result { + log::info!("Creating project for path: {}", path); +- ++ + // Encode the path to create a project ID + let project_id = path.replace('/', "-"); +- ++ + // Get claude directory + let claude_dir = get_claude_dir().map_err(|e| e.to_string())?; + let projects_dir = claude_dir.join("projects"); +- ++ + // Create projects directory if it doesn't exist + if !projects_dir.exists() { + fs::create_dir_all(&projects_dir) + .map_err(|e| format!("Failed to create projects directory: {}", e))?; + } +- ++ + // Create project directory if it doesn't exist + let project_dir = projects_dir.join(&project_id); + if !project_dir.exists() { + fs::create_dir_all(&project_dir) + .map_err(|e| format!("Failed to create project directory: {}", e))?; + } +- ++ + // Get creation time + let metadata = fs::metadata(&project_dir) + .map_err(|e| format!("Failed to read directory metadata: {}", e))?; +- ++ + let created_at = metadata + .created() + .or_else(|_| metadata.modified()) +@@ -452,7 +449,7 @@ pub async fn create_project(path: String) -> Result { + .duration_since(UNIX_EPOCH) + .unwrap_or_default() + .as_secs(); +- ++ + // Return the created project + Ok(Project { + id: project_id, +@@ -648,7 +645,8 @@ pub async fn check_claude_version(app: AppHandle) -> Result Result { + let stdout = String::from_utf8_lossy(&output.stdout).to_string(); + let stderr = String::from_utf8_lossy(&output.stderr).to_string(); +- ++ + // Use regex to directly extract version pattern (e.g., "1.0.41") +- let version_regex = regex::Regex::new(r"(\d+\.\d+\.\d+(?:-[a-zA-Z0-9.-]+)?(?:\+[a-zA-Z0-9.-]+)?)").ok(); +- ++ let version_regex = ++ regex::Regex::new(r"(\d+\.\d+\.\d+(?:-[a-zA-Z0-9.-]+)?(?:\+[a-zA-Z0-9.-]+)?)") ++ .ok(); ++ + let version = if let Some(regex) = version_regex { +- regex.captures(&stdout) ++ regex ++ .captures(&stdout) + .and_then(|captures| captures.get(1)) + .map(|m| m.as_str().to_string()) + } else { + None + }; +- ++ + let full_output = if stderr.is_empty() { + stdout.clone() + } else { +@@ -907,8 +908,6 @@ pub async fn load_session_history( + Ok(messages) + } + +- +- + /// Execute a new interactive Claude Code session with streaming output + #[tauri::command] + pub async fn execute_claude_code( +@@ -924,7 +923,7 @@ pub async fn execute_claude_code( + ); + + let claude_path = find_claude_binary(&app)?; +- ++ + let args = vec![ + "-p".to_string(), + prompt.clone(), +@@ -955,7 +954,7 @@ pub async fn continue_claude_code( + ); + + let claude_path = find_claude_binary(&app)?; +- ++ + let args = vec![ + "-c".to_string(), // Continue flag + "-p".to_string(), +@@ -989,7 +988,7 @@ pub async fn resume_claude_code( + ); + + let claude_path = find_claude_binary(&app)?; +- ++ + let args = vec![ + "--resume".to_string(), + session_id.clone(), +@@ -1026,8 +1025,12 @@ pub async fn cancel_claude_execution( + let registry = app.state::(); + match registry.0.get_claude_session_by_id(sid) { + Ok(Some(process_info)) => { +- log::info!("Found process in registry for session {}: run_id={}, PID={}", +- sid, process_info.run_id, process_info.pid); ++ log::info!( ++ "Found process in registry for session {}: run_id={}, PID={}", ++ sid, ++ process_info.run_id, ++ process_info.pid ++ ); + match registry.0.kill_process(process_info.run_id).await { + Ok(success) => { + if success { +@@ -1060,7 +1063,10 @@ pub async fn cancel_claude_execution( + if let Some(mut child) = current_process.take() { + // Try to get the PID before killing + let pid = child.id(); +- log::info!("Attempting to kill Claude process via ClaudeProcessState with PID: {:?}", pid); ++ log::info!( ++ "Attempting to kill Claude process via ClaudeProcessState with PID: {:?}", ++ pid ++ ); + + // Kill the process + match child.kill().await { +@@ -1069,8 +1075,11 @@ pub async fn cancel_claude_execution( + killed = true; + } + Err(e) => { +- log::error!("Failed to kill Claude process via ClaudeProcessState: {}", e); +- ++ log::error!( ++ "Failed to kill Claude process via ClaudeProcessState: {}", ++ e ++ ); ++ + // Method 3: If we have a PID, try system kill as last resort + if let Some(pid) = pid { + log::info!("Attempting system kill as last resort for PID: {}", pid); +@@ -1083,7 +1092,7 @@ pub async fn cancel_claude_execution( + .args(["-KILL", &pid.to_string()]) + .output() + }; +- ++ + match kill_result { + Ok(output) if output.status.success() => { + log::info!("Successfully killed process via system command"); +@@ -1116,18 +1125,18 @@ pub async fn cancel_claude_execution( + tokio::time::sleep(tokio::time::Duration::from_millis(100)).await; + let _ = app.emit(&format!("claude-complete:{}", sid), false); + } +- ++ + // Also emit generic events for backward compatibility + let _ = app.emit("claude-cancelled", true); + tokio::time::sleep(tokio::time::Duration::from_millis(100)).await; + let _ = app.emit("claude-complete", false); +- ++ + if killed { + log::info!("Claude process cancellation completed successfully"); + } else if !attempted_methods.is_empty() { + log::warn!("Claude process cancellation attempted but process may have already exited. Attempted methods: {:?}", attempted_methods); + } +- ++ + Ok(()) + } + +@@ -1154,9 +1163,15 @@ pub async fn get_claude_session_output( + } + + /// Helper function to spawn Claude process and handle streaming +-async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, model: String, project_path: String) -> Result<(), String> { +- use tokio::io::{AsyncBufReadExt, BufReader}; ++async fn spawn_claude_process( ++ app: AppHandle, ++ mut cmd: Command, ++ prompt: String, ++ model: String, ++ project_path: String, ++) -> Result<(), String> { + use std::sync::Mutex; ++ use tokio::io::{AsyncBufReadExt, BufReader}; + + // Spawn the process + let mut child = cmd +@@ -1169,10 +1184,7 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + + // Get the child PID for logging + let pid = child.id().unwrap_or(0); +- log::info!( +- "Spawned Claude process with PID: {:?}", +- pid +- ); ++ log::info!("Spawned Claude process with PID: {:?}", pid); + + // Create readers first (before moving child) + let stdout_reader = BufReader::new(stdout); +@@ -1207,7 +1219,7 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + let mut lines = stdout_reader.lines(); + while let Ok(Some(line)) = lines.next_line().await { + log::debug!("Claude stdout: {}", line); +- ++ + // Parse the line to check for init message with session ID + if let Ok(msg) = serde_json::from_str::(&line) { + if msg["type"] == "system" && msg["subtype"] == "init" { +@@ -1216,7 +1228,7 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + if session_id_guard.is_none() { + *session_id_guard = Some(claude_session_id.to_string()); + log::info!("Extracted Claude session ID: {}", claude_session_id); +- ++ + // Now register with ProcessRegistry using Claude's session ID + match registry_clone.register_claude_session( + claude_session_id.to_string(), +@@ -1238,12 +1250,12 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + } + } + } +- ++ + // Store live output in registry if we have a run_id + if let Some(run_id) = *run_id_holder_clone.lock().unwrap() { + let _ = registry_clone.append_live_output(run_id, &line); + } +- ++ + // Emit the line to the frontend with session isolation if we have session ID + if let Some(ref session_id) = *session_id_holder_clone.lock().unwrap() { + let _ = app_handle.emit(&format!("claude-output:{}", session_id), &line); +@@ -1287,10 +1299,8 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + // Add a small delay to ensure all messages are processed + tokio::time::sleep(tokio::time::Duration::from_millis(100)).await; + if let Some(ref session_id) = *session_id_holder_clone3.lock().unwrap() { +- let _ = app_handle_wait.emit( +- &format!("claude-complete:{}", session_id), +- status.success(), +- ); ++ let _ = app_handle_wait ++ .emit(&format!("claude-complete:{}", session_id), status.success()); + } + // Also emit to the generic event for backward compatibility + let _ = app_handle_wait.emit("claude-complete", status.success()); +@@ -1300,8 +1310,8 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + // Add a small delay to ensure all messages are processed + tokio::time::sleep(tokio::time::Duration::from_millis(100)).await; + if let Some(ref session_id) = *session_id_holder_clone3.lock().unwrap() { +- let _ = app_handle_wait +- .emit(&format!("claude-complete:{}", session_id), false); ++ let _ = ++ app_handle_wait.emit(&format!("claude-complete:{}", session_id), false); + } + // Also emit to the generic event for backward compatibility + let _ = app_handle_wait.emit("claude-complete", false); +@@ -1321,7 +1331,6 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, + Ok(()) + } + +- + /// Lists files and directories in a given path + #[tauri::command] + pub async fn list_directory_contents(directory_path: String) -> Result, String> { +@@ -2038,78 +2047,92 @@ pub async fn track_session_messages( + + /// Gets hooks configuration from settings at specified scope + #[tauri::command] +-pub async fn get_hooks_config(scope: String, project_path: Option) -> Result { +- log::info!("Getting hooks config for scope: {}, project: {:?}", scope, project_path); ++pub async fn get_hooks_config( ++ scope: String, ++ project_path: Option, ++) -> Result { ++ log::info!( ++ "Getting hooks config for scope: {}, project: {:?}", ++ scope, ++ project_path ++ ); + + let settings_path = match scope.as_str() { +- "user" => { +- get_claude_dir() +- .map_err(|e| e.to_string())? +- .join("settings.json") +- }, ++ "user" => get_claude_dir() ++ .map_err(|e| e.to_string())? ++ .join("settings.json"), + "project" => { + let path = project_path.ok_or("Project path required for project scope")?; + PathBuf::from(path).join(".claude").join("settings.json") +- }, ++ } + "local" => { + let path = project_path.ok_or("Project path required for local scope")?; +- PathBuf::from(path).join(".claude").join("settings.local.json") +- }, +- _ => return Err("Invalid scope".to_string()) ++ PathBuf::from(path) ++ .join(".claude") ++ .join("settings.local.json") ++ } ++ _ => return Err("Invalid scope".to_string()), + }; + + if !settings_path.exists() { +- log::info!("Settings file does not exist at {:?}, returning empty hooks", settings_path); ++ log::info!( ++ "Settings file does not exist at {:?}, returning empty hooks", ++ settings_path ++ ); + return Ok(serde_json::json!({})); + } + + let content = fs::read_to_string(&settings_path) + .map_err(|e| format!("Failed to read settings: {}", e))?; +- +- let settings: serde_json::Value = serde_json::from_str(&content) +- .map_err(|e| format!("Failed to parse settings: {}", e))?; +- +- Ok(settings.get("hooks").cloned().unwrap_or(serde_json::json!({}))) ++ ++ let settings: serde_json::Value = ++ serde_json::from_str(&content).map_err(|e| format!("Failed to parse settings: {}", e))?; ++ ++ Ok(settings ++ .get("hooks") ++ .cloned() ++ .unwrap_or(serde_json::json!({}))) + } + + /// Updates hooks configuration in settings at specified scope + #[tauri::command] + pub async fn update_hooks_config( +- scope: String, ++ scope: String, + hooks: serde_json::Value, +- project_path: Option ++ project_path: Option, + ) -> Result { +- log::info!("Updating hooks config for scope: {}, project: {:?}", scope, project_path); ++ log::info!( ++ "Updating hooks config for scope: {}, project: {:?}", ++ scope, ++ project_path ++ ); + + let settings_path = match scope.as_str() { +- "user" => { +- get_claude_dir() +- .map_err(|e| e.to_string())? +- .join("settings.json") +- }, ++ "user" => get_claude_dir() ++ .map_err(|e| e.to_string())? ++ .join("settings.json"), + "project" => { + let path = project_path.ok_or("Project path required for project scope")?; + let claude_dir = PathBuf::from(path).join(".claude"); + fs::create_dir_all(&claude_dir) + .map_err(|e| format!("Failed to create .claude directory: {}", e))?; + claude_dir.join("settings.json") +- }, ++ } + "local" => { + let path = project_path.ok_or("Project path required for local scope")?; + let claude_dir = PathBuf::from(path).join(".claude"); + fs::create_dir_all(&claude_dir) + .map_err(|e| format!("Failed to create .claude directory: {}", e))?; + claude_dir.join("settings.local.json") +- }, +- _ => return Err("Invalid scope".to_string()) ++ } ++ _ => return Err("Invalid scope".to_string()), + }; + + // Read existing settings or create new + let mut settings = if settings_path.exists() { + let content = fs::read_to_string(&settings_path) + .map_err(|e| format!("Failed to read settings: {}", e))?; +- serde_json::from_str(&content) +- .map_err(|e| format!("Failed to parse settings: {}", e))? ++ serde_json::from_str(&content).map_err(|e| format!("Failed to parse settings: {}", e))? + } else { + serde_json::json!({}) + }; +@@ -2120,7 +2143,7 @@ pub async fn update_hooks_config( + // Write back with pretty formatting + let json_string = serde_json::to_string_pretty(&settings) + .map_err(|e| format!("Failed to serialize settings: {}", e))?; +- ++ + fs::write(&settings_path, json_string) + .map_err(|e| format!("Failed to write settings: {}", e))?; + +@@ -2135,9 +2158,9 @@ pub async fn validate_hook_command(command: String) -> Result { + if output.status.success() { +@@ -2153,6 +2176,63 @@ pub async fn validate_hook_command(command: String) -> Result Err(format!("Failed to validate command: {}", e)) ++ Err(e) => Err(format!("Failed to validate command: {}", e)), ++ } ++} ++ ++/// Deletes a session file and its associated data ++/// ++/// This function removes a session's JSONL file from the project directory ++/// and also cleans up any associated todo data if it exists. ++/// ++/// # Arguments ++/// * `session_id` - The UUID of the session to delete ++/// * `project_id` - The ID of the project containing the session ++/// ++/// # Returns ++/// * `Ok(String)` - Success message with session ID ++/// * `Err(String)` - Error message if deletion fails ++/// ++/// # Errors ++/// * Project directory not found ++/// * Permission denied when deleting files ++/// * File system errors during deletion ++#[tauri::command] ++pub async fn delete_session(session_id: String, project_id: String) -> Result { ++ log::info!( ++ "Deleting session: {} from project: {}", ++ session_id, ++ project_id ++ ); ++ ++ let claude_dir = get_claude_dir().map_err(|e| e.to_string())?; ++ let project_dir = claude_dir.join("projects").join(&project_id); ++ ++ // Check if project directory exists ++ if !project_dir.exists() { ++ return Err(format!("Project directory not found: {}", project_id)); ++ } ++ ++ // Delete the session JSONL file ++ let session_file = project_dir.join(format!("{}.jsonl", session_id)); ++ if session_file.exists() { ++ fs::remove_file(&session_file) ++ .map_err(|e| format!("Failed to delete session file: {}", e))?; ++ log::info!("Deleted session file: {:?}", session_file); ++ } else { ++ log::warn!("Session file not found: {:?}", session_file); ++ } ++ ++ // Delete associated todo data if it exists ++ let todos_dir = project_dir.join("todos"); ++ if todos_dir.exists() { ++ let todo_file = todos_dir.join(format!("{}.json", session_id)); ++ if todo_file.exists() { ++ fs::remove_file(&todo_file) ++ .map_err(|e| format!("Failed to delete todo file: {}", e))?; ++ log::info!("Deleted todo file: {:?}", todo_file); ++ } + } ++ ++ Ok(format!("Session {} deleted successfully", session_id)) + } +diff --git a/src-tauri/src/commands/mod.rs b/src-tauri/src/commands/mod.rs +index a0fa7e8..f4c3552 100644 +--- a/src-tauri/src/commands/mod.rs ++++ b/src-tauri/src/commands/mod.rs +@@ -1,7 +1,7 @@ + pub mod agents; + pub mod claude; + pub mod mcp; +-pub mod usage; +-pub mod storage; +-pub mod slash_commands; + pub mod proxy; ++pub mod slash_commands; ++pub mod storage; ++pub mod usage; +diff --git a/src-tauri/src/commands/proxy.rs b/src-tauri/src/commands/proxy.rs +index e2454ec..2192e0e 100644 +--- a/src-tauri/src/commands/proxy.rs ++++ b/src-tauri/src/commands/proxy.rs +@@ -1,6 +1,6 @@ ++use rusqlite::params; + use serde::{Deserialize, Serialize}; + use tauri::State; +-use rusqlite::params; + + use crate::commands::agents::AgentDb; + +@@ -29,9 +29,9 @@ impl Default for ProxySettings { + #[tauri::command] + pub async fn get_proxy_settings(db: State<'_, AgentDb>) -> Result { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + let mut settings = ProxySettings::default(); +- ++ + // Query each proxy setting + let keys = vec![ + ("proxy_enabled", "enabled"), +@@ -40,7 +40,7 @@ pub async fn get_proxy_settings(db: State<'_, AgentDb>) -> Result) -> Result Result<(), String> { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Save each setting + let values = vec![ + ("proxy_enabled", settings.enabled.to_string()), +- ("proxy_http", settings.http_proxy.clone().unwrap_or_default()), +- ("proxy_https", settings.https_proxy.clone().unwrap_or_default()), ++ ( ++ "proxy_http", ++ settings.http_proxy.clone().unwrap_or_default(), ++ ), ++ ( ++ "proxy_https", ++ settings.https_proxy.clone().unwrap_or_default(), ++ ), + ("proxy_no", settings.no_proxy.clone().unwrap_or_default()), + ("proxy_all", settings.all_proxy.clone().unwrap_or_default()), + ]; +- ++ + for (key, value) in values { + conn.execute( + "INSERT OR REPLACE INTO app_settings (key, value) VALUES (?1, ?2)", + params![key, value], +- ).map_err(|e| format!("Failed to save {}: {}", key, e))?; ++ ) ++ .map_err(|e| format!("Failed to save {}: {}", key, e))?; + } +- ++ + // Apply the proxy settings immediately to the current process + apply_proxy_settings(&settings); +- ++ + Ok(()) + } + + /// Apply proxy settings as environment variables + pub fn apply_proxy_settings(settings: &ProxySettings) { + log::info!("Applying proxy settings: enabled={}", settings.enabled); +- ++ + if !settings.enabled { + // Clear proxy environment variables if disabled + log::info!("Clearing proxy environment variables"); +@@ -109,7 +116,7 @@ pub fn apply_proxy_settings(settings: &ProxySettings) { + std::env::remove_var("all_proxy"); + return; + } +- ++ + // Ensure NO_PROXY includes localhost by default + let mut no_proxy_list = vec!["localhost", "127.0.0.1", "::1", "0.0.0.0"]; + if let Some(user_no_proxy) = &settings.no_proxy { +@@ -118,7 +125,7 @@ pub fn apply_proxy_settings(settings: &ProxySettings) { + } + } + let no_proxy_value = no_proxy_list.join(","); +- ++ + // Set proxy environment variables (uppercase is standard) + if let Some(http_proxy) = &settings.http_proxy { + if !http_proxy.is_empty() { +@@ -126,25 +133,25 @@ pub fn apply_proxy_settings(settings: &ProxySettings) { + std::env::set_var("HTTP_PROXY", http_proxy); + } + } +- ++ + if let Some(https_proxy) = &settings.https_proxy { + if !https_proxy.is_empty() { + log::info!("Setting HTTPS_PROXY={}", https_proxy); + std::env::set_var("HTTPS_PROXY", https_proxy); + } + } +- ++ + // Always set NO_PROXY to include localhost + log::info!("Setting NO_PROXY={}", no_proxy_value); + std::env::set_var("NO_PROXY", &no_proxy_value); +- ++ + if let Some(all_proxy) = &settings.all_proxy { + if !all_proxy.is_empty() { + log::info!("Setting ALL_PROXY={}", all_proxy); + std::env::set_var("ALL_PROXY", all_proxy); + } + } +- ++ + // Log current proxy environment variables for debugging + log::info!("Current proxy environment variables:"); + for (key, value) in std::env::vars() { +@@ -152,4 +159,4 @@ pub fn apply_proxy_settings(settings: &ProxySettings) { + log::info!(" {}={}", key, value); + } + } +-} +\ No newline at end of file ++} +diff --git a/src-tauri/src/commands/slash_commands.rs b/src-tauri/src/commands/slash_commands.rs +index dbf12e6..6f77309 100644 +--- a/src-tauri/src/commands/slash_commands.rs ++++ b/src-tauri/src/commands/slash_commands.rs +@@ -45,13 +45,13 @@ struct CommandFrontmatter { + /// Parse a markdown file with optional YAML frontmatter + fn parse_markdown_with_frontmatter(content: &str) -> Result<(Option, String)> { + let lines: Vec<&str> = content.lines().collect(); +- ++ + // Check if the file starts with YAML frontmatter + if lines.is_empty() || lines[0] != "---" { + // No frontmatter + return Ok((None, content.to_string())); + } +- ++ + // Find the end of frontmatter + let mut frontmatter_end = None; + for (i, line) in lines.iter().enumerate().skip(1) { +@@ -60,12 +60,12 @@ fn parse_markdown_with_frontmatter(content: &str) -> Result<(Option(&frontmatter_content) { + Ok(frontmatter) => Ok((Some(frontmatter), body_content)), +@@ -86,20 +86,20 @@ fn extract_command_info(file_path: &Path, base_path: &Path) -> Result<(String, O + let relative_path = file_path + .strip_prefix(base_path) + .context("Failed to get relative path")?; +- ++ + // Remove .md extension + let path_without_ext = relative_path + .with_extension("") + .to_string_lossy() + .to_string(); +- ++ + // Split into components + let components: Vec<&str> = path_without_ext.split('/').collect(); +- ++ + if components.is_empty() { + return Err(anyhow::anyhow!("Invalid command path")); + } +- ++ + if components.len() == 1 { + // No namespace + Ok((components[0].to_string(), None)) +@@ -112,44 +112,43 @@ fn extract_command_info(file_path: &Path, base_path: &Path) -> Result<(String, O + } + + /// Load a single command from a markdown file +-fn load_command_from_file( +- file_path: &Path, +- base_path: &Path, +- scope: &str, +-) -> Result { ++fn load_command_from_file(file_path: &Path, base_path: &Path, scope: &str) -> Result { + debug!("Loading command from: {:?}", file_path); +- ++ + // Read file content +- let content = fs::read_to_string(file_path) +- .context("Failed to read command file")?; +- ++ let content = fs::read_to_string(file_path).context("Failed to read command file")?; ++ + // Parse frontmatter + let (frontmatter, body) = parse_markdown_with_frontmatter(&content)?; +- ++ + // Extract command info + let (name, namespace) = extract_command_info(file_path, base_path)?; +- ++ + // Build full command (no scope prefix, just /command or /namespace:command) + let full_command = match &namespace { + Some(ns) => format!("/{ns}:{name}"), + None => format!("/{name}"), + }; +- ++ + // Generate unique ID +- let id = format!("{}-{}", scope, file_path.to_string_lossy().replace('/', "-")); +- ++ let id = format!( ++ "{}-{}", ++ scope, ++ file_path.to_string_lossy().replace('/', "-") ++ ); ++ + // Check for special content + let has_bash_commands = body.contains("!`"); + let has_file_references = body.contains('@'); + let accepts_arguments = body.contains("$ARGUMENTS"); +- ++ + // Extract metadata from frontmatter + let (description, allowed_tools) = if let Some(fm) = frontmatter { + (fm.description, fm.allowed_tools.unwrap_or_default()) + } else { + (None, Vec::new()) + }; +- ++ + Ok(SlashCommand { + id, + name, +@@ -171,18 +170,18 @@ fn find_markdown_files(dir: &Path, files: &mut Vec) -> Result<()> { + if !dir.exists() { + return Ok(()); + } +- ++ + for entry in fs::read_dir(dir)? { + let entry = entry?; + let path = entry.path(); +- ++ + // Skip hidden files/directories + if let Some(name) = path.file_name().and_then(|n| n.to_str()) { + if name.starts_with('.') { + continue; + } + } +- ++ + if path.is_dir() { + find_markdown_files(&path, files)?; + } else if path.is_file() { +@@ -193,7 +192,7 @@ fn find_markdown_files(dir: &Path, files: &mut Vec) -> Result<()> { + } + } + } +- ++ + Ok(()) + } + +@@ -252,16 +251,16 @@ pub async fn slash_commands_list( + ) -> Result, String> { + info!("Discovering slash commands"); + let mut commands = Vec::new(); +- ++ + // Add default commands + commands.extend(create_default_commands()); +- ++ + // Load project commands if project path is provided + if let Some(proj_path) = project_path { + let project_commands_dir = PathBuf::from(&proj_path).join(".claude").join("commands"); + if project_commands_dir.exists() { + debug!("Scanning project commands at: {:?}", project_commands_dir); +- ++ + let mut md_files = Vec::new(); + if let Err(e) = find_markdown_files(&project_commands_dir, &mut md_files) { + error!("Failed to find project command files: {}", e); +@@ -280,13 +279,13 @@ pub async fn slash_commands_list( + } + } + } +- ++ + // Load user commands + if let Some(home_dir) = dirs::home_dir() { + let user_commands_dir = home_dir.join(".claude").join("commands"); + if user_commands_dir.exists() { + debug!("Scanning user commands at: {:?}", user_commands_dir); +- ++ + let mut md_files = Vec::new(); + if let Err(e) = find_markdown_files(&user_commands_dir, &mut md_files) { + error!("Failed to find user command files: {}", e); +@@ -305,7 +304,7 @@ pub async fn slash_commands_list( + } + } + } +- ++ + info!("Found {} slash commands", commands.len()); + Ok(commands) + } +@@ -314,17 +313,17 @@ pub async fn slash_commands_list( + #[tauri::command] + pub async fn slash_command_get(command_id: String) -> Result { + debug!("Getting slash command: {}", command_id); +- ++ + // Parse the ID to determine scope and reconstruct file path + let parts: Vec<&str> = command_id.split('-').collect(); + if parts.len() < 2 { + return Err("Invalid command ID".to_string()); + } +- ++ + // The actual implementation would need to reconstruct the path and reload the command + // For now, we'll list all commands and find the matching one + let commands = slash_commands_list(None).await?; +- ++ + commands + .into_iter() + .find(|cmd| cmd.id == command_id) +@@ -343,16 +342,16 @@ pub async fn slash_command_save( + project_path: Option, + ) -> Result { + info!("Saving slash command: {} in scope: {}", name, scope); +- ++ + // Validate inputs + if name.is_empty() { + return Err("Command name cannot be empty".to_string()); + } +- ++ + if !["project", "user"].contains(&scope.as_str()) { + return Err("Invalid scope. Must be 'project' or 'user'".to_string()); + } +- ++ + // Determine base directory + let base_dir = if scope == "project" { + if let Some(proj_path) = project_path { +@@ -366,7 +365,7 @@ pub async fn slash_command_save( + .join(".claude") + .join("commands") + }; +- ++ + // Build file path + let mut file_path = base_dir.clone(); + if let Some(ns) = &namespace { +@@ -374,41 +373,40 @@ pub async fn slash_command_save( + file_path = file_path.join(component); + } + } +- ++ + // Create directories if needed +- fs::create_dir_all(&file_path) +- .map_err(|e| format!("Failed to create directories: {}", e))?; +- ++ fs::create_dir_all(&file_path).map_err(|e| format!("Failed to create directories: {}", e))?; ++ + // Add filename + file_path = file_path.join(format!("{}.md", name)); +- ++ + // Build content with frontmatter + let mut full_content = String::new(); +- ++ + // Add frontmatter if we have metadata + if description.is_some() || !allowed_tools.is_empty() { + full_content.push_str("---\n"); +- ++ + if let Some(desc) = &description { + full_content.push_str(&format!("description: {}\n", desc)); + } +- ++ + if !allowed_tools.is_empty() { + full_content.push_str("allowed-tools:\n"); + for tool in &allowed_tools { + full_content.push_str(&format!(" - {}\n", tool)); + } + } +- ++ + full_content.push_str("---\n\n"); + } +- ++ + full_content.push_str(&content); +- ++ + // Write file + fs::write(&file_path, &full_content) + .map_err(|e| format!("Failed to write command file: {}", e))?; +- ++ + // Load and return the saved command + load_command_from_file(&file_path, &base_dir, &scope) + .map_err(|e| format!("Failed to load saved command: {}", e)) +@@ -416,35 +414,38 @@ pub async fn slash_command_save( + + /// Delete a slash command + #[tauri::command] +-pub async fn slash_command_delete(command_id: String, project_path: Option) -> Result { ++pub async fn slash_command_delete( ++ command_id: String, ++ project_path: Option, ++) -> Result { + info!("Deleting slash command: {}", command_id); +- ++ + // First, we need to determine if this is a project command by parsing the ID + let is_project_command = command_id.starts_with("project-"); +- ++ + // If it's a project command and we don't have a project path, error out + if is_project_command && project_path.is_none() { + return Err("Project path required to delete project commands".to_string()); + } +- ++ + // List all commands (including project commands if applicable) + let commands = slash_commands_list(project_path).await?; +- ++ + // Find the command by ID + let command = commands + .into_iter() + .find(|cmd| cmd.id == command_id) + .ok_or_else(|| format!("Command not found: {}", command_id))?; +- ++ + // Delete the file + fs::remove_file(&command.file_path) + .map_err(|e| format!("Failed to delete command file: {}", e))?; +- ++ + // Clean up empty directories + if let Some(parent) = Path::new(&command.file_path).parent() { + let _ = remove_empty_dirs(parent); + } +- ++ + Ok(format!("Deleted command: {}", command.full_command)) + } + +@@ -453,18 +454,18 @@ fn remove_empty_dirs(dir: &Path) -> Result<()> { + if !dir.exists() { + return Ok(()); + } +- ++ + // Check if directory is empty + let is_empty = fs::read_dir(dir)?.next().is_none(); +- ++ + if is_empty { + fs::remove_dir(dir)?; +- ++ + // Try to remove parent if it's also empty + if let Some(parent) = dir.parent() { + let _ = remove_empty_dirs(parent); + } + } +- ++ + Ok(()) + } +diff --git a/src-tauri/src/commands/storage.rs b/src-tauri/src/commands/storage.rs +index 1bcdb1b..02c5529 100644 +--- a/src-tauri/src/commands/storage.rs ++++ b/src-tauri/src/commands/storage.rs +@@ -1,10 +1,10 @@ ++use super::agents::AgentDb; + use anyhow::Result; +-use rusqlite::{params, Connection, Result as SqliteResult, types::ValueRef}; ++use rusqlite::{params, types::ValueRef, Connection, Result as SqliteResult}; + use serde::{Deserialize, Serialize}; + use serde_json::{Map, Value as JsonValue}; + use std::collections::HashMap; + use tauri::{AppHandle, Manager, State}; +-use super::agents::AgentDb; + + /// Represents metadata about a database table + #[derive(Debug, Serialize, Deserialize, Clone)] +@@ -50,37 +50,35 @@ pub struct QueryResult { + #[tauri::command] + pub async fn storage_list_tables(db: State<'_, AgentDb>) -> Result, String> { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Query for all tables + let mut stmt = conn + .prepare("SELECT name FROM sqlite_master WHERE type='table' AND name NOT LIKE 'sqlite_%' ORDER BY name") + .map_err(|e| e.to_string())?; +- ++ + let table_names: Vec = stmt + .query_map([], |row| row.get(0)) + .map_err(|e| e.to_string())? + .collect::>>() + .map_err(|e| e.to_string())?; +- ++ + drop(stmt); +- ++ + let mut tables = Vec::new(); +- ++ + for table_name in table_names { + // Get row count + let row_count: i64 = conn +- .query_row( +- &format!("SELECT COUNT(*) FROM {}", table_name), +- [], +- |row| row.get(0), +- ) ++ .query_row(&format!("SELECT COUNT(*) FROM {}", table_name), [], |row| { ++ row.get(0) ++ }) + .unwrap_or(0); +- ++ + // Get column information + let mut pragma_stmt = conn + .prepare(&format!("PRAGMA table_info({})", table_name)) + .map_err(|e| e.to_string())?; +- ++ + let columns: Vec = pragma_stmt + .query_map([], |row| { + Ok(ColumnInfo { +@@ -95,14 +93,14 @@ pub async fn storage_list_tables(db: State<'_, AgentDb>) -> Result>>() + .map_err(|e| e.to_string())?; +- ++ + tables.push(TableInfo { + name: table_name, + row_count, + columns, + }); + } +- ++ + Ok(tables) + } + +@@ -117,17 +115,17 @@ pub async fn storage_read_table( + searchQuery: Option, + ) -> Result { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Validate table name to prevent SQL injection + if !is_valid_table_name(&conn, &tableName)? { + return Err("Invalid table name".to_string()); + } +- ++ + // Get column information + let mut pragma_stmt = conn + .prepare(&format!("PRAGMA table_info({})", tableName)) + .map_err(|e| e.to_string())?; +- ++ + let columns: Vec = pragma_stmt + .query_map([], |row| { + Ok(ColumnInfo { +@@ -142,9 +140,9 @@ pub async fn storage_read_table( + .map_err(|e| e.to_string())? + .collect::>>() + .map_err(|e| e.to_string())?; +- ++ + drop(pragma_stmt); +- ++ + // Build query with optional search + let (query, count_query) = if let Some(search) = &searchQuery { + // Create search conditions for all text columns +@@ -153,7 +151,7 @@ pub async fn storage_read_table( + .filter(|col| col.type_name.contains("TEXT") || col.type_name.contains("VARCHAR")) + .map(|col| format!("{} LIKE '%{}%'", col.name, search.replace("'", "''"))) + .collect(); +- ++ + if search_conditions.is_empty() { + ( + format!("SELECT * FROM {} LIMIT ? OFFSET ?", tableName), +@@ -162,7 +160,10 @@ pub async fn storage_read_table( + } else { + let where_clause = search_conditions.join(" OR "); + ( +- format!("SELECT * FROM {} WHERE {} LIMIT ? OFFSET ?", tableName, where_clause), ++ format!( ++ "SELECT * FROM {} WHERE {} LIMIT ? OFFSET ?", ++ tableName, where_clause ++ ), + format!("SELECT COUNT(*) FROM {} WHERE {}", tableName, where_clause), + ) + } +@@ -172,25 +173,23 @@ pub async fn storage_read_table( + format!("SELECT COUNT(*) FROM {}", tableName), + ) + }; +- ++ + // Get total row count + let total_rows: i64 = conn + .query_row(&count_query, [], |row| row.get(0)) + .unwrap_or(0); +- ++ + // Calculate pagination + let offset = (page - 1) * pageSize; + let total_pages = (total_rows as f64 / pageSize as f64).ceil() as i64; +- ++ + // Query data +- let mut data_stmt = conn +- .prepare(&query) +- .map_err(|e| e.to_string())?; +- ++ let mut data_stmt = conn.prepare(&query).map_err(|e| e.to_string())?; ++ + let rows: Vec> = data_stmt + .query_map(params![pageSize, offset], |row| { + let mut row_map = Map::new(); +- ++ + for (idx, col) in columns.iter().enumerate() { + let value = match row.get_ref(idx)? { + ValueRef::Null => JsonValue::Null, +@@ -203,17 +202,20 @@ pub async fn storage_read_table( + } + } + ValueRef::Text(s) => JsonValue::String(String::from_utf8_lossy(s).to_string()), +- ValueRef::Blob(b) => JsonValue::String(base64::Engine::encode(&base64::engine::general_purpose::STANDARD, b)), ++ ValueRef::Blob(b) => JsonValue::String(base64::Engine::encode( ++ &base64::engine::general_purpose::STANDARD, ++ b, ++ )), + }; + row_map.insert(col.name.clone(), value); + } +- ++ + Ok(row_map) + }) + .map_err(|e| e.to_string())? + .collect::>>() + .map_err(|e| e.to_string())?; +- ++ + Ok(TableData { + table_name: tableName, + columns, +@@ -235,49 +237,52 @@ pub async fn storage_update_row( + updates: HashMap, + ) -> Result<(), String> { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Validate table name + if !is_valid_table_name(&conn, &tableName)? { + return Err("Invalid table name".to_string()); + } +- ++ + // Build UPDATE query + let set_clauses: Vec = updates + .keys() + .enumerate() + .map(|(idx, key)| format!("{} = ?{}", key, idx + 1)) + .collect(); +- ++ + let where_clauses: Vec = primaryKeyValues + .keys() + .enumerate() + .map(|(idx, key)| format!("{} = ?{}", key, idx + updates.len() + 1)) + .collect(); +- ++ + let query = format!( + "UPDATE {} SET {} WHERE {}", + tableName, + set_clauses.join(", "), + where_clauses.join(" AND ") + ); +- ++ + // Prepare parameters + let mut params: Vec> = Vec::new(); +- ++ + // Add update values + for value in updates.values() { + params.push(json_to_sql_value(value)?); + } +- ++ + // Add where clause values + for value in primaryKeyValues.values() { + params.push(json_to_sql_value(value)?); + } +- ++ + // Execute update +- conn.execute(&query, rusqlite::params_from_iter(params.iter().map(|p| p.as_ref()))) +- .map_err(|e| format!("Failed to update row: {}", e))?; +- ++ conn.execute( ++ &query, ++ rusqlite::params_from_iter(params.iter().map(|p| p.as_ref())), ++ ) ++ .map_err(|e| format!("Failed to update row: {}", e))?; ++ + Ok(()) + } + +@@ -290,35 +295,38 @@ pub async fn storage_delete_row( + primaryKeyValues: HashMap, + ) -> Result<(), String> { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Validate table name + if !is_valid_table_name(&conn, &tableName)? { + return Err("Invalid table name".to_string()); + } +- ++ + // Build DELETE query + let where_clauses: Vec = primaryKeyValues + .keys() + .enumerate() + .map(|(idx, key)| format!("{} = ?{}", key, idx + 1)) + .collect(); +- ++ + let query = format!( + "DELETE FROM {} WHERE {}", + tableName, + where_clauses.join(" AND ") + ); +- ++ + // Prepare parameters + let params: Vec> = primaryKeyValues + .values() + .map(json_to_sql_value) + .collect::, _>>()?; +- ++ + // Execute delete +- conn.execute(&query, rusqlite::params_from_iter(params.iter().map(|p| p.as_ref()))) +- .map_err(|e| format!("Failed to delete row: {}", e))?; +- ++ conn.execute( ++ &query, ++ rusqlite::params_from_iter(params.iter().map(|p| p.as_ref())), ++ ) ++ .map_err(|e| format!("Failed to delete row: {}", e))?; ++ + Ok(()) + } + +@@ -331,35 +339,40 @@ pub async fn storage_insert_row( + values: HashMap, + ) -> Result { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Validate table name + if !is_valid_table_name(&conn, &tableName)? { + return Err("Invalid table name".to_string()); + } +- ++ + // Build INSERT query + let columns: Vec<&String> = values.keys().collect(); +- let placeholders: Vec = (1..=columns.len()) +- .map(|i| format!("?{}", i)) +- .collect(); +- ++ let placeholders: Vec = (1..=columns.len()).map(|i| format!("?{}", i)).collect(); ++ + let query = format!( + "INSERT INTO {} ({}) VALUES ({})", + tableName, +- columns.iter().map(|c| c.as_str()).collect::>().join(", "), ++ columns ++ .iter() ++ .map(|c| c.as_str()) ++ .collect::>() ++ .join(", "), + placeholders.join(", ") + ); +- ++ + // Prepare parameters + let params: Vec> = values + .values() + .map(json_to_sql_value) + .collect::, _>>()?; +- ++ + // Execute insert +- conn.execute(&query, rusqlite::params_from_iter(params.iter().map(|p| p.as_ref()))) +- .map_err(|e| format!("Failed to insert row: {}", e))?; +- ++ conn.execute( ++ &query, ++ rusqlite::params_from_iter(params.iter().map(|p| p.as_ref())), ++ ) ++ .map_err(|e| format!("Failed to insert row: {}", e))?; ++ + Ok(conn.last_insert_rowid()) + } + +@@ -370,20 +383,20 @@ pub async fn storage_execute_sql( + query: String, + ) -> Result { + let conn = db.0.lock().map_err(|e| e.to_string())?; +- ++ + // Check if it's a SELECT query + let is_select = query.trim().to_uppercase().starts_with("SELECT"); +- ++ + if is_select { + // Handle SELECT queries + let mut stmt = conn.prepare(&query).map_err(|e| e.to_string())?; + let column_count = stmt.column_count(); +- ++ + // Get column names + let columns: Vec = (0..column_count) + .map(|i| stmt.column_name(i).unwrap_or("").to_string()) + .collect(); +- ++ + // Execute query and collect results + let rows: Vec> = stmt + .query_map([], |row| { +@@ -399,8 +412,13 @@ pub async fn storage_execute_sql( + JsonValue::String(f.to_string()) + } + } +- ValueRef::Text(s) => JsonValue::String(String::from_utf8_lossy(s).to_string()), +- ValueRef::Blob(b) => JsonValue::String(base64::Engine::encode(&base64::engine::general_purpose::STANDARD, b)), ++ ValueRef::Text(s) => { ++ JsonValue::String(String::from_utf8_lossy(s).to_string()) ++ } ++ ValueRef::Blob(b) => JsonValue::String(base64::Engine::encode( ++ &base64::engine::general_purpose::STANDARD, ++ b, ++ )), + }; + row_values.push(value); + } +@@ -409,7 +427,7 @@ pub async fn storage_execute_sql( + .map_err(|e| e.to_string())? + .collect::>>() + .map_err(|e| e.to_string())?; +- ++ + Ok(QueryResult { + columns, + rows, +@@ -419,7 +437,7 @@ pub async fn storage_execute_sql( + } else { + // Handle non-SELECT queries (INSERT, UPDATE, DELETE, etc.) + let rows_affected = conn.execute(&query, []).map_err(|e| e.to_string())?; +- ++ + Ok(QueryResult { + columns: vec![], + rows: vec![], +@@ -435,13 +453,12 @@ pub async fn storage_reset_database(app: AppHandle) -> Result<(), String> { + { + // Drop all existing tables within a scoped block + let db_state = app.state::(); +- let conn = db_state.0.lock() +- .map_err(|e| e.to_string())?; +- ++ let conn = db_state.0.lock().map_err(|e| e.to_string())?; ++ + // Disable foreign key constraints temporarily to allow dropping tables + conn.execute("PRAGMA foreign_keys = OFF", []) + .map_err(|e| format!("Failed to disable foreign keys: {}", e))?; +- ++ + // Drop tables - order doesn't matter with foreign keys disabled + conn.execute("DROP TABLE IF EXISTS agent_runs", []) + .map_err(|e| format!("Failed to drop agent_runs table: {}", e))?; +@@ -449,34 +466,31 @@ pub async fn storage_reset_database(app: AppHandle) -> Result<(), String> { + .map_err(|e| format!("Failed to drop agents table: {}", e))?; + conn.execute("DROP TABLE IF EXISTS app_settings", []) + .map_err(|e| format!("Failed to drop app_settings table: {}", e))?; +- ++ + // Re-enable foreign key constraints + conn.execute("PRAGMA foreign_keys = ON", []) + .map_err(|e| format!("Failed to re-enable foreign keys: {}", e))?; +- ++ + // Connection is automatically dropped at end of scope + } +- ++ + // Re-initialize the database which will recreate all tables empty + let new_conn = init_database(&app).map_err(|e| format!("Failed to reset database: {}", e))?; +- ++ + // Update the managed state with the new connection + { + let db_state = app.state::(); +- let mut conn_guard = db_state.0.lock() +- .map_err(|e| e.to_string())?; ++ let mut conn_guard = db_state.0.lock().map_err(|e| e.to_string())?; + *conn_guard = new_conn; + } +- ++ + // Run VACUUM to optimize the database + { + let db_state = app.state::(); +- let conn = db_state.0.lock() +- .map_err(|e| e.to_string())?; +- conn.execute("VACUUM", []) +- .map_err(|e| e.to_string())?; ++ let conn = db_state.0.lock().map_err(|e| e.to_string())?; ++ conn.execute("VACUUM", []).map_err(|e| e.to_string())?; + } +- ++ + Ok(()) + } + +@@ -489,7 +503,7 @@ fn is_valid_table_name(conn: &Connection, table_name: &str) -> Result 0) + } + +@@ -513,4 +527,4 @@ fn json_to_sql_value(value: &JsonValue) -> Result, Stri + } + + /// Initialize the agents database (re-exported from agents module) +-use super::agents::init_database; +\ No newline at end of file ++use super::agents::init_database; +diff --git a/src-tauri/src/main.rs b/src-tauri/src/main.rs +index ffc0212..3164fe3 100644 +--- a/src-tauri/src/main.rs ++++ b/src-tauri/src/main.rs +@@ -14,20 +14,21 @@ use commands::agents::{ + get_live_session_output, get_session_output, get_session_status, import_agent, + import_agent_from_file, import_agent_from_github, init_database, kill_agent_session, + list_agent_runs, list_agent_runs_with_metrics, list_agents, list_claude_installations, +- list_running_sessions, load_agent_session_history, set_claude_binary_path, stream_session_output, update_agent, AgentDb, ++ list_running_sessions, load_agent_session_history, set_claude_binary_path, ++ stream_session_output, update_agent, AgentDb, + }; + use commands::claude::{ + cancel_claude_execution, check_auto_checkpoint, check_claude_version, cleanup_old_checkpoints, +- clear_checkpoint_manager, continue_claude_code, create_checkpoint, create_project, execute_claude_code, +- find_claude_md_files, fork_from_checkpoint, get_checkpoint_diff, get_checkpoint_settings, +- get_checkpoint_state_stats, get_claude_session_output, get_claude_settings, get_home_directory, get_project_sessions, +- get_recently_modified_files, get_session_timeline, get_system_prompt, list_checkpoints, +- list_directory_contents, list_projects, list_running_claude_sessions, load_session_history, +- open_new_session, read_claude_md_file, restore_checkpoint, resume_claude_code, +- save_claude_md_file, save_claude_settings, save_system_prompt, search_files, +- track_checkpoint_message, track_session_messages, update_checkpoint_settings, +- get_hooks_config, update_hooks_config, validate_hook_command, +- ClaudeProcessState, ++ clear_checkpoint_manager, continue_claude_code, create_checkpoint, create_project, ++ delete_session, execute_claude_code, find_claude_md_files, fork_from_checkpoint, ++ get_checkpoint_diff, get_checkpoint_settings, get_checkpoint_state_stats, ++ get_claude_session_output, get_claude_settings, get_home_directory, get_hooks_config, ++ get_project_sessions, get_recently_modified_files, get_session_timeline, get_system_prompt, ++ list_checkpoints, list_directory_contents, list_projects, list_running_claude_sessions, ++ load_session_history, open_new_session, read_claude_md_file, restore_checkpoint, ++ resume_claude_code, save_claude_md_file, save_claude_settings, save_system_prompt, ++ search_files, track_checkpoint_message, track_session_messages, update_checkpoint_settings, ++ update_hooks_config, validate_hook_command, ClaudeProcessState, + }; + use commands::mcp::{ + mcp_add, mcp_add_from_claude_desktop, mcp_add_json, mcp_get, mcp_get_server_status, mcp_list, +@@ -35,14 +36,14 @@ use commands::mcp::{ + mcp_serve, mcp_test_connection, + }; + ++use commands::proxy::{apply_proxy_settings, get_proxy_settings, save_proxy_settings}; ++use commands::storage::{ ++ storage_delete_row, storage_execute_sql, storage_insert_row, storage_list_tables, ++ storage_read_table, storage_reset_database, storage_update_row, ++}; + use commands::usage::{ + get_session_stats, get_usage_by_date_range, get_usage_details, get_usage_stats, + }; +-use commands::storage::{ +- storage_list_tables, storage_read_table, storage_update_row, storage_delete_row, +- storage_insert_row, storage_execute_sql, storage_reset_database, +-}; +-use commands::proxy::{get_proxy_settings, save_proxy_settings, apply_proxy_settings}; + use process::ProcessRegistryState; + use std::sync::Mutex; + use tauri::Manager; +@@ -50,19 +51,17 @@ use tauri::Manager; + #[cfg(target_os = "macos")] + use window_vibrancy::{apply_vibrancy, NSVisualEffectMaterial}; + +- + fn main() { + // Initialize logger + env_logger::init(); + +- + tauri::Builder::default() + .plugin(tauri_plugin_dialog::init()) + .plugin(tauri_plugin_shell::init()) + .setup(|app| { + // Initialize agents database + let conn = init_database(&app.handle()).expect("Failed to initialize agents database"); +- ++ + // Load and apply proxy settings from the database + { + let db = AgentDb(Mutex::new(conn)); +@@ -70,7 +69,7 @@ fn main() { + Ok(conn) => { + // Directly query proxy settings from the database + let mut settings = commands::proxy::ProxySettings::default(); +- ++ + let keys = vec![ + ("proxy_enabled", "enabled"), + ("proxy_http", "http_proxy"), +@@ -78,7 +77,7 @@ fn main() { + ("proxy_no", "no_proxy"), + ("proxy_all", "all_proxy"), + ]; +- ++ + for (db_key, field) in keys { + if let Ok(value) = conn.query_row( + "SELECT value FROM app_settings WHERE key = ?1", +@@ -87,15 +86,23 @@ fn main() { + ) { + match field { + "enabled" => settings.enabled = value == "true", +- "http_proxy" => settings.http_proxy = Some(value).filter(|s| !s.is_empty()), +- "https_proxy" => settings.https_proxy = Some(value).filter(|s| !s.is_empty()), +- "no_proxy" => settings.no_proxy = Some(value).filter(|s| !s.is_empty()), +- "all_proxy" => settings.all_proxy = Some(value).filter(|s| !s.is_empty()), ++ "http_proxy" => { ++ settings.http_proxy = Some(value).filter(|s| !s.is_empty()) ++ } ++ "https_proxy" => { ++ settings.https_proxy = Some(value).filter(|s| !s.is_empty()) ++ } ++ "no_proxy" => { ++ settings.no_proxy = Some(value).filter(|s| !s.is_empty()) ++ } ++ "all_proxy" => { ++ settings.all_proxy = Some(value).filter(|s| !s.is_empty()) ++ } + _ => {} + } + } + } +- ++ + log::info!("Loaded proxy settings: enabled={}", settings.enabled); + settings + } +@@ -104,11 +111,11 @@ fn main() { + commands::proxy::ProxySettings::default() + } + }; +- ++ + // Apply the proxy settings + apply_proxy_settings(&proxy_settings); + } +- ++ + // Re-open the connection for the app to manage + let conn = init_database(&app.handle()).expect("Failed to initialize agents database"); + app.manage(AgentDb(Mutex::new(conn))); +@@ -144,7 +151,7 @@ fn main() { + #[cfg(target_os = "macos")] + { + let window = app.get_webview_window("main").unwrap(); +- ++ + // Try different vibrancy materials that support rounded corners + let materials = [ + NSVisualEffectMaterial::UnderWindowBackground, +@@ -153,7 +160,7 @@ fn main() { + NSVisualEffectMaterial::Menu, + NSVisualEffectMaterial::Sidebar, + ]; +- ++ + let mut applied = false; + for material in materials.iter() { + if apply_vibrancy(&window, *material, None, Some(12.0)).is_ok() { +@@ -161,11 +168,16 @@ fn main() { + break; + } + } +- ++ + if !applied { + // Fallback without rounded corners +- apply_vibrancy(&window, NSVisualEffectMaterial::WindowBackground, None, None) +- .expect("Failed to apply any window vibrancy"); ++ apply_vibrancy( ++ &window, ++ NSVisualEffectMaterial::WindowBackground, ++ None, ++ None, ++ ) ++ .expect("Failed to apply any window vibrancy"); + } + } + +@@ -176,6 +188,7 @@ fn main() { + list_projects, + create_project, + get_project_sessions, ++ delete_session, + get_home_directory, + get_claude_settings, + open_new_session, +@@ -199,7 +212,6 @@ fn main() { + get_hooks_config, + update_hooks_config, + validate_hook_command, +- + // Checkpoint Management + create_checkpoint, + restore_checkpoint, +@@ -215,7 +227,6 @@ fn main() { + get_checkpoint_settings, + clear_checkpoint_manager, + get_checkpoint_state_stats, +- + // Agent Management + list_agents, + create_agent, +@@ -245,13 +256,11 @@ fn main() { + fetch_github_agents, + fetch_github_agent_content, + import_agent_from_github, +- + // Usage & Analytics + get_usage_stats, + get_usage_by_date_range, + get_usage_details, + get_session_stats, +- + // MCP (Model Context Protocol) + mcp_add, + mcp_list, +@@ -265,7 +274,6 @@ fn main() { + mcp_get_server_status, + mcp_read_project_config, + mcp_save_project_config, +- + // Storage Management + storage_list_tables, + storage_read_table, +@@ -274,13 +282,11 @@ fn main() { + storage_insert_row, + storage_execute_sql, + storage_reset_database, +- + // Slash Commands + commands::slash_commands::slash_commands_list, + commands::slash_commands::slash_command_get, + commands::slash_commands::slash_command_save, + commands::slash_commands::slash_command_delete, +- + // Proxy Settings + get_proxy_settings, + save_proxy_settings, +diff --git a/src-tauri/src/process/registry.rs b/src-tauri/src/process/registry.rs +index 30c8e94..f4f33b5 100644 +--- a/src-tauri/src/process/registry.rs ++++ b/src-tauri/src/process/registry.rs +@@ -7,13 +7,8 @@ use tokio::process::Child; + /// Type of process being tracked + #[derive(Debug, Clone, Serialize, Deserialize)] + pub enum ProcessType { +- AgentRun { +- agent_id: i64, +- agent_name: String, +- }, +- ClaudeSession { +- session_id: String, +- }, ++ AgentRun { agent_id: i64, agent_name: String }, ++ ClaudeSession { session_id: String }, + } + + /// Information about a running agent process +@@ -72,7 +67,10 @@ impl ProcessRegistry { + ) -> Result<(), String> { + let process_info = ProcessInfo { + run_id, +- process_type: ProcessType::AgentRun { agent_id, agent_name }, ++ process_type: ProcessType::AgentRun { ++ agent_id, ++ agent_name, ++ }, + pid, + started_at: Utc::now(), + project_path, +@@ -96,7 +94,10 @@ impl ProcessRegistry { + ) -> Result<(), String> { + let process_info = ProcessInfo { + run_id, +- process_type: ProcessType::AgentRun { agent_id, agent_name }, ++ process_type: ProcessType::AgentRun { ++ agent_id, ++ agent_name, ++ }, + pid, + started_at: Utc::now(), + project_path, +@@ -106,7 +107,7 @@ impl ProcessRegistry { + + // For sidecar processes, we register without the child handle since it's managed differently + let mut processes = self.processes.lock().map_err(|e| e.to_string())?; +- ++ + let process_handle = ProcessHandle { + info: process_info, + child: Arc::new(Mutex::new(None)), // No tokio::process::Child handle for sidecar +@@ -127,7 +128,7 @@ impl ProcessRegistry { + model: String, + ) -> Result { + let run_id = self.generate_id()?; +- ++ + let process_info = ProcessInfo { + run_id, + process_type: ProcessType::ClaudeSession { session_id }, +@@ -140,7 +141,7 @@ impl ProcessRegistry { + + // Register without child - Claude sessions use ClaudeProcessState for process management + let mut processes = self.processes.lock().map_err(|e| e.to_string())?; +- ++ + let process_handle = ProcessHandle { + info: process_info, + child: Arc::new(Mutex::new(None)), // No child handle for Claude sessions +@@ -175,25 +176,24 @@ impl ProcessRegistry { + let processes = self.processes.lock().map_err(|e| e.to_string())?; + Ok(processes + .values() +- .filter_map(|handle| { +- match &handle.info.process_type { +- ProcessType::ClaudeSession { .. } => Some(handle.info.clone()), +- _ => None, +- } ++ .filter_map(|handle| match &handle.info.process_type { ++ ProcessType::ClaudeSession { .. } => Some(handle.info.clone()), ++ _ => None, + }) + .collect()) + } + + /// Get a specific Claude session by session ID +- pub fn get_claude_session_by_id(&self, session_id: &str) -> Result, String> { ++ pub fn get_claude_session_by_id( ++ &self, ++ session_id: &str, ++ ) -> Result, String> { + let processes = self.processes.lock().map_err(|e| e.to_string())?; + Ok(processes + .values() +- .find(|handle| { +- match &handle.info.process_type { +- ProcessType::ClaudeSession { session_id: sid } => sid == session_id, +- _ => false, +- } ++ .find(|handle| match &handle.info.process_type { ++ ProcessType::ClaudeSession { session_id: sid } => sid == session_id, ++ _ => false, + }) + .map(|handle| handle.info.clone())) + } +@@ -221,11 +221,9 @@ impl ProcessRegistry { + let processes = self.processes.lock().map_err(|e| e.to_string())?; + Ok(processes + .values() +- .filter_map(|handle| { +- match &handle.info.process_type { +- ProcessType::AgentRun { .. } => Some(handle.info.clone()), +- _ => None, +- } ++ .filter_map(|handle| match &handle.info.process_type { ++ ProcessType::AgentRun { .. } => Some(handle.info.clone()), ++ _ => None, + }) + .collect()) + } +@@ -273,17 +271,26 @@ impl ProcessRegistry { + } + } + } else { +- warn!("No child handle available for process {} (PID: {}), attempting system kill", run_id, pid); ++ warn!( ++ "No child handle available for process {} (PID: {}), attempting system kill", ++ run_id, pid ++ ); + false // Process handle not available, try fallback + } + }; + + // If direct kill didn't work, try system command as fallback + if !kill_sent { +- info!("Attempting fallback kill for process {} (PID: {})", run_id, pid); ++ info!( ++ "Attempting fallback kill for process {} (PID: {})", ++ run_id, pid ++ ); + match self.kill_process_by_pid(run_id, pid) { + Ok(true) => return Ok(true), +- Ok(false) => warn!("Fallback kill also failed for process {} (PID: {})", run_id, pid), ++ Ok(false) => warn!( ++ "Fallback kill also failed for process {} (PID: {})", ++ run_id, pid ++ ), + Err(e) => error!("Error during fallback kill: {}", e), + } + // Continue with the rest of the cleanup even if fallback failed +-- +2.50.1 (Apple Git-155) + diff --git a/src-tauri/src/claude_binary.rs b/src-tauri/src/claude_binary.rs index fe4b2a8eb..c27b9234e 100644 --- a/src-tauri/src/claude_binary.rs +++ b/src-tauri/src/claude_binary.rs @@ -47,7 +47,7 @@ pub fn find_claude_binary(app_handle: &tauri::AppHandle) -> Result(0), ) { info!("Found stored claude path in database: {}", stored_path); - + // Check if the path still exists let path_buf = PathBuf::from(&stored_path); if path_buf.exists() && path_buf.is_file() { @@ -56,14 +56,14 @@ pub fn find_claude_binary(app_handle: &tauri::AppHandle) -> Result(0), ).unwrap_or_else(|_| "system".to_string()); - + info!("User preference for Claude installation: {}", preference); } } @@ -366,10 +366,10 @@ fn get_claude_version(path: &str) -> Result, String> { /// Extract version string from command output fn extract_version_from_output(stdout: &[u8]) -> Option { let output_str = String::from_utf8_lossy(stdout); - + // Debug log the raw output debug!("Raw version output: {:?}", output_str); - + // Use regex to directly extract version pattern (e.g., "1.0.41") // This pattern matches: // - One or more digits, followed by @@ -378,8 +378,9 @@ fn extract_version_from_output(stdout: &[u8]) -> Option { // - A dot, followed by // - One or more digits // - Optionally followed by pre-release/build metadata - let version_regex = regex::Regex::new(r"(\d+\.\d+\.\d+(?:-[a-zA-Z0-9.-]+)?(?:\+[a-zA-Z0-9.-]+)?)").ok()?; - + let version_regex = + regex::Regex::new(r"(\d+\.\d+\.\d+(?:-[a-zA-Z0-9.-]+)?(?:\+[a-zA-Z0-9.-]+)?)").ok()?; + if let Some(captures) = version_regex.captures(&output_str) { if let Some(version_match) = captures.get(1) { let version = version_match.as_str().to_string(); @@ -387,7 +388,7 @@ fn extract_version_from_output(stdout: &[u8]) -> Option { return Some(version); } } - + debug!("No version found in output"); None } @@ -467,7 +468,7 @@ fn compare_versions(a: &str, b: &str) -> Ordering { /// This ensures commands like Claude can find Node.js and other dependencies pub fn create_command_with_env(program: &str) -> Command { let mut cmd = Command::new(program); - + info!("Creating command for: {}", program); // Inherit essential environment variables from parent process @@ -495,7 +496,7 @@ pub fn create_command_with_env(program: &str) -> Command { cmd.env(&key, &value); } } - + // Log proxy-related environment variables for debugging info!("Command will use proxy settings:"); if let Ok(http_proxy) = std::env::var("HTTP_PROXY") { @@ -518,7 +519,7 @@ pub fn create_command_with_env(program: &str) -> Command { } } } - + // Add Homebrew support if the program is in a Homebrew directory if program.contains("/homebrew/") || program.contains("/opt/homebrew/") { if let Some(program_dir) = std::path::Path::new(program).parent() { @@ -527,7 +528,10 @@ pub fn create_command_with_env(program: &str) -> Command { let homebrew_bin_str = program_dir.to_string_lossy(); if !current_path.contains(&homebrew_bin_str.as_ref()) { let new_path = format!("{}:{}", homebrew_bin_str, current_path); - debug!("Adding Homebrew bin directory to PATH: {}", homebrew_bin_str); + debug!( + "Adding Homebrew bin directory to PATH: {}", + homebrew_bin_str + ); cmd.env("PATH", new_path); } } diff --git a/src-tauri/src/commands/agents.rs b/src-tauri/src/commands/agents.rs index b988ce713..36513a7a4 100644 --- a/src-tauri/src/commands/agents.rs +++ b/src-tauri/src/commands/agents.rs @@ -179,7 +179,10 @@ pub async fn read_session_jsonl(session_id: &str, project_path: &str) -> Result< let session_file = project_dir.join(format!("{}.jsonl", session_id)); if !session_file.exists() { - return Err(format!("Session file not found: {}", session_file.display())); + return Err(format!( + "Session file not found: {}", + session_file.display() + )); } match tokio::fs::read_to_string(&session_file).await { @@ -317,7 +320,6 @@ pub fn init_database(app: &AppHandle) -> SqliteResult { [], )?; - // Create settings table for app-wide settings conn.execute( "CREATE TABLE IF NOT EXISTS app_settings ( @@ -690,38 +692,41 @@ pub async fn execute_agent( // Get the agent from database let agent = get_agent(db.clone(), agent_id).await?; let execution_model = model.unwrap_or(agent.model.clone()); - + // Create .claude/settings.json with agent hooks if it doesn't exist if let Some(hooks_json) = &agent.hooks { let claude_dir = std::path::Path::new(&project_path).join(".claude"); let settings_path = claude_dir.join("settings.json"); - + // Create .claude directory if it doesn't exist if !claude_dir.exists() { std::fs::create_dir_all(&claude_dir) .map_err(|e| format!("Failed to create .claude directory: {}", e))?; info!("Created .claude directory at: {:?}", claude_dir); } - + // Check if settings.json already exists if !settings_path.exists() { // Parse the hooks JSON let hooks: serde_json::Value = serde_json::from_str(hooks_json) .map_err(|e| format!("Failed to parse agent hooks: {}", e))?; - + // Create a settings object with just the hooks let settings = serde_json::json!({ "hooks": hooks }); - + // Write the settings file let settings_content = serde_json::to_string_pretty(&settings) .map_err(|e| format!("Failed to serialize settings: {}", e))?; - + std::fs::write(&settings_path, settings_content) .map_err(|e| format!("Failed to write settings.json: {}", e))?; - - info!("Created settings.json with agent hooks at: {:?}", settings_path); + + info!( + "Created settings.json with agent hooks at: {:?}", + settings_path + ); } else { info!("settings.json already exists at: {:?}", settings_path); } @@ -775,7 +780,8 @@ pub async fn execute_agent( execution_model, db, registry, - ).await + ) + .await } /// Creates a system binary command for agent execution @@ -785,17 +791,17 @@ fn create_agent_system_command( project_path: &str, ) -> Command { let mut cmd = create_command_with_env(claude_path); - + // Add all arguments for arg in args { cmd.arg(arg); } - + cmd.current_dir(project_path) .stdin(Stdio::null()) .stdout(Stdio::piped()) .stderr(Stdio::piped()); - + cmd } @@ -905,14 +911,15 @@ async fn spawn_agent_system( // Extract session ID from JSONL output if let Ok(json) = serde_json::from_str::(&line) { // Claude Code uses "session_id" (underscore), not "sessionId" - if json.get("type").and_then(|t| t.as_str()) == Some("system") && - json.get("subtype").and_then(|s| s.as_str()) == Some("init") { + if json.get("type").and_then(|t| t.as_str()) == Some("system") + && json.get("subtype").and_then(|s| s.as_str()) == Some("init") + { if let Some(sid) = json.get("session_id").and_then(|s| s.as_str()) { if let Ok(mut current_session_id) = session_id_clone.lock() { if current_session_id.is_empty() { *current_session_id = sid.to_string(); info!("🔑 Extracted session ID: {}", sid); - + // Update database immediately with session ID if let Ok(conn) = Connection::open(&db_path_for_stdout) { match conn.execute( @@ -925,7 +932,10 @@ async fn spawn_agent_system( } } Err(e) => { - error!("❌ Failed to update session ID immediately: {}", e); + error!( + "❌ Failed to update session ID immediately: {}", + e + ); } } } @@ -1085,7 +1095,10 @@ async fn spawn_agent_system( // Update the run record with session ID and mark as completed - open a new connection if let Ok(conn) = Connection::open(&db_path_for_monitor) { - info!("🔄 Updating database with extracted session ID: {}", extracted_session_id); + info!( + "🔄 Updating database with extracted session ID: {}", + extracted_session_id + ); match conn.execute( "UPDATE agent_runs SET session_id = ?1, status = 'completed', completed_at = CURRENT_TIMESTAMP WHERE id = ?2", params![extracted_session_id, run_id], @@ -1102,7 +1115,10 @@ async fn spawn_agent_system( } } } else { - error!("❌ Failed to open database to update session ID for run {}", run_id); + error!( + "❌ Failed to open database to update session ID for run {}", + run_id + ); } // Cleanup will be handled by the cleanup_finished_processes function @@ -1162,10 +1178,8 @@ pub async fn list_running_sessions( // Cross-check with the process registry to ensure accuracy // Get actually running processes from the registry let registry_processes = registry.0.get_running_agent_processes()?; - let registry_run_ids: std::collections::HashSet = registry_processes - .iter() - .map(|p| p.run_id) - .collect(); + let registry_run_ids: std::collections::HashSet = + registry_processes.iter().map(|p| p.run_id).collect(); // Filter out any database entries that aren't actually running in the registry // This handles cases where processes crashed without updating the database @@ -1358,7 +1372,7 @@ pub async fn get_session_output( // Find the correct project directory by searching for the session file let projects_dir = claude_dir.join("projects"); - + // Check if projects directory exists if !projects_dir.exists() { log::error!("Projects directory not found at: {:?}", projects_dir); @@ -1367,15 +1381,18 @@ pub async fn get_session_output( // Search for the session file in all project directories let mut session_file_path = None; - log::info!("Searching for session file {} in all project directories", run.session_id); - + log::info!( + "Searching for session file {} in all project directories", + run.session_id + ); + if let Ok(entries) = std::fs::read_dir(&projects_dir) { for entry in entries.filter_map(Result::ok) { let path = entry.path(); if path.is_dir() { let dir_name = path.file_name().unwrap_or_default().to_string_lossy(); log::debug!("Checking project directory: {}", dir_name); - + let potential_session_file = path.join(format!("{}.jsonl", run.session_id)); if potential_session_file.exists() { log::info!("Found session file at: {:?}", potential_session_file); @@ -1395,7 +1412,11 @@ pub async fn get_session_output( match tokio::fs::read_to_string(&session_path).await { Ok(content) => Ok(content), Err(e) => { - log::error!("Failed to read session file {}: {}", session_path.display(), e); + log::error!( + "Failed to read session file {}: {}", + session_path.display(), + e + ); // Fallback to live output if file read fails let live_output = registry.0.get_live_output(run_id)?; Ok(live_output) @@ -1403,7 +1424,10 @@ pub async fn get_session_output( } } else { // If session file not found, try the old method as fallback - log::warn!("Session file not found for {}, trying legacy method", run.session_id); + log::warn!( + "Session file not found for {}, trying legacy method", + run.session_id + ); match read_session_jsonl(&run.session_id, &run.project_path).await { Ok(content) => Ok(content), Err(_) => { @@ -1916,7 +1940,7 @@ pub async fn load_agent_session_history( .join(".claude"); let projects_dir = claude_dir.join("projects"); - + if !projects_dir.exists() { log::error!("Projects directory not found at: {:?}", projects_dir); return Err("Projects directory not found".to_string()); @@ -1924,15 +1948,18 @@ pub async fn load_agent_session_history( // Search for the session file in all project directories let mut session_file_path = None; - log::info!("Searching for session file {} in all project directories", session_id); - + log::info!( + "Searching for session file {} in all project directories", + session_id + ); + if let Ok(entries) = std::fs::read_dir(&projects_dir) { for entry in entries.filter_map(Result::ok) { let path = entry.path(); if path.is_dir() { let dir_name = path.file_name().unwrap_or_default().to_string_lossy(); log::debug!("Checking project directory: {}", dir_name); - + let potential_session_file = path.join(format!("{}.jsonl", session_id)); if potential_session_file.exists() { log::info!("Found session file at: {:?}", potential_session_file); diff --git a/src-tauri/src/commands/claude.rs b/src-tauri/src/commands/claude.rs index 94ad3c55e..58e5aace3 100644 --- a/src-tauri/src/commands/claude.rs +++ b/src-tauri/src/commands/claude.rs @@ -1,4 +1,5 @@ use anyhow::{Context, Result}; +use base64; use serde::{Deserialize, Serialize}; use std::fs; use std::io::{BufRead, BufReader}; @@ -10,7 +11,6 @@ use tauri::{AppHandle, Emitter, Manager}; use tokio::process::{Child, Command}; use tokio::sync::Mutex; - /// Global state to track current Claude process pub struct ClaudeProcessState { pub current_process: Arc>>, @@ -262,7 +262,7 @@ fn create_command_with_env(program: &str) -> Command { } } } - + // Add Homebrew support if the program is in a Homebrew directory if program.contains("/homebrew/") || program.contains("/opt/homebrew/") { if let Some(program_dir) = std::path::Path::new(program).parent() { @@ -270,7 +270,10 @@ fn create_command_with_env(program: &str) -> Command { let homebrew_bin_str = program_dir.to_string_lossy(); if !current_path.contains(&homebrew_bin_str.as_ref()) { let new_path = format!("{}:{}", homebrew_bin_str, current_path); - log::debug!("Adding Homebrew bin directory to PATH: {}", homebrew_bin_str); + log::debug!( + "Adding Homebrew bin directory to PATH: {}", + homebrew_bin_str + ); tokio_cmd.env("PATH", new_path); } } @@ -280,22 +283,18 @@ fn create_command_with_env(program: &str) -> Command { } /// Creates a system binary command with the given arguments -fn create_system_command( - claude_path: &str, - args: Vec, - project_path: &str, -) -> Command { +fn create_system_command(claude_path: &str, args: Vec, project_path: &str) -> Command { let mut cmd = create_command_with_env(claude_path); - + // Add all arguments for arg in args { cmd.arg(arg); } - + cmd.current_dir(project_path) .stdout(Stdio::piped()) .stderr(Stdio::piped()); - + cmd } @@ -307,7 +306,6 @@ pub async fn get_home_directory() -> Result { .ok_or_else(|| "Could not determine home directory".to_string()) } - /// Lists all projects in the ~/.claude/projects directory #[tauri::command] pub async fn list_projects() -> Result, String> { @@ -361,7 +359,7 @@ pub async fn list_projects() -> Result, String> { // List all JSONL files (sessions) in this project directory let mut sessions = Vec::new(); let mut most_recent_session: Option = None; - + if let Ok(session_entries) = fs::read_dir(&path) { for session_entry in session_entries.flatten() { let session_path = session_entry.path(); @@ -371,7 +369,7 @@ pub async fn list_projects() -> Result, String> { if let Some(session_id) = session_path.file_stem().and_then(|s| s.to_str()) { sessions.push(session_id.to_string()); - + // Track the most recent session timestamp if let Ok(metadata) = fs::metadata(&session_path) { let modified = metadata @@ -380,7 +378,7 @@ pub async fn list_projects() -> Result, String> { .duration_since(UNIX_EPOCH) .unwrap_or_default() .as_secs(); - + most_recent_session = Some(match most_recent_session { Some(current) => current.max(modified), None => modified, @@ -420,31 +418,31 @@ pub async fn list_projects() -> Result, String> { #[tauri::command] pub async fn create_project(path: String) -> Result { log::info!("Creating project for path: {}", path); - + // Encode the path to create a project ID let project_id = path.replace('/', "-"); - + // Get claude directory let claude_dir = get_claude_dir().map_err(|e| e.to_string())?; let projects_dir = claude_dir.join("projects"); - + // Create projects directory if it doesn't exist if !projects_dir.exists() { fs::create_dir_all(&projects_dir) .map_err(|e| format!("Failed to create projects directory: {}", e))?; } - + // Create project directory if it doesn't exist let project_dir = projects_dir.join(&project_id); if !project_dir.exists() { fs::create_dir_all(&project_dir) .map_err(|e| format!("Failed to create project directory: {}", e))?; } - + // Get creation time let metadata = fs::metadata(&project_dir) .map_err(|e| format!("Failed to read directory metadata: {}", e))?; - + let created_at = metadata .created() .or_else(|_| metadata.modified()) @@ -452,7 +450,7 @@ pub async fn create_project(path: String) -> Result { .duration_since(UNIX_EPOCH) .unwrap_or_default() .as_secs(); - + // Return the created project Ok(Project { id: project_id, @@ -648,7 +646,8 @@ pub async fn check_claude_version(app: AppHandle) -> Result Result { let stdout = String::from_utf8_lossy(&output.stdout).to_string(); let stderr = String::from_utf8_lossy(&output.stderr).to_string(); - + // Use regex to directly extract version pattern (e.g., "1.0.41") - let version_regex = regex::Regex::new(r"(\d+\.\d+\.\d+(?:-[a-zA-Z0-9.-]+)?(?:\+[a-zA-Z0-9.-]+)?)").ok(); - + let version_regex = + regex::Regex::new(r"(\d+\.\d+\.\d+(?:-[a-zA-Z0-9.-]+)?(?:\+[a-zA-Z0-9.-]+)?)") + .ok(); + let version = if let Some(regex) = version_regex { - regex.captures(&stdout) + regex + .captures(&stdout) .and_then(|captures| captures.get(1)) .map(|m| m.as_str().to_string()) } else { None }; - + let full_output = if stderr.is_empty() { stdout.clone() } else { @@ -907,8 +909,6 @@ pub async fn load_session_history( Ok(messages) } - - /// Execute a new interactive Claude Code session with streaming output #[tauri::command] pub async fn execute_claude_code( @@ -924,7 +924,7 @@ pub async fn execute_claude_code( ); let claude_path = find_claude_binary(&app)?; - + let args = vec![ "-p".to_string(), prompt.clone(), @@ -955,7 +955,7 @@ pub async fn continue_claude_code( ); let claude_path = find_claude_binary(&app)?; - + let args = vec![ "-c".to_string(), // Continue flag "-p".to_string(), @@ -989,7 +989,7 @@ pub async fn resume_claude_code( ); let claude_path = find_claude_binary(&app)?; - + let args = vec![ "--resume".to_string(), session_id.clone(), @@ -1026,8 +1026,12 @@ pub async fn cancel_claude_execution( let registry = app.state::(); match registry.0.get_claude_session_by_id(sid) { Ok(Some(process_info)) => { - log::info!("Found process in registry for session {}: run_id={}, PID={}", - sid, process_info.run_id, process_info.pid); + log::info!( + "Found process in registry for session {}: run_id={}, PID={}", + sid, + process_info.run_id, + process_info.pid + ); match registry.0.kill_process(process_info.run_id).await { Ok(success) => { if success { @@ -1060,7 +1064,10 @@ pub async fn cancel_claude_execution( if let Some(mut child) = current_process.take() { // Try to get the PID before killing let pid = child.id(); - log::info!("Attempting to kill Claude process via ClaudeProcessState with PID: {:?}", pid); + log::info!( + "Attempting to kill Claude process via ClaudeProcessState with PID: {:?}", + pid + ); // Kill the process match child.kill().await { @@ -1069,8 +1076,11 @@ pub async fn cancel_claude_execution( killed = true; } Err(e) => { - log::error!("Failed to kill Claude process via ClaudeProcessState: {}", e); - + log::error!( + "Failed to kill Claude process via ClaudeProcessState: {}", + e + ); + // Method 3: If we have a PID, try system kill as last resort if let Some(pid) = pid { log::info!("Attempting system kill as last resort for PID: {}", pid); @@ -1083,7 +1093,7 @@ pub async fn cancel_claude_execution( .args(["-KILL", &pid.to_string()]) .output() }; - + match kill_result { Ok(output) if output.status.success() => { log::info!("Successfully killed process via system command"); @@ -1116,18 +1126,18 @@ pub async fn cancel_claude_execution( tokio::time::sleep(tokio::time::Duration::from_millis(100)).await; let _ = app.emit(&format!("claude-complete:{}", sid), false); } - + // Also emit generic events for backward compatibility let _ = app.emit("claude-cancelled", true); tokio::time::sleep(tokio::time::Duration::from_millis(100)).await; let _ = app.emit("claude-complete", false); - + if killed { log::info!("Claude process cancellation completed successfully"); } else if !attempted_methods.is_empty() { log::warn!("Claude process cancellation attempted but process may have already exited. Attempted methods: {:?}", attempted_methods); } - + Ok(()) } @@ -1154,9 +1164,15 @@ pub async fn get_claude_session_output( } /// Helper function to spawn Claude process and handle streaming -async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, model: String, project_path: String) -> Result<(), String> { - use tokio::io::{AsyncBufReadExt, BufReader}; +async fn spawn_claude_process( + app: AppHandle, + mut cmd: Command, + prompt: String, + model: String, + project_path: String, +) -> Result<(), String> { use std::sync::Mutex; + use tokio::io::{AsyncBufReadExt, BufReader}; // Spawn the process let mut child = cmd @@ -1169,10 +1185,7 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, // Get the child PID for logging let pid = child.id().unwrap_or(0); - log::info!( - "Spawned Claude process with PID: {:?}", - pid - ); + log::info!("Spawned Claude process with PID: {:?}", pid); // Create readers first (before moving child) let stdout_reader = BufReader::new(stdout); @@ -1207,7 +1220,7 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, let mut lines = stdout_reader.lines(); while let Ok(Some(line)) = lines.next_line().await { log::debug!("Claude stdout: {}", line); - + // Parse the line to check for init message with session ID if let Ok(msg) = serde_json::from_str::(&line) { if msg["type"] == "system" && msg["subtype"] == "init" { @@ -1216,7 +1229,7 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, if session_id_guard.is_none() { *session_id_guard = Some(claude_session_id.to_string()); log::info!("Extracted Claude session ID: {}", claude_session_id); - + // Now register with ProcessRegistry using Claude's session ID match registry_clone.register_claude_session( claude_session_id.to_string(), @@ -1238,12 +1251,12 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, } } } - + // Store live output in registry if we have a run_id if let Some(run_id) = *run_id_holder_clone.lock().unwrap() { let _ = registry_clone.append_live_output(run_id, &line); } - + // Emit the line to the frontend with session isolation if we have session ID if let Some(ref session_id) = *session_id_holder_clone.lock().unwrap() { let _ = app_handle.emit(&format!("claude-output:{}", session_id), &line); @@ -1287,10 +1300,8 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, // Add a small delay to ensure all messages are processed tokio::time::sleep(tokio::time::Duration::from_millis(100)).await; if let Some(ref session_id) = *session_id_holder_clone3.lock().unwrap() { - let _ = app_handle_wait.emit( - &format!("claude-complete:{}", session_id), - status.success(), - ); + let _ = app_handle_wait + .emit(&format!("claude-complete:{}", session_id), status.success()); } // Also emit to the generic event for backward compatibility let _ = app_handle_wait.emit("claude-complete", status.success()); @@ -1300,8 +1311,8 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, // Add a small delay to ensure all messages are processed tokio::time::sleep(tokio::time::Duration::from_millis(100)).await; if let Some(ref session_id) = *session_id_holder_clone3.lock().unwrap() { - let _ = app_handle_wait - .emit(&format!("claude-complete:{}", session_id), false); + let _ = + app_handle_wait.emit(&format!("claude-complete:{}", session_id), false); } // Also emit to the generic event for backward compatibility let _ = app_handle_wait.emit("claude-complete", false); @@ -1321,7 +1332,6 @@ async fn spawn_claude_process(app: AppHandle, mut cmd: Command, prompt: String, Ok(()) } - /// Lists files and directories in a given path #[tauri::command] pub async fn list_directory_contents(directory_path: String) -> Result, String> { @@ -2038,78 +2048,92 @@ pub async fn track_session_messages( /// Gets hooks configuration from settings at specified scope #[tauri::command] -pub async fn get_hooks_config(scope: String, project_path: Option) -> Result { - log::info!("Getting hooks config for scope: {}, project: {:?}", scope, project_path); +pub async fn get_hooks_config( + scope: String, + project_path: Option, +) -> Result { + log::info!( + "Getting hooks config for scope: {}, project: {:?}", + scope, + project_path + ); let settings_path = match scope.as_str() { - "user" => { - get_claude_dir() - .map_err(|e| e.to_string())? - .join("settings.json") - }, + "user" => get_claude_dir() + .map_err(|e| e.to_string())? + .join("settings.json"), "project" => { let path = project_path.ok_or("Project path required for project scope")?; PathBuf::from(path).join(".claude").join("settings.json") - }, + } "local" => { let path = project_path.ok_or("Project path required for local scope")?; - PathBuf::from(path).join(".claude").join("settings.local.json") - }, - _ => return Err("Invalid scope".to_string()) + PathBuf::from(path) + .join(".claude") + .join("settings.local.json") + } + _ => return Err("Invalid scope".to_string()), }; if !settings_path.exists() { - log::info!("Settings file does not exist at {:?}, returning empty hooks", settings_path); + log::info!( + "Settings file does not exist at {:?}, returning empty hooks", + settings_path + ); return Ok(serde_json::json!({})); } let content = fs::read_to_string(&settings_path) .map_err(|e| format!("Failed to read settings: {}", e))?; - - let settings: serde_json::Value = serde_json::from_str(&content) - .map_err(|e| format!("Failed to parse settings: {}", e))?; - - Ok(settings.get("hooks").cloned().unwrap_or(serde_json::json!({}))) + + let settings: serde_json::Value = + serde_json::from_str(&content).map_err(|e| format!("Failed to parse settings: {}", e))?; + + Ok(settings + .get("hooks") + .cloned() + .unwrap_or(serde_json::json!({}))) } /// Updates hooks configuration in settings at specified scope #[tauri::command] pub async fn update_hooks_config( - scope: String, + scope: String, hooks: serde_json::Value, - project_path: Option + project_path: Option, ) -> Result { - log::info!("Updating hooks config for scope: {}, project: {:?}", scope, project_path); + log::info!( + "Updating hooks config for scope: {}, project: {:?}", + scope, + project_path + ); let settings_path = match scope.as_str() { - "user" => { - get_claude_dir() - .map_err(|e| e.to_string())? - .join("settings.json") - }, + "user" => get_claude_dir() + .map_err(|e| e.to_string())? + .join("settings.json"), "project" => { let path = project_path.ok_or("Project path required for project scope")?; let claude_dir = PathBuf::from(path).join(".claude"); fs::create_dir_all(&claude_dir) .map_err(|e| format!("Failed to create .claude directory: {}", e))?; claude_dir.join("settings.json") - }, + } "local" => { let path = project_path.ok_or("Project path required for local scope")?; let claude_dir = PathBuf::from(path).join(".claude"); fs::create_dir_all(&claude_dir) .map_err(|e| format!("Failed to create .claude directory: {}", e))?; claude_dir.join("settings.local.json") - }, - _ => return Err("Invalid scope".to_string()) + } + _ => return Err("Invalid scope".to_string()), }; // Read existing settings or create new let mut settings = if settings_path.exists() { let content = fs::read_to_string(&settings_path) .map_err(|e| format!("Failed to read settings: {}", e))?; - serde_json::from_str(&content) - .map_err(|e| format!("Failed to parse settings: {}", e))? + serde_json::from_str(&content).map_err(|e| format!("Failed to parse settings: {}", e))? } else { serde_json::json!({}) }; @@ -2120,7 +2144,7 @@ pub async fn update_hooks_config( // Write back with pretty formatting let json_string = serde_json::to_string_pretty(&settings) .map_err(|e| format!("Failed to serialize settings: {}", e))?; - + fs::write(&settings_path, json_string) .map_err(|e| format!("Failed to write settings: {}", e))?; @@ -2135,9 +2159,9 @@ pub async fn validate_hook_command(command: String) -> Result { if output.status.success() { @@ -2153,6 +2177,249 @@ pub async fn validate_hook_command(command: String) -> Result Err(format!("Failed to validate command: {}", e)) + Err(e) => Err(format!("Failed to validate command: {}", e)), + } +} + +/// Deletes a session file and its associated data +/// +/// This function removes a session's JSONL file from the project directory +/// and also cleans up any associated todo data if it exists. +/// +/// # Arguments +/// * `session_id` - The UUID of the session to delete +/// * `project_id` - The ID of the project containing the session +/// +/// # Returns +/// * `Ok(String)` - Success message with session ID +/// * `Err(String)` - Error message if deletion fails +/// +/// # Errors +/// * Project directory not found +/// * Permission denied when deleting files +/// * File system errors during deletion +#[tauri::command] +pub async fn delete_session(session_id: String, project_id: String) -> Result { + log::info!( + "Deleting session: {} from project: {}", + session_id, + project_id + ); + + let claude_dir = get_claude_dir().map_err(|e| e.to_string())?; + let project_dir = claude_dir.join("projects").join(&project_id); + + // Check if project directory exists + if !project_dir.exists() { + return Err(format!("Project directory not found: {}", project_id)); + } + + // Delete the session JSONL file + let session_file = project_dir.join(format!("{}.jsonl", session_id)); + if session_file.exists() { + fs::remove_file(&session_file) + .map_err(|e| format!("Failed to delete session file: {}", e))?; + log::info!("Deleted session file: {:?}", session_file); + } else { + log::warn!("Session file not found: {:?}", session_file); + } + + // Delete associated todo data if it exists + let todos_dir = project_dir.join("todos"); + if todos_dir.exists() { + let todo_file = todos_dir.join(format!("{}.json", session_id)); + if todo_file.exists() { + fs::remove_file(&todo_file) + .map_err(|e| format!("Failed to delete todo file: {}", e))?; + log::info!("Deleted todo file: {:?}", todo_file); + } + } + + Ok(format!("Session {} deleted successfully", session_id)) +} + +/// Deletes multiple sessions and their associated data +/// +/// This function deletes multiple session files and their associated todo data. +/// +/// # Arguments +/// * `session_ids` - A vector of session IDs to delete +/// * `project_id` - The ID of the project containing the sessions +/// +/// # Returns +/// * `Ok(String)` - Success message with count of deleted sessions +/// * `Err(String)` - Error message if deletion fails +/// +/// # Errors +/// * Project directory not found +/// * Permission denied when deleting files +/// * File system errors during deletion +#[tauri::command] +pub async fn delete_sessions_bulk( + session_ids: Vec, + project_id: String, +) -> Result { + log::info!( + "Bulk deleting {} sessions from project: {}", + session_ids.len(), + project_id + ); + + let claude_dir = get_claude_dir().map_err(|e| e.to_string())?; + let project_dir = claude_dir.join("projects").join(&project_id); + + // Check if project directory exists + if !project_dir.exists() { + return Err(format!("Project directory not found: {}", project_id)); + } + + let mut deleted_count = 0; + let mut errors = Vec::new(); + + for session_id in session_ids { + // Delete the session JSONL file + let session_file = project_dir.join(format!("{}.jsonl", session_id)); + if session_file.exists() { + match fs::remove_file(&session_file) { + Ok(_) => { + log::info!("Deleted session file: {:?}", session_file); + deleted_count += 1; + } + Err(e) => { + let error_msg = format!("Failed to delete session {}: {}", session_id, e); + log::error!("{}", error_msg); + errors.push(error_msg); + continue; + } + } + } else { + log::warn!("Session file not found: {:?}", session_file); + } + + // Delete associated todo data if it exists + let todos_dir = project_dir.join("todos"); + if todos_dir.exists() { + let todo_file = todos_dir.join(format!("{}.json", session_id)); + if todo_file.exists() { + if let Err(e) = fs::remove_file(&todo_file) { + log::warn!("Failed to delete todo file for session {}: {}", session_id, e); + // Don't count this as a fatal error since the main session was deleted + } else { + log::info!("Deleted todo file: {:?}", todo_file); + } + } + } + } + + if !errors.is_empty() { + return Err(format!( + "Deleted {} sessions successfully, but {} failed: {}", + deleted_count, + errors.len(), + errors.join("; ") + )); + } + + Ok(format!("{} sessions deleted successfully", deleted_count)) +} + +/// Saves a pasted image (base64 data) to a file and returns the relative path +/// +/// This function takes base64 image data and saves it as a file in the actual project's +/// images directory (not the Claude storage), so Claude can access it directly. +/// +/// # Arguments +/// * `project_id` - The ID of the project to save the image in +/// * `session_id` - The ID of the current session (used for unique naming) +/// * `base64_data` - The base64 data URL (e.g., "data:image/png;base64,...") +/// +/// # Returns +/// * `Ok(String)` - The relative path to the saved image (e.g., "./images/session_123_image_1.png") +/// * `Err(String)` - Error message if saving fails +/// +/// # Errors +/// * Project directory not found +/// * Invalid base64 data format +/// * File system errors during image creation +/// * Permission denied when creating directories or files +#[tauri::command] +pub async fn save_pasted_image( + project_id: String, + session_id: String, + base64_data: String, +) -> Result { + log::info!( + "Saving pasted image for project: {}, session: {}", + project_id, + session_id + ); + + let claude_dir = get_claude_dir().map_err(|e| e.to_string())?; + let claude_project_dir = claude_dir.join("projects").join(&project_id); + + // Check if Claude project directory exists + if !claude_project_dir.exists() { + return Err(format!("Project directory not found: {}", project_id)); } + + // Get the actual project path from the Claude project directory + let actual_project_path = get_project_path_from_sessions(&claude_project_dir) + .unwrap_or_else(|_| decode_project_path(&project_id)); + + // Create images directory in the actual project directory (where Claude can find it) + let images_dir = std::path::PathBuf::from(&actual_project_path).join("images"); + if !images_dir.exists() { + fs::create_dir_all(&images_dir) + .map_err(|e| format!("Failed to create images directory: {}", e))?; + log::info!("Created images directory: {:?}", images_dir); + } + + // Parse the base64 data URL + if !base64_data.starts_with("data:image/") { + return Err("Invalid image data format".to_string()); + } + + // Extract file extension from MIME type + let mime_part = base64_data + .split(';') + .next() + .ok_or("Invalid data URL format")?; + let extension = match mime_part { + "data:image/png" => "png", + "data:image/jpeg" => "jpg", + "data:image/jpg" => "jpg", + "data:image/gif" => "gif", + "data:image/webp" => "webp", + _ => "png", // Default to PNG + }; + + // Extract base64 content + let base64_content = base64_data + .split(',') + .nth(1) + .ok_or("Invalid base64 data format")?; + + // Decode base64 + use base64::Engine; + let image_data = base64::engine::general_purpose::STANDARD + .decode(base64_content) + .map_err(|e| format!("Failed to decode base64: {}", e))?; + + // Generate unique filename + let timestamp = std::time::SystemTime::now() + .duration_since(std::time::UNIX_EPOCH) + .unwrap() + .as_millis(); + let filename = format!("session_{}_image_{}.{}", session_id, timestamp, extension); + let image_path = images_dir.join(&filename); + + // Write image file + fs::write(&image_path, image_data) + .map_err(|e| format!("Failed to write image file: {}", e))?; + + log::info!("Saved image: {:?}", image_path); + + // Return relative path for use in prompts + let relative_path = format!("./images/{}", filename); + Ok(relative_path) } diff --git a/src-tauri/src/commands/mod.rs b/src-tauri/src/commands/mod.rs index a0fa7e897..f4c3552c0 100644 --- a/src-tauri/src/commands/mod.rs +++ b/src-tauri/src/commands/mod.rs @@ -1,7 +1,7 @@ pub mod agents; pub mod claude; pub mod mcp; -pub mod usage; -pub mod storage; -pub mod slash_commands; pub mod proxy; +pub mod slash_commands; +pub mod storage; +pub mod usage; diff --git a/src-tauri/src/commands/proxy.rs b/src-tauri/src/commands/proxy.rs index e2454ecfe..2192e0efe 100644 --- a/src-tauri/src/commands/proxy.rs +++ b/src-tauri/src/commands/proxy.rs @@ -1,6 +1,6 @@ +use rusqlite::params; use serde::{Deserialize, Serialize}; use tauri::State; -use rusqlite::params; use crate::commands::agents::AgentDb; @@ -29,9 +29,9 @@ impl Default for ProxySettings { #[tauri::command] pub async fn get_proxy_settings(db: State<'_, AgentDb>) -> Result { let conn = db.0.lock().map_err(|e| e.to_string())?; - + let mut settings = ProxySettings::default(); - + // Query each proxy setting let keys = vec![ ("proxy_enabled", "enabled"), @@ -40,7 +40,7 @@ pub async fn get_proxy_settings(db: State<'_, AgentDb>) -> Result) -> Result Result<(), String> { let conn = db.0.lock().map_err(|e| e.to_string())?; - + // Save each setting let values = vec![ ("proxy_enabled", settings.enabled.to_string()), - ("proxy_http", settings.http_proxy.clone().unwrap_or_default()), - ("proxy_https", settings.https_proxy.clone().unwrap_or_default()), + ( + "proxy_http", + settings.http_proxy.clone().unwrap_or_default(), + ), + ( + "proxy_https", + settings.https_proxy.clone().unwrap_or_default(), + ), ("proxy_no", settings.no_proxy.clone().unwrap_or_default()), ("proxy_all", settings.all_proxy.clone().unwrap_or_default()), ]; - + for (key, value) in values { conn.execute( "INSERT OR REPLACE INTO app_settings (key, value) VALUES (?1, ?2)", params![key, value], - ).map_err(|e| format!("Failed to save {}: {}", key, e))?; + ) + .map_err(|e| format!("Failed to save {}: {}", key, e))?; } - + // Apply the proxy settings immediately to the current process apply_proxy_settings(&settings); - + Ok(()) } /// Apply proxy settings as environment variables pub fn apply_proxy_settings(settings: &ProxySettings) { log::info!("Applying proxy settings: enabled={}", settings.enabled); - + if !settings.enabled { // Clear proxy environment variables if disabled log::info!("Clearing proxy environment variables"); @@ -109,7 +116,7 @@ pub fn apply_proxy_settings(settings: &ProxySettings) { std::env::remove_var("all_proxy"); return; } - + // Ensure NO_PROXY includes localhost by default let mut no_proxy_list = vec!["localhost", "127.0.0.1", "::1", "0.0.0.0"]; if let Some(user_no_proxy) = &settings.no_proxy { @@ -118,7 +125,7 @@ pub fn apply_proxy_settings(settings: &ProxySettings) { } } let no_proxy_value = no_proxy_list.join(","); - + // Set proxy environment variables (uppercase is standard) if let Some(http_proxy) = &settings.http_proxy { if !http_proxy.is_empty() { @@ -126,25 +133,25 @@ pub fn apply_proxy_settings(settings: &ProxySettings) { std::env::set_var("HTTP_PROXY", http_proxy); } } - + if let Some(https_proxy) = &settings.https_proxy { if !https_proxy.is_empty() { log::info!("Setting HTTPS_PROXY={}", https_proxy); std::env::set_var("HTTPS_PROXY", https_proxy); } } - + // Always set NO_PROXY to include localhost log::info!("Setting NO_PROXY={}", no_proxy_value); std::env::set_var("NO_PROXY", &no_proxy_value); - + if let Some(all_proxy) = &settings.all_proxy { if !all_proxy.is_empty() { log::info!("Setting ALL_PROXY={}", all_proxy); std::env::set_var("ALL_PROXY", all_proxy); } } - + // Log current proxy environment variables for debugging log::info!("Current proxy environment variables:"); for (key, value) in std::env::vars() { @@ -152,4 +159,4 @@ pub fn apply_proxy_settings(settings: &ProxySettings) { log::info!(" {}={}", key, value); } } -} \ No newline at end of file +} diff --git a/src-tauri/src/commands/slash_commands.rs b/src-tauri/src/commands/slash_commands.rs index dbf12e608..6f77309ea 100644 --- a/src-tauri/src/commands/slash_commands.rs +++ b/src-tauri/src/commands/slash_commands.rs @@ -45,13 +45,13 @@ struct CommandFrontmatter { /// Parse a markdown file with optional YAML frontmatter fn parse_markdown_with_frontmatter(content: &str) -> Result<(Option, String)> { let lines: Vec<&str> = content.lines().collect(); - + // Check if the file starts with YAML frontmatter if lines.is_empty() || lines[0] != "---" { // No frontmatter return Ok((None, content.to_string())); } - + // Find the end of frontmatter let mut frontmatter_end = None; for (i, line) in lines.iter().enumerate().skip(1) { @@ -60,12 +60,12 @@ fn parse_markdown_with_frontmatter(content: &str) -> Result<(Option(&frontmatter_content) { Ok(frontmatter) => Ok((Some(frontmatter), body_content)), @@ -86,20 +86,20 @@ fn extract_command_info(file_path: &Path, base_path: &Path) -> Result<(String, O let relative_path = file_path .strip_prefix(base_path) .context("Failed to get relative path")?; - + // Remove .md extension let path_without_ext = relative_path .with_extension("") .to_string_lossy() .to_string(); - + // Split into components let components: Vec<&str> = path_without_ext.split('/').collect(); - + if components.is_empty() { return Err(anyhow::anyhow!("Invalid command path")); } - + if components.len() == 1 { // No namespace Ok((components[0].to_string(), None)) @@ -112,44 +112,43 @@ fn extract_command_info(file_path: &Path, base_path: &Path) -> Result<(String, O } /// Load a single command from a markdown file -fn load_command_from_file( - file_path: &Path, - base_path: &Path, - scope: &str, -) -> Result { +fn load_command_from_file(file_path: &Path, base_path: &Path, scope: &str) -> Result { debug!("Loading command from: {:?}", file_path); - + // Read file content - let content = fs::read_to_string(file_path) - .context("Failed to read command file")?; - + let content = fs::read_to_string(file_path).context("Failed to read command file")?; + // Parse frontmatter let (frontmatter, body) = parse_markdown_with_frontmatter(&content)?; - + // Extract command info let (name, namespace) = extract_command_info(file_path, base_path)?; - + // Build full command (no scope prefix, just /command or /namespace:command) let full_command = match &namespace { Some(ns) => format!("/{ns}:{name}"), None => format!("/{name}"), }; - + // Generate unique ID - let id = format!("{}-{}", scope, file_path.to_string_lossy().replace('/', "-")); - + let id = format!( + "{}-{}", + scope, + file_path.to_string_lossy().replace('/', "-") + ); + // Check for special content let has_bash_commands = body.contains("!`"); let has_file_references = body.contains('@'); let accepts_arguments = body.contains("$ARGUMENTS"); - + // Extract metadata from frontmatter let (description, allowed_tools) = if let Some(fm) = frontmatter { (fm.description, fm.allowed_tools.unwrap_or_default()) } else { (None, Vec::new()) }; - + Ok(SlashCommand { id, name, @@ -171,18 +170,18 @@ fn find_markdown_files(dir: &Path, files: &mut Vec) -> Result<()> { if !dir.exists() { return Ok(()); } - + for entry in fs::read_dir(dir)? { let entry = entry?; let path = entry.path(); - + // Skip hidden files/directories if let Some(name) = path.file_name().and_then(|n| n.to_str()) { if name.starts_with('.') { continue; } } - + if path.is_dir() { find_markdown_files(&path, files)?; } else if path.is_file() { @@ -193,7 +192,7 @@ fn find_markdown_files(dir: &Path, files: &mut Vec) -> Result<()> { } } } - + Ok(()) } @@ -252,16 +251,16 @@ pub async fn slash_commands_list( ) -> Result, String> { info!("Discovering slash commands"); let mut commands = Vec::new(); - + // Add default commands commands.extend(create_default_commands()); - + // Load project commands if project path is provided if let Some(proj_path) = project_path { let project_commands_dir = PathBuf::from(&proj_path).join(".claude").join("commands"); if project_commands_dir.exists() { debug!("Scanning project commands at: {:?}", project_commands_dir); - + let mut md_files = Vec::new(); if let Err(e) = find_markdown_files(&project_commands_dir, &mut md_files) { error!("Failed to find project command files: {}", e); @@ -280,13 +279,13 @@ pub async fn slash_commands_list( } } } - + // Load user commands if let Some(home_dir) = dirs::home_dir() { let user_commands_dir = home_dir.join(".claude").join("commands"); if user_commands_dir.exists() { debug!("Scanning user commands at: {:?}", user_commands_dir); - + let mut md_files = Vec::new(); if let Err(e) = find_markdown_files(&user_commands_dir, &mut md_files) { error!("Failed to find user command files: {}", e); @@ -305,7 +304,7 @@ pub async fn slash_commands_list( } } } - + info!("Found {} slash commands", commands.len()); Ok(commands) } @@ -314,17 +313,17 @@ pub async fn slash_commands_list( #[tauri::command] pub async fn slash_command_get(command_id: String) -> Result { debug!("Getting slash command: {}", command_id); - + // Parse the ID to determine scope and reconstruct file path let parts: Vec<&str> = command_id.split('-').collect(); if parts.len() < 2 { return Err("Invalid command ID".to_string()); } - + // The actual implementation would need to reconstruct the path and reload the command // For now, we'll list all commands and find the matching one let commands = slash_commands_list(None).await?; - + commands .into_iter() .find(|cmd| cmd.id == command_id) @@ -343,16 +342,16 @@ pub async fn slash_command_save( project_path: Option, ) -> Result { info!("Saving slash command: {} in scope: {}", name, scope); - + // Validate inputs if name.is_empty() { return Err("Command name cannot be empty".to_string()); } - + if !["project", "user"].contains(&scope.as_str()) { return Err("Invalid scope. Must be 'project' or 'user'".to_string()); } - + // Determine base directory let base_dir = if scope == "project" { if let Some(proj_path) = project_path { @@ -366,7 +365,7 @@ pub async fn slash_command_save( .join(".claude") .join("commands") }; - + // Build file path let mut file_path = base_dir.clone(); if let Some(ns) = &namespace { @@ -374,41 +373,40 @@ pub async fn slash_command_save( file_path = file_path.join(component); } } - + // Create directories if needed - fs::create_dir_all(&file_path) - .map_err(|e| format!("Failed to create directories: {}", e))?; - + fs::create_dir_all(&file_path).map_err(|e| format!("Failed to create directories: {}", e))?; + // Add filename file_path = file_path.join(format!("{}.md", name)); - + // Build content with frontmatter let mut full_content = String::new(); - + // Add frontmatter if we have metadata if description.is_some() || !allowed_tools.is_empty() { full_content.push_str("---\n"); - + if let Some(desc) = &description { full_content.push_str(&format!("description: {}\n", desc)); } - + if !allowed_tools.is_empty() { full_content.push_str("allowed-tools:\n"); for tool in &allowed_tools { full_content.push_str(&format!(" - {}\n", tool)); } } - + full_content.push_str("---\n\n"); } - + full_content.push_str(&content); - + // Write file fs::write(&file_path, &full_content) .map_err(|e| format!("Failed to write command file: {}", e))?; - + // Load and return the saved command load_command_from_file(&file_path, &base_dir, &scope) .map_err(|e| format!("Failed to load saved command: {}", e)) @@ -416,35 +414,38 @@ pub async fn slash_command_save( /// Delete a slash command #[tauri::command] -pub async fn slash_command_delete(command_id: String, project_path: Option) -> Result { +pub async fn slash_command_delete( + command_id: String, + project_path: Option, +) -> Result { info!("Deleting slash command: {}", command_id); - + // First, we need to determine if this is a project command by parsing the ID let is_project_command = command_id.starts_with("project-"); - + // If it's a project command and we don't have a project path, error out if is_project_command && project_path.is_none() { return Err("Project path required to delete project commands".to_string()); } - + // List all commands (including project commands if applicable) let commands = slash_commands_list(project_path).await?; - + // Find the command by ID let command = commands .into_iter() .find(|cmd| cmd.id == command_id) .ok_or_else(|| format!("Command not found: {}", command_id))?; - + // Delete the file fs::remove_file(&command.file_path) .map_err(|e| format!("Failed to delete command file: {}", e))?; - + // Clean up empty directories if let Some(parent) = Path::new(&command.file_path).parent() { let _ = remove_empty_dirs(parent); } - + Ok(format!("Deleted command: {}", command.full_command)) } @@ -453,18 +454,18 @@ fn remove_empty_dirs(dir: &Path) -> Result<()> { if !dir.exists() { return Ok(()); } - + // Check if directory is empty let is_empty = fs::read_dir(dir)?.next().is_none(); - + if is_empty { fs::remove_dir(dir)?; - + // Try to remove parent if it's also empty if let Some(parent) = dir.parent() { let _ = remove_empty_dirs(parent); } } - + Ok(()) } diff --git a/src-tauri/src/commands/storage.rs b/src-tauri/src/commands/storage.rs index 1bcdb1b5e..02c552980 100644 --- a/src-tauri/src/commands/storage.rs +++ b/src-tauri/src/commands/storage.rs @@ -1,10 +1,10 @@ +use super::agents::AgentDb; use anyhow::Result; -use rusqlite::{params, Connection, Result as SqliteResult, types::ValueRef}; +use rusqlite::{params, types::ValueRef, Connection, Result as SqliteResult}; use serde::{Deserialize, Serialize}; use serde_json::{Map, Value as JsonValue}; use std::collections::HashMap; use tauri::{AppHandle, Manager, State}; -use super::agents::AgentDb; /// Represents metadata about a database table #[derive(Debug, Serialize, Deserialize, Clone)] @@ -50,37 +50,35 @@ pub struct QueryResult { #[tauri::command] pub async fn storage_list_tables(db: State<'_, AgentDb>) -> Result, String> { let conn = db.0.lock().map_err(|e| e.to_string())?; - + // Query for all tables let mut stmt = conn .prepare("SELECT name FROM sqlite_master WHERE type='table' AND name NOT LIKE 'sqlite_%' ORDER BY name") .map_err(|e| e.to_string())?; - + let table_names: Vec = stmt .query_map([], |row| row.get(0)) .map_err(|e| e.to_string())? .collect::>>() .map_err(|e| e.to_string())?; - + drop(stmt); - + let mut tables = Vec::new(); - + for table_name in table_names { // Get row count let row_count: i64 = conn - .query_row( - &format!("SELECT COUNT(*) FROM {}", table_name), - [], - |row| row.get(0), - ) + .query_row(&format!("SELECT COUNT(*) FROM {}", table_name), [], |row| { + row.get(0) + }) .unwrap_or(0); - + // Get column information let mut pragma_stmt = conn .prepare(&format!("PRAGMA table_info({})", table_name)) .map_err(|e| e.to_string())?; - + let columns: Vec = pragma_stmt .query_map([], |row| { Ok(ColumnInfo { @@ -95,14 +93,14 @@ pub async fn storage_list_tables(db: State<'_, AgentDb>) -> Result>>() .map_err(|e| e.to_string())?; - + tables.push(TableInfo { name: table_name, row_count, columns, }); } - + Ok(tables) } @@ -117,17 +115,17 @@ pub async fn storage_read_table( searchQuery: Option, ) -> Result { let conn = db.0.lock().map_err(|e| e.to_string())?; - + // Validate table name to prevent SQL injection if !is_valid_table_name(&conn, &tableName)? { return Err("Invalid table name".to_string()); } - + // Get column information let mut pragma_stmt = conn .prepare(&format!("PRAGMA table_info({})", tableName)) .map_err(|e| e.to_string())?; - + let columns: Vec = pragma_stmt .query_map([], |row| { Ok(ColumnInfo { @@ -142,9 +140,9 @@ pub async fn storage_read_table( .map_err(|e| e.to_string())? .collect::>>() .map_err(|e| e.to_string())?; - + drop(pragma_stmt); - + // Build query with optional search let (query, count_query) = if let Some(search) = &searchQuery { // Create search conditions for all text columns @@ -153,7 +151,7 @@ pub async fn storage_read_table( .filter(|col| col.type_name.contains("TEXT") || col.type_name.contains("VARCHAR")) .map(|col| format!("{} LIKE '%{}%'", col.name, search.replace("'", "''"))) .collect(); - + if search_conditions.is_empty() { ( format!("SELECT * FROM {} LIMIT ? OFFSET ?", tableName), @@ -162,7 +160,10 @@ pub async fn storage_read_table( } else { let where_clause = search_conditions.join(" OR "); ( - format!("SELECT * FROM {} WHERE {} LIMIT ? OFFSET ?", tableName, where_clause), + format!( + "SELECT * FROM {} WHERE {} LIMIT ? OFFSET ?", + tableName, where_clause + ), format!("SELECT COUNT(*) FROM {} WHERE {}", tableName, where_clause), ) } @@ -172,25 +173,23 @@ pub async fn storage_read_table( format!("SELECT COUNT(*) FROM {}", tableName), ) }; - + // Get total row count let total_rows: i64 = conn .query_row(&count_query, [], |row| row.get(0)) .unwrap_or(0); - + // Calculate pagination let offset = (page - 1) * pageSize; let total_pages = (total_rows as f64 / pageSize as f64).ceil() as i64; - + // Query data - let mut data_stmt = conn - .prepare(&query) - .map_err(|e| e.to_string())?; - + let mut data_stmt = conn.prepare(&query).map_err(|e| e.to_string())?; + let rows: Vec> = data_stmt .query_map(params![pageSize, offset], |row| { let mut row_map = Map::new(); - + for (idx, col) in columns.iter().enumerate() { let value = match row.get_ref(idx)? { ValueRef::Null => JsonValue::Null, @@ -203,17 +202,20 @@ pub async fn storage_read_table( } } ValueRef::Text(s) => JsonValue::String(String::from_utf8_lossy(s).to_string()), - ValueRef::Blob(b) => JsonValue::String(base64::Engine::encode(&base64::engine::general_purpose::STANDARD, b)), + ValueRef::Blob(b) => JsonValue::String(base64::Engine::encode( + &base64::engine::general_purpose::STANDARD, + b, + )), }; row_map.insert(col.name.clone(), value); } - + Ok(row_map) }) .map_err(|e| e.to_string())? .collect::>>() .map_err(|e| e.to_string())?; - + Ok(TableData { table_name: tableName, columns, @@ -235,49 +237,52 @@ pub async fn storage_update_row( updates: HashMap, ) -> Result<(), String> { let conn = db.0.lock().map_err(|e| e.to_string())?; - + // Validate table name if !is_valid_table_name(&conn, &tableName)? { return Err("Invalid table name".to_string()); } - + // Build UPDATE query let set_clauses: Vec = updates .keys() .enumerate() .map(|(idx, key)| format!("{} = ?{}", key, idx + 1)) .collect(); - + let where_clauses: Vec = primaryKeyValues .keys() .enumerate() .map(|(idx, key)| format!("{} = ?{}", key, idx + updates.len() + 1)) .collect(); - + let query = format!( "UPDATE {} SET {} WHERE {}", tableName, set_clauses.join(", "), where_clauses.join(" AND ") ); - + // Prepare parameters let mut params: Vec> = Vec::new(); - + // Add update values for value in updates.values() { params.push(json_to_sql_value(value)?); } - + // Add where clause values for value in primaryKeyValues.values() { params.push(json_to_sql_value(value)?); } - + // Execute update - conn.execute(&query, rusqlite::params_from_iter(params.iter().map(|p| p.as_ref()))) - .map_err(|e| format!("Failed to update row: {}", e))?; - + conn.execute( + &query, + rusqlite::params_from_iter(params.iter().map(|p| p.as_ref())), + ) + .map_err(|e| format!("Failed to update row: {}", e))?; + Ok(()) } @@ -290,35 +295,38 @@ pub async fn storage_delete_row( primaryKeyValues: HashMap, ) -> Result<(), String> { let conn = db.0.lock().map_err(|e| e.to_string())?; - + // Validate table name if !is_valid_table_name(&conn, &tableName)? { return Err("Invalid table name".to_string()); } - + // Build DELETE query let where_clauses: Vec = primaryKeyValues .keys() .enumerate() .map(|(idx, key)| format!("{} = ?{}", key, idx + 1)) .collect(); - + let query = format!( "DELETE FROM {} WHERE {}", tableName, where_clauses.join(" AND ") ); - + // Prepare parameters let params: Vec> = primaryKeyValues .values() .map(json_to_sql_value) .collect::, _>>()?; - + // Execute delete - conn.execute(&query, rusqlite::params_from_iter(params.iter().map(|p| p.as_ref()))) - .map_err(|e| format!("Failed to delete row: {}", e))?; - + conn.execute( + &query, + rusqlite::params_from_iter(params.iter().map(|p| p.as_ref())), + ) + .map_err(|e| format!("Failed to delete row: {}", e))?; + Ok(()) } @@ -331,35 +339,40 @@ pub async fn storage_insert_row( values: HashMap, ) -> Result { let conn = db.0.lock().map_err(|e| e.to_string())?; - + // Validate table name if !is_valid_table_name(&conn, &tableName)? { return Err("Invalid table name".to_string()); } - + // Build INSERT query let columns: Vec<&String> = values.keys().collect(); - let placeholders: Vec = (1..=columns.len()) - .map(|i| format!("?{}", i)) - .collect(); - + let placeholders: Vec = (1..=columns.len()).map(|i| format!("?{}", i)).collect(); + let query = format!( "INSERT INTO {} ({}) VALUES ({})", tableName, - columns.iter().map(|c| c.as_str()).collect::>().join(", "), + columns + .iter() + .map(|c| c.as_str()) + .collect::>() + .join(", "), placeholders.join(", ") ); - + // Prepare parameters let params: Vec> = values .values() .map(json_to_sql_value) .collect::, _>>()?; - + // Execute insert - conn.execute(&query, rusqlite::params_from_iter(params.iter().map(|p| p.as_ref()))) - .map_err(|e| format!("Failed to insert row: {}", e))?; - + conn.execute( + &query, + rusqlite::params_from_iter(params.iter().map(|p| p.as_ref())), + ) + .map_err(|e| format!("Failed to insert row: {}", e))?; + Ok(conn.last_insert_rowid()) } @@ -370,20 +383,20 @@ pub async fn storage_execute_sql( query: String, ) -> Result { let conn = db.0.lock().map_err(|e| e.to_string())?; - + // Check if it's a SELECT query let is_select = query.trim().to_uppercase().starts_with("SELECT"); - + if is_select { // Handle SELECT queries let mut stmt = conn.prepare(&query).map_err(|e| e.to_string())?; let column_count = stmt.column_count(); - + // Get column names let columns: Vec = (0..column_count) .map(|i| stmt.column_name(i).unwrap_or("").to_string()) .collect(); - + // Execute query and collect results let rows: Vec> = stmt .query_map([], |row| { @@ -399,8 +412,13 @@ pub async fn storage_execute_sql( JsonValue::String(f.to_string()) } } - ValueRef::Text(s) => JsonValue::String(String::from_utf8_lossy(s).to_string()), - ValueRef::Blob(b) => JsonValue::String(base64::Engine::encode(&base64::engine::general_purpose::STANDARD, b)), + ValueRef::Text(s) => { + JsonValue::String(String::from_utf8_lossy(s).to_string()) + } + ValueRef::Blob(b) => JsonValue::String(base64::Engine::encode( + &base64::engine::general_purpose::STANDARD, + b, + )), }; row_values.push(value); } @@ -409,7 +427,7 @@ pub async fn storage_execute_sql( .map_err(|e| e.to_string())? .collect::>>() .map_err(|e| e.to_string())?; - + Ok(QueryResult { columns, rows, @@ -419,7 +437,7 @@ pub async fn storage_execute_sql( } else { // Handle non-SELECT queries (INSERT, UPDATE, DELETE, etc.) let rows_affected = conn.execute(&query, []).map_err(|e| e.to_string())?; - + Ok(QueryResult { columns: vec![], rows: vec![], @@ -435,13 +453,12 @@ pub async fn storage_reset_database(app: AppHandle) -> Result<(), String> { { // Drop all existing tables within a scoped block let db_state = app.state::(); - let conn = db_state.0.lock() - .map_err(|e| e.to_string())?; - + let conn = db_state.0.lock().map_err(|e| e.to_string())?; + // Disable foreign key constraints temporarily to allow dropping tables conn.execute("PRAGMA foreign_keys = OFF", []) .map_err(|e| format!("Failed to disable foreign keys: {}", e))?; - + // Drop tables - order doesn't matter with foreign keys disabled conn.execute("DROP TABLE IF EXISTS agent_runs", []) .map_err(|e| format!("Failed to drop agent_runs table: {}", e))?; @@ -449,34 +466,31 @@ pub async fn storage_reset_database(app: AppHandle) -> Result<(), String> { .map_err(|e| format!("Failed to drop agents table: {}", e))?; conn.execute("DROP TABLE IF EXISTS app_settings", []) .map_err(|e| format!("Failed to drop app_settings table: {}", e))?; - + // Re-enable foreign key constraints conn.execute("PRAGMA foreign_keys = ON", []) .map_err(|e| format!("Failed to re-enable foreign keys: {}", e))?; - + // Connection is automatically dropped at end of scope } - + // Re-initialize the database which will recreate all tables empty let new_conn = init_database(&app).map_err(|e| format!("Failed to reset database: {}", e))?; - + // Update the managed state with the new connection { let db_state = app.state::(); - let mut conn_guard = db_state.0.lock() - .map_err(|e| e.to_string())?; + let mut conn_guard = db_state.0.lock().map_err(|e| e.to_string())?; *conn_guard = new_conn; } - + // Run VACUUM to optimize the database { let db_state = app.state::(); - let conn = db_state.0.lock() - .map_err(|e| e.to_string())?; - conn.execute("VACUUM", []) - .map_err(|e| e.to_string())?; + let conn = db_state.0.lock().map_err(|e| e.to_string())?; + conn.execute("VACUUM", []).map_err(|e| e.to_string())?; } - + Ok(()) } @@ -489,7 +503,7 @@ fn is_valid_table_name(conn: &Connection, table_name: &str) -> Result 0) } @@ -513,4 +527,4 @@ fn json_to_sql_value(value: &JsonValue) -> Result, Stri } /// Initialize the agents database (re-exported from agents module) -use super::agents::init_database; \ No newline at end of file +use super::agents::init_database; diff --git a/src-tauri/src/main.rs b/src-tauri/src/main.rs index ffc0212e3..10b50fbee 100644 --- a/src-tauri/src/main.rs +++ b/src-tauri/src/main.rs @@ -14,20 +14,21 @@ use commands::agents::{ get_live_session_output, get_session_output, get_session_status, import_agent, import_agent_from_file, import_agent_from_github, init_database, kill_agent_session, list_agent_runs, list_agent_runs_with_metrics, list_agents, list_claude_installations, - list_running_sessions, load_agent_session_history, set_claude_binary_path, stream_session_output, update_agent, AgentDb, + list_running_sessions, load_agent_session_history, set_claude_binary_path, + stream_session_output, update_agent, AgentDb, }; use commands::claude::{ cancel_claude_execution, check_auto_checkpoint, check_claude_version, cleanup_old_checkpoints, - clear_checkpoint_manager, continue_claude_code, create_checkpoint, create_project, execute_claude_code, - find_claude_md_files, fork_from_checkpoint, get_checkpoint_diff, get_checkpoint_settings, - get_checkpoint_state_stats, get_claude_session_output, get_claude_settings, get_home_directory, get_project_sessions, - get_recently_modified_files, get_session_timeline, get_system_prompt, list_checkpoints, - list_directory_contents, list_projects, list_running_claude_sessions, load_session_history, - open_new_session, read_claude_md_file, restore_checkpoint, resume_claude_code, - save_claude_md_file, save_claude_settings, save_system_prompt, search_files, - track_checkpoint_message, track_session_messages, update_checkpoint_settings, - get_hooks_config, update_hooks_config, validate_hook_command, - ClaudeProcessState, + clear_checkpoint_manager, continue_claude_code, create_checkpoint, create_project, + delete_session, delete_sessions_bulk, execute_claude_code, find_claude_md_files, fork_from_checkpoint, + get_checkpoint_diff, get_checkpoint_settings, get_checkpoint_state_stats, + get_claude_session_output, get_claude_settings, get_home_directory, get_hooks_config, + get_project_sessions, get_recently_modified_files, get_session_timeline, get_system_prompt, + list_checkpoints, list_directory_contents, list_projects, list_running_claude_sessions, + load_session_history, open_new_session, read_claude_md_file, restore_checkpoint, + resume_claude_code, save_claude_md_file, save_claude_settings, save_pasted_image, + save_system_prompt, search_files, track_checkpoint_message, track_session_messages, + update_checkpoint_settings, update_hooks_config, validate_hook_command, ClaudeProcessState, }; use commands::mcp::{ mcp_add, mcp_add_from_claude_desktop, mcp_add_json, mcp_get, mcp_get_server_status, mcp_list, @@ -35,14 +36,14 @@ use commands::mcp::{ mcp_serve, mcp_test_connection, }; +use commands::proxy::{apply_proxy_settings, get_proxy_settings, save_proxy_settings}; +use commands::storage::{ + storage_delete_row, storage_execute_sql, storage_insert_row, storage_list_tables, + storage_read_table, storage_reset_database, storage_update_row, +}; use commands::usage::{ get_session_stats, get_usage_by_date_range, get_usage_details, get_usage_stats, }; -use commands::storage::{ - storage_list_tables, storage_read_table, storage_update_row, storage_delete_row, - storage_insert_row, storage_execute_sql, storage_reset_database, -}; -use commands::proxy::{get_proxy_settings, save_proxy_settings, apply_proxy_settings}; use process::ProcessRegistryState; use std::sync::Mutex; use tauri::Manager; @@ -50,19 +51,27 @@ use tauri::Manager; #[cfg(target_os = "macos")] use window_vibrancy::{apply_vibrancy, NSVisualEffectMaterial}; - fn main() { + // Fix PATH environment for macOS DMG apps + #[cfg(target_os = "macos")] + { + if let Err(e) = fix_path_env::fix() { + eprintln!("Failed to fix PATH environment: {}", e); + } else { + println!("✅ Successfully fixed PATH environment for macOS DMG app"); + } + } + // Initialize logger env_logger::init(); - tauri::Builder::default() .plugin(tauri_plugin_dialog::init()) .plugin(tauri_plugin_shell::init()) .setup(|app| { // Initialize agents database let conn = init_database(&app.handle()).expect("Failed to initialize agents database"); - + // Load and apply proxy settings from the database { let db = AgentDb(Mutex::new(conn)); @@ -70,7 +79,7 @@ fn main() { Ok(conn) => { // Directly query proxy settings from the database let mut settings = commands::proxy::ProxySettings::default(); - + let keys = vec![ ("proxy_enabled", "enabled"), ("proxy_http", "http_proxy"), @@ -78,7 +87,7 @@ fn main() { ("proxy_no", "no_proxy"), ("proxy_all", "all_proxy"), ]; - + for (db_key, field) in keys { if let Ok(value) = conn.query_row( "SELECT value FROM app_settings WHERE key = ?1", @@ -87,15 +96,23 @@ fn main() { ) { match field { "enabled" => settings.enabled = value == "true", - "http_proxy" => settings.http_proxy = Some(value).filter(|s| !s.is_empty()), - "https_proxy" => settings.https_proxy = Some(value).filter(|s| !s.is_empty()), - "no_proxy" => settings.no_proxy = Some(value).filter(|s| !s.is_empty()), - "all_proxy" => settings.all_proxy = Some(value).filter(|s| !s.is_empty()), + "http_proxy" => { + settings.http_proxy = Some(value).filter(|s| !s.is_empty()) + } + "https_proxy" => { + settings.https_proxy = Some(value).filter(|s| !s.is_empty()) + } + "no_proxy" => { + settings.no_proxy = Some(value).filter(|s| !s.is_empty()) + } + "all_proxy" => { + settings.all_proxy = Some(value).filter(|s| !s.is_empty()) + } _ => {} } } } - + log::info!("Loaded proxy settings: enabled={}", settings.enabled); settings } @@ -104,11 +121,11 @@ fn main() { commands::proxy::ProxySettings::default() } }; - + // Apply the proxy settings apply_proxy_settings(&proxy_settings); } - + // Re-open the connection for the app to manage let conn = init_database(&app.handle()).expect("Failed to initialize agents database"); app.manage(AgentDb(Mutex::new(conn))); @@ -144,7 +161,7 @@ fn main() { #[cfg(target_os = "macos")] { let window = app.get_webview_window("main").unwrap(); - + // Try different vibrancy materials that support rounded corners let materials = [ NSVisualEffectMaterial::UnderWindowBackground, @@ -153,7 +170,7 @@ fn main() { NSVisualEffectMaterial::Menu, NSVisualEffectMaterial::Sidebar, ]; - + let mut applied = false; for material in materials.iter() { if apply_vibrancy(&window, *material, None, Some(12.0)).is_ok() { @@ -161,11 +178,16 @@ fn main() { break; } } - + if !applied { // Fallback without rounded corners - apply_vibrancy(&window, NSVisualEffectMaterial::WindowBackground, None, None) - .expect("Failed to apply any window vibrancy"); + apply_vibrancy( + &window, + NSVisualEffectMaterial::WindowBackground, + None, + None, + ) + .expect("Failed to apply any window vibrancy"); } } @@ -176,6 +198,9 @@ fn main() { list_projects, create_project, get_project_sessions, + delete_session, + delete_sessions_bulk, + save_pasted_image, get_home_directory, get_claude_settings, open_new_session, @@ -199,7 +224,6 @@ fn main() { get_hooks_config, update_hooks_config, validate_hook_command, - // Checkpoint Management create_checkpoint, restore_checkpoint, @@ -215,7 +239,6 @@ fn main() { get_checkpoint_settings, clear_checkpoint_manager, get_checkpoint_state_stats, - // Agent Management list_agents, create_agent, @@ -245,13 +268,11 @@ fn main() { fetch_github_agents, fetch_github_agent_content, import_agent_from_github, - // Usage & Analytics get_usage_stats, get_usage_by_date_range, get_usage_details, get_session_stats, - // MCP (Model Context Protocol) mcp_add, mcp_list, @@ -265,7 +286,6 @@ fn main() { mcp_get_server_status, mcp_read_project_config, mcp_save_project_config, - // Storage Management storage_list_tables, storage_read_table, @@ -274,13 +294,11 @@ fn main() { storage_insert_row, storage_execute_sql, storage_reset_database, - // Slash Commands commands::slash_commands::slash_commands_list, commands::slash_commands::slash_command_get, commands::slash_commands::slash_command_save, commands::slash_commands::slash_command_delete, - // Proxy Settings get_proxy_settings, save_proxy_settings, diff --git a/src-tauri/src/process/registry.rs b/src-tauri/src/process/registry.rs index 30c8e94d3..f4f33b5a2 100644 --- a/src-tauri/src/process/registry.rs +++ b/src-tauri/src/process/registry.rs @@ -7,13 +7,8 @@ use tokio::process::Child; /// Type of process being tracked #[derive(Debug, Clone, Serialize, Deserialize)] pub enum ProcessType { - AgentRun { - agent_id: i64, - agent_name: String, - }, - ClaudeSession { - session_id: String, - }, + AgentRun { agent_id: i64, agent_name: String }, + ClaudeSession { session_id: String }, } /// Information about a running agent process @@ -72,7 +67,10 @@ impl ProcessRegistry { ) -> Result<(), String> { let process_info = ProcessInfo { run_id, - process_type: ProcessType::AgentRun { agent_id, agent_name }, + process_type: ProcessType::AgentRun { + agent_id, + agent_name, + }, pid, started_at: Utc::now(), project_path, @@ -96,7 +94,10 @@ impl ProcessRegistry { ) -> Result<(), String> { let process_info = ProcessInfo { run_id, - process_type: ProcessType::AgentRun { agent_id, agent_name }, + process_type: ProcessType::AgentRun { + agent_id, + agent_name, + }, pid, started_at: Utc::now(), project_path, @@ -106,7 +107,7 @@ impl ProcessRegistry { // For sidecar processes, we register without the child handle since it's managed differently let mut processes = self.processes.lock().map_err(|e| e.to_string())?; - + let process_handle = ProcessHandle { info: process_info, child: Arc::new(Mutex::new(None)), // No tokio::process::Child handle for sidecar @@ -127,7 +128,7 @@ impl ProcessRegistry { model: String, ) -> Result { let run_id = self.generate_id()?; - + let process_info = ProcessInfo { run_id, process_type: ProcessType::ClaudeSession { session_id }, @@ -140,7 +141,7 @@ impl ProcessRegistry { // Register without child - Claude sessions use ClaudeProcessState for process management let mut processes = self.processes.lock().map_err(|e| e.to_string())?; - + let process_handle = ProcessHandle { info: process_info, child: Arc::new(Mutex::new(None)), // No child handle for Claude sessions @@ -175,25 +176,24 @@ impl ProcessRegistry { let processes = self.processes.lock().map_err(|e| e.to_string())?; Ok(processes .values() - .filter_map(|handle| { - match &handle.info.process_type { - ProcessType::ClaudeSession { .. } => Some(handle.info.clone()), - _ => None, - } + .filter_map(|handle| match &handle.info.process_type { + ProcessType::ClaudeSession { .. } => Some(handle.info.clone()), + _ => None, }) .collect()) } /// Get a specific Claude session by session ID - pub fn get_claude_session_by_id(&self, session_id: &str) -> Result, String> { + pub fn get_claude_session_by_id( + &self, + session_id: &str, + ) -> Result, String> { let processes = self.processes.lock().map_err(|e| e.to_string())?; Ok(processes .values() - .find(|handle| { - match &handle.info.process_type { - ProcessType::ClaudeSession { session_id: sid } => sid == session_id, - _ => false, - } + .find(|handle| match &handle.info.process_type { + ProcessType::ClaudeSession { session_id: sid } => sid == session_id, + _ => false, }) .map(|handle| handle.info.clone())) } @@ -221,11 +221,9 @@ impl ProcessRegistry { let processes = self.processes.lock().map_err(|e| e.to_string())?; Ok(processes .values() - .filter_map(|handle| { - match &handle.info.process_type { - ProcessType::AgentRun { .. } => Some(handle.info.clone()), - _ => None, - } + .filter_map(|handle| match &handle.info.process_type { + ProcessType::AgentRun { .. } => Some(handle.info.clone()), + _ => None, }) .collect()) } @@ -273,17 +271,26 @@ impl ProcessRegistry { } } } else { - warn!("No child handle available for process {} (PID: {}), attempting system kill", run_id, pid); + warn!( + "No child handle available for process {} (PID: {}), attempting system kill", + run_id, pid + ); false // Process handle not available, try fallback } }; // If direct kill didn't work, try system command as fallback if !kill_sent { - info!("Attempting fallback kill for process {} (PID: {})", run_id, pid); + info!( + "Attempting fallback kill for process {} (PID: {})", + run_id, pid + ); match self.kill_process_by_pid(run_id, pid) { Ok(true) => return Ok(true), - Ok(false) => warn!("Fallback kill also failed for process {} (PID: {})", run_id, pid), + Ok(false) => warn!( + "Fallback kill also failed for process {} (PID: {})", + run_id, pid + ), Err(e) => error!("Error during fallback kill: {}", e), } // Continue with the rest of the cleanup even if fallback failed diff --git a/src/components/AgentExecution.tsx b/src/components/AgentExecution.tsx index 90f089ccd..c0702a77f 100644 --- a/src/components/AgentExecution.tsx +++ b/src/components/AgentExecution.tsx @@ -32,7 +32,7 @@ import { ExecutionControlBar } from "./ExecutionControlBar"; import { ErrorBoundary } from "./ErrorBoundary"; import { useVirtualizer } from "@tanstack/react-virtual"; import { HooksEditor } from "./HooksEditor"; -import { useTrackEvent, useComponentMetrics, useFeatureAdoptionTracking } from "@/hooks"; +import { useTrackEvent, useComponentMetrics, useFeatureAdoptionTracking, useAutoScroll } from "@/hooks"; import { useTabState } from "@/hooks/useTabState"; interface AgentExecutionProps { @@ -118,7 +118,8 @@ export const AgentExecution: React.FC = ({ const [elapsedTime, setElapsedTime] = useState(0); const [hasUserScrolled, setHasUserScrolled] = useState(false); const [isFullscreenModalOpen, setIsFullscreenModalOpen] = useState(false); - + const { autoScrollEnabled } = useAutoScroll(); + const messagesEndRef = useRef(null); const messagesContainerRef = useRef(null); const scrollContainerRef = useRef(null); @@ -235,7 +236,7 @@ export const AgentExecution: React.FC = ({ if (displayableMessages.length === 0) return; // Auto-scroll only if the user has not manually scrolled OR they are still at the bottom - const shouldAutoScroll = !hasUserScrolled || isAtBottom(); + const shouldAutoScroll = autoScrollEnabled && (!hasUserScrolled || isAtBottom()); if (shouldAutoScroll) { if (isFullscreenModalOpen) { @@ -244,7 +245,7 @@ export const AgentExecution: React.FC = ({ rowVirtualizer.scrollToIndex(displayableMessages.length - 1, { align: "end", behavior: "smooth" }); } } - }, [displayableMessages.length, hasUserScrolled, isFullscreenModalOpen, rowVirtualizer, fullscreenRowVirtualizer]); + }, [displayableMessages.length, hasUserScrolled, isFullscreenModalOpen, rowVirtualizer, fullscreenRowVirtualizer, autoScrollEnabled]); // Update elapsed time while running useEffect(() => { @@ -730,11 +731,13 @@ export const AgentExecution: React.FC = ({ ref={scrollContainerRef} className="h-full overflow-y-auto p-6 space-y-8" onScroll={() => { + if (!autoScrollEnabled) return; + // Mark that user has scrolled manually if (!hasUserScrolled) { setHasUserScrolled(true); } - + // If user scrolls back to bottom, re-enable auto-scroll if (isAtBottom()) { setHasUserScrolled(false); @@ -872,11 +875,13 @@ export const AgentExecution: React.FC = ({ ref={fullscreenScrollRef} className="h-full overflow-y-auto space-y-8" onScroll={() => { + if (!autoScrollEnabled) return; + // Mark that user has scrolled manually if (!hasUserScrolled) { setHasUserScrolled(true); } - + // If user scrolls back to bottom, re-enable auto-scroll if (isAtBottom()) { setHasUserScrolled(false); diff --git a/src/components/AgentRunOutputViewer.tsx b/src/components/AgentRunOutputViewer.tsx index d83fb9317..b906f6847 100644 --- a/src/components/AgentRunOutputViewer.tsx +++ b/src/components/AgentRunOutputViewer.tsx @@ -27,6 +27,7 @@ import { formatISOTimestamp } from '@/lib/date-utils'; import { AGENT_ICONS } from './CCAgents'; import type { ClaudeStreamMessage } from './AgentExecution'; import { useTabState } from '@/hooks/useTabState'; +import { useAutoScroll } from '@/hooks'; interface AgentRunOutputViewerProps { /** @@ -67,7 +68,8 @@ export function AgentRunOutputViewer({ const [toast, setToast] = useState<{ message: string; type: "success" | "error" } | null>(null); const [copyPopoverOpen, setCopyPopoverOpen] = useState(false); const [hasUserScrolled, setHasUserScrolled] = useState(false); - + const { autoScrollEnabled } = useAutoScroll(); + // Track whether we're in the initial load phase const isInitialLoadRef = useRef(true); const hasSetupListenersRef = useRef(false); @@ -91,7 +93,7 @@ export function AgentRunOutputViewer({ }; const scrollToBottom = () => { - if (!hasUserScrolled) { + if (autoScrollEnabled && !hasUserScrolled) { const endRef = isFullscreen ? fullscreenMessagesEndRef.current : outputEndRef.current; if (endRef) { endRef.scrollIntoView({ behavior: 'smooth' }); @@ -132,11 +134,11 @@ export function AgentRunOutputViewer({ // Auto-scroll when messages change useEffect(() => { - const shouldAutoScroll = !hasUserScrolled || isAtBottom(); + const shouldAutoScroll = autoScrollEnabled && (!hasUserScrolled || isAtBottom()); if (shouldAutoScroll) { scrollToBottom(); } - }, [messages, hasUserScrolled, isFullscreen]); + }, [messages, hasUserScrolled, isFullscreen, autoScrollEnabled]); const loadOutput = async (skipCache = false) => { if (!run?.id) return; @@ -442,6 +444,8 @@ export function AgentRunOutputViewer({ }; const handleScroll = (e: React.UIEvent) => { + if (!autoScrollEnabled) return; + const target = e.currentTarget; const { scrollTop, scrollHeight, clientHeight } = target; const distanceFromBottom = scrollHeight - scrollTop - clientHeight; diff --git a/src/components/ClaudeCodeSession.tsx b/src/components/ClaudeCodeSession.tsx index a9b9e5891..6713f374c 100644 --- a/src/components/ClaudeCodeSession.tsx +++ b/src/components/ClaudeCodeSession.tsx @@ -28,7 +28,7 @@ import { SplitPane } from "@/components/ui/split-pane"; import { WebviewPreview } from "./WebviewPreview"; import type { ClaudeStreamMessage } from "./AgentExecution"; import { useVirtualizer } from "@tanstack/react-virtual"; -import { useTrackEvent, useComponentMetrics, useWorkflowTracking } from "@/hooks"; +import { useTrackEvent, useComponentMetrics, useWorkflowTracking, useAutoScroll } from "@/hooks"; import { SessionPersistenceService } from "@/services/sessionPersistence"; interface ClaudeCodeSessionProps { @@ -75,6 +75,12 @@ export const ClaudeCodeSession: React.FC = ({ onStreamingChange, onProjectPathChange, }) => { + console.log('[ClaudeCodeSession] Component initialized/remounted with:', { + session: session?.id, + initialProjectPath, + hasSession: !!session + }); + const [projectPath] = useState(initialProjectPath || session?.project_path || ""); const [messages, setMessages] = useState([]); const [isLoading, setIsLoading] = useState(false); @@ -106,6 +112,10 @@ export const ClaudeCodeSession: React.FC = ({ // Add collapsed state for queued prompts const [queuedPromptsCollapsed, setQueuedPromptsCollapsed] = useState(false); + // Auto-scroll hook + const { autoScrollEnabled } = useAutoScroll(); + + const parentRef = useRef(null); const unlistenRefs = useRef([]); const hasActiveSessionRef = useRef(false); @@ -251,10 +261,7 @@ export const ClaudeCodeSession: React.FC = ({ // Load session history if resuming useEffect(() => { if (session) { - // Set the claudeSessionId immediately when we have a session - setClaudeSessionId(session.id); - - // Load session history first, then check for active session + // Load session history first, which will set up extractedSessionInfo and claudeSessionId properly const initializeSession = async () => { await loadSessionHistory(); // After loading history, check if the session is still active @@ -262,7 +269,7 @@ export const ClaudeCodeSession: React.FC = ({ await checkForActiveSession(); } }; - + initializeSession(); } }, [session]); // Remove hasLoadedSession dependency to ensure it runs on mount @@ -274,6 +281,8 @@ export const ClaudeCodeSession: React.FC = ({ // Auto-scroll to bottom when new messages arrive useEffect(() => { + if (!autoScrollEnabled) return; + if (displayableMessages.length > 0) { // Use a more precise scrolling method to ensure content is fully visible setTimeout(() => { @@ -292,7 +301,7 @@ export const ClaudeCodeSession: React.FC = ({ } }, 50); } - }, [displayableMessages.length, rowVirtualizer]); + }, [displayableMessages.length, rowVirtualizer, autoScrollEnabled]); // Calculate total tokens from messages useEffect(() => { @@ -310,35 +319,60 @@ export const ClaudeCodeSession: React.FC = ({ const loadSessionHistory = async () => { if (!session) return; - + try { setIsLoading(true); setError(null); - + const history = await api.loadSessionHistory(session.id, session.project_id); - - // Save session data for restoration + + // Check persistence data to understand session relationships + const persistenceData = SessionPersistenceService.loadSession(session.id); + + // Set up extracted session info for conversation continuation + const projectId = session.project_id; + if (persistenceData?.primaryConversationId && !persistenceData.isConversationRoot) { + // This is a child session - use the primary conversation ID + console.log('[ClaudeCodeSession] Loading child session, primary conversation ID:', persistenceData.primaryConversationId); + setExtractedSessionInfo({ + sessionId: persistenceData.primaryConversationId, + projectId + }); + setClaudeSessionId(persistenceData.primaryConversationId); + } else { + // This is a primary session or session without persistence data + console.log('[ClaudeCodeSession] Loading primary session:', session.id); + setExtractedSessionInfo({ + sessionId: session.id, + projectId + }); + setClaudeSessionId(session.id); + } + + // Save session data for restoration, preserving existing relationships if (history && history.length > 0) { SessionPersistenceService.saveSession( session.id, session.project_id, session.project_path, - history.length + history.length, + undefined, // scrollPosition + persistenceData?.primaryConversationId // preserve existing primaryConversationId ); } - + // Convert history to messages format const loadedMessages: ClaudeStreamMessage[] = history.map(entry => ({ ...entry, type: entry.type || "assistant" })); - + setMessages(loadedMessages); setRawJsonlOutput(history.map(h => JSON.stringify(h))); - + // After loading history, we're continuing a conversation setIsFirstPrompt(false); - + // Scroll to bottom after loading history setTimeout(() => { if (loadedMessages.length > 0) { @@ -458,6 +492,12 @@ export const ClaudeCodeSession: React.FC = ({ const handleSendPrompt = async (prompt: string, model: "sonnet" | "opus") => { console.log('[ClaudeCodeSession] handleSendPrompt called with:', { prompt, model, projectPath, claudeSessionId, effectiveSession }); + console.log('[ClaudeCodeSession] Current state:', { + extractedSessionInfo, + isFirstPrompt, + session, + hasExtractedInfo: !!extractedSessionInfo + }); if (!projectPath) { setError("Please select a project directory first"); @@ -545,13 +585,14 @@ export const ClaudeCodeSession: React.FC = ({ if (!currentSessionId || currentSessionId !== msg.session_id) { console.log('[ClaudeCodeSession] Detected new session_id from generic listener:', msg.session_id); currentSessionId = msg.session_id; - setClaudeSessionId(msg.session_id); - // If we haven't extracted session info before, do it now + // If we haven't extracted session info before, do it now (first session only) if (!extractedSessionInfo) { const projectId = projectPath.replace(/[^a-zA-Z0-9]/g, '-'); + console.log('[ClaudeCodeSession] Setting primary conversation ID:', msg.session_id); + setClaudeSessionId(msg.session_id); // Only update UI session ID for the first session setExtractedSessionInfo({ sessionId: msg.session_id, projectId }); - + // Save session data for restoration SessionPersistenceService.saveSession( msg.session_id, @@ -559,6 +600,21 @@ export const ClaudeCodeSession: React.FC = ({ projectPath, messages.length ); + } else { + console.log('[ClaudeCodeSession] Continuing conversation, new execution ID:', msg.session_id, 'primary conversation ID:', extractedSessionInfo.sessionId); + console.log('[ClaudeCodeSession] UI will continue showing primary conversation ID:', extractedSessionInfo.sessionId); + + // Save this new session but link it to the primary conversation + const projectId = projectPath.replace(/[^a-zA-Z0-9]/g, '-'); + SessionPersistenceService.saveSession( + msg.session_id, + projectId, + projectPath, + messages.length, + undefined, // scrollPosition + extractedSessionInfo.sessionId // primaryConversationId + ); + // Note: We do NOT call setClaudeSessionId() here, so UI keeps showing the primary conversation ID } // Switch to session-specific listeners @@ -647,7 +703,23 @@ export const ClaudeCodeSession: React.FC = ({ if (message.type === 'system' && (message.subtype === 'error' || message.error)) { sessionMetrics.current.errorsEncountered += 1; } - + + // Fix UI session ID display: Store original and replace with primary conversation ID + if (message.type === 'system' && message.subtype === 'init') { + const originalSessionId = message.session_id; + + if (extractedSessionInfo && message.session_id !== extractedSessionInfo.sessionId) { + console.log('[ClaudeCodeSession] Replacing UI session ID:', message.session_id, '→', extractedSessionInfo.sessionId); + message.original_session_id = originalSessionId; // Store child ID + message.session_id = extractedSessionInfo.sessionId; // Use primary ID + message.session_timestamp = new Date().toISOString(); + } else if (!extractedSessionInfo) { + // First session - mark as both primary and child initially + message.original_session_id = originalSessionId; + message.session_timestamp = new Date().toISOString(); + } + } + setMessages((prev) => [...prev, message]); } catch (err) { console.error('Failed to parse message:', err, payload); @@ -826,10 +898,19 @@ export const ClaudeCodeSession: React.FC = ({ // Execute the appropriate command if (effectiveSession && !isFirstPrompt) { - console.log('[ClaudeCodeSession] Resuming session:', effectiveSession.id); - trackEvent.sessionResumed(effectiveSession.id); - trackEvent.modelSelected(model); - await api.resumeClaudeCode(projectPath, effectiveSession.id, prompt, model); + // Check if this is resuming an old session (session prop passed) or continuing current session + if (session) { + // Resuming an old session that was explicitly selected + console.log('[ClaudeCodeSession] Resuming old session:', effectiveSession.id); + trackEvent.sessionResumed(effectiveSession.id); + trackEvent.modelSelected(model); + await api.resumeClaudeCode(projectPath, effectiveSession.id, prompt, model); + } else { + // Continuing the current active session + console.log('[ClaudeCodeSession] Continuing current session:', effectiveSession.id); + trackEvent.modelSelected(model); + await api.continueClaudeCode(projectPath, prompt, model); + } } else { console.log('[ClaudeCodeSession] Starting new session'); setIsFirstPrompt(false); @@ -1483,6 +1564,7 @@ export const ClaudeCodeSession: React.FC = ({ isLoading={isLoading} disabled={!projectPath} projectPath={projectPath} + projectId={effectiveSession?.project_id} extraMenuItems={ <> {effectiveSession && ( diff --git a/src/components/FloatingPromptInput.tsx b/src/components/FloatingPromptInput.tsx index bc49cb8d1..95d71d467 100644 --- a/src/components/FloatingPromptInput.tsx +++ b/src/components/FloatingPromptInput.tsx @@ -22,7 +22,7 @@ import { TooltipProvider, TooltipSimple, Tooltip, TooltipTrigger, TooltipContent import { FilePicker } from "./FilePicker"; import { SlashCommandPicker } from "./SlashCommandPicker"; import { ImagePreview } from "./ImagePreview"; -import { type FileEntry, type SlashCommand } from "@/lib/api"; +import { api, type FileEntry, type SlashCommand } from "@/lib/api"; import { getCurrentWebviewWindow } from "@tauri-apps/api/webviewWindow"; interface FloatingPromptInputProps { @@ -46,6 +46,10 @@ interface FloatingPromptInputProps { * Project path for file picker */ projectPath?: string; + /** + * Project ID for image storage + */ + projectId?: string; /** * Optional className for styling */ @@ -206,6 +210,7 @@ const FloatingPromptInputInner = ( disabled = false, defaultModel = "sonnet", projectPath, + projectId, className, onCancel, extraMenuItems, @@ -225,6 +230,19 @@ const FloatingPromptInputInner = ( const [cursorPosition, setCursorPosition] = useState(0); const [embeddedImages, setEmbeddedImages] = useState([]); const [dragActive, setDragActive] = useState(false); + const [imagePathMap, setImagePathMap] = useState>(new Map()); + + // Computed array of actual image paths for preview (derived from placeholders and mapping) + const imagePathsForPreview = embeddedImages.map(placeholder => { + const actualPath = imagePathMap.get(placeholder); + if (actualPath) { + // For data URLs, use as-is; for file paths, convert to absolute + return actualPath.startsWith('data:') + ? actualPath + : (actualPath.startsWith('/') ? actualPath : (projectPath ? `${projectPath}/${actualPath}` : actualPath)); + } + return placeholder; // fallback to placeholder if mapping not found + }); const textareaRef = useRef(null); const expandedTextareaRef = useRef(null); @@ -275,50 +293,71 @@ const FloatingPromptInputInner = ( // Extract image paths from prompt text const extractImagePaths = (text: string): string[] => { console.log('[extractImagePaths] Input text length:', text.length); - - // Updated regex to handle both quoted and unquoted paths + + const pathsSet = new Set(); // Use Set to ensure uniqueness + + // First, look for image placeholders like [image] or [image #2] + const placeholderRegex = /\[image(?:\s*#\d+)?\]/g; + let placeholderMatches = Array.from(text.matchAll(placeholderRegex)); + console.log('[extractImagePaths] Placeholder matches:', placeholderMatches.length); + + for (const match of placeholderMatches) { + const placeholder = match[0]; + const actualPath = imagePathMap.get(placeholder); + if (actualPath) { + console.log('[extractImagePaths] Found placeholder:', placeholder, '-> actual path'); + // For data URLs, use as-is; for file paths, convert to absolute + const fullPath = actualPath.startsWith('data:') + ? actualPath + : (actualPath.startsWith('/') ? actualPath : (projectPath ? `${projectPath}/${actualPath}` : actualPath)); + + if (isImageFile(fullPath)) { + pathsSet.add(fullPath); + } + } + } + + // Also handle legacy @-mentions for backwards compatibility // Pattern 1: @"path with spaces or data URLs" - quoted paths // Pattern 2: @path - unquoted paths (continues until @ or end) const quotedRegex = /@"([^"]+)"/g; const unquotedRegex = /@([^@\n\s]+)/g; - - const pathsSet = new Set(); // Use Set to ensure uniqueness - + // First, extract quoted paths (including data URLs) let matches = Array.from(text.matchAll(quotedRegex)); - console.log('[extractImagePaths] Quoted matches:', matches.length); - + console.log('[extractImagePaths] Legacy quoted matches:', matches.length); + for (const match of matches) { const path = match[1]; // No need to trim, quotes preserve exact path - console.log('[extractImagePaths] Processing quoted path:', path.startsWith('data:') ? 'data URL' : path); - + console.log('[extractImagePaths] Processing legacy quoted path:', path.startsWith('data:') ? 'data URL' : path); + // For data URLs, use as-is; for file paths, convert to absolute - const fullPath = path.startsWith('data:') - ? path + const fullPath = path.startsWith('data:') + ? path : (path.startsWith('/') ? path : (projectPath ? `${projectPath}/${path}` : path)); - + if (isImageFile(fullPath)) { pathsSet.add(fullPath); } } - + // Remove quoted mentions from text to avoid double-matching let textWithoutQuoted = text.replace(quotedRegex, ''); - + // Then extract unquoted paths (typically file paths) matches = Array.from(textWithoutQuoted.matchAll(unquotedRegex)); - console.log('[extractImagePaths] Unquoted matches:', matches.length); - + console.log('[extractImagePaths] Legacy unquoted matches:', matches.length); + for (const match of matches) { const path = match[1].trim(); // Skip if it looks like a data URL fragment (shouldn't happen with proper quoting) if (path.includes('data:')) continue; - - console.log('[extractImagePaths] Processing unquoted path:', path); - + + console.log('[extractImagePaths] Processing legacy unquoted path:', path); + // Convert relative path to absolute if needed const fullPath = path.startsWith('/') ? path : (projectPath ? `${projectPath}/${path}` : path); - + if (isImageFile(fullPath)) { pathsSet.add(fullPath); } @@ -329,12 +368,19 @@ const FloatingPromptInputInner = ( return uniquePaths; }; + // Extract placeholders from prompt text (for embeddedImages array) + const extractPlaceholders = (text: string): string[] => { + const placeholderRegex = /\[image(?:\s*#\d+)?\]/g; + const placeholders = Array.from(text.matchAll(placeholderRegex)).map(match => match[0]); + return placeholders; + }; + // Update embedded images when prompt changes useEffect(() => { console.log('[useEffect] Prompt changed:', prompt); - const imagePaths = extractImagePaths(prompt); - console.log('[useEffect] Setting embeddedImages to:', imagePaths); - setEmbeddedImages(imagePaths); + const placeholders = extractPlaceholders(prompt); + console.log('[useEffect] Setting embeddedImages to placeholders:', placeholders); + setEmbeddedImages(placeholders); // Auto-resize on prompt change (handles paste, programmatic changes, etc.) if (textareaRef.current && !isExpanded) { @@ -433,6 +479,29 @@ const FloatingPromptInputInner = ( } }, [isExpanded]); + const handleSend = () => { + if (prompt.trim() && !disabled) { + let finalPrompt = prompt.trim(); + + // Replace image placeholders with actual file paths + imagePathMap.forEach((actualPath, placeholder) => { + finalPrompt = finalPrompt.replace(placeholder, `@"${actualPath}"`); + }); + + // Append thinking phrase if not auto mode + const thinkingMode = THINKING_MODES.find(m => m.id === selectedThinkingMode); + if (thinkingMode && thinkingMode.phrase) { + finalPrompt = `${finalPrompt}.\n\n${thinkingMode.phrase}.`; + } + + onSend(finalPrompt, selectedModel); + setPrompt(""); + setEmbeddedImages([]); + setImagePathMap(new Map()); + setTextareaHeight(48); // Reset height after sending + } + }; + const handleTextChange = (e: React.ChangeEvent) => { const newValue = e.target.value; const newCursorPosition = e.target.selectionStart || 0; @@ -745,7 +814,7 @@ const FloatingPromptInputInner = ( for (const item of items) { if (item.type.startsWith('image/')) { e.preventDefault(); - + // Get the image blob const blob = item.getAsFile(); if (!blob) continue; @@ -753,26 +822,100 @@ const FloatingPromptInputInner = ( try { // Convert blob to base64 const reader = new FileReader(); - reader.onload = () => { + reader.onload = async () => { const base64Data = reader.result as string; - - // Add the base64 data URL directly to the prompt - setPrompt(currentPrompt => { - // Use the data URL directly as the image reference - const mention = `@"${base64Data}"`; - const newPrompt = currentPrompt + (currentPrompt.endsWith(' ') || currentPrompt === '' ? '' : ' ') + mention + ' '; - - // Focus the textarea and move cursor to end - setTimeout(() => { - const target = isExpanded ? expandedTextareaRef.current : textareaRef.current; - target?.focus(); - target?.setSelectionRange(newPrompt.length, newPrompt.length); - }, 0); - - return newPrompt; - }); + + // If we have project ID, save the image as a file instead of using base64 + if (projectId) { + try { + console.log('Saving image for projectId:', projectId); + + // Generate a temporary session ID for image storage + const tempSessionId = `temp_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`; + + // Save the image to file and get relative path + const imagePath = await api.savePastedImage(projectId, tempSessionId, base64Data); + + // Add clean image placeholder to the prompt instead of file path + setPrompt(currentPrompt => { + // Calculate the next image number based on existing placeholders + const existingPlaceholders = extractPlaceholders(currentPrompt); + const nextNumber = existingPlaceholders.length + 1; + const imagePlaceholder = nextNumber === 1 ? '[image]' : `[image #${nextNumber}]`; + const newPrompt = currentPrompt + (currentPrompt.endsWith(' ') || currentPrompt === '' ? '' : ' ') + imagePlaceholder + ' '; + + // Store the mapping between placeholder and actual path + setImagePathMap(prevMap => { + const newMap = new Map(prevMap); + newMap.set(imagePlaceholder, imagePath); + return newMap; + }); + + // Focus the textarea and move cursor to end + setTimeout(() => { + const target = isExpanded ? expandedTextareaRef.current : textareaRef.current; + target?.focus(); + target?.setSelectionRange(newPrompt.length, newPrompt.length); + }, 0); + + return newPrompt; + }); + + console.log('Image saved as file:', imagePath); + } catch (error) { + console.error('Failed to save pasted image as file:', error); + // Fallback to the old base64 approach + setPrompt(currentPrompt => { + // Calculate the next image number based on existing placeholders + const existingPlaceholders = extractPlaceholders(currentPrompt); + const nextNumber = existingPlaceholders.length + 1; + const imagePlaceholder = nextNumber === 1 ? '[image]' : `[image #${nextNumber}]`; + const newPrompt = currentPrompt + (currentPrompt.endsWith(' ') || currentPrompt === '' ? '' : ' ') + imagePlaceholder + ' '; + + // Store the mapping between placeholder and actual base64 data + setImagePathMap(prevMap => { + const newMap = new Map(prevMap); + newMap.set(imagePlaceholder, base64Data); + return newMap; + }); + + setTimeout(() => { + const target = isExpanded ? expandedTextareaRef.current : textareaRef.current; + target?.focus(); + target?.setSelectionRange(newPrompt.length, newPrompt.length); + }, 0); + + return newPrompt; + }); + } + } else { + // No project ID available, use base64 as fallback + console.log('No projectId available, falling back to base64. ProjectId:', projectId, 'ProjectPath:', projectPath); + setPrompt(currentPrompt => { + // Calculate the next image number based on existing placeholders + const existingPlaceholders = extractPlaceholders(currentPrompt); + const nextNumber = existingPlaceholders.length + 1; + const imagePlaceholder = nextNumber === 1 ? '[image]' : `[image #${nextNumber}]`; + const newPrompt = currentPrompt + (currentPrompt.endsWith(' ') || currentPrompt === '' ? '' : ' ') + imagePlaceholder + ' '; + + // Store the mapping between placeholder and actual base64 data + setImagePathMap(prevMap => { + const newMap = new Map(prevMap); + newMap.set(imagePlaceholder, base64Data); + return newMap; + }); + + setTimeout(() => { + const target = isExpanded ? expandedTextareaRef.current : textareaRef.current; + target?.focus(); + target?.setSelectionRange(newPrompt.length, newPrompt.length); + }, 0); + + return newPrompt; + }); + } }; - + reader.readAsDataURL(blob); } catch (error) { console.error('Failed to paste image:', error); @@ -796,40 +939,22 @@ const FloatingPromptInputInner = ( }; const handleRemoveImage = (index: number) => { - // Remove the corresponding @mention from the prompt - const imagePath = embeddedImages[index]; - - // For data URLs, we need to handle them specially since they're always quoted - if (imagePath.startsWith('data:')) { - // Simply remove the exact quoted data URL - const quotedPath = `@"${imagePath}"`; - const newPrompt = prompt.replace(quotedPath, '').trim(); - setPrompt(newPrompt); - return; - } - - // For file paths, use the original logic - const escapedPath = imagePath.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'); - const escapedRelativePath = imagePath.replace(projectPath + '/', '').replace(/[.*+?^${}()|[\]\\]/g, '\\$&'); - - // Create patterns for both quoted and unquoted mentions - const patterns = [ - // Quoted full path - new RegExp(`@"${escapedPath}"\\s?`, 'g'), - // Unquoted full path - new RegExp(`@${escapedPath}\\s?`, 'g'), - // Quoted relative path - new RegExp(`@"${escapedRelativePath}"\\s?`, 'g'), - // Unquoted relative path - new RegExp(`@${escapedRelativePath}\\s?`, 'g') - ]; - - let newPrompt = prompt; - for (const pattern of patterns) { - newPrompt = newPrompt.replace(pattern, ''); - } - - setPrompt(newPrompt.trim()); + // Remove the corresponding placeholder from the prompt + const placeholder = embeddedImages[index]; + + // Remove the placeholder from the prompt + const newPrompt = prompt.replace(placeholder, '').replace(/\s+/g, ' ').trim(); + setPrompt(newPrompt); + + // Remove from the image path mapping + setImagePathMap(prevMap => { + const newMap = new Map(prevMap); + newMap.delete(placeholder); + return newMap; + }); + + // Update embedded images array + setEmbeddedImages(prevImages => prevImages.filter((_, i) => i !== index)); }; const selectedModelData = MODELS.find(m => m.id === selectedModel) || MODELS[0]; @@ -877,7 +1002,7 @@ const FloatingPromptInputInner = ( {/* Image previews in expanded mode */} {embeddedImages.length > 0 && ( @@ -1060,7 +1185,7 @@ const FloatingPromptInputInner = ( {/* Image previews */} {embeddedImages.length > 0 && ( diff --git a/src/components/SessionList.tsx b/src/components/SessionList.tsx index 9575c015c..b76df83bb 100644 --- a/src/components/SessionList.tsx +++ b/src/components/SessionList.tsx @@ -1,13 +1,16 @@ -import React, { useState } from "react"; +import React, { useState, useMemo } from "react"; import { motion, AnimatePresence } from "framer-motion"; -import { Clock, MessageSquare } from "lucide-react"; +import { Clock, MessageSquare, Trash2, CheckSquare, Square, Trash, ChevronRight, ChevronDown, Folder, FolderOpen } from "lucide-react"; import { Card } from "@/components/ui/card"; import { Pagination } from "@/components/ui/pagination"; import { ClaudeMemoriesDropdown } from "@/components/ClaudeMemoriesDropdown"; import { TooltipProvider } from "@/components/ui/tooltip"; +import { Button } from "@/components/ui/button"; import { cn } from "@/lib/utils"; import { truncateText, getFirstLine } from "@/lib/date-utils"; +import { api } from "@/lib/api"; import type { Session, ClaudeMdFile } from "@/lib/api"; +import { SessionPersistenceService } from "@/services/sessionPersistence"; interface SessionListProps { /** @@ -26,6 +29,10 @@ interface SessionListProps { * Callback when a session is clicked */ onSessionClick?: (session: Session) => void; + /** + * Callback when a session is deleted + */ + onSessionDelete?: (sessionId: string) => void; /** * Callback when a CLAUDE.md file should be edited */ @@ -38,6 +45,13 @@ interface SessionListProps { const ITEMS_PER_PAGE = 12; +interface HierarchicalSession { + session: Session; + children: Session[]; + isPrimary: boolean; + isExpanded?: boolean; +} + /** * SessionList component - Displays paginated sessions for a specific project * @@ -53,21 +67,233 @@ export const SessionList: React.FC = ({ sessions, projectPath, onSessionClick, + onSessionDelete, onEditClaudeFile, className, }) => { const [currentPage, setCurrentPage] = useState(1); - - // Calculate pagination - const totalPages = Math.ceil(sessions.length / ITEMS_PER_PAGE); + const [selectedSessions, setSelectedSessions] = useState>(new Set()); + const [isMultiSelectMode, setIsMultiSelectMode] = useState(false); + const [expandedSessions, setExpandedSessions] = useState>(new Set()); + + // Process sessions into hierarchical structure + const hierarchicalSessions = useMemo(() => { + const sessionMap = new Map(); + const primarySessions: HierarchicalSession[] = []; + const childSessionsMap = new Map(); + + // Create session map and identify primary sessions + sessions.forEach(session => { + sessionMap.set(session.id, session); + + // Try to get persistence data to identify primary vs child + const persistenceData = SessionPersistenceService.loadSession(session.id); + + if (persistenceData?.isConversationRoot || !persistenceData?.primaryConversationId) { + // This is a primary session + const hierarchicalSession: HierarchicalSession = { + session, + children: [], + isPrimary: true, + isExpanded: expandedSessions.has(session.id) + }; + primarySessions.push(hierarchicalSession); + childSessionsMap.set(session.id, []); + } + }); + + // Group child sessions under their primary sessions + sessions.forEach(session => { + const persistenceData = SessionPersistenceService.loadSession(session.id); + + if (persistenceData?.primaryConversationId && !persistenceData.isConversationRoot) { + // This is a child session + const primaryId = persistenceData.primaryConversationId; + const children = childSessionsMap.get(primaryId); + if (children) { + children.push(session); + } + } + }); + + // Add children to their primary sessions and sort by timestamp + primarySessions.forEach(primarySession => { + const children = childSessionsMap.get(primarySession.session.id) || []; + primarySession.children = children.sort((a, b) => { + const timeA = a.message_timestamp ? new Date(a.message_timestamp).getTime() : a.created_at * 1000; + const timeB = b.message_timestamp ? new Date(b.message_timestamp).getTime() : b.created_at * 1000; + return timeB - timeA; // Sort descending (latest first) + }); + }); + + // Sort primary sessions by most recent activity (primary or child) + return primarySessions.sort((a, b) => { + const getLatestTime = (hs: HierarchicalSession) => { + const primaryTime = hs.session.message_timestamp ? new Date(hs.session.message_timestamp).getTime() : hs.session.created_at * 1000; + const childTimes = hs.children.map(child => child.message_timestamp ? new Date(child.message_timestamp).getTime() : child.created_at * 1000); + return Math.max(primaryTime, ...childTimes); + }; + + return getLatestTime(b) - getLatestTime(a); // Sort descending (most recent first) + }); + }, [sessions, expandedSessions]); + + // Calculate pagination based on hierarchical sessions + const totalPages = Math.ceil(hierarchicalSessions.length / ITEMS_PER_PAGE); const startIndex = (currentPage - 1) * ITEMS_PER_PAGE; const endIndex = startIndex + ITEMS_PER_PAGE; - const currentSessions = sessions.slice(startIndex, endIndex); + const currentHierarchicalSessions = hierarchicalSessions.slice(startIndex, endIndex); + + // Function to toggle expand/collapse for primary sessions + const toggleSessionExpansion = (sessionId: string) => { + setExpandedSessions(prev => { + const newSet = new Set(prev); + if (newSet.has(sessionId)) { + newSet.delete(sessionId); + } else { + newSet.add(sessionId); + } + return newSet; + }); + }; // Reset to page 1 if sessions change React.useEffect(() => { setCurrentPage(1); - }, [sessions.length]); + }, [hierarchicalSessions.length]); + + // Multi-select helper functions + const toggleSessionSelection = (sessionId: string) => { + setSelectedSessions(prev => { + const newSet = new Set(prev); + if (newSet.has(sessionId)) { + newSet.delete(sessionId); + } else { + newSet.add(sessionId); + } + return newSet; + }); + }; + + const selectAllSessions = () => { + const allSessionIds: string[] = []; + currentHierarchicalSessions.forEach(hs => { + allSessionIds.push(hs.session.id); + allSessionIds.push(...hs.children.map(child => child.id)); + }); + setSelectedSessions(new Set(allSessionIds)); + }; + + const clearAllSelections = () => { + setSelectedSessions(new Set()); + }; + + const toggleMultiSelectMode = () => { + setIsMultiSelectMode(!isMultiSelectMode); + setSelectedSessions(new Set()); + }; + + // Check if all current sessions are selected + const allCurrentSessionsSelected = (() => { + const allSessionIds: string[] = []; + currentHierarchicalSessions.forEach(hs => { + allSessionIds.push(hs.session.id); + allSessionIds.push(...hs.children.map(child => child.id)); + }); + return allSessionIds.length > 0 && allSessionIds.every(id => selectedSessions.has(id)); + })(); + + // Bulk delete handler + const handleBulkDelete = async () => { + if (selectedSessions.size === 0) return; + + const sessionIds = Array.from(selectedSessions); + const projectId = currentHierarchicalSessions[0]?.session.project_id; + + if (!projectId) { + alert('Unable to determine project ID'); + return; + } + + const confirmMessage = `Are you sure you want to delete ${sessionIds.length} session${sessionIds.length > 1 ? 's' : ''}?\n\nThis action cannot be undone.`; + + if (!(await window.confirm(confirmMessage))) { + return; + } + + try { + await api.deleteSessionsBulk(sessionIds, projectId); + + // Clean up session persistence data for each deleted session + sessionIds.forEach(sessionId => { + SessionPersistenceService.removeSession(sessionId); + onSessionDelete?.(sessionId); + }); + + // Clear selections and exit multi-select mode + setSelectedSessions(new Set()); + setIsMultiSelectMode(false); + } catch (error) { + console.error('Failed to bulk delete sessions:', error); + alert('Failed to delete sessions. Please try again.'); + } + }; + + const handleDeleteSession = async (e: React.MouseEvent, session: Session) => { + e.preventDefault(); + e.stopPropagation(); // Prevent session click when delete button is clicked + + // Add a small delay to ensure the UI is stable + await new Promise(resolve => setTimeout(resolve, 100)); + + // Check if this is a primary session with children + const persistenceData = SessionPersistenceService.loadSession(session.id); + const isPrimary = persistenceData?.isConversationRoot; + const children = isPrimary ? SessionPersistenceService.getConversationSessions(session.id).filter(id => id !== session.id) : []; + + let confirmMessage = `Are you sure you want to delete this session?\n\nSession: ${session.id.slice(-8)}\nDate: ${session.message_timestamp + ? new Date(session.message_timestamp).toLocaleDateString() + : new Date(session.created_at * 1000).toLocaleDateString()}`; + + if (isPrimary && children.length > 0) { + confirmMessage += `\n\nThis is a primary session with ${children.length} child session${children.length > 1 ? 's' : ''}. Deleting it will also delete all child sessions.`; + } + + confirmMessage += `\n\nThis action cannot be undone.`; + + // Use a more explicit confirmation approach + const confirmed = await window.confirm(confirmMessage); + console.log('Delete confirmation result:', confirmed); + + if (!confirmed) { + console.log('User cancelled deletion'); + return; + } + + console.log('Proceeding with deletion of session:', session.id); + + try { + // If primary session, delete all child sessions first + if (isPrimary && children.length > 0) { + for (const childId of children) { + await api.deleteSession(childId, session.project_id); + SessionPersistenceService.removeSession(childId); + onSessionDelete?.(childId); + } + } + + // Delete the main session + await api.deleteSession(session.id, session.project_id); + + // Clean up session persistence data + SessionPersistenceService.removeSession(session.id); + + onSessionDelete?.(session.id); + } catch (error) { + console.error('Failed to delete session:', error); + alert('Failed to delete session. Please try again.'); + } + }; return ( @@ -86,64 +312,192 @@ export const SessionList: React.FC = ({ )} - -
- {currentSessions.map((session, index) => ( - - +
+ + + {isMultiSelectMode && ( + <> + + + {selectedSessions.size > 0 && ( + + )} + + )} +
+ + {isMultiSelectMode && selectedSessions.size > 0 && ( + + {selectedSessions.size} session{selectedSessions.size > 1 ? 's' : ''} selected + + )} +
+ + +
+ {currentHierarchicalSessions.map((hierarchicalSession, index) => { + const { session, children } = hierarchicalSession; + const isExpanded = expandedSessions.has(session.id); + const hasChildren = children.length > 0; + + return ( + -
-
- {/* Session header */} + {/* Primary Session Card */} + { + if (isMultiSelectMode) { + toggleSessionSelection(session.id); + } else if (hasChildren) { + // Primary sessions with children expand/collapse on click + toggleSessionExpansion(session.id); + } else { + // Primary sessions without children open directly + const event = new CustomEvent('claude-session-selected', { + detail: { session, projectPath } + }); + window.dispatchEvent(event); + onSessionClick?.(session); + } + }} + > +
+ {/* Primary Session Header */}
-
- +
+ {/* Folder icon and expand/collapse for sessions with children */} + {hasChildren ? ( +
+ {isExpanded ? ( + <> + + + + ) : ( + <> + + + + )} +
+ ) : ( + + )} +
-

- Session on {session.message_timestamp - ? new Date(session.message_timestamp).toLocaleDateString('en-US', { - month: 'short', - day: 'numeric', - year: 'numeric' - }) - : new Date(session.created_at * 1000).toLocaleDateString('en-US', { - month: 'short', - day: 'numeric', - year: 'numeric' - }) - } -

+
+

+ {hasChildren ? 'Conversation' : 'Session'} on {session.message_timestamp + ? new Date(session.message_timestamp).toLocaleDateString('en-US', { + month: 'short', + day: 'numeric', + year: 'numeric' + }) + : new Date(session.created_at * 1000).toLocaleDateString('en-US', { + month: 'short', + day: 'numeric', + year: 'numeric' + }) + } +

+ {hasChildren && ( + + {children.length + 1} session{children.length + 1 > 1 ? 's' : ''} + + )} +
- {session.todo_data && ( - - Todo - - )} + +
+ {session.todo_data && ( + + Todo + + )} + + {/* Multi-select checkbox */} + {isMultiSelectMode && ( + + )} + + {/* Delete button (hidden in multi-select mode) */} + {!isMultiSelectMode && ( + + )} +
- + {/* First message preview */} {session.first_message ? (

@@ -154,21 +508,154 @@ export const SessionList: React.FC = ({ No messages yet

)} + + {/* Metadata footer */} +
+

+ {session.id.slice(-8)} +

+ {session.todo_data && ( + + )} +
- - {/* Metadata footer */} -
-

- {session.id.slice(-8)} -

- {session.todo_data && ( - - )} -
-
-
- - ))} + + + {/* Child Sessions */} + {hasChildren && isExpanded && ( + + + {children.map((childSession, childIndex) => ( + + { + if (isMultiSelectMode) { + toggleSessionSelection(childSession.id); + } else { + const event = new CustomEvent('claude-session-selected', { + detail: { session: childSession, projectPath } + }); + window.dispatchEvent(event); + onSessionClick?.(childSession); + } + }} + > +
+ {/* Child Session Header */} +
+
+
+ └─ + +
+ +
+

+ {childSession.message_timestamp + ? new Date(childSession.message_timestamp).toLocaleTimeString('en-US', { + hour: 'numeric', + minute: '2-digit' + }) + : new Date(childSession.created_at * 1000).toLocaleTimeString('en-US', { + hour: 'numeric', + minute: '2-digit' + }) + } +

+
+
+ +
+ {/* Latest session indicator */} + {childIndex === 0 && ( + + Latest + + )} + + {childSession.todo_data && ( + + Todo + + )} + + {/* Multi-select checkbox */} + {isMultiSelectMode && ( + + )} + + {/* Delete button (hidden in multi-select mode) */} + {!isMultiSelectMode && ( + + )} +
+
+ + {/* Child Session Message Preview */} + {childSession.first_message && ( +

+ {truncateText(getFirstLine(childSession.first_message), 100)} +

+ )} + + {/* Child Session Footer */} +
+

+ {childSession.id.slice(-8)} +

+ {childSession.todo_data && ( + + )} +
+
+
+
+ ))} +
+
+ )} + + ); + })}
diff --git a/src/components/SessionOutputViewer.tsx b/src/components/SessionOutputViewer.tsx index eafcc0145..b298f37af 100644 --- a/src/components/SessionOutputViewer.tsx +++ b/src/components/SessionOutputViewer.tsx @@ -8,6 +8,7 @@ import { Toast, ToastContainer } from '@/components/ui/toast'; import { Popover } from '@/components/ui/popover'; import { api } from '@/lib/api'; import { useOutputCache } from '@/lib/outputCache'; +import { useAutoScroll } from '@/hooks'; import type { AgentRun } from '@/lib/api'; import { listen, type UnlistenFn } from '@tauri-apps/api/event'; import { StreamMessage } from './StreamMessage'; @@ -46,7 +47,8 @@ export function SessionOutputViewer({ session, onClose, className }: SessionOutp const [toast, setToast] = useState<{ message: string; type: "success" | "error" } | null>(null); const [copyPopoverOpen, setCopyPopoverOpen] = useState(false); const [hasUserScrolled, setHasUserScrolled] = useState(false); - + const { autoScrollEnabled } = useAutoScroll(); + const scrollAreaRef = useRef(null); const outputEndRef = useRef(null); const fullscreenScrollRef = useRef(null); @@ -66,7 +68,7 @@ export function SessionOutputViewer({ session, onClose, className }: SessionOutp }; const scrollToBottom = () => { - if (!hasUserScrolled) { + if (autoScrollEnabled && !hasUserScrolled) { const endRef = isFullscreen ? fullscreenMessagesEndRef.current : outputEndRef.current; if (endRef) { endRef.scrollIntoView({ behavior: 'smooth' }); @@ -83,11 +85,11 @@ export function SessionOutputViewer({ session, onClose, className }: SessionOutp // Auto-scroll when messages change useEffect(() => { - const shouldAutoScroll = !hasUserScrolled || isAtBottom(); + const shouldAutoScroll = autoScrollEnabled && (!hasUserScrolled || isAtBottom()); if (shouldAutoScroll) { scrollToBottom(); } - }, [messages, hasUserScrolled, isFullscreen]); + }, [messages, hasUserScrolled, isFullscreen, autoScrollEnabled]); const loadOutput = async (skipCache = false) => { @@ -480,11 +482,13 @@ export function SessionOutputViewer({ session, onClose, className }: SessionOutp className="h-full overflow-y-auto p-6 space-y-3" ref={scrollAreaRef} onScroll={() => { + if (!autoScrollEnabled) return; + // Mark that user has scrolled manually if (!hasUserScrolled) { setHasUserScrolled(true); } - + // If user scrolls back to bottom, re-enable auto-scroll if (isAtBottom()) { setHasUserScrolled(false); @@ -614,11 +618,13 @@ export function SessionOutputViewer({ session, onClose, className }: SessionOutp ref={fullscreenScrollRef} className="h-full overflow-y-auto space-y-3" onScroll={() => { + if (!autoScrollEnabled) return; + // Mark that user has scrolled manually if (!hasUserScrolled) { setHasUserScrolled(true); } - + // If user scrolls back to bottom, re-enable auto-scroll if (isAtBottom()) { setHasUserScrolled(false); diff --git a/src/components/Settings.tsx b/src/components/Settings.tsx index 06d338a0c..96df315f6 100644 --- a/src/components/Settings.tsx +++ b/src/components/Settings.tsx @@ -96,6 +96,8 @@ export const Settings: React.FC = ({ const [tabPersistenceEnabled, setTabPersistenceEnabled] = useState(true); // Startup intro preference const [startupIntroEnabled, setStartupIntroEnabled] = useState(true); + // Auto-scroll preference + const [autoScrollEnabled, setAutoScrollEnabled] = useState(true); // Load settings on mount useEffect(() => { @@ -109,6 +111,11 @@ export const Settings: React.FC = ({ const pref = await api.getSetting('startup_intro_enabled'); setStartupIntroEnabled(pref === null ? true : pref === 'true'); })(); + // Load auto-scroll setting (default to true if not set) + (async () => { + const pref = await api.getSetting('auto_scroll_enabled'); + setAutoScrollEnabled(pref === null ? true : pref === 'true'); + })(); }, []); /** @@ -436,20 +443,32 @@ export const Settings: React.FC = ({ onClick={() => setTheme('gray')} className={cn( "flex items-center gap-1.5 px-3 py-1.5 text-xs font-medium rounded-md transition-all", - theme === 'gray' - ? "bg-background shadow-sm" + theme === 'gray' + ? "bg-background shadow-sm" : "hover:bg-background/50" )} > {theme === 'gray' && } Gray +
+ + {/* Auto-scroll Setting */} +
+
+ +

+ Automatically scroll to the bottom when new messages arrive during conversations +

+
+ { + setAutoScrollEnabled(checked); + try { + await api.saveSetting('auto_scroll_enabled', checked ? 'true' : 'false'); + trackEvent.settingsChanged('auto_scroll_enabled', checked); + setToast({ + message: checked + ? 'Auto-scroll enabled' + : 'Auto-scroll disabled', + type: 'success' + }); + } catch (e) { + setToast({ message: 'Failed to update preference', type: 'error' }); + } + }} + /> +
diff --git a/src/components/StreamMessage.tsx b/src/components/StreamMessage.tsx index ae43a5934..dc6edc955 100644 --- a/src/components/StreamMessage.tsx +++ b/src/components/StreamMessage.tsx @@ -48,6 +48,24 @@ interface StreamMessageProps { onLinkDetected?: (url: string) => void; } +/** + * Utility function to convert image file paths back to clean placeholders for display + */ +const cleanImagePaths = (text: string): string => { + if (!text || typeof text !== 'string') return text; + + // Pattern to match image paths: @"./images/session_temp_..." or @"data:image/..." + const imagePathRegex = /@"([^"]*(?:images\/session_temp_[^"]*|data:image\/[^"]*))[^"]*"/g; + + let imageCounter = 0; + const cleanedText = text.replace(imagePathRegex, () => { + imageCounter++; + return imageCounter === 1 ? '[image]' : `[image #${imageCounter}]`; + }); + + return cleanedText; +}; + /** * Component to render a single Claude Code stream message */ @@ -99,6 +117,7 @@ const StreamMessageComponent: React.FC = ({ message, classNa return ( = ({ message, classNa // Text content - render as markdown if (content.type === "text") { // Ensure we have a string to render - const textContent = typeof content.text === 'string' - ? content.text + const rawTextContent = typeof content.text === 'string' + ? content.text : (content.text?.text || JSON.stringify(content.text || content)); - + + // Clean image paths for display + const textContent = cleanImagePaths(rawTextContent); + renderedSomething = true; return (
@@ -358,10 +380,11 @@ const StreamMessageComponent: React.FC = ({ message, classNa return ; } - // Otherwise render as plain text + // Otherwise render as plain text with cleaned image paths + const cleanedContent = cleanImagePaths(contentStr); return (
- {contentStr} + {cleanedContent}
); })() @@ -612,10 +635,13 @@ const StreamMessageComponent: React.FC = ({ message, classNa // Text content if (content.type === "text") { // Handle both string and object formats - const textContent = typeof content.text === 'string' - ? content.text + const rawTextContent = typeof content.text === 'string' + ? content.text : (content.text?.text || JSON.stringify(content.text)); - + + // Clean image paths for display + const textContent = cleanImagePaths(rawTextContent); + renderedSomething = true; return (
diff --git a/src/components/TabContent.tsx b/src/components/TabContent.tsx index e306324ed..39d6c7156 100644 --- a/src/components/TabContent.tsx +++ b/src/components/TabContent.tsx @@ -223,10 +223,16 @@ const TabPanel: React.FC = ({ tab, isActive }) => { initialProjectPath: session.project_path }); }} + onSessionDelete={(sessionId: string) => { + // Remove the deleted session from the local state + setSessions(prevSessions => + prevSessions.filter(s => s.id !== sessionId) + ); + }} onEditClaudeFile={(file: ClaudeMdFile) => { // Open CLAUDE.md file in a new tab - window.dispatchEvent(new CustomEvent('open-claude-file', { - detail: { file } + window.dispatchEvent(new CustomEvent('open-claude-file', { + detail: { file } })); }} /> diff --git a/src/components/ToolWidgets.tsx b/src/components/ToolWidgets.tsx index 5036fb30d..a30825e43 100644 --- a/src/components/ToolWidgets.tsx +++ b/src/components/ToolWidgets.tsx @@ -56,7 +56,7 @@ import { useTheme } from "@/hooks"; import { Button } from "@/components/ui/button"; import { createPortal } from "react-dom"; import * as Diff from 'diff'; -import { Card, CardContent } from "@/components/ui/card"; +import { Card } from "@/components/ui/card"; import { detectLinks, makeLinksClickable } from "@/lib/linkDetector"; import ReactMarkdown from "react-markdown"; import { open } from "@tauri-apps/plugin-shell"; @@ -348,6 +348,8 @@ export const LSResultWidget: React.FC<{ content: string }> = ({ content }) => { * Widget for Read tool */ export const ReadWidget: React.FC<{ filePath: string; result?: any }> = ({ filePath, result }) => { + const [isExpanded, setIsExpanded] = useState(false); + // If we have a result, show it using the ReadResultWidget if (result) { let resultContent = ''; @@ -364,17 +366,41 @@ export const ReadWidget: React.FC<{ filePath: string; result?: any }> = ({ fileP resultContent = JSON.stringify(result.content, null, 2); } } - + + const lineCount = resultContent.split('\n').filter(line => line.trim()).length; + return ( -
-
- - File content: - - {filePath} - -
- {resultContent && } +
+ + + {isExpanded && resultContent && ( +
+ +
+ )} + + {!isExpanded && ( +
+ Click to expand and view file content ({lineCount} lines) +
+ )}
); } @@ -399,7 +425,7 @@ export const ReadWidget: React.FC<{ filePath: string; result?: any }> = ({ fileP /** * Widget for Read tool result - shows file content with line numbers */ -export const ReadResultWidget: React.FC<{ content: string; filePath?: string }> = ({ content, filePath }) => { +export const ReadResultWidget: React.FC<{ content: string; filePath?: string; forceExpanded?: boolean }> = ({ content, filePath, forceExpanded = false }) => { const [isExpanded, setIsExpanded] = useState(false); const { theme } = useTheme(); const syntaxTheme = getClaudeSyntaxTheme(theme); @@ -502,7 +528,34 @@ export const ReadResultWidget: React.FC<{ content: string; filePath?: string }> const language = getLanguage(filePath); const { codeContent, startLineNumber } = parseContent(content); const lineCount = content.split('\n').filter(line => line.trim()).length; - const isLargeFile = lineCount > 20; + const isLargeFile = lineCount > 10; // Lower threshold for collapse + + // When forceExpanded is true, show content directly without any wrapper styling + if (forceExpanded) { + return ( +
+ + {codeContent} + +
+ ); + } return (
@@ -518,18 +571,16 @@ export const ReadResultWidget: React.FC<{ content: string; filePath?: string }> )}
- {isLargeFile && ( - - )} +
- - {(!isLargeFile || isExpanded) && ( + + {isExpanded && (
)} + + {!isExpanded && ( +
+ Click "Expand" to view the file content ({lineCount} lines) {isLargeFile && !isExpanded && (
Click "Expand" to view the full file +
)}
@@ -698,6 +754,7 @@ export const BashWidget: React.FC<{ */ export const WriteWidget: React.FC<{ filePath: string; content: string; result?: any }> = ({ filePath, content, result: _result }) => { const [isMaximized, setIsMaximized] = useState(false); + const [isExpanded, setIsExpanded] = useState(false); const { theme } = useTheme(); const syntaxTheme = getClaudeSyntaxTheme(theme); @@ -849,6 +906,8 @@ export const WriteWidget: React.FC<{ filePath: string; content: string; result?:
); + const lineCount = content.split('\n').length; + return (
@@ -857,8 +916,23 @@ export const WriteWidget: React.FC<{ filePath: string; content: string; result?: {filePath} +
- + + {isExpanded && } + + {!isExpanded && ( +
+ Click "Preview" to view the file content ({lineCount} lines) +
+ )} +
); @@ -1120,12 +1194,13 @@ const getLanguage = (path: string) => { /** * Widget for Edit tool - shows the edit operation */ -export const EditWidget: React.FC<{ - file_path: string; - old_string: string; +export const EditWidget: React.FC<{ + file_path: string; + old_string: string; new_string: string; result?: any; }> = ({ file_path, old_string, new_string, result: _result }) => { + const [isExpanded, setIsExpanded] = useState(false); const { theme } = useTheme(); const syntaxTheme = getClaudeSyntaxTheme(theme); @@ -1135,6 +1210,7 @@ export const EditWidget: React.FC<{ }); const language = getLanguage(file_path); + return (
@@ -1143,8 +1219,18 @@ export const EditWidget: React.FC<{ {file_path} +
+ {isExpanded && ( +
+
{diffResult.map((part, index) => { @@ -1194,7 +1280,14 @@ export const EditWidget: React.FC<{ ); })}
-
+
+ )} + + {!isExpanded && ( +
+ Click "Show Diff" to view the edit changes +
+ )}
); }; @@ -1203,6 +1296,7 @@ export const EditWidget: React.FC<{ * Widget for Edit tool result - shows a diff view */ export const EditResultWidget: React.FC<{ content: string }> = ({ content }) => { + const [isExpanded, setIsExpanded] = useState(false); const { theme } = useTheme(); const syntaxTheme = getClaudeSyntaxTheme(theme); @@ -1240,7 +1334,24 @@ export const EditResultWidget: React.FC<{ content: string }> = ({ content }) => const startLineNumber = firstNumberedLine ? parseInt(firstNumberedLine.lineNumber) : 1; const language = getLanguage(filePath); + const resultLineCount = codeContent.split('\n').length; + return ( +
+
+
+ + Edit Result + {filePath && ( + <> + + {filePath} + + )} +
+
+ + {isExpanded && ( +
+ + {codeContent} + +
+ )} + + {!isExpanded && ( +
+ Edit completed successfully. Click "View" to see the result ({resultLineCount} lines) +
+ )}
); }; @@ -1696,22 +1844,34 @@ export const MultiEditWidget: React.FC<{ /** * Widget for displaying MultiEdit tool results with diffs */ -export const MultiEditResultWidget: React.FC<{ +export const MultiEditResultWidget: React.FC<{ content: string; edits?: Array<{ old_string: string; new_string: string }>; }> = ({ content, edits }) => { + const [isExpanded, setIsExpanded] = useState(false); + // If we have the edits array, show a nice diff view if (edits && edits.length > 0) { return (
-
- - - {edits.length} Changes Applied - +
+
+ + + {edits.length} Changes Applied + +
+
- -
+ + {isExpanded && ( +
{edits.map((edit, index) => { // Split the strings into lines for diff display const oldLines = edit.old_string.split('\n'); @@ -1757,7 +1917,14 @@ export const MultiEditResultWidget: React.FC<{
); })} -
+
+ )} + + {!isExpanded && ( +
+ {edits.length} changes applied successfully. Click "View Changes" to see details. +
+ )}
); } @@ -1800,16 +1967,18 @@ export const SystemReminderWidget: React.FC<{ message: string }> = ({ message }) */ export const SystemInitializedWidget: React.FC<{ sessionId?: string; + childSessionId?: string; model?: string; cwd?: string; tools?: string[]; -}> = ({ sessionId, model, cwd, tools = [] }) => { +}> = ({ sessionId, childSessionId, model, cwd, tools = [] }) => { + const [isExpanded, setIsExpanded] = useState(false); const [mcpExpanded, setMcpExpanded] = useState(false); - + // Separate regular tools from MCP tools const regularTools = tools.filter(tool => !tool.startsWith('mcp__')); const mcpTools = tools.filter(tool => tool.startsWith('mcp__')); - + // Tool icon mapping for regular tools const toolIcons: Record = { 'task': CheckSquare, @@ -1829,13 +1998,13 @@ export const SystemInitializedWidget: React.FC<{ 'todowrite': ListPlus, 'websearch': Globe2, }; - + // Get icon for a tool, fallback to Wrench const getToolIcon = (toolName: string) => { const normalizedName = toolName.toLowerCase(); return toolIcons[normalizedName] || Wrench; }; - + // Format MCP tool name (remove mcp__ prefix and format underscores) const formatMcpToolName = (toolName: string) => { // Remove mcp__ prefix @@ -1863,7 +2032,7 @@ export const SystemInitializedWidget: React.FC<{ .join(' ') }; }; - + // Group MCP tools by provider const mcpToolsByProvider = mcpTools.reduce((acc, tool) => { const { provider } = formatMcpToolName(tool); @@ -1873,48 +2042,77 @@ export const SystemInitializedWidget: React.FC<{ acc[provider].push(tool); return acc; }, {} as Record); - + return ( - - -
- -
-

System Initialized

- +
+ + + {isExpanded && ( +
+
+ {/* Session Info */} -
+
{sessionId && (
- Session ID: + Primary Session: - {sessionId} + ...{sessionId.slice(-8)} + + ({new Date().toLocaleDateString('en-US', { month: 'short', day: 'numeric', year: 'numeric' })}) +
)} - - {model && ( -
- - Model: + + {childSessionId && childSessionId !== sessionId && ( +
+ └─ Child: - {model} - -
- )} - - {cwd && ( -
- - Working Directory: - - {cwd} + ...{childSessionId.slice(-8)} + + (current) +
)}
- + + {model && ( +
+ + Model: + + {model} + +
+ )} + + {cwd && ( +
+ + Working Directory: + + {cwd} + +
+ )} + {/* Regular Tools */} {regularTools.length > 0 && (
@@ -1928,9 +2126,9 @@ export const SystemInitializedWidget: React.FC<{ {regularTools.map((tool, idx) => { const Icon = getToolIcon(tool); return ( - @@ -1941,7 +2139,7 @@ export const SystemInitializedWidget: React.FC<{
)} - + {/* MCP Tools */} {mcpTools.length > 0 && (
@@ -1956,7 +2154,7 @@ export const SystemInitializedWidget: React.FC<{ mcpExpanded && "rotate-180" )} /> - + {mcpExpanded && (
{Object.entries(mcpToolsByProvider).map(([provider, providerTools]) => ( @@ -1970,9 +2168,9 @@ export const SystemInitializedWidget: React.FC<{ {providerTools.map((tool, idx) => { const { method } = formatMcpToolName(tool); return ( - {method} @@ -1986,7 +2184,7 @@ export const SystemInitializedWidget: React.FC<{ )}
)} - + {/* Show message if no tools */} {tools.length === 0 && (
@@ -1995,8 +2193,8 @@ export const SystemInitializedWidget: React.FC<{ )}
- - + )} +
); }; diff --git a/src/components/claude-code-session/MessageList.tsx b/src/components/claude-code-session/MessageList.tsx index c89912fb0..33040a66e 100644 --- a/src/components/claude-code-session/MessageList.tsx +++ b/src/components/claude-code-session/MessageList.tsx @@ -4,6 +4,7 @@ import { useVirtualizer } from '@tanstack/react-virtual'; import { StreamMessage } from '../StreamMessage'; import { Terminal } from 'lucide-react'; import { cn } from '@/lib/utils'; +import { useAutoScroll } from '@/hooks'; import type { ClaudeStreamMessage } from '../AgentExecution'; interface MessageListProps { @@ -24,6 +25,7 @@ export const MessageList: React.FC = React.memo(({ const scrollContainerRef = useRef(null); const shouldAutoScrollRef = useRef(true); const userHasScrolledRef = useRef(false); + const { autoScrollEnabled } = useAutoScroll(); // Virtual scrolling setup const virtualizer = useVirtualizer({ @@ -35,20 +37,23 @@ export const MessageList: React.FC = React.memo(({ // Auto-scroll to bottom when new messages arrive useEffect(() => { + // Only scroll if auto-scroll is enabled AND we should auto-scroll + if (!autoScrollEnabled) return; + if (shouldAutoScrollRef.current && scrollContainerRef.current) { const scrollElement = scrollContainerRef.current; scrollElement.scrollTop = scrollElement.scrollHeight; } - }, [messages]); + }, [messages, autoScrollEnabled]); // Handle scroll events to detect user scrolling const handleScroll = () => { - if (!scrollContainerRef.current) return; - + if (!autoScrollEnabled || !scrollContainerRef.current) return; + const scrollElement = scrollContainerRef.current; - const isAtBottom = + const isAtBottom = Math.abs(scrollElement.scrollHeight - scrollElement.scrollTop - scrollElement.clientHeight) < 50; - + if (!isAtBottom) { userHasScrolledRef.current = true; shouldAutoScrollRef.current = false; @@ -60,11 +65,11 @@ export const MessageList: React.FC = React.memo(({ // Reset auto-scroll when streaming stops useEffect(() => { - if (!isStreaming) { + if (!isStreaming && autoScrollEnabled) { shouldAutoScrollRef.current = true; userHasScrolledRef.current = false; } - }, [isStreaming]); + }, [isStreaming, autoScrollEnabled]); if (messages.length === 0) { return ( diff --git a/src/contexts/ThemeContext.tsx b/src/contexts/ThemeContext.tsx index cb779d9a1..b8b815627 100644 --- a/src/contexts/ThemeContext.tsx +++ b/src/contexts/ThemeContext.tsx @@ -1,7 +1,7 @@ import React, { createContext, useState, useContext, useCallback, useEffect } from 'react'; import { api } from '../lib/api'; -export type ThemeMode = 'dark' | 'gray' | 'light' | 'custom'; +export type ThemeMode = 'dark' | 'gray' | 'zinc' | 'light' | 'custom'; export interface CustomThemeColors { background: string; @@ -104,7 +104,7 @@ export const ThemeProvider: React.FC<{ children: React.ReactNode }> = ({ childre const root = document.documentElement; // Remove all theme classes - root.classList.remove('theme-dark', 'theme-gray', 'theme-light', 'theme-custom'); + root.classList.remove('theme-dark', 'theme-gray', 'theme-zinc', 'theme-light', 'theme-custom'); // Add new theme class root.classList.add(`theme-${themeMode}`); diff --git a/src/hooks/index.ts b/src/hooks/index.ts index b817e00f9..7ad401e93 100644 --- a/src/hooks/index.ts +++ b/src/hooks/index.ts @@ -19,8 +19,9 @@ export { useAIInteractionTracking, useNetworkPerformanceTracking } from './useAnalytics'; -export { - usePerformanceMonitor, - useAsyncPerformanceTracker +export { + usePerformanceMonitor, + useAsyncPerformanceTracker } from './usePerformanceMonitor'; +export { useAutoScroll } from './useAutoScroll'; export { TAB_SCREEN_NAMES } from './useAnalytics'; diff --git a/src/hooks/useAutoScroll.ts b/src/hooks/useAutoScroll.ts new file mode 100644 index 000000000..83f0326be --- /dev/null +++ b/src/hooks/useAutoScroll.ts @@ -0,0 +1,67 @@ +import { useState, useEffect } from 'react'; +import { api } from '@/lib/api'; + +/** + * Custom hook for managing auto-scroll preference globally + * This hook handles loading, saving, and providing the auto-scroll setting + * across the application. + */ +export const useAutoScroll = () => { + const [autoScrollEnabled, setAutoScrollEnabled] = useState(true); + const [isLoading, setIsLoading] = useState(true); + + // Load auto-scroll preference from storage + useEffect(() => { + const loadAutoScrollPreference = async () => { + try { + setIsLoading(true); + + // First check localStorage for instant loading + const cached = typeof window !== 'undefined' + ? window.localStorage.getItem('app_setting:auto_scroll_enabled') + : null; + + if (cached === 'true') { + setAutoScrollEnabled(true); + } else if (cached === 'false') { + setAutoScrollEnabled(false); + } + + // Then verify from database + const pref = await api.getSetting('auto_scroll_enabled'); + const enabled = pref === null ? true : pref === 'true'; // Default to true + setAutoScrollEnabled(enabled); + } catch (error) { + console.error('Failed to load auto-scroll preference:', error); + // Default to enabled on error + setAutoScrollEnabled(true); + } finally { + setIsLoading(false); + } + }; + + loadAutoScrollPreference(); + }, []); + + /** + * Update the auto-scroll preference + * @param enabled - Whether auto-scroll should be enabled + */ + const updateAutoScrollPreference = async (enabled: boolean) => { + try { + setAutoScrollEnabled(enabled); + await api.saveSetting('auto_scroll_enabled', enabled ? 'true' : 'false'); + } catch (error) { + console.error('Failed to save auto-scroll preference:', error); + // Revert on error + setAutoScrollEnabled(!enabled); + throw error; + } + }; + + return { + autoScrollEnabled, + isLoading, + updateAutoScrollPreference, + }; +}; \ No newline at end of file diff --git a/src/lib/api.ts b/src/lib/api.ts index dd3641db7..330959e79 100644 --- a/src/lib/api.ts +++ b/src/lib/api.ts @@ -502,6 +502,52 @@ export const api = { } }, + /** + * Deletes a session and its associated data + * @param sessionId - The ID of the session to delete + * @param projectId - The ID of the project the session belongs to + * @returns Promise resolving when the session is deleted + */ + async deleteSession(sessionId: string, projectId: string): Promise { + try { + return await invoke('delete_session', { sessionId, projectId }); + } catch (error) { + console.error("Failed to delete session:", error); + throw error; + } + }, + + /** + * Deletes multiple sessions and their associated data + * @param sessionIds - Array of session IDs to delete + * @param projectId - The ID of the project the sessions belong to + * @returns Promise resolving with deletion results + */ + async deleteSessionsBulk(sessionIds: string[], projectId: string): Promise { + try { + return await invoke('delete_sessions_bulk', { sessionIds, projectId }); + } catch (error) { + console.error("Failed to bulk delete sessions:", error); + throw error; + } + }, + + /** + * Saves a pasted image to a file and returns the relative path + * @param projectId - The ID of the project to save the image in + * @param sessionId - The ID of the current session + * @param base64Data - The base64 data URL of the image + * @returns Promise resolving to the relative path of the saved image + */ + async savePastedImage(projectId: string, sessionId: string, base64Data: string): Promise { + try { + return await invoke('save_pasted_image', { projectId, sessionId, base64Data }); + } catch (error) { + console.error("Failed to save pasted image:", error); + throw error; + } + }, + /** * Fetch list of agents from GitHub repository * @returns Promise resolving to list of available agents on GitHub diff --git a/src/lib/claudeSyntaxTheme.ts b/src/lib/claudeSyntaxTheme.ts index 31cc8b83d..d96b1c380 100644 --- a/src/lib/claudeSyntaxTheme.ts +++ b/src/lib/claudeSyntaxTheme.ts @@ -35,6 +35,19 @@ export const getClaudeSyntaxTheme = (theme: ThemeMode): any => { variable: '#c084fc', // Purple operator: '#a1a1aa', }, + zinc: { + base: '#f4f4f5', // zinc-100 - crisp white text + background: 'transparent', + comment: '#71717a', // zinc-500 - muted gray + punctuation: '#a1a1aa', // zinc-400 - medium gray + property: '#e4e4e7', // zinc-200 - light gray + tag: '#d4d4d8', // zinc-300 - subtle gray + string: '#d4d4d8', // zinc-300 - neutral strings + function: '#f4f4f5', // zinc-100 - white functions + keyword: '#e4e4e7', // zinc-200 - light keywords + variable: '#a1a1aa', // zinc-400 - gray variables + operator: '#a1a1aa', // zinc-400 - gray operators + }, light: { base: '#1f2937', background: 'transparent', diff --git a/src/services/sessionPersistence.ts b/src/services/sessionPersistence.ts index 2c3f513e9..cbaebfd54 100644 --- a/src/services/sessionPersistence.ts +++ b/src/services/sessionPersistence.ts @@ -15,13 +15,16 @@ export interface SessionRestoreData { lastMessageCount?: number; scrollPosition?: number; timestamp: number; + // Conversation tracking + primaryConversationId?: string; // The first session ID in this conversation + isConversationRoot?: boolean; // True for the first session in a conversation } export class SessionPersistenceService { /** * Save session data for later restoration */ - static saveSession(sessionId: string, projectId: string, projectPath: string, messageCount?: number, scrollPosition?: number): void { + static saveSession(sessionId: string, projectId: string, projectPath: string, messageCount?: number, scrollPosition?: number, primaryConversationId?: string): void { try { const sessionData: SessionRestoreData = { sessionId, @@ -29,7 +32,9 @@ export class SessionPersistenceService { projectPath, lastMessageCount: messageCount, scrollPosition, - timestamp: Date.now() + timestamp: Date.now(), + primaryConversationId: primaryConversationId || sessionId, // Use this sessionId as primary if none provided + isConversationRoot: !primaryConversationId // True if no primary provided (this is the first session) }; // Save individual session data @@ -171,4 +176,53 @@ export class SessionPersistenceService { first_message: "Restored session" }; } + + /** + * Get only the primary conversation sessions (filter out continuation sessions) + */ + static getPrimaryConversationSessions(): string[] { + try { + const index = this.getSessionIndex(); + const primarySessions: string[] = []; + + index.forEach(sessionId => { + const data = this.loadSession(sessionId); + if (data && data.isConversationRoot) { + primarySessions.push(sessionId); + } + }); + + return primarySessions; + } catch (error) { + console.error('Failed to get primary conversation sessions:', error); + return []; + } + } + + /** + * Get all sessions that belong to a specific conversation + */ + static getConversationSessions(primaryConversationId: string): string[] { + try { + const index = this.getSessionIndex(); + const conversationSessions: string[] = []; + + index.forEach(sessionId => { + const data = this.loadSession(sessionId); + if (data && data.primaryConversationId === primaryConversationId) { + conversationSessions.push(sessionId); + } + }); + + return conversationSessions.sort((a, b) => { + const dataA = this.loadSession(a); + const dataB = this.loadSession(b); + if (!dataA || !dataB) return 0; + return dataA.timestamp - dataB.timestamp; // Sort by timestamp + }); + } catch (error) { + console.error('Failed to get conversation sessions:', error); + return []; + } + } } diff --git a/src/stores/sessionStore.ts b/src/stores/sessionStore.ts index d30d4fae3..c865ffc90 100644 --- a/src/stores/sessionStore.ts +++ b/src/stores/sessionStore.ts @@ -124,9 +124,8 @@ const sessionStore: StateCreator< // Delete session deleteSession: async (sessionId: string, projectId: string) => { try { - // Note: API doesn't have a deleteSession method, so this is a placeholder - console.warn('deleteSession not implemented in API'); - + await api.deleteSession(sessionId, projectId); + // Update local state set((state) => ({ sessions: { @@ -140,7 +139,7 @@ const sessionStore: StateCreator< ) })); } catch (error) { - set({ + set({ error: error instanceof Error ? error.message : 'Failed to delete session' }); throw error; diff --git a/src/styles.css b/src/styles.css index 24d418bb8..3914990ec 100644 --- a/src/styles.css +++ b/src/styles.css @@ -217,12 +217,54 @@ html { --color-border: oklch(0.32 0.01 240); --color-input: oklch(0.32 0.01 240); --color-ring: oklch(0.55 0.015 240); - + /* Additional colors for status messages */ --color-green-500: oklch(0.72 0.20 142); --color-green-600: oklch(0.64 0.22 142); } +/* Zinc Theme - Dark black-gray with no tint */ +.theme-zinc { + /* Core backgrounds - pure neutral dark gray */ + --color-background: oklch(0.13 0 0); /* Pure dark gray, almost black */ + --color-foreground: oklch(0.98 0 0); /* Pure white */ + + /* Card and surfaces - neutral grays */ + --color-card: oklch(0.17 0 0); /* Slightly lighter dark gray */ + --color-card-foreground: oklch(0.96 0 0); /* Off-white */ + --color-popover: oklch(0.15 0 0); /* Between background and card */ + --color-popover-foreground: oklch(0.96 0 0); + + /* Primary - pure white for contrast */ + --color-primary: oklch(0.98 0 0); /* Pure white */ + --color-primary-foreground: oklch(0.13 0 0); + + /* Secondary elements - mid neutral gray */ + --color-secondary: oklch(0.21 0 0); /* Mid-dark gray */ + --color-secondary-foreground: oklch(0.85 0 0); /* Light gray */ + + /* Muted areas - pure neutral */ + --color-muted: oklch(0.15 0 0); /* Very dark gray */ + --color-muted-foreground: oklch(0.50 0 0); /* Medium gray */ + + /* Accent - neutral gray highlight */ + --color-accent: oklch(0.21 0 0); /* Mid-dark gray */ + --color-accent-foreground: oklch(0.96 0 0); + + /* System colors */ + --color-destructive: oklch(0.6 0.2 25); + --color-destructive-foreground: oklch(0.98 0 0); + + /* Borders and inputs - pure neutral grays */ + --color-border: oklch(0.25 0 0); /* Medium-dark gray */ + --color-input: oklch(0.15 0 0); /* Very dark gray */ + --color-ring: oklch(0.50 0 0); /* Medium gray */ + + /* Status colors */ + --color-green-500: oklch(0.72 0.20 142); + --color-green-600: oklch(0.64 0.22 142); +} + /* White Theme (High Contrast Light) */ .theme-white { --color-background: oklch(0.99 0 240);