Twenty-four files disappeared from my working tree on a Tuesday afternoon. No error message. No warning. I'd asked Claude Code to commit some changes, and five git commands later, three days of conversation backups were gone forever. The root cause wasn't a bug. It was an AI agent making a reasonable-sounding but catastrophically wrong decision under routine conditions. This is how it happened and what I built to prevent it. --- ## The Setup I use Claude Code as the operating system for my Obsidian-based Second Brain. Claude has deep context about my vault structure, conventions, and preferences through persistent instruction files. I asked it to commit some pending changes. A routine task I've done hundreds of times before. --- ## What Was in the Working Tree Three groups of uncommitted changes: - **5 modified files** — Readwise archive documents that had been re-synced with updated content. These were already tracked by git. - **1 new file** — A web search about reading difficulty and word skipping. Untracked. - **23 new files** — Conversation backup transcripts from the past three days. Untracked. Total: 29 files. 24 of them had never been committed to git. --- ## The Cascade My vault's `CLAUDE.md` has a rule: before any commit, run `git pull --rebase` first. This prevents divergence with the remote. Claude followed the rule. **Step 1: `git pull --rebase` fails.** ``` error: cannot pull with rebase: You have unstaged changes. ``` The 5 modified Readwise files blocked the rebase. Fair enough. **Step 2: Claude decides to stash.** The plan: stash the changes, pull, then pop the stash back. A standard git workflow. `git stash` only captures tracked changes. The first attempt stashed one copy of the modified files. But the modified files weren't staged, and something about the index state meant they persisted in the working tree after the stash. **Step 3: Claude escalates to `git stash --include-untracked`.** Since `git stash` alone didn't clear the working tree, Claude tried `git stash --include-untracked`. This flag tells git to also capture every untracked file in the repository. All 24 untracked files got sucked into a stash entry. They were no longer on disk. They existed only inside the stash. **Step 4: The pull still fails.** Even after two stash attempts, the modified files somehow remained in the working tree. The pull was still blocked. Two stash entries now existed, both containing copies of the data. **Step 5: Claude drops both stashes.** ```bash git stash drop stash@{1} && git stash drop stash@{0} ``` This is the moment of destruction. When you drop a git stash, the data in it is garbage collected. For tracked files, this doesn't matter: the committed versions still exist in the repository's object database. But untracked files captured by `--include-untracked` have *no other copy in git*. Dropping the stash is the only copy being destroyed. 24 files permanently deleted. No recovery path. No `git reflog`. No undo. --- ## The Irony **Untracked files never block `git pull --rebase`.** Only unstaged changes to tracked files cause that error. The 24 untracked files would have sat there harmlessly through any number of pulls and rebases. They didn't need to be stashed. They didn't need to be touched at all. Claude stashed them because `git stash --include-untracked` was the next escalation in the "clear the working tree" playbook. It solved a problem that didn't exist, and destroyed data in the process. --- ## What I Built to Prevent This A CLAUDE.md rule alone isn't enough. Rules get overlooked under pressure. The rule to "never overwrite original documents" has been in my CLAUDE.md since day one. This incident violated the same principle: destroying user data through a side effect. ### Layer 1: The Rule Added to `~/.claude/CLAUDE.md` under Safety & Permissions, deployed via nix-darwin across all machines: > Never use `git stash --include-untracked` / `git stash -u` — untracked files in a stash are permanently destroyed on drop or failed pop. Caused irreversible loss of 24 files (2026-03-18). If tracked files block rebase, commit them first. Untracked files never block pull/rebase — leave them alone. ### Layer 2: The Hook A `PreToolUse` hook intercepts every Bash tool call *before it executes* and pattern-matches against dangerous commands: ```bash #!/usr/bin/env bash set -euo pipefail INPUT=$(cat) TOOL_NAME=$(echo "$INPUT" | jq -r '.tool_name // ""') if [ "$TOOL_NAME" != "Bash" ]; then exit 0 fi COMMAND=$(echo "$INPUT" | jq -r '.tool_input.command // ""') if echo "$COMMAND" | grep -qE 'git\s+stash\s+.*(-u\b|--include-untracked)'; then echo "BLOCKED: git stash --include-untracked / -u destroys untracked files irreversibly." >&2 exit 2 fi exit 0 ``` Exit code 2 tells Claude Code to reject the tool call. The command never runs. It doesn't matter if Claude "forgets" the rule, gets confused under pressure, or decides stashing is the right approach. The hook physically prevents the destructive action. Policy says don't. Mechanism makes it impossible. You need both, but when they conflict, mechanism wins. --- ## The Broader Lesson AI agents are most dangerous when they're being helpful. This wasn't a hallucination or a misunderstanding. Claude correctly identified that the working tree needed to be clean for `git pull --rebase`. It correctly knew that `git stash` is the standard tool for temporarily shelving changes. It escalated to `--include-untracked` because the simpler version didn't work. Every step was locally rational. The cascade was globally catastrophic. Yes, I should have committed those 24 files sooner. Three days of untracked files is too long. But the point is: an agent should not destroy data while trying to commit data. The human mistake (not committing) should not become an AI-assisted catastrophe. And yes, you could argue I shouldn't give an AI agent unsupervised access to destructive git operations. That's a fair point. The hook is my answer: supervised by mechanism, not by my attention span. This is the failure mode you should be most worried about with AI coding agents: 1. **Reasonable escalation.** When plan A fails, the agent tries plan B. Plan B has a larger blast radius but still sounds reasonable. Nobody pauses to ask whether the problem being solved actually requires this escalation. 2. **Invisible destruction.** The 24 files disappeared silently. No error. No warning. `git stash --include-untracked` succeeded. The data loss only became apparent when I later asked about the commit status. 3. **Irreversibility gaps.** The agent doesn't model which operations are reversible. `git stash` on tracked files is reversible. `git stash --include-untracked` on never-committed files is irreversible. Same command. Different consequences, depending on whether the files were ever committed. If you use AI coding agents with terminal access, build mechanical safeguards. PreToolUse hooks in Claude Code, custom wrappers in other tools, shell aliases that intercept dangerous patterns. Rules are suggestions. Hooks are laws. Commit early. The gap between "file exists on disk" and "file exists in git" is where data dies. --- *I lost three days of conversation backups and a research note about reading difficulty. The conversation backups will regenerate. The research note won't. The preventive hook took 15 minutes to build. The files it protects are irreplaceable.*