*Personal knowledge management across 8 tools and 20 years (published March 24, 2026)*
---
## The Graveyard
I've kept notes since 2003. Eight tools, two data losses, one rebuild from scratch. The collection grew steadily. My ability to use it didn't.
I have 2,000 markdown files in an Obsidian vault now. Software engineering, family management, household operations, economic analysis, philosophy, and whatever I was thinking about on a given Tuesday.
At 500 notes, I could hold the structure in my head. At 1,500, I couldn't. You know the feeling. Well-intentioned, carefully built, comprehensively tagged. Still a graveyard.
---
## The Same Problem, Eight Times
The tools changed. The problem didn't.
**Capture era** (2003-2014): Handwritten notebooks, text files, custom PHP apps, Evernote. Each one made it easier to get things in. Evernote's web clipper was so frictionless I clipped everything. Thousands of notes I never looked at again.
**Organization era** (2015-2021): Google Docs, DEVONthink. DEVONthink's "See Also" was the first time a tool surfaced connections for me. But the suggestions were interesting without being actionable, and my notes were locked in a proprietary database.
**Ownership era** (2022-2025): Obsidian, Basecamp. Plain text I actually controlled. Wiki-links, knowledge graph, no vendor lock-in. Nothing broke. But the fundamental problem persisted: 1,500+ notes and no systematic way to surface connections, synthesize across documents, or rediscover forgotten material.
Every tool improved capture. None solved use.
Forte calls it the Volume Paradox: note volume grows linearly, discovery effort grows exponentially, available time stays constant. The bottleneck shifted from capture to synthesis years ago. The hard part was never getting things in. It was filtering, distilling, connecting, and expressing.
---
## What PARA Changed
PARA solved the organizational paralysis that preceded synthesis. You know the taxonomy trap. Is this economics or career development? PARA replaced "What is this about?" with "When will I need this?" and that was enough to stop the filing friction.
It survived every tool migration because it's just folders. It made "good enough" the standard because wrong placement costs a 30-second move, not a broken taxonomy.
PARA organized the graveyard. The notes were still buried. They were just buried neatly.
---
## What AI Changed
When I connected Claude Code to my vault through a `CLAUDE.md` context file, the collection became operational.
`CLAUDE.md` is a long context file that teaches the AI how my vault works. Roughly 12,000 words encoding my organizational philosophy, verification standards, and working preferences. Every session starts from shared understanding instead of cold introduction.
A typical day:
Morning: `/daily-briefing` pulls news into a dated vault file. `/diceroll` picks a random document from the entire vault. A philosophical framework from 2024 resurfaces during a technical problem in 2026. Search finds what you know you need. Randomness finds what you forgot you had.
During the day: `/journal spouse She mentioned...` routes to the Spouse area. `/slot-document` scores any file against existing topic folders and moves it to the best fit. PARA only works if organization is effortless. This removes the friction.
Weekly: `/weekly-review` generates a strategic reflection from actual git commits across multiple repositories. Grounded in what happened, not what I think happened.
When something connects: `/find-similar` discovers related content across the vault. I've found documents I forgot I wrote that directly addressed problems I was actively working on. `/synthesize` produces an overview of a topic folder. One of mine has 80+ documents. The synthesized overview makes it navigable in minutes instead of hours. AI doesn't respect domain boundaries the way human filing does, and that's where the most valuable insights live.
---
## The Skill Capture Loop
Every organizational problem I solve with Claude becomes a reusable, auto-activating skill. The vault currently has 250 captured skills and 60 custom commands built over a year of daily use.
The pattern: I encounter a knowledge management problem. We solve it. `/capture-skill` extracts the solution into a persistent skill file. Next time the pattern appears, the skill activates automatically based on keyword detection. No manual invocation needed.
Some examples: `progressive-summarization-technique` guides 4-layer document distillation. `cross-domain-synthesizer` identifies methodological parallels between unrelated fields. `de-ai-voice-editing` removes AI writing artifacts from drafts. `archive-topic-organizer` sorts flat file collections into topic-based folder structures.
Problems become patterns. Patterns become skills. Skills compound. After 250 of them, the system handles most routine knowledge work without custom instructions. The overhead was front-loaded. The benefit hasn't stopped compounding.
---
## What It Costs
API credits add up. `/weekly-review` with comprehensive git analysis across multiple domains isn't cheap. I budget for this explicitly.
Bootstrapping takes months. 250 skills and 60 commands took a year of daily use. The initial period feels like overhead with unclear payoff. Most people will quit before reaching the inflection point.
The over-engineering temptation is constant. Not everything needs a skill. Not everything needs a framework. Sometimes a note is just a note. I catch myself building systems for problems that don't recur. The discipline is knowing when to stop engineering and just write.
AI misorganization happens. `/slot-document` usually gets it right, but sometimes assigns based on surface keyword overlap rather than thematic fit. Trust but verify.
Context window limits create friction. Complex synthesis across many documents can lose details from earlier in the analysis. This improves with each model generation but it's a real constraint today.
---
## The Collection Was Never the Problem
The value was always in the accumulated notes. Not in any individual document, but in the connections between them — patterns that emerge across years of thinking about different problems.
Every tool migration, every abandoned system, every note I thought was lost was building raw material. The text files from 2003 contained thinking that's still relevant. The Evernote clippings I thought were a graveyard contained patterns I couldn't see at the time.
What was missing wasn't a better capture tool or a better taxonomy. It was a way to systematically extract and connect the value already there.
The bottleneck was never capture. It was always synthesis. For the first time in 20 years, mine is gone.