Overview

This knowledge base tracks research on AI tools, techniques, and knowledge management.

Current State

The wiki contains one ingested source so far — a YouTube transcript covering Andrej Karpathy’s viral LLM Wiki Pattern. The core theme is LLM-maintained knowledge bases: using Claude Code and markdown files to build compounding personal wikis that replace traditional RAG for small-to-medium scale research.

Key Themes

  • Knowledge compounding — The central insight: a wiki that’s incrementally built and maintained by an LLM accumulates value in a way that chat-based or RAG-based approaches don’t. Cross-references, contradictions, and synthesis are pre-compiled.
  • Simplicity over infrastructure — Markdown files + an LLM vs. embedding models + vector databases + chunking pipelines. The wiki approach trades scale ceiling for zero infrastructure.
  • AI-assisted organization — The human curates and directs; the LLM does all the bookkeeping. This inverts the traditional wiki maintenance burden.

Gaps & Next Steps

  • No primary sources from Karpathy himself yet (the gist would be valuable to ingest directly).
  • The AI 2027 article is referenced but not ingested — it would add depth on AI forecasting.
  • No sources yet on traditional RAG implementations for direct comparison.