I build personal infrastructure around the things I do constantly, refined for my workflow (quirks included), with built-in privacy, and for fun.
At time of writing, that includes:
- Search
- AI chat
- API server
- Emacs integration
Running in different contexts
A single binary (indexer
) makes search, indexing, and AI chat available over:
- CLI (
indexer chat
,indexer search --term "example"
) - HTTP (
indexer serve
) - Web UI (
indexer serve && open localhost:2222/chat
) - SSH via
dokku
(ssh -t dokku@myprivatehost indexer chat
) - PWA (enables push notifications in iOS)
- GitHub actions (notify when the index updates)
- Emacs (
org-ai-shell
which wraps the CLI)
Isn’t that a lot of work?
There is an upfront investment, but the nice thing about infrastructure is that the payback period is long. I almost never regret investing more into it as a result.
Coding is surprisingly convenient and I’m building applications on well understood problem areas and libraries that would not have otherwise existed for my specific way of doing things.
Where to go from here
This all started because there wasn’t an easy way to use org-mode on mobile. I kept coming back to personal search as a missing link and then AI started taking off. Retrieval-augmented generation (RAG) connects LLMs to data they hadn’t been trained on, also pointing to personal search. As I got deeper into it, I want (local AI).
I’m convinced I want to own the indexing/search/chat layers for personal productivity long term.
I’m less convinced about owning more of the AI stack. I’ve experimented with my own AI agents (mostly as background jobs), but it’s moving too fast and there’s a long way to go to making them truly useful. I can imagine ripping and replacing it many times over as models and techniques improve.