One of the limitations of large language models is that it is often missing context when responding. LLMs like ChatGPT (GPT 3.5 and GPT 4 at time of writing) don’t allow fine tuning and can only take in ~4,000 tokens (roughly 3,000 words) as part of the prompt. That’s not nearly enough to provide context to be specific to your application. For example, you might want an AI coding tool to help you add a feature to your codebase, but the codebase is likely much larger than the prompt would allow.
To fit more information into prompts for context, LLMs could benefit from a cheatsheet, generated for the prompt. Combining tools like semantic search with an LLM could allow for better applications that are more specific to the user’s domain. For example, when asking an AI coding tool to add a function, the search part could load in all of the related modules or just the type signatures rather than the entire codebase.
Read Cheating is all you need.
Links to this note
-
Chatgpt Lowers Barriers to Building Small Projects
After using it for a few coding projects recently, I find that ChatGPT is a great way to lower the barriers to building smaller, self-contained projects—things that have been hiding in your to do list that take a bit too much effort to attempt but is still a good idea.
-
Llamafile Has the Best Ergonomics for Local Language Models
By far the best installation and running experience for using a large language model locally is llamafile. The entire model, weights, and a server are packaged into a single binary that can be run across multiple runtime environments.
-
Summarization With Chain of Density
A recent paper studied text summarization improvements using a chain of density prompt. The prompt improves over vanilla GPT responses and is close to human summarizations in informativeness and readibility.
-
It’s useful to think about the underlying utilities that go into running one’s life and business with the same rigor used to build something significant. Afterall, the things we rely on every day can have an outsized impact on our own performance so why not treat it that way?
-
LLM-First Programming Language
There are many barriers to adoption for a new programming language looking to go mainstream. You have to attract a small group of enthusiasts, build an ecosystem of high quality libraries, help new people learn, and eventually grow a talent marketplace.
-
There are a few packages and libraries that are being built to use ChatGPT along with Emacs.
-
I started building AI for notes to help me chat with my library of notes. The result of that exploration is org-ai—my one of one software that helps me remember what I’ve previously written, summarize information. Under the hood it uses vector-based similarity search and LLMs and agent-based AI to extract useful information from my zettelkasten in a chat-based interface.
-
Why Vector Databases Are Important to AI
The recent success of large language models like ChatGPT have led to a new stage of applied AI and with it, new challenges. One of those challenges is building context with a limited amount of space to get good results.