My Transient UI For Local LLMs
Two big frustrations for me when I started using Large Language Models (LLMs) for coding were:
- copy-pasting large chunk of code back and forth
- having to painstakingly provide many details about the context of the piece of
code I wanted:
- where it would be called from,
- what it could call,
- libraries
- or other methods from the project,
- what was the overall project about,
- what is the project's coding style.
To alleviate this, I have created a transient UI for gptel that fits my workflow.
With it, I can, with one keystroke, switch prompts, include or exclude a file.
The MANIFEST.org
file, the use of which I advocated for before comes in very
handy: Not only does it give context to the LLM about what file does what, it is
also parsed for links to other files, which are added to the list of includable
files, even if they are outside of the project.
This allows me to bring in my zettelkasten notes, files from other projects, external documentation, etc.
All the selected files are then concatenated into a big prompt that is fed to my locally running LLM, and its answer is asynchronously inserted where I called it.
In the demo below, instead of copy-pasting an existing script and modifying some parts of it, I just explain what I want in plain english. Because I include relevant files (the MANIFEST and the README, as well as the other script that I reference in the prompt), the output from the LLM includes my Guix preamble, which an online LLM wouldn't have done, unless I had asked for it explicitely or copy pasted the example script.
1. Changelog
Initial version