Skip to content
Snippets Groups Projects
Commit bddc11a0 authored by ale's avatar ale
Browse files

Add README with simple installation instructions

parent e0b543e4
No related branches found
No related tags found
No related merge requests found
Experimental Emacs LLM autocomplete assistant
===
Note: the code in this repository isn't ready for use yet.
This is a simple minimalistic Emacs package that provides
autocomplete-like functionality using LLMs. It uses the Ollama API for
LLM access.
It tries to improve the autocompletion quality by providing
*additional context* for the LLM, obtained by indexing your project's
source code, and retrieving snippets that are relevant to the code
being completed (a form of *RAG*).
The indexing code is taken from
[Aider](https://github.com/Aider-AI/aider) and can run as a local
service with its own HTTP API in order to return results with low
latency.
## Usage
To run the indexing service you can use Systemd. But first you need to
install the *ecopilot_srcindex* Python package somewhere: for the
purpose of this example, we're going to use a virtualenv.
```
virtualenv venv
./venv/bin/pip3 install -e .
```
Install the unit files in your user's systemd service directory:
```
cp ecopilot-srcindex.{socket,service} ~/.config/systemd/user/
```
Then edit the service unit to reflect the actual installation path of
the *ecopilot-src-context-server* program. Lastly:
```
systemctl --user daemon-reload
systemctl --user enable ecopilot-srcindex.socket
systemctl --user start ecopilot-srcindex.socket
```
The service will be automatically started when necessary.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment