diff --git a/README.md b/README.md
new file mode 100644
index 0000000000000000000000000000000000000000..a7eace38f245a2961c490453451b8e90f99fa484
--- /dev/null
+++ b/README.md
@@ -0,0 +1,47 @@
+Experimental Emacs LLM autocomplete assistant
+===
+
+Note: the code in this repository isn't ready for use yet.
+
+This is a simple minimalistic Emacs package that provides
+autocomplete-like functionality using LLMs. It uses the Ollama API for
+LLM access.
+
+It tries to improve the autocompletion quality by providing
+*additional context* for the LLM, obtained by indexing your project's
+source code, and retrieving snippets that are relevant to the code
+being completed (a form of *RAG*).
+
+The indexing code is taken from
+[Aider](https://github.com/Aider-AI/aider) and can run as a local
+service with its own HTTP API in order to return results with low
+latency.
+
+
+## Usage
+
+To run the indexing service you can use Systemd. But first you need to
+install the *ecopilot_srcindex* Python package somewhere: for the
+purpose of this example, we're going to use a virtualenv.
+
+```
+virtualenv venv
+./venv/bin/pip3 install -e .
+```
+
+Install the unit files in your user's systemd service directory:
+
+```
+cp ecopilot-srcindex.{socket,service} ~/.config/systemd/user/
+```
+
+Then edit the service unit to reflect the actual installation path of
+the *ecopilot-src-context-server* program. Lastly:
+
+```
+systemctl --user daemon-reload
+systemctl --user enable ecopilot-srcindex.socket
+systemctl --user start ecopilot-srcindex.socket
+```
+
+The service will be automatically started when necessary.