Installation

How to install and start using Memoire.

Requirements, use case & limitations

Memoire is in alpha development and still has some kinks we are working on.

It is currently running on a single core and saving everything to memory with persistence on disk in case of failure. Our goal was to have something fast (and it is fast!) that could handle the most common use cases.

As such, the application will be memory-hungry until we have done our refactor. We have done some benchmarks: the application will be able to handle 50k documents on a machine with 1 GB of RAM and reply within 100ms. If you are approaching this limit, it is highly recommended to increase the available RAM to the application.

Get the application

Memoire is packaged in a docker container, the easiest way to get it working is to use docker-compose.

Memoire will also need an attached volume to save documents and the processed data.

    
yaml
services: memoire: image: ghcr.io/a-star-logic/memoire:latest container_name: Memoire ports: - 3003:3003 environment: - API_KEY=abc123 - SHOW_DOC=true deploy: resources: limits: cpus: '1' memory: 1G volumes: - .memoire:/memoire/.memoire # or - your/local/path:/memoire/.memoire

Environment variables

To work, you will need to set at least the API_KEY environment variable. The application will refuse to start without this key.

In dev, you can set anything you want, but in production, you must ensure this variable is set using a secure string (for example by using openssl rand -hex 32).

This key needs to be present in any request you make to Memoire, in the Authorization header:

    
bash
curl http://localhost:3003/endpoint -H "Authorization: Bearer my_API_KEY"

The variable SHOW_DOC will enable the swagger documentation endpoint, and pretty-print logs.

Full list of variables:

VariableDescription
API_KEY*: stringthe key to authorize your requests
SHOW_DOC: true / falseDisplay the swagger doc and pretty print logs (default: false).
EMBEDDING_MODEL: stringChoose an embedding model, leave empty to use the local CPU-only model. Possible values are: cohere, openai, titanG1, titanV2
AWS_SECRET_ACCESS_KEY; AWS_ACCESS_KEY_ID; AWS_REGION: stringIf you are using Bedrock or Cohere models hosted on bedrock. See our tutorial here to get the AWS variables.
OPENAI_KEY: stringThe API key to interact with OpenAI models (both Azure and the regular API)
OPENAI_DEPLOYMENT: stringIf you are using Azure OpenAI, this is to indicate the deployment

Telemetry

To improve our product, understand how it is used and any issues you might face, we collect anonymous data.

This data has no sensitive information and is sent to Posthog and Sentry.

There is no way to disable it for now (sorry, it's on our roadmap), in the meantime, you can block their domains if this is an issue for you.