Self-Hosted
Getting started with Simba installed on your local system
This guide will walk you through installing and running simba on your local system using both pip, git or docker
you can choose the method that suits you best, if you want to use the SDK for free, we recommand using the pip installation method, if you want to have more control over the source code we recommand installing the full system. If you want to use the prebuilt solution, we recommand docker.
Installation Methods
Install simba-core
simba-core
is the PyPi package that contains the server logic and API, it is necessary to run it to be able to use the SDK
To install the dependencies faster we recommand using uv
Create a config.yaml file
The config.yaml file is one of the most important files of this setup, because it’s what will parameter the Embedding model, vector store type, retreival strategy , database, worker celery for parsing and also the llm you’re using
Go to your project root and create config.yaml, you can get inspired from this one below
The config file should be at the same place where your running simba, otherwise that’s not going to work
Create .env file
If you need to use openai, or mistral AI, or you want to log the chatbot traces using langsmith, or use ollama, you should specify it in your .env
Run the server
Now that you have your .env, and config.yaml, you can run the following command
This will start the server at http://localhost:8000. You will see a logging message in the console
Install SDK
You can now install the SDK to start using simba SDK in local mode
Basic usage
Dependencies
Simba has the following key dependencies:
Troubleshooting
to be added…
Next Steps
Once you have Simba installed, proceed to: