Model Configuration

Model Configuration

Sally supports multiple models, and you can configure them in the system,the following models have been tested and can be used in the system. The more stars they have, the better their performance. We also support adding your own models.

  • GPT-4o-mini ⭐️⭐️
  • GPT-4o ⭐️⭐️⭐️
  • Claude 3.5 haiku ⭐️⭐️⭐️
  • Claude 3.5 sonnet ⭐️⭐️⭐️⭐️⭐️
  • DeepSeek V3 ⭐️⭐️⭐️⭐️

Support AI Model Providers

You can obtain desired models from the following model providers and add them to the system. Priority is given to models from GPT, CLAUDE, and DeepSeek.

ProviderStatuslink
OpenAIDonelink (opens in a new tab)
OpenRouterDonelink (opens in a new tab)
SiliconFlowDonelink (opens in a new tab)
DeepSeekDonelink (opens in a new tab)

How to add a model

  1. Open Sally in sidepanel.
  2. Click on the "Plus" icon at bottom left.
  3. Select a AI Provider from the dropdown menu,if you want to use your own model, select "Custom".
  4. Input the API key for the model provider,you need to register an account on Provider's website and get the API key.
  5. Select a model from the dropdown menu, you can search for the model you want to use.
  6. If you use a custom model, input the model BASE URL.
  7. Click "Confirm" to add the model to the system.

How to Remove a model

  1. Select the model you want to remove from the dropdown menu at the bottom left.
  2. Click on the "Setting" icon at right.
  3. Click on the "Trash" button at the bottom.

Where the API KEY is stored

The API KEY is stored in the local storage of the browser, so you don't worry about it being leaked.

But you need to configure the API KEY again if you use another software,so you need to configure the API KEY in every software.

We do not Proxy your API requests

If you configure your API KEY in the system, the API requests will be sent directly from your browser to the model provider.

There may be some problems with this, such as CORS problems, if you use AI Provider above, you don't need to worry about this.

Can I use LM Studio or Ollama?

Now, we don't support LM Studio or Ollama, because there may be restrictions to request localhost API in some software, but you can try it yourself.