# More AI models

{% tabs %}
{% tab title="Free servers" %}

### Image generation

* Stable Diffusion 3.5 Large
* Animagine XL 3.0

### Character engines

* Gemini 1.5 Flash
* Gemini 2.0 Flash Lite
  {% endtab %}

{% tab title="Premium servers" %}

{% endtab %}
{% endtabs %}

### Image generation

| Model                      | Description                                                                                                                                         | Free                 | Premium              |
| -------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------- | -------------------- | -------------------- |
| Stable Diffusion 3.5 Large | A solid text-to-image model that features improved performance in image quality, typography, complex prompt understanding, and resource-efficiency. | :white\_check\_mark: | :white\_check\_mark: |
| Animagine XL 3.0           | Specializes in generating Anime styled images and characters                                                                                        | :white\_check\_mark: | :white\_check\_mark: |
| Flux                       | The most realistic and best image generation engine out there, capable of generating text on images                                                 | :x:                  | :white\_check\_mark: |

### Character engines

| Model                                                                                                                                                   | Free                                                  | Premium                     |
| ------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------- | --------------------------- |
| Gemini 1.5 Flash                                                                                                                                        | :white\_check\_mark: (global rate limit)              | :white\_check\_mark:        |
| Gemini 2.0 Flash                                                                                                                                        | :white\_check\_mark: (global rate limit)              | :white\_check\_mark:        |
| Llama 3 Instruct                                                                                                                                        | Fallback model when Gemini cannot handle NSFW content | :white\_check\_mark:        |
| Llama 4                                                                                                                                                 | :x:                                                   | :white\_check\_mark: (SOON) |
| GPT 4o                                                                                                                                                  | :x:                                                   | :white\_check\_mark: (SOON) |
| DeepSeek R1                                                                                                                                             | :x:                                                   | :white\_check\_mark: (SOON) |
| BYOK (Bring your own key) models - any AI model of your choice as long as you use your valid API key and choose a model (see supported providers below) | :x:                                                   | :white\_check\_mark: (SOON) |

{% hint style="warning" %}
Support for engine choice is planned at the moment, we'll add many more models
{% endhint %}

### Supported custom API key providers

* **Google** - 39 characters long, starts with AIza
* **OpenAI** - 164 characters long, starts with sk-proj
* **Anthropic** - 108 characters long, starts with sk-ant-
* **Grok** - 84 characters long, starts with xai-


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.aicordapp.com/premium/more-ai-models.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
