At its core, Sophia’s LoreBary is a proxy service, primarily used by JanitorAI users, that provides a unified library of commands, plugins, lorebooks, and more to enhance their roleplay experience. The service also offers features to manage context cache and make long roleplays enjoyable!
Following the LoreBary 2.0 update, the service, its content, and features can now be used on almost any frontend. WyvernChat offers several of LoreBary’s features natively on its platform. However, if you enjoy using LoreBary’s unified content library along with its features across other platforms, you may also want to use LoreBary on WyvernChat.
Set Up LoreBary As A Provider On WyvernChat
Note: For this guide, our LLM provider is DeepSeek’s first-party API.
To use LoreBary on WyvernChat, you have to first set it up as a provider. Navigate to the AI Connections settings page under the Account section from the navigation menu. Alternatively, you can click on your Profile Picture in the top right corner and select AI Connections from the drop-down menu.

In the AI Connections menu, click on + Create Provider.

- Provider Name: LoreBary DeepSeek (you can enter any name).
- Landing Page URL:
https://lorebary.com/. - Description: Enter any description you wish.
- Chat Completion Only?: Enable this option.
- No Key Required: Enable this option.
- Stored API Key (Optional): Add the API key you created on your LLM provider’s site (in our case, we use the API key we created on DeepSeek’s first-party API platform).
- API URL Type: Select OpenAI.
- API URL:
https://api.lorebary.com/deepseek/v1(use the appropriate LoreBary proxy link for the provider of your choice/or enter your custom proxy link and add /v1 at the end of the link). - API Backend Type: Select OPENAI.
- Allowed Parameters: Leave this option empty.
Create A Connection Using LoreBary As Your Provider
Navigate to the AI Connections settings page under the Account section from the navigation menu. Alternatively, you can click on your Profile Picture in the top right corner and select AI Connections from the drop-down menu. Then, go to Connections and click + New Connection.
Create Connection
- Name: You can choose any name. We recommend using the name of the model you intend to use. In this case, we’ll name it LoreBary deepseek-chat.
- Description: You can enter any information you like in the description box.
- Visibility: Private. Make sure it’s a private connection.
- Force Chat Completion: Enable this option.
- Tool Call Enabled: This option can remain disabled UNLESS you want to use LoreBary’s Google Search command (when using OpenRouter or AI Studio as your LLM provider).
- Inline Image Support: This option can remain disabled.
- Use Browser Request: This option can remain disabled.
Model Configuration:
- Provider: Select the provider you created (in our case, we had named it LoreBary DeepSeek).
- Context Length (tokens): Set the Context Size, we recommend 16,384. Remember, a large Context Size can lead to Context Rot.

Select A Model
The Fetch Models option won’t work when using LoreBary as your provider. You’ll have to enter the model name (exactly as specified by your LLM provider) manually. In our case, DeepSeek’s first-party API offers two models.
- deepseek-chat
- deepseek-reasoner
Enter the model name exactly as specified by your LLM provider. Ensure there are no typos, extra spaces, etc.
Instruct Templates
You can either select an Instruct Template from WyvernChat’s library that you have bookmarked or configure your own. You need to add LoreBary’s commands, plugins, lorebooks, and other content codes to the System Prompt.

Configure the rest of the instruct template as you desire, or use defaults set by the template you’ve selected.
Parameters
You can either select a Sampler Preset from WyvernChat’s library that you have bookmarked or configure generation parameters on your own.
Select Connection
Click Create to save your Connection. Open any chat on WyvernChat, click the cog icon, and select the Connection you created to use LoreBary on WyvernChat.

Troubleshooting Errors
LoreBary doesn’t verify or validate your API key, and that is by design. Your API key remains between you, your provider, and your frontend. This is why you need to enable the “Stored API Key (optional)” option while adding LoreBary as a Provider. Otherwise, WyvernChat requires you to provide and validate an API key to configure the Provider.
- 401 Error: Your API key is invalid. Edit the Provider you created on WyvernChat, re-enter your Stored API Key, then save.
- 404 Error: Your API URL is invalid. Edit the Provider you created on WyvernChat, make sure the API URL ends with /v1, and that “Chat Completion Only?” is enabled. OR you made a mistake while manually entering the model name. Edit the Connection you created and ensure the model name is exactly as specified by the LLM provider.
- This model is having trouble responding, please try a different model or wait for this one to become operational.: Your Parameters configuration within the Connection you created is incorrect for the model (incompatible samplers, unacceptable values, etc.). Edit the Connection and select the “Stable” preset to bring all samplers to their default value.
Use LoreBary On WyvernChat
Following the LoreBary 2.0 update, you can use its library of content and its features on any online frontend that allows you to “bring your own model,” like JanitorAI, Chub, or WyvernChat.
To use LoreBary on WyvernChat, you have to first set it up as a Provider, and then create a Connection using LoreBary as your provider. Add your LoreBary content codes to the System Prompt field, select the Connection you created while in a chat, and your requests will be routed through LoreBary.







