Use style tags to modify the writing style of the output.
Banned Tokens
Sequences you don't want to appear in the output. One per line. Text or [token ids].
Logit Bias
Add
Helps to ban or reinforce the usage of certain tokens.
Unrestricted maximum value for the context size slider. Enable only if you know
what you're doing.
Context Size (tokens)
Max Response Length (tokens)
Multiple swipes per generation
Middle-out Transform
Max prompt cost:Unknown
Max prompt cost:Unknown
Max prompt cost:Unknown
Display the response bit by bit as it is generated.
When this is off, responses will be displayed all at once when they are complete.
Temperature
Frequency Penalty
Presence Penalty
Top K
Top P
Repetition Penalty
Min P
Top A
Quick Prompts Edit
Main
Auxiliary
Post-History Instructions
Utility Prompts
Impersonation prompt
Prompt that is used for Impersonation function
World Info format template
Wraps activated World Info entries before inserting into the prompt.Use{0}to mark a place where the content is inserted.
Scenario format template
Use{{scenario}}to mark a place where the content is inserted.
Personality format template
Use{{personality}}to mark a place where the content is inserted.
Group Nudge prompt template
Sent at the end of the group chat history to force reply from a specific character.
New Chat
Set at the beginning of the chat history to indicate that a new chat is about to start.
New Group Chat
Set at the beginning of the chat history to indicate that a new group chat is about to start.
New Example Chat
Set at the beginning of Dialogue examples to indicate that a new example chat is about to start.
Continue nudge
Set at the end of the chat history when the continue button is pressed.
Replace empty message
Send this text instead of nothing when the text box is empty.
Seed
Set to get deterministic results. Use -1 for random seed.
Temperature
Top K
Top P
Typical P
Min P
Top A
TFS
Repetition Penalty
Rep Pen Range
Repetition Penalty Slope
Mirostat
Mode
Tau
Eta
Seed
GBNF Grammar
Samplers Order
Samplers will be applied in a top-down order.
Use with caution.
Top K
0
Top A
1
Top P & Min P
2
Tail Free Sampling
3
Typical P
4
Temperature
5
Repetition Penalty
6
Load koboldcpp order
Samplers Order
Samplers will be applied in a top-down order. Use with caution.
Temperature
0
Top K Sampling
1
Nucleus Sampling
2
Tail Free Sampling
3
Top A Sampling
4
Typical P
5
Mirostat
8
Unified Sampling
9
Min P
10
Neutralize Samplers
Sampler Select
Multiple swipes per generation
Temperature
Top K
Top P
Typical P
Min P
Top A
TFS
Epsilon Cutoff
Top nsigma
Min Keep
Eta Cutoff
Repetition Penalty
Rep Pen Range
Rep Pen Slope
Rep Pen Decay
Encoder Penalty
Frequency Penalty
Presence Penalty
No Repeat Ngram Size
Skew
Min Length
Maximum tokens/second
Smoothing Factor
Smoothing Curve
Threshold
Probability
Multiplier
Base
Allowed Length
Penalty Range
Sequence Breakers
Dynamic Temperature
Minimum Temp
Maximum Temp
Exponent
Mode
Tau
Eta
Contrastive Search
Banned Tokens/Strings
Global list
Preset-specific list
Logit Bias
Add
Helps to ban or reinforce the usage of certain tokens.
CFG
Scale
Negative Prompt
Grammar String
JSON Schema
Samplers Order
kcpp only. Samplers will be applied in a top-down order.
Use with caution.
Top K
0
Top A
1
Top P & Min P
2
Tail Free Sampling
3
Typical P
4
Temperature
5
Repetition Penalty
6
Load default order
Sampler Order
llama.cpp only. Determines the order of samplers. If Mirostat mode is not 0, sampler order is ignored.
Temperature
Top K
Top P
Typical P
Min P
Exclude Top Choices
DRY
Rep/Freq/Pres Penalties
Top N-Sigma
Load default order
Sampler Priority
Ooba only. Determines the order of samplers.
Repetition Penalty
Presence Penalty
Frequency Penalty
DRY
Temperature
Dynamic Temperature
Quadratic / Smooth Sampling
Top Nsigma
Top K
Top P
Typical P
Epsilon Cutoff
Eta Cutoff
Tail Free Sampling
Top A
Min P
Mirostat
XTC
Encoder Repetition Penalty
No Repeat Ngram
Load default order
Sampler Order
Aphrodite only. Determines the order of samplers.
DRY
Penalties
No Repeat Ngram
Dynatemp & Temperature
Top Nsigma
Top P & Top K
Top A
Min P
Tail-Free Sampling
Eta Cutoff
Epsilon Cutoff
Typical P
Cubic and Quadratic Sampling
XTC
Load default order
Character Names Behavior
()
Continue Postfix
()
Continue sends the last message as assistant role instead of system message with instruction.
Combines consecutive system messages into one (excluding example dialogues). May improve coherence for some models.
Send the system prompt for supported models. If disabled, the user message is added to the beginning of the prompt.
Use search capabilities provided by the backend.
Not free, adds a $0.02 fee to each prompt.
Allows using function tools.
Can be utilized by various extensions to provide additional functionality.Not supported when Prompt Post-Processing with "no tools" is used!
Sends attached media in prompts if supported by the model.Videos must be less than 20 MB and under 1 minute long.Audio must be less than 20 MB.
Allows the model to return image attachments.
Incompatible with the following features: function calling, web search, system prompt.
Allows the model to return its thinking process.
This setting affects visibility only.
OpenAI-style options: low, medium, high. Minimum and maximum are aliased to low and high. Auto does not send an effort level.
Allocates a portion of the response length for thinking (min: 1024 tokens, low: 10%, medium: 25%, high: 50%, max: 95%), but minimum 1024 tokens. Auto does not request thinking.
Sets a dynamic reasoning depth level for thinking (Flash 3/Pro 3). High and low are supported by both, minimal and medium are Flash 3 only. Auto lets the model decide.
Allocates a portion of the response length for thinking (Flash 2.5/Pro 2.5) (min: 0/128 tokens, low: 10%, medium: 25%, high: 50%, max: 24576/32768 tokens). Auto lets the model decide.
Constrains the verbosity of the model's response.
Assistant Prefill
Assistant Impersonation Prefill
Prefills won't work when function calling is enabled and any tools are registered.
Logit Bias
Helps to ban or reinforce the usage of certain tokens.Confirm token parsing withTokenizer.
View / Edit bias preset
Add bias entry
Most tokens have a leading space.
API
No connection...
Not connected...
Chat Completion Source
Reverse Proxy
Proxy Presets
Saved addresses and passwords.
Proxy Name
This will show up as your saved preset.
Proxy Server URL
Alternative server URL (leave empty to use the default value).
Doesn't work? Try adding/v1at the end!/chat/completionssuffix will be added automatically.
Proxy Password
Will be used as a password for the proxy instead of API key.
Using a proxy that you're not running yourself is a risk to your data privacy.
ANY support requests will be REFUSED if you are using a proxy.
Chat backgrounds generated with the Image Generationextension will appear here.
Extensions
Manage extensions
Install extension
(DEPRECATED)Extras API:
Not connected...
Connect
Persona Management
Usage Stats
Backup
Restore
Create
+
Current Persona
[Persona Name]
Persona Description
Position
Tokens: 0
Connections
Default
Character
Chat
Global Settings
PNG
JSON
Chat
Character
text
Delete
Cancel
- Advanced
Definitions
Prompt Overrides
(For Chat Completion and Instruct Mode)
Insert {{original}} into either box to include the respective default prompt from system settings.
Main Prompt
Tokens:counting...
Post-History Instructions
Tokens:counting...
Creator's Metadata
(Not sent with the AI Prompt)
Everything here is optional
Created by
Character Version
Creator's Notes
Tags to Embed
Personality summary
Tokens:counting...
Scenario
Tokens:counting...
Character's Note
@ Depth
Role
Tokens:counting...
Talkativeness
How often the character speaks ingroup chats!
ShyNormalChatty
Examples of dialogue
Important to set the character's writing style.
Tokens:counting...
Save
Chat History
New Chat
Import Chat
Select a World Info file for
:
Primary Lorebook
A selected World Info will be bound to this character as its own Lorebook.When generating an AI reply, it will be combined with the entries from a global World Info selector.Exporting a character would also export the selected Lorebook file embedded in the JSON data.
Additional Lorebooks
Associate one or more auxillary Lorebooks with this character. NOTE: These choices are optional and won't be preserved on character export!
☰
entries
Comma separated (required)
Primary Keywords
Logic
(ignored if empty)
Optional Filter
Outlet Name
Scan Depth
Case-Sensitive
Whole Words
Group Scoring
Automation ID
Recursion Level
Inclusion GroupPrioritize
Group Weight
Sticky
Cooldown
Delay
Filter to Characters or Tags
Exclude
Filter to Generation Triggers
Additional Matching Sources
Character Description
Character Personality
Scenario
Persona Description
Character's Note
Creator's Notes
Import
from supported sources or view
Sample characters
Your Persona
Before you get started, you must select a persona name.
This can be changed at any time via theicon.
Persona Name:
in this group
Go back
Alternate Greetings
Add
These will be displayed as swipes on the first message when starting a new chat.
Group members can select one of them to initiate the conversation.
Click thebutton to get started!
Alternate Greeting #
Delete
1/1
Audio
0:00/0:00
Author's Note
Unique to this chat. Checkpoints inherit the Note from their parent, and can be changed individually after that.
Tokens:0
Include in World Info Scanning
Before Main Prompt / Story StringAfter Main Prompt / Story StringIn-chat @ Depthas
Insertion Frequency
(0 = Disable, 1 = Always)
User inputs until next insertion:(disabled)
Character Author's Note (Private)
Won't be shared with the character card on export.
Will be automatically added as the author's note for this character. Will be used in groups, but
can't be modified when a group chat is open.
Tokens:0
Use character author's note
Replace Author's NoteTop of Author's NoteBottom of Author's Note
Default Author's Note
Will be automatically added as the Author's Note for all new chats.
Tokens:0
Before Main Prompt / Story StringAfter Main Prompt / Story StringIn-chat @ Depthas
Insertion Frequency
(0 = Disable, 1 = Always)
Chat CFG
Unique to this chat. Scale
1 = disabled
Negative PromptPositive Prompt
Use character CFG scales
Character CFG
Will be automatically added as the CFG for this character. Scale
1 = disabled
Negative PromptPositive Prompt
Global CFG
Will be used as the default CFG options for every chat unless overridden. Scale
1 = disabled
Negative PromptPositive Prompt
CFG Prompt Cascading
Combine positive/negative prompts from other boxes. For example, ticking the chat, global, and character boxes combine all negative prompts into a comma-separated string.