Customization and Extensibility
Editing and Adding Modes
All of the modes can be edited in the config.yaml
file.
Adding a new mode is as simple as adding it in the config. To use it when generating add m=<name of your mode>
to the end of your prompt. To change the defaults create a default
mode.
Mode configuration options | Default | |
---|---|---|
cfg |
Guidance scale | 3.5 |
steps |
Number of sampling steps | 20 |
lora |
Name of LORA to use in your ComfyUI models/loras folder. |
No LORA |
lora_strength |
Strength of LORA effect | 1.0 |
sampler |
Sampling method to use | "euler" |
scheduler |
Scheduler to use | "simple" |
prompt_template_pre_pe |
Template for prompt before enhancement. {} is replaced with the prompt before prompt enhancement. |
"{}" |
prompt_template_post_pe |
Template for prompt after enhancement. {} is replaced with the prompt after prompt enhancement. |
"{}" |
description |
Description of the mode | No description |
Changing the Model and editing the Workflow
This project works with only one workflow. This limits it to UNET models.
If you want to use a different workflow or model you will have to change some code.
To make the project work with a custom workflow edit the _prepare_workflow
method in
the comfyui_telegram_bot/image_gen.py
file.
Edit the method to match the node IDs and node attributes in your workflow file.
To export a workflow from ComfyUI in the correct format:
- Open your workflow in the ComfyUI web interface.
- Go to Settings (gear icon in the right vertical menu) -> Dev Mode -> Enable dev mode options -> ON
- Close settings
- Click the
Save (API Format)
button in the right vertical menu.
Prompt Enhancement
Prompt Enhancement is fully customizeable. This means that you can change the system prompts for the LLMs and you can also change what LLM you want to use.
System Prompts
All of the system prompts can be edited in the config.yaml
file.
Adding a new prompt enhancement 'types' is as simple as adding it in the config. To use it when generating add pe=<name of your prompt enhancement type>
to the end of your prompt.
Prompt enhancement type configuration options | |
---|---|
description |
Description of the prompt enhancement type |
system_prompt |
System Prompt of the LLM. Important: The LLM has to be instructed to return the enhanced prompt in <prompt></prompt> tags! |
Services
There is currently only one prompt enhancement service implemented and that is for the Anthropic API.
Adding more is as simple as creating a new Python script in the comfyui_telegram_bot/services
folder
with the name of your service. Then you write a class extending the PromptEnhanceService
class
and implementing the enhance_prompt
method.
For reference on how this can be done, you can read the comfyui_telegram_bot/services/anthropic.py
file.
Contributing
I welcome all contributions. Simply create a pull request on the GitHub repository and I will review it.
Ideas for Improvements:
- Model switching
- Configuration options to work with any Workflow file
- IMG2IMG
- OpenAI prompt enhancement service