Compare commits

...

11 Commits
0.5.2 ... main

Author SHA1 Message Date
3e379e32cd Bump version: 0.5.5 → 0.5.6 2025-11-02 10:24:26 +00:00
c3eaf7f639 Fix flux workflow 2025-11-02 10:24:17 +00:00
c27b2c8aba Bump version: 0.5.4 → 0.5.5 2025-10-31 21:54:24 +00:00
86efefb490 refactor(libs): update imports to use extracted build_user_content 2025-10-31 21:54:21 +00:00
b7a2516dd4 Bump version: 0.5.3 → 0.5.4 2025-10-31 16:16:51 +00:00
9e3731bfdc feat(config): add automatic config file creation from sample
- Modify load_config function to copy user_config.cfg from user_config.cfg.sample if it doesn't exist
- Update README.md to reflect new features and configuration options, including web-based settings and job queue management
2025-10-31 16:15:42 +00:00
854732b1c2 Merge branch 'secondtopic' 2025-10-31 16:08:41 +00:00
6172fb4f73 refactor(libs): extract build_user_content function to avoid duplication
Move the logic for building user content, including topic selection and recent prompts avoidance, into a shared function in generic.py. Update openrouter.py and openwebui.py to use this new function instead of duplicating code.
2025-10-31 16:08:12 +00:00
6a6802472c
Merge pull request #1 from karl0ss/secondtopic
Secondtopic
2025-10-31 15:32:03 +00:00
aa7092a7ed Bump version: 0.5.2 → 0.5.3 2025-10-31 15:31:20 +00:00
435b687585 20 words 2025-10-31 15:31:13 +00:00
7 changed files with 96 additions and 87 deletions

View File

@ -1,5 +1,5 @@
[tool.bumpversion] [tool.bumpversion]
current_version = "0.5.2" current_version = "0.5.6"
parse = "(?P<major>\\d+)\\.(?P<minor>\\d+)\\.(?P<patch>\\d+)" parse = "(?P<major>\\d+)\\.(?P<minor>\\d+)\\.(?P<patch>\\d+)"
serialize = ["{major}.{minor}.{patch}"] serialize = ["{major}.{minor}.{patch}"]
replace = "{new_version}" replace = "{new_version}"

View File

@ -4,7 +4,7 @@ FROM python:3.11-slim
# Set the working directory in the container # Set the working directory in the container
WORKDIR /app WORKDIR /app
# Set version label # Set version label
ARG VERSION="0.5.2" ARG VERSION="0.5.6"
LABEL version=$VERSION LABEL version=$VERSION
# Copy project files into the container # Copy project files into the container

View File

@ -4,12 +4,19 @@ This project is a Flask-based web server designed to generate and display images
## Features ## Features
* **Web Interface:** A simple web interface to view generated images. * **Web Interface:** A simple web interface to view generated images, manage favourites, and monitor job queues.
* **Image Generation:** Integrates with ComfyUI to generate images based on given prompts and models. * **Image Generation:** Integrates with ComfyUI to generate images using SDXL, FLUX, and Qwen models based on given prompts.
* **Prompt Generation:** Automatic prompt generation using OpenWebUI or OpenRouter APIs with topic-based theming.
* **Scheduled Generation:** Automatically generates new images at a configurable time. * **Scheduled Generation:** Automatically generates new images at a configurable time.
* **Favourites System:** Mark and manage favourite images.
* **Job Queue Management:** View and cancel running/pending image generation jobs.
* **Thumbnail Generation:** Automatic thumbnail creation for generated images.
* **Prompt Logging:** Maintains a log of recent prompts to avoid repetition.
* **Settings Management:** Web-based configuration editor for all settings.
* **Docker Support:** Comes with a `Dockerfile` and `docker-compose.yml` for easy setup and deployment. * **Docker Support:** Comes with a `Dockerfile` and `docker-compose.yml` for easy setup and deployment.
* **Configurable:** Most options can be configured through a `user_config.cfg` file. * **Configurable:** Most options can be configured through a `user_config.cfg` file or web interface.
* **Authentication:** Optional password protection for image creation. * **Authentication:** Optional password protection for image creation.
* **Version Management:** Uses bump-my-version for version tracking.
## Prerequisites ## Prerequisites
@ -33,8 +40,8 @@ This project is a Flask-based web server designed to generate and display images
``` ```
3. **Configure the application:** 3. **Configure the application:**
* Copy the `user_config.cfg.sample` to `user_config.cfg`. * The `user_config.cfg` file will be automatically created from `user_config.cfg.sample` on first run if it doesn't exist.
* Edit `user_config.cfg` with your settings. See the [Configuration](#configuration) section for more details. * Edit `user_config.cfg` with your settings, or use the web-based settings page accessible by clicking the version number in the bottom right corner of the home page. See the [Configuration](#configuration) section for more details.
4. **Run the application:** 4. **Run the application:**
```bash ```bash
@ -51,8 +58,8 @@ This project is a Flask-based web server designed to generate and display images
``` ```
2. **Configure the application:** 2. **Configure the application:**
* Copy the `user_config.cfg.sample` to `user_config.cfg`. * The `user_config.cfg` file will be automatically created from `user_config.cfg.sample` on first run if it doesn't exist.
* Edit `user_config.cfg` with your settings. The `comfyui_url` should be the address of your ComfyUI instance, accessible from within the Docker network (e.g., `http://host.docker.internal:8188` or your server's IP). * Edit `user_config.cfg` with your settings, or use the web-based settings page accessible by clicking the version number in the bottom right corner of any page. The `comfyui_url` should be the address of your ComfyUI instance, accessible from within the Docker network (e.g., `http://host.docker.internal:8188` or your server's IP).
3. **Build and run with Docker Compose:** 3. **Build and run with Docker Compose:**
```bash ```bash
@ -82,7 +89,6 @@ The application is configured via the `user_config.cfg` file.
| `[comfyui]` | `secondary_topic` | A secondary topic for prompt generation. | | | `[comfyui]` | `secondary_topic` | A secondary topic for prompt generation. | |
| `[comfyui]` | `flux` | Enable FLUX models (`True`/`False`). | `False` | | `[comfyui]` | `flux` | Enable FLUX models (`True`/`False`). | `False` |
| `[comfyui]` | `qwen` | Enable Qwen models (`True`/`False`). | `False` | | `[comfyui]` | `qwen` | Enable Qwen models (`True`/`False`). | `False` |
| `[comfyui]` | `only_flux` | Only use FLUX models (`True`/`False`). | `False` |
| `[comfyui:flux]` | `models` | A comma-separated list of FLUX models. | `flux1-dev-Q4_0.gguf,flux1-schnell-Q4_0.gguf` | | `[comfyui:flux]` | `models` | A comma-separated list of FLUX models. | `flux1-dev-Q4_0.gguf,flux1-schnell-Q4_0.gguf` |
| `[comfyui:qwen]` | `models` | A comma-separated list of Qwen models. | `qwen-image-Q4_K_S.gguf, qwen-image-Q2_K.gguf` | | `[comfyui:qwen]` | `models` | A comma-separated list of Qwen models. | `qwen-image-Q4_K_S.gguf, qwen-image-Q2_K.gguf` |
| `[openwebui]` | `base_url` | The base URL for OpenWebUI. | `https://openwebui` | | `[openwebui]` | `base_url` | The base URL for OpenWebUI. | `https://openwebui` |
@ -96,14 +102,24 @@ The application is configured via the `user_config.cfg` file.
## Usage ## Usage
* **Gallery:** Open your browser to `http://<server_ip>:<port>` to see the gallery of generated images. * **Gallery:** Open your browser to `http://<server_ip>:<port>` to see the gallery of generated images.
* **Create Image:** Navigate to `/create` to manually trigger image generation. * **Create Image:** Navigate to `/create` or `/create_image` to manually trigger image generation with various model options.
* **Job Queue:** Monitor and cancel running/pending jobs via the gallery interface.
* **API Endpoints:**
* `/api/queue` - Get current job queue details (JSON)
* `/cancel` - Cancel the current running job
## Dependencies ## Dependencies
* Flask * Flask
* comfy_api_simplified * comfy_api_simplified
* APScheduler * APScheduler
* Pillow * Pillow
* tenacity
* nest_asyncio
* openai
* websockets
* bump-my-version
* openwebui-chat-client
* And others, see `requirements.txt`. * And others, see `requirements.txt`.
## Contributing ## Contributing

View File

@ -5,6 +5,7 @@ import sys
import time import time
import os import os
import random import random
import shutil
from PIL import Image from PIL import Image
import nest_asyncio import nest_asyncio
import json import json
@ -38,10 +39,21 @@ def save_prompt(prompt):
def load_config() -> configparser.ConfigParser: def load_config() -> configparser.ConfigParser:
"""Loads user configuration from ./user_config.cfg.""" """Loads user configuration from ./user_config.cfg. If it doesn't exist, copies from user_config.cfg.sample."""
user_config = configparser.ConfigParser() user_config = configparser.ConfigParser()
config_path = "./user_config.cfg"
sample_path = "./user_config.cfg.sample"
if not os.path.exists(config_path):
if os.path.exists(sample_path):
shutil.copy(sample_path, config_path)
logging.info("Configuration file copied from sample.")
else:
logging.error("Neither user_config.cfg nor user_config.cfg.sample found.")
sys.exit(1)
try: try:
user_config.read("./user_config.cfg") user_config.read(config_path)
logging.debug("Configuration loaded successfully.") logging.debug("Configuration loaded successfully.")
return user_config return user_config
except KeyError as e: except KeyError as e:
@ -187,24 +199,63 @@ def load_prompt_models_from_config():
return prompt_models return prompt_models
def build_user_content(topic: str = "random") -> str:
"""Build the user content string for prompt generation, including topic instructions and recent prompts avoidance."""
config = load_config()
topic_instruction = ""
selected_topic = ""
secondary_topic_instruction = ""
# Unique list of recent prompts
recent_prompts = list(set(load_recent_prompts()))
if topic == "random":
topics = [t.strip() for t in config["comfyui"]["topics"].split(",") if t.strip()]
selected_topic = random.choice(topics) if topics else ""
elif topic != "":
selected_topic = topic
else:
# Decide on whether to include a topic (e.g., 30% chance to include)
topics = [t.strip() for t in config["comfyui"]["topics"].split(",") if t.strip()]
if random.random() < 0.3 and topics:
selected_topic = random.choice(topics)
if selected_topic != "":
topic_instruction = f" Incorporate the theme of '{selected_topic}' into the new prompt."
# Add secondary topic if configured and not empty
secondary_topic = config["comfyui"].get("secondary_topic", "").strip()
if secondary_topic:
secondary_topic_instruction = f" Additionally incorporate the theme of '{secondary_topic}' into the new prompt."
user_content = (
"Can you generate me a really random image idea, Do not exceed 20 words. Use clear language, not poetic metaphors."
+ topic_instruction
+ secondary_topic_instruction
+ "Avoid prompts similar to the following:"
+ "\n".join(f"{i+1}. {p}" for i, p in enumerate(recent_prompts))
)
return user_content
def create_prompt_with_random_model(base_prompt: str, topic: str = "random"): def create_prompt_with_random_model(base_prompt: str, topic: str = "random"):
"""Create a prompt using a randomly selected model from OpenWebUI or OpenRouter. """Create a prompt using a randomly selected model from OpenWebUI or OpenRouter.
If OpenWebUI fails, it will retry once. If it fails again, it will fallback to OpenRouter. If OpenWebUI fails, it will retry once. If it fails again, it will fallback to OpenRouter.
""" """
prompt_models = load_prompt_models_from_config() prompt_models = load_prompt_models_from_config()
if not prompt_models: if not prompt_models:
logging.warning("No prompt generation models configured.") logging.warning("No prompt generation models configured.")
return None return None
# Randomly select a model # Randomly select a model
service, model = random.choice(prompt_models) service, model = random.choice(prompt_models)
# Import here to avoid circular imports # Import here to avoid circular imports
from libs.openwebui import create_prompt_on_openwebui from libs.openwebui import create_prompt_on_openwebui
from libs.openrouter import create_prompt_on_openrouter from libs.openrouter import create_prompt_on_openrouter
if service == "openwebui": if service == "openwebui":
try: try:
# First attempt with OpenWebUI # First attempt with OpenWebUI
@ -212,13 +263,13 @@ def create_prompt_with_random_model(base_prompt: str, topic: str = "random"):
result = create_prompt_on_openwebui(base_prompt, topic, model) result = create_prompt_on_openwebui(base_prompt, topic, model)
if result: if result:
return result return result
# If first attempt returns None, try again # If first attempt returns None, try again
logging.warning("First OpenWebUI attempt failed. Retrying...") logging.warning("First OpenWebUI attempt failed. Retrying...")
result = create_prompt_on_openwebui(base_prompt, topic, model) result = create_prompt_on_openwebui(base_prompt, topic, model)
if result: if result:
return result return result
# If second attempt fails, fallback to OpenRouter # If second attempt fails, fallback to OpenRouter
logging.warning("Second OpenWebUI attempt failed. Falling back to OpenRouter...") logging.warning("Second OpenWebUI attempt failed. Falling back to OpenRouter...")
openrouter_models = [m for m in prompt_models if m[0] == "openrouter"] openrouter_models = [m for m in prompt_models if m[0] == "openrouter"]
@ -228,7 +279,7 @@ def create_prompt_with_random_model(base_prompt: str, topic: str = "random"):
else: else:
logging.error("No OpenRouter models configured for fallback.") logging.error("No OpenRouter models configured for fallback.")
return "A colorful abstract composition" # Default fallback prompt return "A colorful abstract composition" # Default fallback prompt
except Exception as e: except Exception as e:
logging.error(f"Error with OpenWebUI: {e}") logging.error(f"Error with OpenWebUI: {e}")
# Fallback to OpenRouter on exception # Fallback to OpenRouter on exception
@ -244,7 +295,7 @@ def create_prompt_with_random_model(base_prompt: str, topic: str = "random"):
else: else:
logging.error("No OpenRouter models configured for fallback.") logging.error("No OpenRouter models configured for fallback.")
return "A colorful abstract composition" # Default fallback prompt return "A colorful abstract composition" # Default fallback prompt
elif service == "openrouter": elif service == "openrouter":
try: try:
# Use OpenRouter # Use OpenRouter
@ -252,7 +303,7 @@ def create_prompt_with_random_model(base_prompt: str, topic: str = "random"):
except Exception as e: except Exception as e:
logging.error(f"Error with OpenRouter: {e}") logging.error(f"Error with OpenRouter: {e}")
return "A colorful abstract composition" # Default fallback prompt return "A colorful abstract composition" # Default fallback prompt
user_config = load_config() user_config = load_config()
output_folder = user_config["comfyui"]["output_dir"] output_folder = user_config["comfyui"]["output_dir"]

View File

@ -2,7 +2,7 @@ import random
import logging import logging
from openai import OpenAI, RateLimitError from openai import OpenAI, RateLimitError
import nest_asyncio import nest_asyncio
from libs.generic import load_recent_prompts, load_config from libs.generic import load_recent_prompts, load_config, build_user_content
from libs.openwebui import create_prompt_on_openwebui from libs.openwebui import create_prompt_on_openwebui
import re import re
nest_asyncio.apply() nest_asyncio.apply()
@ -40,36 +40,7 @@ def create_prompt_on_openrouter(prompt: str, topic: str = "random", model: str =
logging.warning("OpenRouter is not enabled in the configuration.") logging.warning("OpenRouter is not enabled in the configuration.")
return "" return ""
topic_instruction = "" user_content = build_user_content(topic)
selected_topic = ""
secondary_topic_instruction = ""
# Unique list of recent prompts
recent_prompts = list(set(load_recent_prompts()))
if topic == "random":
topics = [t.strip() for t in config["comfyui"]["topics"].split(",") if t.strip()]
selected_topic = random.choice(topics) if topics else ""
elif topic != "":
selected_topic = topic
else:
# Decide on whether to include a topic (e.g., 30% chance to include)
topics = [t.strip() for t in config["comfyui"]["topics"].split(",") if t.strip()]
if random.random() < 0.3 and topics:
selected_topic = random.choice(topics)
if selected_topic != "":
topic_instruction = f" Incorporate the theme of '{selected_topic}' into the new prompt."
# Add secondary topic if configured and not empty
secondary_topic = config["comfyui"].get("secondary_topic", "").strip()
if secondary_topic:
secondary_topic_instruction = f" Additionally incorporate the theme of '{secondary_topic}' into the new prompt, in the style of."
user_content = (
"Can you generate me a really random image idea, Do not exceed 20 words. Use clear language, not poetic metaphors."
+ topic_instruction
+ secondary_topic_instruction
+ "Avoid prompts similar to the following:"
+ "\n".join(f"{i+1}. {p}" for i, p in enumerate(recent_prompts))
)
# Load configured models # Load configured models
configured_models = [m.strip() for m in user_config["openrouter"]["models"].split(",") if m.strip()] configured_models = [m.strip() for m in user_config["openrouter"]["models"].split(",") if m.strip()]

View File

@ -1,7 +1,7 @@
import random import random
import logging import logging
import nest_asyncio import nest_asyncio
from libs.generic import load_recent_prompts, load_config from libs.generic import load_recent_prompts, load_config, build_user_content
import re import re
from openwebui_chat_client import OpenWebUIClient from openwebui_chat_client import OpenWebUIClient
from datetime import datetime from datetime import datetime
@ -19,36 +19,7 @@ def create_prompt_on_openwebui(prompt: str, topic: str = "random", model: str =
"""Sends prompt to OpenWebui and returns the generated response.""" """Sends prompt to OpenWebui and returns the generated response."""
# Reload config to get latest values # Reload config to get latest values
config = load_config() config = load_config()
topic_instruction = "" user_content = build_user_content(topic)
selected_topic = ""
secondary_topic_instruction = ""
# Unique list of recent prompts
recent_prompts = list(set(load_recent_prompts()))
if topic == "random":
topics = [t.strip() for t in config["comfyui"]["topics"].split(",") if t.strip()]
selected_topic = random.choice(topics)
elif topic != "":
selected_topic = topic
else:
# Decide on whether to include a topic (e.g., 30% chance to include)
topics = [t.strip() for t in config["comfyui"]["topics"].split(",") if t.strip()]
if random.random() < 0.3 and topics:
selected_topic = random.choice(topics)
if selected_topic != "":
topic_instruction = f" Incorporate the theme of '{selected_topic}' into the new prompt."
# Add secondary topic if configured and not empty
secondary_topic = config["comfyui"].get("secondary_topic", "").strip()
if secondary_topic:
secondary_topic_instruction = f" Additionally incorporate the theme of '{secondary_topic}' into the new prompt, in the style of."
user_content = (
"Can you generate me a really random image idea, Do not exceed 10 words. Use clear language, not poetic metaphors."
+ topic_instruction
+ secondary_topic_instruction
+ "Avoid prompts similar to the following:"
+ "\n".join(f"{i+1}. {p}" for i, p in enumerate(recent_prompts))
)
if model: if model:
# Use the specified model # Use the specified model

View File

@ -167,7 +167,7 @@
"38": { "38": {
"inputs": { "inputs": {
"unet_name": "flux1-dev-Q4_0.gguf", "unet_name": "flux1-dev-Q4_0.gguf",
"device": "cuda:1", "device": "cuda:0",
"virtual_vram_gb": 0, "virtual_vram_gb": 0,
"use_other_vram": true, "use_other_vram": true,
"expert_mode_allocations": "" "expert_mode_allocations": ""